WorldWideScience

Sample records for normalisation process model

  1. A Bayesian MCMC method for point process models with intractable normalising constants

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2004-01-01

    to simulate from the "unknown distribution", perfect simulation algorithms become useful. We illustrate the method in cases whre the likelihood is given by a Markov point process model. Particularly, we consider semi-parametric Bayesian inference in connection to both inhomogeneous Markov point process models...... and pairwise interaction point processes....

  2. Random forest meteorological normalisation models for Swiss PM10 trend analysis

    Science.gov (United States)

    Grange, Stuart K.; Carslaw, David C.; Lewis, Alastair C.; Boleti, Eirini; Hueglin, Christoph

    2018-05-01

    Meteorological normalisation is a technique which accounts for changes in meteorology over time in an air quality time series. Controlling for such changes helps support robust trend analysis because there is more certainty that the observed trends are due to changes in emissions or chemistry, not changes in meteorology. Predictive random forest models (RF; a decision tree machine learning technique) were grown for 31 air quality monitoring sites in Switzerland using surface meteorological, synoptic scale, boundary layer height, and time variables to explain daily PM10 concentrations. The RF models were used to calculate meteorologically normalised trends which were formally tested and evaluated using the Theil-Sen estimator. Between 1997 and 2016, significantly decreasing normalised PM10 trends ranged between -0.09 and -1.16 µg m-3 yr-1 with urban traffic sites experiencing the greatest mean decrease in PM10 concentrations at -0.77 µg m-3 yr-1. Similar magnitudes have been reported for normalised PM10 trends for earlier time periods in Switzerland which indicates PM10 concentrations are continuing to decrease at similar rates as in the past. The ability for RF models to be interpreted was leveraged using partial dependence plots to explain the observed trends and relevant physical and chemical processes influencing PM10 concentrations. Notably, two regimes were suggested by the models which cause elevated PM10 concentrations in Switzerland: one related to poor dispersion conditions and a second resulting from high rates of secondary PM generation in deep, photochemically active boundary layers. The RF meteorological normalisation process was found to be robust, user friendly and simple to implement, and readily interpretable which suggests the technique could be useful in many air quality exploratory data analysis situations.

  3. ENEKuS--A Key Model for Managing the Transformation of the Normalisation of the Basque Language in the Workplace

    Science.gov (United States)

    Marko, Inazio; Pikabea, Inaki

    2013-01-01

    The aim of this study is to develop a reference model for intervention in the language processes applied to the transformation of language normalisation within organisations of a socio-economic nature. It is based on a case study of an experiment carried out over 10 years within a trade union confederation, and has pursued a strategy of a…

  4. Normalisation process theory: a framework for developing, evaluating and implementing complex interventions

    LENUS (Irish Health Repository)

    Murray, Elizabeth

    2010-10-20

    Abstract Background The past decade has seen considerable interest in the development and evaluation of complex interventions to improve health. Such interventions can only have a significant impact on health and health care if they are shown to be effective when tested, are capable of being widely implemented and can be normalised into routine practice. To date, there is still a problematic gap between research and implementation. The Normalisation Process Theory (NPT) addresses the factors needed for successful implementation and integration of interventions into routine work (normalisation). Discussion In this paper, we suggest that the NPT can act as a sensitising tool, enabling researchers to think through issues of implementation while designing a complex intervention and its evaluation. The need to ensure trial procedures that are feasible and compatible with clinical practice is not limited to trials of complex interventions, and NPT may improve trial design by highlighting potential problems with recruitment or data collection, as well as ensuring the intervention has good implementation potential. Summary The NPT is a new theory which offers trialists a consistent framework that can be used to describe, assess and enhance implementation potential. We encourage trialists to consider using it in their next trial.

  5. Normalisation process theory: a framework for developing, evaluating and implementing complex interventions

    Directory of Open Access Journals (Sweden)

    Ong Bie

    2010-10-01

    Full Text Available Abstract Background The past decade has seen considerable interest in the development and evaluation of complex interventions to improve health. Such interventions can only have a significant impact on health and health care if they are shown to be effective when tested, are capable of being widely implemented and can be normalised into routine practice. To date, there is still a problematic gap between research and implementation. The Normalisation Process Theory (NPT addresses the factors needed for successful implementation and integration of interventions into routine work (normalisation. Discussion In this paper, we suggest that the NPT can act as a sensitising tool, enabling researchers to think through issues of implementation while designing a complex intervention and its evaluation. The need to ensure trial procedures that are feasible and compatible with clinical practice is not limited to trials of complex interventions, and NPT may improve trial design by highlighting potential problems with recruitment or data collection, as well as ensuring the intervention has good implementation potential. Summary The NPT is a new theory which offers trialists a consistent framework that can be used to describe, assess and enhance implementation potential. We encourage trialists to consider using it in their next trial.

  6. The implementation of medical revalidation: an assessment using normalisation process theory

    Directory of Open Access Journals (Sweden)

    Abigail Tazzyman

    2017-11-01

    Full Text Available Abstract Background Medical revalidation is the process by which all licensed doctors are legally required to demonstrate that they are up to date and fit to practise in order to maintain their licence. Revalidation was introduced in the United Kingdom (UK in 2012, constituting significant change in the regulation of doctors. The governing body, the General Medical Council (GMC, envisages that revalidation will improve patient care and safety. This potential however is, in part, dependent upon how successfully revalidation is embedded into routine practice. The aim of this study was to use Normalisation Process Theory (NPT to explore issues contributing to or impeding the implementation of revalidation in practice. Methods We conducted seventy-one interviews with sixty UK policymakers and senior leaders at different points during the development and implementation of revalidation: in 2011 (n = 31, 2013 (n = 26 and 2015 (n = 14. We selected interviewees using purposeful sampling. NPT was used as a framework to enable systematic analysis across the interview sets. Results Initial lack of consensus over revalidation’s purpose, and scepticism about its value, decreased over time as participants recognised the benefits it brought to their practice (coherence category of NPT. Though acceptance increased across time, revalidation was not seen as a legitimate part of their role by all doctors. Key individuals, notably the Responsible Officer (RO, were vital for the successful implementation of revalidation in organisations (cognitive participation category. The ease with which revalidation could be integrated into working practices varied greatly depending on the type of role a doctor held and the organisation they work for and the provision of resources was a significant variable in this (collective action category. Formal evaluation of revalidation in organisations was lacking but informal evaluation was taking place. Revalidation had

  7. Normalisation genes for expression analyses in the brown alga model Ectocarpus siliculosus

    Directory of Open Access Journals (Sweden)

    Rousvoal Sylvie

    2008-08-01

    Full Text Available Abstract Background Brown algae are plant multi-cellular organisms occupying most of the world coasts and are essential actors in the constitution of ecological niches at the shoreline. Ectocarpus siliculosus is an emerging model for brown algal research. Its genome has been sequenced, and several tools are being developed to perform analyses at different levels of cell organization, including transcriptomic expression analyses. Several topics, including physiological responses to osmotic stress and to exposure to contaminants and solvents are being studied in order to better understand the adaptive capacity of brown algae to pollution and environmental changes. A series of genes that can be used to normalise expression analyses is required for these studies. Results We monitored the expression of 13 genes under 21 different culture conditions. These included genes encoding proteins and factors involved in protein translation (ribosomal protein 26S, EF1alpha, IF2A, IF4E and protein degradation (ubiquitin, ubiquitin conjugating enzyme or folding (cyclophilin, and proteins involved in both the structure of the cytoskeleton (tubulin alpha, actin, actin-related proteins and its trafficking function (dynein, as well as a protein implicated in carbon metabolism (glucose 6-phosphate dehydrogenase. The stability of their expression level was assessed using the Ct range, and by applying both the geNorm and the Normfinder principles of calculation. Conclusion Comparisons of the data obtained with the three methods of calculation indicated that EF1alpha (EF1a was the best reference gene for normalisation. The normalisation factor should be calculated with at least two genes, alpha tubulin, ubiquitin-conjugating enzyme or actin-related proteins being good partners of EF1a. Our results exclude actin as a good normalisation gene, and, in this, are in agreement with previous studies in other organisms.

  8. Implementation of the SMART MOVE intervention in primary care: a qualitative study using normalisation process theory.

    Science.gov (United States)

    Glynn, Liam G; Glynn, Fergus; Casey, Monica; Wilkinson, Louise Gaffney; Hayes, Patrick S; Heaney, David; Murphy, Andrew W M

    2018-05-02

    Problematic translational gaps continue to exist between demonstrating the positive impact of healthcare interventions in research settings and their implementation into routine daily practice. The aim of this qualitative evaluation of the SMART MOVE trial was to conduct a theoretically informed analysis, using normalisation process theory, of the potential barriers and levers to the implementation of a mhealth intervention to promote physical activity in primary care. The study took place in the West of Ireland with recruitment in the community from the Clare Primary Care Network. SMART MOVE trial participants and the staff from four primary care centres were invited to take part and all agreed to do so. A qualitative methodology with a combination of focus groups (general practitioners, practice nurses and non-clinical staff from four separate primary care centres, n = 14) and individual semi-structured interviews (intervention and control SMART MOVE trial participants, n = 4) with purposeful sampling utilising the principles of Framework Analysis was utilised. The Normalisation Process Theory was used to develop the topic guide for the interviews and also informed the data analysis process. Four themes emerged from the analysis: personal and professional exercise strategies; roles and responsibilities to support active engagement; utilisation challenges; and evaluation, adoption and adherence. It was evident that introducing a new healthcare intervention demands a comprehensive evaluation of the intervention itself and also the environment in which it is to operate. Despite certain obstacles, the opportunity exists for the successful implementation of a novel healthcare intervention that addresses a hitherto unresolved healthcare need, provided that the intervention has strong usability attributes for both disseminators and target users and coheres strongly with the core objectives and culture of the health care environment in which it is to operate. We

  9. Using Normalisation Process Theory to investigate the implementation of school-based oral health promotion.

    Science.gov (United States)

    Olajide, O J; Shucksmith, J; Maguire, A; Zohoori, F V

    2017-09-01

    Despite the considerable improvement in oral health of children in the UK over the last forty years, a significant burden of dental caries remains prevalent in some groups of children, indicating the need for more effective oral health promotion intervention (OHPI) strategies in this population. To explore the implementation process of a community-based OHPI, in the North East of England, using Normalisation Process Theory (NPT) to provide insights on how effectiveness could be maximised. Utilising a generic qualitative research approach, 19 participants were recruited into the study. In-depth interviews were conducted with relevant National Health Service (NHS) staff and primary school teachers while focus group discussions were conducted with reception teachers and teaching assistants. Analyses were conducted using thematic analysis with emergent themes mapped onto NPT constructs. Participants highlighted the benefits of OHPI and the need for evidence in practice. However, implementation of 'best evidence' was hampered by lack of adequate synthesis of evidence from available clinical studies on effectiveness of OHPI as these generally have insufficient information on the dynamics of implementation and how effectiveness obtained in clinical studies could be achieved in 'real life'. This impacted on the decision-making process, levels of commitment, collaboration among OHP teams, resource allocation and evaluation of OHPI. A large gap exists between available research evidence and translation of evidence in OHPI in community settings. Effectiveness of OHPI requires not only an awareness of evidence of clinical effectiveness but also synthesised information about change mechanisms and implementation protocols. Copyright© 2017 Dennis Barber Ltd.

  10. Trends of air pollution in Denmark - Normalised by a simple weather index model

    International Nuclear Information System (INIS)

    Kiilsholm, S.; Rasmussen, A.

    2000-01-01

    This report is a part of the Traffic Pool projects on 'Traffic and Environments', 1995-99, financed by the Danish Ministry of Transport. The Traffic Pool projects included five different projects on 'Surveillance of the Air Quality', 'Atmospheric Modelling', 'Atmospheric Chemistry Modelling', 'Smog and ozone' and 'Greenhouse effects and Climate', [Rasmussen, 2000]. This work is a part of the project on 'Surveillance of the Air Quality' with the main objectives to make trend analysis of levels of air pollution from traffic in Denmark. Other participants were from the Road Directory mainly focusing on measurement of traffic and trend analysis of the air quality utilising a nordic model for the air pollution in street canyons called BLB (Beregningsmodel for Luftkvalitet i Byluftgader) [Vejdirektoratet 2000], National Environmental Research Institute (HERI) mainly focusing on. measurements of air pollution and trend analysis with the Operational Street Pollution Model (OSPM) [DMU 2000], and the Copenhagen Environmental Protection Agency mainly focusing on measurements. In this study a more simple statistical model has been developed for trend analysis of the air quality. The model is filtering out the influence of the variations from year to year in the meteorological conditions on the air pollution levels. The weather factors found most important are wind speed, wind direction and mixing height. Measurements of CO, NO and NO 2 from three streets in Copenhagen have been used, these streets are Jagtvej, Bredgade and H. C. Andersen's Boulevard (HCAB). The years 1994-1996 were used for evaluation of the method and annual indexes of air pollution index dependent only on meteorological parameters, called WEATHIX, were calculated for the years 1990-1997 and used for normalisation of the observed air pollution trends. Meteorological data were taken from either the background stations at the H.C. Oersted - building situated close to one of the street stations or the synoptic

  11. The applicability of normalisation process theory to speech and language therapy: a review of qualitative research on a speech and language intervention.

    Science.gov (United States)

    James, Deborah M

    2011-08-12

    The Bercow review found a high level of public dissatisfaction with speech and language services for children. Children with speech, language, and communication needs (SLCN) often have chronic complex conditions that require provision from health, education, and community services. Speech and language therapists are a small group of Allied Health Professionals with a specialist skill-set that equips them to work with children with SLCN. They work within and across the diverse range of public service providers. The aim of this review was to explore the applicability of Normalisation Process Theory (NPT) to the case of speech and language therapy. A review of qualitative research on a successfully embedded speech and language therapy intervention was undertaken to test the applicability of NPT. The review focused on two of the collective action elements of NPT (relational integration and interaction workability) using all previously published qualitative data from both parents and practitioners' perspectives on the intervention. The synthesis of the data based on the Normalisation Process Model (NPM) uncovered strengths in the interpersonal processes between the practitioners and parents, and weaknesses in how the accountability of the intervention is distributed in the health system. The analysis based on the NPM uncovered interpersonal processes between the practitioners and parents that were likely to have given rise to successful implementation of the intervention. In previous qualitative research on this intervention where the Medical Research Council's guidance on developing a design for a complex intervention had been used as a framework, the interpersonal work within the intervention had emerged as a barrier to implementation of the intervention. It is suggested that the design of services for children and families needs to extend beyond the consideration of benefits and barriers to embrace the social processes that appear to afford success in embedding

  12. Learning from doing: the case for combining normalisation process theory and participatory learning and action research methodology for primary healthcare implementation research.

    Science.gov (United States)

    de Brún, Tomas; O'Reilly-de Brún, Mary; O'Donnell, Catherine A; MacFarlane, Anne

    2016-08-03

    The implementation of research findings is not a straightforward matter. There are substantive and recognised gaps in the process of translating research findings into practice and policy. In order to overcome some of these translational difficulties, a number of strategies have been proposed for researchers. These include greater use of theoretical approaches in research focused on implementation, and use of a wider range of research methods appropriate to policy questions and the wider social context in which they are placed. However, questions remain about how to combine theory and method in implementation research. In this paper, we respond to these proposals. Focussing on a contemporary social theory, Normalisation Process Theory, and a participatory research methodology, Participatory Learning and Action, we discuss the potential of their combined use for implementation research. We note ways in which Normalisation Process Theory and Participatory Learning and Action are congruent and may therefore be used as heuristic devices to explore, better understand and support implementation. We also provide examples of their use in our own research programme about community involvement in primary healthcare. Normalisation Process Theory alone has, to date, offered useful explanations for the success or otherwise of implementation projects post-implementation. We argue that Normalisation Process Theory can also be used to prospectively support implementation journeys. Furthermore, Normalisation Process Theory and Participatory Learning and Action can be used together so that interventions to support implementation work are devised and enacted with the expertise of key stakeholders. We propose that the specific combination of this theory and methodology possesses the potential, because of their combined heuristic force, to offer a more effective means of supporting implementation projects than either one might do on its own, and of providing deeper understandings of

  13. Assessing the facilitators and barriers of interdisciplinary team working in primary care using normalisation process theory: An integrative review.

    Science.gov (United States)

    O'Reilly, Pauline; Lee, Siew Hwa; O'Sullivan, Madeleine; Cullen, Walter; Kennedy, Catriona; MacFarlane, Anne

    2017-01-01

    Interdisciplinary team working is of paramount importance in the reform of primary care in order to provide cost-effective and comprehensive care. However, international research shows that it is not routine practice in many healthcare jurisdictions. It is imperative to understand levers and barriers to the implementation process. This review examines interdisciplinary team working in practice, in primary care, from the perspective of service providers and analyses 1 barriers and facilitators to implementation of interdisciplinary teams in primary care and 2 the main research gaps. An integrative review following the PRISMA guidelines was conducted. Following a search of 10 international databases, 8,827 titles were screened for relevance and 49 met the criteria. Quality of evidence was appraised using predetermined criteria. Data were analysed following the principles of framework analysis using Normalisation Process Theory (NPT), which has four constructs: sense making, enrolment, enactment, and appraisal. The literature is dominated by a focus on interdisciplinary working between physicians and nurses. There is a dearth of evidence about all NPT constructs apart from enactment. Physicians play a key role in encouraging the enrolment of others in primary care team working and in enabling effective divisions of labour in the team. The experience of interdisciplinary working emerged as a lever for its implementation, particularly where communication and respect were strong between professionals. A key lever for interdisciplinary team working in primary care is to get professionals working together and to learn from each other in practice. However, the evidence base is limited as it does not reflect the experiences of all primary care professionals and it is primarily about the enactment of team working. We need to know much more about the experiences of the full network of primary care professionals regarding all aspects of implementation work. International

  14. Assessing the facilitators and barriers of interdisciplinary team working in primary care using normalisation process theory: An integrative review

    Science.gov (United States)

    O’Reilly, Pauline; Lee, Siew Hwa; O’Sullivan, Madeleine; Cullen, Walter; Kennedy, Catriona; MacFarlane, Anne

    2017-01-01

    Background Interdisciplinary team working is of paramount importance in the reform of primary care in order to provide cost-effective and comprehensive care. However, international research shows that it is not routine practice in many healthcare jurisdictions. It is imperative to understand levers and barriers to the implementation process. This review examines interdisciplinary team working in practice, in primary care, from the perspective of service providers and analyses 1 barriers and facilitators to implementation of interdisciplinary teams in primary care and 2 the main research gaps. Methods and findings An integrative review following the PRISMA guidelines was conducted. Following a search of 10 international databases, 8,827 titles were screened for relevance and 49 met the criteria. Quality of evidence was appraised using predetermined criteria. Data were analysed following the principles of framework analysis using Normalisation Process Theory (NPT), which has four constructs: sense making, enrolment, enactment, and appraisal. The literature is dominated by a focus on interdisciplinary working between physicians and nurses. There is a dearth of evidence about all NPT constructs apart from enactment. Physicians play a key role in encouraging the enrolment of others in primary care team working and in enabling effective divisions of labour in the team. The experience of interdisciplinary working emerged as a lever for its implementation, particularly where communication and respect were strong between professionals. Conclusion A key lever for interdisciplinary team working in primary care is to get professionals working together and to learn from each other in practice. However, the evidence base is limited as it does not reflect the experiences of all primary care professionals and it is primarily about the enactment of team working. We need to know much more about the experiences of the full network of primary care professionals regarding all aspects

  15. Facilitating professional liaison in collaborative care for depression in UK primary care; a qualitative study utilising normalisation process theory.

    Science.gov (United States)

    Coupe, Nia; Anderson, Emma; Gask, Linda; Sykes, Paul; Richards, David A; Chew-Graham, Carolyn

    2014-05-01

    Collaborative care (CC) is an organisational framework which facilitates the delivery of a mental health intervention to patients by case managers in collaboration with more senior health professionals (supervisors and GPs), and is effective for the management of depression in primary care. However, there remains limited evidence on how to successfully implement this collaborative approach in UK primary care. This study aimed to explore to what extent CC impacts on professional working relationships, and if CC for depression could be implemented as routine in the primary care setting. This qualitative study explored perspectives of the 6 case managers (CMs), 5 supervisors (trial research team members) and 15 general practitioners (GPs) from practices participating in a randomised controlled trial of CC for depression. Interviews were transcribed verbatim and data was analysed using a two-step approach using an initial thematic analysis, and a secondary analysis using the Normalisation Process Theory concepts of coherence, cognitive participation, collective action and reflexive monitoring with respect to the implementation of CC in primary care. Supervisors and CMs demonstrated coherence in their understanding of CC, and consequently reported good levels of cognitive participation and collective action regarding delivering and supervising the intervention. GPs interviewed showed limited understanding of the CC framework, and reported limited collaboration with CMs: barriers to collaboration were identified. All participants identified the potential or experienced benefits of a collaborative approach to depression management and were able to discuss ways in which collaboration can be facilitated. Primary care professionals in this study valued the potential for collaboration, but GPs' understanding of CC and organisational barriers hindered opportunities for communication. Further work is needed to address these organisational barriers in order to facilitate

  16. Understanding clinician attitudes towards implementation of guided self-help cognitive behaviour therapy for those who hear distressing voices: using factor analysis to test normalisation process theory.

    Science.gov (United States)

    Hazell, Cassie M; Strauss, Clara; Hayward, Mark; Cavanagh, Kate

    2017-07-24

    The Normalisation Process Theory (NPT) has been used to understand the implementation of physical health care interventions. The current study aims to apply the NPT model to a secondary mental health context, and test the model using exploratory factor analysis. This study will consider the implementation of a brief cognitive behaviour therapy for psychosis (CBTp) intervention. Mental health clinicians were asked to complete a NPT-based questionnaire on the implementation of a brief CBTp intervention. All clinicians had experience of either working with the target client group or were able to deliver psychological therapies. In total, 201 clinicians completed the questionnaire. The results of the exploratory factor analysis found partial support for the NPT model, as three of the NPT factors were extracted: (1) coherence, (2) cognitive participation, and (3) reflexive monitoring. We did not find support for the fourth NPT factor (collective action). All scales showed strong internal consistency. Secondary analysis of these factors showed clinicians to generally support the implementation of the brief CBTp intervention. This study provides strong evidence for the validity of the three NPT factors extracted. Further research is needed to determine whether participants' level of seniority moderates factor extraction, whether this factor structure can be generalised to other healthcare settings, and whether pre-implementation attitudes predict actual implementation outcomes.

  17. Implementing online consultations in primary care: a mixed-method evaluation extending normalisation process theory through service co-production.

    Science.gov (United States)

    Farr, Michelle; Banks, Jonathan; Edwards, Hannah B; Northstone, Kate; Bernard, Elly; Salisbury, Chris; Horwood, Jeremy

    2018-03-19

    To examine patient and staff views, experiences and acceptability of a UK primary care online consultation system and ask how the system and its implementation may be improved. Mixed-method evaluation of a primary care e-consultation system. Primary care practices in South West England. Qualitative interviews with 23 practice staff in six practices. Patient survey data for 756 e-consultations from 36 practices, with free-text survey comments from 512 patients, were analysed thematically. Anonymised patients' records were abstracted for 485 e-consultations from eight practices, including consultation types and outcomes. Descriptive statistics were used to analyse quantitative data. Analysis of implementation and the usage of the e-consultation system were informed by: (1) normalisation process theory, (2) a framework that illustrates how e-consultations were co-produced and (3) patients' and staff touchpoints. We found different expectations between patients and staff on how to use e-consultations 'appropriately'. While some patients used the system to try and save time for themselves and their general practitioners (GPs), some used e-consultations when they could not get a timely face-to-face appointment. Most e-consultations resulted in either follow-on phone (32%) or face-to-face appointments (38%) and GPs felt that this duplicated their workload. Patient satisfaction of the system was high, but a minority were dissatisfied with practice communication about their e-consultation. Where both patients and staff interact with technology, it is in effect 'co-implemented'. How patients used e-consultations impacted on practice staff's experiences and appraisal of the system. Overall, the e-consultation system studied could improve access for some patients, but in its current form, it was not perceived by practices as creating sufficient efficiencies to warrant financial investment. We illustrate how this e-consultation system and its implementation can be improved

  18. An application of Extended Normalisation Process Theory in a randomised controlled trial of a complex social intervention: Process evaluation of the Strengthening Families Programme (10–14 in Wales, UK

    Directory of Open Access Journals (Sweden)

    Jeremy Segrott

    2017-12-01

    Conclusions: Extended Normalisation Process Theory provided a useful framework for assessing implementation and explaining variation by examining intervention-context interactions. Findings highlight the need for process evaluations to consider both the structural and process components of implementation to explain whether programme activities are delivered as intended and why.

  19. Supervised Object Class Colour Normalisation

    DEFF Research Database (Denmark)

    Riabchenko, Ekatarina; Lankinen, Jukka; Buch, Anders Glent

    2013-01-01

    . In this work, we develop a such colour normalisation technique, where true colours are not important per se but where examples of same classes have photometrically consistent appearance. This is achieved by supervised estimation of a class specic canonical colour space where the examples have minimal variation......Colour is an important cue in many applications of computer vision and image processing, but robust usage often requires estimation of the unknown illuminant colour. Usually, to obtain images invariant to the illumination conditions under which they were taken, color normalisation is used...... in their colours. We demonstrate the effectiveness of our method with qualitative and quantitative examples from the Caltech-101 data set and a real application of 3D pose estimation for robot grasping....

  20. Using normalisation process theory to understand barriers and facilitators to implementing mindfulness-based stress reduction for people with multiple sclerosis.

    Science.gov (United States)

    Simpson, Robert; Simpson, Sharon; Wood, Karen; Mercer, Stewart W; Mair, Frances S

    2018-01-01

    Objectives To study barriers and facilitators to implementation of mindfulness-based stress reduction for people with multiple sclerosis. Methods Qualitative interviews were used to explore barriers and facilitators to implementation of mindfulness-based stress reduction, including 33 people with multiple sclerosis, 6 multiple sclerosis clinicians and 2 course instructors. Normalisation process theory provided the underpinning conceptual framework. Data were analysed deductively using normalisation process theory constructs (coherence, cognitive participation, collective action and reflexive monitoring). Results Key barriers included mismatched stakeholder expectations, lack of knowledge about mindfulness-based stress reduction, high levels of comorbidity and disability and skepticism about embedding mindfulness-based stress reduction in routine multiple sclerosis care. Facilitators to implementation included introducing a pre-course orientation session; adaptations to mindfulness-based stress reduction to accommodate comorbidity and disability and participants suggested smaller, shorter classes, shortened practices, exclusion of mindful-walking and more time with peers. Post-mindfulness-based stress reduction booster sessions may be required, and objective and subjective reports of benefit would increase clinician confidence in mindfulness-based stress reduction. Discussion Multiple sclerosis patients and clinicians know little about mindfulness-based stress reduction. Mismatched expectations are a barrier to participation, as is rigid application of mindfulness-based stress reduction in the context of disability. Course adaptations in response to patient needs would facilitate uptake and utilisation. Rendering access to mindfulness-based stress reduction rapid and flexible could facilitate implementation. Embedded outcome assessment is desirable.

  1. Normalising convenience food?

    DEFF Research Database (Denmark)

    Halkier, Bente

    2017-01-01

    The construction of convenience food as a social and cultural category for food provisioning, cooking and eating seems to slide between or across understandings of what is considered “proper food” in the existing discourses in everyday life and media. This article sheds light upon some...... of the social and cultural normativities around convenience food by describing the ways in which convenience food forms part of the daily life of young Danes. Theoretically, the article is based on a practice theoretical perspective. Empirically, the article builds upon a qualitative research project on food...... habits among Danes aged 20–25. The article presents two types of empirical patterns. The first types of patterns are the degree to which and the different ways in which convenience food is normalised to use among the young Danes. The second types of patterns are the normative places of convenient food...

  2. A formative evaluation of the implementation of a medication safety data collection tool in English healthcare settings: A qualitative interview study using normalisation process theory.

    Science.gov (United States)

    Rostami, Paryaneh; Ashcroft, Darren M; Tully, Mary P

    2018-01-01

    Reducing medication-related harm is a global priority; however, impetus for improvement is impeded as routine medication safety data are seldom available. Therefore, the Medication Safety Thermometer was developed within England's National Health Service. This study aimed to explore the implementation of the tool into routine practice from users' perspectives. Fifteen semi-structured interviews were conducted with purposely sampled National Health Service staff from primary and secondary care settings. Interview data were analysed using an initial thematic analysis, and subsequent analysis using Normalisation Process Theory. Secondary care staff understood that the Medication Safety Thermometer's purpose was to measure medication safety and improvement. However, other uses were reported, such as pinpointing poor practice. Confusion about its purpose existed in primary care, despite further training, suggesting unsuitability of the tool. Decreased engagement was displayed by staff less involved with medication use, who displayed less ownership. Nonetheless, these advocates often lacked support from management and frontline levels, leading to an overall lack of engagement. Many participants reported efforts to drive scale-up of the use of the tool, for example, by securing funding, despite uncertainty around how to use data. Successful improvement was often at ward-level and went unrecognised within the wider organisation. There was mixed feedback regarding the value of the tool, often due to a perceived lack of "capacity". However, participants demonstrated interest in learning how to use their data and unexpected applications of data were reported. Routine medication safety data collection is complex, but achievable and facilitates improvements. However, collected data must be analysed, understood and used for further work to achieve improvement, which often does not happen. The national roll-out of the tool has accelerated shared learning; however, a number of

  3. A formative evaluation of the implementation of a medication safety data collection tool in English healthcare settings: A qualitative interview study using normalisation process theory.

    Directory of Open Access Journals (Sweden)

    Paryaneh Rostami

    Full Text Available Reducing medication-related harm is a global priority; however, impetus for improvement is impeded as routine medication safety data are seldom available. Therefore, the Medication Safety Thermometer was developed within England's National Health Service. This study aimed to explore the implementation of the tool into routine practice from users' perspectives.Fifteen semi-structured interviews were conducted with purposely sampled National Health Service staff from primary and secondary care settings. Interview data were analysed using an initial thematic analysis, and subsequent analysis using Normalisation Process Theory.Secondary care staff understood that the Medication Safety Thermometer's purpose was to measure medication safety and improvement. However, other uses were reported, such as pinpointing poor practice. Confusion about its purpose existed in primary care, despite further training, suggesting unsuitability of the tool. Decreased engagement was displayed by staff less involved with medication use, who displayed less ownership. Nonetheless, these advocates often lacked support from management and frontline levels, leading to an overall lack of engagement. Many participants reported efforts to drive scale-up of the use of the tool, for example, by securing funding, despite uncertainty around how to use data. Successful improvement was often at ward-level and went unrecognised within the wider organisation. There was mixed feedback regarding the value of the tool, often due to a perceived lack of "capacity". However, participants demonstrated interest in learning how to use their data and unexpected applications of data were reported.Routine medication safety data collection is complex, but achievable and facilitates improvements. However, collected data must be analysed, understood and used for further work to achieve improvement, which often does not happen. The national roll-out of the tool has accelerated shared learning; however

  4. Supporting the use of theory in cross-country health services research: a participatory qualitative approach using Normalisation Process Theory as an example.

    Science.gov (United States)

    O'Donnell, Catherine A; Mair, Frances S; Dowrick, Christopher; Brún, Mary O'Reilly-de; Brún, Tomas de; Burns, Nicola; Lionis, Christos; Saridaki, Aristoula; Papadakaki, Maria; Muijsenbergh, Maria van den; Weel-Baumgarten, Evelyn van; Gravenhorst, Katja; Cooper, Lucy; Princz, Christine; Teunissen, Erik; Mareeuw, Francine van den Driessen; Vlahadi, Maria; Spiegel, Wolfgang; MacFarlane, Anne

    2017-08-21

    To describe and reflect on the process of designing and delivering a training programme supporting the use of theory, in this case Normalisation Process Theory (NPT), in a multisite cross-country health services research study. Participatory research approach using qualitative methods. Six European primary care settings involving research teams from Austria, England, Greece, Ireland, The Netherlands and Scotland. RESTORE research team consisting of 8 project applicants, all senior primary care academics, and 10 researchers. Professional backgrounds included general practitioners/family doctors, social/cultural anthropologists, sociologists and health services/primary care researchers. Views of all research team members (n=18) were assessed using qualitative evaluation methods, analysed qualitatively by the trainers after each session. Most of the team had no experience of using NPT and many had not applied theory to prospective, qualitative research projects. Early training proved didactic and overloaded participants with information. Drawing on RESTORE's methodological approach of Participatory Learning and Action, workshops using role play, experiential interactive exercises and light-hearted examples not directly related to the study subject matter were developed. Evaluation showed the study team quickly grew in knowledge and confidence in applying theory to fieldwork.Recommendations applicable to other studies include: accepting that theory application is not a linear process, that time is needed to address researcher concerns with the process, and that experiential, interactive learning is a key device in building conceptual and practical knowledge. An unanticipated benefit was the smooth transition to cross-country qualitative coding of study data. A structured programme of training enhanced and supported the prospective application of a theory, NPT, to our work but raised challenges. These were not unique to NPT but could arise with the application of any

  5. An application of Extended Normalisation Process Theory in a randomised controlled trial of a complex social intervention: Process evaluation of the Strengthening Families Programme (10-14) in Wales, UK.

    Science.gov (United States)

    Segrott, Jeremy; Murphy, Simon; Rothwell, Heather; Scourfield, Jonathan; Foxcroft, David; Gillespie, David; Holliday, Jo; Hood, Kerenza; Hurlow, Claire; Morgan-Trimmer, Sarah; Phillips, Ceri; Reed, Hayley; Roberts, Zoe; Moore, Laurence

    2017-12-01

    Process evaluations generate important data on the extent to which interventions are delivered as intended. However, the tendency to focus only on assessment of pre-specified structural aspects of fidelity has been criticised for paying insufficient attention to implementation processes and how intervention-context interactions influence programme delivery. This paper reports findings from a process evaluation nested within a randomised controlled trial of the Strengthening Families Programme 10-14 (SFP 10-14) in Wales, UK. It uses Extended Normalisation Process Theory to theorise how interaction between SFP 10-14 and local delivery systems - particularly practitioner commitment/capability and organisational capacity - influenced delivery of intended programme activities: fidelity (adherence to SFP 10-14 content and implementation requirements); dose delivered; dose received (participant engagement); participant recruitment and reach (intervention attendance). A mixed methods design was utilised. Fidelity assessment sheets (completed by practitioners), structured observation by researchers, and routine data were used to assess: adherence to programme content; staffing numbers and consistency; recruitment/retention; and group size and composition. Interviews with practitioners explored implementation processes and context. Adherence to programme content was high - with some variation, linked to practitioner commitment to, and understanding of, the intervention's content and mechanisms. Variation in adherence rates was associated with the extent to which multi-agency delivery team planning meetings were held. Recruitment challenges meant that targets for group size/composition were not always met, but did not affect adherence levels or family engagement. Targets for staffing numbers and consistency were achieved, though capacity within multi-agency networks reduced over time. Extended Normalisation Process Theory provided a useful framework for assessing

  6. Nuclear power 1984: Progressive normalisation

    International Nuclear Information System (INIS)

    Popp, M.

    1984-01-01

    The peaceful use of nuclear power is being integrated into the overall concept of a safe long-term power supply in West Germany. The progress of normalisation is shown particularly in the takeover of all stations of the nuclear fuel circuit by the economy, with the exception of the final storage of radioactive waste, which is the responsibility of the West German Government. Normalisation also means the withdrawal of the state from financing projects after completion of the two prototypes SNR-300 and THTR-300 and the German uranium enrichment plant. The state will, however, support future research and development projects in the nuclear field. The expansion of nuclear power capacity is at present being slowed down by the state of the economy, i.e. only nuclear power projects being built are proceeding. (orig./HP) [de

  7. Repeated lysergic acid diethylamide in an animal model of depression: Normalisation of learning behaviour and hippocampal serotonin 5-HT2 signalling.

    Science.gov (United States)

    Buchborn, Tobias; Schröder, Helmut; Höllt, Volker; Grecksch, Gisela

    2014-06-01

    A re-balance of postsynaptic serotonin (5-HT) receptor signalling, with an increase in 5-HT1A and a decrease in 5-HT2A signalling, is a final common pathway multiple antidepressants share. Given that the 5-HT1A/2A agonist lysergic acid diethylamide (LSD), when repeatedly applied, selectively downregulates 5-HT2A, but not 5-HT1A receptors, one might expect LSD to similarly re-balance the postsynaptic 5-HT signalling. Challenging this idea, we use an animal model of depression specifically responding to repeated antidepressant treatment (olfactory bulbectomy), and test the antidepressant-like properties of repeated LSD treatment (0.13 mg/kg/d, 11 d). In line with former findings, we observe that bulbectomised rats show marked deficits in active avoidance learning. These deficits, similarly as we earlier noted with imipramine, are largely reversed by repeated LSD administration. Additionally, bulbectomised rats exhibit distinct anomalies of monoamine receptor signalling in hippocampus and/or frontal cortex; from these, only the hippocampal decrease in 5-HT2 related [(35)S]-GTP-gamma-S binding is normalised by LSD. Importantly, the sham-operated rats do not profit from LSD, and exhibit reduced hippocampal 5-HT2 signalling. As behavioural deficits after bulbectomy respond to agents classified as antidepressants only, we conclude that the effect of LSD in this model can be considered antidepressant-like, and discuss it in terms of a re-balance of hippocampal 5-HT2/5-HT1A signalling. © The Author(s) 2014.

  8. Infinitary Combinatory Reduction Systems: Normalising Reduction Strategies

    NARCIS (Netherlands)

    Ketema, J.; Simonsen, Jakob Grue

    2010-01-01

    We study normalising reduction strategies for infinitary Combinatory Reduction Systems (iCRSs). We prove that all fair, outermost-fair, and needed-fair strategies are normalising for orthogonal, fully-extended iCRSs. These facts properly generalise a number of results on normalising strategies in

  9. Rules of Normalisation and their Importance for Interpretation of Systems of Optimal Taxation

    DEFF Research Database (Denmark)

    Munk, Knud Jørgen

    representation of the general equilibrium conditions the rules of normalisation in standard optimal tax models. This allows us to provide an intuitive explanation of what determines the optimal tax system. Finally, we review a number of examples where lack of precision with respect to normalisation in otherwise...... important contributions to the literature on optimal taxation has given rise to misinterpretations of of analytical results....

  10. Effect of food matrix and thermal processing on the performance of a normalised quantitative real-time PCR approach for lupine (Lupinus albus) detection as a potential allergenic food.

    Science.gov (United States)

    Villa, Caterina; Costa, Joana; Gondar, Cristina; Oliveira, M Beatriz P P; Mafra, Isabel

    2018-10-01

    Lupine is widely used as an ingredient in diverse food products, but it is also a source of allergens. This work aimed at proposing a method to detect/quantify lupine as an allergen in processed foods based on a normalised real-time PCR assay targeting the Lup a 4 allergen-encoding gene of Lupinus albus. Sensitivities down to 0.0005%, 0.01% and 0.05% (w/w) of lupine in rice flour, wheat flour and bread, respectively, and 1 pg of L. albus DNA were obtained, with adequate real-time PCR performance parameters using the ΔCt method. Both food matrix and processing affected negatively the quantitative performance of the assay. The method was successfully validated with blind samples and applied to processed foods. Lupine was estimated between 4.12 and 22.9% in foods, with some results suggesting the common practice of precautionary labelling. In this work, useful and effective tools were proposed for the detection/quantification of lupine in food products. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Guidelines for normalising Early Modern English corpora: Decisions and justifications

    Directory of Open Access Journals (Sweden)

    Archer Dawn

    2015-03-01

    Full Text Available Corpora of Early Modern English have been collected and released for research for a number of years. With large scale digitisation activities gathering pace in the last decade, much more historical textual data is now available for research on numerous topics including historical linguistics and conceptual history. We summarise previous research which has shown that it is necessary to map historical spelling variants to modern equivalents in order to successfully apply natural language processing and corpus linguistics methods. Manual and semiautomatic methods have been devised to support this normalisation and standardisation process. We argue that it is important to develop a linguistically meaningful rationale to achieve good results from this process. In order to do so, we propose a number of guidelines for normalising corpora and show how these guidelines have been applied in the Corpus of English Dialogues.

  12. The dynamics of the oesophageal squamous epithelium 'normalisation' process in patients with gastro-oesophageal reflux disease treated with long-term acid suppression or anti-reflux surgery.

    Science.gov (United States)

    Mastracci, L; Fiocca, R; Engström, C; Attwood, S; Ell, C; Galmiche, J P; Hatlebakk, J G; Långström, G; Eklund, S; Lind, T; Lundell, L

    2017-05-01

    Proton pump inhibitors and laparoscopic anti-reflux surgery (LARS) offer long-term symptom control to patients with gastro-oesophageal reflux disease (GERD). To evaluate the process of 'normalisation' of the squamous epithelium morphology of the distal oesophagus on these therapies. In the LOTUS trial, 554 patients with chronic GERD were randomised to receive either esomeprazole (20-40 mg daily) or LARS. After 5 years, 372 patients remained in the study (esomeprazole, 192; LARS, 180). Biopsies were taken at the Z-line and 2 cm above, at baseline, 1, 3 and 5 years. A severity score was calculated based on: papillae elongation, basal cell hyperplasia, intercellular space dilatations and eosinophilic infiltration. The epithelial proliferative activity was assessed by Ki-67 immunohistochemistry. A gradual improvement in all variables over 5 years was noted in both groups, at both the Z-line and 2 cm above. The severity score decreased from baseline at each subsequent time point in both groups (P refluxate seems to play the predominant role in restoring tissue morphology. © 2017 John Wiley & Sons Ltd.

  13. Attitudes to Normalisation and Inclusive Education

    Science.gov (United States)

    Sanagi, Tomomi

    2016-01-01

    The purpose of this paper was to clarify the features of teachers' image on normalisation and inclusive education. The participants of the study were both mainstream teachers and special teachers. One hundred and thirty-eight questionnaires were analysed. (1) Teachers completed the questionnaire of SD (semantic differential) images on…

  14. Queer Literature in Spain: Pathways to Normalisation

    Directory of Open Access Journals (Sweden)

    Martínez-Expósito, Alfredo

    2013-06-01

    Full Text Available More than any other, the idea of normalisation has provoked deep divisions within queer activism both at a philosophical and also at a political level. At the root of these divisions lies the irreconcilable divergence between an agenda for social change, which advocates the need for society to accept all sexual behaviours and identities as normal, and an approach of radical resistance against some social structures that can only offer a bourgeois and conformist normalisation. Literary fiction and homo-gay-queer themed cinema have explored these and other sides of the idea of normalisation and have thus contributed to the taking of decisive steps: from the poetics of transgression towards the poetics of celebration and social transformation. In this paper we examine two of these literary normalisation strategies: the use of humour and the proliferation of discursive perspectives both in the cinema and in narrative fiction during the last decades.Más quizá que ninguna otra, la idea de normalización ha provocado profundas divisiones en el seno del activismo queer, tanto a nivel filosófico/conceptual como a nivel de estrategia política. En el origen de estas divisiones se encuentra la irreconciliable divergencia entre una agenda de cambio social, que propugna la necesidad de que la sociedad acepte como normales todas las conductas e identidades sexuales, y un planteamiento de resistencia radical ante unas estructuras sociales que sólo pueden ofrecer una normalización burguesa y acomodaticia. La literatura de ficción y el cine de temática homo-gay-queer han explorado éstas y otras facetas de la idea de normalización, contribuyendo así a dar pasos decisivos desde las poéticas de la transgresión hacia poéticas de la celebración y transformación social. En esta presentación se exploran dos de estas estrategias de normalización literaria: el uso del humor y la proliferación de perspectivas discursivas en el cine y la narrativa de

  15. Normalisation: ROI optimal treatment planning - SNDH pattern

    International Nuclear Information System (INIS)

    Shilvat, D.V.; Bhandari, Virendra; Tamane, Chandrashekhar; Pangam, Suresh

    2001-01-01

    Dose precision maximally to the target / ROI (Region of Interest), taking care of tolerance dose of normal tissue is the aim of ideal treatment planning. This goal is achieved with advanced modalities such as, micro MLC, simulator and 3-dimensional treatment planning system. But SNDH PATTERN uses minimum available resources as, ALCYON II Telecobalt unit, CT Scan, MULTIDATA 2-dimensional treatment planning system to their maximum utility and reaches to the required precision, same as that with advance modalities. Among the number of parameters used, 'NORMALISATION TO THE ROI' will achieve the aim of the treatment planning effectively. This is dealing with an example of canal of esophagus modified treatment planning based on SNDH pattern. Results are attractive and self explanatory. By implementing SNDH pattern, the QUALITY INDEX of treatment plan will reach to greater than 90%, with substantial reduction in dose to the vital organs. Aim is to utilize the minimum available resources efficiently to achieve highest possible precision for delivering homogenous dose to ROI while taking care of tolerance dose to vital organs

  16. Technology, normalisation and male sex work.

    Science.gov (United States)

    MacPhail, Catherine; Scott, John; Minichiello, Victor

    2015-01-01

    Technological change, particularly the growth of the Internet and smart phones, has increased the visibility of male escorts, expanded their client base and diversified the range of venues in which male sex work can take place. Specifically, the Internet has relocated some forms of male sex work away from the street and thereby increased market reach, visibility and access and the scope of sex work advertising. Using the online profiles of 257 male sex workers drawn from six of the largest websites advertising male sexual services in Australia, the role of the Internet in facilitating the normalisation of male sex work is discussed. Specifically we examine how engagement with the sex industry has been reconstituted in term of better informed consumer-seller decisions for both clients and sex workers. Rather than being seen as a 'deviant' activity, understood in terms of pathology or criminal activity, male sex work is increasingly presented as an everyday commodity in the market place. In this context, the management of risks associated with sex work has shifted from formalised social control to more informal practices conducted among online communities of clients and sex workers. We discuss the implications for health, legal and welfare responses within an empowerment paradigm.

  17. Semantic Business Process Modeling

    OpenAIRE

    Markovic, Ivan

    2010-01-01

    This book presents a process-oriented business modeling framework based on semantic technologies. The framework consists of modeling languages, methods, and tools that allow for semantic modeling of business motivation, business policies and rules, and business processes. Quality of the proposed modeling framework is evaluated based on the modeling content of SAP Solution Composer and several real-world business scenarios.

  18. Modeling multiphase materials processes

    CERN Document Server

    Iguchi, Manabu

    2010-01-01

    ""Modeling Multiphase Materials Processes: Gas-Liquid Systems"" describes the methodology and application of physical and mathematical modeling to multi-phase flow phenomena in materials processing. The book focuses on systems involving gas-liquid interaction, the most prevalent in current metallurgical processes. The performance characteristics of these processes are largely dependent on transport phenomena. This volume covers the inherent characteristics that complicate the modeling of transport phenomena in such systems, including complex multiphase structure, intense turbulence, opacity of

  19. Normalisation of body composition parameters for nutritional assessment

    International Nuclear Information System (INIS)

    Preston, Thomas

    2014-01-01

    Full text: Normalisation of body composition parameters to an index of body size facilitates comparison of a subject’s measurements with those of a population. There is an obvious focus on indexes of obesity, but first it is informative to consider Fat Free Mass (FFM) in the context of common anthropometric measures of body size namely, height and weight. The contention is that FFM is a more physiological measure of body size than body mass. Many studies have shown that FFM relates to height ^p. Although there is debate over the appropriate exponent especially in early life, it appears to lie between 2 and 3. If 2, then FFM Index (FFMI; kg/m2) and Fat Mass Index (FMI; kg/m2) can be summed to give BMI. If 3 were used as exponent, then FFMI (kg/m3) plus FMI (kg/m3) gives the Ponderal Index (PI; weight/height3). In 2013, Burton argued that that a cubic exponent is appropriate for normalisation as it is a dimensionless quotient. In 2012, Wang and co-workers repeated earlier observations showing a strong linear relationship between FFM and height3. The importance of the latter study comes from the fact that a 4 compartment body composition model was used, which is recognised as the most accurate means of describing FFM. Once the basis of a FFMI has been defined it can be used to compare measurements with those of a population, either directly, as a ratio to a norm or as a Z-score. FFMI charts could be developed for use in child growth. Other related indexes can be determined for use in specific circumstances such as: body cell mass index (growth and wasting); skeletal muscle mass index (SMMI) or appendicular SMMI (growth and sarcopenia); bone mineral mass index (osteoporosis); extracellular fluid index (hydration). Finally, it is logical that the same system is used to define an adiposity index, so Fat Mass Index (FMI; kg/height3) can be used as it is consistent with FFMI (kg/height3) and PI. It should also be noted that the index FM/FFM, describes an individual

  20. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  1. A comparison of parametric and nonparametric methods for normalising cDNA microarray data.

    Science.gov (United States)

    Khondoker, Mizanur R; Glasbey, Chris A; Worton, Bruce J

    2007-12-01

    Normalisation is an essential first step in the analysis of most cDNA microarray data, to correct for effects arising from imperfections in the technology. Loess smoothing is commonly used to correct for trends in log-ratio data. However, parametric models, such as the additive plus multiplicative variance model, have been preferred for scale normalisation, though the variance structure of microarray data may be of a more complex nature than can be accommodated by a parametric model. We propose a new nonparametric approach that incorporates location and scale normalisation simultaneously using a Generalised Additive Model for Location, Scale and Shape (GAMLSS, Rigby and Stasinopoulos, 2005, Applied Statistics, 54, 507-554). We compare its performance in inferring differential expression with Huber et al.'s (2002, Bioinformatics, 18, 96-104) arsinh variance stabilising transformation (AVST) using real and simulated data. We show GAMLSS to be as powerful as AVST when the parametric model is correct, and more powerful when the model is wrong. (c) 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

  2. Normalised flood losses in Europe: 1970-2006

    Science.gov (United States)

    Barredo, J. I.

    2009-02-01

    This paper presents an assessment of normalised flood losses in Europe for the period 1970-2006. Normalisation provides an estimate of the losses that would occur if the floods from the past take place under current societal conditions. Economic losses from floods are the result of both societal and climatological factors. Failing to adjust for time-variant socio-economic factors produces loss amounts that are not directly comparable over time, but rather show an ever-growing trend for purely socio-economic reasons. This study has used available information on flood losses from the Emergency Events Database (EM-DAT) and the Natural Hazards Assessment Network (NATHAN). Following the conceptual approach of previous studies, we normalised flood losses by considering the effects of changes in population, wealth, and inflation at the country level. Furthermore, we removed inter-country price differences by adjusting the losses for purchasing power parities (PPP). We assessed normalised flood losses in 31 European countries. These include the member states of the European Union, Norway, Switzerland, Croatia, and the Former Yugoslav Republic of Macedonia. Results show no detectable sign of human-induced climate change in normalised flood losses in Europe. The observed increase in the original flood losses is mostly driven by societal factors.

  3. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    . These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety......This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  4. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  5. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    The present thesis considers numerical modeling of activated sludge tanks on municipal wastewater treatment plants. Focus is aimed at integrated modeling where the detailed microbiological model the Activated Sludge Model 3 (ASM3) is combined with a detailed hydrodynamic model based on a numerical...... solution of the Navier-Stokes equations in a multiphase scheme. After a general introduction to the activated sludge tank as a system, the activated sludge tank model is gradually setup in separate stages. The individual sub-processes that are often occurring in activated sludge tanks are initially...... hydrofoil shaped propellers. These two sub-processes deliver the main part of the supplied energy to the activated sludge tank, and for this reason they are important for the mixing conditions in the tank. For other important processes occurring in the activated sludge tank, existing models and measurements...

  6. Use and misuse of temperature normalisation in meta-analyses of thermal responses of biological traits

    Directory of Open Access Journals (Sweden)

    Dimitrios - Georgios Kontopoulos

    2018-02-01

    Full Text Available There is currently unprecedented interest in quantifying variation in thermal physiology among organisms, especially in order to understand and predict the biological impacts of climate change. A key parameter in this quantification of thermal physiology is the performance or value of a rate, across individuals or species, at a common temperature (temperature normalisation. An increasingly popular model for fitting thermal performance curves to data—the Sharpe-Schoolfield equation—can yield strongly inflated estimates of temperature-normalised rate values. These deviations occur whenever a key thermodynamic assumption of the model is violated, i.e., when the enzyme governing the performance of the rate is not fully functional at the chosen reference temperature. Using data on 1,758 thermal performance curves across a wide range of species, we identify the conditions that exacerbate this inflation. We then demonstrate that these biases can compromise tests to detect metabolic cold adaptation, which requires comparison of fitness or rate performance of different species or genotypes at some fixed low temperature. Finally, we suggest alternative methods for obtaining unbiased estimates of temperature-normalised rate values for meta-analyses of thermal performance across species in climate change impact studies.

  7. Normalisation and weighting in life cycle assessment: quo vadis?

    DEFF Research Database (Denmark)

    Pizzol, Massimo; Laurent, Alexis; Sala, Serenella

    2017-01-01

    Purpose: Building on the rhetoric question “quo vadis?” (literally “Where are you going?”), this article critically investigates the state of the art of normalisation and weighting approaches within life cycle assessment. It aims at identifying purposes, current practises, pros and cons, as well...

  8. Model Process Control Language

    Data.gov (United States)

    National Aeronautics and Space Administration — The MPC (Model Process Control) language enables the capture, communication and preservation of a simulation instance, with sufficient detail that it can be...

  9. Oral benfotiamine plus alpha-lipoic acid normalises complication-causing pathways in type 1 diabetes.

    Science.gov (United States)

    Du, X; Edelstein, D; Brownlee, M

    2008-10-01

    We determined whether fixed doses of benfotiamine in combination with slow-release alpha-lipoic acid normalise markers of reactive oxygen species-induced pathways of complications in humans. Male participants with and without type 1 diabetes were studied in the General Clinical Research Centre of the Albert Einstein College of Medicine. Glycaemic status was assessed by measuring baseline values of three different indicators of hyperglycaemia. Intracellular AGE formation, hexosamine pathway activity and prostacyclin synthase activity were measured initially, and after 2 and 4 weeks of treatment. In the nine participants with type 1 diabetes, treatment had no effect on any of the three indicators used to assess hyperglycaemia. However, treatment with benfotiamine plus alpha-lipoic acid completely normalised increased AGE formation, reduced increased monocyte hexosamine-modified proteins by 40% and normalised the 70% decrease in prostacyclin synthase activity from 1,709 +/- 586 pg/ml 6-keto-prostaglandin F(1alpha) to 4,696 +/- 533 pg/ml. These results show that the previously demonstrated beneficial effects of these agents on complication-causing pathways in rodent models of diabetic complications also occur in humans with type 1 diabetes.

  10. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  11. Biosphere Process Model Report

    International Nuclear Information System (INIS)

    Schmitt, J.

    2000-01-01

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  12. Reference gene identification for reliable normalisation of quantitative RT-PCR data in Setaria viridis.

    Science.gov (United States)

    Nguyen, Duc Quan; Eamens, Andrew L; Grof, Christopher P L

    2018-01-01

    Quantitative real-time polymerase chain reaction (RT-qPCR) is the key platform for the quantitative analysis of gene expression in a wide range of experimental systems and conditions. However, the accuracy and reproducibility of gene expression quantification via RT-qPCR is entirely dependent on the identification of reliable reference genes for data normalisation. Green foxtail ( Setaria viridis ) has recently been proposed as a potential experimental model for the study of C 4 photosynthesis and is closely related to many economically important crop species of the Panicoideae subfamily of grasses, including Zea mays (maize), Sorghum bicolor (sorghum) and Sacchurum officinarum (sugarcane). Setaria viridis (Accession 10) possesses a number of key traits as an experimental model, namely; (i) a small sized, sequenced and well annotated genome; (ii) short stature and generation time; (iii) prolific seed production, and; (iv) is amendable to Agrobacterium tumefaciens -mediated transformation. There is currently however, a lack of reference gene expression information for Setaria viridis ( S. viridis ). We therefore aimed to identify a cohort of suitable S. viridis reference genes for accurate and reliable normalisation of S. viridis RT-qPCR expression data. Eleven putative candidate reference genes were identified and examined across thirteen different S. viridis tissues. Of these, the geNorm and NormFinder analysis software identified SERINE / THERONINE - PROTEIN PHOSPHATASE 2A ( PP2A ), 5 '- ADENYLYLSULFATE REDUCTASE 6 ( ASPR6 ) and DUAL SPECIFICITY PHOSPHATASE ( DUSP ) as the most suitable combination of reference genes for the accurate and reliable normalisation of S. viridis RT-qPCR expression data. To demonstrate the suitability of the three selected reference genes, PP2A , ASPR6 and DUSP , were used to normalise the expression of CINNAMYL ALCOHOL DEHYDROGENASE ( CAD ) genes across the same tissues. This approach readily demonstrated the suitably of the three

  13. Aerospace Materials Process Modelling

    Science.gov (United States)

    1988-08-01

    Cooling Transformation diagram ( CCT diagram ) When a IT diagram is used in the heat process modelling, we suppose that a sudden cooling (instantaneous...processes. CE, chooses instead to study thermo-mechanical properties referring to a CCT diagram . This is thinked to be more reliable to give a true...k , mm-_____sml l ml A I 1 III 12.4 This determination is however based on the following approximations: i) A CCT diagram is valid only for the

  14. Total body neutron activation analysis of calcium: calibration and normalisation

    Energy Technology Data Exchange (ETDEWEB)

    Kennedy, N S.J.; Eastell, R; Ferrington, C M; Simpson, J D; Strong, J A [Western General Hospital, Edinburgh (UK); Smith, M A; Tothill, P [Royal Infirmary, Edinburgh (UK)

    1982-05-01

    An irradiation system has been designed, using a neutron beam from a cyclotron, which optimises the uniformity of activation of calcium. Induced activity is measured in a scanning, shadow-shield whole-body counter. Calibration has been effected and reproducibility assessed with three different types of phantom. Corrections were derived for variations in body height, depth and fat thickness. The coefficient of variation for repeated measurements of an anthropomorphic phantom was 1.8% for an absorbed dose equivalent of 13 mSv (1.3 rem). Measurements of total body calcium in 40 normal adults were used to derive normalisation factors which predict the normal calcium in a subject of given size and age. The coefficient of variation of normalised calcium was 6.2% in men and 6.6% in women, with the demonstration of an annual loss of 1.5% after the menopause. The narrow range should make single measurements useful for diagnostic purposes.

  15. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    , by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...

  16. Business process model repositories : efficient process retrieval

    NARCIS (Netherlands)

    Yan, Z.

    2012-01-01

    As organizations increasingly work in process-oriented manner, the number of business process models that they develop and have to maintain increases. As a consequence, it has become common for organizations to have collections of hundreds or even thousands of business process models. When a

  17. Process model repositories and PNML

    NARCIS (Netherlands)

    Hee, van K.M.; Post, R.D.J.; Somers, L.J.A.M.; Werf, van der J.M.E.M.; Kindler, E.

    2004-01-01

    Bringing system and process models together in repositories facilitates the interchange of model information between modelling tools, and allows the combination and interlinking of complementary models. Petriweb is a web application for managing such repositories. It supports hierarchical process

  18. Modeling styles in business process modeling

    NARCIS (Netherlands)

    Pinggera, J.; Soffer, P.; Zugal, S.; Weber, B.; Weidlich, M.; Fahland, D.; Reijers, H.A.; Mendling, J.; Bider, I.; Halpin, T.; Krogstie, J.; Nurcan, S.; Proper, E.; Schmidt, R.; Soffer, P.; Wrycza, S.

    2012-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models. As a consequence, the question arises whether different ways of creating process models exist. In this vein, we observed 115 students engaged in the act of modeling, recording

  19. The one-dimensional normalised generalised equivalence theory (NGET) for generating equivalent diffusion theory group constants for PWR reflector regions

    International Nuclear Information System (INIS)

    Mueller, E.Z.

    1991-01-01

    An equivalent diffusion theory PWR reflector model is presented, which has as its basis Smith's generalisation of Koebke's Equivalent Theory. This method is an adaptation, in one-dimensional slab geometry, of the Generalised Equivalence Theory (GET). Since the method involves the renormalisation of the GET discontinuity factors at nodal interfaces, it is called the Normalised Generalised Equivalence Theory (NGET) method. The advantages of the NGET method for modelling the ex-core nodes of a PWR are summarized. 23 refs

  20. Generating process model collections

    NARCIS (Netherlands)

    Yan, Z.; Dijkman, R.M.; Grefen, P.W.P.J.

    2017-01-01

    Business process management plays an important role in the management of organizations. More and more organizations describe their operations as business processes. It is common for organizations to have collections of thousands of business processes, but for reasons of confidentiality these

  1. What makes process models understandable?

    NARCIS (Netherlands)

    Mendling, J.; Reijers, H.A.; Cardoso, J.; Alonso, G.; Dadam, P.; Rosemann, M.

    2007-01-01

    Despite that formal and informal quality aspects are of significant importance to business process modeling, there is only little empirical work reported on process model quality and its impact factors. In this paper we investigate understandability as a proxy for quality of process models and focus

  2. OpenPrescribing: normalised data and software tool to research trends in English NHS primary care prescribing 1998-2016.

    Science.gov (United States)

    Curtis, Helen J; Goldacre, Ben

    2018-02-23

    We aimed to compile and normalise England's national prescribing data for 1998-2016 to facilitate research on long-term time trends and create an open-data exploration tool for wider use. We compiled data from each individual year's national statistical publications and normalised them by mapping each drug to its current classification within the national formulary where possible. We created a freely accessible, interactive web tool to allow anyone to interact with the processed data. We downloaded all available annual prescription cost analysis datasets, which include cost and quantity for all prescription items dispensed in the community in England. Medical devices and appliances were excluded. We measured the extent of normalisation of data and aimed to produce a functioning accessible analysis tool. All data were imported successfully. 87.5% of drugs were matched exactly on name to the current formulary and a further 6.5% to similar drug names. All drugs in core clinical chapters were reconciled to their current location in the data schema, with only 1.26% of drugs not assigned a current chemical code. We created an openly accessible interactive tool to facilitate wider use of these data. Publicly available data can be made accessible through interactive online tools to help researchers and policy-makers explore time trends in prescribing. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  3. A combination of low-dose bevacizumab and imatinib enhances vascular normalisation without inducing extracellular matrix deposition.

    Science.gov (United States)

    Schiffmann, L M; Brunold, M; Liwschitz, M; Goede, V; Loges, S; Wroblewski, M; Quaas, A; Alakus, H; Stippel, D; Bruns, C J; Hallek, M; Kashkar, H; Hacker, U T; Coutelle, O

    2017-02-28

    Vascular endothelial growth factor (VEGF)-targeting drugs normalise the tumour vasculature and improve access for chemotherapy. However, excessive VEGF inhibition fails to improve clinical outcome, and successive treatment cycles lead to incremental extracellular matrix (ECM) deposition, which limits perfusion and drug delivery. We show here, that low-dose VEGF inhibition augmented with PDGF-R inhibition leads to superior vascular normalisation without incremental ECM deposition thus maintaining access for therapy. Collagen IV expression was analysed in response to VEGF inhibition in liver metastasis of colorectal cancer (CRC) patients, in syngeneic (Panc02) and xenograft tumours of human colorectal cancer cells (LS174T). The xenograft tumours were treated with low (0.5 mg kg -1 body weight) or high (5 mg kg -1 body weight) doses of the anti-VEGF antibody bevacizumab with or without the tyrosine kinase inhibitor imatinib. Changes in tumour growth, and vascular parameters, including microvessel density, pericyte coverage, leakiness, hypoxia, perfusion, fraction of vessels with an open lumen, and type IV collagen deposition were compared. ECM deposition was increased after standard VEGF inhibition in patients and tumour models. In contrast, treatment with low-dose bevacizumab and imatinib produced similar growth inhibition without inducing detrimental collagen IV deposition, leading to superior vascular normalisation, reduced leakiness, improved oxygenation, more open vessels that permit perfusion and access for therapy. Low-dose bevacizumab augmented by imatinib selects a mature, highly normalised and well perfused tumour vasculature without inducing incremental ECM deposition that normally limits the effectiveness of VEGF targeting drugs.

  4. Selection of reference genes for normalisation of real-time RT-PCR in brain-stem death injury in Ovis aries

    Directory of Open Access Journals (Sweden)

    Fraser John F

    2009-07-01

    Full Text Available Abstract Background Heart and lung transplantation is frequently the only therapeutic option for patients with end stage cardio respiratory disease. Organ donation post brain stem death (BSD is a pre-requisite, yet BSD itself causes such severe damage that many organs offered for donation are unusable, with lung being the organ most affected by BSD. In Australia and New Zealand, less than 50% of lungs offered for donation post BSD are suitable for transplantation, as compared with over 90% of kidneys, resulting in patients dying for lack of suitable lungs. Our group has developed a novel 24 h sheep BSD model to mimic the physiological milieu of the typical human organ donor. Characterisation of the gene expression changes associated with BSD is critical and will assist in determining the aetiology of lung damage post BSD. Real-time PCR is a highly sensitive method involving multiple steps from extraction to processing RNA so the choice of housekeeping genes is important in obtaining reliable results. Little information however, is available on the expression stability of reference genes in the sheep pulmonary artery and lung. We aimed to establish a set of stably expressed reference genes for use as a standard for analysis of gene expression changes in BSD. Results We evaluated the expression stability of 6 candidate normalisation genes (ACTB, GAPDH, HGPRT, PGK1, PPIA and RPLP0 using real time quantitative PCR. There was a wide range of Ct-values within each tissue for pulmonary artery (15–24 and lung (16–25 but the expression pattern for each gene was similar across the two tissues. After geNorm analysis, ACTB and PPIA were shown to be the most stably expressed in the pulmonary artery and ACTB and PGK1 in the lung tissue of BSD sheep. Conclusion Accurate normalisation is critical in obtaining reliable and reproducible results in gene expression studies. This study demonstrates tissue associated variability in the selection of these

  5. Neuroscientific Model of Motivational Process

    OpenAIRE

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Rewa...

  6. Inference of financial networks using the normalised mutual information rate

    Science.gov (United States)

    2018-01-01

    In this paper, we study data from financial markets, using the normalised Mutual Information Rate. We show how to use it to infer the underlying network structure of interrelations in the foreign currency exchange rates and stock indices of 15 currency areas. We first present the mathematical method and discuss its computational aspects, and apply it to artificial data from chaotic dynamics and to correlated normal-variates data. We then apply the method to infer the structure of the financial system from the time-series of currency exchange rates and stock indices. In particular, we study and reveal the interrelations among the various foreign currency exchange rates and stock indices in two separate networks, of which we also study their structural properties. Our results show that both inferred networks are small-world networks, sharing similar properties and having differences in terms of assortativity. Importantly, our work shows that global economies tend to connect with other economies world-wide, rather than creating small groups of local economies. Finally, the consistent interrelations depicted among the 15 currency areas are further supported by a discussion from the viewpoint of economics. PMID:29420644

  7. Inference of financial networks using the normalised mutual information rate.

    Science.gov (United States)

    Goh, Yong Kheng; Hasim, Haslifah M; Antonopoulos, Chris G

    2018-01-01

    In this paper, we study data from financial markets, using the normalised Mutual Information Rate. We show how to use it to infer the underlying network structure of interrelations in the foreign currency exchange rates and stock indices of 15 currency areas. We first present the mathematical method and discuss its computational aspects, and apply it to artificial data from chaotic dynamics and to correlated normal-variates data. We then apply the method to infer the structure of the financial system from the time-series of currency exchange rates and stock indices. In particular, we study and reveal the interrelations among the various foreign currency exchange rates and stock indices in two separate networks, of which we also study their structural properties. Our results show that both inferred networks are small-world networks, sharing similar properties and having differences in terms of assortativity. Importantly, our work shows that global economies tend to connect with other economies world-wide, rather than creating small groups of local economies. Finally, the consistent interrelations depicted among the 15 currency areas are further supported by a discussion from the viewpoint of economics.

  8. Modeling of column apparatus processes

    CERN Document Server

    Boyadjiev, Christo; Boyadjiev, Boyan; Popova-Krumova, Petya

    2016-01-01

    This book presents a new approach for the modeling of chemical and interphase mass transfer processes in industrial column apparatuses, using convection-diffusion and average-concentration models. The convection-diffusion type models are used for a qualitative analysis of the processes and to assess the main, small and slight physical effects, and then reject the slight effects. As a result, the process mechanism can be identified. It also introduces average concentration models for quantitative analysis, which use the average values of the velocity and concentration over the cross-sectional area of the column. The new models are used to analyze different processes (simple and complex chemical reactions, absorption, adsorption and catalytic reactions), and make it possible to model the processes of gas purification with sulfur dioxide, which form the basis of several patents.

  9. The stories we tell: qualitative research interviews, talking technologies and the 'normalisation' of life with HIV.

    Science.gov (United States)

    Mazanderani, Fadhila; Paparini, Sara

    2015-04-01

    Since the earliest days of the HIV/AIDS epidemic, talking about the virus has been a key way affected communities have challenged the fear and discrimination directed against them and pressed for urgent medical and political attention. Today, HIV/AIDS is one of the most prolifically and intimately documented of all health conditions, with entrenched infrastructures, practices and technologies--what Vinh-Kim Nguyen has dubbed 'confessional technologies'--aimed at encouraging those affected to share their experiences. Among these technologies, we argue, is the semi-structured interview: the principal methodology used in qualitative social science research focused on patient experiences. Taking the performative nature of the research interview as a talking technology seriously has epistemological implications not merely for how we interpret interview data, but also for how we understand the role of research interviews in the enactment of 'life with HIV'. This paper focuses on one crucial aspect of this enactment: the contemporary 'normalisation' of HIV as 'just another' chronic condition--a process taking place at the level of individual subjectivities, social identities, clinical practices and global health policy, and of which social science research is a vital part. Through an analysis of 76 interviews conducted in London (2009-10), we examine tensions in the experiential narratives of individuals living with HIV in which life with the virus is framed as 'normal', yet where this 'normality' is beset with contradictions and ambiguities. Rather than viewing these as a reflection of resistances to or failures of the enactment of HIV as 'normal', we argue that, insofar as these contradictions are generated by the research interview as a distinct 'talking technology', they emerge as crucial to the normative (re)production of what counts as 'living with HIV' (in the UK) and are an inherent part of the broader performative 'normalisation' of the virus. Copyright © 2015

  10. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  11. Business Process Modeling: Perceived Benefits

    Science.gov (United States)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  12. Conceptual models of information processing

    Science.gov (United States)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  13. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  14. Modeling nuclear processes by Simulink

    Energy Technology Data Exchange (ETDEWEB)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my [Faculty of Engineering, International Islamic University Malaysia, Jalan Gombak, Selangor (Malaysia)

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  15. Modeling nuclear processes by Simulink

    International Nuclear Information System (INIS)

    Rashid, Nahrul Khair Alang Md

    2015-01-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples

  16. modeling grinding modeling grinding processes as micro processes

    African Journals Online (AJOL)

    eobe

    industrial precision grinding processes are cylindrical, center less and ... Several model shave been proposed and used to study grinding ..... grinding force for the two cases were 9.07237N/mm ..... International Journal of Machine Tools &.

  17. A normalised seawater strontium isotope curve. Possible implications for Neoproterozoic-Cambrian weathering rates and the further oxygenation of the Earth

    International Nuclear Information System (INIS)

    Shields, G.A.

    2007-01-01

    The strontium isotope composition of seawater is strongly influenced on geological time scales by changes in the rates of continental weathering relative to ocean crust alteration. However, the potential of the seawater 87 Sr/ 86 Sr curve to trace globally integrated chemical weathering rates has not been fully realised because ocean 87 Sr/ 86 Sr is also influenced by the isotopic evolution of Sr sources to the ocean. A preliminary attempt is made here to normalise the seawater 87 Sr/ 86 Sr curve to plausible trends in the 87 Sr/ 86 Sr ratios of the three major Sr sources: carbonate dissolution, silicate weathering and submarine hydrothermal exchange. The normalised curve highlights the Neoproterozoic-Phanerozoic transition as a period of exceptionally high continental influence, indicating that this interval was characterised by a transient increase in global weathering rates and/or by the weathering of unusually radiogenic crustal rocks. Close correlation between the normalised 87 Sr/ 86 Sr curve, a published seawater δ 34 S curve and atmospheric pCO 2 models is used here to argue that elevated chemical weathering rates were a major contributing factor to the steep rise in seawater 87 Sr/ 86 Sr from 650 Ma to 500 Ma. Elevated weathering rates during the Neoproterozoic-Cambrian interval led to increased nutrient availability, organic burial and to the further oxygenation of Earth's surface environment. Use of normalised seawater 87 Sr/ 86 Sr curves will, it is hoped, help to improve future geochemical models of Earth System dynamics. (orig.)

  18. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    are affected (in a positive or negative way) by the presence of the other enzymes and compounds in the media. In this thesis the concept of multi-enzyme in-pot term is adopted for processes that are carried out by the combination of enzymes in a single reactor and implemented at pilot or industrial scale...... features of the process and provides the information required to structure the process model by using a step-by-step procedure with the required tools and methods. In this way, this framework increases efficiency of the model development process with respect to time and resources needed (fast and effective....... In this way the model parameters that drives the main dynamic behavior can be identified and thus a better understanding of this type of processes. In order to develop, test and verify the methodology, three case studies were selected, specifically the bi-enzyme process for the production of lactobionic acid...

  19. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    -change of a homogeneous Levy process. While the processes in these two classes share the same average behavior over time, the associated intensities exhibit very different properties. Concrete specifications are calibrated to data on the single names included in the iTraxx Europe index. The performances are compared......In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...... specifying the cumulative hazard process directly. Within this framework we present a new model class where cumulative hazards are described by self-similar additive processes, also known as Sato processes. Furthermore we also analyze specifications obtained via a simple deterministic time...

  20. Markov Decision Process Measurement Model.

    Science.gov (United States)

    LaMar, Michelle M

    2018-03-01

    Within-task actions can provide additional information on student competencies but are challenging to model. This paper explores the potential of using a cognitive model for decision making, the Markov decision process, to provide a mapping between within-task actions and latent traits of interest. Psychometric properties of the model are explored, and simulation studies report on parameter recovery within the context of a simple strategy game. The model is then applied to empirical data from an educational game. Estimates from the model are found to correlate more strongly with posttest results than a partial-credit IRT model based on outcome data alone.

  1. Social Models: Blueprints or Processes?

    Science.gov (United States)

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  2. Simple Models for Process Control

    Czech Academy of Sciences Publication Activity Database

    Gorez, R.; Klán, Petr

    2011-01-01

    Roč. 22, č. 2 (2011), s. 58-62 ISSN 0929-2268 Institutional research plan: CEZ:AV0Z10300504 Keywords : process model s * PID control * second order dynamics Subject RIV: JB - Sensors, Measurment, Regulation

  3. Introducing carrying capacity-based normalisation in LCA: framework and development of references at midpoint level

    DEFF Research Database (Denmark)

    Bjørn, Anders; Hauschild, Michael Zwicky

    2015-01-01

    carrying capacity-based normalisation references. The purpose of this article is to present a framework for normalisation against carrying capacity-based references and to develop average normalisation references (NR) for Europe and the world for all those midpoint impact categories commonly included....... A literature review was carried out to identify scientifically sound thresholds for each impact category. Carrying capacities were then calculated from these thresholds and expressed in metrics identical to midpoint indicators giving priority to those recommended by ILCD. NR was expressed as the carrying...... ozone formation and soil quality were found to exceed carrying capacities several times.The developed carrying capacity-based normalisation references offer relevant supplementary reference information to the currently applied references based on society’s background interventions by supporting...

  4. Model feedstock supply processing plants

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2013-01-01

    Full Text Available The model of raw providing the processing enterprises entering into vertically integrated structure on production and processing of dairy raw materials, differing by an orientation on achievement of cumulative effect by the integrated structure acting as criterion function which maximizing is reached by optimization of capacities, volumes of deliveries of raw materials and its qualitative characteristics, costs of industrial processing of raw materials and demand for dairy production is developed.

  5. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    collected retrospectively for a period of six months. For each patient, time in therapeutic range (TTR) was calculated and correlated with practice and patient characteristics using multilevel linear regression models. RESULTS: We identified 447 patients in warfarin treatment in the 20 practices using POCT......INTRODUCTION: Oral anticoagulation treatment (OACT) with warfarin is common in general practice. Increasingly, international normalised ratio (INR) point of care testing (POCT) is being used to manage patients. The aim of this study was to describe and analyse the quality of OACT with warfarin...

  6. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    2010-01-01

    In reduced form default models, the instantaneous default intensity is the classical modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature tends to specify the cumulative haz...

  7. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...

  8. Metrics for Business Process Models

    Science.gov (United States)

    Mendling, Jan

    Up until now, there has been little research on why people introduce errors in real-world business process models. In a more general context, Simon [404] points to the limitations of cognitive capabilities and concludes that humans act rationally only to a certain extent. Concerning modeling errors, this argument would imply that human modelers lose track of the interrelations of large and complex models due to their limited cognitive capabilities and introduce errors that they would not insert in a small model. A recent study by Mendling et al. [275] explores in how far certain complexity metrics of business process models have the potential to serve as error determinants. The authors conclude that complexity indeed appears to have an impact on error probability. Before we can test such a hypothesis in a more general setting, we have to establish an understanding of how we can define determinants that drive error probability and how we can measure them.

  9. Computer Modelling of Dynamic Processes

    Directory of Open Access Journals (Sweden)

    B. Rybakin

    2000-10-01

    Full Text Available Results of numerical modeling of dynamic problems are summed in the article up. These problems are characteristic for various areas of human activity, in particular for problem solving in ecology. The following problems are considered in the present work: computer modeling of dynamic effects on elastic-plastic bodies, calculation and determination of performances of gas streams in gas cleaning equipment, modeling of biogas formation processes.

  10. Modelling Hospital Materials Management Processes

    Directory of Open Access Journals (Sweden)

    Raffaele Iannone

    2013-06-01

    integrated and detailed analysis and description model for hospital materials management data and tasks, which is able to tackle information from patient requirements to usage, from replenishment requests to supplying and handling activities. The model takes account of medical risk reduction, traceability and streamlined processes perspectives. Second, the paper translates this information into a business process model and mathematical formalization.The study provides a useful guide to the various relevant technology‐related, management and business issues, laying the foundations of an efficient reengineering of the supply chain to reduce healthcare costs and improve the quality of care.

  11. Declarative modeling for process supervision

    International Nuclear Information System (INIS)

    Leyval, L.

    1989-01-01

    Our work is a contribution to computer aided supervision of continuous processes. It is inspired by an area of Artificial Intelligence: qualitative physics. Here, supervision is based on a model which continuously provides operators with a synthetic view of the process; but this model is founded on general principles of control theory rather than on physics. It involves concepts such as high gain or small time response. It helps in linking temporally the evolution of various variables. Moreover, the model provides predictions of the future behaviour of the process, which allows action advice and alarm filtering. This should greatly reduce the famous cognitive overload associated to any complex and dangerous evolution of the process

  12. Preoperative mapping of cortical language areas in adult brain tumour patients using PET and individual non-normalised SPM analyses

    International Nuclear Information System (INIS)

    Meyer, Philipp T.; Sturz, Laszlo; Schreckenberger, Mathias; Setani, Keyvan S.; Buell, Udalrich; Spetzger, Uwe; Meyer, Georg F.; Sabri, Osama

    2003-01-01

    In patients scheduled for the resection of perisylvian brain tumours, knowledge of the cortical topography of language functions is crucial in order to avoid neurological deficits. We investigated the applicability of statistical parametric mapping (SPM) without stereotactic normalisation for individual preoperative language function brain mapping using positron emission tomography (PET). Seven right-handed adult patients with left-sided brain tumours (six frontal and one temporal) underwent 12 oxygen-15 labelled water PET scans during overt verb generation and rest. Individual activation maps were calculated for P<0.005 and P<0.001 without anatomical normalisation and overlaid onto the individuals' magnetic resonance images for preoperative planning. Activations corresponding to Broca's and Wernicke's areas were found in five and six cases, respectively, for P<0.005 and in three and six cases, respectively, for P<0.001. One patient with a glioma located in the classical Broca's area without aphasic symptoms presented an activation of the adjacent inferior frontal cortex and of a right-sided area homologous to Broca's area. Four additional patients with left frontal tumours also presented activations of the right-sided Broca's homologue; two of these showed aphasic symptoms and two only a weak or no activation of Broca's area. Other frequently observed activations included bilaterally the superior temporal gyri, prefrontal cortices, anterior insulae, motor areas and the cerebellum. The middle and inferior temporal gyri were activated predominantly on the left. An SPM group analysis (P<0.05, corrected) in patients with left frontal tumours confirmed the activation pattern shown by the individual analyses. We conclude that SPM analyses without stereotactic normalisation offer a promising alternative for analysing individual preoperative language function brain mapping studies. The observed right frontal activations agree with proposed reorganisation processes, but

  13. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret......Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  14. Neuroscientific Model of Motivational Process

    Science.gov (United States)

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  15. Neuroscientific model of motivational process.

    Science.gov (United States)

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  16. ECONOMIC MODELING PROCESSES USING MATLAB

    Directory of Open Access Journals (Sweden)

    Anamaria G. MACOVEI

    2008-06-01

    Full Text Available To study economic phenomena and processes using mathem atical modeling, and to determine the approximatesolution to a problem we need to choose a method of calculation and a numerical computer program, namely thepackage of programs MatLab. Any economic process or phenomenon is a mathematical description of h is behavior,and thus draw up an economic and mathematical model that has the following stages: formulation of the problem, theanalysis process modeling, the production model and design verification, validation and implementation of the model.This article is presented an economic model and its modeling is using mathematical equations and software packageMatLab, which helps us approximation effective solution. As data entry is considered the net cost, the cost of direct andtotal cost and the link between them. I presented the basic formula for determining the total cost. Economic modelcalculations were made in MatLab software package and with graphic representation of its interpretation of the resultsachieved in terms of our specific problem.

  17. Path modeling and process control

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Rodionova, O.; Pomerantsev, A.

    2007-01-01

    and having three or more stages. The methods are applied to a process control of a multi-stage production process having 25 variables and one output variable. When moving along the process, variables change their roles. It is shown how the methods of path modeling can be applied to estimate variables...... be performed regarding the foreseeable output property y, and with respect to an admissible range of correcting actions for the parameters of the next stage. In this paper the basic principles of path modeling is presented. The mathematics is presented for processes having only one stage, having two stages...... of the next stage with the purpose of obtaining optimal or almost optimal quality of the output variable. An important aspect of the methods presented is the possibility of extensive graphic analysis of data that can provide the engineer with a detailed view of the multi-variate variation in data....

  18. Process Models for Security Architectures

    Directory of Open Access Journals (Sweden)

    Floarea NASTASE

    2006-01-01

    Full Text Available This paper presents a model for an integrated security system, which can be implemented in any organization. It is based on security-specific standards and taxonomies as ISO 7498-2 and Common Criteria. The functionalities are derived from the classes proposed in the Common Criteria document. In the paper we present the process model for each functionality and also we focus on the specific components.

  19. Mathematical modelling in economic processes.

    Directory of Open Access Journals (Sweden)

    L.V. Kravtsova

    2008-06-01

    Full Text Available In article are considered a number of methods of mathematical modelling of economic processes and opportunities of use of spreadsheets Excel for reception of the optimum decision of tasks or calculation of financial operations with the help of the built-in functions.

  20. Visualizing the process of process modeling with PPMCharts

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Pinggera, J.; Reijers, H.A.; Weber, B.; Poels, G.; La Rosa, M.; Soffer, P.

    2013-01-01

    In the quest for knowledge about how to make good process models, recent research focus is shifting from studying the quality of process models to studying the process of process modeling (often abbreviated as PPM) itself. This paper reports on our efforts to visualize this specific process in such

  1. Neuroscientific Model of Motivational Process

    Directory of Open Access Journals (Sweden)

    Sung-Il eKim

    2013-03-01

    Full Text Available Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three subprocesses, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous subprocesses, namely reward-driven approach, value-based decision making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area and the dorsolateral prefrontal cortex (cognitive control area are the main neural circuits related to regulation of motivation. These three subprocesses interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  2. Integrated Site Model Process Model Report

    International Nuclear Information System (INIS)

    Booth, T.

    2000-01-01

    The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM

  3. What work has to be done to implement collaborative care for depression? Process evaluation of a trial utilizing the Normalization Process Model

    Directory of Open Access Journals (Sweden)

    Lankshear Annette J

    2010-02-01

    Full Text Available Abstract Background There is a considerable evidence base for 'collaborative care' as a method to improve quality of care for depression, but an acknowledged gap between efficacy and implementation. This study utilises the Normalisation Process Model (NPM to inform the process of implementation of collaborative care in both a future full-scale trial, and the wider health economy. Methods Application of the NPM to qualitative data collected in both focus groups and one-to-one interviews before and after an exploratory randomised controlled trial of a collaborative model of care for depression. Results Findings are presented as they relate to the four factors of the NPM (interactional workability, relational integration, skill-set workability, and contextual integration and a number of necessary tasks are identified. Using the model, it was possible to observe that predictions about necessary work to implement collaborative care that could be made from analysis of the pre-trial data relating to the four different factors of the NPM were indeed borne out in the post-trial data. However, additional insights were gained from the post-trial interview participants who, unlike those interviewed before the trial, had direct experience of a novel intervention. The professional freedom enjoyed by more senior mental health workers may work both for and against normalisation of collaborative care as those who wish to adopt new ways of working have the freedom to change their practice but are not obliged to do so. Conclusions The NPM provides a useful structure for both guiding and analysing the process by which an intervention is optimized for testing in a larger scale trial or for subsequent full-scale implementation.

  4. What work has to be done to implement collaborative care for depression? Process evaluation of a trial utilizing the Normalization Process Model.

    Science.gov (United States)

    Gask, Linda; Bower, Peter; Lovell, Karina; Escott, Diane; Archer, Janine; Gilbody, Simon; Lankshear, Annette J; Simpson, Angela E; Richards, David A

    2010-02-10

    There is a considerable evidence base for 'collaborative care' as a method to improve quality of care for depression, but an acknowledged gap between efficacy and implementation. This study utilises the Normalisation Process Model (NPM) to inform the process of implementation of collaborative care in both a future full-scale trial, and the wider health economy. Application of the NPM to qualitative data collected in both focus groups and one-to-one interviews before and after an exploratory randomised controlled trial of a collaborative model of care for depression. Findings are presented as they relate to the four factors of the NPM (interactional workability, relational integration, skill-set workability, and contextual integration) and a number of necessary tasks are identified. Using the model, it was possible to observe that predictions about necessary work to implement collaborative care that could be made from analysis of the pre-trial data relating to the four different factors of the NPM were indeed borne out in the post-trial data. However, additional insights were gained from the post-trial interview participants who, unlike those interviewed before the trial, had direct experience of a novel intervention. The professional freedom enjoyed by more senior mental health workers may work both for and against normalisation of collaborative care as those who wish to adopt new ways of working have the freedom to change their practice but are not obliged to do so. The NPM provides a useful structure for both guiding and analysing the process by which an intervention is optimized for testing in a larger scale trial or for subsequent full-scale implementation.

  5. Investigating the Process of Process Modeling with Eye Movement Analysis

    OpenAIRE

    Pinggera, Jakob; Furtner, Marco; Martini, Markus; Sachse, Pierre; Reiter, Katharina; Zugal, Stefan; Weber, Barbara

    2015-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models by analyzing the modeler's interactions with the modeling environment. In this paper we aim to complement previous insights on the modeler's modeling behavior with data gathered by tracking the modeler's eye movements when engaged in the act of modeling. We present preliminary results and outline directions for future research to triangulate toward a more comprehensive un...

  6. Welding process modelling and control

    Science.gov (United States)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  7. Advanced oxidation processes: overall models

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, M. [Univ. de los Andes, Escuela Basica de Ingenieria, La Hechicera, Merida (Venezuela); Curco, D.; Addardak, A.; Gimenez, J.; Esplugas, S. [Dept. de Ingenieria Quimica. Univ. de Barcelona, Barcelona (Spain)

    2003-07-01

    Modelling AOPs implies to consider all the steps included in the process, that means, mass transfer, kinetic (reaction) and luminic steps. In this way, recent works develop models which relate the global reaction rate to catalyst concentration and radiation absorption. However, the application of such models requires to know what is the controlling step for the overall process. In this paper, a simple method is explained which allows to determine the controlling step. Thus, it is assumed that reactor is divided in two hypothetical zones (dark and illuminated), and according to the experimental results, obtained by varying only the reaction volume, it can be decided if reaction occurs only in the illuminated zone or in the all reactor, including dark zone. The photocatalytic degradation of phenol, by using titania degussa P-25 as catalyst, is studied as reaction model. The preliminary results obtained are presented here, showing that it seems that, in this case, reaction only occurs in the illuminated zone of photoreactor. A model is developed to explain this behaviour. (orig.)

  8. Model for amorphous aggregation processes

    Science.gov (United States)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  9. From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument.

    Science.gov (United States)

    Finch, Tracy L; Mair, Frances S; O'Donnell, Catherine; Murray, Elizabeth; May, Carl R

    2012-05-17

    Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field. A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals. The developed instrument was pre-tested in two professional samples (N=46; N=231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of

  10. Normalisation of spot urine samples to 24-h collection for assessment of exposure to uranium

    International Nuclear Information System (INIS)

    Marco, R.; Katorza, E.; Gonen, R.; German, U.; Tshuva, A.; Pelled, O.; Paz-tal, O.; Adout, A.; Karpas, Z.

    2008-01-01

    For dose assessment of workers at Nuclear Research Center Negev exposed to natural uranium, spot urine samples are analysed and the results are normalised to 24-h urine excretion based on 'standard' man urine volume of 1.6 l d -1 . In the present work, the urine volume, uranium level and creatinine concentration were determined in two or three 24-h urine collections from 133 male workers (319 samples) and 33 female workers (88 samples). Three volunteers provided urine spot samples from each voiding during a 24-h period and a good correlation was found between the relative level of creatinine and uranium in spot samples collected from the same individual. The results show that normalisation of uranium concentration to creatinine in a spot sample represents the 24-h content of uranium better than normalisation to the standard volume and may be used to reduce the uncertainty of dose assessment based on spot samples. (authors)

  11. From theory to 'measurement' in complex interventions: Methodological lessons from the development of an e-health normalisation instrument

    Directory of Open Access Journals (Sweden)

    Finch Tracy L

    2012-05-01

    Full Text Available Abstract Background Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1 describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2 identify key issues and methodological challenges for advancing work in this field. Methods A 30-item instrument (Technology Adoption Readiness Scale (TARS for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT. NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice was used by health care professionals. Results The developed instrument was pre-tested in two professional samples (N = 46; N = 231. Ratings of items representing normalisation ‘processes’ were significantly related to staff members’ perceptions of whether or not e-health had become ‘routine’. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. Conclusions To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1 greater attention to underlying theoretical assumptions and extent of translation work required; (2 the need for appropriate but flexible approaches to outcomes

  12. Probabilistic evaluation of process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2016-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  13. A novel approach to signal normalisation in atmospheric pressure ionisation mass spectrometry.

    Science.gov (United States)

    Vogeser, Michael; Kirchhoff, Fabian; Geyer, Roland

    2012-07-01

    The aim of our study was to test an alternative principle of signal normalisation in LC-MS/MS. During analyses, post column infusion of the target analyte is done via a T-piece, generating an "area under the analyte peak" (AUP). The ratio of peak area to AUP is assessed as assay response. Acceptable analytical performance of this principle was found for an exemplary analyte. Post-column infusion may allow normalisation of ion suppression not requiring any additional standard compound. This approach can be useful in situations where no appropriate compound is available for classical internal standardisation. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Mathematical modeling of biological processes

    CERN Document Server

    Friedman, Avner

    2014-01-01

    This book on mathematical modeling of biological processes includes a wide selection of biological topics that demonstrate the power of mathematics and computational codes in setting up biological processes with a rigorous and predictive framework. Topics include: enzyme dynamics, spread of disease, harvesting bacteria, competition among live species, neuronal oscillations, transport of neurofilaments in axon, cancer and cancer therapy, and granulomas. Complete with a description of the biological background and biological question that requires the use of mathematics, this book is developed for graduate students and advanced undergraduate students with only basic knowledge of ordinary differential equations and partial differential equations; background in biology is not required. Students will gain knowledge on how to program with MATLAB without previous programming experience and how to use codes in order to test biological hypothesis.

  15. Modeling pellet impact drilling process

    Science.gov (United States)

    Kovalyov, A. V.; Ryabchikov, S. Ya; Isaev, Ye D.; Ulyanova, O. S.

    2016-03-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling the process of pellet impact drilling, which creates the scientific and methodological basis for engineering design of drilling operations under different geo-technical conditions.

  16. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    Integrated modelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...

  17. A normalisation for the four - detector system for gamma - gamma angular correlation studies

    International Nuclear Information System (INIS)

    Kiang, G.C.; Chen, C.H.; Niu, W.F.

    1994-01-01

    A normalisation method for the multiple - HPGe - detector system is described. The system consists of four coaxial HPGe detectors with a CAMAC event - by - event data acquisition system, enabling to measure six gamma -gamma coincidences of angles simultaneously. An application for gamma - gamma correlation studies of Kr 82 is presented and discussed. 3 figs., 6 refs. (author)

  18. Normalisation of the peaceful use of nuclear energy - consequences for its legal regulation

    International Nuclear Information System (INIS)

    Birkhofer, A.; Lukes, R.

    1985-01-01

    The five reports in this book deal with the importance of the peaceful use of nuclear energy, as well as with several aspects of normalisation. The spectrum of the reports underlines the benefit for the support of the peaceful use of nuclear energy. (WG) [de

  19. Normalised quantitative polymerase chain reaction for diagnosis of tuberculosis-associated uveitis.

    Science.gov (United States)

    Barik, Manas Ranjan; Rath, Soveeta; Modi, Rohit; Rana, Rajkishori; Reddy, Mamatha M; Basu, Soumyava

    2018-05-01

    Polymerase chain reaction (PCR)-based diagnosis of tuberculosis-associated uveitis (TBU) in TB-endemic countries is challenging due to likelihood of latent mycobacterial infection in both immune and non-immune cells. In this study, we investigated normalised quantitative PCR (nqPCR) in ocular fluids (aqueous/vitreous) for diagnosis of TBU in a TB-endemic population. Mycobacterial copy numbers (mpb64 gene) were normalised to host genome copy numbers (RNAse P RNA component H1 [RPPH1] gene) in TBU (n = 16) and control (n = 13) samples (discovery cohort). The mpb64:RPPH1 ratios (normalised value) from each TBU and control sample were tested against the current reference standard i.e. clinically-diagnosed TBU, to generate Receiver Operating Characteristic (ROC) curves. The optimum cut-off value of mpb64:RPPH1 ratio (0.011) for diagnosing TBU was identified from the highest Youden index. This cut-off value was then tested in a different cohort of TBU and controls (validation cohort, 20 cases and 18 controls), where it yielded specificity, sensitivity and diagnostic accuracy of 94.4%, 85.0%, and 89.4% respectively. The above values for conventional quantitative PCR (≥1 copy of mpb64 per reaction) were 61.1%, 90.0%, and 74.3% respectively. Normalisation markedly improved the specificity and diagnostic accuracy of quantitative PCR for diagnosis of TBU. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Relationships between the normalised difference vegetation index and temperature fluctuations in post-mining sites

    Czech Academy of Sciences Publication Activity Database

    Bujalský, L.; Jirka, V.; Zemek, František; Frouz, J.

    2018-01-01

    Roč. 32, č. 4 (2018), s. 254-263 ISSN 1748-0930 R&D Projects: GA MŠk(CZ) LO1415 Institutional support: RVO:67179843 Keywords : temperature * normalised difference * vegetation index (NDVI) * vegetation cover * remote sensing Subject RIV: DF - Soil Science Impact factor: 1.078, year: 2016

  1. An efficient Markov chain Monte Carlo method for distributions with intractable normalising constants

    DEFF Research Database (Denmark)

    Møller, Jesper; Pettitt, A. N.; Reeves, R.

    2006-01-01

    Maximum likelihood parameter estimation and sampling from Bayesian posterior distributions are problematic when the probability density for the parameter of interest involves an intractable normalising constant which is also a function of that parameter. In this paper, an auxiliary variable metho...

  2. Normalisation et certification dans le photovoltaïque: perspectives juridiques.

    OpenAIRE

    Boy , Laurence

    2012-01-01

    International audience; Legal approach of standardization in photovoltaic industry in France. Legal sources. Stakeholder"s liabillities. Competition aspects.; Approche juridique de la normalisation et de la certification dans le domaine du photovoltaïque en France. Sources du droit. Responsabilités des acteurs.Aspects concurrentiels.

  3. Bounded real and positive real balanced truncation using Σ-normalised coprime factors

    NARCIS (Netherlands)

    Trentelman, H.L.

    2009-01-01

    In this article, we will extend the method of balanced truncation using normalised right coprime factors of the system transfer matrix to balanced truncation with preservation of half line dissipativity. Special cases are preservation of positive realness and bounded realness. We consider a half

  4. Collapse models and perceptual processes

    International Nuclear Information System (INIS)

    Ghirardi, Gian Carlo; Romano, Raffaele

    2014-01-01

    Theories including a collapse mechanism have been presented various years ago. They are based on a modification of standard quantum mechanics in which nonlinear and stochastic terms are added to the evolution equation. Their principal merits derive from the fact that they are mathematically precise schemes accounting, on the basis of a unique universal dynamical principle, both for the quantum behavior of microscopic systems as well as for the reduction associated to measurement processes and for the classical behavior of macroscopic objects. Since such theories qualify themselves not as new interpretations but as modifications of the standard theory they can be, in principle, tested against quantum mechanics. Recently, various investigations identifying possible crucial test have been discussed. In spite of the extreme difficulty to perform such tests it seems that recent technological developments allow at least to put precise limits on the parameters characterizing the modifications of the evolution equation. Here we will simply mention some of the recent investigations in this direction, while we will mainly concentrate our attention to the way in which collapse theories account for definite perceptual process. The differences between the case of reductions induced by perceptions and those related to measurement procedures by means of standard macroscopic devices will be discussed. On this basis, we suggest a precise experimental test of collapse theories involving conscious observers. We make plausible, by discussing in detail a toy model, that the modified dynamics can give rise to quite small but systematic errors in the visual perceptual process.

  5. Hillslope runoff processes and models

    Science.gov (United States)

    Kirkby, Mike

    1988-07-01

    Hillslope hydrology is concerned with the partition of precipitation as it passes through the vegetation and soil between overland flow and subsurface flow. Flow follows routes which attenuate and delay the flow to different extents, so that a knowledge of the relevant mechanisms is important. In the 1960s and 1970s, hillslope hydrology developed as a distinct topic through the application of new field observations to develop a generation of physically based forecasting models. In its short history, theory has continually been overturned by field observation. Thus the current tendency, particularly among temperate zone hydrologists, to dismiss all Hortonian overland flow as a myth, is now being corrected by a number of significant field studies which reveal the great range in both climatic and hillslope conditions. Some recent models have generally attempted to simplify the processes acting, for example including only vertical unsaturated flow and lateral saturated flows. Others explicitly forecast partial or contributing areas. With hindsight, the most complete and distributed models have generally shown little forecasting advantage over simpler approaches, perhaps trending towards reliable models which can run on desk top microcomputers. The variety now being recognised in hillslope hydrological responses should also lead to models which take account of more complex interactions, even if initially with a less secure physical and mathematical basis than the Richards equation. In particular, there is a need to respond to the variety of climatic responses, and to spatial variability on and beneath the surface, including the role of seepage macropores and pipes which call into question whether the hillside can be treated as a Darcian flow system.

  6. Business Process Modelling for Measuring Quality

    NARCIS (Netherlands)

    Heidari, F.; Loucopoulos, P.; Brazier, F.M.

    2013-01-01

    Business process modelling languages facilitate presentation, communication and analysis of business processes with different stakeholders. This paper proposes an approach that drives specification and measurement of quality requirements and in doing so relies on business process models as

  7. The contribution of online content to the promotion and normalisation of female genital cosmetic surgery: a systematic review of the literature.

    Science.gov (United States)

    Mowat, Hayley; McDonald, Karalyn; Dobson, Amy Shields; Fisher, Jane; Kirkman, Maggie

    2015-11-25

    Women considering female genital cosmetic surgery (FGCS) are likely to use the internet as a key source of information during the decision-making process. The aim of this systematic review was to determine what is known about the role of the internet in the promotion and normalisation of female genital cosmetic surgery and to identify areas for future research. Eight social science, medical, and communication databases and Google Scholar were searched for peer-reviewed papers published in English. Results from all papers were analysed to identify recurring and unique themes. Five papers met inclusion criteria. Three of the papers reported investigations of website content of FGCS providers, a fourth compared motivations for labiaplasty publicised on provider websites with those disclosed by women in online communities, and the fifth analysed visual depictions of female genitalia in online pornography. Analysis yielded five significant and interrelated patterns of representation, each functioning to promote and normalise the practice of FGCS: pathologisation of genital diversity; female genital appearance as important to wellbeing; characteristics of women's genitals are important for sex life; female body as degenerative and improvable through surgery; and FGCS as safe, easy, and effective. A significant gap was identified in the literature: the ways in which user-generated content might function to perpetuate, challenge, or subvert the normative discourses prevalent in online pornography and surgical websites. Further research is needed to contribute to knowledge of the role played by the internet in the promotion and normalisation of female genital cosmetic surgery.

  8. Symmorphosis through dietary regulation: a combinatorial role for proteolysis, autophagy and protein synthesis in normalising muscle metabolism and function of hypertrophic mice after acute starvation.

    Directory of Open Access Journals (Sweden)

    Henry Collins-Hooper

    Full Text Available Animals are imbued with adaptive mechanisms spanning from the tissue/organ to the cellular scale which insure that processes of homeostasis are preserved in the landscape of size change. However we and others have postulated that the degree of adaptation is limited and that once outside the normal levels of size fluctuations, cells and tissues function in an aberant manner. In this study we examine the function of muscle in the myostatin null mouse which is an excellent model for hypertrophy beyond levels of normal growth and consequeces of acute starvation to restore mass. We show that muscle growth is sustained through protein synthesis driven by Serum/Glucocorticoid Kinase 1 (SGK1 rather than Akt1. Furthermore our metabonomic profiling of hypertrophic muscle shows that carbon from nutrient sources is being channelled for the production of biomass rather than ATP production. However the muscle displays elevated levels of autophagy and decreased levels of muscle tension. We demonstrate the myostatin null muscle is acutely sensitive to changes in diet and activates both the proteolytic and autophagy programmes and shutting down protein synthesis more extensively than is the case for wild-types. Poignantly we show that acute starvation which is detrimental to wild-type animals is beneficial in terms of metabolism and muscle function in the myostatin null mice by normalising tension production.

  9. 18S rRNA is a reliable normalisation gene for real time PCR based on influenza virus infected cells

    Directory of Open Access Journals (Sweden)

    Kuchipudi Suresh V

    2012-10-01

    Full Text Available Abstract Background One requisite of quantitative reverse transcription PCR (qRT-PCR is to normalise the data with an internal reference gene that is invariant regardless of treatment, such as virus infection. Several studies have found variability in the expression of commonly used housekeeping genes, such as beta-actin (ACTB and glyceraldehyde-3-phosphate dehydrogenase (GAPDH, under different experimental settings. However, ACTB and GAPDH remain widely used in the studies of host gene response to virus infections, including influenza viruses. To date no detailed study has been described that compares the suitability of commonly used housekeeping genes in influenza virus infections. The present study evaluated several commonly used housekeeping genes [ACTB, GAPDH, 18S ribosomal RNA (18S rRNA, ATP synthase, H+ transporting, mitochondrial F1 complex, beta polypeptide (ATP5B and ATP synthase, H+ transporting, mitochondrial Fo complex, subunit C1 (subunit 9 (ATP5G1] to identify the most stably expressed gene in human, pig, chicken and duck cells infected with a range of influenza A virus subtypes. Results The relative expression stability of commonly used housekeeping genes were determined in primary human bronchial epithelial cells (HBECs, pig tracheal epithelial cells (PTECs, and chicken and duck primary lung-derived cells infected with five influenza A virus subtypes. Analysis of qRT-PCR data from virus and mock infected cells using NormFinder and BestKeeper software programmes found that 18S rRNA was the most stable gene in HBECs, PTECs and avian lung cells. Conclusions Based on the presented data from cell culture models (HBECs, PTECs, chicken and duck lung cells infected with a range of influenza viruses, we found that 18S rRNA is the most stable reference gene for normalising qRT-PCR data. Expression levels of the other housekeeping genes evaluated in this study (including ACTB and GPADH were highly affected by influenza virus infection and

  10. Properties of spatial Cox process models

    DEFF Research Database (Denmark)

    Møller, Jesper

    Probabilistic properties of Cox processes of relevance for statistical modelling and inference are studied. Particularly, we study the most important classes of Cox processes, including log Gaussian Cox processes, shot noise Cox processes, and permanent Cox processes. We consider moment properties...... and point process operations such as thinning, displacements, and superpositioning. We also discuss how to simulate specific Cox processes....

  11. Developing engineering processes through integrated modelling of product and process

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2012-01-01

    This article aims at developing an operational tool for integrated modelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...... activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integrating models of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration...... with a major international engineering company....

  12. Automatic extraction of process categories from process model collections

    NARCIS (Netherlands)

    Malinova, M.; Dijkman, R.M.; Mendling, J.; Lohmann, N.; Song, M.; Wohed, P.

    2014-01-01

    Many organizations build up their business process management activities in an incremental way. As a result, there is no overarching structure defined at the beginning. However, as business process modeling initiatives often yield hundreds to thousands of process models, there is a growing need for

  13. Process mining using BPMN: relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; van der Aalst, W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2017-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  14. Process mining using BPMN : relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; Aalst, van der W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2015-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  15. Multiphysics modelling of manufacturing processes: A review

    DEFF Research Database (Denmark)

    Jabbari, Masoud; Baran, Ismet; Mohanty, Sankhya

    2018-01-01

    Numerical modelling is increasingly supporting the analysis and optimization of manufacturing processes in the production industry. Even if being mostly applied to multistep processes, single process steps may be so complex by nature that the needed models to describe them must include multiphysics...... the diversity in the field of modelling of manufacturing processes as regards process, materials, generic disciplines as well as length scales: (1) modelling of tape casting for thin ceramic layers, (2) modelling the flow of polymers in extrusion, (3) modelling the deformation process of flexible stamps...... for nanoimprint lithography, (4) modelling manufacturing of composite parts and (5) modelling the selective laser melting process. For all five examples, the emphasis is on modelling results as well as describing the models in brief mathematical details. Alongside with relevant references to the original work...

  16. Modeling Suspension and Continuation of a Process

    Directory of Open Access Journals (Sweden)

    Oleg Svatos

    2012-04-01

    Full Text Available This work focuses on difficulties an analyst encounters when modeling suspension and continuation of a process in contemporary process modeling languages. As a basis there is introduced general lifecycle of an activity which is then compared to activity lifecycles supported by individual process modeling languages. The comparison shows that the contemporary process modeling languages cover the defined general lifecycle of an activity only partially. There are picked two popular process modeling languages and there is modeled real example, which reviews how the modeling languages can get along with their lack of native support of suspension and continuation of an activity. Upon the unsatisfying results of the contemporary process modeling languages in the modeled example, there is presented a new process modeling language which, as demonstrated, is capable of capturing suspension and continuation of an activity in much simpler and precise way.

  17. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  18. Process modeling for Humanities: tracing and analyzing scientific processes

    OpenAIRE

    Hug , Charlotte; Salinesi , Camille; Deneckere , Rebecca; Lamasse , Stéphane

    2011-01-01

    International audience; This paper concerns epistemology and the understanding of research processes in Humanities, such as Archaeology. We believe that to properly understand research processes, it is essential to trace them. The collected traces depend on the process model established, which has to be as accurate as possible to exhaustively record the traces. In this paper, we briefly explain why the existing process models for Humanities are not sufficient to represent traces. We then pres...

  19. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  20. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  1. Syntax highlighting in business process models

    NARCIS (Netherlands)

    Reijers, H.A.; Freytag, T.; Mendling, J.; Eckleder, A.

    2011-01-01

    Sense-making of process models is an important task in various phases of business process management initiatives. Despite this, there is currently hardly any support in business process modeling tools to adequately support model comprehension. In this paper we adapt the concept of syntax

  2. Configurable multi-perspective business process models

    NARCIS (Netherlands)

    La Rosa, M.; Dumas, M.; Hofstede, ter A.H.M.; Mendling, J.

    2011-01-01

    A configurable process model provides a consolidated view of a family of business processes. It promotes the reuse of proven practices by providing analysts with a generic modeling artifact from which to derive individual process models. Unfortunately, the scope of existing notations for

  3. REPORTING SOCIETAL : LIMITES ET ENJEUX DE LA PROPOSITION DE NORMALISATION INTERNATIONALE " GLOBAL REPORTING INITIATIVE "

    OpenAIRE

    Michel Capron; Françoise Quairel

    2003-01-01

    International audience; En s'inspirant de la normalisation comptable anglo-saxonne, la Global Reporting Initiative (GRI) propose un référentiel de publication volontaire d'informations sociétales. La transposition présente des limites qui rendent en fait ses principes inapplicables. Néanmoins il tend à s'imposer et les grandes entreprises peuvent y trouver le moyen d'éviter une régulation contraignante.

  4. User's manual for a process model code

    International Nuclear Information System (INIS)

    Kern, E.A.; Martinez, D.P.

    1981-03-01

    The MODEL code has been developed for computer modeling of materials processing facilities associated with the nuclear fuel cycle. However, it can also be used in other modeling applications. This report provides sufficient information for a potential user to apply the code to specific process modeling problems. Several examples that demonstrate most of the capabilities of the code are provided

  5. Repairing process models to reflect reality

    NARCIS (Netherlands)

    Fahland, D.; Aalst, van der W.M.P.; Barros, A.; Gal, A.; Kindler, E.

    2012-01-01

    Process mining techniques relate observed behavior (i.e., event logs) to modeled behavior (e.g., a BPMN model or a Petri net). Processes models can be discovered from event logs and conformance checking techniques can be used to detect and diagnose differences between observed and modeled behavior.

  6. Genetic Process Mining: Alignment-based Process Model Mutation

    NARCIS (Netherlands)

    Eck, van M.L.; Buijs, J.C.A.M.; Dongen, van B.F.; Fournier, F.; Mendling, J.

    2015-01-01

    The Evolutionary Tree Miner (ETM) is a genetic process discovery algorithm that enables the user to guide the discovery process based on preferences with respect to four process model quality dimensions: replay fitness, precision, generalization and simplicity. Traditionally, the ETM algorithm uses

  7. Composing Models of Geographic Physical Processes

    Science.gov (United States)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  8. MORTALITY MODELING WITH LEVY PROCESSES

    Directory of Open Access Journals (Sweden)

    M. Serhat Yucel, FRM

    2012-07-01

    Full Text Available Mortality and longevity risk is usually one of the main risk components ineconomic capital models of insurance companies. Above all, future mortalityexpectations are an important input in the modeling and pricing of long termproducts. Deviations from the expectation can lead insurance company even todefault if sufficient reserves and capital is not held. Thus, Modeling of mortalitytime series accurately is a vital concern for the insurance industry. The aim of thisstudy is to perform distributional and spectral testing to the mortality data andpracticed discrete and continuous time modeling. We believe, the results and thetechniques used in this study will provide a basis for Value at Risk formula incase of mortality.

  9. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  10. Kopernik : modeling business processes for digital customers

    OpenAIRE

    Estañol Lamarca, Montserrat; Castro, Manuel; Díaz-Montenegro, Sylvia; Teniente López, Ernest

    2016-01-01

    This paper presents the Kopernik methodology for modeling business processes for digital customers. These processes require a high degree of flexibility in the execution of their tasks or actions. We achieve this by using the artifact-centric approach to process modeling and the use of condition-action rules. The processes modeled following Kopernik can then be implemented in an existing commercial tool, Balandra.

  11. The triconnected abstraction of process models

    OpenAIRE

    Polyvyanyy, Artem; Smirnov, Sergey; Weske, Mathias

    2009-01-01

    Contents: Artem Polyvanny, Sergey Smirnow, and Mathias Weske The Triconnected Abstraction of Process Models 1 Introduction 2 Business Process Model Abstraction 3 Preliminaries 4 Triconnected Decomposition 4.1 Basic Approach for Process Component Discovery 4.2 SPQR-Tree Decomposition 4.3 SPQR-Tree Fragments in the Context of Process Models 5 Triconnected Abstraction 5.1 Abstraction Rules 5.2 Abstraction Algorithm 6 Related Work and Conclusions

  12. Modelling of Batch Process Operations

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli; Cameron, Ian; Gani, Rafiqul

    2011-01-01

    Here a batch cooling crystalliser is modelled and simulated as is a batch distillation system. In the batch crystalliser four operational modes of the crystalliser are considered, namely: initial cooling, nucleation, crystal growth and product removal. A model generation procedure is shown that s...

  13. Mathematical Modeling: A Structured Process

    Science.gov (United States)

    Anhalt, Cynthia Oropesa; Cortez, Ricardo

    2015-01-01

    Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

  14. Birth/death process model

    Science.gov (United States)

    Solloway, C. B.; Wakeland, W.

    1976-01-01

    First-order Markov model developed on digital computer for population with specific characteristics. System is user interactive, self-documenting, and does not require user to have complete understanding of underlying model details. Contains thorough error-checking algorithms on input and default capabilities.

  15. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    Science.gov (United States)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  16. Modeling business processes: theoretical and practical aspects

    Directory of Open Access Journals (Sweden)

    V.V. Dubininа

    2015-06-01

    Full Text Available The essence of process-oriented enterprise management has been examined in the article. The content and types of information technology have been analyzed in the article, due to the complexity and differentiation of existing methods, as well as the specificity of language, terminology of the enterprise business processes modeling. The theoretical aspects of business processes modeling have been reviewed and the modern traditional modeling techniques received practical application in the visualization model of retailers activity have been studied in the article. In the process of theoretical analysis of the modeling methods found that UFO-toolkit method that has been developed by Ukrainian scientists due to it systemology integrated opportunities, is the most suitable for structural and object analysis of retailers business processes. It was designed visualized simulation model of the business process "sales" as is" of retailers using a combination UFO-elements with the aim of the further practical formalization and optimization of a given business process.

  17. A Comparative of business process modelling techniques

    Science.gov (United States)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  18. Information-Processing Models and Curriculum Design

    Science.gov (United States)

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  19. How visual cognition influences process model comprehension

    NARCIS (Netherlands)

    Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A.

    2017-01-01

    Process analysts and other professionals extensively use process models to analyze business processes and identify performance improvement opportunities. Therefore, it is important that such models can be easily and properly understood. Previous research has mainly focused on two types of factors

  20. Social software for business process modeling

    NARCIS (Netherlands)

    Koschmider, A.; Song, M.S.; Reijers, H.A.

    2010-01-01

    Formal models of business processes are used for a variety of purposes. But where the elicitation of the characteristics of a business process usually takes place in a collaborative fashion, the building of the final, formal process model is done mostly by a single person. This article presents the

  1. Measuring similarity between business process models

    NARCIS (Netherlands)

    Dongen, van B.F.; Dijkman, R.M.; Mendling, J.

    2007-01-01

    Quality aspects become increasingly important when business process modeling is used in a large-scale enterprise setting. In order to facilitate a storage without redundancy and an efficient retrieval of relevant process models in model databases it is required to develop a theoretical understanding

  2. Steady-State Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    illustrate the “equation oriented” approach as well as the “sequential modular” approach to solving complex flowsheets for steady state applications. The applications include the Williams-Otto plant, the hydrodealkylation (HDA) of toluene, conversion of ethylene to ethanol and a bio-ethanol process....

  3. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling....... the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension...

  4. Styles in business process modeling: an exploration and a model

    NARCIS (Netherlands)

    Pinggera, J.; Soffer, P.; Fahland, D.; Weidlich, M.; Zugal, S.; Weber, B.; Reijers, H.A.; Mendling, J.

    2015-01-01

    Business process models are an important means to design, analyze, implement, and control business processes. As with every type of conceptual model, a business process model has to meet certain syntactic, semantic, and pragmatic quality requirements to be of value. For many years, such quality

  5. Process generalization in conceptual models

    NARCIS (Netherlands)

    Wieringa, Roelf J.

    In conceptual modeling, the universe of discourse (UoD) is divided into classes which have a taxonomic structure. The classes are usually defined in terms of attributes (all objects in a class share attribute names) and possibly of events. For enmple, the class of employees is the set of objects to

  6. Numerical modelling of reflood processes

    International Nuclear Information System (INIS)

    Glynn, D.R.; Rhodes, N.; Tatchell, D.G.

    1983-01-01

    The use of a detailed computer model to investigate the effects of grid size and the choice of wall-to-fluid heat-transfer correlations on the predictions obtained for reflooding of a vertical heated channel is described. The model employs equations for the momentum and enthalpy of vapour and liquid and hence accounts for both thermal non-equilibrium and slip between the phases. Empirical correlations are used to calculate interphase and wall-to-fluid friction and heat-transfer as functions of flow regime and local conditions. The empirical formulae have remained fixed with the exception of the wall-to-fluid heat-transfer correlations. These have been varied according to the practices adopted in other computer codes used to model reflood, namely REFLUX, RELAP and TRAC. Calculations have been performed to predict the CSNI standard problem number 7, and the results are compared with experiment. It is shown that the results are substantially grid-independent, and that the choice of correlation has a significant influence on the general flow behaviour, the rate of quenching and on the maximum cladding temperature predicted by the model. It is concluded that good predictions of reflooding rates can be obtained with particular correlation sets. (author)

  7. Branching process models of cancer

    CERN Document Server

    Durrett, Richard

    2015-01-01

    This volume develops results on continuous time branching processes and applies them to study rate of tumor growth, extending classic work on the Luria-Delbruck distribution. As a consequence, the authors calculate the probability that mutations that confer resistance to treatment are present at detection and quantify the extent of tumor heterogeneity. As applications, the authors evaluate ovarian cancer screening strategies and give rigorous proofs for results of Heano and Michor concerning tumor metastasis. These notes should be accessible to students who are familiar with Poisson processes and continuous time. Richard Durrett is mathematics professor at Duke University, USA. He is the author of 8 books, over 200 journal articles, and has supervised more than 40 Ph.D. students. Most of his current research concerns the applications of probability to biology: ecology, genetics, and most recently cancer.

  8. Plasma Processing of Model Residential Solid Waste

    Science.gov (United States)

    Messerle, V. E.; Mossé, A. L.; Nikonchuk, A. N.; Ustimenko, A. B.; Baimuldin, R. V.

    2017-09-01

    The authors have tested the technology of processing of model residential solid waste. They have developed and created a pilot plasma unit based on a plasma chamber incinerator. The waste processing technology has been tested and prepared for commercialization.

  9. Simulation Modeling of Software Development Processes

    Science.gov (United States)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  10. Discovering Process Reference Models from Process Variants Using Clustering Techniques

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    2008-01-01

    In today's dynamic business world, success of an enterprise increasingly depends on its ability to react to changes in a quick and flexible way. In response to this need, process-aware information systems (PAIS) emerged, which support the modeling, orchestration and monitoring of business processes

  11. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  12. Towards Model Checking Stochastic Process Algebra

    NARCIS (Netherlands)

    Hermanns, H.; Grieskamp, W.; Santen, T.; Katoen, Joost P.; Stoddart, B.; Meyer-Kayser, J.; Siegle, M.

    2000-01-01

    Stochastic process algebras have been proven useful because they allow behaviour-oriented performance and reliability modelling. As opposed to traditional performance modelling techniques, the behaviour- oriented style supports composition and abstraction in a natural way. However, analysis of

  13. Parameter identification in multinomial processing tree models

    NARCIS (Netherlands)

    Schmittmann, V.D.; Dolan, C.V.; Raijmakers, M.E.J.; Batchelder, W.H.

    2010-01-01

    Multinomial processing tree models form a popular class of statistical models for categorical data that have applications in various areas of psychological research. As in all statistical models, establishing which parameters are identified is necessary for model inference and selection on the basis

  14. Volatility Determination in an Ambit Process Setting

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole; Graversen, Svend-Erik

    The probability limit behaviour of normalised quadratic variation is studied for a simple tempo-spatial ambit process, with particular regard to the question of volatility memorylessness.......The probability limit behaviour of normalised quadratic variation is studied for a simple tempo-spatial ambit process, with particular regard to the question of volatility memorylessness....

  15. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    Based on an equation oriented solving strategy, this thesis investigates a new approach to process modelling. Homogeneous thermodynamic state functions represent consistent mathematical models of thermodynamic properties. Such state functions of solely extensive canonical state variables are the basis of this work, as they are natural objective functions in optimisation nodes to calculate thermodynamic equilibrium regarding phase-interaction and chemical reactions. Analytical state function derivatives are utilised within the solution process as well as interpreted as physical properties. By this approach, only a limited range of imaginable process constraints are considered, namely linear balance equations of state variables. A second-order update of source contributions to these balance equations is obtained by an additional constitutive equation system. These equations are general dependent on state variables and first-order sensitivities, and cover therefore practically all potential process constraints. Symbolic computation technology efficiently provides sparsity and derivative information of active equations to avoid performance problems regarding robustness and computational effort. A benefit of detaching the constitutive equation system is that the structure of the main equation system remains unaffected by these constraints, and a priori information allows to implement an efficient solving strategy and a concise error diagnosis. A tailor-made linear algebra library handles the sparse recursive block structures efficiently. The optimisation principle for single modules of thermodynamic equilibrium is extended to host entire process models. State variables of different modules interact through balance equations, representing material flows from one module to the other. To account for reusability and encapsulation of process module details, modular process modelling is supported by a recursive module structure. The second-order solving algorithm makes it

  16. Confluence via strong normalisation in an algebraic λ-calculus with rewriting

    Directory of Open Access Journals (Sweden)

    Pablo Buiras

    2012-03-01

    Full Text Available The linear-algebraic lambda-calculus and the algebraic lambda-calculus are untyped lambda-calculi extended with arbitrary linear combinations of terms. The former presents the axioms of linear algebra in the form of a rewrite system, while the latter uses equalities. When given by rewrites, algebraic lambda-calculi are not confluent unless further restrictions are added. We provide a type system for the linear-algebraic lambda-calculus enforcing strong normalisation, which gives back confluence. The type system allows an abstract interpretation in System F.

  17. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    Science.gov (United States)

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  18. Revising process models through inductive learning

    NARCIS (Netherlands)

    Maggi, F.M.; Corapi, D.; Russo, A.; Lupu, E.; Visaggio, G.; Muehlen, zur M.; Su, J.

    2011-01-01

    Discovering the Business Process (BP) model underpinning existing practices through analysis of event logs, allows users to understand, analyse and modify the process. But, to be useful, the BP model must be kept in line with practice throughout its lifetime, as changes occur to the business

  19. Diagnosing differences between business process models

    NARCIS (Netherlands)

    Dijkman, R.M.; Dumas, M.; Reichert, M.; Shan, M.-C.

    2008-01-01

    This paper presents a technique to diagnose differences between business process models in the EPC notation. The diagnosis returns the exact position of a difference in the business process models and diagnoses the type of a difference, using a typology of differences developed in previous work.

  20. APROMORE : an advanced process model repository

    NARCIS (Netherlands)

    La Rosa, M.; Reijers, H.A.; Aalst, van der W.M.P.; Dijkman, R.M.; Mendling, J.; Dumas, M.; García-Bañuelos, L.

    2011-01-01

    Business process models are becoming available in large numbers due to their widespread use in many industrial applications such as enterprise and quality engineering projects. On the one hand, this raises a challenge as to their proper management: how can it be ensured that the proper process model

  1. Efficient querying of large process model repositories

    NARCIS (Netherlands)

    Jin, Tao; Wang, Jianmin; La Rosa, M.; Hofstede, ter A.H.M.; Wen, Lijie

    2013-01-01

    Recent years have seen an increased uptake of business process management technology in industries. This has resulted in organizations trying to manage large collections of business process models. One of the challenges facing these organizations concerns the retrieval of models from large business

  2. Distillation modeling for a uranium refining process

    Energy Technology Data Exchange (ETDEWEB)

    Westphal, B.R.

    1996-03-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed {open_quotes}cathode processing{close_quotes}. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process.

  3. Improved model management with aggregated business process models

    NARCIS (Netherlands)

    Reijers, H.A.; Mans, R.S.; Toorn, van der R.A.

    2009-01-01

    Contemporary organizations invest much efforts in creating models of their business processes. This raises the issue of how to deal with large sets of process models that become available over time. This paper proposes an extension of Event-driven Process Chains, called the aggregate EPC (aEPC),

  4. Conceptual modelling of human resource evaluation process

    Directory of Open Access Journals (Sweden)

    Negoiţă Doina Olivia

    2017-01-01

    Full Text Available Taking into account the highly diverse tasks which employees have to fulfil due to complex requirements of nowadays consumers, the human resource within an enterprise has become a strategic element for developing and exploiting products which meet the market expectations. Therefore, organizations encounter difficulties when approaching the human resource evaluation process. Hence, the aim of the current paper is to design a conceptual model of the aforementioned process, which allows the enterprises to develop a specific methodology. In order to design the conceptual model, Business Process Modelling instruments were employed - Adonis Community Edition Business Process Management Toolkit using the ADONIS BPMS Notation. The conceptual model was developed based on an in-depth secondary research regarding the human resource evaluation process. The proposed conceptual model represents a generic workflow (sequential and/ or simultaneously activities, which can be extended considering the enterprise’s needs regarding their requirements when conducting a human resource evaluation process. Enterprises can benefit from using software instruments for business process modelling as they enable process analysis and evaluation (predefined / specific queries and also model optimization (simulations.

  5. Measures of Quality in Business Process Modelling

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-06-01

    Full Text Available Business process modelling and analysing is undoubtedly one of the most important parts of Applied (Business Informatics. Quality of business process models (diagrams is crucial for any purpose in this area. The goal of a process analyst’s work is to create generally understandable, explicit and error free models. If a process is properly described, created models can be used as an input into deep analysis and optimization. It can be assumed that properly designed business process models (similarly as in the case of correctly written algorithms contain characteristics that can be mathematically described. Besides it will be possible to create a tool that will help process analysts to design proper models. As part of this review will be conducted systematic literature review in order to find and analyse business process model’s design and business process model’s quality measures. It was found that mentioned area had already been the subject of research investigation in the past. Thirty-three suitable scietific publications and twenty-two quality measures were found. Analysed scientific publications and existing quality measures do not reflect all important attributes of business process model’s clarity, simplicity and completeness. Therefore it would be appropriate to add new measures of quality.

  6. Online Rule Generation Software Process Model

    OpenAIRE

    Sudeep Marwaha; Alka Aroa; Satma M C; Rajni Jain; R C Goyal

    2013-01-01

    For production systems like expert systems, a rule generation software can facilitate the faster deployment. The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation. The Royce’s final waterfall model has been used in this paper to explain the software development process. The paper presents the specific output of various steps of modified wat...

  7. Evaluation of Models of the Reading Process.

    Science.gov (United States)

    Balajthy, Ernest

    A variety of reading process models have been proposed and evaluated in reading research. Traditional approaches to model evaluation specify the workings of a system in a simplified fashion to enable organized, systematic study of the system's components. Following are several statistical methods of model evaluation: (1) empirical research on…

  8. Edgar Schein's Process versus Content Consultation Models.

    Science.gov (United States)

    Rockwood, Gary F.

    1993-01-01

    Describes Schein's three models of consultation based on assumptions inherent in different helping styles: purchase of expertise and doctor-patient models, which focus on content of organization problems; and process consultation model, which focuses on how organizational problems are solved. Notes that Schein has suggested that consultants begin…

  9. Distillation modeling for a uranium refining process

    International Nuclear Information System (INIS)

    Westphal, B.R.

    1996-01-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed open-quotes cathode processingclose quotes. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process

  10. Consistent haul road condition monitoring by means of vehicle response normalisation with Gaussian processes

    CSIR Research Space (South Africa)

    Heyns, T

    2012-12-01

    Full Text Available Suboptimal haul road management policies such as routine, periodic and urgent maintenance may result in unnecessary cost, both to roads and vehicles. A recent idea is to continually access haul road condition based on measured vehicle response...

  11. Piecewise deterministic processes in biological models

    CERN Document Server

    Rudnicki, Ryszard

    2017-01-01

    This book presents a concise introduction to piecewise deterministic Markov processes (PDMPs), with particular emphasis on their applications to biological models. Further, it presents examples of biological phenomena, such as gene activity and population growth, where different types of PDMPs appear: continuous time Markov chains, deterministic processes with jumps, processes with switching dynamics, and point processes. Subsequent chapters present the necessary tools from the theory of stochastic processes and semigroups of linear operators, as well as theoretical results concerning the long-time behaviour of stochastic semigroups induced by PDMPs and their applications to biological models. As such, the book offers a valuable resource for mathematicians and biologists alike. The first group will find new biological models that lead to interesting and often new mathematical questions, while the second can observe how to include seemingly disparate biological processes into a unified mathematical theory, and...

  12. Modeling closed nuclear fuel cycles processes

    Energy Technology Data Exchange (ETDEWEB)

    Shmidt, O.V. [A.A. Bochvar All-Russian Scientific Research Institute for Inorganic Materials, Rogova, 5a street, Moscow, 123098 (Russian Federation); Makeeva, I.R. [Zababakhin All-Russian Scientific Research Institute of Technical Physics, Vasiliev street 13, Snezhinsk, Chelyabinsk region, 456770 (Russian Federation); Liventsov, S.N. [Tomsk Polytechnic University, Tomsk, Lenin Avenue, 30, 634050 (Russian Federation)

    2016-07-01

    Computer models of processes are necessary for determination of optimal operating conditions for closed nuclear fuel cycle (NFC) processes. Computer models can be quickly changed in accordance with new and fresh data from experimental research. 3 kinds of process simulation are necessary. First, the VIZART software package is a balance model development used for calculating the material flow in technological processes. VIZART involves taking into account of equipment capacity, transport lines and storage volumes. Secondly, it is necessary to simulate the physico-chemical processes that are involved in the closure of NFC. The third kind of simulation is the development of software that allows the optimization, diagnostics and control of the processes which implies real-time simulation of product flows on the whole plant or on separate lines of the plant. (A.C.)

  13. Fermentation process diagnosis using a mathematical model

    Energy Technology Data Exchange (ETDEWEB)

    Yerushalmi, L; Volesky, B; Votruba, J

    1988-09-01

    Intriguing physiology of a solvent-producing strain of Clostridium acetobutylicum led to the synthesis of a mathematical model of the acetone-butanol fermentation process. The model presented is capable of describing the process dynamics and the culture behavior during a standard and a substandard acetone-butanol fermentation. In addition to the process kinetic parameters, the model includes the culture physiological parameters, such as the cellular membrane permeability and the number of membrane sites for active transport of sugar. Computer process simulation studies for different culture conditions used the model, and quantitatively pointed out the importance of selected culture parameters that characterize the cell membrane behaviour and play an important role in the control of solvent synthesis by the cell. The theoretical predictions by the new model were confirmed by experimental determination of the cellular membrane permeability.

  14. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  15. Model of diffusers / permeators for hydrogen processing

    International Nuclear Information System (INIS)

    Jacobs, W. D.; Hang, T.

    2008-01-01

    Palladium-silver (Pd-Ag) diffusers are mainstays of hydrogen processing. Diffusers separate hydrogen from inert species such as nitrogen, argon or helium. The tubing becomes permeable to hydrogen when heated to more than 250 C and a differential pressure is created across the membrane. The hydrogen diffuses better at higher temperatures. Experimental or experiential results have been the basis for determining or predicting a diffuser's performance. However, the process can be mathematically modeled, and comparison to experimental or other operating data can be utilized to improve the fit of the model. A reliable model-based diffuser system design is the goal which will have impacts on tritium and hydrogen processing. A computer model has been developed to solve the differential equations for diffusion given the operating boundary conditions. The model was compared to operating data for a low pressure diffuser system. The modeling approach and the results are presented in this paper. (authors)

  16. Modeling Business Processes in Public Administration

    Science.gov (United States)

    Repa, Vaclav

    During more than 10 years of its existence business process modeling became a regular part of organization management practice. It is mostly regarded. as a part of information system development or even as a way to implement some supporting technology (for instance workflow system). Although I do not agree with such reduction of the real meaning of a business process, it is necessary to admit that information technologies play an essential role in business processes (see [1] for more information), Consequently, an information system is inseparable from a business process itself because it is a cornerstone of the general basic infrastructure of a business. This fact impacts on all dimensions of business process management. One of these dimensions is the methodology that postulates that the information systems development provide the business process management with exact methods and tools for modeling business processes. Also the methodology underlying the approach presented in this paper has its roots in the information systems development methodology.

  17. Mathematical model of seed germination process

    International Nuclear Information System (INIS)

    Gładyszewska, B.; Koper, R.; Kornarzyński, K.

    1999-01-01

    An analytical model of seed germination process was described. The model based on proposed working hypothesis leads - by analogy - to a law corresponding with Verhulst-Pearl's law, known from the theory of population kinetics. The model was applied to describe the germination kinetics of tomato seeds, Promyk field cultivar, biostimulated by laser treatment. Close agreement of experimental and model data was obtained [pl

  18. Ontological Model of Business Process Management Systems

    Science.gov (United States)

    Manoilov, G.; Deliiska, B.

    2008-10-01

    The activities which constitute business process management (BPM) can be grouped into five categories: design, modeling, execution, monitoring and optimization. Dedicated software packets for business process management system (BPMS) are available on the market. But the efficiency of its exploitation depends on used ontological model in the development time and run time of the system. In the article an ontological model of BPMS in area of software industry is investigated. The model building is preceded by conceptualization of the domain and taxonomy of BPMS development. On the base of the taxonomy an simple online thesaurus is created.

  19. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Full Text Available Background: Competitive intelligence (CI provides actionable intelligence, which provides a competitive edge in enterprises. However, without proper process, it is difficult to develop actionable intelligence. There are disagreements about how the CI process should be structured. For CI professionals to focus on producing actionable intelligence, and to do so with simplicity, they need a common CI process model.Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model.Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used.Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase.Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  20. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  1. Modeling of Dielectric Heating within Lyophilization Process

    Directory of Open Access Journals (Sweden)

    Jan Kyncl

    2014-01-01

    Full Text Available A process of lyophilization of paper books is modeled. The process of drying is controlled by a dielectric heating system. From the physical viewpoint, the task represents a 2D coupled problem described by two partial differential equations for the electric and temperature fields. The material parameters are supposed to be temperature-dependent functions. The continuous mathematical model is solved numerically. The methodology is illustrated with some examples whose results are discussed.

  2. Cost Models for MMC Manufacturing Processes

    Science.gov (United States)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  3. Model visualization for evaluation of biocatalytic processes

    DEFF Research Database (Denmark)

    Law, HEM; Lewis, DJ; McRobbie, I

    2008-01-01

    Biocatalysis offers great potential as an additional, and in some cases as an alternative, synthetic tool for organic chemists, especially as a route to introduce chirality. However, the implementation of scalable biocatalytic processes nearly always requires the introduction of process and/or bi......,S-EDDS), a biodegradable chelant, and is characterised by the use of model visualization using `windows of operation"....

  4. Business process modeling using Petri nets

    NARCIS (Netherlands)

    Hee, van K.M.; Sidorova, N.; Werf, van der J.M.E.M.; Jensen, K.; Aalst, van der W.M.P.; Balbo, G.; Koutny, M.; Wolf, K.

    2013-01-01

    Business process modeling has become a standard activity in many organizations. We start with going back into the history and explain why this activity appeared and became of such importance for organizations to achieve their business targets. We discuss the context in which business process

  5. Business Process Modeling Notation - An Overview

    Directory of Open Access Journals (Sweden)

    Alexandra Fortiş

    2006-01-01

    Full Text Available BPMN represents an industrial standard created to offer a common and user friendly notation to all the participants to a business process. The present paper aims to briefly present the main features of this notation as well as an interpretation of some of the main patterns characterizing a business process modeled by the working fluxes.

  6. Workflow Patterns for Business Process Modeling

    NARCIS (Netherlands)

    Thom, Lucineia Heloisa; Lochpe, Cirano; Reichert, M.U.

    For its reuse advantages, workflow patterns (e.g., control flow patterns, data patterns, resource patterns) are increasingly attracting the interest of both researchers and vendors. Frequently, business process or workflow models can be assembeled out of a set of recurrent process fragments (or

  7. Physical and mathematical modelling of extrusion processes

    DEFF Research Database (Denmark)

    Arentoft, Mogens; Gronostajski, Z.; Niechajowics, A.

    2000-01-01

    The main objective of the work is to study the extrusion process using physical modelling and to compare the findings of the study with finite element predictions. The possibilities and advantages of the simultaneous application of both of these methods for the analysis of metal forming processes...

  8. Qualitative simulation in formal process modelling

    International Nuclear Information System (INIS)

    Sivertsen, Elin R.

    1999-01-01

    In relation to several different research activities at the OECD Halden Reactor Project, the usefulness of formal process models has been identified. Being represented in some appropriate representation language, the purpose of these models is to model process plants and plant automatics in a unified way to allow verification and computer aided design of control strategies. The present report discusses qualitative simulation and the tool QSIM as one approach to formal process models. In particular, the report aims at investigating how recent improvements of the tool facilitate the use of the approach in areas like process system analysis, procedure verification, and control software safety analysis. An important long term goal is to provide a basis for using qualitative reasoning in combination with other techniques to facilitate the treatment of embedded programmable systems in Probabilistic Safety Analysis (PSA). This is motivated from the potential of such a combination in safety analysis based on models comprising both software, hardware, and operator. It is anticipated that the research results from this activity will benefit V and V in a wide variety of applications where formal process models can be utilized. Examples are operator procedures, intelligent decision support systems, and common model repositories (author) (ml)

  9. Multiphysics modelling of the spray forming process

    International Nuclear Information System (INIS)

    Mi, J.; Grant, P.S.; Fritsching, U.; Belkessam, O.; Garmendia, I.; Landaberea, A.

    2008-01-01

    An integrated, multiphysics numerical model has been developed through the joint efforts of the University of Oxford (UK), University of Bremen (Germany) and Inasmet (Spain) to simulate the spray forming process. The integrated model consisted of four sub-models: (1) an atomization model simulating the fragmentation of a continuous liquid metal stream into droplet spray during gas atomization; (2) a droplet spray model simulating the droplet spray mass and enthalpy evolution in the gas flow field prior to deposition; (3) a droplet deposition model simulating droplet deposition, splashing and re-deposition behavior and the resulting preform shape and heat flow; and (4) a porosity model simulating the porosity distribution inside a spray formed ring preform. The model has been validated against experiments of the spray forming of large diameter IN718 Ni superalloy rings. The modelled preform shape, surface temperature and final porosity distribution showed good agreement with experimental measurements

  10. The (Mathematical) Modeling Process in Biosciences.

    Science.gov (United States)

    Torres, Nestor V; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology.

  11. The semantics of hybrid process models

    NARCIS (Netherlands)

    Slaats, T.; Schunselaar, D.M.M.; Maggi, F.M.; Reijers, H.A.; Debruyne, C.; Panetto, H.; Meersman, R.; Dillon, T.; Kuhn, E.; O'Sullivan, D.; Agostino Ardagna, C.

    2016-01-01

    In the area of business process modelling, declarative notations have been proposed as alternatives to notations that follow the dominant, imperative paradigm. Yet, the choice between an imperative or declarative style of modelling is not always easy to make. Instead, a mixture of these styles is

  12. Modeling biochemical transformation processes and information processing with Narrator

    Directory of Open Access Journals (Sweden)

    Palfreyman Niall M

    2007-03-01

    Full Text Available Abstract Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs, which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a

  13. Modeling biochemical transformation processes and information processing with Narrator.

    Science.gov (United States)

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is

  14. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    secondly, provide modelers with the information needed to understand the model errors and how their algorithm changes might mitigate these errors. In...by ARL modelers. 2. Development Environment The automation of Point-Stat processes (i.e., PSA) was developed using Python 3.5.* Python was selected...because it is easy to use, widely used for scripting, and satisfies all the requirements to automate the implementation of the Point-Stat tool. In

  15. Normalised subband adaptive filtering with extended adaptiveness on degree of subband filters

    Science.gov (United States)

    Samuyelu, Bommu; Rajesh Kumar, Pullakura

    2017-12-01

    This paper proposes an adaptive normalised subband adaptive filtering (NSAF) to accomplish the betterment of NSAF performance. In the proposed NSAF, an extended adaptiveness is introduced from its variants in two ways. In the first way, the step-size is set adaptive, and in the second way, the selection of subbands is set adaptive. Hence, the proposed NSAF is termed here as variable step-size-based NSAF with selected subbands (VS-SNSAF). Experimental investigations are carried out to demonstrate the performance (in terms of convergence) of the VS-SNSAF against the conventional NSAF and its state-of-the-art adaptive variants. The results report the superior performance of VS-SNSAF over the traditional NSAF and its variants. It is also proved for its stability, robustness against noise and substantial computing complexity.

  16. Living under the influence: normalisation of alcohol consumption in our cities

    Directory of Open Access Journals (Sweden)

    Xisca Sureda

    2017-01-01

    Full Text Available Harmful use of alcohol is one of the world's leading health risks. A positive association between certain characteristics of the urban environment and individual alcohol consumption has been documented in previous research. When developing a tool characterising the urban environment of alcohol in the cities of Barcelona and Madrid we observed that alcohol is ever present in our cities. Urban residents are constantly exposed to a wide variety of alcohol products, marketing and promotion and signs of alcohol consumption. In this field note, we reflect the normalisation of alcohol in urban environments. We highlight the need for further research to better understand attitudes and practices in relation to alcohol consumption. This type of urban studies is necessary to support policy interventions to prevent and control harmful alcohol use.

  17. The normalisation of terror: the response of Israel's stock market to long periods of terrorism.

    Science.gov (United States)

    Peleg, Kobi; Regens, James L; Gunter, James T; Jaffe, Dena H

    2011-01-01

    Man-made disasters such as acts of terrorism may affect a society's resiliency and sensitivity to prolonged physical and psychological stress. The Israeli Tel Aviv stock market TA-100 Index was used as an indicator of reactivity to suicide terror bombings. After accounting for factors such as world market changes and attack severity and intensity, the analysis reveals that although Israel's financial base remained sensitive to each act of terror across the entire period of the Second Intifada (2000-06), sustained psychological resilience was indicated with no apparent overall market shift. In other words, we saw a 'normalisation of terror' following an extended period of continued suicide bombings. The results suggest that investors responded to less transitory global market forces, indicating sustained resilience and long-term market confidence. Future studies directly measuring investor expectations and reactions to man-made disasters, such as terrorism, are warranted. © 2011 The Author(s). Disasters © Overseas Development Institute, 2011.

  18. Microstructural characterisation of a P91 steel normalised and tempered at different temperatures

    International Nuclear Information System (INIS)

    Hurtado-Norena, C.; Danon, C.A.; Luppo, M.I.; Bruzzoni, P.

    2015-01-01

    9%Cr-1%Mo martensitic-ferritic steels are used in power plant components with operating temperatures of around 600 deg. C because of their good mechanical properties at high temperature as well as good oxidation resistance. These steels are generally used in the normalised and tempered condition. This treatment results in a structure of tempered lath martensite where the precipitates are distributed along the lath interfaces and within the martensite laths. The characterisation of these precipitates is of fundamental importance because of their relationship with the creep behaviour of these steels in service. In the present work, the different types of precipitates found in these steels have been studied on specimens in different metallurgical conditions. The techniques used in this investigation were X-ray diffraction with synchrotron light, scanning electron microscopy, energy dispersive microanalysis and transmission electron microscopy. (authors)

  19. Living under the influence: normalisation of alcohol consumption in our cities.

    Science.gov (United States)

    Sureda, Xisca; Villalbí, Joan R; Espelt, Albert; Franco, Manuel

    Harmful use of alcohol is one of the world's leading health risks. A positive association between certain characteristics of the urban environment and individual alcohol consumption has been documented in previous research. When developing a tool characterising the urban environment of alcohol in the cities of Barcelona and Madrid we observed that alcohol is ever present in our cities. Urban residents are constantly exposed to a wide variety of alcohol products, marketing and promotion and signs of alcohol consumption. In this field note, we reflect the normalisation of alcohol in urban environments. We highlight the need for further research to better understand attitudes and practices in relation to alcohol consumption. This type of urban studies is necessary to support policy interventions to prevent and control harmful alcohol use. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  20. Applying the Business Process and Practice Alignment Meta-model: Daily Practices and Process Modelling

    Directory of Open Access Journals (Sweden)

    Ventura Martins Paula

    2017-03-01

    Full Text Available Background: Business Process Modelling (BPM is one of the most important phases of information system design. Business Process (BP meta-models allow capturing informational and behavioural aspects of business processes. Unfortunately, standard BP meta-modelling approaches focus just on process description, providing different BP models. It is not possible to compare and identify related daily practices in order to improve BP models. This lack of information implies that further research in BP meta-models is needed to reflect the evolution/change in BP. Considering this limitation, this paper introduces a new BP meta-model designed by Business Process and Practice Alignment Meta-model (BPPAMeta-model. Our intention is to present a meta-model that addresses features related to the alignment between daily work practices and BP descriptions. Objectives: This paper intends to present a metamodel which is going to integrate daily work information into coherent and sound process definitions. Methods/Approach: The methodology employed in the research follows a design-science approach. Results: The results of the case study are related to the application of the proposed meta-model to align the specification of a BP model with work practices models. Conclusions: This meta-model can be used within the BPPAM methodology to specify or improve business processes models based on work practice descriptions.

  1. Thermochemical equilibrium modelling of a gasifying process

    International Nuclear Information System (INIS)

    Melgar, Andres; Perez, Juan F.; Laget, Hannes; Horillo, Alfonso

    2007-01-01

    This article discusses a mathematical model for the thermochemical processes in a downdraft biomass gasifier. The model combines the chemical equilibrium and the thermodynamic equilibrium of the global reaction, predicting the final composition of the producer gas as well as its reaction temperature. Once the composition of the producer gas is obtained, a range of parameters can be derived, such as the cold gas efficiency of the gasifier, the amount of dissociated water in the process and the heating value and engine fuel quality of the gas. The model has been validated experimentally. This work includes a parametric study of the influence of the gasifying relative fuel/air ratio and the moisture content of the biomass on the characteristics of the process and the producer gas composition. The model helps to predict the behaviour of different biomass types and is a useful tool for optimizing the design and operation of downdraft biomass gasifiers

  2. Process modeling study of the CIF incinerator

    International Nuclear Information System (INIS)

    Hang, T.

    1995-01-01

    The Savannah River Site (SRS) plans to begin operating the Consolidated Incineration Facility (CIF) in 1996. The CIF will treat liquid and solid low-level radioactive, mixed and RCRA hazardous wastes generated at SRS. In addition to experimental test programs, process modeling was applied to provide guidance in areas of safety, environmental regulation compliances, process improvement and optimization. A steady-state flowsheet model was used to calculate material/energy balances and to track key chemical constituents throughout the process units. Dynamic models were developed to predict the CIF transient characteristics in normal and abnormal operation scenarios. Predictions include the rotary kiln heat transfer, dynamic responses of the CIF to fluctuations in the solid waste feed or upsets in the system equipments, performance of the control system, air inleakage in the kiln, etc. This paper reviews the modeling study performed to assist in the deflagration risk assessment

  3. Deterministic geologic processes and stochastic modeling

    International Nuclear Information System (INIS)

    Rautman, C.A.; Flint, A.L.

    1992-01-01

    This paper reports that recent outcrop sampling at Yucca Mountain, Nevada, has produced significant new information regarding the distribution of physical properties at the site of a potential high-level nuclear waste repository. consideration of the spatial variability indicates that her are a number of widespread deterministic geologic features at the site that have important implications for numerical modeling of such performance aspects as ground water flow and radionuclide transport. Because the geologic processes responsible for formation of Yucca Mountain are relatively well understood and operate on a more-or-less regional scale, understanding of these processes can be used in modeling the physical properties and performance of the site. Information reflecting these deterministic geologic processes may be incorporated into the modeling program explicitly using geostatistical concepts such as soft information, or implicitly, through the adoption of a particular approach to modeling

  4. Modeling and simulation of heterogeneous catalytic processes

    CERN Document Server

    Dixon, Anthony

    2014-01-01

    Heterogeneous catalysis and mathematical modeling are essential components of the continuing search for better utilization of raw materials and energy, with reduced impact on the environment. Numerical modeling of chemical systems has progressed rapidly due to increases in computer power, and is used extensively for analysis, design and development of catalytic reactors and processes. This book presents reviews of the state-of-the-art in modeling of heterogeneous catalytic reactors and processes. Reviews by leading authorities in the respective areas Up-to-date reviews of latest techniques in modeling of catalytic processes Mix of US and European authors, as well as academic/industrial/research institute perspectives Connections between computation and experimental methods in some of the chapters.

  5. Modeling Aspects of Activated Sludge Processes Part l l: Mathematical Process Modeling and Biokinetics of Activated Sludge Processes

    Energy Technology Data Exchange (ETDEWEB)

    AbdElHaleem, H S [Cairo Univ.-CivlI Eng. Dept., Giza (Egypt); EI-Ahwany, A H [CairoUlmrsity- Faculty ofEngincering - Chemical Engineering Department, Giza (Egypt); Ibrahim, H I [Helwan University- Faculty of Engineering - Biomedical Engineering Department, Helwan (Egypt); Ibrahim, G [Menofia University- Faculty of Engineering Sbebin EI Kom- Basic Eng. Sc. Dept., Menofia (Egypt)

    2004-07-01

    Mathematical process modeling and biokinetics of activated sludge process were reviewed considering different types of models. It has been evaluated the task group models of ASMI. and 2, and 3 versioned by Henze et al considering the conditions of each model and the different processes of which every model consists. It is revealed that ASMI contains some defects avoided in ASM3. Relied on homogeneity, Models can be classified into homogenous models characterized by taking the activated sludge process as one phase. In this type of models, the internal mass transfer inside the floes was neglected.. Hence, the kinetic parameter produces can be considered inaccurate. The other type of models is the heterogeneous model This type considers the mass transfer operations in addition to the biochemical reaction processes; hence, the resulted kinetic parameters can be considered more accurate than that of homogenous type.

  6. Modeling Aspects of Activated Sludge Processes Part l l: Mathematical Process Modeling and Biokinetics of Activated Sludge Processes

    International Nuclear Information System (INIS)

    AbdElHaleem, H.S.; EI-Ahwany, A. H.; Ibrahim, H.I.; Ibrahim, G.

    2004-01-01

    Mathematical process modeling and biokinetics of activated sludge process were reviewed considering different types of models. It has been evaluated the task group models of ASMI. and 2, and 3 versioned by Henze et al considering the conditions of each model and the different processes of which every model consists. It is revealed that ASMI contains some defects avoided in ASM3. Relied on homogeneity, Models can be classified into homogenous models characterized by taking the activated sludge process as one phase. In this type of models, the internal mass transfer inside the floes was neglected.. Hence, the kinetic parameter produces can be considered inaccurate. The other type of models is the heterogeneous model This type considers the mass transfer operations in addition to the biochemical reaction processes; hence, the resulted kinetic parameters can be considered more accurate than that of homogenous type

  7. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  8. Numerical modeling of atmospheric washout processes

    International Nuclear Information System (INIS)

    Bayer, D.; Beheng, K.D.; Herbert, F.

    1987-01-01

    For the washout of particles from the atmosphere by clouds and rain one has to distinguish between processes which work in the first phase of cloud development, when condensation nuclei build up in saturated air (Nucleation Aerosol Scavenging, NAS) and those processes which work at the following cloud development. In the second case particles are taken off by cloud droplets or by falling rain drops via collision (Collision Aerosol Scavenging, CAS). The physics of both processes is described. For the CAS process a numerical model is presented. The report contains a documentation of the mathematical equations and the computer programs (FORTRAN). (KW) [de

  9. Various Models for Reading Comprehension Process

    Directory of Open Access Journals (Sweden)

    Parastoo Babashamsi

    2013-11-01

    Full Text Available In recent years reading can be viewed as a process, as a form of thinking, as a true experience, and as a tool subject. As a process, reading includes visual discrimination, independent recognition of word, rhythmic progression along a line of print, precision in the return sweep of the eyes, and adjustment of rate. In the same line, the present paper aims at considering the various models of reading process. Moreover, the paper will take a look at various factors such as schema and vocabulary knowledge which affect reading comprehension process.

  10. Mathematical modeling of biomass fuels formation process

    International Nuclear Information System (INIS)

    Gaska, Krzysztof; Wandrasz, Andrzej J.

    2008-01-01

    The increasing demand for thermal and electric energy in many branches of industry and municipal management accounts for a drastic diminishing of natural resources (fossil fuels). Meanwhile, in numerous technical processes, a huge mass of wastes is produced. A segregated and converted combustible fraction of the wastes, with relatively high calorific value, may be used as a component of formed fuels. The utilization of the formed fuel components from segregated groups of waste in associated processes of co-combustion with conventional fuels causes significant savings resulting from partial replacement of fossil fuels, and reduction of environmental pollution resulting directly from the limitation of waste migration to the environment (soil, atmospheric air, surface and underground water). The realization of technological processes with the utilization of formed fuel in associated thermal systems should be qualified by technical criteria, which means that elementary processes as well as factors of sustainable development, from a global viewpoint, must not be disturbed. The utilization of post-process waste should be preceded by detailed technical, ecological and economic analyses. In order to optimize the mixing process of fuel components, a mathematical model of the forming process was created. The model is defined as a group of data structures which uniquely identify a real process and conversion of this data in algorithms based on a problem of linear programming. The paper also presents the optimization of parameters in the process of forming fuels using a modified simplex algorithm with a polynomial worktime. This model is a datum-point in the numerical modeling of real processes, allowing a precise determination of the optimal elementary composition of formed fuels components, with assumed constraints and decision variables of the task

  11. Quantum mechanical Hamiltonian models of discrete processes

    International Nuclear Information System (INIS)

    Benioff, P.

    1981-01-01

    Here the results of other work on quantum mechanical Hamiltonian models of Turing machines are extended to include any discrete process T on a countably infinite set A. The models are constructed here by use of scattering phase shifts from successive scatterers to turn on successive step interactions. Also a locality requirement is imposed. The construction is done by first associating with each process T a model quantum system M with associated Hilbert space H/sub M/ and step operator U/sub T/. Since U/sub T/ is not unitary in general, M, H/sub M/, and U/sub T/ are extended into a (continuous time) Hamiltonian model on a larger space which satisfies the locality requirement. The construction is compared with the minimal unitary dilation of U/sub T/. It is seen that the model constructed here is larger than the minimal one. However, the minimal one does not satisfy the locality requirement

  12. Modeling nonhomogeneous Markov processes via time transformation.

    Science.gov (United States)

    Hubbard, R A; Inoue, L Y T; Fann, J R

    2008-09-01

    Longitudinal studies are a powerful tool for characterizing the course of chronic disease. These studies are usually carried out with subjects observed at periodic visits giving rise to panel data. Under this observation scheme the exact times of disease state transitions and sequence of disease states visited are unknown and Markov process models are often used to describe disease progression. Most applications of Markov process models rely on the assumption of time homogeneity, that is, that the transition rates are constant over time. This assumption is not satisfied when transition rates depend on time from the process origin. However, limited statistical tools are available for dealing with nonhomogeneity. We propose models in which the time scale of a nonhomogeneous Markov process is transformed to an operational time scale on which the process is homogeneous. We develop a method for jointly estimating the time transformation and the transition intensity matrix for the time transformed homogeneous process. We assess maximum likelihood estimation using the Fisher scoring algorithm via simulation studies and compare performance of our method to homogeneous and piecewise homogeneous models. We apply our methodology to a study of delirium progression in a cohort of stem cell transplantation recipients and show that our method identifies temporal trends in delirium incidence and recovery.

  13. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  14. Causally nonseparable processes admitting a causal model

    International Nuclear Information System (INIS)

    Feix, Adrien; Araújo, Mateus; Brukner, Caslav

    2016-01-01

    A recent framework of quantum theory with no global causal order predicts the existence of ‘causally nonseparable’ processes. Some of these processes produce correlations incompatible with any causal order (they violate so-called ‘causal inequalities’ analogous to Bell inequalities ) while others do not (they admit a ‘causal model’ analogous to a local model ). Here we show for the first time that bipartite causally nonseparable processes with a causal model exist, and give evidence that they have no clear physical interpretation. We also provide an algorithm to generate processes of this kind and show that they have nonzero measure in the set of all processes. We demonstrate the existence of processes which stop violating causal inequalities but are still causally nonseparable when mixed with a certain amount of ‘white noise’. This is reminiscent of the behavior of Werner states in the context of entanglement and nonlocality. Finally, we provide numerical evidence for the existence of causally nonseparable processes which have a causal model even when extended with an entangled state shared among the parties. (paper)

  15. Stochastic differential equation model to Prendiville processes

    Energy Technology Data Exchange (ETDEWEB)

    Granita, E-mail: granitafc@gmail.com [Dept. of Mathematical Science, Universiti Teknologi Malaysia, 81310, Johor Malaysia (Malaysia); Bahar, Arifah [Dept. of Mathematical Science, Universiti Teknologi Malaysia, 81310, Johor Malaysia (Malaysia); UTM Center for Industrial & Applied Mathematics (UTM-CIAM) (Malaysia)

    2015-10-22

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution.

  16. Stochastic differential equation model to Prendiville processes

    International Nuclear Information System (INIS)

    Granita; Bahar, Arifah

    2015-01-01

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution

  17. Modelling a uranium ore bioleaching process

    International Nuclear Information System (INIS)

    Chien, D.C.H.; Douglas, P.L.; Herman, D.H.; Marchbank, A.

    1990-01-01

    A dynamic simulation model for the bioleaching of uranium ore in a stope leaching process has been developed. The model incorporates design and operating conditions, reaction kinetics enhanced by Thiobacillus ferroxidans present in the leaching solution and transport properties. Model predictions agree well with experimental data with an average deviation of about ± 3%. The model is sensitive to small errors in the estimates of fragment size and ore grade. Because accurate estimates are difficult to obtain a parameter estimation approach was developed to update the value of fragment size and ore grade using on-line plant information

  18. Chain binomial models and binomial autoregressive processes.

    Science.gov (United States)

    Weiss, Christian H; Pollett, Philip K

    2012-09-01

    We establish a connection between a class of chain-binomial models of use in ecology and epidemiology and binomial autoregressive (AR) processes. New results are obtained for the latter, including expressions for the lag-conditional distribution and related quantities. We focus on two types of chain-binomial model, extinction-colonization and colonization-extinction models, and present two approaches to parameter estimation. The asymptotic distributions of the resulting estimators are studied, as well as their finite-sample performance, and we give an application to real data. A connection is made with standard AR models, which also has implications for parameter estimation. © 2011, The International Biometric Society.

  19. The Probability Model of Expectation Disconfirmation Process

    Directory of Open Access Journals (Sweden)

    Hui-Hsin HUANG

    2015-06-01

    Full Text Available This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model and the real data. This model has application for business marketing areas in order to manage relationship satisfaction.

  20. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes.

    Science.gov (United States)

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Oscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-07-15

    Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals.

  1. Determinantal point process models on the sphere

    DEFF Research Database (Denmark)

    Møller, Jesper; Nielsen, Morten; Porcu, Emilio

    defined on Sd × Sd . We review the appealing properties of such processes, including their specific moment properties, density expressions and simulation procedures. Particularly, we characterize and construct isotropic DPPs models on Sd , where it becomes essential to specify the eigenvalues......We consider determinantal point processes on the d-dimensional unit sphere Sd . These are finite point processes exhibiting repulsiveness and with moment properties determined by a certain determinant whose entries are specified by a so-called kernel which we assume is a complex covariance function...... and eigenfunctions in a spectral representation for the kernel, and we figure out how repulsive isotropic DPPs can be. Moreover, we discuss the shortcomings of adapting existing models for isotropic covariance functions and consider strategies for developing new models, including a useful spectral approach....

  2. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  3. Hydrothermal processing of Hanford tank wastes: Process modeling and control

    International Nuclear Information System (INIS)

    Currier, R.P.

    1994-01-01

    In the Los Alamos National Laboratory (LANL) hydrothermal process, waste streams are first pressurized and heated as they pass through a continuous flow tubular reactor vessel. The waste is maintained at reaction temperature of 300--550 C where organic destruction and sludge reformation occur. This report documents LANL activities in process modeling and control undertaken in FY94 to support hydrothermal process development. Key issues discussed include non-ideal flow patterns (e.g. axial dispersion) and their effect on reactor performance, the use and interpretation of inert tracer experiments, and the use of computational fluid mechanics to evaluate novel hydrothermal reactor designs. In addition, the effects of axial dispersion (and simplifications to rate expressions) on the estimated kinetic parameters are explored by non-linear regression to experimental data. Safety-related calculations are reported which estimate the explosion limits of effluent gases and the fate of hydrogen as it passes through the reactor. Development and numerical solution of a generalized one-dimensional mathematical model is also summarized. The difficulties encountered in using commercially available software to correlate the behavior of high temperature, high pressure aqueous electrolyte mixtures are summarized. Finally, details of the control system and experiments conducted to empirically determine the system response are reported

  4. A process algebra model of QED

    International Nuclear Information System (INIS)

    Sulis, William

    2016-01-01

    The process algebra approach to quantum mechanics posits a finite, discrete, determinate ontology of primitive events which are generated by processes (in the sense of Whitehead). In this ontology, primitive events serve as elements of an emergent space-time and of emergent fundamental particles and fields. Each process generates a set of primitive elements, using only local information, causally propagated as a discrete wave, forming a causal space termed a causal tapestry. Each causal tapestry forms a discrete and finite sampling of an emergent causal manifold (space-time) M and emergent wave function. Interactions between processes are described by a process algebra which possesses 8 commutative operations (sums and products) together with a non-commutative concatenation operator (transitions). The process algebra possesses a representation via nondeterministic combinatorial games. The process algebra connects to quantum mechanics through the set valued process and configuration space covering maps, which associate each causal tapestry with sets of wave functions over M. Probabilities emerge from interactions between processes. The process algebra model has been shown to reproduce many features of the theory of non-relativistic scalar particles to a high degree of accuracy, without paradox or divergences. This paper extends the approach to a semi-classical form of quantum electrodynamics. (paper)

  5. Fundamentals of Numerical Modelling of Casting Processes

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri; Pryds, Nini; Thorborg, Jesper

    Fundamentals of Numerical Modelling of Casting Processes comprises a thorough presentation of the basic phenomena that need to be addressed in numerical simulation of casting processes. The main philosophy of the book is to present the topics in view of their physical meaning, whenever possible......, rather than relying strictly on mathematical formalism. The book, aimed both at the researcher and the practicing engineer, as well as the student, is naturally divided into four parts. Part I (Chapters 1-3) introduces the fundamentals of modelling in a 1-dimensional framework. Part II (Chapter 4...

  6. Process and Context in Choice Models

    DEFF Research Database (Denmark)

    Ben-Akiva, Moshe; Palma, André de; McFadden, Daniel

    2012-01-01

    . The extended choice framework includes more behavioral richness through the explicit representation of the planning process preceding an action and its dynamics and the effects of context (family, friends, and market) on the process leading to a choice, as well as the inclusion of new types of subjective data...... in choice models. We discuss the key issues involved in applying the extended framework, focusing on richer data requirements, theories, and models, and present three partial demonstrations of the proposed framework. Future research challenges include the development of more comprehensive empirical tests...

  7. Performance assessment modeling of pyrometallurgical process wasteforms

    International Nuclear Information System (INIS)

    Nutt, W.M.; Hill, R.N.; Bullen, D.B.

    1995-01-01

    Performance assessment analyses have been completed to estimate the behavior of high-level nuclear wasteforms generated from the pyrometallurgical processing of liquid metal reactor (LMR) and light water reactor (LWR) spent nuclear fuel. Waste emplaced in the proposed repository at Yucca Mountain is investigated as the basis for the study. The resulting cumulative actinide and fission product releases to the accessible environment within a 100,000 year period from the various pyrometallurgical process wasteforms are compared to those of directly disposed LWR spent fuel using the same total repository system model. The impact of differing radionuclide transport models on the overall release characteristics is investigated

  8. Models of transport processes in concrete

    International Nuclear Information System (INIS)

    Pommersheim, J.M.; Clifton, J.R.

    1991-01-01

    An approach being considered by the US Nuclear Regulatory Commission for disposal of low-level radioactive waste is to place the waste forms in concrete vaults buried underground. The vaults would need a service life of 500 years. Approaches for predicting the service life of concrete of such vaults include the use of mathematical models. Mathematical models are presented in this report for the major degradation processes anticipated for the concrete vaults, which are corrosion of steel reinforcement, sulfate attack, acid attack, and leaching. The models mathematically represent rate controlling processes including diffusion, convection, and reaction and sorption of chemical species. These models can form the basis for predicting the life of concrete under in-service conditions. 33 refs., 6 figs., 7 tabs

  9. Retort process modelling for Indian traditional foods.

    Science.gov (United States)

    Gokhale, S V; Lele, S S

    2014-11-01

    Indian traditional staple and snack food is typically a heterogeneous recipe that incorporates varieties of vegetables, lentils and other ingredients. Modelling the retorting process of multilayer pouch packed Indian food was achieved using lumped-parameter approach. A unified model is proposed to estimate cold point temperature. Initial process conditions, retort temperature and % solid content were the significantly affecting independent variables. A model was developed using combination of vegetable solids and water, which was then validated using four traditional Indian vegetarian products: Pulav (steamed rice with vegetables), Sambar (south Indian style curry containing mixed vegetables and lentils), Gajar Halawa (carrot based sweet product) and Upama (wheat based snack product). The predicted and experimental values of temperature profile matched with ±10 % error which is a good match considering the food was a multi component system. Thus the model will be useful as a tool to reduce number of trials required to optimize retorting of various Indian traditional vegetarian foods.

  10. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model. Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used. Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase. Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  11. Modeling of Reaction Processes Controlled by Diffusion

    International Nuclear Information System (INIS)

    Revelli, Jorge

    2003-01-01

    Stochastic modeling is quite powerful in science and technology.The technics derived from this process have been used with great success in laser theory, biological systems and chemical reactions.Besides, they provide a theoretical framework for the analysis of experimental results on the field of particle's diffusion in ordered and disordered materials.In this work we analyze transport processes in one-dimensional fluctuating media, which are media that change their state in time.This fact induces changes in the movements of the particles giving rise to different phenomena and dynamics that will be described and analyzed in this work.We present some random walk models to describe these fluctuating media.These models include state transitions governed by different dynamical processes.We also analyze the trapping problem in a lattice by means of a simple model which predicts a resonance-like phenomenon.Also we study effective diffusion processes over surfaces due to random walks in the bulk.We consider different boundary conditions and transitions movements.We derive expressions that describe diffusion behaviors constrained to bulk restrictions and the dynamic of the particles.Finally it is important to mention that the theoretical results obtained from the models proposed in this work are compared with Monte Carlo simulations.We find, in general, excellent agreements between the theory and the simulations

  12. Évolution de la normalisation dans le domaine des oléagineux et des corps gras

    Directory of Open Access Journals (Sweden)

    Quinsac Alain

    2003-07-01

    Full Text Available La normalisation joue un grand rôle dans les échanges économiques en participant à l’ouverture et à la transparence des marchés. La filière des Oléagineux et des Corps Gras a intégré depuis longtemps la normalisation dans sa stratégie. Élaborés à partir des besoins de la profession et notamment au niveau de la relation client-fournisseur, les programmes ont concerné principalement l’échantillonnage et l’analyse. Depuis quelques années, une forte évolution du contexte socio-économique et réglementaire (utilisation non-alimentaire, sécurité alimentaire, assurance qualité, a élargi le champ de la normalisation. La démarche normative adoptée dans le cas des bio-diesels et de la détection des OGM dans les oléagineux est expliquée. Les conséquences de l’évolution de la normalisation et les enjeux pour la profession des oléagineux dans le futur sont évoqués.

  13. Integrating textual and model-based process descriptions for comprehensive process search

    NARCIS (Netherlands)

    Leopold, Henrik; van der Aa, Han; Pittke, Fabian; Raffel, Manuel; Mendling, Jan; Reijers, Hajo A.

    2016-01-01

    Documenting business processes using process models is common practice in many organizations. However, not all process information is best captured in process models. Hence, many organizations complement these models with textual descriptions that specify additional details. The problem with this

  14. A Generic Modeling Process to Support Functional Fault Model Development

    Science.gov (United States)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  15. Hencky's model for elastomer forming process

    Science.gov (United States)

    Oleinikov, A. A.; Oleinikov, A. I.

    2016-08-01

    In the numerical simulation of elastomer forming process, Henckys isotropic hyperelastic material model can guarantee relatively accurate prediction of strain range in terms of large deformations. It is shown, that this material model prolongate Hooke's law from the area of infinitesimal strains to the area of moderate ones. New representation of the fourth-order elasticity tensor for Hencky's hyperelastic isotropic material is obtained, it possesses both minor symmetries, and the major symmetry. Constitutive relations of considered model is implemented into MSC.Marc code. By calculating and fitting curves, the polyurethane elastomer material constants are selected. Simulation of equipment for elastomer sheet forming are considered.

  16. Model-based internal wave processing

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  17. Model Identification of Integrated ARMA Processes

    Science.gov (United States)

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  18. Kinetics and modeling of anaerobic digestion process

    DEFF Research Database (Denmark)

    Gavala, Hariklia N.; Angelidaki, Irini; Ahring, Birgitte Kiær

    2003-01-01

    Anaerobic digestion modeling started in the early 1970s when the need for design and efficient operation of anaerobic systems became evident. At that time not only was the knowledge about the complex process of anaerobic digestion inadequate but also there were computational limitations. Thus...

  19. Designing equivalent semantic models for process creation

    NARCIS (Netherlands)

    P.H.M. America (Pierre); J.W. de Bakker (Jaco)

    1986-01-01

    textabstractOperational and denotational semantic models are designed for languages with process creation, and the relationships between the two semantics are investigated. The presentation is organized in four sections dealing with a uniform and static, a uniform and dynamic, a nonuniform and

  20. Modeling as a Decision-Making Process

    Science.gov (United States)

    Bleiler-Baxter, Sarah K.; Stephens, D. Christopher; Baxter, Wesley A.; Barlow, Angela T.

    2017-01-01

    The goal in this article is to support teachers in better understanding what it means to model with mathematics by focusing on three key decision-making processes: Simplification, Relationship Mapping, and Situation Analysis. The authors use the Theme Park task to help teachers develop a vision of how students engage in these three decision-making…

  1. Mathematical Modelling of Continuous Biotechnological Processes

    Science.gov (United States)

    Pencheva, T.; Hristozov, I.; Shannon, A. G.

    2003-01-01

    Biotechnological processes (BTP) are characterized by a complicated structure of organization and interdependent characteristics. Partial differential equations or systems of partial differential equations are used for their behavioural description as objects with distributed parameters. Modelling of substrate without regard to dispersion…

  2. Aligning Grammatical Theories and Language Processing Models

    Science.gov (United States)

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  3. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  4. Fuzzy model for Laser Assisted Bending Process

    Directory of Open Access Journals (Sweden)

    Giannini Oliviero

    2016-01-01

    Full Text Available In the present study, a fuzzy model was developed to predict the residual bending in a conventional metal bending process assisted by a high power diode laser. The study was focused on AA6082T6 aluminium thin sheets. In most dynamic sheet metal forming operations, the highly nonlinear deformation processes cause large amounts of elastic strain energy stored in the formed material. The novel hybrid forming process was thus aimed at inducing the local heating of the mechanically bent workpiece in order to decrease or eliminate the related springback phenomena. In particular, the influence on the extent of springback phenomena of laser process parameters such as source power, scan speed and starting elastic deformation of mechanically bent sheets, was experimentally assessed. Consistent trends in experimental response according to operational parameters were found. Accordingly, 3D process maps of the extent of the springback phenomena according to operational parameters were constructed. The effect of the inherent uncertainties on the predicted residual bending caused by the approximation in the model parameters was evaluated. In particular, a fuzzy-logic based approach was used to describe the model uncertainties and the transformation method was applied to propagate their effect on the residual bending.

  5. Process Model for Friction Stir Welding

    Science.gov (United States)

    Adams, Glynn

    1996-01-01

    Friction stir welding (FSW) is a relatively new process being applied for joining of metal alloys. The process was initially developed by The Welding Institute (TWI) in Cambridge, UK. The FSW process is being investigated at NASA/MSEC as a repair/initial weld procedure for fabrication of the super-light-weight aluminum-lithium shuttle external tank. The FSW investigations at MSFC were conducted on a horizontal mill to produce butt welds of flat plate material. The weldment plates are butted together and fixed to a backing plate on the mill bed. A pin tool is placed into the tool holder of the mill spindle and rotated at approximately 400 rpm. The pin tool is then plunged into the plates such that the center of the probe lies at, one end of the line of contact, between the plates and the shoulder of the pin tool penetrates the top surface of the weldment. The weld is produced by traversing the tool along the line of contact between the plates. A lead angle allows the leading edge of the shoulder to remain above the top surface of the plate. The work presented here is the first attempt at modeling a complex phenomenon. The mechanical aspects of conducting the weld process are easily defined and the process itself is controlled by relatively few input parameters. However, in the region of the weld, plasticizing and forging of the parent material occurs. These are difficult processes to model. The model presented here addresses only variations in the radial dimension outward from the pin tool axis. Examinations of the grain structure of the weld reveal that a considerable amount of material deformation also occurs in the direction parallel to the pin tool axis of rotation, through the material thickness. In addition, measurements of the axial load on the pin tool demonstrate that the forging affect of the pin tool shoulder is an important process phenomenon. Therefore, the model needs to be expanded to account for the deformations through the material thickness and the

  6. Modelling and control of a flotation process

    International Nuclear Information System (INIS)

    Ding, L.; Gustafsson, T.

    1999-01-01

    A general description of a flotation process is given. The dynamic model of a MIMO nonlinear subprocess in flotation, i. e. the pulp levels in five compartments in series is developed and the model is verified with real data from a production plant. In order to reject constant disturbances five extra states are introduced and the model is modified. An exact linearization has been made for the non-linear model and a linear quadratic gaussian controller is proposed based on the linearized model. The simulation result shows an improved performance of the pulp level control when the set points are changed or a disturbance occur. In future the controller will be tested in production. (author)

  7. Learning Markov Decision Processes for Model Checking

    DEFF Research Database (Denmark)

    Mao, Hua; Chen, Yingke; Jaeger, Manfred

    2012-01-01

    . The proposed learning algorithm is adapted from algorithms for learning deterministic probabilistic finite automata, and extended to include both probabilistic and nondeterministic transitions. The algorithm is empirically analyzed and evaluated by learning system models of slot machines. The evaluation......Constructing an accurate system model for formal model verification can be both resource demanding and time-consuming. To alleviate this shortcoming, algorithms have been proposed for automatically learning system models based on observed system behaviors. In this paper we extend the algorithm...... on learning probabilistic automata to reactive systems, where the observed system behavior is in the form of alternating sequences of inputs and outputs. We propose an algorithm for automatically learning a deterministic labeled Markov decision process model from the observed behavior of a reactive system...

  8. Advances in modeling plastic waste pyrolysis processes

    Energy Technology Data Exchange (ETDEWEB)

    Safadi, Y. [Department of Mechanical Engineering, American University of Beirut, PO Box 11-0236, Beirut (Lebanon); Zeaiter, J. [Chemical Engineering Program, American University of Beirut, PO Box 11-0236, Beirut (Lebanon)

    2013-07-01

    The tertiary recycling of plastics via pyrolysis is recently gaining momentum due to promising economic returns from the generated products that can be used as a chemical feedstock or fuel. The need for prediction models to simulate such processes is essential in understanding in depth the mechanisms that take place during the thermal or catalytic degradation of the waste polymer. This paper presents key different models used successfully in literature so far. Three modeling schemes are identified: Power-Law, Lumped-Empirical, and Population-Balance based equations. The categorization is based mainly on the level of detail and prediction capability from each modeling scheme. The data shows that the reliability of these modeling approaches vary with the degree of details the experimental work and product analysis are trying to achieve.

  9. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  10. Attention training normalises combat-related post-traumatic stress disorder effects on emotional Stroop performance using lexically matched word lists.

    Science.gov (United States)

    Khanna, Maya M; Badura-Brack, Amy S; McDermott, Timothy J; Shepherd, Alex; Heinrichs-Graham, Elizabeth; Pine, Daniel S; Bar-Haim, Yair; Wilson, Tony W

    2015-08-26

    We examined two groups of combat veterans, one with post-traumatic stress disorder (PTSD) (n = 27) and another without PTSD (n = 16), using an emotional Stroop task (EST) with word lists matched across a series of lexical variables (e.g. length, frequency, neighbourhood size, etc.). Participants with PTSD exhibited a strong EST effect (longer colour-naming latencies for combat-relevant words as compared to neutral words). Veterans without PTSD produced no such effect, t  .37. Participants with PTSD then completed eight sessions of attention training (Attention Control Training or Attention Bias Modification Training) with a dot-probe task utilising threatening and neutral faces. After training, participants-especially those undergoing Attention Control Training-no longer produced longer colour-naming latencies for combat-related words as compared to other words, indicating normalised attention allocation processes after treatment.

  11. An ecological process model of systems change.

    Science.gov (United States)

    Peirson, Leslea J; Boydell, Katherine M; Ferguson, H Bruce; Ferris, Lorraine E

    2011-06-01

    In June 2007 the American Journal of Community Psychology published a special issue focused on theories, methods and interventions for systems change which included calls from the editors and authors for theoretical advancement in this field. We propose a conceptual model of systems change that integrates familiar and fundamental community psychology principles (succession, interdependence, cycling of resources, adaptation) and accentuates a process orientation. To situate our framework we offer a definition of systems change and a brief review of the ecological perspective and principles. The Ecological Process Model of Systems Change is depicted, described and applied to a case example of policy driven systems level change in publicly funded social programs. We conclude by identifying salient implications for thinking and action which flow from the Model.

  12. Modeling veterans healthcare administration disclosure processes :

    Energy Technology Data Exchange (ETDEWEB)

    Beyeler, Walter E; DeMenno, Mercy B.; Finley, Patrick D.

    2013-09-01

    As with other large healthcare organizations, medical adverse events at the Department of Veterans Affairs (VA) facilities can expose patients to unforeseen negative risks. VHA leadership recognizes that properly handled disclosure of adverse events can minimize potential harm to patients and negative consequences for the effective functioning of the organization. The work documented here seeks to help improve the disclosure process by situating it within the broader theoretical framework of issues management, and to identify opportunities for process improvement through modeling disclosure and reactions to disclosure. The computational model will allow a variety of disclosure actions to be tested across a range of incident scenarios. Our conceptual model will be refined in collaboration with domain experts, especially by continuing to draw on insights from VA Study of the Communication of Adverse Large-Scale Events (SCALE) project researchers.

  13. Dimensional modeling: beyond data processing constraints.

    Science.gov (United States)

    Bunardzic, A

    1995-01-01

    The focus of information processing requirements is shifting from the on-line transaction processing (OLTP) issues to the on-line analytical processing (OLAP) issues. While the former serves to ensure the feasibility of the real-time on-line transaction processing (which has already exceeded a level of up to 1,000 transactions per second under normal conditions), the latter aims at enabling more sophisticated analytical manipulation of data. The OLTP requirements, or how to efficiently get data into the system, have been solved by applying the Relational theory in the form of Entity-Relation model. There is presently no theory related to OLAP that would resolve the analytical processing requirements as efficiently as Relational theory provided for the transaction processing. The "relational dogma" also provides the mathematical foundation for the Centralized Data Processing paradigm in which mission-critical information is incorporated as 'one and only one instance' of data, thus ensuring data integrity. In such surroundings, the information that supports business analysis and decision support activities is obtained by running predefined reports and queries that are provided by the IS department. In today's intensified competitive climate, businesses are finding that this traditional approach is not good enough. The only way to stay on top of things, and to survive and prosper, is to decentralize the IS services. The newly emerging Distributed Data Processing, with its increased emphasis on empowering the end user, does not seem to find enough merit in the relational database model to justify relying upon it. Relational theory proved too rigid and complex to accommodate the analytical processing needs. In order to satisfy the OLAP requirements, or how to efficiently get the data out of the system, different models, metaphors, and theories have been devised. All of them are pointing to the need for simplifying the highly non-intuitive mathematical constraints found

  14. Mathematical modeling of the flash converting process

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, H.Y.; Perez-Tello, M.; Riihilahti, K.M. [Utah Univ., Salt Lake City, UT (United States)

    1996-12-31

    An axisymmetric mathematical model for the Kennecott-Outokumpu flash converting process for converting solid copper matte to copper is presented. The model is an adaptation of the comprehensive mathematical model formerly developed at the University of Utah for the flash smelting of copper concentrates. The model incorporates the transport of momentum, heat, mass, and reaction kinetics between gas and particles in a particle-laden turbulent gas jet. The standard k-{epsilon} model is used to describe gas-phase turbulence in an Eulerian framework. The particle-phase is treated from a Lagrangian viewpoint which is coupled to the gas-phase via the source terms in the Eulerian gas-phase governing equations. Matte particles were represented as Cu{sub 2}S yFeS, and assumed to undergo homogeneous oxidation to Cu{sub 2}O, Fe{sub 3}O{sub 4}, and SO{sub 2}. A reaction kinetics mechanism involving both external mass transfer of oxygen gas to the particle surface and diffusion of oxygen through the porous oxide layer is proposed to estimate the particle oxidation rate Predictions of the mathematical model were compared with the experimental data collected in a bench-scale flash converting facility. Good agreement between the model predictions and the measurements was obtained. The model was used to study the effect of different gas-injection configurations on the overall fluid dynamics in a commercial size flash converting shaft. (author)

  15. Mathematical modeling of the flash converting process

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, H Y; Perez-Tello, M; Riihilahti, K M [Utah Univ., Salt Lake City, UT (United States)

    1997-12-31

    An axisymmetric mathematical model for the Kennecott-Outokumpu flash converting process for converting solid copper matte to copper is presented. The model is an adaptation of the comprehensive mathematical model formerly developed at the University of Utah for the flash smelting of copper concentrates. The model incorporates the transport of momentum, heat, mass, and reaction kinetics between gas and particles in a particle-laden turbulent gas jet. The standard k-{epsilon} model is used to describe gas-phase turbulence in an Eulerian framework. The particle-phase is treated from a Lagrangian viewpoint which is coupled to the gas-phase via the source terms in the Eulerian gas-phase governing equations. Matte particles were represented as Cu{sub 2}S yFeS, and assumed to undergo homogeneous oxidation to Cu{sub 2}O, Fe{sub 3}O{sub 4}, and SO{sub 2}. A reaction kinetics mechanism involving both external mass transfer of oxygen gas to the particle surface and diffusion of oxygen through the porous oxide layer is proposed to estimate the particle oxidation rate Predictions of the mathematical model were compared with the experimental data collected in a bench-scale flash converting facility. Good agreement between the model predictions and the measurements was obtained. The model was used to study the effect of different gas-injection configurations on the overall fluid dynamics in a commercial size flash converting shaft. (author)

  16. Process models and model-data fusion in dendroecology

    Directory of Open Access Journals (Sweden)

    Joel eGuiot

    2014-08-01

    Full Text Available Dendrochronology (i.e. the study of annually dated tree-ring time series has proved to be a powerful technique to understand tree-growth. This paper intends to show the interest of using ecophysiological modeling not only to understand and predict tree-growth (dendroecology but also to reconstruct past climates (dendroclimatology. Process models have been used for several decades in dendroclimatology, but it is only recently that methods of model-data fusion have led to significant progress in modeling tree-growth as a function of climate and in reconstructing past climates. These model-data fusion (MDF methods, mainly based on the Bayesian paradigm, have been shown to be powerful for both model calibration and model inversion. After a rapid survey of tree-growth modeling, we illustrate MDF with examples based on series of Southern France Aleppo pines and Central France oaks. These examples show that if plants experienced CO2 fertilization, this would have a significant effect on tree-growth which in turn would bias the climate reconstructions. This bias could be extended to other environmental non-climatic factors directly or indirectly affecting annual ring formation and not taken into account in classical empirical models, which supports the use of more complex process-based models. Finally, we conclude by showing the interest of the data assimilation methods applied in climatology to produce climate re-analyses.

  17. Reversibility in Quantum Models of Stochastic Processes

    Science.gov (United States)

    Gier, David; Crutchfield, James; Mahoney, John; James, Ryan

    Natural phenomena such as time series of neural firing, orientation of layers in crystal stacking and successive measurements in spin-systems are inherently probabilistic. The provably minimal classical models of such stochastic processes are ɛ-machines, which consist of internal states, transition probabilities between states and output values. The topological properties of the ɛ-machine for a given process characterize the structure, memory and patterns of that process. However ɛ-machines are often not ideal because their statistical complexity (Cμ) is demonstrably greater than the excess entropy (E) of the processes they represent. Quantum models (q-machines) of the same processes can do better in that their statistical complexity (Cq) obeys the relation Cμ >= Cq >= E. q-machines can be constructed to consider longer lengths of strings, resulting in greater compression. With code-words of sufficiently long length, the statistical complexity becomes time-symmetric - a feature apparently novel to this quantum representation. This result has ramifications for compression of classical information in quantum computing and quantum communication technology.

  18. Dual elaboration models in attitude change processes

    Directory of Open Access Journals (Sweden)

    Žeželj Iris

    2005-01-01

    Full Text Available This article examines empirical and theoretical developments in research on attitude change in the past 50 years. It focuses the period from 1980 till present as well as cognitive response theories as the dominant theoretical approach in the field. The postulates of Elaboration Likelihood Model, as most-researched representative of dual process theories are studied, based on review of accumulated research evidence. Main research findings are grouped in four basic factors: message source, message content, message recipient and its context. Most influential criticisms of the theory are then presented regarding its empirical base and dual process assumption. Some possible applications and further research perspectives are discussed at the end.

  19. Temperature Modelling of the Biomass Pretreatment Process

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail; Blanke, Mogens; Jensen, Jakob M.

    2012-01-01

    In a second generation biorefinery, the biomass pretreatment stage has an important contribution to the efficiency of the downstream processing units involved in biofuel production. Most of the pretreatment process occurs in a large pressurized thermal reactor that presents an irregular temperature...... that captures the environmental temperature differences inside the reactor using distributed parameters. A Kalman filter is then added to account for any missing dynamics and the overall model is embedded into a temperature soft sensor. The operator of the plant will be able to observe the temperature in any...

  20. Technical Note: On methodologies for determining the size-normalised weight of planktic foraminifera

    Directory of Open Access Journals (Sweden)

    C. J. Beer

    2010-07-01

    Full Text Available The size-normalised weight (SNW of planktic foraminifera, a measure of test wall thickness and density, is potentially a valuable palaeo-proxy for marine carbon chemistry. As increasing attention is given to developing this proxy it is important that methods are comparable between studies. Here, we compare SNW data generated using two different methods to account for variability in test size, namely (i the narrow (50 μm range sieve fraction method and (ii the individually measured test size method. Using specimens from the 200–250 μm sieve fraction range collected in multinet samples from the North Atlantic, we find that sieving does not constrain size sufficiently well to isolate changes in weight driven by variations in test wall thickness and density from those driven by size. We estimate that the SNW data produced as part of this study are associated with an uncertainty, or error bar, of about ±11%. Errors associated with the narrow sieve fraction method may be reduced by decreasing the size of the sieve window, by using larger tests and by increasing the number tests employed. In situations where numerous large tests are unavailable, however, substantial errors associated with this sieve method remain unavoidable. In such circumstances the individually measured test size method provides a better means for estimating SNW because, as our results show, this method isolates changes in weight driven by variations in test wall thickness and density from those driven by size.

  1. No upward trend in normalised windstorm losses in Europe: 1970-2008

    Science.gov (United States)

    Barredo, J. I.

    2010-01-01

    On 18 January 2007, windstorm Kyrill battered Europe with hurricane-force winds killing 47 people and causing 10 billion US in damage. Kyrill poses several questions: is Kyrill an isolated or exceptional case? Have there been events costing as much in the past? This paper attempts to put Kyrill into an historical context by examining large historical windstorm event losses in Europe for the period 1970-2008 across 29 European countries. It asks the question what economic losses would these historical events cause if they were to recur under 2008 societal conditions? Loss data were sourced from reinsurance firms and augmented with historical reports, peer-reviewed articles and other ancillary sources. Following the same conceptual approach outlined in previous studies, the data were then adjusted for changes in population, wealth, and inflation at the country level and for inter-country price differences using purchasing power parity. The analyses reveal no trend in the normalised windstorm losses and confirm increasing disaster losses are driven by societal factors and increasing exposure.

  2. Normalised Mutual Information of High-Density Surface Electromyography during Muscle Fatigue

    Directory of Open Access Journals (Sweden)

    Adrian Bingham

    2017-12-01

    Full Text Available This study has developed a technique for identifying the presence of muscle fatigue based on the spatial changes of the normalised mutual information (NMI between multiple high density surface electromyography (HD-sEMG channels. Muscle fatigue in the tibialis anterior (TA during isometric contractions at 40% and 80% maximum voluntary contraction levels was investigated in ten healthy participants (Age range: 21 to 35 years; Mean age = 26 years; Male = 4, Female = 6. HD-sEMG was used to record 64 channels of sEMG using a 16 by 4 electrode array placed over the TA. The NMI of each electrode with every other electrode was calculated to form an NMI distribution for each electrode. The total NMI for each electrode (the summation of the electrode’s NMI distribution highlighted regions of high dependence in the electrode array and was observed to increase as the muscle fatigued. To summarise this increase, a function, M(k, was defined and was found to be significantly affected by fatigue and not by contraction force. The technique discussed in this study has overcome issues regarding electrode placement and was used to investigate how the dependences between sEMG signals within the same muscle change spatially during fatigue.

  3. Theoretical modelling of carbon deposition processes

    International Nuclear Information System (INIS)

    Marsh, G.R.; Norfolk, D.J.; Skinner, R.F.

    1985-01-01

    Work based on capsule experiments in the BNL Gamma Facility, aimed at elucidating the chemistry involved in the formation of carbonaceous deposit on CAGR fuel pin surfaces is described. Using a data-base derived from capsule experiments together with literature values for the kinetics of the fundamental reactions, a chemical model of the gas-phase processes has been developed. This model successfully reproduces the capsule results, whilst preliminary application to the WAGR coolant circuit indicates the likely concentration profiles of various radical species within the fuel channels. (author)

  4. Theoretical Modelling of Intercultural Communication Process

    Directory of Open Access Journals (Sweden)

    Mariia Soter

    2016-08-01

    Full Text Available The definition of the concepts of “communication”, “intercultural communication”, “model of communication” are analyzed in the article. The basic components of the communication process are singled out. The model of intercultural communication is developed. Communicative, behavioral and complex skills for optimal organization of intercultural communication, establishment of productive contact with a foreign partner to achieve mutual understanding, searching for acceptable ways of organizing interaction and cooperation for both communicants are highlighted in the article. It is noted that intercultural communication through interaction between people affects the development of different cultures’ aspects.

  5. Symmetries and modelling functions for diffusion processes

    International Nuclear Information System (INIS)

    Nikitin, A G; Spichak, S V; Vedula, Yu S; Naumovets, A G

    2009-01-01

    A constructive approach to the theory of diffusion processes is proposed, which is based on application of both symmetry analysis and the method of modelling functions. An algorithm for construction of the modelling functions is suggested. This algorithm is based on the error function expansion (ERFEX) of experimental concentration profiles. The high-accuracy analytical description of the profiles provided by ERFEX approximation allows a convenient extraction of the concentration dependence of diffusivity from experimental data and prediction of the diffusion process. Our analysis is exemplified by its employment in experimental results obtained for surface diffusion of lithium on the molybdenum (1 1 2) surface precovered with dysprosium. The ERFEX approximation can be directly extended to many other diffusion systems.

  6. Modeling of processes in the tourism sector

    Directory of Open Access Journals (Sweden)

    Salamatina Victoriya, S.

    2015-06-01

    Full Text Available In modern conditions for a number of Russian regions tourism is becoming budget. In this regard, it is of interest to the simulation of processes occurring in the tourism business, because they are affected by many random parameters, due to various economic, political, geographic, and other aspects. For improvement and development of systems for the management of tourism business systematically embeds economic mathematical apparatus in this area, because increased competitiveness requires continuous and constructive changes. Results of application of the economic mathematical apparatus allow a more systematic and internal unity to analyze and evaluate the applicability of further processes in tourism. For some economic processes typical tourist activities is that a certain effect and result from exposure to any of the factors on the indicators of the processes is not immediately but gradually, after some certain time, with a certain lag. With the necessity of accounting for this delay has to face when developing mathematical models of tourist business processes. In this case, the simulation of such processes it is advisable to apply economic-mathematical formalism of optimal control, called game theory.

  7. Deep inelastic processes and the parton model

    International Nuclear Information System (INIS)

    Altarelli, G.

    The lecture was intended as an elementary introduction to the physics of deep inelastic phenomena from the point of view of theory. General formulae and facts concerning inclusive deep inelastic processes in the form: l+N→l'+hadrons (electroproduction, neutrino scattering) are first recalled. The deep inelastic annihilation e + e - →hadrons is then envisaged. The light cone approach, the parton model and their relation are mainly emphasized

  8. Survivability Assessment: Modeling A Recovery Process

    OpenAIRE

    Paputungan, Irving Vitra; Abdullah, Azween

    2009-01-01

    Survivability is the ability of a system to continue operating, in a timely manner, in the presence ofattacks, failures, or accidents. Recovery in survivability is a process of a system to heal or recover from damageas early as possible to fulfill its mission as condition permit. In this paper, we show a preliminary recoverymodel to enhance the system survivability. The model focuses on how we preserve the system and resumes itscritical service under attacks as soon as possible.Keywords: surv...

  9. Process Modeling With Inhomogeneous Thin Films

    Science.gov (United States)

    Machorro, R.; Macleod, H. A.; Jacobson, M. R.

    1986-12-01

    Designers of optical multilayer coatings commonly assume that the individual layers will be ideally homogeneous and isotropic. In practice, it is very difficult to control the conditions involved in the complex evaporation process sufficiently to produce such ideal films. Clearly, changes in process parameters, such as evaporation rate, chamber pressure, and substrate temperature, affect the microstructure of the growing film, frequently producing inhomogeneity in structure or composition. In many cases, these effects are interdependent, further complicating the situation. However, this process can be simulated on powerful, interactive, and accessible microcomputers. In this work, we present such a model and apply it to estimate the influence of an inhomogeneous layer on multilayer performance. Presently, the program simulates film growth, thermal expansion and contraction, and thickness monitoring procedures, and includes the effects of uncertainty in these parameters or noise. Although the model is being developed to cover very general cases, we restrict the present discussion to isotropic and nondispersive quarterwave layers to understand the particular effects of inhomogeneity. We studied several coating designs and related results and tolerances to variations in evaporation conditions. The model is composed of several modular subprograms, is written in Fortran, and is executed on an IBM-PC with 640 K of memory. The results can be presented in graphic form on a monochrome monitor. We are currently installing and implementing color capability to improve the clarity of the multidimensional output.

  10. Near Field Environment Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R.A. Wagner

    2000-11-14

    Waste emplacement and activities associated with construction of a repository system potentially will change environmental conditions within the repository system. These environmental changes principally result from heat generated by the decay of the radioactive waste, which elevates temperatures within the repository system. Elevated temperatures affect distribution of water, increase kinetic rates of geochemical processes, and cause stresses to change in magnitude and orientation from the stresses resulting from the overlying rock and from underground construction activities. The recognition of this evolving environment has been reflected in activities, studies and discussions generally associated with what has been termed the Near-Field Environment (NFE). The NFE interacts directly with waste packages and engineered barriers as well as potentially changing the fluid composition and flow conditions within the mountain. As such, the NFE defines the environment for assessing the performance of a potential Monitored Geologic Repository at Yucca Mountain, Nevada. The NFe evolves over time, and therefore is not amenable to direct characterization or measurement in the ambient system. Analysis or assessment of the NFE must rely upon projections based on tests and models that encompass the long-term processes of the evolution of this environment. This NFE Process Model Report (PMR) describes the analyses and modeling based on current understanding of the evolution of the near-field within the rock mass extending outward from the drift wall.

  11. [The dual process model of addiction. Towards an integrated model?].

    Science.gov (United States)

    Vandermeeren, R; Hebbrecht, M

    2012-01-01

    Neurobiology and cognitive psychology have provided us with a dual process model of addiction. According to this model, behavior is considered to be the dynamic result of a combination of automatic and controlling processes. In cases of addiction the balance between these two processes is severely disturbed. Automated processes will continue to produce impulses that ensure the continuance of addictive behavior. Weak, reflective or controlling processes are both the reason for and the result of the inability to forgo addiction. To identify features that are common to current neurocognitive insights into addiction and psychodynamic views on addiction. The picture that emerges from research is not clear. There is some evidence that attentional bias has a causal effect on addiction. There is no evidence that automatic associations have a causal effect, but there is some evidence that automatic action-tendencies do have a causal effect. Current neurocognitive views on the dual process model of addiction can be integrated with an evidence-based approach to addiction and with psychodynamic views on addiction.

  12. Improving the process of process modelling by the use of domain process patterns

    NARCIS (Netherlands)

    Koschmider, A.; Reijers, H.A.

    2015-01-01

    The use of business process models has become prevalent in a wide area of enterprise applications. But while their popularity is expanding, concerns are growing with respect to their proper creation and maintenance. An obvious way to boost the efficiency of creating high-quality business process

  13. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  14. Equifinality and process-based modelling

    Science.gov (United States)

    Khatami, S.; Peel, M. C.; Peterson, T. J.; Western, A. W.

    2017-12-01

    Equifinality is understood as one of the fundamental difficulties in the study of open complex systems, including catchment hydrology. A review of the hydrologic literature reveals that the term equifinality has been widely used, but in many cases inconsistently and without coherent recognition of the various facets of equifinality, which can lead to ambiguity but also methodological fallacies. Therefore, in this study we first characterise the term equifinality within the context of hydrological modelling by reviewing the genesis of the concept of equifinality and then presenting a theoretical framework. During past decades, equifinality has mainly been studied as a subset of aleatory (arising due to randomness) uncertainty and for the assessment of model parameter uncertainty. Although the connection between parameter uncertainty and equifinality is undeniable, we argue there is more to equifinality than just aleatory parameter uncertainty. That is, the importance of equifinality and epistemic uncertainty (arising due to lack of knowledge) and their implications is overlooked in our current practice of model evaluation. Equifinality and epistemic uncertainty in studying, modelling, and evaluating hydrologic processes are treated as if they can be simply discussed in (or often reduced to) probabilistic terms (as for aleatory uncertainty). The deficiencies of this approach to conceptual rainfall-runoff modelling are demonstrated for selected Australian catchments by examination of parameter and internal flux distributions and interactions within SIMHYD. On this basis, we present a new approach that expands equifinality concept beyond model parameters to inform epistemic uncertainty. The new approach potentially facilitates the identification and development of more physically plausible models and model evaluation schemes particularly within the multiple working hypotheses framework, and is generalisable to other fields of environmental modelling as well.

  15. Spherical Process Models for Global Spatial Statistics

    KAUST Repository

    Jeong, Jaehong

    2017-11-28

    Statistical models used in geophysical, environmental, and climate science applications must reflect the curvature of the spatial domain in global data. Over the past few decades, statisticians have developed covariance models that capture the spatial and temporal behavior of these global data sets. Though the geodesic distance is the most natural metric for measuring distance on the surface of a sphere, mathematical limitations have compelled statisticians to use the chordal distance to compute the covariance matrix in many applications instead, which may cause physically unrealistic distortions. Therefore, covariance functions directly defined on a sphere using the geodesic distance are needed. We discuss the issues that arise when dealing with spherical data sets on a global scale and provide references to recent literature. We review the current approaches to building process models on spheres, including the differential operator, the stochastic partial differential equation, the kernel convolution, and the deformation approaches. We illustrate realizations obtained from Gaussian processes with different covariance structures and the use of isotropic and nonstationary covariance models through deformations and geographical indicators for global surface temperature data. To assess the suitability of each method, we compare their log-likelihood values and prediction scores, and we end with a discussion of related research problems.

  16. Modeling Dynamic Regulatory Processes in Stroke

    Science.gov (United States)

    McDermott, Jason E.; Jarman, Kenneth; Taylor, Ronald; Lancaster, Mary; Shankaran, Harish; Vartanian, Keri B.; Stevens, Susan L.; Stenzel-Poore, Mary P.; Sanfilippo, Antonio

    2012-01-01

    The ability to examine the behavior of biological systems in silico has the potential to greatly accelerate the pace of discovery in diseases, such as stroke, where in vivo analysis is time intensive and costly. In this paper we describe an approach for in silico examination of responses of the blood transcriptome to neuroprotective agents and subsequent stroke through the development of dynamic models of the regulatory processes observed in the experimental gene expression data. First, we identified functional gene clusters from these data. Next, we derived ordinary differential equations (ODEs) from the data relating these functional clusters to each other in terms of their regulatory influence on one another. Dynamic models were developed by coupling these ODEs into a model that simulates the expression of regulated functional clusters. By changing the magnitude of gene expression in the initial input state it was possible to assess the behavior of the networks through time under varying conditions since the dynamic model only requires an initial starting state, and does not require measurement of regulatory influences at each time point in order to make accurate predictions. We discuss the implications of our models on neuroprotection in stroke, explore the limitations of the approach, and report that an optimized dynamic model can provide accurate predictions of overall system behavior under several different neuroprotective paradigms. PMID:23071432

  17. A probabilistic evaluation procedure for process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2018-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  18. Identifying Stable Reference Genes for qRT-PCR Normalisation in Gene Expression Studies of Narrow-Leafed Lupin (Lupinus angustifolius L..

    Directory of Open Access Journals (Sweden)

    Candy M Taylor

    Full Text Available Quantitative Reverse Transcription PCR (qRT-PCR is currently one of the most popular, high-throughput and sensitive technologies available for quantifying gene expression. Its accurate application depends heavily upon normalisation of gene-of-interest data with reference genes that are uniformly expressed under experimental conditions. The aim of this study was to provide the first validation of reference genes for Lupinus angustifolius (narrow-leafed lupin, a significant grain legume crop using a selection of seven genes previously trialed as reference genes for the model legume, Medicago truncatula. In a preliminary evaluation, the seven candidate reference genes were assessed on the basis of primer specificity for their respective targeted region, PCR amplification efficiency, and ability to discriminate between cDNA and gDNA. Following this assessment, expression of the three most promising candidates [Ubiquitin C (UBC, Helicase (HEL, and Polypyrimidine tract-binding protein (PTB] was evaluated using the NormFinder and RefFinder statistical algorithms in two narrow-leafed lupin lines, both with and without vernalisation treatment, and across seven organ types (cotyledons, stem, leaves, shoot apical meristem, flowers, pods and roots encompassing three developmental stages. UBC was consistently identified as the most stable candidate and has sufficiently uniform expression that it may be used as a sole reference gene under the experimental conditions tested here. However, as organ type and developmental stage were associated with greater variability in relative expression, it is recommended using UBC and HEL as a pair to achieve optimal normalisation. These results highlight the importance of rigorously assessing candidate reference genes for each species across a diverse range of organs and developmental stages. With emerging technologies, such as RNAseq, and the completion of valuable transcriptome data sets, it is possible that other

  19. Discovering Reference Process Models by Mining Process Variants

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    Recently, a new generation of adaptive Process-Aware Information Systems (PAIS) has emerged, which allows for dynamic process and service changes (e.g., to insert, delete, and move activities and service executions in a running process). This, in turn, has led to a large number of process variants

  20. Multifunctional multiscale composites: Processing, modeling and characterization

    Science.gov (United States)

    Qiu, Jingjing

    Carbon nanotubes (CNTs) demonstrate extraordinary properties and show great promise in enhancing out-of-plane properties of traditional polymer/fiber composites and enabling functionality. However, current manufacturing challenges hinder the realization of their potential. In the dissertation research, both experimental and computational efforts have been conducted to investigate effective manufacturing techniques of CNT integrated multiscale composites. The fabricated composites demonstrated significant improvements in physical properties, such as tensile strength, tensile modulus, inter-laminar shear strength, thermal dimension stability and electrical conductivity. Such multiscale composites were truly multifunctional with the addition of CNTs. Furthermore, a novel hierarchical multiscale modeling method was developed in this research. Molecular dynamic (MD) simulation offered reasonable explanation of CNTs dispersion and their motion in polymer solution. Bi-mode finite-extensible-nonlinear-elastic (FENE) dumbbell simulation was used to analyze the influence of CNT length distribution on the stress tensor and shear-rate-dependent viscosity. Based on the simulated viscosity profile and empirical equations from experiments, a macroscale flow simulation model on the finite element method (FEM) method was developed and validated to predict resin flow behavior in the processing of CNT-enhanced multiscale composites. The proposed multiscale modeling method provided a comprehensive understanding of micro/nano flow in both atomistic details and mesoscale. The simulation model can be used to optimize process design and control of the mold-filling process in multiscale composite manufacturing. This research provided systematic investigations into the CNT-based multiscale composites. The results from this study may be used to leverage the benefits of CNTs and open up new application opportunities for high-performance multifunctional multiscale composites. Keywords. Carbon

  1. Event Modeling in UML. Unified Modeling Language and Unified Process

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2002-01-01

    We show how events can be modeled in terms of UML. We view events as change agents that have consequences and as information objects that represent information. We show how to create object-oriented structures that represent events in terms of attributes, associations, operations, state charts......, and messages. We outline a run-time environment for the processing of events with multiple participants....

  2. Modelling and control of dynamic systems using gaussian process models

    CERN Document Server

    Kocijan, Juš

    2016-01-01

    This monograph opens up new horizons for engineers and researchers in academia and in industry dealing with or interested in new developments in the field of system identification and control. It emphasizes guidelines for working solutions and practical advice for their implementation rather than the theoretical background of Gaussian process (GP) models. The book demonstrates the potential of this recent development in probabilistic machine-learning methods and gives the reader an intuitive understanding of the topic. The current state of the art is treated along with possible future directions for research. Systems control design relies on mathematical models and these may be developed from measurement data. This process of system identification, when based on GP models, can play an integral part of control design in data-based control and its description as such is an essential aspect of the text. The background of GP regression is introduced first with system identification and incorporation of prior know...

  3. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is

  4. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    OpenAIRE

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    1999-01-01

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is reviewed.

  5. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  6. Modeling and optimization of wet sizing process

    International Nuclear Information System (INIS)

    Thai Ba Cau; Vu Thanh Quang and Nguyen Ba Tien

    2004-01-01

    Mathematical simulation on basis of Stock law has been done for wet sizing process on cylinder equipment of laboratory and semi-industrial scale. The model consists of mathematical equations describing relations between variables, such as: - Resident time distribution function of emulsion particles in the separating zone of the equipment depending on flow-rate, height, diameter and structure of the equipment. - Size-distribution function in the fine and coarse parts depending on resident time distribution function of emulsion particles, characteristics of the material being processed, such as specific density, shapes, and characteristics of the environment of classification, such as specific density, viscosity. - Experimental model was developed on data collected from an experimental cylindrical equipment with diameter x height of sedimentation chamber equal to 50 x 40 cm for an emulsion of zirconium silicate in water. - Using this experimental model allows to determine optimal flow-rate in order to obtain product with desired grain size in term of average size or size distribution function. (author)

  7. Three-dimensional model for fusion processes

    International Nuclear Information System (INIS)

    Olson, A.P.

    1984-01-01

    Active galactic nuclei (AGN) emit unusual spectra of radiation which is interpreted to signify extreme distance, extreme power, or both. The status of AGNs was recently reviewed by Balick and Heckman. It seems that the greatest conceptual difficulty with understanding AGNs is how to form a coherent phenomenological model of their properties. What drives the galactic engine. What and where are the mass-flows of fuel to this engine. Are there more than one engine. Do the engines have any symmetry properties. Is observed radiation isotropically emitted from the source. If it is polarized, what causes the polarization. Why is there a roughly spherical cloud of ionized gas about the center of our own galaxy, the Milky Way. The purpose of this paper is to discuss a new model, based on fusion processes which are not axisymmetric, uniform, isotropic, or even time-invariant. Then, the relationship to these questions will be developed. A unified model of fusion processes applicable to many astronomical phenomena will be proposed and discussed

  8. Process-Based Modeling of Constructed Wetlands

    Science.gov (United States)

    Baechler, S.; Brovelli, A.; Rossi, L.; Barry, D. A.

    2007-12-01

    Constructed wetlands (CWs) are widespread facilities for wastewater treatment. In subsurface flow wetlands, contaminated wastewater flows through a porous matrix, where oxidation and detoxification phenomena occur. Despite the large number of working CWs, system design and optimization are still mainly based upon empirical equations or simplified first-order kinetics. This results from an incomplete understanding of the system functioning, and may in turn hinder the performance and effectiveness of the treatment process. As a result, CWs are often considered not suitable to meet high water quality-standards, or to treat water contaminated with recalcitrant anthropogenic contaminants. To date, only a limited number of detailed numerical models have been developed and successfully applied to simulate constructed wetland behavior. Among these, one of the most complete and powerful is CW2D, which is based on Hydrus2D. The aim of this work is to develop a comprehensive simulator tailored to model the functioning of horizontal flow constructed wetlands and in turn provide a reliable design and optimization tool. The model is based upon PHWAT, a general reactive transport code for saturated flow. PHWAT couples MODFLOW, MT3DMS and PHREEQC-2 using an operator-splitting approach. The use of PHREEQC to simulate reactions allows great flexibility in simulating biogeochemical processes. The biogeochemical reaction network is similar to that of CW2D, and is based on the Activated Sludge Model (ASM). Kinetic oxidation of carbon sources and nutrient transformations (nitrogen and phosphorous primarily) are modeled via Monod-type kinetic equations. Oxygen dissolution is accounted for via a first-order mass-transfer equation. While the ASM model only includes a limited number of kinetic equations, the new simulator permits incorporation of an unlimited number of both kinetic and equilibrium reactions. Changes in pH, redox potential and surface reactions can be easily incorporated

  9. Identification of endogenous control genes for normalisation of real-time quantitative PCR data in colorectal cancer.

    LENUS (Irish Health Repository)

    Kheirelseid, Elrasheid A H

    2010-01-01

    BACKGROUND: Gene expression analysis has many applications in cancer diagnosis, prognosis and therapeutic care. Relative quantification is the most widely adopted approach whereby quantification of gene expression is normalised relative to an endogenously expressed control (EC) gene. Central to the reliable determination of gene expression is the choice of control gene. The purpose of this study was to evaluate a panel of candidate EC genes from which to identify the most stably expressed gene(s) to normalise RQ-PCR data derived from primary colorectal cancer tissue. RESULTS: The expression of thirteen candidate EC genes: B2M, HPRT, GAPDH, ACTB, PPIA, HCRT, SLC25A23, DTX3, APOC4, RTDR1, KRTAP12-3, CHRNB4 and MRPL19 were analysed in a cohort of 64 colorectal tumours and tumour associated normal specimens. CXCL12, FABP1, MUC2 and PDCD4 genes were chosen as target genes against which a comparison of the effect of each EC gene on gene expression could be determined. Data analysis using descriptive statistics, geNorm, NormFinder and qBasePlus indicated significant difference in variances between candidate EC genes. We determined that two genes were required for optimal normalisation and identified B2M and PPIA as the most stably expressed and reliable EC genes. CONCLUSION: This study identified that the combination of two EC genes (B2M and PPIA) more accurately normalised RQ-PCR data in colorectal tissue. Although these control genes might not be optimal for use in other cancer studies, the approach described herein could serve as a template for the identification of valid ECs in other cancer types.

  10. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    INTRODUCTION: Oral anticoagulation treatment (OACT) with warfarin is common in general practice. Increasingly, international normalised ratio (INR) point of care testing (POCT) is being used to manage patients. The aim of this study was to describe and analyse the quality of OACT with warfarin...... practices using INR POCT in the management of patients in warfarin treatment provided good quality of care. Sampling interval and diagnostic coding were significantly correlated with treatment quality....

  11. Models and Modelling Tools for Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    2016-01-01

    The design, development and reliability of a chemical product and the process to manufacture it, need to be consistent with the end-use characteristics of the desired product. One of the common ways to match the desired product-process characteristics is through trial and error based experiments......-based framework is that in the design, development and/or manufacturing of a chemical product-process, the knowledge of the applied phenomena together with the product-process design details can be provided with diverse degrees of abstractions and details. This would allow the experimental resources...... to be employed for validation and fine-tuning of the solutions from the model-based framework, thereby, removing the need for trial and error experimental steps. Also, questions related to economic feasibility, operability and sustainability, among others, can be considered in the early stages of design. However...

  12. Model systems for life processes on Mars

    Science.gov (United States)

    Mitz, M. A.

    1974-01-01

    In the evolution of life forms nonphotosynthetic mechanisms are developed. The question remains whether a total life system could evolve which is not dependent upon photosynthesis. In trying to visualize life on other planets, the photosynthetic process has problems. On Mars, the high intensity of light at the surface is a concern and alternative mechanisms need to be defined and analyzed. In the UV search for alternate mechanisms, several different areas may be identified. These involve activated inorganic compounds in the atmosphere, such as the products of photodissociation of carbon dioxide and the organic material which may be created by natural phenomena. In addition, a life system based on the pressure of the atmospheric constituents, such as carbon dioxide, is a possibility. These considerations may be important for the understanding of evolutionary processes of life on another planet. Model systems which depend on these alternative mechanisms are defined and related to presently planned and future planetary missions.

  13. Diabetic ketoacidosis in adult patients: an audit of factors influencing time to normalisation of metabolic parameters.

    Science.gov (United States)

    Lee, Melissa H; Calder, Genevieve L; Santamaria, John D; MacIsaac, Richard J

    2018-05-01

    Diabetic ketoacidosis (DKA) is an acute life-threatening metabolic complication of diabetes that imposes substantial burden on our healthcare system. There is a paucity of published data in Australia assessing factors influencing time to resolution of DKA and length of stay (LOS). To identify factors that predict a slower time to resolution of DKA in adults with diabetes. Retrospective audit of patients admitted to St Vincent's Hospital Melbourne between 2010 to 2014 coded with a diagnosis of 'Diabetic Ketoacidosis'. The primary outcome was time to resolution of DKA based on normalisation of biochemical markers. Episodes of DKA within the wider Victorian hospital network were also explored. Seventy-one patients met biochemical criteria for DKA; median age 31 years (26-45 years), 59% were male and 23% had newly diagnosed diabetes. Insulin omission was the most common precipitant (42%). Median time to resolution of DKA was 11 h (6.5-16.5 h). Individual factors associated with slower resolution of DKA were lower admission pH (P < 0.001) and higher admission serum potassium level (P = 0.03). Median LOS was 3 days (2-5 days), compared to a Victorian state-wide LOS of 2 days. Higher comorbidity scores were associated with longer LOS (P < 0.001). Lower admission pH levels and higher admission serum potassium levels are independent predictors of slower time to resolution of DKA. This may assist to stratify patients with DKA using markers of severity to determine who may benefit from closer monitoring and to predict LOS. © 2018 Royal Australasian College of Physicians.

  14. Quantification of tumour {sup 18}F-FDG uptake: Normalise to blood glucose or scale to liver uptake?

    Energy Technology Data Exchange (ETDEWEB)

    Keramida, Georgia [Brighton and Sussex Medical School, Clinical Imaging Sciences Centre, Brighton (United Kingdom); Brighton and Sussex University Hospitals NHS Trust, Department of Nuclear Medicine, Brighton (United Kingdom); University of Sussex, Clinical Imaging Sciences Centre, Brighton (United Kingdom); Dizdarevic, Sabina; Peters, A.M. [Brighton and Sussex Medical School, Clinical Imaging Sciences Centre, Brighton (United Kingdom); Brighton and Sussex University Hospitals NHS Trust, Department of Nuclear Medicine, Brighton (United Kingdom); Bush, Janice [Brighton and Sussex Medical School, Clinical Imaging Sciences Centre, Brighton (United Kingdom)

    2015-09-15

    To compare normalisation to blood glucose (BG) with scaling to hepatic uptake for quantification of tumour {sup 18}F-FDG uptake using the brain as a surrogate for tumours. Standardised uptake value (SUV) was measured over the liver, cerebellum, basal ganglia, and frontal cortex in 304 patients undergoing {sup 18}F-FDG PET/CT. The relationship between brain FDG clearance and SUV was theoretically defined. Brain SUV decreased exponentially with BG, with similar constants between cerebellum, basal ganglia, and frontal cortex (0.099-0.119 mmol/l{sup -1}) and similar to values for tumours estimated from the literature. Liver SUV, however, correlated positively with BG. Brain-to-liver SUV ratio therefore showed an inverse correlation with BG, well-fitted with a hyperbolic function (R = 0.83), as theoretically predicted. Brain SUV normalised to BG (nSUV) displayed a nonlinear correlation with BG (R = 0.55); however, as theoretically predicted, brain nSUV/liver SUV showed almost no correlation with BG. Correction of brain SUV using BG raised to an exponential power of 0.099 mmol/l{sup -1} also eliminated the correlation between brain SUV and BG. Brain SUV continues to correlate with BG after normalisation to BG. Likewise, liver SUV is unsuitable as a reference for tumour FDG uptake. Brain SUV divided by liver SUV, however, shows minimal dependence on BG. (orig.)

  15. Analysis of a simulated microarray dataset: Comparison of methods for data normalisation and detection of differential expression (Open Access publication

    Directory of Open Access Journals (Sweden)

    Mouzaki Daphné

    2007-11-01

    Full Text Available Abstract Microarrays allow researchers to measure the expression of thousands of genes in a single experiment. Before statistical comparisons can be made, the data must be assessed for quality and normalisation procedures must be applied, of which many have been proposed. Methods of comparing the normalised data are also abundant, and no clear consensus has yet been reached. The purpose of this paper was to compare those methods used by the EADGENE network on a very noisy simulated data set. With the a priori knowledge of which genes are differentially expressed, it is possible to compare the success of each approach quantitatively. Use of an intensity-dependent normalisation procedure was common, as was correction for multiple testing. Most variety in performance resulted from differing approaches to data quality and the use of different statistical tests. Very few of the methods used any kind of background correction. A number of approaches achieved a success rate of 95% or above, with relatively small numbers of false positives and negatives. Applying stringent spot selection criteria and elimination of data did not improve the false positive rate and greatly increased the false negative rate. However, most approaches performed well, and it is encouraging that widely available techniques can achieve such good results on a very noisy data set.

  16. Comments on: Spatiotemporal models for skewed processes

    KAUST Repository

    Genton, Marc G.; Hering, Amanda S.

    2017-01-01

    We would first like to thank the authors for this paper that highlights the important problem of building models for non-Gaussian space-time processes. We will hereafter refer to the paper as SGV, and we also would like to acknowledge and thank them for providing us with the temporally detrended temperatures, plotted in their Figure 1, along with the coordinates of the twenty-one locations and the posterior means of the parameters for the MA1 model. We find much of interest to discuss in this paper, and as we progress through points of interest, we pose some questions to the authors that we hope they will be able to address.

  17. Comments on: Spatiotemporal models for skewed processes

    KAUST Repository

    Genton, Marc G.

    2017-09-04

    We would first like to thank the authors for this paper that highlights the important problem of building models for non-Gaussian space-time processes. We will hereafter refer to the paper as SGV, and we also would like to acknowledge and thank them for providing us with the temporally detrended temperatures, plotted in their Figure 1, along with the coordinates of the twenty-one locations and the posterior means of the parameters for the MA1 model. We find much of interest to discuss in this paper, and as we progress through points of interest, we pose some questions to the authors that we hope they will be able to address.

  18. Specification of e-business process model for PayPal online payment process using Reo

    OpenAIRE

    Xie, M.

    2005-01-01

    textabstractE-business process modeling allows business analysts to better understand and analyze the business processes, and eventually to use software systems to automate (parts of) these business processes to achieve higher profit. To support e-business process modeling, many business process modeling languages have been used as tools. However, many existing business process modeling languages lack (a) formal semantics, (b) formal computational model, and (c) an integrated view of the busi...

  19. Modeling Aspects Of Activated Sludge Processes Part I: Process Modeling Of Activated Sludge Facilitation And Sedimentation

    International Nuclear Information System (INIS)

    Ibrahim, H. I.; EI-Ahwany, A.H.; Ibrahim, G.

    2004-01-01

    Process modeling of activated sludge flocculation and sedimentation reviews consider the activated sludge floc characteristics such as: morphology viable and non-viable cell ratio density and water content, bio flocculation and its kinetics were studied considering the characteristics of bio flocculation and explaining theory of Divalent Cation Bridging which describes the major role of cations in bio flocculation. Activated sludge flocculation process modeling was studied considering mass transfer limitations from Clifft and Andrew, 1981, Benefild and Molz 1983 passing Henze 1987, until Tyagi 1996 and G. Ibrahim et aI. 2002. Models of aggregation and breakage of flocs were studied by Spicer and Pratsinis 1996,and Biggs 2002 Size distribution of floes influences mass transfer and biomass separation in the activated sludge process. Therefore, it is of primary importance to establish the role of specific process operation factors, such as sludge loading dynamic sludge age and dissolved oxygen, on this distribution with special emphasis on the formation of primary particles

  20. Quantitative seafloor characterization using angular backscatter data of the multi-beam echo-sounding system - Use of models and model free techniques

    Digital Repository Service at National Institute of Oceanography (India)

    Chakraborty, B.

    processing gain, bottom slope corrections, and bottom insonification area normalisation were proposed to generate angular backscattering strength for modelling to infer bottom roughness parameters. A software package (NORGCOR) for similar purpose... bottom backscatter data from multibeam system. For each seafloor area, processed backscatter strength values [presented in Fig.: I(c)], are binned at intervals of 1° from --45° to +45°, and averaged over the entire dataset (approximately around 100...

  1. Uncertainty modeling process for semantic technology

    Directory of Open Access Journals (Sweden)

    Rommel N. Carvalho

    2016-08-01

    Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.

  2. GRAPHICAL MODELS OF THE AIRCRAFT MAINTENANCE PROCESS

    Directory of Open Access Journals (Sweden)

    Stanislav Vladimirovich Daletskiy

    2017-01-01

    Full Text Available The aircraft maintenance is realized by a rapid sequence of maintenance organizational and technical states, its re- search and analysis are carried out by statistical methods. The maintenance process concludes aircraft technical states con- nected with the objective patterns of technical qualities changes of the aircraft as a maintenance object and organizational states which determine the subjective organization and planning process of aircraft using. The objective maintenance pro- cess is realized in Maintenance and Repair System which does not include maintenance organization and planning and is a set of related elements: aircraft, Maintenance and Repair measures, executors and documentation that sets rules of their interaction for maintaining of the aircraft reliability and readiness for flight. The aircraft organizational and technical states are considered, their characteristics and heuristic estimates of connection in knots and arcs of graphs and of aircraft organi- zational states during regular maintenance and at technical state failure are given. It is shown that in real conditions of air- craft maintenance, planned aircraft technical state control and maintenance control through it, is only defined by Mainte- nance and Repair conditions at a given Maintenance and Repair type and form structures, and correspondingly by setting principles of Maintenance and Repair work types to the execution, due to maintenance, by aircraft and all its units mainte- nance and reconstruction strategies. The realization of planned Maintenance and Repair process determines the one of the constant maintenance component. The proposed graphical models allow to reveal quantitative correlations between graph knots to improve maintenance processes by statistical research methods, what reduces manning, timetable and expenses for providing safe civil aviation aircraft maintenance.

  3. Software Process Validation: Quantitatively Measuring the Correspondence of a Process to a Model

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1997-01-01

    .... When process models and process executions diverge, something significant is happening. The authors have developed techniques for uncovering and measuring the discrepancies between models and executions, which they call process validation...

  4. Properties of spatial Cox process models

    DEFF Research Database (Denmark)

    Møller, Jesper

    2005-01-01

    Particularly, we study the most important classes of Cox processes, including log Gaussian Cox processes, shot noise Cox processes, and permanent Cox processes. We consider moment properties and point process operations such as thinning, displacements, and super positioning. We also discuss how...... to simulate specific Cox processes....

  5. Heat source model for welding process

    International Nuclear Information System (INIS)

    Doan, D.D.

    2006-10-01

    One of the major industrial stakes of the welding simulation relates to the control of mechanical effects of the process (residual stress, distortions, fatigue strength... ). These effects are directly dependent on the temperature evolutions imposed during the welding process. To model this thermal loading, an original method is proposed instead of the usual methods like equivalent heat source approach or multi-physical approach. This method is based on the estimation of the weld pool shape together with the heat flux crossing the liquid/solid interface, from experimental data measured in the solid part. Its originality consists in solving an inverse Stefan problem specific to the welding process, and it is shown how to estimate the parameters of the weld pool shape. To solve the heat transfer problem, the interface liquid/solid is modeled by a Bezier curve ( 2-D) or a Bezier surface (3-D). This approach is well adapted to a wide diversity of weld pool shapes met for the majority of the current welding processes (TIG, MlG-MAG, Laser, FE, Hybrid). The number of parameters to be estimated is weak enough, according to the cases considered from 2 to 5 in 20 and 7 to 16 in 3D. A sensitivity study leads to specify the location of the sensors, their number and the set of measurements required to a good estimate. The application of the method on test results of welding TIG on thin stainless steel sheets in emerging and not emerging configurations, shows that only one measurement point is enough to estimate the various weld pool shapes in 20, and two points in 3D, whatever the penetration is full or not. In the last part of the work, a methodology is developed for the transient analysis. It is based on the Duvaut's transformation which overpasses the discontinuity of the liquid metal interface and therefore gives a continuous variable for the all spatial domain. Moreover, it allows to work on a fixed mesh grid and the new inverse problem is equivalent to identify a source

  6. Mechanical-mathematical modeling for landslide process

    Science.gov (United States)

    Svalova, V.

    2009-04-01

    500 m and displacement of a landslide in the plan over 1 m. Last serious activization of a landslide has taken place in 2002 with a motion on 53 cm. Catastrophic activization of the deep blockglide landslide in the area of Khoroshevo in Moscow took place in 2006-2007. A crack of 330 m long appeared in the old sliding circus, along which a new 220 m long creeping block was separated from the plateau and began sinking with a displaced surface of the plateau reaching to 12 m. Such activization of the landslide process was not observed in Moscow since mid XIX century. The sliding area of Khoroshevo was stable during long time without manifestations of activity. Revealing of the reasons of deformation and development of ways of protection from deep landslide motions is extremely actual and difficult problem which decision is necessary for preservation of valuable historical monuments and modern city constructions. The reasons of activization and protective measures are discussed. Structure of monitoring system for urban territories is elaborated. Mechanical-mathematical model of high viscous fluid was used for modeling of matter behavior on landslide slopes. Equation of continuity and an approximated equation of the Navier-Stockes for slow motions in a thin layer were used. The results of modelling give possibility to define the place of highest velocity on landslide surface, which could be the best place for monitoring post position. Model can be used for calibration of monitoring equipment and gives possibility to investigate some fundamental aspects of matter movement on landslide slope.

  7. R-process nucleosynthesis: a dynamical model

    Energy Technology Data Exchange (ETDEWEB)

    Hillebrandt, W; Takahashi, K [Technische Hochschule Darmstadt (Germany, F.R.). Inst. fuer Kernphysik; Kodama, T [Centro Brasileiro de Pesquisas Fisicas, Rio de Janeiro

    1976-10-01

    The synthesis of heavy and neutron-rich elements (with the mass number A > approximately 70) is reconsidered in the framework of a dynamical supernova model. The synthesis equation for the rapid neutron-capture (or, the r-) process and the hydrodynamical equations for the supernova explosion are solved simultaneously. Improved systematics of nuclear parameters are used, and the energy release due to ..beta..-decays as well as the energy loss due to neutrinos is taken into account. It is shown that the observed solar-system abundance curve can be reproduced fairly well by assuming only one supernova event on a time-scale of the order of 1 s. However there are still some discrepancies which may be explained by uncertainties in the nuclear data used.

  8. Kanban simulation model for production process optimization

    Directory of Open Access Journals (Sweden)

    Golchev Riste

    2015-01-01

    Full Text Available A long time has passed since the KANBAN system has been established as an efficient method for coping with the excessive inventory. Still, the possibilities for its improvement through its integration with other different approaches should be investigated further. The basic research challenge of this paper is to present benefits of KANBAN implementation supported with Discrete Event Simulation (DES. In that direction, at the beginning, the basics of KANBAN system are presented with emphasis on the information and material flow, together with a methodology for implementation of KANBAN system. Certain analysis on combining the simulation with this methodology is presented. The paper is concluded with a practical example which shows that through understanding the philosophy of the implementation methodology of KANBAN system and the simulation methodology, a simulation model can be created which can serve as a basis for a variety of experiments that can be conducted within a short period of time, resulting with production process optimization.

  9. Modeling and simulation of economic processes

    Directory of Open Access Journals (Sweden)

    Bogdan Brumar

    2010-12-01

    Full Text Available In general, any activity requires a longer action often characterized by a degree of uncertainty, insecurity, in terms of size of the objective pursued. Because of the complexity of real economic systems, the stochastic dependencies between different variables and parameters considered, not all systems can be adequately represented by a model that can be solved by analytical methods and covering all issues for management decision analysis-economic horizon real. Often in such cases, it is considered that the simulation technique is the only alternative available. Using simulation techniques to study real-world systems often requires a laborious work. Making a simulation experiment is a process that takes place in several stages.

  10. Modelling and application of stochastic processes

    CERN Document Server

    1986-01-01

    The subject of modelling and application of stochastic processes is too vast to be exhausted in a single volume. In this book, attention is focused on a small subset of this vast subject. The primary emphasis is on realization and approximation of stochastic systems. Recently there has been considerable interest in the stochastic realization problem, and hence, an attempt has been made here to collect in one place some of the more recent approaches and algorithms for solving the stochastic realiza­ tion problem. Various different approaches for realizing linear minimum-phase systems, linear nonminimum-phase systems, and bilinear systems are presented. These approaches range from time-domain methods to spectral-domain methods. An overview of the chapter contents briefly describes these approaches. Also, in most of these chapters special attention is given to the problem of developing numerically ef­ ficient algorithms for obtaining reduced-order (approximate) stochastic realizations. On the application side,...

  11. Elliptic Determinantal Processes and Elliptic Dyson Models

    Science.gov (United States)

    Katori, Makoto

    2017-10-01

    We introduce seven families of stochastic systems of interacting particles in one-dimension corresponding to the seven families of irreducible reduced affine root systems. We prove that they are determinantal in the sense that all spatio-temporal correlation functions are given by determinants controlled by a single function called the spatio-temporal correlation kernel. For the four families {A}_{N-1}, {B}_N, {C}_N and {D}_N, we identify the systems of stochastic differential equations solved by these determinantal processes, which will be regarded as the elliptic extensions of the Dyson model. Here we use the notion of martingales in probability theory and the elliptic determinant evaluations of the Macdonald denominators of irreducible reduced affine root systems given by Rosengren and Schlosser.

  12. Numerical approaches to expansion process modeling

    Directory of Open Access Journals (Sweden)

    G. V. Alekseev

    2017-01-01

    Full Text Available Forage production is currently undergoing a period of intensive renovation and introduction of the most advanced technologies and equipment. More and more often such methods as barley toasting, grain extrusion, steaming and grain flattening, boiling bed explosion, infrared ray treatment of cereals and legumes, followed by flattening, and one-time or two-time granulation of the purified whole grain without humidification in matrix presses By grinding the granules. These methods require special apparatuses, machines, auxiliary equipment, created on the basis of different methods of compiled mathematical models. When roasting, simulating the heat fields arising in the working chamber, provide such conditions, the decomposition of a portion of the starch to monosaccharides, which makes the grain sweetish, but due to protein denaturation the digestibility of the protein and the availability of amino acids decrease somewhat. Grain is roasted mainly for young animals in order to teach them to eat food at an early age, stimulate the secretory activity of digestion, better development of the masticatory muscles. In addition, the high temperature is detrimental to bacterial contamination and various types of fungi, which largely avoids possible diseases of the gastrointestinal tract. This method has found wide application directly on the farms. Apply when used in feeding animals and legumes: peas, soy, lupine and lentils. These feeds are preliminarily ground, and then cooked or steamed for 1 hour for 30–40 minutes. In the feed mill. Such processing of feeds allows inactivating the anti-nutrients in them, which reduce the effectiveness of their use. After processing, legumes are used as protein supplements in an amount of 25–30% of the total nutritional value of the diet. But it is recommended to cook and steal a grain of good quality. A poor-quality grain that has been stored for a long time and damaged by pathogenic micro flora is subject to

  13. Integrated modelling of near field and engineered barrier system processes

    International Nuclear Information System (INIS)

    Lamont, A.; Gansemer, J.

    1994-01-01

    The Yucca Mountain Integrating Model (YMIM) is an integrated model of the Engineered barrier System has been developed to assist project managers at LLNL in identifying areas where research emphasis should be placed. The model was designed to be highly modular so that a model of an individual process could be easily modified or replaced without interfering with the models of other processes. The modules modelling container failure and the dissolution of nuclides include particularly detailed, temperature dependent models of their corresponding processes

  14. Behavioral conformance of artifact-centric process models

    NARCIS (Netherlands)

    Fahland, D.; Leoni, de M.; Dongen, van B.F.; Aalst, van der W.M.P.; Abramowicz, W.

    2011-01-01

    The use of process models in business information systems for analysis, execution, and improvement of processes assumes that the models describe reality. Conformance checking is a technique to validate how good a given process model describes recorded executions of the actual process. Recently,

  15. On the correlation between process model metrics and errors

    NARCIS (Netherlands)

    Mendling, J.; Neumann, G.; Aalst, van der W.M.P.; Grundy, J.; Hartmann, S.; Laender, S.; Maciaszek, L.; Roddick, J.F.

    2007-01-01

    Business process models play an important role for the management, design, and improvement of process organizations and process-aware information systems. Despite the extensive application of process modeling in practice there are hardly empirical results available on quality aspects of process

  16. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  17. Group Contribution Based Process Flowsheet Synthesis, Design and Modelling

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2004-01-01

    This paper presents a process-group-contribution Method to model. simulate and synthesize a flowsheet. The process-group based representation of a flowsheet together with a process "property" model are presented. The process-group based synthesis method is developed on the basis of the computer...... aided molecular design methods and gives the ability to screen numerous process alternatives without the need to use the rigorous process simulation models. The process "property" model calculates the design targets for the generated flowsheet alternatives while a reverse modelling method (also...... developed) determines the design variables matching the target. A simple illustrative example highlighting the main features of the methodology is also presented....

  18. Specification of e-business process model for PayPal online payment process using Reo

    NARCIS (Netherlands)

    M. Xie

    2005-01-01

    textabstractE-business process modeling allows business analysts to better understand and analyze the business processes, and eventually to use software systems to automate (parts of) these business processes to achieve higher profit. To support e-business process modeling, many business process

  19. Computer Forensics Field Triage Process Model

    Directory of Open Access Journals (Sweden)

    Marcus K. Rogers

    2006-06-01

    Full Text Available With the proliferation of digital based evidence, the need for the timely identification, analysis and interpretation of digital evidence is becoming more crucial. In many investigations critical information is required while at the scene or within a short period of time - measured in hours as opposed to days. The traditional cyber forensics approach of seizing a system(s/media, transporting it to the lab, making a forensic image(s, and then searching the entire system for potential evidence, is no longer appropriate in some circumstances. In cases such as child abductions, pedophiles, missing or exploited persons, time is of the essence. In these types of cases, investigators dealing with the suspect or crime scene need investigative leads quickly; in some cases it is the difference between life and death for the victim(s. The Cyber Forensic Field Triage Process Model (CFFTPM proposes an onsite or field approach for providing the identification, analysis and interpretation of digital evidence in a short time frame, without the requirement of having to take the system(s/media back to the lab for an in-depth examination or acquiring a complete forensic image(s. The proposed model adheres to commonly held forensic principles, and does not negate the ability that once the initial field triage is concluded, the system(s/storage media be transported back to a lab environment for a more thorough examination and analysis. The CFFTPM has been successfully used in various real world cases, and its investigative importance and pragmatic approach has been amply demonstrated. Furthermore, the derived evidence from these cases has not been challenged in the court proceedings where it has been introduced. The current article describes the CFFTPM in detail, discusses the model’s forensic soundness, investigative support capabilities and practical considerations.

  20. The Formalization of the Business Process Modeling Goals

    Directory of Open Access Journals (Sweden)

    Ligita Bušinska

    2016-10-01

    Full Text Available In business process modeling the de facto standard BPMN has emerged. However, the applications of this notation have many subsets of elements and various extensions. Also, BPMN still coincides with many other modeling languages, forming a large set of available options for business process modeling languages and dialects. While, in general, the goal of modelers is a central notion in the choice of modeling languages and notations, in most researches that propose guidelines, techniques, and methods for business process modeling language evaluation and/or selection, the business process modeling goal is not formalized and not transparently taken into account. To overcome this gap, and to explicate and help to handle business process modeling complexity, the approach to formalize the business process modeling goal, and the supporting three dimensional business process modeling framework, are proposed.

  1. Atmospheric pollution. From processes to modelling

    International Nuclear Information System (INIS)

    Sportisse, B.

    2008-01-01

    Air quality, greenhouse effect, ozone hole, chemical or nuclear accidents.. All these phenomena are tightly linked to the chemical composition of atmosphere and to the atmospheric dispersion of pollutants. This book aims at supplying the main elements of understanding of 'atmospheric pollutions': stakes, physical processes involved, role of scientific expertise in decision making. Content: 1 - classifications and scales: chemical composition of the atmosphere, vertical structure, time scales (transport, residence); 2 - matter/light interaction: notions of radiative transfer, application to the Earth's atmosphere; 3 - some elements about the atmospheric boundary layer: notion of scales in meteorology, atmospheric boundary layer (ABL), thermal stratification and stability, description of ABL turbulence, elements of atmospheric dynamics, some elements about the urban climate; 4 - notions of atmospheric chemistry: characteristics, ozone stratospheric chemistry, ozone tropospheric chemistry, brief introduction to indoor air quality; 5 - aerosols, clouds and rains: aerosols and particulates, aerosols and clouds, acid rains and leaching; 6 - towards numerical simulation: equation of reactive dispersion, numerical methods for chemistry-transport models, numerical resolution of the general equation of aerosols dynamics (GDE), modern simulation chains, perspectives. (J.S.)

  2. Signal Processing Model for Radiation Transport

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, D H

    2008-07-28

    This note describes the design of a simplified gamma ray transport model for use in designing a sequential Bayesian signal processor for low-count detection and classification. It uses a simple one-dimensional geometry to describe the emitting source, shield effects, and detector (see Fig. 1). At present, only Compton scattering and photoelectric absorption are implemented for the shield and the detector. Other effects may be incorporated in the future by revising the expressions for the probabilities of escape and absorption. Pair production would require a redesign of the simulator to incorporate photon correlation effects. The initial design incorporates the physical effects that were present in the previous event mode sequence simulator created by Alan Meyer. The main difference is that this simulator transports the rate distributions instead of single photons. Event mode sequences and other time-dependent photon flux sequences are assumed to be marked Poisson processes that are entirely described by their rate distributions. Individual realizations can be constructed from the rate distribution using a random Poisson point sequence generator.

  3. Evaluating Translational Research: A Process Marker Model

    Science.gov (United States)

    Trochim, William; Kane, Cathleen; Graham, Mark J.; Pincus, Harold A.

    2011-01-01

    Abstract Objective: We examine the concept of translational research from the perspective of evaluators charged with assessing translational efforts. One of the major tasks for evaluators involved in translational research is to help assess efforts that aim to reduce the time it takes to move research to practice and health impacts. Another is to assess efforts that are intended to increase the rate and volume of translation. Methods: We offer an alternative to the dominant contemporary tendency to define translational research in terms of a series of discrete “phases.”Results: We contend that this phased approach has been confusing and that it is insufficient as a basis for evaluation. Instead, we argue for the identification of key operational and measurable markers along a generalized process pathway from research to practice. Conclusions: This model provides a foundation for the evaluation of interventions designed to improve translational research and the integration of these findings into a field of translational studies. Clin Trans Sci 2011; Volume 4: 153–162 PMID:21707944

  4. Modelling of fiberglass pipe destruction process

    Directory of Open Access Journals (Sweden)

    А. К. Николаев

    2017-03-01

    Full Text Available The article deals with important current issue of oil and gas industry of using tubes made of high-strength composite corrosion resistant materials. In order to improve operational safety of industrial pipes it is feasible to use composite fiberglass tubes. More than half of the accidents at oil and gas sites happen at oil gathering systems due to high corrosiveness of pumped fluid. To reduce number of accidents and improve environmental protection we need to solve the issue of industrial pipes durability. This problem could be solved by using composite materials from fiberglass, which have required physical and mechanical properties for oil pipes. The durability and strength can be monitored by a fiberglass winding method, number of layers in composite material and high corrosion-resistance properties of fiberglass. Usage of high-strength composite materials in oil production is economically feasible; fiberglass pipes production is cheaper than steel pipes. Fiberglass has small volume weight, which simplifies pipe transportation and installation. In order to identify the efficiency of using high-strength composite materials at oil production sites we conducted a research of their physical-mechanical properties and modelled fiber pipe destruction process.

  5. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    INTRODUCTION: Oral anticoagulation treatment (OACT)with warfarin is common in general practice. Increasingly,international normalised ratio (INR) point of care testing(POCT) is being used to manage patients. The aim of thisstudy was to describe and analyse the quality of OACT withwarfarin...... in the management of patients in warfarintreatment provided good quality of care. Sampling intervaland diagnostic coding were significantly correlated withtreatment quality. FUNDING: The study received financial support from theSarah Krabbe Foundation, the General Practitioners’ Educationand Development Foundation...

  6. Modeling spatial processes with unknown extremal dependence class

    KAUST Repository

    Huser, Raphaë l G.; Wadsworth, Jennifer L.

    2017-01-01

    Many environmental processes exhibit weakening spatial dependence as events become more extreme. Well-known limiting models, such as max-stable or generalized Pareto processes, cannot capture this, which can lead to a preference for models

  7. Querying Business Process Models with VMQL

    DEFF Research Database (Denmark)

    Störrle, Harald; Acretoaie, Vlad

    2013-01-01

    The Visual Model Query Language (VMQL) has been invented with the objectives (1) to make it easier for modelers to query models effectively, and (2) to be universally applicable to all modeling languages. In previous work, we have applied VMQL to UML, and validated the first of these two claims. ...

  8. Transforming Process Models to Problem Frames

    NARCIS (Netherlands)

    Fassbender, Stephan; Aysolmaz, Banu; Weske, M.; Rinderle-Ma, S.

    2015-01-01

    An increase of process awareness within organizations and advances in IT systems led to a development of process-aware information systems (PAIS) in many organizations. UPROM is developed as a unified BPM methodology to conduct business process and user requirements analysis for PAIS in an

  9. Two Undergraduate Process Modeling Courses Taught Using Inductive Learning Methods

    Science.gov (United States)

    Soroush, Masoud; Weinberger, Charles B.

    2010-01-01

    This manuscript presents a successful application of inductive learning in process modeling. It describes two process modeling courses that use inductive learning methods such as inquiry learning and problem-based learning, among others. The courses include a novel collection of multi-disciplinary complementary process modeling examples. They were…

  10. A linear time layout algorithm for business process models

    NARCIS (Netherlands)

    Gschwind, T.; Pinggera, J.; Zugal, S.; Reijers, H.A.; Weber, B.

    2014-01-01

    The layout of a business process model influences how easily it can beunderstood. Existing layout features in process modeling tools often rely on graph representations, but do not take the specific properties of business process models into account. In this paper, we propose an algorithm that is

  11. User-guided discovery of declarative process models

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, van der W.M.P.; Chawla, N.; King, I.; Sperduti, A.

    2011-01-01

    Process mining techniques can be used to effectively discover process models from logs with example behaviour. Cross-correlating a discovered model with information in the log can be used to improve the underlying process. However, existing process discovery techniques have two important drawbacks.

  12. Model medication management process in Australian nursing homes using business process modeling.

    Science.gov (United States)

    Qian, Siyu; Yu, Ping

    2013-01-01

    One of the reasons for end user avoidance or rejection to use health information systems is poor alignment of the system with healthcare workflow, likely causing by system designers' lack of thorough understanding about healthcare process. Therefore, understanding the healthcare workflow is the essential first step for the design of optimal technologies that will enable care staff to complete the intended tasks faster and better. The often use of multiple or "high risk" medicines by older people in nursing homes has the potential to increase medication error rate. To facilitate the design of information systems with most potential to improve patient safety, this study aims to understand medication management process in nursing homes using business process modeling method. The paper presents study design and preliminary findings from interviewing two registered nurses, who were team leaders in two nursing homes. Although there were subtle differences in medication management between the two homes, major medication management activities were similar. Further field observation will be conducted. Based on the data collected from observations, an as-is process model for medication management will be developed.

  13. Model reduction methods for vector autoregressive processes

    CERN Document Server

    Brüggemann, Ralf

    2004-01-01

    1. 1 Objective of the Study Vector autoregressive (VAR) models have become one of the dominant research tools in the analysis of macroeconomic time series during the last two decades. The great success of this modeling class started with Sims' (1980) critique of the traditional simultaneous equation models (SEM). Sims criticized the use of 'too many incredible restrictions' based on 'supposed a priori knowledge' in large scale macroeconometric models which were popular at that time. Therefore, he advo­ cated largely unrestricted reduced form multivariate time series models, unrestricted VAR models in particular. Ever since his influential paper these models have been employed extensively to characterize the underlying dynamics in systems of time series. In particular, tools to summarize the dynamic interaction between the system variables, such as impulse response analysis or forecast error variance decompo­ sitions, have been developed over the years. The econometrics of VAR models and related quantities i...

  14. Spherical Process Models for Global Spatial Statistics

    KAUST Repository

    Jeong, Jaehong; Jun, Mikyoung; Genton, Marc G.

    2017-01-01

    Statistical models used in geophysical, environmental, and climate science applications must reflect the curvature of the spatial domain in global data. Over the past few decades, statisticians have developed covariance models that capture

  15. Managing risks in business model innovation processes

    DEFF Research Database (Denmark)

    Taran, Yariv; Boer, Harry; Lindgren, Peter

    2010-01-01

    Companies today, in some industries more than others, invest more capital and resources just to stay competitive, develop more diverse solutions, and increasingly start thinking more radically when considering their business models. However, despite the understanding that business model (BM...

  16. Plasma Process Modeling for Integrated Circuits Manufacturing

    OpenAIRE

    M. Meyyappan; T. R. Govindan

    1998-01-01

    A reactor model for plasma-based deposition and etching is presented. Two-dimensional results are discussed in terms of plasma density, ion flux, and ion energy. Approaches to develop rapid CAD-type models are discussed.

  17. Modeling microbial processes in porous media

    Science.gov (United States)

    Murphy, Ellyn M.; Ginn, Timothy R.

    The incorporation of microbial processes into reactive transport models has generally proceeded along two separate lines of investigation: (1) transport of bacteria as inert colloids in porous media, and (2) the biodegradation of dissolved contaminants by a stationary phase of bacteria. Research over the last decade has indicated that these processes are closely linked. This linkage may occur when a change in metabolic activity alters the attachment/detachment rates of bacteria to surfaces, either promoting or retarding bacterial transport in a groundwater-contaminant plume. Changes in metabolic activity, in turn, are controlled by the time of exposure of the microbes to electron acceptors/donor and other components affecting activity. Similarly, metabolic activity can affect the reversibility of attachment, depending on the residence time of active microbes. Thus, improvements in quantitative analysis of active subsurface biota necessitate direct linkages between substrate availability, metabolic activity, growth, and attachment/detachment rates. This linkage requires both a detailed understanding of the biological processes and robust quantitative representations of these processes that can be tested experimentally. This paper presents an overview of current approaches used to represent physicochemical and biological processes in porous media, along with new conceptual approaches that link metabolic activity with partitioning of the microorganism between the aqueous and solid phases. Résumé L'introduction des processus microbiologiques dans des modèles de transport réactif a généralement suivi deux voies différentes de recherches: (1) le transport de bactéries sous forme de colloïdes inertes en milieu poreux, et (2) la biodégradation de polluants dissous par une phase stationnaire de bactéries. Les recherches conduites au cours des dix dernières années indiquent que ces processus sont intimement liés. Cette liaison peut intervenir lorsqu

  18. Evaluation of EOR Processes Using Network Models

    DEFF Research Database (Denmark)

    Winter, Anatol; Larsen, Jens Kjell; Krogsbøll, Anette

    1998-01-01

    The report consists of the following parts: 1) Studies of wetting properties of model fluids and fluid mixtures aimed at an optimal selection of candidates for micromodel experiments. 2) Experimental studies of multiphase transport properties using physical models of porous networks (micromodels......) including estimation of their "petrophysical" properties (e.g. absolute permeability). 3) Mathematical modelling and computer studies of multiphase transport through pore space using mathematical network models. 4) Investigation of link between pore-scale and macroscopic recovery mechanisms....

  19. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  20. Eye Tracking Meets the Process of Process Modeling: a Visual Analytic Approach

    DEFF Research Database (Denmark)

    Burattin, Andrea; Kaiser, M.; Neurauter, Manuel

    2017-01-01

    Research on the process of process modeling (PPM) studies how process models are created. It typically uses the logs of the interactions with the modeling tool to assess the modeler’s behavior. In this paper we suggest to introduce an additional stream of data (i.e., eye tracking) to improve the ...

  1. Calculation of normalised organ and effective doses to adult reference computational phantoms from contemporary computed tomography scanners

    International Nuclear Information System (INIS)

    Jansen, Jan T.M.; Shrimpton, Paul C.

    2010-01-01

    The general-purpose Monte Carlo radiation transport code MCNPX has been used to simulate photon transport and energy deposition in anthropomorphic phantoms due to the x-ray exposure from the Philips iCT 256 and Siemens Definition CT scanners, together with the previously studied General Electric 9800. The MCNPX code was compiled with the Intel FORTRAN compiler and run on a Linux PC cluster. A patch has been successfully applied to reduce computing times by about 4%. The International Commission on Radiological Protection (ICRP) has recently published the Adult Male (AM) and Adult Female (AF) reference computational voxel phantoms as successors to the Medical Internal Radiation Dose (MIRD) stylised hermaphrodite mathematical phantoms that form the basis for the widely-used ImPACT CT dosimetry tool. Comparisons of normalised organ and effective doses calculated for a range of scanner operating conditions have demonstrated significant differences in results (in excess of 30%) between the voxel and mathematical phantoms as a result of variations in anatomy. These analyses illustrate the significant influence of choice of phantom on normalised organ doses and the need for standardisation to facilitate comparisons of dose. Further such dose simulations are needed in order to update the ImPACT CT Patient Dosimetry spreadsheet for contemporary CT practice. (author)

  2. Normalisation in product life cycle assessment: an LCA of the global and European economic systems in the year 2000.

    Science.gov (United States)

    Sleeswijk, Anneke Wegener; van Oers, Lauran F C M; Guinée, Jeroen B; Struijs, Jaap; Huijbregts, Mark A J

    2008-02-01

    In the methodological context of the interpretation of environmental life cycle assessment (LCA) results, a normalisation study was performed. 15 impact categories were accounted for, including climate change, acidification, eutrophication, human toxicity, ecotoxicity, depletion of fossil energy resources, and land use. The year 2000 was chosen as a reference year, and information was gathered on two spatial levels: the global and the European level. From the 860 environmental interventions collected, 48 interventions turned out to account for at least 75% of the impact scores of all impact categories. All non-toxicity related, emission dependent impacts are fully dominated by the bulk emissions of only 10 substances or substance groups: CO(2), CH(4), SO(2), NO(x), NH(3), PM(10), NMVOC, and (H)CFCs emissions to air and emissions of N- and P-compounds to fresh water. For the toxicity-related emissions (pesticides, organics, metal compounds and some specific inorganics), the availability of information was still very limited, leading to large uncertainty in the corresponding normalisation factors. Apart from their usefulness as a reference for LCA studies, the results of this study stress the importance of efficient measures to combat bulk emissions and to promote the registration of potentially toxic emissions on a more comprehensive scale.

  3. Regime-switching models to study psychological process

    NARCIS (Netherlands)

    Hamaker, E.L.; Grasman, R.P.P.P.; Kamphuis, J.H.

    2010-01-01

    Many psychological processes are characterized by recurrent shifts between different states. To model these processes at the level of the individual, regime-switching models may prove useful. In this chapter we discuss two of these models: the threshold autoregressive model and the Markov

  4. Dynamic modeling of ultrafiltration membranes for whey separation processes

    NARCIS (Netherlands)

    Saltik, M.B.; Ozkan, L.; Jacobs, M.; van der Padt, A.

    2017-01-01

    In this paper, we present a control relevant rigorous dynamic model for an ultrafiltration membrane unit in a whey separation process. The model consists of a set of differential algebraic equations and is developed for online model based applications such as model based control and process

  5. Macro Level Simulation Model Of Space Shuttle Processing

    Science.gov (United States)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  6. A general model for membrane-based separation processes

    DEFF Research Database (Denmark)

    Soni, Vipasha; Abildskov, Jens; Jonsson, Gunnar Eigil

    2009-01-01

    behaviour will play an important role. In this paper, modelling of membrane-based processes for separation of gas and liquid mixtures are considered. Two general models, one for membrane-based liquid separation processes (with phase change) and another for membrane-based gas separation are presented....... The separation processes covered are: membrane-based gas separation processes, pervaporation and various types of membrane distillation processes. The specific model for each type of membrane-based process is generated from the two general models by applying the specific system descriptions and the corresponding...

  7. Modelling of injection processes in ladle metallurgy

    NARCIS (Netherlands)

    Visser, H.

    2016-01-01

    Ladle metallurgical processes constitute a portion of the total production chain of steel from iron ore. With these batch processes, the hot metal or steel transfer ladle is being used as a reactor vessel and a reagent is often injected in order to bring the composition of the hot metal or steel to

  8. The Process of Horizontal Differentiation: Two Models.

    Science.gov (United States)

    Daft, Richard L.; Bradshaw, Patricia J.

    1980-01-01

    Explores the process of horizontal differentiation by examining events leading to the establishment of 30 new departments in five universities. Two types of horizontal differentiation processes--administrative and academic--were observed and each was associated with different organizational conditions. (Author/IRT)

  9. Evolutionary Regeneration Model of Thought Process

    OpenAIRE

    Noboru, HOKKYO; Hitachi Energy Research Laboratory

    1982-01-01

    A preliminary attempt is made to understand the thought process and the evolution of the nervous system on the same footing as regeneration processes obeying certain recursive algebraic rules which possibly economize the information content of the increasingly complex structural-functional correlate of the evolving and thinking nervous system.

  10. Animal models for information processing during sleep

    NARCIS (Netherlands)

    Coenen, A.M.L.; Drinkenburg, W.H.I.M.

    2002-01-01

    Information provided by external stimuli does reach the brain during sleep, although the amount of information is reduced during sleep compared to wakefulness. The process controlling this reduction is called `sensory' gating and evidence exists that the underlying neurophysiological processes take

  11. Business process model repositories : framework and survey

    NARCIS (Netherlands)

    Yan, Z.; Dijkman, R.M.; Grefen, P.W.P.J.

    2009-01-01

    Large organizations often run hundreds or even thousands of business processes. Managing such large collections of business processes is a challenging task. Intelligent software can assist in that task by providing common repository functions such as storage, search and version management. They can

  12. A framework for business process model repositories

    NARCIS (Netherlands)

    Yan, Z.; Grefen, P.W.P.J.; Muehlen, zur M.; Su, J.

    2010-01-01

    Large organizations often run hundreds or even thousands of business processes. Managing such large collections of business processes is a challenging task. Intelligent software can assist in that task by providing common repository functions such as storage, search and version management. They can

  13. The Model of the Production Process for the Quality Management

    Directory of Open Access Journals (Sweden)

    Alot Zbigniew

    2017-02-01

    Full Text Available This article is a result of the research on the models of the production processes for the quality management and their identification. It discusses the classical model and the indicators for evaluating the capabilities by taking as its starting point the assumption of the normal distribution of the process characteristics. The division of the process types proposed by ISO 21747:2006 standard introducing models for non-stationary processes is presented. A general process model that allows in any real case to precisely describe the statistical characteristics of the process is proposed. It gives the opportunity for more detailed description, in comparison to the model proposed by ISO 21747:2006 standard, of the process characteristics and determining its capability. This model contains the type of process, statistical distribution, and the method for determining the capability and performance (long-term capability of the process. One of the model elements is proposed, own classification and resulting set of process types. The classification follows the recommendations of ISO 21747:2006 introducing models for the non-stationary processes. However, the set of the process types allows, beyond a more precise description of the process characteristics, its usage to monitor the process.

  14. Study of dissolution process and its modelling

    Directory of Open Access Journals (Sweden)

    Juan Carlos Beltran-Prieto

    2017-01-01

    Full Text Available The use of mathematical concepts and language aiming to describe and represent the interactions and dynamics of a system is known as a mathematical model. Mathematical modelling finds a huge number of successful applications in a vast amount of science, social and engineering fields, including biology, chemistry, physics, computer sciences, artificial intelligence, bioengineering, finance, economy and others. In this research, we aim to propose a mathematical model that predicts the dissolution of a solid material immersed in a fluid. The developed model can be used to evaluate the rate of mass transfer and the mass transfer coefficient. Further research is expected to be carried out to use the model as a base to develop useful models for the pharmaceutical industry to gain information about the dissolution of medicaments in the body stream and this could play a key role in formulation of medicaments.

  15. Understanding Quality in Process Modelling: Towards a Holistic Perspective

    Directory of Open Access Journals (Sweden)

    Jan Recker

    2007-09-01

    Full Text Available Quality is one of the main topics in current conceptual modelling research, as is the field of business process modelling. Yet, widely acknowledged academic contributions towards an understanding or measurement of business process model quality are limited at best. In this paper I argue that the development of methodical theories concerning the measurement or establishment of process model quality must be preceded by methodological elaborations on business process modelling. I further argue that existing epistemological foundations of process modelling are insufficient for describing all extrinsic and intrinsic traits of model quality. This in turn has led to a lack of holistic understanding of process modelling. Taking into account the inherent social and purpose-oriented character of process modelling in contemporary organizations I present a socio-pragmatic constructionist methodology of business process modelling and sketch out implications of this perspective towards an understanding of process model quality. I anticipate that, based on this research, theories can be developed that facilitate the evaluation of the ’goodness’ of a business process model.

  16. Centrifuge modelling of contaminant transport processes

    OpenAIRE

    Culligan, P. J.; Savvidou, C.; Barry, D. A.

    1996-01-01

    Over the past decade, research workers have started to investigate problems of subsurface contaminant transport through physical modelling on a geotechnical centrifuge. A major advantage of this apparatus is its ability to model complex natural systems in a controlled laboratory environment In this paper, we discusses the principles and scaling laws related to the centrifugal modelling of contaminant transport, and presents four examples of recent work that has bee...

  17. An Abstract Model of Historical Processes

    Directory of Open Access Journals (Sweden)

    Michael Poulshock

    2017-06-01

    Full Text Available A theoretical model is presented which provides a way to simulate, at a very abstract level, power struggles in the social world. In the model, agents can benefit or harm each other, to varying degrees and with differing levels of influence. The agents interact over time, using the power they have to try to get more of it, while being constrained in their strategic choices by social inertia. The outcomes of the model are probabilistic. More research is needed to determine whether the model has any empirical validity.

  18. Modeling and Advanced Control for Sustainable Process ...

    Science.gov (United States)

    This book chapter introduces a novel process systems engineering framework that integrates process control with sustainability assessment tools for the simultaneous evaluation and optimization of process operations. The implemented control strategy consists of a biologically-inspired, multi-agent-based method. The sustainability and performance assessment of process operating points is carried out using the U.S. E.P.A.’s GREENSCOPE assessment tool that provides scores for the selected economic, material management, environmental and energy indicators. The indicator results supply information on whether the implementation of the controller is moving the process towards a more sustainable operation. The effectiveness of the proposed framework is illustrated through a case study of a continuous bioethanol fermentation process whose dynamics are characterized by steady-state multiplicity and oscillatory behavior. This book chapter contribution demonstrates the application of novel process control strategies for sustainability by increasing material management, energy efficiency, and pollution prevention, as needed for SHC Sustainable Uses of Wastes and Materials Management.

  19. Computer modeling of lung cancer diagnosis-to-treatment process.

    Science.gov (United States)

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U; Yu, Xinhua; Faris, Nick; Li, Jingshan

    2015-08-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed.

  20. Modeling of Heating During Food Processing

    Science.gov (United States)

    Zheleva, Ivanka; Kamburova, Veselka

    Heat transfer processes are important for almost all aspects of food preparation and play a key role in determining food safety. Whether it is cooking, baking, boiling, frying, grilling, blanching, drying, sterilizing, or freezing, heat transfer is part of the processing of almost every food. Heat transfer is a dynamic process in which thermal energy is transferred from one body with higher temperature to another body with lower temperature. Temperature difference between the source of heat and the receiver of heat is the driving force in heat transfer.

  1. A QCD motivated model for soft processes

    International Nuclear Information System (INIS)

    Kormilitzin, A.; Levin, E.

    2009-01-01

    In this talk we give a brief description of a QCD motivated model for both hard and soft interactions at high energies. In this model the long distance behaviour of the scattering amplitude is determined by the dipole scattering amplitude in the saturation domain.

  2. Health care management modelling: a process perspective

    NARCIS (Netherlands)

    Vissers, J.M.H.

    1998-01-01

    Modelling-based health care management ought to become just as popular as evidence based medicine. Making managerial decisions based on evidence by modelling efforts is certainly a step forward. Examples can be given of many successful applications in different areas of decision making: disease

  3. GRAPHICAL MODELS OF THE AIRCRAFT MAINTENANCE PROCESS

    OpenAIRE

    Stanislav Vladimirovich Daletskiy; Stanislav Stanislavovich Daletskiy

    2017-01-01

    The aircraft maintenance is realized by a rapid sequence of maintenance organizational and technical states, its re- search and analysis are carried out by statistical methods. The maintenance process concludes aircraft technical states con- nected with the objective patterns of technical qualities changes of the aircraft as a maintenance object and organizational states which determine the subjective organization and planning process of aircraft using. The objective maintenance pro- cess is ...

  4. Comparative analysis of business rules and business process modeling languages

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2013-03-01

    Full Text Available During developing an information system is important to create clear models and choose suitable modeling languages. The article analyzes the SRML, SBVR, PRR, SWRL, OCL rules specifying language and UML, DFD, CPN, EPC and IDEF3 BPMN business process modeling language. The article presents business rules and business process modeling languages theoretical comparison. The article according to selected modeling aspects of the comparison between different business process modeling languages ​​and business rules representation languages sets. Also, it is selected the best fit of language set for three layer framework for business rule based software modeling.

  5. Difference-based Model Synchronization in an Industrial MDD Process

    DEFF Research Database (Denmark)

    Könemann, Patrick; Kindler, Ekkart; Unland, Ludger

    2009-01-01

    Models play a central role in model-driven software engineering. There are different kinds of models during the development process, which are related to each other and change over time. Therefore, it is difficult to keep the different models consistent with each other. Consistency of different m...... model versions, and for synchronizing other types of models. The main concern is to apply our concepts to an industrial process, in particular keeping usability and performance in mind. Keyword: Model Differencing, Model Merging, Model Synchronization...

  6. Modeling fixation locations using spatial point processes.

    Science.gov (United States)

    Barthelmé, Simon; Trukenbrod, Hans; Engbert, Ralf; Wichmann, Felix

    2013-10-01

    Whenever eye movements are measured, a central part of the analysis has to do with where subjects fixate and why they fixated where they fixated. To a first approximation, a set of fixations can be viewed as a set of points in space; this implies that fixations are spatial data and that the analysis of fixation locations can be beneficially thought of as a spatial statistics problem. We argue that thinking of fixation locations as arising from point processes is a very fruitful framework for eye-movement data, helping turn qualitative questions into quantitative ones. We provide a tutorial introduction to some of the main ideas of the field of spatial statistics, focusing especially on spatial Poisson processes. We show how point processes help relate image properties to fixation locations. In particular we show how point processes naturally express the idea that image features' predictability for fixations may vary from one image to another. We review other methods of analysis used in the literature, show how they relate to point process theory, and argue that thinking in terms of point processes substantially extends the range of analyses that can be performed and clarify their interpretation.

  7. A Software Development Simulation Model of a Spiral Process

    OpenAIRE

    Carolyn Mizell; Linda Malone

    2009-01-01

    This paper will present a discrete event simulation model of a spiral development lifecycle that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process. There is a need for simulation models of software development processes other than the waterfall due to new processes becoming more widely used in order to overcome the limitations of the traditional waterfall lifecycle. The use of a spiral process can make the inherently difficult job of...

  8. Deconstructing crop processes and models via identities

    DEFF Research Database (Denmark)

    Porter, John Roy; Christensen, Svend

    2013-01-01

    This paper is part review and part opinion piece; it has three parts of increasing novelty and speculation in approach. The first presents an overview of how some of the major crop simulation models approach the issue of simulating the responses of crops to changing climatic and weather variables......, mainly atmospheric CO2 concentration and increased and/or varying temperatures. It illustrates an important principle in models of a single cause having alternative effects and vice versa. The second part suggests some features, mostly missing in current crop models, that need to be included...

  9. NONLINEAR MODEL PREDICTIVE CONTROL OF CHEMICAL PROCESSES

    Directory of Open Access Journals (Sweden)

    SILVA R. G.

    1999-01-01

    Full Text Available A new algorithm for model predictive control is presented. The algorithm utilizes a simultaneous solution and optimization strategy to solve the model's differential equations. The equations are discretized by equidistant collocation, and along with the algebraic model equations are included as constraints in a nonlinear programming (NLP problem. This algorithm is compared with the algorithm that uses orthogonal collocation on finite elements. The equidistant collocation algorithm results in simpler equations, providing a decrease in computation time for the control moves. Simulation results are presented and show a satisfactory performance of this algorithm.

  10. The Structured Process Modeling Theory (SPMT) : a cognitive view on why and how modelers benefit from structuring the process of process modeling

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Gailly, F.; Grefen, P.W.P.J.; Poels, G.

    2015-01-01

    After observing various inexperienced modelers constructing a business process model based on the same textual case description, it was noted that great differences existed in the quality of the produced models. The impression arose that certain quality issues originated from cognitive failures

  11. Dynamic process model of a plutonium oxalate precipitator. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Miller, C.L.; Hammelman, J.E.; Borgonovi, G.M.

    1977-11-01

    In support of LLL material safeguards program, a dynamic process model was developed which simulates the performance of a plutonium (IV) oxalate precipitator. The plutonium oxalate precipitator is a component in the plutonium oxalate process for making plutonium oxide powder from plutonium nitrate. The model is based on state-of-the-art crystallization descriptive equations, the parameters of which are quantified through the use of batch experimental data. The dynamic model predicts performance very similar to general Hanford oxalate process experience. The utilization of such a process model in an actual plant operation could promote both process control and material safeguards control by serving as a baseline predictor which could give early warning of process upsets or material diversion. The model has been incorporated into a FORTRAN computer program and is also compatible with the DYNSYS 2 computer code which is being used at LLL for process modeling efforts.

  12. Dynamic process model of a plutonium oxalate precipitator. Final report

    International Nuclear Information System (INIS)

    Miller, C.L.; Hammelman, J.E.; Borgonovi, G.M.

    1977-11-01

    In support of LLL material safeguards program, a dynamic process model was developed which simulates the performance of a plutonium (IV) oxalate precipitator. The plutonium oxalate precipitator is a component in the plutonium oxalate process for making plutonium oxide powder from plutonium nitrate. The model is based on state-of-the-art crystallization descriptive equations, the parameters of which are quantified through the use of batch experimental data. The dynamic model predicts performance very similar to general Hanford oxalate process experience. The utilization of such a process model in an actual plant operation could promote both process control and material safeguards control by serving as a baseline predictor which could give early warning of process upsets or material diversion. The model has been incorporated into a FORTRAN computer program and is also compatible with the DYNSYS 2 computer code which is being used at LLL for process modeling efforts

  13. MODELLING OF THE PROCESS OF TEACHING READING ENGLISH LANGUAGE PERIODICALS

    Directory of Open Access Journals (Sweden)

    Тетяна Глушко

    2014-07-01

    Full Text Available The article reveals a scientifically substantiated process of teaching reading English language periodicals in all its components, which are consistently developed, and form of interconnection of the structural elements in the process of teaching reading. This process is presented as a few interconnected and interdetermined models: 1 the models of the process of acquiring standard and expressive lexical knowledge; 2 the models of the process of formation of skills to use such vocabulary; 3 the models of the development of skills to read texts of the different linguistic levels.

  14. Aspect-Oriented Business Process Modeling with AO4BPMN

    Science.gov (United States)

    Charfi, Anis; Müller, Heiko; Mezini, Mira

    Many crosscutting concerns in business processes need to be addressed already at the business process modeling level such as compliance, auditing, billing, and separation of duties. However, existing business process modeling languages including OMG's Business Process Modeling Notation (BPMN) lack appropriate means for expressing such concerns in a modular way. In this paper, we motivate the need for aspect-oriented concepts in business process modeling languages and propose an aspect-oriented extension to BPMN called AO4BPMN. We also present a graphical editor supporting that extension.

  15. The Role(s) of Process Models in Design Practice

    DEFF Research Database (Denmark)

    Iversen, Søren; Jensen, Mads Kunø Nyegaard; Vistisen, Peter

    2018-01-01

    This paper investigates how design process models are implemented and used in design-driven organisations. The archetypical theoretical framing of process models, describe their primary role as guiding the design process, and assign roles and deliverables throughout the process. We hypothesise...... that the process models also take more communicative roles in practice, both in terms of creating an internal design rationale, as well as demystifying the black box of design thinking to external stakeholders. We investigate this hypothesis through an interview study of four major danish design......-driven organisations, and analyse the different roles their archetypical process models take in their organisations. The main contribution is the identification of three, often overlapping roles, which design process models showed to assume in design-driven organisations: process guidance, adding transparency...

  16. MODELING OF MANAGEMENT PROCESSES IN AN ORGANIZATION

    Directory of Open Access Journals (Sweden)

    Stefan Iovan

    2016-05-01

    Full Text Available When driving any major change within an organization, strategy and execution are intrinsic to a project’s success. Nevertheless, closing the gap between strategy and execution remains a challenge for many organizations [1]. Companies tend to focus more on execution than strategy for quick results, instead of taking the time needed to understand the parts that make up the whole, so the right execution plan can be put in place to deliver the best outcomes. A large part of this understands that business operations don’t fit neatly within the traditional organizational hierarchy. Business processes are often messy, collaborative efforts that cross teams, departments and systems, making them difficult to manage within a hierarchical structure [2]. Business process management (BPM fills this gap by redefining an organization according to its end-to-end processes, so opportunities for improvement can be identified and processes streamlined for growth, revenue and transformation. This white paper provides guidelines on what to consider when using business process applications to solve your BPM initiatives, and the unique capabilities software systems provides that can help ensure both your project’s success and the success of your organization as a whole. majority of medium and small businesses, big companies and even some guvermental organizations [2].

  17. Biomedical Simulation Models of Human Auditory Processes

    Science.gov (United States)

    Bicak, Mehmet M. A.

    2012-01-01

    Detailed acoustic engineering models that explore noise propagation mechanisms associated with noise attenuation and transmission paths created when using hearing protectors such as earplugs and headsets in high noise environments. Biomedical finite element (FE) models are developed based on volume Computed Tomography scan data which provides explicit external ear, ear canal, middle ear ossicular bones and cochlea geometry. Results from these studies have enabled a greater understanding of hearing protector to flesh dynamics as well as prioritizing noise propagation mechanisms. Prioritization of noise mechanisms can form an essential framework for exploration of new design principles and methods in both earplug and earcup applications. These models are currently being used in development of a novel hearing protection evaluation system that can provide experimentally correlated psychoacoustic noise attenuation. Moreover, these FE models can be used to simulate the effects of blast related impulse noise on human auditory mechanisms and brain tissue.

  18. Modelling of additive manufacturing processes: a review and classification

    Science.gov (United States)

    Stavropoulos, Panagiotis; Foteinopoulos, Panagis

    2018-03-01

    Additive manufacturing (AM) is a very promising technology; however, there are a number of open issues related to the different AM processes. The literature on modelling the existing AM processes is reviewed and classified. A categorization of the different AM processes in process groups, according to the process mechanism, has been conducted and the most important issues are stated. Suggestions are made as to which approach is more appropriate according to the key performance indicator desired to be modelled and a discussion is included as to the way that future modelling work can better contribute to improving today's AM process understanding.

  19. Sensitivity study of reduced models of the activated sludge process ...

    African Journals Online (AJOL)

    2009-08-07

    Aug 7, 2009 ... Sensitivity study of reduced models of the activated sludge process, for the purposes of parameter estimation and process optimisation: Benchmark process with ASM1 and UCT reduced biological models. S du Plessis and R Tzoneva*. Department of Electrical Engineering, Cape Peninsula University of ...

  20. Task-specific visual cues for improving process model understanding

    NARCIS (Netherlands)

    Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A.

    2016-01-01

    Context Business process models support various stakeholders in managing business processes and designing process-aware information systems. In order to make effective use of these models, they have to be readily understandable. Objective Prior research has emphasized the potential of visual cues to

  1. Computer-Aided Multiscale Modelling for Chemical Process Engineering

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Gani, Rafiqul

    2007-01-01

    Chemical processes are generally modeled through monoscale approaches, which, while not adequate, satisfy a useful role in product-process design. In this case, use of a multi-dimensional and multi-scale model-based approach has importance in product-process development. A computer-aided framework...

  2. Guided interaction exploration in artifact-centric process models

    NARCIS (Netherlands)

    van Eck, M.L.; Sidorova, N.; van der Aalst, W.M.P.

    2017-01-01

    Artifact-centric process models aim to describe complex processes as a collection of interacting artifacts. Recent development in process mining allow for the discovery of such models. However, the focus is often on the representation of the individual artifacts rather than their interactions. Based

  3. The Formalization of the Business Process Modeling Goals

    OpenAIRE

    Bušinska, Ligita; Kirikova, Mārīte

    2016-01-01

    In business process modeling the de facto standard BPMN has emerged. However, the applications of this notation have many subsets of elements and various extensions. Also, BPMN still coincides with many other modeling languages, forming a large set of available options for business process modeling languages and dialects. While, in general, the goal of modelers is a central notion in the choice of modeling languages and notations, in most researches that propose guidelines, techniques, and me...

  4. On Process Modelling Using Physical Oriented And Phenomena Based Principles

    Directory of Open Access Journals (Sweden)

    Mihai Culea

    2000-12-01

    Full Text Available This work presents a modelling framework based on phenomena description of the process. The approach is taken to easy understand and construct process model in heterogeneous possible distributed modelling and simulation environments. A simplified case study of a heat exchanger is considered and Modelica modelling language to check the proposed concept. The partial results are promising and the research effort will be extended in a computer aided modelling environment based on phenomena.

  5. An integrated model for supplier selection process

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    In today's highly competitive manufacturing environment, the supplier selection process becomes one of crucial activities in supply chain management. In order to select the best supplier(s) it is not only necessary to continuously tracking and benchmarking performance of suppliers but also to make a tradeoff between tangible and intangible factors some of which may conflict. In this paper an integration of case-based reasoning (CBR), analytical network process (ANP) and linear programming (LP) is proposed to solve the supplier selection problem.

  6. Statistical image processing and multidimensional modeling

    CERN Document Server

    Fieguth, Paul

    2010-01-01

    Images are all around us! The proliferation of low-cost, high-quality imaging devices has led to an explosion in acquired images. When these images are acquired from a microscope, telescope, satellite, or medical imaging device, there is a statistical image processing task: the inference of something - an artery, a road, a DNA marker, an oil spill - from imagery, possibly noisy, blurry, or incomplete. A great many textbooks have been written on image processing. However this book does not so much focus on images, per se, but rather on spatial data sets, with one or more measurements taken over

  7. From BPMN process models to BPEL web services

    NARCIS (Netherlands)

    Ouyang, C.; Aalst, van der W.M.P.; Dumas, M.; Hofstede, ter A.H.M.; Feig, E.; Kumar, A.

    2006-01-01

    The Business Process Modelling Notation (BPMN) is a graph-oriented language in which control and action nodes can be connected almost arbitrarily. It is supported by various modelling tools but so far no systems can directly execute BPMN models. The Business Process Execution Language for Web

  8. The evolution of process-based hydrologic models

    NARCIS (Netherlands)

    Clark, Martyn P.; Bierkens, Marc F.P.; Samaniego, Luis; Woods, Ross A.; Uijlenhoet, Remko; Bennett, Katrina E.; Pauwels, Valentijn R.N.; Cai, Xitian; Wood, Andrew W.; Peters-Lidard, Christa D.

    2017-01-01

    The diversity in hydrologic models has historically led to great controversy on the "correct" approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this

  9. Correctness-preserving configuration of business process models

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Dumas, M.; Gottschalk, F.; Hofstede, ter A.H.M.; La Rosa, M.; Mendling, J.; Fiadeiro, J.; Inverardi, P.

    2008-01-01

    Reference process models capture recurrent business operations in a given domain such as procurement or logistics. These models are intended to be configured to fit the requirements of specific organizations or projects, leading to individualized process models that are subsequently used for domain

  10. Process models as tools in forestry research and management

    Science.gov (United States)

    Kurt Johnsen; Lisa Samuelson; Robert Teskey; Steve McNulty; Tom Fox

    2001-01-01

    Forest process models are mathematical representations of biological systems that incorporate our understanding of physiological and ecological mechanisms into predictive algorithms. These models were originally designed and used for research purposes, but are being developed for use in practical forest management. Process models designed for research...

  11. Business Process Modeling Languages Supporting Collaborative Networks

    NARCIS (Netherlands)

    Soleimani Malekan, H.; Afsarmanesh, H.; Hammoudi, S.; Maciaszek, L.A.; Cordeiro, J.; Dietz, J.L.G.

    2013-01-01

    Formalizing the definition of Business Processes (BPs) performed within each enterprise is fundamental for effective deployment of their competencies and capabilities within Collaborative Networks (CN). In our approach, every enterprise in the CN is represented by its set of BPs, so that other

  12. Anode baking process optimization through computer modelling

    Energy Technology Data Exchange (ETDEWEB)

    Wilburn, D.; Lancaster, D.; Crowell, B. [Noranda Aluminum, New Madrid, MO (United States); Ouellet, R.; Jiao, Q. [Noranda Technology Centre, Pointe Claire, PQ (Canada)

    1998-12-31

    Carbon anodes used in aluminum electrolysis are produced in vertical or horizontal type anode baking furnaces. The carbon blocks are formed from petroleum coke aggregate mixed with a coal tar pitch binder. Before the carbon block can be used in a reduction cell it must be heated to pyrolysis. The baking process represents a large portion of the aluminum production cost, and also has a significant effect on anode quality. To ensure that the baking of the anode is complete, it must be heated to about 1100 degrees C. To improve the understanding of the anode baking process and to improve its efficiency, a menu-driven heat, mass and fluid flow simulation tool, called NABSIM (Noranda Anode Baking SIMulation), was developed and calibrated in 1993 and 1994. It has been used since then to evaluate and screen firing practices, and to determine which firing procedure will produce the optimum heat-up rate, final temperature, and soak time, without allowing unburned tar to escape. NABSIM is used as a furnace simulation tool on a daily basis by Noranda plant process engineers and much effort is expended in improving its utility by creating new versions, and the addition of new modules. In the immediate future, efforts will be directed towards optimizing the anode baking process to improve temperature uniformity from pit to pit. 3 refs., 4 figs.

  13. PRODUCT TRIAL PROCESSING (PTP): A MODEL APPROACH ...

    African Journals Online (AJOL)

    Admin

    This study is a theoretical approach to consumer's processing of product trail, and equally explored ... consumer's first usage experience with a company's brand or product that is most important in determining ... product, what it is really marketing is the expected ..... confidence, thus there is a positive relationship between ...

  14. Understanding Modeling Requirements of Unstructured Business Processes

    NARCIS (Netherlands)

    Allah Bukhsh, Zaharah; van Sinderen, Marten J.; Sikkel, Nicolaas; Quartel, Dick

    2017-01-01

    Management of structured business processes is of interest to both academia and industry, where academia focuses on the development of methods and techniques while industry focuses on the development of supporting tools. With the shift from routine to knowledge work, the relevance of management of

  15. Modeling Kanban Processes in Systems Engineering

    Science.gov (United States)

    2012-06-01

    results in lower change traffic and defect incidence. • KSS: Incremental SE, with some design up-front and design continuing throughout development...Boston: Addison-Wesley. [11] Morgan, James M, and Jeffrey K Liker. (2006). The Toyota Product Development System: Integrating People, Process, and

  16. Beyond dual-process models: A categorisation of processes underlying intuitive judgement and decision making

    NARCIS (Netherlands)

    Glöckner, A.; Witteman, C.L.M.

    2010-01-01

    Intuitive-automatic processes are crucial for making judgements and decisions. The fascinating complexity of these processes has attracted many decision researchers, prompting them to start investigating intuition empirically and to develop numerous models. Dual-process models assume a clear

  17. Process model simulations of the divergence effect

    Science.gov (United States)

    Anchukaitis, K. J.; Evans, M. N.; D'Arrigo, R. D.; Smerdon, J. E.; Hughes, M. K.; Kaplan, A.; Vaganov, E. A.

    2007-12-01

    We explore the extent to which the Vaganov-Shashkin (VS) model of conifer tree-ring formation can explain evidence for changing relationships between climate and tree growth over recent decades. The VS model is driven by daily environmental forcing (temperature, soil moisture, and solar radiation), and simulates tree-ring growth cell-by-cell as a function of the most limiting environmental control. This simplified representation of tree physiology allows us to examine using a selection of case studies whether instances of divergence may be explained in terms of changes in limiting environmental dependencies or transient climate change. Identification of model-data differences permits further exploration of the effects of tree-ring standardization, atmospheric composition, and additional non-climatic factors.

  18. Modelling Template for the Development of the Process Flowsheet

    DEFF Research Database (Denmark)

    Fedorova, Marina; Gani, Rafiqul

    2015-01-01

    Models are playing important roles in design and analysis of chemicals/bio-chemicals based products and the processes that manufacture them. Model-based methods and tools have the potential to decrease the number of experiments, which can be expensive and time consuming, and point to candidates...... in connection to other modelling tools within the modelling framework are forming a user-friendly system, which will make the model development process easier and faster and provide the way for unified and consistent model documentation. The modeller can use the template for their specific problem or to extend...... models systematically, efficiently and reliably. In this way, development of products and processes can be faster, cheaper and very efficient. The developed modelling framework involves three main parts: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which...

  19. Modeling of processing technologies in food industry

    Science.gov (United States)

    Korotkov, V. G.; Sagitov, R. F.; Popov, V. P.; Bachirov, V. D.; Akhmadieva, Z. R.; TSirkaeva, E. A.

    2018-03-01

    Currently, the society is facing an urgent need to solve the problems of nutrition (products with increased nutrition value) and to develop energy-saving technologies for food products. A mathematical modeling of heat and mass transfer of polymer materials in the extruder is rather successful these days. Mathematical description of movement and heat exchange during extrusion of gluten-protein-starch-containing material similar to pasta dough in its structure, were taken as a framework for the mathematical model presented in this paper.

  20. Analysis of Using Resources in Business Process Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Vasilecas Olegas

    2014-12-01

    Full Text Available One of the key purposes of Business Process Model and Notation (BPMN is to support graphical representation of the process model. However, such models have a lack of support for the graphical representation of resources, whose processes are used during simulation or execution of process instance. The paper analyzes different methods and their extensions for resource modeling. Further, this article presents a selected set of resource properties that are relevant for resource modeling. The paper proposes an approach that explains how to use the selected set of resource properties for extension of process modeling using BPMN and simulation tools. They are based on BPMN, where business process instances use resources in a concurrency manner.

  1. A Comprehensive and Harmonized Digital Forensic Investigation Process Model.

    Science.gov (United States)

    Valjarevic, Aleksandar; Venter, Hein S

    2015-11-01

    Performing a digital forensic investigation (DFI) requires a standardized and formalized process. There is currently neither an international standard nor does a global, harmonized DFI process (DFIP) exist. The authors studied existing state-of-the-art DFIP models and concluded that there are significant disparities pertaining to the number of processes, the scope, the hierarchical levels, and concepts applied. This paper proposes a comprehensive model that harmonizes existing models. An effort was made to incorporate all types of processes proposed by the existing models, including those aimed at achieving digital forensic readiness. The authors introduce a novel class of processes called concurrent processes. This is a novel contribution that should, together with the rest of the model, enable more efficient and effective DFI, while ensuring admissibility of digital evidence. Ultimately, the proposed model is intended to be used for different types of DFI and should lead to standardization. © 2015 American Academy of Forensic Sciences.

  2. Statistical properties of several models of fractional random point processes

    Science.gov (United States)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  3. Process modeling of a HLA research lab

    Science.gov (United States)

    Ribeiro, Bruna G. C.; Sena, Alexandre C.; Silva, Dilson; Marzulo, Leandro A. J.

    2017-11-01

    Bioinformatics has provided tremendous breakthroughs in the field of molecular biology. All this evolution has generated a large volume of biological data that increasingly require the use of computing for analysis and storage of this information. The identification of the human leukocyte antigen (HLA) genotypes is critical to the success of organ transplants in humans. HLA typing involves not only laboratory tests but also DNA sequencing, with the participation of several professionals responsible for different stages of the process. Thus, the objective of this paper is to map the main steps in HLA typing in a laboratory specialized in performing such procedures, analyzing each process and proposing solutions to speed up the these steps, avoiding mistakes.

  4. Business process modeling in the cloud

    OpenAIRE

    Yarahmadi, Aziz

    2014-01-01

    In this study, I have defined the first steps of creating a methodological framework to implement a cloud business application. The term 'cloud' here refers to applying the processing power of a network of computing tools to business solutions in order to move on from legacy systems. I have introduced the hardware and software requirements of cloud computing in business and the procedure by which the business needs will be found, analyzed and recorded as a decision making system. But first we...

  5. Multiscale Modeling and Simulation of Material Processing

    Science.gov (United States)

    2006-07-01

    challenge is how to develop methods that permit simulation of a process with a fewer number of atoms (for e.g. 106 instead of 1014 atoms in a cube) or...rreula bakgrundmes to ea wih poblms n-here. In dynamic simulations, the mass and momentum volving rapidly varying stress, such as stress field near a...significant, as indicated by numerical examples that will follow. We next summarize the coupling scheme with the aid of flowchart Fig. 8. The material

  6. Stochastic Models in the Identification Process

    Czech Academy of Sciences Publication Activity Database

    Slovák, Dalibor; Zvárová, Jana

    2011-01-01

    Roč. 7, č. 1 (2011), s. 44-50 ISSN 1801-5603 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : identification process * weight-of evidence formula * coancestry coefficient * beta-binomial sampling formula * DNA mixtures Subject RIV: IN - Informatics, Computer Science http://www.ejbi.eu/images/2011-1/Slovak_en.pdf

  7. How can Product Development Process Modelling be made more useful?

    DEFF Research Database (Denmark)

    Wynn, David C; Maier, Anja; Clarkson, John P

    2010-01-01

    and on the way they are applied. The paper draws upon established principles of cybernetic systems in an attempt to explain the role played by process modelling in operating and improving PD processes. We use this framework to identify eight key factors which influence the utility of modelling in the context...... of use. Further, we indicate how these factors can be interpreted to identify opportunities to improve modelling utility. The paper is organised as follows. Section 2 provides background and motivation for the paper by discussing an example of PD process modelling practice. After highlighting from......, and the process being modelled. Section 5 draws upon established principles of cybernetic systems theory to incorporate this view in an explanation of the role of modelling in PD process operation and improvement. This framework is used to define modelling utility and to progressively identify influences upon it...

  8. Multivariate Product-Shot-noise Cox Point Process Models

    DEFF Research Database (Denmark)

    Jalilian, Abdollah; Guan, Yongtao; Mateu, Jorge

    We introduce a new multivariate product-shot-noise Cox process which is useful for model- ing multi-species spatial point patterns with clustering intra-specific interactions and neutral, negative or positive inter-specific interactions. The auto and cross pair correlation functions of the process...... can be obtained in closed analytical forms and approximate simulation of the process is straightforward. We use the proposed process to model interactions within and among five tree species in the Barro Colorado Island plot....

  9. Study on a Process-oriented Knowledge Management Model

    OpenAIRE

    Zhang, Lingling; Li, Jun; Zheng, Xiuyu; Li, Xingsen; Shi, Yong

    2007-01-01

    Now knowledge has become the most important resource of enterprises. Process-oriented knowledge management (POKM) is a new and valuable research field. It may be the most practical method to deal with difficulties in knowledge management. The paper analyzes background, hypothesis and proposes of POKM, define the process knowledge, and give a process-oriented knowledge management model. The model integrates knowledge, process, human, and technology. It can improve the decision support capabili...

  10. Aberrant brain responses to emotionally valent words is normalised after cognitive behavioural therapy in female depressed adolescents.

    Science.gov (United States)

    Chuang, Jie-Yu; J Whitaker, Kirstie; Murray, Graham K; Elliott, Rebecca; Hagan, Cindy C; Graham, Julia Me; Ooi, Cinly; Tait, Roger; Holt, Rosemary J; van Nieuwenhuizen, Adrienne O; Reynolds, Shirley; Wilkinson, Paul O; Bullmore, Edward T; Lennox, Belinda R; Sahakian, Barbara J; Goodyer, Ian; Suckling, John

    2016-01-01

    Depression in adolescence is debilitating with high recurrence in adulthood, yet its pathophysiological mechanism remains enigmatic. To examine the interaction between emotion, cognition and treatment, functional brain responses to sad and happy distractors in an affective go/no-go task were explored before and after Cognitive Behavioural Therapy (CBT) in depressed female adolescents, and healthy participants. Eighty-two Depressed and 24 healthy female adolescents, aged 12-17 years, performed a functional magnetic resonance imaging (fMRI) affective go/no-go task at baseline. Participants were instructed to withhold their responses upon seeing happy or sad words. Among these participants, 13 patients had CBT over approximately 30 weeks. These participants and 20 matched controls then repeated the task. At baseline, increased activation in response to happy relative to neutral distractors was observed in the orbitofrontal cortex in depressed patients which was normalised after CBT. No significant group differences were found behaviourally or in brain activation in response to sad distractors. Improvements in symptoms (mean: 9.31, 95% CI: 5.35-13.27) were related at trend-level to activation changes in orbitofrontal cortex. In the follow-up section, a limited number of post-CBT patients were recruited. To our knowledge, this is the first fMRI study addressing the effect of CBT in adolescent depression. Although a bias toward negative information is widely accepted as a hallmark of depression, aberrant brain hyperactivity to positive distractors was found and normalised after CBT. Research, assessment and treatment focused on positive stimuli could be a future consideration. Moreover, a pathophysiological mechanism distinct from adult depression may be suggested and awaits further exploration. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  11. Measures of quality of process models created in BPMN

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-12-01

    Full Text Available Description, documentation, evaluation and redesign of key processes during their execution should be an essential part of the strategic management of any organization. All organizations live in a dynamically changing environment. Therefore they must adapt its internal processes to market changes. These processes must be described. Suitable way of description could be BPMN notation. Right after description of processes via BPMN, processes should be controlled to ensure expected of their quality. System (which could be automated based on mathematical expression of qualitative characteristics of process models (i.e. measures of quality of process models can support mentioned process controls. Research team trying to design and get into practical use such a tool. The aim of this publication is description of mentioned system – based on measures of the quality of process models - and answer associated scientific questions.

  12. Integrated Intelligent Modeling, Design and Control of Crystal Growth Processes

    National Research Council Canada - National Science Library

    Prasad, V

    2000-01-01

    .... This MURI program took an integrated approach towards modeling, design and control of crystal growth processes and in conjunction with growth and characterization experiments developed much better...

  13. Self-exciting point process in modeling earthquake occurrences

    International Nuclear Information System (INIS)

    Pratiwi, H.; Slamet, I.; Respatiwulan; Saputro, D. R. S.

    2017-01-01

    In this paper, we present a procedure for modeling earthquake based on spatial-temporal point process. The magnitude distribution is expressed as truncated exponential and the event frequency is modeled with a spatial-temporal point process that is characterized uniquely by its associated conditional intensity process. The earthquakes can be regarded as point patterns that have a temporal clustering feature so we use self-exciting point process for modeling the conditional intensity function. The choice of main shocks is conducted via window algorithm by Gardner and Knopoff and the model can be fitted by maximum likelihood method for three random variables. (paper)

  14. Towards simplification of hydrologic modeling: Identification of dominant processes

    Science.gov (United States)

    Markstrom, Steven; Hay, Lauren E.; Clark, Martyn P.

    2016-01-01

    The Precipitation–Runoff Modeling System (PRMS), a distributed-parameter hydrologic model, has been applied to the conterminous US (CONUS). Parameter sensitivity analysis was used to identify: (1) the sensitive input parameters and (2) particular model output variables that could be associated with the dominant hydrologic process(es). Sensitivity values of 35 PRMS calibration parameters were computed using the Fourier amplitude sensitivity test procedure on 110 000 independent hydrologically based spatial modeling units covering the CONUS and then summarized to process (snowmelt, surface runoff, infiltration, soil moisture, evapotranspiration, interflow, baseflow, and runoff) and model performance statistic (mean, coefficient of variation, and autoregressive lag 1). Identified parameters and processes provide insight into model performance at the location of each unit and allow the modeler to identify the most dominant process on the basis of which processes are associated with the most sensitive parameters. The results of this study indicate that: (1) the choice of performance statistic and output variables has a strong influence on parameter sensitivity, (2) the apparent model complexity to the modeler can be reduced by focusing on those processes that are associated with sensitive parameters and disregarding those that are not, (3) different processes require different numbers of parameters for simulation, and (4) some sensitive parameters influence only one hydrologic process, while others may influence many

  15. Modified Invasion Percolation Models for Multiphase Processes

    Energy Technology Data Exchange (ETDEWEB)

    Karpyn, Zuleima [Pennsylvania State Univ., State College, PA (United States)

    2015-01-31

    This project extends current understanding and modeling capabilities of pore-scale multiphase flow physics in porous media. High-resolution X-ray computed tomography imaging experiments are used to investigate structural and surface properties of the medium that influence immiscible displacement. Using experimental and computational tools, we investigate the impact of wetting characteristics, as well as radial and axial loading conditions, on the development of percolation pathways, residual phase trapping and fluid-fluid interfacial areas.

  16. Animated-simulation modeling facilitates clinical-process costing.

    Science.gov (United States)

    Zelman, W N; Glick, N D; Blackmore, C C

    2001-09-01

    Traditionally, the finance department has assumed responsibility for assessing process costs in healthcare organizations. To enhance process-improvement efforts, however, many healthcare providers need to include clinical staff in process cost analysis. Although clinical staff often use electronic spreadsheets to model the cost of specific processes, PC-based animated-simulation tools offer two major advantages over spreadsheets: they allow clinicians to interact more easily with the costing model so that it more closely represents the process being modeled, and they represent cost output as a cost range rather than as a single cost estimate, thereby providing more useful information for decision making.

  17. Method for modeling social care processes for national information exchange.

    Science.gov (United States)

    Miettinen, Aki; Mykkänen, Juha; Laaksonen, Maarit

    2012-01-01

    Finnish social services include 21 service commissions of social welfare including Adoption counselling, Income support, Child welfare, Services for immigrants and Substance abuse care. This paper describes the method used for process modeling in the National project for IT in Social Services in Finland (Tikesos). The process modeling in the project aimed to support common national target state processes from the perspective of national electronic archive, increased interoperability between systems and electronic client documents. The process steps and other aspects of the method are presented. The method was developed, used and refined during the three years of process modeling in the national project.

  18. Modeling Resource Hotspots: Critical Linkages and Processes

    Science.gov (United States)

    Daher, B.; Mohtar, R.; Pistikopoulos, E.; McCarl, B. A.; Yang, Y.

    2017-12-01

    Growing demands for interconnected resources emerge in the form of hotspots of varying characteristics. The business as usual allocation model cannot address the current, let alone anticipated, complex and highly interconnected resource challenges we face. A new paradigm for resource allocation must be adopted: one that identifies cross-sectoral synergies and, that moves away from silos to recognition of the nexus and integration of it. Doing so will result in new opportunities for business growth, economic development, and improved social well-being. Solutions and interventions must be multi-faceted; opportunities should be identified with holistic trade-offs in mind. No single solution fits all: different hotspots will require distinct interventions. Hotspots have varying resource constraints, stakeholders, goals and targets. The San Antonio region represents a complex resource hotspot with promising potential: its rapidly growing population, the Eagle Ford shale play, and the major agricultural activity there makes it a hotspot with many competing demands. Stakeholders need tools to allow them to knowledgeably address impending resource challenges. This study will identify contemporary WEF nexus questions and critical system interlinkages that will inform the modeling of the tightly interconnected resource systems and stresses using the San Antonio Region as a base; it will conceptualize a WEF nexus modeling framework, and develop assessment criteria to inform integrative planning and decision making.

  19. Model for analyzing decontamination process systems

    International Nuclear Information System (INIS)

    Boykin, R.F.; Rolland, C.W.

    1979-06-01

    Selection of equipment and the design of a new facility in light of minimizing cost and maximizing capacity, is a problem managers face many times in the operations of a manufacturing organization. This paper deals with the actual analysis of equipment facility design for a decontamination operation. Discussions on the selection method of the equipment and the development of the facility design criteria are presented along with insight into the problems encountered in the equipment analysis for a new decontamination facility. The presentation also includes a review of the transition from the old facility into the new facility and the process used to minimize the cost and conveyance problems of the transition

  20. Modelling of chemical reactions in metallurgical processes

    OpenAIRE

    Kinaci, M. Efe; Lichtenegger, Thomas; Schneiderbauer, Simon

    2017-01-01

    Iron-ore reduction has attracted much interest in the last three decades since it can be considered as a core process in steel industry. The iron-ore is reduced to iron with the use of blast furnace and fluidized bed technologies. To investigate the harsh conditions inside fluidized bed reactors, computational tools can be utilized. One such tool is the CFD-DEM method, in which the gas phase reactions and governing equations are calculated in the Eulerian (CFD) side, whereas the particle reac...

  1. A Queuing Model of the Airport Departure Process

    OpenAIRE

    Balakrishnan, Hamsa; Simaiakis, Ioannis

    2013-01-01

    This paper presents an analytical model of the aircraft departure process at an airport. The modeling procedure includes the estimation of unimpeded taxi-out time distributions and the development of a queuing model of the departure runway system based on the transient analysis of D/E/1 queuing systems. The parameters of the runway service process are estimated using operational data. Using the aircraft pushback schedule as input, the model predicts the expected runway schedule and takeoff ti...

  2. Modelling energy spot prices by Lévy semistationary processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole; Benth, Fred Espen; Veraart, Almut

    This paper introduces a new modelling framework for energy spot prices based on Lévy semistationary processes. Lévy semistationary processes are special cases of the general class of ambit processes. We provide a detailed analysis of the probabilistic properties of such models and we show how...... they are able to capture many of the stylised facts observed in energy markets. Furthermore, we derive forward prices based on our spot price model. As it turns out, many of the classical spot models can be embedded into our novel modelling framework....

  3. Modeling of Multicomponent Mixture Separation Processes Using Hollow fiber Membrane

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sin-Ah; Kim, Jin-Kuk; Lee, Young Moo; Yeo, Yeong-Koo [Hanyang University, Seoul (Korea, Republic of)

    2015-02-15

    So far, most of research activities on modeling of membrane separation processes have been focused on binary feed mixture. But, in actual separation operations, binary feed is hard to find and most separation processes involve multicomponent feed mixture. In this work models for membrane separation processes treating multicomponent feed mixture are developed. Various model types are investigated and validity of proposed models are analysed based on experimental data obtained using hollowfiber membranes. The proposed separation models show quick convergence and exhibit good tracking performance.

  4. Modeling interdependencies between business and communication processes in hospitals.

    Science.gov (United States)

    Brigl, Birgit; Wendt, Thomas; Winter, Alfred

    2003-01-01

    The optimization and redesign of business processes in hospitals is an important challenge for the hospital information management who has to design and implement a suitable HIS architecture. Nevertheless, there are no tools available specializing in modeling information-driven business processes and the consequences on the communication between information processing, tools. Therefore, we will present an approach which facilitates the representation and analysis of business processes and resulting communication processes between application components and their interdependencies. This approach aims not only to visualize those processes, but to also to evaluate if there are weaknesses concerning the information processing infrastructure which hinder the smooth implementation of the business processes.

  5. Modeling and simulation for process and safeguards system design

    International Nuclear Information System (INIS)

    Gutmacher, R.G.; Kern, E.A.; Duncan, D.R.; Benecke, M.W.

    1983-01-01

    A computer modeling and simulation approach that meets the needs of both the process and safeguards system designers is described. The results have been useful to Westinghouse Hanford Company process designers in optimizing the process scenario and operating scheme of the Secure Automated Fabrication line. The combined process/measurements model will serve as the basis for design of the safeguards system. Integration of the process design and the safeguards system design should result in a smoothly operating process that is easier to safeguard

  6. A Software Development Simulation Model of a Spiral Process

    Science.gov (United States)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.

  7. Diff-based model synchronization in an industrial MDD process

    DEFF Research Database (Denmark)

    Kindler, Ekkart; Könemann, Patrick; Unland, Ludger

    of different models is maintained manually in many cases today. This paper presents an approach for automated model differencing, so that the differences between two model versions (called delta) can be extracted and stored. It can then be re-used independently of the models it was created from...... to interactively merge different model versions, and for synchronizing other types of models. The main concern was to apply our concepts to an industrial process, so usability and performance were important issues....

  8. Detecting Difference between Process Models Based on the Refined Process Structure Tree

    Directory of Open Access Journals (Sweden)

    Jing Fan

    2017-01-01

    Full Text Available The development of mobile workflow management systems (mWfMS leads to large number of business process models. In the meantime, the location restriction embedded in mWfMS may result in different process models for a single business process. In order to help users quickly locate the difference and rebuild the process model, detecting the difference between different process models is needed. Existing detection methods either provide a dissimilarity value to represent the difference or use predefined difference template to generate the result, which cannot reflect the entire composition of the difference. Hence, in this paper, we present a new approach to solve this problem. Firstly, we parse the process models to their corresponding refined process structure trees (PSTs, that is, decomposing a process model into a hierarchy of subprocess models. Then we design a method to convert the PST to its corresponding task based process structure tree (TPST. As a consequence, the problem of detecting difference between two process models is transformed to detect difference between their corresponding TPSTs. Finally, we obtain the difference between two TPSTs based on the divide and conquer strategy, where the difference is described by an edit script and we make the cost of the edit script close to minimum. The extensive experimental evaluation shows that our method can meet the real requirements in terms of precision and efficiency.

  9. Parameters modelling of amaranth grain processing technology

    Science.gov (United States)

    Derkanosova, N. M.; Shelamova, S. A.; Ponomareva, I. N.; Shurshikova, G. V.; Vasilenko, O. A.

    2018-03-01

    The article presents a technique that allows calculating the structure of a multicomponent bakery mixture for the production of enriched products, taking into account the instability of nutrient content, and ensuring the fulfilment of technological requirements and, at the same time considering consumer preferences. The results of modelling and analysis of optimal solutions are given by the example of calculating the structure of a three-component mixture of wheat and rye flour with an enriching component, that is, whole-hulled amaranth flour applied to the technology of bread from a mixture of rye and wheat flour on a liquid leaven.

  10. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  11. Functional Dual Adaptive Control with Recursive Gaussian Process Model

    International Nuclear Information System (INIS)

    Prüher, Jakub; Král, Ladislav

    2015-01-01

    The paper deals with dual adaptive control problem, where the functional uncertainties in the system description are modelled by a non-parametric Gaussian process regression model. Current approaches to adaptive control based on Gaussian process models are severely limited in their practical applicability, because the model is re-adjusted using all the currently available data, which keeps growing with every time step. We propose the use of recursive Gaussian process regression algorithm for significant reduction in computational requirements, thus bringing the Gaussian process-based adaptive controllers closer to their practical applicability. In this work, we design a bi-criterial dual controller based on recursive Gaussian process model for discrete-time stochastic dynamic systems given in an affine-in-control form. Using Monte Carlo simulations, we show that the proposed controller achieves comparable performance with the full Gaussian process-based controller in terms of control quality while keeping the computational demands bounded. (paper)

  12. Concept of a cognitive-numeric plant and process modelizer

    International Nuclear Information System (INIS)

    Vetterkind, D.

    1990-01-01

    To achieve automatic modeling of plant distrubances and failure limitation procedures, first the system's hardware and the present media (water, steam, coolant fluid) are formalized into fully computable matrices, called topographies. Secondly a microscopic cellular automation model, using lattice gases and state transition rules, is combined with a semi - microscopic cellular process model and with a macroscopic model, too. In doing this, at semi-microscopic level there are acting a cellular data compressor, a feature detection device and the Intelligent Physical Element's process dynamics. At macroscopic level the Walking Process Elements, a process evolving module, a test-and-manage device and abstracting process net are involved. Additionally, a diagnosis-coordinating and a counter measurements coordinating device are used. In order to automatically get process insights, object transformations, elementary process functions and associative methods are used. Developments of optoelectronic hardware language components are under consideration

  13. Towards a structured process modeling method: Building the prescriptive modeling theory information on submission

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Gailly, F.; Grefen, P.W.P.J.; Poels, G.

    2017-01-01

    In their effort to control and manage processes, organizations often create process models. The quality of such models is not always optimal, because it is challenging for a modeler to translate her mental image of the process into a formal process description. In order to support this complex human

  14. Towards simplification of hydrologic modeling: identification of dominant processes

    Directory of Open Access Journals (Sweden)

    S. L. Markstrom

    2016-11-01

    Full Text Available parameter hydrologic model, has been applied to the conterminous US (CONUS. Parameter sensitivity analysis was used to identify: (1 the sensitive input parameters and (2 particular model output variables that could be associated with the dominant hydrologic process(es. Sensitivity values of 35 PRMS calibration parameters were computed using the Fourier amplitude sensitivity test procedure on 110 000 independent hydrologically based spatial modeling units covering the CONUS and then summarized to process (snowmelt, surface runoff, infiltration, soil moisture, evapotranspiration, interflow, baseflow, and runoff and model performance statistic (mean, coefficient of variation, and autoregressive lag 1. Identified parameters and processes provide insight into model performance at the location of each unit and allow the modeler to identify the most dominant process on the basis of which processes are associated with the most sensitive parameters. The results of this study indicate that: (1 the choice of performance statistic and output variables has a strong influence on parameter sensitivity, (2 the apparent model complexity to the modeler can be reduced by focusing on those processes that are associated with sensitive parameters and disregarding those that are not, (3 different processes require different numbers of parameters for simulation, and (4 some sensitive parameters influence only one hydrologic process, while others may influence many.

  15. Probabilistic modeling of discourse-aware sentence processing.

    Science.gov (United States)

    Dubey, Amit; Keller, Frank; Sturt, Patrick

    2013-07-01

    Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.

  16. Le processus de normalisation comptable par l'IASB : le cas du résultat

    OpenAIRE

    Le Manh-Béna, Anne

    2009-01-01

    This research aims to contribute to the understanding of the IASB's standard-setting process through a single topic, the definition of income and its presentation in financial statements. Two research questions are addressed: what is the position expressed by the participants to the due process concerning the IASB's project on the definition and the presentation of income? How can be explained the pugnacity of the IASB to impose a new definition of income? The theoretical framework of this re...

  17. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    Science.gov (United States)

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  18. Effect of Linked Rules on Business Process Model Understanding

    DEFF Research Database (Denmark)

    Wang, Wei; Indulska, Marta; Sadiq, Shazia

    2017-01-01

    Business process models are widely used in organizations by information systems analysts to represent complex business requirements and by business users to understand business operations and constraints. This understanding is extracted from graphical process models as well as business rules. Prior...

  19. Declarative versus imperative process modeling languages : the issue of maintainability

    NARCIS (Netherlands)

    Fahland, D.; Mendling, J.; Reijers, H.A.; Weber, B.; Weidlich, M.; Zugal, S.; Rinderle-Ma, S.; Sadiq, S.; Leymann, F.

    2010-01-01

    The rise of interest in declarative languages for process modeling both justifies and demands empirical investigations into their presumed advantages over more traditional, imperative alternatives. Our concern in this paper is with the ease of maintaining business process models, for example due to

  20. Repairing business process models as retrieved from source code

    NARCIS (Netherlands)

    Fernández-Ropero, M.; Reijers, H.A.; Pérez-Castillo, R.; Piattini, M.; Nurcan, S.; Proper, H.A.; Soffer, P.; Krogstie, J.; Schmidt, R.; Halpin, T.; Bider, I.

    2013-01-01

    The static analysis of source code has become a feasible solution to obtain underlying business process models from existing information systems. Due to the fact that not all information can be automatically derived from source code (e.g., consider manual activities), such business process models

  1. Mathematical modelling of anaerobic digestion processes: applications and future needs

    DEFF Research Database (Denmark)

    Batstone, Damien J.; Puyol, Daniel; Flores Alsina, Xavier

    2015-01-01

    Anaerobic process modelling is a mature and well-established field, largely guided by a mechanistic model structure that is defined by our understanding of underlying processes. This led to publication of the IWA ADM1, and strong supporting, analytical, and extension research in the 15 years sinc...

  2. On the suitability of BPMN for business process modelling

    NARCIS (Netherlands)

    Wohed, P.; Aalst, van der W.M.P.; Dumas, M.; Hofstede, ter A.H.M.; Russell, N.C.; Dustdar, S.; Fiadeiro, J.L.; Sheth, A.

    2006-01-01

    In this paper we examine the suitability of the Business Process Modelling Notation (BPMN) for business process modelling, using the Workflow Patterns as an evaluation framework. The Workflow Patterns are a collection of patterns developed for assessing control-flow, data and resource capabilities

  3. Semantics and analysis of business process models in BPMN

    NARCIS (Netherlands)

    Dijkman, R.M.; Dumas, M.; Ouyang, C.

    2008-01-01

    The Business Process Modelling Notation (BPMN) is a standard for capturing business processes in the early phases of systems development. The mix of constructs found in BPMN makes it possible to create models with semantic errors. Such errors are especially serious, because errors in the early

  4. Business process model abstraction : a definition, catalog, and survey

    NARCIS (Netherlands)

    Smirnov, S.; Reijers, H.A.; Weske, M.H.; Nugteren, T.

    2012-01-01

    The discipline of business process management aims at capturing, understanding, and improving work in organizations by using process models as central artifacts. Since business-oriented tasks require different information from such models to be highlighted, a range of abstraction techniques has been

  5. Simulation Models of Human Decision-Making Processes

    Directory of Open Access Journals (Sweden)

    Nina RIZUN

    2014-10-01

    Full Text Available The main purpose of the paper is presentation of the new concept of human decision-making process modeling via using the analogy with Automatic Control Theory. From the author's point of view this concept allows to develop and improve the theory of decision-making in terms of the study and classification of specificity of the human intellectual processes in different conditions. It was proved that the main distinguishing feature between the Heuristic / Intuitive and Rational Decision-Making Models is the presence of so-called phenomenon of "enrichment" of the input information with human propensity, hobbies, tendencies, expectations, axioms and judgments, presumptions or bias and their justification. In order to obtain additional knowledge about the basic intellectual processes as well as the possibility of modeling the decision results in various parameters characterizing the decision-maker, the complex of the simulation models was developed. These models are based on the assumptions that:  basic intellectual processes of the Rational Decision-Making Model can be adequately simulated and identified by the transient processes of the proportional-integral-derivative controller; basic intellectual processes of the Bounded Rationality and Intuitive Models can be adequately simulated and identified by the transient processes of the nonlinear elements.The taxonomy of the most typical automatic control theory elements and their compliance with certain decision-making models with a point of view of decision-making process specificity and decision-maker behavior during a certain time of professional activity was obtained.

  6. Embedding a State Space Model Into a Markov Decision Process

    DEFF Research Database (Denmark)

    Nielsen, Lars Relund; Jørgensen, Erik; Højsgaard, Søren

    2011-01-01

    In agriculture Markov decision processes (MDPs) with finite state and action space are often used to model sequential decision making over time. For instance, states in the process represent possible levels of traits of the animal and transition probabilities are based on biological models...

  7. Measurement and modeling of advanced coal conversion processes

    Energy Technology Data Exchange (ETDEWEB)

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G. (Advanced Fuel Research, Inc., East Hartford, CT (United States)); Smoot, L.D.; Brewster, B.S. (Brigham Young Univ., Provo, UT (United States))

    1991-01-01

    The objective of this study are to establish the mechanisms and rates of basic steps in coal conversion processes, to integrate and incorporate this information into comprehensive computer models for coal conversion processes, to evaluate these models and to apply them to gasification, mild gasification and combustion in heat engines.

  8. Measurement and modeling of advanced coal conversion processes

    Energy Technology Data Exchange (ETDEWEB)

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G. (Advanced Fuel Research, Inc., East Hartford, CT (United States)); Smoot, L.D.; Brewster, B.S. (Brigham Young Univ., Provo, UT (United States))

    1991-09-25

    The objectives of this study are to establish the mechanisms and rates of basic steps in coal conversion processes, to integrate and incorporate this information into comprehensive computer models for coal conversion processes, to evaluate these models and to apply them to gasification, mild gasification and combustion in heat engines. (VC)

  9. A semantic approach for business process model abstraction

    NARCIS (Netherlands)

    Smirnov, S.; Reijers, H.A.; Weske, M.H.; Mouratidis, H.; Rolland, C.

    2011-01-01

    Models of business processes can easily become large and difficult to understand. Abstraction has proven to be an effective means to present a readable, high-level view of a business process model, by showing aggregated activities and leaving out irrelevant details. Yet, it is an open question how

  10. Simple models of the hydrofracture process

    KAUST Repository

    Marder, M.

    2015-12-29

    Hydrofracturing to recover natural gas and oil relies on the creation of a fracture network with pressurized water. We analyze the creation of the network in two ways. First, we assemble a collection of analytical estimates for pressure-driven crack motion in simple geometries, including crack speed as a function of length, energy dissipated by fluid viscosity and used to break rock, and the conditions under which a second crack will initiate while a first is running. We develop a pseudo-three-dimensional numerical model that couples fluid motion with solid mechanics and can generate branching crack structures not specified in advance. One of our main conclusions is that the typical spacing between fractures must be on the order of a meter, and this conclusion arises in two separate ways. First, it arises from analysis of gas production rates, given the diffusion constants for gas in the rock. Second, it arises from the number of fractures that should be generated given the scale of the affected region and the amounts of water pumped into the rock.

  11. Simple models of the hydrofracture process

    KAUST Repository

    Marder, M.; Chen, Chih-Hung; Patzek, Tadeusz

    2015-01-01

    Hydrofracturing to recover natural gas and oil relies on the creation of a fracture network with pressurized water. We analyze the creation of the network in two ways. First, we assemble a collection of analytical estimates for pressure-driven crack motion in simple geometries, including crack speed as a function of length, energy dissipated by fluid viscosity and used to break rock, and the conditions under which a second crack will initiate while a first is running. We develop a pseudo-three-dimensional numerical model that couples fluid motion with solid mechanics and can generate branching crack structures not specified in advance. One of our main conclusions is that the typical spacing between fractures must be on the order of a meter, and this conclusion arises in two separate ways. First, it arises from analysis of gas production rates, given the diffusion constants for gas in the rock. Second, it arises from the number of fractures that should be generated given the scale of the affected region and the amounts of water pumped into the rock.

  12. Deferred Action: Theoretical model of process architecture design for emergent business processes

    Directory of Open Access Journals (Sweden)

    Patel, N.V.

    2007-01-01

    Full Text Available E-Business modelling and ebusiness systems development assumes fixed company resources, structures, and business processes. Empirical and theoretical evidence suggests that company resources and structures are emergent rather than fixed. Planning business activity in emergent contexts requires flexible ebusiness models based on better management theories and models . This paper builds and proposes a theoretical model of ebusiness systems capable of catering for emergent factors that affect business processes. Drawing on development of theories of the ‘action and design’class the Theory of Deferred Action is invoked as the base theory for the theoretical model. A theoretical model of flexible process architecture is presented by identifying its core components and their relationships, and then illustrated with exemplar flexible process architectures capable of responding to emergent factors. Managerial implications of the model are considered and the model’s generic applicability is discussed.

  13. BUSINESS PROCESS MODELLING: A FOUNDATION FOR KNOWLEDGE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Vesna Bosilj-Vukšić

    2006-12-01

    Full Text Available Knowledge management (KM is increasingly recognised as a strategic practice of knowledge-intensive companies, becoming an integral part of an organisation's strategy to improve business performance. This paper provides an overview of business process modelling applications and analyses the relationship between business process modelling and knowledge management projects. It presents the case study of Croatian leading banks and the insurance company, discussing its practical experience in conducting business process modelling projects and investigating the opportunity for integrating business process repository and organisational knowledge as the foundation for knowledge management system development.

  14. Ontological Analysis of Integrated Process Models: testing hypotheses

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    2001-11-01

    Full Text Available Integrated process modeling is achieving prominence in helping to document and manage business administration and IT processes in organizations. The ARIS framework is a popular example for a framework of integrated process modeling not least because it underlies the 800 or more reference models embedded in the world's most popular ERP package, SAP R/3. This paper demonstrates the usefulness of the Bunge-Wand-Weber (BWW representation model for evaluating modeling grammars such as those constituting ARIS. It reports some initial insights gained from pilot testing Green and Rosemann's (2000 evaluative propositions. Even when considering all five views of ARIS, modelers have problems representing business rules, the scope and boundary of systems, and decomposing models. However, even though it is completely ontologically redundant, users still find the function view useful in modeling.

  15. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  16. What Controls the Vertical Distribution of Aerosol? Relationships Between Process Sensitivity in HadGEM3-UKCA and Inter-Model Variation from AeroCom Phase II

    Science.gov (United States)

    Kipling, Zak; Stier, Philip; Johnson, Colin E.; Mann, Graham W.; Bellouin, Nicolas; Bauer, Susanne E.; Bergman, Tommi; Chin, Mian; Diehl, Thomas; Ghan, Steven J.; hide

    2016-01-01

    same processes as the component mass profiles, plus the size distribution of primary emissions. We also show that the processes that affect the AOD-normalised radiative forcing in the model are predominantly those that affect the vertical mass distribution, in particular convective transport, in-cloud scavenging, aqueous oxidation, ageing and the vertical extent of biomass-burning emissions.

  17. Process modeling for the Integrated Nonthermal Treatment System (INTS) study

    Energy Technology Data Exchange (ETDEWEB)

    Brown, B.W.

    1997-04-01

    This report describes the process modeling done in support of the Integrated Nonthermal Treatment System (INTS) study. This study was performed to supplement the Integrated Thermal Treatment System (ITTS) study and comprises five conceptual treatment systems that treat DOE contract-handled mixed low-level wastes (MLLW) at temperatures of less than 350{degrees}F. ASPEN PLUS, a chemical process simulator, was used to model the systems. Nonthermal treatment systems were developed as part of the INTS study and include sufficient processing steps to treat the entire inventory of MLLW. The final result of the modeling is a process flowsheet with a detailed mass and energy balance. In contrast to the ITTS study, which modeled only the main treatment system, the INTS study modeled each of the various processing steps with ASPEN PLUS, release 9.1-1. Trace constituents, such as radionuclides and minor pollutant species, were not included in the calculations.

  18. A Typology for Modeling Processes in Clinical Guidelines and Protocols

    Science.gov (United States)

    Tu, Samson W.; Musen, Mark A.

    We analyzed the graphical representations that are used by various guideline-modeling methods to express process information embodied in clinical guidelines and protocols. From this analysis, we distilled four modeling formalisms and the processes they typically model: (1) flowcharts for capturing problem-solving processes, (2) disease-state maps that link decision points in managing patient problems over time, (3) plans that specify sequences of activities that contribute toward a goal, (4) workflow specifications that model care processes in an organization. We characterized the four approaches and showed that each captures some aspect of what a guideline may specify. We believe that a general guideline-modeling system must provide explicit representation for each type of process.

  19. QUALITY IMPROVEMENT MODEL AT THE MANUFACTURING PROCESS PREPARATION LEVEL

    Directory of Open Access Journals (Sweden)

    Dusko Pavletic

    2009-12-01

    Full Text Available The paper expresses base for an operational quality improvement model at the manufacturing process preparation level. A numerous appropriate related quality assurance and improvement methods and tools are identified. Main manufacturing process principles are investigated in order to scrutinize one general model of manufacturing process and to define a manufacturing process preparation level. Development and introduction of the operational quality improvement model is based on a research conducted and results of methods and tools application possibilities in real manufacturing processes shipbuilding and automotive industry. Basic model structure is described and presented by appropriate general algorithm. Operational quality improvement model developed lays down main guidelines for practical and systematic application of quality improvements methods and tools.

  20. Rational parametrisation of normalised Stiefel manifolds, and explicit non-'t Hooft solutions of the Atiyah-Drinfeld-Hitchin-Manin instanton matrix equations for Sp(n)

    International Nuclear Information System (INIS)

    McCarthy, P.J.

    1981-01-01

    It is proved that normalised Stiefel manifolds admit a rational parametrisation which generalises Cayley's parametrisation of the unitary groups. Applying (the quaternionic case of) this parametrisation to the Atiyah-Drinfeld-Hitchin-Manin (ADHM) instanton matrix equations, large families of new explicit rational solutions emerge. In particular, new explicit non-'t Hooft solutions are presented. (orig.)

  1. Updating parameters of the chicken processing line model

    DEFF Research Database (Denmark)

    Kurowicka, Dorota; Nauta, Maarten; Jozwiak, Katarzyna

    2010-01-01

    A mathematical model of chicken processing that quantitatively describes the transmission of Campylobacter on chicken carcasses from slaughter to chicken meat product has been developed in Nauta et al. (2005). This model was quantified with expert judgment. Recent availability of data allows...... updating parameters of the model to better describe processes observed in slaughterhouses. We propose Bayesian updating as a suitable technique to update expert judgment with microbiological data. Berrang and Dickens’s data are used to demonstrate performance of this method in updating parameters...... of the chicken processing line model....

  2. Motivation within the Information Processing Model of Foreign Language Learning

    Science.gov (United States)

    Manolopoulou-Sergi, Eleni

    2004-01-01

    The present article highlights the importance of the motivational construct for the foreign language learning (FLL) process. More specifically, in the present article it is argued that motivation is likely to play a significant role at all three stages of the FLL process as they are discussed within the information processing model of FLL, namely,…

  3. The two-process model : Origin and perspective

    NARCIS (Netherlands)

    Daan, S.; Hut, R. A.; Beersma, D.

    In the two-process model as developed in the early 1980's sleep is controlled by a process-S, representing the rise and fall of sleep demand resulting from prior sleep-wake history, interacting with a process-C representing circadian variation in sleep propensity. S and C together optimize sleep

  4. Advanced social features in a recommendation system for process modeling

    NARCIS (Netherlands)

    Koschmider, A.; Song, M.S.; Reijers, H.A.; Abramowicz, W.

    2009-01-01

    Social software is known to stimulate the exchange and sharing of information among peers. This paper describes how an existing system that supports process builders in completing a business process can be enhanced with various social features. In that way, it is easier for process modeler to become

  5. Modelling the pultrusion process of off shore wind turbine blades

    NARCIS (Netherlands)

    Baran, Ismet

    This thesis is devoted to the numerical modelling of the pultrusion process for industrial products such as wind turbine blades and structural profiles. The main focus is on the thermo-chemical and mechanical analyses of the process in which the process induced tresses and shape distortions together

  6. Catastrophe Insurance Modeled by Shot-Noise Processes

    Directory of Open Access Journals (Sweden)

    Thorsten Schmidt

    2014-02-01

    Full Text Available Shot-noise processes generalize compound Poisson processes in the following way: a jump (the shot is followed by a decline (noise. This constitutes a useful model for insurance claims in many circumstances; claims due to natural disasters or self-exciting processes exhibit similar features. We give a general account of shot-noise processes with time-inhomogeneous drivers inspired by recent results in credit risk. Moreover, we derive a number of useful results for modeling and pricing with shot-noise processes. Besides this, we obtain some highly tractable examples and constitute a useful modeling tool for dynamic claims processes. The results can in particular be used for pricing Catastrophe Bonds (CAT bonds, a traded risk-linked security. Additionally, current results regarding the estimation of shot-noise processes are reviewed.

  7. A finite difference model of the iron ore sinter process

    OpenAIRE

    Muller, J.; de Vries, T.L.; Dippenaar, B.A.; Vreugdenburg, J.C.

    2015-01-01

    Iron ore fines are agglomerated to produce sinter, which is an important feed material for blast furnaces worldwide. A model of the iron ore sintering process has been developed with the objective of being representative of the sinter pot test, the standard laboratory process in which the behaviour of specific sinter feed mixtures is evaluated. The model aims to predict sinter quality, including chemical quality and physical strength, as well as key sinter process performance parameters such ...

  8. Dual processing model of medical decision-making

    OpenAIRE

    Djulbegovic, Benjamin; Hozo, Iztok; Beckstead, Jason; Tsalatsanis, Athanasios; Pauker, Stephen G

    2012-01-01

    Abstract Background Dual processing theory of human cognition postulates that reasoning and decision-making can be described as a function of both an intuitive, experiential, affective system (system I) and/or an analytical, deliberative (system II) processing system. To date no formal descriptive model of medical decision-making based on dual processing theory has been developed. Here we postulate such a model and apply it to a common clinical situation: whether treatment should be administe...

  9. Assessing healthcare process maturity: challenges of using a business process maturity model

    NARCIS (Netherlands)

    Tarhan, A.; Turetken, O.; van den Biggelaar, F.J.H.M.

    2015-01-01

    Doi: 10.4108/icst.pervasivehealth.2015.259105 The quality of healthcare services is influenced by the maturity of healthcare processes used to develop it. A maturity model is an instrument to assess and continually improve organizational processes. In the last decade, a number of maturity models

  10. Generalised additive modelling approach to the fermentation process of glutamate.

    Science.gov (United States)

    Liu, Chun-Bo; Li, Yun; Pan, Feng; Shi, Zhong-Ping

    2011-03-01

    In this work, generalised additive models (GAMs) were used for the first time to model the fermentation of glutamate (Glu). It was found that three fermentation parameters fermentation time (T), dissolved oxygen (DO) and oxygen uptake rate (OUR) could capture 97% variance of the production of Glu during the fermentation process through a GAM model calibrated using online data from 15 fermentation experiments. This model was applied to investigate the individual and combined effects of T, DO and OUR on the production of Glu. The conditions to optimize the fermentation process were proposed based on the simulation study from this model. Results suggested that the production of Glu can reach a high level by controlling concentration levels of DO and OUR to the proposed optimization conditions during the fermentation process. The GAM approach therefore provides an alternative way to model and optimize the fermentation process of Glu. Crown Copyright © 2010. Published by Elsevier Ltd. All rights reserved.

  11. Detection and quantification of flow consistency in business process models

    DEFF Research Database (Denmark)

    Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel

    2017-01-01

    , to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics......Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect......, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second...

  12. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  13. A model for ageing-home-care service process improvement

    OpenAIRE

    Yu, Shu-Yan; Shie, An-Jin

    2017-01-01

    The purpose of this study was to develop an integrated model to improve service processes in ageing-home-care. According to the literature, existing service processes have potential service failures that affect service quality and efficacy. However, most previous studies have only focused on conceptual model development using New Service Development (NSD) and fail to provide a systematic model to analyse potential service failures and facilitate managers developing solutions to improve the se...

  14. Modeling the curing process of thermosetting resin matrix composites

    Science.gov (United States)

    Loos, A. C.

    1986-01-01

    A model is presented for simulating the curing process of a thermosetting resin matrix composite. The model relates the cure temperature, the cure pressure, and the properties of the prepreg to the thermal, chemical, and rheological processes occurring in the composite during cure. The results calculated with the computer code developed on the basis of the model were compared with the experimental data obtained from autoclave-curved composite laminates. Good agreement between the two sets of results was obtained.

  15. CFD Modeling and Simulation in Materials Processing 2018

    OpenAIRE

    Nastac, Laurentiu; Pericleous, Koulis; Sabau, Adrian S.; Zhang, Lifeng; Thomas, Brian G.

    2018-01-01

    This book contains the proceedings of the symposium “CFD Modeling and Simulation in Materials Processing” held at the TMS 2018 Annual Meeting & Exhibition in Phoenix, Arizona, USA, March 11–15, 2018. This symposium dealt with computational fluid dynamics (CFD) modeling and simulation of engineering processes. The papers published in this book were requested from researchers and engineers involved in the modeling of multiscale and multiphase phenomena in material processing systems. The sympos...

  16. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  17. Investigation of Mediational Processes Using Parallel Process Latent Growth Curve Modeling

    Science.gov (United States)

    Cheong, JeeWon; MacKinnon, David P.; Khoo, Siek Toon

    2010-01-01

    This study investigated a method to evaluate mediational processes using latent growth curve modeling. The mediator and the outcome measured across multiple time points were viewed as 2 separate parallel processes. The mediational process was defined as the independent variable influencing the growth of the mediator, which, in turn, affected the growth of the outcome. To illustrate modeling procedures, empirical data from a longitudinal drug prevention program, Adolescents Training and Learning to Avoid Steroids, were used. The program effects on the growth of the mediator and the growth of the outcome were examined first in a 2-group structural equation model. The mediational process was then modeled and tested in a parallel process latent growth curve model by relating the prevention program condition, the growth rate factor of the mediator, and the growth rate factor of the outcome. PMID:20157639

  18. Resin infusion of large composite structures modeling and manufacturing process

    Energy Technology Data Exchange (ETDEWEB)

    Loos, A.C. [Michigan State Univ., Dept. of Mechanical Engineering, East Lansing, MI (United States)

    2006-07-01

    The resin infusion processes resin transfer molding (RTM), resin film infusion (RFI) and vacuum assisted resin transfer molding (VARTM) are cost effective techniques for the fabrication of complex shaped composite structures. The dry fibrous preform is placed in the mold, consolidated, resin impregnated and cured in a single step process. The fibrous performs are often constructed near net shape using highly automated textile processes such as knitting, weaving and braiding. In this paper, the infusion processes RTM, RFI and VARTM are discussed along with the advantages of each technique compared with traditional composite fabrication methods such as prepreg tape lay up and autoclave cure. The large number of processing variables and the complex material behavior during infiltration and cure make experimental optimization of the infusion processes costly and inefficient. Numerical models have been developed which can be used to simulate the resin infusion processes. The model formulation and solution procedures for the VARTM process are presented. A VARTM process simulation of a carbon fiber preform was presented to demonstrate the type of information that can be generated by the model and to compare the model predictions with experimental measurements. Overall, the predicted flow front positions, resin pressures and preform thicknesses agree well with the measured values. The results of the simulation show the potential cost and performance benefits that can be realized by using a simulation model as part of the development process. (au)

  19. A Measurable Model of the Creative Process in the Context of a Learning Process

    Science.gov (United States)

    Ma, Min; Van Oystaeyen, Fred

    2016-01-01

    The authors' aim was to arrive at a measurable model of the creative process by putting creativity in the context of a learning process. The authors aimed to provide a rather detailed description of how creative thinking fits in a general description of the learning process without trying to go into an analysis of a biological description of the…

  20. Toward Cognitively Constrained Models of Language Processing: A Review

    Directory of Open Access Journals (Sweden)

    Margreet Vogelzang

    2017-09-01

    Full Text Available Language processing is not an isolated capacity, but is embedded in other aspects of our cognition. However, it is still largely unexplored to what extent and how language processing interacts with general cognitive resources. This question can be investigated with cognitively constrained computational models, which simulate the cognitive processes involved in language processing. The theoretical claims implemented in cognitive models interact with general architectural constraints such as memory limitations. This way, it generates new predictions that can be tested in experiments, thus generating new data that can give rise to new theoretical insights. This theory-model-experiment cycle is a promising method for investigating aspects of language processing that are difficult to investigate with more traditional experimental techniques. This review specifically examines the language processing models of Lewis and Vasishth (2005, Reitter et al. (2011, and Van Rij et al. (2010, all implemented in the cognitive architecture Adaptive Control of Thought—Rational (Anderson et al., 2004. These models are all limited by the assumptions about cognitive capacities provided by the cognitive architecture, but use different linguistic approaches. Because of this, their comparison provides insight into the extent to which assumptions about general cognitive resources influence concretely implemented models of linguistic competence. For example, the sheer speed and accuracy of human language processing is a current challenge in the field of cognitive modeling, as it does not seem to adhere to the same memory and processing capacities that have been found in other cognitive processes. Architecture-based cognitive models of language processing may be able to make explicit which language-specific resources are needed to acquire and process natural language. The review sheds light on cognitively constrained models of language processing from two angles: we