WorldWideScience

Sample records for normalisation process model

  1. Embedding chiropractic in Indigenous Health Care Organisations: applying the normalisation process model.

    Science.gov (United States)

    Polus, Barbara I; Paterson, Charlotte; van Rotterdam, Joan; Vindigni, Dein

    2012-11-26

    Improving the health of Indigenous Australians remains a major challenge. A chiropractic service was established to evaluate this treatment option for musculoskeletal illness in rural Indigenous communities, based on the philosophy of keeping the community involved in all the phases of development, implementation, and evaluation. The development and integration of this service has experienced many difficulties with referrals, funding and building sustainability. Evaluation of the program was a key aspect of its implementation, requiring an appropriate process to identify specific problems and formulate solutions to improve the service. We used the normalisation process model (May 2006) to order the data collected in consultation meetings and to inform our strategy and actions. The normalisation process model provided us with a structure for organising consultation meeting data and helped prioritise tasks. Our data was analysed as it applied to each dimension of the model, noting aspects that the model did not encompass. During this process we reworded the dimensions into more everyday terminology. The final analysis focused on to what extent the model helped us to prioritise and systematise our tasks and plans. We used the model to consider ways to promote the chiropractic service, to enhance relationships and interactions between clinicians and procedures within the health service, and to avoid disruption of the existing service. We identified ways in which chiropractors can become trusted team members who have acceptable and recognised knowledge and skills. We also developed strategies that should result in chiropractic practitioners finding a place within a complex occupational web, by being seen as similar to well-known occupations such as physiotherapy. Interestingly, one dimension identified by our data, which we have labelled 'emancipatory', was absent from the model. The normalisation process model has resulted in a number of new insights and questions. We

  2. A Bayesian MCMC method for point process models with intractable normalising constants

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2004-01-01

    to simulate from the "unknown distribution", perfect simulation algorithms become useful. We illustrate the method in cases whre the likelihood is given by a Markov point process model. Particularly, we consider semi-parametric Bayesian inference in connection to both inhomogeneous Markov point process models...... and pairwise interaction point processes....

  3. Implementing a provider-initiated testing and counselling (PITC) intervention in Cape town, South Africa: a process evaluation using the normalisation process model.

    Science.gov (United States)

    Leon, Natalie; Lewin, Simon; Mathews, Catherine

    2013-08-26

    Provider-initiated HIV testing and counselling (PITC) increases HIV testing rates in most settings, but its effect on testing rates varies considerably. This paper reports the findings of a process evaluation of a controlled trial of PITC for people with sexually transmitted infections (STI) attending publicly funded clinics in a low-resource setting in South Africa, where the trial results were lower than anticipated compared to the standard Voluntary Counselling and Testing (VCT) approach. This longitudinal study used a variety of qualitative methods, including participant observation of project implementation processes, staff focus groups, patient interviews, and observation of clinical practice. Data were content analysed by identifying the main influences shaping the implementation process. The Normalisation Process Model (NPM) was used as a theoretical framework to analyse implementation processes and explain the trial outcomes. The new PITC intervention became embedded in practice (normalised) during a two-year period (2006 to 2007). Factors that promoted the normalising include strong senior leadership, implementation support, appropriate accountability mechanisms, an intervention design that was responsive to service needs and congruent with professional practice, positive staff and patient perceptions, and a responsive organisational context. Nevertheless, nurses struggled to deploy the intervention efficiently, mainly because of poor sequencing and integration of HIV and STI tasks, a focus on HIV education, tension with a patient-centred communication style, and inadequate training on dealing with the operational challenges. This resulted in longer consultation times, which may account for the low test coverage outcome. Leadership and implementation support, congruent intervention design, and a responsive organisational context strengthened implementation. Poor compatibility with nurse skills on the level of the clinical consultation may have contributed

  4. ENEKuS--A Key Model for Managing the Transformation of the Normalisation of the Basque Language in the Workplace

    Science.gov (United States)

    Marko, Inazio; Pikabea, Inaki

    2013-01-01

    The aim of this study is to develop a reference model for intervention in the language processes applied to the transformation of language normalisation within organisations of a socio-economic nature. It is based on a case study of an experiment carried out over 10 years within a trade union confederation, and has pursued a strategy of a…

  5. Consistent haul road condition monitoring by means of vehicle response normalisation with Gaussian processes

    CSIR Research Space (South Africa)

    Heyns, T

    2012-12-01

    Full Text Available -1 Engineering Applications of Artificial Intelligence December 2012/ Vol. 25(8) Consistent haul road condition monitoring by means of vehicle response normalisation with Gaussian processes T. Heyns a,b,n , J.P. de Villiers a,b , P.S. Heyns c a...

  6. Model selection for local and regional meteorological normalisation of background concentrations of tropospheric ozone

    Science.gov (United States)

    Libiseller, Claudia; Grimvall, Anders

    Meteorological normalisation of time series of air quality data aims to extract anthropogenic signals by removing natural fluctuations in the collected data. We showed that the currently used procedures to select normalisation models can cause over-fitting to observed data and undesirable smoothing of anthropogenic signals. A simulation study revealed that the risk of such effects is particularly large when: (i) the observed data are serially correlated, (ii) the normalisation model is selected by leave-one-out cross-validation, and (iii) complex models, such as artificial neural networks, are fitted to data. When the size of the test sets used in the cross-validation was increased, and only moderately complex linear models were fitted to data, the over-fitting was less pronounced. An empirical study of the predictive ability of different normalisation models for tropospheric ozone in Finland confirmed the importance of using appropriate model selection strategies. Moderately complex regional models involving contemporaneous meteorological data from a network of stations were found to be superior to single-site models as well as more complex regional models involving both contemporaneous and time-lagged meteorological data from a network of stations.

  7. Normalisation process theory: a framework for developing, evaluating and implementing complex interventions

    LENUS (Irish Health Repository)

    Murray, Elizabeth

    2010-10-20

    Abstract Background The past decade has seen considerable interest in the development and evaluation of complex interventions to improve health. Such interventions can only have a significant impact on health and health care if they are shown to be effective when tested, are capable of being widely implemented and can be normalised into routine practice. To date, there is still a problematic gap between research and implementation. The Normalisation Process Theory (NPT) addresses the factors needed for successful implementation and integration of interventions into routine work (normalisation). Discussion In this paper, we suggest that the NPT can act as a sensitising tool, enabling researchers to think through issues of implementation while designing a complex intervention and its evaluation. The need to ensure trial procedures that are feasible and compatible with clinical practice is not limited to trials of complex interventions, and NPT may improve trial design by highlighting potential problems with recruitment or data collection, as well as ensuring the intervention has good implementation potential. Summary The NPT is a new theory which offers trialists a consistent framework that can be used to describe, assess and enhance implementation potential. We encourage trialists to consider using it in their next trial.

  8. Normalisation process theory: a framework for developing, evaluating and implementing complex interventions

    Directory of Open Access Journals (Sweden)

    Ong Bie

    2010-10-01

    Full Text Available Abstract Background The past decade has seen considerable interest in the development and evaluation of complex interventions to improve health. Such interventions can only have a significant impact on health and health care if they are shown to be effective when tested, are capable of being widely implemented and can be normalised into routine practice. To date, there is still a problematic gap between research and implementation. The Normalisation Process Theory (NPT addresses the factors needed for successful implementation and integration of interventions into routine work (normalisation. Discussion In this paper, we suggest that the NPT can act as a sensitising tool, enabling researchers to think through issues of implementation while designing a complex intervention and its evaluation. The need to ensure trial procedures that are feasible and compatible with clinical practice is not limited to trials of complex interventions, and NPT may improve trial design by highlighting potential problems with recruitment or data collection, as well as ensuring the intervention has good implementation potential. Summary The NPT is a new theory which offers trialists a consistent framework that can be used to describe, assess and enhance implementation potential. We encourage trialists to consider using it in their next trial.

  9. The nursing work of hospital-based clinical practice guideline implementation: an explanatory systematic review using Normalisation Process Theory.

    Science.gov (United States)

    May, Carl; Sibley, Andrew; Hunt, Katherine

    2014-02-01

    To investigate the dynamics of nurses' work in implementing Clinical Practice Guidelines. Hybrid: systematic review techniques used to identify qualitative studies of clinical guideline implementation; theory-led and structured analysis of textual data. CINAHL, CSA Illumina, EMBASE, MEDLINE, PsycINFO, and Sociological Abstracts. Systematic review of qualitative studies of the implementation of Clinical Practice Guidelines, analysed using Directed Content Analysis, and interpreted in the light of Normalisation Process Theory. Seven studies met the inclusion criteria of the review. These revealed that clinical practice guidelines are disposed to normalisation when: (a) They are associated with activities that practitioners can make workable in practice, and practitioners are able to integrate it into their collective workflow. (b) When they are differentiated from existing clinical practice by its proponents, and when claims of differentiation are regarded as legitimate by their potential users. (c) When they are associated with an emergent community of practice, and when members of that community of practice enrol each other into group processes that specify their engagement with it. (d) When they are associated with improvements in the collective knowledge of its users, and when users are able to integrate the application of that knowledge into their individual workflow. And, (e) when nurses can minimise disruption to behaviour norms and agreed professional roles, and mobilise structural and cognitive resources in ways that build shared commitments across professional boundaries. This review demonstrates the feasibility and benefits of theory-led review of studies of nursing practice, and proposes a dynamic model of implementation. Normalisation Process Theory supports the analysis of nursing work. It characterises mechanisms by which work is made coherent and meaningful, is formed around sets of relational commitments, is enacted and contextualised, and is

  10. The implementation of medical revalidation: an assessment using normalisation process theory

    Directory of Open Access Journals (Sweden)

    Abigail Tazzyman

    2017-11-01

    Full Text Available Abstract Background Medical revalidation is the process by which all licensed doctors are legally required to demonstrate that they are up to date and fit to practise in order to maintain their licence. Revalidation was introduced in the United Kingdom (UK in 2012, constituting significant change in the regulation of doctors. The governing body, the General Medical Council (GMC, envisages that revalidation will improve patient care and safety. This potential however is, in part, dependent upon how successfully revalidation is embedded into routine practice. The aim of this study was to use Normalisation Process Theory (NPT to explore issues contributing to or impeding the implementation of revalidation in practice. Methods We conducted seventy-one interviews with sixty UK policymakers and senior leaders at different points during the development and implementation of revalidation: in 2011 (n = 31, 2013 (n = 26 and 2015 (n = 14. We selected interviewees using purposeful sampling. NPT was used as a framework to enable systematic analysis across the interview sets. Results Initial lack of consensus over revalidation’s purpose, and scepticism about its value, decreased over time as participants recognised the benefits it brought to their practice (coherence category of NPT. Though acceptance increased across time, revalidation was not seen as a legitimate part of their role by all doctors. Key individuals, notably the Responsible Officer (RO, were vital for the successful implementation of revalidation in organisations (cognitive participation category. The ease with which revalidation could be integrated into working practices varied greatly depending on the type of role a doctor held and the organisation they work for and the provision of resources was a significant variable in this (collective action category. Formal evaluation of revalidation in organisations was lacking but informal evaluation was taking place. Revalidation had

  11. Normalisation genes for expression analyses in the brown alga model Ectocarpus siliculosus

    Directory of Open Access Journals (Sweden)

    Rousvoal Sylvie

    2008-08-01

    Full Text Available Abstract Background Brown algae are plant multi-cellular organisms occupying most of the world coasts and are essential actors in the constitution of ecological niches at the shoreline. Ectocarpus siliculosus is an emerging model for brown algal research. Its genome has been sequenced, and several tools are being developed to perform analyses at different levels of cell organization, including transcriptomic expression analyses. Several topics, including physiological responses to osmotic stress and to exposure to contaminants and solvents are being studied in order to better understand the adaptive capacity of brown algae to pollution and environmental changes. A series of genes that can be used to normalise expression analyses is required for these studies. Results We monitored the expression of 13 genes under 21 different culture conditions. These included genes encoding proteins and factors involved in protein translation (ribosomal protein 26S, EF1alpha, IF2A, IF4E and protein degradation (ubiquitin, ubiquitin conjugating enzyme or folding (cyclophilin, and proteins involved in both the structure of the cytoskeleton (tubulin alpha, actin, actin-related proteins and its trafficking function (dynein, as well as a protein implicated in carbon metabolism (glucose 6-phosphate dehydrogenase. The stability of their expression level was assessed using the Ct range, and by applying both the geNorm and the Normfinder principles of calculation. Conclusion Comparisons of the data obtained with the three methods of calculation indicated that EF1alpha (EF1a was the best reference gene for normalisation. The normalisation factor should be calculated with at least two genes, alpha tubulin, ubiquitin-conjugating enzyme or actin-related proteins being good partners of EF1a. Our results exclude actin as a good normalisation gene, and, in this, are in agreement with previous studies in other organisms.

  12. Using Normalisation Process Theory to investigate the implementation of school-based oral health promotion.

    Science.gov (United States)

    Olajide, O J; Shucksmith, J; Maguire, A; Zohoori, F V

    2017-09-01

    Despite the considerable improvement in oral health of children in the UK over the last forty years, a significant burden of dental caries remains prevalent in some groups of children, indicating the need for more effective oral health promotion intervention (OHPI) strategies in this population. To explore the implementation process of a community-based OHPI, in the North East of England, using Normalisation Process Theory (NPT) to provide insights on how effectiveness could be maximised. Utilising a generic qualitative research approach, 19 participants were recruited into the study. In-depth interviews were conducted with relevant National Health Service (NHS) staff and primary school teachers while focus group discussions were conducted with reception teachers and teaching assistants. Analyses were conducted using thematic analysis with emergent themes mapped onto NPT constructs. Participants highlighted the benefits of OHPI and the need for evidence in practice. However, implementation of 'best evidence' was hampered by lack of adequate synthesis of evidence from available clinical studies on effectiveness of OHPI as these generally have insufficient information on the dynamics of implementation and how effectiveness obtained in clinical studies could be achieved in 'real life'. This impacted on the decision-making process, levels of commitment, collaboration among OHP teams, resource allocation and evaluation of OHPI. A large gap exists between available research evidence and translation of evidence in OHPI in community settings. Effectiveness of OHPI requires not only an awareness of evidence of clinical effectiveness but also synthesised information about change mechanisms and implementation protocols. Copyright© 2017 Dennis Barber Ltd.

  13. Trends of air pollution in Denmark - Normalised by a simple weather index model

    International Nuclear Information System (INIS)

    Kiilsholm, S.; Rasmussen, A.

    2000-01-01

    This report is a part of the Traffic Pool projects on 'Traffic and Environments', 1995-99, financed by the Danish Ministry of Transport. The Traffic Pool projects included five different projects on 'Surveillance of the Air Quality', 'Atmospheric Modelling', 'Atmospheric Chemistry Modelling', 'Smog and ozone' and 'Greenhouse effects and Climate', [Rasmussen, 2000]. This work is a part of the project on 'Surveillance of the Air Quality' with the main objectives to make trend analysis of levels of air pollution from traffic in Denmark. Other participants were from the Road Directory mainly focusing on measurement of traffic and trend analysis of the air quality utilising a nordic model for the air pollution in street canyons called BLB (Beregningsmodel for Luftkvalitet i Byluftgader) [Vejdirektoratet 2000], National Environmental Research Institute (HERI) mainly focusing on. measurements of air pollution and trend analysis with the Operational Street Pollution Model (OSPM) [DMU 2000], and the Copenhagen Environmental Protection Agency mainly focusing on measurements. In this study a more simple statistical model has been developed for trend analysis of the air quality. The model is filtering out the influence of the variations from year to year in the meteorological conditions on the air pollution levels. The weather factors found most important are wind speed, wind direction and mixing height. Measurements of CO, NO and NO 2 from three streets in Copenhagen have been used, these streets are Jagtvej, Bredgade and H. C. Andersen's Boulevard (HCAB). The years 1994-1996 were used for evaluation of the method and annual indexes of air pollution index dependent only on meteorological parameters, called WEATHIX, were calculated for the years 1990-1997 and used for normalisation of the observed air pollution trends. Meteorological data were taken from either the background stations at the H.C. Oersted - building situated close to one of the street stations or the synoptic

  14. The applicability of normalisation process theory to speech and language therapy: a review of qualitative research on a speech and language intervention.

    Science.gov (United States)

    James, Deborah M

    2011-08-12

    The Bercow review found a high level of public dissatisfaction with speech and language services for children. Children with speech, language, and communication needs (SLCN) often have chronic complex conditions that require provision from health, education, and community services. Speech and language therapists are a small group of Allied Health Professionals with a specialist skill-set that equips them to work with children with SLCN. They work within and across the diverse range of public service providers. The aim of this review was to explore the applicability of Normalisation Process Theory (NPT) to the case of speech and language therapy. A review of qualitative research on a successfully embedded speech and language therapy intervention was undertaken to test the applicability of NPT. The review focused on two of the collective action elements of NPT (relational integration and interaction workability) using all previously published qualitative data from both parents and practitioners' perspectives on the intervention. The synthesis of the data based on the Normalisation Process Model (NPM) uncovered strengths in the interpersonal processes between the practitioners and parents, and weaknesses in how the accountability of the intervention is distributed in the health system. The analysis based on the NPM uncovered interpersonal processes between the practitioners and parents that were likely to have given rise to successful implementation of the intervention. In previous qualitative research on this intervention where the Medical Research Council's guidance on developing a design for a complex intervention had been used as a framework, the interpersonal work within the intervention had emerged as a barrier to implementation of the intervention. It is suggested that the design of services for children and families needs to extend beyond the consideration of benefits and barriers to embrace the social processes that appear to afford success in embedding

  15. Exploring the implementation of an electronic record into a maternity unit: a qualitative study using Normalisation Process Theory.

    Science.gov (United States)

    Scantlebury, Arabella; Sheard, Laura; Watt, Ian; Cairns, Paul; Wright, John; Adamson, Joy

    2017-01-07

    To explore the benefits, barriers and disadvantages of implementing an electronic record system (ERS). The extent that the system has become 'normalised' into routine practice was also explored. Qualitative semi-structured interviews were conducted with 19 members of NHS staff who represented a variety of staff groups (doctors, midwives of different grades, health care assistants) and wards within a maternity unit at a NHS teaching hospital. Interviews were conducted during the first year of the phased implementation of ERS and were analysed thematically. The four mechanisms of Normalisation Process Theory (NPT) (coherence, cognitive participation, collective action and reflexive monitoring) were adapted for use within the study and provided a theoretical framework to interpret the study's findings. Coherence (participants' understanding of why the ERS has been implemented) was mixed - whilst those involved in ERS implementation anticipated advantages such as improved access to information; the majority were unclear why the ERS was introduced. Participants' willingness to engage with and invest time into the ERS (cognitive participation) depended on the amount of training and support they received and their willingness to change from paper to electronic records. Collective action (the extent the ERS was used) may be influenced by whether participants perceived there to be benefits associated with the system. Whilst some individuals reported benefits such as improved legibility of records, others felt benefits were yet to emerge. The parallel use of paper and the lack of integration of electronic systems within and between the trust and other healthcare organisations hindered ERS use. When appraising the ERS (reflexive monitoring) participants perceived the system to negatively impact the patient-clinician relationship, time and patient safety. Despite expectations that the ERS would have a number of advantages, its implementation was perceived to have a range of

  16. Learning from doing: the case for combining normalisation process theory and participatory learning and action research methodology for primary healthcare implementation research.

    Science.gov (United States)

    de Brún, Tomas; O'Reilly-de Brún, Mary; O'Donnell, Catherine A; MacFarlane, Anne

    2016-08-03

    The implementation of research findings is not a straightforward matter. There are substantive and recognised gaps in the process of translating research findings into practice and policy. In order to overcome some of these translational difficulties, a number of strategies have been proposed for researchers. These include greater use of theoretical approaches in research focused on implementation, and use of a wider range of research methods appropriate to policy questions and the wider social context in which they are placed. However, questions remain about how to combine theory and method in implementation research. In this paper, we respond to these proposals. Focussing on a contemporary social theory, Normalisation Process Theory, and a participatory research methodology, Participatory Learning and Action, we discuss the potential of their combined use for implementation research. We note ways in which Normalisation Process Theory and Participatory Learning and Action are congruent and may therefore be used as heuristic devices to explore, better understand and support implementation. We also provide examples of their use in our own research programme about community involvement in primary healthcare. Normalisation Process Theory alone has, to date, offered useful explanations for the success or otherwise of implementation projects post-implementation. We argue that Normalisation Process Theory can also be used to prospectively support implementation journeys. Furthermore, Normalisation Process Theory and Participatory Learning and Action can be used together so that interventions to support implementation work are devised and enacted with the expertise of key stakeholders. We propose that the specific combination of this theory and methodology possesses the potential, because of their combined heuristic force, to offer a more effective means of supporting implementation projects than either one might do on its own, and of providing deeper understandings of

  17. Are Brief Alcohol Interventions Adequately Embedded in UK Primary Care? A Qualitative Study Utilising Normalisation Process Theory.

    Science.gov (United States)

    O'Donnell, Amy; Kaner, Eileen

    2017-03-28

    Despite substantial evidence for their effectiveness, the adoption of alcohol screening and brief interventions (ASBI) in routine primary care remains inconsistent. Financial incentive schemes were introduced in England between 2008 and 2015 to encourage their delivery. We used Normalisation Process Theory-informed interviews to understand the barriers and facilitators experienced by 14 general practitioners (GPs) as they implemented ASBI during this period. We found multiple factors shaped provision. GPs were broadly cognisant and supportive of preventative alcohol interventions (coherence) but this did not necessarily translate into personal investment in their delivery (cognitive participation). This lack of investment shaped how GPs operationalised such "work" in day-to-day practice (collective action), with ASBI mostly delegated to nurses, and GPs reverting to "business as usual" in their management and treatment of problem drinking (reflexive monitoring). We conclude there has been limited progress towards the goal of an effectively embedded preventative alcohol care pathway in English primary care. Future policy should consider screening strategies that prioritise patients with conditions with a recognised link with excessive alcohol consumption, and which promote more efficient identification of the most problematic drinkers. Improved GP training to build skills and awareness of evidence-based ASBI tools could also help embed best practice over time.

  18. Supervised Object Class Colour Normalisation

    DEFF Research Database (Denmark)

    Riabchenko, Ekatarina; Lankinen, Jukka; Buch, Anders Glent

    2013-01-01

    Colour is an important cue in many applications of computer vision and image processing, but robust usage often requires estimation of the unknown illuminant colour. Usually, to obtain images invariant to the illumination conditions under which they were taken, color normalisation is used....... In this work, we develop a such colour normalisation technique, where true colours are not important per se but where examples of same classes have photometrically consistent appearance. This is achieved by supervised estimation of a class specic canonical colour space where the examples have minimal variation...

  19. Assessing the facilitators and barriers of interdisciplinary team working in primary care using normalisation process theory: An integrative review.

    Science.gov (United States)

    O'Reilly, Pauline; Lee, Siew Hwa; O'Sullivan, Madeleine; Cullen, Walter; Kennedy, Catriona; MacFarlane, Anne

    2017-01-01

    Interdisciplinary team working is of paramount importance in the reform of primary care in order to provide cost-effective and comprehensive care. However, international research shows that it is not routine practice in many healthcare jurisdictions. It is imperative to understand levers and barriers to the implementation process. This review examines interdisciplinary team working in practice, in primary care, from the perspective of service providers and analyses 1 barriers and facilitators to implementation of interdisciplinary teams in primary care and 2 the main research gaps. An integrative review following the PRISMA guidelines was conducted. Following a search of 10 international databases, 8,827 titles were screened for relevance and 49 met the criteria. Quality of evidence was appraised using predetermined criteria. Data were analysed following the principles of framework analysis using Normalisation Process Theory (NPT), which has four constructs: sense making, enrolment, enactment, and appraisal. The literature is dominated by a focus on interdisciplinary working between physicians and nurses. There is a dearth of evidence about all NPT constructs apart from enactment. Physicians play a key role in encouraging the enrolment of others in primary care team working and in enabling effective divisions of labour in the team. The experience of interdisciplinary working emerged as a lever for its implementation, particularly where communication and respect were strong between professionals. A key lever for interdisciplinary team working in primary care is to get professionals working together and to learn from each other in practice. However, the evidence base is limited as it does not reflect the experiences of all primary care professionals and it is primarily about the enactment of team working. We need to know much more about the experiences of the full network of primary care professionals regarding all aspects of implementation work. International

  20. An application of Extended Normalisation Process Theory in a randomised controlled trial of a complex social intervention: Process evaluation of the Strengthening Families Programme (10–14 in Wales, UK

    Directory of Open Access Journals (Sweden)

    Jeremy Segrott

    2017-12-01

    Conclusions: Extended Normalisation Process Theory provided a useful framework for assessing implementation and explaining variation by examining intervention-context interactions. Findings highlight the need for process evaluations to consider both the structural and process components of implementation to explain whether programme activities are delivered as intended and why.

  1. Using normalisation process theory to understand barriers and facilitators to implementing mindfulness-based stress reduction for people with multiple sclerosis.

    Science.gov (United States)

    Simpson, Robert; Simpson, Sharon; Wood, Karen; Mercer, Stewart W; Mair, Frances S

    2018-01-01

    Objectives To study barriers and facilitators to implementation of mindfulness-based stress reduction for people with multiple sclerosis. Methods Qualitative interviews were used to explore barriers and facilitators to implementation of mindfulness-based stress reduction, including 33 people with multiple sclerosis, 6 multiple sclerosis clinicians and 2 course instructors. Normalisation process theory provided the underpinning conceptual framework. Data were analysed deductively using normalisation process theory constructs (coherence, cognitive participation, collective action and reflexive monitoring). Results Key barriers included mismatched stakeholder expectations, lack of knowledge about mindfulness-based stress reduction, high levels of comorbidity and disability and skepticism about embedding mindfulness-based stress reduction in routine multiple sclerosis care. Facilitators to implementation included introducing a pre-course orientation session; adaptations to mindfulness-based stress reduction to accommodate comorbidity and disability and participants suggested smaller, shorter classes, shortened practices, exclusion of mindful-walking and more time with peers. Post-mindfulness-based stress reduction booster sessions may be required, and objective and subjective reports of benefit would increase clinician confidence in mindfulness-based stress reduction. Discussion Multiple sclerosis patients and clinicians know little about mindfulness-based stress reduction. Mismatched expectations are a barrier to participation, as is rigid application of mindfulness-based stress reduction in the context of disability. Course adaptations in response to patient needs would facilitate uptake and utilisation. Rendering access to mindfulness-based stress reduction rapid and flexible could facilitate implementation. Embedded outcome assessment is desirable.

  2. Normalisation of glomerular filtration rate measurements

    International Nuclear Information System (INIS)

    White, A.J.; Strydom, W.J.

    1991-01-01

    The result of a glomerular filtration rate (GFR) measurement on a particular patient is of limited use to the referring physician since normal GFR values vary widely with the patient's age and build, etc. To overcome this problem, it is usual to normalise the measured GFR by dividing it by the patient's surface area and multiplying the result by the surface area of a 'standard' man. This transforms the measurment onto a scale which applies to all patients, young and old, large and small, where normal values fall within a well-defined range and where the degree of renal impairment can be quantified. We have examined the generally accepted surface area (SA) and the less well-known extracellular volume (ECV) normalisation methods of GFR measurements in a series of 110 patients. The results show that both methods produce essentially the same result; however, ECV normalisation is theoretically more correct, can be found directly without the patient's ECV being measured and does not require the use of empirical formulae. Mathematical justification for ECV normalisation is presented, and a proposed distribution pattern for the normalised measurement is introduced. A simple mathematical model shows that accurate GFR measurements can be made in the presence of an enlarged ECV, but normalisation of these will produce misleading low values. (orig.)

  3. Normalising convenience food?

    DEFF Research Database (Denmark)

    Halkier, Bente

    2017-01-01

    The construction of convenience food as a social and cultural category for food provisioning, cooking and eating seems to slide between or across understandings of what is considered “proper food” in the existing discourses in everyday life and media. This article sheds light upon some...... of the social and cultural normativities around convenience food by describing the ways in which convenience food forms part of the daily life of young Danes. Theoretically, the article is based on a practice theoretical perspective. Empirically, the article builds upon a qualitative research project on food...... habits among Danes aged 20–25. The article presents two types of empirical patterns. The first types of patterns are the degree to which and the different ways in which convenience food is normalised to use among the young Danes. The second types of patterns are the normative places of convenient food...

  4. Matrix-normalised real-time PCR approach to quantify soybean as a potential food allergen as affected by thermal processing.

    Science.gov (United States)

    Costa, Joana; Amaral, Joana S; Grazina, Liliana; Oliveira, M Beatriz P P; Mafra, Isabel

    2017-04-15

    The addition of soybean protein materials to meat products is a common practice in the food industry, being a potential hidden allergenic commodity. This study aimed at proposing a novel specific and highly sensitive real-time PCR system for the detection/quantification of soybean as an allergenic ingredient in processed meat products. The method achieved a limit of detection of 9.8pg of soybean DNA (8.6 copies), with adequate real-time PCR performance parameters, regardless of the soybean material (concentrate or isolate) and after thermal treatments. A normalised approach was also proposed in the range of 0.001-10% (w/w) of soybean material in pork meat, which was successfully validated and applied to processed meat products. Soybean was identified in more than 40% of tested samples of cooked ham and mortadella in the range of 0.1-4% (w/w), 3 samples not complying with labelling regulations as a result of undeclared soybean. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Implementation of a text-messaging intervention for adolescents who self-harm (TeenTEXT): a feasibility study using normalisation process theory.

    Science.gov (United States)

    Owens, Christabel; Charles, Nigel

    2016-01-01

    There are few interventions that directly address self-harming behaviour among adolescents. At the request of clinicians in Child and Adolescent Mental Health Services (CAMHS) in England and working with them, we redeveloped an adult SMS text-messaging intervention to meet the needs of adolescents under the care of CAMHS who self-harm. We used normalisation process theory (NPT) to assess the feasibility of delivering it through CAMHS. We planned to recruit 27 young people who self-harm and their clinicians, working as dyads and using the intervention (TeenTEXT) for 6 months. Despite strong engagement in principle from CAMHS teams, in practice we were able to recruit only three clinician/client dyads. Of these, two dropped out because the clients were too unwell. We identified a number of barriers to implementation. These included: a context of CAMHS in crisis, with heavy workloads and high stress levels; organisational gatekeeping practices, which limited the extent to which clinicians could engage with the intervention; perceived burdensomeness and technophobia on the part of clinicians, and a belief by many clinicians that CAMHS may be the wrong delivery setting and that the intervention may have better fit with schools and universal youth services. User-centred design principles and the use of participatory methods in intervention development are no guarantee of implementability. Barriers to implementation cannot always be foreseen, and early clinical champions may overestimate the readiness of colleagues to embrace new ideas and technologies. NPT studies have an important role to play in identifying whether or not interventions are likely to receive widespread clinical support. This study of a text-messaging intervention to support adolescents who self-harm (TeenTEXT) showed that further work is needed to identify the right delivery setting, before testing the efficacy of the intervention.

  6. A formative evaluation of the implementation of a medication safety data collection tool in English healthcare settings: A qualitative interview study using normalisation process theory.

    Directory of Open Access Journals (Sweden)

    Paryaneh Rostami

    Full Text Available Reducing medication-related harm is a global priority; however, impetus for improvement is impeded as routine medication safety data are seldom available. Therefore, the Medication Safety Thermometer was developed within England's National Health Service. This study aimed to explore the implementation of the tool into routine practice from users' perspectives.Fifteen semi-structured interviews were conducted with purposely sampled National Health Service staff from primary and secondary care settings. Interview data were analysed using an initial thematic analysis, and subsequent analysis using Normalisation Process Theory.Secondary care staff understood that the Medication Safety Thermometer's purpose was to measure medication safety and improvement. However, other uses were reported, such as pinpointing poor practice. Confusion about its purpose existed in primary care, despite further training, suggesting unsuitability of the tool. Decreased engagement was displayed by staff less involved with medication use, who displayed less ownership. Nonetheless, these advocates often lacked support from management and frontline levels, leading to an overall lack of engagement. Many participants reported efforts to drive scale-up of the use of the tool, for example, by securing funding, despite uncertainty around how to use data. Successful improvement was often at ward-level and went unrecognised within the wider organisation. There was mixed feedback regarding the value of the tool, often due to a perceived lack of "capacity". However, participants demonstrated interest in learning how to use their data and unexpected applications of data were reported.Routine medication safety data collection is complex, but achievable and facilitates improvements. However, collected data must be analysed, understood and used for further work to achieve improvement, which often does not happen. The national roll-out of the tool has accelerated shared learning; however

  7. Supporting the use of theory in cross-country health services research: a participatory qualitative approach using Normalisation Process Theory as an example.

    Science.gov (United States)

    O'Donnell, Catherine A; Mair, Frances S; Dowrick, Christopher; Brún, Mary O'Reilly-de; Brún, Tomas de; Burns, Nicola; Lionis, Christos; Saridaki, Aristoula; Papadakaki, Maria; Muijsenbergh, Maria van den; Weel-Baumgarten, Evelyn van; Gravenhorst, Katja; Cooper, Lucy; Princz, Christine; Teunissen, Erik; Mareeuw, Francine van den Driessen; Vlahadi, Maria; Spiegel, Wolfgang; MacFarlane, Anne

    2017-08-21

    To describe and reflect on the process of designing and delivering a training programme supporting the use of theory, in this case Normalisation Process Theory (NPT), in a multisite cross-country health services research study. Participatory research approach using qualitative methods. Six European primary care settings involving research teams from Austria, England, Greece, Ireland, The Netherlands and Scotland. RESTORE research team consisting of 8 project applicants, all senior primary care academics, and 10 researchers. Professional backgrounds included general practitioners/family doctors, social/cultural anthropologists, sociologists and health services/primary care researchers. Views of all research team members (n=18) were assessed using qualitative evaluation methods, analysed qualitatively by the trainers after each session. Most of the team had no experience of using NPT and many had not applied theory to prospective, qualitative research projects. Early training proved didactic and overloaded participants with information. Drawing on RESTORE's methodological approach of Participatory Learning and Action, workshops using role play, experiential interactive exercises and light-hearted examples not directly related to the study subject matter were developed. Evaluation showed the study team quickly grew in knowledge and confidence in applying theory to fieldwork.Recommendations applicable to other studies include: accepting that theory application is not a linear process, that time is needed to address researcher concerns with the process, and that experiential, interactive learning is a key device in building conceptual and practical knowledge. An unanticipated benefit was the smooth transition to cross-country qualitative coding of study data. A structured programme of training enhanced and supported the prospective application of a theory, NPT, to our work but raised challenges. These were not unique to NPT but could arise with the application of any

  8. An application of Extended Normalisation Process Theory in a randomised controlled trial of a complex social intervention: Process evaluation of the Strengthening Families Programme (10-14) in Wales, UK.

    Science.gov (United States)

    Segrott, Jeremy; Murphy, Simon; Rothwell, Heather; Scourfield, Jonathan; Foxcroft, David; Gillespie, David; Holliday, Jo; Hood, Kerenza; Hurlow, Claire; Morgan-Trimmer, Sarah; Phillips, Ceri; Reed, Hayley; Roberts, Zoe; Moore, Laurence

    2017-12-01

    Process evaluations generate important data on the extent to which interventions are delivered as intended. However, the tendency to focus only on assessment of pre-specified structural aspects of fidelity has been criticised for paying insufficient attention to implementation processes and how intervention-context interactions influence programme delivery. This paper reports findings from a process evaluation nested within a randomised controlled trial of the Strengthening Families Programme 10-14 (SFP 10-14) in Wales, UK. It uses Extended Normalisation Process Theory to theorise how interaction between SFP 10-14 and local delivery systems - particularly practitioner commitment/capability and organisational capacity - influenced delivery of intended programme activities: fidelity (adherence to SFP 10-14 content and implementation requirements); dose delivered; dose received (participant engagement); participant recruitment and reach (intervention attendance). A mixed methods design was utilised. Fidelity assessment sheets (completed by practitioners), structured observation by researchers, and routine data were used to assess: adherence to programme content; staffing numbers and consistency; recruitment/retention; and group size and composition. Interviews with practitioners explored implementation processes and context. Adherence to programme content was high - with some variation, linked to practitioner commitment to, and understanding of, the intervention's content and mechanisms. Variation in adherence rates was associated with the extent to which multi-agency delivery team planning meetings were held. Recruitment challenges meant that targets for group size/composition were not always met, but did not affect adherence levels or family engagement. Targets for staffing numbers and consistency were achieved, though capacity within multi-agency networks reduced over time. Extended Normalisation Process Theory provided a useful framework for assessing

  9. Response to "The Normalised Child"

    Science.gov (United States)

    Chisnall, Nicola

    2005-01-01

    Grebennikov has chosen to present to the contemporary gaze, the so-called "absolute" concepts of Maria Montessori regarding normalisation and deviation in childhood. Grebennikov has focussed on the issue of challenging behaviour and the "deviations" identified by Montessori 100 years ago. His discursive treatment of the topic…

  10. A Markov chain description of the stepwise mutation model: local and global behaviour of the allele process.

    Science.gov (United States)

    Caliebe, Amke; Jochens, Arne; Krawczak, Michael; Rösler, Uwe

    2010-09-21

    The stepwise mutation model (SMM) is a simple, widely used model to describe the evolutionary behaviour of microsatellites. We apply a Markov chain description of the SMM and derive the marginal and joint properties of this process. In addition to the standard SMM, we also consider the normalised allele process. In contrast to the standard process, the normalised process converges to a stationary distribution. We show that the marginal stationary distribution is unimodal. The standard and normalised processes capture the global and the local behaviour of the SMM, respectively. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  11. Repeated lysergic acid diethylamide in an animal model of depression: Normalisation of learning behaviour and hippocampal serotonin 5-HT2 signalling.

    Science.gov (United States)

    Buchborn, Tobias; Schröder, Helmut; Höllt, Volker; Grecksch, Gisela

    2014-06-01

    A re-balance of postsynaptic serotonin (5-HT) receptor signalling, with an increase in 5-HT1A and a decrease in 5-HT2A signalling, is a final common pathway multiple antidepressants share. Given that the 5-HT1A/2A agonist lysergic acid diethylamide (LSD), when repeatedly applied, selectively downregulates 5-HT2A, but not 5-HT1A receptors, one might expect LSD to similarly re-balance the postsynaptic 5-HT signalling. Challenging this idea, we use an animal model of depression specifically responding to repeated antidepressant treatment (olfactory bulbectomy), and test the antidepressant-like properties of repeated LSD treatment (0.13 mg/kg/d, 11 d). In line with former findings, we observe that bulbectomised rats show marked deficits in active avoidance learning. These deficits, similarly as we earlier noted with imipramine, are largely reversed by repeated LSD administration. Additionally, bulbectomised rats exhibit distinct anomalies of monoamine receptor signalling in hippocampus and/or frontal cortex; from these, only the hippocampal decrease in 5-HT2 related [(35)S]-GTP-gamma-S binding is normalised by LSD. Importantly, the sham-operated rats do not profit from LSD, and exhibit reduced hippocampal 5-HT2 signalling. As behavioural deficits after bulbectomy respond to agents classified as antidepressants only, we conclude that the effect of LSD in this model can be considered antidepressant-like, and discuss it in terms of a re-balance of hippocampal 5-HT2/5-HT1A signalling. © The Author(s) 2014.

  12. Infinitary Combinatory Reduction Systems: Normalising Reduction Strategies

    NARCIS (Netherlands)

    Ketema, J.; Simonsen, Jakob Grue

    2010-01-01

    We study normalising reduction strategies for infinitary Combinatory Reduction Systems (iCRSs). We prove that all fair, outermost-fair, and needed-fair strategies are normalising for orthogonal, fully-extended iCRSs. These facts properly generalise a number of results on normalising strategies in

  13. Normalising the Breast: Early Childhood Services Battling the Bottle and the Breast

    Science.gov (United States)

    Duncan, Judith; Bartle, Carol

    2014-01-01

    Normalising practices as a tool for controlling the body and bodily processes have been well-documented using Foucault's theories, including debates around breastfeeding. In this article we explore how the ideas of "normalisation" of the bottle-feeding culture of infants in New Zealand early childhood settings has become the accepted…

  14. Elevated international normalised ratios correlate with severity of ...

    African Journals Online (AJOL)

    admission international normalised ratios (INRs) were correlated with Injury Severity Scores (ISSs) and in-hospital mortality. A multi- variable Poisson model with robust standard errors was used to assess the relationship between coagulopathy and mortality after adjustment for the confounding influence of age and gender.

  15. Alternative psychosis (forced normalisation) in epilepsy

    African Journals Online (AJOL)

    Landolt was the first to report improvement in EEG activity during periods of abnormal behaviour.6-9. The mechanism of forced normalisation is still not fully understood, although the kindling phenomenon, the phenomenon of long-term potentiation and the channel disorder paradigm have all been proposed as possible.

  16. Alternative psychosis (forced normalisation) in epilepsy

    African Journals Online (AJOL)

    Patients with refractory temporal lobe epilepsy who undergo unilateral anterior temporal lobectomy have been observed to develop a de novo psychosis with diminished seizures. This is thought to be an alternative psychosis related to forced normalisation of the EEG.8,12-14. The absence of clear diagnostic criteria for ...

  17. Attitudes to Normalisation and Inclusive Education

    Science.gov (United States)

    Sanagi, Tomomi

    2016-01-01

    The purpose of this paper was to clarify the features of teachers' image on normalisation and inclusive education. The participants of the study were both mainstream teachers and special teachers. One hundred and thirty-eight questionnaires were analysed. (1) Teachers completed the questionnaire of SD (semantic differential) images on…

  18. Evaluating a systematic voiding programme for patients with urinary incontinence after stroke in secondary care using soft systems analysis and Normalisation Process Theory: findings from the ICONS case study phase.

    Science.gov (United States)

    Thomas, L H; French, B; Burton, C R; Sutton, C; Forshaw, D; Dickinson, H; Leathley, M J; Britt, D; Roe, B; Cheater, F M; Booth, J; Watkins, C L

    2014-10-01

    Urinary incontinence (UI) affects between 40 and 60% of people in hospital after stroke, but is often poorly managed in stroke units. To inform an exploratory trial by three methods: identifying the organisational context for embedding the SVP; exploring health professionals' views around embedding the SVP and measuring presence/absence of UI and frequency of UI episodes at baseline and six weeks post-stroke. A mixed methods single case study included analysis of organisational context using interviews with clinical leaders analysed with soft systems methodology, a process evaluation using interviews with staff delivering the intervention and analysed with Normalisation Process Theory, and outcome evaluation using data from patients receiving the SVP and analysed using descriptive statistics. An 18 bed acute stroke unit in a large Foundation Trust (a 'not for profit' privately controlled entity not accountable to the UK Department of Health) serving a population of 370,000. Health professionals and clinical leaders with a role in either delivering the SVP or linking with it in any capacity were recruited following informed consent. Patients were recruited meeting the following inclusion criteria: aged 18 or over with a diagnosis of stroke; urinary incontinence (UI) as defined by the International Continence Society; conscious; medically stable as judged by the clinical team and with incontinence classified as stress, urge, mixed or 'functional'. All patients admitted to the unit during the intervention period were screened for eligibility; informed consent to collect baseline and outcome data was sought from all eligible patients. Organisational context: 18 health professionals took part in four group interviews. Findings suggest an environment not conducive to therapeutic continence management and a focus on containment of UI. Embedding the SVP into practice: 21 nursing staff took part in six group interviews. Initial confusion gave way to embedding of processes

  19. Use of a pre-analysis osmolality normalisation method to correct for variable urine concentrations and for improved metabolomic analyses.

    Science.gov (United States)

    Chetwynd, Andrew J; Abdul-Sada, Alaa; Holt, Stephen G; Hill, Elizabeth M

    2016-01-29

    Metabolomics analyses of urine have the potential to provide new information on the detection and progression of many disease processes. However, urine samples can vary significantly in total solute concentration and this presents a challenge to achieve high quality metabolomic datasets and the detection of biomarkers of disease or environmental exposures. This study investigated the efficacy of pre- and post-analysis normalisation methods to analyse metabolomic datasets obtained from neat and diluted urine samples from five individuals. Urine samples were extracted by solid phase extraction (SPE) prior to metabolomic analyses using a sensitive nanoflow/nanospray LC-MS technique and the data analysed by principal component analyses (PCA). Post-analysis normalisation of the datasets to either creatinine or osmolality concentration, or to mass spectrum total signal (MSTS), revealed that sample discrimination was driven by the dilution factor of urine rather than the individual providing the sample. Normalisation of urine samples to equal osmolality concentration prior to LC-MS analysis resulted in clustering of the PCA scores plot according to sample source and significant improvements in the number of peaks common to samples of all three dilutions from each individual. In addition, the ability to identify discriminating markers, using orthogonal partial least squared-discriminant analysis (OPLS-DA), was greatly improved when pre-analysis normalisation to osmolality was compared with post-analysis normalisation to osmolality and non-normalised datasets. Further improvements for peak area repeatability were observed in some samples when the pre-analysis normalisation to osmolality was combined with a post-analysis mass spectrum total useful signal (MSTUS) or MSTS normalisation. Future adoption of such normalisation methods may reduce the variability in metabolomics analyses due to differing urine concentrations and improve the discovery of discriminating metabolites

  20. The Normalised Child: A Non-Traditional Psychological Framework

    Science.gov (United States)

    Grebennikov, Leonid

    2005-01-01

    The terms "normalisation" and "normalised child" were introduced into early childhood scholarship by Maria Montessori, whose ideas regarding norm and deviation in children's development and behaviour have been discussed, debated and sometimes criticised, but remain magnetic and recognised worldwide. Contemporary Western society is witnessing a…

  1. An efficient Markov chain Monte Carlo method for distributions with intractable normalising constants

    DEFF Research Database (Denmark)

    Møller, Jesper; Pettitt, A. N.; Reeves, R.

    2006-01-01

    is presented which requires only that independent samples can be drawn from the unnormalised density at any particular parameter value. The proposal distribution is constructed so that the normalising constant cancels from the Metropolis–Hastings ratio. The method is illustrated by producing posterior samples...... for parameters of the Ising model given a particular lattice realisation....

  2. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  3. Normalisation and weighting in life cycle assessment: quo vadis?

    DEFF Research Database (Denmark)

    Pizzol, Massimo; Laurent, Alexis; Sala, Serenella

    2017-01-01

    Purpose: Building on the rhetoric question “quo vadis?” (literally “Where are you going?”), this article critically investigates the state of the art of normalisation and weighting approaches within life cycle assessment. It aims at identifying purposes, current practises, pros and cons, as well...... (LCIA). Methods: The empirical work consisted in (i) an online survey to investigate the perception of the LCA community regarding the scientific quality and current practice concerning normalisation and weighting; (ii) a classification followed by systematic expert-based assessment of existing methods...... for normalisation and weighting according to a set of five criteria: scientific robustness, documentation, coverage, uncertainty and complexity. Results and discussion: The survey results showed that normalised results and weighting scores are perceived as relevant for decision-making, but further development...

  4. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models....... These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  5. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  6. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    solution of the Navier-Stokes equations in a multiphase scheme. After a general introduction to the activated sludge tank as a system, the activated sludge tank model is gradually setup in separate stages. The individual sub-processes that are often occurring in activated sludge tanks are initially......-process models, the last part of the thesis, where the integrated process tank model is tested on three examples of activated sludge systems, is initiated. The three case studies are introduced with an increasing degree of model complexity. All three cases are take basis in Danish municipal wastewater treatment...... plants. The first case study involves the modeling of an activated sludge tank undergoing a special controlling strategy with the intention minimizing the sludge loading on the subsequent secondary settlers during storm events. The applied model is a two-phase model, where the sedimentation of sludge...

  7. Normalisation of body composition parameters for nutritional assessment

    International Nuclear Information System (INIS)

    Preston, Thomas

    2014-01-01

    Full text: Normalisation of body composition parameters to an index of body size facilitates comparison of a subject’s measurements with those of a population. There is an obvious focus on indexes of obesity, but first it is informative to consider Fat Free Mass (FFM) in the context of common anthropometric measures of body size namely, height and weight. The contention is that FFM is a more physiological measure of body size than body mass. Many studies have shown that FFM relates to height ^p. Although there is debate over the appropriate exponent especially in early life, it appears to lie between 2 and 3. If 2, then FFM Index (FFMI; kg/m2) and Fat Mass Index (FMI; kg/m2) can be summed to give BMI. If 3 were used as exponent, then FFMI (kg/m3) plus FMI (kg/m3) gives the Ponderal Index (PI; weight/height3). In 2013, Burton argued that that a cubic exponent is appropriate for normalisation as it is a dimensionless quotient. In 2012, Wang and co-workers repeated earlier observations showing a strong linear relationship between FFM and height3. The importance of the latter study comes from the fact that a 4 compartment body composition model was used, which is recognised as the most accurate means of describing FFM. Once the basis of a FFMI has been defined it can be used to compare measurements with those of a population, either directly, as a ratio to a norm or as a Z-score. FFMI charts could be developed for use in child growth. Other related indexes can be determined for use in specific circumstances such as: body cell mass index (growth and wasting); skeletal muscle mass index (SMMI) or appendicular SMMI (growth and sarcopenia); bone mineral mass index (osteoporosis); extracellular fluid index (hydration). Finally, it is logical that the same system is used to define an adiposity index, so Fat Mass Index (FMI; kg/height3) can be used as it is consistent with FFMI (kg/height3) and PI. It should also be noted that the index FM/FFM, describes an individual

  8. A comparison of parametric and nonparametric methods for normalising cDNA microarray data.

    Science.gov (United States)

    Khondoker, Mizanur R; Glasbey, Chris A; Worton, Bruce J

    2007-12-01

    Normalisation is an essential first step in the analysis of most cDNA microarray data, to correct for effects arising from imperfections in the technology. Loess smoothing is commonly used to correct for trends in log-ratio data. However, parametric models, such as the additive plus multiplicative variance model, have been preferred for scale normalisation, though the variance structure of microarray data may be of a more complex nature than can be accommodated by a parametric model. We propose a new nonparametric approach that incorporates location and scale normalisation simultaneously using a Generalised Additive Model for Location, Scale and Shape (GAMLSS, Rigby and Stasinopoulos, 2005, Applied Statistics, 54, 507-554). We compare its performance in inferring differential expression with Huber et al.'s (2002, Bioinformatics, 18, 96-104) arsinh variance stabilising transformation (AVST) using real and simulated data. We show GAMLSS to be as powerful as AVST when the parametric model is correct, and more powerful when the model is wrong. (c) 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

  9. Model Process Control Language

    Data.gov (United States)

    National Aeronautics and Space Administration — The MPC (Model Process Control) language enables the capture, communication and preservation of a simulation instance, with sufficient detail that it can be...

  10. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    strategic preference, as part of their business model innovation activity planned. Practical implications – This paper aimed at strengthening researchers and, particularly, practitioner’s perspectives into the field of business model process configurations. By insuring an [abstracted] alignment between......Purpose – The paper aims: 1) To develop systematically a structural list of various business model process configuration and to group (deductively) these selected configurations in a structured typological categorization list. 2) To facilitate companies in the process of BM innovation......, by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...

  11. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  12. Biosphere Process Model Report

    International Nuclear Information System (INIS)

    Schmitt, J.

    2000-01-01

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  13. Retrieval of the raindrop size distribution from polarimetric radar data using double-moment normalisation

    Directory of Open Access Journals (Sweden)

    T. H. Raupach

    2017-07-01

    Full Text Available A new technique for estimating the raindrop size distribution (DSD from polarimetric radar data is proposed. Two statistical moments of the DSD are estimated from polarimetric variables, and the DSD is reconstructed using a double-moment normalisation. The technique takes advantage of the relative invariance of the double-moment normalised DSD. The method was tested using X-band radar data and networks of disdrometers in three different climatic regions. Radar-derived estimates of the DSD compare reasonably well to observations. In the three tested domains, in terms of DSD moments, rain rate, and characteristic drop diameter, the proposed method performs similarly to and often better than a state-of-the-art DSD-retrieval technique. The approach is flexible because no specific DSD model is prescribed. In addition, a method is proposed to treat noisy radar data to improve DSD-retrieval performance with radar measurements.

  14. Retrieval of the raindrop size distribution from polarimetric radar data using double-moment normalisation

    Science.gov (United States)

    Raupach, Timothy H.; Berne, Alexis

    2017-07-01

    A new technique for estimating the raindrop size distribution (DSD) from polarimetric radar data is proposed. Two statistical moments of the DSD are estimated from polarimetric variables, and the DSD is reconstructed using a double-moment normalisation. The technique takes advantage of the relative invariance of the double-moment normalised DSD. The method was tested using X-band radar data and networks of disdrometers in three different climatic regions. Radar-derived estimates of the DSD compare reasonably well to observations. In the three tested domains, in terms of DSD moments, rain rate, and characteristic drop diameter, the proposed method performs similarly to and often better than a state-of-the-art DSD-retrieval technique. The approach is flexible because no specific DSD model is prescribed. In addition, a method is proposed to treat noisy radar data to improve DSD-retrieval performance with radar measurements.

  15. Foam process models.

    Energy Technology Data Exchange (ETDEWEB)

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A. (Procter & Gamble Co., West Chester, OH); Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann

    2008-09-01

    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  16. Use and misuse of temperature normalisation in meta-analyses of thermal responses of biological traits

    Directory of Open Access Journals (Sweden)

    Dimitrios - Georgios Kontopoulos

    2018-02-01

    Full Text Available There is currently unprecedented interest in quantifying variation in thermal physiology among organisms, especially in order to understand and predict the biological impacts of climate change. A key parameter in this quantification of thermal physiology is the performance or value of a rate, across individuals or species, at a common temperature (temperature normalisation. An increasingly popular model for fitting thermal performance curves to data—the Sharpe-Schoolfield equation—can yield strongly inflated estimates of temperature-normalised rate values. These deviations occur whenever a key thermodynamic assumption of the model is violated, i.e., when the enzyme governing the performance of the rate is not fully functional at the chosen reference temperature. Using data on 1,758 thermal performance curves across a wide range of species, we identify the conditions that exacerbate this inflation. We then demonstrate that these biases can compromise tests to detect metabolic cold adaptation, which requires comparison of fitness or rate performance of different species or genotypes at some fixed low temperature. Finally, we suggest alternative methods for obtaining unbiased estimates of temperature-normalised rate values for meta-analyses of thermal performance across species in climate change impact studies.

  17. Statistical mapping of maize bundle intensity at the stem scale using spatial normalisation of replicated images.

    Directory of Open Access Journals (Sweden)

    David Legland

    Full Text Available The cellular structure of plant tissues is a key parameter for determining their properties. While the morphology of cells can easily be described, few studies focus on the spatial distribution of different types of tissues within an organ. As plants have various shapes and sizes, the integration of several individuals for statistical analysis of tissues distribution is a difficult problem. The aim of this study is to propose a method that quantifies the average spatial organisation of vascular bundles within maize stems, by integrating information from replicated images. In order to compare observations made on stems of different sizes and shapes, a spatial normalisation strategy was used. A model of average stem contour was computed from the digitisation of several stem slab images. Point patterns obtained from individual stem slices were projected onto the average stem to normalise them. Group-wise analysis of the spatial distribution of vascular bundles was applied on normalised data through the construction of average intensity maps. A quantitative description of average bundle organisation was obtained, via a 3D model of bundle distribution within a typical maize internode. The proposed method is generic and could easily be extended to other plant organs or organisms.

  18. Use and misuse of temperature normalisation in meta-analyses of thermal responses of biological traits

    Science.gov (United States)

    García-Carreras, Bernardo; Sal, Sofía; Smith, Thomas P.; Pawar, Samraat

    2018-01-01

    There is currently unprecedented interest in quantifying variation in thermal physiology among organisms, especially in order to understand and predict the biological impacts of climate change. A key parameter in this quantification of thermal physiology is the performance or value of a rate, across individuals or species, at a common temperature (temperature normalisation). An increasingly popular model for fitting thermal performance curves to data—the Sharpe-Schoolfield equation—can yield strongly inflated estimates of temperature-normalised rate values. These deviations occur whenever a key thermodynamic assumption of the model is violated, i.e., when the enzyme governing the performance of the rate is not fully functional at the chosen reference temperature. Using data on 1,758 thermal performance curves across a wide range of species, we identify the conditions that exacerbate this inflation. We then demonstrate that these biases can compromise tests to detect metabolic cold adaptation, which requires comparison of fitness or rate performance of different species or genotypes at some fixed low temperature. Finally, we suggest alternative methods for obtaining unbiased estimates of temperature-normalised rate values for meta-analyses of thermal performance across species in climate change impact studies. PMID:29441242

  19. Evaluating vowel normalisation procedures: A case study on ...

    African Journals Online (AJOL)

    In the present article, eight such procedures are tested on a large data-set of Southern Sotho vowels (4 434 tokens), as produced by twelve speakers (six of each gender and balanced as to age and locality). We concentrated on the examination of two types of normalisation procedures: firstly, the vowel-extrinsic class of ...

  20. Pouvoir pastoral, normalisation et soins infirmiers : une analyse foucaldienne

    Directory of Open Access Journals (Sweden)

    PATRICK MARTIN

    2010-04-01

    Full Text Available Des technologies répressives sont en place dans le but de contraindre, de soumettre, de normaliser et de diriger les intervenants qui travaillent dans le milieu de la santé. Parmi ces technologies, la confession en est une qui est largement utilisée. Le philosophe français Michel Foucault identifie la pratique confessionnelle comme permettant l’exercice d’une forme particulière de pouvoir qu’il qualifie de « pouvoir pastoral ». Le but de cet article est d’explorer de quelle manière la confession, à des fins de normalisation (pouvoir pastoral, est utilisée dans l’exercice de la profession infirmière. Pour ce faire, une attention particulière sera portée au concept foucaldien de pouvoir pastoral, à ses origines ainsi qu’à son application en soins infirmiers.

  1. On the likelihood of normalisation in combinatory logic

    OpenAIRE

    Bendkowski, Maciej; Grygiel, Katarzyna; Zaionc, Marek

    2016-01-01

    We present a quantitative basis-independent analysis of combinatory logic. Using a general argument regarding plane binary trees with labelled leaves, we generalise the results of David et al. and Bendkowski et al. to all Turing-complete combinator bases proving, inter alia, that asymptotically almost no combinator is strongly normalising nor typeable. We exploit the structure of recently discovered normal-order reduction grammars showing that for each positive $n$, the set of $\\mathbf{S} \\ma...

  2. GREENSCOPE: Sustainable Process Modeling

    Science.gov (United States)

    EPA researchers are responding to environmental problems by incorporating sustainability into process design and evaluation. EPA researchers are also developing a tool that allows users to assess modifications to existing and new chemical processes to determine whether changes in...

  3. The one-dimensional normalised generalised equivalence theory (NGET) for generating equivalent diffusion theory group constants for PWR reflector regions

    International Nuclear Information System (INIS)

    Mueller, E.Z.

    1991-01-01

    An equivalent diffusion theory PWR reflector model is presented, which has as its basis Smith's generalisation of Koebke's Equivalent Theory. This method is an adaptation, in one-dimensional slab geometry, of the Generalised Equivalence Theory (GET). Since the method involves the renormalisation of the GET discontinuity factors at nodal interfaces, it is called the Normalised Generalised Equivalence Theory (NGET) method. The advantages of the NGET method for modelling the ex-core nodes of a PWR are summarized. 23 refs

  4. Auditory processing models

    DEFF Research Database (Denmark)

    Dau, Torsten

    2008-01-01

    The Handbook of Signal Processing in Acoustics will compile the techniques and applications of signal processing as they are used in the many varied areas of Acoustics. The Handbook will emphasize the interdisciplinary nature of signal processing in acoustics. Each Section of the Handbook...... will present topics on signal processing which are important in a specific area of acoustics. These will be of interest to specialists in these areas because they will be presented from their technical perspective, rather than a generic engineering approach to signal processing. Non-specialists, or specialists...

  5. INNOVATION PROCESS MODELLING

    Directory of Open Access Journals (Sweden)

    JANUSZ K. GRABARA

    2011-01-01

    Full Text Available Modelling phenomena in accordance with the structural approach enables one to simplify the observed relations and to present the classification grounds. An example may be a model of organisational structure identifying the logical relations between particular units and presenting the division of authority, work.

  6. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    The subject of this thesis is to develop a methodological framework that can systematically guide mathematical model building for better understanding of multi-enzyme processes. In this way, opportunities for process improvements can be identified by analyzing simulations of either existing...... features of the process and provides the information required to structure the process model by using a step-by-step procedure with the required tools and methods. In this way, this framework increases efficiency of the model development process with respect to time and resources needed (fast and effective...... in the scientific literature. Reliable mathematical models of such multi-catalytic schemes can exploit the potential benefit of these processes. In this way, the best outcome of the process can be obtained understanding the types of modification that are required for process optimization. An effective evaluation...

  7. Modeling of column apparatus processes

    CERN Document Server

    Boyadjiev, Christo; Boyadjiev, Boyan; Popova-Krumova, Petya

    2016-01-01

    This book presents a new approach for the modeling of chemical and interphase mass transfer processes in industrial column apparatuses, using convection-diffusion and average-concentration models. The convection-diffusion type models are used for a qualitative analysis of the processes and to assess the main, small and slight physical effects, and then reject the slight effects. As a result, the process mechanism can be identified. It also introduces average concentration models for quantitative analysis, which use the average values of the velocity and concentration over the cross-sectional area of the column. The new models are used to analyze different processes (simple and complex chemical reactions, absorption, adsorption and catalytic reactions), and make it possible to model the processes of gas purification with sulfur dioxide, which form the basis of several patents.

  8. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  9. Business Process Modeling: Perceived Benefits

    Science.gov (United States)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  10. Chemical Process Modeling and Control.

    Science.gov (United States)

    Bartusiak, R. Donald; Price, Randel M.

    1987-01-01

    Describes some of the features of Lehigh University's (Pennsylvania) process modeling and control program. Highlights the creation and operation of the Chemical Process Modeling and Control Center (PMC). Outlines the program's philosophy, faculty, technical program, current research projects, and facilities. (TW)

  11. Chapter 1: Standard Model processes

    OpenAIRE

    Becher, Thomas

    2017-01-01

    This chapter documents the production rates and typical distributions for a number of benchmark Standard Model processes, and discusses new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  12. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  13. Modeling nuclear processes by Simulink

    Science.gov (United States)

    Rashid, Nahrul Khair Alang Md

    2015-04-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  14. Modeling nuclear processes by Simulink

    Energy Technology Data Exchange (ETDEWEB)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my [Faculty of Engineering, International Islamic University Malaysia, Jalan Gombak, Selangor (Malaysia)

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  15. Path modeling and process control

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Rodionova, O.; Pomerantsev, A.

    2007-01-01

    and having three or more stages. The methods are applied to a process control of a multi-stage production process having 25 variables and one output variable. When moving along the process, variables change their roles. It is shown how the methods of path modeling can be applied to estimate variables......Many production processes are carried out in stages. At the end of each stage, the production engineer can analyze the intermediate results and correct process parameters (variables) of the next stage. Both analysis of the process and correction to process parameters at next stage should...... be performed regarding the foreseeable output property y, and with respect to an admissible range of correcting actions for the parameters of the next stage. In this paper the basic principles of path modeling is presented. The mathematics is presented for processes having only one stage, having two stages...

  16. Selection of reference genes for normalisation of real-time RT-PCR in brain-stem death injury in Ovis aries

    Directory of Open Access Journals (Sweden)

    Fraser John F

    2009-07-01

    Full Text Available Abstract Background Heart and lung transplantation is frequently the only therapeutic option for patients with end stage cardio respiratory disease. Organ donation post brain stem death (BSD is a pre-requisite, yet BSD itself causes such severe damage that many organs offered for donation are unusable, with lung being the organ most affected by BSD. In Australia and New Zealand, less than 50% of lungs offered for donation post BSD are suitable for transplantation, as compared with over 90% of kidneys, resulting in patients dying for lack of suitable lungs. Our group has developed a novel 24 h sheep BSD model to mimic the physiological milieu of the typical human organ donor. Characterisation of the gene expression changes associated with BSD is critical and will assist in determining the aetiology of lung damage post BSD. Real-time PCR is a highly sensitive method involving multiple steps from extraction to processing RNA so the choice of housekeeping genes is important in obtaining reliable results. Little information however, is available on the expression stability of reference genes in the sheep pulmonary artery and lung. We aimed to establish a set of stably expressed reference genes for use as a standard for analysis of gene expression changes in BSD. Results We evaluated the expression stability of 6 candidate normalisation genes (ACTB, GAPDH, HGPRT, PGK1, PPIA and RPLP0 using real time quantitative PCR. There was a wide range of Ct-values within each tissue for pulmonary artery (15–24 and lung (16–25 but the expression pattern for each gene was similar across the two tissues. After geNorm analysis, ACTB and PPIA were shown to be the most stably expressed in the pulmonary artery and ACTB and PGK1 in the lung tissue of BSD sheep. Conclusion Accurate normalisation is critical in obtaining reliable and reproducible results in gene expression studies. This study demonstrates tissue associated variability in the selection of these

  17. Markov Decision Process Measurement Model.

    Science.gov (United States)

    LaMar, Michelle M

    2018-03-01

    Within-task actions can provide additional information on student competencies but are challenging to model. This paper explores the potential of using a cognitive model for decision making, the Markov decision process, to provide a mapping between within-task actions and latent traits of interest. Psychometric properties of the model are explored, and simulation studies report on parameter recovery within the context of a simple strategy game. The model is then applied to empirical data from an educational game. Estimates from the model are found to correlate more strongly with posttest results than a partial-credit IRT model based on outcome data alone.

  18. Simple Models for Process Control

    Czech Academy of Sciences Publication Activity Database

    Gorez, R.; Klán, Petr

    2011-01-01

    Roč. 22, č. 2 (2011), s. 58-62 ISSN 0929-2268 Institutional research plan: CEZ:AV0Z10300504 Keywords : process models * PID control * second order dynamics Subject RIV: JB - Sensors, Measurment, Regulation

  19. Inference of financial networks using the normalised mutual information rate.

    Science.gov (United States)

    Goh, Yong Kheng; Hasim, Haslifah M; Antonopoulos, Chris G

    2018-01-01

    In this paper, we study data from financial markets, using the normalised Mutual Information Rate. We show how to use it to infer the underlying network structure of interrelations in the foreign currency exchange rates and stock indices of 15 currency areas. We first present the mathematical method and discuss its computational aspects, and apply it to artificial data from chaotic dynamics and to correlated normal-variates data. We then apply the method to infer the structure of the financial system from the time-series of currency exchange rates and stock indices. In particular, we study and reveal the interrelations among the various foreign currency exchange rates and stock indices in two separate networks, of which we also study their structural properties. Our results show that both inferred networks are small-world networks, sharing similar properties and having differences in terms of assortativity. Importantly, our work shows that global economies tend to connect with other economies world-wide, rather than creating small groups of local economies. Finally, the consistent interrelations depicted among the 15 currency areas are further supported by a discussion from the viewpoint of economics.

  20. Inference of financial networks using the normalised mutual information rate.

    Directory of Open Access Journals (Sweden)

    Yong Kheng Goh

    Full Text Available In this paper, we study data from financial markets, using the normalised Mutual Information Rate. We show how to use it to infer the underlying network structure of interrelations in the foreign currency exchange rates and stock indices of 15 currency areas. We first present the mathematical method and discuss its computational aspects, and apply it to artificial data from chaotic dynamics and to correlated normal-variates data. We then apply the method to infer the structure of the financial system from the time-series of currency exchange rates and stock indices. In particular, we study and reveal the interrelations among the various foreign currency exchange rates and stock indices in two separate networks, of which we also study their structural properties. Our results show that both inferred networks are small-world networks, sharing similar properties and having differences in terms of assortativity. Importantly, our work shows that global economies tend to connect with other economies world-wide, rather than creating small groups of local economies. Finally, the consistent interrelations depicted among the 15 currency areas are further supported by a discussion from the viewpoint of economics.

  1. Inference of financial networks using the normalised mutual information rate

    Science.gov (United States)

    2018-01-01

    In this paper, we study data from financial markets, using the normalised Mutual Information Rate. We show how to use it to infer the underlying network structure of interrelations in the foreign currency exchange rates and stock indices of 15 currency areas. We first present the mathematical method and discuss its computational aspects, and apply it to artificial data from chaotic dynamics and to correlated normal-variates data. We then apply the method to infer the structure of the financial system from the time-series of currency exchange rates and stock indices. In particular, we study and reveal the interrelations among the various foreign currency exchange rates and stock indices in two separate networks, of which we also study their structural properties. Our results show that both inferred networks are small-world networks, sharing similar properties and having differences in terms of assortativity. Importantly, our work shows that global economies tend to connect with other economies world-wide, rather than creating small groups of local economies. Finally, the consistent interrelations depicted among the 15 currency areas are further supported by a discussion from the viewpoint of economics. PMID:29420644

  2. Model feedstock supply processing plants

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2013-01-01

    Full Text Available The model of raw providing the processing enterprises entering into vertically integrated structure on production and processing of dairy raw materials, differing by an orientation on achievement of cumulative effect by the integrated structure acting as criterion function which maximizing is reached by optimization of capacities, volumes of deliveries of raw materials and its qualitative characteristics, costs of industrial processing of raw materials and demand for dairy production is developed.

  3. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...

  4. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    2010-01-01

    In reduced form default models, the instantaneous default intensity is the classical modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature tends to specify the cumulative...

  5. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  6. Modelling Hospital Materials Management Processes

    Directory of Open Access Journals (Sweden)

    Raffaele Iannone

    2013-06-01

    integrated and detailed analysis and description model for hospital materials management data and tasks, which is able to tackle information from patient requirements to usage, from replenishment requests to supplying and handling activities. The model takes account of medical risk reduction, traceability and streamlined processes perspectives. Second, the paper translates this information into a business process model and mathematical formalization.The study provides a useful guide to the various relevant technology‐related, management and business issues, laying the foundations of an efficient reengineering of the supply chain to reduce healthcare costs and improve the quality of care.

  7. The Brookhaven Process Optimization Models

    Energy Technology Data Exchange (ETDEWEB)

    Pilati, D. A.; Sparrow, F. T.

    1979-01-01

    The Brookhaven National Laboratory Industry Model Program (IMP) has undertaken the development of a set of industry-specific process-optimization models. These models are to be used for energy-use projections, energy-policy analyses, and process technology assessments. Applications of the models currently under development show that system-wide energy impacts may be very different from engineering estimates, selected investment tax credits for cogeneration (or other conservation strategies) may have the perverse effect of increasing industrial energy use, and that a proper combination of energy taxes and investment tax credits is more socially desirable than either policy alone. A section is included describing possible extensions of these models to answer questions or address other systems (e.g., a single plant instead of an entire industry).

  8. A normalised seawater strontium isotope curve. Possible implications for Neoproterozoic-Cambrian weathering rates and the further oxygenation of the Earth

    International Nuclear Information System (INIS)

    Shields, G.A.

    2007-01-01

    The strontium isotope composition of seawater is strongly influenced on geological time scales by changes in the rates of continental weathering relative to ocean crust alteration. However, the potential of the seawater 87 Sr/ 86 Sr curve to trace globally integrated chemical weathering rates has not been fully realised because ocean 87 Sr/ 86 Sr is also influenced by the isotopic evolution of Sr sources to the ocean. A preliminary attempt is made here to normalise the seawater 87 Sr/ 86 Sr curve to plausible trends in the 87 Sr/ 86 Sr ratios of the three major Sr sources: carbonate dissolution, silicate weathering and submarine hydrothermal exchange. The normalised curve highlights the Neoproterozoic-Phanerozoic transition as a period of exceptionally high continental influence, indicating that this interval was characterised by a transient increase in global weathering rates and/or by the weathering of unusually radiogenic crustal rocks. Close correlation between the normalised 87 Sr/ 86 Sr curve, a published seawater δ 34 S curve and atmospheric pCO 2 models is used here to argue that elevated chemical weathering rates were a major contributing factor to the steep rise in seawater 87 Sr/ 86 Sr from 650 Ma to 500 Ma. Elevated weathering rates during the Neoproterozoic-Cambrian interval led to increased nutrient availability, organic burial and to the further oxygenation of Earth's surface environment. Use of normalised seawater 87 Sr/ 86 Sr curves will, it is hoped, help to improve future geochemical models of Earth System dynamics. (orig.)

  9. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret......Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  10. Neuroscientific model of motivational process.

    Science.gov (United States)

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  11. Process Models for Security Architectures

    Directory of Open Access Journals (Sweden)

    Floarea NASTASE

    2006-01-01

    Full Text Available This paper presents a model for an integrated security system, which can be implemented in any organization. It is based on security-specific standards and taxonomies as ISO 7498-2 and Common Criteria. The functionalities are derived from the classes proposed in the Common Criteria document. In the paper we present the process model for each functionality and also we focus on the specific components.

  12. Ranking: a closer look on globalisation methods for normalisation of gene expression arrays

    Science.gov (United States)

    Kroll, Torsten C.; Wölfl, Stefan

    2002-01-01

    Data from gene expression arrays are influenced by many experimental parameters that lead to variations not simply accessible by standard quantification methods. To compare measurements from gene expression array experiments, quantitative data are commonly normalised using reference genes or global normalisation methods based on mean or median values. These methods are based on the assumption that (i) selected reference genes are expressed at a standard level in all experiments or (ii) that mean or median signal of expression will give a quantitative reference for each individual experiment. We introduce here a new ranking diagram, with which we can show how the different normalisation methods compare, and how they are influenced by variations in measurements (noise) that occur in every experiment. Furthermore, we show that an upper trimmed mean provides a simple and robust method for normalisation of larger sets of experiments by comparative analysis. PMID:12034851

  13. Mathematical modelling in economic processes.

    Directory of Open Access Journals (Sweden)

    L.V. Kravtsova

    2008-06-01

    Full Text Available In article are considered a number of methods of mathematical modelling of economic processes and opportunities of use of spreadsheets Excel for reception of the optimum decision of tasks or calculation of financial operations with the help of the built-in functions.

  14. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret...

  15. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    collected retrospectively for a period of six months. For each patient, time in therapeutic range (TTR) was calculated and correlated with practice and patient characteristics using multilevel linear regression models. RESULTS: We identified 447 patients in warfarin treatment in the 20 practices using POCT......INTRODUCTION: Oral anticoagulation treatment (OACT) with warfarin is common in general practice. Increasingly, international normalised ratio (INR) point of care testing (POCT) is being used to manage patients. The aim of this study was to describe and analyse the quality of OACT with warfarin...

  16. Investigating the Process of Process Modeling with Eye Movement Analysis

    OpenAIRE

    Pinggera, Jakob; Furtner, Marco; Martini, Markus; Sachse, Pierre; Reiter, Katharina; Zugal, Stefan; Weber, Barbara

    2015-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models by analyzing the modeler's interactions with the modeling environment. In this paper we aim to complement previous insights on the modeler's modeling behavior with data gathered by tracking the modeler's eye movements when engaged in the act of modeling. We present preliminary results and outline directions for future research to triangulate toward a more comprehensive un...

  17. Integrated Site Model Process Model Report

    International Nuclear Information System (INIS)

    Booth, T.

    2000-01-01

    The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM

  18. A novel approach to detect respiratory phases from pulmonary acoustic signals using normalised power spectral density and fuzzy inference system.

    Science.gov (United States)

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian; Huliraj, N; Revadi, S S

    2016-07-01

    Monitoring respiration is important in several medical applications. One such application is respiratory rate monitoring in patients with sleep apnoea. The respiratory rate in patients with sleep apnoea disorder is irregular compared with the controls. Respiratory phase detection is required for a proper monitoring of respiration in patients with sleep apnoea. To develop a model to detect the respiratory phases present in the pulmonary acoustic signals and to evaluate the performance of the model in detecting the respiratory phases. Normalised averaged power spectral density for each frame and change in normalised averaged power spectral density between the adjacent frames were fuzzified and fuzzy rules were formulated. The fuzzy inference system (FIS) was developed with both Mamdani and Sugeno methods. To evaluate the performance of both Mamdani and Sugeno methods, correlation coefficient and root mean square error (RMSE) were calculated. In the correlation coefficient analysis in evaluating the fuzzy model using Mamdani and Sugeno method, the strength of the correlation was found to be r = 0.9892 and r = 0.9964, respectively. The RMSE for Mamdani and Sugeno methods are RMSE = 0.0853 and RMSE = 0.0817, respectively. The correlation coefficient and the RMSE of the proposed fuzzy models in detecting the respiratory phases reveals that Sugeno method performs better compared with the Mamdani method. © 2014 John Wiley & Sons Ltd.

  19. What work has to be done to implement collaborative care for depression? Process evaluation of a trial utilizing the Normalization Process Model.

    Science.gov (United States)

    Gask, Linda; Bower, Peter; Lovell, Karina; Escott, Diane; Archer, Janine; Gilbody, Simon; Lankshear, Annette J; Simpson, Angela E; Richards, David A

    2010-02-10

    There is a considerable evidence base for 'collaborative care' as a method to improve quality of care for depression, but an acknowledged gap between efficacy and implementation. This study utilises the Normalisation Process Model (NPM) to inform the process of implementation of collaborative care in both a future full-scale trial, and the wider health economy. Application of the NPM to qualitative data collected in both focus groups and one-to-one interviews before and after an exploratory randomised controlled trial of a collaborative model of care for depression. Findings are presented as they relate to the four factors of the NPM (interactional workability, relational integration, skill-set workability, and contextual integration) and a number of necessary tasks are identified. Using the model, it was possible to observe that predictions about necessary work to implement collaborative care that could be made from analysis of the pre-trial data relating to the four different factors of the NPM were indeed borne out in the post-trial data. However, additional insights were gained from the post-trial interview participants who, unlike those interviewed before the trial, had direct experience of a novel intervention. The professional freedom enjoyed by more senior mental health workers may work both for and against normalisation of collaborative care as those who wish to adopt new ways of working have the freedom to change their practice but are not obliged to do so. The NPM provides a useful structure for both guiding and analysing the process by which an intervention is optimized for testing in a larger scale trial or for subsequent full-scale implementation.

  20. Animal models and conserved processes

    Directory of Open Access Journals (Sweden)

    Greek Ray

    2012-09-01

    Full Text Available Abstract Background The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? Methods We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Results Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. Conclusion We conclude that even the presence of conserved processes is

  1. Animal models and conserved processes.

    Science.gov (United States)

    Greek, Ray; Rice, Mark J

    2012-09-10

    The concept of conserved processes presents unique opportunities for using nonhuman animal models in biomedical research. However, the concept must be examined in the context that humans and nonhuman animals are evolved, complex, adaptive systems. Given that nonhuman animals are examples of living systems that are differently complex from humans, what does the existence of a conserved gene or process imply for inter-species extrapolation? We surveyed the literature including philosophy of science, biological complexity, conserved processes, evolutionary biology, comparative medicine, anti-neoplastic agents, inhalational anesthetics, and drug development journals in order to determine the value of nonhuman animal models when studying conserved processes. Evolution through natural selection has employed components and processes both to produce the same outcomes among species but also to generate different functions and traits. Many genes and processes are conserved, but new combinations of these processes or different regulation of the genes involved in these processes have resulted in unique organisms. Further, there is a hierarchy of organization in complex living systems. At some levels, the components are simple systems that can be analyzed by mathematics or the physical sciences, while at other levels the system cannot be fully analyzed by reducing it to a physical system. The study of complex living systems must alternate between focusing on the parts and examining the intact whole organism while taking into account the connections between the two. Systems biology aims for this holism. We examined the actions of inhalational anesthetic agents and anti-neoplastic agents in order to address what the characteristics of complex living systems imply for inter-species extrapolation of traits and responses related to conserved processes. We conclude that even the presence of conserved processes is insufficient for inter-species extrapolation when the trait or response

  2. Model for amorphous aggregation processes

    Science.gov (United States)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  3. Preoperative mapping of cortical language areas in adult brain tumour patients using PET and individual non-normalised SPM analyses

    International Nuclear Information System (INIS)

    Meyer, Philipp T.; Sturz, Laszlo; Schreckenberger, Mathias; Setani, Keyvan S.; Buell, Udalrich; Spetzger, Uwe; Meyer, Georg F.; Sabri, Osama

    2003-01-01

    In patients scheduled for the resection of perisylvian brain tumours, knowledge of the cortical topography of language functions is crucial in order to avoid neurological deficits. We investigated the applicability of statistical parametric mapping (SPM) without stereotactic normalisation for individual preoperative language function brain mapping using positron emission tomography (PET). Seven right-handed adult patients with left-sided brain tumours (six frontal and one temporal) underwent 12 oxygen-15 labelled water PET scans during overt verb generation and rest. Individual activation maps were calculated for P<0.005 and P<0.001 without anatomical normalisation and overlaid onto the individuals' magnetic resonance images for preoperative planning. Activations corresponding to Broca's and Wernicke's areas were found in five and six cases, respectively, for P<0.005 and in three and six cases, respectively, for P<0.001. One patient with a glioma located in the classical Broca's area without aphasic symptoms presented an activation of the adjacent inferior frontal cortex and of a right-sided area homologous to Broca's area. Four additional patients with left frontal tumours also presented activations of the right-sided Broca's homologue; two of these showed aphasic symptoms and two only a weak or no activation of Broca's area. Other frequently observed activations included bilaterally the superior temporal gyri, prefrontal cortices, anterior insulae, motor areas and the cerebellum. The middle and inferior temporal gyri were activated predominantly on the left. An SPM group analysis (P<0.05, corrected) in patients with left frontal tumours confirmed the activation pattern shown by the individual analyses. We conclude that SPM analyses without stereotactic normalisation offer a promising alternative for analysing individual preoperative language function brain mapping studies. The observed right frontal activations agree with proposed reorganisation processes, but

  4. Models of memory: information processing.

    Science.gov (United States)

    Eysenck, M W

    1988-01-01

    A complete understanding of human memory will necessarily involve consideration of the active processes involved at the time of learning and of the organization and nature of representation of information in long-term memory. In addition to process and structure, it is important for theory to indicate the ways in which stimulus-driven and conceptually driven processes interact with each other in the learning situation. Not surprisingly, no existent theory provides a detailed specification of all of these factors. However, there are a number of more specific theories which are successful in illuminating some of the component structures and processes. The working memory model proposed by Baddeley and Hitch (1974) and modified subsequently has shown how the earlier theoretical construct of the short-term store should be replaced with the notion of working memory. In essence, working memory is a system which is used both to process information and to permit the transient storage of information. It comprises a number of conceptually distinct, but functionally interdependent components. So far as long-term memory is concerned, there is evidence of a number of different kinds of representation. Of particular importance is the distinction between declarative knowledge and procedural knowledge, a distinction which has received support from the study of amnesic patients. Kosslyn has argued for a distinction between literal representation and propositional representation, whereas Tulving has distinguished between episodic and semantic memories. While Tulving's distinction is perhaps the best known, there is increasing evidence that episodic and semantic memory differ primarily in content rather than in process, and so the distinction may be of less theoretical value than was originally believed.(ABSTRACT TRUNCATED AT 250 WORDS)

  5. Mathematical modeling of biological processes

    CERN Document Server

    Friedman, Avner

    2014-01-01

    This book on mathematical modeling of biological processes includes a wide selection of biological topics that demonstrate the power of mathematics and computational codes in setting up biological processes with a rigorous and predictive framework. Topics include: enzyme dynamics, spread of disease, harvesting bacteria, competition among live species, neuronal oscillations, transport of neurofilaments in axon, cancer and cancer therapy, and granulomas. Complete with a description of the biological background and biological question that requires the use of mathematics, this book is developed for graduate students and advanced undergraduate students with only basic knowledge of ordinary differential equations and partial differential equations; background in biology is not required. Students will gain knowledge on how to program with MATLAB without previous programming experience and how to use codes in order to test biological hypothesis.

  6. Modeling pellet impact drilling process

    Science.gov (United States)

    Kovalyov, A. V.; Ryabchikov, S. Ya; Isaev, Ye D.; Ulyanova, O. S.

    2016-03-01

    The paper describes pellet impact drilling which could be used to increase the drilling speed and the rate of penetration when drilling hard rocks. Pellet impact drilling implies rock destruction by metal pellets with high kinetic energy in the immediate vicinity of the earth formation encountered. The pellets are circulated in the bottom hole by a high velocity fluid jet, which is the principle component of the ejector pellet impact drill bit. The experiments conducted has allowed modeling the process of pellet impact drilling, which creates the scientific and methodological basis for engineering design of drilling operations under different geo-technical conditions.

  7. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    Integrated modelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...

  8. A visual analysis of the process of process modeling

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.; Pinggera, J.; Reijers, H.A.; Weber, B.; Poels, G.

    2015-01-01

    The construction of business process models has become an important requisite in the analysis and optimization of processes. The success of the analysis and optimization efforts heavily depends on the quality of the models. Therefore, a research domain emerged that studies the process of process

  9. Collapse models and perceptual processes

    International Nuclear Information System (INIS)

    Ghirardi, Gian Carlo; Romano, Raffaele

    2014-01-01

    Theories including a collapse mechanism have been presented various years ago. They are based on a modification of standard quantum mechanics in which nonlinear and stochastic terms are added to the evolution equation. Their principal merits derive from the fact that they are mathematically precise schemes accounting, on the basis of a unique universal dynamical principle, both for the quantum behavior of microscopic systems as well as for the reduction associated to measurement processes and for the classical behavior of macroscopic objects. Since such theories qualify themselves not as new interpretations but as modifications of the standard theory they can be, in principle, tested against quantum mechanics. Recently, various investigations identifying possible crucial test have been discussed. In spite of the extreme difficulty to perform such tests it seems that recent technological developments allow at least to put precise limits on the parameters characterizing the modifications of the evolution equation. Here we will simply mention some of the recent investigations in this direction, while we will mainly concentrate our attention to the way in which collapse theories account for definite perceptual process. The differences between the case of reductions induced by perceptions and those related to measurement procedures by means of standard macroscopic devices will be discussed. On this basis, we suggest a precise experimental test of collapse theories involving conscious observers. We make plausible, by discussing in detail a toy model, that the modified dynamics can give rise to quite small but systematic errors in the visual perceptual process.

  10. From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument.

    Science.gov (United States)

    Finch, Tracy L; Mair, Frances S; O'Donnell, Catherine; Murray, Elizabeth; May, Carl R

    2012-05-17

    Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field. A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals. The developed instrument was pre-tested in two professional samples (N=46; N=231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of

  11. An efficient Markov chain Monte Carlo method for distributions with intractable normalising constants

    DEFF Research Database (Denmark)

    Møller, Jesper; Pettitt, A. N.; Reeves, R.

    2006-01-01

    Maximum likelihood parameter estimation and sampling from Bayesian posterior distributions are problematic when the probability density for the parameter of interest involves an intractable normalising constant which is also a function of that parameter. In this paper, an auxiliary variable method...... is presented which requires only that independent samples can be drawn from the unnormalised density at any particular parameter value. The proposal distribution is constructed so that the normalising constant cancels from the Metropolis–Hastings ratio. The method is illustrated by producing posterior samples...

  12. Properties of spatial Cox process models

    DEFF Research Database (Denmark)

    Møller, Jesper

    Probabilistic properties of Cox processes of relevance for statistical modelling and inference are studied. Particularly, we study the most important classes of Cox processes, including log Gaussian Cox processes, shot noise Cox processes, and permanent Cox processes. We consider moment properties...... and point process operations such as thinning, displacements, and superpositioning. We also discuss how to simulate specific Cox processes....

  13. Cupola Furnace Computer Process Model

    Energy Technology Data Exchange (ETDEWEB)

    Seymour Katz

    2004-12-31

    The cupola furnace generates more than 50% of the liquid iron used to produce the 9+ million tons of castings annually. The cupola converts iron and steel into cast iron. The main advantages of the cupola furnace are lower energy costs than those of competing furnaces (electric) and the ability to melt less expensive metallic scrap than the competing furnaces. However the chemical and physical processes that take place in the cupola furnace are highly complex making it difficult to operate the furnace in optimal fashion. The results are low energy efficiency and poor recovery of important and expensive alloy elements due to oxidation. Between 1990 and 2004 under the auspices of the Department of Energy, the American Foundry Society and General Motors Corp. a computer simulation of the cupola furnace was developed that accurately describes the complex behavior of the furnace. When provided with the furnace input conditions the model provides accurate values of the output conditions in a matter of seconds. It also provides key diagnostics. Using clues from the diagnostics a trained specialist can infer changes in the operation that will move the system toward higher efficiency. Repeating the process in an iterative fashion leads to near optimum operating conditions with just a few iterations. More advanced uses of the program have been examined. The program is currently being combined with an ''Expert System'' to permit optimization in real time. The program has been combined with ''neural network'' programs to affect very easy scanning of a wide range of furnace operation. Rudimentary efforts were successfully made to operate the furnace using a computer. References to these more advanced systems will be found in the ''Cupola Handbook''. Chapter 27, American Foundry Society, Des Plaines, IL (1999).

  14. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  15. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi

    2014-01-01

    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  16. Analog modelling of obduction processes

    Science.gov (United States)

    Agard, P.; Zuo, X.; Funiciello, F.; Bellahsen, N.; Faccenna, C.; Savva, D.

    2012-04-01

    Obduction corresponds to one of plate tectonics oddities, whereby dense, oceanic rocks (ophiolites) are presumably 'thrust' on top of light, continental ones, as for the short-lived, almost synchronous Peri-Arabic obduction (which took place along thousands of km from Turkey to Oman in c. 5-10 Ma). Analog modelling experiments were performed to study the mechanisms of obduction initiation and test various triggering hypotheses (i.e., plate acceleration, slab hitting the 660 km discontinuity, ridge subduction; Agard et al., 2007). The experimental setup comprises (1) an upper mantle, modelled as a low-viscosity transparent Newtonian glucose syrup filling a rigid Plexiglas tank and (2) high-viscosity silicone plates (Rhodrosil Gomme with PDMS iron fillers to reproduce densities of continental or oceanic plates), located at the centre of the tank above the syrup to simulate the subducting and the overriding plates - and avoid friction on the sides of the tank. Convergence is simulated by pushing on a piston at one end of the model with velocities comparable to those of plate tectonics (i.e., in the range 1-10 cm/yr). The reference set-up includes, from one end to the other (~60 cm): (i) the piston, (ii) a continental margin containing a transition zone to the adjacent oceanic plate, (iii) a weakness zone with variable resistance and dip (W), (iv) an oceanic plate - with or without a spreading ridge, (v) a subduction zone (S) dipping away from the piston and (vi) an upper, active continental margin, below which the oceanic plate is being subducted at the start of the experiment (as is known to have been the case in Oman). Several configurations were tested and over thirty different parametric tests were performed. Special emphasis was placed on comparing different types of weakness zone (W) and the extent of mechanical coupling across them, particularly when plates were accelerated. Displacements, together with along-strike and across-strike internal deformation in all

  17. De nationale mål for sundhed – en strategi til en normalisering af befolkningen?

    DEFF Research Database (Denmark)

    Boelsbjerg, Hanne Bess; Præstegaard Hendriksen, Marie

    2018-01-01

    Der ses en stigende samfundsmæssig tendens til at søge efter løsninger på sociale såvel som medicinske problematikker ud fra biomedicinsk viden. Kapitlet har fokus på, hvilken betydning de nationale mål for sundhed har som strategi til en normalisering af befolkningen. Og hvilken rolle myndighede...

  18. From business value model to coordination process model

    NARCIS (Netherlands)

    Fatemi, Hassan; Wieringa, Roelf J.; Poler, R.; van Sinderen, Marten J.; Sanchis, R.

    2009-01-01

    The increased complexity of business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business and process perspectives, and for mutually aligning these perspectives. Business value modeling and coordination process modeling both are necessary

  19. Symmorphosis through dietary regulation: a combinatorial role for proteolysis, autophagy and protein synthesis in normalising muscle metabolism and function of hypertrophic mice after acute starvation.

    Directory of Open Access Journals (Sweden)

    Henry Collins-Hooper

    Full Text Available Animals are imbued with adaptive mechanisms spanning from the tissue/organ to the cellular scale which insure that processes of homeostasis are preserved in the landscape of size change. However we and others have postulated that the degree of adaptation is limited and that once outside the normal levels of size fluctuations, cells and tissues function in an aberant manner. In this study we examine the function of muscle in the myostatin null mouse which is an excellent model for hypertrophy beyond levels of normal growth and consequeces of acute starvation to restore mass. We show that muscle growth is sustained through protein synthesis driven by Serum/Glucocorticoid Kinase 1 (SGK1 rather than Akt1. Furthermore our metabonomic profiling of hypertrophic muscle shows that carbon from nutrient sources is being channelled for the production of biomass rather than ATP production. However the muscle displays elevated levels of autophagy and decreased levels of muscle tension. We demonstrate the myostatin null muscle is acutely sensitive to changes in diet and activates both the proteolytic and autophagy programmes and shutting down protein synthesis more extensively than is the case for wild-types. Poignantly we show that acute starvation which is detrimental to wild-type animals is beneficial in terms of metabolism and muscle function in the myostatin null mice by normalising tension production.

  20. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  1. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  2. Analysis of a simulated microarray dataset: Comparison of methods for data normalisation and detection of differntial expression

    NARCIS (Netherlands)

    Watson, M.; Perez-Alegre, M.; Denis Baron, M.; Delmas, C.; Dovc, P.; Duval, M.; Foulley, J.L.; Garrido-Pavon, J.J.; Hulsegge, B.; Jafrezic, F.; Jiménez-Marín, A.; Lavric, M.; Lê Cao, K.A.; Marot, G.; Mouzaki, D.; Pool, M.H.; Robert-Granié, C.; San Cristobal, M.; Tosser-Klop, G.; Waddington, D.; Koning, de D.J.

    2007-01-01

    Microarrays allow researchers to measure the expression of thousands of genes in a single experiment. Before statistical comparisons can be made, the data must be assessed for quality and normalisation procedures must be applied, of which many have been proposed. Methods of comparing the normalised

  3. Towards the Automated Annotation of Process Models

    NARCIS (Netherlands)

    Leopold, H.; Meilicke, C.; Fellmann, M.; Pittke, F.; Stuckenschmidt, H.; Mendling, J.

    2016-01-01

    Many techniques for the advanced analysis of process models build on the annotation of process models with elements from predefined vocabularies such as taxonomies. However, the manual annotation of process models is cumbersome and sometimes even hardly manageable taking the size of taxonomies into

  4. The centre of rotation of the shoulder complex and the effect of normalisation.

    Science.gov (United States)

    Amabile, Celia; Bull, Anthony M J; Kedgley, Angela E

    2016-06-14

    Shoulder motions consist of a composite movement of three joints and one pseudo-joint, which together dictate the humerothoracic motion. The purpose of this work was to quantify the location of the centre of rotation (CoR) of the shoulder complex as a whole. Dynamic motion of 12 participants was recorded using optical motion tracking during coronal, scapular and sagittal plane elevation. The instantaneous CoR was found for each angle of elevation using helical axes projected onto the three planes of motion. The location of an average CoR for each plane was evaluated using digitised and anthropometric measures for normalisation. When conducting motion in the coronal, scapular, and sagittal planes, respectively, the coefficients for locating the CoRs of the shoulder complex are -61%, -61%, and -65% of the anterior-posterior dimension - the vector between the midpoint of the incisura jugularis and the xiphoid process and the midpoint of the seventh cervical vertebra and the eighth thoracic vertebra; 0%, -1%, and -2% of the superior-inferior dimension - the vector between the midpoint of the acromioclavicular joints and the midpoint of the anterior superior iliac spines; and 57%, 57%, and 78% of the medial-lateral dimension -0.129 times the height of the participant. Knowing the location of the CoR of the shoulder complex as a whole enables improved participant positioning for evaluation and rehabilitation activities that involve movement of the hand with a fixed radius, such as those that employ isokinetic dynamometers. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    INTRODUCTION: Oral anticoagulation treatment (OACT) with warfarin is common in general practice. Increasingly, international normalised ratio (INR) point of care testing (POCT) is being used to manage patients. The aim of this study was to describe and analyse the quality of OACT with warfarin...... in general practice in the Capital Region of Denmark using INR POCT. METHODS: A total of 20 general practices, ten single-handed and ten group practices using INR POCT, were randomly selected to participate in the study. Practice organisation and patient characteristics were recorded. INR measurements were...... collected retrospectively for a period of six months. For each patient, time in therapeutic range (TTR) was calculated and correlated with practice and patient characteristics using multilevel linear regression models. RESULTS: We identified 447 patients in warfarin treatment in the 20 practices using POCT...

  6. Business Process Modelling based on Petri nets

    Directory of Open Access Journals (Sweden)

    Qin Jianglong

    2017-01-01

    Full Text Available Business process modelling is the way business processes are expressed. Business process modelling is the foundation of business process analysis, reengineering, reorganization and optimization. It can not only help enterprises to achieve internal information system integration and reuse, but also help enterprises to achieve with the external collaboration. Based on the prototype Petri net, this paper adds time and cost factors to form an extended generalized stochastic Petri net. It is a formal description of the business process. The semi-formalized business process modelling algorithm based on Petri nets is proposed. Finally, The case from a logistics company proved that the modelling algorithm is correct and effective.

  7. Modeling process flow using diagrams

    NARCIS (Netherlands)

    Kemper, B.; de Mast, J.; Mandjes, M.

    2010-01-01

    In the practice of process improvement, tools such as the flowchart, the value-stream map (VSM), and a variety of ad hoc variants of such diagrams are commonly used. The purpose of this paper is to present a clear, precise, and consistent framework for the use of such flow diagrams in process

  8. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Heng [Pacific Northwest National Laboratory, Richland Washington USA; Ye, Ming [Department of Scientific Computing, Florida State University, Tallahassee Florida USA; Walker, Anthony P. [Environmental Sciences Division and Climate Change Science Institute, Oak Ridge National Laboratory, Oak Ridge Tennessee USA; Chen, Xingyuan [Pacific Northwest National Laboratory, Richland Washington USA

    2017-04-01

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averaging methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.

  9. modeling grinding modeling grinding processes as micro processes

    African Journals Online (AJOL)

    eobe

    into two parts: static specific chip formation energy and dynamic specific chip formation ... the ratio of static normal chip formation force to static tangential chip formation force and the ratio ... grinding processing parameters to the friction coefficient between workpiece and grinding wheel. From equation. (20), the calculation ...

  10. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  11. Modelling of Batch Process Operations

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli; Cameron, Ian; Gani, Rafiqul

    2011-01-01

    Here a batch cooling crystalliser is modelled and simulated as is a batch distillation system. In the batch crystalliser four operational modes of the crystalliser are considered, namely: initial cooling, nucleation, crystal growth and product removal. A model generation procedure is shown that s...

  12. Mathematical Modeling: A Structured Process

    Science.gov (United States)

    Anhalt, Cynthia Oropesa; Cortez, Ricardo

    2015-01-01

    Mathematical modeling, in which students use mathematics to explain or interpret physical, social, or scientific phenomena, is an essential component of the high school curriculum. The Common Core State Standards for Mathematics (CCSSM) classify modeling as a K-12 standard for mathematical practice and as a conceptual category for high school…

  13. Modeling business processes: theoretical and practical aspects

    Directory of Open Access Journals (Sweden)

    V.V. Dubininа

    2015-06-01

    Full Text Available The essence of process-oriented enterprise management has been examined in the article. The content and types of information technology have been analyzed in the article, due to the complexity and differentiation of existing methods, as well as the specificity of language, terminology of the enterprise business processes modeling. The theoretical aspects of business processes modeling have been reviewed and the modern traditional modeling techniques received practical application in the visualization model of retailers activity have been studied in the article. In the process of theoretical analysis of the modeling methods found that UFO-toolkit method that has been developed by Ukrainian scientists due to it systemology integrated opportunities, is the most suitable for structural and object analysis of retailers business processes. It was designed visualized simulation model of the business process "sales" as is" of retailers using a combination UFO-elements with the aim of the further practical formalization and optimization of a given business process.

  14. Modelling heat processing of dairy products

    NARCIS (Netherlands)

    Hotrum, N.; Fox, M.B.; Lieverloo, H.; Smit, E.; Jong, de P.; Schutyser, M.A.I.

    2010-01-01

    This chapter discusses the application of computer modelling to optimise the heat processing of milk. The chapter first reviews types of heat processing equipment used in the dairy industry. Then, the types of objectives that can be achieved using model-based process optimisation are discussed.

  15. How visual cognition influences process model comprehension

    NARCIS (Netherlands)

    Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A.

    2017-01-01

    Process analysts and other professionals extensively use process models to analyze business processes and identify performance improvement opportunities. Therefore, it is important that such models can be easily and properly understood. Previous research has mainly focused on two types of factors

  16. Modeling process flow using diagrams

    OpenAIRE

    Kemper, B.; de Mast, J.; Mandjes, M.

    2010-01-01

    In the practice of process improvement, tools such as the flowchart, the value-stream map (VSM), and a variety of ad hoc variants of such diagrams are commonly used. The purpose of this paper is to present a clear, precise, and consistent framework for the use of such flow diagrams in process improvement projects. The paper finds that traditional diagrams, such as the flowchart, the VSM, and OR-type of diagrams, have severe limitations, miss certain elements, or are based on implicit but cons...

  17. Steady-State Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    illustrate the “equation oriented” approach as well as the “sequential modular” approach to solving complex flowsheets for steady state applications. The applications include the Williams-Otto plant, the hydrodealkylation (HDA) of toluene, conversion of ethylene to ethanol and a bio-ethanol process....

  18. Two Methods for Normalisation of Measured Energy Performance—Testing of a Net-Zero Energy Building in Sweden

    Directory of Open Access Journals (Sweden)

    Björn Berggren

    2017-10-01

    Full Text Available An increasing demand for energy-efficient buildings has led to an increasing focus on predicted energy performance once a building is in use. Many studies have identified a performance gap between predicted energy use and actual measured energy use once buildings are in the user phase. However, none of the identified studies normalise measured energy use for both internal and external deviating boundary conditions. This study uses a Net-zero energy building (Net ZEB building in Sweden to test two different approaches to the normalisation of measured energy use—static and dynamic methods. The normalisation of energy use for a ground source heat pump reduces the performance gap from 12% to 1–5%, depending on the method of normalisation. The normalisation of energy from photovoltaic (PV panels reduces the performance gap from 17% to 5%, regardless of the method used. The results show that normalisation is important in order to accurately determine the energy performance of buildings. The most important parameters are the indoor temperature and internal loads, which have the largest effect on normalisation in this case study. Furthermore, the case study shows that it is possible to build Net ZEB buildings with existing technologies in a Northern European climate.

  19. Numerical modelling of reflood processes

    International Nuclear Information System (INIS)

    Glynn, D.R.; Rhodes, N.; Tatchell, D.G.

    1983-01-01

    The use of a detailed computer model to investigate the effects of grid size and the choice of wall-to-fluid heat-transfer correlations on the predictions obtained for reflooding of a vertical heated channel is described. The model employs equations for the momentum and enthalpy of vapour and liquid and hence accounts for both thermal non-equilibrium and slip between the phases. Empirical correlations are used to calculate interphase and wall-to-fluid friction and heat-transfer as functions of flow regime and local conditions. The empirical formulae have remained fixed with the exception of the wall-to-fluid heat-transfer correlations. These have been varied according to the practices adopted in other computer codes used to model reflood, namely REFLUX, RELAP and TRAC. Calculations have been performed to predict the CSNI standard problem number 7, and the results are compared with experiment. It is shown that the results are substantially grid-independent, and that the choice of correlation has a significant influence on the general flow behaviour, the rate of quenching and on the maximum cladding temperature predicted by the model. It is concluded that good predictions of reflooding rates can be obtained with particular correlation sets. (author)

  20. Branching process models of cancer

    CERN Document Server

    Durrett, Richard

    2015-01-01

    This volume develops results on continuous time branching processes and applies them to study rate of tumor growth, extending classic work on the Luria-Delbruck distribution. As a consequence, the authors calculate the probability that mutations that confer resistance to treatment are present at detection and quantify the extent of tumor heterogeneity. As applications, the authors evaluate ovarian cancer screening strategies and give rigorous proofs for results of Heano and Michor concerning tumor metastasis. These notes should be accessible to students who are familiar with Poisson processes and continuous time. Richard Durrett is mathematics professor at Duke University, USA. He is the author of 8 books, over 200 journal articles, and has supervised more than 40 Ph.D. students. Most of his current research concerns the applications of probability to biology: ecology, genetics, and most recently cancer.

  1. The Poisson Margin Test for Normalisation Free Significance Analysis of NGS Data

    Science.gov (United States)

    Kowalczyk, Adam; Bedo, Justin; Conway, Thomas; Beresford-Smith, Bryan

    Motivation: The current methods for the determination of the statistical significance of peaks and regions in NGS data require an explicit normalisation step to compensate for (global or local) imbalances in the sizes of sequenced and mapped libraries. There are no canonical methods for performing such compensations, hence a number of different procedures serving this goal in different ways can be found in the literature. Unfortunately, the normalisation has a significant impact on the final results. Different methods yield very different numbers of detected "significant peaks" even in the simplest scenario of ChIP-Seq experiments which compare the enrichment in a single sample relative to a matching control. This becomes an even more acute issue in the more general case of the comparison of multiple samples, where a number of arbitrary design choices will be required in the data analysis stage, each option resulting in possibly (significantly) different outcomes.

  2. Discovering Process Reference Models from Process Variants Using Clustering Techniques

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    2008-01-01

    In today's dynamic business world, success of an enterprise increasingly depends on its ability to react to changes in a quick and flexible way. In response to this need, process-aware information systems (PAIS) emerged, which support the modeling, orchestration and monitoring of business processes

  3. Systematic approach for the identification of process reference models

    CSIR Research Space (South Africa)

    Van Der Merwe, A

    2009-02-01

    Full Text Available Process models are used in different application domains to capture knowledge on the process flow. Process reference models (PRM) are used to capture reusable process models, which should simplify the identification process of process models...

  4. Retrieval of the raindrop size distribution from polarimetric radar data using double-moment normalisation

    OpenAIRE

    Raupach, Timothy H.; Berne, Alexis

    2016-01-01

    A new technique for estimating the raindrop size distribution (DSD) from polarimetric radar data is proposed. Two statistical moments of the DSD are estimated from polarimetric variables, and the DSD is reconstructed. The technique takes advantage of the relative invariance of the double-moment normalised DSD. The method was tested using X-band radar data and networks of disdrometers in three different climatic regions. Radar-derived estimates of the DSD compare reasonably well to observation...

  5. Validation process of simulation model

    International Nuclear Information System (INIS)

    San Isidro, M. J.

    1998-01-01

    It is presented a methodology on empirical validation about any detailed simulation model. This king of validation it is always related with an experimental case. The empirical validation has a residual sense, because the conclusions are based on comparisons between simulated outputs and experimental measurements. This methodology will guide us to detect the fails of the simulation model. Furthermore, it can be used a guide in the design of posterior experiments. Three steps can be well differentiated: Sensitivity analysis. It can be made with a DSA, differential sensitivity analysis, and with a MCSA, Monte-Carlo sensitivity analysis. Looking the optimal domains of the input parameters. It has been developed a procedure based on the Monte-Carlo methods and Cluster techniques, to find the optimal domains of these parameters. Residual analysis. This analysis has been made on the time domain and on the frequency domain, it has been used the correlation analysis and spectral analysis. As application of this methodology, it is presented the validation carried out on a thermal simulation model on buildings, Esp., studying the behavior of building components on a Test Cell of LECE of CIEMAT. (Author) 17 refs

  6. Comparisons of the CODD and of the Normaliser Systems For Closed Orbit Measurements in the PS

    CERN Document Server

    Belleman, J; Ludwig, M; Potier, J P; Steerenberg, R; CERN. Geneva. AB Department

    2003-01-01

    The PS ring is equipped with 40 position PUs distributed around the ring. These are connected to the CODD system, which performs trajectory measurements of any single bunch, on two consecutive turns, and to 40 Normalisers, which deliver an averaged orbit. CODD uses a beam-synchronous timing system, which tracks a given bunch all through the acceleration cycle, but needs resynchronization after harmonic changes. It measures at injection or at any given C-timing, albeit with no less than 5ms between acquisitions. It is blind during harmonic changes. A Normaliser, using a technique originally developed for radial loop control, produces a slow signal, proportional to the average position of the bunches. At beam injection a settling time of 1.5ms is needed; then it follows orbit changes with a 200us time constant. Thus, it will not show rapid position changes, such as betatron oscillations. It does not require accurate timing. In the PS installation, the Normaliser outputs are simply sampled at 1ms intervals. CODD...

  7. Comparison between magnetic bead and qPCR library normalisation methods for forensic MPS genotyping.

    Science.gov (United States)

    Mehta, Bhavik; Venables, Samantha; Roffey, Paul

    2018-01-01

    Massively parallel sequencing (MPS) is fast approaching operational use in forensic science, with the capability to analyse hundreds of DNA identity and DNA intelligence markers in multiple samples simultaneously. The ForenSeq™ DNA Signature Kit on MiSeq FGx™ (Illumina) workflow can provide profiles for autosomal short tandem repeats (STRs), X chromosome and Y chromosome STRs, identity single nucleotide polymorphisms (SNPs), biogeographical ancestry SNPs and phenotype (eye and hair colour) SNPs from a sample. The library preparation procedure involves a series of steps including target amplification, library purification and library normalisation. This study highlights the comparison between the manufacturer recommended magnetic bead normalisation and quantitative polymerase chain reaction (qPCR) methods. Furthermore, two qPCR chemistries, KAPA® (KAPA Biosystems) and NEBNext® (New England Bio Inc.), have also been compared. The qPCR outperformed the bead normalisation method, while the NEBNext® kit obtained higher genotype concordance than KAPA®. The study also established an MPS workflow that can be utilised in any operational forensic laboratory.

  8. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    Science.gov (United States)

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  9. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    Based on an equation oriented solving strategy, this thesis investigates a new approach to process modelling. Homogeneous thermodynamic state functions represent consistent mathematical models of thermodynamic properties. Such state functions of solely extensive canonical state variables are the basis of this work, as they are natural objective functions in optimisation nodes to calculate thermodynamic equilibrium regarding phase-interaction and chemical reactions. Analytical state function derivatives are utilised within the solution process as well as interpreted as physical properties. By this approach, only a limited range of imaginable process constraints are considered, namely linear balance equations of state variables. A second-order update of source contributions to these balance equations is obtained by an additional constitutive equation system. These equations are general dependent on state variables and first-order sensitivities, and cover therefore practically all potential process constraints. Symbolic computation technology efficiently provides sparsity and derivative information of active equations to avoid performance problems regarding robustness and computational effort. A benefit of detaching the constitutive equation system is that the structure of the main equation system remains unaffected by these constraints, and a priori information allows to implement an efficient solving strategy and a concise error diagnosis. A tailor-made linear algebra library handles the sparse recursive block structures efficiently. The optimisation principle for single modules of thermodynamic equilibrium is extended to host entire process models. State variables of different modules interact through balance equations, representing material flows from one module to the other. To account for reusability and encapsulation of process module details, modular process modelling is supported by a recursive module structure. The second-order solving algorithm makes it

  10. Measures of Quality in Business Process Modelling

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-06-01

    Full Text Available Business process modelling and analysing is undoubtedly one of the most important parts of Applied (Business Informatics. Quality of business process models (diagrams is crucial for any purpose in this area. The goal of a process analyst’s work is to create generally understandable, explicit and error free models. If a process is properly described, created models can be used as an input into deep analysis and optimization. It can be assumed that properly designed business process models (similarly as in the case of correctly written algorithms contain characteristics that can be mathematically described. Besides it will be possible to create a tool that will help process analysts to design proper models. As part of this review will be conducted systematic literature review in order to find and analyse business process model’s design and business process model’s quality measures. It was found that mentioned area had already been the subject of research investigation in the past. Thirty-three suitable scietific publications and twenty-two quality measures were found. Analysed scientific publications and existing quality measures do not reflect all important attributes of business process model’s clarity, simplicity and completeness. Therefore it would be appropriate to add new measures of quality.

  11. Analysis of a simulated microarray dataset : Comparison of methods for data normalisation and detection of differential expression

    OpenAIRE

    Watson, Michael; Pérez-Alegre, Monica; Baron, Michael Denis; Delmas, Celine; Dovc, Peter; Duval, Mylene; Foulley, Jean Louis; Garrido-Pavon, Juan José; Hulsegge, Ina; Jaffrézic, Florence; Jiménez-Marin, Angeles; Lavric, Miha; Lê Cao, Kim-Anh; Marot, Guillemette; Mouzaki, Daphné

    2007-01-01

    Microarrays allow researchers to measure the expression of thousands of genes in a single experiment. Before statistical comparisons can be made, the data must be assessed for quality and normalisation procedures must be applied, of which many have been proposed. Methods of comparing the normalised data are also abundant, and no clear consensus has yet been reached. The purpose of this paper was to compare those methods used by the EADGENE network on a very noisy simulated data set. With the ...

  12. Modelling income processes with lots of heterogeneity

    DEFF Research Database (Denmark)

    Browning, Martin; Ejrnæs, Mette; Alvarez, Javier

    2010-01-01

    We model earnings processes allowing for lots of heterogeneity across agents. We also introduce an extension to the linear ARMA model which allows the initial convergence in the long run to be different from that implied by the conventional ARMA model. This is particularly important for unit root...

  13. Counting Processes for Retail Default Modeling

    DEFF Research Database (Denmark)

    Kiefer, Nicholas Maximilian; Larson, C. Erik

    in a discrete state space. In a simple case, the states could be default/non-default; in other models relevant for credit modeling the states could be credit scores or payment status (30 dpd, 60 dpd, etc.). Here we focus on the use of stochastic counting processes for mortgage default modeling, using data...

  14. Distillation modeling for a uranium refining process

    International Nuclear Information System (INIS)

    Westphal, B.R.

    1996-01-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed open-quotes cathode processingclose quotes. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process

  15. Piecewise deterministic processes in biological models

    CERN Document Server

    Rudnicki, Ryszard

    2017-01-01

    This book presents a concise introduction to piecewise deterministic Markov processes (PDMPs), with particular emphasis on their applications to biological models. Further, it presents examples of biological phenomena, such as gene activity and population growth, where different types of PDMPs appear: continuous time Markov chains, deterministic processes with jumps, processes with switching dynamics, and point processes. Subsequent chapters present the necessary tools from the theory of stochastic processes and semigroups of linear operators, as well as theoretical results concerning the long-time behaviour of stochastic semigroups induced by PDMPs and their applications to biological models. As such, the book offers a valuable resource for mathematicians and biologists alike. The first group will find new biological models that lead to interesting and often new mathematical questions, while the second can observe how to include seemingly disparate biological processes into a unified mathematical theory, and...

  16. Modeling closed nuclear fuel cycles processes

    Energy Technology Data Exchange (ETDEWEB)

    Shmidt, O.V. [A.A. Bochvar All-Russian Scientific Research Institute for Inorganic Materials, Rogova, 5a street, Moscow, 123098 (Russian Federation); Makeeva, I.R. [Zababakhin All-Russian Scientific Research Institute of Technical Physics, Vasiliev street 13, Snezhinsk, Chelyabinsk region, 456770 (Russian Federation); Liventsov, S.N. [Tomsk Polytechnic University, Tomsk, Lenin Avenue, 30, 634050 (Russian Federation)

    2016-07-01

    Computer models of processes are necessary for determination of optimal operating conditions for closed nuclear fuel cycle (NFC) processes. Computer models can be quickly changed in accordance with new and fresh data from experimental research. 3 kinds of process simulation are necessary. First, the VIZART software package is a balance model development used for calculating the material flow in technological processes. VIZART involves taking into account of equipment capacity, transport lines and storage volumes. Secondly, it is necessary to simulate the physico-chemical processes that are involved in the closure of NFC. The third kind of simulation is the development of software that allows the optimization, diagnostics and control of the processes which implies real-time simulation of product flows on the whole plant or on separate lines of the plant. (A.C.)

  17. MODELLING PURCHASING PROCESSES FROM QUALITY ASPECTS

    Directory of Open Access Journals (Sweden)

    Zora Arsovski

    2008-12-01

    Full Text Available Management has a fundamental task to identify and direct primary and specific processes within purchasing function, applying the up-to-date information infrastructure. ISO 9001:2000 defines a process as a number of interrelated or interactive activities transforming inputs and outputs, and the "process approach" as a systematic identification in management processes employed with the organization and particularly - relationships among the processes. To direct a quality management system using process approach, the organization is to determine the map of its general (basic processes. Primary processes are determined on the grounds of their interrelationship and impact on satisfying customers' needs. To make a proper choice of general business processes, it is necessary to determine the entire business flow, beginning with the customer demand up to the delivery of products or service provided. In the next step the process model is to be converted into data model which is essential for implementation of the information system enabling automation, monitoring, measuring, inspection, analysis and improvement of key purchase processes. In this paper are given methodology and some results of investigation of development of IS for purchasing process from aspects of quality.

  18. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  19. Value-Oriented Coordination Process Modeling

    NARCIS (Netherlands)

    Fatemi, Hassan; van Sinderen, Marten J.; Wieringa, Roelf J.; Hull, Richard; Mendling, Jan; Tai, Stefan

    Business webs are collections of enterprises designed to jointly satisfy a consumer need. Designing business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business value and coordination process perspectives, and for mutually aligning these

  20. Process and Context in Choice Models

    DEFF Research Database (Denmark)

    Ben-Akiva, Moshe; Palma, André de; McFadden, Daniel

    2012-01-01

    We develop a general framework that extends choice models by including an explicit representation of the process and context of decision making. Process refers to the steps involved in decision making. Context refers to factors affecting the process, focusing in this paper on social networks....... The extended choice framework includes more behavioral richness through the explicit representation of the planning process preceding an action and its dynamics and the effects of context (family, friends, and market) on the process leading to a choice, as well as the inclusion of new types of subjective data...

  1. Modeling of Dielectric Heating within Lyophilization Process

    Directory of Open Access Journals (Sweden)

    Jan Kyncl

    2014-01-01

    Full Text Available A process of lyophilization of paper books is modeled. The process of drying is controlled by a dielectric heating system. From the physical viewpoint, the task represents a 2D coupled problem described by two partial differential equations for the electric and temperature fields. The material parameters are supposed to be temperature-dependent functions. The continuous mathematical model is solved numerically. The methodology is illustrated with some examples whose results are discussed.

  2. Cost Models for MMC Manufacturing Processes

    Science.gov (United States)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  3. Physical and mathematical modelling of extrusion processes

    DEFF Research Database (Denmark)

    Arentoft, Mogens; Gronostajski, Z.; Niechajowics, A.

    2000-01-01

    The main objective of the work is to study the extrusion process using physical modelling and to compare the findings of the study with finite element predictions. The possibilities and advantages of the simultaneous application of both of these methods for the analysis of metal forming processes...

  4. Business Process Modeling Notation - An Overview

    Directory of Open Access Journals (Sweden)

    Alexandra Fortiş

    2006-01-01

    Full Text Available BPMN represents an industrial standard created to offer a common and user friendly notation to all the participants to a business process. The present paper aims to briefly present the main features of this notation as well as an interpretation of some of the main patterns characterizing a business process modeled by the working fluxes.

  5. Qualitative simulation in formal process modelling

    International Nuclear Information System (INIS)

    Sivertsen, Elin R.

    1999-01-01

    In relation to several different research activities at the OECD Halden Reactor Project, the usefulness of formal process models has been identified. Being represented in some appropriate representation language, the purpose of these models is to model process plants and plant automatics in a unified way to allow verification and computer aided design of control strategies. The present report discusses qualitative simulation and the tool QSIM as one approach to formal process models. In particular, the report aims at investigating how recent improvements of the tool facilitate the use of the approach in areas like process system analysis, procedure verification, and control software safety analysis. An important long term goal is to provide a basis for using qualitative reasoning in combination with other techniques to facilitate the treatment of embedded programmable systems in Probabilistic Safety Analysis (PSA). This is motivated from the potential of such a combination in safety analysis based on models comprising both software, hardware, and operator. It is anticipated that the research results from this activity will benefit V and V in a wide variety of applications where formal process models can be utilized. Examples are operator procedures, intelligent decision support systems, and common model repositories (author) (ml)

  6. The (Mathematical) Modeling Process in Biosciences.

    Science.gov (United States)

    Torres, Nestor V; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology.

  7. Models and Modelling Tools for Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    2016-01-01

    -process design. Illustrative examples highlighting the need for efficient model-based systems will be presented, where the need for predictive models for innovative chemical product-process design will be highlighted. The examples will cover aspects of chemical product-process design where the idea of the grand......The design, development and reliability of a chemical product and the process to manufacture it, need to be consistent with the end-use characteristics of the desired product. One of the common ways to match the desired product-process characteristics is through trial and error based experiments......, which can be expensive and time consuming. An alternative approach is the use of a systematic model-based framework according to an established work-flow in product-process design, replacing some of the time consuming and/or repetitive experimental steps. The advantages of the use of a model...

  8. Extending Model Checking To Object Process Validation

    NARCIS (Netherlands)

    van Rein, H.

    2002-01-01

    Object-oriented techniques allow the gathering and modelling of system requirements in terms of an application area. The expression of data and process models at that level is a great asset in communication with non-technical people in that area, but it does not necessarily lead to consistent

  9. Hierarchical Structured Model for Nonlinear Dynamical Processes ...

    African Journals Online (AJOL)

    The mathematical representation of the process, in this context, is by a set of linear stochastic differential equations (SDE) with unique solutions. The problem of realization is that of constructing the dynamical system by looking at the problem of scientific model building. In model building, one must be able to calculate the ...

  10. Filament winding cylinders. I - Process model

    Science.gov (United States)

    Lee, Soo-Yong; Springer, George S.

    1990-01-01

    A model was developed which describes the filament winding process of composite cylinders. The model relates the significant process variables such as winding speed, fiber tension, and applied temperature to the thermal, chemical and mechanical behavior of the composite cylinder and the mandrel. Based on the model, a user friendly code was written which can be used to calculate (1) the temperature in the cylinder and the mandrel, (2) the degree of cure and viscosity in the cylinder, (3) the fiber tensions and fiber positions, (4) the stresses and strains in the cylinder and in the mandrel, and (5) the void diameters in the cylinder.

  11. Modeling Aspects of Activated Sludge Processes Part l l: Mathematical Process Modeling and Biokinetics of Activated Sludge Processes

    International Nuclear Information System (INIS)

    AbdElHaleem, H.S.; EI-Ahwany, A. H.; Ibrahim, H.I.; Ibrahim, G.

    2004-01-01

    Mathematical process modeling and biokinetics of activated sludge process were reviewed considering different types of models. It has been evaluated the task group models of ASMI. and 2, and 3 versioned by Henze et al considering the conditions of each model and the different processes of which every model consists. It is revealed that ASMI contains some defects avoided in ASM3. Relied on homogeneity, Models can be classified into homogenous models characterized by taking the activated sludge process as one phase. In this type of models, the internal mass transfer inside the floes was neglected.. Hence, the kinetic parameter produces can be considered inaccurate. The other type of models is the heterogeneous model This type considers the mass transfer operations in addition to the biochemical reaction processes; hence, the resulted kinetic parameters can be considered more accurate than that of homogenous type

  12. Process modeling study of the CIF incinerator

    International Nuclear Information System (INIS)

    Hang, T.

    1995-01-01

    The Savannah River Site (SRS) plans to begin operating the Consolidated Incineration Facility (CIF) in 1996. The CIF will treat liquid and solid low-level radioactive, mixed and RCRA hazardous wastes generated at SRS. In addition to experimental test programs, process modeling was applied to provide guidance in areas of safety, environmental regulation compliances, process improvement and optimization. A steady-state flowsheet model was used to calculate material/energy balances and to track key chemical constituents throughout the process units. Dynamic models were developed to predict the CIF transient characteristics in normal and abnormal operation scenarios. Predictions include the rotary kiln heat transfer, dynamic responses of the CIF to fluctuations in the solid waste feed or upsets in the system equipments, performance of the control system, air inleakage in the kiln, etc. This paper reviews the modeling study performed to assist in the deflagration risk assessment

  13. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  14. From Business Value Model to Coordination Process Model

    Science.gov (United States)

    Fatemi, Hassan; van Sinderen, Marten; Wieringa, Roel

    The increased complexity of business webs calls for modeling the collaboration of enterprises from different perspectives, in particular the business and process perspectives, and for mutually aligning these perspectives. Business value modeling and coordination process modeling both are necessary for a good e-business design, but these activities have different goals and use different concepts. Nevertheless, the resulting models should be consistent with each other because they refer to the same system from different perspectives. Hence, checking the consistency between these models or producing one based on the other would be of high value. In this paper we discuss the issue of achieving consistency in multi-level e-business design and give guidelines to produce consistent coordination process models from business value models in a stepwise manner.

  15. Numerical modeling of atmospheric washout processes

    International Nuclear Information System (INIS)

    Bayer, D.; Beheng, K.D.; Herbert, F.

    1987-01-01

    For the washout of particles from the atmosphere by clouds and rain one has to distinguish between processes which work in the first phase of cloud development, when condensation nuclei build up in saturated air (Nucleation Aerosol Scavenging, NAS) and those processes which work at the following cloud development. In the second case particles are taken off by cloud droplets or by falling rain drops via collision (Collision Aerosol Scavenging, CAS). The physics of both processes is described. For the CAS process a numerical model is presented. The report contains a documentation of the mathematical equations and the computer programs (FORTRAN). (KW) [de

  16. Modeling nonhomogeneous Markov processes via time transformation.

    Science.gov (United States)

    Hubbard, R A; Inoue, L Y T; Fann, J R

    2008-09-01

    Longitudinal studies are a powerful tool for characterizing the course of chronic disease. These studies are usually carried out with subjects observed at periodic visits giving rise to panel data. Under this observation scheme the exact times of disease state transitions and sequence of disease states visited are unknown and Markov process models are often used to describe disease progression. Most applications of Markov process models rely on the assumption of time homogeneity, that is, that the transition rates are constant over time. This assumption is not satisfied when transition rates depend on time from the process origin. However, limited statistical tools are available for dealing with nonhomogeneity. We propose models in which the time scale of a nonhomogeneous Markov process is transformed to an operational time scale on which the process is homogeneous. We develop a method for jointly estimating the time transformation and the transition intensity matrix for the time transformed homogeneous process. We assess maximum likelihood estimation using the Fisher scoring algorithm via simulation studies and compare performance of our method to homogeneous and piecewise homogeneous models. We apply our methodology to a study of delirium progression in a cohort of stem cell transplantation recipients and show that our method identifies temporal trends in delirium incidence and recovery.

  17. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  18. Causally nonseparable processes admitting a causal model

    International Nuclear Information System (INIS)

    Feix, Adrien; Araújo, Mateus; Brukner, Caslav

    2016-01-01

    A recent framework of quantum theory with no global causal order predicts the existence of ‘causally nonseparable’ processes. Some of these processes produce correlations incompatible with any causal order (they violate so-called ‘causal inequalities’ analogous to Bell inequalities ) while others do not (they admit a ‘causal model’ analogous to a local model ). Here we show for the first time that bipartite causally nonseparable processes with a causal model exist, and give evidence that they have no clear physical interpretation. We also provide an algorithm to generate processes of this kind and show that they have nonzero measure in the set of all processes. We demonstrate the existence of processes which stop violating causal inequalities but are still causally nonseparable when mixed with a certain amount of ‘white noise’. This is reminiscent of the behavior of Werner states in the context of entanglement and nonlocality. Finally, we provide numerical evidence for the existence of causally nonseparable processes which have a causal model even when extended with an entangled state shared among the parties. (paper)

  19. Stochastic differential equation model to Prendiville processes

    International Nuclear Information System (INIS)

    Granita; Bahar, Arifah

    2015-01-01

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution

  20. Stochastic differential equation model to Prendiville processes

    Energy Technology Data Exchange (ETDEWEB)

    Granita, E-mail: granitafc@gmail.com [Dept. of Mathematical Science, Universiti Teknologi Malaysia, 81310, Johor Malaysia (Malaysia); Bahar, Arifah [Dept. of Mathematical Science, Universiti Teknologi Malaysia, 81310, Johor Malaysia (Malaysia); UTM Center for Industrial & Applied Mathematics (UTM-CIAM) (Malaysia)

    2015-10-22

    The Prendiville process is another variation of the logistic model which assumes linearly decreasing population growth rate. It is a continuous time Markov chain (CTMC) taking integer values in the finite interval. The continuous time Markov chain can be approximated by stochastic differential equation (SDE). This paper discusses the stochastic differential equation of Prendiville process. The work started with the forward Kolmogorov equation in continuous time Markov chain of Prendiville process. Then it was formulated in the form of a central-difference approximation. The approximation was then used in Fokker-Planck equation in relation to the stochastic differential equation of the Prendiville process. The explicit solution of the Prendiville process was obtained from the stochastic differential equation. Therefore, the mean and variance function of the Prendiville process could be easily found from the explicit solution.

  1. The Probability Model of Expectation Disconfirmation Process

    Directory of Open Access Journals (Sweden)

    Hui-Hsin HUANG

    2015-06-01

    Full Text Available This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model and the real data. This model has application for business marketing areas in order to manage relationship satisfaction.

  2. Chain binomial models and binomial autoregressive processes.

    Science.gov (United States)

    Weiss, Christian H; Pollett, Philip K

    2012-09-01

    We establish a connection between a class of chain-binomial models of use in ecology and epidemiology and binomial autoregressive (AR) processes. New results are obtained for the latter, including expressions for the lag-conditional distribution and related quantities. We focus on two types of chain-binomial model, extinction-colonization and colonization-extinction models, and present two approaches to parameter estimation. The asymptotic distributions of the resulting estimators are studied, as well as their finite-sample performance, and we give an application to real data. A connection is made with standard AR models, which also has implications for parameter estimation. © 2011, The International Biometric Society.

  3. The role of bicycle sharing systems in normalising the image of cycling: An observational study of London cyclists.

    Science.gov (United States)

    Goodman, Anna; Green, Judith; Woodcock, James

    2014-03-01

    Bicycle sharing systems are increasingly popular around the world and have the potential to increase the visibility of people cycling in everyday clothing. This may in turn help normalise the image of cycling, and reduce perceptions that cycling is 'risky' or 'only for sporty people'. This paper sought to compare the use of specialist cycling clothing between users of the London bicycle sharing system (LBSS) and cyclists using personal bicycles. To do this, we observed 3594 people on bicycles at 35 randomly-selected locations across central and inner London. The 592 LBSS users were much less likely to wear helmets (16% vs. 64% among personal-bicycle cyclists), high-visibility clothes (11% vs. 35%) and sports clothes (2% vs. 25%). In total, 79% of LBSS users wore none of these types of specialist cycling clothing, as compared to only 30% of personal-bicycle cyclists. This was true of male and female LBSS cyclists alike (all p >0.25 for interaction). We conclude that bicycle sharing systems may not only encourage cycling directly, by providing bicycles to rent, but also indirectly, by increasing the number and diversity of cycling 'role models' visible.

  4. A neurolinguistic model of grammatical construction processing.

    Science.gov (United States)

    Dominey, Peter Ford; Hoen, Michel; Inui, Toshio

    2006-12-01

    One of the functions of everyday human language is to communicate meaning. Thus, when one hears or reads the sentence, "John gave a book to Mary," some aspect of an event concerning the transfer of possession of a book from John to Mary is (hopefully) transmitted. One theoretical approach to language referred to as construction grammar emphasizes this link between sentence structure and meaning in the form of grammatical constructions. The objective of the current research is to (1) outline a functional description of grammatical construction processing based on principles of psycholinguistics, (2) develop a model of how these functions can be implemented in human neurophysiology, and then (3) demonstrate the feasibility of the resulting model in processing languages of typologically diverse natures, that is, English, French, and Japanese. In this context, particular interest will be directed toward the processing of novel compositional structure of relative phrases. The simulation results are discussed in the context of recent neurophysiological studies of language processing.

  5. Similarity metrics for surgical process models.

    Science.gov (United States)

    Neumuth, Thomas; Loebe, Frank; Jannin, Pierre

    2012-01-01

    The objective of this work is to introduce a set of similarity metrics for comparing surgical process models (SPMs). SPMs are progression models of surgical interventions that support quantitative analyses of surgical activities, supporting systems engineering or process optimization. Five different similarity metrics are presented and proven. These metrics deal with several dimensions of process compliance in surgery, including granularity, content, time, order, and frequency of surgical activities. The metrics were experimentally validated using 20 clinical data sets each for cataract interventions, craniotomy interventions, and supratentorial tumor resections. The clinical data sets were controllably modified in simulations, which were iterated ten times, resulting in a total of 600 simulated data sets. The simulated data sets were subsequently compared to the original data sets to empirically assess the predictive validity of the metrics. We show that the results of the metrics for the surgical process models correlate significantly (pmetrics meet predictive validity. The clinical use of the metrics was exemplarily, as demonstrated by assessment of the learning curves of observers during surgical process model acquisition. Measuring similarity between surgical processes is a complex task. However, metrics for computing the similarity between surgical process models are needed in many uses in the field of medical engineering. These metrics are essential whenever two SPMs need to be compared, such as during the evaluation of technical systems, the education of observers, or the determination of surgical strategies. These metrics are key figures that provide a solid base for medical decisions, such as during validation of sensor systems for use in operating rooms in the future. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. A process algebra model of QED

    International Nuclear Information System (INIS)

    Sulis, William

    2016-01-01

    The process algebra approach to quantum mechanics posits a finite, discrete, determinate ontology of primitive events which are generated by processes (in the sense of Whitehead). In this ontology, primitive events serve as elements of an emergent space-time and of emergent fundamental particles and fields. Each process generates a set of primitive elements, using only local information, causally propagated as a discrete wave, forming a causal space termed a causal tapestry. Each causal tapestry forms a discrete and finite sampling of an emergent causal manifold (space-time) M and emergent wave function. Interactions between processes are described by a process algebra which possesses 8 commutative operations (sums and products) together with a non-commutative concatenation operator (transitions). The process algebra possesses a representation via nondeterministic combinatorial games. The process algebra connects to quantum mechanics through the set valued process and configuration space covering maps, which associate each causal tapestry with sets of wave functions over M. Probabilities emerge from interactions between processes. The process algebra model has been shown to reproduce many features of the theory of non-relativistic scalar particles to a high degree of accuracy, without paradox or divergences. This paper extends the approach to a semi-classical form of quantum electrodynamics. (paper)

  7. Determinantal point process models on the sphere

    DEFF Research Database (Denmark)

    Møller, Jesper; Nielsen, Morten; Porcu, Emilio

    We consider determinantal point processes on the d-dimensional unit sphere Sd . These are finite point processes exhibiting repulsiveness and with moment properties determined by a certain determinant whose entries are specified by a so-called kernel which we assume is a complex covariance function...... and eigenfunctions in a spectral representation for the kernel, and we figure out how repulsive isotropic DPPs can be. Moreover, we discuss the shortcomings of adapting existing models for isotropic covariance functions and consider strategies for developing new models, including a useful spectral approach....

  8. Performance assessment modeling of pyrometallurgical process wasteforms

    International Nuclear Information System (INIS)

    Nutt, W.M.; Hill, R.N.; Bullen, D.B.

    1995-01-01

    Performance assessment analyses have been completed to estimate the behavior of high-level nuclear wasteforms generated from the pyrometallurgical processing of liquid metal reactor (LMR) and light water reactor (LWR) spent nuclear fuel. Waste emplaced in the proposed repository at Yucca Mountain is investigated as the basis for the study. The resulting cumulative actinide and fission product releases to the accessible environment within a 100,000 year period from the various pyrometallurgical process wasteforms are compared to those of directly disposed LWR spent fuel using the same total repository system model. The impact of differing radionuclide transport models on the overall release characteristics is investigated

  9. Retort process modelling for Indian traditional foods.

    Science.gov (United States)

    Gokhale, S V; Lele, S S

    2014-11-01

    Indian traditional staple and snack food is typically a heterogeneous recipe that incorporates varieties of vegetables, lentils and other ingredients. Modelling the retorting process of multilayer pouch packed Indian food was achieved using lumped-parameter approach. A unified model is proposed to estimate cold point temperature. Initial process conditions, retort temperature and % solid content were the significantly affecting independent variables. A model was developed using combination of vegetable solids and water, which was then validated using four traditional Indian vegetarian products: Pulav (steamed rice with vegetables), Sambar (south Indian style curry containing mixed vegetables and lentils), Gajar Halawa (carrot based sweet product) and Upama (wheat based snack product). The predicted and experimental values of temperature profile matched with ±10 % error which is a good match considering the food was a multi component system. Thus the model will be useful as a tool to reduce number of trials required to optimize retorting of various Indian traditional vegetarian foods.

  10. Models of transport processes in concrete

    International Nuclear Information System (INIS)

    Pommersheim, J.M.; Clifton, J.R.

    1991-01-01

    An approach being considered by the US Nuclear Regulatory Commission for disposal of low-level radioactive waste is to place the waste forms in concrete vaults buried underground. The vaults would need a service life of 500 years. Approaches for predicting the service life of concrete of such vaults include the use of mathematical models. Mathematical models are presented in this report for the major degradation processes anticipated for the concrete vaults, which are corrosion of steel reinforcement, sulfate attack, acid attack, and leaching. The models mathematically represent rate controlling processes including diffusion, convection, and reaction and sorption of chemical species. These models can form the basis for predicting the life of concrete under in-service conditions. 33 refs., 6 figs., 7 tabs

  11. Deterministic geologic processes and stochastic modeling

    International Nuclear Information System (INIS)

    Rautman, C.A.; Flint, A.L.

    1991-01-01

    Recent outcrop sampling at Yucca Mountain, Nevada, has produced significant new information regarding the distribution of physical properties at the site of a potential high-level nuclear waste repository. Consideration of the spatial distribution of measured values and geostatistical measures of spatial variability indicates that there are a number of widespread deterministic geologic features at the site that have important implications for numerical modeling of such performance aspects as ground water flow and radionuclide transport. These deterministic features have their origin in the complex, yet logical, interplay of a number of deterministic geologic processes, including magmatic evolution; volcanic eruption, transport, and emplacement; post-emplacement cooling and alteration; and late-stage (diagenetic) alteration. Because of geologic processes responsible for formation of Yucca Mountain are relatively well understood and operate on a more-or-less regional scale, understanding of these processes can be used in modeling the physical properties and performance of the site. Information reflecting these deterministic geologic processes may be incorporated into the modeling program explicitly, using geostatistical concepts such as soft information, or implicitly, through the adoption of a particular approach to modeling. It is unlikely that any single representation of physical properties at the site will be suitable for all modeling purposes. Instead, the same underlying physical reality will need to be described many times, each in a manner conducive to assessing specific performance issues

  12. A Mathematical Model of Cigarette Smoldering Process

    Directory of Open Access Journals (Sweden)

    Chen P

    2014-12-01

    Full Text Available A mathematical model for a smoldering cigarette has been proposed. In the analysis of the cigarette combustion and pyrolysis processes, a receding burning front is defined, which has a constant temperature (~450 °C and divides the cigarette into two zones, the burning zone and the pyrolysis zone. The char combustion processes in the burning zone and the pyrolysis of virgin tobacco and evaporation of water in the pyrolysis zone are included in the model. The hot gases flow from the burning zone, are assumed to go out as sidestream smoke during smoldering. The internal heat transport is characterized by effective thermal conductivities in each zone. Thermal conduction of cigarette paper and convective and radiative heat transfer at the outer surface were also considered. The governing partial differential equations were solved using an integral method. Model predictions of smoldering speed as well as temperature and density profiles in the pyrolysis zone for different kinds of cigarettes were found to agree with the experimental data. The model also predicts the coal length and the maximum coal temperatures during smoldering conditions. The model provides a relatively fast and efficient way to simulate the cigarette burning processes. It offers a practical tool for exploring important parameters for cigarette smoldering processes, such as tobacco components, properties of cigarette paper, and heat generation in the burning zone and its dependence on the mass burn rate.

  13. Modeling of Reaction Processes Controlled by Diffusion

    International Nuclear Information System (INIS)

    Revelli, Jorge

    2003-01-01

    Stochastic modeling is quite powerful in science and technology.The technics derived from this process have been used with great success in laser theory, biological systems and chemical reactions.Besides, they provide a theoretical framework for the analysis of experimental results on the field of particle's diffusion in ordered and disordered materials.In this work we analyze transport processes in one-dimensional fluctuating media, which are media that change their state in time.This fact induces changes in the movements of the particles giving rise to different phenomena and dynamics that will be described and analyzed in this work.We present some random walk models to describe these fluctuating media.These models include state transitions governed by different dynamical processes.We also analyze the trapping problem in a lattice by means of a simple model which predicts a resonance-like phenomenon.Also we study effective diffusion processes over surfaces due to random walks in the bulk.We consider different boundary conditions and transitions movements.We derive expressions that describe diffusion behaviors constrained to bulk restrictions and the dynamic of the particles.Finally it is important to mention that the theoretical results obtained from the models proposed in this work are compared with Monte Carlo simulations.We find, in general, excellent agreements between the theory and the simulations

  14. Internet User Behaviour Model Discovery Process

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The Academy of Economic Studies has more than 45000 students and about 5000 computers with Internet access which are connected to AES network. Students can access internet on these computers through a proxy server which stores information about the way the Internet is accessed. In this paper, we describe the process of discovering internet user behavior models by analyzing proxy server raw data and we emphasize the importance of such models for the e-learning environment.

  15. Process model development for optimization of forged disk manufacturing processes

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, C.E.; Gunasekera, J.S. [Ohio Univ., Athens, OH (United States). Center for Advanced Materials Processing; Malas, J.C. [Wright Labs., Wright Patterson AFB, OH (United States). Materials Directorate

    1997-12-31

    This paper addresses the development of a system which will enable the optimization of an entire processing sequence for a forged part. Typically such a sequence may involve several stages and alternative routes of manufacturing a given part. It is important that such a system be optimized globally, (rather than locally, as is the current practice) in order to achieve improvements in affordability, producibility, and performance. This paper demonstrates the development of a simplified forging model, discussion techniques for searching and reducing a very large design space, and an objective function to evaluate the cost of a design sequence.

  16. Derivative processes for modelling metabolic fluxes

    Science.gov (United States)

    Žurauskienė, Justina; Kirk, Paul; Thorne, Thomas; Pinney, John; Stumpf, Michael

    2014-01-01

    Motivation: One of the challenging questions in modelling biological systems is to characterize the functional forms of the processes that control and orchestrate molecular and cellular phenotypes. Recently proposed methods for the analysis of metabolic pathways, for example, dynamic flux estimation, can only provide estimates of the underlying fluxes at discrete time points but fail to capture the complete temporal behaviour. To describe the dynamic variation of the fluxes, we additionally require the assumption of specific functional forms that can capture the temporal behaviour. However, it also remains unclear how to address the noise which might be present in experimentally measured metabolite concentrations. Results: Here we propose a novel approach to modelling metabolic fluxes: derivative processes that are based on multiple-output Gaussian processes (MGPs), which are a flexible non-parametric Bayesian modelling technique. The main advantages that follow from MGPs approach include the natural non-parametric representation of the fluxes and ability to impute the missing data in between the measurements. Our derivative process approach allows us to model changes in metabolite derivative concentrations and to characterize the temporal behaviour of metabolic fluxes from time course data. Because the derivative of a Gaussian process is itself a Gaussian process, we can readily link metabolite concentrations to metabolic fluxes and vice versa. Here we discuss how this can be implemented in an MGP framework and illustrate its application to simple models, including nitrogen metabolism in Escherichia coli. Availability and implementation: R code is available from the authors upon request. Contact: j.norkunaite@imperial.ac.uk; m.stumpf@imperial.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24578401

  17. SSH adequacy to preimplantation mammalian development: Scarce specific transcripts cloning despite irregular normalisation

    Directory of Open Access Journals (Sweden)

    Renard JP

    2005-11-01

    Full Text Available Abstract Background SSH has emerged as a widely used technology to identify genes that are differentially regulated between two biological situations. Because it includes a normalisation step, it is used for preference to clone low abundance differentially expressed transcripts. It does not require previous sequence knowledge and may start from PCR amplified cDNAs. It is thus particularly well suited to biological situations where specific genes are expressed and tiny amounts of RNA are available. This is the case during early mammalian embryo development. In this field, few differentially expressed genes have been characterized from SSH libraries, but an overall assessment of the quality of SSH libraries is still required. Because we are interested in the more systematic establishment of SSH libraries from early embryos, we have developed a simple and reliable strategy based on reporter transcript follow-up to check SSH library quality and repeatability when starting with small amounts of RNA. Results Four independent subtracted libraries were constructed. They aimed to analyze key events in the preimplantation development of rabbit and bovine embryos. The performance of the SSH procedure was assessed through the large-scale screening of thousands of clones from each library for exogenous reporter transcripts mimicking either tester specific or tester/driver common transcripts. Our results show that abundant transcripts escape normalisation which is only efficient for rare and moderately abundant transcripts. Sequencing 1600 clones from one of the libraries confirmed and extended our results to endogenous transcripts and demonstrated that some very abundant transcripts common to tester and driver escaped subtraction. Nonetheless, the four libraries were greatly enriched in clones encoding for very rare (0.0005% of mRNAs tester-specific transcripts. Conclusion The close agreement between our hybridization and sequencing results shows that the

  18. SSH adequacy to preimplantation mammalian development: scarce specific transcripts cloning despite irregular normalisation.

    Science.gov (United States)

    Bui, L C; Léandri, R D; Renard, J P; Duranthon, V

    2005-11-08

    SSH has emerged as a widely used technology to identify genes that are differentially regulated between two biological situations. Because it includes a normalisation step, it is used for preference to clone low abundance differentially expressed transcripts. It does not require previous sequence knowledge and may start from PCR amplified cDNAs. It is thus particularly well suited to biological situations where specific genes are expressed and tiny amounts of RNA are available. This is the case during early mammalian embryo development. In this field, few differentially expressed genes have been characterized from SSH libraries, but an overall assessment of the quality of SSH libraries is still required. Because we are interested in the more systematic establishment of SSH libraries from early embryos, we have developed a simple and reliable strategy based on reporter transcript follow-up to check SSH library quality and repeatability when starting with small amounts of RNA. Four independent subtracted libraries were constructed. They aimed to analyze key events in the preimplantation development of rabbit and bovine embryos. The performance of the SSH procedure was assessed through the large-scale screening of thousands of clones from each library for exogenous reporter transcripts mimicking either tester specific or tester/driver common transcripts. Our results show that abundant transcripts escape normalisation which is only efficient for rare and moderately abundant transcripts. Sequencing 1600 clones from one of the libraries confirmed and extended our results to endogenous transcripts and demonstrated that some very abundant transcripts common to tester and driver escaped subtraction. Nonetheless, the four libraries were greatly enriched in clones encoding for very rare (0.0005% of mRNAs) tester-specific transcripts. The close agreement between our hybridization and sequencing results shows that the addition and follow-up of exogenous reporter transcripts

  19. A Generic Modeling Process to Support Functional Fault Model Development

    Science.gov (United States)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  20. Model Identification of Integrated ARMA Processes

    Science.gov (United States)

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  1. Modeling as a Decision-Making Process

    Science.gov (United States)

    Bleiler-Baxter, Sarah K.; Stephens, D. Christopher; Baxter, Wesley A.; Barlow, Angela T.

    2017-01-01

    The goal in this article is to support teachers in better understanding what it means to model with mathematics by focusing on three key decision-making processes: Simplification, Relationship Mapping, and Situation Analysis. The authors use the Theme Park task to help teachers develop a vision of how students engage in these three decision-making…

  2. Kinetics and modeling of anaerobic digestion process

    DEFF Research Database (Denmark)

    Gavala, Hariklia N.; Angelidaki, Irini; Ahring, Birgitte Kiær

    2003-01-01

    Anaerobic digestion modeling started in the early 1970s when the need for design and efficient operation of anaerobic systems became evident. At that time not only was the knowledge about the complex process of anaerobic digestion inadequate but also there were computational limitations. Thus...

  3. Querying Business Process Models with VMQL

    DEFF Research Database (Denmark)

    Störrle, Harald; Acretoaie, Vlad

    2013-01-01

    . In this paper, we apply VMQL to the Business Process Modeling Notation (BPMN) to evaluate the second claim. We explore the adaptations required, and re-evaluate the usability of VMQL in this context. We find similar results to earlier work, thus both supporting our claims and establishing the usability of VMQL...

  4. Numerical modeling and simulation in various processes

    Directory of Open Access Journals (Sweden)

    Eliza Consuela ISBĂŞOIU

    2011-12-01

    The economic modeling offers the manager the rigorous side of his actions, multiple chances in order to connect existing resources with the objectives pursued for a certain period of time, offering the possibility of a better and faster thinking and deciding process, without deforming the reality.

  5. Aligning Grammatical Theories and Language Processing Models

    Science.gov (United States)

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  6. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  7. Modeling of the mechanical alloying process

    Science.gov (United States)

    Maurice, D.; Courtney, T. H.

    1992-01-01

    Two programs have been developed to compute the dimensional and property changes that occur with repetitive impacts during the mechanical alloying process. The more sophisticated of the programs also maintains a running count of the fractions of particles present and from this calculates a population distribution. The programs predict powder particle size and shape changes in accord with the accepted stages of powder development during mechanical alloying of ductile species. They also predict hardness and lamellar thickness changes with processing, again with reasonable agreement with experimental results. These predictions offer support of the model (and thereby give insight into the possible 'actual' happenings of mechanical alloying) and hence allow refinement and calibration of the myriad aspects of the model. They also provide a vehicle for establishing control over the dimensions and properties of the output powders used for consolidation, thereby facilitating optimization of the consolidation process.

  8. Managing risks in business model innovation processes

    DEFF Research Database (Denmark)

    Taran, Yariv; Boer, Harry; Lindgren, Peter

    2010-01-01

    Companies today, in some industries more than others, invest more capital and resources just to stay competitive, develop more diverse solutions, and increasingly start thinking more radically when considering their business models. However, despite the understanding that business model (BM) inno...... forward, which link success and failure to the way companies appreciate and handle the risks involved in BM innovation.......) innovation is a risky enterprise, many companies are still choosing not to apply any risk management in the BM innovation process. The objective of this paper is to develop a better understanding of how risks are handled in the practice of BM innovation. An analysis of the BM innovation experiences of two...... industrial companies shows that both companies are experiencing high levels of uncertainty and complexity during their innovation processes and are, consequently, struggling to find new processes for handling the risks involved. Based on the two companies’ experiences, various testable propositions are put...

  9. Statistical model for high energy inclusive processes

    International Nuclear Information System (INIS)

    Pomorisac, B.

    1980-01-01

    We propose a statistical model of inclusive processes. The model is an extension of the model proposed by Salapino and Sugar for the inclusive distributions in rapidity. The model is defined in terms of a random variable on the full phase space of the produced particles and in terms of a Lorentz-invariant probability distribution. We suggest that the Lorentz invariance is broken spontaneously, this may describe the observed anisotropy of the inclusive distributions. Based on this model we calculate the distribution in transverse momentum. An explicit calculation is given of the one-particle inclusive cross sections and the two-particle correlation. The results give a fair representation of the shape of one-particle inclusive cross sections, and positive correlation for the particles emitted. The relevance of our results to experiments is discussed

  10. Fundamentals of Numerical Modelling of Casting Processes

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri; Pryds, Nini; Thorborg, Jesper

    Fundamentals of Numerical Modelling of Casting Processes comprises a thorough presentation of the basic phenomena that need to be addressed in numerical simulation of casting processes. The main philosophy of the book is to present the topics in view of their physical meaning, whenever possible......, rather than relying strictly on mathematical formalism. The book, aimed both at the researcher and the practicing engineer, as well as the student, is naturally divided into four parts. Part I (Chapters 1-3) introduces the fundamentals of modelling in a 1-dimensional framework. Part II (Chapter 4......) presents the most important aspects of solidification theory related to modelling. Part III (Chapter 5) describes the fluid flow phenomena and in part IV (Chapter 6) the stress-strain analysis is addressed. For all parts, both numerical formulations as well as some important analytical solutions...

  11. Temperature Modelling of the Biomass Pretreatment Process

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail; Blanke, Mogens; Jensen, Jakob M.

    2012-01-01

    In a second generation biorefinery, the biomass pretreatment stage has an important contribution to the efficiency of the downstream processing units involved in biofuel production. Most of the pretreatment process occurs in a large pressurized thermal reactor that presents an irregular temperature...... distribution. Therefore, an accurate temperature model is critical for observing the biomass pretreatment. More than that, the biomass is also pushed with a constant horizontal speed along the reactor in order to ensure a continuous throughput. The goal of this paper is to derive a temperature model...... that captures the environmental temperature differences inside the reactor using distributed parameters. A Kalman filter is then added to account for any missing dynamics and the overall model is embedded into a temperature soft sensor. The operator of the plant will be able to observe the temperature in any...

  12. Modeling veterans healthcare administration disclosure processes :

    Energy Technology Data Exchange (ETDEWEB)

    Beyeler, Walter E; DeMenno, Mercy B.; Finley, Patrick D.

    2013-09-01

    As with other large healthcare organizations, medical adverse events at the Department of Veterans Affairs (VA) facilities can expose patients to unforeseen negative risks. VHA leadership recognizes that properly handled disclosure of adverse events can minimize potential harm to patients and negative consequences for the effective functioning of the organization. The work documented here seeks to help improve the disclosure process by situating it within the broader theoretical framework of issues management, and to identify opportunities for process improvement through modeling disclosure and reactions to disclosure. The computational model will allow a variety of disclosure actions to be tested across a range of incident scenarios. Our conceptual model will be refined in collaboration with domain experts, especially by continuing to draw on insights from VA Study of the Communication of Adverse Large-Scale Events (SCALE) project researchers.

  13. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  14. The normalisation of terror: the response of Israel's stock market to long periods of terrorism.

    Science.gov (United States)

    Peleg, Kobi; Regens, James L; Gunter, James T; Jaffe, Dena H

    2011-01-01

    Man-made disasters such as acts of terrorism may affect a society's resiliency and sensitivity to prolonged physical and psychological stress. The Israeli Tel Aviv stock market TA-100 Index was used as an indicator of reactivity to suicide terror bombings. After accounting for factors such as world market changes and attack severity and intensity, the analysis reveals that although Israel's financial base remained sensitive to each act of terror across the entire period of the Second Intifada (2000-06), sustained psychological resilience was indicated with no apparent overall market shift. In other words, we saw a 'normalisation of terror' following an extended period of continued suicide bombings. The results suggest that investors responded to less transitory global market forces, indicating sustained resilience and long-term market confidence. Future studies directly measuring investor expectations and reactions to man-made disasters, such as terrorism, are warranted. © 2011 The Author(s). Disasters © Overseas Development Institute, 2011.

  15. Living under the influence: normalisation of alcohol consumption in our cities.

    Science.gov (United States)

    Sureda, Xisca; Villalbí, Joan R; Espelt, Albert; Franco, Manuel

    Harmful use of alcohol is one of the world's leading health risks. A positive association between certain characteristics of the urban environment and individual alcohol consumption has been documented in previous research. When developing a tool characterising the urban environment of alcohol in the cities of Barcelona and Madrid we observed that alcohol is ever present in our cities. Urban residents are constantly exposed to a wide variety of alcohol products, marketing and promotion and signs of alcohol consumption. In this field note, we reflect the normalisation of alcohol in urban environments. We highlight the need for further research to better understand attitudes and practices in relation to alcohol consumption. This type of urban studies is necessary to support policy interventions to prevent and control harmful alcohol use. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  16. Civil Inattention in Public Places: Normalising Unusual Events through Mobile and Embodied Practices

    Directory of Open Access Journals (Sweden)

    Pentti Haddington

    2012-09-01

    Full Text Available This article builds on GOFFMAN's work to study how pedestrians display their orientation to unusual events in public places. It focuses on the mobile and embodied conduct of those passing a smartmob event in which a performing group "froze" in a busy transit hub for four minutes. The data comprise audio-video recordings of the event. We identify and analyse routinised mobile and embodied practices by which passers-by "normalise" the unusual event. These include different organisations of body behaviour and the ways in which passers-by walk around and between the performers as individuals and groups. The findings are supported with illustrations. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs120375

  17. Modelling Of Manufacturing Processes With Membranes

    Science.gov (United States)

    Crăciunean, Daniel Cristian; Crăciunean, Vasile

    2015-07-01

    The current objectives to increase the standards of quality and efficiency in manufacturing processes can be achieved only through the best combination of inputs, independent of spatial distance between them. This paper proposes modelling production processes based on membrane structures introduced in [4]. Inspired from biochemistry, membrane computation [4] is based on the concept of membrane represented in its formalism by the mathematical concept of multiset. The manufacturing process is the evolution of a super cell system from its initial state according to the given actions of aggregation. In this paper we consider that the atomic production unit of the process is the action. The actions and the resources on which the actions are produced, are distributed in a virtual network of companies working together. The destination of the output resources is specified by corresponding output events.

  18. Dimensional modeling: beyond data processing constraints.

    Science.gov (United States)

    Bunardzic, A

    1995-01-01

    The focus of information processing requirements is shifting from the on-line transaction processing (OLTP) issues to the on-line analytical processing (OLAP) issues. While the former serves to ensure the feasibility of the real-time on-line transaction processing (which has already exceeded a level of up to 1,000 transactions per second under normal conditions), the latter aims at enabling more sophisticated analytical manipulation of data. The OLTP requirements, or how to efficiently get data into the system, have been solved by applying the Relational theory in the form of Entity-Relation model. There is presently no theory related to OLAP that would resolve the analytical processing requirements as efficiently as Relational theory provided for the transaction processing. The "relational dogma" also provides the mathematical foundation for the Centralized Data Processing paradigm in which mission-critical information is incorporated as 'one and only one instance' of data, thus ensuring data integrity. In such surroundings, the information that supports business analysis and decision support activities is obtained by running predefined reports and queries that are provided by the IS department. In today's intensified competitive climate, businesses are finding that this traditional approach is not good enough. The only way to stay on top of things, and to survive and prosper, is to decentralize the IS services. The newly emerging Distributed Data Processing, with its increased emphasis on empowering the end user, does not seem to find enough merit in the relational database model to justify relying upon it. Relational theory proved too rigid and complex to accommodate the analytical processing needs. In order to satisfy the OLAP requirements, or how to efficiently get the data out of the system, different models, metaphors, and theories have been devised. All of them are pointing to the need for simplifying the highly non-intuitive mathematical constraints found

  19. Mathematical modeling of the flash converting process

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, H.Y.; Perez-Tello, M.; Riihilahti, K.M. [Utah Univ., Salt Lake City, UT (United States)

    1996-12-31

    An axisymmetric mathematical model for the Kennecott-Outokumpu flash converting process for converting solid copper matte to copper is presented. The model is an adaptation of the comprehensive mathematical model formerly developed at the University of Utah for the flash smelting of copper concentrates. The model incorporates the transport of momentum, heat, mass, and reaction kinetics between gas and particles in a particle-laden turbulent gas jet. The standard k-{epsilon} model is used to describe gas-phase turbulence in an Eulerian framework. The particle-phase is treated from a Lagrangian viewpoint which is coupled to the gas-phase via the source terms in the Eulerian gas-phase governing equations. Matte particles were represented as Cu{sub 2}S yFeS, and assumed to undergo homogeneous oxidation to Cu{sub 2}O, Fe{sub 3}O{sub 4}, and SO{sub 2}. A reaction kinetics mechanism involving both external mass transfer of oxygen gas to the particle surface and diffusion of oxygen through the porous oxide layer is proposed to estimate the particle oxidation rate Predictions of the mathematical model were compared with the experimental data collected in a bench-scale flash converting facility. Good agreement between the model predictions and the measurements was obtained. The model was used to study the effect of different gas-injection configurations on the overall fluid dynamics in a commercial size flash converting shaft. (author)

  20. Modelling and control of crystallization process

    Directory of Open Access Journals (Sweden)

    S.K. Jha

    2017-03-01

    Full Text Available Batch crystallizers are predominantly used in chemical industries like pharmaceuticals, food industries and specialty chemicals. The nonlinear nature of the batch process leads to difficulties when the objective is to obtain a uniform Crystal Size Distribution (CSD. In this study, a linear PI controller is designed using classical controller tuning methods for controlling the crystallizer outlet temperature by manipulating the inlet jacket temperature; however, the response is not satisfactory. A simple PID controller cannot guarantee a satisfactory response that is why an optimal controller is designed to keep the concentration and temperature in a range that suits our needs. Any typical process operation has constraints on states, inputs and outputs. So, a nonlinear process needs to be operated satisfying the constraints. Hence, a nonlinear controller like Generic Model Controller (GMC which is similar in structure to the PI controller is implemented. It minimizes the derivative of the squared error, thus improving the output response of the process. Minimization of crystal size variation is considered as an objective function in this study. Model predictive control is also designed that uses advanced optimization algorithm to minimize the error while linearizing the process. Constraints are fed into the MPC toolbox in MATLAB and Prediction, Control horizons and Performance weights are tuned using Sridhar and Cooper Method. Performances of all the three controllers (PID, GMC and MPC are compared and it is found that MPC is the most superior one in terms of settling time and percentage overshoot.

  1. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated...... of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling....

  2. Theoretical modelling of carbon deposition processes

    International Nuclear Information System (INIS)

    Marsh, G.R.; Norfolk, D.J.; Skinner, R.F.

    1985-01-01

    Work based on capsule experiments in the BNL Gamma Facility, aimed at elucidating the chemistry involved in the formation of carbonaceous deposit on CAGR fuel pin surfaces is described. Using a data-base derived from capsule experiments together with literature values for the kinetics of the fundamental reactions, a chemical model of the gas-phase processes has been developed. This model successfully reproduces the capsule results, whilst preliminary application to the WAGR coolant circuit indicates the likely concentration profiles of various radical species within the fuel channels. (author)

  3. Theoretical Modelling of Intercultural Communication Process

    Directory of Open Access Journals (Sweden)

    Mariia Soter

    2016-08-01

    Full Text Available The definition of the concepts of “communication”, “intercultural communication”, “model of communication” are analyzed in the article. The basic components of the communication process are singled out. The model of intercultural communication is developed. Communicative, behavioral and complex skills for optimal organization of intercultural communication, establishment of productive contact with a foreign partner to achieve mutual understanding, searching for acceptable ways of organizing interaction and cooperation for both communicants are highlighted in the article. It is noted that intercultural communication through interaction between people affects the development of different cultures’ aspects.

  4. Empirical process modeling in fast breeder reactors

    International Nuclear Information System (INIS)

    Ikonomopoulos, A.; Endou, A.

    1998-01-01

    A non-linear multi-input/single output (MISO) empirical model is introduced for monitoring vital system parameters in a nuclear reactor environment. The proposed methodology employs a scheme of non-parametric smoothing that models the local dynamics of each fitting point individually, as opposed to global modeling techniques--such as multi-layer perceptrons (MLPs)--that attempt to capture the dynamics of the entire design space. The stimulation for employing local models in monitoring rises from one's desire to capture localized idiosyncrasies of the dynamic system utilizing independent estimators. This approach alleviates the effect of negative interference between old and new observations enhancing the model prediction capabilities. Modeling the behavior of any given system comes down to a trade off between variance and bias. The building blocks of the proposed approach are tailored to each data set through two separate, adaptive procedures in order to optimize the bias-variance reconciliation. Hetero-associative schemes of the technique presented exhibit insensitivity to sensor noise and provide the operator with accurate predictions of the actual process signals. A comparison between the local model and MLP prediction capabilities is performed and the results appear in favor of the first method. The data used to demonstrate the potential of local regression have been obtained during two startup periods of the Monju fast breeder reactor (FBR)

  5. Modeling of processes in the tourism sector

    Directory of Open Access Journals (Sweden)

    Salamatina Victoriya, S.

    2015-06-01

    Full Text Available In modern conditions for a number of Russian regions tourism is becoming budget. In this regard, it is of interest to the simulation of processes occurring in the tourism business, because they are affected by many random parameters, due to various economic, political, geographic, and other aspects. For improvement and development of systems for the management of tourism business systematically embeds economic mathematical apparatus in this area, because increased competitiveness requires continuous and constructive changes. Results of application of the economic mathematical apparatus allow a more systematic and internal unity to analyze and evaluate the applicability of further processes in tourism. For some economic processes typical tourist activities is that a certain effect and result from exposure to any of the factors on the indicators of the processes is not immediately but gradually, after some certain time, with a certain lag. With the necessity of accounting for this delay has to face when developing mathematical models of tourist business processes. In this case, the simulation of such processes it is advisable to apply economic-mathematical formalism of optimal control, called game theory.

  6. Aqueous Electrolytes: Model Parameters and Process Simulation

    DEFF Research Database (Denmark)

    Thomsen, Kaj

    This thesis deals with aqueous electrolyte mixtures. The Extended UNIQUAC model is being used to describe the excess Gibbs energy of such solutions. Extended UNIQUAC parameters for the twelve ions Na+, K+, NH4+, H+, Cl-, NO3-, SO42-, HSO4-, OH-, CO32-, HCO3-, and S2O82- are estimated. A computer ...... program including a steady state process simulator for the design, simulation, and optimization of fractional crystallization processes is presented.......This thesis deals with aqueous electrolyte mixtures. The Extended UNIQUAC model is being used to describe the excess Gibbs energy of such solutions. Extended UNIQUAC parameters for the twelve ions Na+, K+, NH4+, H+, Cl-, NO3-, SO42-, HSO4-, OH-, CO32-, HCO3-, and S2O82- are estimated. A computer...

  7. Markov State Model of Ion Assembling Process.

    Science.gov (United States)

    Shevchuk, Roman

    2016-05-12

    We study the process of ion assembling in aqueous solution by means of molecular dynamics. In this article, we present a method to study many-particle assembly using the Markov state model formalism. We observed that at NaCl concentration higher than 1.49 mol/kg, the system tends to form a big ionic cluster composed of roughly 70-90% of the total number of ions. Using Markov state models, we estimated the average time needed for the system to make a transition from discorded state to a state with big ionic cluster. Our results suggest that the characteristic time to form an ionic cluster is a negative exponential function of the salt concentration. Moreover, we defined and analyzed three different kinetic states of a single ion particle. These states correspond to a different particle location during nucleation process.

  8. Environmental Modeling Framework using Stacked Gaussian Processes

    OpenAIRE

    Abdelfatah, Kareem; Bao, Junshu; Terejanu, Gabriel

    2016-01-01

    A network of independently trained Gaussian processes (StackedGP) is introduced to obtain predictions of quantities of interest with quantified uncertainties. The main applications of the StackedGP framework are to integrate different datasets through model composition, enhance predictions of quantities of interest through a cascade of intermediate predictions, and to propagate uncertainties through emulated dynamical systems driven by uncertain forcing variables. By using analytical first an...

  9. Deep inelastic processes and the parton model

    International Nuclear Information System (INIS)

    Altarelli, G.

    The lecture was intended as an elementary introduction to the physics of deep inelastic phenomena from the point of view of theory. General formulae and facts concerning inclusive deep inelastic processes in the form: l+N→l'+hadrons (electroproduction, neutrino scattering) are first recalled. The deep inelastic annihilation e + e - →hadrons is then envisaged. The light cone approach, the parton model and their relation are mainly emphasized

  10. Process Modeling With Inhomogeneous Thin Films

    Science.gov (United States)

    Machorro, R.; Macleod, H. A.; Jacobson, M. R.

    1986-12-01

    Designers of optical multilayer coatings commonly assume that the individual layers will be ideally homogeneous and isotropic. In practice, it is very difficult to control the conditions involved in the complex evaporation process sufficiently to produce such ideal films. Clearly, changes in process parameters, such as evaporation rate, chamber pressure, and substrate temperature, affect the microstructure of the growing film, frequently producing inhomogeneity in structure or composition. In many cases, these effects are interdependent, further complicating the situation. However, this process can be simulated on powerful, interactive, and accessible microcomputers. In this work, we present such a model and apply it to estimate the influence of an inhomogeneous layer on multilayer performance. Presently, the program simulates film growth, thermal expansion and contraction, and thickness monitoring procedures, and includes the effects of uncertainty in these parameters or noise. Although the model is being developed to cover very general cases, we restrict the present discussion to isotropic and nondispersive quarterwave layers to understand the particular effects of inhomogeneity. We studied several coating designs and related results and tolerances to variations in evaporation conditions. The model is composed of several modular subprograms, is written in Fortran, and is executed on an IBM-PC with 640 K of memory. The results can be presented in graphic form on a monochrome monitor. We are currently installing and implementing color capability to improve the clarity of the multidimensional output.

  11. Near Field Environment Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R.A. Wagner

    2000-11-14

    Waste emplacement and activities associated with construction of a repository system potentially will change environmental conditions within the repository system. These environmental changes principally result from heat generated by the decay of the radioactive waste, which elevates temperatures within the repository system. Elevated temperatures affect distribution of water, increase kinetic rates of geochemical processes, and cause stresses to change in magnitude and orientation from the stresses resulting from the overlying rock and from underground construction activities. The recognition of this evolving environment has been reflected in activities, studies and discussions generally associated with what has been termed the Near-Field Environment (NFE). The NFE interacts directly with waste packages and engineered barriers as well as potentially changing the fluid composition and flow conditions within the mountain. As such, the NFE defines the environment for assessing the performance of a potential Monitored Geologic Repository at Yucca Mountain, Nevada. The NFe evolves over time, and therefore is not amenable to direct characterization or measurement in the ambient system. Analysis or assessment of the NFE must rely upon projections based on tests and models that encompass the long-term processes of the evolution of this environment. This NFE Process Model Report (PMR) describes the analyses and modeling based on current understanding of the evolution of the near-field within the rock mass extending outward from the drift wall.

  12. MODELLING OF POSTSEISMIC PROCESSES IN SUBDUCTION ZONES

    Directory of Open Access Journals (Sweden)

    Irina S. Vladimirova

    2012-01-01

    Full Text Available Large intraplate subduction earthquakes are generally accompanied by prolonged and intense postseismic anomalies. In the present work, viscoelastic relaxation in the upper mantle and the asthenosphere is considered as a main mechanism responsible for the occurrence of such postseismic effects. The study of transient processes is performed on the basis of data on postseismic processes accompanying the first Simushir earthquake on 15 November 2006 and Maule earthquake on 27 February 2010.The methodology of modelling a viscoelastic relaxation process after a large intraplate subduction earthquake is presented. A priori parameters of the selected model describing observed postseismic effects are adjusted by minimizing deviations between modeled surface displacements and actual surface displacements recorded by geodetic methods through solving corresponding inverse problems.The presented methodology yielded estimations of Maxwell’s viscosity of the asthenosphere of the central Kuril Arc and also of the central Chile. Besides, postseismic slip distribution patterns were obtained for the focus of the Simushir earthquake of 15 November 2006 (Mw=8.3 (Figure 3, and distribution patterns of seismic and postseismic slip were determined for the focus of the Maule earthquake of 27 February 2010 (Mw=8.8 (Figure 6. These estimations and patterns can provide for prediction of the intensity of viscoelastic stress attenuation in the asthenosphere; anomalous values should be taken into account as adjustment factors when analyzing inter-seismic deformation in order to ensure correct estimation of the accumulated elastic seismogenic potential.

  13. Évolution de la normalisation dans le domaine des oléagineux et des corps gras

    Directory of Open Access Journals (Sweden)

    Quinsac Alain

    2003-07-01

    Full Text Available La normalisation joue un grand rôle dans les échanges économiques en participant à l’ouverture et à la transparence des marchés. La filière des Oléagineux et des Corps Gras a intégré depuis longtemps la normalisation dans sa stratégie. Élaborés à partir des besoins de la profession et notamment au niveau de la relation client-fournisseur, les programmes ont concerné principalement l’échantillonnage et l’analyse. Depuis quelques années, une forte évolution du contexte socio-économique et réglementaire (utilisation non-alimentaire, sécurité alimentaire, assurance qualité, a élargi le champ de la normalisation. La démarche normative adoptée dans le cas des bio-diesels et de la détection des OGM dans les oléagineux est expliquée. Les conséquences de l’évolution de la normalisation et les enjeux pour la profession des oléagineux dans le futur sont évoqués.

  14. Multimodal Similarity Gaussian Process Latent Variable Model.

    Science.gov (United States)

    Song, Guoli; Wang, Shuhui; Huang, Qingming; Tian, Qi

    2017-09-01

    Data from real applications involve multiple modalities representing content with the same semantics from complementary aspects. However, relations among heterogeneous modalities are simply treated as observation-to-fit by existing work, and the parameterized modality specific mapping functions lack flexibility in directly adapting to the content divergence and semantic complicacy in multimodal data. In this paper, we build our work based on the Gaussian process latent variable model (GPLVM) to learn the non-parametric mapping functions and transform heterogeneous modalities into a shared latent space. We propose multimodal Similarity Gaussian Process latent variable model (m-SimGP), which learns the mapping functions between the intra-modal similarities and latent representation. We further propose multimodal distance-preserved similarity GPLVM (m-DSimGP) to preserve the intra-modal global similarity structure, and multimodal regularized similarity GPLVM (m-RSimGP) by encouraging similar/dissimilar points to be similar/dissimilar in the latent space. We propose m-DRSimGP, which combines the distance preservation in m-DSimGP and semantic preservation in m-RSimGP to learn the latent representation. The overall objective functions of the four models are solved by simple and scalable gradient decent techniques. They can be applied to various tasks to discover the nonlinear correlations and to obtain the comparable low-dimensional representation for heterogeneous modalities. On five widely used real-world data sets, our approaches outperform existing models on cross-modal content retrieval and multimodal classification.

  15. Spherical Process Models for Global Spatial Statistics

    KAUST Repository

    Jeong, Jaehong

    2017-11-28

    Statistical models used in geophysical, environmental, and climate science applications must reflect the curvature of the spatial domain in global data. Over the past few decades, statisticians have developed covariance models that capture the spatial and temporal behavior of these global data sets. Though the geodesic distance is the most natural metric for measuring distance on the surface of a sphere, mathematical limitations have compelled statisticians to use the chordal distance to compute the covariance matrix in many applications instead, which may cause physically unrealistic distortions. Therefore, covariance functions directly defined on a sphere using the geodesic distance are needed. We discuss the issues that arise when dealing with spherical data sets on a global scale and provide references to recent literature. We review the current approaches to building process models on spheres, including the differential operator, the stochastic partial differential equation, the kernel convolution, and the deformation approaches. We illustrate realizations obtained from Gaussian processes with different covariance structures and the use of isotropic and nonstationary covariance models through deformations and geographical indicators for global surface temperature data. To assess the suitability of each method, we compare their log-likelihood values and prediction scores, and we end with a discussion of related research problems.

  16. Modeling Dynamic Regulatory Processes in Stroke

    Science.gov (United States)

    McDermott, Jason E.; Jarman, Kenneth; Taylor, Ronald; Lancaster, Mary; Shankaran, Harish; Vartanian, Keri B.; Stevens, Susan L.; Stenzel-Poore, Mary P.; Sanfilippo, Antonio

    2012-01-01

    The ability to examine the behavior of biological systems in silico has the potential to greatly accelerate the pace of discovery in diseases, such as stroke, where in vivo analysis is time intensive and costly. In this paper we describe an approach for in silico examination of responses of the blood transcriptome to neuroprotective agents and subsequent stroke through the development of dynamic models of the regulatory processes observed in the experimental gene expression data. First, we identified functional gene clusters from these data. Next, we derived ordinary differential equations (ODEs) from the data relating these functional clusters to each other in terms of their regulatory influence on one another. Dynamic models were developed by coupling these ODEs into a model that simulates the expression of regulated functional clusters. By changing the magnitude of gene expression in the initial input state it was possible to assess the behavior of the networks through time under varying conditions since the dynamic model only requires an initial starting state, and does not require measurement of regulatory influences at each time point in order to make accurate predictions. We discuss the implications of our models on neuroprotection in stroke, explore the limitations of the approach, and report that an optimized dynamic model can provide accurate predictions of overall system behavior under several different neuroprotective paradigms. PMID:23071432

  17. Modeling dynamic regulatory processes in stroke.

    Directory of Open Access Journals (Sweden)

    Jason E McDermott

    Full Text Available The ability to examine the behavior of biological systems in silico has the potential to greatly accelerate the pace of discovery in diseases, such as stroke, where in vivo analysis is time intensive and costly. In this paper we describe an approach for in silico examination of responses of the blood transcriptome to neuroprotective agents and subsequent stroke through the development of dynamic models of the regulatory processes observed in the experimental gene expression data. First, we identified functional gene clusters from these data. Next, we derived ordinary differential equations (ODEs from the data relating these functional clusters to each other in terms of their regulatory influence on one another. Dynamic models were developed by coupling these ODEs into a model that simulates the expression of regulated functional clusters. By changing the magnitude of gene expression in the initial input state it was possible to assess the behavior of the networks through time under varying conditions since the dynamic model only requires an initial starting state, and does not require measurement of regulatory influences at each time point in order to make accurate predictions. We discuss the implications of our models on neuroprotection in stroke, explore the limitations of the approach, and report that an optimized dynamic model can provide accurate predictions of overall system behavior under several different neuroprotective paradigms.

  18. GREENSCOPE: A Method for Modeling Chemical Process ...

    Science.gov (United States)

    Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Efficiency, and Energy, can evaluate processes with over a hundred different indicators. These indicators provide a means for realizing the principles of green chemistry and green engineering in the context of sustainability. Development of the methodology has centered around three focal points. One is a taxonomy of impacts that describe the indicators and provide absolute scales for their evaluation. The setting of best and worst limits for the indicators allows the user to know the status of the process under study in relation to understood values. Thus, existing or imagined processes can be evaluated according to their relative indicator scores, and process modifications can strive towards realizable targets. A second area of focus is in advancing definitions of data needs for the many indicators of the taxonomy. Each of the indicators has specific data that is necessary for their calculation. Values needed and data sources have been identified. These needs can be mapped according to the information source (e.g., input stream, output stream, external data, etc.) for each of the bases. The user can visualize data-indicator relationships on the way to choosing selected ones for evalua

  19. Discovering Reference Process Models by Mining Process Variants

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    Recently, a new generation of adaptive Process-Aware Information Systems (PAIS) has emerged, which allows for dynamic process and service changes (e.g., to insert, delete, and move activities and service executions in a running process). This, in turn, has led to a large number of process variants

  20. Multifunctional multiscale composites: Processing, modeling and characterization

    Science.gov (United States)

    Qiu, Jingjing

    Carbon nanotubes (CNTs) demonstrate extraordinary properties and show great promise in enhancing out-of-plane properties of traditional polymer/fiber composites and enabling functionality. However, current manufacturing challenges hinder the realization of their potential. In the dissertation research, both experimental and computational efforts have been conducted to investigate effective manufacturing techniques of CNT integrated multiscale composites. The fabricated composites demonstrated significant improvements in physical properties, such as tensile strength, tensile modulus, inter-laminar shear strength, thermal dimension stability and electrical conductivity. Such multiscale composites were truly multifunctional with the addition of CNTs. Furthermore, a novel hierarchical multiscale modeling method was developed in this research. Molecular dynamic (MD) simulation offered reasonable explanation of CNTs dispersion and their motion in polymer solution. Bi-mode finite-extensible-nonlinear-elastic (FENE) dumbbell simulation was used to analyze the influence of CNT length distribution on the stress tensor and shear-rate-dependent viscosity. Based on the simulated viscosity profile and empirical equations from experiments, a macroscale flow simulation model on the finite element method (FEM) method was developed and validated to predict resin flow behavior in the processing of CNT-enhanced multiscale composites. The proposed multiscale modeling method provided a comprehensive understanding of micro/nano flow in both atomistic details and mesoscale. The simulation model can be used to optimize process design and control of the mold-filling process in multiscale composite manufacturing. This research provided systematic investigations into the CNT-based multiscale composites. The results from this study may be used to leverage the benefits of CNTs and open up new application opportunities for high-performance multifunctional multiscale composites. Keywords. Carbon

  1. Innovative process engineering: a generic model of the innovation process

    OpenAIRE

    Pénide, Thomas; Gourc, Didier; Pingaud, Hervé; Peillon, Philippe

    2013-01-01

    International audience; Innovation can be represented as a knowledge transformation process perceived with different levels of granularity. The milestones of this process allow assessment for its each step and set up feedback loops that will be highlighted. This innovation process is a good starting point to understand innovation and then to manage it. Best practices being patterns of processes, we describe innovation best practices as compulsory steps in our innovation process. To put into p...

  2. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is

  3. Non-parametric Bayesian inference for inhomogeneous Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper; Johansen, Per Michael

    is a shot noise process, and the interaction function for a pair of points depends only on the distance between the two points and is a piecewise linear function modelled by a marked Poisson process. Simulation of the resulting posterior using a Metropolis-Hastings algorithm in the "conventional" way...... involves evaluating ratios of unknown normalising constants. We avoid this problem by applying a new auxiliary variable technique introduced by Møller, Pettitt, Reeves & Berthelsen (2006). In the present setting the auxiliary variable used is an example of a partially ordered Markov point process model....

  4. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  5. Modeling and optimization of wet sizing process

    International Nuclear Information System (INIS)

    Thai Ba Cau; Vu Thanh Quang and Nguyen Ba Tien

    2004-01-01

    Mathematical simulation on basis of Stock law has been done for wet sizing process on cylinder equipment of laboratory and semi-industrial scale. The model consists of mathematical equations describing relations between variables, such as: - Resident time distribution function of emulsion particles in the separating zone of the equipment depending on flow-rate, height, diameter and structure of the equipment. - Size-distribution function in the fine and coarse parts depending on resident time distribution function of emulsion particles, characteristics of the material being processed, such as specific density, shapes, and characteristics of the environment of classification, such as specific density, viscosity. - Experimental model was developed on data collected from an experimental cylindrical equipment with diameter x height of sedimentation chamber equal to 50 x 40 cm for an emulsion of zirconium silicate in water. - Using this experimental model allows to determine optimal flow-rate in order to obtain product with desired grain size in term of average size or size distribution function. (author)

  6. Process-Based Modeling of Constructed Wetlands

    Science.gov (United States)

    Baechler, S.; Brovelli, A.; Rossi, L.; Barry, D. A.

    2007-12-01

    Constructed wetlands (CWs) are widespread facilities for wastewater treatment. In subsurface flow wetlands, contaminated wastewater flows through a porous matrix, where oxidation and detoxification phenomena occur. Despite the large number of working CWs, system design and optimization are still mainly based upon empirical equations or simplified first-order kinetics. This results from an incomplete understanding of the system functioning, and may in turn hinder the performance and effectiveness of the treatment process. As a result, CWs are often considered not suitable to meet high water quality-standards, or to treat water contaminated with recalcitrant anthropogenic contaminants. To date, only a limited number of detailed numerical models have been developed and successfully applied to simulate constructed wetland behavior. Among these, one of the most complete and powerful is CW2D, which is based on Hydrus2D. The aim of this work is to develop a comprehensive simulator tailored to model the functioning of horizontal flow constructed wetlands and in turn provide a reliable design and optimization tool. The model is based upon PHWAT, a general reactive transport code for saturated flow. PHWAT couples MODFLOW, MT3DMS and PHREEQC-2 using an operator-splitting approach. The use of PHREEQC to simulate reactions allows great flexibility in simulating biogeochemical processes. The biogeochemical reaction network is similar to that of CW2D, and is based on the Activated Sludge Model (ASM). Kinetic oxidation of carbon sources and nutrient transformations (nitrogen and phosphorous primarily) are modeled via Monod-type kinetic equations. Oxygen dissolution is accounted for via a first-order mass-transfer equation. While the ASM model only includes a limited number of kinetic equations, the new simulator permits incorporation of an unlimited number of both kinetic and equilibrium reactions. Changes in pH, redox potential and surface reactions can be easily incorporated

  7. Modelling and control of dynamic systems using gaussian process models

    CERN Document Server

    Kocijan, Juš

    2016-01-01

    This monograph opens up new horizons for engineers and researchers in academia and in industry dealing with or interested in new developments in the field of system identification and control. It emphasizes guidelines for working solutions and practical advice for their implementation rather than the theoretical background of Gaussian process (GP) models. The book demonstrates the potential of this recent development in probabilistic machine-learning methods and gives the reader an intuitive understanding of the topic. The current state of the art is treated along with possible future directions for research. Systems control design relies on mathematical models and these may be developed from measurement data. This process of system identification, when based on GP models, can play an integral part of control design in data-based control and its description as such is an essential aspect of the text. The background of GP regression is introduced first with system identification and incorporation of prior know...

  8. Model systems for life processes on Mars

    Science.gov (United States)

    Mitz, M. A.

    1974-01-01

    In the evolution of life forms nonphotosynthetic mechanisms are developed. The question remains whether a total life system could evolve which is not dependent upon photosynthesis. In trying to visualize life on other planets, the photosynthetic process has problems. On Mars, the high intensity of light at the surface is a concern and alternative mechanisms need to be defined and analyzed. In the UV search for alternate mechanisms, several different areas may be identified. These involve activated inorganic compounds in the atmosphere, such as the products of photodissociation of carbon dioxide and the organic material which may be created by natural phenomena. In addition, a life system based on the pressure of the atmospheric constituents, such as carbon dioxide, is a possibility. These considerations may be important for the understanding of evolutionary processes of life on another planet. Model systems which depend on these alternative mechanisms are defined and related to presently planned and future planetary missions.

  9. Normalisation method can affect gluteus medius electromyography results during weight bearing exercises in people with hip osteoarthritis (OA): a case control study.

    Science.gov (United States)

    French, Helen P; Huang, Xiaoli; Cummiskey, Andrew; Meldrum, Dara; Malone, Ailish

    2015-02-01

    Surface electromyography (sEMG) is used to assess muscle activation during therapeutic exercise, but data are significantly affected by inter-individual variability and requires normalisation of the sEMG signal to enable comparison between individuals. The purpose of this study was to compare two normalisation methods, a maximal method (maximum voluntary isometric contraction (MVIC)) and non-maximal peak dynamic method (PDM), on gluteus medius (GMed) activation using sEMG during three weight-bearing exercises in people with hip osteoarthritis (OA) and healthy controls. Thirteen people with hip OA and 20 controls performed three exercises (Squat, Step-Up, Step-Down). Average root-mean squared EMG amplitude based on MVIC and PDM normalisation was compared between groups for both involved and uninvolved hips using Mann-Whitney tests. Using MVIC normalisation, significantly higher normalised GMed EMG amplitudes were found in the OA group during all Step-up and down exercises on the involved side (p=0.02-0.001) and most of the Step exercises on the uninvolved side (p=0.03-0.04), but not the Squat (p>0.05), compared to controls. Using PDM normalisation, significant between-group differences occurred only for Ascending Squat (p=0.03) on the involved side. MVIC normalisation demonstrated higher inter-trial relative reliability (ICCs=0.78-0.99) than PDM (ICCs=0.37-0.84), but poorer absolute reliability using Standard Error of Measurement. Normalisation method can significantly affect interpretation of EMG amplitudes. Although MVIC-normalised amplitudes were more sensitive to differences between groups, there was greater variability using this method, which raises concerns regarding validity. Interpretation of EMG data is strongly influenced by the normalisation method used, and this should be considered when applying EMG results to clinical populations. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Selection of reliable reference genes for the normalisation of gene expression levels following time course LPS stimulation of murine bone marrow derived macrophages.

    Science.gov (United States)

    Tanaka, Akane; To, Joyce; O'Brien, Bronwyn; Donnelly, Sheila; Lund, Maria

    2017-10-03

    Macrophages are key players in the initiation, perpetuation and regulation of both innate and adaptive immune responses. They largely perform these roles through modulation of the expression of genes, especially those encoding cytokines. Murine bone marrow derived macrophages (BMDMs) are commonly used as a model macrophage population for the study of immune responses to pro-inflammatory stimuli, notably lipopolysaccharide (LPS), which may be pertinent to the human situation. Evaluation of the temporal responses of LPS stimulated macrophages is widely conducted via the measurement of gene expression levels by RT-qPCR. While providing a robust and sensitive measure of gene expression levels, RT-qPCR relies on the normalisation of gene expression data to a stably expressed reference gene. Generally, a normalisation gene(s) is selected from a list of "traditional" reference genes without validation of expression stability under the specific experimental conditions of the study. In the absence of such validation, and given that many studies use only a single reference gene, the reliability of data is questionable. The stability of expression levels of eight commonly used reference genes was assessed during the peak (6 h) and resolution (24 h) phases of the BMDM response to LPS. Further, this study identified two additional genes, which have not previously been described as reference genes, and the stability of their expression levels during the same phases of the inflammatory response were validated. Importantly, this study demonstrates that certain "traditional" reference genes are in fact regulated by LPS exposure, and, therefore, are not reliable candidates as their inclusion may compromise the accuracy of data interpretation. Testament to this, this study shows that the normalisation of gene expression data using an unstable reference gene greatly affects the experimental data obtained, and, therefore, the ultimate biological conclusions drawn. This study

  11. Comments on: Spatiotemporal models for skewed processes

    KAUST Repository

    Genton, Marc G.

    2017-09-04

    We would first like to thank the authors for this paper that highlights the important problem of building models for non-Gaussian space-time processes. We will hereafter refer to the paper as SGV, and we also would like to acknowledge and thank them for providing us with the temporally detrended temperatures, plotted in their Figure 1, along with the coordinates of the twenty-one locations and the posterior means of the parameters for the MA1 model. We find much of interest to discuss in this paper, and as we progress through points of interest, we pose some questions to the authors that we hope they will be able to address.

  12. Specification of e-business process model for PayPal online payment process using Reo

    OpenAIRE

    Xie, M.

    2005-01-01

    textabstractE-business process modeling allows business analysts to better understand and analyze the business processes, and eventually to use software systems to automate (parts of) these business processes to achieve higher profit. To support e-business process modeling, many business process modeling languages have been used as tools. However, many existing business process modeling languages lack (a) formal semantics, (b) formal computational model, and (c) an integrated view of the busi...

  13. Privatization processes in banking: Motives and models

    Directory of Open Access Journals (Sweden)

    Ristić Života

    2006-01-01

    Full Text Available The paper consists of three methodologically and causally connected thematic parts: the first part deals with crucial motives and models of the privatization processes in the USA and EU with a particular analytical focus on the Herfindahl-Hirschman doctrine of the collective domination index, as well as on the essence of merger-acquisition and take-over models. The second thematic part of the paper, as a logical continuation of the first one represents a brief comparative analysis of the motives and models implemented in bank privatization in the south-eastern European countries with particular focus on identifying interests of foreign investors, an optimal volume and price of the investment, and assessment of finalized privatizations in those countries. The final part of the paper theoretically and practically stems from the first and the second part, in that way making an interdependent and a compatible thematic whole with them, presents qualitative and quantitative aspects of analyzing finalized privatization and/or sale-purchase of Serbian banks with particular focus on IPO and IPOPLUS as the prevailing models of future sale-purchase in privatizing Serbian banks.

  14. Modeling Aspects Of Activated Sludge Processes Part I: Process Modeling Of Activated Sludge Facilitation And Sedimentation

    International Nuclear Information System (INIS)

    Ibrahim, H. I.; EI-Ahwany, A.H.; Ibrahim, G.

    2004-01-01

    Process modeling of activated sludge flocculation and sedimentation reviews consider the activated sludge floc characteristics such as: morphology viable and non-viable cell ratio density and water content, bio flocculation and its kinetics were studied considering the characteristics of bio flocculation and explaining theory of Divalent Cation Bridging which describes the major role of cations in bio flocculation. Activated sludge flocculation process modeling was studied considering mass transfer limitations from Clifft and Andrew, 1981, Benefild and Molz 1983 passing Henze 1987, until Tyagi 1996 and G. Ibrahim et aI. 2002. Models of aggregation and breakage of flocs were studied by Spicer and Pratsinis 1996,and Biggs 2002 Size distribution of floes influences mass transfer and biomass separation in the activated sludge process. Therefore, it is of primary importance to establish the role of specific process operation factors, such as sludge loading dynamic sludge age and dissolved oxygen, on this distribution with special emphasis on the formation of primary particles

  15. Normalised Mutual Information of High-Density Surface Electromyography during Muscle Fatigue

    Directory of Open Access Journals (Sweden)

    Adrian Bingham

    2017-12-01

    Full Text Available This study has developed a technique for identifying the presence of muscle fatigue based on the spatial changes of the normalised mutual information (NMI between multiple high density surface electromyography (HD-sEMG channels. Muscle fatigue in the tibialis anterior (TA during isometric contractions at 40% and 80% maximum voluntary contraction levels was investigated in ten healthy participants (Age range: 21 to 35 years; Mean age = 26 years; Male = 4, Female = 6. HD-sEMG was used to record 64 channels of sEMG using a 16 by 4 electrode array placed over the TA. The NMI of each electrode with every other electrode was calculated to form an NMI distribution for each electrode. The total NMI for each electrode (the summation of the electrode’s NMI distribution highlighted regions of high dependence in the electrode array and was observed to increase as the muscle fatigued. To summarise this increase, a function, M(k, was defined and was found to be significantly affected by fatigue and not by contraction force. The technique discussed in this study has overcome issues regarding electrode placement and was used to investigate how the dependences between sEMG signals within the same muscle change spatially during fatigue.

  16. No upward trend in normalised windstorm losses in Europe: 1970-2008

    Science.gov (United States)

    Barredo, J. I.

    2010-01-01

    On 18 January 2007, windstorm Kyrill battered Europe with hurricane-force winds killing 47 people and causing 10 billion US in damage. Kyrill poses several questions: is Kyrill an isolated or exceptional case? Have there been events costing as much in the past? This paper attempts to put Kyrill into an historical context by examining large historical windstorm event losses in Europe for the period 1970-2008 across 29 European countries. It asks the question what economic losses would these historical events cause if they were to recur under 2008 societal conditions? Loss data were sourced from reinsurance firms and augmented with historical reports, peer-reviewed articles and other ancillary sources. Following the same conceptual approach outlined in previous studies, the data were then adjusted for changes in population, wealth, and inflation at the country level and for inter-country price differences using purchasing power parity. The analyses reveal no trend in the normalised windstorm losses and confirm increasing disaster losses are driven by societal factors and increasing exposure.

  17. Technical Note: On methodologies for determining the size-normalised weight of planktic foraminifera

    Directory of Open Access Journals (Sweden)

    C. J. Beer

    2010-07-01

    Full Text Available The size-normalised weight (SNW of planktic foraminifera, a measure of test wall thickness and density, is potentially a valuable palaeo-proxy for marine carbon chemistry. As increasing attention is given to developing this proxy it is important that methods are comparable between studies. Here, we compare SNW data generated using two different methods to account for variability in test size, namely (i the narrow (50 μm range sieve fraction method and (ii the individually measured test size method. Using specimens from the 200–250 μm sieve fraction range collected in multinet samples from the North Atlantic, we find that sieving does not constrain size sufficiently well to isolate changes in weight driven by variations in test wall thickness and density from those driven by size. We estimate that the SNW data produced as part of this study are associated with an uncertainty, or error bar, of about ±11%. Errors associated with the narrow sieve fraction method may be reduced by decreasing the size of the sieve window, by using larger tests and by increasing the number tests employed. In situations where numerous large tests are unavailable, however, substantial errors associated with this sieve method remain unavoidable. In such circumstances the individually measured test size method provides a better means for estimating SNW because, as our results show, this method isolates changes in weight driven by variations in test wall thickness and density from those driven by size.

  18. Measuring the precision of multi-perspective process models

    NARCIS (Netherlands)

    Mannhardt, Felix; De Leoni, Massimiliano; Reijers, Hajo A.; Van Der Aalst, Wil M P

    2016-01-01

    Process models need to reflect the real behavior of an organization’s processes to be beneficial for several use cases, such as process analysis, process documentation and process improvement. One quality criterion for a process model is that they should precise and not express more behavior than

  19. Uncertainty modeling process for semantic technology

    Directory of Open Access Journals (Sweden)

    Rommel N. Carvalho

    2016-08-01

    Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.

  20. Geochemical modelization of differentiation processes by crystallization

    International Nuclear Information System (INIS)

    Cebria, J.M.; Lopez Ruiz, J.

    1994-01-01

    During crystallization processes, major and trace elements and stable isotopes fractionate, whereas radiogenic isotopes do not change. The different equations proposed allow us to reproduce the variation in major and trace elements during these differentiation processes. In the case of simple fractional crystallization, the residual liquid is impoverished in compatible elements faster than it is enriched in incompatible elements as crystallization proceeds. During in situ crystallization the incompatible elements evolve in a similar way to the case of simple fractional crystallization but the enrichment rate of the moderately incompatible elements is slower and the compatible elements do not suffer a depletion as strong as the one observed during simple fractional crystallization, even for higher f values. In a periodically replenished magma chamber if all the liquid present is removed at the end of each cycle, the magma follows patterns similar to those generated by simple fractional crystallization. On the contrary, if the liquid fraction that crystallizes during each cycle and the one that is extruded at the end of the cycle are small, the residual liquid shows compositions similar to those that would be obtained by equilibrium crystallization. Crystallization processes modelling is in general less difficult than for partial melting. If a rock series is the result of simple fractional crystallization, a C''i L -C''i L plot in which i is a compatible element and j is highly incompatible, allows us to obtain a good approximation to the initial liquid composition. Additionally, long C''i L -log C''i L diagrams in which i is a highly incompatible element, allow us to identify steps in the process and to calculate the bulk distribution coefficients of the trace elements during each step

  1. Properties of spatial Cox process models

    DEFF Research Database (Denmark)

    Møller, Jesper

    2005-01-01

    Particularly, we study the most important classes of Cox processes, including log Gaussian Cox processes, shot noise Cox processes, and permanent Cox processes. We consider moment properties and point process operations such as thinning, displacements, and super positioning. We also discuss how...... to simulate specific Cox processes....

  2. Heat source model for welding process

    International Nuclear Information System (INIS)

    Doan, D.D.

    2006-10-01

    One of the major industrial stakes of the welding simulation relates to the control of mechanical effects of the process (residual stress, distortions, fatigue strength... ). These effects are directly dependent on the temperature evolutions imposed during the welding process. To model this thermal loading, an original method is proposed instead of the usual methods like equivalent heat source approach or multi-physical approach. This method is based on the estimation of the weld pool shape together with the heat flux crossing the liquid/solid interface, from experimental data measured in the solid part. Its originality consists in solving an inverse Stefan problem specific to the welding process, and it is shown how to estimate the parameters of the weld pool shape. To solve the heat transfer problem, the interface liquid/solid is modeled by a Bezier curve ( 2-D) or a Bezier surface (3-D). This approach is well adapted to a wide diversity of weld pool shapes met for the majority of the current welding processes (TIG, MlG-MAG, Laser, FE, Hybrid). The number of parameters to be estimated is weak enough, according to the cases considered from 2 to 5 in 20 and 7 to 16 in 3D. A sensitivity study leads to specify the location of the sensors, their number and the set of measurements required to a good estimate. The application of the method on test results of welding TIG on thin stainless steel sheets in emerging and not emerging configurations, shows that only one measurement point is enough to estimate the various weld pool shapes in 20, and two points in 3D, whatever the penetration is full or not. In the last part of the work, a methodology is developed for the transient analysis. It is based on the Duvaut's transformation which overpasses the discontinuity of the liquid metal interface and therefore gives a continuous variable for the all spatial domain. Moreover, it allows to work on a fixed mesh grid and the new inverse problem is equivalent to identify a source

  3. Identifying Stable Reference Genes for qRT-PCR Normalisation in Gene Expression Studies of Narrow-Leafed Lupin (Lupinus angustifolius L..

    Directory of Open Access Journals (Sweden)

    Candy M Taylor

    Full Text Available Quantitative Reverse Transcription PCR (qRT-PCR is currently one of the most popular, high-throughput and sensitive technologies available for quantifying gene expression. Its accurate application depends heavily upon normalisation of gene-of-interest data with reference genes that are uniformly expressed under experimental conditions. The aim of this study was to provide the first validation of reference genes for Lupinus angustifolius (narrow-leafed lupin, a significant grain legume crop using a selection of seven genes previously trialed as reference genes for the model legume, Medicago truncatula. In a preliminary evaluation, the seven candidate reference genes were assessed on the basis of primer specificity for their respective targeted region, PCR amplification efficiency, and ability to discriminate between cDNA and gDNA. Following this assessment, expression of the three most promising candidates [Ubiquitin C (UBC, Helicase (HEL, and Polypyrimidine tract-binding protein (PTB] was evaluated using the NormFinder and RefFinder statistical algorithms in two narrow-leafed lupin lines, both with and without vernalisation treatment, and across seven organ types (cotyledons, stem, leaves, shoot apical meristem, flowers, pods and roots encompassing three developmental stages. UBC was consistently identified as the most stable candidate and has sufficiently uniform expression that it may be used as a sole reference gene under the experimental conditions tested here. However, as organ type and developmental stage were associated with greater variability in relative expression, it is recommended using UBC and HEL as a pair to achieve optimal normalisation. These results highlight the importance of rigorously assessing candidate reference genes for each species across a diverse range of organs and developmental stages. With emerging technologies, such as RNAseq, and the completion of valuable transcriptome data sets, it is possible that other

  4. Modelling and application of stochastic processes

    CERN Document Server

    1986-01-01

    The subject of modelling and application of stochastic processes is too vast to be exhausted in a single volume. In this book, attention is focused on a small subset of this vast subject. The primary emphasis is on realization and approximation of stochastic systems. Recently there has been considerable interest in the stochastic realization problem, and hence, an attempt has been made here to collect in one place some of the more recent approaches and algorithms for solving the stochastic realiza­ tion problem. Various different approaches for realizing linear minimum-phase systems, linear nonminimum-phase systems, and bilinear systems are presented. These approaches range from time-domain methods to spectral-domain methods. An overview of the chapter contents briefly describes these approaches. Also, in most of these chapters special attention is given to the problem of developing numerically ef­ ficient algorithms for obtaining reduced-order (approximate) stochastic realizations. On the application side,...

  5. Mechanical-mathematical modeling for landslide process

    Science.gov (United States)

    Svalova, V.

    2009-04-01

    500 m and displacement of a landslide in the plan over 1 m. Last serious activization of a landslide has taken place in 2002 with a motion on 53 cm. Catastrophic activization of the deep blockglide landslide in the area of Khoroshevo in Moscow took place in 2006-2007. A crack of 330 m long appeared in the old sliding circus, along which a new 220 m long creeping block was separated from the plateau and began sinking with a displaced surface of the plateau reaching to 12 m. Such activization of the landslide process was not observed in Moscow since mid XIX century. The sliding area of Khoroshevo was stable during long time without manifestations of activity. Revealing of the reasons of deformation and development of ways of protection from deep landslide motions is extremely actual and difficult problem which decision is necessary for preservation of valuable historical monuments and modern city constructions. The reasons of activization and protective measures are discussed. Structure of monitoring system for urban territories is elaborated. Mechanical-mathematical model of high viscous fluid was used for modeling of matter behavior on landslide slopes. Equation of continuity and an approximated equation of the Navier-Stockes for slow motions in a thin layer were used. The results of modelling give possibility to define the place of highest velocity on landslide surface, which could be the best place for monitoring post position. Model can be used for calibration of monitoring equipment and gives possibility to investigate some fundamental aspects of matter movement on landslide slope.

  6. Modeling and simulation of economic processes

    Directory of Open Access Journals (Sweden)

    Bogdan Brumar

    2010-12-01

    Full Text Available In general, any activity requires a longer action often characterized by a degree of uncertainty, insecurity, in terms of size of the objective pursued. Because of the complexity of real economic systems, the stochastic dependencies between different variables and parameters considered, not all systems can be adequately represented by a model that can be solved by analytical methods and covering all issues for management decision analysis-economic horizon real. Often in such cases, it is considered that the simulation technique is the only alternative available. Using simulation techniques to study real-world systems often requires a laborious work. Making a simulation experiment is a process that takes place in several stages.

  7. Kanban simulation model for production process optimization

    Directory of Open Access Journals (Sweden)

    Golchev Riste

    2015-01-01

    Full Text Available A long time has passed since the KANBAN system has been established as an efficient method for coping with the excessive inventory. Still, the possibilities for its improvement through its integration with other different approaches should be investigated further. The basic research challenge of this paper is to present benefits of KANBAN implementation supported with Discrete Event Simulation (DES. In that direction, at the beginning, the basics of KANBAN system are presented with emphasis on the information and material flow, together with a methodology for implementation of KANBAN system. Certain analysis on combining the simulation with this methodology is presented. The paper is concluded with a practical example which shows that through understanding the philosophy of the implementation methodology of KANBAN system and the simulation methodology, a simulation model can be created which can serve as a basis for a variety of experiments that can be conducted within a short period of time, resulting with production process optimization.

  8. Elliptic Determinantal Processes and Elliptic Dyson Models

    Science.gov (United States)

    Katori, Makoto

    2017-10-01

    We introduce seven families of stochastic systems of interacting particles in one-dimension corresponding to the seven families of irreducible reduced affine root systems. We prove that they are determinantal in the sense that all spatio-temporal correlation functions are given by determinants controlled by a single function called the spatio-temporal correlation kernel. For the four families {A}_{N-1}, {B}_N, {C}_N and {D}_N, we identify the systems of stochastic differential equations solved by these determinantal processes, which will be regarded as the elliptic extensions of the Dyson model. Here we use the notion of martingales in probability theory and the elliptic determinant evaluations of the Macdonald denominators of irreducible reduced affine root systems given by Rosengren and Schlosser.

  9. Semi-supervised probabilistics approach for normalising informal short text messages

    CSIR Research Space (South Africa)

    Modupe, A

    2017-03-01

    Full Text Available language processing (NLP) techniques. In this study, our contribution is to target non-standard words in the short text and propose a method to which the given word is likely to be transformed. Our method uses language model probability to characterise...

  10. [Standardization and modeling of surgical processes].

    Science.gov (United States)

    Strauss, G; Schmitz, P

    2016-12-01

    Due to the technological developments around the operating room, surgery in the twenty-first century is undergoing a paradigm shift. Which technologies have already been integrated into the surgical routine? How can a favorable cost-benefit balance be achieved by the implementation of new software-based assistance systems? This article presents the state of the art technology as exemplified by a semi-automated operation system for otorhinolaryngology surgery. The main focus is on systems for implementation of digital handbooks and navigational functions in situ. On the basis of continuous development in digital imaging, decisions may by facilitated by individual patient models thus allowing procedures to be optimized. The ongoing digitization and linking of all relevant information enable a high level of standardization in terms of operating procedures. This may be used by assistance systems as a basis for complete documentation and high process reliability. Automation of processes in the operating room results in an increase in quality, precision and standardization so that the effectiveness and efficiency of treatment can be improved; however, care must be taken that detrimental consequences, such as loss of skills and placing too much faith in technology must be avoided by adapted training concepts.

  11. Numerical approaches to expansion process modeling

    Directory of Open Access Journals (Sweden)

    G. V. Alekseev

    2017-01-01

    Full Text Available Forage production is currently undergoing a period of intensive renovation and introduction of the most advanced technologies and equipment. More and more often such methods as barley toasting, grain extrusion, steaming and grain flattening, boiling bed explosion, infrared ray treatment of cereals and legumes, followed by flattening, and one-time or two-time granulation of the purified whole grain without humidification in matrix presses By grinding the granules. These methods require special apparatuses, machines, auxiliary equipment, created on the basis of different methods of compiled mathematical models. When roasting, simulating the heat fields arising in the working chamber, provide such conditions, the decomposition of a portion of the starch to monosaccharides, which makes the grain sweetish, but due to protein denaturation the digestibility of the protein and the availability of amino acids decrease somewhat. Grain is roasted mainly for young animals in order to teach them to eat food at an early age, stimulate the secretory activity of digestion, better development of the masticatory muscles. In addition, the high temperature is detrimental to bacterial contamination and various types of fungi, which largely avoids possible diseases of the gastrointestinal tract. This method has found wide application directly on the farms. Apply when used in feeding animals and legumes: peas, soy, lupine and lentils. These feeds are preliminarily ground, and then cooked or steamed for 1 hour for 30–40 minutes. In the feed mill. Such processing of feeds allows inactivating the anti-nutrients in them, which reduce the effectiveness of their use. After processing, legumes are used as protein supplements in an amount of 25–30% of the total nutritional value of the diet. But it is recommended to cook and steal a grain of good quality. A poor-quality grain that has been stored for a long time and damaged by pathogenic micro flora is subject to

  12. Living in Surveillance Societies: The Normalisation of Surveillance in Europe and the Threat of Britain’s Bad Example

    Directory of Open Access Journals (Sweden)

    David Murakami Wood

    2009-08-01

    Full Text Available This article argues that surveillance is becoming increasingly normalised across Europe and that this is altering the landscape of liberty and security. It identifies this normalisation as a product of the globalisation of surveillance, the domestication of security, the desire of the European Union (EU to create a distinct leading role in security, and the influence of the 'bad example' of the United Kingdom (UK. The article uses the two very different examples of video-surveillance and electronic public services in the UK to make this case and to argue for both stronger resistance to calls to make human rights more flexible in a risk and security-driven age and more detailed research into the differences between emerging surveillance societies in Europe.

  13. Towards a data processing plane: An automata-based distributed dynamic data processing model

    NARCIS (Netherlands)

    Cushing, R.; Belloum, A.; Bubak, M.; de Laat, C.

    Data processing complexity, partitionability, locality and provenance play a crucial role in the effectiveness of distributed data processing. Dynamics in data processing necessitates effective modeling which allows the understanding and reasoning of the fluidity of data processing. Through

  14. Identification of endogenous control genes for normalisation of real-time quantitative PCR data in colorectal cancer.

    LENUS (Irish Health Repository)

    Kheirelseid, Elrasheid A H

    2010-01-01

    BACKGROUND: Gene expression analysis has many applications in cancer diagnosis, prognosis and therapeutic care. Relative quantification is the most widely adopted approach whereby quantification of gene expression is normalised relative to an endogenously expressed control (EC) gene. Central to the reliable determination of gene expression is the choice of control gene. The purpose of this study was to evaluate a panel of candidate EC genes from which to identify the most stably expressed gene(s) to normalise RQ-PCR data derived from primary colorectal cancer tissue. RESULTS: The expression of thirteen candidate EC genes: B2M, HPRT, GAPDH, ACTB, PPIA, HCRT, SLC25A23, DTX3, APOC4, RTDR1, KRTAP12-3, CHRNB4 and MRPL19 were analysed in a cohort of 64 colorectal tumours and tumour associated normal specimens. CXCL12, FABP1, MUC2 and PDCD4 genes were chosen as target genes against which a comparison of the effect of each EC gene on gene expression could be determined. Data analysis using descriptive statistics, geNorm, NormFinder and qBasePlus indicated significant difference in variances between candidate EC genes. We determined that two genes were required for optimal normalisation and identified B2M and PPIA as the most stably expressed and reliable EC genes. CONCLUSION: This study identified that the combination of two EC genes (B2M and PPIA) more accurately normalised RQ-PCR data in colorectal tissue. Although these control genes might not be optimal for use in other cancer studies, the approach described herein could serve as a template for the identification of valid ECs in other cancer types.

  15. Analysis of a simulated microarray dataset: Comparison of methods for data normalisation and detection of differential expression (Open Access publication

    Directory of Open Access Journals (Sweden)

    Mouzaki Daphné

    2007-11-01

    Full Text Available Abstract Microarrays allow researchers to measure the expression of thousands of genes in a single experiment. Before statistical comparisons can be made, the data must be assessed for quality and normalisation procedures must be applied, of which many have been proposed. Methods of comparing the normalised data are also abundant, and no clear consensus has yet been reached. The purpose of this paper was to compare those methods used by the EADGENE network on a very noisy simulated data set. With the a priori knowledge of which genes are differentially expressed, it is possible to compare the success of each approach quantitatively. Use of an intensity-dependent normalisation procedure was common, as was correction for multiple testing. Most variety in performance resulted from differing approaches to data quality and the use of different statistical tests. Very few of the methods used any kind of background correction. A number of approaches achieved a success rate of 95% or above, with relatively small numbers of false positives and negatives. Applying stringent spot selection criteria and elimination of data did not improve the false positive rate and greatly increased the false negative rate. However, most approaches performed well, and it is encouraging that widely available techniques can achieve such good results on a very noisy data set.

  16. Business Process Simulation: Requirements for Business and Resource Models

    OpenAIRE

    Audrius Rima; Olegas Vasilecas

    2015-01-01

    The purpose of Business Process Model and Notation (BPMN) is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  17. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  18. Computer Forensics Field Triage Process Model

    Directory of Open Access Journals (Sweden)

    Marcus K. Rogers

    2006-06-01

    Full Text Available With the proliferation of digital based evidence, the need for the timely identification, analysis and interpretation of digital evidence is becoming more crucial. In many investigations critical information is required while at the scene or within a short period of time - measured in hours as opposed to days. The traditional cyber forensics approach of seizing a system(s/media, transporting it to the lab, making a forensic image(s, and then searching the entire system for potential evidence, is no longer appropriate in some circumstances. In cases such as child abductions, pedophiles, missing or exploited persons, time is of the essence. In these types of cases, investigators dealing with the suspect or crime scene need investigative leads quickly; in some cases it is the difference between life and death for the victim(s. The Cyber Forensic Field Triage Process Model (CFFTPM proposes an onsite or field approach for providing the identification, analysis and interpretation of digital evidence in a short time frame, without the requirement of having to take the system(s/media back to the lab for an in-depth examination or acquiring a complete forensic image(s. The proposed model adheres to commonly held forensic principles, and does not negate the ability that once the initial field triage is concluded, the system(s/storage media be transported back to a lab environment for a more thorough examination and analysis. The CFFTPM has been successfully used in various real world cases, and its investigative importance and pragmatic approach has been amply demonstrated. Furthermore, the derived evidence from these cases has not been challenged in the court proceedings where it has been introduced. The current article describes the CFFTPM in detail, discusses the model’s forensic soundness, investigative support capabilities and practical considerations.

  19. The Formalization of the Business Process Modeling Goals

    Directory of Open Access Journals (Sweden)

    Ligita Bušinska

    2016-10-01

    Full Text Available In business process modeling the de facto standard BPMN has emerged. However, the applications of this notation have many subsets of elements and various extensions. Also, BPMN still coincides with many other modeling languages, forming a large set of available options for business process modeling languages and dialects. While, in general, the goal of modelers is a central notion in the choice of modeling languages and notations, in most researches that propose guidelines, techniques, and methods for business process modeling language evaluation and/or selection, the business process modeling goal is not formalized and not transparently taken into account. To overcome this gap, and to explicate and help to handle business process modeling complexity, the approach to formalize the business process modeling goal, and the supporting three dimensional business process modeling framework, are proposed.

  20. Modelling of fiberglass pipe destruction process

    Directory of Open Access Journals (Sweden)

    А. К. Николаев

    2017-03-01

    Full Text Available The article deals with important current issue of oil and gas industry of using tubes made of high-strength composite corrosion resistant materials. In order to improve operational safety of industrial pipes it is feasible to use composite fiberglass tubes. More than half of the accidents at oil and gas sites happen at oil gathering systems due to high corrosiveness of pumped fluid. To reduce number of accidents and improve environmental protection we need to solve the issue of industrial pipes durability. This problem could be solved by using composite materials from fiberglass, which have required physical and mechanical properties for oil pipes. The durability and strength can be monitored by a fiberglass winding method, number of layers in composite material and high corrosion-resistance properties of fiberglass. Usage of high-strength composite materials in oil production is economically feasible; fiberglass pipes production is cheaper than steel pipes. Fiberglass has small volume weight, which simplifies pipe transportation and installation. In order to identify the efficiency of using high-strength composite materials at oil production sites we conducted a research of their physical-mechanical properties and modelled fiber pipe destruction process.

  1. Atmospheric pollution. From processes to modelling

    International Nuclear Information System (INIS)

    Sportisse, B.

    2008-01-01

    Air quality, greenhouse effect, ozone hole, chemical or nuclear accidents.. All these phenomena are tightly linked to the chemical composition of atmosphere and to the atmospheric dispersion of pollutants. This book aims at supplying the main elements of understanding of 'atmospheric pollutions': stakes, physical processes involved, role of scientific expertise in decision making. Content: 1 - classifications and scales: chemical composition of the atmosphere, vertical structure, time scales (transport, residence); 2 - matter/light interaction: notions of radiative transfer, application to the Earth's atmosphere; 3 - some elements about the atmospheric boundary layer: notion of scales in meteorology, atmospheric boundary layer (ABL), thermal stratification and stability, description of ABL turbulence, elements of atmospheric dynamics, some elements about the urban climate; 4 - notions of atmospheric chemistry: characteristics, ozone stratospheric chemistry, ozone tropospheric chemistry, brief introduction to indoor air quality; 5 - aerosols, clouds and rains: aerosols and particulates, aerosols and clouds, acid rains and leaching; 6 - towards numerical simulation: equation of reactive dispersion, numerical methods for chemistry-transport models, numerical resolution of the general equation of aerosols dynamics (GDE), modern simulation chains, perspectives. (J.S.)

  2. Mathematical Modelling of Coal Gasification Processes

    Science.gov (United States)

    Sundararajan, T.; Raghavan, V.; Ajilkumar, A.; Vijay Kumar, K.

    2017-07-01

    Coal is by far the most commonly employed fuel for electrical power generation around the world. While combustion could be the route for coal utilization for high grade coals, gasification becomes the preferred process for low grade coals having higher composition of volatiles or ash. Indian coals suffer from high ash content-nearly 50% by weight in some cases. Instead of transporting such high ash coals, it is more energy efficient to gasify the coal and transport the product syngas. Integrated Gasification Combined Cycle (IGCC) plants and Underground Gasification of coal have become attractive technologies for the best utilization of high ash coals. Gasification could be achieved in fixed beds, fluidized beds and entrained beds; faster rates of gasification are possible in fluidized beds and entrained flow systems, because of the small particle sizes and higher gas velocities. The media employed for gasification could involve air/oxygen and steam. Use of oxygen will yield relatively higher calorific value syngas because of the absence of nitrogen. Sequestration of the carbon dioxide after the combustion of the syngas is also easier, if oxygen is used for gasification. Addition of steam can increase hydrogen yield in the syngas and thereby increase the calorific value also. Gasification in the presence of suitable catalysts can increase the composition of methane in the product gas. Several competing heterogenous and homogenous reactions occur during coal major heterogenous reaction pathways, while interactions between carbon monoxide, oxygen, hydrogen, water vapour, methane and carbon dioxide result in several simultaneous gas-phase (homogenous) reactions. The overall product composition of the coal gasification process depends on the input reactant composition, particle size and type of gasifier, and pressure and temperature of the gasifier. The use of catalysts can also selectively change the product composition. At IIT Madras, over the last one decade, both

  3. A general model for membrane-based separation processes

    DEFF Research Database (Denmark)

    Soni, Vipasha; Abildskov, Jens; Jonsson, Gunnar Eigil

    2009-01-01

    behaviour will play an important role. In this paper, modelling of membrane-based processes for separation of gas and liquid mixtures are considered. Two general models, one for membrane-based liquid separation processes (with phase change) and another for membrane-based gas separation are presented....... The separation processes covered are: membrane-based gas separation processes, pervaporation and various types of membrane distillation processes. The specific model for each type of membrane-based process is generated from the two general models by applying the specific system descriptions and the corresponding......A separation process could be defined as a process that transforms a given mixture of chemicals into two or more compositionally distinct end-use products. One way to design these separation processes is to employ a model-based approach, where mathematical models that reliably predict the process...

  4. Process modeling - It's history, current status, and future

    Science.gov (United States)

    Duttweiler, Russell E.; Griffith, Walter M.; Jain, Sulekh C.

    1991-04-01

    The development of process modeling is reviewed to examine the potential of process applications to prevent and solve problems associated with the aerospace industry. The business and global environments is assessed, and the traditional approach to product/process design is argued to be obsolete. A revised engineering process is described which involves planning and prediction before production by means of process simulation. Process simulation can permit simultaneous engineering of unit processes and complex processes, and examples are given in the cross-coupling of forging-process variance. The implementation of process modeling, CAE, and computer simulations are found to reduce costs and time associated with technological development when incorporated judiciously.

  5. Compliance in Resource-based Process Models

    NARCIS (Netherlands)

    Colombo Tosatto, S.; Elrakaiby, Y.; Ziafati, P.

    2013-01-01

    Execution of business processes often requires resources, the use of which is usually subject to constraints. In this paper, we study the compliance of business processes with resource usage policies. To this end, we relate the execution of a business process to its resource requirements in terms of

  6. Two Undergraduate Process Modeling Courses Taught Using Inductive Learning Methods

    Science.gov (United States)

    Soroush, Masoud; Weinberger, Charles B.

    2010-01-01

    This manuscript presents a successful application of inductive learning in process modeling. It describes two process modeling courses that use inductive learning methods such as inquiry learning and problem-based learning, among others. The courses include a novel collection of multi-disciplinary complementary process modeling examples. They were…

  7. Large external quality assessment survey on thrombin generation with CAT: further evidence for the usefulness of normalisation with an external reference plasma.

    Science.gov (United States)

    Perrin, Julien; Depasse, François; Lecompte, Thomas

    2015-07-01

    Calibrated Automated Thrombography (CAT) has been widely used to assess in vitro thrombin generation as an informative intermediary phenotype of coagulation. Interlaboratory exercises have documented a worrisome poor reproducibility. There are some data on the normalisation with an appropriate external reference plasma (RP). This multicentre study of the French-speaking CAT Club aimed at providing further evidence for the usefulness of such a normalisation. Lyophilised aliquots of a RP along with 3 plasmas (P1=normal; P2=hypo-; P3=hypercoagulable) were sent to 34 laboratories (corresponding to 38 instruments). CAT was studied using 1 and 5 pM tissue factor and other dedicated reagents. Normalisation with the local RP in use in the laboratory could also be performed. Interlaboratory CVs were calculated for each plasma before and after normalisation. Regarding endogenous thrombin potential, a good discrimination between the 3 plasmas was achieved in all laboratories but there was no overlap after normalisation only. CVs were generally not reduced with the use of local RP but were generally improved with normalisation using the external RP, often becoming lower than 10%. Regarding P2 however, the benefit of normalisation was poor, and there were analytical difficulties as well, some laboratories being unable to get a useable signal. We confirm that normalisation of CAT results with a suitable external RP is useful in "real life" practice as it often permits an acceptable level of interlaboratory variability. In case of frank hypocoagulability, further improvements are required to get reliable, potentially clinically relevant results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Model-Driven and Pattern-Based Integration of Process-Driven SOA Models

    OpenAIRE

    Zdun, Uwe; Dustdar, Schahram

    2006-01-01

    Service-oriented architectures (SOA) are increasingly used in the context of business processes. However, the modeling approaches for process-driven SOAs do not yet sufficiently integrate the various kinds of models relevant for a process-driven SOA -- ranging from process models to software architectural models to software design models. We propose to integrate process-driven SOA models via a model-driven software development approach that is based on proven practices do...

  9. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  10. Model reduction methods for vector autoregressive processes

    CERN Document Server

    Brüggemann, Ralf

    2004-01-01

    1. 1 Objective of the Study Vector autoregressive (VAR) models have become one of the dominant research tools in the analysis of macroeconomic time series during the last two decades. The great success of this modeling class started with Sims' (1980) critique of the traditional simultaneous equation models (SEM). Sims criticized the use of 'too many incredible restrictions' based on 'supposed a priori knowledge' in large scale macroeconometric models which were popular at that time. Therefore, he advo­ cated largely unrestricted reduced form multivariate time series models, unrestricted VAR models in particular. Ever since his influential paper these models have been employed extensively to characterize the underlying dynamics in systems of time series. In particular, tools to summarize the dynamic interaction between the system variables, such as impulse response analysis or forecast error variance decompo­ sitions, have been developed over the years. The econometrics of VAR models and related quantities i...

  11. Modeling microbial processes in porous media

    Science.gov (United States)

    Murphy, Ellyn M.; Ginn, Timothy R.

    The incorporation of microbial processes into reactive transport models has generally proceeded along two separate lines of investigation: (1) transport of bacteria as inert colloids in porous media, and (2) the biodegradation of dissolved contaminants by a stationary phase of bacteria. Research over the last decade has indicated that these processes are closely linked. This linkage may occur when a change in metabolic activity alters the attachment/detachment rates of bacteria to surfaces, either promoting or retarding bacterial transport in a groundwater-contaminant plume. Changes in metabolic activity, in turn, are controlled by the time of exposure of the microbes to electron acceptors/donor and other components affecting activity. Similarly, metabolic activity can affect the reversibility of attachment, depending on the residence time of active microbes. Thus, improvements in quantitative analysis of active subsurface biota necessitate direct linkages between substrate availability, metabolic activity, growth, and attachment/detachment rates. This linkage requires both a detailed understanding of the biological processes and robust quantitative representations of these processes that can be tested experimentally. This paper presents an overview of current approaches used to represent physicochemical and biological processes in porous media, along with new conceptual approaches that link metabolic activity with partitioning of the microorganism between the aqueous and solid phases. Résumé L'introduction des processus microbiologiques dans des modèles de transport réactif a généralement suivi deux voies différentes de recherches: (1) le transport de bactéries sous forme de colloïdes inertes en milieu poreux, et (2) la biodégradation de polluants dissous par une phase stationnaire de bactéries. Les recherches conduites au cours des dix dernières années indiquent que ces processus sont intimement liés. Cette liaison peut intervenir lorsqu

  12. Learning Markov Decision Processes for Model Checking

    DEFF Research Database (Denmark)

    Mao, Hua; Chen, Yingke; Jaeger, Manfred

    2012-01-01

    Constructing an accurate system model for formal model verification can be both resource demanding and time-consuming. To alleviate this shortcoming, algorithms have been proposed for automatically learning system models based on observed system behaviors. In this paper we extend the algorithm...... is performed by analyzing the probabilistic linear temporal logic properties of the system as well as by analyzing the schedulers, in particular the optimal schedulers, induced by the learned models....

  13. Evaluation of EOR Processes Using Network Models

    DEFF Research Database (Denmark)

    Winter, Anatol; Larsen, Jens Kjell; Krogsbøll, Anette

    1998-01-01

    The report consists of the following parts: 1) Studies of wetting properties of model fluids and fluid mixtures aimed at an optimal selection of candidates for micromodel experiments. 2) Experimental studies of multiphase transport properties using physical models of porous networks (micromodels......) including estimation of their "petrophysical" properties (e.g. absolute permeability). 3) Mathematical modelling and computer studies of multiphase transport through pore space using mathematical network models. 4) Investigation of link between pore-scale and macroscopic recovery mechanisms....

  14. Explosive Bubble Modelling by Noncausal Process

    OpenAIRE

    Christian Gouriéroux; Jean-Michel Zakoian

    2013-01-01

    The linear mixed causal and noncausal autoregressive processes provide often a better fit to economic and financial time series than the standard causal linear autoregressive processes. By considering the example of the noncausal Cauchy autoregressive process, we show that it might be explained by the special associated nonlinear causal dynamics. Indeed, this causal dynamics can include unit root, bubble phenomena, or asymmetric cycles often observed on financial markets. The noncausal Cauchy...

  15. Dynamic modeling of ultrafiltration membranes for whey separation processes

    NARCIS (Netherlands)

    Saltık, M.B.; Özkan, Leyla; Jacobs, Marc; Padt, van der Albert

    2017-01-01

    In this paper, we present a control relevant rigorous dynamic model for an ultrafiltration membrane unit in a whey separation process. The model consists of a set of differential algebraic equations and is developed for online model based applications such as model based control and process

  16. Modeling Large Time Series for Efficient Approximate Query Processing

    DEFF Research Database (Denmark)

    Perera, Kasun S; Hahmann, Martin; Lehner, Wolfgang

    2015-01-01

    -wise aggregation to derive the models. These models are initially created from the original data and are kept in the database along with it. Subsequent queries are answered using the stored models rather than scanning and processing the original datasets. In order to support model query processing, we maintain...

  17. Regime-switching models to study psychological process

    NARCIS (Netherlands)

    Hamaker, E.L.; Grasman, R.P.P.P.; Kamphuis, J.H.

    2010-01-01

    Many psychological processes are characterized by recurrent shifts between different states. To model these processes at the level of the individual, regime-switching models may prove useful. In this chapter we discuss two of these models: the threshold autoregressive model and the Markov

  18. Lipid Processing Technology: Building a Multilevel Modeling Network

    DEFF Research Database (Denmark)

    Díaz Tovar, Carlos Axel; Mustaffa, Azizul Azri; Mukkerikar, Amol

    2011-01-01

    of a master parameter table; iii) development of a model library consisting of new and adopted process models of unit operations involved in lipid processing technologies, validation of the developed models using operating data collected from existing process plants, and application of validated models......The aim of this work is to present the development of a computer aided multilevel modeling network for the systematic design and analysis of processes employing lipid technologies. This is achieved by decomposing the problem into four levels of modeling: i) pure component property modeling...... and a lipid-database of collected experimental data from industry and generated data from validated predictive property models, as well as modeling tools for fast adoption-analysis of property prediction models; ii) modeling of phase behavior of relevant lipid mixtures using the UNIFACCI model, development...

  19. Property Modelling for Applications in Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Physical-chemical properties of pure chemicals and their mixtures play an important role in the design of chemicals based products and the processes that manufacture them. Although, the use of experimental data in design and analysis of chemicals based products and their processes is desirable...... such as database, property model library, model parameter regression, and, property-model based product-process design will be presented. The database contains pure component and mixture data for a wide range of organic chemicals. The property models are based on the combined group contribution and atom...... modeling tools in design and analysis of chemical product-process design, including biochemical processes will be highlighted....

  20. Modelling of injection processes in ladle metallurgy

    NARCIS (Netherlands)

    Visser, H.

    2016-01-01

    Ladle metallurgical processes constitute a portion of the total production chain of steel from iron ore. With these batch processes, the hot metal or steel transfer ladle is being used as a reactor vessel and a reagent is often injected in order to bring the composition of the hot metal or steel to

  1. The Model of the Production Process for the Quality Management

    Directory of Open Access Journals (Sweden)

    Alot Zbigniew

    2017-02-01

    Full Text Available This article is a result of the research on the models of the production processes for the quality management and their identification. It discusses the classical model and the indicators for evaluating the capabilities by taking as its starting point the assumption of the normal distribution of the process characteristics. The division of the process types proposed by ISO 21747:2006 standard introducing models for non-stationary processes is presented. A general process model that allows in any real case to precisely describe the statistical characteristics of the process is proposed. It gives the opportunity for more detailed description, in comparison to the model proposed by ISO 21747:2006 standard, of the process characteristics and determining its capability. This model contains the type of process, statistical distribution, and the method for determining the capability and performance (long-term capability of the process. One of the model elements is proposed, own classification and resulting set of process types. The classification follows the recommendations of ISO 21747:2006 introducing models for the non-stationary processes. However, the set of the process types allows, beyond a more precise description of the process characteristics, its usage to monitor the process.

  2. Modelling the Active Hearing Process in Mosquitoes

    Science.gov (United States)

    Avitabile, Daniele; Homer, Martin; Jackson, Joe; Robert, Daniel; Champneys, Alan

    2011-11-01

    A simple microscopic mechanistic model is described of the active amplification within the Johnston's organ of the mosquito species Toxorhynchites brevipalpis. The model is based on the description of the antenna as a forced-damped oscillator coupled to a set of active threads (ensembles of scolopidia) that provide an impulsive force when they twitch. This twitching is in turn controlled by channels that are opened and closed if the antennal oscillation reaches a critical amplitude. The model matches both qualitatively and quantitatively with recent experiments. New results are presented using mathematical homogenization techniques to derive a mesoscopic model as a simple oscillator with nonlinear force and damping characteristics. It is shown how the results from this new model closely resemble those from the microscopic model as the number of threads approach physiologically correct values.

  3. Study of dissolution process and its modelling

    Directory of Open Access Journals (Sweden)

    Juan Carlos Beltran-Prieto

    2017-01-01

    Full Text Available The use of mathematical concepts and language aiming to describe and represent the interactions and dynamics of a system is known as a mathematical model. Mathematical modelling finds a huge number of successful applications in a vast amount of science, social and engineering fields, including biology, chemistry, physics, computer sciences, artificial intelligence, bioengineering, finance, economy and others. In this research, we aim to propose a mathematical model that predicts the dissolution of a solid material immersed in a fluid. The developed model can be used to evaluate the rate of mass transfer and the mass transfer coefficient. Further research is expected to be carried out to use the model as a base to develop useful models for the pharmaceutical industry to gain information about the dissolution of medicaments in the body stream and this could play a key role in formulation of medicaments.

  4. Bayesian Modeling of Cerebral Information Processing

    OpenAIRE

    Labatut, Vincent; Pastor, Josette

    2001-01-01

    International audience; Modeling explicitly the links between cognitive functions and networks of cerebral areas is necessitated both by the understanding of the clinical outcomes of brain lesions and by the interpretation of activation data provided by functional neuroimaging techniques. At this global level of representation, the human brain can be best modeled by a probabilistic functional causal network. Our modeling approach is based on the anatomical connection pattern, the information ...

  5. Computer modeling of lung cancer diagnosis-to-treatment process.

    Science.gov (United States)

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U; Yu, Xinhua; Faris, Nick; Li, Jingshan

    2015-08-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed.

  6. An Abstract Model of Historical Processes

    Directory of Open Access Journals (Sweden)

    Michael Poulshock

    2017-06-01

    Full Text Available A theoretical model is presented which provides a way to simulate, at a very abstract level, power struggles in the social world. In the model, agents can benefit or harm each other, to varying degrees and with differing levels of influence. The agents interact over time, using the power they have to try to get more of it, while being constrained in their strategic choices by social inertia. The outcomes of the model are probabilistic. More research is needed to determine whether the model has any empirical validity.

  7. Modeling and Advanced Control for Sustainable Process ...

    Science.gov (United States)

    This book chapter introduces a novel process systems engineering framework that integrates process control with sustainability assessment tools for the simultaneous evaluation and optimization of process operations. The implemented control strategy consists of a biologically-inspired, multi-agent-based method. The sustainability and performance assessment of process operating points is carried out using the U.S. E.P.A.’s GREENSCOPE assessment tool that provides scores for the selected economic, material management, environmental and energy indicators. The indicator results supply information on whether the implementation of the controller is moving the process towards a more sustainable operation. The effectiveness of the proposed framework is illustrated through a case study of a continuous bioethanol fermentation process whose dynamics are characterized by steady-state multiplicity and oscillatory behavior. This book chapter contribution demonstrates the application of novel process control strategies for sustainability by increasing material management, energy efficiency, and pollution prevention, as needed for SHC Sustainable Uses of Wastes and Materials Management.

  8. Modeling of Heating During Food Processing

    Science.gov (United States)

    Zheleva, Ivanka; Kamburova, Veselka

    Heat transfer processes are important for almost all aspects of food preparation and play a key role in determining food safety. Whether it is cooking, baking, boiling, frying, grilling, blanching, drying, sterilizing, or freezing, heat transfer is part of the processing of almost every food. Heat transfer is a dynamic process in which thermal energy is transferred from one body with higher temperature to another body with lower temperature. Temperature difference between the source of heat and the receiver of heat is the driving force in heat transfer.

  9. A computational model of human auditory signal processing and perception

    OpenAIRE

    Jepsen, Morten Løve; Ewert, Stephan D.; Dau, Torsten

    2008-01-01

    A model of computational auditory signal-processing and perception that accounts for various aspects of simultaneous and nonsimultaneous masking in human listeners is presented. The model is based on the modulation filterbank model described by Dau et al. [J. Acoust. Soc. Am. 102, 2892 (1997)] but includes major changes at the peripheral and more central stages of processing. The model contains outer- and middle-ear transformations, a nonlinear basilar-membrane processing stage, a hair-cell t...

  10. Multiscale soil-landscape process modeling

    NARCIS (Netherlands)

    Schoorl, J.M.; Veldkamp, A.

    2006-01-01

    The general objective of this chapter is to illustrate the role of soils and geomorphological processes in the multiscale soil-lanscape context. Included in this context is the fourth dimension (temporal dimension) and the human role (fifth dimension)

  11. Comparative analysis of business rules and business process modeling languages

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2013-03-01

    Full Text Available During developing an information system is important to create clear models and choose suitable modeling languages. The article analyzes the SRML, SBVR, PRR, SWRL, OCL rules specifying language and UML, DFD, CPN, EPC and IDEF3 BPMN business process modeling language. The article presents business rules and business process modeling languages theoretical comparison. The article according to selected modeling aspects of the comparison between different business process modeling languages ​​and business rules representation languages sets. Also, it is selected the best fit of language set for three layer framework for business rule based software modeling.

  12. Difference-based Model Synchronization in an Industrial MDD Process

    DEFF Research Database (Denmark)

    Könemann, Patrick; Kindler, Ekkart; Unland, Ludger

    2009-01-01

    Models play a central role in model-driven software engineering. There are different kinds of models during the development process, which are related to each other and change over time. Therefore, it is difficult to keep the different models consistent with each other. Consistency of different m...... model versions, and for synchronizing other types of models. The main concern is to apply our concepts to an industrial process, in particular keeping usability and performance in mind. Keyword: Model Differencing, Model Merging, Model Synchronization...

  13. Dynamic process model of a plutonium oxalate precipitator. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Miller, C.L.; Hammelman, J.E.; Borgonovi, G.M.

    1977-11-01

    In support of LLL material safeguards program, a dynamic process model was developed which simulates the performance of a plutonium (IV) oxalate precipitator. The plutonium oxalate precipitator is a component in the plutonium oxalate process for making plutonium oxide powder from plutonium nitrate. The model is based on state-of-the-art crystallization descriptive equations, the parameters of which are quantified through the use of batch experimental data. The dynamic model predicts performance very similar to general Hanford oxalate process experience. The utilization of such a process model in an actual plant operation could promote both process control and material safeguards control by serving as a baseline predictor which could give early warning of process upsets or material diversion. The model has been incorporated into a FORTRAN computer program and is also compatible with the DYNSYS 2 computer code which is being used at LLL for process modeling efforts.

  14. Dynamic process model of a plutonium oxalate precipitator. Final report

    International Nuclear Information System (INIS)

    Miller, C.L.; Hammelman, J.E.; Borgonovi, G.M.

    1977-11-01

    In support of LLL material safeguards program, a dynamic process model was developed which simulates the performance of a plutonium (IV) oxalate precipitator. The plutonium oxalate precipitator is a component in the plutonium oxalate process for making plutonium oxide powder from plutonium nitrate. The model is based on state-of-the-art crystallization descriptive equations, the parameters of which are quantified through the use of batch experimental data. The dynamic model predicts performance very similar to general Hanford oxalate process experience. The utilization of such a process model in an actual plant operation could promote both process control and material safeguards control by serving as a baseline predictor which could give early warning of process upsets or material diversion. The model has been incorporated into a FORTRAN computer program and is also compatible with the DYNSYS 2 computer code which is being used at LLL for process modeling efforts

  15. MODELLING OF THE PROCESS OF TEACHING READING ENGLISH LANGUAGE PERIODICALS

    Directory of Open Access Journals (Sweden)

    Тетяна Глушко

    2014-07-01

    Full Text Available The article reveals a scientifically substantiated process of teaching reading English language periodicals in all its components, which are consistently developed, and form of interconnection of the structural elements in the process of teaching reading. This process is presented as a few interconnected and interdetermined models: 1 the models of the process of acquiring standard and expressive lexical knowledge; 2 the models of the process of formation of skills to use such vocabulary; 3 the models of the development of skills to read texts of the different linguistic levels.

  16. MODELING OF MANAGEMENT PROCESSES IN AN ORGANIZATION

    Directory of Open Access Journals (Sweden)

    Stefan Iovan

    2016-05-01

    Full Text Available When driving any major change within an organization, strategy and execution are intrinsic to a project’s success. Nevertheless, closing the gap between strategy and execution remains a challenge for many organizations [1]. Companies tend to focus more on execution than strategy for quick results, instead of taking the time needed to understand the parts that make up the whole, so the right execution plan can be put in place to deliver the best outcomes. A large part of this understands that business operations don’t fit neatly within the traditional organizational hierarchy. Business processes are often messy, collaborative efforts that cross teams, departments and systems, making them difficult to manage within a hierarchical structure [2]. Business process management (BPM fills this gap by redefining an organization according to its end-to-end processes, so opportunities for improvement can be identified and processes streamlined for growth, revenue and transformation. This white paper provides guidelines on what to consider when using business process applications to solve your BPM initiatives, and the unique capabilities software systems provides that can help ensure both your project’s success and the success of your organization as a whole. majority of medium and small businesses, big companies and even some guvermental organizations [2].

  17. Modelling of additive manufacturing processes: a review and classification

    Science.gov (United States)

    Stavropoulos, Panagiotis; Foteinopoulos, Panagis

    2018-03-01

    Additive manufacturing (AM) is a very promising technology; however, there are a number of open issues related to the different AM processes. The literature on modelling the existing AM processes is reviewed and classified. A categorization of the different AM processes in process groups, according to the process mechanism, has been conducted and the most important issues are stated. Suggestions are made as to which approach is more appropriate according to the key performance indicator desired to be modelled and a discussion is included as to the way that future modelling work can better contribute to improving today's AM process understanding.

  18. Simulation Model Development for Mail Screening Process

    National Research Council Canada - National Science Library

    Vargo, Trish; Marvin, Freeman; Kooistra, Scott

    2005-01-01

    STUDY OBJECTIVE: Provide decision analysis support to the Homeland Defense Business Unit, Special Projects Team, in developing a simulation model to help determine the most effective way to eliminate backlog...

  19. The NPS Virtual Thermal Image Processing Model

    National Research Council Canada - National Science Library

    Lenter, Yucel

    2001-01-01

    ...). The MRTD is a standard performance measure for forward-looking infrared (FLIR) imaging systems. It takes into account thermal imaging system modeling concerns, such as modulation transfer functions...

  20. Task-specific visual cues for improving process model understanding

    NARCIS (Netherlands)

    Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A.

    2016-01-01

    Context Business process models support various stakeholders in managing business processes and designing process-aware information systems. In order to make effective use of these models, they have to be readily understandable. Objective Prior research has emphasized the potential of visual cues to

  1. A model of the gas analysis system operation process

    Science.gov (United States)

    Yakimenko, I. V.; Kanishchev, O. A.; Lyamets, L. L.; Volkova, I. V.

    2017-12-01

    The characteristic features of modeling the gas-analysis measurement system operation process on the basis of the semi-Markov process theory are discussed. The model of the measuring gas analysis system operation process is proposed, which makes it possible to take into account the influence of the replacement interval, the level of reliability and maintainability and to evaluate the product reliability.

  2. Computer-Aided Multiscale Modelling for Chemical Process Engineering

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Gani, Rafiqul

    2007-01-01

    Chemical processes are generally modeled through monoscale approaches, which, while not adequate, satisfy a useful role in product-process design. In this case, use of a multi-dimensional and multi-scale model-based approach has importance in product-process development. A computer-aided framewor...

  3. Capability Maturity Model (CMM) for Software Process Improvements

    Science.gov (United States)

    Ling, Robert Y.

    2000-01-01

    This slide presentation reviews the Avionic Systems Division's implementation of the Capability Maturity Model (CMM) for improvements in the software development process. The presentation reviews the process involved in implementing the model and the benefits of using CMM to improve the software development process.

  4. Support of Modelling in Process-Engineering Education

    NARCIS (Netherlands)

    Schaaf, van der H.; Vermuë, M.H.; Tramper, J.; Hartog, R.J.M.

    2006-01-01

    An objective of the Process Technology curriculum at Wageningen University is to teach students a stepwise modeling approach in the context of process engineering. Many process-engineering students have difficulty with learning to design a model. Some common problems are lack of structure in the

  5. On Process Modelling Using Physical Oriented And Phenomena Based Principles

    Directory of Open Access Journals (Sweden)

    Mihai Culea

    2000-12-01

    Full Text Available This work presents a modelling framework based on phenomena description of the process. The approach is taken to easy understand and construct process model in heterogeneous possible distributed modelling and simulation environments. A simplified case study of a heat exchanger is considered and Modelica modelling language to check the proposed concept. The partial results are promising and the research effort will be extended in a computer aided modelling environment based on phenomena.

  6. Statistical image processing and multidimensional modeling

    CERN Document Server

    Fieguth, Paul

    2010-01-01

    Images are all around us! The proliferation of low-cost, high-quality imaging devices has led to an explosion in acquired images. When these images are acquired from a microscope, telescope, satellite, or medical imaging device, there is a statistical image processing task: the inference of something - an artery, a road, a DNA marker, an oil spill - from imagery, possibly noisy, blurry, or incomplete. A great many textbooks have been written on image processing. However this book does not so much focus on images, per se, but rather on spatial data sets, with one or more measurements taken over

  7. Sensitivity study of reduced models of the activated sludge process ...

    African Journals Online (AJOL)

    2009-08-07

    Aug 7, 2009 ... order to fit the reduced model behaviour to the real data for the process behaviour. Keywords: wastewater treatment, activated sludge process, reduced model, model parameters, sensitivity function, Matlab simulation. Introduction. The problem of effective and optimal control of wastewater treatment plants ...

  8. Process models as tools in forestry research and management

    Science.gov (United States)

    Kurt Johnsen; Lisa Samuelson; Robert Teskey; Steve McNulty; Tom Fox

    2001-01-01

    Forest process models are mathematical representations of biological systems that incorporate our understanding of physiological and ecological mechanisms into predictive algorithms. These models were originally designed and used for research purposes, but are being developed for use in practical forest management. Process models designed for research...

  9. Neuro-fuzzy model for evaluating the performance of processes ...

    Indian Academy of Sciences (India)

    In this work an Adaptive Neuro-Fuzzy Inference System (ANFIS) was used to model the periodic performance of some multi-input single-output (MISO) processes, namely: brewery operations (case study 1) and soap production (case study 2) processes. Two ANFIS models were developed to model the performance of the ...

  10. The evolution of process-based hydrologic models

    NARCIS (Netherlands)

    Clark, Martyn P.; Bierkens, Marc F.P.; Samaniego, Luis; Woods, Ross A.; Uijlenhoet, Remko; Bennett, Katrina E.; Pauwels, Valentijn R.N.; Cai, Xitian; Wood, Andrew W.; Peters-Lidard, Christa D.

    2017-01-01

    The diversity in hydrologic models has historically led to great controversy on the "correct" approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this

  11. PtProcess: An R Package for Modelling Marked Point Processes Indexed by Time

    Directory of Open Access Journals (Sweden)

    David Harte

    2010-10-01

    Full Text Available This paper describes the package PtProcess which uses the R statistical language. The package provides a unified approach to fitting and simulating a wide variety of temporal point process or temporal marked point process models. The models are specified by an intensity function which is conditional on the history of the process. The user needs to provide routines for calculating the conditional intensity function. Then the package enables one to carry out maximum likelihood fitting, goodness of fit testing, simulation and comparison of models. The package includes the routines for the conditional intensity functions for a variety of standard point process models. The package is intended to simplify the fitting of point process models indexed by time in much the same way as generalized linear model programs have simplified the fitting of various linear models. The primary examples used in this paper are earthquake sequences but the package is intended to have a much wider applicability.

  12. A compositional process control model and its application to biochemical processes.

    NARCIS (Netherlands)

    Jonker, C.M.; Treur, J.

    1999-01-01

    A compositional generic process control model is presented which has been applied to control enzymatic biochemical processes. The model has been designed at a conceptual and formal level using the compositional development method DESIRE, and includes processes for analysis, planning and simulation.

  13. Beyond dual-process models: A categorisation of processes underlying intuitive judgement and decision making

    NARCIS (Netherlands)

    Glöckner, A.; Witteman, C.L.M.

    2010-01-01

    Intuitive-automatic processes are crucial for making judgements and decisions. The fascinating complexity of these processes has attracted many decision researchers, prompting them to start investigating intuition empirically and to develop numerous models. Dual-process models assume a clear

  14. A Compositional Process Control Model and its Application to Biochemical Processes

    NARCIS (Netherlands)

    Jonker, C.M.; Treur, J.

    2002-01-01

    A compositional generic process control model is presented which has been applied to control enzymatic biochemical processes. The model has been designed at a conceptual and formal level using the compositional development method DESIRE, and includes processes for analysis, planning and simulation.

  15. Business Process Modeling Languages Supporting Collaborative Networks

    NARCIS (Netherlands)

    Soleimani Malekan, H.; Afsarmanesh, H.; Hammoudi, S.; Maciaszek, L.A.; Cordeiro, J.; Dietz, J.L.G.

    2013-01-01

    Formalizing the definition of Business Processes (BPs) performed within each enterprise is fundamental for effective deployment of their competencies and capabilities within Collaborative Networks (CN). In our approach, every enterprise in the CN is represented by its set of BPs, so that other

  16. Model based optimization of MSWC process control

    NARCIS (Netherlands)

    Kessel, L.B.M. van; Leskens, M.

    2002-01-01

    Optimization of municipal solid waste combustion (MSWC), processes is an im portant issue doe to the ever-lasting need for emission reduction. more optimal use of raw materials and overall cost reduction. The key of the approach of TNO (Netherlands Orgaru sation for Applied Scientific Research) to

  17. Understanding Modeling Requirements of Unstructured Business Processes

    NARCIS (Netherlands)

    Allah Bukhsh, Zaharah; van Sinderen, Marten J.; Sikkel, Nicolaas; Quartel, Dick

    2017-01-01

    Management of structured business processes is of interest to both academia and industry, where academia focuses on the development of methods and techniques while industry focuses on the development of supporting tools. With the shift from routine to knowledge work, the relevance of management of

  18. Deconstructing crop processes and models via identities

    DEFF Research Database (Denmark)

    Porter, John Roy; Christensen, Svend

    2013-01-01

    , mainly atmospheric CO2 concentration and increased and/or varying temperatures. It illustrates an important principle in models of a single cause having alternative effects and vice versa. The second part suggests some features, mostly missing in current crop models, that need to be included......This paper is part review and part opinion piece; it has three parts of increasing novelty and speculation in approach. The first presents an overview of how some of the major crop simulation models approach the issue of simulating the responses of crops to changing climatic and weather variables...... in the future, focussing on extreme events such as high temperature or extreme drought. The final opinion part is speculative but novel. It describes an approach to deconstruct resource use efficiencies into their constituent identities or elements based on the Kaya-Porter identity, each of which can...

  19. Model-Based Methods in the Biopharmaceutical Process Lifecycle.

    Science.gov (United States)

    Kroll, Paul; Hofer, Alexandra; Ulonska, Sophia; Kager, Julian; Herwig, Christoph

    2017-12-01

    Model-based methods are increasingly used in all areas of biopharmaceutical process technology. They can be applied in the field of experimental design, process characterization, process design, monitoring and control. Benefits of these methods are lower experimental effort, process transparency, clear rationality behind decisions and increased process robustness. The possibility of applying methods adopted from different scientific domains accelerates this trend further. In addition, model-based methods can help to implement regulatory requirements as suggested by recent Quality by Design and validation initiatives. The aim of this review is to give an overview of the state of the art of model-based methods, their applications, further challenges and possible solutions in the biopharmaceutical process life cycle. Today, despite these advantages, the potential of model-based methods is still not fully exhausted in bioprocess technology. This is due to a lack of (i) acceptance of the users, (ii) user-friendly tools provided by existing methods, (iii) implementation in existing process control systems and (iv) clear workflows to set up specific process models. We propose that model-based methods be applied throughout the lifecycle of a biopharmaceutical process, starting with the set-up of a process model, which is used for monitoring and control of process parameters, and ending with continuous and iterative process improvement via data mining techniques.

  20. Modeling of processing technologies in food industry

    Science.gov (United States)

    Korotkov, V. G.; Sagitov, R. F.; Popov, V. P.; Bachirov, V. D.; Akhmadieva, Z. R.; TSirkaeva, E. A.

    2018-03-01

    Currently, the society is facing an urgent need to solve the problems of nutrition (products with increased nutrition value) and to develop energy-saving technologies for food products. A mathematical modeling of heat and mass transfer of polymer materials in the extruder is rather successful these days. Mathematical description of movement and heat exchange during extrusion of gluten-protein-starch-containing material similar to pasta dough in its structure, were taken as a framework for the mathematical model presented in this paper.

  1. Analysis of Using Resources in Business Process Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Vasilecas Olegas

    2014-12-01

    Full Text Available One of the key purposes of Business Process Model and Notation (BPMN is to support graphical representation of the process model. However, such models have a lack of support for the graphical representation of resources, whose processes are used during simulation or execution of process instance. The paper analyzes different methods and their extensions for resource modeling. Further, this article presents a selected set of resource properties that are relevant for resource modeling. The paper proposes an approach that explains how to use the selected set of resource properties for extension of process modeling using BPMN and simulation tools. They are based on BPMN, where business process instances use resources in a concurrency manner.

  2. A Comprehensive and Harmonized Digital Forensic Investigation Process Model.

    Science.gov (United States)

    Valjarevic, Aleksandar; Venter, Hein S

    2015-11-01

    Performing a digital forensic investigation (DFI) requires a standardized and formalized process. There is currently neither an international standard nor does a global, harmonized DFI process (DFIP) exist. The authors studied existing state-of-the-art DFIP models and concluded that there are significant disparities pertaining to the number of processes, the scope, the hierarchical levels, and concepts applied. This paper proposes a comprehensive model that harmonizes existing models. An effort was made to incorporate all types of processes proposed by the existing models, including those aimed at achieving digital forensic readiness. The authors introduce a novel class of processes called concurrent processes. This is a novel contribution that should, together with the rest of the model, enable more efficient and effective DFI, while ensuring admissibility of digital evidence. Ultimately, the proposed model is intended to be used for different types of DFI and should lead to standardization. © 2015 American Academy of Forensic Sciences.

  3. Modelling spray drying processes for dairy products

    NARCIS (Netherlands)

    Verdurmen, Ruud E.M.; Straatsma, Han; Verschueren, Maykel; van Haren, Jan; Smit, Erik; Bargeman, Gerrald; de Jong, Peter

    2002-01-01

    NIZO food research (The Netherlands) has been working for the food industry, the dairy industry in particular, for over 50 years. During the past 15 years NIZO food research has put a lot of effort into developing predictive computer models for the food industry. Nowadays the main challenges in the

  4. Mathematical modelling of the calcination process | Olayiwola ...

    African Journals Online (AJOL)

    High quality lime is an essential raw material for Electric Arc Furnaces and Basic Oxygen Furnaces, steelmaking, alumina production etc. Decrease in fuel consumption in metallurgical furnaces is a tremendous opportunity for reduction of greenhouse gas emissions into the atmosphere. In this paper, a mathematical model ...

  5. Process modeling of a HLA research lab

    Science.gov (United States)

    Ribeiro, Bruna G. C.; Sena, Alexandre C.; Silva, Dilson; Marzulo, Leandro A. J.

    2017-11-01

    Bioinformatics has provided tremendous breakthroughs in the field of molecular biology. All this evolution has generated a large volume of biological data that increasingly require the use of computing for analysis and storage of this information. The identification of the human leukocyte antigen (HLA) genotypes is critical to the success of organ transplants in humans. HLA typing involves not only laboratory tests but also DNA sequencing, with the participation of several professionals responsible for different stages of the process. Thus, the objective of this paper is to map the main steps in HLA typing in a laboratory specialized in performing such procedures, analyzing each process and proposing solutions to speed up the these steps, avoiding mistakes.

  6. Modeling bacterial decay coefficient during SSDML process

    Energy Technology Data Exchange (ETDEWEB)

    Sreekrishnan, T.R.; Tyagi, R.D.; Blais, J.F.; Meunier, N.; Cambell, P.G.C. [Univ. de Quebec, Ste-Foy, Quebec (Canada)

    1996-11-01

    The simultaneous sludge digestion and metal leaching (SSDML) process can leach out heavy metals, achieve sludge solids reduction, and eliminate sludge pathogens. The potential for application in the wastewater treatment industry requires a sound knowledge of the system kinetics. The present work targets a better understanding of the qualitative as well as quantitative relationships between solids reduction rate and other parameters such as sludge pH, initial MLSS concentration, and availability of oxygen during the SSDML process. Experiments were carried out in laboratory batch reactors (20 L working volume) as well as in a 4,000 L capacity pilot facility. Based on the results of these experiments, it was concluded that degradation rate of sludge volatile matter is influenced by (1) sludge pH; (2) availability of oxygen; and (3) initial mixed liquor suspended solids (MLSS) concentration of the sludge. The degradation rate constant for biodegradable fraction of the mixed liquor volatile suspended solids [MLVSS(B)] was computed for various initial MLVSS concentration and sludge pH ranges. The value of k{sub d} decreased with decreasing pH in all cases. Effect of initial MLSS concentration on the value of k{sub d} was found to be minimal for the sludge studied. The relation between the sludge pH and k{sub d} for this sludge was expressed in the form of two polynomials. The relations developed were used in conjunction with previous results on the SSDML process kinetics to simulate the overall SSDML process. Results of these simulation studies were found satisfactory when compared to actual experimental results.

  7. Stochastic Models in the Identification Process

    Czech Academy of Sciences Publication Activity Database

    Slovák, Dalibor; Zvárová, Jana

    2011-01-01

    Roč. 7, č. 1 (2011), s. 44-50 ISSN 1801-5603 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : identification process * weight-of evidence formula * coancestry coefficient * beta- binomial sampling formula * DNA mixtures Subject RIV: IN - Informatics, Computer Science http://www.ejbi.eu/images/2011-1/Slovak_en.pdf

  8. A process model of global purchasing

    OpenAIRE

    MATTHYSSENS, Paul; QUINTENS, Lieven; FAES, Wouter

    2003-01-01

    Inward internationalisation has received more and more attention in recent literature. This article contributes to this developing domain by providing a holistic description of the underlying processes of global purchasing. By means of case study research, carried out in eight companies, drivers and inhibitors of globalisation are highlighted. Conditions that could make global purchasing more efficient and effective are suggested. Attention is drawn to key factors on which companies strategie...

  9. A model for dealing with parallel processes in supervision

    OpenAIRE

    Lilja Cajvert

    2011-01-01

    A model for dealing with parallel processes in supervision Supervision in social work is essential for successful outcomes when working with clients. In social work, unconscious difficulties may arise and similar difficulties may occur in supervision as parallel processes. In this article, the development of a practice-based model of supervision to deal with parallel processes in supervision is described. The model has six phases. In the first phase, the focus is on the supervisor’s inner ...

  10. Catastrophe insurance modeled by shot-noise processes

    OpenAIRE

    Schmidt, Thorsten

    2014-01-01

    Shot-noise processes generalize compound Poisson processes in the following way: a jump (the shot) is followed by a decline (noise). This constitutes a useful model for insurance claims in many circumstances; claims due to natural disasters or self-exciting processes exhibit similar features. We give a general account of shot-noise processes with time-inhomogeneous drivers inspired by recent results in credit risk. Moreover, we derive a number of useful results for modeling and pricing with s...

  11. Calculation of normalised organ and effective doses to adult reference computational phantoms from contemporary computed tomography scanners

    International Nuclear Information System (INIS)

    Jansen, Jan T.M.; Shrimpton, Paul C.

    2010-01-01

    The general-purpose Monte Carlo radiation transport code MCNPX has been used to simulate photon transport and energy deposition in anthropomorphic phantoms due to the x-ray exposure from the Philips iCT 256 and Siemens Definition CT scanners, together with the previously studied General Electric 9800. The MCNPX code was compiled with the Intel FORTRAN compiler and run on a Linux PC cluster. A patch has been successfully applied to reduce computing times by about 4%. The International Commission on Radiological Protection (ICRP) has recently published the Adult Male (AM) and Adult Female (AF) reference computational voxel phantoms as successors to the Medical Internal Radiation Dose (MIRD) stylised hermaphrodite mathematical phantoms that form the basis for the widely-used ImPACT CT dosimetry tool. Comparisons of normalised organ and effective doses calculated for a range of scanner operating conditions have demonstrated significant differences in results (in excess of 30%) between the voxel and mathematical phantoms as a result of variations in anatomy. These analyses illustrate the significant influence of choice of phantom on normalised organ doses and the need for standardisation to facilitate comparisons of dose. Further such dose simulations are needed in order to update the ImPACT CT Patient Dosimetry spreadsheet for contemporary CT practice. (author)

  12. Measures of quality of process models created in BPMN

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-12-01

    Full Text Available Description, documentation, evaluation and redesign of key processes during their execution should be an essential part of the strategic management of any organization. All organizations live in a dynamically changing environment. Therefore they must adapt its internal processes to market changes. These processes must be described. Suitable way of description could be BPMN notation. Right after description of processes via BPMN, processes should be controlled to ensure expected of their quality. System (which could be automated based on mathematical expression of qualitative characteristics of process models (i.e. measures of quality of process models can support mentioned process controls. Research team trying to design and get into practical use such a tool. The aim of this publication is description of mentioned system – based on measures of the quality of process models - and answer associated scientific questions.

  13. Validation of the filament winding process model

    Science.gov (United States)

    Calius, Emilo P.; Springer, George S.; Wilson, Brian A.; Hanson, R. Scott

    1987-01-01

    Tests were performed toward validating the WIND model developed previously for simulating the filament winding of composite cylinders. In these tests two 24 in. long, 8 in. diam and 0.285 in. thick cylinders, made of IM-6G fibers and HBRF-55 resin, were wound at + or - 45 deg angle on steel mandrels. The temperatures on the inner and outer surfaces and inside the composite cylinders were recorded during oven cure. The temperatures inside the cylinders were also calculated by the WIND model. The measured and calculated temperatures were then compared. In addition, the degree of cure and resin viscosity distributions inside the cylinders were calculated for the conditions which existed in the tests.

  14. Modified Invasion Percolation Models for Multiphase Processes

    Energy Technology Data Exchange (ETDEWEB)

    Karpyn, Zuleima [Pennsylvania State Univ., State College, PA (United States)

    2015-01-31

    This project extends current understanding and modeling capabilities of pore-scale multiphase flow physics in porous media. High-resolution X-ray computed tomography imaging experiments are used to investigate structural and surface properties of the medium that influence immiscible displacement. Using experimental and computational tools, we investigate the impact of wetting characteristics, as well as radial and axial loading conditions, on the development of percolation pathways, residual phase trapping and fluid-fluid interfacial areas.

  15. Modeling non-Gaussian time-varying vector autoregressive process

    Data.gov (United States)

    National Aeronautics and Space Administration — We present a novel and general methodology for modeling time-varying vector autoregressive processes which are widely used in many areas such as modeling of chemical...

  16. Modeling Resource Hotspots: Critical Linkages and Processes

    Science.gov (United States)

    Daher, B.; Mohtar, R.; Pistikopoulos, E.; McCarl, B. A.; Yang, Y.

    2017-12-01

    Growing demands for interconnected resources emerge in the form of hotspots of varying characteristics. The business as usual allocation model cannot address the current, let alone anticipated, complex and highly interconnected resource challenges we face. A new paradigm for resource allocation must be adopted: one that identifies cross-sectoral synergies and, that moves away from silos to recognition of the nexus and integration of it. Doing so will result in new opportunities for business growth, economic development, and improved social well-being. Solutions and interventions must be multi-faceted; opportunities should be identified with holistic trade-offs in mind. No single solution fits all: different hotspots will require distinct interventions. Hotspots have varying resource constraints, stakeholders, goals and targets. The San Antonio region represents a complex resource hotspot with promising potential: its rapidly growing population, the Eagle Ford shale play, and the major agricultural activity there makes it a hotspot with many competing demands. Stakeholders need tools to allow them to knowledgeably address impending resource challenges. This study will identify contemporary WEF nexus questions and critical system interlinkages that will inform the modeling of the tightly interconnected resource systems and stresses using the San Antonio Region as a base; it will conceptualize a WEF nexus modeling framework, and develop assessment criteria to inform integrative planning and decision making.

  17. Modelling of chemical reactions in metallurgical processes

    OpenAIRE

    Kinaci, M. Efe; Lichtenegger, Thomas; Schneiderbauer, Simon

    2017-01-01

    Iron-ore reduction has attracted much interest in the last three decades since it can be considered as a core process in steel industry. The iron-ore is reduced to iron with the use of blast furnace and fluidized bed technologies. To investigate the harsh conditions inside fluidized bed reactors, computational tools can be utilized. One such tool is the CFD-DEM method, in which the gas phase reactions and governing equations are calculated in the Eulerian (CFD) side, whereas the particle reac...

  18. Holonic Business Process Modeling in Small to Medium Sized Enterprises

    OpenAIRE

    Nur Budi Mulyono; Tezar Yuliansyah Saputra; Nur Arief Rahmatsyah

    2012-01-01

    Holonic modeling analysis which is the application of system thinking in design, manage, and improvement, is used in a novel context for business process modeling. An approach and techniques of holon and holarchies is presented specifically for small and medium sized enterprise process modeling development. The fitness of the approach is compared with well known reductionist or task breakdown approach. The strength and weaknesses of the holonic modeling is discussed with illustrating case exa...

  19. Modeling interdependencies between business and communication processes in hospitals.

    Science.gov (United States)

    Brigl, Birgit; Wendt, Thomas; Winter, Alfred

    2003-01-01

    The optimization and redesign of business processes in hospitals is an important challenge for the hospital information management who has to design and implement a suitable HIS architecture. Nevertheless, there are no tools available specializing in modeling information-driven business processes and the consequences on the communication between information processing, tools. Therefore, we will present an approach which facilitates the representation and analysis of business processes and resulting communication processes between application components and their interdependencies. This approach aims not only to visualize those processes, but to also to evaluate if there are weaknesses concerning the information processing infrastructure which hinder the smooth implementation of the business processes.

  20. Detecting Difference between Process Models Based on the Refined Process Structure Tree

    Directory of Open Access Journals (Sweden)

    Jing Fan

    2017-01-01

    Full Text Available The development of mobile workflow management systems (mWfMS leads to large number of business process models. In the meantime, the location restriction embedded in mWfMS may result in different process models for a single business process. In order to help users quickly locate the difference and rebuild the process model, detecting the difference between different process models is needed. Existing detection methods either provide a dissimilarity value to represent the difference or use predefined difference template to generate the result, which cannot reflect the entire composition of the difference. Hence, in this paper, we present a new approach to solve this problem. Firstly, we parse the process models to their corresponding refined process structure trees (PSTs, that is, decomposing a process model into a hierarchy of subprocess models. Then we design a method to convert the PST to its corresponding task based process structure tree (TPST. As a consequence, the problem of detecting difference between two process models is transformed to detect difference between their corresponding TPSTs. Finally, we obtain the difference between two TPSTs based on the divide and conquer strategy, where the difference is described by an edit script and we make the cost of the edit script close to minimum. The extensive experimental evaluation shows that our method can meet the real requirements in terms of precision and efficiency.

  1. Modelling energy spot prices by Lévy semistationary processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole; Benth, Fred Espen; Veraart, Almut

    This paper introduces a new modelling framework for energy spot prices based on Lévy semistationary processes. Lévy semistationary processes are special cases of the general class of ambit processes. We provide a detailed analysis of the probabilistic properties of such models and we show how the...... they are able to capture many of the stylised facts observed in energy markets. Furthermore, we derive forward prices based on our spot price model. As it turns out, many of the classical spot models can be embedded into our novel modelling framework....

  2. A Software Development Simulation Model of a Spiral Process

    Science.gov (United States)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.

  3. Diff-based model synchronization in an industrial MDD process

    DEFF Research Database (Denmark)

    Kindler, Ekkart; Könemann, Patrick; Unland, Ludger

    of different models is maintained manually in many cases today. This paper presents an approach for automated model differencing, so that the differences between two model versions (called delta) can be extracted and stored. It can then be re-used independently of the models it was created from...... to interactively merge different model versions, and for synchronizing other types of models. The main concern was to apply our concepts to an industrial process, so usability and performance were important issues....

  4. A Garbage Can Model of the Psychological Research Process.

    Science.gov (United States)

    Martin, Joanne

    1981-01-01

    Reviews models commonly used in psychological research, and, particularly, in organizational decision making. An alternative model of organizational decision making is suggested. The model, referred to as the garbage can model, describes a process in which members of an organization collect the problems and solutions they generate by dumping them…

  5. Modeling Grinding Processes as Micro-Machining Operation ...

    African Journals Online (AJOL)

    A computational based model for surface grinding process as a micro-machined operation has been developed. In this model, grinding forces are made up of chip formation force and sliding force. Mathematical expressions for Modeling tangential grinding force and normal grinding force were obtained. The model was ...

  6. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  7. Effect of Linked Rules on Business Process Model Understanding

    DEFF Research Database (Denmark)

    Wang, Wei; Indulska, Marta; Sadiq, Shazia

    2017-01-01

    of business processes has not been empirically evaluated. In this paper, we report on an experiment that investigates the effect of linked rules, a specific rule integration approach, on business process model understanding. Our results indicate that linked rules are associated with better time efficiency......Business process models are widely used in organizations by information systems analysts to represent complex business requirements and by business users to understand business operations and constraints. This understanding is extracted from graphical process models as well as business rules. Prior...... research advocated integrating business rules into business process models to improve the effectiveness of important organizational activities, such as developing shared understanding, effective communication, and process improvement. However, whether such integrated modeling can improve the understanding...

  8. Concept of a cognitive-numeric plant and process modelizer

    International Nuclear Information System (INIS)

    Vetterkind, D.

    1990-01-01

    To achieve automatic modeling of plant distrubances and failure limitation procedures, first the system's hardware and the present media (water, steam, coolant fluid) are formalized into fully computable matrices, called topographies. Secondly a microscopic cellular automation model, using lattice gases and state transition rules, is combined with a semi - microscopic cellular process model and with a macroscopic model, too. In doing this, at semi-microscopic level there are acting a cellular data compressor, a feature detection device and the Intelligent Physical Element's process dynamics. At macroscopic level the Walking Process Elements, a process evolving module, a test-and-manage device and abstracting process net are involved. Additionally, a diagnosis-coordinating and a counter measurements coordinating device are used. In order to automatically get process insights, object transformations, elementary process functions and associative methods are used. Developments of optoelectronic hardware language components are under consideration

  9. Towards simplification of hydrologic modeling: identification of dominant processes

    Directory of Open Access Journals (Sweden)

    S. L. Markstrom

    2016-11-01

    Full Text Available parameter hydrologic model, has been applied to the conterminous US (CONUS. Parameter sensitivity analysis was used to identify: (1 the sensitive input parameters and (2 particular model output variables that could be associated with the dominant hydrologic process(es. Sensitivity values of 35 PRMS calibration parameters were computed using the Fourier amplitude sensitivity test procedure on 110 000 independent hydrologically based spatial modeling units covering the CONUS and then summarized to process (snowmelt, surface runoff, infiltration, soil moisture, evapotranspiration, interflow, baseflow, and runoff and model performance statistic (mean, coefficient of variation, and autoregressive lag 1. Identified parameters and processes provide insight into model performance at the location of each unit and allow the modeler to identify the most dominant process on the basis of which processes are associated with the most sensitive parameters. The results of this study indicate that: (1 the choice of performance statistic and output variables has a strong influence on parameter sensitivity, (2 the apparent model complexity to the modeler can be reduced by focusing on those processes that are associated with sensitive parameters and disregarding those that are not, (3 different processes require different numbers of parameters for simulation, and (4 some sensitive parameters influence only one hydrologic process, while others may influence many.

  10. Probabilistic modeling of discourse-aware sentence processing.

    Science.gov (United States)

    Dubey, Amit; Keller, Frank; Sturt, Patrick

    2013-07-01

    Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.

  11. A Kinetic Ladle Furnace Process Simulation Model: Effective Equilibrium Reaction Zone Model Using FactSage Macro Processing

    Science.gov (United States)

    Van Ende, Marie-Aline; Jung, In-Ho

    2017-02-01

    The ladle furnace (LF) is widely used in the secondary steelmaking process in particular for the de-sulfurization, alloying, and reheating of liquid steel prior to the casting process. The Effective Equilibrium Reaction Zone model using the FactSage macro processing code was applied to develop a kinetic LF process model. The slag/metal interactions, flux additions to slag, various metallic additions to steel, and arcing in the LF process were taken into account to describe the variations of chemistry and temperature of steel and slag. The LF operation data for several steel grades from different plants were accurately described using the present kinetic model.

  12. Modelling Template for the Development of the Process Flowsheet

    DEFF Research Database (Denmark)

    Fedorova, Marina; Gani, Rafiqul

    2015-01-01

    provides building blocks for the templates (generic models previously developed); 3) computer aided methods and tools, that include procedures to perform model translation, model analysis, model verification/validation, model solution and model documentation. In this work, the integrated use of all three......Models are playing important roles in design and analysis of chemicals/bio-chemicals based products and the processes that manufacture them. Model-based methods and tools have the potential to decrease the number of experiments, which can be expensive and time consuming, and point to candidates......, where the experimental effort could be focused.In this contribution a general modelling framework for systematic model building through modelling templates, which supports the reuse of existing models via its tools integration and model import and export capabilities, is presented. Modelling templates...

  13. Case study modelling for an ettringite treatment process ...

    African Journals Online (AJOL)

    The process modelled in this study includes the formation of ettringite and the recovery of gibbsite through the decomposition of recycled ettringite. The modelling of this process was done using PHREEQC and the results presented in this paper are based on the outcome of different case studies that investigated how the ...

  14. Parallel direct solver for finite element modeling of manufacturing processes

    DEFF Research Database (Denmark)

    Nielsen, Chris Valentin; Martins, P.A.F.

    2017-01-01

    The central processing unit (CPU) time is of paramount importance in finite element modeling of manufacturing processes. Because the most significant part of the CPU time is consumed in solving the main system of equations resulting from finite element assemblies, different approaches have been...... developed to optimize solutions and reduce the overall computational costs of large finite element models....

  15. Product Trial Processing (PTP): a model approach from ...

    African Journals Online (AJOL)

    Product Trial Processing (PTP): a model approach from theconsumer's perspective. ... Global Journal of Social Sciences ... Among the constructs used in the model of consumer's processing of product trail includes; experiential and non- experiential attributes, perceived validity of product trial, consumer perceived expertise, ...

  16. MODELING OF AUTOMATION PROCESSES CONCERNING CROP CULTIVATION BY AVIATION

    Directory of Open Access Journals (Sweden)

    V. I. Ryabkov

    2010-01-01

    Full Text Available The paper considers modeling of automation processes concerning crop cultivation by aviation. Processes that take place in three interconnected environments: human, technical and movable air objects are described by a model which is based on a set theory. Stochastic network theory of mass service systems for description of human-machine system of real time is proposed in the paper.

  17. DEVELOPMENT OF SMALL-SCALE CONSTRUCTION ENTERPRISE PROCESS MANAGEMENT MODEL

    OpenAIRE

    E. V. Folomeev

    2012-01-01

    Process approach is one of most effective ways of managing construction companies. Resulting from using models based on this approach, the company’s structure becomes flexible enough to quickly acquire the ability to get functionally and structurally tuned for specific projects. It is demonstrated in this article how to develop a process management model mechanism for a small-scale construction company.

  18. Measurement and modeling of advanced coal conversion processes

    Energy Technology Data Exchange (ETDEWEB)

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G. (Advanced Fuel Research, Inc., East Hartford, CT (United States)); Smoot, L.D.; Brewster, B.S. (Brigham Young Univ., Provo, UT (United States))

    1991-09-25

    The objectives of this study are to establish the mechanisms and rates of basic steps in coal conversion processes, to integrate and incorporate this information into comprehensive computer models for coal conversion processes, to evaluate these models and to apply them to gasification, mild gasification and combustion in heat engines. (VC)

  19. Simple models of the hydrofracture process

    KAUST Repository

    Marder, M.

    2015-12-29

    Hydrofracturing to recover natural gas and oil relies on the creation of a fracture network with pressurized water. We analyze the creation of the network in two ways. First, we assemble a collection of analytical estimates for pressure-driven crack motion in simple geometries, including crack speed as a function of length, energy dissipated by fluid viscosity and used to break rock, and the conditions under which a second crack will initiate while a first is running. We develop a pseudo-three-dimensional numerical model that couples fluid motion with solid mechanics and can generate branching crack structures not specified in advance. One of our main conclusions is that the typical spacing between fractures must be on the order of a meter, and this conclusion arises in two separate ways. First, it arises from analysis of gas production rates, given the diffusion constants for gas in the rock. Second, it arises from the number of fractures that should be generated given the scale of the affected region and the amounts of water pumped into the rock.

  20. BUSINESS PROCESS MODELLING: A FOUNDATION FOR KNOWLEDGE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Vesna Bosilj-Vukšić

    2006-12-01

    Full Text Available Knowledge management (KM is increasingly recognised as a strategic practice of knowledge-intensive companies, becoming an integral part of an organisation's strategy to improve business performance. This paper provides an overview of business process modelling applications and analyses the relationship between business process modelling and knowledge management projects. It presents the case study of Croatian leading banks and the insurance company, discussing its practical experience in conducting business process modelling projects and investigating the opportunity for integrating business process repository and organisational knowledge as the foundation for knowledge management system development.

  1. Ontological Analysis of Integrated Process Models: testing hypotheses

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    2001-11-01

    Full Text Available Integrated process modeling is achieving prominence in helping to document and manage business administration and IT processes in organizations. The ARIS framework is a popular example for a framework of integrated process modeling not least because it underlies the 800 or more reference models embedded in the world's most popular ERP package, SAP R/3. This paper demonstrates the usefulness of the Bunge-Wand-Weber (BWW representation model for evaluating modeling grammars such as those constituting ARIS. It reports some initial insights gained from pilot testing Green and Rosemann's (2000 evaluative propositions. Even when considering all five views of ARIS, modelers have problems representing business rules, the scope and boundary of systems, and decomposing models. However, even though it is completely ontologically redundant, users still find the function view useful in modeling.

  2. Object Oriented Business Process Modelling in RFID Applied Computing Environments

    Science.gov (United States)

    Zhao, Xiaohui; Liu, Chengfei; Lin, Tao

    As a tracking technology, Radio Frequency Identification (RFID) is now widely applied to enhance the context awareness of enterprise information systems. Such awareness provides great opportunities to facilitate business process automation and thereby improve operation efficiency and accuracy. With the aim to incorporate business logics into RFID-enabled applications, this book chapter addresses how RFID technologies impact current business process management and the characteristics of object-oriented business process modelling. This chapter first discusses the rationality and advantages of applying object-oriented process modelling in RFID applications, then addresses the requirements and guidelines for RFID data management and process modelling. Two typical solutions are introduced to further illustrate the modelling and incorporation of business logics/business processes into RFID edge systems. To demonstrate the applicability of these two approaches, a detailed case study is conducted within a distribution centre scenario.

  3. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  4. Process modeling for the Integrated Nonthermal Treatment System (INTS) study

    Energy Technology Data Exchange (ETDEWEB)

    Brown, B.W.

    1997-04-01

    This report describes the process modeling done in support of the Integrated Nonthermal Treatment System (INTS) study. This study was performed to supplement the Integrated Thermal Treatment System (ITTS) study and comprises five conceptual treatment systems that treat DOE contract-handled mixed low-level wastes (MLLW) at temperatures of less than 350{degrees}F. ASPEN PLUS, a chemical process simulator, was used to model the systems. Nonthermal treatment systems were developed as part of the INTS study and include sufficient processing steps to treat the entire inventory of MLLW. The final result of the modeling is a process flowsheet with a detailed mass and energy balance. In contrast to the ITTS study, which modeled only the main treatment system, the INTS study modeled each of the various processing steps with ASPEN PLUS, release 9.1-1. Trace constituents, such as radionuclides and minor pollutant species, were not included in the calculations.

  5. QUALITY IMPROVEMENT MODEL AT THE MANUFACTURING PROCESS PREPARATION LEVEL

    Directory of Open Access Journals (Sweden)

    Dusko Pavletic

    2009-12-01

    Full Text Available The paper expresses base for an operational quality improvement model at the manufacturing process preparation level. A numerous appropriate related quality assurance and improvement methods and tools are identified. Main manufacturing process principles are investigated in order to scrutinize one general model of manufacturing process and to define a manufacturing process preparation level. Development and introduction of the operational quality improvement model is based on a research conducted and results of methods and tools application possibilities in real manufacturing processes shipbuilding and automotive industry. Basic model structure is described and presented by appropriate general algorithm. Operational quality improvement model developed lays down main guidelines for practical and systematic application of quality improvements methods and tools.

  6. Space in multi-agent systems modelling spatial processes

    Directory of Open Access Journals (Sweden)

    Petr Rapant

    2007-06-01

    Full Text Available Need for modelling of spatial processes arise in the spehere of geoinformation systems in the last time. Some processes (espetially natural ones can be modeled by means of using external tools, e. g. for modelling of contaminant transport in the environment. But in the case of socio-economic processes suitable tools interconnected with GIS are still in quest of reserch and development. One of the candidate technologies are so called multi-agent systems. Their theory is developed quite well, but they lack suitable means for dealing with space. This article deals with this problem and proposes solution for the field of a road transport modelling.

  7. Updating parameters of the chicken processing line model

    DEFF Research Database (Denmark)

    Kurowicka, Dorota; Nauta, Maarten; Jozwiak, Katarzyna

    2010-01-01

    A mathematical model of chicken processing that quantitatively describes the transmission of Campylobacter on chicken carcasses from slaughter to chicken meat product has been developed in Nauta et al. (2005). This model was quantified with expert judgment. Recent availability of data allows...... updating parameters of the model to better describe processes observed in slaughterhouses. We propose Bayesian updating as a suitable technique to update expert judgment with microbiological data. Berrang and Dickens’s data are used to demonstrate performance of this method in updating parameters...... of the chicken processing line model....

  8. Modelling the pultrusion process of off shore wind turbine blades

    NARCIS (Netherlands)

    Baran, Ismet

    This thesis is devoted to the numerical modelling of the pultrusion process for industrial products such as wind turbine blades and structural profiles. The main focus is on the thermo-chemical and mechanical analyses of the process in which the process induced tresses and shape distortions together

  9. process setting models for the minimization of costs defectives

    African Journals Online (AJOL)

    Dr Obe

    2. Optimal Setting Process Models. 2.1 Optimal setting of process mean in the case of one-sided limit. In filling operation, the process average net weight must be set. The standards prescribe the minimum weight which is printed on the packet. This set of quality control problems has one-sided limit (the minimum net weight).

  10. The two-process model : Origin and perspective

    NARCIS (Netherlands)

    Daan, S.; Hut, R. A.; Beersma, D.

    In the two-process model as developed in the early 1980's sleep is controlled by a process-S, representing the rise and fall of sleep demand resulting from prior sleep-wake history, interacting with a process-C representing circadian variation in sleep propensity. S and C together optimize sleep

  11. An Information-Processing Model of Crisis Management.

    Science.gov (United States)

    Egelhoff, William G.; Sen, Falguni

    1992-01-01

    Develops a contingency model for managing a variety of corporate crises. Views crisis management as an information-processing situation and organizations that must cope with crisis as information-processing systems. Attempts to fit appropriate information-processing mechanisms to different categories of crises. (PRA)

  12. Software engineering with process algebra: Modelling client / server architecures

    NARCIS (Netherlands)

    Diertens, B.

    2009-01-01

    In previous work we described how the process algebra based language PSF can be used in software engineering, using the ToolBus, a coordination architecture also based on process algebra, as implementation model. We also described this software development process more formally by presenting the

  13. Understanding uncertainty in process-based hydrological models

    Science.gov (United States)

    Clark, M. P.; Kavetski, D.; Slater, A. G.; Newman, A. J.; Marks, D. G.; Landry, C.; Lundquist, J. D.; Rupp, D. E.; Nijssen, B.

    2013-12-01

    Building an environmental model requires making a series of decisions regarding the appropriate representation of natural processes. While some of these decisions can already be based on well-established physical understanding, gaps in our current understanding of environmental dynamics, combined with incomplete knowledge of properties and boundary conditions of most environmental systems, make many important modeling decisions far more ambiguous. There is consequently little agreement regarding what a 'correct' model structure is, especially at relatively larger spatial scales such as catchments and beyond. In current practice, faced with such a range of decisions, different modelers will generally make different modeling decisions, often on an ad hoc basis, based on their balancing of process understanding, the data available to evaluate the model, the purpose of the modeling exercise, and their familiarity with or investment in an existing model infrastructure. This presentation describes development and application of multiple-hypothesis models to evaluate process-based hydrologic models. Our numerical model uses robust solutions of the hydrology and thermodynamic governing equations as the structural core, and incorporates multiple options to represent the impact of different modeling decisions, including multiple options for model parameterizations (e.g., below-canopy wind speed, thermal conductivity, storage and transmission of liquid water through soil, etc.), as well as multiple options for model architecture, that is, the coupling and organization of different model components (e.g., representations of sub-grid variability and hydrologic connectivity, coupling with groundwater, etc.). Application of this modeling framework across a collection of different research basins demonstrates that differences among model parameterizations are often overwhelmed by differences among equally-plausible model parameter sets, while differences in model architecture lead

  14. Catastrophe Insurance Modeled by Shot-Noise Processes

    Directory of Open Access Journals (Sweden)

    Thorsten Schmidt

    2014-02-01

    Full Text Available Shot-noise processes generalize compound Poisson processes in the following way: a jump (the shot is followed by a decline (noise. This constitutes a useful model for insurance claims in many circumstances; claims due to natural disasters or self-exciting processes exhibit similar features. We give a general account of shot-noise processes with time-inhomogeneous drivers inspired by recent results in credit risk. Moreover, we derive a number of useful results for modeling and pricing with shot-noise processes. Besides this, we obtain some highly tractable examples and constitute a useful modeling tool for dynamic claims processes. The results can in particular be used for pricing Catastrophe Bonds (CAT bonds, a traded risk-linked security. Additionally, current results regarding the estimation of shot-noise processes are reviewed.

  15. Modeling technique for the process of liquid film disintegration

    Science.gov (United States)

    Modorskii, V. Ya.; Sipatov, A. M.; Babushkina, A. V.; Kolodyazhny, D. Yu.; Nagorny, V. S.

    2016-10-01

    In the course of numerical experiments the method of calculation of two-phase flows was developed by solving a model problem. The results of the study were compared between the two models that describe the processes of two-phase flow and the collapse of the liquid jet into droplets. VoF model and model QMOM - two mathematical models were considered the implementation of the spray.

  16. Measuring the Compliance of Processes with Reference Models

    Science.gov (United States)

    Gerke, Kerstin; Cardoso, Jorge; Claus, Alexander

    Reference models provide a set of generally accepted best practices to create efficient processes to be deployed inside organizations. However, a central challenge is to determine how these best practices are implemented in practice. One limitation of existing approaches for measuring compliance is the assumption that the compliance can be determined using the notion of process equivalence. Nonetheless, the use of equivalence algorithms is not adequate since two models can have different structures but one process can still be compliant with the other. This paper presents a new approach and algorithm which allow to measure the compliance of process models with reference models. We evaluate our approach by measuring the compliance of a model currently used by a German passenger airline with the IT Infrastructure Library (ITIL) reference model and by comparing our results with existing approaches.

  17. Detection and quantification of flow consistency in business process models

    DEFF Research Database (Denmark)

    Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel

    2017-01-01

    , to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics......Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect......, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second...

  18. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  19. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  20. Investigation of Mediational Processes Using Parallel Process Latent Growth Curve Modeling

    Science.gov (United States)

    Cheong, JeeWon; MacKinnon, David P.; Khoo, Siek Toon

    2010-01-01

    This study investigated a method to evaluate mediational processes using latent growth curve modeling. The mediator and the outcome measured across multiple time points were viewed as 2 separate parallel processes. The mediational process was defined as the independent variable influencing the growth of the mediator, which, in turn, affected the growth of the outcome. To illustrate modeling procedures, empirical data from a longitudinal drug prevention program, Adolescents Training and Learning to Avoid Steroids, were used. The program effects on the growth of the mediator and the growth of the outcome were examined first in a 2-group structural equation model. The mediational process was then modeled and tested in a parallel process latent growth curve model by relating the prevention program condition, the growth rate factor of the mediator, and the growth rate factor of the outcome. PMID:20157639

  1. What Controls the Vertical Distribution of Aerosol? Relationships Between Process Sensitivity in HadGEM3-UKCA and Inter-Model Variation from AeroCom Phase II

    Science.gov (United States)

    Kipling, Zak; Stier, Philip; Johnson, Colin E.; Mann, Graham W.; Bellouin, Nicolas; Bauer, Susanne E.; Bergman, Tommi; Chin, Mian; Diehl, Thomas; Ghan, Steven J.; hide

    2016-01-01

    same processes as the component mass profiles, plus the size distribution of primary emissions. We also show that the processes that affect the AOD-normalised radiative forcing in the model are predominantly those that affect the vertical mass distribution, in particular convective transport, in-cloud scavenging, aqueous oxidation, ageing and the vertical extent of biomass-burning emissions.

  2. A Measurable Model of the Creative Process in the Context of a Learning Process

    Science.gov (United States)

    Ma, Min; Van Oystaeyen, Fred

    2016-01-01

    The authors' aim was to arrive at a measurable model of the creative process by putting creativity in the context of a learning process. The authors aimed to provide a rather detailed description of how creative thinking fits in a general description of the learning process without trying to go into an analysis of a biological description of the…

  3. Toward Cognitively Constrained Models of Language Processing: A Review

    Directory of Open Access Journals (Sweden)

    Margreet Vogelzang

    2017-09-01

    Full Text Available Language processing is not an isolated capacity, but is embedded in other aspects of our cognition. However, it is still largely unexplored to what extent and how language processing interacts with general cognitive resources. This question can be investigated with cognitively constrained computational models, which simulate the cognitive processes involved in language processing. The theoretical claims implemented in cognitive models interact with general architectural constraints such as memory limitations. This way, it generates new predictions that can be tested in experiments, thus generating new data that can give rise to new theoretical insights. This theory-model-experiment cycle is a promising method for investigating aspects of language processing that are difficult to investigate with more traditional experimental techniques. This review specifically examines the language processing models of Lewis and Vasishth (2005, Reitter et al. (2011, and Van Rij et al. (2010, all implemented in the cognitive architecture Adaptive Control of Thought—Rational (Anderson et al., 2004. These models are all limited by the assumptions about cognitive capacities provided by the cognitive architecture, but use different linguistic approaches. Because of this, their comparison provides insight into the extent to which assumptions about general cognitive resources influence concretely implemented models of linguistic competence. For example, the sheer speed and accuracy of human language processing is a current challenge in the field of cognitive modeling, as it does not seem to adhere to the same memory and processing capacities that have been found in other cognitive processes. Architecture-based cognitive models of language processing may be able to make explicit which language-specific resources are needed to acquire and process natural language. The review sheds light on cognitively constrained models of language processing from two angles: we

  4. Multicriteria framework for selecting a process modelling language

    Science.gov (United States)

    Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel

    2016-01-01

    The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.

  5. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types...

  6. Using CASE to Exploit Process Modeling in Technology Transfer

    Science.gov (United States)

    Renz-Olar, Cheryl

    2003-01-01

    A successful business will be one that has processes in place to run that business. Creating processes, reengineering processes, and continually improving processes can be accomplished through extensive modeling. Casewise(R) Corporate Modeler(TM) CASE is a computer aided software engineering tool that will enable the Technology Transfer Department (TT) at NASA Marshall Space Flight Center (MSFC) to capture these abilities. After successful implementation of CASE, it could then go on to be applied in other departments at MSFC and other centers at NASA. The success of a business process is dependent upon the players working as a team and continuously improving the process. A good process fosters customer satisfaction as well as internal satisfaction in the organizational infrastructure. CASE provides a method for business process success through functions consisting of systems and processes business models; specialized diagrams; matrix management; simulation; report generation and publishing; and, linking, importing, and exporting documents and files. The software has an underlying repository or database to support these functions. The Casewise. manual informs us that dynamics modeling is a technique used in business design and analysis. Feedback is used as a tool for the end users and generates different ways of dealing with the process. Feedback on this project resulted from collection of issues through a systems analyst interface approach of interviews with process coordinators and Technical Points of Contact (TPOCs).

  7. Lipid Processing Technology: Building a Multilevel Modeling Network

    DEFF Research Database (Denmark)

    Díaz Tovar, Carlos Axel; Mustaffa, Azizul Azri; Mukkerikar, Amol

    2011-01-01

    The aim of this work is to present the development of a computer aided multilevel modeling network for the systematic design and analysis of processes employing lipid technologies. This is achieved by decomposing the problem into four levels of modeling: i) pure component property modeling...

  8. modelling of queuing process at airport check-in system

    African Journals Online (AJOL)

    HOD

    models in queue studies. The study adopted travel demand data for Manchester and Leeds-Bradford airports from the United Kingdom. Civil Aviation Authority database. 1.2 Analytical Models for Queuing Studies. Previous researchers have examined queuing process extensively and developed analytical models used for.

  9. A mathematical model for the leukocyte filtration process

    NARCIS (Netherlands)

    Bruil, A.; Bruil, Anton; Beugeling, T.; Beugeling, Tom; Feijen, Jan

    1995-01-01

    Leukocyte filters are applied clinically to remove leukocytes from blood. In order to optimize leukocyte filters, a mathematical model to describe the leukocyte filtration process was developed by modification of a general theoretical model for depth filtration. The model presented here can be used

  10. A consolidation based extruder model to explore GAME process configurations

    NARCIS (Netherlands)

    Willems, P.; Kuipers, N.J.M.; de Haan, A.B.

    2009-01-01

    A mathematical model from literature was adapted to predict the pressure profile and oil yield for canola in a lab-scale extruder. Changing the description of the expression process from filtration to consolidation significantly improved the performance and physical meaning of the model. The model

  11. Modelling the embedded rainfall process using tipping bucket data

    DEFF Research Database (Denmark)

    Thyregod, Peter; Arnbjerg-Nielsen, Karsten; Madsen, Henrik

    1998-01-01

    A new method for modelling the dynamics of rain measurement processes is suggested. The method takes the discrete nature and autocorrelation of measurements from the tipping bucket rain gauge into consideration. The considered model is a state space model with a Poisson marginal distribution. In ...

  12. Conflicts Analysis for Inter-Enterprise Business Process Model

    OpenAIRE

    Wei Ding; Zhong Tian; Jian Wang; Jun Zhu; Haiqi Liang; Lei Zhang

    2003-01-01

    Business process (BP) management systems facilitate the understanding and execution of business processes, which tend to change frequently due to both internal and external change in an enterprise. Therefore, the needs for analysis methods to verify the correctness of business process model is becoming more prominent. One key element of such business process is its control flow. We show how a flow specification may contain certain structural conflicts that could compromise its correct executi...

  13. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    INTRODUCTION: Oral anticoagulation treatment (OACT)with warfarin is common in general practice. Increasingly,international normalised ratio (INR) point of care testing(POCT) is being used to manage patients. The aim of thisstudy was to describe and analyse the quality of OACT withwarfarin...... in general practice in the Capital Region of Denmarkusing INR POCT. METHODS: A total of 20 general practices, ten singlehandedand ten group practices using INR POCT, were randomlyselected to participate in the study. Practice organisationand patient characteristics were recorded. INRmeasurements were...... collected retrospectively for a periodof six months. For each patient, time in therapeutic range(TTR) was calculated and correlated with practice and patientcharacteristics using multilevel linear regressionmodels. RESULTS: We identified 447 patients in warfarin treatmentin the 20 practices using POCT...

  14. Modeling and knowledge acquisition processes using case-based inference

    Directory of Open Access Journals (Sweden)

    Ameneh Khadivar

    2017-03-01

    Full Text Available The method of acquisition and presentation of the organizational Process Knowledge has considered by many KM researches. In this research a model for process knowledge acquisition and presentation has been presented by using the approach of Case Base Reasoning. The validation of the presented model was evaluated by conducting an expert panel. Then a software has been developed based on the presented model and implemented in Eghtesad Novin Bank of Iran. In this company, based on the stages of the presented model, first the knowledge intensive processes has been identified, then the Process Knowledge was stored in a knowledge base in the format of problem/solution/consequent .The retrieval of the knowledge was done based on the similarity of the nearest neighbor algorithm. For validating of the implemented system, results of the system has compared by the results of the decision making of the expert of the process.

  15. Event Modeling in UML. Unified Modeling Language and Unified Process

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2002-01-01

    We show how events can be modeled in terms of UML. We view events as change agents that have consequences and as information objects that represent information. We show how to create object-oriented structures that represent events in terms of attributes, associations, operations, state charts, a...

  16. Mechanistic Fermentation Models for Process Design, Monitoring, and Control.

    Science.gov (United States)

    Mears, Lisa; Stocks, Stuart M; Albaek, Mads O; Sin, Gürkan; Gernaey, Krist V

    2017-10-01

    Mechanistic models require a significant investment of time and resources, but their application to multiple stages of fermentation process development and operation can make this investment highly valuable. This Opinion article discusses how an established fermentation model may be adapted for application to different stages of fermentation process development: planning, process design, monitoring, and control. Although a longer development time is required for such modeling methods in comparison to purely data-based model techniques, the wide range of applications makes them a highly valuable tool for fermentation research and development. In addition, in a research environment, where collaboration is important, developing mechanistic models provides a platform for knowledge sharing and consolidation of existing process understanding. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Holonic Business Process Modeling in Small to Medium Sized Enterprises

    Directory of Open Access Journals (Sweden)

    Nur Budi Mulyono

    2012-01-01

    Full Text Available Holonic modeling analysis which is the application of system thinking in design, manage, and improvement, is used in a novel context for business process modeling. An approach and techniques of holon and holarchies is presented specifically for small and medium sized enterprise process modeling development. The fitness of the approach is compared with well known reductionist or task breakdown approach. The strength and weaknesses of the holonic modeling is discussed with illustrating case example in term of its suitability for an Indonesia’s small and medium sized industry. The novel ideas in this paper have great impact on the way analyst should perceive business process. Future research is applying the approach in supply chain context.Key words: Business process, holonic modeling, operations management, small to medium sized enterprise

  18. Mechanistic Fermentation Models for Process Design, Monitoring, and Control

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads Orla

    2017-01-01

    Mechanistic models require a significant investment of time and resources, but their application to multiple stages of fermentation process development and operation can make this investment highly valuable. This Opinion article discusses how an established fermentation model may be adapted...... for application to different stages of fermentation process development: planning, process design, monitoring, and control. Although a longer development time is required for such modeling methods in comparison to purely data-based model techniques, the wide range of applications makes them a highly valuable tool...... for fermentation research and development. In addition, in a research environment, where collaboration is important, developing mechanistic models provides a platform for knowledge sharing and consolidation of existing process understanding....

  19. Aberrant brain responses to emotionally valent words is normalised after cognitive behavioural therapy in female depressed adolescents.

    Science.gov (United States)

    Chuang, Jie-Yu; J Whitaker, Kirstie; Murray, Graham K; Elliott, Rebecca; Hagan, Cindy C; Graham, Julia Me; Ooi, Cinly; Tait, Roger; Holt, Rosemary J; van Nieuwenhuizen, Adrienne O; Reynolds, Shirley; Wilkinson, Paul O; Bullmore, Edward T; Lennox, Belinda R; Sahakian, Barbara J; Goodyer, Ian; Suckling, John

    2016-01-01

    Depression in adolescence is debilitating with high recurrence in adulthood, yet its pathophysiological mechanism remains enigmatic. To examine the interaction between emotion, cognition and treatment, functional brain responses to sad and happy distractors in an affective go/no-go task were explored before and after Cognitive Behavioural Therapy (CBT) in depressed female adolescents, and healthy participants. Eighty-two Depressed and 24 healthy female adolescents, aged 12-17 years, performed a functional magnetic resonance imaging (fMRI) affective go/no-go task at baseline. Participants were instructed to withhold their responses upon seeing happy or sad words. Among these participants, 13 patients had CBT over approximately 30 weeks. These participants and 20 matched controls then repeated the task. At baseline, increased activation in response to happy relative to neutral distractors was observed in the orbitofrontal cortex in depressed patients which was normalised after CBT. No significant group differences were found behaviourally or in brain activation in response to sad distractors. Improvements in symptoms (mean: 9.31, 95% CI: 5.35-13.27) were related at trend-level to activation changes in orbitofrontal cortex. In the follow-up section, a limited number of post-CBT patients were recruited. To our knowledge, this is the first fMRI study addressing the effect of CBT in adolescent depression. Although a bias toward negative information is widely accepted as a hallmark of depression, aberrant brain hyperactivity to positive distractors was found and normalised after CBT. Research, assessment and treatment focused on positive stimuli could be a future consideration. Moreover, a pathophysiological mechanism distinct from adult depression may be suggested and awaits further exploration. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  20. An extension of clarke's model with stochastic amplitude flip processes

    KAUST Repository

    Hoel, Hakon

    2014-07-01

    Stochastic modeling is an essential tool for studying statistical properties of wireless channels. In multipath fading channel (MFC) models, the signal reception is modeled by a sum of wave path contributions, and Clarke\\'s model is an important example of such which has been widely accepted in many wireless applications. However, since Clarke\\'s model is temporally deterministic, Feng and Field noted that it does not model real wireless channels with time-varying randomness well. Here, we extend Clarke\\'s model to a novel time-varying stochastic MFC model with scatterers randomly flipping on and off. Statistical properties of the MFC model are analyzed and shown to fit well with real signal measurements, and a limit Gaussian process is derived from the model when the number of active wave paths tends to infinity. A second focus of this work is a comparison study of the error and computational cost of generating signal realizations from the MFC model and from its limit Gaussian process. By rigorous analysis and numerical studies, we show that in many settings, signal realizations are generated more efficiently by Gaussian process algorithms than by the MFC model\\'s algorithm. Numerical examples that strengthen these observations are also presented. © 2014 IEEE.

  1. Introduction of Virtualization Technology to Multi-Process Model Checking

    Science.gov (United States)

    Leungwattanakit, Watcharin; Artho, Cyrille; Hagiya, Masami; Tanabe, Yoshinori; Yamamoto, Mitsuharu

    2009-01-01

    Model checkers find failures in software by exploring every possible execution schedule. Java PathFinder (JPF), a Java model checker, has been extended recently to cover networked applications by caching data transferred in a communication channel. A target process is executed by JPF, whereas its peer process runs on a regular virtual machine outside. However, non-deterministic target programs may produce different output data in each schedule, causing the cache to restart the peer process to handle the different set of data. Virtualization tools could help us restore previous states of peers, eliminating peer restart. This paper proposes the application of virtualization technology to networked model checking, concentrating on JPF.

  2. Remote sensing models and methods for image processing

    CERN Document Server

    Schowengerdt, Robert A

    1997-01-01

    This book is a completely updated, greatly expanded version of the previously successful volume by the author. The Second Edition includes new results and data, and discusses a unified framework and rationale for designing and evaluating image processing algorithms.Written from the viewpoint that image processing supports remote sensing science, this book describes physical models for remote sensing phenomenology and sensors and how they contribute to models for remote-sensing data. The text then presents image processing techniques and interprets them in terms of these models. Spectral, s

  3. Enzymatic corn wet milling: engineering process and cost model.

    Science.gov (United States)

    Ramírez, Edna C; Johnston, David B; McAloon, Andrew J; Singh, Vijay

    2009-01-21

    Enzymatic corn wet milling (E-milling) is a process derived from conventional wet milling for the recovery and purification of starch and co-products using proteases to eliminate the need for sulfites and decrease the steeping time. In 2006, the total starch production in USA by conventional wet milling equaled 23 billion kilograms, including modified starches and starches used for sweeteners and ethanol production 1. Process engineering and cost models for an E-milling process have been developed for a processing plant with a capacity of 2.54 million kg of corn per day (100,000 bu/day). These models are based on the previously published models for a traditional wet milling plant with the same capacity. The E-milling process includes grain cleaning, pretreatment, enzymatic treatment, germ separation and recovery, fiber separation and recovery, gluten separation and recovery and starch separation. Information for the development of the conventional models was obtained from a variety of technical sources including commercial wet milling companies, industry experts and equipment suppliers. Additional information for the present models was obtained from our own experience with the development of the E-milling process and trials in the laboratory and at the pilot plant scale. The models were developed using process and cost simulation software (SuperPro Designer) and include processing information such as composition and flow rates of the various process streams, descriptions of the various unit operations and detailed breakdowns of the operating and capital cost of the facility. Based on the information from the model, we can estimate the cost of production per kilogram of starch using the input prices for corn, enzyme and other wet milling co-products. The work presented here describes the E-milling process and compares the process, the operation and costs with the conventional process. The E-milling process was found to be cost competitive with the conventional

  4. Enzymatic corn wet milling: engineering process and cost model

    Directory of Open Access Journals (Sweden)

    McAloon Andrew J

    2009-01-01

    Full Text Available Abstract Background Enzymatic corn wet milling (E-milling is a process derived from conventional wet milling for the recovery and purification of starch and co-products using proteases to eliminate the need for sulfites and decrease the steeping time. In 2006, the total starch production in USA by conventional wet milling equaled 23 billion kilograms, including modified starches and starches used for sweeteners and ethanol production 1. Process engineering and cost models for an E-milling process have been developed for a processing plant with a capacity of 2.54 million kg of corn per day (100,000 bu/day. These models are based on the previously published models for a traditional wet milling plant with the same capacity. The E-milling process includes grain cleaning, pretreatment, enzymatic treatment, germ separation and recovery, fiber separation and recovery, gluten separation and recovery and starch separation. Information for the development of the conventional models was obtained from a variety of technical sources including commercial wet milling companies, industry experts and equipment suppliers. Additional information for the present models was obtained from our own experience with the development of the E-milling process and trials in the laboratory and at the pilot plant scale. The models were developed using process and cost simulation software (SuperPro Designer® and include processing information such as composition and flow rates of the various process streams, descriptions of the various unit operations and detailed breakdowns of the operating and capital cost of the facility. Results Based on the information from the model, we can estimate the cost of production per kilogram of starch using the input prices for corn, enzyme and other wet milling co-products. The work presented here describes the E-milling process and compares the process, the operation and costs with the conventional process. Conclusion The E-milling process

  5. Adaptive Gaussian Predictive Process Models for Large Spatial Datasets

    Science.gov (United States)

    Guhaniyogi, Rajarshi; Finley, Andrew O.; Banerjee, Sudipto; Gelfand, Alan E.

    2011-01-01

    Large point referenced datasets occur frequently in the environmental and natural sciences. Use of Bayesian hierarchical spatial models for analyzing these datasets is undermined by onerous computational burdens associated with parameter estimation. Low-rank spatial process models attempt to resolve this problem by projecting spatial effects to a lower-dimensional subspace. This subspace is determined by a judicious choice of “knots” or locations that are fixed a priori. One such representation yields a class of predictive process models (e.g., Banerjee et al., 2008) for spatial and spatial-temporal data. Our contribution here expands upon predictive process models with fixed knots to models that accommodate stochastic modeling of the knots. We view the knots as emerging from a point pattern and investigate how such adaptive specifications can yield more flexible hierarchical frameworks that lead to automated knot selection and substantial computational benefits. PMID:22298952

  6. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    International Nuclear Information System (INIS)

    Sonnenthale, E.

    2001-01-01

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [1534471]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: Performance Assessment (PA); Near-Field Environment (NFE) PMR; Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); and UZ Flow and Transport Process Model Report (PMR). The work scope for this activity is presented in the TWPs cited above, and summarized as follows: Continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are

  7. A note on the criticisms against the internationalization process model

    OpenAIRE

    Hadjikhani, Amjad

    1997-01-01

    The internationalization process model introduced three decades ago still influences international business studies. Since that time, a growing number of researchers have tested the model to show its strengths and weaknesses. Among the critics, some focus on the weakness of the theoretical aspects, while others argue against parts of the model. This paper will review these criticisms and compare them with the original ideas in the internationalization model. One criticized aspect of the inter...

  8. Dispersive processes in models of regional radionuclide migration. Technical memorandum

    International Nuclear Information System (INIS)

    Evenson, D.E.; Dettinger, M.D.

    1980-05-01

    Three broad areas of concern in the development of aquifer scale transport models will be local scale diffusion and dispersion processes, regional scale dispersion processes, and numerical problems associated with the advection-dispersion equation. Local scale dispersion processes are fairly well understood and accessible to observation. These processes will generally be dominated in large scale systems by regional processes, or macro-dispersion. Macro-dispersion is primarily the result of large scale heterogeneities in aquifer properties. In addition, the effects of many modeling approximations are often included in the process. Because difficulties arise in parameterization of this large scale phenomenon, parameterization should be based on field measurements made at the same scale as the transport process of interest or else partially circumvented through the application of a probabilistic advection model. Other problems associated with numerical transport models include difficulties with conservation of mass, stability, numerical dissipation, overshoot, flexibility, and efficiency. We recommend the random-walk model formulation for Lawrence Livermore Laboratory's purposes as the most flexible, accurate and relatively efficient modeling approach that overcomes these difficulties

  9. Testing and modelling autoregressive conditional heteroskedasticity of streamflow processes

    Directory of Open Access Journals (Sweden)

    W. Wang

    2005-01-01

    Full Text Available Conventional streamflow models operate under the assumption of constant variance or season-dependent variances (e.g. ARMA (AutoRegressive Moving Average models for deseasonalized streamflow series and PARMA (Periodic AutoRegressive Moving Average models for seasonal streamflow series. However, with McLeod-Li test and Engle's Lagrange Multiplier test, clear evidences are found for the existence of autoregressive conditional heteroskedasticity (i.e. the ARCH (AutoRegressive Conditional Heteroskedasticity effect, a nonlinear phenomenon of the variance behaviour, in the residual series from linear models fitted to daily and monthly streamflow processes of the upper Yellow River, China. It is shown that the major cause of the ARCH effect is the seasonal variation in variance of the residual series. However, while the seasonal variation in variance can fully explain the ARCH effect for monthly streamflow, it is only a partial explanation for daily flow. It is also shown that while the periodic autoregressive moving average model is adequate in modelling monthly flows, no model is adequate in modelling daily streamflow processes because none of the conventional time series models takes the seasonal variation in variance, as well as the ARCH effect in the residuals, into account. Therefore, an ARMA-GARCH (Generalized AutoRegressive Conditional Heteroskedasticity error model is proposed to capture the ARCH effect present in daily streamflow series, as well as to preserve seasonal variation in variance in the residuals. The ARMA-GARCH error model combines an ARMA model for modelling the mean behaviour and a GARCH model for modelling the variance behaviour of the residuals from the ARMA model. Since the GARCH model is not followed widely in statistical hydrology, the work can be a useful addition in terms of statistical modelling of daily streamflow processes for the hydrological community.

  10. Multivariate Hawkes process models of the occurrence of regulatory elements

    DEFF Research Database (Denmark)

    Carstensen, L; Sandelin, A; Winther, Ole

    2010-01-01

    distribution of the occurrences of these TREs along the genome. RESULTS: We present a model of TRE occurrences known as the Hawkes process. We illustrate the use of this model by analyzing two different publically available data sets. We are able to model, in detail, how the occurrence of one TRE is affected....... For each of the two data sets we provide two results: first, a qualitative description of the dependencies among the occurrences of the TREs, and second, quantitative results on the favored or avoided distances between the different TREs. CONCLUSIONS: The Hawkes process is a novel way of modeling the joint...

  11. Ultrasonic-assisted manufacturing processes: Variational model and numerical simulations

    KAUST Repository

    Siddiq, Amir

    2012-04-01

    We present a computational study of ultrasonic assisted manufacturing processes including sheet metal forming, upsetting, and wire drawing. A fully variational porous plasticity model is modified to include ultrasonic softening effects and then utilized to account for instantaneous softening when ultrasonic energy is applied during deformation. Material model parameters are identified via inverse modeling, i.e. by using experimental data. The versatility and predictive ability of the model are demonstrated and the effect of ultrasonic intensity on the manufacturing process at hand is investigated and compared qualitatively with experimental results reported in the literature. © 2011 Elsevier B.V. All rights reserved.

  12. Modeling cancer registration processes with an enhanced activity diagram.

    Science.gov (United States)

    Lyalin, D; Williams, W

    2005-01-01

    Adequate instruments are needed to reflect the complexity of routine cancer registry operations properly in a business model. The activity diagram is a key instrument of the Unified Modeling Language (UML) for the modeling of business processes. The authors aim to improve descriptions of processes in cancer registration, as well as in other public health domains, through the enhancements of an activity diagram notation within the standard semantics of UML. The authors introduced the practical approach to enhance a conventional UML activity diagram, complementing it with the following business process concepts: timeline, duration for individual activities, responsibilities for individual activities within swimlanes, and descriptive text. The authors used an enhanced activity diagram for modeling surveillance processes in the cancer registration domain. Specific example illustrates the use of an enhanced activity diagram to visualize a process of linking cancer registry records with external mortality files. Enhanced activity diagram allows for the addition of more business concepts to a single diagram and can improve descriptions of processes in cancer registration, as well as in other domains. Additional features of an enhanced activity diagram allow to advance the visualization of cancer registration processes. That, in turn, promotes the clarification of issues related to the process timeline, responsibilities for particular operations, and collaborations among process participants. Our first experiences in a cancer registry best practices development workshop setting support the usefulness of such an approach.

  13. Residence time modeling of hot melt extrusion processes.

    Science.gov (United States)

    Reitz, Elena; Podhaisky, Helmut; Ely, David; Thommes, Markus

    2013-11-01

    The hot melt extrusion process is a widespread technique to mix viscous melts. The residence time of material in the process frequently determines the product properties. An experimental setup and a corresponding mathematical model were developed to evaluate residence time and residence time distribution in twin screw extrusion processes. The extrusion process was modeled as the convolution of a mass transport process described by a Gaussian probability function, and a mixing process represented by an exponential function. The residence time of the extrusion process was determined by introducing a tracer at the extruder inlet and measuring the tracer concentration at the die. These concentrations were fitted to the residence time model, and an adequate correlation was found. Different parameters were derived to characterize the extrusion process including the dead time, the apparent mixing volume, and a transport related axial mixing. A 2(3) design of experiments was performed to evaluate the effect of powder feed rate, screw speed, and melt viscosity of the material on the residence time. All three parameters affect the residence time of material in the extruder. In conclusion, a residence time model was developed to interpret experimental data and to get insights into the hot melt extrusion process. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. A simplified computational memory model from information processing.

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-23

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  15. A simplified computational memory model from information processing

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-01-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847

  16. Fault Management: Degradation Signature Detection, Modeling, and Processing Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault to Failure Progression (FFP) signature modeling and processing is a new method for applying condition-based signal data to detect degradation, to identify...

  17. Mathematical modeling of electromechanical processes in a brushless DC motor

    Directory of Open Access Journals (Sweden)

    V.I. Tkachuk

    2014-03-01

    Full Text Available On the basis of initial assumptions, a mathematical model that describes electromechanical processes in a brushless DC electric motor with a salient-pole stator and permanent-magnet excitation is created.

  18. Fault Management: Degradation Signature Detection, Modeling, and Processing, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault to Failure Progression (FFP) signature modeling and processing is a new method for applying condition-based signal data to detect degradation, to identify...

  19. Modeling of Cloud/Radiation Processes for Cirrus Cloud Formation

    National Research Council Canada - National Science Library

    Liou, K

    1997-01-01

    This technical report includes five reprints and pre-prints of papers associated with the modeling of cirrus cloud and radiation processes as well as remote sensing of cloud optical and microphysical...

  20. Sketch of a Noisy Channel Model for the Translation Process

    DEFF Research Database (Denmark)

    Carl, Michael

    default rendering" procedure, later conscious processes are triggered by a monitor who interferes when something goes wrong. An attempt is made to explain monitor activities with relevance theoretic concepts according to which a translator needs to ensure the similarity of explicatures and implicatures......The paper develops a Noisy Channel Model for the translation process that is based on actual user activity data. It builds on the monitor model and makes a distinction between early, automatic and late, conscious translation processes: while early priming processes are at the basis of a "literal...... of the source and the target texts. It is suggested that events and parameters in the model need be measurable and quantifiable in the user activity data so as to trace back monitoring activities in the translation process data. Michael Carl is a Professor with special responsibilities at the Department...

  1. Modelling and Control of TCV

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, A.S.; Limebeer, D.J.N.; Jaimoukha, I.M.; Lister, J.B

    2001-11-01

    A new approach to the modelling and control of tokamak fusion reactors is presented. A nonlinear model is derived using the classical arguments of Hamiltonian mechanics and a low-order linear model is derived from it. The modelling process used here addresses flux and energy conservation issues explicitly and self-consistently. The model is of particular value, because it shows the relationship between the initial modelling assumptions and the resulting predictions. The mechanisms behind the creation of uncontrollable modes in tokamak models are discussed. A normalised coprime factorisation controller is developed for the TCV tokamak using the verified linear model. Recent theory is applied to reduce the controller order significantly whilst guaranteeing a priori bounds on the robust stability and performance. The controller is shown to track successfully reference signals that dictate the plasma's shape, position and current. The tests used to verify this were carried out on linear and nonlinear models. (author)

  2. A Dirichlet process mixture model for brain MRI tissue classification.

    Science.gov (United States)

    Ferreira da Silva, Adelino R

    2007-04-01

    Accurate classification of magnetic resonance images according to tissue type or region of interest has become a critical requirement in diagnosis, treatment planning, and cognitive neuroscience. Several authors have shown that finite mixture models give excellent results in the automated segmentation of MR images of the human normal brain. However, performance and robustness of finite mixture models deteriorate when the models have to deal with a variety of anatomical structures. In this paper, we propose a nonparametric Bayesian model for tissue classification of MR images of the brain. The model, known as Dirichlet process mixture model, uses Dirichlet process priors to overcome the limitations of current parametric finite mixture models. To validate the accuracy and robustness of our method we present the results of experiments carried out on simulated MR brain scans, as well as on real MR image data. The results are compared with similar results from other well-known MRI segmentation methods.

  3. The Use of Reference Models in Business Process Renovation

    Directory of Open Access Journals (Sweden)

    Dejan Pajk

    2010-01-01

    Full Text Available Enterprise resource planning (ERP systems are often used by companies to automate and enhance their busi- ness processes. The capabilities of ERP systems can be described by best-practice reference models. The purpose of the article is to demonstrate the business process renovation approach with the use of reference models. Although the use of reference models brings many positive effects for business, they are still rarely used in Slovenian small and medium-sized compa- nies. The reasons for this may be found in the reference models themselves as well as in project implementation methodologies. In the article a reference model based on Microsoft Dynamics NAV is suggested. The reference model is designed using upgraded BPMN notation with additional business objects, which help to describe the models in more detail.

  4. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    Energy Technology Data Exchange (ETDEWEB)

    P. Dixon

    2004-04-05

    The purpose of this Model Report (REV02) is to document the unsaturated zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrological-chemical (THC) processes on UZ flow and transport. This Model Report has been developed in accordance with the ''Technical Work Plan for: Performance Assessment Unsaturated Zone'' (Bechtel SAIC Company, LLC (BSC) 2002 [160819]). The technical work plan (TWP) describes planning information pertaining to the technical scope, content, and management of this Model Report in Section 1.12, Work Package AUZM08, ''Coupled Effects on Flow and Seepage''. The plan for validation of the models documented in this Model Report is given in Attachment I, Model Validation Plans, Section I-3-4, of the TWP. Except for variations in acceptance criteria (Section 4.2), there were no deviations from this TWP. This report was developed in accordance with AP-SIII.10Q, ''Models''. This Model Report documents the THC Seepage Model and the Drift Scale Test (DST) THC Model. The THC Seepage Model is a drift-scale process model for predicting the composition of gas and water that could enter waste emplacement drifts and the effects of mineral alteration on flow in rocks surrounding drifts. The DST THC model is a drift-scale process model relying on the same conceptual model and much of the same input data (i.e., physical, hydrological, thermodynamic, and kinetic) as the THC Seepage Model. The DST THC Model is the primary method for validating the THC Seepage Model. The DST THC Model compares predicted water and gas compositions, as well as mineral alteration patterns, with observed data from the DST. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal-loading conditions, and predict the evolution of mineral alteration and fluid chemistry around potential waste emplacement drifts. The

  5. How should mathematical models of geomorphic processes be judged?

    Science.gov (United States)

    Iverson, Richard M.

    Mathematical models of geomorphic processes can have value as both predictive tools and precise conceptual frameworks. Well-posed mechanistic models have great conceptual value because they link geomorphic processes to universal scientific principles, such as conservation of energy, momentum, and mass. Models without this linkage (e.g., models based exclusively on cellular rules or empirical correlations) have less conceptual value but offer logical methodology for making practical predictions in some circumstances. Clear tests of the predictive power of mechanistic models can be achieved in controlled experiments, whereas natural landscapes typically have uncontrolled initial and boundary conditions and unresolved geological heterogeneities that preclude decisive tests. The best mechanistic models have a simplicity that results from minimizing assumptions and postulates, rather than minimizing mathematics, and this simplicity promotes conclusive tests. Optimal models also employ only parameters that are defined and measured outside the model context. Common weaknesses in geomorphic models result from use of freely coined equations without clear links to conservation laws or compelling data, use of fitted rather than measured values of parameters, lack of clear distinction between assumptions and approximations, and neglect of the four-dimensional (space + time) nature of most geomorphic processes. Models for predicting landslide runout illustrate principles and pitfalls that are common to all geomorphic modeling.

  6. A multi-phase flow model for electrospinning process

    Directory of Open Access Journals (Sweden)

    Xu Lan

    2013-01-01

    Full Text Available An electrospinning process is a multi-phase and multi-physicical process with flow, electric and magnetic fields coupled together. This paper deals with establishing a multi-phase model for numerical study and explains how to prepare for nanofibers and nanoporous materials. The model provides with a powerful tool to controlling over electrospinning parameters such as voltage, flow rate, and others.

  7. A Markov Process Inspired Cellular Automata Model of Road Traffic

    OpenAIRE

    Wang, Fa; Li, Li; Hu, Jianming; Ji, Yan; Yao, Danya; Zhang, Yi; Jin, Xuexiang; Su, Yuelong; Wei, Zheng

    2008-01-01

    To provide a more accurate description of the driving behaviors in vehicle queues, a namely Markov-Gap cellular automata model is proposed in this paper. It views the variation of the gap between two consequent vehicles as a Markov process whose stationary distribution corresponds to the observed distribution of practical gaps. The multiformity of this Markov process provides the model enough flexibility to describe various driving behaviors. Two examples are given to show how to specialize i...

  8. Reverse Osmosis Processing of Organic Model Compounds and Fermentation Broths

    Science.gov (United States)

    2006-04-01

    key species found in the fermentation broth: ethanol, butanol, acetic acid, oxalic acid, lactic acid, and butyric acid. Correlations of the rejection...AFRL-ML-TY-TP-2007-4545 POSTPRINT REVERSE OSMOSIS PROCESSING OF ORGANIC MODEL COMPOUNDS AND FERMENTATION BROTHS Robert Diltz...TELEPHONE NUMBER (Include area code) Bioresource Technology 98 (2007) 686–695Reverse osmosis processing of organic model compounds and fermentation broths

  9. Stochastic process corrosion growth models for pipeline reliability

    International Nuclear Information System (INIS)

    Bazán, Felipe Alexander Vargas; Beck, André Teófilo

    2013-01-01

    Highlights: •Novel non-linear stochastic process corrosion growth model is proposed. •Corrosion rate modeled as random Poisson pulses. •Time to corrosion initiation and inherent time-variability properly represented. •Continuous corrosion growth histories obtained. •Model is shown to precisely fit actual corrosion data at two time points. -- Abstract: Linear random variable corrosion models are extensively employed in reliability analysis of pipelines. However, linear models grossly neglect well-known characteristics of the corrosion process. Herein, a non-linear model is proposed, where corrosion rate is represented as a Poisson square wave process. The resulting model represents inherent time-variability of corrosion growth, produces continuous growth and leads to mean growth at less-than-one power of time. Different corrosion models are adjusted to the same set of actual corrosion data for two inspections. The proposed non-linear random process corrosion growth model leads to the best fit to the data, while better representing problem physics

  10. Le processus de normalisation comptable par l'IASB : le cas du résultat

    OpenAIRE

    Le Manh-Béna, Anne

    2009-01-01

    This research aims to contribute to the understanding of the IASB's standard-setting process through a single topic, the definition of income and its presentation in financial statements. Two research questions are addressed: what is the position expressed by the participants to the due process concerning the IASB's project on the definition and the presentation of income? How can be explained the pugnacity of the IASB to impose a new definition of income? The theoretical framework of this re...

  11. Numerical Model based Reliability Estimation of Selective Laser Melting Process

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2014-01-01

    Selective laser melting is developing into a standard manufacturing technology with applications in various sectors. However, the process is still far from being at par with conventional processes such as welding and casting, the primary reason of which is the unreliability of the process. While...... of the selective laser melting process. A validated 3D finite-volume alternating-direction-implicit numerical technique is used to model the selective laser melting process, and is calibrated against results from single track formation experiments. Correlation coefficients are determined for process input...... parameters such as laser power, speed, beam profile, etc. Subsequently, uncertainties in the processing parameters are utilized to predict a range for the various outputs, using a Monte Carlo method based uncertainty analysis methodology, and the reliability of the process is established....

  12. Exposing earth surface process model simulations to a large audience

    Science.gov (United States)

    Overeem, I.; Kettner, A. J.; Borkowski, L.; Russell, E. L.; Peddicord, H.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) represents a diverse group of >1300 scientists who develop and apply numerical models to better understand the Earth's surface. CSDMS has a mandate to make the public more aware of model capabilities and therefore started sharing state-of-the-art surface process modeling results with large audiences. One platform to reach audiences outside the science community is through museum displays on 'Science on a Sphere' (SOS). Developed by NOAA, SOS is a giant globe, linked with computers and multiple projectors and can display data and animations on a sphere. CSDMS has developed and contributed model simulation datasets for the SOS system since 2014, including hydrological processes, coastal processes, and human interactions with the environment. Model simulations of a hydrological and sediment transport model (WBM-SED) illustrate global river discharge patterns. WAVEWATCH III simulations have been specifically processed to show the impacts of hurricanes on ocean waves, with focus on hurricane Katrina and super storm Sandy. A large world dataset of dams built over the last two centuries gives an impression of the profound influence of humans on water management. Given the exposure of SOS, CSDMS aims to contribute at least 2 model datasets a year, and will soon provide displays of global river sediment fluxes and changes of the sea ice free season along the Arctic coast. Over 100 facilities worldwide show these numerical model displays to an estimated 33 million people every year. Datasets storyboards, and teacher follow-up materials associated with the simulations, are developed to address common core science K-12 standards. CSDMS dataset documentation aims to make people aware of the fact that they look at numerical model results, that underlying models have inherent assumptions and simplifications, and that limitations are known. CSDMS contributions aim to familiarize large audiences with the use of numerical

  13. Ecosystem management via interacting models of political and ecological processes

    Directory of Open Access Journals (Sweden)

    Haas, T. C.

    2004-01-01

    Full Text Available The decision to implement environmental protection options is a political one. Political realities may cause a country to not heed the most persuasive scientific analysis of an ecosystem's future health. A predictive understanding of the political processes that result in ecosystem management decisions may help guide ecosystem management policymaking. To this end, this article develops a stochastic, temporal model of how political processes influence and are influenced by ecosystem processes. This model is realized in a system of interacting influence diagrams that model the decision making of a country's political bodies. These decisions interact with a model of the ecosystem enclosed by the country. As an example, a model for Cheetah (Acinonyx jubatus management in Kenya is constructed and fitted to decision and ecological data.

  14. Virtual models of the HLA class I antigen processing pathway.

    Science.gov (United States)

    Petrovsky, Nikolai; Brusic, Vladimir

    2004-12-01

    Antigen recognition by cytotoxic CD8 T cells is dependent upon a number of critical steps in MHC class I antigen processing including proteosomal cleavage, TAP transport into the endoplasmic reticulum, and MHC class I binding. Based on extensive experimental data relating to each of these steps there is now the capacity to model individual antigen processing steps with a high degree of accuracy. This paper demonstrates the potential to bring together models of individual antigen processing steps, for example proteosome cleavage, TAP transport, and MHC binding, to build highly informative models of functional pathways. In particular, we demonstrate how an artificial neural network model of TAP transport was used to mine a HLA-binding database so as to identify HLA-binding peptides transported by TAP. This integrated model of antigen processing provided the unique insight that HLA class I alleles apparently constitute two separate classes: those that are TAP-efficient for peptide loading (HLA-B27, -A3, and -A24) and those that are TAP-inefficient (HLA-A2, -B7, and -B8). Hence, using this integrated model we were able to generate novel hypotheses regarding antigen processing, and these hypotheses are now capable of being tested experimentally. This model confirms the feasibility of constructing a virtual immune system, whereby each additional step in antigen processing is incorporated into a single modular model. Accurate models of antigen processing have implications for the study of basic immunology as well as for the design of peptide-based vaccines and other immunotherapies.

  15. Innovation Process Planning Model in the Bpmn Standard

    Directory of Open Access Journals (Sweden)

    Jurczyk-Bunkowska Magdalena

    2013-12-01

    Full Text Available The aim of the article is to show the relations in the innovation process planning model. The relations argued here guarantee the stable and reliable way to achieve the result in the form of an increased competitiveness by a professionally directed development of the company. The manager needs to specify the effect while initiating the realisation of the process, has to be achieved this by the system of indirect goals. The original model proposed here shows the standard of dependence between the plans of the fragments of the innovation process which make up for achieving its final goal. The relation in the present article was shown by using the standard Business Process Model and Notation. This enabled the specification of interrelations between the decision levels at which subsequent fragments of the innovation process are planned. This gives the possibility of a better coordination of the process, reducing the time needed for the achievement of its effect. The model has been compiled on the basis of the practises followed in Polish companies. It is not, however, the reflection of these practises, but rather an idealised standard of proceedings which aims at improving the effectiveness of the management of innovations on the operational level. The model shown could be the basis of the creation of systems supporting the decision making, supporting the knowledge management or those supporting the communication in the innovation processes.

  16. Multivariate Product-Shot-noise Cox Point Process Models

    DEFF Research Database (Denmark)

    Jalilian, Abdollah; Guan, Yongtao; Mateu, Jorge

    We introduce a new multivariate product-shot-noise Cox process which is useful for model- ing multi-species spatial point patterns with clustering intra-specific interactions and neutral, negative or positive inter-specific interactions. The auto and cross pair correlation functions of the process...

  17. Parallel direct solver for finite element modeling of manufacturing processes

    DEFF Research Database (Denmark)

    Nielsen, Chris Valentin; Martins, P.A.F.

    2017-01-01

    The central processing unit (CPU) time is of paramount importance in finite element modeling of manufacturing processes. Because the most significant part of the CPU time is consumed in solving the main system of equations resulting from finite element assemblies, different approaches have been d...

  18. FibreChain: characterization and modeling of thermoplastic composites processing

    NARCIS (Netherlands)

    Rietman, Bert; Niazi, Muhammad Sohail; Akkerman, Remko; Lomov, S.V.

    2013-01-01

    Thermoplastic composites feature the advantage of melting and shaping. The material properties during processing and the final product properties are to a large extent determined by the thermal history of the material. The approach in the FP7-project FibreChain for process chain modeling of

  19. PROGRAM COMPLEX FOR MODELING OF THE DETAILS HARDENING PROCESS

    Directory of Open Access Journals (Sweden)

    S. P. Kundas

    2004-01-01

    Full Text Available In the article there is presented the program complex ThermoSim, consisting of preprocessor, processor and postprocessor and intended for modeling (analysis of thermalphysic processes and characteristics of details of instrument-making and machine-building, diagnostics and optimization of technological processes of heat treatment and details constructions without using the destructive control methods.

  20. Innovative model of business process reengineering at machine building enterprises

    Science.gov (United States)

    Nekrasov, R. Yu; Tempel, Yu A.; Tempel, O. A.

    2017-10-01

    The paper provides consideration of business process reengineering viewed as amanagerial innovation accepted by present day machine building enterprises, as well as waysto improve its procedure. A developed innovative model of reengineering measures isdescribed and is based on the process approach and other principles of company management.

  1. Biomolecular Modeling in a Process Dynamics and Control Course

    Science.gov (United States)

    Gray, Jeffrey J.

    2006-01-01

    I present modifications to the traditional course entitled, "Process dynamics and control," which I renamed "Modeling, dynamics, and control of chemical and biological processes." Additions include the central dogma of biology, pharmacokinetic systems, population balances, control of gene transcription, and large­-scale…

  2. Modelling of the aqueous debittering process of Lupinus mutabilis Sweet

    NARCIS (Netherlands)

    Carvajal-Larenas, F.E.; Nout, M.J.R.; Boekel, van M.A.J.S.; Linnemann, A.R.

    2013-01-01

    We investigated the process of lupin debittering by soaking, cooking and washing in water using a newly designed hydroagitator. The effect on alkaloids content, solids in the product, final weight, processing time and water and energy consumption were expressed in a mathematical model for

  3. CFD modelling of condensers for freeze-drying processes

    Indian Academy of Sciences (India)

    ... the condenser, in order to evaluate condenser efficiency and gain deeper insights of the process to be used for the improvement of its design. Both a complete laboratory-scale freeze-drying apparatus and an industrial-scale condenser have been investigated in this work, modelling the process of water vapour deposition.

  4. A Unified Toolset for Business Process Model Formalization

    NARCIS (Netherlands)

    B. Changizi (Behnaz); N. Kokash (Natallia); F. Arbab (Farhad)

    2010-01-01

    htmlabstractIn this paper, we present a toolset to automate the transformation of Business Process Modeling Notation (BPMN), UML Sequence Diagrams, and Business Process Execution Language (BPEL), into their proposed formal semantics expressed in the channel-based coordination language Reo. Such

  5. Process chain modeling and selection in an additive manufacturing context

    DEFF Research Database (Denmark)

    Thompson, Mary Kathryn; Stolfi, Alessandro; Mischkot, Michael

    2016-01-01

    This paper introduces a new two-dimensional approach to modeling manufacturing process chains. This approach is used to consider the role of additive manufacturing technologies in process chains for a part with micro scale features and no internal geometry. It is shown that additive manufacturing...... evolving fields like additive manufacturing....

  6. Modelling aspects of distributed processing in telecommunication networks

    NARCIS (Netherlands)

    Tomasgard, A; Audestad, JA; Dye, S; Stougie, L; van der Vlerk, MH; Wallace, SW

    1998-01-01

    The purpose of this paper is to formally describe new optimization models for telecommunication networks with distributed processing. Modem distributed networks put more focus on the processing of information and less on the actual transportation of data than we are traditionally used to in

  7. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2000-07-17

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II).

  8. Engineered Barrier System Degradation, Flow, and Transport Process Model Report

    International Nuclear Information System (INIS)

    E.L. Hardin

    2000-01-01

    The Engineered Barrier System Degradation, Flow, and Transport Process Model Report (EBS PMR) is one of nine PMRs supporting the Total System Performance Assessment (TSPA) being developed by the Yucca Mountain Project for the Site Recommendation Report (SRR). The EBS PMR summarizes the development and abstraction of models for processes that govern the evolution of conditions within the emplacement drifts of a potential high-level nuclear waste repository at Yucca Mountain, Nye County, Nevada. Details of these individual models are documented in 23 supporting Analysis/Model Reports (AMRs). Nineteen of these AMRs are for process models, and the remaining 4 describe the abstraction of results for application in TSPA. The process models themselves cluster around four major topics: ''Water Distribution and Removal Model, Physical and Chemical Environment Model, Radionuclide Transport Model, and Multiscale Thermohydrologic Model''. One AMR (Engineered Barrier System-Features, Events, and Processes/Degradation Modes Analysis) summarizes the formal screening analysis used to select the Features, Events, and Processes (FEPs) included in TSPA and those excluded from further consideration. Performance of a potential Yucca Mountain high-level radioactive waste repository depends on both the natural barrier system (NBS) and the engineered barrier system (EBS) and on their interactions. Although the waste packages are generally considered as components of the EBS, the EBS as defined in the EBS PMR includes all engineered components outside the waste packages. The principal function of the EBS is to complement the geologic system in limiting the amount of water contacting nuclear waste. A number of alternatives were considered by the Project for different EBS designs that could provide better performance than the design analyzed for the Viability Assessment. The design concept selected was Enhanced Design Alternative II (EDA II)

  9. Characterisation and normalisation factors for life cycle impact assessment mined abiotic resources categories in South Africa - The manufacturing of catalytic converter exhaust systems as a case study

    CSIR Research Space (South Africa)

    Strauss, K

    2006-05-01

    Full Text Available levels. The normalisation factors are based on the total economic reserves of key South African minerals and world non-renewable energy resources respectively. A case study of the manufacturing of an exhaust system for a standard sedan is used to compare...

  10. Rational parametrisation of normalised Stiefel manifolds, and explicit non-'t Hooft solutions of the Atiyah-Drinfeld-Hitchin-Manin instanton matrix equations for Sp(n)

    International Nuclear Information System (INIS)

    McCarthy, P.J.

    1981-01-01

    It is proved that normalised Stiefel manifolds admit a rational parametrisation which generalises Cayley's parametrisation of the unitary groups. Applying (the quaternionic case of) this parametrisation to the Atiyah-Drinfeld-Hitchin-Manin (ADHM) instanton matrix equations, large families of new explicit rational solutions emerge. In particular, new explicit non-'t Hooft solutions are presented. (orig.)

  11. Modeling a novel glass immobilization waste treatment process using flow

    International Nuclear Information System (INIS)

    Ferrada, J.J.; Nehls, J.W. Jr.; Welch, T.D.; Giardina, J.L.

    1996-01-01

    One option for control and disposal of surplus fissile materials is the Glass Material Oxidation and Dissolution System (GMODS), a process developed at ORNL for directly converting Pu-bearing material into a durable high-quality glass waste form. This paper presents a preliminary assessment of the GMODS process flowsheet using FLOW, a chemical process simulator. The simulation showed that the glass chemistry postulated ion the models has acceptable levels of risks

  12. Numerical modelling of the tilt casting processes of titanium alumindes

    OpenAIRE

    Wang, Hong

    2008-01-01

    This research has investigated the modelling and optimisation of the tilt casting process of Titanium Aluminides (TiAl). This study is carried out in parallel with the experimental research undertaken in IRC at the University of Birmingham. They propose to use tilt casting inside a vacuum chamber and attempt to combine this tilt casting process with Induction Skull Melting (ISM). A totally novel process is developing for investment casting, which is suitable for casting gamma TiAl.\\ud \\ud As ...

  13. Modeling process-structure-property relationships for additive manufacturing

    Science.gov (United States)

    Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Yu, Cheng; Liu, Zeliang; Lian, Yanping; Wolff, Sarah; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam

    2018-02-01

    This paper presents our latest work on comprehensive modeling of process-structure-property relationships for additive manufacturing (AM) materials, including using data-mining techniques to close the cycle of design-predict-optimize. To illustrate the processstructure relationship, the multi-scale multi-physics process modeling starts from the micro-scale to establish a mechanistic heat source model, to the meso-scale models of individual powder particle evolution, and finally to the macro-scale model to simulate the fabrication process of a complex product. To link structure and properties, a highefficiency mechanistic model, self-consistent clustering analyses, is developed to capture a variety of material response. The model incorporates factors such as voids, phase composition, inclusions, and grain structures, which are the differentiating features of AM metals. Furthermore, we propose data-mining as an effective solution for novel rapid design and optimization, which is motivated by the numerous influencing factors in the AM process. We believe this paper will provide a roadmap to advance AM fundamental understanding and guide the monitoring and advanced diagnostics of AM processing.

  14. Eye Tracking Meets the Process of Process Modeling: a Visual Analytic Approach

    DEFF Research Database (Denmark)

    Burattin, Andrea; Kaiser, M.; Neurauter, Manuel

    2017-01-01

    Research on the process of process modeling (PPM) studies how process models are created. It typically uses the logs of the interactions with the modeling tool to assess the modeler’s behavior. In this paper we suggest to introduce an additional stream of data (i.e., eye tracking) to improve...... diagram, heat maps, fixations distributions) both static and dynamic (i.e., movies with the evolution of the model and eye tracking data on top)....... the analysis of the PPM. We show that, by exploiting this additional source of information, we can refine the detection of comprehension phases (introducing activities such as “semantic validation” or “ problem understanding”) as well as provide more exploratory visualizations (e.g., combined modeling phase...

  15. A decision model for the risk management of hazardous processes

    International Nuclear Information System (INIS)

    Holmberg, J.E.

    1997-03-01

    A decision model for risk management of hazardous processes as an optimisation problem of a point process is formulated in the study. In the approach, the decisions made by the management are divided into three categories: (1) planned process lifetime, (2) selection of the design and, (3) operational decisions. These three controlling methods play quite different roles in the practical risk management, which is also reflected in our approach. The optimisation of the process lifetime is related to the licensing problem of the process. It provides a boundary condition for a feasible utility function that is used as the actual objective function, i.e., maximizing the process lifetime utility. By design modifications, the management can affect the inherent accident hazard rate of the process. This is usually a discrete optimisation task. The study particularly concentrates upon the optimisation of the operational strategies given a certain design and licensing time. This is done by a dynamic risk model (marked point process model) representing the stochastic process of events observable or unobservable to the decision maker. An optimal long term control variable guiding the selection of operational alternatives in short term problems is studied. The optimisation problem is solved by the stochastic quasi-gradient procedure. The approach is illustrated by a case study. (23 refs.)

  16. Continuation-like semantics for modeling structural process anomalies

    Directory of Open Access Journals (Sweden)

    Grewe Niels

    2012-09-01

    Full Text Available Abstract Background Biomedical ontologies usually encode knowledge that applies always or at least most of the time, that is in normal circumstances. But for some applications like phenotype ontologies it is becoming increasingly important to represent information about aberrations from a norm. These aberrations may be modifications of physiological structures, but also modifications of biological processes. Methods To facilitate precise definitions of process-related phenotypes, such as delayed eruption of the primary teeth or disrupted ocular pursuit movements, I introduce a modeling approach that draws inspiration from the use of continuations in the analysis of programming languages and apply a similar idea to ontological modeling. This approach characterises processes by describing their outcome up to a certain point and the way they will continue in the canonical case. Definitions of process types are then given in terms of their continuations and anomalous phenotypes are defined by their differences to the canonical definitions. Results The resulting model is capable of accurately representing structural process anomalies. It allows distinguishing between different anomaly kinds (delays, interruptions, gives identity criteria for interrupted processes, and explains why normal and anomalous process instances can be subsumed under a common type, thus establishing the connection between canonical and anomalous process-related phenotypes. Conclusion This paper shows how to to give semantically rich definitions of process-related phenotypes. These allow to expand the application areas of phenotype ontologies beyond literature annotation and establishment of genotype-phenotype associations to the detection of anomalies in suitably encoded datasets.

  17. Control of automatic processes: A parallel distributed-processing model of the stroop effect. Technical report

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J.D.; Dunbar, K.; McClelland, J.L.

    1988-06-16

    A growing body of evidence suggests that traditional views of automaticity are in need of revision. For example, automaticity has often been treated as an all-or-none phenomenon, and traditional theories have held that automatic processes are independent of attention. Yet recent empirial data suggests that automatic processes are continuous, and furthermore are subject to attentional control. In this paper we present a model of attention which addresses these issues. Using a parallel distributed processing framework we propose that the attributes of automaticity depend upon the strength of a process and that strength increases with training. Using the Stroop effect as an example, we show how automatic processes are continuous and emerge gradually with practice. Specifically, we present a computational model of the Stroop task which simulates the time course of processing as well as the effects of learning.

  18. Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model

    Science.gov (United States)

    Berman, Jeanette; Smyth, Robyn

    2015-01-01

    This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…

  19. Arta process model of maritime clutter and targets

    CSIR Research Space (South Africa)

    McDonald, A

    2012-10-01

    Full Text Available . The validity and practicality of the ARTA process model is demonstrated by deriving models for a maritime target and for sea clutter, both from measurements and without any prior assumption regarding the distribution of measurements. This ability to generate...

  20. In-situ biogas upgrading process: modeling and simulations aspects

    DEFF Research Database (Denmark)

    Lovato, Giovanna; Alvarado-Morales, Merlin; Kovalovszki, Adam

    2017-01-01

    Biogas upgrading processes by in-situ hydrogen (H2) injection are still challenging and could benefit from a mathematical model to predict system performance. Therefore, a previous model on anaerobic digestion was updated and expanded to include the effect of H2 injection into the liquid phase of...

  1. Model Based Monitoring and Control of Chemical and Biochemical Processes

    DEFF Research Database (Denmark)

    Huusom, Jakob Kjøbsted

    This presentation will give an overview of the work performed at the department of Chemical and Biochemical Engineering related to process control. A research vision is formulated and related to a number of active projects at the department. In more detail a project describing model estimation...... and controller tuning in Model Predictive Control application is discussed....

  2. NEURO-FUZZY MODELLING OF BLENDING PROCESS IN CEMENT PLANT

    Directory of Open Access Journals (Sweden)

    Dauda Olarotimi Araromi

    2015-11-01

    Full Text Available The profitability of a cement plant depends largely on the efficient operation of the blending stage, therefore, there is a need to control the process at the blending stage in order to maintain the chemical composition of the raw mix near or at the desired value with minimum variance despite variation in the raw material composition. In this work, neuro-fuzzy model is developed for a dynamic behaviour of the system to predict the total carbonate content in the raw mix at different clay feed rates. The data used for parameter estimation and model validation was obtained from one of the cement plants in Nigeria. The data was pre-processed to remove outliers and filtered using smoothening technique in order to reveal its dynamic nature. Autoregressive exogenous (ARX model was developed for comparison purpose. ARX model gave high root mean square error (RMSE of 5.408 and 4.0199 for training and validation respectively. Poor fit resulting from ARX model is an indication of nonlinear nature of the process. However, both visual and statistical analyses on neuro-fuzzy (ANFIS model gave a far better result. RMSE of training and validation are 0.28167 and 0.7436 respectively, and the sum of square error (SSE and R-square are 39.6692 and 0.9969 respectively. All these are an indication of good performance of ANFIS model. This model can be used for control design of the process.

  3. The quark-gluon model for particle production processes

    International Nuclear Information System (INIS)

    Volkovitskij, P.E.

    1983-01-01

    The quark-gluon model for hadronization of strings produced in soft and hard processes is suggested. The model is based on the distribution functions of valence quarks in hadrons which have correct Regge behaviour. The simplest case is discussed in which only the longitudinal degrees of freedom are taken into account

  4. Models as instruments for optimizing hospital processes: a systematic review

    NARCIS (Netherlands)

    van Sambeek, J. R. C.; Cornelissen, F. A.; Bakker, P. J. M.; Krabbendam, J. J.

    2010-01-01

    PURPOSE: The purpose of this article is to find decision-making models for the design and control of processes regarding patient flows, considering various problem types, and to find out how usable these models are for managerial decision making. DESIGN/METHODOLOGY/APPROACH: A systematic review of

  5. Automatic Detection and Resolution of Lexical Ambiguity in Process Models

    NARCIS (Netherlands)

    Pittke, F.; Leopold, H.; Mendling, J.

    2015-01-01

    System-related engineering tasks are often conducted using process models. In this context, it is essential that these models do not contain structural or terminological inconsistencies. To this end, several automatic analysis techniques have been proposed to support quality assurance. While formal

  6. Consolidation process model for film stacking glass/PPS laminates

    NARCIS (Netherlands)

    Grouve, Wouter Johannes Bernardus; Akkerman, Remko

    2010-01-01

    A model is proposed to optimise the processing parameters for the consolidation of glass/polyphenylene sulphide (PPS) laminates using a film stacking procedure. In a split approach, the heating and consolidation phase are treated separately. The heating phase is modelled using the one-dimensional

  7. Stochastic Greybox Modeling of an Alternating Activated Sludge Process

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus Fogtmann; Munk-Nielsen, T.; Tychsen, P.

    Summary of key findings We found a greybox model for state estimation and control of the BioDenitro process based on a reduced ASM1. We then applied Maximum Likelihood Estimation on measurements from a real full-scale waste water treatment plant to estimate the model parameters. The estimation me...

  8. Stochastic Greybox Modeling of an Alternating Activated Sludge Process

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus Fogtmann; Munk-Nielsen, T.; Tychsen, P.

    Summary of key findings We found a greybox model for state estimation and control of the BioDenitro process based on a reduced ASM1. We then applied Maximum Likelihood Estimation on measurements from a real full-scale waste water treatment plant to estimate the model parameters. The estimation me...... forecasts of the load....

  9. A computational model of human auditory signal processing and perception.

    Science.gov (United States)

    Jepsen, Morten L; Ewert, Stephan D; Dau, Torsten

    2008-07-01

    A model of computational auditory signal-processing and perception that accounts for various aspects of simultaneous and nonsimultaneous masking in human listeners is presented. The model is based on the modulation filterbank model described by Dau et al. [J. Acoust. Soc. Am. 102, 2892 (1997)] but includes major changes at the peripheral and more central stages of processing. The model contains outer- and middle-ear transformations, a nonlinear basilar-membrane processing stage, a hair-cell transduction stage, a squaring expansion, an adaptation stage, a 150-Hz lowpass modulation filter, a bandpass modulation filterbank, a constant-variance internal noise, and an optimal detector stage. The model was evaluated in experimental conditions that reflect, to a different degree, effects of compression as well as spectral and temporal resolution in auditory processing. The experiments include intensity discrimination with pure tones and broadband noise, tone-in-noise detection, spectral masking with narrow-band signals and maskers, forward masking with tone signals and tone or noise maskers, and amplitude-modulation detection with narrow- and wideband noise carriers. The model can account for most of the key properties of the data and is more powerful than the original model. The model might be useful as a front end in technical applications.

  10. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  11. Analysis and synthesis of solutions for the agglomeration process modeling

    Science.gov (United States)

    Babuk, V. A.; Dolotkazin, I. N.; Nizyaev, A. A.

    2013-03-01

    The present work is devoted development of model of agglomerating process for propellants based on ammonium perchlorate (AP), ammonium dinitramide (ADN), HMX, inactive binder, and nanoaluminum. Generalization of experimental data, development of physical picture of agglomeration for listed propellants, development and analysis of mathematical models are carried out. Synthesis of models of various phenomena taking place at agglomeration implementation allows predicting of size and quantity, chemical composition, structure of forming agglomerates and its fraction in set of condensed combustion products. It became possible in many respects due to development of new model of agglomerating particle evolution on the surface of burning propellant. Obtained results correspond to available experimental data. It is supposed that analogical method based on analysis of mathematical models of particular phenomena and their synthesis will allow implementing of the agglomerating process modeling for other types of metalized solid propellants.

  12. Process optimization of friction stir welding based on thermal models

    DEFF Research Database (Denmark)

    Larsen, Anders Astrup

    2010-01-01

    This thesis investigates how to apply optimization methods to numerical models of a friction stir welding process. The work is intended as a proof-of-concept using different methods that are applicable to models of high complexity, possibly with high computational cost, and without the possibility...... information of the high-fidelity model. The optimization schemes are applied to stationary thermal models of differing complexity of the friction stir welding process. The optimization problems considered are based on optimizing the temperature field in the workpiece by finding optimal translational speed...... and the backingplate by solving an inverse modelling problem in which experimental data and a numerical model are used for determining the contact heat transfer coefficient. Different parametrizations of the spatial distribution of the heat transfer coefficient are studied and discussed, and the optimization problem...

  13. Representing vegetation processes in hydrometeorological simulations using the WRF model

    DEFF Research Database (Denmark)

    Nielsen, Joakim Refslund

    For accurate predictions of weather and climate, it is important that the land surface and its processes are well represented. In a mesoscale model the land surface processes are calculated in a land surface model (LSM). These pro-cesses include exchanges of energy, water and momentum between...... data and the default vegetation data in WRF were further used in high-resolution simulations over Denmark down to cloud-resolving scale (3 km). Results from two spatial resolutions were compared to investigate the inuence of parametrized and resolved convec-tion. The simulations using the parametrized...

  14. TECHNOLOGICAL PROCESS MODELING AIMING TO IMPROVE ITS OPERATIONS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Ivan Mihajlović

    2011-11-01

    Full Text Available This paper presents the modeling procedure of one real technological system. In this study, thecopper extraction from the copper flotation waste generated at the Bor Copper Mine (Serbia, werethe object of modeling. Sufficient data base for statistical modeling was constructed using theorthogonal factorial design of the experiments. Mathematical model of investigated system wasdeveloped using the combination of linear and multiple linear statistical analysis approach. Thepurpose of such a model is obtaining optimal states of the system that enable efficient operationsmanagement. Besides technological and economical, ecological parameters of the process wereconsidered as crucial input variables.

  15. New process model proves accurate in tests on catalytic reformer

    Energy Technology Data Exchange (ETDEWEB)

    Aguilar-Rodriguez, E.; Ancheyta-Juarez, J. (Inst. Mexicano del Petroleo, Mexico City (Mexico))

    1994-07-25

    A mathematical model has been devised to represent the process that takes place in a fixed-bed, tubular, adiabatic catalytic reforming reactor. Since its development, the model has been applied to the simulation of a commercial semiregenerative reformer. The development of mass and energy balances for this reformer led to a model that predicts both concentration and temperature profiles along the reactor. A comparison of the model's results with experimental data illustrates its accuracy at predicting product profiles. Simple steps show how the model can be applied to simulate any fixed-bed catalytic reformer.

  16. Ground-up-top down: a mixed method action research study aimed at normalising research in practice for nurses and midwives.

    Science.gov (United States)

    Parker, Vicki; Lieschke, Gena; Giles, Michelle

    2017-01-01

    Improving health, patient and system outcomes through a practice-based research agenda requires infrastructural supports, leadership and capacity building approaches, at both the individual and organisational levels. Embedding research as normal nursing and midwifery practice requires a flexible approach that is responsive to the diverse clinical contexts within which care is delivered and the variable research skills and interest of clinicians. This paper reports the study protocol for research being undertaken in a Local Health District (LHD) in New South Wales (NSW) Australia. The study aims to evaluate existing nursing and midwifery research activity, culture, capacity and capability across the LHD. This information, in addition to input from key stakeholders will be used to develop a responsive, productive and sustainable research capacity building framework aimed at enculturating practice-based research activities within and across diverse clinical settings of the LHD. A three-phased, sequential mixed-methods action research design underpinned by Normalization Process Theory (NPT). Participants will be nursing and midwifery clinicians and managers across rural and metropolitan services. A combination of survey, focus group, individual interviews and peer supported action-learning groups will be used to gather data. Quantitative data will be analysed using descriptive statistics, correlation and regression, together with thematic analysis of qualitative data to produce an integrated report. Understanding the current research activity and capacity of nurses and midwives, together with organisational supports and culture is essential to developing a productive and sustainable research environment. However, knowledge alone will not bring about change. This study will move beyond description of barriers to research participation for nurses and midwives and the promulgation of various capacity building frameworks to employ a theory driven action-oriented approach

  17. Modelling of the Heating Process in a Thermal Screw

    Science.gov (United States)

    Zhang, Xuan; Veje, Christian T.; Lassen, Benny; Willatzen, Morten

    2012-11-01

    The procedure of separating efficiently dry-stuff (proteins), fat, and water is an important process in the handling of waste products from industrial and commercial meat manufactures. One of the sub-processes in a separation facility is a thermal screw where the raw material (after proper mincing) is heated in order to melt fat, coagulate protein, and free water. This process is very energy consuming and the efficiency of the product is highly dependent on accurate temperature control of the process. A key quality parameter is the time that the product is maintained at temperatures within a certain threshold. A detailed mathematical model for the heating process in the thermal screw is developed and analysed. The model is formulated as a set of partial differential equations including the latent heat for the melting process of the fat and the boiling of water, respectively. The product is modelled by three components; water, fat and dry-stuff (bones and proteins). The melting of the fat component is captured as a plateau in the product temperature. The model effectively captures the product outlet temperature and the energy consumed. Depending on raw material composition, "soft" or "dry", the model outlines the heat injection and screw speeds necessary to obtain optimal output quality.

  18. Rapid Prototyping of wax foundry models in an incremental process

    Directory of Open Access Journals (Sweden)

    B. Kozik

    2011-04-01

    Full Text Available The paper presents an analysis incremental methods of creating wax founding models. There are two methods of Rapid Prototypingof wax models in an incremental process which are more and more often used in industrial practice and in scientific research.Applying Rapid Prototyping methods in the process of making casts allows for acceleration of work on preparing prototypes. It isespecially important in case of element having complicated shapes. The time of making a wax model depending on the size and the appliedRP method may vary from several to a few dozen hours.

  19. Numerical modelling of the jet nozzle enrichment process

    International Nuclear Information System (INIS)

    Vercelli, P.

    1983-01-01

    A numerical model was developed for the simulation of the isotopic enrichment produced by the jet nozzle process. The flow was considered stationary and under ideal gas conditions. The model calculates, for any position of the skimmer piece: (a) values of radial mass concentration profiles for each isotopic species and (b) values of elementary separation effect (Σ sub(A)) and uranium cut (theta). The comparison of the numerical results obtained with the experimental values given in the literature proves the validity of the present work as an initial step in the modelling of the process. (Author) [pt

  20. Modelling the Pultrusion Process of Off Shore Wind Turbine Blades

    DEFF Research Database (Denmark)

    Baran, Ismet

    to the quasi-static mechanical model in which the finite element method is employed. In the mechanical model, the composite part is assumed to advance along the pulling direction meanwhile tracking the corresponding temperature and degree of cure profiles. Modelling the pultrusion process containing both uni....... The compaction, viscous and frictional forces have been predicted for a pultruded composite rod. The viscous drag is found to be the main contribution in terms of the frictional force to the overall pulling force, while the contribution due to material compaction at the inlet is found to be negligible. Process...