WorldWideScience

Sample records for model assessing functional

  1. Predictive assessment of models for dynamic functional connectivity

    DEFF Research Database (Denmark)

    Nielsen, Søren Føns Vind; Schmidt, Mikkel Nørgaard; Madsen, Kristoffer Hougaard

    2018-01-01

    represent functional brain networks as a meta-stable process with a discrete number of states; however, there is a lack of consensus on how to perform model selection and learn the number of states, as well as a lack of understanding of how different modeling assumptions influence the estimated state......In neuroimaging, it has become evident that models of dynamic functional connectivity (dFC), which characterize how intrinsic brain organization changes over time, can provide a more detailed representation of brain function than traditional static analyses. Many dFC models in the literature...... dynamics. To address these issues, we consider a predictive likelihood approach to model assessment, where models are evaluated based on their predictive performance on held-out test data. Examining several prominent models of dFC (in their probabilistic formulations) we demonstrate our framework...

  2. Assessment of tropospheric delay mapping function models in Egypt: Using PTD database model

    Science.gov (United States)

    Abdelfatah, M. A.; Mousa, Ashraf E.; El-Fiky, Gamal S.

    2018-06-01

    For space geodetic measurements, estimates of tropospheric delays are highly correlated with site coordinates and receiver clock biases. Thus, it is important to use the most accurate models for the tropospheric delay to reduce errors in the estimates of the other parameters. Both the zenith delay value and mapping function should be assigned correctly to reduce such errors. Several mapping function models can treat the troposphere slant delay. The recent models were not evaluated for the Egyptian local climate conditions. An assessment of these models is needed to choose the most suitable one. The goal of this paper is to test the quality of global mapping function which provides high consistency with precise troposphere delay (PTD) mapping functions. The PTD model is derived from radiosonde data using ray tracing, which consider in this paper as true value. The PTD mapping functions were compared, with three recent total mapping functions model and another three separate dry and wet mapping function model. The results of the research indicate that models are very close up to zenith angle 80°. Saastamoinen and 1/cos z model are behind accuracy. Niell model is better than VMF model. The model of Black and Eisner is a good model. The results also indicate that the geometric range error has insignificant effect on slant delay and the fluctuation of azimuth anti-symmetric is about 1%.

  3. Quality assessment of protein model-structures based on structural and functional similarities.

    Science.gov (United States)

    Konopka, Bogumil M; Nebel, Jean-Christophe; Kotulska, Malgorzata

    2012-09-21

    Experimental determination of protein 3D structures is expensive, time consuming and sometimes impossible. A gap between number of protein structures deposited in the World Wide Protein Data Bank and the number of sequenced proteins constantly broadens. Computational modeling is deemed to be one of the ways to deal with the problem. Although protein 3D structure prediction is a difficult task, many tools are available. These tools can model it from a sequence or partial structural information, e.g. contact maps. Consequently, biologists have the ability to generate automatically a putative 3D structure model of any protein. However, the main issue becomes evaluation of the model quality, which is one of the most important challenges of structural biology. GOBA--Gene Ontology-Based Assessment is a novel Protein Model Quality Assessment Program. It estimates the compatibility between a model-structure and its expected function. GOBA is based on the assumption that a high quality model is expected to be structurally similar to proteins functionally similar to the prediction target. Whereas DALI is used to measure structure similarity, protein functional similarity is quantified using standardized and hierarchical description of proteins provided by Gene Ontology combined with Wang's algorithm for calculating semantic similarity. Two approaches are proposed to express the quality of protein model-structures. One is a single model quality assessment method, the other is its modification, which provides a relative measure of model quality. Exhaustive evaluation is performed on data sets of model-structures submitted to the CASP8 and CASP9 contests. The validation shows that the method is able to discriminate between good and bad model-structures. The best of tested GOBA scores achieved 0.74 and 0.8 as a mean Pearson correlation to the observed quality of models in our CASP8 and CASP9-based validation sets. GOBA also obtained the best result for two targets of CASP8, and

  4. Assessing elders using the functional health pattern assessment model.

    Science.gov (United States)

    Beyea, S; Matzo, M

    1989-01-01

    The impact of older Americans on the health care system requires we increase our students' awareness of their unique needs. The authors discuss strategies to develop skills using Gordon's Functional Health Patterns Assessment for assessing older clients.

  5. Assessment of nutritional status in the elderly: a proposed function-driven model.

    Science.gov (United States)

    Engelheart, Stina; Brummer, Robert

    2018-01-01

    There is no accepted or standardized definition of 'malnutrition'. Hence, there is also no definition of what constitutes an adequate nutritional status. In elderly people, assessment of nutritional status is complex and is complicated by multi-morbidity and disabilities combined with nutrition-related problems, such as dysphagia, decreased appetite, fatigue, and muscle weakness. We propose a nutritional status model that presents nutritional status from a comprehensive functional perspective. This model visualizes the complexity of the nutritional status in elderly people. The presented model could be interpreted as the nutritional status is conditional to a person's optimal function or situation. Another way of looking at it might be that a person's nutritional status affects his or her optimal situation. The proposed model includes four domains: (1) physical function and capacity; (2) health and somatic disorders; (3) food and nutrition; and (4) cognitive, affective, and sensory function. Each domain has a major impact on nutritional status, which in turn has a major impact on the outcome of each domain. Nutritional status is a multifaceted concept and there exist several knowledge gaps in the diagnosis, prevention, and optimization of treatment of inadequate nutritional status in elderly people. The nutritional status model may be useful in nutritional assessment research, as well as in the clinical setting.

  6. Challenging the foundations of the clinical model of foot function: further evidence that the root model assessments fail to appropriately classify foot function.

    Science.gov (United States)

    Jarvis, Hannah L; Nester, Christopher J; Bowden, Peter D; Jones, Richard K

    2017-01-01

    The Root model of normal and abnormal foot function remains the basis for clinical foot orthotic practice globally. Our aim was to investigate the relationship between foot deformities and kinematic compensations that are the foundations of the model. A convenience sample of 140 were screened and 100 symptom free participants aged 18-45 years were invited to participate. The static biomechanical assessment described by the Root model was used to identify five foot deformities. A 6 segment foot model was used to measure foot kinematics during gait. Statistical tests compared foot kinematics between feet with and without foot deformities and correlated the degree of deformity with any compensatory motions. None of the deformities proposed by the Root model were associated with distinct differences in foot kinematics during gait when compared to those without deformities or each other. Static and dynamic parameters were not correlated. Taken as part of a wider body of evidence, the results of this study have profound implications for clinical foot health practice. We believe that the assessment protocol advocated by the Root model is no longer a suitable basis for professional practice. We recommend that clinicians stop using sub-talar neutral position during clinical assessments and stop assessing the non-weight bearing range of ankle dorsiflexion, first ray position and forefoot alignments and movement as a means of defining the associated foot deformities. The results question the relevance of the Root assessments in the prescription of foot orthoses.

  7. Quick Assessment of Family Functioning.

    Science.gov (United States)

    Golden, Larry B.

    1988-01-01

    Describes five criteria (parental resources, chronicity communication between family members, parental authority, and rapport with professional helpers) of family functioning of an assessment model which can be used to determine which families could benefit from brief interventions by a school counselor. Provides results of 20 case studies with…

  8. Protein single-model quality assessment by feature-based probability density functions.

    Science.gov (United States)

    Cao, Renzhi; Cheng, Jianlin

    2016-04-04

    Protein quality assessment (QA) has played an important role in protein structure prediction. We developed a novel single-model quality assessment method-Qprob. Qprob calculates the absolute error for each protein feature value against the true quality scores (i.e. GDT-TS scores) of protein structural models, and uses them to estimate its probability density distribution for quality assessment. Qprob has been blindly tested on the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM-NOVEL server. The official CASP result shows that Qprob ranks as one of the top single-model QA methods. In addition, Qprob makes contributions to our protein tertiary structure predictor MULTICOM, which is officially ranked 3rd out of 143 predictors. The good performance shows that Qprob is good at assessing the quality of models of hard targets. These results demonstrate that this new probability density distribution based method is effective for protein single-model quality assessment and is useful for protein structure prediction. The webserver of Qprob is available at: http://calla.rnet.missouri.edu/qprob/. The software is now freely available in the web server of Qprob.

  9. The treatment of climate science in Integrated Assessment Modelling: integration of climate step function response in an energy system integrated assessment model.

    Science.gov (United States)

    Dessens, Olivier

    2016-04-01

    Integrated Assessment Models (IAMs) are used as crucial inputs to policy-making on climate change. These models simulate aspect of the economy and climate system to deliver future projections and to explore the impact of mitigation and adaptation policies. The IAMs' climate representation is extremely important as it can have great influence on future political action. The step-function-response is a simple climate model recently developed by the UK Met Office and is an alternate method of estimating the climate response to an emission trajectory directly from global climate model step simulations. Good et al., (2013) have formulated a method of reconstructing general circulation models (GCMs) climate response to emission trajectories through an idealized experiment. This method is called the "step-response approach" after and is based on an idealized abrupt CO2 step experiment results. TIAM-UCL is a technology-rich model that belongs to the family of, partial-equilibrium, bottom-up models, developed at University College London to represent a wide spectrum of energy systems in 16 regions of the globe (Anandarajah et al. 2011). The model uses optimisation functions to obtain cost-efficient solutions, in meeting an exogenously defined set of energy-service demands, given certain technological and environmental constraints. Furthermore, it employs linear programming techniques making the step function representation of the climate change response adapted to the model mathematical formulation. For the first time, we have introduced the "step-response approach" method developed at the UK Met Office in an IAM, the TIAM-UCL energy system, and we investigate the main consequences of this modification on the results of the model in term of climate and energy system responses. The main advantage of this approach (apart from the low computational cost it entails) is that its results are directly traceable to the GCM involved and closely connected to well-known methods of

  10. Assessing gene function in the ruminant placenta.

    Science.gov (United States)

    Anthony, R V; Cantlon, J D; Gates, K C; Purcell, S H; Clay, C M

    2010-01-01

    The placenta provides the means for nutrient transfer from the mother to the fetus, waste transfer from the fetus to the mother, protection of the fetus from the maternal immune system, and is an active endocrine organ. While many placental functions have been defined and investigated, assessing the function of specific genes expressed by the placenta has been problematic, since classical ablation-replacement methods are not feasible with the placenta. The pregnant sheep has been a long-standing animal model for assessing in vivo physiology during pregnancy, since surgical placement of indwelling catheters into both maternal and fetal vasculature has allowed the assessment of placental nutrient transfer and utilization, as well as placental hormone secretion, under unanesthetized-unstressed steady state sampling conditions. However, in ruminants the lack of well-characterized trophoblast cell lines and the inefficiency of creating transgenic pregnancies in ruminants have inhibited our ability to assess specific gene function. Recently, sheep and cattle primary trophoblast cell lines have been reported, and may further our ability to investigate trophoblast function and transcriptional regulation of genes expressed by the placenta. Furthermore, viral infection of the trophoectoderm layer of hatched blastocysts, as a means for placenta-specific transgenesis, holds considerable potential to assess gene function in the ruminant placenta. This approach has been used successfully to "knockdown" gene expression in the developing sheep conceptus, and has the potential for gain-of-function experiments as well. While this technology is still being developed, it may provide an efficient approach to assess specific gene function in the ruminant placenta.

  11. Body composition in dialysis patients: a functional assessment of bioimpedance using different prediction models.

    Science.gov (United States)

    Broers, Natascha J H; Martens, Remy J H; Cornelis, Tom; Diederen, Nanda M P; Wabel, Peter; van der Sande, Frank M; Leunissen, Karel M L; Kooman, Jeroen P

    2015-03-01

    The assessment of body composition (BC) in dialysis patients is of clinical importance given its role in the diagnosis of malnutrition and sarcopenia. Bioimpedance techniques routinely express BC as a 2-compartment (2-C) model distinguishing fat mass (FM) and fat-free mass (FFM), which may be influenced by the hydration of adipose tissue and fluid overload (OH). Recently, the BC monitor was introduced which applies a 3-compartment (3-C) model, distinguishing OH, adipose tissue mass, and lean tissue mass. The aim of this study was to compare BC between the 2-C and 3-C models and assess their relation with markers of functional performance (handgrip strength [HGS] and 4-m walking test), as well as with biochemical markers of nutrition. Forty-seven dialysis patients (30 males and 17 females) (35 hemodialysis, 12 peritoneal dialysis) with a mean age of 64.8 ± 16.5 years were studied. 3-C BC was assessed by BC monitor, whereas the obtained resistivity values were used to calculate FM and FFM according to the Xitron Hydra 4200 formulas, which are based on a 2-C model. FFM (3-C) was 0.99 kg (95% confidence interval [CI], 0.27 to 1.71, P = .008) higher than FFM (2-C). FM (3-C) was 2.43 kg (95% CI, 1.70-3.15, P FFM 3-C - FFM 2-C) (r = 0.361; P FFM (2-C) (r = 0.713; P FFM (3-C) (r = 0.711; P FFM (3-C) and FFM (2-C) were significantly related to HGS. Bioimpedance, HGS, and the 4-m walking test may all be valuable tools in the multidimensional nutritional assessment of both hemodialysis and peritoneal dialysis patients. Copyright © 2015 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  12. Assessing the protection function of Alpine forest ecosystems using BGC modelling theory

    Science.gov (United States)

    Pötzelsberger, E.; Hasenauer, H.; Petritsch, R.; Pietsch, S. A.

    2009-04-01

    The purpose of this study was to assess the protection function of forests in Alpine areas by modelling the flux dynamics (water, carbon, nutrients) within a watershed as they may depend on the vegetation pattern and forest management impacts. The application case for this study was the catchment Schmittenbach, located in the province of Salzburg. Data available covered the hydrology (rainfall measurements from 1981 to 1998 and runoff measurements at the river Schmittenbach from 1981 to 2005), vegetation dynamics (currently 69% forest, predominantly Norway Spruce). The method of simulating the forest growth and water outflow was validated. For simulations of the key ecosystem processes (e.g. photosynthesis, carbon and nitrogen allocation in the different plant parts, litter fall, mineralisation, tree water uptake, transpiration, rainfall interception, evaporation, snow accumulation and snow melt, outflow of spare water) the biogeochemical ecosystem model Biome-BGC was applied. Relevant model extensions were the tree species specific parameter sets and the improved thinning regime. The model is sensitive to site characteristics and needs daily weather data and information on the atmospheric composition, which makes it sensitive to higher CO2-levels and climate change. For model validation 53 plots were selected covering the full range of site quality and stand age. Tree volume and soil was measured and compared with the respective model results. The outflow for the watershed was predicted by combining the simulated forest-outflow (derived from plot-outflow) with the outflow from the non-forest area (calculated with a fixed outflow/rainfall coefficient (OC)). The analysis of production and water related model outputs indicated that mechanistic modelling can be used as a tool to assess the performance of Alpine protection forests. The Water Use Efficiency (WUE), the ratio of Net primary production (NPP) and Transpiration, was found the highest for juvenile stands (

  13. Model and methods to assess hepatic function from indocyanine green fluorescence dynamical measurements of liver tissue.

    Science.gov (United States)

    Audebert, Chloe; Vignon-Clementel, Irene E

    2018-03-30

    The indocyanine green (ICG) clearance, presented as plasma disappearance rate is, presently, a reliable method to estimate the hepatic "function". However, this technique is not instantaneously available and thus cannot been used intra-operatively (during liver surgery). Near-infrared spectroscopy enables to assess hepatic ICG concentration over time in the liver tissue. This article proposes to extract more information from the liver intensity dynamics by interpreting it through a dedicated pharmacokinetics model. In order to account for the different exchanges between the liver tissues, the proposed model includes three compartments for the liver model (sinusoids, hepatocytes and bile canaliculi). The model output dependency to parameters is studied with sensitivity analysis and solving an inverse problem on synthetic data. The estimation of model parameters is then performed with in-vivo measurements in rabbits (El-Desoky et al. 1999). Parameters for different liver states are estimated, and their link with liver function is investigated. A non-linear (Michaelis-Menten type) excretion rate from the hepatocytes to the bile canaliculi was necessary to reproduce the measurements for different liver conditions. In case of bile duct ligation, the model suggests that this rate is reduced, and that the ICG is stored in the hepatocytes. Moreover, the level of ICG remains high in the blood following the ligation of the bile duct. The percentage of retention of indocyanine green in blood, which is a common test for hepatic function estimation, is also investigated with the model. The impact of bile duct ligation and reduced liver inflow on the percentage of ICG retention in blood is studied. The estimation of the pharmacokinetics model parameters may lead to an evaluation of different liver functions. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Uncertainties in radioecological assessment models

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.; Ng, Y.C.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because models are inexact representations of real systems. The major sources of this uncertainty are related to bias in model formulation and imprecision in parameter estimation. The magnitude of uncertainty is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, health risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible. 41 references, 4 figures, 4 tables

  15. East Tennessee State University's "Make a Difference" Project: Using a Team-Based Consultative Model To Conduct Functional Behavioral Assessments.

    Science.gov (United States)

    Vaughn, Kelley; Hales, Cindy; Bush, Marta; Fox, James

    1998-01-01

    Describes implementation of functional behavioral assessment (FBA) through collaboration between a university (East Tennessee State University) and the local school system. Discusses related issues such as factors in team training, team size, FBA adaptations, and replicability of the FBA team model. (Author/DB)

  16. Functional Assessment Inventory Manual.

    Science.gov (United States)

    Crewe, Nancy M.; Athelstan, Gary T.

    This manual, which provides extensive new instructions for administering the Functional Assessment Inventory (FAI), is intended to enable counselors to begin using the inventory without undergoing any special training. The first two sections deal with the need for functional assessment and issues in the development and use of the inventory. The…

  17. Assessment of the Tourism Function in Region Development

    Directory of Open Access Journals (Sweden)

    Nataliia Zigern-Korn

    2014-07-01

    Full Text Available This article represents methodical approaches to estimation of tourism value and function in Russian regions development and contains some results of the carried-out assessment. The reasons about tourism mission for different types of territory exploitation and level of social and economic development, the idea of methods of tourist development of regions space are the core lines of the assessment framework. Selection of model regions due to their development types formed information basis for assessment. We determined specific Indicators and algorithm of an assessment to each model region. Results of the carried-out assessment allow rethinking the category of tourist and recreational potential of the territory from the standpoint of prospects and principles of a sustainable development. The technique of such impact assessment of tourism on regional development opens opportunity for public authorities to adopt correct strategic decisions according to the principles of Smart Development.

  18. New statistical potential for quality assessment of protein models and a survey of energy functions

    Directory of Open Access Journals (Sweden)

    Rykunov Dmitry

    2010-03-01

    Full Text Available Abstract Background Scoring functions, such as molecular mechanic forcefields and statistical potentials are fundamentally important tools in protein structure modeling and quality assessment. Results The performances of a number of publicly available scoring functions are compared with a statistical rigor, with an emphasis on knowledge-based potentials. We explored the effect on accuracy of alternative choices for representing interaction center types and other features of scoring functions, such as using information on solvent accessibility, on torsion angles, accounting for secondary structure preferences and side chain orientation. Partially based on the observations made, we present a novel residue based statistical potential, which employs a shuffled reference state definition and takes into account the mutual orientation of residue side chains. Atom- and residue-level statistical potentials and Linux executables to calculate the energy of a given protein proposed in this work can be downloaded from http://www.fiserlab.org/potentials. Conclusions Among the most influential terms we observed a critical role of a proper reference state definition and the benefits of including information about the microenvironment of interaction centers. Molecular mechanical potentials were also tested and found to be over-sensitive to small local imperfections in a structure, requiring unfeasible long energy relaxation before energy scores started to correlate with model quality.

  19. Personalized pseudophakic model for refractive assessment.

    Science.gov (United States)

    Ribeiro, Filomena J; Castanheira-Dinis, António; Dias, João M

    2012-01-01

    To test a pseudophakic eye model that allows for intraocular lens power (IOL) calculation, both in normal eyes and in extreme conditions, such as post-LASIK. The model's efficacy was tested in 54 participants (104 eyes) who underwent LASIK and were assessed before and after surgery, thus allowing to test the same method in the same eye after only changing corneal topography. MODELLING: The Liou-Brennan eye model was used as a starting point, and biometric values were replaced by individual measurements. Detailed corneal surface data were obtained from topography (Orbscan®) and a grid of elevation values was used to define corneal surfaces in an optical ray-tracing software (Zemax®). To determine IOL power, optimization criteria based on values of the modulation transfer function (MTF) weighted according to contrast sensitivity function (CSF), were applied. Pre-operative refractive assessment calculated by our eye model correlated very strongly with SRK/T (r = 0.959, p0.05). Comparison of post-operative refractive assessment obtained using our eye model with the average of currently used formulas showed a strong correlation (r = 0.778, p0.05). Results suggest that personalized pseudophakic eye models and ray-tracing allow for the use of the same methodology, regardless of previous LASIK, independent of population averages and commonly used regression correction factors, which represents a clinical advantage.

  20. Probabilistic migration modelling focused on functional barrier efficiency and low migration concepts in support of risk assessment.

    Science.gov (United States)

    Brandsch, Rainer

    2017-10-01

    Migration modelling provides reliable migration estimates from food-contact materials (FCM) to food or food simulants based on mass-transfer parameters like diffusion and partition coefficients related to individual materials. In most cases, mass-transfer parameters are not readily available from the literature and for this reason are estimated with a given uncertainty. Historically, uncertainty was accounted for by introducing upper limit concepts first, turning out to be of limited applicability due to highly overestimated migration results. Probabilistic migration modelling gives the possibility to consider uncertainty of the mass-transfer parameters as well as other model inputs. With respect to a functional barrier, the most important parameters among others are the diffusion properties of the functional barrier and its thickness. A software tool that accepts distribution as inputs and is capable of applying Monte Carlo methods, i.e., random sampling from the input distributions of the relevant parameters (i.e., diffusion coefficient and layer thickness), predicts migration results with related uncertainty and confidence intervals. The capabilities of probabilistic migration modelling are presented in the view of three case studies (1) sensitivity analysis, (2) functional barrier efficiency and (3) validation by experimental testing. Based on the predicted migration by probabilistic migration modelling and related exposure estimates, safety evaluation of new materials in the context of existing or new packaging concepts is possible. Identifying associated migration risk and potential safety concerns in the early stage of packaging development is possible. Furthermore, dedicated material selection exhibiting required functional barrier efficiency under application conditions becomes feasible. Validation of the migration risk assessment by probabilistic migration modelling through a minimum of dedicated experimental testing is strongly recommended.

  1. Distinguishing Differential Testlet Functioning from Differential Bundle Functioning Using the Multilevel Measurement Model

    Science.gov (United States)

    Beretvas, S. Natasha; Walker, Cindy M.

    2012-01-01

    This study extends the multilevel measurement model to handle testlet-based dependencies. A flexible two-level testlet response model (the MMMT-2 model) for dichotomous items is introduced that permits assessment of differential testlet functioning (DTLF). A distinction is made between this study's conceptualization of DTLF and that of…

  2. Assessment of exposure-response functions for rocket-emission toxicants

    National Research Council Canada - National Science Library

    Subcommittee on Rocket-Emission Toxicants, National Research Council

    ... aborted launch that results in a rocket being destroyed near the ground. Assessment of Exposure-Response Functions for Rocket-Emmission Toxicants evaluates the model and the data used for three rocket emission toxicants...

  3. Data Acquisition for Quality Loss Function Modelling

    DEFF Research Database (Denmark)

    Pedersen, Søren Nygaard; Howard, Thomas J.

    2016-01-01

    Quality loss functions can be a valuable tool when assessing the impact of variation on product quality. Typically, the input for the quality loss function would be a measure of the varying product performance and the output would be a measure of quality. While the unit of the input is given by t...... by the product function in focus, the quality output can be measured and quantified in a number of ways. In this article a structured approach for acquiring stakeholder satisfaction data for use in quality loss function modelling is introduced.......Quality loss functions can be a valuable tool when assessing the impact of variation on product quality. Typically, the input for the quality loss function would be a measure of the varying product performance and the output would be a measure of quality. While the unit of the input is given...

  4. Personalized pseudophakic model for refractive assessment.

    Directory of Open Access Journals (Sweden)

    Filomena J Ribeiro

    Full Text Available PURPOSE: To test a pseudophakic eye model that allows for intraocular lens power (IOL calculation, both in normal eyes and in extreme conditions, such as post-LASIK. METHODS: PARTICIPANTS: The model's efficacy was tested in 54 participants (104 eyes who underwent LASIK and were assessed before and after surgery, thus allowing to test the same method in the same eye after only changing corneal topography. MODELLING: The Liou-Brennan eye model was used as a starting point, and biometric values were replaced by individual measurements. Detailed corneal surface data were obtained from topography (Orbscan® and a grid of elevation values was used to define corneal surfaces in an optical ray-tracing software (Zemax®. To determine IOL power, optimization criteria based on values of the modulation transfer function (MTF weighted according to contrast sensitivity function (CSF, were applied. RESULTS: Pre-operative refractive assessment calculated by our eye model correlated very strongly with SRK/T (r = 0.959, p0.05. Comparison of post-operative refractive assessment obtained using our eye model with the average of currently used formulas showed a strong correlation (r = 0.778, p0.05. CONCLUSIONS: Results suggest that personalized pseudophakic eye models and ray-tracing allow for the use of the same methodology, regardless of previous LASIK, independent of population averages and commonly used regression correction factors, which represents a clinical advantage.

  5. Assessment of Safety and Functional Efficacy of Stem Cell-Based Therapeutic Approaches Using Retinal Degenerative Animal Models

    Directory of Open Access Journals (Sweden)

    Tai-Chi Lin

    2017-01-01

    Full Text Available Dysfunction and death of retinal pigment epithelium (RPE and or photoreceptors can lead to irreversible vision loss. The eye represents an ideal microenvironment for stem cell-based therapy. It is considered an “immune privileged” site, and the number of cells needed for therapy is relatively low for the area of focused vision (macula. Further, surgical placement of stem cell-derived grafts (RPE, retinal progenitors, and photoreceptor precursors into the vitreous cavity or subretinal space has been well established. For preclinical tests, assessments of stem cell-derived graft survival and functionality are conducted in animal models by various noninvasive approaches and imaging modalities. In vivo experiments conducted in animal models based on replacing photoreceptors and/or RPE cells have shown survival and functionality of the transplanted cells, rescue of the host retina, and improvement of visual function. Based on the positive results obtained from these animal experiments, human clinical trials are being initiated. Despite such progress in stem cell research, ethical, regulatory, safety, and technical difficulties still remain a challenge for the transformation of this technique into a standard clinical approach. In this review, the current status of preclinical safety and efficacy studies for retinal cell replacement therapies conducted in animal models will be discussed.

  6. Assessing work disability for social security benefits: international models for the direct assessment of work capacity.

    Science.gov (United States)

    Geiger, Ben Baumberg; Garthwaite, Kayleigh; Warren, Jon; Bambra, Clare

    2017-08-25

    It has been argued that social security disability assessments should directly assess claimants' work capacity, rather than relying on proxies such as on functioning. However, there is little academic discussion of how such assessments could be conducted. The article presents an account of different models of direct disability assessments based on case studies of the Netherlands, Germany, Denmark, Norway, the United States of America, Canada, Australia, and New Zealand, utilising over 150 documents and 40 expert interviews. Three models of direct work disability assessments can be observed: (i) structured assessment, which measures the functional demands of jobs across the national economy and compares these to claimants' functional capacities; (ii) demonstrated assessment, which looks at claimants' actual experiences in the labour market and infers a lack of work capacity from the failure of a concerned rehabilitation attempt; and (iii) expert assessment, based on the judgement of skilled professionals. Direct disability assessment within social security is not just theoretically desirable, but can be implemented in practice. We have shown that there are three distinct ways that this can be done, each with different strengths and weaknesses. Further research is needed to clarify the costs, validity/legitimacy, and consequences of these different models. Implications for rehabilitation It has recently been argued that social security disability assessments should directly assess work capacity rather than simply assessing functioning - but we have no understanding about how this can be done in practice. Based on case studies of nine countries, we show that direct disability assessment can be implemented, and argue that there are three different ways of doing it. These are "demonstrated assessment" (using claimants' experiences in the labour market), "structured assessment" (matching functional requirements to workplace demands), and "expert assessment" (the

  7. Assessment of left ventricular function with single breath-hold highly accelerated cine MRI combined with guide-point modeling

    International Nuclear Information System (INIS)

    Heilmaier, Christina; Nassenstein, Kai; Nielles-Vallespin, Sonia; Zuehlsdorff, Sven; Hunold, Peter; Barkhausen, Joerg

    2010-01-01

    Purpose: To prospectively assess the performance of highly accelerated cine MRI in multi-orientations combined with a new guide-point modeling post-processing technique (GPM approach) for assessment of left ventricular (LV) function compared to the standard summation of slices method based on a stack of short axis views (SoS approach). Materials and methods: 33 consecutive patients were examined on a 1.5 T scanner with a standard steady state free precession (SSFP) sequence (TR, 3.0 ms; TE, 1.5 m; flip angle (FA), 60 o ; acceleration factor (AF), 2) analyzed with the SoS method and a highly accelerated, single breath-hold temporal parallel acquisition SSFP sequence (TR, 4.6 ms; TE, 1.1 ms; AF, 3) post-processed with the GPM method. LV function values were measured by two independent readers with different experience in cardiac MRI and compared by using the paired t-test and F-test. Inter- and intraobserver agreements were calculated using Bland-Altman-Plots. Results: Mean acquisition and post-processing time was significantly shorter with the GPM approach (15 s/3 min versus 360 s/6 min). For all LV function parameters interobserver agreement between the experienced and non-experienced reader was significantly improved when the GPM approach was used. However, end-diastolic and end-systolic volumes were larger for the GPM technique when compared to the SoS method (P 0.121). In both readers and for all parameters variances did not differ significantly (P ≥ 0.409) and the two approaches showed an excellent linear correlation (r > 0.951). Conclusion: Due to its accurate, fast and reproducible assessment of LV function parameters highly accelerated MRI combined with the GPM technique may become the technique of first choice for assessment of LV function in clinical routine.

  8. Refining Inquiry with Multi-Form Assessment: Formative and summative assessment functions for flexible inquiry

    Science.gov (United States)

    Zuiker, Steven; Reid Whitaker, J.

    2014-04-01

    This paper describes the 5E+I/A inquiry model and reports a case study of one curricular enactment by a US fifth-grade classroom. A literature review establishes the model's conceptual adequacy with respect to longstanding research related to both the 5E inquiry model and multiple, incremental innovations of it. As a collective line of research, the review highlights a common emphasis on formative assessment, at times coupled either with differentiated instruction strategies or with activities that target the generalization of learning. The 5E+I/A model contributes a multi-level assessment strategy that balances formative and summative functions of multiple forms of assessment in order to support classroom participation while still attending to individual achievement. The case report documents the enactment of a weeklong 5E+I/A curricular design as a preliminary account of the model's empirical adequacy. A descriptive and analytical narrative illustrates variable ways that multi-level assessment makes student thinking visible and pedagogical decision-making more powerful. In light of both, it also documents productive adaptations to a flexible curricular design and considers future research to advance this collective line of inquiry.

  9. Better assessment of physical function: item improvement is neglected but essential.

    Science.gov (United States)

    Bruce, Bonnie; Fries, James F; Ambrosini, Debbie; Lingala, Bharathi; Gandek, Barbara; Rose, Matthias; Ware, John E

    2009-01-01

    Physical function is a key component of patient-reported outcome (PRO) assessment in rheumatology. Modern psychometric methods, such as Item Response Theory (IRT) and Computerized Adaptive Testing, can materially improve measurement precision at the item level. We present the qualitative and quantitative item-evaluation process for developing the Patient Reported Outcomes Measurement Information System (PROMIS) Physical Function item bank. The process was stepwise: we searched extensively to identify extant Physical Function items and then classified and selectively reduced the item pool. We evaluated retained items for content, clarity, relevance and comprehension, reading level, and translation ease by experts and patient surveys, focus groups, and cognitive interviews. We then assessed items by using classic test theory and IRT, used confirmatory factor analyses to estimate item parameters, and graded response modeling for parameter estimation. We retained the 20 Legacy (original) Health Assessment Questionnaire Disability Index (HAQ-DI) and the 10 SF-36's PF-10 items for comparison. Subjects were from rheumatoid arthritis, osteoarthritis, and healthy aging cohorts (n = 1,100) and a national Internet sample of 21,133 subjects. We identified 1,860 items. After qualitative and quantitative evaluation, 124 newly developed PROMIS items composed the PROMIS item bank, which included revised Legacy items with good fit that met IRT model assumptions. Results showed that the clearest and best-understood items were simple, in the present tense, and straightforward. Basic tasks (like dressing) were more relevant and important versus complex ones (like dancing). Revised HAQ-DI and PF-10 items with five response options had higher item-information content than did comparable original Legacy items with fewer response options. IRT analyses showed that the Physical Function domain satisfied general criteria for unidimensionality with one-, two-, three-, and four-factor models

  10. Functional Fault Model Development Process to Support Design Analysis and Operational Assessment

    Science.gov (United States)

    Melcher, Kevin J.; Maul, William A.; Hemminger, Joseph A.

    2016-01-01

    A functional fault model (FFM) is an abstract representation of the failure space of a given system. As such, it simulates the propagation of failure effects along paths between the origin of the system failure modes and points within the system capable of observing the failure effects. As a result, FFMs may be used to diagnose the presence of failures in the modeled system. FFMs necessarily contain a significant amount of information about the design, operations, and failure modes and effects. One of the important benefits of FFMs is that they may be qualitative, rather than quantitative and, as a result, may be implemented early in the design process when there is more potential to positively impact the system design. FFMs may therefore be developed and matured throughout the monitored system's design process and may subsequently be used to provide real-time diagnostic assessments that support system operations. This paper provides an overview of a generalized NASA process that is being used to develop and apply FFMs. FFM technology has been evolving for more than 25 years. The FFM development process presented in this paper was refined during NASA's Ares I, Space Launch System, and Ground Systems Development and Operations programs (i.e., from about 2007 to the present). Process refinement took place as new modeling, analysis, and verification tools were created to enhance FFM capabilities. In this paper, standard elements of a model development process (i.e., knowledge acquisition, conceptual design, implementation & verification, and application) are described within the context of FFMs. Further, newer tools and analytical capabilities that may benefit the broader systems engineering process are identified and briefly described. The discussion is intended as a high-level guide for future FFM modelers.

  11. Mathematical modeling and visualization of functional neuroimages

    DEFF Research Database (Denmark)

    Rasmussen, Peter Mondrup

    This dissertation presents research results regarding mathematical modeling in the context of the analysis of functional neuroimages. Specifically, the research focuses on pattern-based analysis methods that recently have become popular analysis tools within the neuroimaging community. Such methods...... neuroimaging data sets are characterized by relatively few data observations in a high dimensional space. The process of building models in such data sets often requires strong regularization. Often, the degree of model regularization is chosen in order to maximize prediction accuracy. We focus on the relative...... be carefully selected, so that the model and its visualization enhance our ability to interpret brain function. The second part concerns interpretation of nonlinear models and procedures for extraction of ‘brain maps’ from nonlinear kernel models. We assess the performance of the sensitivity map as means...

  12. THE MODEL FOR RISK ASSESSMENT ERP-SYSTEMS INFORMATION SECURITY

    Directory of Open Access Journals (Sweden)

    V. S. Oladko

    2016-12-01

    Full Text Available The article deals with the problem assessment of information security risks in the ERP-system. ERP-system functions and architecture are studied. The model malicious impacts on levels of ERP-system architecture are composed. Model-based risk assessment, which is the quantitative and qualitative approach to risk assessment, built on the partial unification 3 methods for studying the risks of information security - security models with full overlapping technique CRAMM and FRAP techniques developed.

  13. PSA Model Improvement Using Maintenance Rule Function Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Mi Ro [KHNP-CRI, Nuclear Safety Laboratory, Daejeon (Korea, Republic of)

    2011-10-15

    The Maintenance Rule (MR) program, in nature, is a performance-based program. Therefore, the risk information derived from the Probabilistic Safety Assessment model is introduced into the MR program during the Safety Significance determination and Performance Criteria selection processes. However, this process also facilitates the determination of the vulnerabilities in currently utilized PSA models and offers means of improving them. To find vulnerabilities in an existing PSA model, an initial review determines whether the safety-related MR functions are included in the PSA model. Because safety-related MR functions are related to accident prevention and mitigation, it is generally necessary for them to be included in the PSA model. In the process of determining the safety significance of each functions, quantitative risk importance levels are determined through a process known as PSA model basic event mapping to MR functions. During this process, it is common for some inadequate and overlooked models to be uncovered. In this paper, the PSA model and the MR program of Wolsong Unit 1 were used as references

  14. Optimizing cost-efficiency in mean exposure assessment--cost functions reconsidered.

    Science.gov (United States)

    Mathiassen, Svend Erik; Bolin, Kristian

    2011-05-21

    Reliable exposure data is a vital concern in medical epidemiology and intervention studies. The present study addresses the needs of the medical researcher to spend monetary resources devoted to exposure assessment with an optimal cost-efficiency, i.e. obtain the best possible statistical performance at a specified budget. A few previous studies have suggested mathematical optimization procedures based on very simple cost models; this study extends the methodology to cover even non-linear cost scenarios. Statistical performance, i.e. efficiency, was assessed in terms of the precision of an exposure mean value, as determined in a hierarchical, nested measurement model with three stages. Total costs were assessed using a corresponding three-stage cost model, allowing costs at each stage to vary non-linearly with the number of measurements according to a power function. Using these models, procedures for identifying the optimally cost-efficient allocation of measurements under a constrained budget were developed, and applied on 225 scenarios combining different sizes of unit costs, cost function exponents, and exposure variance components. Explicit mathematical rules for identifying optimal allocation could be developed when cost functions were linear, while non-linear cost functions implied that parts of or the entire optimization procedure had to be carried out using numerical methods.For many of the 225 scenarios, the optimal strategy consisted in measuring on only one occasion from each of as many subjects as allowed by the budget. Significant deviations from this principle occurred if costs for recruiting subjects were large compared to costs for setting up measurement occasions, and, at the same time, the between-subjects to within-subject variance ratio was small. In these cases, non-linearities had a profound influence on the optimal allocation and on the eventual size of the exposure data set. The analysis procedures developed in the present study can be used

  15. A Porcine Model for Initial Surge Mechanical Ventilator Assessment and Evaluation of Two Limited Function Ventilators

    Science.gov (United States)

    Dickson, Robert P; Hotchkin, David L; Lamm, Wayne JE; Hinkson, Carl; Pierson, David J; Glenny, Robb W; Rubinson, Lewis

    2013-01-01

    Objective To adapt an animal model of acute lung injury for use as a standard protocol for a screening, initial evaluation of limited function, or “surge,” ventilators for use in mass casualty scenarios. Design Prospective, experimental animal study. Setting University research laboratory. Subjects 12 adult pigs. Interventions 12 spontaneously breathing pigs (6 in each group) were subjected to acute lung injury/acute respiratory distress syndrome (ALI/ARDS) via pulmonary artery infusion of oleic acid. Following development of respiratory failure, animals were mechanically ventilated with a limited function ventilator (Simplified Automatic Ventilator [SAVe] I or II; Automedx) for one hour or until the ventilator could not support the animal. The limited function ventilator was then exchanged for a full function ventilator (Servo 900C; Siemens). Measurements and Main Results Reliable and reproducible levels of ALI/ARDS were induced. The SAVe I was unable to adequately oxygenate 5 animals, with PaO2 (52.0 ± 11.1 torr) compared to the Servo (106.0 ± 25.6 torr; p=0.002). The SAVe II was able to oxygenate and ventilate all 6 animals for one hour with no difference in PaO2 (141.8 ± 169.3 torr) compared to the Servo (158.3 ± 167.7 torr). Conclusions We describe a novel in vivo model of ALI/ARDS that can be used to initially screen limited function ventilators considered for mass respiratory failure stockpiles, and is intended to be combined with additional studies to defintively assess appropriateness for mass respiratory failure. Specifically, during this study we demonstrate that the SAVe I ventilator is unable to provide sufficient gas exchange, while the SAVe II, with several more functions, was able to support the same level of hypoxemic respiratory failure secondary to ALI/ARDS for one hour. PMID:21187747

  16. Assessment of recovery in older patients hospitalized with different diagnoses and functional levels, evaluated with and without geriatric assessment.

    Science.gov (United States)

    Abrahamsen, Jenny Foss; Haugland, Cathrine; Ranhoff, Anette Hylen

    2016-01-01

    The objective of the present study was to investigate 1) the role of different admission diagnoses and 2) the degree of functional loss, on the rate of recovery of older patients after acute hospitalization. Furthermore, to compare the predictive value of simple assessments that can be carried out in a hospital lacking geriatric service, with assessments including geriatric screening tests. Prospective, observational cohort study, including 961community dwelling patients aged ≥ 70 years, transferred from medical, cardiac, pulmonary and orthopedic acute hospital departments to intermediate care in nursing home. Functional assessment with Barthel index (BI) was performed at admission to the nursing home and further geriatric assessment tests was performed during the first week. Logistic regression models with and without geriatric assessment were compared concerning the patients having 1) slow recovery (nursing home stay up to 2 months before return home) or, 2) poor recovery (dead or still in nursing home at 2 months). Slow recovery was independently associated with a diagnosis of non-vertebral fracture, BI subgroups 50-79 and model including geriatric assessment, also with cognitive impairment. Poor recovery was more complex, and independently associated both with BI model, cognitive impairment. Geriatric assessment is optimal for determining the recovery potential of older patients after acute hospitalization. As some hospitals lack geriatric services and ability to perform geriatric screening tests, a simpler assessment based on admission diagnoses and ADL function (BI), gives good information regarding the possible rehabilitation time and possibility to return home.

  17. Assessing the effects of management on forest growth across France: insights from a new functional-structural model.

    Science.gov (United States)

    Guillemot, Joannès; Delpierre, Nicolas; Vallet, Patrick; François, Christophe; Martin-StPaul, Nicolas K; Soudani, Kamel; Nicolas, Manuel; Badeau, Vincent; Dufrêne, Eric

    2014-09-01

    The structure of a forest stand, i.e. the distribution of tree size features, has strong effects on its functioning. The management of the structure is therefore an important tool in mitigating the impact of predicted changes in climate on forests, especially with respect to drought. Here, a new functional-structural model is presented and is used to assess the effects of management on forest functioning at a national scale. The stand process-based model (PBM) CASTANEA was coupled to a stand structure module (SSM) based on empirical tree-to-tree competition rules. The calibration of the SSM was based on a thorough analysis of intersite and interannual variability of competition asymmetry. The coupled CASTANEA-SSM model was evaluated across France using forest inventory data, and used to compare the effect of contrasted silvicultural practices on simulated stand carbon fluxes and growth. The asymmetry of competition varied consistently with stand productivity at both spatial and temporal scales. The modelling of the competition rules enabled efficient prediction of changes in stand structure within the CASTANEA PBM. The coupled model predicted an increase in net primary productivity (NPP) with management intensity, resulting in higher growth. This positive effect of management was found to vary at a national scale across France: the highest increases in NPP were attained in forests facing moderate to high water stress; however, the absolute effect of management on simulated stand growth remained moderate to low because stand thinning involved changes in carbon allocation at the tree scale. This modelling approach helps to identify the areas where management efforts should be concentrated in order to mitigate near-future drought impact on national forest productivity. Around a quarter of the French temperate oak and beech forests are currently in zones of high vulnerability, where management could thus mitigate the influence of climate change on forest yield.

  18. Local and Global Function Model of the Liver

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Hesheng, E-mail: hesheng@umich.edu [Department of Radiation Oncology, University of Michigan, Ann Arbor, Michigan (United States); Feng, Mary [Department of Radiation Oncology, University of Michigan, Ann Arbor, Michigan (United States); Jackson, Andrew [Department of Medical Physics, Memorial Sloan-Kettering Cancer Center, New York, New York (United States); Ten Haken, Randall K.; Lawrence, Theodore S. [Department of Radiation Oncology, University of Michigan, Ann Arbor, Michigan (United States); Cao, Yue [Department of Radiation Oncology, University of Michigan, Ann Arbor, Michigan (United States); Department of Radiology, University of Michigan, Ann Arbor, Michigan (United States); Department of Biomedical Engineering, University of Michigan, Ann Arbor, Michigan (United States)

    2016-01-01

    Purpose: To develop a local and global function model in the liver based on regional and organ function measurements to support individualized adaptive radiation therapy (RT). Methods and Materials: A local and global model for liver function was developed to include both functional volume and the effect of functional variation of subunits. Adopting the assumption of parallel architecture in the liver, the global function was composed of a sum of local function probabilities of subunits, varying between 0 and 1. The model was fit to 59 datasets of liver regional and organ function measures from 23 patients obtained before, during, and 1 month after RT. The local function probabilities of subunits were modeled by a sigmoid function in relating to MRI-derived portal venous perfusion values. The global function was fitted to a logarithm of an indocyanine green retention rate at 15 minutes (an overall liver function measure). Cross-validation was performed by leave-m-out tests. The model was further evaluated by fitting to the data divided according to whether the patients had hepatocellular carcinoma (HCC) or not. Results: The liver function model showed that (1) a perfusion value of 68.6 mL/(100 g · min) yielded a local function probability of 0.5; (2) the probability reached 0.9 at a perfusion value of 98 mL/(100 g · min); and (3) at a probability of 0.03 [corresponding perfusion of 38 mL/(100 g · min)] or lower, the contribution to global function was lost. Cross-validations showed that the model parameters were stable. The model fitted to the data from the patients with HCC indicated that the same amount of portal venous perfusion was translated into less local function probability than in the patients with non-HCC tumors. Conclusions: The developed liver function model could provide a means to better assess individual and regional dose-responses of hepatic functions, and provide guidance for individualized treatment planning of RT.

  19. Fun cube based brain gym cognitive function assessment system.

    Science.gov (United States)

    Zhang, Tao; Lin, Chung-Chih; Yu, Tsang-Chu; Sun, Jing; Hsu, Wen-Chuin; Wong, Alice May-Kuen

    2017-05-01

    The aim of this study is to design and develop a fun cube (FC) based brain gym (BG) cognitive function assessment system using the wireless sensor network and multimedia technologies. The system comprised (1) interaction devices, FCs and a workstation used as interactive tools for collecting and transferring data to the server, (2) a BG information management system responsible for managing the cognitive games and storing test results, and (3) a feedback system used for conducting the analysis of cognitive functions to assist caregivers in screening high risk groups with mild cognitive impairment. Three kinds of experiments were performed to evaluate the developed FC-based BG cognitive function assessment system. The experimental results showed that the Pearson correlation coefficient between the system's evaluation outcomes and the traditional Montreal Cognitive Assessment scores was 0.83. The average Technology Acceptance Model 2 score was close to six for 31 elderly subjects. Most subjects considered that the brain games are interesting and the FC human-machine interface is easy to learn and operate. The control group and the cognitive impairment group had statistically significant difference with respect to the accuracy of and the time taken for the brain cognitive function assessment games, including Animal Naming, Color Search, Trail Making Test, Change Blindness, and Forward / Backward Digit Span. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Fine Scale ANUClimate Data for Ecosystem Modeling and Assessment of Plant Functional Types

    Science.gov (United States)

    Hutchinson, M. F.; Kesteven, J. L.; Xu, T.; Evans, B. J.; Togashi, H. F.; Stein, J. L.

    2015-12-01

    High resolution spatially extended values of climate variables play a central role in the assessment of climate and projected future climate in ecosystem modeling. The ground based meteorological network remains a key resource for deriving these spatially extended climate variables. We report on the production, and applications, of new anomaly based fine scale spatial interpolations of key climate variables at daily and monthly time scale, across the Australian continent. The methods incorporate several innovations that have significantly improved spatial predictive accuracy, as well as providing a platform for the incorporation of additional remotely sensed data. The interpolated climate data are supporting many continent-wide ecosystem modeling applications and are playing a key role in testing optimality hypotheses associated with plant functional types (PFTs). The accuracy, and robustness to data error, of anomaly-based interpolation has been enhanced by incorporating physical process aspects of the different climate variables and employing robust statistical methods implemented in the ANUSPLIN package. New regression procedures have also been developed to estimate "background" monthly climate normals from all stations with minimal records to substantially increase the density of supporting spatial networks. Monthly mean temperature interpolation has been enhanced by incorporating process based coastal effects that have reduced predictive error by around 10%. Overall errors in interpolated monthly temperature fields are around 25% less than errors reported by an earlier study. For monthly and daily precipitation, a new anomaly structure has been devised to take account of the skewness in precipitation data and the large proportion of zero values that present significant challenges to standard interpolation methods. The many applications include continent-wide Gross Primary Production modeling and assessing constraints on light and water use efficiency derived

  1. Contribution to a quantitative assessment model for reliability-based metrics of electronic and programmable safety-related functions

    International Nuclear Information System (INIS)

    Hamidi, K.

    2005-10-01

    The use of fault-tolerant EP architectures has induced growing constraints, whose influence on reliability-based performance metrics is no more negligible. To face up the growing influence of simultaneous failure, this thesis proposes, for safety-related functions, a new-trend assessment method of reliability, based on a better taking into account of time-aspect. This report introduces the concept of information and uses it to interpret the failure modes of safety-related function as the direct result of the initiation and propagation of erroneous information until the actuator-level. The main idea is to distinguish the apparition and disappearance of erroneous states, which could be defined as intrinsically dependent of HW-characteristic and maintenance policies, and their possible activation, constrained through architectural choices, leading to the failure of safety-related function. This approach is based on a low level on deterministic SED models of the architecture and use non homogeneous Markov chains to depict the time-evolution of probabilities of errors. (author)

  2. Exploring the Assessment of the DSM-5 Alternative Model for Personality Disorders With the Personality Assessment Inventory.

    Science.gov (United States)

    Busch, Alexander J; Morey, Leslie C; Hopwood, Christopher J

    2017-01-01

    Section III of the Diagnostic and Statistical Manual of Mental Disorders (5th ed. [DSM-5]; American Psychiatric Association, 2013) contains an alternative model for the diagnosis of personality disorder involving the assessment of 25 traits and a global level of overall personality functioning. There is hope that this model will be increasingly used in clinical and research settings, and the ability to apply established instruments to assess these concepts could facilitate this process. This study sought to develop scoring algorithms for these alternative model concepts using scales from the Personality Assessment Inventory (PAI). A multiple regression strategy used to predict scores in 2 undergraduate samples on DSM-5 alternative model instruments: the Personality Inventory for the DSM-5 (PID-5) and the General Personality Pathology scale (GPP; Morey et al., 2011 ). These regression functions resulted in scores that demonstrated promising convergent and discriminant validity across the alternative model concepts, as well as a factor structure in a cross-validation sample that was congruent with the putative structure of the alternative model traits. Results were linked to the PAI community normative data to provide normative information regarding these alternative model concepts that can be used to identify elevated traits and personality functioning level scores.

  3. Development of safety function assessment trees for pressurized heavy water reactor LP/SD operations

    International Nuclear Information System (INIS)

    Yang, Hui Chang; Chung, Chang Hyun; Kim, Ki Yong; Jee, Moon Hak; Sung, Chang Kyoung

    2003-01-01

    The objective of Configuration Risk Management Program(CRMP) is to maintain the safety level by assuring the defense-in-depth of nuclear power plant while the configurations are changed during plant operations, especially for the LP/SD. Such a safety purpose can be achieved by establishing the risk monitoring programs with both quantitative and qualitative features. Generally, the quantitative risk evaluation models, i.e., PRA models are used for the risk evaluation during full power operation, and the qualitative risk evaluation models such as safety function assessment trees are used. Through this study, safety function assessment trees were developed

  4. Changes in right ventricular function assessed by echocardiography in dog models of mild RV pressure overload.

    Science.gov (United States)

    Morita, Tomoya; Nakamura, Kensuke; Osuga, Tatsuyuki; Yokoyama, Nozomu; Morishita, Keitaro; Sasaki, Noboru; Ohta, Hiroshi; Takiguchi, Mitsuyoshi

    2017-07-01

    The assessment of hemodynamic change by echocardiography is clinically useful in patients with pulmonary hypertension. Recently, mild elevation of the mean pulmonary arterial pressure (PAP) has been shown to be associated with increased mortality. However, changes in the echocardiographic indices of right ventricular (RV) function are still unknown. The objective of this study was to validate the relationship between echocardiographic indices of RV function and right heart catheterization variables under a mild RV pressure overload condition. Echocardiography and right heart catheterization were performed in dog models of mild RV pressure overload induced by thromboxane A 2 analog (U46619) (n=7). The mean PAP was mildly increased (19.3±1.1 mm Hg), and the cardiac index was decreased. Most echocardiographic indices of RV function were significantly impaired even under a mild RV pressure overload condition. Multivariate analysis revealed that the RV free wall longitudinal strain (RVLS), standard deviation of the time-to-peak longitudinal strain of RV six segments (RV-SD) by speckle-tracking echocardiography, and Tei index were independent echocardiographic predictors of the mean PAP (free wall RVLS, β=-0.60, P<.001; RV-SD, β=0.40, P=.011), pulmonary vascular resistance (free wall RVLS, β=-0.39, P=.020; RV-SD, β=0.47, P=.0086; Tei index, β=0.34, P=.047), and cardiac index (Tei index, β=-0.65, P<.001). Free wall RVLS, RV-SD, and Tei index are useful for assessing the hemodynamic change under a mild RV pressure overload condition. © 2017, Wiley Periodicals, Inc.

  5. Assessment of soil contamination--a functional perspective.

    Science.gov (United States)

    van Straalen, Nico M

    2002-01-01

    In many industrialized countries the use of land is impeded by soil pollution from a variety of sources. Decisions on clean-up, management or set-aside of contaminated land are based on various considerations, including human health risks, but ecological arguments do not have a strong position in such assessments. This paper analyses why this should be so, and what ecotoxicology and theoretical ecology can improve on the situation. It seems that soil assessment suffers from a fundamental weakness, which relates to the absence of a commonly accepted framework that may act as a reference. Soil contamination can be assessed both from a functional perspective and a structural perspective. The relationship between structure and function in ecosystems is a fundamental question of ecology which receives a lot of attention in recent literature, however, a general concept that may guide ecotoxicological assessments has not yet arisen. On the experimental side, a good deal of progress has been made in the development and standardized use of terrestrial model ecosystems (TME). In such systems, usually consisting of intact soil columns incubated in the laboratory under conditions allowing plant growth and drainage of water, a compromise is sought between field relevance and experimental manageability. A great variety of measurements can be made on such systems, including microbiological processes and activities, but also activities of the decomposer soil fauna. I propose that these TMEs can be useful instruments in ecological soil quality assessments. In addition a "bioinformatics approach" to the analysis of data obtained in TME experiments is proposed. Soil function should be considered as a multidimensional concept and the various measurements can be considered as indicators, whose combined values define the "normal operating range" of the system. Deviations from the normal operating range indicate that the system is in a condition of stress. It is hoped that more work

  6. Modeling risk assessment for nuclear processing plants with LAVA

    International Nuclear Information System (INIS)

    Smith, S.T.; Tisinger, R.M.

    1988-01-01

    Using the Los Alamos Vulnerability and Risk Assessment (LAVA) methodology, the authors developed a model for assessing risks associated with nuclear processing plants. LAVA is a three-part systematic approach to risk assessment. The first part is the mathematical methodology; the second is the general personal computer-based software engine; and the third is the application itself. The methodology provides a framework for creating applications for the software engine to operate upon; all application-specific information is data. Using LAVA, the authors build knowledge-based expert systems to assess risks in applications systems comprising a subject system and a safeguards system. The subject system model is sets of threats, assets, and undesirable outcomes. The safeguards system model is sets of safeguards functions for protecting the assets from the threats by preventing or ameliorating the undesirable outcomes, sets of safeguards subfunctions whose performance determine whether the function is adequate and complete, and sets of issues, appearing as interactive questionnaires, whose measures (in both monetary and linguistic terms) define both the weaknesses in the safeguards system and the potential costs of an undesirable outcome occurring

  7. Power probability density function control and performance assessment of a nuclear research reactor

    International Nuclear Information System (INIS)

    Abharian, Amir Esmaeili; Fadaei, Amir Hosein

    2014-01-01

    Highlights: • In this paper, the performance assessment of static PDF control system is discussed. • The reactor PDF model is set up based on the B-spline functions. • Acquaints of Nu, and Th-h. equations solve concurrently by reformed Hansen’s method. • A principle of performance assessment is put forward for the PDF of the NR control. - Abstract: One of the main issues in controlling a system is to keep track of the conditions of the system function. The performance condition of the system should be inspected continuously, to keep the system in reliable working condition. In this study, the nuclear reactor is considered as a complicated system and a principle of performance assessment is used for analyzing the performance of the power probability density function (PDF) of the nuclear research reactor control. First, the model of the power PDF is set up, then the controller is designed to make the power PDF for tracing the given shape, that make the reactor to be a closed-loop system. The operating data of the closed-loop reactor are used to assess the control performance with the performance assessment criteria. The modeling, controller design and the performance assessment of the power PDF are all applied to the control of Tehran Research Reactor (TRR) power in a nuclear process. In this paper, the performance assessment of the static PDF control system is discussed, the efficacy and efficiency of the proposed method are investigated, and finally its reliability is proven

  8. A formalism to generate probability distributions for performance-assessment modeling

    International Nuclear Information System (INIS)

    Kaplan, P.G.

    1990-01-01

    A formalism is presented for generating probability distributions of parameters used in performance-assessment modeling. The formalism is used when data are either sparse or nonexistent. The appropriate distribution is a function of the known or estimated constraints and is chosen to maximize a quantity known as Shannon's informational entropy. The formalism is applied to a parameter used in performance-assessment modeling. The functional form of the model that defines the parameter, data from the actual field site, and natural analog data are analyzed to estimate the constraints. A beta probability distribution of the example parameter is generated after finding four constraints. As an example of how the formalism is applied to the site characterization studies of Yucca Mountain, the distribution is generated for an input parameter in a performance-assessment model currently used to estimate compliance with disposal of high-level radioactive waste in geologic repositories, 10 CFR 60.113(a)(2), commonly known as the ground water travel time criterion. 8 refs., 2 figs

  9. Assessing the reliability of predictive activity coefficient models for molecules consisting of several functional groups

    Directory of Open Access Journals (Sweden)

    R. P. Gerber

    2013-03-01

    Full Text Available Currently, the most successful predictive models for activity coefficients are those based on functional groups such as UNIFAC. In contrast, these models require a large amount of experimental data for the determination of their parameter matrix. A more recent alternative is the models based on COSMO, for which only a small set of universal parameters must be calibrated. In this work, a recalibrated COSMO-SAC model was compared with the UNIFAC (Do model employing experimental infinite dilution activity coefficient data for 2236 non-hydrogen-bonding binary mixtures at different temperatures. As expected, UNIFAC (Do presented better overall performance, with a mean absolute error of 0.12 ln-units against 0.22 for our COSMO-SAC implementation. However, in cases involving molecules with several functional groups or when functional groups appear in an unusual way, the deviation for UNIFAC was 0.44 as opposed to 0.20 for COSMO-SAC. These results show that COSMO-SAC provides more reliable predictions for multi-functional or more complex molecules, reaffirming its future prospects.

  10. Modeling phytoplankton community in reservoirs. A comparison between taxonomic and functional groups-based models.

    Science.gov (United States)

    Di Maggio, Jimena; Fernández, Carolina; Parodi, Elisa R; Diaz, M Soledad; Estrada, Vanina

    2016-01-01

    In this paper we address the formulation of two mechanistic water quality models that differ in the way the phytoplankton community is described. We carry out parameter estimation subject to differential-algebraic constraints and validation for each model and comparison between models performance. The first approach aggregates phytoplankton species based on their phylogenetic characteristics (Taxonomic group model) and the second one, on their morpho-functional properties following Reynolds' classification (Functional group model). The latter approach takes into account tolerance and sensitivity to environmental conditions. The constrained parameter estimation problems are formulated within an equation oriented framework, with a maximum likelihood objective function. The study site is Paso de las Piedras Reservoir (Argentina), which supplies water for consumption for 450,000 population. Numerical results show that phytoplankton morpho-functional groups more closely represent each species growth requirements within the group. Each model performance is quantitatively assessed by three diagnostic measures. Parameter estimation results for seasonal dynamics of the phytoplankton community and main biogeochemical variables for a one-year time horizon are presented and compared for both models, showing the functional group model enhanced performance. Finally, we explore increasing nutrient loading scenarios and predict their effect on phytoplankton dynamics throughout a one-year time horizon. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. The Gain-Loss Model: A Probabilistic Skill Multimap Model for Assessing Learning Processes

    Science.gov (United States)

    Robusto, Egidio; Stefanutti, Luca; Anselmi, Pasquale

    2010-01-01

    Within the theoretical framework of knowledge space theory, a probabilistic skill multimap model for assessing learning processes is proposed. The learning process of a student is modeled as a function of the student's knowledge and of an educational intervention on the attainment of specific skills required to solve problems in a knowledge…

  12. Analytical Tools for Functional Assessment of Architectural Layouts

    Science.gov (United States)

    Bąkowski, Jarosław

    2017-10-01

    Functional layout of the building, understood as a layout or set of the facility rooms (or groups of rooms) with a system of internal communication, creates an environment and a place of mutual relations between the occupants of the object. Achieving optimal (from the occupants’ point of view) spatial arrangement is possible through activities that often go beyond the stage of architectural design. Adopted in the architectural design, most often during trial and error process or on the basis of previous experience (evidence-based design), functional layout is subject to continuous evaluation and dynamic changing since the beginning of its use. Such verification of the occupancy phase allows to plan future, possible transformations, as well as to develop model solutions for use in other settings. In broader terms, the research hypothesis is to examine whether and how the collected datasets concerning the facility and its utilization can be used to develop methods for assessing functional layout of buildings. In other words, if it is possible to develop an objective method of assessing functional layouts basing on a set of buildings’ parameters: technical, technological and functional ones and whether the method allows developing a set of tools enhancing the design methodology of complex functional objects. By linking the design with the construction phase it is possible to build parametric models of functional layouts, especially in the context of sustainable design or lean design in every aspect: ecological (by reducing the property’s impact on environment), economic (by optimizing its cost) and social (through the implementation of high-performance work environment). Parameterization of size and functional connections of the facility become part of the analyses, as well as the element of model solutions. The “lean” approach means the process of analysis of the existing scheme and consequently - finding weak points as well as means for eliminating these

  13. A multi-state model for the reliability assessment of a distributed generation system via universal generating function

    International Nuclear Information System (INIS)

    Li, Yan-Fu; Zio, Enrico

    2012-01-01

    The current and future developments of electric power systems are pushing the boundaries of reliability assessment to consider distribution networks with renewable generators. Given the stochastic features of these elements, most modeling approaches rely on Monte Carlo simulation. The computational costs associated to the simulation approach force to treating mostly small-sized systems, i.e. with a limited number of lumped components of a given renewable technology (e.g. wind or solar, etc.) whose behavior is described by a binary state, working or failed. In this paper, we propose an analytical multi-state modeling approach for the reliability assessment of distributed generation (DG). The approach allows looking to a number of diverse energy generation technologies distributed on the system. Multiple states are used to describe the randomness in the generation units, due to the stochastic nature of the generation sources and of the mechanical degradation/failure behavior of the generation systems. The universal generating function (UGF) technique is used for the individual component multi-state modeling. A multiplication-type composition operator is introduced to combine the UGFs for the mechanical degradation and renewable generation source states into the UGF of the renewable generator power output. The overall multi-state DG system UGF is then constructed and classical reliability indices (e.g. loss of load expectation (LOLE), expected energy not supplied (EENS)) are computed from the DG system generation and load UGFs. An application of the model is shown on a DG system adapted from the IEEE 34 nodes distribution test feeder.

  14. Transfer Function Identification Using Orthogonal Fourier Transform Modeling Functions

    Science.gov (United States)

    Morelli, Eugene A.

    2013-01-01

    A method for transfer function identification, including both model structure determination and parameter estimation, was developed and demonstrated. The approach uses orthogonal modeling functions generated from frequency domain data obtained by Fourier transformation of time series data. The method was applied to simulation data to identify continuous-time transfer function models and unsteady aerodynamic models. Model fit error, estimated model parameters, and the associated uncertainties were used to show the effectiveness of the method for identifying accurate transfer function models from noisy data.

  15. The family receiving home care: functional health pattern assessment.

    Science.gov (United States)

    Hooper, J I

    1996-01-01

    The winds of change in health care make assessment of the family more important than ever as a tool for health care providers seeking to assist the family move themselves toward high-level wellness. Limited medical care and imposed self-responsibility for health promotion and illness prevention, which are natural consequences of these changes, move the locus of control for health management back to the family. The family's teachings, modeling, and interactions are greater influences than ever on the health of the patient. Gordon's functional health patterns provide a holistic model for assessment of the family because assessment data are classified under 11 headings: health perception and health management, nutritional-metabolic, elimination, activity and exercise, sleep and rest, cognition and perception, self-perception and self-concept, roles and relationships, sexuality and reproduction, coping and stress tolerance, and values and beliefs. Questions posed under each of the health patterns can be varied to reflect the uniqueness of the individual family as well as to inquire about family strengths and weaknesses in all patterns. Data using this model provide a comprehensive base for including the family in designing a plan of care.

  16. Assessing the Quality of Persian Translation of Kite Runner based on House’s (2014 Functional Pragmatic Model

    Directory of Open Access Journals (Sweden)

    Fateme Kargarzadeh

    2017-03-01

    Full Text Available Translation quality assessment is at the heart of any theory of translation. It is used in the academic or teaching contexts to judge translations, to discuss their merits and demerits and to suggest solutions. However, literary translations needs more consideration in terms of quality and clarity as it is widely read form of translation. In this respect, Persian literary translation of Kite Runner was taken for investigation based on House’s (2014 functional pragmatic model of translation quality assessment. To this end, around 100 pages from the beginning of both English and Persian versions of the novel were selected and compared. Using House’s model, the profile of the source text register was created and the genre was recognized. The source text profile was compared to the translation text profile. The results were minute mismatches in field, tenor, and mode which accounted for as overt erroneous expressions and leading matches which were accounted for as covert translation. The mismatches were some mistranslations of tenses and selection of inappropriate meanings for the lexicon. Since the informal and culture specific terms were transferred thoroughly, the culture filter was not applied. Besides, as the translation was a covert one. The findings of the study have implications for translators, researchers and translator trainers.

  17. Assessing executive functions in preschoolers using Shape School Task

    Directory of Open Access Journals (Sweden)

    Marta Nieto

    2016-09-01

    Full Text Available Over the last two decades, there has been a growing interest in the study of the development of executive functions in preschool children due to their relationship with different cognitive, psychological, social and academic domains. Early detection of individual differences in executive functioning can have major implications for basic and applied research. Consequently, there is a key need for assessment tools adapted to preschool skills: Shape School has been shown to be a suitable task for this purpose. Our study uses Shape School as the main task to analyze development of inhibition, task-switching and working memory in a sample of 304 preschoolers (age range 3.25-6.50 years. Additionally, we include cognitive tasks for the evaluation of verbal variables (vocabulary, word reasoning and short-term memory and performance variables (picture completion and symbol search, so as to analyze their relationship with executive functions. Our results show age-associated improvements in executive functions and the cognitive variables assessed. Furthermore, correlation analyses reveal positive relationships between executive functions and the other cognitive variables. More specifically, using structural equation modeling and including age direct and indirect effects, our results suggest that executive functions explain to a greater extent performance on verbal and performance tasks. These findings provide further information to support research that considers preschool age to be a crucial period for the development of executive functions and their relationship with other cognitive processes

  18. Graft function assessment in mouse models of single- and dual- kidney transplantation.

    Science.gov (United States)

    Wang, Lei; Wang, Ximing; Jiang, Shan; Wei, Jin; Buggs, Jacentha; Fu, Liying; Zhang, Jie; Liu, Ruisheng

    2018-05-23

    Animal models of kidney transplantation (KTX) are widely used in studying immune response of hosts to implanted grafts. Additionally, KTX can be used in generating kidney-specific knockout animal models by transplantation of kidneys from donors with global knockout of a gene to wild type recipients or vise verse. Dual kidney transplantation (DKT) provides a more physiological environment for recipients than single kidney transplantation (SKT). However, DKT in mice is rare due to technical challenges. In this study, we successfully performed DKT in mice and compared the hemodynamic response and graft function with SKT. The surgical time, complications and survival rate of DKT were not significantly different from SKT, where survival rates were above 85%. Mice with DKT showed less injury and quicker recovery with lower plasma creatinine (Pcr) and higher GFR than SKT mice (Pcr = 0.34 and 0.17 mg/dl in DKT vs. 0.50 and 0.36 mg/dl in SKT at 1 and 3 days, respectively; GFR = 215 and 131 µl/min for DKT and SKT, respectively). In addition, the DKT exhibited better renal functional reserve and long-term outcome of renal graft function than SKT based on the response to acute volume expansion. In conclusion, we have successfully generated a mouse DKT model. The hemodynamic responses of DKT better mimic physiological situations with less kidney injury and better recovery than SKT because of reduced confounding factors such as single nephron hyperfiltration. We anticipate DKT in mice will provide an additional tool for evaluation of renal significance in physiology and disease.

  19. Imaging and assessment of placental function.

    LENUS (Irish Health Repository)

    Moran, Mary

    2011-09-01

    The placenta is the vital support organ for the developing fetus. This article reviews current ultrasound (US) methods of assessing placental function. The ability of ultrasound to detect placental pathology is discussed. Doppler technology to investigate the fetal, placental, and maternal circulations in both high-risk and uncomplicated pregnancies is discussed and the current literature on the value of three-dimensional power Doppler studies to assess placental volume and vascularization is also evaluated. The article highlights the need for further research into three-dimensional ultrasound and alternative methods of placental evaluation if progress is to be made in optimizing placental function assessment.

  20. Assessment of physiological noise modelling methods for functional imaging of the spinal cord.

    Science.gov (United States)

    Kong, Yazhuo; Jenkinson, Mark; Andersson, Jesper; Tracey, Irene; Brooks, Jonathan C W

    2012-04-02

    The spinal cord is the main pathway for information between the central and the peripheral nervous systems. Non-invasive functional MRI offers the possibility of studying spinal cord function and central sensitisation processes. However, imaging neural activity in the spinal cord is more difficult than in the brain. A significant challenge when dealing with such data is the influence of physiological noise (primarily cardiac and respiratory), and currently there is no standard approach to account for these effects. We have previously studied the various sources of physiological noise for spinal cord fMRI at 1.5T and proposed a physiological noise model (PNM) (Brooks et al., 2008). An alternative de-noising strategy, selective averaging filter (SAF), was proposed by Deckers et al. (2006). In this study we reviewed and implemented published physiological noise correction methods at higher field (3T) and aimed to find the optimal models for gradient-echo-based BOLD acquisitions. Two general techniques were compared: physiological noise model (PNM) and selective averaging filter (SAF), along with regressors designed to account for specific signal compartments and physiological processes: cerebrospinal fluid (CSF), motion correction (MC) parameters, heart rate (HR), respiration volume per time (RVT), and the associated cardiac and respiratory response functions. Functional responses were recorded from the cervical spinal cord of 18 healthy subjects in response to noxious thermal and non-noxious punctate stimulation. The various combinations of models and regressors were compared in three ways: the model fit residuals, regression model F-tests and the number of activated voxels. The PNM was found to outperform SAF in all three tests. Furthermore, inclusion of the CSF regressor was crucial as it explained a significant amount of signal variance in the cord and increased the number of active cord voxels. Whilst HR, RVT and MC explained additional signal (noise) variance

  1. Assessment of input function distortions on kinetic model parameters in simulated dynamic 82Rb PET perfusion studies

    International Nuclear Information System (INIS)

    Meyer, Carsten; Peligrad, Dragos-Nicolae; Weibrecht, Martin

    2007-01-01

    Cardiac 82 rubidium dynamic PET studies allow quantifying absolute myocardial perfusion by using tracer kinetic modeling. Here, the accurate measurement of the input function, i.e. the tracer concentration in blood plasma, is a major challenge. This measurement is deteriorated by inappropriate temporal sampling, spillover, etc. Such effects may influence the measured input peak value and the measured blood pool clearance. The aim of our study is to evaluate the effect of input function distortions on the myocardial perfusion as estimated by the model. To this end, we simulate noise-free myocardium time activity curves (TACs) with a two-compartment kinetic model. The input function to the model is a generic analytical function. Distortions of this function have been introduced by varying its parameters. Using the distorted input function, the compartment model has been fitted to the simulated myocardium TAC. This analysis has been performed for various sets of model parameters covering a physiologically relevant range. The evaluation shows that ±10% error in the input peak value can easily lead to ±10-25% error in the model parameter K 1 , which relates to myocardial perfusion. Variations in the input function tail are generally less relevant. We conclude that an accurate estimation especially of the plasma input peak is crucial for a reliable kinetic analysis and blood flow estimation

  2. Modelling metal speciation in the Scheldt Estuary: Combining a flexible-resolution transport model with empirical functions

    Energy Technology Data Exchange (ETDEWEB)

    Elskens, Marc [Vrije Universiteit Brussel, Analytical, Pleinlaan 2, BE-1050 Brussels (Belgium); Gourgue, Olivier [Université catholique de Louvain, Institute of Mechanics, Materials and Civil Engineering (IMMC), 4 Avenue G. Lemaître, bte L4.05.02, BE-1348 Louvain-la-Neuve (Belgium); Université catholique de Louvain, Georges Lemaître Centre for Earth and Climate Research (TECLIM), Place Louis Pasteur 2, bte L4.03.08, BE-1348 Louvain-la-Neuve (Belgium); Baeyens, Willy [Vrije Universiteit Brussel, Analytical, Pleinlaan 2, BE-1050 Brussels (Belgium); Chou, Lei [Université Libre de Bruxelles, Biogéochimie et Modélisation du Système Terre (BGéoSys) —Océanographie Chimique et Géochimie des Eaux, Campus de la Plaine —CP 208, Boulevard du Triomphe, BE-1050 Brussels (Belgium); Deleersnijder, Eric [Université catholique de Louvain, Institute of Mechanics, Materials and Civil Engineering (IMMC), 4 Avenue G. Lemaître, bte L4.05.02, BE-1348 Louvain-la-Neuve (Belgium); Université catholique de Louvain, Earth and Life Institute (ELI), Georges Lemaître Centre for Earth and Climate Research (TECLIM), Place Louis Pasteur 2, bte L4.03.08, BE-1348 Louvain-la-Neuve (Belgium); Leermakers, Martine [Vrije Universiteit Brussel, Analytical, Pleinlaan 2, BE-1050 Brussels (Belgium); and others

    2014-04-01

    Predicting metal concentrations in surface waters is an important step in the understanding and ultimately the assessment of the ecological risk associated with metal contamination. In terms of risk an essential piece of information is the accurate knowledge of the partitioning of the metals between the dissolved and particulate phases, as the former species are generally regarded as the most bioavailable and thus harmful form. As a first step towards the understanding and prediction of metal speciation in the Scheldt Estuary (Belgium, the Netherlands), we carried out a detailed analysis of a historical dataset covering the period 1982–2011. This study reports on the results for two selected metals: Cu and Cd. Data analysis revealed that both the total metal concentration and the metal partitioning coefficient (K{sub d}) could be predicted using relatively simple empirical functions of environmental variables such as salinity and suspended particulate matter concentration (SPM). The validity of these functions has been assessed by their application to salinity and SPM fields simulated by the hydro-environmental model SLIM. The high-resolution total and dissolved metal concentrations reconstructed using this approach, compared surprisingly well with an independent set of validation measurements. These first results from the combined mechanistic-empirical model approach suggest that it may be an interesting tool for risk assessment studies, e.g. to help identify conditions associated with elevated (dissolved) metal concentrations. - Highlights: • Empirical functions were designed for assessing metal speciation in estuarine water. • The empirical functions were implemented in the hydro-environmental model SLIM. • Validation was carried out in the Scheldt Estuary using historical data 1982–2011. • This combined mechanistic-empirical approach is useful for risk assessment.

  3. Depression in Schizophrenia: Associations With Cognition, Functional Capacity, Everyday Functioning, and Self-Assessment.

    Science.gov (United States)

    Harvey, Philip D; Twamley, Elizabeth W; Pinkham, Amy E; Depp, Colin A; Patterson, Thomas L

    2017-05-01

    Depressed mood has a complex relationship with self-evaluation of personal competence in multiple populations. The absence of depression may be associated with overestimation of abilities, while mild depression seems to lead to accurate self-assessment. Significant depression may lead to underestimation of functioning. In this study, we expand on our previous work by directly comparing the association between different levels of depression, everyday functioning, cognitive and functional capacity performance, and self-assessment of everyday functioning in a large (n = 406) sample of outpatients with schizophrenia. Participants with very low self-reported depression overestimated their everyday functioning compared with informant reports. Higher levels of depression were associated with more accurate self-assessment, but no subgroup of patients underestimated their functioning. Depressive symptom severity was associated with poorer informant-rated social functioning, but there were no differences in vocational functioning, everyday activities, cognitive performance, and functional capacity associated with the severity of self-reported depression. There was minimal evidence of impact of depression on most aspects of everyday functioning and objective test performance and a substantial relationship between depression and accuracy of self-assessment. © The Author 2016. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  4. A model for assessing human cognitive reliability in PRA studies

    International Nuclear Information System (INIS)

    Hannaman, G.W.; Spurgin, A.J.; Lukic, Y.

    1985-01-01

    This paper summarizes the status of a research project sponsored by EPRI as part of the Probabilistic Risk Assessment (PRA) technology improvement program and conducted by NUS Corporation to develop a model of Human Cognitive Reliability (HCR). The model was synthesized from features identified in a review of existing models. The model development was based on the hypothesis that the key factors affecting crew response times are separable. The inputs to the model consist of key parameters the values of which can be determined by PRA analysts for each accident situation being assessed. The output is a set of curves which represent the probability of control room crew non-response as a function of time for different conditions affecting their performance. The non-response probability is then a contributor to the overall non-success of operating crews to achieve a functional objective identified in the PRA study. Simulator data and some small scale tests were utilized to illustrate the calibration of interim HCR model coefficients for different types of cognitive processing since the data were sparse. The model can potentially help PRA analysts make human reliability assessments more explicit. The model incorporates concepts from psychological models of human cognitive behavior, information from current collections of human reliability data sources and crew response time data from simulator training exercises

  5. A Comprehensive Assessment Model for Critical Infrastructure Protection

    Directory of Open Access Journals (Sweden)

    Häyhtiö Markus

    2017-12-01

    Full Text Available International business demands seamless service and IT-infrastructure throughout the entire supply chain. However, dependencies between different parts of this vulnerable ecosystem form a fragile web. Assessment of the financial effects of any abnormalities in any part of the network is demanded in order to protect this network in a financially viable way. Contractual environment between the actors in a supply chain, different business domains and functions requires a management model, which enables a network wide protection for critical infrastructure. In this paper authors introduce such a model. It can be used to assess financial differences between centralized and decentralized protection of critical infrastructure. As an end result of this assessment business resilience to unknown threats can be improved across the entire supply chain.

  6. Mathematical modeling and visualization of functional neuroimages

    DEFF Research Database (Denmark)

    Rasmussen, Peter Mondrup

    This dissertation presents research results regarding mathematical modeling in the context of the analysis of functional neuroimages. Specifically, the research focuses on pattern-based analysis methods that recently have become popular within the neuroimaging community. Such methods attempt...... sets are characterized by relatively few data observations in a high dimensional space. The process of building models in such data sets often requires strong regularization. Often, the degree of model regularization is chosen in order to maximize prediction accuracy. We focus on the relative influence...... be carefully selected, so that the model and its visualization enhance our ability to interpret the brain. The second part concerns interpretation of nonlinear models and procedures for extraction of ‘brain maps’ from nonlinear kernel models. We assess the performance of the sensitivity map as means...

  7. Nonparametric Transfer Function Models

    Science.gov (United States)

    Liu, Jun M.; Chen, Rong; Yao, Qiwei

    2009-01-01

    In this paper a class of nonparametric transfer function models is proposed to model nonlinear relationships between ‘input’ and ‘output’ time series. The transfer function is smooth with unknown functional forms, and the noise is assumed to be a stationary autoregressive-moving average (ARMA) process. The nonparametric transfer function is estimated jointly with the ARMA parameters. By modeling the correlation in the noise, the transfer function can be estimated more efficiently. The parsimonious ARMA structure improves the estimation efficiency in finite samples. The asymptotic properties of the estimators are investigated. The finite-sample properties are illustrated through simulations and one empirical example. PMID:20628584

  8. Utility of Social Modeling for Proliferation Assessment - Enhancing a Facility-Level Model for Proliferation Resistance Assessment of a Nuclear Enegry System

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Brothers, Alan J.; Gastelum, Zoe N.; Olson, Jarrod; Thompson, Sandra E.

    2009-10-26

    based nuclear facility assessment; 5. a discussion of a way to engage with the owners of the PR assessment methodology to assess and improve the enhancement concept; 6. a discussion of implementation of the proposed approach, including a discussion of functionality and potential users; and 7. conclusions from the research. This report represents technical deliverables for the NA-22 Simulations, Algorithms, and Modeling program. Specifically this report is the Task 2 and 3 deliverables for project PL09-UtilSocial.

  9. Nonlinear System Identification via Basis Functions Based Time Domain Volterra Model

    Directory of Open Access Journals (Sweden)

    Yazid Edwar

    2014-07-01

    Full Text Available This paper proposes basis functions based time domain Volterra model for nonlinear system identification. The Volterra kernels are expanded by using complex exponential basis functions and estimated via genetic algorithm (GA. The accuracy and practicability of the proposed method are then assessed experimentally from a scaled 1:100 model of a prototype truss spar platform. Identification results in time and frequency domain are presented and coherent functions are performed to check the quality of the identification results. It is shown that results between experimental data and proposed method are in good agreement.

  10. Developing the algorithm for assessing the competitive abilities of functional foods in marketing

    Directory of Open Access Journals (Sweden)

    Nilova Liudmila

    2017-01-01

    Full Text Available A thorough analysis of competitive factors of functional foods has made it possible to develop an algorithm for assessing the competitive factors of functional food products, with respect to their essential consumer features — quality, safety and functionality. Questionnaires filled in by experts and the published results of surveys of consumers from different countries were used to help select the essential consumer features in functional foods. A “desirability of consumer features” model triangle, based on functional bread and bakery products, was constructed with the use of the Harrington function.

  11. Safety assessment of automated vehicle functions by simulation-based fault injection

    OpenAIRE

    Juez, Garazi; Amparan, Estibaliz; Lattarulo, Ray; Rastelli, Joshue Perez; Ruiz, Alejandra; Espinoza, Huascar

    2017-01-01

    As automated driving vehicles become more sophisticated and pervasive, it is increasingly important to assure its safety even in the presence of faults. This paper presents a simulation-based fault injection approach (Sabotage) aimed at assessing the safety of automated vehicle functions. In particular, we focus on a case study to forecast fault effects during the model-based design of a lateral control function. The goal is to determine the acceptable fault detection interval for pe...

  12. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  13. A systemic approach for modeling soil functions

    Science.gov (United States)

    Vogel, Hans-Jörg; Bartke, Stephan; Daedlow, Katrin; Helming, Katharina; Kögel-Knabner, Ingrid; Lang, Birgit; Rabot, Eva; Russell, David; Stößel, Bastian; Weller, Ulrich; Wiesmeier, Martin; Wollschläger, Ute

    2018-03-01

    The central importance of soil for the functioning of terrestrial systems is increasingly recognized. Critically relevant for water quality, climate control, nutrient cycling and biodiversity, soil provides more functions than just the basis for agricultural production. Nowadays, soil is increasingly under pressure as a limited resource for the production of food, energy and raw materials. This has led to an increasing demand for concepts assessing soil functions so that they can be adequately considered in decision-making aimed at sustainable soil management. The various soil science disciplines have progressively developed highly sophisticated methods to explore the multitude of physical, chemical and biological processes in soil. It is not obvious, however, how the steadily improving insight into soil processes may contribute to the evaluation of soil functions. Here, we present to a new systemic modeling framework that allows for a consistent coupling between reductionist yet observable indicators for soil functions with detailed process understanding. It is based on the mechanistic relationships between soil functional attributes, each explained by a network of interacting processes as derived from scientific evidence. The non-linear character of these interactions produces stability and resilience of soil with respect to functional characteristics. We anticipate that this new conceptional framework will integrate the various soil science disciplines and help identify important future research questions at the interface between disciplines. It allows the overwhelming complexity of soil systems to be adequately coped with and paves the way for steadily improving our capability to assess soil functions based on scientific understanding.

  14. Assessment of left atrial volume and function

    DEFF Research Database (Denmark)

    Kühl, J Tobias; Lønborg, Jacob; Fuchs, Andreas

    2012-01-01

    dynamic LA volume changes. Conversely, cardiac magnetic resonance imaging (CMR) and multi-slice computed tomography (MSCT) appears more appropriate for such measures. We sought to determine the relationship between LA size assessed with TTE and LA size and function assessed with CMR and MSCT. Fifty......-four patients were examined 3 months post myocardial infarction with echocardiography, CMR and MSCT. Left atrial volumes and LA reservoir function were assessed by TTE. LA time-volume curves were determined and LA reservoir function (cyclic change and fractional change), passive emptying function (reservoir...... between CMR and MSCT, with a small to moderate bias in LA(max) (4.9 ± 10.4 ml), CC (3.1 ± 9.1 ml) and reservoir volume (3.4 ± 9.1 ml). TTE underestimates LA(max) with up to 32% compared with CMR and MSCT (P ...

  15. Results of the use of a kinetic model of radiohippuran transport in the human body for quantitative assessment of summary and isolated renal function

    International Nuclear Information System (INIS)

    Ryabov, S.I.; Degtereva, O.A.; Klemina, I.K.; Degterev, B.V.; Senchik, R.V.

    1986-01-01

    The results of a method for the interpretation of commonly used methods of the determination of blood clearance and radionephrography with 131 I-hipuran based on a mathematical model of its transport in the human body are presented. Empirical values of model parameters were obtained in 120 patients with chronic glomerulo- and pyelonephritides verified morphologically and roentgenologically. The use of computational-interpretation algorithms made it possible to determine the volume of circulating plasma (blood), values of true summary and isolated effective renal plasma flow (blood flow) by means of a single i.v. hippuran administration. New indicators for assessment of isolated excretory-transport function and renal hemodynamics as well as indicators of the symmetry of renal function were proposed. The results of a statistical analysis made it possible to recommend some of them as criteria of early diagnosis of preuremic disorder of renal function. Radionuclide indicators of renal function showed good correlation with biochemical, morphological and roentgenological characteristics of renal damage in renal

  16. Homeostasis model assessment of insulin resistance in relation to the poor functional outcomes in nondiabetic patients with ischemic stroke

    Science.gov (United States)

    Li, Siou; Yin, Changhao; Zhao, Weina; Zhu, Haifu; Xu, Dan; Xu, Qing; Jiao, Yang; Wang, Xue; Qiao, Hong

    2018-01-01

    Whether insulin resistance (IR) predicts worse functional outcome in ischemic stroke is still a matter of debate. The aim of the present study is to determine the association between IR and risk of poor outcome in 173 Chinese nondiabetic patients with acute ischemic stroke. This is a prospective, population-based cohort study. Insulin sensitivity, expressed by the homeostasis model assessment (HOMA) of insulin sensitivity (HOMA index = (fasting insulin × fasting glucose)/22.5). IR was defined by HOMA-IR index in the top quartile (Q4). Functional impairment was evaluated at discharge using the modified Rankin scale (mRS). The median (interquartile range) HOMA-IR was 2.14 (1.17–2.83), and Q4 was at least 2.83. There was a significantly positive correlation between HOMA-IR and National Institutes of Health Stroke Scale (r = 0.408; PIR group were associated with a higher risk of poor functional outcome (odds ratio (OR) = 3.23; 95% confidence interval (CI) = 1.75–5.08; P=0.001). In multivariate models comparing the third and fourth quartiles against the first quartile of the HOMA-IR, levels of HOMA-IR were associated with poor outcome, and the adjusted risk of poor outcome increased by 207% (OR = 3.05 (95% CI 1.70–4.89), P=0.006) and 429% (5.29 (3.05–9.80), PHOMA-IR to clinical examination variables (P=0.02). High HOMA-IR index is associated with a poor functional outcome in nondiabetic patients with acute ischemic stroke. PMID:29588341

  17. Preschoolers’ Technology-Assessed Physical Activity and Cognitive Function: A Cross-Sectional Study

    Directory of Open Access Journals (Sweden)

    Minghui Quan

    2018-05-01

    Full Text Available Early childhood is a critical period for development of cognitive function, but research on the association between physical activity and cognitive function in preschool children is limited and inconclusive. This study aimed to examine the association between technology-assessed physical activity and cognitive function in preschool children. A cross-sectional analysis of baseline data from the Physical Activity and Cognitive Development Study was conducted in Shanghai, China. Physical activity was measured with accelerometers for 7 consecutive days, and cognitive functions were assessed using the Chinese version of Wechsler Young Children Scale of Intelligence (C-WYCSI. Linear regression analyses were used to assess the association between physical activity and cognitive function. A total of 260 preschool children (boys, 144; girls, 116; mean age: 57.2 ± 5.4 months were included in analyses for this study. After adjusting for confounding factors, we found that Verbal Intelligence Quotient, Performance Intelligence Quotient, and Full Intelligence Quotient were significantly correlated with light physical activity, not moderate to vigorous physical activity, in boys. Standardized coefficients were 0.211, 0.218, and 0.242 (all p < 0.05 in three different models, respectively. However, the correlation between physical activity and cognitive functions were not significant in girls (p > 0.05. These findings suggest that cognitive function is apparently associated with light physical activity in boys. Further studies are required to clarify the sex-specific effect on physical activity and cognitive functions.

  18. Plasma exogenous creatinine excretion for the assessment of renal function in avian medicine--pharmacokinetic modeling in racing pigeons (Columba livia).

    Science.gov (United States)

    Scope, Alexandra; Schwendenwein, Ilse; Schauberger, Günther

    2013-09-01

    The diagnostic evaluation of the glomerular filtration rate by urinary clearance has significant practical limitations in birds because urine is excreted together with feces. Thus, pharmacokinetic modeling of an exogenous plasma creatinine clearance could be useful for assessing renal creatinine excretion in birds. For this study, creatinine (50 mg/kg) was administered to 2 groups of 15 pigeons (Columba livia) each; in one group by the intravenous (IV) route and in the second by the intramuscular (IM) route. The time series of the plasma creatinine concentrations were analyzed by pharmacokinetic models. Body mass-specific creatinine excretion was determined for IV and IM administration to be between 6.30 and 6.44 mL/min per kg, respectively. Body surface area-specific creatinine clearance, which is related to the metabolic rate, was calculated between 0.506 and 0.523 mL/min per dm2, respectively. The results showed that IV as well as IM administration can be used for assessing renal creatinine excretion in pigeons. For practical reasons, IM administration is recommended, with the use of the Bateman function to calculate creatinine elimination.

  19. Formal safety assessment based on relative risks model in ship navigation

    Energy Technology Data Exchange (ETDEWEB)

    Hu Shenping [Merchant Marine College, Shanghai Maritime University, 1550, Pudong Dadao, Shanghai 200135 (China)]. E-mail: sphu@mmc.shmtu.edu.cn; Fang Quangen [Merchant Marine College, Shanghai Maritime University, 1550, Pudong Dadao, Shanghai 200135 (China)]. E-mail: qgfang@mmc.shmtu.edu.cn; Xia Haibo [Merchant Marine College, Shanghai Maritime University, 1550, Pudong Dadao, Shanghai 200135 (China)]. E-mail: hbxia@mmc.shmtu.edu.cn; Xi Yongtao [Merchant Marine College, Shanghai Maritime University, 1550, Pudong Dadao, Shanghai 200135 (China)]. E-mail: xiyt@mmc.shmtu.edu.cn

    2007-03-15

    Formal safety assessment (FSA) is a structured and systematic methodology aiming at enhancing maritime safety. It has been gradually and broadly used in the shipping industry nowadays around the world. On the basis of analysis and conclusion of FSA approach, this paper discusses quantitative risk assessment and generic risk model in FSA, especially frequency and severity criteria in ship navigation. Then it puts forward a new model based on relative risk assessment (MRRA). The model presents a risk-assessment approach based on fuzzy functions and takes five factors into account, including detailed information about accident characteristics. It has already been used for the assessment of pilotage safety in Shanghai harbor, China. Consequently, it can be proved that MRRA is a useful method to solve the problems in the risk assessment of ship navigation safety in practice.

  20. Formal safety assessment based on relative risks model in ship navigation

    International Nuclear Information System (INIS)

    Hu Shenping; Fang Quangen; Xia Haibo; Xi Yongtao

    2007-01-01

    Formal safety assessment (FSA) is a structured and systematic methodology aiming at enhancing maritime safety. It has been gradually and broadly used in the shipping industry nowadays around the world. On the basis of analysis and conclusion of FSA approach, this paper discusses quantitative risk assessment and generic risk model in FSA, especially frequency and severity criteria in ship navigation. Then it puts forward a new model based on relative risk assessment (MRRA). The model presents a risk-assessment approach based on fuzzy functions and takes five factors into account, including detailed information about accident characteristics. It has already been used for the assessment of pilotage safety in Shanghai harbor, China. Consequently, it can be proved that MRRA is a useful method to solve the problems in the risk assessment of ship navigation safety in practice

  1. Addressing dependability by applying an approach for model-based risk assessment

    International Nuclear Information System (INIS)

    Gran, Bjorn Axel; Fredriksen, Rune; Thunem, Atoosa P.-J.

    2007-01-01

    This paper describes how an approach for model-based risk assessment (MBRA) can be applied for addressing different dependability factors in a critical application. Dependability factors, such as availability, reliability, safety and security, are important when assessing the dependability degree of total systems involving digital instrumentation and control (I and C) sub-systems. In order to identify risk sources their roles with regard to intentional system aspects such as system functions, component behaviours and intercommunications must be clarified. Traditional risk assessment is based on fault or risk models of the system. In contrast to this, MBRA utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tried out within the telemedicine and e-commerce areas, and provided through a series of seven trials a sound basis for risk assessments. In this paper the results from the CORAS project are presented, and it is discussed how the approach for applying MBRA meets the needs of a risk-informed Man-Technology-Organization (MTO) model, and how methodology can be applied as a part of a trust case development

  2. Addressing dependability by applying an approach for model-based risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjorn Axel [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: bjorn.axel.gran@hrp.no; Fredriksen, Rune [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: rune.fredriksen@hrp.no; Thunem, Atoosa P.-J. [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: atoosa.p-j.thunem@hrp.no

    2007-11-15

    This paper describes how an approach for model-based risk assessment (MBRA) can be applied for addressing different dependability factors in a critical application. Dependability factors, such as availability, reliability, safety and security, are important when assessing the dependability degree of total systems involving digital instrumentation and control (I and C) sub-systems. In order to identify risk sources their roles with regard to intentional system aspects such as system functions, component behaviours and intercommunications must be clarified. Traditional risk assessment is based on fault or risk models of the system. In contrast to this, MBRA utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tried out within the telemedicine and e-commerce areas, and provided through a series of seven trials a sound basis for risk assessments. In this paper the results from the CORAS project are presented, and it is discussed how the approach for applying MBRA meets the needs of a risk-informed Man-Technology-Organization (MTO) model, and how methodology can be applied as a part of a trust case development.

  3. PRACTICAL APPLICATION OF A MODEL FOR ASSESSING

    Directory of Open Access Journals (Sweden)

    Petr NOVOTNÝ

    2015-12-01

    Full Text Available Rail transport is an important sub-sector of transport infrastructure. Disruption of its operation due to emergencies can result in a reduction in functional parameters of provided services with consequent impacts on society. Identification of critical elements of this system enables its timely and effective protection. On that ground, the article presents a draft model for assessing the criticality of railway infrastructure elements. This model uses a systems approach and multicriteria semi-quantitative analysis with weighted criteria for calculating the criticality of individual elements of the railway infrastructure. In the conclusion, it presents a practical application of the proposed model including the discussion of results.

  4. Assessing the Utility of a Demand Assessment for Functional Analysis

    Science.gov (United States)

    Roscoe, Eileen M.; Rooker, Griffin W.; Pence, Sacha T.; Longworth, Lynlea J.

    2009-01-01

    We evaluated the utility of an assessment for identifying tasks for the functional analysis demand condition with 4 individuals who had been diagnosed with autism. During the demand assessment, a therapist presented a variety of tasks, and observers measured problem behavior and compliance to identify demands associated with low levels of…

  5. A new assessment model and tool for pediatric nurse practitioners.

    Science.gov (United States)

    Burns, C

    1992-01-01

    This article presents a comprehensive assessment model for pediatric nurse practitioner (PNP) practice that integrates familiar elements of the classical medical history, Gordon's Functional Health Patterns, and developmental fields into one system. This model drives the diagnostic reasoning process toward consideration of a broad range of disease, daily living (nursing diagnosis), and developmental diagnoses, which represents PNP practice better than the medical model does.

  6. Physiologically assessed hot flashes and endothelial function among midlife women.

    Science.gov (United States)

    Thurston, Rebecca C; Chang, Yuefang; Barinas-Mitchell, Emma; Jennings, J Richard; von Känel, Roland; Landsittel, Doug P; Matthews, Karen A

    2017-08-01

    Hot flashes are experienced by most midlife women. Emerging data indicate that they may be associated with endothelial dysfunction. No studies have tested whether hot flashes are associated with endothelial function using physiologic measures of hot flashes. We tested whether physiologically assessed hot flashes were associated with poorer endothelial function. We also considered whether age modified associations. Two hundred seventy-two nonsmoking women reporting either daily hot flashes or no hot flashes, aged 40 to 60 years, and free of clinical cardiovascular disease, underwent ambulatory physiologic hot flash and diary hot flash monitoring; a blood draw; and ultrasound measurement of brachial artery flow-mediated dilation to assess endothelial function. Associations between hot flashes and flow-mediated dilation were tested in linear regression models controlling for lumen diameter, demographics, cardiovascular disease risk factors, and estradiol. In multivariable models incorporating cardiovascular disease risk factors, significant interactions by age (P hot flashes (beta [standard error] = -2.07 [0.79], P = 0.01), and more frequent physiologic hot flashes (for each hot flash: beta [standard error] = -0.10 [0.05], P = 0.03, multivariable) were associated with lower flow-mediated dilation. Associations were not accounted for by estradiol. Associations were not observed among the older women (age 54-60 years) or for self-reported hot flash frequency, severity, or bother. Among the younger women, hot flashes explained more variance in flow-mediated dilation than standard cardiovascular disease risk factors or estradiol. Among younger midlife women, frequent hot flashes were associated with poorer endothelial function and may provide information about women's vascular status beyond cardiovascular disease risk factors and estradiol.

  7. Assessing item fit for unidimensional item response theory models using residuals from estimated item response functions.

    Science.gov (United States)

    Haberman, Shelby J; Sinharay, Sandip; Chon, Kyong Hee

    2013-07-01

    Residual analysis (e.g. Hambleton & Swaminathan, Item response theory: principles and applications, Kluwer Academic, Boston, 1985; Hambleton, Swaminathan, & Rogers, Fundamentals of item response theory, Sage, Newbury Park, 1991) is a popular method to assess fit of item response theory (IRT) models. We suggest a form of residual analysis that may be applied to assess item fit for unidimensional IRT models. The residual analysis consists of a comparison of the maximum-likelihood estimate of the item characteristic curve with an alternative ratio estimate of the item characteristic curve. The large sample distribution of the residual is proved to be standardized normal when the IRT model fits the data. We compare the performance of our suggested residual to the standardized residual of Hambleton et al. (Fundamentals of item response theory, Sage, Newbury Park, 1991) in a detailed simulation study. We then calculate our suggested residuals using data from an operational test. The residuals appear to be useful in assessing the item fit for unidimensional IRT models.

  8. Questionnaire-based assessment of executive functioning: Case studies.

    Science.gov (United States)

    Kronenberger, William G; Castellanos, Irina; Pisoni, David B

    2018-01-01

    Delays in the development of executive functioning skills are frequently observed in pediatric neuropsychology populations and can have a broad and significant impact on quality of life. As a result, assessment of executive functioning is often relevant for the development of formulations and recommendations in pediatric neuropsychology clinical work. Questionnaire-based measures of executive functioning behaviors in everyday life have unique advantages and complement traditional neuropsychological measures of executive functioning. Two case studies of children with spina bifida are presented to illustrate the clinical use of a new questionnaire measure of executive and learning-related functioning, the Learning, Executive, and Attention Functioning Scale (LEAF). The LEAF emphasizes clinical utility in assessment by incorporating four characteristics: brevity in administration, breadth of additional relevant content, efficiency of scoring and interpretation, and ease of availability for use. LEAF results were consistent with another executive functioning checklist in documenting everyday behavior problems related to working memory, planning, and organization while offering additional breadth of assessment of domains such as attention, processing speed, and novel problem-solving. These case study results demonstrate the clinical utility of questionnaire-based measurement of executive functioning in pediatric neuropsychology and provide a new measure for accomplishing this goal.

  9. Functional dynamic factor models with application to yield curve forecasting

    KAUST Repository

    Hays, Spencer

    2012-09-01

    Accurate forecasting of zero coupon bond yields for a continuum of maturities is paramount to bond portfolio management and derivative security pricing. Yet a universal model for yield curve forecasting has been elusive, and prior attempts often resulted in a trade-off between goodness of fit and consistency with economic theory. To address this, herein we propose a novel formulation which connects the dynamic factor model (DFM) framework with concepts from functional data analysis: a DFM with functional factor loading curves. This results in a model capable of forecasting functional time series. Further, in the yield curve context we show that the model retains economic interpretation. Model estimation is achieved through an expectation- maximization algorithm, where the time series parameters and factor loading curves are simultaneously estimated in a single step. Efficient computing is implemented and a data-driven smoothing parameter is nicely incorporated. We show that our model performs very well on forecasting actual yield data compared with existing approaches, especially in regard to profit-based assessment for an innovative trading exercise. We further illustrate the viability of our model to applications outside of yield forecasting.

  10. The assessment of executive functioning in children

    OpenAIRE

    Henry, L.; Bettenay, C.

    2010-01-01

    Background: Executive functioning is increasingly seen as incorporating several component sub-skills and clinical assessments should reflect this complexity. \\ud \\ud Method: Tools for assessing executive functioning in children are reviewed within five key areas, across verbal and visuospatial abilities, with emphasis on batteries of tests. \\ud \\ud Results: There are many appropriate tests for children, although the choice is more limited for those under the age of 8 years. \\ud \\ud Conclusion...

  11. Assessing fit in Bayesian models for spatial processes

    KAUST Repository

    Jun, M.; Katzfuss, M.; Hu, J.; Johnson, V. E.

    2014-01-01

    © 2014 John Wiley & Sons, Ltd. Gaussian random fields are frequently used to model spatial and spatial-temporal data, particularly in geostatistical settings. As much of the attention of the statistics community has been focused on defining and estimating the mean and covariance functions of these processes, little effort has been devoted to developing goodness-of-fit tests to allow users to assess the models' adequacy. We describe a general goodness-of-fit test and related graphical diagnostics for assessing the fit of Bayesian Gaussian process models using pivotal discrepancy measures. Our method is applicable for both regularly and irregularly spaced observation locations on planar and spherical domains. The essential idea behind our method is to evaluate pivotal quantities defined for a realization of a Gaussian random field at parameter values drawn from the posterior distribution. Because the nominal distribution of the resulting pivotal discrepancy measures is known, it is possible to quantitatively assess model fit directly from the output of Markov chain Monte Carlo algorithms used to sample from the posterior distribution on the parameter space. We illustrate our method in a simulation study and in two applications.

  12. Assessing fit in Bayesian models for spatial processes

    KAUST Repository

    Jun, M.

    2014-09-16

    © 2014 John Wiley & Sons, Ltd. Gaussian random fields are frequently used to model spatial and spatial-temporal data, particularly in geostatistical settings. As much of the attention of the statistics community has been focused on defining and estimating the mean and covariance functions of these processes, little effort has been devoted to developing goodness-of-fit tests to allow users to assess the models\\' adequacy. We describe a general goodness-of-fit test and related graphical diagnostics for assessing the fit of Bayesian Gaussian process models using pivotal discrepancy measures. Our method is applicable for both regularly and irregularly spaced observation locations on planar and spherical domains. The essential idea behind our method is to evaluate pivotal quantities defined for a realization of a Gaussian random field at parameter values drawn from the posterior distribution. Because the nominal distribution of the resulting pivotal discrepancy measures is known, it is possible to quantitatively assess model fit directly from the output of Markov chain Monte Carlo algorithms used to sample from the posterior distribution on the parameter space. We illustrate our method in a simulation study and in two applications.

  13. Measuring everyday functional competence using the Rasch assessment of everyday activity limitations (REAL) item bank.

    Science.gov (United States)

    Oude Voshaar, Martijn A H; Ten Klooster, Peter M; Vonkeman, Harald E; van de Laar, Mart A F J

    2017-11-01

    Traditional patient-reported physical function instruments often poorly differentiate patients with mild-to-moderate disability. We describe the development and psychometric evaluation of a generic item bank for measuring everyday activity limitations in outpatient populations. Seventy-two items generated from patient interviews and mapped to the International Classification of Functioning, Disability and Health (ICF) domestic life chapter were administered to 1128 adults representative of the Dutch population. The partial credit model was fitted to the item responses and evaluated with respect to its assumptions, model fit, and differential item functioning (DIF). Measurement performance of a computerized adaptive testing (CAT) algorithm was compared with the SF-36 physical functioning scale (PF-10). A final bank of 41 items was developed. All items demonstrated acceptable fit to the partial credit model and measurement invariance across age, sex, and educational level. Five- and ten-item CAT simulations were shown to have high measurement precision, which exceeded that of SF-36 physical functioning scale across the physical function continuum. Floor effects were absent for a 10-item empirical CAT simulation, and ceiling effects were low (13.5%) compared with SF-36 physical functioning (38.1%). CAT also discriminated better than SF-36 physical functioning between age groups, number of chronic conditions, and respondents with or without rheumatic conditions. The Rasch assessment of everyday activity limitations (REAL) item bank will hopefully prove a useful instrument for assessing everyday activity limitations. T-scores obtained using derived measures can be used to benchmark physical function outcomes against the general Dutch adult population.

  14. Assessing Functional Vision Using Microcomputers.

    Science.gov (United States)

    Spencer, Simon; Ross, Malcolm

    1989-01-01

    The paper describes a software system which uses microcomputers to aid in the assessment of functional vision in visually impaired students. The software also aims to be visually stimulating and to develop hand-eye coordination, visual memory, and cognitive abilities. (DB)

  15. Functional Behavior Assessment in Schools: Current Status and Future Directions

    Science.gov (United States)

    Anderson, Cynthia M.; Rodriguez, Billie Jo; Campbell, Amy

    2015-01-01

    Functional behavior assessment is becoming a commonly used practice in school settings. Accompanying this growth has been an increase in research on functional behavior assessment. We reviewed the extant literature on documenting indirect and direct methods of functional behavior assessment in school settings. To discern best practice guidelines…

  16. Assessment of right atrial function analysis

    International Nuclear Information System (INIS)

    Shohgase, Takashi; Miyamoto, Atsushi; Kanamori, Katsushi; Kobayashi, Takeshi; Yasuda, Hisakazu

    1988-01-01

    To assess the potential utility of right atrial function analysis in cardiac disease, reservoir function, pump function, and right atrial peak emptying rate (RAPER) were compared in 10 normal subjects, 32 patients with coronary artery disease, and 4 patients with primary pulmonary hypertension. Right atrial volume curves were obtained using cardiac radionuclide method with Kr-81m. In normal subjects, reservoir function index was 0.41+-0.05; pump function index was 0.25+-0.05. Both types of patients has decreased reservoir funcion and increased pump function. Pump function tended to decrease with an increase of right ventricular end-diastolic pressure. RAPER correlated well with right ventricular peak filling rate, probably reflecting right ventricular diastolic function. Analysis of right atrial function seemed to be of value in evaluating factors regulating right ventricular contraction and diastolic function, and cardiac output. (Namekawa, K)

  17. A comparison of long-term functional outcome after 2 middle cerebral artery occlusion models in rats.

    Science.gov (United States)

    Roof, R L; Schielke, G P; Ren, X; Hall, E D

    2001-11-01

    Proven behavioral assessment strategies for testing potential therapeutic agents in rat stroke models are needed. Few studies include tasks that demand higher levels of sensorimotor and cognitive function. Because behavioral outcome and rate of recovery vary among ischemia models, there is a need to characterize and compare performance on specific tasks across models. To this end, sensorimotor and cognitive deficits were assessed during a 5-week period after either permanent proximal middle cerebral artery occlusion (pMCAO) or permanent distal middle cerebral artery occlusion combined with a 90-minute occlusion of both common carotid arteries (dMCAO/tCCAO) in Sprague-Dawley rats. The EBST, hindlimb and forelimb placing, and cylinder tests were given at regular intervals postinjury to assess sensorimotor function. Cognitive function was assessed with a multitrial water navigation task. pMCAO, which caused both striatal and cortical damage, produced persistent sensorimotor and cognitive deficits. Limb placing responses and postural reflexes were impaired throughout the month of testing. A persistent bias for using the ipsilateral forelimb for wall movements in the cylinder test was observed as well as a bias for landing on the opposite forelimb. pMCAO rats were also impaired in the water navigation task. dMCAO/tCCAO, which caused only cortical damage, produced similar sensorimotor deficits, but these were greatly diminished by 2 weeks after injury. No impairment was found for water tank navigation. Correlations between forelimb placing (both models), water navigation performance (pMCAO model), and sensorimotor asymmetry (dMCAOtCCAO model) and infarct volume were observed. Based on the range of functions affected and stability of observed deficits, the pMCAO model appears to be preferable to the dMCAO/tCCAO model for use in assessing therapeutic agents for stroke.

  18. Testing the efficacy of downscaling in species distribution modelling: a comparison between MaxEnt and Favourability Function models

    Energy Technology Data Exchange (ETDEWEB)

    Olivero, J.; Toxopeus, A.G.; Skidmore, A.K.; Real, R.

    2016-07-01

    Statistical downscaling is used to improve the knowledge of spatial distributions from broad–scale to fine–scale maps with higher potential for conservation planning. We assessed the effectiveness of downscaling in two commonly used species distribution models: Maximum Entropy (MaxEnt) and the Favourability Function (FF). We used atlas data (10 x 10 km) of the fire salamander Salamandra salamandra distribution in southern Spain to derive models at a 1 x 1 km resolution. Downscaled models were assessed using an independent dataset of the species’ distribution at 1 x 1 km. The Favourability model showed better downscaling performance than the MaxEnt model, and the models that were based on linear combinations of environmental variables performed better than models allowing higher flexibility. The Favourability model minimized model overfitting compared to the MaxEnt model. (Author)

  19. Utility of Social Modeling for Proliferation Assessment - Preliminary Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coles, Garill A.; Gastelum, Zoe N.; Brothers, Alan J.; Thompson, Sandra E.

    2009-06-01

    This Preliminary Assessment draft report will present the results of a literature search and preliminary assessment of the body of research, analysis methods, models and data deemed to be relevant to the Utility of Social Modeling for Proliferation Assessment research. This report will provide: 1) a description of the problem space and the kinds of information pertinent to the problem space, 2) a discussion of key relevant or representative literature, 3) a discussion of models and modeling approaches judged to be potentially useful to the research, and 4) the next steps of this research that will be pursued based on this preliminary assessment. This draft report represents a technical deliverable for the NA-22 Simulations, Algorithms, and Modeling (SAM) program. Specifically this draft report is the Task 1 deliverable for project PL09-UtilSocial-PD06, Utility of Social Modeling for Proliferation Assessment. This project investigates non-traditional use of social and cultural information to improve nuclear proliferation assessment, including nonproliferation assessment, proliferation resistance assessments, safeguards assessments and other related studies. These assessments often use and create technical information about the State’s posture towards proliferation, the vulnerability of a nuclear energy system to an undesired event, and the effectiveness of safeguards. This project will find and fuse social and technical information by explicitly considering the role of cultural, social and behavioral factors relevant to proliferation. The aim of this research is to describe and demonstrate if and how social science modeling has utility in proliferation assessment.

  20. Equivalent magnetic vector potential model for low-frequency magnetic exposure assessment

    Science.gov (United States)

    Diao, Y. L.; Sun, W. N.; He, Y. Q.; Leung, S. W.; Siu, Y. M.

    2017-10-01

    In this paper, a novel source model based on a magnetic vector potential for the assessment of induced electric field strength in a human body exposed to the low-frequency (LF) magnetic field of an electrical appliance is presented. The construction of the vector potential model requires only a single-component magnetic field to be measured close to the appliance under test, hence relieving considerable practical measurement effort—the radial basis functions (RBFs) are adopted for the interpolation of discrete measurements; the magnetic vector potential model can then be directly constructed by summing a set of simple algebraic functions of RBF parameters. The vector potentials are then incorporated into numerical calculations as the equivalent source for evaluations of the induced electric field in the human body model. The accuracy and effectiveness of the proposed model are demonstrated by comparing the induced electric field in a human model to that of the full-wave simulation. This study presents a simple and effective approach for modelling the LF magnetic source. The result of this study could simplify the compliance test procedure for assessing an electrical appliance regarding LF magnetic exposure.

  1. A model for the two-point velocity correlation function in turbulent channel flow

    International Nuclear Information System (INIS)

    Sahay, A.; Sreenivasan, K.R.

    1996-01-01

    A relatively simple analytical expression is presented to approximate the equal-time, two-point, double-velocity correlation function in turbulent channel flow. To assess the accuracy of the model, we perform the spectral decomposition of the integral operator having the model correlation function as its kernel. Comparisons of the empirical eigenvalues and eigenfunctions with those constructed from direct numerical simulations data show good agreement. copyright 1996 American Institute of Physics

  2. A universal Model-R Coupler to facilitate the use of R functions for model calibration and analysis

    Science.gov (United States)

    Wu, Yiping; Liu, Shuguang; Yan, Wende

    2014-01-01

    Mathematical models are useful in various fields of science and engineering. However, it is a challenge to make a model utilize the open and growing functions (e.g., model inversion) on the R platform due to the requirement of accessing and revising the model's source code. To overcome this barrier, we developed a universal tool that aims to convert a model developed in any computer language to an R function using the template and instruction concept of the Parameter ESTimation program (PEST) and the operational structure of the R-Soil and Water Assessment Tool (R-SWAT). The developed tool (Model-R Coupler) is promising because users of any model can connect an external algorithm (written in R) with their model to implement various model behavior analyses (e.g., parameter optimization, sensitivity and uncertainty analysis, performance evaluation, and visualization) without accessing or modifying the model's source code.

  3. Assessment of functional vision and its rehabilitation.

    Science.gov (United States)

    Colenbrander, August

    2010-03-01

    This article, based on a report prepared for the International Council of Ophthalmology (ICO) and the International Society for Low Vision Research and Rehabilitation (ISLRR), explores the assessment of various aspects of visual functioning as needed to document the outcomes of vision rehabilitation. Documenting patient abilities and functional vision (how the person functions) is distinct from the measurement of visual functions (how the eye functions) and also from the assessment of quality of life. All three areas are important, but their assessment should not be mixed. Observation of task performance offers the most objective measure of functional vision, but it is time-consuming and not feasible for many tasks. Where possible, timing and error rates provide an easy score. Patient response questionnaires provide an alternative. They may save time and can cover a wider area, but the responses are subjective and proper scoring presents problems. Simple Likert scoring still predominates but Rasch analysis, needed to provide better result scales, is gaining ground. Selection of questions is another problem. If the range of difficulties does not match the range of patient abilities, and if the difficulties are not distributed evenly, the results are not optimal. This may be an argument to use different outcome questions for different conditions. Generic questionnaires are appropriate for the assessment of generic quality of life, but not for specific rehabilitation outcomes. Different questionnaires are also needed for screening, intake and outcomes. Intake questions must be relevant to actual needs to allow prioritization of rehabilitation goals; the activity inventory presents a prototype. Outcome questions should be targeted at predefined rehabilitation goals. The Appendix cites some promising examples. The Low Vision Intervention Trial (LOVIT) is an example of a properly designed randomized control study, and has demonstrated the remarkable effectiveness of

  4. Improving Flood Damage Assessment Models in Italy

    Science.gov (United States)

    Amadio, M.; Mysiak, J.; Carrera, L.; Koks, E.

    2015-12-01

    The use of Stage-Damage Curve (SDC) models is prevalent in ex-ante assessments of flood risk. To assess the potential damage of a flood event, SDCs describe a relation between water depth and the associated potential economic damage over land use. This relation is normally developed and calibrated through site-specific analysis based on ex-post damage observations. In some cases (e.g. Italy) SDCs are transferred from other countries, undermining the accuracy and reliability of simulation results. Against this background, we developed a refined SDC model for Northern Italy, underpinned by damage compensation records from a recent flood event. Our analysis considers both damage to physical assets and production losses from business interruptions. While the first is calculated based on land use information, production losses are measured through the spatial distribution of Gross Value Added (GVA). An additional component of the model assesses crop-specific agricultural losses as a function of flood seasonality. Our results show an overestimation of asset damage from non-calibrated SDC values up to a factor of 4.5 for tested land use categories. Furthermore, we estimate that production losses amount to around 6 per cent of the annual GVA. Also, maximum yield losses are less than a half of the amount predicted by the standard SDC methods.

  5. Causal Agency Theory: Reconceptualizing a Functional Model of Self-Determination

    Science.gov (United States)

    Shogren, Karrie A.; Wehmeyer, Michael L.; Palmer, Susan B.; Forber-Pratt, Anjali J.; Little, Todd J.; Lopez, Shane

    2015-01-01

    This paper introduces Causal Agency Theory, an extension of the functional model of self-determination. Causal Agency Theory addresses the need for interventions and assessments pertaining to selfdetermination for all students and incorporates the significant advances in understanding of disability and in the field of positive psychology since the…

  6. Measurement Properties of Indirect Assessment Methods for Functional Behavioral Assessment: A Review of Research

    Science.gov (United States)

    Floyd, Randy G.; Phaneuf, Robin L.; Wilczynski, Susan M.

    2005-01-01

    Indirect assessment instruments used during functional behavioral assessment, such as rating scales, interviews, and self-report instruments, represent the least intrusive techniques for acquiring information about the function of problem behavior. This article provides criteria for examining the measurement properties of these instruments…

  7. Construction and pilot assessment of the Lower Limb Function Assessment Scale.

    Science.gov (United States)

    Allart, Etienne; Paquereau, Julie; Rogeau, Caroline; Daveluy, Walter; Kozlowski, Odile; Rousseaux, Marc

    2014-01-01

    Stroke often leads to upright standing and walking impairments. Clinical assessments do not sufficiently address ecological aspects and the patient's subjective evaluation of function. To perform a pilot assessment of the psychometric properties of the Lower Limb-Function Assessment Scale (LL-FAS). The LL-FAS includes 30 items assessing the patient's perception (in a questionnaire) and the examiner's perception (in a practical test) of upright standing and walking impairments and their impact on activities of daily living. We analyzed the LL-FAS's reliability, construct validity, internal consistency, predictive validity and feasibility. Thirty-five stroke patients were included. The scale's mean ± SD completion time was 25 ± 6 min. Intra-observer reliability was good to excellent (intraclass correlation coefficients (ICC >0.82). Interobserver reliability was moderate (0.67 0.9) and predictive validity were excellent. The LL-FAS showed fair psychometric properties in this pilot study and may be of value for evaluating post-stroke lower limb impairment.

  8. Structural equation modeling of motor impairment, gross motor function, and the functional outcome in children with cerebral palsy.

    Science.gov (United States)

    Park, Eun-Young; Kim, Won-Ho

    2013-05-01

    Physical therapy intervention for children with cerebral palsy (CP) is focused on reducing neurological impairments, improving strength, and preventing the development of secondary impairments in order to improve functional outcomes. However, relationship between motor impairments and functional outcome has not been proved definitely. This study confirmed the construct of motor impairment and performed structural equation modeling (SEM) between motor impairment, gross motor function, and functional outcomes of regarding activities of daily living in children with CP. 98 children (59 boys, 39 girls) with CP participated in this cross-sectional study. Mean age was 11 y 5 mo (SD 1 y 9 mo). The Manual Muscle Test (MMT), the Modified Ashworth Scale (MAS), range of motion (ROM) measurement, and the selective motor control (SMC) scale were used to assess motor impairments. Gross motor function and functional outcomes were measured using the Gross Motor Function Measure (GMFM) and the Functional Skills domain of the Pediatric Evaluation of Disability Inventory (PEDI) respectively. Measurement of motor impairment was consisted of strength, spasticity, ROM, and SMC. The construct of motor impairment was confirmed though an examination of a measurement model. The proposed SEM model showed good fit indices. Motor impairment effected gross motor function (β=-.0869). Gross motor function and motor impairment affected functional outcomes directly (β=0.890) and indirectly (β=-0.773) respectively. We confirmed that the construct of motor impairment consist of strength, spasticity, ROM, and SMC and it was identified through measurement model analysis. Functional outcomes are best predicted by gross motor function and motor impairments have indirect effects on functional outcomes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. A factor analysis of Functional Independence and Functional Assessment Measure scores among focal and diffuse brain injury patients: The importance of bi-factor models.

    Science.gov (United States)

    Gunn, Sarah; Burgess, Gerald H; Maltby, John

    2018-04-28

    To explore the factor structure of the UK Functional Independence Measure and Functional Assessment Measure (FIM+FAM) among focal and diffuse acquired brain injury patients. Criterion standard. An NHS acute acquired brain injury inpatient rehabilitation hospital. Referred sample of 447 adults (835 cases after exclusions) admitted for inpatient treatment following an acquired brain injury significant enough to justify intensive inpatient neurorehabilitation. Not applicable. Functional Independence Measure and Functional Assessment Measure. Exploratory Factor Analysis suggested a two-factor structure to FIM+FAM scores, among both focal-proximate and diffuse-proximate acquired brain injury aetiologies. Confirmatory Factor Analysis suggested a three-factor bi-factor structure presented the best fit of the FIM+FAM score data across both aetiologies. However, across both analyses, a convergence was found towards a general factor, demonstrated by high correlations between factors in the Exploratory Factor Analysis, and by a general factor explaining the majority of the variance in scores on Confirmatory Factor Analysis. Our findings suggested that although factors describing specific functional domains can be derived from FIM+FAM item scores, there is a convergence towards a single factor describing overall functioning. This single factor informs the specific group factors (e.g. motor, psychosocial and communication function) following brain injury. Further research into the comparative value of the general and group factors as evaluative/prognostic measures is indicated. Copyright © 2018. Published by Elsevier Inc.

  10. Testing the efficacy of downscaling in species distribution modelling: a comparison between MaxEnt and Favourability Function models

    Directory of Open Access Journals (Sweden)

    Olivero, J.

    2016-03-01

    Full Text Available Statistical downscaling is used to improve the knowledge of spatial distributions from broad–scale to fine–scale maps with higher potential for conservation planning. We assessed the effectiveness of downscaling in two commonly used species distribution models: Maximum Entropy (MaxEnt and the Favourability Function (FF. We used atlas data (10 x 10 km of the fire salamander Salamandra salamandra distribution in southern Spain to derive models at a 1 x 1 km resolution. Downscaled models were assessed using an independent dataset of the species’ distribution at 1 x 1 km. The Favourability model showed better downscaling performance than the MaxEnt model, and the models that were based on linear combinations of environmental variables performed better than models allowing higher flexibility. The Favourability model minimized model overfitting compared to the MaxEnt model.

  11. Toward a consistent modeling framework to assess multi-sectoral climate impacts.

    Science.gov (United States)

    Monier, Erwan; Paltsev, Sergey; Sokolov, Andrei; Chen, Y-H Henry; Gao, Xiang; Ejaz, Qudsia; Couzo, Evan; Schlosser, C Adam; Dutkiewicz, Stephanie; Fant, Charles; Scott, Jeffery; Kicklighter, David; Morris, Jennifer; Jacoby, Henry; Prinn, Ronald; Haigh, Martin

    2018-02-13

    Efforts to estimate the physical and economic impacts of future climate change face substantial challenges. To enrich the currently popular approaches to impact analysis-which involve evaluation of a damage function or multi-model comparisons based on a limited number of standardized scenarios-we propose integrating a geospatially resolved physical representation of impacts into a coupled human-Earth system modeling framework. Large internationally coordinated exercises cannot easily respond to new policy targets and the implementation of standard scenarios across models, institutions and research communities can yield inconsistent estimates. Here, we argue for a shift toward the use of a self-consistent integrated modeling framework to assess climate impacts, and discuss ways the integrated assessment modeling community can move in this direction. We then demonstrate the capabilities of such a modeling framework by conducting a multi-sectoral assessment of climate impacts under a range of consistent and integrated economic and climate scenarios that are responsive to new policies and business expectations.

  12. Structure functions from chiral soliton models

    International Nuclear Information System (INIS)

    Weigel, H.; Reinhardt, H.; Gamberg, L.

    1997-01-01

    We study nucleon structure functions within the bosonized Nambu-Jona-Lasinio (NJL) model where the nucleon emerges as a chiral soliton. We discuss the model predictions on the Gottfried sum rule for electron-nucleon scattering. A comparison with a low-scale parametrization shows that the model reproduces the gross features of the empirical structure functions. We also compute the leading twist contributions of the polarized structure functions g 1 and g 2 in this model. We compare the model predictions on these structure functions with data from the E143 experiment by GLAP evolving them from the scale characteristic for the NJL-model to the scale of the data

  13. Ecological models and pesticide risk assessment: current modeling practice.

    Science.gov (United States)

    Schmolke, Amelie; Thorbek, Pernille; Chapman, Peter; Grimm, Volker

    2010-04-01

    Ecological risk assessments of pesticides usually focus on risk at the level of individuals, and are carried out by comparing exposure and toxicological endpoints. However, in most cases the protection goal is populations rather than individuals. On the population level, effects of pesticides depend not only on exposure and toxicity, but also on factors such as life history characteristics, population structure, timing of application, presence of refuges in time and space, and landscape structure. Ecological models can integrate such factors and have the potential to become important tools for the prediction of population-level effects of exposure to pesticides, thus allowing extrapolations, for example, from laboratory to field. Indeed, a broad range of ecological models have been applied to chemical risk assessment in the scientific literature, but so far such models have only rarely been used to support regulatory risk assessments of pesticides. To better understand the reasons for this situation, the current modeling practice in this field was assessed in the present study. The scientific literature was searched for relevant models and assessed according to nine characteristics: model type, model complexity, toxicity measure, exposure pattern, other factors, taxonomic group, risk assessment endpoint, parameterization, and model evaluation. The present study found that, although most models were of a high scientific standard, many of them would need modification before they are suitable for regulatory risk assessments. The main shortcomings of currently available models in the context of regulatory pesticide risk assessments were identified. When ecological models are applied to regulatory risk assessments, we recommend reviewing these models according to the nine characteristics evaluated here. (c) 2010 SETAC.

  14. Uncertainties in environmental radiological assessment models and their implications

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because these models are inexact representations of real systems. The major sources of this uncertainty are related to biases in model formulation and parameter estimation. The best approach for estimating the actual extent of over- or underprediction is model validation, a procedure that requires testing over the range of the intended realm of model application. Other approaches discussed are the use of screening procedures, sensitivity and stochastic analyses, and model comparison. The magnitude of uncertainty in model predictions is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. Estimates are made of the relative magnitude of uncertainty for situations requiring predictions of individual and collective risks for both chronic and acute releases of radionuclides. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible

  15. Diagnosing time scales of flux tower-model agreement as a function of environmental regime

    Science.gov (United States)

    Brunsell, N. A.; Barlage, M. J.; Monaghan, A. J.

    2013-12-01

    Understanding the extent of agreement between land surface models and observations can provide insight into theoretical advancements in our understanding of land-atmosphere interactions. In particular, understanding the conditions under which models perform particularly well or poorly is essential for identifying potential model limitations. Here, we use three eddy covariance towers over different land cover to assess the agreement with the Noah and Noah-MP ("Multi-Physics") land surface models as a function of environmental variables. The data spans 2007-2012 and encompasses both normal and drought conditions. The environmental regimes are isolated using self-organizing maps (SOMs) to diagnose the relative importance of factors (soil moisture, air temperature, humidity, solar radiation, wind-speed, etc.) on the resulting water and carbon fluxes. The temporal variability of model limitations is assessed with an information theory based wavelet technique within each environmental regime. Discussion will focus on the role of predicting potential model biases as a function of environmental condition.

  16. Validating the JobFit system functional assessment method

    Energy Technology Data Exchange (ETDEWEB)

    Jenny Legge; Robin Burgess-Limerick

    2007-05-15

    Workplace injuries are costing the Australian coal mining industry and its communities $410 Million a year. This ACARP study aims to meet those demands by developing a safe, reliable and valid pre-employment functional assessment tool. All JobFit System Pre-Employment Functional Assessments (PEFAs) consist of a musculoskeletal screen, balance test, aerobic fitness test and job-specific postural tolerances and material handling tasks. The results of each component are compared to the applicant's job demands and an overall PEFA score between 1 and 4 is given with 1 being the better score. The reliability study and validity study were conducted concurrently. The reliability study examined test-retest, intra-tester and inter-tester reliability of the JobFit System Functional Assessment Method. Overall, good to excellent reliability was found, which was sufficient to be used for comparison with injury data for determining the validity of the assessment. The overall assessment score and material handling tasks had the greatest reliability. The validity study compared the assessment results of 336 records from a Queensland underground and open cut coal mine with their injury records. A predictive relationship was found between PEFA score and the risk of a back/trunk/shoulder injury from manual handling. An association was also found between PEFA score of 1 and increased length of employment. Lower aerobic fitness test results had an inverse relationship with injury rates. The study found that underground workers, regardless of PEFA score, were more likely to have an injury when compared to other departments. No relationship was found between age and risk of injury. These results confirm the validity of the JobFit System Functional Assessment method.

  17. Network analysis of mesoscale optical recordings to assess regional, functional connectivity.

    Science.gov (United States)

    Lim, Diana H; LeDue, Jeffrey M; Murphy, Timothy H

    2015-10-01

    With modern optical imaging methods, it is possible to map structural and functional connectivity. Optical imaging studies that aim to describe large-scale neural connectivity often need to handle large and complex datasets. In order to interpret these datasets, new methods for analyzing structural and functional connectivity are being developed. Recently, network analysis, based on graph theory, has been used to describe and quantify brain connectivity in both experimental and clinical studies. We outline how to apply regional, functional network analysis to mesoscale optical imaging using voltage-sensitive-dye imaging and channelrhodopsin-2 stimulation in a mouse model. We include links to sample datasets and an analysis script. The analyses we employ can be applied to other types of fluorescence wide-field imaging, including genetically encoded calcium indicators, to assess network properties. We discuss the benefits and limitations of using network analysis for interpreting optical imaging data and define network properties that may be used to compare across preparations or other manipulations such as animal models of disease.

  18. Feasibility study for remote assessment of cognitive function in multiple sclerosis.

    Science.gov (United States)

    George, Michaela F; Holingue, Calliope B; Briggs, Farren B S; Shao, Xiaorong; Bellesis, Kalliope H; Whitmer, Rachel A; Schaefer, Catherine; Benedict, Ralph Hb; Barcellos, Lisa F

    2016-01-01

    Cognitive impairment is common in multiple sclerosis (MS), and affects employment and quality of life. Large studies are needed to identify risk factors for cognitive decline. Currently, a MS-validated remote assessment for cognitive function does not exist. Studies to determine feasibility of large remote cognitive function investigations in MS have not been published. To determine whether MS patients would participate in remote cognitive studies. We utilized the Modified Telephone Interview for Cognitive Status (TICS-M), a previously validated phone assessment for cognitive function in healthy elderly populations to detect mild cognitive impairment. We identified factors that influenced participation rates. We investigated the relationship between MS risk factors and TICS-M score in cases, and score differences between cases and control individuals. The TICS-M was administered to MS cases and controls. Linear and logistic regression models were utilized. 11.5% of eligible study participants did not participate in cognitive testing. MS cases, females and individuals with lower educational status were more likely to refuse (pTICS-M score among cases (pTICS-M score was significantly lower in cases compared to controls (p=0.007). Our results demonstrate convincingly that a remotely administered cognitive assessment is quite feasible for conducting large epidemiologic studies in MS, and lay the much needed foundation for future work that will utilize MS-validated cognitive measures.

  19. Electricity price forecasting through transfer function models

    International Nuclear Information System (INIS)

    Nogales, F.J.; Conejo, A.J.

    2006-01-01

    Forecasting electricity prices in present day competitive electricity markets is a must for both producers and consumers because both need price estimates to develop their respective market bidding strategies. This paper proposes a transfer function model to predict electricity prices based on both past electricity prices and demands, and discuss the rationale to build it. The importance of electricity demand information is assessed. Appropriate metrics to appraise prediction quality are identified and used. Realistic and extensive simulations based on data from the PJM Interconnection for year 2003 are conducted. The proposed model is compared with naive and other techniques. Journal of the Operational Research Society (2006) 57, 350-356.doi:10.1057/palgrave.jors.2601995; published online 18 May 2005. (author)

  20. Questionnaire-based assessment of executive functioning: Psychometrics.

    Science.gov (United States)

    Castellanos, Irina; Kronenberger, William G; Pisoni, David B

    2018-01-01

    The psychometric properties of the Learning, Executive, and Attention Functioning (LEAF) scale were investigated in an outpatient clinical pediatric sample. As a part of clinical testing, the LEAF scale, which broadly measures neuropsychological abilities related to executive functioning and learning, was administered to parents of 118 children and adolescents referred for psychological testing at a pediatric psychology clinic; 85 teachers also completed LEAF scales to assess reliability across different raters and settings. Scores on neuropsychological tests of executive functioning and academic achievement were abstracted from charts. Psychometric analyses of the LEAF scale demonstrated satisfactory internal consistency, parent-teacher inter-rater reliability in the small to large effect size range, and test-retest reliability in the large effect size range, similar to values for other executive functioning checklists. Correlations between corresponding subscales on the LEAF and other behavior checklists were large, while most correlations with neuropsychological tests of executive functioning and achievement were significant but in the small to medium range. Results support the utility of the LEAF as a reliable and valid questionnaire-based assessment of delays and disturbances in executive functioning and learning. Applications and advantages of the LEAF and other questionnaire measures of executive functioning in clinical neuropsychology settings are discussed.

  1. The Generalised Ecosystem Modelling Approach in Radiological Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Klos, Richard

    2008-03-15

    An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment

  2. The Generalised Ecosystem Modelling Approach in Radiological Assessment

    International Nuclear Information System (INIS)

    Klos, Richard

    2008-03-01

    An independent modelling capability is required by SSI in order to evaluate dose assessments carried out in Sweden by, amongst others, SKB. The main focus is the evaluation of the long-term radiological safety of radioactive waste repositories for both spent fuel and low-level radioactive waste. To meet the requirement for an independent modelling tool for use in biosphere dose assessments, SSI through its modelling team CLIMB commissioned the development of a new model in 2004, a project to produce an integrated model of radionuclides in the landscape. The generalised ecosystem modelling approach (GEMA) is the result. GEMA is a modular system of compartments representing the surface environment. It can be configured, through water and solid material fluxes, to represent local details in the range of ecosystem types found in the past, present and future Swedish landscapes. The approach is generic but fine tuning can be carried out using local details of the surface drainage system. The modular nature of the modelling approach means that GEMA modules can be linked to represent large scale surface drainage features over an extended domain in the landscape. System change can also be managed in GEMA, allowing a flexible and comprehensive model of the evolving landscape to be constructed. Environmental concentrations of radionuclides can be calculated and the GEMA dose pathway model provides a means of evaluating the radiological impact of radionuclide release to the surface environment. This document sets out the philosophy and details of GEMA and illustrates the functioning of the model with a range of examples featuring the recent CLIMB review of SKB's SR-Can assessment

  3. Indoor Air Nuclear, Biological, and Chemical Health Modeling and Assessment System

    Energy Technology Data Exchange (ETDEWEB)

    Stenner, Robert D.; Hadley, Donald L.; Armstrong, Peter R.; Buck, John W.; Hoopes, Bonnie L.; Janus, Michael C.

    2001-03-01

    Indoor air quality effects on human health are of increasing concern to public health agencies and building owners. The prevention and treatment of 'sick building' syndrome and the spread of air-borne diseases in hospitals, for example, are well known priorities. However, increasing attention is being directed to the vulnerability of our public buildings/places, public security and national defense facilities to terrorist attack or the accidental release of air-borne biological pathogens, harmful chemicals, or radioactive contaminants. The Indoor Air Nuclear, Biological, and Chemical Health Modeling and Assessment System (IA-NBC-HMAS) was developed to serve as a health impact analysis tool for use in addressing these concerns. The overall goal was to develop a user-friendly fully functional prototype Health Modeling and Assessment system, which will operate under the PNNL FRAMES system for ease of use and to maximize its integration with other modeling and assessment capabilities accessible within the FRAMES system (e.g., ambient air fate and transport models, water borne fate and transport models, Physiologically Based Pharmacokinetic models, etc.). The prototype IA-NBC-HMAS is designed to serve as a functional Health Modeling and Assessment system that can be easily tailored to meet specific building analysis needs of a customer. The prototype system was developed and tested using an actual building (i.e., the Churchville Building located at the Aberdeen Proving Ground) and release scenario (i.e., the release and measurement of tracer materials within the building) to ensure realism and practicality in the design and development of the prototype system. A user-friendly "demo" accompanies this report to allow the reader the opportunity for a "hands on" review of the prototype system's capability.

  4. The comparability of English, French and Dutch scores on the Functional Assessment of Chronic Illness Therapy-Fatigue (FACIT-F: an assessment of differential item functioning in patients with systemic sclerosis.

    Directory of Open Access Journals (Sweden)

    Linda Kwakkenbos

    Full Text Available The Functional Assessment of Chronic Illness Therapy-Fatigue (FACIT-F is commonly used to assess fatigue in rheumatic diseases, and has shown to discriminate better across levels of the fatigue spectrum than other commonly used measures. The aim of this study was to assess the cross-language measurement equivalence of the English, French, and Dutch versions of the FACIT-F in systemic sclerosis (SSc patients.The FACIT-F was completed by 871 English-speaking Canadian, 238 French-speaking Canadian and 230 Dutch SSc patients. Confirmatory factor analysis was used to assess the factor structure in the three samples. The Multiple-Indicator Multiple-Cause (MIMIC model was utilized to assess differential item functioning (DIF, comparing English versus French and versus Dutch patient responses separately.A unidimensional factor model showed good fit in all samples. Comparing French versus English patients, statistically significant, but small-magnitude DIF was found for 3 of 13 items. French patients had 0.04 of a standard deviation (SD lower latent fatigue scores than English patients and there was an increase of only 0.03 SD after accounting for DIF. For the Dutch versus English comparison, 4 items showed small, but statistically significant, DIF. Dutch patients had 0.20 SD lower latent fatigue scores than English patients. After correcting for DIF, there was a reduction of 0.16 SD in this difference.There was statistically significant DIF in several items, but the overall effect on fatigue scores was minimal. English, French and Dutch versions of the FACIT-F can be reasonably treated as having equivalent scoring metrics.

  5. Modeling of Communication in a Computational Situation Assessment Model

    International Nuclear Information System (INIS)

    Lee, Hyun Chul; Seong, Poong Hyun

    2009-01-01

    Operators in nuclear power plants have to acquire information from human system interfaces (HSIs) and the environment in order to create, update, and confirm their understanding of a plant state, or situation awareness, because failures of situation assessment may result in wrong decisions for process control and finally errors of commission in nuclear power plants. Quantitative or prescriptive models to predict operator's situation assessment in a situation, the results of situation assessment, provide many benefits such as HSI design solutions, human performance data, and human reliability. Unfortunately, a few computational situation assessment models for NPP operators have been proposed and those insufficiently embed human cognitive characteristics. Thus we proposed a new computational situation assessment model of nuclear power plant operators. The proposed model incorporating significant cognitive factors uses a Bayesian belief network (BBN) as model architecture. It is believed that communication between nuclear power plant operators affects operators' situation assessment and its result, situation awareness. We tried to verify that the proposed model represent the effects of communication on situation assessment. As the result, the proposed model succeeded in representing the operators' behavior and this paper shows the details

  6. Improving Assessment of Work Related Mental Health Function Using the Work Disability Functional Assessment Battery (WD-FAB).

    Science.gov (United States)

    Marfeo, Elizabeth E; Ni, Pengsheng; McDonough, Christine; Peterik, Kara; Marino, Molly; Meterko, Mark; Rasch, Elizabeth K; Chan, Leighton; Brandt, Diane; Jette, Alan M

    2018-03-01

    Purpose To improve the mental health component of the Work Disability Functional Assessment Battery (WD-FAB), developed for the US Social Security Administration's (SSA) disability determination process. Specifically our goal was to expand the WD-FAB scales of mood & emotions, resilience, social interactions, and behavioral control to improve the depth and breadth of the current scales and expand the content coverage to include aspects of cognition & communication function. Methods Data were collected from a random, stratified sample of 1695 claimants applying for the SSA work disability benefits, and a general population sample of 2025 working age adults. 169 new items were developed to replenish the WD-FAB scales and analyzed using factor analysis and item response theory (IRT) analysis to construct unidimensional scales. We conducted computer adaptive test (CAT) simulations to examine the psychometric properties of the WD-FAB. Results Analyses supported the inclusion of four mental health subdomains: Cognition & Communication (68 items), Self-Regulation (34 items), Resilience & Sociability (29 items) and Mood & Emotions (34 items). All scales yielded acceptable psychometric properties. Conclusions IRT methods were effective in expanding the WD-FAB to assess mental health function. The WD-FAB has the potential to enhance work disability assessment both within the context of the SSA disability programs as well as other clinical and vocational rehabilitation settings.

  7. A more general model for testing measurement invariance and differential item functioning.

    Science.gov (United States)

    Bauer, Daniel J

    2017-09-01

    The evaluation of measurement invariance is an important step in establishing the validity and comparability of measurements across individuals. Most commonly, measurement invariance has been examined using 1 of 2 primary latent variable modeling approaches: the multiple groups model or the multiple-indicator multiple-cause (MIMIC) model. Both approaches offer opportunities to detect differential item functioning within multi-item scales, and thereby to test measurement invariance, but both approaches also have significant limitations. The multiple groups model allows 1 to examine the invariance of all model parameters but only across levels of a single categorical individual difference variable (e.g., ethnicity). In contrast, the MIMIC model permits both categorical and continuous individual difference variables (e.g., sex and age) but permits only a subset of the model parameters to vary as a function of these characteristics. The current article argues that moderated nonlinear factor analysis (MNLFA) constitutes an alternative, more flexible model for evaluating measurement invariance and differential item functioning. We show that the MNLFA subsumes and combines the strengths of the multiple group and MIMIC models, allowing for a full and simultaneous assessment of measurement invariance and differential item functioning across multiple categorical and/or continuous individual difference variables. The relationships between the MNLFA model and the multiple groups and MIMIC models are shown mathematically and via an empirical demonstration. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. Renal function assessment in heart failure.

    Science.gov (United States)

    Pérez Calvo, J I; Josa Laorden, C; Giménez López, I

    Renal function is one of the most consistent prognostic determinants in heart failure. The prognostic information it provides is independent of the ejection fraction and functional status. This article reviews the various renal function assessment measures, with special emphasis on the fact that the patient's clinical situation and response to the heart failure treatment should be considered for the correct interpretation of the results. Finally, we review the literature on the performance of tubular damage biomarkers. Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Medicina Interna (SEMI). All rights reserved.

  9. A model for assessing the systemic vulnerability in landslide prone areas

    Directory of Open Access Journals (Sweden)

    S. Pascale

    2010-07-01

    Full Text Available The objectives of spatial planning should include the definition and assessment of possible mitigation strategies regarding the effects of natural hazards on the surrounding territory. Unfortunately, however, there is often a lack of adequate tools to provide necessary support to the local bodies responsible for land management. This paper deals with the conception, the development and the validation of an integrated numerical model for assessing systemic vulnerability in complex and urbanized landslide-prone areas. The proposed model considers this vulnerability not as a characteristic of a particular element at risk, but as a peculiarity of a complex territorial system, in which the elements are reciprocally linked in a functional way. It is an index of the tendency of a given territorial element to suffer damage (usually of a functional kind due to its interconnections with other elements of the same territorial system. The innovative nature of this work also lies in the formalization of a procedure based on a network of influences for an adequate assessment of such "systemic" vulnerability.

    This approach can be used to obtain information which is useful, in any given situation of a territory hit by a landslide event, for the identification of the element which has suffered the most functional damage, ie the most "critical" element and the element which has the greatest repercussions on other elements of the system and thus a "decisive" role in the management of the emergency.

    This model was developed within a GIS system through the following phases:

    1. the topological characterization of the territorial system studied and the assessment of the scenarios in terms of spatial landslide hazard. A statistical method, based on neural networks was proposed for the assessment of landslide hazard;

    2. the analysis of the direct consequences of a scenario event on the system;

    3. the definition of the

  10. Plant functional modelling as a basis for assessing the impact of management on plant safety

    International Nuclear Information System (INIS)

    Rasmussen, Birgitte; Petersen, Kurt E.

    1999-01-01

    A major objective of the present work is to provide means for representing a chemical process plant as a socio-technical system, so as to allow hazard identification at a high level in order to identify major targets for safety development. The main phases of the methodology are: (1) preparation of a plant functional model where a set of plant functions describes coherently hardware, software, operations, work organization and other safety related aspects. The basic principle is that any aspect of the plant can be represented by an object based upon an Intent and associated with each Intent are Methods, by which the Intent is realized, and Constraints, which limit the Intent. (2) Plant level hazard identification based on keywords/checklists and the functional model. (3) Development of incident scenarios and selection of hazardous situation with different safety characteristics. (4) Evaluation of the impact of management on plant safety through interviews. (5) Identification of safety critical ways of action in the management system, i.e. identification of possible error- and violation-producing conditions

  11. Using ecosystem modelling techniques in exposure assessments of radionuclides - an overview

    International Nuclear Information System (INIS)

    Kumblad, L.

    2005-01-01

    The risk to humans from potential releases from nuclear facilities is evaluated in safety assessments. Essential components of these assessments are exposure models, which estimate the transport of radionuclides in the environment, the uptake in biota, and transfer to humans. Recently, there has been a growing concern for radiological protection of the whole environment, not only humans, and a first attempt has been to employ model approaches based on stylized environments and transfer functions to biota based exclusively on bioconcentration factors (BCF). They are generally of a non-mechanistic nature and involve no knowledge of the actual processes involved, which is a severe limitation when assessing real ecosystems. in this paper, the possibility of using an ecological modelling approach as a complement or an alternative to the use of BCF-based models is discussed. The paper gives an overview of ecological and ecosystem modelling and examples of studies where ecosystem models have been used in association to ecological risk assessment studies for other pollutants than radionuclides. It also discusses the potential to use this technique in exposure assessments of radionuclides with a few examples from the safety assessment work performed by the Swedish nuclear fuel and waste management company (SKB). Finally there is a comparison of the characteristics of ecosystem models and traditionally exposure models for radionuclides used to estimate the radionuclide exposure of biota. The evaluation of ecosystem models already applied in safety assessments has shown that the ecosystem approach is possible to use to assess exposure to biota, and that it can handle many of the modelling problems identified related to BCF-models. The findings in this paper suggest that both national and international assessment frameworks for protection of the environment from ionising radiation would benefit from striving to adopt methodologies based on ecologically sound principles and

  12. On the conversion of functional models : Bridging differences between functional taxonomies in the modeling of user actions

    NARCIS (Netherlands)

    Van Eck, D.

    2009-01-01

    In this paper, I discuss a methodology for the conversion of functional models between functional taxonomies developed by Kitamura et al. (2007) and Ookubo et al. (2007). They apply their methodology to the conversion of functional models described in terms of the Functional Basis taxonomy into

  13. Sensitivity Assessment of Ozone Models

    Energy Technology Data Exchange (ETDEWEB)

    Shorter, Jeffrey A.; Rabitz, Herschel A.; Armstrong, Russell A.

    2000-01-24

    The activities under this contract effort were aimed at developing sensitivity analysis techniques and fully equivalent operational models (FEOMs) for applications in the DOE Atmospheric Chemistry Program (ACP). MRC developed a new model representation algorithm that uses a hierarchical, correlated function expansion containing a finite number of terms. A full expansion of this type is an exact representation of the original model and each of the expansion functions is explicitly calculated using the original model. After calculating the expansion functions, they are assembled into a fully equivalent operational model (FEOM) that can directly replace the original mode.

  14. Differences between Mothers' and Fathers' Ratings of Family Functioning with the Family Assessment Device: The Validity of Combined Parent Scores

    Science.gov (United States)

    Cooke, Dawson; Marais, Ida; Cavanagh, Robert; Kendall, Garth; Priddis, Lynn

    2015-01-01

    The psychometric properties of the General Functioning subscale of the McMaster Family Assessment Device were examined using the Rasch Model (N = 237 couples). Mothers' and fathers' ratings of the General Functioning subscale of the McMaster Family Assessment Device are recommended, provided these are analyzed separately. More than a quarter of…

  15. The Feasibility of Quality Function Deployment (QFD) as an Assessment and Quality Assurance Model

    Science.gov (United States)

    Matorera, D.; Fraser, W. J.

    2016-01-01

    Business schools are globally often seen as structured, purpose-driven, multi-sector and multi-perspective organisations. This article is based on the response of a graduate school to an innovative industrial Quality Function Deployment-based model (QFD), which was to be adopted initially in a Master's degree programme for quality assurance…

  16. Assessing and Promoting Functional Resilience in Flight Crews During Exploration Missions

    Science.gov (United States)

    Shelhamer, M.

    2015-01-01

    The NASA Human Research Program works to mitigate risks to health and performance on extended missions. However, research should be directed not only to mitigating known risks, but also to providing crews with tools to assess and enhance resilience, as a group and individually. We can draw on ideas from complexity theory to assess resilience. The entire crew or the individual crewmember can be viewed as a complex system composed of subsystems; the interactions between subsystems are of crucial importance. Understanding the interactions can provide important information even in the absence of complete information on the component subsystems. Enabled by advances in noninvasive measurement of physiological and behavioral parameters, subsystem monitoring can be implemented within a mission and during training to establish baselines. Coupled with mathematical modeling, this can provide assessment of health and function. Since the web of physiological systems (and crewmembers) can be interpreted as a network in mathematical terms, we can draw on recent work that relates the structure of such networks to their resilience (ability to self-organize in the face of perturbation). Some of the many parameters and interactions to choose from include: sleep cycles, coordination of work and meal times, cardiorespiratory rhythms, circadian rhythms and body temperature, stress markers and cognition, sleep and performance, immune function and nutritional status. Tools for resilience are then the means to measure and analyze these parameters, incorporate them into models of normal variability and interconnectedness, and recognize when parameters or their couplings are outside of normal limits.

  17. Statistical modelling with quantile functions

    CERN Document Server

    Gilchrist, Warren

    2000-01-01

    Galton used quantiles more than a hundred years ago in describing data. Tukey and Parzen used them in the 60s and 70s in describing populations. Since then, the authors of many papers, both theoretical and practical, have used various aspects of quantiles in their work. Until now, however, no one put all the ideas together to form what turns out to be a general approach to statistics.Statistical Modelling with Quantile Functions does just that. It systematically examines the entire process of statistical modelling, starting with using the quantile function to define continuous distributions. The author shows that by using this approach, it becomes possible to develop complex distributional models from simple components. A modelling kit can be developed that applies to the whole model - deterministic and stochastic components - and this kit operates by adding, multiplying, and transforming distributions rather than data.Statistical Modelling with Quantile Functions adds a new dimension to the practice of stati...

  18. Function Model for Community Health Service Information

    Science.gov (United States)

    Yang, Peng; Pan, Feng; Liu, Danhong; Xu, Yongyong

    In order to construct a function model of community health service (CHS) information for development of CHS information management system, Integration Definition for Function Modeling (IDEF0), an IEEE standard which is extended from Structured Analysis and Design(SADT) and now is a widely used function modeling method, was used to classifying its information from top to bottom. The contents of every level of the model were described and coded. Then function model for CHS information, which includes 4 super-classes, 15 classes and 28 sub-classed of business function, 43 business processes and 168 business activities, was established. This model can facilitate information management system development and workflow refinement.

  19. Stepwise Analysis of Differential Item Functioning Based on Multiple-Group Partial Credit Model.

    Science.gov (United States)

    Muraki, Eiji

    1999-01-01

    Extended an Item Response Theory (IRT) method for detection of differential item functioning to the partial credit model and applied the method to simulated data using a stepwise procedure. Then applied the stepwise DIF analysis based on the multiple-group partial credit model to writing trend data from the National Assessment of Educational…

  20. Routine functional assessment for hip fracture patients

    DEFF Research Database (Denmark)

    Pedersen, Tonny J; Lauritsen, Jens M

    2016-01-01

    Background and purpose - Pre-fracture functional level has been shown to be a consistent predictor of rehabilitation outcomes in older hip fracture patients. We validated 4 overall pre-fracture functional level assessment instruments in patients aged 65 or more, used the prediction of outcome at 4...... months post-fracture, and assessed cutoff values for decision making in treatment and rehabilitation. Patients and methods - 165 consecutive patients with acute primary hip fracture were prospectively included in the study. Pre-fracture Barthel-20, Barthel-100, cumulated ambulation score, and new...... investigation of usage for guidance of clinical and rehabilitation decisions concerning hip fracture patients is warranted....

  1. Assessment of Large Transport Infrastructure Projects: the CBA-DK model

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Banister, David

    2008-01-01

    The scope of this paper is to present a newly developed decision support model to assess transport infrastructure projects: CBA-DK. The model makes use of conventional cost-benefit analysis resulting in aggregated single point estimates and quantitative risk analysis using Monte Carlo simulation...... resulting in interval results. The embedded uncertainties within traditional CBA such as ex-ante based investment costs and travel time savings are of particular concern. The methodological approach has been to apply suitable probability distribution functions on the uncertain parameters, thus resulting...... in feasibility risk assessment moving from point to interval results. Decision support as illustrated in this paper aims to provide assistance in the development and ultimately the choice of action while accounting for the uncertainties surrounding transport appraisal schemes. The modelling framework...

  2. Alternative parameters for echocardiographic assessment of fetal diastolic function

    Directory of Open Access Journals (Sweden)

    Zielinsky P.

    2004-01-01

    Full Text Available Alternative methods to assess ventricular diastolic function in the fetus are proposed. Fetal myocardial hypertrophy in maternal diabetes was used as a model of decreased left ventricular compliance (LVC, and fetal respiratory movements as a model of increased LVC. Comparison of three groups of fetuses showed that, in 10 fetuses of diabetic mothers (FDM with septal hypertrophy (SH, the mean excursion index of the septum primum (EISP (ratio between the linear excursion of the flap valve and the left atrial diameter was 0.36 ± 0.09, in 8 FDM without SH it was 0.51 ± 0.09 (P = 0.001, and in the 8 normal control fetuses (NCF it was 0.49 ± 0.12 (P = 0.003. In another study, 28 fetuses in apnea had a mean EISP of 0.39 ± 0.05 which increased to 0.57 ± 0.07 during respiration (P < 0.001. These two studies showed that the mobility of the septum primum was reduced when LVC was decreased and was increased when LVC was enhanced. Mean pulmonary vein pulsatility was higher in 14 FDM (1.83 ± 1.21 than in 26 NCF (1.02 ± 0.31; P = 0.02. In the same fetuses, mean left atrial shortening was decreased (0.40 ± 0.11 in relation to NCF (0.51 ± 0.09; P = 0.011. These results suggest that FDM may have a higher preload than normal controls, probably as a result of increased myocardial mass and LV hypertrophy. Prenatal assessment of LV diastolic function by fetal echocardiography should include analysis of septum primum mobility, pulmonary vein pulsatility, and left atrial shortening.

  3. Functional model of biological neural networks.

    Science.gov (United States)

    Lo, James Ting-Ho

    2010-12-01

    A functional model of biological neural networks, called temporal hierarchical probabilistic associative memory (THPAM), is proposed in this paper. THPAM comprises functional models of dendritic trees for encoding inputs to neurons, a first type of neuron for generating spike trains, a second type of neuron for generating graded signals to modulate neurons of the first type, supervised and unsupervised Hebbian learning mechanisms for easy learning and retrieving, an arrangement of dendritic trees for maximizing generalization, hardwiring for rotation-translation-scaling invariance, and feedback connections with different delay durations for neurons to make full use of present and past informations generated by neurons in the same and higher layers. These functional models and their processing operations have many functions of biological neural networks that have not been achieved by other models in the open literature and provide logically coherent answers to many long-standing neuroscientific questions. However, biological justifications of these functional models and their processing operations are required for THPAM to qualify as a macroscopic model (or low-order approximate) of biological neural networks.

  4. Critical Zone Experimental Design to Assess Soil Processes and Function

    Science.gov (United States)

    Banwart, Steve

    2010-05-01

    Through unsustainable land use practices, mining, deforestation, urbanisation and degradation by industrial pollution, soil losses are now hypothesized to be much faster (100 times or more) than soil formation - with the consequence that soil has become a finite resource. The crucial challenge for the international research community is to understand the rates of processes that dictate soil mass stocks and their function within Earth's Critical Zone (CZ). The CZ is the environment where soils are formed, degrade and provide their essential ecosystem services. Key among these ecosystem services are food and fibre production, filtering, buffering and transformation of water, nutrients and contaminants, storage of carbon and maintaining biological habitat and genetic diversity. We have initiated a new research project to address the priority research areas identified in the European Union Soil Thematic Strategy and to contribute to the development of a global network of Critical Zone Observatories (CZO) committed to soil research. Our hypothesis is that the combined physical-chemical-biological structure of soil can be assessed from first-principles and the resulting soil functions can be quantified in process models that couple the formation and loss of soil stocks with descriptions of biodiversity and nutrient dynamics. The objectives of this research are to 1. Describe from 1st principles how soil structure influences processes and functions of soils, 2. Establish 4 European Critical Zone Observatories to link with established CZOs, 3. Develop a CZ Integrated Model of soil processes and function, 4. Create a GIS-based modelling framework to assess soil threats and mitigation at EU scale, 5. Quantify impacts of changing land use, climate and biodiversity on soil function and its value and 6. Form with international partners a global network of CZOs for soil research and deliver a programme of public outreach and research transfer on soil sustainability. The

  5. The Comparability of English, French and Dutch Scores on the Functional Assessment of Chronic Illness Therapy-Fatigue (FACIT-F): An Assessment of Differential Item Functioning in Patients with Systemic Sclerosis

    Science.gov (United States)

    Kwakkenbos, Linda; Willems, Linda M.; Baron, Murray; Hudson, Marie; Cella, David; van den Ende, Cornelia H. M.; Thombs, Brett D.

    2014-01-01

    Objective The Functional Assessment of Chronic Illness Therapy- Fatigue (FACIT-F) is commonly used to assess fatigue in rheumatic diseases, and has shown to discriminate better across levels of the fatigue spectrum than other commonly used measures. The aim of this study was to assess the cross-language measurement equivalence of the English, French, and Dutch versions of the FACIT-F in systemic sclerosis (SSc) patients. Methods The FACIT-F was completed by 871 English-speaking Canadian, 238 French-speaking Canadian and 230 Dutch SSc patients. Confirmatory factor analysis was used to assess the factor structure in the three samples. The Multiple-Indicator Multiple-Cause (MIMIC) model was utilized to assess differential item functioning (DIF), comparing English versus French and versus Dutch patient responses separately. Results A unidimensional factor model showed good fit in all samples. Comparing French versus English patients, statistically significant, but small-magnitude DIF was found for 3 of 13 items. French patients had 0.04 of a standard deviation (SD) lower latent fatigue scores than English patients and there was an increase of only 0.03 SD after accounting for DIF. For the Dutch versus English comparison, 4 items showed small, but statistically significant, DIF. Dutch patients had 0.20 SD lower latent fatigue scores than English patients. After correcting for DIF, there was a reduction of 0.16 SD in this difference. Conclusions There was statistically significant DIF in several items, but the overall effect on fatigue scores was minimal. English, French and Dutch versions of the FACIT-F can be reasonably treated as having equivalent scoring metrics. PMID:24638101

  6. Assessment of symbolic function in Mexican preschool children

    Directory of Open Access Journals (Sweden)

    N. R. Jiménez Barreto

    2013-04-01

    Full Text Available Development of symbolic function is an important psychological formation of pre-school age and reflects the possibility of the child to use signs and symbols in a conscious way. Assessment of symbolic function can be used as one of preparation for school indicators. The objective of the present study is to characterize the level of symbolic function development in Mexican pre-school children. 59 children were included in the study. The ages of the children were between 5 and 6 years and all of them belonged to sub-urban pre-school institution. All 59 children participated in this study for the first time. Our assessment consisted of specific tasks with symbolic means on materialized, perceptive and verbal levels. Each child was tested individually. Results showed an insufficient development of the symbolic function in all evaluated children. More than 78% of the children showed difficulties during performance in the tasks of assessment; their drawings were undifferentiated and had few essential characteristics. The obtained results show the necessity to implement developmental strategies in order to guarantee the formation of the ability of constant conscious sage of symbolic means at the end of pre-school age.

  7. Gene transfer in rodents and primates as a new tool for modeling diseases in animals and assessing functions by in vivo imaging

    Energy Technology Data Exchange (ETDEWEB)

    Deglon, N. [Atomic Energy Commission (CEA), Dept. of Medical Research and MIRCen Program, 91 - Orsay (France)

    2006-07-01

    The identification of disease-causing genes in familial forms of neuro-degenerative disorders and the development of genetic models closely replicating human CNS pathologies have drastically changed our understanding of the molecular events leading to neuronal cell death. If these achievements open new opportunities of therapeutic interventions efficient delivery systems taking into account the specificity of the central nervous system are required to administer therapeutic candidates. In addition, there is a need to develop 1) genetic models in large animals that replicate late stages of the diseases and 2) imaging techniques suitable for longitudinal, quantitative and non-invasive evaluation of disease progression and the evaluation of new therapeutic strategies. Over the last few years, we have investigated the potential of lentiviral vectors as tool to model and treat CNS disorders. The use of lentiviral vectors to create animal model of these pathologies holds various advantages compared to classical transgenic approaches. Viral vectors are versatile, highly flexible tools to perform in vivo studies. Multiple genetic models can be created in a short period of time. High transduction efficiencies as well as robust and sustained trans-gene expression lead to the rapid appearance of functional and behavioral abnormalities and severe neuro-degeneration. Targeted injections in different brain areas can be used to investigate the regional specificity of the neuro-pathology and eliminate potential side effects associated with a widespread over-expression of the trans-gene. Finally, models can be established in different mammalian species including non-human primates, thereby providing an opportunity to assess complex behavioral changes and perform longitudinal follow-up of neuro-pathological alterations by imaging. We have demonstrated the proof of principle of this approach for Huntington's disease. We have shown that the intratriatal injection of lentiviral

  8. Gene transfer in rodents and primates as a new tool for modeling diseases in animals and assessing functions by in vivo imaging

    International Nuclear Information System (INIS)

    Deglon, N.

    2006-01-01

    The identification of disease-causing genes in familial forms of neuro-degenerative disorders and the development of genetic models closely replicating human CNS pathologies have drastically changed our understanding of the molecular events leading to neuronal cell death. If these achievements open new opportunities of therapeutic interventions efficient delivery systems taking into account the specificity of the central nervous system are required to administer therapeutic candidates. In addition, there is a need to develop 1) genetic models in large animals that replicate late stages of the diseases and 2) imaging techniques suitable for longitudinal, quantitative and non-invasive evaluation of disease progression and the evaluation of new therapeutic strategies. Over the last few years, we have investigated the potential of lentiviral vectors as tool to model and treat CNS disorders. The use of lentiviral vectors to create animal model of these pathologies holds various advantages compared to classical transgenic approaches. Viral vectors are versatile, highly flexible tools to perform in vivo studies. Multiple genetic models can be created in a short period of time. High transduction efficiencies as well as robust and sustained trans-gene expression lead to the rapid appearance of functional and behavioral abnormalities and severe neuro-degeneration. Targeted injections in different brain areas can be used to investigate the regional specificity of the neuro-pathology and eliminate potential side effects associated with a widespread over-expression of the trans-gene. Finally, models can be established in different mammalian species including non-human primates, thereby providing an opportunity to assess complex behavioral changes and perform longitudinal follow-up of neuro-pathological alterations by imaging. We have demonstrated the proof of principle of this approach for Huntington's disease. We have shown that the intratriatal injection of lentiviral vector

  9. ANALYSIS OF LAND RESOURCES SUITABILITY BY FUNCTIONAL MODEL IN EASTERN CROATIA REGION

    Directory of Open Access Journals (Sweden)

    Vladimir Vukadinović

    2011-06-01

    Full Text Available A total of 17405 soil samples (2003rd-2009th years were analyzed in the eastern part of Croatia. The aim of this paper is to assess land suitability for crops i.e. to describe quantitatively land quality and indicate disadvantages of land using system in investigated area. The described mathematical model uses score functions for estimating indicators of soil suitability. Soil suitability assessment computer model for crops, supported by GIS, proved to be fast, efficient and enough reliable. Using GIS tool it is possible to visualize land suitability and present it in different cartographic bases such as maps whereas using geostatistical method – kriging enables to possible to provide regionalization of production area based on quantitative assessment of land suitability for crops.

  10. Riluzole does not improve lifespan or motor function in three ALS mouse models.

    Science.gov (United States)

    Hogg, Marion C; Halang, Luise; Woods, Ina; Coughlan, Karen S; Prehn, Jochen H M

    2017-12-08

    Riluzole is the most widespread therapeutic for treatment of the progressive degenerative disease amyotrophic lateral sclerosis (ALS). Riluzole gained FDA approval in 1995 before the development of ALS mouse models. We assessed riluzole in three transgenic ALS mouse models: the SOD1 G93A model, the TDP-43 A315T model, and the recently developed FUS (1-359) model. Age, sex and litter-matched mice were treated with riluzole (22 mg/kg) in drinking water or vehicle (DMSO) from symptom onset. Lifespan was assessed and motor function tests were carried out twice weekly to determine whether riluzole slowed disease progression. Riluzole treatment had no significant benefit on lifespan in any of the ALS mouse models tested. Riluzole had no significant impact on decline in motor performance in the FUS (1-359) and SOD1 G93A transgenic mice as assessed by Rotarod and stride length analysis. Riluzole is widely prescribed for ALS patients despite questions surrounding its efficacy. Our data suggest that if riluzole was identified as a therapeutic candidate today it would not progress past pre-clinical assessment. This raises questions about the standards used in pre-clinical assessment of therapeutic candidates for the treatment of ALS.

  11. Modeling fire occurrence as a function of landscape

    Science.gov (United States)

    Loboda, T. V.; Carroll, M.; DiMiceli, C.

    2011-12-01

    Wildland fire is a prominent component of ecosystem functioning worldwide. Nearly all ecosystems experience the impact of naturally occurring or anthropogenically driven fire. Here, we present a spatially explicit and regionally parameterized Fire Occurrence Model (FOM) aimed at developing fire occurrence estimates at landscape and regional scales. The model provides spatially explicit scenarios of fire occurrence based on the available records from fire management agencies, satellite observations, and auxiliary geospatial data sets. Fire occurrence is modeled as a function of the risk of ignition, potential fire behavior, and fire weather using internal regression tree-driven algorithms and empirically established, regionally derived relationships between fire occurrence, fire behavior, and fire weather. The FOM presents a flexible modeling structure with a set of internal globally available default geospatial independent and dependent variables. However, the flexible modeling environment adapts to ingest a variable number, resolution, and content of inputs provided by the user to supplement or replace the default parameters to improve the model's predictive capability. A Southern California FOM instance (SC FOM) was developed using satellite assessments of fire activity from a suite of Landsat and Moderate Resolution Imaging Spectroradiometer (MODIS) satellite data, Monitoring Trends in Burn Severity fire perimeters, and auxiliary geospatial information including land use and ownership, utilities, transportation routes, and the Remote Automated Weather Station data records. The model was parameterized based on satellite data acquired between 2001 and 2009 and fire management fire perimeters available prior to 2009. SC FOM predictive capabilities were assessed using observed fire occurrence available from the MODIS active fire product during 2010. The results show that SC FOM provides a realistic estimate of fire occurrence at the landscape level: the fraction of

  12. Objective Integrated Assessment of Functional Outcomes in Reduction Mammaplasty

    Science.gov (United States)

    Passaro, Ilaria; Malovini, Alberto; Faga, Angela; Toffola, Elena Dalla

    2013-01-01

    Background: The aim of our study was an objective integrated assessment of the functional outcomes of reduction mammaplasty. Methods: The study involved 17 women undergoing reduction mammaplasty from March 2009 to June 2011. Each patient was assessed before surgery and 2 months postoperatively with the original association of 4 subjective and objective assessment methods: a physiatric clinical examination, the Roland Morris Disability Questionnaire, the Berg Balance Scale, and a static force platform analysis. Results: All of the tests proved multiple statistically significant associated outcomes demonstrating a significant improvement in the functional status following reduction mammaplasty. Surgical correction of breast hypertrophy could achieve both spinal pain relief and recovery of performance status in everyday life tasks, owing to a muscular postural functional rearrangement with a consistent antigravity muscle activity sparing. Pain reduction in turn could reduce the antalgic stiffness and improved the spinal range of motion. In our sample, the improvement of the spinal range of motion in flexion matched a similar improvement in extension. Recovery of a more favorable postural pattern with reduction of the anterior imbalance was demonstrated by the static force stabilometry. Therefore, postoperatively, all of our patients narrowed the gap between the actual body barycenter and the ideal one. The static force platform assessment also consistently confirmed the effectiveness of an accurate clinical examination of functional impairment from breast hypertrophy. Conclusions: The static force platform assessment might help the clinician to support the diagnosis of functional impairment from a breast hypertrophy with objectively based data. PMID:25289256

  13. Functional evaluation of peripheral nerve regeneration and target reinnervation in animal models: a critical overview.

    Science.gov (United States)

    Navarro, Xavier

    2016-02-01

    Peripheral nerve injuries usually lead to severe loss of motor, sensory and autonomic functions in the patients. Due to the complex requirements for adequate axonal regeneration, functional recovery is often poorly achieved. Experimental models are useful to investigate the mechanisms related to axonal regeneration and tissue reinnervation, and to test new therapeutic strategies to improve functional recovery. Therefore, objective and reliable evaluation methods should be applied for the assessment of regeneration and function restitution after nerve injury in animal models. This review gives an overview of the most useful methods to assess nerve regeneration, target reinnervation and recovery of complex sensory and motor functions, their values and limitations. The selection of methods has to be adequate to the main objective of the research study, either enhancement of axonal regeneration, improving regeneration and reinnervation of target organs by different types of nerve fibres, or increasing recovery of complex sensory and motor functions. It is generally recommended to use more than one functional method for each purpose, and also to perform morphological studies of the injured nerve and the reinnervated targets. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  14. Canine intrahepatic vasculature: is a functional anatomic model relevant to the dog?

    Science.gov (United States)

    Hall, Jon L; Mannion, Paddy; Ladlow, Jane F

    2015-01-01

    To clarify canine intrahepatic portal and hepatic venous system anatomy using corrosion casting and advanced imaging and to devise a novel functional anatomic model of the canine liver to investigate whether this could help guide the planning and surgical procedure of partial hepatic lobectomy and interventional radiological procedures. Prospective experimental study. Adult Greyhound cadavers (n = 8). Portal and hepatic vein corrosion casts of healthy livers were assessed using computed tomography (CT). The hepatic lobes have a consistent hilar hepatic and portal vein supply with some variation in the number of intrahepatic branches. For all specimens, 3 surgically resectable areas were identified in the left lateral lobe and 2 surgically resectable areas were identified in the right medial lobe as defined by a functional anatomic model. CT of detailed acrylic casts allowed complex intrahepatic vascular relationships to be investigated and compared with previous studies. Improving understanding of the intrahepatic vascular supply facilitates interpretation of advanced images in clinical patients, the planning and performance of surgical procedures, and may facilitate interventional vascular procedures, such as intravenous embolization of portosystemic shunts. Functional division of the canine liver similar to human models is possible. The left lateral and right medial lobes can be consistently divided into surgically resectable functional areas and partial lobectomies can be performed following a functional model; further study in clinically affected animals would be required to investigate the relevance of this functional model in the dog. © Copyright 2014 by The American College of Veterinary Surgeons.

  15. Acceptability of Functional Behavioral Assessment Procedures to Special Educators and School Psychologists

    Science.gov (United States)

    O'Neill, Robert E.; Bundock, Kaitlin; Kladis, Kristin; Hawken, Leanne S.

    2015-01-01

    This survey study assessed the acceptability of a variety of functional behavioral assessment (FBA) procedures (i.e., functional assessment interviews, rating scales/questionnaires, systematic direct observations, functional analysis manipulations) to a national sample of 123 special educators and a state sample of 140 school psychologists.…

  16. Value function in economic growth model

    Science.gov (United States)

    Bagno, Alexander; Tarasyev, Alexandr A.; Tarasyev, Alexander M.

    2017-11-01

    Properties of the value function are examined in an infinite horizon optimal control problem with an unlimited integrand index appearing in the quality functional with a discount factor. Optimal control problems of such type describe solutions in models of economic growth. Necessary and sufficient conditions are derived to ensure that the value function satisfies the infinitesimal stability properties. It is proved that value function coincides with the minimax solution of the Hamilton-Jacobi equation. Description of the growth asymptotic behavior for the value function is provided for the logarithmic, power and exponential quality functionals and an example is given to illustrate construction of the value function in economic growth models.

  17. Relating Memory To Functional Performance In Normal Aging to Dementia Using Hierarchical Bayesian Cognitive Processing Models

    Science.gov (United States)

    Shankle, William R.; Pooley, James P.; Steyvers, Mark; Hara, Junko; Mangrola, Tushar; Reisberg, Barry; Lee, Michael D.

    2012-01-01

    Determining how cognition affects functional abilities is important in Alzheimer’s disease and related disorders (ADRD). 280 patients (normal or ADRD) received a total of 1,514 assessments using the Functional Assessment Staging Test (FAST) procedure and the MCI Screen (MCIS). A hierarchical Bayesian cognitive processing (HBCP) model was created by embedding a signal detection theory (SDT) model of the MCIS delayed recognition memory task into a hierarchical Bayesian framework. The SDT model used latent parameters of discriminability (memory process) and response bias (executive function) to predict, simultaneously, recognition memory performance for each patient and each FAST severity group. The observed recognition memory data did not distinguish the six FAST severity stages, but the latent parameters completely separated them. The latent parameters were also used successfully to transform the ordinal FAST measure into a continuous measure reflecting the underlying continuum of functional severity. HBCP models applied to recognition memory data from clinical practice settings accurately translated a latent measure of cognition to a continuous measure of functional severity for both individuals and FAST groups. Such a translation links two levels of brain information processing, and may enable more accurate correlations with other levels, such as those characterized by biomarkers. PMID:22407225

  18. Developing models of how cognitive improvements change functioning: Mediation, moderation and moderated mediation

    Science.gov (United States)

    Wykes, Til; Reeder, Clare; Huddy, Vyv; Taylor, Rumina; Wood, Helen; Ghirasim, Natalia; Kontis, Dimitrios; Landau, Sabine

    2012-01-01

    Background Cognitive remediation (CRT) affects functioning but the extent and type of cognitive improvements necessary are unknown. Aim To develop and test models of how cognitive improvement transfers to work behaviour using the data from a current service. Method Participants (N49) with a support worker and a paid or voluntary job were offered CRT in a Phase 2 single group design with three assessments: baseline, post therapy and follow-up. Working memory, cognitive flexibility, planning and work outcomes were assessed. Results Three models were tested (mediation — cognitive improvements drive functioning improvement; moderation — post treatment cognitive level affects the impact of CRT on functioning; moderated mediation — cognition drives functioning improvements only after a certain level is achieved). There was evidence of mediation (planning improvement associated with improved work quality). There was no evidence that cognitive flexibility (total Wisconsin Card Sorting Test errors) and working memory (Wechsler Adult Intelligence Scale III digit span) mediated work functioning despite significant effects. There was some evidence of moderated mediation for planning improvement if participants had poorer memory and/or made fewer WCST errors. The total CRT effect on work quality was d = 0.55, but the indirect (planning-mediated CRT effect) was d = 0.082 Conclusion Planning improvements led to better work quality but only accounted for a small proportion of the total effect on work outcome. Other specific and non-specific effects of CRT and the work programme are likely to account for some of the remaining effect. This is the first time complex models have been tested and future Phase 3 studies need to further test mediation and moderated mediation models. PMID:22503640

  19. [Assessment of Functioning when Conducting Occupational Capacity Evaluations--What is "Evidence-Based"?].

    Science.gov (United States)

    Canela, Carlos; Schleifer, Roman; Dube, Anish; Hengartner, Michael P; Ebner, Gerhard; Seifritz, Erich; Liebrenz, Michael

    2016-03-01

    Occupational capacity evaluations have previously been subject to criticism for lacking in quality and consistency. To the authors' knowledge, there is no clear consensus on the best way to formally assess functioning within capacity evaluations. In this review we investigated different instruments that are used to assess functioning in occupational capacity evaluations. Systematic review of the literature. Though several instruments that assess functional capacity were found in our search, a specific validated instrument assessing occupational capacity as part of a larger psychiatric evaluation was not found. The limitations of the existing instruments on assessing functional capacity are discussed. Medical experts relying on instruments to conduct functional capacity evaluations should be cognizant of their limitations. The findings call for the development and use of an instrument specifically designed to assess the functional and occupational capacity of psychiatric patients, which is also likely to improve the quality of these reports. © Georg Thieme Verlag KG Stuttgart · New York.

  20. A Systematic Approach for Real-Time Operator Functional State Assessment

    Science.gov (United States)

    Zhang, Guangfan; Wang, Wei; Pepe, Aaron; Xu, Roger; Schnell, Thomas; Anderson, Nick; Heitkamp, Dean; Li, Jiang; Li, Feng; McKenzie, Frederick

    2012-01-01

    A task overload condition often leads to high stress for an operator, causing performance degradation and possibly disastrous consequences. Just as dangerous, with automated flight systems, an operator may experience a task underload condition (during the en-route flight phase, for example), becoming easily bored and finding it difficult to maintain sustained attention. When an unexpected event occurs, either internal or external to the automated system, the disengaged operator may neglect, misunderstand, or respond slowly/inappropriately to the situation. In this paper, we discuss an approach for Operator Functional State (OFS) monitoring in a typical aviation environment. A systematic ground truth finding procedure has been designed based on subjective evaluations, performance measures, and strong physiological indicators. The derived OFS ground truth is continuous in time compared to a very sparse estimation of OFS based on an expert review or subjective evaluations. It can capture the variations of OFS during a mission to better guide through the training process of the OFS assessment model. Furthermore, an OFS assessment model framework based on advanced machine learning techniques was designed and the systematic approach was then verified and validated with experimental data collected in a high fidelity Boeing 737 simulator. Preliminary results show highly accurate engagement/disengagement detection making it suitable for real-time applications to assess pilot engagement.

  1. Determination of probability density functions for parameters in the Munson-Dawson model for creep behavior of salt

    International Nuclear Information System (INIS)

    Pfeifle, T.W.; Mellegard, K.D.; Munson, D.E.

    1992-10-01

    The modified Munson-Dawson (M-D) constitutive model that describes the creep behavior of salt will be used in performance assessment calculations to assess compliance of the Waste Isolation Pilot Plant (WIPP) facility with requirements governing the disposal of nuclear waste. One of these standards requires that the uncertainty of future states of the system, material model parameters, and data be addressed in the performance assessment models. This paper presents a method in which measurement uncertainty and the inherent variability of the material are characterized by treating the M-D model parameters as random variables. The random variables can be described by appropriate probability distribution functions which then can be used in Monte Carlo or structural reliability analyses. Estimates of three random variables in the M-D model were obtained by fitting a scalar form of the model to triaxial compression creep data generated from tests of WIPP salt. Candidate probability distribution functions for each of the variables were then fitted to the estimates and their relative goodness-of-fit tested using the Kolmogorov-Smirnov statistic. A sophisticated statistical software package obtained from BMDP Statistical Software, Inc. was used in the M-D model fitting. A separate software package, STATGRAPHICS, was used in fitting the candidate probability distribution functions to estimates of the variables. Skewed distributions, i.e., lognormal and Weibull, were found to be appropriate for the random variables analyzed

  2. Structure-Function Network Mapping and Its Assessment via Persistent Homology

    Science.gov (United States)

    2017-01-01

    Understanding the relationship between brain structure and function is a fundamental problem in network neuroscience. This work deals with the general method of structure-function mapping at the whole-brain level. We formulate the problem as a topological mapping of structure-function connectivity via matrix function, and find a stable solution by exploiting a regularization procedure to cope with large matrices. We introduce a novel measure of network similarity based on persistent homology for assessing the quality of the network mapping, which enables a detailed comparison of network topological changes across all possible thresholds, rather than just at a single, arbitrary threshold that may not be optimal. We demonstrate that our approach can uncover the direct and indirect structural paths for predicting functional connectivity, and our network similarity measure outperforms other currently available methods. We systematically validate our approach with (1) a comparison of regularized vs. non-regularized procedures, (2) a null model of the degree-preserving random rewired structural matrix, (3) different network types (binary vs. weighted matrices), and (4) different brain parcellation schemes (low vs. high resolutions). Finally, we evaluate the scalability of our method with relatively large matrices (2514x2514) of structural and functional connectivity obtained from 12 healthy human subjects measured non-invasively while at rest. Our results reveal a nonlinear structure-function relationship, suggesting that the resting-state functional connectivity depends on direct structural connections, as well as relatively parsimonious indirect connections via polysynaptic pathways. PMID:28046127

  3. Use of the catena principle in geomorphological impact assessment: a functional approach

    NARCIS (Netherlands)

    Wolfert, H.P.

    1995-01-01

    An integral method for assessing geomorphological landscape qualities is presented, to be used in environmental impact assessments. Five groups of landform functions are distinguished in the Netherlands, an area of low relief: orientation functions, information functions, ordering functions,

  4. Direct assessment of lung function in COPD using CT densitometric measures

    International Nuclear Information System (INIS)

    Gu, Suicheng; Leader, Joseph; Gur, David; Pu, Jiantao; Zheng, Bin; Chen, Qihang; Sciurba, Frank; Kminski, Naftali

    2014-01-01

    To investigate whether lung function in patients with chronic obstructive pulmonary disease (COPD) can be directly predicted using CT densitometric measures and assess the underlying prediction errors as compared with the traditional spirometry-based measures. A total of 600 CT examinations were collected from a COPD study. In addition to the entire lung volume, the extent of emphysema depicted in each CT examination was quantified using density mask analysis (densitometry). The partial least square regression was used for constructing the prediction model, where a repeated random split-sample validation was employed. For each split, we randomly selected 400 CT exams for training (regression) purpose and the remaining 200 exams for assessing performance in prediction of lung function (e.g., FEV1 and FEV1/FVC) and disease severity. The absolute and percentage errors as well as their standard deviations were computed. The averaged percentage errors in prediction of FEV1, FEV1/FVC%, TLC, RV/TLC% and DLco% predicted were 33%, 17%, 9%, 18% and 23%, respectively. When classifying the exams in terms of disease severity grades using the CT measures, 37% of the subjects were correctly classified with no error and 83% of the exams were either correctly classified or classified into immediate neighboring categories. The linear weighted kappa and quadratic weighted kappa were 0.54 (moderate agreement) and 0.72 (substantial agreement), respectively. Despite the existence of certain prediction errors in quantitative assessment of lung function, the CT densitometric measures could be used to relatively reliably classify disease severity grade of COPD patients in terms of GOLD. (paper)

  5. Biodiversity, ecosystem functions and services in environmental risk assessment: introduction to the special issue.

    Science.gov (United States)

    Schäfer, Ralf B

    2012-01-15

    This Special Issue focuses on the questions if and how biodiversity, ecosystem functions and resulting services could be incorporated into the Ecological Risk Assessment (ERA). Therefore, three articles provide a framework for the integration of ecosystem services into ERA of soils, sediments and pesticides. Further articles demonstrate ways how stakeholders can be integrated into an ecosystem service-based ERA for soils and describe how the current monitoring could be adapted to new assessment endpoints that are directly linked to ecosystem services. Case studies show that the current ERA may not be protective for biodiversity, ecosystem functions and resulting services and that both pesticides and salinity currently adversely affect ecosystem functions in the field. Moreover, ecological models can be used for prediction of new protection goals and could finally support their implementation into the ERA. Overall, the Special Issue stresses the urgent need to enhance current procedures of ERA if biodiversity, ecosystem functions and resulting services are to be protected. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Right ventricular function: methodologic and clinical considerations in noninvasive scintigraphic assessment

    International Nuclear Information System (INIS)

    Manno, B.V.; Iskandrian, A.S.; Hakki, A.H.

    1984-01-01

    Right ventricular function plays an important role in many cardiac disorders. Changes in left ventricular function, right ventricular afterload and preload, cardiac medications and ischemia may affect right ventricular function. Radionuclide ventriculography permits quantitative assessment of regional and global function of the right ventricle. This assessment can be made at rest, during exercise or after pharmacologic interventions. The overlap between right ventricle and right atrium is a major limitation for gated scintigraphic techniques. The use of imaging with newer short-lived radionuclides may permit more accurate and reproducible assessment of right ventricular function by means of the first pass method. Further work in areas related to improvement of techniques and the impact of right ventricular function on prognosis is needed

  7. Functional State Modelling of Saccharomyces cerevisiae Cultivations

    Directory of Open Access Journals (Sweden)

    Iasen Hristozov

    2004-10-01

    Full Text Available The implementation of functional state approach for modelling of yeast cultivation is considered in this paper. This concept helps in monitoring and control of complex processes such as bioprocesses. Using of functional state modelling approach for fermentation processes aims to overcome the main disadvantage of using global process model, namely complex model structure and big number of model parameters. The main advantage of functional state modelling is that the parameters of each local model can be separately estimated from other local models parameters. The results achieved from batch, as well as from fed-batch, cultivations are presented.

  8. Cognitive functioning and behaviour of epileptic children in parents' assessment.

    Science.gov (United States)

    Talarska, Dorota; Steinborn, Barbara; Michalak, Michał

    2011-01-01

    Cognitive functioning and behaviour of chronically ill children are affected by many factors, including anxiety due to hospitalization, persistent symptoms of sickness and adverse side effects of medications. The aim of this work was to seek out parents' opinion concerning cognitive functioning and behaviour of children with epilepsy. The study comprised 156 children with epilepsy aged 7-18 and treated in the Department of Developmental Neurology at Karol Marcinkowski Poznan University of Medical Sciences and in an outpatient clinic. The research tool used was the questionnaire Quality of Life in Childhood Epilepsy (QOLCE) completed by parents. Assessment of cognitive functioning and behaviour was based on the analysis of the areas V (cognitive processes) and VII (behaviour). Parents assessed children's functioning in the areas of cognitive processes and behaviour at a similar level - 55 points. In the area of cognitive processes, concentration while performing some tasks and reading was assessed as the worst. A significant difference in caregivers' assessment was found according to age, frequency of seizures and duration of disease. In the area analysing the child's behaviour, parents indicated getting angry easily and not being upset by other people's opinions. The display of aggression towards others got the lowest number of comments. The children's functioning was assessed by parents as rather poor in both analysed areas. Parents of children treated with polytherapy noticed more difficulties in cognitive functioning and behaviour than parents of children treated with one medication.

  9. Reliability of the Hazelbaker Assessment Tool for Lingual Frenulum Function

    Directory of Open Access Journals (Sweden)

    James Jennifer P

    2006-03-01

    Full Text Available Abstract Background About 3% of infants are born with a tongue-tie which may lead to breastfeeding problems such as ineffective latch, painful attachment or poor weight gain. The Hazelbaker Assessment Tool for Lingual Frenulum Function (HATLFF has been developed to give a quantitative assessment of the tongue-tie and recommendation about frenotomy (release of the frenulum. The aim of this study was to assess the inter-rater reliability of the HATLFF. Methods Fifty-eight infants referred to the Breastfeeding Education and Support Services (BESS at The Royal Women's Hospital for assessment of tongue-tie and 25 control infants were assessed by two clinicians independently. Results The Appearance items received kappas between about 0.4 to 0.6, which represents "moderate" reliability. The first three Function items (lateralization, lift and extension of tongue had kappa values over 0.65 which indicates "substantial" agreement. The four Function items relating to infant sucking (spread, cupping, peristalsis and snapback received low kappa values with insignificant p values. There was 96% agreement between the two assessors on the recommendation for frenotomy (kappa 0.92, excellent agreement. The study found that the Function Score can be more simply assessed using only the first three function items (ie not scoring the sucking items, with a cut-off of ≤4 for recommendation of frenotomy. Conclusion We found that the HATLFF has a high reliability in a study of infants with tongue-tie and control infants

  10. Development and comparison of Bayesian modularization method in uncertainty assessment of hydrological models

    Science.gov (United States)

    Li, L.; Xu, C.-Y.; Engeland, K.

    2012-04-01

    With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD

  11. Bayesian inference in an item response theory model with a generalized student t link function

    Science.gov (United States)

    Azevedo, Caio L. N.; Migon, Helio S.

    2012-10-01

    In this paper we introduce a new item response theory (IRT) model with a generalized Student t-link function with unknown degrees of freedom (df), named generalized t-link (GtL) IRT model. In this model we consider only the difficulty parameter in the item response function. GtL is an alternative to the two parameter logit and probit models, since the degrees of freedom (df) play a similar role to the discrimination parameter. However, the behavior of the curves of the GtL is different from those of the two parameter models and the usual Student t link, since in GtL the curve obtained from different df's can cross the probit curves in more than one latent trait level. The GtL model has similar proprieties to the generalized linear mixed models, such as the existence of sufficient statistics and easy parameter interpretation. Also, many techniques of parameter estimation, model fit assessment and residual analysis developed for that models can be used for the GtL model. We develop fully Bayesian estimation and model fit assessment tools through a Metropolis-Hastings step within Gibbs sampling algorithm. We consider a prior sensitivity choice concerning the degrees of freedom. The simulation study indicates that the algorithm recovers all parameters properly. In addition, some Bayesian model fit assessment tools are considered. Finally, a real data set is analyzed using our approach and other usual models. The results indicate that our model fits the data better than the two parameter models.

  12. Correlation functions of the Ising model and the eight-vertex model

    International Nuclear Information System (INIS)

    Ko, L.F.

    1986-01-01

    Calculations for the two-point correlation functions in the scaling limit for two statistical models are presented. In Part I, the Ising model with a linear defect is studied for T T/sub c/. The transfer matrix method of Onsager and Kaufman is used. The energy-density correlation is given by functions related to the modified Bessel functions. The dispersion expansion for the spin-spin correlation functions are derived. The dominant behavior for large separations at T not equal to T/sub c/ is extracted. It is shown that these expansions lead to systems of Fredholm integral equations. In Part II, the electric correlation function of the eight-vertex model for T < T/sub c/ is studied. The eight vertex model decouples to two independent Ising models when the four spin coupling vanishes. To first order in the four-spin coupling, the electric correlation function is related to a three-point function of the Ising model. This relation is systematically investigated and the full dispersion expansion (to first order in four-spin coupling) is obtained. The results is a new kind of structure which, unlike those of many solvable models, is apparently not expressible in terms of linear integral equations

  13. Variance Function Partially Linear Single-Index Models1.

    Science.gov (United States)

    Lian, Heng; Liang, Hua; Carroll, Raymond J

    2015-01-01

    We consider heteroscedastic regression models where the mean function is a partially linear single index model and the variance function depends upon a generalized partially linear single index model. We do not insist that the variance function depend only upon the mean function, as happens in the classical generalized partially linear single index model. We develop efficient and practical estimation methods for the variance function and for the mean function. Asymptotic theory for the parametric and nonparametric parts of the model is developed. Simulations illustrate the results. An empirical example involving ozone levels is used to further illustrate the results, and is shown to be a case where the variance function does not depend upon the mean function.

  14. Functional outcome measures in a surgical model of hip osteoarthritis in dogs

    OpenAIRE

    Little, Dianne; Johnson, Stephen; Hash, Jonathan; Olson, Steven A.; Estes, Bradley T.; Moutos, Franklin T.; Lascelles, B. Duncan X.; Guilak, Farshid

    2016-01-01

    Background The hip is one of the most common sites of osteoarthritis in the body, second only to the knee in prevalence. However, current animal models of hip osteoarthritis have not been assessed using many of the functional outcome measures used in orthopaedics, a characteristic that could increase their utility in the evaluation of therapeutic interventions. The canine hip shares similarities with the human hip, and functional outcome measures are well documented in veterinary medicine, pr...

  15. Validation and clinical significance of the Childhood Myositis Assessment Scale for assessment of muscle function in the juvenile idiopathic inflammatory myopathies.

    Science.gov (United States)

    Huber, Adam M; Feldman, Brian M; Rennebohm, Robert M; Hicks, Jeanne E; Lindsley, Carol B; Perez, Maria D; Zemel, Lawrence S; Wallace, Carol A; Ballinger, Susan H; Passo, Murray H; Reed, Ann M; Summers, Ronald M; White, Patience H; Katona, Ildy M; Miller, Frederick W; Lachenbruch, Peter A; Rider, Lisa G

    2004-05-01

    To examine the measurement characteristics of the Childhood Myositis Assessment Scale (CMAS) in children with juvenile idiopathic inflammatory myopathy (juvenile IIM), and to obtain preliminary data on the clinical significance of CMAS scores. One hundred eight children with juvenile IIM were evaluated on 2 occasions, 7-9 months apart, using various measures of physical function, strength, and disease activity. Interrater reliability, construct validity, and responsiveness of the CMAS were examined. The minimum clinically important difference (MID) and CMAS scores corresponding to various degrees of physical disability were estimated. The intraclass correlation coefficient for 26 patients assessed by 2 examiners was 0.89, indicating very good interrater reliability. The CMAS score correlated highly with the Childhood Health Assessment Questionnaire (C-HAQ) score and with findings on manual muscle testing (MMT) (r(s) = -0.73 and 0.73, respectively) and moderately with physician-assessed global disease activity and skin activity, parent-assessed global disease severity, and muscle magnetic resonance imaging (r(s) = -0.44 to -0.61), thereby demonstrating good construct validity. The standardized response mean was 0.81 (95% confidence interval 0.53, 1.09) in patients with at least 0.8 cm improvement on a 10-cm visual analog scale for physician-assessed global disease activity, indicating strong responsiveness. In bivariate regression models predicting physician-assessed global disease activity, MMT remained significant in models containing the CMAS (P = 0.03) while the C-HAQ did not (P = 0.4). Estimates of the MID ranged from 1.5 to 3.0 points on a 0-52-point scale. CMAS scores corresponding to no, mild, mild-to-moderate, and moderate physical disability, respectively, were 48, 45, 39, and 30. The CMAS exhibits good reliability, construct validity, and responsiveness, and is therefore a valid instrument for the assessment of physical function, muscle strength, and

  16. Assessment of personality-related levels of functioning: A pilot study of clinical assessment of the DSM-5 Level of Personality Functioning based on a semi-structured interview

    DEFF Research Database (Denmark)

    Thylstrup, Birgitte; Simonsen, Sebastian; Nemery, Caroline

    2016-01-01

    was to test the Clinical Assessment of the Level of Personality Functioning Scale [CALF], a semi-structured clinical interview, designed to assess the Level of Personality Functioning Scale of the DSM-5 (Section III) by applying strategies similar to what characterizes assessments in clinical practice....... Methods: The inter-rater reliability of the assessment of the four domains and the total impairment in the Level of Personality Functioning Scale were measured in a patient sample that varied in terms of severity and type of pathology. Ratings were done independently by the interviewer and two experts who...... watched a videotaped interview. Results: Inter-rater reliability coefficients varied between domains and were not sufficient for clinical practice, but may support the use of the interview to assess the dimensions of personality functioning for research purposes. Conclusions: While designed to measure...

  17. Functionalized anatomical models for EM-neuron Interaction modeling

    Science.gov (United States)

    Neufeld, Esra; Cassará, Antonino Mario; Montanaro, Hazael; Kuster, Niels; Kainz, Wolfgang

    2016-06-01

    The understanding of interactions between electromagnetic (EM) fields and nerves are crucial in contexts ranging from therapeutic neurostimulation to low frequency EM exposure safety. To properly consider the impact of in vivo induced field inhomogeneity on non-linear neuronal dynamics, coupled EM-neuronal dynamics modeling is required. For that purpose, novel functionalized computable human phantoms have been developed. Their implementation and the systematic verification of the integrated anisotropic quasi-static EM solver and neuronal dynamics modeling functionality, based on the method of manufactured solutions and numerical reference data, is described. Electric and magnetic stimulation of the ulnar and sciatic nerve were modeled to help understanding a range of controversial issues related to the magnitude and optimal determination of strength-duration (SD) time constants. The results indicate the importance of considering the stimulation-specific inhomogeneous field distributions (especially at tissue interfaces), realistic models of non-linear neuronal dynamics, very short pulses, and suitable SD extrapolation models. These results and the functionalized computable phantom will influence and support the development of safe and effective neuroprosthetic devices and novel electroceuticals. Furthermore they will assist the evaluation of existing low frequency exposure standards for the entire population under all exposure conditions.

  18. Function modeling: improved raster analysis through delayed reading and function raster datasets

    Science.gov (United States)

    John S. Hogland; Nathaniel M. Anderson; J .Greg Jones

    2013-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space, often limiting the types of analyses that can be performed. To address this issue, we have developed Function Modeling. Function Modeling is a new modeling framework that streamlines the...

  19. UPPER LIMB FUNCTIONAL ASSESSMENT USING HAPTIC INTERFACE

    Directory of Open Access Journals (Sweden)

    Aleš Bardorfer

    2004-12-01

    Full Text Available A new method for the assessment of the upper limb (UL functional state, using a haptic interface is presented. A haptic interface is used as a measuring device, capable of providing objective, repeatable and quantitative data of the UL motion. A patient is presented with a virtual environment, both graphically via a computer screen and haptically via the Phantom Premium 1.5 haptic interface. The setup allows the patient to explore and feel the virtual environment with three of his/her senses; sight, hearing, and most important, touch. Specially designed virtual environments are used to assess the patient’s UL movement capabilities. The tests range from tracking tasks – to assess the accuracy of movement – tracking tasks with added disturbances in a form of random forces – to assess the patient’s control abilities, a labyrinth test – to assess both speed and accuracy, to the last test for measuring the maximal force capacity of the UL.A new method for the assessment of the upper limb (UL functional state, using a haptic interface is presented. A haptic interface is used as a measuring device, capable of providing objective, repeatable and quantitative data of the UL motion. A patient is presented with a virtual environment, both graphically via a computer screen and haptically via the Phantom Premium 1.5 haptic interface. The setup allows the patient to explore and feel the virtual environment with three of his/her senses; sight, hearing, and most important, touch. Specially designed virtual environments are used to assess the patient’s UL movement capabilities. The tests range from tracking tasks–to assess the accuracy of movement-tracking tasks with added disturbances in a form of random forces-to assess the patient’s control abilities, a labyrinth test-to assess both speed and accuracy, to the last test for measuring the maximal force capacity of the UL.A comprehensive study, using the developed measurement setup within the

  20. Longitudinal assessment of endothelial function in the microvasculature of mice in-vivo.

    Science.gov (United States)

    Belch, Jill J F; Akbar, Naveed; Alapati, Venkateswara; Petrie, John; Arthur, Simon; Khan, Faisel

    2013-01-01

    Endothelial dysfunction is associated with early development of cardiovascular disease, making longitudinal measurements desirable. We devised a protocol using laser Doppler imaging (LDI) and iontophoresis of acetylcholine (ACh) and sodium nitroprusside (SNP) to assess the skin microcirculation longitudinally in mice every 4 weeks for 24 weeks in two groups of C57BL/6 mice, chow versus high-cholesterol diet(known to induce endothelial dysfunction). LDI measurements were compared with vascular function (isometric tension) measured using wire myography in the tail artery in response to ACh and SNP. Microvascular responses to ACh were significantly reduced in cholesterol-fed versus chow-fed mice from week 4 onwards (Phydrochloride (L-NAME) showed a significant reduction in ACh response compared with vehicle-treated animals (P<0.05) at baseline and at 12 weeks. In cholesterol-fed mice, ACh responses were 226 ± 21 and 180 ± 21 AU (P=0.03) before and after L-NAME, respectively. A reduction in ex-vivo ACh response was detected in the tail artery in cholesterol-fed mice, and a significant correlation found between peak microvascular ACh response and maximum ACh response in the tail artery (r=0.699, P=0.017). No changes were found in SNP responses in the microvasculature or tail artery. Using this protocol, we have shown longitudinal decreases in microvascular endothelial function to cholesterol feeding. L-NAME studies confirm that the reduced vasodilatation to ACh in cholesterol-fed mice was mediated partly through reduced NO bioavailability. Wire myography of tail arteries confirmed that in-vivo measurements of microvascular function reflect ex-vivo vascular function in other beds. Longitudinal assessments of skin microvascular function in mice could provide a useful translatable model for assessing early endothelial dysfunction. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. Ultrasonographic Assessment of Diaphragm Function in Critically Ill Subjects.

    Science.gov (United States)

    Umbrello, Michele; Formenti, Paolo

    2016-04-01

    The majority of patients admitted to the ICU require mechanical ventilation as a part of their process of care. However, mechanical ventilation itself or the underlying disease can lead to dysfunction of the diaphragm, a condition that may contribute to the failure of weaning from mechanical ventilation. However, extended time on the ventilator increases health-care costs and greatly increases patient morbidity and mortality. Nevertheless, symptoms and signs of muscle disease in a bedridden (or bed rest-only) ICU patient are often difficult to assess because of concomitant confounding factors. Conventional assessment of diaphragm function lacks specific, noninvasive, time-saving, and easily performed bedside tools or requires patient cooperation. Recently, the use of ultrasound has raised great interest as a simple, noninvasive method of quantification of diaphragm contractile activity. In this review, we discuss the physiology and the relevant pathophysiology of diaphragm function, and we summarize the recent findings concerning the evaluation of its (dys)function in critically ill patients, with a special focus on the role of ultrasounds. We describe how to assess diaphragm excursion and diaphragm thickening during breathing and the meaning of these measurements under spontaneous or mechanical ventilation as well as the reference values in health and disease. The spread of ultrasonographic assessment of diaphragm function may possibly result in timely identification of patients with diaphragm dysfunction and to a potential improvement in the assessment of recovery from diaphragm weakness. Copyright © 2016 by Daedalus Enterprises.

  2. Association between changes on the Negative Symptom Assessment scale (NSA-16) and measures of functional outcome in schizophrenia.

    Science.gov (United States)

    Velligan, Dawn I; Alphs, Larry; Lancaster, Scott; Morlock, Robert; Mintz, Jim

    2009-09-30

    We examined whether changes in negative symptoms, as measured by scores on the 16-item Negative Symptom Assessment scale (NSA-16), were associated with changes in functional outcome. A group of 125 stable outpatients with schizophrenia were assessed at baseline and at 6 months using the NSA-16, the Brief Psychiatric Rating Scale, and multiple measures of functional outcome. Baseline adjusted regression coefficients indicated moderate correlations between negative symptoms and functional outcomes when baseline values of both variables were controlled. Results were nearly identical when we controlled for positive symptoms. Cross-lag panel correlations and Structural Equation Modeling were used to examine whether changes in negative symptoms drove changes in functional outcomes over time. Results indicated that negative symptoms drove the changes in the Social and Occupational Functioning Scale (SOFAS) rather than the reverse. Measures of Quality of Life and measures of negative symptoms may be assessing overlapping constructs or changes in both may be driven by a third variable. Negative symptoms were unrelated over time to scores on a performance-based measure of functional capacity. This study indicates that the relationship between negative symptom change and the change in functional outcomes is complex, and points to potential issues in selection of assessments.

  3. Exploitation of geoinformatics at modelling of functional effects of forest functions

    International Nuclear Information System (INIS)

    Sitko, R.

    2005-01-01

    From point of view of space modelling geoinformatics has wide application in group of ecologic function of forest because they directly depend on natural conditions of site. A causa de cy modelling application was realised on the territory of TANAP (Tatras National Park), West Tatras, in the part Liptovske Kopy. The size of this territory is about 4,900 hectares and forests there subserve the first of all significant ecological functions, what are soil protection from erosion, water management, and anti-avalanche function. Of environmental functions they have recreational role of the forest and function of nature protection. Anti-avalanche and anti-erosion function of forest is evaluated in this presentation

  4. Correlation of 68Ga Ventilation-Perfusion PET/CT with Pulmonary Function Test Indices for Assessing Lung Function.

    Science.gov (United States)

    Le Roux, Pierre-Yves; Siva, Shankar; Steinfort, Daniel P; Callahan, Jason; Eu, Peter; Irving, Lou B; Hicks, Rodney J; Hofman, Michael S

    2015-11-01

    Pulmonary function tests (PFTs) are routinely used to assess lung function, but they do not provide information about regional pulmonary dysfunction. We aimed to assess correlation of quantitative ventilation-perfusion (V/Q) PET/CT with PFT indices. Thirty patients underwent V/Q PET/CT and PFT. Respiration-gated images were acquired after inhalation of (68)Ga-carbon nanoparticles and administration of (68)Ga-macroaggregated albumin. Functional volumes were calculated by dividing the volume of normal ventilated and perfused (%NVQ), unmatched and matched defects by the total lung volume. These functional volumes were correlated with forced expiratory volume in 1 s (FEV1), forced vital capacity (FVC), FEV1/FVC, and diffusing capacity for carbon monoxide (DLCO). All functional volumes were significantly different in patients with chronic obstructive pulmonary disease (P volume of unmatched defects (r = -0.55). Considering %NVQ only, a cutoff value of 90% correctly categorized 28 of 30 patients with or without significant pulmonary function impairment. Our study demonstrates strong correlations between V/Q PET/CT functional volumes and PFT parameters. Because V/Q PET/CT is able to assess regional lung function, these data support the feasibility of its use in radiation therapy and preoperative planning and assessing pulmonary dysfunction in a variety of respiratory diseases. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  5. Self Assessment in Schizophrenia: Accuracy of Evaluation of Cognition and Everyday Functioning

    Science.gov (United States)

    Gould, Felicia; McGuire, Laura Stone; Durand, Dante; Sabbag, Samir; Larrauri, Carlos; Patterson, Thomas L.; Twamley, Elizabeth W.; Harvey, Philip D.

    2015-01-01

    Objective Self-assessment deficits, often referred to as impaired insight or unawareness of illness, are well established in people with schizophrenia. There are multiple levels of awareness, including awareness of symptoms, functional deficits, cognitive impairments, and the ability to monitor cognitive and functional performance in an ongoing manner. The present study aimed to evaluate the comparative predictive value of each aspect of awareness on the levels of everyday functioning in people with schizophrenia. Method We examined multiple aspects of self-assessment of functioning in 214 people with schizophrenia. We also collected information on everyday functioning rated by high contact clinicians and examined the importance of self-assessment for the prediction of real world functional outcomes. The relative impact of performance based measures of cognition, functional capacity, and metacognitive performance on everyday functioning was also examined. Results Misestimation of ability emerged as the strongest predictor of real world functioning and exceeded the influences of cognitive performance, functional capacity performance, and performance-based assessment of metacognitive monitoring. The relative contribution of the factors other than self-assessment varied according to which domain of everyday functioning was being examined, but in all cases, accounted for less predictive variance. Conclusions These results underscore the functional impact of misestimating one’s current functioning and relative level of ability. These findings are consistent with the use of insight-focused treatments and compensatory strategies designed to increase self-awareness in multiple functional domains. PMID:25643212

  6. Self-assessment in schizophrenia: Accuracy of evaluation of cognition and everyday functioning.

    Science.gov (United States)

    Gould, Felicia; McGuire, Laura Stone; Durand, Dante; Sabbag, Samir; Larrauri, Carlos; Patterson, Thomas L; Twamley, Elizabeth W; Harvey, Philip D

    2015-09-01

    Self-assessment deficits, often referred to as impaired insight or unawareness of illness, are well established in people with schizophrenia. There are multiple levels of awareness, including awareness of symptoms, functional deficits, cognitive impairments, and the ability to monitor cognitive and functional performance in an ongoing manner. The present study aimed to evaluate the comparative predictive value of each aspect of awareness on the levels of everyday functioning in people with schizophrenia. We examined multiple aspects of self-assessment of functioning in 214 people with schizophrenia. We also collected information on everyday functioning rated by high contact clinicians and examined the importance of self-assessment for the prediction of real-world functional outcomes. The relative impact of performance-based measures of cognition, functional capacity, and metacognitive performance on everyday functioning was also examined. Misestimation of ability emerged as the strongest predictor of real-world functioning and exceeded the influences of cognitive performance, functional capacity performance, and performance-based assessment of metacognitive monitoring. The relative contribution of the factors other than self-assessment varied according to which domain of everyday functioning was being examined, but, in all cases, accounted for less predictive variance. These results underscore the functional impact of misestimating one's current functioning and relative level of ability. These findings are consistent with the use of insight-focused treatments and compensatory strategies designed to increase self-awareness in multiple functional domains. (c) 2015 APA, all rights reserved).

  7. Computational modeling of the structure-function relationship in human placental terminal villi.

    OpenAIRE

    Plitman, Mayo R; Olsthoorn, Jason; Charnock-Jones, David Stephen; Burton, Graham James; Oyen, Michelle Lynn

    2016-01-01

    Placental oxygen transport takes place at the final branches of the villous tree and is dictated by the relative arrangement of the maternal and fetal circulations. Modeling techniques have failed to accurately assess the structure-function relationship in the terminal villi due to the geometrical complexity. Three-dimensional blood flow and oxygen transport was modeled in four terminal villi reconstructed from confocal image stacks. The blood flow was analyzed along the center lines of capil...

  8. Bridging the gap between measurements and modelling: a cardiovascular functional avatar.

    Science.gov (United States)

    Casas, Belén; Lantz, Jonas; Viola, Federica; Cedersund, Gunnar; Bolger, Ann F; Carlhäll, Carl-Johan; Karlsson, Matts; Ebbers, Tino

    2017-07-24

    Lumped parameter models of the cardiovascular system have the potential to assist researchers and clinicians to better understand cardiovascular function. The value of such models increases when they are subject specific. However, most approaches to personalize lumped parameter models have thus far required invasive measurements or fall short of being subject specific due to a lack of the necessary clinical data. Here, we propose an approach to personalize parameters in a model of the heart and the systemic circulation using exclusively non-invasive measurements. The personalized model is created using flow data from four-dimensional magnetic resonance imaging and cuff pressure measurements in the brachial artery. We term this personalized model the cardiovascular avatar. In our proof-of-concept study, we evaluated the capability of the avatar to reproduce pressures and flows in a group of eight healthy subjects. Both quantitatively and qualitatively, the model-based results agreed well with the pressure and flow measurements obtained in vivo for each subject. This non-invasive and personalized approach can synthesize medical data into clinically relevant indicators of cardiovascular function, and estimate hemodynamic variables that cannot be assessed directly from clinical measurements.

  9. Assessing uncertainty in SRTM elevations for global flood modelling

    Science.gov (United States)

    Hawker, L. P.; Rougier, J.; Neal, J. C.; Bates, P. D.

    2017-12-01

    The SRTM DEM is widely used as the topography input to flood models in data-sparse locations. Understanding spatial error in the SRTM product is crucial in constraining uncertainty about elevations and assessing the impact of these upon flood prediction. Assessment of SRTM error was carried out by Rodriguez et al (2006), but this did not explicitly quantify the spatial structure of vertical errors in the DEM, and nor did it distinguish between errors over different types of landscape. As a result, there is a lack of information about spatial structure of vertical errors of the SRTM in the landscape that matters most to flood models - the floodplain. Therefore, this study attempts this task by comparing SRTM, an error corrected SRTM product (The MERIT DEM of Yamazaki et al., 2017) and near truth LIDAR elevations for 3 deltaic floodplains (Mississippi, Po, Wax Lake) and a large lowland region (the Fens, UK). Using the error covariance function, calculated by comparing SRTM elevations to the near truth LIDAR, perturbations of the 90m SRTM DEM were generated, producing a catalogue of plausible DEMs. This allows modellers to simulate a suite of plausible DEMs at any aggregated block size above native SRTM resolution. Finally, the generated DEM's were input into a hydrodynamic model of the Mekong Delta, built using the LISFLOOD-FP hydrodynamic model, to assess how DEM error affects the hydrodynamics and inundation extent across the domain. The end product of this is an inundation map with the probability of each pixel being flooded based on the catalogue of DEMs. In a world of increasing computer power, but a lack of detailed datasets, this powerful approach can be used throughout natural hazard modelling to understand how errors in the SRTM DEM can impact the hazard assessment.

  10. Assessment of adrenal function in liver diseases

    Directory of Open Access Journals (Sweden)

    Sandeep Kharb

    2013-01-01

    Full Text Available Background: In recent times, there are reports of adrenal dysfunction in whole spectrum of liver disease. Adrenal insufficiency (AI has been shown to correlate with progression of liver disease. Hence this study was conducted to assess adrenal function in subjects with acute liver disease (ALD, chronic liver disease (CLD and post liver transplantation (LT. Material and Methods: This study included 25 healthy controls, 25 patients of ALD, 20 subjects of CLD with Child-Pugh stage A (CLD-1 and 30 with Child-Pugh stage B or C (CLD-2, and 10 subjects with LT. All subjects were assessed clinically, biochemically and for adrenal functions. Results: AI was present in 9 (34.6% patients with ALD, 20 (40% patients with CLD and 4 (40% in subjects with LT. AI was more common in CLD-2 (18 patients - 60% than CLD-1 (2 patients - 10%. All patients with chronic liver disease had significantly lower basal cortisol (8.8±4.8, P=0.01, stimulated cortisol (18.2±6.3, P <0.00001 and incremental cortisol (9.4±4.6, P <0.00001 as compared to controls. There was increase in percentage of subjects with adrenal dysfunction with progression of liver disease as assessed by Child-Pugh staging. AI was predicted by lower levels of serum protein, serum albumin, total cholesterol and HDL cholesterol and higher levels of serum bilirubin and INR. Adrenal functions showed recovery following liver transplantation. Conclusions: AI forms important part of spectrum of acute and chronic liver disease. Deterioration of synthetic functions of liver disease predicts presence of AI, and these patients should be evaluated for adrenal dysfunction periodically.

  11. The most frequently used tests for assessing executive functions in aging

    Directory of Open Access Journals (Sweden)

    Camila de Assis Faria

    Full Text Available There are numerous neuropsychological tests for assessing executive functions in aging, which vary according to the different domains assessed. OBJECTIVE: To present a systematic review of the most frequently used instruments for assessing executive functions in older adults with different educational levels in clinical and experimental research. METHODS: We searched for articles published in the last five years, using the PubMed database with the following terms: "neuropsychological tests", "executive functions", and "mild cognitive impairment". There was no language restriction. RESULTS: 25 articles fulfilled all the inclusion criteria. The seven neuropsychological tests most frequently used to evaluate executive functions in aging were: [1] Trail Making Test (TMT Form B; [2] Verbal Fluency Test (VFT - F, A and S; [3] VFT Animals category; [4] Clock Drawing Test (CDT; [5] Digits Forward and Backward subtests (WAIS-R or WAIS-III; [6] Stroop Test; and [7] Wisconsin Card Sorting Test (WCST and its variants. The domains of executive functions most frequently assessed were: mental flexibility, verbal fluency, planning, working memory, and inhibitory control. CONCLUSION: The study identified the tests and domains of executive functions most frequently used in the last five years by research groups worldwide to evaluate older adults. These results can direct future research and help build evaluation protocols for assessing executive functions, taking into account the different educational levels and socio-demographic profiles of older adults in Brazil.

  12. Assessing functional diversity by program slicing

    International Nuclear Information System (INIS)

    Wallace, D.R.; Lyle, J.R.; Gallagher, K.B.; Ippolito, L.M.

    1994-01-01

    A responsibility of the Nuclear Regulatory Commission auditors is to provide assessments of the quality of the safety systems. For software, the audit process as currently implemented is a slow, tedious, manual process prone to human errors. While auditors cannot possibly examine all components of the system in complete detail, they do check for implementation of specific principles like functional diversity. This paper describes an experimental prototype Computer Aided Software Engineering (CASE) tool, UNRAVEL, designed to enable auditors to check for functional diversity and aid an auditor in examining software by extracting all code relevant to a computation identified for detailed inspection

  13. Tackling Biocomplexity with Meta-models for Species Risk Assessment

    Directory of Open Access Journals (Sweden)

    Philip J. Nyhus

    2007-06-01

    Full Text Available We describe results of a multi-year effort to strengthen consideration of the human dimension into endangered species risk assessments and to strengthen research capacity to understand biodiversity risk assessment in the context of coupled human-natural systems. A core group of social and biological scientists have worked with a network of more than 50 individuals from four countries to develop a conceptual framework illustrating how human-mediated processes influence biological systems and to develop tools to gather, translate, and incorporate these data into existing simulation models. A central theme of our research focused on (1 the difficulties often encountered in identifying and securing diverse bodies of expertise and information that is necessary to adequately address complex species conservation issues; and (2 the development of quantitative simulation modeling tools that could explicitly link these datasets as a way to gain deeper insight into these issues. To address these important challenges, we promote a "meta-modeling" approach where computational links are constructed between discipline-specific models already in existence. In this approach, each model can function as a powerful stand-alone program, but interaction between applications is achieved by passing data structures describing the state of the system between programs. As one example of this concept, an integrated meta-model of wildlife disease and population biology is described. A goal of this effort is to improve science-based capabilities for decision making by scientists, natural resource managers, and policy makers addressing environmental problems in general, and focusing on biodiversity risk assessment in particular.

  14. Investigation of local load effect on damping characteristics of synchronous generator using transfer-function block-diagram model

    Directory of Open Access Journals (Sweden)

    Pichai Aree

    2005-07-01

    Full Text Available The transfer-function block-diagram model of single-machine infinite-bus power system has been a popular analytical tool amongst power engineers for explaining and assessing synchronous generator dynamic behaviors. In previous studies, the effects of local load together with damper circuit on generator damping have not yet been addressed because neither of them was integrated into this model. Since the model only accounts for the generator main field circuit, it may not always yield a realistic damping assessment due to lack of damper circuit representation. This paper presents an extended transfer-function block-diagram model, which includes one of the q-axis damper circuits as well as local load. This allows a more realistic investigation of the local load effect on the generator damping. The extended model is applied to assess thegenerator dynamic performance. The results show that the damping power components mostly derived from the q-axis damper and the field circuits can be improved according to the local load. The frequency response method is employed to carry out the fundamental analysis.

  15. Measurement of Function Post Hip Fracture: Testing a Comprehensive Measurement Model of Physical Function.

    Science.gov (United States)

    Resnick, Barbara; Gruber-Baldini, Ann L; Hicks, Gregory; Ostir, Glen; Klinedinst, N Jennifer; Orwig, Denise; Magaziner, Jay

    2016-07-01

    Measurement of physical function post hip fracture has been conceptualized using multiple different measures. This study tested a comprehensive measurement model of physical function. This was a descriptive secondary data analysis including 168 men and 171 women post hip fracture. Using structural equation modeling, a measurement model of physical function which included grip strength, activities of daily living, instrumental activities of daily living, and performance was tested for fit at 2 and 12 months post hip fracture, and among male and female participants. Validity of the measurement model of physical function was evaluated based on how well the model explained physical activity, exercise, and social activities post hip fracture. The measurement model of physical function fit the data. The amount of variance the model or individual factors of the model explained varied depending on the activity. Decisions about the ideal way in which to measure physical function should be based on outcomes considered and participants. The measurement model of physical function is a reliable and valid method to comprehensively measure physical function across the hip fracture recovery trajectory. © 2015 Association of Rehabilitation Nurses.

  16. Efficiency assessment models of higher education institution staff activity

    Directory of Open Access Journals (Sweden)

    K. A. Dyusekeyev

    2016-01-01

    Full Text Available The paper substantiates the necessity of improvement of university staff incentive system under the conditions of competition in the field of higher education, the necessity to develop a separate model for the evaluation of the effectiveness of the department heads. The authors analysed the methods for assessing production function of units. The advantage of the application of the methods to assess the effectiveness of border economic structures in the field of higher education is shown. The choice of the data envelopment analysis method to solve the problem has proved. The model for evaluating of university departments activity on the basis of the DEAmethodology has developed. On the basis of operating in Russia, Kazakhstan and other countries universities staff pay systems the structure of the criteria system for university staff activity evaluation has been designed. For clarification and specification of the departments activity efficiency criteria a strategic map has been developed that allowed us to determine the input and output parameters of the model. DEA-methodology using takes into account a large number of input and output parameters, increases the assessment objectivity by excluding experts, receives interim data to identify the strengths and weaknesses of the evaluated object.

  17. Complementary approaches to the assessment of personality disorder. The Personality Assessment Schedule and Adult Personality Functioning Assessment compared.

    Science.gov (United States)

    Hill, J; Fudge, H; Harrington, R; Pickles, A; Rutter, M

    2000-05-01

    Current concepts and measures of personality disorder are in many respects unsatisfactory. To establish agreement between two contrasting measures of personality disorder, and to compare subject-informant agreement on each. To examine the extent to which trait abnormality can be separated from interpersonal and social role dysfunction. Fifty-six subjects and their closest informants were interviewed and rated independently. Personality functioning was assessed using a modified Personality Assessment Schedule (M-PAS), and the Adult Personality Functioning Assessment (APFA). Subject-informant agreement on the M-PAS was moderately good, and agreement between the M-PAS and the APFA, across and within subjects and informants, was comparable to that for the M-PAS. This was equally the case when M-PAS trait plus impairment scores and trait abnormality scores were used. The M-PAS and the APFA are probably assessing similar constructs. Trait abnormalities occur predominantly in an interpersonal context and could be assessed within that context.

  18. Dynamic model for the assessment of radiological exposure to marine biota

    Energy Technology Data Exchange (ETDEWEB)

    Vives i Batlle, J. [Westlakes Scientific Consulting Ltd, The Princess Royal Building, Westlakes Science and Technology Park, Moor Row, Cumbria CA24 3LN (United Kingdom)], E-mail: jordi.vives@westlakes.ac.uk; Wilson, R.C.; Watts, S.J.; Jones, S.R.; McDonald, P.; Vives-Lynch, S. [Westlakes Scientific Consulting Ltd, The Princess Royal Building, Westlakes Science and Technology Park, Moor Row, Cumbria CA24 3LN (United Kingdom)

    2008-11-15

    A generic approach has been developed to simulate dynamically the uptake and turnover of radionuclides by marine biota. The approach incorporates a three-compartment biokinetic model based on first order linear kinetics, with interchange rates between the organism and its surrounding environment. Model rate constants are deduced as a function of known parameters: biological half-lives of elimination, concentration factors and a sample point of the retention curve, allowing for the representation of multi-component release. The new methodology has been tested and validated in respect of non-dynamic assessment models developed for regulatory purposes. The approach has also been successfully tested against research dynamic models developed to represent the uptake of technetium and radioiodine by lobsters and winkles. Assessments conducted on two realistic test scenarios demonstrated the importance of simulating time-dependency for ecosystems in which environmental levels of radionuclides are not in equilibrium.

  19. Assessing Executive Function components in 9 years old children

    Directory of Open Access Journals (Sweden)

    Sandra Reyes

    2014-05-01

    Full Text Available Executive Function (EF is a multidimensional construct. It includes a set of abilities that allows to execute actions with a purpose, aimed to a goal, in an efficient way. The objective of this work is to explore some of the cognitive abilities that constitute a common factor for EF in 9 years-old children. The chosen instruments: Batería de Evaluación Neuropsicológica de la Función Ejecutiva en niños (ENFEN (Battery of Neuropsychological Assessment for Executive Function in Children, along with the Backward Digits Subtestfrom the WISC-III, were administered to 101 children from private schools of Buenos Aires State, Argentina. The ENFEN consists on EF tasks, including Phonological and Semantic Fluency, Trail Making Test versions for children (gray and colored sets, Interference Task, and Planning disc movements according to a model. An initial confirmatory factor analysis didn’t show significant fit indexes, being the Inhibitory control the variable with the lower and non significant factorial weight. A second model excluding the Inhibitory control measure was conducted, and it showed excellent fit indexes. Therefore, it can be concluded that at this age, some of the cognitive abilities included on the EF are: Phonological and Semantic Fluency, Sustained and Selective attention, Planning and Working memory; which is not the case for Inhibitory Control (measured by the Interference Task in the ENFEN.

  20. Functional outcome measures in a surgical model of hip osteoarthritis in dogs.

    Science.gov (United States)

    Little, Dianne; Johnson, Stephen; Hash, Jonathan; Olson, Steven A; Estes, Bradley T; Moutos, Franklin T; Lascelles, B Duncan X; Guilak, Farshid

    2016-12-01

    The hip is one of the most common sites of osteoarthritis in the body, second only to the knee in prevalence. However, current animal models of hip osteoarthritis have not been assessed using many of the functional outcome measures used in orthopaedics, a characteristic that could increase their utility in the evaluation of therapeutic interventions. The canine hip shares similarities with the human hip, and functional outcome measures are well documented in veterinary medicine, providing a baseline for pre-clinical evaluation of therapeutic strategies for the treatment of hip osteoarthritis. The purpose of this study was to evaluate a surgical model of hip osteoarthritis in a large laboratory animal model and to evaluate functional and end-point outcome measures. Seven dogs were subjected to partial surgical debridement of cartilage from one femoral head. Pre- and postoperative pain and functional scores, gait analysis, radiographs, accelerometry, goniometry and limb circumference were evaluated through a 20-week recovery period, followed by histological evaluation of cartilage and synovium. Animals developed histological and radiographic evidence of osteoarthritis, which was correlated with measurable functional impairment. For example, Mankin scores in operated limbs were positively correlated to radiographic scores but negatively correlated to range of motion, limb circumference and 20-week peak vertical force. This study demonstrates that multiple relevant functional outcome measures can be used successfully in a large laboratory animal model of hip osteoarthritis. These measures could be used to evaluate relative efficacy of therapeutic interventions relevant to human clinical care.

  1. Integrated performance assessment model for waste package behavior and radionuclide release

    International Nuclear Information System (INIS)

    Kossik, R.; Miller, I.; Cunnane, M.

    1992-01-01

    Golder Associates Inc. (GAI) has developed a probabilistic total system performance assessment and strategy evaluation model (RIP) which can be applied in an iterative manner to evaluate repository site suitability and guide site characterization. This paper describes one component of the RIP software, the waste package behavior and radionuclide release model. The waste package component model considers waste package failure by various modes, matrix alteration/dissolution, and radionuclide mass transfer. Model parameters can be described as functions of local environmental conditions. The waste package component model is coupled to component models for far-field radionuclide transport and disruptive events. The model has recently been applied to the proposed repository at Yucca Mountain

  2. Modeling Deficits From Early Auditory Information Processing to Psychosocial Functioning in Schizophrenia.

    Science.gov (United States)

    Thomas, Michael L; Green, Michael F; Hellemann, Gerhard; Sugar, Catherine A; Tarasenko, Melissa; Calkins, Monica E; Greenwood, Tiffany A; Gur, Raquel E; Gur, Ruben C; Lazzeroni, Laura C; Nuechterlein, Keith H; Radant, Allen D; Seidman, Larry J; Shiluk, Alexandra L; Siever, Larry J; Silverman, Jeremy M; Sprock, Joyce; Stone, William S; Swerdlow, Neal R; Tsuang, Debby W; Tsuang, Ming T; Turetsky, Bruce I; Braff, David L; Light, Gregory A

    2017-01-01

    Neurophysiologic measures of early auditory information processing (EAP) are used as endophenotypes in genomic studies and biomarkers in clinical intervention studies. Research in schizophrenia has established correlations among measures of EAP, cognition, clinical symptoms, and functional outcome. Clarifying these associations by determining the pathways through which deficits in EAP affect functioning would suggest when and where to therapeutically intervene. To characterize the pathways from EAP to outcome and to estimate the extent to which enhancement of basic information processing might improve cognition and psychosocial functioning in schizophrenia. Cross-sectional data were analyzed using structural equation modeling to examine the associations among EAP, cognition, negative symptoms, and functional outcome. Participants were recruited from the community at 5 geographically distributed laboratories as part of the Consortium on the Genetics of Schizophrenia 2 from July 1, 2010, through January 31, 2014. This well-characterized cohort of 1415 patients with schizophrenia underwent EAP, cognitive, and thorough clinical and functional assessment. Mismatch negativity, P3a, and reorienting negativity were used to measure EAP. Cognition was measured by the Letter Number Span test and scales from the California Verbal Learning Test-Second Edition, the Wechsler Memory Scale-Third Edition, and the Penn Computerized Neurocognitive Battery. Negative symptoms were measured by the Scale for the Assessment of Negative Symptoms. Functional outcome was measured by the Role Functioning Scale. Participants included 1415 unrelated outpatients diagnosed with schizophrenia or schizoaffective disorder (mean [SD] age, 46 [11] years; 979 males [69.2%] and 619 white [43.7%]). Early auditory information processing had a direct effect on cognition (β = 0.37, P model in which EAP deficits lead to poor functional outcome via impaired cognition and increased negative symptoms

  3. Intelligence and specific cognitive functions in intellectual disability: implications for assessment and classification.

    Science.gov (United States)

    Bertelli, Marco O; Cooper, Sally-Ann; Salvador-Carulla, Luis

    2018-03-01

    Current diagnostic criteria for intellectual disability categorize ability as measured by IQ tests. However, this does not suit the new conceptualization of intellectual disability, which refers to a range of neuropsychiatric syndromes that have in common early onset, cognitive impairments, and consequent deficits in learning and adaptive functioning. A literature review was undertaken on the concept of intelligence and whether it encompasses a range of specific cognitive functions to solve problems, which might be better reported as a profile, instead of an IQ, with implications for diagnosis and classification of intellectual disability. Data support a model of intelligence consisting of distinct but related processes. Persons with intellectual disability with the same IQ level have different cognitive profiles, based on varying factors involved in aetiopathogenesis. Limitations of functioning and many biopsychological factors associated with intellectual disability are more highly correlated with impairments of specific cognitive functions than with overall IQ. The current model of intelligence, based on IQ, is of limited utility for intellectual disability, given the wide range and variability of cognitive functions and adaptive capacities. Assessing level of individual impairment in executive and specific cognitive functions may be a more useful alternative. This has considerable implications for the revision of the International Classification of Diseases and for the cultural attitude towards intellectual disability in general.

  4. Predicting cognitive function of the Malaysian elderly: a structural equation modelling approach.

    Science.gov (United States)

    Foong, Hui Foh; Hamid, Tengku Aizan; Ibrahim, Rahimah; Haron, Sharifah Azizah; Shahar, Suzana

    2018-01-01

    The aim of this study was to identify the predictors of elderly's cognitive function based on biopsychosocial and cognitive reserve perspectives. The study included 2322 community-dwelling elderly in Malaysia, randomly selected through a multi-stage proportional cluster random sampling from Peninsular Malaysia. The elderly were surveyed on socio-demographic information, biomarkers, psychosocial status, disability, and cognitive function. A biopsychosocial model of cognitive function was developed to test variables' predictive power on cognitive function. Statistical analyses were performed using SPSS (version 15.0) in conjunction with Analysis of Moment Structures Graphics (AMOS 7.0). The estimated theoretical model fitted the data well. Psychosocial stress and metabolic syndrome (MetS) negatively predicted cognitive function and psychosocial stress appeared as a main predictor. Socio-demographic characteristics, except gender, also had significant effects on cognitive function. However, disability failed to predict cognitive function. Several factors together may predict cognitive function in the Malaysian elderly population, and the variance accounted for it is large enough to be considered substantial. Key factor associated with the elderly's cognitive function seems to be psychosocial well-being. Thus, psychosocial well-being should be included in the elderly assessment, apart from medical conditions, both in clinical and community setting.

  5. Automatic assessment of functional health decline in older adults based on smart home data.

    Science.gov (United States)

    Alberdi Aramendi, Ane; Weakley, Alyssa; Aztiria Goenaga, Asier; Schmitter-Edgecombe, Maureen; Cook, Diane J

    2018-05-01

    In the context of an aging population, tools to help elderly to live independently must be developed. The goal of this paper is to evaluate the possibility of using unobtrusively collected activity-aware smart home behavioral data to automatically detect one of the most common consequences of aging: functional health decline. After gathering the longitudinal smart home data of 29 older adults for an average of >2 years, we automatically labeled the data with corresponding activity classes and extracted time-series statistics containing 10 behavioral features. Using this data, we created regression models to predict absolute and standardized functional health scores, as well as classification models to detect reliable absolute change and positive and negative fluctuations in everyday functioning. Functional health was assessed every six months by means of the Instrumental Activities of Daily Living-Compensation (IADL-C) scale. Results show that total IADL-C score and subscores can be predicted by means of activity-aware smart home data, as well as a reliable change in these scores. Positive and negative fluctuations in everyday functioning are harder to detect using in-home behavioral data, yet changes in social skills have shown to be predictable. Future work must focus on improving the sensitivity of the presented models and performing an in-depth feature selection to improve overall accuracy. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Assessment of morphological-functional state of children with cochlear implants

    Directory of Open Access Journals (Sweden)

    V.M. Pysanko

    2016-10-01

    Full Text Available Purpose: assessment of morphological-functional state of pre-school age children with cochlear implants and substantiation of need in post-operative rehabilitation in period of preparation for comprehensive school. Material: we tested weakly hearing children with cochlear implants (n=127, age - 5.6±0.6 years. They were the main group. Control group consisted of children with normal hearing (n=70, age - 5.7±0.4 years. Morphological-functional state was assessed by indicators of physical and biological condition, visual analyzer, posture parameters and foot arch, muscular system and level of coordination. We calculated index of integral morphological-functional state assessment. Results: Morphological functional state of most of children (with cochlear implants was characterized by low physical condition indicators and disharmony. We observed delay in biological development. Index of morphological-functional state integral assessment witnesses, that such child can not study in comprehensive school. Rehabilitation program can reduce the gap between children with normal hearing and those with cochlear implants. Conclusions: Rehabilitation program facilitates quicker domestic and social rehabilitation of children at the account of widening the circle of communication, learning new actions and conceptions. It can permit for such children to study at school together with their healthy peers.

  7. Automatic creation of Markov models for reliability assessment of safety instrumented systems

    International Nuclear Information System (INIS)

    Guo Haitao; Yang Xianhui

    2008-01-01

    After the release of new international functional safety standards like IEC 61508, people care more for the safety and availability of safety instrumented systems. Markov analysis is a powerful and flexible technique to assess the reliability measurements of safety instrumented systems, but it is fallible and time-consuming to create Markov models manually. This paper presents a new technique to automatically create Markov models for reliability assessment of safety instrumented systems. Many safety related factors, such as failure modes, self-diagnostic, restorations, common cause and voting, are included in Markov models. A framework is generated first based on voting, failure modes and self-diagnostic. Then, repairs and common-cause failures are incorporated into the framework to build a complete Markov model. Eventual simplification of Markov models can be done by state merging. Examples given in this paper show how explosively the size of Markov model increases as the system becomes a little more complicated as well as the advancement of automatic creation of Markov models

  8. Load function modelling for light impact

    International Nuclear Information System (INIS)

    Klingmueller, O.

    1982-01-01

    For Pile Integrity Testing light weight drop hammers are used to induce stress waves. In the computational analysis of one-dimensional wave propagation a load function has to be used. Several mechanical models and corresponding load functions are discussed. It is shown that a bell-shaped function which does not correspond to a mechanical model is in best accordance with test results and does not lead to numerical disturbances in the computational results. (orig.) [de

  9. Hepatobiliary function assessed by {sup 99m}Tc-mebrofenin cholescintigraphy in the evaluation of severity of steatosis in a rat model

    Energy Technology Data Exchange (ETDEWEB)

    Vetelaeinen, Reeta L.; Vliet, Arlene van; Gulik, Thomas M. van [Academic Medical Center, Department of Surgery, Amsterdam (Netherlands); Bennink, Roelof J.; Bruin, Kora de [Academic Medical Center, Department of Nuclear Medicine, Amsterdam (Netherlands)

    2006-10-15

    This study evaluated the utility of non-invasive assessment of hepatobiliary function by {sup 99m}Tc-mebrofenin cholescintigraphy in a rat model of diet-induced steatosis. Male Wistar rats (250-300 g) were fed a standard methionine- and choline-deficient (MCD) diet for up to 5 weeks, thereby inducing hepatic fat accumulation, progressive inflammation and fibrogenesis corresponding with clinical steatosis. {sup 99m}Tc-mebrofenin pinhole scintigraphy was used to evaluate the hepatocyte mebrofenin uptake rate, the time of maximum hepatic uptake (T{sub peak}) and the time required for peak activity to decrease by 50% (T{sub 1/2peak}). Scintigraphic parameters were correlated with biochemical and serological parameters and with liver histopathology. MCD diet induced mild steatosis after 1 week and severe steatosis with prominent inflammation after 5 weeks. T{sub peak}, T{sub 1/2peak} prolonged and the uptake rate decreased significantly, while the severity of steatosis increased (p<0.05). There was a strong, significant correlation between the severity of steatosis (histopathology, hepatic triglyceride content) and the {sup 99m}Tc-mebrofenin uptake rate (r {sup 2}=0.83, p<0.0001 and r {sup 2}=0.82, p<0.0001, respectively). In addition, the uptake rate correlated significantly with the increased inflammation (plasma and hepatic TNF-{alpha}, r {sup 2}=0.72, p<0.0001 and r {sup 2}=0.52, p=0.001, respectively). The correlation of the uptake rate with hepatocellular damage was weak (AST and ALT, r {sup 2}=0.29 and 0.32, respectively), but correlation with synthetic function was strong (prothrombin time, r {sup 2}=0.70, p<0.001). Hepatobiliary function assessed by {sup 99m}Tc-mebrofenin scintigraphy correlates with the extent and progression of steatosis. These results suggest a potential role for mebrofenin scintigraphy as a non-invasive functional follow-up method for disease progression in steatotic patients. (orig.)

  10. Models for waste life cycle assessment: Review of technical assumptions

    DEFF Research Database (Denmark)

    Gentil, Emmanuel; Damgaard, Anders; Hauschild, Michael Zwicky

    2010-01-01

    A number of waste life cycle assessment (LCA) models have been gradually developed since the early 1990s, in a number of countries, usually independently from each other. Large discrepancies in results have been observed among different waste LCA models, although it has also been shown that results...... from different LCA studies can be consistent. This paper is an attempt to identify, review and analyse methodologies and technical assumptions used in various parts of selected waste LCA models. Several criteria were identified, which could have significant impacts on the results......, such as the functional unit, system boundaries, waste composition and energy modelling. The modelling assumptions of waste management processes, ranging from collection, transportation, intermediate facilities, recycling, thermal treatment, biological treatment, and landfilling, are obviously critical when comparing...

  11. Functional-Based Assessment of Social Behavior: Introduction and Overview.

    Science.gov (United States)

    Lewis, Timothy J.; Sugai, George

    1994-01-01

    This introduction to and overview of a special issue on social behavior assessment within schools discusses the impact of function-based methodologies on assessment and intervention practices in identification and remediation of challenging social behaviors. (JDD)

  12. A Comparison of Functional Behavioral Assessment and Functional Analysis Methodology among Students with Mild Disabilities

    Science.gov (United States)

    Lewis, Timothy J.; Mitchell, Barbara S.; Harvey, Kristin; Green, Ambra; McKenzie, Jennifer

    2015-01-01

    Functional behavioral assessment (FBA) and functional analyses (FA) are grounded in the applied behavior analysis principle that posits problem behavior is functionally related to the environment in which it occurs and is maintained by either providing access to reinforcing outcomes or allowing the individual to avoid or escape that which they…

  13. Structure-based Markov random field model for representing evolutionary constraints on functional sites.

    Science.gov (United States)

    Jeong, Chan-Seok; Kim, Dongsup

    2016-02-24

    Elucidating the cooperative mechanism of interconnected residues is an important component toward understanding the biological function of a protein. Coevolution analysis has been developed to model the coevolutionary information reflecting structural and functional constraints. Recently, several methods have been developed based on a probabilistic graphical model called the Markov random field (MRF), which have led to significant improvements for coevolution analysis; however, thus far, the performance of these models has mainly been assessed by focusing on the aspect of protein structure. In this study, we built an MRF model whose graphical topology is determined by the residue proximity in the protein structure, and derived a novel positional coevolution estimate utilizing the node weight of the MRF model. This structure-based MRF method was evaluated for three data sets, each of which annotates catalytic site, allosteric site, and comprehensively determined functional site information. We demonstrate that the structure-based MRF architecture can encode the evolutionary information associated with biological function. Furthermore, we show that the node weight can more accurately represent positional coevolution information compared to the edge weight. Lastly, we demonstrate that the structure-based MRF model can be reliably built with only a few aligned sequences in linear time. The results show that adoption of a structure-based architecture could be an acceptable approximation for coevolution modeling with efficient computation complexity.

  14. An Assessment of the Effectiveness of Tree-Based Models for Multi-Variate Flood Damage Assessment in Australia

    Directory of Open Access Journals (Sweden)

    Roozbeh Hasanzadeh Nafari

    2016-07-01

    Full Text Available Flood is a frequent natural hazard that has significant financial consequences for Australia. In Australia, physical losses caused by floods are commonly estimated by stage-damage functions. These methods usually consider only the depth of the water and the type of buildings at risk. However, flood damage is a complicated process, and it is dependent on a variety of factors which are rarely taken into account. This study explores the interaction, importance, and influence of water depth, flow velocity, water contamination, precautionary measures, emergency measures, flood experience, floor area, building value, building quality, and socioeconomic status. The study uses tree-based models (regression trees and bagging decision trees and a dataset collected from 2012 to 2013 flood events in Queensland, which includes information on structural damages, impact parameters, and resistance variables. The tree-based approaches show water depth, floor area, precautionary measures, building value, and building quality to be important damage-influencing parameters. Furthermore, the performance of the tree-based models is validated and contrasted with the outcomes of a multi-parameter loss function (FLFArs from Australia. The tree-based models are shown to be more accurate than the stage-damage function. Consequently, considering more parameters and taking advantage of tree-based models is recommended. The outcome is important for improving established Australian flood loss models and assisting decision-makers and insurance companies dealing with flood risk assessment.

  15. Confronting species distribution model predictions with species functional traits.

    Science.gov (United States)

    Wittmann, Marion E; Barnes, Matthew A; Jerde, Christopher L; Jones, Lisa A; Lodge, David M

    2016-02-01

    Species distribution models are valuable tools in studies of biogeography, ecology, and climate change and have been used to inform conservation and ecosystem management. However, species distribution models typically incorporate only climatic variables and species presence data. Model development or validation rarely considers functional components of species traits or other types of biological data. We implemented a species distribution model (Maxent) to predict global climate habitat suitability for Grass Carp (Ctenopharyngodon idella). We then tested the relationship between the degree of climate habitat suitability predicted by Maxent and the individual growth rates of both wild (N = 17) and stocked (N = 51) Grass Carp populations using correlation analysis. The Grass Carp Maxent model accurately reflected the global occurrence data (AUC = 0.904). Observations of Grass Carp growth rate covered six continents and ranged from 0.19 to 20.1 g day(-1). Species distribution model predictions were correlated (r = 0.5, 95% CI (0.03, 0.79)) with observed growth rates for wild Grass Carp populations but were not correlated (r = -0.26, 95% CI (-0.5, 0.012)) with stocked populations. Further, a review of the literature indicates that the few studies for other species that have previously assessed the relationship between the degree of predicted climate habitat suitability and species functional traits have also discovered significant relationships. Thus, species distribution models may provide inferences beyond just where a species may occur, providing a useful tool to understand the linkage between species distributions and underlying biological mechanisms.

  16. Assessing physical function and physical activity in patients with CKD.

    Science.gov (United States)

    Painter, Patricia; Marcus, Robin L

    2013-05-01

    Patients with CKD are characterized by low levels of physical functioning, which, along with low physical activity, predict poor outcomes in those treated with dialysis. The hallmark of clinical care in geriatric practice and geriatric research is the orientation to and assessment of physical function and functional limitations. Although there is increasing interest in physical function and physical activity in patients with CKD, the nephrology field has not focused on this aspect of care. This paper provides an in-depth review of the measurement of physical function and physical activity. It focuses on physiologic impairments and physical performance limitations (impaired mobility and functional limitations). The review is based on established frameworks of physical impairment and functional limitations that have guided research in physical function in the aging population. Definitions and measures for physiologic impairments, physical performance limitations, self-reported function, and physical activity are presented. On the basis of the information presented, recommendations for incorporating routine assessment of physical function and encouragement for physical activity in clinical care are provided.

  17. A new method for predicting functional recovery of stroke patients with hemiplegia: logarithmic modelling.

    Science.gov (United States)

    Koyama, Tetsuo; Matsumoto, Kenji; Okuno, Taiji; Domen, Kazuhisa

    2005-10-01

    To examine the validity and applicability of logarithmic modelling for predicting functional recovery of stroke patients with hemiplegia. Longitudinal postal survey. Stroke patients with hemiplegia staying in a long-term rehabilitation facility, who had been referred from acute medical service 30-60 days after onset. Functional Independence Measure (FIM) scores were periodically assessed during hospitalization. For each individual, a logarithmic formula that was scaled by an interval increase in FIM scores during the initial 2-6 weeks was used for predicting functional recovery. For the study, we recruited 18 patients who showed a wide variety of disability levels on admission (FIM scores 25-107). For each patient, the predicted FIM scores derived from the logarithmic formula matched the actual change in FIM scores. The changes predicted the recovery of motor rather than cognitive functions. Regression analysis showed a close fit between logarithmic modelling and actual FIM scores (across-subject R2 = 0.945). Provided with two initial time-point samplings, logarithmic modelling allows accurate prediction of functional recovery for individuals. Because the modelling is mathematically simple, it can be widely applied in daily clinical practice.

  18. Assessing parameter variability in a photosynthesis model within and between plant functional types using global Fluxnet eddy covariance data

    NARCIS (Netherlands)

    Groenendijk, M.; Dolman, A.J.; Molen, van der M.K.; Leuning, R.; Arneth, A.; Delpierre, N.; Gash, J.H.C.; Lindroth, A.; Richardson, A.D.; Verbeeck, H.; Wohlfahrt, G.

    2011-01-01

    The vegetation component in climate models has advanced since the late 1960s from a uniform prescription of surface parameters to plant functional types (PFTs). PFTs are used in global land-surface models to provide parameter values for every model grid cell. With a simple photosynthesis model we

  19. A Memristor Model with Piecewise Window Function

    Directory of Open Access Journals (Sweden)

    J. Yu

    2013-12-01

    Full Text Available In this paper, we present a memristor model with piecewise window function, which is continuously differentiable and consists of three nonlinear pieces. By introducing two parameters, the shape of this window function can be flexibly adjusted to model different types of memristors. Using this model, one can easily obtain an expression of memristance depending on charge, from which the numerical value of memristance can be readily calculated for any given charge, and eliminate the error occurring in the simulation of some existing window function models.

  20. Using niche-based modelling to assess the impact of climate change on tree functional diversity in Europe

    DEFF Research Database (Denmark)

    Thuiller, Wilfried; Lavorel, Sandra; Sykes, Martin T.

    2006-01-01

    Rapid anthropogenic climate change is already affecting species distributions and ecosystem functioning worldwide. We applied niche-based models to analyse the impact of climate change on tree species and functional diversity in Europe. Present-day climate was used to predict the distributions...... of 122 tree species from different functional types (FT). We then explored projections of future distributions under one climate scenario for 2080, considering two alternative dispersal assumptions: no dispersal and unlimited dispersal. The species-rich broadleaved deciduous group appeared to play a key...... role in the future of different European regions. Temperate areas were projected to lose both species richness and functional diversity due to the loss of broadleaved deciduous trees. These were projected to migrate to boreal forests, thereby increasing their species richness and functional diversity...

  1. Modeling for operational event risk assessment

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The U.S. Nuclear Regulatory Commission has been using risk models to evaluate the risk significance of operational events in U.S. commercial nuclear power plants for more seventeen years. During that time, the models have evolved in response to the advances in risk assessment technology and insights gained with experience. Evaluation techniques fall into two categories, initiating event assessments and condition assessments. The models used for these analyses have become uniquely specialized for just this purpose

  2. Numerical model of the influence function of deformable mirrors based on Bessel Fourier orthogonal functions

    International Nuclear Information System (INIS)

    Li Shun; Zhang Sijiong

    2014-01-01

    A numerical model is presented to simulate the influence function of deformable mirror actuators. The numerical model is formed by Bessel Fourier orthogonal functions, which are constituted of Bessel orthogonal functions and a Fourier basis. A detailed comparison is presented between the new Bessel Fourier model, the Zernike model, the Gaussian influence function and the modified Gaussian influence function. Numerical experiments indicate that the new numerical model is easy to use and more accurate compared with other numerical models. The new numerical model can be used for describing deformable mirror performances and numerical simulations of adaptive optics systems. (research papers)

  3. Systematic assessment of apraxia and functional predictions from the Birmingham Cognitive Screen.

    Science.gov (United States)

    Bickerton, Wai-Ling; Riddoch, M Jane; Samson, Dana; Balani, Alex Bahrami; Mistry, Bejal; Humphreys, Glyn W

    2012-05-01

    The validity and functional predictive values of the apraxia tests in the Birmingham Cognitive Screen (BCoS) were evaluated. BCoS was developed to identify patients with different forms of praxic deficit using procedures designed to be inclusive for patients with aphasia and/or spatial neglect. Observational studies were conducted from a university neuropsychological assessment centre and from acute and rehabilitation stroke care hospitals throughout an English region. Volunteers from referred patients with chronic acquired brain injuries, a consecutive hospital sample of patients within 3 months of stroke (n=635) and a population based healthy control sample (n=100) were recruited. The main outcome measures used were the Barthel Index, the Nottingham Extended Activities of Daily Living Scale as well as recovery from apraxia. There were high inter-rater reliabilities and correlations between the BCoS apraxia tasks and counterpart tests from the literature. The vast majority (88.3%) of the stroke survivors were able to complete the screen. Pantomime and gesture recognition tasks were more sensitive in differentiating between individuals with left hemisphere damage and right hemisphere damage whereas the Multistep Object Use test and the imitation task had higher functional correlates over and above effects of hemiplegia. Together, the initial scores of the four tasks enabled predictions with 75% accuracy, the recovery of apraxia and independence level at 9 months. As a model based assessment, BCoS offers a quick and valid way to detect apraxia and predict functional recovery. It enables early and informative assessment of most stroke patients for rehabilitation planning.

  4. Are Imaging and Lesioning Convergent Methods for Assessing Functional Specialisation? Investigations Using an Artificial Neural Network

    Science.gov (United States)

    Thomas, Michael S. C.; Purser, Harry R. M.; Tomlinson, Simon; Mareschal, Denis

    2012-01-01

    This article presents an investigation of the relationship between lesioning and neuroimaging methods of assessing functional specialisation, using synthetic brain imaging (SBI) and lesioning of a connectionist network of past-tense formation. The model comprised two processing "routes": one was a direct route between layers of input and output…

  5. An assessment of geographical distribution of different plant functional types over North America simulated using the CLASS-CTEM modelling framework

    Science.gov (United States)

    Shrestha, Rudra K.; Arora, Vivek K.; Melton, Joe R.; Sushama, Laxmi

    2017-10-01

    The performance of the competition module of the CLASS-CTEM (Canadian Land Surface Scheme and Canadian Terrestrial Ecosystem Model) modelling framework is assessed at 1° spatial resolution over North America by comparing the simulated geographical distribution of its plant functional types (PFTs) with two observation-based estimates. The model successfully reproduces the broad geographical distribution of trees, grasses and bare ground although limitations remain. In particular, compared to the two observation-based estimates, the simulated fractional vegetation coverage is lower in the arid southwest North American region and higher in the Arctic region. The lower-than-observed simulated vegetation coverage in the southwest region is attributed to lack of representation of shrubs in the model and plausible errors in the observation-based data sets. The observation-based data indicate vegetation fractional coverage of more than 60 % in this arid region, despite only 200-300 mm of precipitation that the region receives annually, and observation-based leaf area index (LAI) values in the region are lower than one. The higher-than-observed vegetation fractional coverage in the Arctic is likely due to the lack of representation of moss and lichen PFTs and also likely because of inadequate representation of permafrost in the model as a result of which the C3 grass PFT performs overly well in the region. The model generally reproduces the broad spatial distribution and the total area covered by the two primary tree PFTs (needleleaf evergreen trees, NDL-EVG; and broadleaf cold deciduous trees, BDL-DCD-CLD) reasonably well. The simulated fractional coverage of tree PFTs increases after the 1960s in response to the CO2 fertilization effect and climate warming. Differences between observed and simulated PFT coverages highlight model limitations and suggest that the inclusion of shrubs, and moss and lichen PFTs, and an adequate representation of permafrost will help improve

  6. Development and comparison in uncertainty assessment based Bayesian modularization method in hydrological modeling

    Science.gov (United States)

    Li, Lu; Xu, Chong-Yu; Engeland, Kolbjørn

    2013-04-01

    SummaryWith respect to model calibration, parameter estimation and analysis of uncertainty sources, various regression and probabilistic approaches are used in hydrological modeling. A family of Bayesian methods, which incorporates different sources of information into a single analysis through Bayes' theorem, is widely used for uncertainty assessment. However, none of these approaches can well treat the impact of high flows in hydrological modeling. This study proposes a Bayesian modularization uncertainty assessment approach in which the highest streamflow observations are treated as suspect information that should not influence the inference of the main bulk of the model parameters. This study includes a comprehensive comparison and evaluation of uncertainty assessments by our new Bayesian modularization method and standard Bayesian methods using the Metropolis-Hastings (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions were used in combination with standard Bayesian method: the AR(1) plus Normal model independent of time (Model 1), the AR(1) plus Normal model dependent on time (Model 2) and the AR(1) plus Multi-normal model (Model 3). The results reveal that the Bayesian modularization method provides the most accurate streamflow estimates measured by the Nash-Sutcliffe efficiency and provide the best in uncertainty estimates for low, medium and entire flows compared to standard Bayesian methods. The study thus provides a new approach for reducing the impact of high flows on the discharge uncertainty assessment of hydrological models via Bayesian method.

  7. Hepatobiliary function assessed by 99mTc-mebrofenin cholescintigraphy in the evaluation of severity of steatosis in a rat model

    International Nuclear Information System (INIS)

    Vetelaeinen, Reeta L.; Vliet, Arlene van; Gulik, Thomas M. van; Bennink, Roelof J.; Bruin, Kora de

    2006-01-01

    This study evaluated the utility of non-invasive assessment of hepatobiliary function by 99m Tc-mebrofenin cholescintigraphy in a rat model of diet-induced steatosis. Male Wistar rats (250-300 g) were fed a standard methionine- and choline-deficient (MCD) diet for up to 5 weeks, thereby inducing hepatic fat accumulation, progressive inflammation and fibrogenesis corresponding with clinical steatosis. 99m Tc-mebrofenin pinhole scintigraphy was used to evaluate the hepatocyte mebrofenin uptake rate, the time of maximum hepatic uptake (T peak ) and the time required for peak activity to decrease by 50% (T 1/2peak ). Scintigraphic parameters were correlated with biochemical and serological parameters and with liver histopathology. MCD diet induced mild steatosis after 1 week and severe steatosis with prominent inflammation after 5 weeks. T peak , T 1/2peak prolonged and the uptake rate decreased significantly, while the severity of steatosis increased (p 99m Tc-mebrofenin uptake rate (r 2 =0.83, p 2 =0.82, p 2 =0.72, p 2 =0.52, p=0.001, respectively). The correlation of the uptake rate with hepatocellular damage was weak (AST and ALT, r 2 =0.29 and 0.32, respectively), but correlation with synthetic function was strong (prothrombin time, r 2 =0.70, p 99m Tc-mebrofenin scintigraphy correlates with the extent and progression of steatosis. These results suggest a potential role for mebrofenin scintigraphy as a non-invasive functional follow-up method for disease progression in steatotic patients. (orig.)

  8. Assessing Impacts of Climate Change on Forests: The State of Biological Modeling

    Science.gov (United States)

    Dale, V. H.; Rauscher, H. M.

    1993-04-06

    Models that address the impacts to forests of climate change are reviewed by four levels of biological organization: global, regional or landscape, community, and tree. The models are compared as to their ability to assess changes in greenhouse gas flux, land use, maps of forest type or species composition, forest resource productivity, forest health, biodiversity, and wildlife habitat. No one model can address all of these impacts, but landscape transition models and regional vegetation and land-use models consider the largest number of impacts. Developing landscape vegetation dynamics models of functional groups is suggested as a means to integrate the theory of both landscape ecology and individual tree responses to climate change. Risk assessment methodologies can be adapted to deal with the impacts of climate change at various spatial and temporal scales. Four areas of research development are identified: (1) linking socioeconomic and ecologic models, (2) interfacing forest models at different scales, (3) obtaining data on susceptibility of trees and forest to changes in climate and disturbance regimes, and (4) relating information from different scales.

  9. COPEWELL: A Conceptual Framework and System Dynamics Model for Predicting Community Functioning and Resilience After Disasters.

    Science.gov (United States)

    Links, Jonathan M; Schwartz, Brian S; Lin, Sen; Kanarek, Norma; Mitrani-Reiser, Judith; Sell, Tara Kirk; Watson, Crystal R; Ward, Doug; Slemp, Cathy; Burhans, Robert; Gill, Kimberly; Igusa, Tak; Zhao, Xilei; Aguirre, Benigno; Trainor, Joseph; Nigg, Joanne; Inglesby, Thomas; Carbone, Eric; Kendra, James M

    2018-02-01

    Policy-makers and practitioners have a need to assess community resilience in disasters. Prior efforts conflated resilience with community functioning, combined resistance and recovery (the components of resilience), and relied on a static model for what is inherently a dynamic process. We sought to develop linked conceptual and computational models of community functioning and resilience after a disaster. We developed a system dynamics computational model that predicts community functioning after a disaster. The computational model outputted the time course of community functioning before, during, and after a disaster, which was used to calculate resistance, recovery, and resilience for all US counties. The conceptual model explicitly separated resilience from community functioning and identified all key components for each, which were translated into a system dynamics computational model with connections and feedbacks. The components were represented by publicly available measures at the county level. Baseline community functioning, resistance, recovery, and resilience evidenced a range of values and geographic clustering, consistent with hypotheses based on the disaster literature. The work is transparent, motivates ongoing refinements, and identifies areas for improved measurements. After validation, such a model can be used to identify effective investments to enhance community resilience. (Disaster Med Public Health Preparedness. 2018;12:127-137).

  10. Financial and testamentary capacity evaluations: procedures and assessment instruments underneath a functional approach.

    Science.gov (United States)

    Sousa, Liliana B; Simões, Mário R; Firmino, Horácio; Peisah, Carmelle

    2014-02-01

    Mental health professionals are frequently involved in mental capacity determinations. However, there is a lack of specific measures and well-defined procedures for these evaluations. The main purpose of this paper is to provide a review of financial and testamentary capacity evaluation procedures, including not only the traditional neuropsychological and functional assessment but also the more recently developed forensic assessment instruments (FAIs), which have been developed to provide a specialized answer to legal systems regarding civil competencies. Here the main guidelines, papers, and other references are reviewed in order to achieve a complete and comprehensive selection of instruments used in the assessment of financial and testamentary capacity. Although some specific measures for financial abilities have been developed recently, the same is not true for testamentary capacity. Here are presented several instruments or methodologies for assessing financial and testamentary capacity, including neuropsychological assessment, functional assessment scales, performance based functional assessment instruments, and specific FAIs. FAIs are the only specific instruments intended to provide a specific and direct answer to the assessment of financial capacity based on legal systems. Considering the need to move from a diagnostic to a functional approach in financial and testamentary capacity evaluations, it is essential to consider both general functional examination as well as cognitive functioning.

  11. Evaluating the Performance of DFT Functionals in Assessing the Interaction Energy and Ground-State Charge Transfer of Donor/Acceptor Complexes: Tetrathiafulvalene−Tetracyanoquinodimethane (TTF−TCNQ) as a Model Case

    KAUST Repository

    Sini, Gjergji

    2011-03-08

    We have evaluated the performance of several density functional theory (DFT) functionals for the description of the ground-state electronic structure and charge transfer in donor/acceptor complexes. The tetrathiafulvalene- tetracyanoquinodimethane (TTF-TCNQ) complex has been considered as a model test case. Hybrid functionals have been chosen together with recently proposed long-range corrected functionals (ωB97X, ωB97X-D, LRC-ωPBEh, and LC-ωPBE) in order to assess the sensitivity of the results to the treatment and magnitude of exact exchange. The results show an approximately linear dependence of the ground-state charge transfer with the HOMO TTF-LUMOTCNQ energy gap, which in turn depends linearly on the percentage of exact exchange in the functional. The reliability of ground-state charge transfer values calculated in the framework of a monodeterminantal DFT approach was also examined. © 2011 American Chemical Society.

  12. Evaluating the Performance of DFT Functionals in Assessing the Interaction Energy and Ground-State Charge Transfer of Donor/Acceptor Complexes: Tetrathiafulvalene−Tetracyanoquinodimethane (TTF−TCNQ) as a Model Case

    KAUST Repository

    Sini, Gjergji; Sears, John S.; Brédas, Jean-Luc

    2011-01-01

    We have evaluated the performance of several density functional theory (DFT) functionals for the description of the ground-state electronic structure and charge transfer in donor/acceptor complexes. The tetrathiafulvalene- tetracyanoquinodimethane (TTF-TCNQ) complex has been considered as a model test case. Hybrid functionals have been chosen together with recently proposed long-range corrected functionals (ωB97X, ωB97X-D, LRC-ωPBEh, and LC-ωPBE) in order to assess the sensitivity of the results to the treatment and magnitude of exact exchange. The results show an approximately linear dependence of the ground-state charge transfer with the HOMO TTF-LUMOTCNQ energy gap, which in turn depends linearly on the percentage of exact exchange in the functional. The reliability of ground-state charge transfer values calculated in the framework of a monodeterminantal DFT approach was also examined. © 2011 American Chemical Society.

  13. Spatial variability and parametric uncertainty in performance assessment models

    International Nuclear Information System (INIS)

    Pensado, Osvaldo; Mancillas, James; Painter, Scott; Tomishima, Yasuo

    2011-01-01

    The problem of defining an appropriate treatment of distribution functions (which could represent spatial variability or parametric uncertainty) is examined based on a generic performance assessment model for a high-level waste repository. The generic model incorporated source term models available in GoldSim ® , the TDRW code for contaminant transport in sparse fracture networks with a complex fracture-matrix interaction process, and a biosphere dose model known as BDOSE TM . Using the GoldSim framework, several Monte Carlo sampling approaches and transport conceptualizations were evaluated to explore the effect of various treatments of spatial variability and parametric uncertainty on dose estimates. Results from a model employing a representative source and ensemble-averaged pathway properties were compared to results from a model allowing for stochastic variation of transport properties along streamline segments (i.e., explicit representation of spatial variability within a Monte Carlo realization). We concluded that the sampling approach and the definition of an ensemble representative do influence consequence estimates. In the examples analyzed in this paper, approaches considering limited variability of a transport resistance parameter along a streamline increased the frequency of fast pathways resulting in relatively high dose estimates, while those allowing for broad variability along streamlines increased the frequency of 'bottlenecks' reducing dose estimates. On this basis, simplified approaches with limited consideration of variability may suffice for intended uses of the performance assessment model, such as evaluation of site safety. (author)

  14. Imbalanced Learning for Functional State Assessment

    Science.gov (United States)

    Li, Feng; McKenzie, Frederick; Li, Jiang; Zhang, Guangfan; Xu, Roger; Richey, Carl; Schnell, Tom

    2011-01-01

    This paper presents results of several imbalanced learning techniques applied to operator functional state assessment where the data is highly imbalanced, i.e., some function states (majority classes) have much more training samples than other states (minority classes). Conventional machine learning techniques usually tend to classify all data samples into majority classes and perform poorly for minority classes. In this study, we implemented five imbalanced learning techniques, including random undersampling, random over-sampling, synthetic minority over-sampling technique (SMOTE), borderline-SMOTE and adaptive synthetic sampling (ADASYN) to solve this problem. Experimental results on a benchmark driving lest dataset show thai accuracies for minority classes could be improved dramatically with a cost of slight performance degradations for majority classes,

  15. A model to facilitate implementation of the International Classification of Functioning, Disability and Health into prosthetics and orthotics.

    Science.gov (United States)

    Jarl, Gustav; Ramstrand, Nerrolyn

    2017-09-01

    The International Classification of Functioning, Disability and Health is a classification of human functioning and disability and is based on a biopsychosocial model of health. As such, International Classification of Functioning, Disability and Health seems suitable as a basis for constructing models defining the clinical P&O process. The aim was to use International Classification of Functioning, Disability and Health to facilitate development of such a model. Proposed model: A model, the Prosthetic and Orthotic Process (POP) model, is proposed. The Prosthetic and Orthotic Process model is based on the concepts of the International Classification of Functioning, Disability and Health and comprises four steps in a cycle: (1) Assessment, including the medical history and physical examination of the patient. (2) Goals, specified on four levels including those related to participation, activity, body functions and structures and technical requirements of the device. (3) Intervention, in which the appropriate course of action is determined based on the specified goal and evidence-based practice. (4) Evaluation of outcomes, where the outcomes are assessed and compared to the corresponding goals. After the evaluation of goal fulfilment, the first cycle in the process is complete, and a broad evaluation is now made including overriding questions about the patient's satisfaction with the outcomes and the process. This evaluation will determine if the process should be ended or if another cycle in the process should be initiated. The Prosthetic and Orthotic Process model can provide a common understanding of the P&O process. Concepts of International Classification of Functioning, Disability and Health have been incorporated into the model to facilitate communication with other rehabilitation professionals and encourage a holistic and patient-centred approach in clinical practice. Clinical relevance The Prosthetic and Orthotic Process model can support the implementation

  16. Development of a self-report physical function instrument for disability assessment: item pool construction and factor analysis.

    Science.gov (United States)

    McDonough, Christine M; Jette, Alan M; Ni, Pengsheng; Bogusz, Kara; Marfeo, Elizabeth E; Brandt, Diane E; Chan, Leighton; Meterko, Mark; Haley, Stephen M; Rasch, Elizabeth K

    2013-09-01

    To build a comprehensive item pool representing work-relevant physical functioning and to test the factor structure of the item pool. These developmental steps represent initial outcomes of a broader project to develop instruments for the assessment of function within the context of Social Security Administration (SSA) disability programs. Comprehensive literature review; gap analysis; item generation with expert panel input; stakeholder interviews; cognitive interviews; cross-sectional survey administration; and exploratory and confirmatory factor analyses to assess item pool structure. In-person and semistructured interviews and Internet and telephone surveys. Sample of SSA claimants (n=1017) and a normative sample of adults from the U.S. general population (n=999). Not applicable. Model fit statistics. The final item pool consisted of 139 items. Within the claimant sample, 58.7% were white; 31.8% were black; 46.6% were women; and the mean age was 49.7 years. Initial factor analyses revealed a 4-factor solution, which included more items and allowed separate characterization of: (1) changing and maintaining body position, (2) whole body mobility, (3) upper body function, and (4) upper extremity fine motor. The final 4-factor model included 91 items. Confirmatory factor analyses for the 4-factor models for the claimant and the normative samples demonstrated very good fit. Fit statistics for claimant and normative samples, respectively, were: Comparative Fit Index=.93 and .98; Tucker-Lewis Index=.92 and .98; and root mean square error approximation=.05 and .04. The factor structure of the physical function item pool closely resembled the hypothesized content model. The 4 scales relevant to work activities offer promise for providing reliable information about claimant physical functioning relevant to work disability. Copyright © 2013 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  17. The role of spatial information in the preservation of the shrimp nursery function of mangroves: a spatially explicit bio-economic model for the assessment of land use trade-offs.

    Science.gov (United States)

    Zavalloni, Matteo; Groeneveld, Rolf A; van Zwieten, Paul A M

    2014-10-01

    Conversion to aquaculture affects the provision of important ecosystem services provided by mangrove ecosystems, and this effect depends strongly on the location of the conversion. We introduce in a bio-economic mathematical programming model relevant spatial elements that affect the provision of the nursery habitat service of mangroves: (1) direct or indirect connection of mangroves to watercourses; (2) the spatial allocation of aquaculture ponds; and (3) the presence of non-linear relations between mangrove extent and juvenile recruitment to wild shrimp populations. By tracing out the production possibilities frontier of wild and cultivated shrimp, the model assesses the role of spatial information in the trade-off between aquaculture and the nursery habitat function using spatial elements relevant to our model of a mangrove area in Ca Mau Province, Viet Nam. Results show that where mangrove forests have to coexist with shrimp aquaculture ponds, the inclusion of specific spatial information on ecosystem functions in considerations of land allocation can achieve aquaculture benefits while largely preserving the economic benefits generated by the nursery habitat function. However, if spatial criteria are ignored, ill-advised land allocation decisions can easily lead to a collapse of the mangrove's nursery function. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. COMPLEX FUNCTIONAL ASSESSMENT OF THE HIP JOINT.

    Directory of Open Access Journals (Sweden)

    Maya S. Krastanova

    2015-09-01

    Full Text Available Introduction: In relation to the study reporting the effects of applying phased complex rehabilitation in patients with total hip arthroplasty, it has been concluded that the everyday clinical practice in Bulgaria does not apply complex examination, giving an objective picture about the extent of functional status of patients with trauma and diseases of the hip. Aim: The main goal of this report is to present a test which incorporates all known and routine research and in which the total number of points determines the functional status of patients with trauma and diseases of the hip. Material and Methods: Based on the Hip dysfunction and Osteoarthritis Outcome Score, the Harris Hip Score modified test, scale D’Aubigne and Postel and Iowa’s test for complex functional evaluation of the hip joint, we have developed a test including information about the degree of pain; goniometry and manual muscle testing of the hip; locomotor test – type of gait and adjuvants; test for Daily Activities of Life. The test has been developed on the basis of expert assessment by doctors and physiotherapists of the proposed indicators for evaluation and determination of the weighting factors’ contribution to the general condition of the patient. Conclusion: The developed and tested method of complex functional assessment of the hip joint enables our colleagues, dealing with trauma and diseases of the hip, to use it in various research and scientific projects, as well as in general medical practice.

  19. Functional Assessment-Based Intervention for Selective Mutism

    Science.gov (United States)

    Kern, Lee; Starosta, Kristin M.; Bambara, Linda M.; Cook, Clayton R.; Gresham, Frank R.

    2007-01-01

    The process of functional assessment has emerged as an essential component for intervention development. Applications across divergent types of problem behavior, however, remain limited. This study evaluated the applicability of this promising approach to students with selective mutism. Two middle school students served as participants. The…

  20. Effects of Training in Functional Behavior Assessment

    Science.gov (United States)

    Dukes, Charles; Rosenberg, Howard; Brady, Michael

    2008-01-01

    The purpose of this study was to investigate the effectiveness of training special education teachers in the process of functional behavioral assessment (FBA) and subsequent development of recommendations to promote behavior change. An original evaluation instrument was developed that included measures of special education teachers' knowledge of…

  1. [Neuropsychological models of autism spectrum disorders - behavioral evidence and functional imaging].

    Science.gov (United States)

    Dziobek, Isabel; Bölte, Sven

    2011-03-01

    To review neuropsychological models of theory of mind (ToM), executive functions (EF), and central coherence (CC) as framework for cognitive abnormalities in autism spectrum disorders (ASD). Behavioral and functional imaging studies are described that assess social-cognitive, emotional, and executive functions as well as locally oriented perception in ASD. Impairments in ToM and EF as well as alterations in CC are frequently replicated phenomena in ASD. Especially problems concerning social perception and ToM have high explanatory value for clinical symptomatology. Brain activation patterns differ between individuals with and without ASD for ToM, EF, und CC functions. An approach focussing on reduced cortical connectivity seems to be increasingly favored over explanations focussing on single affected brain sites. A better understanding of the complexities of ASD in future research demands the integration of clinical, neuropsychological, functional imaging, and molecular genetics evidence. Weaknesses in ToM and EF as well as strengths in detail-focussed perception should be used for individual intervention planning.

  2. Individual-based model for radiation risk assessment

    Science.gov (United States)

    Smirnova, O.

    A mathematical model is developed which enables one to predict the life span probability for mammals exposed to radiation. It relates statistical biometric functions with statistical and dynamic characteristics of an organism's critical system. To calculate the dynamics of the latter, the respective mathematical model is used too. This approach is applied to describe the effects of low level chronic irradiation on mice when the hematopoietic system (namely, thrombocytopoiesis) is the critical one. For identification of the joint model, experimental data on hematopoiesis in nonirradiated and irradiated mice, as well as on mortality dynamics of those in the absence of radiation are utilized. The life span probability and life span shortening predicted by the model agree with corresponding experimental data. Modeling results show the significance of ac- counting the variability of the individual radiosensitivity of critical system cells when estimating the radiation risk. These findings are corroborated by clinical data on persons involved in the elimination of the Chernobyl catastrophe after- effects. All this makes it feasible to use the model for radiation risk assessments for cosmonauts and astronauts on long-term missions such as a voyage to Mars or a lunar colony. In this case the model coefficients have to be determined by making use of the available data for humans. Scenarios for the dynamics of dose accumulation during space flights should also be taken into account.

  3. Integrated performance assessment model for waste policy package behavior and radionuclide release

    International Nuclear Information System (INIS)

    Kossik, R.; Miller, I.; Cunnane, M.

    1992-01-01

    Golder Associates Inc. (GAI) has developed a probabilistic total system performance assessment and strategy evaluation model (RIP) which can be applied in an iterative manner to evaluate repository site suitability and guide site characterization. This paper describes one component of the RIP software, the waste package behavior and radionuclide release model. The waste package component model considers waste package failure by various modes, matrix alteration/dissolution, and radionuclide mass transfer. Model parameters can be described as functions of local environmental conditions. The waste package component model is coupled to component models for far-field radionuclide transport and disruptive events. The model has recently been applied to the proposed repository at Yucca Mountain

  4. Development of an assessment of functioning scale for prison environments.

    Science.gov (United States)

    Shelton, Deborah; Wakai, Sara

    2015-01-01

    This paper reports the development of a global assessment of functioning (GAF), modified from the DSM Axis V GAF for the prison environment. Focus groups, which were conducted with 36 correctional officers and clinicians in two prisons, provided descriptions of behavior in prison settings to re-align the GAF scale. Face validity was established. It was found that Habitation/Behavior, Social, and Symptoms emerged as important domains of functioning in prison. Gender differences were noted with regard to cleanliness, relationships and coping strategies. The cut-off score was identified at a score where offenders were unable to participate in a disciplinary process due to their mental illness. The structure of prison alters human functioning, requiring different assessment language and ratings to measure perceived behavioral norms and/or expectations. Front-line staff need the ability to observe and communicate behavioral changes quickly and accurately in a prison environment without undue burden upon their workload. This assessment was modified by front-line staff specifically for the prison environment to document quick and frequent assessments of observed changes over time in the offender population.

  5. Maximum entropy models of ecosystem functioning

    International Nuclear Information System (INIS)

    Bertram, Jason

    2014-01-01

    Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on the information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example

  6. Maximum entropy models of ecosystem functioning

    Energy Technology Data Exchange (ETDEWEB)

    Bertram, Jason, E-mail: jason.bertram@anu.edu.au [Research School of Biology, The Australian National University, Canberra ACT 0200 (Australia)

    2014-12-05

    Using organism-level traits to deduce community-level relationships is a fundamental problem in theoretical ecology. This problem parallels the physical one of using particle properties to deduce macroscopic thermodynamic laws, which was successfully achieved with the development of statistical physics. Drawing on this parallel, theoretical ecologists from Lotka onwards have attempted to construct statistical mechanistic theories of ecosystem functioning. Jaynes’ broader interpretation of statistical mechanics, which hinges on the entropy maximisation algorithm (MaxEnt), is of central importance here because the classical foundations of statistical physics do not have clear ecological analogues (e.g. phase space, dynamical invariants). However, models based on the information theoretic interpretation of MaxEnt are difficult to interpret ecologically. Here I give a broad discussion of statistical mechanical models of ecosystem functioning and the application of MaxEnt in these models. Emphasising the sample frequency interpretation of MaxEnt, I show that MaxEnt can be used to construct models of ecosystem functioning which are statistical mechanical in the traditional sense using a savanna plant ecology model as an example.

  7. Assessment of endothelial function and myocardial flow reserve using 15O-water PET without attenuation correction

    International Nuclear Information System (INIS)

    Tuffier, Stephane; Joubert, Michael; Bailliez, Alban; Legallois, Damien; Belin, Annette; Redonnet, Michel; Agostini, Denis; Manrique, Alain

    2016-01-01

    Myocardial blood flow (MBF) measurement using positron emission tomography (PET) from the washout rate of 15 O-water is theoretically independent of tissue attenuation. The aim of this study was to evaluate the impact of not using attenuation correction in the assessment of coronary endothelial function and myocardial flow reserve (MFR) using 15 O-water PET. We retrospectively processed 70 consecutive 15 O-water PET examinations obtained at rest and during cold pressor testing (CPT) in patients with dilated cardiomyopathy (n = 58), or at rest and during adenosine infusion in heart transplant recipients (n = 12). Data were reconstructed with attenuation correction (AC) and without attenuation correction (NAC) using filtered backprojection, and MBF was quantified using a single compartmental model. The agreement between AC and NAC data was assessed using Lin's concordance correlation coefficient followed by Bland-Altman plot analysis. Regarding endothelial function, NAC PET showed poor reproducibility and poor agreement with AC PET data. Conversely, NAC PET demonstrated high reproducibility and a strong agreement with AC PET for the assessment of MFR. Non-attenuation-corrected 15 O-water PET provided an accurate measurement of MFR compared to attenuation-corrected PET. However, non-attenuation-corrected PET data were less effective for the assessment of endothelial function using CPT in this population. (orig.)

  8. Functional Behavioral Assessment in Early Education Settings

    Science.gov (United States)

    Neilsen, Shelley; McEvoy, Mary

    2004-01-01

    Functional behavioral assessment (FBA) is the process of identifying the events in the environment that consistently precede and follow challenging behavior. The use of FBA has increased significantly following the reauthorization of the Individuals with Disabilities Education Act in 1997, which mandated FBAs be conducted when children with…

  9. Accountability for Early Childhood Education (Assessing Global Functioning).

    Science.gov (United States)

    Cassel, Russell N.

    1995-01-01

    Discusses the pacing of learning activity, knowledge of progress in student learning, teacher role, accountability in learning, feedback on knowledge of success, the global functioning assessment concept, and the mother surrogate. (RS)

  10. Prediction of Chemical Function: Model Development and ...

    Science.gov (United States)

    The United States Environmental Protection Agency’s Exposure Forecaster (ExpoCast) project is developing both statistical and mechanism-based computational models for predicting exposures to thousands of chemicals, including those in consumer products. The high-throughput (HT) screening-level exposures developed under ExpoCast can be combined with HT screening (HTS) bioactivity data for the risk-based prioritization of chemicals for further evaluation. The functional role (e.g. solvent, plasticizer, fragrance) that a chemical performs can drive both the types of products in which it is found and the concentration in which it is present and therefore impacting exposure potential. However, critical chemical use information (including functional role) is lacking for the majority of commercial chemicals for which exposure estimates are needed. A suite of machine-learning based models for classifying chemicals in terms of their likely functional roles in products based on structure were developed. This effort required collection, curation, and harmonization of publically-available data sources of chemical functional use information from government and industry bodies. Physicochemical and structure descriptor data were generated for chemicals with function data. Machine-learning classifier models for function were then built in a cross-validated manner from the descriptor/function data using the method of random forests. The models were applied to: 1) predict chemi

  11. Functioning with a Sticky Model.

    Science.gov (United States)

    Reys, Robert E.

    1981-01-01

    A model that can be effectively used to develop the notion of function and provide varied practice by using "real world" examples and concrete objects is covered. The use of Popsicle-sticks is featured, with some suggestions for tasks involving functions with one operation, two operations, and inverse operations covered. (MP)

  12. Functional Modeling of Neural-Glia Interaction

    DEFF Research Database (Denmark)

    Postnov, D.E.; Brazhe, N.A.; Sosnovtseva, Olga

    2012-01-01

    Functional modeling is an approach that focuses on the representation of the qualitative dynamics of the individual components (e.g. cells) of a system and on the structure of the interaction network.......Functional modeling is an approach that focuses on the representation of the qualitative dynamics of the individual components (e.g. cells) of a system and on the structure of the interaction network....

  13. Ecosystem Model Skill Assessment. Yes We Can!

    Science.gov (United States)

    Olsen, Erik; Fay, Gavin; Gaichas, Sarah; Gamble, Robert; Lucey, Sean; Link, Jason S

    2016-01-01

    Accelerated changes to global ecosystems call for holistic and integrated analyses of past, present and future states under various pressures to adequately understand current and projected future system states. Ecosystem models can inform management of human activities in a complex and changing environment, but are these models reliable? Ensuring that models are reliable for addressing management questions requires evaluating their skill in representing real-world processes and dynamics. Skill has been evaluated for just a limited set of some biophysical models. A range of skill assessment methods have been reviewed but skill assessment of full marine ecosystem models has not yet been attempted. We assessed the skill of the Northeast U.S. (NEUS) Atlantis marine ecosystem model by comparing 10-year model forecasts with observed data. Model forecast performance was compared to that obtained from a 40-year hindcast. Multiple metrics (average absolute error, root mean squared error, modeling efficiency, and Spearman rank correlation), and a suite of time-series (species biomass, fisheries landings, and ecosystem indicators) were used to adequately measure model skill. Overall, the NEUS model performed above average and thus better than expected for the key species that had been the focus of the model tuning. Model forecast skill was comparable to the hindcast skill, showing that model performance does not degenerate in a 10-year forecast mode, an important characteristic for an end-to-end ecosystem model to be useful for strategic management purposes. We identify best-practice approaches for end-to-end ecosystem model skill assessment that would improve both operational use of other ecosystem models and future model development. We show that it is possible to not only assess the skill of a complicated marine ecosystem model, but that it is necessary do so to instill confidence in model results and encourage their use for strategic management. Our methods are applicable

  14. Kaon fragmentation function from NJL-jet model

    International Nuclear Information System (INIS)

    Matevosyan, Hrayr H.; Thomas, Anthony W.; Bentz, Wolfgang

    2010-01-01

    The NJL-jet model provides a sound framework for calculating the fragmentation functions in an effective chiral quark theory, where the momentum and isospin sum rules are satisfied without the introduction of ad hoc parameters [1]. Earlier studies of the pion fragmentation functions using the Nambu-Jona-Lasinio (NJL) model within this framework showed good qualitative agreement with the empirical parameterizations. Here we extend the NJL-jet model by including the strange quark. The corrections to the pion fragmentation function and corresponding kaon fragmentation functions are calculated using the elementary quark to quark-meson fragmentation functions from NJL. The results for the kaon fragmentation function exhibit a qualitative agreement with the empirical parameterizations, while the unfavored strange quark fragmentation to pions is shown to be of the same order of magnitude as the unfavored light quark's. The results of these studies are expected to provide important guidance for the analysis of a large variety of semi-inclusive data.

  15. Avian collision risk models for wind energy impact assessments

    International Nuclear Information System (INIS)

    Masden, E.A.; Cook, A.S.C.P.

    2016-01-01

    With the increasing global development of wind energy, collision risk models (CRMs) are routinely used to assess the potential impacts of wind turbines on birds. We reviewed and compared the avian collision risk models currently available in the scientific literature, exploring aspects such as the calculation of a collision probability, inclusion of stationary components e.g. the tower, angle of approach and uncertainty. 10 models were cited in the literature and of these, all included a probability of collision of a single bird colliding with a wind turbine during passage through the rotor swept area, and the majority included a measure of the number of birds at risk. 7 out of the 10 models calculated the probability of birds colliding, whilst the remainder used a constant. We identified four approaches to calculate the probability of collision and these were used by others. 6 of the 10 models were deterministic and included the most frequently used models in the UK, with only 4 including variation or uncertainty in some way, the most recent using Bayesian methods. Despite their appeal, CRMs have their limitations and can be ‘data hungry’ as well as assuming much about bird movement and behaviour. As data become available, these assumptions should be tested to ensure that CRMs are functioning to adequately answer the questions posed by the wind energy sector. - Highlights: • We highlighted ten models available to assess avian collision risk. • Only 4 of the models included variability or uncertainty. • Collision risk models have limitations and can be ‘data hungry’. • It is vital that the most appropriate model is used for a given task.

  16. Avian collision risk models for wind energy impact assessments

    Energy Technology Data Exchange (ETDEWEB)

    Masden, E.A., E-mail: elizabeth.masden@uhi.ac.uk [Environmental Research Institute, North Highland College-UHI, University of the Highlands and Islands, Ormlie Road, Thurso, Caithness KW14 7EE (United Kingdom); Cook, A.S.C.P. [British Trust for Ornithology, The Nunnery, Thetford IP24 2PU (United Kingdom)

    2016-01-15

    With the increasing global development of wind energy, collision risk models (CRMs) are routinely used to assess the potential impacts of wind turbines on birds. We reviewed and compared the avian collision risk models currently available in the scientific literature, exploring aspects such as the calculation of a collision probability, inclusion of stationary components e.g. the tower, angle of approach and uncertainty. 10 models were cited in the literature and of these, all included a probability of collision of a single bird colliding with a wind turbine during passage through the rotor swept area, and the majority included a measure of the number of birds at risk. 7 out of the 10 models calculated the probability of birds colliding, whilst the remainder used a constant. We identified four approaches to calculate the probability of collision and these were used by others. 6 of the 10 models were deterministic and included the most frequently used models in the UK, with only 4 including variation or uncertainty in some way, the most recent using Bayesian methods. Despite their appeal, CRMs have their limitations and can be ‘data hungry’ as well as assuming much about bird movement and behaviour. As data become available, these assumptions should be tested to ensure that CRMs are functioning to adequately answer the questions posed by the wind energy sector. - Highlights: • We highlighted ten models available to assess avian collision risk. • Only 4 of the models included variability or uncertainty. • Collision risk models have limitations and can be ‘data hungry’. • It is vital that the most appropriate model is used for a given task.

  17. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    Science.gov (United States)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  18. Functional summary statistics for the Johnson-Mehl model

    DEFF Research Database (Denmark)

    Møller, Jesper; Ghorbani, Mohammad

    The Johnson-Mehl germination-growth model is a spatio-temporal point process model which among other things have been used for the description of neurotransmitters datasets. However, for such datasets parametric Johnson-Mehl models fitted by maximum likelihood have yet not been evaluated by means...... of functional summary statistics. This paper therefore invents four functional summary statistics adapted to the Johnson-Mehl model, with two of them based on the second-order properties and the other two on the nuclei-boundary distances for the associated Johnson-Mehl tessellation. The functional summary...... statistics theoretical properties are investigated, non-parametric estimators are suggested, and their usefulness for model checking is examined in a simulation study. The functional summary statistics are also used for checking fitted parametric Johnson-Mehl models for a neurotransmitters dataset....

  19. Relationship of amotivation to neurocognition, self-efficacy and functioning in first-episode psychosis: a structural equation modeling approach.

    Science.gov (United States)

    Chang, W C; Kwong, V W Y; Hui, C L M; Chan, S K W; Lee, E H M; Chen, E Y H

    2017-03-01

    Better understanding of the complex interplay among key determinants of functional outcome is crucial to promoting recovery in psychotic disorders. However, this is understudied in the early course of illness. We aimed to examine the relationships among negative symptoms, neurocognition, general self-efficacy and global functioning in first-episode psychosis (FEP) patients using structural equation modeling (SEM). Three hundred and twenty-one Chinese patients aged 26-55 years presenting with FEP to an early intervention program in Hong Kong were recruited. Assessments encompassing symptom profiles, functioning, perceived general self-efficacy and a battery of neurocognitive tests were conducted. Negative symptom measurement was subdivided into amotivation and diminished expression (DE) domain scores based on the ratings in the Scale for the Assessment of Negative Symptoms. An initial SEM model showed no significant association between functioning and DE which was removed from further analysis. A final trimmed model yielded very good model fit (χ2 = 15.48, p = 0.63; comparative fit index = 1.00; root mean square error of approximation amotivation, neurocognition and general self-efficacy had a direct effect on global functioning. Amotivation was also found to mediate a significant indirect effect of neurocognition and general self-efficacy on functioning. Neurocognition was not significantly related to general self-efficacy. Our results indicate a critical intermediary role of amotivation in linking neurocognitive impairment to functioning in FEP. General self-efficacy may represent a promising treatment target for improvement of motivational deficits and functional outcome in the early illness stage.

  20. Should we assess climate model predictions in light of severe tests?

    Science.gov (United States)

    Katzav, Joel

    2011-06-01

    According to Austro-British philosopher Karl Popper, a system of theoretical claims is scientific only if it is methodologically falsifiable, i.e., only if systematic attempts to falsify or severely test the system are being carried out [Popper, 2005, pp. 20, 62]. He holds that a test of a theoretical system is severe if and only if it is a test of the applicability of the system to a case in which the system's failure is likely in light of background knowledge, i.e., in light of scientific assumptions other than those of the system being tested [Popper, 2002, p. 150]. Popper counts the 1919 tests of general relativity's then unlikely predictions of the deflection of light in the Sun's gravitational field as severe. An implication of Popper's above condition for being a scientific theoretical system is the injunction to assess theoretical systems in light of how well they have withstood severe testing. Applying this injunction to assessing the quality of climate model predictions (CMPs), including climate model projections, would involve assigning a quality to each CMP as a function of how well it has withstood severe tests allowed by its implications for past, present, and nearfuture climate or, alternatively, as a function of how well the models that generated the CMP have withstood severe tests of their suitability for generating the CMP.

  1. Functional, Structural, and Neurotoxicity Biomarkers in Integrative Assessment of Concussions

    Directory of Open Access Journals (Sweden)

    Svetlana A Dambinova

    2016-10-01

    Full Text Available Concussion is a complex, heterogenous process affecting the brain. Accurate assessment and diagnosis and appropriate management of concussion are essential to ensure athletes do not prematurely return to play or others to work or active military duty, risking re-injury. To date, clinical diagnosis relies primarily on evaluating subjects for functional impairment using instruments that include neurocognitive testing, subjective symptom report, and neurobehavioral assessments, such as balance and vestibular-ocular reflex testing. Structural biomarkers, defined as advanced neuroimaging techniques and biomarkers assessing neurotoxicity and immunoexcitotoxicity may complement the use of functional biomarkers. We hypothesize that neurotoxicity AMPA, NMDA, and kainite receptor biomarkers might be utilized as a part of comprehensive approach to concussion evaluations, with the goal of increasing diagnostic accuracy and facilitating treatment planning and prognostic assessment.

  2. Road Maintenance and Rehabilitation Program Using Functional and Structural Assessment

    Science.gov (United States)

    Setianingsih, A. I.; Sangaji, S.; Setyawan, A.

    2017-02-01

    Road sector development policy in Bangka Belitung emphasis on equitable development, which is opening up new areas for industrial development zones of potential marine and coastal tourism, so that having an impact on the budget priority to build a new road. This led to a minimal budget provided for the maintenance of the existing road. This study aimed to evaluate the condition of the pavement both functionally and structurally, the growth of traffic density and the availability of existing road maintenance costs. Then, to analyze the influence of existing road conditions, traffic density and road maintenance costs to the type of road maintenance management. The results are compared with the results of the existing maintenance conducted by the Public Works Department of Bangka Belitung province. Evaluation of pavement conditions consists of visual assessment of pavement condition using IRI, pavement condition assessment functionally with deflection method using test data tool Benkelman Beam (BB) and the actual traffic load. IRI value, deflections and traffic growth gained from years 2011-2015 subsequently created regression models to obtain the relationship and the correlation coefficient. The analysis showed that using the same relative magnitude of the budget from 2011 to 2015, giving priority to the maintenance of the road with good conditions capable of providing the road with a steady state of 100%. Recommendations can be given that maintain the road with good conditions reflecting that preservation provide maximum results with the more efficient maintenance cost.

  3. A deterministic width function model

    Directory of Open Access Journals (Sweden)

    C. E. Puente

    2003-01-01

    Full Text Available Use of a deterministic fractal-multifractal (FM geometric method to model width functions of natural river networks, as derived distributions of simple multifractal measures via fractal interpolating functions, is reported. It is first demonstrated that the FM procedure may be used to simulate natural width functions, preserving their most relevant features like their overall shape and texture and their observed power-law scaling on their power spectra. It is then shown, via two natural river networks (Racoon and Brushy creeks in the United States, that the FM approach may also be used to closely approximate existing width functions.

  4. Bioavailability in the boris assessment model

    International Nuclear Information System (INIS)

    Norden, M.; Avila, R.; Gonze, M.A.; Tamponnet, C.

    2004-01-01

    The fifth framework EU project BORIS (Bioavailability Of Radionuclides In Soils: role of biological components and resulting improvement of prediction models) has three scientific objectives. The first is to improve understanding of the mechanisms governing the transfer of radionuclides to plants. The second is to improve existing predictive models of radionuclide interaction with soils by incorporating the knowledge acquired from the experimental results. The last and third objective is to extract from the experimental results some scientific basis for the development of bioremediation methods of radionuclides contaminated soils and to apprehend the role of additional non-radioactive pollutants on radionuclide bio-availability. This paper is focused on the second objective. The purpose of the BORIS assessment model is to describe the behaviour of radionuclides in the soil-plant system with the aim of making predictions of the time dynamics of the bioavailability of radionuclides in soil and the radionuclides concentrations in plants. To be useful the assessment model should be rather simple and use only a few parameters, which are commonly available or possible to measure for different sites. The model shall take into account, as much as possible, the results of the experimental studies and the mechanistic models developed in the BORIS project. One possible approach is to introduce in the assessment model a quantitative relationship between bioavailability of the radionuclides in soil and the soil properties. To do this an operational definition of bioavailability is needed. Here operational means experimentally measurable, directly or indirectly, and that the bioavailability can be translated into a mathematical expression. This paper describes the reasoning behind the chosen definition of bioavailability for the assessment model, how to derive operational expressions for the bioavailability and how to use them in the assessment model. (author)

  5. An application of a hydraulic model simulator in flood risk assessment under changing climatic conditions

    Science.gov (United States)

    Doroszkiewicz, J. M.; Romanowicz, R. J.

    2016-12-01

    The standard procedure of climate change impact assessment on future hydrological extremes consists of a chain of consecutive actions, starting from the choice of GCM driven by an assumed CO2 scenario, through downscaling of climatic forcing to a catchment scale, estimation of hydrological extreme indices using hydrological modelling tools and subsequent derivation of flood risk maps with the help of a hydraulic model. Among many possible sources of uncertainty, the main are the uncertainties related to future climate scenarios, climate models, downscaling techniques and hydrological and hydraulic models. Unfortunately, we cannot directly assess the impact of these different sources of uncertainties on flood risk in future due to lack of observations of future climate realizations. The aim of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the processes involved, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-sections. The study shows that the application of a simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the

  6. A simple method for assessment of muscle force, velocity, and power producing capacities from functional movement tasks.

    Science.gov (United States)

    Zivkovic, Milena Z; Djuric, Sasa; Cuk, Ivan; Suzovic, Dejan; Jaric, Slobodan

    2017-07-01

    A range of force (F) and velocity (V) data obtained from functional movement tasks (e.g., running, jumping, throwing, lifting, cycling) performed under variety of external loads have typically revealed strong and approximately linear F-V relationships. The regression model parameters reveal the maximum F (F-intercept), V (V-intercept), and power (P) producing capacities of the tested muscles. The aim of the present study was to evaluate the level of agreement between the routinely used "multiple-load model" and a simple "two-load model" based on direct assessment of the F-V relationship from only 2 external loads applied. Twelve participants were tested on the maximum performance vertical jumps, cycling, bench press throws, and bench pull performed against a variety of different loads. All 4 tested tasks revealed both exceptionally strong relationships between the parameters of the 2 models (median R = 0.98) and a lack of meaningful differences between their magnitudes (fixed bias below 3.4%). Therefore, addition of another load to the standard tests of various functional tasks typically conducted under a single set of mechanical conditions could allow for the assessment of the muscle mechanical properties such as the muscle F, V, and P producing capacities.

  7. Assessing Differential Item Functioning on the Test of Relational Reasoning

    Directory of Open Access Journals (Sweden)

    Denis Dumas

    2018-03-01

    Full Text Available The test of relational reasoning (TORR is designed to assess the ability to identify complex patterns within visuospatial stimuli. The TORR is designed for use in school and university settings, and therefore, its measurement invariance across diverse groups is critical. In this investigation, a large sample, representative of a major university on key demographic variables, was collected, and the resulting data were analyzed using a multi-group, multidimensional item-response theory model-comparison procedure. No significant differential item functioning was found on any of the TORR items across any of the demographic groups of interest. This finding is interpreted as evidence of the cultural fairness of the TORR, and potential test-development choices that may have contributed to that cultural fairness are discussed.

  8. Systematic assessment of blood circulation time of functionalized upconversion nanoparticles in the chick embryo

    Science.gov (United States)

    Nadort, Annemarie; Liang, Liuen; Grebenik, Ekaterina; Guller, Anna; Lu, Yiqing; Qian, Yi; Goldys, Ewa; Zvyagin, Andrei

    2015-12-01

    Nanoparticle-based delivery of drugs and contrast agents holds great promise in cancer research, because of the increased delivery efficiency compared to `free' drugs and dyes. A versatile platform to investigate nanotechnology is the chick embryo chorioallantoic membrane tumour model, due to its availability (easy, cheap) and accessibility (interventions, imaging). In our group, we developed this model using several tumour cell lines (e.g. breast cancer, colon cancer). In addition, we have synthesized in-house silica coated photoluminescent upconversion nanoparticles with several functional groups (COOH, NH2, PEG). In this work we will present the systematic assessment of their in vivo blood circulation times. To this end, we injected chick embryos grown ex ovo with the functionalized UCNPs and obtained a small amount of blood at several time points after injection to create blood smears The UCNP signal from the blood smears was quantified using a modified inverted microscope imaging set-up. The results of this systematic study are valuable to optimize biochemistry protocols and guide nanomedicine advancement in the versatile chick embryo tumour model.

  9. Model evaluation methodology applicable to environmental assessment models

    International Nuclear Information System (INIS)

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes

  10. Models for Pesticide Risk Assessment

    Science.gov (United States)

    EPA considers the toxicity of the pesticide as well as the amount of pesticide to which a person or the environments may be exposed in risk assessment. Scientists use mathematical models to predict pesticide concentrations in exposure assessment.

  11. Functional Assessment in Transition and Rehabilitation for Adolescents and Adults with Learning Disorders.

    Science.gov (United States)

    Bullis, Michael, Ed.; Davis, Cheryl D., Ed.

    This manual is based on a 3-year, federally funded program, Project FASTER (Functional Assessment Services for Transition, Education, and Rehabilitation), that developed functional assessment procedures and provided assessment services to adolescents and adults with learning or behavioral disorders who were involved in school-based transition…

  12. Cardiorespiratory functional assessment after pediatric heart transplantation.

    Science.gov (United States)

    Pastore, E; Turchetta, A; Attias, L; Calzolari, A; Giordano, U; Squitieri, C; Parisi, F

    2001-12-01

    Limited data are available on the exercise capacity of young heart transplant recipients. The aim of this study was therefore to assess cardiorespiratory responses to exercise in this group of patients. Fourteen consecutive heart transplant recipients (six girls and eight boys, age-range 5-15 yr) and 14 healthy matched controls underwent a Bruce treadmill test to determine: duration of test; resting and maximum heart rates; maximum systolic blood pressure; peak oxygen consumption (VO2 peak); and cardiac output. Duration of test and heart rate increase were then compared with: time since transplantation, rejections per year, and immunosuppressive drugs received. The recipients also underwent the following lung function tests: forced vital capacity (FVC) and forced expiratory volume in 1 s (FEV1). When compared with healthy controls, transplant recipients had tachycardia at rest (126 +/- 3.7 beats/min; p physical activity, possibly owing to over-protective parents and teachers and to a lack of suitable supervised facilities. The authors stress the importance of a cardiorespiratory functional evaluation for assessment of health status and to encourage recipients, if possible, to undertake regular physical activity.

  13. Magnetic resonance in the assessment of renal function

    International Nuclear Information System (INIS)

    Knesplova, L.; Krestin, G.P.

    1998-01-01

    The kidneys are the most important organs to maintain homeostasis. In the assessment of renal functional disorders laboratory tests offer only indirect hints on location of the disease; radionuclide nephrography is hampered by low spatial resolution and radiologic methods provide only limited quantitative information. The MRI technique with fast pulse sequences and renally eliminated contrast agent has the capability of combining both anatomic and functional information. This article gives an overview on functional MRI of the kidneys with its possibilities and limitations. The clinical application of functional MRI allows a better understanding of some pathologic conditions such as urinary tract obstruction, renal insufficiency, effects of extracorporeal shock wave lithotripsy, different states of hydration, effects of drugs, vascular disorders, and effects of transplantation. (orig.)

  14. Assessment of the Rescorla-Wagner model.

    Science.gov (United States)

    Miller, R R; Barnet, R C; Grahame, N J

    1995-05-01

    The Rescorla-Wagner model has been the most influential theory of associative learning to emerge from the study of animal behavior over the last 25 years. Recently, equivalence to this model has become a benchmark in assessing connectionist models, with such equivalence often achieved by incorporating the Widrow-Hoff delta rule. This article presents the Rescorla-Wagner model's basic assumptions, reviews some of the model's predictive successes and failures, relates the failures to the model's assumptions, and discusses the model's heuristic value. It is concluded that the model has had a positive influence on the study of simple associative learning by stimulating research and contributing to new model development. However, this benefit should neither lead to the model being regarded as inherently "correct" nor imply that its predictions can be profitably used to assess other models.

  15. Model wave functions for the deuteron

    International Nuclear Information System (INIS)

    Certov, A.; Mathelitsch, L.; Moravcsik, M.J.

    1987-01-01

    Model wave functions are constructed for the deuteron to facilitate the unambiguous exploration of dependencies on the percentage D state and on the small-, medium-, and large-distance parts of the deuteron wave function. The wave functions are constrained by those deuteron properties which are accurately known experimentally, and are in an analytic form which is easily integrable in expressions usually encountered in the use of such wave functions

  16. Methods for Developing Emissions Scenarios for Integrated Assessment Models

    Energy Technology Data Exchange (ETDEWEB)

    Prinn, Ronald [MIT; Webster, Mort [MIT

    2007-08-20

    The overall objective of this research was to contribute data and methods to support the future development of new emissions scenarios for integrated assessment of climate change. Specifically, this research had two main objectives: 1. Use historical data on economic growth and energy efficiency changes, and develop probability density functions (PDFs) for the appropriate parameters for two or three commonly used integrated assessment models. 2. Using the parameter distributions developed through the first task and previous work, we will develop methods of designing multi-gas emission scenarios that usefully span the joint uncertainty space in a small number of scenarios. Results on the autonomous energy efficiency improvement (AEEI) parameter are summarized, an uncertainty analysis of elasticities of substitution is described, and the probabilistic emissions scenario approach is presented.

  17. An assessment of geographical distribution of different plant functional types over North America simulated using the CLASS–CTEM modelling framework

    Directory of Open Access Journals (Sweden)

    R. K. Shrestha

    2017-10-01

    Full Text Available The performance of the competition module of the CLASS–CTEM (Canadian Land Surface Scheme and Canadian Terrestrial Ecosystem Model modelling framework is assessed at 1° spatial resolution over North America by comparing the simulated geographical distribution of its plant functional types (PFTs with two observation-based estimates. The model successfully reproduces the broad geographical distribution of trees, grasses and bare ground although limitations remain. In particular, compared to the two observation-based estimates, the simulated fractional vegetation coverage is lower in the arid southwest North American region and higher in the Arctic region. The lower-than-observed simulated vegetation coverage in the southwest region is attributed to lack of representation of shrubs in the model and plausible errors in the observation-based data sets. The observation-based data indicate vegetation fractional coverage of more than 60 % in this arid region, despite only 200–300 mm of precipitation that the region receives annually, and observation-based leaf area index (LAI values in the region are lower than one. The higher-than-observed vegetation fractional coverage in the Arctic is likely due to the lack of representation of moss and lichen PFTs and also likely because of inadequate representation of permafrost in the model as a result of which the C3 grass PFT performs overly well in the region. The model generally reproduces the broad spatial distribution and the total area covered by the two primary tree PFTs (needleleaf evergreen trees, NDL-EVG; and broadleaf cold deciduous trees, BDL-DCD-CLD reasonably well. The simulated fractional coverage of tree PFTs increases after the 1960s in response to the CO2 fertilization effect and climate warming. Differences between observed and simulated PFT coverages highlight model limitations and suggest that the inclusion of shrubs, and moss and lichen PFTs, and an adequate representation of

  18. Dynamic behavioural model for assessing impact of regeneration actions on system availability: Application to weapon systems

    International Nuclear Information System (INIS)

    Monnin, Maxime; Iung, Benoit; Senechal, Olivier

    2011-01-01

    Mastering system availability all along the system life cycle is now a critical issue with regards to systems engineering. It is more true for military systems which operate in a battle context. Indeed as they must act in a hostile environment, they can become unavailable due to failures of or damage to the system. In both cases, system regeneration is required to restore its availability. Many approaches based on system modelling have been developed to assess availability. However, very few of them take battlefield damage into account and relevant methods for the model development are missing. In this paper, a modelling method for architecture of weapon system of systems that supports regeneration engineering is proposed. On the one hand, this method relies on a unified failure/damage approach to extend acknowledged availability models. It allows to integrate failures, damages, as well as the possibility of regeneration, into operational availability assessment. Architectures are modelled as a set of operational functions, supported by components that belong to platform (system). Modelling atoms (i.e. elementary units of modelling) for both the architecture components and functions are defined, based on state-space formalism. Monte Carlo method is used to estimate availability through simulation. Availability of the architecture is defined on the basis of the possible states of the required functions for a mission. The states of a function directly depend on the state of the corresponding components (i.e. the components that support the function). Aggregation rules define the state of the function knowing the states of each component. Aggregation is defined by means of combinatorial equations of the component states. The modelling approach is supported by means of stochastic activity network for the models simulation. Results are analysed in terms of graphs of availability for mission's days. Thus, given the simulation results, it is possible to plan combat

  19. Performance assessment of sealing systems. Conceptual and integrated modelling of plugs and seals

    Energy Technology Data Exchange (ETDEWEB)

    Ruebel, Andre; Buhmann, Dieter; Kindlein, Jonathan; Lauke, Thomas

    2016-08-15

    The long-time isolation of radionuclides from the biosphere is the goal of the storage of radioactive waste in deep geological repositories. For repositories in rock salt, this goal is achieved on the one hand by the impermeable undisturbed part of the salt host rock formation and on the other hand by crushed salt, which is used to backfill the mine openings in the emplacement areas and galleries created during the construction of the repository. The crushed salt backfill is compacted over time and achieves a sufficiently high hydraulic resistance to avoid inflow of brines into the emplacement areas of the repository in the long-term. Plugs and seals must additionally provide their sealing function during the early post closure phase, until the compaction of the backfill is adequate and the permeability of the backfill is sufficiently low. To assess the future development of the waste repository, an adequate knowledge of the material behaviour is necessary and related mathematical models must be developed to be able to perform predictions on the long-term safety of the repository. An integrated performance assessment model was formulated that describes the long-term behaviour of a sealing built from salt concrete. The average permeability of the sealing changes with time after its emplacement from various processes of which two were regarded in a constitutive model: first, the healing of the EDZ in the host rock around the sealing, and second, the corrosion of the salt concrete material resulting from brine attack. Empirical parameter model functions were defined for both processes to reflect the actual behaviour. The mathematical model was implemented in the integrated performance assessment model LOPOS which is used by GRS as near-field model for repositories in salt. Deterministic and probabilistic calculations were performed with realistic parameters showing how the permeability of the sealing decreases during the first 2 000 years due to the healing of the EDZ

  20. Performance assessment of sealing systems. Conceptual and integrated modelling of plugs and seals

    International Nuclear Information System (INIS)

    Ruebel, Andre; Buhmann, Dieter; Kindlein, Jonathan; Lauke, Thomas

    2016-08-01

    The long-time isolation of radionuclides from the biosphere is the goal of the storage of radioactive waste in deep geological repositories. For repositories in rock salt, this goal is achieved on the one hand by the impermeable undisturbed part of the salt host rock formation and on the other hand by crushed salt, which is used to backfill the mine openings in the emplacement areas and galleries created during the construction of the repository. The crushed salt backfill is compacted over time and achieves a sufficiently high hydraulic resistance to avoid inflow of brines into the emplacement areas of the repository in the long-term. Plugs and seals must additionally provide their sealing function during the early post closure phase, until the compaction of the backfill is adequate and the permeability of the backfill is sufficiently low. To assess the future development of the waste repository, an adequate knowledge of the material behaviour is necessary and related mathematical models must be developed to be able to perform predictions on the long-term safety of the repository. An integrated performance assessment model was formulated that describes the long-term behaviour of a sealing built from salt concrete. The average permeability of the sealing changes with time after its emplacement from various processes of which two were regarded in a constitutive model: first, the healing of the EDZ in the host rock around the sealing, and second, the corrosion of the salt concrete material resulting from brine attack. Empirical parameter model functions were defined for both processes to reflect the actual behaviour. The mathematical model was implemented in the integrated performance assessment model LOPOS which is used by GRS as near-field model for repositories in salt. Deterministic and probabilistic calculations were performed with realistic parameters showing how the permeability of the sealing decreases during the first 2 000 years due to the healing of the EDZ

  1. Understanding National Models for Climate Assessments

    Science.gov (United States)

    Dave, A.; Weingartner, K.

    2017-12-01

    National-level climate assessments have been produced or are underway in a number of countries. These efforts showcase a variety of approaches to mapping climate impacts onto human and natural systems, and involve a variety of development processes, organizational structures, and intended purposes. This presentation will provide a comparative overview of national `models' for climate assessments worldwide, drawing from a geographically diverse group of nations with varying capacities to conduct such assessments. Using an illustrative sampling of assessment models, the presentation will highlight the range of assessment mandates and requirements that drive this work, methodologies employed, focal areas, and the degree to which international dimensions are included for each nation's assessment. This not only allows the U.S. National Climate Assessment to be better understood within an international context, but provides the user with an entry point into other national climate assessments around the world, enabling a better understanding of the risks and vulnerabilities societies face.

  2. Hydrologic-geochemical modeling needs for nuclear waste disposal systems performance assessments from the NEA perspective

    International Nuclear Information System (INIS)

    Muller, A.B.

    1986-01-01

    Credible scenarios for releases from high level nuclear waste repositories require radionuclides to be mobilized and transported by ground water. The capability to predict ground water flow velocities and directions as well as radionuclide concentrations in the flow system as a function of time are essential for assessing the performance of disposal systems. The first of these parameters can be estimated by hydrologic modeling while the concentrations can be predicted by geochemical modeling. The complementary use of empirical and phenomenological approaches to the geochemical modeling, when effectively coupled with hydrologic models can provide the tools needed for realistic performance assessment. An overview of the activities of the NEA in this area, with emphasis on the geochemical data bases (ISIRS for Ksub(d) data and the thermochemical data base critical review), rock/water interaction modeling (code development and short-courses), and hydrologic-geochemical code coupling (workshop and in-house activities) is presented in this paper from the perspective of probabilistic risk assessment needs. (author)

  3. An ethical assessment model for digital disease detection technologies.

    Science.gov (United States)

    Denecke, Kerstin

    2017-09-20

    Digital epidemiology, also referred to as digital disease detection (DDD), successfully provided methods and strategies for using information technology to support infectious disease monitoring and surveillance or understand attitudes and concerns about infectious diseases. However, Internet-based research and social media usage in epidemiology and healthcare pose new technical, functional and formal challenges. The focus of this paper is on the ethical issues to be considered when integrating digital epidemiology with existing practices. Taking existing ethical guidelines and the results from the EU project M-Eco and SORMAS as starting point, we develop an ethical assessment model aiming at providing support in identifying relevant ethical concerns in future DDD projects. The assessment model has four dimensions: user, application area, data source and methodology. The model supports in becoming aware, identifying and describing the ethical dimensions of DDD technology or use case and in identifying the ethical issues on the technology use from different perspectives. It can be applied in an interdisciplinary meeting to collect different viewpoints on a DDD system even before the implementation starts and aims at triggering discussions and finding solutions for risks that might not be acceptable even in the development phase. From the answers, ethical issues concerning confidence, privacy, data and patient security or justice may be judged and weighted.

  4. Predictions of models for environmental radiological assessment

    International Nuclear Information System (INIS)

    Peres, Sueli da Silva; Lauria, Dejanira da Costa; Mahler, Claudio Fernando

    2011-01-01

    In the field of environmental impact assessment, models are used for estimating source term, environmental dispersion and transfer of radionuclides, exposure pathway, radiation dose and the risk for human beings Although it is recognized that the specific information of local data are important to improve the quality of the dose assessment results, in fact obtaining it can be very difficult and expensive. Sources of uncertainties are numerous, among which we can cite: the subjectivity of modelers, exposure scenarios and pathways, used codes and general parameters. The various models available utilize different mathematical approaches with different complexities that can result in different predictions. Thus, for the same inputs different models can produce very different outputs. This paper presents briefly the main advances in the field of environmental radiological assessment that aim to improve the reliability of the models used in the assessment of environmental radiological impact. The intercomparison exercise of model supplied incompatible results for 137 Cs and 60 Co, enhancing the need for developing reference methodologies for environmental radiological assessment that allow to confront dose estimations in a common comparison base. The results of the intercomparison exercise are present briefly. (author)

  5. Computational Models for Calcium-Mediated Astrocyte Functions

    Directory of Open Access Journals (Sweden)

    Tiina Manninen

    2018-04-01

    Full Text Available The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro, but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop

  6. Computational Models for Calcium-Mediated Astrocyte Functions.

    Science.gov (United States)

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro , but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus

  7. Inter-doctor variations in the assessment of functional incapacities by insurance physicians

    Directory of Open Access Journals (Sweden)

    Schellart Antonius JM

    2011-11-01

    Full Text Available Abstract Background The aim of this study was to determine the - largely unexplored - extent of systematic variation in the work disability assessment by Dutch insurance physicians (IPs of employees on long-term sick leave, and to ascertain whether this variation was associated with the individual characteristics and opinions of IPs. Methods In March 2008 we conducted a survey among IPs on the basis of the 'Attitude - Social norm - self-Efficacy' (ASE model. We used the ensuing data to form latent variables for the ASE constructs. We then linked the background variables and the measured constructs for IPs (n = 199 working at regional offices (n = 27 to the work disability assessments of clients (n = 83,755 and their characteristics. These assessments were carried out between July 2003 and April 2008. We performed multilevel regression analysis on three important assessment outcomes: No Sustainable Capacity or Restrictions for Working Hours (binominal, Functional Incapacity Score (scale 0-6 and Maximum Work Disability Class (binominal. We calculated Intra Class Correlations (ICCs at IP level and office level and explained variances (R2 for the three outcomes. A higher ICC reflects stronger systematic variation. Results The ICCs at IP level were approximately 6% for No Sustainable Capacity or Restrictions for Working Hours and Maximum Work Disability Class and 12% for Functional Incapacity Score. Background IP variables and the measured ASE constructs for physicians contributed very little to the variation - at most 1%. The ICCs at office level ranged from 0% to around 1%. The R2 was 11% for No Sustainable Capacity or Restrictions for Working Hours, 19% for Functional Incapacity Score and 37% for Maximum Work Disability Class. Conclusion Our study uncovered small to moderate systematic variations in the outcome of disability assessments in the Netherlands. However, the individual characteristics and opinions of insurance physicians have very

  8. Assessment of lung function in a large cohort of patients with acromegaly.

    Science.gov (United States)

    Störmann, Sylvère; Gutt, Bodo; Roemmler-Zehrer, Josefine; Bidlingmaier, Martin; Huber, Rudolf M; Schopohl, Jochen; Angstwurm, Matthias W

    2017-07-01

    Acromegaly is associated with increased mortality due to respiratory disease. To date, lung function in patients with acromegaly has only been assessed in small studies, with contradicting results. We assessed lung function parameters in a large cohort of patients with acromegaly. Lung function of acromegaly patients was prospectively assessed using spirometry, blood gas analysis and body plethysmography. Biochemical indicators of acromegaly were assessed through measurement of growth hormone and IGF-I levels. This study was performed at the endocrinology outpatient clinic of a tertiary referral center in Germany. We prospectively tested lung function of 109 acromegaly patients (53 male, 56 female; aged 24-82 years; 80 with active acromegaly) without severe acute or chronic pulmonary disease. We compared lung volume, air flow, airway resistance and blood gases to normative data. Acromegaly patients had greater lung volumes (maximal vital capacity, intra-thoracic gas volume and residual volume: P  acromegaly. Female patients had significantly altered lung function in terms of subclinical airway obstruction. In our cross-sectional analysis of lung function in 109 patients with acromegaly, lung volumes were increased compared to healthy controls. Additionally, female patients showed signs of subclinical airway obstruction. There was no difference between patients with active acromegaly compared with patients biochemically in remission. © 2017 European Society of Endocrinology.

  9. Gadoxetate-enhanced MR imaging and compartmental modelling to assess hepatocyte bidirectional transport function in rats with advanced liver fibrosis

    Energy Technology Data Exchange (ETDEWEB)

    Giraudeau, Celine; Leporq, Benjamin; Doblas, Sabrina [University Paris Diderot, Sorbonne Paris Cite, Hopital Beaujon, Laboratory of Imaging Biomarkers, UMR1149 Inserm, Clichy (France); Lagadec, Matthieu; Daire, Jean-Luc; Van Beers, Bernard E. [University Paris Diderot, Sorbonne Paris Cite, Hopital Beaujon, Laboratory of Imaging Biomarkers, UMR1149 Inserm, Clichy (France); Beaujon University Hospital Paris Nord, Department of Radiology, Clichy (France); Pastor, Catherine M. [University Paris Diderot, Sorbonne Paris Cite, Hopital Beaujon, Laboratory of Imaging Biomarkers, UMR1149 Inserm, Clichy (France); Hopitaux Universitaires de Geneve, Departement d' Imagerie et des Sciences de l' Information Medicale, Geneva (Switzerland)

    2017-05-15

    Changes in the expression of hepatocyte membrane transporters in advanced fibrosis decrease the hepatic transport function of organic anions. The aim of our study was to assess if these changes can be evaluated with pharmacokinetic analysis of the hepatobiliary transport of the MR contrast agent gadoxetate. Dynamic gadoxetate-enhanced MRI was performed in 17 rats with advanced fibrosis and 8 normal rats. After deconvolution, hepatocyte three-compartmental analysis was performed to calculate the hepatocyte influx, biliary efflux and sinusoidal backflux rates. The expression of Oatp1a1, Mrp2 and Mrp3 organic anion membrane transporters was assessed with reverse transcription polymerase chain reaction. In the rats with advanced fibrosis, the influx and efflux rates of gadoxetate decreased and the backflux rate increased significantly (p = 0.003, 0.041 and 0.010, respectively). Significant correlations were found between influx and Oatp1a1 expression (r = 0.78, p < 0.001), biliary efflux and Mrp2 (r = 0.50, p = 0.016) and sinusoidal backflux and Mrp3 (r = 0.61, p = 0.002). These results show that changes in the bidirectional organic anion hepatocyte transport function in rats with advanced liver fibrosis can be assessed with compartmental analysis of gadoxetate-enhanced MRI. (orig.)

  10. Gadoxetate-enhanced MR imaging and compartmental modelling to assess hepatocyte bidirectional transport function in rats with advanced liver fibrosis

    International Nuclear Information System (INIS)

    Giraudeau, Celine; Leporq, Benjamin; Doblas, Sabrina; Lagadec, Matthieu; Daire, Jean-Luc; Van Beers, Bernard E.; Pastor, Catherine M.

    2017-01-01

    Changes in the expression of hepatocyte membrane transporters in advanced fibrosis decrease the hepatic transport function of organic anions. The aim of our study was to assess if these changes can be evaluated with pharmacokinetic analysis of the hepatobiliary transport of the MR contrast agent gadoxetate. Dynamic gadoxetate-enhanced MRI was performed in 17 rats with advanced fibrosis and 8 normal rats. After deconvolution, hepatocyte three-compartmental analysis was performed to calculate the hepatocyte influx, biliary efflux and sinusoidal backflux rates. The expression of Oatp1a1, Mrp2 and Mrp3 organic anion membrane transporters was assessed with reverse transcription polymerase chain reaction. In the rats with advanced fibrosis, the influx and efflux rates of gadoxetate decreased and the backflux rate increased significantly (p = 0.003, 0.041 and 0.010, respectively). Significant correlations were found between influx and Oatp1a1 expression (r = 0.78, p < 0.001), biliary efflux and Mrp2 (r = 0.50, p = 0.016) and sinusoidal backflux and Mrp3 (r = 0.61, p = 0.002). These results show that changes in the bidirectional organic anion hepatocyte transport function in rats with advanced liver fibrosis can be assessed with compartmental analysis of gadoxetate-enhanced MRI. (orig.)

  11. Modeling the World Health Organization Disability Assessment Schedule II using non-parametric item response models.

    Science.gov (United States)

    Galindo-Garre, Francisca; Hidalgo, María Dolores; Guilera, Georgina; Pino, Oscar; Rojo, J Emilio; Gómez-Benito, Juana

    2015-03-01

    The World Health Organization Disability Assessment Schedule II (WHO-DAS II) is a multidimensional instrument developed for measuring disability. It comprises six domains (getting around, self-care, getting along with others, life activities and participation in society). The main purpose of this paper is the evaluation of the psychometric properties for each domain of the WHO-DAS II with parametric and non-parametric Item Response Theory (IRT) models. A secondary objective is to assess whether the WHO-DAS II items within each domain form a hierarchy of invariantly ordered severity indicators of disability. A sample of 352 patients with a schizophrenia spectrum disorder is used in this study. The 36 items WHO-DAS II was administered during the consultation. Partial Credit and Mokken scale models are used to study the psychometric properties of the questionnaire. The psychometric properties of the WHO-DAS II scale are satisfactory for all the domains. However, we identify a few items that do not discriminate satisfactorily between different levels of disability and cannot be invariantly ordered in the scale. In conclusion the WHO-DAS II can be used to assess overall disability in patients with schizophrenia, but some domains are too general to assess functionality in these patients because they contain items that are not applicable to this pathology. Copyright © 2014 John Wiley & Sons, Ltd.

  12. An ecologically valid performance-based social functioning assessment battery for schizophrenia.

    Science.gov (United States)

    Shi, Chuan; He, Yi; Cheung, Eric F C; Yu, Xin; Chan, Raymond C K

    2013-12-30

    Psychiatrists pay more attention to the social functioning outcome of schizophrenia nowadays. How to evaluate the real world function among schizophrenia is a challenging task due to culture difference, there is no such kind of instrument in terms of the Chinese setting. This study aimed to report the validation of an ecologically valid performance-based everyday functioning assessment for schizophrenia, namely the Beijing Performance-based Functional Ecological Test (BJ-PERFECT). Fifty community-dwelling adults with schizophrenia and 37 healthy controls were recruited. Fifteen of the healthy controls were re-tested one week later. All participants were administered the University of California, San Diego, Performance-based Skill Assessment-Brief version (UPSA-B) and the MATRICS Consensus Cognitive Battery (MCCB). The finalized assessment included three subdomains: transportation, financial management and work ability. The test-retest and inter-rater reliabilities were good. The total score significantly correlated with the UPSA-B. The performance of individuals with schizophrenia was significantly more impaired than healthy controls, especially in the domain of work ability. Among individuals with schizophrenia, functional outcome was influenced by premorbid functioning, negative symptoms and neurocognition such as processing speed, visual learning and attention/vigilance. © 2013 Elsevier Ireland Ltd. All rights reserved.

  13. Development of bilingual tools to assess functional health patterns.

    Science.gov (United States)

    Krozy, R E; McCarthy, N C

    1999-01-01

    The theory and process of developing bilingual assessment tools based on Gordon's 11 functional health patterns. To facilitate assessing the individual, family, and community in a student clinical practicum in a Spanish-speaking country. Multiple family and community health promotion theories; translation theories, Gordon's Manual of Nursing Diagnosis (1982); translation/back-translation involving Ecuadorian faculty and students; student community assessments; faculty and staff workshops in Ecuador. Bilingual, culturally sensitive health assessment tools facilitate history taking, establish nursing diagnoses and interventions, and promote mutual learning. These outcomes demonstrate potential application to other systems in the international nursing community.

  14. [Assessment of cognitive functions in internal medicine].

    Science.gov (United States)

    Capron, J

    2015-12-01

    The evaluation of cognitive functions can be performed using two approaches: a quantitative one, based on screening tools; a qualitative one, based on the examination of specific cognitive functions. The quantitative approach offers a pragmatic process: to screen rapidly for a cognitive dysfunction that may require assistance or treatments. We will present three screening tools and their diagnostic value: the clock test, the Mini Mental State Examination and the Montreal Cognitive Assessment. They help select patients who require a more detailed examination to precisely diagnose their cognitive dysfunction. We propose a way to perform a detailed cognitive examination at the bedside, including the examination of alertness, attention, memory, language, frontal functions, praxis and hemi-neglect. This simple examination indicates the location of the cerebral lesion and sometimes suggests the underlying disease. Copyright © 2015. Published by Elsevier SAS.

  15. Assessment of Global Functioning in Adolescents with Autism Spectrum Disorders: Utility of the Developmental Disability-Child Global Assessment Scale

    Science.gov (United States)

    White, Susan W.; Smith, Laura A.; Schry, Amie R.

    2014-01-01

    Assessment of global functioning is an important consideration in treatment outcome research; yet, there is little guidance on its evidence-based assessment for children with autism spectrum disorders. This study investigated the utility and validity of clinician-rated global functioning using the Developmental Disability-Child Global Assessment…

  16. Validation of ultrasonography for non-invasive assessment of diaphragm function in muscular dystrophy.

    Science.gov (United States)

    Whitehead, Nicholas P; Bible, Kenneth L; Kim, Min Jeong; Odom, Guy L; Adams, Marvin E; Froehner, Stanley C

    2016-12-15

    Duchenne muscular dystrophy (DMD) is a severe, degenerative muscle disease that is commonly studied using the mdx mouse. The mdx diaphragm muscle closely mimics the pathophysiological changes in DMD muscles. mdx diaphragm force is commonly assessed ex vivo, precluding time course studies. Here we used ultrasonography to evaluate time-dependent changes in diaphragm function in vivo, by measuring diaphragm movement amplitude. In mdx mice, diaphragm amplitude decreased with age and values were much lower than for wild-type mice. Importantly, diaphragm amplitude strongly correlated with ex vivo specific force values. Micro-dystrophin administration increased mdx diaphragm amplitude by 26% after 4 weeks. Diaphragm amplitude correlated positively with ex vivo force values and negatively with diaphragm fibrosis, a major cause of DMD muscle weakness. These studies validate diaphragm ultrasonography as a reliable technique for assessing time-dependent changes in mdx diaphragm function in vivo. This technique will be valuable for testing potential therapies for DMD. Duchenne muscular dystrophy (DMD) is a severe, degenerative muscle disease caused by dystrophin mutations. The mdx mouse is a widely used animal model of DMD. The mdx diaphragm muscle most closely recapitulates key features of DMD muscles, including progressive fibrosis and considerable force loss. Diaphragm function in mdx mice is commonly evaluated by specific force measurements ex vivo. While useful, this method only measures force from a small muscle sample at one time point. Therefore, accurate assessment of diaphragm function in vivo would provide an important advance to study the time course of functional decline and treatment benefits. Here, we evaluated an ultrasonography technique for measuring time-dependent changes of diaphragm function in mdx mice. Diaphragm movement amplitude values for mdx mice were considerably lower than those for wild-type, decreased from 8 to 18 months of age, and correlated

  17. Magnetic resonance in the assessment of renal function

    Energy Technology Data Exchange (ETDEWEB)

    Knesplova, L.; Krestin, G.P. [Department of Radiology, University Hospital Zurich (Switzerland)

    1998-03-01

    The kidneys are the most important organs to maintain homeostasis. In the assessment of renal functional disorders laboratory tests offer only indirect hints on location of the disease; radionuclide nephrography is hampered by low spatial resolution and radiologic methods provide only limited quantitative information. The MRI technique with fast pulse sequences and renally eliminated contrast agent has the capability of combining both anatomic and functional information. This article gives an overview on functional MRI of the kidneys with its possibilities and limitations. The clinical application of functional MRI allows a better understanding of some pathologic conditions such as urinary tract obstruction, renal insufficiency, effects of extracorporeal shock wave lithotripsy, different states of hydration, effects of drugs, vascular disorders, and effects of transplantation. (orig.) With 9 figs., 62 refs.

  18. STAMINA - Model description. Standard Model Instrumentation for Noise Assessments

    NARCIS (Netherlands)

    Schreurs EM; Jabben J; Verheijen ENG; CMM; mev

    2010-01-01

    Deze rapportage beschrijft het STAMINA-model, dat staat voor Standard Model Instrumentation for Noise Assessments en door het RIVM is ontwikkeld. Het instituut gebruikt dit standaardmodel om omgevingsgeluid in Nederland in kaart te brengen. Het model is gebaseerd op de Standaard Karteringsmethode

  19. AN APPLICATION OF FUNCTIONAL MULTIVARIATE REGRESSION MODEL TO MULTICLASS CLASSIFICATION

    OpenAIRE

    Krzyśko, Mirosław; Smaga, Łukasz

    2017-01-01

    In this paper, the scale response functional multivariate regression model is considered. By using the basis functions representation of functional predictors and regression coefficients, this model is rewritten as a multivariate regression model. This representation of the functional multivariate regression model is used for multiclass classification for multivariate functional data. Computational experiments performed on real labelled data sets demonstrate the effectiveness of the proposed ...

  20. PARALLEL MODELS OF ASSESSMENT: INFANT MENTAL HEALTH AND THERAPEUTIC ASSESSMENT MODELS INTERSECT THROUGH EARLY CHILDHOOD CASE STUDIES.

    Science.gov (United States)

    Gart, Natalie; Zamora, Irina; Williams, Marian E

    2016-07-01

    Therapeutic Assessment (TA; S.E. Finn & M.E. Tonsager, 1997; J.D. Smith, 2010) is a collaborative, semistructured model that encourages self-discovery and meaning-making through the use of assessment as an intervention approach. This model shares core strategies with infant mental health assessment, including close collaboration with parents and caregivers, active participation of the family, a focus on developing new family stories and increasing parents' understanding of their child, and reducing isolation and increasing hope through the assessment process. The intersection of these two theoretical approaches is explored, using case studies of three infants/young children and their families to illustrate the application of TA to infant mental health. The case of an 18-month-old girl whose parents fear that she has bipolar disorder illustrates the core principles of the TA model, highlighting the use of assessment intervention sessions and the clinical approach to preparing assessment feedback. The second case follows an infant with a rare genetic syndrome from ages 2 to 24 months, focusing on the assessor-parent relationship and the importance of a developmental perspective. Finally, assessment of a 3-year-old boy illustrates the development and use of a fable as a tool to provide feedback to a young child about assessment findings and recommendations. © 2016 Michigan Association for Infant Mental Health.

  1. Scintigraphic assessment of liver function in patients requiring liver surgery

    NARCIS (Netherlands)

    Cieślak, K.P.

    2018-01-01

    This thesis addresses various aspects of assessment of liver function using a quantitative liver function test, 99mTc-mebrofenin hepatobiliary scintigraphy (HBS). HBS enables direct measurement of at least one of the liver’s true processes with minimal external interference and offers the

  2. A new climate dataset for systematic assessments of climate change impacts as a function of global warming

    Directory of Open Access Journals (Sweden)

    J. Heinke

    2013-10-01

    Full Text Available In the ongoing political debate on climate change, global mean temperature change (ΔTglob has become the yardstick by which mitigation costs, impacts from unavoided climate change, and adaptation requirements are discussed. For a scientifically informed discourse along these lines, systematic assessments of climate change impacts as a function of ΔTglob are required. The current availability of climate change scenarios constrains this type of assessment to a narrow range of temperature change and/or a reduced ensemble of climate models. Here, a newly composed dataset of climate change scenarios is presented that addresses the specific requirements for global assessments of climate change impacts as a function of ΔTglob. A pattern-scaling approach is applied to extract generalised patterns of spatially explicit change in temperature, precipitation and cloudiness from 19 Atmosphere–Ocean General Circulation Models (AOGCMs. The patterns are combined with scenarios of global mean temperature increase obtained from the reduced-complexity climate model MAGICC6 to create climate scenarios covering warming levels from 1.5 to 5 degrees above pre-industrial levels around the year 2100. The patterns are shown to sufficiently maintain the original AOGCMs' climate change properties, even though they, necessarily, utilise a simplified relationships between ΔTglob and changes in local climate properties. The dataset (made available online upon final publication of this paper facilitates systematic analyses of climate change impacts as it covers a wider and finer-spaced range of climate change scenarios than the original AOGCM simulations.

  3. Characterizing Cognitive Aging of Working Memory and Executive Function in Animal Models

    Directory of Open Access Journals (Sweden)

    Jennifer Lynn Bizon

    2012-09-01

    Full Text Available Executive functions supported by prefrontal cortical systems provide essential control and planning mechanisms to guide goal-directed behavior. As such, age-related alterations in executive functions can mediate profound and widespread deficits on a diverse array of neurocognitive processes. Many of the critical neuroanatomical and functional characteristics of prefrontal cortex are preserved in rodents, allowing for meaningful cross-species comparisons relevant to the study of cognitive aging. In particular, as rodents lend themselves to genetic, cellular and biochemical approaches, rodent models of executive function stand to significantly contribute to our understanding of the critical neurobiological mechanisms that mediate decline of executive processes across the lifespan. Moreover, rodent analogues of executive functions that decline in human aging represent an essential component of a targeted, rational approach for developing and testing effective treatment and prevention therapies for age-related cognitive decline. This paper reviews behavioral approaches used to study executive function in rodents, with a focus on those assays that share a foundation in the psychological and neuroanatomical constructs important for human aging. A particular emphasis is placed on behavioral approaches used to assess working memory and cognitive flexibility, which are sensitive to decline with age across species and for which strong rodent models currently exist. In addition, other approaches in rodent behavior that have potential for providing analogues to functions that reliably decline to human aging (e.g., information processing speed are discussed.

  4. A Basis Function Approach to Simulate Storm Surge Events for Coastal Flood Risk Assessment

    Science.gov (United States)

    Wu, Wenyan; Westra, Seth; Leonard, Michael

    2017-04-01

    Storm surge is a significant contributor to flooding in coastal and estuarine regions, especially when it coincides with other flood producing mechanisms, such as extreme rainfall. Therefore, storm surge has always been a research focus in coastal flood risk assessment. Often numerical models have been developed to understand storm surge events for risk assessment (Kumagai et al. 2016; Li et al. 2016; Zhang et al. 2016) (Bastidas et al. 2016; Bilskie et al. 2016; Dalledonne and Mayerle 2016; Haigh et al. 2014; Kodaira et al. 2016; Lapetina and Sheng 2015), and assess how these events may change or evolve in the future (Izuru et al. 2015; Oey and Chou 2016). However, numeric models often require a lot of input information and difficulties arise when there are not sufficient data available (Madsen et al. 2015). Alternative, statistical methods have been used to forecast storm surge based on historical data (Hashemi et al. 2016; Kim et al. 2016) or to examine the long term trend in the change of storm surge events, especially under climate change (Balaguru et al. 2016; Oh et al. 2016; Rueda et al. 2016). In these studies, often the peak of surge events is used, which result in the loss of dynamic information within a tidal cycle or surge event (i.e. a time series of storm surge values). In this study, we propose an alternative basis function (BF) based approach to examine the different attributes (e.g. peak and durations) of storm surge events using historical data. Two simple two-parameter BFs were used: the exponential function and the triangular function. High quality hourly storm surge record from 15 tide gauges around Australia were examined. It was found that there are significantly location and seasonal variability in the peak and duration of storm surge events, which provides additional insights in coastal flood risk. In addition, the simple form of these BFs allows fast simulation of storm surge events and minimises the complexity of joint probability

  5. A multi-model assessment of terrestrial biosphere model data needs

    Science.gov (United States)

    Gardella, A.; Cowdery, E.; De Kauwe, M. G.; Desai, A. R.; Duveneck, M.; Fer, I.; Fisher, R.; Knox, R. G.; Kooper, R.; LeBauer, D.; McCabe, T.; Minunno, F.; Raiho, A.; Serbin, S.; Shiklomanov, A. N.; Thomas, A.; Walker, A.; Dietze, M.

    2017-12-01

    Terrestrial biosphere models provide us with the means to simulate the impacts of climate change and their uncertainties. Going beyond direct observation and experimentation, models synthesize our current understanding of ecosystem processes and can give us insight on data needed to constrain model parameters. In previous work, we leveraged the Predictive Ecosystem Analyzer (PEcAn) to assess the contribution of different parameters to the uncertainty of the Ecosystem Demography model v2 (ED) model outputs across various North American biomes (Dietze et al., JGR-G, 2014). While this analysis identified key research priorities, the extent to which these priorities were model- and/or biome-specific was unclear. Furthermore, because the analysis only studied one model, we were unable to comment on the effect of variability in model structure to overall predictive uncertainty. Here, we expand this analysis to all biomes globally and a wide sample of models that vary in complexity: BioCro, CABLE, CLM, DALEC, ED2, FATES, G'DAY, JULES, LANDIS, LINKAGES, LPJ-GUESS, MAESPA, PRELES, SDGVM, SIPNET, and TEM. Prior to performing uncertainty analyses, model parameter uncertainties were assessed by assimilating all available trait data from the combination of the BETYdb and TRY trait databases, using an updated multivariate version of PEcAn's Hierarchical Bayesian meta-analysis. Next, sensitivity analyses were performed for all models across a range of sites globally to assess sensitivities for a range of different outputs (GPP, ET, SH, Ra, NPP, Rh, NEE, LAI) at multiple time scales from the sub-annual to the decadal. Finally, parameter uncertainties and model sensitivities were combined to evaluate the fractional contribution of each parameter to the predictive uncertainty for a specific variable at a specific site and timescale. Facilitated by PEcAn's automated workflows, this analysis represents the broadest assessment of the sensitivities and uncertainties in terrestrial

  6. Comparing the dependability and associations with functioning of the DSM-5 Section III trait model of personality pathology and the DSM-5 Section II personality disorder model.

    Science.gov (United States)

    Chmielewski, Michael; Ruggero, Camilo J; Kotov, Roman; Liu, Keke; Krueger, Robert F

    2017-07-01

    Two competing models of personality psychopathology are included in the fifth edition of the Diagnostic Statistical Manual of Mental Disorders ( DSM-5 ; American Psychiatric Association, 2013); the traditional personality disorder (PD) model included in Section II and an alternative trait-based model included in Section III. Numerous studies have examined the validity of the alternative trait model and its official assessment instrument, the Personality Inventory for DSM-5 (PID-5; Krueger, Derringer, Markon, Watson, & Skodol, 2012). However, few studies have directly compared the trait-based model to the traditional PD model empirically in the same dataset. Moreover, to our knowledge, only a single study (Suzuki, Griffin, & Samuel, 2015) has examined the dependability of the PID-5, which is an essential component of construct validity for traits (Chmielewski & Watson, 2009; McCrae, Kurtz, Yamagata, & Terracciano, 2011). The current study directly compared the dependability of the DSM-5 traits, as assessed by the PID-5, and the traditional PD model, as assessed by the Personality Diagnostic Questionnaire-4 (PDQ-4+), in a large undergraduate sample. In addition, it evaluated and compared their associations with functioning, another essential component of personality pathology. In general, our findings indicate that most DSM-5 traits demonstrate high levels of dependability that are superior to the traditional PD model; however, some of the constructs assessed by the PID-5 may be more state like. The models were roughly equivalent in terms of their associations with functioning. The current results provide additional support for the validity of PID-5 and the DSM-5 Section III personality pathology model. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. Road Assessment Model and Pilot Application in China

    Directory of Open Access Journals (Sweden)

    Tiejun Zhang

    2014-01-01

    Full Text Available Risk assessment of roads is an effective approach for road agencies to determine safety improvement investments. It can increases the cost-effective returns in crash and injury reductions. To get a powerful Chinese risk assessment model, Research Institute of Highway (RIOH is developing China Road Assessment Programme (ChinaRAP model to show the traffic crashes in China in partnership with International Road Assessment Programme (iRAP. The ChinaRAP model is based upon RIOH’s achievements and iRAP models. This paper documents part of ChinaRAP’s research work, mainly including the RIOH model and its pilot application in a province in China.

  8. Is functional MR imaging assessment of hemispheric language dominance as good as the Wada test?: a meta-analysis.

    Science.gov (United States)

    Dym, R Joshua; Burns, Judah; Freeman, Katherine; Lipton, Michael L

    2011-11-01

    To perform a systematic review and meta-analysis to quantitatively assess functional magnetic resonance (MR) imaging lateralization of language function in comparison with the Wada test. This study was determined to be exempt from review by the institutional review board. A systematic review and meta-analysis were performed in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. A structured Medline search was conducted to identify all studies that compared functional MR imaging with the Wada test for determining hemispheric language dominance prior to brain surgery. Studies meeting predetermined inclusion criteria were selected independently by two radiologists who also assessed their quality using the Quality Assessment of Diagnostic Accuracy Studies tool. Language dominance was classified as typical (left hemispheric language dominance) or atypical (right hemispheric language dominance or bilateral language representation) for each patient. A meta-analysis was then performed by using a bivariate random-effects model to derive estimates of sensitivity and specificity, with Wada as the standard of reference. Subgroup analyses were also performed to compare the different functional MR imaging techniques utilized by the studies. Twenty-three studies, comprising 442 patients, met inclusion criteria. The sensitivity and specificity of functional MR imaging for atypical language dominance (compared with the Wada test) were 83.5% (95% confidence interval: 80.2%, 86.7%) and 88.1% (95% confidence interval: 87.0%, 89.2%), respectively. Functional MR imaging provides an excellent, noninvasive alternative for language lateralization and should be considered for the initial preoperative assessment of hemispheric language dominance. Further research may help determine which functional MR methods are most accurate for specific patient populations. RSNA, 2011

  9. The universal function in color dipole model

    Science.gov (United States)

    Jalilian, Z.; Boroun, G. R.

    2017-10-01

    In this work we review color dipole model and recall properties of the saturation and geometrical scaling in this model. Our primary aim is determining the exact universal function in terms of the introduced scaling variable in different distance than the saturation radius. With inserting the mass in calculation we compute numerically the contribution of heavy productions in small x from the total structure function by the fraction of universal functions and show the geometrical scaling is established due to our scaling variable in this study.

  10. NJL-jet model for quark fragmentation functions

    International Nuclear Information System (INIS)

    Ito, T.; Bentz, W.; Cloeet, I. C.; Thomas, A. W.; Yazaki, K.

    2009-01-01

    A description of fragmentation functions which satisfy the momentum and isospin sum rules is presented in an effective quark theory. Concentrating on the pion fragmentation function, we first explain why the elementary (lowest order) fragmentation process q→qπ is completely inadequate to describe the empirical data, although the crossed process π→qq describes the quark distribution functions in the pion reasonably well. Taking into account cascadelike processes in a generalized jet-model approach, we then show that the momentum and isospin sum rules can be satisfied naturally, without the introduction of ad hoc parameters. We present results for the Nambu-Jona-Lasinio (NJL) model in the invariant mass regularization scheme and compare them with the empirical parametrizations. We argue that the NJL-jet model, developed herein, provides a useful framework with which to calculate the fragmentation functions in an effective chiral quark theory.

  11. Safety functions and safety function indicators - key elements in SKB'S methodology for assessing long-term safety of a KBS-3 repository

    International Nuclear Information System (INIS)

    Hedin, A.

    2008-01-01

    The application of so called safety function indicators in SKB safety assessment of a KBS-3 repository for spent nuclear fuel is presented. Isolation and retardation are the two main safety functions of the KBS-3 concept. In order to quantitatively evaluate safety on a sub-system level, these functions need to be differentiated, associated with quantitative measures and, where possible, with quantitative criteria relating to the fulfillment of the safety functions. A safety function is defined as a role through which a repository component contributes to safety. A safety function indicator is a measurable or calculable property of a repository component that allows quantitative evaluation of a safety function. A safety function indicator criterion is a quantitative limit such that if the criterion is fulfilled, the corresponding safety function is upheld. The safety functions and their associated indicators and criteria developed for the KBS-3 repository are primarily related to the isolating potential and to physical states of the canister and the clay buffer surrounding the canister. They are thus not directly related to release rates of radionuclides. The paper also describes how the concepts introduced i) aid in focussing the assessment on critical, safety related issues, ii) provide a framework for the accounting of safety throughout the different time frames of the assessment and iii) provide key information in the selection of scenarios for the safety assessment. (author)

  12. Factorisations for partition functions of random Hermitian matrix models

    International Nuclear Information System (INIS)

    Jackson, D.M.; Visentin, T.I.

    1996-01-01

    The partition function Z N , for Hermitian-complex matrix models can be expressed as an explicit integral over R N , where N is a positive integer. Such an integral also occurs in connection with random surfaces and models of two dimensional quantum gravity. We show that Z N can be expressed as the product of two partition functions, evaluated at translated arguments, for another model, giving an explicit connection between the two models. We also give an alternative computation of the partition function for the φ 4 -model.The approach is an algebraic one and holds for the functions regarded as formal power series in the appropriate ring. (orig.)

  13. Proposal for a Method for Business Model Performance Assessment: Toward an Experimentation Tool for Business Model Innovation

    Directory of Open Access Journals (Sweden)

    Antonio Batocchio

    2017-04-01

    Full Text Available The representation of business models has been recently widespread, especially in the pursuit of innovation. However, defining a company’s business model is sometimes limited to discussion and debates. This study observes the need for performance measurement so that business models can be data-driven. To meet this goal, the work proposed as a hypothesis the creation of a method that combines the practices of the Balanced Scorecard with a method of business models representation – the Business Model Canvas. Such a combination was based on study of conceptual adaptation, resulting in an application roadmap. A case study application was performed to check the functionality of the proposition, focusing on startup organizations. It was concluded that based on the performance assessment of the business model it is possible to propose the search for change through experimentation, a path that can lead to business model innovation.

  14. Function Modelling Of The Market And Assessing The Degree Of Similarity Between Real Properties - Dependent Or Independent Procedures In The Process Of Office Property Valuation

    Directory of Open Access Journals (Sweden)

    Barańska Anna

    2015-09-01

    Full Text Available Referring to the developed and presented in previous publications (e.g. Barańska 2011 two-stage algorithm for real estate valuation, this article addresses the problem of the relationship between the two stages of the algorithm. An essential part of the first stage is the multi-dimensional function modelling of the real estate market. As a result of selecting the model best fitted to the market data, in which the dependent variable is always the price of a real property, a set of market attributes is obtained, which in this model are considered to be price-determining. In the second stage, from the collection of real estate which served as a database in the process of estimating model parameters, the selected objects are those which are most similar to the one subject to valuation and form the basis for predicting the final value of the property being valued. Assessing the degree of similarity between real properties can be carried out based on the full spectrum of real estate attributes that potentially affect their value and which it is possible to gather information about, or only on the basis of those attributes which were considered to be price-determining in function modelling. It can also be performed by various methods. This article has examined the effect of various approaches on the final value of the property obtained using the two-stage prediction. In order fulfill the study aim precisely as possible, the results of each calculation step of the algorithm have been investigated in detail. Each of them points to the independence of the two procedures.

  15. Correlation functions of two-matrix models

    International Nuclear Information System (INIS)

    Bonora, L.; Xiong, C.S.

    1993-11-01

    We show how to calculate correlation functions of two matrix models without any approximation technique (except for genus expansion). In particular we do not use any continuum limit technique. This allows us to find many solutions which are invisible to the latter technique. To reach our goal we make full use of the integrable hierarchies and their reductions which were shown in previous papers to naturally appear in multi-matrix models. The second ingredient we use, even though to a lesser extent, are the W-constraints. In fact an explicit solution of the relevant hierarchy, satisfying the W-constraints (string equation), underlies the explicit calculation of the correlation functions. The correlation functions we compute lend themselves to a possible interpretation in terms of topological field theories. (orig.)

  16. The use of generalised audit software by internal audit functions in a developing country: A maturity level assessment

    OpenAIRE

    D.P. van der Nest; Louis Smidt; Dave Lubbe

    2017-01-01

    This article explores the existing practices of internal audit functions in the locally controlled South African banking industry regarding the use of Generalised Audit Software (GAS), against a benchmark developed from recognised data analytic maturity models, in order to assess the current maturity levels of the locally controlled South African banks in the use of this software for tests of controls. The literature review indicates that the use of GAS by internal audit functions is still at...

  17. Multinomial-exponential reliability function: a software reliability model

    International Nuclear Information System (INIS)

    Saiz de Bustamante, Amalio; Saiz de Bustamante, Barbara

    2003-01-01

    The multinomial-exponential reliability function (MERF) was developed during a detailed study of the software failure/correction processes. Later on MERF was approximated by a much simpler exponential reliability function (EARF), which keeps most of MERF mathematical properties, so the two functions together makes up a single reliability model. The reliability model MERF/EARF considers the software failure process as a non-homogeneous Poisson process (NHPP), and the repair (correction) process, a multinomial distribution. The model supposes that both processes are statistically independent. The paper discusses the model's theoretical basis, its mathematical properties and its application to software reliability. Nevertheless it is foreseen model applications to inspection and maintenance of physical systems. The paper includes a complete numerical example of the model application to a software reliability analysis

  18. The Use of Logistic Model in RUL Assessment

    Science.gov (United States)

    Gumiński, R.; Radkowski, S.

    2017-12-01

    The paper takes on the issue of assessment of remaining useful life (RUL). The goal of the paper was to develop a method, which would enable use of diagnostic information in the task of reducing the uncertainty related to technical risk. Prediction of the remaining useful life (RUL) of the system is a very important task for maintenance strategy. In the literature RUL of an engineering system is defined as the first future time instant in which thresholds of conditions (safety, operational quality, maintenance cost, etc) are violated. Knowledge of RUL offers the possibility of planning the testing and repair activities. Building models of damage development is important in this task. In the presented work, logistic function will be used to model fatigue crack development. It should be remembered that modeling of every phase of damage development is very difficult, yet modeling of every phase of damage separately, especially including on-line diagnostic information is more effective. Particular attention was paid to the possibility of forecasting the occurrence of damage due to fatigue while relying on the analysis of the structure of a vibroacoustic signal.

  19. Fetal functional brain age assessed from universal developmental indices obtained from neuro-vegetative activity patterns.

    Directory of Open Access Journals (Sweden)

    Dirk Hoyer

    Full Text Available Fetal brain development involves the development of the neuro-vegetative (autonomic control that is mediated by the autonomic nervous system (ANS. Disturbances of the fetal brain development have implications for diseases in later postnatal life. In that context, the fetal functional brain age can be altered. Universal principles of developmental biology applied to patterns of autonomic control may allow a functional age assessment. The work aims at the development of a fetal autonomic brain age score (fABAS based on heart rate patterns. We analysed n = 113 recordings in quiet sleep, n = 286 in active sleep, and n = 29 in active awakeness from normals. We estimated fABAS from magnetocardiographic recordings (21.4-40.3 weeks of gestation preclassified in quiet sleep (n = 113, 63 females and active sleep (n = 286, 145 females state by cross-validated multivariate linear regression models in a cross-sectional study. According to universal system developmental principles, we included indices that address increasing fluctuation range, increasing complexity, and pattern formation (skewness, power spectral ratio VLF/LF, pNN5. The resulting models constituted fABAS. fABAS explained 66/63% (coefficient of determination R(2 of training and validation set of the variance by age in quiet, while 51/50% in active sleep. By means of a logistic regression model using fluctuation range and fetal age, quiet and active sleep were automatically reclassified (94.3/93.1% correct classifications. We did not find relevant gender differences. We conclude that functional brain age can be assessed based on universal developmental indices obtained from autonomic control patterns. fABAS reflect normal complex functional brain maturation. The presented normative data are supplemented by an explorative study of 19 fetuses compromised by intrauterine growth restriction. We observed a shift in the state distribution towards active awakeness. The lower WGA

  20. Assessment of Executive Function in Patients With Substance Use Disorder: A Comparison of Inventory- and Performance-Based Assessment.

    Science.gov (United States)

    Hagen, Egon; Erga, Aleksander H; Hagen, Katrin P; Nesvåg, Sverre M; McKay, James R; Lundervold, Astri J; Walderhaug, Espen

    2016-07-01

    Chronic polysubstance abuse (SUD) is associated with neurophysiological and neuroanatomical changes. Neurocognitive impairment tends to affect quality of life, occupational functioning, and the ability to benefit from therapy. Neurocognitive assessment is thus of importance, but costly and not widely available. Therefore, in a busy clinical setting, procedures that include readily available measures targeting core cognitive deficits would be beneficial. This paper investigates the utility of psychometric tests and a questionnaire-based inventory to assess "hot" and "cold" neurocognitive measures of executive functions (EF) in adults with a substance use disorder. Hot decision-making processes are associated with emotional, affective, and visceral responses, while cold executive functions are associated with rational decision-making. Subjects with polysubstance abuse (n=126) and healthy controls (n=32) were compared on hot (Iowa Gambling Task) and cold (Stroop and the Trail Making Test) measures of EF, in addition to a questionnaire assessing everyday EF related problems (BRIEF-A; Behavior Rating Inventory of Executive Function - Adult, self-report version). Information about the substance abuse and social adjustment were assessed by self-report. Logistic regression analyses were applied to assess independent correlates of SUD status and social adjustment. A multiple linear regression was performed to predict the number of previous treatment attempts. The psychometric test of hot EF (the Iowa Gambling Task) did not differentiate the patients with polysubstance abuse from controls, and was not associated with social adjustment. The psychometric tests of cold EF distinguished somewhat between the groups and were associated with one indicator of social adjustment. The BRIEF-A differentiated between groups on all the clinical scales and was associated with three out of five social adjustment indicators ("criminal lifestyle," "conflict with caregiver," and "stable

  1. Modeling Functional Neuroanatomy for an Anatomy Information System

    Science.gov (United States)

    Niggemann, Jörg M.; Gebert, Andreas; Schulz, Stefan

    2008-01-01

    Objective Existing neuroanatomical ontologies, databases and information systems, such as the Foundational Model of Anatomy (FMA), represent outgoing connections from brain structures, but cannot represent the “internal wiring” of structures and as such, cannot distinguish between different independent connections from the same structure. Thus, a fundamental aspect of Neuroanatomy, the functional pathways and functional systems of the brain such as the pupillary light reflex system, is not adequately represented. This article identifies underlying anatomical objects which are the source of independent connections (collections of neurons) and uses these as basic building blocks to construct a model of functional neuroanatomy and its functional pathways. Design The basic representational elements of the model are unnamed groups of neurons or groups of neuron segments. These groups, their relations to each other, and the relations to the objects of macroscopic anatomy are defined. The resulting model can be incorporated into the FMA. Measurements The capabilities of the presented model are compared to the FMA and the Brain Architecture Management System (BAMS). Results Internal wiring as well as functional pathways can correctly be represented and tracked. Conclusion This model bridges the gap between representations of single neurons and their parts on the one hand and representations of spatial brain structures and areas on the other hand. It is capable of drawing correct inferences on pathways in a nervous system. The object and relation definitions are related to the Open Biomedical Ontology effort and its relation ontology, so that this model can be further developed into an ontology of neuronal functional systems. PMID:18579841

  2. Assessing the Social Acceptability of the Functional Analysis of Problem Behavior

    Science.gov (United States)

    Langthorne, Paul; McGill, Peter

    2011-01-01

    Although the clinical utility of the functional analysis is well established, its social acceptability has received minimal attention. The current study assessed the social acceptability of functional analysis procedures among 10 parents and 3 teachers of children who had recently received functional analyses. Participants completed a 9-item…

  3. A Comparison of Experimental Functional Analysis and the Questions about Behavioral Function (QABF) in the Assessment of Challenging Behavior of Individuals with Autism

    Science.gov (United States)

    Healy, Olive; Brett, Denise; Leader, Geraldine

    2013-01-01

    We compared two functional behavioral assessment methods: the Questions About Behavioral Function (QABF; a standardized test) and experimental functional analysis (EFA) to identify behavioral functions of aggressive/destructive behavior, self-injurious behavior and stereotypy in 32 people diagnosed with autism. Both assessments found that self…

  4. LED Lighting System Reliability Modeling and Inference via Random Effects Gamma Process and Copula Function

    Directory of Open Access Journals (Sweden)

    Huibing Hao

    2015-01-01

    Full Text Available Light emitting diode (LED lamp has attracted increasing interest in the field of lighting systems due to its low energy and long lifetime. For different functions (i.e., illumination and color, it may have two or more performance characteristics. When the multiple performance characteristics are dependent, it creates a challenging problem to accurately analyze the system reliability. In this paper, we assume that the system has two performance characteristics, and each performance characteristic is governed by a random effects Gamma process where the random effects can capture the unit to unit differences. The dependency of performance characteristics is described by a Frank copula function. Via the copula function, the reliability assessment model is proposed. Considering the model is so complicated and analytically intractable, the Markov chain Monte Carlo (MCMC method is used to estimate the unknown parameters. A numerical example about actual LED lamps data is given to demonstrate the usefulness and validity of the proposed model and method.

  5. Discrete two-sex models of population dynamics: On modelling the mating function

    Science.gov (United States)

    Bessa-Gomes, Carmen; Legendre, Stéphane; Clobert, Jean

    2010-09-01

    Although sexual reproduction has long been a central subject of theoretical ecology, until recently its consequences for population dynamics were largely overlooked. This is now changing, and many studies have addressed this issue, showing that when the mating system is taken into account, the population dynamics depends on the relative abundance of males and females, and is non-linear. Moreover, sexual reproduction increases the extinction risk, namely due to the Allee effect. Nevertheless, different studies have identified diverse potential consequences, depending on the choice of mating function. In this study, we investigate the consequences of three alternative mating functions that are frequently used in discrete population models: the minimum; the harmonic mean; and the modified harmonic mean. We consider their consequences at three levels: on the probability that females will breed; on the presence and intensity of the Allee effect; and on the extinction risk. When we consider the harmonic mean, the number of times the individuals of the least abundant sex mate exceeds their mating potential, which implies that with variable sex-ratios the potential reproductive rate is no longer under the modeller's control. Consequently, the female breeding probability exceeds 1 whenever the sex-ratio is male-biased, which constitutes an obvious problem. The use of the harmonic mean is thus only justified if we think that this parameter should be re-defined in order to represent the females' breeding rate and the fact that females may reproduce more than once per breeding season. This phenomenon buffers the Allee effect, and reduces the extinction risk. However, when we consider birth-pulse populations, such a phenomenon is implausible because the number of times females can reproduce per birth season is limited. In general, the minimum or modified harmonic mean mating functions seem to be more suitable for assessing the impact of mating systems on population dynamics.

  6. On Support Functions for the Development of MFM Models

    DEFF Research Database (Denmark)

    Heussen, Kai; Lind, Morten

    2012-01-01

    a review of MFM applications, and contextualizes the model development with respect to process design and operation knowledge. Developing a perspective for an environment for MFM-oriented model- and application-development a tool-chain is outlined and relevant software functions are discussed......A modeling environment and methodology are necessary to ensure quality and reusability of models in any domain. For MFM in particular, as a tool for modeling complex systems, awareness has been increasing for this need. Introducing the context of modeling support functions, this paper provides....... With a perspective on MFM-modeling for existing processes and automation design, modeling stages and corresponding formal model properties are identified. Finally, practically feasible support functions and model-checks to support the model-development are suggested....

  7. Function of dynamic models in systems biology: linking structure to behaviour.

    Science.gov (United States)

    Knüpfer, Christian; Beckstein, Clemens

    2013-10-08

    Dynamic models in Systems Biology are used in computational simulation experiments for addressing biological questions. The complexity of the modelled biological systems and the growing number and size of the models calls for computer support for modelling and simulation in Systems Biology. This computer support has to be based on formal representations of relevant knowledge fragments. In this paper we describe different functional aspects of dynamic models. This description is conceptually embedded in our "meaning facets" framework which systematises the interpretation of dynamic models in structural, functional and behavioural facets. Here we focus on how function links the structure and the behaviour of a model. Models play a specific role (teleological function) in the scientific process of finding explanations for dynamic phenomena. In order to fulfil this role a model has to be used in simulation experiments (pragmatical function). A simulation experiment always refers to a specific situation and a state of the model and the modelled system (conditional function). We claim that the function of dynamic models refers to both the simulation experiment executed by software (intrinsic function) and the biological experiment which produces the phenomena under investigation (extrinsic function). We use the presented conceptual framework for the function of dynamic models to review formal accounts for functional aspects of models in Systems Biology, such as checklists, ontologies, and formal languages. Furthermore, we identify missing formal accounts for some of the functional aspects. In order to fill one of these gaps we propose an ontology for the teleological function of models. We have thoroughly analysed the role and use of models in Systems Biology. The resulting conceptual framework for the function of models is an important first step towards a comprehensive formal representation of the functional knowledge involved in the modelling and simulation process

  8. FLORA™: Phase I development of a functional vision assessment for prosthetic vision users.

    Science.gov (United States)

    Geruschat, Duane R; Flax, Marshall; Tanna, Nilima; Bianchi, Michelle; Fisher, Andy; Goldschmidt, Mira; Fisher, Lynne; Dagnelie, Gislin; Deremeik, Jim; Smith, Audrey; Anaflous, Fatima; Dorn, Jessy

    2015-07-01

    Research groups and funding agencies need a functional assessment suitable for an ultra-low vision population to evaluate the impact of new vision-restoration treatments. The purpose of this study was to develop a pilot assessment to capture the functional visual ability and well-being of subjects whose vision has been partially restored with the Argus II Retinal Prosthesis System. The Functional Low-Vision Observer Rated Assessment (FLORA) pilot assessment involved a self-report section, a list of functional visual tasks for observation of performance and a case narrative summary. Results were analysed to determine whether the interview questions and functional visual tasks were appropriate for this ultra-low vision population and whether the ratings suffered from floor or ceiling effects. Thirty subjects with severe to profound retinitis pigmentosa (bare light perception or worse in both eyes) were enrolled in a clinical trial and implanted with the Argus II System. From this population, 26 subjects were assessed with the FLORA. Seven different evaluators administered the assessment. All 14 interview questions were asked. All 35 tasks for functional vision were selected for evaluation at least once, with an average of 20 subjects being evaluated for each test item. All four rating options—impossible (33 per cent), difficult (23 per cent), moderate (24 per cent) and easy (19 per cent)—were used by the evaluators. Evaluators also judged the amount of vision they observed the subjects using to complete the various tasks, with 'vision only' occurring 75 per cent on average with the System ON, and 29 per cent with the System OFF. The first version of the FLORA was found to contain useful elements for evaluation and to avoid floor and ceiling effects. The next phase of development will be to refine the assessment and to establish reliability and validity to increase its value as an assessment tool for functional vision and well-being. © 2015 The Authors. Clinical

  9. Assessment of the function of the nervus intermedius by means of functional salivary gland-scintigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, J.P.; Bertram, G.; Moedder, G.

    1982-01-01

    Using functional scintigraphy of the salivary glands, the function of the nervus intermedius can be assessed by estimating excretory quotients for both submandibular glands. This method is preferred to the standard salivation test method of Magielski and Blatt, despite a minimal exposure of the patient to radiation from the injected Natriumpertechnetat. The technical course of this investigation, along with the indications for its use, will be presented.

  10. Reliability assessment of embedded digital system using multi-state function

    International Nuclear Information System (INIS)

    Choi, Jong Gyun; Seong, Poong Hyun

    2006-01-01

    This work describes a combinatorial model for estimating the reliability of the embedded digital system by means of multi-state function. This model includes a coverage model for fault-handling techniques implemented in digital systems. The fault-handling techniques make it difficult for many types of components in digital system to be treated as binary state, good or bad. The multi-state function provides a complete analysis of multi-state systems as which the digital systems can be regarded. Through adaptation of software operational profile flow to multi-state function, the HW/SW interaction is also considered for estimation of the reliability of digital system. Using this model, we evaluate the reliability of one board controller in a digital system, Interposing Logic System (ILS), which is installed in YGN nuclear power units 3 and 4. Since the proposed model is a generalized combinatorial model, the simplification of this model becomes the conventional model that treats the system as binary state. This modeling method is particularly attractive for embedded systems in which small sized application software is implemented since it will require very laborious work for this method to be applied to systems with large software

  11. Multi-model approach to assess the impact of climate change on runoff

    Science.gov (United States)

    Dams, J.; Nossent, J.; Senbeta, T. B.; Willems, P.; Batelaan, O.

    2015-10-01

    The assessment of climate change impacts on hydrology is subject to uncertainties related to the climate change scenarios, stochastic uncertainties of the hydrological model and structural uncertainties of the hydrological model. This paper focuses on the contribution of structural uncertainty of hydrological models to the overall uncertainty of the climate change impact assessment. To quantify the structural uncertainty of hydrological models, four physically based hydrological models (SWAT, PRMS and a semi- and fully distributed version of the WetSpa model) are set up for a catchment in Belgium. Each model is calibrated using four different objective functions. Three climate change scenarios with a high, mean and low hydrological impact are statistically perturbed from a large ensemble of climate change scenarios and are used to force the hydrological models. This methodology allows assessing and comparing the uncertainty introduced by the climate change scenarios with the uncertainty introduced by the hydrological model structure. Results show that the hydrological model structure introduces a large uncertainty on both the average monthly discharge and the extreme peak and low flow predictions under the climate change scenarios. For the low impact climate change scenario, the uncertainty range of the mean monthly runoff is comparable to the range of these runoff values in the reference period. However, for the mean and high impact scenarios, this range is significantly larger. The uncertainty introduced by the climate change scenarios is larger than the uncertainty due to the hydrological model structure for the low and mean hydrological impact scenarios, but the reverse is true for the high impact climate change scenario. The mean and high impact scenarios project increasing peak discharges, while the low impact scenario projects increasing peak discharges only for peak events with return periods larger than 1.6 years. All models suggest for all scenarios a

  12. A DSM-based framework for integrated function modelling

    DEFF Research Database (Denmark)

    Eisenbart, Boris; Gericke, Kilian; Blessing, Lucienne T. M.

    2017-01-01

    an integrated function modelling framework, which specifically aims at relating between the different function modelling perspectives prominently addressed in different disciplines. It uses interlinked matrices based on the concept of DSM and MDM in order to facilitate cross-disciplinary modelling and analysis...... of the functionality of a system. The article further presents the application of the framework based on a product example. Finally, an empirical study in industry is presented. Therein, feedback on the potential of the proposed framework to support interdisciplinary design practice as well as on areas of further...

  13. Composite spectral functions for solving Volterra's population model

    International Nuclear Information System (INIS)

    Ramezani, M.; Razzaghi, M.; Dehghan, M.

    2007-01-01

    An approximate method for solving Volterra's population model for population growth of a species in a closed system is proposed. Volterra's model is a nonlinear integro-differential equation, where the integral term represents the effect of toxin. The approach is based upon composite spectral functions approximations. The properties of composite spectral functions consisting of few terms of orthogonal functions are presented and are utilized to reduce the solution of the Volterra's model to the solution of a system of algebraic equations. The method is easy to implement and yields very accurate result

  14. Exact 2-point function in Hermitian matrix model

    International Nuclear Information System (INIS)

    Morozov, A.; Shakirov, Sh.

    2009-01-01

    J. Harer and D. Zagier have found a strikingly simple generating function [1,2] for exact (all-genera) 1-point correlators in the Gaussian Hermitian matrix model. In this paper we generalize their result to 2-point correlators, using Toda integrability of the model. Remarkably, this exact 2-point correlation function turns out to be an elementary function - arctangent. Relation to the standard 2-point resolvents is pointed out. Some attempts of generalization to 3-point and higher functions are described.

  15. Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems.

    Science.gov (United States)

    Pant, Sanjay

    2018-05-01

    A new class of functions, called the 'information sensitivity functions' (ISFs), which quantify the information gain about the parameters through the measurements/observables of a dynamical system are presented. These functions can be easily computed through classical sensitivity functions alone and are based on Bayesian and information-theoretic approaches. While marginal information gain is quantified by decrease in differential entropy, correlations between arbitrary sets of parameters are assessed through mutual information. For individual parameters, these information gains are also presented as marginal posterior variances, and, to assess the effect of correlations, as conditional variances when other parameters are given. The easy to interpret ISFs can be used to (a) identify time intervals or regions in dynamical system behaviour where information about the parameters is concentrated; (b) assess the effect of measurement noise on the information gain for the parameters; (c) assess whether sufficient information in an experimental protocol (input, measurements and their frequency) is available to identify the parameters; (d) assess correlation in the posterior distribution of the parameters to identify the sets of parameters that are likely to be indistinguishable; and (e) assess identifiability problems for particular sets of parameters. © 2018 The Authors.

  16. The Model for Assessment of Telemedicine (MAST)

    DEFF Research Database (Denmark)

    Kidholm, Kristian; Clemensen, Jane; Caffery, Liam J

    2017-01-01

    The evaluation of telemedicine can be achieved using different evaluation models or theoretical frameworks. This paper presents a scoping review of published studies which have applied the Model for Assessment of Telemedicine (MAST). MAST includes pre-implementation assessment (e.g. by use...

  17. Modelling the pre-assessment learning effects of assessment: evidence in the validity chain.

    Science.gov (United States)

    Cilliers, Francois J; Schuwirth, Lambert W T; van der Vleuten, Cees P M

    2012-11-01

    We previously developed a model of the pre-assessment learning effects of consequential assessment and started to validate it. The model comprises assessment factors, mechanism factors and learning effects. The purpose of this study was to continue the validation process. For stringency, we focused on a subset of assessment factor-learning effect associations that featured least commonly in a baseline qualitative study. Our aims were to determine whether these uncommon associations were operational in a broader but similar population to that in which the model was initially derived. A cross-sectional survey of 361 senior medical students at one medical school was undertaken using a purpose-made questionnaire based on a grounded theory and comprising pairs of written situational tests. In each pair, the manifestation of an assessment factor was varied. The frequencies at which learning effects were selected were compared for each item pair, using an adjusted alpha to assign significance. The frequencies at which mechanism factors were selected were calculated. There were significant differences in the learning effect selected between the two scenarios of an item pair for 13 of this subset of 21 uncommon associations, even when a p-value of value. For a subset of uncommon associations in the model, the role of most assessment factor-learning effect associations and the mechanism factors involved were supported in a broader but similar population to that in which the model was derived. Although model validation is an ongoing process, these results move the model one step closer to the stage of usefully informing interventions. Results illustrate how factors not typically included in studies of the learning effects of assessment could confound the results of interventions aimed at using assessment to influence learning. © Blackwell Publishing Ltd 2012.

  18. Integrated Assessment Model Evaluation

    Science.gov (United States)

    Smith, S. J.; Clarke, L.; Edmonds, J. A.; Weyant, J. P.

    2012-12-01

    Integrated assessment models of climate change (IAMs) are widely used to provide insights into the dynamics of the coupled human and socio-economic system, including emission mitigation analysis and the generation of future emission scenarios. Similar to the climate modeling community, the integrated assessment community has a two decade history of model inter-comparison, which has served as one of the primary venues for model evaluation and confirmation. While analysis of historical trends in the socio-economic system has long played a key role in diagnostics of future scenarios from IAMs, formal hindcast experiments are just now being contemplated as evaluation exercises. Some initial thoughts on setting up such IAM evaluation experiments are discussed. Socio-economic systems do not follow strict physical laws, which means that evaluation needs to take place in a context, unlike that of physical system models, in which there are few fixed, unchanging relationships. Of course strict validation of even earth system models is not possible (Oreskes etal 2004), a fact borne out by the inability of models to constrain the climate sensitivity. Energy-system models have also been grappling with some of the same questions over the last quarter century. For example, one of "the many questions in the energy field that are waiting for answers in the next 20 years" identified by Hans Landsberg in 1985 was "Will the price of oil resume its upward movement?" Of course we are still asking this question today. While, arguably, even fewer constraints apply to socio-economic systems, numerous historical trends and patterns have been identified, although often only in broad terms, that are used to guide the development of model components, parameter ranges, and scenario assumptions. IAM evaluation exercises are expected to provide useful information for interpreting model results and improving model behavior. A key step is the recognition of model boundaries, that is, what is inside

  19. Zhang functions and various models

    CERN Document Server

    Zhang, Yunong

    2015-01-01

    This book focuses on solving different types of time-varying problems. It presents various Zhang dynamics (ZD) models by defining various Zhang functions (ZFs) in real and complex domains. It then provides theoretical analyses of such ZD models and illustrates their results. It also uses simulations to substantiate their efficacy and show the feasibility of the presented ZD approach (i.e., different ZFs leading to different ZD models), which is further applied to the repetitive motion planning (RMP) of redundant robots, showing its application potential.

  20. An automated system for assessing cognitive function in any environment

    Science.gov (United States)

    Wesnes, Keith A.

    2005-05-01

    The Cognitive Drug Research (CDR) computerized assessment system has been in use in worldwide clinical trials for over 20 years. It is a computer based system which assesses core aspects of human cognitive function including attention, information, working memory and long-term memory. It has been extensively validated and can be performed by a wide range of clinical populations including patients with various types of dementia. It is currently in worldwide use in clinical trials to evaluate new medicines, as well as a variety of programs involving the effects of age, stressors illnesses and trauma upon human cognitive function. Besides being highly sensitive to drugs which will impair or improve function, its utility has been maintained over the last two decades by constantly increasing the number of platforms upon which it can operate. Besides notebook versions, the system can be used on a wrist worn device, PDA, via tht telephone and over the internet. It is the most widely used automated cognitive function assessment system in worldwide clinical research. It has dozens of parallel forms and requires little training to use or administer. The basic development of the system wil be identified, and the huge databases (normative, patient population, drug effects) which have been built up from hundreds of clinical trials will be described. The system is available for use in virtually any environment or type of trial.

  1. SCORING ASSESSMENT AND FORECASTING MODELS BANKRUPTCY RISK OF COMPANIES

    Directory of Open Access Journals (Sweden)

    SUSU Stefanita

    2014-07-01

    Full Text Available Bankruptcy risk made the subject of many research studies that aim at identifying the time of the bankruptcy, the factors that compete to achieve this state, the indicators that best express this orientation (the bankruptcy. The threats to enterprises require the managers knowledge of continually economic and financial situations, and vulnerable areas with development potential. Managers need to identify and properly manage the threats that would prevent achieving the targets. In terms of methods known in the literature of assessment and evaluation of bankruptcy risk they are static, functional, strategic, and scoring nonfinancial models. This article addresses Altman and Conan-Holder-known internationally as the model developed at national level by two teachers from prestigious universities in our country-the Robu-Mironiuc model. Those models are applied to data released by the profit and loss account and balance sheet Turism Covasna company over which bankruptcy risk analysis is performed. The results of the analysis are interpreted while trying to formulate solutions to the economic and financial viability of the entity.

  2. Structure functions in the chiral bag model

    International Nuclear Information System (INIS)

    Sanjose, V.; Vento, V.; Centro Mixto CSIC/Valencia Univ., Valencia

    1989-01-01

    We calculate the structure functions of an isoscalar nuclear target for the deep inelastic scattering by leptons in an extended version of the chiral bag model which incorporates the qanti q structure of the pions in the cloud. Bjorken scaling and Regge behavior are satisfied. The model calculation reproduces the low-x behavior of the data but fails to explain the medium- to large-x behavior. Evolution of the quark structure functions seem inevitable to attempt a connection between the low-energy models and the high-energy behavior of quantum chromodynamics. (orig.)

  3. Structure functions in the chiral bag model

    Energy Technology Data Exchange (ETDEWEB)

    Sanjose, V.; Vento, V.

    1989-07-13

    We calculate the structure functions of an isoscalar nuclear target for the deep inelastic scattering by leptons in an extended version of the chiral bag model which incorporates the qanti q structure of the pions in the cloud. Bjorken scaling and Regge behavior are satisfied. The model calculation reproduces the low-x behavior of the data but fails to explain the medium- to large-x behavior. Evolution of the quark structure functions seem inevitable to attempt a connection between the low-energy models and the high-energy behavior of quantum chromodynamics. (orig.).

  4. Hazard identification based on plant functional modelling

    International Nuclear Information System (INIS)

    Rasmussen, B.; Whetton, C.

    1993-10-01

    A major objective of the present work is to provide means for representing a process plant as a socio-technical system, so as to allow hazard identification at a high level. The method includes technical, human and organisational aspects and is intended to be used for plant level hazard identification so as to identify critical areas and the need for further analysis using existing methods. The first part of the method is the preparation of a plant functional model where a set of plant functions link together hardware, software, operations, work organisation and other safety related aspects of the plant. The basic principle of the functional modelling is that any aspect of the plant can be represented by an object (in the sense that this term is used in computer science) based upon an Intent (or goal); associated with each Intent are Methods, by which the Intent is realized, and Constraints, which limit the Intent. The Methods and Constraints can themselves be treated as objects and decomposed into lower-level Intents (hence the procedure is known as functional decomposition) so giving rise to a hierarchical, object-oriented structure. The plant level hazard identification is carried out on the plant functional model using the Concept Hazard Analysis method. In this, the user will be supported by checklists and keywords and the analysis is structured by pre-defined worksheets. The preparation of the plant functional model and the performance of the hazard identification can be carried out manually or with computer support. (au) (4 tabs., 10 ills., 7 refs.)

  5. Annotation and retrieval system of CAD models based on functional semantics

    Science.gov (United States)

    Wang, Zhansong; Tian, Ling; Duan, Wenrui

    2014-11-01

    CAD model retrieval based on functional semantics is more significant than content-based 3D model retrieval during the mechanical conceptual design phase. However, relevant research is still not fully discussed. Therefore, a functional semantic-based CAD model annotation and retrieval method is proposed to support mechanical conceptual design and design reuse, inspire designer creativity through existing CAD models, shorten design cycle, and reduce costs. Firstly, the CAD model functional semantic ontology is constructed to formally represent the functional semantics of CAD models and describe the mechanical conceptual design space comprehensively and consistently. Secondly, an approach to represent CAD models as attributed adjacency graphs(AAG) is proposed. In this method, the geometry and topology data are extracted from STEP models. On the basis of AAG, the functional semantics of CAD models are annotated semi-automatically by matching CAD models that contain the partial features of which functional semantics have been annotated manually, thereby constructing CAD Model Repository that supports model retrieval based on functional semantics. Thirdly, a CAD model retrieval algorithm that supports multi-function extended retrieval is proposed to explore more potential creative design knowledge in the semantic level. Finally, a prototype system, called Functional Semantic-based CAD Model Annotation and Retrieval System(FSMARS), is implemented. A case demonstrates that FSMARS can successfully botain multiple potential CAD models that conform to the desired function. The proposed research addresses actual needs and presents a new way to acquire CAD models in the mechanical conceptual design phase.

  6. Assessment of the assessment: Evaluation of the model quality estimates in CASP10

    KAUST Repository

    Kryshtafovych, Andriy

    2013-08-31

    The article presents an assessment of the ability of the thirty-seven model quality assessment (MQA) methods participating in CASP10 to provide an a priori estimation of the quality of structural models, and of the 67 tertiary structure prediction groups to provide confidence estimates for their predicted coordinates. The assessment of MQA predictors is based on the methods used in previous CASPs, such as correlation between the predicted and observed quality of the models (both at the global and local levels), accuracy of methods in distinguishing between good and bad models as well as good and bad regions within them, and ability to identify the best models in the decoy sets. Several numerical evaluations were used in our analysis for the first time, such as comparison of global and local quality predictors with reference (baseline) predictors and a ROC analysis of the predictors\\' ability to differentiate between the well and poorly modeled regions. For the evaluation of the reliability of self-assessment of the coordinate errors, we used the correlation between the predicted and observed deviations of the coordinates and a ROC analysis of correctly identified errors in the models. A modified two-stage procedure for testing MQA methods in CASP10 whereby a small number of models spanning the whole range of model accuracy was released first followed by the release of a larger number of models of more uniform quality, allowed a more thorough analysis of abilities and inabilities of different types of methods. Clustering methods were shown to have an advantage over the single- and quasi-single- model methods on the larger datasets. At the same time, the evaluation revealed that the size of the dataset has smaller influence on the global quality assessment scores (for both clustering and nonclustering methods), than its diversity. Narrowing the quality range of the assessed models caused significant decrease in accuracy of ranking for global quality predictors but

  7. Assessment of endothelial function and myocardial flow reserve using {sup 15}O-water PET without attenuation correction

    Energy Technology Data Exchange (ETDEWEB)

    Tuffier, Stephane; Joubert, Michael; Bailliez, Alban [EA 4650, Normandie Universite, Caen (France); Legallois, Damien [EA 4650, Normandie Universite, Caen (France); Caen University Hospital, Department of Cardiology, Caen (France); Belin, Annette [Caen University Hospital, Department of Cardiac Surgery, Caen (France); Redonnet, Michel [Rouen University Hospital, Department of Cardiac Surgery, Rouen (France); Agostini, Denis [EA 4650, Normandie Universite, Caen (France); Caen University Hospital, Department of Nuclear Medicine, Caen (France); Manrique, Alain [EA 4650, Normandie Universite, Caen (France); Caen University Hospital, Department of Nuclear Medicine, Caen (France); Cyceron PET Centre, Caen (France)

    2016-02-15

    Myocardial blood flow (MBF) measurement using positron emission tomography (PET) from the washout rate of {sup 15}O-water is theoretically independent of tissue attenuation. The aim of this study was to evaluate the impact of not using attenuation correction in the assessment of coronary endothelial function and myocardial flow reserve (MFR) using {sup 15}O-water PET. We retrospectively processed 70 consecutive {sup 15}O-water PET examinations obtained at rest and during cold pressor testing (CPT) in patients with dilated cardiomyopathy (n = 58), or at rest and during adenosine infusion in heart transplant recipients (n = 12). Data were reconstructed with attenuation correction (AC) and without attenuation correction (NAC) using filtered backprojection, and MBF was quantified using a single compartmental model. The agreement between AC and NAC data was assessed using Lin's concordance correlation coefficient followed by Bland-Altman plot analysis. Regarding endothelial function, NAC PET showed poor reproducibility and poor agreement with AC PET data. Conversely, NAC PET demonstrated high reproducibility and a strong agreement with AC PET for the assessment of MFR. Non-attenuation-corrected {sup 15}O-water PET provided an accurate measurement of MFR compared to attenuation-corrected PET. However, non-attenuation-corrected PET data were less effective for the assessment of endothelial function using CPT in this population. (orig.)

  8. The role of dual-energy computed tomography in the assessment of pulmonary function

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Hye Jeon [Department of Radiology, Hallym University College of Medicine, Hallym University Sacred Heart Hospital, 22, Gwanpyeong-ro 170beon-gil, Dongan-gu, Anyang-si, Gyeonggi-do 431-796 (Korea, Republic of); Hoffman, Eric A. [Departments of Radiology, Medicine, and Biomedical Engineering, University of Iowa, 200 Hawkins Dr, CC 701 GH, Iowa City, IA 52241 (United States); Lee, Chang Hyun; Goo, Jin Mo [Department of Radiology, Seoul National University College of Medicine, 103 Daehak-ro, Jongno-gu, Seoul 110-799 (Korea, Republic of); Levin, David L. [Department of Radiology, Mayo Clinic College of Medicine, 200 First Street, SW, Rochester, MN 55905 (United States); Kauczor, Hans-Ulrich [Diagnostic and Interventional Radiology, University Hospital Heidelberg, Im Neuenheimer Feld 400, 69120 Heidelberg (Germany); Translational Lung Research Center Heidelberg (TLRC), Member of the German Center for Lung Research (DZL), Im Neuenheimer Feld 400, 69120 Heidelberg (Germany); Seo, Joon Beom, E-mail: seojb@amc.seoul.kr [Department of Radiology and Research Institute of Radiology, Asan Medical Center, University of Ulsan College of Medicine, 388-1, Pungnap 2-dong, Songpa-ku, Seoul, 05505 (Korea, Republic of)

    2017-01-15

    Highlights: • The dual-energy CT technique enables the differentiation of contrast materials with material decomposition algorithm. • Pulmonary functional information can be evaluated using dual-energy CT with anatomic CT information, simultaneously. • Pulmonary functional information from dual-energy CT can improve diagnosis and severity assessment of diseases. - Abstract: The assessment of pulmonary function, including ventilation and perfusion status, is important in addition to the evaluation of structural changes of the lung parenchyma in various pulmonary diseases. The dual-energy computed tomography (DECT) technique can provide the pulmonary functional information and high resolution anatomic information simultaneously. The application of DECT for the evaluation of pulmonary function has been investigated in various pulmonary diseases, such as pulmonary embolism, asthma and chronic obstructive lung disease and so on. In this review article, we will present principles and technical aspects of DECT, along with clinical applications for the assessment pulmonary function in various lung diseases.

  9. A Generic Modeling Process to Support Functional Fault Model Development

    Science.gov (United States)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  10. Attention modeling for video quality assessment

    DEFF Research Database (Denmark)

    You, Junyong; Korhonen, Jari; Perkis, Andrew

    2010-01-01

    averaged spatiotemporal pooling. The local quality is derived from visual attention modeling and quality variations over frames. Saliency, motion, and contrast information are taken into account in modeling visual attention, which is then integrated into IQMs to calculate the local quality of a video frame...... average between the global quality and the local quality. Experimental results demonstrate that the combination of the global quality and local quality outperforms both sole global quality and local quality, as well as other quality models, in video quality assessment. In addition, the proposed video...... quality modeling algorithm can improve the performance of image quality metrics on video quality assessment compared to the normal averaged spatiotemporal pooling scheme....

  11. Modelling stream-fish functional traits in reference conditions: regional and local environmental correlates.

    Directory of Open Access Journals (Sweden)

    João M Oliveira

    Full Text Available Identifying the environmental gradients that control the functional structure of biological assemblages in reference conditions is fundamental to help river management and predict the consequences of anthropogenic stressors. Fish metrics (density of ecological guilds, and species richness from 117 least disturbed stream reaches in several western Iberia river basins were modelled with generalized linear models in order to investigate the importance of regional- and local-scale abiotic gradients to variation in functional structure of fish assemblages. Functional patterns were primarily associated with regional features, such as catchment elevation and slope, rainfall, and drainage area. Spatial variations of fish guilds were thus associated with broad geographic gradients, showing (1 pronounced latitudinal patterns, affected mainly by climatic factors and topography, or (2 at the basin level, strong upstream-downstream patterns related to stream position in the longitudinal gradient. Maximum native species richness was observed in midsize streams in accordance with the river continuum concept. The findings of our study emphasized the need to use a multi-scale approach in order to fully assess the factors that govern the functional organization of biotic assemblages in 'natural' streams, as well as to improve biomonitoring and restoration of fluvial ecosystems.

  12. Technology-based functional assessment in early childhood intervention: a pilot study.

    Science.gov (United States)

    Khetani, Mary A; McManus, Beth M; Arestad, Kristen; Richardson, Zachary; Charlifue-Smith, Renee; Rosenberg, Cordelia; Rigau, Briana

    2018-01-01

    Electronic patient-reported outcomes (e-PROs) may provide valid and feasible options for obtaining family input on their child's functioning for care planning and outcome monitoring, but they have not been adopted into early intervention (EI). The purpose of this pilot study was to evaluate the feasibility of implementing technology-based functional assessment into EI practice and to examine child, family, service, and environmental correlates of caregiver-reported child functioning in the home. In a cross-sectional design, eight individual EI providers participated in a 90-min technology-based functional assessment training to recruit participants and a 60-min semi-structured focus group post data collection. Participants completed the Young Children's Participation and Environment Measure (YC-PEM) home section online and Pediatric Evaluation of Disability Inventory Computer Adaptive Test (PEDI-CAT) via iPad. Participants' EI service use data were obtained from administrative records. A total of 37 caregivers of children between 6 and 35 months old (mean age = 19.4, SD = 7.7) enrolled, a rate of 44% (37/84) in 2.5 months. Providers suggested expanding staff training, gathering data during scheduled evaluations, and providing caregivers and providers with access to assessment summaries. Caregivers wanted their child's participation to change in 56% of home activities. Lower caregiver education and higher EI intensity were related to less child involvement in home activities. Implementing technology-based functional assessment is feasible with modifications, and these data can be useful for highlighting child, family, and EI service correlates of caregiver-reported child functioning that merit further study. Feasibility results informed protocol modifications related to EI provider training, timing of data collection, and management of EI service use data extraction, as preparation for a subsequent scale-up study that is underway.

  13. Functional redundancy and food web functioning in linuron-exposed ecosystems

    Energy Technology Data Exchange (ETDEWEB)

    De Laender, F., E-mail: frederik.delaender@ugent.be [Laboratory of Environmental Toxicity and Aquatic Ecology, Ghent University, Plateaustraat 22, 9000 Ghent (Belgium); Van den Brink, P.J., E-mail: Paul.vandenBrink@wur.nl [Department of Aquatic Ecology and Water Quality Management, Wageningen University, PO Box 47, 6700 AA Wageningen (Netherlands); Janssen, C.R., E-mail: colin.janssen@ugent.be [Laboratory of Environmental Toxicity and Aquatic Ecology, Ghent University, Plateaustraat 22, 9000 Ghent (Belgium)

    2011-10-15

    An extensive data set describing effects of the herbicide linuron on macrophyte-dominated microcosms was analysed with a food web model to assess effects on ecosystem functioning. We showed that sensitive phytoplankton and periphyton groups in the diets of heterotrophs were gradually replaced by more tolerant phytoplankton species as linuron concentrations increased. This diet shift - showing redundancy among phytoplankton species - allowed heterotrophs to maintain their functions in the contaminated microcosms. On an ecosystem level, total gross primary production was up to hundred times lower in the treated microcosms but the uptake of dissolved organic carbon by bacteria and mixotrophs was less sensitive. Food web efficiency was not consistently lower in the treated microcosms. We conclude that linuron predominantly affected the macrophytes but did not alter the overall functioning of the surrounding planktonic food web. Therefore, a risk assessment that protects macrophyte growth also protects the functioning of macrophyte-dominated microcosms. - Highlights: > Food web modelling reveals the functional response of species and ecosystem to linuron. > Primary production was more sensitive to linuron than bacterial production. > Linuron replaced sensitive phytoplankton by tolerant phytoplankton in heterotrophs' diets. > Linuron did not change the functioning of heterotrophs. - Food web modelling reveals functional redundancy of the planktonic community in microcosms treated with linuron.

  14. Functional redundancy and food web functioning in linuron-exposed ecosystems

    International Nuclear Information System (INIS)

    De Laender, F.; Van den Brink, P.J.; Janssen, C.R.

    2011-01-01

    An extensive data set describing effects of the herbicide linuron on macrophyte-dominated microcosms was analysed with a food web model to assess effects on ecosystem functioning. We showed that sensitive phytoplankton and periphyton groups in the diets of heterotrophs were gradually replaced by more tolerant phytoplankton species as linuron concentrations increased. This diet shift - showing redundancy among phytoplankton species - allowed heterotrophs to maintain their functions in the contaminated microcosms. On an ecosystem level, total gross primary production was up to hundred times lower in the treated microcosms but the uptake of dissolved organic carbon by bacteria and mixotrophs was less sensitive. Food web efficiency was not consistently lower in the treated microcosms. We conclude that linuron predominantly affected the macrophytes but did not alter the overall functioning of the surrounding planktonic food web. Therefore, a risk assessment that protects macrophyte growth also protects the functioning of macrophyte-dominated microcosms. - Highlights: → Food web modelling reveals the functional response of species and ecosystem to linuron. → Primary production was more sensitive to linuron than bacterial production. → Linuron replaced sensitive phytoplankton by tolerant phytoplankton in heterotrophs' diets. → Linuron did not change the functioning of heterotrophs. - Food web modelling reveals functional redundancy of the planktonic community in microcosms treated with linuron.

  15. A Function-Based Framework for Stream Assessment & Restoration Projects

    Science.gov (United States)

    This report lays out a framework for approaching stream assessment and restoration projects that focuses on understanding the suite of stream functions at a site in the context of what is happening in the watershed.

  16. Quark fragmentation function and the nonlinear chiral quark model

    International Nuclear Information System (INIS)

    Zhu, Z.K.

    1993-01-01

    The scaling law of the fragmentation function has been proved in this paper. With that, we show that low-P T quark fragmentation function can be studied as a low energy physocs in the light-cone coordinate frame. We therefore use the nonlinear chiral quark model which is able to study the low energy physics under scale Λ CSB to study such a function. Meanwhile the formalism for studying the quark fragmentation function has been established. The nonlinear chiral quark model is quantized on the light-front. We then use old-fashioned perturbation theory to study the quark fragmentation function. Our first order result for such a function shows in agreement with the phenomenological model study of e + e - jet. The probability for u,d pair formation in the e + e - jet from our calculation is also in agreement with the phenomenological model results

  17. Using transfer functions to quantify El Niño Southern Oscillation dynamics in data and models.

    Science.gov (United States)

    MacMartin, Douglas G; Tziperman, Eli

    2014-09-08

    Transfer function tools commonly used in engineering control analysis can be used to better understand the dynamics of El Niño Southern Oscillation (ENSO), compare data with models and identify systematic model errors. The transfer function describes the frequency-dependent input-output relationship between any pair of causally related variables, and can be estimated from time series. This can be used first to assess whether the underlying relationship is or is not frequency dependent, and if so, to diagnose the underlying differential equations that relate the variables, and hence describe the dynamics of individual subsystem processes relevant to ENSO. Estimating process parameters allows the identification of compensating model errors that may lead to a seemingly realistic simulation in spite of incorrect model physics. This tool is applied here to the TAO array ocean data, the GFDL-CM2.1 and CCSM4 general circulation models, and to the Cane-Zebiak ENSO model. The delayed oscillator description is used to motivate a few relevant processes involved in the dynamics, although any other ENSO mechanism could be used instead. We identify several differences in the processes between the models and data that may be useful for model improvement. The transfer function methodology is also useful in understanding the dynamics and evaluating models of other climate processes.

  18. Modelling of Functional States during Saccharomyces cerevisiae Fed-batch Cultivation

    Directory of Open Access Journals (Sweden)

    Stoyan Tzonkov

    2005-04-01

    Full Text Available An implementation of functional state approach for modelling of yeast fed-batch cultivation is presented in this paper. Using of functional state modelling approach aims to overcome the main disadvantage of using global process model, namely complex model structure and big number of model parameters, which complicate the model simulation and parameter estimation. This approach has computational advantages, such as the possibility to use the estimated values from the previous state as starting values for estimation of parameters of a new state. The functional state modelling approach is applied here for fedbatch cultivation of Saccharomyces cerevisiae. Four functional states are recognised and parameter estimation of local models is presented as well.

  19. Increasing Compliance in Students with Intellectual Disabilities Using Functional Behavioral Assessment and Self-Monitoring

    Science.gov (United States)

    Wadsworth, Jamie P.; Hansen, Blake D.; Wills, Sarah B.

    2015-01-01

    Noncompliance in three elementary age students with intellectual disabilities was assessed using functional behavioral assessments. Escape was identified as the primary function of the behavior in all three students, and access to tangible items was identified in one of the students as a secondary function. Teacher-monitoring and self-monitoring…

  20. Quark fragmentation functions in NJL-jet model

    Science.gov (United States)

    Bentz, Wolfgang; Matevosyan, Hrayr; Thomas, Anthony

    2014-09-01

    We report on our studies of quark fragmentation functions in the Nambu-Jona-Lasinio (NJL) - jet model. The results of Monte-Carlo simulations for the fragmentation functions to mesons and nucleons, as well as to pion and kaon pairs (dihadron fragmentation functions) are presented. The important role of intermediate vector meson resonances for those semi-inclusive deep inelastic production processes is emphasized. Our studies are very relevant for the extraction of transverse momentum dependent quark distribution functions from measured scattering cross sections. We report on our studies of quark fragmentation functions in the Nambu-Jona-Lasinio (NJL) - jet model. The results of Monte-Carlo simulations for the fragmentation functions to mesons and nucleons, as well as to pion and kaon pairs (dihadron fragmentation functions) are presented. The important role of intermediate vector meson resonances for those semi-inclusive deep inelastic production processes is emphasized. Our studies are very relevant for the extraction of transverse momentum dependent quark distribution functions from measured scattering cross sections. Supported by Grant in Aid for Scientific Research, Japanese Ministry of Education, Culture, Sports, Science and Technology, Project No. 20168769.

  1. Statistical credit risk assessment model of small and very small enterprises for Lithuanian credit unions

    OpenAIRE

    Špicas, Renatas

    2017-01-01

    While functioning in accordance with the new, business and efficiency-oriented operating model, credit unions develop and begin functioning outside the community. It is universally recognised in scientific literature that as credit unions expand their activities beyond a community, social relations with credit union members weaken and the credit unions lose their social control element, which help them to better assess and manage information asymmetry and credit risk. So far, the analysis of ...

  2. Invited review: A position on the Global Livestock Environmental Assessment Model (GLEAM).

    Science.gov (United States)

    MacLeod, M J; Vellinga, T; Opio, C; Falcucci, A; Tempio, G; Henderson, B; Makkar, H; Mottet, A; Robinson, T; Steinfeld, H; Gerber, P J

    2018-02-01

    The livestock sector is one of the fastest growing subsectors of the agricultural economy and, while it makes a major contribution to global food supply and economic development, it also consumes significant amounts of natural resources and alters the environment. In order to improve our understanding of the global environmental impact of livestock supply chains, the Food and Agriculture Organization of the United Nations has developed the Global Livestock Environmental Assessment Model (GLEAM). The purpose of this paper is to provide a review of GLEAM. Specifically, it explains the model architecture, methods and functionality, that is the types of analysis that the model can perform. The model focuses primarily on the quantification of greenhouse gases emissions arising from the production of the 11 main livestock commodities. The model inputs and outputs are managed and produced as raster data sets, with spatial resolution of 0.05 decimal degrees. The Global Livestock Environmental Assessment Model v1.0 consists of five distinct modules: (a) the Herd Module; (b) the Manure Module; (c) the Feed Module; (d) the System Module; (e) the Allocation Module. In terms of the modelling approach, GLEAM has several advantages. For example spatial information on livestock distributions and crops yields enables rations to be derived that reflect the local availability of feed resources in developing countries. The Global Livestock Environmental Assessment Model also contains a herd model that enables livestock statistics to be disaggregated and variation in livestock performance and management to be captured. Priorities for future development of GLEAM include: improving data quality and the methods used to perform emissions calculations; extending the scope of the model to include selected additional environmental impacts and to enable predictive modelling; and improving the utility of GLEAM output.

  3. Neuropsychological assessment of language functions during functional magnetic resonance imaging: development of new tasks. Preliminary report.

    Science.gov (United States)

    Fersten, Ewa; Jakuciński, Maciej; Kuliński, Radosław; Koziara, Henryk; Mroziak, Barbara; Nauman, Paweł

    2011-01-01

    Due to the complex and extended cerebral organization of language functions, the brain regions crucial for speech and language, i.e. eloquent areas, have to be affected by neurooncological surgery. One of the techniques that may be helpful in pre-operative planning of the extent of tumour removal and estimating possible complications seems to be functional magnetic resonance imaging (fMRI). The aim of the study was to develop valid procedures for neuropsychological assessment of various language functions visualisable by fMRI in healthy individuals. In this fMRI study, 10 healthy (with no CNS pathology), right-handed volunteers aged 25-35 were examined using four tasks designed to measure different language functions, and one for short-term memory assessment. A 1.5-T MRI scanner performing ultrafast functional (EPI) sequences with 4-mm slice thickness and 1-mm interslice gap was used to detect the BOLD response to stimuli present-ed in a block design (30-second alternating blocks of activity and rest). The analyses used the SPM software running in a MATLAB environment, and the obtained data were interpreted by means of colour-coded maps superimposed on structural brain scans. For each of the tasks developed for particular language functions, a different area of increased neuronal activity was found. The differential localization of function-related neuronal activity seems interesting and the research worth continuing, since verbal communication failure may result from impairment of any of various language functions, and studies reported in the literature seem to focus on verbal expression only.

  4. Assessment of Beta-Cell Function During Pregnancy and after Delivery

    Directory of Open Access Journals (Sweden)

    Genova M. P.

    2014-06-01

    Full Text Available The aim of the present study was to assess β-cell function using homeostasis model (HOMA-B and disposition index (DI in pregnant women with/without gestational diabetes, and after delivery. A total of 102 pregnant women between 24-28 gestational weeks (53 with gestational diabetes mellitus (GDM and 49 with normal glucose tolerance (NGT and 22 GDM postpartum women (8-12 weeks after delivery were included in the study. All postpartum women had a history of GDM. HOMA indexes (insulin resistance - HOMA-IR and HOMA-B for assessing β-cell function were calculated from fasting glucose and insulin concentrations. To estimate insulin secretion independent of insulin sensitivity, DI was calculated using glucose and insulin levels at 0 and 60 min during the course of a 2 h 75g oral glucose tolerance test (OGTT. In GDM pregnant women HOMA-B was significantly lower compared to NGT women (p = 0.017, but there was no significant difference compared to women after birth (NS. There was difference between NGT and postpartum women (p < 0.05. DI was significantly lower for GDM pregnant women in comparison to NGT and postpartum women (p < 0.0001; p = 0.011, between NGT and women after birth (p < 0.04. In our study, comparison of НОМА-В in NGT and GDM pregnant women demonstrated that the OR of developing GDM was 0.989 (95% CI, 0.980-0.998, P = 0.013, and comparison of DI in healthy pregnant and GDM showed that the OR of developing GDM was 0.967 (95% CI, 0.947-0.988, P = 0.002. Therefore, HOMA-B and DI appear to be protective factors in the risk of developing GDM. According to our results, assessment of β-cell function, using HOMA-B and DI, showed that they are lower in GDM than NGT group and postpartum women. It is important to note that HOMA-B did not show significant difference between GDM pregnant and women after delivery with a history for GDM. We assume that pregnant women with GDM have a pancreatic β-cell defect that remains after birth. These women

  5. The BACHD Rat Model of Huntington Disease Shows Specific Deficits in a Test Battery of Motor Function.

    Science.gov (United States)

    Manfré, Giuseppe; Clemensson, Erik K H; Kyriakou, Elisavet I; Clemensson, Laura E; van der Harst, Johanneke E; Homberg, Judith R; Nguyen, Huu Phuc

    2017-01-01

    Rationale : Huntington disease (HD) is a progressive neurodegenerative disorder characterized by motor, cognitive and neuropsychiatric symptoms. HD is usually diagnosed by the appearance of motor deficits, resulting in skilled hand use disruption, gait abnormality, muscle wasting and choreatic movements. The BACHD transgenic rat model for HD represents a well-established transgenic rodent model of HD, offering the prospect of an in-depth characterization of the motor phenotype. Objective : The present study aims to characterize different aspects of motor function in BACHD rats, combining classical paradigms with novel high-throughput behavioral phenotyping. Methods : Wild-type (WT) and transgenic animals were tested longitudinally from 2 to 12 months of age. To measure fine motor control, rats were challenged with the pasta handling test and the pellet reaching test. To evaluate gross motor function, animals were assessed by using the holding bar and the grip strength tests. Spontaneous locomotor activity and circadian rhythmicity were assessed in an automated home-cage environment, namely the PhenoTyper. We then integrated existing classical methodologies to test motor function with automated home-cage assessment of motor performance. Results : BACHD rats showed strong impairment in muscle endurance at 2 months of age. Altered circadian rhythmicity and locomotor activity were observed in transgenic animals. On the other hand, reaching behavior, forepaw dexterity and muscle strength were unaffected. Conclusions : The BACHD rat model exhibits certain features of HD patients, like muscle weakness and changes in circadian behavior. We have observed modest but clear-cut deficits in distinct motor phenotypes, thus confirming the validity of this transgenic rat model for treatment and drug discovery purposes.

  6. Beyond the global assessment of functioning: learning from Virginia Apgar.

    Science.gov (United States)

    Dimsdale, Joel E; Jeste, Dilip V; Patterson, Thomas L

    2010-01-01

    The Global Assessment of Functioning (GAF) scale is widely used in psychiatry, yet it has certain drawbacks. The authors seek to generate further discussion and research around developing an improved successor to the GAF. The authors used the Apgar scale as a template for constructing a possible successor to the GAF. Consulting with 16 colleagues, they selected 5 domains that were felt to be central to functioning in psychiatric patients. Psychiatrists in diverse clinical settings then completed both a GAF and a Psychiatric Apgar scale on 40 patients. The two scales were found to agree significantly. Use of the Psychiatric Apgar, however, provides clearer guidance about assessing functioning. The GAF was a brilliant addition to psychiatric practice. As we develop the next Diagnostic and Statistical Manual, it is pertinent to ask whether the GAF approach could be optimized even further by applying the lessons of Virginia Apgar.

  7. Relationship between functional connectivity and motor function assessment in stroke patients with hemiplegia: a resting-state functional MRI study

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Ye; Wang, Li; Zhang, Jingna; Sang, Linqiong; Li, Pengyue; Qiu, Mingguo [Third Military Medical University, Department of Medical Imaging, College of Biomedical Engineering, Chongqing (China); Liu, Hongliang; Yan, Rubing [Third Military Medical University, Department of Rehabilitation, Southwest Hospital, Chongqing (China); Yang, Jun; Wang, Jian [Third Military Medical University, Department of Radiology, Southwest Hospital, Chongqing (China)

    2016-05-15

    Resting-state functional magnetic resonance imaging (fMRI) has been used to examine the brain mechanisms of stroke patients with hemiplegia, but the relationship between functional connectivity (FC) and treatment-induced motor function recovery has not yet been fully investigated. This study aimed to identify the brain FC changes in stroke patients and study the relationship between FC and motor function assessment using the resting-state fMRI. Seventeen stroke patients with hemiplegia and fifteen healthy control subjects (HCSs) were recruited in this study. We compared the FC between the ipsilesional primary motor cortex (M1) and the whole brain of the patients with the FC of the HCSs and studied the FC changes in the patients before and after conventional rehabilitation and motor imagery therapy. Additionally, correlations between the FC change and motor function of the patients were studied. Compared to the HCSs, the FC in the patient group was significantly increased between the ipsilesional M1 and the ipsilesional inferior parietal cortex, frontal gyrus, supplementary motor area (SMA), and contralesional angular and decreased between the ipsilesional M1 and bilateral M1. After the treatment, the FC between the ipsilesional M1 and contralesional M1 increased while the FC between the ipsilesional M1 and ipsilesional SMA and paracentral lobule decreased. A statistically significant correlation was found between the FC change in the bilateral M1 and the Fugl-Meyer assessment (FMA) score change. Our results revealed an abnormal motor network after stroke and suggested that the FC could serve as a biomarker of motor function recovery in stroke patients with hemiplegia. (orig.)

  8. Relationship between functional connectivity and motor function assessment in stroke patients with hemiplegia: a resting-state functional MRI study

    International Nuclear Information System (INIS)

    Zhang, Ye; Wang, Li; Zhang, Jingna; Sang, Linqiong; Li, Pengyue; Qiu, Mingguo; Liu, Hongliang; Yan, Rubing; Yang, Jun; Wang, Jian

    2016-01-01

    Resting-state functional magnetic resonance imaging (fMRI) has been used to examine the brain mechanisms of stroke patients with hemiplegia, but the relationship between functional connectivity (FC) and treatment-induced motor function recovery has not yet been fully investigated. This study aimed to identify the brain FC changes in stroke patients and study the relationship between FC and motor function assessment using the resting-state fMRI. Seventeen stroke patients with hemiplegia and fifteen healthy control subjects (HCSs) were recruited in this study. We compared the FC between the ipsilesional primary motor cortex (M1) and the whole brain of the patients with the FC of the HCSs and studied the FC changes in the patients before and after conventional rehabilitation and motor imagery therapy. Additionally, correlations between the FC change and motor function of the patients were studied. Compared to the HCSs, the FC in the patient group was significantly increased between the ipsilesional M1 and the ipsilesional inferior parietal cortex, frontal gyrus, supplementary motor area (SMA), and contralesional angular and decreased between the ipsilesional M1 and bilateral M1. After the treatment, the FC between the ipsilesional M1 and contralesional M1 increased while the FC between the ipsilesional M1 and ipsilesional SMA and paracentral lobule decreased. A statistically significant correlation was found between the FC change in the bilateral M1 and the Fugl-Meyer assessment (FMA) score change. Our results revealed an abnormal motor network after stroke and suggested that the FC could serve as a biomarker of motor function recovery in stroke patients with hemiplegia. (orig.)

  9. Environmental Modeling and Bayesian Analysis for Assessing Human Health Impacts from Radioactive Waste Disposal

    Science.gov (United States)

    Stockton, T.; Black, P.; Tauxe, J.; Catlett, K.

    2004-12-01

    Bayesian decision analysis provides a unified framework for coherent decision-making. Two key components of Bayesian decision analysis are probability distributions and utility functions. Calculating posterior distributions and performing decision analysis can be computationally challenging, especially for complex environmental models. In addition, probability distributions and utility functions for environmental models must be specified through expert elicitation, stakeholder consensus, or data collection, all of which have their own set of technical and political challenges. Nevertheless, a grand appeal of the Bayesian approach for environmental decision- making is the explicit treatment of uncertainty, including expert judgment. The impact of expert judgment on the environmental decision process, though integral, goes largely unassessed. Regulations and orders of the Environmental Protection Agency, Department Of Energy, and Nuclear Regulatory Agency orders require assessing the impact on human health of radioactive waste contamination over periods of up to ten thousand years. Towards this end complex environmental simulation models are used to assess "risk" to human and ecological health from migration of radioactive waste. As the computational burden of environmental modeling is continually reduced probabilistic process modeling using Monte Carlo simulation is becoming routinely used to propagate uncertainty from model inputs through model predictions. The utility of a Bayesian approach to environmental decision-making is discussed within the context of a buried radioactive waste example. This example highlights the desirability and difficulties of merging the cost of monitoring, the cost of the decision analysis, the cost and viability of clean up, and the probability of human health impacts within a rigorous decision framework.

  10. Diagnostics for Linear Models With Functional Responses

    OpenAIRE

    Xu, Hongquan; Shen, Qing

    2005-01-01

    Linear models where the response is a function and the predictors are vectors are useful in analyzing data from designed experiments and other situations with functional observations. Residual analysis and diagnostics are considered for such models. Studentized residuals are defined and their properties are studied. Chi-square quantile-quantile plots are proposed to check the assumption of Gaussian error process and outliers. Jackknife residuals and an associated test are proposed to det...

  11. Review of early assessment models of innovative medical technologies.

    Science.gov (United States)

    Fasterholdt, Iben; Krahn, Murray; Kidholm, Kristian; Yderstræde, Knud Bonnet; Pedersen, Kjeld Møller

    2017-08-01

    Hospitals increasingly make decisions regarding the early development of and investment in technologies, but a formal evaluation model for assisting hospitals early on in assessing the potential of innovative medical technologies is lacking. This article provides an overview of models for early assessment in different health organisations and discusses which models hold most promise for hospital decision makers. A scoping review of published studies between 1996 and 2015 was performed using nine databases. The following information was collected: decision context, decision problem, and a description of the early assessment model. 2362 articles were identified and 12 studies fulfilled the inclusion criteria. An additional 12 studies were identified and included in the review by searching reference lists. The majority of the 24 early assessment studies were variants of traditional cost-effectiveness analysis. Around one fourth of the studies presented an evaluation model with a broader focus than cost-effectiveness. Uncertainty was mostly handled by simple sensitivity or scenario analysis. This review shows that evaluation models using known methods assessing cost-effectiveness are most prevalent in early assessment, but seems ill-suited for early assessment in hospitals. Four models provided some usable elements for the development of a hospital-based model. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  12. Contribution to a quantitative assessment model for reliability-based metrics of electronic and programmable safety-related functions; Contribution a un modele d'evaluation quantitative des performances fiabilistes de fonctions electroniques et programmables dediees a la securite

    Energy Technology Data Exchange (ETDEWEB)

    Hamidi, K

    2005-10-15

    The use of fault-tolerant EP architectures has induced growing constraints, whose influence on reliability-based performance metrics is no more negligible. To face up the growing influence of simultaneous failure, this thesis proposes, for safety-related functions, a new-trend assessment method of reliability, based on a better taking into account of time-aspect. This report introduces the concept of information and uses it to interpret the failure modes of safety-related function as the direct result of the initiation and propagation of erroneous information until the actuator-level. The main idea is to distinguish the apparition and disappearance of erroneous states, which could be defined as intrinsically dependent of HW-characteristic and maintenance policies, and their possible activation, constrained through architectural choices, leading to the failure of safety-related function. This approach is based on a low level on deterministic SED models of the architecture and use non homogeneous Markov chains to depict the time-evolution of probabilities of errors. (author)

  13. The effect of language on functional capacity assessment in middle-aged and older US Latinos with schizophrenia.

    Science.gov (United States)

    Bengoetxea, Eneritz; Burton, Cynthia Z; Mausbach, Brent T; Patterson, Thomas L; Twamley, Elizabeth W

    2014-08-15

    The U.S. Latino population is steadily increasing, prompting a need for cross-cultural outcome measures in schizophrenia research. This study examined the contribution of language to functional assessment in middle-aged Latino patients with schizophrenia by comparing 29 monolingual Spanish-speakers, 29 Latino English-speakers, and 29 non-Latino English-speakers who were matched on relevant demographic variables and who completed cognitive and functional assessments in their native language. There were no statistically significant differences between groups on the four everyday functioning variables (UCSD Performance-Based Skills Assessment [UPSA], Social Skills Performance Assessment [SSPA], Medication Management Ability Assessment [MMAA], and the Global Assessment of Functioning [GAF]). The results support the cross-linguistic and cross-cultural acceptability of these functional assessment instruments. It appears that demographic variables other than language (e.g., age, education) better explain differences in functional assessment among ethnically diverse subpopulations. Considering the influence of these other factors in addition to language on functional assessments will help ensure that measures can be appropriately interpreted among the diverse residents of the United States. Published by Elsevier Ireland Ltd.

  14. The effect of language on functional capacity assessment in middle-aged and older US Latinos with schizophrenia

    Science.gov (United States)

    Bengoetxea, Eneritz; Burton, Cynthia Z.; Mausbach, Brent T.; Patterson, Thomas L.; Twamley, Elizabeth W.

    2014-01-01

    The U.S. Latino population is steadily increasing, prompting a need for cross-cultural outcome measures in schizophrenia research. This study examined the contribution of language to functional assessment in middle-aged Latino patients with schizophrenia by comparing 29 monolingual Spanish-speakers, 29 Latino English-speakers, and 29 non-Latino English-speakers who were matched on relevant demographic variables and who completed cognitive and functional assessments in their native language. There were no statistically significant differences between groups on the four everyday functioning variables (UCSD Performance-Based Skills Assessment [UPSA], Social Skills Performance Assessment [SSPA], Medication Management Ability Assessment [MMAA], and the Global Assessment of Functioning [GAF]). The results support the cross-linguistic and cross-cultural acceptability of these functional assessment instruments. It appears that demographic variables other than language (e.g., age, education) better explain differences in functional assessment among ethnically diverse subpopulations. Considering the influence of these other factors in addition to language on functional assessments will help ensure that measures can be appropriately interpreted among the diverse residents of the United States. PMID:24751379

  15. Structure, Function, and Applications of the Georgetown-Einstein (GE) Breast Cancer Simulation Model.

    Science.gov (United States)

    Schechter, Clyde B; Near, Aimee M; Jayasekera, Jinani; Chandler, Young; Mandelblatt, Jeanne S

    2018-04-01

    The Georgetown University-Albert Einstein College of Medicine breast cancer simulation model (Model GE) has evolved over time in structure and function to reflect advances in knowledge about breast cancer, improvements in early detection and treatment technology, and progress in computing resources. This article describes the model and provides examples of model applications. The model is a discrete events microsimulation of single-life histories of women from multiple birth cohorts. Events are simulated in the absence of screening and treatment, and interventions are then applied to assess their impact on population breast cancer trends. The model accommodates differences in natural history associated with estrogen receptor (ER) and human epidermal growth factor receptor 2 (HER2) biomarkers, as well as conventional breast cancer risk factors. The approach for simulating breast cancer natural history is phenomenological, relying on dates, stage, and age of clinical and screen detection for a tumor molecular subtype without explicitly modeling tumor growth. The inputs to the model are regularly updated to reflect current practice. Numerous technical modifications, including the use of object-oriented programming (C++), and more efficient algorithms, along with hardware advances, have increased program efficiency permitting simulations of large samples. The model results consistently match key temporal trends in US breast cancer incidence and mortality. The model has been used in collaboration with other CISNET models to assess cancer control policies and will be applied to evaluate clinical trial design, recurrence risk, and polygenic risk-based screening.

  16. FUNCTIONAL MODELLING FOR FAULT DIAGNOSIS AND ITS APPLICATION FOR NPP

    Directory of Open Access Journals (Sweden)

    MORTEN LIND

    2014-12-01

    Full Text Available The paper presents functional modelling and its application for diagnosis in nuclear power plants. Functional modelling is defined and its relevance for coping with the complexity of diagnosis in large scale systems like nuclear plants is explained. The diagnosis task is analyzed and it is demonstrated that the levels of abstraction in models for diagnosis must reflect plant knowledge about goals and functions which is represented in functional modelling. Multilevel flow modelling (MFM, which is a method for functional modelling, is introduced briefly and illustrated with a cooling system example. The use of MFM for reasoning about causes and consequences is explained in detail and demonstrated using the reasoning tool, the MFMSuite. MFM applications in nuclear power systems are described by two examples: a PWR; and an FBR reactor. The PWR example show how MFM can be used to model and reason about operating modes. The FBR example illustrates how the modelling development effort can be managed by proper strategies including decomposition and reuse.

  17. Functional Modelling for fault diagnosis and its application for NPP

    Energy Technology Data Exchange (ETDEWEB)

    Lind, Morten; Zhang, Xin Xin [Dept. of Electrical Engineering, Technical University of Denmark, Kongens Lyngby (Denmark)

    2014-12-15

    The paper presents functional modelling and its application for diagnosis in nuclear power plants. Functional modelling is defined and its relevance for coping with the complexity of diagnosis in large scale systems like nuclear plants is explained. The diagnosis task is analyzed and it is demonstrated that the levels of abstraction in models for diagnosis must reflect plant knowledge about goals and functions which is represented in functional modelling. Multilevel flow modelling (MFM), which is a method for functional modelling, is introduced briefly and illustrated with a cooling system example. The use of MFM for reasoning about causes and consequences is explained in detail and demonstrated using the reasoning tool, the MFMSuite. MFM applications in nuclear power systems are described by two examples: a PWR; and an FBR reactor. The PWR example show how MFM can be used to model and reason about operating modes. The FBR example illustrates how the modelling development effort can be managed by proper strategies including decomposition and reuse.

  18. Functional Modelling for fault diagnosis and its application for NPP

    International Nuclear Information System (INIS)

    Lind, Morten; Zhang, Xin Xin

    2014-01-01

    The paper presents functional modelling and its application for diagnosis in nuclear power plants. Functional modelling is defined and its relevance for coping with the complexity of diagnosis in large scale systems like nuclear plants is explained. The diagnosis task is analyzed and it is demonstrated that the levels of abstraction in models for diagnosis must reflect plant knowledge about goals and functions which is represented in functional modelling. Multilevel flow modelling (MFM), which is a method for functional modelling, is introduced briefly and illustrated with a cooling system example. The use of MFM for reasoning about causes and consequences is explained in detail and demonstrated using the reasoning tool, the MFMSuite. MFM applications in nuclear power systems are described by two examples: a PWR; and an FBR reactor. The PWR example show how MFM can be used to model and reason about operating modes. The FBR example illustrates how the modelling development effort can be managed by proper strategies including decomposition and reuse.

  19. Interactive Rapid Dose Assessment Model (IRDAM): reactor-accident assessment methods. Vol.2

    International Nuclear Information System (INIS)

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.; Desrosiers, A.E.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness, the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This document describes the technical bases for IRDAM including methods, models and assumptions used in calculations. IRDAM calculates whole body (5-cm depth) and infant thyroid doses at six fixed downwind distances between 500 and 20,000 meters. Radionuclides considered primarily consist of noble gases and radioiodines. In order to provide a rapid assessment capability consistent with the capacity of the Osborne-1 computer, certain simplifying approximations and assumptions are made. These are described, along with default values (assumptions used in the absence of specific input) in the text of this document. Two companion volumes to this one provide additional information on IRDAM. The user's Guide (NUREG/CR-3012, Volume 1) describes the setup and operation of equipment necessary to run IRDAM. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios

  20. Statistical Assessment of Solvent Mixture Models Used for Separation of Biological Active Compounds

    Directory of Open Access Journals (Sweden)

    Lorentz Jäntschi

    2008-08-01

    Full Text Available Two mathematical models with seven and six parameters have been created for use as methods for identification of the optimum mobile phase in chromatographic separations. A series of chromatographic response functions were proposed and implemented in order to assess and validate the models. The assessment was performed on a set of androstane isomers. Pearson, Spearman, Kendall tau-a,b,c and Goodman-Kruskal correlation coefficients were used in order to identify and to quantify the link and its nature (quantitative, categorical, semi-quantitative, both quantitative and categorical between experimental values and the values estimated by the mathematical models. The study revealed that the six parameter model is valid and reliable for five chromatographic response factors (retardation factor, retardation factor ordered ascending by the chromatographic peak, resolution of pairs of compound, resolution matrix of successive chromatographic peaks, and quality factor. Furthermore, the model could be used as an instrument in analysis of the quality of experimental data. The results obtained by applying the model with six parameters for deviations of rank sums suggest that the data of the experiment no. 8 are questionable.

  1. A critique of recent models for human error rate assessment

    International Nuclear Information System (INIS)

    Apostolakis, G.E.

    1988-01-01

    This paper critically reviews two groups of models for assessing human error rates under accident conditions. The first group, which includes the US Nuclear Regulatory Commission (NRC) handbook model and the human cognitive reliability (HCR) model, considers as fundamental the time that is available to the operators to act. The second group, which is represented by the success likelihood index methodology multiattribute utility decomposition (SLIM-MAUD) model, relies on ratings of the human actions with respect to certain qualitative factors and the subsequent derivation of error rates. These models are evaluated with respect to two criteria: the treatment of uncertainties and the internal coherence of the models. In other words, this evaluation focuses primarily on normative aspects of these models. The principal findings are as follows: (1) Both of the time-related models provide human error rates as a function of the available time for action and the prevailing conditions. However, the HCR model ignores the important issue of state-of-knowledge uncertainties, dealing exclusively with stochastic uncertainty, whereas the model presented in the NRC handbook handles both types of uncertainty. (2) SLIM-MAUD provides a highly structured approach for the derivation of human error rates under given conditions. However, the treatment of the weights and ratings in this model is internally inconsistent. (author)

  2. Item Response Theory with Covariates (IRT-C): Assessing Item Recovery and Differential Item Functioning for the Three-Parameter Logistic Model

    Science.gov (United States)

    Tay, Louis; Huang, Qiming; Vermunt, Jeroen K.

    2016-01-01

    In large-scale testing, the use of multigroup approaches is limited for assessing differential item functioning (DIF) across multiple variables as DIF is examined for each variable separately. In contrast, the item response theory with covariate (IRT-C) procedure can be used to examine DIF across multiple variables (covariates) simultaneously. To…

  3. Multivariate Heteroscedasticity Models for Functional Brain Connectivity

    Directory of Open Access Journals (Sweden)

    Christof Seiler

    2017-12-01

    Full Text Available Functional brain connectivity is the co-occurrence of brain activity in different areas during resting and while doing tasks. The data of interest are multivariate timeseries measured simultaneously across brain parcels using resting-state fMRI (rfMRI. We analyze functional connectivity using two heteroscedasticity models. Our first model is low-dimensional and scales linearly in the number of brain parcels. Our second model scales quadratically. We apply both models to data from the Human Connectome Project (HCP comparing connectivity between short and conventional sleepers. We find stronger functional connectivity in short than conventional sleepers in brain areas consistent with previous findings. This might be due to subjects falling asleep in the scanner. Consequently, we recommend the inclusion of average sleep duration as a covariate to remove unwanted variation in rfMRI studies. A power analysis using the HCP data shows that a sample size of 40 detects 50% of the connectivity at a false discovery rate of 20%. We provide implementations using R and the probabilistic programming language Stan.

  4. A battery of tests for assessing cognitive function in U.S. Chinese older adults--findings from the PINE Study.

    Science.gov (United States)

    Chang, E-Shien; Dong, XinQi

    2014-11-01

    Existing methodological challenges in aging research has dampened our assessment of cognitive function among minority older adults. We aim to report the composite scores of five cognitive function tests among U.S. Chinese older adults, and examine the association between cognitive function and key sociodemographic characteristics. The Population Study of Chinese Elderly in Chicago Study enrolled an epidemiological cohort of 3,159 community-dwelling Chinese older adults. We administered five cognitive function tests, including the Chinese Mini-Mental State Examination, the immediate and delayed recall of the East Boston Memory Test, the Digit Span Backwards assessment, and the Symbol Digit Modalities Test. We used Spearman correlation coefficients to examine the correlation between cognitive function and sociodemographic variables. Linear regression models were used to report the effect of sociodemographic and health variables including age, sex, education on cognitive function. Our multivariate analysis suggested that performance in each domain of cognitive function was inversely associated with age and positively related to education. With respect to sex, after adjusted for age, education and all key variables presented in the model, being male was positively related to global cognitive score and working memory. Being married, having fewer children, having been in the United States for fewer years, having been in the community for fewer years, and better self-reported health were positively correlated with all cognitive function domains. This population-based study of U.S. Chinese older adults is among the first to examine a battery of five cognitive function tests, which in aggregate enables researchers to capture a wide range of cognitive performance. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Evaluation-Function-based Model-free Adaptive Fuzzy Control

    Directory of Open Access Journals (Sweden)

    Agus Naba

    2016-12-01

    Full Text Available Designs of adaptive fuzzy controllers (AFC are commonly based on the Lyapunov approach, which requires a known model of the controlled plant. They need to consider a Lyapunov function candidate as an evaluation function to be minimized. In this study these drawbacks were handled by designing a model-free adaptive fuzzy controller (MFAFC using an approximate evaluation function defined in terms of the current state, the next state, and the control action. MFAFC considers the approximate evaluation function as an evaluative control performance measure similar to the state-action value function in reinforcement learning. The simulation results of applying MFAFC to the inverted pendulum benchmark verified the proposed scheme’s efficacy.

  6. Model Penentuan Nilai Target Functional Requirement Berbasis Utilitas

    Directory of Open Access Journals (Sweden)

    Cucuk Nur Rosyidi

    2012-01-01

    Full Text Available In a product design and development process, a designer faces a problem to decide functional requirement (FR target values. That decision is made under a risk since it is conducted in the early design phase using incomplete information. Utility function can be used to reflect the decision maker attitude towards the risk in making such decision. In this research, we develop a utility-based model to determine FR target values using quadratic utility function and information from Quality Function Deployment (QFD. A pencil design is used as a numerical example using quadratic utility function for each FR. The model can be applied for balancing customer and designer interest in determining FR target values.

  7. Towards refactoring the Molecular Function Ontology with a UML profile for function modeling.

    Science.gov (United States)

    Burek, Patryk; Loebe, Frank; Herre, Heinrich

    2017-10-04

    Gene Ontology (GO) is the largest resource for cataloging gene products. This resource grows steadily and, naturally, this growth raises issues regarding the structure of the ontology. Moreover, modeling and refactoring large ontologies such as GO is generally far from being simple, as a whole as well as when focusing on certain aspects or fragments. It seems that human-friendly graphical modeling languages such as the Unified Modeling Language (UML) could be helpful in connection with these tasks. We investigate the use of UML for making the structural organization of the Molecular Function Ontology (MFO), a sub-ontology of GO, more explicit. More precisely, we present a UML dialect, called the Function Modeling Language (FueL), which is suited for capturing functions in an ontologically founded way. FueL is equipped, among other features, with language elements that arise from studying patterns of subsumption between functions. We show how to use this UML dialect for capturing the structure of molecular functions. Furthermore, we propose and discuss some refactoring options concerning fragments of MFO. FueL enables the systematic, graphical representation of functions and their interrelations, including making information explicit that is currently either implicit in MFO or is mainly captured in textual descriptions. Moreover, the considered subsumption patterns lend themselves to the methodical analysis of refactoring options with respect to MFO. On this basis we argue that the approach can increase the comprehensibility of the structure of MFO for humans and can support communication, for example, during revision and further development.

  8. Improvement of the Model of Enterprise Management Process on the Basis of General Management Functions

    Directory of Open Access Journals (Sweden)

    Ruslan Skrynkovskyy

    2017-12-01

    Full Text Available The purpose of the article is to improve the model of the enterprise (institution, organization management process on the basis of general management functions. The graphic model of the process of management according to the process-structured management is presented. It has been established that in today's business environment, the model of the management process should include such general management functions as: 1 controlling the achievement of results; 2 planning based on the main goal; 3 coordination and corrective actions (in the system of organization of work and production; 4 action as a form of act (conscious, volitional, directed; 5 accounting system (accounting, statistical, operational-technical and managerial; 6 diagnosis (economic, legal with such subfunctions as: identification of the state and capabilities; analysis (economic, legal, systemic with argumentation; assessment of the state, trends and prospects of development. The prospect of further research in this direction is: 1 the formation of a system of interrelation of functions and management methods, taking into account the presented research results; 2 development of the model of effective and efficient communication business process of the enterprise.

  9. The Assessment of Executive Functioning in People with Intellectual Disabilities: An Exploratory Analysis

    Science.gov (United States)

    Bevins, Shelley; Hurse, Emily

    2016-01-01

    The following article details a piece of service development work undertaken as part of the Plymouth Down Syndrome Screening Programme. The work aimed to review the use of three measures assessing executive functioning skills used within the Programme as well as with people without Down syndrome. Three tasks assessing executive functioning (the…

  10. Bayesian assessment of moving group membership: importance of models and prior knowledge

    Science.gov (United States)

    Lee, Jinhee; Song, Inseok

    2018-04-01

    Young nearby moving groups are important and useful in many fields of astronomy such as studying exoplanets, low-mass stars, and the stellar evolution of the early planetary systems over tens of millions of years, which has led to intensive searches for their members. Identification of members depends on the used models sensitively; therefore, careful examination of the models is required. In this study, we investigate the effects of the models used in moving group membership calculations based on a Bayesian framework (e.g. BANYAN II) focusing on the beta-Pictoris moving group (BPMG). Three improvements for building models are suggested: (1) updating a list of accepted members by re-assessing memberships in terms of position, motion, and age, (2) investigating member distribution functions in XYZ, and (3) exploring field star distribution functions in XYZ and UVW. The effect of each change is investigated, and we suggest using all of these improvements simultaneously in future membership probability calculations. Using this improved MG membership calculation and the careful examination of the age, 57 bona fide members of BPMG are confirmed including 12 new members. We additionally suggest 17 highly probable members.

  11. Environmental assessment of solid waste landfilling technologies by means of LCA-modeling

    DEFF Research Database (Denmark)

    Manfredi, Simone; Christensen, Thomas Højlund

    2009-01-01

    By using life cycle assessment (LCA) modeling, this paper compares the environmental performance of six landfilling technologies (open dump, conventional landfill with flares, conventional landfill with energy recovery, standard bioreactor landfill, flushing bioreactor landfill and semi......-aerobic landfill) and assesses the influence of the active operations practiced on these performances. The environmental assessments have been performed by means of the LCA-based tool EASEWASTE, whereby the functional unit utilized for the LCA is “landfilling of 1 ton of wet household waste in a 10 m deep landfill...... that it is crucially important to ensure the highest collection efficiency of landfill gas and leachate since a poor capture compromises the overall environmental performance. Once gas and leachate are collected and treated, the potential impacts in the standard environmental categories and on spoiled groundwater...

  12. Global and 3D spatial assessment of neuroinflammation in rodent models of Multiple Sclerosis.

    Directory of Open Access Journals (Sweden)

    Shashank Gupta

    Full Text Available Multiple Sclerosis (MS is a progressive autoimmune inflammatory and demyelinating disease of the central nervous system (CNS. T cells play a key role in the progression of neuroinflammation in MS and also in the experimental autoimmune encephalomyelitis (EAE animal models for the disease. A technology for quantitative and 3 dimensional (3D spatial assessment of inflammation in this and other CNS inflammatory conditions is much needed. Here we present a procedure for 3D spatial assessment and global quantification of the development of neuroinflammation based on Optical Projection Tomography (OPT. Applying this approach to the analysis of rodent models of MS, we provide global quantitative data of the major inflammatory component as a function of the clinical course. Our data demonstrates a strong correlation between the development and progression of neuroinflammation and clinical disease in several mouse and a rat model of MS refining the information regarding the spatial dynamics of the inflammatory component in EAE. This method provides a powerful tool to investigate the effect of environmental and genetic forces and for assessing the therapeutic effects of drug therapy in animal models of MS and other neuroinflammatory/neurodegenerative disorders.

  13. Report on the 2011 Critical Assessment of Function Annotation (CAFA) meeting

    Energy Technology Data Exchange (ETDEWEB)

    Friedberg, Iddo [Miami Univ., Oxford, OH (United States)

    2015-01-21

    The Critical Assessment of Function Annotation meeting was held July 14-15, 2011 at the Austria Conference Center in Vienna, Austria. There were 73 registered delegates at the meeting. We thank the DOE for this award. It helped us organize and support a scientific meeting AFP 2011 as a special interest group (SIG) meeting associated with the ISMB 2011 conference. The conference was held in Vienna, Austria, in July 2011. The AFP SIG was held on July 15-16, 2011 (immediately preceding the conference). The meeting consisted of two components, the first being a series of talks (invited and contributed) and discussion sections dedicated to protein function research, with an emphasis on the theory and practice of computational methods utilized in functional annotation. The second component provided a large-scale assessment of computational methods through participation in the Critical Assessment of Functional Annotation (CAFA). The meeting was exciting and, based on feedback, quite successful. There were 73 registered participants. The schedule was only slightly different from the one proposed, due to two cancellations. Dr. Olga Troyanskaya has canceled and we invited Dr. David Jones instead. Similarly, instead of Dr. Richard Roberts, Dr. Simon Kasif gave a closing keynote. The remaining invited speakers were Janet Thornton (EBI) and Amos Bairoch (University of Geneva).

  14. Group-ICA model order highlights patterns of functional brain connectivity

    Directory of Open Access Journals (Sweden)

    Ahmed eAbou Elseoud

    2011-06-01

    Full Text Available Resting-state networks (RSNs can be reliably and reproducibly detected using independent component analysis (ICA at both individual subject and group levels. Altering ICA dimensionality (model order estimation can have a significant impact on the spatial characteristics of the RSNs as well as their parcellation into sub-networks. Recent evidence from several neuroimaging studies suggests that the human brain has a modular hierarchical organization which resembles the hierarchy depicted by different ICA model orders. We hypothesized that functional connectivity between-group differences measured with ICA might be affected by model order selection. We investigated differences in functional connectivity using so-called dual-regression as a function of ICA model order in a group of unmedicated seasonal affective disorder (SAD patients compared to normal healthy controls. The results showed that the detected disease-related differences in functional connectivity alter as a function of ICA model order. The volume of between-group differences altered significantly as a function of ICA model order reaching maximum at model order 70 (which seems to be an optimal point that conveys the largest between-group difference then stabilized afterwards. Our results show that fine-grained RSNs enable better detection of detailed disease-related functional connectivity changes. However, high model orders show an increased risk of false positives that needs to be overcome. Our findings suggest that multilevel ICA exploration of functional connectivity enables optimization of sensitivity to brain disorders.

  15. Machine Learning Approach for Software Reliability Growth Modeling with Infinite Testing Effort Function

    Directory of Open Access Journals (Sweden)

    Subburaj Ramasamy

    2017-01-01

    Full Text Available Reliability is one of the quantifiable software quality attributes. Software Reliability Growth Models (SRGMs are used to assess the reliability achieved at different times of testing. Traditional time-based SRGMs may not be accurate enough in all situations where test effort varies with time. To overcome this lacuna, test effort was used instead of time in SRGMs. In the past, finite test effort functions were proposed, which may not be realistic as, at infinite testing time, test effort will be infinite. Hence in this paper, we propose an infinite test effort function in conjunction with a classical Nonhomogeneous Poisson Process (NHPP model. We use Artificial Neural Network (ANN for training the proposed model with software failure data. Here it is possible to get a large set of weights for the same model to describe the past failure data equally well. We use machine learning approach to select the appropriate set of weights for the model which will describe both the past and the future data well. We compare the performance of the proposed model with existing model using practical software failure data sets. The proposed log-power TEF based SRGM describes all types of failure data equally well and also improves the accuracy of parameter estimation more than existing TEF and can be used for software release time determination as well.

  16. How TK-TD and population models for aquatic macrophytes could support the risk assessment for plant protection products.

    Science.gov (United States)

    Hommen, Udo; Schmitt, Walter; Heine, Simon; Brock, Theo Cm; Duquesne, Sabine; Manson, Phil; Meregalli, Giovanna; Ochoa-Acuña, Hugo; van Vliet, Peter; Arts, Gertie

    2016-01-01

    This case study of the Society of Environmental Toxicology and Chemistry (SETAC) workshop MODELINK demonstrates the potential use of mechanistic effects models for macrophytes to extrapolate from effects of a plant protection product observed in laboratory tests to effects resulting from dynamic exposure on macrophyte populations in edge-of-field water bodies. A standard European Union (EU) risk assessment for an example herbicide based on macrophyte laboratory tests indicated risks for several exposure scenarios. Three of these scenarios are further analyzed using effect models for 2 aquatic macrophytes, the free-floating standard test species Lemna sp., and the sediment-rooted submerged additional standard test species Myriophyllum spicatum. Both models include a toxicokinetic (TK) part, describing uptake and elimination of the toxicant, a toxicodynamic (TD) part, describing the internal concentration-response function for growth inhibition, and a description of biomass growth as a function of environmental factors to allow simulating seasonal dynamics. The TK-TD models are calibrated and tested using laboratory tests, whereas the growth models were assumed to be fit for purpose based on comparisons of predictions with typical growth patterns observed in the field. For the risk assessment, biomass dynamics are predicted for the control situation and for several exposure levels. Based on specific protection goals for macrophytes, preliminary example decision criteria are suggested for evaluating the model outputs. The models refined the risk indicated by lower tier testing for 2 exposure scenarios, while confirming the risk associated for the third. Uncertainties related to the experimental and the modeling approaches and their application in the risk assessment are discussed. Based on this case study and the assumption that the models prove suitable for risk assessment once fully evaluated, we recommend that 1) ecological scenarios be developed that are also

  17. Neural modeling of prefrontal executive function

    Energy Technology Data Exchange (ETDEWEB)

    Levine, D.S. [Univ. of Texas, Arlington, TX (United States)

    1996-12-31

    Brain executive function is based in a distributed system whereby prefrontal cortex is interconnected with other cortical. and subcortical loci. Executive function is divided roughly into three interacting parts: affective guidance of responses; linkage among working memory representations; and forming complex behavioral schemata. Neural network models of each of these parts are reviewed and fit into a preliminary theoretical framework.

  18. Evaluating intersectoral collaboration: a model for assessment by service users

    Directory of Open Access Journals (Sweden)

    Bengt Ahgren

    2009-02-01

    Full Text Available Introduction: DELTA was launched as a project in 1997 to improve intersectoral collaboration in the rehabilitation field. In 2005 DELTA was transformed into a local association for financial co-ordination between the institutions involved. Based on a study of the DELTA service users, the purpose of this article is to develop and to validate a model that can be used to assess the integration of welfare services from the perspective of the service users. Theory: The foundation of integration is a well functioning structure of integration. Without such structural conditions, it is difficult to develop a process of integration that combines the resources and competences of the collaborating organisations to create services advantageous for the service users. In this way, both the structure and the process will contribute to the outcome of integration. Method: The study was carried out as a retrospective cross-sectional survey during two weeks, including all the current service users of DELTA. The questionnaire contained 32 questions, which were derived from the theoretical framework and research on service users, capturing perceptions of integration structure, process and outcome. Ordinal scales and open questions where used for the assessment. Results: The survey had a response rate of 82% and no serious biases of the results were detected. The study shows that the users of the rehabilitation services perceived the services as well integrated, relevant and adapted to their needs. The assessment model was tested for reliability and validity and a few modifications were suggested. Some key measurement themes were derived from the study. Conclusion: The model developed in this study is an important step towards an assessment of service integration from the perspective of the service users. It needs to be further refined, however, before it can be used in other evaluations of collaboration in the provision of integrated welfare services.

  19. A class of non-linear exposure-response models suitable for health impact assessment applicable to large cohort studies of ambient air pollution.

    Science.gov (United States)

    Nasari, Masoud M; Szyszkowicz, Mieczysław; Chen, Hong; Crouse, Daniel; Turner, Michelle C; Jerrett, Michael; Pope, C Arden; Hubbell, Bryan; Fann, Neal; Cohen, Aaron; Gapstur, Susan M; Diver, W Ryan; Stieb, David; Forouzanfar, Mohammad H; Kim, Sun-Young; Olives, Casey; Krewski, Daniel; Burnett, Richard T

    2016-01-01

    The effectiveness of regulatory actions designed to improve air quality is often assessed by predicting changes in public health resulting from their implementation. Risk of premature mortality from long-term exposure to ambient air pollution is the single most important contributor to such assessments and is estimated from observational studies generally assuming a log-linear, no-threshold association between ambient concentrations and death. There has been only limited assessment of this assumption in part because of a lack of methods to estimate the shape of the exposure-response function in very large study populations. In this paper, we propose a new class of variable coefficient risk functions capable of capturing a variety of potentially non-linear associations which are suitable for health impact assessment. We construct the class by defining transformations of concentration as the product of either a linear or log-linear function of concentration multiplied by a logistic weighting function. These risk functions can be estimated using hazard regression survival models with currently available computer software and can accommodate large population-based cohorts which are increasingly being used for this purpose. We illustrate our modeling approach with two large cohort studies of long-term concentrations of ambient air pollution and mortality: the American Cancer Society Cancer Prevention Study II (CPS II) cohort and the Canadian Census Health and Environment Cohort (CanCHEC). We then estimate the number of deaths attributable to changes in fine particulate matter concentrations over the 2000 to 2010 time period in both Canada and the USA using both linear and non-linear hazard function models.

  20. Benefits of Simulation and Role-Playing to Teach Performance of Functional Assessments.

    Science.gov (United States)

    Trail Ross, Mary Ellen; Otto, Dorothy A; Stewart Helton, Anne

    The use of simulation is an innovative teaching strategy that has proven to be valuable in nursing education. This article describes the benefits of a simulation lab involving faculty role-play to teach baccalaureate nursing students how to properly assess the functional status of older adults. Details about the simulation lab, which involved functional assessments of two elderly community-dwelling residents, are presented, along with student and faculty evaluations of this teaching modality.

  1. Model-Based Approaches for Teaching and Practicing Personality Assessment.

    Science.gov (United States)

    Blais, Mark A; Hopwood, Christopher J

    2017-01-01

    Psychological assessment is a complex professional skill. Competence in assessment requires an extensive knowledge of personality, neuropsychology, social behavior, and psychopathology, a background in psychometrics, familiarity with a range of multimethod tools, cognitive flexibility, skepticism, and interpersonal sensitivity. This complexity makes assessment a challenge to teach and learn, particularly as the investment of resources and time in assessment has waned in psychological training programs over the last few decades. In this article, we describe 3 conceptual models that can assist teaching and learning psychological assessments. The transtheoretical model of personality provides a personality systems-based framework for understanding how multimethod assessment data relate to major personality systems and can be combined to describe and explain complex human behavior. The quantitative psychopathology-personality trait model is an empirical model based on the hierarchical organization of individual differences. Application of this model can help students understand diagnostic comorbidity and symptom heterogeneity, focus on more meaningful high-order domains, and identify the most effective assessment tools for addressing a given question. The interpersonal situation model is rooted in interpersonal theory and can help students connect test data to here-and-now interactions with patients. We conclude by demonstrating the utility of these models using a case example.

  2. The Assessment of Patient Clinical Outcome: Advantages, Models, Features of an Ideal Model

    Directory of Open Access Journals (Sweden)

    Mou’ath Hourani

    2016-06-01

    Full Text Available Background: The assessment of patient clinical outcome focuses on measuring various aspects of the health status of a patient who is under healthcare intervention. Patient clinical outcome assessment is a very significant process in the clinical field as it allows health care professionals to better understand the effectiveness of their health care programs and thus for enhancing the health care quality in general. It is thus vital that a high quality, informative review of current issues regarding the assessment of patient clinical outcome should be conducted. Aims & Objectives: 1 Summarizes the advantages of the assessment of patient clinical outcome; 2 reviews some of the existing patient clinical outcome assessment models namely: Simulation, Markov, Bayesian belief networks, Bayesian statistics and Conventional statistics, and Kaplan-Meier analysis models; and 3 demonstrates the desired features that should be fulfilled by a well-established ideal patient clinical outcome assessment model. Material & Methods: An integrative review of the literature has been performed using the Google Scholar to explore the field of patient clinical outcome assessment. Conclusion: This paper will directly support researchers, clinicians and health care professionals in their understanding of developments in the domain of the assessment of patient clinical outcome, thus enabling them to propose ideal assessment models.

  3. The Assessment of Patient Clinical Outcome: Advantages, Models, Features of an Ideal Model

    Directory of Open Access Journals (Sweden)

    Mou’ath Hourani

    2016-06-01

    Full Text Available Background: The assessment of patient clinical outcome focuses on measuring various aspects of the health status of a patient who is under healthcare intervention. Patient clinical outcome assessment is a very significant process in the clinical field as it allows health care professionals to better understand the effectiveness of their health care programs and thus for enhancing the health care quality in general. It is thus vital that a high quality, informative review of current issues regarding the assessment of patient clinical outcome should be conducted. Aims & Objectives: 1 Summarizes the advantages of the assessment of patient clinical outcome; 2 reviews some of the existing patient clinical outcome assessment models namely: Simulation, Markov, Bayesian belief networks, Bayesian statistics and Conventional statistics, and Kaplan-Meier analysis models; and 3 demonstrates the desired features that should be fulfilled by a well-established ideal patient clinical outcome assessment model. Material & Methods: An integrative review of the literature has been performed using the Google Scholar to explore the field of patient clinical outcome assessment. Conclusion: This paper will directly support researchers, clinicians and health care professionals in their understanding of developments in the domain of the assessment of patient clinical outcome, thus enabling them to propose ideal assessment models.

  4. Quantitative assessment of left ventricular systolic function using 3-dimensional echocardiography

    Directory of Open Access Journals (Sweden)

    Rahul Mehrotra

    2013-09-01

    Full Text Available Assessment of left ventricular systolic function is the commonest and one of the most important indications for performance of echocardiography. It is important for prognostication, determination of treatment plan, for decisions related to expensive device therapies and for assessing response to treatment. The current methods based on two-dimensional echocardiography are not reliable, have high degree of inter-observer and intra-observer variability and are based on presumptions about the geometry of left ventricle (LV. Real-time three-dimensional echocardiography (RT3DE on the other hand is fast, easy, accurate, relatively operator independent and is not based on any assumptions related to the shape of LV. Owing to these advantages, it is the Echocardiographic modality of choice for assessment of systolic function of the LV. We describe here a step by step approach to evaluation of LV volumes, ejection fraction, regional systolic function and Dyssynchrony analysis based on RT3DE. It has been well validated in clinical studies and is rapidly being incorporated in routine clinical practice.

  5. Neurophysiology and new techniques to assess esophageal sensory function: an update.

    Science.gov (United States)

    Brock, Christina; McCallum, Richard W; Gyawali, C Prakash; Farmer, Adam D; Frøkjaer, Jens Brøndum; McMahon, Barry P; Drewes, Asbjørn Mohr

    2016-09-01

    This review aims to discuss the neurophysiology of the esophagus and new methods to assess esophageal nociception. Pain and other symptoms can be caused by diseases in the mucosa or muscular or sphincter dysfunction, together with abnormal pain processing, either in the peripheral or central nervous systems. Therefore, we present new techniques in the assessment of esophageal function and the potential role of the mucosal barrier in the generation and propagation of pain. We discuss the assessment and role of esophageal sphincters in nociception, as well as imaging and electrophysiological techniques, with examples of their use in understanding the sensory system following noxious stimuli to the esophagus. Additionally, we discuss the mechanisms behind functional diseases of the esophagus. We conclude that the new methods have identified many of the mechanisms behind malfunction of the mucosa, disturbances of muscular and sphincter functions, and the central response to different stimuli. Taken together, this has increased our understanding of esophageal disorders and may lead to new treatment modalities. © 2016 New York Academy of Sciences.

  6. Symmetries and modelling functions for diffusion processes

    International Nuclear Information System (INIS)

    Nikitin, A G; Spichak, S V; Vedula, Yu S; Naumovets, A G

    2009-01-01

    A constructive approach to the theory of diffusion processes is proposed, which is based on application of both symmetry analysis and the method of modelling functions. An algorithm for construction of the modelling functions is suggested. This algorithm is based on the error function expansion (ERFEX) of experimental concentration profiles. The high-accuracy analytical description of the profiles provided by ERFEX approximation allows a convenient extraction of the concentration dependence of diffusivity from experimental data and prediction of the diffusion process. Our analysis is exemplified by its employment in experimental results obtained for surface diffusion of lithium on the molybdenum (1 1 2) surface precovered with dysprosium. The ERFEX approximation can be directly extended to many other diffusion systems.

  7. Estimating the effects of wetland conservation practices in croplands: Approaches for modeling in CEAP–Cropland Assessment

    Science.gov (United States)

    De Steven, Diane; Mushet, David

    2018-01-01

    Quantifying the current and potential benefits of conservation practices can be a valuable tool for encouraging greater practice adoption on agricultural lands. A goal of the CEAP-Cropland Assessment is to estimate the environmental effects of conservation practices that reduce losses (exports) of soil, nutrients, and pesticides from farmlands to streams and rivers. The assessment approach combines empirical data on reported cropland practices with simulation modeling that compares field-level exports for scenarios “with practices” and “without practices.” Conserved, restored, and created wetlands collectively represent conservation practices that can influence sediment and nutrient exports from croplands. However, modeling the role of wetlands within croplands presents some challenges, including the potential for negative impacts of sediment and nutrient inputs on wetland functions. This Science Note outlines some preliminary solutions for incorporating wetlands and wetland practices into the CEAP-Cropland modeling framework. First, modeling the effects of wetland practices requires identifying wetland hydrogeomorphic type and accounting for the condition of both the wetland and an adjacent upland zone. Second, modeling is facilitated by classifying wetland-related practices into two functional categories (wetland and upland buffer). Third, simulating practice effects requires alternative field configurations to account for hydrological differences among wetland types. These ideas are illustrated for two contrasting wetland types (riparian and depressional).

  8. Terrestrial population models for ecological risk assessment: A state-of-the-art review

    Science.gov (United States)

    Emlen, J.M.

    1989-01-01

    Few attempts have been made to formulate models for predicting impacts of xenobiotic chemicals on wildlife populations. However, considerable effort has been invested in wildlife optimal exploitation models. Because death from intoxication has a similar effect on population dynamics as death by harvesting, these management models are applicable to ecological risk assessment. An underlying Leslie-matrix bookkeeping formulation is widely applicable to vertebrate wildlife populations. Unfortunately, however, the various submodels that track birth, death, and dispersal rates as functions of the physical, chemical, and biotic environment are by their nature almost inevitably highly species- and locale-specific. Short-term prediction of one-time chemical applications requires only information on mortality before and after contamination. In such cases a simple matrix formulation may be adequate for risk assessment. But generally, risk must be projected over periods of a generation or more. This precludes generic protocols for risk assessment and also the ready and inexpensive predictions of a chemical's influence on a given population. When designing and applying models for ecological risk assessment at the population level, the endpoints (output) of concern must be carefully and rigorously defined. The most easily accessible and appropriate endpoints are (1) pseudoextinction (the frequency or probability of a population falling below a prespecified density), and (2) temporal mean population density. Spatial and temporal extent of predicted changes must be clearly specified a priori to avoid apparent contradictions and confusion.

  9. Interactive Rapid Dose Assessment Model (IRDAM): user's guide

    International Nuclear Information System (INIS)

    Poeton, R.W.; Moeller, M.P.; Laughlin, G.J.; Desrosiers, A.E.

    1983-05-01

    As part of the continuing emphasis on emergency preparedness the US Nuclear Regulatory Commission (NRC) sponsored the development of a rapid dose assessment system by Pacific Northwest Laboratory (PNL). This system, the Interactive Rapid Dose Assessment Model (IRDAM) is a micro-computer based program for rapidly assessing the radiological impact of accidents at nuclear power plants. This User's Guide provides instruction in the setup and operation of the equipment necessary to run IRDAM. Instructions are also given on how to load the magnetic disks and access the interactive part of the program. Two other companion volumes to this one provide additional information on IRDAM. Reactor Accident Assessment Methods (NUREG/CR-3012, Volume 2) describes the technical bases for IRDAM including methods, models and assumptions used in calculations. Scenarios for Comparing Dose Assessment Models (NUREG/CR-3012, Volume 3) provides the results of calculations made by IRDAM and other models for specific accident scenarios

  10. Functional linear models for association analysis of quantitative traits.

    Science.gov (United States)

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. © 2013 WILEY

  11. Enabling Cross-Discipline Collaboration Via a Functional Data Model

    Science.gov (United States)

    Lindholm, D. M.; Wilson, A.; Baltzer, T.

    2016-12-01

    Many research disciplines have very specialized data models that are used to express the detailed semantics that are meaningful to that community and easily utilized by their data analysis tools. While invaluable to members of that community, such expressive data structures and metadata are of little value to potential collaborators from other scientific disciplines. Many data interoperability efforts focus on the difficult task of computationally mapping concepts from one domain to another to facilitate discovery and use of data. Although these efforts are important and promising, we have found that a great deal of discovery and dataset understanding still happens at the level of less formal, personal communication. However, a significant barrier to inter-disciplinary data sharing that remains is one of data access.Scientists and data analysts continue to spend inordinate amounts of time simply trying to get data into their analysis tools. Providing data in a standard file format is often not sufficient since data can be structured in many ways. Adhering to more explicit community standards for data structure and metadata does little to help those in other communities.The Functional Data Model specializes the Relational Data Model (used by many database systems)by defining relations as functions between independent (domain) and dependent (codomain) variables. Given that arrays of data in many scientific data formats generally represent functionally related parameters (e.g. temperature as a function of space and time), the Functional Data Model is quite relevant for these datasets as well. The LaTiS software framework implements the Functional Data Model and provides a mechanism to expose an existing data source as a LaTiS dataset. LaTiS datasets can be manipulated using a Functional Algebra and output in any number of formats.LASP has successfully used the Functional Data Model and its implementation in the LaTiS software framework to bridge the gap between

  12. Association of arsenobetaine with beta-cell function assessed by homeostasis model assessment (HOMA) in nondiabetic Koreans: data from the fourth Korea National Health and Nutrition Examination Survey (KNHANES) 2008-2009.

    Science.gov (United States)

    Baek, Kiook; Lee, Namhoon; Chung, Insung

    2017-01-01

    Arsenic is known as an endocrine disruptor that people are exposed to through various sources such as drinking water and indigestion of marine products. Although some epidemiological and animal studies have reported a correlation between arsenic exposure and diabetes development, there are limited studies regarding the toxic effects of organic arsenic including arsenobetaine on the human body. Here, we analyzed the association between urine arsenobetaine and the homeostasis model assessment of β-cell function (HOMA-β), which is an index for predicting diabetes development and reflecting the function of pancreatic β-cells. In the fourth Korea National Health and Nutrition Examination Survey (KNHANES), health and nutrition surveys and screening tests were performed. Of the total survey population, people with confirmed values for urine total arsenic and arsenobetaine were included, and known diabetic patients were excluded. A total 369 participants were finally included in the study. We collected surveys on health, height, body weight, body mass index, blood mercury level, fasting glucose level, and serum insulin level and calculated HOMA index. Owing to sexual discrepancy, we performed sexually stratified analysis. Urine total arsenic and total arsenic minus arsenobetaine was not associated with HOMA-IR and HOMA-β in univariate analysis or in sexually stratified analysis. However, urine arsenobetaine showed a statistically significant relationship with HOMA-β in univariate analysis, and only male participants showed a significant correlation in sexually stratified analysis. In the analysis adjusted for age, BMI, smoking, alcohol drinking, physical activity and blood mercury, the HOMA-β value in the group below the 25th percentile of arsenobetaine was significantly higher than the group between 50 and 75th percentile, while no difference was shown for HOMA-IR. In sexually stratified analysis, The value of HOMA-β was significantly higher in male participants

  13. Model and Analytic Processes for Export License Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.; Wood, Thomas W.; Daly, Don S.; Brothers, Alan J.; Sanfilippo, Antonio P.; Cook, Diane; Holder, Larry

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determine which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An

  14. Comparison of in vivo postexercise phosphocreatine recovery and resting ATP synthesis flux for the assessment of skeletal muscle mitochondrial function

    NARCIS (Netherlands)

    Broek, van den N.M.A.; Ciapaite, J.; Nicolay, K.; Prompers, J.J.

    2010-01-01

    31P magnetic resonance spectroscopy (MRS) has been used to assess skeletal muscle mitochondrial function in vivo by measuring 1) phosphocreatine (PCr) recovery after exercise or 2) resting ATP synthesis flux with saturation transfer (ST). In this study, we compared both parameters in a rat model of

  15. Towards aspect-oriented functional--structural plant modelling.

    Science.gov (United States)

    Cieslak, Mikolaj; Seleznyova, Alla N; Prusinkiewicz, Przemyslaw; Hanan, Jim

    2011-10-01

    Functional-structural plant models (FSPMs) are used to integrate knowledge and test hypotheses of plant behaviour, and to aid in the development of decision support systems. A significant amount of effort is being put into providing a sound methodology for building them. Standard techniques, such as procedural or object-oriented programming, are not suited for clearly separating aspects of plant function that criss-cross between different components of plant structure, which makes it difficult to reuse and share their implementations. The aim of this paper is to present an aspect-oriented programming approach that helps to overcome this difficulty. The L-system-based plant modelling language L+C was used to develop an aspect-oriented approach to plant modelling based on multi-modules. Each element of the plant structure was represented by a sequence of L-system modules (rather than a single module), with each module representing an aspect of the element's function. Separate sets of productions were used for modelling each aspect, with context-sensitive rules facilitated by local lists of modules to consider/ignore. Aspect weaving or communication between aspects was made possible through the use of pseudo-L-systems, where the strict-predecessor of a production rule was specified as a multi-module. The new approach was used to integrate previously modelled aspects of carbon dynamics, apical dominance and biomechanics with a model of a developing kiwifruit shoot. These aspects were specified independently and their implementation was based on source code provided by the original authors without major changes. This new aspect-oriented approach to plant modelling is well suited for studying complex phenomena in plant science, because it can be used to integrate separate models of individual aspects of plant development and function, both previously constructed and new, into clearly organized, comprehensive FSPMs. In a future work, this approach could be further

  16. Comparison of left ventricular function assessment between echocardiography and MRI in Duchenne muscular dystrophy

    Energy Technology Data Exchange (ETDEWEB)

    Buddhe, Sujatha; Lewin, Mark; Olson, Aaron; Soriano, Brian D. [University of Washington School of Medicine and Seattle Children' s Hospital, Division of Cardiology, Department of Pediatrics, Seattle, WA (United States); Ferguson, Mark [University of Washington School of Medicine and Seattle Children' s Hospital, Department of Radiology, Seattle, WA (United States)

    2016-09-15

    Cardiomyopathy in Duchenne muscular dystrophy (DMD) is associated with death in approximately 40% of patients. Echocardiography is routinely used to assess left ventricular (LV) function; however, it has limitations in these patients. We compared echocardiographic measures of cardiac function assessment to cardiac MRI. We included children and young adults with DMD who had MRI performed between January 2010 and July 2015. We measured echocardiographic and MRI parameters of function assessment, including strain. Presence of late gadolinium enhancement (LGE) was assessed by MRI. Subjects were divided into two groups based on MRI left ventricular ejection fraction (LVEF): group I, LVEF ≥55% and group II, LVEF <55%. We included 41 studies in 33 subjects, with 25 in group I and 16 in group II. Mean age of subjects was 13.6 ± 2.8 years and mean duration between echocardiogram and MRI was 7.6 ± 4.1 months. Only 8 of 16 (50%) patients in group II had diminished function on echocardiogram. Echocardiographic images were suboptimal in 16 subjects (39%). Overall, echocardiographic parameters had weak correlation with MRI-derived ejection fraction percentage. MRI-derived myocardial strain assessment has better correlation with MRI ejection fraction as compared to echocardiography-derived strain parameters. Echocardiography-based ventricular functional assessment has weak correlation with MRI parameters in children and young adults with Duchenne muscular dystrophy. While this correlation improves in the subset of subjects with adequate echocardiographic image quality, it remains modest and potentially suboptimal for clinical management. Accordingly, we conclude that MRI should be performed routinely and early in children with DMD, not only for LGE imaging but also for functional assessment. (orig.)

  17. Comparison of left ventricular function assessment between echocardiography and MRI in Duchenne muscular dystrophy

    International Nuclear Information System (INIS)

    Buddhe, Sujatha; Lewin, Mark; Olson, Aaron; Soriano, Brian D.; Ferguson, Mark

    2016-01-01

    Cardiomyopathy in Duchenne muscular dystrophy (DMD) is associated with death in approximately 40% of patients. Echocardiography is routinely used to assess left ventricular (LV) function; however, it has limitations in these patients. We compared echocardiographic measures of cardiac function assessment to cardiac MRI. We included children and young adults with DMD who had MRI performed between January 2010 and July 2015. We measured echocardiographic and MRI parameters of function assessment, including strain. Presence of late gadolinium enhancement (LGE) was assessed by MRI. Subjects were divided into two groups based on MRI left ventricular ejection fraction (LVEF): group I, LVEF ≥55% and group II, LVEF <55%. We included 41 studies in 33 subjects, with 25 in group I and 16 in group II. Mean age of subjects was 13.6 ± 2.8 years and mean duration between echocardiogram and MRI was 7.6 ± 4.1 months. Only 8 of 16 (50%) patients in group II had diminished function on echocardiogram. Echocardiographic images were suboptimal in 16 subjects (39%). Overall, echocardiographic parameters had weak correlation with MRI-derived ejection fraction percentage. MRI-derived myocardial strain assessment has better correlation with MRI ejection fraction as compared to echocardiography-derived strain parameters. Echocardiography-based ventricular functional assessment has weak correlation with MRI parameters in children and young adults with Duchenne muscular dystrophy. While this correlation improves in the subset of subjects with adequate echocardiographic image quality, it remains modest and potentially suboptimal for clinical management. Accordingly, we conclude that MRI should be performed routinely and early in children with DMD, not only for LGE imaging but also for functional assessment. (orig.)

  18. Functional computed tomography imaging of tumor-induced angiogenesis. Preliminary results of new tracer kinetic modeling using a computer discretization approach

    International Nuclear Information System (INIS)

    Kaneoya, Katsuhiko; Ueda, Takuya; Suito, Hiroshi

    2008-01-01

    The aim of this study was to establish functional computed tomography (CT) imaging as a method for assessing tumor-induced angiogenesis. Functional CT imaging was mathematically analyzed for 14 renal cell carcinomas by means of two-compartment modeling using a computer-discretization approach. The model incorporated diffusible kinetics of contrast medium including leakage from the capillary to the extravascular compartment and back-flux to the capillary compartment. The correlations between functional CT parameters [relative blood volume (rbv), permeability 1 (Pm1), and permeability 2 (Pm2)] and histopathological markers of angiogenesis [microvessel density (MVD) and vascular endothelial growth factor (VEGF)] were statistically analyzed. The modeling was successfully performed, showing similarity between the mathematically simulated curve and the measured time-density curve. There were significant linear correlations between MVD grade and Pm1 (r=0.841, P=0.001) and between VEGF grade and Pm2 (r=0.804, P=0.005) by Pearson's correlation coefficient. This method may be a useful tool for the assessment of tumor-induced angiogenesis. (author)

  19. Virtual Plants Need Water Too: Functional-Structural Root System Models in the Context of Drought Tolerance Breeding.

    Science.gov (United States)

    Ndour, Adama; Vadez, Vincent; Pradal, Christophe; Lucas, Mikaël

    2017-01-01

    Developing a sustainable agricultural model is one of the great challenges of the coming years. The agricultural practices inherited from the Green Revolution of the 1960s show their limits today, and new paradigms need to be explored to counter rising issues such as the multiplication of climate-change related drought episodes. Two such new paradigms are the use of functional-structural plant models to complement and rationalize breeding approaches and a renewed focus on root systems as untapped sources of plant amelioration. Since the late 1980s, numerous functional and structural models of root systems were developed and used to investigate the properties of root systems in soil or lab-conditions. In this review, we focus on the conception and use of such root models in the broader context of research on root-driven drought tolerance, on the basis of root system architecture (RSA) phenotyping. Such models result from the integration of architectural, physiological and environmental data. Here, we consider the different phenotyping techniques allowing for root architectural and physiological study and their limits. We discuss how QTL and breeding studies support the manipulation of RSA as a way to improve drought resistance. We then go over the integration of the generated data within architectural models, how those architectural models can be coupled with functional hydraulic models, and how functional parameters can be measured to feed those models. We then consider the assessment and validation of those hydraulic models through confrontation of simulations to experimentations. Finally, we discuss the up and coming challenges facing root systems functional-structural modeling approaches in the context of breeding.

  20. Virtual Plants Need Water Too: Functional-Structural Root System Models in the Context of Drought Tolerance Breeding

    Directory of Open Access Journals (Sweden)

    Adama Ndour

    2017-09-01

    Full Text Available Developing a sustainable agricultural model is one of the great challenges of the coming years. The agricultural practices inherited from the Green Revolution of the 1960s show their limits today, and new paradigms need to be explored to counter rising issues such as the multiplication of climate-change related drought episodes. Two such new paradigms are the use of functional-structural plant models to complement and rationalize breeding approaches and a renewed focus on root systems as untapped sources of plant amelioration. Since the late 1980s, numerous functional and structural models of root systems were developed and used to investigate the properties of root systems in soil or lab-conditions. In this review, we focus on the conception and use of such root models in the broader context of research on root-driven drought tolerance, on the basis of root system architecture (RSA phenotyping. Such models result from the integration of architectural, physiological and environmental data. Here, we consider the different phenotyping techniques allowing for root architectural and physiological study and their limits. We discuss how QTL and breeding studies support the manipulation of RSA as a way to improve drought resistance. We then go over the integration of the generated data within architectural models, how those architectural models can be coupled with functional hydraulic models, and how functional parameters can be measured to feed those models. We then consider the assessment and validation of those hydraulic models through confrontation of simulations to experimentations. Finally, we discuss the up and coming challenges facing root systems functional-structural modeling approaches in the context of breeding.

  1. Model validation and calibration based on component functions of model output

    International Nuclear Information System (INIS)

    Wu, Danqing; Lu, Zhenzhou; Wang, Yanping; Cheng, Lei

    2015-01-01

    The target in this work is to validate the component functions of model output between physical observation and computational model with the area metric. Based on the theory of high dimensional model representations (HDMR) of independent input variables, conditional expectations are component functions of model output, and the conditional expectations reflect partial information of model output. Therefore, the model validation of conditional expectations tells the discrepancy between the partial information of the computational model output and that of the observations. Then a calibration of the conditional expectations is carried out to reduce the value of model validation metric. After that, a recalculation of the model validation metric of model output is taken with the calibrated model parameters, and the result shows that a reduction of the discrepancy in the conditional expectations can help decrease the difference in model output. At last, several examples are employed to demonstrate the rationality and necessity of the methodology in case of both single validation site and multiple validation sites. - Highlights: • A validation metric of conditional expectations of model output is proposed. • HDRM explains the relationship of conditional expectations and model output. • An improved approach of parameter calibration updates the computational models. • Validation and calibration process are applied at single site and multiple sites. • Validation and calibration process show a superiority than existing methods

  2. Using game authoring platforms to develop screen-based simulated functional assessments in persons with executive dysfunction following traumatic brain injury.

    Science.gov (United States)

    Martínez-Pernía, David; Núñez-Huasaf, Javier; Del Blanco, Ángel; Ruiz-Tagle, Amparo; Velásquez, Juan; Gomez, Mariela; Robert Blesius, Carl; Ibañez, Agustin; Fernández-Manjón, Baltasar; Slachevsky, Andrea

    2017-10-01

    The assessment of functional status is a critical component of clinical neuropsychological evaluations used for both diagnostic and therapeutic purposes in patients with cognitive brain disorders. There are, however, no widely adopted neuropsychological tests that are both ecologically valid and easily administered in daily clinical practice. This discrepancy is a roadblock to the widespread adoption of functional assessments. In this paper, we propose a novel approach using a serious game authoring platform (eAdventure) for creating screen-based simulated functional assessments. We created a naturalistic functional task that consisted of preparing a cup of tea (SBS-COT) and applied the assessment in a convenience sample of eight dyads of therapists/patients with mild executive dysfunction after traumatic brain injury. We had three main aims. First, we performed a comprehensive review of executive function assessment in activities of daily living. Second, we were interested in measuring the feasibility of this technology with respect to staffing, economic and technical requirements. Third, a serious game was administered to patients to study the feasibility of this technology in the clinical context (pre-screening test). In addition, quantitative (Technology Acceptance Model (TAM) questionnaires) and qualitative (semistructured interviews) evaluations were applied to obtain user input. Our results suggest that the staffing, economic and technical requirements of the SBS-COT are feasible. The outcomes of the pre-screening test provide evidence that this technology is useful in the functional assessment of patients with executive dysfunction. In relation to subjective data, the TAM questionnaire showed good user acceptability from a professional perspective. Interview analyses with professionals and patients showed positive experiences related to the use of the SBS-COT. Our work indicates that the use of these types of authoring platforms could have positive long

  3. Functional assessment of patients after total knee replacement

    Directory of Open Access Journals (Sweden)

    Matla Joanna

    2017-06-01

    Full Text Available Introduction: In the society of the 21st century, osteoarthritis is considered one of the primary causes of the occurrence of pain and disability. Arthroplasty is the treatment of choice for advanced degenerative changes. The aim of the study was to carry out a functional assessment of patients at early stages of rehabilitation after total knee replacement.

  4. Cost functions of greenhouse models

    International Nuclear Information System (INIS)

    Linderoth, H.

    2000-01-01

    The benchmark is equal to the cost (D) caused by an increase in temperature since the middle of the nineteenth century (T) of nearly 2.5 deg. C. According to mainstream economists, the benchmark is 1-2% of GDP, but very different estimates can also be found. Even though there appears to be agreement among a number of economists that the benchmark is 1-2% of GDP, major differences exist when it comes to estimating D for different sectors. One of the main problems is how to estimate non-market activities. Normally, the benchmark is the best guess, but due to the possibility of catastrophic events this can be considerable smaller than the mean. Certainly, the cost function is skewed to the right. The benchmark is just one point on the cost curve. To a great extent, cost functions are alike in greenhouse models (D = α ''.T'' λ). Cost functions are region and sector dependent in several models. In any case, both α (benchmark) and λ are rough estimates. Besides being dependent on α and λ, the marginal emission cost depends on the discount rate. In fact, because emissions have effects continuing for many years, the discount rate is clearly the most important parameter. (au) (au)

  5. Using Lambert W function and error function to model phase change on microfluidics

    Science.gov (United States)

    Bermudez Garcia, Anderson

    2014-05-01

    Solidification and melting modeling on microfluidics are solved using Lambert W's function and error's functions. Models are formulated using the heat's diffusion equation. The generic posed case is the melting of a slab with time dependent surface temperature, having a micro or nano-fluid liquid phase. At the beginning the solid slab is at melting temperature. A slab's face is put and maintained at temperature greater than the melting limit and varying in time. Lambert W function and error function are applied via Maple to obtain the analytic solution evolution of the front of microfluidic-solid interface, it is analytically computed and slab's corresponding melting time is determined. It is expected to have analytical results to be useful for food engineering, cooking engineering, pharmaceutical engineering, nano-engineering and bio-medical engineering.

  6. Assessing Adaptive Functioning in Death Penalty Cases after Hall and DSM-5.

    Science.gov (United States)

    Hagan, Leigh D; Drogin, Eric Y; Guilmette, Thomas J

    2016-03-01

    DSM-5 and Hall v. Florida (2014) have dramatically refocused attention on the assessment of adaptive functioning in death penalty cases. In this article, we address strategies for assessing the adaptive functioning of defendants who seek exemption from capital punishment pursuant to Atkins v. Virginia (2002). In particular, we assert that evaluations of adaptive functioning should address assets as well as deficits; seek to identify credible and reliable evidence concerning the developmental period and across the lifespan; distinguish incapacity from the mere absence of adaptive behavior; adhere faithfully to test manual instructions for using standardized measures of adaptive functioning; and account for potential bias on the part of informants. We conclude with brief caveats regarding the standard error of measurement (SEM) in light of Hall, with reference to examples of ordinary life activities that directly illuminate adaptive functioning relevant to capital cases. © 2016 American Academy of Psychiatry and the Law.

  7. The assessment of the function of the nervus intermedius by means of functional salivary gland-scintigraphy

    International Nuclear Information System (INIS)

    Thomas, J.P.; Bertram, G.; Moedder, G.

    1982-01-01

    Using functional scintigraphy of the salivary glands, the function of the nervus intermedius can be assessed by estimating excretory quotients for both submandibular glands. This method is preferred to the standard salivationtest method of Magielski and Blatt, despite a minimal exposure of the patient to radiation from the injected Natriumpertechnetat. The technical course of this investigation, along with the indications for its use, will be presented. (orig.) [de

  8. Self-Report Assessment of Executive Functioning in College Students with Disabilities

    Science.gov (United States)

    Grieve, Adam; Webne-Behrman, Lisa; Couillou, Ryan; Sieben-Schneider, Jill

    2014-01-01

    This study presents a unique assessment of executive functioning (EF) among postsecondary students with disabilities, with the aim of understanding the extent to which students with different disabilities and in different age groups assess their own difficulties with relevant and educationally-adaptive skills such as planning, initiating, managing…

  9. A novel approach to the assessment of vascular endothelial function

    International Nuclear Information System (INIS)

    Sathasivam, S; Siddiqui, Z; Greenwald, S; Phababpha, S; Sengmeuan, P; Detchaporn, P; Kukongviriyapan, U

    2011-01-01

    Impaired endothelial function (EF) is associated with atherogenesis, and its quantitative assessment has prognostic value. Currently, methods based on assessing flow-mediated dilation (FMD) are technically difficult and expensive. We tested a novel way of assessing EF by measuring the time difference between pulses arriving at the middle fingers of each hand (f-fΔT), whilst FMD is induced in one arm. We compared f-fΔT with standard methods in healthy and diseased subjects. Our findings suggest that the proposed simple and inexpensive technique gives comparable results and has the potential to qualitatively assess EF in the clinical setting, although further work is required.

  10. Mathematical Models of Cardiac Pacemaking Function

    Science.gov (United States)

    Li, Pan; Lines, Glenn T.; Maleckar, Mary M.; Tveito, Aslak

    2013-10-01

    Over the past half century, there has been intense and fruitful interaction between experimental and computational investigations of cardiac function. This interaction has, for example, led to deep understanding of cardiac excitation-contraction coupling; how it works, as well as how it fails. However, many lines of inquiry remain unresolved, among them the initiation of each heartbeat. The sinoatrial node, a cluster of specialized pacemaking cells in the right atrium of the heart, spontaneously generates an electro-chemical wave that spreads through the atria and through the cardiac conduction system to the ventricles, initiating the contraction of cardiac muscle essential for pumping blood to the body. Despite the fundamental importance of this primary pacemaker, this process is still not fully understood, and ionic mechanisms underlying cardiac pacemaking function are currently under heated debate. Several mathematical models of sinoatrial node cell membrane electrophysiology have been constructed as based on different experimental data sets and hypotheses. As could be expected, these differing models offer diverse predictions about cardiac pacemaking activities. This paper aims to present the current state of debate over the origins of the pacemaking function of the sinoatrial node. Here, we will specifically review the state-of-the-art of cardiac pacemaker modeling, with a special emphasis on current discrepancies, limitations, and future challenges.

  11. Use of radionuclide techniques for assessment of splenic function and detection of splenic remnants

    International Nuclear Information System (INIS)

    Ganguly, S.; Sinha, S.; Sarkar, B.R.; Basu, S.; Ghosh, S.

    1998-01-01

    Full text: The spleen is often involved in hematological malignancies; it is also the site of RBC destruction in thalassemia and ITP. In latter cases, splenectomy is often performed and postoperatively, detection of functioning splenic remnants affect the prognosis adversely. In this study, we assessed the usefulness of radionuclide techniques in : a) assessment of splenic function in primarily non-splenic diseases (benign or malignant), and b) detection of splenic remnant after splenectomy. 12 patients of splenomegaly and 5 patients after splenectomy underwent splenic imaging; imaging was performed using both 99m Tc-sulphur colloid (with first pass) and 99m Tc labelled heat denatured RBCs as tracers. Thus splenic perfusion, morphology and RBC trapping functions were all assessed. The colloid images usually matched the RBC images except in 2 cases where photogenic areas (presumably infarcts) were visualized on RBC scans that were missed on colloid scans. Three of the post splenectomy cases revealed functioning splenic remnants, which was also better visualized on RBC scans. It is concluded that radionuclide imaging could be used regularly for assessing function of spleen, or detecting splenic remnants

  12. Contribution to a quantitative assessment model for reliability-based metrics of electronic and programmable safety-related functions; Contribution a un modele d'evaluation quantitative des performances fiabilistes de fonctions electroniques et programmables dediees a la securite

    Energy Technology Data Exchange (ETDEWEB)

    Hamidi, K

    2005-10-15

    The use of fault-tolerant EP architectures has induced growing constraints, whose influence on reliability-based performance metrics is no more negligible. To face up the growing influence of simultaneous failure, this thesis proposes, for safety-related functions, a new-trend assessment method of reliability, based on a better taking into account of time-aspect. This report introduces the concept of information and uses it to interpret the failure modes of safety-related function as the direct result of the initiation and propagation of erroneous information until the actuator-level. The main idea is to distinguish the apparition and disappearance of erroneous states, which could be defined as intrinsically dependent of HW-characteristic and maintenance policies, and their possible activation, constrained through architectural choices, leading to the failure of safety-related function. This approach is based on a low level on deterministic SED models of the architecture and use non homogeneous Markov chains to depict the time-evolution of probabilities of errors. (author)

  13. The Schroedinger functional for Gross-Neveu models

    International Nuclear Information System (INIS)

    Leder, B.

    2007-01-01

    Gross-Neveu type models with a finite number of fermion flavours are studied on a two-dimensional Euclidean space-time lattice. The models are asymptotically free and are invariant under a chiral symmetry. These similarities to QCD make them perfect benchmark systems for fermion actions used in large scale lattice QCD computations. The Schroedinger functional for the Gross-Neveu models is defined for both, Wilson and Ginsparg-Wilson fermions, and shown to be renormalisable in 1-loop lattice perturbation theory. In two dimensions four fermion interactions of the Gross-Neveu models have dimensionless coupling constants. The symmetry properties of the four fermion interaction terms and the relations among them are discussed. For Wilson fermions chiral symmetry is explicitly broken and additional terms must be included in the action. Chiral symmetry is restored up to cut-off effects by tuning the bare mass and one of the couplings. The critical mass and the symmetry restoring coupling are computed to second order in lattice perturbation theory. This result is used in the 1-loop computation of the renormalised couplings and the associated beta-functions. The renormalised couplings are defined in terms of suitable boundary-to-boundary correlation functions. In the computation the known first order coefficients of the beta-functions are reproduced. One of the couplings is found to have a vanishing betafunction. The calculation is repeated for the recently proposed Schroedinger functional with exact chiral symmetry, i.e. Ginsparg-Wilson fermions. The renormalisation pattern is found to be the same as in the Wilson case. Using the regularisation dependent finite part of the renormalised couplings, the ratio of the Lambda-parameters is computed. (orig.)

  14. ERUPTION TO DOSE: COUPLING A TEPHRA DISPERSAL MODEL WITHIN A PERFORMANCE ASSESSMENT FRAMEWORK

    International Nuclear Information System (INIS)

    G. N. Keating, J. Pelletier

    2005-01-01

    The tephra dispersal model used by the Yucca Mountain Project (YMP) to evaluate the potential consequences of a volcanic eruption through the waste repository must incorporate simplifications in order to function within a large Monte-Carlo style performance assessment framework. That is, the explicit physics of the conduit, vent, and eruption column processes are abstracted to a 2-D, steady-state advection-dispersion model (ASHPLUME) that can be run quickly over thousands of realizations of the overall system model. Given the continuous development of tephra dispersal modeling techniques in the last few years, we evaluated the adequacy of this simplified model for its intended purpose within the YMP total system performance assessment (TSPA) model. We evaluated uncertainties inherent in model simplifications including (1) instantaneous, steady-state vs. unsteady eruption, which affects column height, (2) constant wind conditions, and (3) power-law distribution of the tephra blanket; comparisons were made to other models and published ash distributions. Spatial statistics are useful for evaluating differences in these model output vs. results using more complex wind, column height, and tephra deposition patterns. However, in order to assess the adequacy of the model for its intended use in TSPA, we evaluated the propagation of these uncertainties through FAR, the YMP ash redistribution model, which utilizes ASHPLUME tephra deposition results to calculate the concentration of nuclear waste-contaminated tephra at a dose-receptor population as a result of sedimentary transport and mixing processes on the landscape. Questions we sought to answer include: (1) what conditions of unsteadiness, wind variability, or departure from simplified tephra distribution result in significant effects on waste concentration (related to dose calculated for the receptor population)? (2) What criteria can be established for the adequacy of a tephra dispersal model within the TSPA

  15. The SOS model partition function and the elliptic weight functions

    International Nuclear Information System (INIS)

    Pakuliak, S; Silantyev, A; Rubtsov, V

    2008-01-01

    We generalized a recent observation (Khoroshkin and Pakuliak 2005 Theor. Math. Phys. 145 1373) that the partition function of the six-vertex model with domain wall boundary conditions can be obtained from a calculation of projections of the product of total currents in the quantum affine algebra U q (sl 2 -hat) in its current realization. A generalization is done for the elliptic current algebra (Enriquez and Felder 1998 Commun. Math. Phys. 195 651, Enriquez and Rubtsov 1997 Ann. Sci. Ecole Norm. Sup. 30 821). The projections of the product of total currents in this case are calculated explicitly and are presented as integral transforms of a product of the total currents. It is proved that the integral kernel of this transform is proportional to the partition function of the SOS model with domain wall boundary conditions

  16. Integrated assessment models of global climate change

    International Nuclear Information System (INIS)

    Parson, E.A.; Fisher-Vanden, K.

    1997-01-01

    The authors review recent work in the integrated assessment modeling of global climate change. This field has grown rapidly since 1990. Integrated assessment models seek to combine knowledge from multiple disciplines in formal integrated representations; inform policy-making, structure knowledge, and prioritize key uncertainties; and advance knowledge of broad system linkages and feedbacks, particularly between socio-economic and bio-physical processes. They may combine simplified representations of the socio-economic determinants of greenhouse gas emissions, the atmosphere and oceans, impacts on human activities and ecosystems, and potential policies and responses. The authors summarize current projects, grouping them according to whether they emphasize the dynamics of emissions control and optimal policy-making, uncertainty, or spatial detail. They review the few significant insights that have been claimed from work to date and identify important challenges for integrated assessment modeling in its relationships to disciplinary knowledge and to broader assessment seeking to inform policy- and decision-making. 192 refs., 2 figs

  17. An Assessment of Mean Areal Precipitation Methods on Simulated Stream Flow: A SWAT Model Performance Assessment

    Directory of Open Access Journals (Sweden)

    Sean Zeiger

    2017-06-01

    Full Text Available Accurate mean areal precipitation (MAP estimates are essential input forcings for hydrologic models. However, the selection of the most accurate method to estimate MAP can be daunting because there are numerous methods to choose from (e.g., proximate gauge, direct weighted average, surface-fitting, and remotely sensed methods. Multiple methods (n = 19 were used to estimate MAP with precipitation data from 11 distributed monitoring sites, and 4 remotely sensed data sets. Each method was validated against the hydrologic model simulated stream flow using the Soil and Water Assessment Tool (SWAT. SWAT was validated using a split-site method and the observed stream flow data from five nested-scale gauging sites in a mixed-land-use watershed of the central USA. Cross-validation results showed the error associated with surface-fitting and remotely sensed methods ranging from −4.5 to −5.1%, and −9.8 to −14.7%, respectively. Split-site validation results showed the percent bias (PBIAS values that ranged from −4.5 to −160%. Second order polynomial functions especially overestimated precipitation and subsequent stream flow simulations (PBIAS = −160 in the headwaters. The results indicated that using an inverse-distance weighted, linear polynomial interpolation or multiquadric function method to estimate MAP may improve SWAT model simulations. Collectively, the results highlight the importance of spatially distributed observed hydroclimate data for precipitation and subsequent steam flow estimations. The MAP methods demonstrated in the current work can be used to reduce hydrologic model uncertainty caused by watershed physiographic differences.

  18. Ab initio derivation of model energy density functionals

    International Nuclear Information System (INIS)

    Dobaczewski, Jacek

    2016-01-01

    I propose a simple and manageable method that allows for deriving coupling constants of model energy density functionals (EDFs) directly from ab initio calculations performed for finite fermion systems. A proof-of-principle application allows for linking properties of finite nuclei, determined by using the nuclear nonlocal Gogny functional, to the coupling constants of the quasilocal Skyrme functional. The method does not rely on properties of infinite fermion systems but on the ab initio calculations in finite systems. It also allows for quantifying merits of different model EDFs in describing the ab initio results. (letter)

  19. Uncertain Quality Function Deployment Using a Hybrid Group Decision Making Model

    Directory of Open Access Journals (Sweden)

    Ze-Ling Wang

    2016-11-01

    Full Text Available Quality function deployment (QFD is a widely used quality system tool for translating customer requirements (CRs into the engineering design requirements (DRs of products or services. The conventional QFD analysis, however, has been criticized as having some limitations such as in the assessment of relationships between CRs and DRs, the determination of CR weights and the prioritization of DRs. This paper aims to develop a new hybrid group decision-making model based on hesitant 2-tuple linguistic term sets and an extended QUALIFLEX (qualitative flexible multiple criteria method approach for handling QFD problems with incomplete weight information. First, hesitant linguistic term sets are combined with interval 2-tuple linguistic variables to express various uncertainties in the assessment information of QFD team members. Borrowing the idea of grey relational analysis (GRA, a multiple objective optimization model is constructed to determine the relative weights of CRs. Then, an extended QUALIFLEX approach with an inclusion comparison method is suggested to determine the ranking of the DRs identified in QFD. Finally, an analysis of a market segment selection problem is conducted to demonstrate and validate the proposed QFD approach.

  20. Establishing verbal repertoires in children with autism using function-based video modeling.

    Science.gov (United States)

    Plavnick, Joshua B; Ferreri, Summer J

    2011-01-01

    Previous research suggests that language-training procedures for children with autism might be enhanced following an assessment of conditions that evoke emerging verbal behavior. The present investigation examined a methodology to teach recognizable mands based on environmental variables known to evoke participants' idiosyncratic communicative responses in the natural environment. An alternating treatments design was used during Experiment 1 to identify the variables that were functionally related to gestures emitted by 4 children with autism. Results showed that gestures functioned as requests for attention for 1 participant and as requests for assistance to obtain a preferred item or event for 3 participants. Video modeling was used during Experiment 2 to compare mand acquisition when video sequences were either related or unrelated to the results of the functional analysis. An alternating treatments within multiple probe design showed that participants repeatedly acquired mands during the function-based condition but not during the nonfunction-based condition. In addition, generalization of the response was observed during the former but not the latter condition.

  1. BioModels: Content, Features, Functionality, and Use

    Science.gov (United States)

    Juty, N; Ali, R; Glont, M; Keating, S; Rodriguez, N; Swat, MJ; Wimalaratne, SM; Hermjakob, H; Le Novère, N; Laibe, C; Chelliah, V

    2015-01-01

    BioModels is a reference repository hosting mathematical models that describe the dynamic interactions of biological components at various scales. The resource provides access to over 1,200 models described in literature and over 140,000 models automatically generated from pathway resources. Most model components are cross-linked with external resources to facilitate interoperability. A large proportion of models are manually curated to ensure reproducibility of simulation results. This tutorial presents BioModels' content, features, functionality, and usage. PMID:26225232

  2. Model-driven dependability assessment of software systems

    CERN Document Server

    Bernardi, Simona; Petriu, Dorina C

    2013-01-01

    In this book, the authors present cutting-edge model-driven techniques for modeling and analysis of software dependability. Most of them are based on the use of UML as software specification language. From the software system specification point of view, such techniques exploit the standard extension mechanisms of UML (i.e., UML profiling). UML profiles enable software engineers to add non-functional properties to the software model, in addition to the functional ones. The authors detail the state of the art on UML profile proposals for dependability specification and rigorously describe the t

  3. Plutonium assessment modeling: government policy, non-proliferation, and the government fence

    International Nuclear Information System (INIS)

    Kurstedt, H.A. Jr.; Nachlas, J.A.

    1977-01-01

    Assessment modeling for the evaluation of plutonium as an energy resource is stressed, and generic mathematical model forms are outlined. Representative necessary objective functions are developed. Constraints and assumptions are listed. An example involving present-day light water reactor technology is demonstrated. Technical, environmental, and political implications are drawn. Specific new directions for analysis are suggested. The position of the boundary of government control and responsibility--the government exclusion fence--is shown to be a critical, but overlooked, constraint. Existing governmental uranium stockpiles may be an unmentioned, though important, constraint. Plutonium is the most abundant proven energy equivalent and most controversial energy resource. Plutonium results from an intermediate nuclear reactor processing stage starting with the raw material 238 U. Therefore, the plutonium resource differs from the 238 U resource only through minimal conversion losses and through the political and/or social will to perform the conversion. The relative abundance of 238 U, and therefore of plutonium is high. There is a great need to assess plutonium in relation to the potential available energy for a society in short supply

  4. Deep inelastic structure functions in the chiral bag model

    International Nuclear Information System (INIS)

    Sanjose, V.; Vento, V.; Centro Mixto CSIC/Valencia Univ., Valencia

    1989-01-01

    We calculate the structure functions for deep inelastic scattering on baryons in the cavity approximation to the chiral bag model. The behavior of these structure functions is analyzed in the Bjorken limit. We conclude that scaling is satisfied, but not Regge behavior. A trivial extension as a parton model can be achieved by introducing the structure function for the pion in a convolution picture. In this extended version of the model not only scaling but also Regge behavior is satisfied. Conclusions are drawn from the comparison of our results with experimental data. (orig.)

  5. Deep inelastic structure functions in the chiral bag model

    Energy Technology Data Exchange (ETDEWEB)

    Sanjose, V. (Valencia Univ. (Spain). Dept. de Didactica de las Ciencias Experimentales); Vento, V. (Valencia Univ. (Spain). Dept. de Fisica Teorica; Centro Mixto CSIC/Valencia Univ., Valencia (Spain). Inst. de Fisica Corpuscular)

    1989-10-02

    We calculate the structure functions for deep inelastic scattering on baryons in the cavity approximation to the chiral bag model. The behavior of these structure functions is analyzed in the Bjorken limit. We conclude that scaling is satisfied, but not Regge behavior. A trivial extension as a parton model can be achieved by introducing the structure function for the pion in a convolution picture. In this extended version of the model not only scaling but also Regge behavior is satisfied. Conclusions are drawn from the comparison of our results with experimental data. (orig.).

  6. Diet models with linear goal programming: impact of achievement functions.

    Science.gov (United States)

    Gerdessen, J C; de Vries, J H M

    2015-11-01

    Diet models based on goal programming (GP) are valuable tools in designing diets that comply with nutritional, palatability and cost constraints. Results derived from GP models are usually very sensitive to the type of achievement function that is chosen.This paper aims to provide a methodological insight into several achievement functions. It describes the extended GP (EGP) achievement function, which enables the decision maker to use either a MinSum achievement function (which minimizes the sum of the unwanted deviations) or a MinMax achievement function (which minimizes the largest unwanted deviation), or a compromise between both. An additional advantage of EGP models is that from one set of data and weights multiple solutions can be obtained. We use small numerical examples to illustrate the 'mechanics' of achievement functions. Then, the EGP achievement function is demonstrated on a diet problem with 144 foods, 19 nutrients and several types of palatability constraints, in which the nutritional constraints are modeled with fuzzy sets. Choice of achievement function affects the results of diet models. MinSum achievement functions can give rise to solutions that are sensitive to weight changes, and that pile all unwanted deviations on a limited number of nutritional constraints. MinMax achievement functions spread the unwanted deviations as evenly as possible, but may create many (small) deviations. EGP comprises both types of achievement functions, as well as compromises between them. It can thus, from one data set, find a range of solutions with various properties.

  7. Development of the quality assessment model of EHR software in family medicine practices: research based on user satisfaction

    Directory of Open Access Journals (Sweden)

    Damir Kralj

    2015-09-01

    Full Text Available Background Family medicine practices (FMPs make the basis for the Croatian health care system. Use of electronic health record (EHR software is mandatory and it plays an important role in running these practices, but important functional features still remain uneven and largely left to the will of the software developers.Objective The objective of this study was to develop a novel and comprehensive model for functional evaluation of the EHR software in FMPs, based on current world standards, models and projects, as well as on actual user satisfaction and requirements.Methods Based on previous theoretical and experimental research in this area, we made the initial framework model consisting of six basic categories as a base for online survey questionnaire. Family doctors assessed perceived software quality by using a five-point Likert-type scale. Using exploratory factor analysis and appropriate statistical methods over the collected data, the final optimal structure of the novel model was formed. Special attention was focused on the validity and quality of the novel model.Results The online survey collected a total of 384 cases. The obtained results indicate both the quality of the assessed software and the quality in use of the novel model. The intense ergonomic orientation of the novel measurement model was particularly emphasised.Conclusions The resulting novel model is multiple validated, comprehensive and universal. It could be used to assess the user-perceived quality of almost all forms of the ambulatory EHR software and therefore useful to all stakeholders in this area of the health care informatisation. 

  8. Development of the quality assessment model of EHR software in family medicine practices: research based on user satisfaction.

    Science.gov (United States)

    Kralj, Damir; Kern, Josipa; Tonkovic, Stanko; Koncar, Miroslav

    2015-09-09

    Family medicine practices (FMPs) make the basis for the Croatian health care system. Use of electronic health record (EHR) software is mandatory and it plays an important role in running these practices, but important functional features still remain uneven and largely left to the will of the software developers. The objective of this study was to develop a novel and comprehensive model for functional evaluation of the EHR software in FMPs, based on current world standards, models and projects, as well as on actual user satisfaction and requirements. Based on previous theoretical and experimental research in this area, we made the initial framework model consisting of six basic categories as a base for online survey questionnaire. Family doctors assessed perceived software quality by using a five-point Likert-type scale. Using exploratory factor analysis and appropriate statistical methods over the collected data, the final optimal structure of the novel model was formed. Special attention was focused on the validity and quality of the novel model. The online survey collected a total of 384 cases. The obtained results indicate both the quality of the assessed software and the quality in use of the novel model. The intense ergonomic orientation of the novel measurement model was particularly emphasised. The resulting novel model is multiple validated, comprehensive and universal. It could be used to assess the user-perceived quality of almost all forms of the ambulatory EHR software and therefore useful to all stakeholders in this area of the health care informatisation.

  9. Can genetics help psychometrics? Improving dimensionality assessment through genetic factor modeling.

    Science.gov (United States)

    Franić, Sanja; Dolan, Conor V; Borsboom, Denny; Hudziak, James J; van Beijsterveldt, Catherina E M; Boomsma, Dorret I

    2013-09-01

    In the present article, we discuss the role that quantitative genetic methodology may play in assessing and understanding the dimensionality of psychological (psychometric) instruments. Specifically, we study the relationship between the observed covariance structures, on the one hand, and the underlying genetic and environmental influences giving rise to such structures, on the other. We note that this relationship may be such that it hampers obtaining a clear estimate of dimensionality using standard tools for dimensionality assessment alone. One situation in which dimensionality assessment may be impeded is that in which genetic and environmental influences, of which the observed covariance structure is a function, differ from each other in structure and dimensionality. We demonstrate that in such situations settling dimensionality issues may be problematic, and propose using quantitative genetic modeling to uncover the (possibly different) dimensionalities of the underlying genetic and environmental structures. We illustrate using simulations and an empirical example on childhood internalizing problems.

  10. Effects of tailored neck-shoulder pain treatment based on a decision model guided by clinical assessments and standardized functional tests. A study protocol of a randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Björklund Martin

    2012-05-01

    Full Text Available Abstract Background A major problem with rehabilitation interventions for neck pain is that the condition may have multiple causes, thus a single treatment approach is seldom efficient. The present study protocol outlines a single blinded randomised controlled trial evaluating the effect of tailored treatment for neck-shoulder pain. The treatment is based on a decision model guided by standardized clinical assessment and functional tests with cut-off values. Our main hypothesis is that the tailored treatment has better short, intermediate and long-term effects than either non-tailored treatment or treatment-as-usual (TAU on pain and function. We sub-sequentially hypothesize that tailored and non-tailored treatment both have better effect than TAU. Methods/Design 120 working women with minimum six weeks of nonspecific neck-shoulder pain aged 20–65, are allocated by minimisation with the factors age, duration of pain, pain intensity and disability in to the groups tailored treatment (T, non-tailored treatment (NT or treatment-as-usual (TAU. Treatment is given to the groups T and NT for 11 weeks (27 sessions evenly distributed. An extensive presentation of the tests and treatment decision model is provided. The main treatment components are manual therapy, cranio-cervical flexion exercise and strength training, EMG-biofeedback training, treatment for cervicogenic headache, neck motor control training. A decision algorithm based on the baseline assessment determines the treatment components given to each participant of T- and NT-groups. Primary outcome measures are physical functioning (Neck Disability Index and average pain intensity last week (Numeric Rating Scale. Secondary outcomes are general improvement (Patient Global Impression of Change scale, symptoms (Profile Fitness Mapping neck questionnaire, capacity to work in the last 6 weeks (quality and quantity and pressure pain threshold of m. trapezius. Primary and secondary outcomes will

  11. Underwater noise modelling for environmental impact assessment

    Energy Technology Data Exchange (ETDEWEB)

    Farcas, Adrian [Centre for Environment, Fisheries and Aquaculture Science (Cefas), Pakefield Road, Lowestoft, NR33 0HT (United Kingdom); Thompson, Paul M. [Lighthouse Field Station, Institute of Biological and Environmental Sciences, University of Aberdeen, Cromarty IV11 8YL (United Kingdom); Merchant, Nathan D., E-mail: nathan.merchant@cefas.co.uk [Centre for Environment, Fisheries and Aquaculture Science (Cefas), Pakefield Road, Lowestoft, NR33 0HT (United Kingdom)

    2016-02-15

    Assessment of underwater noise is increasingly required by regulators of development projects in marine and freshwater habitats, and noise pollution can be a constraining factor in the consenting process. Noise levels arising from the proposed activity are modelled and the potential impact on species of interest within the affected area is then evaluated. Although there is considerable uncertainty in the relationship between noise levels and impacts on aquatic species, the science underlying noise modelling is well understood. Nevertheless, many environmental impact assessments (EIAs) do not reflect best practice, and stakeholders and decision makers in the EIA process are often unfamiliar with the concepts and terminology that are integral to interpreting noise exposure predictions. In this paper, we review the process of underwater noise modelling and explore the factors affecting predictions of noise exposure. Finally, we illustrate the consequences of errors and uncertainties in noise modelling, and discuss future research needs to reduce uncertainty in noise assessments.

  12. Underwater noise modelling for environmental impact assessment

    International Nuclear Information System (INIS)

    Farcas, Adrian; Thompson, Paul M.; Merchant, Nathan D.

    2016-01-01

    Assessment of underwater noise is increasingly required by regulators of development projects in marine and freshwater habitats, and noise pollution can be a constraining factor in the consenting process. Noise levels arising from the proposed activity are modelled and the potential impact on species of interest within the affected area is then evaluated. Although there is considerable uncertainty in the relationship between noise levels and impacts on aquatic species, the science underlying noise modelling is well understood. Nevertheless, many environmental impact assessments (EIAs) do not reflect best practice, and stakeholders and decision makers in the EIA process are often unfamiliar with the concepts and terminology that are integral to interpreting noise exposure predictions. In this paper, we review the process of underwater noise modelling and explore the factors affecting predictions of noise exposure. Finally, we illustrate the consequences of errors and uncertainties in noise modelling, and discuss future research needs to reduce uncertainty in noise assessments.

  13. Global sensitivity analysis of computer models with functional inputs

    International Nuclear Information System (INIS)

    Iooss, Bertrand; Ribatet, Mathieu

    2009-01-01

    Global sensitivity analysis is used to quantify the influence of uncertain model inputs on the response variability of a numerical model. The common quantitative methods are appropriate with computer codes having scalar model inputs. This paper aims at illustrating different variance-based sensitivity analysis techniques, based on the so-called Sobol's indices, when some model inputs are functional, such as stochastic processes or random spatial fields. In this work, we focus on large cpu time computer codes which need a preliminary metamodeling step before performing the sensitivity analysis. We propose the use of the joint modeling approach, i.e., modeling simultaneously the mean and the dispersion of the code outputs using two interlinked generalized linear models (GLMs) or generalized additive models (GAMs). The 'mean model' allows to estimate the sensitivity indices of each scalar model inputs, while the 'dispersion model' allows to derive the total sensitivity index of the functional model inputs. The proposed approach is compared to some classical sensitivity analysis methodologies on an analytical function. Lastly, the new methodology is applied to an industrial computer code that simulates the nuclear fuel irradiation.

  14. Business risks, functions, methods of assessment and ways to reduce risk

    Directory of Open Access Journals (Sweden)

    A.V. Mihalchuk

    2015-06-01

    Full Text Available For successful existence in a market economy entrepreneur have to take bold actions, and this increases the risk. The article describes the concept of entrepreneurship and business risk, positive and negative aspects of functions of risk in business. Therefore, it is necessary to assess the risk properly and be able to manage it to achieve the most effective results in the market. In market conditions the problem of assessing and accounting market becomes independent theoretical and practical significance as an important component of the theory and practice of management. Risk - a key element of business activities. Development of risk situations can lead to both the occurrence of adverse effects (losses, lost profits, and positive results for a company in the form of increased profit. This article describes: the concept of entrepreneurship, risk and business risks, characteristic of positive and negative aspects of risk functions in business, methods of assessment and risk reduction, shows formulae and examples you can use to assess risk in an enterprise. Analyzing already established methods of risk assessment a number of rules were proposed in order to reduce business risk.

  15. Reduced Rank Mixed Effects Models for Spatially Correlated Hierarchical Functional Data

    KAUST Repository

    Zhou, Lan

    2010-03-01

    Hierarchical functional data are widely seen in complex studies where sub-units are nested within units, which in turn are nested within treatment groups. We propose a general framework of functional mixed effects model for such data: within unit and within sub-unit variations are modeled through two separate sets of principal components; the sub-unit level functions are allowed to be correlated. Penalized splines are used to model both the mean functions and the principal components functions, where roughness penalties are used to regularize the spline fit. An EM algorithm is developed to fit the model, while the specific covariance structure of the model is utilized for computational efficiency to avoid storage and inversion of large matrices. Our dimension reduction with principal components provides an effective solution to the difficult tasks of modeling the covariance kernel of a random function and modeling the correlation between functions. The proposed methodology is illustrated using simulations and an empirical data set from a colon carcinogenesis study. Supplemental materials are available online.

  16. Predicting recovery of cognitive function soon after stroke: differential modeling of logarithmic and linear regression.

    Science.gov (United States)

    Suzuki, Makoto; Sugimura, Yuko; Yamada, Sumio; Omori, Yoshitsugu; Miyamoto, Masaaki; Yamamoto, Jun-ichi

    2013-01-01

    Cognitive disorders in the acute stage of stroke are common and are important independent predictors of adverse outcome in the long term. Despite the impact of cognitive disorders on both patients and their families, it is still difficult to predict the extent or duration of cognitive impairments. The objective of the present study was, therefore, to provide data on predicting the recovery of cognitive function soon after stroke by differential modeling with logarithmic and linear regression. This study included two rounds of data collection comprising 57 stroke patients enrolled in the first round for the purpose of identifying the time course of cognitive recovery in the early-phase group data, and 43 stroke patients in the second round for the purpose of ensuring that the correlation of the early-phase group data applied to the prediction of each individual's degree of cognitive recovery. In the first round, Mini-Mental State Examination (MMSE) scores were assessed 3 times during hospitalization, and the scores were regressed on the logarithm and linear of time. In the second round, calculations of MMSE scores were made for the first two scoring times after admission to tailor the structures of logarithmic and linear regression formulae to fit an individual's degree of functional recovery. The time course of early-phase recovery for cognitive functions resembled both logarithmic and linear functions. However, MMSE scores sampled at two baseline points based on logarithmic regression modeling could estimate prediction of cognitive recovery more accurately than could linear regression modeling (logarithmic modeling, R(2) = 0.676, PLogarithmic modeling based on MMSE scores could accurately predict the recovery of cognitive function soon after the occurrence of stroke. This logarithmic modeling with mathematical procedures is simple enough to be adopted in daily clinical practice.

  17. Functional assessment of the right ventricle with gated myocardial perfusion SPECT

    International Nuclear Information System (INIS)

    Wadhwa, S.S.; Abbati, D.; Carolan, M.

    2002-01-01

    Full text: Evaluation of right ventricular function can provide valuable information in a variety of cardiac and non-cardiac conditions. Functional assessment of the right ventricle is difficult owing to its anatomy and geometry. We describe a method of assessing right ventricular function using gated myocardial perfusion SPECT. In 20 patients right and left ventricular ejection fractions (RVEF, LVEF) were determined using gated blood pool (GBPS) and gated myocardial perfusion SPECT (GSPECT). To avoid contamination with right atrial activity the two frame method was adopted for gated blood pool data when measuring RVEF. In 9 patients with normal right ventricles, an index of wall thickening for the right ventricle was derived from the peak systolic and diastolic counts in the free wall. There was good linear correlation between the two methods adopted for calculation of LVEF and RVEF. Bland - Airman analysis demonstrated good agreement between the two methods with no specific bias. The mean LVEF was 47.9 +/-12% (GBPS) and 47.3 +/- 12.4 (GSPECT). The mean RVEF was 43.2 +/- 9.6% (GBPS) and 44.2 +/- 8.5% (GSPECT). In both cases the values were significantly different. The mean wall motion index was 35%. There was no correlation between the wall thickness index and ejection fraction however the index was greater in patients with normal right ventricle than in those with reduced RVER Gated SPECT offers an alternative to GBPS for the functional assessment of the right ventricle. Utilising GSPECT will allow the simultaneous assessment of both the right and left ventricles. Copyright (2002) The Australian and New Zealand Society of Nuclear Medicine Inc

  18. A model-based approach to preplanting risk assessment for gray leaf spot of maize.

    Science.gov (United States)

    Paul, P A; Munkvold, G P

    2004-12-01

    ABSTRACT Risk assessment models for gray leaf spot of maize, caused by Cercospora zeae-maydis, were developed using preplanting site and maize genotype data as predictors. Disease severity at the dough/dent plant growth stage was categorized into classes and used as the response variable. Logistic regression and classification and regression tree (CART) modeling approaches were used to predict severity classes as a function of planting date (PD), amount of maize soil surface residue (SR), cropping sequence, genotype maturity and gray leaf spot resistance (GLSR) ratings, and longitude (LON). Models were development using 332 cases collected between 1998 and 2001. Thirty cases collected in 2002 were used to validate the models. Preplanting data showed a strong relationship with late-season gray leaf spot severity classes. The most important predictors were SR, PD, GLSR, and LON. Logistic regression models correctly classified 60 to 70% of the validation cases, whereas the CART models correctly classified 57 to 77% of these cases. Cases misclassified by the CART models were mostly due to overestimation, whereas the logistic regression models tended to misclassify cases by underestimation. Both the CART and logistic regression models have potential as management decision-making tools. Early quantitative assessment of gray leaf spot risk would allow for more sound management decisions being made when warranted.

  19. Assessment of a virtual functional prototyping process for the rapid manufacture of passive-dynamic ankle-foot orthoses.

    Science.gov (United States)

    Schrank, Elisa S; Hitch, Lester; Wallace, Kevin; Moore, Richard; Stanhope, Steven J

    2013-10-01

    Passive-dynamic ankle-foot orthosis (PD-AFO) bending stiffness is a key functional characteristic for achieving enhanced gait function. However, current orthosis customization methods inhibit objective premanufacture tuning of the PD-AFO bending stiffness, making optimization of orthosis function challenging. We have developed a novel virtual functional prototyping (VFP) process, which harnesses the strengths of computer aided design (CAD) model parameterization and finite element analysis, to quantitatively tune and predict the functional characteristics of a PD-AFO, which is rapidly manufactured via fused deposition modeling (FDM). The purpose of this study was to assess the VFP process for PD-AFO bending stiffness. A PD-AFO CAD model was customized for a healthy subject and tuned to four bending stiffness values via VFP. Two sets of each tuned model were fabricated via FDM using medical-grade polycarbonate (PC-ISO). Dimensional accuracy of the fabricated orthoses was excellent (average 0.51 ± 0.39 mm). Manufacturing precision ranged from 0.0 to 0.74 Nm/deg (average 0.30 ± 0.36 Nm/deg). Bending stiffness prediction accuracy was within 1 Nm/deg using the manufacturer provided PC-ISO elastic modulus (average 0.48 ± 0.35 Nm/deg). Using an experimentally derived PC-ISO elastic modulus improved the optimized bending stiffness prediction accuracy (average 0.29 ± 0.57 Nm/deg). Robustness of the derived modulus was tested by carrying out the VFP process for a disparate subject, tuning the PD-AFO model to five bending stiffness values. For this disparate subject, bending stiffness prediction accuracy was strong (average 0.20 ± 0.14 Nm/deg). Overall, the VFP process had excellent dimensional accuracy, good manufacturing precision, and strong prediction accuracy with the derived modulus. Implementing VFP as part of our PD-AFO customization and manufacturing framework, which also includes fit customization, provides a novel and powerful method to

  20. Modeling inputs to computer models used in risk assessment

    International Nuclear Information System (INIS)

    Iman, R.L.

    1987-01-01

    Computer models for various risk assessment applications are closely scrutinized both from the standpoint of questioning the correctness of the underlying mathematical model with respect to the process it is attempting to model and from the standpoint of verifying that the computer model correctly implements the underlying mathematical model. A process that receives less scrutiny, but is nonetheless of equal importance, concerns the individual and joint modeling of the inputs. This modeling effort clearly has a great impact on the credibility of results. Model characteristics are reviewed in this paper that have a direct bearing on the model input process and reasons are given for using probabilities-based modeling with the inputs. The authors also present ways to model distributions for individual inputs and multivariate input structures when dependence and other constraints may be present

  1. Assessing NARCCAP climate model effects using spatial confidence regions

    Directory of Open Access Journals (Sweden)

    J. P. French

    2017-07-01

    Full Text Available We assess similarities and differences between model effects for the North American Regional Climate Change Assessment Program (NARCCAP climate models using varying classes of linear regression models. Specifically, we consider how the average temperature effect differs for the various global and regional climate model combinations, including assessment of possible interaction between the effects of global and regional climate models. We use both pointwise and simultaneous inference procedures to identify regions where global and regional climate model effects differ. We also show conclusively that results from pointwise inference are misleading, and that accounting for multiple comparisons is important for making proper inference.

  2. Statistical Modelling of Resonant Cross Section Structure in URR, Model of the Characteristic Function

    International Nuclear Information System (INIS)

    Koyumdjieva, N.

    2006-01-01

    A statistical model for the resonant cross section structure in the Unresolved Resonance Region has been developed in the framework of the R-matrix formalism in Reich Moore approach with effective accounting of the resonance parameters fluctuations. The model uses only the average resonance parameters and can be effectively applied for analyses of cross sections functional, averaged over many resonances. Those are cross section moments, transmission and self-indication functions measured through thick sample. In this statistical model the resonant cross sections structure is accepted to be periodic and the R-matrix is a function of ε=E/D with period 0≤ε≤N; R nc (ε)=π/2√(S n *S c )1/NΣ(i=1,N)(β in *β ic *ctg[π(ε i - = ε-iS i )/N]; Here S n ,S c ,S i is respectively neutron strength function, strength function for fission or inelastic channel and strength function for radiative capture, N is the number of resonances (ε i ,β i ) that obey the statistic of Porter-Thomas and Wigner's one. The simple case of this statistical model concerns the resonant cross section structure for non-fissile nuclei under the threshold for inelastic scattering - the model of the characteristic function with HARFOR program. In the above model some improvements of calculation of the phases and logarithmic derivatives of neutron channels have been done. In the parameterization we use the free parameter R l ∞ , which accounts the influence of long-distant resonances. The above scheme for statistical modelling of the resonant cross section structure has been applied for evaluation of experimental data for total, capture and inelastic cross sections for 232 Th in the URR (4-150) keV and also the transmission and self-indication functions in (4-175) keV. The set of evaluated average resonance parameters have been obtained. The evaluated average resonance parameters in the URR are consistent with those in the Resolved Resonance Region (CRP for Th-U cycle, Vienna, 2006

  3. Towards a functional model of mental disorders incorporating the laws of thermodynamics.

    Science.gov (United States)

    Murray, George C; McKenzie, Karen

    2013-05-01

    The current paper presents the hypothesis that the understanding of mental disorders can be advanced by incorporating the laws of thermodynamics, specifically relating to energy conservation and energy transfer. These ideas, along with the introduction of the notion that entropic activities are symptomatic of inefficient energy transfer or disorder, were used to propose a model of understanding mental ill health as resulting from the interaction of entropy, capacity and work (environmental demands). The model was applied to Attention Deficit Hyperactivity Disorder, and was shown to be compatible with current thinking about this condition, as well as emerging models of mental disorders as complex networks. A key implication of the proposed model is that it argues that all mental disorders require a systemic functional approach, with the advantage that it offers a number of routes into the assessment, formulation and treatment for mental health problems. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Functional assessment of cutaneous microvasculature after radiation

    International Nuclear Information System (INIS)

    Doll, C.; Durand, R.; Grulkey, W.; Sayer, S.; Olivotto, I.

    1999-01-01

    Background and purpose: To determine if laser Doppler flowmetry could be used to non-invasively evaluate microvasculature function after radiation therapy (RT), we assessed blood flow response to heating in women following RT after breast conservation.Materials and methods: Forty women with unilateral stage I/II breast cancer treated with conservative surgery and RT were evaluated at varying intervals post RT. Ten patients were retested after an interval of 55 to 57 months to assess reproducibility of the control data. A laser Doppler probe fitted into a heat source was used to non-invasively measure blood flow in a small area of skin on the treated breast and a matched area on the untreated side. The heating element increased skin surface temperature to 40C, permitting assessment of heat stress induced changes in blood flow.Results36 months post RT, there was no significant difference seen in relative blood flow between the irradiated and non-irradiated sides. Cutaneous blood flow response to the heat stress was very reproducible when women were reassessed 55 to 57 months after initial testing.Conclusions 36 months post RT. (Copyright (c) 1999 Elsevier Science B.V., Amsterdam. All rights reserved.)

  5. Functional Median Polish

    KAUST Repository

    Sun, Ying

    2012-08-03

    This article proposes functional median polish, an extension of univariate median polish, for one-way and two-way functional analysis of variance (ANOVA). The functional median polish estimates the functional grand effect and functional main factor effects based on functional medians in an additive functional ANOVA model assuming no interaction among factors. A functional rank test is used to assess whether the functional main factor effects are significant. The robustness of the functional median polish is demonstrated by comparing its performance with the traditional functional ANOVA fitted by means under different outlier models in simulation studies. The functional median polish is illustrated on various applications in climate science, including one-way and two-way ANOVA when functional data are either curves or images. Specifically, Canadian temperature data, U. S. precipitation observations and outputs of global and regional climate models are considered, which can facilitate the research on the close link between local climate and the occurrence or severity of some diseases and other threats to human health. © 2012 International Biometric Society.

  6. Multiple Types of Memory and Everyday Functional Assessment in Older Adults

    Science.gov (United States)

    Beaver, Jenna

    2017-01-01

    Abstract Objective Current proxy measures for assessing everyday functioning (e.g., questionnaires, performance-based measures, and direct observation) show discrepancies in their rating of functional status. The present study investigated the relationship between multiple proxy measures of functional status and content memory (i.e., memory for information), temporal order memory, and prospective memory in an older adult sample. Method A total of 197 community-dwelling older adults who did (n = 45) or did not meet (n = 152) criteria for mild cognitive impairment (MCI), completed six different assessments of functional status (two questionnaires, two performance-based tasks, and two direct observation tasks) as well as experimental measures of content memory, prospective memory, and temporal order memory. Results After controlling for demographics and content memory, the temporal order and prospective memory measures explained a significant amount of variance in all proxy functional status measures. When all variables were entered into the regression analyses, content memory and prospective memory were found to be significant predictors of all measures of functional status, whereas temporal order memory was a significant predictor for the questionnaire and direct observation measures, but not performance-based measures. Conclusion The results suggest that direct observation and questionnaire measures may be able to capture components of everyday functioning that require context and temporal sequencing abilities, such as multi-tasking, that are not as well captured in many current laboratory performance-based measures of functional status. Future research should aim to inform the development and use of maximally effective and valid proxy measures of functional ability. PMID:28334170

  7. Observation-based assessment of functional ability in patients with chronic widespread pain

    DEFF Research Database (Denmark)

    Amris, Kirstine; Wæhrens, Eva Ejlersen; Jespersen, Anders

    2011-01-01

    Knowledge about functional ability, including activities of daily living (ADL), in patients with chronic widespread pain (CWP) and fibromyalgia (FMS) is largely based on self-report. The purpose of this study was to assess functional ability by using standardised, observation-based assessment...... of ADL performance and to examine the relationship between self-reported and observation-based measures of disability. A total of 257 women with CWP, 199 (77%) fulfilling the American College of Rheumatology tender point criteria for FMS, were evaluated with the Assessment of Motor and Process Skills...... (AMPS), an observation-based assessment providing linear measures of ADL motor and ADL process skill ability (unit: logits). A cutoff for effortless and independent ADL task performance is set at 2.0 for the motor scale and 1.0 for the process scale. A total of 248 (96.5%) had ability measures below...

  8. Application of eco-exergy for assessment of ecosystem health and development of structurally dynamic models

    DEFF Research Database (Denmark)

    Zhang, J.; Gürkan, Zeren; Jørgensen, S.E.

    2010-01-01

    are developed using eco-exergy as the goal function, have been applied in explaining and exploring ecosystem properties and changes in community structure driven by biotic and abiotic factors. In this paper, we review the application of eco-exergy for the assessment of ecosystem health and development......Eco-exergy has been widely used in the assessment of ecosystem health, parameter estimations, calibrations, validations and prognoses. It offers insights into the understanding of ecosystem dynamics and disturbance-cl riven changes. Particularly, structurally dynamic models (SDMs), which...

  9. An Improved Nested Sampling Algorithm for Model Selection and Assessment

    Science.gov (United States)

    Zeng, X.; Ye, M.; Wu, J.; WANG, D.

    2017-12-01

    Multimodel strategy is a general approach for treating model structure uncertainty in recent researches. The unknown groundwater system is represented by several plausible conceptual models. Each alternative conceptual model is attached with a weight which represents the possibility of this model. In Bayesian framework, the posterior model weight is computed as the product of model prior weight and marginal likelihood (or termed as model evidence). As a result, estimating marginal likelihoods is crucial for reliable model selection and assessment in multimodel analysis. Nested sampling estimator (NSE) is a new proposed algorithm for marginal likelihood estimation. The implementation of NSE comprises searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm and its variants are often used for local sampling in NSE. However, M-H is not an efficient sampling algorithm for high-dimensional or complex likelihood function. For improving the performance of NSE, it could be feasible to integrate more efficient and elaborated sampling algorithm - DREAMzs into the local sampling. In addition, in order to overcome the computation burden problem of large quantity of repeating model executions in marginal likelihood estimation, an adaptive sparse grid stochastic collocation method is used to build the surrogates for original groundwater model.

  10. A Model for Situation and Threat Assessment

    Science.gov (United States)

    2006-12-01

    CUBRC , Inc.) 8151 Needwood #T103 Derwood, MD 20855 UNITED STATES steinberg@cubrc.org A model is presented for situation and threat assessment...PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Subject Matter Expert (SME) Calspan-UB Research Center ( CUBRC , Inc.) 8151 Needwood #T103 Derwood, MD...1 A Model for Situation and Threat Assessment Alan Steinberg CUBRC , Inc. steinberg@cubrc.org November, 2005 2 Objectives • Advance the state-of

  11. ASSESSMENT CRITERIA OF FUNCTIONALITY GEOTEXTILES USED IN ROAD CONSTRUCTION

    Directory of Open Access Journals (Sweden)

    LUCA Cristinel

    2016-05-01

    Full Text Available This work was performed in order to assess the functionality of geotextiles used in road construction. To increase the quality of road works requires the use of geotextiles in their structure. Depending on the role and the benefits they offer, geotextiles have a number of physical properties, hydraulic, endurance and optimal characteristics regarding their degradation. Geotextile properties were identified and divided according to their characteristics area. Thus, there were obtained textile properties oriented towards geotextiles and properties geared to the application field respectively reinforcement, drainage, and filtration. Value engineering works at the level of constructive product conception and production. The instrumentation is done by functional analysis, value functions and design or redesign of geotextile based on the necessary functions. Systematic research method allowed geotextiles dimensioning functions in order to obtain products in terms of quality, reliability and maximum operational performance. Functions obtained from the analysis are appropriate for a single property. After obtaining the set of decisions was possible functions geotextiles hierarchy after the significance of their use. Establishing the importance of the coefficients or characteristics hierarchy after their weight requires the comparison of the features between them and grading them in proportion to their degree of importance. The ranking of these functions is beneficial when designing or redesigning geotextiles.

  12. A Simple Model of Self-Assessments

    NARCIS (Netherlands)

    S. Dominguez Martinez (Silvia); O.H. Swank (Otto)

    2006-01-01

    textabstractWe develop a simple model that describes individuals' self-assessments of their abilities. We assume that individuals learn about their abilities from appraisals of others and experience. Our model predicts that if communication is imperfect, then (i) appraisals of others tend to be too

  13. Driver steering model for closed-loop steering function analysis

    Science.gov (United States)

    Bolia, Pratiksh; Weiskircher, Thomas; Müller, Steffen

    2014-05-01

    In this paper, a two level preview driver steering control model for the use in numerical vehicle dynamics simulation is introduced. The proposed model is composed of cascaded control loops: The outer loop is the path following layer based on potential field framework. The inner loop tries to capture the driver's physical behaviour. The proposed driver model allows easy implementation of different driving situations to simulate a wide range of different driver types, moods and vehicle types. The expediency of the proposed driver model is shown with the help of developed driver steering assist (DSA) function integrated with a conventional series production (Electric Power steering System with rack assist servo unit) system. With the help of the DSA assist function, the driver is prevented from over saturating the front tyre forces and loss of stability and controllability during cornering. The simulation results show different driver reactions caused by the change in the parameters or properties of the proposed driver model if the DSA assist function is activated. Thus, the proposed driver model is useful for the advanced driver steering and vehicle stability assist function evaluation in the early stage of vehicle dynamics handling and stability evaluation.

  14. Dependability modeling and assessment in UML-based software development.

    Science.gov (United States)

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  15. Conceptual Models and Guidelines for Clinical Assessment of Financial Capacity.

    Science.gov (United States)

    Marson, Daniel

    2016-09-01

    The ability to manage financial affairs is a life skill of critical importance, and neuropsychologists are increasingly asked to assess financial capacity across a variety of settings. Sound clinical assessment of financial capacity requires knowledge and appreciation of applicable clinical conceptual models and principles. However, the literature has presented relatively little conceptual guidance for clinicians concerning financial capacity and its assessment. This article seeks to address this gap. The article presents six clinical models of financial capacity : (1) the early gerontological IADL model of Lawton, (2) the clinical skills model and (3) related cognitive psychological model developed by Marson and colleagues, (4) a financial decision-making model adapting earlier decisional capacity work of Appelbaum and Grisso, (5) a person-centered model of financial decision-making developed by Lichtenberg and colleagues, and (6) a recent model of financial capacity in the real world developed through the Institute of Medicine. Accompanying presentation of the models is discussion of conceptual and practical perspectives they represent for clinician assessment. Based on the models, the article concludes by presenting a series of conceptually oriented guidelines for clinical assessment of financial capacity. In summary, sound assessment of financial capacity requires knowledge and appreciation of clinical conceptual models and principles. Awareness of such models, principles and guidelines will strengthen and advance clinical assessment of financial capacity. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Sample heterogeneity in unipolar depression as assessed by functional connectivity analyses is dominated by general disease effects.

    Science.gov (United States)

    Feder, Stephan; Sundermann, Benedikt; Wersching, Heike; Teuber, Anja; Kugel, Harald; Teismann, Henning; Heindel, Walter; Berger, Klaus; Pfleiderer, Bettina

    2017-11-01

    Combinations of resting-state fMRI and machine-learning techniques are increasingly employed to develop diagnostic models for mental disorders. However, little is known about the neurobiological heterogeneity of depression and diagnostic machine learning has mainly been tested in homogeneous samples. Our main objective was to explore the inherent structure of a diverse unipolar depression sample. The secondary objective was to assess, if such information can improve diagnostic classification. We analyzed data from 360 patients with unipolar depression and 360 non-depressed population controls, who were subdivided into two independent subsets. Cluster analyses (unsupervised learning) of functional connectivity were used to generate hypotheses about potential patient subgroups from the first subset. The relationship of clusters with demographical and clinical measures was assessed. Subsequently, diagnostic classifiers (supervised learning), which incorporated information about these putative depression subgroups, were trained. Exploratory cluster analyses revealed two weakly separable subgroups of depressed patients. These subgroups differed in the average duration of depression and in the proportion of patients with concurrently severe depression and anxiety symptoms. The diagnostic classification models performed at chance level. It remains unresolved, if subgroups represent distinct biological subtypes, variability of continuous clinical variables or in part an overfitting of sparsely structured data. Functional connectivity in unipolar depression is associated with general disease effects. Cluster analyses provide hypotheses about potential depression subtypes. Diagnostic models did not benefit from this additional information regarding heterogeneity. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Attention and executive functions in a rat model of chronic epilepsy.

    Science.gov (United States)

    Faure, Jean-Baptiste; Marques-Carneiro, José E; Akimana, Gladys; Cosquer, Brigitte; Ferrandon, Arielle; Herbeaux, Karine; Koning, Estelle; Barbelivien, Alexandra; Nehlig, Astrid; Cassel, Jean-Christophe

    2014-05-01

    Temporal lobe epilepsy is a relatively frequent, invalidating, and often refractory neurologic disorder. It is associated with cognitive impairments that affect memory and executive functions. In the rat lithium-pilocarpine temporal lobe epilepsy model, memory impairment and anxiety disorder are classically reported. Here we evaluated sustained visual attention in this model of epilepsy, a function not frequently explored. Thirty-five Sprague-Dawley rats were subjected to lithium-pilocarpine status epilepticus. Twenty of them received a carisbamate treatment for 7 days, starting 1 h after status epilepticus onset. Twelve controls received lithium and saline. Five months later, attention was assessed in the five-choice serial reaction time task, a task that tests visual attention and inhibitory control (impulsivity/compulsivity). Neuronal counting was performed in brain regions of interest to the functions studied (hippocampus, prefrontal cortex, nucleus basalis magnocellularis, and pedunculopontine tegmental nucleus). Lithium-pilocarpine rats developed motor seizures. When they were able to learn the task, they exhibited attention impairment and a tendency toward impulsivity and compulsivity. These disturbances occurred in the absence of neuronal loss in structures classically related to attentional performance, although they seemed to better correlate with neuronal loss in hippocampus. Globally, rats that received carisbamate and developed motor seizures were as impaired as untreated rats, whereas those that did not develop overt motor seizures performed like controls, despite evidence for hippocampal damage. This study shows that attention deficits reported by patients with temporal lobe epilepsy can be observed in the lithium-pilocarpine model. Carisbamate prevents the occurrence of motor seizures, attention impairment, impulsivity, and compulsivity in a subpopulation of neuroprotected rats. Wiley Periodicals, Inc. © 2014 International League Against Epilepsy.

  18. A general phenomenological model for work function

    Science.gov (United States)

    Brodie, I.; Chou, S. H.; Yuan, H.

    2014-07-01

    A general phenomenological model is presented for obtaining the zero Kelvin work function of any crystal facet of metals and semiconductors, both clean and covered with a monolayer of electropositive atoms. It utilizes the known physical structure of the crystal and the Fermi energy of the two-dimensional electron gas assumed to form on the surface. A key parameter is the number of electrons donated to the surface electron gas per surface lattice site or adsorbed atom, which is taken to be an integer. Initially this is found by trial and later justified by examining the state of the valence electrons of the relevant atoms. In the case of adsorbed monolayers of electropositive atoms a satisfactory justification could not always be found, particularly for cesium, but a trial value always predicted work functions close to the experimental values. The model can also predict the variation of work function with temperature for clean crystal facets. The model is applied to various crystal faces of tungsten, aluminium, silver, and select metal oxides, and most demonstrate good fits compared to available experimental values.

  19. Functional results-oriented healthcare leadership: a novel leadership model.

    Science.gov (United States)

    Al-Touby, Salem Said

    2012-03-01

    This article modifies the traditional functional leadership model to accommodate contemporary needs in healthcare leadership based on two findings. First, the article argues that it is important that the ideal healthcare leadership emphasizes the outcomes of the patient care more than processes and structures used to deliver such care; and secondly, that the leadership must strive to attain effectiveness of their care provision and not merely targeting the attractive option of efficient operations. Based on these premises, the paper reviews the traditional Functional Leadership Model and the three elements that define the type of leadership an organization has namely, the tasks, the individuals, and the team. The article argues that concentrating on any one of these elements is not ideal and proposes adding a new element to the model to construct a novel Functional Result-Oriented healthcare leadership model. The recommended Functional-Results Oriented leadership model embosses the results element on top of the other three elements so that every effort on healthcare leadership is directed towards attaining excellent patient outcomes.

  20. Mathematical Models of Cardiac Pacemaking Function

    Directory of Open Access Journals (Sweden)

    Pan eLi

    2013-10-01

    Full Text Available Over the past half century, there has been intense and fruitful interaction between experimental and computational investigations of cardiac function. This interaction has, for example, led to deep understanding of cardiac excitation-contraction coupling; how it works, as well as how it fails. However, many lines of inquiry remain unresolved, among them the initiation of each heartbeat. The sinoatrial node, a cluster of specialized pacemaking cells in the right atrium of the heart, spontaneously generates an electro-chemical wave that spreads through the atria and through the cardiac conduction system to the ventricles, initiating the contraction of cardiac muscle essential for pumping blood to the body. Despite the fundamental importance of this primary pacemaker, this process is still not fully understood, and ionic mechanisms underlying cardiac pacemaking function are currently under heated debate. Several mathematical models of sinoatrial node cell membrane electrophysiology have been constructed as based on different experimental data sets and hypotheses. As could be expected, these differing models offer diverse predictions about cardiac pacemaking activities. This paper aims to present the current state of debate over the origins of the pacemaking function of the sinoatrial node. Here, we will specifically review the state-of-the-art of cardiac pacemaker modeling, with a special emphasis on current discrepancies, limitations, and future challenges.

  1. Novel prediction model of renal function after nephrectomy from automated renal volumetry with preoperative multidetector computed tomography (MDCT).

    Science.gov (United States)

    Isotani, Shuji; Shimoyama, Hirofumi; Yokota, Isao; Noma, Yasuhiro; Kitamura, Kousuke; China, Toshiyuki; Saito, Keisuke; Hisasue, Shin-ichi; Ide, Hisamitsu; Muto, Satoru; Yamaguchi, Raizo; Ukimura, Osamu; Gill, Inderbir S; Horie, Shigeo

    2015-10-01

    The predictive model of postoperative renal function may impact on planning nephrectomy. To develop the novel predictive model using combination of clinical indices with computer volumetry to measure the preserved renal cortex volume (RCV) using multidetector computed tomography (MDCT), and to prospectively validate performance of the model. Total 60 patients undergoing radical nephrectomy from 2011 to 2013 participated, including a development cohort of 39 patients and an external validation cohort of 21 patients. RCV was calculated by voxel count using software (Vincent, FUJIFILM). Renal function before and after radical nephrectomy was assessed via the estimated glomerular filtration rate (eGFR). Factors affecting postoperative eGFR were examined by regression analysis to develop the novel model for predicting postoperative eGFR with a backward elimination method. The predictive model was externally validated and the performance of the model was compared with that of the previously reported models. The postoperative eGFR value was associated with age, preoperative eGFR, preserved renal parenchymal volume (RPV), preserved RCV, % of RPV alteration, and % of RCV alteration (p volumetry and clinical indices might yield an important tool for predicting postoperative renal function.

  2. The assessment of cognitive function in older adult patients with chronic kidney disease: an integrative review.

    Science.gov (United States)

    Hannan, Mary; Steffen, Alana; Quinn, Lauretta; Collins, Eileen G; Phillips, Shane A; Bronas, Ulf G

    2018-05-25

    Chronic kidney disease (CKD) is a common chronic condition in older adults that is associated with cognitive decline. However, the exact prevalence of cognitive impairment in older adults with CKD is unclear likely due to the variety of methods utilized to assess cognitive function. The purpose of this integrative review is to determine how cognitive function is most frequently assessed in older adult patients with CKD. Five electronic databases were searched to explore relevant literature related to cognitive function assessment in older adult patients with CKD. Inclusion and exclusion criteria were created to focus the search to the assessment of cognitive function with standardized cognitive tests in older adults with CKD, not on renal replacement therapy. Through the search methods, 36 articles were found that fulfilled the purpose of the review. There were 36 different types of cognitive tests utilized in the included articles, with each study utilizing between one and 12 tests. The most commonly utilized cognitive test was the Mini Mental State Exam (MMSE), followed by tests of digit symbol substitution and verbal fluency. The most commonly assessed aspect of cognitive function was global cognition. The assessment of cognitive function in older adults with CKD with standardized tests is completed in various ways. Unfortunately, the common methods of assessment of cognitive function may not be fully examining the domains of impairment commonly found in older adults with CKD. Further research is needed to identify the ideal cognitive test to best assess older adults with CKD for cognitive impairment.

  3. Using ecological momentary assessment to investigate short-term variations in sexual functioning in a sample of peri-menopausal women from Iran.

    Directory of Open Access Journals (Sweden)

    Amir H Pakpour

    Full Text Available The investigation of short-term changes in female sexual functioning has received little attention so far. The aims of the study were to gain empirical knowledge on within-subject and within- and across-variable fluctuations in women's sexual functioning over time. More specifically, to investigate the stability of women´s self-reported sexual functioning and the moderating effects of contextual and interpersonal factors. A convenience sample of 206 women, recruited across eight Health care Clinics in Rasht, Iran. Ecological momentary assessment was used to examine fluctuations of sexual functioning over a six week period. A shortened version of the Female Sexual Function Index (FSFI was applied to assess sexual functioning. Self-constructed questions were included to assess relationship satisfaction, partner's sexual performance and stress levels. Mixed linear two-level model analyses revealed a link between orgasm and relationship satisfaction (Beta = 0.125, P = 0.074 with this link varying significantly between women. Analyses further revealed a significant negative association between stress and all six domains of women's sexual functioning. Women not only reported differing levels of stress over the course of the assessment period, but further differed from each other in how much stress they experienced and how much this influenced their sexual response. Orgasm and sexual satisfaction were both significantly associated with all other domains of sexual function (P<0.001. And finally, a link between partner performance and all domains of women`s sexual functioning (P<0.001 could be detected. Except for lubrication (P = 0.717, relationship satisfaction had a significant effect on all domains of the sexual response (P<0.001. Overall, our findings support the new group of criteria introduced in the DSM-5, called "associated features" such as partner factors and relationship factors. Consideration of these criteria is important and necessary for

  4. Assessment of macrovascular endothelial function using pulse wave analysis and its association with microvascular reactivity in healthy subjects.

    Science.gov (United States)

    Ibrahim, N N I N; Rasool, A H G

    2017-08-01

    Pulse wave analysis (PWA) and laser Doppler fluximetry (LDF) are non-invasive methods of assessing macrovascular endothelial function and microvascular reactivity respectively. The aim of this study was to assess the correlation between macrovascular endothelial function assessed by PWA and microvascular reactivity assessed by LDF. 297 healthy and non-smoking subjects (159 females, mean age (±SD) 23.56 ± 4.54 years) underwent microvascular reactivity assessment using LDF followed by macrovascular endothelial function assessments using PWA. Pearson's correlation showed no correlation between macrovascular endothelial function and microvascular reactivity (r = -0.10, P = 0.12). There was no significant correlation between macrovascular endothelial function assessed by PWA and microvascular reactivity assessed by LDF in healthy subjects. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Psychological distress negatively affects self-assessment of shoulder function in patients with rotator cuff tears.

    Science.gov (United States)

    Potter, Michael Q; Wylie, James D; Greis, Patrick E; Burks, Robert T; Tashjian, Robert Z

    2014-12-01

    In many areas of orthopaedics, patients with greater levels of psychological distress report inferior self-assessments of pain and function. This effect can lead to lower-than-expected baseline scores on common patient-reported outcome scales, even those not traditionally considered to have a psychological component. This study attempts to answer the following questions: (1) Are higher levels of psychological distress associated with clinically important differences in baseline scores on the VAS for pain, the Simple Shoulder Test, and the American Shoulder and Elbow Surgeons score in patients undergoing arthroscopic rotator cuff repair? (2) Does psychological distress remain a negative predictor of baseline shoulder scores when other clinical variables are controlled? Eighty-five patients with full-thickness rotator cuff tears were prospectively enrolled. Psychological distress was quantified using the Distress Risk Assessment Method questionnaire. Patients completed baseline self-assessments including the VAS for pain, the Simple Shoulder Test, and the American Shoulder and Elbow Surgeons score. Age, sex, BMI, smoking status, American Society of Anesthesiologists classification, tear size, and tear retraction were recorded for each patient. Bivariate correlations and multivariate regression models were used to assess the effect of psychological distress on patient self-assessment of shoulder pain and function. Distressed patients reported higher baseline VAS scores (6.7 [95% CI, 4.4-9.0] versus 2.9 [95% CI, 2.3-3.6], p = 0.001) and lower baseline Simple Shoulder Test (3.7 [95% CI, 2.9-4.5] versus 5.7 [95% CI 5.0-6.4], p = 0.001) and American Shoulder and Elbow Surgeons scores (39 [95% CI, 34-45] versus 58 [95% CI, 53-63], p psychological distress are associated with inferior baseline patient self-assessment of shoulder pain and function using the VAS, the Simple Shoulder Test, and the American Shoulder and Elbow Surgeons score. Longitudinal followup is

  6. A simple model of self-assessment

    NARCIS (Netherlands)

    Dominguez-Martinez, S.; Swank, O.H.

    2009-01-01

    We develop a simple model that describes individuals' self-assessments of their abilities. We assume that individuals learn about their abilities from appraisals of others and experience. Our model predicts that if communication is imperfect, then (i) appraisals of others tend to be too positive and

  7. Modeling risk evolution of digestive tract functional violations when exposed to chemical environmental factors

    Directory of Open Access Journals (Sweden)

    M.R. Kamaltdinov

    2015-06-01

    Full Text Available Modern methods of health risk assessment are based on the representation of individual and public health as a dynamic process of “evolution”, which describes a continuous course of negative (and positive changes in the condition of the body. The article presents a conceptual diagram of multilevel health risk evolution modeling under the influence of environmental factors. The main aspects associated with the simulation of digestive processes in the “meso level” are considered. Some results of solving the problem of the flow in the digestive tract antroduodenal area taken into account tract motility. Further development ways of the model are outlines – account of biochemical reactions, secretory and absorptive functions tract. The proposed approach will enable not only to predict the risk of digestive system functional disorders, but also take into account basic physiological processes, mechanisms of income, distribution, excretion of chemicals.

  8. A note on monotonicity of item response functions for ordered polytomous item response theory models.

    Science.gov (United States)

    Kang, Hyeon-Ah; Su, Ya-Hui; Chang, Hua-Hua

    2018-03-08

    A monotone relationship between a true score (τ) and a latent trait level (θ) has been a key assumption for many psychometric applications. The monotonicity property in dichotomous response models is evident as a result of a transformation via a test characteristic curve. Monotonicity in polytomous models, in contrast, is not immediately obvious because item response functions are determined by a set of response category curves, which are conceivably non-monotonic in θ. The purpose of the present note is to demonstrate strict monotonicity in ordered polytomous item response models. Five models that are widely used in operational assessments are considered for proof: the generalized partial credit model (Muraki, 1992, Applied Psychological Measurement, 16, 159), the nominal model (Bock, 1972, Psychometrika, 37, 29), the partial credit model (Masters, 1982, Psychometrika, 47, 147), the rating scale model (Andrich, 1978, Psychometrika, 43, 561), and the graded response model (Samejima, 1972, A general model for free-response data (Psychometric Monograph no. 18). Psychometric Society, Richmond). The study asserts that the item response functions in these models strictly increase in θ and thus there exists strict monotonicity between τ and θ under certain specified conditions. This conclusion validates the practice of customarily using τ in place of θ in applied settings and provides theoretical grounds for one-to-one transformations between the two scales. © 2018 The British Psychological Society.

  9. Assessment of multislice CT to quantify pulmonary emphysema function and physiology in a rat model

    Science.gov (United States)

    Cao, Minsong; Stantz, Keith M.; Liang, Yun; Krishnamurthi, Ganapathy; Presson, Robert G., Jr.

    2005-04-01

    Purpose: The purpose of this study is to evaluate multi-slice computed tomography technology to quantify functional and physiologic changes in rats with pulmonary emphysema. Method: Seven rats were scanned using a 16-slice CT (Philips MX8000 IDT) before and after artificial inducement of emphysema. Functional parameters i.e. lung volumes were measured by non-contrast spiral scan during forced breath-hold at inspiration and expiration followed by image segmentation based on attenuation threshold. Dynamic CT imaging was performed immediately following the contrast injection to estimate physiology changes. Pulmonary perfusion, fractional blood volume, and mean transit times (MTTs) were estimated by fitting the time-density curves of contrast material using a compartmental model. Results: The preliminary results indicated that the lung volumes of emphysema rats increased by 3.52+/-1.70mL (pemphysema rats decreased by 91.76+/-68.11HU (pemphysema rats were 0.25+/-0.04ml/s/ml and 0.32+/-0.09ml/s/ml respectively. The fractional blood volumes for normal and emphysema rats were 0.21+/-0.04 and 0.15+/-0.02. There was a trend toward faster MTTs for emphysema rats (0.42+/-0.08s) than normal rats (0.89+/-0.19s) with ppulmonary emphysema appears promising for small animals.

  10. Hierarchical functional model for automobile development; Jidosha kaihatsu no tame no kaisogata kino model

    Energy Technology Data Exchange (ETDEWEB)

    Sumida, S [U-shin Ltd., Tokyo (Japan); Nagamatsu, M; Maruyama, K [Hokkaido Institute of Technology, Sapporo (Japan); Hiramatsu, S [Mazda Motor Corp., Hiroshima (Japan)

    1997-10-01

    A new approach on modeling is put forward in order to compose the virtual prototype which is indispensable for fully computer integrated concurrent development of automobile product. A basic concept of the hierarchical functional model is proposed as the concrete form of this new modeling technology. This model is used mainly for explaining and simulating functions and efficiencies of both the parts and the total product of automobile. All engineers who engage themselves in design and development of automobile can collaborate with one another using this model. Some application examples are shown, and usefulness of this model is demonstrated. 5 refs., 5 figs.

  11. Exploring Moodle Functionality for Managing Open Distance Learning E-Assessments

    Directory of Open Access Journals (Sweden)

    Indira KONERU

    2017-10-01

    Full Text Available Current and emerging technologies enable Open Distance Learning (ODL institutions integrate e-Learning in innovative ways and add value to the existing teaching-learning and assessment processes. ODL e-Assessment systems have evolved from Computer Assisted / Aided Assessment (CAA systems through intelligent assessment and feedback systems. E-Assessment (electronic assessment connotes using electronic technology and tools to design and administer assessments, collect and store students’ assessment evidences, grade performance, provide feedback and generate reports. The widely recognized advantages of e-Assessment over traditional, paper-based assessment include: lower long term costs, instant feedback to students, greater flexibility with respect to location and timing, improved reliability with machine marking, improved impartiality, and enhanced question styles that incorporate interactivity and multimedia. The advent of Learning Management Systems (LMS, such as Moodle (Modular Object-Oriented Dynamic Learning Environment paved the way for integrated advanced services for: interactive dialogue, controlling knowledge at different stages of distance process and e-Assessment systems. Moodle provides the complete integrated environment for handling all aspects of e-Assessment from authoring questions through to reports for course teams (Butcher, 2008. This study explores how Moodle functionality: supports diagnostic, formative, summative and competency-based assessments and facilitates ODL institutions design, administer and manage e-Assessments.

  12. Dose related risk and effect assessment model (DREAM) -- A more realistic approach to risk assessment of offshore discharges

    International Nuclear Information System (INIS)

    Johnsen, S.; Furuholt, E.

    1995-01-01

    Risk assessment of discharges from offshore oil and gas production to the marine environment features determination of potential environmental concentration (PEC) levels and no observed effect concentration (NOEC) levels. The PEC values are normally based on dilution of chemical components in the actual discharge source in the recipient, while the NOEC values are determined by applying a safety factor to acute toxic effects from laboratory tests. The DREAM concept focuses on realistic exposure doses as function of contact time and dilution, rather than fixed exposure concentrations of chemicals in long time exposure regimes. In its present state, the DREAM model is based on a number of assumptions with respect to the link between real life exposure doses and effects observed in laboratory tests. A research project has recently been initiated to develop the concept further, with special focus on chronic effects of different chemical compounds on the marine ecosystem. One of the questions that will be addressed is the link between exposure time, dose, concentration and effect. Validation of the safety factors applied for transforming acute toxic data into NOEC values will also be included. The DREAM model has been used by Statoil for risk assessment of discharges from new and existing offshore oil and gas production fields, and has been found to give a much more realistic results than conventional risk assessment tools. The presentation outlines the background for the DREAM approach, describes the model in its present state, discusses further developments and applications, and shows a number of examples on the performance of DREAM

  13. Evaluation of right heart function in a rat model using modified echocardiographic views.

    Science.gov (United States)

    Bernardo, Ivan; Wong, James; Wlodek, Mary E; Vlahos, Ross; Soeding, Paul

    2017-01-01

    Echocardiography plays a major role in assessing cardiac function in animal models. We investigated use of a modified parasternal mid right-ventricular (MRV) and right ventricle (RV) outflow (RVOT) view, in assessing RV size and function, and the suitability of advanced 2D-strain analysis. 15 WKY rats were examined using transthoracic echocardiography. The left heart was assessed using standard short and long axis views. For the right ventricle a MRV and RVOT view were used to measure RV chamber and free wall area. 2D-strain analysis was applied to both ventricles using off-line analysis. RV chamber volume was determined by injection of 2% agarose gel, and RV free wall dissected and weighed. Echocardiography measurement was correlated with necropsy findings. The RV mid-ventricular dimension (R1) was 0.42±0.07cm and the right ventricular outflow tract dimension (R2) was 0.34±0.06cm, chamber end-diastolic area measurements were 0.38±0.09cm2 and 0.29±0.08cm2 for MRV and RVOT views respectively. RVOT and MRV chamber area correlated with gel mass. Doppler RV stroke volume was 0.32±0.08ml, cardiac output (CO) 110±27 ml.min-1 and RV free wall contractility assessed using 2D-strain analysis was demonstrated. We have shown that modified MRV and RVOT views can provide detailed assessment of the RV in rodents, with 2D-strain analysis of the RV free wall potentially feasible.

  14. Testing the basic assumption of the hydrogeomorphic approach to assessing wetland functions.

    Science.gov (United States)

    Hruby, T

    2001-05-01

    The hydrogeomorphic (HGM) approach for developing "rapid" wetland function assessment methods stipulates that the variables used are to be scaled based on data collected at sites judged to be the best at performing the wetland functions (reference standard sites). A critical step in the process is to choose the least altered wetlands in a hydrogeomorphic subclass to use as a reference standard against which other wetlands are compared. The basic assumption made in this approach is that wetlands judged to have had the least human impact have the highest level of sustainable performance for all functions. The levels at which functions are performed in these least altered wetlands are assumed to be "characteristic" for the subclass and "sustainable." Results from data collected in wetlands in the lowlands of western Washington suggest that the assumption may not be appropriate for this region. Teams developing methods for assessing wetland functions did not find that the least altered wetlands in a subclass had a range of performance levels that could be identified as "characteristic" or "sustainable." Forty-four wetlands in four hydrogeomorphic subclasses (two depressional subclasses and two riverine subclasses) were rated by teams of experts on the severity of their human alterations and on the level of performance of 15 wetland functions. An ordinal scale of 1-5 was used to quantify alterations in water regime, soils, vegetation, buffers, and contributing basin. Performance of functions was judged on an ordinal scale of 1-7. Relatively unaltered wetlands were judged to perform individual functions at levels that spanned all of the seven possible ratings in all four subclasses. The basic assumption of the HGM approach, that the least altered wetlands represent "characteristic" and "sustainable" levels of functioning that are different from those found in altered wetlands, was not confirmed. Although the intent of the HGM approach is to use level of functioning as a

  15. MODEL AUTHENTIC SELF-ASSESSMENT DALAM PENGEMBANGAN EMPLOYABILITY SKILLS MAHASISWA PENDIDIKAN TINGGI VOKASI

    Directory of Open Access Journals (Sweden)

    I Made Suarta

    2015-06-01

    ______________________________________________________________ AUTHENTIC SELF-ASSESSMENT MODEL FOR DEVELOPING EMPLOYABILITY SKILLS STUDENT IN HIGHER VOCATIONAL EDUCATION Abstract The purpose of this research is to develop assessment tools to evaluate achievement of employability skills which are integrated in the learning database applications. The assessment model developed is a combination of self-assessment and authentic assessment, proposed as models of authentic self-assessment. The steps of developing authentic self-assessment models include: identifying the standards, selecting an authentic task, identifying the criteria for the task, and creating the rubric. The results of development assessment tools include: (1 problem solving skills assessment model, (2 self-management skills assessment model, and (3 competence database applications assessment model. This model can be used to assess the cognitive, affective, and psychomotor achievement. The results indicate: achievement of problem solving and self-management ability was in good category, and competencies in designing conceptual and logical database was in high category. This model also has met the basic principles of assessment, i.e.: validity, reliability, focused on competencies, comprehen-sive, objectivity, and the principle of educating. Keywords: authentic assessment, self-assessment, problem solving skills, self-management skills, vocational education

  16. Apply Functional Modelling to Consequence Analysis in Supervision Systems

    DEFF Research Database (Denmark)

    Zhang, Xinxin; Lind, Morten; Gola, Giulio

    2013-01-01

    This paper will first present the purpose and goals of applying functional modelling approach to consequence analysis by adopting Multilevel Flow Modelling (MFM). MFM Models describe a complex system in multiple abstraction levels in both means-end dimension and whole-part dimension. It contains...... consequence analysis to practical or online applications in supervision systems. It will also suggest a multiagent solution as the integration architecture for developing tools to facilitate the utilization results of functional consequence analysis. Finally a prototype of the multiagent reasoning system...... causal relations between functions and goals. A rule base system can be developed to trace the causal relations and perform consequence propagations. This paper will illustrate how to use MFM for consequence reasoning by using rule base technology and describe the challenges for integrating functional...

  17. Functional inverted Wishart for Bayesian multivariate spatial modeling with application to regional climatology model data.

    Science.gov (United States)

    Duan, L L; Szczesniak, R D; Wang, X

    2017-11-01

    Modern environmental and climatological studies produce multiple outcomes at high spatial resolutions. Multivariate spatial modeling is an established means to quantify cross-correlation among outcomes. However, existing models typically suffer from poor computational efficiency and lack the flexibility to simultaneously estimate auto- and cross-covariance structures. In this article, we undertake a novel construction of covariance by utilizing spectral convolution and by imposing an inverted Wishart prior on the cross-correlation structure. The cross-correlation structure with this functional inverted Wishart prior flexibly accommodates not only positive but also weak or negative associations among outcomes while preserving spatial resolution. Furthermore, the proposed model is computationally efficient and produces easily interpretable results, including the individual autocovariances and full cross-correlation matrices, as well as a partial cross-correlation matrix reflecting the outcome correlation after excluding the effects caused by spatial convolution. The model is examined using simulated data sets under different scenarios. It is also applied to the data from the North American Regional Climate Change Assessment Program, examining long-term associations between surface outcomes for air temperature, pressure, humidity, and radiation, on the land area of the North American West Coast. Results and predictive performance are compared with findings from approaches using convolution only or coregionalization.

  18. Functional inverted Wishart for Bayesian multivariate spatial modeling with application to regional climatology model data

    Science.gov (United States)

    Duan, L. L.; Szczesniak, R. D.; Wang, X.

    2018-01-01

    Modern environmental and climatological studies produce multiple outcomes at high spatial resolutions. Multivariate spatial modeling is an established means to quantify cross-correlation among outcomes. However, existing models typically suffer from poor computational efficiency and lack the flexibility to simultaneously estimate auto- and cross-covariance structures. In this article, we undertake a novel construction of covariance by utilizing spectral convolution and by imposing an inverted Wishart prior on the cross-correlation structure. The cross-correlation structure with this functional inverted Wishart prior flexibly accommodates not only positive but also weak or negative associations among outcomes while preserving spatial resolution. Furthermore, the proposed model is computationally efficient and produces easily interpretable results, including the individual autocovariances and full cross-correlation matrices, as well as a partial cross-correlation matrix reflecting the outcome correlation after excluding the effects caused by spatial convolution. The model is examined using simulated data sets under different scenarios. It is also applied to the data from the North American Regional Climate Change Assessment Program, examining long-term associations between surface outcomes for air temperature, pressure, humidity, and radiation, on the land area of the North American West Coast. Results and predictive performance are compared with findings from approaches using convolution only or coregionalization. PMID:29576735

  19. Climate Modeling Computing Needs Assessment

    Science.gov (United States)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  20. A hybrid input–output multi-objective model to assess economic–energy–environment trade-offs in Brazil

    International Nuclear Information System (INIS)

    Carvalho, Ariovaldo Lopes de; Antunes, Carlos Henggeler; Freire, Fausto; Henriques, Carla Oliveira

    2015-01-01

    A multi-objective linear programming (MOLP) model based on a hybrid Input–Output (IO) framework is presented. This model aims at assessing the trade-offs between economic, energy, environmental (E3) and social objectives in the Brazilian economic system. This combination of multi-objective models with Input–Output Analysis (IOA) plays a supplementary role in understanding the interactions between the economic and energy systems, and the corresponding impacts on the environment, offering a consistent framework for assessing the effects of distinct policies on these systems. Firstly, the System of National Accounts (SNA) is reorganized to include the National Energy Balance, creating a hybrid IO framework that is extended to assess Greenhouse Gas (GHG) emissions and the employment level. The objective functions considered are the maximization of GDP (gross domestic product) and employment levels, as well as the minimization of energy consumption and GHG emissions. An interactive method enabling a progressive and selective search of non-dominated solutions with distinct characteristics and underlying trade-offs is utilized. Illustrative results indicate that the maximization of GDP and the employment levels lead to an increase of both energy consumption and GHG emissions, while the minimization of either GHG emissions or energy consumption cause negative impacts on GDP and employment. - Highlights: • A hybrid Input–Output multi-objective model is applied to the Brazilian economy. • Objective functions are GDP, employment level, energy consumption and GHG emissions. • Interactive search process identifies trade-offs between the competing objectives. • Positive correlations between GDP growth and employment. • Positive correlations between energy consumption and GHG emissions