Crisóstomo, Verónica; Sun, Fei; Maynar, Manuel; Báez-Díaz, Claudia; Blanco, Virginia; Garcia-Lindo, Monica; Usón-Gargallo, Jesús; Sánchez-Margallo, Francisco Miguel
Cardiovascular diseases are a major health concern and therefore an important topic in biomedical research. Large animal models allow researchers to assess the safety and efficacy of new cardiovascular procedures in systems that resemble human anatomy; additionally, they can be used to emulate scenarios for training purposes. Among the many biomedical models that are described in published literature, it is important that researchers understand and select those that are best suited to achieve the aims of their research, that facilitate the humane care and management of their research animals and that best promote the high ethical standards required of animal research. In this resource the authors describe some common swine models that can be easily incorporated into regular practices of research and training at biomedical institutions. These models use both native and altered vascular anatomy of swine to carry out research protocols, such as testing biological reactions to implanted materials, surgically creating aneurysms using autologous tissue and inducing myocardial infarction through closed-chest procedures. Such models can also be used for training, where native and altered vascular anatomy allow medical professionals to learn and practice challenging techniques in anatomy that closely simulates human systems.
Fu, L.; Ma, X.; West, P.; Beaulieu, S. E.; Di Stefano, M.; Fox, P. A.
Provenance is information about entities, activities, and people involved in producing a piece of data or thing, which can be used to form assessments about its quality, reliability or trustworthiness. In a research publication, provenance includes entities, activities and people involved in the process leading to the parts of the publication such as figures, tables, paragraphs etc. Such information is often desirable for the readers to correctly interpret publication content and enables them to evaluate the credibility of the reported results by digging into the software in use, source data and responsible agents or even reproducing the results themselves. In this presentation, we will describe our ontology designed to model the preparing process of research publications based on our experience from two projects, both focusing on provenance capturing for research publications. The first project is about capturing provenance information for a National Climate Assessment (NCA) report of the US Global Change Research Program (USGCRP), and the second about capturing provenance information for an Ecosystem Status Report (ESR) of the Northeast Fisheries Science Center (NEFSC). Both projects base their provenance modeling on the W3C Provenance ontology (PROV-O), which proves to be an effective way to create models for provenance capturing. We will illustrate the commonalities and differences between use cases of these two projects and how we derive a common model from models specifically designed to capture provenance information for each of the projects.
Balakrishna, S.; Acheson, Michael J.
Recent NASA Common Research Model (CRM) tests at the Langley National Transonic Facility (NTF) and Ames 11-foot Transonic Wind Tunnel (11-foot TWT) have generated an experimental database for CFD code validation. The database consists of force and moment, surface pressures and wideband wing-root dynamic strain/wing Kulite data from continuous sweep pitch polars. The dynamic data sets, acquired at 12,800 Hz sampling rate, are analyzed in this study to evaluate CRM wing buffet onset and potential CRM wing flow separation.
Chan, Y. N.; Harmin, M. Y.; Rafie, A. S. M.
A baseline model in reference to the NASA Common Research Model is selected to illustrate the concept of aeroelastic tailoring of wing ribs structure by varying the orientation of the ribs. This enables the torsional-bending modes characteristic to be altered, which results in a possibility of improvement in the aeroelastic performance without having to compromise its overall weight. Two strategies are implanted in this work: the first strategy considers the whole ribs to be oriented in parallel while the second one is by dividing the wing into six parts where the ribs are allowed to be oriented in parallel within their part only. In many cases, there are a significant improvement with up to 5.5% of flutter increment via orientating the ribs section, hence leading to a possibility in significant structural weight reduction.
Full Text Available With increased computing power more data than ever are being and will be produced, stored and (re- used. Data are collected in databases, computed and annotated, or transformed by specific tools. The knowledge from data is documented in research publications, reports, presentations, or other types of files. The management of data and knowledge is difficult, and even more complicated is their re-use, exchange, or integration. To allow for quality analysis or integration across data sets and to ensure access to scientific knowledge, additional information - Research Information - has to be assigned to data and knowledge entities. We present the metadata model CERIF to add information to entities such as Publication, Project, Organisation, Person, Product, Patent, Service, Equipment, and Facility and to manage the semantically enhanced relationships between these entities in a formalized way. CERIF has been released as an EC Recommendation to European Member States in 2000. Here, we refer to the latest version CERIF 2008-1.0.
Ayodeji Emmanuel Oke
Full Text Available The use of structural equation modelling (SEM for research studies in construction related field has been on the increase over the years. The essence of this study is not to compare the level of usage of SEM with other modelling methods, neither is it to examine its extent of adoption in construction management - as this has been researched in previous works - but to arrive at a common ground for future construction related research works, based on the findings and recommendations from existing studies on the subject of SEM. Research materials within and outside the field of construction management were reviewed and it was discovered that SEM using AMOS (covariance approach is the most appropriate method for construction research studies. This is not just because it is the most available of the software programs, but because of the numerous benefits and advantages highlighted from previous studies. The study also recommended appropriate sample size as well as cut-off value for various required goodness-of-fit tests of SEM model.
Rivers, Melissa B.; Balakrishna, S.
The NASA Common Research Model (CRM) high Reynolds number transonic wind tunnel testing program was established to generate an experimental database for applied Computational Fluid Dynamics (CFD) validation studies. During transonic wind tunnel tests, the CRM encounters large sting vibrations when the angle of attack approaches the second pitching moment break, which can sometimes become divergent. CRM transonic test data analysis suggests that sting divergent oscillations are related to negative net sting damping episodes associated with flow separation instability. The National Transonic Facility (NTF) has been addressing remedies to extend polar testing up to and beyond the second pitching moment break point of the test articles using an active piezoceramic damper system for both ambient and cryogenic temperatures. This paper reviews CRM test results to gain understanding of sting dynamics with a simple model describing the mechanics of a sting-model system and presents the performance of the damper under cryogenic conditions.
Kim, Hyeoneui; Choi, Jeeyae; Jang, Imho; Quach, Jimmy; Ohno-Machado, Lucila
We explored the feasibility of representing nursing research data with the Observational Medical Outcomes Partners (OMOP) Common Data Model (CDM) to understand the challenges and opportunities in representing various types of health data not limited to diseases and drug treatments. We collected 1,431 unique data items from 256 nursing articles and mapped them to the OMOP CDM. A deeper level of mapping was explored by simulating 10 data search use cases. Although the majority of the data could be represented in the OMOP CDM, potential information loss was identified in contents related to patient reported outcomes, socio-economic information, and locally developed nursing intervention protocols. These areas will be further investigated in a follow up study. We will use lessons learned in this study to inform the metadata development efforts for data discovery.
Dunning, Peter D.; Stanford, Bret K.; Kim, H. Alicia
The purpose of this work is to use level set topology optimization to improve the design of a representative wing box structure for the NASA common research model. The objective is to minimize the total compliance of the structure under aerodynamic and body force loading, where the aerodynamic loading is coupled to the structural deformation. A taxi bump case was also considered, where only body force loads were applied. The trim condition that aerodynamic lift must balance the total weight of the aircraft is enforced by allowing the root angle of attack to change. The level set optimization method is implemented on an unstructured three-dimensional grid, so that the method can optimize a wing box with arbitrary geometry. Fast matching and upwind schemes are developed for an unstructured grid, which make the level set method robust and efficient. The adjoint method is used to obtain the coupled shape sensitivities required to perform aerostructural optimization of the wing box structure.
Čantrak Đorđe S.
Full Text Available The paper presents high-speed stereo particle image velocimetry investigation of the NASA Common Research Model wing tip vortex. A three-percent scaled semi–span model, without nacelle and pylon, was tested in the 32- by 48-inch Indraft tunnel, at the Fluid Mechanics Laboratory at the NASA Ames Research Center. Turbulence investigation of the wing tip vortex is presented. Measurements of the wing-tip vortex were performed in a vertical cross-stream plane three tip-chords downstream of the wing tip trailing edge with a 2 kHz sampling rate. Experimental data are analyzed in the invariant anisotropy maps for three various angles of attack (0°, 2°, and 4° and the same speed generated in the tunnel (V∞ = 50 m/s. This corresponds to a chord Reynolds number 2.68x105, where the chord length of 3” is considered the characteristic length. The region of interest was x = 220 mm and y = 90 mm. The 20 000 particle image velocimetry samples were acquired at each condition. Velocity fields and turbulence statistics are given for all cases, as well as turbulence structure in the light of the invariant theory. Prediction of the wing tip vortices is still a challenge for the computational fluid dynamics codes due to significant pressure and velocity gradients. [Project of the Serbian Ministry of Education, Science and Technological Development, Grant no. TR 35046
Macrae, I M
This review describes the most commonly used rodent models and outcome measures in preclinical stroke research and discusses their strengths and limitations. Most models involve permanent or transient middle cerebral artery occlusion with therapeutic agents tested for their ability to reduce stroke-induced infarcts and improve neurological deficits. Many drugs have demonstrated preclinical efficacy but, other than thrombolytics, which restore blood flow, none have demonstrated efficacy in clinical trials. This failure to translate efficacy from bench to bedside is discussed alongside achievable steps to improve the ability of preclinical research to predict clinical efficacy: (i) Improvements in study quality and reporting. Study design must include randomization, blinding and predefined inclusion/exclusion criteria, and journal editors have the power to ensure statements on these and mortality data are included in preclinical publications. (ii) Negative and neutral studies must be published to enable preclinical meta-analyses and systematic reviews to more accurately predict drug efficacy in man. (iii) Preclinical groups should work within networks and agree on standardized procedures for assessing final infarct and functional outcome. This will improve research quality, timeliness and translational capacity. (iv) Greater uptake and improvements in non-invasive diagnostic imaging to detect and study potentially salvageable penumbral tissue, the target for acute neuroprotection. Drug effects on penumbra lifespan studied serially, followed by assessment of behavioural outcome and infarct within in the same animal group, will increase the power to detect drug efficacy preclinically. Similar progress in detecting drug efficacy clinically will follow from patient recruitment into acute stroke trials based on evidence of remaining penumbra. © 2011 The Author. British Journal of Pharmacology © 2011 The British Pharmacological Society.
Jutte, Christine V.; Stanford, Bret K.; Wieseman, Carol D.
This work explores the use of alternative internal structural designs within a full-scale wing box structure for aeroelastic tailoring, with a focus on curvilinear spars, ribs, and stringers. The baseline wing model is a fully-populated, cantilevered wing box structure of the Common Research Model (CRM). Metrics of interest include the wing weight, the onset of dynamic flutter, and the static aeroelastic stresses. Twelve parametric studies alter the number of internal structural members along with their location, orientation, and curvature. Additional evaluation metrics are considered to identify design trends that lead to lighter-weight, aeroelastically stable wing designs. The best designs of the individual studies are compared and discussed, with a focus on weight reduction and flutter resistance. The largest weight reductions were obtained by removing the inner spar, and performance was maintained by shifting stringers forward and/or using curvilinear ribs: 5.6% weight reduction, a 13.9% improvement in flutter speed, but a 3.0% increase in stress levels. Flutter resistance was also maintained using straight-rotated ribs although the design had a 4.2% lower flutter speed than the curved ribs of similar weight and stress levels were higher. For some configurations, the differences between curved and straight ribs were smaller, which provides motivation for future optimization-based studies to fully exploit the trade-offs.
This thesis will examine potential propulsive and aerodynamic benefits of integrating a boundary-layer ingestion (BLI) propulsion system with a typical commercial aircraft using the Common Research Model geometry and the NASA Tetrahedral Unstructured Software System (TetrUSS). The Numerical Propulsion System Simulation (NPSS) environment will be used to generate engine conditions for CFD analysis. Improvements to the BLI geometry will be made using the Constrained Direct Iterative Surface Curvature (CDISC) design method. Previous studies have shown reductions of up to 25% in terms of propulsive power required for cruise for other axisymmetric geometries using the BLI concept. An analysis of engine power requirements, drag, and lift coefficients using the baseline and BLI geometries coupled with the NPSS model are shown. Potential benefits of the BLI system relating to cruise propulsive power are quantified using a power balance method and a comparison to the baseline case is made. Iterations of the BLI geometric design are shown and any improvements between subsequent BLI designs presented. Simulations are conducted for a cruise flight condition of Mach 0.85 at an altitude of 38,500 feet and an angle of attack of 2deg for all geometries. A comparison between available wind tunnel data, previous computational results, and the original CRM model is presented for model verification purposes along with full results for BLI power savings. Results indicate a 14.3% reduction in engine power requirements at cruise for the BLI configuration over the baseline geometry. Minor shaping of the aft portion of the fuselage using CDISC has been shown to increase the benefit from boundary-layer ingestion further, resulting in a 15.6% reduction in power requirements for cruise as well as a drag reduction of eighteen counts over the baseline geometry.
Blumenthal, Brennan T.; Elmiligui, Alaa; Geiselhart, Karl A.; Campbell, Richard L.; Maughmer, Mark D.; Schmitz, Sven
The present paper examines potential propulsive and aerodynamic benefits of integrating a Boundary-Layer Ingestion (BLI) propulsion system into a typical commercial aircraft using the Common Research Model (CRM) geometry and the NASA Tetrahedral Unstructured Software System (TetrUSS). The Numerical Propulsion System Simulation (NPSS) environment is used to generate engine conditions for CFD analysis. Improvements to the BLI geometry are made using the Constrained Direct Iterative Surface Curvature (CDISC) design method. Previous studies have shown reductions of up to 25% in terms of propulsive power required for cruise for other axisymmetric geometries using the BLI concept. An analysis of engine power requirements, drag, and lift coefficients using the baseline and BLI geometries coupled with the NPSS model are shown. Potential benefits of the BLI system relating to cruise propulsive power are quantified using a power balance method, and a comparison to the baseline case is made. Iterations of the BLI geometric design are shown and any improvements between subsequent BLI designs presented. Simulations are conducted for a cruise flight condition of Mach 0.85 at an altitude of 38,500 feet and an angle of attack of 2 deg for all geometries. A comparison between available wind tunnel data, previous computational results, and the original CRM model is presented for model verification purposes along with full results for BLI power savings. Results indicate a 14.4% reduction in engine power requirements at cruise for the BLI configuration over the baseline geometry. Minor shaping of the aft portion of the fuselage using CDISC has been shown to increase the benefit from Boundary-Layer Ingestion further, resulting in a 15.6% reduction in power requirements for cruise as well as a drag reduction of eighteen counts over the baseline geometry.
Acheson, Michael J.; Balakrishna, S.
Recent tests using the Common Research Model (CRM) at the Langley National Transonic Facility (NTF) and the Ames 11-foot Transonic Wind Tunnel (11' TWT) produced large sets of data that have been used to examine the effects of active damping on transonic tunnel aerodynamic data quality. In particular, large statistically significant sets of repeat data demonstrate that the active damping system had no apparent effect on drag, lift and pitching moment repeatability during warm testing conditions, while simultaneously enabling aerodynamic data to be obtained post stall. A small set of cryogenic (high Reynolds number) repeat data was obtained at the NTF and again showed a negligible effect on data repeatability. However, due to a degradation of control power in the active damping system cryogenically, the ability to obtain test data post-stall was not achieved during cryogenic testing. Additionally, comparisons of data repeatability between NTF and 11-ft TWT CRM data led to further (warm) testing at the NTF which demonstrated that for a modest increase in data sampling time, a 2-3 factor improvement in drag, and pitching moment repeatability was readily achieved not related with the active damping system.
Jutte, Christine V.; Stanford, Bret K.; Wieseman, Carol D.; Moore, James B.
This work explores the use of tow steered composite laminates, functionally graded metals (FGM), thickness distributions, and curvilinear rib/spar/stringer topologies for aeroelastic tailoring. Parameterized models of the Common Research Model (CRM) wing box have been developed for passive aeroelastic tailoring trade studies. Metrics of interest include the wing weight, the onset of dynamic flutter, and the static aeroelastic stresses. Compared to a baseline structure, the lowest aggregate static wing stresses could be obtained with tow steered skins (47% improvement), and many of these designs could reduce weight as well (up to 14%). For these structures, the trade-off between flutter speed and weight is generally strong, although one case showed both a 100% flutter improvement and a 3.5% weight reduction. Material grading showed no benefit in the skins, but moderate flutter speed improvements (with no weight or stress increase) could be obtained by grading the spars (4.8%) or ribs (3.2%), where the best flutter results were obtained by grading both thickness and material. For the topology work, large weight reductions were obtained by removing an inner spar, and performance was maintained by shifting stringers forward and/or using curvilinear ribs: 5.6% weight reduction, a 13.9% improvement in flutter speed, but a 3.0% increase in stress levels. Flutter resistance was also maintained using straightrotated ribs although the design had a 4.2% lower flutter speed than the curved ribs of similar weight and stress levels were higher. These results will guide the development of a future design optimization scheme established to exploit and combine the individual attributes of these technologies.
Uhlir, Paul F. [Board on Research Data and Information Policy and Global Affairs, Washington, DC (United States)
Recent decades have witnessed an ever-increasing range and volume of digital data. All elements of the pillars of science--whether observation, experiment, or theory and modeling--are being transformed by the continuous cycle of generation, dissemination, and use of factual information. This is even more so in terms of the re-using and re-purposing of digital scientific data beyond the original intent of the data collectors, often with dramatic results. We all know about the potential benefits and impacts of digital data, but we are also aware of the barriers, the challenges in maximizing the access, and use of such data. There is thus a need to think about how a data infrastructure can enhance capabilities for finding, using, and integrating information to accelerate discovery and innovation. How can we best implement an accessible, interoperable digital environment so that the data can be repeatedly used by a wide variety of users in different settings and with different applications? With this objective: to use the microbial communities and microbial data, literature, and the research materials themselves as a test case, the Board on Research Data and Information held an International Symposium on Designing the Microbial Research Commons at the National Academy of Sciences in Washington, DC on 8-9 October 2009. The symposium addressed topics such as models to lower the transaction costs and support access to and use of microbiological materials and digital resources from the perspective of publicly funded research, public-private interactions, and developing country concerns. The overall goal of the symposium was to stimulate more research and implementation of improved legal and institutional models for publicly funded research in microbiology.
The primary purpose of this column is to focus on several common core concepts that are foundational to qualitative research. Discussion of these concepts is at an introductory level and is designed to raise awareness and understanding of several conceptual foundations that undergird qualitative research. Because of the variety of qualitative approaches, not all concepts are relevant to every design and tradition. However, foundational aspects were selected for highlighting.
Watkins, A. Neal; Lipford, William E.; Leighty, Bradley D.; Goodman, Kyle Z.; Goad, William K.; Goad, Linda R.
This report will serve to present results of a test of the pressure sensitive paint (PSP) technique on the Common Research Model (CRM). This test was conducted at the National Transonic Facility (NTF) at NASA Langley Research Center. PSP data was collected on several surfaces with the tunnel operating in both cryogenic mode and standard air mode. This report will also outline lessons learned from the test as well as possible approaches to challenges faced in the test that can be applied to later entries.
Engel, Martin; Do-Ha, Dzung; Muñoz, Sonia Sanz; Ooi, Lezanne
Induced pluripotent stem cells and embryonic stem cells have revolutionized cellular neuroscience, providing the opportunity to model neurological diseases and test potential therapeutics in a pre-clinical setting. The power of these models has been widely discussed, but the potential pitfalls of stem cell differentiation in this research are less well described. We have analyzed the literature that describes differentiation of human pluripotent stem cells into three neural cell types that are commonly used to study diseases, including forebrain cholinergic neurons for Alzheimer's disease, midbrain dopaminergic neurons for Parkinson's disease and cortical astrocytes for neurodegenerative and psychiatric disorders. Published protocols for differentiation vary widely in the reported efficiency of target cell generation. Additionally, characterization of the cells by expression profile and functionality differs between studies and is often insufficient, leading to highly variable protocol outcomes. We have synthesized this information into a simple methodology that can be followed when performing or assessing differentiation techniques. Finally we propose three considerations for future research, including the use of physiological O2 conditions, three-dimensional co-culture systems and microfluidics to control feeding cycles and growth factor gradients. Following these guidelines will help researchers to ensure that robust and meaningful data is generated, enabling the full potential of stem cell differentiation for disease modeling and regenerative medicine.
Truth is for sale today, some critics claim. The increased commodification of science corrupts it, scientific fraud is rampant and the age-old trust in science is shattered. This cynical view, although gaining in prominence, does not explain very well the surprising motivation and integrity that is still central to the scientific life. Although scientific knowledge becomes more and more treated as a commodity or as a product that is for sale, a central part of academic scientific practice is still organized according to different principles. In this paper, I critically analyze alternative models for understanding the organization of knowledge, such as the idea of the scientific commons and the gift economy of science. After weighing the diverse positive and negative aspects of free market economies of science and gift economies of science, a commons structured as a gift economy seems best suited to preserve and take advantage of the specific character of scientific knowledge. Furthermore, commons and gift economies promote the rich social texture that is important for supporting central norms of science. Some of these basic norms might break down if the gift character of science is lost. To conclude, I consider the possibility and desirability of hybrid economies of academic science, which combine aspects of gift economies and free market economies. The aim of this paper is to gain a better understanding of these deeper structural challenges faced by science policy. Such theoretical reflections should eventually assist us in formulating new policy guidelines.
Bell, James H.
The luminescence lifetime technique was used to make pressure-sensitive paint (PSP) measurements on a 2.7% Common Research Model in the NASA Ames 11ft Transonic Wind Tunnel. PSP data were obtained on the upper and lower surfaces of the wing and horizontal tail, as well as one side of the fuselage. Data were taken for several model attitudes of interest at Mach numbers between 0.70 and 0.87. Image data were mapped onto a three-dimensional surface grid suitable both for comparison with CFD and for integration of pressures to determine loads. Luminescence lifetime measurements were made using strobed LED (light-emitting diode) lamps to illuminate the PSP and fast-framing interline transfer cameras to acquire the PSP emission.
Rivers, Melissa; Quest, Juergen; Rudnik, Ralf
Experimental aerodynamic investigations of the NASA Common Research Model have been conducted in the NASA Langley National Transonic Facility, the NASA Ames 11-ft wind tunnel, and the European Transonic Wind Tunnel. In the NASA Ames 11-ft wind tunnel, data have been obtained at only a chord Reynolds number of 5 million for a wing/body/tail = 0 degree incidence configuration. Data have been obtained at chord Reynolds numbers of 5, 19.8 and 30 million for the same configuration in the National Transonic Facility and in the European Transonic Facility. Force and moment, surface pressure, wing bending and twist, and surface flow visualization data were obtained in all three facilities but only the force and moment and surface pressure data are presented herein.
Nakao, Kazuwa; Yasoda, Akihiro; Ebihara, Ken; Hosoda, Kiminori; Mukoyama, Masashi
Since the 1980s, a number of bioactive molecules, now known as cardiovascular hormones, have been isolated from the heart and blood vessels, particularly from the subset of vascular endothelial cells. The natriuretic peptide family is the prototype of the cardiovascular hormones. Over the following decade, a variety of hormones and cytokines, now known as adipokines or adipocytokines, have also been isolated from adipose tissue. Leptin is the only adipokine demonstrated to cause an obese phenotype in both animals and humans upon deletion. Thus, the past two decades have seen the identification of two important classes of bioactive molecules secreted by newly recognized endocrine cells, both of which differentiate from mesenchymal stem cells. To assess the physiological and clinical implications of these novel hormones, we have investigated their functions using animal models. We have also developed and analyzed mice overexpressing transgenic forms of these proteins and knockout mice deficient in these and related genes. Here, we demonstrate the current state of the translational research of these novel hormones, the natriuretic peptide family and leptin, and discuss how lessons learned from excellent animal models and rare human diseases can provide a better understanding of common human diseases.
Full Text Available Creating energy efficiency traction induction motors with frequency control for hybrid drive vehicles defines practical interest for new methods of testing and simulation. Tests of these machines, it is desirable to carry out with energy recovery in the motor-generator, where a common shaft unites the machines. At present, the system simulation of motor-generator is carried out on simplified models without saturation, surface effect, jagged cores, and non-sinusoidal voltage of frequency converters. A refined interrelated mathematical model of asynchronous motor and generator operating with a common shaft, based on the theory of electrical circuits and field theory. Models allow the related modelling and study of static and dynamic modes electrical machines taking into account the saturation, skin effect, toothed cores, non-sinusoidal voltage of the frequency inverter, and variation parameters of the windings.
Mayo Fuster Morell
Full Text Available This paper addresses an emerging phenomenon characterized by continuous change and experimentation: the collaborative commons creation of audiovisual content online. The analysis wants to focus on models of sustainability of collaborative online creation, paying particular attention to the use of different forms of advertising. This article is an excerpt of a larger investigation, which unit of analysis are cases of Online Creation Communities that take as their central node of activity the Catalan territory. From 22 selected cases, the methodology combines quantitative analysis, through a questionnaire delivered to all cases, and qualitative analysis through face interviews conducted in 8 cases studied. The research, which conclusions we summarize in this article,in this article, leads us to conclude that the sustainability of the project depends largely on relationships of trust and interdependence between different voluntary agents, the non-monetary contributions and retributions as well as resources and infrastructure of free use. All together leads us to understand that this is and will be a very important area for the future of audiovisual content and its sustainability, which will imply changes in the policies that govern them.
The choice of a common metric for the meta-analysis (quantitative synthesis) of correlational and experimental research studies is presented and justified. First, a background for the problem of identifying a common metric is presented. Second, the percentage of accounted variance (PAV) is described as the metric of choice, and reasons are given…
Akerlind, Gerlese S.
This paper focuses on the data analysis stage of phenomenographic research, elucidating what is involved in terms of both commonality and variation in accepted practice. The analysis stage of phenomenographic research is often not well understood. This paper helps to clarify the process, initially by collecting together in one location the more…
Rivers, Melissa; Hunter, Craig; Vatsa, Veer
Two Navier-Stokes codes were used to compute flow over the High-Lift Common Research Model (HL-CRM) in preparation for a wind tunnel test to be performed at the NASA Langley Research Center 14-by-22-Foot Subsonic Tunnel in fiscal year 2018. Both flight and wind tunnel conditions were simulated by the two codes at set Mach numbers and Reynolds numbers over a full angle-of-attack range for three configurations: cruise, landing and takeoff. Force curves, drag polars and surface pressure contour comparisons are shown for the two codes. The lift and drag curves compare well for the cruise configuration up to 10deg angle of attack but not as well for the other two configurations. The drag polars compare reasonably well for all three configurations. The surface pressure contours compare well for some of the conditions modeled but not as well for others.
Stott, James E.; Britton, Paul; Ring, Robert W.; Hark, Frank; Hatfield, G. Spencer
Aggregate nuclear plant failure data is used to produce generic common-cause factors that are specifically for use in the common-cause failure models of NUREG/CR-5485. Furthermore, the models presented in NUREG/CR-5485 are specifically designed to incorporate two significantly distinct assumptions about the methods of surveillance testing from whence this aggregate failure data came. What are the implications of using these NUREG generic factors to model the common-cause failures of aerospace systems? Herein, the implications of using the NUREG generic factors in the modeling of aerospace systems are investigated in detail and strong recommendations for modeling the common-cause failures of aerospace systems are given.
Most sponsored research in this world is driven by the need to improve livelihood and the environment around us. This is particularly true for the case of earth and environmental issues involving the resources of water, food, energy and health. However, is such research guaranteed of bringing positive benefits for society as soon as it is documented in peer-reviewed forums or in media publications? More than 2 decades ago the United States National Research Council popularized the term "Valley of Death" to describe the region where research findings struggle to survive before reaching maturity for societal applications. Recent experience in the field of earth and environmental sciences shows that many of the potential beneficiaries (i.e., the common people), who are not as familiar with the motivation behind sponsored research in the field, may have a more skeptical view based on their current and archaic practices in their livelihood. This talk will shed light this "Valley of Death" for research and ways to accelerate the societal impact of research to the common people. Using examples drawing from technology, water, food and physical modeling of earth, this talk will also share lessons learned on ways to be effective agents of change for making a direct impact with scientific research.
Tam, Kai Chung
The inclusion of modeling and applications into the mathematics curriculum has proven to be a challenging task over the last fifty years. The Common Core State Standards (CCSS) has made mathematical modeling both one of its Standards for Mathematical Practice and one of its Conceptual Categories. This article discusses the need for mathematical…
This paper discusses the problem of lack of clear licensing and transparency of usage terms and conditions for research metadata. Making research data connected, discoverable and reusable are the key enablers of the new data revolution in research. We discuss how the lack of transparency hinders discovery of research data and make it disconnected from the publication and other trusted research outcomes. In addition, we discuss the application of Creative Commons licenses for research metadata...
Creating a Common Platform for HIV Vaccine Research and HIV Care and Treatment Program. Second only to South Africa in HIV burden, Nigeria's complex epidemic of HIV viral subtypes is a vital target for HIV vaccine evaluation research. This grant will support a partnership between the Institute of Human ...
Environmental and geoscientific research infrastructures (RIs) are dedicated to distinct aspects of the ocean, atmosphere, ecosystems, or solid Earth research, yet there is significant commonality in the way they conceive, develop, operate and upgrade their observation systems and platforms. Many environmental Ris are distributed network of observatories (be it drifting buoys, geophysical observatories, ocean-bottom stations, atmospheric measurements sites) with needs for remote operations. Most RIs have to deal with calibration and standardization issues. RIs use a variety of measurements technologies, but this variety is based on a small, common set of physical principles. All RIs have set their own research and development priorities, and developed their solution to their problems - however many problems are common across RIs. Finally, RIs may overlap in terms of scientific perimeter. In ENVRIplus we aim, for the first time, to identify common opportunities for innovation, to support common research and development across RIs on promising issues, and more generally to create a forum to spread state of the art techniques among participants. ENVRIplus activities include 1) measurement technologies: where are the common types of measurement for which we can share expertise or common development? 2) Metrology : how do we tackle together the diversified challenge of quality assurance and standardization? 3) Remote operations: can we address collectively the need for autonomy, robustness and distributed data handling? And 4) joint operations for research: are we able to demonstrate that together, RIs are able to provide relevant information to support excellent research. In this process we need to nurture an ecosystem of key players. Can we involve all the key technologists of the European RIs for a greater mutual benefit? Can we pave the way to a growing common market for innovative European SMEs, with a common programmatic approach conducive to targeted R&D? Can we
Busseri, Michael; Sadava, Stanley; DeCourville, Nancy
The primary components of subjective well-being (SWB) include life satisfaction (LS), positive affect (PA), and negative affect (NA). There is little consensus, however, concerning how these components form a model of SWB. In this paper, six longitudinal studies varying in demographic characteristics, length of time between assessment periods,…
Lynch, David R.; Pandolfo, Massimo; Schulz, Jorg B.; Perlman, Susan; Delatycki, Martin B.; Payne, R. Mark; Shaddy, Robert; Fischbeck, Kenneth H.; Farmer, Jennifer; Kantor, Paul; Raman, Subha V.; Hunegs, Lisa; Odenkirchen, Joanne; Miller, Kristy; Kaufmann, Petra
Background To reduce study start-up time, increase data sharing, and assist investigators conducting clinical studies, the National Institute of Neurological Disorders and Stroke embarked on an initiative to create common data elements for neuroscience clinical research. The Common Data Element Team developed general common data elements which are commonly collected in clinical studies regardless of therapeutic area, such as demographics. In the present project, we applied such approaches to data collection in Friedreich ataxia, a neurological disorder that involves multiple organ systems. Methods To develop Friedreich’s ataxia common data elements, Friedreich’s ataxia experts formed a working group and subgroups to define elements in: Ataxia and Performance Measures; Biomarkers; Cardiac and Other Clinical Outcomes; and Demographics, Laboratory Tests and Medical History. The basic development process included: Identification of international experts in Friedreich’s ataxia clinical research; Meeting via teleconference to develop a draft of standardized common data elements recommendations; Vetting of recommendations across the subgroups; Dissemination of recommendations to the research community for public comment. Results The full recommendations were published online in September 2011 at http://www.commondataelements.ninds.nih.gov/FA.aspx. The Subgroups’ recommendations are classified as core, supplemental or exploratory. Template case report forms were created for many of the core tests. Conclusions The present set of data elements should ideally lead to decreased initiation time for clinical research studies and greater ability to compare and analyze data across studies. Their incorporation into new and ongoing studies will be assessed in an ongoing fashion to define their utility in Friedreich’s ataxia. PMID:23239403
Training simulators for nuclear power plant operating staff have gained increasing importance over the last twenty years. One of the recommendations of the 1983 IAEA Specialists' Meeting on Nuclear Power Plant Training Simulators in Helsinki was to organize a Co-ordinated Research Programme (CRP) on some aspects of training simulators. The goal statement was: ''To establish and maintain a common approach to modelling for nuclear training simulators based on defined training requirements''. Before adapting this goal statement, the participants considered many alternatives for defining the common aspects of training simulator models, such as the programming language used, the nature of the simulator computer system, the size of the simulation computers, the scope of simulation. The participants agreed that it was the training requirements that defined the need for a simulator, the scope of models and hence the type of computer complex that was required, the criteria for fidelity and verification, and was therefore the most appropriate basis for the commonality of modelling approaches. It should be noted that the Co-ordinated Research Programme was restricted, for a variety of reasons, to consider only a few aspects of training simulators. This report reflects these limitations, and covers only the topics considered within the scope of the programme. The information in this document is intended as an aid for operating organizations to identify possible modelling approaches for training simulators for nuclear power plants. 33 refs
Harms, Craig A
As the use of fish models in biomedical research increases, so too does the need for individuals working in laboratory animal science to be familiar with surgical procedures in these animals. The author presents a primer on fish surgery with an emphasis on research applications.
Reflecting the change in funding strategies for European research projects, and the goal to jointly improve medical radiation protection through sustainable research efforts, five medical societies involved in the application of ionising radiation (European Association of Nuclear Medicine, EANM; European Federation of Organizations for Medical Physics. EFOMP; European Federation of Radiographer Societies, EFRS; European Society of Radiology, ESR; European Society for Radiotherapy and Oncology, ESTRO) have identified research areas of common interest and developed this first edition of the Common Strategic Research Agenda (SRA) for medical radiation protection. The research topics considered necessary and most urgent for effective medical care and efficient in terms of radiation protection are summarised in five main themes: 1. Measurement and quantification in the field of medical applications of ionising radiation 2. Normal tissue reactions, radiation-induced morbidity and long-term health problems 3. Optimisation of radiation exposure and harmonisation of practices 4. Justification of the use of ionising radiation in medical practice 5. Infrastructures for quality assurance The SRA is a living document; thus comments and suggestions by all stakeholders in medical radiation protection are welcome and will be dealt with by the European Alliance for Medical Radiation Protection Research (EURAMED) established by the above-mentioned societies. • Overcome the fragmentation of medical radiation protection research in Europe • Identify research areas of joint interest in the field of medical radiation protection • Improve the use of ionising radiation in medicine • Collect stakeholder feedback and seek consensus • Emphasise importance of clinical translation and evaluation of research results.
George, Brandon J.; Beasley, T. Mark; Brown, Andrew W.; Dawson, John; Dimova, Rositsa; Divers, Jasmin; Goldsby, TaShauna U.; Heo, Moonseong; Kaiser, Kathryn A.; Keith, Scott; Kim, Mimi Y.; Li, Peng; Mehta, Tapan; Oakes, J. Michael; Skinner, Asheley; Stuart, Elizabeth; Allison, David B.
We identify 10 common errors and problems in the statistical analysis, design, interpretation, and reporting of obesity research and discuss how they can be avoided. The 10 topics are: 1) misinterpretation of statistical significance, 2) inappropriate testing against baseline values, 3) excessive and undisclosed multiple testing and “p-value hacking,” 4) mishandling of clustering in cluster randomized trials, 5) misconceptions about nonparametric tests, 6) mishandling of missing data, 7) miscalculation of effect sizes, 8) ignoring regression to the mean, 9) ignoring confirmation bias, and 10) insufficient statistical reporting. We hope that discussion of these errors can improve the quality of obesity research by helping researchers to implement proper statistical practice and to know when to seek the help of a statistician. PMID:27028280
Rode, Carsten; Woloszyn, Monika
Subtask 1 of the IEA ECBCS Annex 41 (IEA 2007) project had the purpose to advance development in modelling of integral Heat, Air and Moisture (HAM) transfer processes that take place in “whole buildings”. Such modelling considers all relevant elements of buildings: The indoor air, building envelope...
Rode, Carsten; Woloszyn, Monika
Subtask 1 of the IEA Annex 41 project had the purpose to advance the development in modelling the integral heat, air and moisture transfer processes that take place in “whole buildings”. Such modelling comprises all relevant elements of buildings: The indoor air, the building envelope, the inside...
Baghaei, Purya; Aryadoust, Vahid
Research shows that test method can exert a significant impact on test takers' performance and thereby contaminate test scores. We argue that common test method can exert the same effect as common stimuli and violate the conditional independence assumption of item response theory models because, in general, subsets of items which have a shared…
Biering-Sørensen, F; Alai, S; Anderson, K.
OBJECTIVES: To develop a comprehensive set of common data elements (CDEs), data definitions, case report forms and guidelines for use in spinal cord injury (SCI) clinical research, as part of the CDE project at the National Institute of Neurological Disorders and Stroke (NINDS) of the US National...... with and cross-referenced to development of the International Spinal Cord Society (ISCoS) International SCI Data Sets. The recommendations were compiled, subjected to internal review and posted online for external public comment. The final version was reviewed by all working groups and the NINDS CDE team before...
This paper presents Distributed Systems Foundation (DSF), a common platform for distributed systems research and development. It can run a distributed algorithm written in Java under multiple execution modes—simulation, massive multi-tenancy, and real deployment. DSF provides a set of novel features to facilitate testing and debugging, including chaotic timing test and time travel debugging with mutable replay. Unlike existing research prototypes that offer advanced debugging features by hacking programming tools, DSF is written entirely in Java, without modifications to any external tools such as JVM, Java runtime library, compiler, linker, system library, OS, or hypervisor. This simplicity stems from our goal of making DSF not only a research prototype but more importantly a production tool. Experiments show that DSF is efficient and easy to use. DSF's massive multi-tenancy mode can run 4,000 OS-level threads in a single JVM to concurrently execute (as opposed to simulate) 1,000 DHT nodes in real-time.
Chen, Shi-Yi; Liu, Qin; Feng, Zhe
R is a computer language and has been widely used in science community due to the powerful capability in data analysis and visualization; and these functions are mainly provided by the developed packages. Because every package has strict format definitions on the inputted data, it is always required to appropriately manipulate the original data in advance. Unfortunately, users, especially for the beginners, are always confused by the extreme flexibility with R in data manipulation. In the present paper, we roughly categorize the common manipulations with R for biological data into four classes, including overview of data, transformation, summarization, and reshaping. Subsequently, these manipulations are exemplified in a sample data of clinical records of diabetic patients. Our main purpose is to provide a better landscape on the data manipulation with R and hence facilitate the practical applications in biological researches.
A methodology has been conceived for efficient synthesis of dynamical models that simulate common-sense decision- making processes. This methodology is intended to contribute to the design of artificial-intelligence systems that could imitate human common-sense decision making or assist humans in making correct decisions in unanticipated circumstances. This methodology is a product of continuing research on mathematical models of the behaviors of single- and multi-agent systems known in biology, economics, and sociology, ranging from a single-cell organism at one extreme to the whole of human society at the other extreme. Earlier results of this research were reported in several prior NASA Tech Briefs articles, the three most recent and relevant being Characteristics of Dynamics of Intelligent Systems (NPO -21037), NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 48; Self-Supervised Dynamical Systems (NPO-30634), NASA Tech Briefs, Vol. 27, No. 3 (March 2003), page 72; and Complexity for Survival of Living Systems (NPO- 43302), NASA Tech Briefs, Vol. 33, No. 7 (July 2009), page 62. The methodology involves the concepts reported previously, albeit viewed from a different perspective. One of the main underlying ideas is to extend the application of physical first principles to the behaviors of living systems. Models of motor dynamics are used to simulate the observable behaviors of systems or objects of interest, and models of mental dynamics are used to represent the evolution of the corresponding knowledge bases. For a given system, the knowledge base is modeled in the form of probability distributions and the mental dynamics is represented by models of the evolution of the probability densities or, equivalently, models of flows of information. Autonomy is imparted to the decisionmaking process by feedback from mental to motor dynamics. This feedback replaces unavailable external information by information stored in the internal knowledge base. Representation
Common, often overlooked, variables in biomedical research with animals are reviewed. The barren primary enclosure is an abnormal living environment for laboratory animals. Species-appropriate enrichment attenuates some of the distress resulting from chronic understimulation. Social deprivation distress of individually-caged social animals is best mitigated by the provision of compatible companionship. Biotelemetry and positive reinforcement training avoid or minimize stress reactions that typically occur when animals are forcibly restrained during procedures. The variables, 'light' and 'position of living quarters' are inherent in the multi-tier caging system. To date there is no satisfactory alternative other than the single-tier cage arrangement that eliminates both variables. Removing test animals from their familiar home environment and from their cage mates for procedures introduces stress as an avoidable influential variable. Music may become an important variable if not all subjects are exposed to it. Disturbance time cannot be controlled as an extraneous variable but it should at least be mentioned to explain possible incongruities of data. A positive relationship between animal care personnel and research subjects is a key requisite to minimize stress as a data-confounding variable.
Wang, H P; Zheng, D; Tian, Y
In this paper modeling and common-rail pressure control of high pressure common rail injection system (HPCRIS) is presented. The proposed mathematical model of high pressure common rail injection system which contains three sub-systems: high pressure pump sub-model, common rail sub-model and injector sub-model is a relative complicated nonlinear system. The mathematical model is validated by the software Matlab and a virtual detailed simulation environment. For the considered HPCRIS, an effective model free controller which is called Extended State Observer - based intelligent Proportional Integral (ESO-based iPI) controller is designed. And this proposed method is composed mainly of the referred ESO observer, and a time delay estimation based iPI controller. Finally, to demonstrate the performances of the proposed controller, the proposed ESO-based iPI controller is compared with a conventional PID controller and ADRC. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.
At its session on March 13, the Council of the European Communities approved the following two new research programmes: - the research programme of the Common Research Agency during the period 1980 to 1983 and - the research programme of Euratom in the field of controlled nuclear fusion during the period 1979 to 1983. The programme for the Common Research Agency earmarks the financing of 510.87 m ecus for a 4-year-period as of January 1, 1980 by means of which the new research programme will be conducted with 2260 employees in the research centres of the EC. The programme focuses on reactor safety, modern energy sources, environmental research, reference measurements and activities for the Commission. The research and training programme in the field of controlled nuclear fusion provides the financing of 190.5 m ecus for the 4-year-period as of January 1, 1979 (partial replacement of the former programme for 1976-80), including a staff of 113 employees ensuring coordination between national research centres. An additional 145m ecus are earmarked for the large-scale project JET the preparation of which is worked on by 150 employes. (orig.) [de
Full Text Available Douglas R Pedersen, Jessica E Goetz, Gail L Kurriger, James A MartinDepartment of Orthopaedics and Rehabilitation, University of Iowa, Iowa City, IA, USAPurpose: This study addresses the species-specific and site-specific details of weight-bearing articular cartilage zone depths and chondrocyte distributions among humans and common osteoarthritis (OA animal models using contemporary digital imaging tools. Histological analysis is the gold-standard research tool for evaluating cartilage health, OA severity, and treatment efficacy. Historically, evaluations were made by expert analysts. However, state-of-the-art tools have been developed that allow for digitization of entire histological sections for computer-aided analysis. Large volumes of common digital cartilage metrics directly complement elucidation of trends in OA inducement and concomitant potential treatments.Materials and methods: Sixteen fresh human knees, 26 adult New Zealand rabbit stifles, and 104 bovine lateral plateaus were measured for four cartilage zones and the cell densities within each zone. Each knee was divided into four weight-bearing sites: the medial and lateral plateaus and femoral condyles.Results: One-way analysis of variance followed by pairwise multiple comparisons (Holm–Sidak method at a significance of 0.05 clearly confirmed the variability between cartilage depths at each site, between sites in the same species, and between weight-bearing articular cartilage definitions in different species.Conclusion: The present study clearly demonstrates multisite, multispecies differences in normal weight-bearing articular cartilage, which can be objectively quantified by a common digital histology imaging technique. The clear site-specific differences in normal cartilage must be taken into consideration when characterizing the pathoetiology of OA models. Together, these provide a path to consistently analyze the volume and variety of histologic slides necessarily generated
Sayyah, Rana; Hunt, Mitchell; MacLeond, Todd C.; Ho, Fat D.
This paper presents a mathematical model characterizing the behavior of a common-source amplifier using a FeFET. The model is based on empirical data and incorporates several variables that affect the output, including frequency, load resistance, and gate-to-source voltage. Since the common-source amplifier is the most widely used amplifier in MOS technology, understanding and modeling the behavior of the FeFET-based common-source amplifier will help in the integration of FeFETs into many circuits.
The European Parliament sets priorities for future research policy; it supports e.g.: the Europeanization of large-scale research because it shall facilitate the combination of the research and financing potential. It also secures a wide adaptation of the knowledge achieved by the joint financial efforts of all member states. The demander made for a joint research project are: that the next programme for several years of the joint research project may launch its position and special qualification as a safety research centre for industrial activities of high risks (nuclear energy sector, chemistry, biology) and that the joint research project may be organized as independently as possible. Moreover, the European Parliament demands that the member states increase their research budgets to at least 2.5% of the gross national product. (orig./HSCH) [de
CPTAC supports analyses of the mass spectrometry raw data (mapping of spectra to peptide sequences and protein identification) for the public using a Common Data Analysis Pipeline (CDAP). The data types available on the public portal are described below. A general overview of this pipeline can be downloaded here. Mass Spectrometry Data Formats RAW (Vendor) Format
of common cancers among patients treated at Kamuzu Central Hospital (KCH) in Lilongwe, and to determine the prevalence of Human .... study population. Cancer of the oesophagus alone comprised over one-third of the cases, followed by cervical cancer contributing to over one-fifth (21%) and then Kaposi's sarcoma and ...
Abstract Reflecting the change in funding strategies for European research projects, and the goal to jointly improve medical radiation protection through sustainable research efforts, five medical societies involved in the application of ionising radiation (European Association of Nuclear Medicine, EANM; European Federation of Organizations for Medical Physics. EFOMP; European Federation of Radiographer Societies, EFRS; European Society of Radiology, ESR; European Society for Radiotherapy and...
B. V. Hrynchyshyn
We considered various approaches to the reconstruction of the historical fencing. It is proved that the activities of such societies has a positive effect on the process research of features of medieval weapons, fighting tactics of different periods The various approaches to the reconstruction of the historical fencing. Proved that the activities of such societies has a positive effect on the process research of features of medieval weapons, fighting tactics of different periods.
Choo, Esther K; Garro, Aris C; Ranney, Megan L; Meisel, Zachary F; Morrow Guthrie, Kate
Qualitative methods are increasingly being used in emergency care research. Rigorous qualitative methods can play a critical role in advancing the emergency care research agenda by allowing investigators to generate hypotheses, gain an in-depth understanding of health problems or specific populations, create expert consensus, and develop new intervention and dissemination strategies. This article, Part I of a two-article series, provides an introduction to general principles of applied qualitative health research and examples of its common use in emergency care research, describing study designs and data collection methods most relevant to our field, including observation, individual interviews, and focus groups. In Part II of this series, we will outline the specific steps necessary to conduct a valid and reliable qualitative research project, with a focus on interview-based studies. These elements include building the research team, preparing data collection guides, defining and obtaining an adequate sample, collecting and organizing qualitative data, and coding and analyzing the data. We also discuss potential ethical considerations unique to qualitative research as it relates to emergency care research. © 2015 by the Society for Academic Emergency Medicine.
This talk will introduce the Common Workflow Language project. In July 2016 they released standards that enable the portable, interoperable, and executable description of command line data analysis tools and workflow made from those tools. These descriptions are enhanced by CWL's first class (but optional) support for Docker containers. CWL originated from the world of bioinformatics but is not discipline specific and is gaining interest and use in other fields. Attendees who want to play with CWL prior to attending the presentation are invited to go through the "Gentle Introduction to the Common Workflow Language" tutorial on any OS X or Linux machine on their own time. About the speaker Michael R. Crusoe is one of the co-founders of the CWL project and is the CWL Community Engineer. His facilitation, technical contributions, and training on behalf of the project draw from his time as the former lead developer of C. Titus Brown's k-h-mer project, his previous career as a sysadmin and programmer, and his ex...
Public access to results of federally-funded research is a new mandate for large departments of the United States government. Public access to scholarly literature from U.S. investments is straightforward, with policies and systems like PubMed Central and PubAg (http://pubag.nal.usda.gov) already im...
Chen, Charles P.
International college students studying in North America endure substantial psychological stress in their daily lives. The nature and function of stressors in the context of international college students' subjective appraisal are discussed and analyzed using the Lazarus and Folkman's concept of stress. Recommendations for future research are…
Both an improved understanding of the causes and consequences of global warming as well as the exploration of responses to global warming require the integration of knowledge from a wide variety of disciplines in the natural sciences, social sciences, and humanities. There are a variety of examples of successful multidisciplinary enterprises that have conducted research over an extended period of time
Ambrosiano, J. [Los Alamos National Lab., NM (United States); Butler, D.M. [Limit Point Systems, Inc. (United States); Matarazzo, C.; Miller, M. [Lawrence Livermore National Lab., CA (United States); Schoof, L. [Sandia National Lab., Albuquerque, NM (United States)
The problem of sharing data among scientific simulation models is a difficult and persistent one. Computational scientists employ an enormous variety of discrete approximations in modeling physical processes on computers. Problems occur when models based on different representations are required to exchange data with one another, or with some other software package. Within the DOE`s Accelerated Strategic Computing Initiative (ASCI), a cross-disciplinary group called the Data Models and Formats (DMF) group, has been working to develop a common data model. The current model is comprised of several layers of increasing semantic complexity. One of these layers is an abstract model based on set theory and topology called the fiber bundle kernel (FBK). This layer provides the flexibility needed to describe a wide range of mesh-approximated functions as well as other entities. This paper briefly describes the ASCI common data model, its mathematical basis, and ASCI prototype development. These prototypes include an object-oriented data management library developed at Los Alamos called the Common Data Model Library or CDMlib, the Vector Bundle API from the Lawrence Livermore Laboratory, and the DMF API from Sandia National Laboratory.
Sex determination is a complicated process involving large-scale modifications in gene expression affecting virtually every tissue in the body. Although the evolutionary origin of sex remains controversial, there is little doubt that it has developed as a process of optimizing metabolic control, as well as developmental and reproductive functions within a given setting of limited resources and environmental pressure. Evidence from various model organisms supports the view that sex determination may occur as a result of direct environmental induction or genetic regulation. The first process has been well documented in reptiles and fish, while the second is the classic case for avian species and mammals. Both of the latter have developed a variety of sex-specific/sex-related genes, which ultimately form a complete chromosome pair (sex chromosomes/gonosomes). Interestingly, combinations of environmental and genetic mechanisms have been described among different classes of animals, thus rendering the possibility of a unidirectional continuous evolutionary process from the one type of mechanism to the other unlikely. On the other hand, common elements appear throughout the animal kingdom, with regard to a) conserved key genes and b) a central role of sex steroid control as a prerequisite for ultimately normal sex differentiation. Studies in invertebrates also indicate a role of epigenetic chromatin modification, particularly with regard to alternative splicing options. This review summarizes current evidence from research in this hot field and signifies the need for further study of both normal hormonal regulators of sexual phenotype and patterns of environmental disruption. PMID:22357269
Full Text Available Abstract Sex determination is a complicated process involving large-scale modifications in gene expression affecting virtually every tissue in the body. Although the evolutionary origin of sex remains controversial, there is little doubt that it has developed as a process of optimizing metabolic control, as well as developmental and reproductive functions within a given setting of limited resources and environmental pressure. Evidence from various model organisms supports the view that sex determination may occur as a result of direct environmental induction or genetic regulation. The first process has been well documented in reptiles and fish, while the second is the classic case for avian species and mammals. Both of the latter have developed a variety of sex-specific/sex-related genes, which ultimately form a complete chromosome pair (sex chromosomes/gonosomes. Interestingly, combinations of environmental and genetic mechanisms have been described among different classes of animals, thus rendering the possibility of a unidirectional continuous evolutionary process from the one type of mechanism to the other unlikely. On the other hand, common elements appear throughout the animal kingdom, with regard to a conserved key genes and b a central role of sex steroid control as a prerequisite for ultimately normal sex differentiation. Studies in invertebrates also indicate a role of epigenetic chromatin modification, particularly with regard to alternative splicing options. This review summarizes current evidence from research in this hot field and signifies the need for further study of both normal hormonal regulators of sexual phenotype and patterns of environmental disruption.
This report explains how to use the Binomial Failure Rate (BFR) method to estimate common cause failure rates. The entire method is described, beginning with the conceptual model, and covering practical issues of data preparation, treatment of variation in the failure rates, Bayesian estimation of the quantities of interest, checking the model assumptions for lack of fit to the data, and the ultimate application of the answers
Schoenfeld, Alan H.
On October 14, 2013 the Mathematics Education Department at Teachers College hosted a full-day conference focused on the Common Core Standards Mathematical Modeling requirements to be implemented in September 2014 and in honor of Professor Henry Pollak's 25 years of service to the school. This article is adapted from my talk at this conference…
Full Text Available Introduction: Development of science in various fields has caused change in the methods to determine geographical location. Precision farming involves new technology that provides the opportunity for farmers to change in factors such as nutrients, soil moisture available to plants, soil physical and chemical characteristics and other factors with the spatial resolution of less than a centimeter to several meters to monitor and evaluate. GPS receivers based on precision farming operations specified accuracies are used in the following areas: 1 monitoring of crop and soil sampling (less than one meter accuracy 2 use of fertilizer, pesticide and seed work (less than half a meter accuracy 3 Transplantation and row cultivation (precision of less than 4 cm (Perez et al., 2011. In one application of GPS in agriculture, route guidance precision farming tractors in the fields was designed to reduce the transmission error that deviate from the path specified in the range of 50 to 300 mm driver informed and improved way to display (Perez et al., 2011. In another study, the system automatically guidance, based on RTK-GPS technology, precision tillage operations was used between and within the rows very close to the drip irrigation pipe and without damage to their crops at a distance of 50 mm (Abidine et al., 2004. In another study, to compare the accuracy and precision of the receivers, 5 different models of Trimble Mark GPS devices from 15 stations were mapped, the results indicated that minimum error was related to Geo XT model with an accuracy of 91 cm and maximum error was related to Pharos model with an accuracy of 5.62 m (Kindra et al., 2006. Due to the increasing use of GPS receivers in agriculture as well as the lack of trust on the real accuracy and precision of receivers, this study aimed to compare the positioning accuracy and precision of three commonly used GPS receivers models used to specify receivers with the lowest error for precision
Wancket, L M
Bone implants and devices are a rapidly growing field within biomedical research, and implants have the potential to significantly improve human and animal health. Animal models play a key role in initial product development and are important components of nonclinical data included in applications for regulatory approval. Pathologists are increasingly being asked to evaluate these models at the initial developmental and nonclinical biocompatibility testing stages, and it is important to understand the relative merits and deficiencies of various species when evaluating a new material or device. This article summarizes characteristics of the most commonly used species in studies of bone implant materials, including detailed information about the relevance of a particular model to human bone physiology and pathology. Species reviewed include mice, rats, rabbits, guinea pigs, dogs, sheep, goats, and nonhuman primates. Ultimately, a comprehensive understanding of the benefits and limitations of different model species will aid in rigorously evaluating a novel bone implant material or device. © The Author(s) 2015.
Full Text Available This paper studies the effects of common shocks on the OLS estimators of the slopes’ parameters in linear panel data models. The shocks are assumed to affect both the errors and some of the explanatory variables. In contrast to existing approaches, which rely on using results on martingale difference sequences, our method relies on conditional strong laws of large numbers and conditional central limit theorems for conditionally-heterogeneous random variables.
Wang, Xin-juan; Zhao, Bai-xiao
With the application of scientific studying methods, the level of clinical study has been improved greatly, and people has been paid more attention to scientific evaluation of the clinical effect of Chinese medicine and acu-moxibustion. Formerly, because of lack in acceptance and application of modern scientific studying methods in Chinese clinical acu-moxibustion researchers, their achievements weren't approved by the international academy for the faulty model of study. Randomized Control Trial (RCT) is the golden standard method widely accepted at present, so it is of great importance for clinical acu-moxibustion researchers to exert control methods correctly and effectively. The commonly used controlling methods of overseas clinical acu-moxibustion studies were discussed in this article in order to give some suggestion and benifits to the internal acu-moxibustion clinical researchers.
Asmi, Ari; Konjin, Jacco; Pursula, Antti
Environmental Research infrastructures are facilities, resources, systems and related services that are used by research communities to conduct top-level research. Environmental research is addressing processes at very different time scales, and supporting research infrastructures must be designed as long-term facilities in order to meet the requirements of continuous environmental observation, measurement and analysis. This longevity makes the environmental research infrastructures ideal structures to support the long-term development in environmental sciences. ENVRI project is a collaborative action of the major European (ESFRI) Environmental Research Infrastructures working towards increased co-operation and interoperability between the infrastructures. One of the key products of the ENVRI project is to combine the long-term plans of the individual infrastructures towards a common strategy, describing the vision and planned actions. The envisaged vision for environmental research infrastructures toward 2030 is to support the holistic understanding of our planet and it's behavior. The development of a 'Standard Model of the Planet' is a common ambition, a challenge to define an environmental standard model; a framework of all interactions within the Earth System, from solid earth to near space. Indeed scientists feel challenged to contribute to a 'Standard Model of the Planet' with data, models, algorithms and discoveries. Understanding the Earth System as an interlinked system requires a systems approach. The Environmental Sciences are rapidly moving to become a one system-level science. Mainly since modern science, engineering and society are increasingly facing complex problems that can only be understood in the context of the full overall system. The strategy of the supporting collaborating research infrastructures is based on developing three key factors for the Environmental Sciences: the technological, the cultural and the human capital. The technological
Fiengo, Giovanni; Palladino, Angelo; Giglio, Veniero
Progressive reductions in vehicle emission requirements have forced the automotive industry to invest in research and development of alternative control strategies. Continual control action exerted by a dedicated electronic control unit ensures that best performance in terms of pollutant emissions and power density is married with driveability and diagnostics. Gasoline direct injection (GDI) engine technology is a way to attain these goals. This brief describes the functioning of a GDI engine equipped with a common rail (CR) system, and the devices necessary to run test-bench experiments in detail. The text should prove instructive to researchers in engine control and students are recommended to this brief as their first approach to this technology. Later chapters of the brief relate an innovative strategy designed to assist with the engine management system; injection pressure regulation for fuel pressure stabilization in the CR fuel line is proposed and validated by experiment. The resulting control scheme ...
Fisher, Laurel J
Motivational Spiral Models (MSM) show links over time among self concepts, feelings, strategies, skills and participation in everyday activities. In theory, MSM have many common features, with distinct features in particular contexts. This project examined children?s motivation to participate in literacy (MSM-L), social (MSM-S) and physical activities (MSM-P). The participants in Study 1 (N?=?32) were 9 to 11 years old, and in Study 2 (N?=?73) were 4 to 12 year old children. Locations were cl...
Jensen, Per Anker; Andersen, Per Dannemand; Rasmussen, Birgitte
Purpose: To identify trends and challenges in relation to the FM profession in the Nordic countries and to identify inputs to a common Nordic research agenda. Theory: The study is based on theory from innovation systems and strategic foresight. Based on a literature review an innovation systems...... model of the FM sector was developed as a framework for the study. Approach: The study contained four elements. First a review of literature on the future of FM was carried out. Secondly, four national workshops were held involving FM practitioners and researchers from Denmark, Norway, Sweden...... and Finland. Third, the results of the workshops were presented and discussed at a joint Nordic workshop at a Nordic FM conference in August 2011. Finally, an adapted Delphi survey was carried out as a a final data collection and validation of the findings. Findings: The results of the study show...
Vaughan Henriques; Maureen Tanner
Background/Aim/Purpose: A commonly implemented software process improvement framework is the capability maturity model integrated (CMMI). Existing literature indicates higher levels of CMMI maturity could result in a loss of agility due to its organizational focus. To maintain agility, research has focussed attention on agile maturity models. The objective of this paper is to find the common research themes and conclusions in agile maturity model research. Methodology: This research adop...
1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. ESPC Common Model Architecture Earth System Modeling...Capability (NUOPC) was established between NOAA and Navy to develop a common software architecture for easy and efficient interoperability. The...model architecture and other software-related standards in this project. OBJECTIVES NUOPC proposes to accelerate improvement of our national
Abuhelwa, Ahmad Y; Foster, David J R; Upton, Richard N
The analytical solutions to compartmental pharmacokinetic models are well known, but have not been presented in a form that easily allows for complex dosing regimen and changes in covariate/parameter values that may occur at discrete times within and/or between dosing intervals. Laplace transforms were used to derive ADVAN-style analytical solutions for 1, 2, and 3 compartment pharmacokinetic linear models of intravenous and first-order absorption drug administration. The equations calculate the change in drug amounts in each compartment of the model over a time interval (t; t = t2 - t1) accounting for any dose or covariate events acting in the time interval. The equations were coded in the R language and used to simulate the time-course of drug amounts in each compartment of the systems. The equations were validated against commercial software [NONMEM (Beal, Sheiner, Boeckmann, & Bauer, 2009)] output to assess their capability to handle both complex dosage regimens and the effect of changes in covariate/parameter values that may occur at discrete times within or between dosing intervals. For all tested pharmacokinetic models, the time-course of drug amounts using the ADVAN-style analytical solutions were identical to NONMEM outputs to at least four significant figures, confirming the validity of the presented equations. To our knowledge, this paper presents the ADVAN-style equations for common pharmacokinetic models in the literature for the first time. The presented ADVAN-style equations overcome obstacles to implementing the classical analytical solutions in software, and have speed advantages over solutions using differential equation solvers. The equations presented in this paper fill a gap in the pharmacokinetic literature, and it is expected that these equations will facilitate the investigation of useful open-source software for modelling pharmacokinetic data. Copyright © 2015 Elsevier Inc. All rights reserved.
Wyser, Emmanuel; Jaboyedoff, Michel
Granular studies received an increasing interest during the last decade. Many scientific investigations were successfully addressed to acknowledge the ubiquitous behavior of granular matter. We investigate liquid impacts onto granular beds, i.e. the influence of the packing and compaction-dilation transition. However, a physically-based model is still lacking to address complex microscopic features of granular bed response during liquid impacts such as compaction-dilation transition or granular bed uplifts (Wyser et al. in review). We present our preliminary 2D numerical modeling based on the Discrete Element Method (DEM) using nonlinear contact force law (the Hertz-Mindlin model) for disk shape particles. The algorithm is written in C programming language. Our 2D model provides an analytical tool to address granular problems such as i) granular collapses and ii) static granular assembliy problems. This provides a validation framework of our numerical approach by comparing our numerical results with previous laboratory experiments or numerical works. Inspired by the work of Warnett et al. (2014) and Staron & Hinch (2005), we studied i) the axisymetric collapse of granular columns. We addressed the scaling between the initial aspect ratio and the final runout distance. Our numerical results are in good aggreement with the previous studies of Warnett et al. (2014) and Staron & Hinch (2005). ii) Reproducing static problems for regular and randomly stacked particles provides a valid comparison to results of Egholm (2007). Vertical and horizontal stresses within the assembly are quite identical to stresses obtained by Egholm (2007), thus demonstating the consistency of our 2D numerical model. Our 2D numerical model is able to reproduce common granular case studies such as granular collapses or static problems. However, a sufficient small timestep should be used to ensure a good numerical consistency, resulting in higher computational time. The latter becomes critical
Harris Alex HS
Full Text Available Abstract Background To assist educators and researchers in improving the quality of medical research, we surveyed the editors and statistical reviewers of high-impact medical journals to ascertain the most frequent and critical statistical errors in submitted manuscripts. Findings The Editors-in-Chief and statistical reviewers of the 38 medical journals with the highest impact factor in the 2007 Science Journal Citation Report and the 2007 Social Science Journal Citation Report were invited to complete an online survey about the statistical and design problems they most frequently found in manuscripts. Content analysis of the responses identified major issues. Editors and statistical reviewers (n = 25 from 20 journals responded. Respondents described problems that we classified into two, broad themes: A. statistical and sampling issues and B. inadequate reporting clarity or completeness. Problems included in the first theme were (1 inappropriate or incomplete analysis, including violations of model assumptions and analysis errors, (2 uninformed use of propensity scores, (3 failing to account for clustering in data analysis, (4 improperly addressing missing data, and (5 power/sample size concerns. Issues subsumed under the second theme were (1 Inadequate description of the methods and analysis and (2 Misstatement of results, including undue emphasis on p-values and incorrect inferences and interpretations. Conclusions The scientific quality of submitted manuscripts would increase if researchers addressed these common design, analytical, and reporting issues. Improving the application and presentation of quantitative methods in scholarly manuscripts is essential to advancing medical research.
Hoedemaekers, R.H.M.V.; Gordijn, B.; Pijnenburg, M.A.M.
In genomic research the ideal standard of free, informed, prior, and explicit consent is believed to restrict important research studies. For certain types of genomic research other forms of consent are therefore proposed which are ethically justified by an appeal to the common good. This notion is
The utilization of the facilities in the Japan Atomic Energy Research Institute in common in 1982 has finished in active state, and the results of the researches have reached the stage of publication. The subjects of the researches spread over wide fields, and in 1982 also, extremely diversified researches were carried out. In this report, theses results were collected in one book, and it is desirable to utilize it actively. The number of the research themes is 131. In the field of general researches, the researches on radiochemistry, the utilization of radiation and the effects of irradiation were mostly carried out, while in cooperative researches, the researches were mainly concerned with nuclear reactor engineering and nuclear reactor materials. The total number of visitors was 3025. The facilities offered to the common utilization were JRR-2, JRR-3, JRR-4, Co-60 irradiation facility and others. The abstracts of the papers are reported. (J.P.N.)
Sennott, Samuel C.; Light, Janice C.; McNaughton, David
A systematic review of research on the effects of interventions that include communication partner modeling of aided augmentative and alternative communication (AAC) on the language acquisition of individuals with complex communication needs was conducted. Included studies incorporated AAC modeling as a primary component of the intervention,…
Nadkarni, Prakash M.; Brandt, Cynthia A.
Objectives The National Cancer Institute (NCI) has developed the Common Data Elements (CDE) to serve as a controlled vocabulary of data descriptors for cancer research, to facilitate data interchange and inter-operability between cancer research centers. We evaluated CDE’s structure to see whether it could represent the elements necessary to support its intended purpose, and whether it could prevent errors and inconsistencies from being accidentally introduced. We also performed automated checks for certain types of content errors that provided a rough measure of curation quality. Methods Evaluation was performed on CDE content downloaded via the NCI’s CDE Browser, and transformed into relational database form. Evaluation was performed under three categories: 1) compatibility with the ISO/IEC 11179 metadata model, on which CDE structure is based, 2) features necessary for controlled vocabulary support, and 3) support for a stated NCI goal, set up of data collection forms for cancer research. Results Various limitations were identified both with respect to content (inconsistency, insufficient definition of elements, redundancy) as well as structure – particularly the need for term and relationship support, as well as the need for metadata supporting the explicit representation of electronic forms that utilize sets of common data elements. Conclusions While there are numerous positive aspects to the CDE effort, there is considerable opportunity for improvement. Our recommendations include review of existing content by diverse experts in the cancer community; integration with the NCI thesaurus to take advantage of the latter’s links to nationally used controlled vocabularies, and various schema enhancements required for electronic form support. PMID:17149500
Pennington, D. D.; Gandara, A.; Gris, I.
The Virtual Learning Commons (VLC), funded by the National Science Foundation Office of Cyberinfrastructure CI-Team Program, is a combination of Semantic Web, mash up, and social networking tools that supports knowledge sharing and innovation across scientific disciplines in research and education communities and networks. The explosion of scientific resources (data, models, algorithms, tools, and cyberinfrastructure) challenges the ability of researchers to be aware of resources that might benefit them. Even when aware, it can be difficult to understand enough about those resources to become potential adopters or re-users. Often scientific data and emerging technologies have little documentation, especially about the context of their use. The VLC tackles this challenge by providing mechanisms for individuals and groups of researchers to organize Web resources into virtual collections, and engage each other around those collections in order to a) learn about potentially relevant resources that are available; b) design research that leverages those resources; and c) develop initial work plans. The VLC aims to support the "fuzzy front end" of innovation, where novel ideas emerge and there is the greatest potential for impact on research design. It is during the fuzzy front end that conceptual collisions across disciplines and exposure to diverse perspectives provide opportunity for creative thinking that can lead to inventive outcomes. The VLC integrates Semantic Web functionality for structuring distributed information, mash up functionality for retrieving and displaying information, and social media for discussing/rating information. We are working to provide three views of information that support researchers in different ways: 1. Innovation Marketplace: supports users as they try to understand what research is being conducted, who is conducting it, where they are located, and who they collaborate with; 2. Conceptual Mapper: supports users as they organize their
Jacob, Merle; Hellström, Tomas
This paper makes a plea for the construction of a common agenda for higher education and science, technology and innovation (STI) policy research. The public higher education and research sector in all countries is currently in the grip of several challenges arising from increased accountability, internationalization and in some cases dwindling…
Clark, Martyn; Kavetski, Dmitri; Fenicia, Fabrizio; Gupta, Hoshin
provide a common framework for model development and analysis. We recognize that the majority of process-based hydrological models use the same set of physics - most models use Darcy's Law to represent the flow of water through the soil matrix and Fourier's Law for thermodynamics. Our numerical model uses robust solutions of the hydrology and thermodynamic governing equations as the structural core, and incorporates multiple options to represent the impact of different modeling decisions, including different methods to represent spatial variability and different parameterizations of surface fluxes and shallow groundwater. Our analysis isolates individual modeling decisions and uses orthogonal diagnostic signatures to evaluate model behavior. Application of this framework in research basins demonstrates that the combination of (1) flexibility in the numerical model and (2) comprehensive scrutiny of orthogonal signatures provides a powerful approach to identify the suitability of different modeling options and different model parameter values. We contend that this common framework has general utility, and its widespread application in both research basins and at larger spatial scales will help accelerate the development of process-based hydrologic models.
Markus, K.A.; Borsboom, D.
Causal theories of measurement view test items as effects of a common cause. Behavior domain theories view test item responses as behaviors sampled from a common domain. A domain score is a composite score over this domain. The question arises whether latent variables can simultaneously constitute
Yang, Xin; Jin, Jiaoying; Xu, Mengling; Wu, Huihui; He, Wanji; Yuchi, Ming; Ding, Mingyue
Carotid atherosclerosis is a major reason of stroke, a leading cause of death and disability. In this paper, a segmentation method based on Active Shape Model (ASM) is developed and evaluated to outline common carotid artery (CCA) for carotid atherosclerosis computer-aided evaluation and diagnosis. The proposed method is used to segment both media-adventitia-boundary (MAB) and lumen-intima-boundary (LIB) on transverse views slices from three-dimensional ultrasound (3D US) images. The data set consists of sixty-eight, 17 × 2 × 2, 3D US volume data acquired from the left and right carotid arteries of seventeen patients (eight treated with 80 mg atorvastatin and nine with placebo), who had carotid stenosis of 60% or more, at baseline and after three months of treatment. Manually outlined boundaries by expert are adopted as the ground truth for evaluation. For the MAB and LIB segmentations, respectively, the algorithm yielded Dice Similarity Coefficient (DSC) of 94.4% ± 3.2% and 92.8% ± 3.3%, mean absolute distances (MAD) of 0.26 ± 0.18 mm and 0.33 ± 0.21 mm, and maximum absolute distances (MAXD) of 0.75 ± 0.46 mm and 0.84 ± 0.39 mm. It took 4.3 ± 0.5 mins to segment single 3D US images, while it took 11.7 ± 1.2 mins for manual segmentation. The method would promote the translation of carotid 3D US to clinical care for the monitoring of the atherosclerotic disease progression and regression. PMID:23533535
Full Text Available Carotid atherosclerosis is a major reason of stroke, a leading cause of death and disability. In this paper, a segmentation method based on Active Shape Model (ASM is developed and evaluated to outline common carotid artery (CCA for carotid atherosclerosis computer-aided evaluation and diagnosis. The proposed method is used to segment both media-adventitia-boundary (MAB and lumen-intima-boundary (LIB on transverse views slices from three-dimensional ultrasound (3D US images. The data set consists of sixty-eight, 17 × 2 × 2, 3D US volume data acquired from the left and right carotid arteries of seventeen patients (eight treated with 80 mg atorvastatin and nine with placebo, who had carotid stenosis of 60% or more, at baseline and after three months of treatment. Manually outlined boundaries by expert are adopted as the ground truth for evaluation. For the MAB and LIB segmentations, respectively, the algorithm yielded Dice Similarity Coefficient (DSC of 94.4% ± 3.2% and 92.8% ± 3.3%, mean absolute distances (MAD of 0.26 ± 0.18 mm and 0.33 ± 0.21 mm, and maximum absolute distances (MAXD of 0.75 ± 0.46 mm and 0.84 ± 0.39 mm. It took 4.3 ± 0.5 mins to segment single 3D US images, while it took 11.7 ± 1.2 mins for manual segmentation. The method would promote the translation of carotid 3D US to clinical care for the monitoring of the atherosclerotic disease progression and regression.
Lyne, Mike; Smith, Richard N; Lyne, Rachel; Aleksic, Jelena; Hu, Fengyuan; Kalderimis, Alex; Stepan, Radek; Micklem, Gos
Common metabolic and endocrine diseases such as diabetes affect millions of people worldwide and have a major health impact, frequently leading to complications and mortality. In a search for better prevention and treatment, there is ongoing research into the underlying molecular and genetic bases of these complex human diseases, as well as into the links with risk factors such as obesity. Although an increasing number of relevant genomic and proteomic data sets have become available, the quantity and diversity of the data make their efficient exploitation challenging. Here, we present metabolicMine, a data warehouse with a specific focus on the genomics, genetics and proteomics of common metabolic diseases. Developed in collaboration with leading UK metabolic disease groups, metabolicMine integrates data sets from a range of experiments and model organisms alongside tools for exploring them. The current version brings together information covering genes, proteins, orthologues, interactions, gene expression, pathways, ontologies, diseases, genome-wide association studies and single nucleotide polymorphisms. Although the emphasis is on human data, key data sets from mouse and rat are included. These are complemented by interoperation with the RatMine rat genomics database, with a corresponding mouse version under development by the Mouse Genome Informatics (MGI) group. The web interface contains a number of features including keyword search, a library of Search Forms, the QueryBuilder and list analysis tools. This provides researchers with many different ways to analyse, view and flexibly export data. Programming interfaces and automatic code generation in several languages are supported, and many of the features of the web interface are available through web services. The combination of diverse data sets integrated with analysis tools and a powerful query system makes metabolicMine a valuable research resource. The web interface makes it accessible to first
Genetically modified mice have contributed much to studies in the life sciences. In some research fields, however, mouse models are insufficient for analyzing the molecular mechanisms of pathology or as disease models. Often, genetically modified non-human primate (NHP) models are desired, as they are more similar to human physiology, morphology, and anatomy. Recent progress in studies of the reproductive biology in NHPs has enabled the introduction of exogenous genes into NHP genomes or the alteration of endogenous NHP genes. This review summarizes recent progress in the production of genetically modified NHPs, including the common marmoset, and future perspectives for realizing genetically modified NHP models for use in life sciences research. Copyright © 2015 Elsevier Ireland Ltd and the Japan Neuroscience Society. All rights reserved.
Petersen, Karen Bjerg
This paper sets out to consider concepts of diversity as means to discuss and address the increasing diversity of modern societies and to reflect the development of research priorities for universities in Nordic and Southern African countries. Based on reconceptualisations of theoretical concepts...... countries like e.g. Denmark have responded differently to diversification during the past decades. Based on this, challenges in deal-ing with diversity as a common research priority for Nordic and African universities will be shortly addressed....
Stohlmann, Micah; Maiorca, Cathrine; Olson, Travis A.
Mathematical modeling is an essential integrated piece of the Common Core State Standards. However, researchers have shown that mathematical modeling activities can be difficult for teachers to implement. Teachers are more likely to implement mathematical modeling activities if they have their own successful experiences with such activities. This…
Roc'h, A.; Bergsma, H.; Bergsma, J.G.; Leferink, Frank Bernardus Johannes
A well designed common mode filter for motor drive application can significantly improve the level of electromagnetic interference generated by the cable and the motor housing. The subsequent design of this filter is strongly dependent on the actual in situ parameters of the motor drive and often
Kajonius, Petri J.
Research is currently testing how the new maladaptive personality inventory for DSM (PID-5) and the well-established common Five-Factor Model (FFM) together can serve as an empirical and theoretical foundation for clinical psychology. The present study investigated the official short version of the PID-5 together with a common short version of…
Davis, Sean D.; Piercy, Fred P.
Some researchers have hypothesized that factors common across therapy models are largely responsible for change. In this study we conducted semi-structured, open-ended qualitative interviews with three different MFT model developers (Dr. Susan M. Johnson, Emotionally Focused Therapy; Dr. Frank M. Dattilio, Cognitive-Behavioral Therapy; and Dr.…
The absence of well defined inbred lines is an important problem associated with scientific research on fish. Inbred lines can be produced by conventional full-sib mating, but at least 10-15 generations are needed to produce homozygous inbred lines. Using common carp, which reach maturity
Reviews models commonly used in psychological research, and, particularly, in organizational decision making. An alternative model of organizational decision making is suggested. The model, referred to as the garbage can model, describes a process in which members of an organization collect the problems and solutions they generate by dumping them…
Koskinen, Camilla; Nyström, Lisbet
To clinically and contextually implement the theoretical and factual knowledge of care and caring that has been developed in the last 30 years is seen as a great challenge in caring science research. Emphasis has been put on problem-solving research methodologies and action research in hopes of narrowing the divide between caring theory and clinical practice. Thus, the intention is now to further action research towards a hermeneutic approach and to put emphasis on hermeneutic application where theory and praxis become one through human dialogue. This article highlights hermeneutic application research as an alternative methodology within participatory-oriented research which presents a new opportunity to unite clinical practice and caring theory. The aim is to contribute to the development of the hermeneutical application research design in its epistemological, ontological and ethical perspective, by articulating and clarifying the central foundations in the application. On the basis of Gadamer's hermeneutical thinking and Levinas ethical thinking, the central foundations in the application research are ethics, creation of a hermeneutical room, dialogue and common understanding and appropriation and action. When theoretical understanding turns into praxis, knowledge also becomes activity and theory and practice become one. Application thus realises the basic idea that praxis and theory are one, and thus, theory of caring can only become evident and implemented in a clinical practice through moments when the participants find a common understanding and consensus on the knowledge of care and caring. © 2015 Nordic College of Caring Science.
Carstensen, Tina; Kasch, Helge; Frostholm, Lisbeth
Background: Various predisposing, precipitating and perpetuating factors are found to be associated with development of persistent symptoms and disability after whiplash trauma. According to the commonsense model of illness, people use commonsense knowledge to develop individual illness models when...... of precollision sick leave on chronic whiplash? Methods: This presentation will integrate findings from research on predisposing, precipitating, perpetuating factors that are associated with poor outcome after whiplash trauma and propose the common-sense model as a unifying model. Data from a study including 740...... into specific factors in the model we found that negative perception of the whiplash trauma mediated the effect of previous sick leave on future neck pain, sum of indirect effect: 0.017, confidence intervals (0.008;0.028). Conclusion: Previous life experiences have an impact on how people make sense of a health...
Hark, Frank; Ring, Rob; Novack, Steven D.; Britton, Paul
Common Cause Failures (CCFs) are a known and documented phenomenon that defeats system redundancy. CCFs are a set of dependent type of failures that can be caused for example by system environments, manufacturing, transportation, storage, maintenance, and assembly. Since there are many factors that contribute to CCFs, they can be reduced, but are difficult to eliminate entirely. Furthermore, failure databases sometimes fail to differentiate between independent and dependent CCF. Because common cause failure data is limited in the aerospace industry, the Probabilistic Risk Assessment (PRA) Team at Bastion Technology Inc. is estimating CCF risk using generic data collected by the Nuclear Regulatory Commission (NRC). Consequently, common cause risk estimates based on this database, when applied to other industry applications, are highly uncertain. Therefore, it is important to account for a range of values for independent and CCF risk and to communicate the uncertainty to decision makers. There is an existing methodology for reducing CCF risk during design, which includes a checklist of 40+ factors grouped into eight categories. Using this checklist, an approach to produce a beta factor estimate is being investigated that quantitatively relates these factors. In this example, the checklist will be tailored to space launch vehicles, a quantitative approach will be described, and an example of the method will be presented.
Gullick, Janice; West, Sandra
The purpose of this paper is to describe the use of a common qualitative data set analysed with both a quality improvement tool to facilitate service improvement, and a rigorous research methodology to engage beginning nurse researchers in a mentored project. A qualitative cohort study of the experience of hospitalisation across six diagnostic groups interrogated data from 104 patient and carer interviews using the Picker Dimensions of Experience and Heideggerian Phenomenology. The paper reveals that well-conducted qualitative interviews can provide common ground for service improvement initiatives and rigorous research analysis. The Picker Dimensions use simple coding methods that push findings towards utility, but at times are overly reductionist and exile any data not related to hospital services. Heideggerian phenomenology is training and resource intensive, but its exploration of the meaning of the illness experience provides a profound backdrop for the subsequent understanding of hospitalisation. The access that qualitative data provides to the patient and family's perspective is becoming increasingly valued in processes of ongoing quality improvement, clinical redesign and evaluation for hospital accreditation. The intrinsic rewards of deep qualitative analysis for the staff involved are extraordinary. Clinicians were humbled by new understandings, which surprised them despite their long clinical experience. While quality improvement processes require training, ethics applications and data collection, the same framework can support rigorous qualitative research through use of the data as "common ground". The researchers experienced a tension, but eventually, a balance between the strengths and limitations of these combined modes of qualitative inquiry:
Peeters, P.; Van Hoestenberghe, T.; Vincke, L.; Visser, P.J.
The use of breach models includes two tasks: predicting breach characteristics and estimating flow through the breach. Strengths and weaknesses as well as opportunities and threats of different simplified and detailed physically-based breach models are listed following theoretical and practical
Olesen, H. R.
Proceedings of the Twenty-first NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held November 6-10 1995, in Baltimore, Maryland.......Proceedings of the Twenty-first NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held November 6-10 1995, in Baltimore, Maryland....
Orlinsky, D E; Willutzki, U; Meyerberg, J; Cierpka, M; Buchheim, P; Ambühl, H
In view of the great diversity to be found among psychotherapists in many countries in terms of professional background, theoretical orientation, and other personal and demographic characteristics, it is surprising to find certain areas of great commonality. Among the most striking of these are therapists' reports of their ideals and perceptions concerning their manner of relating to their patients. A very large majority of nearly 2,400 therapists surveyed in an on-going study of psychotherapeutic development wanted to and did see their behavior vis-a-vis patients as accepting, friendly, warm, tolerant, committed, and involved. These traits, which indicate a strong proclivity toward forming a positive therapeutic bond or alliance, also closely match qualities that therapists perceive in their own personal relationships. Discussion of these findings focuses on the possible sources and therapeutic consequences of this common pattern of interpersonal behavior.
Full Text Available A multi-dimensional strategy to tackle the global obesity epidemic requires an in-depth understanding of the mechanisms that underlie this complex condition. Much of the current mechanistic knowledge has arisen from preclinical research performed mostly, but not exclusively, in laboratory mouse and rat strains. These experimental models mimic certain aspects of the human condition and its root causes, particularly the over-consumption of calories and unbalanced diets. As with human obesity, obesity in rodents is the result of complex gene–environment interactions. Here, we review the traditional monogenic models of obesity, their contemporary optogenetic and chemogenetic successors, and the use of dietary manipulations and meal-feeding regimes to recapitulate the complexity of human obesity. We critically appraise the strengths and weaknesses of these different models to explore the underlying mechanisms, including the neural circuits that drive behaviours such as appetite control. We also discuss the use of these models for testing and screening anti-obesity drugs, beneficial bio-actives, and nutritional strategies, with the goal of ultimately translating these findings for the treatment of human obesity.
Kenneth J. Knapp
Full Text Available This study proposes using an established common body of knowledge (CBK as one means of organizing information security literature.Â Consistent with calls for more relevant information systems (IS research, this industry-developed framework can motivate future research towards topics that are important to the security practitioner.Â In this review, forty-eight articles from ten IS journals from 1995 to 2004 are selected and cross-referenced to the ten domains of the information security CBK.Â Further, we distinguish articles as empirical research, frameworks, or tutorials.Â Generally, this study identified a need for additional empirical research in every CBK domain including topics related to legal aspects of information security.Â Specifically, this study identified a need for additional IS security research relating to applications development, physical security, operations security, and business continuity.Â The CBK framework is inherently practitioner oriented and using it will promote relevancy by steering IS research towards topics important to practitioners.Â This is important considering the frequent calls by prominent information systems scholars for more relevant research.Â Few research frameworks have emerged from the literature that specifically classify the diversity of security threats and range of problems that businesses today face.Â With the recent surge of interest in security, the need for a comprehensive framework that also promotes relevant research can be of great value.
Chung, Dae Wook; Kang, Chang Soon
The most widely used models in common cause analysis are (single) shock models such as the BFR, and the MFR. But, single shock model can not treat the individual common cause separately and has some irrational assumptions. Multiple shock model for common cause failures is developed using Markov chain theory. This model treats each common cause shock as separately and sequently occuring event to implicate the change in failure probability distribution due to each common cause shock. The final failure probability distribution is evaluated and compared with that from the BFR model. The results show that multiple shock model which minimizes the assumptions in the BFR model is more realistic and conservative than the BFR model. The further work for application is the estimations of parameters such as common cause shock rate and component failure probability given a shock,p, through the data analysis
Wächter, Joachim; Hammitzsch, Martin; Kerschke, Dorit; Lauterjung, Jörn
Research infrastructures (RIs) are platforms integrating facilities, resources and services used by the research communities to conduct research and foster innovation. RIs include scientific equipment, e.g., sensor platforms, satellites or other instruments, but also scientific data, sample repositories or archives. E-infrastructures on the other hand provide the technological substratum and middleware to interlink distributed RI components with computing systems and communication networks. The resulting platforms provide the foundation for the design and implementation of RIs and play an increasing role in the advancement and exploitation of knowledge and technology. RIs are regarded as essential to achieve and maintain excellence in research and innovation crucial for the European Research Area (ERA). The implementation of RIs has to be considered as a long-term, complex development process often over a period of 10 or more years. The ongoing construction of Spatial Data Infrastructures (SDIs) provides a good example for the general complexity of infrastructure development processes especially in system-of-systems environments. A set of directives issued by the European Commission provided a framework of guidelines for the implementation processes addressing the relevant content and the encoding of data as well as the standards for service interfaces and the integration of these services into networks. Additionally, a time schedule for the overall construction process has been specified. As a result this process advances with a strong participation of member states and responsible organisations. Today, SDIs provide the operational basis for new digital business processes in both national and local authorities. Currently, the development of integrated RIs in Earth and Environmental Sciences is characterised by the following properties: • A high number of parallel activities on European and national levels with numerous institutes and organisations participating
Full Text Available DETRA (Developing a European Transport Research Alliance is a 7th Framework project, whose concept derived from the so-called Lyon Declaration and concerns the deepening of the European Research Area objectives in transport in order to address the Grand Challenges. Key priorities of this Alliance is to examine the strengths, weaknesses, opportunities and threats (SWOT in the domain and develop common understanding and approaches to reducing fragmentation and overcoming barriers. The DETRA project aimed to meet and exceed the requirements and objectives of the call for an Analysis of the state of ERA development within the transport domain and to develop recommendations for the EC, member states and other stakeholders as well as for the DETRA partner organizations themselves. In this study, particular emphasis is given to the part of DETRA concerning the development of a single trans- European research program, which can be used as a compass for the future research activities of the whole transportation area.
Tang, D.L.; Wiseman, E.; Keating, T.; Archambeault, J.
The National Research Council of Canada (NRC) co-chairs an international working group on performance benchmarking and impact assessment of Research and Technology Organizations (RTO). The Knowledge Management branch of the NRC conducted the patent analysis portion of the benchmarking study. In this paper, we present a Weighted Originality index that can more accurately measure the spread of technological combinations in terms of hierarchical patent classifications. Using this patent indicator, we revealed a common pattern of distribution of invention originality in RTOs. Our work contributes to the methodological advancement of patent measures for the scientometric community. (Author)
Savannah L. Kelly
Full Text Available Abstract Objective – The purpose of this quantitative study was to measure the impact of providing research struggle videos on first-year students’ research self-efficacy. The three-part video series explicated and briefly addressed common first-year roadblocks related to searching, evaluating, and caring about sources. The null hypothesis tested was that students would have similar research self-efficacy scores, regardless of exposure to the video series. Methods – The study was a quasi-experimental, nonequivalent control group design. The population included all 22 sections (N = 359 of First-Year Writing affiliated with the FASTrack Learning Community at the University of Mississippi. Of 22 sections, 12 (N = 212 served as the intervention group exposed to the videos, while the other 10 (N = 147 served as the control group. A research self-efficacy pretest – posttest measure was administered to all students. In addition, all 22 sections, regardless of control or intervention status, received a face-to-face one-shot library instruction session. Results – As a whole, this study failed to reject the null hypothesis. Students exposed to the research struggle videos reported similar research self-efficacy scores as students who were not exposed to the videos. A significant difference, however, did exist between all students’ pretest and posttest scores, suggesting that something else, possibly the in-person library session, did have an impact on students’ research self-efficacy. Conclusion – Although students’ research self-efficacy may have increased due to the presence of an in-person library session, this current research was most interested in evaluating the effect of providing supplemental instruction via struggle videos for first-year students. As this was not substantiated, it is recommended that researchers review the findings and limitations of this current study in order to identify more effective approaches in providing
Terry Robert F
Full Text Available Abstract Health research priority setting processes assist researchers and policymakers in effectively targeting research that has the greatest potential public health benefit. Many different approaches to health research prioritization exist, but there is no agreement on what might constitute best practice. Moreover, because of the many different contexts for which priorities can be set, attempting to produce one best practice is in fact not appropriate, as the optimal approach varies per exercise. Therefore, following a literature review and an analysis of health research priority setting exercises that were organized or coordinated by the World Health Organization since 2005, we propose a checklist for health research priority setting that allows for informed choices on different approaches and outlines nine common themes of good practice. It is intended to provide generic assistance for planning health research prioritization processes. The checklist explains what needs to be clarified in order to establish the context for which priorities are set; it reviews available approaches to health research priority setting; it offers discussions on stakeholder participation and information gathering; it sets out options for use of criteria and different methods for deciding upon priorities; and it emphasizes the importance of well-planned implementation, evaluation and transparency.
Nicole Leite Galvão-Coelho
Full Text Available Major depression is a psychiatric disorder with high prevalence in the general population, with increasing expression in adolescence, about 14% in young people. Frequently, it presents as a chronic condition, showing no remission even after several pharmacological treatments and persisting in adult life. Therefore, distinct protocols and animal models have been developed to increase the understanding of this disease or search for new therapies. To this end, this study investigated the effects of chronic social isolation and the potential antidepressant action of nortriptyline in juvenile Callithrix jacchus males and females by monitoring fecal cortisol, body weight, and behavioral parameters and searching for biomarkers and a protocol for inducing depression. The purpose was to validate this species and protocol as a translational model of juvenile depression, addressing all domain criteria of validation: etiologic, face, functional, predictive, inter-relational, evolutionary, and population. In both sexes and both protocols (IDS and DPT, we observed a significant reduction in cortisol levels in the last phase of social isolation, concomitant with increases in autogrooming, stereotyped and anxiety behaviors, and the presence of anhedonia. The alterations induced by chronic social isolation are characteristic of the depressive state in non-human primates and/or in humans, and were reversed in large part by treatment with an antidepressant drug (nortriptyline. Therefore, these results indicate C. jacchus as a potential translational model of juvenile depression by addressing all criteria of validation.
The common utilization of the facilities in the Japan Atomic Energy Research Institute by universities has been carried out for 20 years, and it contributed very much to the progress of the basic researches on atomic energy and the training of the persons concerned to atomic energy. This report is to be published, summarizing the results of researches carried out actively in 1980. The total number of the subjects in the common utilization of reactors and others was 126, and the total number of visitors during one year was 3356 man-day. 19 cold rooms and 6 hot rooms were leased from the JAERI as the university open laboratory, and Ge(Li) semiconductor detectors, multiple pulse height analyzers, gas chromatography, spectrophotometers, Moessbauer effect measuring equipments, X-ray diffraction equipments and others are installed. A minicomputer was installed in 1978, and preparation is in progress so as to be available as a new correlation measuring equipment. A pure Ge semiconductor detector and a 4000-channel multiple pulse height analyzer were additionally installed in 1980. The state of RI management and radiation control in the open laboratory is reported. The abstracts of the research reports are provided. (Kako, I.)
... COMMISSION Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft... comment Draft NUREG, ``Common- Cause Failure Analysis in Event and Condition Assessment: Guidance and... NUREG, ``Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research.'' The...
Full Text Available Model coupling requires a thorough conceptualisation of the coupling strategy, including an exact definition of the individual model domains, the "transboundary" processes and the exchange parameters. It is shown here that in the case of coupling groundwater flow and hydrological models – in particular on the regional scale – it is very important to find a common definition and scale-appropriate process description of groundwater recharge and baseflow (or "groundwater runoff/discharge" in order to achieve a meaningful representation of the processes that link the unsaturated and saturated zones and the river network. As such, integration by means of coupling established disciplinary models is problematic given that in such models, processes are defined from a purpose-oriented, disciplinary perspective and are therefore not necessarily consistent with definitions of the same process in the model concepts of other disciplines. This article contains a general introduction to the requirements and challenges of model coupling in Integrated Water Resources Management including a definition of the most relevant technical terms, a short description of the commonly used approach of model coupling and finally a detailed consideration of the role of groundwater recharge and baseflow in coupling groundwater models with hydrological models. The conclusions summarize the most relevant problems rather than giving practical solutions. This paper aims to point out that working on a large scale in an integrated context requires rethinking traditional disciplinary workflows and encouraging communication between the different disciplines involved. It is worth noting that the aspects discussed here are mainly viewed from a groundwater perspective, which reflects the author's background.
Hudson, Cody L; Topaloglu, Umit; Bian, Jiang; Hogan, William; Kieber-Emmons, Thomas
Clinical research data generated by a federation of collection mechanisms and systems often produces highly dissimilar data with varying quality. Poor data quality can result in the inefficient use of research data or can even require the repetition of the performed studies, a costly process. This work presents two tools for improving data quality of clinical research data relying on the National Cancer Institute's Common Data Elements as a standard representation of possible questions and data elements to A: automatically suggest CDE annotations for already collected data based on semantic and syntactic analysis utilizing the Unified Medical Language System (UMLS) Terminology Services' Metathesaurus and B: annotate and constrain new clinical research questions though a simple-to-use "CDE Browser." In this work, these tools are built and tested on the open-source LimeSurvey software and research data analyzed and identified to contain various data quality issues captured by the Comprehensive Research Informatics Suite (CRIS) at the University of Arkansas for Medical Sciences.
Egberto Ribeiro Turato
Full Text Available CONTEXT AND OBJECTIVE: Medical literature should consist of knowledge applicable to professional education; nevertheless, the profusion of articles in databases provokes disquiet among students. The authors considered the premise that scientific production in the field of health follows a mechanical description of phenomena without the clarity of motivating questions. The aim was to interpret material from expert reports, applied by medical students to analyze articles from renowned journals. DESIGN AND SETTING: This research project was exploratory, searching for latent meanings regarding methodological problems in a sample of papers. It was performed in a Brazilian medical school. METHODS: The sample was intentionally built, consisting of articles related to original research in the field of health, published over the previous five years. The results came from text content analysis, performed by a professor and his medical students. RESULTS: (1 Failure to state a hypothesis is an equivocal practice: articles did not show clarity of hypothesis to demonstrate that their authors had epistemological knowledge of the methods chosen. (2 There is a certain belief that in normal scientific practice, hypotheses are unnecessary: studies without explicit hypotheses led to suppositions that they merely repeat dominant models. (3 Presentation of common sense as scientific conclusions: research brings together what would have mobilized the researchers initially. CONCLUSIONS: Absence of formal hypotheses leaves scientific production vulnerable when put under epistemological discussion. Conclusions from scientific articles are often confounded with common-sense statements. Quantitative research is suggested, for studying the frequency of occurrence of these dubious methodological points.
Fleming, K.N.; Mosleh, A.; Deremer, R.K.
Common cause events are an important class of dependent events with respect to their contribution to system unavailability and to plant risk. Unfortunately, these events have not been treated with any king of consistency in applied risk studies over the past decade. Many probabilistic risk assessments (PRA) have not included these events at all, and those that have did not employ the kind of systematic procedures that are needed to achieve consistency, accuracy, and credibility in this area of PRA methodology. In this paper, the authors report on the progress recently made in the development of a systematic approach for incorporating common cause events into applied risk and reliability evaluations. This approach takes advantage of experience from recently completed PRAs and is the result of a project, sponsored by the Electric Power Research Institute (EPRI), in which procedures for dependent events analysis are being developed. Described in this paper is a general framework for system-level common cause failure (CCF) analysis and its application to a three-train auxiliary feedwater system. Within this general framework, three parametric CCF models are compared, including the basic parameter (BP), multiple Greek letter (MGL), and binominal failure rate (BFR) models. Pitfalls of not following the recommended procedure are discussed, and some old issues, such as the benefits of redundancy and diversity, are reexamined. (orig.)
Pierson, Kawika; Hand, Michael L.; Thompson, Fred
Quantitative public financial management research focused on local governments is limited by the absence of a common database for empirical analysis. While the U.S. Census Bureau distributes government finance data that some scholars have utilized, the arduous process of collecting, interpreting, and organizing the data has led its adoption to be prohibitive and inconsistent. In this article we offer a single, coherent resource that contains all of the government financial data from 1967-2012, uses easy to understand natural-language variable names, and will be extended when new data is available. PMID:26107821
Pierson, Kawika; Hand, Michael L; Thompson, Fred
Quantitative public financial management research focused on local governments is limited by the absence of a common database for empirical analysis. While the U.S. Census Bureau distributes government finance data that some scholars have utilized, the arduous process of collecting, interpreting, and organizing the data has led its adoption to be prohibitive and inconsistent. In this article we offer a single, coherent resource that contains all of the government financial data from 1967-2012, uses easy to understand natural-language variable names, and will be extended when new data is available.
Brinkman, T. J.; Cold, H.; Stinchcomb, T.; Brown, C.; Hollingsworth, T. N.
Indigenous communities in the Arctic have received increased attention from scientists in recent decades because of rapid climate change and resource development. Although many successful collaborations have occurred, some communities have been overwhelmed by the volume of research activity and frustrated with inadequate integration of local priorities into the research agenda. We present a northern case study to demonstrate how these challenges can be overcome through innovative community-based research and responsive scientific study designs. We collaborated with the community of Nuiqsut, Alaska to pilot a monitoring program that used camera-equipped GPS units to document social-ecological changes important to the community. Nuiqsut residents embraced an engagement strategy that avoided common methods of community collaboration (e.g., interviews), and that utilized novel and locally-accessible tools for documenting change. The monitoring program structure facilitated integration of indigenous knowledge (e.g., TEK) with western science. Scientists from diverse disciplines benefitted from local narratives on biophysical and social disturbances relevant to their research. The community benefitted from several subsequent scientific investigations that were launched to address the most pressing concerns voiced by local residents. Our community-based research strategy expanded to ten rural communities within the last year. We share our story and provide specific recommendations for enhancing community collaborations.
Huber, Robert; Beranzoli, Laura; Fiebig, Markus; Gilbert, Olivier; Laj, Paolo; Mazzola, Mauro; Paris, Jean-Daniel; Pedersen, Helle; Stocker, Markus; Vitale, Vito; Waldmann, Christoph
European Environmental Research Infrastructures (RI) frequently comprise in situ observatories from large-scale networks of platforms or sites to local networks of various sensors. Network operation is usually a cumbersome aspect of these RIs facing specific technological problems related to operations in remote areas, maintenance of the network, transmission of observation values, etc.. Robust inter-connection within and across these networks is still at infancy level and the burden increases with remoteness of the station, harshness of environmental conditions, and unavailability of classic communication systems, which is a common feature here. Despite existing RIs having developed ad-hoc solutions to overcome specific problems and innovative technologies becoming available, no common approach yet exists. Within the European project ENVRIplus, a dedicated work package aims to stimulate common network operation technologies and approaches in terms of power supply and storage, robustness, and data transmission. Major objectives of this task are to review existing technologies and RI requirements, propose innovative solutions and evaluate the standardization potential prior to wider deployment across networks. Focus areas within these efforts are: improving energy production and storage units, testing robustness of RI equipment towards extreme conditions as well as methodologies for robust data transmission. We will introduce current project activities which are coordinated at various levels including the engineering as well as the data management perspective, and explain how environmental RIs can benefit from the developments.
Claudot, Frédérique; Alla, François; Fresson, Jeanne; Calvez, Thierry; Coudane, Henry; Bonaïti-Pellié, Catherine
Background Research ethics have become universal in their principles through international agreements. The standardization of regulations facilitates the internationalization of research concerning drugs. However, in so-called observational studies (i.e. from data collected retrospectively or prospectively, obtained without any additional therapy or monitoring procedure), the modalities used for applying the main principles vary from one country to the other. This situation may entail problems for the conduct of multi-centric international studies, as well as for the publication of results if the authors and editors come from countries governed by different regulations. In particular, several French observational studies were rejected or retracted by United States peer reviewed journals, because their protocols have not been submitted to an Institutional Review Board/Independent Ethics Committee (IRB/IEC). Methods national legislation case analysis Results In accordance with European regulation, French observational studies from data obtained without any additional therapy or monitoring procedure, do not need the approval of an IRB/IEC. Nevertheless, these researches are neither exempt from scientific opinion nor from ethical and legal authorization. Conclusion We wish to demonstrate through the study of this example that different bodies of law can provide equivalent levels of protection that respect the same ethical principles. Our purpose in writing this paper was to encourage public bodies, scientific journals, and researchers to gain a better understanding of the various sets of specific national regulations and to speak a common language. PMID:19336436
Korsah, Kofi; Wood, Richard Thomas
Experience with applying current guidance and practices for common-cause failure (CCF) mitigation to digital instrumentation and control (I&C) systems has proven problematic, and the regulatory environment has been unpredictable. The impact of CCF vulnerability is to inhibit I&C modernization and, thereby, challenge the long-term sustainability of existing plants. For new plants and advanced reactor concepts, the issue of CCF vulnerability for highly integrated digital I&C systems imposes a design burden resulting in higher costs and increased complexity. The regulatory uncertainty regarding which mitigation strategies are acceptable (e.g., what diversity is needed and how much is sufficient) drives designers to adopt complicated, costly solutions devised for existing plants. The conditions that constrain the transition to digital I&C technology by the U.S. nuclear industry require crosscutting research to resolve uncertainty, demonstrate necessary characteristics, and establish an objective basis for qualification of digital technology for usage in Nuclear Power Plant (NPP) I&C applications. To fulfill this research need, Oak Ridge National Laboratory is conducting an investigation into mitigation of CCF vulnerability for nuclear-qualified applications. The outcome of this research is expected to contribute to a fundamentally sound, comprehensive technical basis for establishing the qualification of digital technology for nuclear power applications. This report documents the investigation of modeling approaches for representing failure of I&C systems. Failure models are used when there is a need to analyze how the probability of success (or failure) of a system depends on the success (or failure) of individual elements. If these failure models are extensible to represent CCF, then they can be employed to support analysis of CCF vulnerabilities and mitigation strategies. Specifically, the research findings documented in this report identify modeling approaches that
Korsah, Kofi [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wood, Richard Thomas [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
Experience with applying current guidance and practices for common-cause failure (CCF) mitigation to digital instrumentation and control (I&C) systems has proven problematic, and the regulatory environment has been unpredictable. The impact of CCF vulnerability is to inhibit I&C modernization and, thereby, challenge the long-term sustainability of existing plants. For new plants and advanced reactor concepts, the issue of CCF vulnerability for highly integrated digital I&C systems imposes a design burden resulting in higher costs and increased complexity. The regulatory uncertainty regarding which mitigation strategies are acceptable (e.g., what diversity is needed and how much is sufficient) drives designers to adopt complicated, costly solutions devised for existing plants. The conditions that constrain the transition to digital I&C technology by the U.S. nuclear industry require crosscutting research to resolve uncertainty, demonstrate necessary characteristics, and establish an objective basis for qualification of digital technology for usage in Nuclear Power Plant (NPP) I&C applications. To fulfill this research need, Oak Ridge National Laboratory is conducting an investigation into mitigation of CCF vulnerability for nuclear-qualified applications. The outcome of this research is expected to contribute to a fundamentally sound, comprehensive technical basis for establishing the qualification of digital technology for nuclear power applications. This report documents the investigation of modeling approaches for representing failure of I&C systems. Failure models are used when there is a need to analyze how the probability of success (or failure) of a system depends on the success (or failure) of individual elements. If these failure models are extensible to represent CCF, then they can be employed to support analysis of CCF vulnerabilities and mitigation strategies. Specifically, the research findings documented in this report identify modeling approaches that
Souza, Alejandro J.; Bell, Paul S.; Amoudry, Laurent
U.K. Sediment Initiative 2009: Developing Multidisciplinary Sediment Dynamics Research in a Strategic Context;Liverpool, United Kingdom, 27-29 April 2009; A workshop funded by the U.K. Natural Environment Research Council (NERC) brought together U.K.-based researchers, stakeholders, and policy makers with an interest in sediment processes to foster collaborative links and to help NERC theme leaders develop future Theme Action Plans (TAPs). This could be achieved only by identifying gaps in knowledge and prioritizing research needs toward formulating a U.K. sediment transport research strategy. More than 50 participants from NERC research and collaborative centers, U.K. higher education institution researchers, industry consultants, and government departments and agencies attended the workshop. The workshop was divided into three parts. First, in an introduction, guest speakers discussed the importance of sediment transport from different perspectives. The speakers included Darius Campbell (U.K. Department for Environment, Food, and Rural Affairs) on policy drivers, Richard Whitehouse (HR Wallingford, Ltd.) and David Lambkin (ABPMer, Ltd.) on industry needs, John Rees (NERC theme leader) on the NERC TAPs and possible funding opportunities, Alan Davies (University of Bangor) on the academic research perspective, and Chris Sherwood (U.S. Geological Survey) on international insight on large sediment transport projects and on developments of the U.S. National Community Sediment Transport Model project. Second, in a series of breakout sessions, participants considered the stakeholders’ needs, the different dynamic areas of the coastal ocean, and the process time scales. Third, the workshop group discussed possible funding streams and ways to better formulate a concerted research plan for presentation to funding bodies.
Makda, Ishtiyaq Ahmed; Nymand, Morten; Madawala, Udaya
between input and output which is normally associated with high common mode noise generation. In this work, common mode noise sources in the converter are identified, and a common mode noise model is developed. Based on the established noise model, a practical CM filter is designed to comply......In this paper, common mode noise modeling of low-voltage high-current isolated full bridge boost dc-dc converters intended for fuel cell application is presented. Due to the tightly coupled primary and secondary windings of the transformer, such converter has inherently large capacitive coupling...
Lu, George C.
The purpose of the EXPRESS (Expedite the PRocessing of Experiments to Space Station) rack project is to provide a set of predefined interfaces for scientific payloads which allow rapid integration into a payload rack on International Space Station (ISS). VxWorks' was selected as the operating system for the rack and payload resource controller, primarily based on the proliferation of VME (Versa Module Eurocard) products. These products provide needed flexibility for future hardware upgrades to meet everchanging science research rack configuration requirements. On the International Space Station, there are multiple science research rack configurations, including: 1) Human Research Facility (HRF); 2) EXPRESS ARIS (Active Rack Isolation System); 3) WORF (Window Observational Research Facility); and 4) HHR (Habitat Holding Rack). The RIC (Rack Interface Controller) connects payloads to the ISS bus architecture for data transfer between the payload and ground control. The RIC is a general purpose embedded computer which supports multiple communication protocols, including fiber optic communication buses, Ethernet buses, EIA-422, Mil-Std-1553 buses, SMPTE (Society Motion Picture Television Engineers)-170M video, and audio interfaces to payloads and the ISS. As a cost saving and software reliability strategy, the Boeing Payload Software Organization developed reusable common software where appropriate. These reusable modules included a set of low-level driver software interfaces to 1553B. RS232, RS422, Ethernet buses, HRDL (High Rate Data Link), video switch functionality, telemetry processing, and executive software hosted on the FUC computer. These drivers formed the basis for software development of the HRF, EXPRESS, EXPRESS ARIS, WORF, and HHR RIC executable modules. The reusable RIC common software has provided extensive benefits, including: 1) Significant reduction in development flow time; 2) Minimal rework and maintenance; 3) Improved reliability; and 4) Overall
Weinfurt, Kevin P; Bollinger, Juli M; Brelsford, Kathleen M; Bresciani, Martina; Lampron, Zachary; Lin, Li; Topazian, Rachel J; Sugarman, Jeremy
For pragmatic clinical research comparing commonly used treatments, questions exist about if and how to notify participants about it and secure their authorization for participation. To determine how patients react when they seek clinical care and encounter one of several different pragmatic clinical research studies. In an online survey using a between-subjects experimental design, respondents read and responded to 1 of 24 hypothetical research scenarios reflecting different types of studies and approaches to notification and authorization (eg, general notification, oral consent, written consent). English-speaking US adults 18 years and older. Willingness to participate in the hypothetical study, acceptability of the notification and authorization approach, understanding of the study, perceptions of benefit/harm, trust, and perception of amount of study information received. Willingness to participate did not differ by notification and authorization approach. Some (21%-36%) of the patients randomized to general notification with an explicit opt-out provision were not aware they would be enrolled by default. Acceptability was greatest for and similar among notification and authorization approaches that actively engaged the patient (eg, oral or written consent) and lower for approaches with less engagement (eg, general notification). Problems of understanding were found among 20%-55% of respondents, depending on the particular scenario. Most respondents (77%-94%) felt that participation in the hypothetical study posed no risks of harm to their health or privacy. Current attitudes about notification and authorization approaches and difficulties understanding pragmatic clinical research pose significant challenges for pragmatic research. Data from this study provide a starting point to developing solutions to these surprisingly complex issues.
Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.
Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software
Roesch, R; Ogloff, J R; Eaves, D
There is a need for researchers and policy makers in the area of mental health and law to collaborate and develop common methods of approach to research. Although we have learned a great deal about the prevalence and needs of mentally ill offenders in jails and prisons, there are a number of research questions that remain. If the "second generation" of research is to be fruitful--and useful to policy makers--we need to be sure that the methods we employ are valid and that the findings we obtain are reliable. By collaborating with colleagues in other jurisdictions, we can begin to learn whether some of the existing findings are of a general nature, or dependent upon the system in which they were found. Similarly, while the first-generation research has alerted us to the needs of mentally ill offenders in jails and prisons, second-generation research is needed to help identify factors that may help prevent the "revolving door phenomenon," which results in mentally ill people being volleyed among mental health, criminal justice, and community settings. One area that has received embarrassingly little attention has been the need for considering the relationship between substance abuse and mental disorders. In our own work, we have found an alarmingly high degree of substance abuse among offenders, including mentally ill offenders. We have come to realize the importance of considering the role that substance abuse coupled with other mental disorders may play in the criminal justice system. As a result of this concern, the Surrey Mental Health Project recently hired a full-time drug and alcohol counselor whose job it is to work with inmates with substance abuse disorders while in the jail, and to help arrange continuing treatment resources upon their release. As Wilson et al. (1995) discuss, intensive case management projects may be particularly useful at targeting the unique needs of mentally ill offenders with multiple problems. Much of the research conducted with
Prins, Noeline W; Pohlmeyer, Eric A; Debnath, Shubham; Mylavarapu, Ramanamurthy; Geng, Shijia; Sanchez, Justin C; Rothen, Daniel; Prasad, Abhishek
The common marmoset (Callithrix jacchus) has been proposed as a suitable bridge between rodents and larger primates. They have been used in several types of research including auditory, vocal, visual, pharmacological and genetics studies. However, marmosets have not been used as much for behavioral studies. Here we present data from training 12 adult marmosets for behavioral neuroscience studies. We discuss the husbandry, food preferences, handling, acclimation to laboratory environments and neurosurgical techniques. In this paper, we also present a custom built "scoop" and a monkey chair suitable for training of these animals. The animals were trained for three tasks: 4 target center-out reaching task, reaching tasks that involved controlling robot actions, and touch screen task. All animals learned the center-out reaching task within 1-2 weeks whereas learning reaching tasks controlling robot actions task took several months of behavioral training where the monkeys learned to associate robot actions with food rewards. We propose the marmoset as a novel model for behavioral neuroscience research as an alternate for larger primate models. This is due to the ease of handling, quick reproduction, available neuroanatomy, sensorimotor system similar to larger primates and humans, and a lissencephalic brain that can enable implantation of microelectrode arrays relatively easier at various cortical locations compared to larger primates. All animals were able to learn behavioral tasks well and we present the marmosets as an alternate model for simple behavioral neuroscience tasks. Copyright © 2017 Elsevier B.V. All rights reserved.
Full Text Available The common rail pressure has a direct influence on the working stability of Opposed-Piston Two-Stroke (OP2S diesel engines, especially on performance indexes such as power, economy and emissions. Meanwhile, the rail pressure overshoot phenomenon occurs frequently due to the operating characteristics of OP2S diesel engines, which could lead to serious consequences. In order to solve the rail pressure overshoot problem of OP2S diesel engines, a nonlinear concerted algorithm adding a speed state feedback was investigated. First, the nonlinear Linear Parameter Varying (LPV model was utilized to describe the coupling relationship between the engine speed and the rail pressure. The Linear Quadratic Regulator (LQR optimal control algorithm was applied to design the controller by the feedback of speed and rail pressure. Second, cooperating with the switching characteristics of injectors, the co-simulation of MATLAB/Simulink and GT-Power was utilized to verify the validity of the control algorithm and analyze workspaces for both normal and special sections. Finally, bench test results showed that the accuracy of the rail pressure control was in the range of ±1 MPa, in the condition of sudden 600 r/min speed increases. In addition, the fuel mass was reduced 76.3% compared with the maximum fuel supply quantity and the rail pressure fluctuation was less than 20 MPa. The algorithm could also be appropriate for other types of common rail system thanks to its universality.
Cui Xuee; Li Minghua; Wang Yongli; Cheng Yingsheng; Li Wenbin
Objective: To study the feasibility of establishing experimental model of human internal carotid artery siphon segment in canine common carotid artery (CCA) by end-to-end anastomoses of one side common carotid artery segment with the other side common carotid artery. Methods: Surgical techniques were used to make siphon model in 8 canines. One side CCA was taken as the parent artery and anastomosing with the cut off contra-lateral CCA segment which has passed through within the S-shaped glass tube. Two weeks after the creation of models angiography showed the model siphons were patent. Results: Experimental models of human internal carotid artery siphon segment were successfully made in all 8 dogs. Conclusions: It is practically feasible to establish experimental canine common carotid artery models of siphon segment simulating human internal carotid artery. (authors)
Revelo, Renata A.; Loui, Michael C.
We studied mentoring relationships between undergraduate and graduate students in a summer undergraduate research program, over three years. Using a grounded theory approach, we created a model of research mentoring that describes how the roles of the mentor and the student can change. Whereas previous models of research mentoring ignored student…
Redman, P M; Kelly, J A; Albright, E D; Anderson, P F; Mulder, C; Schnell, E H
The establishment of the HealthWeb project by twelve health sciences libraries provides a collaborative means of organizing and enhancing access to Internet resources for the international health sciences community. The project is based on the idea that the Internet is common ground for all libraries and that through collaboration a more comprehensive, robust, and long-lasting information product can be maintained. The participants include more than seventy librarians from the health sciences libraries of the Committee on Institutional Cooperation (CIC), an academic consortium of twelve major research universities. The Greater Midwest Region of the National Network of Libraries of Medicine serves as a cosponsor. HealthWeb is an information resource that provides access to evaluated, annotated Internet resources via the World Wide Web. The project vision as well as the progress reported on its implementation may serve as a model for other collaborative Internet projects.
Bodea, Constanta Nicoleta; Dascalu, Mariana Iuliana
The authors propose a risks evaluation model for research projects. The model is based on fuzzy inference. The knowledge base for fuzzy process is built with a causal and cognitive map of risks. The map was especially developed for research projects, taken into account their typical lifecycle. The model was applied to an e-testing research…
Full Text Available Background/Aim/Purpose: A commonly implemented software process improvement framework is the capability maturity model integrated (CMMI. Existing literature indicates higher levels of CMMI maturity could result in a loss of agility due to its organizational focus. To maintain agility, research has focussed attention on agile maturity models. The objective of this paper is to find the common research themes and conclusions in agile maturity model research. Methodology: This research adopts a systematic approach to agile maturity model research, using Google Scholar, Science Direct, and IEEE Xplore as sources. In total 531 articles were initially found matching the search criteria, which was filtered to 39 articles by applying specific exclusion criteria. Contribution:: The article highlights the trends in agile maturity model research, specifically bringing to light the lack of research providing validation of such models. Findings: Two major themes emerge, being the coexistence of agile and CMMI and the development of agile principle based maturity models. The research trend indicates an increase in agile maturity model articles, particularly in the latter half of the last decade, with concentrations of research coinciding with version updates of CMMI. While there is general consensus around higher CMMI maturity levels being incompatible with true agility, there is evidence of the two coexisting when agile is introduced into already highly matured environments. Future Research:\tFuture research direction for this topic should include how to attain higher levels of CMMI maturity using only agile methods, how governance is addressed in agile environments, and whether existing agile maturity models relate to improved project success.
Yan, Kaihong; Dong, Zhaomin; Liu, Yanju; Naidu, Ravi
Bioaccessibility to assess potential risks resulting from exposure to Pb-contaminated soils is commonly estimated using various in vitro methods. However, existing in vitro methods yield different results depending on the composition of the extractant as well as the contaminated soils. For this reason, the relationships between the five commonly used in vitro methods, the Relative Bioavailability Leaching Procedure (RBALP), the unified BioAccessibility Research Group Europe (BARGE) method (UBM), the Solubility Bioaccessibility Research Consortium assay (SBRC), a Physiologically Based Extraction Test (PBET), and the in vitro Digestion Model (RIVM) were quantified statistically using 10 soils from long-term Pb-contaminated mining and smelter sites located in Western Australia and South Australia. For all 10 soils, the measured Pb bioaccessibility regarding all in vitro methods varied from 1.9 to 106% for gastric phase, which is higher than that for intestinal phase: 0.2 ∼ 78.6%. The variations in Pb bioaccessibility depend on the in vitro models being used, suggesting that the method chosen for bioaccessibility assessment must be validated against in vivo studies prior to use for predicting risk. Regression studies between RBALP and SRBC, RBALP and RIVM (0.06) (0.06 g of soil in each tube, S:L ratios for gastric phase and intestinal phase are 1:375 and 1:958, respectively) showed that Pb bioaccessibility based on the three methods were comparable. Meanwhile, the slopes between RBALP and UBM, RBALP and RIVM (0.6) (0.6 g soil in each tube, S:L ratios for gastric phase and intestinal phase are 1:37.5 and 1:96, respectively) were 1.21 and 1.02, respectively. The findings presented in this study could help standardize in vitro bioaccessibility measurements and provide a scientific basis for further relating Pb bioavailability and soil properties.
Schaeffel, Frank; Feldkaemper, Marita
Our current understanding of the development of refractive errors, in particular myopia, would be substantially limited had Wiesel and Raviola not discovered by accident that monkeys develop axial myopia as a result of deprivation of form vision. Similarly, if Josh Wallman and colleagues had not found that simple plastic goggles attached to the chicken eye generate large amounts of myopia, the chicken model would perhaps not have become such an important animal model. Contrary to previous assumptions about the mechanisms of myopia, these animal models suggested that eye growth is visually controlled locally by the retina, that an afferent connection to the brain is not essential and that emmetropisation uses more sophisticated cues than just the magnitude of retinal blur. While animal models have shown that the retina can determine the sign of defocus, the underlying mechanism is still not entirely clear. Animal models have also provided knowledge about the biochemical nature of the signal cascade converting the output of retinal image processing to changes in choroidal thickness and scleral growth; however, a critical question was, and still is, can the results from animal models be applied to myopia in children? While the basic findings from chickens appear applicable to monkeys, some fundamental questions remain. If eye growth is guided by visual feedback, why is myopic development not self-limiting? Why does undercorrection not arrest myopic progression even though positive lenses induce myopic defocus, which leads to the development of hyperopia in emmetropic animals? Why do some spectacle or contact lens designs reduce myopic progression and others not? It appears that some major differences exist between animals reared with imposed defocus and children treated with various optical corrections, although without the basic knowledge obtained from animal models, we would be lost in an abundance of untestable hypotheses concerning human myopia. © 2015 Optometry
Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)
textabstractRecent advances in data collection and data storage techniques enable marketing researchers to study the individual characteristics of a large range of transactions and purchases, in particular the effects of household-specific characteristics. This book presents the most important and
Declercq, A M; Chiers, K; Haesebrouck, F; Van den Broeck, W; Dewulf, J; Cornelissen, M; Decostere, A
Challenge models generating gill lesions typical for columnaris disease were developed for the fry of both Common Carp Cyprinus carpio and Rainbow Trout Oncorhynchus mykiss by means of an immersion challenge and Flavobacterium columnare field isolates were characterized regarding virulence. Carp inoculated with highly virulent isolates revealed diffuse, whitish discoloration of the gills affecting all arches, while in trout mostly unilateral focal lesions, which were restricted to the first two gill arches, occurred. Light microscopic examination of the gills of carp exposed to highly virulent isolates revealed a diffuse loss of branchial structures and desquamation and necrosis of gill epithelium with fusion of filaments and lamellae. In severe cases, large parts of the filaments were replaced with necrotic debris entangled with massive clusters of F. columnare bacterial cells enwrapped in an eosinophilic matrix. In trout, histopathologic lesions were similar but less extensive and much more focal, and well delineated from apparently healthy tissue. Scanning and transmission electron microscopic observations of the affected gills showed long, slender bacterial cells contained in an extracellular matrix and in close contact with the destructed gill tissue. This is the first study to reveal gill lesions typical for columnaris disease at macroscopic, light microscopic, and ultrastructural levels in both Common Carp and Rainbow Trout following a challenge with F. columnare. The results provide a basis for research opportunities to examine pathogen-gill interactions.
Ri equation) implies (23). We introduce wave number space by 4D (k) = R eik. d (24) 813 and the three-dimensional spectrum, E , by the Karman-Howarth...Mech. Series No. 4, 1965, pp. 13-23. 10 simple model for T , which has derivatives only (no integrals*) both in t space and for R ,is 1 ’S 15F The...coefficients have been chosen so that for Kolmogoroff equili- brium, i.e., T = 0 , the only solution is E = const k -5 / 3 and, in addition, Eq. (28) is
Dinges, Laslo; Al-Hamadi, Ayoub; Elzobi, Moftah; El-Etriby, Sherif
Document analysis tasks such as pattern recognition, word spotting or segmentation, require comprehensive databases for training and validation. Not only variations in writing style but also the used list of words is of importance in the case that training samples should reflect the input of a specific area of application. However, generation of training samples is expensive in the sense of manpower and time, particularly if complete text pages including complex ground truth are required. This is why there is a lack of such databases, especially for Arabic, the second most popular language. However, Arabic handwriting recognition involves different preprocessing, segmentation and recognition methods. Each requires particular ground truth or samples to enable optimal training and validation, which are often not covered by the currently available databases. To overcome this issue, we propose a system that synthesizes Arabic handwritten words and text pages and generates corresponding detailed ground truth. We use these syntheses to validate a new, segmentation based system that recognizes handwritten Arabic words. We found that a modification of an Active Shape Model based character classifiers-that we proposed earlier-improves the word recognition accuracy. Further improvements are achieved, by using a vocabulary of the 50,000 most common Arabic words for error correction.
Full Text Available Document analysis tasks such as pattern recognition, word spotting or segmentation, require comprehensive databases for training and validation. Not only variations in writing style but also the used list of words is of importance in the case that training samples should reflect the input of a specific area of application. However, generation of training samples is expensive in the sense of manpower and time, particularly if complete text pages including complex ground truth are required. This is why there is a lack of such databases, especially for Arabic, the second most popular language. However, Arabic handwriting recognition involves different preprocessing, segmentation and recognition methods. Each requires particular ground truth or samples to enable optimal training and validation, which are often not covered by the currently available databases. To overcome this issue, we propose a system that synthesizes Arabic handwritten words and text pages and generates corresponding detailed ground truth. We use these syntheses to validate a new, segmentation based system that recognizes handwritten Arabic words. We found that a modification of an Active Shape Model based character classifiers—that we proposed earlier—improves the word recognition accuracy. Further improvements are achieved, by using a vocabulary of the 50,000 most common Arabic words for error correction.
... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft Report for Comment; Correction AGENCY: Nuclear Regulatory Commission. ACTION: Draft NUREG; request for...
... COMMISSION Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and Research, Draft...-xxxx, Revision 0, ``Common-Cause Failure Analysis in Event and Condition Assessment: Guidance and... at (301) 492-3446. FOR FURTHER INFORMATION CONTACT: Song-Hua Shen, Division of Risk Analysis, Office...
Full Text Available The purpose of this research was to examine the relations between different aspects of aggressiveness and personality traits. Buss-Perry Aggression Questionnaire (AQ, Eysenck Personality Questionnaire (EPQ, which represent psychobiological model, and inventory Big Five Plus Two Inventory (BF+2, which represent psycholexical model of personality in Serbian language, were administered to 478 participants. The results revealed that affective impulsive aggressiveness and predatory or instrumental aggressiveness could be identified in the aggressiveness - personality traits relationships. Those aspects of aggressiveness could take manifest or latent character. As expected, Psychoticism from EPQ, Aggressiveness, and Negative Valence from BF+2 showed a significant contribution to all identified forms, except for Aggressiveness in relations with “acting out” physical aggression. Although these personality traits carry out significant loadings, these loadings were not always the highest. Affective-impulsive aggressiveness, which was mainly determined by the components of latent domain AQ, was related to Neuroticism from both models. The remaining forms of manifest aggressiveness were related to low Consciousness, whereas Physical aggression is connected to Extraversion and Oppennes. This connection represents possible “acting out” reaction or more frequent tendency of impulsive physical aggression. The results showed that aggressiveness represents a multidimensional construct which could be explained by specific constellation of personality traits, depending which aspects of aggressivenes are of interest. [Projekat Ministarstva nauke Republike Srbije, br. ON179006: Nasledni, sredinski i psihološki činioci mentalnog zdravlja
Ginsburg, Geoffrey S.; Kuderer, Nicole M.
Despite stunning advances in our understanding of the genetics and the molecular basis for cancer, many patients with cancer are not yet receiving therapy tailored specifically to their tumor biology. The translation of these advances into clinical practice has been hindered, in part, by the lack of evidence for biomarkers supporting the personalized medicine approach. Most stakeholders agree that the translation of biomarkers into clinical care requires evidence of clinical utility. The highest level of evidence comes from randomized controlled clinical trials (RCTs). However, in many instances, there may be no RCTs that are feasible for assessing the clinical utility of potentially valuable genomic biomarkers. In the absence of RCTs, evidence generation will require well-designed cohort studies for comparative effectiveness research (CER) that link detailed clinical information to tumor biology and genomic data. CER also uses systematic reviews, evidence-quality appraisal, and health outcomes research to provide a methodologic framework for assessing biologic patient subgroups. Rapid learning health care (RLHC) is a model in which diverse data are made available, ideally in a robust and real-time fashion, potentially facilitating CER and personalized medicine. Nonetheless, to realize the full potential of personalized care using RLHC requires advances in CER and biostatistics methodology and the development of interoperable informatics systems, which has been recognized by the National Cancer Institute's program for CER and personalized medicine. The integration of CER methodology and genomics linked to RLHC should enhance, expedite, and expand the evidence generation required for fully realizing personalized cancer care. PMID:23071236
Power, Séamus A; Velez, Gabriel; Qadafi, Ahmad; Tennant, Joseph
We propose a SAGE model for social psychological research. Encapsulated in our acronym is a proposal to have a synthetic approach to social psychological research, in which qualitative methods are augmentative to quantitative ones, qualitative methods can be generative of new experimental hypotheses, and qualitative methods can capture experiences that evade experimental reductionism. We remind social psychological researchers that psychology was founded in multiple methods of investigation at multiple levels of analysis. We discuss historical examples and our own research as contemporary examples of how a SAGE model can operate in part or as an integrated whole. The implications of our model are discussed.
Neill, K M
Collaboration, when viewed as a nonhierarchic endeavor based on a sharing of power and authority, may not be an ideal model for interdisciplinary research. A cooperative venture in which participants willingly join in planning, making decisions about, and implementing a project supervised by a principal investigator is historically the successful structure for interdisciplinary research. A holistic model of mutually respectful cooperation in a hierarchic team is a more realistic goal for interdisciplinary research efforts.
Rodysill, J. R.
Proxy-based reconstructions provide vital information for developing histories of environmental and climate changes. Networks of spatiotemporal paleoclimate information are powerful tools for understanding dynamical processes within the global climate system and improving model-based predictions of the patterns and magnitudes of climate changes at local- to global-scales. Compiling individual paleoclimate records and integrating reconstructed climate information in the context of an ensemble of multi-proxy records, which are fundamental for developing a spatiotemporal climate data network, are hindered by challenges related to data and information accessibility, chronological uncertainty, sampling resolution, climate proxy type, and differences between depositional environments. The U.S. Geological Survey (USGS) North American Holocene Climate Synthesis Working Group has been compiling and integrating multi-proxy paleoclimate data as part of an ongoing effort to synthesize Holocene climate records from North America. The USGS North American Holocene Climate Synthesis Working Group recently completed a late Holocene hydroclimate synthesis for the North American continent using several proxy types from a range of depositional environments, including lakes, wetlands, coastal marine, and cave speleothems. Using new age-depth relationships derived from the Bacon software package, we identified century-scale patterns of wetness and dryness for the past 2000 years with an age uncertainty-based confidence rating for each proxy record. Additionally, for highly-resolved North American lake sediment records, we computed average late Holocene sediment deposition rates and identified temporal trends in age uncertainty that are common to multiple lakes. This presentation addresses strengths and challenges of compiling and integrating data from different paleoclimate archives, with a particular focus on lake sediments, which may inform and guide future paleolimnological studies.
Creswell, John W.; Bean, John P.
A test of the Biglan model of faculty subcultures using measures of research output and tests of the model controlling for the effects of faculty socialization are described. The Biglan model is found to be valid, and the distinctiveness of the Biglan groups appears to increase with the socialization of faculty into subject areas. (Author/MLW)
Sengupta, S.K.; Boyle, J.S.
Variables describing atmospheric circulation and other climate parameters derived from various GCMs and obtained from observations can be represented on a spatio-temporal grid (lattice) structure. The primary objective of this paper is to explore existing as well as some new statistical methods to analyze such data structures for the purpose of model diagnostics and intercomparison from a statistical perspective. Among the several statistical methods considered here, a new method based on common principal components appears most promising for the purpose of intercomparison of spatio-temporal data structures arising in the task of model/model and model/data intercomparison. A complete strategy for such an intercomparison is outlined. The strategy includes two steps. First, the commonality of spatial structures in two (or more) fields is captured in the common principal vectors. Second, the corresponding principal components obtained as time series are then compared on the basis of similarities in their temporal evolution
Lasrado, Lester Allan; Vatrapu, Ravi; Andersen, Kim Normann
of maturity models. Specifically, it explores maturity models literature in IS and standard guidelines, if any to develop maturity models, challenges identified and solutions proposed. Our systematic literature review of IS publications revealed over hundred and fifty articles on maturity models. Extant...... literature reveals that researchers have primarily focused on developing new maturity models pertaining to domain-specific problems and/or new enterprise technologies. We find rampant re-use of the design structure of widely adopted models such as Nolan’s Stage of Growth Model, Crosby’s Grid, and Capability...
Jordan Cizelj, R.; Vrbanic, I.
As a rule, common cause failures have high influence on the results of Probabilistic Safety Assessments. In the paper, uncertainty analysis for parameters of Multiple-Greek-Letter common-cause-model due to stochastic nature of events is presented. Results of Bayesian analysis and maximum likelihood analysis are compared and interpreted. Special emphasis is given to the assessment of the Bayesian inclusion of generic knowledge, since it may bias the results conservatively. (author)
Minie, Mark; Bowers, Stuart; Tarczy-Hornoch, Peter; Roberts, Edward; James, Rose A; Rambo, Neil; Fuller, Sherrilynne
The University of Washington Health Sciences Libraries and Information Center BioCommons serves the bioinformatics needs of researchers at the university and in the vibrant for-profit and not-for-profit biomedical research sector in the Washington area and region. The BioCommons comprises services addressing internal University of Washington, not-for-profit, for-profit, and regional and global clientele. The BioCommons is maintained and administered by the BioResearcher Liaison Team. The BioCommons architecture provides a highly flexible structure for adapting to rapidly changing resources and needs. BioCommons uses Web-based pre- and post-course evaluations and periodic user surveys to assess service effectiveness. Recent surveys indicate substantial usage of BioCommons services and a high level of effectiveness and user satisfaction. BioCommons is developing novel collaborative Web resources to distribute bioinformatics tools and is experimenting with Web-based competency training in bioinformation resource use.
The results of the joint researches by utilizing the facilities of JAERI in 1992 fiscal year were summarized. The number of research themes in 1992 was 247 cases. In this book, 166 reports are collected. (J.P.N.)
The results of the joint researches by utilizing the facilities of JAERI in 1993 fiscal year were summarized. The number of research themes in 1993 was 228 cases. In this book, 243 reports are collected. (J.P.N.)
List, Johann-Mattis; Pathmanathan, Jananan Sylvestre; Lopez, Philippe; Bapteste, Eric
For a long time biologists and linguists have been noticing surprising similarities between the evolution of life forms and languages. Most of the proposed analogies have been rejected. Some, however, have persisted, and some even turned out to be fruitful, inspiring the transfer of methods and models between biology and linguistics up to today. Most proposed analogies were based on a comparison of the research objects rather than the processes that shaped their evolution. Focusing on process-based analogies, however, has the advantage of minimizing the risk of overstating similarities, while at the same time reflecting the common strategy to use processes to explain the evolution of complexity in both fields. We compared important evolutionary processes in biology and linguistics and identified processes specific to only one of the two disciplines as well as processes which seem to be analogous, potentially reflecting core evolutionary processes. These new process-based analogies support novel methodological transfer, expanding the application range of biological methods to the field of historical linguistics. We illustrate this by showing (i) how methods dealing with incomplete lineage sorting offer an introgression-free framework to analyze highly mosaic word distributions across languages; (ii) how sequence similarity networks can be used to identify composite and borrowed words across different languages; (iii) how research on partial homology can inspire new methods and models in both fields; and (iv) how constructive neutral evolution provides an original framework for analyzing convergent evolution in languages resulting from common descent (Sapir's drift). Apart from new analogies between evolutionary processes, we also identified processes which are specific to either biology or linguistics. This shows that general evolution cannot be studied from within one discipline alone. In order to get a full picture of evolution, biologists and linguists need to
Bouter, Lex M.
Universities are, to a large extent, publicly funded. It is reasonable to expect that society should benefit as a result. This means that scientific research should at least have a potential societal impact. Universities and individual researchers should therefore give serious thought to the societal relevance of their research activities and…
Rohweder, Catherine L; Laping, Jane L; Diehl, Sandra J; Moore, Alexis A; Isler, Malika Roman; Scott, Jennifer Elissa; Enga, Zoe Kaori; Black, Molly C; Dave, Gaurav; Corbie-Smith, Giselle; Melvin, Cathy L
Innovative models to facilitate more rapid uptake of research findings into practice are urgently needed. Community members who engage in research can accelerate this process by acting as adoption agents. We implemented an Evidence Academy conference model bringing together researchers, health care professionals, advocates, and policy makers across North Carolina to discuss high-impact, life-saving study results. The overall goal is to develop dissemination and implementation strategies for translating evidence into practice and policy. Each 1-day, single-theme, regional meeting focuses on a leading community-identified health priority. The model capitalizes on the power of diverse local networks to encourage broad, common awareness of new research findings. Furthermore, it emphasizes critical reflection and active group discussion on how to incorporate new evidence within and across organizations, health care systems, and communities. During the concluding session, participants are asked to articulate action plans relevant to their individual interests, work setting, or area of expertise.
Igor Novitski et al.
A high-field dipole magnet based on the common coil design was developed at Fermilab for a future Very Large Hadron Collider. A short model of this magnet with a design field of 11 T in two 40-mm apertures is being fabricated using the react-and-wind technique. In order to study and optimize the magnet design two 165-mm long mechanical models were assembled and tested. A technological model consisting of magnet straight section and ends was also fabricated in order to check the tooling and the winding and assembly procedures. This paper describes the design and technology of the common coil dipole magnet and summarizes the status of short model fabrication.The results of the mechanical model tests and comparison with FE mechanical analysis are also presented.
Kvam, Paul H.; Martz, Harry F.
We consider redundant systems of identical components for which reliability is assessed statistically using only demand-based failures and successes. Direct assessment of system reliability can lead to gross errors in estimation if there exist external events in the working environment that cause two or more components in the system to fail in the same demand period which have not been included in the reliability model. We develop a simple Bayesian model for estimating component reliability and the corresponding probability of common cause failure in operating systems for which the data is confounded; that is, the common cause failures cannot be distinguished from multiple independent component failures in the narrative event descriptions
Sparre, Mogens; Rasmussen, Ole Horn; Fast, Alf Michael
Abstract: Participatory Action Research (PAR) has a longer academic history compared with the idea of business models (BMs). This paper indicates how industries gain by using the combined methodology. The research question "Can participatory action research create value for Business Model...... Innovation (BMI)?” – has been investigated from five different perspectives based upon The Business Model Cube and The Where to Look Model. Using both established and newly developed tools the paper presents how. Theory and data from two cases are presented and it is demonstrated how industry increase...... their monetary and/or non-monetary value creation doing BMI based upon PAR. The process is essential and using the methodology of PAR creates meaning. Behind the process, the RAR methodology and its link to BM and BMI may contribute to theory construction and creation of a common language in academia around...
Full Text Available To support its competitive advantages in current market conditions, each company needs to choose better ways of guaranteeing its favorable competitive position. In this regard, considerable interest lies in the structuring and algorithmization of marketing research processes that provide the information background of such choice. The article is devoted to modeling the process of marketing research of competitive environment.
Olsen, Michelle D. Hunt
A study was conducted to propose a research-based model for a longitudinal data research system that addressed recommendations from a synthesis of literature related to: (1) needs reported by the U.S. Department of Education, (2) the twelve mandatory elements that define federally approved state longitudinal data systems (SLDS), (3) the…
Kap, Yolanda S.; Jagessar, S. Anwar; Dunham, Jordon; 't Hart, Bert A.
New drugs often fail in the translation from the rodent experimental autoimmune encephalomyelitis (EAE) model to human multiple sclerosis (MS). Here, we present the marmoset EAE model as an indispensable model for translational research into MS. The genetic heterogeneity of this species and lifelong
Saeki, Elina; Pendergast, Laura; Segool, Natasha K.; von der Embse, Nathaniel P.
Despite the recent rollout of the Common Core State Standards (CCSS), CCSS-aligned assessments, and test-based teacher evaluation systems, questions remain regarding the impact that these accountability policies will have on teachers and students. This article discusses the psychosocial and instructional consequences of test-based accountability…
A.C.J.W. Janssens (Cécile); P. Tikka-Kleemola (Päivi)
textabstractThe translation of emerging genomic knowledge into public health and clinical care is one of the major challenges for the coming decades. At the moment, genome-based prediction of common diseases, such as type 2 diabetes, coronary heart disease and cancer, is still not informative. Our
Reumann-Moore, Rebecca; Duffy, Mark
Initiated for the 2013-14 school year, the Common Assignment Study (CAS) is a three-year effort being led by the Colorado Education Initiative (CEI) and The Fund for Transforming Education in Kentucky (The Fund) with support from the Bill & Melinda Gates Foundation. Conceptually, CAS builds on previous efforts to improve instruction through…
Genomics was introduced with big promises and expectations of its future contribution to our society. Medical genomics was introduced as that which would lay the foundation for a revolution in our management of common diseases. Genomics would lead the way towards a future of personalised medicine.
Ben Bruno Policicchio
Full Text Available The HIV-1/AIDS pandemic continues to spread unabated worldwide and no vaccine exists within our grasp. Effective antiretroviral therapy (ART has been developed, but ART cannot clear the virus from the infected patient. A cure for HIV-1 is badly needed to stop both the spread of the virus in human populations and disease progression in infected individuals. A safe and effective cure strategy for HIV infection will require multiple tools and appropriate animal models are tools that are central to cure research. An ideal animal model should recapitulate the essential aspects of HIV pathogenesis and associated immune responses, while permitting invasive studies, thus allowing a thorough evaluation of strategies aimed at reducing the size of the reservoir (functional cure or eliminating the reservoir altogether (sterilizing cure. Since there is no perfect animal model for cure research, multiple models have been tailored and tested to address specific quintessential questions of virus persistence and eradication. The development of new nonhuman primate and mouse models, along with a certain interest in the feline model, have the potential to fuel cure research. In this review, we highlight the major animal models currently utilized for cure research and the contributions of each model to this goal.
In this article, we provide some useful perspectives and experiences in mentoring students in undergraduate research (UR) in mathematical modeling using differential equations. To engage students in this topic, we present a systematic approach to the creation of rich problems from real-world phenomena; present mathematical models that are derived…
Policicchio, Benjamin B; Pandrea, Ivona; Apetrei, Cristian
The HIV-1/AIDS pandemic continues to spread unabated worldwide, and no vaccine exists within our grasp. Effective antiretroviral therapy (ART) has been developed, but ART cannot clear the virus from the infected patient. A cure for HIV-1 is badly needed to stop both the spread of the virus in human populations and disease progression in infected individuals. A safe and effective cure strategy for human immunodeficiency virus (HIV) infection will require multiple tools, and appropriate animal models are tools that are central to cure research. An ideal animal model should recapitulate the essential aspects of HIV pathogenesis and associated immune responses, while permitting invasive studies, thus allowing a thorough evaluation of strategies aimed at reducing the size of the reservoir (functional cure) or eliminating the reservoir altogether (sterilizing cure). Since there is no perfect animal model for cure research, multiple models have been tailored and tested to address specific quintessential questions of virus persistence and eradication. The development of new non-human primate and mouse models, along with a certain interest in the feline model, has the potential to fuel cure research. In this review, we highlight the major animal models currently utilized for cure research and the contributions of each model to this goal.
Alizadeh, Siamak; Sriramula, Srinivas
Redundant safety systems are commonly used in the process industry to respond to hazardous events. In redundant systems composed of identical units, Common Cause Failures (CCFs) can significantly influence system performance with regards to reliability and safety. However, their impact has been overlooked due to the inherent complexity of modelling common cause induced failures. This article develops a reliability model for a redundant safety system using Markov analysis approach. The proposed model incorporates process demands in conjunction with CCF for the first time and evaluates their impacts on the reliability quantification of safety systems without automatic diagnostics. The reliability of the Markov model is quantified by considering the Probability of Failure on Demand (PFD) as a measure for low demand systems. The safety performance of the model is analysed using Hazardous Event Frequency (HEF) to evaluate the frequency of entering a hazardous state that will lead to an accident if the situation is not controlled. The utilisation of Markov model for a simple case study of a pressure protection system is demonstrated and it is shown that the proposed approach gives a sufficiently accurate result for all demand rates, durations, component failure rates and corresponding repair rates for low demand mode of operation. The Markov model proposed in this paper assumes the absence of automatic diagnostics, along with multiple stage repair strategy for CCFs and restoration of the system from hazardous state to the "as good as new" state. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Harwood, T Mark; Beutler, Larry E; Castillo, Salvador; Karno, Mitch
The generic model of psychotherapy (Orlinsky & Howard, 1987) eschews the view that inputs, processes or outputs associated with treatment exert linear and independent effects on outcomes. Variables within these three clusters must be viewed both within the context of time and through their interactions with other variables within a class. This study illustrates the use of this model by identifying common (comprising both traditional relationship factors and shared therapy ingredients) and specific factors in cognitive-behavioural (CB) and family systems (FS) treatments for alcoholic couples and tracking their contributions over two treatment phases - the acute phase, and the follow-up phase. While four process variables (therapy type, intensity of treatment, common elements and FS-specific procedures) contributed to outcomes during the active treatment phase, these variables became more interactive during follow-up. Indeed, high levels of both specific interventions of both treatments were negatively associated with benefit, if common factors were also frequently used during the acute phase. The best effects were obtained when common and specific interventions were counterbalanced, one being frequently used and the other being infrequently used. Implications for future alcohol treatment and recommendations for research on common and specific factors are discussed.
Reza, Ashif; Misra, Anuraag; Das, Parnika
This paper presents an improved model for the prediction of bandwidth enhancement factor (BWEF) in an inductively tuned common source amplifier. In this model, we have included the effect of drain-source channel resistance of field effect transistor along with load inductance and output capacitance on BWEF of the amplifier. A frequency domain analysis of the model is performed and a closed-form expression is derived for BWEF of the amplifier. A prototype common source amplifier is designed and tested. The BWEF of amplifier is obtained from the measured frequency response as a function of drain current and load inductance. In the present work, we have clearly demonstrated that inclusion of drain-source channel resistance in the proposed model helps to estimate the BWEF, which is accurate to less than 5% as compared to the measured results.
Full Text Available The aim of design science research (DSR in information systems is the user-centred creation of IT-artifacts with regard to specific social environments. For culture research in the field, which is necessary for a proper localization of IT-artifacts, models and research approaches from social sciences usually are adopted. Descriptive dimension-based culture models most commonly are applied for this purpose, which assume culture being a national phenomenon and tend to reduce it to basic values. Such models are useful for investigations in behavioural culture research because it aims to isolate, describe and explain culture-specific attitudes and characteristics within a selected society. In contrast, with the necessity to deduce concrete decisions for artifact-design, research results from DSR need to go beyond this aim. As hypothesis, this contribution generally questions the applicability of such generic culture dimensions’ models for DSR and focuses on their theoretical foundation, which goes back to Hofstede’s conceptual Onion Model of Culture. The herein applied literature-based analysis confirms the hypothesis. Consequently, an alternative conceptual culture model is being introduced and discussed as theoretical foundation for culture research in DSR.
Full Text Available Common Variability Language (CVL is a recent proposal for OMG's upcoming Variability Modeling standard. CVL models variability in terms of Model Fragments. Usability is a widely-recognized quality criterion essential to warranty the successful use of tools that put these ideas in practice. Facing the need of evaluating the usability of CVL modeling tools, this paper presents a Usability Evaluation of CVL applied to a Modeling Tool for firmware code of Induction Hobs. This evaluation addresses the configuration, scoping and visualization facets. The evaluation involved the end users of the tool whom are engineers of our Induction Hob industrial partner. Effectiveness and efficiency results indicate that model configuration in terms of model fragment substitutions is intuitive enough but both scoping and visualization require improved tool support. Results also enabled us to identify a list of usability problems which may contribute to alleviate scoping and visualization issues in CVL.
J. Chen (Jinghui); M. Kobayashi (Masahito); M.J. McAleer (Michael)
textabstractThe paper considers the problem as to whether financial returns have a common volatility process in the framework of stochastic volatility models that were suggested by Harvey et al. (1994). We propose a stochastic volatility version of the ARCH test proposed by Engle and Susmel (1993),
Merrill, Ryan; Sintov, Nicole
As atmospheric CO 2 continues to rise above 450 PPM, policymakers struggle with uncertainty concerning predictors of citizen support for environmental energy policies (EEPs) and preferences for their design, topics which have received limited attention in empirical literature. We present an original model of policy support based on citizens’ affinity-to-commons: pathways by which individuals enjoy natural public goods that in turn shape preferences between alternative policy mechanisms. We evaluate this model using a survey of southern California electricity customers, with results indicating the model's utility in predicting public support of EEP. Stronger community ties are associated with preferences for “pull”-type subsidies, whereas stronger connections to natural commons are linked to support for both “pull” and “push”-type sanctions. Findings have implications for coalition building as advocates may engender support for green energy policy by framing sanctions as protecting natural commons, and framing subsidies either in this same way and/or as producing benefits for communities. - Highlights: • A commons-oriented model of citizen support for environmental energy policy is proposed (Thaler (2012)). • A factor analysis identifies local tax shifts, green subsidies, and energy taxes (Schultz et al. (1995)). • Community connections predict support for policies with employing subsidies (Sabatier (2006)). • Connection to nature predicts support for policies using both sanctions and subsidies. (Stern et al. (1999)).
Timmers, J.M.H.; Verbeek, A.L.M.; Hout, J. in't; Pijnappel, R.M.; Broeders, M.J.M.; Heeten, G.J. den
OBJECTIVES: To develop a prediction model for breast cancer based on common mammographic findings on screening mammograms aiming to reduce reader variability in assigning BI-RADS. METHODS: We retrospectively reviewed 352 positive screening mammograms of women participating in the Dutch screening
Timmers, J. M. H.; Verbeek, A. L. M.; Inthout, J.; Pijnappel, R. M.; Broeders, M. J. M.; den Heeten, G. J.
To develop a prediction model for breast cancer based on common mammographic findings on screening mammograms aiming to reduce reader variability in assigning BI-RADS. We retrospectively reviewed 352 positive screening mammograms of women participating in the Dutch screening programme (Nijmegen
Hunt, Mitchell; Sayyah, Rana; Mitchell, Cody; Laws, Crystal; MacLeod, Todd C.; Ho, Fat D.
Mathematical models of the common-source and common-gate amplifiers using metal-ferroelectric- semiconductor field effect transistors (MOSFETs) are developed in this paper. The models are compared against data collected with MOSFETs of varying channel lengths and widths, and circuit parameters such as biasing conditions are varied as well. Considerations are made for the capacitance formed by the ferroelectric layer present between the gate and substrate of the transistors. Comparisons between the modeled and measured data are presented in depth as well as differences and advantages as compared to the performance of each circuit using a MOSFET.
Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...
Fu, Liang-Yu; Wang, Guang-Zhong; Ma, Bin-Guang; Zhang, Hong-Yu
Highlights: → There exists a universal G:C → A:T mutation bias in three domains of life. → This universal mutation bias has not been sufficiently explained. → A DNA mutation model proposed by Loewdin 40 years ago offers a common explanation. -- Abstract: Recently, numerous genome analyses revealed the existence of a universal G:C → A:T mutation bias in bacteria, fungi, plants and animals. To explore the molecular basis for this mutation bias, we examined the three well-known DNA mutation models, i.e., oxidative damage model, UV-radiation damage model and CpG hypermutation model. It was revealed that these models cannot provide a sufficient explanation to the universal mutation bias. Therefore, we resorted to a DNA mutation model proposed by Loewdin 40 years ago, which was based on inter-base double proton transfers (DPT). Since DPT is a fundamental and spontaneous chemical process and occurs much more frequently within GC pairs than AT pairs, Loewdin model offers a common explanation for the observed universal mutation bias and thus has broad biological implications.
Lesieutre, Bernard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bravo, Richard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yinger, Robert [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chassin, Dave [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Huang, Henry [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lu, Ning [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hiskens, Ian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Venkataramanan, Giri [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)
The research presented in this report primarily focuses on improving power system load models to better represent their impact on system behavior. The previous standard load model fails to capture the delayed voltage recovery events that are observed in the Southwest and elsewhere. These events are attributed to stalled air conditioner units after a fault. To gain a better understanding of their role in these events and to guide modeling efforts, typical air conditioner units were testing in laboratories. Using data obtained from these extensive tests, new load models were developed to match air conditioner behavior. An air conditioner model is incorporated in the new WECC composite load model. These models are used in dynamic studies of the West and can impact power transfer limits for California. Unit-level and systemlevel solutions are proposed as potential solutions to the delayed voltage recovery problem.
Lu, Tielin; Fan, Zitian; Wang, Chunxi; Liu, Xiaojing; Wang, Shuo; Zhao, Hua
A method for measurement equipment data description has been proposed based on the property resource analysis. The applications of common data dictionary (CDD) to devices and equipment is mainly used in digital factory to advance the management not only in the enterprise, also to the different enterprise in the same data environment. In this paper, we can make a brief of the data flow in the whole manufacture enterprise and the automatic trigger the process of the data exchange. Furthermore,the application of the data dictionary is available for the measurement and control equipment, which can also be used in other different industry in smart manufacture.
Daunton, Nancy G.
Practical information on candidate animal models for motion sickness research and on methods used to elicit and detect motion sickness in these models is provided. Four good potential models for use in motion sickness experiments include the dog, cat, squirrel monkey, and rat. It is concluded that the appropriate use of the animal models, combined with exploitation of state-of-the-art biomedical techniques, should generate a great step forward in the understanding of motion sickness mechanisms and in the development of efficient and effective approaches to its prevention and treatment in humans.
The results of the joint researches by utilizing the facilities of JAERI in 1991 fiscal year were summarized, and this report was able to be completed. Many researchers in whole Japan took part in many themes, and the very significant results were obtained. Now this joint research has reached the great turnabout period. The reconstructed JRR-3M was offered for joint utilization since April, 1991, and the utilization for neutron diffraction and scattering increased largely. As for the ion irradiation facility in Takasaki Research Establishment, the partial operation will be started in the next year, and the joint utilization is expected to begin. Accompanying the diversification of the utilization of facilities, in order to properly meet the needs of users, the thorough revision of the system seems necessary. The number of research themes in 1991 was 222 cases. JRR-3M accomplished the joint utilization operation of 8 cycles as expected, but JRR-2 caused a trouble during 5th cycle, and the operation thereafter was canceled. In this book, 159 reports are collected. (K.I.)
Full Text Available We consider a proper propositional quantum logic and show that it has multiple disjoint lattice models, only one of which is an orthomodular lattice (algebra underlying Hilbert (quantum space. We give an equivalent proof for the classical logic which turns out to have disjoint distributive and nondistributive ortholattices. In particular, we prove that both classical logic and quantum logic are sound and complete with respect to each of these lattices. We also show that there is one common nonorthomodular lattice that is a model of both quantum and classical logic. In technical terms, that enables us to run the same classical logic on both a digital (standard, two-subset, 0-1-bit computer and a nondigital (say, a six-subset computer (with appropriate chips and circuits. With quantum logic, the same six-element common lattice can serve us as a benchmark for an efficient evaluation of equations of bigger lattice models or theorems of the logic.
Kubiak, Christine; de Andres-Trelles, Fernando; Kuchinke, Wolfgang
BACKGROUND: Thorough knowledge of the regulatory requirements is a challenging prerequisite for conducting multinational clinical studies in Europe given their complexity and heterogeneity in regulation and perception across the EU member states. METHODS: In order to summarise the current situation...... in relation to the wide spectrum of clinical research, the European Clinical Research Infrastructures Network (ECRIN) developed a multinational survey in ten European countries. However a lack of common classification framework for major categories of clinical research was identified, and therefore reaching...... an agreement on a common classification was the initial step in the development of the survey. RESULTS: The ECRIN transnational working group on regulation, composed of experts in the field of clinical research from ten European countries, defined seven major categories of clinical research that seem relevant...
Full Text Available Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11 graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface.
Hung, Ling-Hong; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee
Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs) remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11) graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface.
Brookhart, Susan M.; Guskey, Thomas R.; Bowers, Alex J.; McMillan, James H.; Smith, Jeffrey K.; Smith, Lisa F.; Stevens, Michael T.; Welsh, Megan E.
Grading refers to the symbols assigned to individual pieces of student work or to composite measures of student performance on report cards. This review of over 100 years of research on grading considers five types of studies: (a) early studies of the reliability of grades, (b) quantitative studies of the composition of K-12 report card grades,…
Harla, Donna K.
Parental involvement in schools is an important potential contributor to improving American education and making the U.S. more globally competitive. This qualitative and quantitative mixed-methodology action research study probed the viability of engaging parents around issues of educational improvement by inviting them to participate in training…
Argues that poetry therapy is similar to the other creative arts therapies in its use of creative processes and products, and in its intrinsic positiveness, gentle indirectness, and breadth of appeal and application. Suggests that collaborative research efforts among creative arts therapists can lead to new understandings of the processes and…
About the Cover: The Thailand Initiative in Genomics and Expression Research for Liver Cancer (TIGER-LC) Consortium (depicted as a tiger) emerges from foliage, representing molecular, clinical, and epidemiological studies from teams in the United States, Thailand, and Japan, to generate a multilayered genomic and genetic liver cancer data ecosystem (represented by the tiger’s tail).
Rönnebeck, Silke; Bernholt, Sascha; Ropohl, Mathias
Despite the importance of scientific inquiry in science education, researchers and educators disagree considerably regarding what features define this instructional approach. While a large body of literature addresses theoretical considerations, numerous empirical studies investigate scientific inquiry on quite different levels of detail and also…
Special Assistant to the Secretary california Environmental Protection 555 Captal Mail, Suite 235 Sacramento, CA 95814 John Borum Vice President of... ECO The Robbina Co. 400O’Neil Blud. Attieboro, MA 02703 Mr. Andraw Cornai Research Director, Toxics Reduction Ecology Center of Ann Arbor 417 Detroit
Hauser, Sandra; Jung, Friedrich; Pietzsch, Jens
Endothelial cell (EC) models have evolved as important tools in biomaterial research due to ubiquitously occurring interactions between implanted materials and the endothelium. However, screening the available literature has revealed a gap between material scientists and physiologists in terms of their understanding of these biomaterial-endothelium interactions and their relative importance. Consequently, EC models are often applied in nonphysiological experimental setups, or too extensive conclusions are drawn from their results. The question arises whether this might be one reason why, among the many potential biomaterials, only a few have found their way into the clinic. In this review, we provide an overview of established EC models and possible selection criteria to enable researchers to determine the most reliable and relevant EC model to use. Copyright Â© 2016 Elsevier Ltd. All rights reserved.
Form (netCDF) software package for Windows distributed by the Unidata Program Center at the University Corporation for Atmospheric Research (UCAR) in...standard ASCII format, while the provided WRF model results are often in netCDF format. Therefore, the 3DWF model and its GUI needed to be modified so...Garvey D, Chang S, Cogan J. Application of a multigrid method to a mass consistent diagnostic wind model. J. Appl . Meteorology. 2005;44:1078–1089
Heydari, Abbas; Khorashadizadeh, Fatemeh
This review shows how researchers use pander's health promotion model. We included all articles in which Pender's health promotion has been used for theoretical framework. Eligible articles were selected according to review of abstracts. Search was conducted using the electronic database from 1990 to 2012. Based on our search, 74 articles with various methodologies were relevant for review. Their aims of these studies were to predict effective factors/barriers in health promotion behaviours, to detect effects of intervention programme for improving health promotion behaviours, test the model, identify quality of life and health promotion behaviour, predict stage of change in related factors that affect health promotion behaviour, prevent the events that interfere with health promotion behaviour, develop another model similar to this model, compare this model with another model, determine the relationship of variables associated to health promotion behaviours.
Full Text Available Common carp (Cyprinus carpio is an important food resource in European and Asian countries. Nowadays, common carp after drying process is appreciated by the transportation agency and food industry because of its low transportation cost. Changes of acid value (AV, total bacterial count (TBC, and peroxide value (PV were reported in this study. We found that the changes of AV, TBC and PV of dry common carp fitted the first order reaction model and the reaction energies of changes of AV, TBC, and PV during storage were 4.56 kJ/mol, 2.21 kJ/mol, and 2.33 kJ/mol, respectively. This study will provide theoretical knowledge to food factories relating with dry fish storage and transportation.
Jebena, Mulusew G; Lindstrom, David; Belachew, Tefera; Hadley, Craig; Lachat, Carl; Verstraeten, Roos; De Cock, Nathalie; Kolsteren, Patrick
Although the consequences of food insecurity on physical health and nutritional status of youth living have been reported, its effect on their mental health remains less investigated in developing countries. The aim of this study was to examine the pathways through which food insecurity is associated with poor mental health status among youth living in Ethiopia. We used data from Jimma Longitudinal Family Survey of Youth (JLFSY) collected in 2009/10. A total of 1,521 youth were included in the analysis. We measured food insecurity using a 5-items scale and common mental disorders using the 20-item Self-Reporting Questionnaire (SRQ-20). Structural and generalized equation modeling using maximum likelihood estimation method was used to analyze the data. The prevalence of common mental disorders was 30.8% (95% CI: 28.6, 33.2). Food insecurity was independently associated with common mental disorders (β = 0.323, Pfood insecurity on common mental disorders was direct and only 8.2% of their relationship was partially mediated by physical health. In addition, poor self-rated health (β = 0.285, Pdisorders. Food insecurity is directly associated with common mental disorders among youth in Ethiopia. Interventions that aim to improve mental health status of youth should consider strategies to improve access to sufficient, safe and nutritious food.
Lindstrom, David; Belachew, Tefera; Hadley, Craig; Lachat, Carl; Verstraeten, Roos; De Cock, Nathalie; Kolsteren, Patrick
Background Although the consequences of food insecurity on physical health and nutritional status of youth living have been reported, its effect on their mental health remains less investigated in developing countries. The aim of this study was to examine the pathways through which food insecurity is associated with poor mental health status among youth living in Ethiopia. Methods We used data from Jimma Longitudinal Family Survey of Youth (JLFSY) collected in 2009/10. A total of 1,521 youth were included in the analysis. We measured food insecurity using a 5-items scale and common mental disorders using the 20-item Self-Reporting Questionnaire (SRQ-20). Structural and generalized equation modeling using maximum likelihood estimation method was used to analyze the data. Results The prevalence of common mental disorders was 30.8% (95% CI: 28.6, 33.2). Food insecurity was independently associated with common mental disorders (β = 0.323, Pfood insecurity on common mental disorders was direct and only 8.2% of their relationship was partially mediated by physical health. In addition, poor self-rated health (β = 0.285, PFood insecurity is directly associated with common mental disorders among youth in Ethiopia. Interventions that aim to improve mental health status of youth should consider strategies to improve access to sufficient, safe and nutritious food. PMID:27846283
Full Text Available Gaucher disease (GD, the most common lysosomal storage disorder (LSD, is caused by the defective activity of the lysosomal hydrolase glucocerebrosidase, which is encoded by the GBA gene. Generation of animal models that faithfully recapitulate the three clinical subtypes of GD has proved to be more of a challenge than first anticipated. The first mouse to be produced died within hours after birth owing to skin permeability problems, and mice with point mutations in Gba did not display symptoms correlating with human disease and also died soon after birth. Recently, conditional knockout mice that mimic some features of the human disease have become available. Here, we review the contribution of all currently available animal models to examining pathological pathways underlying GD and to testing the efficacy of new treatment modalities, and propose a number of criteria for the generation of more appropriate animal models of GD.
Andrey Borisovich Nikolaev
Full Text Available In represented article the questions of estimate of accuracy of an average integral characteristics of random process in the course of imitation modeling is considered. For the purposes of analytical treatment of initial stage of modeling a conditionally nonstationary Gaussian process is analyzed as stationary Gaussian process with boundary prehistory. A model of approximant autocorrelation function is recommended. Analytical expression for variance and mathematical expectation of average integral estimation are obtained. Statistical estimation efficiency criterion, the probability of belonging to correct parameter interval is introduced. Dependences of closeness in estimation statistics clearing interval at transient behavior are researched for various types of processes.
Kaufman, Bruce E.; And Others
Kaufman attempts to identify essential characteristics that distinguish behavioral from nonbehavioral research in industrial relations. He argues that they are distinguished by the psychological model of man that is contained in the theoretical framework used to deduce or test hypotheses. Comments from Lewin, Mincer, and Cummings with Kaufman's…
Andersen, Malene Friis; Nielsen, Karina M; Brinkmann, Svend
The purpose of this study was to investigate which opportunities and obstacles employees with common mental disorders (CMD) experience in relation to return to work (RTW) and how they perceive the process of returning to work. In addition, the study explores what characterizes an optimal RTW intervention and points to possible ways to improve future interventions for employees with CMD. A systematic literature search was conducted, and eight qualitative studies of medium or high quality published between 1995-2011 were included in this systematic review. The eight studies were synthesized using the meta-ethnographic method. This meta-synthesis found that employees with CMD identify a number of obstacles to and facilitators of returning to work related to their own personality, social support at the workplace, and the social and rehabilitation systems. The employees found it difficult to decide when they were ready to resume work and experienced difficulties implementing RTW solutions at the workplace. This study reveals that the RTW process should be seen as a continuous and coherent one where experiences of the past and present and anticipation of the future are dynamically interrelated and affect the success or failure of RTW. The meta-synthesis also illuminates insufficient coordination between the social and rehabilitation systems and suggests how an optimal RTW intervention could be designed.
Kunugi, Tomoaki; Satake, Shin-ichi
Scientific interest in the environmental and meteorological fields has been recently focused on the estimation of the temperature rise on the earth in the near and distant future. This problem is strongly related to the imbalance of the amount of carbon on the earth after the industrial revolution; termed the 'Missing Sink' problem. The temperature rise is estimated by the gas transfer flux=(gas transfer rate) x (partial pressure difference of CO 2 between air and sea surface). It is very difficult to measure and estimate the gas exchange coefficient resulting from the air-sea interaction because of the very high Schmidt number (Sc) turbulent fluid flow with free surface deformation. On the other hand, the utilization of a high Prandtl number (Pr) fluid flow with a free surface as a coolant in an advanced magnetic fusion reactor and as a chamber protection scheme in an inertial confinement fusion reactor have been considered. Because the diffusivities of high Pr or Sc fluids are very small, when the high temperature or concentration regions appear on the free surface, caused by plasma radiation or carbon-dioxide gas absorption, respectively, the scalar transport from the free surface to the bulk flow is very slow compared to the fluid motion. In this paper, some common aspects between the heat transfer of high Pr free surface flow in fusion engineering and the mass transfer of high Sc free surface flow in the global warming problem are discussed. (author)
Shuai, Zhisheng; van den Driessche, P
A mathematical model is formulated for the transmission and spread of cholera in a heterogeneous host population that consists of several patches of homogeneous host populations sharing a common water source. The basic reproduction number ℛ0 is derived and shown to determine whether or not cholera dies out. Explicit formulas are derived for target/type reproduction numbers that measure the control strategies required to eradicate cholera from all patches.
Wang, Qiuling; He, Qijin; Zhou, Guangsheng
In the context of climate warming, the varying soil moisture caused by precipitation pattern change will affect the applicability of stomatal conductance models, thereby affecting the simulation accuracy of carbon-nitrogen-water cycles in ecosystems. We studied the applicability of four common stomatal conductance models including Jarvis, Ball-Woodrow-Berry (BWB), Ball-Berry-Leuning (BBL) and unified stomatal optimization (USO) models based on summer maize leaf gas exchange data from a soil moisture consecutive decrease manipulation experiment. The results showed that the USO model performed best, followed by the BBL model, BWB model, and the Jarvis model performed worst under varying soil moisture conditions. The effects of soil moisture made a difference in the relative performance among the models. By introducing a water response function, the performance of the Jarvis, BWB, and USO models improved, which decreased the normalized root mean square error (NRMSE) by 15.7%, 16.6% and 3.9%, respectively; however, the performance of the BBL model was negative, which increased the NRMSE by 5.3%. It was observed that the models of Jarvis, BWB, BBL and USO were applicable within different ranges of soil relative water content (i.e., 55%-65%, 56%-67%, 37%-79% and 37%-95%, respectively) based on the 95% confidence limits. Moreover, introducing a water response function, the applicability of the Jarvis and BWB models improved. The USO model performed best with or without introducing the water response function and was applicable under varying soil moisture conditions. Our results provide a basis for selecting appropriate stomatal conductance models under drought conditions. Copyright © 2018 Elsevier B.V. All rights reserved.
Russell, Richard A.; Waiss, Richard D.
A study was conducted to identify the common support equipment and Space Station interface requirements for the IOC (initial operating capabilities) model technology experiments. In particular, each principal investigator for the proposed model technology experiment was contacted and visited for technical understanding and support for the generation of the detailed technical backup data required for completion of this study. Based on the data generated, a strong case can be made for a dedicated technology experiment command and control work station consisting of a command keyboard, cathode ray tube, data processing and storage, and an alert/annunciator panel located in the pressurized laboratory.
Bühler, Kora-Mareen; Giné, Elena; Echeverry-Alzate, Victor; Calleja-Conde, Javier; de Fonseca, Fernando Rodriguez; López-Moreno, Jose Antonio
Drug-related phenotypes are common complex and highly heritable traits. In the last few years, candidate gene (CGAS) and genome-wide association studies (GWAS) have identified a huge number of single nucleotide polymorphisms (SNPs) associated with drug use, abuse or dependence, mainly related to alcohol or nicotine. Nevertheless, few of these associations have been replicated in independent studies. The aim of this study was to provide a review of the SNPs that have been most significantly associated with alcohol-, nicotine-, cannabis- and cocaine-related phenotypes in humans between the years of 2000 and 2012. To this end, we selected CGAS, GWAS, family-based association and case-only studies published in peer-reviewed international scientific journals (using the PubMed/MEDLINE and Addiction GWAS Resource databases) in which a significant association was reported. A total of 371 studies fit the search criteria. We then filtered SNPs with at least one replication study and performed meta-analysis of the significance of the associations. SNPs in the alcohol metabolizing genes, in the cholinergic gene cluster CHRNA5-CHRNA3-CHRNB4, and in the DRD2 and ANNK1 genes, are, to date, the most replicated and significant gene variants associated with alcohol- and nicotine-related phenotypes. In the case of cannabis and cocaine, a far fewer number of studies and replications have been reported, indicating either a need for further investigation or that the genetics of cannabis/cocaine addiction are more elusive. This review brings a global state-of-the-art vision of the behavioral genetics of addiction and collaborates on formulation of new hypothesis to guide future work. © 2015 Society for the Study of Addiction.
Ernst, Anja F; Albers, Casper J
Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking.
James R. Roede
Full Text Available Due to its short lifespan, ease of use and age-related pathologies that mirror those observed in humans, the common marmoset (Callithrix jacchus is poised to become a standard nonhuman primate model of aging. Blood and extracellular fluid possess two major thiol-dependent redox nodes involving cysteine (Cys, cystine (CySS, glutathione (GSH and glutathione disulfide (GSSG. Alteration in these plasma redox nodes significantly affects cellular physiology, and oxidation of the plasma Cys/CySS redox potential (EhCySS is associated with aging and disease risk in humans. The purpose of this study was to determine age-related changes in plasma redox metabolites and corresponding redox potentials (Eh to further validate the marmoset as a nonhuman primate model of aging. We measured plasma thiol redox states in marmosets and used existing human data with multivariate adaptive regression splines (MARS to model the relationships between age and redox metabolites. A classification accuracy of 70.2% and an AUC of 0.703 were achieved using the MARS model built from the marmoset redox data to classify the human samples as young or old. These results show that common marmosets provide a useful model for thiol redox biology of aging.
Pennel, Cara L; Burdine, James N; Prochaska, John D; McLeroy, Kenneth R
Community health assessment and community health improvement planning are continuous, systematic processes for assessing and addressing health needs in a community. Since there are different models to guide assessment and planning, as well as a variety of organizations and agencies that carry out these activities, there may be confusion in choosing among approaches. By examining the various components of the different assessment and planning models, we are able to identify areas for coordination, ways to maximize collaboration, and strategies to further improve community health. We identified 11 common assessment and planning components across 18 models and requirements, with a particular focus on health department, health system, and hospital models and requirements. These common components included preplanning; developing partnerships; developing vision and scope; collecting, analyzing, and interpreting data; identifying community assets; identifying priorities; developing and implementing an intervention plan; developing and implementing an evaluation plan; communicating and receiving feedback on the assessment findings and/or the plan; planning for sustainability; and celebrating success. Within several of these components, we discuss characteristics that are critical to improving community health. Practice implications include better understanding of different models and requirements by health departments, hospitals, and others involved in assessment and planning to improve cross-sector collaboration, collective impact, and community health. In addition, federal and state policy and accreditation requirements may be revised or implemented to better facilitate assessment and planning collaboration between health departments, hospitals, and others for the purpose of improving community health.
Ames, D. P.; Peterson, M.; Larsen, J.
A steady flow of manuscripts describing integrated water resources management (IWRM) modelling has been published in Environmental Modelling & Software since the journal's inaugural issue in 1997. These papers represent two decades of peer-reviewed scientific knowledge regarding methods, practices, and protocols for conducting IWRM. We have undertaken to explore this specific assemblage of literature with the intention of identifying commonly reported procedures in terms of data integration methods, modelling techniques, approaches to stakeholder participation, means of communication of model results, and other elements of the model development and application life cycle. Initial results from this effort will be presented including a summary of commonly used practices, and their evolution over the past two decades. We anticipate that results will show a pattern of movement toward greater use of both stakeholder/participatory modelling methods as well as increased use of automated methods for data integration and model preparation. Interestingly, such results could be interpreted to show that the availability of better, faster, and more integrated software tools and technologies free the modeler to take a less technocratic and more human approach to water resources modelling.
Ernst, Anja F.
Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking. PMID:28533971
Full Text Available The partners and allies of business practice in the processes of modernization of the economy are scientific-research institutions, such as scientific institutes, research-development units and universities. A market on which companies can look for the solutions they need and scientific-research institutions can look for inspiration, partners and capital is being formed. The market provides conditions for operation, development and implementation of developed solutions. Taking into consideration the complexity of market, technical, legal, financial, or intellectual property protection issues, research-development units are more and more frequently unable to function efficiently without a clear and unequivocal definition of goals, methods and conditions of activity. Their market offer has to take into consideration not just scientific-research, or methodological aspects. Taking into consideration continuously growing demands of the clients and pressure of the competition, scientific-research institutions have to pay attention also to market, information, personnel, or financial aspects typical of strictly commercial ventures. What may support a comprehensive preparation and implementation of scientific-research activities under market conditions are tools successfully used in trade and economy, such as business models. These issues are the basis of deliberations contained in this work.
Nakamura, Motoki; Haarmann-Stemmann, Thomas; Krutmann, Jean; Morita, Akimichi
Increasing ethical concerns regarding animal experimentation have led to the development of various alternative methods based on the 3Rs (Refinement, Reduction, and Replacement), first described by Russell and Burch in 1959. Cosmetic and skin aging research are particularly susceptible to concerns related to animal testing. In addition to animal welfare reasons, there are scientific and economic reasons to reduce and avoid animal experiments. Importantly, animal experiments may not reflect findings in humans mainly because of the differences of architectures and immune responses between animal skin and human skin. Here we review the shift from animal testing to the development and application of alternative non-animal based methods and the necessity and benefits of this shift. Some specific alternatives to animal models are discussed, including biochemical approaches, two-dimensional and three-dimensional cell cultures, and volunteer studies, as well as future directions, including genome-based research and the development of in silico computer simulations of skin models. Among the in vitro methods, three-dimensional reconstructed skin models are highly popular and useful alternatives to animal models however still have many limitations. With careful selection and skillful handling, these alternative methods will become indispensable for modern dermatology and skin aging research. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Sunil K. Panchal
Full Text Available Rodents are widely used to mimic human diseases to improve understanding of the causes and progression of disease symptoms and to test potential therapeutic interventions. Chronic diseases such as obesity, diabetes and hypertension, together known as the metabolic syndrome, are causing increasing morbidity and mortality. To control these diseases, research in rodent models that closely mimic the changes in humans is essential. This review will examine the adequacy of the many rodent models of metabolic syndrome to mimic the causes and progression of the disease in humans. The primary criterion will be whether a rodent model initiates all of the signs, especially obesity, diabetes, hypertension and dysfunction of the heart, blood vessels, liver and kidney, primarily by diet since these are the diet-induced signs in humans with metabolic syndrome. We conclude that the model that comes closest to fulfilling this criterion is the high carbohydrate, high fat-fed male rodent.
Pan, X.; Chin, M.; Gautam, R.; Bian, H.; Kim, D.; Colarco, P. R.; Diehl, T. L.; Takemura, T.; Pozzoli, L.; Tsigaridis, K.; Bauer, S.; Bellouin, N.
Atmospheric pollution over South Asia attracts special attention due to its effects on regional climate, water cycle and human health. These effects are potentially growing owing to rising trends of anthropogenic aerosol emissions. In this study, the spatio-temporal aerosol distributions over South Asia from seven global aerosol models are evaluated against aerosol retrievals from NASA satellite sensors and ground-based measurements for the period of 2000-2007. Overall, substantial underestimations of aerosol loading over South Asia are found systematically in most model simulations. Averaged over the entire South Asia, the annual mean aerosol optical depth (AOD) is underestimated by a range 15 to 44% across models compared to MISR (Multi-angle Imaging SpectroRadiometer), which is the lowest bound among various satellite AOD retrievals (from MISR, SeaWiFS (Sea-Viewing Wide Field-of-View Sensor), MODIS (Moderate Resolution Imaging Spectroradiometer) Aqua and Terra). In particular during the post-monsoon and wintertime periods (i.e., October-January), when agricultural waste burning and anthropogenic emissions dominate, models fail to capture AOD and aerosol absorption optical depth (AAOD) over the Indo-Gangetic Plain (IGP) compared to ground-based Aerosol Robotic Network (AERONET) sunphotometer measurements. The underestimations of aerosol loading in models generally occur in the lower troposphere (below 2 km) based on the comparisons of aerosol extinction profiles calculated by the models with those from Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) data. Furthermore, surface concentrations of all aerosol components (sulfate, nitrate, organic aerosol (OA) and black carbon (BC)) from the models are found much lower than in situ measurements in winter. Several possible causes for these common problems of underestimating aerosols in models during the post-monsoon and wintertime periods are identified: the aerosol hygroscopic growth and formation of
Zhang, Li; Gezan, Salvador A; Eduardo Vallejos, C; Jones, James W; Boote, Kenneth J; Clavijo-Michelangeli, Jose A; Bhakta, Mehul; Osorno, Juan M; Rao, Idupulapati; Beebe, Stephen; Roman-Paoli, Elvin; Gonzalez, Abiezer; Beaver, James; Ricaurte, Jaumer; Colbert, Raphael; Correll, Melanie J
This work reports the effects of the genetic makeup, the environment and the genotype by environment interactions for node addition rate in an RIL population of common bean. This information was used to build a predictive model for node addition rate. To select a plant genotype that will thrive in targeted environments it is critical to understand the genotype by environment interaction (GEI). In this study, multi-environment QTL analysis was used to characterize node addition rate (NAR, node day - 1 ) on the main stem of the common bean (Phaseolus vulgaris L). This analysis was carried out with field data of 171 recombinant inbred lines that were grown at five sites (Florida, Puerto Rico, 2 sites in Colombia, and North Dakota). Four QTLs (Nar1, Nar2, Nar3 and Nar4) were identified, one of which had significant QTL by environment interactions (QEI), that is, Nar2 with temperature. Temperature was identified as the main environmental factor affecting NAR while day length and solar radiation played a minor role. Integration of sites as covariates into a QTL mixed site-effect model, and further replacing the site component with explanatory environmental covariates (i.e., temperature, day length and solar radiation) yielded a model that explained 73% of the phenotypic variation for NAR with root mean square error of 16.25% of the mean. The QTL consistency and stability was examined through a tenfold cross validation with different sets of genotypes and these four QTLs were always detected with 50-90% probability. The final model was evaluated using leave-one-site-out method to assess the influence of site on node addition rate. These analyses provided a quantitative measure of the effects on NAR of common beans exerted by the genetic makeup, the environment and their interactions.
Lee, K. Y.
This study compares the modeling and experimental results on the equilibrium phase partitioning behavior of three common alcohols (ethanol, isopropanol, and methanol) in a two-phase system consisting of water and a BTEX compound. A previously developed computer program is used to generate ternary phase diagrams for each alcohol-water-NAPL mixture combination, where the required activity coefficients are estimated using the UNIFAC model. A set of laboratory experiments is conducted to determine the maximum single-phase water content for every alcohol-water-NAPL mixture combination considered in this study, where the initial volume composition is 85 percent alcohol and 15 percent NAPL. Comparison of experimental results against UNIFAC- derived modeling results shows good agreement for mixtures containing ethanol and methanol, but relatively poor agreement for mixtures containing isopropanol.
Two issues have been dealt. One is to develop an event based parametric model called ξ-CCF model. Its parameters are expressed in the fraction of the progressive multiplicities of failure events. By these expressions, the contribution of each multiple failure can be presented more clearly. It can help to select defense tactics against common cause failures. The other is to provide a method which is based on the operational experience and engineering judgement to estimate the effectiveness of defense tactics. It is expressed in terms of reduction matrix for a given tactics on a specific plant in the event by event form. The application of practical example shows that the model in cooperation with the method can simply estimate the effectiveness of defense tactics. It can be easily used by the operators and its application may be extended
Wang, Shuliang; Chi, Hehua; Yuan, Hanning; Geng, Jing
Human facial expressions are key ingredient to convert an individual's innate emotion in communication. However, the variation of facial expressions affects the reliable identification of human emotions. In this paper, we present a cloud model to extract facial features for representing human emotion. First, the uncertainties in facial expression are analyzed in the context of cloud model. The feature extraction and representation algorithm is established under cloud generators. With forward cloud generator, facial expression images can be re-generated as many as we like for visually representing the extracted three features, and each feature shows different roles. The effectiveness of the computing model is tested on Japanese Female Facial Expression database. Three common features are extracted from seven facial expression images. Finally, the paper is concluded and remarked.
Visser, S.M.; Flanagan, D.C.
Since the late 1980's, the Agricultural Research Service (ARS) of the United States Department of Agriculture (USDA) has been developing process-based erosion models to predict water erosion and wind erosion. During much of that time, the development efforts of the Water Erosion Prediction Project
Møller Andersen, Frits; Alberg Østergaard, Poul
This editorial introduces the seventh volume of the International Journal of Sustainable Energy Planning and Management. The volume presents part of the outcome of the project Energy Systems Modelling Research and Analysis (ENSYMORA) funded by the Danish Innovation Fund. The project carried out...... by 11 university and industry partners has improved the basis for decision-making within energy planning and energy scenario making by providing new and improved tools and methods for energy systems analyses....
Claudia Ioana CIOBANU
Full Text Available Compliance with the construct validity criteria is necessary for the correct assessment of the research in terms of quality and for further development of the marketing models. The identification of formative and reflective constructs as well as the correct testing of their validity and reliability are important methodological steps for marketing research as described in this article. The first part defines the reflective and the formative constructs and highlighst their particularities by analysing the theoretical criteria that differentiate them. In the second part of the study aspects of validity and trust for the formative and reflective constructs are presented as well as some empirical considerations from research literature regarding their measurement.
Dozier, A.; Arabi, M.; David, O.
Many challenges are introduced to interdisciplinary research in and around the hydrologic science community due to advances in computing technology and modeling capabilities in different programming languages, across different platforms and frameworks by researchers in a variety of fields with a variety of experience in computer programming. Many new hydrologic models as well as optimization, parameter estimation, and uncertainty characterization techniques are developed in scripting languages such as Matlab, R, Python, or in newer languages such as Java and the .Net languages, whereas many legacy models have been written in FORTRAN and C, which complicates inter-model communication for two-way feedbacks. However, most hydrologic researchers and industry personnel have little knowledge of the computing technologies that are available to address the model integration process. Therefore, the goal of this study is to address these new challenges by utilizing a novel approach based on a publish-subscribe-type system to enhance modeling capabilities of legacy socio-economic, hydrologic, and ecologic software. Enhancements include massive parallelization of executions and access to legacy model variables at any point during the simulation process by another program without having to compile all the models together into an inseparable 'super-model'. Thus, this study provides two-way feedback mechanisms between multiple different process models that can be written in various programming languages and can run on different machines and operating systems. Additionally, a level of abstraction is given to the model integration process that allows researchers and other technical personnel to perform more detailed and interactive modeling, visualization, optimization, calibration, and uncertainty analysis without requiring deep understanding of inter-process communication. To be compatible, a program must be written in a programming language with bindings to a common
Full Text Available In recent years, the incidence of nonalcoholic fatty liver disease (NAFLD has increased gradually along with the rising prevalence of obesity, type 2 diabetes, and hyperlipidemia, and NAFLD has become one of the most common chronic liver diseases in the world and the second major liver disease after chronic viral hepatitis in China. However, its pathogenesis has not yet been clarified. Animal models are playing an important role in researches on NAFLD due to the facts that the development and progression of NAFLD require a long period of time, and ethical limitations exist in conducting drug trials in patients or collecting liver tissues from patients. The animal models with histopathology similar to that of NAFLD patients are reviewed, and their modeling principle, as well as the advantages and disadvantages, are compared. Animal models provide a powerful tool for further studies of NAFLD pathogenesis and drug screening for prevention and treatment of NAFLD.
Full Text Available This research investigates the overall heating energy consumptions using various control strategies, secondary heat emitters, and primary plant for a building. Previous research has successfully demonstrated that a dynamic distributed heat emitter model embedded within a simplified third-order lumped parameter building model is capable of achieving improved results when compared to other commercially available modelling tools. With the enhanced ability to capture transient effects of emitter thermal capacity, this research studies the influence of control strategies and primary plant configurations on the rate of energy consumption of a heating system. Four alternative control strategies are investigated: zone feedback; weather-compensated; a combination of both of these methods; and thermostatic control. The plant alternative configurations consist of conventional boilers, biomass boilers, and heat pumps supporting radiator heating and underfloor heating. The performance of the model is tested on a primary school building and can be applied to any residential or commercial building with a heating system. Results show that the new methods reported offer greater detail and rigor in the conduct of building energy modelling.
Nowik, N; Podlasz, P; Jakimiuk, A; Kasica, N; Sienkiewicz, W; Kaleczyc, J
The zebrafish (Danio rerio) has become known as an excellent model organism for studies of vertebrate biology, vertebrate genetics, embryonal development, diseases and drug screening. Nevertheless, there is still lack of detailed reports about usage of the zebrafish as a model in veterinary medicine. Comparing to other vertebrates, they can lay hundreds of eggs at weekly intervals, externally fertilized zebrafish embryos are accessible to observation and manipulation at all stages of their development, which makes possible to simplify the research techniques such as fate mapping, fluorescent tracer time-lapse lineage analysis and single cell transplantation. Although zebrafish are only 2.5 cm long, they are easy to maintain. Intraperitoneal and intracerebroventricular injections, blood sampling and measurement of food intake are possible to be carry out in adult zebrafish. Danio rerio is a useful animal model for neurobiology, developmental biology, drug research, virology, microbiology and genetics. A lot of diseases, for which the zebrafish is a perfect model organism, affect aquatic animals. For a part of them, like those caused by Mycobacterium marinum or Pseudoloma neutrophila, Danio rerio is a natural host, but the zebrafish is also susceptible to the most of fish diseases including Itch, Spring viraemia of carp and Infectious spleen and kidney necrosis. The zebrafish is commonly used in research of bacterial virulence. The zebrafish embryo allows for rapid, non-invasive and real time analysis of bacterial infections in a vertebrate host. Plenty of common pathogens can be examined using zebrafish model: Streptococcus iniae, Vibrio anguillarum or Listeria monocytogenes. The steps are taken to use the zebrafish also in fungal research, especially that dealing with Candida albicans and Cryptococcus neoformans. Although, the zebrafish is used commonly as an animal model to study diseases caused by external agents, it is also useful in studies of metabolic
Moazeni, Najmeh; Vadood, Morteza; Semnani, Dariush; Hasani, Hossein
The common bile duct is one of the body’s most sensitive organs and a polyurethane nanofiber tube can be used as a prosthetic of the common bile duct. The compliance is one of the most important properties of prosthetic which should be adequately compliant as long as possible to keep the behavioral integrity of prosthetic. In the present paper, the prosthetic compliance was measured and modeled using regression method and artificial neural network (ANN) based on the electrospinning process parameters such as polymer concentration, voltage, tip-to-collector distance and flow rate. Whereas, the ANN model contains different parameters affecting on the prediction accuracy directly, the genetic algorithm (GA) was used to optimize the ANN parameters. Finally, it was observed that the optimized ANN model by GA can predict the compliance with high accuracy (mean absolute percentage error = 8.57%). Moreover, the contribution of variables on the compliance was investigated through relative importance analysis and the optimum values of parameters for ideal compliance were determined.
Devinck, Frédéric; Knoblauch, Kenneth
Establishing the relation between perception and discrimination is a fundamental objective in psychophysics, with the goal of characterizing the neural mechanisms mediating perception. Here, we show that a procedure for estimating a perceptual scale based on a signal detection model also predicts discrimination performance. We use a recently developed procedure, Maximum Likelihood Difference Scaling (MLDS), to measure the perceptual strength of a long-range, color, filling-in phenomenon, the Watercolor Effect (WCE), as a function of the luminance ratio between the two components of its generating contour. MLDS is based on an equal-variance, gaussian, signal detection model and yields a perceptual scale with interval properties. The strength of the fill-in percept increased 10-15 times the estimate of the internal noise level for a 3-fold increase in the luminance ratio. Each observer's estimated scale predicted discrimination performance in a subsequent paired-comparison task. A common signal detection model accounts for both the appearance and discrimination data. Since signal detection theory provides a common metric for relating discrimination performance and neural response, the results have implications for comparing perceptual and neural response functions.
Ledermann, Thomas; Macho, Siegfried
For the study of growth in dyads, methods have been developed to analyze growth at the level of the dyad members. In this article, we present a novel approach that we call the Common Fate Growth Model (CFGM). This model permits an analysis of growth at the level of the dyads when members are either distinguishable (e.g., heterosexual couples) or indistinguishable (e.g., lesbian couples). To estimate the model, we describe the use of structural equation modeling (SEM) for both distinguishable and indistinguishable members. For indistinguishable members and small groups, such as families, we provide details for the use of multilevel SEM (MSEM). For both SEM and MSEM, we address the issue of measurement invariance (MI) and the estimation of group-level means. The models are illustrated with data from couples collected at seven measurement occasions. To aid the estimation of the models, Mplus code and Amos setups are provided. PsycINFO Database Record (c) 2014 APA, all rights reserved.
El-Banna, Adel I.; Naeem, Marwa A.
This research work aimed at making use of Machine Translation to help students avoid some syntactic, semantic and pragmatic common errors in translation from English into Arabic. Participants were a hundred and five freshmen who studied the "Translation Common Errors Remedial Program" prepared by the researchers. A testing kit that…
Abstract Background Thorough knowledge of the regulatory requirements is a challenging prerequisite for conducting multinational clinical studies in Europe given their complexity and heterogeneity in regulation and perception across the EU member states. Methods In order to summarise the current situation in relation to the wide spectrum of clinical research, the European Clinical Research Infrastructures Network (ECRIN) developed a multinational survey in ten European countries. However a lack of common classification framework for major categories of clinical research was identified, and therefore reaching an agreement on a common classification was the initial step in the development of the survey. Results The ECRIN transnational working group on regulation, composed of experts in the field of clinical research from ten European countries, defined seven major categories of clinical research that seem relevant from both the regulatory and the scientific points of view, and correspond to congruent definitions in all countries: clinical trials on medicinal products; clinical trials on medical devices; other therapeutic trials (including surgery trials, transplantation trials, transfusion trials, trials with cell therapy, etc.); diagnostic studies; clinical research on nutrition; other interventional clinical research (including trials in complementary and alternative medicine, trials with collection of blood or tissue samples, physiology studies, etc.); and epidemiology studies. Our classification was essential to develop a survey focused on protocol submission to ethics committees and competent authorities, procedures for amendments, requirements for sponsor and insurance, and adverse event reporting following five main phases: drafting, consensus, data collection, validation, and finalising. Conclusion The list of clinical research categories as used for the survey could serve as a contribution to the, much needed, task of harmonisation and simplification of the
Mahmood, Shakeel; Hort, Krishna; Ahmed, Shakil; Salam, Mohammed; Cravioto, Alejandro
There is increasing interest in building the capacity of researchers in low and middle income countries (LMIC) to address their national priority health and health policy problems. However, the number and variety of partnerships and funding arrangements can create management problems for LMIC research institutes. This paper aims to identify problems faced by a health research institute in Bangladesh, describe two strategies developed to address these problems, and identify the results after three years of implementation. This paper uses a mixture of quantitative and qualitative data collected during independent annual reviews of the International Centre for Diarrhoeal Disease Research, Bangladesh (ICDDR,B) between 2006 and 2010. Quantitative data includes the number of research activities according to strategic priority areas, revenues collected and expenditure. Qualitative data includes interviews of researchers and management of ICDDR,B, and of research users and key donors. Data in a Monitoring and Evaluation Framework (MEF) were assessed against agreed indicators. The key problems faced by ICDDR,B in 2006 were insufficient core funds to build research capacity and supporting infrastructure, and an inability to direct research funds towards the identified research priorities in its strategic plan. Two strategies were developed to address these problems: a group of donors agreed to provide unearmarked pooled core funding, and accept a single common report based on an agreed MEF. On review after three years, there had been significant increases in total revenue, and the ability to allocate greater amounts of money on capacity building and infrastructure. The MEF demonstrated progress against strategic objectives, and better alignment of research against strategic priorities. There had also been changes in the sense of ownership and collaboration between ICDDR,B's management and its core donors. The changes made to funding relationships supported and monitored by
Full Text Available Abstract Background There is increasing interest in building the capacity of researchers in low and middle income countries (LMIC to address their national priority health and health policy problems. However, the number and variety of partnerships and funding arrangements can create management problems for LMIC research institutes. This paper aims to identify problems faced by a health research institute in Bangladesh, describe two strategies developed to address these problems, and identify the results after three years of implementation. Methods This paper uses a mixture of quantitative and qualitative data collected during independent annual reviews of the International Centre for Diarrhoeal Disease Research, Bangladesh (ICDDR,B between 2006 and 2010. Quantitative data includes the number of research activities according to strategic priority areas, revenues collected and expenditure. Qualitative data includes interviews of researchers and management of ICDDR,B, and of research users and key donors. Data in a Monitoring and Evaluation Framework (MEF were assessed against agreed indicators. Results The key problems faced by ICDDR,B in 2006 were insufficient core funds to build research capacity and supporting infrastructure, and an inability to direct research funds towards the identified research priorities in its strategic plan. Two strategies were developed to address these problems: a group of donors agreed to provide unearmarked pooled core funding, and accept a single common report based on an agreed MEF. On review after three years, there had been significant increases in total revenue, and the ability to allocate greater amounts of money on capacity building and infrastructure. The MEF demonstrated progress against strategic objectives, and better alignment of research against strategic priorities. There had also been changes in the sense of ownership and collaboration between ICDDR,B's management and its core donors. Conclusions The
Full Text Available Simple regulatory mechanisms based on the idea of the saturable 'common stomach' can control the regulation of construction behavior and colony-level responses to environmental perturbations in Metapolybia wasp societies. We mapped the different task groups to mutual inductance electrical circuits and used Kirchoff's basic voltage laws to build a model that uses master equations from physics, yet is able to provide strong predictions for this complex biological phenomenon. Similar to real colonies, independently of the initial conditions, the system shortly sets into an equilibrium, which provides optimal task allocation for a steady construction, depending on the influx of accessible water. The system is very flexible and in the case of perturbations, it reallocates its workforce and adapts to the new situation with different equilibrium levels. Similar to the finding of field studies, decreasing any task groups caused decrease of construction; increasing or decreasing water inflow stimulated or reduced the work of other task groups while triggering compensatory behavior in water foragers. We also showed that only well connected circuits are able to produce adequate construction and this agrees with the finding that this type of task partitioning only exists in larger colonies. Studying the buffer properties of the common stomach and its effect on the foragers revealed that it provides stronger negative feedback to the water foragers, while the connection between the pulp foragers and the common stomach has a strong fixed-point attractor, as evidenced by the dissipative trajectory.
Simple regulatory mechanisms based on the idea of the saturable ‘common stomach’ can control the regulation of construction behavior and colony-level responses to environmental perturbations in Metapolybia wasp societies. We mapped the different task groups to mutual inductance electrical circuits and used Kirchoff’s basic voltage laws to build a model that uses master equations from physics, yet is able to provide strong predictions for this complex biological phenomenon. Similar to real colonies, independently of the initial conditions, the system shortly sets into an equilibrium, which provides optimal task allocation for a steady construction, depending on the influx of accessible water. The system is very flexible and in the case of perturbations, it reallocates its workforce and adapts to the new situation with different equilibrium levels. Similar to the finding of field studies, decreasing any task groups caused decrease of construction; increasing or decreasing water inflow stimulated or reduced the work of other task groups while triggering compensatory behavior in water foragers. We also showed that only well connected circuits are able to produce adequate construction and this agrees with the finding that this type of task partitioning only exists in larger colonies. Studying the buffer properties of the common stomach and its effect on the foragers revealed that it provides stronger negative feedback to the water foragers, while the connection between the pulp foragers and the common stomach has a strong fixed-point attractor, as evidenced by the dissipative trajectory. PMID:27861633
De Groef, Lies; Dekeyster, Eline; Geeraerts, Emiel; Lefevere, Evy; Stalmans, Ingeborg; Salinas-Navarro, Manuel; Moons, Lieve
Mouse disease models have proven indispensable in glaucoma research, yet the complexity of the vast number of models and mouse strains has also led to confusing findings. In this study, we evaluated baseline intraocular pressure, retinal histology, and retinofugal projections in three mouse strains commonly used in glaucoma research, i.e. C57Bl/6, C57Bl/6-Tyr(c), and CD-1 mice. We found that the mouse strains under study do not only display moderate variations in their intraocular pressure, retinal architecture, and retinal ganglion cell density, also the retinofugal projections to the dorsal lateral geniculate nucleus and the superior colliculus revealed striking differences, potentially underlying diverging optokinetic tracking responses and visual acuity. Next, we reviewed the success rate of three models of (glaucomatous) optic neuropathies (intravitreal N-methyl-d-aspartic acid injection, optic nerve crush, and laser photocoagulation-induced ocular hypertension), looking for differences in disease susceptibility between these mouse strains. Different genetic backgrounds and albinism led to differential susceptibility to experimentally induced retinal ganglion cell death among these three mouse strains. Overall, CD-1 mice appeared to have the highest sensitivity to retinal ganglion cell damage, while the C57Bl/6 background was more resistant in the three models used. Copyright © 2016 Elsevier Ltd. All rights reserved.
Karaa, Amel; Rahman, Shamima; Lombès, Anne; Yu-Wai-Man, Patrick; Sheikh, Muniza K; Alai-Hansen, Sherita; Cohen, Bruce H; Dimmock, David; Emrick, Lisa; Falk, Marni J; McCormack, Shana; Mirsky, David; Moore, Tony; Parikh, Sumit; Shoffner, John; Taivassalo, Tanja; Tarnopolsky, Mark; Tein, Ingrid; Odenkirchen, Joanne C; Goldstein, Amy
The common data elements (CDE) project was developed by the National Institute of Neurological Disorders and Stroke (NINDS) to provide clinical researchers with tools to improve data quality and allow for harmonization of data collected in different research studies. CDEs have been created for several neurological diseases; the aim of this project was to develop CDEs specifically curated for mitochondrial disease (Mito) to enhance clinical research. Nine working groups (WGs), composed of international mitochondrial disease experts, provided recommendations for Mito clinical research. They initially reviewed existing NINDS CDEs and instruments, and developed new data elements or instruments when needed. Recommendations were organized, internally reviewed by the Mito WGs, and posted online for external public comment for a period of eight weeks. The final version was again reviewed by all WGs and the NINDS CDE team prior to posting for public use. The NINDS Mito CDEs and supporting documents are publicly available on the NINDS CDE website ( https://commondataelements.ninds.nih.gov/ ), organized into domain categories such as Participant/Subject Characteristics, Assessments, and Examinations. We developed a comprehensive set of CDE recommendations, data definitions, case report forms (CRFs), and guidelines for use in Mito clinical research. The widespread use of CDEs is intended to enhance Mito clinical research endeavors, including natural history studies, clinical trial design, and data sharing. Ongoing international collaboration will facilitate regular review, updates and online publication of Mito CDEs, and support improved consistency of data collection and reporting.
Full Text Available Multicore systems are complex in that multiple processes are running concurrently and can interfere with each other. Real-time systems add on top of that time constraints, making results invalid as soon as a deadline has been missed. Tracing is often the most reliable and accurate tool available to study and understand those systems. However, tracing requires that users understand the kernel events and their meaning. It is therefore not very accessible. Using modeling to generate source code or represent applications’ workflow is handy for developers and has emerged as part of the model-driven development methodology. In this paper, we propose a new approach to system analysis using model-based constraints, on top of userspace and kernel traces. We introduce the constraints representation and how traces can be used to follow the application’s workflow and check the constraints we set on the model. We then present a number of common problems that we encountered in real-time and multicore systems and describe how our model-based constraints could have helped to save time by automatically identifying the unwanted behavior.
Goethem, G. van
At the Lisbon 2000 Summit, a strategic goal was proposed for the European Union: 'to become the most competitive knowledge-based economy with more and better employment and social cohesion by 2010'. Overall, in particular in the nuclear fission community, this EC initiative was well accepted by the main stakeholders. In Europe, the main stakeholders (i.e. suppliers and/or demanders) of nuclear knowledge are actually: the research organisations with mixed public/private funding), the manufacturing industry (or vendors), the utilities and waste management organisations, the regulatory bodies (or technical safety organisations/TSOs) and the academia. In the nuclear fission research area, under Euratom FP-5 (1998-2002), criticism was raised by a number of 'high level experts' that too many Community efforts were devoted to production (e.g. through execution of shared cost actions) and not enough to dissemination and transfer (e.g. through education and training) and exploitation (e.g. through innovation) of nuclear knowledge. They were also complaining about the wasted resources due to the 'fragmentation' of EU research. As far as production of nuclear fission knowledge is concerned, a variety of poles (or fragments) of scientific research and operational feedback does exist in many countries but there is no clear common strategy on how to integrate these fragments at European level with a long term prospect. As far as dissemination and transfer of nuclear knowledge is concerned, the situation in some EU-25 countries is dramatic: mostly due to a bad public perception of nuclear energy, the lack of teachers and students becomes a serious concern. As far as exploitation of nuclear knowledge is concerned, all stakeholders are concerned about the unfair balance between supply and demand of knowledge, and about the relatively poor impact of research on technological and societal changes. In conclusion, from a EU research point of view, the solutions to the above 'nuclear
Wang, Weining; Iyer, N. Gopalakrishna; Tay, Hsien Ts’ung; Wu, Yonghui; Lim, Tony K. H.; Zheng, Lin; Song, In Chin; Kwoh, Chee Keong; Huynh, Hung; Tan, Patrick O. B.; Chow, Pierce K. H.
Despite advances in therapeutics, outcomes for hepatocellular carcinoma (HCC) remain poor and there is an urgent need for efficacious systemic therapy. Unfortunately, drugs that are successful in preclinical studies often fail in the clinical setting, and we hypothesize that this is due to functional differences between primary tumors and commonly used preclinical models. In this study, we attempt to answer this question by comparing tumor morphology and gene expression profiles between primary tumors, xenografts and HCC cell lines. Hep G2 cell lines and tumor cells from patient tumor explants were subcutaneously (ectopically) injected into the flank and orthotopically into liver parenchyma of Mus Musculus SCID mice. The mice were euthanized after two weeks. RNA was extracted from the tumors, and gene expression profiling was performed using the Gene Chip Human Genome U133 Plus 2.0. Principal component analyses (PCA) and construction of dendrograms were conducted using Partek genomics suite. PCA showed that the commonly used HepG2 cell line model and its xenograft counterparts were vastly different from all fresh primary tumors. Expression profiles of primary tumors were also significantly divergent from their counterpart patient-derived xenograft (PDX) models, regardless of the site of implantation. Xenografts from the same primary tumors were more likely to cluster together regardless of site of implantation, although heat maps showed distinct differences in gene expression profiles between orthotopic and ectopic models. The data presented here challenges the utility of routinely used preclinical models. Models using HepG2 were vastly different from primary tumors and PDXs, suggesting that this is not clinically representative. Surprisingly, site of implantation (orthotopic versus ectopic) resulted in limited impact on gene expression profiles, and in both scenarios xenografts differed significantly from the original primary tumors, challenging the long
Madsen, Robert; Barrett, Steven; Wilcox, Michael
The vision system of the common house fly has many properties, such as hyperacuity and parallel structure, which would be advantageous in a machine vision system. A software model has been developed which is ultimately intended to be a tool to guide the design of an analog real time vision system. The model starts by laying out cartridges over an image. The cartridges are analogous to the ommatidium of the fly's eye and contain seven photoreceptors each with a Gaussian profile. The spacing between photoreceptors is variable providing for more or less detail as needed. The cartridges provide information on what type of features they see and neighboring cartridges share information to construct a feature map.
The impact of common cause failures (CCF) on PSA results for NPPs is in sharp contrast with the limited quality which can be achieved in their assessment. This is due to the dearth of observations and cannot be remedied in the short run. Therefore the methods employed for calculating failure rates should be devised such as to make the best use of the few available observations on CCF. The Multi-Class Binomial Failure Rate (MCBFR) Model achieves this by assigning observed failures to different classes according to their technical characteristics and applying the BFR formalism to each of these. The results are hence determined by a superposition of BFR type expressions for each class, each of them with its own coupling factor. The model thus obtained flexibly reproduces the dependence of CCF rates on failure multiplicity insinuated by the observed failure multiplicities. This is demonstrated by evaluating CCFs observed for combined impulse pilot valves in German NPPs. (orig.) [de
Ceusters, Werner; Blaisure, Jonathan
Correctly counting entities is a requirement for analytics tools to function appropriately. The Observational Medical Outcomes Partnership's (OMOP) Common Data Model (CDM) specifications were examined to assess the extent to which counting in OMOP CDM compatible data repositories would work as expected. To that end, constructs (tables, fields and attributes) defined in the OMOP CDM as well as cardinality constraints and other business rules found in its documentation and related literature were compared to the types of entities and axioms proposed in realism-based ontologies. It was found that not only the model itself, but also a proposed standard algorithm for computing condition eras may lead to erroneous counting of several sorts of entities.
Biering-Sørensen, Fin; Alai, Sherita; Anderson, Kim; Charlifue, Susan; Chen, Yuying; DeVivo, Michael; Flanders, Adam E.; Jones, Linda; Kleitman, Naomi; Lans, Aria; Noonan, Vanessa K.; Odenkirchen, Joanne; Steeves, John; Tansey, Keith; Widerström-Noga, Eva; Jakeman, Lyn B.
Objective To develop a comprehensive set of common data elements (CDEs), data definitions, case report forms and guidelines for use in spinal cord injury (SCI) clinical research, as part of the CDE project at the National Institute of Neurological Disorders and Stroke (NINDS) of the USA National Institutes of Health. Setting International Working Groups Methods Nine working groups composed of international experts reviewed existing CDEs and instruments, created new elements when needed, and provided recommendations for SCI clinical research. The project was carried out in collaboration with and cross-referenced to development of the International Spinal Cord Society (ISCoS) International SCI Data Sets. The recommendations were compiled, subjected to internal review, and posted online for external public comment. The final version was reviewed by all working groups and the NINDS CDE team prior to release. Results The NINDS SCI CDEs and supporting documents are publically available on the NINDS CDE website and the ISCoS website. The CDEs span the continuum of SCI care and the full range of domains of the International Classification of Functioning, Disability and Health. Conclusions Widespread use of common data elements can facilitate SCI clinical research and trial design, data sharing, and retrospective analyses. Continued international collaboration will enable consistent data collection and reporting, and will help ensure that the data elements are updated, reviewed and broadcast as additional evidence is obtained. PMID:25665542
Chimeric antigen receptor (CAR) T-cell immunotherapy has emerged as a promising treatment for pre-B cell acute lymphoblastic leukemia (B-ALL), the most common type of childhood cancer. B-ALL is characterized by an overproduction of immature white blood cells called lymphoblasts. In a trial led by Center for Cancer Research investigators, around 70 to 90 percent of patients whose B-ALL has relapsed or developed resistance to chemotherapy entered remission after CAR T-cell therapy targeting CD19. Read more…
Davidson, William H.
National technology development initiatives in Japan and Europe are playing an increasingly important role in many fields of research. Such heightened international activity suggests a need for a more global perspective on research administration, and raises many questions for the United States' research community and science and technology…
Full Text Available The recent high-throughput sequencing has enabled the composition of Escherichia coli strains in the human microbial community to be profiled en masse. However, there are two challenges to address: (1 exploring the genetic differences between E. coli strains in human gut and (2 dynamic responses of E. coli to diverse stress conditions. As a result, we investigated the E. coli strains in human gut microbiome using deep sequencing data and reconstructed genome-wide metabolic networks for the three most common E. coli strains, including E. coli HS, UTI89, and CFT073. The metabolic models show obvious strain-specific characteristics, both in network contents and in behaviors. We predicted optimal biomass production for three models on four different carbon sources (acetate, ethanol, glucose, and succinate and found that these stress-associated genes were involved in host-microbial interactions and increased in human obesity. Besides, it shows that the growth rates are similar among the models, but the flux distributions are different, even in E. coli core reactions. The correlations between human diabetes-associated metabolic reactions in the E. coli models were also predicted. The study provides a systems perspective on E. coli strains in human gut microbiome and will be helpful in integrating diverse data sources in the following study.
Tanner, Susanne E.; Teles-Machado, Ana; Martinho, Filipe; Peliz, Álvaro; Cabral, Henrique N.
Individual-based coupled physical-biological models have become the standard tool for studying ichthyoplankton dynamics and assessing fish recruitment. Here, common sole (Solea solea L.), a flatfish of high commercial importance in Europe was used to evaluate transport of eggs and larvae and investigate the connectivity between spawning and nursery areas along the western Iberian coast as spatio-temporal variability in dispersal and recruitment patterns can result in very strong or weak year-classes causing large fluctuations in stock size. A three-dimensional particle tracking model coupled to Regional Ocean Modelling System model was used to investigate variability of sole larvae dispersal along the western Iberian coast over a five-year period (2004-2009). A sensitivity analysis evaluating: (1) the importance of diel vertical migrations of larvae and (2) the size of designated recruitment areas was performed. Results suggested that connectivity patterns of sole larvae dispersal and their spatio-temporal variability are influenced by the configuration of the coast with its topographical structures and thus the suitable recruitment area available as well as the wind-driven mesoscale circulation along the Iberian coast.
Full Text Available We propose a new regulation mechanism based on the idea of the "common stomach" to explain several aspects of the resilience and homeostatic regulation of honeybee colonies. This mechanism exploits shared pools of substances (pollen, nectar, workers, brood that modulate recruitment, abandonment and allocation patterns at the colony-level and enable bees to perform several survival strategies to cope with difficult circumstances: Lack of proteins leads to reduced feeding of young brood, to early capping of old brood and to regaining of already spent proteins through brood cannibalism. We modeled this system by linear interaction terms and mass-action law. To test the predictive power of the model of this regulatory mechanism we compared our model predictions to experimental data of several studies. These comparisons show that the proposed regulation mechanism can explain a variety of colony level behaviors. Detailed analysis of the model revealed that these mechanisms could explain the resilience, stability and self-regulation observed in honeybee colonies. We found that manipulation of material flow and applying sudden perturbations to colony stocks are quickly compensated by a resulting counter-acting shift in task selection. Selective analysis of feedback loops allowed us to discriminate the importance of different feedback loops in self-regulation of honeybee colonies. We stress that a network of simple proximate mechanisms can explain significant colony-level abilities that can also be seen as ultimate reasoning of the evolutionary trajectory of honeybees.
Amanda R. Panfil
Full Text Available Since the isolation and discovery of human T-cell leukemia virus type 1 (HTLV-1 over 30 years ago, researchers have utilized animal models to study HTLV-1 transmission, viral persistence, virus-elicited immune responses, and HTLV-1-associated disease development (ATL, HAM/TSP. Non-human primates, rabbits, rats, and mice have all been used to help understand HTLV-1 biology and disease progression. Non-human primates offer a model system that is phylogenetically similar to humans for examining viral persistence. Viral transmission, persistence, and immune responses have been widely studied using New Zealand White rabbits. The advent of molecular clones of HTLV-1 has offered the opportunity to assess the importance of various viral genes in rabbits, non-human primates, and mice. Additionally, over-expression of viral genes using transgenic mice has helped uncover the importance of Tax and Hbz in the induction of lymphoma and other lymphocyte-mediated diseases. HTLV-1 inoculation of certain strains of rats results in histopathological features and clinical symptoms similar to that of humans with HAM/TSP. Transplantation of certain types of ATL cell lines in immunocompromised mice results in lymphoma. Recently, “humanized” mice have been used to model ATL development for the first time. Not all HTLV-1 animal models develop disease and those that do vary in consistency depending on the type of monkey, strain of rat, or even type of ATL cell line used. However, the progress made using animal models cannot be understated as it has led to insights into the mechanisms regulating viral replication, viral persistence, disease development, and, most importantly, model systems to test disease treatments.
Thirel, Guillaume; Andréassian, Vazken; Perrin, Charles
This communication will present a summary of the outcomes of a workshop session held in Gothenburg (Sweden) during the International Association of Hydrological Sciences (IAHS) General Assembly in 2013 on the topic of modelling of temporally-varying catchments, i.e. catchments that exhibit significant changes in their physical or climate conditions over a period of record. This workshop aimed at contributing to the Panta Rhei IAHS decade by offering a tribune to modellers to debate on hydrological modelling under change. For this workshop, the participants had been invited to apply a calibration and evaluation protocol to their own hydrological models on a given set of changing catchments and to come to Gothenburg to present their results (Thirel et al., 2015a). It was recognized that this protocol, based on calibration and evaluation over contrasted periods, is an appropriate way of assessing the suitability of hydrological models to handle changing conditions. Some modellers saw this exercise as an opportunity to confront their models to conditions different from their usual application area, or to use models to better understand hydrological changes. The crucial need for dedicated protocols to evaluate models under change was also stressed by some modellers who proposed complementary testing protocols (Thirel et al., 2015b). It is of utmost importance that studies for which models are applied under extreme conditions (meaning conditions very different from their calibration conditions) are performed using well-defined protocols. Several challenges for future research to improve the hydrological modelling of changing catchments were discussed during the workshop and will be presented. References Thirel G., V. Andréassian, C. Perrin, J.-N. Audouy, L. Berthet, P. Edwards, N. Folton, C. Furusho, A. Kuentz, J. Lerat, G. Lindström, E. Martin, T. Mathevet, R. Merz, J. Parajka, D. Ruelland, J. Vaze. Hydrology under change: an evaluation protocol to investigate how
Benjamin, S.; Sun, S.; Grell, G. A.; Green, B.; Olson, J.; Kenyon, J.; James, E.; Smirnova, T. G.; Brown, J. M.
Cloud-radiation representation in models for subgrid-scale clouds is a known gap from subseasonal-to-seasonal models down to storm-scale models applied for forecast duration of only a few hours. NOAA/ESRL has been applying common physical parameterizations for scale-aware deep/shallow convection and boundary-layer mixing over this wide range of time and spatial scales, with some progress to be reported in this presentation. The Grell-Freitas scheme (2014, Atmos. Chem. Phys.) and MYNN boundary-layer EDMF scheme (Olson / Benjamin et al. 2016 Mon. Wea. Rev.) have been applied and tested extensively for the NOAA hourly updated 3-km High-Resolution Rapid Refresh (HRRR) and 13-km Rapid Refresh (RAP) model/assimilation systems over the United States and North America, with targeting toward improvement to boundary-layer evolution and cloud-radiation representation in all seasons. This representation is critical for both warm-season severe convective storm forecasting and for winter-storm prediction of snow and mixed precipitation. At the same time the Grell-Freitas scheme has been applied also as an option for subseasonal forecasting toward improved US week 3-4 prediction with the FIM-HYCOM coupled model (Green et al 2017, MWR). Cloud/radiation evaluation using CERES satellite-based estimates have been applied to both 12-h RAP (13km) and also during Weeks 1-4 from 32-day FIM-HYCOM (60km) forecasts. Initial results reveal that improved cloud representation is needed for both resolutions and now is guiding further refinement for cloud representation including with the Grell-Freitas scheme and with the updated MYNN-EDMF scheme (both now also in global testing as well as with the 3km HRRR and 13km RAP models).
Edwardson, Matthew A; Wang, Ximing; Liu, Brent; Ding, Li; Lane, Christianne J; Park, Caron; Nelsen, Monica A; Jones, Theresa A; Wolf, Steven L; Winstein, Carolee J; Dromerick, Alexander W
Stroke patients with mild-moderate upper extremity motor impairments and minimal sensory and cognitive deficits provide a useful model to study recovery and improve rehabilitation. Laboratory-based investigators use lesioning techniques for similar goals. To determine whether stroke lesions in an upper extremity rehabilitation trial cohort match lesions from the preclinical stroke recovery models used to drive translational research. Clinical neuroimages from 297 participants enrolled in the Interdisciplinary Comprehensive Arm Rehabilitation Evaluation (ICARE) study were reviewed. Images were characterized based on lesion type (ischemic or hemorrhagic), volume, vascular territory, depth (cortical gray matter, cortical white matter, subcortical), old strokes, and leukoaraiosis. Lesions were compared with those of preclinical stroke models commonly used to study upper limb recovery. Among the ischemic stroke participants, median infarct volume was 1.8 mL, with most lesions confined to subcortical structures (61%) including the anterior choroidal artery territory (30%) and the pons (23%). Of ICARE participants, stroke patients, but they represent a clinically and scientifically important subgroup. Compared with lesions in general stroke populations and widely studied animal models of recovery, ICARE participants had smaller, more subcortically based strokes. Improved preclinical-clinical translational efforts may require better alignment of lesions between preclinical and human stroke recovery models.
Davis, Sean D.; Piercy, Fred P.
Proponents of the common factors movement in marriage and family therapy (MFT) suggest that, rather than specific models of therapy, elements common across models of therapy and common to the process of therapy itself are responsible for therapeutic change. This article--the second of two companion articles--reports on a study designed to further…
Water availability is fundamental to societies and ecosystems, but our understanding of variations in hydroclimate (including extreme events, flooding, and decadal periods of drought) is limited because of a paucity of modern instrumental observations that are distributed unevenly across the globe and only span parts of the 20th and 21st centuries. Such data coverage is insufficient for characterizing hydroclimate and its associated dynamics because of its multidecadal to centennial variability and highly regionalized spatial signature. High-resolution (seasonal to decadal) hydroclimatic proxies that span all or parts of the Common Era (CE) and paleoclimate simulations from climate models are therefore important tools for augmenting our understanding of hydroclimate variability. In particular, the comparison of the two sources of information is critical for addressing the uncertainties and limitations of both while enriching each of their interpretations. We review the principal proxy data available for hydroclimatic reconstructions over the CE and highlight the contemporary understanding of how these proxies are interpreted as hydroclimate indicators. We also review the available last-millennium simulations from fully coupled climate models and discuss several outstanding challenges associated with simulating hydroclimate variability and change over the CE. A specific review of simulated hydroclimatic changes forced by volcanic events is provided, as is a discussion of expected improvements in estimated radiative forcings, models, and their implementation in the future. Our review of hydroclimatic proxies and last-millennium model simulations is used as the basis for articulating a variety of considerations and best practices for how to perform proxy-model comparisons of CE hydroclimate. This discussion provides a framework for how best to evaluate hydroclimate variability and its associated dynamics using these comparisons and how they can better inform
Hydro2k Consortium, Pages
Water availability is fundamental to societies and ecosystems, but our understanding of variations in hydroclimate (including extreme events, flooding, and decadal periods of drought) is limited because of a paucity of modern instrumental observations that are distributed unevenly across the globe and only span parts of the 20th and 21st centuries. Such data coverage is insufficient for characterizing hydroclimate and its associated dynamics because of its multidecadal to centennial variability and highly regionalized spatial signature. High-resolution (seasonal to decadal) hydroclimatic proxies that span all or parts of the Common Era (CE) and paleoclimate simulations from climate models are therefore important tools for augmenting our understanding of hydroclimate variability. In particular, the comparison of the two sources of information is critical for addressing the uncertainties and limitations of both while enriching each of their interpretations. We review the principal proxy data available for hydroclimatic reconstructions over the CE and highlight the contemporary understanding of how these proxies are interpreted as hydroclimate indicators. We also review the available last-millennium simulations from fully coupled climate models and discuss several outstanding challenges associated with simulating hydroclimate variability and change over the CE. A specific review of simulated hydroclimatic changes forced by volcanic events is provided, as is a discussion of expected improvements in estimated radiative forcings, models, and their implementation in the future. Our review of hydroclimatic proxies and last-millennium model simulations is used as the basis for articulating a variety of considerations and best practices for how to perform proxy-model comparisons of CE hydroclimate. This discussion provides a framework for how best to evaluate hydroclimate variability and its associated dynamics using these comparisons and how they can better inform
Lindholm, D. M.; Wilson, A.
Much progress has been made in scientific data interoperability, especially in the areas of metadata and discovery. However, while a data user may have improved techniques for finding data, there is often a large chasm to span when it comes to acquiring the desired subsets of various datasets and integrating them into a data processing environment. Some tools such as OPeNDAP servers and the Unidata Common Data Model (CDM) have introduced improved abstractions for accessing data via a common interface, but they alone do not go far enough to enable fusion of data from multidisciplinary sources. Although data from various scientific disciplines may represent semantically similar concepts (e.g. time series), the user may face widely varying structural representations of the data (e.g. row versus column oriented), not to mention radically different storage formats. It is not enough to convert data to a common format. The key to fusing scientific data is to represent each dataset with consistent sampling. This can best be done by using a data model that expresses the functional relationship that each dataset represents. The domain of those functions determines how the data can be combined. The Visualization for Algorithm Development (VisAD) Java API has provided a sophisticated data model for representing the functional nature of scientific datasets for well over a decade. Because VisAD is largely designed for its visualization capabilities, the data model can be cumbersome to use for numerical computation, especially for those not comfortable with Java. Although both VisAD and the implementation of the CDM are written in Java, neither defines a pure Java interface that others could implement and program to, further limiting potential for interoperability. In this talk, we will present a solution for data integration based on a simple discipline-agnostic scientific data model and programming interface that enables a dataset to be defined in terms of three variable types
El Bakkari, B.; Nacir, B.; El Bardouni, T.; El Younoussi, C.; Merroun, O.; Htet, A.; Boulaich, Y.; Zoubair, M.; Boukhal, H.; Chakir, M.
The Moroccan 2 MW TRIGA MARK II research reactor at Centre des Etudes Nucleaires de la Maamora (CENM) achieved initial criticality on May 2, 2007. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes for their use in agriculture, industry, and medicine. This study deals with the neutronic analysis of the 2-MW TRIGA MARK II research reactor at CENM and validation of the results by comparisons with the experimental, operational, and available final safety analysis report (FSAR) values. The study was prepared in collaboration between the Laboratory of Radiation and Nuclear Systems (ERSN-LMR) from Faculty of Sciences of Tetuan (Morocco) and CENM. The 3-D continuous energy Monte Carlo code MCNP (version 5) was used to develop a versatile and accurate full model of the TRIGA core. The model represents in detailed all components of the core with literally no physical approximation. Continuous energy cross-section data from the more recent nuclear data evaluations (ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1, and JENDL-3.3) as well as S(α, β) thermal neutron scattering functions distributed with the MCNP code were used. The cross-section libraries were generated by using the NJOY99 system updated to its more recent patch file 'up259'. The consistency and accuracy of both the Monte Carlo simulation and neutron transport physics were established by benchmarking the TRIGA experiments. Core excess reactivity, total and integral control rods worth as well as power peaking factors were used in the validation process. Results of calculations are analysed and discussed.
El Bakkari, B.; Nacir, B.; El Bardouni, T.; El Younoussi, C.; Merroun, O.; Htet, A.; Boulaich, Y.; Zoubair, M.; Boukhal, H.; Chakir, M.
The Moroccan 2 MW TRIGA MARK II research reactor at Centre des Etudes Nucléaires de la Maâmora (CENM) achieved initial criticality on May 2, 2007. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes for their use in agriculture, industry, and medicine. This study deals with the neutronic analysis of the 2-MW TRIGA MARK II research reactor at CENM and validation of the results by comparisons with the experimental, operational, and available final safety analysis report (FSAR) values. The study was prepared in collaboration between the Laboratory of Radiation and Nuclear Systems (ERSN-LMR) from Faculty of Sciences of Tetuan (Morocco) and CENM. The 3-D continuous energy Monte Carlo code MCNP (version 5) was used to develop a versatile and accurate full model of the TRIGA core. The model represents in detailed all components of the core with literally no physical approximation. Continuous energy cross-section data from the more recent nuclear data evaluations (ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1, and JENDL-3.3) as well as S( α, β) thermal neutron scattering functions distributed with the MCNP code were used. The cross-section libraries were generated by using the NJOY99 system updated to its more recent patch file "up259". The consistency and accuracy of both the Monte Carlo simulation and neutron transport physics were established by benchmarking the TRIGA experiments. Core excess reactivity, total and integral control rods worth as well as power peaking factors were used in the validation process. Results of calculations are analysed and discussed.
Yun, Xiang; Feng, Xiancheng
It is the research hotspot of current broadband network to combine voice service, data service and broadband audio-video service by IP protocol to transport various real time and mutual services to terminal users (home). Home Networking is a new kind of network and application technology which can provide various services. Home networking is called as Digital Home Network. It means that PC, home entertainment equipment, home appliances, Home wirings, security, illumination system were communicated with each other by some composing network technology, constitute a networking internal home, and connect with WAN by home gateway. It is a new network technology and application technology, and can provide many kinds of services inside home or between homes. Currently, home networking can be divided into three kinds: Information equipment, Home appliances, Communication equipment. Equipment inside home networking can exchange information with outer networking by home gateway, this information communication is bidirectional, user can get information and service which provided by public networking by using home networking internal equipment through home gateway connecting public network, meantime, also can get information and resource to control the internal equipment which provided by home networking internal equipment. Based on the general network model of home networking, there are four functional entities inside home networking: HA, HB, HC, and HD. (1) HA (Home Access) - home networking connects function entity; (2) HB (Home Bridge) Home networking bridge connects function entity; (3) HC (Home Client) - Home networking client function entity; (4) HD (Home Device) - decoder function entity. There are many physical ways to implement four function entities. Based on theses four functional entities, there are reference model of physical layer, reference model of link layer, reference model of IP layer and application reference model of high layer. In the future home network
Full Text Available Sunscreen products are predominantly regulated as over-the-counter (OTC drugs by the US FDA. The “active” ingredients function as ultraviolet filters. Once a sunscreen product is generally recognized as safe and effective (GRASE via an OTC drug review process, new formulations using these ingredients do not require FDA review and approval, however, the majority of ingredients have never been tested to uncover any potential endocrine activity and their ability to interact with the estrogen receptor (ER is unknown, despite the fact that this is a very extensively studied target related to endocrine activity. Consequently, we have developed an in silico model to prioritize single ingredient estrogen receptor activity for use when actual animal data are inadequate, equivocal, or absent. It relies on consensus modeling to qualitatively and quantitatively predict ER binding activity. As proof of concept, the model was applied to ingredients commonly used in sunscreen products worldwide and a few reference chemicals. Of the 32 chemicals with unknown ER binding activity that were evaluated, seven were predicted to be active estrogenic compounds. Five of the seven were confirmed by the published data. Further experimental data is needed to confirm the other two predictions.
Poirier, S.; Buteau, A.; Ounsy, M.; Rodriguez, C.; Hauser, N.; Lam, T.; Xiong, N.
For almost 20 years, the scientific community of neutron and synchrotron institutes have been dreaming of a common data format for exchanging experimental results and applications for reducing and analyzing the data. Using HDF5 as a data container has become the standard in many facilities. The big issue is the standardization of the data organization (schema) within the HDF5 container. By introducing a new level of indirection for data access, the Common-Data-Model-Access (CDMA) framework proposes a solution and allows separation of responsibilities between data reduction developers and the institute. Data reduction developers are responsible for data reduction code; the institute provides a plug-in to access the data. The CDMA is a core API that accesses data through a data format plug-in mechanism and scientific application definitions (sets of keywords) coming from a consensus between scientists and institutes. Using a innovative 'mapping' system between application definitions and physical data organizations, the CDMA allows data reduction application development independent of the data file container AND schema. Each institute develops a data access plug-in for its own data file formats along with the mapping between application definitions and its data files. Thus data reduction applications can be developed from a strictly scientific point of view and are immediately able to process data acquired from several institutes. (authors)
Full Text Available This paper investigates the valuation of vulnerable European options considering the market prices of common systematic jump risks under regime-switching jump-diffusion models. The way of regime-switching Esscher transform is adopted to identify an equivalent martingale measure for pricing vulnerable European options. Explicit analytical pricing formulae for vulnerable European options are derived by risk-neutral pricing theory. For comparison, the other two cases are also considered separately. The first case considers all jump risks as unsystematic risks while the second one assumes all jumps risks to be systematic risks. Numerical examples for the valuation of vulnerable European options are provided to illustrate our results and indicate the influence of the market prices of jump risks on the valuation of vulnerable European options.
Full Text Available Common rail injection systems are becoming a more widely used solution in the fuel systems of modern diesel engines. The main component and the characteristic feature of the system is rail injection of the fuel under high pressure, which is passed to the injector and further to the combustion chamber. An important element in this process is the high-pressure pump, continuing adequate pressure in the rail injection system. Common rail (CR systems are being modified in order to optimise their work and virtual simulations are a useful tool in order to analyze the correctness of operation of the system while varying the parameters and settings, without any negative impact on the real object. In one particular study, a computer simulation of the pump high-pressure CR system was made in MatLab environment, based on the actual dimensions of the object – a one-cylinder diesel engine, the Farymann Diesel 18W. The resulting model consists of two parts – the first is responsible for simulating the operation of the high-pressure pump, and the second responsible for simulation of the remaining elements of the CR system. The results of this simulation produced waveforms of the following parameters: fluid flow from the manifold to the injector [m3/s], liquid flow from the manifold to the atmosphere [m3/s], and manifold pressure [Pa]. The simulation results allow for a positive verification of the model and the resulting system could become a useful element of simulation of the entire position and control algorithm.
A Common User Data Model, CUDM, has been developed for the purpose of benchmark calculations between APOLLO3 and MARBLE code systems. The current version of CUDM was designed for core calculation benchmark problems with 3-dimensional Cartesian, 3-D XYZ, geometry. CUDM is able to manage all input/output data such as 3-D XYZ geometry, effective macroscopic cross section, effective multiplication factor and neutron flux. In addition, visualization tools for geometry and neutron flux were included. CUDM was designed by the object-oriented technique and implemented using Python programming language. Based on the CUDM, a prototype system for a benchmark calculation, CUDM-benchmark, was also developed. The CUDM-benchmark supports input/output data conversion for IDT solver in APOLLO3, and TRITAC and SNT solvers in MARBLE. In order to evaluate pertinence of CUDM, the CUDM-benchmark was applied to benchmark problems proposed by T. Takeda, G. Chiba and I. Zmijarevic. It was verified that the CUDM-benchmark successfully reproduced the results calculated with reference input data files, and provided consistent results among all the solvers by using one common input data defined by CUDM. In addition, a detailed benchmark calculation for Chiba benchmark was performed by using the CUDM-benchmark. Chiba benchmark is a neutron transport benchmark problem for fast criticality assembly without homogenization. This benchmark problem consists of 4 core configurations which have different sodium void regions, and each core configuration is defined by more than 5,000 fuel/material cells. In this application, it was found that the results by IDT and SNT solvers agreed well with the reference results by Monte-Carlo code. In addition, model effects such as quadrature set effect, S n order effect and mesh size effect were systematically evaluated and summarized in this report. (author)
Meisner, Søren; Lehur, Paul-Antoine; Moran, Brendan; Martins, Lina; Jemec, Gregor Borut Ernst
Background Peristomal skin complications (PSCs) are the most common post-operative complications following creation of a stoma. Living with a stoma is a challenge, not only for the patient and their carers, but also for society as a whole. Due to methodological problems of PSC assessment, the associated health-economic burden of medium to longterm complications has been poorly described. Aim The aim of the present study was to create a model to estimate treatment costs of PSCs using the standardized assessment Ostomy Skin Tool as a reference. The resultant model was applied to a real-life global data set of stoma patients (n = 3017) to determine the prevalence and financial burden of PSCs. Methods Eleven experienced stoma care nurses were interviewed to get a global understanding of a treatment algorithm that formed the basis of the cost analysis. The estimated costs were based on a seven week treatment period. PSC costs were estimated for five underlying diagnostic categories and three levels of severity. The estimated treatment costs of severe cases of PSCs were increased 2–5 fold for the different diagnostic categories of PSCs compared with mild cases. French unit costs were applied to the global data set. Results The estimated total average cost for a seven week treatment period (including appliances and accessories) was 263€ for those with PSCs (n = 1742) compared to 215€ for those without PSCs (n = 1172). A co-variance analysis showed that leakage level had a significant impact on PSC cost from ‘rarely/never’ to ‘always/often’ p<0.00001 and from ‘rarely/never’ to ‘sometimes’ p = 0.0115. Conclusion PSCs are common and troublesome and the consequences are substantial, both for the patient and from a health economic viewpoint. PSCs should be diagnosed and treated at an early stage to prevent long term, debilitating and expensive complications. PMID:22679479
Full Text Available BACKGROUND: Peristomal skin complications (PSCs are the most common post-operative complications following creation of a stoma. Living with a stoma is a challenge, not only for the patient and their carers, but also for society as a whole. Due to methodological problems of PSC assessment, the associated health-economic burden of medium to longterm complications has been poorly described. AIM: The aim of the present study was to create a model to estimate treatment costs of PSCs using the standardized assessment Ostomy Skin Tool as a reference. The resultant model was applied to a real-life global data set of stoma patients (n = 3017 to determine the prevalence and financial burden of PSCs. METHODS: Eleven experienced stoma care nurses were interviewed to get a global understanding of a treatment algorithm that formed the basis of the cost analysis. The estimated costs were based on a seven week treatment period. PSC costs were estimated for five underlying diagnostic categories and three levels of severity. The estimated treatment costs of severe cases of PSCs were increased 2-5 fold for the different diagnostic categories of PSCs compared with mild cases. French unit costs were applied to the global data set. RESULTS: The estimated total average cost for a seven week treatment period (including appliances and accessories was 263€ for those with PSCs (n = 1742 compared to 215€ for those without PSCs (n = 1172. A co-variance analysis showed that leakage level had a significant impact on PSC cost from 'rarely/never' to 'always/often' p<0.00001 and from 'rarely/never' to 'sometimes' p = 0.0115. CONCLUSION: PSCs are common and troublesome and the consequences are substantial, both for the patient and from a health economic viewpoint. PSCs should be diagnosed and treated at an early stage to prevent long term, debilitating and expensive complications.
A I Pearce
Full Text Available Development of an optimal interface between bone and orthopaedic and dental implants has taken place for many years. In order to determine whether a newly developed implant material conforms to the requirements of biocompatibility, mechanical stability and safety, it must undergo rigorous testing both in vitro and in vivo. Results from in vitro studies can be difficult to extrapolate to the in vivo situation. For this reason the use of animal models is often an essential step in the testing of orthopaedic and dental implants prior to clinical use in humans. This review discusses some of the more commonly available and frequently used animal models such as the dog, sheep, goat, pig and rabbit models for the evaluation of bone-implant interactions. Factors for consideration when choosing an animal model and implant design are discussed. Various bone specific features are discussed including the usage of the species, bone macrostructure and microstructure and bone composition and remodelling, with emphasis being placed on the similarity between the animal model and the human clinical situation. While the rabbit was the most commonly used of the species discussed in this review, it is clear that this species showed the least similarities to human bone. There were only minor differences in bone composition between the various species and humans. The pig demonstrated a good likeness with human bone however difficulties may be encountered in relation to their size and ease of handling. In this respect the dog and sheep/goat show more promise as animal models for the testing of bone implant materials. While no species fulfils all of the requirements of an ideal model, an understanding of the differences in bone architecture and remodelling between the species is likely to assist in the selection of a suitable species for a defined research question.
We work in a model where all CP-violating phenomena have a common source. CP is spontaneously broken at a large scale V through the phase of a complex singlet scalar. An additional SU(2) L singlet vector-like down-type quark relates this high scale CP violation to low energy. We quantitatively analyse this model in the quark sector. We obtain the numerical values of the parameters of the Lagrangian in the quark sector for a specific ansatz of the 4 x 4 down-type quark mass matrix where the weak phase is generated minimally. Zb-barb vertex will modify in presence of the extra vector-like down-type quark. From the experimental lower bound of the partial decay width Z → b-barb, we find the lower bound of the additional down-type quark mass. Tree level flavour changing neutral current appears in this model due to the presence of the extra vector-like down-type quark. We give the range of values of the mass splitting Δm B q in B 0 q - B-bar 0 q system using SM box, Z-mediating tree level and Z-mediating one loop diagrams together for both q = d, s. We find the analytical expression for Γ q 12 in this model from standard box, Z- and Higgs-mediated penguin diagrams for B 0 q - B-bar 0 q system, q = d, s. From this we numerically evaluate the decay width difference ΔΓ B q /Γ B q . We also find the numerical values of the CP asymmetry parameters a J and a π for the decays B 0 d → J/ψK s and B 0 d → π + π - , respectively. We get the lower bound of the scale V through the upper bound of the strong CP phase
Full Text Available In order to response to the increasingly polluted environment, maintain sustainable economic and social development in Jiangsu province, the author calculated the index of the resource environment in Jiangsu, using LMDI(logarithmic-mean Divisia index decomposition method based on the Commoner model(we can see from formula(2,(5,(6&(7, to reflect the three major influencing factors of cumulative effects. In table 2 and figure 3, the research results show the expansion of the size of economy and growth of population make resources consumption increase and environmental pollution aggravate, while technological progress reduce the pressure of resources and environment. According to the findings, the paper proposes the policy recommendations, such as develop circular economy, promote technological innovation and strengthen regional cooperation mechanism and so on to reduce the environmental pollution while economic developing. These will be useful to the policymakers.
Fu, Y.; Fang, H.
The design methods of furniture are different from east to west; it has been the hotspot of the scholars. However, in terms of the theory of modern design innovation, neither the early creation theory, the modern design theory, nor the widely applied TRIZ theory can fully fit the modern furniture design innovation, so it is urgent to study the modern furniture design theory. This paper is based on the idea of TRIZ theory, using lots of literatures as data, and uses the method of statistical stratification to analyze and sort out the research of modern sitting equipment, and finally put forward the modern furniture design model, which provides new ideas and perspectives for the modern design of Chinese furniture.
Pascoe, Charlotte; Lawrence, Bryan; Moine, Marie-Pierre; Ford, Rupert; Devine, Gerry
The EU METAFOR Project (http://metaforclimate.eu) has created a web-based model documentation questionnaire to collect metadata from the modelling groups that are running simulations in support of the Coupled Model Intercomparison Project - 5 (CMIP5). The CMIP5 model documentation questionnaire will retrieve information about the details of the models used, how the simulations were carried out, how the simulations conformed to the CMIP5 experiment requirements and details of the hardware used to perform the simulations. The metadata collected by the CMIP5 questionnaire will allow CMIP5 data to be compared in a scientifically meaningful way. This paper describes the life-cycle of the CMIP5 questionnaire development which starts with relatively unstructured input from domain specialists and ends with formal XML documents that comply with the METAFOR Common Information Model (CIM). Each development step is associated with a specific tool. (1) Mind maps are used to capture information requirements from domain experts and build a controlled vocabulary, (2) a python parser processes the XML files generated by the mind maps, (3) Django (python) is used to generate the dynamic structure and content of the web based questionnaire from processed xml and the METAFOR CIM, (4) Python parsers ensure that information entered into the CMIP5 questionnaire is output as CIM compliant xml, (5) CIM compliant output allows automatic information capture tools to harvest questionnaire content into databases such as the Earth System Grid (ESG) metadata catalogue. This paper will focus on how Django (python) and XML input files are used to generate the structure and content of the CMIP5 questionnaire. It will also address how the choice of development tools listed above provided a framework that enabled working scientists (who we would never ordinarily get to interact with UML and XML) to be part the iterative development process and ensure that the CMIP5 model documentation questionnaire
Saver, Jeffrey L.; Warach, Steven; Janis, Scott; Odenkirchen, Joanne; Becker, Kyra; Benavente, Oscar; Broderick, Joseph; Dromerick, Alexander W.; Duncan, Pamela; Elkind, Mitchell S. V.; Johnston, Karen; Kidwell, Chelsea S.; Meschia, James F.; Schwamm, Lee
Background and Purpose The National Institute of Neurological Disorders and Stroke initiated development of stroke-specific Common Data Elements (CDEs) as part of a project to develop data standards for funded clinical research in all fields of neuroscience. Standardizing data elements in translational, clinical and population research in cerebrovascular disease could decrease study start-up time, facilitate data sharing, and promote well-informed clinical practice guidelines. Methods A Working Group of diverse experts in cerebrovascular clinical trials, epidemiology, and biostatistics met regularly to develop a set of Stroke CDEs, selecting among, refining, and adding to existing, field-tested data elements from national registries and funded trials and studies. Candidate elements were revised based on comments from leading national and international neurovascular research organizations and the public. Results The first iteration of the NINDS stroke-specific CDEs comprises 980 data elements spanning nine content areas: 1) Biospecimens and Biomarkers; 2) Hospital Course and Acute Therapies; 3) Imaging; 4) Laboratory Tests and Vital Signs; 5) Long Term Therapies; 6) Medical History and Prior Health Status; 7) Outcomes and Endpoints; 8) Stroke Presentation; 9) Stroke Types and Subtypes. A CDE website provides uniform names and structures for each element, a data dictionary, and template case report forms (CRFs) using the CDEs. Conclusion Stroke-specific CDEs are now available as standardized, scientifically-vetted variable structures to facilitate data collection and data sharing in cerebrovascular patient-oriented research. The CDEs are an evolving resource that will be iteratively improved based on investigator use, new technologies, and emerging concepts and research findings. PMID:22308239
Vieweger, Anja; Döring, Thomas F
In agriculture and food systems, health-related research includes a vast diversity of topics. Nutritional, toxicological, pharmacological, epidemiological, behavioural, sociological, economic and political methods are used to study health in the five domains of soils, plants, livestock, humans and ecosystems. An idea developed in the early founding days of organic agriculture stated that the health of all domains is one and indivisible. Here we show that recent research reveals the existence and complex nature of such health links among domains. However, studies of health aspects in agriculture are often separated by disciplinary boundaries. This restrains the understanding of health in agricultural systems. Therefore we explore the opportunities and limitations of bringing perspectives together from the different domains. We review current approaches to define and assess health in agricultural contexts, comparing the state of the art of commonly used approaches and bringing together the presently disconnected debates in soil science, plant science, veterinary science and human medicine. Based on a qualitative literature analysis, we suggest that many health criteria fall into two paradigms: (1) the Growth Paradigm, where terms are primarily oriented towards continued growth; (2) the Boundary Paradigm, where terms focus on maintaining or coming back to a status quo, recognising system boundaries. Scientific health assessments in agricultural and food systems need to be explicit in terms of their position on the continuum between Growth Paradigm and Boundary Paradigm. Finally, we identify areas and concepts for a future direction of health assessment and research in agricultural and food systems. © 2014 Society of Chemical Industry.
Chen, Winson X; Poon, Eric K W; Hutchins, Nicholas; Thondapu, Vikas; Barlis, Peter; Ooi, Andrew
The haemodynamic behaviour of blood inside a coronary artery after stenting is greatly affected by individual stent features as well as complex geometrical properties of the artery including tortuosity and curvature. Regions at higher risk of restenosis, as measured by low wall shear stress (WSS computational modelling and computational fluid dynamics methodologies were used to analyse the haemodynamic characteristics in curved stented arteries using several common stent models. Results in this study showed that stent strut thickness was one major factor influencing the distribution of WSS in curved arteries. Regions of low WSS were found behind struts, particularly those oriented at a large angle relative to the streamwise flow direction. These findings were similar to those obtained in studies of straight arteries. An uneven distribution of WSS at the inner and outer bends of curved arteries was observed where the WSS was lower at the inner bend. In this study, it was also shown that stents with a helical configuration generated an extra swirling component of the flow based on the helical direction; however, this extra swirl in the flow field did not cause significant changes on the distribution of WSS under the current setup.
Erickson, T. A.; Koziol, B. W.; Rood, R. B.
The goal of the OpenClimateGIS project is to make climate model datasets readily available in commonly used, modern geospatial formats used by GIS software, browser-based mapping tools, and virtual globes.The climate modeling community typically stores climate data in multidimensional gridded formats capable of efficiently storing large volumes of data (such as netCDF, grib) while the geospatial community typically uses flexible vector and raster formats that are capable of storing small volumes of data (relative to the multidimensional gridded formats). OpenClimateGIS seeks to address this difference in data formats by clipping climate data to user-specified vector geometries (i.e. areas of interest) and translating the gridded data on-the-fly into multiple vector formats. The OpenClimateGIS system does not store climate data archives locally, but rather works in conjunction with external climate archives that expose climate data via the OPeNDAP protocol. OpenClimateGIS provides a RESTful API web service for accessing climate data resources via HTTP, allowing a wide range of applications to access the climate data.The OpenClimateGIS system has been developed using open source development practices and the source code is publicly available. The project integrates libraries from several other open source projects (including Django, PostGIS, numpy, Shapely, and netcdf4-python).OpenClimateGIS development is supported by a grant from NOAA's Climate Program Office.
Full Text Available The study evaluated allocative efficiency levels of common bean farms in Eastern Uganda and the factors influencing allocative efficiencies of these farms. To achieve this objective, a sample of 480 households was randomly selected in Busia, Mbale, Budaka and Tororo districts in Eastern Uganda. Data was collected using a personally administered structured questionnaire with a focus on household decision makers; whereas a stochastic frontier model and a two limit Tobit regression model were employed in the analysis. It was established that the mean allocative efficiency was 29.37% and it was significantly influenced by farm size, off-farm income, asset value and distance to the market. Therefore the study suggested the need for policies to discourage land fragmentation and promote road and market infrastructure development in the rural areas. The study also revealed the need for farmers to be trained on entrepreneurial skills so that they can invest their farm profits into more income generating activities that will harness more farming capital.
ARL-TR-8155 ● SEP 2017 US Army Research Laboratory Atmospheric Renewable Energy Research, Volume 5 (Solar Radiation Flux Model... Energy Research, Volume 5 (Solar Radiation Flux Model) by Clayton Walker and Gail Vaucher Computational and Information Sciences Directorate, ARL...2017 June 28 4. TITLE AND SUBTITLE Atmospheric Renewable Energy Research, Volume 5 (Solar Radiation Flux Model) 5a. CONTRACT NUMBER ROTC Internship
Corwin, Lisa A; Graham, Mark J; Dolan, Erin L
Course-based undergraduate research experiences (CUREs) are being championed as scalable ways of involving undergraduates in science research. Studies of CUREs have shown that participating students achieve many of the same outcomes as students who complete research internships. However, CUREs vary widely in their design and implementation, and aspects of CUREs that are necessary and sufficient to achieve desired student outcomes have not been elucidated. To guide future research aimed at understanding the causal mechanisms underlying CURE efficacy, we used a systems approach to generate pathway models representing hypotheses of how CURE outcomes are achieved. We started by reviewing studies of CUREs and research internships to generate a comprehensive set of outcomes of research experiences, determining the level of evidence supporting each outcome. We then used this body of research and drew from learning theory to hypothesize connections between what students do during CUREs and the outcomes that have the best empirical support. We offer these models as hypotheses for the CURE community to test, revise, elaborate, or refute. We also cite instruments that are ready to use in CURE assessment and note gaps for which instruments need to be developed. © 2015 L. A. Corwin et al. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Considering the large quantities of wastewater generated from iron and steel enterprises in China, this paper is aimed to research the common methods applied for evaluating the integrated wastewater treatment effect of iron and steel enterprises. Based on survey results on environmental protection performance, technological economy, resource & energy consumption, services and management, an indicator system for evaluating the operation effect of integrated wastewater treatment facilities is set up. By discussing the standards and industrial policies in and out of China, 27 key secondary indicators are further defined on the basis of investigation on main equipment and key processes for wastewater treatment, so as to determine the method for setting key quantitative and qualitative indicators for evaluation indicator system. It is also expected to satisfy the basic requirements of reasonable resource allocation, environmental protection and sustainable economic development, further improve the integrated wastewater treatment effect of iron and steel enterprises, and reduce the emission of hazardous substances and environmental impact.
Full Text Available The more intensive growth of agricultural crops adding mineral fertilizers, environmental pollution make the soil degraded: reduce the fertility of soil, increase the concentration of heavy metals. Especially dangerous is a common, synergistic effect of heavy metals. Vermicompost optimizes pH, texture and organic material content – the soil indicators, which are the major contributors to migration of heavy metals in the soil and to the plants from it. In the article there is an investigation of vermicompost influence on bioaccumulation of heavy metals in common meadow-grass. After experimental research it is determined that immobilization of heavy metals was the best in soil-vermicompost substrate, prepared in a ratio 1:2. The cadmium (Cd concentrations were lowest and the difference of HM content determined between roots and shoots was the most in biomass grown up in that mixture. In the underground part of plant the concentration equal to 11.10 mg/kg and in the part of above ground – 1.05 mg/kg. The situation of lead (Pb and copper (Cu is analogous. This is the optimal ratio of mixture preparation.
Zheng, Xiaoyu; Yamaguchi, Akira; Takata, Takashi
The traditional α-factor model has focused on the occurrence frequencies of common cause failure (CCF) events. Global α-factors in the α-factor model are defined as fractions of failure probability for particular groups of components. However, there are unknown uncertainties in the CCF parameters estimation for the scarcity of available failure data. Joint distributions of CCF parameters are actually determined by a set of possible causes, which are characterized by CCF-triggering abilities and occurrence frequencies. In the present paper, the process of α-decomposition (Kelly-CCF method) is developed to learn about sources of uncertainty in CCF parameter estimation. Moreover, it aims to evaluate CCF risk significances of different causes, which are named as decomposed α-factors. Firstly, a Hybrid Bayesian Network is adopted to reveal the relationship between potential causes and failures. Secondly, because all potential causes have different occurrence frequencies and abilities to trigger dependent failures or independent failures, a regression model is provided and proved by conditional probability. Global α-factors are expressed by explanatory variables (causes’ occurrence frequencies) and parameters (decomposed α-factors). At last, an example is provided to illustrate the process of hierarchical Bayesian inference for the α-decomposition process. This study shows that the α-decomposition method can integrate failure information from cause, component and system level. It can parameterize the CCF risk significance of possible causes and can update probability distributions of global α-factors. Besides, it can provide a reliable way to evaluate uncertainty sources and reduce the uncertainty in probabilistic risk assessment. It is recommended to build databases including CCF parameters and corresponding causes’ occurrence frequency of each targeted system
Domínguez-Hüttinger, Elisa; Christodoulides, Panayiotis; Miyauchi, Kosuke; Irvine, Alan D; Okada-Hatakeyama, Mariko; Kubo, Masato; Tanaka, Reiko J
The skin barrier acts as the first line of defense against constant exposure to biological, microbial, physical, and chemical environmental stressors. Dynamic interplay between defects in the skin barrier, dysfunctional immune responses, and environmental stressors are major factors in the development of atopic dermatitis (AD). A systems biology modeling approach can yield significant insights into these complex and dynamic processes through integration of prior biological data. We sought to develop a multiscale mathematical model of AD pathogenesis that describes the dynamic interplay between the skin barrier, environmental stress, and immune dysregulation and use it to achieve a coherent mechanistic understanding of the onset, progression, and prevention of AD. We mathematically investigated synergistic effects of known genetic and environmental risk factors on the dynamic onset and progression of the AD phenotype, from a mostly asymptomatic mild phenotype to a severe treatment-resistant form. Our model analysis identified a "double switch," with 2 concatenated bistable switches, as a key network motif that dictates AD pathogenesis: the first switch is responsible for the reversible onset of inflammation, and the second switch is triggered by long-lasting or frequent activation of the first switch, causing irreversible onset of systemic T H 2 sensitization and worsening of AD symptoms. Our mathematical analysis of the bistable switch predicts that genetic risk factors decrease the threshold of environmental stressors to trigger systemic T H 2 sensitization. This analysis predicts and explains 4 common clinical AD phenotypes from a mild and reversible phenotype through to severe and recalcitrant disease and provides a mechanistic explanation for clinically demonstrated preventive effects of emollient treatments against development of AD. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Chan, T.; Nakka, B.W.; O'Connor, P.A.; Uphori, D.U.; Reid, J.A.K.; Scheier, N.W.; Stanchell, F.W.
This report presents details of the modelling that was done to support the development of the simplified geosphere model (GEONET), which was used in the assessment that was presented in the Environmental Impact Statement on the proposed concept for the disposal of Canada's nuclear fuel waste. Detailed modelling of groundwater flow, heat transport and contaminant transport through the geosphere was performed using the MOTIF finite-element computer code and the particle-tracking code TRACK3D. The GEONET model was developed using data from the Whiteshell Research Area, with a hypothetical disposal vault located at a depth of 500 m. This report first briefly describes the conceptual model and summarises the two-dimensional (2-D) simulations that were used initially to define an adequate 3-D representation of the system. The analysis showed that the configuration of major fracture zones could have a large influence on the groundwater flow patterns. These major fracture zones can have high velocities and large flows. The proximity of the radionuclide source to a major fracture zone may strongly influence the time for a radionuclide to be transported from the disposal vault to the surface. Groundwater flow was then simulated and advective/convective particle tracking was conducted in the selected 3-D representation of the system, to aid in selecting a suitable form for the simplified model to be used in the overall systems assessment with the SYVAC3-CC3 computer code. Sensitivity analyses were performed on the effects of (a) different natural geometries of part of the model domain, (b) different hydraulic properties, (c) construction, operation and closure of the vault, (d) the presence of a water supply well and (e) the presence of an open borehole. These analyses indicated that the shape of the topography and the presence of a major low-dipping fracture zone focuses groundwater passing through the vault into a discharge area that is much smaller than the area of the
Full Text Available Abstract Background Milkweeds (Asclepias L. have been extensively investigated in diverse areas of evolutionary biology and ecology; however, there are few genetic resources available to facilitate and compliment these studies. This study explored how low coverage genome sequencing of the common milkweed (Asclepias syriaca L. could be useful in characterizing the genome of a plant without prior genomic information and for development of genomic resources as a step toward further developing A. syriaca as a model in ecology and evolution. Results A 0.5× genome of A. syriaca was produced using Illumina sequencing. A virtually complete chloroplast genome of 158,598 bp was assembled, revealing few repeats and loss of three genes: accD, clpP, and ycf1. A nearly complete rDNA cistron (18S-5.8S-26S; 7,541 bp and 5S rDNA (120 bp sequence were obtained. Assessment of polymorphism revealed that the rDNA cistron and 5S rDNA had 0.3% and 26.7% polymorphic sites, respectively. A partial mitochondrial genome sequence (130,764 bp, with identical gene content to tobacco, was also assembled. An initial characterization of repeat content indicated that Ty1/copia-like retroelements are the most common repeat type in the milkweed genome. At least one A. syriaca microread hit 88% of Catharanthus roseus (Apocynaceae unigenes (median coverage of 0.29× and 66% of single copy orthologs (COSII in asterids (median coverage of 0.14×. From this partial characterization of the A. syriaca genome, markers for population genetics (microsatellites and phylogenetics (low-copy nuclear genes studies were developed. Conclusions The results highlight the promise of next generation sequencing for development of genomic resources for any organism. Low coverage genome sequencing allows characterization of the high copy fraction of the genome and exploration of the low copy fraction of the genome, which facilitate the development of molecular tools for further study of a target species
Straub, Shannon C K; Fishbein, Mark; Livshultz, Tatyana; Foster, Zachary; Parks, Matthew; Weitemier, Kevin; Cronn, Richard C; Liston, Aaron
Milkweeds (Asclepias L.) have been extensively investigated in diverse areas of evolutionary biology and ecology; however, there are few genetic resources available to facilitate and compliment these studies. This study explored how low coverage genome sequencing of the common milkweed (Asclepias syriaca L.) could be useful in characterizing the genome of a plant without prior genomic information and for development of genomic resources as a step toward further developing A. syriaca as a model in ecology and evolution. A 0.5× genome of A. syriaca was produced using Illumina sequencing. A virtually complete chloroplast genome of 158,598 bp was assembled, revealing few repeats and loss of three genes: accD, clpP, and ycf1. A nearly complete rDNA cistron (18S-5.8S-26S; 7,541 bp) and 5S rDNA (120 bp) sequence were obtained. Assessment of polymorphism revealed that the rDNA cistron and 5S rDNA had 0.3% and 26.7% polymorphic sites, respectively. A partial mitochondrial genome sequence (130,764 bp), with identical gene content to tobacco, was also assembled. An initial characterization of repeat content indicated that Ty1/copia-like retroelements are the most common repeat type in the milkweed genome. At least one A. syriaca microread hit 88% of Catharanthus roseus (Apocynaceae) unigenes (median coverage of 0.29×) and 66% of single copy orthologs (COSII) in asterids (median coverage of 0.14×). From this partial characterization of the A. syriaca genome, markers for population genetics (microsatellites) and phylogenetics (low-copy nuclear genes) studies were developed. The results highlight the promise of next generation sequencing for development of genomic resources for any organism. Low coverage genome sequencing allows characterization of the high copy fraction of the genome and exploration of the low copy fraction of the genome, which facilitate the development of molecular tools for further study of a target species and its relatives. This study represents a first
Background Milkweeds (Asclepias L.) have been extensively investigated in diverse areas of evolutionary biology and ecology; however, there are few genetic resources available to facilitate and compliment these studies. This study explored how low coverage genome sequencing of the common milkweed (Asclepias syriaca L.) could be useful in characterizing the genome of a plant without prior genomic information and for development of genomic resources as a step toward further developing A. syriaca as a model in ecology and evolution. Results A 0.5× genome of A. syriaca was produced using Illumina sequencing. A virtually complete chloroplast genome of 158,598 bp was assembled, revealing few repeats and loss of three genes: accD, clpP, and ycf1. A nearly complete rDNA cistron (18S-5.8S-26S; 7,541 bp) and 5S rDNA (120 bp) sequence were obtained. Assessment of polymorphism revealed that the rDNA cistron and 5S rDNA had 0.3% and 26.7% polymorphic sites, respectively. A partial mitochondrial genome sequence (130,764 bp), with identical gene content to tobacco, was also assembled. An initial characterization of repeat content indicated that Ty1/copia-like retroelements are the most common repeat type in the milkweed genome. At least one A. syriaca microread hit 88% of Catharanthus roseus (Apocynaceae) unigenes (median coverage of 0.29×) and 66% of single copy orthologs (COSII) in asterids (median coverage of 0.14×). From this partial characterization of the A. syriaca genome, markers for population genetics (microsatellites) and phylogenetics (low-copy nuclear genes) studies were developed. Conclusions The results highlight the promise of next generation sequencing for development of genomic resources for any organism. Low coverage genome sequencing allows characterization of the high copy fraction of the genome and exploration of the low copy fraction of the genome, which facilitate the development of molecular tools for further study of a target species and its relatives
Phuc Huu Nguyen
Full Text Available Abstract. This paper suggests a theoretical framework for analyzing the mechanism of the behavior of academic researchers whose interests are tangled and vary widely in academic factors (the intrinsic satisfaction in conducting research, the improvement in individual research ability, etc. or non-academic factors (career rewards, financial rewards, etc.. Furthermore, each researcher also has his/her different academic stances in their preferences about academic freedom and academic entrepreneurship. Understanding the behavior of academic researchers will contribute to nurture young researchers, to improve the standard of research and education as well as to boost collaboration in academia-industry. In particular, as open innovation is increasingly in need of the involvement of university researchers, to establish a successful approach to entice researchers into enterprises’ research, companies must comprehend the behavior of university researchers who have multiple complex motivations. The paper explores academic researchers' behaviors through optimizing their utility functions, i.e. the satisfaction obtained by their research outputs. This paper characterizes these outputs as the results of researchers' 3C: Competence (the ability to implement the research, Commitment (the effort to do the research, and Contribution (finding meaning in the research. Most of the previous research utilized the empirical methods to study researcher's motivation. Without adopting economic theory into the analysis, the past literature could not offer a deeper understanding of researcher's behavior. Our contribution is important both conceptually and practically because it provides the first theoretical framework to study the mechanism of researcher's behavior. Keywords: Academia-Industry, researcher behavior, ulrich model’s 3C.
Posamentier, Alfred S (Steven); Jaye, Daniel I
The math teacher's go-to resource-now updated for the Common Core! What works in math and why has never been the issue; the research is all out there. Where teachers struggle is the "how." That's the big service What Successful Math Teachers Do provides. It's a powerful portal to what the best research looks like in practice strategy by strategy-now aligned to both the Common Core and the NCTM Standards. For each of the book's 80 strategies, the authors present A brief description A summary of supporting research The corresponding NCTM and Common Core Standards Classroom applications Possible pitfalls Recommended reading and research.
Harte-Hargrove, Lauren C; French, Jacqueline A; Pitkänen, Asla; Galanopoulou, Aristea S; Whittemore, Vicky; Scharfman, Helen E
The major objective of preclinical translational epilepsy research is to advance laboratory findings toward clinical application by testing potential treatments in animal models of seizures and epilepsy. Recently there has been a focus on the failure of preclinical discoveries to translate reliably, or even to be reproduced in different laboratories. One potential cause is a lack of standardization in preclinical data collection. The resulting difficulties in comparing data across studies have led to high cost and missed opportunity, which in turn impede clinical trials and advances in medical care. Preclinical epilepsy research has successfully brought numerous antiseizure treatments into the clinical practice, yet the unmet clinical needs have prompted the reconsideration of research strategies to optimize epilepsy therapy development. In the field of clinical epilepsy there have been successful steps to improve such problems, such as generation of common data elements (CDEs) and case report forms (CRFs and standards of data collection and reporting) by a team of leaders in the field. Therefore, the Translational Task Force was appointed by the International League Against Epilepsy (ILAE) and the American Epilepsy Society (AES), in partnership with the National Institute of Neurological Disorders and Stroke (NINDS) and the National Institutes of Health (NIH) to define CDEs for animal epilepsy research studies and prepare guidelines for data collection and experimental procedures. If adopted, the preclinical CDEs could facilitate collaborative epilepsy research, comparisons of data across different laboratories, and promote rigor, transparency, and impact, particularly in therapy development. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.
Tediosi, A.; Bulgheroni, C.; Sali, G.; Facchi, A.; Gandolfi, C.
After a few years from the delivery of the EU Water Framework Directive (WFD) the need to link agriculture and WFD has emerged as one of the highest priorities; therefore, it is important to discuss on how the EU Common Agricultural Policy (CAP) can contribute to the achievements of the WFD objectives. The recent CAP reform - known as Mid Term Review (MTR) or Fischler Reform - has increased the opportunities, offering to farmers increased support to address some environmental issues. The central novelty coming from the MTR is the introduction of a farm single payment which aims to the Decoupling of EU Agricultural Support from production. Other MTR important topics deal with the Modulation of the payments, the Cross-Compliance and the strengthening of the Rural Development policy. All these new elements will affect the farmers' behaviour, steering their productive choices for the future, which, in turn, will have consequences on the water demand for irrigation. Indeed, from the water quantity viewpoint, agriculture is a large consumer and improving water use efficiency is one of the main issues at stake, following the increasing impacts of water scarcity and droughts across Europe in a context of climate change. According to a recent survey of the European Commission the saving potential in the agricultural sector is 43% of present abstraction and 95% of it is concentrated in southern europe. Many models have been developed to forecast the farmers' behaviour as a consequence of agricultural policies, both at sector and regional level; all of them are founded on Mathematical Programming techniques and many of them use the Positive approach, which better fits the territorial dimension. A large body of literature also exists focusing on the assessment of irrigation water requirements. The examples of conjunctive modelling of the two aspects are however much more limited. The work presented has got some innovative aspects: not only does it couple an economical model
Cheng, Ching-Yu; Li, Qing
Postpartum mothers experience certain physical health conditions that may affect their quality of life, future health, and health of their children. Yet, the physical health of postpartum mothers is relatively neglected in both research and practice. The purpose of this review is to describe the general health status and prevalence of common physical health conditions of postpartum mothers. The review followed standard procedures for integrative literature reviews. Twenty-two articles were reviewed from searches in scientific databases, reference lists, and an up-to-date survey. Three tables were designed to answer review questions. In general, postpartum mothers self-rate their health as good. They experience certain physical conditions such as fatigue/physical exhaustion, sleep-related problems, pain, sex-related concerns, hemorrhoids/constipation, and breast problems. Despite a limited number of studies, the findings provide a glimpse of the presence of a number of physical health conditions experienced by women in the 2 years postpartum. In the articles reviewed, physical health conditions and postpartum period were poorly defined, no standard scales existed, and the administration of surveys varied widely in time. Those disparities prevented systematic comparisons of results and made it difficult to gain a coherent understanding of the physical health conditions of postpartum mothers. More longitudinal research is needed that focuses on the etiology, predictors, and management of the health conditions most prevalent among postpartum mothers. Instruments are needed that target a broader range of physical conditions in respect to type and severity.
Full Text Available Milk in terms of production value has the second biggest share in Croatian agricultural sector in 2013 (CBS, 2014. It could be speculated that after the abolition of quotas in the European Union, the declining trend in domestic production will continue and that exposure to free European market will significantly affect the competitiveness of domestic production. The aim of this paper is to analyse the prospects of Croatian dairy industry (sector under certain conditions of the EU Common Agricultural Policy (CAP and to present projections simulated with the help of partial equilibrium model AGMEMOD. The main model inputs are policy and macroeconomic variables, supply-use balances of agro-food products and producer prices. The Baseline projections has shown that in 2025 in line with the CAP implementation there might be a decrease of dairy cows number by 33 %, the raw milk price by 14 % and the collected cow’s milk amount by 13 % compared to the five-year average of 2008-2012. The positive effect was noted in productivity, according to the simulation, with an increase by 25 %, which consequently may lead to increased deliveries to dairies for about 17 %. Therefore preliminary results show that accounting for milk processing the dairy sector in Croatia might obtain a favourable situation by 2025. Taking into account the EU market situation, there is an opportunity to increase milk processing given the current level of prices in the EU market and global markets, and taking into account the abolition of milk quotas. Also, the results suggest, according to the experience of other states, that the utilization of funds of 1st and 2nd pillar of the CAP (utilization measures across projects in order to improve the production structure and efficiency will play an important role.
Vilà, M.; Fernández, M.; Jiménez-Munt, I.
Determining the temperature distribution within the lithosphere requires the knowledge of the radiogenic heat production (RHP) distribution within the crust and the lithospheric mantle. RHP of crustal rocks varies considerably at different scales as a result of the petrogenetic processes responsible for their formation and therefore RHP depends on the considered lithologies. In this work we address RHP variability of some common lithological groups from a compilation of a total of 2188 representative U, Th and K concentrations of different worldwide rock types derived from 102 published studies. To optimize the use of the generated RHP database we have classified and renamed the rock-type denominations of the original works following a petrologic classification scheme with a hierarchical structure. The RHP data of each lithological group is presented in cumulative distribution plots, and we report a table with the mean, the standard deviation, the minimum and maximum values, and the significant percentiles of these lithological groups. We discuss the reported RHP distribution for the different igneous, sedimentary and metamorphic lithological groups from a petrogenetic viewpoint and give some useful guidelines to assign RHP values to lithospheric thermal modeling.
Scanlan, Aaron T; Fox, Jordan L; Borges, Nattai R; Dascombe, Ben J; Dalbo, Vincent J
The influence of various factors on training-load (TL) responses in basketball has received limited attention. This study aimed to examine the temporal changes and influence of cumulative training dose on TL responses and interrelationships during basketball activity. Ten state-level Australian male junior basketball players completed 4 × 10-min standardized bouts of simulated basketball activity using a circuit-based protocol. Internal TL was quantified using the session rating of perceived exertion (sRPE), summated heart-rate zones (SHRZ), Banister training impulse (TRIMP), and Lucia TRIMP models. External TL was assessed via measurement of mean sprint and circuit speeds. Temporal TL comparisons were performed between 10-min bouts, while Pearson correlation analyses were conducted across cumulative training doses (0-10, 0-20, 0-30, and 0-40 min). sRPE TL increased (P basketball activity. sRPE TL was only significantly related to Lucia TRIMP (r = .66-.69; P training doses (r = .84-.89; P basketball training doses lasting beyond 20 min. Thus, the interchangeability of commonly used internal and external TL approaches appears dose-dependent during basketball activity, with various psychophysiological mediators likely underpinning temporal changes.
Agarwala, Matthew; Lovett, Andrew; Bateman, Ian; Day, Brett; Agnolucci, Paolo; Ziv, Guy
The UK Government is formally committed to reducing carbon emissions and protecting and improving natural capital and the environment. However, actually delivering on these objectives requires an integrated approach to addressing two parallel challenges: de-carbonising future energy system pathways; and safeguarding natural capital to ensure the continued flow of ecosystem services. Although both emphasise benefiting from natural resources, efforts to connect natural capital and energy systems research have been limited, meaning opportunities to improve management of natural resources and meet society's energy needs could be missed. The ecosystem services paradigm provides a consistent conceptual framework that applies in multiple disciplines across the natural and economic sciences, and facilitates collaboration between them. At the forefront of the field, integrated ecosystem service - economy models have guided public- and private-sector decision making at all levels. Models vary in sophistication from simple spreadsheet tools to complex software packages integrating biophysical, GIS and economic models and draw upon many fields, including ecology, hydrology, geography, systems theory, economics and the social sciences. They also differ in their ability to value changes in natural capital and ecosystem services at various spatial and temporal scales. Despite these differences, current models share a common feature: their treatment of energy systems is superficial at best. In contrast, energy systems research has no widely adopted, unifying conceptual framework that organises thinking about key system components and interactions. Instead, the literature is organised around modelling approaches, including life cycle analyses, econometric investigations, linear programming and computable general equilibrium models. However, some consistencies do emerge. First, often contain a linear set of steps, from exploration to resource supply, fuel processing, conversion
B.J. Hipple Walters (Bethany); S.A. Adams (Samantha); A.P. Nieboer (Anna); R.A. Bal (Roland)
textabstractBackground: Disease management programs, especially those based on the Chronic Care Model (CCM),are increasingly common in the Netherlands. While disease management programs have beenwell-researched quantitatively and economically, less qualitative research has been done. Theoverall aim
Solomon, Jesse A.; Tarnopolsky, Mark A.; Hamadeh, Mazen J.
There is no consensus among research laboratories around the world on the criteria that define endpoint in studies involving rodent models of amyotrophic lateral sclerosis (ALS). Data from 4 nutrition intervention studies using 162 G93A mice, a model of ALS, were analyzed to determine if differences exist between the following endpoint criteria: CS 4 (functional paralysis of both hindlimbs), CS 4+ (CS 4 in addition to the earliest age of body weight loss, body condition deterioration or righting reflex), and CS 5 (CS 4 plus righting reflex >20 s). The age (d; mean ± SD) at which mice reached endpoint was recorded as the unit of measurement. Mice reached CS 4 at 123.9±10.3 d, CS 4+ at 126.6±9.8 d and CS 5 at 127.6±9.8 d, all significantly different from each other (P<0.001). There was a significant positive correlation between CS 4 and CS 5 (r = 0.95, P<0.001), CS 4 and CS 4+ (r = 0.96, P<0.001), and CS 4+ and CS 5 (r = 0.98, P<0.001), with the Bland-Altman plot showing an acceptable bias between all endpoints. Logrank tests showed that mice reached CS 4 24% and 34% faster than CS 4+ (P = 0.046) and CS 5 (P = 0.006), respectively. Adopting CS 4 as endpoint would spare a mouse an average of 4 days (P<0.001) from further neuromuscular disability and poor quality of life compared to CS 5. Alternatively, CS 5 provides information regarding proprioception and severe motor neuron death, both could be important parameters in establishing the efficacy of specific treatments. Converging ethics and discovery, would adopting CS 4 as endpoint compromise the acquisition of insight about the effects of interventions in animal models of ALS? PMID:21687686
Full Text Available Abstract Background Recent advances in genomics, proteomics, and the increasing demands for biomarker validation studies have catalyzed changes in the landscape of cancer research, fueling the development of tissue banks for translational research. A result of this transformation is the need for sufficient quantities of clinically annotated and well-characterized biospecimens to support the growing needs of the cancer research community. Clinical annotation allows samples to be better matched to the research question at hand and ensures that experimental results are better understood and can be verified. To facilitate and standardize such annotation in bio-repositories, we have combined three accepted and complementary sets of data standards: the College of American Pathologists (CAP Cancer Checklists, the protocols recommended by the Association of Directors of Anatomic and Surgical Pathology (ADASP for pathology data, and the North American Association of Central Cancer Registry (NAACCR elements for epidemiology, therapy and follow-up data. Combining these approaches creates a set of International Standards Organization (ISO – compliant Common Data Elements (CDEs for the mesothelioma tissue banking initiative supported by the National Institute for Occupational Safety and Health (NIOSH of the Center for Disease Control and Prevention (CDC. Methods The purpose of the project is to develop a core set of data elements for annotating mesothelioma specimens, following standards established by the CAP checklist, ADASP cancer protocols, and the NAACCR elements. We have associated these elements with modeling architecture to enhance both syntactic and semantic interoperability. The system has a Java-based multi-tiered architecture based on Unified Modeling Language (UML. Results Common Data Elements were developed using controlled vocabulary, ontology and semantic modeling methodology. The CDEs for each case are of different types: demographic
Mohanty, Sambit K; Mistry, Amita T; Amin, Waqas; Parwani, Anil V; Pople, Andrew K; Schmandt, Linda; Winters, Sharon B; Milliken, Erin; Kim, Paula; Whelan, Nancy B; Farhat, Ghada; Melamed, Jonathan; Taioli, Emanuela; Dhir, Rajiv; Pass, Harvey I; Becich, Michael J
Recent advances in genomics, proteomics, and the increasing demands for biomarker validation studies have catalyzed changes in the landscape of cancer research, fueling the development of tissue banks for translational research. A result of this transformation is the need for sufficient quantities of clinically annotated and well-characterized biospecimens to support the growing needs of the cancer research community. Clinical annotation allows samples to be better matched to the research question at hand and ensures that experimental results are better understood and can be verified. To facilitate and standardize such annotation in bio-repositories, we have combined three accepted and complementary sets of data standards: the College of American Pathologists (CAP) Cancer Checklists, the protocols recommended by the Association of Directors of Anatomic and Surgical Pathology (ADASP) for pathology data, and the North American Association of Central Cancer Registry (NAACCR) elements for epidemiology, therapy and follow-up data. Combining these approaches creates a set of International Standards Organization (ISO) - compliant Common Data Elements (CDEs) for the mesothelioma tissue banking initiative supported by the National Institute for Occupational Safety and Health (NIOSH) of the Center for Disease Control and Prevention (CDC). The purpose of the project is to develop a core set of data elements for annotating mesothelioma specimens, following standards established by the CAP checklist, ADASP cancer protocols, and the NAACCR elements. We have associated these elements with modeling architecture to enhance both syntactic and semantic interoperability. The system has a Java-based multi-tiered architecture based on Unified Modeling Language (UML). Common Data Elements were developed using controlled vocabulary, ontology and semantic modeling methodology. The CDEs for each case are of different types: demographic, epidemiologic data, clinical history, pathology data
Sivo, Stephen; Fan, Xitao
Autocorrelated residuals in longitudinal data are widely reported as common to longitudinal data. Yet few, if any, researchers modeling growth processes evaluate a priori whether their data have this feature. Sivo, Fan, and Witta (2005) found that not modeling autocorrelated residuals present in longitudinal data severely biases latent curve…
Joshua F Goldberg
Full Text Available Many large carnivores occupy a wide geographic distribution, and face threats from habitat loss and fragmentation, poaching, prey depletion, and human wildlife-conflicts. Conservation requires robust techniques for estimating population densities and trends, but the elusive nature and low densities of many large carnivores make them difficult to detect. Spatial capture-recapture (SCR models provide a means for handling imperfect detectability, while linking population estimates to individual movement patterns to provide more accurate estimates than standard approaches. Within this framework, we investigate the effect of different sample interval lengths on density estimates, using simulations and a common leopard (Panthera pardus model system. We apply Bayesian SCR methods to 89 simulated datasets and camera-trapping data from 22 leopards captured 82 times during winter 2010-2011 in Royal Manas National Park, Bhutan. We show that sample interval length from daily, weekly, monthly or quarterly periods did not appreciably affect median abundance or density, but did influence precision. We observed the largest gains in precision when moving from quarterly to shorter intervals. We therefore recommend daily sampling intervals for monitoring rare or elusive species where practicable, but note that monthly or quarterly sample periods can have similar informative value. We further develop a novel application of Bayes factors to select models where multiple ecological factors are integrated into density estimation. Our simulations demonstrate that these methods can help identify the "true" explanatory mechanisms underlying the data. Using this method, we found strong evidence for sex-specific movement distributions in leopards, suggesting that sexual patterns of space-use influence density. This model estimated a density of 10.0 leopards/100 km2 (95% credibility interval: 6.25-15.93, comparable to contemporary estimates in Asia. These SCR methods provide
This paper presents an analysis of the different types of reasoning and physical explanation used in science, common thought, and physics teaching. It then reflects on the learning difficulties connected with these various approaches, and suggests some possible didactic strategies. Although causal reasoning occurs very frequently in common thought…
Sibonga, Jean D.
This slide presentation reviews the use of animal models to research and inform bone morphology, in particular relating to human research in bone loss as a result of low gravity environments. Reasons for use of animal models as tools for human research programs include: time-efficient, cost-effective, invasive measures, and predictability as some model are predictive for drug effects.
Schneider, Nadine; Fechner, Nikolas; Landrum, Gregory A; Stiefl, Nikolaus
Big data is one of the key transformative factors which increasingly influences all aspects of modern life. Although this transformation brings vast opportunities it also generates novel challenges, not the least of which is organizing and searching this data deluge. The field of medicinal chemistry is not different: more and more data are being generated, for instance, by technologies such as DNA encoded libraries, peptide libraries, text mining of large literature corpora, and new in silico enumeration methods. Handling those huge sets of molecules effectively is quite challenging and requires compromises that often come at the expense of the interpretability of the results. In order to find an intuitive and meaningful approach to organizing large molecular data sets, we adopted a probabilistic framework called "topic modeling" from the text-mining field. Here we present the first chemistry-related implementation of this method, which allows large molecule sets to be assigned to "chemical topics" and investigating the relationships between those. In this first study, we thoroughly evaluate this novel method in different experiments and discuss both its disadvantages and advantages. We show very promising results in reproducing human-assigned concepts using the approach to identify and retrieve chemical series from sets of molecules. We have also created an intuitive visualization of the chemical topics output by the algorithm. This is a huge benefit compared to other unsupervised machine-learning methods, like clustering, which are commonly used to group sets of molecules. Finally, we applied the new method to the 1.6 million molecules of the ChEMBL22 data set to test its robustness and efficiency. In about 1 h we built a 100-topic model of this large data set in which we could identify interesting topics like "proteins", "DNA", or "steroids". Along with this publication we provide our data sets and an open-source implementation of the new method (CheTo) which
Kionna Oliveira Bernardes Santos
Full Text Available Background. The Self-Reporting Questionnaire (SRQ-20 is widely used for evaluating common mental disorders. However, few studies have evaluated the SRQ-20 measurements performance in occupational groups. This study aimed to describe manifestation patterns of common mental disorders symptoms among workers populations, by using latent class analysis. Methods. Data derived from 9,959 Brazilian workers, obtained from four cross-sectional studies that used similar methodology, among groups of informal workers, teachers, healthcare workers, and urban workers. Common mental disorders were measured by using SRQ-20. Latent class analysis was performed on each database separately. Results. Three classes of symptoms were confirmed in the occupational categories investigated. In all studies, class I met better criteria for suspicion of common mental disorders. Class II discriminated workers with intermediate probability of answers to the items belonging to anxiety, sadness, and energy decrease that configure common mental disorders. Class III was composed of subgroups of workers with low probability to respond positively to questions for screening common mental disorders. Conclusions. Three patterns of symptoms of common mental disorders were identified in the occupational groups investigated, ranging from distinctive features to low probabilities of occurrence. The SRQ-20 measurements showed stability in capturing nonpsychotic symptoms.
Parnis, J. Mark; Thompson, Matthew G. K.
An introductory undergraduate physical organic chemistry exercise that introduces the harmonic oscillator's use in vibrational spectroscopy is developed. The analysis and modeling exercise begins with the students calculating the stretching modes of common organic molecules with the help of the quantum mechanical harmonic oscillator (QMHO) model.
Cipolla, Laura; Ferrari, Lia A.
A hands-on approach to introduce the chemical elements and the atomic structure to elementary/middle school students is described. The proposed classroom activity presents Bohr models of atoms using common and inexpensive materials, such as nested plastic balls, colored modeling clay, and small-sized pasta (or small plastic beads).
Mayet, Aurélie; Legleye, Stéphane; Beck, François; Falissard, Bruno; Chau, Nearkasen
The aim of this study was to describe the transitions between tobacco (T), cannabis (C) and other illicit drugs (OIDs) initiations, to simultaneously explore several substance use theories: gateway theory (GT), common liability model (CLM) and route of administration model (RAM). Data from 2 French nationwide surveys conducted in 2005 and 2010 were used (16,421 subjects aged 18-34). Using reported ages at initiations, we reconstituted a retrospective cohort describing all initiation sequences between T, C and OID. Transition probabilities between the substances were computed using a Markov multi-state model that also tested the effect of 2 latent variables (item response theory scores reflecting propensity for early onset and further substance use) on all transitions. T initiation was associated with increased likelihood of subsequent C initiation, but the reverse relationship was also observed. While the most likely initiation sequence among subjects who initiated the 3 groups of substances was the 'gateway' sequence T x2192; C x2192; OID, this pattern was not associated with substance use propensity more than alternative sequences. Early use propensity was associated with the 'gateway' sequence but also with some alternative ones beginning with T, C or OID. If the gateway sequence appears as the most likely pattern, in line with GT, the effects of early onset and substance use propensities were also observed for some alternative sequences, which is more in line with CLM. RAM could explain reciprocal interactions observed between T and C. This suggests shared influences of individual (personality traits) and environmental (substance availability, peer influence) characteristics. © 2015 S. Karger AG, Basel.
Li Jiaorui; Xu Wei; Xie Wenxian; Ren Zhengzheng
In consideration of many uncertain factors existing in economic system, nonlinear stochastic dynamical price model which is subjected to Gaussian white noise excitation is proposed based on deterministic model. One-dimensional averaged Ito stochastic differential equation for the model is derived by using the stochastic averaging method, and applied to investigate the stability of the trivial solution and the first-passage failure of the stochastic price model. The stochastic price model and the methods presented in this paper are verified by numerical studies
Jesse A Solomon
Full Text Available There is no consensus among research laboratories around the world on the criteria that define endpoint in studies involving rodent models of amyotrophic lateral sclerosis (ALS. Data from 4 nutrition intervention studies using 162 G93A mice, a model of ALS, were analyzed to determine if differences exist between the following endpoint criteria: CS 4 (functional paralysis of both hindlimbs, CS 4+ (CS 4 in addition to the earliest age of body weight loss, body condition deterioration or righting reflex, and CS 5 (CS 4 plus righting reflex >20 s. The age (d; mean ± SD at which mice reached endpoint was recorded as the unit of measurement. Mice reached CS 4 at 123.9±10.3 d, CS 4+ at 126.6±9.8 d and CS 5 at 127.6±9.8 d, all significantly different from each other (P<0.001. There was a significant positive correlation between CS 4 and CS 5 (r = 0.95, P<0.001, CS 4 and CS 4+ (r = 0.96, P<0.001, and CS 4+ and CS 5 (r = 0.98, P<0.001, with the Bland-Altman plot showing an acceptable bias between all endpoints. Logrank tests showed that mice reached CS 4 24% and 34% faster than CS 4+ (P = 0.046 and CS 5 (P = 0.006, respectively. Adopting CS 4 as endpoint would spare a mouse an average of 4 days (P<0.001 from further neuromuscular disability and poor quality of life compared to CS 5. Alternatively, CS 5 provides information regarding proprioception and severe motor neuron death, both could be important parameters in establishing the efficacy of specific treatments. Converging ethics and discovery, would adopting CS 4 as endpoint compromise the acquisition of insight about the effects of interventions in animal models of ALS?
Steven A. Cohen
and similarities among five commonly used measures of rurality in the United States. There are important, quantifiable distinctions in defining what it means to be a rural county depending on both the geographic region and the measurement used. These findings highlight the importance of developing and selecting an appropriate rurality metric in health research.
Luo, Lin; Sun, Xianzhong
Rhythmic gymnastics training guidance model, taking into consideration the features of artistic gymnastics training, is put forward to help gymnasts identify their deficiencies and unskilled technical movements and improve their training effects. The model is built on the foundation of both physical quality indicator model and artistic gymnastics training indicator model. Physical quality indicator model composed of bodily factor, flexibility-strength factor and speed-dexterity factor delivers an objective evaluation with reference to basic sport testing data. Training indicator model, based on physical fitness indicator, helps analyze the technical movements, through which the impact from each bodily factor on technical movements is revealed. AG training guidance model, in further combination with actual training data and in comparison with the data shown in the training indicator model, helps identify the problems in trainings, and thus improve the training effect. These three models when in combined use and in comparison with historical model data can check and verify the improvement in training effect over a certain period of time.
Full Text Available Tanya A Enderli, Stephanie R Burtch, Jara N Templet, Alessandra Carriero Department of Biomedical Engineering, Florida Institute of Technology, Melbourne, FL, USA Abstract: Osteogenesis imperfecta (OI, commonly known as brittle bone disease, is a genetic disease characterized by extreme bone fragility and consequent skeletal deformities. This connective tissue disorder is caused by mutations in the quality and quantity of the collagen that in turn affect the overall mechanical integrity of the bone, increasing its vulnerability to fracture. Animal models of the disease have played a critical role in the understanding of the pathology and causes of OI and in the investigation of a broad range of clinical therapies for the disease. Currently, at least 20 animal models have been officially recognized to represent the phenotype and biochemistry of the 17 different types of OI in humans. These include mice, dogs, and fish. Here, we describe each of the animal models and the type of OI they represent, and present their application in clinical research for treatments of OI, such as drug therapies (ie, bisphosphonates and sclerostin and mechanical (ie, vibrational loading. In the future, different dosages and lengths of treatment need to be further investigated on different animal models of OI using potentially promising treatments, such as cellular and chaperone therapies. A combination of therapies may also offer a viable treatment regime to improve bone quality and reduce fragility in animals before being introduced into clinical trials for OI patients. Keywords: OI, brittle bone, clinical research, mouse, dog, zebrafish
Vonderhaar, T. H.
The NIMBUS 6 data were analyzed to form an up to date climatology of the Earth radiation budget as a basis for numerical model definition studies. Global maps depicting infrared emitted flux, net flux and albedo from processed NIMBUS 6 data for July, 1977, are presented. Zonal averages of net radiation flux for April, May, and June and zonal mean emitted flux and net flux for the December to January period are also presented. The development of two models is reported. The first is a statistical dynamical model with vertical and horizontal resolution. The second model is a two level global linear balance model. The results of time integration of the model up to 120 days, to simulate the January circulation, are discussed. Average zonal wind, meridonal wind component, vertical velocity, and moisture budget are among the parameters addressed.
Böhnke, Jan R; Croudace, Tim J
The assessment of 'general health and well-being' in public mental health research stimulates debates around relative merits of questionnaire instruments and their items. Little evidence regarding alignment or differential advantages of instruments or items has appeared to date. Population-based psychometric study of items employed in public mental health narratives. Multidimensional item response theory was applied to General Health Questionnaire (GHQ-12), Warwick-Edinburgh Mental Well-being Scale (WEMWBS) and EQ-5D items (Health Survey for England, 2010-2012; n = 19 290). A bifactor model provided the best account of the data and showed that the GHQ-12 and WEMWBS items assess mainly the same construct. Only one item of the EQ-5D showed relevant overlap with this dimension (anxiety/depression). Findings were corroborated by comparisons with alternative models and cross-validation analyses. The consequences of this lack of differentiation (GHQ-12 v. WEMWBS) for mental health and well-being narratives deserves discussion to enrich debates on priorities in public mental health and its assessment. © The Royal College of Psychiatrists 2015.
Peterson, M.F.; Arregle, J-L.; Martin, Xavier
Multiple-level (or mixed linear) modeling (MLM) can simultaneously test hypotheses at several levels of analysis (usually two or three), or control for confounding effects at one level while testing hypotheses at others. Advances in multi-level modeling allow increased precision in quantitative
Choi, Youngna; Spero, Steven
In this article, we study financing in the real estate market and show how various types of mortgages can be modeled and analyzed. With only an introductory level of interest theory, finance, and calculus, we model and analyze three types of popular mortgages with real life examples that explain the background and inevitable outcome of the current…
Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling.
Progress achieved during this report period is presented on the following topics: Development and application of gyrokinetic particle codes to tokamak transport, development of techniques to take advantage of parallel computers; model dynamo and bootstrap current drive; and in general maintain our broad-based program in basic plasma physics and computer modeling
Feeney, Daniel F; Meyer, François G; Noone, Nicholas; Enoka, Roger M
Motor neurons appear to be activated with a common input signal that modulates the discharge activity of all neurons in the motor nucleus. It has proven difficult for neurophysiologists to quantify the variability in a common input signal, but characterization of such a signal may improve our understanding of how the activation signal varies across motor tasks. Contemporary methods of quantifying the common input to motor neurons rely on compiling discrete action potentials into continuous time series, assuming the motor pool acts as a linear filter, and requiring signals to be of sufficient duration for frequency analysis. We introduce a space-state model in which the discharge activity of motor neurons is modeled as inhomogeneous Poisson processes and propose a method to quantify an abstract latent trajectory that represents the common input received by motor neurons. The approach also approximates the variation in synaptic noise in the common input signal. The model is validated with four data sets: a simulation of 120 motor units, a pair of integrate-and-fire neurons with a Renshaw cell providing inhibitory feedback, the discharge activity of 10 integrate-and-fire neurons, and the discharge times of concurrently active motor units during an isometric voluntary contraction. The simulations revealed that a latent state-space model is able to quantify the trajectory and variability of the common input signal across all four conditions. When compared with the cumulative spike train method of characterizing common input, the state-space approach was more sensitive to the details of the common input current and was less influenced by the duration of the signal. The state-space approach appears to be capable of detecting rather modest changes in common input signals across conditions. NEW & NOTEWORTHY We propose a state-space model that explicitly delineates a common input signal sent to motor neurons and the physiological noise inherent in synaptic signal
Seel, Norbert M.
This article provides a review of theoretical approaches to model-based learning and related research. In accordance with the definition of model-based learning as an acquisition and utilization of mental models by learners, the first section centers on mental model theory. In accordance with epistemology of modeling the issues of semantics,…
could be united in a common environment based on distributed technologies and virtual reality so that a shared research could be realized. In this reason, the main goal of the project team is to develop a Virtual Research. Laboratory (VRL) as a collection of relatively independent distributed virtual mediums, search machine ...
Kurt Johnsen; Lisa Samuelson; Robert Teskey; Steve McNulty; Tom Fox
Forest process models are mathematical representations of biological systems that incorporate our understanding of physiological and ecological mechanisms into predictive algorithms. These models were originally designed and used for research purposes, but are being developed for use in practical forest management. Process models designed for research...
Sarli, Cathy C; Dubinsky, Ellen K; Holmes, Kristi L
Is there a means of assessing research impact beyond citation analysis? The case study took place at the Washington University School of Medicine Becker Medical Library. This case study analyzed the research study process to identify indicators beyond citation count that demonstrate research impact. The authors discovered a number of indicators that can be documented for assessment of research impact, as well as resources to locate evidence of impact. As a result of the project, the authors developed a model for assessment of research impact, the Becker Medical Library Model for Assessment of Research. Assessment of research impact using traditional citation analysis alone is not a sufficient tool for assessing the impact of research findings, and it is not predictive of subsequent clinical applications resulting in meaningful health outcomes. The Becker Model can be used by both researchers and librarians to document research impact to supplement citation analysis.
As Open Educational Resources (OER) increasingly receive attention from academics, educational foundations, and government agencies, exemplars will emerge that lower student textbook costs by moving away from commercial publishers through self-publishing or curating web-based resources. Joe Moxley's "Writing Commons" serves as a scaled…
Orlowska, Elzbieta; Laszcyca, Katarzyna Malgorzata; Urbanski, Dorian Fabian
show that the iron distribution in L. filicaulis seeds is similar to that in common beans, while the seeds of L. japonicus show a different pattern of iron accumulation. RILs from a cross between these two species are being studied in order to find genes that are important for seed iron distribution...
Shannon C.K. Straub; Mark Fishbein; Tatyana Livshult; Zachary Foster; Matthew Parks; Kevin Weitemier; Richard C. Cronn; Aaron. Liston
Milkweeds (Asclepias L.) have been extensively investigated in diverse areas of evolutionary biology and ecology; however, there are few genetic resources available to facilitate and compliment these studies. This study explored how low coverage genome sequencing of the common milkweed (Asclepias syriaca L.) could be useful in...
Wolf, Nancy Butler
Educational reform is most likely to be successful when teachers are knowledgeable about the intended reform, and when their concerns about the reform are understood and addressed. The Common Core State Standards (CCSS) is an effort to establish a set of nationwide expectations for students and teachers. This study examined teacher understanding…
Metcalf, Karen K.; Barlow, Amy; Hudson, Lisa; Jones, Elizabeth; Lyons, Dennis; Piersall, James; Munfus, Laureen
Provides guidelines on how to adapt common games such as checkers, tic tac toe, obstacle courses, and memory joggers into interactive games in multimedia courseware. Emphasizes creating generic games that can be recycled and used for multiple topics to save development time and keep costs low. Discusses topic themes, game structure, and…
Computer networks and their services have become an essential part of research and education. Nowadays every modern R&E institution must have a computer network and provide network services to its students and staff. In addition to its internal computer network, every R&E institution must have a
Selig, James P.; Trott, Arianna; Lemberger, Matthew E.
Researchers in group counseling often encounter complex data from individual clients who are members of a group. Clients in the same group may be more similar than clients from different groups and this can lead to violations of statistical assumptions. The complexity of the data also means that predictors and outcomes can be measured at both the…
Disruptive Influences on Research in Academic Pathology Departments: Proposed Changes to the Common Rule Governing Informed Consent for Research Use of Biospecimens and to Rules Governing Return of Research Results.
Sobel, Mark E; Dreyfus, Jennifer C
Academic pathology departments will be dramatically affected by proposed United States federal government regulatory initiatives. Pathology research will be substantially altered if proposed changes to the Common Rule (Code of Federal Regulations: Protection of Human Subjects title 45 CFR 46) and regulations governing the return of individual research results are approved and finalized, even more so now that the Precision Medicine initiative has been launched. Together, these changes are disruptive influences on academic pathology research as we know it, straining limited resources and compromising advances in diagnostic and academic pathology. Academic research pathologists will be challenged over the coming years and must demonstrate leadership to ensure the continued availability of and the ethical use of research pathology specimens. Copyright © 2017 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.
Graff, Paige Valderrama
The Expedition Earth and Beyond Student Scientist Guidebook is designed to help student researchers model the process of science and conduct a research investigation. The Table of Contents listed outlines the steps included in this guidebook
Stanford, Bret K.; Jutte, Christine V.
This work quantifies the potential aeroelastic benefits of tailoring a full-scale wing box structure using tailored thickness distributions, material distributions, or both simultaneously. These tailoring schemes are considered for the wing skins, the spars, and the ribs. Material grading utilizes a spatially-continuous blend of two metals: Al and Al+SiC. Thicknesses and material fraction variables are specified at the 4 corners of the wing box, and a bilinear interpolation is used to compute these parameters for the interior of the planform. Pareto fronts detailing the conflict between static aeroelastic stresses and dynamic flutter boundaries are computed with a genetic algorithm. In some cases, a true material grading is found to be superior to a single-material structure.
Zhang, Qian-Qian; Ying, Guang-Guo; Chen, Zhi-Feng; Liu, You-Sheng; Liu, Wang-Rong; Zhao, Jian-Liang
Climbazole is an antidandruff active ingredient commonly used in personal care products, but little is known about its environmental fate. The aim of this study was to evaluate the fate of climbazole in water, sediment, soil and air compartments of the whole China by using a level III multimedia fugacity model. The usage of climbazole was calculated to be 345 t in the whole China according to the market research data, and after wastewater treatment a total emission of 245 t was discharged into the receiving environment with approximately 93% into the water compartment and 7% into the soil compartment. The developed fugacity model was successfully applied to estimate the contamination levels and mass inventories of climbazole in various environmental compartments of the river basins in China. The predicted environmental concentration ranges of climbazole were: 0.20-367 ng/L in water, and 0.009-25.2 ng/g dry weight in sediment. The highest concentration was mainly found in Haihe River basin and the lowest was in basins of Tibet and Xinjiang regions. The mass inventory of climbazole in the whole China was estimated to be 294 t, with 6.79% in water, 83.7% in sediment, 9.49% in soil, and 0.002% in air. Preliminary risk assessment showed high risks in sediment posed by climbazole in 2 out of 58 basins in China. The medium risks in water and sediment were mostly concentrated in north China. To the best of our knowledge, it is the first report on the emissions and multimedia fate of climbazole in the river basins of the whole China. Copyright © 2015 Elsevier B.V. All rights reserved.
Hatziioannou, Theodora; Evans, David T.
The AIDS pandemic continues to present us with unique scientific and public health challenges. Although the development of effective antiretroviral therapy has been a major triumph, the emergence of drug resistance requires active management of treatment regimens and the continued development of new antiretroviral drugs. Moreover, despite nearly 30 years of intensive investigation, we still lack the basic scientific knowledge necessary to produce a safe and effective vaccine against HIV-1. Animal models offer obvious advantages in the study of HIV/AIDS, allowing for a more invasive investigation of the disease and for preclinical testing of drugs and vaccines. Advances in humanized mouse models, non-human primate immunogenetics and recombinant challenge viruses have greatly increased the number and sophistication of available mouse and simian models. Understanding the advantages and limitations of each of these models is essential for the design of animal studies to guide the development of vaccines and antiretroviral therapies for the prevention and treatment of HIV-1 infection. PMID:23154262
José M. Rey Benayas
Full Text Available We examined patterns of commonness and rarity among plant species in montane wet grasslands of Iberia. This examination is set within two contexts. First, we expanded on an earlier scheme for classifying species as common or rare by adding a fourth criterion, the ability of that species to occupy a larger or smaller fraction of its potential suitable habitats, i.e., habitat occupancy. Second, we explicated two theories, the superior organism theory and the generalist/specialist trade-off theory. The data consisted of 232 species distributed among 92 plots. The species were measured for mean local abundance, size of environmental volume occupied, percentage of volume occupied, range within Iberia, and range in Europe and the Mediterranean basin. In general, all measures were positively correlated, in agreement with the superior organism theory. However, specialist species were also found. Thus, patterns of commonness and rarity may be due to a combination of mechanisms. Analyses such as ours can also be used as a first step in identifying habitats and species that may be endangered.
Full Text Available We address a poorly understood aspect of ecological niche modeling: its sensitivity to different levels of geographic uncertainty in organism occurrence data. Our primary interest was to assess how accuracy degrades under increasing uncertainty, with performance measured indirectly through model consistency. We used Monte Carlo simulations and a similarity measure to assess model sensitivity across three variables: locality accuracy, niche modeling method, and species. Randomly generated data sets with known levels of locality uncertainty were compared to an original prediction using Fuzzy Kappa. Data sets where locality uncertainty is low were expected to produce similar distribution maps to the original. In contrast, data sets where locality uncertainty is high were expected to produce less similar maps. BIOCLIM, DOMAIN, Maxent and GARP were used to predict the distributions for 1200 simulated datasets (3 species x 4 buffer sizes x 100 randomized data sets. Thus, our experimental design produced a total of 4800 similarity measures, with each of the simulated distributions compared to the prediction of the original data set and corresponding modeling method. A general linear model (GLM analysis was performed which enables us to simultaneously measure the effect of buffer size, modeling method, and species, as well as interactions among all variables. Our results show that modeling method has the largest effect on similarity scores and uniquely accounts for 40% of the total variance in the model. The second most important factor was buffer size, but it uniquely accounts for only 3% of the variation in the model. The newer and currently more popular methods, GARP and Maxent, were shown to produce more inconsistent predictions than the earlier and simpler methods, BIOCLIM and DOMAIN. Understanding the performance of different niche modeling methods under varying levels of geographic uncertainty is an important step toward more productive
Fu, Liang-Yu; Wang, Guang-Zhong; Ma, Bin-Guang; Zhang, Hong-Yu
Recently, numerous genome analyses revealed the existence of a universal G:C→A:T mutation bias in bacteria, fungi, plants and animals. To explore the molecular basis for this mutation bias, we examined the three well-known DNA mutation models, i.e., oxidative damage model, UV-radiation damage model and CpG hypermutation model. It was revealed that these models cannot provide a sufficient explanation to the universal mutation bias. Therefore, we resorted to a DNA mutation model proposed by Löwdin 40 years ago, which was based on inter-base double proton transfers (DPT). Since DPT is a fundamental and spontaneous chemical process and occurs much more frequently within GC pairs than AT pairs, Löwdin model offers a common explanation for the observed universal mutation bias and thus has broad biological implications. Copyright © 2011 Elsevier Inc. All rights reserved.
Long-run relations and common trends are discussed in terms of the multivariate cointegration model given in the autoregressive and the moving average form. The basic results needed for the analysis of I(1) and 1(2)processes are reviewed and the results applied to Danish monetary data. The test...
Erdogan, Cihan Suleyman; Hansen, Benni Winding; Vang, Ole
is an evolutionary conserved key protein kinase in the TOR pathway that regulates growth, proliferation and cell metabolism in response to nutrients, growth factors and stress. Comparing the ageing process in invertebrate model organisms with relatively short lifespan with mammals provides valuable information about...... the molecular mechanisms underlying the ageing process faster than mammal systems. Inhibition of the TOR pathway activity via either genetic manipulation or rapamycin increases lifespan profoundly in most invertebrate model organisms. This contribution will review the recent findings in invertebrates concerning...... the TOR pathway and effects of TOR inhibition by rapamycin on lifespan. Besides some contradictory results, the majority points out that rapamycin induces longevity. This suggests that administration of rapamycin in invertebrates is a promising tool for pursuing the scientific puzzle of lifespan...
Bjoerneby, Henrik; Alvik, Ivar
The main objective of this study is to consider the legal advantages and disadvantages with different contract models given NordREG's choice of a supplier centric model with mandatory combined billing in a future Nordic end-user market for electricity.At the outset, there are today three relevant categories of agreements in place between customers, suppliers and DSOs in the Nordic electricity retail markets: the electricity supply agreements between customers and suppliers, the grid use agreements between customers and DSOs, and the grid connection agreements usually entered into between customers and DSOs. We have assumed that issues governed by the grid connection agreements will still be entered into by DSOs under a supplier centric model. Two general contract models have on this basis been considered as possible approaches to regulation of electricity supply and grid use terms under a future supplier centric model. The subcontractor model is considered in more detail in chapter 7 of this report. Under this model, the customer enters into a contract with the supplier governing both electricity supply and grid use. The supplier then enters into a separate contract with the DSO for grid use, making the DSO a subcontractor for this service. The Danish wholesale model which will be implemented from 1 October 2014 represents one example of a subcontractor model.The main advantage of the subcontractor model is that it will entitle the customer to envisage the electricity supply, including grid services, as a single service delivered by the supplier. On the other hand, the sub-contractor model will extend the responsibilities of suppliers towards customers. We discuss the advantages and disadvantages of this model further in section 7.2. The power of attorney model is considered in more detail in chapter 8 of this report. Under this model, the customer and the DSO will still formally be contract parties to the grid use agreement, but the supplier will act with a
Ritter, Thomas; Lettl, Christopher
Business-model research has struggled to develop a clear footprint in the strategic management field. This introduction to the special issue on the wider implications of business-model research argues that part of this struggle relates to the application of five different perspectives on the term...... “business model,” which creates ambiguity about the conceptual boundaries of business models, the applied terminology, and the potential contributions of business-model research to strategic management literature. By explicitly distinguishing among these five perspectives and by aligning them into one...
Conn, P. Michael
... have the advantage that the reproductive, mitotic, development or aging cycles are rapid compared with those in humans; others are utilized because individual proteins may be studied in an advantageous way and that have human homologs. Other organisms are facile to grow in laboratory settings or lend themselves to convenient analyses, have deﬁned genomes or present especially good human models of human or animal disease. We have made an effort not to be seduced into making the entire book homage to the rem...
Javier Blanch, Francisco Gil
Full Text Available The objective of this article is twofold; firstly, we establish the theoretical boundaries of positive leadership and the reasons for its emergence. It is related to the new paradigm of positive psychology that has recently been shaping the scope of organizational knowledge. This conceptual framework has triggered the development of the various forms of positive leadership (i.e. transformational, servant, spiritual, authentic, and positive. Although the construct does not seem univocally defined, these different types of leadership overlap and share a significant affinity. Secondly, we review the empirical evidence that shows the impact of positive leadership in organizations and we highlight the positive relationship between these forms of leadership and key positive organizational variables. Lastly, we analyse future research areas in order to further develop this concept.
Kim, Hyunsik; Eaton, Nicholas R
Studies of mental disorder comorbidity have produced an unsynthesized literature with multiple competing transdiagnostic models. The current study attempted to (a) integrate these models into an overarching comorbidity hierarchy, (b) link the resulting transdiagnostic factors to the bifactor model of psychopathology, and (c) investigate predictive validity of transdiagnostic factors for important future outcomes. A series of exploratory structural equation models (ESEMs) was conducted on 12 common mental disorders from a large, 2-wave nationally representative sample, using the bass-ackwards method to explore the hierarchical structure of transdiagnostic comorbidity factors. These Wave 1 factors were then linked with the bifactor model and with mental disorders at Wave 2. Results indicated that common mental disorder comorbidity was structured into an interpretable hierarchy. Connections between the hierarchy's general factor of psychopathology (denoted p), internalizing, and distress were very strong; these factors also linked strongly with the bifactor model's p factor. Predictive validity analyses prospectively predicting subsequent diagnoses indicated that, overall: (a) transdiagnostic factors outperformed disorder-specific variance; (b) within hierarchy levels, transdiagnostic factors where disorders optimally loaded outperformed other transdiagnostic factors, but this differed by disorder type; and (c) between hierarchy levels, transdiagnostic factors where disorders optimally loaded showed similar predictive validity. We discuss implications for hierarchical structure modeling, the integration of multiple competing comorbidity models, and benefits of transdiagnostic factors for understanding the continuity of mental disorders over time. (c) 2015 APA, all rights reserved).
De Brún, Aoife; McCarthy, Mary; McKenzie, Kenneth; McGloin, Aileen
This study examined the Irish media discourse on obesity by employing the Common Sense Model of Illness Representations. A media sample of 368 transcripts was compiled from newspaper articles (n = 346), radio discussions (n = 5), and online news articles (n = 17) on overweight and obesity from the years 2005, 2007, and 2009. Using the Common Sense Model and framing theory to guide the investigation, a thematic analysis was conducted on the media sample. Analysis revealed that the behavioral dimensions of diet and activity levels were the most commonly cited causes of and interventions in obesity. The advertising industry was blamed for obesity, and there were calls for increased government action to tackle the issue. Physical illness and psychological consequences of obesity were prevalent in the sample, and analysis revealed that the economy, regardless of its state, was blamed for obesity. These results are discussed in terms of expectations of audience understandings of the issue and the implications of these dominant portrayals and framings on public support for interventions. The article also outlines the value of a qualitative analytical framework that combines the Common Sense Model and framing theory in the investigation of illness narratives.
Full Text Available In the case of social milieu research, the practical, non-scientific context of market research can create problems for practical social research. Some of these problems are examined in this paper. Social milieu research can be seen as a new approach to social structure. It can also be seen as a paradigm for the combination of quantitative and qualitative research techniques and research strategies. Two important and influential milieu models are presented here: the milieu model of Gerhard SCHULZE and the milieu model of Sinus Sociovision. In this paper two important concepts are differentiated: the milieu model and the milieu instrument. The milieu model (the "milieu map" represents the social structure of a society as an organized set of social milieus. The milieu instrument consists of quantitative and qualitative techniques for measuring and interpreting the social milieus of the milieu model. In the context of market research, the milieu instrument enables the milieu-researcher to relate the customer's product to the social milieus in an interpretative and also quantifying way. However, there are two main problems: 1. Social milieu research in the context of market research is theoretically and methodologically highly sophisticated. Therefore the acceptance of milieu-thinking and milieu-theorizing must be created and distributed in the field of market research, namely to the customer of commercial milieu research. The distribution of milieu-theory has to be realized through the whole process of consulting and the customer has to be integrated in the process of qualitative interpretation. 2. In using the concept of social field of Pierre BOURDIEU, a problematic strategy of market research companies is analyzed. On one hand, market research companies are interested in distributing the knowledge and in maximizing the acceptance of their milieu-model in the field of market research, so that it becomes a prevailing standard, a "currency" in the
Mulcahy, Candace A.; Maccini, Paula; Wright, Kenneth; Miller, Jason
In this review, the authors offer a critical analysis of published interventions for improving mathematics performance among middle and high school students with EBD in light of the Common Core State Standards. An exhaustive review of literature from 1975 to December 2012 yielded 20 articles that met criteria for inclusion. The authors analyzed…
Tuai, Cameron K.
The integration of librarians and technologists to deliver information services represents a new and potentially costly organizational challenge for many library administrators. To understand better how to control the costs of integration, the research presented here will use structural contingency theory to study the coordination of librarians…
Abu Said, Al-Mansor; Mohd Rasdi, Roziah; Abu Samah, Bahaman; Silong, Abu Daud; Sulaiman, Suzaimah
Purpose: The purpose of this paper is to develop a career success model for academics at the Malaysian research universities. Design/methodology/approach: Self-administered and online surveys were used for data collection among 325 academics from Malaysian research universities. Findings: Based on the analysis of structural equation modeling, the…
Coppola, Jennifer J; Disney, Anita A
Acetylcholine (ACh) is believed to act as a neuromodulator in cortical circuits that support cognition, specifically in processes including learning, memory consolidation, vigilance, arousal and attention. The cholinergic modulation of cortical processes is studied in many model systems including rodents, cats and primates. Further, these studies are performed in cortical areas ranging from the primary visual cortex to the prefrontal cortex and using diverse methodologies. The results of these studies have been combined into singular models of function-a practice based on an implicit assumption that the various model systems are equivalent and interchangeable. However, comparative anatomy both within and across species reveals important differences in the structure of the cholinergic system. Here, we will review anatomical data including innervation patterns, receptor expression, synthesis and release compared across species and cortical area with a focus on rodents and primates. We argue that these data suggest no canonical cortical model system exists for the cholinergic system. Further, we will argue that as a result, care must be taken both in combining data from studies across cortical areas and species, and in choosing the best model systems to improve our understanding and support of human health.
Jennifer J. Coppola
Full Text Available Acetylcholine (ACh is believed to act as a neuromodulator in cortical circuits that support cognition, specifically in processes including learning, memory consolidation, vigilance, arousal and attention. The cholinergic modulation of cortical processes is studied in many model systems including rodents, cats and primates. Further, these studies are performed in cortical areas ranging from the primary visual cortex to the prefrontal cortex and using diverse methodologies. The results of these studies have been combined into singular models of function—a practice based on an implicit assumption that the various model systems are equivalent and interchangeable. However, comparative anatomy both within and across species reveals important differences in the structure of the cholinergic system. Here, we will review anatomical data including innervation patterns, receptor expression, synthesis and release compared across species and cortical area with a focus on rodents and primates. We argue that these data suggest no canonical cortical model system exists for the cholinergic system. Further, we will argue that as a result, care must be taken both in combining data from studies across cortical areas and species, and in choosing the best model systems to improve our understanding and support of human health.
Soković, Marina; Glamočlija, Jasmina; Marin, Petar D; Brkić, Dejan; van Griensven, Leo J L D
The chemical composition and antibacterial activity of essential oils from 10 commonly consumed herbs: Citrus aurantium, C. limon, Lavandula angustifolia, Matricaria chamomilla, Mentha piperita, M. spicata, Ocimum basilicum, Origanum vulgare, Thymus vulgaris and Salvia officinalis have been determined. The antibacterial activity of these oils and their main components; i.e. camphor, carvacrol, 1,8-cineole, linalool, linalyl acetate, limonene, menthol, a-pinene, b-pinene, and thymol were assayed against the human pathogenic bacteria Bacillus subtilis, Enterobacter cloacae, Escherichia coli O157:H7, Micrococcus flavus, Proteus mirabilis, Pseudomonas aeruginosa, Salmonella enteritidis, S. epidermidis, S. typhimurium, and Staphylococcus aureus. The highest and broadest activity was shown by O. vulgare oil. Carvacrol had the highest antibacterial activity among the tested components.
Full Text Available The chemical composition and antibacterial activity of essential oils from 10 commonly consumed herbs: Citrus aurantium, C. limon, Lavandula angustifolia, Matricaria chamomilla, Mentha piperita, M. spicata, Ocimum basilicum, Origanum vulgare, Thymus vulgaris and Salvia officinalis have been determined. The antibacterial activity of these oils and their main components; i.e. camphor, carvacrol, 1,8-cineole, linalool, linalyl acetate, limonene, menthol, a-pinene, b-pinene, and thymol were assayed against the human pathogenic bacteria Bacillus subtilis, Enterobacter cloacae, Escherichia coli O157:H7, Micrococcus flavus, Proteus mirabilis, Pseudomonas aeruginosa, Salmonella enteritidis, S. epidermidis, S. typhimurium, and Staphylococcus aureus. The highest and broadest activity was shown by O. vulgare oil. Carvacrol had the highest antibacterial activity among the tested components.
The informal group mentoring model for research skills development begins with desk research for qualities of a publishable paper. Five dummy papers were reviewed by participants for quality. Participants conducted new studies and wrote research articles. These articles were peer reviewed by participants and submitted ...
Stahl, Bernd; Obach, Michael; Yaghmaei, Emad
Responsible research and innovation (RRI) is an approach to research and innovation governance aiming to ensure that research purpose, process and outcomes are acceptable, sustainable and even desirable. In order to achieve this ambitious aim, RRI must be relevant to research and innovation in in...... studies show the model to be viable and useful in corporate innovation processes. With this approach, we aim to inspire further research and evaluation of the proposed maturity model as a tool for facilitating the integration of RRI in corporate management....
Rader, David J
Uniquely blends mathematical theory and algorithm design for understanding and modeling real-world problems Optimization modeling and algorithms are key components to problem-solving across various fields of research, from operations research and mathematics to computer science and engineering. Addressing the importance of the algorithm design process. Deterministic Operations Research focuses on the design of solution methods for both continuous and discrete linear optimization problems. The result is a clear-cut resource for understanding three cornerstones of deterministic operations resear
Karanikas, Nektarios; Schwarz, M; Harfmann, J
A symbiotic relationship between human factors and safety scientists is needed to ensure the provision of holistic solutions for problems emerging in modern socio-technical systems. System Theoretic Accident Model and Processes (STAMP) tackles both interactions and individual failures of human and
This article identifies elements and connections that seem to be relevant to explain persistent aggregate behavioral patterns in educational systems when using complex dynamical systems modeling and simulation approaches. Several studies have shown what factors are at play in educational fields, but confusion still remains about the underlying…
M.P.M. Vierboom (Michel); E. Breedveld (Elly); I. Kondova (Ivanela); B.A. 't Hart (Bert)
textabstractIntroduction: There is an ever-increasing need for animal models to evaluate efficacy and safety of new therapeutics in the field of rheumatoid arthritis (RA). Particularly for the early preclinical evaluation of human-specific biologicals targeting the progressive phase of the disease,
Kelly, Rebecca A.; Jakeman, Anthony J.; Barreteau, Olivier; Borsuk, Mark E.; El-Sawah, Sondoss; Hamilton, Serena H.; Henriksen, Hans Jorgen; Kuikka, Sakari; Maier, Holger R.; Rizzoli, Andrea Emilio; van Delden, H.; Voinov, A.
The design and implementation of effective environmental policies need to be informed by a holistic understanding of the system processes (biophysical, social and economic), their complex interactions, and how they respond to various changes. Models, integrating different system processes into a
Soto, Fabian A.; Wasserman, Edward A.
A wealth of empirical evidence has now accumulated concerning animals' categorizing photographs of real-world objects. Although these complex stimuli have the advantage of fostering rapid category learning, they are difficult to manipulate experimentally and to represent in formal models of behavior. We present a solution to the representation…
Goodrich, J Marc; Lonigan, Christopher J
According to the common underlying proficiency model (Cummins, 1981), as children acquire academic knowledge and skills in their first language, they also acquire language-independent information about those skills that can be applied when learning a second language. The purpose of this study was to evaluate the relevance of the common underlying proficiency model for the early literacy skills of Spanish-speaking language-minority children using confirmatory factor analysis. Eight hundred fifty-eight Spanish-speaking language-minority preschoolers (mean age = 60.83 months, 50.2% female) participated in this study. Results indicated that bifactor models that consisted of language-independent as well as language-specific early literacy factors provided the best fits to the data for children's phonological awareness and print knowledge skills. Correlated factors models that only included skills specific to Spanish and English provided the best fits to the data for children's oral language skills. Children's language-independent early literacy skills were significantly related across constructs and to language-specific aspects of early literacy. Language-specific aspects of early literacy skills were significantly related within but not across languages. These findings suggest that language-minority preschoolers have a common underlying proficiency for code-related skills but not language-related skills that may allow them to transfer knowledge across languages.
Milat, Andrew J; Bauman, Adrian E; Redman, Sally
Research funding agencies continue to grapple with assessing research impact. Theoretical frameworks are useful tools for describing and understanding research impact. The purpose of this narrative literature review was to synthesize evidence that describes processes and conceptual models for assessing policy and practice impacts of public health research. The review involved keyword searches of electronic databases, including MEDLINE, CINAHL, PsycINFO, EBM Reviews, and Google Scholar in July/August 2013. Review search terms included 'research impact', 'policy and practice', 'intervention research', 'translational research', 'health promotion', and 'public health'. The review included theoretical and opinion pieces, case studies, descriptive studies, frameworks and systematic reviews describing processes, and conceptual models for assessing research impact. The review was conducted in two phases: initially, abstracts were retrieved and assessed against the review criteria followed by the retrieval and assessment of full papers against review criteria. Thirty one primary studies and one systematic review met the review criteria, with 88% of studies published since 2006. Studies comprised assessments of the impacts of a wide range of health-related research, including basic and biomedical research, clinical trials, health service research, as well as public health research. Six studies had an explicit focus on assessing impacts of health promotion or public health research and one had a specific focus on intervention research impact assessment. A total of 16 different impact assessment models were identified, with the 'payback model' the most frequently used conceptual framework. Typically, impacts were assessed across multiple dimensions using mixed methodologies, including publication and citation analysis, interviews with principal investigators, peer assessment, case studies, and document analysis. The vast majority of studies relied on principal investigator
Burnette, Margaret H.
The increasing interdisciplinarity of scientific research creates both challenges and opportunities for librarians. The liaison model may be inadequate for supporting campus research that represents multiple disciplines and geographically dispersed departments. The identification of units, researchers, and projects is a first step in planning and…
Simons, Jeffrey S.; Carey, Kate B.; Wills, Thomas A.
This study tested a theoretical model hypothesizing differential pathways from five predictors to alcohol abuse and dependence symptoms. The participants were college students (N= 2,270) surveyed on two occasions in a 6-month prospective design. Social norms, perceived utility of alcohol use, and family history of alcohol problems were indirectly associated with Time 2 (T2) abuse and dependence symptoms through influencing level of alcohol consumption. Poor behavioral control had a direct eff...
Lasmézas, C I; Deslys, J P; Demaimay, R; Adjou, K T; Hauw, J J; Dormont, D
The development of transmissible spongiform encephalopathies in experimental models depends on two major factors: the intracerebral accumulation of an abnormal, protease-resistant isoform of PrP (PrPres), which is a host protein mainly expressed in neurons; and the existence of different strains of agent. In order to make a distinction between pathogenic mechanisms depending upon the accumulation of host-derived PrPres and the strain-specific effects, we quantified and compared the sequence of molecular [PrPres and glial fibrillary acidic protein (GFAP) accumulation] and pathological events in the brains of syngeneic mice throughout the course of infection with two different strains of agent. The bovine spongiform encephalopathy (BSE) agent exhibits properties different from any known scrapie source and has been studied in comparison with a classical scrapie strain. Convergent kinetic data in both models confirmed the cause-effect relationship between PrPres and pathological changes and showed that PrPres accumulation is directly responsible for astrocyte activation in vivo. Moreover, we observed a threshold level of PrPres for this effect on astroglial cells. However, despite similar infectivity titres, the BSE model produced less PrPres than scrapie, and the relative importance of gliosis was higher. The comparison of the molecular and pathological features after intracerebral or intraperitoneal inoculation also revealed differences between the models. Therefore, the mechanisms leading to the targeting and the fine regulation of the molecular events seem to be independent of the host PrP and to depend upon the agent. The possible involvement of a regulatory molecule accounting for these specificities has to be considered.
Anchukaitis, K. J.; LeGrande, A. N.
The Asian monsoon can be characterized in terms of both precipitation variability and atmospheric circulation across a range of spatial and temporal scales. While multicentury time series of tree-ring widths at hundreds of sites across Asia provide estimates of past rainfall, the oxygen isotope ratios of annual rings may reveal broader regional hydroclimate and atmosphere-ocean dynamics. Tree-ring oxygen isotope chronologies from Monsoon Asia have been interpreted to reflect a local 'amount effect', relative humidity, source water and seasonality, and winter snowfall. Here, we use an isotope-enabled general circulation model simulation from the NASA Goddard Institute for Space Science (GISS) Model E and a proxy system model of the oxygen isotope composition of tree-ring cellulose to interpret the large-scale and local climate controls on δ 18O chronologies. Broad-scale dominant signals are associated with a suite of covarying hydroclimate variables including growing season rainfall amounts, relative humidity, and vapor pressure deficit. Temperature and source water influences are region-dependent, as are the simulated tree-ring isotope signals associated with the El Nino Southern Oscillation (ENSO) and large-scale indices of the Asian monsoon circulation. At some locations, including southern coastal Viet Nam, local precipitation isotope ratios and the resulting simulated δ 18O tree-ring chronologies reflect upstream rainfall amounts and atmospheric circulation associated with monsoon strength and wind anomalies.
Berninger, V.; Rijlaarsdam, G.; Fayol, M.L.; Fayol, M.; Alamargot, D.; Berninger, V.W.
About the book: Translation of cognitive representations into written language is one of the most important processes in writing. This volume provides a long-awaited updated overview of the field. The contributors discuss each of the commonly used research methods for studying translation; theorize
Huttley Gavin A
Full Text Available Abstract Background Neighboring nucleotides exert a striking influence on mutation, with the hypermutability of CpG dinucleotides in many genomes being an exemplar. Among the approaches employed to measure the relative importance of sequence neighbors on molecular evolution have been continuous-time Markov process models for substitutions that treat sequences as a series of independent tuples. The most widely used examples are the codon substitution models. We evaluated the suitability of derivatives of the nucleotide frequency weighted (hereafter NF and tuple frequency weighted (hereafter TF models for measuring sequence context dependent substitution. Critical properties we address are their relationships to an independent nucleotide process and the robustness of parameter estimation to changes in sequence composition. We then consider the impact on inference concerning dinucleotide substitution processes from application of these two forms to intron sequence alignments from primates. Results We prove that the NF form always nests the independent nucleotide process and that this is not true for the TF form. As a consequence, using TF to study context effects can be misleading, which is shown by both theoretical calculations and simulations. We describe a simple example where a context parameter estimated under TF is confounded with composition terms unless all sequence states are equi-frequent. We illustrate this for the dinucleotide case by simulation under a nucleotide model, showing that the TF form identifies a CpG effect when none exists. Our analysis of primate introns revealed that the effect of nucleotide neighbors is over-estimated under TF compared with NF. Parameter estimates for a number of contexts are also strikingly discordant between the two model forms. Conclusion Our results establish that the NF form should be used for analysis of independent-tuple context dependent processes. Although neighboring effects in general are
Lindsay, Helen; Yap, Von Bing; Ying, Hua; Huttley, Gavin A
Neighboring nucleotides exert a striking influence on mutation, with the hypermutability of CpG dinucleotides in many genomes being an exemplar. Among the approaches employed to measure the relative importance of sequence neighbors on molecular evolution have been continuous-time Markov process models for substitutions that treat sequences as a series of independent tuples. The most widely used examples are the codon substitution models. We evaluated the suitability of derivatives of the nucleotide frequency weighted (hereafter NF) and tuple frequency weighted (hereafter TF) models for measuring sequence context dependent substitution. Critical properties we address are their relationships to an independent nucleotide process and the robustness of parameter estimation to changes in sequence composition. We then consider the impact on inference concerning dinucleotide substitution processes from application of these two forms to intron sequence alignments from primates. We prove that the NF form always nests the independent nucleotide process and that this is not true for the TF form. As a consequence, using TF to study context effects can be misleading, which is shown by both theoretical calculations and simulations. We describe a simple example where a context parameter estimated under TF is confounded with composition terms unless all sequence states are equi-frequent. We illustrate this for the dinucleotide case by simulation under a nucleotide model, showing that the TF form identifies a CpG effect when none exists. Our analysis of primate introns revealed that the effect of nucleotide neighbors is over-estimated under TF compared with NF. Parameter estimates for a number of contexts are also strikingly discordant between the two model forms. Our results establish that the NF form should be used for analysis of independent-tuple context dependent processes. Although neighboring effects in general are still important, prominent influences such as the elevated Cp
Sport science can be thought of as a scientific process used to guide the practice of sport with the ultimate aim of improving sporting performance. However, despite this goal, the general consensus is that the translation of sport-science research to practice is poor. Furthermore, researchers have been criticised for failing to study problems relevant to practitioners and for disseminating findings that are difficult to implement within a practical setting. This paper proposes that the situation may be improved by the adoption of a model that guides the direction of research required to build our evidence base about how to improve performance. Central to the Applied Research Model for the Sport Sciences (ARMSS) described in this report is the idea that only research leading to practices that can and will be adopted can improve sporting performance. The eight stages of the proposed model are (i) defining the problem; (ii) descriptive research; (iii) predictors of performance; (iv) experimental testing of predictors; (v) determinants of key performance predictors; (vi) efficacy studies; (vii) examination of barriers to uptake; and (viii) implementation studies in a real sporting setting. It is suggested that, from the very inception, researchers need to consider how their research findings might ultimately be adapted to the intended population, in the actual sporting setting, delivered by persons with diverse training and skills, and using the available resources. It is further argued in the model that a greater understanding of the literature and more mechanistic studies are essential to inform subsequent research conducted in real sporting settings. The proposed ARMSS model therefore calls for a fundamental change in the way in which many sport scientists think about the research process. While there is no guarantee that application of this proposed research model will improve actual sports performance, anecdotal evidence suggests that sport-science research is
Fuchs, Peter; Nussbeck, Fridtjof W; Meuwly, Nathalie; Bodenmann, Guy
The analysis of observational data is often seen as a key approach to understanding dynamics in romantic relationships but also in dyadic systems in general. Statistical models for the analysis of dyadic observational data are not commonly known or applied. In this contribution, selected approaches to dyadic sequence data will be presented with a focus on models that can be applied when sample sizes are of medium size ( N = 100 couples or less). Each of the statistical models is motivated by an underlying potential research question, the most important model results are presented and linked to the research question. The following research questions and models are compared with respect to their applicability using a hands on approach: (I) Is there an association between a particular behavior by one and the reaction by the other partner? (Pearson Correlation); (II) Does the behavior of one member trigger an immediate reaction by the other? (aggregated logit models; multi-level approach; basic Markov model); (III) Is there an underlying dyadic process, which might account for the observed behavior? (hidden Markov model); and (IV) Are there latent groups of dyads, which might account for observing different reaction patterns? (mixture Markov; optimal matching). Finally, recommendations for researchers to choose among the different models, issues of data handling, and advises to apply the statistical models in empirical research properly are given (e.g., in a new r-package "DySeq").
Mahdavi, Seyed Mohammad; Sahraei, Hedayat; Rezaei-Tavirani, Mostafa; Najafi Abedi, Akram
Naturally, the presence of electromagnetic waves in our living environment affects all components of organisms, particularly humans and animals, as the large part of their body consists of water. In the present study, we tried to investigate the relation between exposure to the extremely low-frequency electromagnetic field (ELF-EMF) and common behaviors such as body weight, food and water intake, anorexia (poor appetite), plasma glucose concentration, movement, rearing and sniffing in rats. For this purpose, rats were exposed to 40 Hz ELF-EMF once a day for 21 days, then at days 1, 3, 7, 14 and 21 after exposure, any changes in the above-mentioned items were assessed in the exposed rats and compared to the non-exposed group as control. Body weight of irradiated rats significantly increased only a week after exposure and decreased after that. No significant change was observed in food and water intake of irradiated rats compared to the control, and the anorexia parameter in the group exposed to ELF-EMF was significantly decreased at one and two weeks after irradiation. A week after exposure, the level of glucose was significantly increased but at other days these changes were not significant. Movements, rearing and sniffing of rats at day 1 after exposure were significantly decreased and other days these changes did not follow any particular pattern. However, the result of this study demonstrated that exposure to ELF-EMF can alter the normal condition of animals and may represent a harmful impact on behavior.
Natalia T. Zadorozhna
Full Text Available The description of the model of site of research institution of NAPS of Ukraine (TSRI is presented. Goals and objectives оf the TSRI model are defined. Web 2.0 theoretical base for site development is considered, as well as described information environment, TSRI internet and intranet models. There are substantiated the site structure, design and site layout for research institution of NAPS of Ukraine. The example of the website of the Institute of Information Technologies and Learning Tools of the NAPS of Ukraine based on the TSRI model is showen. The technological aspects of the TSRI model are considered. There is given the scheme of internet-intranet environment to support scientific publications on pedagogical and psychological sciences. This paper is intended to web designers, researchers and administrative staff of research institutions.
Lanckriet, A; Timbermont, L; De Gussem, M; Marien, M; Vancraeynest, D; Haesebrouck, F; Ducatelle, R; Van Immerseel, F
Necrotic enteritis poses an important health risk to broilers. The ionophore anticoccidials lasalocid, salinomycin, maduramicin, narasin and a combination of narasin and nicarbazin were tested in feed for their prophylactic effect on the incidence of necrotic enteritis in a subclinical experimental infection model that uses coccidia as a predisposing factor. In addition, drinking water medication with the antibiotics amoxicillin, tylosin and lincomycin was evaluated as curative treatment in the same experimental model. The minimal inhibitory concentrations (MICs) of all antibiotics and anticoccidials were determined in vitro against 51 Clostridium perfringens strains isolated from broilers. The strains examined appeared uniformly susceptible to lasalocid, maduramicin, narasin, salinomycin, amoxicillin and tylosin, whereas an extended frequency distribution range of MICs for lincomycin was seen, indicating acquired resistance in 36 isolates in the higher range of MICs. Nicarbazin did not inhibit the in vitro growth of the C. perfringens strains even at a concentration of 128 microg/ml. Supplementation of the diet from day 1 onwards with lasalocid, salinomycin, narasin or maduramicin led to a reduction in birds with necrotic enteritis lesions as compared with the non-medicated infected control group. A combination product of narasin and nicarbazin had no significant protective effect. Treatment with amoxicillin, lincomycin and tylosin completely stopped the development of necrotic lesions.
D. Douglas; K. Beard; J. Eldred; P. Evtushenko; A. Jenkins; W. Moore; L. Osborne; D. Sexton; C. Tennant
Particle beam modeling in accelerators has been the focus of considerable effort since the 1950s. Many generations of tools have resulted from this process, each leveraging both prior experience and increases in computer power. However, continuing innovation in accelerator technology results in systems that are not well described by existing tools, so the software development process is on-going. We discuss a novel response to this situation, which was encountered when Jefferson Lab began operation of its energy-recovering linacs. These machines were not readily described with legacy soft-ware; therefore a model was built using Microsoft Excel. This interactive simulation can query data from the accelerator, use it to compute machine parameters, analyze difference orbit data, and evaluate beam properties. It can also derive new accelerator tunings and rapidly evaluate the impact of changes in machine configuration. As it is spreadsheet-based, it can be easily user-modified in response to changing requirements. Examples for the JLab IR Upgrade FEL are presented.
Gai, Ke; Hoelen, Thomas P; Hsu-Kim, Heileen; Lowry, Gregory V
Mercury (Hg) occurs as a myriad of species in environmental media, each with different physicochemical properties. The influence of Hg speciation on its transport in unsaturated soils is not well studied. Transport of four Hg species (dissolved inorganic Hg (II) species, a prepared Hg(II) and dissolved organic matter (DOM) complex, Hg(0), and HgS nanoparticles) was measured in sand and soil packed columns with partial water saturation under simulated rainfall (low ionic strength solution without DOM) and landfill leachate (high DOM content and high ionic strength) influent conditions. The Hg(II)-DOM species had the highest mobility among the four Hg species evaluated, and HgS particles (∼230 nm hydrodynamic diameter) had the poorest mobility, for all soil and influent conditions tested. The addition of 2 wt % clay particles to sand greatly retarded the transport of all Hg species, especially under simulated rainfall. DOM in the column influent facilitated the transport of all four Hg species in model and natural soils. For simulated rainfall, the transport trends observed in model sands were consistent with those measured in a sandy soil, except that the mobility of dissolved inorganic Hg(II) species was significantly lower in natural soils. For simulated rainfall, Hg transport was negligible in a high organic content (∼3.72 wt %) soil for all species except Hg-DOM. This work suggests that the Hg-DOM species presents the greatest potential for vertical migration to groundwater, especially with DOM in the influent solution.
Full Text Available A novel formula easily applied with high precision is proposed in this paper to fit the B-H curve of soft magnetic materials, and it is validated by comparison with predicted and experimental results. It can accurately describe the nonlinear magnetization process and magnetic saturation characteristics of soft magnetic materials. Based on the electromagnetic transient coupling principle, an electromagnetic mathematical model of a high-speed solenoid valve (HSV is developed in Fortran language that takes the saturation phenomena of the electromagnetic force into consideration. The accuracy of the model is validated by the comparison of the simulated and experimental static electromagnetic forces. Through experiment, it is concluded that the increase of the drive current is conducive to improving the electromagnetic energy conversion efficiency of the HSV at a low drive current, but it has little effect at a high drive current. Through simulation, it is discovered that the electromagnetic energy conversion characteristics of the HSV are affected by the drive current and the total reluctance, consisting of the gap reluctance and the reluctance of the iron core and armature soft magnetic materials. These two influence factors, within the scope of the different drive currents, have different contribution rates to the electromagnetic energy conversion efficiency.
Emel'yanenko, V. V.; Asher, D. J.; Bailey, M. E.
A numerical simulation of the Oort cloud is used to explain the observed orbital distributions and numbers of Jupiter-family (JF) and Halley-type (HT) short-period (SP) comets. Comets are given initial orbits with perihelion distances between 5 and 36 au, and evolve under planetary, stellar and Galactic perturbations for 4.5 Gyr. This process leads to the formation of an Oort cloud (which we define as the region of semimajor axes a > 1,000 au), and to a flux of cometary bodies from the Oort cloud returning to the planetary region at the present epoch. The results are consistent with the dynamical characteristics of SP comets and other observed cometary populations: the near-parabolic flux, Centaurs, and high-eccentricity trans-Neptunian objects. To achieve this consistency with observations, the model requires that the number of comets versus initial perihelion distance is concentrated towards the outer planetary region. Moreover, the mean physical lifetime of observable comets in the inner planetary region ( q observed HT comets and nearly half of observed JF comets come from the Oort cloud, and initially (4.5 Gyr ago) from orbits concentrated near the outer planetary region. Comets that have been in the Oort cloud also return to the Centaur (5 observers. The model provides a unified picture for the origin of JF and HT comets. It predicts that the mean physical lifetime of all comets in the region q < 1.5 au is less than ˜200 revolutions.
Cadelli, N.; Cottone, G.; Bertozzi, G.; Girardi, F.
Since 1982 a joint study has been launched by the CEC with the participation of national institutions in the E.C., aiming at a Performance Assessment of Geological Isolation Systems (PAGIS) for HLW disposal. This document is a summary of the first phase of the study which was devoted to the collection of data and models and to the choice of an appropriate methodology. To this purpose, real or national sites have been chosen, which are representative of three types of continental geological formations in the E.C.: clay, granite and salt (although the choices imply no committment of any kind about their final use). Moreover, sub-seabed areas have also been identified. The study covers the following items: - basic data on waste characteristics, site data and repository designs; - methodology, which allows sensitivity and uncertainty analyses to be performed, as well as the assessment of radiation doses to individuals and populations; - preliminary modelling of radionuclide release and migration through the geosphere (near- and far-field) and the biosphere following their various pathways to man; - selection of the most relevant radionuclide release scenarios and their probability of occurrence. Reference values have been selected for the basic data as well as variants covering the various options which are under consideration in the different Countries of the E.C.
Hoyer, Annika; Kuss, Oliver
Meta-analysis of diagnostic studies is still a rapidly developing area of biostatistical research. Especially, there is an increasing interest in methods to compare different diagnostic tests to a common gold standard. Restricting to the case of two diagnostic tests, in these meta-analyses the parameters of interest are the differences of sensitivities and specificities (with their corresponding confidence intervals) between the two diagnostic tests while accounting for the various associations across single studies and between the two tests. We propose statistical models with a quadrivariate response (where sensitivity of test 1, specificity of test 1, sensitivity of test 2, and specificity of test 2 are the four responses) as a sensible approach to this task. Using a quadrivariate generalized linear mixed model naturally generalizes the common standard bivariate model of meta-analysis for a single diagnostic test. If information on several thresholds of the tests is available, the quadrivariate model can be further generalized to yield a comparison of full receiver operating characteristic (ROC) curves. We illustrate our model by an example where two screening methods for the diagnosis of type 2 diabetes are compared. © The Author(s) 2016.
Nethery, Rachel C.; Warren, Joshua L.; Herring, Amy H.; Moore, Kari A.B.; Evenson, Kelly R.; Diez-Roux, Ana V.
The purpose of this study was to reduce the dimensionality of a set of neighborhood-level variables collected on participants in the Multi-Ethnic Study of Atherosclerosis (MESA) while appropriately accounting for the spatial structure of the data. A common spatial factor analysis model in the Bayesian setting was utilized in order to properly characterize dependencies in the data. Results suggest that use of the spatial factor model can result in more precise estimation of factor scores, improved insight into the spatial patterns in the data, and the ability to more accurately assess associations between the neighborhood environment and health outcomes. PMID:26372887
Marshall, Martin; Pagel, Christina; French, Catherine; Utley, Martin; Allwood, Dominique; Fulop, Naomi; Pope, Catherine; Banks, Victoria; Goldmann, Allan
The traditional separation of the producers of research evidence in academia from the users of that evidence in healthcare organisations has not succeeded in closing the gap between what is known about the organisation and delivery of health services and what is actually done in practice. As a consequence, there is growing interest in alternative models of knowledge creation and mobilisation, ones which emphasise collaboration, active participation of all stakeholders, and a commitment to shared learning. Such models have robust historical, philosophical and methodological foundations but have not yet been embraced by many of the people working in the health sector. This paper presents an emerging model of participation, the Researcher-in-Residence. The model positions the researcher as a core member of a delivery team, actively negotiating a body of expertise which is different from, but complementary to, the expertise of managers and clinicians. Three examples of in-residence models are presented: an anthropologist working as a member of an executive team, operational researchers working in a front-line delivery team, and a Health Services Researcher working across an integrated care organisation. Each of these examples illustrates the contribution that an embedded researcher can make to a service-based team. They also highlight a number of unanswered questions about the model, including the required level of experience of the researcher and their areas of expertise, the institutional facilitators and barriers to embedding the model, and the risk that the independence of an embedded researcher might be compromised. The Researcher-in-Residence model has the potential to engage both academics and practitioners in the promotion of evidence-informed service improvement, but further evaluation is required before the model should be routinely used in practice. PMID:24894592
Sellers, Debra M.; Schainker, Lisa M.; Lockhart, Peggy; Yeh, Hsiu Chen
This article describes the development, implementation, and exploratory evaluation of a professional development series that addressed educators' knowledge and use of the terms "research-based" and "evidence-based" within Human Sciences Extension and Outreach at one university. Respondents to a follow-up survey were more likely…
Fengler, Silke; Luxbacher, Günther
After the end of the Great War, private as well as public research funding in Austria was anaemic and slow to develop. Whereas the German state-funded Deutsche Forschungsgemeinschaft (DFG) was established as early as 1920, first steps in that direction were only taken in Austria in the late 1920s. In 1929, the Osterreichisch-deutsche Wissenschaftshilfe (ODW) was founded under the auspices of the Austrian Academy of Sciences and the DFG. Although prima facie on an equal footing, the new research funding organisation was in fact highly dependent on its German cooperation partner. The article explores for the first time ODW's position within the German and Austrian science and foreign policies, which aimed to promote the idea of unification of both states within the German Reich. A quantitative analysis of the subsidies policy in the first five years of existence shows that the ODW gave financial aid primarily to conservative research fields, affecting the intellectual balance of power in the First Austrian Republic. Policy continuities and discontinuities of the organisation in the course of the national-socialist rise to power in Germany after 1933 are examined in the second part of the article. The article thus both increases our knowledge about the most important German research funding organisation DFG, and identifies some of the fundamental structural features of Austrian science policy in the interwar years.
Leemans, R.; Asrar, G.; Canadell, J.G.; Ingram, J.; Larigauderie, A.; Mooney, H.; Nobre, C.; Patwardhan, A.; Rice, M.; Schmidt, F.; Seitzinger, S.; Virji, H.; Vörösmarthy, C.; Yuoung, O.
The Earth System Science Partnership (ESSP) was established in 2001 by four global environmental change (GEC) research programmes: DIVERSITAS, IGBP, IHDP and WCRP. ESSP facilitates the study of the Earth's environment as an integrated system in order to understand how and why it is changing, and to
Bovenkerk, Bernice; Kaldewaij, Frederike
Animal models are used in experiments in the behavioural neurosciences that aim to contribute to the prevention and treatment of cognitive and affective disorders in human beings, such as anxiety and depression. Ironically, those animals that are likely to be the best models for psychopathology are also likely to be considered the ones that are most morally problematic to use, if it seems probable that (and if indeed they are initially selected as models because) they have experiences that are similar to human experiences that we have strong reasons to avoid causing, and indeed aim to alleviate (such as pain, anxiety or sadness). In this paper, against the background of contemporary discussions in animal ethics and the philosophy of animal minds, we discuss the views that it is morally permissible to use animals in these kinds of experiments, and that it is better to use less cognitively complex animals (such as zebrafish) than more complex animals (such as dogs). First, we criticise some justifications for the claim that human beings and more complex animals have higher moral status. We argue that contemporary approaches that attribute equal moral status to all beings that are capable of conscious strivings strivings (e.g. avoiding pain and anxiety; aiming to eat and play) are based on more plausible assumptions. Second, we argue that it is problematic to assume that less cognitively complex animals have a lesser sensory and emotional experience than more complex beings across the board. In specific cases, there might be good reasons to assume that more complex beings would be harmed more by a specific physical or environmental intervention, but it might also be that they sometimes are harmed less because of a better ability to cope. Determining whether a specific experiment is justified is therefore a complex issue. Our aim in this chapter is to stimulate further reflection on these common assumptions behind the use of animal models for psychopathologies. In
Franke, Sigrid K; van Kesteren, Ronald E; Hofman, Sam; Wubben, Jacqueline A M; Smit, August B; Philippens, Ingrid H C H M
Insight into susceptibility mechanisms underlying Parkinson's disease (PD) would aid the understanding of disease etiology, enable target finding and benefit the development of more refined disease-modifying strategies. We used intermittent low-dose MPTP (0.5 mg/kg/week) injections in marmosets and measured multiple behavioral and neurochemical parameters. Genetically diverse monkeys from different breeding families were selected to investigate inter- and intrafamily differences in susceptibility to MPTP treatment. We show that such differences exist in clinical signs, in particular nonmotor PD-related behaviors, and that they are accompanied by differences in neurotransmitter levels. In line with the contribution of a genetic component, different susceptibility phenotypes could be traced back through genealogy to individuals of the different families. Our findings show that low-dose MPTP treatment in marmosets represents a clinically relevant PD model, with a window of opportunity to examine the onset of the disease, allowing the detection of individual variability in disease susceptibility, which may be of relevance for the diagnosis and treatment of PD in humans. © 2016 S. Karger AG, Basel.
Terada, Maiko; Horisawa, Kenichi; Miura, Shizuka; Takashima, Yasuo; Ohkawa, Yasuyuki; Sekiya, Sayaka; Matsuda-Ito, Kanae; Suzuki, Atsushi
Intrahepatic cholangiocarcinoma (ICC) is a malignant epithelial neoplasm composed of cells resembling cholangiocytes that line the intrahepatic bile ducts in portal areas of the hepatic lobule. Although ICC has been defined as a tumor arising from cholangiocyte transformation, recent evidence from genetic lineage-tracing experiments has indicated that hepatocytes can be a cellular origin of ICC by directly changing their fate to that of biliary lineage cells. Notch signaling has been identified as an essential factor for hepatocyte conversion into biliary lineage cells at the onset of ICC. However, the mechanisms underlying Notch signal activation in hepatocytes remain unclear. Here, using a mouse model of ICC, we found that hepatic macrophages called Kupffer cells transiently congregate around the central veins in the liver and express the Notch ligand Jagged-1 coincident with Notch activation in pericentral hepatocytes. Depletion of Kupffer cells prevents the Notch-mediated cell-fate conversion of hepatocytes to biliary lineage cells, inducing hepatocyte apoptosis and increasing mortality in mice. These findings will be useful for uncovering the pathogenic mechanism of ICC and developing prevenient and therapeutic strategies for this refractory disease. PMID:27698452
Daler K. Sharipov
Full Text Available The article deals with a math model for research and management of aerosols released into the atmosphere as well as numerical algorithm used as hardware and software systems for conducting computing experiment.
Passner, Jeffrey E
... by the U.S. Army Research Laboratory (ARL) to determine how accurate and robust the model is under a variety of meteorological conditions, with an emphasis on fine resolution, short-range forecasts in complex terrain...
National Oceanic and Atmospheric Administration, Department of Commerce — Weather Research and Forecasting (WRF) mesoscale numerical weather prediction model 3.5-day hourly forecast for the region surrounding the Hawaiian island of Oahu at...
National Oceanic and Atmospheric Administration, Department of Commerce — Weather Research and Forecasting (WRF) mesoscale numerical weather prediction model 7-day hourly forecast for the region surrounding the Commonwealth of the Northern...
National Oceanic and Atmospheric Administration, Department of Commerce — Weather Research and Forecasting (WRF) mesoscale numerical weather prediction model 7-day hourly forecast for the region surrounding the island of Guam at...
National Oceanic and Atmospheric Administration, Department of Commerce — Weather Research and Forecasting (WRF) mesoscale numerical weather prediction model 7-day hourly forecast for the region surrounding the islands of Samoa at...
Full Text Available Indoor modeling and mapping has been an active area of research in last 20 years in order to tackle the problems related to positioning and tracking of people and objects indoors, and provides many opportunities for several domains ranging from emergency response to logistics in micro urban spaces. The outputs of recent research in the field have been presented in several scientific publications and events primarily related to spatial information science and technology. This paper summarizes the outputs of last 10 years of research on indoor modeling and mapping within a proper classification which covers 7 areas, i.e. Information Acquisition by Sensors, Model Definition, Model Integration, Indoor Positioning and LBS, Routing & Navigation Methods, Augmented and Virtual Reality Applications, and Ethical Issues. Finally, the paper outlines the current and future research directions and concluding remarks.
Gunduz, M.; Isikdag, U.; Basaraner, M.
Indoor modeling and mapping has been an active area of research in last 20 years in order to tackle the problems related to positioning and tracking of people and objects indoors, and provides many opportunities for several domains ranging from emergency response to logistics in micro urban spaces. The outputs of recent research in the field have been presented in several scientific publications and events primarily related to spatial information science and technology. This paper summarizes the outputs of last 10 years of research on indoor modeling and mapping within a proper classification which covers 7 areas, i.e. Information Acquisition by Sensors, Model Definition, Model Integration, Indoor Positioning and LBS, Routing & Navigation Methods, Augmented and Virtual Reality Applications, and Ethical Issues. Finally, the paper outlines the current and future research directions and concluding remarks.
Wong, L. T.; Mui, K. W.; Hui, P. S.
Maintaining acceptable indoor air quality (IAQ) for a healthy environment is of primary concern, policymakers have developed different strategies to address the performance of it based on proper assessment methodologies and monitoring plans. It could be cost prohibitive to sample all toxic pollutants in a building. In search of a more manageable number of parameters for cost-effective IAQ assessment, this study investigated the probable correlations among the 12 indoor environmental parameters listed in the IAQ certification scheme of the Hong Kong Environment Protection Department (HKEPD) in 422 Hong Kong offices. These 12 parameters consists of nine indoor air pollutants: carbon dioxide (CO 2), carbon monoxide (CO), respirable suspended particulates (RSP), nitrogen dioxide (NO 2), ozone (O 3), formaldehyde (HCHO), total volatile organic compounds (TVOC), radon (Rn), airborne bacteria count (ABC); and three thermal comfort parameters: temperature ( T), relative humidity (RH) and air velocity ( V). The relative importance of the correlations derived, from largest to smallest loadings, was ABC, Rn, CO, RH, RSP, CO 2, TVOC, O 3, T, V, NO 2 and HCHO. Together with the mathematical expressions derived, an alternative sampling protocol for IAQ assessment with the three 'most representative and independent' parameters namely RSP, CO 2 and TVOC measured in an office environment was proposed. The model validity was verified with on site measurements from 43 other offices in Hong Kong. The measured CO 2, RSP and TVOC concentrations were used to predict the probable levels of the other nine parameters and good agreement was found between the predictions and measurements. This simplified protocol provides an easy tool for performing IAQ monitoring in workplaces and will be useful for determining appropriate mitigation measures to finally honor the certification scheme in a cost-effective way.
Steel, Zachary; Silove, Derrick
Despite a strong historical record of resettling and providing care for refugee populations, the Australian Federal Government has increasingly implemented harsh and restrictive policies regarding the treatment and management of asylum seekers. Most controversial of these has been the mandatory detention of asylum seekers, a policy applied indiscriminately and without discretion where individual cases have not been subject to judicial review or time constraints. From the outset health professionals have raised concerns about the possible adverse mental health impacts of prolonged detention. In contrast, government representatives have characterized conditions in detention as benign and comfortable, and have consistently contested criticism of detention, often citing a lack of scientific evidence as tacit support for the continuation of the policy. Nevertheless, requests for access to the detention centres to undertake rigorous scientific investigations have gone unheeded. In this context we argue that the Australian Government has failed to uphold its commitment to good governance by allowing transparency, openness and a willingness to have the impact of its policies scrutinized by scientists. The manifest conflict of interest in the government position leads to a breach in the normal social contract between mental health researchers and those responsible for the policy of detention. There is, we argue, a legitimate moral imperative in such situations for clinical researchers to breach the walls of enforced silence and give a voice to those who are afflicted. This imperative, however, must be carefully balanced against the risks that may face detainees agreeing to participate in such research.
Simulation (M&S) — 2010 Research,” for the Assistant Secretary of Defense for Research and Engineering, Systems Engineering, Director, Modeling and...http://ignoranceisfutile.wordpress.com/2008/10/15/darpa-backked- siri -nearing-launch-of-personal-artificial- intelligence/. 11 virtual assistant ...this TMSSP addresses: • Defense Training Environment (DTE) • Personal Learning Assistant (PLA), and • More agile business models for training. This
AFRL-AFOSR-VA-TR-2015-0296 RESEARCH IN SUPERCRITICAL FUEL PROPERTIES AND COMBUSTION MODELING Gregory Faris SRI INTERNATIONAL MENLO PARK CA Final... Faris , Gregory P. Smith 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) SRI International...International Final Technical Report • September 14, 2015 RESEARCH IN SUPERCRITICAL FUEL PROPERTIES AND COMBUSTION MODELING Prepared by: Gregory W. Faris
Sánchez-Johnsen, Lisa; Escamilla, Julia; Rodriguez, Erin M.; Vega, Susan; Bolaños, Liliana
Many behavioral health materials have not been translated into Spanish. Of those that are available in Spanish, some of them have not been translated correctly, many are only appropriate for a subgroup of Latinos, and/or multiple versions of the same materials exist. This article describes an innovative model of conducting bilingual English–Spanish translations as part of community-based participatory research studies and provides recommendations based on this model. In this article, the traditional process of conducting bilingual translations is reviewed, and an innovative model for conducting translations in collaboration with community partners is described. Finally, recommendations for conducting future health research studies with community partners are provided. Researchers, health care providers, educators, and community partners will benefit from learning about this innovative model that helps produce materials that are more culturally appropriate than those that are produced with the most commonly used method of conducting translations. PMID:25741929
Chen, Wansu; Shi, Jiaxiao; Qian, Lei; Azen, Stanley P
To estimate relative risks or risk ratios for common binary outcomes, the most popular model-based methods are the robust (also known as modified) Poisson and the log-binomial regression. Of the two methods, it is believed that the log-binomial regression yields more efficient estimators because it is maximum likelihood based, while the robust Poisson model may be less affected by outliers. Evidence to support the robustness of robust Poisson models in comparison with log-binomial models is very limited. In this study a simulation was conducted to evaluate the performance of the two methods in several scenarios where outliers existed. The findings indicate that for data coming from a population where the relationship between the outcome and the covariate was in a simple form (e.g. log-linear), the two models yielded comparable biases and mean square errors. However, if the true relationship contained a higher order term, the robust Poisson models consistently outperformed the log-binomial models even when the level of contamination is low. The robust Poisson models are more robust (or less sensitive) to outliers compared to the log-binomial models when estimating relative risks or risk ratios for common binary outcomes. Users should be aware of the limitations when choosing appropriate models to estimate relative risks or risk ratios.
O'Keefe, Christine M.; Head, Richard J.
It is the purpose of this article to discuss the development and application of a logic model in the context of a large scientific research program within the Commonwealth Scientific and Industrial Research Organisation (CSIRO). CSIRO is Australia's national science agency and is a publicly funded part of Australia's innovation system. It conducts…
First page Back Continue Last page Overview Graphics. UNDER GRADUATE RESEARCH An alternative model of doing science. The main work force is undergraduate students. Using research as a tool in education. Advantages : High risk tolerance. Infinite energy. Uninhibited lateral thinking. Problems: Japanese ...
To this end, the introductions of 59 RAs published in the Legon Journal of the Humanities from 2005 to 2010 were analyzed using the CARS model. The findings reveal that the authors of these RAs may perhaps not be exploiting Step 3 (reviewing items of previous research) under Move 1 in order to reinforce the research ...
Beck, Kirk A.
This article describes ethnographic decision tree modeling (EDTM; C. H. Gladwin, 1989) as a mixed method design appropriate for counseling psychology research. EDTM is introduced and located within a postpositivist research paradigm. Decision theory that informs EDTM is reviewed, and the 2 phases of EDTM are highlighted. The 1st phase, model…
Velden, van der M.J.; Vreke, J.; Wal, van der B.; Symons, A.
The project described here was aimed at evaluating the Capability Maturity Model (CMM) in the context of a research organization. Part of the evaluation was a standard CMM assessment. It was found that CMM could be applied to a research organization, although its five maturity levels were considered
One of the high-priority research areas of the Federal Highway Administration (FHWA) is the development of the Interactive Highway Safety Design Model (IHSDM). The goal of the IHSDM research program is to develop a systematic approach that will allow...
Jackson, Samuel J; Thomas, Gareth J
Mouse models, including patient-derived xenograft mice, are widely used to address questions in cancer research. However, there are documented flaws in these models that can result in the misrepresentation of human tumour biology and limit the suitability of the model for translational research. A coordinated effort to promote the more widespread development and use of 'non-animal human tissue' models could provide a clinically relevant platform for many cancer studies, maximising the opportunities presented by human tissue resources such as biobanks. A number of key factors limit the wide adoption of non-animal human tissue models in cancer research, including deficiencies in the infrastructure and the technical tools required to collect, transport, store and maintain human tissue for lab use. Another obstacle is the long-standing cultural reliance on animal models, which can make researchers resistant to change, often because of concerns about historical data compatibility and losing ground in a competitive environment while new approaches are embedded in lab practice. There are a wide range of initiatives that aim to address these issues by facilitating data sharing and promoting collaborations between organisations and researchers who work with human tissue. The importance of coordinating biobanks and introducing quality standards is gaining momentum. There is an exciting opportunity to transform cancer drug discovery by optimising the use of human tissue and reducing the reliance on potentially less predictive animal models. © 2017. Published by The Company of Biologists Ltd.
Samuel J. Jackson
Full Text Available Mouse models, including patient-derived xenograft mice, are widely used to address questions in cancer research. However, there are documented flaws in these models that can result in the misrepresentation of human tumour biology and limit the suitability of the model for translational research. A coordinated effort to promote the more widespread development and use of ‘non-animal human tissue’ models could provide a clinically relevant platform for many cancer studies, maximising the opportunities presented by human tissue resources such as biobanks. A number of key factors limit the wide adoption of non-animal human tissue models in cancer research, including deficiencies in the infrastructure and the technical tools required to collect, transport, store and maintain human tissue for lab use. Another obstacle is the long-standing cultural reliance on animal models, which can make researchers resistant to change, often because of concerns about historical data compatibility and losing ground in a competitive environment while new approaches are embedded in lab practice. There are a wide range of initiatives that aim to address these issues by facilitating data sharing and promoting collaborations between organisations and researchers who work with human tissue. The importance of coordinating biobanks and introducing quality standards is gaining momentum. There is an exciting opportunity to transform cancer drug discovery by optimising the use of human tissue and reducing the reliance on potentially less predictive animal models.
Ollie Yiru Yu
Full Text Available Dental caries form through a complex interaction over time among dental plaque, fermentable carbohydrate, and host factors (including teeth and saliva. As a key factor, dental plaque or biofilm substantially influence the characteristic of the carious lesions. Laboratory microbial culture models are often used because they provide a controllable and constant environment for cariology research. Moreover, they do not have ethical problems associated with clinical studies. The design of the microbial culture model varies from simple to sophisticated according to the purpose of the investigation. Each model is a compromise between the reality of the oral cavity and the simplification of the model. Researchers, however, can still obtain meaningful and useful results from the models they select. Laboratory microbial culture models can be categorized into a closed system and an open system. Models in the closed system have a finite supply of nutrients, and are also simple and cost-effective. Models in the open system enabled the supply of a fresh culture medium and the removal of metabolites and spent culture liquid simultaneously. They provide better regulation of the biofilm growth rate than the models in the closed system. This review paper gives an overview of the dental plaque biofilm and laboratory microbial culture models used for cariology research.
Berman, Jeanette; Smyth, Robyn
This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…
We present a conceptual model of design science research artifacts. The model views an artifact at three levels. At the artifact level a selected artifact is viewed as a combination of material and immaterial aspects and a set of representations hereof. At the design level the selected artifact...
E. Demerouti (Eva); A.B. Bakke (Arnold B.)
textabstractMotivation: The motivation of this overview is to present the state of the art of Job Demands-Resources (JD-R) model whilst integrating the various contributions to the special issue. Research purpose: To provide an overview of the JD-R model, which incorporates many possible working
Crawley, Frank E.; Kobala, Thomas R., Jr.
Presents a summary of models and methods of attitude research which are embedded in the theoretical tenets of social psychology and in the broader framework of constructivism. Focuses on the construction of social reality rather than the construction of physical reality. Models include theory of reasoned action, theory of planned behavior, and…
This paper examines changing models of pedagogy by drawing on recent research with teachers and their students as well as theoretical developments. In relation to a participatory view of learning, the paper reviews existing pedagogical models that take little account of the use of information and communications technologies as well as those that…
Roč. 28, č. 6 (2011), s. 2669-2673 ISSN 0264-9993. [Society for Non-linear Dynamics and Econometrics Annual Conferencen. Washington DC, 16.03.2011-18.03.2011] R&D Projects: GA ČR GA402/09/0965 Institutional research plan: CEZ:AV0Z10750506 Keywords : Research and development * Growth * Bayesian model averaging Subject RIV: AH - Economics Impact factor: 0.701, year: 2011 http://library.utia.cas.cz/separaty/2011/E/horvath-research & development and growth a bayesian model averaging analysis.pdf
Sharma, Raju Prasad; Schuhmacher, Marta; Kumar, Vikas
Endocrine disruptor compounds (EDCs) are environment chemicals that cause harmful effects through multiple mechanisms, interfering with hormone system resulting in alteration of homeostasis, reproduction and developmental effect. Many of these EDCs have concurrent exposure with crosstalk and common mechanisms which may lead to dynamic interactions. To carry out risk assessment of EDCs' mixture, it is important to know the detailed toxic pathway, crosstalk of receptor and other factors like critical window of exposure. In this review, we summarize the major mechanism of actions of EDCs with the different/same target organs interfering with the same/different class of hormone by altering their synthesis, metabolism, binding and cellular action. To show the impact of EDCs on life stage development, a case study on female fertility affecting germ cell is illustrated. Based on this summarized discussion, major groups of EDCs are classified based on their target organ, mode of action and potential risk. Finally, a conceptual model of pharmacodynamic interaction is proposed to integrate the crosstalk and common mechanisms that modulate estrogen into the predictive mixture dosimetry model with dynamic interaction of mixture. This review will provide new insight for EDCs' risk assessment and can be used to develop next generation PBPK/PD models for EDCs' mixture analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.
Jetzer, Philippe; Tortora, Crescenzo
The thermodynamic and dynamical properties of a variable dark energy model with density scaling as ρ x ∝(1+z) m , z being the redshift, are discussed following the outline of Jetzer et al.[P. Jetzer, D. Puy, M. Signore, and C. Tortora, Gen. Relativ. Gravit. 43, 1083 (2011).]. These kinds of models are proven to lead to the creation/disruption of matter and radiation, which affect the cosmic evolution of both matter and radiation components in the Universe. In particular, we have concentrated on the temperature-redshift relation of radiation, which has been constrained using a very recent collection of cosmic microwave background (CMB) temperature measurements up to z∼3. For the first time, we have combined this observational probe with a set of independent measurements (Supernovae Ia distance moduli, CMB anisotropy, large-scale structure and observational data for the Hubble parameter), which are commonly adopted to constrain dark energy models. We find that, within the uncertainties, the model is indistinguishable from a cosmological constant which does not exchange any particles with other components. Anyway, while temperature measurements and Supernovae Ia tend to predict slightly decaying models, the contrary happens if CMB data are included. Future observations, in particular, measurements of CMB temperature at large redshift, will allow to give firmer bounds on the effective equation of state parameter w eff of this kind of dark energy model.
Jetzer, Philippe; Tortora, Crescenzo
The thermodynamic and dynamical properties of a variable dark energy model with density scaling as ρx∝(1+z)m, z being the redshift, are discussed following the outline of Jetzer et al. [P. Jetzer, D. Puy, M. Signore, and C. Tortora, Gen. Relativ. Gravit. 43, 1083 (2011).GRGVA80001-770110.1007/s10714-010-1091-4]. These kinds of models are proven to lead to the creation/disruption of matter and radiation, which affect the cosmic evolution of both matter and radiation components in the Universe. In particular, we have concentrated on the temperature-redshift relation of radiation, which has been constrained using a very recent collection of cosmic microwave background (CMB) temperature measurements up to z˜3. For the first time, we have combined this observational probe with a set of independent measurements (Supernovae Ia distance moduli, CMB anisotropy, large-scale structure and observational data for the Hubble parameter), which are commonly adopted to constrain dark energy models. We find that, within the uncertainties, the model is indistinguishable from a cosmological constant which does not exchange any particles with other components. Anyway, while temperature measurements and Supernovae Ia tend to predict slightly decaying models, the contrary happens if CMB data are included. Future observations, in particular, measurements of CMB temperature at large redshift, will allow to give firmer bounds on the effective equation of state parameter weff of this kind of dark energy model.
Keyser, Daniel; Uccellini, Louis W.
A number of regional-scale numerical weather prediction models are discussed together with their application to the study of the structure and the dynamics of mesoscale phenomena. Consideration is given to investigations of natural phenomena (such as midlatitude cyclones and related baroclinic disturbances; upper-level jet-front systems; surface frontal zones, squall lines, and rain bands; mesoscale convective systems; and severe-storm environments) in which two operational models and four research models are used for regional-model studies. It is shown that these models provide investigators with four-dimensional dynamically consistent data sets to supplement and extend those available from observations.
Mehul S. Bhakta
Full Text Available The common bean is a tropical facultative short-day legume that is now grown in tropical and temperate zones. This observation underscores how domestication and modern breeding can change the adaptive phenology of a species. A key adaptive trait is the optimal timing of the transition from the vegetative to the reproductive stage. This trait is responsive to genetically controlled signal transduction pathways and local climatic cues. A comprehensive characterization of this trait can be started by assessing the quantitative contribution of the genetic and environmental factors, and their interactions. This study aimed to locate significant QTL (G and environmental (E factors controlling time-to-flower in the common bean, and to identify and measure G × E interactions. Phenotypic data were collected from a biparental [Andean × Mesoamerican] recombinant inbred population (F11:14, 188 genotypes grown at five environmentally distinct sites. QTL analysis using a dense linkage map revealed 12 QTL, five of which showed significant interactions with the environment. Dissection of G × E interactions using a linear mixed-effect model revealed that temperature, solar radiation, and photoperiod play major roles in controlling common bean flowering time directly, and indirectly by modifying the effect of certain QTL. The model predicts flowering time across five sites with an adjusted r-square of 0.89 and root-mean square error of 2.52 d. The model provides the means to disentangle the environmental dependencies of complex traits, and presents an opportunity to identify in silico QTL allele combinations that could yield desired phenotypes under different climatic conditions.
Full Text Available Alzheimer's disease (AD is a degenerative disease of the central nervous system, and its pathogenesis is complex. Animal models play an important role in study on pathogenesis and treatment of AD. This paper summarized methods of building models, observation on animal models and evaluation index in recent years, so as to provide related evidence for basic and clinical research in future. DOI: 10.3969/j.issn.1672-6731.2015.08.003
Fried, Leanne; Mansfield, Caroline; Dobozy, Eva
This article reports on the development of a conceptual model of teacher emotion through a review of teacher emotion research published between 2003 and 2013. By examining 82 publications regarding teacher emotion, the main aim of the review was to identify how teacher emotion was conceptualised in the literature and develop a conceptual model to…
The 4th International Conference on Integrating GIS and Environmental Modeling (GIS/EM4) was convened in Banff, Canada, September 2-8, 2000 at The Banff Centre for Conferences. The meeting's purpose, like it's predecessors was to reformulate, each three to four years, the collaborative research agenda for integrating spatio-temporal analysis with environmental simulation modeling.
T.A. Knoch (Tobias); A. Abuseiris (Anis); M. Lesnussa (Michael); F.N. Kepper (Nick); R.M. de Graaf (Rob); F.G. Grosveld (Frank)
textabstractThe amount of information is growing exponentially with ever-new technologies emerging and is believed to be always at the limit. In contrast, huge resources are obviously available, which are underused in the IT sector, similar as e.g. in the renewable energy sector. Genome research is
Silva, Angélica Baptista; Morel, Carlos Médicis; Moraes, Ilara Hämmerli Sozzi de
To review the conceptual relationship between telehealth and translational research. Bibliographical search on telehealth was conducted in the Scopus, Cochrane BVS, LILACS and MEDLINE databases to find experiences of telehealth in conjunction with discussion of translational research in health. The search retrieved eight studies based on analysis of models of the five stages of translational research and the multiple strands of public health policy in the context of telehealth in Brazil. The models were applied to telehealth activities concerning the Network of Human Milk Banks, in the Telemedicine University Network. The translational research cycle of human milk collected, stored and distributed presents several integrated telehealth initiatives, such as video conferencing, and software and portals for synthesizing knowledge, composing elements of an information ecosystem, mediated by information and communication technologies in the health system. Telehealth should be composed of a set of activities in a computer mediated network promoting the translation of knowledge between research and health services.
Full Text Available The article aims at presenting the first results of a current research on works, series, periodicals translated and adapted for the Italian recently broadened audience of the late 19th century, especially during the age of Positivism. It presents two case studies: 1. translation and adaptation of geographical publications by Emilio Treves; 2. reuse of images in educational publications for object lessons, mainly published by Hoepli, Vallardi, Paravia. The two case studies are meant to give account of publishers’ strategies, influenced by the emergence of new readers, and favoured by a still-undefined international copyright legislation. Publishers often translated and adapted texts from abroad in order to save money and satisfy their audience; by doing so, they acted as relevant transcultural mediators in an age of mass education. In the conclusions, the debate on the controversial reception of Positivism in Italy, which determines the time span of the article, is mentioned as likely to benefit from the study of the coeval book trade. Namely, it can be argued that the surprising diffusion of scientism at every rank of society during the so- called liberal age drew upon on the described strategies of transcultural adaptation.
Douglas A Clark
Full Text Available From 1988-1992 wood bison (Bison bison athabascae were transplanted to the southwest Yukon, inadvertently creating concerns among local First Nations about their impacts on other wildlife, habitat, and their members' traditional livelihoods. To understand these concerns we conducted a participatory impact assessment based on a multistage analysis of existing and new qualitative data. We found wood bison had since become a valued food resource, though there was a socially-determined carrying capacity for this population. Study participants desire a population large enough to sustainably harvest but avoid crossing a threshold beyond which bison may alter the regional ecosystem. An alternative problem definition emerged that focuses on how wildlife and people alike are adapting to the observed long-term changes in climate and landscape; suggesting that a wider range of acceptable policy alternatives likely exists than may have previously been thought. Collective identification of this new problem definition indicates that this specific assessment acted as a social learning process in which the participants jointly discovered new perspectives on a problem at both individual and organisational levels. Subsequent regulatory changes, based on this research, demonstrate the efficacy of participatory impact assessment for ameliorating human-wildlife conflicts.
The water uptake kinetics of cowpea seeds were carried out at two different water absorption treatments - common treatment and microwave treatment - to evaluate the effects of rehydration temperatures and microwave output powers on rehydration. Water uptake of cowpea seeds during soaking in water was studied at various temperatures of 20 - 45 degree C, and at various microwave output powers of 180 - 900 W. As the rehydration temperature and microwave output power increased, the water uptake of cowpea seeds increased and the rehydration time decreased. The Peleg and Richards Models were capable of predicting water uptake of cowpea seeds undergoing common treatment and microwave treatment, respectively. The effective diffusivity values were evaluated by fitting experimental absorption data to Fick second law of diffusion. The effective diffusivity coefficients for cowpea seeds varied from 7.75*10-11 to 1.99*10-10 m2/s and from 2.23*10-9 to 9.78*10-9 m2/s for common treatment and microwave treatment, respectively. (author)
Cassels, Susan; Goodreau, Steven M
HIV is transmitted within complex biobehavioral systems. Mathematical modeling can provide insight to complex population-level outcomes of various behaviors measured at an individual level. HIV models in the social and behavioral sciences can be categorized in a number of ways; here, we consider two classes of applications common in the field generally, and in the past year in particular: those models that explore significant behavioral determinants of HIV disparities within and between populations; and those models that seek to evaluate the potential impact of specific social and behavioral interventions. We discuss two overarching issues we see in the field: the need to further systematize effectiveness models of behavioral interventions, and the need for increasing investigation of the use of behavioral data in epidemic models. We believe that a recent initiative by the National Institutes of Health will qualitatively change the relationships between epidemic modeling and sociobehavioral prevention research in the coming years.
Holdaway, Daniel; Kent, James
The linearity of a selection of common advection schemes is tested and examined with a view to their use in the tangent linear and adjoint versions of an atmospheric general circulation model. The schemes are tested within a simple offline one-dimensional periodic domain as well as using a simplified and complete configuration of the linearised version of NASA's Goddard Earth Observing System version 5 (GEOS-5). All schemes which prevent the development of negative values and preserve the shape of the solution are confirmed to have nonlinear behaviour. The piecewise parabolic method (PPM) with certain flux limiters, including that used by default in GEOS-5, is found to support linear growth near the shocks. This property can cause the rapid development of unrealistically large perturbations within the tangent linear and adjoint models. It is shown that these schemes with flux limiters should not be used within the linearised version of a transport scheme. The results from tests using GEOS-5 show that the current default scheme (a version of PPM) is not suitable for the tangent linear and adjoint model, and that using a linear third-order scheme for the linearised model produces better behaviour. Using the third-order scheme for the linearised model improves the correlations between the linear and non-linear perturbation trajectories for cloud liquid water and cloud liquid ice in GEOS-5.
Asmi, A.; Sorvari, S.; Kutsch, W. L.; Laj, P.
European long-term environmental research infrastructures (often referred as ESFRI RIs) are the core facilities for providing services for scientists in their quest for understanding and predicting the complex Earth system and its functioning that requires long-term efforts to identify environmental changes (trends, thresholds and resilience, interactions and feedbacks). Many of the research infrastructures originally have been developed to respond to the needs of their specific research communities, however, it is clear that strong collaboration among research infrastructures is needed to serve the trans-boundary research requires exploring scientific questions at the intersection of different scientific fields, conducting joint research projects and developing concepts, devices, and methods that can be used to integrate knowledge. European Environmental research infrastructures have already been successfully worked together for many years and have established a cluster - ENVRI cluster - for their collaborative work. ENVRI cluster act as a collaborative platform where the RIs can jointly agree on the common solutions for their operations, draft strategies and policies and share best practices and knowledge. Supporting project for the ENVRI cluster, ENVRIplus project, brings together 21 European research infrastructures and infrastructure networks to work on joint technical solutions, data interoperability, access management, training, strategies and dissemination efforts. ENVRI cluster act as one stop shop for multidisciplinary RI users, other collaborative initiatives, projects and programmes and coordinates and implement jointly agreed RI strategies.
Research purpose: To provide an overview of the JD–R model, which incorporates many possible working conditions and focuses on both negative and positive indicators of employee well-being. Moreover, the studies of the special issue were introduced. Research design: Qualitative and quantitative studies on the JD–R model were reviewed to enlighten the health and motivational processes suggested by the model. Main findings: Next to the confirmation of the two suggested processes of the JD–R model, the studies of the special issue showed that the model can be used to predict work-place bullying, incidences of upper respiratory track infection, work-based identity, and early retirement intentions. Moreover, whilst psychological safety climate could be considered as a hypothetical precursor of job demands and resources, compassion satisfaction moderated the health process of the model. Contribution/value-add: The findings of previous studies and the studies of the special issue were integrated in the JD–R model that can be used to predict well-being and performance at work. New avenues for future research were suggested. Practical/managerial implications: The JD–R model is a framework that can be used for organisations to improve employee health and motivation, whilst simultaneously improving various organisational outcomes.
here. Conclusions A strong, equitable collaboration between clinical and academic partners working towards a common outcome can enhance the use of research within the healthcare workforce and contribute actively to the research process. A set of propositions are specified to facilitate both transferability of this partnership model to other professional groups and clinical teams and evaluation of the model components.
molecular dynamics program by NAgoya Cooperation ( COGNAC ) • Polymer rheology Analyzer with Slip-link model of entanglement (PASTA) • Simulation...FEM, and self-consistent field method. Detailed descriptions of the four simulation programs are below: • COGNAC ―A molecular dynamics program that...code 2. Available on Windows, Linux and MacOSX operating systems 3. Common GUI 4. COGNAC a. Density biased Monte Carlo and density biased
Jing Ying Hoo
Full Text Available The utilization of zebrafish in biomedical research is very common in the research world nowadays. Today, it has emerged as a favored vertebrate organism for the research in science of reproduction. There is a significant growth in amount numbers of scientific literature pertaining to research discoveries in reproductive sciences in zebrafish. It has implied the importance of zebrafish in this particular field of research. In essence, the current available literature has covered from the very specific brain region or neurons of zebrafish, which are responsible for reproductive regulation, until the gonadal level of the animal. The discoveries and findings have proven that this small animal is sharing a very close/similar reproductive system with mammals. More interestingly, the behavioral characteristics and along with the establishment of animal courtship behavior categorization in zebrafish have laid an even stronger foundation and firmer reason on the suitability of zebrafish utilization in research of reproductive sciences. In view of the immense importance of this small animal for the development of reproductive sciences, this review aimed at compiling and describing the proximate close similarity of reproductive regulation on zebrafish and human along with factors contributing to the infertility, showing its versatility and its potential usage for fertility research.
Full Text Available Electronic power transformers (EPTs have been identified as emerging intelligent electronic devices in the future smart grid, e.g., the Energy Internet, especially in the application of renewable energy conversion and management. Considering that the EPT is directly connected to the medium-voltage grid, e.g., a10 kV distribution system, and its cascaded H-bridges structure, the common mode voltage (CMV issue will be more complex and severe. The CMV will threaten the insulation of the entire EPT device and even produce common mode current. This paper investigates the generated mechanism and characteristics of the CMV in a cascaded H-bridge EPT (CHB-EPT under both balanced and fault grid conditions. First, the CHB-EPT system is introduced. Then, a three-phase simplified circuit model of the high-voltage side of the EPT system is presented. Combined with a unipolar modulation strategy and carrier phase shifting technology by rigorous mathematical analysis and derivation, the EPT internal CMV and its characteristics are obtained. Moreover, the influence of the sinusoidal pulse width modulation dead time is considered and discussed based on analytical calculation. Finally, the simulation results are provided to verify the validity of the aforementioned model and the analysis results. The proposed theoretical analysis method is also suitable for other similar cascaded converters and can provide a useful theoretical guide for structural design and power density optimization.
Zhang, Si-Qi; Lu, Jing-Bin; Li, Hong; Liu, Ji-Ping; Zhang, Xiao-Ru; Liu, Han; Liang, Yu; Ma, Ji; Liu, Xiao-Jing; Wu, Xiang-Yao
In this paper, we have studied the evolution curve of two-level atomic system that the initial state is excited state. At the different of environmental reservoir models, which include the single Lorentzian, ideal photon band-gap, double Lorentzian and square Lorentzian reservoir, we researched the influence of these environmental reservoir models on the evolution of energy level population. At static no modulation, comparing the four environmental models, the atomic energy level population oscillation of square Lorentzian reservoir model is fastest, and the atomic system decoherence is slowest. Under dynamic modulation, comparing the photon band-gap model with the single Lorentzian reservoir model, no matter what form of dynamic modulation, the time of atoms decay to the ground state is longer for the photonic band-gap model. These conclusions make the idea of using the environmental change to modulate the coherent evolution of atomic system become true.
Full Text Available Equivalent circuit models are a hot research topic in the field of lithium-ion batteries for electric vehicles, and scholars have proposed a variety of equivalent circuit models, from simple to complex. On one hand, a simple model cannot simulate the dynamic characteristics of batteries; on the other hand, it is difficult to apply a complex model to a real-time system. At present, there are few systematic comparative studies on equivalent circuit models of lithium-ion batteries. The representative first-order resistor-capacitor (RC model and second-order RC model commonly used in the literature are studied comparatively in this paper. Firstly, the parameters of the two models are identified experimentally; secondly, the simulation model is built in Matlab/Simulink environment, and finally the output precision of these two models is verified by the actual data. The results show that in the constant current condition, the maximum error of the first-order RC model is 1.65% and the maximum error for the second-order RC model is 1.22%. In urban dynamometer driving schedule (UDDS condition, the maximum error of the first-order RC model is 1.88%, and for the second-order RC model the maximum error is 1.69%. This is of great instructional significance to the application in practical battery management systems for the equivalent circuit model of lithium-ion batteries of electric vehicles.
Tania, Mousumi; Khan, Md Asaduzzaman; Xia, Kun
Autism, a lifelong neuro-developmental disorder is a uniquely human condition. Animal models are not the perfect tools for the full understanding of human development and behavior, but they can be an important place to start. This review focused on the recent updates of animal model research in autism. We have reviewed the publications over the last three decades, which are related to animal model study in autism. Animal models are important because they allow researchers to study the underlying neurobiology in a way that is not possible in humans. Improving the availability of better animal models will help the field to increase the development of medicines that can relieve disabling symptoms. Results from the therapeutic approaches are encouraging remarkably, since some behavioral alterations could be reversed even when treatment was performed on adult mice. Finding an animal model system with similar behavioral tendencies as humans is thus vital for understanding the brain mechanisms, supporting social motivation and attention, and the manner in which these mechanisms break down in autism. The ongoing studies should therefore increase the understanding of the biological alterations associated with autism as well as the development of knowledge-based treatments therapy for those struggling with autism. In this review, we have presented recent advances in research based on animal models of autism, raising hope for understanding the disease biology for potential therapeutic intervention to improve the quality of life of autism individuals.
Full Text Available Purpose. Perform numerical analysis of the distribution of the factual contributions of line sources of distortion in the voltage distortion at the point of common coupling, based on the principles of superposition and exclusions. Methodology. Numerical analysis was performed on the results of the simulation steady state operation of power supply system of seven electricity consumers. Results. Mathematical model for determining the factual contribution of line sources of distortion in the voltage distortion at the point of common coupling, based on the principles of superposition and exclusions, are equivalent. To assess the degree of participation of each source of distortion in the voltage distortion at the point of common coupling and distribution of financial compensation to the injured party by all sources of distortion developed a one-dimensional criteria based on the scalar product of vectors. Not accounting group sources of distortion, which belong to the subject of the energy market, to determine their total factual contribution as the residual of the factual contribution between all sources of distortion. Originality. Simulation mode power supply system was carried out in the phase components space, taking into account the distributed characteristics of distortion sources. Practical value. The results of research can be used to develop methods and tools for distributed measurement and analytical systems assessment of the power quality.
Rice, K. M.; Fannin, J. C.; Gillette, C.; Blough, E. R.
Cardiovascular disease is the leading cause of death in women in the United States. Aging is a primary risk factor for the development of cardiovascular disease as well as cardiovascular-related morbidity and mortality. Aging is a universal process that all humans undergo; however, research in aging is limited by cost and time constraints. Therefore, most research in aging has been done in primates and rodents; however it is unknown how well the effects of aging in rat models translate into h...
It is difficult for researchers of AD HOC network to conduct actual deployment during experimental stage as the network topology is changeable and location of nodes is unfixed. Thus simulation still remains the main research method of the network. Mobility model is an important component of AD HOC network simulation. It is used to describe the movement pattern of nodes in AD HOC network (including location and velocity, etc.) and decides the movement trail of nodes, playing as the abstraction of the movement modes of nodes. Therefore, mobility model which simulates node movement is an important foundation for simulation research. In AD HOC network research, mobility model shall reflect the movement law of nodes as truly as possible. In this paper, node generally refers to the wireless equipment people carry. The main research contents include how nodes avoid obstacles during movement process and the impacts of obstacles on the mutual relation among nodes, based on which a Node Self Avoiding Obstacle, i.e. NASO model is established in AD HOC network.
Janice L. Coen; Marques Cameron; John Michalakes; Edward G. Patton; Philip J. Riggan; Kara M. Yedinak
A wildland fire behavior module (WRF-Fire) was integrated into the Weather Research and Forecasting (WRF) public domain numerical weather prediction model. The fire module is a surface fire behavior model that is two-way coupled with the atmospheric model. Near-surface winds from the atmospheric model are interpolated to a finer fire grid and used, with fuel properties...
Davis, Sean D; Lebow, Jay L; Sprenkle, Douglas H
Though it is clear from meta-analytic research that couple therapy works well, it is less clear how couple therapy works. Efforts to attribute change to the unique ingredients of a particular model have consistently turned up short, leading many researchers to suggest that change is due to common factors that run through different treatment approaches and settings. The purpose of this article is to provide an empirically based case for several common factors in couple therapy, and discuss clinical, training, and research implications for a common factors couple therapy paradigm. Critical distinctions between model-driven and common factors paradigms are also discussed, and a moderate common factors approach is proposed as a more useful alternative to an extreme common factors approach. Copyright © 2011. Published by Elsevier Ltd.
Borkhardt, A.; Sanchez-Garcia, I.; Cobaleda, C.; Hauer, J.
The term ''leukemia'' encompasses a group of diseases with a variable clinical and pathological presentation. Its cellular origin, its biology and the underlying molecular genetic alterations determine the very variable and individual disease phenotype. The focus of this review is to discuss the most important guidelines to be taken into account when we aim at developing an ''ideal'' animal model to study leukemia. The animal model should mimic all the clinical, histological and molecular genetic characteristics of the human phenotype and should be applicable as a clinically predictive model. It should achieve all the requirements to be used as a standardized model adaptive to basic research as well as to pharmaceutical practice. Furthermore it should fulfill all the criteria to investigate environmental risk factors, the role of genomic mutations and be applicable for therapeutic testing. These constraints limit the usefulness of some existing animal models, which are however very valuable for basic research. Hence in this review we will primarily focus on genetically engineered mouse models (GEMMs) to study the most frequent types of childhood leukemia. GEMMs are robust models with relatively low site specific variability and which can, with the help of the latest gene modulating tools be adapted to individual clinical and research questions. Moreover they offer the possibility to restrict oncogene expression to a defined target population and regulate its expression level as well as its timely activity. Until recently it was only possible in individual cases to develop a murin model, which fulfills the above mentioned requirements. Hence the development of new regulatory elements to control targeted oncogene expression should be priority. Tightly controlled and cell specific oncogene expression can then be combined with a knock-in approach and will depict a robust murine model, which enables almost physiologic oncogene
Full Text Available The paper presents the results of theoretical discussions and research findings in the field of designing sustainable business models that support the creation of value at various stages of the business life cycle. The paper presents selected findings of extensive research into the business models of Polish companies listed on the Warsaw Stock Exchange. Companies which are at various stages of development should build and adapt their business models in order to maintain the ability to create value for stakeholders. Characteristics of business models at the early stages of development are different than at mature stages. The paper highlights the differences in business models in the context of the life cycle of companies and sustainability criteria. The paper presents research findings which show that the company’s development can be seen from the point of view of the business model. Research on business models concentrated on identifying the key attributes and the configuration of the business models appropriate for the early stage of development as well as the maturity stage. It was found that the business models of companies at an early stage of the development of companies listed on the Warsaw Stock Exchange are oriented primarily to how the company shapes, delivers, and captures value from the market in order to generate profits for shareholders and increase the value of the company, while the business models of mature companies include the intentions of management used to balance objectives with respect to different groups of stakeholders, and to carefully formulate and implement business objectives with particular attention paid to preserving the sustainability of the business. The assessment of business models from the point of view of the life cycle proves that managers change their approach to configuring business models over time; at some point, they include management intentions aimed at a broader range of goals than merely
Caro, J Jaime; Briggs, Andrew H; Siebert, Uwe; Kuntz, Karen M
Models--mathematical frameworks that facilitate estimation of the consequences of health care decisions--have become essential tools for health technology assessment. Evolution of the methods since the first ISPOR Modeling Task Force reported in 2003 has led to a new Task Force, jointly convened with the Society for Medical Decision Making, and this series of seven articles presents the updated recommendations for best practices in conceptualizing models; implementing state-transition approaches, discrete event simulations, or dynamic transmission models; and dealing with uncertainty and validating and reporting models transparently. This overview article introduces the work of the Task Force, provides all the recommendations, and discusses some quandaries that require further elucidation. The audience for these articles includes those who build models, stakeholders who utilize their results, and, indeed, anyone concerned with the use of models to support decision making. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Murthy, Mahadev; Ram, Jeffrey L
Invertebrate model systems, such as nematodes and fruit flies, have provided valuable information about the genetics and cellular biology involved in aging. However, limitations of these simple, genetically tractable organisms suggest the need for other model systems, some of them invertebrate, to facilitate further advances in the understanding of mechanisms of aging and longevity in mammals, including humans. This paper introduces 10 review articles about the use of invertebrate model systems for the study of aging by authors who participated in an 'NIA-NIH symposium on aging in invertebrate model systems' at the 2013 International Congress for Invertebrate Reproduction and Development. In contrast to the highly derived characteristics of nematodes and fruit flies as members of the superphylum Ecdysozoa, cnidarians, such as Hydra, are more 'basal' organisms that have a greater number of genetic orthologs in common with humans. Moreover, some other new model systems, such as the urochordate Botryllus schlosseri , the tunicate Ciona , and the sea urchins (Echinodermata) are members of the Deuterostomia, the same superphylum that includes all vertebrates, and thus have mechanisms that are likely to be more closely related to those occurring in humans. Additional characteristics of these new model systems, such as the recent development of new molecular and genetic tools and a more similar pattern to humans of regeneration and stem cell function suggest that these new model systems may have unique advantages for the study of mechanisms of aging and longevity.
Full Text Available The common marmoset (Callithrix jacchus is increasingly being utilised as a nonhuman primate model for human disease, ranging from autoimmune to infectious disease. In order to fully exploit these models, meaningful comparison to the human host response is necessary. Commercially available reagents, primarily targeted to human cells, were utilised to assess the phenotype and activation status of key immune cell types and cytokines in naive and infected animals. Single cell suspensions of blood, spleen, and lung were examined. Generally, the phenotype of cells was comparable between humans and marmosets, with approximately 63% of all lymphocytes in the blood of marmosets being T cells, 25% B-cells, and 12% NK cells. The percentage of neutrophils in marmoset blood were more similar to human values than mouse values. Comparison of the activation status of cells following experimental systemic or inhalational infection exhibited different trends in different tissues, most obvious in cell types active in the innate immune response. This work significantly enhances the ability to understand the immune response in these animals and fortifies their use as models of infectious disease.
Truth is for sale today, some critics claim. The increased commodification of science corrupts it, scientific fraud is rampant and the age-old trust in science is shattered. This cynical view, although gaining in prominence, does not explain very well the surprising motivation and integrity that is still central to the scientific life. Although…
Full Text Available Abstract Background The work of Research Ethics Boards (REBs, especially when involving genetics research and biobanks, has become more challenging with the growth of biotechnology and biomedical research. Some REBs have even rejected research projects where the use of a biobank with coded samples was an integral part of the study, the greatest fear being the lack of participant protection and uncontrolled use of biological samples or related genetic data. The risks of discrimination and stigmatization are a recurrent issue. In light of the increasing interest in biomedical research and the resulting benefits to the health of participants, it is imperative that practical solutions be found to the problems associated with the management of biobanks: namely, protecting the integrity of the research participants, as well as guaranteeing the security and confidentiality of the participant's information. Methods We aimed to devise a practical and efficient model for the management of biobanks in biomedical research where a medical archivist plays the pivotal role as a data-protection officer. The model had to reduce the burden placed on REBs responsible for the evaluation of genetics projects and, at the same time, maximize the protection of research participants. Results The proposed model includes the following: 1 a means of protecting the information in biobanks, 2 offers ways to provide follow-up information requested about the participants, 3 protects the participant's confidentiality and 4 adequately deals with the ethical issues at stake in biobanking. Conclusion Until a governmental governance body is established in Quebec to guarantee the protection of research participants and establish harmonized guidelines for the management of biobanks in medical research, it is definitely up to REBs to find solutions that the present lack of guidelines poses. The model presented in this article offers a practical solution on a day-to-day basis for REBs
Full Text Available Online language teaching is continuing to grow in importance both in the academic world and the public sector as language learners desire to increase their language skills within the framework of an increasingly digital landscape both in academia and the corporate world. While the current landscape of the online language learning community is still highly saturated with more commonly taught languages like Spanish, French, Chinese, etc., members of the less commonly taught languages (LCTL community are looking to take advantage of the same benefits of providing online courses, such as increasing marketability to a larger learner audience, supporting selfpaced learning, increasing learner autonomy, and increasing individual interactivity, as well as additional LCTL-specific benefits,such as helping sustain LCTLS with low enrollment. In this paper, we seek to describe one program’s experience designing and developing three introductory online courses in Dari, Pashto, and Uyghur, detailing our process from research and inception to the prototype development phase.
This paper describes a framework used by the National Institute for Nursing in Oxford to integrate research, development and practice. With the increasing attention given to the topic of how research findings are implemented into clinical practice, it was felt important to share the challenges that have arisen in attempting to combine traditional research activities with more practice based development work. The emerging conceptual framework, structures and functions are described highlighting the variety of partnerships to be established in order to achieve the goal of integrating research into practice. While the underpinning principles of the framework--generating knowledge, implementing research into practice and evaluating the effectiveness of programmes--are not new, it is the way they have been combined within an organisational structure that could be helpful to others considering such a strategy. Both the strengths and weaknesses of the model are discussed, a number of conclusions drawn as to its robustness and consideration given to its replication.
Bernard, Michael Lewis [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
A common problem associated with the effort to better assess potential behaviors of various individuals within different countries is the shear difficulty in comprehending the dynamic nature of populations, particularly over time and considering feedback effects. This paper discusses a theory-based analytical capability designed to enable analysts to better assess the influence of events on individuals interacting within a country or region. These events can include changes in policy, man-made or natural disasters, migration, war, or other changes in environmental/economic conditions. In addition, this paper describes potential extensions of this type of research to enable more timely and accurate assessments.
Ravid, Katya; Seta, Francesca; Center, David; Waters, Gloria; Coleman, David
Team science has been recognized as critical to solving increasingly complex biomedical problems and advancing discoveries in the prevention, diagnosis, and treatment of human disease. In 2009, the Evans Center for Interdisciplinary Biomedical Research (ECIBR) was established in the Department of Medicine at Boston University School of Medicine as a new organizational paradigm to promote interdisciplinary team science. The ECIBR is made up of affinity research collaboratives (ARCs), consisting of investigators from different departments and disciplines who come together to study biomedical problems that are relevant to human disease and not under interdisciplinary investigation at the university. Importantly, research areas are identified by investigators according to their shared interests. ARC proposals are evaluated by a peer review process, and collaboratives are funded annually for up to three years.Initial outcomes of the first 12 ARCs show the value of this model in fostering successful biomedical collaborations that lead to publications, extramural grants, research networking, and training. The most successful ARCs have been developed into more sustainable organizational entities, including centers, research cores, translational research projects, and training programs.To further expand team science at Boston University, the Interdisciplinary Biomedical Research Office was established in 2015 to more fully engage the entire university, not just the medical campus, in interdisciplinary research using the ARC mechanism. This approach to promoting team science may be useful to other academic organizations seeking to expand interdisciplinary research at their institutions.
Dziak, John J; Li, Runze; Zimmerman, Marc A; Buu, Anne
Ordinal responses are very common in longitudinal data collected from substance abuse research or other behavioral research. This study develops a new statistical model with free SAS macros that can be applied to characterize time-varying effects on ordinal responses. Our simulation study shows that the ordinal-scale time-varying effects model has very low estimation bias and sometimes offers considerably better performance when fitting data with ordinal responses than a model that treats the response as continuous. Contrary to a common assumption that an ordinal scale with several levels can be treated as continuous, our results indicate that it is not so much the number of levels on the ordinal scale but rather the skewness of the distribution that makes a difference on relative performance of linear versus ordinal models. We use longitudinal data from a well-known study on youth at high risk for substance abuse as a motivating example to demonstrate that the proposed model can characterize the time-varying effect of negative peer influences on alcohol use in a way that is more consistent with the developmental theory and existing literature, in comparison with the linear time-varying effect model. Copyright © 2014 John Wiley & Sons, Ltd.
Uslar, Mathias; Rohjans, Sebastian; Trefke, Jörn; Vasquez Gonzalez, Jose Manuel
Within the Smart Grid, the combination of automation equipment, communication technology and IT is crucial. Interoperability of devices and systems can be seen as the key enabler of smart grids. Therefore, international initiatives have been started in order to identify interoperability core standards for Smart Grids. IEC 62357, the so called Seamless Integration Architecture, is one of these very core standards, which has been identified by recent Smart Grid initiatives and roadmaps to be essential for building and managing intelligent power systems. The Seamless Integration Architecture provides an overview of the interoperability and relations between further standards from IEC TC 57 like the IEC 61970/61968: Common Information Model - CIM. CIM has proven to be a mature standard for interoperability and engineering; consequently, it is a cornerstone of the IEC Smart Grid Standardization Roadmap. This book provides an overview on how the CIM developed, in which international projects and roadmaps is h...
Rao, Narasimha D.; van Ruijven, Bas J.; Riahi, Keywan; Bosetti, Valentina
As climate change progresses, the risk of adverse impacts on vulnerable populations is growing. As governments seek increased and drastic action, policymakers are likely to seek quantification of climate-change impacts and the consequences of mitigation policies on these populations. Current models used in climate research have a limited ability to represent the poor and vulnerable, or the different dimensions along which they face these risks. Best practices need to be adopted more widely, and new model features that incorporate social heterogeneity and different policy mechanisms need to be developed. Increased collaboration between modellers, economists, and other social scientists could aid these developments.
Wender, Ben A.; Prado-Lopez, Valentina; Fantke, Peter
to guide research efforts in data refinement and design of experiments for existing and emerging chemicals alike. This study presents a sensitivity-based approach for estimating toxicity characterization factors given high input data uncertainty and using the results to prioritize data collection according......Product developers using life cycle toxicity characterization models to understand the potential impacts of chemical emissions face serious challenges related to large data demands and high input data uncertainty. This motivates greater focus on model sensitivity toward input parameter variability...... to parameter influence on characterization factors (CFs). Proof of concept is illustrated with the UNEP-SETAC scientific consensus model USEtox....
Garrett, B.C.; Dixon, D.A.; Dunning, T.H.
The Pacific Northwest National Laboratory (PNNL) has established the Environmental Molecular Sciences Laboratory (EMSL). In April 1994, construction began on the new EMSL, a collaborative research facility devoted to advancing the understanding of environmental molecular science. Research in the Theory, Modeling, and Simulation (TM and S) program will play a critical role in understanding molecular processes important in restoring DOE`s research, development, and production sites, including understanding the migration and reactions of contaminants in soils and ground water, developing processes for isolation and processing of pollutants, developing improved materials for waste storage, understanding the enzymatic reactions involved in the biodegradation of contaminants, and understanding the interaction of hazardous chemicals with living organisms. The research objectives of the TM and S program are fivefold: to apply available electronic structure and dynamics techniques to study fundamental molecular processes involved in the chemistry of natural and contaminated systems; to extend current electronic structure and dynamics techniques to treat molecular systems of future importance and to develop new techniques for addressing problems that are computationally intractable at present; to apply available molecular modeling techniques to simulate molecular processes occurring in the multi-species, multi-phase systems characteristic of natural and polluted environments; to extend current molecular modeling techniques to treat ever more complex molecular systems and to improve the reliability and accuracy of such simulations; and to develop technologies for advanced parallel architectural computer systems. Research highlights of 82 projects are given.
Elbæk, Mikael Karstensen; Heller, Alfred; Pedersen, Gert Schmeltz
A re-implementation of the research database of the Technical University of Denmark, DTU, is based on Fedora. The backbone consists of content models for primary and secondary entities and their relationships, giving flexible and powerful extraction capabilities for interoperability and reporting...
Richards, Jef I.; Preston, Ivan L.
Noting the lack of a dispute mechanism for determining whether an advertising practice is truly deceptive without generating the costs and negative publicity produced by traditional Federal Trade Commission (FTC) procedures, this paper proposes a model based upon early termination of the issues through jointly commissioned behavioral research. The…
Nijsse, A.; Wamsteker-Andriessen, S.J.
A survey of research reports in transportation modelling in two parts. Part one is devided in reports concerning economic development and car mobility, analyzing large transportation data files and transportation planning and spatial development. Part two consists of reserach reports concerning
Nijsse, A.; Wamsteker-Andriessen, S.J.
A survey of research reports in transportation modelling in two parts. Part one is devided in reports concerning economic development and car mobility, analyzing large transportation data files and transportation planning and spatial development. Part two consists of reserach reports concerning
Vodička, Petr; Smetana Jr., K.; Dvořánková, B.; Emerick, T.; Xu, Y.; Ourednik, J.; Ourednik, V.; Motlík, Jan
Roč. 1049, - (2005), s. 161-171 ISSN 0077-8923 R&D Projects: GA MŠk(CZ) LN00A065 Institutional research plan: CEZ:AV0Z50450515 Keywords : animal model * stem cell * transgenic pig Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 1.971, year: 2005
Martens, Matthew P.
Structural equation modeling (SEM) has become increasingly popular for analyzing data in the social sciences, although several broad reviews of psychology journals suggest that many SEM researchers engage in questionable practices when using the technique. The purpose of this study is to review and critique the use of SEM in counseling psychology…
Full Text Available PCBs are a group of persistent organic pollutants (POPs in the environment. Adsorption behavior of PCBs has obtained great attention affecting the degradation, mobility activities. In this paper, adsorption process was studied systematically to figure out the model of adsorption, adsorption mechanism and the influencing factors, which will provides the theoretical basis for further research.
Burggren, Warren W; Warburton, Stephen
The concept of animal models is well honored, and amphibians have played a prominent part in the success of using key species to discover new information about all animals. As animal models, amphibians offer several advantages that include a well-understood basic physiology, a taxonomic diversity well suited to comparative studies, tolerance to temperature and oxygen variation, and a greater similarity to humans than many other currently popular animal models. Amphibians now account for approximately 1/4 to 1/3 of lower vertebrate and invertebrate research, and this proportion is especially true in physiological research, as evident from the high profile of amphibians as animal models in Nobel Prize research. Currently, amphibians play prominent roles in research in the physiology of musculoskeletal, cardiovascular, renal, respiratory, reproductive, and sensory systems. Amphibians are also used extensively in physiological studies aimed at generating new insights in evolutionary biology, especially in the investigation of the evolution of air breathing and terrestriality. Environmental physiology also utilizes amphibians, ranging from studies of cryoprotectants for tissue preservation to physiological reactions to hypergravity and space exploration. Amphibians are also playing a key role in studies of environmental endocrine disruptors that are having disproportionately large effects on amphibian populations and where specific species can serve as sentinel species for environmental pollution. Finally, amphibian genera such as Xenopus, a genus relatively well understood metabolically and physiologically, will continue to contribute increasingly in this new era of systems biology and "X-omics."
Liang Peiyu; Xing Luping; Xuan Hui; Xue Wen
PCBs are a group of persistent organic pollutants (POPs) in the environment. Adsorption behavior of PCBs has obtained great attention affecting the degradation, mobility activities. In this paper, adsorption process was studied systematically to figure out the model of adsorption, adsorption mechanism and the influencing factors, which will provides the theoretical basis for further research.
In this study, online learning refers students under the guidance of teachers through the online learning platform for organized learning. Based on the analysis of related research results, considering the existing problems, the main contents of this paper include the following aspects: (1) Analyze and study the current student engagement model.…
Kahn, Jeffrey H.
Multilevel modeling (MLM) is rapidly becoming the standard method of analyzing nested data, for example, data from students within multiple schools, data on multiple clients seen by a smaller number of therapists, and even longitudinal data. Although MLM analyses are likely to increase in frequency in counseling psychology research, many readers…
Macfarlane, A. J.; Barbarisi, I.; Rios, C.; Docasal, R.; Martinez, S.; Arviset, C.; Besse, S.; De Marchi, G.; Grotheer, E.; Gonzalez, J.; Lim, T.; Fraga, D.; Barthelemy, M.
The first of the European Space Agency's (ESA) planetary missions to make use of the latest release of the Planetary Data Standards (PDS4) are currently in advanced stages of development (ExoMars, BepiColombo). This occurs at a time when the Planetary Science Archive (PSA) has been undergoing a complete reengineering in order to increase the accessibility of ESA's planetary data holdings utilising the latest technologies and to significantly improve the user experience for both the specialist scientific community and general public alike. The PSA must also keep on handling PDS3 data arriving to the archive from active missions (Rosetta, Mars Express, Venus Express) as well as continuing to provide access to missions that have reached the legacy phase (Huygens, SMART1, Giotto). Therefore, as part of the reengineering of the PSA, an effort has been made to map the key metadata from PDS3 and PDS4 into a common data model with the intention of providing transparency to the services that make up the new PSA, and consequently to the end user. We present how this common mapping allows the PSA to support the data deliveries from the pipelines of existing missions without the need to reprocess the PDS3 data and in addition how it should simplify the data deliveries from PDS4 missions. We review how the implementation of this data model, involving a PostgreSQL database with the PostGIS extension, enables the new PSA to be able to provide multiple methods of interoperability used by the international community, such as PDAP (Planetary Data Access Protocol), EPN-TAP (EuroPlanet-Table Access Protocol), and GIS-enabled technologies without the user having to know in detail the underlying structure of the data format.
Full Text Available Down syndrome is caused by trisomy of chromosome 21. To date, a multiplicity of mouse models with Down-syndrome-related features has been developed to understand this complex human chromosomal disorder. These mouse models have been important for determining genotype-phenotype relationships and identification of dosage-sensitive genes involved in the pathophysiology of the condition, and in exploring the impact of the additional chromosome on the whole genome. Mouse models of Down syndrome have also been used to test therapeutic strategies. Here, we provide an overview of research in the last 15 years dedicated to the development and application of rodent models for Down syndrome. We also speculate on possible and probable future directions of research in this fast-moving field. As our understanding of the syndrome improves and genome engineering technologies evolve, it is necessary to coordinate efforts to make all Down syndrome models available to the community, to test therapeutics in models that replicate the whole trisomy and design new animal models to promote further discovery of potential therapeutic targets.
Herault, Yann; Delabar, Jean M; Fisher, Elizabeth M C; Tybulewicz, Victor L J; Yu, Eugene; Brault, Veronique
Down syndrome is caused by trisomy of chromosome 21. To date, a multiplicity of mouse models with Down-syndrome-related features has been developed to understand this complex human chromosomal disorder. These mouse models have been important for determining genotype-phenotype relationships and identification of dosage-sensitive genes involved in the pathophysiology of the condition, and in exploring the impact of the additional chromosome on the whole genome. Mouse models of Down syndrome have also been used to test therapeutic strategies. Here, we provide an overview of research in the last 15 years dedicated to the development and application of rodent models for Down syndrome. We also speculate on possible and probable future directions of research in this fast-moving field. As our understanding of the syndrome improves and genome engineering technologies evolve, it is necessary to coordinate efforts to make all Down syndrome models available to the community, to test therapeutics in models that replicate the whole trisomy and design new animal models to promote further discovery of potential therapeutic targets. © 2017. Published by The Company of Biologists Ltd.
Phillips, Anna C
This chapter explores the reasoning behind using the vaccination model to examine the influence of psychosocial factors on immunity. It then briefly discusses the mechanics of the vaccination response and the protocols used in Psychoneuroimmunology vaccine research, before giving examples from the research literature of the studies examining relationships such as the association between stress and the vaccination response. It also explores the ways the vaccination model can be used to answer key questions in Psychoneuroimmunology, such as: "does it matter when stressful life events occur relative to when the vaccine is received?" "what are the effects of prior exposure to the antigen?" and "do other psychosocial factors influence vaccine response besides stress?" Finally, it briefly considers the mechanisms underlying psychosocial factors and vaccination response associations and the future research needed to understand these better, and indeed to use current and future knowledge to improve and enhance vaccine responses in key at risk populations.
Mazerolle, Stephanie M; Eason, Christianne M; Goodman, Ashley
Some anecdotal evidence has suggested that organizational infrastructure may affect the quality of life of athletic trainers (ATs). To compare ATs' perspectives on work-life balance, role strain, job satisfaction, and retention in collegiate practice settings within the various models. Cross-sectional and qualitative study. National Collegiate Athletic Association Divisions I, II, and III. Fifty-nine ATs from 3 models (athletics = 25, medical = 20, academic = 14) completed phase I. A total of 24 ATs (15 men, 9 women), 8 from each model, also completed phase II. Participants completed a Web-based survey for phase I and were interviewed via telephone for phase II. Quantitative data were analyzed using statistical software. Likert-scale answers (1 = strongly disagree, 5 = strongly agree) to the survey questions were analyzed using the Kruskal-Wallis, Mann-Whitney U, and Cohen f tests. Qualitative data were evaluated using a general inductive approach. Multiple-analyst triangulation and peer review were conducted to satisfy data credibility. Commonalities were communication, social support, and time management and effective work-life balance strategies. Quantitative data revealed that ATs employed in the athletics model worked more hours (69.6 ± 11.8 hours) than those employed in the medical (57.6 ± 10.2 hours; P = .001) or academic (59.5 ± 9.5 hours; P = .02) model, were less satisfied with their pay (2.68 ± 1.1; χ 2 = 7.757, P = .02; f = 0.394), believed that they had less support from their administrators (3.12 ± 1.1; χ 2 = 9.512, P = .009; f = 0.443), and had fewer plans to remain in their current positions (3.20 ± 1.2; χ 2 = 7.134, P = .03; f = 0.374). Athletic trainers employed in the academic model believed that they had less support from coworkers (3.71 ± 0.90; χ 2 = 6.825, P = .03; f = 0.365) and immediate supervisors (3.43 ± 0.90; χ 2 = 6.006, P = .050; f = 0.340). No differences in role conflict were found among the models
Kaptein, Ad A; van Korlaar, Inez M; Cameron, Linda D; Vossen, Carla Y; van der Meer, Felix J M; Rosendaal, Frits R
This study applied the Common-Sense Model (CSM) to predict risk perception and disease-related worry in 174 individuals with a genetic predisposition to venous thrombosis (thrombophilia). Participants completed an adapted version of the Illness Perception Questionnaire-Revised (IPQ-R) and measures assessing risk perception and worry. Regression analyses revealed that illness perceptions were predictors of risk perception and thrombosis worry. The hypothesis that illness perceptions mediate the relationship between a person's experience of venous thrombosis and perceived risk and thrombosis worry could not be confirmed. Further research should refine the IPQ-R for populations at risk of a disease and examine the value of the CSM in explaining the relationship between risk perception, worry, and health behavior. (PsycINFO Database Record (c) 2007 APA, all rights reserved).
Kapischke, Matthias; Pries, Alexandra
The operative and conservative results of therapy in pancreatic ductal adenocarcinoma remain appallingly poor. This underlines the demand for further research for effective anticancer drugs. The various animal models remain the essential method for the determination of efficacy of substances during preclinical phase. Unfortunately, most of these tested substances showed a good efficacy in pancreatic carcinoma in the animal model but were not confirmed during the clinical phase. The available literature in PubMed, Medline, Ovid and secondary literature was searched regarding the available animal models for drug testing against pancreatic cancer. The models were analyzed regarding their pros and cons in anticancer drug testing. The different modifications of the orthotopic model (especially in mice) seem at present to be the best model for anticancer testing in pancreatic carcinoma. The value of genetically engineered animal model (GEM) and syngeneic models is on debate. A good selection of the model concerning the questions supposed to be clarified may improve the comparability of the results of animal experiments compared to clinical trials.
Waszak, Martin R.; Barthelemy, Jean-Francois; Jones, Kenneth M.; Silcox, Richard J.; Silva, Walter A.; Nowaczyk, Ronald H.
Multidisciplinary analysis and design is inherently a team activity due to the variety of required expertise and knowledge. As a team activity, multidisciplinary research cannot escape the issues that affect all teams. The level of technical diversity required to perform multidisciplinary analysis and design makes the teaming aspects even more important. A study was conducted at the NASA Langley Research Center to develop a model of multidiscipline teams that can be used to help understand their dynamics and identify key factors that influence their effectiveness. The study sought to apply the elements of systems thinking to better understand the factors, both generic and Langley-specific, that influence the effectiveness of multidiscipline teams. The model of multidiscipline research teams developed during this study has been valuable in identifying means to enhance team effectiveness, recognize and avoid problem behaviors, and provide guidance for forming and coordinating multidiscipline teams.
The US EPA National Ambient Air Quality Standard (NAAQS) was not conceived to nor does it provide an accurate definition of the absorbed ozone dose or baseline exposure level to protect vegetation. This research presents a multiplicative modeling approach based not only on atmospheric, but on equally important physiological, phenological, and environmental parameters. Physiological constraints on ozone uptake demonstrate that actual absorption is substantially lower than that assumed by a simple interpretation of hourly atmospheric ozone concentrations. Coupled with development of foliar injury expression this provides evidence that tropospheric ozone is more toxic to vegetation than is currently understood.