Salt Lakes of the African Rift System: A Valuable Research Opportunity for Insight into Nature's Concenrtated Multi-Electrolyte Science. JYN Philip, DMS Mosha. Abstract. The Tanzanian rift system salt lakes present significant cultural, ecological, recreational and economical values. Beyond the wealth of minerals, resources ...
Bush-Bacelis, Jean L.
The American Assembly of Collegiate Schools of Business (AACSB), an accrediting agency, may be an overlooked tool for establishing rationale and credibility for globalization of business courses. The 245 member institutions are bound by the agency's accrediting requirements, and many others are influenced by the standards set in those…
De Waele Jo
Full Text Available There is a growing need in the speleological community of tools that make teaching of speleology and karst much easier. Despite the existence of a wide range of major academic textbooks, often the caver community has a difficult access to such material. Therefore, to fill this gap, the Italian Speleological Society, under the umbrella of the Union International de Spéléologie, has prepared a set of lectures, in a presentation format, on several topics including geology, physics, chemistry, hydrogeology, mineralogy, palaeontology, biology, microbiology, history, archaeology, artificial caves, documentation, etc. These lectures constitute the “Teaching Resources in Speleology and Karst”, available online. This educational tool, thanks to its easily manageable format, can constantly be updated and enriched with new contents and topics.
Lavinghouze, Rene; Price, Ann Webb; Smith, Kisha-Ann
Success stories are evaluation tools that have been used by professionals across disciplines for quite some time. They are also proving to be useful in promoting health programs and their accomplishments. The increasing popularity of success stories is due to the innovative and effective way that they increase a program's visibility, while engaging potential participants, partners, and funders in public health efforts. From the community level to the federal level, program administrators are using success stories as vehicles for celebrating achievements, sharing challenges, and communicating lessons learned. Success stories are an effective means to move beyond the numbers and connect to readers-with a cause they can relate to and want to join. This article defines success stories and provides an overview of several types of story formats, how success stories can be systematically collected, and how they are used to communicate program success.
Full Text Available The growing international market for unproven stem cell-based interventions advertised on a direct-to-consumer basis over the internet (“stem cell tourism” is a source of concern because of the risks it presents to patients as well as their supporters, domestic health care systems, and the stem cell research field. Emerging responses such as public and health provider-focused education and national regulatory efforts are encouraging, but the market continues to grow. Physicians play a number of roles in the stem cell tourism market and, in many jurisdictions, are members of a regulated profession. In this article, we consider the use of professional regulation to address physician involvement in stem cell tourism. Although it is not without its limitations, professional regulation is a potentially valuable tool that can be employed in response to problematic types of physician involvement in the stem cell tourism market.
Zarzeczny, Amy; Caulfield, Timothy; Ogbogu, Ubaka; Bell, Peter; Crooks, Valorie A; Kamenova, Kalina; Master, Zubin; Rachul, Christen; Snyder, Jeremy; Toews, Maeghan; Zoeller, Sonja
The growing international market for unproven stem cell-based interventions advertised on a direct-to-consumer basis over the internet ("stem cell tourism") is a source of concern because of the risks it presents to patients as well as their supporters, domestic health care systems, and the stem cell research field. Emerging responses such as public and health provider-focused education and national regulatory efforts are encouraging, but the market continues to grow. Physicians play a number of roles in the stem cell tourism market and, in many jurisdictions, are members of a regulated profession. In this article, we consider the use of professional regulation to address physician involvement in stem cell tourism. Although it is not without its limitations, professional regulation is a potentially valuable tool that can be employed in response to problematic types of physician involvement in the stem cell tourism market. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Montenegro, Gil; Alves, Luiza; Zaninotto, Ana Luiza; Falcão, Denise Pinheiro; de Amorim, Rivadávio Fernandes Batista
Hypnosis is a valuable tool in the management of patients who undergo surgical procedures in the maxillofacial complex, particularly in reducing and eliminating pain during surgery and aiding patients who have dental fear and are allergic to anesthesia. This case report demonstrates the efficacy of hypnosis in mitigating anxiety, bleeding, and pain during dental surgery without anesthesia during implant placement of tooth 14, the upper left first molar.
Kriegsmann, Jörg; Kriegsmann, Mark; Casadonte, Rita
Matrix-assisted laser desorption/ionization (MALDI) time-of-flight (TOF) imaging mass spectrometry (IMS) is an evolving technique in cancer diagnostics and combines the advantages of mass spectrometry (proteomics), detection of numerous molecules, and spatial resolution in histological tissue sections and cytological preparations. This method allows the detection of proteins, peptides, lipids, carbohydrates or glycoconjugates and small molecules.Formalin-fixed paraffin-embedded tissue can also be investigated by IMS, thus, this method seems to be an ideal tool for cancer diagnostics and biomarker discovery. It may add information to the identification of tumor margins and tumor heterogeneity. The technique allows tumor typing, especially identification of the tumor of origin in metastatic tissue, as well as grading and may provide prognostic information. IMS is a valuable method for the identification of biomarkers and can complement histology, immunohistology and molecular pathology in various fields of histopathological diagnostics, especially with regard to identification and grading of tumors.
Karzenowski, Abby; Puskar, Kathy
Motivational Interviewing (MI) is well known and respected by many health care professionals. Developed by Miller and Rollnick (2002) , it is a way to promote behavior change from within and resolve ambivalence. MI is individualized and is most commonly used in the psychiatric setting; it is a valuable tool for the Psychiatric Advanced Nurse Practice Nurse. There are many resources that talk about what MI is and the principles used to apply it. However, there is little information about how to incorporate MI into a clinical case. This article provides a summary of articles related to MI and discusses two case studies using MI and why advanced practice nurses should use MI with their patients.
Garas, Monique; Vaccarezza, Mauro; Newland, George; McVay-Doornbusch, Kylie; Hasani, Jamila
Three-dimensional (3D) printing is a modern technique of creating 3D-printed models that allows reproduction of human structures from MRI and CT scans via fusion of multiple layers of resin materials. To assess feasibility of this innovative resource as anatomy educational tool, we conducted a preliminary study on Curtin University undergraduate students to investigate the use of 3D models for anatomy learning as a main goal, to assess the effectiveness of different specimen types during the sessions and personally preferred anatomy learning tools among students as secondary aim. The study consisted of a pre-test, exposure to test (anatomical test) and post-test survey. During pre-test, all participants (both without prior experience and experienced groups) were given a brief introduction on laboratory safety and study procedure thus participants were exposed to 3D, wet and plastinated specimens of the heart, shoulder and thigh to identify the pinned structures (anatomical test). Then, participants were provided a post-test survey containing five questions. In total, 23 participants completed the anatomical test and post-test survey. A larger number of participants (85%) achieved right answers for 3D models compared to wet and plastinated materials, 74% of population selected 3D models as the most usable tool for identification of pinned structures and 45% chose 3D models as their preferred method of anatomy learning. This preliminary small-size study affirms the feasibility of 3D-printed models as a valuable asset in anatomy learning and shows their capability to be used adjacent to cadaveric materials and other widely used tools in anatomy education. Copyright © 2018 Elsevier GmbH. All rights reserved.
Full Text Available This paper shows how to address technological, cultural and social transformations with empirically grounded innovation. Areas in transition such as higher education and learning techniques today bring about new needs and opportunities for innovative tools and services. But how do we find these tools? The paper argues for using a strategy of (user value innovation that creatively combines ethnographic methods with strategic industry analysis. By focusing on unmet and emerging needs ethnographic research identifies learner values, needs and challenges but does not determine solutions. Blue-ocean strategy tools can identify new opportunities that alter existing offerings but give weak guidance on what will be most relevant to users. The triangulation of both is illustrated through an innovation project in higher education.
Patwardhan, Anjali; Patwardhan, Prakash
In the recent climate of consumerism and consumer focused care, health and social care needs to be more responsive than ever before. Consumer needs and preferences can be elicited with accepted validity and reliability only by strict methodological control, customerisation of the questionnaire and skilled interpretation. To construct, conduct, interpret and implement improved service provision, requires a trained work force and infrastructure. This article aims to appraise various aspects of consumer surveys and to assess their value as effective service improvement tools. The customer is the sole reason organisations exist. Consumer surveys are used worldwide as service and quality of care improvement tools by all types of service providers including health service providers. The article critically appraises the value of consumer surveys as service improvement tools in health services tool and its future applications. No one type of survey is the best or ideal. The key is the selection of the correct survey methodology, unique and customised for the particular type/aspect of care being evaluated. The method used should reflect the importance of the information required. Methodological rigor is essential for the effectiveness of consumer surveys as service improvement tools. Unfortunately so far there is no universal consensus on superiority of one particular methodology over another or any benefit of one specific methodology in a given situation. More training and some dedicated resource allocation is required to develop consumer surveys. More research is needed to develop specific survey methodology and evaluation techniques for improved validity and reliability of the surveys as service improvement tools. Measurement of consumer preferences/priorities, evaluation of services and key performance scores, is not easy. Consumer surveys seem impressive tools as they provide the customer a voice for change or modification. However, from a scientific point
Breuer, Henning; Schwarz, Heinrich; Feller, Kristina; Matsumoto, Mitsuji
This paper shows how to address technological, cultural and social transformations with empirically grounded innovation. Areas in transition such as higher education and learning techniques today bring about new needs and opportunities for innovative tools and services. But how do we find these tools? The paper argues for using a strategy of…
Dwaik, Raghad A. A.
Digital technology has become an indispensable aspect of foreign language learning around the globe especially in the case of college students who are often required to finish extensive reading assignments within a limited time period. Such pressure calls for the use of efficient tools such as digital dictionaries to help them achieve their…
Omar, Jone; Slowikowski, Boleslaw; Boix, Ana; von Holst, Christoph
Feed additives need to be authorised to be placed on the market according to Regulation (EU) No. 1831/2003. Next to laying down the procedural requirements, the regulation creates the European Union Reference Laboratory for Feed Additives (EURL-FA) and requires that applicants send samples to the EURL-FA. Once authorised, the characteristics of the marketed feed additives should correspond to those deposited in the sample bank of the EURL-FA. For this purpose, the submitted samples were subjected to near-infrared (NIR) and Raman spectroscopy for spectral characterisation. These techniques have the valuable potential of characterising the feed additives in a non-destructive manner without any complicated sample preparation. This paper describes the capability of spectroscopy for a rapid characterisation of products to establish whether specific authorisation criteria are met. This study is based on the analysis of feed additive samples from different categories and functional groups, namely products containing (1) selenium, (2) zinc and manganese, (3) vitamins and (4) essential oils such as oregano and thyme oil. The use of chemometrics turned out to be crucial, especially in cases where the differentiation of spectra by visual inspection was very difficult.
Chomsky, D B; Lang, C C; Rayos, G H; Shyr, Y; Yeoh, T K; Pierson, R N; Davis, S F; Wilson, J R
Peak exercise oxygen consumption (Vo2), a noninvasive index of peak exercise cardiac output (CO), is widely used to select candidates for heart transplantation. However, peak exercise Vo2 can be influenced by noncardiac factors such as deconditioning, motivation, or body composition and may yield misleading prognostic information. Direct measurement of the CO response to exercise may avoid this problem and more accurately predict prognosis. Hemodynamic and ventilatory responses to maximal treadmill exercise were measured in 185 ambulatory patients with chronic heart failure who had been referred for cardiac transplantation (mean left ventricular ejection fraction, 22 +/- 7%; mean peak Vo2, 12.9 +/- 3.0 mL. min-1.kg-1). CO response to exercise was normal in 83 patients and reduced in 102. By univariate analysis, patients with normal CO responses had a better 1-year survival rate (95%) than did those with reduced CO responses (72%) (P 14 mL.min-1.kg-1 (88%) was not different from that of patients with peak Vo2 of 10 mL.min-1.kg-1 (89%) (P < .0001). By Cox regression analysis, exercise CO response was the strongest independent predictor of survival (risk ratio, 4.3), with peak Vo2 dichotomized at 10 mL. min-1.kg-1 (risk ratio, 3.3) as the only other independent predictor. Patients with reduced CO responses and peak Vo2 of < or = 10 mL.min-1.kg-1 had an extremely poor 1-year survival rate (38%). Both CO response to exercise and peak exercise Vo2 provide valuable independent prognostic information in ambulatory patients with heart failure. These variables should be used in combination to select potential heart transplantation candidates.
Full Text Available Anjali Patwardhan,1 Charles H Spencer21Nationwide Children’s Hospital Columbus, 2Ohio State University, Columbus, OH, USAAbstract: Improving the quality of care in international health services was made a high priority in 1977. The World Health Assembly passed a resolution to greatly improve “Health for all” by the year 2000. Since 1977, the use of patient surveys for quality improvement has become a common practice in the health-care industry. The use of surveys reflects the concept that patient satisfaction is closely linked with that of organizational performance, which is in turn closely linked with organizational culture. This article is a review of the role of patient surveys as a quality-improvement tool in health care. The article explores the characteristics, types, merits, and pitfalls of various patient surveys, as well as the impact of their wide-ranging application in dissimilar scenarios to identify gaps in service provision. It is demonstrated that the conducting of patient surveys and using the results to improve the quality of care are two different processes. The value of patient surveys depends on the interplay between these two processes and several other factors that can influence the final outcome. The article also discusses the business aspect of the patient surveys in detail. Finally, the authors make future recommendations on how the patient survey tool can be best used to improve the quality of care in the health-care sector.Keywords: patient surveys, quality improvement, service gaps
Khawar, N.; Umber, A.
To determine umbilical artery Doppler velocity parameter systolic: diastolic ratio (S/D ratio) relation with fetal well being and outcome. Setting: Department of Obstetrics and Gynecology, Lady Willingdon Hospital, Lahore Duration of study: Six months from 27-02-2008 to 26-08-2008. Subjects and methods: Sixty patients fulfilling the inclusion criteria were included in this study. They were subdivided into two groups. Group 'A' included 30 normal pregnant women with no medical or obstetrical risk factors and group 'B' included 30 pregnant women having risk factors like, hypertension, diabetes, Rhesus incompatibility, discordant twins, intrauterine growth restriction and non immunehydropsfetalis. Results: In comparison of S/D ratio with risk factors it was observed that S/D ratio 3 was present in 19 patients (31.6%) in pregnancy with hypertension/preeclampsia, 3 patients (5%) with diabetes mellitus, 11 patients (18.3%) with intrauterine growth restriction, 15 patients (25.0%) with oligohydramnios and only 1 patient (1.6%) with twin pregnancy. It was observed that women with S/D ratio 3 S/D ratio delivered 10 neonates (16.6%) with <4 Apgar score at 1 minute, 23 (38.3%) with <6 score at 5 minutes and 23 neonates (38.3%) needed resuscitation, 21 (35.0%) were admitted to neonatal unit for asphyxia. Conclusion: Umbilical artery Doppler studies is an integral tool while evaluating health of high risk pregnancies. However, it is not appropriate as a screening tool for low risk pregnancies. (author)
English, Andrew; Azeem, Ayesha; Spanoudes, Kyriakos; Jones, Eleanor; Tripathi, Bhawana; Basu, Nandita; McNamara, Karrina; Tofail, Syed A M; Rooney, Niall; Riley, Graham; O'Riordan, Alan; Cross, Graham; Hutmacher, Dietmar; Biggs, Manus; Pandit, Abhay; Zeugolis, Dimitrios I
Controlling the cell-substrate interactions at the bio-interface is becoming an inherent element in the design of implantable devices. Modulation of cellular adhesion in vitro, through topographical cues, is a well-documented process that offers control over subsequent cellular functions. However, it is still unclear whether surface topography can be translated into a clinically functional response in vivo at the tissue/device interface. Herein, we demonstrated that anisotropic substrates with a groove depth of ∼317nm and ∼1988nm promoted human tenocyte alignment parallel to the underlying topography in vitro. However, the rigid poly(lactic-co-glycolic acid) substrates used in this study upregulated the expression of chondrogenic and osteogenic genes, indicating possible tenocyte trans-differentiation. Of significant importance is that none of the topographies assessed (∼37nm, ∼317nm and ∼1988nm groove depth) induced extracellular matrix orientation parallel to the substrate orientation in a rat patellar tendon model. These data indicate that two-dimensional imprinting technologies are useful tools for in vitro cell phenotype maintenance, rather than for organised neotissue formation in vivo, should multifactorial approaches that consider both surface topography and substrate rigidity be established. Herein, we ventured to assess the influence of parallel groves, ranging from nano- to micro-level, on tenocytes response in vitro and on host response using a tendon and a subcutaneous model. In vitro analysis indicates that anisotropically ordered micro-scale grooves, as opposed to nano-scale grooves, maintain physiological cell morphology. The rather rigid PLGA substrates appeared to induce trans-differentiation towards chondrogenic and/or steogenic lineage, as evidence by TILDA gene analysis. In vivo data in both tendon and subcutaneous models indicate that none of the substrates induced bidirectional host cell and tissue growth. Collective, these
Esposito, Pasquale; Dal Canton, Antonio
Evaluation and improvement of quality of care provided to the patients are of crucial importance in the daily clinical practice and in the health policy planning and financing. Different tools have been developed, including incident analysis, health technology assessment and clinical audit. The clinical audit consist of measuring a clinical outcome or a process, against well-defined standards set on the principles of evidence-based medicine in order to identify the changes needed to improve the quality of care. In particular, patients suffering from chronic renal diseases, present many problems that have been set as topics for clinical audit projects, such as hypertension, anaemia and mineral metabolism management. Although the results of these studies have been encouraging, demonstrating the effectiveness of audit, overall the present evidence is not clearly in favour of clinical audit. These findings call attention to the need to further studies to validate this methodology in different operating scenarios. This review examines the principle of clinical audit, focusing on experiences performed in nephrology settings. PMID:25374819
Vizcaíno, Juan Antonio; Reisinger, Florian; Côté, Richard; Martens, Lennart
The Proteomics Identifications Database (PRIDE, http://www.ebi.ac.uk/pride ) provides users with the ability to explore and compare mass spectrometry-based proteomics experiments that reveal details of the protein expression found in a broad range of taxonomic groups, tissues, and disease states. A PRIDE experiment typically includes identifications of proteins, peptides, and protein modifications. Additionally, many of the submitted experiments also include the mass spectra that provide the evidence for these identifications. Finally, one of the strongest advantages of PRIDE in comparison with other proteomics repositories is the amount of metadata it contains, a key point to put the above-mentioned data in biological and/or technical context. Several informatics tools have been developed in support of the PRIDE database. The most recent one is called "Database on Demand" (DoD), which allows custom sequence databases to be built in order to optimize the results from search engines. We describe the use of DoD in this chapter. Additionally, in order to show the potential of PRIDE as a source for data mining, we also explore complex queries using federated BioMart queries to integrate PRIDE data with other resources, such as Ensembl, Reactome, or UniProt.
Mulier, Michiel; Pastrav, Cesar; Van der Perre, Georges
Defining the stem insertion end point during total hip replacement still relies on the surgeon's feeling. When a custom-made stem prosthesis with an optimal fit into the femoral canal is used, the risk of per-operative fractures is even greater than with standard prostheses. Vibration analysis is used in other clinical settings and has been tested as a means to detect optimal stem insertion in the laboratory. The first per-operative use of vibration analysis during non-cemented custom-made stem insertion in 30 patients is reported here. Thirty patients eligible for total hip replacement with uncemented stem prosthesis were included. The neck of the stem was connected with a shaker that emitted white noise as excitation signal and an impedance head that measured the frequency response. The response signal was sent to a computer that analyzed the frequency response function after each insertion phase. A technician present in the operating theatre but outside the laminated airflow provided feed-back to the surgeon. The correlation index between the frequency response function measured during the last two insertion hammering sessions was >0.99 in 86.7% of the cases. In four cases the surgeon stopped the insertion procedure because of a perceived risk of fracture. Two special cases illustrating the potential benefit of per-operative vibration analysis are described. The results of intra-operative vibration analysis indicate that this technique may be a useful tool assisting the orthopaedic surgeon in defining the insertion endpoint of the stem. The development of a more user-friendly device is therefore warranted.
Full Text Available Energy and environment are top priorities for the EU’s Europe 2020 Strategy. Both fields imply complex approaches and consistent investment. The paper presents an alternative to large investments to improve the efficiencies of existing (outdated power installations: namely the use of data-mining techniques for analysing existing operational data. Data-mining is based upon exhaustive analysis of operational records, inferring high-value information by simply processing records with advanced mathematical / statistical tools. Results can be: assessment of the consistency of measurements, identification of new hardware needed for improving the quality of data, deducing the most efficient level for operation (internal benchmarking, correlation of consumptions with power/ heat production, of technical parameters with environmental impact, scheduling the optimal maintenance time, fuel stock optimization, simulating scenarios for equipment operation, anticipating periods of maximal stress of equipment, identification of medium and long term trends, planning and decision support for new investment, etc. The paper presents a data mining process carried out at the TERMICA - Suceava power plant. The analysis calls for a multidisciplinary approach, a complex team (experts in power&heat production, mechanics, environmental protection, economists, and last but not least IT experts and can be carried out with lower expenses than an investment in new equipment. Involvement of top management of the company is essential, being the driving force and motivation source for the data-mining team. The approach presented is self learning as once established, the data-mining analytical, modelling and simulation procedures and associated parameter databases can adjust themselves by absorbing and processing new relevant information and can be used on a long term basis for monitoring the performance of the installation, certifying the soundness of managerial measures taken
Duval, A.; Guicharnaud, H.; Dran, J.C.
For several years, we carry out a research on metalpoint drawings, a graphic technique mainly employed by European artists during the 15th and 16th centuries. As a non-destructive and very sensitive analytical technique is required, particle induced X-ray emission (PIXE) analysis with an external beam has been used for this purpose. More than 70 artworks drawn by Italian, Flemish and German artists have been analysed, including leadpoint and silverpoint drawings. Following a short description of the metalpoint technique, the results are compared with the recipes written by Cennino Cennini at the beginning of the 15th century and specific examples are presented
Full Text Available This article will address the issue of using understandings of psychodynamic interrelations as a means to grasp how social and cultural dynamics are processed individually and collectively in narratives. I apply the two theoretically distinct concepts of inter- and intrasubjectivity to gain insight into how social and cultural dynamics are processed as subjective experiences and reflected in the interrelational space created in narrative interviews with trainee social educators. By using a combination of interactionist theory and psychosocial theory in the analysis of an interview with a student of social education, I demonstrate how the often conflicting demands and expectations are being played out in the interrelational tension between the researcher (myself and the interviewee or narrator. In a confrontation with “inner” expectations and concerns regarding a future profession and one’s ability to cope, and the “outer” socially and culturally embedded discourses as they are played out in the objectives of self-development and education, the narrative about a forthcoming internship is filled with tension and contradiction. In this article I will demonstrate how such tensions and contradictions are valuable sources of information in understanding the process of becoming a social educator.
This article will address the issue of using understandings of psychodynamic interrelations as a means to grasp how social and cultural dynamics are processed individually and collectively in narratives. I apply the two theoretically distinct concepts of inter- and intrasubjectivity to gain insight...... are valuable sources of information in understanding the process of becoming a social educator....
Ebrahim, Nader Ale
“Research Tools” can be defined as vehicles that broadly facilitate research and related activities. “Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 700 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated ...
reach 92% based on GUS detection. The combination of the highly efficient transformation and the regeneration system of Superroot provides a valuable tool for functional genomics studies in L. corniculatus.
Weng, Ruihui; Wei, Xiaobo; Yu, Bin; Zhu, Shuzhen; Yang, Xiaohua; Xie, Fen; Zhang, Mahui; Jiang, Ying; Feng, Zhong-Ping; Sun, Hong-Shuo; Xia, Ying; Jin, Kunlin; Chan, Piu; Wang, Qing; Gao, Xiaoya
Progressive supranuclear palsy (PSP) was previously thought as a cause of atypical Parkinsonism. Although Cystatin C (Cys C) and low-density cholesterol lipoprotein-C (LDL-C) are known to play critical roles in Parkinsonism, it is unknown whether they can be used as markers to distinguish PSP patients from healthy subjects and to determine disease severity. We conducted a cross-sectional study to determine plasma Cys C/HDL/LDL-C levels of 40 patients with PSP and 40 healthy age-matched controls. An extended battery of motor and neuropsychological tests, including the PSP-Rating Scale (PSPRS), the Non-Motor Symptoms Scale (NMSS), Geriatric Depression Scale (GDS) and Mini-Mental State Examination (MMSE), was used to evaluate the disease severity. Receiver operating characteristic (ROC) curves were adopted to assess the prognostic accuracy of Cys C/LDL-C levels in distinguishing PSP from healthy subjects. Patients with PSP exhibited significantly higher plasma levels of Cys C and lower LDL-C. The levels of plasma Cys C were positively and inversely correlated with the PSPRS/NMSS and MMSE scores, respectively. The LDL-C/HDL-C ratio was positively associated with PSPRS/NMSS and GDS scores. The ROC curve for the combination of Cys C and LDL-C yielded a better accuracy for distinguishing PSP from healthy subjects than the separate curves for each parameter. Plasma Cys C and LDL-C may be valuable screening tools for differentiating PSP from healthy subjects; while they could be useful for the PSP intensifies and severity evaluation. A better understanding of Cys C and LDL-C may yield insights into the pathogenesis of PSP. Copyright © 2018 Elsevier Ltd. All rights reserved.
Through training materials and guides, we aim to build skills and knowledge to enhance the quality of development research. We also offer free access to our database of funded research projects, known as IDRIS+, and our digital library. Our research tools include. Guide to research databases at IDRC: How to access and ...
Håkansson, Anders; Lindberg, Eva Pettersson; Henriksson, Karin
At the Department of Community Medicine at Lund University we have given courses in basic research methodology since 1989. The course has yielded 20 points of university credit, the equivalent of one full-time semester of studies, and it has been run part-time, covering one and a half years. Our aim has been to provide a large number of physicians with basic training in research methods, and to stimulate the engagement of new scientific students from the whole Southern Health Care Region. During the first ten years, 138 general practitioners (20% of the GPs of the region) and 202 specialists completed our courses. Up till now, 19 GPs (14%) and 19 specialists (9%) have begun PhD studies. During the last two years, another 100 physicians from southern Sweden have attended our courses, as well as GPs from Zealand in Denmark. We have been developing our course in basic research methods during a twelve-year period, and it is now well established in our health care region. We feel that we have succeeded in reaching the two goals we had set up: to give a large number of physicians a fundamental knowledge of research methods and to recruit and increase the number of PhD students. We believe that medical research and development must flourish also outside the traditional university settings.
Christopher D. Risbrudt; Robert J. Ross; Julie J. Blankenburg; Charles A. Nelson
Founded in 1910 by the U.S. Forest Service to serve as a centralized, national wood research laboratory, the USDA Forest Products Laboratory (FPL) has a long history of providing technical services to other government agencies, including those within the Defense (DoD). A recent search of FPLâs library and correspondence files revealed that approximately 10,000...
Kjærsgaard, Mette Gislev; Smith, Rachel Charlotte
, as well as design, production and use, we might need to rethink the role of ethnography within user centred design and business development. Here the challenge is less about ”getting closer” to user needs and real-life contexts, through familiarization, mediation, and facilitation, and more about creating...... a critical theoretically informed distance from which to perceive and reflect upon complex interconnections between people, technology, business and design, as well as our roles as researchers and designers within these....
Low, Paul J.; Bock, Sören
This review presents a highly selective summary of spectroelectrochemical methods used in the study of metal alkyne, acetylide, vinylidene and allenylidene complexes. The review is illustrated predominantly by the selected examples from the authors’ group that formed the basis of a lecture at the recent ISE Annual Meeting. Emphasis is placed on the use of spectroelectrochemical methods to study redox-induced ligand isomerisation reactions, and determination of molecular electronic structure, which complement the conventional tools available to the synthetic chemist for characterisation of molecular compounds. The role of computational studies in supporting the interpretation of spectroscopic data is also briefly discussed
de Almeida-Leite, Camila Megale; Arantes, Rosa Maria Esteves
Central nervous system glial cells as astrocytes and microglia have been investigated in vitro and many intracellular pathways have been clarified upon various stimuli. Peripheral glial cells, however, are not as deeply investigated in vitro despite its importance role in inflammatory and neurodegenerative diseases. Based on our previous experience of culturing neuronal cells, our objective was to standardize and morphologically characterize a primary culture of mouse superior cervical ganglion glial cells in order to obtain a useful tool to study peripheral glial cell biology. Superior cervical ganglia from neonatal C57BL6 mice were enzymatically and mechanically dissociated and cells were plated on diluted Matrigel coated wells in a final concentration of 10,000cells/well. Five to 8 days post plating, glial cell cultures were fixed for morphological and immunocytochemical characterization. Glial cells showed a flat and irregular shape, two or three long cytoplasm processes, and round, oval or long shaped nuclei, with regular outline. Cell proliferation and mitosis were detected both qualitative and quantitatively. Glial cells were able to maintain their phenotype in our culture model including immunoreactivity against glial cell marker GFAP. This is the first description of immunocytochemical characterization of mouse sympathetic cervical ganglion glial cells in primary culture. This work discusses the uses and limitations of our model as a tool to study many aspects of peripheral glial cell biology. Copyright © 2010 Elsevier B.V. All rights reserved.
Yara Dadalti Fragoso
Full Text Available CONTEXT: MIDAS was developed as a fast and efficient method for identification of migraine in need of medical evaluation and treatment. It was necessary to translate MIDAS, originally written in English, so as to apply it in Brazil and make it usable by individuals from a variety of social-economic-cultural backgrounds. OBJECTIVE: To translate and to apply MIDAS in Brazil. SETTING: Assessment of a sample of workers regularly employed by an oil refinery. SETTING: Refinaria Presidente Bernardes, Cubatão, São Paulo, Brazil. PARTICIPANTS: 404 workers of the company who correctly answered a questionnaire for the identification and evaluation of headache. When the individual considered it to be pertinent to his own needs, there was the option to answer MIDAS as well. METHODS: MIDAS, originally written in English, was translated into Brazilian Portuguese by a neurologist and by a translator specializing in medical texts. The final version of the translation was obtained when, for ten patients to whom it was applied, the text seemed clear and the results were consistent over three sessions. MAIN MEASUREMENTS: Prevalence and types of primary headaches, evaluation of MIDAS as a tool for identification of more severe cases. RESULTS: From the total of 419 questionnaires given to the employees, 404 were returned correctly completed. From these, 160 persons were identified as presenting headaches, 44 of whom considered it worthwhile answering MIDAS. Nine of these individuals who answered MIDAS were identified as severe cases of migraine due to disability caused by the condition. An interview on a later date confirmed these results. Three were cases of chronic daily headache (transformed migraine and six were cases of migraine. CONCLUSIONS: MIDAS translated to Brazilian Portuguese was a useful tool for identifying severe cases of migraine and of transformed migraine in a working environment. The workers did not consider MIDAS to be difficult to answer. Their
Marinozzi, Maura; Pertusati, Fabrizio; Serpi, Michaela
The compounds characterized by the presence of a λ 5 -phosphorus functionality at the α-position with respect to the diazo moiety, here referred to as λ 5 -phosphorus-containing α-diazo compounds (PCDCs), represent a vast class of extremely versatile reagents in organic chemistry and are particularly useful in the preparation of phosphonate- and phosphinoxide-functionalized molecules. Indeed, thanks to the high reactivity of the diazo moiety, PCDCs can be induced to undergo a wide variety of chemical transformations. Among them are carbon-hydrogen, as well as heteroatom-hydrogen insertion reactions, cyclopropanation, ylide formation, Wolff rearrangement, and cycloaddition reactions. PCDCs can be easily prepared from readily accessible precursors by a variety of different methods, such as diazotization, Bamford-Stevens-type elimination, and diazo transfer reactions. This evidence along with their relative stability and manageability make them appealing tools in organic synthesis. This Review aims to demonstrate the ongoing utility of PCDCs in the modern preparation of different classes of phosphorus-containing compounds, phosphonates, in particular. Furthermore, to address the lack of precedent collective papers, this Review also summarizes the methods for PCDCs preparation.
Imanishi, Leandro; Vayssières, Alice; Franche, Claudine; Bogusz, Didier; Wall, Luis; Svistoonoff, Sergio
Among infection mechanisms leading to root nodule symbiosis, the intercellular infection pathway is probably the most ancestral but also one of the least characterized. Intercellular infection has been described in Discaria trinervis, an actinorhizal plant belonging to the Rosales order. To decipher the molecular mechanisms underlying intercellular infection with Frankia bacteria, we set up an efficient genetic transformation protocol for D. trinervis based on Agrobacterium rhizogenes. We showed that composite plants with transgenic roots expressing green fluorescent protein can be specifically and efficiently nodulated by Frankia strain BCU110501. Nitrogen fixation rates and feedback inhibition of nodule formation by nitrogen were similar in control and composite plants. In order to challenge the transformation system, the MtEnod11 promoter, a gene from Medicago truncatula widely used as a marker for early infection-related symbiotic events in model legumes, was introduced in D. trinervis. MtEnod11::GUS expression was related to infection zones in root cortex and in the parenchyma of the developing nodule. The ability to study intercellular infection with molecular tools opens new avenues for understanding the evolution of the infection process in nitrogen-fixing root nodule symbioses.
Li, Yizhou; Zimmermann, Gunther; Scheuermann, Jörg; Neri, Dario
Phage-display libraries and DNA-encoded chemical libraries (DECLs) represent useful tools for the isolation of specific binding molecules from large combinatorial sets of compounds. With both methods, specific binders are recovered at the end of affinity capture procedures by using target proteins of interest immobilized on a solid support. However, although the efficiency of phage-display selections is routinely quantified by counting the phage titer before and after the affinity capture step, no similar quantification procedures have been reported for the characterization of DECL selections. In this article, we describe the potential and limitations of quantitative PCR (qPCR) methods for the evaluation of selection efficiency by using a combinatorial chemical library with more than 35 million compounds. In the experimental conditions chosen for the selections, a quantification of DNA input/recovery over five orders of magnitude could be performed, revealing a successful enrichment of abundant binders, which could be confirmed by DNA sequencing. qPCR provided rapid information about the performance of selections, thus facilitating the optimization of experimental conditions. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Full Text Available Abstract Background The cultivar Micro-Tom (MT is regarded as a model system for tomato genetics due to its short life cycle and miniature size. However, efforts to improve tomato genetic transformation have led to protocols dependent on the costly hormone zeatin, combined with an excessive number of steps. Results Here we report the development of a MT near-isogenic genotype harboring the allele Rg1 (MT-Rg1, which greatly improves tomato in vitro regeneration. Regeneration was further improved in MT by including a two-day incubation of cotyledonary explants onto medium containing 0.4 μM 1-naphthaleneacetic acid (NAA before cytokinin treatment. Both strategies allowed the use of 5 μM 6-benzylaminopurine (BAP, a cytokinin 100 times less expensive than zeatin. The use of MT-Rg1 and NAA pre-incubation, followed by BAP regeneration, resulted in high transformation frequencies (near 40%, in a shorter protocol with fewer steps, spanning approximately 40 days from Agrobacterium infection to transgenic plant acclimatization. Conclusions The genetic resource and the protocol presented here represent invaluable tools for routine gene expression manipulation and high throughput functional genomics by insertional mutagenesis in tomato.
Khayet, Mohamed; Fernández, Victoria
Most aerial plant parts are covered with a hydrophobic lipid-rich cuticle, which is the interface between the plant organs and the surrounding environment. Plant surfaces may have a high degree of hydrophobicity because of the combined effects of surface chemistry and roughness. The physical and chemical complexity of the plant cuticle limits the development of models that explain its internal structure and interactions with surface-applied agrochemicals. In this article we introduce a thermodynamic method for estimating the solubilities of model plant surface constituents and relating them to the effects of agrochemicals. Following the van Krevelen and Hoftyzer method, we calculated the solubility parameters of three model plant species and eight compounds that differ in hydrophobicity and polarity. In addition, intact tissues were examined by scanning electron microscopy and the surface free energy, polarity, solubility parameter and work of adhesion of each were calculated from contact angle measurements of three liquids with different polarities. By comparing the affinities between plant surface constituents and agrochemicals derived from (a) theoretical calculations and (b) contact angle measurements we were able to distinguish the physical effect of surface roughness from the effect of the chemical nature of the epicuticular waxes. A solubility parameter model for plant surfaces is proposed on the basis of an increasing gradient from the cuticular surface towards the underlying cell wall. The procedure enabled us to predict the interactions among agrochemicals, plant surfaces, and cuticular and cell wall components, and promises to be a useful tool for improving our understanding of biological surface interactions.
Vettori, S.; Pecchioni, E.; Camaiti, M.; Garfagnoli, F.; Benvenuti, M.; Costagliola, P.; Moretti, S.
In the recent past, a wide range of protective products (in most cases, synthetic polymers) have been applied to the surfaces of ancient buildings/artefacts to preserve them from alteration . The lack of a detailed mapping of the permanence and efficacy of these treatments, in particular when applied on large surfaces such as building facades, may be particularly noxious when new restoration treatments are needed and the best choice of restoration protocols has to be taken. The presence of protective compounds on stone surfaces may be detected in laboratory by relatively simple diagnostic tests, which, however, normally require invasive (or micro-invasive) sampling methodologies and are time-consuming, thus limiting their use only to a restricted number of samples and sampling sites. On the contrary, hyperspectral sensors are rapid, non-invasive and non-destructive tools capable of analyzing different materials on the basis of their different patterns of absorption at specific wavelengths, and so particularly suitable for the field of cultural heritage [2,3]. In addition, they can be successfully used to discriminate between inorganic (i.e. rocks and minerals) and organic compounds, as well as to acquire, in short times, many spectra and compositional maps at relatively low costs. In this study we analyzed a number of stone samples (Carrara Marble and biogenic calcarenites - "Lecce Stone" and "Maastricht Stone"-) after treatment of their surfaces with synthetic polymers (synthetic wax, acrylic, perfluorinated and silicon based polymers) of common use in conservation-restoration practice. The hyperspectral device used for this purpose was ASD FieldSpec FR Pro spectroradiometer, a portable, high-resolution instrument designed to acquire Visible and Near-Infrared (VNIR: 350-1000 nm) and Short-Wave Infrared (SWIR: 1000-2500 nm) punctual reflectance spectra with a rapid data collection time (about 0.1 s for each spectrum). The reflectance spectra so far obtained in
Akolkar, Beena; Spain, Lisa M.; Guill, Michael H.; Del Vecchio, Corey T.; Carroll, Leslie E.
The National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK) Central Repositories, part of the National Institutes of Health (NIH), are an important resource available to researchers and the general public. The Central Repositories house samples, genetic data, phenotypic data, and study documentation from >100 NIDDK-funded clinical studies, in areas such as diabetes, digestive disease, and liver disease research. The Central Repositories also have an exceptionally rich collection of studies related to kidney disease, including the Modification of Diet in Renal Disease landmark study and recent data from the Chronic Renal Insufficiency Cohort and CKD in Children Cohort studies. The data are carefully curated and linked to the samples from the study. The NIDDK is working to make the materials and data accessible to researchers. The Data Repositories continue to improve flexible online searching tools that help researchers identify the samples or data of interest, and NIDDK has created several different paths to access the data and samples, including some funding initiatives. Over the past several years, the Central Repositories have seen steadily increasing interest and use of the stored materials. NIDDK plans to make more collections available and do more outreach and education about use of the datasets to the nephrology research community in the future to enhance the value of this resource. PMID:25376765
Rasooly, Rebekah S; Akolkar, Beena; Spain, Lisa M; Guill, Michael H; Del Vecchio, Corey T; Carroll, Leslie E
The National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK) Central Repositories, part of the National Institutes of Health (NIH), are an important resource available to researchers and the general public. The Central Repositories house samples, genetic data, phenotypic data, and study documentation from >100 NIDDK-funded clinical studies, in areas such as diabetes, digestive disease, and liver disease research. The Central Repositories also have an exceptionally rich collection of studies related to kidney disease, including the Modification of Diet in Renal Disease landmark study and recent data from the Chronic Renal Insufficiency Cohort and CKD in Children Cohort studies. The data are carefully curated and linked to the samples from the study. The NIDDK is working to make the materials and data accessible to researchers. The Data Repositories continue to improve flexible online searching tools that help researchers identify the samples or data of interest, and NIDDK has created several different paths to access the data and samples, including some funding initiatives. Over the past several years, the Central Repositories have seen steadily increasing interest and use of the stored materials. NIDDK plans to make more collections available and do more outreach and education about use of the datasets to the nephrology research community in the future to enhance the value of this resource. Copyright © 2015 by the American Society of Nephrology.
Green plants are the ultimate source of all resources required for man's life, his food, his clothes, and almost all his energy requirements. Primitive prehistoric man could live from the abundance of nature surrounding him. Man today, dominating nature in terms of numbers and exploiting its limited resources, cannot exist without employing his intelligence to direct natural evolution. Plant sciences, therefore, are not a matter of curiosity but an essential requirement. From such considerations, the IAEA and FAO jointly organized a symposium to assess the value of mutation research for various kinds of plant science, which directly or indirectly might contribute to sustaining and improving crop production. The benefit through developing better cultivars that plant breeders can derive from using the additional genetic resources resulting from mutation induction has been assessed before at other FAO/IAEA meetings (Rome 1964, Pullman 1969, Ban 1974, Ibadan 1978) and is also monitored in the Mutation Breeding Newsletter, published by IAEA twice a year. Several hundred plant cultivars which carry economically important characters because their genes have been altered by ionizing radiation or other mutagens, are grown by farmers and horticulturists in many parts of the world. But the benefit derived from such mutant varieties is without any doubt surpassed by the contribution which mutation research has made towards the advancement of genetics. For this reason, a major part of the papers and discussions at the symposium dealt with the role induced-mutation research played in providing insight into gene action and gene interaction, the organization of genes in plant chromosomes in view of homology and homoeology, the evolutionary role of gene duplication and polyploidy, the relevance of gene blocks, the possibilities for chromosome engineering, the functioning of cytroplasmic inheritance and the genetic dynamics of populations. In discussing the evolutionary role of
Full Text Available The research tools refer to the resources researchers need to use in experimental work. In Biotechnology, these can include cell lines, monoclonal antibodies, reagents, animal models, growth factors, combinatorial chemistry libraries, drug and drug targets, clones and cloning tools (such as PCR, method, laboratory equipment and machines, database and computer software. Research tools therefore serve as basis for upstream research to improve the present product or process. There are several challenges in the way of using patented research tools. IP issues with regard to research tools are important and may sometime pose hindrance for researchers. Hence in the case of patented research tools, IPR issues can compose a major hurdle for technology development. In majority instances research tools are permitted through MTAs for academic research and for imparting education. TRIPS provides a provision for exception to patent rights for experimental use of patented technology in scientific research and several countries including India have included this provision in their patent legislation. For commercially important work, licensing of research tools can be based on royalty or one time lump sum payment. Some patent owners of important high-end research tools for development of platform technology create problems in licensing which can impede research. Usually cost of a commercially available research tool is built up in its price.
Mahajan Ashwini; Prof. B.V. Jain; Dr Surajj Sarode
A centrifuge is a critical piece of equipment for the laboratory. Purpose of this study was to study research centrifuge in detail, its applications, uses in different branches and silent features. Their are two types of research centrifuge study here revolutionary research centrifuge and microprocessor research centrifuge. A centrifuge is a device that separates particles from a solution through use of a rotor. In biology, the particles are usually cells, sub cellular organelles, or large mo...
Kohlmann, Rebekka; Hoffmann, Alexander; Geis, Gabriele; Gatermann, Sören
approach allowed an optimized treatment recommendation. MALDI-TOF MS following 4h pre-culture is a valuable tool for rapid pathogen identification from positive blood cultures, allowing easy integration in diagnostic routine and the opportunity of considerably earlier treatment adaptation. Copyright © 2015 Elsevier GmbH. All rights reserved.
Barnett, Stephen; Henderson, Joan; Hodgkins, Adam; Harrison, Christopher; Ghosh, Abhijeet; Dijkmans-Hadley, Bridget; Britt, Helena; Bonney, Andrew
Electronic medical data (EMD) from electronic health records of general practice computer systems have enormous research potential, yet many variables are unreliable. The aim of this study was to compare selected data variables from general practice EMD with a reliable, representative national dataset (Bettering the Evaluation and Care of Health (BEACH)) in order to validate their use for primary care research. EMD variables were compared with encounter data from the nationally representative BEACH program using χ 2 tests and robust 95% confidence intervals to test their validity (measure what they reportedly measure). The variables focused on for this study were patient age, sex, smoking status and medications prescribed at the visit. The EMD sample from six general practices in the Illawarra region of New South Wales, Australia, yielded data on 196,515 patient encounters. Details of 90,553 encounters were recorded in the 2013 BEACH dataset from 924 general practitioners. No significant differences in patient age ( p = 0.36) or sex ( p = 0.39) were found. EMD had a lower rate of current smokers and higher average scripts per visit, but similar prescribing distribution patterns. Validating EMD variables offers avenues for improving primary care delivery and measuring outcomes of care to inform clinical practice and health policy.
Knollmann, Björn C
As part of the series on Controversies in Cardiovascular Research, the article reviews the strengths and limitations of induced pluripotent stem cell-derived cardiomyocytes (iPSC-CM) as models of cardiac arrhythmias. Specifically, the article attempts to answer the following questions: Which clinical arrhythmias can be modeled by iPSC-CM? How well can iPSC-CM model adult ventricular myocytes? What are the strengths and limitations of published iPSC-CM arrhythmia models? What new mechanistic insight has been gained? What is the evidence that would support using iPSC-CM to personalize anti-arrhythmic drug therapy? The review also discusses the pros and cons of using the iPSC-CM technology for modeling specific genetic arrhythmia disorders such as long QT syndrome, Brugada Syndrome or Catecholaminergic Polymorphic Ventricular Tachycardia. PMID:23569106
Full Text Available The hermaphroditic Mangrove Killifish, Kryptolebias marmoratus, is the world's only vertebrate that routinely self-fertilizes. As such, highly inbred and presumably isogenic "clonal" lineages of this androdioecious species have long been maintained in several laboratories and used in a wide variety of experiments that require genetically uniform vertebrate specimens. Here we conduct a genetic inventory of essentially all laboratory stocks of the Mangrove Killifish held worldwide. At 32 microsatellite loci, these stocks proved to show extensive interline differentiation as well as some intraline variation, much of which can be attributed to post-origin de novo mutations and/or to the segregation of polymorphisms from wild progenitors. Our genetic findings also document that many of the surveyed laboratory strains are not what they have been labeled, apparently due to the rather frequent mishandling or unintended mixing of various laboratory stocks over the years. Our genetic inventory should help to clarify much of this confusion about the clonal identities and genetic relationships of laboratory lines, and thereby help to rejuvenate interest in K. marmoratus as a reliable vertebrate model for experimental research that requires or can capitalize upon "clonal" replicate specimens.
Lennox, Laura; Doyle, Cathal; Reed, Julie E; Bell, Derek
Although improvement initiatives show benefits to patient care, they often fail to sustain. Models and frameworks exist to address this challenge, but issues with design, clarity and usability have been barriers to use in healthcare settings. This work aimed to collaborate with stakeholders to develop a sustainability tool relevant to people in healthcare settings and practical for use in improvement initiatives. Tool development was conducted in six stages. A scoping literature review, group discussions and a stakeholder engagement event explored literature findings and their resonance with stakeholders in healthcare settings. Interviews, small-scale trialling and piloting explored the design and tested the practicality of the tool in improvement initiatives. National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care for Northwest London (CLAHRC NWL). CLAHRC NWL improvement initiative teams and staff. The iterative design process and engagement of stakeholders informed the articulation of the sustainability factors identified from the literature and guided tool design for practical application. Key iterations of factors and tool design are discussed. From the development process, the Long Term Success Tool (LTST) has been designed. The Tool supports those implementing improvements to reflect on 12 sustainability factors to identify risks to increase chances of achieving sustainability over time. The Tool is designed to provide a platform for improvement teams to share their own views on sustainability as well as learn about the different views held within their team to prompt discussion and actions. The development of the LTST has reinforced the importance of working with stakeholders to design strategies which respond to their needs and preferences and can practically be implemented in real-world settings. Further research is required to study the use and effectiveness of the tool in practice and assess engagement with
Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C
This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.
such previous work, two case studies are presented, in which drawings helped investigate the relationship between media technology users and two specific devices, namely television and mobile phones. The experiment generated useful data and opened for further consideration of the method as an appropriate HCI...... research tool....
Pantula, Sastry; Dickey, David
Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...
Rodriguez, W. J.; Chaudhury, S. R.
Undergraduate research projects that utilize remote sensing satellite instrument data to investigate atmospheric phenomena pose many challenges. A significant challenge is processing large amounts of multi-dimensional data. Remote sensing data initially requires mining; filtering of undesirable spectral, instrumental, or environmental features; and subsequently sorting and reformatting to files for easy and quick access. The data must then be transformed according to the needs of the investigation(s) and displayed for interpretation. These multidimensional datasets require views that can range from two-dimensional plots to multivariable-multidimensional scientific visualizations with animations. Science undergraduate students generally find these data processing tasks daunting. Generally, researchers are required to fully understand the intricacies of the dataset and write computer programs or rely on commercially available software, which may not be trivial to use. In the time that undergraduate researchers have available for their research projects, learning the data formats, programming languages, and/or visualization packages is impractical. When dealing with large multi-dimensional data sets appropriate Scientific Visualization tools are imperative in allowing students to have a meaningful and pleasant research experience, while producing valuable scientific research results. The BEST Lab at Norfolk State University has been creating tools for multivariable-multidimensional analysis of Earth Science data. EzSAGE and SAGE4D have been developed to sort, analyze and visualize SAGE II (Stratospheric Aerosol and Gas Experiment) data with ease. Three- and four-dimensional visualizations in interactive environments can be produced. EzSAGE provides atmospheric slices in three-dimensions where the researcher can change the scales in the three-dimensions, color tables and degree of smoothing interactively to focus on particular phenomena. SAGE4D provides a navigable
The social networking tool Twitter may soon be adopted by petroleum industry workers as a means of ensuring increased communications. Comprised of social networks, link sharing, and live searching, the tool can be used to conduct subject searchers as well as to link to quarterly reports and press releases. Twitter is also being used to manage crisis communications as well as to monitor activities on the Internet. Twitter may also provide a means for oil and gas operators to follow influential industry bloggers as well as to develop effective communications strategies. It was concluded that Twitter may offer an opportunity for companies to participate in non-traditional communications approaches such as online forums, and other media-sharing tools. 1 fig.
І. К. Журавльова
Full Text Available The article describes tasks that electronic collections of rare books fulfill: broad access for readers to rare and valuable editions providing, preservation of ensuring of the original. On the example of the electronic collection of the Central Scientific Library of the V.N. Karazin Kharkiv National University – «eScriptorium: electronic archive of rare books and manuscripts for research and education» the possibility of the full-text resources of the valuable editions using is shown. The principles of creation, structure, chronological frameworks, directions of adding the documents to the archive are represented. The perspectives of the project development are outlined as well as examples of the digital libraries of the European countries and Ukraine are provided, the actual task of preserving the originals of the rare books of the country is raised, the innovative approaches to serving users with electronic resources are considered. The evidences of cooperation of the Central Scientific Library of the V.N. Karazin Kharkiv National University with the largest world digital libraries: World Digital Library and Europeana are provided.
Van Hal, J.D.M.
Challenging and valuable Inaugural speech given on May 7th 2008 at the occasion of the acceptance of the position of Professor Sustainable Housing Transformation at the faculty of Architeeture of the Delft University of Technology by Prof. J.D.M. van Hal MSc PhD.
Bräutigam, Andrea; Gowik, Udo
Next generation sequencing (NGS) technologies have opened fascinating opportunities for the analysis of plants with and without a sequenced genome on a genomic scale. During the last few years, NGS methods have become widely available and cost effective. They can be applied to a wide variety of biological questions, from the sequencing of complete eukaryotic genomes and transcriptomes, to the genome-scale analysis of DNA-protein interactions. In this review, we focus on the use of NGS for pla...
Stender, Vivien; Jankowski, Cedric; Hammitzsch, Martin; Wächter, Joachim
Established initiatives and organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. These infrastructures aim the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. In this regard, Research Data Management (RDM) gains importance and thus requires the support by appropriate tools integrated in these infrastructures. Different projects provide arbitrary solutions to manage research data. However, within two projects - SUMARIO for land and water management and TERENO for environmental monitoring - solutions to manage research data have been developed based on Free and Open Source Software (FOSS) components. The resulting framework provides essential components for harvesting, storing and documenting research data, as well as for discovering, visualizing and downloading these data on the basis of standardized services stimulated considerably by enhanced data management approaches of Spatial Data Infrastructures (SDI). In order to fully exploit the potentials of these developments for enhancing data management in Geosciences the publication of software components, e.g. via GitHub, is not sufficient. We will use our experience to move these solutions into the cloud e.g. as PaaS or SaaS offerings. Our contribution will present data management solutions for the Geosciences developed in two projects. A sort of construction kit with FOSS components build the backbone for the assembly and implementation of projects specific platforms. Furthermore, an approach is presented to stimulate the reuse of FOSS RDM solutions with cloud concepts. In further projects specific RDM platforms can be set-up much faster, customized to the individual needs and tools can be added during the run-time.
Full Text Available Leishmania is the causative agent of leishmaniasis, a neglected tropical disease that affects more than 12 million people around the world. Current treatments are toxic and poorly effective due to the acquisition of resistance within Leishmania populations. Thus, the pursuit for new antileishmanial drugs is a priority. The available methods for drug screening based on colorimetric assays using vital dyes are time-consuming. Currently, the use of fluorescent reporter proteins is replacing the use of viability indicator dyes. We have constructed two plasmids expressing the red fluorescent protein mCherry with multiple cloning sites (MCS, adequate for N- and C-terminal fusion protein constructs. Our results also show that the improved pXG-mCherry plasmid can be employed for drug screening in vitro. The use of the red fluorescent protein, mCherry, is an easier tool for numerous assays, not only to test pharmacological compounds, but also to determine the subcellular localization of proteins.
Straightforward Statistics: Understanding the Tools of Research is a clear and direct introduction to statistics for the social, behavioral, and life sciences. Based on the author's extensive experience teaching undergraduate statistics, this book provides a narrative presentation of the core principles that provide the foundation for modern-day statistics. With step-by-step guidance on the nuts and bolts of computing these statistics, the book includes detailed tutorials how to use state-of-the-art software, SPSS, to compute the basic statistics employed in modern academic and applied researc
A process for recovering valuable liquid hydrocarbons from coking coal, mineral coal, or oil shale through treatment with hydrogen under pressure at elevated temperature is described. Catalysts and grinding oil may be used in the process if necessary. The process provides for deashing the coal prior to hydrogenation and for preventing the coking and swelling of the deashed material. During the treatment with hydrogen, the coal is either mixed with coal low in bituminous material, such as lean coal or active coal, as a diluent or the bituminous constituents which cause the coking and swelling are removed by extraction with solvents. (BLM)
Kevin R. Butt
Full Text Available Earthworms are responsible for soil development, recycling organic matter and form a vital component within many food webs. For these and other reasons earthworms are worthy of investigation. Many technologically-enhanced approaches have been used within earthworm-focused research. These have their place, may be a development of existing practices or bring techniques from other fields. Nevertheless, let us not overlook the fact that much can still be learned through utilisation of more basic approaches which have been used for some time. New does not always equate to better. Information on community composition within an area and specific population densities can be learned using simple collection techniques, and burrowing behaviour can be determined from pits, resin-insertion or simple mesocosms. Life history studies can be achieved through maintenance of relatively simple cultures. Behavioural observations can be undertaken by direct observation or with low cost webcam usage. Applied aspects of earthworm research can also be achieved through use of simple techniques to enhance population development and even population dynamics can be directly addressed with use of relatively inexpensive, effective marking techniques. This paper seeks to demonstrate that good quality research in this sphere can result from appropriate application of relatively simple research tools.
Butt, K.R.; Grigoropoulou, N.
Earthworms are responsible for soil development, recycling organic matter and form a vital component within many food webs. For these and other reasons earthworms are worthy of investigation. Many technologically-enhanced approaches have been used within earthworm-focused research. These have their place, may be a development of existing practices or bring techniques from other fields. Nevertheless, let us not overlook the fact that much can still be learned through utilisation of more basic approaches which have been used for some time. New does not always equate to better. Information on community composition within an area and specific population densities can be learned using simple collection techniques, and burrowing behaviour can be determined from pits, resin-insertion or simple mesocosms. Life history studies can be achieved through maintenance of relatively simple cultures. Behavioural observations can be undertaken by direct observation or with low cost we became usage. Applied aspects of earthworm research can also be achieved through use of simple techniques to enhance population development and even population dynamics can be directly addressed with use of relatively inexpensive, effective marking techniques. This paper seeks to demonstrate that good quality research in this sphere can result from appropriate application of relatively simple research tools.
Kirkpatrick, CJ; Otto, M; van Kooten, T; Krump, [No Value; Kriegsmann, J; Bittinger, F
Progress in biocompatibility and tissue engineering would today be inconceivable without the aid of in vitro techniques. Endothelial cell cultures represent a valuable tool not just in haemocompatibility testing, but also in the concept of designing hybrid organs. In the past endothelial cells (EC)
Hüllemann, S; Schüpfer, G; Mauch, J
In naturally occurring numbers the frequencies of digits 1-9 in the leading position are counterintuitively distributed because the frequencies of occurrence are unequal. Benford-Newcomb's law describes the expected distribution of these frequencies. It was previously shown that known fraudulent articles consistently violated this law. To compare the features of 12 known fraudulent articles from a single Japanese author to the features of 13 articles in the same research field from other Japanese authors, published during the same time period and identified with a Medline database search. All 25 articles were assessed to determine whether the data violated the law. Formulas provided by the law were used to determine the frequencies of occurrence for the first two leading digits in manually extracted numbers. It was found that all the known fraudulent papers violated the law and 6 of the 13 articles used for comparison followed the law. Assuming that the articles in the comparison group were not falsified or fabricated, the sensitivity of assessing articles with Benford-Newcomb's law was 100% (95% confidence interval CI: 73.54-100%) but the specificity was only 46.15% (95% CI: 19.22-74.87%) and the positive predictive value was 63.16% (95% CI: 38.36-83.71%). All 12 of the known falsified articles violated Benford-Newcomb's law, which indicated that this analysis had a high sensitivity. The low specificity of the assessment may be explained by the assumptions made about the articles identified for comparison. Violations of Benford-Newcomb's law about the frequencies of the leading digits cannot serve as proof of falsification but they may provide a basis for deeper discussions between the editor and author about a submitted work.
Michael A. Langston
Full Text Available Despite staggering investments made in unraveling the human genome, current estimates suggest that as much as 90% of the variance in cancer and chronic diseases can be attributed to factors outside an individual’s genetic endowment, particularly to environmental exposures experienced across his or her life course. New analytical approaches are clearly required as investigators turn to complicated systems theory and ecological, place-based and life-history perspectives in order to understand more clearly the relationships between social determinants, environmental exposures and health disparities. While traditional data analysis techniques remain foundational to health disparities research, they are easily overwhelmed by the ever-increasing size and heterogeneity of available data needed to illuminate latent gene x environment interactions. This has prompted the adaptation and application of scalable combinatorial methods, many from genome science research, to the study of population health. Most of these powerful tools are algorithmically sophisticated, highly automated and mathematically abstract. Their utility motivates the main theme of this paper, which is to describe real applications of innovative transdisciplinary models and analyses in an effort to help move the research community closer toward identifying the causal mechanisms and associated environmental contexts underlying health disparities. The public health exposome is used as a contemporary focus for addressing the complex nature of this subject.
Katherine D. Seelman
Full Text Available The importance of public policy as a complementary framework for telehealth, telemedicine, and by association telerehabilitation, has been recognized by a number of experts. The purpose of this paper is to review literature on telerehabilitation (TR policy and research methodology issues in order to report on the current state of the science and make recommendations about future research needs. An extensive literature search was implemented using search terms grouped into main topics of telerehabilitation, policy, population of users, and policy specific issues such as cost and reimbursement. The availability of rigorous and valid evidence-based cost studies emerged as a major challenge to the field. Existing cost studies provided evidence that telehomecare may be a promising application area for TR. Cost studies also indicated that telepsychiatry is a promising telepractice area. The literature did not reference the International Classification on Functioning, Disability and Health (ICF. Rigorous and comprehensive TR assessment and evaluation tools for outcome studies are tantamount to generating confidence among providers, payers, clinicians and end users. In order to evaluate consumer satisfaction and participation, assessment criteria must include medical, functional and quality of life items such as assistive technology and environmental factors. Keywords: Telerehabilitation, Telehomecare, Telepsychiatry, Telepractice
Voivonta, Theodora; Avraamidou, Lucy
This paper is concerned with the educational value of Facebook and specifically how it can be used in formal educational settings. As such, it provides a review of existing literature of how Facebook is used in higher education paying emphasis on the scope of its use and the outcomes achieved. As evident in existing literature, Facebook has been…
Voivonta, Theodora; Avraamidou, Lucy
This paper is concerned with the educational value of Facebook and specifically how it can be used in formal educational settings. As such, it provides a review of existing literature of how Facebook is used in higher education paying emphasis on the scope of its use and the outcomes achieved. As
Greene, Gretchen; Donley, J.; Rodney, S.; LAZIO, J.; Koekemoer, A. M.; Busko, I.; Hanisch, R. J.; VAO Team; CANDELS Team
The formation of galaxies and their co-evolution with black holes through cosmic time are prominent areas in current extragalactic astronomy. New methods in science research are building upon collaborations between scientists and archive data centers which span large volumes of multi-wavelength and heterogeneous data. A successful example of this form of teamwork is demonstrated by the CANDELS (Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey) and the Virtual Astronomical Observatory (VAO) collaboration. The CANDELS project archive data provider services are registered and discoverable in the VAO through an innovative web based Data Discovery Tool, providing a drill down capability and cross-referencing with other co-spatially located astronomical catalogs, images and spectra. The CANDELS team is working together with the VAO to define new methods for analyzing Spectral Energy Distributions of galaxies containing active galactic nuclei, and helping to evolve advanced catalog matching methods for exploring images of variable depths, wavelengths and resolution. Through the publication of VOEvents, the CANDELS project is publishing data streams for newly discovered supernovae that are bright enough to be followed from the ground.
Mychasiuk, R; Benzies, K
Facebook is currently one of the world's most visited websites, and home to millions of users who access their accounts on a regular basis. Owing to the website's ease of accessibility and free service, demographic characteristics of users span all domains. As such, Facebook may be a valuable tool for locating and communicating with participants in longitudinal research studies. This article outlines the benefit gained in a longitudinal follow-up study, of an intervention programme for at-risk families, through the use of Facebook as a search engine. Using Facebook as a resource, we were able to locate 19 participants that were otherwise 'lost' to follow-up, decreasing attrition in our study by 16%. Additionally, analysis indicated that hard-to-reach participants located with Facebook differed significantly on measures of receptive language and self-esteem when compared to their easier-to-locate counterparts. These results suggest that Facebook is an effective means of improving participant retention in a longitudinal intervention study and may help improve study validity by reaching participants that contribute differing results. © 2011 Blackwell Publishing Ltd.
This paper will discuss some of the tooling necessary to manufacture aluminum-based research reactor fuel plates. Most of this tooling is intended for use in a high-production facility. Some of the tools shown have manufactured more than 150,000 pieces. The only maintenance has been sharpening. With careful design, tools can be made to accommodate the manufacture of several different fuel elements, thus, reducing tooling costs and maintaining tools that the operators are trained to use. An important feature is to design the tools using materials with good lasting quality. Good tools can increase return on investment. (author)
This paper will discuss some of the tooling necessary to manufacture aluminum-based research reactor fuel plates. Most of this tooling is intended for use in a high-production facility. Some of the tools shown have manufactured more than 150,000 pieces. The only maintenance has been sharpening. With careful design, tools can be made to accommodate the manufacture of several different fuel elements, thus, reducing tooling costs and maintaining tools that the operators are trained to use. An important feature is to design the tools using materials with good lasting quality. Good tools can increase return on investment
Greene, Sarah M.; Baldwin, Laura-Mae; Dolor, Rowena J.; Thompson, Ella; Neale, Anne Victoria
Over the past two decades, the health research enterprise has matured rapidly, and many recognize an urgent need to translate pertinent research results into practice, to help improve the quality, accessibility, and affordability of U.S. health care. Streamlining research operations would speed translation, particularly for multi-site collaborations. However, the culture of research discourages reusing or adapting existing resources or study materials. Too often, researchers start studies and...
Andrea Calsamiglia Madurga
Full Text Available We present a theoretical and epistemological reflection on Forum Theater’s potential as a Research Tool. Our presence on social action and research has led us to a double reflection on qualitative research’s limitations on the affect studies and the Forum Theater’s potential as a research tool to tackle research about affects. After some specific experiences in action research (qualitative research on romantic love and gender violence, and the creation process of the Forum Theater “Is it a joke?”, we explore Forum Theatre’s possibilities as a research tool in the feminist epistemology framework.
Nijssen, E.J.; Frambach, R.T.
This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies’ turnover, (2) MR companies’ awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers’ perceptions of the influence of client
Nijssen, Edwin J.; Frambach, Ruud T.
This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies' turnover, (2) MR companies' awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers' perceptions of the influence of client
Techniques to quantify ephemeral gully erosion have been identified by USDA Natural Resources Conservation Service (NRCS) as one of gaps in current erosion assessment tools. One reason that may have contributed to this technology gap is the difficulty to quantify changes in channel geometry to asses...
The Volpe Center developed a marketing research primer which provides a guide to the approach, procedures, and research tools used by private industry in predicting consumer response. The final two chapters of the primer focus on the challenges of do...
O V Verkhodanov
Full Text Available We describe the current status of CATS (astrophysical CATalogs Support system, a publicly accessible tool maintained at Special Astrophysical Observatory of the Russian Academy of Sciences (SAO RAS (http://cats.sao.ru allowing one to search hundreds of catalogs of astronomical objects discovered all along the electromagnetic spectrum. Our emphasis is mainly on catalogs of radio continuum sources observed from 10 MHz to 245 GHz, and secondly on catalogs of objects such as radio and active stars, X-ray binaries, planetary nebulae, HII regions, supernova remnants, pulsars, nearby and radio galaxies, AGN and quasars. CATS also includes the catalogs from the largest extragalactic surveys with non-radio waves. In 2008 CATS comprised a total of about 109 records from over 400 catalogs in the radio, IR, optical and X-ray windows, including most source catalogs deriving from observations with the Russian radio telescope RATAN-600. CATS offers several search tools through different ways of access, e.g. via Web-interface and e-mail. Since its creation in 1997 CATS has managed about 105requests. Currently CATS is used by external users about 1500 times per day and since its opening to the public in 1997 has received about 4000 requests for its selection and matching tasks.
Cunningham, Barbara Jane; Hidecker, Mary Jo Cooley; Thomas-Stonell, Nancy; Rosenbaum, Peter
In this paper, we present our experiences - both successes and challenges - in implementing evidence-based classification tools into clinical practice. We also make recommendations for others wanting to promote the uptake and application of new research-based assessment tools. We first describe classification systems and the benefits of using them in both research and practice. We then present a theoretical framework from Implementation Science to report strategies we have used to implement two research-based classification tools into practice. We also illustrate some of the challenges we have encountered by reporting results from an online survey investigating 58 Speech-language Pathologists' knowledge and use of the Communication Function Classification System (CFCS), a new tool to classify children's functional communication skills. We offer recommendations for researchers wanting to promote the uptake of new tools in clinical practice. Specifically, we identify structural, organizational, innovation, practitioner, and patient-related factors that we recommend researchers address in the design of implementation interventions. Roles and responsibilities of both researchers and clinicians in making implementations science a success are presented. Implications for rehabilitation Promoting uptake of new and evidence-based tools into clinical practice is challenging. Implementation science can help researchers to close the knowledge-to-practice gap. Using concrete examples, we discuss our experiences in implementing evidence-based classification tools into practice within a theoretical framework. Recommendations are provided for researchers wanting to implement new tools in clinical practice. Implications for researchers and clinicians are presented.
Andersen, Jakob Axel Bejbro; Howard, Thomas J.; McAloone, Tim C.
. This paper elucidates the requirements for such tools by drawing on knowledge of the entrepreneurial phenomenon and by building on the existing research tools used in design research. On this basis, the development of a capture method for tech startup processes is described and its potential discussed....
Narratives and activity theory are useful as socially constructed data collection tools that allow a researcher access to the social, cultural and historical meanings that research participants place on events in their lives. This case study shows how these tools were used to promote reflection within a cultural-historical activity theoretically…
Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Strawhacker, C.; Pulsifer, P. L.; Thurmes, N.
Lobashev, V.M.; Tavkhelidze, A.N.
A meson facility is being built at the Institute of Nuclear Research, USSR Academy of Sciences, in Troitsk, where the Scientific Center, USSR Academy of Sciences is located. The facility will include a linear accelerator for protons and negative hydrogen ions with 600 MeV energy and 0.5-1 mA beam current. Some fundamental studies that can be studied at a meson facility are described in the areas of elementary particles, neutron physics, solid state physics, and applied research. The characteristics of the linear accelerator are given and the meson facility's experimental complex is described
Ebrahim, Nader Ale
“Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 700 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated periodically. “Research Tools” consists of a hierarchical set of nodes. It has four main nodes: (1)...
Ebrahim, Nader Ale
“Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 700 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated periodically. “Research Tools” consists of a hierarchical set of nodes. It has four main nodes: (1)...
View, Jenice L.; DeMulder, Elizabeth; Stribling, Stacia; Dodman, Stephanie; Ra, Sophia; Hall, Beth; Swalwell, Katy
This is a three-part essay featuring six teacher educators and one classroom teacher researcher. Part one describes faculty efforts to build curriculum for teacher research, scaffold the research process, and analyze outcomes. Part two shares one teacher researcher's experience using an equity audit tool in several contexts: her teaching practice,…
Fields, MaryAnne; Brewer, Ralph; Edge, Harris L.; Pusey, Jason L.; Weller, Ed; Patel, Dilip G.; DiBerardino, Charles A.
The Robotics Collaborative Technology Alliance (RCTA) program focuses on four overlapping technology areas: Perception, Intelligence, Human-Robot Interaction (HRI), and Dexterous Manipulation and Unique Mobility (DMUM). In addition, the RCTA program has a requirement to assess progress of this research in standalone as well as integrated form. Since the research is evolving and the robotic platforms with unique mobility and dexterous manipulation are in the early development stage and very expensive, an alternate approach is needed for efficient assessment. Simulation of robotic systems, platforms, sensors, and algorithms, is an attractive alternative to expensive field-based testing. Simulation can provide insight during development and debugging unavailable by many other means. This paper explores the maturity of robotic simulation systems for applications to real-world problems in robotic systems research. Open source (such as Gazebo and Moby), commercial (Simulink, Actin, LMS), government (ANVEL/VANE), and the RCTA-developed RIVET simulation environments are examined with respect to their application in the robotic research domains of Perception, Intelligence, HRI, and DMUM. Tradeoffs for applications to representative problems from each domain are presented, along with known deficiencies and disadvantages. In particular, no single robotic simulation environment adequately covers the needs of the robotic researcher in all of the domains. Simulation for DMUM poses unique constraints on the development of physics-based computational models of the robot, the environment and objects within the environment, and the interactions between them. Most current robot simulations focus on quasi-static systems, but dynamic robotic motion places an increased emphasis on the accuracy of the computational models. In order to understand the interaction of dynamic multi-body systems, such as limbed robots, with the environment, it may be necessary to build component
Full Text Available BACKGROUND: We systematically analyzed multiple myeloma (MM cell lines and patient bone marrow cells for their engraftment capacity in immunodeficient mice and validated the response of the resulting xenografts to antimyeloma agents. DESIGN AND METHODS: Using flow cytometry and near infrared fluorescence in-vivo-imaging, growth kinetics of MM cell lines L363 and RPMI8226 and patient bone marrow cells were investigated with use of a murine subcutaneous bone implant, intratibial and intravenous approach in NOD/SCID, NOD/SCID treated with CD122 antibody and NOD/SCID IL-2Rγ(null mice (NSG. RESULTS: Myeloma growth was significantly increased in the absence of natural killer cell activity (NSG or αCD122-treated NOD/SCID. Comparison of NSG and αCD122-treated NOD/SCID revealed enhanced growth kinetics in the former, especially with respect to metastatic tumor sites which were exclusively observed therein. In NSG, MM cells were more tumorigenic when injected intratibially than intravenously. In NOD/SCID in contrast, the use of juvenile long bone implants was superior to intratibial or intravenous cancer cell injection. Using the intratibial NSG model, mice developed typical disease symptoms exclusively when implanted with human MM cell lines or patient-derived bone marrow cells, but not with healthy bone marrow cells nor in mock-injected animals. Bortezomib and dexamethasone delayed myeloma progression in L363- as well as patient-derived MM cell bearing NSG. Antitumor activity could be quantified via flow cytometry and in vivo imaging analyses. CONCLUSIONS: Our results suggest that the intratibial NSG MM model mimics the clinical situation of the disseminated disease and serves as a valuable tool in the development of novel anticancer strategies.
Jernberg, Tomas; Lindahl, Bertil
Electrocardiography (ECG) obtained on admission and a troponin T (tn-T) level measured early after admission are simple and accessible methods for predicting outcome in patients with suspected unstable angina or myocardial infarction without persistent ST-elevations. However, there are few studies about the combination of these 2 methods as a means of predicting long-term outcome. ECG was obtained on admission, and a tn-T level was analyzed on admission and after 6 hours in 710 consecutive patients admitted because of chest pain and no ST-elevations. Patients were observed for a median time of 40 months for death. ST-segment depressions > or =0.05 mV were present in 266 patients (37%). These patients had a 9.7-fold increased risk of death, compared with patients with normal ECG results. Isolated T-Wave inversions or pathological signs other than ST-T changes were present in 196 patients (28%), who had a 4.5-fold increased risk of death compared with patients who had normal ECG results. At 6 hours after admission, 169 patients (24%) had at least 1 sample of tn-T > or =0.10 microg/L, which resulted in an 3.7-fold increased risk of death. In a multivariate analysis, both ECG on admission and tn-T level came out as independent predictors of outcome. When these methods were combined, patients could be divided into low- (tn-T level or =0.10 microg/L or ST-segment depression), and high-risk groups (tn-T level > or =0.10 microg/L and ST-segment depression). ECG and tn-T level are valuable tools to quickly risk stratify patients with chest pain. The combination of these methods is superior to either one alone.
This article explores the role of drawing as a tool for reflection. It reports on a PhD research project that aims to identify and analyse the value that co-design processes can bring to participants and their communities. The research is associated with Leapfrog, a three-year project funded by the UK Arts and Humanities Research Council (AHRC).…
In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…
Research Tools can be found in TTC's Available Technologies and in scientific publications. They are freely available to non-profits and universities through a Material Transfer Agreement (or other appropriate mechanism), and available via licensing to companies.
Nathalie Sonck; Henk Fernee
Smartphones and apps offer an innovative means of collecting data from the public. The Netherlands Institute for Social Research | SCP has been engaged in one of the first experiments involving the use of a smartphone app to collect time use data recorded by means of an electronic diary. Is it feasible to use smartphones as a data collection tool for social research? What are the effects on data quality? Can we also incorporate reality mining tools in the smartphone app to replace traditional...
Price, Geoffrey P.; Wright, Vivian H.
Using John Creswell's Research Process Cycle as a framework, this article describes various web-based collaborative technologies useful for enhancing the organization and efficiency of educational research. Visualization tools (Cacoo) assist researchers in identifying a research problem. Resource storage tools (Delicious, Mendeley, EasyBib)…
Michael D. Coovert
Full Text Available Serious games are an attractive tool for education and training, but their utility is even broader. We argue serious games provide a unique opportunity for research as well, particularly in areas where multiple players (groups or teams are involved. In our paper we provide background in several substantive areas. First, we outline major constructs and challenges found in team research. Secondly, we discuss serious games, providing an overview and description of their role in education, training, and research. Thirdly, we describe necessary characteristics for game engines utilized in team research, followed by a discussion of the value added by utilizing serious games. Our goal in this paper is to argue serious games are an effective tool with demonstrated reliability and validity and should be part of a research program for those engaged in team research. Both team researchers and those involved in serious game development can benefit from a mutual partnership which is research focused.
Schou, Lone; Høstrup, Helle; Lyngsø, Elin
schou l., høstrup h., lyngsø e.e., larsen s. & poulsen i. (2011) Validation of a new assessment tool for qualitative research articles. Journal of Advanced Nursing00(0), 000-000. doi: 10.1111/j.1365-2648.2011.05898.x ABSTRACT: Aim. This paper presents the development and validation of a new...... assessment tool for qualitative research articles, which could assess trustworthiness of qualitative research articles as defined by Guba and at the same time aid clinicians in their assessment. Background. There are more than 100 sets of proposals for quality criteria for qualitative research. However, we...... is the Danish acronym for Appraisal of Qualitative Studies. Phase 1 was to develop the tool based on a literature review and on consultation with qualitative researchers. Phase 2 was an inter-rater reliability test in which 40 health professionals participated. Phase 3 was an inter-rater reliability test among...
Sarah E. Council
Full Text Available The field of citizen science is exploding and offers not only a great way to engage the general public in science literacy through primary research, but also an avenue for teaching professionals to engage their students in meaningful community research experiences. Though this field is expanding, there are many hurdles for researchers and participants, as well as challenges for teaching professionals who want to engage their students. Here we highlight one of our projects that engaged many citizens in Raleigh, NC, and across the world, and we use this as a case study to highlight ways to engage citizens in all kinds of research. Through the use of numerous tools to engage the public, we gathered citizen scientists to study skin microbes and their associated odors, and we offer valuable ideas for teachers to tap into resources for their own students and potential citizen-science projects.
Zhenyu, Zhao; Yongquan, Zhou; Houming, Zhou; Xiaomei, Xu; Haibin, Xiao
High speed machining technology can improve the processing efficiency and precision, but also reduce the processing cost. Therefore, the technology is widely regarded in the industry. With the extensive application of high-speed machining technology, high-speed tool system has higher and higher requirements on the tool chuck. At present, in high speed precision machining, several new kinds of clip heads are as long as there are heat shrinkage tool-holder, high-precision spring chuck, hydraulic tool-holder, and the three-rib deformation chuck. Among them, the heat shrinkage tool-holder has the advantages of high precision, high clamping force, high bending rigidity and dynamic balance, etc., which are widely used. Therefore, it is of great significance to research the new requirements of the machining tool system. In order to adapt to the requirement of high speed machining precision machining technology, this paper expounds the common tool holder technology of high precision machining, and proposes how to select correctly tool clamping system in practice. The characteristics and existing problems are analyzed in the tool clamping system.
A process is described for the recovery of valuable shale oils or tars, characterized in that the oil shale is heated to about 300/sup 0/C or a temperature not exceeding this essentially and then is treated with a solvent with utilization of this heat.
Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.
Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.
Ebrahim, Nader Ale
“Research Tools” can be defined as vehicles that broadly facilitate research and related activities. “Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 800 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated...
This report describes follow-up for research and development on the recovery of valuable resources, such as magnesium, bromine and boron, contained in the brackish water for manufacture of common salt in the coastal region of Mexico. For the field survey, salt garden, irrigation plant and manufacturing plant of dinning salt were inspected. The optimum site was examined by assuming desalination plant and solar pond. The groundwater in coastal regions is progressively salified. Since the coastal region is a tourist resort with an round-trip area of whales, environmental protection is indispensable. For the joint research with invited researchers, the solar pond system and fresh water generation were studied. As a result, it was found that the solar pond system is an excellent method for keeping thermal energy in a low cost at the salt garden with abundant solar energy, and that the desalination system combined with distilling is the most suitable method. 7 refs., 8 figs., 1 tab.
The CPTAC program develops new approaches to elucidate aspects of the molecular complexity of cancer made from large-scale proteogenomic datasets, and advance them toward precision medicine. Part of the CPTAC mission is to make data and tools available and accessible to the greater research community to accelerate the discovery process.
Pritchett, Amy R.
While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plugin' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).
At last, the first systematic guide to the growing jungle of citation indices and other bibliometric indicators. Written with the aim of providing a complete and unbiased overview of all available statistical measures for scientific productivity, the core of this reference is an alphabetical dictionary of indices and other algorithms used to evaluate the importance and impact of researchers and their institutions. In 150 major articles, the authors describe all indices in strictly mathematical terms without passing judgement on their relative merit. From widely used measures, such as the journal impact factor or the h-index, to highly specialized indices, all indicators currently in use in the sciences and humanities are described, and their application explained. The introductory section and the appendix contain a wealth of valuable supporting information on data sources, tools and techniques for bibliometric and scientometric analysis - for individual researchers as well as their funders and publishers.
A process is described for the preparation of valuable hydrocarbons by treatment of carbonaceous materials, like coal, tars, minerals oils, and their distillation and conversion products, and for refining of liquid hydrocarbon mixture obtained at raised temperature and under pressure, preferably in the presence of catalysts, by the use of hydrogen-containing gases, purified and obtained by distilling solid combustibles, characterized by the purification of the hydrogen-containing gases being accomplished for the purpose of practically complete removal of the oxygen by heating at ordinary or higher pressure in the presence of a catalyst containing silver and oxides of metals of group VI of the periodic system.
Having considered the varying estimates of future UK energy requirements which have been made, the impact on the environment arising from the use of valuable sites for energy production is examined. It is shown that energy installations of all kinds clash with areas of natural beauty or ecological importance. As an example, a recent investigation of potential sites for nuclear power stations found that most of them were on or next to sites of special scientific interest, and other areas officially designated to be regarded as special or to be protected in some way. (U.K.)
Hakim, Toufic M.; Garg, Shila
The National Science Foundation's 1996 report "Shaping the Future: New Expectations for Undergraduate Education in Science, Mathematics, Engineering and Technology" urged that in order to improve SME&T education, decisive action must be taken so that "all students have access to excellent undergraduate education in science .... and all students learn these subjects by direct experience with the methods and processes of inquiry." Research-related educational activities that integrate education and research have been shown to be valuable in improving the quality of education and enhancing the number of majors in physics departments. Student researchers develop a motivation to continue in science and engineering through an appreciation of how science is done and the excitement of doing frontier research. We will address some of the challenges of integrating research into the physics undergraduate curriculum effectively. The departmental and institutional policies and infrastructure required to help prepare students for this endeavor will be discussed as well as sources of support and the establishment of appropriate evaluation procedures.
Schell, Scott R
Surgical research is dependent upon information technologies. Selection of the computer, operating system, and software tool that best support the surgical investigator's needs requires careful planning before research commences. This manuscript presents a brief tutorial on how surgical investigators can best select these information technologies, with comparisons and recommendations between existing systems, software, and solutions. Privacy concerns, based upon HIPAA and other regulations, now require careful proactive attention to avoid legal penalties, civil litigation, and financial loss. Security issues are included as part of the discussions related to selection and application of information technology. This material was derived from a segment of the Association for Academic Surgery's Fundamentals of Surgical Research course.
Kolbæk, Raymond; Steensgaard, Randi; Angel, Sanne
. Furthermore we try to evidence-based the concept of "Sample handlings" and examines whether this concept can be used as a flexible methodological tool for developing workflow that promotes patient participation in their own rehabilitation. We use a action research design to identify actual problems, develop......, to test, evaluate and implement specific actions to promote patient participation in rehabilitation. Four nurses and four social and health assistants is having a "co-researcher" active role. The interaction with the researchers creates a reflexive and dynamic process with a learning and competence......Abstract Content: Major challenges occurs, when trying to implement research in clinical practice. In the West Danish Center for Spinal Cord Injury, we are doing a practice-based ph.d. project, that involves the practice field's own members as co-researchers. In the management of the project we use...
Full Text Available The new Internet technologies have infiltrated in a stunning way the academic environment, both at individual and at institutional level. Therefore, more and more teachers have started educational blogs, librarians are active on Twitter, other educational actors curate web content, students post on Instagram or Flickr, and university departments have Facebook pages and/or YouTube accounts etc. Today, the use of web technology has become “a legitimate activity in many areas of higher education” (Waycott, 2010 and a considerable shift to digital academic research has gradually occurred. Teachers are encouraging students to take up digital tools for research and writing, thus revealing new ways of using information and communication technologies for academic purposes and not just for socializing. The main objective of this paper is to investigate the effects of integrating diverse digital, Web 2.0 tools and resources and OERs/MOOCs in research and in the construction of students’ academic texts. We aim to stress the increasing influence of digital and online tools in academic research and writing. Teachers, specialists, and students alike are affected by this process. In order to show how, we explore the following issues: What is Research 2.0? Which digital/online tools have we used to assist our students? What are the challenges for academic research using digital / web 2.0 tools? And how do digital tools shape academic research?
Full Text Available Abstract Background One of the consequences of the rapid and widespread adoption of high-throughput experimental technologies is an exponential increase of the amount of data produced by genome-wide experiments. Researchers increasingly need to handle very large volumes of heterogeneous data, including both the data generated by their own experiments and the data retrieved from publicly available repositories of genomic knowledge. Integration, exploration, manipulation and interpretation of data and information therefore need to become as automated as possible, since their scale and breadth are, in general, beyond the limits of what individual researchers and the basic data management tools in normal use can handle. This paper describes Genephony, a tool we are developing to address these challenges. Results We describe how Genephony can be used to manage large datesets of genomic information, integrating them with existing knowledge repositories. We illustrate its functionalities with an example of a complex annotation task, in which a set of SNPs coming from a genotyping experiment is annotated with genes known to be associated to a phenotype of interest. We show how, thanks to the modular architecture of Genephony and its user-friendly interface, this task can be performed in a few simple steps. Conclusion Genephony is an online tool for the manipulation of large datasets of genomic information. It can be used as a browser for genomic data, as a high-throughput annotation tool, and as a knowledge discovery tool. It is designed to be easy to use, flexible and extensible. Its knowledge management engine provides fine-grained control over individual data elements, as well as efficient operations on large datasets.
This book is about the rise and supposed fall of the mean value theorem. It discusses the evolution of the theorem and the concepts behind it, how the theorem relates to other fundamental results in calculus, and modern re-evaluations of its role in the standard calculus course. The mean value theorem is one of the central results of calculus. It was called “the fundamental theorem of the differential calculus” because of its power to provide simple and rigorous proofs of basic results encountered in a first-year course in calculus. In mathematical terms, the book is a thorough treatment of this theorem and some related results in the field; in historical terms, it is not a history of calculus or mathematics, but a case study in both. MVT: A Most Valuable Theorem is aimed at those who teach calculus, especially those setting out to do so for the first time. It is also accessible to anyone who has finished the first semester of the standard course in the subject and will be of interest to undergraduate mat...
Stender, V.; Schroeder, M.; Wächter, J.
Established initiatives and mandated organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. The basic idea behind these infrastructures is the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. Especially the management of research data is gaining more and more importance. In geosciences these developments have to be merged with the enhanced data management approaches of Spatial Data Infrastructures (SDI). The Centre for GeoInformationTechnology (CeGIT) at the GFZ German Research Centre for Geosciences has the objective to establish concepts and standards of SDIs as an integral part of research infrastructure architectures. In different projects, solutions to manage research data for land- and water management or environmental monitoring have been developed based on a framework consisting of Free and Open Source Software (FOSS) components. The framework provides basic components supporting the import and storage of data, discovery and visualization as well as data documentation (metadata). In our contribution, we present our data management solutions developed in three projects, Central Asian Water (CAWa), Sustainable Management of River Oases (SuMaRiO) and Terrestrial Environmental Observatories (TERENO) where FOSS components build the backbone of the data management platform. The multiple use and validation of tools helped to establish a standardized architectural blueprint serving as a contribution to Research Infrastructures. We examine the question of whether FOSS tools are really a sustainable choice and whether the increased efforts of maintenance are justified. Finally it should help to answering the question if the use of FOSS for Research Infrastructures is a
This report is part of the scientific basis for the management plan for the North Sea and Skagerrak. The report focuses on the vulnerability of particularly valuable areas to petroleum activities, maritime transport, fisheries, land-based and coastal activities and long-range transboundary pollution. A working group with representatives from many different government agencies, headed by the Institute of Marine Research and the Directorate for Nature Management, has been responsible for drawing up the present report on behalf of the Expert Group for the North Sea and Skagerrak. The present report considers the 12 areas that were identified as particularly valuable during an earlier stage of the management plan process on the environment, natural resources and pollution. There are nine areas along the coast and three open sea areas in the North Sea that were identified according to the same predefined criteria as used for the management plans for the Barents Sea: Lofoten area and the Norwegian Sea. The most important criteria for particularly valuable areas are importance for biological production and importance for biodiversity.(Author)
This report is part of the scientific basis for the management plan for the North Sea and Skagerrak. The report focuses on the vulnerability of particularly valuable areas to petroleum activities, maritime transport, fisheries, land-based and coastal activities and long-range transboundary pollution. A working group with representatives from many different government agencies, headed by the Institute of Marine Research and the Directorate for Nature Management, has been responsible for drawing up the present report on behalf of the Expert Group for the North Sea and Skagerrak. The present report considers the 12 areas that were identified as particularly valuable during an earlier stage of the management plan process on the environment, natural resources and pollution. There are nine areas along the coast and three open sea areas in the North Sea that were identified according to the same predefined criteria as used for the management plans for the Barents Sea: Lofoten area and the Norwegian Sea. The most important criteria for particularly valuable areas are importance for biological production and importance for biodiversity.(Author)
Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.
Madiedo, J. M.
The Virtual Museum for Meteorites (Figure 1) was created as a tool for students, educators and researchers [1, 2]. One of the aims of this online resource is to promote the interest in meteorites. Thus, the role of meteorites in education and outreach is fundamental, as these are very valuable tools to promote the public's interest in Astronomy and Planetary Sciences. Meteorite exhibitions reveal the fascination of students, educators and even researchers for these extraterrestrial rocks and how these can explain many key questions origin and evolution of our Solar System. However, despite the efforts related to the origin and evolution of our Solar System. However, despite the efforts of private collectors, museums and other institutions to organize meteorite exhibitions, the reach of these is usually limited. The Virtual Museum for Meteorites takes advantage of HTML and related technologies to overcome local boundaries and offer its contents for a global audience. A description of the recent developments performed in the framework of this virtual museum is given in this work.
Valeria Gisela Perez
Full Text Available This paper develops a reflection about the importance of the research of accounting subjects in the professional accountants training, this importance is an attribute of research to increase the wealth of discipline under investigation, this can be converted into a skill and/or competence wich accountants are required to demonstrate in their professional practice.Furthermore, accounting is recognized by the authors as a science in constant development, being able to be investigated. This change in knowledge is an element that motivates professionals to be constantly updated, becoming this aspect (constant updating the skill and competence that research can bring to professional training in university classrooms.The reflection is based on the study of documents developed by prestigious authors in accounting theory, teaching and research.Therefore, this paper concludes that research is a useful tool for the professional accounting training, and rewards the important skills and competencies for professional practice; it can be conceived as well as a strategy for technical and educational activities that allows students to recreate knowledge, allowing future updates that will require their professional practice.Key words: Accounting research, university teaching, accounting education.
Weingart, R.C.; Chau, H.H.; Goosman, D.R.; Hofer, W.W.; Honodel, C.A.; Lee, R.S.; Steinberg, D.J.; Stroud, J.R.
We have developed a new tool for ultrahigh-pressure research at LLL. This system, which we call the electric gun, has already achieved thin flyer plate velocities in excess of 20 km/s and pressures of the order of 2 TPa in tantalum. We believe that the electric gun is competitive with laser- and nuclear-driven methods of producing shocks in the 1-to-5 TPa range because of its precision and ease and economy of operation. Its development is recommended for shock initiation studies, dry runs for Site 300 hydroshots, and as a shock wave generator for surface studies
Full Text Available Introduction: This paper describes the development of a ‘Research for Impact’ Tool against a background of concerns about the over-researching of Aboriginal and Torres Strait Islander people’s issues without demonstrable benefits.Material and Methods: A combination of literature reviews, workshops with researchers and reflections by project team members and partners using participatory snowball techniques.Results: Assessing research impact is difficult, akin to so-called ‘wicked problem’, but not impossible. Heuristic and collaborative approach to research that takes in the expectations of research users, those being researched and the funders of research offers a pragmatic solution to evaluating research impact. The proposed ‘Research for Impact’ Tool is based on the understanding that the value of research is to create evidence and/or products to support smarter decisions so as to improve the human condition.Research is of limited value unless the evidence produced is used to inform smarter decisions. A practical way of approaching research impact is therefore to start with the decisions confronting decision makers whether they are government policymakers, professional practitioners or households and the extent to which the research supports smarter decisions and the knock-on consequences of such smart decisions. Embedded at each step in the impact planning, monitoring and evaluation process is the need for Indigenous leadership and participation, capacity enhancement and collaborative partnerships and participatory learning by doing approaches across partners.Discussion: The tool is designed in the context of Indigenous research but the basic idea that the way to assess research impact is to start upfront by defining the users’ of research and their information needs, the decisions confronting them and the extent to which research informs smarter decisions is equally applicable to research in other settings, both applied and
Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.
Roeder, L.; Jundt, R.
Sponsored by the Department of Energy, the ARM Climate Research Facility is a global scientific user facility for the study of climate change. To publicize progress and achievements and to reach new users, the ACRF uses a variety of Web 2.0 tools and strategies that build off of the program’s comprehensive and well established News Center (www.arm.gov/news). These strategies include: an RSS subscription service for specific news categories; an email “newsletter” distribution to the user community that compiles the latest News Center updates into a short summary with links; and a Facebook page that pulls information from the News Center and links to relevant information in other online venues, including those of our collaborators. The ACRF also interacts with users through field campaign blogs, like Discovery Channel’s EarthLive, to share research experiences from the field. Increasingly, field campaign Wikis are established to help ACRF researchers collaborate during the planning and implementation phases of their field studies and include easy to use logs and image libraries to help record the campaigns. This vital reference information is used in developing outreach material that is shared in highlights, news, and Facebook. Other Web 2.0 tools that ACRF uses include Google Maps to help users visualize facility locations and aircraft flight patterns. Easy-to-use comment boxes are also available on many of the data-related web pages on www.arm.gov to encourage feedback. To provide additional opportunities for increased interaction with the public and user community, future Web 2.0 plans under consideration for ACRF include: evaluating field campaigns for Twitter and microblogging opportunities, adding public discussion forums to research highlight web pages, moving existing photos into albums on FlickR or Facebook, and building online video archives through YouTube.
Hartley, D.S. III
This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.
Software Tools for Battery Design Software Tools for Battery Design Under the Computer-Aided Engineering for Electric Drive Vehicle Batteries (CAEBAT) project, NREL has developed software tools to help using CAEBAT software tools. Knowledge of the interplay of multi-physics at varied scales is imperative
Kaczmarczyk, Lech; Jackson, Walker S
The humble house mouse has long been a workhorse model system in biomedical research. The technology for introducing site-specific genome modifications led to Nobel Prizes for its pioneers and opened a new era of mouse genetics. However, this technology was very time-consuming and technically demanding. As a result, many investigators continued to employ easier genome manipulation methods, though resulting models can suffer from overlooked or underestimated consequences. Another breakthrough, invaluable for the molecular dissection of disease mechanisms, was the invention of high-throughput methods to measure the expression of a plethora of genes in parallel. However, the use of samples containing material from multiple cell types could obfuscate data, and thus interpretations. In this review we highlight some important issues in experimental approaches using mouse models for biomedical research. We then discuss recent technological advances in mouse genetics that are revolutionising human disease research. Mouse genomes are now easily manipulated at precise locations thanks to guided endonucleases, such as transcription activator-like effector nucleases (TALENs) or the CRISPR/Cas9 system, both also having the potential to turn the dream of human gene therapy into reality. Newly developed methods of cell type-specific isolation of transcriptomes from crude tissue homogenates, followed by detection with next generation sequencing (NGS), are vastly improving gene regulation studies. Taken together, these amazing tools simplify the creation of much more accurate mouse models of human disease, and enable the extraction of hitherto unobtainable data.
Kim, Jong-Won; Kim, Dogyun
Dosimetry tools for proton therapy research have been developed to measure the properties of a therapeutic proton beam. A CCD camera-scintillation screen system, which can verify the 2D dose distribution of a scanning beam and can be used for proton radiography, was developed. Also developed were a large area parallel-plate ionization chamber and a multi-layer Faraday cup to monitor the beam current and to measure the beam energy, respectively. To investigate the feasibility of locating the distal dose falloff in real time during patient treatment, a prompt gamma measuring system composed of multi-layer shielding structures was then devised. The system worked well for a pristine proton beam. However, correlation between the distal dose falloff and the prompt gamma distribution was blurred by neutron background for a therapy beam formed by scattering method. We have also worked on the design of a Compton camera to image the 2D distribution of prompt gamma rays.
Leggett, Graham J
Continued progress in the fast-growing field of nanoplasmonics will require the development of new methods for the fabrication of metal nanostructures. Optical lithography provides a continually expanding tool box. Two-photon processes, as demonstrated by Shukla et al. (doi: 10.1021/nn103015g), enable the fabrication of gold nanostructures encapsulated in dielectric material in a simple, direct process and offer the prospect of three-dimensional fabrication. At higher resolution, scanning probe techniques enable nanoparticle particle placement by localized oxidation, and near-field sintering of nanoparticulate films enables direct writing of nanowires. Direct laser "printing" of single gold nanoparticles offers a remarkable capability for the controlled fabrication of model structures for fundamental studies, particle-by-particle. Optical methods continue to provide a powerful support for research into metamaterials.
Snilstveit, Birte; Vojtkova, Martina; Bhavsar, Ami; Stevenson, Jennifer; Gaarder, Marie
A range of organizations are engaged in the production of evidence on the effects of health, social, and economic development programs on human welfare outcomes. However, evidence is often scattered around different databases, web sites, and the gray literature and is often presented in inaccessible formats. Lack of overview of the evidence in a specific field can be a barrier to the use of existing research and prevent efficient use of limited resources for new research. Evidence & Gap Maps (EGMs) aim to address these issues and complement existing synthesis and mapping approaches. EGMs are a new addition to the tools available to support evidence-informed policymaking. To provide an accessible resource for researchers, commissioners, and decision makers, EGMs provide thematic collections of evidence structured around a framework which schematically represents the types of interventions and outcomes of relevance to a particular sector. By mapping the existing evidence using this framework, EGMs provide a visual overview of what we know and do not know about the effects of different programs. They make existing evidence available, and by providing links to user-friendly summaries of relevant studies, EGMs can facilitate the use of existing evidence for decision making. They identify key "gaps" where little or no evidence from impact evaluations and systematic reviews is available and can be a valuable resource to inform a strategic approach to building the evidence base in a particular sector. The article will introduce readers to the concept and methods of EGMs and present a demonstration of the EGM tool using existing examples. Copyright Â© 2016 Elsevier Inc. All rights reserved.
Full Text Available Large-scale genome projects have generated a rapidly increasing number of DNA sequences. Therefore, development of computational methods to rapidly analyze these sequences is essential for progress in genomic research. Here we present an automatic annotation system for preliminary analysis of DNA sequences. The gene annotation tool (GATO is a Bioinformatics pipeline designed to facilitate routine functional annotation and easy access to annotated genes. It was designed in view of the frequent need of genomic researchers to access data pertaining to a common set of genes. In the GATO system, annotation is generated by querying some of the Web-accessible resources and the information is stored in a local database, which keeps a record of all previous annotation results. GATO may be accessed from everywhere through the internet or may be run locally if a large number of sequences are going to be annotated. It is implemented in PHP and Perl and may be run on any suitable Web server. Usually, installation and application of annotation systems require experience and are time consuming, but GATO is simple and practical, allowing anyone with basic skills in informatics to access it without any special training. GATO can be downloaded at [http://mariwork.iq.usp.br/gato/]. Minimum computer free space required is 2 MB.
Brose, Sascha; Danylyuk, Serhiy; Tempeler, Jenny; Kim, Hyun-su; Loosen, Peter; Juschkin, Larissa
In this work we present the capabilities of the designed and realized extreme ultraviolet laboratory exposure tool (EUVLET) which has been developed at the RWTH-Aachen, Chair for the Technology of Optical Systems (TOS), in cooperation with the Fraunhofer Institute for Laser Technology (ILT) and Bruker ASC GmbH. Main purpose of this laboratory setup is the direct application in research facilities and companies with small batch production, where the fabrication of high resolution periodic arrays over large areas is required. The setup can also be utilized for resist characterization and evaluation of its pre- and post-exposure processing. The tool utilizes a partially coherent discharge produced plasma (DPP) source and minimizes the number of other critical components to a transmission grating, the photoresist coated wafer and the positioning system for wafer and grating and utilizes the Talbot lithography approach. To identify the limits of this approach first each component is analyzed and optimized separately and relations between these components are identified. The EUV source has been optimized to achieve the best values for spatial and temporal coherence. Phase-shifting and amplitude transmission gratings have been fabricated and exposed. Several commercially available electron beam resists and one EUV resist have been characterized by open frame exposures to determine their contrast under EUV radiation. Cold development procedure has been performed to further increase the resist contrast. By analyzing the exposure results it can be demonstrated that only a 1:1 copy of the mask structure can be fully resolved by the utilization of amplitude masks. The utilized phase-shift masks offer higher 1st order diffraction efficiency and allow a demagnification of the mask structure in the achromatic Talbot plane.
Grace, Stephen C; Embry, Stephen; Luo, Heng
Liquid chromatography coupled to mass spectrometry (LCMS) has become a widely used technique in metabolomics research for differential profiling, the broad screening of biomolecular constituents across multiple samples to diagnose phenotypic differences and elucidate relevant features. However, a significant limitation in LCMS-based metabolomics is the high-throughput data processing required for robust statistical analysis and data modeling for large numbers of samples with hundreds of unique chemical species. To address this problem, we developed Haystack, a web-based tool designed to visualize, parse, filter, and extract significant features from LCMS datasets rapidly and efficiently. Haystack runs in a browser environment with an intuitive graphical user interface that provides both display and data processing options. Total ion chromatograms (TICs) and base peak chromatograms (BPCs) are automatically displayed, along with time-resolved mass spectra and extracted ion chromatograms (EICs) over any mass range. Output files in the common .csv format can be saved for further statistical analysis or customized graphing. Haystack's core function is a flexible binning procedure that converts the mass dimension of the chromatogram into a set of interval variables that can uniquely identify a sample. Binned mass data can be analyzed by exploratory methods such as principal component analysis (PCA) to model class assignment and identify discriminatory features. The validity of this approach is demonstrated by comparison of a dataset from plants grown at two light conditions with manual and automated peak detection methods. Haystack successfully predicted class assignment based on PCA and cluster analysis, and identified discriminatory features based on analysis of EICs of significant bins. Haystack, a new online tool for rapid processing and analysis of LCMS-based metabolomics data is described. It offers users a range of data visualization options and supports non
Mora, J.C.; Robles, Beatriz [Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas - CIEMAT (Spain); Bradshaw, Clare; Stark, Karolina [Stockholm University (Sweden); Sweeck, Liev; Vives i Batlle, Jordi [Belgian Nuclear Research Centre SCK-CEN (Belgium); Beresford, Nick [Centre for Ecology and Hydrology - CEH (United Kingdom); Thoerring, Havard; Dowdall, Mark [Norwegian Radiation Protection Authority - NRPA (Norway); Outola, Iisa; Turtiainen, Tuukka; Vetikko, Virve [STUK - Radiation and Nuclear Safety Authority (Finland); Steiner, Martin [Federal Office for Radiation Protection - BfS (Germany); Beaugelin-Seiller, Karine; Fevrier, Laureline; Hurtevent, Pierre; Boyer, Patrick [Institut de Radioprotection et de Surete Nucleaire - IRSN (France)
Interaction Matrices as a Tool for Prioritizing Radioecology Research J.C. Mora CIEMAT In 2010 the Strategy for Allied Radioecology (STAR) was launched with several objectives aimed towards integrating the radioecology research efforts of nine institutions in Europe. One of these objectives was the creation of European Radioecology Observatories. The Chernobyl Exclusion Zone (CEZ) and the Upper Silesian Coal Basin (USCB), a coal mining area in Poland, have been chosen after a selection process. A second objective was to develop a system for improving and validating the capabilities of predicting the behaviour of the main radionuclides existing at these observatories. Interaction Matrices (IM) have been used since the 1990's as a tool for developing ecological conceptual models and have also been used within radioecology. The Interaction Matrix system relies on expert judgement for structuring knowledge of a given ecosystem at the conceptual level and was selected for use in the STAR project. A group of experts, selected from each institution of STAR, designed two matrices with the main compartments for each ecosystem (a forest in CEZ and a lake in USCB). All the features, events and processes (FEPs) which could affect the behaviour of the considered radionuclides, focusing on radiocaesium in the Chernobyl forest and radium in the Rontok-Wielki lake, were also included in each IM. Two new sets of experts were appointed to review, improve and prioritize the processes included in each IM. A first processing of the various candidate interaction matrices produced a single interaction matrix for each ecosystem which incorporated all experts combined knowledge. During the prioritization of processes in the IMs, directed towards developing a whole predictive model of radionuclides behaviour in those ecosystems, raised interesting issues related to the processes and parameters involved, regarding the existing knowledge in them. This exercise revealed several processes
Millar, A. Z.; Perry, S.
Interns in the Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) program conduct computer science research for the benefit of earthquake scientists and have created products in growing use within the SCEC education and research communities. SCEC/UseIT comprises some twenty undergraduates who combine their varied talents and academic backgrounds to achieve a Grand Challenge that is formulated around needs of SCEC scientists and educators and that reflects the value SCEC places on the integration of computer science and the geosciences. In meeting the challenge, students learn to work on multidisciplinary teams and to tackle complex problems with no guaranteed solutions. Meantime, their efforts bring fresh perspectives and insight to the professionals with whom they collaborate, and consistently produces innovative, useful tools for research and education. The 2007 Grand Challenge was to design and prototype serious games to communicate important earthquake science concepts. Interns broke themselves into four game teams, the Educational Game, the Training Game, the Mitigation Game and the Decision-Making Game, and created four diverse games with topics from elementary plate tectonics to earthquake risk mitigation, with intended players ranging from elementary students to city planners. The games were designed to be versatile, to accommodate variation in the knowledge base of the player; and extensible, to accommodate future additions. The games are played on a web browser or from within SCEC-VDO (Virtual Display of Objects). SCEC-VDO, also engineered by UseIT interns, is a 4D, interactive, visualization software that enables integration and exploration of datasets and models such as faults, earthquake hypocenters and ruptures, digital elevation models, satellite imagery, global isochrons, and earthquake prediction schemes. SCEC-VDO enables the user to create animated movies during a session, and is now part
McClatchey, Will C; Mahady, Gail B; Bennett, Bradley C; Shiels, Laura; Savo, Valentina
The science of ethnobotany is reviewed in light of its multi-disciplinary contributions to natural product research for the development of pharmaceuticals and pharmacological tools. Some of the issues reviewed involve ethical and cultural perspectives of healthcare and medicinal plants. While these are not usually part of the discussion of pharmacology, cultural concerns potentially provide both challenges and insight for field and laboratory researchers. Plant evolutionary issues are also considered as they relate to development of plant chemistry and accessing this through ethnobotanical methods. The discussion includes presentation of a range of CNS-active medicinal plants that have been recently examined in the field, laboratory and/or clinic. Each of these plants is used to illustrate one or more aspects about the valuable roles of ethnobotany in pharmacological research. We conclude with consideration of mutually beneficial future collaborations between field ethnobotanists and pharmacologists.
Andreia Salvan Pagnan
Full Text Available Within the women's clothing of the universe's underwear were long an insignificant plan with regard to the development of new textile materials, shapes and colors. The panties that had been known as breeches or long underwear only became a necessity around the twentieth century with the vaporous dresses Christian Dior in the 50 Technological advances in the textile industry brought spandex created by the American laboratory DuPont's better known as the lycra. The elasticity of the fabric gave comfort to women's lingerie, passing this attribute to be considered as a quality factor in lingeries. To understand the desires of the users a qualitative research was conducted with women 18-45 years collecting opinions on the perceived comfort of already existing models compared to a new one be launched. Through the Quality Function Deployment Tool (QFD, or Quality Function Deployment, the data obtained from users of the answers given an interpretation which is to prioritize targets for the development of a based product on analyzes of desired characteristics which are converted into attributes technicians.
Muhammad Awais; Tanzila Samin; Muhammad Bilal
Now-a-days it is very important for the business persons to attract their target customers towards their products through valuable mode of promotion and communication. Increasing use of World Wide Web has completely changed the scenario of business sector. Customized products and services, customers preferences, @ and dot com craze have elevated the importance of internet advertising. This research paper investigates valuable internet advertising which will help to enhance the value of intern...
To determine the positive predictive value (PPV) of FADA, the frequent causes of FPs in our laboratory and the demographic characteristics of tuberculous pleural effusions (TPEs) and non-tuberculous pleural effusions (NTPEs). Methods. High FADA results generated in the past year were extracted with corresponding TB ...
Lee Odden, ―Best and Worst Practices Social Media Marketing,‖ Top Rank® Online Market Blog, entry posted 12 February 2009, http://www.toprankblog.com...1001 (accessed 30 March 2011). Odden, Lee. ―Best and Worst Practices Social Media Marketing.‖ Top Rank® Online Market Blog, entry posted 12
Analysis Tools NREL developed the following modeling, simulation, and analysis tools to investigate novel design goals (e.g., fuel economy versus performance) to find cost-competitive solutions. ADOPT Vehicle Simulator to analyze the performance and fuel economy of conventional and advanced light- and
McMahan, Tracy A.; Shea, Charlotte A.; Finckenor, Miria; Ferguson, Dale
As NASA plans and implements the Vision for Space Exploration, managers, engineers, and scientists need lunar environment information that is readily available and easily accessed. For this effort, lunar environment data was compiled from a variety of missions from Apollo to more recent remote sensing missions, such as Clementine. This valuable information comes not only in the form of measurements and images but also from the observations of astronauts who have visited the Moon and people who have designed spacecraft for lunar missions. To provide a research tool that makes the voluminous lunar data more accessible, the Space Environments and Effects (SEE) Program, managed at NASA's Marshall Space Flight Center (MSFC) in Huntsville, AL, organized the data into a DVD knowledgebase: the Lunar e-Library. This searchable collection of 1100 electronic (.PDF) documents and abstracts makes it easy to find critical technical data and lessons learned from past lunar missions and exploration studies. The SEE Program began distributing the Lunar e-Library DVD in 2006. This paper describes the Lunar e-Library development process (including a description of the databases and resources used to acquire the documents) and the contents of the DVD product, demonstrates its usefulness with focused searches, and provides information on how to obtain this free resource.
Full Text Available One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states，and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.
Trevino, Victor; Falciani, Francesco; Barrera-Saldaña, Hugo A
Among the many benefits of the Human Genome Project are new and powerful tools such as the genome-wide hybridization devices referred to as microarrays. Initially designed to measure gene transcriptional levels, microarray technologies are now used for comparing other genome features among individuals and their tissues and cells. Results provide valuable information on disease subcategories, disease prognosis, and treatment outcome. Likewise, they reveal differences in genetic makeup, regulat...
Rector, Travis A.; Vogt, Nicole P.
Spectroscopy is one of the most powerful tools that astronomers use to study the universe. However relatively few resources are available that enable undergraduates to explore astronomical spectra interactively. We present web-based applications which guide students through the analysis of real spectra of stars, galaxies, and quasars. The tools are written in HTML5 and function in all modern web browsers on computers and tablets. No software needs to be installed nor do any datasets need to be downloaded, enabling students to use the tools in or outside of class (e.g., for online classes).Approachable GUIs allow students to analyze spectra in the same manner as professional astronomers. The stellar spectroscopy tool can fit a continuum with a blackbody and identify spectral features, as well as fit line profiles and determine equivalent widths. The galaxy and AGN tools can also measure redshifts and calcium break strengths. The tools provide access to an archive of hundreds of spectra obtained with the optical telescopes at Kitt Peak National Observatory. It is also possible to load your own spectra or to query the Sloan Digital Sky Survey (SDSS) database.We have also developed curricula to investigate these topics: spectral classification, variable stars, redshift, and AGN classification. We will present the functionality of the tools and describe the associated curriculum. The tools are part of the General Education Astronomy Source (GEAS) project based at New Mexico State University, with support from the National Science Foundation (NSF, AST-0349155) and the National Aeronautics and Space Administration (NASA, NNX09AV36G). Curriculum development was supported by the NSF (DUE-0618849 and DUE-0920293).
Full Text Available In this paper, the test-bench for sonic logging tool is proposed and designed to realize automatic calibration and testing of the sonic logging tool. The test-bench System consists of Host Computer, Embedded Controlling Board, and functional boards. The Host Computer serves as the Human Machine Interface (HMI and processes uploaded data. The software running on Host Computer is designed on VC++, which is developed based on multithreading, Dynamic Linkable Library (DLL and Multiple Document Interface (MDI techniques. The Embedded Controlling Board uses ARM7 as the microcontroller and communicates with Host Computer via Ethernet. The Embedded Controlling Board software is realized based on embedded uclinux operating system with a layered architecture. The functional boards are designed based on Field Programmable Gate Array (FPGA and provide test interfaces for the logging tool. The functional board software is divided into independent sub-modules that can repeatedly be used by various functional boards and then integrated those sub-modules in the top layer. With the layered architecture and modularized design, the software system is highly reliable and extensible. With the help of designed system, a test has been conducted quickly and successfully on the electronic receiving cabin of the sonic logging tool. It demonstrated that the system could greatly improve the production efficiency of the sonic logging tool.
Bruun Larsen, Lars; Skonnord, Trygve; Gjelstad, Svein
in primary care research. Examples of this are online randomisation, electronic questionnaires, automatic email scheduling, mobile phone applications and data extraction tools. The amount of data can be increased to a low cost, and this can help to reach adequate sample sizes. However, there are still...... challenges within the field. To secure a high response rate, you need to follow up manually or use another application. There are also practical and ethical problems, and the data security for sensitive data have to be followed carefully. Session content Oral presentations about some technological...
Dec 29, 2008 ... It is on this premise that this article presents Bayes' theorem as a vital tool. A brief intuitive ... diseased individual will be selected or that a disease-free individual will be selected? ...... Ultrasound physics and. Instruction 3rd ed ...
The present knowledge society requires statistical literacy-the ability to interpret, critically evaluate, and communicate about statistical information and messages (Gal, 2002). However, research shows that students generally do not gain satisfactory statistical understanding. The research
The types and uses of research reactors are reviewed. After an analysis of the world situation, the demand of new research reactors of about 20 MW is foreseen. The experience and competitiveness of INVAP S.E. as designer and constructor of research reactors is outlined and the general specifications of the reactors designed by INVAP for Egypt and Australia are given
Veeck, Ann; Hoger, Beth
Knowledge of how to effectively monitor social media is an increasingly valued marketing research skill. This study tests an approach for adding social media content to an undergraduate marketing research class team project. The revised project maintains the expected objectives and parameters of a traditional research project, while integrating…
Full Text Available This contribution describes ‘Research Game’, a game produced in a Lifelong Learning Programme-Comenius Project (The European Scientific Research Game which aims at motivating secondary school students through the experience of the excitement of scientific research. The project proposes practical and didactic works which combine theoretical activities with ICT in order to introduce students to the scientific research. Students collaborated internationally across Europe, to build hypotheses, carry out research, test the validity of their hypothesis and finalize a theory based on their findings. On the project platform (www.researchgame.eu/platform teachers and students registered, created a team, interacted on a forum space, played and learned science in a new innovative way. Here, the students shared their research findings with other groups of all Europe; finally competed online playing a serious game and showing to be able to apply the scientific method.
Rather than produce clear-cut answers to well-defined problems, research on future environmental policy issues requires a different approach whereby researchers are partners in joint learning processes among stakeholders, policy makers, NGOs (Non-Governmental Organisations) and industry. This
Abbott, Rodman P.; Stracener, Jerrell
This study investigates the relationship between the designated research project system independent variables of Labor, Travel, Equipment, and Contract total annual costs and the dependent variables of both the associated matching research project total annual academic publication output and thesis/dissertation number output. The Mahalanobis…
Noyons, Everard Christiaan Marie
Bibliometric maps of science are landscapes of scientific research fields created by quantitative analysis of bibliographic data. In such maps the 'cities' are, for instance, research topics. Topics with a strong cognitive relation are in each other's vicinity and topics with a weak relation are
Kurt Johnsen; Lisa Samuelson; Robert Teskey; Steve McNulty; Tom Fox
Forest process models are mathematical representations of biological systems that incorporate our understanding of physiological and ecological mechanisms into predictive algorithms. These models were originally designed and used for research purposes, but are being developed for use in practical forest management. Process models designed for research...
Validity is a key concept in qualitative educational research. Yet, it is often not addressed in methodological writing about dance. This essay explores validity in a postmodern world of diverse approaches to scholarship, by looking at the changing face of validity in educational qualitative research and at how new understandings of the concept…
Roo, A.P.J. de; Thielen, J.; Feyen, L.; Burek, P.; Salamon, P.
The floods in the rivers Meuse and Rhine in 1993 and 1995 made the European Commission realize that also at Commission level further research on floods – especially in transboundary river catchments - was necessary. This led to the start of a dedicated research project on floods at the European
The objective of this study is to develop an evidencebased research implementation database and tool to support research implementation at the Georgia Department of Transportation (GDOT).A review was conducted drawing from the (1) implementati...
THE FOLLOWING IS ONE OF A SERIES OF PAPERS DEVELOPED OR PRODUCED BY THE ECONOMIC ANALYSIS DIVISION OF THE JOHN A. VOLPE NATIONAL TRANSPORTATION SYSTEMS CENTER AS PART OF ITS RESEARCH PROJECT LOOKING INTO ISSUES SURROUNDING : USER RESPONSE AND MARKET ...
Ivancic, William D.
Personnel in the NASA Glenn Research Center Network and Architectures branch have performed a variety of research related to space-based sensor webs, network centric operations, security and delay tolerant networking (DTN). Quality documentation and communications, real-time monitoring and information dissemination are critical in order to perform quality research while maintaining low cost and utilizing multiple remote systems. This has been accomplished using a variety of Internet technologies often operating simultaneously. This paper describes important features of various technologies and provides a number of real-world examples of how combining Internet technologies can enable a virtual team to act efficiently as one unit to perform advanced research in operational systems. Finally, real and potential abuses of power and manipulation of information and information access is addressed.
Klein, P. D.; Hachey, D. L.; Kreek, M. J.; Schoeller, D. A.
Recent developments in the use of the stable isotopes, /sup 13/C, /sup 15/N, /sup 17/O, and /sup 18/O, as tracers in research studies in the fields of biology, medicine, pharmacology, and agriculture are briefly reviewed. (CH)
Massi, Luciana; Santos, Gelson Ribeiro dos; Ferreira, Jerino Queiroz; Queiroz, Salete Linhares
Chemistry teachers increasingly use research articles in their undergraduate courses. This trend arises from current pedagogical emphasis on active learning and scientific process. In this paper, we describe some educational experiences on the use of research articles in chemistry higher education. Additionally, we present our own conclusions on the use of such methodology applied to a scientific communication course offered to undergraduate chemistry students at the University of São Paulo, ...
Fiscal 1996 research cooperation promotion project. Report on the Japan/Mexico international cooperation research on recovery of valuable elements in brine; 1996 nendo kenkyu kyoryoku suishin jigyo. Kansuichu no yuka shigen kaishu gijutsu ni kansuru kenkyu kyoryoku follow up hokokusho
In the research cooperation promotion projects having been carried out, the R and D of an overall recovery system have been conducted which recovers effectively and systematically valuable resources such as magnesium, bromine and boron including in brine made after the salt manufacturing in the coastal region in Mexico. In this project, as the research on the distilling process needed for commercialization of the system to recover valuable resources from the brine, Japan has been collecting information on various distilling processes and solar pond systems and studying a distilling process appropriate to the site jointly with Mexico. As a result, a seawater desalination process which combined the solar pond using solar energy and the evaporation method was recommended as a low-priced distilling method. Moreover, it was indicated that the solar still method for condensing/recovering the evaporated water of the Guerrero Negro salt pan is also less influential in the salt manufacturing process and is viable as a most economical distilling process by studying the structural design and operational method of more efficient solar stills
Otrel-Cass, Kathrin; Cowie, Bronwen
When practising teachers take time to exchange their experiences and reflect on their teaching realities as critical friends, they add meaning and depth to educational research. When peer talk is facilitated through video chat platforms, teachers can meet (virtually) face to face even when...... recordings were transcribed and used to prompt further discussion. The recording of the video chat meetings provided an opportunity for researchers to listen in and follow up on points they felt needed further unpacking or clarification. The recorded peer video chat conversations provided an additional...... opportunity to stimulate and support teacher participants in a process of critical analysis and reflection on practice. The discussions themselves were empowering because in the absence of the researcher, the teachers, in negotiation with peers, choose what is important enough to them to take time to discuss....
Full Text Available There are many ways in which pain in animals can be measured and these are based on a variety of phenomena that are related to either the perception of pain or alterations in physical or behavioural features of the animal that are caused by that pain. The features of pain that are most useful for assessment in clinical environments are not always the best to use in a research environment. This is because the aims and objectives of the two settings are different and so whilst particular techniques will have the same advantages and disadvantages in clinical and research environments, these considerations may become more or less of a drawback when moving from one environment to the other. For example, a simple descriptive pain scale has a number of advantages and disadvantages. In a clinical setting the advantages are very useful and the disadvantages are less relevant, but in a research environment the advantages are less important and the disadvantages can become more problematic. This paper will focus on pain in the research environment and after a brief revision of the pathophysiological systems involved will attempt to outline the major advantages and disadvantages of the more commonly used measurement techniques that have been used for studies in the area of pain perception and analgesia. This paper is expanded from a conference proceedings paper presented at the International Veterinary Emergency and Critical Care Conference in San Diego, USA.
Ramalho, A.J.G.; Marques, J.G.; Cardeira, F.M.
A short presentation is made of the Portuguese Research Reactor utilisation, its problems and the solutions found. Starting with the initial calibration and experiments the routine operation at full power follows. The problems then encountered which drove to the refurbishment are referred. The present status of the system is then presented and from that conclusions for the future are derived. (author)
Nathalie Sonck; Henk Fernee
Smartphones and apps offer an innovative means of collecting data from the public. The Netherlands Institute for Social Research | SCP has been engaged in one of the first experiments involving the use of a smartphone app to collect time use data recorded by means of an electronic diary. Is it
Ainley, Mary; Bourke, Valerie; Chatfield, Robert; Hillman, Kylie; Watkins, Ian
In 1997, Balwyn High School (Australia) instituted a class of 28 Year 7 students to use laptop computers across the curriculum. This report details findings from an action research project that monitored important aspects of what happened when this program was introduced. A range of measures was developed to assess the influence of the use of…
Brownell, Marni D.; Jutte, Douglas P.
Linking administrative data records for the same individuals across services and over time offers a powerful, population-wide resource for child maltreatment research that can be used to identify risk and protective factors and to examine outcomes. Multistage de-identification processes have been developed to protect privacy and maintain…
Baran, Evrim; Chuang, Hsueh-Hua; Thompson, Ann
TPACK (technological pedagogical content knowledge) has emerged as a clear and useful construct for researchers working to understand technology integration in learning and teaching. Whereas first generation TPACK work focused upon explaining and interpreting the construct, TPACK has now entered a second generation where the focus is upon using…
Netjes, M.; Vanderfeesten, I.T.P.; Reijers, H.A.; Bussler, C.; Haller, A.
Although much attention is being paid to business processes during the past decades, the design of business processes and particularly workflow processes is still more art than science. In this workshop paper, we present our view on modeling methods for workflow processes and introduce our research
Kumar, V; Petersen, J Andrew; Leone, Robert P
The customers who buy the most from you are probably not your best marketers. What's more, your best marketers may be worth far more to your company than your most enthusiastic consumers. Those are the conclusions of professors Kumar and Petersen at the University of Connecticut and professor Leone at Ohio State University, who analyzed thousands of customers in research focused on a telecommunications company and a financial services firm. In this article, the authors present a straightforward tool that can be used to calculate both customer lifetime value (CLV), the worth of your customers' purchases, and customer referral value (CRV), the value of their referrals. Knowing both enables you to segment your customers into four constituent parts: those that buy a lot but are poor marketers (which they term Affluents); those that don't buy much but are very strong salespeople for your firm (Advocates); those that do both well (Champions); and those that do neither well (Misers). In a series of one-year experiments, the authors demonstrated the effectiveness of this segmentation approach. Offering purchasing incentives to Advocates, referral incentives to Affluents, and both to Misers, they were able to move significant proportions of all three into the Champions category. Both companies reaped returns on their marketing investments greater than 12-fold--more than double the normal marketing ROI for their industries. The power of this tool is its ability to help marketers decide where to focus their efforts. Rather than waste funds encouraging big spenders to spend slightly more while overlooking the power of customer evangelists who don't buy enough to seem important, you can reap much higher rewards by nudging big spenders to make referrals and urging enthusiastic proponents of your wares to buy a bit more.
Yu, T.; Lu, R.; Bishop, L.
Biofilm processes are widely utilized in environmental engineering for biodegradation of contaminated waters, gases and soils. It is important to understand the structure and functions of biofilms. Microelectrodes are novel experimental tools for environmental biofilm studies. The authors reviewed the techniques of oxygen, sulfide, redox potential and pH microelectrode. These microelectrodes have tip diameters of 3 to 20 μm, resulting a high spatial resolution. They enable us directly measure the chemical conditions as results of microbial activities in biofilms. The authors also reported the laboratory and field studies of wastewater biofilms using microelectrode techniques. The results of these studies provided experimental evidence on the stratification of microbial processes and the associated redox potential change in wastewater biofilms: (1) The oxygen penetration depth was only a fraction of the biofilm thickness. This observation, first made under laboratory conditions, has been confirmed under field conditions. (2) The biofilms with both aerobic oxidation and sulfate reduction had a clearly stratified structure. This was evidenced by a sharp decrease of redox potential near the interface between the aerobic zone and the sulfate reduction zone within the biofilm. In this type of biofilms, aerobic oxidation took place only in a shallow layer near the biofilm surface and sulfate reduction occurred in the deeper anoxic zone. (3) The redox potential changed with the shift of primary microbial process in biofilms, indicating that it is possible to use redox potential to help illustrate the structure and functions of biofilms. (author)
Zhou, Tianjun; Yu, Yongqiang; Liu, Yimin; Wang, Bin
First book available on systematic evaluations of the performance of the global climate model FGOALS. Covers the whole field, ranging from the development to the applications of this climate system model. Provide an outlook for the future development of the FGOALS model system. Offers brief introduction about how to run FGOALS. Coupled climate system models are of central importance for climate studies. A new model known as FGOALS (the Flexible Global Ocean-Atmosphere-Land System model), has been developed by the State Key Laboratory of Numerical Modeling for Atmospheric Sciences and Geophysical Fluid Dynamics, Institute of Atmospheric Physics, Chinese Academy of Sciences (LASG/IAP, CAS), a first-tier national geophysical laboratory. It serves as a powerful tool, both for deepening our understanding of fundamental mechanisms of the climate system and for making decadal prediction and scenario projections of future climate change. ''Flexible Global Ocean-Atmosphere-Land System Model: A Modeling Tool for the Climate Change Research Community'' is the first book to offer systematic evaluations of this model's performance. It is comprehensive in scope, covering both developmental and application-oriented aspects of this climate system model. It also provides an outlook of future development of FGOALS and offers an overview of how to employ the model. It represents a valuable reference work for researchers and professionals working within the related areas of climate variability and change.
Zhou, Tianjun; Yu, Yongqiang; Liu, Yimin; Wang, Bin (eds.) [Chinese Academy of Sciences, Beijing, (China). Inst. of Atmospheric Physics
First book available on systematic evaluations of the performance of the global climate model FGOALS. Covers the whole field, ranging from the development to the applications of this climate system model. Provide an outlook for the future development of the FGOALS model system. Offers brief introduction about how to run FGOALS. Coupled climate system models are of central importance for climate studies. A new model known as FGOALS (the Flexible Global Ocean-Atmosphere-Land System model), has been developed by the State Key Laboratory of Numerical Modeling for Atmospheric Sciences and Geophysical Fluid Dynamics, Institute of Atmospheric Physics, Chinese Academy of Sciences (LASG/IAP, CAS), a first-tier national geophysical laboratory. It serves as a powerful tool, both for deepening our understanding of fundamental mechanisms of the climate system and for making decadal prediction and scenario projections of future climate change. ''Flexible Global Ocean-Atmosphere-Land System Model: A Modeling Tool for the Climate Change Research Community'' is the first book to offer systematic evaluations of this model's performance. It is comprehensive in scope, covering both developmental and application-oriented aspects of this climate system model. It also provides an outlook of future development of FGOALS and offers an overview of how to employ the model. It represents a valuable reference work for researchers and professionals working within the related areas of climate variability and change.
Open inquiry through reproducing results is fundamental to the scientific process. Contemporary research relies on software engineering pipelines to collect, process, and analyze data. The open source projects within Project Jupyter facilitate these objectives by bringing software engineering within the context of scientific communication. We will highlight specific projects that are computational building blocks for scientific communication, starting with the Jupyter Notebook. We will also explore applications of projects that build off of the Notebook such as Binder, JupyterHub, and repo2docker. We will discuss how these projects can individually and jointly improve reproducibility in scientific communication. Finally, we will demonstrate applications of Jupyter software that allow researchers to build upon the code of other scientists, both to extend their work and the work of others. There will be a follow-up demo session in the afternoon, hosted by iML. Details can be foun...
Wright, J; Wagner, A
Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput) experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform calle...
Blažková, Michaela; Člověčko, M.; Eltsov, V. B.; Gažo, E.; de Graaf, R.; Hosio, J.J.; Krusius, M.; Schmoranzer, D.; Schoepe, W.; Skrbek, Ladislav; Skyba, P.; Solntsev, R.E.; Vinen, W. F.
Roč. 150, - (2008), s. 525-535 ISSN 0022-2291 R&D Projects: GA ČR GA202/05/0218 Grant - others:GAUK(CZ) 7953/2007; Transnational Access Programme(XE) RITA -CT-2003-505313 Institutional research plan: CEZ:AV0Z10100520 Keywords : normal 3He * superfluid 3He * superfluid 4He * turbulence, * cavitation * quartz tuning fork Subject RIV: BK - Fluid Dynamics Impact factor: 1.034, year: 2008
An overview of the activities of the research groups that have been involved in fabrication, development and characterization of microplasmas for chemical analysis over the last few years is presented. Microplasmas covered include: miniature inductively coupled plasmas (ICPs); capacitively coupled plasmas (CCPs); microwave-induced plasmas (MIPs); a dielectric barrier discharge (DBD); microhollow cathode discharge (MCHD) or microstructure electrode (MSE) discharges, other microglow discharges (such as those formed between 'liquid' electrodes); microplasmas formed in micrometer-diameter capillary tubes for gas chromatography (GC) or high-performance liquid chromatography (HPLC) applications, and a stabilized capacitive plasma (SCP) for GC applications. Sample introduction into microplasmas, in particular, into a microplasma device (MPD), battery operation of a MPD and of a mini- in-torch vaporization (ITV) microsample introduction system for MPDs, and questions of microplasma portability for use on site (e.g., in the field) are also briefly addressed using examples of current research. To emphasize the significance of sample introduction into microplasmas, some previously unpublished results from the author's laboratory have also been included. And an overall assessment of the state-of-the-art of analytical microplasma research is provided
Science and society would be well advised to develop a different relationship as the information revolution penetrates all aspects of modern life. Rather than produce clear answers to clear questions in a top-down manner, land-use issues related to the UN Sustainable Development Goals (SDGs) present "wicked"problems involving different, strongly opiniated, stakeholders with conflicting ideas and interests and risk-averse politicians. The Dutch government has invited its citizens to develop a "science agenda", defining future research needs, implicitly suggesting that the research community is unable to do so. Time, therefore, for a pro-active approach to more convincingly define our:"societal license to research". For soil science this could imply a focus on the SDGs , considering soils as living, characteristically different, dynamic bodies in a landscape, to be mapped in ways that allow generation of suitable modelling data. Models allow a dynamic characterization of water- and nutrient regimes and plant growth in soils both for actual and future conditions, reflecting e.g. effects of climate or land-use change or alternative management practices. Engaging modern stakeholders in a bottom-up manner implies continuous involvement and "joint learning" from project initiation to completion, where modelling results act as building blocks to explore alternative scenarios. Modern techniques allow very rapid calculations and innovative visualization. Everything is possible but only modelling can articulate the economic, social and environmental consequences of each scenario, demonstrating in a pro-active manner the crucial and indispensible role of research. But choices are to be made by stakeholders and reluctant policy makers and certainly not by scientists who should carefully guard their independance. Only clear results in the end are convincing proof for the impact of science, requiring therefore continued involvement of scientists up to the very end of projects. To
The demand of renewable energies is growing steadily both from policy and from industry which seeks environmentally friendly feed stocks. The recent policies enacted by the EU, USA and other industrialized countries foresee an increased interest in the cultivation of energy crops; there is clear evidence that switchgrass is one of the most promising biomass crop for energy production and bio-based economy and compounds. Switchgrass: A Valuable Biomass Crop for Energy provides a comprehensive guide to switchgrass in terms of agricultural practices, potential use and markets, and environmental and social benefits. Considering this potential energy source from its biology, breed and crop physiology to its growth and management to the economical, social and environmental impacts, Switchgrass: A Valuable Biomass Crop for Energy brings together chapters from a range of experts in the field, including a foreword from Kenneth P. Vogel, to collect and present the environmental benefits and characteristics of this a ...
This slide presentation reviews the Global Hawk, a unmanned aerial vehicle (UAV) that NASA plans to use for Earth Sciences research. The Global Hawk is the world's first fully autonomous high-altitude, long-endurance aircraft, and is capable of conducting long duration missions. Plans are being made for the use of the aircraft on missions in the Arctic, Pacific and Western Atlantic Oceans. There are slides showing the Global Hawk Operations Center (GHOC), Flight Control and Air Traffic Control Communications Architecture, and Payload Integration and Accommodations on the Global Hawk. The first science campaign, planned for a study of the Pacific Ocean, is reviewed.
Lal, Shalini; Donnelly, Catherine; Shin, Jennifer
Digital storytelling is a method of using storytelling, group work, and modern technology to facilitate the creation of 2-3 minute multi-media video clips to convey personal or community stories. Digital storytelling is being used within the health care field; however, there has been limited documentation of its application within occupational therapy. This paper introduces digital storytelling and proposes how it can be applied in occupational therapy clinical practice, education, and research. The ethical and methodological challenges in relation to using the method are also discussed.
Barr, Yael; Rasbury, Jack; Johnson, Jordan; Barstend, Kristina; Saile, Lynn; Watkins, Sharmi
The Exploration Medical Capability (ExMC) element is one of six elements of the Human Research Program (HRP). ExMC is charged with decreasing the risk of: "Inability to adequately recognize or treat an ill or injured crew member" for exploration-class missions In preparation for exploration-class missions, ExMC has compiled a large evidence base, previously available only to persons within the NASA community. ExMC has developed the "NASA Human Research Wiki" in an effort to make the ExMC information available to the general public and increase collaboration within and outside of NASA. The ExMC evidence base is comprised of several types of data, including: (1)Information on more than 80 medical conditions which could occur during space flight (a)Derived from several sources (b)Including data on incidence and potential outcomes, as captured in the Integrated Medical Model s (IMM) Clinical Finding Forms (CliFFs). (2)Approximately 25 gap reports (a)Identify any "gaps" in knowledge and/or technology that would need to be addressed in order to provide adequate medical support for these novel missions.
Full Text Available Alexander M Walker,1 Amanda R Patrick,2 Michael S Lauer,3 Mark C Hornbrook,4 Matthew G Marin,5 Richard Platt,6 Véronique L Roger,7 Paul Stang,8 Sebastian Schneeweiss21World Health Information Science Consultants, Newton, MA; 2Division of Pharmacoepidemiology and Pharmacoeconomics, Brigham and Women's Hospital, Boston, MA; 3National Heart, Lung, and Blood Institute, National Institutes of Health, Bethesda, MD; 4The Center for Health Research, Kaiser Permanente Northwest, Portland, OR; 5Department of Medicine, New Jersey Medical School, Newark, NJ; 6Department of Population Medicine, Harvard Pilgrim Health Care Institute and Harvard Medical School, Boston, MA; 7Department of Health Sciences Research, Mayo Clinic, Rochester, MN; 8Johnson and Johnson Pharmaceutical Research and Development, Titusville, NJ, USABackground: Comparative effectiveness research (CER provides actionable information for health care decision-making. Randomized clinical trials cannot provide the patients, time horizons, or practice settings needed for all required CER. The need for comparative assessments and the infeasibility of conducting randomized clinical trials in all relevant areas is leading researchers and policy makers to non-randomized, retrospective CER. Such studies are possible when rich data exist on large populations receiving alternative therapies that are used as-if interchangeably in clinical practice. This setting we call “empirical equipoise.”Objectives: This study sought to provide a method for the systematic identification of settings it in which it is empirical equipoise that offers promised non-randomized CER.Methods: We used a standardizing transformation of the propensity score called “preference” to assess pairs of common treatments for uncomplicated community-acquired pneumonia and new-onset heart failure in a population of low-income elderly people in Pennsylvania, for whom we had access to de-identified insurance records. Treatment
Scanlon, J.J.; Rolader, G.E.; Jamison, K.A.; Petresky, H.
Electromagnetic Launcher (EML) research at the Air Force Armament Laboratory, Hypervelocity Launcher Branch (AFATL/SAH), Eglin AFB, has focused on developing the technologies required for repetitively launching several kilogram payloads to high velocities. Previous AFATL/SAH experiments have been limited by the available power supply resulting in small muzzle energies on the order of 100's of kJ. In an effort to advance the development of EML's, AFATL/SAH has designed and constructed a battery power supply (BPS) capable of providing several mega-Amperes of current for several seconds. This system consists of six modules each containing 2288 automotive batteries which may be connected in two different series - parallel arrangements. In this paper the authors define the electrical characteristics of the AFATL Battery Power supply at the component level
Rogers, Jan; SanSoucie, Mike
Containerless processing represents an important topic for materials research in microgravity. Levitated specimens are free from contact with a container, which permits studies of deeply undercooled melts, and high-temperature, highly reactive materials. Containerless processing provides data for studies of thermophysical properties, phase equilibria, metastable state formation, microstructure formation, undercooling, and nucleation. The European Space Agency (ESA) and the German Aerospace Center (DLR) jointly developed an electromagnetic levitator facility (MSL-EML) for containerless materials processing in space. The electrostatic levitator (ESL) facility at the Marshall Space Flight Center provides support for the development of containerless processing studies for the ISS. Apparatus and techniques have been developed to use the ESL to provide data for phase diagram determination, creep resistance, emissivity, specific heat, density/thermal expansion, viscosity, surface tension and triggered nucleation of melts. The capabilities and results from selected ESL-based characterization studies performed at NASA's Marshall Space Flight Center will be presented.
Ion Danut I. JUGANARU
Full Text Available This study aims at analyzing the distribution of tourist flows in 2014, from 25 European countries, on three main categories of trip purposes, and assumes that there are differences or similarities between the tourists’ countries of residence and their trip purposes. "Purpose'' is a multidimensional concept used in marketing research, most often for understanding consumer behavior, and for identifying market segments or customer target groups, reunited in terms of similar characteristics. Being aware that the decision of choice/ purchase is based on purposes, their knowledge proves useful in designing strategies to increase the satisfaction level provided to the customer. The statistical method used in this paper is the factorial correspondences analysis. In our opinion, the identification, by this method, of the existence of differences or similarities between the tourists’ countries of residence and their trip purposes can represent a useful step in studying the tourism market and the choice/ reformulation of strategies.
D R Simmons
Full Text Available A common problem in visual appearance research is how to quantitatively characterise the visual appearance of a region of an image which is categorised by human observers in the same way. An example of this is scarring in medical images (Ayoub et al, 2010, The Cleft-Palate Craniofacial Journal, in press. We have argued that “scarriness” is itself a visual appearance descriptor which summarises the distinctive combination of colour, texture and shape information which allows us to distinguish scarred from non-scarred tissue (Simmons et al, ECVP 2009. Other potential descriptors for other image classes would be “metallic”, “natural”, or “liquid”. Having developed an automatic algorithm to locate scars in medical images, we then tested “ground truth” by asking untrained observers to draw around the region of scarring. The shape and size of the scar on the image was defined by building a contour plot of the agreement between observers' outlines and thresholding at the point above which 50% of the observers agreed: a consensus coding scheme. Based on the variability in the amount of overlap between the scar as defined by the algorithm, and the consensus scar of the observers, we have concluded that the algorithm does not completely capture the putative appearance descriptor “scarriness”. A simultaneous analysis of qualitative descriptions of the scarring by the observers revealed that other image features than those encoded by the algorithm (colour and texture might be important, such as scar boundary shape. This approach to visual appearance research in medical imaging has potential applications in other application areas, such as botany, geology and archaeology.
Verma, Ark; Brysbaert, Marc
Neuropsychological and neuroimaging research has established that knowledge related to tool use and tool recognition is lateralized to the left cerebral hemisphere. Recently, behavioural studies with the visual half-field technique have confirmed the lateralization. A limitation of this research was that different sets of stimuli had to be used for the comparison of tools to other objects and objects to non-objects. Therefore, we developed a new set of stimuli containing matched triplets of tools, other objects and non-objects. With the new stimulus set, we successfully replicated the findings of no visual field advantage for objects in an object recognition task combined with a significant right visual field advantage for tools in a tool recognition task. The set of stimuli is available as supplemental data to this article.
Because the World Wide Web is a dynamic collection of information, the Web search tools (or "search engines") that index the Web are dynamic. Traditional information retrieval evaluation techniques may not provide reliable results when applied to the Web search tools. This study is the result of ten replications of the classic 1996 Ding and Marchionini Web search tool research. It explores the effects that replication can have on transforming unreliable results from one iteration into replica...
Zhongqi Sheng; Lei Zhang; Hualong Xie; Changchun Liu
Assembly is the part that produces the maximum workload and consumed time during product design and manufacturing process. CNC machine tool is the key basic equipment in manufacturing industry and research on assembly design technologies of CNC machine tool has theoretical significance and practical value. This study established a simplified ASRG for CNC machine tool. The connection between parts, semantic information of transmission, and geometric constraint information were quantified to as...
Baggi, Fulvio; Mantegazza, Renato; Antozzi, Carlo; Sanders, Donald
Clinical registries may facilitate research on myasthenia gravis (MG) in several ways: as a source of demographic, clinical, biological, and immunological data on large numbers of patients with this rare disease; as a source of referrals for clinical trials; and by allowing rapid identification of MG patients with specific features. Physician-derived registries have the added advantage of incorporating diagnostic and treatment data that may allow comparison of outcomes from different therapeutic approaches, which can be supplemented with patient self-reported data. We report the demographic analysis of MG patients in two large physician-derived registries, the Duke MG Patient Registry, at the Duke University Medical Center, and the INNCB MG Registry, at the Istituto Neurologico Carlo Besta, as a preliminary study to assess the consistency of the two data sets. These registries share a common structure, with an inner core of common data elements (CDE) that facilitate data analysis. The CDEs are concordant with the MG-specific CDEs developed under the National Institute of Neurological Disorders and Stroke Common Data Elements Project. © 2012 New York Academy of Sciences.
Hunter, Jill V; Wilde, Elisabeth A; Tong, Karen A; Holshouser, Barbara A
This article identifies emerging neuroimaging measures considered by the inter-agency Pediatric Traumatic Brain Injury (TBI) Neuroimaging Workgroup. This article attempts to address some of the potential uses of more advanced forms of imaging in TBI as well as highlight some of the current considerations and unresolved challenges of using them. We summarize emerging elements likely to gain more widespread use in the coming years, because of 1) their utility in diagnosis, prognosis, and understanding the natural course of degeneration or recovery following TBI, and potential for evaluating treatment strategies; 2) the ability of many centers to acquire these data with scanners and equipment that are readily available in existing clinical and research settings; and 3) advances in software that provide more automated, readily available, and cost-effective analysis methods for large scale data image analysis. These include multi-slice CT, volumetric MRI analysis, susceptibility-weighted imaging (SWI), diffusion tensor imaging (DTI), magnetization transfer imaging (MTI), arterial spin tag labeling (ASL), functional MRI (fMRI), including resting state and connectivity MRI, MR spectroscopy (MRS), and hyperpolarization scanning. However, we also include brief introductions to other specialized forms of advanced imaging that currently do require specialized equipment, for example, single photon emission computed tomography (SPECT), positron emission tomography (PET), encephalography (EEG), and magnetoencephalography (MEG)/magnetic source imaging (MSI). Finally, we identify some of the challenges that users of the emerging imaging CDEs may wish to consider, including quality control, performing multi-site and longitudinal imaging studies, and MR scanning in infants and children.
Full Text Available Abstract Background Policy makers, clinicians and researchers are demonstrating increasing interest in using data linked from multiple sources to support measurement of clinical performance and patient health outcomes. However, the utility of data linkage may be compromised by sub-optimal or incomplete linkage, leading to systematic bias. In this study, we synthesize the evidence identifying participant or population characteristics that can influence the validity and completeness of data linkage and may be associated with systematic bias in reported outcomes. Methods A narrative review, using structured search methods was undertaken. Key words "data linkage" and Mesh term "medical record linkage" were applied to Medline, EMBASE and CINAHL databases between 1991 and 2007. Abstract inclusion criteria were; the article attempted an empirical evaluation of methodological issues relating to data linkage and reported on patient characteristics, the study design included analysis of matched versus unmatched records, and the report was in English. Included articles were grouped thematically according to patient characteristics that were compared between matched and unmatched records. Results The search identified 1810 articles of which 33 (1.8% met inclusion criteria. There was marked heterogeneity in study methods and factors investigated. Characteristics that were unevenly distributed among matched and unmatched records were; age (72% of studies, sex (50% of studies, race (64% of studies, geographical/hospital site (93% of studies, socio-economic status (82% of studies and health status (72% of studies. Conclusion A number of relevant patient or population factors may be associated with incomplete data linkage resulting in systematic bias in reported clinical outcomes. Readers should consider these factors in interpreting the reported results of data linkage studies.
McQuade, Sarah; Davis, Louise; Nash, Christine
Current thinking in coach education advocates mentoring as a development tool to connect theory and practice. However, little empirical evidence exists to evaluate the effectiveness of mentoring as a coach development tool. Business, education, and nursing precede the coaching industry in their mentoring practice, and research findings offered in…
The aim of this paper is to propose a research tool in the field of education--the "metaphorical collage." This tool facilitates the understanding of concepts and processes in education through the analysis of metaphors in collage works that include pictorial images and verbal images. We believe the "metaphorical collage" to be…
Fancy, Steven G.; Pank, Larry F.; Douglas, David C.; Curby, Catherine H.; Garner, Gerald W.; Amstrup, Steven C.; Regelin, Wayne L.
operation, the UHF (ultra-high frequency) signal failed on three of 32 caribou transmitters and 10 of 36 polar bear transmitters.A geographic information system (GIS) incorporating other databases (e.g., land cover, elevation, slope, aspect, hydrology, ice distribution) was used to analyze and display detailed locational and behavioral data collected via satellite. Examples of GIS applications to research projects using satellite telemetry and examples of detailed movement patterns of caribou and polar bears are presented. This report includes documentation for computer software packages for processing Argos data and presents developments, as of March 1987, in transmitter design, data retrieval using a local user terminal, computer software, and sensor development and calibration.
Crossley, Scott A.
This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…
Søndergaard, Lene Vammen; Dagnæs-Hansen, Frederik; Herskin, Mette S
of the extent of welfare assessment in pigs used in biomedical research and to suggest a welfare assessment standard for research facilities based on an exposition of ethological considerations relevant for the welfare of pigs in biomedical research. The tools for porcine welfare assessment presented suggest...
This article proposes a three-part conceptualisation of the use of Facebook in ethnographic research: as a tool, as data and as context. Longitudinal research with young adults at a time of significant change provides many challenges for the ethnographic researcher, such as maintaining channels of communication and high rates of participant…
The overarching goal of the MoDOT Pavement Preservation Research Program, Task 3: Pavement Evaluation Tools Data : Collection Methods was to identify and evaluate methods to rapidly obtain network-level and project-level information relevant to :...
Highlights: → A highly flexible neutronic core simulator was developed. → The tool estimates the static neutron flux, the eigenmodes, and the neutron noise. → The tool was successfully validated via many benchmark cases. → The tool can be used for research and education. → The tool is freely available. - Abstract: This paper deals with the development, validation, and demonstration of an innovative neutronic tool. The novelty of the tool resides in its versatility, since many different systems can be investigated and different kinds of calculations can be performed. More precisely, both critical systems and subcritical systems with an external neutron source can be studied, and static and dynamic cases in the frequency domain (i.e. for stationary fluctuations) can be considered. In addition, the tool has the ability to determine the different eigenfunctions of any nuclear core. For each situation, the static neutron flux, the different eigenmodes and eigenvalues, the first-order neutron noise, and their adjoint functions are estimated, as well as the effective multiplication factor of the system. The main advantages of the tool, which is entirely MatLab based, lie with the robustness of the implemented numerical algorithms, its high portability between different computer platforms and operative systems, and finally its ease of use since no input deck writing is required. The present version of the tool, which is based on two-group diffusion theory, is mostly suited to investigate thermal systems. The definition of both the static and dynamic core configurations directly from the static macroscopic cross-sections and their fluctuations, respectively, makes the tool particularly well suited for research and education. Some of the many benchmark cases used to validate the tool are briefly reported. The static and dynamic capabilities of the tool are also demonstrated for the following configurations: a vibrating control rod, a perturbation traveling upwards
Showcasing exemplars of how various aspects of design research were successfully transitioned into and influenced, design practice, this book features chapters written by eminent international researchers and practitioners from industry on the Impact of Design Research on Industrial Practice. Chapters written by internationally acclaimed researchers of design analyse the findings (guidelines, methods and tools), technologies/products and educational approaches that have been transferred as tools, technologies and people to transform industrial practice of engineering design, whilst the chapters that are written by industrial practitioners describe their experience of how various tools, technologies and training impacted design practice. The main benefit of this book, for educators, researchers and practitioners in (engineering) design, will be access to a comprehensive coverage of case studies of successful transfer of outcomes of design research into practice; as well as guidelines and platforms for successf...
Aguiar, R R; Ambrosio, L A; Sepúlveda-Hermosilla, G; Maracaja-Coutinho, V; Paschoal, A R
This report describes the miRQuest - a novel middleware available in a Web server that allows the end user to do the miRNA research in a user-friendly way. It is known that there are many prediction tools for microRNA (miRNA) identification that use different programming languages and methods to realize this task. It is difficult to understand each tool and apply it to diverse datasets and organisms available for miRNA analysis. miRQuest can easily be used by biologists and researchers with limited experience with bioinformatics. We built it using the middleware architecture on a Web platform for miRNA research that performs two main functions: i) integration of different miRNA prediction tools for miRNA identification in a user-friendly environment; and ii) comparison of these prediction tools. In both cases, the user provides sequences (in FASTA format) as an input set for the analysis and comparisons. All the tools were selected on the basis of a survey of the literature on the available tools for miRNA prediction. As results, three different cases of use of the tools are also described, where one is the miRNA identification analysis in 30 different species. Finally, miRQuest seems to be a novel and useful tool; and it is freely available for both benchmarking and miRNA identification at http://mirquest.integrativebioinformatics.me/.
Boltz, S.; Macdonald, B. D.; Orr, T.; Johnson, W.; Benton, D. J.
Researchers with the National Institute for Occupational Safety and Health are conducting research at a deep, underground metal mine in Idaho to develop improvements in ground control technologies that reduce the effects of dynamic loading on mine workings, thereby decreasing the risk to miners. This research is multifaceted and includes: photogrammetry, microseismic monitoring, geotechnical instrumentation, and numerical modeling. When managing research involving such a wide range of data, understanding how the data relate to each other and to the mining activity quickly becomes a daunting task. In an effort to combine this diverse research data into a single, easy-to-use system, a three-dimensional visualization tool was developed. The tool was created using the Unity3d video gaming engine and includes the mine development entries, production stopes, important geologic structures, and user-input research data. The tool provides the user with a first-person, interactive experience where they are able to walk through the mine as well as navigate the rock mass surrounding the mine to view and interpret the imported data in the context of the mine and as a function of time. The tool was developed using data from a single mine; however, it is intended to be a generic tool that can be easily extended to other mines. For example, a similar visualization tool is being developed for an underground coal mine in Colorado. The ultimate goal is for NIOSH researchers and mine personnel to be able to use the visualization tool to identify trends that may not otherwise be apparent when viewing the data separately. This presentation highlights the features and capabilities of the mine visualization tool and explains how it may be used to more effectively interpret data and reduce the risk of ground fall hazards to underground miners.
Wight, Evelyn; Gardner, Gene; Harvey, Tony
As a reflection of its growing culture of openness, and in response to the public's need for accurate information about its activities, the U.S. Department of Energy (DOE) Office of the Assistant Secretary for Environmental Restoration and Waste Management (EM) has increased the amount of information available to the public through communication tools such as brochures, fact sheets, and a travelling exhibit with an interactive computer display. Our involvement with this effort has been to design, develop, and critique booklets, brochures, fact sheets and other communication tools for EM. This paper presents an evaluation of the effectiveness of two communication tools we developed: the EM Booklet and the EM Fact Sheets. We measured effectiveness using non-parametric testing. This paper describes DOE's culture change, EM's communication tools and their context within DOE'S new open culture, our research, test methods and results, the significance of our research, and our plans for future research. (author)
Murawska, Jaclyn M.; Walker, David A.
In this commentary, we offer a set of visual tools that can assist education researchers, especially those in the field of mathematics, in developing cohesiveness from a mixed methods perspective, commencing at a study's research questions and literature review, through its data collection and analysis, and finally to its results. This expounds…
Asselin, Marlene; Moayeri, Maryam
Competency in the new literacies of the Internet is essential for participating in contemporary society. Researchers studying these new literacies are recognizing the limitations of traditional methodological tools and adapting new technologies and new media for use in research. This paper reports our exploration of usability testing software to…
Kirishima, K.; Shibayama, H.; Nakahira, H.; Shimauchi, H.; Myochin, M.; Wada, Y.; Kawase, K.; Kishimoto, Y.
In the project ''Recovery and Utilization of Valuable Metals from Spent Fuel,'' mutual separation process of valuable metals recovered from spent fuel has been studied by using the simulated solution contained Pb, Ru, Rh, Pd and Mo. Pd was separated successfully by DHS (di-hexyl sulfide) solvent extraction method, while Pb was recovered selectively from the raffinate by neutralization precipitation of other elements. On the other hand, Rh was roughly separated by washing the precipitate with alkaline solution, so that Rh was refined by chelate resin CS-346. Outline of the mutual separation process flow sheet has been established of the combination of these techniques. The experimental results and the process flow sheet of mutual separation of valuable metals are presented in this paper
Yi, Youn Kyu; Kim, Hyun Soo; Tran, Tam; Hong, Sung Kil; Kim, Myong Jun
Recovering valuable metals such as Si, Ag, Cu, and Al has become a pressing issue as end-of-life photovoltaic modules need to be recycled in the near future to meet legislative requirements in most countries. Of major interest is the recovery and recycling of high-purity silicon (> 99.9%) for the production of wafers and semiconductors. The value of Si in crystalline-type photovoltaic modules is estimated to be -$95/kW at the 2012 metal price. At the current installed capacity of 30 GW/yr, the metal value in the PV modules represents valuable resources that should be recovered in the future. The recycling of end-of-life photovoltaic modules would supply > 88,000 and 207,000 tpa Si by 2040 and 2050, respectively. This represents more than 50% of the required Si for module fabrication. Experimental testwork on crystalline Si modules could recover a > 99.98%-grade Si product by HNO3/NaOH leaching to remove Al, Ag, and Ti and other metal ions from the doped Si. A further pyrometallurgical smelting at 1520 degrees C using CaO-CaF2-SiO2 slag mixture to scavenge the residual metals after acid leaching could finally produce > 99.998%-grade Si. A process based on HNO3/NaOH leaching and subsequent smelting is proposed for recycling Si from rejected or recycled photovoltaic modules. Implications: The photovoltaic industry is considering options of recycling PV modules to recover metals such as Si, Ag, Cu, Al, and others used in the manufacturing of the PV cells. This is to retain its "green" image and to comply with current legislations in several countries. An evaluation of potential resources made available from PV wastes and the technologies used for processing these materials is therefore of significant importance to the industry. Of interest are the costs of processing and the potential revenues gained from recycling, which should determine the viability of economic recycling of PV modules in the future.
Full Text Available The massive growth of and access to information technology (IT has enabled the integration of technology into classrooms. One such integration is the use of WebQuests as an instructional tool in teaching targeted learning activities such as writing abstracts of research articles in English for English as a Foreign Language (EFL learners. In the academic world, writing an abstract of a research paper or final project in English can be challenging for EFL students. This article presents an action research project on the process and outcomes of using a WebQuest designed to help 20 Indonesian university IT students write a research article’s abstract in English. Findings reveal that despite positive feedback, changes need to be made to make the WebQuest a more effective instructional tool for the purpose it was designed.
van Duppen, Z; Summa, M; Fuchs, T
Film or film fragments are often used in psychopathology education. However, so far there have been very few articles that have discussed the benefits and limitations of using films to explain or illustrate psychopathology. Although numerous films involves psychopathology in varying degrees, it is not clear how we can use films for psychopathology education. To examine the advantages, limitations and possible methods of using film as a means of increasing our knowledge and understanding of psychiatric illnesses. We discuss five examples that illustrate the interaction of film and psychopathology. On the one hand we explain how the psychopathological concepts are used in each film and on the other hand we explain which aspects of each film are valuable aids for teaching psychopathology. The use of film makes it possible to introduce the following topics in psychopathological teaching programme: holistic psychiatric reasoning, phenomenology and the subjective experience, the recognition of psychopathological prototypes and the importance of context. There is undoubtedly an analogy between the method we have chosen for teaching psychopathology with the help of films and the holistic approach of the psychiatrist and his or her team. We believe psychopathology education can benefit from films and we would recommend our colleagues to use it in this way.
T. A. Sannikova
Full Text Available One of the main directions of the food industry development is the production of functional food products. Changes in the human’s diet structure cause that none of population group does receive necessary amount of vitamins, macro and microelements in healthy routine diet. To solve this problem, food stuffs enhanced by different ingredients enable to improve the biological and food value. The pumpkin is a valuable source of such important substances as carotene and pectin. Addition of garlic and hot pepper ingredients to process of pumpkin pickling enables to enrich the products with carbohydrates, proteins, microelements, which have low or no content in the pumpkin fruit. Therefore, the study of the influence of the different quantities of garlic and hot pepper additions on chemical composition of finished product is very important. The influence of plant additions used on chemical composition of finished product had been well determined. It was shown that through increased doses of garlic and hot pepper ingredients as compared with control, the carotene and dry matter content then decreased by 1.16%-3.43% in pickled pumpkin, while the pectin content depended on added component. The highest pectin content, 0.71% was observed at addition of 10 g. garlic ingredient per 1 kg. of raw matter, that was 4.1 times higher than control. With increased addition of hot pepper ingredient the pectin accumulation was decreasing from 0.58% in control to 0.36% in variant 10g. per 1kg. of raw matter.
Philip W. Gassman; Manuel R. Reyes; Colleen H. Green; Jeffrey G. Arnold
The Soil and Water Assessment Tool (SWAT) model is a continuation of nearly 30 years of modeling efforts conducted by the U.S. Department of Agriculture (USDA), Agricultural Research Service. SWAT has gained international acceptance as a robust interdisciplinary watershed modeling tool, as evidenced by international SWAT conferences, hundreds of SWAT-related papers presented at numerous scientific meetings, and dozens of articles published in peer-reviewed journals. The model has also been ad...
Full Text Available The production lines used for manufacturing U-shaped profiles are very complex and they must have high productivity. One of the most important stages of the fabrication process is the cutting-off. This paper presents the experimental research and analysis of the durability of the cutting tools used for cutting-off U-shaped metal steel profiles. The results of this work can be used to predict the durability of the cutting tools.
Casanovas-Rubio, Maria del Mar; Ahearn, Alison; Ramos, Gonzalo; Popo-Ola, Sunday
In principle, the research-teaching nexus should be seen as a two-way link, showing not only ways in which research supports teaching but also ways in which teaching supports research. In reality, the discussion has been limited almost entirely to the first of these practices. This paper presents a case study in which some student field-trip…
Full Text Available To develop and disseminate tools for interactive visualization of HIV cohort data.If a picture is worth a thousand words, then an interactive video, composed of a long string of pictures, can produce an even richer presentation of HIV population dynamics. We developed an HIV cohort data visualization tool using open-source software (R statistical language. The tool requires that the data structure conform to the HIV Cohort Data Exchange Protocol (HICDEP, and our implementation utilized Caribbean, Central and South America network (CCASAnet data.This tool currently presents patient-level data in three classes of plots: (1 Longitudinal plots showing changes in measurements viewed alongside event probability curves allowing for simultaneous inspection of outcomes by relevant patient classes. (2 Bubble plots showing changes in indicators over time allowing for observation of group level dynamics. (3 Heat maps of levels of indicators changing over time allowing for observation of spatial-temporal dynamics. Examples of each class of plot are given using CCASAnet data investigating trends in CD4 count and AIDS at antiretroviral therapy (ART initiation, CD4 trajectories after ART initiation, and mortality.We invite researchers interested in this data visualization effort to use these tools and to suggest new classes of data visualization. We aim to contribute additional shareable tools in the spirit of open scientific collaboration and hope that these tools further the participation in open data standards like HICDEP by the HIV research community.
Zwoll, K.; Mueller, K.D.; Becks, B.; Erven, W.; Sauer, M.
The production of mechanical parts in research centers can be improved by connecting several numerically controlled machine tools to a central process computer via a data link. The CAMAC Serial Highway with its expandable structure yields an economic and flexible system for this purpose. The CAMAC System also facilitates the development of modular components controlling the machine tools itself. A CAMAC installation controlling three different machine tools connected to a central computer (PDP11) via the CAMAC Serial Highway is described. Besides this application, part of the CAMAC hardware and software can also be used for a great variety of scientific experiments
Tahmasebi, Farhad; Pearce, Robert
Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. For efficiency and speed, the tool takes advantage of a function developed in Excels Visual Basic for Applications. The strategic planning process for determining the community Outcomes is also briefly discussed. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples of using the tool are also presented.
Full Text Available Powder brazing filler metals (PBFMs feature a number of comparative advantages. Among others, these include a low energy consumption, an accurate dosage, a good brazeability, a short production time, and a high production efficiency. These filler metals have been used in the aerospace, automobile, and electric appliances industries. The PBFMs are especially suitable for diamond tools bonding, which involves complex workpiece shapes and requires accurate dosage. The recent research of PBFMs for diamond tools is reviewed in this paper. The current applications are discussed. The CuSnTi and Ni-Cr-based PBFMs have been the two commonly used monolayer PBFMs. Thus, the bonding mechanism at the interface between both the monolayer PBFMs and a diamond tool are summarized first. The ways to improve the performance of the monolayer PBFMs for diamond tools are analyzed. Next, a research of PBFMs for impregnated diamond tools is reviewed. The technical problems that urgently need solutions are discussed. Finally, the challenges and opportunities involved with the PBFMs for diamond tools research and development are summarized, and corresponding prospects are suggested.
Fortuna, Cinira Magali; Mesquita, Luana Pinho de; Matumoto, Silvia; Monceau, Gilles
This qualitative study is based on institutional analysis as the methodological theoretical reference with the objective of analyzing researchers' implication during a research-intervention and the interferences caused by this analysis. The study involved researchers from courses in medicine, nursing, and dentistry at two universities and workers from a Regional Health Department in follow-up on the implementation of the Stork Network in São Paulo State, Brazil. The researchers worked together in the intervention and in analysis workshops, supported by an external institutional analysis. Two institutions stood out in the analysis: the research, established mainly with characteristics of neutrality, and management, with Taylorist characteristics. Differences between researchers and difficulties in identifying actions proper to network management and research were some of the interferences that were identified. The study concludes that implication analysis is a powerful tool for such studies.
Doménech, Jesús; Genaim, Samir; Johnsen, Einar Broch; Schlatte, Rudolf
In this paper we describe EasyInterface, an open-source toolkit for rapid development of web-based graphical user interfaces (GUIs). This toolkit addresses the need of researchers to make their research prototype tools available to the community, and integrating them in a common environment, rapidly and without being familiar with web programming or GUI libraries in general. If a tool can be executed from a command-line and its output goes to the standard output, then in few minutes one can m...
POP Nicolae Al.
Full Text Available Starting from the meaning of the communication process in marketing, the authors try to identify its role in assuring the continuity of the management process in what concerns the relationships between all the partners of the company, on the long term. An emphasis is made on the role of online communication and its tools in relationship marketing. In order to validate some of the mentioned ideas the authors have chosen to undertake a qualitative marketing research among the managers of some Romanian tourism companies. The qualitative part of the study had as purpose the identification of the main tools which form the basis of the communication with the beneficiaries of the touristic services, of the way in which the companies use the online communication tools for attracting, keeping and developing the long term relationships with their customers in the virtual environment. The following tools have been analyzed: websites, email marketing campaigns, e-newsletters, online advertising, search engines, sponsored links, blogs, RSS feed, social networks, forums, online discussion groups, portals, infomediaries and instant messaging. The chosen investigation method was the selective survey, the research technique - explorative interrogation and the research instrument - semi structured detailed interview, based on a conversation guide. A very important fact is the classification resulted after the respondents were requested to mention the most efficient tools for attracting customers and for maintaining the relationships with them. Although the notoriety of the online marketing tools is high, there are some tools that are known by definition, but are not used at all or are not used correctly; or are not known by definition, but are used in practice. The authors contributed by validating a performing methodology of qualitative research, a study which will open new ways and means for making the online communication tools used for touristic services in
Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D’Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella M; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel
Background International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. Objectives This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. Methods We developed the informed consent document of the malaria treatment trial into a multimedia tool integrating video, animations and audio narrations in three major Gambian languages. Acceptability and ease of use of the multimedia tool were assessed using quantitative and qualitative methods. In two separate visits, the participants’ comprehension of the study information was measured by using a validated digitised audio questionnaire. Results The majority of participants (70%) reported that the multimedia tool was clear and easy to understand. Participants had high scores on the domains of adverse events/risk, voluntary participation, study procedures while lowest scores were recorded on the question items on randomisation. The differences in mean scores for participants’ ‘recall’ and ‘understanding’ between first and second visits were statistically significant (F (1,41)=25.38, pmultimedia tool was acceptable and easy to administer among low literacy participants in The Gambia. It also proved to be effective in delivering and sustaining comprehension of study information across a diverse group of participants. Additional research is needed to compare the tool to the traditional consent interview, both in The Gambia and in other sub-Saharan settings. PMID:25133065
Macdermid, Joy C; Miller, Jordan; Gross, Anita R
Development or synthesis of the best clinical research is in itself insufficient to change practice. Knowledge translation (KT) is an emerging field focused on moving knowledge into practice, which is a non-linear, dynamic process that involves knowledge synthesis, transfer, adoption, implementation, and sustained use. Successful implementation requires using KT strategies based on theory, evidence, and best practice, including tools and processes that engage knowledge developers and knowledge users. Tools can provide instrumental help in implementing evidence. A variety of theoretical frameworks underlie KT and provide guidance on how tools should be developed or implemented. A taxonomy that outlines different purposes for engaging in KT and target audiences can also be useful in developing or implementing tools. Theoretical frameworks that underlie KT typically take different perspectives on KT with differential focus on the characteristics of the knowledge, knowledge users, context/environment, or the cognitive and social processes that are involved in change. Knowledge users include consumers, clinicians, and policymakers. A variety of KT tools have supporting evidence, including: clinical practice guidelines, patient decision aids, and evidence summaries or toolkits. Exemplars are provided of two KT tools to implement best practice in management of neck pain-a clinician implementation guide (toolkit) and a patient decision aid. KT frameworks, taxonomies, clinical expertise, and evidence must be integrated to develop clinical tools that implement best evidence in the management of neck pain.
Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E
A novel Protocol Ethics Tool Kit ('Ethics Tool Kit') has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi
For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques  ; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.
Li, Man; Pickering, Brian W; Smith, Vernon D; Hadzikadic, Mirsad; Gajic, Ognjen; Herasevich, Vitaly
Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR) in complex environments such as intensive care units (ICU). We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and administrative data from heterogeneous sources within the EMR to support research and practice improvement in the ICUs. Examples of intelligent alarms -- "sniffers", administrative reports, decision support and clinical research applications are presented.
Full Text Available Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR in complex environments such as intensive care units (ICU. We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and administrative data from heterogeneous sources within the EMR to support research and practice improvement in the ICUs. Examples of intelligent alarms – “sniffers”, administrative reports, decision support and clinical research applications are presented.
Barroso, Antonio Carlos de Oliveira; Menezes, Mario Olimpio de, E-mail: email@example.com, E-mail: firstname.lastname@example.org [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Grupo de Pesquisa em Gestao do Conhecimento Aplicada a Area Nuclear
Nowadays broad interest in a couple of inter linked subject areas can make the configuration of a research group to be much diversified both in terms of its components and of the binding relationships that glues the group together. That is the case of the research group for knowledge management and its applications to nuclear technology - KMANT at IPEN, a living entity born 7 years ago and that has sustainably attracted new collaborators. This paper describes the strategic planning of the group, its charter and credo, the present components of the group and the diversified nature of their relations with the group and with IPEN. Then the technical competencies and currently research lines (or programs) are described as well as the research projects, and the management scheme of the group. In the sequence the web-based management and collaboration tools are described as well our experience with their use. KMANT have experiment with over 20 systems and software in this area, but we will focus on those aimed at: (a) web-based project management (RedMine, ClockinIT, Who does, PhProjekt and Dotproject); (b) teaching platform (Moodle); (c) mapping and knowledge representation tools (Cmap, Freemind and VUE); (d) Simulation tools (Matlab, Vensim and NetLogo); (e) social network analysis tools (ORA, MultiNet and UciNet); (f) statistical analysis and modeling tools (R and SmartPLS). Special emphasis is given to the coupling of the group permanent activities like graduate courses and regular seminars and how newcomers are selected and trained to be able to enroll the group. A global assessment of the role the management strategy and available tool set for the group performance is presented. (author)
Barroso, Antonio Carlos de Oliveira; Menezes, Mario Olimpio de
Nowadays broad interest in a couple of inter linked subject areas can make the configuration of a research group to be much diversified both in terms of its components and of the binding relationships that glues the group together. That is the case of the research group for knowledge management and its applications to nuclear technology - KMANT at IPEN, a living entity born 7 years ago and that has sustainably attracted new collaborators. This paper describes the strategic planning of the group, its charter and credo, the present components of the group and the diversified nature of their relations with the group and with IPEN. Then the technical competencies and currently research lines (or programs) are described as well as the research projects, and the management scheme of the group. In the sequence the web-based management and collaboration tools are described as well our experience with their use. KMANT have experiment with over 20 systems and software in this area, but we will focus on those aimed at: (a) web-based project management (RedMine, ClockinIT, Who does, PhProjekt and Dotproject); (b) teaching platform (Moodle); (c) mapping and knowledge representation tools (Cmap, Freemind and VUE); (d) Simulation tools (Matlab, Vensim and NetLogo); (e) social network analysis tools (ORA, MultiNet and UciNet); (f) statistical analysis and modeling tools (R and SmartPLS). Special emphasis is given to the coupling of the group permanent activities like graduate courses and regular seminars and how newcomers are selected and trained to be able to enroll the group. A global assessment of the role the management strategy and available tool set for the group performance is presented. (author)
Li, Man; Pickering, Brian W.; Smith, Vernon D.; Hadzikadic, Mirsad; Gajic, Ognjen; Herasevich, Vitaly
Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR) in complex environments such as intensive care units (ICU). We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and adminis...
Man Li; Brian W. Pickering; Vernon D. Smith; Mirsad Hadzikadic; Ognjen Gajic; Vitaly Herasevich
Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR) in complex environments such as intensive care units (ICU). We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and adminis...
James S. Bates
Researchers, educators, and practitioners utilize a range of tools and techniques to obtain data, input, feedback, and information from research participants, program learners, and stakeholders. Ketso is both an array of information gathering techniques and a toolkit (see www.ketso.com). It “can be used in any situation when people come together to share information, learn from each other, make decisions and plan actions” (Tippett & How, 2011, p. 4). The word ketso means “action” in the Sesot...
Harbottle, Jennifer; Strangward, Patrick; Alnuamaani, Catherine; Lawes, Surita; Patel, Sanjai; Prokop, Andreas
The "droso4schools" project aims to introduce the fruit fly "Drosophila" as a powerful modern teaching tool to convey curriculum-relevant specifications in biology lessons. Flies are easy and cheap to breed and have been at the forefront of biology research for a century, providing unique conceptual understanding of biology and…
Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.
Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256
Trexler, Grant Lewis
This dissertation set out to identify effective qualitative and quantitative management tools used by financial officers (CFOs) in carrying out their management functions of planning, decision making, organizing, staffing, communicating, motivating, leading and controlling at a public research university. In addition, impediments to the use of…
Narasimharao, B. Pandu, Ed.; Wright, Elizabeth, Ed.; Prasad, Shashidhara, Ed.; Joshi, Meghana, Ed.
Higher education institutions play a vital role in their surrounding communities. Besides providing a space for enhanced learning opportunities, universities can utilize their resources for social and economic interests. The "Handbook of Research on Science Education and University Outreach as a Tool for Regional Development" is a…
Meyer, R.A.; Tirsell, K.G.; Armantrout, G.A.
Four areas of research that will have significant impact on the further development of γ-ray spectroscopy as an accurate analytical tool are considered. The areas considered are: (1) automation; (2) accurate multigamma ray sources; (3) accuracy of the current and future γ-ray energy scale, and (4) new solid state X and γ-ray detectors
Tahmasebi, Farhad; Pearce, Robert
Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. The strategic planning process for determining the community Outcomes is also briefly described. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples are also presented.
The article is a reflection on the use of an oral diary as a qualitative research tool, the role that it played during fieldwork and the methodological issues that emerged. It draws on a small-scale empirical study into primary school teachers' use of group discussion, during which oral diaries were used to explore and document teacher reflective…
Rosen, Yigel, Ed.; Ferrara, Steve, Ed.; Mosharraf, Maryam, Ed.
Education is expanding to include a stronger focus on the practical application of classroom lessons in an effort to prepare the next generation of scholars for a changing world economy centered on collaborative and problem-solving skills for the digital age. "The Handbook of Research on Technology Tools for Real-World Skill Development"…
Roerig, S.; Evers, S.J.T.M.; Krabbendam, L.
The relation between theatre, or drama, and research is not novel which is illustrated by concepts such as role theory, theatre for development, or distancing in drama therapy. In various scientific fields theatre is used as a communicative and/or educative tool, however in the realm of childhood
Pitsch, Karola; Neumann, Alexander; Schnier, Christian; Hermann, Thomas
We suggest that an Augmented Reality (AR) system for coupled interaction partners provides a new tool for linguistic research that allows to manipulate the coparticipants’ real-time perception and action. It encompasses novel facilities for recording heterogeneous sensor-rich data sets to be accessed in parallel with qualitative/manual and quantitative/computational methods.
Full Text Available Annually, the Editorial Activity division of the Academy of Public Administration edits the proceedings of the scientific-practical conferences with international participation „Theory and Practice of Public Administration”, in a separate volume. This year collection contains 120 articles signed by the researchers of the Academy, of other national higher education institutions and from the similar institutions abroad, of central and local public authorities. The most relevant scientific researches presented in the plenary session of the Conference as well as within six workshops are emphasized in the article.
Human rights education (HRE) aims to achieve a change of mindsets and social attitudes that entails the construction of a culture of respect towards those values it teaches. Although HRE is a recent field of study, its consolidation in Latin America is a fact. During the latest decades several authors have carried out research related to HRE that…
Bacon, Donald R.; Paul, Pallab; Stewart, Kim A.; Mukhopadhyay, Kausiki
Much has been written about the evaluation of faculty research productivity in promotion and tenure decisions, including many articles that seek to determine the rank of various marketing journals. Yet how faculty evaluators combine journal quality, quantity, and author contribution to form judgments of a scholar's performance is unclear. A…
Virtual Globes are a paradigm shift in the way earth sciences are conducted. With these tools, nearly all aspects of earth science can be integrated from field science, to remote sensing, to remote collaborations, to logistical planning, to data archival/retrieval, to PDF paper retriebal, to education and outreach. Here we present an example of how VGs can be fully exploited for field sciences, using research at McCall Glacier, in Arctic Alaska.
Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K
Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation. Copyright © 2014. Published by Elsevier Ltd.
Seul, M.; Brazil, L.; Castronova, A. M.
CUAHSI Data Services: Tools and Cyberinfrastructure for Water Data Discovery, Research and CollaborationEnabling research surrounding interdisciplinary topics often requires a combination of finding, managing, and analyzing large data sets and models from multiple sources. This challenge has led the National Science Foundation to make strategic investments in developing community data tools and cyberinfrastructure that focus on water data, as it is central need for many of these research topics. CUAHSI (The Consortium of Universities for the Advancement of Hydrologic Science, Inc.) is a non-profit organization funded by the National Science Foundation to aid students, researchers, and educators in using and managing data and models to support research and education in the water sciences. This presentation will focus on open-source CUAHSI-supported tools that enable enhanced data discovery online using advanced searching capabilities and computational analysis run in virtual environments pre-designed for educators and scientists so they can focus their efforts on data analysis rather than IT set-up.
Marty firms offer Information Technology Research reports, analyst calls, conferences, seminars, tools, leadership development, etc. These entities include Gartner, Forrester Research, IDC, The Burton Group, Society for Information Management, 1nfoTech Research, The Corporate Executive Board, and so on. This talk will cover how a number of such services are being used at the Goddard Space Flight Center to improve our IT management practices, workforce skills, approach to innovation, and service delivery. These tools and services are used across the workforce, from the executive leadership to the IT worker. The presentation will cover the types of services each vendor provides and their primary engagement model. The use of these services at other NASA Centers and Headquarters will be included. In addition, I will explain how two of these services are available now to the entire NASA IT workforce through enterprise-wide subscriptions.
Full Text Available Gerald (Gerry Rubin, pioneer in Drosophila genetics, is Founding Director of the HHMI-funded Janelia Research Campus. In this interview, Gerry recounts key events and collaborations that have shaped his unique approach to scientific exploration, decision-making, management and mentorship – an approach that forms the cornerstone of the model adopted at Janelia to tackle problems in interdisciplinary biomedical research. Gerry describes his remarkable journey from newcomer to internationally renowned leader in the fly field, highlighting his contributions to the tools and resources that have helped establish Drosophila as an important model in translational research. Describing himself as a ‘tool builder’, his current focus is on developing approaches for in-depth study of the fly nervous system, in order to understand key principles in neurobiology. Gerry was interviewed by Ross Cagan, Senior Editor of Disease Models & Mechanisms.
's claim by fellow scientists, and (3) demonstrate the utility and value of the research contribution to any interested parties. However, turning an exploratory prototype into a “proper” tool for end-users often entails great effort. Heavyweight mainstream frameworks such as Eclipse do not address...... this issue; their steep learning curves constitute substantial entry barriers to such ecosystems. In this paper, we present the Model Analyzer/Checker (MACH), a stand-alone tool with a command-line interpreter. MACH integrates a set of research prototypes for analyzing UML models. By choosing a simple...... command line interpreter rather than (costly) graphical user interface, we achieved the core goal of quickly deploying research results to a broader audience while keeping the required effort to an absolute minimum. We analyze MACH as a case study of how requirements and constraints in an academic...
Aycock, Dawn M; Clark, Patricia C; Thomas-Seaton, LaTeshia; Lee, Shih-Yu; Moloney, Margaret
Highly organized project management facilitates rigorous study implementation. Research involves gathering large amounts of information that can be overwhelming when organizational strategies are not used. We describe a variety of project management and organizational tools used in different studies that may be particularly useful for novice researchers. The studies were a multisite study of caregivers of stroke survivors, an Internet-based diary study of women with migraines, and a pilot study testing a sleep intervention in mothers of low-birth-weight infants. Project management tools were used to facilitate enrollment, data collection, and access to results. The tools included protocol and eligibility checklists, event calendars, screening and enrollment logs, instrument scoring tables, and data summary sheets. These tools created efficiency, promoted a positive image, minimized errors, and provided researchers with a sense of control. For the studies described, there were no protocol violations, there were minimal missing data, and the integrity of data collection was maintained. © The Author(s) 2016.
Full Text Available Participatory-action research encourages the involvement of all key stakeholders in the research process and is especially well suited to mental health research. Previous literature outlines the importance of engaging stakeholders in the development of research questions and methodologies, but little has been written about ensuring the involvement of all stakeholders (especially non-academic members in dissemination opportunities such as publication development. The Article Idea Chart was developed as a specific methodology for engaging all stakeholders in data analysis and publication development. It has been successfully utilised in a number of studies and is an effective tool for ensuring the dissemination process of participatory-action research results is both inclusive and transparent to all team members, regardless of stakeholder group. Keywords: participatory-action research, mental health, dissemination, community capacity building, publications, authorship
Torous, John; Kiang, Mathew V; Lorme, Jeanette; Onnela, Jukka-Pekka
A longstanding barrier to progress in psychiatry, both in clinical settings and research trials, has been the persistent difficulty of accurately and reliably quantifying disease phenotypes. Mobile phone technology combined with data science has the potential to offer medicine a wealth of additional information on disease phenotypes, but the large majority of existing smartphone apps are not intended for use as biomedical research platforms and, as such, do not generate research-quality data. Our aim is not the creation of yet another app per se but rather the establishment of a platform to collect research-quality smartphone raw sensor and usage pattern data. Our ultimate goal is to develop statistical, mathematical, and computational methodology to enable us and others to extract biomedical and clinical insights from smartphone data. We report on the development and early testing of Beiwe, a research platform featuring a study portal, smartphone app, database, and data modeling and analysis tools designed and developed specifically for transparent, customizable, and reproducible biomedical research use, in particular for the study of psychiatric and neurological disorders. We also outline a proposed study using the platform for patients with schizophrenia. We demonstrate the passive data capabilities of the Beiwe platform and early results of its analytical capabilities. Smartphone sensors and phone usage patterns, when coupled with appropriate statistical learning tools, are able to capture various social and behavioral manifestations of illnesses, in naturalistic settings, as lived and experienced by patients. The ubiquity of smartphones makes this type of moment-by-moment quantification of disease phenotypes highly scalable and, when integrated within a transparent research platform, presents tremendous opportunities for research, discovery, and patient health.
Sacramento, Jose Miguel Noronha
This work aims to assist research institutes, notably the IPEN, in order to improve their assertiveness in the process of defining their research lines. New evolutionary speeds have increased exponentially requiring greater synchronism and multiple and coordinated action from the three fundamental elements in order to assure the development of the contemporary society: Government, Productive Structure and Infrastructure in Science and Technology. This environment increasingly dynamic and mutant imposes greater proximity with the socioeconomic environment when former client-consumer has become the co-creator of knowledge and supplier of energy now contained in a new standard of social relations, called Networked Society. The difference in time for the University, the Productive Structure and Government is function of its main activities: Science, Market and the achievement of Public Opinion, respectively. The equation that will harmonize and find synergies between these three dimensions is the contemporary challenge for those who seek to innovate and advance knowledge in order to improve the standard of living of the society. In this work is shown that research institutes must believe in the words of Robert Plomin and start connecting to the several links in different chains in order to make use of a collective intelligence that continuously expands in speed and quality higher than in any other time in human history. The comparison among the results obtained from the different methodologies of analysis proposed in this work allows finding out strengths and weaknesses, threats and opportunities of the IPEN providing subsidies in order to find better ways to tailor its performance to the new demands. (author)
Anna Kirova PhD
Full Text Available In this article the authors explore the effect of word-image relationships on the collection of data and the reporting of research results for a study involving the development of a series of fotonovelas with immigrant children in an inner-city school. The central question explored in this article is Can experiences such as producing visual narratives in the form of fotonovelas stimulate multiple expressions of voice and position and bring awareness of embodied ways of communicating in a culture-rich school context? The processes involved in collaboratively developing the photographic narrative format of the fotonovela combine visual elements and structures and embodied, reflective performance together with written text. As a research method fotonovela does not merely translate verbal into visual representations but constructs a hybrid photo-image-text that opens new spaces for dialogue, resistance, and representation of a new way of knowing that changes the way of seeing and has the potential to change the author's and the reader's self-understanding.
Hayward, Jake; Cheung, Amandy; Velji, Alkarim; Altarejos, Jenny; Gill, Peter; Scarfe, Andrew; Lewis, Melanie
Context/Setting: The script theory of diagnostic reasoning proposes that clinicians evaluate cases in the context of an "illness script," iteratively testing internal hypotheses against new information eventually reaching a diagnosis. We present a novel tool for teaching diagnostic reasoning to undergraduate medical students based on an adaptation of script theory. We developed a virtual patient case that used clinically authentic audio and video, interactive three-dimensional (3D) body images, and a simulated electronic medical record. Next, we used interactive slide bars to record respondents' likelihood estimates of diagnostic possibilities at various stages of the case. Responses were dynamically compared to data from expert clinicians and peers. Comparative frequency distributions were presented to the learner and final diagnostic likelihood estimates were analyzed. Detailed student feedback was collected. Over two academic years, 322 students participated. Student diagnostic likelihood estimates were similar year to year, but were consistently different from expert clinician estimates. Student feedback was overwhelmingly positive: students found the case was novel, innovative, clinically authentic, and a valuable learning experience. We demonstrate the successful implementation of a novel approach to teaching diagnostic reasoning. Future study may delineate reasoning processes associated with differences between novice and expert responses.
Skonieczny, Łukasz; Rybiński, Henryk; Kryszkiewicz, Marzena; Niezgódka, Marek
This book is a selection of results obtained within three years of research performed under SYNAT—a nation-wide scientific project aiming at creating an infrastructure for scientific content storage and sharing for academia, education and open knowledge society in Poland. The book is intended to be the last of the series related to the SYNAT project. The previous books, titled “Intelligent Tools for Building a Scientific Information Platform” and “Intelligent Tools for Building a Scientific Information Platform: Advanced Architectures and Solutions”, were published as volumes 390 and 467 in Springer's Studies in Computational Intelligence. Its contents is based on the SYNAT 2013 Workshop held in Warsaw. The papers included in this volume present an overview and insight into information retrieval, repository systems, text processing, ontology-based systems, text mining, multimedia data processing and advanced software engineering, addressing the problems of implementing intelligent tools for building...
Luo, Zhonghui; Peng, Bin; Xiao, Qijun; Bai, Lu
Thermal error is the main factor affecting the accuracy of precision machining. Through experiments, this paper studies the thermal error test and intelligent modeling for the spindle of vertical high speed CNC machine tools in respect of current research focuses on thermal error of machine tool. Several testing devices for thermal error are designed, of which 7 temperature sensors are used to measure the temperature of machine tool spindle system and 2 displacement sensors are used to detect the thermal error displacement. A thermal error compensation model, which has a good ability in inversion prediction, is established by applying the principal component analysis technology, optimizing the temperature measuring points, extracting the characteristic values closely associated with the thermal error displacement, and using the artificial neural network technology.
Giles-Corti, Billie; Macaulay, Gus; Middleton, Nick; Boruff, Bryan; Bull, Fiona; Butterworth, Iain; Badland, Hannah; Mavoa, Suzanne; Roberts, Rebecca; Christian, Hayley
Growing evidence shows that higher-density, mixed-use, pedestrian-friendly neighbourhoods encourage active transport, including transport-related walking. Despite widespread recognition of the benefits of creating more walkable neighbourhoods, there remains a gap between the rhetoric of the need for walkability and the creation of walkable neighbourhoods. Moreover, there is little objective data to benchmark the walkability of neighbourhoods within and between Australian cities in order to monitor planning and design intervention progress and to assess built environment and urban policy interventions required to achieve increased walkability. This paper describes a demonstration project that aimed to develop, trial and validate a 'Walkability Index Tool' that could be used by policy makers and practitioners to assess the walkability of local areas; or by researchers to access geospatial data assessing walkability. The overall aim of the project was to develop an automated geospatial tool capable of creating walkability indices for neighbourhoods at user-specified scales. The tool is based on open-source software architecture, within the Australian Urban Research Infrastructure Network (AURIN) framework, and incorporates key sub-component spatial measures of walkability (street connectivity, density and land use mix). Using state-based data, we demonstrated it was possible to create an automated walkability index. However, due to the lack of availability of consistent of national data measuring land use mix, at this stage it has not been possible to create a national walkability measure. The next stage of the project is to increase useability of the tool within the AURIN portal and to explore options for alternative spatial data sources that will enable the development of a valid national walkability index. AURIN's open-source Walkability Index Tool is a first step in demonstrating the potential benefit of a tool that could measure walkability across Australia. It
Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D'Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella M; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel
International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. We developed the informed consent document of the malaria treatment trial into a multimedia tool integrating video, animations and audio narrations in three major Gambian languages. Acceptability and ease of use of the multimedia tool were assessed using quantitative and qualitative methods. In two separate visits, the participants' comprehension of the study information was measured by using a validated digitised audio questionnaire. The majority of participants (70%) reported that the multimedia tool was clear and easy to understand. Participants had high scores on the domains of adverse events/risk, voluntary participation, study procedures while lowest scores were recorded on the question items on randomisation. The differences in mean scores for participants' 'recall' and 'understanding' between first and second visits were statistically significant (F (1,41)=25.38, presearch is needed to compare the tool to the traditional consent interview, both in The Gambia and in other sub-Saharan settings.
Seltzer, Erica D.; Stolley, Melinda R.; Mensah, Edward K.; Sharp, Lisa K.
Purpose The recent and rapid growth of social networking site (SNS) use presents a unique public health opportunity to develop effective strategies for the recruitment of hard-to-reach participants for cancer research studies. This survey investigated childhood cancer survivors’ reported use of SNS such as facebook or MySpace and their perceptions of using SNS, for recruitment into survivorship research. Methods Sixty White, Black and Hispanic, adult childhood cancer survivors (range 18 – 48 years of age) that were randomly selected from a larger childhood cancer study, the Chicago Healthy Living Study (CHLS), participated in this pilot survey. Telephone surveys were conducted to understand current SNS activity and attitudes towards using SNS as a cancer research recruitment tool. Results Seventy percent of participants reported SNS usage of which 80% were at least weekly users and 79 % reported positive attitudes towards the use of SNS as a recruitment tool for survivorship research. Conclusions and implications for cancer survivors The results of this pilot study revealed that SNS use was high and regular among the childhood cancer survivors sampled. Most had positive attitudes towards using SNS for recruitment of research. The results of this pilot survey suggest that SNS may offer an alternative approach for recruitment of childhood cancer survivors into research. PMID:24532046
Seltzer, Erica D; Stolley, Melinda R; Mensah, Edward K; Sharp, Lisa K
The recent and rapid growth of social networking site (SNS) use presents a unique public health opportunity to develop effective strategies for the recruitment of hard-to-reach participants for cancer research studies. This survey investigated childhood cancer survivors' reported use of SNS such as Facebook or MySpace and their perceptions of using SNS, for recruitment into survivorship research. Sixty White, Black, and Hispanic adult childhood cancer survivors (range 18-48 years of age) that were randomly selected from a larger childhood cancer study, the Chicago Healthy Living Study, participated in this pilot survey. Telephone surveys were conducted to understand current SNS activity and attitudes towards using SNS as a cancer research recruitment tool. Seventy percent of participants reported SNS usage of which 80 % were at least weekly users and 79 % reported positive attitudes towards the use of SNS as a recruitment tool for survivorship research. The results of this pilot study revealed that SNS use was high and regular among the childhood cancer survivors sampled. Most had positive attitudes towards using SNS for recruitment of research. The results of this pilot survey suggest that SNS may offer an alternative approach for recruitment of childhood cancer survivors into research.
Costa, Fabricio F
Advances in information technology have improved our ability to gather, collect and analyze information from individuals online. Social networks can be seen as a nonlinear superposition of a multitude of complex connections between people where the nodes represent individuals and the links between them capture a variety of different social interactions. The emergence of different types of social networks has fostered connections between individuals, thus facilitating data exchange in a variety of fields. Therefore, the question posed now is "can these same tools be applied to life sciences in order to improve scientific and medical research?" In this article, I will review how social networks and other web-based tools are changing the way we approach and track diseases in biomedical research. Copyright © 2012 Elsevier Ltd. All rights reserved.
Martínez Ruiz, María Ángeles; Ávalos Ramos, María Alejandra; Merma Molina, Gladys
The aim of this study is to analyse the metaphorical expressions designed by Science of Sport and Physical Activity university students, as a tool of inquiring two research questions: their perceptions of their physical education teachers, and the meaning physical activity has in students’ personal life. 51 students from the University of Alicante have participated in the study. Qualitative data analysis software AQUAD 6 was used for data processing. The results obtained from the analysis of ...
Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D’Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel
Background International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. Objectives This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. Methods We developed the informed consent document of the malaria treatment trial into a m...
Full Text Available Aim/Purpose: These days educators are expected to integrate technological tools into classes. Although they acquire relevant skills, they are often reluctant to use these tools. Background:\tWe incorporated online forums for generating a Community of Inquiry (CoI in a faculty development program. Extending the Technology, Pedagogy, and Content Knowledge (TPACK model with Assessment Knowledge and content analysis of forum discourse and reflection after each CoI, we offer the Diagnostic Tool for Learning, Assessment, and Research (DTLAR. Methodology: This study spanned over two cycles of a development program for medical faculty. Contribution: This study demonstrates how the DTLAR supports in-depth examination of the benefits and challenges of using CoIs for learning and teaching. Findings: Before the program, participants had little experience with, and were reluctant to use, CoIs in classes. At the program completion, many were willing to adopt CoIs and appreciated this method’s contribution. Both CoIs discourse and reflections included positive attitudes regarding cognitive and teacher awareness categories. However, negative attitudes regarding affective aspects and time-consuming aspects of CoIs were exposed. Participants who experienced facilitating a CoI gained additional insights into its usefulness. Recommendations for Practitioners\t: The DTLAR allows analyzing adaption of online forums for learning and teaching. Recommendation for Researchers: The DTLAR allows analyzing factors that affect the acceptance of online fo-rums for learning and teaching. Impact on Society\t: While the tool was implemented in the context of medical education, it can be readily applied in other adult learning programs. Future Research: The study includes several design aspects that probably affected the improve-ment and challenges we found. Future research is called for providing guidelines for identifying boundary conditions and potential for further
Strasser, Carly; Kunze, John; Abrams, Stephen; Cruse, Patricia
Scientific datasets have immeasurable value, but they lose their value over time without proper documentation, long-term storage, and easy discovery and access. Across disciplines as diverse as astronomy, demography, archeology, and ecology, large numbers of small heterogeneous datasets (i.e., the long tail of data) are especially at risk unless they are properly documented, saved, and shared. One unifying factor for many of these at-risk datasets is that they reside in spreadsheets. In response to this need, the California Digital Library (CDL) partnered with Microsoft Research Connections and the Gordon and Betty Moore Foundation to create the DataUp data management tool for Microsoft Excel. Many researchers creating these small, heterogeneous datasets use Excel at some point in their data collection and analysis workflow, so we were interested in developing a data management tool that fits easily into those work flows and minimizes the learning curve for researchers. The DataUp project began in August 2011. We first formally assessed the needs of researchers by conducting surveys and interviews of our target research groups: earth, environmental, and ecological scientists. We found that, on average, researchers had very poor data management practices, were not aware of data centers or metadata standards, and did not understand the benefits of data management or sharing. Based on our survey results, we composed a list of desirable components and requirements and solicited feedback from the community to prioritize potential features of the DataUp tool. These requirements were then relayed to the software developers, and DataUp was successfully launched in October 2012.
Rosenbaum, Benjamin P; Silkin, Nikolay; Miller, Randolph A
Real-time alerting systems typically warn providers about abnormal laboratory results or medication interactions. For more complex tasks, institutions create site-wide 'data warehouses' to support quality audits and longitudinal research. Sophisticated systems like i2b2 or Stanford's STRIDE utilize data warehouses to identify cohorts for research and quality monitoring. However, substantial resources are required to install and maintain such systems. For more modest goals, an organization desiring merely to identify patients with 'isolation' orders, or to determine patients' eligibility for clinical trials, may adopt a simpler, limited approach based on processing the output of one clinical system, and not a data warehouse. We describe a limited, order-entry-based, real-time 'pick off' tool, utilizing public domain software (PHP, MySQL). Through a web interface the tool assists users in constructing complex order-related queries and auto-generates corresponding database queries that can be executed at recurring intervals. We describe successful application of the tool for research and quality monitoring.
Waterlander, Wilma E; Scarpa, Michael; Lentz, Daisy; Steenhuis, Ingrid H M
Economic interventions in the food environment are expected to effectively promote healthier food choices. However, before introducing them on a large scale, it is important to gain insight into the effectiveness of economic interventions and peoples' genuine reactions to price changes. Nonetheless, because of complex implementation issues, studies on price interventions are virtually non-existent. This is especially true for experiments undertaken in a retail setting. We have developed a research tool to study the effects of retail price interventions in a virtual-reality setting: the Virtual Supermarket. This paper aims to inform researchers about the features and utilization of this new software application. The Virtual Supermarket is a Dutch-developed three-dimensional software application in which study participants can shop in a manner comparable to a real supermarket. The tool can be used to study several food pricing and labelling strategies. The application base can be used to build future extensions and could be translated into, for example, an English-language version. The Virtual Supermarket contains a front-end which is seen by the participants, and a back-end that enables researchers to easily manipulate research conditions. The application keeps track of time spent shopping, number of products purchased, shopping budget, total expenditures and answers on configurable questionnaires. All data is digitally stored and automatically sent to a web server. A pilot study among Dutch consumers (n = 66) revealed that the application accurately collected and stored all data. Results from participant feedback revealed that 83% of the respondents considered the Virtual Supermarket easy to understand and 79% found that their virtual grocery purchases resembled their regular groceries. The Virtual Supermarket is an innovative research tool with a great potential to assist in gaining insight into food purchasing behaviour. The application can be obtained via an URL
One of the main requirements in agricultural research is to analyse large number of samples for their one or more chemical constituents and physical properties. In plant breeding programmes and germplasm evaluation, it is necessary that the analysis is fast as many samples are to be analysed. Pulsed nuclear magnetic resonance (NMR) is a potential tool for developing rapid and nondestructive method of analysis. Various applications of low resolution pulsed NMR in agricultural research, which are generally used as screening method are briefly described. 25 refs., 2 figs., 2 tabs
Steenhuis Ingrid HM
Full Text Available Abstract Background Economic interventions in the food environment are expected to effectively promote healthier food choices. However, before introducing them on a large scale, it is important to gain insight into the effectiveness of economic interventions and peoples' genuine reactions to price changes. Nonetheless, because of complex implementation issues, studies on price interventions are virtually non-existent. This is especially true for experiments undertaken in a retail setting. We have developed a research tool to study the effects of retail price interventions in a virtual-reality setting: the Virtual Supermarket. This paper aims to inform researchers about the features and utilization of this new software application. Results The Virtual Supermarket is a Dutch-developed three-dimensional software application in which study participants can shop in a manner comparable to a real supermarket. The tool can be used to study several food pricing and labelling strategies. The application base can be used to build future extensions and could be translated into, for example, an English-language version. The Virtual Supermarket contains a front-end which is seen by the participants, and a back-end that enables researchers to easily manipulate research conditions. The application keeps track of time spent shopping, number of products purchased, shopping budget, total expenditures and answers on configurable questionnaires. All data is digitally stored and automatically sent to a web server. A pilot study among Dutch consumers (n = 66 revealed that the application accurately collected and stored all data. Results from participant feedback revealed that 83% of the respondents considered the Virtual Supermarket easy to understand and 79% found that their virtual grocery purchases resembled their regular groceries. Conclusions The Virtual Supermarket is an innovative research tool with a great potential to assist in gaining insight into food
The GLARE: Grease Lubrication Apparatus for Research and Education was designed as a fourth year thesis project with the University of Ontario Institute of Technology (UOIT). The purpose of the apparatus is to train Ontario Power Generation Nuclear (OPGN) staff to properly lubricate bearings with grease and to help detect early equipment failures. Proper re-lubrication is critical to the nuclear industry as equipment may be inaccessible for long periods of time. A secondary purpose for the tool is for UOIT research and undergraduate laboratories.This abstract provides an overview of the project and its application to the nuclear industry. (author)
Background Economic interventions in the food environment are expected to effectively promote healthier food choices. However, before introducing them on a large scale, it is important to gain insight into the effectiveness of economic interventions and peoples' genuine reactions to price changes. Nonetheless, because of complex implementation issues, studies on price interventions are virtually non-existent. This is especially true for experiments undertaken in a retail setting. We have developed a research tool to study the effects of retail price interventions in a virtual-reality setting: the Virtual Supermarket. This paper aims to inform researchers about the features and utilization of this new software application. Results The Virtual Supermarket is a Dutch-developed three-dimensional software application in which study participants can shop in a manner comparable to a real supermarket. The tool can be used to study several food pricing and labelling strategies. The application base can be used to build future extensions and could be translated into, for example, an English-language version. The Virtual Supermarket contains a front-end which is seen by the participants, and a back-end that enables researchers to easily manipulate research conditions. The application keeps track of time spent shopping, number of products purchased, shopping budget, total expenditures and answers on configurable questionnaires. All data is digitally stored and automatically sent to a web server. A pilot study among Dutch consumers (n = 66) revealed that the application accurately collected and stored all data. Results from participant feedback revealed that 83% of the respondents considered the Virtual Supermarket easy to understand and 79% found that their virtual grocery purchases resembled their regular groceries. Conclusions The Virtual Supermarket is an innovative research tool with a great potential to assist in gaining insight into food purchasing behaviour. The
Daim, Tugrul; Kim, Jisun
Technologies such as renewable energy alternatives including wind, solar and biomass, storage technologies and electric engines are creating a different landscape for the electricity industry. Using sources and ideas from technologies such as renewable energy alternatives, Research and Technology Management in the Electricity Industry explores a different landscape for this industry and applies it to the electric industry supported by real industry cases. Divided into three sections, Research and Technology Management in the Electricity Industry introduces a range of methods and tools includ
Effective mentoring is a critical component in the training of early-career researchers, cultivating more independent, productive and satisfied scientists. For example, mentoring has been shown by the 2005 Sigma Xi National Postdoc Survey to be a key indicator for a successful postdoctoral outcome. Mentoring takes many forms and can include support for maximizing research skills and productivity as well as assistance in preparing for a chosen career path. Yet, because there is no "one-size-fits-all” approach, mentoring can be an activity that is hard to define. In this presentation, a series of tips and tools will be offered to aid mentors in developing a plan for their mentoring activities. This will include: suggestions for how to get started; opportunities for mentoring activities within the research group, within the institution, and outside the institution; tools for communicating and assessing professional milestones; and resources for fostering the professional and career development of mentees. Special considerations will also be presented for mentoring international scholars and women. These strategies will be helpful to the PI responding to the new NSF mentoring plan requirement for postdocs as well as to the student, postdoc, researcher or professor overseeing the research and training of others.
A review of the research accomplished in 2009 in the System-Level Design, Analysis and Simulation Tools (SLDAST) of the NASA's Airspace Systems Program is presented. This research thrust focuses on the integrated system-level assessment of component level innovations, concepts and technologies of the Next Generation Air Traffic System (NextGen) under research in the ASP program to enable the development of revolutionary improvements and modernization of the National Airspace System. The review includes the accomplishments on baseline research and the advancements on design studies and system-level assessment, including the cluster analysis as an annualization standard of the air traffic in the U.S. National Airspace, and the ACES-Air MIDAS integration for human-in-the-loop analyzes within the NAS air traffic simulation.
Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001-2014. A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion ("consultation process") but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face-to-face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. The number of priority setting exercises in health research published in PubMed-indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well-defined structure - such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix - it is likely that the Delphi method and non-replicable consultation processes will gradually be replaced by these emerging tools, which offer more
Background Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. Methods To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001–2014. Results A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion (“consultation process”) but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face–to–face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. Conclusion The number of priority setting exercises in health research published in PubMed–indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well–defined structure – such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix – it is likely that the Delphi method and non–replicable consultation processes will gradually be
E. A. Zablotskaya
Full Text Available The study of the correlation relationship between the signs, the informativeness of the indicators makes it possible to conduct a preliminary assessment of the plants and more objectively to identify forms with high economically valuable characteristics. Their integrated assessment will identify the best source material for further selection. In literary sources, information on the correlation in broccoli between yields and its elements are not the same. The purpose of our study was to analyze the contingency of various traits and to identify significant correlation links between quantitative traits in broccoli hybrids (42 samples. They were obtained using doubled haploid lines (DH-line of early maturity at 2 planting dates (spring and summer. Studies were conducted in the Odintsovo district of the Moscow region in field experience in 2015, 2016. Significant influence on growth and development was provided by the developing weather conditions during the growing period. The fluctuation of humidification and temperature conditions differed significantly during the years of study and the time of planting, which is an important circumstance for analyzing the data obtained. Based on the results of the research, it was concluded that the value of the correlation coefficient and the strength of the correlation relationship between the characteristics (mass, diameter, head height, plant height, vegetation period are different and depend on the set of test specimens and growing conditions. A significant stable manifestation of positive correlation was revealed during all the years of research and the time of planting between the diameter and mass of the head (r = 0.45-0.96. The variability of the correlation of other economically valuable traits is marked.
Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.
Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software
Wahle, Andreas; Lee, Kyungmoo; Harding, Adam T.; Garvin, Mona K.; Niemeijer, Meindert; Sonka, Milan; Abràmoff, Michael D.
In ophthalmology, various modalities and tests are utilized to obtain vital information on the eye's structure and function. For example, optical coherence tomography (OCT) is utilized to diagnose, screen, and aid treatment of eye diseases like macular degeneration or glaucoma. Such data are complemented by photographic retinal fundus images and functional tests on the visual field. DICOM isn't widely used yet, though, and frequently images are encoded in proprietary formats. The eXtensible Neuroimaging Archive Tool (XNAT) is an open-source NIH-funded framework for research PACS and is in use at the University of Iowa for neurological research applications. Its use for ophthalmology was hence desirable but posed new challenges due to data types thus far not considered and the lack of standardized formats. We developed custom tools for data types not natively recognized by XNAT itself using XNAT's low-level REST API. Vendor-provided tools can be included as necessary to convert proprietary data sets into valid DICOM. Clients can access the data in a standardized format while still retaining the original format if needed by specific analysis tools. With respective project-specific permissions, results like segmentations or quantitative evaluations can be stored as additional resources to previously uploaded datasets. Applications can use our abstract-level Python or C/C++ API to communicate with the XNAT instance. This paper describes concepts and details of the designed upload script templates, which can be customized to the needs of specific projects, and the novel client-side communication API which allows integration into new or existing research applications.
Guertin, L. A.
VoiceThread has been utilized in an undergraduate research methods course for peer review and final research project dissemination. VoiceThread (http://www.voicethread.com) can be considered a social media tool, as it is a web-based technology with the capacity to enable interactive dialogue. VoiceThread is an application that allows a user to place a media collection online containing images, audio, videos, documents, and/or presentations in an interface that facilitates asynchronous communication. Participants in a VoiceThread can be passive viewers of the online content or engaged commenters via text, audio, video, with slide annotations via a doodle tool. The VoiceThread, which runs across browsers and operating systems, can be public or private for viewing and commenting and can be embedded into any website. Although few university students are aware of the VoiceThread platform (only 10% of the students surveyed by Ng (2012)), the 2009 K-12 edition of The Horizon Report (Johnson et al., 2009) lists VoiceThread as a tool to watch because of the opportunities it provides as a collaborative learning environment. In Fall 2011, eleven students enrolled in an undergraduate research methods course at Penn State Brandywine each conducted their own small-scale research project. Upon conclusion of the projects, students were required to create a poster summarizing their work for peer review. To facilitate the peer review process outside of class, each student-created PowerPoint file was placed in a VoiceThread with private access to only the class members and instructor. Each student was assigned to peer review five different student posters (i.e., VoiceThread images) with the audio and doodle tools to comment on formatting, clarity of content, etc. After the peer reviews were complete, the students were allowed to edit their PowerPoint poster files for a new VoiceThread. In the new VoiceThread, students were required to video record themselves describing their research
M. M. Aligadjiev
Full Text Available Aim. The paper discusses the improvement of methods of hydrobiological studies by modifying tools for plankton and benthic samples collecting. Methods. In order to improve the standard methods of hydro-biological research, we have developed tools for sampling zooplankton and benthic environment of the Caspian Sea. Results. Long-term practice of selecting hydrobiological samples in the Caspian Sea shows that it is required to complete the modernization of the sampling tools used to collect hydrobiological material. With the introduction of Azov and Black Sea invasive comb jelly named Mnemiopsis leidyi A. Agassiz to the Caspian Sea there is a need to collect plankton samples without disturbing its integrity. Tools for collecting benthic fauna do not always give a complete picture of the state of benthic ecosystems because of the lack of visual site selection for sampling. Moreover, while sampling by dredge there is a probable loss of the samples, especially in areas with difficult terrain. Conclusion. We propose to modify a small model of Upstein net (applied in shallow water to collect zooplankton samples with an upper inverted cone that will significantly improve the catchability of the net in theCaspian Sea. Bottom sampler can be improved by installing a video camera for visual inspection of the bottom topography, and use sensors to determine tilt of the dredge and the position of the valves of the bucket.
Full Text Available Assembly is the part that produces the maximum workload and consumed time during product design and manufacturing process. CNC machine tool is the key basic equipment in manufacturing industry and research on assembly design technologies of CNC machine tool has theoretical significance and practical value. This study established a simplified ASRG for CNC machine tool. The connection between parts, semantic information of transmission, and geometric constraint information were quantified to assembly connection strength to depict the assembling difficulty level. The transmissibility based on trust relationship was applied on the assembly connection strength. Assembly unit partition based on assembly connection strength was conducted, and interferential assembly units were identified and revised. The assembly sequence planning and optimization of parts in each assembly unit and between assembly units was conducted using genetic algorithm. With certain type of high speed CNC turning center, as an example, this paper explored into the assembly modeling, assembly unit partition, and assembly sequence planning and optimization and realized the optimized assembly sequence of headstock of CNC machine tool.
Full Text Available The multi-disciplinary and international nature of large European projects requires powerful managerial and communicative tools to ensure the transmission of information to the end-users. One such project is TRACE entitled “Tracing Food Commodities in Europe”. One of its objectives is to provide a communication system dedicated to be the central source of information on food authenticity and traceability in Europe. This paper explores the web tools used and communication vehicles offered to scientists involved in the TRACE project to communicate internally as well as to the public. Two main tools have been built: an Intranet and a public website. The TRACE website can be accessed at http://www.trace.eu.org. A particular emphasis was placed on the efficiency, the relevance and the accessibility of the information, the publicity of the website as well as the use of the collaborative utilities. The rationale of web space design as well as integration of proprietary software solutions are presented. Perspectives on the using of web tools in the research projects are discussed.
Argyropoulou, Eleftheria; Hatira, Kalliopi
This article introduces an alternative qualitative research tool: metaphor and drawing, as projections of personality features, to explore underlying concepts and values, thoughts and beliefs, fears and hesitations, aspirations and ambitions of the research subjects. These two projective tools are used to explore Greek state kindergarten head…
Rodney R. Dietert
Full Text Available Academic preparation of science researchers and/or human or veterinary medicine clinicians through the science, technology, engineering, and mathematics (STEM curriculum has usually focused on the students (1 acquiring increased disciplinary expertise, (2 learning needed methodologies and protocols, and (3 expanding their capacity for intense, persistent focus. Such educational training is effective until roadblocks or problems arise via this highly-learned approach. Then, the health science trainee may have few tools available for effective problem solving. Training to achieve flexibility, adaptability, and broadened perspectives using contemplative practices has been rare among biomedical education programs. To address this gap, a Cornell University-based program involving formal biomedical science coursework, and health science workshops has been developed to offer science students, researchers and health professionals a broader array of personal, contemplation-based, problem-solving tools. This STEM educational initiative includes first-person exercises designed to broaden perceptional awareness, decrease emotional drama, and mobilize whole-body strategies for creative problem solving. Self-calibration and journaling are used for students to evaluate the personal utility of each exercise. The educational goals are to increase student self-awareness and self-regulation and to provide trainees with value-added tools for career-long problem solving. Basic elements of this educational initiative are discussed using the framework of the Tree of Contemplative Practices.
Dubosarsky, Mia D.
How do young children view science? Do these views reflect cultural stereotypes? When do these views develop? These fundamental questions in the field of science education have rarely been studied with the population of preschool children. One main reason is the lack of an appropriate research instrument that addresses preschool children's developmental competencies. Extensive body of research has pointed at the significance of early childhood experiences in developing positive attitudes and interests toward learning in general and the learning of science in particular. Theoretical and empirical research suggests that stereotypical views of science may be replaced by authentic views following inquiry science experience. However, no preschool science intervention program could be designed without a reliable instrument that provides baseline information about preschool children's current views of science. The current study presents preschool children's views of science as gathered from a pioneering research tool. This tool, in the form of a computer "game," does not require reading, writing, or expressive language skills and is operated by the children. The program engages children in several simple tasks involving picture recognition and yes/no answers in order to reveal their views about science. The study was conducted with 120 preschool children in two phases and found that by the age of 4 years, participants possess an emergent concept of science. Gender and school differences were detected. Findings from this interdisciplinary study will contribute to the fields of early childhood, science education, learning technologies, program evaluation, and early childhood curriculum development.
Full Text Available The paper describes the research and development of casting and solidification of slab ingots from special tool steels by means of numerical modelling using the finite element method. The pre-processing, processing and post-processing phases of numerical modelling are outlined. Also, problems with determining the thermophysical properties of materials and heat transfer between the individual parts of the casting system are discussed. Based on the type of grade of tool steel, the risk of final porosity is predicted. The results allowed to improve the production technology of slab ingots, and also to verify the ratio, the chamfer and the external/ internal shape of the wall of the new designed slab ingots.
We are currently in one of the most exciting times for science and engineering as we witness unprecedented growth in our computational and experimental capabilities to generate new data and models. To facilitate data and model sharing, and to enhance reproducibility and rigor in biomechanics research, the Journal of Biomechanics has introduced a number of tools for Content Innovation to allow presentation, sharing, and archiving of methods, models, and data in our articles. The tools include an Interactive Plot Viewer, 3D Geometric Shape and Model Viewer, Virtual Microscope, Interactive MATLAB Figure Viewer, and Audioslides. Authors are highly encouraged to make use of these in upcoming journal submissions. Copyright © 2017 Elsevier Ltd. All rights reserved.
H. Valles; C.M.S. Carrington
There has been a recent proposal to change the way that biology is taught and learned in undergraduate biology programs in the USA so that students develop a better understanding of science and the natural world. Here, we use this new, recommended teachingâ learning framework to assert that permanent forestry plots could be a valuable tool to help develop biology...
Over the last 60 years research reactors (RRs) have played an important role in technological and socio-economical development of mankind, such as radioisotope production for medicine, industry, research and education. Neutron scattering has been widely used for research and development in materials science. The prospect of neutron scattering as a powerful tool for materials research is increasing in the 21 st century. This can be seen from the investment of several new neutron sources all over the world such as the Spallation Neutron Source (SNS) in USA, the Japan Proton Accelerator Complex (JPARC) in Japan, the new OPAL Reactor in Australia, and some upgrading to the existing sources at ISIS, Rutherford Appleton Laboratory, UK; Institute of Laue Langevin (ILL) in Grenoble, France and Berlin Reactor, Germany. Developing countries with moderate flux research reactor have also been involved in this technique, such as India, Malaysia and Indonesia The Siwabessy Multipurpose Reactor in Serpong, Indonesia that also produces thermal neutron has contributed to the research and development in the Asia Pacific Region. However,the international joint research among those countries plays an important role on optimizing the results. (author)
Full Text Available Over the last 60 years research reactors (RRs have played an important role in technological and socio-economical development of mankind, such as radioisotope production for medicine, industry, research and education. Neutron scattering has been widely used for research and development in materials science. The prospect of neutron scattering as a powerful tool for materials research is increasing in the 21st century. This can be seen from the investment of several new neutron sources all over the world such as the Spallation Neutron Source (SNS in USA, the Japan Proton Accelerator Complex (JPARC in Japan, the new OPAL Reactor in Australia, and some upgrading to the existing sources at ISIS, Rutherford Appleton Laboratory, UK; Institute of Laue Langevin (ILL in Grenoble, France and Berlin Reactor, Germany. Developing countries with moderate flux research reactor have also been involved in this technique, such as India, Malaysia and Indonesia. The Siwabessy Multipurpose Reactor in Serpong, Indonesia that also produces thermal neutron has contributed to the research and development in the Asia Pacific Region. However, the international joint research among those countries plays an important role on optimizing the results.
Afolabi, Muhammed Olanrewaju; McGrath, Nuala; D'Alessandro, Umberto; Kampmann, Beate; Imoukhuede, Egeruan B; Ravinetto, Raffaella M; Alexander, Neal; Larson, Heidi J; Chandramohan, Daniel; Bojang, Kalifa
To assess the effectiveness of a multimedia informed consent tool for adults participating in a clinical trial in the Gambia. Adults eligible for inclusion in a malaria treatment trial (n = 311) were randomized to receive information needed for informed consent using either a multimedia tool (intervention arm) or a standard procedure (control arm). A computerized, audio questionnaire was used to assess participants' comprehension of informed consent. This was done immediately after consent had been obtained (at day 0) and at subsequent follow-up visits (days 7, 14, 21 and 28). The acceptability and ease of use of the multimedia tool were assessed in focus groups. On day 0, the median comprehension score in the intervention arm was 64% compared with 40% in the control arm (P = 0.042). The difference remained significant at all follow-up visits. Poorer comprehension was independently associated with female sex (odds ratio, OR: 0.29; 95% confidence interval, CI: 0.12-0.70) and residing in Jahaly rather than Basse province (OR: 0.33; 95% CI: 0.13-0.82). There was no significant independent association with educational level. The risk that a participant's comprehension score would drop to half of the initial value was lower in the intervention arm (hazard ratio 0.22, 95% CI: 0.16-0.31). Overall, 70% (42/60) of focus group participants from the intervention arm found the multimedia tool clear and easy to understand. A multimedia informed consent tool significantly improved comprehension and retention of consent information by research participants with low levels of literacy.
Full Text Available The UbuntuNet Alliance Alliance is well-placed to facilitate interaction between education and research institutions and the African academic and researcher in the Diaspora so that together they can strengthen research that will exploit new technological tools and increase the industrial base. It is envisaged that the Alliance will become an important vehicle for linkages that will facilitate repatriation of scientific knowledge and skills to Africa and even help reduce and eventually eradicate the brain drain which has taken so many excellent intellectuals to the developed world. As organisational vehicles for inter-institutional collaboration both established and emerging NRENs can play a critical role in reversing these trends and in mitigating what appears to be the negative impact of the brain drain.
Darling, John A.; Frederick, Raymond M.
Understanding the risks of biological invasion posed by ballast water-whether in the context of compliance testing, routine monitoring, or basic research-is fundamentally an exercise in biodiversity assessment, and as such should take advantage of the best tools available for tackling that problem. The past several decades have seen growing application of genetic methods for the study of biodiversity, driven in large part by dramatic technological advances in nucleic acids analysis. Monitoring approaches based on such methods have the potential to increase dramatically sampling throughput for biodiversity assessments, and to improve on the sensitivity, specificity, and taxonomic accuracy of traditional approaches. The application of targeted detection tools (largely focused on PCR but increasingly incorporating novel probe-based methodologies) has led to a paradigm shift in rare species monitoring, and such tools have already been applied for early detection in the context of ballast water surveillance. Rapid improvements in community profiling approaches based on high throughput sequencing (HTS) could similarly impact broader efforts to catalogue biodiversity present in ballast tanks, and could provide novel opportunities to better understand the risks of biotic exchange posed by ballast water transport-and the effectiveness of attempts to mitigate those risks. These various approaches still face considerable challenges to effective implementation, depending on particular management or research needs. Compliance testing, for instance, remains dependent on accurate quantification of viable target organisms; while tools based on RNA detection show promise in this context, the demands of such testing require considerable additional investment in methods development. In general surveillance and research contexts, both targeted and community-based approaches are still limited by various factors: quantification remains a challenge (especially for taxa in larger size
Sade, Christian; de Barros, Leticia Maria Renault; Melo, Jorge José Maciel; Passos, Eduardo
This paper seeks to assess a way of conducting interviews in line with the ideology of Brazilian Psychiatric Reform. In the methodology of participative intervention and research in mental health, the interview is less a data collection than a data harvesting procedure. It is designed to apply the principles of psychosocial care, autonomy as the basis for treatment, the predominance of the users and of their social networks and civic participation. Inspired by the Explicitation Interview technique, the contention is that the handling of the interview presupposes an open attitude able to promote and embrace different viewpoints. This attitude makes the interview a collective experience of sharing and belonging, allowing participants to reposition themselves subjectively in treatment with the emergence of groupality. As an example of using the interview as a methodological tool in mental health research, we examine research into adaptation of the tool of Autonomous Medication Management (GAM). It is an interventionist approach guided by principles that foster autonomy and the protagonist status of users of psychotropic medication, their quality of life, their rights and recognition of the multiple significances of medication, understood here as a collective interview technique.
Maimon, Eric; Samuni, Uri; Goldstein, Sara
Radicals are part of the chemistry of life, and ionizing radiation chemistry serves as an indispensable research tool for elucidation of the mechanism(s) underlying their reactions. The ever-increasing understanding of their involvement in diverse physiological and pathological processes has expanded the search for compounds that can diminish radical-induced damage. This review surveys the areas of research focusing on radical reactions and particularly with stable cyclic nitroxide radicals, which demonstrate unique antioxidative activities. Unlike common antioxidants that are progressively depleted under oxidative stress and yield secondary radicals, nitroxides are efficient radical scavengers yielding in most cases their respective oxoammonium cations, which are readily reduced back in the tissue to the nitroxide thus continuously being recycled. Nitroxides, which not only protect enzymes, cells, and laboratory animals from diverse kinds of biological injury, but also modify the catalytic activity of heme enzymes, could be utilized in chemical and biological systems serving as a research tool for elucidating mechanisms underlying complex chemical and biochemical processes.
Horvath, Monica M; Winfield, Stephanie; Evans, Steve; Slopek, Steve; Shang, Howard; Ferranti, Jeffrey
In many healthcare organizations, comparative effectiveness research and quality improvement (QI) investigations are hampered by a lack of access to data created as a byproduct of patient care. Data collection often hinges upon either manual chart review or ad hoc requests to technical experts who support legacy clinical systems. In order to facilitate this needed capacity for data exploration at our institution (Duke University Health System), we have designed and deployed a robust Web application for cohort identification and data extraction--the Duke Enterprise Data Unified Content Explorer (DEDUCE). DEDUCE is envisioned as a simple, web-based environment that allows investigators access to administrative, financial, and clinical information generated during patient care. By using business intelligence tools to create a view into Duke Medicine's enterprise data warehouse, DEDUCE provides a Guided Query functionality using a wizard-like interface that lets users filter through millions of clinical records, explore aggregate reports, and, export extracts. Researchers and QI specialists can obtain detailed patient- and observation-level extracts without needing to understand structured query language or the underlying database model. Developers designing such tools must devote sufficient training and develop application safeguards to ensure that patient-centered clinical researchers understand when observation-level extracts should be used. This may mitigate the risk of data being misunderstood and consequently used in an improper fashion. Copyright © 2010 Elsevier Inc. All rights reserved.
Tomás, Concepción; Yago, Teresa; Eguiluz, Mercedes; Samitier, M A Luisa; Oliveros, Teresa; Palacios, Gemma
To validate the questionnaire "Gender Perspective in Health Research" (GPIHR) to assess the inclusion of gender perspective in research projects. Validation study in two stages. Feasibility was analysed in the first, and reliability, internal consistence and validity in the second. Aragón Institute of Health Science, Aragón, Spain. GPIHR was applied to 118 research projects funded in national and international competitive tenders from 2003 to 2012. Analysis of inter- and intra-observer reliability with Kappa index and internal consistency with Cronbach's alpha. Content validity analysed through literature review and construct validity with an exploratory factor analysis. Validated GPIHR has 10 questions: 3 in the introduction, 1 for objectives, 3 for methodology and 3 for research purpose. Average time of application was 13min Inter-observer reliability (Kappa) varied between 0.35 and 0.94 and intra-observer between 0.40 and 0.94. Theoretical construct is supported in the literature. Factor analysis identifies three levels of GP inclusion: "difference by sex", "gender sensitive" and "feminist research" with an internal consistency of 0.64, 0.87 and 0.81, respectively, which explain 74.78% of variance. GPIHR questionnaire is a valid tool to assess GP and useful for those researchers who would like to include GP in their projects. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.
Lienert, Florian; Lohmueller, Jason J; Garg, Abhishek; Silver, Pamela A
Recent progress in DNA manipulation and gene circuit engineering has greatly improved our ability to programme and probe mammalian cell behaviour. These advances have led to a new generation of synthetic biology research tools and potential therapeutic applications. Programmable DNA-binding domains and RNA regulators are leading to unprecedented control of gene expression and elucidation of gene function. Rebuilding complex biological circuits such as T cell receptor signalling in isolation from their natural context has deepened our understanding of network motifs and signalling pathways. Synthetic biology is also leading to innovative therapeutic interventions based on cell-based therapies, protein drugs, vaccines and gene therapies. PMID:24434884
Combination of bioaffinity and chromatography gave birth to affinity chromatography. A further combination with frontal analysis resulted in creation of frontal affinity chromatography (FAC). This new versatile research tool enabled detailed analysis of weak interactions that play essential roles in living systems, especially those between complex saccharides and saccharide-binding proteins. FAC now becomes the best method for the investigation of saccharide-binding proteins (lectins) from viewpoints of sensitivity, accuracy, and efficiency, and is contributing greatly to the development of glycobiology. It opened a door leading to deeper understanding of the significance of saccharide recognition in life. The theory is also concisely described. PMID:25169774
Malinowski, Ann Kinga; Ananth, Cande V; Catalano, Patrick; Hines, Erin P; Kirby, Russell S; Klebanoff, Mark A; Mulvihill, John J; Simhan, Hyagriv; Hamilton, Carol M; Hendershot, Tabitha P; Phillips, Michael J; Kilpatrick, Lisa A; Maiese, Deborah R; Ramos, Erin M; Wright, Rosalind J; Dolan, Siobhan M
Only through concerted and well-executed research endeavors can we gain the requisite knowledge to advance pregnancy care and have a positive impact on maternal and newborn health. Yet the heterogeneity inherent in individual studies limits our ability to compare and synthesize study results, thus impeding the capacity to draw meaningful conclusions that can be trusted to inform clinical care. The PhenX Toolkit (http://www.phenxtoolkit.org), supported since 2007 by the National Institutes of Health, is a web-based catalog of standardized protocols for measuring phenotypes and exposures relevant for clinical research. In 2016, a working group of pregnancy experts recommended 15 measures for the PhenX Toolkit that are highly relevant to pregnancy research. The working group followed the established PhenX consensus process to recommend protocols that are broadly validated, well established, nonproprietary, and have a relatively low burden for investigators and participants. The working group considered input from the pregnancy experts and the broader research community and included measures addressing the mode of conception, gestational age, fetal growth assessment, prenatal care, the mode of delivery, gestational diabetes, behavioral and mental health, and environmental exposure biomarkers. These pregnancy measures complement the existing measures for other established domains in the PhenX Toolkit, including reproductive health, anthropometrics, demographic characteristics, and alcohol, tobacco, and other substances. The preceding domains influence a woman's health during pregnancy. For each measure, the PhenX Toolkit includes data dictionaries and data collection worksheets that facilitate incorporation of the protocol into new or existing studies. The measures within the pregnancy domain offer a valuable resource to investigators and clinicians and are well poised to facilitate collaborative pregnancy research with the goal to improve patient care. To achieve this
Park, Sinyoung; Nam, Chung Mo; Park, Sejung; Noh, Yang Hee; Ahn, Cho Rong; Yu, Wan Sun; Kim, Bo Kyung; Kim, Seung Min; Kim, Jin Seok; Rha, Sun Young
With the growing amount of clinical research, regulations and research ethics are becoming more stringent. This trend introduces a need for quality assurance measures for ensuring adherence to research ethics and human research protection beyond Institutional Review Board approval. Audits, one of the most effective tools for assessing quality assurance, are measures used to evaluate Good Clinical Practice (GCP) and protocol compliance in clinical research. However, they are laborious, time consuming, and require expertise. Therefore, we developed a simple auditing process (a screening audit) and evaluated its feasibility and effectiveness. The screening audit was developed using a routine audit checklist based on the Severance Hospital's Human Research Protection Program policies and procedures. The measure includes 20 questions, and results are summarized in five categories of audit findings. We analyzed 462 studies that were reviewed by the Severance Hospital Human Research Protection Center between 2013 and 2017. We retrospectively analyzed research characteristics, reply rate, audit findings, associated factors and post-screening audit compliance, etc. RESULTS: Investigator reply rates gradually increased, except for the first year (73% → 26% → 53% → 49% → 55%). The studies were graded as "critical," "major," "minor," and "not a finding" (11.9, 39.0, 42.9, and 6.3%, respectively), based on findings and number of deficiencies. The auditors' decisions showed fair agreement with weighted kappa values of 0.316, 0.339, and 0.373. Low-risk level studies, single center studies, and non-phase clinical research showed more prevalent frequencies of being "major" or "critical" (p = 0.002, audit grade (p audit results of post-screening audit compliance checks in "non-responding" and "critical" studies upon applying the screening audit. Our screening audit is a simple and effective way to assess overall GCP compliance by institutions and to
Full Text Available This article describes the main features and implementation of our automatic data distribution research tool. The tool (DDT accepts programs written in Fortran 77 and generates High Performance Fortran (HPF directives to map arrays onto the memories of the processors and parallelize loops, and executable statements to remap these arrays. DDT works by identifying a set of computational phases (procedures and loops. The algorithm builds a search space of candidate solutions for these phases which is explored looking for the combination that minimizes the overall cost; this cost includes data movement cost and computation cost. The movement cost reflects the cost of accessing remote data during the execution of a phase and the remapping costs that have to be paid in order to execute the phase with the selected mapping. The computation cost includes the cost of executing a phase in parallel according to the selected mapping and the owner computes rule. The tool supports interprocedural analysis and uses control flow information to identify how phases are sequenced during the execution of the application.
Williams, Bradley S; D'Amico, Ellen; Kastens, Jude H; Thorp, James H; Flotemersch, Joseph E; Thoms, Martin C
River systems consist of hydrogeomorphic patches (HPs) that emerge at multiple spatiotemporal scales. Functional process zones (FPZs) are HPs that exist at the river valley scale and are important strata for framing whole-watershed research questions and management plans. Hierarchical classification procedures aid in HP identification by grouping sections of river based on their hydrogeomorphic character; however, collecting data required for such procedures with field-based methods is often impractical. We developed a set of GIS-based tools that facilitate rapid, low cost riverine landscape characterization and FPZ classification. Our tools, termed RESonate, consist of a custom toolbox designed for ESRI ArcGIS®. RESonate automatically extracts 13 hydrogeomorphic variables from readily available geospatial datasets and datasets derived from modeling procedures. An advanced 2D flood model, FLDPLN, designed for MATLAB® is used to determine valley morphology by systematically flooding river networks. When used in conjunction with other modeling procedures, RESonate and FLDPLN can assess the character of large river networks quickly and at very low costs. Here we describe tool and model functions in addition to their benefits, limitations, and applications.
Confocal microscopy is widely used in neurobiology for studying the three-dimensional structure of the nervous system. Confocal image data are often multi-channel, with each channel resulting from a different fluorescent dye or fluorescent protein; one channel may have dense data, while another has sparse; and there are often structures at several spatial scales: subneuronal domains, neurons, and large groups of neurons (brain regions). Even qualitative analysis can therefore require visualization using techniques and parameters fine-tuned to a particular dataset. Despite the plethora of volume rendering techniques that have been available for many years, the techniques standardly used in neurobiological research are somewhat rudimentary, such as looking at image slices or maximal intensity projections. Thus there is a real demand from neurobiologists, and biologists in general, for a flexible visualization tool that allows interactive visualization of multi-channel confocal data, with rapid fine-tuning of parameters to reveal the three-dimensional relationships of structures of interest. Together with neurobiologists, we have designed such a tool, choosing visualization methods to suit the characteristics of confocal data and a typical biologist\\'s workflow. We use interactive volume rendering with intuitive settings for multidimensional transfer functions, multiple render modes and multi-views for multi-channel volume data, and embedding of polygon data into volume data for rendering and editing. As an example, we apply this tool to visualize confocal microscopy datasets of the developing zebrafish visual system.
Background Currently over 50% of drugs prescribed to children have not been evaluated properly for use in their age group. One key reason why children have been excluded from clinical trials is that they are not considered able to exercise meaningful autonomy over the decision to participate. Dutch law states that competence to consent can be presumed present at the age of 12 and above; however, in pediatric practice children’s competence is not that clearly presented and the transition from assent to active consent is gradual. A gold standard for competence assessment in children does not exist. In this article we describe a study protocol on the development of a standardized tool for assessing competence to consent in research in children and adolescents. Methods/design In this study we modified the MacCAT-CR, the best evaluated competence assessment tool for adults, for use in children and adolescents. We will administer the tool prospectively to a cohort of pediatric patients from 6 to18 years during the selection stages of ongoing clinical trials. The outcomes of the MacCAT-CR interviews will be compared to a reference standard, established by the judgments of clinical investigators, and an expert panel consisting of child psychiatrists, child psychologists and medical ethicists. The reliability, criterion-related validity and reproducibility of the tool will be determined. As MacCAT-CR is a multi-item scale consisting of 13 items, power was justified at 130–190 subjects, providing a minimum of 10–15 observations per item. MacCAT-CR outcomes will be correlated with age, life experience, IQ, ethnicity, socio-economic status and competence judgment of the parent(s). It is anticipated that 160 participants will be recruited over 2 years to complete enrollment. Discussion A validity study on an assessment tool of competence to consent is strongly needed in research practice, particularly in the child and adolescent population. In this study we will establish
Hein Irma M
Full Text Available Abstract Background Currently over 50% of drugs prescribed to children have not been evaluated properly for use in their age group. One key reason why children have been excluded from clinical trials is that they are not considered able to exercise meaningful autonomy over the decision to participate. Dutch law states that competence to consent can be presumed present at the age of 12 and above; however, in pediatric practice children’s competence is not that clearly presented and the transition from assent to active consent is gradual. A gold standard for competence assessment in children does not exist. In this article we describe a study protocol on the development of a standardized tool for assessing competence to consent in research in children and adolescents. Methods/design In this study we modified the MacCAT-CR, the best evaluated competence assessment tool for adults, for use in children and adolescents. We will administer the tool prospectively to a cohort of pediatric patients from 6 to18 years during the selection stages of ongoing clinical trials. The outcomes of the MacCAT-CR interviews will be compared to a reference standard, established by the judgments of clinical investigators, and an expert panel consisting of child psychiatrists, child psychologists and medical ethicists. The reliability, criterion-related validity and reproducibility of the tool will be determined. As MacCAT-CR is a multi-item scale consisting of 13 items, power was justified at 130–190 subjects, providing a minimum of 10–15 observations per item. MacCAT-CR outcomes will be correlated with age, life experience, IQ, ethnicity, socio-economic status and competence judgment of the parent(s. It is anticipated that 160 participants will be recruited over 2 years to complete enrollment. Discussion A validity study on an assessment tool of competence to consent is strongly needed in research practice, particularly in the child and adolescent population. In
Hege, Inga; Kononowicz, Andrzej A; Adler, Martin
Clinical reasoning is a fundamental process medical students have to learn during and after medical school. Virtual patients (VP) are a technology-enhanced learning method to teach clinical reasoning. However, VP systems do not exploit their full potential concerning the clinical reasoning process; for example, most systems focus on the outcome and less on the process of clinical reasoning. Keeping our concept grounded in a former qualitative study, we aimed to design and implement a tool to enhance VPs with activities and feedback, which specifically foster the acquisition of clinical reasoning skills. We designed the tool by translating elements of a conceptual clinical reasoning learning framework into software requirements. The resulting clinical reasoning tool enables learners to build their patient's illness script as a concept map when they are working on a VP scenario. The student's map is compared with the experts' reasoning at each stage of the VP, which is technically enabled by using Medical Subject Headings, which is a comprehensive controlled vocabulary published by the US National Library of Medicine. The tool is implemented using Web technologies, has an open architecture that enables its integration into various systems through an open application program interface, and is available under a Massachusetts Institute of Technology license. We conducted usability tests following a think-aloud protocol and a pilot field study with maps created by 64 medical students. The results show that learners interact with the tool but create less nodes and connections in the concept map than an expert. Further research and usability tests are required to analyze the reasons. The presented tool is a versatile, systematically developed software component that specifically supports the clinical reasoning skills acquisition. It can be plugged into VP systems or used as stand-alone software in other teaching scenarios. The modular design allows an extension with new
Full Text Available Many researchers collect online survey data because it is cost-effective and less time-consuming than traditional research methods. This paper describes Twitter chats as a research tool vis-à-vis two other online research methods: providing links to electronic surveys to respondents and use of commercially available survey panels through vendors with readily available respondents. Similar to a face-to-face focus group, Twitter chats provide a synchronous environment for participants to answer a structured series of questions and to respond to both the chat facilitator and each other. This paper also reports representative responses from a Twitter chat that explored financial decisions of young adults. The chat was sponsored by a multi-state group of land-grant university researchers, in cooperation with WiseBread, a personal finance website targeted to millennials, to recruit respondents for a more extensive month-long online survey about the financial decisions of young adults. The Twitter chat responses suggest that student loans were the top concern of participants, and debt and housing rounded out the top three concerns. The internet, both websites and social media, was the most frequently cited source of financial information. The article concludes with a discussion of lessons learned from the Twitter chat experience and suggestions for professional practice.
ADENIYI AKINGBADE WAIDI
Full Text Available Questionnaire has to do with questions designed to gather information or data for analysis. Questionnaire has to be adequate, simple, focused and related to the subject which the research is set to achieve and to test the hypotheses and questions that are formulated for the study. But many questionnaires are constructed and administered without following proper guideline which hinders there end result. This paper assesses some of the guides for constructing questionnaire as well as it uses and the extent to which it enhanced manager’s access to reliable data and information. Descriptive method is employed for the study. Findings revealed that poor or badly prepared questionnaire produce questionnaire that does not provide effective results. Managers and researchers that use such questionnaire hardly achieve their organisational and research objectives. The need for good, well prepared and adequate questionnaire is exemplified by its being the primary tool for analytical research. The study recommends that questionnaire be properly prepared for effective research outcome.
Kopton, Isabella M.; Kenning, Peter
Over the last decade, the application of neuroscience to economic research has gained in importance and the number of neuroeconomic studies has grown extensively. The most common method for these investigations is fMRI. However, fMRI has limitations (particularly concerning situational factors) that should be countered with other methods. This review elaborates on the use of functional Near-Infrared Spectroscopy (fNIRS) as a new and promising tool for investigating economic decision making both in field experiments and outside the laboratory. We describe results of studies investigating the reliability of prototype NIRS studies, as well as detailing experiments using conventional and stationary fNIRS devices to analyze this potential. This review article shows that further research using mobile fNIRS for studies on economic decision making outside the laboratory could be a fruitful avenue helping to develop the potential of a new method for field experiments outside the laboratory. PMID:25147517
Gelinas, Luke; Pierce, Robin; Winkler, Sabune; Cohen, I Glenn; Lynch, Holly Fernandez; Bierer, Barbara E
Convertino, V. A.
Lower body negative pressure (LBNP) has been extensively used for decades in aerospace physiological research as a tool to investigate cardiovascular mechanisms that are associated with or underlie performance in aerospace and military environments. In comparison with clinical stand and tilt tests, LBNP represents a relatively safe methodology for inducing highly reproducible hemodynamic responses during exposure to footward fluid shifts similar to those experienced under orthostatic challenge. By maintaining an orthostatic challenge in a supine posture, removal of leg support (muscle pump) and head motion (vestibular stimuli) during LBNP provides the capability to isolate cardiovascular mechanisms that regulate blood pressure. LBNP can be used for physiological measurements, clinical diagnoses and investigational research comparisons of subject populations and alterations in physiological status. The applications of LBNP to the study of blood pressure regulation in spaceflight, groundbased simulations of low gravity, and hemorrhage have provided unique insights and understanding for development of countermeasures based on physiological mechanisms underlying the operational problems.
Zheng, Yihua; Kuznetsova, Maria M.; Pulkkinen, Antti A.; Maddox, Marlo M.; Mays, Mona Leila
The Space Weather Research Center (http://swrc. gsfc.nasa.gov) at NASA Goddard, part of the Community Coordinated Modeling Center (http://ccmc.gsfc.nasa.gov), is committed to providing research-based forecasts and notifications to address NASA's space weather needs, in addition to its critical role in space weather education. It provides a host of services including spacecraft anomaly resolution, historical impact analysis, real-time monitoring and forecasting, tailored space weather alerts and products, and weekly summaries and reports. In this paper, we focus on how (near) real-time data (both in space and on ground), in combination with modeling capabilities and an innovative dissemination system called the integrated Space Weather Analysis system (http://iswa.gsfc.nasa.gov), enable monitoring, analyzing, and predicting the spacecraft charging environment for spacecraft users. Relevant tools and resources are discussed.
Amon, Krestina L; Campbell, Andrew J; Hawke, Catherine; Steinbeck, Katharine
Researchers are increasingly using social media to recruit participants to surveys and clinical studies. However, the evidence of the efficacy and validity of adolescent recruitment through Facebook is yet to be established. To conduct a systematic review of the literature on the use of Facebook to recruit adolescents for health research. Nine electronic databases and reference lists were searched for articles published between 2004 and 2013. Studies were included in the review if: 1) participants were aged ≥ 10 to ≤ 18 years, 2) studies addressed a physical or mental health issue, 3) Facebook was identified as a recruitment tool, 4) recruitment details using Facebook were outlined in the methods section and considered in the discussion, or information was obtained by contacting the authors, 5) results revealed how many participants were recruited using Facebook, and 6) studies addressed how adolescent consent and/or parental consent was obtained. Titles, abstracts, and keywords were scanned and duplicates removed by 2 reviewers. Full text was evaluated for inclusion criteria, and 2 reviewers independently extracted data. The search resulted in 587 publications, of which 25 full-text papers were analyzed. Six studies met all the criteria for inclusion in the review. Three recruitment methods using Facebook was identified: 1) paid Facebook advertising, 2) use of the Facebook search tool, and 3) creation and use of a Facebook Page. Eligible studies described the use of paid Facebook advertising and Facebook as a search tool as methods to successfully recruit adolescent participants. Online and verbal consent was obtained from participants recruited from Facebook. Copyright © 2014 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.
Miller, Brian W.; Morisette, Jeffrey T.
Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.
Full Text Available Innovation and thus the production of knowledge becomes a factor of competitiveness. In this context quality management could be complemented by knowledge management to aim the improvement of knowledge production by research activities process. To this end, after describing knowledge and informa-tion typologies in engineering activities, a knowledge man-agement system is proposed. The goal is to support: (1 Semi-Structured Information (e.g. reports, etc. thanks to the BASIC-Lab tool functions, which are based on attributing points of view and annotations to documents and document zones, and (2 Non-Structured Information (such as mail, dialogues, etc., thanks to MICA-Graph approach which intends to support ex-change of technical messages that concerns common resolution of research problems within project teams and to capitalise relevant knowledge. For the both approaches, prototype tools have been developed and evaluated, primarily to feed back with manufacturing knowledge in the EADS industrial envi-ronment.
Carlos Medicis Morel
Full Text Available BACKGROUND: New approaches and tools were needed to support the strategic planning, implementation and management of a Program launched by the Brazilian Government to fund research, development and capacity building on neglected tropical diseases with strong focus on the North, Northeast and Center-West regions of the country where these diseases are prevalent. METHODOLOGY/PRINCIPAL FINDINGS: Based on demographic, epidemiological and burden of disease data, seven diseases were selected by the Ministry of Health as targets of the initiative. Publications on these diseases by Brazilian researchers were retrieved from international databases, analyzed and processed with text-mining tools in order to standardize author- and institution's names and addresses. Co-authorship networks based on these publications were assembled, visualized and analyzed with social network analysis software packages. Network visualization and analysis generated new information, allowing better design and strategic planning of the Program, enabling decision makers to characterize network components by area of work, identify institutions as well as authors playing major roles as central hubs or located at critical network cut-points and readily detect authors or institutions participating in large international scientific collaborating networks. CONCLUSIONS/SIGNIFICANCE: Traditional criteria used to monitor and evaluate research proposals or R&D Programs, such as researchers' productivity and impact factor of scientific publications, are of limited value when addressing research areas of low productivity or involving institutions from endemic regions where human resources are limited. Network analysis was found to generate new and valuable information relevant to the strategic planning, implementation and monitoring of the Program. It afforded a more proactive role of the funding agencies in relation to public health and equity goals, to scientific capacity building
Morel, Carlos Medicis; Serruya, Suzanne Jacob; Penna, Gerson Oliveira; Guimarães, Reinaldo
Background New approaches and tools were needed to support the strategic planning, implementation and management of a Program launched by the Brazilian Government to fund research, development and capacity building on neglected tropical diseases with strong focus on the North, Northeast and Center-West regions of the country where these diseases are prevalent. Methodology/Principal Findings Based on demographic, epidemiological and burden of disease data, seven diseases were selected by the Ministry of Health as targets of the initiative. Publications on these diseases by Brazilian researchers were retrieved from international databases, analyzed and processed with text-mining tools in order to standardize author- and institution's names and addresses. Co-authorship networks based on these publications were assembled, visualized and analyzed with social network analysis software packages. Network visualization and analysis generated new information, allowing better design and strategic planning of the Program, enabling decision makers to characterize network components by area of work, identify institutions as well as authors playing major roles as central hubs or located at critical network cut-points and readily detect authors or institutions participating in large international scientific collaborating networks. Conclusions/Significance Traditional criteria used to monitor and evaluate research proposals or R&D Programs, such as researchers' productivity and impact factor of scientific publications, are of limited value when addressing research areas of low productivity or involving institutions from endemic regions where human resources are limited. Network analysis was found to generate new and valuable information relevant to the strategic planning, implementation and monitoring of the Program. It afforded a more proactive role of the funding agencies in relation to public health and equity goals, to scientific capacity building objectives and a more
Ogao Patrick J
Full Text Available Abstract Background Ever since Dr. John Snow (1813–1854 used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping – all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation in exploring geospatial structures encompassing disease, urban and census mapping. Results Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred
Ogao, Patrick J
Ever since Dr. John Snow (1813-1854) used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping--all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation) in exploring geospatial structures encompassing disease, urban and census mapping. Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities) background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred exploratory tool in identifying, interpreting and
Bickerstaffe, Adrian; Ranaweera, Thilina; Endersby, Travis; Ellis, Christopher; Maddumarachchi, Sanjaya; Gooden, George E; White, Paul; Moses, Eric K; Hewitt, Alex W; Hopper, John L
The Ark is an open-source web-based tool that allows researchers to manage health and medical research data for humans and animals without specialized database skills or programming expertise. The system provides data management for core research information including demographic, phenotype, biospecimen and pedigree data, in addition to supporting typical investigator requirements such as tracking participant consent and correspondence, whilst also being able to generate custom data exports and reports. The Ark is 'study generic' by design and highly configurable via its web interface, allowing researchers to tailor the system to the specific data management requirements of their study. Source code for The Ark can be obtained freely from the website https://github.com/The-Ark-Informatics/ark/ . The source code can be modified and redistributed under the terms of the GNU GPL v3 license. Documentation and a pre-configured virtual appliance can be found at the website http://sphinx.org.au/the-ark/ . email@example.com. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: firstname.lastname@example.org
Garaizar, Pablo; Reips, Ulf-Dietrich
Social networking has surpassed e-mail and instant messaging as the dominant form of online communication (Meeker, Devitt, & Wu, 2010). Currently, all large social networks are proprietary, making it difficult to impossible for researchers to make changes to such networks for the purpose of study design and access to user-generated data from the networks. To address this issue, the authors have developed and present Social Lab, an Internet-based free and open-source social network software system available from http://www.sociallab.es . Having full availability of navigation and communication data in Social Lab allows researchers to investigate behavior in social media on an individual and group level. Automated artificial users ("bots") are available to the researcher to simulate and stimulate social networking situations. These bots respond dynamically to situations as they unfold. The bots can easily be configured with scripts and can be used to experimentally manipulate social networking situations in Social Lab. Examples for setting up, configuring, and using Social Lab as a tool for research in social media are provided.
James S. Bates
Full Text Available Researchers, educators, and practitioners utilize a range of tools and techniques to obtain data, input, feedback, and information from research participants, program learners, and stakeholders. Ketso is both an array of information gathering techniques and a toolkit (see www.ketso.com. It “can be used in any situation when people come together to share information, learn from each other, make decisions and plan actions” (Tippett & How, 2011, p. 4. The word ketso means “action” in the Sesotho language, spoken in the African nation of Lesotho where the concept for this instrument was conceived. Ketso techniques fall into the participatory action research family of social science research methods (Tippett, Handley, & Ravetz, 2007. Ohio State University Extension professionals have used the Ketso toolkit and its techniques in numerous settings, including for professional development, conducting community needs/interests assessments, brainstorming, and data collection. As a toolkit, Ketso uses tactile and colorful leaves, branches, and icons to organize and display participants’ contributions on felt mats. As an array of techniques, Ketso is effective in engaging audiences because it is inclusive and provides each participant a platform for their perspective to be shared.
Jessani, Nasreen; Lewy, Daniela; Ekirapa-Kiracho, Elizabeth; Bennett, Sara
Despite significant investments in health systems research (HSR) capacity development, there is a dearth of information regarding how to assess HSR capacity. An alliance of schools of public health (SPHs) in East and Central Africa developed a tool for the self-assessment of HSR capacity with the aim of producing institutional capacity development plans. Between June and November 2011, seven SPHs across the Democratic Republic of Congo, Ethiopia, Kenya, Rwanda, Tanzania, and Uganda implemented this co-created tool. The objectives of the institutional assessments were to assess existing capacities for HSR and to develop capacity development plans to address prioritized gaps. A mixed-method approach was employed consisting of document analysis, self-assessment questionnaires, in-depth interviews, and institutional dialogues aimed at capturing individual perceptions of institutional leadership, collective HSR skills, knowledge translation, and faculty incentives to engage in HSR. Implementation strategies for the capacity assessment varied across the SPHs. This paper reports findings from semi-structured interviews with focal persons from each SPH, to reflect on the process used at each SPH to execute the institutional assessments as well as the perceived strengths and weaknesses of the assessment process. The assessment tool was robust enough to be utilized in its entirety across all seven SPHs resulting in a thorough HSR capacity assessment and a capacity development plan for each SPH. Successful implementation of the capacity assessment exercises depended on four factors: (i) support from senior leadership and collaborators, (ii) a common understanding of HSR, (iii) adequate human and financial resources for the exercise, and (iv) availability of data. Methods of extracting information from the results of the assessments, however, were tailored to the unique objectives of each SPH. This institutional HSR capacity assessment tool and the process for its utilization
Veller, van M.G.P.; Gerritsma, W.
Wageningen UR Library has developed a tool based upon co-citation analysis to recommend alternative journals to researchers for a journal they look up in the tool. The journal recommendations can be tuned in such a way to include citation preferences for each of the five science groups that comprise
Smartt, H.; Kuhn, M.; Krementz, D.
The U.S. National Nuclear Security Administration (NNSA) Office of Non-proliferation and Verification Research and Development currently funds research on advanced containment technologies to support Continuity of Knowledge (CoK) objectives for verification regimes. One effort in this area is the Advanced Tools for Maintaining Continuity of Knowledge (ATCK) project. Recognizing that CoK assurances must withstand potential threats from sophisticated adversaries, and that containment options must therefore keep pace with technology advances, the NNSA research and development on advanced containment tools is an important investment. The two ATCK efforts underway at present address the technical containment requirements for securing access points (loop seals) and protecting defined volumes. Multiple U.S. national laboratories are supporting this project: Sandia National Laboratories (SNL), Savannah River National Laboratory (SRNL), and Oak Ridge National Laboratory (ORNL). SNL and SRNL are developing the ''Ceramic Seal,'' an active loop seal that integrates multiple advanced security capabilities and improved efficiency housed within a small-volume ceramic body. The development includes an associated handheld reader and interface software. Currently at the prototype stage, the Ceramic Seal will undergo a series of tests to determine operational readiness. It will be field tested in a representative verification trial in 2016. ORNL is developing the Whole Volume Containment Seal (WCS), a flexible conductive fabric capable of enclosing various sizes and shapes of monitored items. The WCS includes a distributed impedance measurement system for imaging the fabric surface area and passive tamper-indicating features such as permanent-staining conductive ink. With the expected technology advances from the Ceramic Seal and WCS, the ATCK project takes significant steps in advancing containment technologies to help maintain CoK for various verification
Galuvao, Akata Sisigafu'aapulematumua
This article introduces Tofa'a'anolasi, a novel Samoan research framework created by drawing on the work of other Samoan and Pacific education researchers, in combination with adapting the 'Foucauldian tool box' to use for research carried out from a Samoan perspective. The article starts with an account and explanation of the process of…
Laursen, S. L.; Hunter, A.; Weston, T.; Thiry, H.
Evidence-based thinking is essential both to science and to the development of effective educational programs. Thus assessment of student learning—gathering evidence about the nature and depth of students’ learning gains, and about how they arise—is a centerpiece of any effective undergraduate research (UR) program. Assessment data can be used to monitor progress, to diagnose problems, to strengthen program designs, and to report both good outcomes and strategies to improve them to institutional and financial stakeholders in UR programs. While the positive impact of UR on students’ educational, personal and professional development has long been a matter of faith, only recently have researchers and evaluators developed an empirical basis by which to identify and explain these outcomes. Based on this growing body of evidence, URSSA, the Undergraduate Research Student Self-Assessment, is a survey tool that departments and programs can use to assess student outcomes of UR. URSSA focuses on what students learn from their UR experience, rather than whether they liked it. Both multiple-choice and open-ended items focus on students’ gains from UR, including: (1) skills such as lab work and communication; (2) conceptual knowledge and linkages among ideas in their field and with other fields; (3) deepened understanding of the intellectual and practical work of science; (4) growth in confidence and adoption of the identity of scientist; (5) preparation for a career or graduate school in science; and (6) greater clarity in understanding what career or educational path they might wish to pursue. Other items probe students’ participation in important activities that have been shown to lead to these gains; and a set of optional items can be included to probe specific program features that may supplement UR (e.g. field trips, career seminars, housing arrangements). The poster will describe URSSA's content, development, validation, and use. For more information about
De, Baishakhi; Bhandari, Koushik; Mukherjee, Ranjan; Katakam, Prakash; Adiki, Shanta K; Gundamaraju, Rohit; Mitra, Analava
The world has witnessed growing complexities in disease scenario influenced by the drastic changes in host-pathogen- environment triadic relation. Pharmaceutical R&Ds are in constant search of novel therapeutic entities to hasten transition of drug molecules from lab bench to patient bedside. Extensive animal studies and human pharmacokinetics are still the "gold standard" in investigational new drug research and bio-equivalency studies. Apart from cost, time and ethical issues on animal experimentation, burning questions arise relating to ecological disturbances, environmental hazards and biodiversity issues. Grave concerns arises when the adverse outcomes of continued studies on one particular disease on environment gives rise to several other pathogenic agents finally complicating the total scenario. Thus Pharma R&Ds face a challenge to develop bio-waiver protocols. Lead optimization, drug candidate selection with favorable pharmacokinetics and pharmacodynamics, toxicity assessment are vital steps in drug development. Simulation tools like Gastro Plus™, PK Sim®, SimCyp find applications for the purpose. Advanced technologies like organ-on-a chip or human-on-a chip where a 3D representation of human organs and systems can mimic the related processes and activities, thereby linking them to major features of human biology can be successfully incorporated in the drug development tool box. PBPK provides the State of Art to serve as an optional of animal experimentation. PBPK models can successfully bypass bio-equivalency studies, predict bioavailability, drug interactions and on hyphenation with in vitro-in vivo correlation can be extrapolated to humans thus serving as bio-waiver. PBPK can serve as an eco-friendly bio-waiver predictive tool in drug development. Copyright© Bentham Science Publishers; For any queries, please email at email@example.com.
Chen, Chunpeng James; Zhang, Zhiwu
The ultimate goal of genomic research is to effectively predict phenotypes from genotypes so that medical management can improve human health and molecular breeding can increase agricultural production. Genomic prediction or selection (GS) plays a complementary role to genome-wide association studies (GWAS), which is the primary method to identify genes underlying phenotypes. Unfortunately, most computing tools cannot perform data analyses for both GWAS and GS. Furthermore, the majority of these tools are executed through a command-line interface (CLI), which requires programming skills. Non-programmers struggle to use them efficiently because of the steep learning curves and zero tolerance for data formats and mistakes when inputting keywords and parameters. To address these problems, this study developed a software package, named the Intelligent Prediction and Association Tool (iPat), with a user-friendly graphical user interface. With iPat, GWAS or GS can be performed using a pointing device to simply drag and/or click on graphical elements to specify input data files, choose input parameters and select analytical models. Models available to users include those implemented in third party CLI packages such as GAPIT, PLINK, FarmCPU, BLINK, rrBLUP and BGLR. Users can choose any data format and conduct analyses with any of these packages. File conversions are automatically conducted for specified input data and selected packages. A GWAS-assisted genomic prediction method was implemented to perform genomic prediction using any GWAS method such as FarmCPU. iPat was written in Java for adaptation to multiple operating systems including Windows, Mac and Linux. The iPat executable file, user manual, tutorials and example datasets are freely available at http://zzlab.net/iPat. firstname.lastname@example.org.
Morguí, Josep-Anton; Font, Anna; Cañas, Lidia; Vázquez-García, Eusebi; Gini, Andrea; Corominas, Ariadna; Àgueda, Alba; Lobo, Agustin; Ferraz, Carlos; Nofuentes, Manel; Ulldemolins, Delmir; Roca, Alex; Kamnang, Armand; Grossi, Claudia; Curcoll, Roger; Batet, Oscar; Borràs, Silvia; Occhipinti, Paola; Rodó, Xavier
An educational tool was designed with the aim of making more comprehensive the research done on Greenhouse Gases (GHGs) in the ClimaDat Spanish network of atmospheric observation stations (www.climadat.es). This tool is called Air Enquirer and it consist of a multi-sensor box. It is envisaged to build more than two hundred boxes to yield them to the Spanish High Schools through the Education department (www.educaixa.com) of the "Obra Social 'La Caixa'", who funds this research. The starting point for the development of the Air Enquirers was the experience at IC3 (www.ic3.cat) in the CarboSchools+ FP7 project (www.carboschools.cat, www.carboschools.eu). The Air Enquirer's multi-sensor box is based in Arduino's architecture and contains sensors for CO2, temperature, relative humidity, pressure, and both infrared and visible luminance. The Air Enquirer is designed for taking continuous measurements. Every Air Enquirer ensemble of measurements is used to convert values to standard units (water content in ppmv, and CO2 in ppmv_dry). These values are referred to a calibration made with Cavity Ring Down Spectrometry (Picarro®) under different temperature, pressure, humidity and CO2 concentrations. Multiple sets of Air Enquirers are intercalibrated for its use in parallel during the experiments. The different experiments proposed to the students will be outdoor (observational) or indoor (experimental, in the lab) focusing on understanding the biogeochemistry of GHGs in the ecosystems (mainly CO2), the exchange (flux) of gases, the organic matter production, respiration and decomposition processes, the influence of the anthropogenic activities on the gases (and particles) exchanges, and their interaction with the structure and composition of the atmosphere (temperature, water content, cooling and warming processes, radiative forcing, vertical gradients and horizontal patterns). In order to ensure Air Enquirers a high-profile research performance the experimental designs
Blackledge, Matthew D; Collins, David J; Koh, Dow-Mu; Leach, Martin O
We present pyOsiriX, a plugin built for the already popular dicom viewer OsiriX that provides users the ability to extend the functionality of OsiriX through simple Python scripts. This approach allows users to integrate the many cutting-edge scientific/image-processing libraries created for Python into a powerful DICOM visualisation package that is intuitive to use and already familiar to many clinical researchers. Using pyOsiriX we hope to bridge the apparent gap between basic imaging scientists and clinical practice in a research setting and thus accelerate the development of advanced clinical image processing. We provide arguments for the use of Python as a robust scripting language for incorporation into larger software solutions, outline the structure of pyOsiriX and how it may be used to extend the functionality of OsiriX, and we provide three case studies that exemplify its utility. For our first case study we use pyOsiriX to provide a tool for smooth histogram display of voxel values within a user-defined region of interest (ROI) in OsiriX. We used a kernel density estimation (KDE) method available in Python using the scikit-learn library, where the total number of lines of Python code required to generate this tool was 22. Our second example presents a scheme for segmentation of the skeleton from CT datasets. We have demonstrated that good segmentation can be achieved for two example CT studies by using a combination of Python libraries including scikit-learn, scikit-image, SimpleITK and matplotlib. Furthermore, this segmentation method was incorporated into an automatic analysis of quantitative PET-CT in a patient with bone metastases from primary prostate cancer. This enabled repeatable statistical evaluation of PET uptake values for each lesion, before and after treatment, providing estaimes maximum and median standardised uptake values (SUVmax and SUVmed respectively). Following treatment we observed a reduction in lesion volume, SUVmax and SUVmed for
Grigoriev, S. N.; Bobrovskij, N. M.; Melnikov, P. A.; Bobrovskij, I. N.
Modern vector of development of machining technologies aimed at the transition to environmentally safe technologies - “green” technologies. The concept of “green technology” includes a set of signs of knowledge intended for practical use (“technology”). One of the ways to improve the quality of production is the use of surface plastic deformation (SPD) processing methods. The advantage of the SPD is a capability to combine effects of finishing and strengthening treatment. The SPD processing can replace operations: fine turning, grinding or polishing. The SPD is a forceful contact impact of indentor on workpiece’s surface in condition of their relative motion. It is difficult to implement the core technology of the SPD (burnishing, roller burnishing, etc.) while maintaining core technological advantages without the use of lubricating and cooling technology (metalworking fluids, MWF). The “green” SPD technology was developed by the authors for dry processing and has not such shortcomings. When processing with SPD without use of MWF requirements for tool’s durability is most significant, especially in the conditions of mass production. It is important to determine the period of durability of tool at the design stage of the technological process with the purpose of wastage preventing. This paper represents the results of durability research of natural and synthetic diamonds (polycrystalline diamond - ASPK) as well as precision of polycrystalline superabrasive tools made of dense boron nitride (DBN) during SPD processing without application of MWF.
Satpathy, R; Konkimalla, V B; Ratha, J
Microbial dehalogenation is a biochemical process in which the halogenated substances are catalyzed enzymatically in to their non-halogenated form. The microorganisms have a wide range of organohalogen degradation ability both explicit and non-specific in nature. Most of these halogenated organic compounds being pollutants need to be remediated; therefore, the current approaches are to explore the potential of microbes at a molecular level for effective biodegradation of these substances. Several microorganisms with dehalogenation activity have been identified and characterized. In this aspect, the bioinformatics plays a key role to gain deeper knowledge in this field of dehalogenation. To facilitate the data mining, many tools have been developed to annotate these data from databases. Therefore, with the discovery of a microorganism one can predict a gene/protein, sequence analysis, can perform structural modelling, metabolic pathway analysis, biodegradation study and so on. This review highlights various methods of bioinformatics approach that describes the application of various databases and specific tools in the microbial dehalogenation fields with special focus on dehalogenase enzymes. Attempts have also been made to decipher some recent applications of in silico modeling methods that comprise of gene finding, protein modelling, Quantitative Structure Biodegradibility Relationship (QSBR) study and reconstruction of metabolic pathways employed in dehalogenation research area.
Saadet Kuru Cetin
Full Text Available In this study, in-class lesson observations were made with volunteer teachers working in primary and secondary schools using alternative observation tools regarding the scope of contemporary educational supervision. The study took place during the fall and spring semesters of the 2015-2016 and 2016-2017 academic years and the class observations were made with six alternative volunteer teachers in the primary and secondary schools in the provincial and district centers using alternative observation tools. In the classroom observations, the teacher's verbal flow scheme, teacher's movement scheme and student behaviors both during tasks and not, were analyzed. Observations were made during the two classes with teacher's permission. After the first observation, an information meeting was held and then the second observation was made. Following the observations, interviews were held with the teachers. In interviews, the information about the class observations was shared with teachers and their opinions about research were asked. It has been found that alternative observations, in general, have a positive effect on the professional development of teachers. It is concluded that this type of observation approach positively affects teachers' in-class activities, helps in classroom management and teaching arrangements and positively affects student's unwanted behaviors.
Jankowski, Katherine R B; Flannelly, Kevin J; Flannelly, Laura T
The t-test developed by William S. Gosset (also known as Student's t-test and the two-sample t-test) is commonly used to compare one sample mean on a measure with another sample mean on the same measure. The outcome of the t-test is used to draw inferences about how different the samples are from each other. It is probably one of the most frequently relied upon statistics in inferential research. It is easy to use: a researcher can calculate the statistic with three simple tools: paper, pen, and a calculator. A computer program can quickly calculate the t-test for large samples. The ease of use can result in the misuse of the t-test. This article discusses the development of the original t-test, basic principles of the t-test, two additional types of t-tests (the one-sample t-test and the paired t-test), and recommendations about what to consider when using the t-test to draw inferences in research.
Vecchiato, Giovanni; Astolfi, Laura; De Vico Fallani, Fabrizio; Toppi, Jlenia; Aloise, Fabio; Bez, Francesco; Wei, Daming; Kong, Wanzeng; Dai, Jounging; Cincotti, Febo; Mattia, Donatella; Babiloni, Fabio
Here we present an overview of some published papers of interest for the marketing research employing electroencephalogram (EEG) and magnetoencephalogram (MEG) methods. The interest for these methodologies relies in their high-temporal resolution as opposed to the investigation of such problem with the functional Magnetic Resonance Imaging (fMRI) methodology, also largely used in the marketing research. In addition, EEG and MEG technologies have greatly improved their spatial resolution in the last decades with the introduction of advanced signal processing methodologies. By presenting data gathered through MEG and high resolution EEG we will show which kind of information it is possible to gather with these methodologies while the persons are watching marketing relevant stimuli. Such information will be related to the memorization and pleasantness related to such stimuli. We noted that temporal and frequency patterns of brain signals are able to provide possible descriptors conveying information about the cognitive and emotional processes in subjects observing commercial advertisements. These information could be unobtainable through common tools used in standard marketing research. We also show an example of how an EEG methodology could be used to analyze cultural differences between fruition of video commercials of carbonated beverages in Western and Eastern countries.
Full Text Available Here we present an overview of some published papers of interest for the marketing research employing electroencephalogram (EEG and magnetoencephalogram (MEG methods. The interest for these methodologies relies in their high-temporal resolution as opposed to the investigation of such problem with the functional Magnetic Resonance Imaging (fMRI methodology, also largely used in the marketing research. In addition, EEG and MEG technologies have greatly improved their spatial resolution in the last decades with the introduction of advanced signal processing methodologies. By presenting data gathered through MEG and high resolution EEG we will show which kind of information it is possible to gather with these methodologies while the persons are watching marketing relevant stimuli. Such information will be related to the memorization and pleasantness related to such stimuli. We noted that temporal and frequency patterns of brain signals are able to provide possible descriptors conveying information about the cognitive and emotional processes in subjects observing commercial advertisements. These information could be unobtainable through common tools used in standard marketing research. We also show an example of how an EEG methodology could be used to analyze cultural differences between fruition of video commercials of carbonated beverages in Western and Eastern countries.
Here we provide an update on construction of the five NEON Mobile Deployment Platforms (MDPs) as well as a description of the infrastructure and sensors available to researchers in the near future. Additionally, we include information (i.e. timelines and procedures) on requesting MDPs for PI led projects. The MDPs will provide the means to observe stochastic or spatially important events, gradients, or quantities that cannot be reliably observed using fixed location sampling (e.g. fires and floods). Due to the transient temporal and spatial nature of such events, the MDPs are designed to accommodate rapid deployment for time periods up to 1 year. Broadly, the MDPs are comprised of infrastructure and instrumentation capable of functioning individually or in conjunction with one another to support observations of ecological change, as well as education, training and outreach. More specifically, the MDPs include the capability to make tower based measures of ecosystem exchange, radiation, and precipitation in conjunction with baseline soils data such as CO2 flux, and soil temperature and moisture. An aquatics module is also available with the MDP to facilitate research integrating terrestrial and aquatic processes. Ultimately, the NEON MDPs provides a tool for linking PI led research to the continental scale data sets collected by NEON.
Supreet Kaur Gill
Full Text Available Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research.
Goldfarb, L.; Yang, A.
Leah Goldfarb, Paul Cutler, Andrew Yang*, Mustapha Mokrane, Jacinta Legg and Deliang Chen The scientific community has been engaged in developing an international strategy on Earth system research. The initial consultation in this “visioning” process focused on gathering suggestions for Earth system research priorities that are interdisciplinary and address the most pressing societal issues. It was implemented this through a website that utilized Web 2.0 capabilities. The website (http://www.icsu-visioning.org/) collected input from 15 July to 1 September 2009. This consultation was the first in which the international scientific community was asked to help shape the future of a research theme. The site attracted over 7000 visitors from 133 countries, more than 1000 of whom registered and took advantage of the site’s functionality to contribute research questions (~300 questions), comment on posts, and/or vote on questions. To facilitate analysis of results, the site captured a small set of voluntary information about each contributor and their contribution. A group of ~50 international experts were invited to analyze the inputs at a “Visioning Earth System Research” meeting held in September 2009. The outcome of this meeting—a prioritized list of research questions to be investigated over the next decade—was then posted on the visioning website for additional comment from the community through an online survey tool. In general, many lessons were learned in the development and implementation of this website, both in terms of the opportunities offered by Web 2.0 capabilities and the application of these capabilities. It is hoped that this process may serve as a model for other scientific communities. The International Council for Science (ICSU) in cooperation with the International Social Science Council (ISSC) is responsible for organizing this Earth system visioning process.
Two phase III clinical trials of new therapies for patients with metastatic melanoma presented in June at the 2011 ASCO conference confirmed that vemurafenib and ipilimumab (Yervoy™) offer valuable new options for the disease.
Full Text Available Knowledge extraction from detected document image is a complex problem in the field of information technology. This problem becomes more intricate when we know, a negligible percentage of the detected document images are valuable. In this paper, a segmentation-based classification algorithm is used to analysis the document image. In this algorithm, using a two-stage segmentation approach, regions of the image are detected, and then classified to document and non-document (pure region regions in the hierarchical classification. In this paper, a novel valuable definition is proposed to classify document image in to valuable or invaluable categories. The proposed algorithm is evaluated on a database consisting of the document and non-document image that provide from Internet. Experimental results show the efficiency of the proposed algorithm in the semantic document image classification. The proposed algorithm provides accuracy rate of 98.8% for valuable and invaluable document image classification problem.
Binello, E.; Mitchell, R.N.; Harling, O.K.
An immunologic tool based on manipulation of the boron neutron capture reaction was previously proposed in the context of heart transplantation research to examine the temporal relationship between parenchymal rejection (representing immune cell infiltration) and transplantation-associated arteriosclerosis (characterized by progressive vascular occlusion). Critical to the development of this method is the uptake of boron by specific cells of the immune system, namely T cells, without adverse effects on cell function, which may be assessed by the ability of boron-loaded cells to produce IFNγ, a protein with substantial impact on rejection. This work presents the evaluation of two carboranyl thymidine analogs. Advantages of this type of boron compound are reduced risk of leakage and effective dose delivery based on their incorporation into cellular nuclear material. Results indicate that uptake of these boronated nucleosides is high with no adverse effects on cell function, thereby warranting the continued development of this technique that has potentially wide applicability in immunological models
The report includes the following chapters: (1) Introduction: ozone in the atmosphere, anthropogenic influence on the ozone layer, polar stratospheric ozone loss; (2) Tracer-tracer relations in the stratosphere: tracer-tracer relations as a tool in atmospheric research; impact of cosmic-ray-induced heterogeneous chemistry on polar ozone; (3) quantifying polar ozone loss from ozone-tracer relations: principles of tracer-tracer correlation techniques; reference ozone-tracer relations in the early polar vortex; impact of mixing on ozone-tracer relations in the polar vortex; impact of mesospheric intrusions on ozone-tracer relations in the stratospheric polar vortex calculation of chemical ozone loss in the arctic in March 2003 based on ILAS-II measurements; (4) epilogue.
Nelson, Douglas G; Byus, Kent
Contemporary public health requires the support and participation of its constituency. This study assesses the capacity of consumption value theory to identify the basis of this support. A telephone survey design used simple random sampling of adult residents of Cherokee County, Oklahoma. Factor analysis and stepwise discriminant analysis was used to identify and classify personal and societal level support variables. Most residents base societal level support on epistemic values. Direct services clientele base their support on positive emotional values derived from personal contact and attractive programs. Residents are curious about public health and want to know more about the health department. Where marketing the effectiveness of public health programs would yield relatively little support, marketing health promotion activities may attract public opposition. This formative research tool suggests a marketing strategy for public health practitioners.
Full Text Available Single molecule studies have expanded rapidly over the past decade and have the ability to provide an unprecedented level of understanding of biological systems. A common challenge upon introduction of novel, data-rich approaches is the management, processing, and analysis of the complex data sets that are generated. We provide a standardized approach for analyzing these data in the freely available software package SMART: Single Molecule Analysis Research Tool. SMART provides a format for organizing and easily accessing single molecule data, a general hidden Markov modeling algorithm for fitting an array of possible models specified by the user, a standardized data structure and graphical user interfaces to streamline the analysis and visualization of data. This approach guides experimental design, facilitating acquisition of the maximal information from single molecule experiments. SMART also provides a standardized format to allow dissemination of single molecule data and transparency in the analysis of reported data.
Full Text Available Human pluripotent stem cells (hPSCs, namely, embryonic stem cells (ESCs and induced pluripotent stem cells (iPSCs, with their ability of indefinite self-renewal and capability to differentiate into cell types derivatives of all three germ layers, represent a powerful research tool in developmental biology, for drug screening, disease modelling, and potentially cell replacement therapy. Efficient differentiation protocols that would result in the cell type of our interest are needed for maximal exploitation of these cells. In the present work, we aim at focusing on the protocols for differentiation of hPSCs into functional cardiomyocytes in vitro as well as achievements in the heart disease modelling and drug testing on the patient-specific iPSC-derived cardiomyocytes (iPSC-CMs.
Verma, A.K.; Varde, P.V.; Sankar, S.; Prakash, P.
A prototype Knowledge Based (KB) operator Adviser (OPAD) system has been developed for 100 MW(th) Heavy Water moderated, cooled and Natural Uranium fueled research reactor. The development objective of this system is to improve reliability of operator action and hence the reactor safety at the time of crises as well as normal operation. The jobs performed by this system include alarm analysis, transient identification, reactor safety status monitoring, qualitative fault diagnosis and procedure generation in reactor operation. In order to address safety objectives at various stages of the Operator Adviser (OPAD) system development the Knowledge has been structured using PSA tools/information in an shell environment. To demonstrate the feasibility of using a combination of KB approach with PSA for operator adviser system, salient features of some of the important modules (viz. FUELEX, LOOPEX and LOCAEX) have been discussed. It has been found that this system can serve as an efficient operator support system
Gholami, Jaleh; Majdzadeh, Reza; Nedjat, Saharnaz; Nedjat, Sima; Maleki, Katayoun; Ashoorkhani, Mahnaz; Yazdizadeh, Bahareh
The knowledge translation self-assessment tool for research institutes (SATORI) was designed to assess the status of knowledge translation in research institutes. The objective was, to identify the weaknesses and strengths of knowledge translation in research centres and faculties associated with Tehran University of Medical Sciences (TUMS). The tool, consisting of 50 statements in four main domains, was used in 20 TUMS-affiliated research centres and departments after its reliability was established. It was completed in a group discussion by the members of the research council, researchers and research users' representatives from each centre and/or department. The mean score obtained in the four domains of 'The question of research', 'Knowledge production', 'Knowledge transfer' and 'Promoting the use of evidence' were 2.26, 2.92, 2 and 1.89 (out of 5) respectively.Nine out of 12 interventional priorities with the lowest quartile score were related to knowledge transfer resources and strategies, whereas eight of them were in the highest quartile and related to 'The question of research' and 'Knowledge production'. The self-assessment tool identifies the gaps in capacity and infrastructure of knowledge translation support within research organizations. Assessment of research institutes using SATORI pointed out that strengthening knowledge translation through provision of financial support for knowledge translation activities, creating supportive and facilitating infrastructures, and facilitating interactions between researchers and target audiences to exchange questions and research findings are among the priorities of research centres and/or departments.
Akl, Elie A; Fadlallah, Racha; Ghandour, Lilian; Kdouh, Ola; Langlois, Etienne; Lavis, John N; Schünemann, Holger; El-Jardali, Fadi
Groups or institutions funding or conducting systematic reviews in health policy and systems research (HPSR) should prioritise topics according to the needs of policymakers and stakeholders. The aim of this study was to develop and validate a tool to prioritise questions for systematic reviews in HPSR. We developed the tool following a four-step approach consisting of (1) the definition of the purpose and scope of tool, (2) item generation and reduction, (3) testing for content and face validity, (4) and pilot testing of the tool. The research team involved international experts in HPSR, systematic review methodology and tool development, led by the Center for Systematic Reviews on Health Policy and Systems Research (SPARK). We followed an inclusive approach in determining the final selection of items to allow customisation to the user's needs. The purpose of the SPARK tool was to prioritise questions in HPSR in order to address them in systematic reviews. In the item generation and reduction phase, an extensive literature search yielded 40 relevant articles, which were reviewed by the research team to create a preliminary list of 19 candidate items for inclusion in the tool. As part of testing for content and face validity, input from international experts led to the refining, changing, merging and addition of new items, and to organisation of the tool into two modules. Following pilot testing, we finalised the tool, with 22 items organised in two modules - the first module including 13 items to be rated by policymakers and stakeholders, and the second including 9 items to be rated by systematic review teams. Users can customise the tool to their needs, by omitting items that may not be applicable to their settings. We also developed a user manual that provides guidance on how to use the SPARK tool, along with signaling questions. We have developed and conducted initial validation of the SPARK tool to prioritise questions for systematic reviews in HPSR, along with
Torres, Samantha; de la Riva, Erika E; Tom, Laura S; Clayman, Marla L; Taylor, Chirisse; Dong, Xinqi; Simon, Melissa A
Despite increasing need to boost the recruitment of underrepresented populations into cancer trials and biobanking research, few tools exist for facilitating dialogue between researchers and potential research participants during the recruitment process. In this paper, we describe the initial processes of a user-centered design cycle to develop a standardized research communication tool prototype for enhancing research literacy among individuals from underrepresented populations considering enrollment in cancer research and biobanking studies. We present qualitative feedback and recommendations on the prototype's design and content from potential end users: five clinical trial recruiters and ten potential research participants recruited from an academic medical center. Participants were given the prototype (a set of laminated cards) and were asked to provide feedback about the tool's content, design elements, and word choices during semi-structured, in-person interviews. Results suggest that the prototype was well received by recruiters and patients alike. They favored the simplicity, lay language, and layout of the cards. They also noted areas for improvement, leading to card refinements that included the following: addressing additional topic areas, clarifying research processes, increasing the number of diverse images, and using alternative word choices. Our process for refining user interfaces and iterating content in early phases of design may inform future efforts to develop tools for use in clinical research or biobanking studies to increase research literacy.
Riggs, E. M.
beginners. Thus researchers must embrace the uncontrolled nature of the setting, the qualitative nature of the data collected, and the researcher's role in interpreting geologically appropriate actions as evidence of successful problem solving and investigation. Working to understand the role of diversity and culture in the geosciences also involves a wide array of theory, from affective issues through culturally and linguistically-influenced cognition, through gender, self-efficacy, and many other areas of inquiry. Research in understanding spatial skills draws heavily on techniques from cognition research but also must involve the field-specific knowledge of geoscientists to infuse these techniques with exemplars, a catalog of meaningful actions by students, and an understanding of how to recognize success. These examples illustrate briefly the wide array of tools from other fields that is being brought to bear to advance rigorous geoscience education research. We will illustrate a few of these and the insights we have gained, and the power of theory and method from other fields to enlighten us as we attempt to educate a broader array of earth scientists.
Fumagalli, E.; Verdelli, G.
ISMES (Experimental Institute for Models and Structures) is now carrying out a series of tests on physical models as a part of a research programme sponsored by DSR (Studies and Research Direction) of ENEL (Italian State Electricity Board) on behalf of CPN (Nuclear Design and Construction Centre) of ENEL with the aim to experience a 'Thin'-walled PCPV for 'BWR'. The physical model, together with the mathematical model and the rheological model of the materials, is intended as a meaningful design tool. The mathematical model covers the overall structural design phase, (geometries) and the linear behaviour, whereas the physical model, besides of a global information to be compared with the results of the mathematical model, supplies a number of data as the non-linear behaviour up to failure and local conditions (penetration area etc.) are concerned. The aim of the first phase of this research programme is to make a comparison between the calculation and experiment tests as the thicknesses of the wall and the bottom slab are concerned, whereas the second phase of the research deals with the behaviour of the removable lid and its connection with the main structure. To do this, a model in scale 1:10 has been designed which symmetrically reproduces with respect to the equator, the bottom part of the structure. In the bottom slab the penetrations of the prototype design are reproduced, whereas the upper slab is plain. This paper describes the model, and illustrates the main results, underlining the different behaviour of the upper and bottom slabs up to collapse
Full Text Available Digital tool making offers many challenges, involving much trial and error. Developing machine learning and assistance in automated and semi-automated Internet resource discovery, metadata generation, and rich-text identification provides opportunities for great discovery, innovation, and the potential for transformation of the library community. The areas of computer science involved, as applied to the library applications addressed, are among that discipline’s leading edges. Making applied research practical and applicable, through placement within library/collection-management systems and services, involves equal parts computer scientist, research librarian, and legacy-systems archaeologist. Still, the early harvest is there for us now, with a large harvest pending. Data Fountains and iVia, the projects discussed, demonstrate this. Clearly, then, the present would be a good time for the library community to more proactively and significantly engage with this technology and research, to better plan for its impacts, to more proactively take up the challenges involved in its exploration, and to better and more comprehensively guide effort in this new territory. The alternative to doing this is that others will develop this territory for us, do it not as well, and sell it back to us at a premium. Awareness of this technology and its current capabilities, promises, limitations, and probable major impacts needs to be generalized throughout the library management, metadata, and systems communities. This article charts recent work, promising avenues for new research and development, and issues the library community needs to understand.
Holmes, Bruce J.; Sawhill, Bruce K.; Herriot, James; Seehart, Ken; Zellweger, Dres; Shay, Rick
The objective of this research by NextGen AeroSciences, LLC is twofold: 1) to deliver an initial "toolbox" of algorithms, agent-based structures, and method descriptions for introducing trajectory agency as a methodology for simulating and analyzing airspace states, including bulk properties of large numbers of heterogeneous 4D aircraft trajectories in a test airspace -- while maintaining or increasing system safety; and 2) to use these tools in a test airspace to identify possible phase transition structure to predict when an airspace will approach the limits of its capacity. These 4D trajectories continuously replan their paths in the presence of noise and uncertainty while optimizing performance measures and performing conflict detection and resolution. In this approach, trajectories are represented as extended objects endowed with pseudopotential, maintaining time and fuel-efficient paths by bending just enough to accommodate separation while remaining inside of performance envelopes. This trajectory-centric approach differs from previous aircraft-centric distributed approaches to deconfliction. The results of this project are the following: 1) we delivered a toolbox of algorithms, agent-based structures and method descriptions as pseudocode; and 2) we corroborated the existence of phase transition structure in simulation with the addition of "early warning" detected prior to "full" airspace. This research suggests that airspace "fullness" can be anticipated and remedied before the airspace becomes unsafe.
Wilson, G.E.; Boyack, B.E.
Best Estimate computer codes have been accepted by the US Nuclear Regulatory Commission as an optional tool for performing safety analysis related to the licensing and regulation of current nuclear reactors producing commercial electrical power, providing their uncertainty is quantified. In support of this policy change, the NRC and its contractors and consultants have developed and demonstrated an uncertainty quantification methodology called CSAU. At the process level, the method is generic to any application which relies on best estimate computer code simulations to determine safe operating margins. The primary use of the CSAU methodology is to quantify safety margins for existing designs; however, the methodology can also serve an equally important role in advanced reactor research for plants not yet built. Applied early, during the period when alternate designs are being evaluated, the methodology can identify the relative importance of the sources of uncertainty in the knowledge of each plant behavior and, thereby, help prioritize the research needed to bring the new designs to fruition. This paper describes the CSAU methodology, at the generic process level, and provides the general principles whereby it may be applied to evaluations of advanced reactor designs. 9 refs., 1 fig., 1 tab
Full Text Available The purpose of this paper is to showcase the information literacy course for doctoral students called Information Resources and Tools for Research. Turku University Library organises this course in collaboration with the University of Turku Graduate School. The course, which was started in 2012, has been organised four times so far, twice in English and twice in Finnish. The course offers training to all doctoral Programs in all of the seven disciplines present at the University of Turku and doctoral candidates of the University. In our presentation we will describe the structure and contents of the course and share our experiences of the collaboration with the University of Turku Graduate School. In addition, we will describe how the information specialists of the Turku University Library have collaborated during the course. We will also discuss the challenges of the course. Based on the course feedback, it can be stated that in general, participants have found this course very useful for their research in the University of Turku.
Full Text Available Rutgers Cooperative Extension developed an online self-assessment tool called the Personal Health and Finance Quiz available at http://njaes.rutgers.edu/money/health-finance-quiz/. Believed to be among the first public surveys to simultaneously query users about their health and personal finance practices, the quiz is part of Small Steps to Health and Wealth™ (SSHW, a Cooperative Extension program developed to motivate Americans to take action to improve both their health and personal finances (see http://njaes.rutgers.edu/sshw/. Respondents indicate one of four frequencies for performance of 20 daily activities and receive a Health, Finance, and Total score indicating their frequency of performing activities that health and financial experts recommend. In addition to providing users with personalized feedback, the quiz collects data for research about the health and financial practices of Americans to inform future Extension outreach and can be used as a pre-/post-test to evaluate the impact of SSHW programs. Initial research analyses are planned for 2015.
Roysri, Krisana; Chotipanich, Chanisa; Laopaiboon, Vallop; Khiewyoo, Jiraporn
Diagnostic nuclear medicine is being increasingly employed in clinical practice with the advent of new technologies and radiopharmaceuticals. The report of the prevalence of a certain disease is important for assessing the quality of that article. Therefore, this study was performed to evaluate the quality of published nuclear medicine articles and determine the frequency of reporting the prevalence of studied diseases. We used Standards for Reporting of Diagnostic Accuracy (STARD) and Quality Assessment of Diagnostic Accuracy Studies (QUADAS-2) checklists for evaluating the quality of articles published in five nuclear medicine journals with the highest impact factors in 2012. The articles were retrieved from Scopus database and were selected and assessed independently by two nuclear medicine physicians. Decision concerning equivocal data was made by consensus between the reviewers. The average STARD score was approximately 17 points, and the highest score was 17.19±2.38 obtained by the European Journal of Nuclear Medicine. QUADAS-2 tool showed that all journals had low bias regarding study population. The Journal of Nuclear Medicine had the highest score in terms of index test, reference standard, and time interval. Lack of clarity regarding the index test, reference standard, and time interval was frequently observed in all journals including Clinical Nuclear Medicine, in which 64% of the studies were unclear regarding the index test. Journal of Nuclear Cardiology had the highest number of articles with appropriate reference standard (83.3%), though it had the lowest frequency of reporting disease prevalence (zero reports). All five journals had the same STARD score, while index test, reference standard, and time interval were very unclear according to QUADAS-2 tool. Unfortunately, data were too limited to determine which journal had the lowest risk of bias. In fact, it is the author's responsibility to provide details of research methodology so that the
van Vught, Frans; Westerheijden, Don F.
This paper sets out to analyse the need for better "transparency tools" which inform university stakeholders about the quality of universities. First, we give an overview of what we understand by the concept of transparency tools and those that are currently available. We then critique current transparency tools' methodologies, looking in detail…
Shelley, Tia Renee; Dasgupta, Chandan; Silva, Alexandra; Lyons, Leilah; Moher, Tom
This paper presents a new mobile software tool, PhotoMAT (Photo Management and Analysis Tool), and students' experiences with this tool within a scaffolded curricular unit--Neighborhood Safari. PhotoMAT was designed to support learners' investigations of backyard animal behavior and works with image sets obtained using fixed-position field cameras…
Pescuma, Micaela; de Valdez, Graciela Font; Mozzi, Fernanda
Whey, the main by-product of the cheese industry, is considered as an important pollutant due to its high chemical and biological oxygen demand. Whey, often considered as waste, has high nutritional value and can be used to obtain value-added products, although some of them need expensive enzymatic synthesis. An economical alternative to transform whey into valuable products is through bacterial or yeast fermentations and by accumulation during algae growth. Fermentative processes can be applied either to produce individual compounds or to formulate new foods and beverages. In the first case, a considerable amount of research has been directed to obtain biofuels able to replace those derived from petrol. In addition, the possibility of replacing petrol-derived plastics by biodegradable polymers synthesized during bacterial fermentation of whey has been sought. Further, the ability of different organisms to produce metabolites commonly used in the food and pharmaceutical industries (i.e., lactic acid, lactobionic acid, polysaccharides, etc.) using whey as growth substrate has been studied. On the other hand, new low-cost functional whey-based foods and beverages leveraging the high nutritional quality of whey have been formulated, highlighting the health-promoting effects of fermented whey-derived products. This review aims to gather the multiple uses of whey as sustainable raw material for the production of individual compounds, foods, and beverages by microbial fermentation. This is the first work to give an overview on the microbial transformation of whey as raw material into a large repertoire of industrially relevant foods and products.
Nazem, Amir; Mansoori, G Ali
A century of research has passed since the discovery and definition of Alzheimer's disease (AD), the primary common dementing disorder worldwide. However, AD lacks definite diagnostic approaches and effective cure at the present. Moreover, the currently available diagnostic tools are not sufficient for an early screening of AD in order to start preventive approaches. Recently the emerging field of nanotechnology has promised new techniques to solve some of the AD challenges. Nanotechnology refers to the techniques of designing and manufacturing nanosize (1-100 nm) structures through controlled positional and/or self-assembly of atoms and molecules. In this report, we present the promises that nanotechnology brings in research on the AD diagnosis and therapy. They include its potential for the better understanding of the AD root cause molecular mechanisms, AD's early diagnoses, and effective treatment. The advances in AD research offered by the atomic force microscopy, single molecule fluorescence microscopy and NanoSIMS microscopy are examined here. In addition, the recently proposed applications of nanotechnology for the early diagnosis of AD including bio-barcode assay, localized surface plasmon resonance nanosensor, quantum dot and nanomechanical cantilever arrays are analyzed. Applications of nanotechnology in AD therapy including neuroprotections against oxidative stress and anti-amyloid therapeutics, neuroregeneration and drug delivery beyond the blood brain barrier (BBB) are discussed and analyzed. All of these applications could improve the treatment approach of AD and other neurodegenerative diseases. The complete cure of AD may become feasible by a combination of nanotechnology and some other novel approaches, like stem cell technology.
Amin, Waqas; Kang, Hyunseok P; Egloff, Ann Marie; Singh, Harpreet; Trent, Kerry; Ridge-Hetrick, Jennifer; Seethala, Raja R; Grandis, Jennifer; Parwani, Anil V
The Specialized Program of Research Excellence (SPORE) in Head and Neck Cancer neoplasm virtual biorepository is a bioinformatics-supported system to incorporate data from various clinical, pathological, and molecular systems into a single architecture based on a set of common data elements (CDEs) that provides semantic and syntactic interoperability of data sets. The various components of this annotation tool include the Development of Common Data Elements (CDEs) that are derived from College of American Pathologists (CAP) Checklist and North American Association of Central Cancer Registries (NAACR) standards. The Data Entry Tool is a portable and flexible Oracle-based data entry device, which is an easily mastered web-based tool. The Data Query Tool helps investigators and researchers to search de-identified information within the warehouse/resource through a 'point and click' interface, thus enabling only the selected data elements to be essentially copied into a data mart using a multi dimensional model from the warehouse's relational structure. The SPORE Head and Neck Neoplasm Database contains multimodal datasets that are accessible to investigators via an easy to use query tool. The database currently holds 6553 cases and 10607 tumor accessions. Among these, there are 965 metastatic, 4227 primary, 1369 recurrent, and 483 new primary cases. The data disclosure is strictly regulated by user's authorization. The SPORE Head and Neck Neoplasm Virtual Biorepository is a robust translational biomedical informatics tool that can facilitate basic science, clinical, and translational research. The Data Query Tool acts as a central source providing a mechanism for researchers to efficiently find clinically annotated datasets and biospecimens that are relevant to their research areas. The tool protects patient privacy by revealing only de-identified data in accordance with regulations and approvals of the IRB and scientific review committee
PET imaging has for many years been a versatile tool for non-invasive imaging of neuro-physiology and, indeed, whole body physiology. Quantitative PET imaging of trace amounts of radioactivity is scientifically elegant and can be very complex. This lecture focuses on whether and where this test is clinically useful. Because of the research tradition, PET imaging has been perceived as an 'expensive' test, as it costs more per scan than CT and MRI scans at most institutions. Such a superficial analysis is incorrect, however, as it is increasingly recognized that imaging costs, which in some circumstances will be increased by the use of PET, are only a relatively small component of patient care costs. Thus, PET may raise imaging costs and the number of imaging procedures in some settings, though PET may reduce imaging test numbers in other settings. However, the analysis must focus on the total costs of patient management. Analyses focused on total patient care costs, including cost of hospitalization and cost surgery as well as imaging costs, have shown that PET can substantially reduce total patient care costs in several settings. This is achieved by providing a more accurate diagnosis, and thus having fewer instances of an incorrect diagnosis resulting in subsequent inappropriate surgery or investigations. Several institutions have shown scenarios in which PET for tumor imaging is cost effective. While the specific results of the analyses vary based on disease prevalence and cost input values for each procedure, as well as the projected performance of PET, the similar results showing total care cost savings in the management of several common cancers, strongly supports the rational for the use of PET in cancer management. In addition, promising clinical results are forthcoming in several other illnesses, suggesting PET will have broader utility than these uses, alone. Thus, while PET is an 'expensive' imaging procedure and has considerable utility as a research
Hanekamp, E.; Vissers, P.; De Lint, S. [Partners for Innovation, Amsterdam (Netherlands)
The Global Bio-Energy Partnership (GBEP) has developed a set of 24 sustainability indicators applicable to all forms of bio-energy and aimed at voluntary use by national governments. The GBEP indicators enable governments to assess the bio-energy sector and to develop new policies related to sustainable bio-energy production and use. These indicators have been piloted in Ghana. Modern bio-energy is a big opportunity for the region, which is why NL Agency adopted and supported the pilot, together with the Global Bio-Energy Partnership (GBEP). The pilot project also was supported by the ECOWAS Regional Centre for Renewable Energy and Energy Efficiency (ECREEE) and has been coordinated by the Council for Scientific and Industrial Research (CSIR). The Ghana Energy Commission took the responsibility to involve policymakers. Partners for Innovation was commissioned by NL Agency to provide technical assistance for the pilot. The main aims of the project are: (a) Enhancing the capacity of the host country Ghana (and ECOWAS) to use the GBEP indicators as a tool for assessing the sustainability of its bio-energy sector and/or developing sustainable bio-energy policies; (b) Learning lessons on how to apply the indicators and how to enhance their practicality as a tool for policymakers and giving this as feedback to the GBEP community. Three Ghanaian research institutes (CSIR-FORIG, CSIR-IIR and UG-ISSER) have studied 11 out of the 24 GBEP indicators in the pilot. The pilot has been a success: the 24 sustainability criteria appear to be very valuable for Ghana. As such the indicators provide, also for other governments, a practical tool to assess sustainability of biomass sectors and policies. The report also shows important insights on data availability and quality, and on the applicability of the GBEP indicators in Ghana. The final report provides concrete recommendations on: (1) How Ghana can proceed with the GBEP sustainability indicators; and (2) The lessons learned for
A descriptive qualitative research design was used to determine whether participants ... simulation as a teaching method; a manikin offering effective learning; confidence ..... Tesch R. Qualitative Research: Analysis Types and Software Tools.
Powers, Christina M.; Grieger, Khara D.; Hendren, Christine Ogilvie; Meacham, Connie A.; Gurevich, Gerald; Lassiter, Meredith Gooding; Money, Eric S.; Lloyd, Jennifer M.; Beaulieu, Stephen M.
Prioritizing and assessing risks associated with chemicals, industrial materials, or emerging technologies is a complex problem that benefits from the involvement of multiple stakeholder groups. For example, in the case of engineered nanomaterials (ENMs), scientific uncertainties exist that hamper environmental, health, and safety (EHS) assessments. Therefore, alternative approaches to standard EHS assessment methods have gained increased attention. The objective of this paper is to describe the application of a web-based, interactive decision support tool developed by the U.S. Environmental Protection Agency (U.S. EPA) in a pilot study on ENMs. The piloted tool implements U.S. EPA's comprehensive environmental assessment (CEA) approach to prioritize research gaps. When pursued, such research priorities can result in data that subsequently improve the scientific robustness of risk assessments and inform future risk management decisions. Pilot results suggest that the tool was useful in facilitating multi-stakeholder prioritization of research gaps. Results also provide potential improvements for subsequent applications. The outcomes of future CEAWeb applications with larger stakeholder groups may inform the development of funding opportunities for emerging materials across the scientific community (e.g., National Science Foundation Science to Achieve Results [STAR] grants, National Institutes of Health Requests for Proposals). - Highlights: • A web-based, interactive decision support tool was piloted for emerging materials. • The tool (CEAWeb) was based on an established approach to prioritize research gaps. • CEAWeb facilitates multi-stakeholder prioritization of research gaps. • We provide recommendations for future versions and applications of CEAWeb
Adeline Phaik Harn Chua; Kenneth R. Deans; Craig M. Parker
Blogs appear to be gaining momentum as a marketing tool which can be used by organisations for such strategies and processes as branding, managing reputation, developing customer trust and loyalty, niche marketing, gathering marketing intelligence and promoting their online presence. There has been limited academic research in this area, and most significantly concerning the types of small and medium enterprises (SMEs) for which blogs might have potential as a marketing tool. In an attempt to...
Eloranta, E. W.; Spuler, S.; Hayman, M. M.
Many aspects of air quality research require information on the vertical distribution of pollution. Traditional measurements, obtained from surface based samplers, or passive satellite remote sensing, do not provide vertical profiles. Lidar can provide profiles of aerosol properties. However traditional backscatter lidar suffers from uncertain calibrations with poorly constrained algorithms. These problems are avoided using High Spectral Resolution Lidar (HSRL) which provides absolutely calibrated vertical profiles of aerosol properties. The University of Wisconsin HSRL systems measure 532 nm wavelength aerosol backscatter cross-sections, extinction cross-sections, depolarization, and attenuated 1064 nm backscatter. These instruments are designed for long-term deployment at remote sites with minimal local support. Processed data is provided for public viewing and download in real-time on our web site "http://hsrl.ssec.wisc.edu". Air pollution applications of HSRL data will be illustrated with examples acquired during air quality field programs including; KORUS-AQ, DISCOVER-AQ, LAMOS and FRAPPE. Observations include 1) long range transport of dust, air pollution and smoke. 2) Fumigation episodes where elevated pollution is mixed down to the surface. 3) visibility restrictions by aerosols and 4) diurnal variations in atmospheric optical depth. While HSRL is powerful air quality research tool, its application in routine measurement networks is hindered by the high cost of current systems. Recent technical advances promise a next generation HSRL using telcom components to greatly reduce system cost. This paper will present data generated by a prototype low cost system constructed at NCAR. In addition to lower cost, operation at a non-visible near 780 nm infrared wavelength removes all FAA restrictions on the operation.
Engholm, Gerda; Ferlay, Jacques; Christensen, Niels; Bray, Freddie; Gjerstorff, Marianne L; Klint, Asa; Køtlum, Jóanis E; Olafsdóttir, Elínborg; Pukkala, Eero; Storm, Hans H
The NORDCAN database and program ( www.ancr.nu ) include detailed information and results on cancer incidence, mortality and prevalence in each of the Nordic countries over five decades and has lately been supplemented with predictions of cancer incidence and mortality; future extensions include the incorporation of cancer survival estimates. The data originates from the national cancer registries and causes of death registries in Denmark, Finland, Iceland, Norway, Sweden, and Faroe Islands and is regularly updated. Presently 41 cancer entities are included in the common dataset, and conversions of the original national data according to international rules ensure comparability. With 25 million inhabitants in the Nordic countries, 130 000 incident cancers are reported yearly, alongside nearly 60 000 cancer deaths, with almost a million persons living with a cancer diagnosis. This web-based application is available in English and in each of the five Nordic national languages. It includes comprehensive and easy-to-use descriptive epidemiology tools that provide tabulations and graphs, with further user-specified options available. The NORDCAN database aims to provide comparable and timely data to serve the varying needs of policy makers, cancer societies, the public, and journalists, as well as the clinical and research community.
Full Text Available We introduce the notion of Electric Field Encephalography (EFEG based on measuring electric fields of the brain and demonstrate, using computer modeling, that given the appropriate electric field sensors this technique may have significant advantages over the current EEG technique. Unlike EEG, EFEG can be used to measure brain activity in a contactless and reference-free manner at significant distances from the head surface. Principal component analysis using simulated cortical sources demonstrated that electric field sensors positioned 3 cm away from the scalp and characterized by the same signal-to-noise ratio as EEG sensors provided the same number of uncorrelated signals as scalp EEG. When positioned on the scalp, EFEG sensors provided 2-3 times more uncorrelated signals. This significant increase in the number of uncorrelated signals can be used for more accurate assessment of brain states for non-invasive brain-computer interfaces and neurofeedback applications. It also may lead to major improvements in source localization precision. Source localization simulations for the spherical and Boundary Element Method (BEM head models demonstrated that the localization errors are reduced two-fold when using electric fields instead of electric potentials. We have identified several techniques that could be adapted for the measurement of the electric field vector required for EFEG and anticipate that this study will stimulate new experimental approaches to utilize this new tool for functional brain research.
North, M. J. N.
Argonne National Laboratory (ANL) has worked closely with Western Area Power Administration (Western) over many years to develop a variety of electric power marketing and transmission system models that are being used for ongoing system planning and operation as well as analytic studies. Western markets and delivers reliable, cost-based electric power from 56 power plants to millions of consumers in 15 states. The Spot Market Agent Research Tool Version 2.0 (SMART II) is an investigative system that partially implements some important components of several existing ANL linear programming models, including some used by Western. SMART II does not implement a complete model of the Western utility system but it does include several salient features of this network for exploratory purposes. SMART II uses a Swarm agent-based framework. SMART II agents model bulk electric power transaction dynamics with recognition for marginal costs as well as transmission and generation constraints. SMART II uses a sparse graph of nodes and links to model the electric power spot market. The nodes represent power generators and consumers with distinct marginal decision curves and varying investment capital as well individual learning parameters. The links represent transmission lines with individual capacities taken from a range of central distribution, outlying distribution and feeder line types. The application of SMART II to electric power systems studies has produced useful results different from those often found using more traditional techniques. Use of the advanced features offered by the Swarm modeling environment simplified the creation of the SMART II model.
Vernardos, G.; Fluke, C. J.; Croton, D.; Bate, N. F.
As synoptic all-sky surveys begin to discover new multiply lensed quasars, the flow of data will enable statistical cosmological microlensing studies of sufficient size to constrain quasar accretion disk and supermassive black hole properties. In preparation for this new era, we are undertaking the GPU-Enabled, High Resolution cosmological MicroLensing parameter survey (GERLUMPH). We present here the GERLUMPH Data Release 1, which consists of 12,342 high resolution cosmological microlensing magnification maps and provides the first uniform coverage of the convergence, shear, and smooth matter fraction parameter space. We use these maps to perform a comprehensive numerical investigation of the mass-sheet degeneracy, finding excellent agreement with its predictions. We study the effect of smooth matter on microlensing induced magnification fluctuations. In particular, in the minima and saddle-point regions, fluctuations are enhanced only along the critical line, while in the maxima region they are always enhanced for high smooth matter fractions (≈0.9). We describe our approach to data management, including the use of an SQL database with a Web interface for data access and online analysis, obviating the need for individuals to download large volumes of data. In combination with existing observational databases and online applications, the GERLUMPH archive represents a fundamental component of a new microlensing eResearch cloud. Our maps and tools are publicly available at http://gerlumph.swin.edu.au/
Vernardos, G.; Fluke, C. J.; Croton, D. [Centre for Astrophysics and Supercomputing, Swinburne University of Technology, P.O. Box 218, Hawthorn, Victoria, 3122 (Australia); Bate, N. F. [Sydney Institute for Astronomy, School of Physics, A28, University of Sydney, NSW, 2006 (Australia)
As synoptic all-sky surveys begin to discover new multiply lensed quasars, the flow of data will enable statistical cosmological microlensing studies of sufficient size to constrain quasar accretion disk and supermassive black hole properties. In preparation for this new era, we are undertaking the GPU-Enabled, High Resolution cosmological MicroLensing parameter survey (GERLUMPH). We present here the GERLUMPH Data Release 1, which consists of 12,342 high resolution cosmological microlensing magnification maps and provides the first uniform coverage of the convergence, shear, and smooth matter fraction parameter space. We use these maps to perform a comprehensive numerical investigation of the mass-sheet degeneracy, finding excellent agreement with its predictions. We study the effect of smooth matter on microlensing induced magnification fluctuations. In particular, in the minima and saddle-point regions, fluctuations are enhanced only along the critical line, while in the maxima region they are always enhanced for high smooth matter fractions (≈0.9). We describe our approach to data management, including the use of an SQL database with a Web interface for data access and online analysis, obviating the need for individuals to download large volumes of data. In combination with existing observational databases and online applications, the GERLUMPH archive represents a fundamental component of a new microlensing eResearch cloud. Our maps and tools are publicly available at http://gerlumph.swin.edu.au/.
King, Stephanie L
Over the years, playback experiments have helped further our understanding of the wonderful world of animal communication. They have provided fundamental insights into animal behaviour and the function of communicative signals in numerous taxa. As important as these experiments are, however, there is strong evidence to suggest that the information conveyed in a signal may only have value when presented interactively. By their very nature, signalling exchanges are interactive and therefore, an interactive playback design is a powerful tool for examining the function of such exchanges. While researchers working on frog and songbird vocal interactions have long championed interactive playback, it remains surprisingly underused across other taxa. The interactive playback approach is not limited to studies of acoustic signalling, but can be applied to other sensory modalities, including visual, chemical and electrical communication. Here, I discuss interactive playback as a potent yet underused technique in the field of animal behaviour. I present a concise review of studies that have used interactive playback thus far, describe how it can be applied, and discuss its limitations and challenges. My hope is that this review will result in more scientists applying this innovative technique to their own study subjects, as a means of furthering our understanding of the function of signalling interactions in animal communication systems. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Weber, Griffin M; Murphy, Shawn N; McMurry, Andrew J; Macfadden, Douglas; Nigrin, Daniel J; Churchill, Susanne; Kohane, Isaac S
The authors developed a prototype Shared Health Research Information Network (SHRINE) to identify the technical, regulatory, and political challenges of creating a federated query tool for clinical data repositories. Separate Institutional Review Boards (IRBs) at Harvard's three largest affiliated health centers approved use of their data, and the Harvard Medical School IRB approved building a Query Aggregator Interface that can simultaneously send queries to each hospital and display aggregate counts of the number of matching patients. Our experience creating three local repositories using the open source Informatics for Integrating Biology and the Bedside (i2b2) platform can be used as a road map for other institutions. The authors are actively working with the IRBs and regulatory groups to develop procedures that will ultimately allow investigators to obtain identified patient data and biomaterials through SHRINE. This will guide us in creating a future technical architecture that is scalable to a national level, compliant with ethical guidelines, and protective of the interests of the participating hospitals.
Full Text Available Hypersensitivity to external sounds is often comorbid with tinnitus and may be significant for adherence to certain types of tinnitus management. Therefore, a clear measure of sensitivity to sound is important. The aim of this study was to evaluate the validity and reliability of the Hyperacusis Questionnaire (HQ for use as a measurement tool using data from a sample of 264 adults who took part in tinnitus research. We evaluated the HQ factor structure, internal consistency, convergent and discriminant validity, and floor and ceiling effects. Internal consistency was high (Cronbach’s alpha = 0.88 and moderate correlations were observed between the HQ, uncomfortable loudness levels, and other health questionnaires. Confirmatory factor analysis revealed that the original HQ three-factor solution and a one-factor solution were both a poor fit to the data. Four problematic items were removed and exploratory factor analysis identified a two-factor (attentional and social solution. The original three-factor structure of the HQ was not confirmed. All fourteen items do not accurately assess hypersensitivity to sound in a tinnitus population. We propose a 10-item (2-factor version of the HQ, which will need to be confirmed using a new tinnitus and perhaps nontinnitus population.
Sharma, Deepak; Priyadarshini, Pragya; Vrati, Sudhanshu
The beginning of the second century of research in the field of virology (the first virus was discovered in 1898) was marked by its amalgamation with bioinformatics, resulting in the birth of a new domain--viroinformatics. The availability of more than 100 Web servers and databases embracing all or specific viruses (for example, dengue virus, influenza virus, hepatitis virus, human immunodeficiency virus [HIV], hemorrhagic fever virus [HFV], human papillomavirus [HPV], West Nile virus, etc.) as well as distinct applications (comparative/diversity analysis, viral recombination, small interfering RNA [siRNA]/short hairpin RNA [shRNA]/microRNA [miRNA] studies, RNA folding, protein-protein interaction, structural analysis, and phylotyping and genotyping) will definitely aid the development of effective drugs and vaccines. However, information about their access and utility is not available at any single source or on any single platform. Therefore, a compendium of various computational tools and resources dedicated specifically to virology is presented in this article. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Fackrell, Kathryn; Fearnley, Constance; Hoare, Derek J; Sereda, Magdalena
Hypersensitivity to external sounds is often comorbid with tinnitus and may be significant for adherence to certain types of tinnitus management. Therefore, a clear measure of sensitivity to sound is important. The aim of this study was to evaluate the validity and reliability of the Hyperacusis Questionnaire (HQ) for use as a measurement tool using data from a sample of 264 adults who took part in tinnitus research. We evaluated the HQ factor structure, internal consistency, convergent and discriminant validity, and floor and ceiling effects. Internal consistency was high (Cronbach's alpha = 0.88) and moderate correlations were observed between the HQ, uncomfortable loudness levels, and other health questionnaires. Confirmatory factor analysis revealed that the original HQ three-factor solution and a one-factor solution were both a poor fit to the data. Four problematic items were removed and exploratory factor analysis identified a two-factor (attentional and social) solution. The original three-factor structure of the HQ was not confirmed. All fourteen items do not accurately assess hypersensitivity to sound in a tinnitus population. We propose a 10-item (2-factor) version of the HQ, which will need to be confirmed using a new tinnitus and perhaps nontinnitus population.
Rulkens, W.H.; Klapwijk, A.; Willers, H.C.
Agricultural liquid livestock wastes are an important potential source of valuable nitrogen-containing compounds such as ammonia and proteins. Large volumetric quantities of these wastes are produced in areas with a high livestock production density. Much technological research has been carried out
Full Text Available In the article were described issues associated with the use by scientific institutions content marketing strategy tools. This article shows the extent to which tools of modern marketing are used in the Internet communication by scientific institutions. Currently content marketing concept is accepted not only as a fashionable trend of modern marketing but above all, it is treated as an important tool to improve enough Internet message, to effectively interest to the users. A optimal selection and use content marketing tools it provides opportunities for enhancing efficiency in the reception (acceptance of the generated message.
Full Text Available Buildings need to be more environmentally benign since the building sector is responsible for about 40% of all of energy and material use in Sweden. For this reason a unique cooperation between companies, municipalities and the Government called “Building- Living and Property Management for the future”, in short “The Building Living Dialogue” has going on since 2003. The project focuses on: a healthy indoor environment, b efficient use of energy, and c efficient resource management. In accordance with the dialogue targets, two research projects were initiated aiming at developing an Environmental rating tool taking into accounts both building sector requirements and expectations and national and international research findings. This paper describes the first phase in the development work where stakeholders and researchers cooperate. It includes results from inventories and based on this experience discusses procedures for developing assessment tools and what the desirable features of a broadly accepted building rating tool could be.
Krakowka, Amy Richmond
Field trips have been acknowledged as valuable learning experiences in geography. This article uses Kolb's (1984) experiential learning model to discuss how students learn and how field trips can help enhance learning. Using Kolb's experiential learning theory as a guide in the design of field trips helps ensure that field trips contribute to…
It is observed that most of the timber trees producing valuable fruits and seeds have low ... sector of the economy by providing major raw materials (saw logs, ... the trees also produce industrial raw materials like latex, ... villagers while avoiding some of the ecological costs of ..... enzymes of rats with carbon tetrachloride.
Fraser, Orlaith N; Bugnyar, Thomas
Reconciliation, a post-conflict affiliative interaction between former opponents, is an important mechanism for reducing the costs of aggressive conflict in primates and some other mammals as it may repair the opponents' relationship and reduce post-conflict distress. Opponents who share a valuable relationship are expected to be more likely to reconcile as for such partners the benefits of relationship repair should outweigh the risk of renewed aggression. In birds, however, post-conflict behavior has thus far been marked by an apparent absence of reconciliation, suggested to result either from differing avian and mammalian strategies or because birds may not share valuable relationships with partners with whom they engage in aggressive conflict. Here, we demonstrate the occurrence of reconciliation in a group of captive subadult ravens (Corvus corax) and show that it is more likely to occur after conflicts between partners who share a valuable relationship. Furthermore, former opponents were less likely to engage in renewed aggression following reconciliation, suggesting that reconciliation repairs damage caused to their relationship by the preceding conflict. Our findings suggest not only that primate-like valuable relationships exist outside the pair bond in birds, but that such partners may employ the same mechanisms in birds as in primates to ensure that the benefits afforded by their relationships are maintained even when conflicts of interest escalate into aggression. These results provide further support for a convergent evolution of social strategies in avian and mammalian species.
Full Text Available This article disseminates the results of a programme of detailed archaeological survey and archive research on one of Europe's most important surviving late-medieval Guild Chapels — that of the Holy Cross Guild, Stratford-upon-Avon (Warwickshire. Today the building is part of Stratford-upon-Avon's tourist trail, located directly opposite William Shakespeare's home, 'New Place', and visited by thousands of tourists every year. However, its archaeological and historical significance has been overlooked owing to the extensive restoration of the building in the 19th and 20th centuries. This destroyed evidence for an internationally significant scheme of wall paintings within the Chapel, paid for by the London Mayor and Stratford-upon-Avon merchant, Hugh Clopton, an important member of the Holy Cross Guild and the original builder of 'New Place'. The paintings also have an important connection with Stratford-upon-Avon's most famous son, William Shakespeare, whose father may have been involved in their destruction and removal during the 16th century. Research by a team of historical archaeologists and digital heritage specialists at the Department of Archaeology, University of York, has revealed the significance of the Guild Chapel through the creation of a digital model and textual paradata, which form the focus of this article. The project is ground-breaking in that it moves beyond the traditional use of digital models as virtual reconstructions of past buildings to use the model itself as a research tool through which the user can explore and validate the evidence for the scheme directly. This is achieved through the creation of a palimpsest of antiquarian drawings of the paintings, made as they were revealed during restoration works in the 19th and 20th centuries, and set within their 3-dimensional architectural context. The model allows the user to compare and contrast differences in the recording methods, iconographies and interpretations of
Henry, Nancy L.
Technology and a variety of resources play an important role in students' educational lives. Vygotsky's (1987) theory of tool mediation suggests that cultural tools, such as computer software influence individuals' thinking and action. However, it is not completely understood how technology and other resources influence student action. Middle…
Torelli, Francesca; Zander, Steffen; Ellerbrok, Heinz; Kochs, Georg; Ulrich, Rainer G; Klotz, Christian; Seeber, Frank
Rodent species like Myodes glareolus and Microtus spp. are natural reservoirs for many zoonotic pathogens causing human diseases and are gaining increasing interest in the field of eco-immunology as candidate animal models. Despite their importance the lack of immunological reagents has hampered research in these animal species. Here we report the recombinant production and functional characterization of IFN-γ, a central mediator of host's innate and adaptive immune responses, from the bank vole M. glareolus. Soluble dimeric recMgIFN-γ was purified in high yield from Escherichia coli. Its activity on M. glareolus and Microtus arvalis kidney cell lines was assessed by immunofluorescent detection of nuclear translocation and phosphorylation of the transcription factor STAT1. RecMgIFN-γ also induced expression of an IFN-γ-regulated innate immunity gene. Inhibition of vesicular stomatitis virus replication in vole cells upon recMgIFN-γ treatment provided further evidence of its biological activity. Finally, we established a recMgIFN-γ-responsive bank vole reporter cell line that allows the sensitive titration of the cytokine activity via a bioluminescence reporter assay. Taken together, we report valuable tools for future investigations on the immune response against zoonotic pathogens in their natural animal hosts, which might foster the development of novel animal models.
Smith, Des H.V.; Moehrenschlager, Axel; Christensen, Nancy; Knapik, Dwight; Gibson, Keith; Converse, Sarah J.
Worldwide, approximately 168 bird species are captive-bred for reintroduction into the wild. Programs tend to be initiated for species with a high level of endangerment. Depressed hatching success can be a problem for such programs and has been linked to artificial incubation. The need for artificial incubation is driven by the practice of multiclutching to increase egg production or by uncertainty over the incubation abilities of captive birds. There has been little attempt to determine how artificial incubation differs from bird-contact incubation. We describe a novel archive (data-logger) egg and use it to compare temperature, humidity, and egg-turning in 5 whooping crane (Grus americana) nests, 4 sandhill crane (G. canadensis) nests, and 3 models of artificial incubator; each of which are used to incubate eggs in whooping crane captive-breeding programs. Mean incubation temperature was 31.7° C for whooping cranes and 32.83° C for sandhill cranes. This is well below that of the artificial incubators (which were set based on a protocol of 37.6° C). Humidity in crane nests varied considerably, but median humidity in all 3 artificial incubators was substantially different from that in the crane nests. Two artificial incubators failed to turn the eggs in a way that mimicked crane egg-turning. Archive eggs are an effective tool for guiding the management of avian conservation breeding programs, and can be custom-made for other species. They also have potential to be applied to research on wild populations.
Benis, Arriel; Hoshen, Moshe
Outcomes research and evidence-based medical practice is being positively impacted by proliferation of healthcare databases. Modern epidemiologic studies require complex data comprehension. A new tool, DisEpi, facilitates visual exploration of epidemiological data supporting Public Health Knowledge Discovery. It provides domain-experts a compact visualization of information at the population level. In this study, DisEpi is applied to Attention-Deficit/Hyperactivity Disorder (ADHD) patients within Clalit Health Services, analyzing the socio-demographic and ADHD filled prescription data between 2006 and 2016 of 1,605,800 children aged 6 to 17 years. DisEpi's goals facilitate the identification of (1) Links between attributes and/or events, (2) Changes in these relationships over time, and (3) Clusters of population attributes for similar trends. DisEpi combines hierarchical clustering graphics and a heatmap where color shades reflect disease time-trends. In the ADHD context, DisEpi allowed the domain-expert to visually analyze a snapshot summary of data mining results. Accordingly, the domain-expert was able to efficiently identify that: (1) Relatively younger children and particularly youngest children in class are treated more often, (2) Medication incidence increased between 2006 and 2011 but then stabilized, and (3) Progression rates of medication incidence is different for each of the 3 main discovered clusters (aka: profiles) of treated children. DisEpi delivered results similar to those previously published which used classical statistical approaches. DisEpi requires minimal preparation and fewer iterations, generating results in a user-friendly format for the domain-expert. DisEpi will be wrapped as a package containing the end-to-end discovery process. Optionally, it may provide automated annotation using calendar events (such as policy changes or media interests), which can improve discovery efficiency, interpretation, and policy implementation.
Full Text Available Purpose: The aim of this study is to investigate the importance of Knowledge Management as a tool for improving business processes in a different context from the industrial organizations, as an archaeological museum. Design/methodology/approach: Using data collected from the National Museum of the Sultanate of Oman in Muscat, a methodology for analysis and improvement of processes (the Business Cycle Management Process, CMP is designed and validated. This application is described as an eight phases process based on Six Sigma DMAIC. The model has a characteristic "P" shape. Findings: As the results obtained by the process improvement initiative show, we highlight the relevance of the improvement in all aspects regarding the security in showcases in that context. Research limitations/implications: The complexity of implementing indicators and the partial vision of the project as data were only obtained from a part of one of the companies involved in the construction of the museum. An important implication of this paper is in order to present a methodology to improve the museum processes focusing on the reduction of errors and also adding value for the visitors. Practical implications: The relevance to intervene on certain relevant variables at different levels of management performance is verified. Social implications: Improving the quality of leisure services in order to the identification of certain challenges regarding the nature and competitiveness of cultural services. Originality/value: The current work has served as a repository of knowledge applicable to new similar projects, in which to take into account the peculiarities of each case and in particular the level of quality demanded by the client in a cultural context. It is important to take into account the degree of avoidable dissatisfaction (number of solvable problems that would lead to dissatisfaction, the opportunity for improvement, the reduction of operational waste and the need
Vitova, T.; Brendebach, B.; Dardenne, K.; Denecke, M. A.; Lebid, A.; Löble, M.; Rothe, J.; Batuk, O. N.; Hormes, J.; Liu, D.; Breher, F.; Geckeis, H.
High resolution X-ray emission spectroscopy (HRXES) is becoming increasingly important for our understanding of electronic and coordination structures. The combination of such information with development of quantum theoretical tools will advance our capability for predicting reactivity and physical behavior especially of 5f elements. HRXES can be used to remove lifetime broadening by registering the partial fluorescence yield emitted by the sample (i.e., recording a windowed signal from the energy dispersed fluorescence emission while varying incident photon energy), thereby yielding highly resolved X-ray absorption fine structure (XAFS) spectra. Such spectra often display resonant features not observed in conventional XAFS. The spectrometer set-up can also be used for a wide range of other experiments, for example, resonant inelastic X-ray scattering (RIXS), where bulk electron configuration information in solids, liquids and gases is obtained. Valence-selective XAFS studies, where the local structure of a selected element's valence state present in a mixture of valence states can be obtained, as well as site-selective XAFS studies, where the coordination structure of a metal bound to selected elements can be differentiated from that of all the other ligating atoms. A HRXES spectrometer has been constructed and is presently being commissioned for use at the INE-Beamline for actinide research at the synchrotron source ANKA at FZK. We present the spectrometer's compact, modular design, optimized for attaining a wide range of energies, and first test measurement results. Examples from HRXES studies of lanthanides, actinides counter parts, are also shown.
Mendoza, A. M. M.; Rastaetter, L.; Kuznetsova, M. M.; Mays, M. L.; Chulaki, A.; Shim, J. S.; MacNeice, P. J.; Taktakishvili, A.; Collado-Vega, Y. M.; Weigand, C.; Zheng, Y.; Mullinix, R.; Patel, K.; Pembroke, A. D.; Pulkkinen, A. A.; Boblitt, J. M.; Bakshi, S. S.; Tsui, T.
The Community Coordinated Modeling Center (CCMC), with the fundamental goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research, has been serving as an integral hub for over 15 years, providing invaluable resources to both space weather scientific and operational communities. CCMC has developed and provided innovative web-based point of access tools varying from: Runs-On-Request System - providing unprecedented global access to the largest collection of state-of-the-art solar and space physics models, Integrated Space Weather Analysis (iSWA) - a powerful dissemination system for space weather information, Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and Mobile apps to view space weather data anywhere to the scientific community. In addition to supporting research and performing model evaluations, CCMC also supports space science education by hosting summer students through local universities. In this poster, we will showcase CCMC's latest innovative tools and services, and CCMC's tools that revolutionized the way we do research and improve our operational space weather capabilities. CCMC's free tools and resources are all publicly available online (http://ccmc.gsfc.nasa.gov).
Duffy, Christopher; Leonard, Lorne; Shi, Yuning; Bhatt, Gopal; Hanson, Paul; Gil, Yolanda; Yu, Xuan
Using a series of recent examples and papers we explore some progress and potential for virtual (cyber-) collaboration inspired by access to high resolution, harmonized public-sector data at continental scales . The first example describes 7 meso-scale catchments in Pennsylvania, USA where the watershed is forced by climate reanalysis and IPCC future climate scenarios (Intergovernmental Panel on Climate Change). We show how existing public-sector data and community models are currently able to resolve fine-scale eco-hydrologic processes regarding wetland response to climate change . The results reveal that regional climate change is only part of the story, with large variations in flood and drought response associated with differences in terrain, physiography, landuse and/or hydrogeology. The importance of community-driven virtual testbeds are demonstrated in the context of Critical Zone Observatories, where earth scientists from around the world are organizing hydro-geophysical data and model results to explore new processes that couple hydrologic models with land-atmosphere interaction, biogeochemical weathering, carbon-nitrogen cycle, landscape evolution and ecosystem services . Critical Zone cyber-research demonstrates how data-driven model development requires a flexible computational structure where process modules are relatively easy to incorporate and where new data structures can be implemented . From the perspective of "Big-Data" the paper points out that extrapolating results from virtual observatories to catchments at continental scales, will require centralized or cloud-based cyberinfrastructure as a necessary condition for effectively sharing petabytes of data and model results . Finally we outline how innovative cyber-science is supporting earth-science learning, sharing and exploration through the use of on-line tools where hydrologists and limnologists are sharing data and models for simulating the coupled impacts of catchment
Siegel Robert S
Full Text Available Abstract Background A common limitation in guard cell signaling research is that it is difficult to obtain consistent high expression of transgenes of interest in Arabidopsis guard cells using known guard cell promoters or the constitutive 35S cauliflower mosaic virus promoter. An additional drawback of the 35S promoter is that ectopically expressing a gene throughout the organism could cause pleiotropic effects. To improve available methods for targeted gene expression in guard cells, we isolated strong guard cell promoter candidates based on new guard cell-specific microarray analyses of 23,000 genes that are made available together with this report. Results A promoter, pGC1(At1g22690, drove strong and relatively specific reporter gene expression in guard cells including GUS (beta-glucuronidase and yellow cameleon YC3.60 (GFP-based calcium FRET reporter. Reporter gene expression was weaker in immature guard cells. The expression of YC3.60 was sufficiently strong to image intracellular Ca2+ dynamics in guard cells of intact plants and resolved spontaneous calcium transients in guard cells. The GC1 promoter also mediated strong reporter expression in clustered stomata in the stomatal development mutant too-many-mouths (tmm. Furthermore, the same promoter::reporter constructs also drove guard cell specific reporter expression in tobacco, illustrating the potential of this promoter as a method for high level expression in guard cells. A serial deletion of the promoter defined a guard cell expression promoter region. In addition, anti-sense repression using pGC1 was powerful for reducing specific GFP gene expression in guard cells while expression in leaf epidermal cells was not repressed, demonstrating strong cell-type preferential gene repression. Conclusion The pGC1 promoter described here drives strong reporter expression in guard cells of Arabidopsis and tobacco plants. It provides a potent research tool for targeted guard cell expression or
Yang, Yingzhen; Costa, Alex; Leonhardt, Nathalie; Siegel, Robert S; Schroeder, Julian I
Background A common limitation in guard cell signaling research is that it is difficult to obtain consistent high expression of transgenes of interest in Arabidopsis guard cells using known guard cell promoters or the constitutive 35S cauliflower mosaic virus promoter. An additional drawback of the 35S promoter is that ectopically expressing a gene throughout the organism could cause pleiotropic effects. To improve available methods for targeted gene expression in guard cells, we isolated strong guard cell promoter candidates based on new guard cell-specific microarray analyses of 23,000 genes that are made available together with this report. Results A promoter, pGC1(At1g22690), drove strong and relatively specific reporter gene expression in guard cells including GUS (beta-glucuronidase) and yellow cameleon YC3.60 (GFP-based calcium FRET reporter). Reporter gene expression was weaker in immature guard cells. The expression of YC3.60 was sufficiently strong to image intracellular Ca2+ dynamics in guard cells of intact plants and resolved spontaneous calcium transients in guard cells. The GC1 promoter also mediated strong reporter expression in clustered stomata in the stomatal development mutant too-many-mouths (tmm). Furthermore, the same promoter::reporter constructs also drove guard cell specific reporter expression in tobacco, illustrating the potential of this promoter as a method for high level expression in guard cells. A serial deletion of the promoter defined a guard cell expression promoter region. In addition, anti-sense repression using pGC1 was powerful for reducing specific GFP gene expression in guard cells while expression in leaf epidermal cells was not repressed, demonstrating strong cell-type preferential gene repression. Conclusion The pGC1 promoter described here drives strong reporter expression in guard cells of Arabidopsis and tobacco plants. It provides a potent research tool for targeted guard cell expression or gene silencing. It is also
Altman, Eric I; Baykara, Mehmet Z; Schwarz, Udo D
Although atomic force microscopy (AFM) was rapidly adopted as a routine surface imaging apparatus after its introduction in 1986, it has not been widely used in catalysis research. The reason is that common AFM operating modes do not provide the atomic resolution required to follow catalytic processes; rather the more complex noncontact (NC) mode is needed. Thus, scanning tunneling microscopy has been the principal tool for atomic scale catalysis research. In this Account, recent developments in NC-AFM will be presented that offer significant advantages for gaining a complete atomic level view of catalysis. The main advantage of NC-AFM is that the image contrast is due to the very short-range chemical forces that are of interest in catalysis. This motivated our development of 3D-AFM, a method that yields quantitative atomic resolution images of the potential energy surfaces that govern how molecules approach, stick, diffuse, and rebound from surfaces. A variation of 3D-AFM allows the determination of forces required to push atoms and molecules on surfaces, from which diffusion barriers and variations in adsorption strength may be obtained. Pushing molecules towards each other provides access to intermolecular interaction between reaction partners. Following reaction, NC-AFM with CO-terminated tips yields textbook images of intramolecular structure that can be used to identify reaction intermediates and products. Because NC-AFM and STM contrast mechanisms are distinct, combining the two methods can produce unique insight. It is demonstrated for surface-oxidized Cu(100) that simultaneous 3D-AFM/STM yields resolution of both the Cu and O atoms. Moreover, atomic defects in the Cu sublattice lead to variations in the reactivity of the neighboring O atoms. It is shown that NC-AFM also allows a straightforward imaging of work function variations which has been used to identify defect charge states on catalytic surfaces and to map charge transfer within an individual
Zhao, Y.; Zhao, Y. L.; Shao, YW; Hu, T. J.; Zhang, Q.; Ge, X. H.
Cutting force is an important factor that affects machining accuracy, cutting vibration and tool wear. Machining condition monitoring by cutting force measurement is a key technology for intelligent manufacture. Current cutting force sensors exist problems of large volume, complex structure and poor compatibility in practical application, for these problems, a smart cutting tool is proposed in this paper for cutting force measurement. Commercial MEMS (Micro-Electro-Mechanical System) strain gauges with high sensitivity and small size are adopted as transducing element of the smart tool, and a structure optimized cutting tool is fabricated for MEMS strain gauge bonding. Static calibration results show that the developed smart cutting tool is able to measure cutting forces in both X and Y directions, and the cross-interference error is within 3%. Its general accuracy is 3.35% and 3.27% in X and Y directions, and sensitivity is 0.1 mV/N, which is very suitable for measuring small cutting forces in high speed and precision machining. The smart cutting tool is portable and reliable for practical application in CNC machine tool.
Abery, Philip; Kuys, Suzanne; Lynch, Mary; Low Choy, Nancy
To design and establish reliability of a local stroke audit tool by engaging allied health clinicians within a privately funded hospital. Design: Two-stage study involving a modified Delphi process to inform stroke audit tool development and inter-tester reliability. Allied health clinicians. A modified Delphi process to select stroke guideline recommendations for inclusion in the audit tool. Reliability study: 1 allied health representative from each discipline audited 10 clinical records with sequential admissions to acute and rehabilitation services. Recommendations were admitted to the audit tool when 70% agreement was reached, with 50% set as the reserve agreement. Inter-tester reliability was determined using intra-class correlation coefficients (ICCs) across 10 clinical records. Twenty-two participants (92% female, 50% physiotherapists, 17% occupational therapists) completed the modified Delphi process. Across 6 voting rounds, 8 recommendations reached 70% agreement and 2 reached 50% agreement. Two recommendations (nutrition/hydration; goal setting) were added to ensure representation for all disciplines. Substantial consistency across raters was established for the audit tool applied in acute stroke (ICC .71; range .48 to .90) and rehabilitation (ICC.78; range .60 to .93) services. Allied health clinicians within a privately funded hospital generally agreed in an audit process to develop a reliable stroke audit tool. Allied health clinicians agreed on stroke guideline recommendations to inform a stroke audit tool. The stroke audit tool demonstrated substantial consistency supporting future use for service development. This process, which engages local clinicians, could be adopted by other facilities to design reliable audit tools to identify local service gaps to inform changes to clinical practice. © 2018 John Wiley & Sons, Ltd.
Wiche, Oliver; Székely, Balazs; Moschner, Christin; Heilmeier, Hermann
plots was randomized and every treatment was fivefold replicated. Soil solution was collected weekly with plastic suction cups. Concentrations of trace metals in shoots of oat and soil solution were measured with ICP-MS. As a result, we found that both, concentrations of trace elements in oat plants, as well as the mobility of P and trace metals in soil solution was increased by an intercropping with white lupine. Mixed culture of oat with 11% white lupin significantly increased the concentrations of the trace nutrients Fe, Mn and Zn, as well as the concentrations of the trace metals Pb, La, Nd, Sc, Th and U in tissues of oat. Surprisingly, mixed cultures with 33 % white lupin did not significantly affect trace metal concentrations in oat, what might be the consequence of an increasing competition of roots of white lupin and oat for nutrients and trace metals. In conclusion we found that mixed cultures of white lupin with cereals might be a powerful tool for enhanced phytoremediation and phytomining. However, processes involved in the physiochemical mechanism of element uptake as affected by the oat/white lupin co-cultivation remain unknown and further studies on this topic are planned. These studies have been carried out in the framework of the PhytoGerm project, financed by the Federal Ministry of Education and Research, Germany. The authors are grateful to students and laboratory assistants contributing in the field work and sample preparation.
Maar, Marion; Yeates, Karen; Barron, Marcia; Hua, Diane; Liu, Peter; Moy Lum-Kwong, Margaret; Perkins, Nancy; Sleeth, Jessica; Tobe, Joshua; Wabano, Mary Jo; Williamson, Pamela; Tobe, Sheldon W
Non-communicable chronic diseases are the leading causes of mortality globally, and nearly 80% of these deaths occur in low- and middle-income countries (LMICs). In high-income countries (HICs), inequitable distribution of resources affects poorer and otherwise disadvantaged groups including Aboriginal peoples. Cardiovascular mortality in high-income countries has recently begun to fall; however, these improvements are not realized among citizens in LMICs or those subgroups in high-income countries who are disadvantaged in the social determinants of health including Aboriginal people. It is critical to develop multi-faceted, affordable and realistic health interventions in collaboration with groups who experience health inequalities. Based on community-based participatory research (CBPR), we aimed to develop implementation tools to guide complex interventions to ensure that health gains can be realized in low-resource environments. We developed the I-RREACH (Intervention and Research Readiness Engagement and Assessment of Community Health Care) tool to guide implementation of interventions in low-resource environments. We employed CBPR and a consensus methodology to (1) develop the theoretical basis of the tool and (2) to identify key implementation factor domains; then, we (3) collected participant evaluation data to validate the tool during implementation. The I-RREACH tool was successfully developed using a community-based consensus method and is rooted in participatory principles, equalizing the importance of the knowledge and perspectives of researchers and community stakeholders while encouraging respectful dialogue. The I-RREACH tool consists of three phases: fact finding, stakeholder dialogue and community member/patient dialogue. The evaluation for our first implementation of I-RREACH by participants was overwhelmingly positive, with 95% or more of participants indicating comfort with and support for the process and the dialogue it creates. The I
Collins, Sandra K; Collins, Kevin S
With the workforce growing older and the supply of younger workers diminishing, it is critical for health care managers to understand the factors necessary to capitalize on their vintage employees. Retaining this segment of the workforce has a multitude of benefits including the preservation of valuable intellectual capital, which is necessary to ensure that health care organizations maintain their competitive advantage in the consumer-driven market. Retaining the aging employee is possible if health care managers learn the motivators and training differences associated with this category of the workforce. These employees should be considered a valuable resource of human capital because without their extensive expertise, intense loyalty and work ethic, and superior customer service skills, health care organizations could suffer severe economic repercussions in the near future.
Vladimir I. Zagvyazinsky
The aim of the investigation is to show that in modern market conditions it is necessary to keep humanistic valuable and orientation installations of domestic education and not to allow its slipping on a line item of utilitarian, quickly achievable, but not long-term benefits. Theoretical significance. The author emphasizes value of forming of an ideal – harmonious development of the personality – and the collectivist beginnings for disclosure of potential of each school student, a student, a...
JMBE Production Editor
Full Text Available Correction for Sarah E. Council and Julie E. Horvath, “Tools for Citizen-Science Recruitment and Student Engagement in Your Research and in Your Classroom,” which appeared in the Journal of Microbiology & Biology Education, volume 17, number 1, March 2016, pages 38–40.
The aim of this study is to investigate whether Lego could be used as a tool for reflective practice with social care practitioners (SCPs) and student practitioners. This article outlines an action research study conducted in an institute of higher education in Ireland. Findings from this study suggest that Lego can be used to support student…
Ng, Wan; Gunstone, Richard
Investigates the use of the World Wide Web (WWW) as a research and teaching tool in promoting self-directed learning groups of 15-year-old students. Discusses the perceptions of students of the effectiveness of the WWW in assisting them with the construction of knowledge on photosynthesis and respiration. (Contains 33 references.) (Author/YDS)
Federated searching was once touted as the library world's answer to Google, but ten years since federated searching technology's inception, how does it actually compare? This study focuses on undergraduate student preferences and perceptions when doing research using both Google and a federated search tool. Students were asked about their…
Powers, Christina M., E-mail: email@example.com [National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Grieger, Khara D., E-mail: firstname.lastname@example.org [RTI International, 3040 Cornwallis Rd., Research Triangle Park, NC 27709 (United States); Hendren, Christine Ogilvie, E-mail: email@example.com [Center for the Environmental Implications of NanoTechnology, Duke University, Durham, NC 27708 (United States); Meacham, Connie A., E-mail: firstname.lastname@example.org [National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Gurevich, Gerald, E-mail: email@example.com [National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Lassiter, Meredith Gooding, E-mail: firstname.lastname@example.org [National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Money, Eric S., E-mail: email@example.com [RTI International, 3040 Cornwallis Rd., Research Triangle Park, NC 27709 (United States); Lloyd, Jennifer M., E-mail: firstname.lastname@example.org [RTI International, 3040 Cornwallis Rd., Research Triangle Park, NC 27709 (United States); Beaulieu, Stephen M., E-mail: email@example.com [RTI International, 3040 Cornwallis Rd., Research Triangle Park, NC 27709 (United States)
Prioritizing and assessing risks associated with chemicals, industrial materials, or emerging technologies is a complex problem that benefits from the involvement of multiple stakeholder groups. For example, in the case of engineered nanomaterials (ENMs), scientific uncertainties exist that hamper environmental, health, and safety (EHS) assessments. Therefore, alternative approaches to standard EHS assessment methods have gained increased attention. The objective of this paper is to describe the application of a web-based, interactive decision support tool developed by the U.S. Environmental Protection Agency (U.S. EPA) in a pilot study on ENMs. The piloted tool implements U.S. EPA's comprehensive environmental assessment (CEA) approach to prioritize research gaps. When pursued, such research priorities can result in data that subsequently improve the scientific robustness of risk assessments and inform future risk management decisions. Pilot results suggest that the tool was useful in facilitating multi-stakeholder prioritization of research gaps. Results also provide potential improvements for subsequent applications. The outcomes of future CEAWeb applications with larger stakeholder groups may inform the development of funding opportunities for emerging materials across the scientific community (e.g., National Science Foundation Science to Achieve Results [STAR] grants, National Institutes of Health Requests for Proposals). - Highlights: • A web-based, interactive decision support tool was piloted for emerging materials. • The tool (CEAWeb) was based on an established approach to prioritize research gaps. • CEAWeb facilitates multi-stakeholder prioritization of research gaps. • We provide recommendations for future versions and applications of CEAWeb.
Mishra, P; Patankar, A; Etmektzoglou, A; Svatos, M [Varian Medical Systems, Palo Alto, CA (United States); Lewis, J [Brigham and Women’s Hospital, Boston, MA (United States)
Purpose: We introduce Veritas 2.0, a cloud-based, non-clinical research portal, to facilitate translation of radiotherapy research ideas to new delivery techniques. The ecosystem of research tools includes web apps for a research beam builder for TrueBeam Developer Mode, an image reader for compressed and uncompressed XIM files, and a trajectory log file based QA/beam delivery analyzer. Methods: The research beam builder can generate TrueBeam readable XML file either from scratch or from pre-existing DICOM-RT plans. DICOM-RT plan is first converted to XML format and then researcher can interactively modify or add control points to them. Delivered beam can be verified via reading generated images and analyzing trajectory log files. Image reader can read both uncompressed and HND-compressed XIM images. The trajectory log analyzer lets researchers plot expected vs. actual values and deviations among 30 mechanical axes. The analyzer gives an animated view of MLC patterns for the beam delivery. Veritas 2.0 is freely available and its advantages versus standalone software are i) No software installation or maintenance needed, ii) easy accessibility across all devices iii) seamless upgrades and iv) OS independence. Veritas is written using open-source tools like twitter bootstrap, jQuery, flask, and Python-based modules. Results: In the first experiment, an anonymized 7-beam DICOM-RT IMRT plan was converted to XML beam containing 1400 control points. kV and MV imaging points were inserted into this XML beam. In another experiment, a binary log file was analyzed to compare actual vs expected values and deviations among axes. Conclusions: Veritas 2.0 is a public cloud-based web app that hosts a pool of research tools for facilitating research from conceptualization to verification. It is aimed at providing a platform for facilitating research and collaboration. I am full time employee at Varian Medical systems, Palo Alto.
Mishra, P; Patankar, A; Etmektzoglou, A; Svatos, M; Lewis, J
Purpose: We introduce Veritas 2.0, a cloud-based, non-clinical research portal, to facilitate translation of radiotherapy research ideas to new delivery techniques. The ecosystem of research tools includes web apps for a research beam builder for TrueBeam Developer Mode, an image reader for compressed and uncompressed XIM files, and a trajectory log file based QA/beam delivery analyzer. Methods: The research beam builder can generate TrueBeam readable XML file either from scratch or from pre-existing DICOM-RT plans. DICOM-RT plan is first converted to XML format and then researcher can interactively modify or add control points to them. Delivered beam can be verified via reading generated images and analyzing trajectory log files. Image reader can read both uncompressed and HND-compressed XIM images. The trajectory log analyzer lets researchers plot expected vs. actual values and deviations among 30 mechanical axes. The analyzer gives an animated view of MLC patterns for the beam delivery. Veritas 2.0 is freely available and its advantages versus standalone software are i) No software installation or maintenance needed, ii) easy accessibility across all devices iii) seamless upgrades and iv) OS independence. Veritas is written using open-source tools like twitter bootstrap, jQuery, flask, and Python-based modules. Results: In the first experiment, an anonymized 7-beam DICOM-RT IMRT plan was converted to XML beam containing 1400 control points. kV and MV imaging points were inserted into this XML beam. In another experiment, a binary log file was analyzed to compare actual vs expected values and deviations among axes. Conclusions: Veritas 2.0 is a public cloud-based web app that hosts a pool of research tools for facilitating research from conceptualization to verification. It is aimed at providing a platform for facilitating research and collaboration. I am full time employee at Varian Medical systems, Palo Alto
Good, Marjorie J; Hurley, Patricia; Woo, Kaitlin M; Szczepanek, Connie; Stewart, Teresa; Robert, Nicholas; Lyss, Alan; Gönen, Mithat; Lilenbaum, Rogerio
Clinical research program managers are regularly faced with the quandary of determining how much of a workload research staff members can manage while they balance clinical practice and still achieve clinical trial accrual goals, maintain data quality and protocol compliance, and stay within budget. A tool was developed to measure clinical trial-associated workload, to apply objective metrics toward documentation of work, and to provide clearer insight to better meet clinical research program challenges and aid in balancing staff workloads. A project was conducted to assess the feasibility and utility of using this tool in diverse research settings. Community-based research programs were recruited to collect and enter clinical trial-associated monthly workload data into a web-based tool for 6 consecutive months. Descriptive statistics were computed for self-reported program characteristics and workload data, including staff acuity scores and number of patient encounters. Fifty-one research programs that represented 30 states participated. Median staff acuity scores were highest for staff with patients enrolled in studies and receiving treatment, relative to staff with patients in follow-up status. Treatment trials typically resulted in higher median staff acuity, relative to cancer control, observational/registry, and prevention trials. Industry trials exhibited higher median staff acuity scores than trials sponsored by the National Institutes of Health/National Cancer Institute, academic institutions, or others. The results from this project demonstrate that trial-specific acuity measurement is a better measure of workload than simply counting the number of patients. The tool was shown to be feasible and useable in diverse community-based research settings. Copyright © 2016 by American Society of Clinical Oncology.
Curtis, Helen J; Goldacre, Ben
We aimed to compile and normalise England's national prescribing data for 1998-2016 to facilitate research on long-term time trends and create an open-data exploration tool for wider use. We compiled data from each individual year's national statistical publications and normalised them by mapping each drug to its current classification within the national formulary where possible. We created a freely accessible, interactive web tool to allow anyone to interact with the processed data. We downloaded all available annual prescription cost analysis datasets, which include cost and quantity for all prescription items dispensed in the community in England. Medical devices and appliances were excluded. We measured the extent of normalisation of data and aimed to produce a functioning accessible analysis tool. All data were imported successfully. 87.5% of drugs were matched exactly on name to the current formulary and a further 6.5% to similar drug names. All drugs in core clinical chapters were reconciled to their current location in the data schema, with only 1.26% of drugs not assigned a current chemical code. We created an openly accessible interactive tool to facilitate wider use of these data. Publicly available data can be made accessible through interactive online tools to help researchers and policy-makers explore time trends in prescribing. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Full Text Available Background: To the best of our knowledge, a strategic approach to define the contents of structured clinical documentation tools for both clinical routine patient care and research purposes has not been reported so far, although electronic health record will become more and more structured and detailed in the future. Objective: To achieve an interdisciplinary consensus on a checklist to be considered for the preparation of disease- and situation-specific clinical documentation tools. Methods: A 2-round Delphi consensus-based process was conducted both with 19 physicians of different disciplines and 14 students from Austria, Switzerland, and Germany. Agreement was defined as 80% or more positive votes of the participants. Results: The participants agreed that a working group should be set up for the development of structured disease- or situation-specific documentation tools (97% agreement. The final checklist included 4 recommendations concerning the setup of the working group, 12 content-related recommendations, and 3 general and technical recommendations (mean agreement [standard deviation] = 97.4% [4.0%], ranging from 84.2% to 100.0%. Discussion and Conclusion: In the future, disease- and situation-specific structured documentation tools will provide an important bridge between registries and electronic health records. Clinical documentation tools defined according to this Delphi consensus-based checklist will provide data for registries while serving as high-quality data acquisition tools in routine clinical care.
Torous, John; Kiang, Mathew V; Lorme, Jeanette; Onnela, Jukka-Pekka
Background A longstanding barrier to progress in psychiatry, both in clinical settings and research trials, has been the persistent difficulty of accurately and reliably quantifying disease phenotypes. Mobile phone technology combined with data science has the potential to offer medicine a wealth of additional information on disease phenotypes, but the large majority of existing smartphone apps are not intended for use as biomedical research platforms and, as such, do not generate research-qu...
I am inspired to write this article after coming across some publications in The Physics Teacher that all hit on topics of personal interest and experience. Similarly to Christensen my goal in writing this is to encourage other physics educators to take advantage of modern technology in delivering content to students and to feel comfortable doing so. There are numerous ways in which to create screencasts and lecture videos, some of which have been addressed in other articles. I invite those interested in learning how to create these videos to contact their educational technology staff or perform some internet searches on the topic. I will focus this article on the technology that enhanced the content I was delivering to my students. I will share a bit of my journey towards creating video materials and introduce a vital piece of technology, the graphics tablet, which changed the way I communicate with my students.
Many patients with cardiovascular disease have their low density lipoprotein cholesterol within normal range. This raises the question about the most important lipoprotein to use as a marker of atherogenecity. In fact, small dense low density lipoprotein has recently been suggested as a strong predictor of cardiovascular ...
Swordfish Xiphias gladius is an oceanic-pelagic species. Its population structure in the Western Indian Ocean was studied from the shape of the sagittal otoliths of 391 individuals collected from 2009 to 2014. Normalised elliptical Fourier descriptors (EFDs) were extracted automatically using TNPC software. Principal ...
Biagioli, M.; Pinton, P.; Scudiero, R.; Ragghianti, M.; Bucci, S.; Rizzuto, R.
The ability of cadmium to disrupt calcium homeostasis has been known since a long time, but the precise cellular targets of its toxic action are still debated. A great problem in the interpretation of data has been associated with the ability of cadmium to strongly bind traditional calcium probes. Aequorin, the well-characterized calcium-sensitive photoprotein, was used as intracellular calcium indicator during cadmium injury in NIH 3T3 murine fibroblasts. NIH 3T3 cells were transfected with a cDNA construct containing aequorin fused to a truncated glutamate receptor, which directs the probe to the outer surface of intracellular membranes. At first, we tested if different cadmium concentrations were able to modify the rate of light emission by aequorin showing that cadmium concentrations 2+ /Ca 2+ interference. To directly investigate the role of Cd 2+ in Ca 2+ homeostasis, we have started to selectively measure the free Ca 2+ concentration in different cell compartments. Here, we report that cadmium reduces the transient free calcium signal after stimulation of cells with bradykinin. Further studies are in progress to clarify the role of mitochondria and endoplasmic reticulum in cadmium-induced alterations of Ca 2+ homeostasis in order to link signal transduction modifications with the onset of apoptosis induced by cadmium exposure
In order to avoid unnecessary, time consuming, and costly litigation, the Department of Defense, and more specifically the United States Navy, has adopted the use of alternative dispute resolution (ADR...
Mishra, A.; Ehtuish, Ehtuish F.
To evaluate the efficacy of multidetector (16-row) computed tomography (MDCT) in imaging the upper and lower limb arterial tree in trauma and peripheral vascular disease. Thirty three patients underwent multislice computed tomography angiography (MSCTA) of the upper or the lower limb on multislice (16-slice) CT scanner between November 2004 and July 2005 in the Department of Radiology, National Organ Transplant Center, Tripoli, Libya. The findings were retrospectively compared with the surgical outcome in cases of trauma with suspected arterial injuries; or color Doppler correlation was obtained, for patients of peripheral vascular disease. Multislice computed tomography angiography allows a comprehensive diagnostic work-up in all trauma cases with suspected arterial injuries. In 23 cases of peripheral vascular diseases, MSCTA adequately demonstrated the presence of any stenosis or occlusion, its degree and extent, the presence of collaterals and distal reformation if any; the presence of plaques. Our experience of computed tomography angiography with 16-row MDCT scanner has clearly demonstrated its efficacy as a promising, new, fast, accurate, safe and non-invasive imaging modality of choice in cases of trauma with suspected arterial injuries; and as a useful screening modality in cases of peripheral vascular disease for diagnosis and for grading. (author)
Dawidowicz, Andrzej L; Czapczyńska, Natalia B
Essential oils are one of nature's most precious gifts with surprisingly potent and outstanding properties. Coniferous oils, for instance, are nowadays being used extensively to treat or prevent many types of infections, modify immune responses, soothe inflammations, stabilize moods, and to help ease all forms of non-acute pain. Given the broad spectrum of usage of coniferous essential oils, a fast, safe, simple, and efficient sample-preparation method is needed in the estimation procedure of essential oil components in fresh plant material. Generally, the time- and energy-consuming steam distillation (SD) is applied for this purpose. This paper will compare SD, pressurized liquid extraction (PLE), matrix solid-phase dispersion (MSPD), and the sea sand disruption method (SSDM) as isolation techniques to obtain aroma components from Scots pine (Pinus sylvestris), spruce (Picea abies), and Douglas fir (Pseudotsuga menziesii). According to the obtained data, SSDM is the most efficient sample preparation method in determining the essential oil composition of conifers. Moreover, SSDM requires small organic solvent amounts and a short extraction time, which makes it an advantageous alternative procedure for the routine analysis of coniferous oils. The superiority of SSDM over MSPD efficiency is ascertained, as there are no chemical interactions between the plant cell components and the sand. This fact confirms the reliability and efficacy of SSDM for the analysis of volatile oil components. Copyright © 2011 Verlag Helvetica Chimica Acta AG, Zürich.
Willumsen, Nicholas; Bager, Cecilie L; Leeming, Diana J; Smith, Victoria; Christiansen, Claus; Karsdal, Morten A; Dornan, David; Bay-Jensen, Anne-Christine
Extracellular matrix (ECM) proteins, such as collagen type I and elastin, and intermediate filament (IMF) proteins, such as vimentin are modified and dysregulated as part of the malignant changes leading to disruption of tissue homeostasis. Noninvasive biomarkers that reflect such changes may have a great potential for cancer. Levels of matrix metalloproteinase (MMP) generated fragments of type I collagen (C1M), of elastin (ELM), and of citrullinated vimentin (VICM) were measured in serum from patients with lung cancer (n = 40), gastrointestinal cancer (n = 25), prostate cancer (n = 14), malignant melanoma (n = 7), chronic obstructive pulmonary disease (COPD) (n = 13), and idiopathic pulmonary fibrosis (IPF) (n = 10), as well as in age-matched controls (n = 33). The area under the receiver operating characteristics (AUROC) was calculated and a diagnostic decision tree generated from specific cutoff values. C1M and VICM were significantly elevated in lung cancer patients as compared with healthy controls (AUROC = 0.98, P < 0.0001) and other cancers (AUROC = 0.83 P < 0.0001). A trend was detected when comparing lung cancer with COPD+IPF. No difference could be seen for ELM. Interestingly, C1M and VICM were able to identify patients with lung cancer with a positive predictive value of 0.9 and an odds ratio of 40 (95% CI = 8.7–186, P < 0.0001). Biomarkers specifically reflecting degradation of collagen type I and citrullinated vimentin are applicable for lung cancer patients. Our data indicate that biomarkers reflecting ECM and IMF protein dysregulation are highly applicable in the lung cancer setting. We speculate that these markers may aid in diagnosing and characterizing patients with lung cancer
profitability on behalf of its shareholders and/or owners—that’s capital- ism. Our job is to protect the interests of the taxpayers and the warfighter while...business, then it was not good. Profit is the fundamental reason that businesses exist: to make money for their owners or shareholders . Without...performance of the service, or we may only be interested in controlling cost at a set level of performance. As we emphasized in BBP 2.0, we have to start
Hudson, A P G; Harris, A M P; Mohamed, N
The mixed dentition pantomogram is routinely used in paediatric patients. This paper discusses the value of the pantomogram for early identification of problems in dental development during the mixed dentition stage. Aspects regarding dental maturity, leeway space, the sequence of eruption of the permanent teeth, anomalies and the development of the canines will be reviewed.
Kundu, B; Eltohamy, M; Yadavalli, V K; Reis, R L; Kim, H W
The assembly of natural proteinaceous biopolymers into macro-scale architectures is of great importance in synthetic biology, soft-material science and regenerative therapy. The self-assembly of protein tends to be limited due to anisotropic interactions among protein molecules, poor solubility and stability. Here, we introduce a unique platform to self-immobilize diverse proteins (fibrous and globular, positively and negatively charged, low and high molecular weight) using silicon surfaces with pendant -NH 2 groups via a facile one step diffusion limited aggregation (DLA) method. All the experimental proteins (type I collagen, bovine serum albumin and cytochrome C) self-assemble into seaweed-like branched dendritic architectures via classical DLA in the absence of any electrolytes. The notable differences in branching architectures are due to dissimilarities in protein colloidal sub-units, which is typical for each protein type, along with the heterogeneous distribution of surface -NH 2 groups. Fractal analysis of assembled structures is used to explain the underlying route of fractal deposition; which concludes how proteins with different functionality can yield similar assembly. Further, the nano-micro-structured surfaces can be used to provide functional topographical cues to study cellular responses, as demonstrated using rat bone marrow stem cells. The results indicate that the immobilization of proteins via DLA does not affect functionality, instead serving as topographical cues to guide cell morphology. This indicates a promising design strategy at the tissue-material interface and is anticipated to guide future surface modifications. A cost-effective standard templating strategy is therefore proposed for fundamental and applied particle aggregation studies, which can be used at multiple length scales for biomaterial design and surface reformation.
Radwa Momtaz Abdelsamie Zaki Khalil
Lipoprotein Cholesterol; LDL I, large buoyant LDL; LDL II, intermediate density LDL; LDL III, smaller dense LDL; .... triglycerides >_150 mg, high density lipoprotein (HDL) <40 mg/dl in men ... sion of phenotype B.4,12 For a given triglyceride level, women were .... that sdLDL /LDL ratio is a very strong predictor of CHD in men;.
I am inspired to write this article after coming across some publications in "The Physics Teacher" that all hit on topics of personal interest and experience. Similarly to Christensen my goal in writing this is to encourage other physics educators to take advantage of modern technology in delivering content to students and to feel…
Gjerris, Anne Cathrine Roslev; Staer-Jensen, Jette; Jørgensen, Jan Stener
The aim of the present study was (1) to evaluate the relationship between umbilical cord arterial blood lactate and pH, standard base excess (SBE), and actual base excess (ABE) at delivery and (2) to suggest a cut-off level of umbilical cord arterial blood lactate in predicting fetal asphyxia usi...... ROC-curves, where an ABE value less than -12 was used as "gold standard" for significant intrapartum asphyxia.......The aim of the present study was (1) to evaluate the relationship between umbilical cord arterial blood lactate and pH, standard base excess (SBE), and actual base excess (ABE) at delivery and (2) to suggest a cut-off level of umbilical cord arterial blood lactate in predicting fetal asphyxia using...
Gjerris, A.C.; Staer-Jensen, J.; Jorgensen, J.S.
asphyxia using ROC-curves, where an ABE value less than -12 was used as "gold standard" for significant intrapartum asphyxia. STUDY DESIGN: This is a descriptive study of umbilical cord arterial blood samples from 2554 singleton deliveries. The deliveries took place at the Department of Obstetrics......OBJECTIVE: The aim of the present study was (1) to evaluate the relationship between umbilical cord arterial blood lactate and pH, standard base excess (SBE), and actual base excess (ABE) at delivery and (2) to suggest a cut-off level of umbilical cord arterial blood lactate in predicting fetal...
Ouarezki, Yasmine; Cizmecioglu, Filiz Mine; Mansour, Chourouk; Jones, Jeremy Huw; Gault, Emma Jane; Mason, Avril; Donaldson, Malcolm D C
Early diagnosis of Turner syndrome (TS) is necessary to facilitate appropriate management, including growth promotion. Not all girls with TS have overt short stature, and comparison with parental height (Ht) is needed for appropriate evaluation. We examined both the prevalence and diagnostic sensitivity of measured parental Ht in a dedicated TS clinic between 1989 and 2013. Lower end of parental target range (LTR) was calculated as mid-parental Ht (correction factor 12.5 cm minus 8.5 cm) and converted to standard deviation scores (SDS) using UK 1990 data, then compared with patient Ht SDS at first accurate measurement aged > 1 year. Information was available in 172 girls of whom 142 (82.6%) were short at first measurement. However, both parents had been measured in only 94 girls (54.6%). In 92 of these girls age at measurement was 6.93 ± 3.9 years, Ht SDS vs LTR SDS - 2.63 ± 0.94 vs - 1.77 ± 0.81 (p Turner syndrome are short in relation to parental heights, with untreated final height approximately 20 cm below female population mean. • Measured parental height is more accurate than reported height. What is New: • In a dedicated Turner clinic, there was 85% sensitivity when comparing patient height standard deviation score at first accurate measurement beyond 1 year of age with the lower end of the parental target range standard deviation. • However, measured height in both parents had been recorded in only 54.6% of the Turner girls attending the clinic. This indicates the need to improve the quality of growth assessment in tertiary care.
Ertmer, David J.
Background: Newborn hearing screening, early intervention programs, and advancements in cochlear implant and hearing aid technology have greatly increased opportunities for children with hearing loss to become intelligible talkers. Optimizing speech intelligibility requires that progress be monitored closely. Although direct assessment of…
Miles, K.; Volpe, K.
The Millstone nuclear power station has begun an aggressive program to use robotics, which when properly used minimizes operating costs and exposure to personnel. This article describes several new ways of using existing robotic equipment to speed up work processes and provide solutions to difficult problems. The moisture separator pit and liquid radwaste are discussed
Milić Saša D.
Full Text Available The paper presents the SFRA (Sweep Frequency Response Analysis-SFRA method for analyzing frequency response of transformer windings in order to identify potential defects in the geometry of the core and winding. The most frequent problems (recognized by SFRA are: core shift, shorted or open winding, unwanted contact between core and mass, etc. Comparative analysis of this method with conventional methods is carried out in situ transformer in real hard industrial conditions. Benefits of SFRA method are great reliability and repeatability of the measurements. This method belongs to the non-invasive category. Due to the high reliability and repeatability of the measurements it is very suitable for detection of changes in the geometry of the coil and the core during prophylactic field testing, or after transporting the transformer.
Filker, Phyllis J; Cook, Nicole; Kodish-Stav, Jodi
The objective of this study was to investigate if electronic patient records have utility in dental school strategic planning. Electronic health records (EHRs) have been used by all predoctoral students and faculty members at Nova Southeastern University's College of Dental Medicine (NSU-CDM) since 2006. The study analyzed patient demographic and caries risk assessment data from October 2006 to May 2011 extracted from the axiUm EHR database. The purpose was to determine if there was a relationship between high oral health care needs and patient demographics, including gender, age, and median income of the zip code where they reside in order to support dental school strategic planning including the locations of future satellite clinics. The results showed that about 51 percent of patients serviced by the Broward County-based NSU-CDM oral health care facilities have high oral health care needs and that about 60 percent of this population resides in zip codes where the average income is below the median income for the county ($41,691). The results suggest that EHR data can be used adjunctively by dental schools when proposing potential sites for satellite clinics and planning for future oral health care programming.
Siqueira, José F; Rôças, Isabela N
This paper reviews the principles of polymerase chain reaction (PCR) methodology, its application in identification of endodontic pathogens and the perspectives regarding the knowledge to be reached with the use of this highly sensitive, specific and accurate methodology as a microbial identification test. Studies published in the medical, dental and biological literature. Evaluation of published epidemiological studies examining the endodontic microbiota through PCR methodology. PCR technology has enabled the detection of bacterial species that are difficult or even impossible to culture as well as cultivable bacterial strains showing a phenotypically divergent or convergent behaviour. Moreover, PCR is more rapid, much more sensitive, and more accurate when compared with culture. Its use in endodontics to investigate the microbiota associated with infected root canals has expanded the knowledge on the bacteria involved in the pathogenesis of periradicular diseases. For instance, Tannerella forsythensis (formerly Bacteroides forsythus), Treponema denticola, other Treponema species, Dialister pneumosintes, and Prevotella tannerae were detected in infected root canals for the first time and in high prevalence when using PCR analysis. The diversity of endodontic microbiota has been demonstrated by studies using PCR amplification, cloning and sequencing of the PCR products. Moreover, other fastidious bacterial species, such as Porphyromonas endodontalis, Porphyromonas gingivalis and some Eubacterium spp., have been reported in endodontic infections at a higher prevalence than those reported by culture procedures.
Simmons, Aaron B; Bloomsburg, Samuel J; Billingslea, Samuel A; Merrill, Morgan M; Li, Shuai; Thomas, Marshall W; Fuerst, Peter G
superior colliculus. Pou4f2(Cre) provides multiple uses for the vision researcher's genetic toolkit. First, Pou4f2(Cre) is a knock-in allele that can be used to eliminate Pou4f2, resulting in depletion of RGCs. Second, expression of Cre in male germ cells makes this strain an efficient germline activator of recombination, for example, to target LoxP-flanked sequences in the whole mouse. Third, Pou4f2(Cre) efficiently targets RGCs, amacrine cells, bipolar cells, horizontal cells, and a small number of photoreceptors within the retina, as well as the visual centers in the brain. Unlike other Cre recombinase lines that target retinal neurons, no recombination was observed in Müller or other retinal glia. These properties make this Cre recombinase line a useful tool for vision researchers.
Adeline Phaik Harn Chua
Full Text Available Blogs appear to be gaining momentum as a marketing tool which can be used by organisations for such strategies and processes as branding, managing reputation, developing customer trust and loyalty, niche marketing, gathering marketing intelligence and promoting their online presence. There has been limited academic research in this area, and most significantly concerning the types of small and medium enterprises (SMEs for which blogs might have potential as a marketing tool. In an attempt to address the knowledge gap, this paper presents a future research agenda (in the form of research questions which can guide the eBusiness research community in conducting much needed studies in this area. This paper is particularly novel in that it aims to demonstrate how the heterogeneity of SMEs and their specific business uses of eBusiness technology such as blogs can form the central plank of a future research agenda. This is important because the existing eBusiness literature tends to treat eBusiness collectively rather than focusing on the specific business uses of different eBusiness technologies, and to treat SMEs as a homogeneous group. The paper concludes with a discussion of how this research agenda can form the basis of studies which use a range of different research methods, and how this "big picture" agenda approach might help the eBusiness research community build theory which better explains SME adoption and use of eBusiness.
Shukla, Vaibhav; Varghese, Vinay Koshy; Kabekkodu, Shama Prasada; Mallya, Sandeep; Satyamoorthy, Kapaettu
Since the discovery of microRNAs (miRNAs), a class of noncoding RNAs that regulate the gene expression posttranscriptionally in sequence-specific manner, there has been a release of number of tools useful for both basic and advanced applications. This is because of the significance of miRNAs in many pathophysiological conditions including cancer. Numerous bioinformatics tools that have been developed for miRNA analysis have their utility for detection, expression, function, target prediction and many other related features. This review provides a comprehensive assessment of web-based tools for the miRNA analysis that does not require prior knowledge of any computing languages. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please email: firstname.lastname@example.org.
Campbell, A. Malcolm; Eckdahl, Todd; Cronk, Brian; Andresen, Corinne; Frederick, Paul; Huckuntod, Samantha; Shinneman, Claire; Wacker, Annie; Yuan, Jason
The "Vision and Change" report recommended genuine research experiences for undergraduate biology students. Authentic research improves science education, increases the number of scientifically literate citizens, and encourages students to pursue research. Synthetic biology is well suited for undergraduate research and is a growing area…
Charged particle activation analysis based on the bombardment with 15MeV protons from cyclotron was used to study the friction wearing at the zone of contacts in cutting tools, roller bearings and gear teeth. The radioactivity of resulting isotopes such as Co-56, Co-58, Re-183 serves as a measure of the mass changes on the surface tools. The method is suitable for studying the parameters effecting wearing processes and the role of cutting fluid, and also to envisage the economic factors in production planning
Froehlich, Peter; Lorenz, Tom; Martin, Gunther; Brett, Beate; Bertau, Martin [Institut fuer Technische Chemie, TU Bergakademie Freiberg, Leipziger Strasse 29, 09599, Freiberg (Germany)
This Review provides an overview of valuable metals, the supply of which has been classified as critical for Europe. Starting with a description of the current state of the art, novel approaches for their recovery from primary resources are presented as well as recycling processes. The focus lies on developments since 2005. Chemistry strategies which are used in metal recovery are summarized on the basis of the individual types of deposit and mineral. In addition, the economic importance as well as utilization of the metals is outlined. (copyright 2017 Wiley-VCH Verlag GmbH and Co. KGaA, Weinheim)
Vite T, J.
There were extracted valuable metals from foundry sands such as: gold, platinum, silver, cobalt, germanium, nickel and zinc among others, as well as highly toxic metals such as chromium, lead, vanadium and arsenic. The extraction efficiency was up to 100% in some cases. For this reason there were obtained two patents at the United States, patent number 5,356,601, in October 1994, given for the developed process and patent number 5,376,000, in December 1994, obtained for the equipment employed. Therefore, the preliminary parameters for the installation of a pilot plant have also been developed. (Author)
Perception and Information Behaviour of Institutional Repository End-Users Provides Valuable Insight for Future Development. A Review of: St. Jean, B., Rieh, S. Y., Yakel, E., & Markey, K. (2011. Unheard voices: Institutional repository end-users. College & Research Libraries, 72(1, 21-42.
likened the IRs they used to a varying array of information resources and tools, including databases, interface, server, online forums, and “static Wikipedia” (p. 27. Furthermore, six of the interviewees had never heard of the actual term “Institutional Repository” (p. 27.How do end-users access and use IRs?The most common methods of accessing IRs included selecting the link on their institution library’s website and Google searches. Many interviewees found out about the IRs they are using through recommendations from professors, peers, or library workshops. Other interviewees found out about particular IRs “simply because a Google search had landed them there” (p. 29.Interviewees’ preferred method of interacting with an IR were divided between browsing and keyword searching. However, these preferences may have been the result of an IR’s content or interface limitations. For instance, some interviewees expressed difficulties with browsing a particular IR, while another interviewee preferred browsing because “there wasn’t much going on” when searching for a specific topic of interest (p. 30.For what purposes do end-users use IRs?Interviewees commonly cited keeping abreast with research projects from their own university as a reason to access their institutions’ IRs. Student interviewees also used IRs to find examples of theses and dissertations they would be expected to complete. Identifying people doing similar work across different departments in the same institution for collaboration and networking opportunities was another unique purpose for using IRs.How do end-users perceive the credibility of information from IRs?Many interviewees perceived IRs to be more “trustworthy” than Google Scholar (p. 33. In their view, an IR’s credibility was assured by the reputation of its affiliated institution. On the other hand, many interviewees viewed a lack of comprehensiveness in content negatively when judging the credibility of an information
Snilstveit, Birte; Vojtkova, Martina; Bhavsar, Ami; Gaarder, Marie
Evidence-gap maps present a new addition to the tools available to support evidence-informed policy making. Evidence-gap maps are thematic evidence collections covering a range of issues such as maternal health, HIV/AIDS, and agriculture. They present a visual overview of existing systematic reviews or impact evaluations in a sector or subsector, schematically representing the types of int...
Jongeling, R.M.; Datta, S.; Serebrenik, A.; Koschke, R.; Krinke, J.; Robillard, M.
Recent years have seen an increasing attention to social aspects of software engineering, including studies of emotions and sentiments experienced and expressed by the software developers. Most of these studies reuse existing sentiment analysis tools such as SentiStrength and NLTK. However, these
Many of us nowadays invest significant amounts of time in sharing our activities and opinions with friends and family via social networking tools such as Facebook, Twitter or other related websites. However, despite the availability of many platforms for scientists to connect and...
Kuru Cetin, Saadet
In this study, in-class lesson observations were made with volunteer teachers working in primary and secondary schools using alternative observation tools regarding the scope of contemporary educational supervision. The study took place during the fall and spring semesters of the 2015-2016 and 2016-2017 academic years and the class observations…
Podhora, A.; Helming, K.; Adenauer, L.; Heckelei, T.; Kautto, P.; Reidsma, P.; Rennings, K.; Turnpenny, J.; Jansen, J.M.L.
Since 2002, the European Commission has employed the instrument of ex-ante impact assessments (IA) to help focus its policy-making process on implementing sustainable development. Scientific tools should play an essential role of providing the evidence base to assess the impacts of alternative
Pazos, Florencio; Chagoyen, Monica
Daily work in molecular biology presently depends on a large number of computational tools. An in-depth, large-scale study of that 'ecosystem' of Web tools, its characteristics, interconnectivity, patterns of usage/citation, temporal evolution and rate of decay is crucial for understanding the forces that shape it and for informing initiatives aimed at its funding, long-term maintenance and improvement. In particular, the long-term maintenance of these tools is compromised because of their specific development model. Hundreds of published studies become irreproducible de facto, as the software tools used to conduct them become unavailable. In this study, we present a large-scale survey of >5400 publications describing Web servers within the two main bibliographic resources for disseminating new software developments in molecular biology. For all these servers, we studied their citation patterns, the subjects they address, their citation networks and the temporal evolution of these factors. We also analysed how these factors affect the availability of these servers (whether they are alive). Our results show that this ecosystem of tools is highly interconnected and adapts to the 'trendy' subjects in every moment. The servers present characteristic temporal patterns of citation/usage, and there is a worrying rate of server 'death', which is influenced by factors such as the server popularity and the institutions that hosts it. These results can inform initiatives aimed at the long-term maintenance of these resources. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: email@example.com.
Verhagen, Evert; Voogt, Nelly; Bruinsma, Anja; Finch, Caroline F
Evidence of effectiveness does not equal successful implementation. To progress the field, practical tools are needed to bridge the gap between research and practice and to truly unite effectiveness and implementation evidence. This paper describes the Knowledge Transfer Scheme integrating existing implementation research frameworks into a tool which has been developed specifically to bridge the gap between knowledge derived from research on the one side and evidence-based usable information and tools for practice on the other.
Wallis, Selina; Cole, Donald C; Gaye, Oumar; Mmbaga, Blandina T; Mwapasa, Victor; Tagbor, Harry; Bates, Imelda
Research is key to achieving global development goals. Our objectives were to develop and test an evidence-informed process for assessing health research management and support systems (RMSS) in four African universities and for tracking interventions to address capacity gaps. Four African universities. 83 university staff and students from 11 cadres. A literature-informed 'benchmark' was developed and used to itemise all components of a university's health RMSS. Data on all components were collected during site visits to four African universities using interview guides, document reviews and facilities observation guides. Gaps in RMSS capacity were identified against the benchmark and institutional action plans developed to remedy gaps. Progress against indicators was tracked over 15 months and common challenges and successes identified. Common gaps in operational health research capacity included no accessible research strategy, a lack of research e-tracking capability and inadequate quality checks for proposal submissions and contracts. Feedback indicated that the capacity assessment was comprehensive and generated practical actions, several of which were no-cost. Regular follow-up helped to maintain focus on activities to strengthen health research capacity in the face of challenges. Identification of each institutions' strengths and weaknesses against an evidence-informed benchmark enabled them to identify gaps in in their operational health research systems, to develop prioritised action plans, to justify resource requests to fulfil the plans and to track progress in strengthening RMSS. Use of a standard benchmark, approach and tools enabled comparisons across institutions which has accelerated production of evidence about the science of research capacity strengthening. The tools could be used by institutions seeking to understand their strengths and to address gaps in research capacity. Research capacity gaps that were common to several institutions could be
Poster presented at the Research Bazaar 2015 at Melbourne University, Australia. Conference attendees were asked to share an overview of their project and the digital platforms they used in their research.
Dahl, Jan Erik
In the studied master's course, students participated both as research objects in a digital annotation experiment and as critical investigators of this technology in their semester projects. The students' role paralleled the researcher's role, opening an opportunity for researcher-student co-learning within what is often referred to as…
Sturzenegger, Susi; Johnsson, Kai; Riezman, Howard
Funded by the Swiss National Science Foundation to promote cutting edge research as well as the advancement of young researchers and women, technology transfer, outreach and education, the NCCR (Swiss National Centre of Competence in Research) Chemical Biology is co-led by Howard Riezman, University of Geneva and Kai Johnsson, École Polytechnique Fédérale de Lausanne (EPFL).
Highlights: • A GUI-based intuitive tool for data format analysis is presented. • Data can be viewed in any data types specified by the user in real time. • Analyzed formats are saved and reused as templates for other data of the same forms. • Users can easily extract contents in any forms by writing a simple script file. • The tool would be useful for exchanging data in collaborative fusion researches. - Abstract: An intuitive tool with graphical user interface (GUI) for analyzing formats and extracting contents of binary data in fusion research is presented. Users can examine structures of binary data at arbitrary addresses by selecting their type from a list of radio buttons in the data inspection window and checking their representations instantly on the computer screen. The result of analysis is saved in a file which contains the information such as name, data type, start address, and array size of the data. If the array size of some data depends on others that appear prior to the former and if the users specify their relation in the inspection window, the resultant file can also be used as a format template for the same series of data. By writing a simple script, the users can extract the contents of data either to a text or binary file in the format of their preference. As a real-life example, the tool is applied to the MHD equilibrium data at JT-60U, where poloidal flux data are extracted and converted to a format suitable for contour plotting in other data visualization program. The tool would be useful in collaborative fusion researches for exchanging relatively small-size data, which don’t fit in well with the standard routine processes
Kademani, B. S.; Vijai Kumar, *
This paper highlights the information explosion, the need for bibliographic control, the need for information retrieval tools. Explains the emergence of Citation Index, concept of citation indexing, reasons for citing, its structure (print and electronic versions of Science citation Index and Social Science Citation Index ), and application of citation index. It also discusses the search effectiveness, factors taken into consideration for coverage of journals in citation indexes, Journal Cita...
Full Text Available Since the end of the 19th century the Calabria region in southern Italy has been known for an abundance of grooved stone axes and hammers used during late prehistory. These artefacts are characterized by a wide and often pronounced groove in the middle of the implement, thought to have aided securing the head to a wooden haft. Their widespread presence is known both in prehistoric archaeological literature and in the archaeological collections of various regional and extra-regional museums. At first, scholars did not relate these tools to the rich Calabrian ore deposits and to possible ancient mining activities; they were regarded simply as a variant of ground lithic industry of Neolithic tradition. However, between 1997 and 2012, about 50 tools were discovered in the prehistoric mine of Grotta della Monaca in northern Calabria where there are outcrops of copper and iron ore. This allowed us to recognize their specific mining value and to consider them as a sort of “guide fossil” for the identification of ancient mining districts. This paper presents the results of a study involving over 150 tools from the entire region, effectively demonstrating an almost perfect co-occurrence of grooved axes and hammers with areas rich in mineral resources, especially metalliferous ores.
Schoop, Eric; Kriaučiūnienė, Roma; Brundzaitė, Rasa
This article analyses the needs and possibilities to educate new type of virtual collaboration skills for the university students, who are currently studying in business and information systems area. We investigate the possibility to incorporate problem-based group learning and computer supported tools into university curricula. The empirical research results are presented, which summarize experiences of using the virtual collaborative learning (VCL) environment, provided by Business informat...
Mador, Rebecca L; Kornas, Kathy; Simard, Anne; Haroun, Vinita
Given the context-specific nature of health research prioritization and the obligation to effectively allocate resources to initiatives that will achieve the greatest impact, evaluation of priority setting processes can refine and strengthen such exercises and their outcomes. However, guidance is needed on evaluation tools that can be applied to research priority setting. This paper describes the adaption and application of a conceptual framework to evaluate a research priority setting exercise operating within the public health sector in Ontario, Canada. The Nine Common Themes of Good Practice checklist, described by Viergever et al. (Health Res Policy Syst 8:36, 2010) was used as the conceptual framework to evaluate the research priority setting process developed for the Locally Driven Collaborative Projects (LDCP) program in Ontario, Canada. Multiple data sources were used to inform the evaluation, including a review of selected priority setting approaches, surveys with priority setting participants, document review, and consultation with the program advisory committee. The evaluation assisted in identifying improvements to six elements of the LDCP priority setting process. The modifications were aimed at improving inclusiveness, information gathering practices, planning for project implementation, and evaluation. In addition, the findings identified that the timing of priority setting activities and level of control over the process were key factors that influenced the ability to effectively implement changes. The findings demonstrate the novel adaptation and application of the 'Nine Common Themes of Good Practice checklist' as a tool for evaluating a research priority setting exercise. The tool can guide the development of evaluation questions and enables the assessment of key constructs related to the design and delivery of a research priority setting process.
Wekerle, Christine; Vakili, Negar; Stewart, Sherry H; Black, Tara
Researchers in violence prevention areas seek to disseminate work for impact to practice and policy. Knowledge transfer, exchange, and mobilization are common terms for research knowledge utilization where public communication platforms are playing an increasing role, having unique capacity to connect stakeholders in advocacy and lived experience, academia, non-governmental organizations, government-supported organizations, such as child welfare, and research funding bodies. Social networking platforms provide a communication intervention opportunity to test the effectiveness of the research reach. A Canadian Institutes of Health Research- funded team grant in boys' and men's health, focusing on sexual violence (SV) victimization, health, and resilience undertook an evaluation to examine whether a strategic approach involving a cadre of SV experts (n = 46) and their research increased engagement. Using a unique identifier (#CIHRTeamSV) content was shared on social media (Twitter) within an ABABAB experimental monthly format (A = no sharing; B = sharing content), following a baseline entry of researchers. Active Twitter engagement lead to increases in the number of individuals' profile views, article downloads, and citations. These findings encourage further research into the utility of social media for disseminating sexual violence research, and that social media has developed as a forum for evidence-based conversation on sensitive topics of public health import. Crown Copyright © 2018. Published by Elsevier Ltd. All rights reserved.
Advanced light sources peer into matter at the atomic and molecular scales, with applications ranging from physics, chemistry, materials science, and advanced energy research, to biology and medicine.
Mulligan, Angela A; Luben, Robert N; Bhaniani, Amit; Parry-Smith, David J; O'Connor, Laura; Khawaja, Anthony P; Forouhi, Nita G; Khaw, Kay-Tee
To describe the research methods for the development of a new open source, cross-platform tool which processes data from the European Prospective Investigation into Cancer and Nutrition Norfolk Food Frequency Questionnaire (EPIC-Norfolk FFQ). A further aim was to compare nutrient and food group values derived from the current tool (FETA, FFQ EPIC Tool for Analysis) with the previously validated but less accessible tool, CAFÉ (Compositional Analyses from Frequency Estimates). The effect of text matching on intake data was also investigated. Cross-sectional analysis of a prospective cohort study-EPIC-Norfolk. East England population (city of Norwich and its surrounding small towns and rural areas). Complete FFQ data from 11 250 men and 13 602 women (mean age 59 years; range 40-79 years). Nutrient and food group intakes derived from FETA and CAFÉ analyses of EPIC-Norfolk FFQ data. Nutrient outputs from FETA and CAFÉ were similar; mean (SD) energy intake from FETA was 9222 kJ (2633) in men, 8113 kJ (2296) in women, compared with CAFÉ intakes of 9175 kJ (2630) in men, 8091 kJ (2298) in women. The majority of differences resulted in one or less quintile change (98.7%). Only mean daily fruit and vegetable food group intakes were higher in women than in men (278 vs 212 and 284 vs 255 g, respectively). Quintile changes were evident for all nutrients, with the exception of alcohol, when text matching was not executed; however, only the cereals food group was affected. FETA produces similar nutrient and food group values to the previously validated CAFÉ but has the advantages of being open source, cross-platform and complete with a data-entry form directly compatible with the software. The tool will facilitate research using the EPIC-Norfolk FFQ, and can be customised for different study populations.
Full Text Available Possibilities of the assessment of a landscape with the use of succession development stages, monitored with the value of the Mean Individual Biomass (MIB of carabid beetles and the occurrence of bird species are discussed on the basis of an example from Poland. Higher variability of the MIB value in space signifies a greater biodiversity. Apart from the variability of MIB, it is suggested to adopt the occurrence of the following animals as indicators, (in the order of importance, representing underlying valuable landscapes: black stork, lesser spotted eagle, white-tailed eagle, wolf, crane and white stork. The higher number of these species and their greater density indicate a higher value of the landscape for biodiversity and ecosystem services, especially carbon sequestration. All these indicators may be useful to assess measures for sustainable land use.
Jacquiod, Samuel Jehan Auguste; Stenbæk, Jonas; Santos, Susana
has been identified. Our analyses suggest that publicly available metagenome data can provide valuable information on soil microeukaryotes for comparative purposes when handled appropriately, complementing the current view provided by ribosomal amplicon sequencing methods......., providing microbiologists with substantial amounts of accessible information. We took advantage of public metagenomes in order to investigate microeukaryote communities in a well characterized grassland soil. The data gathered allowed the evaluation of several factors impacting the community structure......, including the DNA extraction method, the database choice and also the annotation procedure. While most studies on soil microeukaryotes are based on sequencing of PCR-amplified taxonomic markers (18S rRNA genes, ITS regions), this work represents, to our knowledge, the first report based solely...
Hors, Cora; Goldberg, Anna Carla; Almeida, Ederson Haroldo Pereira de; Babio Júnior, Fernando Galan; Rizzo, Luiz Vicente
Introduce a program for the management of scientific research in a General Hospital employing the business management tools Lean Six Sigma and PMBOK for project management in this area. The Lean Six Sigma methodology was used to improve the management of the institution's scientific research through a specific tool (DMAIC) for identification, implementation and posterior analysis based on PMBOK practices of the solutions found. We present our solutions for the management of institutional research projects at the Sociedade Beneficente Israelita Brasileira Albert Einstein. The solutions were classified into four headings: people, processes, systems and organizational culture. A preliminary analysis of these solutions showed them to be completely or partially compliant to the processes described in the PMBOK Guide. In this post facto study, we verified that the solutions drawn from a project using Lean Six Sigma methodology and based on PMBOK enabled the improvement of our processes dealing with the management of scientific research carried out in the institution and constitutes a model to contribute to the search of innovative science management solutions by other institutions dealing with scientific research in Brazil.
complete the Master with a seminar: Nuclear Power Plants, and a Thesis. In the frame of the academic plan, multiple activities are organized related to research reactors and also to nuclear power plants. Since the very beginning the performance of selected experiments in a nuclear reactor was recognized as an extraordinary tool to give the students an insight in the principal phenomena associated with the chain reaction and the related engineering problems. This experiments have an intrinsic elevated cost, associated with the relevance of the installation and with the specialized personnel involved. CNEA provides the career with this educational instrument through the Ra-1 and RA-3 reactors located at Constituyentes and Ezeiza Atomic Centers respectively. Various activities are under way but the most established, in the Reactor Physics Course, is the estimation of kinetic parameters in RA-1 reactor. The practice includes three different experiments: Approach to critical and calibration of control rods by the compensation method: Starting in a subcritical state with source the calibration of control rod B1 vs B2 is done by introduction of the first and withdrawal of the second. The methods used are based on the Point Kinetic Model; Measurement of control rods effectivity by the rod-drop method: Separate Rod Drop of rods B1 B2 B3 of the overall ensemble B1 B2 B3 B4 and total scram starting with three withdrawn and one partially inserted, is the procedure followed to estimate the reactivity worth of B1 B2 B3 and scram. The Point Kinetic Model and the Modal Kinetic Model are used; Reactor noise technique for the estimation of reactor parameters: α and Λ. The kinetic parameters are estimated assuring that the Point Kinetic Model is valid (detection chambers near to the core), that the fluctuation of the fission density is the dominant source of the correlated part of neutron noise (measurement at low power, <10kw), the dominance of the fundamental armonic (simultaneous use of
Du, Z C; Lv, C F; Hong, M S
A new error modelling and identification method based on the cross grid encoder is proposed in this paper. Generally, there are 21 error components in the geometric error of the 3 axis NC machine tools. However according our theoretical analysis, the squareness error among different guide ways affects not only the translation error component, but also the rotational ones. Therefore, a revised synthetic error model is developed. And the mapping relationship between the error component and radial motion error of round workpiece manufactured on the NC machine tools are deduced. This mapping relationship shows that the radial error of circular motion is the comprehensive function result of all the error components of link, worktable, sliding table and main spindle block. Aiming to overcome the solution singularity shortcoming of traditional error component identification method, a new multi-step identification method of error component by using the Cross Grid Encoder measurement technology is proposed based on the kinematic error model of NC machine tool. Firstly, the 12 translational error components of the NC machine tool are measured and identified by using the least square method (LSM) when the NC machine tools go linear motion in the three orthogonal planes: XOY plane, XOZ plane and YOZ plane. Secondly, the circular error tracks are measured when the NC machine tools go circular motion in the same above orthogonal planes by using the cross grid encoder Heidenhain KGM 182. Therefore 9 rotational errors can be identified by using LSM. Finally the experimental validation of the above modelling theory and identification method is carried out in the 3 axis CNC vertical machining centre Cincinnati 750 Arrow. The entire 21 error components have been successfully measured out by the above method. Research shows the multi-step modelling and identification method is very suitable for 'on machine measurement'
McCormick, Tyler H.; Lee, Hedwig; Cesare, Nina; Shojaie, Ali; Spiro, Emma S.
Despite recent and growing interest in using Twitter to examine human behavior and attitudes, there is still significant room for growth regarding the ability to leverage Twitter data for social science research. In particular, gleaning demographic information about Twitter users--a key component of much social science research--remains a…
Sools, Anna Maria
In qualitative health research many researchers use a narrative approach to study lay health concepts and experiences. In this article, I explore the theoretical linkages between the concepts narrative and health, which are used in a variety of ways. The article builds on previous work that
Carver, Cynthia L.; Klein, C. Suzanne
This paper introduces the use of action research to examine the content and outcomes of university-based leadership preparation programs. Using examples drawn from an ongoing action research project with candidates in a master's level principal preparation program, we demonstrate how the collection and analysis of candidate's written reflections,…
Silva, Alcino J.; Müller, Klaus-Robert
The sheer volume and complexity of publications in the biological sciences are straining traditional approaches to research planning. Nowhere is this problem more serious than in molecular and cellular cognition, since in this neuroscience field, researchers routinely use approaches and information from a variety of areas in neuroscience and other…
NASA's strategic goals include advancing knowledge and opportunity in space and improving life on Earth. We support these goals through extensive programs in space and Earth science research accomplished via space-based missions and research funding. NASA's "system" is configured to conduct science using (1) in-house personnel and (2) grants, contracts, and agreements with external entities (academia, industry, international space agencies.
Full Text Available Many undergraduate laboratories are, too often, little more than an exercise in “cooking” where students are instructed step-by-step what to add, mix, and, most unfortunately, expect as an outcome. Although the shortcomings of “cookbook” laboratories are well known, they are considerably easier to manage than the more desirable inquiry-based laboratories. Thus the ability to quickly access, share, sort, and analyze research data would make a significant contribution towards the feasibility of teaching/mentoring large numbers of inexperienced students in an inquiry-based research environment, as well as facilitating research collaborations among students. Herein we report on a software tool (MicroTracker designed to address the educational problems that we experienced with inquiry-based research education due to constraints on data management and accessibility.
Ogao, Patrick J
Abstract Background Ever since Dr. John Snow (1813–1854) used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed...
Qualitative evaluation of the implementation of the Interdisciplinary Management Tool: a reflective tool to enhance interdisciplinary teamwork using Structured, Facilitated Action Research for Implementation.
Nancarrow, Susan A; Smith, Tony; Ariss, Steven; Enderby, Pamela M
Reflective practice is used increasingly to enhance team functioning and service effectiveness; however, there is little evidence of its use in interdisciplinary teams. This paper presents the qualitative evaluation of the Interdisciplinary Management Tool (IMT), an evidence-based change tool designed to enhance interdisciplinary teamwork through structured team reflection. The IMT incorporates three components: an evidence-based resource guide; a reflective implementation framework based on Structured, Facilitated Action Research for Implementation methodology; and formative and summative evaluation components. The IMT was implemented with intermediate care teams supported by independent facilitators in England. Each intervention lasted 6 months and was evaluated over a 12-month period. Data sources include interviews, a focus group with facilitators, questionnaires completed by team members and documentary feedback from structured team reports. Data were analysed qualitatively using the Framework approach. The IMT was implemented with 10 teams, including 253 staff from more than 10 different disciplines. Team challenges included lack of clear vision; communication issues; limited career progression opportunities; inefficient resource use; need for role clarity and service development. The IMT successfully engaged staff in the change process, and resulted in teams developing creative strategies to address the issues identified. Participants valued dedicated time to focus on the processes of team functioning; however, some were uncomfortable with a focus on teamwork at the expense of delivering direct patient care. The IMT is a relatively low-cost, structured, reflective way to enhance team function. It empowers individuals to understand and value their own, and others' roles and responsibilities within the team; identify barriers to effective teamwork, and develop and implement appropriate solutions to these. To be successful, teams need protected time to take
Atkinson, Nancy L; Massett, Holly A; Mylks, Christy; McCormack, Lauren A; Kish-Doto, Julia; Hesse, Bradford W; Wang, Min Qi
Informatics applications have the potential to improve participation in clinical trials, but their design must be based on user-centered research. This research used a fully counterbalanced experimental design to investigate the effect of changes made to the original version of a website, http://BreastCancerTrials.org/, and confirm that the revised version addressed and reinforced patients' needs and expectations. Participants included women who had received a breast cancer diagnosis within the last 5 years (N=77). They were randomized into two groups: one group used and reviewed the original version first followed by the redesigned version, and the other group used and reviewed them in reverse order. The study used both quantitative and qualitative measures. During use, participants' click paths and general reactions were observed. After use, participants were asked to answer survey items and open-ended questions to indicate their reactions and which version they preferred and met their needs and expectations better. Overall, the revised version of the site was preferred and perceived to be clearer, easier to navigate, more trustworthy and credible, and more private and safe overall. However, users who viewed the original version last had similar attitudes toward both versions. By applying research findings to the redesign of a website for clinical trial searching, it was possible to re-engineer the interface to better support patients' decisions to participate in clinical trials. The mechanisms of action in this case appeared to revolve around creating an environment that supported a sense of personal control and decisional autonomy.
Reuter, Katja; Ukpolo, Francis; Ward, Edward; Wilson, Melissa L; Angyan, Praveen
Background Understanding the relationship between organizational context and research utilization is key to reducing the research-practice gap in health care. This is particularly true in the residential long term care (LTC) setting where relatively little work has examined the influence of context on research implementation. Reliable, valid measures and tools are a prerequisite for studying organizational context and research utilization. Few such tools exist in German. We thus translated three such tools (the Alberta Context Tool and two measures of research use) into German for use in German residential LTC. We point out challenges and strategies for their solution unique to German residential LTC, and demonstrate how resolving specific challenges in the translation of the health care aide instrument version streamlined the translation process of versions for registered nurses, allied health providers, practice specialists, and managers. Methods Our translation methods were based on best practices and included two independent forward translations, reconciliation of the forward translations, expert panel discussions, two independent back translations, reconciliation of the back translations, back translation review, and cognitive debriefing. Results We categorized the challenges in this translation process into seven categories: (1) differing professional education of Canadian and German care providers, (2) risk that German translations would become grammatically complex, (3) wordings at risk of being misunderstood, (4) phrases/idioms non-existent in German, (5) lack of corresponding German words, (6) limited comprehensibility of corresponding German words, and (7) target persons’ unfamiliarity with activities detailed in survey items. Examples of each challenge are described with strategies that we used to manage the challenge. Conclusion Translating an existing instrument is complex and time-consuming, but a rigorous approach is necessary to obtain instrument
Brevik, Eric C.; Lindbo, David L.; Belcher, Christopher
Several studies crossing numerous disciplinary boundaries have demonstrated that undergraduate students benefit from research experiences. These benefits include personal and intellectual development, more and closer contact with faculty, the use of active learning techniques, the creation of high expectations, the development of creative and problem-solving skills, and the development of greater independence and intrinsic motivation to learn. The discipline also gains in that studies show undergraduates who engage in research experiences are more likely to remain science majors and finish their degree program. Research experiences come as close as possible to allowing undergraduates to experience what it is like to be an academic or research member of their profession working to advance their discipline, therefore enhancing their professional socialization into their chosen field. If the goals achieved by undergraduate research include introducing these students to the advancement of their chosen field, it stands to reason the ultimate ending to this experience would be the publication of a peer-reviewed paper. While not all undergraduate projects will end with a product worthy of peer-reviewed publication, some definitely do, and the personal experience of the authors indicates that undergraduate students who achieve publication get great satisfaction and a sense of personal achievement from that publication. While a top-tier international journal probably isn't going to be the ultimate destination for many of these projects, there are several appropriate outlets. The SSSA journal Soil Horizons has published several undergraduate projects in recent years, and good undergraduate projects can often be published in state academy of science journals. Journals focused expressly on publishing undergraduate research include the Journal of Undergraduate Research and Scholarly Excellence, Reinvention, and the American Journal of Undergraduate Research. Case studies of
Hollins Martin, Caroline J; Forrest, Eleanor; Wylie, Linda; Martin, Colin R
The NMSF (2009) survey reported that bereavement midwife care was inadequate in a number of UK NHS Trusts. Using a small grant from the Scottish government, 3 experienced midwifery lecturers designed an interactive workbook called "Shaping bereavement care for midwives in clinical practice" for the purpose of improving delivery of bereavement education to student midwives. An instrument called the Understanding Bereavement Evaluation Tool (UBET) was designed to measure effectiveness of the workbook at equipping students with essential knowledge. To assess validity and reliability of the UBET at measuring midwives' self-perceptions of knowledge surrounding delivery of bereavement care to childbearing women, partners and families who have experienced childbirth related bereavement. An evaluative audit using the UBET was undertaken to explore student midwives' (n=179) self perceived knowledge levels before and after the workbook intervention. Validity tests have shown that the UBET, (6-item version), could be considered a psychometrically robust instrument for assessing students' knowledge gain. PCA identified that the UBET comprised two sub-scales (theoretical knowledge base - Q 1, 2 & 3 and psychosocial elements of care delivery - Q 4, 5 & 6). Data has shown that the easy to administer and short 6-item UBET is a valid and reliable tool for educators to measure success at delivering education using the "Shaping bereavement care for midwives in clinical practice" work book. Copyright © 2012 Elsevier Ltd. All rights reserved.
Maprelian, Eduardo; Cabral, Eduardo L.L.; Silva, Antonio T. e
The loss of coolant accidents (LOCA) in pool type research reactors are normally considered as limiting in the licensing process. This paper verifies the viability of the computer code 3D-AIRLOCA to analyze LOCA in a pool type research reactor, and also develops two computer codes LOSS and TEMPLOCA. The computer code LOSS determines the time tom drawn the pool down to the level of the bottom of the core, and the computer code TEMPLOCA calculates the peak fuel element temperature during the transient. These two coders substitutes the 3D-AIRLOCA in the LOCA analysis for pool type research reactors. (author)
Nikolian, Vahagn C; Ibrahim, Andrew M
Journals fill several important roles within academic medicine, including building knowledge, validating quality of methods, and communicating research. This section provides an overview of these roles and highlights innovative approaches journals have taken to enhance dissemination of research. As journals move away from print formats and embrace web-based content, design-centered thinking will allow for engagement of a larger audience. Examples of recent efforts in this realm are provided, as well as simplified strategies for developing visual abstracts to improve dissemination via social media. Finally, we hone in on principles of learning and education which have driven these advances in multimedia-based communication in scientific research.
Thiel, William H; Giangrande, Paloma H
The development of DNA and RNA aptamers for research as well as diagnostic and therapeutic applications is a rapidly growing field. In the past decade, the process of identifying aptamers has been revolutionized with the advent of high-throughput sequencing (HTS). However, bioinformatics tools that enable the average molecular biologist to analyze these large datasets and expedite the identification of candidate aptamer sequences have been lagging behind the HTS revolution. The Galaxy Project was developed in order to efficiently analyze genome, exome, and transcriptome HTS data, and we have now applied these tools to aptamer HTS data. The Galaxy Project's public webserver is an open source collection of bioinformatics tools that are powerful, flexible, dynamic, and user friendly. The online nature of the Galaxy webserver and its graphical interface allow users to analyze HTS data without compiling code or installing multiple programs. Herein we describe how tools within the Galaxy webserver can be adapted to pre-process, compile, filter and analyze aptamer HTS data from multiple rounds of selection. Copyright © 2015 Elsevier Inc. All rights reserved.
Jahurul, M H A; Zaidul, I S M; Ghafoor, Kashif; Al-Juhaimi, Fahad Y; Nyam, Kar-Lin; Norulaini, N A N; Sahena, F; Mohd Omar, A K
The large amount of waste produced by the food industries causes serious environmental problems and also results in economic losses if not utilized effectively. Different research reports have revealed that food industry by-products can be good sources of potentially valuable bioactive compounds. As such, the mango juice industry uses only the edible portions of the mangoes, and a considerable amount of peels and seeds are discarded as industrial waste. These mango by-products come from the tropical or subtropical fruit processing industries. Mango by-products, especially seeds and peels, are considered to be cheap sources of valuable food and nutraceutical ingredients. The main uses of natural food ingredients derived from mango by-products are presented and discussed, and the mainstream sectors of application for these by-products, such as in the food, pharmaceutical, nutraceutical and cosmetic industries, are highlighted. Copyright © 2015 Elsevier Ltd. All rights reserved.
National Institutes of Health, Department of Health and Human Services — Research projects funded by the National Institutes of Health (NIH), other DHHS Operating Divisions (AHRQ, CDC, FDA, HRSA, SAMHSA), and the Department of Veterans...