WorldWideScience

Sample records for previously developed computer

  1. The Importance of Business Model Factors for Cloud Computing Adoption: Role of Previous Experiences

    Directory of Open Access Journals (Sweden)

    Bogataj Habjan Kristina

    2017-08-01

    Full Text Available Background and Purpose: Bringing several opportunities for more effective and efficient IT governance and service exploitation, cloud computing is expected to impact the European and global economies significantly. Market data show that despite many advantages and promised benefits the adoption of cloud computing is not as fast and widespread as foreseen. This situation shows the need for further exploration of the potentials of cloud computing and its implementation on the market. The purpose of this research was to identify individual business model factors with the highest impact on cloud computing adoption. In addition, the aim was to identify the differences in opinion regarding the importance of business model factors on cloud computing adoption according to companies’ previous experiences with cloud computing services.

  2. Emphysema and bronchiectasis in COPD patients with previous pulmonary tuberculosis: computed tomography features and clinical implications

    Directory of Open Access Journals (Sweden)

    Jin J

    2018-01-01

    Full Text Available Jianmin Jin,1 Shuling Li,2 Wenling Yu,2 Xiaofang Liu,1 Yongchang Sun1,3 1Department of Respiratory and Critical Care Medicine, Beijing Tongren Hospital, Capital Medical University, Beijing, 2Department of Radiology, Beijing Tongren Hospital, Capital Medical University, Beijing, 3Department of Respiratory and Critical Care Medicine, Peking University Third Hospital, Beijing, China Background: Pulmonary tuberculosis (PTB is a risk factor for COPD, but the clinical characteristics and the chest imaging features (emphysema and bronchiectasis of COPD with previous PTB have not been studied well.Methods: The presence, distribution, and severity of emphysema and bronchiectasis in COPD patients with and without previous PTB were evaluated by high-resolution computed tomography (HRCT and compared. Demographic data, respiratory symptoms, lung function, and sputum culture of Pseudomonas aeruginosa were also compared between patients with and without previous PTB.Results: A total of 231 COPD patients (82.2% ex- or current smokers, 67.5% male were consecutively enrolled. Patients with previous PTB (45.0% had more severe (p=0.045 and longer history (p=0.008 of dyspnea, more exacerbations in the previous year (p=0.011, and more positive culture of P. aeruginosa (p=0.001, compared with those without PTB. Patients with previous PTB showed a higher prevalence of bronchiectasis (p<0.001, which was more significant in lungs with tuberculosis (TB lesions, and a higher percentage of more severe bronchiectasis (Bhalla score ≥2, p=0.031, compared with those without previous PTB. The overall prevalence of emphysema was not different between patients with and without previous PTB, but in those with previous PTB, a higher number of subjects with middle (p=0.001 and lower (p=0.019 lobe emphysema, higher severity score (p=0.028, higher prevalence of panlobular emphysema (p=0.013, and more extensive centrilobular emphysema (p=0.039 were observed. Notably, in patients with

  3. Value of computed tomography pelvimetry in patients with a previous cesarean section

    International Nuclear Information System (INIS)

    Yamani, Tarik Y.; Rouzi, Abdulrahim A.

    1998-01-01

    A case-control study was conducted at the Department of Obstetrics and Gynaecology, King Abdulaziz University Hospital, Jeddah, Saudi Arabia to determine the value of computed tomography pelivimetry in patients with a previous cesarean section. Between January 1993 and December 1995, 219 pregnant women with one previous cesarean had antenatal CT pelvimetry for assessment of the pelvis. One hundred and nineteen women did not have CT pelvimetry and served as control. Fifty-one women (51%) in the CT pelvimetry group were delivered by cesarean section. Twenty-three women (23%) underwent elective cesarean section for contracted pelvis based upon the findings of CT pelvimetry and 28 women (28%) underwent emergency cesarean section after trial of labor. In the group who did not have CT pelvimetry, 26 women (21.8%) underwent emergency cesarean section. This was a statistically significant difference (P=0.02). There were no statistically significant differences in birthweight and Apgar scores either group. There was no prenatal or maternal mortality in this study. Computed tomography pelvimetry increased the rate of cesarean delivery without any benefit in the immediate delivery outcomes. Therefore, the practice of documenting the adequacy of the pelvis by CT pelvimetry before vaginal birth after cesarean should be abandoned. (author)

  4. Low-dose computed tomography image restoration using previous normal-dose scan

    Science.gov (United States)

    Ma, Jianhua; Huang, Jing; Feng, Qianjin; Zhang, Hua; Lu, Hongbing; Liang, Zhengrong; Chen, Wufan

    2011-01-01

    Purpose: In current computed tomography (CT) examinations, the associated x-ray radiation dose is of a significant concern to patients and operators. A simple and cost-effective means to perform the examinations is to lower the milliampere-seconds (mAs) or kVp parameter (or delivering less x-ray energy to the body) as low as reasonably achievable in data acquisition. However, lowering the mAs parameter will unavoidably increase data noise and the noise would propagate into the CT image if no adequate noise control is applied during image reconstruction. Since a normal-dose high diagnostic CT image scanned previously may be available in some clinical applications, such as CT perfusion imaging and CT angiography (CTA), this paper presents an innovative way to utilize the normal-dose scan as a priori information to induce signal restoration of the current low-dose CT image series.Methods: Unlike conventional local operations on neighboring image voxels, nonlocal means (NLM) algorithm utilizes the redundancy of information across the whole image. This paper adapts the NLM to utilize the redundancy of information in the previous normal-dose scan and further exploits ways to optimize the nonlocal weights for low-dose image restoration in the NLM framework. The resulting algorithm is called the previous normal-dose scan induced nonlocal means (ndiNLM). Because of the optimized nature of nonlocal weights calculation, the ndiNLM algorithm does not depend heavily on image registration between the current low-dose and the previous normal-dose CT scans. Furthermore, the smoothing parameter involved in the ndiNLM algorithm can be adaptively estimated based on the image noise relationship between the current low-dose and the previous normal-dose scanning protocols.Results: Qualitative and quantitative evaluations were carried out on a physical phantom as well as clinical abdominal and brain perfusion CT scans in terms of accuracy and resolution properties. The gain by the use of

  5. High-Throughput Computational Assessment of Previously Synthesized Semiconductors for Photovoltaic and Photoelectrochemical Devices

    DEFF Research Database (Denmark)

    Kuhar, Korina; Pandey, Mohnish; Thygesen, Kristian Sommer

    2018-01-01

    Using computational screening we identify materials with potential use as light absorbers in photovoltaic or photoelectrochemical devices. The screening focuses on compounds of up to three different chemical elements which are abundant and nontoxic. A prescreening is carried out based on informat...

  6. Green Computing In Developed And Developing Countries

    OpenAIRE

    Taruna, S; Singh, Pratibha; Joshi, Soshya

    2014-01-01

    Today e-waste are becoming a major problem for the developing countries. E-waste is defined something as a discarded parts of electronic devices which contains most of the times, hazardous chemicals which is deadly for our environment, example is computer components. Green Computing is the study and practice of designing, using, disposing and manufacturing electronic components in an eco-friendly manner and Green Computing is one of the solution to tackle with this hazardous e-waste problem w...

  7. Predictive factors for the development of diabetes in women with previous gestational diabetes mellitus

    DEFF Research Database (Denmark)

    Damm, P.; Kühl, C.; Bertelsen, Aksel

    1992-01-01

    OBJECTIVES: The purpose of this study was to determine the incidence of diabetes in women with previous dietary-treated gestational diabetes mellitus and to identify predictive factors for development of diabetes. STUDY DESIGN: Two to 11 years post partum, glucose tolerance was investigated in 241...... women with previous dietary-treated gestational diabetes mellitus and 57 women without previous gestational diabetes mellitus (control group). RESULTS: Diabetes developed in 42 (17.4%) women with previous gestational diabetes mellitus (3.7% insulin-dependent diabetes mellitus and 13.7% non......-insulin-dependent diabetes mellitus). Diabetes did not develop in any of the controls. Predictive factors for diabetes development were fasting glucose level at diagnosis (high glucose, high risk), preterm delivery, and an oral glucose tolerance test result that showed diabetes 2 months post partum. In a subgroup...

  8. Development and Validation of a Mobile Computer Anxiety Scale

    Science.gov (United States)

    Wang, Yi-Shun

    2007-01-01

    Although researchers have developed various scales for measuring users' computer anxiety or Internet anxiety, none of the literature has addressed the measurement of mobile computer anxiety (MCA). The purpose of this study is to develop and validate a multidimensional mobile computer anxiety scale (MCAS) based on previous research on computer…

  9. 78 FR 35263 - Freeport LNG Development, L.P.; Application for Blanket Authorization To Export Previously...

    Science.gov (United States)

    2013-06-12

    ... the LNG at the time of export. The Application was filed under section 3 of the Natural Gas Act (NGA... not prohibited by U.S. law or policy. Current Application The current Application is filed in... Freeport LNG Development, L.P.; Application for Blanket Authorization To Export Previously Imported...

  10. Developing Reading Comprehension through Metacognitive Strategies: A Review of Previous Studies

    Science.gov (United States)

    Channa, Mansoor Ahmed; Nordin, Zaimuariffudin Shukri; Siming, Insaf Ali; Chandio, Ali Asgher; Koondher, Mansoor Ali

    2015-01-01

    This paper has reviewed the previous studies on metacognitive strategies based on planning, monitoring, and evaluating in order to develop reading comprehension. The main purpose of this review in metacognition, and reading domain is to help readers to enhance their capabilities and power reading through these strategies. The researchers reviewed…

  11. Computed tomography in the evaluation of abdominal fat distribution associated with a hyperlipidic diet in previously undernourished rats

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Carlos Alberto Soares da [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil). Faculdade de Ciencias Medicas. Program of Post-graduation in Clinical and Experimental Physiopathology; Alves, Erika Gomes; Gonzalez, Gabriele Paula; Barbosa, Thais Barcellos Cortez; Lima, Veronica Demarco; Nascimento, Renata; Moura, Egberto Gaspar de; Saba, Celly Cristina Alves do Nascimento [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil). Inst. de Biologia Roberto Alcantara Gomes. Dept. of Physiological Sciences]. E-mail: cellysaba@terra.com.br; Monteiro, Alexandra Maria Vieira [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil). Faculdade de Ciencias Medicas

    2007-09-15

    Objective: To study, by means of computed tomography, the repercussion of post-weaning dietary supplementation with soy oil or canola oil on the abdominal fat distribution in previously undernourished rats. Materials and methods: Dams submitted to 50% food restriction (FR) compared with dams receiving a standard diet (C). After weaning, undernourished rats received a diet supplemented with 19% soy oil (19% FR-soy) or 19% canola oil (19% FR-canola). Rats in the control group received a diet with 7% soy oil (7% C-soy) until the end of the experimental period. At the age of 60 days old, the rats were submitted to computed tomography for evaluation of total abdominal and visceral fat area. The rats' length and body mass were evaluated and, after their sacrifice, the abdominal fat depots were excised weighted. The data are reported as mean {+-} mean standard error, with p < 0.05 considered as significance level. Results: Rats in the group 19% FR presented similar length, body weight and visceral fat mass. As a whole, the evaluations have shower results significantly lower in relation to the control group (7% C-soy). However, computed tomography has found significant differences in abdominal fat distribution for the groups 19% FR-soy and 19% FR-canola. Conclusion: Computed tomography has demonstrated that the abdominal fat distribution may be dependent on the type of vegetable oil included in the diet. (author)

  12. Computer aided training system development

    International Nuclear Information System (INIS)

    Midkiff, G.N.

    1987-01-01

    The first three phases of Training System Development (TSD) -- job and task analysis, curriculum design, and training material development -- are time consuming and labor intensive. The use of personal computers with a combination of commercial and custom-designed software resulted in a significant reduction in the man-hours required to complete these phases for a Health Physics Technician Training Program at a nuclear power station. This paper reports that each step in the training program project involved the use of personal computers: job survey data were compiled with a statistical package, task analysis was performed with custom software designed to interface with a commercial database management program. Job Performance Measures (tests) were generated by a custom program from data in the task analysis database, and training materials were drafted, edited, and produced using commercial word processing software

  13. Frequency and clinical significance of previously undetected incidental findings detected on computed tomography simulation scans for breast cancer patients.

    Science.gov (United States)

    Nakamura, Naoki; Tsunoda, Hiroko; Takahashi, Osamu; Kikuchi, Mari; Honda, Satoshi; Shikama, Naoto; Akahane, Keiko; Sekiguchi, Kenji

    2012-11-01

    To determine the frequency and clinical significance of previously undetected incidental findings found on computed tomography (CT) simulation images for breast cancer patients. All CT simulation images were first interpreted prospectively by radiation oncologists and then double-checked by diagnostic radiologists. The official reports of CT simulation images for 881 consecutive postoperative breast cancer patients from 2009 to 2010 were retrospectively reviewed. Potentially important incidental findings (PIIFs) were defined as any previously undetected benign or malignancy-related findings requiring further medical follow-up or investigation. For all patients in whom a PIIF was detected, we reviewed the clinical records to determine the clinical significance of the PIIF. If the findings from the additional studies prompted by a PIIF required a change in management, the PIIF was also recorded as a clinically important incidental finding (CIIF). There were a total of 57 (6%) PIIFs. The 57 patients in whom a PIIF was detected were followed for a median of 17 months (range, 3-26). Six cases of CIIFs (0.7% of total) were detected. Of the six CIIFs, three (50%) cases had not been noted by the radiation oncologist until the diagnostic radiologist detected the finding. On multivariate analysis, previous CT examination was an independent predictor for PIIF (p = 0.04). Patients who had not previously received chest CT examinations within 1 year had a statistically significantly higher risk of PIIF than those who had received CT examinations within 6 months (odds ratio, 3.54; 95% confidence interval, 1.32-9.50; p = 0.01). The rate of incidental findings prompting a change in management was low. However, radiation oncologists appear to have some difficulty in detecting incidental findings that require a change in management. Considering cost, it may be reasonable that routine interpretations are given to those who have not received previous chest CT examinations within 1 year

  14. Air Space Proportion in Pterosaur Limb Bones Using Computed Tomography and Its Implications for Previous Estimates of Pneumaticity

    Science.gov (United States)

    Martin, Elizabeth G.; Palmer, Colin

    2014-01-01

    Air Space Proportion (ASP) is a measure of how much air is present within a bone, which allows for a quantifiable comparison of pneumaticity between specimens and species. Measured from zero to one, higher ASP means more air and less bone. Conventionally, it is estimated from measurements of the internal and external bone diameter, or by analyzing cross-sections. To date, the only pterosaur ASP study has been carried out by visual inspection of sectioned bones within matrix. Here, computed tomography (CT) scans are used to calculate ASP in a small sample of pterosaur wing bones (mainly phalanges) and to assess how the values change throughout the bone. These results show higher ASPs than previous pterosaur pneumaticity studies, and more significantly, higher ASP values in the heads of wing bones than the shaft. This suggests that pneumaticity has been underestimated previously in pterosaurs, birds, and other archosaurs when shaft cross-sections are used to estimate ASP. Furthermore, ASP in pterosaurs is higher than those found in birds and most sauropod dinosaurs, giving them among the highest ASP values of animals studied so far, supporting the view that pterosaurs were some of the most pneumatized animals to have lived. The high degree of pneumaticity found in pterosaurs is proposed to be a response to the wing bone bending stiffness requirements of flight rather than a means to reduce mass, as is often suggested. Mass reduction may be a secondary result of pneumaticity that subsequently aids flight. PMID:24817312

  15. Mentoring to develop research selfefficacy, with particular reference to previously disadvantaged individuals

    OpenAIRE

    S. Schulze

    2010-01-01

    The development of inexperienced researchers is crucial. In response to the lack of research self-efficacy of many previously disadvantaged individuals, the article examines how mentoring can enhance the research self-efficacy of mentees. The study is grounded in the self-efficacy theory (SET) – an aspect of the social cognitive theory (SCT). Insights were gained from an in-depth study of SCT, SET and mentoring, and from a completed mentoring project. This led to the formulation of three basi...

  16. Project management for cloud computing development

    Directory of Open Access Journals (Sweden)

    Paul POCATILU

    2010-04-01

    Full Text Available This article deals with the impact of employing cloud computing architectures in the field of software systems development. We analyze the individual influence of the cloud computing model characteristics on the project development process.

  17. Mentoring to develop research selfefficacy, with particular reference to previously disadvantaged individuals

    Directory of Open Access Journals (Sweden)

    S. Schulze

    2010-07-01

    Full Text Available The development of inexperienced researchers is crucial. In response to the lack of research self-efficacy of many previously disadvantaged individuals, the article examines how mentoring can enhance the research self-efficacy of mentees. The study is grounded in the self-efficacy theory (SET – an aspect of the social cognitive theory (SCT. Insights were gained from an in-depth study of SCT, SET and mentoring, and from a completed mentoring project. This led to the formulation of three basic principles. Firstly, institutions need to provide supportive environmental conditions that facilitate research selfefficacy. This implies a supportive and efficient collective system. The possible effects of performance ratings and reward systems at the institution also need to be considered. Secondly, mentoring needs to create opportunities for young researchers to experience successful learning as a result of appropriate action. To this end, mentees need to be involved in actual research projects in small groups. At the same time the mentor needs to facilitate skills development by coaching and encouragement. Thirdly, mentors need to encourage mentees to believe in their ability to successfully complete research projects. This implies encouraging positive emotional states, stimulating self-reflection and self-comparison with others in the group, giving positive evaluative feedback and being an intentional role model.

  18. Computer Graphics for Multimedia and Hypermedia Development.

    Science.gov (United States)

    Mohler, James L.

    1998-01-01

    Discusses several theoretical and technical aspects of computer-graphics development that are useful for creating hypermedia and multimedia materials. Topics addressed include primary bitmap attributes in computer graphics, the jigsaw principle, and raster layering. (MSE)

  19. Rate of torque and electromyographic development during anticipated eccentric contraction is lower in previously strained hamstrings.

    Science.gov (United States)

    Opar, David A; Williams, Morgan D; Timmins, Ryan G; Dear, Nuala M; Shield, Anthony J

    2013-01-01

    The effect of prior strain injury on myoelectrical activity of the hamstrings during tasks requiring high rates of torque development has received little attention. To determine if recreational athletes with a history of unilateral hamstring strain injury will exhibit lower levels of myoelectrical activity during eccentric contraction, rate of torque development (RTD), and impulse (IMP) at 30, 50, and 100 milliseconds after the onset of myoelectrical activity or torque development in the previously injured limb compared with the uninjured limb. Case control study; Level of evidence, 3. Twenty-six recreational athletes were recruited. Of these, 13 athletes had a history of unilateral hamstring strain injury (all confined to biceps femoris long head), and 13 had no history of hamstring strain injury. Following familiarization, all athletes undertook isokinetic dynamometry testing and surface electromyography (integrated EMG; iEMG) assessment of the biceps femoris long head and medial hamstrings during eccentric contractions at -60 and -180 deg·s(-1). In the injured limb of the injured group, compared with the contralateral uninjured limb, RTD and IMP was lower during -60 deg·s(-1) eccentric contractions at 50 milliseconds (RTD: injured limb, 312.27 ± 191.78 N·m·s(-1) vs uninjured limb, 518.54 ± 172.81 N·m·s(-1), P = .008; IMP: injured limb, 0.73 ± 0.30 N·m·s vs uninjured limb, 0.97 ± 0.23 N·m·s, P = .005) and 100 milliseconds (RTD: injured limb, 280.03 ± 131.42 N·m·s(-1) vs uninjured limb, 460.54 ± 152.94 N·m·s(-1), P = .001; IMP: injured limb, 2.15 ± 0.89 N·m·s vs uninjured limb, 3.07 ± 0.63 N·m·s, P contraction. Biceps femoris long head muscle activation was lower at 100 milliseconds at both contraction speeds (-60 deg·s(-1), normalized iEMG activity [×1000]: injured limb, 26.25 ± 10.11 vs uninjured limb, 33.57 ± 8.29, P = .009; -180 deg·s(-1), normalized iEMG activity [×1000]: injured limb, 31.16 ± 10.01 vs uninjured limb, 39.64

  20. Cloud computing development in Armenia

    Directory of Open Access Journals (Sweden)

    Vazgen Ghazaryan

    2014-10-01

    Full Text Available Purpose – The purpose of the research is to clarify benefits and risks in regards with data protection, cost; business can have by the use of this new technologies for the implementation and management of organization’s information systems.Design/methodology/approach – Qualitative case study of the results obtained via interviews. Three research questions were raised: Q1: How can company benefit from using Cloud Computing compared to other solutions?; Q2: What are possible issues that occur with Cloud Computing?; Q3: How would Cloud Computing change an organizations’ IT infrastructure?Findings – The calculations provided in the interview section prove the financial advantages, even though the precise degree of flexibility and performance has not been assessed. Cloud Computing offers great scalability. Another benefit that Cloud Computing offers, in addition to better performance and flexibility, is reliable and simple backup data storage, physically distributed and so almost invulnerable to damage. Although the advantages of Cloud Computing more than compensate for the difficulties associated with it, the latter must be carefully considered. Since the cloud architecture is relatively new, so far the best guarantee against all risks it entails, from a single company's perspective, is a well-formulated service-level agreement, where the terms of service and the shared responsibility and security roles between the client and the provider are defined.Research limitations/implications – study was carried out on the bases of two companies, which gives deeper view, but for more widely applicable results, a wider analysis is necessary.Practical implications:Originality/Value – novelty of the research depends on the fact that existing approaches on this problem mainly focus on technical side of computing.Research type: case study

  1. Computing in Qualitative Analysis: A Healthy Development?

    Science.gov (United States)

    Richards, Lyn; Richards, Tom

    1991-01-01

    Discusses the potential impact of computers in qualitative health research. Describes the original goals, design, and implementation of NUDIST, a qualitative computing software. Argues for evaluation of the impact of computer techniques and for an opening of debate among program developers and users to address the purposes and power of computing…

  2. Physics Computer Development Project (PCDP), Progress Report.

    Science.gov (United States)

    Bork, Alfred M.

    This report discusses the development and implementation of computer-related teaching materials for undergraduate physics courses. A list of the computer dialogs developed, with a short description of each, is included. The types of dialog are: (1) development of an interactive proof, (2) assistance in problem solving, (3) diagnosing and filling…

  3. COMPUTER SCIENCE DEVELOPMENTS RELEVANT TO PSYCHOLOGY.

    Science.gov (United States)

    on-line control of experiments by man-machine interaction. The developments in computer science which make these applications possible are discussed...in some detail. In addition, there are conceptual developments in computer science , particularly in the study of artificial intelligence, which may provide leads in the development of psychological theory. (Author)

  4. International Developments in Computer Science.

    Science.gov (United States)

    1982-06-01

    world in terms of revenue. Hitachi Hitachi is a large conglomerate company that makes railroad locomotives , industrial cranes, and home appliances as...of six sections: bionics , pattern processing, speech processing, mathematical engineering, computer 20 vision, and machine inference. Kazuhiro Fuchi

  5. Automated assessment of heart chamber volumes and function in patients with previous myocardial infarction using multidetector computed tomography

    DEFF Research Database (Denmark)

    Fuchs, Andreas; Kühl, Jørgen Tobias; Lønborg, Jacob

    2013-01-01

    Left ventricular (LV), right ventricular (RV), and left atrial (LA) volumes and functions contain important prognostic information in ischemic heart disease. Because multidetector computed tomography (MDCT) has high spatial resolution, this method may be optimal to obtain this information....

  6. [Analysis of single-photon emission computed tomography in patients with hypertensive encephalopathy complicated with previous hypertensive crisis].

    Science.gov (United States)

    Kustkova, H S

    2012-01-01

    In cerebrovascular diseases pefuzionnaya single photon emission computed tomography with lipophilic amines used for the diagnosis of functional disorders of cerebral blood flow. Quantitative calculations helps clarify the nature of vascular disease and clarify the adequacy and effectiveness of the treatment. In this modern program for SPECT ensure conduct not only as to the calculation of blood flow, but also make it possible to compute also the absolute values of cerebral blood flow.

  7. Effect of previous exhaustive exercise on metabolism and fatigue development during intense exercise in humans

    DEFF Research Database (Denmark)

    Iaia, F. M.; Perez-Gomez, J.; Nordsborg, Nikolai

    2010-01-01

    The present study examined how metabolic response and work capacity are affected by previous exhaustive exercise. Seven subjects performed an exhaustive cycle exercise ( approximately 130%-max; EX2) after warm-up (CON) and 2 min after an exhaustive bout at a very high (VH; approximately 30 s), high...... during a repeated high-intensity exercise lasting 1/2-2 min....

  8. Synchronous development of breast cancer and chest wall fibrosarcoma after previous mantle radiation for Hodgkin's disease

    International Nuclear Information System (INIS)

    Patlas, Michael; McCready, David; Kulkarni, Supriya; Dill-Macky, Marcus J.

    2005-01-01

    Survivors of Hodgkin's disease are at increased risk of developing a second malignant neoplasm, including breast carcinoma and sarcoma. We report the first case of synchronous development of chest wall fibrosarcoma and breast carcinoma after mantle radiotherapy for Hodgkin's disease. Mammographic, sonographic and MR features are demonstrated. (orig.)

  9. Sustainable development, tourism and territory. Previous elements towards a systemic approach

    Directory of Open Access Journals (Sweden)

    Pierre TORRENTE

    2009-01-01

    Full Text Available Today, tourism is one of the major challenges for many countries and territories. The balance of payments, an ever-increasing number of visitors and the significant development of the tourism offer clearly illustrate the booming trend in this sector. This macro-economic approach is often used by the organizations in charge of tourism, WTO for instance. Quantitative assessments which consider the satisfaction of customers’ needs as an end in itself have prevailed both in tourism development schemes and in prospective approaches since the sixties.

  10. Reactor safety computer code development at INEL

    International Nuclear Information System (INIS)

    Johnsen, G.W.

    1985-01-01

    This report provides a brief overview of the computer code development programs being conducted at EG and G Idaho, Inc. on behalf of US Nuclear Regulatory Commission and the Department of Energy, Idaho Operations Office. Included are descriptions of the codes being developed, their development status as of the date of this report, and resident code development expertise

  11. Southampton uni's computer whizzes develop "mini" grid

    CERN Multimedia

    Sherriff, Lucy

    2006-01-01

    "In a bid to help its students explore the potential of grid computing, the University of Southampton's Computer Science department has developed what it calls a "lightweight grid". The system has been designed to allow students to experiment with grid technology without the complexity of inherent security concerns of the real thing. (1 page)

  12. Implementing and developing cloud computing applications

    CERN Document Server

    Sarna, David E Y

    2010-01-01

    From small start-ups to major corporations, companies of all sizes have embraced cloud computing for the scalability, reliability, and cost benefits it can provide. It has even been said that cloud computing may have a greater effect on our lives than the PC and dot-com revolutions combined.Filled with comparative charts and decision trees, Implementing and Developing Cloud Computing Applications explains exactly what it takes to build robust and highly scalable cloud computing applications in any organization. Covering the major commercial offerings available, it provides authoritative guidan

  13. Computed tomographic identification of dysplasia and progression of osteoarthritis in dog elbows previously assigned OFA grades 0 and 1.

    Science.gov (United States)

    Kunst, Chelsea M; Pease, Anthony P; Nelson, Nathan C; Habing, Greg; Ballegeer, Elizabeth A

    2014-01-01

    Elbow dysplasia is a heritable disease that is a common cause of lameness and progressive elbow osteoarthritis in young large breed dogs. The Orthopedic Foundation for Animals (OFA) screens elbow radiographs, and assigns grades 0-3 based on presence and severity of bony proliferation on the anconeal process. Grade 1 is assigned when less than 3 mm is present and considered positive for dysplasia. We investigated the incidence of elbow dysplasia and progression of osteoarthritis in elbows with grades 0 and 1 in 46 elbows screened at least 1 year previously, using CT as a gold standard and with the addition of CT absorptiometry. The incidence of dysplasia based on CT was 62% in grade 0, and 75% in grade 1 elbows, all of which had medial coronoid disease. Progressive osteoarthritis at recheck was consistent with elbow dysplasia. The sensitivity and specificity of the OFA grade for elbow dysplasia compared to CT findings was 75% and 38%, respectively. Increased bone mineral density of the medial coronoid process as characterized by osteoabsorptiometry warrants further investigation with respect to elbow dysplasia. Proliferation on the anconeal process without CT evidence of dysplasia or osteoarthritis was present in 20% of the elbows, and is theorized to be an anatomic variant or enthesopathy of the olecranon ligament/synovium. Results of our study suggest that the "anconeal bump" used for elbow screening by the OFA is a relatively insensitive characteristic, and support the use of CT for identifying additional characteristics of elbow dysplasia. © 2014 American College of Veterinary Radiology.

  14. Approximator: Predicting Interruptibility in Software Development with Commodity Computers

    DEFF Research Database (Denmark)

    Tell, Paolo; Jalaliniya, Shahram; Andersen, Kristian S. M.

    2015-01-01

    Assessing the presence and availability of a remote colleague is key in coordination in global software development but is not easily done using existing computer-mediated channels. Previous research has shown that automated estimation of interruptibility is feasible and can achieve a precision....... These early but promising results represent a starting point for designing tools with support for interruptibility capable of improving distributed awareness and cooperation to be used in global software development....

  15. Computational Developments for Distance Determination of Stellar ...

    Indian Academy of Sciences (India)

    2016-01-27

    Jan 27, 2016 ... For the computational developments of the problem, continued fraction by the Top–Down algorithm was developed and applied for the evaluation of the error function erf(). The distance equation () = 0 was solved by an iterative method of second order of convergence using homotopy continuation ...

  16. Computer code development plant for SMART design

    International Nuclear Information System (INIS)

    Bae, Kyoo Hwan; Choi, S.; Cho, B.H.; Kim, K.K.; Lee, J.C.; Kim, J.P.; Kim, J.H.; Chung, M.; Kang, D.J.; Chang, M.H.

    1999-03-01

    In accordance with the localization plan for the nuclear reactor design driven since the middle of 1980s, various computer codes have been transferred into the korea nuclear industry through the technical transfer program from the worldwide major pressurized water reactor supplier or through the international code development program. These computer codes have been successfully utilized in reactor and reload core design works. As the results, design- related technologies have been satisfactorily accumulated. However, the activities for the native code development activities to substitute the some important computer codes of which usages are limited by the original technique owners have been carried out rather poorly. Thus, it is most preferentially required to secure the native techniques on the computer code package and analysis methodology in order to establish the capability required for the independent design of our own model of reactor. Moreover, differently from the large capacity loop-type commercial reactors, SMART (SYSTEM-integrated Modular Advanced ReacTor) design adopts a single reactor pressure vessel containing the major primary components and has peculiar design characteristics such as self-controlled gas pressurizer, helical steam generator, passive residual heat removal system, etc. Considering those peculiar design characteristics for SMART, part of design can be performed with the computer codes used for the loop-type commercial reactor design. However, most of those computer codes are not directly applicable to the design of an integral reactor such as SMART. Thus, they should be modified to deal with the peculiar design characteristics of SMART. In addition to the modification efforts, various codes should be developed in several design area. Furthermore, modified or newly developed codes should be verified their reliability through the benchmarking or the test for the object design. Thus, it is necessary to proceed the design according to the

  17. Recent development in computational actinide chemistry

    International Nuclear Information System (INIS)

    Li Jun

    2008-01-01

    Ever since the Manhattan project in World War II, actinide chemistry has been essential for nuclear science and technology. Yet scientists still seek the ability to interpret and predict chemical and physical properties of actinide compounds and materials using first-principle theory and computational modeling. Actinide compounds are challenging to computational chemistry because of their complicated electron correlation effects and relativistic effects, including spin-orbit coupling effects. There have been significant developments in theoretical studies on actinide compounds in the past several years. The theoretical capabilities coupled with new experimental characterization techniques now offer a powerful combination for unraveling the complexities of actinide chemistry. In this talk, we will provide an overview of our own research in this field, with particular emphasis on applications of relativistic density functional and ab initio quantum chemical methods to the geometries, electronic structures, spectroscopy and excited-state properties of small actinide molecules such as CUO and UO 2 and some large actinide compounds relevant to separation and environment science. The performance of various density functional approaches and wavefunction theory-based electron correlation methods will be compared. The results of computational modeling on the vibrational, electronic, and NMR spectra of actinide compounds will be briefly discussed as well [1-4]. We will show that progress in relativistic quantum chemistry, computer hardware and computational chemistry software has enabled computational actinide chemistry to emerge as a powerful and predictive tool for research in actinide chemistry. (authors)

  18. Ethics in computer software design and development

    Science.gov (United States)

    Alan J. Thomson; Daniel L. Schmoldt

    2001-01-01

    Over the past 20 years, computer software has become integral and commonplace for operational and management tasks throughout agricultural and natural resource disciplines. During this software infusion, however, little thought has been afforded human impacts, both good and bad. This paper examines current ethical issues of software system design and development in...

  19. Computation and Learning in Visual Development

    Directory of Open Access Journals (Sweden)

    M Nardini

    2014-08-01

    Full Text Available In a special issue marking 30 years since the publication of Marr's Vision (Perception 41:9, 2012, Poggio proposed an update to Marr's influential “levels of understanding” framework. As well as understanding which algorithms are used for computations such as stereo or object recognition, we also need to understand how observers learn these algorithms, and how this learning is accomplished by neural circuits. I will describe research that addresses this problem in the domain of cue combination. In the last decade, linear cue combination has emerged as a common principle in visual and multisensory processing. In very many tasks, a computational goal (to minimise sensory uncertainty is achieved by the algorithm of weighted averaging. This framework provides a good description of observers' behaviour when combining sensory estimates (e.g. multiple depth cues. However, research has repeatedly shown that the computations carried out by developing perceptual systems – up to 8 years or later in humans – are not those leading to uncertainty reduction via weighted averaging. I will describe results showing how developing and mature perceptual systems differ in their computations when combining sensory cues, and outline two key problems for current and future research: 1. understanding the reorganisation of neural information processing that underlies these computational changes, and 2. understanding the learning mechanisms by which we acquire cue combination abilities through perceptual experience.

  20. An Application Development Platform for Neuromorphic Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dean, Mark [University of Tennessee (UT); Chan, Jason [University of Tennessee (UT); Daffron, Christopher [University of Tennessee (UT); Disney, Adam [University of Tennessee (UT); Reynolds, John [University of Tennessee (UT); Rose, Garrett [University of Tennessee (UT); Plank, James [University of Tennessee (UT); Birdwell, John Douglas [University of Tennessee (UT); Schuman, Catherine D [ORNL

    2016-01-01

    Dynamic Adaptive Neural Network Arrays (DANNAs) are neuromorphic computing systems developed as a hardware based approach to the implementation of neural networks. They feature highly adaptive and programmable structural elements, which model arti cial neural networks with spiking behavior. We design them to solve problems using evolutionary optimization. In this paper, we highlight the current hardware and software implementations of DANNA, including their features, functionalities and performance. We then describe the development of an Application Development Platform (ADP) to support efficient application implementation and testing of DANNA based solutions. We conclude with future directions.

  1. Cloud Computing and Agile Organization Development

    Directory of Open Access Journals (Sweden)

    Bogdan GHILIC-MICU

    2014-01-01

    Full Text Available In the 3rd millennium economy, defined by globalization and continuous reduction of natural resources, the economic organization becomes the main actor in the phenomenon of transfor-mation and adaptation to new conditions. Even more, the economic environment, which is closely related to the social environment, undergoes complex metamorphoses, especially in the management area. In this dynamic and complex social and environmental context, the econom-ic organization must possess the ability to adapt, becoming a flexible and agile answer to new market opportunities. Considering the spectacular evolution of information and communica-tions technology, one of the solutions to ensure organization agility is cloud computing. Just like the development of any science requires adaptation to theories and instruments specific to other fields, a cloud computing paradigm for the agile organization must appeal to models from management, cybernetics, mathematics, structuralism and information theory (or information systems theory.

  2. Computational Tools to Accelerate Commercial Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David C

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  3. Validation of a Previously Developed Geospatial Model That Predicts the Prevalence of Listeria monocytogenes in New York State Produce Fields.

    Science.gov (United States)

    Weller, Daniel; Shiwakoti, Suvash; Bergholz, Peter; Grohn, Yrjo; Wiedmann, Martin; Strawn, Laura K

    2016-02-01

    Technological advancements, particularly in the field of geographic information systems (GIS), have made it possible to predict the likelihood of foodborne pathogen contamination in produce production environments using geospatial models. Yet, few studies have examined the validity and robustness of such models. This study was performed to test and refine the rules associated with a previously developed geospatial model that predicts the prevalence of Listeria monocytogenes in produce farms in New York State (NYS). Produce fields for each of four enrolled produce farms were categorized into areas of high or low predicted L. monocytogenes prevalence using rules based on a field's available water storage (AWS) and its proximity to water, impervious cover, and pastures. Drag swabs (n = 1,056) were collected from plots assigned to each risk category. Logistic regression, which tested the ability of each rule to accurately predict the prevalence of L. monocytogenes, validated the rules based on water and pasture. Samples collected near water (odds ratio [OR], 3.0) and pasture (OR, 2.9) showed a significantly increased likelihood of L. monocytogenes isolation compared to that for samples collected far from water and pasture. Generalized linear mixed models identified additional land cover factors associated with an increased likelihood of L. monocytogenes isolation, such as proximity to wetlands. These findings validated a subset of previously developed rules that predict L. monocytogenes prevalence in produce production environments. This suggests that GIS and geospatial models can be used to accurately predict L. monocytogenes prevalence on farms and can be used prospectively to minimize the risk of preharvest contamination of produce. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  4. Computational intelligent data analysis for sustainable development computational intelligent data analysis for sustainable development

    CERN Document Server

    Yu, Ting; Simoff, Simeon

    2016-01-01

    Computational Intelligent Data Analysis for Sustainable Development: An Introduction and Overview Ting Yu, Nitesh Chawla, and Simeon SimoffIntegrated Sustainability AnalysisTracing Embodied CO2 in Trade Using High-Resolution Input-Output Tables Daniel Moran and Arne GeschkeAggregation Effects in Carbon Footprint Accounting Using Multi-Region Input-Output Analysis Xin Zhou, Hiroaki Shirakawa, and Manfred LenzenComputational Intelligent Data Analysis for Climate ChangeClimate InformaticsClaire Monteleoni, Gavin A. Schmidt, Francis Alexander, Alexandru Niculescu-Mizil, Karsten Steinhaeuser, Michael Tippett, Arindam Banerjee, M. Benno Blumenthal, Auroop R. Ganguly, Jason E. Smerdon, and Marco TedescoComputational Data Sciences for Actionable Insights on Climate Extremes and Uncertainty Auroop R. Ganguly, Evan Kodra, Snigdhansu Chatterjee, Arindam Banerjee, and Habib N. NajmComputational Intelligent Data Analysis for Biodiversity and Species ConservationMathematical Programming Applications to Land Conservation an...

  5. Approximator: Predicting Interruptibility in Software Development with Commodity Computers

    DEFF Research Database (Denmark)

    Tell, Paolo; Jalaliniya, Shahram; Andersen, Kristian S. M.

    2015-01-01

    Assessing the presence and availability of a remote colleague is key in coordination in global software development but is not easily done using existing computer-mediated channels. Previous research has shown that automated estimation of interruptibility is feasible and can achieve a precision...... closer to, or even better than, human judgment. However, existing approaches to assess interruptibility have been designed to rely on external sensors. In this paper, we present Approximator, a system that estimates the interruptibility of a user based exclusively on the sensing ability of commodity...

  6. [Incidence and clinical risk factors for the development of diabetes mellitus in women with previous gestational diabetes].

    Science.gov (United States)

    Domínguez-Vigo, P; Álvarez-Silvares, E; Alves-Pérez M T; Domínguez-Sánchez, J; González-González, A

    2016-04-01

    Gestational diabetes is considered a variant of diabetes mellitus as they share a common pathophysiological basis: insulin resistance in target and insufficient secretion of it by pancreatic p-cell bodies. Pregnancy is a unique physiological situation provides an opportunity to identify future risk of diabetes mellitus. To determine the long-term incidence of diabetes mellitus in women who have previously been diagnosed with gestational diabetes and identifying clinical risk factors for developing the same. nested case-control cohort study. 671 patients between 1996 and 2009 were diagnosed with gestational diabetes were selected. The incidence of diabetes mellitus was estimated and 2 subgroups were formed: Group A or cases: women who develop diabetes mellitus after diagnosis of gestational diabetes. Group B or control: random sample of 71 women with a history of gestational diabetes in the follow-up period remained normoglycemic. Both groups were studied up to 18 years postpartum. By studying Kaplan Meier survival of the influence of different gestational variables it was obtained in the later development of diabetes mellitus with time parameter and COX models for categorical variables were applied. Significant variables were studied by multivariate Cox analysis. In all analyzes the Hazard ratio was calculated with confidence intervals at 95%. The incidence of diabetes mellitus was 10.3% in patients with a history of gestational diabetes. They were identified as risk factors in the index pregnancy to later development of diabetes mellitus: greater than 35 and younger than 27 years maternal age, BMI greater than 30 kg/m2, hypertensive disorders of pregnancy, insulin therapy, poor metabolic control and more than a complicated pregnancy with gestational diabetes. Clinical factors have been identified in the pregnancy complicated by gestational diabetes that determine a higher probability of progression to diabetes mellitus in the medium and long term.

  7. Planning policy, sustainability and housebuilder practices: The move into (and out of?) the redevelopment of previously developed land.

    Science.gov (United States)

    Karadimitriou, Nikos

    2013-05-01

    This paper explores the transformations of the housebuilding industry under the policy requirement to build on previously developed land (PDL). This requirement was a key lever in promoting the sustainable urban development agenda of UK governments from the early 1990s to 2010 and has survived albeit somewhat relaxed and permutated in the latest National Planning Policy Framework (NPPF). The paper therefore looks at the way in which the policy push towards densification and mixed use affected housebuilders' business strategy and practices and their ability to cope with the 2007 downturn of the housing market and its aftermath. It also points out the eventual feedback of some of these practices into planning policy. Following the gradual shift of British urban policy focus towards sustainability which started in the early 1990s, new configurations of actors, new skills, strategies and approaches to managing risk emerged in property development and housebuilding. There were at least two ways in which housebuilders could have responded to the requirements of developing long term mixed use high density projects on PDL. One way was to develop new products and to employ practices and combinations of practices involving phasing, a flexible approach to planning applications and innovative production methods. Alternatively, they could approach PDL development as a temporary turn of policy or view mixed use high density schemes as a niche market to be explored without drastically overhauling the business model of the entire firm. These transformations of the UK housebuilding sector were unfolding during a long period of buoyancy in the housing market which came to an end in 2007. Very little is known both about how housebuilder strategies and production practices evolved during the boom years as well as about how these firms coped with the effects of the 2007 market downturn. The paper draws on published data (company annual reports, government statistics) and primary

  8. Planning policy, sustainability and housebuilder practices: The move into (and out of?) the redevelopment of previously developed land

    Science.gov (United States)

    Karadimitriou, Nikos

    2013-01-01

    This paper explores the transformations of the housebuilding industry under the policy requirement to build on previously developed land (PDL). This requirement was a key lever in promoting the sustainable urban development agenda of UK governments from the early 1990s to 2010 and has survived albeit somewhat relaxed and permutated in the latest National Planning Policy Framework (NPPF). The paper therefore looks at the way in which the policy push towards densification and mixed use affected housebuilders’ business strategy and practices and their ability to cope with the 2007 downturn of the housing market and its aftermath. It also points out the eventual feedback of some of these practices into planning policy. Following the gradual shift of British urban policy focus towards sustainability which started in the early 1990s, new configurations of actors, new skills, strategies and approaches to managing risk emerged in property development and housebuilding. There were at least two ways in which housebuilders could have responded to the requirements of developing long term mixed use high density projects on PDL. One way was to develop new products and to employ practices and combinations of practices involving phasing, a flexible approach to planning applications and innovative production methods. Alternatively, they could approach PDL development as a temporary turn of policy or view mixed use high density schemes as a niche market to be explored without drastically overhauling the business model of the entire firm. These transformations of the UK housebuilding sector were unfolding during a long period of buoyancy in the housing market which came to an end in 2007. Very little is known both about how housebuilder strategies and production practices evolved during the boom years as well as about how these firms coped with the effects of the 2007 market downturn. The paper draws on published data (company annual reports, government statistics) and primary

  9. Development of computer code in PNC, 8

    International Nuclear Information System (INIS)

    Ohhira, Mitsuru

    1990-01-01

    Private buildings applied base isolation system, are on the practical stage now. So, under Construction and Maintenance Management Office, we are doing an application study of base isolation system to nuclear fuel facilities. On the process of this study, we have developed Dynamic Analysis Program-Base Isolation System (DAP-BS) which is able to run a 32-bit personal computer. Using this program, we can analyze a 3-dimensional structure, and evaluate the various properties of base isolation parts that are divided into maximum 16 blocks. And from the results of some simulation analyses, we thought that DAP-BS had good reliability and marketability. So, we put DAP-BS on the market. (author)

  10. A Computational Model of Spatial Development

    Science.gov (United States)

    Hiraki, Kazuo; Sashima, Akio; Phillips, Steven

    Psychological experiments on children's development of spatial knowledge suggest experience at self-locomotion with visual tracking as important factors. Yet, the mechanism underlying development is unknown. We propose a robot that learns to mentally track a target object (i.e., maintaining a representation of an object's position when outside the field-of-view) as a model for spatial development. Mental tracking is considered as prediction of an object's position given the previous environmental state and motor commands, and the current environment state resulting from movement. Following Jordan & Rumelhart's (1992) forward modeling architecture the system consists of two components: an inverse model of sensory input to desired motor commands; and a forward model of motor commands to desired sensory input (goals). The robot was tested on the `three cups' paradigm (where children are required to select the cup containing the hidden object under various movement conditions). Consistent with child development, without the capacity for self-locomotion the robot's errors are self-center based. When given the ability of self-locomotion the robot responds allocentrically.

  11. Development of Probabilistic Internal Dosimetry Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Noh, Siwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kwon, Tae-Eun [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of); Lee, Jai-Ki [Korean Association for Radiation Protection, Seoul (Korea, Republic of)

    2017-02-15

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values (e.g. the 2.5{sup th}, 5{sup th}, median, 95{sup th}, and 97.5{sup th} percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various

  12. Development of probabilistic internal dosimetry computer code

    Science.gov (United States)

    Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki

    2017-02-01

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values ( e.g. the 2.5th, 5th, median, 95th, and 97.5th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases of

  13. Development of quantitative computed tomography lung protocols.

    Science.gov (United States)

    Newell, John D; Sieren, Jered; Hoffman, Eric A

    2013-09-01

    The purpose of this review article is to review the process of developing optimal computed tomography (CT) protocols for quantitative lung CT (QCT). In this review, we discuss the following important topics: QCT-derived metrics of lung disease; QCT scanning protocols; quality control; and QCT image processing software. We will briefly discuss several QCT-derived metrics of lung disease that have been developed for the assessment of emphysema, small airway disease, and large airway disease. The CT scanning protocol is one of the most important elements in a successful QCT. We will provide a detailed description of the current move toward optimizing the QCT protocol for the assessment of chronic obstructive pulmonary disorder and asthma. Quality control of CT images is also a very important part of the QCT process. We will discuss why it is necessary to use CT scanner test objects (phantoms) to provide frequent periodic checks on the CT scanner calibration to ensure precise and accurate CT numbers. We will discuss the use of QCT image processing software to segment the lung and extract the desired QCT metrics of lung disease. We will discuss the practical issues of using this software. The data obtained from the image processing software are then combined with those from other clinical examinations, health status questionnaires, pulmonary physiology, and genomics to increase our understanding of obstructive lung disease and improve our ability to design new therapies for these diseases.

  14. Developing a District-Wide Computer-Use Plan.

    Science.gov (United States)

    Fisher, Glenn

    1983-01-01

    Outlines plan for computer use in grades K-12 in Albany School District, California, noting major goals of computer literacy program, approach to implement goals, hardware needed, staff development, new personnel, location of hardware, and other prerequisites. An estimated budget plan and computer use framework (programing, computer assisted…

  15. Development and application of computational aerothermodynamics flowfield computer codes

    Science.gov (United States)

    Venkatapathy, Ethiraj

    1993-01-01

    Computations are presented for one-dimensional, strong shock waves that are typical of those that form in front of a reentering spacecraft. The fluid mechanics and thermochemistry are modeled using two different approaches. The first employs traditional continuum techniques in solving the Navier-Stokes equations. The second-approach employs a particle simulation technique (the direct simulation Monte Carlo method, DSMC). The thermochemical models employed in these two techniques are quite different. The present investigation presents an evaluation of thermochemical models for nitrogen under hypersonic flow conditions. Four separate cases are considered. The cases are governed, respectively, by the following: vibrational relaxation; weak dissociation; strong dissociation; and weak ionization. In near-continuum, hypersonic flow, the nonequilibrium thermochemical models employed in continuum and particle simulations produce nearly identical solutions. Further, the two approaches are evaluated successfully against available experimental data for weakly and strongly dissociating flows.

  16. Development of computational interface for internal radiotherapy

    International Nuclear Information System (INIS)

    Damaso, Renato de Sousa; Campos, Tarcisio Passos R.

    1996-01-01

    Ths present paper is part of a research line which seeks the development of the brachytherapy coupled with NCT (Neutron Capture Therapy), specifically to the cervical cancer. This technique is based on the capability of placing a discrete mixed neutron and gamma-ray source close to a tumor region, making use of a intrauterine catheter. The dose received in this application is a high level one, type HDL, for a short period of time. Using chemical composts, such as boron organic composts which are ingested by the patient and are absorbed selectively into the tumor cells, the energy deposited on the malignant cells are selectively increased by nuclear reactions such as B (n,α)Li. In contrast, the health cells are preserved. The main goal of this paper consist of the elaboration of graphic displays, which are used to visualize the anatomic parts of the human body in a two or tree-dimensional way. It is possible with this graphics displays to make a friendly interface between the medical staff with the process of planning the therapy. Preliminary results of computational simulations are presented, showing simulations of a radiotherapy plan dealing with a rectum tumor using brachytherapy coupled with BNCT. (author)

  17. Developing a Computer Laboratory for Undergraduate Sociology Courses.

    Science.gov (United States)

    Raymondo, James C.

    1996-01-01

    Discusses the development of a computer laboratory for sociology courses, as well as some advantages and disadvantages of incorporating computer technology into the classroom. Examines the proposal and proposal-review process. Provides tips for writing a successful proposal. (MJP)

  18. Developing a computer game to prepare children for surgery.

    Science.gov (United States)

    Rassin, Michal; Gutman, Yaira; Silner, Dina

    2004-12-01

    Computer games are a major part of the culture of children and teenagers in many developed countries. Research shows that children of the computer age prefer computer-assisted learning to any other teaching strategy. Health care workers traditionally have used dolls, games, drawings, creative arts, and even videotapes to prepare children for surgery. No studies have been conducted in Israel on using computers to help ailing children in general or to help children preparing for surgery in particular. This article discusses the potential for using computers to educate patients based on a review of the literature and interviews with children and describes the process of computer game development.

  19. Development of a computer based learning system for teaching and ...

    African Journals Online (AJOL)

    Computer based learning (CBL) refers to the use of computers as a key component of the educational environment This computer based-learning for teaching and accessing mathematics is a software package developed using Java Programming Language and Java Server Page (JSP). It acts as a web application using ...

  20. Spectrophotometric determination of uranium with arsenazo previous liquid-liquid extraction and colour development in organic medium

    International Nuclear Information System (INIS)

    Palomares Delgado, F.; Vera Palomino, J.; Petrement Eguiluz, J. C.

    1964-01-01

    The determination of uranium with arsenazo is hindered by a great number of cation which form stable complexes with the reactive and may given rise to serious interferences. By studying the optimum conditions of uranium the extraction be means of tributylphosphate solutions dissolved in methylisobuthylketone, under conditions for previous masking of the interfering cations, an organic extract was obtained containing all the uranium together with small amounts of iron. The possible interference derived from the latter element is avoided by reduction with hydroxylammoniumchlorid followed by complex formation of the Fe(II)-ortophenantroline compound in alcoholic medium. (Author) 17 refs

  1. Computer-aided design development transition for IPAD environment

    Science.gov (United States)

    Owens, H. G.; Mock, W. D.; Mitchell, J. C.

    1980-01-01

    The relationship of federally sponsored computer-aided design/computer-aided manufacturing (CAD/CAM) programs to the aircraft life cycle design process, an overview of NAAD'S CAD development program, an evaluation of the CAD design process, a discussion of the current computing environment within which NAAD is developing its CAD system, some of the advantages/disadvantages of the NAAD-IPAD approach, and CAD developments during transition into the IPAD system are discussed.

  2. Estimating Computer-Based Training Development Times

    Science.gov (United States)

    1987-10-14

    Sonia Gunderson Scientific Systems, Inc. ARI Field Unit at Fort Knox, Kentucky Donald F. Haggard, Chief Training Research Laboratory Jack H. Hiller ...formative eva ]ati’ __ programming routines writing lessons computer dowr,-t: e programming lessons _ learning content meetings video production

  3. Preliminary Phase Field Computational Model Development

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yulan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hu, Shenyang Y. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Xu, Ke [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Suter, Jonathan D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McCloy, John S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Johnson, Bradley R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ramuhalli, Pradeep [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-15

    This interim report presents progress towards the development of meso-scale models of magnetic behavior that incorporate microstructural information. Modeling magnetic signatures in irradiated materials with complex microstructures (such as structural steels) is a significant challenge. The complexity is addressed incrementally, using the monocrystalline Fe (i.e., ferrite) film as model systems to develop and validate initial models, followed by polycrystalline Fe films, and by more complicated and representative alloys. In addition, the modeling incrementally addresses inclusion of other major phases (e.g., martensite, austenite), minor magnetic phases (e.g., carbides, FeCr precipitates), and minor nonmagnetic phases (e.g., Cu precipitates, voids). The focus of the magnetic modeling is on phase-field models. The models are based on the numerical solution to the Landau-Lifshitz-Gilbert equation. From the computational standpoint, phase-field modeling allows the simulation of large enough systems that relevant defect structures and their effects on functional properties like magnetism can be simulated. To date, two phase-field models have been generated in support of this work. First, a bulk iron model with periodic boundary conditions was generated as a proof-of-concept to investigate major loop effects of single versus polycrystalline bulk iron and effects of single non-magnetic defects. More recently, to support the experimental program herein using iron thin films, a new model was generated that uses finite boundary conditions representing surfaces and edges. This model has provided key insights into the domain structures observed in magnetic force microscopy (MFM) measurements. Simulation results for single crystal thin-film iron indicate the feasibility of the model for determining magnetic domain wall thickness and mobility in an externally applied field. Because the phase-field model dimensions are limited relative to the size of most specimens used in

  4. Synchronous development of breast cancer and chest wall fibrosarcoma after previous mantle radiation for Hodgkin's disease

    Energy Technology Data Exchange (ETDEWEB)

    Patlas, Michael [Hamilton General Hospital, Department of Radiology, Hamilton, ON (Canada); McCready, David [University Health Network and Mount Sinai Hospital, Department of Surgery, Toronto, ON (Canada); Kulkarni, Supriya; Dill-Macky, Marcus J. [University Health Network and Mount Sinai Hospital, Department of Medical Imaging, Toronto, ON (Canada)

    2005-09-01

    Survivors of Hodgkin's disease are at increased risk of developing a second malignant neoplasm, including breast carcinoma and sarcoma. We report the first case of synchronous development of chest wall fibrosarcoma and breast carcinoma after mantle radiotherapy for Hodgkin's disease. Mammographic, sonographic and MR features are demonstrated. (orig.)

  5. Executive Development through Asynchronous Computer Conferencing

    Science.gov (United States)

    1991-06-01

    Department of the Army. Dick, J. R. (1987). ’Automating’ your chairman. The Magazine of Bank Administration, pp. 36 - 38. Doise , W . (1986). Levels of...Deputy Chief of Staff for Personnel EDGAR M. JOHNSON JON W . BLADES Technical Director COL, IN Commanding Research accomplished under contract for the...Computer News, mInfowol, etwork W , and Washington Technoloaq. Canvass of Experts. Users. Suppliers. Researchers. Several tactics were used to gather

  6. Inflatable Antenna for CubeSat: Extension of the Previously Developed S-Band Design to the X-Band

    Science.gov (United States)

    Babuscia, Alessandra; Choi, Thomas; Cheung, Kar-Ming; Thangavelautham, Jekan; Ravichandran, Mithun; Chandra, Aman

    2015-01-01

    The inflatable antenna for CubeSat is a 1 meter antenna reflector designed with one side reflective Mylar, another side clear Mylar with a patch antenna at the focus. The development of this technology responds to the increasing need for more capable communication systems to allow CubeSats to operate autonomously in interplanetary missions. An initial version of the antenna for the S-Band was developed and tested in both anechoic chamber and vacuum chamber. Recent developments in transceivers and amplifiers for CubeSat at X-band motivated the extension from the S-Band to the X-Band. This paper describes the process of extending the design of the antenna to the X-Band focusing on patch antenna redesign, new manufacturing challenges and initial results of experimental tests.

  7. The reliability of the Associate Platinum digital foot scanner in measuring previously developed footprint characteristics: a technical note.

    Science.gov (United States)

    Papuga, M Owen; Burke, Jeanmarie R

    2011-02-01

    An ink pad and paper, pressure-sensitive platforms, and photography have previously been used to collect footprint data used in clinical assessment. Digital scanners have been widely used more recently to collect such data. The purpose of this study was to evaluate the intra- and interrater reliability of a flatbed digital image scanning technology to capture footprint data. This study used a repeated-measures design on 32 (16 male 16 female) healthy subjects. The following measured indices of footprint were recorded from 2-dimensional images of the plantar surface of the foot recorded with an Associate Platinum (Foot Levelers Inc, Roanoke, VA) digital foot scanner: Staheli index, Chippaux-Smirak index, arch angle, and arch index. Intraclass correlation coefficient (ICC) values were calculated to evaluate intrarater, interday, and interclinician reliability. The ICC values for intrarater reliability were greater than or equal to .817, indicating an excellent level of reproducibility in assessing the collected images. Analyses of variance revealed that there were no significant differences between raters for each index (P > .05). The ICC values also indicated excellent reliability (.881-.971) between days and clinicians in all but one of the indices of footprint, arch angle (.689), with good reliability between clinicians. The full-factorial analysis of variance model did not reveal any interaction effects (P > .05), which indicated that indices of footprint were not changing across days and clinicians. Scanning technology used in this study demonstrated good intra- and interrater reliability measurements of footprint indices, as demonstrated by high ICC values. Copyright © 2011 National University of Health Sciences. Published by Mosby, Inc. All rights reserved.

  8. Development of Computer Science Disciplines - A Social Network Analysis Approach

    OpenAIRE

    Pham, Manh Cuong; Klamma, Ralf; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and ...

  9. International Computer Conferencing for Professional Development: The Bangkok Project.

    Science.gov (United States)

    Anderson, Terry; Mason, Robin

    1993-01-01

    Describes the Bangkok Project, a successful application of electronic mail and computer conferencing networks to support professional development in the field of distance education. The development of the computer conference to supplement a face-to-face conference in Bangkok is explained, and conference format and nature of interactions are…

  10. Cloud Computing: Key to IT Development in West Africa | Nwabuonu ...

    African Journals Online (AJOL)

    It has been established that Information Technology (IT) Development in West Africa has faced lots of challenges ranging from Cyber Threat to inadequate IT Infrastructure. Cloud Computing is a Revolution. It is creating a fundamental change in Computer Architecture, Software and Tools Development iIn the way we Store, ...

  11. ATLAS computing activities and developments in the Italian Grid cloud

    International Nuclear Information System (INIS)

    Rinaldi, L; Ciocca, C; K, M; Annovi, A; Antonelli, M; Martini, A; Barberis, D; Brunengo, A; Corosu, M; Barberis, S; Carminati, L; Campana, S; Di, A; Capone, V; Carlino, G; Doria, A; Esposito, R; Merola, L; De, A; Luminari, L

    2012-01-01

    The large amount of data produced by the ATLAS experiment needs new computing paradigms for data processing and analysis, which involve many computing centres spread around the world. The computing workload is managed by regional federations, called “clouds”. The Italian cloud consists of a main (Tier-1) center, located in Bologna, four secondary (Tier-2) centers, and a few smaller (Tier-3) sites. In this contribution we describe the Italian cloud facilities and the activities of data processing, analysis, simulation and software development performed within the cloud, and we discuss the tests of the new computing technologies contributing to evolution of the ATLAS Computing Model.

  12. Fluid dynamics parallel computer development at NASA Langley Research Center

    Science.gov (United States)

    Townsend, James C.; Zang, Thomas A.; Dwoyer, Douglas L.

    1987-01-01

    To accomplish more detailed simulations of highly complex flows, such as the transition to turbulence, fluid dynamics research requires computers much more powerful than any available today. Only parallel processing on multiple-processor computers offers hope for achieving the required effective speeds. Looking ahead to the use of these machines, the fluid dynamicist faces three issues: algorithm development for near-term parallel computers, architecture development for future computer power increases, and assessment of possible advantages of special purpose designs. Two projects at NASA Langley address these issues. Software development and algorithm exploration is being done on the FLEX/32 Parallel Processing Research Computer. New architecture features are being explored in the special purpose hardware design of the Navier-Stokes Computer. These projects are complementary and are producing promising results.

  13. Hemoglobin-Based Oxygen Carrier (HBOC) Development in Trauma: Previous Regulatory Challenges, Lessons Learned, and a Path Forward.

    Science.gov (United States)

    Keipert, Peter E

    2017-01-01

    Historically, hemoglobin-based oxygen carriers (HBOCs) were being developed as "blood substitutes," despite their transient circulatory half-life (~ 24 h) vs. transfused red blood cells (RBCs). More recently, HBOC commercial development focused on "oxygen therapeutic" indications to provide a temporary oxygenation bridge until medical or surgical interventions (including RBC transfusion, if required) can be initiated. This included the early trauma trials with HemAssist ® (BAXTER), Hemopure ® (BIOPURE) and PolyHeme ® (NORTHFIELD) for resuscitating hypotensive shock. These trials all failed due to safety concerns (e.g., cardiac events, mortality) and certain protocol design limitations. In 2008 the Food and Drug Administration (FDA) put all HBOC trials in the US on clinical hold due to the unfavorable benefit:risk profile demonstrated by various HBOCs in different clinical studies in a meta-analysis published by Natanson et al. (2008). During standard resuscitation in trauma, organ dysfunction and failure can occur due to ischemia in critical tissues, which can be detected by the degree of lactic acidosis. SANGART'S Phase 2 trauma program with MP4OX therefore added lactate >5 mmol/L as an inclusion criterion to enroll patients who had lost sufficient blood to cause a tissue oxygen debt. This was key to the successful conduct of their Phase 2 program (ex-US, from 2009 to 2012) to evaluate MP4OX as an adjunct to standard fluid resuscitation and transfusion of RBCs. In 2013, SANGART shared their Phase 2b results with the FDA, and succeeded in getting the FDA to agree that a planned Phase 2c higher dose comparison study of MP4OX in trauma could include clinical sites in the US. Unfortunately, SANGART failed to secure new funding and was forced to terminate development and operations in Dec 2013, even though a regulatory path forward with FDA approval to proceed in trauma had been achieved.

  14. Wide-angle display developments by computer graphics

    Science.gov (United States)

    Fetter, William A.

    1989-01-01

    Computer graphics can now expand its new subset, wide-angle projection, to be as significant a generic capability as computer graphics itself. Some prior work in computer graphics is presented which leads to an attractive further subset of wide-angle projection, called hemispheric projection, to be a major communication media. Hemispheric film systems have long been present and such computer graphics systems are in use in simulators. This is the leading edge of capabilities which should ultimately be as ubiquitous as CRTs (cathode-ray tubes). These assertions are not from degrees in science or only from a degree in graphic design, but in a history of computer graphics innovations, laying groundwork by demonstration. The author believes that it is timely to look at several development strategies, since hemispheric projection is now at a point comparable to the early stages of computer graphics, requiring similar patterns of development again.

  15. The Formal Approach to Computer Game Rule Development Automation

    OpenAIRE

    Elena, A.

    2009-01-01

    Computer game rules development is one of the weakly automated tasks in game development. This paper gives an overview of the ongoing research project which deals with automation of rules development for turn-based strategy computer games. Rules are the basic elements of these games. This paper proposes a new approach to automation including visual formal rules model creation, model verification and modelbased code generation.

  16. Computational morphodynamics of plants: integrating development over space and time.

    Science.gov (United States)

    Roeder, Adrienne H K; Tarr, Paul T; Tobin, Cory; Zhang, Xiaolan; Chickarmane, Vijay; Cunha, Alexandre; Meyerowitz, Elliot M

    2011-04-01

    The emerging field of computational morphodynamics aims to understand the changes that occur in space and time during development by combining three technical strategies: live imaging to observe development as it happens; image processing and analysis to extract quantitative information; and computational modelling to express and test time-dependent hypotheses. The strength of the field comes from the iterative and combined use of these techniques, which has provided important insights into plant development.

  17. Communications, Computers and Automation for Development.

    Science.gov (United States)

    Pool, Ithiel de Sola; And Others

    This paper includes three articles dealing with the application of science and technology to national development. In part, the first article attempts to answer the following questions: 1) what will be the costs and effects of communication technology in the coming decade; 2) how can the elements of communication systems be examined in terms of…

  18. Electronic Mail for Personal Computers: Development Issues.

    Science.gov (United States)

    Tomer, Christinger

    1994-01-01

    Examines competing, commercially developed electronic mail programs and how these technologies will affect the functionality and quality of electronic mail. How new standards for client-server mail systems are likely to enhance messaging capabilities and the use of electronic mail for information retrieval are considered. (Contains eight…

  19. Developing and validating an instrument for measuring mobile computing self-efficacy.

    Science.gov (United States)

    Wang, Yi-Shun; Wang, Hsiu-Yuan

    2008-08-01

    IT-related self-efficacy has been found to have a critical influence on system use. However, traditional measures of computer self-efficacy and Internet-related self-efficacy are perceived to be inapplicable in the context of mobile computing and commerce because they are targeted primarily at either desktop computer or wire-based technology contexts. Based on previous research, this study develops and validates a multidimensional instrument for measuring mobile computing self-efficacy (MCSE). This empirically validated instrument will be useful to researchers in developing and testing the theories of mobile user behavior, and to practitioners in assessing the mobile computing self-efficacy of users and promoting the use of mobile commerce systems.

  20. Development of a computer-aided digital reactivity computer system for PWRs

    International Nuclear Information System (INIS)

    Chung, S.-K.; Sung, K.-Y.; Kim, D.; Cho, D.-Y.

    1993-01-01

    Reactor physics tests at initial startup and after reloading are performed to verify nuclear design and to ensure safety operation. Two kinds of reactivity computers, analog and digital, have been widely used in the pressurized water reactor (PWR) core physics test. The test data of both reactivity computers are displayed only on the strip chart recorder, and these data are managed by hand so that the accuracy of the test results depends on operator expertise and experiences. This paper describes the development of the computer-aided digital reactivity computer system (DRCS), which is enhanced by system management software and an improved system for the application of the PWR core physics test

  1. Cloud Computing: Key to IT Development in West Africa

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    2013-12-01

    Dec 1, 2013 ... Abstract. It has been established that Information Technology (IT) Development in West Africa has faced lots of challenges ranging from Cyber Threat to inadequate IT Infrastructure. Cloud Computing is a Revolution. It is creating a fundamental change in Computer Architecture, Software and.

  2. Developing a Distributed Computing Architecture at Arizona State University.

    Science.gov (United States)

    Armann, Neil; And Others

    1994-01-01

    Development of Arizona State University's computing architecture, designed to ensure that all new distributed computing pieces will work together, is described. Aspects discussed include the business rationale, the general architectural approach, characteristics and objectives of the architecture, specific services, and impact on the university…

  3. HOME COMPUTER USE AND THE DEVELOPMENT OF HUMAN CAPITAL*

    Science.gov (United States)

    Malamud, Ofer; Pop-Eleches, Cristian

    2012-01-01

    This paper uses a regression discontinuity design to estimate the effect of home computers on child and adolescent outcomes by exploiting a voucher program in Romania. Our main results indicate that home computers have both positive and negative effects on the development of human capital. Children who won a voucher to purchase a computer had significantly lower school grades but show improved computer skills. There is also some evidence that winning a voucher increased cognitive skills, as measured by Raven’s Progressive Matrices. We do not find much evidence for an effect on non-cognitive outcomes. Parental rules regarding homework and computer use attenuate the effects of computer ownership, suggesting that parental monitoring and supervision may be important mediating factors. PMID:22719135

  4. Development of computer aided engineering system for TRAC applications

    International Nuclear Information System (INIS)

    Arai, Kenji; Itoya, Seihiro; Uematsu, Hitoshi; Tsunoyama, Shigeaki

    1990-01-01

    An advanced best estimate computer program for nuclear reactor transient analysis, TRAC has been extensively used to carry out various thermal hydraulic calculations in the nuclear engineering field, because of its versatility. To perform efficiently a wide variety of TRAC calculation, the efficient utilization of computers and the convenient environment for input and output processing is necessary. We have applied a computer network comprising a super-computer, engineering work stations and personal computers to TRAC calculations and have assigned the appropriate functions to each computer. We have also been developing an interactive graphics system for input and output processing on an EWS. This hardware and software environment can improve the effectiveness of TRAC utilization for various thermal hydraulic calculations. (author)

  5. Development of a Computational Model for Predicting Damage to Tankers

    DEFF Research Database (Denmark)

    Little, P.; Pippenger, D.; Simonsen, Bo Cerup

    1996-01-01

    A windows based computer program DAMAGE has been developed for analysis of ship grounding on a pinnacle shaped rock. The paper presents part of the theory and the overall ideas of the computerprogram.......A windows based computer program DAMAGE has been developed for analysis of ship grounding on a pinnacle shaped rock. The paper presents part of the theory and the overall ideas of the computerprogram....

  6. The role of computer simulation in nuclear technologies development

    International Nuclear Information System (INIS)

    Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, V. V.; Ryazanov, D.K.; Tellin, A.I.

    2001-01-01

    In the report the role and purposes of computer simulation in nuclear technologies development is discussed. The authors consider such applications of computer simulation as nuclear safety researches, optimization of technical and economic parameters of acting nuclear plant, planning and support of reactor experiments, research and design new devices and technologies, design and development of 'simulators' for operating personnel training. Among marked applications the following aspects of computer simulation are discussed in the report: neutron-physical, thermal and hydrodynamics models, simulation of isotope structure change and damage dose accumulation for materials under irradiation, simulation of reactor control structures. (authors)

  7. Development of a computer design system for HVAC

    International Nuclear Information System (INIS)

    Miyazaki, Y.; Yotsuya, M.; Hasegawa, M.

    1993-01-01

    The development of a computer design system for HVAC (Heating, Ventilating and Air Conditioning) system is presented in this paper. It supports the air conditioning design for a nuclear power plant and a reprocessing plant. This system integrates various computer design systems which were developed separately for the various design phases of HVAC. the purposes include centralizing the HVAC data, optimizing design, and reducing the designing time. The centralized HVAC data are managed by a DBMS (Data Base Management System). The DBMS separates the computer design system into a calculation module and the data. The design system can thus be expanded easily in the future. 2 figs

  8. The computer code SEURBNUK-2: Recent developments

    International Nuclear Information System (INIS)

    Staniforth, R.; Yerkees, A.

    1979-01-01

    The SEURBNUK-2 code is now being developed jointly by AEE Winfrith and JRC Ispra for use in Fast Reactor Containment Studies. To meet the needs of such studies and the needs of the COVA programme, a number of improvements and extensions of the code have been made. A selection of these changes and illustrations of their use are given in this paper. The structural capability of SEURBNUK-2 was originally limited to the treatment of thin shells, and shell junctions. Although this facility proved surprisingly useful, it was realised that a more versatile and powerful means of calculating the deformation of more complicated structural geometries would be required. The finite element code EURDYN which employs convected coordinates was adapted for the purpose, so that axially symmetric elements of the isoparametric, triangular and thin shell families could be used to model various parts of the reactor structure. The method of coupling this finite element code to the fluid motion is described and the use of this new version of the code is illustrated and the results compared with those obtained by the original code and by experiment. A feature of many reactor designs which is being modelled in the later COVA experiments is the perforated plate or porous structure. For fixed perforated plates and porous structures, the additional pressure drop and inertia effects can be included in the momentum equations by addition of suitable terms and the original technique of solution is unaltered. Details of the finite difference equations are given in this paper together with the results of check calculations which were performed to ensure the correct functioning of the code. (orig.)

  9. COMPUTER MODELING IN THE DEVELOPMENT OF ARTIFICIAL VENTRICLES OF HEART

    Directory of Open Access Journals (Sweden)

    L. V. Belyaev

    2011-01-01

    Full Text Available In article modern researches of processes of development of artificial ventricles of heart are described. Advanta- ges of application computer (CAD/CAE technologies are shown by development of artificial ventricles of heart. The systems developed with application of the given technologies are submitted. 

  10. Experimental and computational development of a natural breast phantom for dosimetry studies

    International Nuclear Information System (INIS)

    Nogueira, Luciana B.; Campos, Tarcisio P.R.

    2013-01-01

    This paper describes the experimental and computational development of a natural breast phantom, anthropomorphic and anthropometric for studies in dosimetry of brachytherapy and teletherapy of breast. The natural breast phantom developed corresponding to fibroadipose breasts of women aged 30 to 50 years, presenting radiographically medium density. The experimental breast phantom was constituted of three tissue-equivalents (TE's): glandular TE, adipose TE and skin TE. These TE's were developed according to chemical composition of human breast and present radiological response to exposure. Completed the construction of experimental breast phantom this was mounted on a thorax phantom previously developed by the research group NRI/UFMG. Then the computational breast phantom was constructed by performing a computed tomography (CT) by axial slices of the chest phantom. Through the images generated by CT a computational model of voxels of the thorax phantom was developed by SISCODES computational program, being the computational breast phantom represented by the same TE's of the experimental breast phantom. The images generated by CT allowed evaluating the radiological equivalence of the tissues. The breast phantom is being used in studies of experimental dosimetry both in brachytherapy as in teletherapy of breast. Dosimetry studies by MCNP-5 code using the computational model of the phantom breast are in progress. (author)

  11. Computational approaches to the development of perceptual expertise.

    Science.gov (United States)

    Palmeri, Thomas J; Wong, Alan C-N; Gauthier, Isabel

    2004-08-01

    Dog experts, ornithologists, radiologists and other specialists are noted for their remarkable abilities at categorizing, identifying and recognizing objects within their domain of expertise. A complete understanding of the development of perceptual expertise requires a combination of thorough empirical research and carefully articulated computational theories that formalize specific hypotheses about the acquisition of expertise. A comprehensive computational theory of the development of perceptual expertise remains elusive, but we can look to existing computational models from the object-recognition, perceptual-categorization, automaticity and related literatures for possible starting points. Arguably, hypotheses about the development of perceptual expertise should first be explored within the context of existing computational models of visual object understanding before considering the creation of highly modularized adaptations for particular domains of perceptual expertise.

  12. The role of customized computational tools in product development.

    Energy Technology Data Exchange (ETDEWEB)

    Heinstein, Martin Wilhelm; Kempka, Steven Norman; Tikare, Veena

    2005-06-01

    Model-based computer simulations have revolutionized product development in the last 10 to 15 years. Technologies that have existed for many decades or even centuries have been improved with the aid of computer simulations. Everything from low-tech consumer goods such as detergents, lubricants and light bulb filaments to the most advanced high-tech products such as airplane wings, wireless communication technologies and pharmaceuticals is engineered with the aid of computer simulations today. In this paper, we present a framework for describing computational tools and their application within the context of product engineering. We examine a few cases of product development that integrate numerical computer simulations into the development stage. We will discuss how the simulations were integrated into the development process, what features made the simulations useful, the level of knowledge and experience that was necessary to run meaningful simulations and other details of the process. Based on this discussion, recommendations for the incorporation of simulations and computational tools into product development will be made.

  13. Development of Onboard Computer Complex for Russian Segment of ISS

    Science.gov (United States)

    Branets, V.; Brand, G.; Vlasov, R.; Graf, I.; Clubb, J.; Mikrin, E.; Samitov, R.

    1998-01-01

    Report present a description of the Onboard Computer Complex (CC) that was developed during the period of 1994-1998 for the Russian Segment of ISS. The system was developed in co-operation with NASA and ESA. ESA developed a new computation system under the RSC Energia Technical Assignment, called DMS-R. The CC also includes elements developed by Russian experts and organizations. A general architecture of the computer system and the characteristics of primary elements of this system are described. The system was integrated at RSC Energia with the participation of American and European specialists. The report contains information on software simulators, verification and de-bugging facilities witch were been developed for both stand-alone and integrated tests and verification. This CC serves as the basis for the Russian Segment Onboard Control Complex on ISS.

  14. Development of real-time visualization system for Computational Fluid Dynamics on parallel computers

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Otani, Takayuki; Matsumoto, Hideki; Takei, Toshifumi; Doi, Shun

    1998-03-01

    A real-time visualization system for computational fluid dynamics in a network connecting between a parallel computing server and the client terminal was developed. Using the system, a user can visualize the results of a CFD (Computational Fluid Dynamics) simulation on the parallel computer as a client terminal during the actual computation on a server. Using GUI (Graphical User Interface) on the client terminal, to user is also able to change parameters of the analysis and visualization during the real-time of the calculation. The system carries out both of CFD simulation and generation of a pixel image data on the parallel computer, and compresses the data. Therefore, the amount of data from the parallel computer to the client is so small in comparison with no compression that the user can enjoy the swift image appearance comfortably. Parallelization of image data generation is based on Owner Computation Rule. GUI on the client is built on Java applet. A real-time visualization is thus possible on the client PC only if Web browser is implemented on it. (author)

  15. Architecture independent environment for developing engineering software on MIMD computers

    Science.gov (United States)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  16. Development of small scale cluster computer for numerical analysis

    Science.gov (United States)

    Zulkifli, N. H. N.; Sapit, A.; Mohammed, A. N.

    2017-09-01

    In this study, two units of personal computer were successfully networked together to form a small scale cluster. Each of the processor involved are multicore processor which has four cores in it, thus made this cluster to have eight processors. Here, the cluster incorporate Ubuntu 14.04 LINUX environment with MPI implementation (MPICH2). Two main tests were conducted in order to test the cluster, which is communication test and performance test. The communication test was done to make sure that the computers are able to pass the required information without any problem and were done by using simple MPI Hello Program where the program written in C language. Additional, performance test was also done to prove that this cluster calculation performance is much better than single CPU computer. In this performance test, four tests were done by running the same code by using single node, 2 processors, 4 processors, and 8 processors. The result shows that with additional processors, the time required to solve the problem decrease. Time required for the calculation shorten to half when we double the processors. To conclude, we successfully develop a small scale cluster computer using common hardware which capable of higher computing power when compare to single CPU processor, and this can be beneficial for research that require high computing power especially numerical analysis such as finite element analysis, computational fluid dynamics, and computational physics analysis.

  17. A Development of Computer Controlled 5 Axis Ultrasonic Testing System

    International Nuclear Information System (INIS)

    Kim, Y. S.; Kim, J. G.; Park, J. C.; Kim, N. I.

    1990-01-01

    A computer controlled 5 axis ultrasonic testing system is developed in order to detect flaws in special parts with complex shape. The various kinds of ultrasonic test can be performed automatically using computer program which was developed by DHI(Daewoo Heavy Industries Ltd.). By use of this computer program, the detector location can be programed and the amplitude signal of echo can be processed digitally. The test results can be plotted graphically on a high resolution display monitor in real time base. The test data can be also saved in magnetic memory devices(HDD or FDD) as well as in the form of hard copy through color printer. The computer software contains c- scan, c+a scan processing programs as well as statistical analysis for test data

  18. Development of a Technical Library to Support Computer Systems Evaluation

    Directory of Open Access Journals (Sweden)

    Patricia Munson Malley

    1971-12-01

    Full Text Available This paper reports on the development and growth of the United States Army Computer Systems Support and Evaluation Command (USACSSEC Technical Reference Library from a collection of miscellaneous documents related to only fifty computer systems to the present collection of approximately 10,000 hardware/software technical documents related to over 200 systems from 70 manufacturers. Special emphasis is given to the evolution of the filing system and retrieval techniques unique to the USACSSEC Technical Reference Library, i.e., computer listings of available documents in various sequences, and development uf the cataloging system adaptable to computer technology. It is hoped that this paper will be a contribution toward a standard approach in cataloging ADP collections.

  19. Portable Computer Technology (PCT) Research and Development Program Phase 2

    Science.gov (United States)

    Castillo, Michael; McGuire, Kenyon; Sorgi, Alan

    1995-01-01

    The subject of this project report, focused on: (1) Design and development of two Advanced Portable Workstation 2 (APW 2) units. These units incorporate advanced technology features such as a low power Pentium processor, a high resolution color display, National Television Standards Committee (NTSC) video handling capabilities, a Personal Computer Memory Card International Association (PCMCIA) interface, and Small Computer System Interface (SCSI) and ethernet interfaces. (2) Use these units to integrate and demonstrate advanced wireless network and portable video capabilities. (3) Qualification of the APW 2 systems for use in specific experiments aboard the Mir Space Station. A major objective of the PCT Phase 2 program was to help guide future choices in computing platforms and techniques for meeting National Aeronautics and Space Administration (NASA) mission objectives. The focus being on the development of optimal configurations of computing hardware, software applications, and network technologies for use on NASA missions.

  20. Research on Computer-based Creative Industries Development

    Science.gov (United States)

    Shuqin, Sun

    In recent years, creative industries based on the computer technology is booming and leads a new trend in this field. This creative industries considers innovation as a driving force. It combines the various cultural art resources with the latest computer technology, estabilshes new production and consumption patterns, promotes new industrial clusters, cultivates new consumer groups and generates enormous economic and social value. Therefore, computer-based creative industries is not only a cultural or educational philosophy, but also a development strategy with practical and sustainable features.

  1. Computational models of the development of perceptual expertise

    OpenAIRE

    Gobet, F; Campitelli, G; Lane, PCR

    2007-01-01

    In a recent article, Palmeri, Wong and Gauthier have argued that computational models may help direct hypotheses about the development of perceptual expertise. They support their claim by an analysis of models from the object-recognition and perceptual-categorization literatures. Surprisingly, however, they do not consider any computational models from traditional research into expertise, essentially the research deriving from Chase and Simon’s chunking theory, which itself was influenced by ...

  2. Computational Fluid Dynamics. [numerical methods and algorithm development

    Science.gov (United States)

    1992-01-01

    This collection of papers was presented at the Computational Fluid Dynamics (CFD) Conference held at Ames Research Center in California on March 12 through 14, 1991. It is an overview of CFD activities at NASA Lewis Research Center. The main thrust of computational work at Lewis is aimed at propulsion systems. Specific issues related to propulsion CFD and associated modeling will also be presented. Examples of results obtained with the most recent algorithm development will also be presented.

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  4. Development of a UNIX network compatible reactivity computer

    International Nuclear Information System (INIS)

    Sanchez, R.F.; Edwards, R.M.

    1996-01-01

    A state-of-the-art UNIX network compatible controller and UNIX host workstation with MATLAB/SIMULINK software were used to develop, implement, and validate a digital reactivity calculation. An objective of the development was to determine why a Macintosh-based reactivity computer reactivity output drifted intolerably

  5. Development and Validation of Self Instructional Computer Based ...

    African Journals Online (AJOL)

    The study is on the development and validation of self-instructional computer based package for teaching social studies in senior Primary Schools. The study investigated the effect intellectual development of study habits in Senior Primary school students where social studies were taught with and without the ...

  6. Computer-based tools to support curriculum developers

    NARCIS (Netherlands)

    Nieveen, N.M.; Gustafson, Kent

    2000-01-01

    Since the start of the early 90’s, an increasing number of people are interested in supporting the complex tasks of the curriculum development process with computer-based tools. ‘Curriculum development’ refers to an intentional process or activity directed at (re) designing, developing and

  7. Development of a computer aided learning system for graphical ...

    African Journals Online (AJOL)

    We present the development and deployment process of a computer-aided learning tool which serves as a training aid for undergraduate control engineering courses. We show the process of aigorithm construction and implementation of the=software which is also aimed at teaching software development at undergraduate ...

  8. Development of integrated platform for computational material design

    International Nuclear Information System (INIS)

    Kiyoshi, Matsubara; Kumi, Itai; Nobutaka, Nishikawa; Akifumi, Kato; Hideaki, Koike

    2003-01-01

    The goal of our project is to design and develop a problem-solving environment (PSE) that will help computational scientists and engineers develop large complicated application software and simulate complex phenomena by using networking and parallel computing. The integrated platform, which is designed for PSE in the Japanese national project of Frontier Simulation Software for Industrial Science, is defined by supporting the entire range of problem solving activity from program formulation and data setup to numerical simulation, data management, and visualization. A special feature of our integrated platform is based on a new architecture called TASK FLOW. It integrates the computational resources such as hardware and software on the network and supports complex and large-scale simulation. This concept is applied to computational material design and the project 'comprehensive research for modeling, analysis, control, and design of large-scale complex system considering properties of human being'. Moreover this system will provide the best solution for developing large and complicated software and simulating complex and large-scaled phenomena in computational science and engineering. A prototype has already been developed and the validation and verification of an integrated platform will be scheduled by using the prototype in 2003. In the validation and verification, fluid-structure coupling analysis system for designing an industrial machine will be developed on the integrated platform. As other examples of validation and verification, integrated platform for quantum chemistry and bio-mechanical system are planned

  9. Computational support in product development:applications from high temperature design and development

    OpenAIRE

    Isaksson, Ola

    1998-01-01

    In this thesis a new perspective to Computational Support in Product Development is presented. It is explained and discussed how computational simulation can be used to achieve shorter lead time and better quality in the product development process. The approach proposed is to use a generic and process based sequential decomposition of computational simulation activities in a product development project. This methodology seeks to ensure that the best methods and tools are used in all stages o...

  10. High performance parallel computers for science: New developments at the Fermilab advanced computer program

    International Nuclear Information System (INIS)

    Nash, T.; Areti, H.; Atac, R.

    1988-08-01

    Fermilab's Advanced Computer Program (ACP) has been developing highly cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 MFlops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction. 10 refs., 7 figs

  11. Methods for the development of large computer codes under LTSS

    International Nuclear Information System (INIS)

    Sicilian, J.M.

    1977-06-01

    TRAC is a large computer code being developed by Group Q-6 for the analysis of the transient thermal hydraulic behavior of light-water nuclear reactors. A system designed to assist the development of TRAC is described. The system consists of a central HYDRA dataset, R6LIB, containing files used in the development of TRAC, and a file maintenance program, HORSE, which facilitates the use of this dataset

  12. Development of the computer network of IFIN-HH

    International Nuclear Information System (INIS)

    Danet, A.; Mirica, M.; Constantinescu, S.

    1998-01-01

    The general computer network of Horia Hulubei National Institute for Physics and Nuclear Engineering (IFIN-HH), as part of RNC (Romanian National Computer Network for scientific research and technological development), offers the Romanian physics research community an efficient and cost-effective infrastructure to communicate and collaborate with fellow researchers abroad, and to collect and exchange the most up-to-date information in their research area. RNC is the national project co-ordinated and established by the Ministry of Research and Technology targeted on the following main objectives: - setting up a technical and organizational infrastructure meant to provide national and international electronic services for the Romanian scientific research community; - providing a rapid and competitive tool for the exchange information in the framework of R-D community; - using the scientific and technical data bases available in the country and offered by the national networks from other countries through international networks; - providing a support for information, documentation, scientific and technical co-operation. The guiding principle in elaborating the project of general computer network of IFIN-HH was to implement an open system based on OSI standards without technical barriers in communication between different communities using different computing hardware and software. The major objectives achieved in 1997 in the direction of developing the general computer network of IFIN-HH (over 250 computers connected) were: - connecting all the existing and newly installed computer equipment and providing an adequate connectivity; - providing the usual Internet services: e-mail, ftp, telnet, finger, gopher; - providing access to the World Wide Web resources; - providing on-line statistics of IP traffic (input and output) of each node of the domain computer network; - improving the performance of the connection with the central node RNC. (authors)

  13. Practical methods to improve the development of computational software

    International Nuclear Information System (INIS)

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-01-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  14. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  15. Laboratory Works Designed for Developing Student Motivation in Computer Architecture

    Directory of Open Access Journals (Sweden)

    Petre Ogrutan

    2017-02-01

    Full Text Available In light of the current difficulties related to maintaining the students’ interest and to stimulate their motivation for learning, the authors have developed a range of new laboratory exercises intended for first-year students in Computer Science as well as for engineering students after completion of at least one course in computers. The educational goal of the herein proposed laboratory exercises is to enhance the students’ motivation and creative thinking by organizing a relaxed yet competitive learning environment. The authors have developed a device including LEDs and switches, which is connected to a computer. By using assembly language, commands can be issued to flash several LEDs and read the states of the switches. The effectiveness of this idea was confirmed by a statistical study.

  16. The Role of Private Sector in Applying Computer Technology to Development in Developing Countries.

    Science.gov (United States)

    Rangnekar, Sharu S.

    In developing countries, the most important problem faced by the persons who are trying to force the pace of development is often a psychological one. The use of computers has spread in the private sector, public sector companies and government departments. However, propaganda against computers requires reconsideration of the role in developing…

  17. Development of an Intelligent Instruction System for Mathematical Computation

    Science.gov (United States)

    Kim, Du Gyu; Lee, Jaemu

    2013-01-01

    In this paper, we propose the development of a web-based, intelligent instruction system to help elementary school students for mathematical computation. We concentrate on the intelligence facilities which support diagnosis and advice. The existing web-based instruction systems merely give information on whether the learners' replies are…

  18. Recent development of computational resources for new antibiotics discovery

    DEFF Research Database (Denmark)

    Kim, Hyun Uk; Blin, Kai; Lee, Sang Yup

    2017-01-01

    Understanding a complex working mechanism of biosynthetic gene clusters (BGCs) encoding secondary metabolites is a key to discovery of new antibiotics. Computational resources continue to be developed in order to better process increasing volumes of genome and chemistry data, and thereby better...

  19. Factors Influencing Cloud-Computing Technology Adoption in Developing Countries

    Science.gov (United States)

    Hailu, Alemayehu

    2012-01-01

    Adoption of new technology has complicating components both from the selection, as well as decision-making criteria and process. Although new technology such as cloud computing provides great benefits especially to the developing countries, it has challenges that may complicate the selection decision and subsequent adoption process. This study…

  20. Computer Game-based Learning: Applied Game Development Made Simpler

    NARCIS (Netherlands)

    Nyamsuren, Enkhbold

    2018-01-01

    The RAGE project (Realising an Applied Gaming Ecosystem, http://rageproject.eu/) is an ongoing initiative that aims to offer an ecosystem to support serious games’ development and use. Its two main objectives are to provide technologies for computer game-based pedagogy and learning and to establish

  1. Development and Evaluation of a Computer-Aided Learning (CAL ...

    African Journals Online (AJOL)

    Against this background, this paper was aimed at teaching students the fundamentals of C++ programming language in higher institutions in Nigeria using Computer Aided Learning software system (CAL) developed for C++, which is a course being taught at the 2nd year, 3rd year and 4th year to students of Engineering ...

  2. Recent developments and applications in mathematics and computer science

    International Nuclear Information System (INIS)

    Churchhouse, R.F.; Tahir Shah, K.; Zanella, P.

    1991-01-01

    The book contains 8 invited lectures and 4 short seminars presented at the College on Recent Developments and Applications in Mathematics and Computer Science held in Trieste from 7 May to 1 June 1990. A separate abstract was prepared for each paper. Refs, figs and tabs

  3. Development of a Computer Program for the Design of Auger ...

    African Journals Online (AJOL)

    A computer program was developed for the above processes to remove the constraints of the classical approach. The program which is iterative and menu driven, accepts relevant input data (material to be conveyed, required capacity, elevations involved, etc) and does the required calculations, selection and optimization.

  4. Development of Desktop Computing Applications and Engineering Tools on GPUs

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg; Glimberg, Stefan Lemvig; Hansen, Toke Jansen

    (GPUs) for high-performance computing applications and software tools in science and engineering, inverse problems, visualization, imaging, dynamic optimization. The goals are to contribute to the development of new state-of-the-art mathematical models and algorithms for maximum throughout performance...

  5. Development of computer-aided auto-ranging technique for a computed radiography system

    International Nuclear Information System (INIS)

    Ishida, M.; Shimura, K.; Nakajima, N.; Kato, H.

    1988-01-01

    For a computed radiography system, the authors developed a computer-aided autoranging technique in which the clinically useful image data are automatically mapped to the available display range. The preread image data are inspected to determine the location of collimation. A histogram of the pixels inside the collimation is evaluated regarding characteristic values such as maxima and minima, and then the optimal density and contrast are derived for the display image. The effect of the autoranging technique was investigated at several hospitals in Japan. The average rate of films lost due to undesirable density or contrast was about 0.5%

  6. Development of the Shimadzu computed tomographic scanner SCT-200N

    International Nuclear Information System (INIS)

    Ishihara, Hiroshi; Yamaoka, Nobuyuki; Saito, Masahiro

    1982-01-01

    The Shimadzu Computed Tomographic Scanner SCT-200N has been developed as an ideal CT scanner for diagnosing the head and spine. Due to the large aperture, moderate scan time and the Zoom Scan Mode, any part of the body can be scanned. High quality image can be obtained by adopting the precisely stabilized X-ray unit and densely packed array of 64-detectors. As for its operation, capability of computed radiography (CR) prior to patient positioning and real time reconstruction ensure efficient patient through-put. Details of the SCT-200N are described in this paper. (author)

  7. An integrated computer design environment for the development of micro-computer critical software

    International Nuclear Information System (INIS)

    De Agostino, E.; Massari, V.

    1986-01-01

    The paper deals with the development of micro-computer software for Nuclear Safety System. More specifically, it describes an experimental work in the field of software development methodologies to be used for the implementation of micro-computer based safety systems. An investigation of technological improvements that are provided by state-of-the-art integrated packages for micro-based systems development has been carried out. The work has aimed to assess a suitable automated tools environment for the whole software life-cycle. The main safety functions, as DNBR, KW/FT, of a nuclear power reactor have been implemented in a host-target approach. A prototype test-bed microsystem has been implemented to run the safety functions in order to derive a concrete evaluation on the feasibility of critical software according to new technological trends of ''Software Factories''. (author)

  8. Development of computed tomography system and image reconstruction algorithm

    International Nuclear Information System (INIS)

    Khairiah Yazid; Mohd Ashhar Khalid; Azaman Ahmad; Khairul Anuar Mohd Salleh; Ab Razak Hamzah

    2006-01-01

    Computed tomography is one of the most advanced and powerful nondestructive inspection techniques, which is currently used in many different industries. In several CT systems, detection has been by combination of an X-ray image intensifier and charge -coupled device (CCD) camera or by using line array detector. The recent development of X-ray flat panel detector has made fast CT imaging feasible and practical. Therefore this paper explained the arrangement of a new detection system which is using the existing high resolution (127 μm pixel size) flat panel detector in MINT and the image reconstruction technique developed. The aim of the project is to develop a prototype flat panel detector based CT imaging system for NDE. The prototype consisted of an X-ray tube, a flat panel detector system, a rotation table and a computer system to control the sample motion and image acquisition. Hence this project is divided to two major tasks, firstly to develop image reconstruction algorithm and secondly to integrate X-ray imaging components into one CT system. The image reconstruction algorithm using filtered back-projection method is developed and compared to other techniques. The MATLAB program is the tools used for the simulations and computations for this project. (Author)

  9. PREVIOUS SECOND TRIMESTER ABORTION

    African Journals Online (AJOL)

    PNLC

    PREVIOUS SECOND TRIMESTER ABORTION: A risk factor for third trimester uterine rupture in three ... for accurate diagnosis of uterine rupture. KEY WORDS: Induced second trimester abortion - Previous uterine surgery - Uterine rupture. ..... scarred uterus during second trimester misoprostol- induced labour for a missed ...

  10. Ameloblastic fibroma: a stage in the development of a hamartomatous odontoma or a true neoplasm? Critical analysis of 162 previously reported cases plus 10 new cases.

    Science.gov (United States)

    Buchner, Amos; Vered, Marilena

    2013-11-01

    To analyze neoplastic and hamartomatous variants of ameloblastic fibromas (AFs). Analysis of 172 cases (162 previously reported, 10 new). AF emerged as a lesion primarily of children and adolescents (mean age, 14.9 years), with about 80% diagnosed when odontogenesis is completed (age, 22 years are considered true neoplasms, while those in younger patients may be either true neoplasms or odontomas in early stages of development. Although the histopathology of hamartomatous and neoplastic variants of AF are indistinguishable, clinical and radiologic features can be of some help to distinguish between them. Asymptomatic small unilocular lesions with no or minimal bone expansion in young individuals are likely to be developing odontomas, and large, expansile lesions with extensive bone destruction are neoplasms. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Development of high-Reynolds-number-flow computation

    International Nuclear Information System (INIS)

    Kuwahara, K.

    1986-01-01

    It has become clear that the high-Reynolds-number flows can be simulated by directly integrating the Navier-Stokes equations. The numerical diffusion plays a very important role in getting reasonable results. Numerical diffusion of second-order-derivative type which appears in the first-order upwind scheme or as an implicity diffusion in the Beam-Warming-Steger method is similar to the molecular diffusion and conceals the dependence of the flow on the Reynolds number and is not suitable for high-Reynolds-number-flow computation. On the other hand, the numerical diffusion of fourth-order-derivative type is of short range and does not conceal the effect of molecular diffusion and stabilizes the computation very well. At the present stage, this may be the best way to overcome the numerical instability in high-Reynolds-number-flow computation. Even simulation of unsteady compressible flows at high Reynolds number is feasible. The numerical simulation of high-Reynolds-number flow is much simpler than it was previously believed

  12. Interpreting "Personality" Taxonomies: Why Previous Models Cannot Capture Individual-Specific Experiencing, Behaviour, Functioning and Development. Major Taxonomic Tasks Still Lay Ahead.

    Science.gov (United States)

    Uher, Jana

    2015-12-01

    As science seeks to make generalisations, a science of individual peculiarities encounters intricate challenges. This article explores these challenges by applying the Transdisciplinary Philosophy-of-Science Paradigm for Research on Individuals (TPS-Paradigm) and by exploring taxonomic "personality" research as an example. Analyses of researchers' interpretations of the taxonomic "personality" models, constructs and data that have been generated in the field reveal widespread erroneous assumptions about the abilities of previous methodologies to appropriately represent individual-specificity in the targeted phenomena. These assumptions, rooted in everyday thinking, fail to consider that individual-specificity and others' minds cannot be directly perceived, that abstract descriptions cannot serve as causal explanations, that between-individual structures cannot be isomorphic to within-individual structures, and that knowledge of compositional structures cannot explain the process structures of their functioning and development. These erroneous assumptions and serious methodological deficiencies in widely used standardised questionnaires have effectively prevented psychologists from establishing taxonomies that can comprehensively model individual-specificity in most of the kinds of phenomena explored as "personality", especially in experiencing and behaviour and in individuals' functioning and development. Contrary to previous assumptions, it is not universal models but rather different kinds of taxonomic models that are required for each of the different kinds of phenomena, variations and structures that are commonly conceived of as "personality". Consequently, to comprehensively explore individual-specificity, researchers have to apply a portfolio of complementary methodologies and develop different kinds of taxonomies, most of which have yet to be developed. Closing, the article derives some meta-desiderata for future research on individuals' "personality".

  13. Development of a computational methodology for internal dose calculations

    CERN Document Server

    Yoriyaz, H

    2000-01-01

    A new approach for calculating internal dose estimates was developed through the use of a more realistic computational model of the human body and a more precise tool for the radiation transport simulation. The present technique shows the capability to build a patient-specific phantom with tomography data (a voxel-based phantom) for the simulation of radiation transport and energy deposition using Monte Carlo methods such as in the MCNP-4B code. In order to utilize the segmented human anatomy as a computational model for the simulation of radiation transport, an interface program, SCMS, was developed to build the geometric configurations for the phantom through the use of tomographic images. This procedure allows to calculate not only average dose values but also spatial distribution of dose in regions of interest. With the present methodology absorbed fractions for photons and electrons in various organs of the Zubal segmented phantom were calculated and compared to those reported for the mathematical phanto...

  14. Development of computer simulations for landfill methane recovery

    Energy Technology Data Exchange (ETDEWEB)

    Massmann, J.W.; Moore, C.A.; Sykes, R.M.

    1981-12-01

    Two- and three-dimensional finite-difference computer programs simulating methane recovery systems in landfills have been developed. These computer programs model multicomponent combined pressure and diffusional flow in porous media. Each program and the processes it models are described in this report. Examples of the capabilities of each program are also presented. The two-dimensional program was used to simulate methane recovery systems in a cylindrically shaped landfill. The effects of various pump locations, geometries, and extraction rates were determined. The three-dimensional program was used to model the Puente Hills landfill, a field test site in southern California. The biochemical and microbiological details of methane generation in landfills are also given. Effects of environmental factors, such as moisture, oxygen, temperature, and nutrients on methane generation are discussed and an analytical representation of the gas generation rate is developed.

  15. Innovation in nursing education: development of computer-assisted thinking.

    Science.gov (United States)

    Kanai-Pak, M; Hosoi, R; Arai, C; Ishii, Y; Seki, M; Kikuchi, Y; Kabasawa, K; Sato, K

    1997-01-01

    In order to enhance students' active thinking, faculty members at International University of Health and Welfare developed the CAT (Computer Assisted Thinking) program. The CAT program is different from CAI (Computer Assisted Instruction), which mainly asks users to choose correct answers. Instead, the CAT program asks users to type in short sentences. There are two functions in the CAT program: one is to keep the students' action log each time they use the program and the other is to serve as medical dictionary. An analysis of the action log revealed that the students demonstrated little skill in inferential thinking. Their observations were very concrete. In order to help the students to develop their abstract thinking skills, we need to review our curriculum.

  16. Bioconductor: open software development for computational biology and bioinformatics

    DEFF Research Database (Denmark)

    Gentleman, R.C.; Carey, V.J.; Bates, D.M.

    2004-01-01

    into interdisciplinary scientific research, and promoting the achievement of remote reproducibility of research results. We describe details of our aims and methods, identify current challenges, compare Bioconductor to other open bioinformatics projects, and provide working examples.......The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry...

  17. Development of a Very Dense Liquid Cooled Compute Platform

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, Phillip N.; Lipp, Robert J.

    2013-12-10

    The objective of this project was to design and develop a prototype very energy efficient high density compute platform with 100% pumped refrigerant liquid cooling using commodity components and high volume manufacturing techniques. Testing at SLAC has indicated that we achieved a DCIE of 0.93 against our original goal of 0.85. This number includes both cooling and power supply and was achieved employing some of the highest wattage processors available.

  18. The influence of playing computer games on pupil's development

    OpenAIRE

    Pospíšilová, Lenka

    2008-01-01

    This thesis is about the effects of playing computer games on pupils and students behavior. It is divided into a theoretical and an investigative part. The theoretical part is dedicated to historical development of technologies and principals of game systems in relationship to technical progress. It adverts to psychological, social and biological effects of long time, intensive playing of games. It shows positive and negative effects ofthis activity. The work analyses typical pathological eve...

  19. Automating Commercial Video Game Development using Computational Intelligence

    OpenAIRE

    Tse G. Tan; Jason Teo; Patricia Anthony

    2011-01-01

    Problem statement: The retail sales of computer and video games have grown enormously during the last few years, not just in United States (US), but also all over the world. This is the reason a lot of game developers and academic researchers have focused on game related technologies, such as graphics, audio, physics and Artificial Intelligence (AI) with the goal of creating newer and more fun games. In recent years, there has been an increasing interest in game AI for pro...

  20. Multilink manipulator computer control: experience in development and commissioning

    International Nuclear Information System (INIS)

    Holt, J.E.

    1988-11-01

    This report describes development which has been carried out on the multilink manipulator computer control system. The system allows the manipulator to be driven using only two joysticks. The leading link is controlled and the other links follow its path into the reactor, thus avoiding any potential obstacles. The system has been fully commissioned and used with the Sizewell ''A'' reactor 2 Multilink T.V. manipulator. Experience of the use of the system is presented, together with recommendations for future improvements. (author)

  1. A computational clonal analysis of the developing mouse limb bud.

    Directory of Open Access Journals (Sweden)

    Luciano Marcon

    Full Text Available A comprehensive spatio-temporal description of the tissue movements underlying organogenesis would be an extremely useful resource to developmental biology. Clonal analysis and fate mappings are popular experiments to study tissue movement during morphogenesis. Such experiments allow cell populations to be labeled at an early stage of development and to follow their spatial evolution over time. However, disentangling the cumulative effects of the multiple events responsible for the expansion of the labeled cell population is not always straightforward. To overcome this problem, we develop a novel computational method that combines accurate quantification of 2D limb bud morphologies and growth modeling to analyze mouse clonal data of early limb development. Firstly, we explore various tissue movements that match experimental limb bud shape changes. Secondly, by comparing computational clones with newly generated mouse clonal data we are able to choose and characterize the tissue movement map that better matches experimental data. Our computational analysis produces for the first time a two dimensional model of limb growth based on experimental data that can be used to better characterize limb tissue movement in space and time. The model shows that the distribution and shapes of clones can be described as a combination of anisotropic growth with isotropic cell mixing, without the need for lineage compartmentalization along the AP and PD axis. Lastly, we show that this comprehensive description can be used to reassess spatio-temporal gene regulations taking tissue movement into account and to investigate PD patterning hypothesis.

  2. The Computational Development of Reinforcement Learning during Adolescence.

    Directory of Open Access Journals (Sweden)

    Stefano Palminteri

    2016-06-01

    Full Text Available Adolescence is a period of life characterised by changes in learning and decision-making. Learning and decision-making do not rely on a unitary system, but instead require the coordination of different cognitive processes that can be mathematically formalised as dissociable computational modules. Here, we aimed to trace the developmental time-course of the computational modules responsible for learning from reward or punishment, and learning from counterfactual feedback. Adolescents and adults carried out a novel reinforcement learning paradigm in which participants learned the association between cues and probabilistic outcomes, where the outcomes differed in valence (reward versus punishment and feedback was either partial or complete (either the outcome of the chosen option only, or the outcomes of both the chosen and unchosen option, were displayed. Computational strategies changed during development: whereas adolescents' behaviour was better explained by a basic reinforcement learning algorithm, adults' behaviour integrated increasingly complex computational features, namely a counterfactual learning module (enabling enhanced performance in the presence of complete feedback and a value contextualisation module (enabling symmetrical reward and punishment learning. Unlike adults, adolescent performance did not benefit from counterfactual (complete feedback. In addition, while adults learned symmetrically from both reward and punishment, adolescents learned from reward but were less likely to learn from punishment. This tendency to rely on rewards and not to consider alternative consequences of actions might contribute to our understanding of decision-making in adolescence.

  3. Development of a Pamphlet Targeting Computer Workstation Ergonomics

    Science.gov (United States)

    Faraci, Jennifer S.

    1997-01-01

    With the increased use of computers throughout Goddard Space Flight Center, the Industrial Hygiene Office (IHO) has observed a growing trend in the number of health complaints attributed to poor computer workstation setup. A majority of the complaints has centered around musculoskeletal symptoms, including numbness, pain, and tingling in the upper extremities, shoulders, and neck. Eye strain and headaches have also been reported. In some cases, these symptoms can lead to chronic conditions such as repetitive strain injuries (RSI's). In an effort to prevent or minimize the frequency of these symptoms among the GSFC population, the IHO conducts individual ergonomic workstation evaluations and ergonomics training classes upon request. Because of the extensive number of computer workstations at GSFC, and the limited amount of manpower which the Industrial Hygiene staff could reasonably allocate to conduct workstation evaluations and employee training, a pamphlet was developed with a two-fold purpose: (1) to educate the GSFC population about the importance of ergonomically-correct computer workstation setup and the potential effects of a poorly configured workstation; and (2) to enable employees to perform a general assessment of their own workstations and make any necessary modifications for proper setup.

  4. A way forward for the development of an exposure computational model to computed tomography dosimetry

    Science.gov (United States)

    Ferreira, C. C.; Galvão, L. A.; Vieira, J. W.; Maia, A. F.

    2011-04-01

    A way forward for the development of an exposure computational model to computed tomography dosimetry has been presented. In this way, an exposure computational model (ECM) for computed tomography (CT) dosimetry has been developed and validated through comparison with experimental results. For the development of the ECM, X-ray spectra generator codes have been evaluated and the head bow tie filter has been modelled through a mathematical equation. EGS4 and EGSnrc have been used for simulating the radiation transport by the ECM. Geometrical phantoms, commonly used in CT dosimetry, have been modelled by IDN software. MAX06 has also been used to simulate an adult male patient submitted for CT examinations. The evaluation of the X-ray spectra generator codes in CT dosimetry showed dependence with tube filtration (or HVL value). More generally, with the increment of total filtration (or HVL value) the X-raytbc becomes the best X-ray spectra generator code for CT dosimetry. The EGSnrc/X-raytbc combination has calculated C100,c in better concordance with C100,c measured in two different CT scanners. For a Toshiba CT scanner, the average percentage difference between the calculated C100,c values and measured C100,c values was 8.2%. Whilst for a GE CT scanner, the average percentage difference was 10.4%. By the measurements of air kerma through a prototype head bow tie filter a third-order exponential decay equation was found. C100,c and C100,p values calculated by the ECM are in good agreement with values measured at a specific CT scanner. A maximum percentage difference of 2% has been found in the PMMA CT head phantoms, demonstrating effective modelling of the head bow tie filter by the equation. The absorbed and effective doses calculated by the ECM developed in this work have been compared to those calculated by the ECM of Jones and Shrimpton for an adult male patient. For a head examination the absorbed dose values calculated by the ECM developed by Jones and Shrimpton

  5. Computer-Aided Sensor Development Focused on Security Issues.

    Science.gov (United States)

    Bialas, Andrzej

    2016-05-26

    The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research.

  6. The development of AR book for computer learning

    Science.gov (United States)

    Phadung, Muneeroh; Wani, Najela; Tongmnee, Nur-aiynee

    2017-08-01

    Educators need to provide the alternative educational tools to foster learning outcomes of students. By using AR technology to create exciting edutainment experiences, this paper presents how augmented reality (AR) can be applied in the education. This study aims to develop the AR book for tenth grade students (age 15-16) and evaluate its quality. The AR book was developed based on ADDIE framework processes to provide computer learning on software computer knowledge. The content was accorded with the current Thai education curriculum. The AR book had 10 pages in three topics (the first was "Introduction," the second was "System Software" and the third was "Application Software"). Each page contained markers that placed virtual objects (2D animation and video clip). The obtained data were analyzed in terms of average and standard deviation. The validity of multimedia design of the AR book was assessed by three experts in multimedia design. A five-point Likert scale was used and the values were X¯ =4 .84 , S.D. = 1.27 which referred to very high. Moreover, three content experts, who specialize in computer teaching, evaluated the AR book's validity. The values determined by the experts were X¯ =4 .69 , S.D. = 0.29 which referred to very high. Implications for future study and education are discussed.

  7. Development of unstructured mesh generator on parallel computers

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Shimada, Akio; Murakami, Hiroyuki; Higashida, Akihiro; Wakatsuki, Shigeto

    2000-01-01

    A general-purpose unstructured mesh generator, 'GRID3D/UNST', has been developed on parallel computers. High-speed operations and large-scale memory capacity of parallel computers enable the system to generate a large-scale mesh at high speed. As a matter of fact, the system generates large-scale mesh composed of 2,400,000 nodes and 14,000,000 elements about 1.5 hours on HITACHI SR2201, 64 PEs (Processing Elements) through 2.5 hours pre-process on SUN. Also the system is built on standard FORTRAN, C and Motif, and therefore has high portability. The system enables us to solve a large-scale problem that has been impossible to be solved, and to break new ground in the field of science and engineering. (author)

  8. Computational study of developing high-quality decision trees

    Science.gov (United States)

    Fu, Zhiwei

    2002-03-01

    Recently, decision tree algorithms have been widely used in dealing with data mining problems to find out valuable rules and patterns. However, scalability, accuracy and efficiency are significant concerns regarding how to effectively deal with large and complex data sets in the implementation. In this paper, we propose an innovative machine learning approach (we call our approach GAIT), combining genetic algorithm, statistical sampling, and decision tree, to develop intelligent decision trees that can alleviate some of these problems. We design our computational experiments and run GAIT on three different data sets (namely Socio- Olympic data, Westinghouse data, and FAA data) to test its performance against standard decision tree algorithm, neural network classifier, and statistical discriminant technique, respectively. The computational results show that our approach outperforms standard decision tree algorithm profoundly at lower sampling levels, and achieves significantly better results with less effort than both neural network and discriminant classifiers.

  9. Development of a proton Computed Tomography Detector System

    Energy Technology Data Exchange (ETDEWEB)

    Naimuddin, Md. [Delhi U.; Coutrakon, G. [Northern Illinois U.; Blazey, G. [Northern Illinois U.; Boi, S. [Northern Illinois U.; Dyshkant, A. [Northern Illinois U.; Erdelyi, B. [Northern Illinois U.; Hedin, D. [Northern Illinois U.; Johnson, E. [Northern Illinois U.; Krider, J. [Northern Illinois U.; Rukalin, V. [Northern Illinois U.; Uzunyan, S. A. [Northern Illinois U.; Zutshi, V. [Northern Illinois U.; Fordt, R. [Fermilab; Sellberg, G. [Fermilab; Rauch, J. E. [Fermilab; Roman, M. [Fermilab; Rubinov, P. [Fermilab; Wilson, P. [Fermilab

    2016-02-04

    Computer tomography is one of the most promising new methods to image abnormal tissues inside the human body. Tomography is also used to position the patient accurately before radiation therapy. Hadron therapy for treating cancer has become one of the most advantegeous and safe options. In order to fully utilize the advantages of hadron therapy, there is a necessity of performing radiography with hadrons as well. In this paper we present the development of a proton computed tomography system. Our second-generation proton tomography system consists of two upstream and two downstream trackers made up of fibers as active material and a range detector consisting of plastic scintillators. We present details of the detector system, readout electronics, and data acquisition system as well as the commissioning of the entire system. We also present preliminary results from the test beam of the range detector.

  10. Developing Activities for Teaching Cloud Computing and Virtualization

    Directory of Open Access Journals (Sweden)

    E. Erturk

    2014-10-01

    Full Text Available Cloud computing and virtualization are new but indispensable components of computer engineering and information systems curricula for universities and higher education institutions. Learning about these topics is important for students preparing to work in the IT industry. In many companies, information technology operates under tight financial constraints. Virtualization, (for example storage, desktop, and server virtualization, reduces overall IT costs through the consolidation of systems. It also results in reduced loads and energy savings in terms of the power and cooling infrastructure. Therefore it is important to investigate the practical aspects of this topic both for industry practice and for teaching purposes. This paper demonstrates some activities undertaken recently by students at the Eastern Institute of Technology New Zealand and concludes with general recommendations for IT educators, software developers, and other IT professionals

  11. Development of a Computer Application to Simulate Porous Structures

    Directory of Open Access Journals (Sweden)

    S.C. Reis

    2002-09-01

    Full Text Available Geometric modeling is an important tool to evaluate structural parameters as well as to follow the application of stereological relationships. The obtention, visualization and analysis of volumetric images of the structure of materials, using computational geometric modeling, facilitates the determination of structural parameters of difficult experimental access, such as topological and morphological parameters. In this work, we developed a geometrical model implemented by computer software that simulates random pore structures. The number of nodes, number of branches (connections between nodes and the number of isolated parts, are obtained. Also, the connectivity (C is obtained from this application. Using a list of elements, nodes and branches, generated by the software, in AutoCAD® command line format, the obtained structure can be viewed and analyzed.

  12. Development of a mechanistically based computer simulation of nitrogen oxide absorption in packed towers

    International Nuclear Information System (INIS)

    Counce, R.M.

    1981-01-01

    A computer simulation for nitrogen oxide (NO/sub x/) scrubbing in packed towers was developed for use in process design and process control. This simulation implements a mechanistically based mathematical model, which was formulated from (1) an exhaustive literature review; (2) previous NO/sub x/ scrubbing experience with sieve-plate towers; and (3) comparisons of sequential sets of experiments. Nitrogen oxide scrubbing is characterized by simultaneous absorption and desorption phenomena: the model development is based on experiments designed to feature these two phenomena. The model was then successfully tested in experiments designed to put it in jeopardy

  13. Computable general equilibrium model fiscal year 2013 capability development report

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, Brian Keith [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rivera, Michael Kelly [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Boero, Riccardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-17

    This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences in the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.

  14. Model-Driven Development for scientific computing. Computations of RHEED intensities for a disordered surface. Part I

    Science.gov (United States)

    Daniluk, Andrzej

    2010-03-01

    Scientific computing is the field of study concerned with constructing mathematical models, numerical solution techniques and with using computers to analyse and solve scientific and engineering problems. Model-Driven Development (MDD) has been proposed as a means to support the software development process through the use of a model-centric approach. This paper surveys the core MDD technology that was used to develop an application that allows computation of the RHEED intensities dynamically for a disordered surface. New version program summaryProgram title: RHEED1DProcess Catalogue identifier: ADUY_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADUY_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 31 971 No. of bytes in distributed program, including test data, etc.: 3 039 820 Distribution format: tar.gz Programming language: Embarcadero C++ Builder Computer: Intel Core Duo-based PC Operating system: Windows XP, Vista, 7 RAM: more than 1 GB Classification: 4.3, 7.2, 6.2, 8, 14 Catalogue identifier of previous version: ADUY_v3_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 2394 Does the new version supersede the previous version?: No Nature of problem: An application that implements numerical simulations should be constructed according to the CSFAR rules: clear and well-documented, simple, fast, accurate, and robust. A clearly written, externally and internally documented program is much easier to understand and modify. A simple program is much less prone to error and is more easily modified than one that is complicated. Simplicity and clarity also help make the program flexible. Making the program fast has economic benefits. It also allows flexibility because some of the features that make a program efficient can be traded off for

  15. Laparoscopy After Previous Laparotomy

    Directory of Open Access Journals (Sweden)

    Zulfo Godinjak

    2006-11-01

    Full Text Available Following the abdominal surgery, extensive adhesions often occur and they can cause difficulties during laparoscopic operations. However, previous laparotomy is not considered to be a contraindication for laparoscopy. The aim of this study is to present that an insertion of Veres needle in the region of umbilicus is a safe method for creating a pneumoperitoneum for laparoscopic operations after previous laparotomy. In the last three years, we have performed 144 laparoscopic operations in patients that previously underwent one or two laparotomies. Pathology of digestive system, genital organs, Cesarean Section or abdominal war injuries were the most common causes of previouslaparotomy. During those operations or during entering into abdominal cavity we have not experienced any complications, while in 7 patients we performed conversion to laparotomy following the diagnostic laparoscopy. In all patients an insertion of Veres needle and trocar insertion in the umbilical region was performed, namely a technique of closed laparoscopy. Not even in one patient adhesions in the region of umbilicus were found, and no abdominal organs were injured.

  16. Conceptual aspects: analyses law, ethical, human, technical, social factors of development ICT, e-learning and intercultural development in different countries setting out the previous new theoretical model and preliminary findings

    NARCIS (Netherlands)

    Kommers, Petrus A.M.; Smyrnova-Trybulska, Eugenia; Morze, Natalia; Issa, Tomayess; Issa, Theodora

    2015-01-01

    This paper, prepared by an international team of authors focuses on the conceptual aspects: analyses law, ethical, human, technical, social factors of ICT development, e-learning and intercultural development in different countries, setting out the previous and new theoretical model and preliminary

  17. Discoveries and developments in human-computer interaction.

    Science.gov (United States)

    Boehm-Davis, Deborah A

    2008-06-01

    This paper describes contributions made to the science and practice of human-computer interaction (HCI), primarily through Human Factors and the society's annual proceedings. Research in HCI began to appear in publications associated with the Society around 1980 and has continued through the present. A search of the literature appearing in either the journal or the proceedings was done to identify the specific contributions made by researchers in this area. More than 2,300 papers were identified, some comparing the actual or predicted performance of a new device, display format, or computer-based system with an existing or alternative system. Other work describes methods for evaluating systems performance. This work has had a tremendous impact, particularly the work of Fitts, Smith and Mosier, and Virzi. Work on HCI has contributed to (a) current national and international guidelines, (b) the development of user interface management systems, (c) the provision of guidance as to where best to invest resources when evaluating computing systems, and (d) the prediction of human performance using those systems.

  18. A computer simulator for development of engineering system design methodologies

    Science.gov (United States)

    Padula, S. L.; Sobieszczanski-Sobieski, J.

    1987-01-01

    A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.

  19. Computational techniques used in the development of coprocessing flowsheets

    International Nuclear Information System (INIS)

    Groenier, W.S.; Mitchell, A.D.; Jubin, R.T.

    1979-01-01

    The computer program SEPHIS, developed to aid in determining optimum solvent extraction conditions for the reprocessing of nuclear power reactor fuels by the Purex method, is described. The program employs a combination of approximate mathematical equilibrium expressions and a transient, stagewise-process calculational method to allow stage and product-stream concentrations to be predicted with accuracy and reliability. The possible applications to inventory control for nuclear material safeguards, nuclear criticality analysis, and process analysis and control are of special interest. The method is also applicable to other counntercurrent liquid--liquid solvent extraction processes having known chemical kinetics, that may involve multiple solutes and are performed in conventional contacting equipment

  20. Development of computational small animal models and their applications in preclinical imaging and therapy research

    NARCIS (Netherlands)

    Xie, Tianwu; Zaidi, Habib

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal

  1. Chaste: using agile programming techniques to develop computational biology software.

    Science.gov (United States)

    Pitt-Francis, Joe; Bernabeu, Miguel O; Cooper, Jonathan; Garny, Alan; Momtahan, Lee; Osborne, James; Pathmanathan, Pras; Rodriguez, Blanca; Whiteley, Jonathan P; Gavaghan, David J

    2008-09-13

    Cardiac modelling is the area of physiome modelling where the available simulation software is perhaps most mature, and it therefore provides an excellent starting point for considering the software requirements for the wider physiome community. In this paper, we will begin by introducing some of the most advanced existing software packages for simulating cardiac electrical activity. We consider the software development methods used in producing codes of this type, and discuss their use of numerical algorithms, relative computational efficiency, usability, robustness and extensibility. We then go on to describe a class of software development methodologies known as test-driven agile methods and argue that such methods are more suitable for scientific software development than the traditional academic approaches. A case study is a project of our own, Cancer, Heart and Soft Tissue Environment, which is a library of computational biology software that began as an experiment in the use of agile programming methods. We present our experiences with a review of our progress thus far, focusing on the advantages and disadvantages of this new approach compared with the development methods used in some existing packages. We conclude by considering whether the likely wider needs of the cardiac modelling community are currently being met and suggest that, in order to respond effectively to changing requirements, it is essential that these codes should be more malleable. Such codes will allow for reliable extensions to include both detailed mathematical models--of the heart and other organs--and more efficient numerical techniques that are currently being developed by many research groups worldwide.

  2. New Developments and Geoscience Applications of Synchrotron Computed Microtomography (Invited)

    Science.gov (United States)

    Rivers, M. L.; Wang, Y.; Newville, M.; Sutton, S. R.; Yu, T.; Lanzirotti, A.

    2013-12-01

    Computed microtomography is the extension to micron spatial resolution of the CAT scanning technique developed for medical imaging. Synchrotron sources are ideal for the method, since they provide a monochromatic, parallel beam with high intensity. High energy storage rings such as the Advanced Photon Source at Argonne National Laboratory produce x-rays with high energy, high brilliance, and high coherence. All of these factors combine to produce an extremely powerful imaging tool for earth science research. Techniques that have been developed include: - Absorption and phase contrast computed tomography with spatial resolution below one micron. - Differential contrast computed tomography, imaging above and below the absorption edge of a particular element. - High-pressure tomography, imaging inside a pressure cell at pressures above 10GPa. - High speed radiography and tomography, with 100 microsecond temporal resolution. - Fluorescence tomography, imaging the 3-D distribution of elements present at ppm concentrations. - Radiographic strain measurements during deformation at high confining pressure, combined with precise x-ray diffraction measurements to determine stress. These techniques have been applied to important problems in earth and environmental sciences, including: - The 3-D distribution of aqueous and organic liquids in porous media, with applications in contaminated groundwater and petroleum recovery. - The kinetics of bubble formation in magma chambers, which control explosive volcanism. - Studies of the evolution of the early solar system from 3-D textures in meteorites - Accurate crystal size distributions in volcanic systems, important for understanding the evolution of magma chambers. - The equation-of-state of amorphous materials at high pressure using both direct measurements of volume as a function of pressure and also by measuring the change x-ray absorption coefficient as a function of pressure. - The location and chemical speciation of toxic

  3. Cloud Computing as an Enabler of Agile Global Software Development

    Directory of Open Access Journals (Sweden)

    Maureen Tanner

    2016-05-01

    Full Text Available Agile global software development (AGSD is an increasingly prevalent software development strategy, as organizations hope to realize the benefits of accessing a larger resource pool of skilled labor, at a potentially reduced cost, while at the same time delivering value incrementally and iteratively. However, the distributed nature of AGSD creates geographic, temporal, socio-cultural distances that challenge collaboration between project stakeholders. The Cloud Computing (CC service models of Infrastructure as a Service (IaaS, Platform as a Service (PaaS, and Software as a Service (SaaS are similar to the aspirant qualities of AGSD as they provide services that are globally accessible, efficient, and stable, with lower predictable operating costs that scale to meet the computational demand. This study focused on the 12 agile principles upon which all agile methodologies are based, therein potentially increasing the potential for the findings to be generalized. Domestication Theory was used to assist in understanding how cloud technologies were appropriated in support of AGSD. The research strategy took the form of case study research. The findings suggest that some of the challenges in applying the agile principles in AGSD may be overcome by using CC.

  4. Individual Stochastic Screening for the Development of Computer Graphics

    Directory of Open Access Journals (Sweden)

    Maja Turčić¹*

    2012-12-01

    Full Text Available With the emergence of new tools and media, art and design have developed into digital computer-generated works. This article presents a sequence of creating art graphics because their original authors have not published the procedures. The goal is to discover the mathematics of an image and the programming libretto with the purpose of organizing a structural base of computer graphics. We will elaborate the procedures used to produce graphics known throughout the history of art, but that are nowadays also found in design and security graphics. The results are closely related graphics obtained by changing parameters that initiate them. The aim is to control the graphics, i.e. to use controlled stochastic to achieve desired solutions. Since the artists from the past have never published the procedures of screening methods, their ideas have remained “only” the works of art. In this article we will present the development of the algorithm that, more or less successfully, simulates those screening solutions. It has been proven that mathematically defined graphical elements serve as screening elements. New technological and mathematical solutions are introduced in the reproduction with individual screening elements to be used in printing.

  5. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  6. Fundamental limitations in developing computer-aided detection for mammography

    Science.gov (United States)

    Nishikawa, Robert M.; Pesce, Lorenzo L.

    2011-08-01

    While asymptomatic screening with mammography has been proven to reduce breast cancer mortality, radiologists miss cancers when reading screening mammograms. Computer-aided detection (CADe) is being developed to help radiologists avoid overlooking a cancer. In this paper, we describe two overarching issues that limit the current development of CADe schemes. These are the inability to optimize a scheme for clinical impact - current methods only optimize for how well the CADe scheme works in the absence of a radiologist - and the lack of a figure of merit that quantifies the performance efficiency of the CADe scheme. Such a figure of merit could be used to determine how much better performance a CADe scheme could obtain, at least in theory, and which component of the several techniques employed in the CADe scheme is the weakest link.

  7. Software Development Processes Applied to Computational Icing Simulation

    Science.gov (United States)

    Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.

    1999-01-01

    The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.

  8. Development and quality assurance of computer-based assessment batteries.

    Science.gov (United States)

    Schlegel, Robert E; Gilliland, Kirby

    2007-02-01

    The purpose of this article is to outline critical elements in the development and quality assurance (QA) assessment of a computer-based assessment battery (CAB). The first section of the article provides an overview of the life cycle of a representative CAB, typical evolutionary stages, and many of the essential considerations for designing and developing a CAB. The second section of the article presents a model for conducting a quality assurance assessment of a CAB. A general narrative of several steps in the QA process is supported by a table of recommended QA assessment elements. Although this QA process model may not be definitive for all cases, it provides a general framework within which a systematic assessment of any CAB can be conducted.

  9. Computational Tools for Accelerating Carbon Capture Process Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  10. Computable general equilibrium model fiscal year 2014 capability development report

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, Brian Keith [Los Alamos National Laboratory; Boero, Riccardo [Los Alamos National Laboratory

    2016-05-11

    This report provides an overview of the development of the NISAC CGE economic modeling capability since 2012. This capability enhances NISAC's economic modeling and analysis capabilities to answer a broader set of questions than possible with previous economic analysis capability. In particular, CGE modeling captures how the different sectors of the economy, for example, households, businesses, government, etc., interact to allocate resources in an economy and this approach captures these interactions when it is used to estimate the economic impacts of the kinds of events NISAC often analyzes.

  11. Children, computer exposure and musculoskeletal outcomes: the development of pathway models for school and home computer-related musculoskeletal outcomes.

    Science.gov (United States)

    Harris, Courtenay; Straker, Leon; Pollock, Clare; Smith, Anne

    2015-01-01

    Children's computer use is rapidly growing, together with reports of related musculoskeletal outcomes. Models and theories of adult-related risk factors demonstrate multivariate risk factors associated with computer use. Children's use of computers is different from adult's computer use at work. This study developed and tested a child-specific model demonstrating multivariate relationships between musculoskeletal outcomes, computer exposure and child factors. Using pathway modelling, factors such as gender, age, television exposure, computer anxiety, sustained attention (flow), socio-economic status and somatic complaints (headache and stomach pain) were found to have effects on children's reports of musculoskeletal symptoms. The potential for children's computer exposure to follow a dose-response relationship was also evident. Developing a child-related model can assist in understanding risk factors for children's computer use and support the development of recommendations to encourage children to use this valuable resource in educational, recreational and communication environments in a safe and productive manner. Computer use is an important part of children's school and home life. Application of this developed model, that encapsulates related risk factors, enables practitioners, researchers, teachers and parents to develop strategies that assist young people to use information technology for school, home and leisure in a safe and productive manner.

  12. A perspective on computer documentation: System developer vs. technical editor

    Energy Technology Data Exchange (ETDEWEB)

    Carnes, E.T.; Truett, L.F.

    1995-12-31

    Between the computer-knowledgeable {open_quotes}techie{close_quotes} and the technical writer is a chasm created by differences in knowledge bases and skills. Although this gap is widened by misunderstandings and misconceptions of system development roles, it is bridged by mutual need and dual appreciation. Often the editor/writer is {open_quotes}behind{close_quotes} from beginning to end. The writer normally joins the team after the programmers are well into system development and do not want to {open_quotes}waste time{close_quotes} discussing fundamentals. The writer is usually excluded from technical discussions because it is assumed that he/she would not understand anyway. Later in the system development cycle, the writer has no time to polish the documentation before a new version of the software is issued which implies that the documentation must be revised. Nevertheless, the editor/writer`s product is critical for the end-user`s appreciation of the software, a fact which promotes unity to complete the comprehensive package of software and documentation. This paper explores the planks in the bridge that spans the chasm between developers and their fundamental PR agents, the technical editors/writers. This paper defines approaches (e.g., The Circling Theory) and techniques (Bold Thrust!) employed for effective communication -- between software developer and technical writer as well as between the software and the end-user.

  13. Development of superconductor electronics technology for high-end computing

    Science.gov (United States)

    Silver, A.; Kleinsasser, A.; Kerber, G.; Herr, Q.; Dorojevets, M.; Bunyk, P.; Abelson, L.

    2003-12-01

    This paper describes our programme to develop and demonstrate ultra-high performance single flux quantum (SFQ) VLSI technology that will enable superconducting digital processors for petaFLOPS-scale computing. In the hybrid technology, multi-threaded architecture, the computational engine to power a petaFLOPS machine at affordable power will consist of 4096 SFQ multi-chip processors, with 50 to 100 GHz clock frequency and associated cryogenic RAM. We present the superconducting technology requirements, progress to date and our plan to meet these requirements. We improved SFQ Nb VLSI by two generations, to a 8 kA cm-2, 1.25 µm junction process, incorporated new CAD tools into our methodology, demonstrated methods for recycling the bias current and data communication at speeds up to 60 Gb s-1, both on and between chips through passive transmission lines. FLUX-1 is the most ambitious project implemented in SFQ technology to date, a prototype general-purpose 8 bit microprocessor chip. We are testing the FLUX-1 chip (5K gates, 20 GHz clock) and designing a 32 bit floating-point SFQ multiplier with vector-register memory. We report correct operation of the complete stripline-connected gate library with large bias margins, as well as several larger functional units used in FLUX-1. The next stage will be an SFQ multi-processor machine. Important challenges include further reducing chip supply current and on-chip power dissipation, developing at least 64 kbit, sub-nanosecond cryogenic RAM chips, developing thermally and electrically efficient high data rate cryogenic-to-ambient input/output technology and improving Nb VLSI to increase gate density.

  14. Development of a Computer Program for the Integrated Control of the Fuel Homogeneity Measurement System

    Energy Technology Data Exchange (ETDEWEB)

    Shin, H. S.; Jang, J. W.; Lee, Y. H.; Oh, S. J.; Park, H. D.; Kim, C. K

    2005-11-15

    The computer program is developed based on Visual C++, which is equipped with a user-friendly interface of the input/output(I/O) and a display function for the measuring conditions. This program consists of three parts which are the port communication, PLC(Programmable Logic Controller) and the MCA(Multi Channel Analyzer) control parts. The communication type between the CPU of the PLC module box and the computer is selected as be the Rs-232 asynchronous type and the thread method is adapted in the development of the first part of the program. The PLC-related program has been developed so that the data communication between the PLC CPU and the computer could be harmonized with the unique commands which have already been defined in the PLC. The measuring space and time intervals, the start and end ROI(region of interest) values, and the allowable error limitation are input at each measurement in this program. Finally the controlling MCA program has been developed by using Canberra's programming library which contains several files including the head files in which the variable and the function of C++ are declared according to the MCA function. The performance test has been carried out through an application of the developed computer program to the homogeneity measurement system. The gamma counts at 28 measuring points of a fuel rod of 700 mm in length are measured for 50 sec at each point. It was revealed that the measurement results are better than the previous ones in respects of the measurement accuracy and a measurement time saving could be achieved. It was concluded that the gamma measurement system can be improved through equipping it with the developed control program.

  15. Development of a Computer Program for the Integrated Control of the Fuel Homogeneity Measurement System

    International Nuclear Information System (INIS)

    Shin, H. S.; Jang, J. W.; Lee, Y. H.; Oh, S. J.; Park, H. D.; Kim, C. K.

    2005-11-01

    The computer program is developed based on Visual C++, which is equipped with a user-friendly interface of the input/output(I/O) and a display function for the measuring conditions. This program consists of three parts which are the port communication, PLC(Programmable Logic Controller) and the MCA(Multi Channel Analyzer) control parts. The communication type between the CPU of the PLC module box and the computer is selected as be the Rs-232 asynchronous type and the thread method is adapted in the development of the first part of the program. The PLC-related program has been developed so that the data communication between the PLC CPU and the computer could be harmonized with the unique commands which have already been defined in the PLC. The measuring space and time intervals, the start and end ROI(region of interest) values, and the allowable error limitation are input at each measurement in this program. Finally the controlling MCA program has been developed by using Canberra's programming library which contains several files including the head files in which the variable and the function of C++ are declared according to the MCA function. The performance test has been carried out through an application of the developed computer program to the homogeneity measurement system. The gamma counts at 28 measuring points of a fuel rod of 700 mm in length are measured for 50 sec at each point. It was revealed that the measurement results are better than the previous ones in respects of the measurement accuracy and a measurement time saving could be achieved. It was concluded that the gamma measurement system can be improved through equipping it with the developed control program

  16. Development Of The Computer Code For Comparative Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Purwadi, Mohammad Dhandhang

    2001-01-01

    The qualitative and quantitative chemical analysis with Neutron Activation Analysis (NAA) is an importance utilization of a nuclear research reactor, and this should be accelerated and promoted in application and its development to raise the utilization of the reactor. The application of Comparative NAA technique in GA Siwabessy Multi Purpose Reactor (RSG-GAS) needs special (not commercially available yet) soft wares for analyzing the spectrum of multiple elements in the analysis at once. The application carried out using a single spectrum software analyzer, and comparing each result manually. This method really degrades the quality of the analysis significantly. To solve the problem, a computer code was designed and developed for comparative NAA. Spectrum analysis in the code is carried out using a non-linear fitting method. Before the spectrum analyzed, it was passed to the numerical filter which improves the signal to noise ratio to do the deconvolution operation. The software was developed using the G language and named as PASAN-K The testing result of the developed software was benchmark with the IAEA spectrum and well operated with less than 10 % deviation

  17. Technical developments for computed tomography on the CENBG nanobeam line

    Energy Technology Data Exchange (ETDEWEB)

    Gordillo, N., E-mail: gordillo@cenbg.in2p3.fr [Universite Bordeaux 1, CNRS/IN2P3, Centre d' Etudes Nucleaires de Bordeaux Gradignan, CENBG, Chemin du Solarium, BP120, 33175 Gradignan (France); Habchi, C.; Daudin, L. [Universite Bordeaux 1, CNRS/IN2P3, Centre d' Etudes Nucleaires de Bordeaux Gradignan, CENBG, Chemin du Solarium, BP120, 33175 Gradignan (France); Sakellariou, A. [Research School of Physical Sciences and Engineering, The Australian National University, Canberra, ACT 0200 (Australia); Delalee, F.; Barberet, Ph.; Incerti, S.; Seznec, H.; Moretto, Ph. [Universite Bordeaux 1, CNRS/IN2P3, Centre d' Etudes Nucleaires de Bordeaux Gradignan, CENBG, Chemin du Solarium, BP120, 33175 Gradignan (France)

    2011-10-15

    The use of ion microbeams as probes for computed tomography has proven to be a powerful tool for the three-dimensional characterization of specimens a few tens of micrometers in size. Compared to other types of probes, the main advantage is that quantitative information about mass density and composition can be obtained directly, using specific reconstruction codes. At the Centre d'Etudes Nucleaires de Bordeaux Gradignan (CENBG), this technique was initially developed for applications in cellular biology. However, the observation of the cell ultrastructure requires a sub-micron resolution. The construction of the nanobeam line at the Applications Interdisciplinaires des Faisceaux d'Ions en Region Aquitaine (AIFIRA) irradiation facility has opened new perspectives for such applications. The implementation of computed tomography on the nanobeam line of CENBG has required a careful design of the analysis chamber, especially microscopes for precise sample visualization, and detectors for scanning transmission ion microscopy (STIM) and for particle induced X-ray emission (PIXE). The sample can be precisely positioned in the three directions X, Y, Z and a stepper motor coupled to a goniometer ensures the rotational motion. First images of 3D tomography were obtained on a reference sample containing microspheres of certified diameter, showing the good stability of the beam and the sample stage, and the precision of the motion.

  18. Development of computer based ultrasonic flaw detector for nondestructive testing

    International Nuclear Information System (INIS)

    Lee, Weon Heum; Kim, Jin Koo; Kim, Yang Rae; Choi, Kwan Sun; Kim, Sun Hyung; Lee, Sun Heum

    1996-01-01

    Ultrasonic Testing is one of the most widely used method of Nondestructive testing for Pre-Service Inspection(PSI) and In-Service Inspection(ISI) in the structure of Bridges, Power plants, chemical plants and heavy industrial fields. It is very important target to estimate safety, remain life, Quality Control of the Structure. Also, a lot of research for quantities evaluation and analysis inspection data is proceeding. But traditional portable ultrasonic flaw detector had been following disadvantages. 1) Analog ultrasonic flaw detector decreased credibility of ultrasonic test, because it is impossible for saying data and digital signal processing. 2) Stand-alone digital ultrasonic flaw detector cannot effectively evaluate received signals because of lack of its storage memory. To overcome this shortcoming, we develop the computer based ultrasonic flaw detector for nondestructive testing. It can store the received signal and effectively evaluate the signal, and then enhance the reliability of the testing results.

  19. Application and development of Industrial Computed Tomography in China

    International Nuclear Information System (INIS)

    Kong Fangeng; Xian Wu

    1996-01-01

    Compared with traditional perspective radiography, ICT (Industrial Computed Tomography) is able to acquire tomography image without the disadvantages of image overlapping and blurring that exist in traditional perspective radiography. By acquiring the 2D tomography image of the object at different stage as many as needed, it is possible to achieve 3D tomography image. In China, the first Γ-ray ICT equipment was born at Chongqing University in May 1993. For this equipment, 60 Co radiation source with 1 Ci and 30 Ci was used, and spatial resolution is about 0.5mm, and density resolution is about 0.5%, and the diameter of the test object can be 300mm, but the price of the Chinese ICT equipment is only about a half on the same type of ICT equipment producing abroad other than China. Besides Γ-ray ICT, Chinese are engaging in research and develop x-ray ICT to meet foreign and domestic need. (author)

  20. Cloud Computing in Higher Education Sector for Sustainable Development

    Science.gov (United States)

    Duan, Yuchao

    2016-01-01

    Cloud computing is considered a new frontier in the field of computing, as this technology comprises three major entities namely: software, hardware and network. The collective nature of all these entities is known as the Cloud. This research aims to examine the impacts of various aspects namely: cloud computing, sustainability, performance…

  1. Protein adsorption on nanoparticles: model development using computer simulation.

    Science.gov (United States)

    Shao, Qing; Hall, Carol K

    2016-10-19

    The adsorption of proteins on nanoparticles results in the formation of the protein corona, the composition of which determines how nanoparticles influence their biological surroundings. We seek to better understand corona formation by developing models that describe protein adsorption on nanoparticles using computer simulation results as data. Using a coarse-grained protein model, discontinuous molecular dynamics simulations are conducted to investigate the adsorption of two small proteins (Trp-cage and WW domain) on a model nanoparticle of diameter 10.0 nm at protein concentrations ranging from 0.5 to 5 mM. The resulting adsorption isotherms are well described by the Langmuir, Freundlich, Temkin and Kiselev models, but not by the Elovich, Fowler-Guggenheim and Hill-de Boer models. We also try to develop a generalized model that can describe protein adsorption equilibrium on nanoparticles of different diameters in terms of dimensionless size parameters. The simulation results for three proteins (Trp-cage, WW domain, and GB3) on four nanoparticles (diameter  =  5.0, 10.0, 15.0, and 20.0 nm) illustrate both the promise and the challenge associated with developing generalized models of protein adsorption on nanoparticles.

  2. Magnetic fusion energy and computers: the role of computing in magnetic fusion energy research and development

    International Nuclear Information System (INIS)

    This report examines the role of computing in the Department of Energy magnetic confinement fusion program. The present status of the MFECC and its associated network is described. The third part of this report examines the role of computer models in the main elements of the fusion program and discusses their dependence on the most advanced scientific computers. A review of requirements at the National MFE Computer Center was conducted in the spring of 1976. The results of this review led to the procurement of the CRAY 1, the most advanced scientific computer available, in the spring of 1978. The utilization of this computer in the MFE program has been very successful and is also described in the third part of the report. A new study of computer requirements for the MFE program was conducted during the spring of 1979 and the results of this analysis are presented in the forth part of this report

  3. Magnetic fusion energy and computers: the role of computing in magnetic fusion energy research and development

    Energy Technology Data Exchange (ETDEWEB)

    1979-10-01

    This report examines the role of computing in the Department of Energy magnetic confinement fusion program. The present status of the MFECC and its associated network is described. The third part of this report examines the role of computer models in the main elements of the fusion program and discusses their dependence on the most advanced scientific computers. A review of requirements at the National MFE Computer Center was conducted in the spring of 1976. The results of this review led to the procurement of the CRAY 1, the most advanced scientific computer available, in the spring of 1978. The utilization of this computer in the MFE program has been very successful and is also described in the third part of the report. A new study of computer requirements for the MFE program was conducted during the spring of 1979 and the results of this analysis are presented in the forth part of this report.

  4. Development and validation of Monte Carlo dose computations for contrast-enhanced stereotactic synchrotron radiation therapy

    International Nuclear Information System (INIS)

    Vautrin, M.

    2011-01-01

    Contrast-enhanced stereotactic synchrotron radiation therapy (SSRT) is an innovative technique based on localized dose-enhancement effects obtained by reinforced photoelectric absorption in the tumor. Medium energy monochromatic X-rays (50 - 100 keV) are used for irradiating tumors previously loaded with a high-Z element. Clinical trials of SSRT are being prepared at the European Synchrotron Radiation Facility (ESRF), an iodinated contrast agent will be used. In order to compute the energy deposited in the patient (dose), a dedicated treatment planning system (TPS) has been developed for the clinical trials, based on the ISOgray TPS. This work focuses on the SSRT specific modifications of the TPS, especially to the PENELOPE-based Monte Carlo dose engine. The TPS uses a dedicated Monte Carlo simulation of medium energy polarized photons to compute the deposited energy in the patient. Simulations are performed considering the synchrotron source, the modeled beamline geometry and finally the patient. Specific materials were also implemented in the voxelized geometry of the patient, to consider iodine concentrations in the tumor. The computation process has been optimized and parallelized. Finally a specific computation of absolute doses and associated irradiation times (instead of monitor units) was implemented. The dedicated TPS was validated with depth dose curves, dose profiles and absolute dose measurements performed at the ESRF in a water tank and solid water phantoms with or without bone slabs. (author) [fr

  5. Compute-to-Learn: Authentic Learning via Development of Interactive Computer Demonstrations within a Peer-Led Studio Environment

    Science.gov (United States)

    Jafari, Mina; Welden, Alicia Rae; Williams, Kyle L.; Winograd, Blair; Mulvihill, Ellen; Hendrickson, Heidi P.; Lenard, Michael; Gottfried, Amy; Geva, Eitan

    2017-01-01

    In this paper, we report on the implementation of a novel compute-to-learn pedagogy, which is based upon the theories of situated cognition and meaningful learning. The "compute-to-learn" pedagogy is designed to simulate an authentic research experience as part of the undergraduate curriculum, including project development, teamwork,…

  6. Development of Computational Procedure for Assessment of Patient Dose in Multi-Detector Computed Tomography

    International Nuclear Information System (INIS)

    Park, Dong Wook

    2007-02-01

    Technological development to improve the quality and speed with which images are obtained have fostered the growth of frequency and collective effective dose of CT examination. Especially, High-dose x-ray technique of CT has increased in the concern of patient dose. However CTDI and DLP in CT dosimetry leaves something to be desired to evaluate patient dose. And even though the evaluation of effective dose in CT practice is required for comparison with other radiography, it's not sufficient to show any estimation because it's not for medical purpose. Therefore the calculation of effective dose in CT procedure is needed for that purpose. However modelling uncertainties will be due to insufficient information from manufacturing tolerances. Therefore the purpose of this work is development of computational procedure for assessment of patient dose through the experiment for getting essential information in MDCT. In order to obtain exact absorbed dose, normalization factors must be created to relate simulated dose values with CTDI air measurement. The normalization factors applied to the calculation of CTDI 100 using axial scanning and organ effective dose using helical scanning. The calculation of helical scanning was compared with the experiment of Groves et al.(2004). The result has a about factor 2 of the experiment. It seems because AEC is not simulated. In several studies, when AEC applied to a CT examination, approximately 20-30% dose reduction was appeared. Therefore the study of AEC simulation should be added and modified

  7. Present state of computer-aided diagnosis (CAD) development

    International Nuclear Information System (INIS)

    Fujita, Hiroshi

    2007-01-01

    Topics of computer-aided detection (CAD) are reviewed. Commercially available, Food and Drug Administration (FDA)-approved CAD systems are for fields of breast cancer (mammography), chest (flat X-ray and CT imaging) and colon (polyp detection). In Japan, only mammography CAD is approved. Efficacy of CAD is controversial, for which reliable database is important, and its construction is under development in various medical fields. Digitalized image is now popularized, which conceivably leads to improve the cost-effectiveness of diagnosis with CAD. For incentive, approval for health insurance would be the case as seen in the increased CAD sale by R2 Technology Co., and MHLW actually assists facilities to introduce the reading-aid system of mammography by sharing a half of its cost. There are 2 big projects for CAD study supported by MECSST, which the author concerns. One is the development of diagnostic aid for the multi-dimensional medical images where the multi-organ, multi-disease CAD system is considered. The other involves the CAD in brain MRI, in breast US and in eyeground picture. It is not in so far future for patients and doctors to fully enjoy the benefit of CAD. (R.T.)

  8. Development of computed tomography instrument for college teaching

    International Nuclear Information System (INIS)

    Liu Fenglin; Lu Yanping; Wang Jue

    2006-01-01

    Computed tomography (CT), which uses penetrating radiation from many directions to reconstruct cross-sectional or 3D images of object, has widely applied in medical diagnosis and treatment, industrial NDT and NDE. So it is significant for college students to understand the fundamental of CT. The authors describe the CD-50BG CT instrument developed for experimental teaching at colleges. With 50 mm field-of-view and the translation-rotation scanning mode, the system makes use of a single plastic scintillator + photomultiplier detector and a 137 Cs radioactive source with 0.74 GBq activity, which is housed in a tungsten alloy shield. At the same time, an image processing software has been developed to process the acquired data, so that cross-sectional and 3D images can be reconstructed. High quality images with 1 lp·mm -1 spatial resolution and 1% contrast sensitivity are obtained. So far in China, more than ten institutions including Tsinghua University and Peking University have already applied the system to elementary teaching. (authors)

  9. Development of multimedia computer-based training for VXI integrated fuel monitors

    International Nuclear Information System (INIS)

    Keeffe, R.; Ellacott, T.; Truong, Q.S.

    1999-01-01

    The Canadian Safeguards Support Program has developed the VXI Integrated Fuel Monitor (VFIM) which is based on the international VXI instrument bus standard. This equipment is a generic radiation monitor which can be used in an integrated mode where several detection systems can be connected to a common system where information is collected, displayed, and analyzed via a virtual control panel with the aid of computers, trackball and computer monitor. The equipment can also be used in an autonomous mode as a portable radiation monitor with a very low power consumption. The equipment has been described at previous international symposia. Integration of several monitoring systems (bundle counter, core discharge monitor, and yes/no monitor) has been carried out at Wolsong 2. Performance results from one of the monitoring systems which was installed at CANDU nuclear stations are discussed in a companion paper at this symposium. This paper describes the development of an effective multimedia computer-based training package for the primary users of the equipment; namely IAEA inspectors and technicians. (author)

  10. Scaling predictive modeling in drug development with cloud computing.

    Science.gov (United States)

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  12. Development of innovative computer software to facilitate the setup and computation of water quality index.

    Science.gov (United States)

    Nabizadeh, Ramin; Valadi Amin, Maryam; Alimohammadi, Mahmood; Naddafi, Kazem; Mahvi, Amir Hossein; Yousefzadeh, Samira

    2013-04-26

    Developing a water quality index which is used to convert the water quality dataset into a single number is the most important task of most water quality monitoring programmes. As the water quality index setup is based on different local obstacles, it is not feasible to introduce a definite water quality index to reveal the water quality level. In this study, an innovative software application, the Iranian Water Quality Index Software (IWQIS), is presented in order to facilitate calculation of a water quality index based on dynamic weight factors, which will help users to compute the water quality index in cases where some parameters are missing from the datasets. A dataset containing 735 water samples of drinking water quality in different parts of the country was used to show the performance of this software using different criteria parameters. The software proved to be an efficient tool to facilitate the setup of water quality indices based on flexible use of variables and water quality databases.

  13. On the impact of quantum computing technology on future developments in high-performance scientific computing

    NARCIS (Netherlands)

    Möller, M.; Vuik, C.

    2017-01-01

    Quantum computing technologies have become a hot topic in academia and industry receiving much attention and financial support from all sides. Building a quantum computer that can be used practically is in itself an outstanding challenge that has become the ‘new race to the moon’. Next to

  14. Ethics and the Computer: Children's Development of Moral Reasoning about Computer and Internet Use.

    Science.gov (United States)

    Burnam, Bruce; Kafai, Yasmin B.

    2001-01-01

    Describes a study of third and fifth grade students that investigated moral dilemmas involving computer and Internet use. Significant differences were found between children's moral reasoning in everyday situations compared to those involving computer and Internet use, but gender differences were not consistently detected. (Author/LRW)

  15. A case of cutaneous squamous cell carcinoma associated with small cell carcinoma of lung developing a skin metastasis on previously irradiated area

    International Nuclear Information System (INIS)

    Kohda, Mamoru; Takei, Yoji; Ueki, Hiroaki

    1983-01-01

    Squamous cell carcinoma which occurred in the penis of a 61-year-old male patient was treated surgically and by Linac (a total of 10,400 rad). However, it was not cured. Abnormal shadows in the lung and multiple liver tumor was noted one month before death. Autopsy revealed generalized metastases of pulmonary small-cell carcinoma, and persistent squamous cell carcinoma of the penis with no metastases. Skin metastasis of lung carcinoma occurred only in the area previously irradiated. (Ueda, J.)

  16. CT-guided Irreversible Electroporation in an Acute Porcine Liver Model: Effect of Previous Transarterial Iodized Oil Tissue Marking on Technical Parameters, 3D Computed Tomographic Rendering of the Electroporation Zone, and Histopathology

    Energy Technology Data Exchange (ETDEWEB)

    Sommer, C. M., E-mail: christof.sommer@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Fritz, S., E-mail: stefan.fritz@med.uni-heidelberg.de [University Hospital Heidelberg, Department of General Visceral and Transplantation Surgery (Germany); Vollherbst, D., E-mail: dominikvollherbst@web.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Zelzer, S., E-mail: s.zelzer@dkfz-heidelberg.de [German Cancer Research Center (dkfz), Medical and Biological Informatics (Germany); Wachter, M. F., E-mail: fredericwachter@googlemail.com; Bellemann, N., E-mail: nadine.bellemann@med.uni-heidelberg.de; Gockner, T., E-mail: theresa.gockner@med.uni-heidelberg.de; Mokry, T., E-mail: theresa.mokry@med.uni-heidelberg.de; Schmitz, A., E-mail: anne.schmitz@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Aulmann, S., E-mail: sebastian.aulmann@mail.com [University Hospital Heidelberg, Department of General Pathology (Germany); Stampfl, U., E-mail: ulrike.stampfl@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Pereira, P., E-mail: philippe.pereira@slk-kliniken.de [SLK Kliniken Heilbronn GmbH, Clinic for Radiology, Minimally-invasive Therapies and Nuclear Medicine (Germany); Kauczor, H. U., E-mail: hu.kauczor@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany); Werner, J., E-mail: jens.werner@med.uni-heidelberg.de [University Hospital Heidelberg, Department of General Visceral and Transplantation Surgery (Germany); Radeleff, B. A., E-mail: boris.radeleff@med.uni-heidelberg.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology (Germany)

    2015-02-15

    PurposeTo evaluate the effect of previous transarterial iodized oil tissue marking (ITM) on technical parameters, three-dimensional (3D) computed tomographic (CT) rendering of the electroporation zone, and histopathology after CT-guided irreversible electroporation (IRE) in an acute porcine liver model as a potential strategy to improve IRE performance.MethodsAfter Ethics Committee approval was obtained, in five landrace pigs, two IREs of the right and left liver (RL and LL) were performed under CT guidance with identical electroporation parameters. Before IRE, transarterial marking of the LL was performed with iodized oil. Nonenhanced and contrast-enhanced CT examinations followed. One hour after IRE, animals were killed and livers collected. Mean resulting voltage and amperage during IRE were assessed. For 3D CT rendering of the electroporation zone, parameters for size and shape were analyzed. Quantitative data were compared by the Mann–Whitney test. Histopathological differences were assessed.ResultsMean resulting voltage and amperage were 2,545.3 ± 66.0 V and 26.1 ± 1.8 A for RL, and 2,537.3 ± 69.0 V and 27.7 ± 1.8 A for LL without significant differences. Short axis, volume, and sphericity index were 16.5 ± 4.4 mm, 8.6 ± 3.2 cm{sup 3}, and 1.7 ± 0.3 for RL, and 18.2 ± 3.4 mm, 9.8 ± 3.8 cm{sup 3}, and 1.7 ± 0.3 for LL without significant differences. For RL and LL, the electroporation zone consisted of severely widened hepatic sinusoids containing erythrocytes and showed homogeneous apoptosis. For LL, iodized oil could be detected in the center and at the rim of the electroporation zone.ConclusionThere is no adverse effect of previous ITM on technical parameters, 3D CT rendering of the electroporation zone, and histopathology after CT-guided IRE of the liver.

  17. CT-guided Irreversible Electroporation in an Acute Porcine Liver Model: Effect of Previous Transarterial Iodized Oil Tissue Marking on Technical Parameters, 3D Computed Tomographic Rendering of the Electroporation Zone, and Histopathology

    International Nuclear Information System (INIS)

    Sommer, C. M.; Fritz, S.; Vollherbst, D.; Zelzer, S.; Wachter, M. F.; Bellemann, N.; Gockner, T.; Mokry, T.; Schmitz, A.; Aulmann, S.; Stampfl, U.; Pereira, P.; Kauczor, H. U.; Werner, J.; Radeleff, B. A.

    2015-01-01

    PurposeTo evaluate the effect of previous transarterial iodized oil tissue marking (ITM) on technical parameters, three-dimensional (3D) computed tomographic (CT) rendering of the electroporation zone, and histopathology after CT-guided irreversible electroporation (IRE) in an acute porcine liver model as a potential strategy to improve IRE performance.MethodsAfter Ethics Committee approval was obtained, in five landrace pigs, two IREs of the right and left liver (RL and LL) were performed under CT guidance with identical electroporation parameters. Before IRE, transarterial marking of the LL was performed with iodized oil. Nonenhanced and contrast-enhanced CT examinations followed. One hour after IRE, animals were killed and livers collected. Mean resulting voltage and amperage during IRE were assessed. For 3D CT rendering of the electroporation zone, parameters for size and shape were analyzed. Quantitative data were compared by the Mann–Whitney test. Histopathological differences were assessed.ResultsMean resulting voltage and amperage were 2,545.3 ± 66.0 V and 26.1 ± 1.8 A for RL, and 2,537.3 ± 69.0 V and 27.7 ± 1.8 A for LL without significant differences. Short axis, volume, and sphericity index were 16.5 ± 4.4 mm, 8.6 ± 3.2 cm 3 , and 1.7 ± 0.3 for RL, and 18.2 ± 3.4 mm, 9.8 ± 3.8 cm 3 , and 1.7 ± 0.3 for LL without significant differences. For RL and LL, the electroporation zone consisted of severely widened hepatic sinusoids containing erythrocytes and showed homogeneous apoptosis. For LL, iodized oil could be detected in the center and at the rim of the electroporation zone.ConclusionThere is no adverse effect of previous ITM on technical parameters, 3D CT rendering of the electroporation zone, and histopathology after CT-guided IRE of the liver

  18. Extended precision data types for the development of the original computer aided engineering applications

    Science.gov (United States)

    Pescaru, A.; Oanta, E.; Axinte, T.; Dascalescu, A.-D.

    2015-11-01

    Computer aided engineering is based on models of the phenomena which are expressed as algorithms. The implementations of the algorithms are usually software applications which are processing a large volume of numerical data, regardless the size of the input data. In this way, the finite element method applications used to have an input data generator which was creating the entire volume of geometrical data, starting from the initial geometrical information and the parameters stored in the input data file. Moreover, there were several data processing stages, such as: renumbering of the nodes meant to minimize the size of the band length of the system of equations to be solved, computation of the equivalent nodal forces, computation of the element stiffness matrix, assemblation of system of equations, solving the system of equations, computation of the secondary variables. The modern software application use pre-processing and post-processing programs to easily handle the information. Beside this example, CAE applications use various stages of complex computation, being very interesting the accuracy of the final results. Along time, the development of CAE applications was a constant concern of the authors and the accuracy of the results was a very important target. The paper presents the various computing techniques which were imagined and implemented in the resulting applications: finite element method programs, finite difference element method programs, applied general numerical methods applications, data generators, graphical applications, experimental data reduction programs. In this context, the use of the extended precision data types was one of the solutions, the limitations being imposed by the size of the memory which may be allocated. To avoid the memory-related problems the data was stored in files. To minimize the execution time, part of the file was accessed using the dynamic memory allocation facilities. One of the most important consequences of the

  19. Process engineering computer support-II, processes development

    International Nuclear Information System (INIS)

    Markovska, Liljana; Meshko, Vera

    1997-01-01

    Computer support for chemical reactor design is presented. Spatial attention is given to the regression analysis computer support, where the possibilities and characteristics of the nonlinear software package REPROCHE are analyzed. Creation of own reactor models is also analyzed. The reactor simulator is intended for different reactor types, in such a way every chemical reaction mechanism could be simulated

  20. Computer-Mediated Collaborative Projects: Processes for Enhancing Group Development

    Science.gov (United States)

    Dupin-Bryant, Pamela A.

    2008-01-01

    Groups are a fundamental part of the business world. Yet, as companies continue to expand internationally, a major challenge lies in promoting effective communication among employees who work in varying time zones. Global expansion often requires group collaboration through computer systems. Computer-mediated groups lead to different communicative…

  1. Developing Educational Computer Animation Based on Human Personality Types

    Science.gov (United States)

    Musa, Sajid; Ziatdinov, Rushan; Sozcu, Omer Faruk; Griffiths, Carol

    2015-01-01

    Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered…

  2. The development of mobile computation and the related formal description

    International Nuclear Information System (INIS)

    Jin Yan; Yang Xiaozong

    2003-01-01

    The description and research for formal representation in mobile computation, which is very instructive to resolve the status transmission, domain administration, authentication. This paper presents the descriptive communicating process and computational process from the view of formal calculus, what's more, it construct a practical application used by mobile ambient. Finally, this dissertation shows the future work and direction. (authors)

  3. New life for old computers | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    An IDRC-supported initiative in Latin America and the Caribbean is tackling the environmental problem of e-waste disposal, providing computers for schools, and creating jobs at the same time. The rapid pace of technology in this electronic age has created a global problem. Computers and their many accessories, ...

  4. Developing Activities for Teaching Cloud Computing and Virtualization

    OpenAIRE

    E. Erturk; B. Maharjan

    2014-01-01

    Cloud computing and virtualization are new but indispensable components of computer engineering and information systems curricula for universities and higher education institutions. Learning about these topics is important for students preparing to work in the IT industry. In many companies, information technology operates under tight financial constraints. Virtualization, (for example storage, desktop, and server virtualization), reduces overall IT costs through the consolidation of systems....

  5. The Development of Sociocultural Competence with the Help of Computer Technology

    Science.gov (United States)

    Rakhimova, Alina E.; Yashina, Marianna E.; Mukhamadiarova, Albina F.; Sharipova, Astrid V.

    2017-01-01

    The article deals with the description of the process of development sociocultural knowledge and competences using computer technologies. On the whole the development of modern computer technologies allows teachers to broaden trainees' sociocultural outlook and trace their progress online. Observation of modern computer technologies and estimation…

  6. Computational annotation of genes differentially expressed along olive fruit development

    Directory of Open Access Journals (Sweden)

    Martinelli Federico

    2009-10-01

    Full Text Available Abstract Background Olea europaea L. is a traditional tree crop of the Mediterranean basin with a worldwide economical high impact. Differently from other fruit tree species, little is known about the physiological and molecular basis of the olive fruit development and a few sequences of genes and gene products are available for olive in public databases. This study deals with the identification of large sets of differentially expressed genes in developing olive fruits and the subsequent computational annotation by means of different software. Results mRNA from fruits of the cv. Leccino sampled at three different stages [i.e., initial fruit set (stage 1, completed pit hardening (stage 2 and veraison (stage 3] was used for the identification of differentially expressed genes putatively involved in main processes along fruit development. Four subtractive hybridization libraries were constructed: forward and reverse between stage 1 and 2 (libraries A and B, and 2 and 3 (libraries C and D. All sequenced clones (1,132 in total were analyzed through BlastX against non-redundant NCBI databases and about 60% of them showed similarity to known proteins. A total of 89 out of 642 differentially expressed unique sequences was further investigated by Real-Time PCR, showing a validation of the SSH results as high as 69%. Library-specific cDNA repertories were annotated according to the three main vocabularies of the gene ontology (GO: cellular component, biological process and molecular function. BlastX analysis, GO terms mapping and annotation analysis were performed using the Blast2GO software, a research tool designed with the main purpose of enabling GO based data mining on sequence sets for which no GO annotation is yet available. Bioinformatic analysis pointed out a significantly different distribution of the annotated sequences for each GO category, when comparing the three fruit developmental stages. The olive fruit-specific transcriptome dataset was

  7. The contribution of high-performance computing and modelling for industrial development

    CSIR Research Space (South Africa)

    Sithole, Happy

    2017-10-01

    Full Text Available Performance Computing and Modelling for Industrial Development Dr Happy Sithole and Dr Onno Ubbink 2 Strategic context • High-performance computing (HPC) combined with machine Learning and artificial intelligence present opportunities to non...

  8. Development of computational science in JAEA. R and D of simulation

    International Nuclear Information System (INIS)

    Nakajima, Norihiro; Araya, Fumimasa; Hirayama, Toshio

    2006-01-01

    R and D of computational science in JAEA (Japan Atomic Energy Agency) is described. Environment of computer, R and D system in CCSE (Center for Computational Science and e-Systems), joint computational science researches in Japan and world, development of computer technologies, the some examples of simulation researches, 3-dimensional image vibrational platform system, simulation researches of FBR cycle techniques, simulation of large scale thermal stress for development of steam generator, simulation research of fusion energy techniques, development of grid computing technology, simulation research of quantum beam techniques and biological molecule simulation researches are explained. Organization of JAEA, development of computational science in JAEA, network of JAEA, international collaboration of computational science, and environment of ITBL (Information-Technology Based Laboratory) project are illustrated. (S.Y.)

  9. Developing a multimodal biometric authentication system using soft computing methods.

    Science.gov (United States)

    Malcangi, Mario

    2015-01-01

    Robust personal authentication is becoming ever more important in computer-based applications. Among a variety of methods, biometric offers several advantages, mainly in embedded system applications. Hard and soft multi-biometric, combined with hard and soft computing methods, can be applied to improve the personal authentication process and to generalize the applicability. This chapter describes the embedded implementation of a multi-biometric (voiceprint and fingerprint) multimodal identification system based on hard computing methods (DSP) for feature extraction and matching, an artificial neural network (ANN) for soft feature pattern matching, and a fuzzy logic engine (FLE) for data fusion and decision.

  10. X-ray Computed Tomography Image Quality Indicator (IQI) Development

    Data.gov (United States)

    National Aeronautics and Space Administration — Phase one of the program is to identify suitable x-ray Computed Tomography (CT) Image Quality Indicator (IQI) design(s) that can be used to adequately capture CT...

  11. Widespread Piracy by Students Frustrates Developers of Computer Software.

    Science.gov (United States)

    DeLoughry, Thomas J.

    1987-01-01

    Computer software producers view students' illegal copying of programs as lost revenue and feel powerless to stop the piracy. Some propose to change student attitudes about copying, others suggest reducing software prices, and still others are calling for prosecution. (MSE)

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  13. Developing ontological model of computational linear algebra - preliminary considerations

    Science.gov (United States)

    Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Lirkov, I.

    2013-10-01

    The aim of this paper is to propose a method for application of ontologically represented domain knowledge to support Grid users. The work is presented in the context provided by the Agents in Grid system, which aims at development of an agent-semantic infrastructure for efficient resource management in the Grid. Decision support within the system should provide functionality beyond the existing Grid middleware, specifically, help the user to choose optimal algorithm and/or resource to solve a problem from a given domain. The system assists the user in at least two situations. First, for users without in-depth knowledge about the domain, it should help them to select the method and the resource that (together) would best fit the problem to be solved (and match the available resources). Second, if the user explicitly indicates the method and the resource configuration, it should "verify" if her choice is consistent with the expert recommendations (encapsulated in the knowledge base). Furthermore, one of the goals is to simplify the use of the selected resource to execute the job; i.e., provide a user-friendly method of submitting jobs, without required technical knowledge about the Grid middleware. To achieve the mentioned goals, an adaptable method of expert knowledge representation for the decision support system has to be implemented. The selected approach is to utilize ontologies and semantic data processing, supported by multicriterial decision making. As a starting point, an area of computational linear algebra was selected to be modeled, however, the paper presents a general approach that shall be easily extendable to other domains.

  14. A Brief Analysis of Development Situations and Trend of Cloud Computing

    Science.gov (United States)

    Yang, Wenyan

    2017-12-01

    in recent years, the rapid development of Internet technology has radically changed people's work, learning and lifestyles. More and more activities are completed by virtue of computers and networks. The amount of information and data generated is bigger day by day, and people rely more on computer, which makes computing power of computer fail to meet demands of accuracy and rapidity from people. The cloud computing technology has experienced fast development, which is widely applied in the computer industry as a result of advantages of high precision, fast computing and easy usage. Moreover, it has become a focus in information research at present. In this paper, the development situations and trend of cloud computing shall be analyzed and researched.

  15. Mastering cognitive development theory in computer science education

    Science.gov (United States)

    Gluga, Richard; Kay, Judy; Lister, Raymond; Simon; Kleitman, Sabina

    2013-03-01

    To design an effective computer science curriculum, educators require a systematic method of classifying the difficulty level of learning activities and assessment tasks. This is important for curriculum design and implementation and for communication between educators. Different educators must be able to use the method consistently, so that classified activities and assessments are comparable across the subjects of a degree, and, ideally, comparable across institutions. One widespread approach to supporting this is to write learning objects in terms of Bloom's Taxonomy. This, or other such classifications, is likely to be more effective if educators can use them consistently, in the way experts would use them. To this end, we present the design and evaluation of our online interactive web-based tutorial system, which can be configured and used to offer training in different classification schemes. We report on results from three evaluations. First, 17 computer science educators complete a tutorial on using Bloom's Taxonomy to classify programming examination questions. Second, 20 computer science educators complete a Neo-Piagetian tutorial. Third evaluation was a comparison of inter-rater reliability scores of computer science educators classifying programming questions using Bloom's Taxonomy, before and after taking our tutorial. Based on the results from these evaluations, we discuss the effectiveness of our tutorial system design for teaching computer science educators how to systematically and consistently classify programming examination questions. We also discuss the suitability of Bloom's Taxonomy and Neo-Piagetian theory for achieving this goal. The Bloom's and Neo-Piagetian tutorials are made available as a community resource. The contributions of this paper are the following: the tutorial system for learning classification schemes for the purpose of coding the difficulty of computing learning materials; its evaluation; new insights into the consistency

  16. Development and application of methods and computer codes of fuel management and nuclear design of reload cycles in PWR

    International Nuclear Information System (INIS)

    Ahnert, C.; Aragones, J.M.; Corella, M.R.; Esteban, A.; Martinez-Val, J.M.; Minguez, E.; Perlado, J.M.; Pena, J.; Matias, E. de; Llorente, A.; Navascues, J.; Serrano, J.

    1976-01-01

    Description of methods and computer codes for Fuel Management and Nuclear Design of Reload Cycles in PWR, developed at JEN by adaptation of previous codes (LEOPARD, NUTRIX, CITATION, FUELCOST) and implementation of original codes (TEMP, SOTHIS, CICLON, NUDO, MELON, ROLLO, LIBRA, PENELOPE) and their application to the project of Management and Design of Reload Cycles of a 510 Mwt PWR, including comparison with results of experimental operation and other calculations for validation of methods. (author) [es

  17. Intelligent physical blocks for introducing computer programming in developing countries

    CSIR Research Space (South Africa)

    Smith, Andrew C

    2007-05-01

    Full Text Available the usability and educational aspects are reported on. The author provides a brief overview on previous work in this field. Results obtained from field studies are given. The author concludes with recommendations for improvements and further research...

  18. Developments in medical image processing and computational vision

    CERN Document Server

    Jorge, Renato

    2015-01-01

    This book presents novel and advanced topics in Medical Image Processing and Computational Vision in order to solidify knowledge in the related fields and define their key stakeholders. It contains extended versions of selected papers presented in VipIMAGE 2013 – IV International ECCOMAS Thematic Conference on Computational Vision and Medical Image, which took place in Funchal, Madeira, Portugal, 14-16 October 2013.  The twenty-two chapters were written by invited experts of international recognition and address important issues in medical image processing and computational vision, including: 3D vision, 3D visualization, colour quantisation, continuum mechanics, data fusion, data mining, face recognition, GPU parallelisation, image acquisition and reconstruction, image and video analysis, image clustering, image registration, image restoring, image segmentation, machine learning, modelling and simulation, object detection, object recognition, object tracking, optical flow, pattern recognition, pose estimat...

  19. An Interactive Computer-Based Circulation System: Design and Development

    Directory of Open Access Journals (Sweden)

    James S. Aagaard

    1972-03-01

    Full Text Available An on-line computer-based circulation control system has been installed at the Northwestern University library. Features of the system include self-service book charge, remote terminal inquiry and update, and automatic production of notices for call-ins and books available. Fine notices are also prepared daily and overdue notices weekly. Important considerations in the design of the system were to minimize costs of operation and to include technical services functions eventually. The system operates on a relatively small computer in a multiprogrammed mode.

  20. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    International Nuclear Information System (INIS)

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de

    2015-01-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  1. A new incision for unilateral cleft lip repair developed using animated simulation of repair on computer

    Directory of Open Access Journals (Sweden)

    Sahay A

    2007-01-01

    Full Text Available Background: Unilateral cleft lip repair continues to leave behind some amount of dissatisfaction, as a scope for further improvement is always felt. Most surgeons do not like to deviate from the standard Millard′s/ triangular techniques, or their minor modifications, as no one likes to experiment on the face for fear of unfavourable outcomes. The computer can be utilized as a useful tool in the analysis and planning of surgery and new methods can be developed and attempted subsequently with greater confidence. Aim: We decided to see if an improved lip repair could be developed with the use of computers. Materials and Methods: Analysis of previous lip repairs was done to determine where an improvement was required. Movement of tissues, by simulating an ideal repair, using image warping software, on digital images of cleft lip was studied in animation sequences. A repair which could reproduce these movements was planned. A new incision emerged, which had combined the principles of Millard′s and Randall / Tennyson repairs, with additional features. The new method was performed on 30 cases. Conclusions: The results were encouraging as the shortcomings of these methods were minimized, and the advantages maximized.

  2. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de, E-mail: vagner.macedo@usp.br, E-mail: patricia@ipen.br, E-mail: delvonei@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  3. Development of a rapid lateral flow immunoassay test for detection of exosomes previously enriched from cell culture medium and body fluids.

    Science.gov (United States)

    Oliveira-Rodríguez, Myriam; López-Cobo, Sheila; Reyburn, Hugh T; Costa-García, Agustín; López-Martín, Soraya; Yáñez-Mó, María; Cernuda-Morollón, Eva; Paschen, Annette; Valés-Gómez, Mar; Blanco-López, Maria Carmen

    2016-01-01

    Exosomes are cell-secreted nanovesicles (40-200 nm) that represent a rich source of novel biomarkers in the diagnosis and prognosis of certain diseases. Despite the increasingly recognized relevance of these vesicles as biomarkers, their detection has been limited due in part to current technical challenges in the rapid isolation and analysis of exosomes. The complexity of the development of analytical platforms relies on the heterogeneous composition of the exosome membrane. One of the most attractive tests is the inmunochromatographic strips, which allow rapid detection by unskilled operators. We have successfully developed a novel lateral flow immunoassay (LFIA) for the detection of exosomes based on the use of tetraspanins as targets. We have applied this platform for the detection of exosomes purified from different sources: cell culture supernatants, human plasma and urine. As proof of concept, we explored the analytical potential of this LFIA platform to accurately quantify exosomes purified from a human metastatic melanoma cell line. The one-step assay can be completed in 15 min, with a limit of detection of 8.54×10(5) exosomes/µL when a blend of anti-CD9 and anti-CD81 were selected as capture antibodies and anti-CD63 labelled with gold nanoparticles as detection antibody. Based on our results, this platform could be well suited to be used as a rapid exosome quantification tool, with promising diagnostic applications, bearing in mind that the detection of exosomes from different sources may require adaptation of the analytical settings to their specific composition.

  4. Development of a rapid lateral flow immunoassay test for detection of exosomes previously enriched from cell culture medium and body fluids

    Directory of Open Access Journals (Sweden)

    Myriam Oliveira-Rodríguez

    2016-08-01

    Full Text Available Exosomes are cell-secreted nanovesicles (40–200 nm that represent a rich source of novel biomarkers in the diagnosis and prognosis of certain diseases. Despite the increasingly recognized relevance of these vesicles as biomarkers, their detection has been limited due in part to current technical challenges in the rapid isolation and analysis of exosomes. The complexity of the development of analytical platforms relies on the heterogeneous composition of the exosome membrane. One of the most attractive tests is the inmunochromatographic strips, which allow rapid detection by unskilled operators. We have successfully developed a novel lateral flow immunoassay (LFIA for the detection of exosomes based on the use of tetraspanins as targets. We have applied this platform for the detection of exosomes purified from different sources: cell culture supernatants, human plasma and urine. As proof of concept, we explored the analytical potential of this LFIA platform to accurately quantify exosomes purified from a human metastatic melanoma cell line. The one-step assay can be completed in 15 min, with a limit of detection of 8.54×105 exosomes/µL when a blend of anti-CD9 and anti-CD81 were selected as capture antibodies and anti-CD63 labelled with gold nanoparticles as detection antibody. Based on our results, this platform could be well suited to be used as a rapid exosome quantification tool, with promising diagnostic applications, bearing in mind that the detection of exosomes from different sources may require adaptation of the analytical settings to their specific composition.

  5. Development and applications of a new computer-controlled magnetic inspection system

    Science.gov (United States)

    Eichmann, A. R.

    1992-07-01

    The magnetic hysteresis inspection technique has been shown to obtain results that can be traceable to the fundamental properties of the material. A new computer-controlled instrument known as the Magnescope Mark 2 was developed which can be used to make magnetic hysteresis inspections on materials in situ. These inspections can be used to evaluate the condition of steel components non-destructively allowing it to be used in applications such as quality control and assurance in the production of steel and for evaluating the structural integrity of steel components. Previous inspections systems based on this technique were large, heavy, and hard to use making measurements out in the field difficult to obtain. The Magnescope Mark 2 was designed to be smaller, lighter, and easier to use allowing field measurements to be obtained much easier. The design and construction of the Magnescope Mark 2 is described along with the improvements and additions made to the control and analysis software known as MAGNUM.

  6. Cloud Computing: The Emergence of Application Service Providers (ASPs) in Developing Economies

    DEFF Research Database (Denmark)

    Yeboah-Boateng, Ezer Osei; Cudjoe-Seshie, Stephen

    2013-01-01

    for multi-tenant user model and immediate computing resource scalability. This paper provides an insight into the Cloud computing eco-system in a developing economy, with Ghana as case example. It reveals that there is positive experience with this computing model from the consumer standpoint. It also...

  7. Developing Oral and Written Communication Skills in Undergraduate Computer Science and Information Systems Curriculum

    Science.gov (United States)

    Kortsarts, Yana; Fischbach, Adam; Rufinus, Jeff; Utell, Janine M.; Yoon, Suk-Chung

    2010-01-01

    Developing and applying oral and written communication skills in the undergraduate computer science and computer information systems curriculum--one of the ABET accreditation requirements - is a very challenging and, at the same time, a rewarding task that provides various opportunities to enrich the undergraduate computer science and computer…

  8. A computer literacy scale for newly enrolled nursing college students: development and validation.

    Science.gov (United States)

    Lin, Tung-Cheng

    2011-12-01

    Increasing application and use of information systems and mobile technologies in the healthcare industry require increasing nurse competency in computer use. Computer literacy is defined as basic computer skills, whereas computer competency is defined as the computer skills necessary to accomplish job tasks. Inadequate attention has been paid to computer literacy and computer competency scale validity. This study developed a computer literacy scale with good reliability and validity and investigated the current computer literacy of newly enrolled students to develop computer courses appropriate to students' skill levels and needs. This study referenced Hinkin's process to develop a computer literacy scale. Participants were newly enrolled first-year undergraduate students, with nursing or nursing-related backgrounds, currently attending a course entitled Information Literacy and Internet Applications. Researchers examined reliability and validity using confirmatory factor analysis. The final version of the developed computer literacy scale included six constructs (software, hardware, multimedia, networks, information ethics, and information security) and 22 measurement items. Confirmatory factor analysis showed that the scale possessed good content validity, reliability, convergent validity, and discriminant validity. This study also found that participants earned the highest scores for the network domain and the lowest score for the hardware domain. With increasing use of information technology applications, courses related to hardware topic should be increased to improve nurse problem-solving abilities. This study recommends that emphases on word processing and network-related topics may be reduced in favor of an increased emphasis on database, statistical software, hospital information systems, and information ethics.

  9. Human-Computer Interface Development: Concepts and Systems for its Management

    OpenAIRE

    Hartson, H. Rex; Hix, Deborah

    1986-01-01

    Human-computer interface management, from a computer science viewpoint, focuses on the process of developing quality human computer interfaces, including their representation, design, implementation, execution, evaluation, and maintenance. This survey presents important concepts of interface management: dialogue independence, structural modeling, specification, rapid prototyping, holistic software engineering, control structures, and support environments, including User Interface Management S...

  10. Development and validation of a computational fluid dynamics methodology for simulation of pulsatile left ventricular assist devices.

    Science.gov (United States)

    Medvitz, Richard B; Kreider, James W; Manning, Keefe B; Fontaine, Arnold A; Deutsch, Steven; Paterson, Eric G

    2007-01-01

    An unsteady computational fluid dynamic methodology was developed so that design analyses could be undertaken for devices such as the 50cc Penn State positive-displacement left ventricular assist device (LVAD). The piston motion observed in vitro was modeled, yielding the physiologic flow waveform observed during pulsatile experiments. Valve closure was modeled numerically by locally increasing fluid viscosity during the closed phase. Computational geometry contained Bjork-Shiley Monostrut mechanical heart valves in mitral and aortic positions. Cases for computational analysis included LVAD operation under steady-flow and pulsatile-flow conditions. Computations were validated by comparing simulation results with previously obtained in vitro particle image velocimetry (PIV) measurements. The steady portion of the analysis studied effects of mitral valve orientation, comparing the computational results with in vitro data obtained from mock circulatory loop experiments. The velocity field showed good qualitative agreement with the in vitro PIV data. The pulsatile flow simulations modeled the unsteady flow phenomena associated with a positive-displacement LVAD operating through several beat cycles. Flow velocity gradients allowed computation of the scalar wall strain rate, an important factor for determining hemodynamics of the device. Velocity magnitude contours compared well with PIV data throughout the cycle. Computational wall shear rates over the pulsatile cycle were found to be in the same range as wall shear rates observed in vitro.

  11. Speech Development of Autistic Children by Interactive Computer Games

    Science.gov (United States)

    Rahman, Mustafizur; Ferdous, S. M.; Ahmed, Syed Ishtiaque; Anwar, Anika

    2011-01-01

    Purpose: Speech disorder is one of the most common problems found with autistic children. The purpose of this paper is to investigate the introduction of computer-based interactive games along with the traditional therapies in order to help improve the speech of autistic children. Design/methodology/approach: From analysis of the works of Ivar…

  12. X-Y plotter adapter developed for SDS-930 computer

    Science.gov (United States)

    Robertson, J. B.

    1968-01-01

    Graphical Display Adapter provides a real time display for digital computerized experiments. This display uses a memory oscilloscope which records a single trace until erased. It is a small hardware unit which interfaces with the J-box feature of the SDS-930 computer to either an X-Y plotter or a memory oscilloscope.

  13. Development of android application for computation of air pollutant ...

    African Journals Online (AJOL)

    Past few decades, human have experienced a revolution in the computer sciences, not only in terms of its ability but also in terms of its use. Advancement of smartphone technology had produced rapid yet incredible invention in many sectors such as construction, agriculture, education, health and many more. This paper ...

  14. Recent developments and innovative applications in computational mechanics

    CERN Document Server

    Mueller-Hoeppe, Dana; Reese, Stefanie

    2011-01-01

    This Festschrift is dedicated to Professor Peter Wriggers on the occasion of his 60th birthday. It contains contributions from friends and collaborators as well as current and former PhD students, and covers the latest advances in computational mechanics.

  15. Recent development in methods for electron optical computations

    Czech Academy of Sciences Publication Activity Database

    Lencová, Bohumila

    2001-01-01

    Roč. 93, č. 6 (2001), s. 434-435 ISSN 0248-4900 Institutional research plan: CEZ:AV0Z2065902 Keywords : electron optical computations * finite element method Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering Impact factor: 1.829, year: 2001

  16. Computer tools for face seal analyses developed at John Crane

    Science.gov (United States)

    Wu, Shifeng

    1994-01-01

    The purposes of the computer tools for face seal analysis are new product optimization, existing seals on new applications, existing seals on off-duty conditions, and trouble-shooting. Discussed in this viewgraph presentation are interface forces, friction/heat generation, heat transfer/temperature distribution, axisymmetric pressure/thermal distortion, leakage, and an example case.

  17. Evaluating the Effectiveness of Computer Applications in Developing English Learning

    Science.gov (United States)

    Whitaker, James Todd

    2016-01-01

    I examined the effectiveness of self-directed learning and English learning with computer applications on college students in Bangkok, Thailand, in a control-group experimental-group pretest-posttest design. The hypothesis was tested using a t test: two-sample assuming unequal variances to establish the significance of mean scores between the two…

  18. Development of a Computer Simulation for a Car Deceleration ...

    African Journals Online (AJOL)

    This is very practical, technical, and it happens every day. In this paper, we studied the factors responsible for this event. Using a computer simulation that is based on a mathematical model, we implemented the simulation of a car braking model and showed how long it takes a car to come to rest while considering certain ...

  19. Mastering Cognitive Development Theory in Computer Science Education

    Science.gov (United States)

    Gluga, Richard; Kay, Judy; Lister, Raymond; Kleitman, Simon; Kleitman, Sabina

    2013-01-01

    To design an effective computer science curriculum, educators require a systematic method of classifying the difficulty level of learning activities and assessment tasks. This is important for curriculum design and implementation and for communication between educators. Different educators must be able to use the method consistently, so that…

  20. INFLUENCE OF DEVELOPMENT OF COMPUTER TECHNOLOGIES ON TEACHING

    Directory of Open Access Journals (Sweden)

    Sead Rešić

    2012-09-01

    Full Text Available Our times are characterized by strong changes in technology that have become reality in many areas of society. When compared to production, transport, services, etc education, as a rule, slowly opens to new technologies. However, children at their homes and outside the schools live in a technologically rich environment, and they expect the change in education in accordance with the imperatives of the education for the twenty-first century. In this sense, systems for automated data processing, multimedia systems, then distance learning, virtual schools and other technologies are being introduced into education. They lead to an increase in students' activities, quality evaluation of their knowledge and finally to their progress, all in accordance with individual abilities and knowledge. Mathematics and computers often appear together in the teaching process. Taking into account the teaching of mathematics, computers and software packages have a significant role. The program requirements are not dominant. The emphasis is on mathematical content and the method of presentation. Computers are especially used in solving various mathematical tasks and self-learning of mathematics. Still, many problems that require solutions appear in the process: how to organise lectures, practice, textbooks, collected mathematical problems, written exams, how to assign and check homework. The answers to these questions are not simple and they will probably be sought continuously, with an increasing use of computers in the teaching process. In this paper I have tried to solve some of the questions above.

  1. Computer Aided Model Development for Automatic Tool Wear ...

    African Journals Online (AJOL)

    The pre-processing operations on the images (taken on photographic cards) included scanning, in order to transfer onto a computer and convert them to digital images. Thresholding and segmentation were done in order to convert the altered background of the scanned images to a pure white background; the images were ...

  2. Development of Adjustable 3D computational phantoms for breast radiotherapy

    International Nuclear Information System (INIS)

    Emam, Zohal Alnour Ahmed

    2016-06-01

    Radiotherapy has become an essential part of breast cancer treatment and it was given a great concern during last decades due to aspects of managing breast cancer successfully, reducing recurrence and breast cancer mortality. Monte Carlo simulation has been used heavily in this issue. To use monte Carlo the suitable data set must be found to perform the study. This process is not straight forward and difficult to achieve and an effort is needed to obtain it. In this work we aimed to develop a methodology for obtaining 3D adjustable computational phantoms with different breast sizes to treat this problem. At first make human software was used to generate outer surfaces models with desired anthropomorphic features for our purpose. Three breasts cup sizes have been developed: small (A), medium (C) and large (D) according to European standardization system of dress, then blender software was used to join skeleton and internal organs outer surfaces of the body models in correct anatomical positions and the results were poly mesh anthropomorphic phantom has three breast sizes easy to manipulate positioning and modifying, the prepared models have been voxelised in 3D matrixes (256*256*256) using Binvox software, then voxelised models prepared in suitable formats for Gate (mhd/raw) in 70 axial slice with voxel dimension of 1.394*1.394*5 mm 3 for width, depth and length respectively. Gate monte Carlo was used to simulate the irradiation of virtual tumor bed site in left breasts with direct field electron beam, each breast size was treated with five energies 6, 9, 12, 15, and 18 MeV by field size 5*5 cm 2 , and 100 cm source surface distance (SSD). The results were studied to evaluate the effect of breast size variation on dose distribution. According to criteria of tumor bed coverage by 100% 90% normalised maximum dose and minimum dose to heart and lug which are considering the organs at risks, results show the energy 6 MeV give under cover to tumor bed in the small, medium

  3. COMPUTING

    CERN Document Server

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  5. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  6. The Eukaryotic Microbiome: Origins and Implications for Fetal and Neonatal life note bene: previous titles: The Microbiome in the Development of Terrestrial Life,and,The Origins and Development of the Neonatal Microbiome

    Directory of Open Access Journals (Sweden)

    William B. Miller

    2016-09-01

    Full Text Available All eukaryotic organisms are holobionts representing complex collaborations between the entire microbiome of each eukaryote and its innate cells. These linked constituencies form complex localized and interlocking ecologies in which the specific microbial constituents and their relative abundance differ substantially according to age and environmental exposures. Rapid advances in microbiology and genetic research techniques have uncovered a significant previous underestimate of the extent of that microbial contribution and its metabolic and developmental impact on holobionts. Therefore, a re-calibration of the neonatal period is suggested as a transitional phase in development that includes the acquisition of consequential collaborative microbial life from extensive environmental influences. These co-dependent, symbiotic relationships formed in the fetal and neonatal stages extend into adulthood and even across generations.

  7. Agile Development of Various Computational Power Adaptive Web-Based Mobile-Learning Software Using Mobile Cloud Computing

    Science.gov (United States)

    Zadahmad, Manouchehr; Yousefzadehfard, Parisa

    2016-01-01

    Mobile Cloud Computing (MCC) aims to improve all mobile applications such as m-learning systems. This study presents an innovative method to use web technology and software engineering's best practices to provide m-learning functionalities hosted in a MCC-learning system as service. Components hosted by MCC are used to empower developers to create…

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  9. Development of a guidance guide for dosimetry in computed tomography

    International Nuclear Information System (INIS)

    Fontes, Ladyjane Pereira

    2016-01-01

    Due to frequent questions from users of ionization chambers pencil type calibrated in the Instrument Calibration Laboratory of the Institute of Energy and Nuclear Research (LCI - IPEN), on how to properly apply the factors indicated in their calibration certificates, a guide was prepared guidance for dosimetry in computed tomography. The guide includes guidance prior knowledge of half value layer (HVL), as it is necessary to know the effective beam energy for application quality for correction factor (kq). The evaluation of HVL in TC scanners becomes a difficult task due to system geometry and therefore a survey was conducted of existing methodologies for the determination of HVL in clinical beams Computed Tomography, taking into account technical, practical and economic factors. In this work it was decided to test a Tandem System consists of absorbing covers made in the workshop of IPEN, based on preliminary studies due to low cost and good response. The Tandem system consists of five cylindrical absorbing layers of 1mm, 3mm, 5mm, 7mm and 10mm aluminum and 3 cylindrical absorbing covers 15mm, 25mm and acrylic 35mm (PMMA) coupled to the ionization chamber of commercial pencil type widely used in quality control tests in dosimetry in clinical beams Computed tomography. Through Tandem curves it was possible to assess HVL values and from the standard curve pencil-type ionization chamber, Kq find the appropriate beam. The elaborate Guide provides information on how to build the calibration curve on the basis of CSR, to find the Kq and information for construction Tandem curve, to find values close to CSR. (author)

  10. Developing the human-computer interface for Space Station Freedom

    Science.gov (United States)

    Holden, Kritina L.

    1991-01-01

    For the past two years, the Human-Computer Interaction Laboratory (HCIL) at the Johnson Space Center has been involved in prototyping and prototype reviews of in support of the definition phase of the Space Station Freedom program. On the Space Station, crew members will be interacting with multi-monitor workstations where interaction with several displays at one time will be common. The HCIL has conducted several experiments to begin to address design issues for this complex system. Experiments have dealt with design of ON/OFF indicators, the movement of the cursor across multiple monitors, and the importance of various windowing capabilities for users performing multiple tasks simultaneously.

  11. COMPUTATIONAL SIMULATION OF FIRE DEVELOPMENT INSIDE A TRADE CENTRE

    Directory of Open Access Journals (Sweden)

    Constantin LUPU

    2015-07-01

    Full Text Available Real scale fire experiments involve considerable costs compared to computational mathematical modelling. This paperwork is the result of such a virtual simulation of a fire occurred in a hypothetical wholesale warehouse comprising a large number of trade stands. The analysis starts from the ignition source located inside a trade stand towards the fire expansion over three groups of compartments, by highlighting the heat transfer, both in small spaces, as well as over large distances. In order to confirm the accuracy of the simulation, the obtained values are compared to the ones from the specialized literature.

  12. A new approach in development of data flow control and investigation system for computer networks

    International Nuclear Information System (INIS)

    Frolov, I.; Vaguine, A.; Silin, A.

    1992-01-01

    This paper describes a new approach in development of data flow control and investigation system for computer networks. This approach was developed and applied in the Moscow Radiotechnical Institute for control and investigations of Institute computer network. It allowed us to solve our network current problems successfully. Description of our approach is represented below along with the most interesting results of our work. (author)

  13. The Development of Educational and/or Training Computer Games for Students with Disabilities

    Science.gov (United States)

    Kwon, Jungmin

    2012-01-01

    Computer and video games have much in common with the strategies used in special education. Free resources for game development are becoming more widely available, so lay computer users, such as teachers and other practitioners, now have the capacity to develop games using a low budget and a little self-teaching. This article provides a guideline…

  14. Enabling Customization through Web Development: An Iterative Study of the Dell Computer Corporation Website

    Science.gov (United States)

    Liu, Chang; Mackie, Brian G.

    2008-01-01

    Throughout the last decade, companies have increased their investment in electronic commerce (EC) by developing and implementing Web-based applications on the Internet. This paper describes a class project to develop a customized computer website which is similar to Dell Computer Corporation's (Dell) website. The objective of this project is to…

  15. Some computational challenges of developing efficient parallel algorithms for data-dependent computations in thermal-hydraulics supercomputer applications

    International Nuclear Information System (INIS)

    Woodruff, S.B.

    1992-01-01

    The Transient Reactor Analysis Code (TRAC), which features a two- fluid treatment of thermal-hydraulics, is designed to model transients in water reactors and related facilities. One of the major computational costs associated with TRAC and similar codes is calculating constitutive coefficients. Although the formulations for these coefficients are local the costs are flow-regime- or data-dependent; i.e., the computations needed for a given spatial node often vary widely as a function of time. Consequently, poor load balancing will degrade efficiency on either vector or data parallel architectures when the data are organized according to spatial location. Unfortunately, a general automatic solution to the load-balancing problem associated with data-dependent computations is not yet available for massively parallel architectures. This document discusses why developers algorithms, such as a neural net representation, that do not exhibit algorithms, such as a neural net representation, that do not exhibit load-balancing problems

  16. Development of a Computational Steering Framework for High Performance Computing Environments on Blue Gene/P Systems

    KAUST Repository

    Danani, Bob K.

    2012-07-01

    Computational steering has revolutionized the traditional workflow in high performance computing (HPC) applications. The standard workflow that consists of preparation of an application’s input, running of a simulation, and visualization of simulation results in a post-processing step is now transformed into a real-time interactive workflow that significantly reduces development and testing time. Computational steering provides the capability to direct or re-direct the progress of a simulation application at run-time. It allows modification of application-defined control parameters at run-time using various user-steering applications. In this project, we propose a computational steering framework for HPC environments that provides an innovative solution and easy-to-use platform, which allows users to connect and interact with running application(s) in real-time. This framework uses RealityGrid as the underlying steering library and adds several enhancements to the library to enable steering support for Blue Gene systems. Included in the scope of this project is the development of a scalable and efficient steering relay server that supports many-to-many connectivity between multiple steered applications and multiple steering clients. Steered applications can range from intermediate simulation and physical modeling applications to complex computational fluid dynamics (CFD) applications or advanced visualization applications. The Blue Gene supercomputer presents special challenges for remote access because the compute nodes reside on private networks. This thesis presents an implemented solution and demonstrates it on representative applications. Thorough implementation details and application enablement steps are also presented in this thesis to encourage direct usage of this framework.

  17. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  19. Developing a Framework for Intuitive Human-Computer Interaction.

    Science.gov (United States)

    O'Brien, Marita A; Rogers, Wendy A; Fisk, Arthur D

    2008-09-01

    Many technology marketing materials tout the intuitive nature of products, but current human-computer interaction (HCI) guidelines provide limited methods to help designers create this experience beyond making them easy to use. This paper proposes a definition for intuitive interaction with specific attributes to allow designers to create products that elicit the target experience. Review of relevant literatures provides empirical evidence for the suggested working definition of intuitive HCI: interactions between humans and high technology in lenient learning environments that allow the human to use a combination of prior experience and feedforward methods to achieve an individual's functional and abstract goals . Core concepts supporting this definition were compiled into an organizational framework that includes: seeking user goals, performing well-learned behavior, determining what to do next, metacognition, knowledge in the head, and knowledge in the world. This paper describes these concepts and proposes design approaches that could facilitate intuitive behavior and suggests areas for further research.

  20. The Development of Word Processor-Mainframe Computer Interaction

    Science.gov (United States)

    Cain, M.; Stocker, T.

    1983-01-01

    This paper addresses how peripheral word processing units have been modified into total workstations, enabling a user to perform multiple functions. Manuscripts can be prepared and edited, information can be passed to and extracted from the mainframe computer, and the mainframe's superior processing capabilities can be utilized. This is a viable alternative to the so-called “Heinz 57” approach to information systems characterized in many institutions. Presented here will be a short background of data processing at The Children's Hospital, necessary to show the explosive growth of data processing in a medical institution, a description of how word processing configuration played a determining role in equipment selection, and a sampling of actual word processing—mainframe applications that have been accomplished.

  1. The role of computers in developing countries with reference to East Africa

    International Nuclear Information System (INIS)

    Shayo, L.K.

    1984-01-01

    The role of computers in economic and technological development is examined with particular reference to developing countries. It is stressed that these countries must exploit the potential of computers in their strive to catch-up in the development race. The shortage of qualified EDP personnel is singled out as one of the most critical factors in any unsatisfactory state of computer applications. A computerization policy based on the demands for information by the sophistication of the development process, and supported by a sufficient core of qualified local manpower, is recommended. The situation in East Africa is discussed and recommendations for training and production of telematics equipment are made. (author)

  2. Computing in research and development in Africa benefits, trends, challenges and solutions

    CERN Document Server

    2015-01-01

    This book describes the trends, challenges and solutions in computing use for scientific research and development within different domains in Africa, such as health, agriculture, environment, economy, energy, education and engineering. The benefits expected are discussed by a number of recognized, domain-specific experts, with a common theme being computing as solution enabler. This book is the first document providing such a representative up-to-date view on this topic at the continent level.   • Discusses computing for scientific research and development on the African continent, addressing domains such as engineering, health, agriculture, environment, economy, energy, and education; • Describes the state-of-the-art in usage of computing to address problems in developing countries pertaining to health, productivity, economic growth, and renewable energy; • Offers insights applicable to all developing countries on the use of computing technologies to address a variety of societal issues.

  3. Development of application program and building database to increase facilities for using the radiation effect assessment computer codes

    International Nuclear Information System (INIS)

    Hyun Seok Ko; Young Min Kim; Suk-Hoon Kim; Dong Hoon Shin; Chang-Sun Kang

    2005-01-01

    The current radiation effect assessment system is required the skillful technique about the application for various code and high level of special knowledge classified by field. Therefore, as a matter of fact, it is very difficult for the radiation users' who don't have enough special knowledge to assess or recognize the radiation effect properly. For this, we already have developed the five Computer codes(windows-based), that is the radiation effect assessment system, in radiation utilizing field including the nuclear power generation. It needs the computer program that non-specialist can use the five computer codes to have already developed with ease. So, we embodied the A.I-based specialist system that can infer the assessment system by itself, according to the characteristic of given problem. The specialist program can guide users, search data, inquire of administrator directly. Conceptually, with circumstance which user to apply the five computer code may encounter actually, we embodied to consider aspects as follows. First, the accessibility of concept and data to need must be improved. Second, the acquirement of reference theory and use of corresponding computer code must be easy. Third, Q and A function needed for solution of user's question out of consideration previously. Finally, the database must be renewed continuously. Actually, to express this necessity, we develop the client program to organize reference data, to build the access methodology(query) about organized data, to load the visible expression function of searched data. And It is embodied the instruction method(effective theory acquirement procedure and methodology) to acquire the theory referring the five computer codes. It is developed the data structure access program(DBMS) to renew continuously data with ease. For Q and A function, it is embodied the Q and A board within client program because the user of client program can search the content of question and answer. (authors)

  4. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  11. A Computational Model Predicting Disruption of Blood Vessel Development

    Science.gov (United States)

    Vascular development is a complex process regulated by dynamic biological networks that vary in topology and state across different tissues and developmental stages. Signals regulating de novo blood vessel formation (vasculogenesis) and remodeling (angiogenesis) come from a varie...

  12. Development of Python applications for learning computational physics

    Directory of Open Access Journals (Sweden)

    Jesús Daniel Arias-Hernández

    2016-01-01

    are useful for generating GUIs to show data in tables and graphics. The GUIs were implemented using the Tkinter and PyQt4 libraries, where the latter facilitated the development with the help of Qt Designer software Tools.

  13. Some computational challenges of developing efficient parallel algorithms for data-dependent computations in thermal-hydraulics supercomputer applications

    International Nuclear Information System (INIS)

    Woodruff, S.B.

    1994-01-01

    The Transient Reactor Analysis Code (TRAC), which features a two-fluid treatment of thermal-hydraulics, is designed to model transients in water reactors and related facilities. One of the major computational costs associated with TRAC and similar codes is calculating constitutive coefficients. Although the formulations for these coefficients are local, the costs are flow-regime- or data-dependent; i.e., the computations needed for a given spatial node often vary widely as a function of time. Consequently, a fixed, uniform assignment of nodes to prallel processors will result in degraded computational efficiency due to the poor load balancing. A standard method for treating data-dependent models on vector architectures has been to use gather operations (or indirect adressing) to sort the nodes into subsets that (temporarily) share a common computational model. However, this method is not effective on distributed memory data parallel architectures, where indirect adressing involves expensive communication overhead. Another serious problem with this method involves software engineering challenges in the areas of maintainability and extensibility. For example, an implementation that was hand-tuned to achieve good computational efficiency would have to be rewritten whenever the decision tree governing the sorting was modified. Using an example based on the calculation of the wall-to-liquid and wall-to-vapor heat-transfer coefficients for three nonboiling flow regimes, we describe how the use of the Fortran 90 WHERE construct and automatic inlining of functions can be used to ameliorate this problem while improving both efficiency and software engineering. Unfortunately, a general automatic solution to the load-balancing problem associated with data-dependent computations is not yet available for massively parallel architectures. We discuss why developers should either wait for such solutions or consider alternative numerical algorithms, such as a neural network

  14. Development of a computer system at La Hague center

    International Nuclear Information System (INIS)

    Mimaud, Robert; Malet, Georges; Ollivier, Francis; Fabre, J.-C.; Valois, Philippe; Desgranges, Patrick; Anfossi, Gilbert; Gentizon, Michel; Serpollet, Roger.

    1977-01-01

    The U.P.2 plant, built at La Hague Center is intended mainly for the reprocessing of spent fuels coming from (as metal) graphite-gas reactors and (as oxide) light-water, heavy-water and breeder reactors. In each of the five large nuclear units the digital processing of measurements was dealt with until 1974 by CAE 3030 data processors. During the period 1974-1975 a modern industrial computer system was set up. This system, equipped with T 2000/20 material from the Telemecanique company, consists of five measurement acquisition devices (for a total of 1500 lines processed) and two central processing units (CPU). The connection of these two PCU (Hardware and Software) enables an automatic connection of the system either on the first CPU or on the second one. The system covers, at present, data processing, threshold monitoring, alarm systems, display devices, periodical listing, and specific calculations concerning the process (balances etc), and at a later stage, an automatic control of certain units of the Process [fr

  15. Recent Developments in Complex Analysis and Computer Algebra

    CERN Document Server

    Kajiwara, Joji; Xu, Yongzhi

    1999-01-01

    This volume consists of papers presented in the special sessions on "Complex and Numerical Analysis", "Value Distribution Theory and Complex Domains", and "Use of Symbolic Computation in Mathematics Education" of the ISAAC'97 Congress held at the University of Delaware, during June 2-7, 1997. The ISAAC Congress coincided with a U.S.-Japan Seminar also held at the University of Delaware. The latter was supported by the National Science Foundation through Grant INT-9603029 and the Japan Society for the Promotion of Science through Grant MTCS-134. It was natural that the participants of both meetings should interact and consequently several persons attending the Congress also presented papers in the Seminar. The success of the ISAAC Congress and the U.S.-Japan Seminar has led to the ISAAC'99 Congress being held in Fukuoka, Japan during August 1999. Many of the same participants will return to this Seminar. Indeed, it appears that the spirit of the U.S.-Japan Seminar will be continued every second year as part of...

  16. Development of a stereoscopic haptic acoustic real-time computer (SHARC)

    Science.gov (United States)

    Chen, Tom; Young, Peter; Anderson, David; Yu, Jiang; Nagata, Shojiro

    1998-04-01

    The Stereoscopic Haptic Acoustic Real-Time Computer (SHARC) is a multi-sensory computer system which integrates technologies for autostereoscopic display, acoustic sensing and rendering, and haptic interfaces into the same computing environment. This paper describes the system organization and the interface between different sensory components. This paper also discusses our findings from developing and using the SHARC system in application to a virtual environment in terms of interface, speed, and bandwidth issues, together with recommendations for future work in this area.

  17. Evidence-based guidelines for the wise use of computers by children: physical development guidelines.

    Science.gov (United States)

    Straker, L; Maslen, B; Burgess-Limerick, R; Johnson, P; Dennerlein, J

    2010-04-01

    Computer use by children is common and there is concern over the potential impact of this exposure on child physical development. Recently principles for child-specific evidence-based guidelines for wise use of computers have been published and these included one concerning the facilitation of appropriate physical development. This paper reviews the evidence and presents detailed guidelines for this principle. The guidelines include encouraging a mix of sedentary and whole body movement tasks, encouraging reasonable postures during computing tasks through workstation, chair, desk, display and input device selection and adjustment and special issues regarding notebook computer use and carriage, computing skills and responding to discomfort. The evidence limitations highlight opportunities for future research. The guidelines themselves can inform parents and teachers, equipment designers and suppliers and form the basis of content for teaching children the wise use of computers. STATEMENT OF RELEVANCE: Many children use computers and computer-use habits formed in childhood may track into adulthood. Therefore child-computer interaction needs to be carefully managed. These guidelines inform those responsible for children to assist in the wise use of computers.

  18. Computer literacy among first year medical students in a developing country: A cross sectional study

    Science.gov (United States)

    2012-01-01

    Background The use of computer assisted learning (CAL) has enhanced undergraduate medical education. CAL improves performance at examinations, develops problem solving skills and increases student satisfaction. The study evaluates computer literacy among first year medical students in Sri Lanka. Methods The study was conducted at Faculty of Medicine, University of Colombo, Sri Lanka between August-September 2008. First year medical students (n = 190) were invited for the study. Data on computer literacy and associated factors were collected by an expert-validated pre-tested self-administered questionnaire. Computer literacy was evaluated by testing knowledge on 6 domains; common software packages, operating systems, database management and the usage of internet and E-mail. A linear regression was conducted using total score for computer literacy as the continuous dependant variable and other independent covariates. Results Sample size-181 (Response rate-95.3%), 49.7% were Males. Majority of the students (77.3%) owned a computer (Males-74.4%, Females-80.2%). Students have gained their present computer knowledge by; a formal training programme (64.1%), self learning (63.0%) or by peer learning (49.2%). The students used computers for predominately; word processing (95.6%), entertainment (95.0%), web browsing (80.1%) and preparing presentations (76.8%). Majority of the students (75.7%) expressed their willingness for a formal computer training programme at the faculty. Mean score for the computer literacy questionnaire was 48.4 ± 20.3, with no significant gender difference (Males-47.8 ± 21.1, Females-48.9 ± 19.6). There were 47.9% students that had a score less than 50% for the computer literacy questionnaire. Students from Colombo district, Western Province and Student owning a computer had a significantly higher mean score in comparison to other students (p computer training was the strongest predictor of computer literacy (β = 13.034), followed by using

  19. A Generic Software Development Process Refined from Best Practices for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Soojin Park

    2015-04-01

    Full Text Available Cloud computing has emerged as more than just a piece of technology, it is rather a new IT paradigm. The philosophy behind cloud computing shares its view with green computing where computing environments and resources are not as subjects to own but as subjects of sustained use. However, converting currently used IT services to Software as a Service (SaaS cloud computing environments introduces several new risks. To mitigate such risks, existing software development processes must undergo significant remodeling. This study analyzes actual cases of SaaS cloud computing environment adoption as a way to derive four new best practices for software development and incorporates the identified best practices for currently-in-use processes. Furthermore, this study presents a design for generic software development processes that implement the proposed best practices. The design for the generic process has been applied to reinforce the weak points found in SaaS cloud service development practices used by eight enterprises currently developing or operating actual SaaS cloud computing services. Lastly, this study evaluates the applicability of the proposed SaaS cloud oriented development process through analyzing the feedback data collected from actual application to the development of a SaaS cloud service Astation.

  20. An Experimental Investigation of Computer Program Development Approaches and Computer Programming Metrics.

    Science.gov (United States)

    1979-12-01

    stimulation/inspiration throughout their lengthy service as members of my study committee. I am indebted beyond measure, however, to two people whose...juatitative characteristics of sofware development phenomena. It was predicted a priori that these confirmatory aspects would verify the study’s basic premises

  1. Development of computational small animal models and their applications in preclinical imaging and therapy research.

    Science.gov (United States)

    Xie, Tianwu; Zaidi, Habib

    2016-01-01

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.

  2. Development of computational small animal models and their applications in preclinical imaging and therapy research

    International Nuclear Information System (INIS)

    Xie, Tianwu; Zaidi, Habib

    2016-01-01

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future

  3. Development of computational small animal models and their applications in preclinical imaging and therapy research

    Energy Technology Data Exchange (ETDEWEB)

    Xie, Tianwu [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, Geneva 4 CH-1211 (Switzerland); Zaidi, Habib, E-mail: habib.zaidi@hcuge.ch [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, Geneva 4 CH-1211 (Switzerland); Geneva Neuroscience Center, Geneva University, Geneva CH-1205 (Switzerland); Department of Nuclear Medicine and Molecular Imaging, University of Groningen, University Medical Center Groningen, Groningen 9700 RB (Netherlands)

    2016-01-15

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.

  4. Brain-muscle-computer interface: mobile-phone prototype development and testing.

    Science.gov (United States)

    Vernon, Scott; Joshi, Sanjay S

    2011-07-01

    We report prototype development and testing of a new mobile-phone-based brain-muscle-computer interface for severely paralyzed persons, based on previous results from our group showing that humans may actively create specified power levels in two separate frequency bands of a single surface electromyography (sEMG) signal. EMG activity on the surface of a single face muscle site (auricularis superior) is recorded with a standard electrode. This analog electrical signal is imported into an Android-based mobile phone and digitized via an internal A/D converter. The digital signal is split, and then simultaneously filtered with two band-pass filters to extract total power within two separate frequency bands. The user-modulated power in each frequency band serves as two separate control channels for machine control. After signal processing, the Android phone sends commands to external devices via a Bluetooth interface. Users are trained to use the device via visually based operant conditioning, with simple cursor-to-target activities on the phone screen. The mobile-phone prototype interface is formally evaluated on a single advanced Spinal Muscle Atrophy subject, who has successfully used the interface in his home in evaluation trials and for remote control of a television. Development of this new device will not only guide future interface design for community use, but will also serve as an information technology bridge for in situ data collection to quantify human sEMG manipulation abilities for a relevant population.

  5. Development and validation of a computer-based learning module for wrist arthroscopy.

    Science.gov (United States)

    Obdeijn, M C; Alewijnse, J V; Mathoulin, C; Liverneaux, P; Tuijthof, G J M; Schijven, M P

    2014-04-01

    The objective of this study was to develop and validate a computer-based module for wrist arthroscopy to which a group of experts could consent. The need for such a module was assessed with members of the European Wrist Arthroscopy Society (EWAS). The computer-based module was developed through several rounds of consulting experts on the content. The module's learning enhancement was tested in a randomized controlled trial with 28 medical students who were assigned to the computer-based module group or lecture group. The design process led to a useful tool, which is supported by a panel of experts. Although the computer based module did not enhance learning, the participants did find the module more pleasant to use. Developing learning tools such as this computer-based module can improve the teaching of wrist arthroscopy skills. Copyright © 2014. Published by Elsevier SAS.

  6. Summaries of research and development activities by using JAEA computer system in FY2006. April 1, 2006 - March 31, 2007

    International Nuclear Information System (INIS)

    2008-02-01

    Center for Promotion of Computational Science and Engineering (CCSE) of Japan Atomic Energy Agency (JAEA) installed large computer systems including super-computers in order to support research and development activities in JAEA. CCSE operates and manages the computer system and network system. This report presents usage records of the JAEA computer system and the big users' research and development activities by using the computer system in FY2006 (April 1, 2006 - March 31, 2007). (author)

  7. Summaries of research and development activities by using JAERI computer system in FY2004 (April 1, 2004 - March 31, 2005)

    International Nuclear Information System (INIS)

    2005-08-01

    Center for Promotion of Computational Science and Engineering (CCSE) of Japan Atomic Energy Research Institute (JAERI) installed large computer systems including super-computers in order to support research and development activities in JAERI. CCSE operates and manages the computer system and network system. This report presents usage records of the JAERI computer system and the big users' research and development activities by using the computer system in FY2004 (April 1, 2004 - March 31, 2005). (author)

  8. Summaries of research and development activities by using JAERI computer system in FY2003. April 1, 2003 - March 31, 2004

    International Nuclear Information System (INIS)

    2005-03-01

    Center for Promotion of Computational Science and Engineering (CCSE) of Japan Atomic Energy Research Institute (JAERI) installed large computer system included super-computers in order to support research and development activities in JAERI. CCSE operates and manages the computer system and network system. This report presents usage records of the JAERI computer system and big user's research and development activities by using the computer system in FY2003 (April 1, 2003 - March 31, 2004). (author)

  9. Summaries of research and development activities by using JAEA computer system in FY2005. April 1, 2005 - March, 31, 2006

    International Nuclear Information System (INIS)

    2006-10-01

    Center for Promotion of Computational Science and Engineering (CCSE) of Japan Atomic Energy Agency (JAEA) installed large computer systems including super-computers in order to support research and development activities in JAEA. CCSE operates and manages the computer system and network system. This report presents usage records of the JAERI computer system and the big users' research and development activities by using the computer system in FY2005 (April 1, 2005 - March 31, 2006). (author)

  10. development of a computer program for the design of auger

    African Journals Online (AJOL)

    User

    program was developed for the above processes to remove the constraints of the classical ... Results of evaluation tests show that the program is efficient in the ..... C. C AUGER CONVEYOR DESIGN FOR CHART A MATERIALS. 100 WRITE(*,2). WRITE(*,4)'DESIGN OF SCREW FOR CHART A MATERIALS'. WRITE(* ...

  11. Computer-Aided Template for Model Reuse, Development and Maintenance

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    2014-01-01

    . For the particle scale, two alternative mechanisms to describe the diffusion inside catalyst pellets are available: a Fickian diffusion model and a dusty gas model . Moreover, the effects of isothermal and non-isothermal catalyst are also considered during the model development process. Thereby, any number...

  12. Development of a dynamic computational model of social cognitive theory.

    Science.gov (United States)

    Riley, William T; Martin, Cesar A; Rivera, Daniel E; Hekler, Eric B; Adams, Marc A; Buman, Matthew P; Pavel, Misha; King, Abby C

    2016-12-01

    Social cognitive theory (SCT) is among the most influential theories of behavior change and has been used as the conceptual basis of health behavior interventions for smoking cessation, weight management, and other health behaviors. SCT and other behavior theories were developed primarily to explain differences between individuals, but explanatory theories of within-person behavioral variability are increasingly needed as new technologies allow for intensive longitudinal measures and interventions adapted from these inputs. These within-person explanatory theoretical applications can be modeled as dynamical systems. SCT constructs, such as reciprocal determinism, are inherently dynamical in nature, but SCT has not been modeled as a dynamical system. This paper describes the development of a dynamical system model of SCT using fluid analogies and control systems principles drawn from engineering. Simulations of this model were performed to assess if the model performed as predicted based on theory and empirical studies of SCT. This initial model generates precise and testable quantitative predictions for future intensive longitudinal research. Dynamic modeling approaches provide a rigorous method for advancing health behavior theory development and refinement and for guiding the development of more potent and efficient interventions.

  13. Development of Innovative Computer Software to Facilitate the Setup and Computation of Water Quality Index

    OpenAIRE

    Samira Yousefzadeh; Amir Hossein Mahvi; Mahmood Alimohammadi; Kazem Naddafi; Maryam Valadi Amin; Ramin Nabizadeh

    2013-01-01

    Background Developing a water quality index which is used to convert the water quality dataset into a single number is the most important task of most water quality monitoring programmes. As the water quality index setup is based on different local obstacles, it is not feasible to introduce a definite water quality index to reveal the water quality level. Findings In this study, an innovative software application, the Iranian Water Quality Index Software (IWQIS), is presented in order to faci...

  14. Continuous development of schemes for parallel computing of the electrostatics in biological systems: implementation in DelPhi.

    Science.gov (United States)

    Li, Chuan; Petukh, Marharyta; Li, Lin; Alexov, Emil

    2013-08-15

    Due to the enormous importance of electrostatics in molecular biology, calculating the electrostatic potential and corresponding energies has become a standard computational approach for the study of biomolecules and nano-objects immersed in water and salt phase or other media. However, the electrostatics of large macromolecules and macromolecular complexes, including nano-objects, may not be obtainable via explicit methods and even the standard continuum electrostatics methods may not be applicable due to high computational time and memory requirements. Here, we report further development of the parallelization scheme reported in our previous work (Li, et al., J. Comput. Chem. 2012, 33, 1960) to include parallelization of the molecular surface and energy calculations components of the algorithm. The parallelization scheme utilizes different approaches such as space domain parallelization, algorithmic parallelization, multithreading, and task scheduling, depending on the quantity being calculated. This allows for efficient use of the computing resources of the corresponding computer cluster. The parallelization scheme is implemented in the popular software DelPhi and results in speedup of several folds. As a demonstration of the efficiency and capability of this methodology, the electrostatic potential, and electric field distributions are calculated for the bovine mitochondrial supercomplex illustrating their complex topology, which cannot be obtained by modeling the supercomplex components alone. Copyright © 2013 Wiley Periodicals, Inc.

  15. Relationship between Young Children's Habitual Computer Use and Influencing Variables on Socio-Emotional Development

    Science.gov (United States)

    Seo, Hyun Ah; Chun, Hui Young; Jwa, Seung Hwa; Choi, Mi Hyun

    2011-01-01

    This study investigates the relationship between young children's habitual computer use and influencing variables on socio-emotional development. The participants were 179 five-year-old children. The Internet Addiction Scale for Young Children (IASYC) was used to identify children with high and low levels of habituation to computer use. The data…

  16. Process-Based Development of Competence Models to Computer Science Education

    Science.gov (United States)

    Zendler, Andreas; Seitz, Cornelia; Klaudt, Dieter

    2016-01-01

    A process model ("cpm.4.CSE") is introduced that allows the development of competence models in computer science education related to curricular requirements. It includes eight subprocesses: (a) determine competence concept, (b) determine competence areas, (c) identify computer science concepts, (d) assign competence dimensions to…

  17. Design, Development, and Evaluation of a Mobile Learning Application for Computing Education

    Science.gov (United States)

    Oyelere, Solomon Sunday; Suhonen, Jarkko; Wajiga, Greg M.; Sutinen, Erkki

    2018-01-01

    The study focused on the application of the design science research approach in the course of developing a mobile learning application, MobileEdu, for computing education in the Nigerian higher education context. MobileEdu facilitates the learning of computer science courses on mobile devices. The application supports ubiquitous, collaborative,…

  18. CONTRIBUTIONS FOR DEVELOPING OF A COMPUTER AIDED LEARNING ENVIRONMENT OF DESCRIPTIVE GEOMETRY

    Directory of Open Access Journals (Sweden)

    Antonescu Ion

    2009-07-01

    Full Text Available The paper presents the authors’ contributions for developing a computer code for teaching of descriptive geometry using the computer aided learning techniques. The program was implemented using the programming interface and the 3D modeling capabilities of the AutoCAD system.

  19. Placental complications after a previous cesarean section

    OpenAIRE

    Milošević Jelena; Lilić Vekoslav; Tasić Marija; Radović-Janošević Dragana; Stefanović Milan; Antić Vladimir

    2009-01-01

    Introduction The incidence of cesarean section has been rising in the past 50 years. With the increased number of cesarean sections, the number of pregnancies with the previous cesarean section rises as well. The aim of this study was to establish the influence of the previous cesarean section on the development of placental complications: placenta previa, placental abruption and placenta accreta, as well as to determine the influence of the number of previous cesarean sections on the complic...

  20. Algorithm development for Maxwell's equations for computational electromagnetism

    Science.gov (United States)

    Goorjian, Peter M.

    1990-01-01

    A new algorithm has been developed for solving Maxwell's equations for the electromagnetic field. It solves the equations in the time domain with central, finite differences. The time advancement is performed implicitly, using an alternating direction implicit procedure. The space discretization is performed with finite volumes, using curvilinear coordinates with electromagnetic components along those directions. Sample calculations are presented of scattering from a metal pin, a square and a circle to demonstrate the capabilities of the new algorithm.

  1. Computer aided design and development of mixed-propeller pumps

    International Nuclear Information System (INIS)

    Bhaoyal, B.C.

    1994-01-01

    This paper deals with the design principle of mixed propeller hydraulic aided by CADD software developed by author for generation of the hydraulic profile of the mixed propeller and diffuser geometry. The design methodology for plotting the vane profile of mixed propeller pump has been discussed in detail with special reference to conformal transformation in cylindrical as well as conical plane. (author). 10 refs., 11 figs

  2. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  3. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  4. COMPUTING

    CERN Document Server

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  7. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  8. Sternal development in the pediatric population: evaluation using computed tomography

    International Nuclear Information System (INIS)

    Delgado, Jorge; Jaimes, Camilo; Gwal, Kriti; Jaramillo, Diego; Ho-Fung, Victor

    2014-01-01

    The normal development of the sternum using CT imaging is not known. To describe the normal development of the sternum in children on chest CT imaging. CT imaging of 300 patients (150 male, 150 female), mean age: 4.97 years (range: 0.01-9.9 years), were evaluated retrospectively. The presence and number of ossification centers in the manubrium, each individual mesosternal segment and the xiphoid were reviewed. Additionally, the vertical and horizontal fusion between ossification centers was evaluated. Differences among age and gender were calculated. Descriptive statistics, analysis of variances (ANOVA), chi-square and Fisher exact tests were performed for statistical analysis. Manubrium: A single ossification center was seen in 88% of cases and two or three ossification centers were seen in 12%. More manubrial ossification centers were correlated to a younger age (P 0.05). Xiphoid: Absence was seen in 67% of the patients. Bifid xiphoid was seen in 1% of the patients. The normal development of the different components of the sternum is a process with wide variation among children. The large variability of mesosternal ossification center types should not be confused with pathology. (orig.)

  9. Development of the operator training system using computer graphics. Pt. 1. Defining the system configuration and developing basic techniques

    International Nuclear Information System (INIS)

    Takana, Kenchi; Sasou, Kunihide; Sano, Toshiaki; Suzuki, Koichi; Noji, Kunio

    2001-01-01

    Efficient and concurrent operator training seems to be crucial in near future because of an increase in operators to be trained due to generation alternations. Ever developed Man -Machine-Simulator (MMS) has several merits: (1) Operators' cognitive and behavioral activities among team in emergency could be simulated based on the concurrent mental model; (2) Simulated scenarios could be expanded to multiple malfunctions events, to all of which procedures could not be stipulated previously, (3) Standard behavior in coping with anomalies including communication and operations could be presented. This paper describes the development of an operator training system by applying this MMS. Three dimensional computer graphics (3D-CG) was adopted for improving the training effects and attracting operators' interest by visually presenting realistic operating team behavior in the main control room. Towards the completion of the operator training system, following designs of system configuration and developments of several basic techniques were availed: (1) Imaging the utilization of the operator training system, functions to be equipped and system configurations for realizing functions were determined. And three of scenarios were chosen in order to appeal the merits of the MMS and to raise training effects. (2) Knowledge base was completed to execute simulations. And connection between operator team model and plant simulator, that is the 2nd generation type simulator of the BTC -4, was executed to obtain simulation results (time sequential log data of plant dynamics and operating team behavior). (3) Operator's actions seen in VCR tapes in real training were classified for eighteen kinds of fundamental categories and those fundamental actions were modeled on 3D-CG using the People Shop software. The 3D-CG of main control panel was prepared using Multi Gen software. (author)

  10. Managing Computer Systems Development: Understanding the Human and Technological Imperatives.

    Science.gov (United States)

    1985-06-01

    tjq~ 2hancze, John-Wiley and sons, Inc 9B 12. Beckhard ~ Richard and Harris, Reuben Top 13. DeMarco, T., Sjruj~ed An 1 s is and L.in 150 14. Dickover...information system development. Beckhard and Harris [Ref. 12: pp. 16-19] identify two essential conditions for any change effort to be effectively managed...February 19814. 10. Nolan, Richard Lo, "Controlling the Costs of Data Services," j&KjX Bisinoss Reviews July-August 1977. 11. Tichy, NoelrM : ia~gn

  11. Cyclopentane combustion chemistry. Part I: Mechanism development and computational kinetics

    KAUST Repository

    Rachidi, Mariam El

    2017-06-23

    Cycloalkanes are significant constituents of conventional fossil fuels, in which they are one of the main contributors to soot formation, but also significantly influence the ignition characteristics below ∼900K. This paper discusses the development of a detailed high- and low-temperature oxidation mechanism for cyclopentane, which is an important archetypical cycloalkane. The differences between cyclic and non-cyclic alkane chemistry, and thus the inapplicability of acyclic alkane analogies, required the detailed theoretical investigation of the kinetics of important cyclopentane oxidation reactions as part of the mechanism development. The cyclopentyl+O reaction was investigated at the UCCSD(T)-F12a/cc-pVTZ-F12//M06-2X/6-311++G(d,p) level of theory in a time-dependent master equation framework. Comparisons with analogous cyclohexane or non-cyclic alkane reactions are presented. Our study suggests that beyond accurate quantum chemistry the inclusion of pressure dependence and especially that of formally direct kinetics is crucial even at pressures relevant for practical application.

  12. Development of a prototype gantry system for preclinical x-ray phase-contrast computed tomography

    International Nuclear Information System (INIS)

    Tapfer, Arne; Bech, Martin; Pauwels, Bart; Liu Xuan; Bruyndonckx, Peter; Sasov, Alexander; Kenntner, Johannes; Mohr, Juergen; Walter, Marco; Schulz, Joachim; Pfeiffer, Franz

    2011-01-01

    Purpose: To explore the potential of grating-based x-ray phase-contrast imaging for clinical applications, a first compact gantry system was developed. It is designed such that it can be implemented into an in-vivo small-animal phase-contrast computed tomography (PC-CT) scanner. The purpose of the present study is to assess the accuracy and quantitativeness of the described gantry in both absorption and phase-contrast. Methods: A phantom, containing six chemically well-defined liquids, was constructed. A tomography scan with cone-beam reconstruction of this phantom was performed yielding the spatial distribution of the linear attenuation coefficient μ and decrement δ of the complex refractive index. Theoretical values of μ and δ were calculated for each liquid from tabulated data and compared with the experimentally measured values. Additionally, a color-fused image representation is proposed to display the complementary absorption and phase-contrast information in a single image. Results: Experimental and calculated data of the phantom agree well confirming the quantitativeness and accuracy of the reconstructed spatial distributions of μ and δ. The proposed color-fused image representation, which combines the complementary absorption and phase information, considerably helps in distinguishing the individual substances. Conclusions: The concept of grating-based phase-contrast computed tomography (CT) can be implemented into a compact, cone-beam geometry gantry setup. The authors believe that this work represents an important milestone in translating phase-contrast x-ray imaging from previous proof-of-principle experiments to first preclinical biomedical imaging applications on small-animal models.

  13. Computer mapping software and geographic data base development: Oak Ridge National Laboratory user experience

    International Nuclear Information System (INIS)

    Honea, B.; Johnson, P.

    1978-01-01

    As users of computer display tools, our opinion is that the researcher's needs should guide and direct the computer scientist's development of mapping software and data bases. Computer graphic techniques developed for the sake of the computer graphics community tend to be esoteric and rarely suitable for user problems. Two types of users exist for computer graphic tools: the researcher who is generally satisfied with abstract but accurate displays for analysis purposes and the decision maker who requires synoptic and easily comprehended displays relevant to the issues he or she must address. Computer mapping software and data bases should be developed for the user in a generalized and standardized format for ease in transferring and to facilitate the linking or merging with larger analysis systems. Maximum utility of computer mapping tools is accomplished when linked to geographic information and analysis systems. Computer graphic techniques have varying degrees of utility depending upon whether they are used for data validation, analysis procedures or presenting research results

  14. The impact of home computer use on children's activities and development.

    Science.gov (United States)

    Subrahmanyam, K; Kraut, R E; Greenfield, P M; Gross, E F

    2000-01-01

    The increasing amount of time children are spending on computers at home and school has raised questions about how the use of computer technology may make a difference in their lives--from helping with homework to causing depression to encouraging violent behavior. This article provides an overview of the limited research on the effects of home computer use on children's physical, cognitive, and social development. Initial research suggests, for example, that access to computers increases the total amount of time children spend in front of a television or computer screen at the expense of other activities, thereby putting them at risk for obesity. At the same time, cognitive research suggests that playing computer games can be an important building block to computer literacy because it enhances children's ability to read and visualize images in three-dimensional space and track multiple images simultaneously. The limited evidence available also indicates that home computer use is linked to slightly better academic performance. The research findings are more mixed, however, regarding the effects on children's social development. Although little evidence indicates that the moderate use of computers to play games has a negative impact on children's friendships and family relationships, recent survey data show that increased use of the Internet may be linked to increases in loneliness and depression. Of most concern are the findings that playing violent computer games may increase aggressiveness and desensitize a child to suffering, and that the use of computers may blur a child's ability to distinguish real life from simulation. The authors conclude that more systematic research is needed in these areas to help parents and policymakers maximize the positive effects and to minimize the negative effects of home computers in children's lives.

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  16. Developing computational model-based diagnostics to analyse clinical chemistry data

    NARCIS (Netherlands)

    Schalkwijk, D.B. van; Bochove, K. van; Ommen, B. van; Freidig, A.P.; Someren, E.P. van; Greef, J. van der; Graaf, A.A. de

    2010-01-01

    This article provides methodological and technical considerations to researchers starting to develop computational model-based diagnostics using clinical chemistry data.These models are of increasing importance, since novel metabolomics and proteomics measuring technologies are able to produce large

  17. Evolution of the Milieu Approach for Software Development for the Polymorphous Computing Architecture Program

    National Research Council Canada - National Science Library

    Dandass, Yoginder

    2004-01-01

    A key goal of the DARPA Polymorphous Computing Architectures (PCA) program is to develop reactive closed-loop systems that are capable of being dynamically reconfigured in order to respond to changing mission scenarios...

  18. Previously unknown organomagnesium compounds in astrochemical context

    OpenAIRE

    Ruf, Alexander

    2018-01-01

    We describe the detection of dihydroxymagnesium carboxylates (CHOMg) in astrochemical context. CHOMg was detected in meteorites via ultrahigh-resolving chemical analytics and represents a novel, previously unreported chemical class. Thus, chemical stability was probed via quantum chemical computations, in combination with experimental fragmentation techniques. Results propose the putative formation of green-chemical OH-Grignard-type molecules and triggered fundamental questions within chemica...

  19. Development of computer program for safety of nuclear power plant against tsunami

    International Nuclear Information System (INIS)

    Jin, S. B.; Choi, K. R.; Lee, S. K.; Cho, Y. S.

    2001-01-01

    The main objective of this study is the development of a computer program to check the safety of nuclear power plants along the coastline of the Korean Peninsula. The computer program describes the propagation and associated run-up process of tsunamis by solving linear and nonlinear shallow-water equations with finite difference methods. The computer program has been applied to several ideal and simplified problems. Obtained numerical solutions are compared to existing and available solutions and measurements. A very good agreement between numerical solutions and existing measurement is observed. The computer program developed in this study can be to check the safety analysis of nuclear power plants against tsunamis. The program can also be used to study the propagation of tsunamis for a long distance, and associated run-up and run-down process along a shoreline. Furthermore, the computer program can be used to provide the proper design criteria of coastal facilities and structures

  20. Development of computing code system for level 3 PSA

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Jong Tae; Yu, Dong Han; Kim, Seung Hwan

    1997-07-01

    Among the various research areas of the level 3 PSA, the effect of terrain on the transport of radioactive material was investigated through wind tunnel experiment. These results will give a physical insight in the development of a new dispersion model. Because there are some discrepancies between the results from Gaussian plume model and those from field test, the effect of terrain on the atmospheric dispersion was investigated by using CTDMPLUS code. Through this study we find that the model which can treat terrain effect is essential in the atmospheric dispersion of radioactive materials and the CTDMPLUS model can be used as a useful tool. And it is suggested that modification of a model and experimental study should be made through the continuous effort. The health effect assessment near the Yonggwang site by using IPE (Individual plant examination) results and its site data was performed. The health effect assessment is an important part of consequence analysis of a nuclear power plant site. The MACCS was used in the assessment. Based on the calculation of CCDF for each risk measure, it is shown that CCDF has a slow slope and thus wide probability distribution in cases of early fatality, early injury, total early fatality risk, and total weighted early fatality risk. And in cases of cancer fatality and population dose within 48km and 80km, the CCDF curve have a steep slope and thus narrow probability distribution. The establishment of methodologies for necessary models for consequence analysis resulting form a server accident in the nuclear power plant was made and a program for consequence analysis was developed. The models include atmospheric transport and diffusion, calculation of exposure doses for various pathways, and assessment of health effects and associated risks. Finally, the economic impact resulting form an accident in a nuclear power plant was investigated. In this study, estimation models for each cost terms that considered in economic

  1. Articles on Practical Cybernetics. Computer-Developed Computers; Heuristics and Modern Sciences; Linguistics and Practice; Cybernetics and Moral-Ethical Considerations; and Men and Machines at the Chessboard.

    Science.gov (United States)

    Berg, A. I.; And Others

    Five articles which were selected from a Russian language book on cybernetics and then translated are presented here. They deal with the topics of: computer-developed computers, heuristics and modern sciences, linguistics and practice, cybernetics and moral-ethical considerations, and computer chess programs. (Author/JY)

  2. Design, development and integration of a large scale multiple source X-ray computed tomography system

    International Nuclear Information System (INIS)

    Malcolm, Andrew A.; Liu, Tong; Ng, Ivan Kee Beng; Teng, Wei Yuen; Yap, Tsi Tung; Wan, Siew Ping; Kong, Chun Jeng

    2013-01-01

    X-ray Computed Tomography (CT) allows visualisation of the physical structures in the interior of an object without physically opening or cutting it. This technology supports a wide range of applications in the non-destructive testing, failure analysis or performance evaluation of industrial products and components. Of the numerous factors that influence the performance characteristics of an X-ray CT system the energy level in the X-ray spectrum to be used is one of the most significant. The ability of the X-ray beam to penetrate a given thickness of a specific material is directly related to the maximum available energy level in the beam. Higher energy levels allow penetration of thicker components made of more dense materials. In response to local industry demand and in support of on-going research activity in the area of 3D X-ray imaging for industrial inspection the Singapore Institute of Manufacturing Technology (SIMTech) engaged in the design, development and integration of large scale multiple source X-ray computed tomography system based on X-ray sources operating at higher energies than previously available in the Institute. The system consists of a large area direct digital X-ray detector (410 x 410 mm), a multiple-axis manipulator system, a 225 kV open tube microfocus X-ray source and a 450 kV closed tube millifocus X-ray source. The 225 kV X-ray source can be operated in either transmission or reflection mode. The body of the 6-axis manipulator system is fabricated from heavy-duty steel onto which high precision linear and rotary motors have been mounted in order to achieve high accuracy, stability and repeatability. A source-detector distance of up to 2.5 m can be achieved. The system is controlled by a proprietary X-ray CT operating system developed by SIMTech. The system currently can accommodate samples up to 0.5 x 0.5 x 0.5 m in size with weight up to 50 kg. These specifications will be increased to 1.0 x 1.0 x 1.0 m and 100 kg in future

  3. High-resolution computational algorithms for simulating offshore wind turbines and farms: Model development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Calderer, Antoni [Univ. of Minnesota, Minneapolis, MN (United States); Yang, Xiaolei [Stony Brook Univ., NY (United States); Angelidis, Dionysios [Univ. of Minnesota, Minneapolis, MN (United States); Feist, Chris [Univ. of Minnesota, Minneapolis, MN (United States); Guala, Michele [Univ. of Minnesota, Minneapolis, MN (United States); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guo, Xin [Univ. of Minnesota, Minneapolis, MN (United States); Boomsma, Aaron [Univ. of Minnesota, Minneapolis, MN (United States); Shen, Lian [Univ. of Minnesota, Minneapolis, MN (United States); Sotiropoulos, Fotis [Stony Brook Univ., NY (United States)

    2015-10-30

    The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.

  4. Highlights from the previous volumes

    Science.gov (United States)

    Vergini Eduardo, G.; Pan, Y.; al., Vardi R. et; al., Akkermans Eric et; et al.

    2014-01-01

    Semiclassical propagation up to the Heisenberg time Superconductivity and magnetic order in the half-Heusler compound ErPdBi An experimental evidence-based computational paradigm for new logic-gates in neuronal activity Universality in the symmetric exclusion process and diffusive systems

  5. Development of an X-ray Computed Tomography System for Non-Invasive Imaging of Industrial Materials

    International Nuclear Information System (INIS)

    Abdullah, J.; Sipaun, S. M.; Mustapha, I.; Zain, R. M.; Rahman, M. F. A.; Mustapha, M.; Shaari, M. R.; Hassan, H.; Said, M. K. M.; Mohamad, G. H. P.; Ibrahim, M. M.

    2008-01-01

    X-ray computed tomography is a powerful non-invasive imaging technique for viewing an object's inner structures in two-dimensional cross-section images without the need to physically section it. The invention of CT techniques revolutionised the field of medical diagnostic imaging because it provided more detailed and useful information than any previous non-invasive imaging techniques. The method is increasingly being used in industry, aerospace, geosciences and archaeology. This paper describes the development of an X-ray computed tomography system for imaging of industrial materials. The theoretical aspects of CT scanner, the system configurations and the adopted algorithm for image reconstruction are discussed. The penetrating rays from a 160 kV industrial X-ray machine were used to investigate structures that manifest in a manufactured component or product. Some results were presented in this paper

  6. Computer users' risk factors for developing shoulder, elbow and back symptoms

    DEFF Research Database (Denmark)

    Juul-Kristensen, Birgit; Søgaard, Karen; Strøyer, Jesper

    2004-01-01

    to be afflicted than men in all regions. In the full-fit multivariate logistic regression analysis, little influence on the timing of a rest pause and being disturbed by glare or reflection were significant predictors of shoulder symptoms, screen below eye height was a significant predictor for elbow symptoms......, and previous symptoms was a significant predictor for symptoms in all regions. Computer worktime and psychosocial dimensions were not significant predictors. CONCLUSIONS: Influence on work pauses, reduction of glare or reflection, and screen height are important factors in the design of future computer...

  7. Development of the two Korean adult tomographic computational phantoms for organ dosimetry

    International Nuclear Information System (INIS)

    Lee, Choonsik; Lee, Choonik; Park, Sang-Hyun; Lee, Jai-Ki

    2006-01-01

    Following the previously developed Korean tomographic phantom, KORMAN, two additional whole-body tomographic phantoms of Korean adult males were developed from magnetic resonance (MR) and computed tomography (CT) images, respectively. Two healthy male volunteers, whose body dimensions were fairly representative of the average Korean adult male, were recruited and scanned for phantom development. Contiguous whole body MR images were obtained from one subject exclusive of the arms, while whole-body CT images were acquired from the second individual. A total of 29 organs and tissues and 19 skeletal sites were segmented via image manipulation techniques such as gray-level thresholding, region growing, and manual drawing, in which each of segmented image slice was subsequently reviewed by an experienced radiologist for anatomical accuracy. The resulting phantoms, the MR-based KTMAN-1 (Korean Typical MAN-1) and the CT-based KTMAN-2 (Korean Typical MAN-2), consist of 300x150x344 voxels with a voxel resolution of 2x2x5 mm 3 for both phantoms. Masses of segmented organs and tissues were calculated as the product of a nominal reference density, the prevoxel volume, and the cumulative number of voxels defining each organs or tissue. These organs masses were then compared with those of both the Asian and the ICRP reference adult male. Organ masses within both KTMAN-1 and KTMAN-2 showed differences within 40% of Asian and ICRP reference values, with the exception of the skin, gall bladder, and pancreas which displayed larger differences. The resulting three-dimensional binary file was ported to the Monte Carlo code MCNPX2.4 to calculate organ doses following external irradiation for illustrative purposes. Colon, lung, liver, and stomach absorbed doses, as well as the effective dose, for idealized photon irradiation geometries (anterior-posterior and right lateral) were determined, and then compared with data from two other tomographic phantoms (Asian and Caucasian), and

  8. А History of Computer-Based Instruction and its Effects on Developing Instructional Technologies

    Directory of Open Access Journals (Sweden)

    Ömer Faruk Sözcü

    2013-01-01

    Full Text Available The purpose of the paper is to discuss instructional and technological developments based on the history of computer-based instruction (CBI. Historically, the development of the CBI movement began in earnest at the end of 1960s and in the early 1970s. At that time, computers, for the first time, began to be used in education, basically for teaching language and mathematics. Historically, CBI emerged from the programmed instruction and teaching machines of the middle of the 1950s. Educational computing began with a few large, government-funded projects on mainframe and minicomputers. At this time, several projects were developed to be utilized in instructional processes, such as PLATO and TICCIT. As a result, the developments after the 1970s will be discussed regarding the CBI process so as to indicate new instructional and technological developments as part of new learning technologies from past to present for students and educators in schools.

  9. [Development of computer aided forming techniques in manufacturing scaffolds for bone tissue engineering].

    Science.gov (United States)

    Wei, Xuelei; Dong, Fuhui

    2011-12-01

    To review recent advance in the research and application of computer aided forming techniques for constructing bone tissue engineering scaffolds. The literature concerning computer aided forming techniques for constructing bone tissue engineering scaffolds in recent years was reviewed extensively and summarized. Several studies over last decade have focused on computer aided forming techniques for bone scaffold construction using various scaffold materials, which is based on computer aided design (CAD) and bone scaffold rapid prototyping (RP). CAD include medical CAD, STL, and reverse design. Reverse design can fully simulate normal bone tissue and could be very useful for the CAD. RP techniques include fused deposition modeling, three dimensional printing, selected laser sintering, three dimensional bioplotting, and low-temperature deposition manufacturing. These techniques provide a new way to construct bone tissue engineering scaffolds with complex internal structures. With rapid development of molding and forming techniques, computer aided forming techniques are expected to provide ideal bone tissue engineering scaffolds.

  10. Computer literacy among first year medical students in a developing country: A cross sectional study

    Directory of Open Access Journals (Sweden)

    Ranasinghe Priyanga

    2012-09-01

    Full Text Available Abstract Background The use of computer assisted learning (CAL has enhanced undergraduate medical education. CAL improves performance at examinations, develops problem solving skills and increases student satisfaction. The study evaluates computer literacy among first year medical students in Sri Lanka. Methods The study was conducted at Faculty of Medicine, University of Colombo, Sri Lanka between August-September 2008. First year medical students (n = 190 were invited for the study. Data on computer literacy and associated factors were collected by an expert-validated pre-tested self-administered questionnaire. Computer literacy was evaluated by testing knowledge on 6 domains; common software packages, operating systems, database management and the usage of internet and E-mail. A linear regression was conducted using total score for computer literacy as the continuous dependant variable and other independent covariates. Results Sample size-181 (Response rate-95.3%, 49.7% were Males. Majority of the students (77.3% owned a computer (Males-74.4%, Females-80.2%. Students have gained their present computer knowledge by; a formal training programme (64.1%, self learning (63.0% or by peer learning (49.2%. The students used computers for predominately; word processing (95.6%, entertainment (95.0%, web browsing (80.1% and preparing presentations (76.8%. Majority of the students (75.7% expressed their willingness for a formal computer training programme at the faculty. Mean score for the computer literacy questionnaire was 48.4 ± 20.3, with no significant gender difference (Males-47.8 ± 21.1, Females-48.9 ± 19.6. There were 47.9% students that had a score less than 50% for the computer literacy questionnaire. Students from Colombo district, Western Province and Student owning a computer had a significantly higher mean score in comparison to other students (p Conclusion Sri Lankan medical undergraduates had a low-intermediate level of computer

  11. Summaries of research and development activities by using JAEA computer system in FY2007. April 1, 2007 - March 31, 2008

    International Nuclear Information System (INIS)

    2008-11-01

    Center for Computational Science and e-Systems (CCSE) of Japan Atomic Energy Agency (JAEA) installed large computer systems including super-computers in order to support research and development activities in JAEA. This report presents usage records of the JAEA computer system and the big users' research and development activities by using the computer system in FY2007 (April 1, 2007 - March 31, 2008). (author)

  12. Summaries of research and development activities by using JAEA computer system in FY2009. April 1, 2009 - March 31, 2010

    International Nuclear Information System (INIS)

    2010-11-01

    Center for Computational Science and e-Systems (CCSE) of Japan Atomic Energy Agency (JAEA) installed large computer systems including super-computers in order to support research and development activities in JAEA. This report presents usage records of the JAEA computer system and the big users' research and development activities by using the computer system in FY2009 (April 1, 2009 - March 31, 2010). (author)

  13. Multimedia computer-assisted instruction for carers on exercise for older people: development and testing.

    Science.gov (United States)

    Ponpaipan, Muthita; Srisuphan, Wichit; Jitapunkul, Sutthichai; Panuthai, Sirirat; Tonmukayakul, Ouyporn; While, Alison

    2011-02-01

    This paper is a report of a study conducted to develop a multimedia computer-assisted instruction for informal carers and test its content validity, user difficulty and user satisfaction. Healthy ageing is an increasingly important public health target globally. Changes in technology offer the opportunity for e-health promotion as a means of educating populations and healthcare staff to meet public health targets. Computer-assisted instruction was developed and tested systematically in four phases during 2008, and these are outlined. Phase 1 consisted of topic and content identification using a literature review. Phase 2 comprised refinement of the content using an academic panel of experts. Phase 3 was the production of computer-assisted instruction comprising problem clarification, algorithm designing with reference to a cognitive theory of multimedia learning and program coding. Phase 4 consisted of testing for content validity, and writing a computer-assisted instruction manual and testing it for user difficulty and satisfaction. The data from each phase informed the development and refinement of the computer-assisted instruction. Content validity was confirmed and 'test' users reported few difficulties in its use and high satisfaction. This e-health promotion initiative is an example of how computer-assisted instruction may be developed to teach carers of older people. © 2010 Thailand Research Fund. Journal of Advanced Nursing © 2010 Blackwell Publishing Ltd.

  14. Research and development of grid computing technology in center for computational science and e-systems of Japan Atomic Energy Agency

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of the Japan Atomic Energy Agency (CCSE/JAEA) has carried out R and D of grid computing technology. Since 1995, R and D to realize computational assistance for researchers called Seamless Thinking Aid (STA) and then to share intellectual resources called Information Technology Based Laboratory (ITBL) have been conducted, leading to construct an intelligent infrastructure for the atomic energy research called Atomic Energy Grid InfraStructure (AEGIS) under the Japanese national project 'Development and Applications of Advanced High-Performance Supercomputer'. It aims to enable synchronization of three themes: 1) Computer-Aided Research and Development (CARD) to realize and environment for STA, 2) Computer-Aided Engineering (CAEN) to establish Multi Experimental Tools (MEXT), and 3) Computer Aided Science (CASC) to promote the Atomic Energy Research and Investigation (AERI). This article reviewed achievements in R and D of grid computing technology so far obtained. (T. Tanaka)

  15. Computer-aided modeling framework for efficient model development, analysis and identification

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Sales Cruz, Mauricio

    2011-01-01

    methods introduce. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms, and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task......Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy, and water. This trend is set to continue due to the substantial benefits computer-aided....... The methodology has been implemented into a computer-aided modeling framework, which combines expert skills, tools, and database connections that are required for the different steps of the model development work-flow with the goal to increase the efficiency of the modeling process. The framework has two main...

  16. Integrated Graphics Operations and Analysis Lab Development of Advanced Computer Graphics Algorithms

    Science.gov (United States)

    Wheaton, Ira M.

    2011-01-01

    The focus of this project is to aid the IGOAL in researching and implementing algorithms for advanced computer graphics. First, this project focused on porting the current International Space Station (ISS) Xbox experience to the web. Previously, the ISS interior fly-around education and outreach experience only ran on an Xbox 360. One of the desires was to take this experience and make it into something that can be put on NASA s educational site for anyone to be able to access. The current code works in the Unity game engine which does have cross platform capability but is not 100% compatible. The tasks for an intern to complete this portion consisted of gaining familiarity with Unity and the current ISS Xbox code, porting the Xbox code to the web as is, and modifying the code to work well as a web application. In addition, a procedurally generated cloud algorithm will be developed. Currently, the clouds used in AGEA animations and the Xbox experiences are a texture map. The desire is to create a procedurally generated cloud algorithm to provide dynamically generated clouds for both AGEA animations and the Xbox experiences. This task consists of gaining familiarity with AGEA and the plug-in interface, developing the algorithm, creating an AGEA plug-in to implement the algorithm inside AGEA, and creating a Unity script to implement the algorithm for the Xbox. This portion of the project was unable to be completed in the time frame of the internship; however, the IGOAL will continue to work on it in the future.

  17. Computational fluid dynamics modelling of perfusion measurements in dynamic contrast-enhanced computed tomography: development, validation and clinical applications

    International Nuclear Information System (INIS)

    Peladeau-Pigeon, M; Coolens, C

    2013-01-01

    Dynamic contrast-enhanced computed tomography (DCE-CT) is an imaging tool that aids in evaluating functional characteristics of tissue at different stages of disease management: diagnostic, radiation treatment planning, treatment effectiveness, and monitoring. Clinical validation of DCE-derived perfusion parameters remains an outstanding problem to address prior to perfusion imaging becoming a widespread standard as a non-invasive quantitative measurement tool. One approach to this validation process has been the development of quality assurance phantoms in order to facilitate controlled perfusion ex vivo. However, most of these systems fail to establish and accurately replicate physiologically relevant capillary permeability and exchange performance. The current work presents the first step in the development of a prospective suite of physics-based perfusion simulations based on coupled fluid flow and particle transport phenomena with the goal of enhancing the understanding of clinical contrast agent kinetics. Existing knowledge about a controllable, two-compartmental fluid exchange phantom was used to validate the computational fluid dynamics (CFD) simulation model presented herein. The sensitivity of CFD-derived contrast uptake curves to contrast injection parameters, including injection duration and flow rate, were quantified and found to be within 10% accuracy. The CFD model was employed to evaluate two commonly used clinical kinetic algorithms used to derive perfusion parameters: Fick's principle and the modified Tofts model. Neither kinetic model was able to capture the true transport phenomena it aimed to represent but if the overall contrast concentration after injection remained identical, then successive DCE-CT evaluations could be compared and could indeed reflect differences in regional tissue flow. This study sets the groundwork for future explorations in phantom development and pharmaco-kinetic modelling, as well as the development of novel contrast

  18. Computational fluid dynamics modelling of perfusion measurements in dynamic contrast-enhanced computed tomography: development, validation and clinical applications.

    Science.gov (United States)

    Peladeau-Pigeon, M; Coolens, C

    2013-09-07

    Dynamic contrast-enhanced computed tomography (DCE-CT) is an imaging tool that aids in evaluating functional characteristics of tissue at different stages of disease management: diagnostic, radiation treatment planning, treatment effectiveness, and monitoring. Clinical validation of DCE-derived perfusion parameters remains an outstanding problem to address prior to perfusion imaging becoming a widespread standard as a non-invasive quantitative measurement tool. One approach to this validation process has been the development of quality assurance phantoms in order to facilitate controlled perfusion ex vivo. However, most of these systems fail to establish and accurately replicate physiologically relevant capillary permeability and exchange performance. The current work presents the first step in the development of a prospective suite of physics-based perfusion simulations based on coupled fluid flow and particle transport phenomena with the goal of enhancing the understanding of clinical contrast agent kinetics. Existing knowledge about a controllable, two-compartmental fluid exchange phantom was used to validate the computational fluid dynamics (CFD) simulation model presented herein. The sensitivity of CFD-derived contrast uptake curves to contrast injection parameters, including injection duration and flow rate, were quantified and found to be within 10% accuracy. The CFD model was employed to evaluate two commonly used clinical kinetic algorithms used to derive perfusion parameters: Fick's principle and the modified Tofts model. Neither kinetic model was able to capture the true transport phenomena it aimed to represent but if the overall contrast concentration after injection remained identical, then successive DCE-CT evaluations could be compared and could indeed reflect differences in regional tissue flow. This study sets the groundwork for future explorations in phantom development and pharmaco-kinetic modelling, as well as the development of novel contrast

  19. Computational fluid dynamics modelling of perfusion measurements in dynamic contrast-enhanced computed tomography: development, validation and clinical applications

    Science.gov (United States)

    Peladeau-Pigeon, M.; Coolens, C.

    2013-09-01

    Dynamic contrast-enhanced computed tomography (DCE-CT) is an imaging tool that aids in evaluating functional characteristics of tissue at different stages of disease management: diagnostic, radiation treatment planning, treatment effectiveness, and monitoring. Clinical validation of DCE-derived perfusion parameters remains an outstanding problem to address prior to perfusion imaging becoming a widespread standard as a non-invasive quantitative measurement tool. One approach to this validation process has been the development of quality assurance phantoms in order to facilitate controlled perfusion ex vivo. However, most of these systems fail to establish and accurately replicate physiologically relevant capillary permeability and exchange performance. The current work presents the first step in the development of a prospective suite of physics-based perfusion simulations based on coupled fluid flow and particle transport phenomena with the goal of enhancing the understanding of clinical contrast agent kinetics. Existing knowledge about a controllable, two-compartmental fluid exchange phantom was used to validate the computational fluid dynamics (CFD) simulation model presented herein. The sensitivity of CFD-derived contrast uptake curves to contrast injection parameters, including injection duration and flow rate, were quantified and found to be within 10% accuracy. The CFD model was employed to evaluate two commonly used clinical kinetic algorithms used to derive perfusion parameters: Fick's principle and the modified Tofts model. Neither kinetic model was able to capture the true transport phenomena it aimed to represent but if the overall contrast concentration after injection remained identical, then successive DCE-CT evaluations could be compared and could indeed reflect differences in regional tissue flow. This study sets the groundwork for future explorations in phantom development and pharmaco-kinetic modelling, as well as the development of novel contrast

  20. The design development and commissioning of two distributed computer based boiler control systems

    International Nuclear Information System (INIS)

    Collier, D.; Johnstone, L.R.; Pringle, S.T.; Walker, R.W.

    1980-01-01

    The CEBG N.E. Region has recently commissioned two major boiler control schemes using distributed computer control system. Both systems have considerable development potential to allow modifications to meet changing operational requirements. The distributed approach to control was chosen in both instances so as to achieve high control system availability and as a method of easing the commissioning programs. The experience gained with these two projects has reinforced the view that distributed computer systems show advantages over centralised single computers especially if software is designed for the distributed system. (auth)

  1. Integrating computation into the undergraduate curriculum: A vision and guidelines for future developments

    Science.gov (United States)

    Chonacky, Norman; Winch, David

    2008-04-01

    There is substantial evidence of a need to make computation an integral part of the undergraduate physics curriculum. This need is consistent with data from surveys in both the academy and the workplace, and has been reinforced by two years of exploratory efforts by a group of physics faculty for whom computation is a special interest. We have examined past and current efforts at reform and a variety of strategic, organizational, and institutional issues involved in any attempt to broadly transform existing practice. We propose a set of guidelines for development based on this past work and discuss our vision of computationally integrated physics.

  2. The development of computer ethics: contributions from business ethics and medical ethics.

    Science.gov (United States)

    Wong, K; Steinke, G

    2000-04-01

    In this essay, we demonstrate that the field of computer ethics shares many core similarities with two other areas of applied ethics. Academicians writing and teaching in the area of computer ethics, along with practitioners, must address ethical issues that are qualitatively similar in nature to those raised in medicine and business. In addition, as academic disciplines, these three fields also share some similar concerns. For example, all face the difficult challenge of maintaining a credible dialogue with diverse constituents such as academicians of various disciplines, professionals, policymakers, and the general public. Given these similarities, the fields of bioethics and business ethics can serve as useful models for the development of computer ethics.

  3. Development of the computer-aided process planning (CAPP system for polymer injection molds manufacturing

    Directory of Open Access Journals (Sweden)

    J. Tepić

    2011-10-01

    Full Text Available Beginning of production and selling of polymer products largely depends on mold manufacturing. The costs of mold manufacturing have significant share in the final price of a product. The best way to improve and rationalize polymer injection molds production process is by doing mold design automation and manufacturing process planning automation. This paper reviews development of a dedicated process planning system for manufacturing of the mold for injection molding, which integrates computer-aided design (CAD, computer-aided process planning (CAPP and computer-aided manufacturing (CAM technologies.

  4. Development of the computer code system for the analyses of PWR core

    International Nuclear Information System (INIS)

    Tsujimoto, Iwao; Naito, Yoshitaka.

    1992-11-01

    This report is one of the materials for the work titled 'Development of the computer code system for the analyses of PWR core phenomena', which is performed under contracts between Shikoku Electric Power Company and JAERI. In this report, the numerical method adopted in our computer code system are described, that is, 'The basic course and the summary of the analysing method', 'Numerical method for solving the Boltzmann equation', 'Numerical method for solving the thermo-hydraulic equations' and 'Description on the computer code system'. (author)

  5. Evolution of facility layout requirements and CAD [computer-aided design] system development

    International Nuclear Information System (INIS)

    Jones, M.

    1990-06-01

    The overall configuration of the Superconducting Super Collider (SSC) including the infrastructure and land boundary requirements were developed using a computer-aided design (CAD) system. The evolution of the facility layout requirements and the use of the CAD system are discussed. The emphasis has been on minimizing the amount of input required and maximizing the speed by which the output may be obtained. The computer system used to store the data is also described

  6. The cognitive dynamics of computer science cost-effective large scale software development

    CERN Document Server

    De Gyurky, Szabolcs Michael; John Wiley & Sons

    2006-01-01

    This book has three major objectives: To propose an ontology for computer software; To provide a methodology for development of large software systems to cost and schedule that is based on the ontology; To offer an alternative vision regarding the development of truly autonomous systems.

  7. Ethical Issues in Brain-Computer Interface Research, Development, and Dissemination

    NARCIS (Netherlands)

    Vlek, R.J.; Steines, D.; Szibbo, D.; Kübler, A.; Schneider, M.J.; Haselager, W.F.G.; Nijboer, F.

    2012-01-01

    The steadily growing field of brain–computer interfacing (BCI) may develop useful technologies, with a potential impact not only on individuals, but also on society as a whole. At the same time, the development of BCI presents significant ethical and legal challenges. In a workshop during the 4th

  8. Ethical Issues in Brain-Computer Interface Research, Development, and Dissemination

    NARCIS (Netherlands)

    Vlek, Rutger; Steines, David; Szibbo, Dyana; Kübler, Andrea; Schneider, Mary-Jane; Haselager, Pim; Nijboer, Femke

    The steadily growing field of brain-computer interfacing (BCI) may develop useful technologies, with a potential impact not only on individuals, but also on society as a whole. At the same time, the development of BCI presents significant ethical and legal challenges. In a workshop during the 4th

  9. [Etiological factors for developing carpal tunnel syndrome in people who work with computers].

    Science.gov (United States)

    Lewańska, Magdalena; Wagrowska-Koski, Ewa; Walusiak-Skorupa, Jolanta

    2013-01-01

    Carpal tunnel syndrome (CTS) is the most frequent mononeuropathy of upper extremities. From the early 1990's it has been suggested that intensive work with computers can result in CTS development, however, this relationship has not as yet been proved. The aim of the study was to evaluate occupational and non-occupational risk factors for developing CTS in the population of computer-users. The study group comprised 60 patients (58 women and 2 men; mean age: 53.8 +/- 6.35 years) working with computers and suspected of occupational CTS. A survey as well as both median and ulnar nerve conduction examination (NCS) were performed in all the subjects. The patients worked with use of computer for 6.43 +/- 1.71 h per day. The mean latency between the beginning of employment and the occurrence of first CTS symptoms was 12.09 +/- 5.94 years. All patients met the clinical and electrophysiological diagnostic criteria of CTS. In the majority of patients etiological factors for developing CTS were non-occupational: obesity, hypothyroidism, oophorectomy, past hysterectomy, hormonal replacement therapy or oral contraceptives, recent menopause, diabetes, tendovaginitis. In 7 computer-users etiological factors were not identified. The results of our study show that CTS is usually generated by different causes not related with using computers at work.

  10. Development of point Kernel radiation shielding analysis computer program implementing recent nuclear data and graphic user interfaces

    International Nuclear Information System (INIS)

    Kang, S.; Lee, S.; Chung, C.

    2002-01-01

    There is an increasing demand for safe and efficient use of radiation and radioactive work activity along with shielding analysis as a result the number of nuclear and conventional facilities using radiation or radioisotope rises. Most Korean industries and research institutes including Korea Power Engineering Company (KOPEC) have been using foreign computer programs for radiation shielding analysis. Korean nuclear regulations have introduced new laws regarding the dose limits and radiological guides as prescribed in the ICRP 60. Thus, the radiation facilities should be designed and operated to comply with these new regulations. In addition, the previous point kernel shielding computer code utilizes antiquated nuclear data (mass attenuation coefficient, buildup factor, etc) which were developed in 1950∼1960. Subsequently, the various nuclear data such mass attenuation coefficient, buildup factor, etc. have been updated during the past few decades. KOPEC's strategic directive is to become a self-sufficient and independent nuclear design technology company, thus KOPEC decided to develop a new radiation shielding computer program that included the latest regulatory requirements and updated nuclear data. This new code was designed by KOPEC with developmental cooperation with Hanyang University, Department of Nuclear Engineering. VisualShield is designed with a graphical user interface to allow even users unfamiliar to radiation shielding theory to proficiently prepare input data sets and analyzing output results

  11. Developing a computationally efficient dynamic multilevel hybrid optimization scheme using multifidelity model interactions.

    Energy Technology Data Exchange (ETDEWEB)

    Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Castro, Joseph Pete Jr. (; .); Giunta, Anthony Andrew

    2006-01-01

    Many engineering application problems use optimization algorithms in conjunction with numerical simulators to search for solutions. The formulation of relevant objective functions and constraints dictate possible optimization algorithms. Often, a gradient based approach is not possible since objective functions and constraints can be nonlinear, nonconvex, non-differentiable, or even discontinuous and the simulations involved can be computationally expensive. Moreover, computational efficiency and accuracy are desirable and also influence the choice of solution method. With the advent and increasing availability of massively parallel computers, computational speed has increased tremendously. Unfortunately, the numerical and model complexities of many problems still demand significant computational resources. Moreover, in optimization, these expenses can be a limiting factor since obtaining solutions often requires the completion of numerous computationally intensive simulations. Therefore, we propose a multifidelity optimization algorithm (MFO) designed to improve the computational efficiency of an optimization method for a wide range of applications. In developing the MFO algorithm, we take advantage of the interactions between multi fidelity models to develop a dynamic and computational time saving optimization algorithm. First, a direct search method is applied to the high fidelity model over a reduced design space. In conjunction with this search, a specialized oracle is employed to map the design space of this high fidelity model to that of a computationally cheaper low fidelity model using space mapping techniques. Then, in the low fidelity space, an optimum is obtained using gradient or non-gradient based optimization, and it is mapped back to the high fidelity space. In this paper, we describe the theory and implementation details of our MFO algorithm. We also demonstrate our MFO method on some example problems and on two applications: earth penetrators and

  12. Whole earth modeling: developing and disseminating scientific software for computational geophysics.

    Science.gov (United States)

    Kellogg, L. H.

    2016-12-01

    Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.

  13. Report on nuclear industry quality assurance procedures for safety analysis computer code development and use

    International Nuclear Information System (INIS)

    Sheron, B.W.; Rosztoczy, Z.R.

    1980-08-01

    As a result of a request from Commissioner V. Gilinsky to investigate in detail the causes of an error discovered in a vendor Emergency Core Cooling System (ECCS) computer code in March, 1978, the staff undertook an extensive investigation of the vendor quality assurance practices applied to safety analysis computer code development and use. This investigation included inspections of code development and use practices of the four major Light Water Reactor Nuclear Steam Supply System vendors and a major reload fuel supplier. The conclusion reached by the staff as a result of the investigation is that vendor practices for code development and use are basically sound. A number of areas were identified, however, where improvements to existing vendor procedures should be made. In addition, the investigation also addressed the quality assurance (QA) review and inspection process for computer codes and identified areas for improvement

  14. Development of algorithm for continuous generation of a computer game in terms of usability and optimization of developed code in computer science

    Directory of Open Access Journals (Sweden)

    Tibor Skala

    2018-03-01

    Full Text Available As both hardware and software have become increasingly available and constantly developed, they globally contribute to improvements in technology in every field of technology and arts. Digital tools for creation and processing of graphical contents are very developed and they have been designed to shorten the time required for content creation, which is, in this case, animation. Since contemporary animation has experienced a surge in various visual styles and visualization methods, programming is built-in in everything that is currently in use. There is no doubt that there is a variety of algorithms and software which are the brain and the moving force behind any idea created for a specific purpose and applicability in society. Art and technology combined make a direct and oriented medium for publishing and marketing in every industry, including those which are not necessarily closely related to those that rely heavily on visual aspect of work. Additionally, quality and consistency of an algorithm will also depend on proper integration into the system that will be powered by that algorithm as well as on the way the algorithm is designed. Development of an endless algorithm and its effective use will be shown during the use of the computer game. In order to present the effect of various parameters, in the final phase of the computer game development an endless algorithm was tested with varying number of key input parameters (achieved time, score reached, pace of the game.

  15. Development of PHilMech Computer Vision System (CVS) for Quality Analysis of Rice and Corn

    OpenAIRE

    Andres Morales Tuates jr; Aileen R. Ligisan

    2016-01-01

    Manual analysis of rice and corn is done by visually inspecting each grain and classifying according to their respective categories.  This method is subjective and tedious leading to errors in analysis.  Computer vision could be used to analyze quality of rice and corn by developing models that correlate shape and color features with various classification. The PhilMech low-cost computer vision system (CVS) was developed to analyze the quality of rice and corn.  It is composed of an ordinary ...

  16. Development of a portable computed tomographic scanner for on-line imaging of industrial piping systems

    International Nuclear Information System (INIS)

    Jaafar Abdullah; Mohd Arif Hamzah; Mohd Soyapi Mohd Yusof; Mohd Fitri Abdul Rahman; Fadil IsmaiI; Rasif Mohd Zain

    2003-01-01

    Computed tomography (CT) technology is being increasingly developed for industrial application. This paper presents the development of a portable computed tomographic scanner for on?line imaging of industrial piping systems. The theoretical approach, the system hardware, the data acquisition system and the adopted algorithm for image reconstruction are discussed. The scanner has large potential to be used to determine the extent of corrosion under insulation (CUI), to detect blockages, to measure the thickness of deposit/materials built-up on the walls and to improve understanding of material flow in pipelines. (Author)

  17. Moving To The Cloud Developing Apps in the New World of Cloud Computing

    CERN Document Server

    Sitaram, Dinkar

    2011-01-01

    Moving to the Cloud provides an in-depth introduction to cloud computing models, cloud platforms, application development paradigms, concepts and technologies. The authors particularly examine cloud platforms that are in use today. They also describe programming APIs and compare the technologies that underlie them. The basic foundations needed for developing both client-side and cloud-side applications covering compute/storage scaling, data parallelism, virtualization, MapReduce, RIA, SaaS and Mashups are covered. Approaches to address key challenges of a cloud infrastructure, such as scalabi

  18. Lattice QCD Application Development within the US DOE Exascale Computing Project

    Energy Technology Data Exchange (ETDEWEB)

    Brower, Richard [Boston U.; Christ, Norman [Columbia U.; DeTar, Carleton [Utah U.; Edwards, Robert [Jefferson Lab; Mackenzie, Paul [Fermilab

    2017-10-30

    In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020's. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  19. Development of computational models for the simulation of isodose curves on dosimetry films generated by iodine-125 brachytherapy seeds

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Adriano M.; Meira-Belo, Luiz C.; Reis, Sergio C.; Grynberg, Suely E., E-mail: amsantos@cdtn.b [Center for Development of Nuclear Technology (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2011-07-01

    The interstitial brachytherapy is one modality of radiotherapy in which radioactive sources are placed directly in the region to be treated or close to it. The seeds that are used in the treatment of prostate cancer are generally cylindrical radioactive sources, consisting of a ceramic or metal matrix, which acts as the carrier of the radionuclide and as the X-ray marker, encapsulated in a sealed titanium tube. This study aimed to develop a computational model to reproduce the film-seed geometry, in order to obtain the spatial regions of the isodose curves produced by the seed when it is put over the film surface. The seed modeled in this work was the OncoSeed 6711, a sealed source of iodine-125, which its isodose curves were obtained experimentally in previous work with the use of dosimetric films. For the films modeling, compositions and densities of the two types of dosimetric films were used: Agfa Personal Monitoring photographic film 2/10, manufactured by Agfa-Geavaert; and the model EBT radiochromic film, by International Specialty Products. The film-seed models were coupled to the Monte Carlo code MCNP5. The results obtained by simulations showed to be in good agreement with experimental results performed in a previous work. This indicates that the computational model can be used in future studies for other seeds models. (author)

  20. Development of a Computer-Aided Diagnosis System for Early Detection of Masses Using Retrospectively Detected Cancers on Prior Mammograms

    National Research Council Canada - National Science Library

    Wei, Jun

    2006-01-01

    The goal of this project is to develop a computer-aided diagnosis (CAD) system for mass detection using advanced computer vision techniques that will be trained with retrospectively detected cancers on prior mammograms...

  1. Development of a Computer-Aided Diagnosis System for Early Detection of Masses Using Retrospectively Detected Cancers on Prior Mammograms

    National Research Council Canada - National Science Library

    Wei, Jun

    2007-01-01

    The goal of this project is to develop a computer-aided diagnosis (CAD) system for mass detection using advanced computer vision techniques that will be trained with retrospectively detected cancers on prior mammograms...

  2. Computation and Database Development for Flue Gas Treatment on Electron Beam Machine (EBM)

    International Nuclear Information System (INIS)

    Tono Wibowo; Slamet Santosa

    2007-01-01

    A computation and database development for parameter calculations of SO 2 and NO x flue-gas treatment have been done. This computation and database development will make easier for researchers in calculations of flue gas parameters for various specifications and recur in such a way that saves time and apparatus. Analysis and calculation design of flue gas treatment using EBM right now is performing in Microsoft Excel program and calculator, therefore with a computation and database it is expected that can be developed for further parameter calculations of flue gas treatment and having user friendly characteristic. Computation for parameter calculations of flue gas treatment is developed on Borland Delphi version 7.0 with arithmetic and graphic components are in active and for database function is used dBase and Paradox through Borland Database Engine (BDE). Developed calculations include removal efficiency, dose and time of irradiation and the power of MBE. For the purpose of further calculations and bigger application, database functions have been prepared for SQL-Links. From the operation test, program can be run as expected. (author)

  3. Accelerating Development of EV Batteries Through Computer-Aided Engineering (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, A.; Kim, G. H.; Smith, K.; Santhanagopalan, S.

    2012-12-01

    The Department of Energy's Vehicle Technology Program has launched the Computer-Aided Engineering for Automotive Batteries (CAEBAT) project to work with national labs, industry and software venders to develop sophisticated software. As coordinator, NREL has teamed with a number of companies to help improve and accelerate battery design and production. This presentation provides an overview of CAEBAT, including its predictive computer simulation of Li-ion batteries known as the Multi-Scale Multi-Dimensional (MSMD) model framework. MSMD's modular, flexible architecture connects the physics of battery charge/discharge processes, thermal control, safety and reliability in a computationally efficient manner. This allows independent development of submodels at the cell and pack levels.

  4. Issues on the Development and Application of Computer Tools to Support Product Structuring and Configuring

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp; Riitahuhta, A.

    2001-01-01

    The aim of this article is to make a balance on the results and challenges in the efforts to develop computer tools to support product structuring and configuring in product development projects. The balance will be made in two dimensions, a design science and an industrial dimension. The design...... science dimension focuses on our understanding of product structure and product configuration. The industrial dimension presents findings from a number of projects regarding the implementation of computer tools to support engineering designers in industrial practice. The article concludes...... that there are large positive effects to be gained for industrial companies by conscious implementing computer tools based on the results of design science. The positive effects will be measured by e.g. predictable product quality, reduced lead time, and reuse of design solutions....

  5. Development and functional demonstration of a wireless intraoral inductive tongue computer interface for severely disabled persons.

    Science.gov (United States)

    N S Andreasen Struijk, Lotte; Lontis, Eugen R; Gaihede, Michael; Caltenco, Hector A; Lund, Morten Enemark; Schioeler, Henrik; Bentsen, Bo

    2017-08-01

    Individuals with tetraplegia depend on alternative interfaces in order to control computers and other electronic equipment. Current interfaces are often limited in the number of available control commands, and may compromise the social identity of an individual due to their undesirable appearance. The purpose of this study was to implement an alternative computer interface, which was fully embedded into the oral cavity and which provided multiple control commands. The development of a wireless, intraoral, inductive tongue computer was described. The interface encompassed a 10-key keypad area and a mouse pad area. This system was embedded wirelessly into the oral cavity of the user. The functionality of the system was demonstrated in two tetraplegic individuals and two able-bodied individuals Results: The system was invisible during use and allowed the user to type on a computer using either the keypad area or the mouse pad. The maximal typing rate was 1.8 s for repetitively typing a correct character with the keypad area and 1.4 s for repetitively typing a correct character with the mouse pad area. The results suggest that this inductive tongue computer interface provides an esthetically acceptable and functionally efficient environmental control for a severely disabled user. Implications for Rehabilitation New Design, Implementation and detection methods for intra oral assistive devices. Demonstration of wireless, powering and encapsulation techniques suitable for intra oral embedment of assistive devices. Demonstration of the functionality of a rechargeable and fully embedded intra oral tongue controlled computer input device.

  6. Development of a wireless computer vision instrument to detect biotic stress in wheat.

    Science.gov (United States)

    Casanova, Joaquin J; O'Shaughnessy, Susan A; Evett, Steven R; Rush, Charles M

    2014-09-23

    Knowledge of crop abiotic and biotic stress is important for optimal irrigation management. While spectral reflectance and infrared thermometry provide a means to quantify crop stress remotely, these measurements can be cumbersome. Computer vision offers an inexpensive way to remotely detect crop stress independent of vegetation cover. This paper presents a technique using computer vision to detect disease stress in wheat. Digital images of differentially stressed wheat were segmented into soil and vegetation pixels using expectation maximization (EM). In the first season, the algorithm to segment vegetation from soil and distinguish between healthy and stressed wheat was developed and tested using digital images taken in the field and later processed on a desktop computer. In the second season, a wireless camera with near real-time computer vision capabilities was tested in conjunction with the conventional camera and desktop computer. For wheat irrigated at different levels and inoculated with wheat streak mosaic virus (WSMV), vegetation hue determined by the EM algorithm showed significant effects from irrigation level and infection. Unstressed wheat had a higher hue (118.32) than stressed wheat (111.34). In the second season, the hue and cover measured by the wireless computer vision sensor showed significant effects from infection (p = 0.0014), as did the conventional camera (p irrigation scheduling. Such a low-cost system could be suitable for use in the field in automated irrigation scheduling applications.

  7. A Multidisciplinary Model for Development of Intelligent Computer-Assisted Instruction.

    Science.gov (United States)

    Park, Ok-choon; Seidel, Robert J.

    1989-01-01

    Proposes a schematic multidisciplinary model to help developers of intelligent computer-assisted instruction (ICAI) identify the types of required expertise and integrate them into a system. Highlights include domain types and expertise; knowledge acquisition; task analysis; knowledge representation; student modeling; diagnosis of learning needs;…

  8. Development and evaluation of a computer-based medical work assessment programme

    Directory of Open Access Journals (Sweden)

    Spallek Michael

    2008-12-01

    Full Text Available Abstract Background There are several ways to conduct a job task analysis in medical work environments including pencil-paper observations, interviews and questionnaires. However these methods implicate bias problems such as high inter-individual deviations and risks of misjudgement. Computer-based observation helps to reduce these problems. The aim of this paper is to give an overview of the development process of a computer-based job task analysis instrument for real-time observations to quantify the job tasks performed by physicians working in different medical settings. In addition reliability and validity data of this instrument will be demonstrated. Methods This instrument was developed in consequential steps. First, lists comprising tasks performed by physicians in different care settings were classified. Afterwards content validity of task lists was proved. After establishing the final task categories, computer software was programmed and implemented in a mobile personal computer. At least inter-observer reliability was evaluated. Two trained observers recorded simultaneously tasks of the same physician. Results Content validity of the task lists was confirmed by observations and experienced specialists of each medical area. Development process of the job task analysis instrument was completed successfully. Simultaneous records showed adequate interrater reliability. Conclusion Initial results of this analysis supported the validity and reliability of this developed method for assessing physicians' working routines as well as organizational context factors. Based on results using this method, possible improvements for health professionals' work organisation can be identified.

  9. Teaching Web Application Development: A Case Study in a Computer Science Course

    Science.gov (United States)

    Del Fabro, Marcos Didonet; de Alimeda, Eduardo Cunha; Sluzarski, Fabiano

    2012-01-01

    Teaching web development in Computer Science undergraduate courses is a difficult task. Often, there is a gap between the students' experiences and the reality in the industry. As a consequence, the students are not always well-prepared once they get the degree. This gap is due to several reasons, such as the complexity of the assignments, the…

  10. Development of Letshoot Applications as an Instructional Media of Computer Network Troubleshooting

    Directory of Open Access Journals (Sweden)

    Fadila Aini Atista

    2017-12-01

    Full Text Available Based on the observations made in Class XII TKJ SMK Negeri 1 Banyudono, it is known that the interest of students on computer network troubleshooting material is low because the learning materials are delivered through the presentation media in one direction from teacher to student. This makes students less understand the material presented by the teacher. Based on the background of the problem, we propose to build an android based learning media that can help students learn independently, this application is called Letshoot. Letshoot were made as an instructional media of computer network troubleshooting that can help students learn independently without being hindered by place, time and teacher attendance. This research aims to develop and measure the feasibility level of Letshoot application. This app will help student to learn computer network troubleshooting. Using Borg and Gall's research and development model with the application developed in five stages: (1 data collection, (2 planning, (3 develop the preliminary form of product, (4 preliminary field testing, and (5 main product revision. The result of feasibility test of Letshoot application measured by using Likertscale from expert media get percentage 90,64% which belongs in very feasible category, from expert material equal to 82% which belongs very feasible category and from user equal to 82,11% which belongs to very feasible category to be used as a medium of learning. On these results can be concluded that the Letshoot application very feasible to use as a learning media of computer network troubleshooting in vocational high school

  11. Who Needs What: Recommendations for Designing Effective Online Professional Development for Computer Science Teachers

    Science.gov (United States)

    Qian, Yizhou; Hambrusch, Susanne; Yadav, Aman; Gretter, Sarah

    2018-01-01

    The new Advanced Placement (AP) Computer Science (CS) Principles course increases the need for quality CS teachers and thus the need for professional development (PD). This article presents the results of a 2-year study investigating how teachers teaching the AP CS Principles course for the first time used online PD material. Our results showed…

  12. HCI^2 Workbench: A Development Tool for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Wenzhe, Shi; Pantic, Maja

    In this paper, we present a novel software tool designed and implemented to simplify the development process of Multimodal Human-Computer Interaction (MHCI) systems. This tool, which is called the HCI^2 Workbench, exploits a Publish / Subscribe (P/S) architecture [13] [14] to facilitate efficient

  13. Development of three-dimensional computed tomography system using TNRF2 of JRR-3M

    Energy Technology Data Exchange (ETDEWEB)

    Murata, Yutaka; Mochiki, Koh-ichi [Musashi Inst. of Tech., Tokyo (Japan); Matsubayashi, Masahito

    1998-01-01

    A three-dimensional filtering engine, a convolution engine, and a back projection engine were developed for real-time signal processing of three-dimensional computed tomography. The performance of the system was measured and through-put of 0.5 second per one cross sectional data processing was attained. (author)

  14. The Development of an Audio Computer-Based Classroom Test of ESL Listening Skills.

    Science.gov (United States)

    Balizet, Sha; Treder, Dave; Parshall, Cynthia G.

    There are very few examples of audio-based computerized tests, but for many disciplines, such as foreign language and music, there appear to be many benefits to this type of testing. The purpose of the present study was to develop and compare computer-delivered and audiocassette/paper-and-pencil versions of a listening test. The test was a measure…

  15. The Development and Evaluation of a Computer-Simulated Science Inquiry Environment Using Gamified Elements

    Science.gov (United States)

    Tsai, Fu-Hsing

    2018-01-01

    This study developed a computer-simulated science inquiry environment, called the Science Detective Squad, to engage students in investigating an electricity problem that may happen in daily life. The environment combined the simulation of scientific instruments and a virtual environment, including gamified elements, such as points and a story for…

  16. Tangential scanning of hardwood logs: developing an industrial computer tomography scanner

    Science.gov (United States)

    Nand K. Gupta; Daniel L. Schmoldt; Bruce Isaacson

    1999-01-01

    It is generally believed that noninvasive scanning of hardwood logs such as computer tomography (CT) scanning prior to initial breakdown will greatly improve the processing of logs into lumber. This belief, however, has not translated into rapid development and widespread installation of industrial CT scanners for log processing. The roadblock has been more operational...

  17. Development of a Computer Code for the Estimation of Fuel Rod Failure

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, I.H.; Ahn, H.J. [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)

    1997-12-31

    Much research has already been performed to obtain the information on the degree of failed fuel rods from the primary coolant activities of operating PWRs in the last few decades. The computer codes that are currently in use for domestic nuclear power plants, such as CADE code and ABB-CE codes developed by Westinghouse and ABB-CE, respectively, still give significant overall errors in estimating the failed fuel rods. In addition, with the CADE code, it is difficult to predict the degree of fuel rod failures during the transient period of nuclear reactor operation, where as the ABB-CE codes are relatively more difficult to use for end-users. In particular, the rapid progresses made recently in the area of the computer hardware and software systems that their computer programs be more versatile and user-friendly. While the MS windows system that is centered on the graphic user interface and multitasking is now in widespread use, the computer codes currently employed at the nuclear power plants, such as CADE and ABB-CE codes, can only be run on the DOS system. Moreover, it is desirable to have a computer code for the fuel rod failure estimation that can directly use the radioactivity data obtained from the on-line monitoring system of the primary coolant activity. The main purpose of this study is, therefore, to develop a Windows computer code that can predict the location, the number of failed fuel rods,and the degree of failures using the radioactivity data obtained from the primary coolant activity for PWRs. Another objective is to combine this computer code with the on-line monitoring system of the primary coolant radioactivity at Kori 3 and 4 operating nuclear power plants and enable their combined use for on-line evaluation of the number and degree of fuel rod failures. (author). 49 refs., 85 figs., 30 tabs.

  18. The development of an intelligent interface to a computational fluid dynamics flow-solver code

    Science.gov (United States)

    Williams, Anthony D.

    1988-01-01

    Researchers at NASA Lewis are currently developing an 'intelligent' interface to aid in the development and use of large, computational fluid dynamics flow-solver codes for studying the internal fluid behavior of aerospace propulsion systems. This paper discusses the requirements, design, and implementation of an intelligent interface to Proteus, a general purpose, three-dimensional, Navier-Stokes flow solver. The interface is called PROTAIS to denote its introduction of artificial intelligence (AI) concepts to the Proteus code.

  19. Development of a computer model to predict aortic rupture due to impact loading.

    Science.gov (United States)

    Shah, C S; Yang, K H; Hardy, W; Wang, H K; King, A I

    2001-11-01

    Aortic injuries during blunt thoracic impacts can lead to life threatening hemorrhagic shock and potential exsanguination. Experimental approaches designed to study the mechanism of aortic rupture such as the testing of cadavers is not only expensive and time consuming, but has also been relatively unsuccessful. The objective of this study was to develop a computer model and to use it to predict modes of loading that are most likely to produce aortic ruptures. Previously, a 3D finite element model of the human thorax was developed and validated against data obtained from lateral pendulum tests. The model included a detailed description of the heart, lungs, rib cage, sternum, spine, diaphragm, major blood vessels and intercostal muscles. However, the aorta was modeled as a hollow tube using shell elements with no fluid within, and its material properties were assumed to be linear and isotropic. In this study fluid elements representing blood have been incorporated into the model in order to simulate pressure changes inside the aorta due to impact. The current model was globally validated against experimental data published in the literature for both frontal and lateral pendulum impact tests. Simulations of the validated model for thoracic impacts from a number of directions indicate that the ligamentum arteriosum, subclavian artery, parietal pleura and pressure changes within the aorta are factors that could influence aortic rupture. The model suggests that a right-sided impact to the chest is potentially more hazardous with respect to aortic rupture than any other impact direction simulated in this study. The aortic isthmus was the most likely site of aortic rupture regardless of impact direction. The reader is cautioned that this model could only be validated on a global scale. Validation of the kinematics and dynamics of the aorta at the local level could not be done due to a lack of experimental data. It is hoped that this model will be used to design

  20. Development of a computational system for management of risks in radiosterilization processes of biological tissues

    International Nuclear Information System (INIS)

    Montoya, Cynara Viterbo

    2009-01-01

    Risk management can be understood to be a systematic management which aims to identify record and control the risks of a process. Applying risk management becomes a complex activity, due to the variety of professionals involved. In order to execute risk management the following are requirements of paramount importance: the experience, discernment and judgment of a multidisciplinary team, guided by means of quality tools, so as to provide standardization in the process of investigating the cause and effects of risks and dynamism in obtaining the objective desired, i.e. the reduction and control of the risk. This work aims to develop a computational system of risk management (software) which makes it feasible to diagnose the risks of the processes of radiosterilization of biological tissues. The methodology adopted was action-research, according to which the researcher performs an active role in the establishment of the problems found, in the follow-up and in the evaluation of the actions taken owing to the problems. The scenario of this action-research was the Laboratory of Biological Tissues (LTB) in the Radiation Technology Center IPEN/CNEN-SP - Sao Paulo/Brazil. The software developed was executed in PHP and Flash/MySQL language, the server (hosting), the software is available on the Internet (www.vcrisk.com.br), which the user can access from anywhere by means of the login/access password previously sent by email to the team responsible for the tissue to be analyzed. The software presents friendly navigability whereby the user is directed step-by-step in the process of investigating the risk up to the means of reducing it. The software 'makes' the user comply with the term and present the effectiveness of the actions taken to reduce the risk. Applying this system provided the organization (LTB/CTR/IPEN) with dynamic communication, effective between the members of the multidisciplinary team: a) in decision-making; b) in lessons learned; c) in knowing the new risk

  1. Development of 3-D Radiosurgery Planning System Using IBM Personal Computer

    International Nuclear Information System (INIS)

    Suh, Tae Suk; Park, Charn Il; Ha, Sung Whan; Kang, Wee Saing; Suh, Doug Young; Park, Sung Hun

    1993-01-01

    Recently, stereotactic radiosurgery plan is required with the information of 3-D image and dose distribution. A project has been doing if developing LINAC based stereotactic radiosurgery since April 1991. The purpose of this research is to develop 3-D radiosurgery planning system using personal computer. The procedure of this research is based on two steps. The first step is to develop 3-D localization system, which input the image information of the patient, coordinate transformation, the position and shape of target, and patient contour into computer system using CT image and stereotactic frame. The second step is to develop 3-D dose planning system, which compute dose distribution on image plane, display on high resolution monitor both isodose distribution and patient image simultaneously and develop menu-driven planning system. This prototype of radiosurgery planning system was applied recently for several clinical cases. It was shown that our planning system is fast, accurate and efficient while making it possible to handle various kinds of image modalities such as angiography, CT and MRI. It makes it possible to develop general 3-D planning system using beam eye view or CT simulation in radiation therapy in future

  2. An Overview of Recent Developments in Cognitive Diagnostic Computer Adaptive Assessments

    Directory of Open Access Journals (Sweden)

    Alan Huebner

    2010-01-01

    Full Text Available Cognitive diagnostic modeling has become an exciting new field of psychometric research. These models aim to diagnose examinees' mastery status of a group of discretely defined skills, or attributes, thereby providing them with detailed information regarding their specific strengths and weaknesses. Combining cognitive diagnosis with computer adaptive assessments has emerged as an important part of this new field. This article aims to provide practitioners and researchers with an introduction to and overview of recent developments in cognitive diagnostic computer adaptive assessments.

  3. A Development Architecture for Serious Games Using BCI (Brain Computer Interface Sensors

    Directory of Open Access Journals (Sweden)

    Kyhyun Um

    2012-11-01

    Full Text Available Games that use brainwaves via brain–computer interface (BCI devices, to improve brain functions are known as BCI serious games. Due to the difficulty of developing BCI serious games, various BCI engines and authoring tools are required, and these reduce the development time and cost. However, it is desirable to reduce the amount of technical knowledge of brain functions and BCI devices needed by game developers. Moreover, a systematic BCI serious game development process is required. In this paper, we present a methodology for the development of BCI serious games. We describe an architecture, authoring tools, and development process of the proposed methodology, and apply it to a game development approach for patients with mild cognitive impairment as an example. This application demonstrates that BCI serious games can be developed on the basis of expert-verified theories.

  4. A development architecture for serious games using BCI (brain computer interface) sensors.

    Science.gov (United States)

    Sung, Yunsick; Cho, Kyungeun; Um, Kyhyun

    2012-11-12

    Games that use brainwaves via brain-computer interface (BCI) devices, to improve brain functions are known as BCI serious games. Due to the difficulty of developing BCI serious games, various BCI engines and authoring tools are required, and these reduce the development time and cost. However, it is desirable to reduce the amount of technical knowledge of brain functions and BCI devices needed by game developers. Moreover, a systematic BCI serious game development process is required. In this paper, we present a methodology for the development of BCI serious games. We describe an architecture, authoring tools, and development process of the proposed methodology, and apply it to a game development approach for patients with mild cognitive impairment as an example. This application demonstrates that BCI serious games can be developed on the basis of expert-verified theories.

  5. Developing Tools for Research on School Leadership Development: An Illustrative Case of a Computer Simulation

    Science.gov (United States)

    Showanasai, Parinya; Lu, Jiafang; Hallinger, Philip

    2013-01-01

    Purpose: The extant literature on school leadership development is dominated by conceptual analysis, descriptive studies of current practice, critiques of current practice, and prescriptions for better ways to approach practice. Relatively few studies have examined impact of leadership development using experimental methods, among which even fewer…

  6. Software Infrastructure for Computer-aided Drug Discovery and Development, a Practical Example with Guidelines.

    Science.gov (United States)

    Moretti, Loris; Sartori, Luca

    2016-09-01

    In the field of Computer-Aided Drug Discovery and Development (CADDD) the proper software infrastructure is essential for everyday investigations. The creation of such an environment should be carefully planned and implemented with certain features in order to be productive and efficient. Here we describe a solution to integrate standard computational services into a functional unit that empowers modelling applications for drug discovery. This system allows users with various level of expertise to run in silico experiments automatically and without the burden of file formatting for different software, managing the actual computation, keeping track of the activities and graphical rendering of the structural outcomes. To showcase the potential of this approach, performances of five different docking programs on an Hiv-1 protease test set are presented. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Large-scale computation in solid state physics - Recent developments and prospects

    International Nuclear Information System (INIS)

    DeVreese, J.T.

    1985-01-01

    During the past few years an increasing interest in large-scale computation is developing. Several initiatives were taken to evaluate and exploit the potential of ''supercomputers'' like the CRAY-1 (or XMP) or the CYBER-205. In the U.S.A., there first appeared the Lax report in 1982 and subsequently (1984) the National Science Foundation in the U.S.A. announced a program to promote large-scale computation at the universities. Also, in Europe several CRAY- and CYBER-205 systems have been installed. Although the presently available mainframes are the result of a continuous growth in speed and memory, they might have induced a discontinuous transition in the evolution of the scientific method; between theory and experiment a third methodology, ''computational science'', has become or is becoming operational

  8. Bridging Theory and Practice: Developing Guidelines to Facilitate the Design of Computer-based Learning Environments

    Directory of Open Access Journals (Sweden)

    Lisa D. Young

    2003-10-01

    Full Text Available Abstract. The design of computer-based learning environments has undergone a paradigm shift; moving students away from instruction that was considered to promote technical rationality grounded in objectivism, to the application of computers to create cognitive tools utilized in constructivist environments. The goal of the resulting computer-based learning environment design principles is to have students learn with technology, rather than from technology. This paper reviews the general constructivist theory that has guided the development of these environments, and offers suggestions for the adaptation of modest, generic guidelines, not mandated principles, that can be flexibly applied and allow for the expression of true constructivist ideals in online learning environments.

  9. Development of a fast running accident analysis computer program for use in a simulator

    International Nuclear Information System (INIS)

    Cacciabue, P.C.

    1985-01-01

    This paper describes how a reactor safety nuclear computer program can be modified and improved with the aim of reaching a very fast running tool to be used as a physical model in a plant simulator, without penalizing the accuracy of results. It also discusses some ideas on how the physical theoretical model can be combined to a driving statistical tool for the build up of the entire package of software to be implemented in the simulator for risk and reliability analysis. The approach to the problem, although applied to a specific computer program, can be considered quite general if an already existing and well tested code is being used for the purpose. The computer program considered is ALMOD, originally developed for the analysis of the thermohydraulic and neutronic behaviour of the reactor core, primary circuit and steam generator during operational and special transients. (author)

  10. Guidelines for design and development of computer/microprocessor based systems in research and power reactors

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Chandra, A.K.

    1993-01-01

    Computer systems are being used in Indian research reactors and nuclear power plants in the areas of data acquisition, process monitoring and control, alarm annunciation and safety. The design and evaluation of these systems requires a special approach particularly due to the unique nature of the software which is an essential constituent of these systems. It was decided to evolve guidelines for designing and review of computer/microprocessor based systems for use in nuclear power plants in India. The present document tries to address various issues and presents guidelines which are as comprehensive as possible and cover all issues relating to the design and development of computer based systems. These guidelines are expected to be useful to the specifiers, designers and reviewers of such systems. (author). 6 refs., 1 fig

  11. Image analysis and modeling in medical image computing. Recent developments and advances.

    Science.gov (United States)

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body

  12. Development of a Wireless Computer Vision Instrument to Detect Biotic Stress in Wheat

    Directory of Open Access Journals (Sweden)

    Joaquin J. Casanova

    2014-09-01

    Full Text Available Knowledge of crop abiotic and biotic stress is important for optimal irrigation management. While spectral reflectance and infrared thermometry provide a means to quantify crop stress remotely, these measurements can be cumbersome. Computer vision offers an inexpensive way to remotely detect crop stress independent of vegetation cover. This paper presents a technique using computer vision to detect disease stress in wheat. Digital images of differentially stressed wheat were segmented into soil and vegetation pixels using expectation maximization (EM. In the first season, the algorithm to segment vegetation from soil and distinguish between healthy and stressed wheat was developed and tested using digital images taken in the field and later processed on a desktop computer. In the second season, a wireless camera with near real-time computer vision capabilities was tested in conjunction with the conventional camera and desktop computer. For wheat irrigated at different levels and inoculated with wheat streak mosaic virus (WSMV, vegetation hue determined by the EM algorithm showed significant effects from irrigation level and infection. Unstressed wheat had a higher hue (118.32 than stressed wheat (111.34. In the second season, the hue and cover measured by the wireless computer vision sensor showed significant effects from infection (p = 0.0014, as did the conventional camera (p < 0.0001. Vegetation hue obtained through a wireless computer vision system in this study is a viable option for determining biotic crop stress in irrigation scheduling. Such a low-cost system could be suitable for use in the field in automated irrigation scheduling applications.

  13. The Watts-Strogatz network model developed by including degree distribution: theory and computer simulation

    International Nuclear Information System (INIS)

    Chen, Y W; Zhang, L F; Huang, J P

    2007-01-01

    By using theoretical analysis and computer simulations, we develop the Watts-Strogatz network model by including degree distribution, in an attempt to improve the comparison between characteristic path lengths and clustering coefficients predicted by the original Watts-Strogatz network model and those of the real networks with the small-world property. Good agreement between the predictions of the theoretical analysis and those of the computer simulations has been shown. It is found that the developed Watts-Strogatz network model can fit the real small-world networks more satisfactorily. Some other interesting results are also reported by adjusting the parameters in a model degree-distribution function. The developed Watts-Strogatz network model is expected to help in the future analysis of various social problems as well as financial markets with the small-world property

  14. Development of Computational Models for Pyrochemical Electrorefiners of Nuclear Waste Transmutation Systems

    International Nuclear Information System (INIS)

    Kim, K. R.; Lee, H. S.; Hwang, I. S.

    2010-12-01

    The objective of this project is to develop multi-dimensional computational models in order to improve the operation of uranium electrorefiners currently used in pyroprocessing technology. These 2-D (US) and 3-D (ROK) mathematical models are based on the fundamental physical and chemical properties of the electrorefiner processes. The validated models by compiled and evaluated experimental data could provide better information for developing advanced electrorefiners for uranium recovery. The research results in this period are as follows: - Successfully assessed a common computational platform for the modeling work and identify spatial characterization requirements. - Successfully developed a 3-D electro-fluid dynamic electrorefiner model. - Successfully validated and benchmarked the two multi-dimensional models with compiled experimental data sets

  15. Benchmarking the CRBLASTER Computational Framework on the 350-MHz 49-core Maestro Development Board

    Science.gov (United States)

    Mighell, K. J.

    2012-09-01

    I describe the performance of the CRBLASTER computational framework on a 350-MHz 49-core Maestro Development Board (MBD). The 49-core Interim Test Chip (ITC) was developed by the U.S. Government and is based on the intellectual property of the 64-core TILE64 processor of the Tilera Corporation. The Maestro processor is intended for use in the high radiation environments found in space; the ITC was fabricated using IBM 90-nm CMOS 9SF technology and Radiation-Hardening-by-Design (RHDB) rules. CRBLASTER is a parallel-processing cosmic-ray rejection application based on a simple computational framework that uses the high-performance computing industry standard Message Passing Interface (MPI) library. CRBLASTER was designed to be used by research scientists to easily port image-analysis programs based on embarrassingly-parallel algorithms to a parallel-processing environment such as a multi-node Beowulf cluster or multi-core processors using MPI. I describe my experience of porting CRBLASTER to the 64-core TILE64 processor, the Maestro simulator, and finally the 49-core Maestro processor itself. Performance comparisons using the ITC are presented between emulating all floating-point operations in software and doing all floating point operations with hardware assist from an IEEE-754 compliant Aurora FPU (floating point unit) that is attached to each of the 49 cores. Benchmarking of the CRBLASTER computational framework using the memory-intensive L.A.COSMIC cosmic ray rejection algorithm and a computational-intensive Poisson noise generator reveal subtleties of the Maestro hardware design. Lastly, I describe the importance of using real scientific applications during the testing phase of next-generation computer hardware; complex real-world scientific applications can stress hardware in novel ways that may not necessarily be revealed while executing simple applications or unit tests.

  16. An analysis of computer-related patient safety incidents to inform the development of a classification.

    Science.gov (United States)

    Magrabi, Farah; Ong, Mei-Sing; Runciman, William; Coiera, Enrico

    2010-01-01

    To analyze patient safety incidents associated with computer use to develop the basis for a classification of problems reported by health professionals. Incidents submitted to a voluntary incident reporting database across one Australian state were retrieved and a subset (25%) was analyzed to identify 'natural categories' for classification. Two coders independently classified the remaining incidents into one or more categories. Free text descriptions were analyzed to identify contributing factors. Where available medical specialty, time of day and consequences were examined. Descriptive statistics; inter-rater reliability. A search of 42,616 incidents from 2003 to 2005 yielded 123 computer related incidents. After removing duplicate and unrelated incidents, 99 incidents describing 117 problems remained. A classification with 32 types of computer use problems was developed. Problems were grouped into information input (31%), transfer (20%), output (20%) and general technical (24%). Overall, 55% of problems were machine related and 45% were attributed to human-computer interaction. Delays in initiating and completing clinical tasks were a major consequence of machine related problems (70%) whereas rework was a major consequence of human-computer interaction problems (78%). While 38% (n=26) of the incidents were reported to have a noticeable consequence but no harm, 34% (n=23) had no noticeable consequence. Only 0.2% of all incidents reported were computer related. Further work is required to expand our classification using incident reports and other sources of information about healthcare IT problems. Evidence based user interface design must focus on the safe entry and retrieval of clinical information and support users in detecting and correcting errors and malfunctions.

  17. Rapid and collaborative development of socially relevant computing solutions for developing communities

    CSIR Research Space (South Africa)

    Mtsweni, J

    2014-11-01

    Full Text Available Information and communication technologies (ICTs) have an immense potential as a tool for development. It is now common knowledge that advances in the use of technology can improve economic opportunities for the poor, improve service delivery...

  18. Computer science teacher professional development in the United States: a review of studies published between 2004 and 2014

    Science.gov (United States)

    Menekse, Muhsin

    2015-10-01

    While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher professional development. In this study, the main goal was to systematically review the studies regarding computer science professional development to understand the scope, context, and effectiveness of these programs in the past decade (2004-2014). Based on 21 journal articles and conference proceedings, this study explored: (1) Type of professional development organization and source of funding, (2) professional development structure and participants, (3) goal of professional development and type of evaluation used, (4) specific computer science concepts and training tools used, (5) and their effectiveness to improve teacher practice and student learning.

  19. A generic simulation cell method for developing extensible, efficient and readable parallel computational models

    Science.gov (United States)

    Honkonen, I.

    2015-03-01

    I present a method for developing extensible and modular computational models without sacrificing serial or parallel performance or source code readability. By using a generic simulation cell method I show that it is possible to combine several distinct computational models to run in the same computational grid without requiring modification of existing code. This is an advantage for the development and testing of, e.g., geoscientific software as each submodel can be developed and tested independently and subsequently used without modification in a more complex coupled program. An implementation of the generic simulation cell method presented here, generic simulation cell class (gensimcell), also includes support for parallel programming by allowing model developers to select which simulation variables of, e.g., a domain-decomposed model to transfer between processes via a Message Passing Interface (MPI) library. This allows the communication strategy of a program to be formalized by explicitly stating which variables must be transferred between processes for the correct functionality of each submodel and the entire program. The generic simulation cell class requires a C++ compiler that supports a version of the language standardized in 2011 (C++11). The code is available at https://github.com/nasailja/gensimcell for everyone to use, study, modify and redistribute; those who do are kindly requested to acknowledge and cite this work.

  20. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    International Nuclear Information System (INIS)

    Manimaran, M.; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-01-01

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored

  1. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    Science.gov (United States)

    Ferreira Fonseca, T. C.; Bogaerts, R.; Hunt, John; Vanhavere, F.

    2014-11-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium.

  2. The design and development of a computer game on insulin injection.

    Science.gov (United States)

    Ebrahimpour, Fatemeh; Najafi, Mostafa; Sadeghi, Narges

    2014-01-01

    Insulin therapy is of high importance in glycemic control and prevention of complications in type 1 diabetes in children. However, this treatment is unpleasant and stressful for many children, and it is difficult for them to accept. The purpose of the study was to design and develop an educational computer game for diabetic children to familiarize them with insulin injections. After a review of the literature and the collection of basic information, we discussed the purpose of this research with some diabetic children, their parents, and nurses. The findings that we acquired from the discussion were considered in designing and developing the game. Then, following the principles associated with the development of computer games, we developed seven different games that related to insulin injections, and the games were evaluated in a pilot study. The games developed through the design and programming environment of Adobe Flash Player and stored on a computer disk (CD). The seven games were a pairs game, a puzzle game, a question and answer game, an insulin kit game, a drawing room game, a story game, and an insulin injection-room game). The idea was that diabetic children could become acquainted with insulin injections and the injection toolkit by playing a variety of entertaining and fun games. They also learned about some of the issues associated with insulin and experienced insulin injection in a simulated environment. It seems that the use of new technologies, such as computer games, can influence diabetic children's acquaintance with the correct method of insulin injection, psychological readiness to initiate insulin therapy, reduction in stress, anxiety, and fear of insulin injection.

  3. A methodology to develop computational phantoms with adjustable posture for WBC calibration.

    Science.gov (United States)

    Fonseca, T C Ferreira; Bogaerts, R; Hunt, John; Vanhavere, F

    2014-11-21

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium.

  4. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    International Nuclear Information System (INIS)

    Fonseca, T C Ferreira; Vanhavere, F; Bogaerts, R; Hunt, John

    2014-01-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium. (paper)

  5. Maize sugary enhancer1 (se1) is a presence-absence variant of a previously uncharacterized gene and development of educational videos to raise the profile of plant breeding and improve curricula

    Science.gov (United States)

    Haro von Mogel, Karl J.

    Carbohydrate metabolism is a biologically, economically, and culturally important process in crop plants. Humans have selected many crop species such as maize (Zea mays L.) in ways that have resulted in changes to carbohydrate metabolic pathways, and understanding the underlying genetics of this pathway is therefore exceedingly important. A previously uncharacterized starch metabolic pathway mutant, sugary enhancer1 (se1), is a recessive modifier of sugary1 (su1) sweet corn that increases the sugar content while maintaining an appealing creamy texture. This allele has been incorporated into many sweet corn varieties since its discovery in the 1970s, however, testing for the presence of this allele has been difficult. A genetic stock was developed that allowed the presence of se1 to be visually scored in segregating ears, which were used to genetically map se1 to the deletion of a single gene model located on the distal end of the long arm of chromosome 2. An analysis of homology found that this gene is specific to monocots, and the gene is expressed in the endosperm and developing leaf. The se1 allele increased water soluble polysaccharide (WSP) and decreased amylopectin in maize endosperm, but there was no overall effect on starch content in mature leaves due to se1. This discovery will lead to a greater understanding of starch metabolism, and the marker developed will assist in breeding. There is a present need for increased training for plant breeders to meet the growing needs of the human population. To raise the profile of plant breeding among young students, a series of videos called Fields of Study was developed. These feature interviews with plant breeders who talk about what they do as plant breeders and what they enjoy about their chosen profession. To help broaden the education of students in college biology courses, and assist with the training of plant breeders, a second video series, Pollination Methods was developed. Each video focuses on one or two

  6. Towards Test Driven Development for Computational Science with pFUnit

    Science.gov (United States)

    Rilee, Michael L.; Clune, Thomas L.

    2014-01-01

    Developers working in Computational Science & Engineering (CSE)/High Performance Computing (HPC) must contend with constant change due to advances in computing technology and science. Test Driven Development (TDD) is a methodology that mitigates software development risks due to change at the cost of adding comprehensive and continuous testing to the development process. Testing frameworks tailored for CSE/HPC, like pFUnit, can lower the barriers to such testing, yet CSE software faces unique constraints foreign to the broader software engineering community. Effective testing of numerical software requires a comprehensive suite of oracles, i.e., use cases with known answers, as well as robust estimates for the unavoidable numerical errors associated with implementation with finite-precision arithmetic. At first glance these concerns often seem exceedingly challenging or even insurmountable for real-world scientific applications. However, we argue that this common perception is incorrect and driven by (1) a conflation between model validation and software verification and (2) the general tendency in the scientific community to develop relatively coarse-grained, large procedures that compound numerous algorithmic steps.We believe TDD can be applied routinely to numerical software if developers pursue fine-grained implementations that permit testing, neatly side-stepping concerns about needing nontrivial oracles as well as the accumulation of errors. We present an example of a successful, complex legacy CSE/HPC code whose development process shares some aspects with TDD, which we contrast with current and potential capabilities. A mix of our proposed methodology and framework support should enable everyday use of TDD by CSE-expert developers.

  7. Using Computer-Assisted Argumentation Mapping to develop effective argumentation skills in high school advanced placement physics

    Science.gov (United States)

    Heglund, Brian

    Educators recognize the importance of reasoning ability for development of critical thinking skills, conceptual change, metacognition, and participation in 21st century society. There is a recognized need for students to improve their skills of argumentation, however, argumentation is not explicitly taught outside logic and philosophy---subjects that are not part of the K-12 curriculum. One potential way of supporting the development of argumentation skills in the K-12 context is through incorporating Computer-Assisted Argument Mapping to evaluate arguments. This quasi-experimental study tested the effects of such argument mapping software and was informed by the following two research questions: 1. To what extent does the collaborative use of Computer-Assisted Argumentation Mapping to evaluate competing theories influence the critical thinking skill of argument evaluation, metacognitive awareness, and conceptual knowledge acquisition in high school Advanced Placement physics, compared to the more traditional method of text tables that does not employ Computer-Assisted Argumentation Mapping? 2. What are the student perceptions of the pros and cons of argument evaluation in the high school Advanced Placement physics environment? This study examined changes in critical thinking skills, including argumentation evaluation skills, as well as metacognitive awareness and conceptual knowledge, in two groups: a treatment group using Computer-Assisted Argumentation Mapping to evaluate physics arguments, and a comparison group using text tables to evaluate physics arguments. Quantitative and qualitative methods for collecting and analyzing data were used to answer the research questions. Quantitative data indicated no significant difference between the experimental groups, and qualitative data suggested students perceived pros and cons of argument evaluation in the high school Advanced Placement physics environment, such as self-reported sense of improvement in argument

  8. USE OF CLOUD COMPUTING FOR DEVELOPMENT OF TEACHERS’ INFORMATION AND COMMUNICATION COMPETENCE

    Directory of Open Access Journals (Sweden)

    N. Soroko

    2013-08-01

    Full Text Available The article deals with the problem for development of techers’ information and communication competence and use of cloud computing for it. The analysis of the modern approaches to the use of cloud technologies and projects for professional development of teachers and development of teachers’ information and communication competence have been presented.There are the main characteristics of software as a service on the Internet for education leading companies Google, Microsoft, IBM. There are described some actions of these companies, which are conducted to help teachers to master cloud technology for improving the professional activities and development of teachers’ information and communication competence. The examples of ways of development of teachers’ information and communication competence and training teachers to use modern ICT in the professional activity are given in the paper. The Cloud based model for development of teachers’ information and communication competence has been proposed.

  9. Computer literacy of future teacher of physical culture, as one of basic elements of professional development.

    Directory of Open Access Journals (Sweden)

    Dragnev Y.V.

    2011-08-01

    Full Text Available he problem of computer literacy of future teacher of physical culture is examined in the article, as one of basic elements of professional development. The necessity of introduction of multimedia technologies opens up for practice of athletic education, which enables to combine the didactics functions of computer, as teaching facilities, with possibilities of traditional methods of teaching and to renew an educational process information technologies etc Specified, that professional development of future teacher of physical culture must create a new specialist in the field of knowledge „Physical education, sport and health of man" which will be competitive on the European and World labour-markets in the conditions of informatization and computerization of higher education.

  10. A development of computer code for evaluating internal radiation dose through ingestion and inhalation pathways

    International Nuclear Information System (INIS)

    Lee, Jeong Ho; Lee, Chang Woo; Choi, Yong Ho; Chun, Ki Jung; Kim, Kook Chan; Kim, Sang Bok; Kim, Jin Kyu

    1991-07-01

    The computer codes were developed to evaluate internal radiation dose when radioactive isotopes released from nuclear facilities are taken through ingestion and inhalation pathways. Food chain models and relevant data base representing the agricultural and social environment of Korea are set up. An equilibrium model-KFOOD, which can deal with routine releases from a nuclear facility and a dynamic model-ECOREA, which is suitable for the description of acute radioactivity release following nuclear accident. (Author)

  11. Factors Influencing the Adoption of Cloud Computing by Small and Medium Enterprises (SMEs) in Developing Economies

    DEFF Research Database (Denmark)

    Yeboah-Boateng, Ezer Osei; Essandoh, Kofi Asare

    2014-01-01

    , competence of cloud vendors, resistance to new technology, compatibility and existence of IT infrastructure are realized as key factors influencing cloud computing adoption. These findings will go a long way in helping service providers and technology policymakers to develop solutions and strategies...... communication, scalability and business continuity as the main drivers of cloud adoption, whereas lack of knowledge, poor internet connectivity, security of cloud services, lack of trust and interoperability with existing systems were identified as barriers to adoption. Top management support, trialability...

  12. Development of Student Information Management System based on Cloud Computing Platform

    Directory of Open Access Journals (Sweden)

    Ibrahim A. ALAMERI

    2017-10-01

    Full Text Available The management and provision of information about the educational process is an essential part of effective management of the educational process in the institutes of higher education. In this paper the requirements of a reliable student management system are analyzed, formed a use-case model of student information management system, designed and implemented the architecture of the application. Regarding the implementation process, modern approaches were used to develop and deploy a reliable online application in cloud computing environments specifically.

  13. Development of the computer-aided process planning (CAPP) system for polymer injection molds manufacturing

    OpenAIRE

    J. Tepić; V. Todić; D. Lukić; M. Milošević; S. Borojević

    2011-01-01

    Beginning of production and selling of polymer products largely depends on mold manufacturing. The costs of mold manufacturing have significant share in the final price of a product. The best way to improve and rationalize polymer injection molds production process is by doing mold design automation and manufacturing process planning automation. This paper reviews development of a dedicated process planning system for manufacturing of the mold for injection molding, which integrates computer-...

  14. Development of a computer code for Dalat research reactor transient analysis

    International Nuclear Information System (INIS)

    Le Vinh Vinh; Nguyen Thai Sinh; Huynh Ton Nghiem; Luong Ba Vien; Pham Van Lam; Nguyen Kien Cuong

    2003-01-01

    DRSIM (Dalat Reactor SIMulation) computer code has been developed for Dalat reactor transient analysis. It is basically a coupled neutronics-hydrodynamics-heat transfer code employing point kinetics, one dimensional hydrodynamics and one dimensional heat transfer. The work was financed by VAEC and DNRI in the framework of institutional R and D programme. Some transient problems related to reactivity and loss of coolant flow was carried out by DRSIM using temperature and void coefficients calculated by WIMS and HEXNOD2D codes. (author)

  15. Development changes of geometric layout product, developed by means of computer aided design

    Directory of Open Access Journals (Sweden)

    С.Г. Кєворков

    2007-01-01

    Full Text Available  Contains results of development of modification formation methodology in a product geometrical mockup made by means of CAD system. Change process of a CAD data (assembly structures, details and influencing on a product structure is considered. The analysis of the assembly version creations algorithm, which creates a product structure with certain serial number, is carried out. The algorithms of CAD user environment creations, restriction of CAD object and CAD object cancellation algorithm are created.

  16. Current practice in software development for computational neuroscience and how to improve it.

    Science.gov (United States)

    Gewaltig, Marc-Oliver; Cannon, Robert

    2014-01-01

    Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  17. Current practice in software development for computational neuroscience and how to improve it.

    Directory of Open Access Journals (Sweden)

    Marc-Oliver Gewaltig

    2014-01-01

    Full Text Available Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  18. The development of Sonic Pi and its use in educational partnerships: Co-creating pedagogies for learning computer programming

    OpenAIRE

    Aaron, S; Blackwell, Alan Frank; Burnard, Pamela Anne

    2017-01-01

    Sonic Pi is a new open source software tool and platform originally developed for the Raspberry Pi computer, designed to enable school children to learn programming by creating music. In this article we share insights from a scoping study on the development of Sonic Pi and its use in educational partnerships. Our findings draw attention to the importance of collaborative relationships between teacher and computer scientist and the value of creative pedagogies for learning computer programming...

  19. Biomechanical Stress and Strain Analysis of Mandibular Human Region from Computed Tomography to Custom Implant Development

    Directory of Open Access Journals (Sweden)

    Rafael Ferreira Gregolin

    2017-01-01

    Full Text Available Currently computational tools are helping and improving the processes and testing procedures in several areas of knowledge. Computed tomography (CT is a diagnostic tool already consolidated and now beginning to be used as a tool for something even more innovative, creating biomodels. Biomodels are anatomical physical copies of human organs and tissues that are used for diagnosis and surgical planning. The use of tomographic images in the creation of biomodels has been arousing great interest in the medical and bioengineering area. In addition to creating biomodels by computed tomography it is also possible, using this process, to create mathematical models to perform computer simulations and analyses of regions of interest. This paper discusses the creation of a biomodel of the skull-mandibular region of a patient from CT for study and evaluation of efforts in the area of the temporomandibular joint (TMJ aiming at the design and development of a TMJ custom prosthesis. The evaluation of efforts in the TMJ region due to the forces of mastication was made using the finite element method and the results were corroborated by comparison with mandibular models studied in similar works.

  20. Development of computational methods of design by analysis for pressure vessel components

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan; Wu Honglin

    2005-01-01

    Stress classification is not only one of key steps when pressure vessel component is designed by analysis, but also a difficulty which puzzles engineers and designers at all times. At present, for calculating and categorizing the stress field of pressure vessel components, there are several computation methods of design by analysis such as Stress Equivalent Linearization, Two-Step Approach, Primary Structure method, Elastic Compensation method, GLOSS R-Node method and so on, that are developed and applied. Moreover, ASME code also gives an inelastic method of design by analysis for limiting gross plastic deformation only. When pressure vessel components design by analysis, sometimes there are huge differences between the calculating results for using different calculating and analysis methods mentioned above. As consequence, this is the main reason that affects wide application of design by analysis approach. Recently, a new approach, presented in the new proposal of a European Standard, CEN's unfired pressure vessel standard EN 13445-3, tries to avoid problems of stress classification by analyzing pressure vessel structure's various failure mechanisms directly based on elastic-plastic theory. In this paper, some stress classification methods mentioned above, are described briefly. And the computational methods cited in the European pressure vessel standard, such as Deviatoric Map, and nonlinear analysis methods (plastic analysis and limit analysis), are depicted compendiously. Furthermore, the characteristics of computational methods of design by analysis are summarized for selecting the proper computational method when design pressure vessel component by analysis. (authors)

  1. [Placental complications after a previous cesarean section].

    Science.gov (United States)

    Milosević, Jelena; Lilić, Vekoslav; Tasić, Marija; Radović-Janosević, Dragana; Stefanović, Milan; Antić, Vladimir

    2009-01-01

    The incidence of cesarean section has been rising in the past 50 years. With the increased number of cesarean sections, the number of pregnancies with the previous cesarean section rises as well. The aim of this study was to establish the influence of the previous cesarean section on the development of placental complications: placenta previa, placental abruption and placenta accreta, as well as to determine the influence of the number of previous cesarean sections on the complication development. The research was conducted at the Clinic of Gynecology and Obstetrics in Nis covering 10-year-period (from 1995 to 2005) with 32358 deliveries, 1280 deliveries after a previous cesarean section, 131 cases of placenta previa and 118 cases of placental abruption. The experimental groups was presented by the cases of placenta previa or placental abruption with prior cesarean section in obstetrics history, opposite to the control group having the same conditions but without a cesarean section in medical history. The incidence of placenta previa in the control group was 0.33%, opposite to the 1.86% incidence after one cesarean section (pcesarean sections and as high as 14.28% after three cesarean sections in obstetric history. Placental abruption was recorded as placental complication in 0.33% pregnancies in the control group, while its incidence was 1.02% after one cesarean section (pcesarean sections. The difference in the incidence of intrapartal hysterectomy between the group with prior cesarean section (0.86%) and without it (0.006%) shows a high statistical significance (pcesarean section is an important risk factor for the development of placental complications.

  2. Integrated Computational Materials Engineering Development of Advanced High Strength Steel for Lightweight Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Hector, Jr., Louis G. [General Motors, Warren, MI (United States); McCarty, Eric D. [United States Automotive Materials Partnership LLC (USAMP), Southfield, MI (United States)

    2017-07-31

    The goal of the ICME 3GAHSS project was to successfully demonstrate the applicability of Integrated Computational Materials Engineering (ICME) for the development and deployment of third generation advanced high strength steels (3GAHSS) for immediate weight reduction in passenger vehicles. The ICME approach integrated results from well-established computational and experimental methodologies to develop a suite of material constitutive models (deformation and failure), manufacturing process and performance simulation modules, a properties database, as well as the computational environment linking them together for both performance prediction and material optimization. This is the Final Report for the ICME 3GAHSS project, which achieved the fol-lowing objectives: 1) Developed a 3GAHSS ICME model, which includes atomistic, crystal plasticity, state variable and forming models. The 3GAHSS model was implemented in commercially available LS-DYNA and a user guide was developed to facilitate use of the model. 2) Developed and produced two 3GAHSS alloys using two different chemistries and manufacturing processes, for use in calibrating and validating the 3GAHSS ICME Model. 3) Optimized the design of an automotive subassembly by substituting 3GAHSS for AHSS yielding a design that met or exceeded all baseline performance requirements with a 30% mass savings. A technical cost model was also developed to estimate the cost per pound of weight saved when substituting 3GAHSS for AHSS. The project demonstrated the potential for 3GAHSS to achieve up to 30% weight savings in an automotive structure at a cost penalty of up to $0.32 to $1.26 per pound of weight saved. The 3GAHSS ICME Model enables the user to design 3GAHSS to desired mechanical properties in terms of strength and ductility.

  3. A Systematic Review of Computational Drug Discovery, Development, and Repurposing for Ebola Virus Disease Treatment.

    Science.gov (United States)

    Schuler, James; Hudson, Matthew L; Schwartz, Diane; Samudrala, Ram

    2017-10-20

    Ebola virus disease (EVD) is a deadly global public health threat, with no currently approved treatments. Traditional drug discovery and development is too expensive and inefficient to react quickly to the threat. We review published research studies that utilize computational approaches to find or develop drugs that target the Ebola virus and synthesize its results. A variety of hypothesized and/or novel treatments are reported to have potential anti-Ebola activity. Approaches that utilize multi-targeting/polypharmacology have the most promise in treating EVD.

  4. A Systematic Review of Computational Drug Discovery, Development, and Repurposing for Ebola Virus Disease Treatment

    Directory of Open Access Journals (Sweden)

    James Schuler

    2017-10-01

    Full Text Available Ebola virus disease (EVD is a deadly global public health threat, with no currently approved treatments. Traditional drug discovery and development is too expensive and inefficient to react quickly to the threat. We review published research studies that utilize computational approaches to find or develop drugs that target the Ebola virus and synthesize its results. A variety of hypothesized and/or novel treatments are reported to have potential anti-Ebola activity. Approaches that utilize multi-targeting/polypharmacology have the most promise in treating EVD.

  5. A computer-aided framework for development, identification andmanagement of physiologically-based pharmacokinetic models

    DEFF Research Database (Denmark)

    Heitzig, Martina; Linninger, Andreas; Sin, Gürkan

    2014-01-01

    The objective of this work is the development of a generic computer-aided modelling framework to support the development of physiologically-based pharmacokinetic models thereby increasing the efficiency and quality of the modelling process. In particular, the framework systematizes the modelling......-based pharmacokinetic modelling of the distribution of the drug cyclosporin A in rats and humans. Four alternative candidate models for rats are derived and discriminated based on experimental data. The model candidate that is best represented by the experimental data is scaled-up to a human being applying...

  6. Development of a multimaterial, two-dimensional, arbitrary Lagrangian-Eulerian mesh computer program

    International Nuclear Information System (INIS)

    Barton, R.T.

    1982-01-01

    We have developed a large, multimaterial, two-dimensional Arbitrary Lagrangian-Eulerian (ALE) computer program. The special feature of an ALE mesh is that it can be either an embedded Lagrangian mesh, a fixed Eulerian mesh, or a partially embedded, partially remapped mesh. Remapping is used to remove Lagrangian mesh distortion. This general purpose program has been used for astrophysical modeling, under the guidance of James R. Wilson. The rationale behind the development of this program will be used to highlight several important issues in program design

  7. Correlation signatures of wet soils and snows. [algorithm development and computer programming

    Science.gov (United States)

    Phillips, M. R.

    1972-01-01

    Interpretation, analysis, and development of algorithms have provided the necessary computational programming tools for soil data processing, data handling and analysis. Algorithms that have been developed thus far, are adequate and have been proven successful for several preliminary and fundamental applications such as software interfacing capabilities, probability distributions, grey level print plotting, contour plotting, isometric data displays, joint probability distributions, boundary mapping, channel registration and ground scene classification. A description of an Earth Resources Flight Data Processor, (ERFDP), which handles and processes earth resources data under a users control is provided.

  8. Development of a new generation solid rocket motor ignition computer code

    Science.gov (United States)

    Foster, Winfred A., Jr.; Jenkins, Rhonald M.; Ciucci, Alessandro; Johnson, Shelby D.

    1994-01-01

    This report presents the results of experimental and numerical investigations of the flow field in the head-end star grain slots of the Space Shuttle Solid Rocket Motor. This work provided the basis for the development of an improved solid rocket motor ignition transient code which is also described in this report. The correlation between the experimental and numerical results is excellent and provides a firm basis for the development of a fully three-dimensional solid rocket motor ignition transient computer code.

  9. Theoretical Atomic Physics code development IV: LINES, A code for computing atomic line spectra

    International Nuclear Information System (INIS)

    Abdallah, J. Jr.; Clark, R.E.H.

    1988-12-01

    A new computer program, LINES, has been developed for simulating atomic line emission and absorption spectra using the accurate fine structure energy levels and transition strengths calculated by the (CATS) Cowan Atomic Structure code. Population distributions for the ion stages are obtained in LINES by using the Local Thermodynamic Equilibrium (LTE) model. LINES is also useful for displaying the pertinent atomic data generated by CATS. This report describes the use of LINES. Both CATS and LINES are part of the Theoretical Atomic PhysicS (TAPS) code development effort at Los Alamos. 11 refs., 9 figs., 1 tab

  10. Development of a model and computer code to describe solar grade silicon production processes. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Gould, R K; Srivastava, R

    1979-12-01

    Models and computer codes which may be used to describe flow reactors in which high purity, solar grade silicon is produced via reduction of gaseous silicon halides are described. A prominent example of the type of process which may be studied using the codes developed in this program is the SiCl/sub 4//Na reactor currently being developed by the Westinghouse Electric Corp. During this program two large computer codes were developed. The first is the CHEMPART code, an axisymmetric, marching code which treats two-phase flows with models describing detailed gas-phase chemical kinetics, particle formation, and particle growth. This code, based on the AeroChem LAPP (Low Altitude Plume Program) code can be used to describe flow reactors in which reactants mix, react, and form a particulate phase. Detailed radial gas-phase composition, temperature, velocity, and particle size distribution profiles are computed. Also, depositon of heat, momentum, and mass (either particulate or vapor) on reactor walls is described. The second code is a modified version of the GENMIX boundary layer code which is used to compute rates of heat, momentum, and mass transfer to the reactor walls. This code lacks the detailed chemical kinetics and particle handling features of the CHEMPART code but has the virtue of running much more rapidly than CHEMPART, while treating the phenomena occurring in the boundary layer in more detail than can be afforded using CHEMPART. These two codes have been used in this program to predict particle formation characteristics and wall collection efficiencies for SiCl/sub 4//Na flow reactors. Results are described.

  11. Development of an electrical impedance computed tomographic two-phase flows analyzer. Annual technical report for program renewal

    Energy Technology Data Exchange (ETDEWEB)

    Jones, O.C.

    1993-05-01

    This progress report details the theoretical development, numerical results, experimental design (mechanical), experimental design (electronic), and experimental results for the research program for the development of an electrical impedance computed tomographic two-phase flow analyzer.

  12. Design and development of a diversified real time computer for future FBRs

    International Nuclear Information System (INIS)

    Sujith, K.R.; Bhattacharyya, Anindya; Behera, R.P.; Murali, N.

    2014-01-01

    The current safety related computer system of Prototype Fast Breeder Reactor (PFBR) under construction in Kalpakkam consists of two redundant Versa Module Europa (VME) bus based Real Time Computer system with a Switch Over Logic Circuit (SOLC). Since both the VME systems are identical, the dual redundant system is prone to common cause failure (CCF). The probability of CCF can be reduced by adopting diversity. Design diversity has long been used to protect redundant systems against common-mode failures. The conventional notion of diversity relies on 'independent' generation of 'different' implementations. This paper discusses the design and development of a diversified Real Time Computer which will replace one of the computer system in the dual redundant architecture. Compact PCI (cPCI) bus systems are widely used in safety critical applications such as avionics, railways, defence and uses diverse electrical signaling and logical specifications, hence was chosen for development of the diversified system. Towards the initial development a CPU card based on an ARM-9 processor, 16 channel Relay Output (RO) card and a 30 channel Analog Input (AI) card was developed. All the cards mentioned supports hot-swap and geographic addressing capability. In order to mitigate the component obsolescence problem the 32 bit PCI target controller and associated glue logic for the slave I/O cards was indigenously developed using VHDL. U-boot was selected as the boot loader and arm Linux 2.6 as the preliminary operating system for the CPU card. Board specific initialization code for the CPU card was written in ARM assembly language and serial port initialization was written in C language. Boot loader along with Linux 2.6 kernel and jffs2 file system was flashed into the CPU card. Test applications written in C language were used to test the various peripherals of the CPU card. Device driver for the AI and RO card was developed as Linux kernel modules and application library was also

  13. Magnetic fusion energy and computers. The role of computing in magnetic fusion energy research and development (second edition)

    International Nuclear Information System (INIS)

    1983-01-01

    This report documents the structure and uses of the MFE Network and presents a compilation of future computing requirements. Its primary emphasis is on the role of supercomputers in fusion research. One of its key findings is that with the introduction of each successive class of supercomputer, qualitatively improved understanding of fusion processes has been gained. At the same time, even the current Class VI machines severely limit the attainable realism of computer models. Many important problems will require the introduction of Class VII or even larger machines before they can be successfully attacked

  14. Development of a totally computer-controlled triple quadrupole mass spectrometer system

    International Nuclear Information System (INIS)

    Wong, C.M.; Crawford, R.W.; Barton, V.C.; Brand, H.R.; Neufeld, K.W.; Bowman, J.E.

    1983-01-01

    A totally computer-controlled triple quadrupole mass spectrometer (TQMS) is described. It has a number of unique features not available on current commercial instruments, including: complete computer control of source and all ion axial potentials; use of dual computers for data acquisition and data processing; and capability for self-adaptive control of experiments. Furthermore, it has been possible to produce this instrument at a cost significantly below that of commercial instruments. This triple quadrupole mass spectrometer has been constructed using components commercially available from several different manufacturers. The source is a standard Hewlett-Packard 5985B GC/MS source. The two quadrupole analyzers and the quadrupole CAD region contain Balzers QMA 150 rods with Balzers QMG 511 rf controllers for the analyzers and a Balzers QHS-511 controller for the CAD region. The pulsed-positive-ion-negative-ion-chemical ionization (PPINICI) detector is made by Finnigan Corporation. The mechanical and electronics design were developed at LLNL for linking these diverse elements into a functional TQMS as described. The computer design for total control of the system is unique in that two separate LSI-11/23 minicomputers and assorted I/O peripherals and interfaces from several manufacturers are used. The evolution of this design concept from totally computer-controlled instrumentation into future self-adaptive or ''expert'' systems for instrumental analysis is described. Operational characteristics of the instrument and initial results from experiments involving the analysis of the high explosive HMX (1,3,5,7-Tetranitro-1,3,5,7-Tetrazacyclooctane) are presented

  15. Development of food tables and use with computers. Review of nutrient data bases.

    Science.gov (United States)

    Hertzler, A A; Hoover, L W

    1977-01-01

    Numerous tables of food composition have been compiled since the 1890s to meet the needs for nutrient data by nutritionists, researchers, and consumers. Early tables included values for protein, fat, carbohydrate, and calories. By 1945, values for several vitamins and minerals were listed in tables of food composition. The contents of food composition tables have been updated and expanded as laboratory procedures for analyzing nutrients have been improved. A trend toward increased specificity for most nutrient classifications has resulted in consideration of up to one hundred nutritional components in the development of nutrient data bases. Tables of food composition have varied in number of food items, number of nutrients, and classification schemes. Food-group designations have been used in some tables as a technique to categorize data for similar food items. Computer-stored nutrient data bases tend to vary in much the same manner as printed tables of food composition. Computer-assisted diagnostic procedures for editing input data for validity and verifying reasonable relationships among nutrient data have been developed to detect data inconsistencies. As data for new food products are included and the effects of food processing methods are determined, food composition tables and computer-stored nutrient data bases are expected to become more comprehensive, reliable, and suitable for various uses.

  16. An Open Environment to Support the Development of Computational Chemistry Solutions

    Science.gov (United States)

    Bejarano, Bernardo Palacios; Ruiz, Irene Luque; Gómez-Nieto, Miguel Ángel

    2009-08-01

    In this paper we present an open software environment devoted to support the investigations in computational chemistry. The software, named CoChiSE (Computational Chimica Software Environment) is fully developed in Java using Eclipse as IDE; in this way, the system is integrated by different perspectives oriented to solve different aspects of the computational chemistry research. CoChiSE is able to manage large chemical databases, maintaining information about molecules and properties as well; this information can be exported and imported to/from the most popular standard file formats. The system also allows the user to perform the calculation of different type of isomorphism and molecular similarity. Besides, CoChiSE incorporates a perspective in charge of the calculation of molecular descriptors, considering more than four hundred descriptors of different categories. All the information and system perspectives are integrated in the same environment, so a huge amount of information is managed by the user. The characteristics of the developed system permit the easy integration of either user, proprietary and free software.

  17. Examining human behavior in video games: The development of a computational model to measure aggression.

    Science.gov (United States)

    Lamb, Richard; Annetta, Leonard; Hoston, Douglas; Shapiro, Marina; Matthews, Benjamin

    2018-06-01

    Video games with violent content have raised considerable concern in popular media and within academia. Recently, there has been considerable attention regarding the claim of the relationship between aggression and video game play. The authors of this study propose the use of a new class of tools developed via computational models to allow examination of the question of whether there is a relationship between violent video games and aggression. The purpose of this study is to computationally model and compare the General Aggression Model with the Diathesis Mode of Aggression related to the play of violent content in video games. A secondary purpose is to provide a method of measuring and examining individual aggression arising from video game play. Total participants examined for this study are N = 1065. This study occurs in three phases. Phase 1 is the development and quantification of the profile combination of traits via latent class profile analysis. Phase 2 is the training of the artificial neural network. Phase 3 is the comparison of each model as a computational model with and without the presence of video game violence. Results suggest that a combination of environmental factors and genetic predispositions trigger aggression related to video games.

  18. Computational Modelling of Cancer Development and Growth: Modelling at Multiple Scales and Multiscale Modelling.

    Science.gov (United States)

    Szymańska, Zuzanna; Cytowski, Maciej; Mitchell, Elaine; Macnamara, Cicely K; Chaplain, Mark A J

    2017-06-20

    In this paper, we present two mathematical models related to different aspects and scales of cancer growth. The first model is a stochastic spatiotemporal model of both a synthetic gene regulatory network (the example of a three-gene repressilator is given) and an actual gene regulatory network, the NF-[Formula: see text]B pathway. The second model is a force-based individual-based model of the development of a solid avascular tumour with specific application to tumour cords, i.e. a mass of cancer cells growing around a central blood vessel. In each case, we compare our computational simulation results with experimental data. In the final discussion section, we outline how to take the work forward through the development of a multiscale model focussed at the cell level. This would incorporate key intracellular signalling pathways associated with cancer within each cell (e.g. p53-Mdm2, NF-[Formula: see text]B) and through the use of high-performance computing be capable of simulating up to [Formula: see text] cells, i.e. the tissue scale. In this way, mathematical models at multiple scales would be combined to formulate a multiscale computational model.

  19. Development of a Bacteria Computer: From in silico Finite Automata to in vitro and in vivo

    Science.gov (United States)

    Sakakibara, Yasubumi

    We overview a series of our research on implementing finite automata in vitro and in vivo in the framework of DNA-based computing [1,2]. First, we employ the length-encoding technique proposed and presented in [3,4] to implement finite automata in test tube. In the length-encoding method, the states and state transition functions of a target finite automaton are effectively encoded into DNA sequences, a computation (accepting) process of finite automata is accomplished by self-assembly of encoded complementary DNA strands, and the acceptance of an input string is determined by the detection of a completely hybridized double-strand DNA. Second, we report our intensive in vitro experiments in which we have implemented and executed several finite-state automata in test tube. We have designed and developed practical laboratory protocols which combine several in vitro operations such as annealing, ligation, PCR, and streptavidin-biotin bonding to execute in vitro finite automata based on the length-encoding technique. We have carried laboratory experiments on various finite automata with 2 up to 6 states for several input strings. Third, we present a novel framework to develop a programmable and autonomous in vivo computer using Escherichia coli (E. coli), and implement in vivo finite-state automata based on the framework by employing the protein-synthesis mechanism of E. coli. We show some successful experiments to run an in vivo finite-state automaton on E. coli.

  20. Development of a computer code for dynamic analysis of the primary circuit of advanced reactors

    International Nuclear Information System (INIS)

    Rocha, Jussie Soares da; Lira, Carlos A.B.O.; Magalhaes, Mardson A. de Sa

    2011-01-01

    Currently, advanced reactors are being developed, seeking for enhanced safety, better performance and low environmental impacts. Reactor designs must follow several steps and numerous tests before a conceptual project could be certified. In this sense, computational tools become indispensable in the preparation of such projects. Thus, this study aimed at the development of a computational tool for thermal-hydraulic analysis by coupling two computer codes to evaluate the influence of transients caused by pressure variations and flow surges in the region of the primary circuit of IRIS reactor between the core and the pressurizer. For the simulation, it was used a situation of 'insurge', characterized by the entry of water in the pressurizer, due to the expansion of the refrigerant in the primary circuit. This expansion was represented by a pressure disturbance in step form, through the block 'step' of SIMULINK, thus enabling the transient startup. The results showed that the dynamic tool, obtained through the coupling of the codes, generated very satisfactory responses within model limitations, preserving the most important phenomena in the process. (author)

  1. Computer-Based Procedures for Field Workers in Nuclear Power Plants: Development of a Model of Procedure Usage and Identification of Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Katya Le Blanc; Johanna Oxstrand

    2012-04-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field workers. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do so. This paper describes the development of a Model of Procedure Use and the qualitative study on which the model is based. The study was conducted in collaboration with four nuclear utilities and five research institutes. During the qualitative study and the model development requirements and for computer-based procedures were identified.

  2. Computer-Aided Software Engineering - An approach to real-time software development

    Science.gov (United States)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  3. Virtual environment and computer-aided technologies used for system prototyping and requirements development

    Science.gov (United States)

    Logan, Cory; Maida, James; Goldsby, Michael; Clark, Jim; Wu, Liew; Prenger, Henk

    1993-01-01

    The Space Station Freedom (SSF) Data Management System (DMS) consists of distributed hardware and software which monitor and control the many onboard systems. Virtual environment and off-the-shelf computer technologies can be used at critical points in project development to aid in objectives and requirements development. Geometric models (images) coupled with off-the-shelf hardware and software technologies were used in The Space Station Mockup and Trainer Facility (SSMTF) Crew Operational Assessment Project. Rapid prototyping is shown to be a valuable tool for operational procedure and system hardware and software requirements development. The project objectives, hardware and software technologies used, data gained, current activities, future development and training objectives shall be discussed. The importance of defining prototyping objectives and staying focused while maintaining schedules are discussed along with project pitfalls.

  4. A review of Green's function methods in computational fluid mechanics: Background, recent developments and future directions

    International Nuclear Information System (INIS)

    Dorning, J.

    1981-01-01

    The research and development over the past eight years on local Green's function methods for the high-accuracy, high-efficiency numerical solution of nuclear engineering problems is reviewed. The basic concepts and key ideas are presented by starting with an expository review of the original fully two-dimensional local Green's function methods developed for neutron diffusion and heat conduction, and continuing through the progressively more complicated and more efficient nodal Green's function methods for neutron diffusion, heat conduction and neutron transport to establish the background for the recent development of Green's function methods in computational fluid mechanics. Some of the impressive numerical results obtained via these classes of methods for nuclear engineering problems are briefly summarized. Finally, speculations are proffered on future directions in which the development of these types of methods in fluid mechanics and other areas might lead. (orig.) [de

  5. Development of a computational framework on fluid-solid mixture flow simulations for the COMPASS code

    International Nuclear Information System (INIS)

    Zhang, Shuai; Morita, Koji; Shirakawa, Noriyuki; Yamamoto, Yuichi

    2010-01-01

    The COMPASS code is designed based on the moving particle semi-implicit method to simulate various complex mesoscale phenomena relevant to core disruptive accidents of sodium-cooled fast reactors. In this study, a computational framework for fluid-solid mixture flow simulations was developed for the COMPASS code. The passively moving solid model was used to simulate hydrodynamic interactions between fluid and solids. Mechanical interactions between solids were modeled by the distinct element method. A multi-time-step algorithm was introduced to couple these two calculations. The proposed computational framework for fluid-solid mixture flow simulations was verified by the comparison between experimental and numerical studies on the water-dam break with multiple solid rods. (author)

  6. Calculation and evaluation methodology of the flawed pipe and the compute program development

    International Nuclear Information System (INIS)

    Liu Chang; Qian Hao; Yao Weida; Liang Xingyun

    2013-01-01

    Background: The crack will grow gradually under alternating load for a pressurized pipe, whereas the load is less than the fatigue strength limit. Purpose: Both calculation and evaluation methodology for a flawed pipe that have been detected during in-service inspection is elaborated here base on the Elastic Plastic Fracture Mechanics (EPFM) criteria. Methods: In the compute, the depth and length interaction of a flaw has been considered and a compute program is developed per Visual C++. Results: The fluctuating load of the Reactor Coolant System transients, the initial flaw shape, the initial flaw orientation are all accounted here. Conclusions: The calculation and evaluation methodology here is an important basis for continue working or not. (authors)

  7. Development and validation of GWHEAD, a three-dimensional groundwater head computer code

    International Nuclear Information System (INIS)

    Beckmeyer, R.R.; Root, R.W.; Routt, K.R.

    1980-03-01

    A computer code has been developed to solve the groundwater flow equation in three dimensions. The code has finite-difference approximations solved by the strongly implicit solution procedure. Input parameters to the code include hydraulic conductivity, specific storage, porosity, accretion (recharge), and initial hydralic head. These parameters may be input as varying spatially. The hydraulic conductivity may be input as isotropic or anisotropic. The boundaries either may permit flow across them or may be impermeable. The code has been used to model leaky confined groundwater conditions and spherical flow to a continuous point sink, both of which have exact analytical solutions. The results generated by the computer code compare well with those of the analytical solutions. The code was designed to be used to model groundwater flow beneath fuel reprocessing and waste storage areas at the Savannah River Plant

  8. Development of research reactor parameter measuring system based on personal computer

    International Nuclear Information System (INIS)

    Byung Jin Jun

    1992-11-01

    A PC system which can be applied to measure some important parameters of research reactors has been developed. It includes multi-counter, multi-scaler, reactivity computer, control rod drop time measurement and power calibration and solving any problems or difficulties when counters are used. Analog data acquisition is relatively well-known and has not been included in this work. The multi-counter and multi-scaler are basic tools for many kinds of research reactor experiments which replace conventional counter modules and MCAs. The reactivity computer can accommodate virtually all reactor power monitors. As the start-up channels can also be used, it can give reactivity values at the earliest stage of new reactor commissioning and can be used in critical assemblies as well. The control rod drop time measuring programme which replaces the use of memory oscilloscope. The power calibration programme offers improved accuracy and convenience. Refs, figs, 1 tab

  9. Development of COMPAS, computer aided process flowsheet design and analysis system of nuclear fuel reprocessing

    International Nuclear Information System (INIS)

    Homma, Shunji; Sakamoto, Susumu; Takanashi, Mitsuhiro; Nammo, Akihiko; Satoh, Yoshihiro; Soejima, Takayuki; Koga, Jiro; Matsumoto, Shiro

    1995-01-01

    A computer aided process flowsheet design and analysis system, COMPAS has been developed in order to carry out the flowsheet calculation on the process flow diagram of nuclear fuel reprocessing. All of equipments, such as dissolver, mixer-settler, and so on, in the process flowsheet diagram are graphically visualized as icon on a bitmap display of UNIX workstation. Drawing of a flowsheet can be carried out easily by the mouse operation. Not only a published numerical simulation code but also a user's original one can be used on the COMPAS. Specifications of the equipment and the concentration of components in the stream displayed as tables can be edited by a computer user. Results of calculation can be also displayed graphically. Two examples show that the COMPAS is applicable to decide operating conditions of Purex process and to analyze extraction behavior in a mixer-settler extractor. (author)

  10. Space Station Simulation Computer System (SCS) study for NASA/MSFC. Phased development plan

    Science.gov (United States)

    1990-01-01

    NASA's Space Station Freedom Program (SSFP) planning efforts have identified a need for a payload training simulator system to serve as both a training facility and as a demonstrator to validate operational concepts. The envisioned MSFC Payload Training Complex (PTC) required to meet this need will train the Space Station payload scientists, station scientists and ground controllers to operate the wide variety of experiments that will be onboard the Space Station Freedom. The Simulation Computer System (SCS) is made up of computer hardware, software, and workstations that will support the Payload Training Complex at MSFC. The purpose of this SCS Study is to investigate issues related to the SCS, alternative requirements, simulator approaches, and state-of-the-art technologies to develop candidate concepts and designs.

  11. The development of an oscilloscope visualization system for the hybrid computer E.A.I. 8900

    International Nuclear Information System (INIS)

    Djukanovic, Radojka

    1970-01-01

    This report was the first subject of a thesis submitted to the Faculte des Sciences in Paris, on the 30 of June 1970 by Mistress Radojka Djukanovic-Remsak, in order to obtain the grade of doctor engineer. A visualization system was studied and developed, whereby various figures could be displayed on an oscilloscope screen without a memory by means of points and segments. This system was realized by the utilisation of the analog and logic elements of an analog computer E.A.I. 8800 and a series of programs intended to be used in conjunction with the E.A.I. 8400 digital computer. The second subject: 'The evolution of multiprogramming' was dealt with in a note CEA-N-1346. (author) [fr

  12. DCA++: A case for science driven application development for leadership computing platforms

    International Nuclear Information System (INIS)

    Summers, Michael S; Alvarez, Gonzalo; Meredith, Jeremy; Maier, Thomas A; Schulthess, Thomas C

    2009-01-01

    The DCA++ code was one of the early science applications that ran on jaguar at the National Center for Computational Sciences, and the first application code to sustain a petaflop/s under production conditions on a general-purpose supercomputer. The code implements a quantum cluster method with a Quantum Monte Carlo kernel to solve the 2D Hubbard model for high-temperature superconductivity. It is implemented in C++, making heavy use of the generic programming model. In this paper, we discuss how this code was developed, reaching scalability and high efficiency on the world's fastest supercomputer in only a few years. We show how the use of generic concepts combined with systematic refactoring of codes is a better strategy for computational sciences than a comprehensive upfront design.

  13. The development of a computational platform to design and simulate on-board hydrogen storage systems

    DEFF Research Database (Denmark)

    Mazzucco, Andrea; Rokni, Masoud

    2017-01-01

    designs based on the tubular tank configuration with Ti1.1CrMn as the absorbing alloy. Results show that a shell and tube layout with metal hydride tubes of 2 mm inner diameter achieves the desired refueling time of 3 min and store a maximum of 3.1 kg of hydrogen in a 126 L tank, corresponding......A computational platform is developed in the Modelica® language within the Dymola™ environment to provide a tool for the design and performance comparison of on-board hydrogen storage systems. The platform has been coupled with an open source library for hydrogen fueling stations to investigate...... the vehicular tank within the frame of a complete refueling system. The two technologies that are integrated in the platform are solid-state hydrogen storage in the form of metal hydrides and compressed gas systems. In this work the computational platform is used to compare the storage performance of two tank...

  14. The development of a distributed computing environment for the design and modeling of plasma spectroscopy experiments

    International Nuclear Information System (INIS)

    Nash, J.K.; Eme, W.G.; Lee, R.W.; Salter, J.M.

    1994-10-01

    The design and analysis of plasma spectroscopy experiments can be significantly complicated by relatively routine computational tasks arising from the massive amount of data encountered in the experimental design and analysis stages of the work. Difficulties in obtaining, computing, manipulating and visualizing the information represent not simply an issue of convenience -- they have a very real limiting effect on the final quality of the data and on the potential for arriving at meaningful conclusions regarding an experiment. We describe ongoing work in developing a portable UNIX environment shell with the goal of simplifying and enabling these activities for the plasma-modeling community. Applications to the construction of atomic kinetics models and to the analysis of x-ray transmission spectroscopy will be shown

  15. Development and clinical study of mobile 12-lead electrocardiography based on cloud computing for cardiac emergency.

    Science.gov (United States)

    Fujita, Hideo; Uchimura, Yuji; Waki, Kayo; Omae, Koji; Takeuchi, Ichiro; Ohe, Kazuhiko

    2013-01-01

    To improve emergency services for accurate diagnosis of cardiac emergency, we developed a low-cost new mobile electrocardiography system "Cloud Cardiology®" based upon cloud computing for prehospital diagnosis. This comprises a compact 12-lead ECG unit equipped with Bluetooth and Android Smartphone with an application for transmission. Cloud server enables us to share ECG simultaneously inside and outside the hospital. We evaluated the clinical effectiveness by conducting a clinical trial with historical comparison to evaluate this system in a rapid response car in the real emergency service settings. We found that this system has an ability to shorten the onset to balloon time of patients with acute myocardial infarction, resulting in better clinical outcome. Here we propose that cloud-computing based simultaneous data sharing could be powerful solution for emergency service for cardiology, along with its significant clinical outcome.

  16. Usages of Computers and Smartphones to Develop Dementia Care Education Program for Asian American Family Caregivers.

    Science.gov (United States)

    Lee, Jung-Ah; Nguyen, Hannah; Park, Joan; Tran, Linh; Nguyen, Trang; Huynh, Yen

    2017-10-01

    Families of ethnic minority persons with dementia often seek help at later stages of the disease. Little is known about the effectiveness of various methods in supporting ethnic minority dementia patients' caregivers. The objective of the study was to identify smartphone and computer usage among family caregivers of dementia patients (i.e., Korean and Vietnamese Americans) to develop dementia-care education programs for them. Participants were asked various questions related to their computer or smartphone usage in conjunction with needs-assessment interviews. Flyers were distributed at two ethnic minority community centers in Southern California. Snowball recruitment was also utilized to reach out to the families of dementia patients dwelling in the community. Thirty-five family caregivers, including 20 Vietnamese and 15 Korean individuals, participated in this survey. Thirty participants (30 of 35, 85.7%) were computer users. Among those, 76.7% (23 of 30) reported daily usage and 53% (16 of 30) claimed to use social media. A majority of the participants (31 of 35, 88.6%) reported that they owned smartphones. More than half of smartphone users (18 of 29, 62%) claimed to use social media applications. Many participants claimed that they could not attend in-class education due to caregiving and/or transportation issues. Most family caregivers of dementia patients use smartphones more often than computers, and more than half of those caregivers communicate with others through social media apps. A smartphone-app-based caregiver intervention may serve as a more effective approach compared to the conventional in-class method. Multiple modalities for the development of caregiver interventions should be considered.

  17. Computable General Equilibrium Model Fiscal Year 2013 Capability Development Report - April 2014

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, Brian Keith [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). National Infrastructure Simulation and Analysis Center (NISAC); Rivera, Michael K. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). National Infrastructure Simulation and Analysis Center (NISAC); Boero, Riccardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). National Infrastructure Simulation and Analysis Center (NISAC)

    2014-04-01

    This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences in the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.

  18. Development of computer-controlled ultrasonic image processing system for severe accidents research

    International Nuclear Information System (INIS)

    Koo, Kil Mo; Kang, Kyung Ho; Kim, Jong Tai; Kim, Jong Whan; Cho, Young Ro; Ha, Kwang Soon; Park, Rae Jun; Kim, Sang Baik; Kim, Hee Dong; Sim, Chul Moo

    2000-07-01

    In order to verify in-vessel corium cooling mechanism, LAVA(Lower-plenum Arrested Vessel Attack) experiment is being performed as a first stage proof of principle test. The aims of this study are to find a gap formation between corium melt and reactor lower head vessel, to verify the principle of the gap formation and to analyze the effect of the gap formation on the thermal behavior of corium melt and the lower plenum. This report aims at developing a computer controlled image signal processing system which is able to improve visualization and to measure the gap distribution with 3-dimensional planar image using a time domain signal analysis method as a part of the ultrasonic pulse echo methods and a computerized position control system. An image signal processing system is developed by independently developing an ultrasonic image signal processing technique and a PC controlled position control system and then combining both systems

  19. Computer Sciences Applied to Management at Open University of Catalonia: Development of Competences of Teamworks

    Science.gov (United States)

    Pisa, Carlos Cabañero; López, Enric Serradell

    Teamwork is considered one of the most important professional skills in today's business environment. More specifically, the collaborative work between professionals and information technology managers from various functional areas is a strategic key in competitive business. Several university-level programs are focusing on developing these skills. This article presents the case of the course Computer Science Applied to Management (hereafter CSAM) that has been designed with the objective to develop the ability to work cooperatively in interdisciplinary teams. For their design and development have been addressed to the key elements of efficiency that appear in the literature, most notably the establishment of shared objectives and a feedback system, the management of the harmony of the team, their level of autonomy, independence, diversity and level of supervision. The final result is a subject in which, through a working virtual platform, interdisciplinary teams solve a problem raised by a case study.

  20. Development of indigenous USB based ICT-controller for industrial computed tomography scanner

    International Nuclear Information System (INIS)

    Walinjkar, Parag; Umesh Kumar

    2014-01-01

    In Industrial Computed Tomography (ICT) the quality of tomographic image depends on the accuracy of data/measurement. Isotope Production and Applications Division (IP and AD) is pioneer in this field and equipped with advance facility of ICT using gamma rays as well as X-rays. ICT-controller has been developed indigenously, for parallel beam scanning technique, to control scanning and data acquisition process automatically as per user requirements. The process of scanning and data collection has been automated using commercially available USB module. The acquired raw data is then processed and tomographic image of the specimen reconstructed to test operational performance of the ICT-controller. The paper is about the development of ICT-controller. It also describes the tests carried out to confirm successful development of the ICT-controller. (author)

  1. The Impact of Computer Science on the Development of Oulu ICT during 1985-1990

    Science.gov (United States)

    Oinas-Kukkonen, Henry; Similä, Jouni; Pulli, Petri; Oinas-Kukkonen, Harri; Kerola, Pentti

    The region of Oulu has been emphasizing the importance of electronics industry for its business growth since the 1960s. After a pitch-dark recession, the region developed in the 1990s into a new, well-established hub of information and communication technology (ICT) in Finland. The city with its 100,000 inhabitants occupied nearly 10,000 ICT professionals in 1995. This article will contribute to the body of research knowledge through analyzing the role of computer science, in particular information systems and software engineering, for the development of the ICT industry in Oulu in the latter half of the 1980s. This analysis is based on a variety of both primary and secondary sources. This article suggests that the system-theoretical and software-oriented research expertise played a key role for the rapid and successful ICT business development of the Oulu region.

  2. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

    Directory of Open Access Journals (Sweden)

    Gervasio Varela

    2016-07-01

    Full Text Available This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC and Ambient Intelligence (AmI systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location.

  3. Development of a lack of appetite item bank for computer-adaptive testing (CAT)

    DEFF Research Database (Denmark)

    Thamsborg, Lise Laurberg Holst; Petersen, Morten Aa; Aaronson, Neil K

    2015-01-01

    PURPOSE: A significant proportion of oncological patients experiences lack of appetite. Precise measurement is relevant to improve the management of lack of appetite. The so-called computer-adaptive test (CAT) allows for adaptation of the questionnaire to the individual patient, thereby optimizing...... measurement precision. The EORTC Quality of Life Group is developing a CAT version of the widely used EORTC QLQ-C30 questionnaire. Here, we report on the development of the lack of appetite CAT. METHODS: The EORTC approach to CAT development comprises four phases: literature search, operationalization, pre-testing......, and field testing. Phases 1-3 are described in this paper. First, a list of items was retrieved from the literature. This was refined, deleting redundant and irrelevant items. Next, new items fitting the "QLQ-C30 item style" were created. These were evaluated by international samples of experts and cancer...

  4. Development of computer-controlled ultrasonic image processing system for severe accidents research

    Energy Technology Data Exchange (ETDEWEB)

    Koo, Kil Mo; Kang, Kyung Ho; Kim, Jong Tai; Kim, Jong Whan; Cho, Young Ro; Ha, Kwang Soon; Park, Rae Jun; Kim, Sang Baik; Kim, Hee Dong; Sim, Chul Moo

    2000-07-01

    In order to verify in-vessel corium cooling mechanism, LAVA(Lower-plenum Arrested Vessel Attack) experiment is being performed as a first stage proof of principle test. The aims of this study are to find a gap formation between corium melt and reactor lower head vessel, to verify the principle of the gap formation and to analyze the effect of the gap formation on the thermal behavior of corium melt and the lower plenum. This report aims at developing a computer controlled image signal processing system which is able to improve visualization and to measure the gap distribution with 3-dimensional planar image using a time domain signal analysis method as a part of the ultrasonic pulse echo methods and a computerized position control system. An image signal processing system is developed by independently developing an ultrasonic image signal processing technique and a PC controlled position control system and then combining both systems.

  5. Knowledge-based computational intelligence development for predicting protein secondary structures from sequences.

    Science.gov (United States)

    Shen, Hong-Bin; Yi, Dong-Liang; Yao, Li-Xiu; Yang, Jie; Chou, Kuo-Chen

    2008-10-01

    In the postgenomic age, with the avalanche of protein sequences generated and relatively slow progress in determining their structures by experiments, it is important to develop automated methods to predict the structure of a protein from its sequence. The membrane proteins are a special group in the protein family that accounts for approximately 30% of all proteins; however, solved membrane protein structures only represent less than 1% of known protein structures to date. Although a great success has been achieved for developing computational intelligence techniques to predict secondary structures in both globular and membrane proteins, there is still much challenging work in this regard. In this review article, we firstly summarize the recent progress of automation methodology development in predicting protein secondary structures, especially in membrane proteins; we will then give some future directions in this research field.

  6. Overcoming Chemical, Biological, and Computational Challenges in the Development of Inhibitors Targeting Protein-Protein Interactions

    Science.gov (United States)

    Laraia, Luca; McKenzie, Grahame; Spring, David R.; Venkitaraman, Ashok R.; Huggins, David J.

    2015-01-01

    Protein-protein interactions (PPIs) underlie the majority of biological processes, signaling, and disease. Approaches to modulate PPIs with small molecules have therefore attracted increasing interest over the past decade. However, there are a number of challenges inherent in developing small-molecule PPI inhibitors that have prevented these approaches from reaching their full potential. From target validation to small-molecule screening and lead optimization, identifying therapeutically relevant PPIs that can be successfully modulated by small molecules is not a simple task. Following the recent review by Arkin et al., which summarized the lessons learnt from prior successes, we focus in this article on the specific challenges of developing PPI inhibitors and detail the recent advances in chemistry, biology, and computation that facilitate overcoming them. We conclude by providing a perspective on the field and outlining four innovations that we see as key enabling steps for successful development of small-molecule inhibitors targeting PPIs. PMID:26091166

  7. Development of advanced computational fluid dynamics tools and their application to simulation of internal turbulent flows

    Science.gov (United States)

    Emelyanov, V. N.; Karpenko, A. G.; Volkov, K. N.

    2015-06-01

    Modern graphics processing units (GPU) provide architectures and new programming models that enable to harness their large processing power and to design computational fluid dynamics (CFD) simulations at both high performance and low cost. Possibilities of the use of GPUs for the simulation of internal fluid flows are discussed. The finite volume method is applied to solve three-dimensional (3D) unsteady compressible Euler and Navier-Stokes equations on unstructured meshes. Compute Inified Device Architecture (CUDA) technology is used for programming implementation of parallel computational algorithms. Solution of some fluid dynamics problems on GPUs is presented and approaches to optimization of the CFD code related to the use of different types of memory are discussed. Speedup of solution on GPUs with respect to the solution on central processor unit (CPU) is compared with the use of different meshes and different methods of distribution of input data into blocks. Performance measurements show that numerical schemes developed achieve 20 to 50 speedup on GPU hardware compared to CPU reference implementation. The results obtained provide promising perspective for designing a GPU-based software framework for applications in CFD.

  8. Development of tight-binding, chemical-reaction-dynamics simulator for combinatorial computational chemistry

    International Nuclear Information System (INIS)

    Kubo, Momoji; Ando, Minako; Sakahara, Satoshi; Jung, Changho; Seki, Kotaro; Kusagaya, Tomonori; Endou, Akira; Takami, Seiichi; Imamura, Akira; Miyamoto, Akira

    2004-01-01

    Recently, we have proposed a new concept called 'combinatorial computational chemistry' to realize a theoretical, high-throughput screening of catalysts and materials. We have already applied our combinatorial, computational-chemistry approach, mainly based on static first-principles calculations, to various catalysts and materials systems and its applicability to the catalysts and materials design was strongly confirmed. In order to realize more effective and efficient combinatorial, computational-chemistry screening, a high-speed, chemical-reaction-dynamics simulator based on quantum-chemical, molecular-dynamics method is essential. However, to the best of our knowledge, there is no chemical-reaction-dynamics simulator, which has an enough high-speed ability to perform a high-throughput screening. In the present study, we have succeeded in the development of a chemical-reaction-dynamics simulator based on our original, tight-binding, quantum-chemical, molecular-dynamics method, which is more than 5000 times faster than the regular first-principles, molecular-dynamics method. Moreover, its applicability and effectiveness to the atomistic clarification of the methanol-synthesis dynamics at reaction temperature were demonstrated

  9. Computational and experimental approaches for development of methotrexate nanosuspensions by bottom-up nanoprecipitation.

    Science.gov (United States)

    Dos Santos, Aline Martins; Carvalho, Flávia Chiva; Teixeira, Deiver Alessandro; Azevedo, David Lima; de Barros, Wander Miguel; Gremião, Maria Palmira Daflon

    2017-05-30

    Development of nanosuspensions offers a promising tool for formulations involving poorly water-soluble drugs. In this study, methotrexate (MTX) nanosuspensions were prepared using a bottom-up process based on acid-base neutralization reactions. Computational studies were performed to determine structural and electronic properties for isolated molecules and molecular clusters in order to evaluate the mechanism of MTX nanoparticle formation. Computational results indicated that the clusters in zwitterionic and cationic states presented larger dimensions and higher energies of interaction between MTX molecules, which favored aggregation. In contrast, the clusters in the anionic state exhibited lower energies of interaction, indicating aggregation was less likely to occur. Experimental results indicated that the higher the HCl proportion during drug precipitation, the greater the particle size, resulting in micrometric particles (2874-7308nm) (cationic and zwitterionic forms). However, MTX nanoparticles ranging in size from 132 to 186nm were formed using the lowest HCl proportion during drug precipitation (anionic form). In vitro release profiles indicated that the drug release rate from nanosuspension was increased (approximately 2.6 times) over that of the raw material. Overall, computational modeling and experimental analysis were complementary and assisted in the rational design of the nanosuspension based on acid-base reactions. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Development of easy-to-use interface for nuclear transmutation computing, VCINDER code

    Directory of Open Access Journals (Sweden)

    Oyeon Kum

    2018-02-01

    Full Text Available The CINDER code has about 60 years of development history, and is thus one of the world's best transmutation computing codes to date. Unfortunately, it is complex and cumbersome to use. Preparing auxiliary input files for activation computation from MCNPX output and executing them using Perl script (activation script is the first difficulty, and separation of gamma source computing script (gamma script, which analyzes the spectra files produced by CINDER code and creates source definition format for MCNPX code, is the second difficulty. In addition, for highly nonlinear problems, multiple human interventions may increase the possibility of errors. Postprocessing such as making plots with large text outputs is also time consuming. One way to improve these limitations is to make a graphical user interface wrapper that includes all codes, such as MCNPX and CINDER, and all scripts with a visual C#.NET tool. The graphical user interface merges all the codes and provides easy postprocessing of graphics data and Microsoft office tools, such as Excel sheets, which make the CINDER code easy to use. This study describes the VCINDER code (with visual C#.NET and gives a typical application example.

  11. Developing a computer delivered, theory based intervention for guideline implementation in general practice

    Directory of Open Access Journals (Sweden)

    Ashworth Mark

    2010-11-01

    Full Text Available Abstract Background Non-adherence to clinical guidelines has been identified as a consistent finding in general practice. The purpose of this study was to develop theory-informed, computer-delivered interventions to promote the implementation of guidelines in general practice. Specifically, our aim was to develop computer-delivered prompts to promote guideline adherence for antibiotic prescribing in respiratory tract infections (RTIs, and adherence to recommendations for secondary stroke prevention. Methods A qualitative design was used involving 33 face-to-face interviews with general practitioners (GPs. The prompts used in the interventions were initially developed using aspects of social cognitive theory, drawing on nationally recommended standards for clinical content. The prompts were then presented to GPs during interviews, and iteratively modified and refined based on interview feedback. Inductive thematic analysis was employed to identify responses to the prompts and factors involved in the decision to use them. Results GPs reported being more likely to use the prompts if they were perceived as offering support and choice, but less likely to use them if they were perceived as being a method of enforcement. Attitudes towards using the prompts were also related to anticipated patient outcomes, individual prescriber differences, accessibility and presentation of prompts and acceptability of guidelines. Comments on the prompts were largely positive after modifying them based on participant feedback. Conclusions Acceptability and satisfaction with computer-delivered prompts to follow guidelines may be increased by working with practitioners to ensure that the prompts will be perceived as valuable tools that can support GPs' practice.

  12. Do more intelligent brains retain heightened plasticity for longer in development? A computational investigation.

    Science.gov (United States)

    Thomas, Michael S C

    2016-06-01

    Twin studies indicate that the heritability of general cognitive ability - the genetic contribution to individual differences - increases with age. Brant et al. (2013) reported that this increase in heritability occurs earlier in development for low ability children than high ability children. Allied with structural brain imaging results that indicate faster thickening and thinning of cortex for high ability children (Shaw et al., 2006), Brant and colleagues argued higher cognitive ability represents an extended sensitive period for brain development. However, they admitted no coherent mechanistic account can currently reconcile the key empirical data. Here, computational methods are employed to demonstrate the empirical data can be reconciled without recourse to variations in sensitive periods. These methods utilized population-based artificial neural network models of cognitive development. In the model, ability-related variations stemmed from the timing of the increases in the non-linearity of computational processes, causing dizygotic twins to diverge in their behavior. These occurred in a population where: (a) ability was determined by the combined small contributions of many neurocomputational factors, and (b) individual differences in ability were largely genetically constrained. The model's explanation of developmental increases in heritability contrasts with proposals that these increases represent emerging gene-environment correlations (Haworth et al., 2010). The article advocates simulating inherited individual differences within an explicitly developmental framework. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.

  13. Development of a computer program for the cost analysis of spent fuel management

    International Nuclear Information System (INIS)

    Choi, Heui Joo; Lee, Jong Youl; Choi, Jong Won; Cha, Jeong Hun; Whang, Joo Ho

    2009-01-01

    So far, a substantial amount of spent fuels have been generated from the PWR and CANDU reactors. They are being temporarily stored at the nuclear power plant sites. It is expected that the temporary storage facility will be full of spent fuels by around 2016. The government plans to solve the problem by constructing an interim storage facility soon. The radioactive management act was enacted in 2008 to manage the spent fuels safety in Korea. According to the act, the radioactive waste management fund which will be used for the transportation, interim storage, and the final disposal of spent fuels has been established. The cost for the management of spent fuels is surprisingly high and could include a lot of uncertainty. KAERI and Kyunghee University have developed cost estimation tools to evaluate the cost for a spent fuel management based on an engineering design and calculation. It is not easy to develop a tool for a cost estimation under the situation that the national policy on a spent fuel management has not yet been fixed at all. Thus, the current version of the computer program is based on the current conceptual design of each management system. The main purpose of this paper is to introduce the computer program developed for the cost analysis of a spent fuel management. In order to show the application of the program, a spent fuel management scenario is prepared, and the cost for the scenario is estimated

  14. Do more intelligent brains retain heightened plasticity for longer in development? A computational investigation

    Directory of Open Access Journals (Sweden)

    Michael S.C. Thomas

    2016-06-01

    Full Text Available Twin studies indicate that the heritability of general cognitive ability – the genetic contribution to individual differences – increases with age. Brant et al. (2013 reported that this increase in heritability occurs earlier in development for low ability children than high ability children. Allied with structural brain imaging results that indicate faster thickening and thinning of cortex for high ability children (Shaw et al., 2006, Brant and colleagues argued higher cognitive ability represents an extended sensitive period for brain development. However, they admitted no coherent mechanistic account can currently reconcile the key empirical data. Here, computational methods are employed to demonstrate the empirical data can be reconciled without recourse to variations in sensitive periods. These methods utilized population-based artificial neural network models of cognitive development. In the model, ability-related variations stemmed from the timing of the increases in the non-linearity of computational processes, causing dizygotic twins to diverge in their behavior. These occurred in a population where: (a ability was determined by the combined small contributions of many neurocomputational factors, and (b individual differences in ability were largely genetically constrained. The model’s explanation of developmental increases in heritability contrasts with proposals that these increases represent emerging gene-environment correlations (Haworth et al., 2010. The article advocates simulating inherited individual differences within an explicitly developmental framework.

  15. Developing a coding scheme for detecting usability and fun problems in computer games for young children.

    Science.gov (United States)

    Barendregt, W; Bekker, M M

    2006-08-01

    This article describes the development and assessment of a coding scheme for finding both usability and fun problems through observations of young children playing computer games during user tests. The proposed coding scheme is based on an existing list of breakdown indication types of the detailed video analysis method (DEVAN). This method was developed to detect usability problems in task-based products for adults. However, the new coding scheme for children's computer games takes into account that in games, fun, in addition to usability, is an important factor and that children behave differently from adults. Therefore, the proposed coding scheme uses 8 of the 14 original breakdown indications and has 7 new indications. The article first discusses the development of the new coding scheme. Subsequently, the article describes the reliability assessment of the coding scheme. The any-two agreement measure of 38.5% shows that thresholds for when certain user behavior is worth coding will be different for different evaluators. However, the any-two agreement of .92 for a fixed list of observation points shows that the distinction between the available codes is clear to most evaluators. Finally, a pilot study shows that training can increase any-two agreement considerably by decreasing the number of unique observations, in comparison with the number of agreed upon observations.

  16. Software of image processing system on the JINR basic computers and problems of its further development

    International Nuclear Information System (INIS)

    Ivanov, V.G.

    1978-01-01

    To process picture information on the basis of BESM-6 and CDC-6500 computers, Joint Institute for Nuclear Research has developed a set of programs which enables the user to restore a spatial picture of measured events and calculate track parameters, as well as kinematically identify the events and to select most probable hypotheses for each event. A wide-scale use of programs which process picture data obtained via various track chambers requires quite a number of different options of each program. For this purpose, a special program, PATCHY editor, has been developed to update, edit and assemble large programs. Therefore, a partitioned structure of the programs has been chosen which considerably reduces programming time. Basic problems of picture processing software are discussed and the fact that availability of terminal equipment for BESM-6 and CDC-6500 computers will help to increase the processing speed and to implement interactive mode is pointed out. It is also planned to develop a training system to help the user learn how to use the programs of the system

  17. Computational Investigation on Fully Developed Periodic Laminar Flow Structure in Baffled Circular Tube with Various BR

    Directory of Open Access Journals (Sweden)

    Withada Jedsadaratanachai

    2014-01-01

    Full Text Available This paper presents a 3D numerical analysis of fully developed periodic laminar flow in a circular tube fitted with 45° inclined baffles with inline arrangement. The computations are based on a finite volume method, and the SIMPLE algorithm has been implemented. The characteristics of fluid flow are presented for Reynolds number, Re = 100–1000, based on the hydraulic diameter (D of the tube. The angled baffles were repeatedly inserted at the middle of the test tube with inline arrangement to generate vortex flows over the tested tube. Effects of different Reynolds numbers and blockage ratios (b/D, BR with a single pitch ratio of 1 on flow structure in the tested tube were emphasized. The flows in baffled tube show periodic flow at x/D ≈ 2-3, and become a fully developed periodic flow profiles at x/D ≈ 6-7, depending on Re, BR and transverse plane positions. The computational results reveal that the higher of BR and closer position of turbulators, the faster of fully developed periodic flow profiles.

  18. Development of an educational partnership for enhancement of a computer risk assessment model

    International Nuclear Information System (INIS)

    Topper, K.

    1995-02-01

    The Multimedia Environmental Pollutant Assessment System (MEPAS) is a computer program which evaluates exposure pathways for chemical and radioactive releases according to their potential human health impacts. MEPAS simulates the exposure pathways through standard source-to-receptor transport principles using, a multimedia approach (air, groundwater, overland flow, soil, surface water) in conjunction with specific chemical exposure considerations. This model was originally developed by Pacific Northwest Laboratory (PNL) to prioritize environmental concerns at potentially contaminated US Department of Energy (DOE) sites. Currently MEPAS is being used to evaluate a range of environmental problems which are not restricted to DOE sites. A partnership was developed between PNL and Mesa State College during 1991. This partnership involves the use of undergraduate students, faculty, and PNL personnel to complete enhancements to MEPAS. This has led to major refinements to the original MEPAS shell for DOE in a very cost-effective manner. PNL was awarded a 1993 Federal Laboratory Consortium Award and Mesa State College was awarded an Environmental Restoration and Waste Management Distinguished Faculty Award from DOE in 1993 as a result of this collaboration. The college has benefited through the use of MEPAS within laboratories and through the applied experience gained by the students. Development of this partnership will be presented with the goal of allowing other DOE facilities to replicate this program. It is specifically recommended that DOE establish funded programs which support this type of a relationship on an ongoing basis. Additionally, specific enhancements to MEPAS will be presented through computer display of the program

  19. Development and evaluation of a computer-aided system for analyzing human error in railway operations

    International Nuclear Information System (INIS)

    Kim, Dong San; Baek, Dong Hyun; Yoon, Wan Chul

    2010-01-01

    As human error has been recognized as one of the major contributors to accidents in safety-critical systems, there has been a strong need for techniques that can analyze human error effectively. Although many techniques have been developed so far, much room for improvement remains. As human error analysis is a cognitively demanding and time-consuming task, it is particularly necessary to develop a computerized system supporting this task. This paper presents a computer-aided system for analyzing human error in railway operations, called Computer-Aided System for Human Error Analysis and Reduction (CAS-HEAR). It supports analysts to find multiple levels of error causes and their causal relations by using predefined links between contextual factors and causal factors as well as links between causal factors. In addition, it is based on a complete accident model; hence, it helps analysts to conduct a thorough analysis without missing any important part of human error analysis. A prototype of CAS-HEAR was evaluated by nine field investigators from six railway organizations in Korea. Its overall usefulness in human error analysis was confirmed, although development of its simplified version and some modification of the contextual factors and causal factors are required in order to ensure its practical use.

  20. The Role of Computers in Research and Development at Langley Research Center

    Science.gov (United States)

    Wieseman, Carol D. (Compiler)

    1994-01-01

    This document is a compilation of presentations given at a workshop on the role cf computers in research and development at the Langley Research Center. The objectives of the workshop were to inform the Langley Research Center community of the current software systems and software practices in use at Langley. The workshop was organized in 10 sessions: Software Engineering; Software Engineering Standards, methods, and CASE tools; Solutions of Equations; Automatic Differentiation; Mosaic and the World Wide Web; Graphics and Image Processing; System Design Integration; CAE Tools; Languages; and Advanced Topics.

  1. Metalloporphyrin catalysts for oxygen reduction developed using computer-aided molecular design

    Energy Technology Data Exchange (ETDEWEB)

    Ryba, G.N.; Hobbs, J.D.; Shelnutt, J.A. [and others

    1996-04-01

    The objective of this project is the development of a new class of metalloporphyrin materials used as catalsyts for use in fuel cell applications. The metalloporphyrins are excellent candidates for use as catalysts at both the anode and cathode. The catalysts reduce oxygen in 1 M potassium hydroxide, as well as in 2 M sulfuric acid. Covalent attachment to carbon supports is being investigated. The computer-aided molecular design is an iterative process, in which experimental results feed back into the design of future catalysts.

  2. The model of localized business community economic development under limited financial resources: computer model and experiment

    Directory of Open Access Journals (Sweden)

    Berg Dmitry

    2016-01-01

    Full Text Available Globalization processes now affect and are affected by most of organizations, different type resources, and the natural environment. One of the main restrictions initiated by these processes is the financial one: money turnover in global markets leads to its concentration in the certain financial centers, and local business communities suffer from the money lack. This work discusses the advantages of complementary currency introduction into a local economics. By the computer simulation with the engineered program model and the real economic experiment it was proved that the complementary currency does not compete with the traditional currency, furthermore, it acts in compliance with it, providing conditions for the sustainable business community development.

  3. Considerations upon Testing the Children’s Somatic Development, Using the Computer

    Directory of Open Access Journals (Sweden)

    Carla Amira Karnyanszky

    2006-01-01

    Full Text Available The aim of this study is to analyze the ways of the children’s somatic-physical development. Two methods were evaluated: the comparison between the measured dimensions and the medium values, obtained from the specialized literature, and the one between the calculus of the proportionality rates and the medium rates for the population. Furthermore, this paper is based on a computer program that stores the history of children’s growth, available for use in surveys on large communities (school, medical center, sport selection.

  4. Computational Methodologies for Developing Structure–Morphology–Performance Relationships in Organic Solar Cells: A Protocol Review

    KAUST Repository

    Do, Khanh

    2016-09-08

    We outline a step-by-step protocol that incorporates a number of theoretical and computational methodologies to evaluate the structural and electronic properties of pi-conjugated semiconducting materials in the condensed phase. Our focus is on methodologies appropriate for the characterization, at the molecular level, of the morphology in blend systems consisting of an electron donor and electron acceptor, of importance for understanding the performance properties of bulk-heterojunction organic solar cells. The protocol is formulated as an introductory manual for investigators who aim to study the bulk-heterojunction morphology in molecular details, thereby facilitating the development of structure morphology property relationships when used in tandem with experimental results.

  5. The PRIMA (PRoton IMAging) collaboration: Development of a proton Computed Tomography apparatus

    Energy Technology Data Exchange (ETDEWEB)

    Scaringella, M., E-mail: scaringella@gmail.com [Dipartimento di Ingegneria Industriale, Università di Firenze, Firenze (Italy); Brianzi, M. [INFN—Sezione di Firenze, Firenze (Italy); Bruzzi, M. [INFN—Sezione di Firenze, Firenze (Italy); Dipartimento di Fisica e Astronomia, Università di Firenze, Firenze (Italy); Bucciolini, M. [INFN—Sezione di Firenze, Firenze (Italy); Dipartimento di Scienze biomediche, sperimentali e cliniche, Università di Firenze, Firenze (Italy); SOD Fisica Medica, Azienda Ospedaliero-Universitaria Careggi, Firenze (Italy); Carpinelli, M. [Dipartimento di Chimica e Farmacia, Università di Sassari, Sassari (Italy); INFN sezione di Cagliari, Cagliari (Italy); Cirrone, G.A.P. [INFN—Laboratori Nazionali del Sud, Catania (Italy); Civinini, C. [INFN—Sezione di Firenze, Firenze (Italy); Cuttone, G. [INFN—Laboratori Nazionali del Sud, Catania (Italy); Lo Presti, D. [INFN—Sezione di Catania, Catania (Italy); Dipartimento di Fisica, Università di Catania, Catania (Italy); Pallotta, S. [INFN—Sezione di Firenze, Firenze (Italy); Dipartimento di Scienze biomediche, sperimentali e cliniche, Università di Firenze, Firenze (Italy); SOD Fisica Medica, Azienda Ospedaliero-Universitaria Careggi, Firenze (Italy); Pugliatti, C. [INFN—Sezione di Catania, Catania (Italy); Dipartimento di Fisica, Università di Catania, Catania (Italy); Randazzo, N. [INFN—Sezione di Catania, Catania (Italy); Romano, F. [Centro Studi e Ricerche e Museo Storico della Fisica, Rome (Italy); Sipala, V. [Dipartimento di Chimica e Farmacia, Università di Sassari, Sassari (Italy); INFN sezione di Cagliari, Cagliari (Italy); and others

    2013-12-01

    This paper describes the development of a proton Computed Tomography (pCT) apparatus able to reconstruct a map of stopping power useful for accurate proton therapy treatment planning and patient positioning. This system is based on two main components: a silicon microstrip tracker and a YAG:Ce crystal calorimeter. Each proton trajectory is sampled by the tracker in four points: two upstream and two downstream the object under test; the particle residual energy is measured by the calorimeter. The apparatus is described in details together with a discussion on the characterization of the hardware under proton beams with energies up to 175 MeV.

  6. HIGH-FIDELITY SIMULATION-DRIVEN MODEL DEVELOPMENT FOR COARSE-GRAINED COMPUTATIONAL FLUID DYNAMICS

    Energy Technology Data Exchange (ETDEWEB)

    Hanna, Botros N.; Dinh, Nam T.; Bolotnov, Igor A.

    2016-06-01

    Nuclear reactor safety analysis requires identifying various credible accident scenarios and determining their consequences. For a full-scale nuclear power plant system behavior, it is impossible to obtain sufficient experimental data for a broad range of risk-significant accident scenarios. In single-phase flow convective problems, Direct Numerical Simulation (DNS) and Large Eddy Simulation (LES) can provide us with high fidelity results when physical data are unavailable. However, these methods are computationally expensive and cannot be afforded for simulation of long transient scenarios in nuclear accidents despite extraordinary advances in high performance scientific computing over the past decades. The major issue is the inability to make the transient computation parallel, thus making number of time steps required in high-fidelity methods unaffordable for long transients. In this work, we propose to apply a high fidelity simulation-driven approach to model sub-grid scale (SGS) effect in Coarse Grained Computational Fluid Dynamics CG-CFD. This approach aims to develop a statistical surrogate model instead of the deterministic SGS model. We chose to start with a turbulent natural convection case with volumetric heating in a horizontal fluid layer with a rigid, insulated lower boundary and isothermal (cold) upper boundary. This scenario of unstable stratification is relevant to turbulent natural convection in a molten corium pool during a severe nuclear reactor accident, as well as in containment mixing and passive cooling. The presented approach demonstrates how to create a correction for the CG-CFD solution by modifying the energy balance equation. A global correction for the temperature equation proves to achieve a significant improvement to the prediction of steady state temperature distribution through the fluid layer.

  7. Development of ANFIS models for air quality forecasting and input optimization for reducing the computational cost and time

    Science.gov (United States)

    Prasad, Kanchan; Gorai, Amit Kumar; Goyal, Pramila

    2016-03-01

    This study aims to develop adaptive neuro-fuzzy inference system (ANFIS) for forecasting of daily air pollution concentrations of five air pollutants [sulphur dioxide (SO2), nitrogen dioxide (NO2), carbon monoxide (CO), ozone (O3) and particular matters (PM10)] in the atmosphere of a Megacity (Howrah). Air pollution in the city (Howrah) is rising in parallel with the economics and thus observing, forecasting and controlling the air pollution becomes increasingly important due to the health impact. ANFIS serve as a basis for constructing a set of fuzzy IF-THEN rules, with appropriate membership functions to generate the stipulated input-output pairs. The ANFIS model predictor considers the value of meteorological factors (pressure, temperature, relative humidity, dew point, visibility, wind speed, and precipitation) and previous day's pollutant concentration in different combinations as the inputs to predict the 1-day advance and same day air pollution concentration. The concentration value of five air pollutants and seven meteorological parameters of the Howrah city during the period 2009 to 2011 were used for development of the ANFIS model. Collinearity tests were conducted to eliminate the redundant input variables. A forward selection (FS) method is used for selecting the different subsets of input variables. Application of collinearity tests and FS techniques reduces the numbers of input variables and subsets which helps in reducing the computational cost and time. The performances of the models were evaluated on the basis of four statistical indices (coefficient of determination, normalized mean square error, index of agreement, and fractional bias).

  8. Development of a simple photon emission computed tomography dedicated to the small animal

    International Nuclear Information System (INIS)

    Bekaert, V.

    2006-11-01

    The development of in vivo small animal imaging becomes essential to study human pathologies. The ImaBio project of Institut Pluridisciplinaire Hubert Curien (IPHC) fits in the process of developing new instruments for biomedical applications with the development of a multimodality imaging platform dedicated to small animal imaging (AMISSA). This thesis presents the study, the design and the development of a Single Photon Emission Computed Tomography (SPECT) which will be integrated to the AMISSA platform. The result of these developments is the possibility to obtain the spatial distribution of an injected molecule into the animal. The SPECT technical solutions are based on the acquired knowledge of the institute allowing the conception of a device with cameras adapted to the gamma detection produced by the radiotracers used in single photon imaging. In order to cover the entire of the transverse field of view, four gamma cameras are arranged in a ring around the volume of interest. Each camera consists of 5 individual modules based on a YAP:Ce crystal array, a multi-anode photomultiplier and a dedicated multichannel electronic device. Finally, 20 detection modules were calibrated to give the same result for an identical energy deposit. The data are acquired then process to extract the positions and the energies deposited by gamma photons in the crystals. This last information is then gathered to build the projections. The 3D reconstructed image from the projections is carried out by the sequence of two algorithms, analytical and iterative OS-EM, both modified to take into account the singular geometry of our detection system. Finally, the obtained image is fused with the anatomical information given by the micro Computed Tomography system. (author)

  9. Development of Computer-Based Training to Supplement Lessons in Fundamentals of Electronics

    Directory of Open Access Journals (Sweden)

    Ian P. Benitez

    2016-05-01

    Full Text Available Teaching Fundamentals of Electronics allow students to familiarize with basic electronics concepts, acquire skills in the use of multi-meter test instrument, and develop mastery in testing basic electronic components. Actual teaching and doing observations during practical activities on components pin identification and testing showed that the lack of skills of new students in testing components can lead to incorrect fault diagnosis and wrong pin connection during in-circuit replacement of the defective parts. With the aim of reinforcing students with concrete understanding of the concepts of components applied in the actual test and measurement, a Computer-Based Training was developed. The proponent developed the learning modules (courseware utilizing concept mapping and storyboarding instructional design. Developing a courseware as simulated, activity-based and interactive as possible was the primary goal to resemble the real-world process. A Local area network (LAN-based learning management system was also developed to use in administering the learning modules. The Paired Sample T-Test based on the pretest and post-test result was used to determine whether the students achieved learning after taking the courseware. The result revealed that there is a significant achievement of the students after studying the learning module. The E-learning content was validated by the instructors in terms of contents, activities, assessment and format with a grand weighted mean of 4.35 interpreted as Sufficient. Based from the evaluation result, supplementing with the proposed computer-based training can enhance the teachinglearning process in electronic fundamentals.

  10. Development of a computational system based in the code GEANT4 for dosimetric evaluation in radiotherapy

    International Nuclear Information System (INIS)

    Oliveira, Alex Cristovao Holanda de

    2016-01-01

    The incidence of cancer has grown in Brazil, as well as around the world, following the change in the age profile of the population. One of the most important techniques and commonly used in cancer treatment is radiotherapy. Around 60% of new cases of cancer use radiation in at least one phase of treatment. The most used equipment for radiotherapy is a linear accelerator (Linac) which produces electron or X-ray beams in energy range from 5 to 30 MeV. The most appropriate way to irradiate a patient is determined during treatment planning. Currently, treatment planning system (TPS) is the main and the most important tool in the process of planning for radiotherapy. The main objective of this work is to develop a computational system based on the MC code Geant4 for dose evaluations in photon beam radiotherapy. In addition to treatment planning, these dose evaluations can be performed for research and quality control of equipment and TPSs. The computer system, called Quimera, consists of a graphical user interface (qGUI) and three MC applications (qLinacs, qMATphantoms and qNCTphantoms). The qGUI has the function of interface for the MC applications, by creating or editing the input files, running simulations and analyzing the results. The qLinacs is used for modeling and generation of Linac beams (phase space). The qMATphantoms and qNCTphantoms are used for dose calculations in virtual models of physical phantoms and computed tomography (CT) images, respectively. From manufacturer's data, models of a Varian Linac photon beam and a Varian multileaf collimator (MLC) were simulated in the qLinacs. The Linac and MLC modelling were validated using experimental data. qMATphamtoms and qNCTphantoms were validated using IAEA phase spaces. In this first version, the Quimera can be used for research, radiotherapy planning of simple treatments and quality control in photon beam radiotherapy. The MC applications work independent of the qGUI and the qGUI can be used for

  11. Development of a Assistive Human-Computer device Based On Electro-Oculogram for Disabled People

    Directory of Open Access Journals (Sweden)

    Sima Soltani

    2014-01-01

    Full Text Available Objective: In the study, a novel wearable miniaturized human computer interface system was designed and implemented. It allowed disabled people, who are not able to move their limbs voluntarily and speech overtly, to express their intentions and feelings just by moving their eyes. Materials & Methods: The developed system that is installed on a pair of glasses, records the electrooculogram signal and transfers the digitized data wirelessly to a laptop. Realtime analysis of the signals allows users to utilize two high performance graphical user interfaces a keypad and a game, just by their eye movements. The performance of the developed system was tested on six normal people, who typed a total number of 1071 characters successfully, to evaluate accuracy and rate of typing. It was also tested by four people with quadriplegia and cerebral palsy who performed a computer game by using their eye movements. Results: According to results of the experiments on normal people, the accuracy of recognizing the user's intention was obtained 94.1% and the average rate of communication was 7.72 characters per minute. Evaluating the usability of the system for disabled people showed that they were able to perform the computer game using their eyes. The percentage of success was evaluated as an average of 58.7%. Conclusion: The proposed system recorded and processed elecrooculogram signals with appropriate quality. The final prototype of the system was 2.6 cm× 4.5 cm in size and weighted only 15 grams. The total power consumption was measured as 123 mW. The designed keypad provided selection of each character by minimum eye movements. The system assures high performance for communication as well as high level of mobility and comfort for everyday use.

  12. Development and Validation of the Computer Technology Literacy Self-Assessment Scale for Taiwanese Elementary School Students

    Science.gov (United States)

    Chang, Chiung-Sui

    2008-01-01

    The purpose of this study was to describe the development and validation of an instrument to identify various dimensions of the computer technology literacy self-assessment scale (CTLS) for elementary school students. The instrument included five CTLS dimensions (subscales): the technology operation skills, the computer usages concepts, the…

  13. Effects of a Teacher Professional Development Program on Science Teachers' Views about Using Computers in Teaching and Learning

    Science.gov (United States)

    Çetin, Nagihan Imer

    2016-01-01

    The purpose of this study was to examine science teachers' level of using computers in teaching and the impact of a teacher professional development program (TPDP) on their views regarding utilizing computers in science education. Forty-three in-service science teachers from different regions of Turkey attended a 5 day TPDP. The TPDP was…

  14. Design and Development of a Sample "Computer Programming" Course Tool via Story-Based E-Learning Approach

    Science.gov (United States)

    Kose, Utku; Koc, Durmus; Yucesoy, Suleyman Anil

    2013-01-01

    This study introduces a story-based e-learning oriented course tool that was designed and developed for using within "computer programming" courses. With this tool, students can easily adapt themselves to the subjects in the context of computer programming principles, thanks to the story-based, interactive processes. By using visually…

  15. Investigation into the development of computer aided design software for space based sensors

    Science.gov (United States)

    Pender, C. W.; Clark, W. L.

    1987-01-01

    The described effort is phase one of the development of a Computer Aided Design (CAD) software to be used to perform radiometric sensor design. The software package will be referred to as SCAD and is directed toward the preliminary phase of the design of space based sensor system. The approach being followed is to develop a modern, graphic intensive, user friendly software package using existing software as building blocks. The emphasis will be directed toward the development of a shell containing menus, smart defaults, and interfaces, which can accommodate a wide variety of existing application software packages. The shell will offer expected utilities such as graphics, tailored menus, and a variety of drivers for I/O devices. Following the development of the shell, the development of SCAD is planned as chiefly selection and integration of appropriate building blocks. The phase one development activities have included: the selection of hardware which will be used with SCAD; the determination of the scope of SCAD; the preliminary evaluation of a number of software packages for applicability to SCAD; determination of a method for achieving required capabilities where voids exist; and then establishing a strategy for binding the software modules into an easy to use tool kit.

  16. Development of computer code models for analysis of subassembly voiding in the LMFBR

    International Nuclear Information System (INIS)

    Hinkle, W.

    1979-12-01

    The research program discussed in this report was started in FY1979 under the combined sponsorship of the US Department of Energy (DOE), General Electric (GE) and Hanford Engineering Development Laboratory (HEDL). The objective of the program is to develop multi-dimensional computer codes which can be used for the analysis of subassembly voiding incoherence under postulated accident conditions in the LMFBR. Two codes are being developed in parallel. The first will use a two fluid (6 equation) model which is more difficult to develop but has the potential for providing a code with the utmost in flexibility and physical consistency for use in the long term. The other will use a mixture (< 6 equation) model which is less general but may be more amenable to interpretation and use of experimental data and therefore, easier to develop for use in the near term. To assure that the models developed are not design dependent, geometries and transient conditions typical of both foreign and US designs are being considered

  17. Development of high performance scientific components for interoperability of computing packages

    Energy Technology Data Exchange (ETDEWEB)

    Gulabani, Teena Pratap [Iowa State Univ., Ames, IA (United States)

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achieved by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.

  18. Development of computer systems for planning and management of reactor decommissioning

    International Nuclear Information System (INIS)

    Yanagihara, Satoshi; Sukegawa, Takenori; Shiraishi, Kunio

    2001-01-01

    The computer systems for planning and management of reactor decommissioning were developed for effective implementation of a decommissioning project. The systems are intended to be applied to construction of work breakdown structures and estimation of manpower needs, worker doses, etc. based on the unit productivity and work difficulty factors, which were developed by analyzing the actual data on the JPDR dismantling activities. In addition, information necessary for project planning can be effectively integrated as a graphical form on a computer screen by transferring the data produced by subprograms such as radioactive inventory and dose rate calculation routines among the systems. Expert systems were adopted for modeling a new decommissioning project using production rules by reconstructing work breakdown structures and work specifications. As the results, the systems were characterized by effective modeling of a decommissioning project, project management data estimation based on feedback of past experience, and information integration through the graphical user interface. On the other hands, the systems were validated by comparing the calculated results with the actual manpower needs of the JPDR dismantling activities; it is expected that the systems will be applicable to planning and evaluation of other decommissioning projects. (author)

  19. Development of a Computer Program for the Analysis Logistics of PWR Spent Fuels

    International Nuclear Information System (INIS)

    Choi, Heui Joo; Choi, Jong Won; Cha, Jeong Hun

    2008-01-01

    It is expected that the temporary storage facilities at the nuclear power plants will be full of the spent fuels within 10 years. Provided that a centralized interim storage facility is constructed along the coast of the Korean peninsula to solve this problem, a substantial amount of spent fuels should be transported by sea or by land every year. In this paper we developed a computer program for the analysis of transportation logistics of the spent fuels from 4 different nuclear power plant sites to the hypothetical centralized interim storage facility and the final repository. Mass balance equations were used to analyze the logistics between the nuclear power plants and the interim storage facility. To this end a computer program, CASK, was developed by using the VISUAL BASIC language. The annual transportation rates of spent fuels from the four nuclear power plant sites were determined by using the CASK program. The parameter study with the program illustrated the easiness of logistics analysis. The program could be used for the cost analysis of the spent fuel transportation as well.

  20. Development of computer models for fuel element behaviour in water reactors

    International Nuclear Information System (INIS)

    Gittus, J.H.

    1987-03-01

    Description of fuel behaviour during normal operation transients and accident conditions has always represented a most challenging and important problem. Reliable predictions constitute a basic demand for safety based calculations, for design purposes and for fuel performance. Therefore, computer codes based on deterministic and probabilistic models were developed. Possibility of comprehensive descriptions of the phenomena is precluded in view of the great number of individual processes, involving physical, chemical, thermohydraulical and mechanical parameters, to be considered in a wide range of situations. In case of fast thermal transients predictive capability is limited by the kinetics of evolution of the system and its eventual dynamic behaviour. Evidently, probabilistic approaches are also limited by the sparcity and limited breadth of the impirical data base. Code predictions have to be evaluated against power reactor data, results from simulation experiments and, if possible, include cross validation of different codes and validation of sub-models. Progress on this subject is reviewed in this report, which completes the co-ordinated research programme on 'Development of Computer Models for Fuel Element Behaviour in Water Reactors' (D-COM), initiated under the auspices of the IAEA in 1981