WorldWideScience

Sample records for computer access correlations

  1. Computer self efficacy as correlate of on-line public access ...

    African Journals Online (AJOL)

    The use of Online Public Access Catalogue (OPAC) by students has a lot of advantages and computer self-efficacy is a factor that could determine its effective utilization. Little appears to be known about colleges of education students‟ use of OPAC, computer self-efficacy and the relationship between OPAC and computer ...

  2. Computer access security code system

    Science.gov (United States)

    Collins, Earl R., Jr. (Inventor)

    1990-01-01

    A security code system for controlling access to computer and computer-controlled entry situations comprises a plurality of subsets of alpha-numeric characters disposed in random order in matrices of at least two dimensions forming theoretical rectangles, cubes, etc., such that when access is desired, at least one pair of previously unused character subsets not found in the same row or column of the matrix is chosen at random and transmitted by the computer. The proper response to gain access is transmittal of subsets which complete the rectangle, and/or a parallelepiped whose opposite corners were defined by first groups of code. Once used, subsets are not used again to absolutely defeat unauthorized access by eavesdropping, and the like.

  3. Reliable computation from contextual correlations

    Science.gov (United States)

    Oestereich, André L.; Galvão, Ernesto F.

    2017-12-01

    An operational approach to the study of computation based on correlations considers black boxes with one-bit inputs and outputs, controlled by a limited classical computer capable only of performing sums modulo-two. In this setting, it was shown that noncontextual correlations do not provide any extra computational power, while contextual correlations were found to be necessary for the deterministic evaluation of nonlinear Boolean functions. Here we investigate the requirements for reliable computation in this setting; that is, the evaluation of any Boolean function with success probability bounded away from 1 /2 . We show that bipartite CHSH quantum correlations suffice for reliable computation. We also prove that an arbitrarily small violation of a multipartite Greenberger-Horne-Zeilinger noncontextuality inequality also suffices for reliable computation.

  4. Concordance-based Kendall's Correlation for Computationally-Light vs. Computationally-Heavy Centrality Metrics: Lower Bound for Correlation

    Directory of Open Access Journals (Sweden)

    Natarajan Meghanathan

    2017-01-01

    Full Text Available We identify three different levels of correlation (pair-wise relative ordering, network-wide ranking and linear regression that could be assessed between a computationally-light centrality metric and a computationally-heavy centrality metric for real-world networks. The Kendall's concordance-based correlation measure could be used to quantitatively assess how well we could consider the relative ordering of two vertices vi and vj with respect to a computationally-light centrality metric as the relative ordering of the same two vertices with respect to a computationally-heavy centrality metric. We hypothesize that the pair-wise relative ordering (concordance-based assessment of the correlation between centrality metrics is the most strictest of all the three levels of correlation and claim that the Kendall's concordance-based correlation coefficient will be lower than the correlation coefficient observed with the more relaxed levels of correlation measures (linear regression-based Pearson's product-moment correlation coefficient and the network wide ranking-based Spearman's correlation coefficient. We validate our hypothesis by evaluating the three correlation coefficients between two sets of centrality metrics: the computationally-light degree and local clustering coefficient complement-based degree centrality metrics and the computationally-heavy eigenvector centrality, betweenness centrality and closeness centrality metrics for a diverse collection of 50 real-world networks.

  5. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    OpenAIRE

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  6. An SPSS Macro to Compute Confidence Intervals for Pearson’s Correlation

    Directory of Open Access Journals (Sweden)

    Bruce Weaver

    2014-04-01

    Full Text Available In many disciplines, including psychology, medical research, epidemiology and public health, authors are required, or at least encouraged to report confidence intervals (CIs along with effect size estimates. Many students and researchers in these areas use IBM-SPSS for statistical analysis. Unfortunately, the CORRELATIONS procedure in SPSS does not provide CIs in the output. Various work-around solutions have been suggested for obtaining CIs for rhowith SPSS, but most of them have been sub-optimal. Since release 18, it has been possible to compute bootstrap CIs, but only if users have the optional bootstrap module. The !rhoCI macro described in this article is accessible to all SPSS users with release 14 or later. It directs output from the CORRELATIONS procedure to another dataset, restructures that dataset to have one row per correlation, computes a CI for each correlation, and displays the results in a single table. Because the macro uses the CORRELATIONS procedure, it allows users to specify a list of two or more variables to include in the correlation matrix, to choose a confidence level, and to select either listwise or pairwise deletion. Thus, it offers substantial improvements over previous solutions to theproblem of how to compute CIs for rho with SPSS.

  7. Correlates of Access to Business Research Databases

    Science.gov (United States)

    Gottfried, John C.

    2010-01-01

    This study examines potential correlates of business research database access through academic libraries serving top business programs in the United States. Results indicate that greater access to research databases is related to enrollment in graduate business programs, but not to overall enrollment or status as a public or private institution.…

  8. Disk access controller for Multi 8 computer

    International Nuclear Information System (INIS)

    Segalard, Jean

    1970-01-01

    After having presented the initial characteristics and weaknesses of the software provided for the control of a memory disk coupled with a Multi 8 computer, the author reports the development and improvement of this controller software. He presents the different constitutive parts of the computer and the operation of the disk coupling and of the direct access to memory. He reports the development of the disk access controller: software organisation, loader, subprograms and statements

  9. Towards an Approach of Semantic Access Control for Cloud Computing

    Science.gov (United States)

    Hu, Luokai; Ying, Shi; Jia, Xiangyang; Zhao, Kai

    With the development of cloud computing, the mutual understandability among distributed Access Control Policies (ACPs) has become an important issue in the security field of cloud computing. Semantic Web technology provides the solution to semantic interoperability of heterogeneous applications. In this paper, we analysis existing access control methods and present a new Semantic Access Control Policy Language (SACPL) for describing ACPs in cloud computing environment. Access Control Oriented Ontology System (ACOOS) is designed as the semantic basis of SACPL. Ontology-based SACPL language can effectively solve the interoperability issue of distributed ACPs. This study enriches the research that the semantic web technology is applied in the field of security, and provides a new way of thinking of access control in cloud computing.

  10. Computer Security Systems Enable Access.

    Science.gov (United States)

    Riggen, Gary

    1989-01-01

    A good security system enables access and protects information from damage or tampering, but the most important aspects of a security system aren't technical. A security procedures manual addresses the human element of computer security. (MLW)

  11. Automated Computer Access Request System

    Science.gov (United States)

    Snook, Bryan E.

    2010-01-01

    The Automated Computer Access Request (AutoCAR) system is a Web-based account provisioning application that replaces the time-consuming paper-based computer-access request process at Johnson Space Center (JSC). Auto- CAR combines rules-based and role-based functionality in one application to provide a centralized system that is easily and widely accessible. The system features a work-flow engine that facilitates request routing, a user registration directory containing contact information and user metadata, an access request submission and tracking process, and a system administrator account management component. This provides full, end-to-end disposition approval chain accountability from the moment a request is submitted. By blending both rules-based and rolebased functionality, AutoCAR has the flexibility to route requests based on a user s nationality, JSC affiliation status, and other export-control requirements, while ensuring a user s request is addressed by either a primary or backup approver. All user accounts that are tracked in AutoCAR are recorded and mapped to the native operating system schema on the target platform where user accounts reside. This allows for future extensibility for supporting creation, deletion, and account management directly on the target platforms by way of AutoCAR. The system s directory-based lookup and day-today change analysis of directory information determines personnel moves, deletions, and additions, and automatically notifies a user via e-mail to revalidate his/her account access as a result of such changes. AutoCAR is a Microsoft classic active server page (ASP) application hosted on a Microsoft Internet Information Server (IIS).

  12. Computer Access and Flowcharting as Variables in Learning Computer Programming.

    Science.gov (United States)

    Ross, Steven M.; McCormick, Deborah

    Manipulation of flowcharting was crossed with in-class computer access to examine flowcharting effects in the traditional lecture/laboratory setting and in a classroom setting where online time was replaced with manual simulation. Seventy-two high school students (24 male and 48 female) enrolled in a computer literacy course served as subjects.…

  13. A National Study of the Relationship between Home Access to a Computer and Academic Performance Scores of Grade 12 U.S. Science Students: An Analysis of the 2009 NAEP Data

    Science.gov (United States)

    Coffman, Mitchell Ward

    The purpose of this dissertation was to examine the relationship between student access to a computer at home and academic achievement. The 2009 National Assessment of Educational Progress (NAEP) dataset was probed using the National Data Explorer (NDE) to investigate correlations in the subsets of SES, Parental Education, Race, and Gender as it relates to access of a home computer and improved performance scores for U.S. public school grade 12 science students. A causal-comparative approach was employed seeking clarity on the relationship between home access and performance scores. The influence of home access cannot overcome the challenges students of lower SES face. The achievement gap, or a second digital divide, for underprivileged classes of students, including minorities does not appear to contract via student access to a home computer. Nonetheless, in tests for significance, statistically significant improvement in science performance scores was reported for those having access to a computer at home compared to those not having access. Additionally, regression models reported evidence of correlations between and among subsets of controls for the demographic factors gender, race, and socioeconomic status. Variability in these correlations was high; suggesting influence from unobserved factors may have more impact upon the dependent variable. Having access to a computer at home increases performance scores for grade 12 general science students of all races, genders and socioeconomic levels. However, the performance gap is roughly equivalent to the existing performance gap of the national average for science scores, suggesting little influence from access to a computer on academic achievement. The variability of scores reported in the regression analysis models reflects a moderate to low effect, suggesting an absence of causation. These statistical results are accurate and confirm the literature review, whereby having access to a computer at home and the

  14. Matching and correlation computations in stereoscopic depth perception.

    Science.gov (United States)

    Doi, Takahiro; Tanabe, Seiji; Fujita, Ichiro

    2011-03-02

    A fundamental task of the visual system is to infer depth by using binocular disparity. To encode binocular disparity, the visual cortex performs two distinct computations: one detects matched patterns in paired images (matching computation); the other constructs the cross-correlation between the images (correlation computation). How the two computations are used in stereoscopic perception is unclear. We dissociated their contributions in near/far discrimination by varying the magnitude of the disparity across separate sessions. For small disparity (0.03°), subjects performed at chance level to a binocularly opposite-contrast (anti-correlated) random-dot stereogram (RDS) but improved their performance with the proportion of contrast-matched (correlated) dots. For large disparity (0.48°), the direction of perceived depth reversed with an anti-correlated RDS relative to that for a correlated one. Neither reversed nor normal depth was perceived when anti-correlation was applied to half of the dots. We explain the decision process as a weighted average of the two computations, with the relative weight of the correlation computation increasing with the disparity magnitude. We conclude that matching computation dominates fine depth perception, while both computations contribute to coarser depth perception. Thus, stereoscopic depth perception recruits different computations depending on the disparity magnitude.

  15. ACCESS TO A COMPUTER SYSTEM. BETWEEN LEGAL PROVISIONS AND TECHNICAL REALITY

    Directory of Open Access Journals (Sweden)

    Maxim DOBRINOIU

    2016-05-01

    Full Text Available Nowadays, on a rise of cybersecurity incidents and a very complex IT&C environment, the national legal systems must adapt in order to properly address the new and modern forms of criminality in cyberspace. The illegal access to a computer system remains one of the most important cyber-related crimes due to its popularity but also from the perspective as being a door opened to computer data and sometimes a vehicle for other tech crimes. In the same time, the information society services slightly changed the IT paradigm and represent the new interface between users and systems. Is true that services rely on computer systems, but accessing services goes now beyond the simple accessing computer systems as commonly understood by most of the legislations. The article intends to explain other sides of the access related to computer systems and services, with the purpose to advance possible legal solutions to certain case scenarios.

  16. Household computer and Internet access: The digital divide in a pediatric clinic population

    Science.gov (United States)

    Carroll, Aaron E.; Rivara, Frederick P.; Ebel, Beth; Zimmerman, Frederick J.; Christakis, Dimitri A.

    2005-01-01

    Past studies have noted a digital divide, or inequality in computer and Internet access related to socioeconomic class. This study sought to measure how many households in a pediatric primary care outpatient clinic had household access to computers and the Internet, and whether this access differed by socio-economic status or other demographic information. We conducted a phone survey of a population-based sample of parents with children ages 0 to 11 years old. Analyses assessed predictors of having home access to a computer, the Internet, and high-speed Internet service. Overall, 88.9% of all households owned a personal computer, and 81.4% of all households had Internet access. Among households with Internet access, 48.3% had high speed Internet at home. There were statistically significant associations between parental income or education and home computer ownership and Internet access. However, the impact of this difference was lessened by the fact that over 60% of families with annual household income of $10,000–$25,000, and nearly 70% of families with only a high-school education had Internet access at home. While income and education remain significant predictors of household computer and internet access, many patients and families at all economic levels have access, and might benefit from health promotion interventions using these modalities. PMID:16779012

  17. Computer Security: When a person leaves - access rights remain!

    CERN Multimedia

    Computer Security Team

    2014-01-01

    We have been contacted recently by an embarrassed project manager who just figured out that a student who left at the end of 2013 still had access rights to read the whole project folder in February 2014: “How can that be?! In any other company, access rights would be purged at the same time as an employment contract terminates." Not so at CERN.   CERN has always been an open site with an open community. Physical access to the site is lightweight and you just need to have your CERN access card at hand. Further restrictions have only been put in place where safety or security really require them, and CERN does not require you to keep your access card on display. The same holds for the digital world. Once registered at CERN - either by contract, via your experiment or through the Users' office - you own a computing account that provides you with access to a wide variety of computing services. For example, last year 9,730 students/technicians/engineers/researchers/sta...

  18. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    Directory of Open Access Journals (Sweden)

    Karlheinz Schwarz

    2013-09-01

    Full Text Available Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized below. In each section a further focusing will be provided by occasionally organizing special issues on topics of high interests, collecting papers on fundamental work in the field. More applied papers should be submitted to their corresponding specialist journals. To help us achieve our goal with this journal, we have an excellent editorial board to advise us on the exciting current and future trends in computation from methodology to application. We very much look forward to hearing all about the research going on across the world. [...

  19. Dynamic computing random access memory

    International Nuclear Information System (INIS)

    Traversa, F L; Bonani, F; Pershin, Y V; Di Ventra, M

    2014-01-01

    The present von Neumann computing paradigm involves a significant amount of information transfer between a central processing unit and memory, with concomitant limitations in the actual execution speed. However, it has been recently argued that a different form of computation, dubbed memcomputing (Di Ventra and Pershin 2013 Nat. Phys. 9 200–2) and inspired by the operation of our brain, can resolve the intrinsic limitations of present day architectures by allowing for computing and storing of information on the same physical platform. Here we show a simple and practical realization of memcomputing that utilizes easy-to-build memcapacitive systems. We name this architecture dynamic computing random access memory (DCRAM). We show that DCRAM provides massively-parallel and polymorphic digital logic, namely it allows for different logic operations with the same architecture, by varying only the control signals. In addition, by taking into account realistic parameters, its energy expenditures can be as low as a few fJ per operation. DCRAM is fully compatible with CMOS technology, can be realized with current fabrication facilities, and therefore can really serve as an alternative to the present computing technology. (paper)

  20. Equity and Computers for Mathematics Learning: Access and Attitudes

    Science.gov (United States)

    Forgasz, Helen J.

    2004-01-01

    Equity and computer use for secondary mathematics learning was the focus of a three year study. In 2003, a survey was administered to a large sample of grade 7-10 students. Some of the survey items were aimed at determining home access to and ownership of computers, and students' attitudes to mathematics, computers, and computer use for…

  1. Gender, Computer Access and Use as Predictors of Nigerian ...

    African Journals Online (AJOL)

    This study X-rayed the contributions of gender, access to computer and computer use to the Nigerian undergraduates' computer proficiency. Three hundred and fifteen (315) undergraduates from the Faculty of Education of. Olabisi Onabanjo University, Nigeria served as the sample for this study. The instruments used for ...

  2. Identity based Encryption and Biometric Authentication Scheme for Secure Data Access in Cloud Computing

    DEFF Research Database (Denmark)

    Cheng, Hongbing; Rong, Chunming; Tan, Zheng-Hua

    2012-01-01

    Cloud computing will be a main information infrastructure in the future; it consists of many large datacenters which are usually geographically distributed and heterogeneous. How to design a secure data access for cloud computing platform is a big challenge. In this paper, we propose a secure data...... access scheme based on identity-based encryption and biometric authentication for cloud computing. Firstly, we describe the security concern of cloud computing and then propose an integrated data access scheme for cloud computing, the procedure of the proposed scheme include parameter setup, key...... distribution, feature template creation, cloud data processing and secure data access control. Finally, we compare the proposed scheme with other schemes through comprehensive analysis and simulation. The results show that the proposed data access scheme is feasible and secure for cloud computing....

  3. Secure Dynamic access control scheme of PHR in cloud computing.

    Science.gov (United States)

    Chen, Tzer-Shyong; Liu, Chia-Hui; Chen, Tzer-Long; Chen, Chin-Sheng; Bau, Jian-Guo; Lin, Tzu-Ching

    2012-12-01

    With the development of information technology and medical technology, medical information has been developed from traditional paper records into electronic medical records, which have now been widely applied. The new-style medical information exchange system "personal health records (PHR)" is gradually developed. PHR is a kind of health records maintained and recorded by individuals. An ideal personal health record could integrate personal medical information from different sources and provide complete and correct personal health and medical summary through the Internet or portable media under the requirements of security and privacy. A lot of personal health records are being utilized. The patient-centered PHR information exchange system allows the public autonomously maintain and manage personal health records. Such management is convenient for storing, accessing, and sharing personal medical records. With the emergence of Cloud computing, PHR service has been transferred to storing data into Cloud servers that the resources could be flexibly utilized and the operation cost can be reduced. Nevertheless, patients would face privacy problem when storing PHR data into Cloud. Besides, it requires a secure protection scheme to encrypt the medical records of each patient for storing PHR into Cloud server. In the encryption process, it would be a challenge to achieve accurately accessing to medical records and corresponding to flexibility and efficiency. A new PHR access control scheme under Cloud computing environments is proposed in this study. With Lagrange interpolation polynomial to establish a secure and effective PHR information access scheme, it allows to accurately access to PHR with security and is suitable for enormous multi-users. Moreover, this scheme also dynamically supports multi-users in Cloud computing environments with personal privacy and offers legal authorities to access to PHR. From security and effectiveness analyses, the proposed PHR access

  4. Task-and-role-based access-control model for computational grid

    Institute of Scientific and Technical Information of China (English)

    LONG Tao; HONG Fan; WU Chi; SUN Ling-li

    2007-01-01

    Access control in a grid environment is a challenging issue because the heterogeneous nature and independent administration of geographically dispersed resources in grid require access control to use fine-grained policies. We established a task-and-role-based access-control model for computational grid (CG-TRBAC model), integrating the concepts of role-based access control (RBAC) and task-based access control (TBAC). In this model, condition restrictions are defined and concepts specifically tailored to Workflow Management System are simplified or omitted so that role assignment and security administration fit computational grid better than traditional models; permissions are mutable with the task status and system variables, and can be dynamically controlled. The CG-TRBAC model is proved flexible and extendible. It can implement different control policies. It embodies the security principle of least privilege and executes active dynamic authorization. A task attribute can be extended to satisfy different requirements in a real grid system.

  5. Computer and internet access for long-term care residents: perceived benefits and barriers.

    Science.gov (United States)

    Tak, Sunghee H; Beck, Cornelia; McMahon, Ed

    2007-05-01

    In this study, the authors examined residents' computer and Internet access, as well as benefits and barriers to access in nursing homes. Administrators of 64 nursing homes in a national chain completed surveys. Fourteen percent of the nursing homes provided computers for residents to use, and 11% had Internet access. Some residents owned personal computers in their rooms. Administrators perceived the benefits of computer and Internet use for residents as facilitating direct communication with family and providing mental exercise, education, and enjoyment. Perceived barriers included cost and space for computer equipment and residents' cognitive and physical impairments. Implications of residents' computer activities were discussed for nursing care. Further research is warranted to examine therapeutic effects of computerized activities and their cost effectiveness.

  6. Legal, privacy, security, access and regulatory issues in cloud computing

    CSIR Research Space (South Africa)

    Dlodlo, N

    2011-04-01

    Full Text Available a gap on reporting are on are legal , privacy, security, access and regulatory issues. This paper raises an awareness of legal, privacy, security, access and regulatory issues that are associated with the advent of cloud computing. An in...

  7. Computer Access and Computer Use for Science Performance of Racial and Linguistic Minority Students

    Science.gov (United States)

    Chang, Mido; Kim, Sunha

    2009-01-01

    This study examined the effects of computer access and computer use on the science achievement of elementary school students, with focused attention on the effects for racial and linguistic minority students. The study used the Early Childhood Longitudinal Study (ECLS-K) database and conducted statistical analyses with proper weights and…

  8. Towards ubiquitous access of computer-assisted surgery systems.

    Science.gov (United States)

    Liu, Hui; Lufei, Hanping; Shi, Weishong; Chaudhary, Vipin

    2006-01-01

    Traditional stand-alone computer-assisted surgery (CAS) systems impede the ubiquitous and simultaneous access by multiple users. With advances in computing and networking technologies, ubiquitous access to CAS systems becomes possible and promising. Based on our preliminary work, CASMIL, a stand-alone CAS server developed at Wayne State University, we propose a novel mobile CAS system, UbiCAS, which allows surgeons to retrieve, review and interpret multimodal medical images, and to perform some critical neurosurgical procedures on heterogeneous devices from anywhere at anytime. Furthermore, various optimization techniques, including caching, prefetching, pseudo-streaming-model, and compression, are used to guarantee the QoS of the UbiCAS system. UbiCAS enables doctors at remote locations to actively participate remote surgeries, share patient information in real time before, during, and after the surgery.

  9. Embedded systems for supporting computer accessibility.

    Science.gov (United States)

    Mulfari, Davide; Celesti, Antonio; Fazio, Maria; Villari, Massimo; Puliafito, Antonio

    2015-01-01

    Nowadays, customized AT software solutions allow their users to interact with various kinds of computer systems. Such tools are generally available on personal devices (e.g., smartphones, laptops and so on) commonly used by a person with a disability. In this paper, we investigate a way of using the aforementioned AT equipments in order to access many different devices without assistive preferences. The solution takes advantage of open source hardware and its core component consists of an affordable Linux embedded system: it grabs data coming from the assistive software, which runs on the user's personal device, then, after processing, it generates native keyboard and mouse HID commands for the target computing device controlled by the end user. This process supports any operating system available on the target machine and it requires no specialized software installation; therefore the user with a disability can rely on a single assistive tool to control a wide range of computing platforms, including conventional computers and many kinds of mobile devices, which receive input commands through the USB HID protocol.

  10. Remote data access in computational jobs on the ATLAS data grid

    CERN Document Server

    Begy, Volodimir; The ATLAS collaboration; Lassnig, Mario

    2018-01-01

    This work describes the technique of remote data access from computational jobs on the ATLAS data grid. In comparison to traditional data movement and stage-in approaches it is well suited for data transfers which are asynchronous with respect to the job execution. Hence, it can be used for optimization of data access patterns based on various policies. In this study, remote data access is realized with the HTTP and WebDAV protocols, and is investigated in the context of intra- and inter-computing site data transfers. In both cases, the typical scenarios for application of remote data access are identified. The paper also presents an analysis of parameters influencing the data goodput between heterogeneous storage element - worker node pairs on the grid.

  11. Prospective evaluation of an internet-linked handheld computer critical care knowledge access system.

    Science.gov (United States)

    Lapinsky, Stephen E; Wax, Randy; Showalter, Randy; Martinez-Motta, J Carlos; Hallett, David; Mehta, Sangeeta; Burry, Lisa; Stewart, Thomas E

    2004-12-01

    Critical care physicians may benefit from immediate access to medical reference material. We evaluated the feasibility and potential benefits of a handheld computer based knowledge access system linking a central academic intensive care unit (ICU) to multiple community-based ICUs. Four community hospital ICUs with 17 physicians participated in this prospective interventional study. Following training in the use of an internet-linked, updateable handheld computer knowledge access system, the physicians used the handheld devices in their clinical environment for a 12-month intervention period. Feasibility of the system was evaluated by tracking use of the handheld computer and by conducting surveys and focus group discussions. Before and after the intervention period, participants underwent simulated patient care scenarios designed to evaluate the information sources they accessed, as well as the speed and quality of their decision making. Participants generated admission orders during each scenario, which were scored by blinded evaluators. Ten physicians (59%) used the system regularly, predominantly for nonmedical applications (median 32.8/month, interquartile range [IQR] 28.3-126.8), with medical software accessed less often (median 9/month, IQR 3.7-13.7). Eight out of 13 physicians (62%) who completed the final scenarios chose to use the handheld computer for information access. The median time to access information on the handheld handheld computer was 19 s (IQR 15-40 s). This group exhibited a significant improvement in admission order score as compared with those who used other resources (P = 0.018). Benefits and barriers to use of this technology were identified. An updateable handheld computer system is feasible as a means of point-of-care access to medical reference material and may improve clinical decision making. However, during the study, acceptance of the system was variable. Improved training and new technology may overcome some of the barriers we

  12. Global Pattern of Nasopharyngeal Cancer: Correlation of Outcome With Access to Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Lam, Ka-On [Clinical Oncology Center, University of Hong Kong-Shenzhen Hospital, Shenzhen (China); Lee, Anne W.M., E-mail: annelee@hku-szh.org [Clinical Oncology Center, University of Hong Kong-Shenzhen Hospital, Shenzhen (China); Choi, Cheuk-Wai [Department of Systems Engineering and Engineering Management, City University of Hong Kong, Hong Kong (China); Sze, Henry C.K. [Clinical Oncology Center, University of Hong Kong-Shenzhen Hospital, Shenzhen (China); Zietman, Anthony L. [Department of Radiation Oncology, Massachusetts General Hospital, Boston, Massachusetts (United States); Hopkins, Kirsten I.; Rosenblatt, Eduardo [International Atomic Energy Agency, Vienna (Austria)

    2016-04-01

    Purpose: This study aimed to estimate the treatment outcome of nasopharyngeal cancer (NPC) across the world and its correlation with access to radiation therapy (RT). Methods and Materials: The age-standardized mortality (ASM) and age-standardized incidence (ASI) rates of NPC from GLOBOCAN (2012) were summarized, and [1−(ASM/ASI)] was computed to give the proxy relative survival (RS). Data from the International Atomic Energy Agency (IAEA) and the World Bank were used to assess the availability of RT in surrogate terms: the number of RT equipment units and radiation oncologists per million population. Results: A total of 112 countries with complete valid data were analyzed, and the proxy RS varied widely from 0% to 83% (median, 50%). Countries were categorized into Good, Median, and Poor outcome groups on the basis of their proxy RS (<45%, 45%-55%, and >55%). Eighty percent of new cases occurred in the Poor outcome group. Univariable linear regression showed a significant correlation between outcome and the availability of RT: proxy RS increased at 3.4% (P<.001) and 1.5% (P=.001) per unit increase in RT equipment and oncologist per million population, respectively. The median number of RT equipment units per million population increased significantly from 0.5 in the Poor, to 1.5 in the Median, to 4.6 in the Good outcome groups, and the corresponding number of oncologists increased from 1.1 to 3.3 to 7.1 (P<.001). Conclusions: Nasopharyngeal cancer is a highly treatable disease, but the outcome varies widely across the world. The current study shows a significant correlation between survival and access to RT based on available surrogate indicators. However, the possible reasons for poor outcome are likely to be multifactorial and complex. Concerted international efforts are needed not only to address the fundamental requirement for adequate RT access but also to obtain more comprehensive and accurate data for research to improve cancer outcome.

  13. Computer access and Internet use by urban and suburban emergency department customers.

    Science.gov (United States)

    Bond, Michael C; Klemt, Ryan; Merlis, Jennifer; Kopinski, Judith E; Hirshon, Jon Mark

    2012-07-01

    Patients are increasingly using the Internet (43% in 2000 vs. 70% in 2006) to obtain health information, but is there a difference in the ability of urban and suburban emergency department (ED) customers to access the Internet? To assess computer and Internet resources available to and used by people waiting to be seen in an urban ED and a suburban ED. Individuals waiting in the ED were asked survey questions covering demographics, type of insurance, access to a primary care provider, reason for their ED visit, computer access, and ability to access the Internet for health-related matters. There were 304 individuals who participated, 185 in the urban ED and 119 in the suburban ED. Urban subjects were more likely than suburban to be women, black, have low household income, and were less likely to have insurance. The groups were similar in regard to average age, education, and having a primary care physician. Suburban respondents were more likely to own a computer, but the majority in both groups had access to computers and the Internet. Their frequency of accessing the Internet was similar, as were their reasons for using it. Individuals from the urban ED were less willing to schedule appointments via the Internet but more willing to contact their health care provider via e-mail. The groups were equally willing to use the Internet to fill prescriptions and view laboratory results. Urban and suburban ED customers had similar access to the Internet. Both groups were willing to use the Internet to access personal health information. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. NCI Workshop Report: Clinical and Computational Requirements for Correlating Imaging Phenotypes with Genomics Signatures

    Directory of Open Access Journals (Sweden)

    Rivka Colen

    2014-10-01

    Full Text Available The National Cancer Institute (NCI Cancer Imaging Program organized two related workshops on June 26–27, 2013, entitled “Correlating Imaging Phenotypes with Genomics Signatures Research” and “Scalable Computational Resources as Required for Imaging-Genomics Decision Support Systems.” The first workshop focused on clinical and scientific requirements, exploring our knowledge of phenotypic characteristics of cancer biological properties to determine whether the field is sufficiently advanced to correlate with imaging phenotypes that underpin genomics and clinical outcomes, and exploring new scientific methods to extract phenotypic features from medical images and relate them to genomics analyses. The second workshop focused on computational methods that explore informatics and computational requirements to extract phenotypic features from medical images and relate them to genomics analyses and improve the accessibility and speed of dissemination of existing NIH resources. These workshops linked clinical and scientific requirements of currently known phenotypic and genotypic cancer biology characteristics with imaging phenotypes that underpin genomics and clinical outcomes. The group generated a set of recommendations to NCI leadership and the research community that encourage and support development of the emerging radiogenomics research field to address short-and longer-term goals in cancer research.

  15. Youth with cerebral palsy with differing upper limb abilities: how do they access computers?

    Science.gov (United States)

    Davies, T Claire; Chau, Tom; Fehlings, Darcy L; Ameratunga, Shanthi; Stott, N Susan

    2010-12-01

    To identify the current level of awareness of different computer access technologies and the choices made regarding mode of access by youth with cerebral palsy (CP) and their families. Survey. Two tertiary-level rehabilitation centers in New Zealand and Canada. Youth (N=60) with CP, Manual Ability Classification Scale (MACS) levels I to V, age 13 to 25 years. Not applicable. Questionnaire. Fifty (83%) of the 60 youth were aware of at least 1 available assistive technology (AT), such as touch screens and joysticks. However, only 34 youth (57%) were familiar with the accessibility options currently available in the most common operating systems. Thirty-three (94%) of 35 youth who were MACS I and II used a standard mouse and keyboard, while few chose to use assistive technology or accessibility options. In contrast, 10 (40%) of 25 youth who were MACS III to V used a variety of assistive technologies such as touch screens, joysticks, trackballs, and scanning technologies. This group also had the highest use of accessibility options, although only 15 (60%) of the 25 were aware of them. Most youth with CP were aware of, and used, assistive technologies to enhance their computer access but were less knowledgeable about accessibility options. Accessibility options allow users to modify their own computer interface and can thus enhance computer access for youth with CP. Clinicians should be knowledgeable enough to give informed advice in this area of computer access, thus ensuring that all youth with CP can benefit from both AT and accessibility options, as required. Copyright © 2010 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  16. Path Not Found: Disparities in Access to Computer Science Courses in California High Schools

    Science.gov (United States)

    Martin, Alexis; McAlear, Frieda; Scott, Allison

    2015-01-01

    "Path Not Found: Disparities in Access to Computer Science Courses in California High Schools" exposes one of the foundational causes of underrepresentation in computing: disparities in access to computer science courses in California's public high schools. This report provides new, detailed data on these disparities by student body…

  17. An Annotated and Cross-Referenced Bibliography on Computer Security and Access Control in Computer Systems.

    Science.gov (United States)

    Bergart, Jeffrey G.; And Others

    This paper represents a careful study of published works on computer security and access control in computer systems. The study includes a selective annotated bibliography of some eighty-five important published results in the field and, based on these papers, analyzes the state of the art. In annotating these works, the authors try to be…

  18. Student Perceived Importance and Correlations of Selected Computer Literacy Course Topics

    Science.gov (United States)

    Ciampa, Mark

    2013-01-01

    Traditional college-level courses designed to teach computer literacy are in a state of flux. Today's students have high rates of access to computing technology and computer ownership, leading many policy decision makers to conclude that students already are computer literate and thus computer literacy courses are dinosaurs in a modern digital…

  19. Does Technical Success of Angioplasty in Dysfunctional Hemodialysis Accesses Correlate with Access Patency?

    Energy Technology Data Exchange (ETDEWEB)

    Sidhu, Arshdeep; Tan, Kong T.; Noel-Lamy, Maxime; Simons, Martin E.; Rajan, Dheeraj K., E-mail: dheeraj.rajan@uhn.ca [University Health Network, University of Toronto, Division of Vascular and Interventional Radiology, Peter Munk Cardiac Center (Canada)

    2016-10-15

    PurposeTo study if <30 % residual stenosis post angioplasty (PTA) correlates with primary access circuit patency, and if any variables predict technical success.Materials and MethodsA prospective observational study was performed between January 2009 and December 2012, wherein 76 patients underwent 154 PTA events in 56 prosthetic grafts (AVG) and 98 autogenous fistulas (AVF). Data collected included patient age, gender, lesion location and laterality, access type and location, number of prior interventions, and transonic flow rates pre- and postintervention. Impact of technical outcome on access patency was assessed. Univariate logistic regression was used to assess the impact of variables on technical success with significant factors assessed with a multiple variable model.ResultsTechnical success rates of PTA in AVFs and AVGs were 79.6 and 76.7 %, respectively. Technical failures of PTA were associated with an increased risk of patency loss among circuits with AVFs (p < 0.05), but not with AVGs (p = 0.7). In AVFs, primary access patency rates between technical successes and failures at three and 6 months were 74.4 versus 61.9 % (p = 0.3) and 53.8 versus 23.8 % (p < 0.05), respectively. In AVGs, primary access patency rates between technical successes and failures at three and six months were 72.1 versus 53.9 % (p = 0.5) and 33.6 versus 38.5 % (p = 0.8), respectively. Transonic flow rates did not significantly differ among technically successful or failed outcomes at one or three months.ConclusionTechnical failures of PTA had a significant impact on access patency among AVFs with a trend toward poorer access patency within AVGs.

  20. Does Technical Success of Angioplasty in Dysfunctional Hemodialysis Accesses Correlate with Access Patency?

    International Nuclear Information System (INIS)

    Sidhu, Arshdeep; Tan, Kong T.; Noel-Lamy, Maxime; Simons, Martin E.; Rajan, Dheeraj K.

    2016-01-01

    PurposeTo study if <30 % residual stenosis post angioplasty (PTA) correlates with primary access circuit patency, and if any variables predict technical success.Materials and MethodsA prospective observational study was performed between January 2009 and December 2012, wherein 76 patients underwent 154 PTA events in 56 prosthetic grafts (AVG) and 98 autogenous fistulas (AVF). Data collected included patient age, gender, lesion location and laterality, access type and location, number of prior interventions, and transonic flow rates pre- and postintervention. Impact of technical outcome on access patency was assessed. Univariate logistic regression was used to assess the impact of variables on technical success with significant factors assessed with a multiple variable model.ResultsTechnical success rates of PTA in AVFs and AVGs were 79.6 and 76.7 %, respectively. Technical failures of PTA were associated with an increased risk of patency loss among circuits with AVFs (p < 0.05), but not with AVGs (p = 0.7). In AVFs, primary access patency rates between technical successes and failures at three and 6 months were 74.4 versus 61.9 % (p = 0.3) and 53.8 versus 23.8 % (p < 0.05), respectively. In AVGs, primary access patency rates between technical successes and failures at three and six months were 72.1 versus 53.9 % (p = 0.5) and 33.6 versus 38.5 % (p = 0.8), respectively. Transonic flow rates did not significantly differ among technically successful or failed outcomes at one or three months.ConclusionTechnical failures of PTA had a significant impact on access patency among AVFs with a trend toward poorer access patency within AVGs.

  1. Growth rate correlates negatively with protein turnover in Arabidopsis accessions.

    Science.gov (United States)

    Ishihara, Hirofumi; Moraes, Thiago Alexandre; Pyl, Eva-Theresa; Schulze, Waltraud X; Obata, Toshihiro; Scheffel, André; Fernie, Alisdair R; Sulpice, Ronan; Stitt, Mark

    2017-08-01

    Previous studies with Arabidopsis accessions revealed that biomass correlates negatively to dusk starch content and total protein, and positively to the maximum activities of enzymes in photosynthesis. We hypothesized that large accessions have lower ribosome abundance and lower rates of protein synthesis, and that this is compensated by lower rates of protein degradation. This would increase growth efficiency and allow more investment in photosynthetic machinery. We analysed ribosome abundance and polysome loading in 19 accessions, modelled the rates of protein synthesis and compared them with the observed rate of growth. Large accessions contained less ribosomes than small accessions, due mainly to cytosolic ribosome abundance falling at night in large accessions. The modelled rates of protein synthesis resembled those required for growth in large accessions, but were up to 30% in excess in small accessions. We then employed 13 CO 2 pulse-chase labelling to measure the rates of protein synthesis and degradation in 13 accessions. Small accessions had a slightly higher rate of protein synthesis and much higher rates of protein degradation than large accessions. Protein turnover was negligible in large accessions but equivalent to up to 30% of synthesised protein day -1 in small accessions. We discuss to what extent the decrease in growth in small accessions can be quantitatively explained by known costs of protein turnover and what factors may lead to the altered diurnal dynamics and increase of ribosome abundance in small accessions, and propose that there is a trade-off between protein turnover and maximisation of growth rate. © 2017 The Authors The Plant Journal © 2017 John Wiley & Sons Ltd.

  2. A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing.

    Science.gov (United States)

    Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang

    2017-07-24

    With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient.

  3. A Secure and Verifiable Outsourced Access Control Scheme in Fog-Cloud Computing

    Science.gov (United States)

    Fan, Kai; Wang, Junxiong; Wang, Xin; Li, Hui; Yang, Yintang

    2017-01-01

    With the rapid development of big data and Internet of things (IOT), the number of networking devices and data volume are increasing dramatically. Fog computing, which extends cloud computing to the edge of the network can effectively solve the bottleneck problems of data transmission and data storage. However, security and privacy challenges are also arising in the fog-cloud computing environment. Ciphertext-policy attribute-based encryption (CP-ABE) can be adopted to realize data access control in fog-cloud computing systems. In this paper, we propose a verifiable outsourced multi-authority access control scheme, named VO-MAACS. In our construction, most encryption and decryption computations are outsourced to fog devices and the computation results can be verified by using our verification method. Meanwhile, to address the revocation issue, we design an efficient user and attribute revocation method for it. Finally, analysis and simulation results show that our scheme is both secure and highly efficient. PMID:28737733

  4. Prescription for trouble: Medicare Part D and patterns of computer and internet access among the elderly.

    Science.gov (United States)

    Wright, David W; Hill, Twyla J

    2009-01-01

    The Medicare Prescription Drug, Improvement, and Modernization Act of 2003 specifically encourages Medicare enrollees to use the Internet to obtain information regarding the new prescription drug insurance plans and to enroll in a plan. This reliance on computer technology and the Internet leads to practical questions regarding implementation of the insurance coverage. For example, it seems unlikely that all Medicare enrollees have access to computers and the Internet or that they are all computer literate. This study uses the 2003 Current Population Survey to examine the effects of disability and income on computer access and Internet use among the elderly. Internet access declines with age and is exacerbated by disabilities. Also, decreases in income lead to decreases in computer ownership and use. Therefore, providing prescription drug coverage primarily through the Internet seems likely to maintain or increase stratification of access to health care, especially for low-income, disabled elderly, who are also a group most in need of health care access.

  5. No Special Equipment Required: The Accessibility Features Built into the Windows and Macintosh Operating Systems make Computers Accessible for Students with Special Needs

    Science.gov (United States)

    Kimball,Walter H.; Cohen,Libby G.; Dimmick,Deb; Mills,Rick

    2003-01-01

    The proliferation of computers and other electronic learning devices has made knowledge and communication accessible to people with a wide range of abilities. Both Windows and Macintosh computers have accessibility options to help with many different special needs. This documents discusses solutions for: (1) visual impairments; (2) hearing…

  6. Secure Data Access Control for Fog Computing Based on Multi-Authority Attribute-Based Signcryption with Computation Outsourcing and Attribute Revocation.

    Science.gov (United States)

    Xu, Qian; Tan, Chengxiang; Fan, Zhijie; Zhu, Wenye; Xiao, Ya; Cheng, Fujia

    2018-05-17

    Nowadays, fog computing provides computation, storage, and application services to end users in the Internet of Things. One of the major concerns in fog computing systems is how fine-grained access control can be imposed. As a logical combination of attribute-based encryption and attribute-based signature, Attribute-based Signcryption (ABSC) can provide confidentiality and anonymous authentication for sensitive data and is more efficient than traditional "encrypt-then-sign" or "sign-then-encrypt" strategy. Thus, ABSC is suitable for fine-grained access control in a semi-trusted cloud environment and is gaining more and more attention recently. However, in many existing ABSC systems, the computation cost required for the end users in signcryption and designcryption is linear with the complexity of signing and encryption access policy. Moreover, only a single authority that is responsible for attribute management and key generation exists in the previous proposed ABSC schemes, whereas in reality, mostly, different authorities monitor different attributes of the user. In this paper, we propose OMDAC-ABSC, a novel data access control scheme based on Ciphertext-Policy ABSC, to provide data confidentiality, fine-grained control, and anonymous authentication in a multi-authority fog computing system. The signcryption and designcryption overhead for the user is significantly reduced by outsourcing the undesirable computation operations to fog nodes. The proposed scheme is proven to be secure in the standard model and can provide attribute revocation and public verifiability. The security analysis, asymptotic complexity comparison, and implementation results indicate that our construction can balance the security goals with practical efficiency in computation.

  7. Computer network access to scientific information systems for minority universities

    Science.gov (United States)

    Thomas, Valerie L.; Wakim, Nagi T.

    1993-08-01

    The evolution of computer networking technology has lead to the establishment of a massive networking infrastructure which interconnects various types of computing resources at many government, academic, and corporate institutions. A large segment of this infrastructure has been developed to facilitate information exchange and resource sharing within the scientific community. The National Aeronautics and Space Administration (NASA) supports both the development and the application of computer networks which provide its community with access to many valuable multi-disciplinary scientific information systems and on-line databases. Recognizing the need to extend the benefits of this advanced networking technology to the under-represented community, the National Space Science Data Center (NSSDC) in the Space Data and Computing Division at the Goddard Space Flight Center has developed the Minority University-Space Interdisciplinary Network (MU-SPIN) Program: a major networking and education initiative for Historically Black Colleges and Universities (HBCUs) and Minority Universities (MUs). In this paper, we will briefly explain the various components of the MU-SPIN Program while highlighting how, by providing access to scientific information systems and on-line data, it promotes a higher level of collaboration among faculty and students and NASA scientists.

  8. Change in Computer Access and the Academic Achievement of Immigrant Children

    Science.gov (United States)

    Moon, Ui Jeong; Hofferth, Sandra

    2018-01-01

    Background/Context: Increased interest in the correlates of media devices available to children has led to research indicating that access to and use of technology are positively associated with children's academic achievement. However, the digital divide remains; not all children have access to digital technologies, and not all children can…

  9. “Future Directions”: m-government computer systems accessed via cloud computing – advantages and possible implementations

    OpenAIRE

    Daniela LIŢAN

    2015-01-01

    In recent years, the activities of companies and Public Administration had been automated and adapted to the current information system. Therefore, in this paper, I will present and exemplify the benefits of m-government computer systems development and implementation (which can be accessed from mobile devices and which are specific to the workflow of Public Administrations) starting from the “experience” of e-government systems implementation in the context of their access and usage through ...

  10. Effect of correlated decay on fault-tolerant quantum computation

    Science.gov (United States)

    Lemberger, B.; Yavuz, D. D.

    2017-12-01

    We analyze noise in the circuit model of quantum computers when the qubits are coupled to a common bosonic bath and discuss the possible failure of scalability of quantum computation. Specifically, we investigate correlated (super-radiant) decay between the qubit energy levels from a two- or three-dimensional array of qubits without imposing any restrictions on the size of the sample. We first show that regardless of how the spacing between the qubits compares with the emission wavelength, correlated decay produces errors outside the applicability of the threshold theorem. This is because the sum of the norms of the two-body interaction Hamiltonians (which can be viewed as the upper bound on the single-qubit error) that decoheres each qubit scales with the total number of qubits and is unbounded. We then discuss two related results: (1) We show that the actual error (instead of the upper bound) on each qubit scales with the number of qubits. As a result, in the limit of large number of qubits in the computer, N →∞ , correlated decay causes each qubit in the computer to decohere in ever shorter time scales. (2) We find the complete eigenvalue spectrum of the exchange Hamiltonian that causes correlated decay in the same limit. We show that the spread of the eigenvalue distribution grows faster with N compared to the spectrum of the unperturbed system Hamiltonian. As a result, as N →∞ , quantum evolution becomes completely dominated by the noise due to correlated decay. These results argue that scalable quantum computing may not be possible in the circuit model in a two- or three- dimensional geometry when the qubits are coupled to a common bosonic bath.

  11. Experimental detection of nonclassical correlations in mixed-state quantum computation

    International Nuclear Information System (INIS)

    Passante, G.; Moussa, O.; Trottier, D. A.; Laflamme, R.

    2011-01-01

    We report on an experiment to detect nonclassical correlations in a highly mixed state. The correlations are characterized by the quantum discord and are observed using four qubits in a liquid-state nuclear magnetic resonance quantum information processor. The state analyzed is the output of a DQC1 computation, whose input is a single quantum bit accompanied by n maximally mixed qubits. This model of computation outperforms the best known classical algorithms and, although it contains vanishing entanglement, it is known to have quantum correlations characterized by the quantum discord. This experiment detects nonvanishing quantum discord, ensuring the existence of nonclassical correlations as measured by the quantum discord.

  12. An accessibility solution of cloud computing by solar energy

    Directory of Open Access Journals (Sweden)

    Zuzana Priščáková

    2013-01-01

    Full Text Available Cloud computing is a modern innovative technology of solution of a problem with data storage, data processing, company infrastructure building and so on. Many companies worry over the changes by the implementation of this solution because these changes could have a negative impact on the company, or, in the case of establishment of new companies, this worry results from an unfamiliar environment. Data accessibility, integrity and security belong among basic problems of cloud computing. The aim of this paper is to offer and scientifically confirm a proposal of an accessibility solution of cloud by implementing of solar energy as a primary source. Problems with accessibility rise from power failures when data may be stolen or lost. Since cloud is often started from a server, the server dependence on power is strong. Modern conditions offer us a new, more innovative solution regarding the ecological as well as an economical company solution. The Sun as a steady source of energy offers us a possibility to produce necessary energy by a solar technique – solar panels. The connection of a solar panel as a primary source of energy for a server would remove its power dependence as well as possible failures. The power dependence would stay as a secondary source. Such an ecological solution would influence the economical side of company because the energy consumption would be lower. Besides a proposal of an accessibility solution, this paper involves a physical and mathematical solution to a calculation of solar energy showered on the Earth, a calculation of the panel size by cosines method and a simulation of these calculations in MATLAB conditions.

  13. Do Your School Policies Provide Equal Access to Computers? Are You Sure?

    Science.gov (United States)

    DuBois, Phyllis A.; Schubert, Jane G.

    1986-01-01

    Outlines how school policies can unintentionally perpetuate gender discrimination in student computer use and access. Describes four areas of administrative policies that can cause inequities and provides ways for administrators to counteract these policies. Includes discussion of a program to balance computer use, and an abstract of an article…

  14. A Study of the Correlation between Computer Games and Adolescent Behavioral Problems.

    Science.gov (United States)

    Shokouhi-Moqhaddam, Solmaz; Khezri-Moghadam, Noshiravan; Javanmard, Zeinab; Sarmadi-Ansar, Hassan; Aminaee, Mehran; Shokouhi-Moqhaddam, Majid; Zivari-Rahman, Mahmoud

    2013-01-01

    Today, due to developing communicative technologies, computer games and other audio-visual media as social phenomena, are very attractive and have a great effect on children and adolescents. The increasing popularity of these games among children and adolescents results in the public uncertainties about plausible harmful effects of these games. This study aimed to investigate the correlation between computer games and behavioral problems on male guidance school students. This was a descriptive-correlative study on 384 randomly chosen male guidance school students. They were asked to answer the researcher's questionnaire about computer games and Achenbach's Youth Self-Report (YSR). The Results of this study indicated that there was about 95% direct significant correlation between the amount of playing games among adolescents and anxiety/depression, withdrawn/depression, rule-breaking behaviors, aggression, and social problems. However, there was no statistically significant correlation between the amount of computer game usage and physical complaints, thinking problems, and attention problems. In addition, there was a significant correlation between the students' place of living and their parents' job, and using computer games. Computer games lead to anxiety, depression, withdrawal, rule-breaking behavior, aggression, and social problems in adolescents.

  15. A Study of the Correlation between Computer Games and Adolescent Behavioral Problems

    Science.gov (United States)

    Shokouhi-Moqhaddam, Solmaz; Khezri-Moghadam, Noshiravan; Javanmard, Zeinab; Sarmadi-Ansar, Hassan; Aminaee, Mehran; Shokouhi-Moqhaddam, Majid; Zivari-Rahman, Mahmoud

    2013-01-01

    Background Today, due to developing communicative technologies, computer games and other audio-visual media as social phenomena, are very attractive and have a great effect on children and adolescents. The increasing popularity of these games among children and adolescents results in the public uncertainties about plausible harmful effects of these games. This study aimed to investigate the correlation between computer games and behavioral problems on male guidance school students. Methods This was a descriptive-correlative study on 384 randomly chosen male guidance school students. They were asked to answer the researcher's questionnaire about computer games and Achenbach’s Youth Self-Report (YSR). Findings The Results of this study indicated that there was about 95% direct significant correlation between the amount of playing games among adolescents and anxiety/depression, withdrawn/depression, rule-breaking behaviors, aggression, and social problems. However, there was no statistically significant correlation between the amount of computer game usage and physical complaints, thinking problems, and attention problems. In addition, there was a significant correlation between the students’ place of living and their parents’ job, and using computer games. Conclusion Computer games lead to anxiety, depression, withdrawal, rule-breaking behavior, aggression, and social problems in adolescents. PMID:24494157

  16. Efficient Reduction of Access Latency through Object Correlations in Virtual Environments

    Directory of Open Access Journals (Sweden)

    Damon Shing-Min Liu

    2007-01-01

    Full Text Available Object correlations are common semantic patterns in virtual environments. They can be exploited to improve the effectiveness of storage caching, prefetching, data layout, and disk scheduling. However, we have little approaches for discovering object correlations in VE to improve the performance of storage systems. Being an interactive feedback-driven paradigm, it is critical that the user receives responses to his navigation requests with little or no time lag. Therefore, we propose a class of view-based projection-generation method for mining various frequent sequential traversal patterns in the virtual environments. The frequent sequential traversal patterns are used to predict the user navigation behavior and, through clustering scheme, help to reduce disk access time with proper patterns placement into disk blocks. Finally, the effectiveness of these schemes is shown through simulation to demonstrate how these proposed techniques not only significantly cut down disk access time, but also enhance the accuracy of data prefetching.

  17. Paging memory from random access memory to backing storage in a parallel computer

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Inglett, Todd A; Ratterman, Joseph D; Smith, Brian E

    2013-05-21

    Paging memory from random access memory (`RAM`) to backing storage in a parallel computer that includes a plurality of compute nodes, including: executing a data processing application on a virtual machine operating system in a virtual machine on a first compute node; providing, by a second compute node, backing storage for the contents of RAM on the first compute node; and swapping, by the virtual machine operating system in the virtual machine on the first compute node, a page of memory from RAM on the first compute node to the backing storage on the second compute node.

  18. Distributed Database Access in the LHC Computing Grid with CORAL

    CERN Document Server

    Molnár, Z; Düllmann, D; Giacomo, G; Kalkhof, A; Valassi, A; CERN. Geneva. IT Department

    2009-01-01

    The CORAL package is the LCG Persistency Framework foundation for accessing relational databases. From the start CORAL has been designed to facilitate the deployment of the LHC experiment database applications in a distributed computing environment. In particular we cover - improvements to database service scalability by client connection management - platform-independent, multi-tier scalable database access by connection multiplexing, caching - a secure authentication and authorisation scheme integrated with existing grid services. We will summarize the deployment experience from several experiment productions using the distributed database infrastructure, which is now available in LCG. Finally, we present perspectives for future developments in this area.

  19. SmartVeh: Secure and Efficient Message Access Control and Authentication for Vehicular Cloud Computing.

    Science.gov (United States)

    Huang, Qinlong; Yang, Yixian; Shi, Yuxiang

    2018-02-24

    With the growing number of vehicles and popularity of various services in vehicular cloud computing (VCC), message exchanging among vehicles under traffic conditions and in emergency situations is one of the most pressing demands, and has attracted significant attention. However, it is an important challenge to authenticate the legitimate sources of broadcast messages and achieve fine-grained message access control. In this work, we propose SmartVeh, a secure and efficient message access control and authentication scheme in VCC. A hierarchical, attribute-based encryption technique is utilized to achieve fine-grained and flexible message sharing, which ensures that vehicles whose persistent or dynamic attributes satisfy the access policies can access the broadcast message with equipped on-board units (OBUs). Message authentication is enforced by integrating an attribute-based signature, which achieves message authentication and maintains the anonymity of the vehicles. In order to reduce the computations of the OBUs in the vehicles, we outsource the heavy computations of encryption, decryption and signing to a cloud server and road-side units. The theoretical analysis and simulation results reveal that our secure and efficient scheme is suitable for VCC.

  20. Computer code MLCOSP for multiple-correlation and spectrum analysis with a hybrid computer

    International Nuclear Information System (INIS)

    Oguma, Ritsuo; Fujii, Yoshio; Usui, Hozumi; Watanabe, Koichi

    1975-10-01

    Usage of the computer code MLCOSP(Multiple Correlation and Spectrum) developed is described for a hybrid computer installed in JAERI Functions of the hybrid computer and its terminal devices are utilized ingeniously in the code to reduce complexity of the data handling which occurrs in analysis of the multivariable experimental data and to perform the analysis in perspective. Features of the code are as follows; Experimental data can be fed to the digital computer through the analog part of the hybrid computer by connecting with a data recorder. The computed results are displayed in figures, and hardcopies are taken when necessary. Series-messages to the code are shown on the terminal, so man-machine communication is possible. And further the data can be put in through a keyboard, so case study according to the results of analysis is possible. (auth.)

  1. The ENSDF radioactivity data base for IBM-PC and computer network access

    International Nuclear Information System (INIS)

    Ekstroem, P.; Spanier, L.

    1989-08-01

    A data base system for radioactivity gamma rays is described. A base with approximately 15000 gamma rays from 2777 decays is available for installation on the hard disk of a PC, and a complete system with approximately 73000 gamma rays is available for on-line access via the NORDic University computer NETwork (NORDUNET) and the Swedish University computer NETwork (SUNET)

  2. Forecasting Model for Network Throughput of Remote Data Access in Computing Grids

    CERN Document Server

    Begy, Volodimir; The ATLAS collaboration

    2018-01-01

    Computing grids are one of the key enablers of eScience. Researchers from many fields (e.g. High Energy Physics, Bioinformatics, Climatology, etc.) employ grids to run computational jobs in a highly distributed manner. The current state of the art approach for data access in the grid is data placement: a job is scheduled to run at a specific data center, and its execution starts only when the complete input data has been transferred there. This approach has two major disadvantages: (1) the jobs are staying idle while waiting for the input data; (2) due to the limited infrastructure resources, the distributed data management system handling the data placement, may queue the transfers up to several days. An alternative approach is remote data access: a job may stream the input data directly from storage elements, which may be located at local or remote data centers. Remote data access brings two innovative benefits: (1) the jobs can be executed asynchronously with respect to the data transfer; (2) when combined...

  3. Lifelong mobile learning: Increasing accessibility and flexibility with tablet computers and ebooks

    OpenAIRE

    Kalz, Marco

    2011-01-01

    Kalz, M. (2011, 1 September). Lifelong mobile learning: Increasing accessibility and flexibility with tablet computers and ebooks. Presentation provided during the opening ceremony of the iPad pilot for schakelzone rechten, Utrecht, The Netherlands.

  4. Access to and use of computers among clinical dental students of ...

    African Journals Online (AJOL)

    Access to and use of computers among clinical dental students of the University of Lagos. PO Ayanbadejo, OO Sofola, OG Uti. Abstract. No Abstract. Full Text: EMAIL FULL TEXT EMAIL FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT. Article Metrics. No metrics found. Metrics powered by PLOS ALM

  5. Efficient quantum algorithm for computing n-time correlation functions.

    Science.gov (United States)

    Pedernales, J S; Di Candia, R; Egusquiza, I L; Casanova, J; Solano, E

    2014-07-11

    We propose a method for computing n-time correlation functions of arbitrary spinorial, fermionic, and bosonic operators, consisting of an efficient quantum algorithm that encodes these correlations in an initially added ancillary qubit for probe and control tasks. For spinorial and fermionic systems, the reconstruction of arbitrary n-time correlation functions requires the measurement of two ancilla observables, while for bosonic variables time derivatives of the same observables are needed. Finally, we provide examples applicable to different quantum platforms in the frame of the linear response theory.

  6. Paracoccidioidomycosis: High-resolution computed tomography-pathologic correlation

    International Nuclear Information System (INIS)

    Marchiori, Edson; Valiante, Paulo Marcos; Mano, Claudia Mauro; Zanetti, Glaucia; Escuissato, Dante L.; Souza, Arthur Soares; Capone, Domenico

    2011-01-01

    Objective: The purpose of this study was to describe the high-resolution computed tomography (HRCT) features of pulmonary paracoccidioidomycosis and to correlate them with pathologic findings. Methods: The study included 23 adult patients with pulmonary paracoccidioidomycosis. All patients had undergone HRCT, and the images were retrospectively analyzed by two chest radiologists, who reached decisions by consensus. An experienced lung pathologist reviewed all pathological specimens. The HRCT findings were correlated with histopathologic data. Results: The predominant HRCT findings included areas of ground-glass opacities, nodules, interlobular septal thickening, airspace consolidation, cavitation, and fibrosis. The main pathological features consisted of alveolar and interlobular septal inflammatory infiltration, granulomas, alveolar exudate, cavitation secondary to necrosis, and fibrosis. Conclusion: Paracoccidioidomycosis can present different tomography patterns, which can involve both the interstitium and the airspace. These abnormalities can be pathologically correlated with inflammatory infiltration, granulomatous reaction, and fibrosis.

  7. Evaluation of External Memory Access Performance on a High-End FPGA Hybrid Computer

    Directory of Open Access Journals (Sweden)

    Konstantinos Kalaitzis

    2016-10-01

    Full Text Available The motivation of this research was to evaluate the main memory performance of a hybrid super computer such as the Convey HC-x, and ascertain how the controller performs in several access scenarios, vis-à-vis hand-coded memory prefetches. Such memory patterns are very useful in stencil computations. The theoretical bandwidth of the memory of the Convey is compared with the results of our measurements. The accurate study of the memory subsystem is particularly useful for users when they are developing their application-specific personality. Experiments were performed to measure the bandwidth between the coprocessor and the memory subsystem. The experiments aimed mainly at measuring the reading access speed of the memory from Application Engines (FPGAs. Different ways of accessing data were used in order to find the most efficient way to access memory. This way was proposed for future work in the Convey HC-x. When performing a series of accesses to memory, non-uniform latencies occur. The Memory Controller of the Convey HC-x in the coprocessor attempts to cover this latency. We measure memory efficiency as a ratio of the number of memory accesses and the number of execution cycles. The result of this measurement converges to one in most cases. In addition, we performed experiments with hand-coded memory accesses. The analysis of the experimental results shows how the memory subsystem and Memory Controllers work. From this work we conclude that the memory controllers do an excellent job, largely because (transparently to the user they seem to cache large amounts of data, and hence hand-coding is not needed in most situations.

  8. 76 FR 37111 - Access to Confidential Business Information by Computer Sciences Corporation and Its Identified...

    Science.gov (United States)

    2011-06-24

    ... Business Information by Computer Sciences Corporation and Its Identified Subcontractors AGENCY: Environmental Protection Agency (EPA). ACTION: Notice. SUMMARY: EPA has authorized its contractor, Computer Sciences Corporation of Chantilly, VA and Its Identified Subcontractors, to access information which has...

  9. Encapsulating peritonitis: computed tomography and surgical correlation

    Energy Technology Data Exchange (ETDEWEB)

    Kadow, Juliana Santos; Fingerhut, Carla Jeronimo Peres; Fernandes, Vinicius de Barros; Coradazzi, Klaus Rizk Stuhr; Silva, Lucas Marciel Soares; Penachim, Thiago Jose, E-mail: vinicius.barros.fernandes@gmail.com [Pontificia Universidade Catolica de Campinas (PUC-Campinas), Campinas, SP (Brazil). Hospital e Maternidade Celso Pierro

    2014-07-15

    Sclerosing encapsulating peritonitis is a rare and frequently severe entity characterized by total or partial involvement of small bowel loops by a membrane of fibrous tissue. The disease presents with nonspecific clinical features of intestinal obstruction, requiring precise imaging diagnosis to guide the treatment. The present report emphasizes the importance of computed tomography in the diagnosis of this condition and its confirmation by surgical correlation. (author)

  10. Understanding and Improving Blind Students' Access to Visual Information in Computer Science Education

    Science.gov (United States)

    Baker, Catherine M.

    Teaching people with disabilities tech skills empowers them to create solutions to problems they encounter and prepares them for careers. However, computer science is typically taught in a highly visual manner which can present barriers for people who are blind. The goal of this dissertation is to understand and decrease those barriers. The first projects I present looked at the barriers that blind students face. I first present the results of my survey and interviews with blind students with degrees in computer science or related fields. This work highlighted the many barriers that these blind students faced. I then followed-up on one of the barriers mentioned, access to technology, by doing a preliminary accessibility evaluation of six popular integrated development environments (IDEs) and code editors. I found that half were unusable and all had some inaccessible portions. As access to visual information is a barrier in computer science education, I present three projects I have done to decrease this barrier. The first project is Tactile Graphics with a Voice (TGV). This project investigated an alternative to Braille labels for those who do not know Braille and showed that TGV was a potential alternative. The next project was StructJumper, which created a modified abstract syntax tree that blind programmers could use to navigate through code with their screen reader. The evaluation showed that users could navigate more quickly and easily determine the relationships of lines of code when they were using StructJumper compared to when they were not. Finally, I present a tool for dynamic graphs (the type with nodes and edges) which had two different modes for handling focus changes when moving between graphs. I found that the modes support different approaches for exploring the graphs and therefore preferences are mixed based on the user's preferred approach. However, both modes had similar accuracy in completing the tasks. These projects are a first step towards

  11. On the computation of molecular surface correlations for protein docking using fourier techniques.

    Science.gov (United States)

    Sakk, Eric

    2007-08-01

    The computation of surface correlations using a variety of molecular models has been applied to the unbound protein docking problem. Because of the computational complexity involved in examining all possible molecular orientations, the fast Fourier transform (FFT) (a fast numerical implementation of the discrete Fourier transform (DFT)) is generally applied to minimize the number of calculations. This approach is rooted in the convolution theorem which allows one to inverse transform the product of two DFTs in order to perform the correlation calculation. However, such a DFT calculation results in a cyclic or "circular" correlation which, in general, does not lead to the same result as the linear correlation desired for the docking problem. In this work, we provide computational bounds for constructing molecular models used in the molecular surface correlation problem. The derived bounds are then shown to be consistent with various intuitive guidelines previously reported in the protein docking literature. Finally, these bounds are applied to different molecular models in order to investigate their effect on the correlation calculation.

  12. Survey of Canadian Myotonic Dystrophy Patients' Access to Computer Technology.

    Science.gov (United States)

    Climans, Seth A; Piechowicz, Christine; Koopman, Wilma J; Venance, Shannon L

    2017-09-01

    Myotonic dystrophy type 1 is an autosomal dominant condition affecting distal hand strength, energy, and cognition. Increasingly, patients and families are seeking information online. An online neuromuscular patient portal under development can help patients access resources and interact with each other regardless of location. It is unknown how individuals living with myotonic dystrophy interact with technology and whether barriers to access exist. We aimed to characterize technology use among participants with myotonic dystrophy and to determine whether there is interest in a patient portal. Surveys were mailed to 156 participants with myotonic dystrophy type 1 registered with the Canadian Neuromuscular Disease Registry. Seventy-five participants (60% female) responded; almost half were younger than 46 years. Most (84%) used the internet; almost half of the responders (47%) used social media. The complexity and cost of technology were commonly cited reasons not to use technology. The majority of responders (76%) were interested in a myotonic dystrophy patient portal. Patients in a Canada-wide registry of myotonic dystrophy have access to and use technology such as computers and mobile phones. These patients expressed interest in a portal that would provide them with an opportunity to network with others with myotonic dystrophy and to access information about the disease.

  13. Neuroanatomical correlates of brain-computer interface performance.

    Science.gov (United States)

    Kasahara, Kazumi; DaSalla, Charles Sayo; Honda, Manabu; Hanakawa, Takashi

    2015-04-15

    Brain-computer interfaces (BCIs) offer a potential means to replace or restore lost motor function. However, BCI performance varies considerably between users, the reasons for which are poorly understood. Here we investigated the relationship between sensorimotor rhythm (SMR)-based BCI performance and brain structure. Participants were instructed to control a computer cursor using right- and left-hand motor imagery, which primarily modulated their left- and right-hemispheric SMR powers, respectively. Although most participants were able to control the BCI with success rates significantly above chance level even at the first encounter, they also showed substantial inter-individual variability in BCI success rate. Participants also underwent T1-weighted three-dimensional structural magnetic resonance imaging (MRI). The MRI data were subjected to voxel-based morphometry using BCI success rate as an independent variable. We found that BCI performance correlated with gray matter volume of the supplementary motor area, supplementary somatosensory area, and dorsal premotor cortex. We suggest that SMR-based BCI performance is associated with development of non-primary somatosensory and motor areas. Advancing our understanding of BCI performance in relation to its neuroanatomical correlates may lead to better customization of BCIs based on individual brain structure. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Internet Use and Access Among Pregnant Women via Computer and Mobile Phone: Implications for Delivery of Perinatal Care.

    Science.gov (United States)

    Peragallo Urrutia, Rachel; Berger, Alexander A; Ivins, Amber A; Beckham, A Jenna; Thorp, John M; Nicholson, Wanda K

    2015-03-30

    The use of Internet-based behavioral programs may be an efficient, flexible method to enhance prenatal care and improve pregnancy outcomes. There are few data about access to, and use of, the Internet via computers and mobile phones among pregnant women. We describe pregnant women's access to, and use of, computers, mobile phones, and computer technologies (eg, Internet, blogs, chat rooms) in a southern United States population. We describe the willingness of pregnant women to participate in Internet-supported weight-loss interventions delivered via computers or mobile phones. We conducted a cross-sectional survey among 100 pregnant women at a tertiary referral center ultrasound clinic in the southeast United States. Data were analyzed using Stata version 10 (StataCorp) and R (R Core Team 2013). Means and frequency procedures were used to describe demographic characteristics, access to computers and mobile phones, and use of specific Internet modalities. Chi-square testing was used to determine whether there were differences in technology access and Internet modality use according to age, race/ethnicity, income, or children in the home. The Fisher's exact test was used to describe preferences to participate in Internet-based postpartum weight-loss interventions via computer versus mobile phone. Logistic regression was used to determine demographic characteristics associated with these preferences. The study sample was 61.0% white, 26.0% black, 6.0% Hispanic, and 7.0% Asian with a mean age of 31.0 (SD 5.1). Most participants had access to a computer (89/100, 89.0%) or mobile phone (88/100, 88.0%) for at least 8 hours per week. Access remained high (>74%) across age groups, racial/ethnic groups, income levels, and number of children in the home. Internet/Web (94/100, 94.0%), email (90/100, 90.0%), and Facebook (50/100, 50.0%) were the most commonly used Internet technologies. Women aged less than 30 years were more likely to report use of Twitter and chat rooms

  15. Spin-transfer torque magnetoresistive random-access memory technologies for normally off computing (invited)

    International Nuclear Information System (INIS)

    Ando, K.; Yuasa, S.; Fujita, S.; Ito, J.; Yoda, H.; Suzuki, Y.; Nakatani, Y.; Miyazaki, T.

    2014-01-01

    Most parts of present computer systems are made of volatile devices, and the power to supply them to avoid information loss causes huge energy losses. We can eliminate this meaningless energy loss by utilizing the non-volatile function of advanced spin-transfer torque magnetoresistive random-access memory (STT-MRAM) technology and create a new type of computer, i.e., normally off computers. Critical tasks to achieve normally off computers are implementations of STT-MRAM technologies in the main memory and low-level cache memories. STT-MRAM technology for applications to the main memory has been successfully developed by using perpendicular STT-MRAMs, and faster STT-MRAM technologies for applications to the cache memory are now being developed. The present status of STT-MRAMs and challenges that remain for normally off computers are discussed

  16. CORRELATION AMONG DAMAGES CAUSED BY YELLOW BEETLE, CLIMATOLOGICAL ELEMENTS AND PRODUCTION OF GUAVA ACCESSES GROWN IN ORGANIC SYSTEM

    Directory of Open Access Journals (Sweden)

    JULIANA ALTAFIN GALLI

    Full Text Available ABSTRACT The objective of this research was evaluate the damage caused by the yellow beetle on 85 guava accessions and correlations of the damage with the climatological elements and the production of fruit in an orchard of guava conducted in organic system. Ten leaves by access were analyzed containing the injury of insect attack. Each leaf had its foliar area measured by leaf area meter and, after obtaining the total area, the leaf was covered with duct tape, and measure again. The averages were compared by Scott-Knott test at 5% probability. The 15 accessions with highest average damage had the data submitted to the correlation with the minimum and maximum temperature, precipitation and relative humidity. The production was obtained by the number of fruits/plant. The damages are negatively correlated with the mean relative humidity of 7:00h (local time in the period of 14 days prior to the assessments, and negatively affect production. The accessions Saito, L4P16, Monte Alto Comum 1 and L5P19 are promising in organic agriculture, for presenting good production and minor damage to insect attack, when compared to others.

  17. IBM PC based real time photon correlator [Paper No.:D2

    International Nuclear Information System (INIS)

    Kumaravadivelu, C.; Nageswaran, A.; Weling, S. A.

    1993-01-01

    The design aspects and development of IBM PC based real time photon correlator is presented. This system computes 64 auto-correlation functions in real time. Sample data is 4-bit wide. Correlation functions are computed in hard wired logic using discrete components. A combination of parallel and pipelined architecture is adopted to compute the correlation in realtime. A high speed controller generates the required control signals for the computing hardware and also provides handshake signals to IBM PC to access the computed results. IBM PC bus is extended and interfaced to correlation computing hardware. IBM PC collects the experimental parameters through user friendly menu and initiates the correlation hardware and continues to collect the correlation build ups and displays them on the screen. Extensive test and maintenance features are incorporated into the system. This system is developed for Material Science Division in Indira Gandhi Centre for Atomic Research (IGCAR) to study static and dynamic properties of macro molecules and colloidal particles in dispersion using light scattering technique. It can also be used to study the flow characteristics of sodium in nuclear reactors. It can be used in dynamic neutron scattering experiments. (author). 3 figs., 2 tabs

  18. Fingerprint authentication via joint transform correlator and its application in remote access control of a 3D microscopic system

    Science.gov (United States)

    He, Wenqi; Lai, Hongji; Wang, Meng; Liu, Zeyi; Yin, Yongkai; Peng, Xiang

    2014-05-01

    We present a fingerprint authentication scheme based on the optical joint transform correlator (JTC) and further describe its application to the remote access control of a Network-based Remote Laboratory (NRL). It is built to share a 3D microscopy system of our realistic laboratory in Shenzhen University with the remote co-researchers in Stuttgart University. In this article, we would like to focus on the involved security issues, mainly on the verification of various remote visitors to our NRL. By making use of the JTC-based optical pattern recognition technique as well as the Personal Identification Number (PIN), we are able to achieve the aim of authentication and access control for any remote visitors. Note that only the authorized remote visitors could be guided to the Virtual Network Computer (VNC), a cross-platform software, which allows the remote visitor to access the desktop applications and visually manipulate the instruments of our NRL through the internet. Specifically to say, when a remote visitor attempts to access to our NRL, a PIN is mandatory required in advance, which is followed by fingerprint capturing and verification. Only if both the PIN and the fingerprint are correct, can one be regarded as an authorized visitor, and then he/she would get the authority to visit our NRL by the VNC. It is also worth noting that the aforementioned "two-step verification" strategy could be further applied to verify the identity levels of various remote visitors, and therefore realize the purpose of diversified visitor management.

  19. The correlation between a passion for computer games and the school performance of younger schoolchildren.

    Directory of Open Access Journals (Sweden)

    Maliy D.V.

    2015-07-01

    Full Text Available Today computer games occupy a significant place in children’s lives and fundamentally affect the process of the formation and development of their personalities. A number of present-day researchers assert that computer games have a developmental effect on players. Others share the point of view that computer games have negative effects on the cognitive and emotional spheres of a child and claim that children with low self-esteem who neglect their schoolwork and have difficulties in communication are particularly passionate about computer games. This article reviews theoretical and experimental pedagogical and psychological studies of the nature of the correlation between a passion for computer games and the school performance of younger schoolchildren. Our analysis of foreign and Russian psychology studies regarding the problem of playing activities mediated by information and computer technologies allowed us to single out the main criteria for children’s passion for computer games and school performance. This article presents the results of a pilot study of the nature of the correlation between a passion for computer games and the school performance of younger schoolchildren. The research involved 32 pupils (12 girls and 20 boys aged 10-11 years in the 4th grade. The general hypothesis was that there are divergent correlations between the passion of younger schoolchildren for computer games and their school performance. A questionnaire survey administered to the pupils allowed us to obtain information about the amount of time they devoted to computer games, their preferences for computer-game genres, and the extent of their passion for games. To determine the level of school performance we analyzed class registers. To establish the correlation between a passion for computer games and the school performance of younger schoolchildren, as well as to determine the effect of a passion for computer games on the personal qualities of the children

  20. SuperB R&D computing program: HTTP direct access to distributed resources

    Science.gov (United States)

    Fella, A.; Bianchi, F.; Ciaschini, V.; Corvo, M.; Delprete, D.; Diacono, D.; Di Simone, A.; Franchini, P.; Donvito, G.; Giacomini, F.; Gianoli, A.; Longo, S.; Luitz, S.; Luppi, E.; Manzali, M.; Pardi, S.; Perez, A.; Rama, M.; Russo, G.; Santeramo, B.; Stroili, R.; Tomassetti, L.

    2012-12-01

    The SuperB asymmetric energy e+e- collider and detector to be built at the newly founded Nicola Cabibbo Lab will provide a uniquely sensitive probe of New Physics in the flavor sector of the Standard Model. Studying minute effects in the heavy quark and heavy lepton sectors requires a data sample of 75 ab-1 and a luminosity target of 1036cm-2s-1. The increasing network performance also in the Wide Area Network environment and the capability to read data remotely with good efficiency are providing new possibilities and opening new scenarios in the data access field. Subjects like data access and data availability in a distributed environment are key points in the definition of the computing model for an HEP experiment like SuperB. R&D efforts in such a field have been brought on during the last year in order to release the Computing Technical Design Report within 2013. WAN direct access to data has been identified as one of the more interesting viable option; robust and reliable protocols as HTTP/WebDAV and xrootd are the subjects of a specific R&D line in a mid-term scenario. In this work we present the R&D results obtained in the study of new data access technologies for typical HEP use cases, focusing on specific protocols such as HTTP and WebDAV in Wide Area Network scenarios. Reports on efficiency, performance and reliability tests performed in a data analysis context have been described. Future R&D plan includes HTTP and xrootd protocols comparison tests, in terms of performance, efficiency, security and features available.

  1. Workarounds to computer access in healthcare organizations: you want my password or a dead patient?

    Science.gov (United States)

    Koppel, Ross; Smith, Sean; Blythe, Jim; Kothari, Vijay

    2015-01-01

    Workarounds to computer access in healthcare are sufficiently common that they often go unnoticed. Clinicians focus on patient care, not cybersecurity. We argue and demonstrate that understanding workarounds to healthcare workers' computer access requires not only analyses of computer rules, but also interviews and observations with clinicians. In addition, we illustrate the value of shadowing clinicians and conducing focus groups to understand their motivations and tradeoffs for circumvention. Ethnographic investigation of the medical workplace emerges as a critical method of research because in the inevitable conflict between even well-intended people versus the machines, it's the people who are the more creative, flexible, and motivated. We conducted interviews and observations with hundreds of medical workers and with 19 cybersecurity experts, CIOs, CMIOs, CTO, and IT workers to obtain their perceptions of computer security. We also shadowed clinicians as they worked. We present dozens of ways workers ingeniously circumvent security rules. The clinicians we studied were not "black hat" hackers, but just professionals seeking to accomplish their work despite the security technologies and regulations.

  2. Impact of the Digital Divide on Computer Use and Internet Access on the Poor in Nigeria

    Science.gov (United States)

    Tayo, Omolara; Thompson, Randall; Thompson, Elizabeth

    2016-01-01

    We recruited 20 community members in Ido Local Government Area, Oyo state and Yewa Local Government Area, Ogun state in Nigeria to explore experiences and perceptions of Internet access and computer use. Face-to-face interviews were conducted using open-ended questions to collect qualitative data regarding accessibility of information and…

  3. Negative correlates of computer game play in adolescents.

    Science.gov (United States)

    Colwell, J; Payne, J

    2000-08-01

    There is some concern that playing computer games may be associated with social isolation, lowered self-esteem, and aggression among adolescents. Measures of these variables were included in a questionnaire completed by 204 year eight students at a North London comprehensive school. Principal components analysis of a scale to assess needs fulfilled by game play provided some support for the notion of 'electronic friendship' among boys, but there was no evidence that game play leads to social isolation. Play was not linked to self-esteem in girls, but a negative relationship was obtained between self-esteem and frequency of play in boys. However, self-esteem was not associated with total exposure to game play. Aggression scores were not related to the number of games with aggressive content named among three favourite games, but they were positively correlated with total exposure to game play. A multiple regression analysis revealed that sex and total game play exposure each accounted for a significant but small amount of the variance in aggression scores. The positive correlation between playing computer games and aggression provides some justification for further investigation of the causal hypothesis, and possible methodologies are discussed.

  4. Internet Use and Access Among Pregnant Women via Computer and Mobile Phone: Implications for Delivery of Perinatal Care

    Science.gov (United States)

    Peragallo Urrutia, Rachel; Berger, Alexander A; Ivins, Amber A; Beckham, A Jenna; Thorp Jr, John M

    2015-01-01

    Background The use of Internet-based behavioral programs may be an efficient, flexible method to enhance prenatal care and improve pregnancy outcomes. There are few data about access to, and use of, the Internet via computers and mobile phones among pregnant women. Objective We describe pregnant women’s access to, and use of, computers, mobile phones, and computer technologies (eg, Internet, blogs, chat rooms) in a southern United States population. We describe the willingness of pregnant women to participate in Internet-supported weight-loss interventions delivered via computers or mobile phones. Methods We conducted a cross-sectional survey among 100 pregnant women at a tertiary referral center ultrasound clinic in the southeast United States. Data were analyzed using Stata version 10 (StataCorp) and R (R Core Team 2013). Means and frequency procedures were used to describe demographic characteristics, access to computers and mobile phones, and use of specific Internet modalities. Chi-square testing was used to determine whether there were differences in technology access and Internet modality use according to age, race/ethnicity, income, or children in the home. The Fisher’s exact test was used to describe preferences to participate in Internet-based postpartum weight-loss interventions via computer versus mobile phone. Logistic regression was used to determine demographic characteristics associated with these preferences. Results The study sample was 61.0% white, 26.0% black, 6.0% Hispanic, and 7.0% Asian with a mean age of 31.0 (SD 5.1). Most participants had access to a computer (89/100, 89.0%) or mobile phone (88/100, 88.0%) for at least 8 hours per week. Access remained high (>74%) across age groups, racial/ethnic groups, income levels, and number of children in the home. Internet/Web (94/100, 94.0%), email (90/100, 90.0%), and Facebook (50/100, 50.0%) were the most commonly used Internet technologies. Women aged less than 30 years were more likely to

  5. F2AC: A Lightweight, Fine-Grained, and Flexible Access Control Scheme for File Storage in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Wei Ren

    2016-01-01

    Full Text Available Current file storage service models for cloud servers assume that users either belong to single layer with different privileges or cannot authorize privileges iteratively. Thus, the access control is not fine-grained and flexible. Besides, most access control methods at cloud servers mainly rely on computationally intensive cryptographic algorithms and, especially, may not be able to support highly dynamic ad hoc groups with addition and removal of group members. In this paper, we propose a scheme called F2AC, which is a lightweight, fine-grained, and flexible access control scheme for file storage in mobile cloud computing. F2AC can not only achieve iterative authorization, authentication with tailored policies, and access control for dynamically changing accessing groups, but also provide access privilege transition and revocation. A new access control model called directed tree with linked leaf model is proposed for further implementations in data structures and algorithms. The extensive analysis is given for justifying the soundness and completeness of F2AC.

  6. Using speech recognition to enhance the Tongue Drive System functionality in computer access.

    Science.gov (United States)

    Huo, Xueliang; Ghovanloo, Maysam

    2011-01-01

    Tongue Drive System (TDS) is a wireless tongue operated assistive technology (AT), which can enable people with severe physical disabilities to access computers and drive powered wheelchairs using their volitional tongue movements. TDS offers six discrete commands, simultaneously available to the users, for pointing and typing as a substitute for mouse and keyboard in computer access, respectively. To enhance the TDS performance in typing, we have added a microphone, an audio codec, and a wireless audio link to its readily available 3-axial magnetic sensor array, and combined it with a commercially available speech recognition software, the Dragon Naturally Speaking, which is regarded as one of the most efficient ways for text entry. Our preliminary evaluations indicate that the combined TDS and speech recognition technologies can provide end users with significantly higher performance than using each technology alone, particularly in completing tasks that require both pointing and text entry, such as web surfing.

  7. Gender Differences in Availability, Internet Access and Rate of Usage of Computers among Distance Education Learners.

    Science.gov (United States)

    Atan, Hanafi; Sulaiman, Fauziah; Rahman, Zuraidah Abd; Idrus, Rozhan Mohammed

    2002-01-01

    Explores the level of availability of computers, Internet accessibility, and the rate of usage of computers both at home and at the workplace between distance education learners according to gender. Results of questionnaires completed at the Universiti Sains Malaysia indicate that distance education reduces the gender gap. (Author/LRW)

  8. DiFX: A software correlator for very long baseline interferometry using multi-processor computing environments

    OpenAIRE

    Deller, A. T.; Tingay, S. J.; Bailes, M.; West, C.

    2007-01-01

    We describe the development of an FX style correlator for Very Long Baseline Interferometry (VLBI), implemented in software and intended to run in multi-processor computing environments, such as large clusters of commodity machines (Beowulf clusters) or computers specifically designed for high performance computing, such as multi-processor shared-memory machines. We outline the scientific and practical benefits for VLBI correlation, these chiefly being due to the inherent flexibility of softw...

  9. Experiences of registered nurses with regard to accessing health information at the point-of-care via mobile computing devices

    Directory of Open Access Journals (Sweden)

    Esmeralda Ricks

    2015-11-01

    Full Text Available Background: The volume of health information necessary to provide competent health care today has become overwhelming. Mobile computing devices are fast becoming an essential clinical tool for accessing health information at the point-of-care of patients. Objectives: This study explored and described how registered nurses experienced accessing information at the point-of-care via mobile computing devices (MCDs. Method: A qualitative, exploratory, descriptive and contextual design was used. Ten in–depth interviews were conducted with purposively sampled registered nurses employed by a state hospital in the Nelson Mandela Bay Municipality (NMBM. Interviews were recorded, transcribed verbatim and analysed using Tesch’s data analysis technique. Ethical principles were adhered to throughout the study. Guba’s model of trustworthiness was used to confirm integrity of the study. Results: Four themes emerged which revealed that the registered nurses benefited from the training they received by enabling them to develop, and improve, their computer literacy levels. Emphasis was placed on the benefits that the accessed information had for educational purposes for patients and the public, for colleagues and students. Furthermore the ability to access information at the point-of-care was considered by registered nurses as valuable to improve patient care because of the wide range of accurate and readily accessible information available via the mobile computing device. Conclusion: The registered nurses in this study felt that being able to access information at the point-of-care increased their confidence and facilitated the provision of quality care because it assisted them in being accurate and sure of what they were doing.

  10. Experiences of registered nurses with regard to accessing health information at the point-of-care via mobile computing devices.

    Science.gov (United States)

    Ricks, Esmeralda; Benjamin, Valencia; Williams, Margaret

    2015-11-19

    The volume of health information necessary to provide competent health care today has become overwhelming. Mobile computing devices are fast becoming an essential clinical tool for accessing health information at the point-of-care of patients. This study explored and described how registered nurses experienced accessing information at the point-of-care via mobile computing devices (MCDs). A qualitative, exploratory, descriptive and contextual design was used. Ten in-depth interviews were conducted with purposively sampled registered nurses employed by a state hospital in the Nelson Mandela Bay Municipality (NMBM). Interviews were recorded, transcribed verbatim and analysed using Tesch's data analysis technique. Ethical principles were adhered to throughout the study. Guba's model of trustworthiness was used to confirm integrity of the study. Four themes emerged which revealed that the registered nurses benefited from the training they received by enabling them to develop, and improve, their computer literacy levels. Emphasis was placed on the benefits that the accessed information had for educational purposes for patients and the public, for colleagues and students. Furthermore the ability to access information at the point-of-care was considered by registered nurses as valuable to improve patient care because of the wide range of accurate and readily accessible information available via the mobile computing device. The registered nurses in this study felt that being able to access information at the point-of-care increased their confidence and facilitated the provision of quality care because it assisted them in being accurate and sure of what they were doing.

  11. Fast Computation of the Two-Point Correlation Function in the Age of Big Data

    Science.gov (United States)

    Pellegrino, Andrew; Timlin, John

    2018-01-01

    We present a new code which quickly computes the two-point correlation function for large sets of astronomical data. This code combines the ease of use of Python with the speed of parallel shared libraries written in C. We include the capability to compute the auto- and cross-correlation statistics, and allow the user to calculate the three-dimensional and angular correlation functions. Additionally, the code automatically divides the user-provided sky masks into contiguous subsamples of similar size, using the HEALPix pixelization scheme, for the purpose of resampling. Errors are computed using jackknife and bootstrap resampling in a way that adds negligible extra runtime, even with many subsamples. We demonstrate comparable speed with other clustering codes, and code accuracy compared to known and analytic results.

  12. DELTA - a computer program to analyze gamma-gamma angular correlations from unaligned states

    International Nuclear Information System (INIS)

    Ekstroem, L.P.

    1983-10-01

    A computer program to analyze gamma-gamma angular correlations from radioactive decay and from thermal-neutron capture is described. The program can, in addition to correlation data, handle mixing ratio and conversion coefficient data. (author)

  13. Malignant hemangiopericytoma. A correlative study of angiography, computed tomography, and pathology

    Energy Technology Data Exchange (ETDEWEB)

    Higa, Toshiaki; Kuroda, Yasumasa; Kobashi, Yoichiro; Ichijima, Kunio; Odori, Teruo; Torizuka, Kanji [Kyoto Univ. (Japan). Hospital

    1983-01-01

    Four cases of primary and secondary malignant hemangiopericytoma were correlatively studied using selective angiography, computed tomography, and pathologic specimens. One case was found in each of the peritoneal space (metastasis), the left middle cranial fossa, the left thigh, and the left retroperitoneal space. Basic angiographic features of the tumor were a few feeding arteries entering the tumor with radially arranged fine network throughout, early visualization of veins due to A-V shunting, and a long-standing tumor stain with avascular area (s). Correlative evaluation of angiograms, computed tomograms, and pathologic specimens revealed hemorrhage, central necrosis, and cystic degeneration for the avascular area (s) of the tumor stains on the angiograms. The more prominent those secondary changes, the less characteristic the angiographic features.

  14. Correlation Educational Model in Primary Education Curriculum of Mathematics and Computer Science

    Science.gov (United States)

    Macinko Kovac, Maja; Eret, Lidija

    2012-01-01

    This article gives insight into methodical correlation model of teaching mathematics and computer science. The model shows the way in which the related areas of computer science and mathematics can be supplemented, if it transforms the way of teaching and creates a "joint" lessons. Various didactic materials are designed, in which all…

  15. Computing Coherence Vectors and Correlation Matrices with Application to Quantum Discord Quantification

    Directory of Open Access Journals (Sweden)

    Jonas Maziero

    2016-01-01

    Full Text Available Coherence vectors and correlation matrices are important functions frequently used in physics. The numerical calculation of these functions directly from their definitions, which involves Kronecker products and matrix multiplications, may seem to be a reasonable option. Notwithstanding, as we demonstrate in this paper, some algebraic manipulations before programming can reduce considerably their computational complexity. Besides, we provide Fortran code to generate generalized Gell-Mann matrices and to compute the optimized and unoptimized versions of associated Bloch’s vectors and correlation matrix in the case of bipartite quantum systems. As a code test and application example, we consider the calculation of Hilbert-Schmidt quantum discords.

  16. Soft-error tolerance and energy consumption evaluation of embedded computer with magnetic random access memory in practical systems using computer simulations

    Science.gov (United States)

    Nebashi, Ryusuke; Sakimura, Noboru; Sugibayashi, Tadahiko

    2017-08-01

    We evaluated the soft-error tolerance and energy consumption of an embedded computer with magnetic random access memory (MRAM) using two computer simulators. One is a central processing unit (CPU) simulator of a typical embedded computer system. We simulated the radiation-induced single-event-upset (SEU) probability in a spin-transfer-torque MRAM cell and also the failure rate of a typical embedded computer due to its main memory SEU error. The other is a delay tolerant network (DTN) system simulator. It simulates the power dissipation of wireless sensor network nodes of the system using a revised CPU simulator and a network simulator. We demonstrated that the SEU effect on the embedded computer with 1 Gbit MRAM-based working memory is less than 1 failure in time (FIT). We also demonstrated that the energy consumption of the DTN sensor node with MRAM-based working memory can be reduced to 1/11. These results indicate that MRAM-based working memory enhances the disaster tolerance of embedded computers.

  17. Neural correlates and neural computations in posterior parietal cortex during perceptual decision-making

    Directory of Open Access Journals (Sweden)

    Alexander eHuk

    2012-10-01

    Full Text Available A recent line of work has found remarkable success in relating perceptual decision-making and the spiking activity in the macaque lateral intraparietal area (LIP. In this review, we focus on questions about the neural computations in LIP that are not answered by demonstrations of neural correlates of psychological processes. We highlight three areas of limitations in our current understanding of the precise neural computations that might underlie neural correlates of decisions: (1 empirical questions not yet answered by existing data; (2 implementation issues related to how neural circuits could actually implement the mechanisms suggested by both physiology and psychology; and (3 ecological constraints related to the use of well-controlled laboratory tasks and whether they provide an accurate window on sensorimotor computation. These issues motivate the adoption of a more general encoding-decoding framework that will be fruitful for more detailed contemplation of how neural computations in LIP relate to the formation of perceptual decisions.

  18. TimeSet: A computer program that accesses five atomic time services on two continents

    Science.gov (United States)

    Petrakis, P. L.

    1993-01-01

    TimeSet is a shareware program for accessing digital time services by telephone. At its initial release, it was capable of capturing time signals only from the U.S. Naval Observatory to set a computer's clock. Later the ability to synchronize with the National Institute of Standards and Technology was added. Now, in Version 7.10, TimeSet is able to access three additional telephone time services in Europe - in Sweden, Austria, and Italy - making a total of five official services addressable by the program. A companion program, TimeGen, allows yet another source of telephone time data strings for callers equipped with TimeSet version 7.10. TimeGen synthesizes UTC time data strings in the Naval Observatory's format from an accurately set and maintained DOS computer clock, and transmits them to callers. This allows an unlimited number of 'freelance' time generating stations to be created. Timesetting from TimeGen is made feasible by the advent of Becker's RighTime, a shareware program that learns the drift characteristics of a computer's clock and continuously applies a correction to keep it accurate, and also brings .01 second resolution to the DOS clock. With clock regulation by RighTime and periodic update calls by the TimeGen station to an official time source via TimeSet, TimeGen offers the same degree of accuracy within the resolution of the computer clock as any official atomic time source.

  19. Electron correlation in molecules: concurrent computation Many-Body Perturbation Theory (ccMBPT) calculations using macrotasking on the NEC SX-3/44 computer

    International Nuclear Information System (INIS)

    Moncrieff, D.; Wilson, S.

    1992-06-01

    The ab initio determination of the electronic structure of molecules is a many-fermion problem involving the approximate description of the motion of the electrons in the field of fixed nuclei. It is an area of research which demands considerable computational resources but having enormous potential in fields as diverse as interstellar chemistry and drug design, catalysis and solid state chemistry, molecular biology and environmental chemistry. Electronic structure calculations almost invariably divide into two main stages: the approximate solution of an independent electron model, in which each electron moves in the average field created by the other electrons in the system, and then, the more computationally demanding determination of a series of corrections to this model, the electron correlation effects. The many-body perturbation theory expansion affords a systematic description of correlation effects, which leads directly to algorithms which are suitable for concurrent computation. We term this concurrent computation Many-Body Perturbation Theory (ccMBPT). The use of a dynamic load balancing technique on the NEC SX-3/44 computer in electron correlation calculations is investigated for the calculation of the most demanding energy component in the most accurate of contemporary ab initio studies. An application to the ground state of the nitrogen molecule is described. We also briefly discuss the extent to which the calculation of the dominant corrections to such studies can be rendered computationally tractable by exploiting both the vector processing and parallel processor capabilities of the NEC SX-3/44 computer. (author)

  20. Characterizing Computer Access Using a One-Channel EEG Wireless Sensor.

    Science.gov (United States)

    Molina-Cantero, Alberto J; Guerrero-Cubero, Jaime; Gómez-González, Isabel M; Merino-Monge, Manuel; Silva-Silva, Juan I

    2017-06-29

    This work studies the feasibility of using mental attention to access a computer. Brain activity was measured with an electrode placed at the Fp1 position and the reference on the left ear; seven normally developed people and three subjects with cerebral palsy (CP) took part in the experimentation. They were asked to keep their attention high and low for as long as possible during several trials. We recorded attention levels and power bands conveyed by the sensor, but only the first was used for feedback purposes. All of the information was statistically analyzed to find the most significant parameters and a classifier based on linear discriminant analysis (LDA) was also set up. In addition, 60% of the participants were potential users of this technology with an accuracy of over 70%. Including power bands in the classifier did not improve the accuracy in discriminating between the two attentional states. For most people, the best results were obtained by using only the attention indicator in classification. Tiredness was higher in the group with disabilities (2.7 in a scale of 3) than in the other (1.5 in the same scale); and modulating the attention to access a communication board requires that it does not contain many pictograms (between 4 and 7) on screen and has a scanning period of a relatively high t s c a n ≈ 10 s. The information transfer rate (ITR) is similar to the one obtained by other brain computer interfaces (BCI), like those based on sensorimotor rhythms (SMR) or slow cortical potentials (SCP), and makes it suitable as an eye-gaze independent BCI.

  1. Video game access, parental rules, and problem behavior: a study of boys with autism spectrum disorder.

    Science.gov (United States)

    Engelhardt, Christopher R; Mazurek, Micah O

    2014-07-01

    Environmental correlates of problem behavior among individuals with autism spectrum disorder remain relatively understudied. The current study examined the contribution of in-room (i.e. bedroom) access to a video game console as one potential correlate of problem behavior among a sample of 169 boys with autism spectrum disorder (ranging from 8 to 18 years of age). Parents of these children reported on (1) whether they had specific rules regulating their child's video game use, (2) whether their child had in-room access to a variety of screen-based media devices (television, computer, and video game console), and (3) their child's oppositional behaviors. Multivariate regression models showed that in-room access to a video game console predicted oppositional behavior while controlling for in-room access to other media devices (computer and television) and relevant variables (e.g. average number of video game hours played per day). Additionally, the association between in-room access to a video game console and oppositional behavior was particularly large when parents reported no rules on their child's video game use. The current findings indicate that both access and parental rules regarding video games warrant future experimental and longitudinal research as they relate to problem behavior in boys with autism spectrum disorder. © The Author(s) 2013.

  2. Computing the correlation between catalyst composition and its performance in the catalysed process

    Czech Academy of Sciences Publication Activity Database

    Holeňa, Martin; Steinfeldt, N.; Baerns, M.; Štefka, David

    2012-01-01

    Roč. 43, 10 August (2012), s. 55-67 ISSN 0098-1354 R&D Projects: GA ČR GA201/08/0802 Institutional support: RVO:67985807 Keywords : catalysed process * catalyst performance * correlation measures * estimating correlation value * analysis of variance * regression trees Subject RIV: IN - Informatics, Computer Science Impact factor: 2.091, year: 2012

  3. A Study of the Correlation between Computer Games and Adolescent Behavioral Problems

    OpenAIRE

    Shokouhi-Moqhaddam, Solmaz; Khezri-Moghadam, Noshiravan; Javanmard, Zeinab; Sarmadi-Ansar, Hassan; Aminaee, Mehran; Shokouhi-Moqhaddam, Majid; Zivari-Rahman, Mahmoud

    2013-01-01

    Background Today, due to developing communicative technologies, computer games and other audio-visual media as social phenomena, are very attractive and have a great effect on children and adolescents. The increasing popularity of these games among children and adolescents results in the public uncertainties about plausible harmful effects of these games. This study aimed to investigate the correlation between computer games and behavioral problems on male guidance school students. Methods Th...

  4. Computers in plasma physics: remote data access and magnetic configuration design

    International Nuclear Information System (INIS)

    Blackwell, B.D.; McMillan, B.F.; Searle, A.C.; Gardner, H.J.; Price, D.M.; Fredian, T.W.

    2000-01-01

    Full text: Two graphically intensive examples of the application of computers in plasma physics are described remote data access for plasma confinement experiments, and a code for real-time magnetic field tracing and optimisation. The application for both of these is the H-1NF National Plasma Fusion Research Facility, a Commonwealth Major National Research Facility within the Research School of Physical Science, Institute of Advanced Studies, ANU. It is based on the 'flexible' heliac stellarator H-1, a plasma confinement device in which the confining fields are generated solely by external conductors. These complex, fully three dimensional magnetic fields are used as examples for the magnetic design application, and data from plasma physics experiments are used to illustrate the remote access techniques. As plasma fusion experiments grow in size, increased remote access allows physicists to participate in experiments and data analysis from their home base. Three types of access will be described and demonstrated - a simple Java-based web interface, an example TCP client-server built around the widely used MDSPlus data system and the visualisation package IDL (RSI Inc), and a virtual desktop Environment (VNC: AT and T Research) that simulates terminals local to the plasma facility. A client server TCP/IP - web interface to the programmable logic controller that provides user interface to the programmable high power magnet power supplies is described. A very general configuration file allows great flexibility, and allows new displays and interfaces to be created (usually) without changes to the underlying C++ and Java code. The magnetic field code BLINE provides accurate calculation of complex magnetic fields, and 3D visualisation in real time, using a low cost multiprocessor computer and an OpenGL-compatible graphics accelerator. A fast, flexible multi-mesh interpolation method is used for tracing vacuum magnetic field lines created by arbitrary filamentary

  5. The NILE system architecture: fault-tolerant, wide-area access to computing and data resources

    International Nuclear Information System (INIS)

    Ricciardi, Aleta; Ogg, Michael; Rothfus, Eric

    1996-01-01

    NILE is a multi-disciplinary project building a distributed computing environment for HEP. It provides wide-area, fault-tolerant, integrated access to processing and data resources for collaborators of the CLEO experiment, though the goals and principles are applicable to many domains. NILE has three main objectives: a realistic distributed system architecture design, the design of a robust data model, and a Fast-Track implementation providing a prototype design environment which will also be used by CLEO physicists. This paper focuses on the software and wide-area system architecture design and the computing issues involved in making NILE services highly-available. (author)

  6. Three-dimensional evaluation of human jaw bone microarchitecture: correlation between the microarchitectural parameters of cone beam computed tomography and micro-computer tomography.

    Science.gov (United States)

    Kim, Jo-Eun; Yi, Won-Jin; Heo, Min-Suk; Lee, Sam-Sun; Choi, Soon-Chul; Huh, Kyung-Hoe

    2015-12-01

    To evaluate the potential feasibility of cone beam computed tomography (CBCT) in the assessment of trabecular bone microarchitecture. Sixty-eight specimens from four pairs of human jaw were scanned using both micro-computed tomography (micro-CT) of 19.37-μm voxel size and CBCT of 100-μm voxel size. The correlation of 3-dimensional parameters between CBCT and micro-CT was evaluated. All parameters, except bone-specific surface and trabecular thickness, showed linear correlations between the 2 imaging modalities (P < .05). Among the parameters, bone volume, percent bone volume, trabecular separation, and degree of anisotropy (DA) of CBCT images showed strong correlations with those of micro-CT images. DA showed the strongest correlation (r = 0.693). Most microarchitectural parameters from CBCT were correlated with those from micro-CT. Some microarchitectural parameters, especially DA, could be used as strong predictors of bone quality in the human jaw. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Higher order correlations in computed particle distributions

    International Nuclear Information System (INIS)

    Hanerfeld, H.; Herrmannsfeldt, W.; Miller, R.H.

    1989-03-01

    The rms emittances calculated for beam distributions using computer simulations are frequently dominated by higher order aberrations. Thus there are substantial open areas in the phase space plots. It has long been observed that the rms emittance is not an invariant to beam manipulations. The usual emittance calculation removes the correlation between transverse displacement and transverse momentum. In this paper, we explore the possibility of defining higher order correlations that can be removed from the distribution to result in a lower limit to the realizable emittance. The intent is that by inserting the correct combinations of linear lenses at the proper position, the beam may recombine in a way that cancels the effects of some higher order forces. An example might be the non-linear transverse space charge forces which cause a beam to spread. If the beam is then refocused so that the same non-linear forces reverse the inward velocities, the resulting phase space distribution may reasonably approximate the original distribution. The approach to finding the location and strength of the proper lens to optimize the transported beam is based on work by Bruce Carlsten of Los Alamos National Laboratory. 11 refs., 4 figs

  8. Spatial correlations in intense ionospheric scintillations - comparison between numerical computation and observation

    International Nuclear Information System (INIS)

    Kumagai, H.

    1987-01-01

    The spatial correlations in intense ionospheric scintillations were analyzed by comparing numerical results with observational ones. The observational results were obtained by spaced-receiver scintillation measurements of VHF satellite radiowave. The numerical computation was made by using the fourth-order moment equation with fairly realistic ionospheric irregularity models, in which power-law irregularities with spectral index 4, both thin and thick slabs, and both isotropic and anisotropic irregularities, were considered. Evolution of the S(4) index and the transverse correlation function was computed. The numerical result that the transverse correlation distance decreases with the increase in S(4) was consistent with that obtained in the observation, suggesting that multiple scattering plays an important role in the intense scintillations observed. The anisotropy of irregularities proved to act as if the density fluctuation increased. This effect, as well as the effect of slab thickness, was evaluated by the total phase fluctuations that the radiowave experienced in the slab. On the basis of the comparison, the irregularity height and electron-density fluctuation which is necessary to produce a particular strength of scintillation were estimated. 30 references

  9. A Hybrid Scheme for Fine-Grained Search and Access Authorization in Fog Computing Environment

    Science.gov (United States)

    Xiao, Min; Zhou, Jing; Liu, Xuejiao; Jiang, Mingda

    2017-01-01

    In the fog computing environment, the encrypted sensitive data may be transferred to multiple fog nodes on the edge of a network for low latency; thus, fog nodes need to implement a search over encrypted data as a cloud server. Since the fog nodes tend to provide service for IoT applications often running on resource-constrained end devices, it is necessary to design lightweight solutions. At present, there is little research on this issue. In this paper, we propose a fine-grained owner-forced data search and access authorization scheme spanning user-fog-cloud for resource constrained end users. Compared to existing schemes only supporting either index encryption with search ability or data encryption with fine-grained access control ability, the proposed hybrid scheme supports both abilities simultaneously, and index ciphertext and data ciphertext are constructed based on a single ciphertext-policy attribute based encryption (CP-ABE) primitive and share the same key pair, thus the data access efficiency is significantly improved and the cost of key management is greatly reduced. Moreover, in the proposed scheme, the resource constrained end devices are allowed to rapidly assemble ciphertexts online and securely outsource most of decryption task to fog nodes, and mediated encryption mechanism is also adopted to achieve instantaneous user revocation instead of re-encrypting ciphertexts with many copies in many fog nodes. The security and the performance analysis show that our scheme is suitable for a fog computing environment. PMID:28629131

  10. A Hybrid Scheme for Fine-Grained Search and Access Authorization in Fog Computing Environment.

    Science.gov (United States)

    Xiao, Min; Zhou, Jing; Liu, Xuejiao; Jiang, Mingda

    2017-06-17

    In the fog computing environment, the encrypted sensitive data may be transferred to multiple fog nodes on the edge of a network for low latency; thus, fog nodes need to implement a search over encrypted data as a cloud server. Since the fog nodes tend to provide service for IoT applications often running on resource-constrained end devices, it is necessary to design lightweight solutions. At present, there is little research on this issue. In this paper, we propose a fine-grained owner-forced data search and access authorization scheme spanning user-fog-cloud for resource constrained end users. Compared to existing schemes only supporting either index encryption with search ability or data encryption with fine-grained access control ability, the proposed hybrid scheme supports both abilities simultaneously, and index ciphertext and data ciphertext are constructed based on a single ciphertext-policy attribute based encryption (CP-ABE) primitive and share the same key pair, thus the data access efficiency is significantly improved and the cost of key management is greatly reduced. Moreover, in the proposed scheme, the resource constrained end devices are allowed to rapidly assemble ciphertexts online and securely outsource most of decryption task to fog nodes, and mediated encryption mechanism is also adopted to achieve instantaneous user revocation instead of re-encrypting ciphertexts with many copies in many fog nodes. The security and the performance analysis show that our scheme is suitable for a fog computing environment.

  11. A service-oriented data access control model

    Science.gov (United States)

    Meng, Wei; Li, Fengmin; Pan, Juchen; Song, Song; Bian, Jiali

    2017-01-01

    The development of mobile computing, cloud computing and distributed computing meets the growing individual service needs. Facing with complex application system, it's an urgent problem to ensure real-time, dynamic, and fine-grained data access control. By analyzing common data access control models, on the basis of mandatory access control model, the paper proposes a service-oriented access control model. By regarding system services as subject and data of databases as object, the model defines access levels and access identification of subject and object, and ensures system services securely to access databases.

  12. Guided Endodontic Access in Maxillary Molars Using Cone-beam Computed Tomography and Computer-aided Design/Computer-aided Manufacturing System: A Case Report.

    Science.gov (United States)

    Lara-Mendes, Sônia T de O; Barbosa, Camila de Freitas M; Santa-Rosa, Caroline C; Machado, Vinícius C

    2018-05-01

    The aim of this study was to describe a guided endodontic technique that facilitates access to root canals of molars presenting with pulp calcifications. A 61-year-old woman presented to our service with pain in the upper left molar region. The second and third left molars showed signs of apical periodontitis confirmed by the cone-beam computed tomographic (CBCT) scans brought to us by the patient at the initial appointment. Conventional endodontic treatment was discontinued given the difficulty in locating the root canals. Intraoral scanning and the CBCT scans were used to plan the access to the calcified canals by means of implant planning software. Guides were fabricated through rapid prototyping and allowed for the correct orientation of a cylindrical drill used to provide access through the calcifications. Second to that, the root canals were prepared with reciprocating endodontic instruments and rested for 2 weeks with intracanal medication. Subsequently, canals were packed with gutta-percha cones using the hydraulic compression technique. Permanent restorations of the access cavities were performed. By comparing the tomographic images, the authors observed a drastic reduction of the periapical lesions as well as the absence of pain symptoms after 3 months. This condition was maintained at the 1-year follow-up. The guided endodontic technique in maxillary molars was shown to be a fast, safe, and predictable therapy and can be regarded as an excellent option for the location of calcified root canals, avoiding failures in complex cases. Copyright © 2018 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  13. Human- and computer-accessible 2D correlation data for a more reliable structure determination of organic compounds. Future roles of researchers, software developers, spectrometer managers, journal editors, reviewers, publisher and database managers toward artificial-intelligence analysis of NMR spectra.

    Science.gov (United States)

    Jeannerat, Damien

    2017-01-01

    The introduction of a universal data format to report the correlation data of 2D NMR spectra such as COSY, HSQC and HMBC spectra will have a large impact on the reliability of structure determination of small organic molecules. These lists of assigned cross peaks will bridge signals found in NMR 1D and 2D spectra and the assigned chemical structure. The record could be very compact, human and computer readable so that it can be included in the supplementary material of publications and easily transferred into databases of scientific literature and chemical compounds. The records will allow authors, reviewers and future users to test the consistency and, in favorable situations, the uniqueness of the assignment of the correlation data to the associated chemical structures. Ideally, the data format of the correlation data should include direct links to the NMR spectra to make it possible to validate their reliability and allow direct comparison of spectra. In order to take the full benefits of their potential, the correlation data and the NMR spectra should therefore follow any manuscript in the review process and be stored in open-access database after publication. Keeping all NMR spectra, correlation data and assigned structures together at all time will allow the future development of validation tools increasing the reliability of past and future NMR data. This will facilitate the development of artificial intelligence analysis of NMR spectra by providing a source of data than can be used efficiently because they have been validated or can be validated by future users. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  14. On the impossibility of creating the quantum correlations with computer

    International Nuclear Information System (INIS)

    Vinduska, M.

    1991-01-01

    It is indicated that Feynman's proof about the impossibility of creating the quantum correlations with computers does not hold if the general transformations of the probability measure of the treated systems do not for a group. In the paper the consequences of this fact are considered in relation to the Bell inequalities and to the models of relative probability measure on the concave surfaces. 5 refs.; 4 figs.; 2 tabs

  15. Comparing alternative approaches to measuring the geographical accessibility of urban health services: Distance types and aggregation-error issues

    Directory of Open Access Journals (Sweden)

    Riva Mylène

    2008-02-01

    Full Text Available Abstract Background Over the past two decades, geographical accessibility of urban resources for population living in residential areas has received an increased focus in urban health studies. Operationalising and computing geographical accessibility measures depend on a set of four parameters, namely definition of residential areas, a method of aggregation, a measure of accessibility, and a type of distance. Yet, the choice of these parameters may potentially generate different results leading to significant measurement errors. The aim of this paper is to compare discrepancies in results for geographical accessibility of selected health care services for residential areas (i.e. census tracts computed using different distance types and aggregation methods. Results First, the comparison of distance types demonstrates that Cartesian distances (Euclidean and Manhattan distances are strongly correlated with more accurate network distances (shortest network and shortest network time distances across the metropolitan area (Pearson correlation greater than 0.95. However, important local variations in correlation between Cartesian and network distances were observed notably in suburban areas where Cartesian distances were less precise. Second, the choice of the aggregation method is also important: in comparison to the most accurate aggregation method (population-weighted mean of the accessibility measure for census blocks within census tracts, accessibility measures computed from census tract centroids, though not inaccurate, yield important measurement errors for 5% to 10% of census tracts. Conclusion Although errors associated to the choice of distance types and aggregation method are only important for about 10% of census tracts located mainly in suburban areas, we should not avoid using the best estimation method possible for evaluating geographical accessibility. This is especially so if these measures are to be included as a dimension of the

  16. Sign Compute Resolve for Random Access

    DEFF Research Database (Denmark)

    Goseling, Jasper; Stefanovic, Cedomir; Popovski, Petar

    2014-01-01

    users collide. We measure the performance of the proposed method in terms of user resolution rate as well as overall throughput of the system. The results show that our approach significantly increases the performance of the system even compared to coded random access, where collisions are not wasted......We present an approach to random access that is based on three elements: physical-layer network coding, signature codes and tree splitting. Upon occurrence of a collision, physical-layer network coding enables the receiver to decode the sum of the information that was transmitted by the individual...

  17. Renormalization group improved computation of correlation functions in theories with nontrivial phase diagram

    DEFF Research Database (Denmark)

    Codello, Alessandro; Tonero, Alberto

    2016-01-01

    We present a simple and consistent way to compute correlation functions in interacting theories with nontrivial phase diagram. As an example we show how to consistently compute the four-point function in three dimensional Z2-scalar theories. The idea is to perform the path integral by weighting...... the momentum modes that contribute to it according to their renormalization group (RG) relevance, i.e. we weight each mode according to the value of the running couplings at that scale. In this way, we are able to encode in a loop computation the information regarding the RG trajectory along which we...

  18. Lexical access in sign language: a computational model.

    Science.gov (United States)

    Caselli, Naomi K; Cohen-Goldberg, Ariel M

    2014-01-01

    PSYCHOLINGUISTIC THEORIES HAVE PREDOMINANTLY BEEN BUILT UPON DATA FROM SPOKEN LANGUAGE, WHICH LEAVES OPEN THE QUESTION: How many of the conclusions truly reflect language-general principles as opposed to modality-specific ones? We take a step toward answering this question in the domain of lexical access in recognition by asking whether a single cognitive architecture might explain diverse behavioral patterns in signed and spoken language. Chen and Mirman (2012) presented a computational model of word processing that unified opposite effects of neighborhood density in speech production, perception, and written word recognition. Neighborhood density effects in sign language also vary depending on whether the neighbors share the same handshape or location. We present a spreading activation architecture that borrows the principles proposed by Chen and Mirman (2012), and show that if this architecture is elaborated to incorporate relatively minor facts about either (1) the time course of sign perception or (2) the frequency of sub-lexical units in sign languages, it produces data that match the experimental findings from sign languages. This work serves as a proof of concept that a single cognitive architecture could underlie both sign and word recognition.

  19. Lexical access in sign language: A computational model

    Directory of Open Access Journals (Sweden)

    Naomi Kenney Caselli

    2014-05-01

    Full Text Available Psycholinguistic theories have predominantly been built upon data from spoken language, which leaves open the question: How many of the conclusions truly reflect language-general principles as opposed to modality-specific ones? We take a step toward answering this question in the domain of lexical access in recognition by asking whether a single cognitive architecture might explain diverse behavioral patterns in signed and spoken language. Chen and Mirman (2012 presented a computational model of word processing that unified opposite effects of neighborhood density in speech production, perception, and written word recognition. Neighborhood density effects in sign language also vary depending on whether the neighbors share the same handshape or location. We present a spreading activation architecture that borrows the principles proposed by Chen and Mirman (2012, and show that if this architecture is elaborated to incorporate relatively minor facts about either 1 the time course of sign perception or 2 the frequency of sub-lexical units in sign languages, it produces data that match the experimental findings from sign languages. This work serves as a proof of concept that a single cognitive architecture could underlie both sign and word recognition.

  20. Cops, Computers and the Right to Privacy in the Information Age: unauthorised access and inapropriate disclosure of information complaints in New South Wales

    Directory of Open Access Journals (Sweden)

    Mike Enders

    2001-05-01

    Full Text Available The term the 'information age' is particularly applicable to Australia. In a recent email, the Australian Institute of Criminology's Chief Librarian, John Myrtle, passed on statistics which showed that internet use and access in Australia has increased about 50% during the last year (Pers. Comm. 14 July 1999. Of greater interest is the fact that almost 20% of Australian households, 1.3 million, have internet access and over one third of the adult population has accessed the internet at some time during the year ending February 1999. To further back these figures, the Sydney Morning Herald of 12 February, 2000, carried statistics from the Australian Bureau of Statistics which showed that 22.6% of Australian families had home internet access (Anon., 2000a, p. 105. These figures firmly place Australians among the world's most computer literate societies. Of course computers weren't always that popular. The authors of this paper entered law enforcement at a time when computers were owned by Universities and major corporations - not individuals - and a decent calculator cost about a week's wages. However, things changed quickly and by the 1980s computers were an established part of policing. Today, all major police services are committed to, and reliant on, some form of computerised information system. The two systems which the authors have had contact with are the Crime Reporting and Information System for Police (CRISP (Queensland Police Service and the Computerised Operational Policing System (COPS (New South Wales Police Service. While many aspects of these two systems are different, they, and all the other police information systems in existence, share one major similarity: they store and provide access to personal and confidential information on every individual with whom police come into contact during their duties. Modern police investigation techniques rely on officers being able to access this information routinely to carry out their duties

  1. Non-perturbative QCD correlation functions

    Energy Technology Data Exchange (ETDEWEB)

    Cyrol, Anton Konrad

    2017-11-27

    Functional methods provide access to the non-perturbative regime of quantum chromo- dynamics. Hence, they allow investigating confinement and chiral symmetry breaking. In this dissertation, correlation functions of Yang-Mills theory and unquenched two-flavor QCD are computed from the functional renormalization group. Employing a self-consistent vertex expansion of the effective action, Yang-Mills correlation functions are obtained in four as well as in three spacetime dimensions. To this end, confinement and Slavnov-Taylor identities are discussed. Our numerical results show very good agreement with corresponding lattice results. Next, unquenched two-flavor QCD is considered where it is shown that the unquenched two-flavor gluon propagator is insensitive to the pion mass. Furthermore, the necessity for consistent truncations is emphasized. Finally, correlation functions of finite-temperature Yang-Mills theory are computed in a truncation that includes the splitting of the gluon field into directions that are transverse and longitudinal to the heat bath. In particular, it includes the splitting of the three- and four-gluon vertices. The obtained gluon propagator allows to extract a Debye screening mass that coincides with the hard thermal loop screening mass at high temperatures, but is meaningful also at temperatures below the phase transition temperature.

  2. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time

    Directory of Open Access Journals (Sweden)

    Yeqing Zhang

    2018-02-01

    Full Text Available For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully.

  3. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time

    Science.gov (United States)

    Zhang, Yeqing; Wang, Meiling; Li, Yafeng

    2018-01-01

    For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully. PMID:29495301

  4. Controlling user access to electronic resources without password

    Science.gov (United States)

    Smith, Fred Hewitt

    2015-06-16

    Described herein are devices and techniques for remotely controlling user access to a restricted computer resource. The process includes pre-determining an association of the restricted computer resource and computer-resource-proximal environmental information. Indicia of user-proximal environmental information are received from a user requesting access to the restricted computer resource. Received indicia of user-proximal environmental information are compared to associated computer-resource-proximal environmental information. User access to the restricted computer resource is selectively granted responsive to a favorable comparison in which the user-proximal environmental information is sufficiently similar to the computer-resource proximal environmental information. In at least some embodiments, the process further includes comparing user-supplied biometric measure and comparing it with a predetermined association of at least one biometric measure of an authorized user. Access to the restricted computer resource is granted in response to a favorable comparison.

  5. Metal oxide resistive random access memory based synaptic devices for brain-inspired computing

    Science.gov (United States)

    Gao, Bin; Kang, Jinfeng; Zhou, Zheng; Chen, Zhe; Huang, Peng; Liu, Lifeng; Liu, Xiaoyan

    2016-04-01

    The traditional Boolean computing paradigm based on the von Neumann architecture is facing great challenges for future information technology applications such as big data, the Internet of Things (IoT), and wearable devices, due to the limited processing capability issues such as binary data storage and computing, non-parallel data processing, and the buses requirement between memory units and logic units. The brain-inspired neuromorphic computing paradigm is believed to be one of the promising solutions for realizing more complex functions with a lower cost. To perform such brain-inspired computing with a low cost and low power consumption, novel devices for use as electronic synapses are needed. Metal oxide resistive random access memory (ReRAM) devices have emerged as the leading candidate for electronic synapses. This paper comprehensively addresses the recent work on the design and optimization of metal oxide ReRAM-based synaptic devices. A performance enhancement methodology and optimized operation scheme to achieve analog resistive switching and low-energy training behavior are provided. A three-dimensional vertical synapse network architecture is proposed for high-density integration and low-cost fabrication. The impacts of the ReRAM synaptic device features on the performances of neuromorphic systems are also discussed on the basis of a constructed neuromorphic visual system with a pattern recognition function. Possible solutions to achieve the high recognition accuracy and efficiency of neuromorphic systems are presented.

  6. Correlation of computed tomography, sonography, and gross anatomy of the liver

    International Nuclear Information System (INIS)

    Sexton, C.F.; Zeman, R.K.

    1983-01-01

    Although the vascular and segmental anatomy of the liver is well defined in the sonographic and computed tomographic (CT) literature, the three-dimensional relations of hepatic structures remain conceptually complex. As an aid to understanding and teaching this anatomy, axially and sagitally sectioned gross liver specimens were correlated with appropriate sonographic and CT scans. Major hepatic landmarks and vascular structures are used to identify four distinct transverse planes and four sagittal planes

  7. A Parallel Approach in Computing Correlation Immunity up to Six Variables

    Science.gov (United States)

    2015-07-24

    failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 23 JUL 2015 2. REPORT TYPE...second step, we specify that a condition hold across all assignments of values to the variables chosen in the first step. For pedagogical reasons, we could...table of the function whose correlation immunity is currently being computed. When this circuit is used in exhaustive enumeration, the Function

  8. Living with Computers. Young Danes' Uses of and Thoughts on the Uses of Computers

    DEFF Research Database (Denmark)

    Stald, Gitte Bang

    1998-01-01

    Young Danes, computers,users, super users, non users, computer access, unge danskere, computere,brugere,superbrugere,ikke-brugere......Young Danes, computers,users, super users, non users, computer access, unge danskere, computere,brugere,superbrugere,ikke-brugere...

  9. Correlation of computed tomographic and magnetic resonance imaging findings in cerebral infartion

    International Nuclear Information System (INIS)

    Komatsubara, Chizuko; Chuda, Moriyoshi; Taka, Toshihiko

    1989-01-01

    We evaluated neurological findings in 75 patients of cerebral infarction, and correlated computed tomographic (CT) and magnetic resonance imaging (MRI) findings. MRI was found to have the advantage when the lesion were multiple, or in the posterior fossa. MRI demonstrates the anatomical details, and lacks the bony artifact, so it is an excellent method for identification of cerebral infarction. (author)

  10. Examining Stakeholder Perceptions of Accessibility and Utilization of Computer and Internet Technology in the Selinsgrove Area School District

    Science.gov (United States)

    Krause, Lorinda M.

    2014-01-01

    This study utilized a mixed methods approach to examine the issue of how parents, students, and teachers (stakeholders) perceive accessibility and the utilization of computer and Internet technology within the Selinsgrove, Pennsylvania Area School District. Quantitative data was collected through the use of questionnaires distributed to the…

  11. Toward a computational theory of conscious processing.

    Science.gov (United States)

    Dehaene, Stanislas; Charles, Lucie; King, Jean-Rémi; Marti, Sébastien

    2014-04-01

    The study of the mechanisms of conscious processing has become a productive area of cognitive neuroscience. Here we review some of the recent behavioral and neuroscience data, with the specific goal of constraining present and future theories of the computations underlying conscious processing. Experimental findings imply that most of the brain's computations can be performed in a non-conscious mode, but that conscious perception is characterized by an amplification, global propagation and integration of brain signals. A comparison of these data with major theoretical proposals suggests that firstly, conscious access must be carefully distinguished from selective attention; secondly, conscious perception may be likened to a non-linear decision that 'ignites' a network of distributed areas; thirdly, information which is selected for conscious perception gains access to additional computations, including temporary maintenance, global sharing, and flexible routing; and finally, measures of the complexity, long-distance correlation and integration of brain signals provide reliable indices of conscious processing, clinically relevant to patients recovering from coma. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Correlation Filters for Detection of Cellular Nuclei in Histopathology Images.

    Science.gov (United States)

    Ahmad, Asif; Asif, Amina; Rajpoot, Nasir; Arif, Muhammad; Minhas, Fayyaz Ul Amir Afsar

    2017-11-21

    Nuclei detection in histology images is an essential part of computer aided diagnosis of cancers and tumors. It is a challenging task due to diverse and complicated structures of cells. In this work, we present an automated technique for detection of cellular nuclei in hematoxylin and eosin stained histopathology images. Our proposed approach is based on kernelized correlation filters. Correlation filters have been widely used in object detection and tracking applications but their strength has not been explored in the medical imaging domain up till now. Our experimental results show that the proposed scheme gives state of the art accuracy and can learn complex nuclear morphologies. Like deep learning approaches, the proposed filters do not require engineering of image features as they can operate directly on histopathology images without significant preprocessing. However, unlike deep learning methods, the large-margin correlation filters developed in this work are interpretable, computationally efficient and do not require specialized or expensive computing hardware. A cloud based webserver of the proposed method and its python implementation can be accessed at the following URL: http://faculty.pieas.edu.pk/fayyaz/software.html#corehist .

  13. Explicitly-correlated ring-coupled-cluster-doubles theory: Including exchange for computations on closed-shell systems

    Energy Technology Data Exchange (ETDEWEB)

    Hehn, Anna-Sophia; Holzer, Christof; Klopper, Wim, E-mail: klopper@kit.edu

    2016-11-10

    Highlights: • Ring-coupled-cluster-doubles approach now implemented with exchange terms. • Ring-coupled-cluster-doubles approach now implemented with F12 functions. • Szabo–Ostlund scheme (SO2) implemented for use in SAPT. • Fast convergence to the limit of a complete basis. • Implementation in the TURBOMOLE program system. - Abstract: Random-phase-approximation (RPA) methods have proven to be powerful tools in electronic-structure theory, being non-empirical, computationally efficient and broadly applicable to a variety of molecular systems including small-gap systems, transition-metal compounds and dispersion-dominated complexes. Applications are however hindered due to the slow basis-set convergence of the electron-correlation energy with the one-electron basis. As a remedy, we present approximate explicitly-correlated RPA approaches based on the ring-coupled-cluster-doubles formulation including exchange contributions. Test calculations demonstrate that the basis-set convergence of correlation energies is drastically accelerated through the explicitly-correlated approach, reaching 99% of the basis-set limit with triple-zeta basis sets. When implemented in close analogy to early work by Szabo and Ostlund [36], the new explicitly-correlated ring-coupled-cluster-doubles approach including exchange has the perspective to become a valuable tool in the framework of symmetry-adapted perturbation theory (SAPT) for the computation of dispersion energies of molecular complexes of weakly interacting closed-shell systems.

  14. 36 CFR 1254.32 - What rules apply to public access use of the Internet on NARA-supplied computers?

    Science.gov (United States)

    2010-07-01

    ... access use of the Internet on NARA-supplied computers? 1254.32 Section 1254.32 Parks, Forests, and Public Property NATIONAL ARCHIVES AND RECORDS ADMINISTRATION PUBLIC AVAILABILITY AND USE USING RECORDS AND DONATED... for Internet use in all NARA research rooms. The number of workstations varies per location. We...

  15. Air trapping in sarcoidosis on computed tomography: Correlation with lung function

    International Nuclear Information System (INIS)

    Davies, C.W.H.; Tasker, A.D.; Padley, S.P.G.; Davies, R.J.O.; Gleeson, F.V.

    2000-01-01

    AIMS: To document the presence and extent of air trapping on high resolution computed tomography (HRCT) in patients with pulmonary sarcoidosis and correlate HRCT features with pulmonary function tests. METHODS: Twenty-one patients with pulmonary sarcoidosis underwent HRCT and pulmonary function assessment at presentation. Inspiratory and expiratory HRCT were assessed for the presence and extent of air trapping, ground-glass opacification, nodularity, septal thickening, bronchiectasis and parenchymal distortion. HRCT features were correlated with pulmonary function tests. RESULTS: Air trapping on expiratory HRCT was present in 20/21 (95%) patients. The extent of air trapping correlated with percentage predicted residual volume (RV)/total lung capacity (TLC) (r = 0.499;P < 0.05) and percentage predicted maximal mid-expiratory flow rate between 25 and 75% of the vital capacity (r = -0.54;P < 0.05). Ground-glass opacification was present in four of 21 (19%), nodularity in 18/21 (86%), septal thickening in 18/21 (86%), traction bronchiectasis in 14/21 (67%) and distortion in 12/21 (57%) of patients; there were no significant relationships between these CT features and pulmonary function results. CONCLUSION: Air trapping is a common feature in sarcoidosis and correlates with evidence of small airways disease on pulmonary function testing. Davies, C.W.H. (2000). Clinical Radiology 55, 217-221

  16. Notified Access: Extending Remote Memory Access Programming Models for Producer-Consumer Synchronization

    KAUST Repository

    Belli, Roberto; Hoefler, Torsten

    2015-01-01

    Remote Memory Access (RMA) programming enables direct access to low-level hardware features to achieve high performance for distributed-memory programs. However, the design of RMA programming schemes focuses on the memory access and less on the synchronization. For example, in contemporary RMA programming systems, the widely used producer-consumer pattern can only be implemented inefficiently, incurring in an overhead of an additional round-trip message. We propose Notified Access, a scheme where the target process of an access can receive a completion notification. This scheme enables direct and efficient synchronization with a minimum number of messages. We implement our scheme in an open source MPI-3 RMA library and demonstrate lower overheads (two cache misses) than other point-to-point synchronization mechanisms for each notification. We also evaluate our implementation on three real-world benchmarks, a stencil computation, a tree computation, and a Colicky factorization implemented with tasks. Our scheme always performs better than traditional message passing and other existing RMA synchronization schemes, providing up to 50% speedup on small messages. Our analysis shows that Notified Access is a valuable primitive for any RMA system. Furthermore, we provide guidance for the design of low-level network interfaces to support Notified Access efficiently.

  17. Notified Access: Extending Remote Memory Access Programming Models for Producer-Consumer Synchronization

    KAUST Repository

    Belli, Roberto

    2015-05-01

    Remote Memory Access (RMA) programming enables direct access to low-level hardware features to achieve high performance for distributed-memory programs. However, the design of RMA programming schemes focuses on the memory access and less on the synchronization. For example, in contemporary RMA programming systems, the widely used producer-consumer pattern can only be implemented inefficiently, incurring in an overhead of an additional round-trip message. We propose Notified Access, a scheme where the target process of an access can receive a completion notification. This scheme enables direct and efficient synchronization with a minimum number of messages. We implement our scheme in an open source MPI-3 RMA library and demonstrate lower overheads (two cache misses) than other point-to-point synchronization mechanisms for each notification. We also evaluate our implementation on three real-world benchmarks, a stencil computation, a tree computation, and a Colicky factorization implemented with tasks. Our scheme always performs better than traditional message passing and other existing RMA synchronization schemes, providing up to 50% speedup on small messages. Our analysis shows that Notified Access is a valuable primitive for any RMA system. Furthermore, we provide guidance for the design of low-level network interfaces to support Notified Access efficiently.

  18. Assessment of Computer Technology Availability, Accessibility and Usage by Agricultural Education Student Teachers in Secondary Schools in Botswana

    Science.gov (United States)

    Hulela, K.; Rammolai, M.; Mpatane, W.

    2014-01-01

    This study examines the availability, accessibility and usability of computer as a form of information and communication technologies (ICTs) by student teachers in secondary schools. 44 out of 51 student teachers of Agriculture responded to the questionnaire. Means and percentages were used to analyze the data to establish the availability,…

  19. A Survey of Exemplar Teachers' Perceptions, Use, and Access of Computer-Based Games and Technology for Classroom Instruction

    Science.gov (United States)

    Proctor, Michael D.; Marks, Yaela

    2013-01-01

    This research reports and analyzes for archival purposes surveyed perceptions, use, and access by 259 United States based exemplar Primary and Secondary educators of computer-based games and technology for classroom instruction. Participating respondents were considered exemplary as they each won the Milken Educator Award during the 1996-2009…

  20. The CT Scanner Facility at Stellenbosch University: An open access X-ray computed tomography laboratory

    Science.gov (United States)

    du Plessis, Anton; le Roux, Stephan Gerhard; Guelpa, Anina

    2016-10-01

    The Stellenbosch University CT Scanner Facility is an open access laboratory providing non-destructive X-ray computed tomography (CT) and a high performance image analysis services as part of the Central Analytical Facilities (CAF) of the university. Based in Stellenbosch, South Africa, this facility offers open access to the general user community, including local researchers, companies and also remote users (both local and international, via sample shipment and data transfer). The laboratory hosts two CT instruments, i.e. a micro-CT system, as well as a nano-CT system. A workstation-based Image Analysis Centre is equipped with numerous computers with data analysis software packages, which are to the disposal of the facility users, along with expert supervision, if required. All research disciplines are accommodated at the X-ray CT laboratory, provided that non-destructive analysis will be beneficial. During its first four years, the facility has accommodated more than 400 unique users (33 in 2012; 86 in 2013; 154 in 2014; 140 in 2015; 75 in first half of 2016), with diverse industrial and research applications using X-ray CT as means. This paper summarises the existence of the laboratory's first four years by way of selected examples, both from published and unpublished projects. In the process a detailed description of the capabilities and facilities available to users is presented.

  1. The CT Scanner Facility at Stellenbosch University: An open access X-ray computed tomography laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Plessis, Anton du, E-mail: anton2@sun.ac.za [CT Scanner Facility, Central Analytical Facilities, Stellenbosch University, Stellenbosch (South Africa); Physics Department, Stellenbosch University, Stellenbosch (South Africa); Roux, Stephan Gerhard le, E-mail: lerouxsg@sun.ac.za [CT Scanner Facility, Central Analytical Facilities, Stellenbosch University, Stellenbosch (South Africa); Guelpa, Anina, E-mail: aninag@sun.ac.za [CT Scanner Facility, Central Analytical Facilities, Stellenbosch University, Stellenbosch (South Africa)

    2016-10-01

    The Stellenbosch University CT Scanner Facility is an open access laboratory providing non-destructive X-ray computed tomography (CT) and a high performance image analysis services as part of the Central Analytical Facilities (CAF) of the university. Based in Stellenbosch, South Africa, this facility offers open access to the general user community, including local researchers, companies and also remote users (both local and international, via sample shipment and data transfer). The laboratory hosts two CT instruments, i.e. a micro-CT system, as well as a nano-CT system. A workstation-based Image Analysis Centre is equipped with numerous computers with data analysis software packages, which are to the disposal of the facility users, along with expert supervision, if required. All research disciplines are accommodated at the X-ray CT laboratory, provided that non-destructive analysis will be beneficial. During its first four years, the facility has accommodated more than 400 unique users (33 in 2012; 86 in 2013; 154 in 2014; 140 in 2015; 75 in first half of 2016), with diverse industrial and research applications using X-ray CT as means. This paper summarises the existence of the laboratory’s first four years by way of selected examples, both from published and unpublished projects. In the process a detailed description of the capabilities and facilities available to users is presented.

  2. Training to use a commercial brain-computer interface as access technology: a case study.

    Science.gov (United States)

    Taherian, Sarvnaz; Selitskiy, Dmitry; Pau, James; Davies, T Claire; Owens, R Glynn

    2016-01-01

    This case study describes how an individual with spastic quadriplegic cerebral palsy was trained over a period of four weeks to use a commercial electroencephalography (EEG)-based brain-computer interface (BCI). The participant spent three sessions exploring the system, and seven sessions playing a game focused on EEG feedback training of left and right arm motor imagery and a customised, training game paradigm was employed. The participant showed improvement in the production of two distinct EEG patterns. The participant's performance was influenced by motivation, fatigue and concentration. Six weeks post-training the participant could still control the BCI and used this to type a sentence using an augmentative and alternative communication application on a wirelessly linked device. The results from this case study highlight the importance of creating a dynamic, relevant and engaging training environment for BCIs. Implications for Rehabilitation Customising a training paradigm to suit the users' interests can influence adherence to assistive technology training. Mood, fatigue, physical illness and motivation influence the usability of a brain-computer interface. Commercial brain-computer interfaces, which require little set up time, may be used as access technology for individuals with severe disabilities.

  3. 3D CFD computations of trasitional flows using DES and a correlation based transition model

    DEFF Research Database (Denmark)

    Sørensen, Niels N.; Bechmann, Andreas; Zahle, Frederik

    2011-01-01

    a circular cylinder from Re = 10 to 1 × 106 reproducing the cylinder drag crisis. The computations show good quantitative and qualitative agreement with the behaviour seen in experiments. This case shows that the methodology performs smoothly from the laminar cases at low Re to the turbulent cases at high Re......The present article describes the application of the correlation based transition model of Menter et al. in combination with the Detached Eddy Simulation (DES) methodology to two cases with large degree of flow separation typically considered difficult to compute. Firstly, the flow is computed over...

  4. Computer self-efficacy and computer attitude as correlates of ...

    African Journals Online (AJOL)

    The Internet as a useful tool that supports teaching and learning is not in full use in most secondary schools in Nigeria hence limiting the students from maximizing the potentials of Internet in advancing their academic pursuits. This study, therefore, examined the extent to which computer self-efficacy and computer attitude ...

  5. Joint Hybrid Backhaul and Access Links Design in Cloud-Radio Access Networks

    KAUST Repository

    Dhifallah, Oussama Najeeb

    2015-09-06

    The cloud-radio access network (CRAN) is expected to be the core network architecture for next generation mobile radio systems. In this paper, we consider the downlink of a CRAN formed of one central processor (the cloud) and several base station (BS), where each BS is connected to the cloud via either a wireless or capacity-limited wireline backhaul link. The paper addresses the joint design of the hybrid backhaul links (i.e., designing the wireline and wireless backhaul connections from the cloud to the BSs) and the access links (i.e., determining the sparse beamforming solution from the BSs to the users). The paper formulates the hybrid backhaul and access link design problem by minimizing the total network power consumption. The paper solves the problem using a two-stage heuristic algorithm. At one stage, the sparse beamforming solution is found using a weighted mixed 11/12 norm minimization approach; the correlation matrix of the quantization noise of the wireline backhaul links is computed using the classical rate-distortion theory. At the second stage, the transmit powers of the wireless backhaul links are found by solving a power minimization problem subject to quality-of-service constraints, based on the principle of conservation of rate by utilizing the rates found in the first stage. Simulation results suggest that the performance of the proposed algorithm approaches the global optimum solution, especially at high signal-to-interference-plus-noise ratio (SINR).

  6. Effect Through Broadcasting System Access Point For Video Transmission

    Directory of Open Access Journals (Sweden)

    Leni Marlina

    2015-08-01

    Full Text Available Most universities are already implementing wired and wireless network that is used to access integrated information systems and the Internet. At present it is important to do research on the influence of the broadcasting system through the access point for video transmitter learning in the university area. At every university computer network through the access point must also use the cable in its implementation. These networks require cables that will connect and transmit data from one computer to another computer. While wireless networks of computers connected through radio waves. This research will be a test or assessment of how the influence of the network using the WLAN access point for video broadcasting means learning from the server to the client. Instructional video broadcasting from the server to the client via the access point will be used for video broadcasting means of learning. This study aims to understand how to build a wireless network by using an access point. It also builds a computer server as instructional videos supporting software that can be used for video server that will be emitted by broadcasting via the access point and establish a system of transmitting video from the server to the client via the access point.

  7. A Compute Environment of ABC95 Array Computer Based on Multi-FPGA Chip

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    ABC95 array computer is a multi-function network's computer based on FPGA technology, The multi-function network supports processors conflict-free access data from memory and supports processors access data from processors based on enhanced MESH network.ABC95 instruction's system includes control instructions, scalar instructions, vectors instructions.Mostly net-work instructions are introduced.A programming environment of ABC95 array computer assemble language is designed.A programming environment of ABC95 array computer for VC++ is advanced.It includes load function of ABC95 array computer program and data, store function, run function and so on.Specially, The data type of ABC95 array computer conflict-free access is defined.The results show that these technologies can develop programmer of ABC95 array computer effectively.

  8. Increasing the computational efficient of digital cross correlation by a vectorization method

    Science.gov (United States)

    Chang, Ching-Yuan; Ma, Chien-Ching

    2017-08-01

    This study presents a vectorization method for use in MATLAB programming aimed at increasing the computational efficiency of digital cross correlation in sound and images, resulting in a speedup of 6.387 and 36.044 times compared with performance values obtained from looped expression. This work bridges the gap between matrix operations and loop iteration, preserving flexibility and efficiency in program testing. This paper uses numerical simulation to verify the speedup of the proposed vectorization method as well as experiments to measure the quantitative transient displacement response subjected to dynamic impact loading. The experiment involved the use of a high speed camera as well as a fiber optic system to measure the transient displacement in a cantilever beam under impact from a steel ball. Experimental measurement data obtained from the two methods are in excellent agreement in both the time and frequency domain, with discrepancies of only 0.68%. Numerical and experiment results demonstrate the efficacy of the proposed vectorization method with regard to computational speed in signal processing and high precision in the correlation algorithm. We also present the source code with which to build MATLAB-executable functions on Windows as well as Linux platforms, and provide a series of examples to demonstrate the application of the proposed vectorization method.

  9. 3D fast adaptive correlation imaging for large-scale gravity data based on GPU computation

    Science.gov (United States)

    Chen, Z.; Meng, X.; Guo, L.; Liu, G.

    2011-12-01

    In recent years, large scale gravity data sets have been collected and employed to enhance gravity problem-solving abilities of tectonics studies in China. Aiming at the large scale data and the requirement of rapid interpretation, previous authors have carried out a lot of work, including the fast gradient module inversion and Euler deconvolution depth inversion ,3-D physical property inversion using stochastic subspaces and equivalent storage, fast inversion using wavelet transforms and a logarithmic barrier method. So it can be say that 3-D gravity inversion has been greatly improved in the last decade. Many authors added many different kinds of priori information and constraints to deal with nonuniqueness using models composed of a large number of contiguous cells of unknown property and obtained good results. However, due to long computation time, instability and other shortcomings, 3-D physical property inversion has not been widely applied to large-scale data yet. In order to achieve 3-D interpretation with high efficiency and precision for geological and ore bodies and obtain their subsurface distribution, there is an urgent need to find a fast and efficient inversion method for large scale gravity data. As an entirely new geophysical inversion method, 3D correlation has a rapid development thanks to the advantage of requiring no a priori information and demanding small amount of computer memory. This method was proposed to image the distribution of equivalent excess masses of anomalous geological bodies with high resolution both longitudinally and transversely. In order to tranform the equivalence excess masses into real density contrasts, we adopt the adaptive correlation imaging for gravity data. After each 3D correlation imaging, we change the equivalence into density contrasts according to the linear relationship, and then carry out forward gravity calculation for each rectangle cells. Next, we compare the forward gravity data with real data, and

  10. Implementation of the Two-Point Angular Correlation Function on a High-Performance Reconfigurable Computer

    Directory of Open Access Journals (Sweden)

    Volodymyr V. Kindratenko

    2009-01-01

    Full Text Available We present a parallel implementation of an algorithm for calculating the two-point angular correlation function as applied in the field of computational cosmology. The algorithm has been specifically developed for a reconfigurable computer. Our implementation utilizes a microprocessor and two reconfigurable processors on a dual-MAP SRC-6 system. The two reconfigurable processors are used as two application-specific co-processors. Two independent computational kernels are simultaneously executed on the reconfigurable processors while data pre-fetching from disk and initial data pre-processing are executed on the microprocessor. The overall end-to-end algorithm execution speedup achieved by this implementation is over 90× as compared to a sequential implementation of the algorithm executed on a single 2.8 GHz Intel Xeon microprocessor.

  11. Pipeline leak detection and location by on-line-correlation with a process computer

    International Nuclear Information System (INIS)

    Siebert, H.; Isermann, R.

    1977-01-01

    A method for leak detection using a correlation technique in pipelines is described. For leak detection and also for leak localisation and estimation of the leak flow recursive estimation algorithms are used. The efficiency of the methods is demonstrated with a process computer and a pipeline model operating on-line. It is shown that very small leaks can be detected. (orig.) [de

  12. Method and apparatus for managing access to a memory

    Science.gov (United States)

    DeBenedictis, Erik

    2017-08-01

    A method and apparatus for managing access to a memory of a computing system. A controller transforms a plurality of operations that represent a computing job into an operational memory layout that reduces a size of a selected portion of the memory that needs to be accessed to perform the computing job. The controller stores the operational memory layout in a plurality of memory cells within the selected portion of the memory. The controller controls a sequence by which a processor in the computing system accesses the memory to perform the computing job using the operational memory layout. The operational memory layout reduces an amount of energy consumed by the processor to perform the computing job.

  13. Accessible high performance computing solutions for near real-time image processing for time critical applications

    Science.gov (United States)

    Bielski, Conrad; Lemoine, Guido; Syryczynski, Jacek

    2009-09-01

    High Performance Computing (HPC) hardware solutions such as grid computing and General Processing on a Graphics Processing Unit (GPGPU) are now accessible to users with general computing needs. Grid computing infrastructures in the form of computing clusters or blades are becoming common place and GPGPU solutions that leverage the processing power of the video card are quickly being integrated into personal workstations. Our interest in these HPC technologies stems from the need to produce near real-time maps from a combination of pre- and post-event satellite imagery in support of post-disaster management. Faster processing provides a twofold gain in this situation: 1. critical information can be provided faster and 2. more elaborate automated processing can be performed prior to providing the critical information. In our particular case, we test the use of the PANTEX index which is based on analysis of image textural measures extracted using anisotropic, rotation-invariant GLCM statistics. The use of this index, applied in a moving window, has been shown to successfully identify built-up areas in remotely sensed imagery. Built-up index image masks are important input to the structuring of damage assessment interpretation because they help optimise the workload. The performance of computing the PANTEX workflow is compared on two different HPC hardware architectures: (1) a blade server with 4 blades, each having dual quad-core CPUs and (2) a CUDA enabled GPU workstation. The reference platform is a dual CPU-quad core workstation and the PANTEX workflow total computing time is measured. Furthermore, as part of a qualitative evaluation, the differences in setting up and configuring various hardware solutions and the related software coding effort is presented.

  14. Local Nucleosome Dynamics Facilitate Chromatin Accessibility in Living Mammalian Cells

    Directory of Open Access Journals (Sweden)

    Saera Hihara

    2012-12-01

    Full Text Available Genome information, which is three-dimensionally organized within cells as chromatin, is searched and read by various proteins for diverse cell functions. Although how the protein factors find their targets remains unclear, the dynamic and flexible nature of chromatin is likely crucial. Using a combined approach of fluorescence correlation spectroscopy, single-nucleosome imaging, and Monte Carlo computer simulations, we demonstrate local chromatin dynamics in living mammalian cells. We show that similar to interphase chromatin, dense mitotic chromosomes also have considerable chromatin accessibility. For both interphase and mitotic chromatin, we observed local fluctuation of individual nucleosomes (∼50 nm movement/30 ms, which is caused by confined Brownian motion. Inhibition of these local dynamics by crosslinking impaired accessibility in the dense chromatin regions. Our findings show that local nucleosome dynamics drive chromatin accessibility. We propose that this local nucleosome fluctuation is the basis for scanning genome information.

  15. Access Denied! Contrasting Data Access in the United States and Ireland

    Directory of Open Access Journals (Sweden)

    Grogan Samuel

    2016-07-01

    Full Text Available The ability of an Internet user to access data collected about himself as a result of his online activity is a key privacy safeguard. Online, data access has been overshadowed by other protections such as notice and choice. This paper describes attitudes about data access. 873 US and Irish Internet users participated in a survey designed to examine views on data access to information held by online companies and data brokers. We observed low levels of awareness of access mechanisms along with a high desire for access in both participant groups. We tested three proposed access systems in keeping with industry programs and regulatory proposals. User response was positive. We conclude that access remains an important privacy protection that is inadequately manifested in practice. Our study provides insight for lawmakers and policymakers, as well as computer scientists who implement these systems.

  16. Computers in Academic Architecture Libraries.

    Science.gov (United States)

    Willis, Alfred; And Others

    1992-01-01

    Computers are widely used in architectural research and teaching in U.S. schools of architecture. A survey of libraries serving these schools sought information on the emphasis placed on computers by the architectural curriculum, accessibility of computers to library staff, and accessibility of computers to library patrons. Survey results and…

  17. Ubiquitous Accessibility for People with Visual Impairments: Are We There Yet?

    Science.gov (United States)

    Billah, Syed Masum; Ashok, Vikas; Porter, Donald E; Ramakrishnan, I V

    2017-05-01

    Ubiquitous access is an increasingly common vision of computing, wherein users can interact with any computing device or service from anywhere, at any time. In the era of personal computing, users with visual impairments required special-purpose, assistive technologies, such as screen readers, to interact with computers. This paper investigates whether technologies like screen readers have kept pace with, or have created a barrier to, the trend toward ubiquitous access, with a specific focus on desktop computing as this is still the primary way computers are used in education and employment. Towards that, the paper presents a user study with 21 visually-impaired participants, specifically involving the switching of screen readers within and across different computing platforms, and the use of screen readers in remote access scenarios. Among the findings, the study shows that, even for remote desktop access-an early forerunner of true ubiquitous access-screen readers are too limited, if not unusable. The study also identifies several accessibility needs, such as uniformity of navigational experience across devices, and recommends potential solutions. In summary, assistive technologies have not made the jump into the era of ubiquitous access, and multiple, inconsistent screen readers create new practical problems for users with visual impairments.

  18. Computer Registration Becoming Mandatory

    CERN Multimedia

    2003-01-01

    Following the decision by the CERN Management Board (see Weekly Bulletin 38/2003), registration of all computers connected to CERN's network will be enforced and only registered computers will be allowed network access. The implementation has started with the IT buildings, continues with building 40 and the Prevessin site (as of Tuesday 4th November 2003), and will cover the whole of CERN before the end of this year. We therefore recommend strongly that you register all your computers in CERN's network database including all network access cards (Ethernet AND wireless) as soon as possible without waiting for the access restriction to take force. This will allow you accessing the network without interruption and help IT service providers to contact you in case of problems (e.g. security problems, viruses, etc.) Users WITH a CERN computing account register at: http://cern.ch/register/ (CERN Intranet page) Visitors WITHOUT a CERN computing account (e.g. short term visitors) register at: http://cern.ch/regis...

  19. Handling of computational in vitro/in vivo correlation problems by Microsoft Excel: I. Principles and some general algorithms.

    Science.gov (United States)

    Langenbucher, Frieder

    2002-01-01

    Most computations in the field of in vitro/in vivo correlations can be handled directly by Excel worksheets, without the need for specialized software. Following a summary of Excel features, applications are illustrated for numerical computation of AUC and Mean, Wagner-Nelson and Loo-Riegelman absorption plots, and polyexponential curve fitting.

  20. Flexible Access Control for Dynamic Collaborative Environments

    NARCIS (Netherlands)

    Dekker, M.A.C.

    2009-01-01

    Access control is used in computer systems to control access to confidential data. In this thesis we focus on access control for dynamic collaborative environments where multiple users and systems access and exchange data in an ad hoc manner. In such environments it is difficult to protect

  1. Lexical semantic access and letter access are involved in different aspects of reading

    DEFF Research Database (Denmark)

    Poulsen, Mads

    ). In this subset sample, both letter access and lexical access accounted for unique variance in reading fluency. The pattern of effects for lexical access did not change by controlling for serial rapid naming (RAN). Conclusions: The results suggest that letter access and lexical access are important for different......Purpose: This study investigated the effects of lexical access speed and letter access speed on reading fluency and reading comprehension. We hypothesized that 1) letter access speed would correlate with reading fluency but not comprehension, while 2) lexical access speed would influence reading...... comprehension. For readers who are struggling with recoding, most of the reading effort is probably tied up with recoding, leaving little to be explained by lexical access. Therefore we expected that 3) lexical access speed would primarily predict reading fluency for readers who were no longer struggling...

  2. Predictive access control for distributed computation

    DEFF Research Database (Denmark)

    Yang, Fan; Hankin, Chris; Nielson, Flemming

    2013-01-01

    We show how to use aspect-oriented programming to separate security and trust issues from the logical design of mobile, distributed systems. The main challenge is how to enforce various types of security policies, in particular predictive access control policies — policies based on the future beh...... behavior of a program. A novel feature of our approach is that we can define policies concerning secondary use of data....

  3. Fencing direct memory access data transfers in a parallel active messaging interface of a parallel computer

    Science.gov (United States)

    Blocksome, Michael A.; Mamidala, Amith R.

    2013-09-03

    Fencing direct memory access (`DMA`) data transfers in a parallel active messaging interface (`PAMI`) of a parallel computer, the PAMI including data communications endpoints, each endpoint including specifications of a client, a context, and a task, the endpoints coupled for data communications through the PAMI and through DMA controllers operatively coupled to segments of shared random access memory through which the DMA controllers deliver data communications deterministically, including initiating execution through the PAMI of an ordered sequence of active DMA instructions for DMA data transfers between two endpoints, effecting deterministic DMA data transfers through a DMA controller and a segment of shared memory; and executing through the PAMI, with no FENCE accounting for DMA data transfers, an active FENCE instruction, the FENCE instruction completing execution only after completion of all DMA instructions initiated prior to execution of the FENCE instruction for DMA data transfers between the two endpoints.

  4. Malignant pleural mesothelioma: Computed tomography and correlation with histology

    Energy Technology Data Exchange (ETDEWEB)

    Seely, Jean M. [Department of Diagnostic Imaging, Ottawa Hospital, 1053 Carling Avenue, Ottawa, Ontario K1Y 4E9 (Canada)], E-mail: jeseely@ottawahospital.on.ca; Nguyen, Elsie T., E-mail: nguyen_elsie@hotmail.com; Churg, Andrew M. [University of British Columbia, 2211 Wesbrook Mall, Vancouver, BC V6T 1W5 (Canada)], E-mail: achurg@interchange.ubc.ca; Mueller, Nestor L. [University of British Columbia, Vancouver Hospital and Health Sciences Centre, 855 West 12th Avenue, Vancouver, BC V5Z 1M9 (Canada)], E-mail: nmuller@vanhosp.bc.ca

    2009-06-15

    Objective: To review the computed tomography (CT) imaging findings of pleural mesothelioma at presentation and to correlate the CT with the histological subtype. Materials and methods: Pathology reports from 1997 to 2006 were reviewed at two academic institutions to identify patients with proven pleural mesothelioma. Diagnosis was based on histologic findings in specimens obtained by transthoracic needle biopsy, surgical biopsy or resection. All histology slides were reviewed by a lung pathologist. CT scans, available in 92 patients, were reviewed blindly and in random order by two independent radiologists. Kappa analysis was completed to assess inter-observer agreement. Eighty patients in whom there was no significant delay between CT imaging and histological diagnosis were assessed by logistic regression analysis to correlate CT and histologic findings. Results: Seventy-two of the 92 mesotheliomas were epithelial, 15 sarcomatous, and 5 of mixed histology. All patients (77 male, 15 female, mean age 68 years) had pleural thickening on CT; the thickening was nodular in 79 patients (86%) and mediastinal in 87 (95%). Ipsilateral volume loss was seen in 42 patients (46%). Pleural effusions were present in 80 patients (87%), being large (>2/3 hemithorax) in 19 patients (21%). Atypical features at presentation included bilateral disease in three patients (3%), and spontaneous pneumothoraces in nine patients (10%). Internal mammary lymphadenopathy was observed in 48 patients (52%) and cardiophrenic lymphadenopathy in 42 (46%). Inter-observer agreement was excellent (average kappa = 0.89). Ipsilateral volume loss was associated with sarcomatous or mixed mesothelioma (p = 0.004). Using logistic regression analysis, other CT findings did not correlate with histological subtype. Conclusions: Ipsilateral volume loss is most frequently associated with sarcomatous or mixed mesothelioma. The remaining imaging findings are not helpful in predicting the histological subtype of

  5. Malignant pleural mesothelioma: Computed tomography and correlation with histology

    International Nuclear Information System (INIS)

    Seely, Jean M.; Nguyen, Elsie T.; Churg, Andrew M.; Mueller, Nestor L.

    2009-01-01

    Objective: To review the computed tomography (CT) imaging findings of pleural mesothelioma at presentation and to correlate the CT with the histological subtype. Materials and methods: Pathology reports from 1997 to 2006 were reviewed at two academic institutions to identify patients with proven pleural mesothelioma. Diagnosis was based on histologic findings in specimens obtained by transthoracic needle biopsy, surgical biopsy or resection. All histology slides were reviewed by a lung pathologist. CT scans, available in 92 patients, were reviewed blindly and in random order by two independent radiologists. Kappa analysis was completed to assess inter-observer agreement. Eighty patients in whom there was no significant delay between CT imaging and histological diagnosis were assessed by logistic regression analysis to correlate CT and histologic findings. Results: Seventy-two of the 92 mesotheliomas were epithelial, 15 sarcomatous, and 5 of mixed histology. All patients (77 male, 15 female, mean age 68 years) had pleural thickening on CT; the thickening was nodular in 79 patients (86%) and mediastinal in 87 (95%). Ipsilateral volume loss was seen in 42 patients (46%). Pleural effusions were present in 80 patients (87%), being large (>2/3 hemithorax) in 19 patients (21%). Atypical features at presentation included bilateral disease in three patients (3%), and spontaneous pneumothoraces in nine patients (10%). Internal mammary lymphadenopathy was observed in 48 patients (52%) and cardiophrenic lymphadenopathy in 42 (46%). Inter-observer agreement was excellent (average kappa = 0.89). Ipsilateral volume loss was associated with sarcomatous or mixed mesothelioma (p = 0.004). Using logistic regression analysis, other CT findings did not correlate with histological subtype. Conclusions: Ipsilateral volume loss is most frequently associated with sarcomatous or mixed mesothelioma. The remaining imaging findings are not helpful in predicting the histological subtype of

  6. An energy efficient and high speed architecture for convolution computing based on binary resistive random access memory

    Science.gov (United States)

    Liu, Chen; Han, Runze; Zhou, Zheng; Huang, Peng; Liu, Lifeng; Liu, Xiaoyan; Kang, Jinfeng

    2018-04-01

    In this work we present a novel convolution computing architecture based on metal oxide resistive random access memory (RRAM) to process the image data stored in the RRAM arrays. The proposed image storage architecture shows performances of better speed-device consumption efficiency compared with the previous kernel storage architecture. Further we improve the architecture for a high accuracy and low power computing by utilizing the binary storage and the series resistor. For a 28 × 28 image and 10 kernels with a size of 3 × 3, compared with the previous kernel storage approach, the newly proposed architecture shows excellent performances including: 1) almost 100% accuracy within 20% LRS variation and 90% HRS variation; 2) more than 67 times speed boost; 3) 71.4% energy saving.

  7. The Cognitive Correlates of Third-Grade Skill in Arithmetic, Algorithmic Computation, and Arithmetic Word Problems

    Science.gov (United States)

    Fuchs, Lynn S.; Fuchs, Douglas; Compton, Donald L.; Powell, Sarah R.; Seethaler, Pamela M.; Capizzi, Andrea M.; Schatschneider, Christopher; Fletcher, Jack M.

    2006-01-01

    The purpose of this study was to examine the cognitive correlates of RD-grade skill in arithmetic, algorithmic computation, and arithmetic word problems. Third graders (N = 312) were measured on language, nonverbal problem solving, concept formation, processing speed, long-term memory, working memory, phonological decoding, and sight word…

  8. Heat transfer, velocity-temperature correlation, and turbulent shear stress from Navier-Stokes computations of shock wave/turbulent boundary layer interaction flows

    Science.gov (United States)

    Wang, C. R.; Hingst, W. R.; Porro, A. R.

    1991-01-01

    The properties of 2-D shock wave/turbulent boundary layer interaction flows were calculated by using a compressible turbulent Navier-Stokes numerical computational code. Interaction flows caused by oblique shock wave impingement on the turbulent boundary layer flow were considered. The oblique shock waves were induced with shock generators at angles of attack less than 10 degs in supersonic flows. The surface temperatures were kept at near-adiabatic (ratio of wall static temperature to free stream total temperature) and cold wall (ratio of wall static temperature to free stream total temperature) conditions. The computational results were studied for the surface heat transfer, velocity temperature correlation, and turbulent shear stress in the interaction flow fields. Comparisons of the computational results with existing measurements indicated that (1) the surface heat transfer rates and surface pressures could be correlated with Holden's relationship, (2) the mean flow streamwise velocity components and static temperatures could be correlated with Crocco's relationship if flow separation did not occur, and (3) the Baldwin-Lomax turbulence model should be modified for turbulent shear stress computations in the interaction flows.

  9. Nephrocalcinosis in rabbits - correlation of ultrasound, computed tomography, pathology and renal function

    Energy Technology Data Exchange (ETDEWEB)

    Cramer, B.; Pushpanathan, C. [Janeway Child Health Centre, St. Johns`s (Canada). Radiology Dept.; Husa, L. [Memorial Univ. of Newfoundland, St. Johns`s (Canada)

    1998-01-01

    Objective. The purpose of this study was to induce nephrocalcinosis (NC) in rabbits with phosphate, vitamin D, oxalate and furosemide, to determine the effect of renal function and to correlate detection of ultrasound (US) and computed tomography (CT) with pathology. Materials and methods. Seventy-five immature New Zealand white rabbits were divided into five groups of 15. In each group, 5 animals were controls and 10 were given oral phosphate, furosemide, vitamin D or oxalate, furosemide, vitamin D or oxalate. Unilateral nephrectomy was performed at 3-6 weeks, and 5 rabbits of each test group were withdrawn from the substance. Weekly US was performed as well as US, CT and measurement of serum creatinine at the time of nephrectomy and prior to planned demise. Results. A todal of 140 kidneys in 75 rabbits had both pathological and US correlation, with CT correlation in 126. Forty rabbits developed nephrocalcinosis with early (post nephrectomy at 3-6 weeks) or late (post demise at 10-20 weeks) phatological correlation obtained in 53 kidneys. Forty-one of these kidneys were from test animals: 23 developed NC early, 18 late. Twelve controls developed NC; 4 early, 8 late. Comparing US and CT to phatology, the sensitivity was 96% for US, 64% for CT. Specificity was 85% for US and 96% for CT. In 109 kidneys, information on serum creatinine level was available to correlate with phatology. The mean creatinine level was 138 mmol/l for those with NC and 118 mmol/l for those without NC (P<0.001).

  10. Nephrocalcinosis in rabbits - correlation of ultrasound, computed tomography, pathology and renal function

    International Nuclear Information System (INIS)

    Cramer, B.; Pushpanathan, C.

    1998-01-01

    Objective. The purpose of this study was to induce nephrocalcinosis (NC) in rabbits with phosphate, vitamin D, oxalate and furosemide, to determine the effect of renal function and to correlate detection of ultrasound (US) and computed tomography (CT) with pathology. Materials and methods. Seventy-five immature New Zealand white rabbits were divided into five groups of 15. In each group, 5 animals were controls and 10 were given oral phosphate, furosemide, vitamin D or oxalate, furosemide, vitamin D or oxalate. Unilateral nephrectomy was performed at 3-6 weeks, and 5 rabbits of each test group were withdrawn from the substance. Weekly US was performed as well as US, CT and measurement of serum creatinine at the time of nephrectomy and prior to planned demise. Results. A todal of 140 kidneys in 75 rabbits had both pathological and US correlation, with CT correlation in 126. Forty rabbits developed nephrocalcinosis with early (post nephrectomy at 3-6 weeks) or late (post demise at 10-20 weeks) phatological correlation obtained in 53 kidneys. Forty-one of these kidneys were from test animals: 23 developed NC early, 18 late. Twelve controls developed NC; 4 early, 8 late. Comparing US and CT to phatology, the sensitivity was 96% for US, 64% for CT. Specificity was 85% for US and 96% for CT. In 109 kidneys, information on serum creatinine level was available to correlate with phatology. The mean creatinine level was 138 mmol/l for those with NC and 118 mmol/l for those without NC (P<0.001)

  11. Guidelines for Outsourcing Remote Access.

    Science.gov (United States)

    Hassler, Ardoth; Neuman, Michael

    1996-01-01

    Discusses the advantages and disadvantages of outsourcing remote access to campus computer networks and the Internet, focusing on improved service, cost-sharing, partnerships with vendors, supported protocols, bandwidth, scope of access, implementation, support, network security, and pricing. Includes a checklist for a request for proposals on…

  12. Accessible Earth: Enhancing diversity in the Geosciences through accessible course design

    Science.gov (United States)

    Bennett, R. A.; Lamb, D. A.

    2017-12-01

    The tradition of field-based instruction in the geoscience curriculum, which culminates in a capstone geological field camp, presents an insurmountable barrier to many disabled students who might otherwise choose to pursue geoscience careers. There is a widespread perception that success as a practicing geoscientist requires direct access to outcrops and vantage points available only to those able to traverse inaccessible terrain. Yet many modern geoscience activities are based on remotely sensed geophysical data, data analysis, and computation that take place entirely from within the laboratory. To challenge the perception of geoscience as a career option only for the non-disabled, we have created the capstone Accessible Earth Study Abroad Program, an alternative to geologic field camp for all students, with a focus on modern geophysical observation systems, computational thinking, data science, and professional development.In this presentation, we will review common pedagogical approaches in geosciences and current efforts to make the field more inclusive. We will review curricular access and inclusivity relative to a wide range of learners and provide examples of accessible course design based on our experiences in teaching a study abroad course in central Italy, and our plans for ongoing assessment, refinement, and dissemination of the effectiveness of our efforts.

  13. The Education Value of Cloud Computing

    Science.gov (United States)

    Katzan, Harry, Jr.

    2010-01-01

    Cloud computing is a technique for supplying computer facilities and providing access to software via the Internet. Cloud computing represents a contextual shift in how computers are provisioned and accessed. One of the defining characteristics of cloud software service is the transfer of control from the client domain to the service provider.…

  14. Morphological measurements in computed tomography correlate with airflow obstruction in chronic obstructive pulmonary disease: systematic review and meta-analysis

    Energy Technology Data Exchange (ETDEWEB)

    Xie, XueQian; Oudkerk, Matthijs; Vliegenthart, Rozemarijn [University of Groningen, University Medical Center Groningen, Center for Medical Imaging-North East Netherlands (CMI-NEN), Department of Radiology, Hanzeplein 1, P.O. Box 30.001, RB, Groningen (Netherlands); Jong, Pim A. de [University Medical Center Utrecht, University of Utrecht, Department of Radiology, Heidelberglaan 100, P.O. Box 85.500, CX, Utrecht (Netherlands); Wang, Ying [Tianjin Medical University General Hospital, Department of Radiology, Tianjin (China); Hacken, Nick H.T. ten [University of Groningen, University Medical Center Groningen, Department of Pulmonary Diseases, Hanzeplein 1, P.O. Box 30.001, RB, Groningen (Netherlands); Miao, Jingtao; Zhang, GuiXiang [Shanghai Jiao Tong University Affiliated First People' s Hospital, Department of Radiology, Shanghai (China); Bock, Geertruida H. de [University of Groningen, University Medical Center Groningen, Department of Epidemiology, Hanzeplein 1, P.O. Box 30.001, RB, Groningen (Netherlands)

    2012-10-15

    To determine the correlation between CT measurements of emphysema or peripheral airways and airflow obstruction in chronic obstructive pulmonary disease (COPD). PubMed, Embase and Web of Knowledge were searched from 1976 to 2011. Two reviewers independently screened 1,763 citations to identify articles that correlated CT measurements to airflow obstruction parameters of the pulmonary function test in COPD patients, rated study quality and extracted information. Three CT measurements were accessed: lung attenuation area percentage < -950 Hounsfield units, mean lung density and airway wall area percentage. Two airflow obstruction parameters were accessed: forced expiratory volume in the first second as percentage from predicted (FEV{sub 1} %pred) and FEV{sub 1} divided by the forced volume vital capacity. Seventy-nine articles (9,559 participants) were included in the systematic review, demonstrating different methodologies, measurements and CT airflow obstruction correlations. There were 15 high-quality articles (2,095 participants) in the meta-analysis. The absolute pooled correlation coefficients ranged from 0.48 (95 % CI, 0.40 to 0.54) to 0.65 (0.58 to 0.71) for inspiratory CT and 0.64 (0.53 to 0.72) to 0.73 (0.63 to 0.80) for expiratory CT. CT measurements of emphysema or peripheral airways are significantly related to airflow obstruction in COPD patients. CT provides a morphological method to investigate airway obstruction in COPD. (orig.)

  15. Visual computed tomographic scoring of emphysema and its correlation with its diagnostic electrocardiographic sign: the frontal P vector.

    Science.gov (United States)

    Chhabra, Lovely; Sareen, Pooja; Gandagule, Amit; Spodick, David H

    2012-03-01

    Verticalization of the frontal P vector in patients older than 45 years is virtually diagnostic of pulmonary emphysema (sensitivity, 96%; specificity, 87%). We investigated the correlation of P vector and the computed tomographic visual score of emphysema (VSE) in patients with established diagnosis of chronic obstructive pulmonary disease/emphysema. High-resolution computed tomographic scans of 26 patients with emphysema (age, >45 years) were reviewed to assess the type and extent of emphysema using the subjective visual scoring. Electrocardiograms were independently reviewed to determine the frontal P vector. The P vector and VSE were compared for statistical correlation. Both P vector and VSE were also directly compared with the forced expiratory volume at 1 second. The VSE and the orientation of the P vector (ÂP) had an overall significant positive correlation (r = +0.68; P = .0001) in all patients, but the correlation was very strong in patients with predominant lower-lobe emphysema (r = +0.88; P = .0004). Forced expiratory volume at 1 second and ÂP had almost a linear inverse correlation in predominant lower-lobe emphysema (r = -0.92; P vertical ÂP and predominant lower-lobe emphysema reflects severe obstructive lung dysfunction. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Non-local correlations within dynamical mean field theory

    Energy Technology Data Exchange (ETDEWEB)

    Li, Gang

    2009-03-15

    The contributions from the non-local fluctuations to the dynamical mean field theory (DMFT) were studied using the recently proposed dual fermion approach. Straight forward cluster extensions of DMFT need the solution of a small cluster, where all the short-range correlations are fully taken into account. All the correlations beyond the cluster scope are treated in the mean-field level. In the dual fermion method, only a single impurity problem needs to be solved. Both the short and long-range correlations could be considered on equal footing in this method. The weak-coupling nature of the dual fermion ensures the validity of the finite order diagram expansion. The one and two particle Green's functions calculated from the dual fermion approach agree well with the Quantum Monte Carlo solutions, and the computation time is considerably less than with the latter method. The access of the long-range order allows us to investigate the collective behavior of the electron system, e.g. spin wave excitations. (orig.)

  17. Non-local correlations within dynamical mean field theory

    International Nuclear Information System (INIS)

    Li, Gang

    2009-03-01

    The contributions from the non-local fluctuations to the dynamical mean field theory (DMFT) were studied using the recently proposed dual fermion approach. Straight forward cluster extensions of DMFT need the solution of a small cluster, where all the short-range correlations are fully taken into account. All the correlations beyond the cluster scope are treated in the mean-field level. In the dual fermion method, only a single impurity problem needs to be solved. Both the short and long-range correlations could be considered on equal footing in this method. The weak-coupling nature of the dual fermion ensures the validity of the finite order diagram expansion. The one and two particle Green's functions calculated from the dual fermion approach agree well with the Quantum Monte Carlo solutions, and the computation time is considerably less than with the latter method. The access of the long-range order allows us to investigate the collective behavior of the electron system, e.g. spin wave excitations. (orig.)

  18. A physical implementation of the Turing machine accessed through Web

    Directory of Open Access Journals (Sweden)

    Marijo Maracic

    2008-11-01

    Full Text Available A Turing machine has an important role in education in the field of computer science, as it is a milestone in courses related to automata theory, theory of computation and computer architecture. Its value is also recognized in the Computing Curricula proposed by the Association for Computing Machinery (ACM and IEEE Computer Society. In this paper we present a physical implementation of the Turing machine accessed through Web. To enable remote access to the Turing machine, an implementation of the client-server architecture is built. The web interface is described in detail and illustrations of remote programming, initialization and the computation of the Turing machine are given. Advantages of such approach and expected benefits obtained by using remotely accessible physical implementation of the Turing machine as an educational tool in the teaching process are discussed.

  19. Remote access to mathematical software

    International Nuclear Information System (INIS)

    Dolan, E.; Hovland, P.; More, J.; Norris, B.; Smith, B.

    2001-01-01

    The network-oriented application services paradigm is becoming increasingly common for scientific computing. The popularity of this approach can be attributed to the numerous advantages to both user and developer provided by network-enabled mathematical software. The burden of installing and maintaining complex systems is lifted from the user, while enabling developers to provide frequent updates without disrupting service. Access to software with similar functionality can be unified under the same interface. Remote servers can utilize potentially more powerful computing resources than may be available locally. We discuss some of the application services developed by the Mathematics and Computer Science Division at Argonne National Laboratory, including the Network Enabled Optimization System (NEOS) Server and the Automatic Differentiation of C (ADIC) Server, as well as preliminary work on Web access to the Portable Extensible Toolkit for Scientific Computing (PETSc). We also provide a brief survey of related work

  20. When can Empirical Green Functions be computed from Noise Cross-Correlations? Hints from different Geographical and Tectonic environments

    Science.gov (United States)

    Matos, Catarina; Silveira, Graça; Custódio, Susana; Domingues, Ana; Dias, Nuno; Fonseca, João F. B.; Matias, Luís; Krueger, Frank; Carrilho, Fernando

    2014-05-01

    Noise cross-correlations are now widely used to extract Green functions between station pairs. But, do all the cross-correlations routinely computed produce successful Green Functions? What is the relationship between noise recorded in a couple of stations and the cross-correlation between them? During the last decade, we have been involved in the deployment of several temporary dense broadband (BB) networks within the scope of both national projects and international collaborations. From 2000 to 2002, a pool of 8 BB stations continuously operated in the Azores in the scope of the Memorandum of Understanding COSEA (COordinated Seismic Experiment in the Azores). Thanks to the Project WILAS (West Iberia Lithosphere and Astenosphere Structure, PTDC/CTE-GIX/097946/2008) we temporarily increased the number of BB deployed in mainland Portugal to more than 50 (permanent + temporary) during the period 2010 - 2012. In 2011/12 a temporary pool of 12 seismometers continuously recorded BB data in the Madeira archipelago, as part of the DOCTAR (Deep Ocean Test Array Experiment) project. Project CV-PLUME (Investigation on the geometry and deep signature of the Cape Verde mantle plume, PTDC/CTE-GIN/64330/2006) covered the archipelago of Cape Verde, North Atlantic, with 40 temporary BB stations in 2007/08. Project MOZART (Mozambique African Rift Tomography, PTDC/CTE-GIX/103249/2008), covered Mozambique, East Africa, with 30 temporary BB stations in the period 2011 - 2013. These networks, located in very distinct geographical and tectonic environments, offer an interesting opportunity to study seasonal and spatial variations of noise sources and their impact on Empirical Green functions computed from noise cross-correlation. Seismic noise recorded at different seismic stations is evaluated by computation of the probability density functions of power spectral density (PSD) of continuous data. To assess seasonal variations of ambient noise sources in frequency content, time-series of

  1. Computation of antenna pattern correlation and MIMO performance by means of surface current distribution and spherical wave theory

    Directory of Open Access Journals (Sweden)

    O. Klemp

    2006-01-01

    Full Text Available In order to satisfy the stringent demand for an accurate prediction of MIMO channel capacity and diversity performance in wireless communications, more effective and suitable models that account for real antenna radiation behavior have to be taken into account. One of the main challenges is the accurate modeling of antenna correlation that is directly related to the amount of channel capacity or diversity gain which might be achieved in multi element antenna configurations. Therefore spherical wave theory in electromagnetics is a well known technique to express antenna far fields by means of a compact field expansion with a reduced number of unknowns that was recently applied to derive an analytical approach in the computation of antenna pattern correlation. In this paper we present a novel and efficient computational technique to determine antenna pattern correlation based on the evaluation of the surface current distribution by means of a spherical mode expansion.

  2. Ground-glass opacity in diffuse lung diseases: high-resolution computed tomography-pathology correlation

    International Nuclear Information System (INIS)

    Santos, Maria Lucia de Oliveira; Vianna, Alberto Domingues; Marchiori, Edson; Souza Junior, Arthur Soares; Moraes, Heleno Pinto de

    2003-01-01

    Ground-glass opacity is a finding frequently seen in high-resolution computed tomography examinations of the chest and is characterized by hazy increased attenuation of lung, however without blurring of bronchial and vascular margins. Due to its un specificity, association with other radiological, clinical and pathological findings must be considered for an accurate diagnostic interpretation. In this paper were reviewed 62 computed tomography examinations of patients with diffuse pulmonary diseases of 14 different etiologies in which ground-glass opacity was the only or the most remarkable finding, and correlated this findings with pathology abnormalities seen on specimens obtained from biopsies or necropsies. In pneumocystosis, ground-glass opacities correlated histologically with alveolar occupation by a foaming material containing parasites, in bronchiole alveolar cell carcinoma with thickening of the alveolar septa and occupation of the lumen by mucus and tumoral cells, in paracoccidioidomycosis with thickening of the alveolar septa, areas of fibrosis and alveolar bronchopneumonia exudate, in sarcoidosis with fibrosis or clustering of granulomas and in idiopathic pulmonary fibrosis with alveolar septa thickening due to fibrosis. Alveolar occupation by blood was found in cases of leptospirosis, idiopathic hemo siderosis, metastatic kidney tumor and invasive aspergillosis whereas oily vacuole were seen in lipoid pneumonia, proteinaceous and lipo proteinaceous material in silico proteinosis and pulmonary alveolar proteinosis, and edematous fluid in cardiac failure. (author)

  3. Authentication and Access: Accommodating Public Users in an Academic World

    Directory of Open Access Journals (Sweden)

    Lynne Weber

    2010-09-01

    Full Text Available In the fall of 2004, the Academic Computing Center, a division of the Information Technology Services Department (ITS at Minnesota State University, Mankato took over responsibility for the computers in the public areas of Memorial Library. For the first time, affiliated Memorial Library users were required to authenticate using a campus username and password, a change that effectively eliminated computer access for anyone not part of the university community. This posed a dilemma for the librarians. Because of its Federal Depository status, the library had a responsibility to provide general access to both print and online government publications for the general public. Furthermore, the library had a long tradition of providing guest access to most library resources, and there was reluctance to abandon the practice. Therefore the librarians worked with ITS to retain a small group of six computers that did not require authentication and were clearly marked for community use, along with several standup, open-access computers on each floor used primarily for searching the library catalog. The additional need to provide computer access to high school students visiting the library for research and instruction led to more discussions with ITS and resulted in a means of generating temporary usernames and passwords through a Web form. These user accommodations were implemented in the library without creating a written policy governing the use of open-access computers.

  4. M2M massive wireless access

    DEFF Research Database (Denmark)

    Zanella, Andrea; Zorzi, Michele; Santos, André F.

    2013-01-01

    In order to make the Internet of Things a reality, ubiquitous coverage and low-complexity connectivity are required. Cellular networks are hence the most straightforward and realistic solution to enable a massive deployment of always connected Machines around the globe. Nevertheless, a paradigm...... shift in the conception and design of future cellular networks is called for. Massive access attempts, low-complexity and cheap machines, sporadic transmission and correlated signals are among the main properties of this new reality, whose main consequence is the disruption of the development...... Access Reservation, Coded Random Access and the exploitation of multiuser detection in random access. Additionally, we will show how the properties of machine originated signals, such as sparsity and spatial/time correlation can be exploited. The end goal of this paper is to provide motivation...

  5. Efficient Unsteady Flow Visualization with High-Order Access Dependencies

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jiang; Guo, Hanqi; Yuan, Xiaoru

    2016-04-19

    We present a novel high-order access dependencies based model for efficient pathline computation in unsteady flow visualization. By taking longer access sequences into account to model more sophisticated data access patterns in particle tracing, our method greatly improves the accuracy and reliability in data access prediction. In our work, high-order access dependencies are calculated by tracing uniformly-seeded pathlines in both forward and backward directions in a preprocessing stage. The effectiveness of our proposed approach is demonstrated through a parallel particle tracing framework with high-order data prefetching. Results show that our method achieves higher data locality and hence improves the efficiency of pathline computation.

  6. Fast computation of molecular random phase approximation correlation energies using resolution of the identity and imaginary frequency integration

    Science.gov (United States)

    Eshuis, Henk; Yarkony, Julian; Furche, Filipp

    2010-06-01

    The random phase approximation (RPA) is an increasingly popular post-Kohn-Sham correlation method, but its high computational cost has limited molecular applications to systems with few atoms. Here we present an efficient implementation of RPA correlation energies based on a combination of resolution of the identity (RI) and imaginary frequency integration techniques. We show that the RI approximation to four-index electron repulsion integrals leads to a variational upper bound to the exact RPA correlation energy if the Coulomb metric is used. Auxiliary basis sets optimized for second-order Møller-Plesset (MP2) calculations are well suitable for RPA, as is demonstrated for the HEAT [A. Tajti et al., J. Chem. Phys. 121, 11599 (2004)] and MOLEKEL [F. Weigend et al., Chem. Phys. Lett. 294, 143 (1998)] benchmark sets. Using imaginary frequency integration rather than diagonalization to compute the matrix square root necessary for RPA, evaluation of the RPA correlation energy requires O(N4 log N) operations and O(N3) storage only; the price for this dramatic improvement over existing algorithms is a numerical quadrature. We propose a numerical integration scheme that is exact in the two-orbital case and converges exponentially with the number of grid points. For most systems, 30-40 grid points yield μH accuracy in triple zeta basis sets, but much larger grids are necessary for small gap systems. The lowest-order approximation to the present method is a post-Kohn-Sham frequency-domain version of opposite-spin Laplace-transform RI-MP2 [J. Jung et al., Phys. Rev. B 70, 205107 (2004)]. Timings for polyacenes with up to 30 atoms show speed-ups of two orders of magnitude over previous implementations. The present approach makes it possible to routinely compute RPA correlation energies of systems well beyond 100 atoms, as is demonstrated for the octapeptide angiotensin II.

  7. Cloud Computing with iPlant Atmosphere.

    Science.gov (United States)

    McKay, Sheldon J; Skidmore, Edwin J; LaRose, Christopher J; Mercer, Andre W; Noutsos, Christos

    2013-10-15

    Cloud Computing refers to distributed computing platforms that use virtualization software to provide easy access to physical computing infrastructure and data storage, typically administered through a Web interface. Cloud-based computing provides access to powerful servers, with specific software and virtual hardware configurations, while eliminating the initial capital cost of expensive computers and reducing the ongoing operating costs of system administration, maintenance contracts, power consumption, and cooling. This eliminates a significant barrier to entry into bioinformatics and high-performance computing for many researchers. This is especially true of free or modestly priced cloud computing services. The iPlant Collaborative offers a free cloud computing service, Atmosphere, which allows users to easily create and use instances on virtual servers preconfigured for their analytical needs. Atmosphere is a self-service, on-demand platform for scientific computing. This unit demonstrates how to set up, access and use cloud computing in Atmosphere. Copyright © 2013 John Wiley & Sons, Inc.

  8. Testing a bedside personal computer Clinical Care Classification System for nursing students using Microsoft Access.

    Science.gov (United States)

    Feeg, Veronica D; Saba, Virginia K; Feeg, Alan N

    2008-01-01

    This study tested a personal computer-based version of the Sabacare Clinical Care Classification System on students' performance of charting patient care plans. The application was designed as an inexpensive alternative to teach electronic charting for use on any laptop or personal computer with Windows and Microsoft Access. The data-based system was tested in a randomized trial with the control group using a type-in text-based-only system also mounted on a laptop at the bedside in the laboratory. Student care plans were more complete using the data-based system over the type-in text version. Students were more positive but not necessarily more efficient with the data-based system. The results demonstrate that the application is effective for improving student nursing care charting using the nursing process and capturing patient care information with a language that is standardized and ready for integration with other patient electronic health record data. It can be implemented on a bedside stand in the clinical laboratory or used to aggregate care planning over a student's clinical experience.

  9. Computer Registration Becoming Mandatory

    CERN Multimedia

    2003-01-01

    Following the decision by the CERN Management Board (see Weekly Bulletin 38/2003), registration of all computers connected to CERN's network will be enforced and only registered computers will be allowed network access. The implementation has started with the IT buildings, continues with building 40 and the Prevessin site (as of Tuesday 4th November 2003), and will cover the whole of CERN before the end of this year. We therefore recommend strongly that you register all your computers in CERN's network database (Ethernet and wire-less cards) as soon as possible without waiting for the access restriction to take force. This will allow you accessing the network without interruption and help IT service providers to contact you in case of problems (security problems, viruses, etc.) • Users WITH a CERN computing account register at: http://cern.ch/register/ (CERN Intranet page) • Visitors WITHOUT a CERN computing account (e.g. short term visitors) register at: http://cern.ch/registerVisitorComp...

  10. Decoupled Access-Execute on ARM big.LITTLE

    OpenAIRE

    Weber, Anton

    2016-01-01

    Decoupled Access-Execute (DAE) presents a novel approach to improve power efficiency with a combination of compile-time transformations and Dynamic Voltage Frequency Scaling (DVFS). DAE splits regions of the program into two distinct phases: a memory-bound access phase and a compute-bound execute phase. DVFS is used to run the phases at different frequencies, thus conserving energy while caching data from main memory and performing computations at maximum performance. This project analyses th...

  11. CMS computing upgrade and evolution

    CERN Document Server

    Hernandez Calama, Jose

    2013-01-01

    The distributed Grid computing infrastructure has been instrumental in the successful exploitation of the LHC data leading to the discovery of the Higgs boson. The computing system will need to face new challenges from 2015 on when LHC restarts with an anticipated higher detector output rate and event complexity, but with only a limited increase in the computing resources. A more efficient use of the available resources will be mandatory. CMS is improving the data storage, distribution and access as well as the processing efficiency. Remote access to the data through the WAN, dynamic data replication and deletion based on the data access patterns, and separation of disk and tape storage are some of the areas being actively developed. Multi-core processing and scheduling is being pursued in order to make a better use of the multi-core nodes available at the sites. In addition, CMS is exploring new computing techniques, such as Cloud Computing, to get access to opportunistic resources or as a means of using wit...

  12. A Novel Multilayer Correlation Maximization Model for Improving CCA-Based Frequency Recognition in SSVEP Brain-Computer Interface.

    Science.gov (United States)

    Jiao, Yong; Zhang, Yu; Wang, Yu; Wang, Bei; Jin, Jing; Wang, Xingyu

    2018-05-01

    Multiset canonical correlation analysis (MsetCCA) has been successfully applied to optimize the reference signals by extracting common features from multiple sets of electroencephalogram (EEG) for steady-state visual evoked potential (SSVEP) recognition in brain-computer interface application. To avoid extracting the possible noise components as common features, this study proposes a sophisticated extension of MsetCCA, called multilayer correlation maximization (MCM) model for further improving SSVEP recognition accuracy. MCM combines advantages of both CCA and MsetCCA by carrying out three layers of correlation maximization processes. The first layer is to extract the stimulus frequency-related information in using CCA between EEG samples and sine-cosine reference signals. The second layer is to learn reference signals by extracting the common features with MsetCCA. The third layer is to re-optimize the reference signals set in using CCA with sine-cosine reference signals again. Experimental study is implemented to validate effectiveness of the proposed MCM model in comparison with the standard CCA and MsetCCA algorithms. Superior performance of MCM demonstrates its promising potential for the development of an improved SSVEP-based brain-computer interface.

  13. Correlation of uptake patterns on single-photon emission computed tomography/computed tomography (SPECT/CT)and treatment response in patients with knee pain

    International Nuclear Information System (INIS)

    Koh, Geon; Hwang, Kyung Hoon; Lee, Hae Jin; Kim, Seog Gyun; Lee, Beom Koo

    2016-01-01

    To determine whether treatment response in patients with knee pain could be predicted using uptake patterns on single-photon emission computed tomography/computed tomography (SPECT/CT) images. Ninety-five patients with knee pain who had undergone SPECT/CT were included in this retrospective study. Subjects were divided into three groups: increased focal uptake (FTU), increased irregular tracer uptake (ITU), and no tracer uptake (NTU). A numeric rating scale (NRS-11) assessed pain intensity. We analyzed the association between uptake patterns and treatment response using Pearson's chi-square test and Fisher's exact test. Uptake was quantified from SPECT/CT with region of interest (ROI) counting, and an intraclass correlation coefficient (ICC) calculated agreement. We used Student' t-test to calculate statistically significant differences of counts between groups and the Pearson correlation to measure the relationship between counts and initial NRS-1k1. Multivariate logistic regression analysis determined which variables were significantly associated with uptake. The FTU group included 32 patients; ITU, 39; and NTU, 24. With conservative management, 64 % of patients with increased tracer uptake (TU, both focal and irregular) and 36 % with NTU showed positive response. Conservative treatment response of FTU was better than NTU, but did not differ from that of ITU. Conservative treatment response of TU was significantly different from that of NTU (OR 3.1; p 0.036). Moderate positive correlation was observed between ITU and initial NRS-11. Age and initial NRS-11 significantly predicted uptake. Patients with uptake in their knee(s) on SPECT/CT showed positive treatment response under conservative treatment

  14. Correlation of uptake patterns on single-photon emission computed tomography/computed tomography (SPECT/CT)and treatment response in patients with knee pain

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Geon; Hwang, Kyung Hoon; Lee, Hae Jin; Kim, Seog Gyun; Lee, Beom Koo [Gachon University Gil Hospital, Incheon (Korea, Republic of)

    2016-06-15

    To determine whether treatment response in patients with knee pain could be predicted using uptake patterns on single-photon emission computed tomography/computed tomography (SPECT/CT) images. Ninety-five patients with knee pain who had undergone SPECT/CT were included in this retrospective study. Subjects were divided into three groups: increased focal uptake (FTU), increased irregular tracer uptake (ITU), and no tracer uptake (NTU). A numeric rating scale (NRS-11) assessed pain intensity. We analyzed the association between uptake patterns and treatment response using Pearson's chi-square test and Fisher's exact test. Uptake was quantified from SPECT/CT with region of interest (ROI) counting, and an intraclass correlation coefficient (ICC) calculated agreement. We used Student' t-test to calculate statistically significant differences of counts between groups and the Pearson correlation to measure the relationship between counts and initial NRS-1k1. Multivariate logistic regression analysis determined which variables were significantly associated with uptake. The FTU group included 32 patients; ITU, 39; and NTU, 24. With conservative management, 64 % of patients with increased tracer uptake (TU, both focal and irregular) and 36 % with NTU showed positive response. Conservative treatment response of FTU was better than NTU, but did not differ from that of ITU. Conservative treatment response of TU was significantly different from that of NTU (OR 3.1; p 0.036). Moderate positive correlation was observed between ITU and initial NRS-11. Age and initial NRS-11 significantly predicted uptake. Patients with uptake in their knee(s) on SPECT/CT showed positive treatment response under conservative treatment.

  15. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  16. Development traumatic brain injury computer user interface for disaster area in Indonesia supported by emergency broadband access network.

    Science.gov (United States)

    Sutiono, Agung Budi; Suwa, Hirohiko; Ohta, Toshizumi; Arifin, Muh Zafrullah; Kitamura, Yohei; Yoshida, Kazunari; Merdika, Daduk; Qiantori, Andri; Iskandar

    2012-12-01

    Disasters bring consequences of negative impacts on the environment and human life. One of the common cause of critical condition is traumatic brain injury (TBI), namely, epidural (EDH) and subdural hematoma (SDH), due to downfall hard things during earthquake. We proposed and analyzed the user response, namely neurosurgeon, general doctor/surgeon and nurse when they interacted with TBI computer interface. The communication systems was supported by TBI web based applications using emergency broadband access network with tethered balloon and simulated in the field trial to evaluate the coverage area. The interface consisted of demography data and multi tabs for anamnesis, treatment, follow up and teleconference interfaces. The interface allows neurosurgeon, surgeon/general doctors and nurses to entry the EDH and SDH patient's data during referring them on the emergency simulation and evaluated based on time needs and their understanding. The average time needed was obtained after simulated by Lenovo T500 notebook using mouse; 8-10 min for neurosurgeons, 12-15 min for surgeons/general doctors and 15-19 min for nurses. By using Think Pad X201 Tablet, the time needed for entry data was 5-7 min for neurosurgeon, 7-10 min for surgeons/general doctors and 12-16 min for nurses. We observed that the time difference was depending on the computer type and user literacy qualification as well as their understanding on traumatic brain injury, particularly for the nurses. In conclusion, there are five data classification for simply TBI GUI, namely, 1) demography, 2) specific anamnesis for EDH and SDH, 3) treatment action and medicine of TBI, 4) follow up data display and 5) teleneurosurgery for streaming video consultation. The type of computer, particularly tablet PC was more convenient and faster for entry data, compare to that computer mouse touched pad. Emergency broadband access network using tethered balloon is possible to be employed to cover the communications systems in

  17. BioSPICE: access to the most current computational tools for biologists.

    Science.gov (United States)

    Garvey, Thomas D; Lincoln, Patrick; Pedersen, Charles John; Martin, David; Johnson, Mark

    2003-01-01

    The goal of the BioSPICE program is to create a framework that provides biologists access to the most current computational tools. At the program midpoint, the BioSPICE member community has produced a software system that comprises contributions from approximately 20 participating laboratories integrated under the BioSPICE Dashboard and a methodology for continued software integration. These contributed software modules are the BioSPICE Dashboard, a graphical environment that combines Open Agent Architecture and NetBeans software technologies in a coherent, biologist-friendly user interface. The current Dashboard permits data sources, models, simulation engines, and output displays provided by different investigators and running on different machines to work together across a distributed, heterogeneous network. Among several other features, the Dashboard enables users to create graphical workflows by configuring and connecting available BioSPICE components. Anticipated future enhancements to BioSPICE include a notebook capability that will permit researchers to browse and compile data to support model building, a biological model repository, and tools to support the development, control, and data reduction of wet-lab experiments. In addition to the BioSPICE software products, a project website supports information exchange and community building.

  18. Tri-party agreement databases, access mechanism and procedures. Revision 2

    International Nuclear Information System (INIS)

    Brulotte, P.J.

    1996-01-01

    This document contains the information required for the Washington State Department of Ecology (Ecology) and the U.S. Environmental Protection Agency (EPA) to access databases related to the Hanford Federal Facility Agreement and Consent Order (Tri-Party Agreement). It identifies the procedure required to obtain access to the Hanford Site computer networks and the Tri-Party Agreement related databases. It addresses security requirements, access methods, database availability dates, database access procedures, and the minimum computer hardware and software configurations required to operate within the Hanford Site networks. This document supersedes any previous agreements including the Administrative Agreement to Provide Computer Access to U.S. Environmental Protection Agency (EPA) and the Administrative Agreement to Provide Computer Access to Washington State Department of Ecology (Ecology), agreements that were signed by the U.S. Department of Energy (DOE), Richland Operations Office (RL) in June 1990, Access approval to EPA and Ecology is extended by RL to include all Tri-Party Agreement relevant databases named in this document via the documented access method and date. Access to databases and systems not listed in this document will be granted as determined necessary and negotiated among Ecology, EPA, and RL through the Tri-Party Agreement Project Managers. The Tri-Party Agreement Project Managers are the primary points of contact for all activities to be carried out under the Tri-Party Agreement. Action Plan. Access to the Tri-Party Agreement related databases and systems does not provide or imply any ownership on behalf of Ecology or EPA whether public or private of either the database or the system. Access to identified systems and databases does not include access to network/system administrative control information, network maps, etc

  19. Access Agent Improving The Performance Of Access Control Lists

    Directory of Open Access Journals (Sweden)

    Thelis R. S.

    2015-08-01

    Full Text Available The main focus of the proposed research is maintaining the security of a network. Extranet is a popular network among most of the organizations where network access is provided to a selected group of outliers. Limiting access to an extranet can be carried out using Access Control Lists ACLs method. However handling the workload of ACLs is an onerous task for the router. The purpose of the proposed research is to improve the performance and to solidify the security of the ACLs used in a small organization. Using a high performance computer as a dedicated device to share and handle the router workload is suggested in order to increase the performance of the router when handling ACLs. Methods of detecting and directing sensitive data is also discussed in this paper. A framework is provided to help increase the efficiency of the ACLs in an organization network using the above mentioned procedures thus helping the organizations ACLs performance to be improved to be more secure and the system to perform faster. Inbuilt methods of Windows platform or Software for open source platforms can be used to make a computer function as a router. Extended ACL features allow the determining of the type of packets flowing through the router. Combining these mechanisms allows the ACLs to be improved and perform in a more efficient manner.

  20. Using Palm Technology in Participatory Simulations of Complex Systems: A New Take on Ubiquitous and Accessible Mobile Computing

    Science.gov (United States)

    Klopfer, Eric; Yoon, Susan; Perry, Judy

    2005-09-01

    This paper reports on teachers' perceptions of the educational affordances of a handheld application called Participatory Simulations. It presents evidence from five cases representing each of the populations who work with these computational tools. Evidence across multiple data sources yield similar results to previous research evaluations of handheld activities with respect to enhancing motivation, engagement and self-directed learning. Three additional themes are discussed that provide insight into understanding curricular applicability of Participatory Simulations that suggest a new take on ubiquitous and accessible mobile computing. These themes generally point to the multiple layers of social and cognitive flexibility intrinsic to their design: ease of adaptation to subject-matter content knowledge and curricular integration; facility in attending to teacher-individualized goals; and encouraging the adoption of learner-centered strategies.

  1. Respiratory impedance is correlated with airway narrowing in asthma using three-dimensional computed tomography.

    Science.gov (United States)

    Karayama, M; Inui, N; Mori, K; Kono, M; Hozumi, H; Suzuki, Y; Furuhashi, K; Hashimoto, D; Enomoto, N; Fujisawa, T; Nakamura, Y; Watanabe, H; Suda, T

    2018-03-01

    Respiratory impedance comprises the resistance and reactance of the respiratory system and can provide detailed information on respiratory function. However, details of the relationship between impedance and morphological airway changes in asthma are unknown. We aimed to evaluate the correlation between imaging-based airway changes and respiratory impedance in patients with asthma. Respiratory impedance and spirometric data were evaluated in 72 patients with asthma and 29 reference subjects. We measured the intraluminal area (Ai) and wall thickness (WT) of third- to sixth-generation bronchi using three-dimensional computed tomographic analyses, and values were adjusted by body surface area (BSA, Ai/BSA, and WT/the square root (√) of BSA). Asthma patients had significantly increased respiratory impedance, decreased Ai/BSA, and increased WT/√BSA, as was the case in those without airflow limitation as assessed by spirometry. Ai/BSA was inversely correlated with respiratory resistance at 5 Hz (R5) and 20 Hz (R20). R20 had a stronger correlation with Ai/BSA than did R5. Ai/BSA was positively correlated with forced expiratory volume in 1 second/forced vital capacity ratio, percentage predicted forced expiratory volume in 1 second, and percentage predicted mid-expiratory flow. WT/√BSA had no significant correlation with spirometry or respiratory impedance. Respiratory resistance is associated with airway narrowing. © 2018 John Wiley & Sons Ltd.

  2. Real-time optical correlator using computer-generated holographic filter on a liquid crystal light valve

    Science.gov (United States)

    Chao, Tien-Hsin; Yu, Jeffrey

    1990-01-01

    Limitations associated with the binary phase-only filter often used in optical correlators are presently circumvented in the writing of complex-valued data on a gray-scale spatial light modulator through the use of a computer-generated hologram (CGH) algorithm. The CGH encodes complex-valued data into nonnegative real CGH data in such a way that it may be encoded in any of the available gray-scale spatial light modulators. A CdS liquid-crystal light valve is used for the complex-valued CGH encoding; computer simulations and experimental results are compared, and the use of such a CGH filter as the synapse hologram in a holographic optical neural net is discussed.

  3. Analytical Computation of Energy-Energy Correlation at Next-to-Leading Order in QCD.

    Science.gov (United States)

    Dixon, Lance J; Luo, Ming-Xing; Shtabovenko, Vladyslav; Yang, Tong-Zhi; Zhu, Hua Xing

    2018-03-09

    The energy-energy correlation (EEC) between two detectors in e^{+}e^{-} annihilation was computed analytically at leading order in QCD almost 40 years ago, and numerically at next-to-leading order (NLO) starting in the 1980s. We present the first analytical result for the EEC at NLO, which is remarkably simple, and facilitates analytical study of the perturbative structure of the EEC. We provide the expansion of the EEC in the collinear and back-to-back regions through next-to-leading power, information which should aid resummation in these regions.

  4. Correlates of screen time among 8-19-year-old students in China.

    Science.gov (United States)

    Ye, Sunyue; Chen, Lijian; Wang, Qineng; Li, Qinggong

    2018-04-10

    Previous studies have shown that prolonged time spent on screen-based sedentary behavior was significantly associated with lower health status in children, independent of physical activity levels. The study aimed to explore the individual and environmental correlates of screen time (ST) among 8-19-year-old students in China. The study surveyed ST using a self-administered questionnaire in Chinese students aged 8-19 years; 1063 participants were included in the final analysis. Individual and environmental correlates of ST were assessed using a mixed-effects model (for continuous outcome variables) and multiple logistic regression model (for binary outcome variables). Prolonged ST was observed in 14.7% of boys and 8.9% of girls. Of the ST, weekend and mobile phone/tablet use represented 80% and 40%, respectively. A positive relationship was observed between media accessibility and ST in both boys and girls (p negative factor for longer ST (p mobile phone/tablet or a computer rather than viewing a TV, along with increased media accessibility, increased ST. These results indicate that greater media accessibility was positively associated and the presence of parents/others was negatively associated with prolonged ST in both Chinese boys and girls. Development of new and effective strategies against prolonged ST are required, especially for small screen device-based ST on weekends.

  5. bc-GenExMiner 3.0: new mining module computes breast cancer gene expression correlation analyses.

    Science.gov (United States)

    Jézéquel, Pascal; Frénel, Jean-Sébastien; Campion, Loïc; Guérin-Charbonnel, Catherine; Gouraud, Wilfried; Ricolleau, Gabriel; Campone, Mario

    2013-01-01

    We recently developed a user-friendly web-based application called bc-GenExMiner (http://bcgenex.centregauducheau.fr), which offered the possibility to evaluate prognostic informativity of genes in breast cancer by means of a 'prognostic module'. In this study, we develop a new module called 'correlation module', which includes three kinds of gene expression correlation analyses. The first one computes correlation coefficient between 2 or more (up to 10) chosen genes. The second one produces two lists of genes that are most correlated (positively and negatively) to a 'tested' gene. A gene ontology (GO) mining function is also proposed to explore GO 'biological process', 'molecular function' and 'cellular component' terms enrichment for the output lists of most correlated genes. The third one explores gene expression correlation between the 15 telomeric and 15 centromeric genes surrounding a 'tested' gene. These correlation analyses can be performed in different groups of patients: all patients (without any subtyping), in molecular subtypes (basal-like, HER2+, luminal A and luminal B) and according to oestrogen receptor status. Validation tests based on published data showed that these automatized analyses lead to results consistent with studies' conclusions. In brief, this new module has been developed to help basic researchers explore molecular mechanisms of breast cancer. DATABASE URL: http://bcgenex.centregauducheau.fr

  6. A Practical Computational Method for the Anisotropic Redshift-Space 3-Point Correlation Function

    Science.gov (United States)

    Slepian, Zachary; Eisenstein, Daniel J.

    2018-04-01

    We present an algorithm enabling computation of the anisotropic redshift-space galaxy 3-point correlation function (3PCF) scaling as N2, with N the number of galaxies. Our previous work showed how to compute the isotropic 3PCF with this scaling by expanding the radially-binned density field around each galaxy in the survey into spherical harmonics and combining these coefficients to form multipole moments. The N2 scaling occurred because this approach never explicitly required the relative angle between a galaxy pair about the primary galaxy. Here we generalize this work, demonstrating that in the presence of azimuthally-symmetric anisotropy produced by redshift-space distortions (RSD) the 3PCF can be described by two triangle side lengths, two independent total angular momenta, and a spin. This basis for the anisotropic 3PCF allows its computation with negligible additional work over the isotropic 3PCF. We also present the covariance matrix of the anisotropic 3PCF measured in this basis. Our algorithm tracks the full 5-D redshift-space 3PCF, uses an accurate line of sight to each triplet, is exact in angle, and easily handles edge correction. It will enable use of the anisotropic large-scale 3PCF as a probe of RSD in current and upcoming large-scale redshift surveys.

  7. Which journals do primary care physicians and specialists access from an online service?

    Science.gov (United States)

    McKibbon, K Ann; Haynes, R Brian; McKinlay, R James; Lokker, Cynthia

    2007-07-01

    The study sought to determine which online journals primary care physicians and specialists not affiliated with an academic medical center access and how the accesses correlate with measures of journal quality and importance. Observational study of full-text accesses made during an eighteen-month digital library trial was performed. Access counts were correlated with six methods composed of nine measures for assessing journal importance: ISI impact factors; number of high-quality articles identified during hand-searches of key clinical journals; production data for ACP Journal Club, InfoPOEMs, and Evidence-Based Medicine; and mean clinician-provided clinical relevance and newsworthiness scores for individual journal titles. Full-text journals were accessed 2,322 times by 87 of 105 physicians. Participants accessed 136 of 348 available journal titles. Physicians often selected journals with relatively higher numbers of articles abstracted in ACP Journal Club. Accesses also showed significant correlations with 6 other measures of quality. Specialists' access patterns correlated with 3 measures, with weaker correlations than for primary care physicians. Primary care physicians, more so than specialists, chose full-text articles from clinical journals deemed important by several measures of value. Most journals accessed by both groups were of high quality as measured by this study's methods for assessing journal importance.

  8. Statistical electron correlation coefficients for the five lowest states of the heliumlike ions

    International Nuclear Information System (INIS)

    Thakkar, A.J.; Smith, V.H. Jr.

    1981-01-01

    Statistical correlation coefficients were introduced by Kutzelnigg, Del Re, and Berthier to provide overall measures of the difference between the electron pair density and the product of one-electron densities in atoms and molecules. Some properties of these coefficients are discussed, and it is shown that an angular correlation coefficient is experimentally accessible. Radial and angular correlation coefficients are computed from highly accurate wave functions for the 1 1 S, 2 3 S, 2 1 S, 2 3 P, and 2 1 P states of the heliumlike ions from He through Mg 10+ . It is found that positive angular correlation coefficients occur in the 2 1 P state of the two-electron positive ions but not in neutral helium. Moreover, the angular correlation coefficients for the 2 1 S and 2 3 S states of the positively charged two-electron ions show that a previously proposed reformulation of Hund's rule is incorrect

  9. Physical Computing and Its Scope--Towards a Constructionist Computer Science Curriculum with Physical Computing

    Science.gov (United States)

    Przybylla, Mareen; Romeike, Ralf

    2014-01-01

    Physical computing covers the design and realization of interactive objects and installations and allows students to develop concrete, tangible products of the real world, which arise from the learners' imagination. This can be used in computer science education to provide students with interesting and motivating access to the different topic…

  10. Long sequence correlation coprocessor

    Science.gov (United States)

    Gage, Douglas W.

    1994-09-01

    A long sequence correlation coprocessor (LSCC) accelerates the bitwise correlation of arbitrarily long digital sequences by calculating in parallel the correlation score for 16, for example, adjacent bit alignments between two binary sequences. The LSCC integrated circuit is incorporated into a computer system with memory storage buffers and a separate general purpose computer processor which serves as its controller. Each of the LSCC's set of sequential counters simultaneously tallies a separate correlation coefficient. During each LSCC clock cycle, computer enable logic associated with each counter compares one bit of a first sequence with one bit of a second sequence to increment the counter if the bits are the same. A shift register assures that the same bit of the first sequence is simultaneously compared to different bits of the second sequence to simultaneously calculate the correlation coefficient by the different counters to represent different alignments of the two sequences.

  11. Accessibility analysis in manufacturing processes using visibility cones

    Institute of Scientific and Technical Information of China (English)

    尹周平; 丁汉; 熊有伦

    2002-01-01

    Accessibility is a kind of important design feature of products,and accessibility analysis has been acknowledged as a powerful tool for solving computational manufacturing problems arising from different manufacturing processes.After exploring the relations among approachability,accessibility and visibility,a general method for accessibility analysis using visibility cones (VC) is proposed.With the definition of VC of a point,three kinds of visibility of a feature,namely complete visibility cone (CVC),partial visibility cone (PVC) and local visibility cone (LVC),are defined.A novel approach to computing VCs is formulated by identifying C-obstacles in the C-space,for which a general and efficient algorithm is proposed and implemented by making use of visibility culling.Lastly,we discuss briefly how to realize accessibility analysis in numerically controlled (NC) machining planning,coordinate measuring machines (CMMs) inspection planning and assembly sequence planning with the proposed methods.

  12. Applying standardized uptake values in gallium-67-citrate single-photon emission computed tomography/computed tomography studies and their correlation with blood test results in representative organs.

    Science.gov (United States)

    Toriihara, Akira; Daisaki, Hiromitsu; Yamaguchi, Akihiro; Yoshida, Katsuya; Isogai, Jun; Tateishi, Ukihide

    2018-05-21

    Recently, semiquantitative analysis using standardized uptake value (SUV) has been introduced in bone single-photon emission computed tomography/computed tomography (SPECT/CT). Our purposes were to apply SUV-based semiquantitative analytic method for gallium-67 (Ga)-citrate SPECT/CT and to evaluate correlation between SUV of physiological uptake and blood test results in representative organs. The accuracy of semiquantitative method was validated using an National Electrical Manufacturers Association body phantom study (radioactivity ratio of sphere : background=4 : 1). Thereafter, 59 patients (34 male and 25 female; mean age, 66.9 years) who had undergone Ga-citrate SPECT/CT were retrospectively enrolled in the study. A mean SUV of physiological uptake was calculated for the following organs: the lungs, right atrium, liver, kidneys, spleen, gluteal muscles, and bone marrow. The correlation between physiological uptakes and blood test results was evaluated using Pearson's correlation coefficient. The phantom study revealed only 1% error between theoretical and actual SUVs in the background, suggesting the sufficient accuracy of scatter and attenuation corrections. However, a partial volume effect could not be overlooked, particularly in small spheres with a diameter of less than 28 mm. The highest mean SUV was observed in the liver (range: 0.44-4.64), followed by bone marrow (range: 0.33-3.60), spleen (range: 0.52-2.12), and kidneys (range: 0.42-1.45). There was no significant correlation between hepatic uptake and liver function, renal uptake and renal function, or bone marrow uptake and blood cell count (P>0.05). The physiological uptake in Ga-citrate SPECT/CT can be represented as SUVs, which are not significantly correlated with corresponding blood test results.

  13. Information Accessibility and Utilization as Correlate of Quality Of ...

    African Journals Online (AJOL)

    The quality of life of people in developing countries, including Nigeria, is often adjudged to be lower than the expected standard. This is worse with women living in the rural areas whose lives are characterised by inadequate access and use of basic amenities of life. The situation is compounded by the women's lack of ...

  14. Fast-GPU-PCC: A GPU-Based Technique to Compute Pairwise Pearson's Correlation Coefficients for Time Series Data-fMRI Study.

    Science.gov (United States)

    Eslami, Taban; Saeed, Fahad

    2018-04-20

    Functional magnetic resonance imaging (fMRI) is a non-invasive brain imaging technique, which has been regularly used for studying brain’s functional activities in the past few years. A very well-used measure for capturing functional associations in brain is Pearson’s correlation coefficient. Pearson’s correlation is widely used for constructing functional network and studying dynamic functional connectivity of the brain. These are useful measures for understanding the effects of brain disorders on connectivities among brain regions. The fMRI scanners produce huge number of voxels and using traditional central processing unit (CPU)-based techniques for computing pairwise correlations is very time consuming especially when large number of subjects are being studied. In this paper, we propose a graphics processing unit (GPU)-based algorithm called Fast-GPU-PCC for computing pairwise Pearson’s correlation coefficient. Based on the symmetric property of Pearson’s correlation, this approach returns N ( N − 1 ) / 2 correlation coefficients located at strictly upper triangle part of the correlation matrix. Storing correlations in a one-dimensional array with the order as proposed in this paper is useful for further usage. Our experiments on real and synthetic fMRI data for different number of voxels and varying length of time series show that the proposed approach outperformed state of the art GPU-based techniques as well as the sequential CPU-based versions. We show that Fast-GPU-PCC runs 62 times faster than CPU-based version and about 2 to 3 times faster than two other state of the art GPU-based methods.

  15. Correlation of the clinical and physical image quality in chest radiography for average adults with a computed radiography imaging system.

    Science.gov (United States)

    Moore, C S; Wood, T J; Beavis, A W; Saunderson, J R

    2013-07-01

    The purpose of this study was to examine the correlation between the quality of visually graded patient (clinical) chest images and a quantitative assessment of chest phantom (physical) images acquired with a computed radiography (CR) imaging system. The results of a previously published study, in which four experienced image evaluators graded computer-simulated postero-anterior chest images using a visual grading analysis scoring (VGAS) scheme, were used for the clinical image quality measurement. Contrast-to-noise ratio (CNR) and effective dose efficiency (eDE) were used as physical image quality metrics measured in a uniform chest phantom. Although optimal values of these physical metrics for chest radiography were not derived in this work, their correlation with VGAS in images acquired without an antiscatter grid across the diagnostic range of X-ray tube voltages was determined using Pearson's correlation coefficient. Clinical and physical image quality metrics increased with decreasing tube voltage. Statistically significant correlations between VGAS and CNR (R=0.87, pchest CR images acquired without an antiscatter grid. A statistically significant correlation has been found between the clinical and physical image quality in CR chest imaging. The results support the value of using CNR and eDE in the evaluation of quality in clinical thorax radiography.

  16. Neuropathological correlations with the computed tomograms in Creutzfeldt-Jakob disease

    Energy Technology Data Exchange (ETDEWEB)

    Nagura, Hiroshi; Tohgi, Hideo; Yamanouchi, Hiroshi (Tokyo Metropolitan Geriatric Medical Center (Japan)); Tomonaga, Masanori

    1983-03-01

    Findings of computed tomograms were correlated with pathological changes in 3 autopsied cases of Creutzfeldt-Jakob disease who died at various stages of the disease. CTs were almost normal at the periods when severe dementia, myoclonus fully developed. The brain from a patient who died at this period showed slight nerve cell loss and spongiform changes mainly in the cerebral cortex. CTs of two advanced cases showed that the atrophic processes of the brain progressed rapidly. In these cases severe nerve cell loss and status spongiosus were found in the cerebral cortex, basal nuclei and cerebellum. Serial CTs showed that atrophic processes involved first the cerebral cortex, and then the basal nuclei and cerebellum. These observations pose the problem whether the difference in the distribution of lesions observed in cases of Creutzfeldt-Jakob disease is merely due to the different stages of the disease at the time of death or due to the variety of pathologic processes in individual case.

  17. Neuropathological correlations with the computed tomograms in Creutzfeldt-Jakob disease

    International Nuclear Information System (INIS)

    Nagura, Hiroshi; Tohgi, Hideo; Yamanouchi, Hiroshi; Tomonaga, Masanori.

    1983-01-01

    Findings of computed tomograms were correlated with pathological changes in 3 autopsied cases of Creutzfeldt-Jakob disease who died at various stages of the disease. CTs were almost normal at the periods when severe dementia, myoclonus fully developed. The brain from a patient who died at this period showed slight nerve cell loss and spongiform changes mainly in the cerebral cortex. CTs of two advanced cases showed that the atrophic processes of the brain progressed rapidly. In these cases severe nerve cell loss and status spongiosus were found in the cerebral cortex, basal nuclei and cerebellum. Serial CTs showed that atrophic processes involved first the cerebral cortex, and then the basal nuclei and cerebellum. These observations pose the problem whether the difference in the distribution of lesions observed in cases of Creutzfeldt-Jakob disease is merely due to the different stages of the disease at the time of death or due to the variety of pathologic processes in individual case. (author)

  18. Computed tomography, lymphography, and staging laparotomy: correlations in initial staging of Hodgkin disease

    Energy Technology Data Exchange (ETDEWEB)

    Castellino, R.A.; Hoppe, R.T.; Blank, N.; Young, S.W.; Neumann, C.; Rosenberg, S.A.; Kaplan, H.S.

    1984-07-01

    One hundred twenty-one patients with newly diagnosed, previously untreated Hodgkin disease underwent abdominal and pelvic computed tomographic (CT) scanning and bipedal lymphography. These studies were followed by staging laparotomy, which included biopsy of the liver, retroperitoneal and mesenteric lymph nodes, and splenectomy. Correlation of the results of the imaging studies with the histopathologic diagnoses revealed a small - but significant - increased accuracy of lymphography compared with CT in assessing the retroperitoneal lymph nodes. The theoretical advantages of CT scanning in detecting lymphomatous deposits in lymph nodes about the celiac axis and the mesentery, or in the liver and spleen, were not confirmed.

  19. MOFAC : model for fine grained access control

    OpenAIRE

    2014-01-01

    M.Sc. (Computer Science) Computer security is a key component in any computer system. Traditionally computers were not connected to one another. This centralized configuration made the implementation of computer security a relatively easy task. The closed nature of the system limited the number of unknown factors that could cause security breaches. The users and their access rights were generally well defined and the system was protected from outside threats through simple, yet effective c...

  20. Social Media Use and Access to Digital Technology in US Young Adults in 2016

    Science.gov (United States)

    Johnson, Amanda L; Ilakkuvan, Vinu; Jacobs, Megan A; Graham, Amanda L; Rath, Jessica M

    2017-01-01

    Background In 2015, 90% of US young adults with Internet access used social media. Digital and social media are highly prevalent modalities through which young adults explore identity formation, and by extension, learn and transmit norms about health and risk behaviors during this developmental life stage. Objective The purpose of this study was to provide updated estimates of social media use from 2014 to 2016 and correlates of social media use and access to digital technology in data collected from a national sample of US young adults in 2016. Methods Young adult participants aged 18-24 years in Wave 7 (October 2014, N=1259) and Wave 9 (February 2016, N=989) of the Truth Initiative Young Adult Cohort Study were asked about use frequency for 11 social media sites and access to digital devices, in addition to sociodemographic characteristics. Regular use was defined as using a given social media site at least weekly. Weighted analyses estimated the prevalence of use of each social media site, overlap between regular use of specific sites, and correlates of using a greater number of social media sites regularly. Bivariate analyses identified sociodemographic correlates of access to specific digital devices. Results In 2014, 89.42% (weighted n, 1126/1298) of young adults reported regular use of at least one social media site. This increased to 97.5% (weighted n, 965/989) of young adults in 2016. Among regular users of social media sites in 2016, the top five sites were Tumblr (85.5%), Vine (84.7%), Snapchat (81.7%), Instagram (80.7%), and LinkedIn (78.9%). Respondents reported regularly using an average of 7.6 social media sites, with 85% using 6 or more sites regularly. Overall, 87% of young adults reported access or use of a smartphone with Internet access, 74% a desktop or laptop computer with Internet access, 41% a tablet with Internet access, 29% a smart TV or video game console with Internet access, 11% a cell phone without Internet access, and 3% none of these

  1. Genomics With Cloud Computing

    Directory of Open Access Journals (Sweden)

    Sukhamrit Kaur

    2015-04-01

    Full Text Available Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computing to genomics are like easy access and sharing of data security of data less cost to pay for resources but still there are some demerits like large time needed to transfer data less network bandwidth.

  2. Accessibility measures: review and applications. Evaluation of accessibility impacts of land-use transportation scenarios, and related social and economic impact

    OpenAIRE

    Geurs KT; Ritsema van Eck JR; Universiteit Utrecht-URU; LAE

    2001-01-01

    This report describes an extensive literature study and three case studies aimed at reviewing accessibility measures for their ability to evaluate the accessibility impacts of national land-use and transport scenarios, and related social and economic impacts. Several activity- and utility-based accessibility measures were computed to analyse job accessibility by car and public transport in the Netherlands for: (1) the (base) year 1995, (2) a Trend, or business-as-usual, scenario, representing...

  3. Correlates of Strengthening Lessons from HIV/AIDS Treatment and Care Services in Ethiopia Perceived Access and Implications for Health System.

    Directory of Open Access Journals (Sweden)

    Bereket Yakob

    care, health system responsiveness, perceived financial fairness, transportation convenience and satisfaction with services were correlates of perceived access and affected healthcare performance. Interventions targeted at improving access to HIV/AIDS treatment and care services should address these factors. Further studies may be needed to confirm the findings.

  4. Liver tumors, correlation of computed tomography (CT) and pathology

    Energy Technology Data Exchange (ETDEWEB)

    Okazaki, Atsushi; Niibe, Hideo; Mitsuhashi, Norio

    1984-09-01

    Computed tomographic and pathologic correlation was studied in 12 autopsied cases with 11 cases of metastatic liver tumors and 1 case of hepatocellular carcinoma. Despite of proliferative patterns of the tumors, nodular low attenuations on CT showed scattered nodular lesions and geographic low attenuations on CT showed groups of multiple small nodular lesions, macroscopically. Abnormal areas of low attenuation were generally diminished by drip infusion contrast enhancement, which was more significant on tumors of infiltrative proliferation. Tumors of infiltrative proliferation revealed little degeneration of surrounding liver cells and abnormal areas of low attenuation were more distinct before contrast enhancement. Tumors of expansive proliferation revealed obvious degeneration of surrounding liver cells and a case having about 200 layers of degenerated liver cells revealed more distinct after contrast enhancement. The central lower density areas in abnormal areas of low attenuation on CT coincided with liquefactive necroses with scanty capillary. vessels and fibrotic changes, histopathologically. But coagulative necroses without decrease of surrouding blood flows were not visualized on CT. CT could not demonstrate the liquefactive necroses in more small nodules than 2 cm in diameter. (J.P.N.).

  5. Real-time autocorrelator for fluorescence correlation spectroscopy based on graphical-processor-unit architecture: method, implementation, and comparative studies

    Science.gov (United States)

    Laracuente, Nicholas; Grossman, Carl

    2013-03-01

    We developed an algorithm and software to calculate autocorrelation functions from real-time photon-counting data using the fast, parallel capabilities of graphical processor units (GPUs). Recent developments in hardware and software have allowed for general purpose computing with inexpensive GPU hardware. These devices are more suited for emulating hardware autocorrelators than traditional CPU-based software applications by emphasizing parallel throughput over sequential speed. Incoming data are binned in a standard multi-tau scheme with configurable points-per-bin size and are mapped into a GPU memory pattern to reduce time-expensive memory access. Applications include dynamic light scattering (DLS) and fluorescence correlation spectroscopy (FCS) experiments. We ran the software on a 64-core graphics pci card in a 3.2 GHz Intel i5 CPU based computer running Linux. FCS measurements were made on Alexa-546 and Texas Red dyes in a standard buffer (PBS). Software correlations were compared to hardware correlator measurements on the same signals. Supported by HHMI and Swarthmore College

  6. Dynamically Authorized Role-Based Access Control for Grid Applications

    Institute of Scientific and Technical Information of China (English)

    YAO Hanbing; HU Heping; LU Zhengding; LI Ruixuan

    2006-01-01

    Grid computing is concerned with the sharing and coordinated use of diverse resources in distributed "virtual organizations". The heterogeneous, dynamic and multi-domain nature of these environments makes challenging security issues that demand new technical approaches. Despite the recent advances in access control approaches applicable to Grid computing, there remain issues that impede the development of effective access control models for Grid applications. Among them there are the lack of context-based models for access control, and reliance on identity or capability-based access control schemes. An access control scheme that resolve these issues is presented, and a dynamically authorized role-based access control (D-RBAC) model extending the RBAC with context constraints is proposed. The D-RABC mechanisms dynamically grant permissions to users based on a set of contextual information collected from the system and user's environments, while retaining the advantages of RBAC model. The implementation architecture of D-RBAC for the Grid application is also described.

  7. The Battle to Secure Our Public Access Computers

    Science.gov (United States)

    Sendze, Monique

    2006-01-01

    Securing public access workstations should be a significant part of any library's network and information-security strategy because of the sensitive information patrons enter on these workstations. As the IT manager for the Johnson County Library in Kansas City, Kan., this author is challenged to make sure that thousands of patrons get the access…

  8. Remote Forensics May Bring the Next Sea Change in E-discovery: Are All Networked Computers Now Readily Accessible Under the Revised Federal Rules of Civil Procedure?

    Directory of Open Access Journals (Sweden)

    AleJoseph J. Schwerha

    2008-09-01

    Full Text Available The recent amendments to Rule 26 of the Federal Rules of Civil Procedure created a two-tiered approach to discovery of electronically stored information (“ESI”. Responding parties must produce ESI that is relevant, not subject to privilege, and reasonably accessible. However, because some methods of storing ESI, such as on magnetic backup tapes and within enormous databases, require substantial cost to access and search their contents, the rules permit parties to designate those repositories as “not reasonably accessible” because of undue burden or cost. But even despite the difficulty in searching for ESI, the party’s duty to preserve potentially responsive evidence remains; it simply gains the option to forgo poring over the material. Further, the court might nevertheless compel production if the requesting party demonstrates good cause.Regardless of whether the responding party believes certain documents to be reasonably accessible or not, courts may still require their production.   In such cases, the court may then choose to order production, but shift the costs of doing so to the requesting party.  Throughout this process, the burden and cost of production are central themes.   Their determination is fluid, varying from case to case and even over time in the same situation.   Nowhere is this more evident than where a responding party has numerous, geographically dispersed computers under its control that may contain responsive ESI to a request for production of documents.  Traditionally, a responding party would be forced to make a decision of whether or not to send out computer forensic experts to all of these locations to make forensically sound copies of all of those computers and then analyze each.   This process is time consuming and costly.  Recently, several companies have put forth substantial solutions that facially allow a responding party to capture and analyze data

  9. A global workspace model for phenomenal and access consciousness.

    Science.gov (United States)

    Raffone, Antonino; Pantani, Martina

    2010-06-01

    Both the global workspace theory and Block's distinction between phenomenal and access consciousness, are central in the current debates about consciousness and the neural correlates of consciousness. In this article, a unifying global workspace model for phenomenal and access consciousness is proposed. In the model, recurrent neural interactions take place in distinct yet interacting access and phenomenal brain loops. The effectiveness of feedback signaling onto sensory cortical maps is emphasized for the neural correlates of phenomenal consciousness. Two forms of top-down attention, attention for perception and attention for access, play differential roles for phenomenal and access consciousness. The model is implemented in a neural network form, with the simulation of single and multiple visual object processing, and of the attentional blink. 2010 Elsevier Inc. All rights reserved.

  10. SIEX3: A correlated computer code for prediction of fast reactor mixed oxide fuel and blanket pin performance

    International Nuclear Information System (INIS)

    Baker, R.B.; Wilson, D.R.

    1986-04-01

    The SIEX3 computer program was developed to calculate the fuel and cladding performance of oxide fuel and oxide blanket pins irradiated in the fast neutron environment of a liquid metal cooled reactor. The code is uniquely designed to be accurate yet quick running and use a minimum of computer core storage. This was accomplished through the correlation of physically based models to very large data bases of irradiation test results. Data from over 200 fuel pins and over 800 transverse fuel microscopy samples were used in the calibrations

  11. Technological advancements and Internet sexuality: does private access to the Internet influence online sexual behavior?

    Science.gov (United States)

    Daneback, Kristian; Månsson, Sven-Axel; Ross, Michael W

    2012-08-01

    The aim of this study was to investigate whether demographic characteristics and sexual behavior online and offline were associated with private, respectively, nonprivate access to the Internet in a Web sample of people who use the Internet for sexual purposes. A total of 1,913 respondents completed an online questionnaire about Internet sexuality, and 1,614 reported using the Internet for sexual purposes. The majority of these respondents reported having access to an Internet-connected computer no one else had access to (62 percent women and 70 percent men). The results showed that it is possible to differentiate between those who have access to an Internet-connected computer no one else has access to and those who have shared access to an Internet-connected computer. Not only did they differ in demographic characteristics, but also in the sexual activities they engaged in on the Internet. Different patterns were found for women and men. For example, men who had private access to Internet-connected computers were more likely than those who had shared access to seek information about sexual issues. Thus, having access to Internet computers no one else has access to may promote sexual knowledge and health for men. The results of this study along with the technological development implies that in future research, attention should be paid to where and how people access the Internet in relation to online behavior in general and online sexual behavior in particular.

  12. Alexia without agraphia: Clinical-computed tomographic correlations

    International Nuclear Information System (INIS)

    Weisberg, L.A.; Wall, M.; Charity Hospital, New Orleans, LA; Veterans Administration Hospital, New Orleans, LA

    1987-01-01

    Four patients with alexia without agraphia had CT lesions which correlated with the clinical findings. All lesions were vascular; two were spontaneous intracerebral hematomas and two were ischemic infarctions in the posterior cerebral artery distribution. The lesions were located in the posterior portion of the dominant hemisphere. The location of the lesion correlated with the presence or absence of visual field abnormalities. (orig.)

  13. Web Based Remote Access Microcontroller Laboratory

    OpenAIRE

    H. Çimen; İ. Yabanova; M. Nartkaya; S. M. Çinar

    2008-01-01

    This paper presents a web based remote access microcontroller laboratory. Because of accelerated development in electronics and computer technologies, microcontroller-based devices and appliances are found in all aspects of our daily life. Before the implementation of remote access microcontroller laboratory an experiment set is developed by teaching staff for training microcontrollers. Requirement of technical teaching and industrial applications are considered when expe...

  14. Several Families of Sequences with Low Correlation and Large Linear Span

    Science.gov (United States)

    Zeng, Fanxin; Zhang, Zhenyu

    In DS-CDMA systems and DS-UWB radios, low correlation of spreading sequences can greatly help to minimize multiple access interference (MAI) and large linear span of spreading sequences can reduce their predictability. In this letter, new sequence sets with low correlation and large linear span are proposed. Based on the construction Trm1[Trnm(αbt+γiαdt)]r for generating p-ary sequences of period pn-1, where n=2m, d=upm±v, b=u±v, γi∈GF(pn), and p is an arbitrary prime number, several methods to choose the parameter d are provided. The obtained sequences with family size pn are of four-valued, five-valued, six-valued or seven-valued correlation and the maximum nontrivial correlation value is (u+v-1)pm-1. The simulation by a computer shows that the linear span of the new sequences is larger than that of the sequences with Niho-type and Welch-type decimations, and similar to that of [10].

  15. Computed tomography of delayed encephalopathy of acute carbon monoxide poisoning - correlation with clinical findings -

    International Nuclear Information System (INIS)

    Suh, Chang Hae; Chung, Sung Hoon; Choo, In Wook; Chang, Kee Hyun

    1986-01-01

    Cerebral computed tomography (CT) findings were described in twenty-six cases with the late sequelae of acute carbon monoxide poisoning and were computed with the neurological symptoms and signs. The CT findings include symmetrical periventricular white matter low density in five cases, globes pallidus low density in six cases, ventricular dilatation in seven cases, ventricular dilatation and sulci widening in three cases, and normal findings in ten cases. Only one case showed low densities in both periventricular white matter and globes pallidus. Late sequelae of the interval from of carbon monoxide poisoning were clinically categorized as cortical dysfunction, parkinsonian feature, and cerebella dysfunction. The severity of the clinical symptoms and signs of neurological sequelae is generally correlated with presence and multiplicity of abnormal brain CT findings. But of fourteen cases showing the parkinsonian feature, only five cases had low density of globes pallidus in brain CT. Another case showing small unilateral low density of globes pallidus had no parkinsonian feature but showed mild cortical dysfunction.

  16. Pulmonary arteriovenous malformations in hereditary haemorrhagic telangiectasia. Correlations between computed tomography findings and cerebral complications

    Energy Technology Data Exchange (ETDEWEB)

    Etievant, Johan; Si-Mohamed, Salim; Vinurel, Nicolas; Revel, Didier [Hospices Civils de Lyon, Hopital Cardiologique Louis Pradel, Departement d' Imagerie Cardiaque et Thoracique, Diagnostique et Interventionnelle, Bron (France); Universite Claude Bernard Lyon 1, Villeurbanne (France); Dupuis-Girod, Sophie [Hospices Civils de Lyon, Hopital Femme-Mere-Enfant, Service de Genetique, Centre de Reference pour la maladie de Rendu-Osler, Lyon (France); Decullier, Evelyne [Universite Claude Bernard Lyon 1, Villeurbanne (France); Hospices Civils de Lyon, Pole Information Medicale Evaluation Recherche, Lyon (France); Gamondes, Delphine [Hospices Civils de Lyon, Hopital Cardiologique Louis Pradel, Departement d' Imagerie Cardiaque et Thoracique, Diagnostique et Interventionnelle, Bron (France); Khouatra, Chahera [Hospices Civils de Lyon, Hopital Cardiologique Louis Pradel, Service de pneumologie - Centre des Maladies Orphelines Pulmonaires, Lyon (France); Cottin, Vincent [Universite Claude Bernard Lyon 1, Villeurbanne (France); Hospices Civils de Lyon, Hopital Cardiologique Louis Pradel, Service de pneumologie - Centre des Maladies Orphelines Pulmonaires, Lyon (France)

    2018-03-15

    Computed tomography (CT) is the modality of choice to characterise pulmonary arteriovenous malformations (PAVMs) in patients with hereditary haemorrhagic telangiectasia (HHT). Our objective was to determine if CT findings were associated with frequency of brain abscess and ischaemic stroke. This retrospective study included patients with HHT-related PAVMs. CT results, i.e. PAVM presentation (unique, multiple, disseminated or diffuse), the number of PAVMs and the largest feeding artery size, were correlated to prevalence of ischaemic stroke and brain abscess. All CTs were reviewed in consensus by two radiologists. Of 170 patients, 73 patients had unique (42.9 %), 49 multiple (28.8 %), 36 disseminated (21.2 %) and 12 diffuse (7.1 %) PAVMs. Fifteen patients presented with brain abscess; 26 patients presented with ischaemic stroke. The number of PAVMs was significantly correlated with brain abscess (11.5 vs. 6.2, respectively; p=0.025). The mean diameter of the largest feeding artery was significantly correlated with ischaemic stroke frequency (4.9 vs. 3.2 mm, respectively; p=0.0098). The number of PAVMs correlated significantly with risk of brain abscess, and a larger feeding artery significantly with more ischaemic strokes. These findings can lead to a better recognition and management of the PAVMs at risk of cerebral complications. (orig.)

  17. Computational experiment for the purpose of determining the probabilistic and temporal characteristics of information security systems against unauthorized access in automated information systems

    Directory of Open Access Journals (Sweden)

    A. V. Skrypnikov

    2017-01-01

    Full Text Available The article is devoted to the method of experimental estimation of parameters of functioning of standard information protection systems from unauthorized access, certified, widely used in organizations operating automated information systems. In the course of the experiment, statistical data were evaluated in the dynamics of the functioning of information security systems against unauthorized access in automated information systems. Registration of the parameters for the execution time of protective protection functions was carried out using a special utility called ProcessMonitor from the Sysinternals suite of utilities used to filter processes and threads. The loading of the processor and main memory of the computer with the use of special software, specially designed for performing experimental research, simulates the operation of GIS in real-world work for its intended purpose. A special software for simulating the work of a system with high load is developed in "VisualStudio 2015" within the framework of "ConsoleApplication". At the same time, the processor is loaded at a level of 50-70% and 60-80% of the operative memory. The obtained values of the time of implementation of protective functions in conditions of high utilization of resources of computer facilities for their intended purpose will allow us to assess the conflict and dynamic properties of the GIS. In the future, the obtained experimental estimates can be used to develop a model of information security in automated information systems, as well as in the formation of quality requirements (resource intensity, response time to the user's request, availability, etc.. Also, the results of the computational experiment in the future can be used to develop a software package for assessing the dynamic performance of information security systems against unauthorized access in automated information systems

  18. Controlling user access to electronic resources without password

    Science.gov (United States)

    Smith, Fred Hewitt

    2017-08-22

    Described herein are devices and techniques for remotely controlling user access to a restricted computer resource. The process includes obtaining an image from a communication device of a user. An individual and a landmark are identified within the image. Determinations are made that the individual is the user and that the landmark is a predetermined landmark. Access to a restricted computing resource is granted based on the determining that the individual is the user and that the landmark is the predetermined landmark. Other embodiments are disclosed.

  19. Combination of inquiry learning model and computer simulation to improve mastery concept and the correlation with critical thinking skills (CTS)

    Science.gov (United States)

    Nugraha, Muhamad Gina; Kaniawati, Ida; Rusdiana, Dadi; Kirana, Kartika Hajar

    2016-02-01

    Among the purposes of physics learning at high school is to master the physics concepts and cultivate scientific attitude (including critical attitude), develop inductive and deductive reasoning skills. According to Ennis et al., inductive and deductive reasoning skills are part of critical thinking. Based on preliminary studies, both of the competence are lack achieved, it is seen from student learning outcomes is low and learning processes that are not conducive to cultivate critical thinking (teacher-centered learning). One of learning model that predicted can increase mastery concepts and train CTS is inquiry learning model aided computer simulations. In this model, students were given the opportunity to be actively involved in the experiment and also get a good explanation with the computer simulations. From research with randomized control group pretest-posttest design, we found that the inquiry learning model aided computer simulations can significantly improve students' mastery concepts than the conventional (teacher-centered) method. With inquiry learning model aided computer simulations, 20% of students have high CTS, 63.3% were medium and 16.7% were low. CTS greatly contribute to the students' mastery concept with a correlation coefficient of 0.697 and quite contribute to the enhancement mastery concept with a correlation coefficient of 0.603.

  20. Inverse correlation between reactive oxygen species in unwashed semen and sperm motion parameters as measured by a computer-assisted semen analyzer.

    Science.gov (United States)

    Takeshima, Teppei; Yumura, Yasushi; Yasuda, Kengo; Sanjo, Hiroyuki; Kuroda, Shinnosuke; Yamanaka, Hiroyuki; Iwasaki, Akira

    2017-01-01

    This study investigated the correlation between sperm motion parameters obtained by a computer-assisted semen analyzer and levels of reactive oxygen species in unwashed semen. In total, 847 patients, except for azoospermic patients were investigated. At the time of each patient's first consultation, semen parameters were measured using SMAS™ or CellSoft 3000™, and production of reactive oxygen species was measured using a computer-driven LKB Wallac Luminometer 1251 Analyzer. The patients were divided into two groups: reactive oxygen species - positive and negative. The semen parameters within each group were measured using one of the two computer-assisted semen analyzer systems and then compared. Correlations between reactive oxygen species levels and sperm motion parameters in semen from the reactive oxygen species - positive group were also investigated. Reactive oxygen species were detected in semen samples of 282 cases (33.3%). Sperm concentration (P semen damage sperm concentration, motility, and other sperm motion parameters.

  1. A Theorem on Grid Access Control

    Institute of Scientific and Technical Information of China (English)

    XU ZhiWei(徐志伟); BU GuanYing(卜冠英)

    2003-01-01

    The current grid security research is mainly focused on the authentication of grid systems. A problem to be solved by grid systems is to ensure consistent access control. This problem is complicated because the hosts in a grid computing environment usually span multiple autonomous administrative domains. This paper presents a grid access control model, based on asynchronous automata theory and the classic Bell-LaPadula model. This model is useful to formally study the confidentiality and integrity problems in a grid computing environment. A theorem is proved, which gives the necessary and sufficient conditions to a grid to maintain confidentiality.These conditions are the formalized descriptions of local (node) relations or relationship between grid subjects and node subjects.

  2. Accessible Earth: An accessible study abroad capstone for the geoscience curriculum

    Science.gov (United States)

    Bennett, R. A.; Lamb, D. A.

    2017-12-01

    International capstone field courses offer geoscience-students opportunities to reflect upon their knowledge, develop intercultural competence, appreciate diversity, and recognize themselves as geoscientists on a global scale. Such experiences are often described as pivotal to a geoscientist's education, a right of passage. However, field-based experiences present insurmountable barriers to many students, undermining the goal of inclusive excellence. Nevertheless, there remains a widespread belief that successful geoscientists are those able to traverse inaccessible terrain. One path forward from this apparent dilemma is emerging as we take steps to address a parallel challenge: as we move into the 21st century the geoscience workforce will require an ever increasing range of skills, including analysis, modeling, communication, and computational proficiency. Computer programing, laboratory experimentation, numerical simulation, etc, are inherently more accessible than fieldwork, yet equally valuable. Students interested in pursuing such avenues may be better served by capstone experiences that align more closely with their career goals. Moreover, many of the desirable learning outcomes attributed to field-based education are not unique to immersion in remote inaccessible locations. Affective and cognitive gains may also result from social bonding through extended time with peers and mentors, creative synthesis of knowledge, project-based learning, and intercultural experience. Developing an inclusive course for the geoscience curriculum requires considering all learners, including different genders, ages, physical abilities, familial dynamics, and a multitude of other attributes. The Accessible Earth Study Abroad Program endeavors to provide geoscience students an inclusive capstone experience focusing on modern geophysical observation systems (satellite based observations and permanent networks of ground-based instruments), computational thinking and methods of

  3. Overdiagnosing of femoroacetabular impingement: correlation between clinical presentation and computed tomography in symptomatic patients☆

    Science.gov (United States)

    Canella, Richard Prazeres; Adam, Guilherme Pradi; de Castillo, Roberto André Ulhôa; Codonho, Daniel; Ganev, Gerson Gandhi; de Vicenzi, Luiz Fernando

    2016-01-01

    Objective To correlate the angles between the acetabulum and the proximal femur in symptomatic patients with femoroacetabular impingement (FAI), using computed tomography (CT). Methods We retrospectively evaluated 103 hips from 103 patients, using multislice CT to measure the acetabular age, acetabular version (in its supraequatorial portion and in its middle third), femoral neck version, cervical-diaphyseal and alpha angles and the acetabular depth. For the statistical analysis, we used the Pearson correlation coefficient. Results There were inverse correlations between the following angles: (1) acetabular coverage versus alpha angle (p = 0.019); (2) acetabular version (supraequatorial) versus alpha angle (p = 0.049). For patients with femoral anteversion lower than 15 degrees: (1) acetabular version (supraequatorial) versus alpha angle (p = 0.026); (2) acetabular version (middle third) versus alpha angle (p = 0.02). For patients with acetabular version (supraequatorial) lower than 10 degrees: (1) acetabular version (supraequatorial) versus alpha angle (p = 0.004); (2) acetabular version (middle third) versus alpha angle (p = 0.009). Conclusion There was a statistically significant inverse correlation between the acetabular version and alpha angles (the smaller the acetabular anteversion angle was, the larger the alpha angle was) in symptomatic patients, thus supporting the hypothesis that FAI occurs when cam and pincer findings due to acetabular retroversion are seen simultaneously, and that the latter alone does not cause FAI, which leads to overdiagnosis in these cases. PMID:27069890

  4. Assessing motor imagery in brain-computer interface training: Psychological and neurophysiological correlates.

    Science.gov (United States)

    Vasilyev, Anatoly; Liburkina, Sofya; Yakovlev, Lev; Perepelkina, Olga; Kaplan, Alexander

    2017-03-01

    Motor imagery (MI) is considered to be a promising cognitive tool for improving motor skills as well as for rehabilitation therapy of movement disorders. It is believed that MI training efficiency could be improved by using the brain-computer interface (BCI) technology providing real-time feedback on person's mental attempts. While BCI is indeed a convenient and motivating tool for practicing MI, it is not clear whether it could be used for predicting or measuring potential positive impact of the training. In this study, we are trying to establish whether the proficiency in BCI control is associated with any of the neurophysiological or psychological correlates of motor imagery, as well as to determine possible interrelations among them. For that purpose, we studied motor imagery in a group of 19 healthy BCI-trained volunteers and performed a correlation analysis across various quantitative assessment metrics. We examined subjects' sensorimotor event-related EEG events, corticospinal excitability changes estimated with single-pulse transcranial magnetic stimulation (TMS), BCI accuracy and self-assessment reports obtained with specially designed questionnaires and interview routine. Our results showed, expectedly, that BCI performance is dependent on the subject's capability to suppress EEG sensorimotor rhythms, which in turn is correlated with the idle state amplitude of those oscillations. Neither BCI accuracy nor the EEG features associated with MI were found to correlate with the level of corticospinal excitability increase during motor imagery, and with assessed imagery vividness. Finally, a significant correlation was found between the level of corticospinal excitability increase and kinesthetic vividness of imagery (KVIQ-20 questionnaire). Our results suggest that two distinct neurophysiological mechanisms might mediate possible effects of motor imagery: the non-specific cortical sensorimotor disinhibition and the focal corticospinal excitability increase

  5. Social Media Use and Access to Digital Technology in US Young Adults in 2016.

    Science.gov (United States)

    Villanti, Andrea C; Johnson, Amanda L; Ilakkuvan, Vinu; Jacobs, Megan A; Graham, Amanda L; Rath, Jessica M

    2017-06-07

    In 2015, 90% of US young adults with Internet access used social media. Digital and social media are highly prevalent modalities through which young adults explore identity formation, and by extension, learn and transmit norms about health and risk behaviors during this developmental life stage. The purpose of this study was to provide updated estimates of social media use from 2014 to 2016 and correlates of social media use and access to digital technology in data collected from a national sample of US young adults in 2016. Young adult participants aged 18-24 years in Wave 7 (October 2014, N=1259) and Wave 9 (February 2016, N=989) of the Truth Initiative Young Adult Cohort Study were asked about use frequency for 11 social media sites and access to digital devices, in addition to sociodemographic characteristics. Regular use was defined as using a given social media site at least weekly. Weighted analyses estimated the prevalence of use of each social media site, overlap between regular use of specific sites, and correlates of using a greater number of social media sites regularly. Bivariate analyses identified sociodemographic correlates of access to specific digital devices. In 2014, 89.42% (weighted n, 1126/1298) of young adults reported regular use of at least one social media site. This increased to 97.5% (weighted n, 965/989) of young adults in 2016. Among regular users of social media sites in 2016, the top five sites were Tumblr (85.5%), Vine (84.7%), Snapchat (81.7%), Instagram (80.7%), and LinkedIn (78.9%). Respondents reported regularly using an average of 7.6 social media sites, with 85% using 6 or more sites regularly. Overall, 87% of young adults reported access or use of a smartphone with Internet access, 74% a desktop or laptop computer with Internet access, 41% a tablet with Internet access, 29% a smart TV or video game console with Internet access, 11% a cell phone without Internet access, and 3% none of these. Access to all digital devices with

  6. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  7. Access to electronic health knowledge in five countries in Africa: a descriptive study

    Directory of Open Access Journals (Sweden)

    Honorati Masanja

    2007-05-01

    Full Text Available Abstract Background Access to medical literature in developing countries is helped by open access publishing and initiatives to allow free access to subscription only journals. The effectiveness of these initiatives in Africa has not been assessed. This study describes awareness, reported use and factors influencing use of on-line medical literature via free access initiatives. Methods Descriptive study in four teaching hospitals in Cameroon, Nigeria, Tanzania and Uganda plus one externally funded research institution in The Gambia. Survey with postgraduate doctors and research scientists to determine Internet access patterns, reported awareness of on-line medical information and free access initiatives; semi structured interviews with a sub-sample of survey participants to explore factors influencing use. Results In the four African teaching hospitals, 70% of the 305 postgraduate doctors reported textbooks as their main source of information; 66% had used the Internet for health information in the last week. In two hospitals, Internet cafés were the main Internet access point. For researchers at the externally-funded research institution, electronic resources were their main source, and almost all had used the Internet in the last week. Across all 333 respondents, 90% had heard of PubMed, 78% of BMJ on line, 49% the Cochrane Library, 47% HINARI, and 19% BioMedCentral. HINARI use correlates with accessing the Internet on computers located in institutions. Qualitative data suggested there are difficulties logging into HINARI and that sometimes it is librarians that limit access to passwords. Conclusion Text books remain an important resource for postgraduate doctors in training. Internet use is common, but awareness of free-access initiatives is limited. HINARI and other initiatives could be more effective with strong institutional endorsement and management to promote and ensure access.

  8. Providing the Public with Online Access to Large Bibliographic Data Bases.

    Science.gov (United States)

    Firschein, Oscar; Summit, Roger K.

    DIALOG, an interactive, computer-based information retrieval language, consists of a series of computer programs designed to make use of direct access memory devices in order to provide the user with a rapid means of identifying records within a specific memory bank. Using the system, a library user can be provided access to sixteen distinct and…

  9. DIRAC distributed computing services

    International Nuclear Information System (INIS)

    Tsaregorodtsev, A

    2014-01-01

    DIRAC Project provides a general-purpose framework for building distributed computing systems. It is used now in several HEP and astrophysics experiments as well as for user communities in other scientific domains. There is a large interest from smaller user communities to have a simple tool like DIRAC for accessing grid and other types of distributed computing resources. However, small experiments cannot afford to install and maintain dedicated services. Therefore, several grid infrastructure projects are providing DIRAC services for their respective user communities. These services are used for user tutorials as well as to help porting the applications to the grid for a practical day-to-day work. The services are giving access typically to several grid infrastructures as well as to standalone computing clusters accessible by the target user communities. In the paper we will present the experience of running DIRAC services provided by the France-Grilles NGI and other national grid infrastructure projects.

  10. Kymogram detection and kymogram-correlated image reconstruction from subsecond spiral computed tomography scans of the heart

    International Nuclear Information System (INIS)

    Kachelriess, Marc; Sennst, Dirk-Alexander; Maxlmoser, Wolfgang; Kalender, Willi A.

    2002-01-01

    Subsecond single-slice, multi-slice or cone-beam spiral computed tomography (SSCT, MSCT, CBCT) offer great potential for improving heart imaging. Together with the newly developed phase-correlated cardiac reconstruction algorithms 180 deg. MCD and 180 deg. MCI [Med. Phys. 27, 1881-1902 (2000)] or related algorithms provided by the CT manufacturers, high image quality can be achieved. These algorithms require information about the cardiac motion, i.e., typically the simultaneously recorded electrocardiogram (ECG), to synchronize the reconstruction with the cardiac motion. Neither data acquired without ECG information (standard patients) nor acquisitions with corrupted ECG information can be handled adequately. We developed a method to extract the appropriate information about cardiac motion directly from the measured raw data (projection data). The so-called kymogram function is a measure of the cardiac motion as a function of time t or as a function of the projection angle α. In contrast to the ECG which is a global measure of the heart's electric excitation, the kymogram is a local measure of the heart motion at the z-position z(α) at projection angle α. The patient's local heart rate as well as the necessary synchronization information to be used with phase-correlated algorithms can be extracted from the kymogram by using a series of signal processing steps. The kymogram information is shown to be adequate to substitute the ECG information. Computer simulations with simulated ECG and patient measurements with simultaneously acquired ECG were carried out for a multislice scanner providing M=4 slices to evaluate these new approaches. Both the ECG function and the kymogram function were used for reconstruction. Both were highly correlated regarding the periodicity information used for reconstruction. In 21 out of 25 consecutive cases the kymogram approach was equivalent to the ECG-correlated reconstruction; only minor differences in image quality between both

  11. Imaginary time density-density correlations for two-dimensional electron gases at high density

    Energy Technology Data Exchange (ETDEWEB)

    Motta, M.; Galli, D. E. [Dipartimento di Fisica, Università degli Studi di Milano, Via Celoria 16, 20133 Milano (Italy); Moroni, S. [IOM-CNR DEMOCRITOS National Simulation Center and SISSA, Via Bonomea 265, 34136 Trieste (Italy); Vitali, E. [Department of Physics, College of William and Mary, Williamsburg, Virginia 23187-8795 (United States)

    2015-10-28

    We evaluate imaginary time density-density correlation functions for two-dimensional homogeneous electron gases of up to 42 particles in the continuum using the phaseless auxiliary field quantum Monte Carlo method. We use periodic boundary conditions and up to 300 plane waves as basis set elements. We show that such methodology, once equipped with suitable numerical stabilization techniques necessary to deal with exponentials, products, and inversions of large matrices, gives access to the calculation of imaginary time correlation functions for medium-sized systems. We discuss the numerical stabilization techniques and the computational complexity of the methodology and we present the limitations related to the size of the systems on a quantitative basis. We perform the inverse Laplace transform of the obtained density-density correlation functions, assessing the ability of the phaseless auxiliary field quantum Monte Carlo method to evaluate dynamical properties of medium-sized homogeneous fermion systems.

  12. A sparse autoencoder-based deep neural network for protein solvent accessibility and contact number prediction.

    Science.gov (United States)

    Deng, Lei; Fan, Chao; Zeng, Zhiwen

    2017-12-28

    Direct prediction of the three-dimensional (3D) structures of proteins from one-dimensional (1D) sequences is a challenging problem. Significant structural characteristics such as solvent accessibility and contact number are essential for deriving restrains in modeling protein folding and protein 3D structure. Thus, accurately predicting these features is a critical step for 3D protein structure building. In this study, we present DeepSacon, a computational method that can effectively predict protein solvent accessibility and contact number by using a deep neural network, which is built based on stacked autoencoder and a dropout method. The results demonstrate that our proposed DeepSacon achieves a significant improvement in the prediction quality compared with the state-of-the-art methods. We obtain 0.70 three-state accuracy for solvent accessibility, 0.33 15-state accuracy and 0.74 Pearson Correlation Coefficient (PCC) for the contact number on the 5729 monomeric soluble globular protein dataset. We also evaluate the performance on the CASP11 benchmark dataset, DeepSacon achieves 0.68 three-state accuracy and 0.69 PCC for solvent accessibility and contact number, respectively. We have shown that DeepSacon can reliably predict solvent accessibility and contact number with stacked sparse autoencoder and a dropout approach.

  13. ARCAS (ACACIA Regional Climate-data Access System) -- a Web Access System for Climate Model Data Access, Visualization and Comparison

    Science.gov (United States)

    Hakkarinen, C.; Brown, D.; Callahan, J.; hankin, S.; de Koningh, M.; Middleton-Link, D.; Wigley, T.

    2001-05-01

    A Web-based access system to climate model output data sets for intercomparison and analysis has been produced, using the NOAA-PMEL developed Live Access Server software as host server and Ferret as the data serving and visualization engine. Called ARCAS ("ACACIA Regional Climate-data Access System"), and publicly accessible at http://dataserver.ucar.edu/arcas, the site currently serves climate model outputs from runs of the NCAR Climate System Model for the 21st century, for Business as Usual and Stabilization of Greenhouse Gas Emission scenarios. Users can select, download, and graphically display single variables or comparisons of two variables from either or both of the CSM model runs, averaged for monthly, seasonal, or annual time resolutions. The time length of the averaging period, and the geographical domain for download and display, are fully selectable by the user. A variety of arithmetic operations on the data variables can be computed "on-the-fly", as defined by the user. Expansions of the user-selectable options for defining analysis options, and for accessing other DOD-compatible ("Distributed Ocean Data System-compatible") data sets, residing at locations other than the NCAR hardware server on which ARCAS operates, are planned for this year. These expansions are designed to allow users quick and easy-to-operate web-based access to the largest possible selection of climate model output data sets available throughout the world.

  14. Theory of computation

    CERN Document Server

    Tourlakis, George

    2012-01-01

    Learn the skills and acquire the intuition to assess the theoretical limitations of computer programming Offering an accessible approach to the topic, Theory of Computation focuses on the metatheory of computing and the theoretical boundaries between what various computational models can do and not do—from the most general model, the URM (Unbounded Register Machines), to the finite automaton. A wealth of programming-like examples and easy-to-follow explanations build the general theory gradually, which guides readers through the modeling and mathematical analysis of computational pheno

  15. Modeling binary correlated responses using SAS, SPSS and R

    CERN Document Server

    Wilson, Jeffrey R

    2015-01-01

    Statistical tools to analyze correlated binary data are spread out in the existing literature. This book makes these tools accessible to practitioners in a single volume. Chapters cover recently developed statistical tools and statistical packages that are tailored to analyzing correlated binary data. The authors showcase both traditional and new methods for application to health-related research. Data and computer programs will be publicly available in order for readers to replicate model development, but learning a new statistical language is not necessary with this book. The inclusion of code for R, SAS, and SPSS allows for easy implementation by readers. For readers interested in learning more about the languages, though, there are short tutorials in the appendix. Accompanying data sets are available for download through the book s website. Data analysis presented in each chapter will provide step-by-step instructions so these new methods can be readily applied to projects.  Researchers and graduate stu...

  16. Computer aided diagnosis system for Alzheimer disease using brain diffusion tensor imaging features selected by Pearson's correlation.

    Science.gov (United States)

    Graña, M; Termenon, M; Savio, A; Gonzalez-Pinto, A; Echeveste, J; Pérez, J M; Besga, A

    2011-09-20

    The aim of this paper is to obtain discriminant features from two scalar measures of Diffusion Tensor Imaging (DTI) data, Fractional Anisotropy (FA) and Mean Diffusivity (MD), and to train and test classifiers able to discriminate Alzheimer's Disease (AD) patients from controls on the basis of features extracted from the FA or MD volumes. In this study, support vector machine (SVM) classifier was trained and tested on FA and MD data. Feature selection is done computing the Pearson's correlation between FA or MD values at voxel site across subjects and the indicative variable specifying the subject class. Voxel sites with high absolute correlation are selected for feature extraction. Results are obtained over an on-going study in Hospital de Santiago Apostol collecting anatomical T1-weighted MRI volumes and DTI data from healthy control subjects and AD patients. FA features and a linear SVM classifier achieve perfect accuracy, sensitivity and specificity in several cross-validation studies, supporting the usefulness of DTI-derived features as an image-marker for AD and to the feasibility of building Computer Aided Diagnosis systems for AD based on them. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  17. Correlator receiver architecture with PnpN optical thyristor operating as optical hard-limiter

    Science.gov (United States)

    Kang, Tae-Gu; Ho Lee, Su; Park, Soonchul

    2011-07-01

    We propose novel correlator receiver architecture with a PnpN optical thyristor operating as optical hard-limiter, and demonstrate a multiple-access interference rejection of the proposed correlator receiver. The proposed correlator receiver is composed of the 1×2 splitter, optical delay line, 2×1 combiner, and fabricated PnpN optical thyristor. The proposed correlator receiver enhances the system performance because it excludes some combinations of multiple-access interference patterns from causing errors as in optical code-division multiple access systems with conventional optical receiver shown in all previous works. It is found that the proposed correlator receiver can fully reject the interference signals generated by decoding processing and multiple access for two simultaneous users.

  18. GPUs: An Emerging Platform for General-Purpose Computation

    Science.gov (United States)

    2007-08-01

    programming; real-time cinematic quality graphics Peak stream (26) License required (limited time no- cost evaluation program) Commercially...folding.stanford.edu (accessed 30 March 2007). 2. Fan, Z.; Qiu, F.; Kaufman, A.; Yoakum-Stover, S. GPU Cluster for High Performance Computing. ACM/IEEE...accessed 30 March 2007). 8. Goodnight, N.; Wang, R.; Humphreys, G. Computation on Programmable Graphics Hardware. IEEE Computer Graphics and

  19. Cloud Computing: Exploring the scope

    OpenAIRE

    Maurya, Brajesh Kumar

    2010-01-01

    Cloud computing refers to a paradigm shift to overall IT solutions while raising the accessibility, scalability and effectiveness through its enabling technologies. However, migrated cloud platforms and services cost benefits as well as performances are neither clear nor summarized. Globalization and the recessionary economic times have not only raised the bar of a better IT delivery models but also have given access to technology enabled services via internet. Cloud computing has va...

  20. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  1. Introduction to computer networking

    CERN Document Server

    Robertazzi, Thomas G

    2017-01-01

    This book gives a broad look at both fundamental networking technology and new areas that support it and use it. It is a concise introduction to the most prominent, recent technological topics in computer networking. Topics include network technology such as wired and wireless networks, enabling technologies such as data centers, software defined networking, cloud and grid computing and applications such as networks on chips, space networking and network security. The accessible writing style and non-mathematical treatment makes this a useful book for the student, network and communications engineer, computer scientist and IT professional. • Features a concise, accessible treatment of computer networking, focusing on new technological topics; • Provides non-mathematical introduction to networks in their most common forms today;< • Includes new developments in switching, optical networks, WiFi, Bluetooth, LTE, 5G, and quantum cryptography.

  2. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  3. Bridging the digital divide: mobile access to personal health records among patients with diabetes.

    Science.gov (United States)

    Graetz, Ilana; Huang, Jie; Brand, Richard J; Hsu, John; Yamin, Cyrus K; Reed, Mary E

    2018-01-01

    Some patients lack regular computer access and experience a digital divide that causes them to miss internet-based health innovations. The diffusion of smartphones has increased internet access across the socioeconomic spectrum, and increasing the channels through which patients can access their personal health records (PHRs) could help bridge the divide in PHR use. We examined PHR use through a computer-based Web browser or mobile device. Cross-sectional historical cohort analysis. Among adult patients in the diabetes registry of an integrated healthcare delivery system, we studied the devices used to access their PHR during 2016. Among 267,208 patients with diabetes, 68.1% used the PHR in 2016; 60.6% of all log-ins were via computer and 39.4% were via mobile device. Overall, 63.9% used it from both a computer and mobile device, 29.6% used only a computer, and 6.5% used only a mobile device. After adjustment, patients who were black, Hispanic, or Asian; lived in lower socioeconomic status (SES) neighborhoods; or had lower engagement were all significantly more likely to use the PHR only from a mobile device (P digital divide in computer use, disproportionately reaching racial/ethnic minorities and lower SES patients. Nonetheless, even with a mobile-optimized and app-accessible PHR, differences in PHR use by race/ethnicity and SES remain. Continued efforts are needed to increase equitable access to PHRs among patients with chronic conditions.

  4. Correlation of pulmonary function and usual interstitial pneumonia computed tomography patterns in idiopathic pulmonary fibrosis.

    Science.gov (United States)

    Arcadu, Antonella; Byrne, Suzanne C; Pirina, Pietro; Hartman, Thomas E; Bartholmai, Brian J; Moua, Teng

    2017-08-01

    Little is known about presenting 'inconsistent' or 'possible' usual interstitial pneumonia (UIP) computed tomography (CT) patterns advancing to 'consistent' UIP as disease progresses in idiopathic pulmonary fibrosis (IPF). We hypothesized that if 'consistent' UIP represented more advanced disease, such a pattern on presentation should also correlate with more severe pulmonary function test (PFT) abnormalities. Consecutive IPF patients (2005-2013) diagnosed by international criteria with baseline PFT and CT were included. Presenting CTs were assessed by three expert radiologists for consensus UIP pattern ('consistent', 'possible', and 'inconsistent'). Approximation of individual and combined interstitial abnormalities was also performed with correlation of interstitial abnormalities and UIP CT pattern made with PFT findings and survival. Three-hundred and fifty patients (70% male) were included with a mean age of 68.3 years. Mean percent predicted forced vital capacity (FVC%) and diffusion capacity (DLCO%) was 64% and 45.5% respectively. Older age and male gender correlated more with 'consistent' UIP CT pattern. FVC% was not associated with any UIP pattern but did correlate with total volume of radiologist assessed interstitial abnormalities. DLCO% was lower in those with 'consistent' UIP pattern. A 'consistent' UIP CT pattern was also not independently predictive of survival after correction for age, gender, FVC%, and DLCO%. PFT findings appear to correlate with extent of radiologic disease but not specific morphologic patterns. Whether such UIP patterns represent different stages of disease severity or radiologic progression is not supported by coinciding pulmonary function decline. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Protein Solvent-Accessibility Prediction by a Stacked Deep Bidirectional Recurrent Neural Network

    Directory of Open Access Journals (Sweden)

    Buzhong Zhang

    2018-05-01

    Full Text Available Residue solvent accessibility is closely related to the spatial arrangement and packing of residues. Predicting the solvent accessibility of a protein is an important step to understand its structure and function. In this work, we present a deep learning method to predict residue solvent accessibility, which is based on a stacked deep bidirectional recurrent neural network applied to sequence profiles. To capture more long-range sequence information, a merging operator was proposed when bidirectional information from hidden nodes was merged for outputs. Three types of merging operators were used in our improved model, with a long short-term memory network performing as a hidden computing node. The trained database was constructed from 7361 proteins extracted from the PISCES server using a cut-off of 25% sequence identity. Sequence-derived features including position-specific scoring matrix, physical properties, physicochemical characteristics, conservation score and protein coding were used to represent a residue. Using this method, predictive values of continuous relative solvent-accessible area were obtained, and then, these values were transformed into binary states with predefined thresholds. Our experimental results showed that our deep learning method improved prediction quality relative to current methods, with mean absolute error and Pearson’s correlation coefficient values of 8.8% and 74.8%, respectively, on the CB502 dataset and 8.2% and 78%, respectively, on the Manesh215 dataset.

  6. Protein Solvent-Accessibility Prediction by a Stacked Deep Bidirectional Recurrent Neural Network.

    Science.gov (United States)

    Zhang, Buzhong; Li, Linqing; Lü, Qiang

    2018-05-25

    Residue solvent accessibility is closely related to the spatial arrangement and packing of residues. Predicting the solvent accessibility of a protein is an important step to understand its structure and function. In this work, we present a deep learning method to predict residue solvent accessibility, which is based on a stacked deep bidirectional recurrent neural network applied to sequence profiles. To capture more long-range sequence information, a merging operator was proposed when bidirectional information from hidden nodes was merged for outputs. Three types of merging operators were used in our improved model, with a long short-term memory network performing as a hidden computing node. The trained database was constructed from 7361 proteins extracted from the PISCES server using a cut-off of 25% sequence identity. Sequence-derived features including position-specific scoring matrix, physical properties, physicochemical characteristics, conservation score and protein coding were used to represent a residue. Using this method, predictive values of continuous relative solvent-accessible area were obtained, and then, these values were transformed into binary states with predefined thresholds. Our experimental results showed that our deep learning method improved prediction quality relative to current methods, with mean absolute error and Pearson's correlation coefficient values of 8.8% and 74.8%, respectively, on the CB502 dataset and 8.2% and 78%, respectively, on the Manesh215 dataset.

  7. Biometrics: Accessibility challenge or opportunity?

    Science.gov (United States)

    Blanco-Gonzalo, Ramon; Lunerti, Chiara; Sanchez-Reillo, Raul; Guest, Richard Michael

    2018-01-01

    Biometric recognition is currently implemented in several authentication contexts, most recently in mobile devices where it is expected to complement or even replace traditional authentication modalities such as PIN (Personal Identification Number) or passwords. The assumed convenience characteristics of biometrics are transparency, reliability and ease-of-use, however, the question of whether biometric recognition is as intuitive and straightforward to use is open to debate. Can biometric systems make some tasks easier for people with accessibility concerns? To investigate this question, an accessibility evaluation of a mobile app was conducted where test subjects withdraw money from a fictitious ATM (Automated Teller Machine) scenario. The biometric authentication mechanisms used include face, voice, and fingerprint. Furthermore, we employed traditional modalities of PIN and pattern in order to check if biometric recognition is indeed a real improvement. The trial test subjects within this work were people with real-life accessibility concerns. A group of people without accessibility concerns also participated, providing a baseline performance. Experimental results are presented concerning performance, HCI (Human-Computer Interaction) and accessibility, grouped according to category of accessibility concern. Our results reveal links between individual modalities and user category establishing guidelines for future accessible biometric products.

  8. Biometrics: Accessibility challenge or opportunity?

    Science.gov (United States)

    Lunerti, Chiara; Sanchez-Reillo, Raul; Guest, Richard Michael

    2018-01-01

    Biometric recognition is currently implemented in several authentication contexts, most recently in mobile devices where it is expected to complement or even replace traditional authentication modalities such as PIN (Personal Identification Number) or passwords. The assumed convenience characteristics of biometrics are transparency, reliability and ease-of-use, however, the question of whether biometric recognition is as intuitive and straightforward to use is open to debate. Can biometric systems make some tasks easier for people with accessibility concerns? To investigate this question, an accessibility evaluation of a mobile app was conducted where test subjects withdraw money from a fictitious ATM (Automated Teller Machine) scenario. The biometric authentication mechanisms used include face, voice, and fingerprint. Furthermore, we employed traditional modalities of PIN and pattern in order to check if biometric recognition is indeed a real improvement. The trial test subjects within this work were people with real-life accessibility concerns. A group of people without accessibility concerns also participated, providing a baseline performance. Experimental results are presented concerning performance, HCI (Human-Computer Interaction) and accessibility, grouped according to category of accessibility concern. Our results reveal links between individual modalities and user category establishing guidelines for future accessible biometric products. PMID:29565989

  9. Neural Correlates of User-initiated Motor Success and Failure - A Brain-Computer Interface Perspective.

    Science.gov (United States)

    Yazmir, Boris; Reiner, Miriam

    2018-05-15

    Any motor action is, by nature, potentially accompanied by human errors. In order to facilitate development of error-tailored Brain-Computer Interface (BCI) correction systems, we focused on internal, human-initiated errors, and investigated EEG correlates of user outcome successes and errors during a continuous 3D virtual tennis game against a computer player. We used a multisensory, 3D, highly immersive environment. Missing and repelling the tennis ball were considered, as 'error' (miss) and 'success' (repel). Unlike most previous studies, where the environment "encouraged" the participant to perform a mistake, here errors happened naturally, resulting from motor-perceptual-cognitive processes of incorrect estimation of the ball kinematics, and can be regarded as user internal, self-initiated errors. Results show distinct and well-defined Event-Related Potentials (ERPs), embedded in the ongoing EEG, that differ across conditions by waveforms, scalp signal distribution maps, source estimation results (sLORETA) and time-frequency patterns, establishing a series of typical features that allow valid discrimination between user internal outcome success and error. The significant delay in latency between positive peaks of error- and success-related ERPs, suggests a cross-talk between top-down and bottom-up processing, represented by an outcome recognition process, in the context of the game world. Success-related ERPs had a central scalp distribution, while error-related ERPs were centro-parietal. The unique characteristics and sharp differences between EEG correlates of error/success provide the crucial components for an improved BCI system. The features of the EEG waveform can be used to detect user action outcome, to be fed into the BCI correction system. Copyright © 2016 IBRO. Published by Elsevier Ltd. All rights reserved.

  10. The design and implementation of access control management system in IHEP network

    International Nuclear Information System (INIS)

    Wang Yanming; An Dehai; Qi Fazhi

    2010-01-01

    In campus network environment of Institute of High Energy Physics, because of the number of Network devices and computers are large scale, ensuring the access validity of network devices and user's computer, and aiming at effective control the exceptional network communication are technological means to achieve network normal running. The access control system of Campus network of institute of High Energy Physics using MySQL database in the behind, and using CGI PHP HTML language to develop the front interface. The System achieves user information management, user computer access control, cutting down the exceptional network communication and alarm function. Increasing the management effective of network, to ensure campus network safety and reliable running. (authors)

  11. Task-role-based Access Control Model in Smart Health-care System

    OpenAIRE

    Wang Peng; Jiang Lingyun

    2015-01-01

    As the development of computer science and smart health-care technology, there is a trend for patients to enjoy medical care at home. Taking enormous users in the Smart Health-care System into consideration, access control is an important issue. Traditional access control models, discretionary access control, mandatory access control, and role-based access control, do not properly reflect the characteristics of Smart Health-care System. This paper proposes an advanced access control model for...

  12. Student Engagement in a Computer Rich Science Classroom

    Science.gov (United States)

    Hunter, Jeffrey C.

    The purpose of this study was to examine the student lived experience when using computers in a rural science classroom. The overarching question the project sought to examine was: How do rural students relate to computers as a learning tool in comparison to a traditional science classroom? Participant data were collected using a pre-study survey, Experience Sampling during class and post-study interviews. Students want to use computers in their classrooms. Students shared that they overwhelmingly (75%) preferred a computer rich classroom to a traditional classroom (25%). Students reported a higher level of engagement in classes that use technology/computers (83%) versus those that do not use computers (17%). A computer rich classroom increased student control and motivation as reflected by a participant who shared; "by using computers I was more motivated to get the work done" (Maggie, April 25, 2014, survey). The researcher explored a rural school environment. Rural populations represent a large number of students and appear to be underrepresented in current research. The participants, tenth grade Biology students, were sampled in a traditional teacher led class without computers for one week followed by a week using computers daily. Data supported that there is a new gap that separates students, a device divide. This divide separates those who have access to devices that are robust enough to do high level class work from those who do not. Although cellular phones have reduced the number of students who cannot access the Internet, they may have created a false feeling that access to a computer is no longer necessary at home. As this study shows, although most students have Internet access, fewer have access to a device that enables them to complete rigorous class work at home. Participants received little or no training at school in proper, safe use of a computer and the Internet. It is clear that the majorities of students are self-taught or receive guidance

  13. Optimal usage of computing grid network in the fields of nuclear fusion computing task

    International Nuclear Information System (INIS)

    Tenev, D.

    2006-01-01

    Nowadays the nuclear power becomes the main source of energy. To make its usage more efficient, the scientists created complicated simulation models, which require powerful computers. The grid computing is the answer to powerful and accessible computing resources. The article observes, and estimates the optimal configuration of the grid environment in the fields of the complicated nuclear fusion computing tasks. (author)

  14. Multichannel analyzer using the direct-memory-access channel in a personal computer; Mnogokanal`nyj analizator v personal`nom komp`yutere, ispol`zuyushchij kanal pryamogo dostupa k pamyati

    Energy Technology Data Exchange (ETDEWEB)

    Georgiev, G; Vankov, I; Dimitrov, L [Incn. Yadernykh Issledovanij i Yadernoj Ehnergetiki Bolgarskoj Akademii Nuk, Sofiya (Bulgaria); Peev, I [Firma TOIVEL, Sofiya (Bulgaria)

    1996-12-31

    Paper describes a multichannel analyzer of the spectrometry data developed on the basis of a personal computer memory and a controlled channel of direct access. Analyzer software covering a driver and program of spectrum display control is studied. 2 figs.

  15. Fast Computing for Distance Covariance

    OpenAIRE

    Huo, Xiaoming; Szekely, Gabor J.

    2014-01-01

    Distance covariance and distance correlation have been widely adopted in measuring dependence of a pair of random variables or random vectors. If the computation of distance covariance and distance correlation is implemented directly accordingly to its definition then its computational complexity is O($n^2$) which is a disadvantage compared to other faster methods. In this paper we show that the computation of distance covariance and distance correlation of real valued random variables can be...

  16. LHCb Distributed Data Analysis on the Computing Grid

    CERN Document Server

    Paterson, S; Parkes, C

    2006-01-01

    LHCb is one of the four Large Hadron Collider (LHC) experiments based at CERN, the European Organisation for Nuclear Research. The LHC experiments will start taking an unprecedented amount of data when they come online in 2007. Since no single institute has the compute resources to handle this data, resources must be pooled to form the Grid. Where the Internet has made it possible to share information stored on computers across the world, Grid computing aims to provide access to computing power and storage capacity on geographically distributed systems. LHCb software applications must work seamlessly on the Grid allowing users to efficiently access distributed compute resources. It is essential to the success of the LHCb experiment that physicists can access data from the detector, stored in many heterogeneous systems, to perform distributed data analysis. This thesis describes the work performed to enable distributed data analysis for the LHCb experiment on the LHC Computing Grid.

  17. Morphological measurements in computed tomography correlate with airflow obstruction in chronic obstructive pulmonary disease: systematic review and meta-analysis

    International Nuclear Information System (INIS)

    Xie, XueQian; Oudkerk, Matthijs; Vliegenthart, Rozemarijn; Jong, Pim A. de; Wang, Ying; Hacken, Nick H.T. ten; Miao, Jingtao; Zhang, GuiXiang; Bock, Geertruida H. de

    2012-01-01

    To determine the correlation between CT measurements of emphysema or peripheral airways and airflow obstruction in chronic obstructive pulmonary disease (COPD). PubMed, Embase and Web of Knowledge were searched from 1976 to 2011. Two reviewers independently screened 1,763 citations to identify articles that correlated CT measurements to airflow obstruction parameters of the pulmonary function test in COPD patients, rated study quality and extracted information. Three CT measurements were accessed: lung attenuation area percentage 1 %pred) and FEV 1 divided by the forced volume vital capacity. Seventy-nine articles (9,559 participants) were included in the systematic review, demonstrating different methodologies, measurements and CT airflow obstruction correlations. There were 15 high-quality articles (2,095 participants) in the meta-analysis. The absolute pooled correlation coefficients ranged from 0.48 (95 % CI, 0.40 to 0.54) to 0.65 (0.58 to 0.71) for inspiratory CT and 0.64 (0.53 to 0.72) to 0.73 (0.63 to 0.80) for expiratory CT. CT measurements of emphysema or peripheral airways are significantly related to airflow obstruction in COPD patients. CT provides a morphological method to investigate airway obstruction in COPD. (orig.)

  18. Maxillary sinus augmentation by crestal access: a retrospective study on cavity size and outcome correlation.

    Science.gov (United States)

    Spinato, Sergio; Bernardello, Fabio; Galindo-Moreno, Pablo; Zaffe, Davide

    2015-12-01

    Cone-beam computed tomography (CBCT) and radiographic outcomes of crestal sinus elevation, performed using mineralized human bone allograft, were analyzed to correlate results with maxillary sinus size. A total of 60 sinus augmentations in 60 patients, with initial bone ≤5 mm, were performed. Digital radiographs were taken at surgical implant placement time up to post-prosthetic loading follow-up (12-72 months), when CBCT evaluation was carried out. Marginal bone loss (MBL) was radiographically analyzed at 6 months and follow-up time post-loading. Sinus size (BPD), implant distance from palatal (PID) and buccal wall (BID), and absence of bone coverage of implant (intra-sinus bone loss--IBL) were evaluated and statistically evaluated by ANOVA and linear regression analyses. MBL increased as a function of time. MBL at final follow-up was statistically associated with MBL at 6 months. A statistically significant correlation of IBL with wall distance and of IBL/mm with time was identified with greater values in wide sinuses (WS ≥ 13.27 mm) than in narrow sinuses (NS < 13.27 mm). This study is the first quantitative and statistically significant confirmation that crestal technique with residual ridge height <5 mm is more appropriate and predictable, in terms of intra-sinus bone coverage, in narrow than in WS. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. Image-Guided Radiotherapy for Liver Cancer Using Respiratory-Correlated Computed Tomography and Cone-Beam Computed Tomography

    International Nuclear Information System (INIS)

    Guckenberger, Matthias; Sweeney, Reinhart A.; Wilbert, Juergen; Krieger, Thomas; Richter, Anne; Baier, Kurt; Mueller, Gerd; Sauer, Otto; Flentje, Michael

    2008-01-01

    Purpose: To evaluate a novel four-dimensional (4D) image-guided radiotherapy (IGRT) technique in stereotactic body RT for liver tumors. Methods and Materials: For 11 patients with 13 intrahepatic tumors, a respiratory-correlated 4D computed tomography (CT) scan was acquired at treatment planning. The target was defined using CT series reconstructed at end-inhalation and end-exhalation. The liver was delineated on these two CT series and served as a reference for image guidance. A cone-beam CT scan was acquired after patient positioning; the blurred diaphragm dome was interpreted as a probability density function showing the motion range of the liver. Manual contour matching of the liver structures from the planning 4D CT scan with the cone-beam CT scan was performed. Inter- and intrafractional uncertainties of target position and motion range were evaluated, and interobserver variability of the 4D-IGRT technique was tested. Results: The workflow of 4D-IGRT was successfully practiced in all patients. The absolute error in the liver position and error in relation to the bony anatomy was 8 ± 4 mm and 5 ± 2 mm (three-dimensional vector), respectively. Margins of 4-6 mm were calculated for compensation of the intrafractional drifts of the liver. The motion range of the diaphragm dome was reproducible within 5 mm for 11 of 13 lesions, and the interobserver variability of the 4D-IGRT technique was small (standard deviation, 1.5 mm). In 4 patients, the position of the intrahepatic lesion was directly verified using a mobile in-room CT scanner after application of intravenous contrast. Conclusion: The results of our study have shown that 4D image guidance using liver contour matching between respiratory-correlated CT and cone-beam CT scans increased the accuracy compared with stereotactic positioning and compared with IGRT without consideration of breathing motion

  20. Performance Analysis of a De-correlated Modified Code Tracking Loop for Synchronous DS-CDMA System under Multiuser Environment

    Science.gov (United States)

    Wu, Ya-Ting; Wong, Wai-Ki; Leung, Shu-Hung; Zhu, Yue-Sheng

    This paper presents the performance analysis of a De-correlated Modified Code Tracking Loop (D-MCTL) for synchronous direct-sequence code-division multiple-access (DS-CDMA) systems under multiuser environment. Previous studies have shown that the imbalance of multiple access interference (MAI) in the time lead and time lag portions of the signal causes tracking bias or instability problem in the traditional correlating tracking loop like delay lock loop (DLL) or modified code tracking loop (MCTL). In this paper, we exploit the de-correlating technique to combat the MAI at the on-time code position of the MCTL. Unlike applying the same technique to DLL which requires an extensive search algorithm to compensate the noise imbalance which may introduce small tracking bias under low signal-to-noise ratio (SNR), the proposed D-MCTL has much lower computational complexity and exhibits zero tracking bias for the whole range of SNR, regardless of the number of interfering users. Furthermore, performance analysis and simulations based on Gold codes show that the proposed scheme has better mean square tracking error, mean-time-to-lose-lock and near-far resistance than the other tracking schemes, including traditional DLL (T-DLL), traditional MCTL (T-MCTL) and modified de-correlated DLL (MD-DLL).

  1. Access Control Based on Trail Inference

    Directory of Open Access Journals (Sweden)

    ALBARELO, P. C.

    2015-06-01

    Full Text Available Professionals are constantly seeking qualification and consequently increasing their knowledge in their area of expertise. Thus, it is interesting to develop a computer system that knows its users and their work history. Using this information, even in the case of professional role change, the system could allow the renewed authorization for activities, based on previously authorized use. This article proposes a model for user access control that is embedded in a context-aware environment. The model applies the concept of trails to manage access control, recording activities usage in contexts and applying this history as a criterion to grant new accesses. Despite the fact that previous related research works consider contexts, none of them uses the concept of trails. Hence, the main contribution of this work is the use of a new access control criterion, namely, the history of previous accesses (trails. A prototype was implemented and applied in an evaluation based on scenarios. The results demonstrate the feasibility of the proposal, allowing for access control systems to use an alternative way to support access rights.

  2. Cloud Computing Cryptography "State-of-the-Art"

    OpenAIRE

    Omer K. Jasim; Safia Abbas; El-Sayed M. El-Horbaty; Abdel-Badeeh M. Salem

    2013-01-01

    Cloud computing technology is very useful in present day to day life, it uses the internet and the central remote servers to provide and maintain data as well as applications. Such applications in turn can be used by the end users via the cloud communications without any installation. Moreover, the end users’ data files can be accessed and manipulated from any other computer using the internet services. Despite the flexibility of data and application accessing and usage that cloud computing e...

  3. Restricted access processor - An application of computer security technology

    Science.gov (United States)

    Mcmahon, E. M.

    1985-01-01

    This paper describes a security guard device that is currently being developed by Computer Sciences Corporation (CSC). The methods used to provide assurance that the system meets its security requirements include the system architecture, a system security evaluation, and the application of formal and informal verification techniques. The combination of state-of-the-art technology and the incorporation of new verification procedures results in a demonstration of the feasibility of computer security technology for operational applications.

  4. Remote direct memory access

    Science.gov (United States)

    Archer, Charles J.; Blocksome, Michael A.

    2012-12-11

    Methods, parallel computers, and computer program products are disclosed for remote direct memory access. Embodiments include transmitting, from an origin DMA engine on an origin compute node to a plurality target DMA engines on target compute nodes, a request to send message, the request to send message specifying a data to be transferred from the origin DMA engine to data storage on each target compute node; receiving, by each target DMA engine on each target compute node, the request to send message; preparing, by each target DMA engine, to store data according to the data storage reference and the data length, including assigning a base storage address for the data storage reference; sending, by one or more of the target DMA engines, an acknowledgment message acknowledging that all the target DMA engines are prepared to receive a data transmission from the origin DMA engine; receiving, by the origin DMA engine, the acknowledgement message from the one or more of the target DMA engines; and transferring, by the origin DMA engine, data to data storage on each of the target compute nodes according to the data storage reference using a single direct put operation.

  5. Effective correlator for RadioAstron project

    Science.gov (United States)

    Sergeev, Sergey

    This paper presents the implementation of programme FX-correlator for Very Long Baseline Interferometry, adapted for the project "RadioAstron". Software correlator implemented for heterogeneous computing systems using graphics accelerators. It is shown that for the task interferometry implementation of the graphics hardware has a high efficiency. The host processor of heterogeneous computing system, performs the function of forming the data flow for graphics accelerators, the number of which corresponds to the number of frequency channels. So, for the Radioastron project, such channels is seven. Each accelerator is perform correlation matrix for all bases for a single frequency channel. Initial data is converted to the floating-point format, is correction for the corresponding delay function and computes the entire correlation matrix simultaneously. Calculation of the correlation matrix is performed using the sliding Fourier transform. Thus, thanks to the compliance of a solved problem for architecture graphics accelerators, managed to get a performance for one processor platform Kepler, which corresponds to the performance of this task, the computing cluster platforms Intel on four nodes. This task successfully scaled not only on a large number of graphics accelerators, but also on a large number of nodes with multiple accelerators.

  6. A computational approach to measuring the correlation between expertise and social media influence for celebrities on microblogs

    OpenAIRE

    Zhao, Wayne Xin; Liu, Jing; He, Yulan; Lin, Chin Yew; Wen, Ji-Rong

    2016-01-01

    Social media influence analysis, sometimes also called authority detection, aims to rank users based on their influence scores in social media. Existing approaches of social influence analysis usually focus on how to develop effective algorithms to quantize users’ influence scores. They rarely consider a person’s expertise levels which are arguably important to influence measures. In this paper, we propose a computational approach to measuring the correlation between expertise and social medi...

  7. Biometrics: Accessibility challenge or opportunity?

    Directory of Open Access Journals (Sweden)

    Ramon Blanco-Gonzalo

    Full Text Available Biometric recognition is currently implemented in several authentication contexts, most recently in mobile devices where it is expected to complement or even replace traditional authentication modalities such as PIN (Personal Identification Number or passwords. The assumed convenience characteristics of biometrics are transparency, reliability and ease-of-use, however, the question of whether biometric recognition is as intuitive and straightforward to use is open to debate. Can biometric systems make some tasks easier for people with accessibility concerns? To investigate this question, an accessibility evaluation of a mobile app was conducted where test subjects withdraw money from a fictitious ATM (Automated Teller Machine scenario. The biometric authentication mechanisms used include face, voice, and fingerprint. Furthermore, we employed traditional modalities of PIN and pattern in order to check if biometric recognition is indeed a real improvement. The trial test subjects within this work were people with real-life accessibility concerns. A group of people without accessibility concerns also participated, providing a baseline performance. Experimental results are presented concerning performance, HCI (Human-Computer Interaction and accessibility, grouped according to category of accessibility concern. Our results reveal links between individual modalities and user category establishing guidelines for future accessible biometric products.

  8. Bladder transitional cell carcinoma: correlation of contrast enhancement on computed tomography with histological grade and tumour angiogenesis

    International Nuclear Information System (INIS)

    Xie, Q.; Zhang, J.; Wu, P.-H.; Jiang, X.-Q.; Chen, S.-L.; Wang, Q.-L.; Xu, J.; Chen, G.-D.; Deng, J.-H.

    2005-01-01

    AIM: To investigate the correlation between the degree of contrast enhancement of bladder cancer in the early enhanced phase of helical computed tomography (CT) and microvessel density (MVD), vascular endothelial growth factor (VEGF) and histological grade. MATERIALS AND METHODS: Sixty-five patients with transitional cell carcinoma of the bladder were examined by incremental unenhanced CT and helical CT at 40-45 s after initiation of intravenous administration of contrast medium before surgery. The CT density in Hounsfield units of bladder carcinomas were measured in the middle of the maximum diameter section of the cancer lesions on unenhanced and enhanced CT. The degree of contrast enhancement of the tumour was determined as the absolute increase in Hounsfield units. Histological grade, VEGF and MVD were analysed for each cancer. The Pearson and Spearman correlation tests were used to determine the strength of the relationships between CT enhancement and histological grade, VEGF expression and MVD. RESULTS: Different degrees of enhancement were observed in 91 cancers during the early enhanced phase of helical CT. Mean MVDs and mean CT enhancing values of different histological grade groups were statistically different (p<0.001). A positive correlation was found in the CT-enhancing value of bladder cancer and MVD (Pearson correlation test; r=0.938, p<0.001) and histological grade (Spearman rank correlation; r=0.734, p<0.001). VEGF of bladder cancer did not correlate with the change in CT attenuation (Spearman rank correlation; r=0.087, p=0.410) and MVD (Spearman rank correlation, r=0.103, p=0.330). CONCLUSION: In bladder cancer, the degree of contrast enhancement during the early enhanced helical CT is correlated with the MVD and histological grade of tumour. It is possible that MVD is the histopathological basis of early contrast enhancement of bladder cancer

  9. An Intelligent Terminal for Access to a Medical Database

    Science.gov (United States)

    Womble, M. E.; Wilson, S. D.; Keiser, H. N.; Tworek, M. L.

    1978-01-01

    Very powerful data base management systems (DBMS) now exist which allow medical personnel access to patient record data bases. DBMS's make it easy to retrieve either complete or abbreviated records of patients with similar characteristics. In addition, statistics on data base records are immediately accessible. However, the price of this power is a large computer with the inherent problems of access, response time, and reliability. If a general purpose, time-shared computer is used to get this power, the response time to a request can be either rapid or slow, depending upon loading by other users. Furthermore, if the computer is accessed via dial-up telephone lines, there is competition with other users for telephone ports. If either the DBMS or the host machine is replaced, the medical users, who are typically not sophisticated in computer usage, are forced to learn the new system. Microcomputers, because of their low cost and adaptability, lend themselves to a solution of these problems. A microprocessor-based intelligent terminal has been designed and implemented at the USAF School of Aerospace Medicine to provide a transparent interface between the user and his data base. The intelligent terminal system includes multiple microprocessors, floppy disks, a CRT terminal, and a printer. Users interact with the system at the CRT terminal using menu selection (framing). The system translates the menu selection into the query language of the DBMS and handles all actual communication with the DBMS and its host computer, including telephone dialing and sign on procedures, as well as the actual data base query and response. Retrieved information is stored locally for CRT display, hard copy production, and/or permanent retention. Microprocessor-based communication units provide security for sensitive medical data through encryption/decryption algorithms and high reliability error detection transmission schemes. Highly modular software design permits adapation to a

  10. Computational time-resolved and resonant x-ray scattering of strongly correlated materials

    Energy Technology Data Exchange (ETDEWEB)

    Bansil, Arun [Northeastern Univ., Boston, MA (United States)

    2016-11-09

    Basic-Energy Sciences of the Department of Energy (BES/DOE) has made large investments in x-ray sources in the U.S. (NSLS-II, LCLS, NGLS, ALS, APS) as powerful enabling tools for opening up unprecedented new opportunities for exploring properties of matter at various length and time scales. The coming online of the pulsed photon source, literally allows us to see and follow the dynamics of processes in materials at their natural timescales. There is an urgent need therefore to develop theoretical methodologies and computational models for understanding how x-rays interact with matter and the related spectroscopies of materials. The present project addressed aspects of this grand challenge of x-ray science. In particular, our Collaborative Research Team (CRT) focused on developing viable computational schemes for modeling x-ray scattering and photoemission spectra of strongly correlated materials in the time-domain. The vast arsenal of formal/numerical techniques and approaches encompassed by the members of our CRT were brought to bear through appropriate generalizations and extensions to model the pumped state and the dynamics of this non-equilibrium state, and how it can be probed via x-ray absorption (XAS), emission (XES), resonant and non-resonant x-ray scattering, and photoemission processes. We explored the conceptual connections between the time-domain problems and other second-order spectroscopies, such as resonant inelastic x-ray scattering (RIXS) because RIXS may be effectively thought of as a pump-probe experiment in which the incoming photon acts as the pump, and the fluorescent decay is the probe. Alternatively, when the core-valence interactions are strong, one can view K-edge RIXS for example, as the dynamic response of the material to the transient presence of a strong core-hole potential. Unlike an actual pump-probe experiment, here there is no mechanism for adjusting the time-delay between the pump and the probe. However, the core hole

  11. NEW SOURCES OF GRAIN MOLD RESISTANCE AMONG SORGHUM ACCESSIONS FROM SUDAN

    Directory of Open Access Journals (Sweden)

    Louis Kajac Prom

    2009-05-01

    Full Text Available   Fifty-nine sorghum accessions from Sudan were evaluated in replicated plots at Isabela, Puerto Rico, for resistance against Fusarium thapsinum, one of the causal agents of grain mold.  The environmental conditions such as temperature, relative humidity, and rainfall during this study, especially at and after physiological maturity were optimal for grain mold development.  Highly significant negative correlations between grain mold severity ratings in the field and on threshed grains with germination rate and seed weight were recorded, indicating that germination and seed weight were adversely affected when challenged with F. thapsinum.  Temperature showed a significant negative correlation with grain mold severity and a significant positive correlation with germination rate.  However, no significant correlation was observed between rainfall and grain mold severity or germination rate.  Accessions PI570011, PI570027, PI569992, PI569882, PI571312, PI570759, and PI267548 exhibited the lowest grain mold severities and among the highest germination rates, indicating that these accessions may possess genetic resistance to grain mold and might be useful in sorghum enhancement programs.  Four of these accessions had significantly higher germination rates than the resistant control genotypes with PI267548 having the highest germination rate.  PI267548 was the only white seeded accessions showing significantly better grain mold resistance than the control genotypes.

  12. Fade detector for the FODA-TDMA access scheme

    Science.gov (United States)

    Celandroni, Nedo; Ferro, Erina; Marzoli, Antonio

    1989-05-01

    The First in first out Ordered Demand Assignment-Time Division Multiple Access (FODA-TDMA) satellite access scheme designed for simultaneous transmissions of real time data, like packetized voice and slow-scan images (stream traffic) and data coming from standard EDP applications, such as bulk data tansfer, interactive computer access, mailing, data base enquiry and updating (datagram traffic) is described. When deep fades are experienced due to rain attenuation, the system is able to counter the fade. Techniques to detect the fade are presented.

  13. Hypercard Another Computer Tool.

    Science.gov (United States)

    Geske, Joel

    1991-01-01

    Describes "Hypercard," a computer application package usable in all three modes of instructional computing: tutor, tool, and tutee. Suggests using Hypercard in scholastic journalism programs to teach such topics as news, headlines, design, photography, and advertising. Argues that the ability to access, organize, manipulate, and comprehend…

  14. The ARAC client system: network-based access to ARAC

    International Nuclear Information System (INIS)

    Leach, M J; Sumikawa, D; Webster, C

    1999-01-01

    The ARAC Client System allows users (such as emergency managers and first responders) with commonly available desktop and laptop computers to utilize the central ARAC system over the Internet or any other communications link using Internet protocols. Providing cost-effective fast access to the central ARAC system greatly expands the availability of the ARAC capability. The ARAC Client system consists of (1) local client applications running on the remote user's computer, and (2) ''site servers'' that provide secure access to selected central ARAC system capabilities and run on a scalable number of dedicated workstations residing at the central facility. The remote client applications allow users to describe a real or potential them-bio event, electronically sends this information to the central ARAC system which performs model calculations, and quickly receive and visualize the resulting graphical products. The site servers will support simultaneous access to ARAC capabilities by multiple users. The ARAC Client system is based on object-oriented client/server and distributed computing technologies using CORBA and Java, and consists of a large number of interacting components

  15. DATA SECURITY ISSUES IN CLOUD COMPUTING: REVIEW

    Directory of Open Access Journals (Sweden)

    Hussam Alddin Shihab Ahmed

    2016-02-01

    Full Text Available Cloud computing is an internet based model that empower on demand ease of access and pay for the usage of each access to shared pool of networks. It is yet another innovation that fulfills a client's necessity for computing resources like systems, stockpiling, servers, administrations and applications. Securing the Data is considered one of the principle significant challenges and concerns for cloud computing. This persistent problem is getting more affective due to the changes in improving cloud computing technology. From the perspective of the Clients, cloud computing is a security hazard especially when it comes to assurance affirmation issues and data security, remain the most basically which backs off for appropriation of Cloud Computing administrations. This paper audits and breaks down the essential issue of cloud computing and depicts the information security and protection of privacy issues in cloud.

  16. Assessing the Macro-Level Correlates of Malware Infections Using a Routine Activities Framework.

    Science.gov (United States)

    Holt, Thomas J; Burruss, George W; Bossler, Adam M

    2018-05-01

    The ability to gain unauthorized access to computer systems to engage in espionage and data theft poses a massive threat to individuals worldwide. There has been minimal focus, however, on the role of malicious software, or malware, which can automate this process. This study examined the macro-correlates of malware infection at the national level by using an open repository of known malware infections and utilizing a routine activities framework. Negative inflated binomial models for counts indicated that nations with greater technological infrastructure, more political freedoms, and with less organized crime financial impact were more likely to report malware infections. The number of Computer Emergency Response Teams (CERTs) in a nation was not significantly related with reported malware infection. The implications of the study for the understanding of malware infection, routine activity theory, and target-hardening strategies are discussed.

  17. Cloud Computing. Technology Briefing. Number 1

    Science.gov (United States)

    Alberta Education, 2013

    2013-01-01

    Cloud computing is Internet-based computing in which shared resources, software and information are delivered as a service that computers or mobile devices can access on demand. Cloud computing is already used extensively in education. Free or low-cost cloud-based services are used daily by learners and educators to support learning, social…

  18. Correlation between presumed sinusitis-induced pain and paranasal sinus computed tomographic findings.

    Science.gov (United States)

    Mudgil, Shikha P; Wise, Scott W; Hopper, Kenneth D; Kasales, Claudia J; Mauger, David; Fornadley, John A

    2002-02-01

    The correlation between facial and/or head pain in patients clinically suspected of having sinusitis and actual localized findings on sinus computed tomographic (CT) imaging are poorly understood. To prospectively evaluate the relationship of paranasal sinus pain symptoms with CT imaging. Two hundred consecutive patients referred by otolaryngologists and internists for CT of the paranasal sinuses participated by completing a questionnaire immediately before undergoing CT. Three radiologists blinded to the patients' responses scored the degree of air/fluid level, mucosal thickening, bony reaction, and mucus retention cysts using a graded scale of severity (0 to 3 points). The osteomeatal complexes and nasolacrimal ducts were also evaluated for patency. Bivariate analysis was performed to evaluate the relationship between patients' localized symptoms and CT findings in the respective sinus. One hundred sixty-three patients (82%) reported having some form of facial pain or headache. The right temple/forehead was the most frequently reported region of maximal pain. On CT imaging the maxillary sinus was the most frequently involved sinus. Bivariate analysis failed to show any relationship between patient symptoms and findings on CT. Patients with a normal CT reported a mean 5.88 sites of facial or head pain versus 5.45 sites for patients with an abnormal CT. Patient-based responses of sinonasal pain symptoms fail to correlate with findings in the respective sinuses. CT should therefore be reserved for delineating the anatomy and degree of sinus disease before surgical intervention.

  19. How You Can Protect Public Access Computers "and" Their Users

    Science.gov (United States)

    Huang, Phil

    2007-01-01

    By providing the public with online computing facilities, librarians make available a world of information resources beyond their traditional print materials. Internet-connected computers in libraries greatly enhance the opportunity for patrons to enjoy the benefits of the digital age. Unfortunately, as hackers become more sophisticated and…

  20. High-resolution computed tomography in silicosis: correlation with chest radiography and pulmonary function tests

    Energy Technology Data Exchange (ETDEWEB)

    Lopes, Agnaldo Jose [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil). Pedro Ernesto Univ. Hospital. Dept. of Respiratory Function]. E-mail: phel.lop@uol.com.br; Mogami, Roberto; Capone, Domenico; Jansen, Jose Manoel [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil). School of Medical Sciences; Tessarollo, Bernardo [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil). Dept. of Radiology and Diagnostic Image; Melo, Pedro Lopes de [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil). Inst. of Biology

    2008-05-15

    Objective: To correlate tomographic findings with pulmonary function findings, as well as to compare chest X-ray findings with high-resolution computed tomography (HRCT) findings, in patients with silicosis. Methods: A cross-sectional study was conducted in 44 non-smoking patients without a history of tuberculosis. Chest X-ray findings were classified according to the International Labour Organization recommendations. Using a semiquantitative system, the following HRCT findings were measured: the full extent of pulmonary involvement; parenchymal opacities; and emphysema. Spirometry and forced oscillation were performed. Pulmonary volumes were evaluated using the helium dilution method, and diffusing capacity of the lung for carbon monoxide was assessed. Results: Of the 44 patients studied, 41 were male. The mean age was 48.4 years. There were 4 patients who were classified as category 0 based on X-ray findings and as category 1 based on HRCT findings. Using HRCT scans, we identified progressive massive fibrosis in 33 patients, compared with only 23 patients when X-rays were used. Opacity score was found to correlate most closely with airflow, DLCO and compliance. Emphysema score correlated inversely with volume, DLCO and airflow. In this sample of patients presenting a predominance of large opacities (75% of the individuals), the deterioration of pulmonary function was associated with the extent of structural changes. Conclusions: In the early detection of silicosis and the identification of progressive massive fibrosis, HRCT scans are superior to X-rays. (author)

  1. High-resolution computed tomography in silicosis: correlation with chest radiography and pulmonary function tests

    International Nuclear Information System (INIS)

    Lopes, Agnaldo Jose; Mogami, Roberto; Capone, Domenico; Jansen, Jose Manoel; Tessarollo, Bernardo; Melo, Pedro Lopes de

    2008-01-01

    Objective: To correlate tomographic findings with pulmonary function findings, as well as to compare chest X-ray findings with high-resolution computed tomography (HRCT) findings, in patients with silicosis. Methods: A cross-sectional study was conducted in 44 non-smoking patients without a history of tuberculosis. Chest X-ray findings were classified according to the International Labour Organization recommendations. Using a semiquantitative system, the following HRCT findings were measured: the full extent of pulmonary involvement; parenchymal opacities; and emphysema. Spirometry and forced oscillation were performed. Pulmonary volumes were evaluated using the helium dilution method, and diffusing capacity of the lung for carbon monoxide was assessed. Results: Of the 44 patients studied, 41 were male. The mean age was 48.4 years. There were 4 patients who were classified as category 0 based on X-ray findings and as category 1 based on HRCT findings. Using HRCT scans, we identified progressive massive fibrosis in 33 patients, compared with only 23 patients when X-rays were used. Opacity score was found to correlate most closely with airflow, DLCO and compliance. Emphysema score correlated inversely with volume, DLCO and airflow. In this sample of patients presenting a predominance of large opacities (75% of the individuals), the deterioration of pulmonary function was associated with the extent of structural changes. Conclusions: In the early detection of silicosis and the identification of progressive massive fibrosis, HRCT scans are superior to X-rays. (author)

  2. Nurses' computer literacy and attitudes towards the use of computers in health care.

    Science.gov (United States)

    Gürdaş Topkaya, Sati; Kaya, Nurten

    2015-05-01

    This descriptive and cross-sectional study was designed to address nurses' computer literacy and attitudes towards the use of computers in health care and to determine the correlation between these two variables. This study was conducted with the participation of 688 nurses who worked at two university-affiliated hospitals. These nurses were chosen using a stratified random sampling method. The data were collected using the Multicomponent Assessment of Computer Literacy and the Pretest for Attitudes Towards Computers in Healthcare Assessment Scale v. 2. The nurses, in general, had positive attitudes towards computers, and their computer literacy was good. Computer literacy in general had significant positive correlations with individual elements of computer competency and with attitudes towards computers. If the computer is to be an effective and beneficial part of the health-care system, it is necessary to help nurses improve their computer competency. © 2014 Wiley Publishing Asia Pty Ltd.

  3. Assessing mouse alternatives to access to computer: a case study of a user with cerebral palsy.

    Science.gov (United States)

    Pousada, Thais; Pareira, Javier; Groba, Betania; Nieto, Laura; Pazos, Alejandro

    2014-01-01

    The purpose of this study is to describe the process of assessment of three assistive devices to meet the needs of a woman with cerebral palsy (CP) in order to provide her with computer access and use. The user has quadriplegic CP, with anarthria, using a syllabic keyboard. Devices were evaluated through a three-step approach: (a) use of a questionnaire to preselect potential assistive technologies, (b) use of an eTAO tool to determine the effectiveness of each devised, and (c) a conducting semi-structured interview to obtain qualitative data. Touch screen, joystick, and trackball were the preselected devices. The best device that met the user's needs and priorities was joystick. The finding was corroborated by both the eTAO tool and the semi-structured interview. Computers are a basic form of social participation. It is important to consider the special needs and priorities of users and to try different devices when undertaking a device-selection process. Environmental and personal factors have to be considered, as well. This leads to a need to evaluate new tools in order to provide the appropriate support. The eTAO could be a suitable instrument for this purpose. Additional research is also needed to understand how to better match devices with different user populations and how to comprehensively evaluate emerging technologies relative to users with disabilities.

  4. Accessible Earth: Enhancing diversity in the Geosciences through accessible course design and Experiential Learning Theory

    Science.gov (United States)

    Bennett, Rick; Lamb, Diedre

    2017-04-01

    The tradition of field-based instruction in the geoscience curriculum, which culminates in a capstone geological field camp, presents an insurmountable barrier to many disabled students who might otherwise choose to pursue geoscience careers. There is a widespread perception that success as a practicing geoscientist requires direct access to outcrops and vantage points available only to those able to traverse inaccessible terrain. Yet many modern geoscience activities are based on remotely sensed geophysical data, data analysis, and computation that take place entirely from within the laboratory. To challenge the perception of geoscience as a career option only for the able bodied, we have created the capstone Accessible Earth Study Abroad Program, an alternative to geologic field camp with a focus on modern geophysical observation systems, computational thinking, and data science. In this presentation, we will report on the theoretical bases for developing the course, our experiences in teaching the course to date, and our plan for ongoing assessment, refinement, and dissemination of the effectiveness of our efforts.

  5. Direct access to INIS

    International Nuclear Information System (INIS)

    Zheludev, I.S.; Romanenko, A.G.

    1981-01-01

    Librarians, researchers, and information specialists throughout the world now have the opportunity for direct access to coverage of almost 95% of the world's literature dealing with the peaceful uses of atomic energy and nuclear science. This opportunity has been provided by the International Nuclear Information System (INIS) of the IAEA. INIS, with the voluntary collaboration of more than 60 of the Agency's Member States, maintains a comprehensive, computer-resident data-base, containing the bibliographic details plus informative abstracts of the bulk of the world's literature on nuclear science and technology. Since this data-base is growing at a rate of 75,000 items per year, and already contains more than 500,000 items, it is obviously important to be able to search this collection conveniently and efficiently. The usefulness of this ability is enhanced when other data-bases on related subjects are made available on an information network. During the early 1970s, on-line interrogation of large bibliographic data-bases became the accepted method for searching this type of information resource. Direct interaction between the searcher and the data-base provides quick feed-back resulting in improved literature listings for launching research and development projects. On-line access enables organizations which cannot store a large data-base on their own computer to expand the information resources at their command. Because of these advantages, INIS undertook to extend to interested Member States on-line access to its data-base in Vienna

  6. Computer Operating System Maintenance.

    Science.gov (United States)

    1982-06-01

    FACILITY The Computer Management Information Facility ( CMIF ) system was developed by Rapp Systems to fulfill the need at the CRF to record and report on...computer center resource usage and utilization. The foundation of the CMIF system is a System 2000 data base (CRFMGMT) which stores and permits access

  7. Pacing a data transfer operation between compute nodes on a parallel computer

    Science.gov (United States)

    Blocksome, Michael A [Rochester, MN

    2011-09-13

    Methods, systems, and products are disclosed for pacing a data transfer between compute nodes on a parallel computer that include: transferring, by an origin compute node, a chunk of an application message to a target compute node; sending, by the origin compute node, a pacing request to a target direct memory access (`DMA`) engine on the target compute node using a remote get DMA operation; determining, by the origin compute node, whether a pacing response to the pacing request has been received from the target DMA engine; and transferring, by the origin compute node, a next chunk of the application message if the pacing response to the pacing request has been received from the target DMA engine.

  8. Constructing a two bands optical code-division multiple-access network of bipolar optical access codecs using Walsh-coded liquid crystal modulators

    Science.gov (United States)

    Yen, Chih-Ta; Huang, Jen-Fa; Chih, Ping-En

    2014-08-01

    We propose and experimentally demonstrated the two bands optical code-division multiple-access (OCDMA) network over bipolar Walsh-coded liquid-crystal modulators (LCMs) and driven by green light and red light lasers. Achieving system performance depends on the construction of a decoder that implements a true bipolar correlation using only unipolar signals and intensity detection for each band. We took advantage of the phase delay characteristics of LCMs to construct a prototype optical coder/decoder (codec). Matched and unmatched Walsh signature codes were evaluated to detect correlations among multiuser data in the access network. By using LCMs, a red and green laser light source was spectrally encoded and the summed light dots were complementary decoded. Favorable contrast on auto- and cross-correlations indicates that binary information symbols can be properly recovered using a balanced photodetector.

  9. Mapping soil deformation around plant roots using in vivo 4D X-ray Computed Tomography and Digital Volume Correlation.

    Science.gov (United States)

    Keyes, S D; Gillard, F; Soper, N; Mavrogordato, M N; Sinclair, I; Roose, T

    2016-06-14

    The mechanical impedance of soils inhibits the growth of plant roots, often being the most significant physical limitation to root system development. Non-invasive imaging techniques have recently been used to investigate the development of root system architecture over time, but the relationship with soil deformation is usually neglected. Correlative mapping approaches parameterised using 2D and 3D image data have recently gained prominence for quantifying physical deformation in composite materials including fibre-reinforced polymers and trabecular bone. Digital Image Correlation (DIC) and Digital Volume Correlation (DVC) are computational techniques which use the inherent material texture of surfaces and volumes, captured using imaging techniques, to map full-field deformation components in samples during physical loading. Here we develop an experimental assay and methodology for four-dimensional, in vivo X-ray Computed Tomography (XCT) and apply a Digital Volume Correlation (DVC) approach to the data to quantify deformation. The method is validated for a field-derived soil under conditions of uniaxial compression, and a calibration study is used to quantify thresholds of displacement and strain measurement. The validated and calibrated approach is then demonstrated for an in vivo test case in which an extending maize root in field-derived soil was imaged hourly using XCT over a growth period of 19h. This allowed full-field soil deformation data and 3D root tip dynamics to be quantified in parallel for the first time. This fusion of methods paves the way for comparative studies of contrasting soils and plant genotypes, improving our understanding of the fundamental mechanical processes which influence root system development. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. High-Performance Secure Database Access Technologies for HEP Grids

    Energy Technology Data Exchange (ETDEWEB)

    Matthew Vranicar; John Weicher

    2006-04-17

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist’s computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that "Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications.” There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the

  11. High-Performance Secure Database Access Technologies for HEP Grids

    International Nuclear Information System (INIS)

    Vranicar, Matthew; Weicher, John

    2006-01-01

    The Large Hadron Collider (LHC) at the CERN Laboratory will become the largest scientific instrument in the world when it starts operations in 2007. Large Scale Analysis Computer Systems (computational grids) are required to extract rare signals of new physics from petabytes of LHC detector data. In addition to file-based event data, LHC data processing applications require access to large amounts of data in relational databases: detector conditions, calibrations, etc. U.S. high energy physicists demand efficient performance of grid computing applications in LHC physics research where world-wide remote participation is vital to their success. To empower physicists with data-intensive analysis capabilities a whole hyperinfrastructure of distributed databases cross-cuts a multi-tier hierarchy of computational grids. The crosscutting allows separation of concerns across both the global environment of a federation of computational grids and the local environment of a physicist's computer used for analysis. Very few efforts are on-going in the area of database and grid integration research. Most of these are outside of the U.S. and rely on traditional approaches to secure database access via an extraneous security layer separate from the database system core, preventing efficient data transfers. Our findings are shared by the Database Access and Integration Services Working Group of the Global Grid Forum, who states that 'Research and development activities relating to the Grid have generally focused on applications where data is stored in files. However, in many scientific and commercial domains, database management systems have a central role in data storage, access, organization, authorization, etc, for numerous applications'. There is a clear opportunity for a technological breakthrough, requiring innovative steps to provide high-performance secure database access technologies for grid computing. We believe that an innovative database architecture where the secure

  12. An Expressive, Lightweight and Secure Construction of Key Policy Attribute-Based Cloud Data Sharing Access Control

    Science.gov (United States)

    Lin, Guofen; Hong, Hanshu; Xia, Yunhao; Sun, Zhixin

    2017-10-01

    Attribute-based encryption (ABE) is an interesting cryptographic technique for flexible cloud data sharing access control. However, some open challenges hinder its practical application. In previous schemes, all attributes are considered as in the same status while they are not in most of practical scenarios. Meanwhile, the size of access policy increases dramatically with the raise of its expressiveness complexity. In addition, current research hardly notices that mobile front-end devices, such as smartphones, are poor in computational performance while too much bilinear pairing computation is needed for ABE. In this paper, we propose a key-policy weighted attribute-based encryption without bilinear pairing computation (KP-WABE-WB) for secure cloud data sharing access control. A simple weighted mechanism is presented to describe different importance of each attribute. We introduce a novel construction of ABE without executing any bilinear pairing computation. Compared to previous schemes, our scheme has a better performance in expressiveness of access policy and computational efficiency.

  13. Computers in the Teaching of English as a Foreign Language: Access to the Diversity of Textual Genres and Language Skills

    Science.gov (United States)

    Dos Santos, Roberto-Márcio; Sobrinho, Jerônimo Coura

    In the area of language teaching both language skills and textual genres can be worked with simultaneously (thus responding to the Brazilian Curricular Parameters and to the trends in contemporary education, which emphasize contextualized teaching) by means of computers. Computers can make the teaching process dynamic and rich, since they enable the access to the foreign language through virtual environments, which creates a larger number of learning contexts, with all their specific vocabulary and linguistic features in real communication. This study focuses on possible applications of this kind of approach. The computer online is a resource of diverse textual genres and can be an important tool in the language classroom as well as an access to authentic material produced in contextualized practice close to real-life communication. On the other hand, all these materials must be appropriately used without ever worshipping the technology as if it were a miraculous solution. After all, the professional pedagogic skills of the teacher should never be forgotten or taken for granted. In this study, a series of interviews with teachers was carried out - both with Brazilian teachers of the public sector (basic education) and language institutes (private English courses) as well as teacher trainers (university professors), in order to verify if the teachers were prepared to work with informatics in teaching practices, and check the professionals’ views on the subject. The ideas of Maingueneau and Marcuschi about textual genres are a theoretical base in this work, besides the concept of cognitive economy. The text and its typology are focused here as the basic material for teaching English, through digital technologies and hypermedia. The study is also based on Sharma and Barrett’s notion of blended learning as a balanced combination of technological resources and traditional practices in the classroom. Thus, this is an attempt to investigate the relevance of

  14. The Evolution of Teachers' Instructional Beliefs and Practices in High-Access-to-Technology Classrooms.

    Science.gov (United States)

    Dwyer, David C.; And Others

    Beginning in 1985, Apple Computer, Inc., and several school districts began a collaboration to examine the impact of computer saturation on instruction and learning in K-12 classrooms. The initial guiding question was simply put: What happens when teachers and students have constant access to technology? To provide "constant access,"…

  15. Enhancing access to health information in Africa: a librarian's perspective.

    Science.gov (United States)

    Gathoni, Nasra

    2012-01-01

    In recent years, tremendous progress has been made toward providing health information in Africa, in part because of technological advancements. Nevertheless, ensuring that information is accessible, comprehensible, and usable remains problematic, and there remain needs in many settings to address issues such as computer skills, literacy, and the infrastructure to access information. To determine how librarians might play a more strategic role in meeting information needs of health professionals in Africa, the author reviewed key components of information systems pertinent to knowledge management for the health sector, including access to global online resources, capacity to use computer technology for information retrieval, information literacy, and the potential for professional networks to play a role in improving access to and use of information. The author concluded that, in regions that lack adequate information systems, librarians could apply their knowledge and skills to facilitate access and use by information seekers. Ensuring access to and use of health information can also be achieved by engaging organizations and associations working to enhance access to health information, such as the Association for Health Information and Libraries in Africa. These groups can provide assistance through training, dissemination, information repackaging, and other approaches known to improve information literacy.

  16. Paraquat-poisoning in the rabbit lungs: high resolution computed tomographic findings and pathologic correlation

    International Nuclear Information System (INIS)

    Lee, Kyung Soo; Kim, Eui Han; Lee, Byoung Ho; Kim, Kun Sang

    1992-01-01

    The authors evaluated high resolution computed tomographic (HRCT) findings of the isolated rabbit lungs with paraquat poisoning, and the findings were correlated with pathologic specimens. The purposes of this study are 1) to obtain the HRCT findings of the normal rabbit lung. 2) to find out if pulmonary pathology can be induced in rabbits by paraquat, and 3) to correlate the HRCT findings to those of pathology. Thirty rabbits were divided into three groups: group I included four control rabbits; group II included 16 rabbits given paraquat intraperitoneally (IP group); and group III included 10 rabbits given paraquat intravenously (IV group). The rabbits were sacrificed seven, 10, and 14 days after injection of various amount of paraquat, and then the lungs were isolated for HRCT and pathologic studies. Gross and microscopic findings of the three groups of control and paraquat-injected rabbit lungs were correlated with HRCT findings. Pulmonary congestion, mild thickening of alveolar walls and septae, and multifocal micro-atelectasis were the man pathologic findings of the lungs in both groups of the rabbits. Pulmonary hemorrhage was noted in five (31%) of 16 rabbits of IP group and three (30%) of 10 IV group. Pulmonary edema was seen in one rabbits (6%) of IP and four (40%) of IV group. Typical pulmonary fibrosis was seen in one rabbit of IP (6%) and IV (10%) group, respectively. There was no correlation between the amount of paraquat and frequency of the pulmonary pathology. Pulmonary fibrosis was seen at least one week after the paraquat injection. On HRCT, pulmonary hemorrhage and edema appeared as diffuse air-space consolidation and pulmonary fibrosis as linear or band-like opacities. However, minimal changes such as mild congestion

  17. Drug-related stigma and access to care among people who inject drugs in Vietnam.

    Science.gov (United States)

    Lan, Chiao-Wen; Lin, Chunqing; Thanh, Duong Cong; Li, Li

    2018-03-01

    There are considerable challenges faced by people with a history of injecting drug use (PWID) in Vietnam, including drug-related stigma and lack of access to healthcare. Seeking and utilising healthcare, as well as harm reduction programs for PWID, are often hampered by drug-related stigma. This study aimed to examine the impacts of drug-related stigma on access to care and utilisation of harm reduction programs among PWID in Vietnam. A cross-sectional study was conducted in two provinces in Vietnam, Phú Thọ and Vinh Phúc. The study participants completed the survey by using Audio Computer-Assisted Self-Interview between late 2014 and early 2015. Linear multiple regression models and logistic regression models were used to assess the relationship among drug-related stigma, access to care and utilisation of harm reduction programs, including methadone maintenance treatment (MMT) and needle exchange programs (NEP). A total of 900 PWID participated in this study. Drug-related stigma was significantly associated with lower level of access to care, but not with utilisation of MMT or NEP. Older age was positively associated with higher levels of access to care. Levels of education were positively correlated with access to care, as well as utilisation of MMT and NEP. This study underscores the need for future interventions to reduce drug-related stigma in society and in health-care settings to improve PWID's utilisation of care services. Special attention should be paid to younger PWID and those with lower levels of education. © 2017 Australasian Professional Society on Alcohol and other Drugs.

  18. Correlation of primary middle and distal esophageal cancers motion with surrounding tissues using four-dimensional computed tomography.

    Science.gov (United States)

    Wang, Wei; Li, Jianbin; Zhang, Yingjie; Shao, Qian; Xu, Min; Guo, Bing; Shang, Dongping

    2016-01-01

    To investigate the correlation of gross tumor volume (GTV) motion with the structure of interest (SOI) motion and volume variation for middle and distal esophageal cancers using four-dimensional computed tomography (4DCT). Thirty-three patients with middle or distal esophageal carcinoma underwent 4DCT simulation scan during free breathing. All image sets were registered with 0% phase, and the GTV, apex of diaphragm, lung, and heart were delineated on each phase of the 4DCT data. The position of GTV and SOI was identified in all 4DCT phases, and the volume of lung and heart was also achieved. The phase relationship between the GTV and SOI was estimated through Pearson's correlation test. The mean peak-to-peak displacement of all primary tumors in the lateral (LR), anteroposterior (AP), and superoinferior (SI) directions was 0.13 cm, 0.20 cm, and 0.30 cm, respectively. The SI peak-to-peak motion of the GTV was defined as the greatest magnitude of motion. The displacement of GTV correlated well with heart in three dimensions and significantly associated with bilateral lung in LR and SI directions. A significant correlation was found between the GTV and apex of the diaphragm in SI direction (r left=0.918 and r right=0.928). A significant inverse correlation was found between GTV motion and varying lung volume, but the correlation was not significant with heart (r LR=-0.530, r AP=-0.531, and r SI=-0.588) during respiratory cycle. For middle and distal esophageal cancers, GTV should expand asymmetric internal margins. The primary tumor motion has quite good correlation with diaphragm, heart, and lung.

  19. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  20. Database organization for computer-aided characterization of laser diode

    International Nuclear Information System (INIS)

    Oyedokun, Z.O.

    1988-01-01

    Computer-aided data logging involves a huge amount of data which must be properly managed for optimized storage space, easy access, retrieval and utilization. An organization method is developed to enhance the advantages of computer-based data logging of the testing of the semiconductor injection laser which optimize storage space, permit authorized user easy access and inhibits penetration. This method is based on unique file identification protocol tree structure and command file-oriented access procedures

  1. PARTICAL SWARM OPTIMIZATION OF TASK SCHEDULING IN CLOUD COMPUTING

    OpenAIRE

    Payal Jaglan*, Chander Diwakar

    2016-01-01

    Resource provisioning and pricing modeling in cloud computing makes it an inevitable technology both on developer and consumer end. Easy accessibility of software and freedom of hardware configuration increase its demand in IT industry. It’s ability to provide a user-friendly environment, software independence, quality, pricing index and easy accessibility of infrastructure via internet. Task scheduling plays an important role in cloud computing systems. Task scheduling in cloud computing mea...

  2. Access to Strong Opioid Analgesics in the Context of Legal and Regulatory Barriers in Eleven Central and Eastern European Countries.

    Science.gov (United States)

    Vranken, Marjolein J M; Mantel-Teeuwisse, Aukje K; Schutjens, Marie-Hélène D B; Scholten, Willem K; Jünger, Saskia; Medic, Dr Rer; Leufkens, Hubert G M

    2018-04-06

    In 2011-2013, >95% of the global opioid analgesics consumption occurred in three regions, accounting for 15% of the world population. Despite abundant literature on barriers to access, little is known on the correlation between actual access to opioid analgesics and barriers to access, including legal and regulatory barriers. This study aimed to evaluate the correlation between access to strong opioid analgesics and barriers to access in national legislation and regulations in 11 central and eastern European countries that participated in the Access to Opioid Medication in Europe (ATOME) project. Two variables were contrasted to assess their correlation: the country level of access to strong opioid analgesics indicated by the Adequacy of Consumption Measure (ACM) and the number of potential legal and regulatory barriers identified by an external review of legislation and regulations. A linear correlation was evaluated using a squared linear correlation coefficient. Evaluation of the correlation between the ACM and the number of potential barriers produces an R 2 value of 0.023 and a correlation plot trend line gradient of -0.075, indicating no correlation between access to strong opioid analgesics and the number of potential barriers in national legislation and regulations in the countries studied. No correlation was found, which indicates that other factors besides potential legal and regulatory barriers play a critical role in withholding prescribers and patients essential pain medication in the studied countries. More research is needed toward better understanding of the complex interplay of factors that determine access to strong opioid analgesics.

  3. Correlation exploration of metabolic and genomic diversity in rice

    Directory of Open Access Journals (Sweden)

    Shinozaki Kazuo

    2009-12-01

    Full Text Available Abstract Background It is essential to elucidate the relationship between metabolic and genomic diversity to understand the genetic regulatory networks associated with the changing metabolo-phenotype among natural variation and/or populations. Recent innovations in metabolomics technologies allow us to grasp the comprehensive features of the metabolome. Metabolite quantitative trait analysis is a key approach for the identification of genetic loci involved in metabolite variation using segregated populations. Although several attempts have been made to find correlative relationships between genetic and metabolic diversity among natural populations in various organisms, it is still unclear whether it is possible to discover such correlations between each metabolite and the polymorphisms found at each chromosomal location. To assess the correlative relationship between the metabolic and genomic diversity found in rice accessions, we compared the distance matrices for these two "omics" patterns in the rice accessions. Results We selected 18 accessions from the world rice collection based on their population structure. To determine the genomic diversity of the rice genome, we genotyped 128 restriction fragment length polymorphism (RFLP markers to calculate the genetic distance among the accessions. To identify the variations in the metabolic fingerprint, a soluble extract from the seed grain of each accession was analyzed with one dimensional 1H-nuclear magnetic resonance (NMR. We found no correlation between global metabolic diversity and the phylogenetic relationships among the rice accessions (rs = 0.14 by analyzing the distance matrices (calculated from the pattern of the metabolic fingerprint in the 4.29- to 0.71-ppm 1H chemical shift and the genetic distance on the basis of the RFLP markers. However, local correlation analysis between the distance matrices (derived from each 0.04-ppm integral region of the 1H chemical shift against genetic

  4. Individual and family environmental correlates of television and computer time in 10- to 12-year-old European children: the ENERGY-project.

    Science.gov (United States)

    Verloigne, Maïté; Van Lippevelde, Wendy; Bere, Elling; Manios, Yannis; Kovács, Éva; Grillenberger, Monika; Maes, Lea; Brug, Johannes; De Bourdeaudhuij, Ilse

    2015-09-18

    The aim was to investigate which individual and family environmental factors are related to television and computer time separately in 10- to-12-year-old children within and across five European countries (Belgium, Germany, Greece, Hungary, Norway). Data were used from the ENERGY-project. Children and one of their parents completed a questionnaire, including questions on screen time behaviours and related individual and family environmental factors. Family environmental factors included social, political, economic and physical environmental factors. Complete data were obtained from 2022 child-parent dyads (53.8 % girls, mean child age 11.2 ± 0.8 years; mean parental age 40.5 ± 5.1 years). To examine the association between individual and family environmental factors (i.e. independent variables) and television/computer time (i.e. dependent variables) in each country, multilevel regression analyses were performed using MLwiN 2.22, adjusting for children's sex and age. In all countries, children reported more television and/or computer time, if children and their parents thought that the maximum recommended level for watching television and/or using the computer was higher and if children had a higher preference for television watching and/or computer use and a lower self-efficacy to control television watching and/or computer use. Most physical and economic environmental variables were not significantly associated with television or computer time. Slightly more individual factors were related to children's computer time and more parental social environmental factors to children's television time. We also found different correlates across countries: parental co-participation in television watching was significantly positively associated with children's television time in all countries, except for Greece. A higher level of parental television and computer time was only associated with a higher level of children's television and computer time in Hungary. Having rules

  5. Generating series for GUE correlators

    Science.gov (United States)

    Dubrovin, Boris; Yang, Di

    2017-11-01

    We extend to the Toda lattice hierarchy the approach of Bertola et al. (Phys D Nonlinear Phenom 327:30-57, 2016; IMRN, 2016) to computation of logarithmic derivatives of tau-functions in terms of the so-called matrix resolvents of the corresponding difference Lax operator. As a particular application we obtain explicit generating series for connected GUE correlators. On this basis an efficient recursive procedure for computing the correlators in full genera is developed.

  6. ATLAS Distributed Computing in LHC Run2

    CERN Document Server

    Campana, Simone; The ATLAS collaboration

    2015-01-01

    The ATLAS Distributed Computing infrastructure has evolved after the first period of LHC data taking in order to cope with the challenges of the upcoming LHC Run2. An increased data rate and computing demands of the Monte-Carlo simulation, as well as new approaches to ATLAS analysis, dictated a more dynamic workload management system (ProdSys2) and data management system (Rucio), overcoming the boundaries imposed by the design of the old computing model. In particular, the commissioning of new central computing system components was the core part of the migration toward the flexible computing model. The flexible computing utilization exploring the opportunistic resources such as HPC, cloud, and volunteer computing is embedded in the new computing model, the data access mechanisms have been enhanced with the remote access, and the network topology and performance is deeply integrated into the core of the system. Moreover a new data management strategy, based on defined lifetime for each dataset, has been defin...

  7. High Optical Access Trap 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Maunz, Peter Lukas Wilhelm [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-26

    The High Optical Access (HOA) trap was designed in collaboration with the Modular Universal Scalable Ion-trap Quantum Computer (MUSIQC) team, funded along with Sandia National Laboratories through IARPA's Multi Qubit Coherent Operations (MQCO) program. The design of version 1 of the HOA trap was completed in September 2012 and initial devices were completed and packaged in February 2013. The second version of the High Optical Access Trap (HOA-2) was completed in September 2014 and is available at IARPA's disposal.

  8. An Access Control Framework for Reflective Middleware

    Institute of Scientific and Technical Information of China (English)

    Gang Huang; Lian-Shan Sun

    2008-01-01

    Reflective middleware opens up the implementation details of middleware platform and applications at runtime for improving the adaptability of middleware-based systems. However, such openness brings new challenges to access control of the middleware-based systems.Some users can access the system via reflective entities, which sometimes cannot be protected by access control mechanisms of traditional middleware. To deliver high adaptability securely, reflective middleware should be equipped with proper access control mechanisms for potential access control holes induced by reflection. One reason of integrating these mechanisms in reflective middleware is that one goal of reflective middleware is to equip applications with reflection capabilities as transparent as possible. This paper studies how to design a reflective J2EE middlewarePKUAS with access control in mind. At first, a computation model of reflective system is built to identify all possible access control points induced by reflection. Then a set of access control mechanisms, including the wrapper of MBeans and a hierarchy of Java class loaders, are equipped for controlling the identified access control points. These mechanisms together with J2EE access control mechanism form the access control framework for PKUAS. The paper evaluates the security and the performance overheads of the framework in quality and quantity.

  9. Diagnostic accuracy of computer tomography angiography and magnetic resonance angiography in the stenosis detection of autologuous hemodialysis access: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Bin Li

    Full Text Available PURPOSE: To compare the diagnostic performances of computer tomography angiography (CTA and magnetic resonance angiography (MRA for detection and assessment of stenosis in patients with autologuous hemodialysis access. MATERIALS AND METHODS: Search of PubMed, MEDLINE, EMBASE and Cochrane Library database from January 1984 to May 2013 for studies comparing CTA or MRA with DSA or surgery for autologuous hemodialysis access. Eligible studies were in English language, aimed to detect more than 50% stenosis or occlusion of autologuous vascular access in hemodialysis patients with CTA and MRA technology and provided sufficient data about diagnosis performance. Methodological quality was assessed by the Quality Assessment of Diagnostic Studies (QUADAS instrument. Sensitivities (SEN, specificities (SPE, positive likelihood ratio (PLR, negative likelihood values (NLR, diagnostic odds ratio (DOR and areas under the receiver operator characteristic curve (AUC were pooled statistically. Potential threshold effect, heterogeneity and publication bias was evaluated. The clinical utility of CTA and MRA in detection of stenosis was also investigated. RESULT: Sixteen eligible studies were included, with a total of 500 patients. Both CTA and MRA were accurate modality (sensitivity, 96.2% and 95.4%, respectively; specificity, 97.1 and 96.1%, respectively; DOR [diagnostic odds ratio], 393.69 and 211.47, respectively for hemodialysis vascular access. No significant difference was detected between the diagnostic performance of CTA (AUC, 0.988 and MRA (AUC, 0.982. Meta-regression analyses and subgroup analyses revealed no statistical difference. The Deek's funnel plots suggested a publication bias. CONCLUSION: Diagnostic performance of CTA and MRA for detecting stenosis of hemodialysis vascular access had no statistical difference. Both techniques may function as an alternative or an important complement to conventional digital subtraction angiography (DSA and may be

  10. ZIVIS: A City Computing Platform Based on Volunteer Computing

    International Nuclear Information System (INIS)

    Antoli, B.; Castejon, F.; Giner, A.; Losilla, G.; Reynolds, J. M.; Rivero, A.; Sangiao, S.; Serrano, F.; Tarancon, A.; Valles, R.; Velasco, J. L.

    2007-01-01

    Abstract Volunteer computing has come up as a new form of distributed computing. Unlike other computing paradigms like Grids, which use to be based on complex architectures, volunteer computing has demonstrated a great ability to integrate dispersed, heterogeneous computing resources with ease. This article presents ZIVIS, a project which aims to deploy a city-wide computing platform in Zaragoza (Spain). ZIVIS is based on BOINC (Berkeley Open Infrastructure for Network Computing), a popular open source framework to deploy volunteer and desktop grid computing systems. A scientific code which simulates the trajectories of particles moving inside a stellarator fusion device, has been chosen as the pilot application of the project. In this paper we describe the approach followed to port the code to the BOINC framework as well as some novel techniques, based on standard Grid protocols, we have used to access the output data present in the BOINC server from a remote visualizer. (Author)

  11. Attentional priorities and access to short-term memory

    DEFF Research Database (Denmark)

    Gillebert, Celine; Dyrholm, Mads; Vangkilde, Signe Allerup

    2012-01-01

    The intraparietal sulcus (IPS) has been implicated in selective attention as well as visual short-term memory (VSTM). To contrast mechanisms of target selection, distracter filtering, and access to VSTM, we combined behavioral testing, computational modeling and functional magnetic resonance......, thereby displaying a significant interaction between the two factors. The interaction between target and distracter set size in IPS could not be accounted for by a simple explanation in terms of number of items accessing VSTM. Instead, it led us to a model where items accessing VSTM receive differential...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  13. Mobile computing in critical care.

    Science.gov (United States)

    Lapinsky, Stephen E

    2007-03-01

    Handheld computing devices are increasingly used by health care workers, and offer a mobile platform for point-of-care information access. Improved technology, with larger memory capacity, higher screen resolution, faster processors, and wireless connectivity has broadened the potential roles for these devices in critical care. In addition to the personal information management functions, handheld computers have been used to access reference information, management guidelines and pharmacopoeias as well as to track the educational experience of trainees. They can act as an interface with a clinical information system, providing rapid access to patient information. Despite their popularity, these devices have limitations related to their small size, and acceptance by physicians has not been uniform. In the critical care environment, the risk of transmitting microorganisms by such a portable device should always be considered.

  14. Concentration of Access to Information and Communication Technologies in the Municipalities of the Brazilian Legal Amazon.

    Directory of Open Access Journals (Sweden)

    Silvana Rossy de Brito

    Full Text Available This study fills demand for data on access and use of information and communication technologies (ICT in the Brazilian legal Amazon, a region of localities with identical economic, political, and social problems. We use the 2010 Brazilian Demographic Census to compile data on urban and rural households (i with computers and Internet access, (ii with mobile phones, and (iii with fixed phones. To compare the concentration of access to ICT in the municipalities of the Brazilian Amazon with other regions of Brazil, we use a concentration index to quantify the concentration of households in the following classes: with computers and Internet access, with mobile phones, with fixed phones, and no access. These data are analyzed along with municipal indicators on income, education, electricity, and population size. The results show that for urban households, the average concentration in the municipalities of the Amazon for computers and Internet access and for fixed phones is lower than in other regions of the country; meanwhile, that for no access and mobile phones is higher than in any other region. For rural households, the average concentration in the municipalities of the Amazon for computers and Internet access, mobile phones, and fixed phones is lower than in any other region of the country; meanwhile, that for no access is higher than in any other region. In addition, the study shows that education and income are determinants of inequality in accessing ICT in Brazilian municipalities and that the existence of electricity in rural households is directly associated with the ownership of ICT resources.

  15. Concentration of Access to Information and Communication Technologies in the Municipalities of the Brazilian Legal Amazon.

    Science.gov (United States)

    de Brito, Silvana Rossy; da Silva, Aleksandra do Socorro; Cruz, Adejard Gaia; Monteiro, Maurílio de Abreu; Vijaykumar, Nandamudi Lankalapalli; da Silva, Marcelino Silva; Costa, João Crisóstomo Weyl Albuquerque; Francês, Carlos Renato Lisboa

    2016-01-01

    This study fills demand for data on access and use of information and communication technologies (ICT) in the Brazilian legal Amazon, a region of localities with identical economic, political, and social problems. We use the 2010 Brazilian Demographic Census to compile data on urban and rural households (i) with computers and Internet access, (ii) with mobile phones, and (iii) with fixed phones. To compare the concentration of access to ICT in the municipalities of the Brazilian Amazon with other regions of Brazil, we use a concentration index to quantify the concentration of households in the following classes: with computers and Internet access, with mobile phones, with fixed phones, and no access. These data are analyzed along with municipal indicators on income, education, electricity, and population size. The results show that for urban households, the average concentration in the municipalities of the Amazon for computers and Internet access and for fixed phones is lower than in other regions of the country; meanwhile, that for no access and mobile phones is higher than in any other region. For rural households, the average concentration in the municipalities of the Amazon for computers and Internet access, mobile phones, and fixed phones is lower than in any other region of the country; meanwhile, that for no access is higher than in any other region. In addition, the study shows that education and income are determinants of inequality in accessing ICT in Brazilian municipalities and that the existence of electricity in rural households is directly associated with the ownership of ICT resources.

  16. Recent Progress in First-Principles Methods for Computing the Electronic Structure of Correlated Materials

    Directory of Open Access Journals (Sweden)

    Fredrik Nilsson

    2018-03-01

    Full Text Available Substantial progress has been achieved in the last couple of decades in computing the electronic structure of correlated materials from first principles. This progress has been driven by parallel development in theory and numerical algorithms. Theoretical development in combining ab initio approaches and many-body methods is particularly promising. A crucial role is also played by a systematic method for deriving a low-energy model, which bridges the gap between real and model systems. In this article, an overview is given tracing the development from the LDA+U to the latest progress in combining the G W method and (extended dynamical mean-field theory ( G W +EDMFT. The emphasis is on conceptual and theoretical aspects rather than technical ones.

  17. File access prediction using neural networks.

    Science.gov (United States)

    Patra, Prashanta Kumar; Sahu, Muktikanta; Mohapatra, Subasish; Samantray, Ronak Kumar

    2010-06-01

    One of the most vexing issues in design of a high-speed computer is the wide gap of access times between the memory and the disk. To solve this problem, static file access predictors have been used. In this paper, we propose dynamic file access predictors using neural networks to significantly improve upon the accuracy, success-per-reference, and effective-success-rate-per-reference by using neural-network-based file access predictor with proper tuning. In particular, we verified that the incorrect prediction has been reduced from 53.11% to 43.63% for the proposed neural network prediction method with a standard configuration than the recent popularity (RP) method. With manual tuning for each trace, we are able to improve upon the misprediction rate and effective-success-rate-per-reference using a standard configuration. Simulations on distributed file system (DFS) traces reveal that exact fit radial basis function (RBF) gives better prediction in high end system whereas multilayer perceptron (MLP) trained with Levenberg-Marquardt (LM) backpropagation outperforms in system having good computational capability. Probabilistic and competitive predictors are the most suitable for work stations having limited resources to deal with and the former predictor is more efficient than the latter for servers having maximum system calls. Finally, we conclude that MLP with LM backpropagation algorithm has better success rate of file prediction than those of simple perceptron, last successor, stable successor, and best k out of m predictors.

  18. Thermodynamic equilibrium-air correlations for flowfield applications

    Science.gov (United States)

    Zoby, E. V.; Moss, J. N.

    1981-01-01

    Equilibrium-air thermodynamic correlations have been developed for flowfield calculation procedures. A comparison between the postshock results computed by the correlation equations and detailed chemistry calculations is very good. The thermodynamic correlations are incorporated in an approximate inviscid flowfield code with a convective heating capability for the purpose of defining the thermodynamic environment through the shock layer. Comparisons of heating rates computed by the approximate code and a viscous-shock-layer method are good. In addition to presenting the thermodynamic correlations, the impact of several viscosity models on the convective heat transfer is demonstrated.

  19. Heterogeneity in the WTP for recreational access

    DEFF Research Database (Denmark)

    Campbell, Danny; Vedel, Suzanne Elizabeth; Thorsen, Bo Jellesmark

    2014-01-01

    In this study we have addressed appropriate modelling of heterogeneity in willingness to pay (WTP) for environmental goods, and have demonstrated its importance using a case of forest access in Denmark. We compared WTP distributions for four models: (1) a multinomial logit model, (2) a mixed logit...... model assuming a univariate Normal distribution, (3) or assuming a multivariate Normal distribution allowing for correlation across attributes, and (4) a mixture of two truncated Normal distributions, allowing for correlation among attributes. In the first two models mean WTP for enhanced access...... was negative. However, models accounting for preference heterogeneity found a positive mean WTP, but a large sub-group with negative WTP. Accounting for preference heterogeneity can alter overall conclusions, which highlights the importance of this for policy recommendations....

  20. Evaluating mobile centric information access and interaction compatibility for learning websites

    CSIR Research Space (South Africa)

    Chipangura, B

    2013-11-01

    Full Text Available guidelines for One web design, not all websites meet these standards. Research has shown that accessing websites that were designed for desktop computer access on mobile hand held devices results in negative user experience [12]. The reasons... to identify mobile phone accessibility problems of university websites [14, 18]. At organizational level, many universities are struggling with adapting their current desktop-based websites to be accessible on mobile devices [20]. A number...

  1. Computer networks and their implications for nuclear data

    International Nuclear Information System (INIS)

    Carlson, J.

    1992-01-01

    Computer networks represent a valuable resource for accessing information. Just as the computer has revolutionized the ability to process and analyze information, networks have and will continue to revolutionize data collection and access. A number of services are in routine use that would not be possible without the presence of an (inter)national computer network (which will be referred to as the internet). Services such as electronic mail, remote terminal access, and network file transfers are almost a required part of any large scientific/research organization. These services only represent a small fraction of the potential uses of the internet; however, the remainder of this paper discusses some of these uses and some technological developments that may influence these uses

  2. The equipment access software for a distributed UNIX-based accelerator control system

    International Nuclear Information System (INIS)

    Trofimov, Nikolai; Zelepoukine, Serguei; Zharkov, Eugeny; Charrue, Pierre; Gareyte, Claire; Poirier, Herve

    1994-01-01

    This paper presents a generic equipment access software package for a distributed control system using computers with UNIX or UNIX-like operating systems. The package consists of three main components, an application Equipment Access Library, Message Handler and Equipment Data Base. An application task, which may run in any computer in the network, sends requests to access equipment through Equipment Library calls. The basic request is in the form Equipment-Action-Data and is routed via a remote procedure call to the computer to which the given equipment is connected. In this computer the request is received by the Message Handler. According to the type of the equipment connection, the Message Handler either passes the request to the specific process software in the same computer or forwards it to a lower level network of equipment controllers using MIL1553B, GPIB, RS232 or BITBUS communication. The answer is then returned to the calling application. Descriptive information required for request routing and processing is stored in the real-time Equipment Data Base. The package has been written to be portable and is currently available on DEC Ultrix, LynxOS, HPUX, XENIX, OS-9 and Apollo domain. ((orig.))

  3. 2 December 2003: Registration of Computers Mandatory for the entire CERN Site

    CERN Multimedia

    2003-01-01

    Following the decision by the CERN Management Board (see Weekly Bulletin 38/2003), registration of all computers connected to CERN's network will be enforced and only registered computers will be allowed network access. The implementation has been put into place in the IT buildings, building 40 and the Prévessin site, and will cover the whole of CERN by 2 December 2003. We therefore recommend strongly that you register all your computers in CERN's network database including all network access cards (Ethernet AND wire-less) as soon as possible without waiting for the access restriction to take force. This will allow you accessing the network without interruption and help IT service providers to contact you in case of problems (security problems, viruses, etc.). - If you have a CERN NICE/mail computing account register at: http://cern.ch/register/ (CERN Intranet page) - If you don't have CERN NICE/mail computing account (e.g. short term visitors) register at: http://cern.ch/registerVisitorComputer/...

  4. Correlation of abdominopelvic computed tomography with clinical manifestations in methamphetamine body stuffers.

    Science.gov (United States)

    Bahrami-Motlagh, Hooman; Hassanian-Moghaddam, Hossein; Zamini, Hedieh; Zamani, Nasim; Gachkar, Latif

    2018-02-01

    Little is known about methamphetamine body stuffers and correlation of clinical manifestations with imaging studies. Current study was done to determine abdominopelvic computed tomography findings and clinical manifestations in methamphetamine body stuffers. In an IRB-approved routine data base study, demographic characteristics, clinical findings, and CT results of 70 methamphetamine body stuffers were retrieved. According to the clinical manifestations, the patients were categorized into either benign- or severe-outcome group. Also, they were determined to have positive or negative CT results. In the group with positive results, number and place of the baggies were determined, as well. Results of the CT were compared between the two groups. Almost 43% of the patients had positive abdominopelvic CT results. Mean density of the packs was 176.2 ± 152.7 Hounsfield unit. Based on the clinical grounds, 57% of the patients were in the benign- and 33% were in the severe-outcome group. In the benign group, 45% of the patients had positive CTs while in the severe-risk group, this was 40% (p > 0.05). Except variables defined as severe outcome (seizure, intubation, creatinine level, aspartate aminotransferase level, creatine phosphokinase and troponin level), agitation, on-arrival pulse rate, lactate dehydrogenase, bicarbonate, base excess, loss of consciousness and hospitalization period were correlating factors. But in regression analysis, we could not find a significant variable that prognosticate severe outcome. It seems that there is no relationship between the CT findings and clinical manifestations of the methamphetamine body stuffers. Severe outcomes may be observed even in the face of negative CTs.

  5. Genomics With Cloud Computing

    OpenAIRE

    Sukhamrit Kaur; Sandeep Kaur

    2015-01-01

    Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computin...

  6. Intelligent computer aided training systems in the real world: Making the technology accessible to the educational mainstream

    Science.gov (United States)

    Kovarik, Madeline

    1993-01-01

    Intelligent computer aided training systems hold great promise for the application of this technology to mainstream education and training. Yet, this technology, which holds such a vast potential impact for the future of education and training, has had little impact beyond the enclaves of government research labs. This is largely due to the inaccessibility of the technology to those individuals in whose hands it can have the greatest impact, teachers and educators. Simply throwing technology at an educator and expecting them to use it as an effective tool is not the answer. This paper provides a background into the use of technology as a training tool. MindLink, developed by HyperTech Systems, provides trainers with a powerful rule-based tool that can be integrated directly into a Windows application. By embedding expert systems technology it becomes more accessible and easier to master.

  7. GRID : unlimited computing power on your desktop Conference MT17

    CERN Multimedia

    2001-01-01

    The Computational GRID is an analogy to the electrical power grid for computing resources. It decouples the provision of computing, data, and networking from its use, it allows large-scale pooling and sharing of resources distributed world-wide. Every computer, from a desktop to a mainframe or supercomputer, can provide computing power or data for the GRID. The final objective is to plug your computer into the wall and have direct access to huge computing resources immediately, just like plugging-in a lamp to get instant light. The GRID will facilitate world-wide scientific collaborations on an unprecedented scale. It will provide transparent access to major distributed resources of computer power, data, information, and collaborations.

  8. Computer network defense system

    Science.gov (United States)

    Urias, Vincent; Stout, William M. S.; Loverro, Caleb

    2017-08-22

    A method and apparatus for protecting virtual machines. A computer system creates a copy of a group of the virtual machines in an operating network in a deception network to form a group of cloned virtual machines in the deception network when the group of the virtual machines is accessed by an adversary. The computer system creates an emulation of components from the operating network in the deception network. The components are accessible by the group of the cloned virtual machines as if the group of the cloned virtual machines was in the operating network. The computer system moves network connections for the group of the virtual machines in the operating network used by the adversary from the group of the virtual machines in the operating network to the group of the cloned virtual machines, enabling protecting the group of the virtual machines from actions performed by the adversary.

  9. Lecture 7: Worldwide LHC Computing Grid Overview

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    This presentation will introduce in an informal, but technically correct way the challenges that are linked to the needs of massively distributed computing architectures in the context of the LHC offline computing. The topics include technological and organizational aspects touching many aspects of LHC computing, from data access, to maintenance of large databases and huge collections of files, to the organization of computing farms and monitoring. Fabrizio Furano holds a Ph.D in Computer Science and has worked in the field of Computing for High Energy Physics for many years. Some of his preferred topics include application architectures, system design and project management, with focus on performance and scalability of data access. Fabrizio has experience in a wide variety of environments, from private companies to academic research in particular in object oriented methodologies, mainly using C++. He has also teaching experience at university level in Software Engineering and C++ Programming.

  10. ICT Oriented toward Nyaya: Community Computing in India's Slums

    Science.gov (United States)

    Byker, Erik J.

    2014-01-01

    In many schools across India, access to information and communication technology (ICT) is still a rare privilege. While the Annual Status of Education Report in India (2013) showed a marginal uptick in the amount of computers, the opportunities for children to use those computers have remained stagnant. The lack of access to ICT is especially…

  11. First results of the SOAP project. Open access publishing in 2010

    CERN Document Server

    Dallmeier-Tiessen, Suenje; Goerner, Bettina; Hyppoelae, Jenni; Igo-Kemenes, Peter; Kahn, Deborah; Lambert, Simon; Lengenfelder, Anja; Leonard, Chris; Mele, Salvatore; Polydoratou, Panayiota; Ross, David; Ruiz-Perez, Sergio; Schimmer, Ralf; Swaisland, Mark; van der Stelt, Wim

    2010-01-01

    The SOAP (Study of Open Access Publishing) project has compiled data on the present offer for open access publishing in online peer-reviewed journals. Starting from the Directory of Open Access Journals, several sources of data are considered, including inspection of journal web site and direct inquiries within the publishing industry. Several results are derived and discussed, together with their correlations: the number of open access journals and articles; their subject area; the starting date of open access journals; the size and business models of open access publishers; the licensing models; the presence of an impact factor; the uptake of hybrid open access.

  12. The Status of Ubiquitous Computing.

    Science.gov (United States)

    Brown, David G.; Petitto, Karen R.

    2003-01-01

    Explains the prevalence and rationale of ubiquitous computing on college campuses--teaching with the assumption or expectation that all faculty and students have access to the Internet--and offers lessons learned by pioneering institutions. Lessons learned involve planning, technology, implementation and management, adoption of computer-enhanced…

  13. Simulation and Noise Analysis of Multimedia Transmission in Optical CDMA Computer Networks

    Directory of Open Access Journals (Sweden)

    Nasaruddin Nasaruddin

    2013-09-01

    Full Text Available This paper simulates and analyzes noise of multimedia transmission in a flexible optical code division multiple access (OCDMA computer network with different quality of service (QoS requirements. To achieve multimedia transmission in OCDMA, we have proposed strict variable-weight optical orthogonal codes (VW-OOCs, which can guarantee the smallest correlation value of one by the optimal design. In developing multimedia transmission for computer network, a simulation tool is essential in analyzing the effectiveness of various transmissions of services. In this paper, implementation models are proposed to analyze the multimedia transmission in the representative of OCDMA computer networks by using MATLAB simulink tools. Simulation results of the models are discussed including spectrums outputs of transmitted signals, superimposed signals, received signals, and eye diagrams with and without noise. Using the proposed models, multimedia OCDMA computer network using the strict VW-OOC is practically evaluated. Furthermore, system performance is also evaluated by considering avalanche photodiode (APD noise and thermal noise. The results show that the system performance depends on code weight, received laser power, APD noise, and thermal noise which should be considered as important parameters to design and implement multimedia transmission in OCDMA computer networks.

  14. Simulation and Noise Analysis of Multimedia Transmission in Optical CDMA Computer Networks

    Directory of Open Access Journals (Sweden)

    Nasaruddin

    2009-11-01

    Full Text Available This paper simulates and analyzes noise of multimedia transmission in a flexible optical code division multiple access (OCDMA computer network with different quality of service (QoS requirements. To achieve multimedia transmission in OCDMA, we have proposed strict variable-weight optical orthogonal codes (VW-OOCs, which can guarantee the smallest correlation value of one by the optimal design. In developing multimedia transmission for computer network, a simulation tool is essential in analyzing the effectiveness of various transmissions of services. In this paper, implementation models are proposed to analyze the multimedia transmission in the representative of OCDMA computer networks by using MATLAB simulink tools. Simulation results of the models are discussed including spectrums outputs of transmitted signals, superimposed signals, received signals, and eye diagrams with and without noise. Using the proposed models, multimedia OCDMA computer network using the strict VW-OOC is practically evaluated. Furthermore, system performance is also evaluated by considering avalanche photodiode (APD noise and thermal noise. The results show that the system performance depends on code weight, received laser power, APD noise, and thermal noise which should be considered as important parameters to design and implement multimedia transmission in OCDMA computer networks.

  15. Chronic rhinosinusitis: correlation of symptoms with computed ...

    African Journals Online (AJOL)

    Introduction: Symptomatology, nasal endoscopy and Computerised Tomographic (CT) scan have been used to diagnose chronic rhinosinusitis. The value of disease severity score in the assessment of chronic rhinosinusitis has not been well investigated. Hence, this study aims to correlate the pre-operative symptom ...

  16. Correlation between the signal-to-noise ratio improvement factor (KSNR) and clinical image quality for chest imaging with a computed radiography system

    International Nuclear Information System (INIS)

    Moore, C S; Wood, T J; Saunderson, J R; Beavis, A W

    2015-01-01

    This work assessed the appropriateness of the signal-to-noise ratio improvement factor (K SNR ) as a metric for the optimisation of computed radiography (CR) of the chest. The results of a previous study in which four experienced image evaluators graded computer simulated chest images using a visual grading analysis scoring (VGAS) scheme to quantify the benefit of using an anti-scatter grid were used for the clinical image quality measurement (number of simulated patients  =  80). The K SNR was used to calculate the improvement in physical image quality measured in a physical chest phantom. K SNR correlation with VGAS was assessed as a function of chest region (lung, spine and diaphragm/retrodiaphragm), and as a function of x-ray tube voltage in a given chest region. The correlation of the latter was determined by the Pearson correlation coefficient. VGAS and K SNR image quality metrics demonstrated no correlation in the lung region but did show correlation in the spine and diaphragm/retrodiaphragmatic regions. However, there was no correlation as a function of tube voltage in any region; a Pearson correlation coefficient (R) of  −0.93 (p  =  0.015) was found for lung, a coefficient (R) of  −0.95 (p  =  0.46) was found for spine, and a coefficient (R) of  −0.85 (p  =  0.015) was found for diaphragm. All demonstrate strong negative correlations indicating conflicting results, i.e. K SNR increases with tube voltage but VGAS decreases. Medical physicists should use the K SNR metric with caution when assessing any potential improvement in clinical chest image quality when introducing an anti-scatter grid for CR imaging, especially in the lung region. This metric may also be a limited descriptor of clinical chest image quality as a function of tube voltage when a grid is used routinely. (paper)

  17. Pulmonary mucormycosis. Serial morphologic changes on computed tomography correlate with clinical and pathologic findings

    International Nuclear Information System (INIS)

    Nam, Bo Da; Kim, Tae Jung; Lee, Kyung Soo; Kim, Tae Sung; Chung, Myung Jin; Han, Joungho

    2018-01-01

    To evaluate serial computed tomography (CT) findings of pulmonary mucormycosis correlated with peripheral blood absolute neutrophil count (ANC). Between February 1997 and June 2016, 20 immunocompromised patients (10 males, 10 females; mean age, 48.9 years) were histopathologically diagnosed as pulmonary mucormycosis. On initial (n=20) and follow-up (n=15) CT scans, the patterns of lung abnormalities and their changing features on follow-up scans were evaluated, and the pattern changes were correlated with ANC changes. All patients were immunocompromised. On initial CT scans, nodule (≤3cm)/mass (>3cm) or consolidation with surrounding ground-glass opacity halo (18/20, 90%) was the most common pattern. On follow-up CT, morphologic changes (13/15, 87%) could be seen and they included reversed halo (RH) sign, central necrosis, and air-crescent sign. Although all cases did not demonstrate the regular morphologic changes at the same timeline, various combinations of pattern change could be seen in all patients. Sequential morphologic changes were related with recovering of ANC in 13 of 15 patients. Pulmonary mucormycosis most frequently presents as consolidation or nodule/mass with halo sign at CT. Morphologic changes into RH sign, central necrotic cavity or air-crescent sign occur with treatment and recovery of ANC. (orig.)

  18. Time-Shift Correlation Algorithm for P300 Event Related Potential Brain-Computer Interface Implementation

    Directory of Open Access Journals (Sweden)

    Ju-Chi Liu

    2016-01-01

    Full Text Available A high efficient time-shift correlation algorithm was proposed to deal with the peak time uncertainty of P300 evoked potential for a P300-based brain-computer interface (BCI. The time-shift correlation series data were collected as the input nodes of an artificial neural network (ANN, and the classification of four LED visual stimuli was selected as the output node. Two operating modes, including fast-recognition mode (FM and accuracy-recognition mode (AM, were realized. The proposed BCI system was implemented on an embedded system for commanding an adult-size humanoid robot to evaluate the performance from investigating the ground truth trajectories of the humanoid robot. When the humanoid robot walked in a spacious area, the FM was used to control the robot with a higher information transfer rate (ITR. When the robot walked in a crowded area, the AM was used for high accuracy of recognition to reduce the risk of collision. The experimental results showed that, in 100 trials, the accuracy rate of FM was 87.8% and the average ITR was 52.73 bits/min. In addition, the accuracy rate was improved to 92% for the AM, and the average ITR decreased to 31.27 bits/min. due to strict recognition constraints.

  19. Time-Shift Correlation Algorithm for P300 Event Related Potential Brain-Computer Interface Implementation.

    Science.gov (United States)

    Liu, Ju-Chi; Chou, Hung-Chyun; Chen, Chien-Hsiu; Lin, Yi-Tseng; Kuo, Chung-Hsien

    2016-01-01

    A high efficient time-shift correlation algorithm was proposed to deal with the peak time uncertainty of P300 evoked potential for a P300-based brain-computer interface (BCI). The time-shift correlation series data were collected as the input nodes of an artificial neural network (ANN), and the classification of four LED visual stimuli was selected as the output node. Two operating modes, including fast-recognition mode (FM) and accuracy-recognition mode (AM), were realized. The proposed BCI system was implemented on an embedded system for commanding an adult-size humanoid robot to evaluate the performance from investigating the ground truth trajectories of the humanoid robot. When the humanoid robot walked in a spacious area, the FM was used to control the robot with a higher information transfer rate (ITR). When the robot walked in a crowded area, the AM was used for high accuracy of recognition to reduce the risk of collision. The experimental results showed that, in 100 trials, the accuracy rate of FM was 87.8% and the average ITR was 52.73 bits/min. In addition, the accuracy rate was improved to 92% for the AM, and the average ITR decreased to 31.27 bits/min. due to strict recognition constraints.

  20. Pulmonary mucormycosis. Serial morphologic changes on computed tomography correlate with clinical and pathologic findings

    Energy Technology Data Exchange (ETDEWEB)

    Nam, Bo Da; Kim, Tae Jung; Lee, Kyung Soo; Kim, Tae Sung; Chung, Myung Jin [Samsung Medical Centre, Sungkyunkwan University School of Medicine, Department of Radiology and Centre for Imaging Science, Seoul (Korea, Republic of); Han, Joungho [Samsung Medical Centre, Sungkyunkwan University School of Medicine, Department of Pathology, Seoul (Korea, Republic of)

    2018-02-15

    To evaluate serial computed tomography (CT) findings of pulmonary mucormycosis correlated with peripheral blood absolute neutrophil count (ANC). Between February 1997 and June 2016, 20 immunocompromised patients (10 males, 10 females; mean age, 48.9 years) were histopathologically diagnosed as pulmonary mucormycosis. On initial (n=20) and follow-up (n=15) CT scans, the patterns of lung abnormalities and their changing features on follow-up scans were evaluated, and the pattern changes were correlated with ANC changes. All patients were immunocompromised. On initial CT scans, nodule (≤3cm)/mass (>3cm) or consolidation with surrounding ground-glass opacity halo (18/20, 90%) was the most common pattern. On follow-up CT, morphologic changes (13/15, 87%) could be seen and they included reversed halo (RH) sign, central necrosis, and air-crescent sign. Although all cases did not demonstrate the regular morphologic changes at the same timeline, various combinations of pattern change could be seen in all patients. Sequential morphologic changes were related with recovering of ANC in 13 of 15 patients. Pulmonary mucormycosis most frequently presents as consolidation or nodule/mass with halo sign at CT. Morphologic changes into RH sign, central necrotic cavity or air-crescent sign occur with treatment and recovery of ANC. (orig.)

  1. Engineering and Computing Portal to Solve Environmental Problems

    Science.gov (United States)

    Gudov, A. M.; Zavozkin, S. Y.; Sotnikov, I. Y.

    2018-01-01

    This paper describes architecture and services of the Engineering and Computing Portal, which is considered to be a complex solution that provides access to high-performance computing resources, enables to carry out computational experiments, teach parallel technologies and solve computing tasks, including technogenic safety ones.

  2. Large scale access tests and online interfaces to ATLAS conditions databases

    International Nuclear Information System (INIS)

    Amorim, A; Lopes, L; Pereira, P; Simoes, J; Soloviev, I; Burckhart, D; Schmitt, J V D; Caprini, M; Kolos, S

    2008-01-01

    The access of the ATLAS Trigger and Data Acquisition (TDAQ) system to the ATLAS Conditions Databases sets strong reliability and performance requirements on the database storage and access infrastructures. Several applications were developed to support the integration of Conditions database access with the online services in TDAQ, including the interface to the Information Services (IS) and to the TDAQ Configuration Databases. The information storage requirements were the motivation for the ONline A Synchronous Interface to COOL (ONASIC) from the Information Service (IS) to LCG/COOL databases. ONASIC avoids the possible backpressure from Online Database servers by managing a local cache. In parallel, OKS2COOL was developed to store Configuration Databases into an Offline Database with history record. The DBStressor application was developed to test and stress the access to the Conditions database using the LCG/COOL interface while operating in an integrated way as a TDAQ application. The performance scaling of simultaneous Conditions database read accesses was studied in the context of the ATLAS High Level Trigger large computing farms. A large set of tests were performed involving up to 1000 computing nodes that simultaneously accessed the LCG central database server infrastructure at CERN

  3. A fuzzy expert system to Trust-Based Access Control in crowdsourcing environments

    Directory of Open Access Journals (Sweden)

    Olusegun Folorunso

    2015-07-01

    Full Text Available Crowdsourcing has been widely accepted across a broad range of application areas. In crowdsourcing environments, the possibility of performing human computation is characterized with risks due to the openness of their web-based platforms where each crowd worker joins and participates in the process at any time, causing serious effect on the quality of its computation. In this paper, a combination of Trust-Based Access Control (TBAC strategy and fuzzy-expert systems was used to enhance the quality of human computation in crowdsourcing environment. A TBAC-fuzzy algorithm was developed and implemented using MATLAB 7.6.0 to compute trust value (Tvalue, priority value as evaluated by fuzzy inference system (FIS and finally generate access decision to each crowd-worker. In conclusion, the use of TBAC is feasible in improving quality of human computation in crowdsourcing environments.

  4. Correlation between computed tomographic and magnetic resonance imaging findings of parenchymal lung diseases

    Energy Technology Data Exchange (ETDEWEB)

    Barreto, Miriam Menna; Rafful, Patricia Piazza [Department of Radiology, Federal University of Rio de Janeiro, Rio de Janeiro (Brazil); Rodrigues, Rosana Souza [Department of Radiology, Federal University of Rio de Janeiro, Rio de Janeiro (Brazil); D’Or Institute for Research and Education, Rio de Janeiro, RJ (Brazil); Zanetti, Gláucia [Department of Radiology, Federal University of Rio de Janeiro, Rio de Janeiro (Brazil); Hochhegger, Bruno [Complexo Hospitalar Santa Casa de Misericórdia de Porto Alegre, Porto Alegre, RS (Brazil); Souza, Arthur Soares [Department of Radiology, Medical School of Rio Preto (FAMERP) and Ultra X, São José do Rio Preto, SP (Brazil); Guimarães, Marcos Duarte [Department of Imaging, Hospital AC Camargo, São Paulo, SP (Brazil); Marchiori, Edson, E-mail: edmarchiori@gmail.com [Department of Radiology, Federal University of Rio de Janeiro, Rio de Janeiro (Brazil)

    2013-09-15

    Computed tomography (CT) is considered to be the gold standard method for the assessment of morphological changes in the pulmonary parenchyma. Although its spatial resolution is lower than that of CT, MRI offers the advantage of characterizing different aspects of tissue based on the degree of contrast on T1-weighted image (WI) and T2-WI. In this article, we describe and correlate the MRI and CT features of several common patterns of parenchymal lung disease (air trapping, atelectasis, bronchiectasis, cavitation, consolidation, emphysema, ground-glass opacities, halo sign, interlobular septal thickening, masses, mycetoma, nodules, progressive massive fibrosis, reverse halo sign and tree-in-bud pattern). MRI may be an alternative modality for the collection of morphological and functional information useful for the management of parenchymal lung disease, which would help reduce the number of chest CT scans and radiation exposure required in patients with a variety of conditions.

  5. Computer science I essentials

    CERN Document Server

    Raus, Randall

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Computer Science I includes fundamental computer concepts, number representations, Boolean algebra, switching circuits, and computer architecture.

  6. submitter Studies of CMS data access patterns with machine learning techniques

    CERN Document Server

    De Luca, Silvia

    This thesis presents a study of the Grid data access patterns in distributed analysis in the CMS experiment at the LHC accelerator. This study ranges from the deep analysis of the historical patterns of access to the most relevant data types in CMS, to the exploitation of a supervised Machine Learning classification system to set-up a machinery able to eventually predict future data access patterns - i.e. the so-called dataset “popularity” of the CMS datasets on the Grid - with focus on specific data types. All the CMS workflows run on the Worldwide LHC Computing Grid (WCG) computing centers (Tiers), and in particular the distributed analysis systems sustains hundreds of users and applications submitted every day. These applications (or “jobs”) access different data types hosted on disk storage systems at a large set of WLCG Tiers. The detailed study of how this data is accessed, in terms of data types, hosting Tiers, and different time periods, allows to gain precious insight on storage occupancy ove...

  7. 3D CFD computations of transitional flows using DES and a correlation based transition model; Wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen, Niels N.

    2009-07-15

    The report describes the application of the correlation based transition model of Menter et. al. [1, 2] to the cylinder drag crisis and the stalled flow over an DU-96-W-351 airfoil using the DES methodology. When predicting the flow over airfoils and rotors, the laminar-turbulent transition process can be important for the aerodynamic performance. Today, the most widespread approach is to use fully turbulent computations, where the transitional process is ignored and the entire boundary layer on the wings or airfoils is handled by the turbulence model. The correlation based transition model has lately shown promising results, and the present paper describes the application of the model to predict the drag and shedding frequency for flow around a cylinder from sub to super-critical Reynolds numbers. Additionally, the model is applied to the flow around the DU-96 airfoil, at high angles of attack. (au)

  8. Automated quantification of pulmonary emphysema from computed tomography scans: comparison of variation and correlation of common measures in a large cohort

    Science.gov (United States)

    Keller, Brad M.; Reeves, Anthony P.; Yankelevitz, David F.; Henschke, Claudia I.

    2010-03-01

    The purpose of this work was to retrospectively investigate the variation of standard indices of pulmonary emphysema from helical computed tomographic (CT) scans as related to inspiration differences over a 1 year interval and determine the strength of the relationship between these measures in a large cohort. 626 patients that had 2 scans taken at an interval of 9 months to 15 months (μ: 381 days, σ: 31 days) were selected for this work. All scans were acquired at a 1.25mm slice thickness using a low dose protocol. For each scan, the emphysema index (EI), fractal dimension (FD), mean lung density (MLD), and 15th percentile of the histogram (HIST) were computed. The absolute and relative changes for each measure were computed and the empirical 95% confidence interval was reported both in non-normalized and normalized scales. Spearman correlation coefficients are computed between the relative change in each measure and relative change in inspiration between each scan-pair, as well as between each pair-wise combination of the four measures. EI varied on a range of -10.5 to 10.5 on a non-normalized scale and -15 to 15 on a normalized scale, with FD and MLD showing slightly larger but comparable spreads, and HIST having a much larger variation. MLD was found to show the strongest correlation to inspiration change (r=0.85, pemphysema index and fractal dimension have the least variability overall of the commonly used measures of emphysema and that they offer the most unique quantification of emphysema relative to each other.

  9. Brain neuroimaging of domestic cats: correlation between computed tomography and cross-sectional anatomy

    International Nuclear Information System (INIS)

    Nepomuceno, A.C.; Zanatta, R.; Chung, D.G.; Costa, P.F.; Feliciano, M.A.R.; Avante, M.L.; Canola, J.C.; Lopes, L.S.

    2016-01-01

    Computed tomography of the brain is necessary as part of the diagnosis of lesions of the central nervous system. In this study we used six domestic cats, male or female, aged between one and five years, evaluated by Computed Tomography (CT) examination without clinical signs of central nervous system disorders. Two euthanized animals stating a condition unrelated to the nervous system were incorporated into this study. The proposal consisted in establishing detailed anatomical description of tomographic images of normal brain of cats, using as reference anatomical images of cross sections of the stained brain and cranial part, with thicknesses similar to the planes of the CT images. CT examinations were performed with and without intravenous iodinated contrast media for live animals. With one euthanized animal, the brain was removed and immediately preserved in 10% formalin for later achievement in cross-sectional thickness of approximately 4mm and staining technique of Barnard, and Robert Brown. The head of another animal was disarticulated in the Atlanto-occipital region and frozen at -20 deg C then sliced to a thickness of about 5mm. The description of visualized anatomical structures using tomography is useful as a guide and allows transcribing with relative accuracy the brain region affected by an injury, and thus correlating it with the clinical symptoms of the patient, providing additional information and consequent improvement to veterinarians during the course of surgical clinic in this species. (author)

  10. Brain neuroimaging of domestic cats: correlation between computed tomography and cross-sectional anatomy

    Energy Technology Data Exchange (ETDEWEB)

    Nepomuceno, A.C. [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil); Zanatta, R. [Universidade de Cuiaba, MT (Brazil); Chung, D.G.; Costa, P.F.; Feliciano, M.A.R.; Avante, M.L.; Canola, J.C., E-mail: marcusfeliciano@yahoo.com.br [Faculdade de Ciencias Agrarias e Veterinarias, Jaboticabal, SP (Brazil); Lopes, L.S. [Universidade de Sao Paulo (USP), Ribeirao Preto, SP (Brazil)

    2016-09-15

    Computed tomography of the brain is necessary as part of the diagnosis of lesions of the central nervous system. In this study we used six domestic cats, male or female, aged between one and five years, evaluated by Computed Tomography (CT) examination without clinical signs of central nervous system disorders. Two euthanized animals stating a condition unrelated to the nervous system were incorporated into this study. The proposal consisted in establishing detailed anatomical description of tomographic images of normal brain of cats, using as reference anatomical images of cross sections of the stained brain and cranial part, with thicknesses similar to the planes of the CT images. CT examinations were performed with and without intravenous iodinated contrast media for live animals. With one euthanized animal, the brain was removed and immediately preserved in 10% formalin for later achievement in cross-sectional thickness of approximately 4mm and staining technique of Barnard, and Robert Brown. The head of another animal was disarticulated in the Atlanto-occipital region and frozen at -20 deg C then sliced to a thickness of about 5mm. The description of visualized anatomical structures using tomography is useful as a guide and allows transcribing with relative accuracy the brain region affected by an injury, and thus correlating it with the clinical symptoms of the patient, providing additional information and consequent improvement to veterinarians during the course of surgical clinic in this species. (author)

  11. The Problem of Subject Access to Visual Materials

    Directory of Open Access Journals (Sweden)

    Heather P. Jespersen

    2004-09-01

    Full Text Available This article discusses the problem of giving subject access to works of art. We survey both concept-based and content-based access by computers and by indexers/catalogers respectively, as well as issues of interoperability, database and indexer consistency, and cataloging standards. The authors, both of whom are trained art historians, question attempts to mystify fine art subject matter by the creation of clever library science systems that are executed by the naive. Only when trained art historians and knowledgeable catalogers are finally responsible for providing subject access to works of art, will true interoperability and consistency happen.

  12. The neural correlates of problem states: testing FMRI predictions of a computational model of multitasking.

    Directory of Open Access Journals (Sweden)

    Jelmer P Borst

    Full Text Available BACKGROUND: It has been shown that people can only maintain one problem state, or intermediate mental representation, at a time. When more than one problem state is required, for example in multitasking, performance decreases considerably. This effect has been explained in terms of a problem state bottleneck. METHODOLOGY: In the current study we use the complimentary methodologies of computational cognitive modeling and neuroimaging to investigate the neural correlates of this problem state bottleneck. In particular, an existing computational cognitive model was used to generate a priori fMRI predictions for a multitasking experiment in which the problem state bottleneck plays a major role. Hemodynamic responses were predicted for five brain regions, corresponding to five cognitive resources in the model. Most importantly, we predicted the intraparietal sulcus to show a strong effect of the problem state manipulations. CONCLUSIONS: Some of the predictions were confirmed by a subsequent fMRI experiment, while others were not matched by the data. The experiment supported the hypothesis that the problem state bottleneck is a plausible cause of the interference in the experiment and that it could be located in the intraparietal sulcus.

  13. Access to augmentative and alternative communication: new technologies and clinical decision-making.

    Science.gov (United States)

    Fager, Susan; Bardach, Lisa; Russell, Susanne; Higginbotham, Jeff

    2012-01-01

    Children with severe physical impairments require a variety of access options to augmentative and alternative communication (AAC) and computer technology. Access technologies have continued to develop, allowing children with severe motor control impairments greater independence and access to communication. This article will highlight new advances in access technology, including eye and head tracking, scanning, and access to mainstream technology, as well as discuss future advances. Considerations for clinical decision-making and implementation of these technologies will be presented along with case illustrations.

  14. GeoBrain Computational Cyber-laboratory for Earth Science Studies

    Science.gov (United States)

    Deng, M.; di, L.

    2009-12-01

    Computational approaches (e.g., computer-based data visualization, analysis and modeling) are critical for conducting increasingly data-intensive Earth science (ES) studies to understand functions and changes of the Earth system. However, currently Earth scientists, educators, and students have met two major barriers that prevent them from being effectively using computational approaches in their learning, research and application activities. The two barriers are: 1) difficulties in finding, obtaining, and using multi-source ES data; and 2) lack of analytic functions and computing resources (e.g., analysis software, computing models, and high performance computing systems) to analyze the data. Taking advantages of recent advances in cyberinfrastructure, Web service, and geospatial interoperability technologies, GeoBrain, a project funded by NASA, has developed a prototype computational cyber-laboratory to effectively remove the two barriers. The cyber-laboratory makes ES data and computational resources at large organizations in distributed locations available to and easily usable by the Earth science community through 1) enabling seamless discovery, access and retrieval of distributed data, 2) federating and enhancing data discovery with a catalogue federation service and a semantically-augmented catalogue service, 3) customizing data access and retrieval at user request with interoperable, personalized, and on-demand data access and services, 4) automating or semi-automating multi-source geospatial data integration, 5) developing a large number of analytic functions as value-added, interoperable, and dynamically chainable geospatial Web services and deploying them in high-performance computing facilities, 6) enabling the online geospatial process modeling and execution, and 7) building a user-friendly extensible web portal for users to access the cyber-laboratory resources. Users can interactively discover the needed data and perform on-demand data analysis and

  15. Access to, interest in and attitude toward e-learning for continuous education among Malaysian nurses.

    Science.gov (United States)

    Chong, Mei Chan; Francis, Karen; Cooper, Simon; Abdullah, Khatijah Lim; Hmwe, Nant Thin Thin; Sohod, Salina

    2016-01-01

    Continuous nursing education (CNE) courses delivered through e-learning is believed to be an effective mode of learning for nurses. Implementation of e-learning modules requires pre-assessment of infrastructure and learners' characteristics. Understanding the learners' needs and their perspectives would facilitate effective e-learning delivery by addressing the underlying issues and providing necessary support to learners. The aim of this study was to examine access to computer and Internet facilities, interest in and preferences regarding e-learning, and attitudes toward e-learning among nurses in Peninsular Malaysia. The study utilized a cross-sectional descriptive survey. Government hospitals and community clinics in four main regions of Peninsular Malaysia. A total of 300 registered nurses. Data were collected using questionnaires, which consisted of demographic and background items and questions on access to computer and Internet facilities, interest and preferences in e-learning, and attitudes toward e-learning. Descriptive analysis and a chi-squared test were used to identify associations between variables. Most Malaysian nurses had access to a personal or home computer (85.3%, n=256) and computer access at work (85.3%, n=256). The majority had Internet access at home (84%, n=252) and at work (71.8%, n=215); however, average hours of weekly computer use were low. Most nurses (83%, n=249) did not have an e-learning experience but were interested in e-learning activities. Most nurses displayed positive attitudes toward e-learning. Average weekly computer use and interest in e-learning were positively associated with attitudes toward e-learning. Study findings suggest that organizational support is needed to promote accessibility of information and communications technology (ICT) facilities for Malaysian nurses to motivate their involvement in e-learning. Copyright © 2015. Published by Elsevier Ltd.

  16. Correlation of chest computed tomography findings with dyspnea and lung functions in post-tubercular sequelae

    Directory of Open Access Journals (Sweden)

    Ananya Panda

    2016-01-01

    Full Text Available Aims: To study the correlation between dyspnea, radiological findings, and pulmonary function tests (PFTs in patients with sequelae of pulmonary tuberculosis (TB. Materials and Methods: Clinical history, chest computed tomography (CT, and PFT of patients with post-TB sequelae were recorded. Dyspnea was graded according to the Modified Medical Research Council (mMRC scale. CT scans were analyzed for fibrosis, cavitation, bronchiectasis, consolidation, nodules, and aspergilloma. Semi-quantitative analysis was done for these abnormalities. Scores were added to obtain a total morphological score (TMS. The lungs were also divided into three zones and scores added to obtain the total lung score (TLS. Spirometry was done for forced vital capacity (FVC, forced expiratory volume in 1 s (FEV1, and FEV1/FVC. Results: Dyspnea was present in 58/101 patients. A total of 22/58 patients had mMRC Grade 1, and 17/58 patients had Grades 2 and 3 dyspnea each. There was a significant difference in median fibrosis, bronchiectasis, nodules (P < 0.01 scores, TMS, and TLS (P < 0.0001 between dyspnea and nondyspnea groups. Significant correlations were obtained between grades of dyspnea and fibrosis (r = 0.34, P = 0.006, bronchiectasis (r = 0.35, P = 0.004, nodule (r = 0.24, P = 0.016 scores, TMS (r = 0.398, P = 0.000, and TLS (r = 0.35, P = 0.0003. PFTs were impaired in 78/101 (77.2% patients. Restrictive defect was most common in 39.6% followed by mixed in 34.7%. There was a negative but statistically insignificant trend between PFT and fibrosis, bronchiectasis, nodule scores, TMS, and TLS. However, there were significant differences in median fibrosis, cavitation, and bronchiectasis scores in patients with normal, mild to moderate, and severe respiratory defects. No difference was seen in TMS and TLS according to the severity of the respiratory defect. Conclusion: Both fibrosis and bronchiectasis correlated with dyspnea and with PFT. However, this correlation was not

  17. Empowering Middle School Teachers with Portable Computers.

    Science.gov (United States)

    Weast, Jerry D.; And Others

    1993-01-01

    A Sioux Falls (South Dakota) project that supplied middle school teachers with Macintosh computers and training to use them showed gratifying results. Easy access to portable notebook computers made teachers more active computer users, increased teacher interaction and collaboration, enhanced teacher productivity regarding management tasks and…

  18. Audio Quality Assurance : An Application of Cross Correlation

    DEFF Research Database (Denmark)

    Jurik, Bolette Ammitzbøll; Nielsen, Jesper Asbjørn Sindahl

    2012-01-01

    We describe algorithms for automated quality assurance on content of audio files in context of preservation actions and access. The algorithms use cross correlation to compare the sound waves. They are used to do overlap analysis in an access scenario, where preserved radio broadcasts are used in...

  19. Polyphenolic Composition and Antioxidant Activities of 6 New Turmeric (Curcuma Longa L.) Accessions.

    Science.gov (United States)

    Chinedum, Eleazu; Kate, Eleazu; Sonia, Chukwuma; Ironkwe, Adanma; Andrew, Igwe

    2015-01-01

    The phytochemical composition and antioxidant capacities of 6 new NRCRI turmeric (Curcuma longa L.) accessions (39, 35, 60, 30, 50 and 41) were determined using standard techniques. The moisture contents of the tumeric samples ranged from 15.75 to 47.80% and the curcumin contents of the turmeric samples fell within the range of curcumin obtained from turmeric in other countries of the world. Furthermore, the turmeric accessions contained considerable amounts of antioxidants (measured using 2,2-diphenyl-1-picrylhydrazyl (DPPH) radical and reducing power assays), alkaloids, flavonoids, anthocyanins, and phenolics. There was significant correlation between the anthocyanin contents of the tumeric accessions versus their alkaloid (0.744) and flavonoid contents (0.986) suggesting an additive effect between the anthocyanins and alkaloids in turmeric; significant correlation between the inhibition of the turmeric accessions on DPPH radical versus their flavonoid (0.892) and anthocyanin (0.949) contents and significant correlation between the reducing power of the turmeric accessions versus their flavonoid (0.973) and anthocyanin (0.974) contents suggesting that anthocyanins as flavonoids largely contribute to the antioxidant activities of turmeric. The positive regression recorded between inhibition of DPPH radical by the turmeric accessions and quercetin versus reducing power (R2 = 0.852) suggest that any of these methods could be used to assess the antioxidant activities of tumeric. Finally, the study indicated the potentials of the turmeric accessions especially accessions 30 and 50 as promising sources of antioxidants.

  20. Computer Architecture A Quantitative Approach

    CERN Document Server

    Hennessy, John L

    2011-01-01

    The computing world today is in the middle of a revolution: mobile clients and cloud computing have emerged as the dominant paradigms driving programming and hardware innovation today. The Fifth Edition of Computer Architecture focuses on this dramatic shift, exploring the ways in which software and technology in the cloud are accessed by cell phones, tablets, laptops, and other mobile computing devices. Each chapter includes two real-world examples, one mobile and one datacenter, to illustrate this revolutionary change.Updated to cover the mobile computing revolutionEmphasizes the two most im

  1. Comparison between film-screen and computed radiography systems in Brazilian mammography

    International Nuclear Information System (INIS)

    Vieira, L.A.; Oliveira, J.R.; Carvalho, L.A.P.; César, A.C.Z.; Nogueira, M.S.

    2015-01-01

    Since 2004 the Public Health Office of the State of Minas Gerais in Brazil has established the Image Quality Control Program in Mammography. It evaluates the image quality based on an accredited phantom of the Brazilian College of Radiology (CBR). This phantom follows international standards such as masses, specks, fibers, contrast details and spatial resolution. The contrast index (CI) is accessed through optical density (OD) measurements. Although OD is defined under film-screen (FS) scope, among all accessible mammographic systems under the health office surveillance, almost 80% are computed radiography (CR) based. A necessity to adapt the protocol has emerged to consider OD as a conformity parameter. Objective: To verify the OD accessibility under CR´s printed out films and the feasibility to calculate contrast index, in comparison with FS´s. Results: A total of 56 images were evaluated with three different CBR phantoms. They were equally divided into FS and CR systems and a densitometer was used to read out their OD values. The correlation between their contrast-to-noise ratio (CNR) was found to be in the order of 0.77 (±0.14). The samples were not significantly different (inside 5% incertitude) for every phantom. The CNR correlation coefficient was 0.871. For OD, correlation coefficient was 0.989 and a log-fit function has shown good agreement with detector response. The OD-normalized standard deviation difference between CR and FS for every different phantom was 36.6%, 2.8% and 20.2%. A CI range for CR´s lying between 0.13 and 0.69 was found. Conclusions: Different phantoms were successfully tested in both CR and FS to evaluate the feasibility in use contrast index as a conformity parameter since their correlations are strictly related to calibration curve, as provided by phantom manufacturer. The relative CR-FS OD σ-difference provides a spreading indicator, where the first and last phantoms are considerably out of expectation. Such differences are

  2. Generalized drift-flux correlation

    International Nuclear Information System (INIS)

    Takeuchi, K.; Young, M.Y.; Hochreiter, L.E.

    1991-01-01

    A one-dimensional drift-flux model with five conservation equations is frequently employed in major computer codes, such as TRAC-PD2, and in simulator codes. In this method, the relative velocity between liquid and vapor phases, or slip ratio, is given by correlations, rather than by direct solution of the phasic momentum equations, as in the case of the two-fluid model used in TRAC-PF1. The correlations for churn-turbulent bubbly flow and slug flow regimes were given in terms of drift velocities by Zuber and Findlay. For the annular flow regime, the drift velocity correlations were developed by Ishii et al., using interphasic force balances. Another approach is to define the drift velocity so that flooding and liquid hold-up conditions are properly simulated, as reported here. The generalized correlation is used to reanalyze the MB-2 test data for two-phase flow in a large-diameter pipe. The results are applied to the generalized drift flux velocity, whose relationship to the other correlations is discussed. Finally, the generalized drift flux correlation is implemented in TRAC-PD2. Flow reversal from countercurrent to cocurrent flow is computed in small-diameter U-shaped tubes and is compared with the flooding curve

  3. Higher point spin field correlators in D=4 superstring theory

    International Nuclear Information System (INIS)

    Haertl, D.; Schlotterer, O.; Stieberger, S.

    2010-01-01

    Calculational tools are provided allowing to determine general tree-level scattering amplitudes for processes involving bosons and fermions in heterotic and superstring theories in four space-time dimensions. We compute higher-point superstring correlators involving massless four-dimensional fermionic and spin fields. In D=4 these correlators boil down to a product of two pure spin field correlators of left- and right-handed spin fields. This observation greatly simplifies the computation of such correlators. The latter are basic ingredients to compute multi-fermion superstring amplitudes in D=4. Their underlying fermionic structure and the fermionic couplings in the effective action are determined by these correlators.

  4. Fencing network direct memory access data transfers in a parallel active messaging interface of a parallel computer

    Science.gov (United States)

    Blocksome, Michael A.; Mamidala, Amith R.

    2015-07-07

    Fencing direct memory access (`DMA`) data transfers in a parallel active messaging interface (`PAMI`) of a parallel computer, the PAMI including data communications endpoints, each endpoint including specifications of a client, a context, and a task, the endpoints coupled for data communications through the PAMI and through DMA controllers operatively coupled to a deterministic data communications network through which the DMA controllers deliver data communications deterministically, including initiating execution through the PAMI of an ordered sequence of active DMA instructions for DMA data transfers between two endpoints, effecting deterministic DMA data transfers through a DMA controller and the deterministic data communications network; and executing through the PAMI, with no FENCE accounting for DMA data transfers, an active FENCE instruction, the FENCE instruction completing execution only after completion of all DMA instructions initiated prior to execution of the FENCE instruction for DMA data transfers between the two endpoints.

  5. Optimising LAN access to grid enabled storage elements

    International Nuclear Information System (INIS)

    Stewart, G A; Dunne, B; Elwell, A; Millar, A P; Cowan, G A

    2008-01-01

    When operational, the Large Hadron Collider experiments at CERN will collect tens of petabytes of physics data per year. The worldwide LHC computing grid (WLCG) will distribute this data to over two hundred Tier-1 and Tier-2 computing centres, enabling particle physicists around the globe to access the data for analysis. Although different middleware solutions exist for effective management of storage systems at collaborating institutes, the patterns of access envisaged for Tier-2s fall into two distinct categories. The first involves bulk transfer of data between different Grid storage elements using protocols such as GridFTP. This data movement will principally involve writing ESD and AOD files into Tier-2 storage. Secondly, once datasets are stored at a Tier-2, physics analysis jobs will read the data from the local SE. Such jobs require a POSIX-like interface to the storage so that individual physics events can be extracted. In this paper we consider the performance of POSIX-like access to files held in Disk Pool Manager (DPM) storage elements, a popular lightweight SRM storage manager from EGEE

  6. "Transit data"-based MST computation

    Directory of Open Access Journals (Sweden)

    Thodoris Karatasos

    2017-10-01

    Full Text Available In this work, we present an innovative image recognition technique which is based on the exploitation of transit-data in images or simple photographs of sites of interest. Our objective is to automatically transform real-world images to graphs and, then, compute Minimum Spanning Trees (MST in them.We apply this framework and present an application which automatically computes efficient construction plans (for escalator or low-emission hot spots for connecting all points of interest in cultural sites, i.e., archaeological sites, museums, galleries, etc, aiming to to facilitate global physical access to cultural heritage and artistic work and make it accessible to all groups of population.

  7. Large-scale particle simulations in a virtual-memory computer

    International Nuclear Information System (INIS)

    Gray, P.C.; Wagner, J.S.; Tajima, T.; Million, R.

    1982-08-01

    Virtual memory computers are capable of executing large-scale particle simulations even when the memory requirements exceed the computer core size. The required address space is automatically mapped onto slow disc memory by the operating system. When the simulation size is very large, frequent random accesses to slow memory occur during the charge accumulation and particle pushing processes. Accesses to slow memory significantly reduce the execution rate of the simulation. We demonstrate in this paper that with the proper choice of sorting algorithm, a nominal amount of sorting to keep physically adjacent particles near particles with neighboring array indices can reduce random access to slow memory, increase the efficiency of the I/O system, and hence, reduce the required computing time

  8. Streaming movies, media, and instant access

    CERN Document Server

    Dixon, Wheeler Winston

    2013-01-01

    Film stocks are vanishing, but the iconic images of the silver screen remain -- albeit in new, sleeker formats. Today, viewers can instantly stream movies on televisions, computers, and smartphones. Gone are the days when films could only be seen in theaters or rented at video stores: movies are now accessible at the click of a button, and there are no reels, tapes, or discs to store. Any film or show worth keeping may be collected in the virtual cloud and accessed at will through services like Netflix, Hulu, and Amazon Instant.The movies have changed, and we are changing with them.

  9. Time division multiple access for vehicular communications

    CERN Document Server

    Omar, Hassan Aboubakr

    2014-01-01

    This brief focuses on medium access control (MAC) in vehicular ad hoc networks (VANETs), and presents VeMAC, a novel MAC scheme based on distributed time division multiple access (TDMA) for VANETs. The performance of VeMAC is evaluated via mathematical analysis and computer simulations in comparison with other existing MAC protocols, including the IEEE 802.11p standard. This brief aims at proposing TDMA as a suitable MAC scheme for VANETs, which can support the quality-of-service requirements of high priority VANET applications.

  10. Security and Privacy in Fog Computing: Challenges

    OpenAIRE

    Mukherjee, Mithun; Matam, Rakesh; Shu, Lei; Maglaras, Leandros; Ferrag, Mohamed Amine; Choudhry, Nikumani; Kumar, Vikas

    2017-01-01

    open access article Fog computing paradigm extends the storage, networking, and computing facilities of the cloud computing toward the edge of the networks while offloading the cloud data centers and reducing service latency to the end users. However, the characteristics of fog computing arise new security and privacy challenges. The existing security and privacy measurements for cloud computing cannot be directly applied to the fog computing due to its features, such as mobility, heteroge...

  11. Cloud computing: An innovative tool for library services

    OpenAIRE

    Sahu, R.

    2015-01-01

    Cloud computing is a new technique of information communication technology because of its potential benefits such as reduced cost, accessible anywhere any time as well as its elasticity and flexibility. In this Paper defines cloud Computing, definition, essential characteristics, model of cloud computing, components of cloud, advantages & drawbacks of cloud computing and also describe cloud computing in libraries.

  12. Functional Multiple-Set Canonical Correlation Analysis

    Science.gov (United States)

    Hwang, Heungsun; Jung, Kwanghee; Takane, Yoshio; Woodward, Todd S.

    2012-01-01

    We propose functional multiple-set canonical correlation analysis for exploring associations among multiple sets of functions. The proposed method includes functional canonical correlation analysis as a special case when only two sets of functions are considered. As in classical multiple-set canonical correlation analysis, computationally, the…

  13. Cloud Computing Based E-Learning System

    Science.gov (United States)

    Al-Zoube, Mohammed; El-Seoud, Samir Abou; Wyne, Mudasser F.

    2010-01-01

    Cloud computing technologies although in their early stages, have managed to change the way applications are going to be developed and accessed. These technologies are aimed at running applications as services over the internet on a flexible infrastructure. Microsoft office applications, such as word processing, excel spreadsheet, access database…

  14. Octopus: LLL's computing utility

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    The Laboratory's Octopus network constitutes one of the greatest concentrations of computing power in the world. This power derives from the network's organization as well as from the size and capability of its computers, storage media, input/output devices, and communication channels. Being in a network enables these facilities to work together to form a unified computing utility that is accessible on demand directly from the users' offices. This computing utility has made a major contribution to the pace of research and development at the Laboratory; an adequate rate of progress in research could not be achieved without it. 4 figures

  15. Telecommunication access to INIS

    International Nuclear Information System (INIS)

    Scheel, H.; Breitfeld, B.; Huebner, B.

    1983-01-01

    Proceeding from the features of on-line retrieval from the INIS data base, a description is given of the technical and organizational conditions established by the national INIS Centre of the GDR in using the INIS direct access service. Data are presented on the structure of search queries, retrieval precision, and connect time to the computer. Experience has shown that efficient dialogue searching necessitates the searcher's skill and familiarity with the system. (author)

  16. A Matter of Computer Time

    Science.gov (United States)

    Celano, Donna; Neuman, Susan B.

    2010-01-01

    Many low-income children do not have the opportunity to develop the computer skills necessary to succeed in our technological economy. Their only access to computers and the Internet--school, afterschool programs, and community organizations--is woefully inadequate. Educators must work to close this knowledge gap and to ensure that low-income…

  17. Access control within military C4ISR systems

    Science.gov (United States)

    Maschino, Mike

    2003-07-01

    Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance (C4ISR) tactical battlefield systems must provide the right information and resources to the right individuals at the right time. At the same time, the C4ISR system must enforce access controls to prevent the wrong individuals from obtaining sensitive information, or consuming scarce resources. Because lives, missions and property depend upon them, these access control mechanisms must be effective, reliable, efficient and flexible. The mechanisms employed must suit the nature of the items that are to be protected, as well as the varieties of access policies that must be enforced, and the types of access that will be made to these items. Some access control technologies are inherently centralized, while others are suitable for distributed implementation. The C4ISR architect must select from among the available technologies a combination of mechanisms that eases the burden of policy administration, but is inherently survivable, accurate, resource efficient, and which provides low latency. This paper explores various alternative access enforcement mechanisms, and assesses their effectiveness in managing policy-driven access control within the battlespace.

  18. CLOUD COMPUTING TECHNOLOGY TRENDS

    Directory of Open Access Journals (Sweden)

    Cristian IVANUS

    2014-05-01

    Full Text Available Cloud computing has been a tremendous innovation, through which applications became available online, accessible through an Internet connection and using any computing device (computer, smartphone or tablet. According to one of the most recent studies conducted in 2012 by Everest Group and Cloud Connect, 57% of companies said they already use SaaS application (Software as a Service, and 38% reported using standard tools PaaS (Platform as a Service. However, in the most cases, the users of these solutions highlighted the fact that one of the main obstacles in the development of this technology is the fact that, in cloud, the application is not available without an Internet connection. The new challenge of the cloud system has become now the offline, specifically accessing SaaS applications without being connected to the Internet. This topic is directly related to user productivity within companies as productivity growth is one of the key promises of cloud computing system applications transformation. The aim of this paper is the presentation of some important aspects related to the offline cloud system and regulatory trends in the European Union (EU.

  19. Paracoccidioidomycosis: high-resolution computed tomography - anatomo-pathological correlation; Paracoccidioidomicose: correlacao da tomografia computadorizada de alta resolucao com a anatomopatologia

    Energy Technology Data Exchange (ETDEWEB)

    Marchiori, Edson; Muniz, Maria Angelica Soares; Santos, Maria Lucia de Oliveira [Universidade Federal Fluminense, Niteroi, RJ (Brazil). Inst. de Radiologia; Moraes, Heleno Pinto de [Universidade Federal Fluminense, Niteroi, RJ (Brazil). Inst. de Patologia; Capone, Domenico [Universidade do Estado, Rio de Janeiro, RJ (Brazil). Inst. de Pneumologia

    2000-12-01

    We reviewed the high-resolution computed tomography scans of 13 patients with paracoccidioidomycosis and correlated the findings with the anatomo-pathological findings of 5 patients. The most frequent findings observed were thickening of the interlobular septa, emphysema, ground glass areas, thickening of bronchial walls, tracheal dilatation, nodules, cavities and evidence of fibrosing disease such as architectural distortion, parenchymatous bands, spicular pleural thickening, intralobular reticulate and thickening with distortion off the axial interstitium. (author)

  20. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  1. Making Spatial Statistics Service Accessible On Cloud Platform

    OpenAIRE

    Mu, X.; Wu, J.; Li, T; Zhong, Y.; Gao, X.

    2014-01-01

    Web service can bring together applications running on diverse platforms, users can access and share various data, information and models more effectively and conveniently from certain web service platform. Cloud computing emerges as a paradigm of Internet computing in which dynamical, scalable and often virtualized resources are provided as services. With the rampant growth of massive data and restriction of net, traditional web services platforms have some prominent problems existi...

  2. The effects of home computer access and social capital on mathematics and science achievement among Asian-American high school students in the NELS:88 data set

    Science.gov (United States)

    Quigley, Mark Declan

    The purpose of this researcher was to examine specific environmental, educational, and demographic factors and their influence on mathematics and science achievement. In particular, the researcher ascertained the interconnections of home computer access and social capital, with Asian American students and the effect on mathematics and science achievement. Coleman's theory on social capital and parental influence was used as a basis for the analysis of data. Subjects for this study were the base year students from the National Education Longitudinal Study of 1988 (NELS:88) and the subsequent follow-up survey data in 1990, 1992, and 1994. The approximate sample size for this study is 640 ethnic Asians from the NELS:88 database. The analysis was a longitudinal study based on the Student and Parent Base Year responses and the Second Follow-up survey of 1992, when the subjects were in 12th grade. Achievement test results from the NELS:88 data were used to measure achievement in mathematics and science. The NELS:88 test battery was developed to measure both individual status and a student's growth in a number of achievement areas. The subject's responses were analyzed by principal components factor analysis, weights, effect sizes, hierarchial regression analysis, and PLSPath Analysis. The results of this study were that prior ability in mathematics and science is a major influence in the student's educational achievement. Findings from the study support the view that home computer access has a negative direct effect on mathematics and science achievement for both Asian American males and females. None of the social capital factors in the study had either a negative or positive direct effect on mathematics and science achievement although some indirect effects were found. Suggestions were made toward increasing parental involvement in their children's academic endeavors. Computer access in the home should be considered related to television viewing and should be closely

  3. Elementary mathematical and computational tools for electrical and computer engineers using Matlab

    CERN Document Server

    Manassah, Jamal T

    2013-01-01

    Ideal for use as a short-course textbook and for self-study Elementary Mathematical and Computational Tools for Electrical and Computer Engineers Using MATLAB fills that gap. Accessible after just one semester of calculus, it introduces the many practical analytical and numerical tools that are essential to success both in future studies and in professional life. Sharply focused on the needs of the electrical and computer engineering communities, the text provides a wealth of relevant exercises and design problems. Changes in MATLAB's version 6.0 are included in a special addendum.

  4. Computational Chemistry Comparison and Benchmark Database

    Science.gov (United States)

    SRD 101 NIST Computational Chemistry Comparison and Benchmark Database (Web, free access)   The NIST Computational Chemistry Comparison and Benchmark Database is a collection of experimental and ab initio thermochemical properties for a selected set of molecules. The goals are to provide a benchmark set of molecules for the evaluation of ab initio computational methods and allow the comparison between different ab initio computational methods for the prediction of thermochemical properties.

  5. Commercial counterboard for 10 ns software correlator for photon and fluorescence correlation spectroscopy

    Science.gov (United States)

    Molteni, Matteo; Ferri, Fabio

    2016-11-01

    A 10 ns time resolution, multi-tau software correlator, capable of computing simultaneous autocorrelation (A-A, B-B) and cross (A-B) correlation functions at count rates up to ˜10 MHz, with no data loss, has been developed in LabVIEW and C++ by using the National Instrument timer/counterboard (NI PCIe-6612) and a fast Personal Computer (PC) (Intel Core i7-4790 Processor 3.60 GHz ). The correlator works by using two algorithms: for large lag times (τ ≳ 1 μs), a classical time-mode scheme, based on the measure of the number of pulses per time interval, is used; differently, for τ ≲ 1 μs a photon-mode (PM) scheme is adopted and the correlation function is retrieved from the sequence of the photon arrival times. Single auto- and cross-correlation functions can be processed online in full real time up to count rates of ˜1.8 MHz and ˜1.2 MHz, respectively. Two autocorrelation (A-A, B-B) and a cross correlation (A-B) functions can be simultaneously processed in full real time only up to count rates of ˜750 kHz. At higher count rates, the online processing takes place in a delayed modality, but with no data loss. When tested with simulated correlation data and latex spheres solutions, the overall performances of the correlator appear to be comparable with those of commercial hardware correlators, but with several nontrivial advantages related to its flexibility, low cost, and easy adaptability to future developments of PC and data acquisition technology.

  6. Correlation Functions in Open Quantum-Classical Systems

    Directory of Open Access Journals (Sweden)

    Chang-Yu Hsieh

    2013-12-01

    Full Text Available Quantum time correlation functions are often the principal objects of interest in experimental investigations of the dynamics of quantum systems. For instance, transport properties, such as diffusion and reaction rate coefficients, can be obtained by integrating these functions. The evaluation of such correlation functions entails sampling from quantum equilibrium density operators and quantum time evolution of operators. For condensed phase and complex systems, where quantum dynamics is difficult to carry out, approximations must often be made to compute these functions. We present a general scheme for the computation of correlation functions, which preserves the full quantum equilibrium structure of the system and approximates the time evolution with quantum-classical Liouville dynamics. Several aspects of the scheme are discussed, including a practical and general approach to sample the quantum equilibrium density, the properties of the quantum-classical Liouville equation in the context of correlation function computations, simulation schemes for the approximate dynamics and their interpretation and connections to other approximate quantum dynamical methods.

  7. Object based data access at the D0 experiment

    International Nuclear Information System (INIS)

    Fuess, S.

    1995-11-01

    The D OE Experiment at Fermilab is currently participating in the FNAL Computing Division's ''Computing for Analysis Project'' (CAP) to investigate object based data storage and access. Following a short description of the CAP system architecture, the D OE data model is explored. A brief discussion of the method of operation of the CAP system leads into a concluding section

  8. A computational study on outliers in world music

    Science.gov (United States)

    Benetos, Emmanouil; Dixon, Simon

    2017-01-01

    The comparative analysis of world music cultures has been the focus of several ethnomusicological studies in the last century. With the advances of Music Information Retrieval and the increased accessibility of sound archives, large-scale analysis of world music with computational tools is today feasible. We investigate music similarity in a corpus of 8200 recordings of folk and traditional music from 137 countries around the world. In particular, we aim to identify music recordings that are most distinct compared to the rest of our corpus. We refer to these recordings as ‘outliers’. We use signal processing tools to extract music information from audio recordings, data mining to quantify similarity and detect outliers, and spatial statistics to account for geographical correlation. Our findings suggest that Botswana is the country with the most distinct recordings in the corpus and China is the country with the most distinct recordings when considering spatial correlation. Our analysis includes a comparison of musical attributes and styles that contribute to the ‘uniqueness’ of the music of each country. PMID:29253027

  9. A computational study on outliers in world music.

    Science.gov (United States)

    Panteli, Maria; Benetos, Emmanouil; Dixon, Simon

    2017-01-01

    The comparative analysis of world music cultures has been the focus of several ethnomusicological studies in the last century. With the advances of Music Information Retrieval and the increased accessibility of sound archives, large-scale analysis of world music with computational tools is today feasible. We investigate music similarity in a corpus of 8200 recordings of folk and traditional music from 137 countries around the world. In particular, we aim to identify music recordings that are most distinct compared to the rest of our corpus. We refer to these recordings as 'outliers'. We use signal processing tools to extract music information from audio recordings, data mining to quantify similarity and detect outliers, and spatial statistics to account for geographical correlation. Our findings suggest that Botswana is the country with the most distinct recordings in the corpus and China is the country with the most distinct recordings when considering spatial correlation. Our analysis includes a comparison of musical attributes and styles that contribute to the 'uniqueness' of the music of each country.

  10. Cranium-brain trauma in computed tomographs - diagnosis and clinical correlation

    International Nuclear Information System (INIS)

    Wrasse, K.

    1982-01-01

    For the successful treatment of intracranial complications in the case of cranium-brain trauma a quick and exact diagnosis is necessary. The goal of this work was to test and evaluate the effectivity of computed tomography for neurotraumatology. Using 565 patients, who were acutely or at one time suffering from a cranium-brain trauma, the high validity of computed tomography for these injuries was proven. The following areas in question were studied with respect to the value of computed tomography in comparison to them: angiography, X-ray diagnostic, echoencephalography, brain scintigraphy, electroencephalography and neurological-psychopathological findings from cranium-brain trauma. Statement possibilities and difficulties of computed tomography are discussed in the cases of the following neurotraumatological diseases: extracranial hematomas; acute cranium-brain traumas; traumatic arachnoidal bleeding; diffuse brain edema; transtentorial herniation and brain contusions. At the end the diagnostic and therapeutic procedures in the case of cranium-brain trauma are presented. (orig.) [de

  11. Gambling accessibility: a scale to measure gambler preferences.

    Science.gov (United States)

    Moore, Susan M; Thomas, Anna C; Kyrios, Michael; Bates, Glen; Meredyth, Denise

    2011-03-01

    Geographic closeness of gambling venues is not the only aspect of accessibility likely to affect gambling frequency. Perceived accessibility of gambling venues may include other features such as convenience (e.g., opening hours) or "atmosphere". The aim of the current study was to develop a multidimensional measure of gamblers' perceptions of accessibility, and present evidence for its reliability and validity. We surveyed 303 gamblers with 43 items developed to measure different dimensions of accessibility. Factor analysis of the items produced a two factor solution. The first, Social Accessibility related to the level at which gambling venues were enjoyed because they were social places, provided varying entertainment options and had a pleasant atmosphere. The second factor, Accessible Retreat related to the degree to which venues were enjoyed because they were geographically and temporally available and provided a familiar and anonymous retreat with few interruptions or distractions. Both factors, developed as reliable subscales of the new Gambling Access Scale, demonstrated construct validity through their correlations with other gambling-related measures. Social Accessibility was moderately related to gambling frequency and amount spent, but not to problem gambling, while, as hypothesised, Accessible Retreat was associated with stronger urges to gamble and gambling problems.

  12. ATLAS Distributed Computing in LHC Run2

    International Nuclear Information System (INIS)

    Campana, Simone

    2015-01-01

    The ATLAS Distributed Computing infrastructure has evolved after the first period of LHC data taking in order to cope with the challenges of the upcoming LHC Run-2. An increase in both the data rate and the computing demands of the Monte-Carlo simulation, as well as new approaches to ATLAS analysis, dictated a more dynamic workload management system (Prodsys-2) and data management system (Rucio), overcoming the boundaries imposed by the design of the old computing model. In particular, the commissioning of new central computing system components was the core part of the migration toward a flexible computing model. A flexible computing utilization exploring the use of opportunistic resources such as HPC, cloud, and volunteer computing is embedded in the new computing model; the data access mechanisms have been enhanced with the remote access, and the network topology and performance is deeply integrated into the core of the system. Moreover, a new data management strategy, based on a defined lifetime for each dataset, has been defined to better manage the lifecycle of the data. In this note, an overview of an operational experience of the new system and its evolution is presented. (paper)

  13. Tracheomalacia before and after aortosternopexy: dynamic and quantitative assessment by electron-beam computed tomography with clinical correlation

    International Nuclear Information System (INIS)

    Kao, S.C.S.; Kimura, K.; Smith, W.L.; Sato, Y.

    1995-01-01

    To correlate the dynamics of tracheal collapse with clinical upper airway obstruction before and after aortosternopexy, seven boys and three girls (mean age, 10 months) underwent dynamic evaluation of the trachea by electron-beam computed tomography (EBCT). The site, extent, and severity of collapse were correlated with symptomatology and details of operative procedure. When >50% area collapse was used as the criterion for tracheomalacia, segmental involvement occurred above the aortic arch in all patients, extending to the aortic arch level in only four. Tracheomalacia involved two or fewer 8-mm levels in seven patients and more than two levels in three. Eight patients underwent one aortosternopexy procedure, resulting in clinical improvement in six and correlating well with EBCT findings. Of the remaining two patients who had single aortosternopexy and did not show clinical and radiographic improvement, one required operative repair of a vascular ring and the other continued to have recurrent respiratory tract infections. On the basis of EBCT findings, two patients required additional innominate arteriopexies: One improved, and the other remained symptomatic, requiring tracheostomy. EBCT is a noninvasive modality that allows preoperative diagnosis of tracheomalacia. More importantly, the operative decision and technique are guided by an objective and quantitative assessment of tracheal collapse. (orig.)

  14. Access to DIII-D data located in multiple files and multiple locations

    International Nuclear Information System (INIS)

    McHarg, B.B. Jr.

    1993-10-01

    The General Atomics DIII-D tokamak fusion experiment is now collecting over 80 MB of data per discharge once every 10 min, and that quantity is expected to double within the next year. The size of the data files, even in compressed format, is becoming increasingly difficult to handle. Data is also being acquired now on a variety of UNIX systems as well as MicroVAX and MODCOMP computer systems. The existing computers collect all the data into a single shot file, and this data collection is taking an ever increasing amount of time as the total quantity of data increases. Data is not available to experimenters until it has been collected into the shot file, which is in conflict with the substantial need for data examination on a timely basis between shots. The experimenters are also spread over many different types of computer systems (possibly located at other sites). To improve data availability and handling, software has been developed to allow individual computer systems to create their own shot files locally. The data interface routine PTDATA that is used to access DIII-D data has been modified so that a user's code on any computer can access data from any computer where that data might be located. This data access is transparent to the user. Breaking up the shot file into separate files in multiple locations also impacts software used for data archiving, data management, and data restoration

  15. A Big Data Platform for Storing, Accessing, Mining and Learning Geospatial Data

    Science.gov (United States)

    Yang, C. P.; Bambacus, M.; Duffy, D.; Little, M. M.

    2017-12-01

    Big Data is becoming a norm in geoscience domains. A platform that is capable to effiently manage, access, analyze, mine, and learn the big data for new information and knowledge is desired. This paper introduces our latest effort on developing such a platform based on our past years' experiences on cloud and high performance computing, analyzing big data, comparing big data containers, and mining big geospatial data for new information. The platform includes four layers: a) the bottom layer includes a computing infrastructure with proper network, computer, and storage systems; b) the 2nd layer is a cloud computing layer based on virtualization to provide on demand computing services for upper layers; c) the 3rd layer is big data containers that are customized for dealing with different types of data and functionalities; d) the 4th layer is a big data presentation layer that supports the effient management, access, analyses, mining and learning of big geospatial data.

  16. Computational Science at the Argonne Leadership Computing Facility

    Science.gov (United States)

    Romero, Nichols

    2014-03-01

    The goal of the Argonne Leadership Computing Facility (ALCF) is to extend the frontiers of science by solving problems that require innovative approaches and the largest-scale computing systems. ALCF's most powerful computer - Mira, an IBM Blue Gene/Q system - has nearly one million cores. How does one program such systems? What software tools are available? Which scientific and engineering applications are able to utilize such levels of parallelism? This talk will address these questions and describe a sampling of projects that are using ALCF systems in their research, including ones in nanoscience, materials science, and chemistry. Finally, the ways to gain access to ALCF resources will be presented. This research used resources of the Argonne Leadership Computing Facility at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under contract DE-AC02-06CH11357.

  17. Task-role-based Access Control Model in Smart Health-care System

    Directory of Open Access Journals (Sweden)

    Wang Peng

    2015-01-01

    Full Text Available As the development of computer science and smart health-care technology, there is a trend for patients to enjoy medical care at home. Taking enormous users in the Smart Health-care System into consideration, access control is an important issue. Traditional access control models, discretionary access control, mandatory access control, and role-based access control, do not properly reflect the characteristics of Smart Health-care System. This paper proposes an advanced access control model for the medical health-care environment, task-role-based access control model, which overcomes the disadvantages of traditional access control models. The task-role-based access control (T-RBAC model introduces a task concept, dividing tasks into four categories. It also supports supervision role hierarchy. T-RBAC is a proper access control model for Smart Health-care System, and it improves the management of access rights. This paper also proposes an implementation of T-RBAC, a binary two-key-lock pair access control scheme using prime factorization.

  18. The influence of early computer use on educational achievement in mathematics

    Directory of Open Access Journals (Sweden)

    Andrés Fernández Aráuz

    2014-11-01

    Full Text Available Does early access to computer use improve student achievement? Empirical evidence from experimental designs shows little or no relationship between the use of ICT and academic performance. Using the data collected in PISA it can be analyzed this association some years after that the student had access to a computer for the first time. Performing an exploratory data analysis of Costa Rican students in PISA 2012 mathematics assessment and by estimating a linear regression model even controlling for factors with high explanatory power on academic performance, it’s shown that age at access to a computer is a determinant of educational performance.

  19. Edge computing technologies for Internet of Things: a primer

    Directory of Open Access Journals (Sweden)

    Yuan Ai

    2018-04-01

    Full Text Available With the rapid development of mobile internet and Internet of Things applications, the conventional centralized cloud computing is encountering severe challenges, such as high latency, low Spectral Efficiency (SE, and non-adaptive machine type of communication. Motivated to solve these challenges, a new technology is driving a trend that shifts the function of centralized cloud computing to edge devices of networks. Several edge computing technologies originating from different backgrounds to decrease latency, improve SE, and support the massive machine type of communication have been emerging. This paper comprehensively presents a tutorial on three typical edge computing technologies, namely mobile edge computing, cloudlets, and fog computing. In particular, the standardization efforts, principles, architectures, and applications of these three technologies are summarized and compared. From the viewpoint of radio access network, the differences between mobile edge computing and fog computing are highlighted, and the characteristics of fog computing-based radio access network are discussed. Finally, open issues and future research directions are identified as well. Keywords: Internet of Things (IoT, Mobile edge computing, Cloudlets, Fog computing

  20. A scoping review of cloud computing in healthcare.

    Science.gov (United States)

    Griebel, Lena; Prokosch, Hans-Ulrich; Köpcke, Felix; Toddenroth, Dennis; Christoph, Jan; Leb, Ines; Engel, Igor; Sedlmayr, Martin

    2015-03-19

    Cloud computing is a recent and fast growing area of development in healthcare. Ubiquitous, on-demand access to virtually endless resources in combination with a pay-per-use model allow for new ways of developing, delivering and using services. Cloud computing is often used in an "OMICS-context", e.g. for computing in genomics, proteomics and molecular medicine, while other field of application still seem to be underrepresented. Thus, the objective of this scoping review was to identify the current state and hot topics in research on cloud computing in healthcare beyond this traditional domain. MEDLINE was searched in July 2013 and in December 2014 for publications containing the terms "cloud computing" and "cloud-based". Each journal and conference article was categorized and summarized independently by two researchers who consolidated their findings. 102 publications have been analyzed and 6 main topics have been found: telemedicine/teleconsultation, medical imaging, public health and patient self-management, hospital management and information systems, therapy, and secondary use of data. Commonly used features are broad network access for sharing and accessing data and rapid elasticity to dynamically adapt to computing demands. Eight articles favor the pay-for-use characteristics of cloud-based services avoiding upfront investments. Nevertheless, while 22 articles present very general potentials of cloud computing in the medical domain and 66 articles describe conceptual or prototypic projects, only 14 articles report from successful implementations. Further, in many articles cloud computing is seen as an analogy to internet-/web-based data sharing and the characteristics of the particular cloud computing approach are unfortunately not really illustrated. Even though cloud computing in healthcare is of growing interest only few successful implementations yet exist and many papers just use the term "cloud" synonymously for "using virtual machines" or "web

  1. 15 CFR 740.7 - Computers (APP).

    Science.gov (United States)

    2010-01-01

    ... 4A003. (2) Technology and software. License Exception APP authorizes exports of technology and software... License Exception. (2) Access and release restrictions—(i)Computers and software. Computers and software... software eligible for License Exception APP may not be reexported or transferred (in country) without prior...

  2. Recent trends in grid computing

    International Nuclear Information System (INIS)

    Miura, Kenichi

    2004-01-01

    Grid computing is a technology which allows uniform and transparent access to geographically dispersed computational resources, such as computers, databases, experimental and observational equipment etc. via high-speed, high-bandwidth networking. The commonly used analogy is that of electrical power grid, whereby the household electricity is made available from outlets on the wall, and little thought need to be given to where the electricity is generated and how it is transmitted. The usage of grid also includes distributed parallel computing, high through-put computing, data intensive computing (data grid) and collaborative computing. This paper reviews the historical background, software structure, current status and on-going grid projects, including applications of grid technology to nuclear fusion research. (author)

  3. Correlation between Academic and Skills-Based Tests in Computer Networks

    Science.gov (United States)

    Buchanan, William

    2006-01-01

    Computing-related programmes and modules have many problems, especially related to large class sizes, large-scale plagiarism, module franchising, and an increased requirement from students for increased amounts of hands-on, practical work. This paper presents a practical computer networks module which uses a mixture of online examinations and a…

  4. The Role of Genome Accessibility in Transcription Factor Binding in Bacteria.

    Directory of Open Access Journals (Sweden)

    Antonio L C Gomes

    2016-04-01

    Full Text Available ChIP-seq enables genome-scale identification of regulatory regions that govern gene expression. However, the biological insights generated from ChIP-seq analysis have been limited to predictions of binding sites and cooperative interactions. Furthermore, ChIP-seq data often poorly correlate with in vitro measurements or predicted motifs, highlighting that binding affinity alone is insufficient to explain transcription factor (TF-binding in vivo. One possibility is that binding sites are not equally accessible across the genome. A more comprehensive biophysical representation of TF-binding is required to improve our ability to understand, predict, and alter gene expression. Here, we show that genome accessibility is a key parameter that impacts TF-binding in bacteria. We developed a thermodynamic model that parameterizes ChIP-seq coverage in terms of genome accessibility and binding affinity. The role of genome accessibility is validated using a large-scale ChIP-seq dataset of the M. tuberculosis regulatory network. We find that accounting for genome accessibility led to a model that explains 63% of the ChIP-seq profile variance, while a model based in motif score alone explains only 35% of the variance. Moreover, our framework enables de novo ChIP-seq peak prediction and is useful for inferring TF-binding peaks in new experimental conditions by reducing the need for additional experiments. We observe that the genome is more accessible in intergenic regions, and that increased accessibility is positively correlated with gene expression and anti-correlated with distance to the origin of replication. Our biophysically motivated model provides a more comprehensive description of TF-binding in vivo from first principles towards a better representation of gene regulation in silico, with promising applications in systems biology.

  5. Object detection by correlation coefficients using azimuthally averaged reference projections.

    Science.gov (United States)

    Nicholson, William V

    2004-11-01

    A method of computing correlation coefficients for object detection that takes advantage of using azimuthally averaged reference projections is described and compared with two alternative methods-computing a cross-correlation function or a local correlation coefficient versus the azimuthally averaged reference projections. Two examples of an application from structural biology involving the detection of projection views of biological macromolecules in electron micrographs are discussed. It is found that a novel approach to computing a local correlation coefficient versus azimuthally averaged reference projections, using a rotational correlation coefficient, outperforms using a cross-correlation function and a local correlation coefficient in object detection from simulated images with a range of levels of simulated additive noise. The three approaches perform similarly in detecting macromolecular views in electron microscope images of a globular macrolecular complex (the ribosome). The rotational correlation coefficient outperforms the other methods in detection of keyhole limpet hemocyanin macromolecular views in electron micrographs.

  6. Public computing options for individuals with cognitive impairments: survey outcomes.

    Science.gov (United States)

    Fox, Lynn Elizabeth; Sohlberg, McKay Moore; Fickas, Stephen; Lemoncello, Rik; Prideaux, Jason

    2009-09-01

    To examine availability and accessibility of public computing for individuals with cognitive impairment (CI) who reside in the USA. A telephone survey was administered as a semi-structured interview to 145 informants representing seven types of public facilities across three geographically distinct regions using a snowball sampling technique. An Internet search of wireless (Wi-Fi) hotspots supplemented the survey. Survey results showed the availability of public computer terminals and Internet hotspots was greatest in the urban sample, followed by the mid-sized and rural cities. Across seven facility types surveyed, libraries had the highest percentage of access barriers, including complex queue procedures, login and password requirements, and limited technical support. University assistive technology centres and facilities with a restricted user policy, such as brain injury centres, had the lowest incidence of access barriers. Findings suggest optimal outcomes for people with CI will result from a careful match of technology and the user that takes into account potential barriers and opportunities to computing in an individual's preferred public environments. Trends in public computing, including the emergence of widespread Wi-Fi and limited access to terminals that permit auto-launch applications, should guide development of technology designed for use in public computing environments.

  7. Spiral computed tomography during arterial portography of the liver: correlations between radiological and intraoperative findings and evaluation of operability

    International Nuclear Information System (INIS)

    Layer, G.; Runge, I.; Conrad, R.; Pauleit, D.; Jaeger, U.; Schild, H.H.; Gallkowski, U.; Wolff, M.; Hirner, A.

    1999-01-01

    Purpose: To evaluate the accuracy of spiral computed tomography during arterial portography (SCTAP) in the detection, localization, and resectablility of liver tumors in a correlative study between radiology and intraoperative findings. Method and Materials: Retrospectively, SCTAP images of 168 consecutive patients before liver tumor resection were analyzed. The SCTAP studies (100 ml Iopromid 300 by automated injector with a flow of 3 ml/s; slice thickness, table feed and reconstruction index 5 mm each; scan-delay 30 s; 120 kV; 250 mAs) were evaluated for the detection, localization, and resectability of focal liver lesions by three experienced radiologists in consensus and were correlated with histopathological and intraoperative findings where available (59/168). Results: The sensitivity of SCTAP for the detection of liver tumors was 91% for all lesions and 84% for lesions [de

  8. Templet Web: the use of volunteer computing approach in PaaS-style cloud

    Science.gov (United States)

    Vostokin, Sergei; Artamonov, Yuriy; Tsarev, Daniil

    2018-03-01

    This article presents the Templet Web cloud service. The service is designed for high-performance scientific computing automation. The use of high-performance technology is specifically required by new fields of computational science such as data mining, artificial intelligence, machine learning, and others. Cloud technologies provide a significant cost reduction for high-performance scientific applications. The main objectives to achieve this cost reduction in the Templet Web service design are: (a) the implementation of "on-demand" access; (b) source code deployment management; (c) high-performance computing programs development automation. The distinctive feature of the service is the approach mainly used in the field of volunteer computing, when a person who has access to a computer system delegates his access rights to the requesting user. We developed an access procedure, algorithms, and software for utilization of free computational resources of the academic cluster system in line with the methods of volunteer computing. The Templet Web service has been in operation for five years. It has been successfully used for conducting laboratory workshops and solving research problems, some of which are considered in this article. The article also provides an overview of research directions related to service development.

  9. Correlation of primary middle and distal esophageal cancers motion with surrounding tissues using four-dimensional computed tomography

    Directory of Open Access Journals (Sweden)

    Wang W

    2016-06-01

    Full Text Available Wei Wang,1 Jianbin Li,1 Yingjie Zhang,1 Qian Shao,1 Min Xu,1 Bing Guo,1 Dongping Shang2 1Department of Radiation Oncology, 2Department of Big Bore CT Room, Shandong Cancer Hospital Affiliated to Shandong University, Shandong Academy of Medical Sciences, Jinan, Shandong, People’s Republic of China Purpose: To investigate the correlation of gross tumor volume (GTV motion with the structure of interest (SOI motion and volume variation for middle and distal esophageal cancers using four-dimensional computed tomography (4DCT.Patients and methods: Thirty-three patients with middle or distal esophageal carcinoma underwent 4DCT simulation scan during free breathing. All image sets were registered with 0% phase, and the GTV, apex of diaphragm, lung, and heart were delineated on each phase of the 4DCT data. The position of GTV and SOI was identified in all 4DCT phases, and the volume of lung and heart was also achieved. The phase relationship between the GTV and SOI was estimated through Pearson’s correlation test.Results: The mean peak-to-peak displacement of all primary tumors in the lateral (LR, anteroposterior (AP, and superoinferior (SI directions was 0.13 cm, 0.20 cm, and 0.30 cm, respectively. The SI peak-to-peak motion of the GTV was defined as the greatest magnitude of motion. The displacement of GTV correlated well with heart in three dimensions and significantly associated with bilateral lung in LR and SI directions. A significant correlation was found between the GTV and apex of the diaphragm in SI direction (rleft=0.918 and rright=0.928. A significant inverse correlation was found between GTV motion and varying lung volume, but the correlation was not significant with heart (rLR=–0.530, rAP=–0.531, and rSI=–0.588 during respiratory cycle.Conclusion: For middle and distal esophageal cancers, GTV should expand asymmetric internal margins. The primary tumor motion has quite good correlation with diaphragm, heart, and lung. Keywords

  10. Computer network for electric power control systems. Chubu denryoku (kabu) denryoku keito seigyoyo computer network

    Energy Technology Data Exchange (ETDEWEB)

    Tsuneizumi, T. (Chubu Electric Power Co. Inc., Nagoya (Japan)); Shimomura, S.; Miyamura, N. (Fuji Electric Co. Ltd., Tokyo (Japan))

    1992-06-03

    A computer network for electric power control system was developed that is applied with the open systems interconnection (OSI), an international standard for communications protocol. In structuring the OSI network, a direct session layer was accessed from the operation functions when high-speed small-capacity information is transmitted. File transfer, access and control having a function of collectively transferring large-capacity data were applied when low-speed large-capacity information is transmitted. A verification test for the realtime computer network (RCN) mounting regulation was conducted according to a verification model using a mini-computer, and a result that can satisfy practical performance was obtained. For application interface, kernel, health check and two-route transmission functions were provided as a connection control function, so were transmission verification function and late arrival abolishing function. In system mounting pattern, dualized communication server (CS) structure was adopted. A hardware structure may include a system to have the CS function contained in a host computer and a separate installation system. 5 figs., 6 tabs.

  11. Projection of Anthropometric Correlation for Virtual Population Modelling

    DEFF Research Database (Denmark)

    Rasmussen, John; Waagepetersen, Rasmus Plenge; Rasmussen, Kasper Pihl

    2018-01-01

    , and therefore the correlations between parameters, are not accessible. This problem is solved by projecting correlation from a data set for which raw data are provided. The method is tested and validated by generation of pseudo females from males in the ANSUR anthropometric dataset. Results show...

  12. What drives individuals to access the internet mostly using a cell phone?

    OpenAIRE

    Yook, Seungyun; Jung, Yumi

    2012-01-01

    Mobile Internet users can access content, applications, and services using their cell phones. Recent PEW Internet research shows that more than half of U.S. cell phone owners have smartphones. Among them, some people have adopted a mobile phone and use it as a major Internet access medium; they may use other devices such as a desktop computer, notebook, netbook, or tablet PC, but those are not included in their Internet access medium repertoire. This paper examines who accesses the Internet m...

  13. Cationic agent contrast-enhanced computed tomography imaging of cartilage correlates with the compressive modulus and coefficient of friction.

    Science.gov (United States)

    Lakin, B A; Grasso, D J; Shah, S S; Stewart, R C; Bansal, P N; Freedman, J D; Grinstaff, M W; Snyder, B D

    2013-01-01

    The aim of this study is to evaluate whether contrast-enhanced computed tomography (CECT) attenuation, using a cationic contrast agent (CA4+), correlates with the equilibrium compressive modulus (E) and coefficient of friction (μ) of ex vivo bovine articular cartilage. Correlations between CECT attenuation and E (Group 1, n = 12) and μ (Group 2, n = 10) were determined using 7 mm diameter bovine osteochondral plugs from the stifle joints of six freshly slaughtered, skeletally mature cows. The equilibrium compressive modulus was measured using a four-step, unconfined, compressive stress-relaxation test, and the coefficients of friction were determined from a torsional friction test. Following mechanical testing, samples were immersed in CA4+, imaged using μCT, rinsed, and analyzed for glycosaminoglycan (GAG) content using the 1,9-dimethylmethylene blue (DMMB) assay. The CECT attenuation was positively correlated with the GAG content of bovine cartilage (R(2) = 0.87, P coefficients of friction: CECT vs μ(static) (R(2) = 0.71, P = 0.002), CECT vs μ(static_equilibrium) (R(2) = 0.79, P coefficient of friction. Copyright © 2012 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  14. Controlling Access to Input/Output Peripheral Devices

    Directory of Open Access Journals (Sweden)

    E. Y. Rodionov

    2010-03-01

    Full Text Available In this paper the author proposes a system that manages information security policy on enterprise. Problems related to managing information security policy on enterprise and access to peripheral devices in computer systems functioning under control of Microsoft Windows NT operating systems are considered.

  15. Power plant process computer

    International Nuclear Information System (INIS)

    Koch, R.

    1982-01-01

    The concept of instrumentation and control in nuclear power plants incorporates the use of process computers for tasks which are on-line in respect to real-time requirements but not closed-loop in respect to closed-loop control. The general scope of tasks is: - alarm annunciation on CRT's - data logging - data recording for post trip reviews and plant behaviour analysis - nuclear data computation - graphic displays. Process computers are used additionally for dedicated tasks such as the aeroball measuring system, the turbine stress evaluator. Further applications are personal dose supervision and access monitoring. (orig.)

  16. New data access with HTTP/WebDAV in the ATLAS experiment

    CERN Document Server

    Elmsheuser, Johannes; The ATLAS collaboration; Serfon, Cedric; Garonne, Vincent; Blunier, Sylvain; Lavorini, Vincenzo; Nilsson, Paul

    2015-01-01

    With the exponential growth of LHC (Large Hadron Collider) data in the years 2010-2012, distributed computing has become the established way to analyze collider data. The ATLAS experiment Grid infrastructure includes more than 130 sites worldwide, ranging from large national computing centres to smaller university clusters. So far the storage technologies and access protocols to the clusters that host this tremendous amount of data vary from site to site. HTTP/WebDAV offers the possibility to use a unified industry standard to access the storage. We present the deployment and testing of HTTP/WebDAV for local and remote data access in the ATLAS experiment for the new data management system Rucio and the PanDA workload management system. Deployment and large scale tests have been performed using the Grid testing system HammerCloud and the ROOT HTTP plugin Davix.

  17. New data access with HTTP/WebDAV in the ATLAS experiment

    CERN Document Server

    Elmsheuser, Johannes; The ATLAS collaboration; Serfon, Cedric; Garonne, Vincent; Blunier, Sylvain; Lavorini, Vincenzo; Nilsson, Paul

    2015-01-01

    With the exponential growth of LHC (Large Hadron Collider) data in the years 2010-2012, distributed computing has become the established way to analyse collider data. The ATLAS experiment Grid infrastructure includes more than 130 sites worldwide, ranging from large national computing centres to smaller university clusters. So far the storage technologies and access protocols to the clusters that host this tremendous amount of data vary from site to site. HTTP/WebDAV offers the possibility to use a unified industry standard to access the storage. We present the deployment and testing of HTTP/WebDAV for local and remote data access in the ATLAS experiment for the new data management system Rucio and the PanDA workload management system. Deployment and large scale tests have been performed using the Grid testing system HammerCloud and the ROOT HTTP plugin Davix.

  18. Computer Security: “Hello World” - Welcome to CERN

    CERN Multimedia

    Stefan Lueders, Computer Security Team

    2015-01-01

    Welcome to the open, liberal and free academic computing environment at CERN. Thanks to your new (or long-established!) affiliation with CERN, you are eligible for a CERN computing account, which enables you to register your devices: computers, laptops, smartphones, tablets, etc. It provides you with plenty of disk space and an e-mail address. It allows you to create websites, virtual machines and databases on demand.   You can now access most of the computing services provided by the GS and IT departments: Indico, for organising meetings and conferences; EDMS, for the approval of your engineering specifications; TWiki, for collaboration with others; and the WLCG computing grid. “Open, liberal, and free”, however, does not mean that you can do whatever you like. While we try to make your access to CERN's computing facilities as convenient and easy as possible, there are a few limits and boundaries to respect. These boundaries protect both the Organization'...

  19. Correlative single photon emission computed tomography imaging of [123I]altropane binding in the rat model of Parkinson's

    International Nuclear Information System (INIS)

    Gleave, Jacqueline A.; Farncombe, Troy H.; Saab, Chantal; Doering, Laurie C.

    2011-01-01

    Introduction: This study used the dopamine transporter (DAT) probe, [ 123 I]-2β-carbomethoxy-3β-(4-fluorophenyl)-N-(3-iodo-E-allyl)nortropane ([ 123 I]altropane), to assess the DAT levels in the 6-hydroxydopamine rat model of Parkinson's disease. We sought to assess if the right to left [ 123 I]altropane striatal ratios correlated with dopamine content in the striatum and substantia nigra and with behavioural outcomes. Methods: [ 123 I]altropane images taken pre- and postlesion were acquired before and after the transplantation of neural stem/progenitor cells. The images obtained using [ 123 I]altropane and single photon emission computed tomography (SPECT) were compared with specific behavioural tests and the dopamine content assessed by high-performance liquid chromatography. Results: [ 123 I]altropane binding correlated with the content of dopamine in the striatum; however, [ 123 I]altropane binding did not correlate with the dopamine content in the substantia nigra. There was a significant correlation of altropane ratios with the cylinder test and the postural instability test, but not with amphetamine rotations. The low coefficient of determination (r 2 ) for these correlations indicated that [ 123 I]altropane SPECT was not a good predictor of behavioural outcomes. Conclusion: Our data reveal that [ 123 I]altropane predicts the integrity of the striatal dopamine nerve terminals, but does not predict the integrity of the nigrostriatal system. [ 123 I]altropane could be a useful marker to measure dopamine content in cell replacement therapies; however, it would not be able to evaluate outcomes for neuroprotective strategies.

  20. Computer Ethics Topics and Teaching Strategies.

    Science.gov (United States)

    DeLay, Jeanine A.

    An overview of six major issues in computer ethics is provided in this paper: (1) unauthorized and illegal database entry, surveillance and monitoring, and privacy issues; (2) piracy and intellectual property theft; (3) equity and equal access; (4) philosophical implications of artificial intelligence and computer rights; (5) social consequences…

  1. Quantum Correlations in Nonlocal Boson Sampling.

    Science.gov (United States)

    Shahandeh, Farid; Lund, Austin P; Ralph, Timothy C

    2017-09-22

    Determination of the quantum nature of correlations between two spatially separated systems plays a crucial role in quantum information science. Of particular interest is the questions of if and how these correlations enable quantum information protocols to be more powerful. Here, we report on a distributed quantum computation protocol in which the input and output quantum states are considered to be classically correlated in quantum informatics. Nevertheless, we show that the correlations between the outcomes of the measurements on the output state cannot be efficiently simulated using classical algorithms. Crucially, at the same time, local measurement outcomes can be efficiently simulated on classical computers. We show that the only known classicality criterion violated by the input and output states in our protocol is the one used in quantum optics, namely, phase-space nonclassicality. As a result, we argue that the global phase-space nonclassicality inherent within the output state of our protocol represents true quantum correlations.

  2. elevatr: Access Elevation Data from Various APIs | Science ...

    Science.gov (United States)

    Several web services are available that provide access to elevation data. This package provides access to several of those services and returns elevation data either as a SpatialPointsDataFrame from point elevation services or as a raster object from raster elevation services. Currently, the package supports access to the Mapzen Elevation Service, Mapzen Terrain Service, and the USGS Elevation Point Query Service. The R language for statistical computing is increasingly used for spatial data analysis . This R package, elevatr, is in response to this and provides access to elevation data from various sources directly in R. The impact of `elevatr` is that it will 1) facilitate spatial analysis in R by providing access to foundational dataset for many types of analyses (e.g. hydrology, limnology) 2) open up a new set of users and uses for APIs widely used outside of R, and 3) provide an excellent example federal open source development as promoted by the Federal Source Code Policy (https://sourcecode.cio.gov/).

  3. Image storage, cataloguing and retrieval using a personal computer database software application

    International Nuclear Information System (INIS)

    Lewis, G.; Howman-Giles, R.

    1999-01-01

    Full text: Interesting images and cases are collected and collated by most nuclear medicine practitioners throughout the world. Changing imaging technology has altered the way in which images may be presented and are reported, with less reliance on 'hard copy' for both reporting and archiving purposes. Digital image generation and storage is rapidly replacing film in both radiological and nuclear medicine practice. A personal computer database based interesting case filing system is described and demonstrated. The digital image storage format allows instant access to both case information (e.g. history and examination, scan report or teaching point) and the relevant images. The database design allows rapid selection of cases and images appropriate to a particular diagnosis, scan type, age or other search criteria. Correlative X-ray, CT, MRI and ultrasound images can also be stored and accessed. The application is in use at The New Children's Hospital as an aid to postgraduate medical education, with new cases being regularly added to the database

  4. Computer skills and computer anxiety as predictors of internet use ...

    African Journals Online (AJOL)

    The study investigated the extent to which computer skills and computer anxiety predict Internet use among distance learning students in University of Ibadan, Nigeria. The descriptive method of correlative type was used for the study and the sample comprised of one hundred and thirty four (134) distance learning students ...

  5. Electronic Information Access and Utilization by Makerere University Students in Uganda

    Directory of Open Access Journals (Sweden)

    Elisam Magara

    2008-09-01

    Full Text Available Objectives – The objectives of this study were to establish the level of computer utilization skills of Makerere University (Uganda Library and Information Science (LIS students; to determine the use of electronic information resources by LIS students; to determine the attitudes of LIS students towards electronic information resources; and to establish the problems faced by LIS students in accessing electronic information resources.Methods – A questionnaire survey was used for data collection.Results – The majority of Library and Information Science students at Makerere University depend on university computers for their work, and very few of them access the library’s e-resources. The few who access e-resources are self-taught. The majority of students surveyed were unaware of Emerald and EBSCO databases relevant to Library and Information Science students, and they found accessing eresources time-consuming. Conclusion – The study concluded that a concerted effort is needed by both LIS lecturers and university librarians in promoting use of the library’s electronic resources.

  6. Mastering cloud computing foundations and applications programming

    CERN Document Server

    Buyya, Rajkumar; Selvi, SThamarai

    2013-01-01

    Mastering Cloud Computing is designed for undergraduate students learning to develop cloud computing applications. Tomorrow's applications won't live on a single computer but will be deployed from and reside on a virtual server, accessible anywhere, any time. Tomorrow's application developers need to understand the requirements of building apps for these virtual systems, including concurrent programming, high-performance computing, and data-intensive systems. The book introduces the principles of distributed and parallel computing underlying cloud architectures and specifical

  7. Comparison of apical centring ability between incisal-shifted access and traditional lingual access for maxillary anterior teeth.

    Science.gov (United States)

    Yahata, Yoshio; Masuda, Yoshiko; Komabayashi, Takashi

    2017-12-01

    The aim of this study was to compare the apical centring ability of incisal-shifted access (ISA) with that of traditional lingual access (TLA). Fifteen three-dimensional printed resin models were prepared from the computed tomography data for a human maxillary central incisor and divided into ISA (n = 7), TLA (n = 7) and control (n = 1) groups. After access preparation, these models were shaped to the working length using K-files up to #40, followed by step-back procedures. An apical portion of the model was removed at 0.5 mm coronal to the working length. Microscopic images of each cutting surface were taken to measure the preparation area and the distance of transportation. TLA created a larger preparation area than ISA (P < 0.05). The distance of transportation (mean ± standard deviation) was 0.4 ± 0.1 mm for ISA and 0.7 ± 0.1 mm for TLA (P < 0.05). Access cavity preparation has a significant effect on apical centring ability. ISA is beneficial to maintaining apical configuration. © 2017 Australian Society of Endodontology Inc.

  8. Intermittency analysis of correlated data

    International Nuclear Information System (INIS)

    Wosiek, B.

    1992-01-01

    We describe the method of the analysis of the dependence of the factorial moments on the bin size in which the correlations between the moments computed for different bin sizes are taken into account. For large multiplicity nucleus-nucleus data inclusion of the correlations does not change the values of the slope parameter, but gives errors significantly reduced as compared to the case of fits with no correlations. (author)

  9. Archives: Journal of Computer Science and Its Application

    African Journals Online (AJOL)

    Items 1 - 9 of 9 ... Archives: Journal of Computer Science and Its Application. Journal Home > Archives: Journal of Computer Science and Its Application. Log in or Register to get access to full text downloads.

  10. Quantitative computed tomography analysis of the airways in patients with cystic fibrosis using automated software: correlation with spirometry in the evaluation of severity

    International Nuclear Information System (INIS)

    Santos, Marcel Koenigkam; Cruvinel, Danilo Lemos; Menezes, Marcelo Bezerra de; Teixeira, Sara Reis; Vianna, Elcio de Oliveira; Elias Junior, Jorge; Martinez, Jose Antonio Baddini

    2016-01-01

    Objective: To perform a quantitative analysis of the airways using automated software, in computed tomography images of patients with cystic fibrosis, correlating the results with spirometric findings. Materials and methods: Thirty-four patients with cystic fibrosis were studied-20 males and 14 females; mean age 18 ± 9 years - divided into two groups according to the spirometry findings: group I (n = 21), without severe airflow obstruction (forced expiratory volume in first second [FEV1] > 50% predicted), and group II (n = 13), with severe obstruction (FEV1 ≤ 50% predicted). The following tracheobronchial tree parameters were obtained automatically: bronchial diameter, area, thickness, and wall attenuation. Results: On average, 52 bronchi per patient were studied. The number of bronchi analyzed was higher in group II. The correlation with spirometry findings, especially between the relative wall thickness of third to eighth bronchial generation and predicted FEV1, was better in group I. Conclusion: Quantitative analysis of the airways by computed tomography can be useful for assessing disease severity in cystic fibrosis patients. In patients with severe airflow obstruction, the number of bronchi studied by the method is higher, indicating more bronchiectasis. In patients without severe obstruction, the relative bronchial wall thickness showed a good correlation with the predicted FEV1. (author)

  11. Quantitative computed tomography analysis of the airways in patients with cystic fibrosis using automated software: correlation with spirometry in the evaluation of severity

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Marcel Koenigkam; Cruvinel, Danilo Lemos; Menezes, Marcelo Bezerra de; Teixeira, Sara Reis; Vianna, Elcio de Oliveira; Elias Junior, Jorge; Martinez, Jose Antonio Baddini, E-mail: marcelk46@yahoo.com.br [Universidade de Sao Paulo (HC/FMRP/USP), Ribeirao Preto, SP (Brazil). Faculdade de Medicina

    2016-11-15

    Objective: To perform a quantitative analysis of the airways using automated software, in computed tomography images of patients with cystic fibrosis, correlating the results with spirometric findings. Materials and methods: Thirty-four patients with cystic fibrosis were studied-20 males and 14 females; mean age 18 ± 9 years - divided into two groups according to the spirometry findings: group I (n = 21), without severe airflow obstruction (forced expiratory volume in first second [FEV1] > 50% predicted), and group II (n = 13), with severe obstruction (FEV1 ≤ 50% predicted). The following tracheobronchial tree parameters were obtained automatically: bronchial diameter, area, thickness, and wall attenuation. Results: On average, 52 bronchi per patient were studied. The number of bronchi analyzed was higher in group II. The correlation with spirometry findings, especially between the relative wall thickness of third to eighth bronchial generation and predicted FEV1, was better in group I. Conclusion: Quantitative analysis of the airways by computed tomography can be useful for assessing disease severity in cystic fibrosis patients. In patients with severe airflow obstruction, the number of bronchi studied by the method is higher, indicating more bronchiectasis. In patients without severe obstruction, the relative bronchial wall thickness showed a good correlation with the predicted FEV1. (author)

  12. Hyperacute stroke patients and catheter thrombolysis therapy. Correlation between computed tomography perfusion maps and final infarction

    International Nuclear Information System (INIS)

    Naito, Yukari; Tanaka, Shigeko; Inoue, Yuichi; Ota, Shinsuke; Sakaki, Saburo; Kitagaki, Hajime

    2008-01-01

    We investigated the correlation between abnormal perfusion areas by computed tomography perfusion (CTP) study of hyperacute stroke patients and the final infarction areas after intraarterial catheter thrombolysis. CTP study using the box-modulation transfer function (box-MTF) method based on the deconvolution analysis method was performed in 22 hyperacute stroke patients. Ischemic lesions were immediately treated with catheter thrombolysis after CTP study. Among them, nine patients with middle cerebral artery (MCA) occlusion were investigated regarding correlations of the size of the prolonged mean transit time (MTT) area, the decreased cerebral blood volume (CBV) area, and the final infarction area. Using the box-MTF method, the prolonged MTT area was almost identical to the final infarction area in the case of catheter thrombolysis failure. The decreased CBV areas resulted in infarction or hemorrhage, irrespective of the outcome of recanalization after catheter thrombolysis. The prolonged MTT areas, detected by the box-MTF method of CTP in hyperacute stroke patients, included the area of true prolonged MTT and the tracer delay. The prolonged MTT area was almost identical to the final infarction area when recanalization failed. We believe that a tracer delay area also indicates infarction in cases of thrombolysis failure. (author)

  13. Coordination processes in computer supported collaborative writing

    NARCIS (Netherlands)

    Kanselaar, G.; Erkens, Gijsbert; Jaspers, Jos; Prangsma, M.E.

    2005-01-01

    In the COSAR-project a computer-supported collaborative learning environment enables students to collaborate in writing an argumentative essay. The TC3 groupware environment (TC3: Text Composer, Computer supported and Collaborative) offers access to relevant information sources, a private notepad, a

  14. REACHING THE COMPUTING HELP DESK

    CERN Multimedia

    Miguel Marquina

    2000-01-01

    You may find it useful to glue the information below, e.g. near/at your computer, for those occasions when access to computer services is not possible. It presents the way to contact the Computing Help Desk (hosted by IT Division as an entry point for general computing issues). Do not hesitate to contact us (by email to User.Relations@cern.ch) for additional information or feedback regarding this matter.Your contact for general computing problems or queriesPhone number:(+41 22 76) 78888Opening Hours:From Monday to Friday 8:30-17:30Email:Helpdesk@cern.chWeb:http://consult.cern.ch/service/helpdeskMiguel MarquinaIT Division/UserSupport

  15. Fluid Dynamics Theory, Computation, and Numerical Simulation

    CERN Document Server

    Pozrikidis, Constantine

    2009-01-01

    Fluid Dynamics: Theory, Computation, and Numerical Simulation is the only available book that extends the classical field of fluid dynamics into the realm of scientific computing in a way that is both comprehensive and accessible to the beginner. The theory of fluid dynamics, and the implementation of solution procedures into numerical algorithms, are discussed hand-in-hand and with reference to computer programming. This book is an accessible introduction to theoretical and computational fluid dynamics (CFD), written from a modern perspective that unifies theory and numerical practice. There are several additions and subject expansions in the Second Edition of Fluid Dynamics, including new Matlab and FORTRAN codes. Two distinguishing features of the discourse are: solution procedures and algorithms are developed immediately after problem formulations are presented, and numerical methods are introduced on a need-to-know basis and in increasing order of difficulty. Matlab codes are presented and discussed for ...

  16. Correlation of radiation dose and heart rate in dual-source computed tomography coronary angiography.

    Science.gov (United States)

    Laspas, Fotios; Tsantioti, Dimitra; Roussakis, Arkadios; Kritikos, Nikolaos; Efthimiadou, Roxani; Kehagias, Dimitrios; Andreou, John

    2011-04-01

    Computed tomography coronary angiography (CTCA) has been widely used since the introduction of 64-slice scanners and dual-source CT technology, but the relatively high radiation dose remains a major concern. To evaluate the relationship between radiation exposure and heart rate (HR), in dual-source CTCA. Data from 218 CTCA examinations, performed with a dual-source 64-slices scanner, were statistically evaluated. Effective radiation dose, expressed in mSv, was calculated as the product of the dose-length product (DLP) times a conversion coefficient for the chest (mSv = DLPx0.017). Heart rate range and mean heart rate, expressed in beats per minute (bpm) of each individual during CTCA, were also provided by the system. Statistical analysis of effective dose and heart rate data was performed by using Pearson correlation coefficient and two-sample t-test. Mean HR and effective dose were found to have a borderline positive relationship. Individuals with a mean HR >65 bpm observed to receive a statistically significant higher effective dose as compared to those with a mean HR ≤65 bpm. Moreover, a strong correlation between effective dose and variability of HR of more than 20 bpm was observed. Dual-source CT scanners are considered to have the capability to provide diagnostic examinations even with high HR and arrhythmias. However, it is desirable to keep the mean heart rate below 65 bpm and heart rate fluctuation less than 20 bpm in order to reduce the radiation exposure.

  17. Mathematical structures for computer graphics

    CERN Document Server

    Janke, Steven J

    2014-01-01

    A comprehensive exploration of the mathematics behind the modeling and rendering of computer graphics scenes Mathematical Structures for Computer Graphics presents an accessible and intuitive approach to the mathematical ideas and techniques necessary for two- and three-dimensional computer graphics. Focusing on the significant mathematical results, the book establishes key algorithms used to build complex graphics scenes. Written for readers with various levels of mathematical background, the book develops a solid foundation for graphics techniques and fills in relevant grap

  18. Effects of the loss of correlation structure on Phase 1 dose estimates

    International Nuclear Information System (INIS)

    Simpson, J.C.

    1991-11-01

    In Phase I of the Hanford Environmental Dose Reconstruction Project, a step-by-step (modular) calculational structure was used. This structure was intended (1) to simplify the computational process, (2) to allow storage of intermediate calculations for later analyses, and (3) to guide the collection of data by presenting understandable structures for its use. The implementation of this modular structure resulted in the loss of correlation among inputs and outputs of the code, resulting in less accurate dose estimates than anticipated. The study documented in this report investigated two types of correlations in the Phase I model: temporal and pathway. Temporal correlations occur in the simulation when, in the calculation, data estimated for a previous time are used in a subsequent calculation. If the various portions of the calculation do not use the same realization of the earlier estimate, they are no longer correlated with respect to time. Similarly, spatial correlations occur in a simulation when, in the calculation, data estimated for a particular location are used in estimates for other locations. If the various calculations do not use the same value for the original location, they are no longer correlated with respect to location. The loss of the correlation structure in the Phase I code resulted in dose estimates that are biased. It is recommended that the air pathway dose model be restructured and the intermediate histograms eliminated. While the restructured code may still contain distinct modules, all input parameters to each module and all out put from each module should be retained in a database such that subsequent modules can access all the information necessary to retain the correlation structure

  19. Access control and confidentiality in radiology

    Science.gov (United States)

    Noumeir, Rita; Chafik, Adil

    2005-04-01

    A medical record contains a large amount of data about the patient such as height, weight and blood pressure. It also contains sensitive information such as fertility, abortion, psychiatric data, sexually transmitted diseases and diagnostic results. Access to this information must be carefully controlled. Information technology has greatly improved patient care. The recent extensive deployment of digital medical images made diagnostic images promptly available to healthcare decision makers, regardless of their geographic location. Medical images are digitally archived, transferred on telecommunication networks, and visualized on computer screens. However, with the widespread use of computing and communication technologies in healthcare, the issue of data security has become increasingly important. Most of the work until now has focused on the security of data communication to ensure its integrity, authentication, confidentiality and user accountability. The mechanisms that have been proposed to achieve the security of data communication are not specific to healthcare. Data integrity can be achieved with data signature. Data authentication can be achieved with certificate exchange. Data confidentiality can be achieved with encryption. User accountability can be achieved with audits. Although these mechanisms are essential to ensure data security during its transfer on the network, access control is needed in order to ensure data confidentiality and privacy within the information system application. In this paper, we present and discuss an access control mechanism that takes into account the notion of a care process. Radiology information is categorized and a model to enforce data privacy is proposed.

  20. Correlation model to analyze dependent failures for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Dezfuli, H.

    1985-01-01

    A methodology is formulated to study the dependent (correlated) failures of various abnormal events in nuclear power plants. This methodology uses correlation analysis is a means for predicting and quantifying the dependent failures. Appropriate techniques are also developed to incorporate the dependent failure in quantifying fault trees and accident sequences. The uncertainty associated with each estimation in all of the developed techniques is addressed and quantified. To identify the relative importance of the degree of dependency (correlation) among events and to incorporate these dependencies in the quantification phase of PRA, the interdependency between a pair of events in expressed with the aid of the correlation coefficient. For the purpose of demonstrating the methodology, the data base used in the Accident Sequence Precursor Study (ASP) was adopted and simulated to obtain distributions for the correlation coefficients. A computer program entitled Correlation Coefficient Generator (CCG) was developed to generate a distribution for each correlation coefficient. The method of bootstrap technique was employed in the CCG computer code to determine confidence limits of the estimated correlation coefficients. A second computer program designated CORRELATE was also developed to obtain probability intervals for both fault trees and accident sequences with statistically correlated failure data

  1. Experimental quantum computing without entanglement.

    Science.gov (United States)

    Lanyon, B P; Barbieri, M; Almeida, M P; White, A G

    2008-11-14

    Deterministic quantum computation with one pure qubit (DQC1) is an efficient model of computation that uses highly mixed states. Unlike pure-state models, its power is not derived from the generation of a large amount of entanglement. Instead it has been proposed that other nonclassical correlations are responsible for the computational speedup, and that these can be captured by the quantum discord. In this Letter we implement DQC1 in an all-optical architecture, and experimentally observe the generated correlations. We find no entanglement, but large amounts of quantum discord-except in three cases where an efficient classical simulation is always possible. Our results show that even fully separable, highly mixed, states can contain intrinsically quantum mechanical correlations and that these could offer a valuable resource for quantum information technologies.

  2. Concurrent use of data base and graphics computer workstations to provide graphic access to large, complex data bases for robotics control of nuclear surveillance and maintenance

    International Nuclear Information System (INIS)

    Dalton, G.R.; Tulenko, J.S.; Zhou, X.

    1990-01-01

    The University of Florida is part of a multiuniversity research effort, sponsored by the US Department of Energy which is under way to develop and deploy an advanced semi-autonomous robotic system for use in nuclear power stations. This paper reports on the development of the computer tools necessary to gain convenient graphic access to the intelligence implicit in a large complex data base such as that in a nuclear reactor plant. This program is integrated as a man/machine interface within the larger context of the total computerized robotic planning and control system. The portion of the project described here addresses the connection between the three-dimensional displays on an interactive graphic workstation and a data-base computer running a large data-base server program. Programming the two computers to work together to accept graphic queries and return answers on the graphic workstation is a key part of the interactive capability developed

  3. Geometric Algebra Computing

    CERN Document Server

    Corrochano, Eduardo Bayro

    2010-01-01

    This book presents contributions from a global selection of experts in the field. This useful text offers new insights and solutions for the development of theorems, algorithms and advanced methods for real-time applications across a range of disciplines. Written in an accessible style, the discussion of all applications is enhanced by the inclusion of numerous examples, figures and experimental analysis. Features: provides a thorough discussion of several tasks for image processing, pattern recognition, computer vision, robotics and computer graphics using the geometric algebra framework; int

  4. Templet Web: the use of volunteer computing approach in PaaS-style cloud

    Directory of Open Access Journals (Sweden)

    Vostokin Sergei

    2018-03-01

    Full Text Available This article presents the Templet Web cloud service. The service is designed for high-performance scientific computing automation. The use of high-performance technology is specifically required by new fields of computational science such as data mining, artificial intelligence, machine learning, and others. Cloud technologies provide a significant cost reduction for high-performance scientific applications. The main objectives to achieve this cost reduction in the Templet Web service design are: (a the implementation of “on-demand” access; (b source code deployment management; (c high-performance computing programs development automation. The distinctive feature of the service is the approach mainly used in the field of volunteer computing, when a person who has access to a computer system delegates his access rights to the requesting user. We developed an access procedure, algorithms, and software for utilization of free computational resources of the academic cluster system in line with the methods of volunteer computing. The Templet Web service has been in operation for five years. It has been successfully used for conducting laboratory workshops and solving research problems, some of which are considered in this article. The article also provides an overview of research directions related to service development.

  5. Preserving access to ALEPH computing environment via virtual machines

    International Nuclear Information System (INIS)

    Coscetti, Simone; Boccali, Tommaso; Arezzini, Silvia; Maggi, Marcello

    2014-01-01

    The ALEPH Collaboration [1] took data at the LEP (CERN) electron-positron collider in the period 1989-2000, producing more than 300 scientific papers. While most of the Collaboration activities stopped in the last years, the data collected still has physics potential, with new theoretical models emerging, which ask checks with data at the Z and WW production energies. An attempt to revive and preserve the ALEPH Computing Environment is presented; the aim is not only the preservation of the data files (usually called bit preservation), but of the full environment a physicist would need to perform brand new analyses. Technically, a Virtual Machine approach has been chosen, using the VirtualBox platform. Concerning simulated events, the full chain from event generators to physics plots is possible, and reprocessing of data events is also functioning. Interactive tools like the DALI event display can be used on both data and simulated events. The Virtual Machine approach is suited for both interactive usage, and for massive computing using Cloud like approaches.

  6. Security in cloud computing

    OpenAIRE

    Moreno Martín, Oriol

    2016-01-01

    Security in Cloud Computing is becoming a challenge for next generation Data Centers. This project will focus on investigating new security strategies for Cloud Computing systems. Cloud Computingisarecent paradigmto deliver services over Internet. Businesses grow drastically because of it. Researchers focus their work on it. The rapid access to exible and low cost IT resources on an on-demand fashion, allows the users to avoid planning ahead for provisioning, and enterprises to save money ...

  7. Access to the Arts through Assistive Technology.

    Science.gov (United States)

    Frame, Charles

    Personnel in the rehabilitation field have come to recognize the possibilities and implications of computers as assistive technology for disabled persons. This manual provides information on how to adapt the Unicorn Board, Touch Talker/Light Talker overlays, the Adaptive Firmware Card setup disk, and Trace-Transparent Access Module (T-TAM) to…

  8. Computer ray tracing speeds.

    Science.gov (United States)

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  9. Early Lung Adenocarcinoma in Mice: Micro-Computed Tomography Manifestations and Correlation with Pathology

    Directory of Open Access Journals (Sweden)

    Lin Deng

    2017-06-01

    Full Text Available Lung cancer is the most common fatal malignancy for both men and women and adenocarcinoma is the most common histologic type. Early diagnosis of lung cancer can significantly improve the survival rate of patients. This study aimed to investigate the micro-computed tomography (micro-CT manifestations of early lung adenocarcinoma (LAC in mice and to provide a new perspective for early clinical diagnosis. Early LAC models in 10 mice were established by subcutaneously injecting 1-methyl-3-nitro-1-nitrosoguanidine (MNNG solution. Micro-CT scan and multiple planar reconstruction (MPR were used for mouse lungs. Micro-CT features of early LAC, especially the relationships between tumor and bronchus, were analyzed and correlated with pathology. Micro-CT findings of early LAC were divided into three types: non-solid (n = 8, 6%, partly solid (n = 85, 64% and totally solid (n = 39, 30%. Tumor-bronchus relationships, which could be observed in 110 of 132(83% LAC, were classified into four patterns: type I (n = 16, 15%, bronchus was truncated at the margin of the tumor; type II (n = 33, 30%, bronchus penetrated into the tumor with tapered narrowing and interruption; type III (n = 38, 35%, bronchus penetrated into the tumor with a patent and intact lumen; type IV (n = 99, 90%, bronchus ran at the border of the tumor with an intact or compressed lumen. Micro-CT manifestations of early LAC correlated well with pathological findings. Micro-CT can clearly demonstrate the features of mouse early LAC and bronchus-tumor relationships, and can also provide a new tool and perspective for the study of early LAC.

  10. Quantum computational webs

    International Nuclear Information System (INIS)

    Gross, D.; Eisert, J.

    2010-01-01

    We discuss the notion of quantum computational webs: These are quantum states universal for measurement-based computation, which can be built up from a collection of simple primitives. The primitive elements--reminiscent of building blocks in a construction kit--are (i) one-dimensional states (computational quantum wires) with the power to process one logical qubit and (ii) suitable couplings, which connect the wires to a computationally universal web. All elements are preparable by nearest-neighbor interactions in a single pass, of the kind accessible in a number of physical architectures. We provide a complete classification of qubit wires, a physically well-motivated class of universal resources that can be fully understood. Finally, we sketch possible realizations in superlattices and explore the power of coupling mechanisms based on Ising or exchange interactions.

  11. Computed tomography (CT) findings in 88 neurofibromatosis 1 (NF1) patients: Prevalence rates and correlations of thoracic findings

    International Nuclear Information System (INIS)

    Ueda, Ken; Honda, Osamu; Satoh, Yukihisa; Kawai, Misa; Gyobu, Tomoko; Kanazawa, Toru; Hidaka, Shojiro; Yanagawa, Masahiro; Sumikawa, Hiromitsu; Tomiyama, Noriyuki

    2015-01-01

    Highlights: • Various thoracic CT findings, including cysts, mediastinal masses, etc. were found. • Cysts show upper and peripheral dominant distribution. • The number, size, and distribution of the pulmonary cysts in NF-1 revealed significant correlation. • It is suspected that thoracic CT findings in NF-1 occur independently. - Abstract: Purpose: To evaluate the prevalence rates and the correlations of thoracic computed tomography (CT) findings of neurofibromatosis 1 (NF1) in 88 patients. Materials and methods: Chest CT images of 88 NF1 patients were independently reviewed by three observers, and the CT findings were evaluated. If abnormal findings were present, their number, size, and distribution were recorded. The prevalence rate of each CT finding was calculated, and the correlations between CT findings were analyzed. Results: Of the 88 cases, 13 were positive for cysts, 16 for emphysema, 8 for nodules, 8 for GGNs (ground glass nodules), 13 for mediastinal masses, 20 for scoliosis, 44 for subcutaneous nodules, and 34 for skin nodules. Cysts showed upper and peripheral dominant distributions. Regarding 13 mediastinal masses, 2 were diagnosed as malignant peripheral nerve sheath tumors (MPNSTs), 1 was diagnosed as primary lung cancer, 2 were diagnosed as lateral meningocele, 3 were diagnosed as neurofibromas, and the remaining 7 were considered neurofibromas. There was a significant correlation between the prevalence of subcutaneous nodules and that of skin nodules. Significant positive correlations were also seen between size and number, size and rate of central distribution, and number and rate of central distribution of cysts. Conclusion: Various CT findings were found in NF-1 patients, and the prevalence rates of subcutaneous and skin nodules were higher than other findings. Though the prevalence rates of subcutaneous nodules and skin nodules were significantly correlated, the other CT findings in NF-1 occurred independently. The number, size, and

  12. Computed tomography (CT) findings in 88 neurofibromatosis 1 (NF1) patients: Prevalence rates and correlations of thoracic findings

    Energy Technology Data Exchange (ETDEWEB)

    Ueda, Ken, E-mail: k-ueda@radiol.med.osaka-u.ac.jp [Department of Radiology, Osaka University Graduate School of Medicine (Japan); Honda, Osamu [Department of Radiology, Osaka University Graduate School of Medicine (Japan); Satoh, Yukihisa [Department of Diagnostic Radiology, Osaka Medical Center for Cancer and Cardiovascular Diseases (Japan); Kawai, Misa; Gyobu, Tomoko; Kanazawa, Toru; Hidaka, Shojiro; Yanagawa, Masahiro [Department of Radiology, Osaka University Graduate School of Medicine (Japan); Sumikawa, Hiromitsu [Department of Diagnostic Radiology, Osaka Rosai Hospital (Japan); Tomiyama, Noriyuki [Department of Radiology, Osaka University Graduate School of Medicine (Japan)

    2015-06-15

    Highlights: • Various thoracic CT findings, including cysts, mediastinal masses, etc. were found. • Cysts show upper and peripheral dominant distribution. • The number, size, and distribution of the pulmonary cysts in NF-1 revealed significant correlation. • It is suspected that thoracic CT findings in NF-1 occur independently. - Abstract: Purpose: To evaluate the prevalence rates and the correlations of thoracic computed tomography (CT) findings of neurofibromatosis 1 (NF1) in 88 patients. Materials and methods: Chest CT images of 88 NF1 patients were independently reviewed by three observers, and the CT findings were evaluated. If abnormal findings were present, their number, size, and distribution were recorded. The prevalence rate of each CT finding was calculated, and the correlations between CT findings were analyzed. Results: Of the 88 cases, 13 were positive for cysts, 16 for emphysema, 8 for nodules, 8 for GGNs (ground glass nodules), 13 for mediastinal masses, 20 for scoliosis, 44 for subcutaneous nodules, and 34 for skin nodules. Cysts showed upper and peripheral dominant distributions. Regarding 13 mediastinal masses, 2 were diagnosed as malignant peripheral nerve sheath tumors (MPNSTs), 1 was diagnosed as primary lung cancer, 2 were diagnosed as lateral meningocele, 3 were diagnosed as neurofibromas, and the remaining 7 were considered neurofibromas. There was a significant correlation between the prevalence of subcutaneous nodules and that of skin nodules. Significant positive correlations were also seen between size and number, size and rate of central distribution, and number and rate of central distribution of cysts. Conclusion: Various CT findings were found in NF-1 patients, and the prevalence rates of subcutaneous and skin nodules were higher than other findings. Though the prevalence rates of subcutaneous nodules and skin nodules were significantly correlated, the other CT findings in NF-1 occurred independently. The number, size, and

  13. Cross-Cultural adaptation of an instrument to computer accessibility evaluation for students with cerebral palsy

    Directory of Open Access Journals (Sweden)

    Gerusa Ferreira Lourenço

    2015-03-01

    Full Text Available The specific literature indicates that the successful education of children with cerebral palsy may require the implementation of appropriate assistive technology resources, allowing students to improve their performance and complete everyday tasks more efficiently and independently. To this end, these resources must be selected properly, emphasizing the importance of an appropriate initial assessment of the child and the possibilities of the resources available. The present study aimed to translate and adapt theoretically an American instrument that evaluates computer accessibility for people with cerebral palsy, in order to contextualize it for applicability to Brazilian students with cerebral palsy. The methodology involved the steps of translation and cross-cultural adaptation of this instrument, as well as the construction of a supplementary script for additional use of that instrument in the educational context. Translation procedures, theoretical and technical adaptation of the American instrument and theoretical analysis (content and semantics were carried out with the participation of professional experts of the special education area as adjudicators. The results pointed to the relevance of the proposal of the translated instrument in conjunction with the script built to the reality of professionals involved with the education of children with cerebral palsy, such as occupational therapists and special educators.

  14. Text accessibility by people with reduced contrast sensitivity.

    Science.gov (United States)

    Crossland, Michael D; Rubin, Gary S

    2012-09-01

    Contrast sensitivity is reduced in people with eye disease, and also in older adults without eye disease. In this article, we compare contrast of text presented in print and digital formats with contrast sensitivity values for a large cohort of subjects in a population-based study of older adults (the Salisbury Eye Evaluation). Contrast sensitivity values were recorded for 2520 adults aged 65 to 84 years living in Salisbury, Maryland. The proportion of the sample likely to be unable to read text of different formats (electronic books, newsprint, paperback books, laser print, and LED computer monitors) was calculated using published contrast reserve levels required to perform spot reading, to read with fluency, high fluency, and under optimal conditions. One percent of this sample had contrast sensitivity less than that required to read newsprint fluently. Text presented on an LED computer monitor had the highest contrast. Ninety-eight percent of the sample had contrast sensitivity sufficient for high fluent reading of text (at least 160 words/min) on a monitor. However, 29.6% were still unlikely to be able to read this text with optimal fluency. Reduced contrast of print limits text accessibility for many people in the developed world. Presenting text in a high-contrast format, such as black laser print on a white page, would increase the number of people able to access such information. Additionally, making text available in a format that can be presented on an LED computer monitor will increase access to written documents.

  15. SECURITY AND PRIVACY ISSUES IN CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    Amina AIT OUAHMAN

    2014-10-01

    Full Text Available Today, cloud computing is defined and talked about across the ICT industry under different contexts and with different definitions attached to it. It is a new paradigm in the evolution of Information Technology, as it is one of the biggest revolutions in this field to have taken place in recent times. According to the National Institute for Standards and Technology (NIST, “cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services that can be rapidly provisioned and released with minimal management effort or service provider interaction” [1]. The importance of Cloud Computing is increasing and it is receiving a growing attention in the scientific and industrial communities. A study by Gartner [2] considered Cloud Computing as the first among the top 10 most important technologies and with a better prospect in successive years by companies and organizations. Clouds bring out tremendous benefits for both individuals and enterprises. Clouds support economic savings, outsourcing mechanisms, resource sharing, any-where any-time accessibility, on-demand scalability, and service flexibility. Clouds minimize the need for user involvement by masking technical details such as software upgrades, licenses, and maintenance from its customers. Clouds could also offer better security advantages over individual server deployments. Since a cloud aggregates resources, cloud providers charter expert security personnel while typical companies could be limited with a network administrator who might not be well versed in cyber security issues. The new concepts introduced by the clouds, such as computation outsourcing, resource sharing, and external data warehousing, increase the security and privacy concerns and create new security challenges. Moreover, the large scale of the clouds, the proliferation of mobile access devices (e

  16. Computed tomography of the anterior mediastinum in myasthemia gravis: a radiologic-pathologic correlative study

    Energy Technology Data Exchange (ETDEWEB)

    Fon, G.T.; Bein, M.E.; Mancuso, A.A.; Keesey, J.C.; Lupetin, A.R.; Wong, W.S.

    1982-01-01

    Chest radiographs and computed tomographic (CT) scans of the mediastinum were correlated with pathologic findings of the thymus following thymectomy in 57 patients with myasthenia gravis. Based on the patient's age and the overall morphology of the anterior mediastinum, CT scans were assigned one of four grades in an attempt to predict thymus pathologic findings. Using this grading, 14 of 16 cases of thymoma were suspected or definitely diagnosed. One of the two cases not diagnosed on CT was a microscopic tumor. There were no false-positive diagnoses in 11 cases graded as definitely thymoma. We conclude that thymoma can be sensitively diagnosed in patients older than 40 years of age. However, thymoma cannot be predicted with a high level of confidence in patients younger than 40 because of the difficulty in differentiating normal thymus or hyperplasia from thymoma. Recommendations for the use of CT in the preoperative evaluation of myasthenic patients are presented.

  17. Computed tomography of the anterior mediastinum in myasthemia gravis: a radiologic-pathologic correlative study

    International Nuclear Information System (INIS)

    Fon, G.T.; Bein, M.E.; Mancuso, A.A.; Keesey, J.C.; Lupetin, A.R.; Wong, W.S.

    1982-01-01

    Chest radiographs and computed tomographic (CT) scans of the mediastinum were correlated with pathologic findings of the thymus following thymectomy in 57 patients with myasthenia gravis. Based on the patient's age and the overall morphology of the anterior mediastinum, CT scans were assigned one of four grades in an attempt to predict thymus pathologic findings. Using this grading, 14 of 16 cases of thymoma were suspected or definitely diagnosed. One of the two cases not diagnosed on CT was a microscopic tumor. There were no false-positive diagnoses in 11 cases graded as definitely thymoma. We conclude that thymoma can be sensitively diagnosed in patients older than 40 years of age. However, thymoma cannot be predicted with a high level of confidence in patients younger than 40 because of the difficulty in differentiating normal thymus or hyperplasia from thymoma. Recommendations for the use of CT in the preoperative evaluation of myasthenic patients are presented

  18. The potential use of mobile technology: enhancing accessibility and communication in a blended learning course

    OpenAIRE

    Mayisela, Tabisa

    2013-01-01

    Mobile technology is increasingly being used to support blended learning beyond computer centres. It has been considered as a potential solution to the problem of a shortage of computers for accessing online learning materials (courseware) in a blended learning course. The purpose of the study was to establish how the use of mobile technology could enhance accessibility and communication in a blended learning course. Data were solicitedfrom a purposive convenience sample of 36 students engage...

  19. Computer Security: the value of your password

    CERN Multimedia

    Stefan Lueders, Computer Security Team

    2016-01-01

    Of course, your passwords have a value to you as they allow you to access your computer and your Facebook page, to buy on Amazon, to create a Twitter feed, and to use a multitude of computing services provided by CERN. But have you ever thought of their value to the malicious people of this world?    With your account password, I can take over your computer. I can install software allowing me to enable your microphone and listen to your communications and what is happening around you as long as your computer is turned on. I can take regular screenshots and monitor you while you work. With that, I can try to determine your working habits, your online behaviour, the way you write e-mails… Useful, if I want to impersonate you believably (e.g. to attack CERN and the systems you are working on at CERN). What’s more, with access to your computer, I can install a keylogger to record your every keystroke – including when you type all your other passwords: ...

  20. Captioning and Indian Sign Language as Accessibility Tools in Universal Design

    Directory of Open Access Journals (Sweden)

    John Mathew Martin Poothullil

    2013-06-01

    Full Text Available Universal Design in Media as a strategy to achieve accessibility in digital television started in Spain in 1997 with the digitalization of satellite platforms (MuTra, 2006. In India, a conscious effort toward a strategy for accessible media format in digital television is yet to be made. Advertising in India is a billion dollar industry (Adam Smith, 2008 and digital television provides a majority of the space for it. This study investigated the effects of advertisement in accessible format, through the use of captioning and Indian sign language (ISL, on hearing and deaf people. “Deaf (capital letter ‘D’ used for culturally Deaf and hearing” viewers watched two short recent advertisements with and without accessibility formats in a randomized order. Their reactions were recorded on a questionnaire developed for the purpose of the study. Eighty-four persons participated in this study of which 42 were deaf persons. Analysis of the data showed that there was difference in the effects of accessible and nonaccessible formats of advertisement on the “Deaf and Hearing” viewers. The study showed that accessible formats increased the comprehension of the message of the advertisement and use of ISL helped deaf persons to understand concepts better. While captioning increased the perception of the hearing persons to correlate with listening and understanding the concept of the advertisement, the deaf persons correlated watching the ISL interpreter with understanding the concept of the advertisement. Placement of the ISL interpreter in the screen and color of the fonts used for captioning were also covered under the study. However, the placement of the ISL interpreter and color of fonts in the screen and their correlation with comprehension of the advertisement by hearing and deaf persons did not show much of significance in the result of the study.

  1. Assessment of Low-Income Adults' Access to Technology: Implications for Nutrition Education

    Science.gov (United States)

    Neuenschwander, Lauren M.; Abbott, Angela; Mobley, Amy R.

    2012-01-01

    Objective: The main objective of this study was to investigate access and use of technologies such as the Internet among Indiana's low-income population. The secondary objective was to determine whether access and use of computers significantly differed by age, race, and/or education level. Methods: Data were collected from low-income adult…

  2. Correlation between computer-aided dynamic gadolinium-enhanced MRI assessment of inflammation and semi-quantitative synovitis and bone marrow oedema scores of the wrist in patients with rheumatoid arthritis--a cohort study

    DEFF Research Database (Denmark)

    Boesen, Mikael; Kubassova, Olga; Bouert, Rasmus

    2012-01-01

    Objective. To test the correlation between assessment of inflammation using dynamic contrast-enhanced MRI (DCE-MRI) analysed by a novel computer-aided approach and semi-quantitative scores of synovitis and bone marrow oedema (BME) using the OMERACT-RA MRI Scoring (RAMRIS) system, in the wrist...... extended region of interest (ROI) placed around the wrist joint (semi-automated approach) and (iii) within a small ROI placed in the area with most visual enhancement (semi-automated approach). Time spent on each procedure was noted. Spearman's rank correlation test was applied to assess the correlation...... between RAMRIS and the computer-generated dynamic parameters. Results. RAMRIS synovitis (range 2-9), BME (range 0-39) and the dynamic parameters reflecting the number of enhancing voxels were significantly correlated, especially when an extended ROI around the wrist was used (¿¿=¿0.74; P¿...

  3. Statistical modeling of the Internet traffic dynamics: To which extent do we need long-term correlations?

    Science.gov (United States)

    Markelov, Oleg; Nguyen Duc, Viet; Bogachev, Mikhail

    2017-11-01

    Recently we have suggested a universal superstatistical model of user access patterns and aggregated network traffic. The model takes into account the irregular character of end user access patterns on the web via the non-exponential distributions of the local access rates, but neglects the long-term correlations between these rates. While the model is accurate for quasi-stationary traffic records, its performance under highly variable and especially non-stationary access dynamics remains questionable. In this paper, using an example of the traffic patterns from a highly loaded network cluster hosting the website of the 1998 FIFA World Cup, we suggest a generalization of the previously suggested superstatistical model by introducing long-term correlations between access rates. Using queueing system simulations, we show explicitly that this generalization is essential for modeling network nodes with highly non-stationary access patterns, where neglecting long-term correlations leads to the underestimation of the empirical average sojourn time by several decades under high throughput utilization.

  4. Neural Correlates of Direct Access Trading in a Real Stock Market: An fMRI Investigation.

    Science.gov (United States)

    Raggetti, GianMario; Ceravolo, Maria G; Fattobene, Lucrezia; Di Dio, Cinzia

    2017-01-01

    Background: While financial decision making has been barely explored, no study has previously investigated the neural correlates of individual decisions made by professional traders involved in real stock market negotiations, using their own financial resources. Aim: We sought to detect how different brain areas are modulated by factors like age, expertise, psychological profile (speculative risk seeking or aversion) and, eventually, size and type (Buy/Sell) of stock negotiations, made through Direct Access Trading (DAT) platforms. Subjects and methods: Twenty male traders underwent fMRI while negotiating in the Italian stock market using their own preferred trading platform. Results: At least 20 decision events were collected during each fMRI session. Risk averse traders performed a lower number of financial transactions with respect to risk seekers, with a lower average economic value, but with a higher rate of filled proposals. Activations were observed in cortical and subcortical areas traditionally involved in decision processes, including the ventrolateral and dorsolateral prefrontal cortex (vlPFC, dlPFC), the posterior parietal cortex (PPC), the nucleus accumbens (NAcc), and dorsal striatum. Regression analysis indicated an important role of age in modulating activation of left NAcc, while traders' expertise was negatively related to activation of vlPFC. High value transactions were associated with a stronger activation of the right PPC when subjects' buy rather than sell. The success of the trading activity, based on a large number of filled transactions, was related with higher activation of vlPFC and dlPFC. Independent of chronological and professional age, traders differed in their attitude to DAT, with distinct brain activity profiles being detectable during fMRI sessions. Those subjects who described themselves as very self-confident, showed a lower or absent activation of both the caudate nucleus and the dlPFC, while more reflexive traders showed

  5. Neural Correlates of Direct Access Trading in a Real Stock Market: An fMRI Investigation

    Directory of Open Access Journals (Sweden)

    GianMario Raggetti

    2017-09-01

    Full Text Available Background: While financial decision making has been barely explored, no study has previously investigated the neural correlates of individual decisions made by professional traders involved in real stock market negotiations, using their own financial resources.Aim: We sought to detect how different brain areas are modulated by factors like age, expertise, psychological profile (speculative risk seeking or aversion and, eventually, size and type (Buy/Sell of stock negotiations, made through Direct Access Trading (DAT platforms.Subjects and methods: Twenty male traders underwent fMRI while negotiating in the Italian stock market using their own preferred trading platform.Results: At least 20 decision events were collected during each fMRI session. Risk averse traders performed a lower number of financial transactions with respect to risk seekers, with a lower average economic value, but with a higher rate of filled proposals. Activations were observed in cortical and subcortical areas traditionally involved in decision processes, including the ventrolateral and dorsolateral prefrontal cortex (vlPFC, dlPFC, the posterior parietal cortex (PPC, the nucleus accumbens (NAcc, and dorsal striatum. Regression analysis indicated an important role of age in modulating activation of left NAcc, while traders' expertise was negatively related to activation of vlPFC. High value transactions were associated with a stronger activation of the right PPC when subjects' buy rather than sell. The success of the trading activity, based on a large number of filled transactions, was related with higher activation of vlPFC and dlPFC. Independent of chronological and professional age, traders differed in their attitude to DAT, with distinct brain activity profiles being detectable during fMRI sessions. Those subjects who described themselves as very self-confident, showed a lower or absent activation of both the caudate nucleus and the dlPFC, while more reflexive traders

  6. Development of improved methods for remote access of DIII-D data and data analysis

    International Nuclear Information System (INIS)

    Greene, K.L.; McHarg, B.B. Jr.

    1997-11-01

    The DIII-D tokamak is a national fusion research facility. There is an increasing need to access data from remote sites in order to facilitate data analysis by collaborative researchers at remote locations, both nationally and internationally. In the past, this has usually been done by remotely logging into computers at the DIII-D site. With the advent of faster networking and powerful computers at remote sites, it is becoming possible to access and analyze data from anywhere in the world as if the remote user were actually at the DIII-D site. The general mechanism for accessing DIII-D data has always been via the PTDATA subroutine. Substantial enhancements are being made to that routine to make it more useful in a non-local environment. In particular, a caching mechanism is being built into PTDATA to make network data access more efficient. Studies are also being made of using Distributed File System (DFS) disk storage in a Distributed Computing Environment (DCE). A data server has been created that will migrate, on request, shot data from the DIII-D environment into the DFS environment

  7. Online Public Access Catalogs. ERIC Fact Sheet.

    Science.gov (United States)

    Cochrane, Pauline A.

    A listing is presented of 17 documents in the ERIC database concerning the Online Catalog (sometimes referred to as OPAC or Online Public Access Catalog), a computer-based and supported library catalog designed for patron use. The database usually represents recent acquisitions and often contains information about books on order and items in…

  8. The benefit of non contrast-enhanced magnetic resonance angiography for predicting vascular access surgery outcome: a computer model perspective.

    Directory of Open Access Journals (Sweden)

    Maarten A G Merkx

    Full Text Available INTRODUCTION: Vascular access (VA surgery, a prerequisite for hemodialysis treatment of end-stage renal-disease (ESRD patients, is hampered by complication rates, which are frequently related to flow enhancement. To assist in VA surgery planning, a patient-specific computer model for postoperative flow enhancement was developed. The purpose of this study is to assess the benefit of non contrast-enhanced magnetic resonance angiography (NCE-MRA data as patient-specific geometrical input for the model-based prediction of surgery outcome. METHODS: 25 ESRD patients were included in this study. All patients received a NCE-MRA examination of the upper extremity blood vessels in addition to routine ultrasound (US. Local arterial radii were assessed from NCE-MRA and converted to model input using a linear fit per artery. Venous radii were determined with US. The effect of radius measurement uncertainty on model predictions was accounted for by performing Monte-Carlo simulations. The resulting flow prediction interval of the computer model was compared with the postoperative flow obtained from US. Patients with no overlap between model-based prediction and postoperative measurement were further analyzed to determine whether an increase in geometrical detail improved computer model prediction. RESULTS: Overlap between postoperative flows and model-based predictions was obtained for 71% of patients. Detailed inspection of non-overlapping cases revealed that the geometrical details that could be assessed from NCE-MRA explained most of the differences, and moreover, upon addition of these details in the computer model the flow predictions improved. CONCLUSIONS: The results demonstrate clearly that NCE-MRA does provide valuable geometrical information for VA surgery planning. Therefore, it is recommended to use this modality, at least for patients at risk for local or global narrowing of the blood vessels as well as for patients for whom an US-based model

  9. Security Implications of Typical Grid Computing Usage Scenarios

    International Nuclear Information System (INIS)

    Humphrey, Marty; Thompson, Mary R.

    2001-01-01

    A Computational Grid is a collection of heterogeneous computers and resources spread across multiple administrative domains with the intent of providing users uniform access to these resources. There are many ways to access the resources of a Computational Grid, each with unique security requirements and implications for both the resource user and the resource provider. A comprehensive set of Grid usage scenarios are presented and analyzed with regard to security requirements such as authentication, authorization, integrity, and confidentiality. The main value of these scenarios and the associated security discussions are to provide a library of situations against which an application designer can match, thereby facilitating security-aware application use and development from the initial stages of the application design and invocation. A broader goal of these scenarios are to increase the awareness of security issues in Grid Computing

  10. Security Implications of Typical Grid Computing Usage Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Humphrey, Marty; Thompson, Mary R.

    2001-06-05

    A Computational Grid is a collection of heterogeneous computers and resources spread across multiple administrative domains with the intent of providing users uniform access to these resources. There are many ways to access the resources of a Computational Grid, each with unique security requirements and implications for both the resource user and the resource provider. A comprehensive set of Grid usage scenarios are presented and analyzed with regard to security requirements such as authentication, authorization, integrity, and confidentiality. The main value of these scenarios and the associated security discussions are to provide a library of situations against which an application designer can match, thereby facilitating security-aware application use and development from the initial stages of the application design and invocation. A broader goal of these scenarios are to increase the awareness of security issues in Grid Computing.

  11. Does Access to Foreign Markets shape Internal Migration? Evidence from Brazil

    OpenAIRE

    Laura Hering; Rodrigo Paillacar

    2014-01-01

    This paper investigates how internal migration is affected by Brazil’s increased integration into the world economy. It analyzes the impact of regional differences in access to foreign demand on sector-specific bilateral migration rates between the Brazilian states for the years 1995 to 2003. Using international trade data, a foreign market access measure is computed at the sectoral level,...

  12. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  13. 61 Gender, Computer Access and Use as Predictors of Nigerian ...

    African Journals Online (AJOL)

    Nekky Umera

    ... Awolowo University, Faculty of. Education, Department of Educational Technology, Ile-Ife, Osun-State, ... that failure to adopt computer technology in schools results in student's incompetence in ..... southside elementary school. Retrieved on ...

  14. A Hybrid Verifiable and Delegated Cryptographic Model in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jaber Ibrahim Naser

    2018-02-01

    Full Text Available Access control is very important in cloud data sharing. Especially in the domains like healthcare, it is essential to have access control mechanisms in place for confidentiality and secure data access. Attribute based encryption has been around for many years to secure data and provide controlled access. In this paper, we proposed a framework that supports circuit and attributes based encryption mechanism that involves multiple parties. They are data owner, data user, cloud server and attribute authority. An important feature of the proposed system is the verifiable delegation of the decryption process to cloud server. Data owner encrypts data and delegates decryption process to cloud. Cloud server performs partial decryption and then the final decrypted data are shared for users as per the privileges. Data owner  thus reduces computational complexity by delegating decryption process cloud server. We built a prototype application using the Microsoft.NET platform for proof of the concept. The empirical results revealed that there is controlled access with multiple user roles and access control rights for secure and confidential data access in cloud computing.

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  16. Fluid dynamics theory, computation, and numerical simulation

    CERN Document Server

    Pozrikidis, C

    2001-01-01

    Fluid Dynamics Theory, Computation, and Numerical Simulation is the only available book that extends the classical field of fluid dynamics into the realm of scientific computing in a way that is both comprehensive and accessible to the beginner The theory of fluid dynamics, and the implementation of solution procedures into numerical algorithms, are discussed hand-in-hand and with reference to computer programming This book is an accessible introduction to theoretical and computational fluid dynamics (CFD), written from a modern perspective that unifies theory and numerical practice There are several additions and subject expansions in the Second Edition of Fluid Dynamics, including new Matlab and FORTRAN codes Two distinguishing features of the discourse are solution procedures and algorithms are developed immediately after problem formulations are presented, and numerical methods are introduced on a need-to-know basis and in increasing order of difficulty Matlab codes are presented and discussed for a broad...

  17. Segmentation, access to finance constraints and the credit monopolistic power of financial institutions in Nicaragua

    NARCIS (Netherlands)

    Herrera Urbina, C.J.; Ruben, R.; Dijkstra, G.

    2015-01-01

    Access to finance has been the focus of significant interest in recent years as there are warning signs suggesting that lack of access to credit has an adverse effect on growth and poverty alleviation. Furthermore, recent studies have shown that access to finance is positively correlated with

  18. A flowsheet model of a coal-fired MHD/steam combined electricity generating cycle, using the access computer model

    International Nuclear Information System (INIS)

    Davison, J.E.; Eldershaw, C.E.

    1992-01-01

    This document forms the final report on a study of a coal-fired magnetohydrodynamic (MHD)/steam electric power generation system carried out by British Coal Corporation for the Commission of the European Communities. The study objective was to provide mass and energy balances and overall plant efficiency predictions for MHD to assist the Commission in their evaluation of advanced power generation technologies. In early 1990 the British Coal Corporation completed a study for the Commission in which a computer flowsheet modelling package was used to predict the performance of a conceptual air blown MHD plant. Since that study was carried out increasing emphasis has been placed on the possible need to reduce CO 2 emissions to counter the so-called greenhouse effect. Air blown MHD could greatly reduce CO 2 emissions per KWh by virtue of its high thermal efficiency. However, if even greater reductions in CO 2 emissions were required the CO 2 produced by coal combustion may have to be disposed of, for example into the deep ocean or underground caverns. To achieve this at minimum cost a concentrated CO 2 flue gas would be required. This could be achieved in an MHD plant by using a mixture of high purity oxygen and recycled CO 2 flue gas in the combustor. To assess this plant concept the European Commission awarded British Coal a contract to produce performance predictions using the access computer program

  19. Correlation functions of Coulomb branch operators

    Energy Technology Data Exchange (ETDEWEB)

    Gerchkovitz, Efrat [Weizmann Institute of Science,Rehovot 76100 (Israel); Gomis, Jaume [Perimeter Institute for Theoretical Physics,Waterloo, ON N2L 2Y5 (Canada); Ishtiaque, Nafiz [Perimeter Institute for Theoretical Physics,Waterloo, ON N2L 2Y5 (Canada); Department of Physics, University of Waterloo,Waterloo, ON N2L 3G1 (Canada); Karasik, Avner; Komargodski, Zohar [Weizmann Institute of Science,Rehovot 76100 (Israel); Pufu, Silviu S. [Joseph Henry Laboratories, Princeton University,Princeton, NJ 08544 (United States)

    2017-01-24

    We consider the correlation functions of Coulomb branch operators in four-dimensional N=2 Superconformal Field Theories (SCFTs) involving exactly one anti-chiral operator. These extremal correlators are the “minimal' non-holomorphic local observables in the theory. We show that they can be expressed in terms of certain determinants of derivatives of the four-sphere partition function of an appropriate deformation of the SCFT. This relation between the extremal correlators and the deformed four-sphere partition function is non-trivial due to the presence of conformal anomalies, which lead to operator mixing on the sphere. Evaluating the deformed four-sphere partition function using supersymmetric localization, we compute the extremal correlators explicitly in many interesting examples. Additionally, the representation of the extremal correlators mentioned above leads to a system of integrable differential equations. We compare our exact results with previous perturbative computations and with the four-dimensional tt{sup ∗} equations. We also use our results to study some of the asymptotic properties of the perturbative series expansions we obtain in N=2 SQCD.

  20. Computer hardware for radiologists: Part I

    International Nuclear Information System (INIS)

    Indrajit, IK; Alam, A

    2010-01-01

    Computers are an integral part of modern radiology practice. They are used in different radiology modalities to acquire, process, and postprocess imaging data. They have had a dramatic influence on contemporary radiology practice. Their impact has extended further with the emergence of Digital Imaging and Communications in Medicine (DICOM), Picture Archiving and Communication System (PACS), Radiology information system (RIS) technology, and Teleradiology. A basic overview of computer hardware relevant to radiology practice is presented here. The key hardware components in a computer are the motherboard, central processor unit (CPU), the chipset, the random access memory (RAM), the memory modules, bus, storage drives, and ports. The personnel computer (PC) has a rectangular case that contains important components called hardware, many of which are integrated circuits (ICs). The fiberglass motherboard is the main printed circuit board and has a variety of important hardware mounted on it, which are connected by electrical pathways called “buses”. The CPU is the largest IC on the motherboard and contains millions of transistors. Its principal function is to execute “programs”. A Pentium ® 4 CPU has transistors that execute a billion instructions per second. The chipset is completely different from the CPU in design and function; it controls data and interaction of buses between the motherboard and the CPU. Memory (RAM) is fundamentally semiconductor chips storing data and instructions for access by a CPU. RAM is classified by storage capacity, access speed, data rate, and configuration