WorldWideScience

Sample records for acpa automated cluster

  1. CCR6(+) Th cell populations distinguish ACPA positive from ACPA negative rheumatoid arthritis.

    Science.gov (United States)

    Paulissen, Sandra M J; van Hamburg, Jan Piet; Davelaar, Nadine; Vroman, Heleen; Hazes, Johanna M W; de Jong, Pascal H P; Lubberts, Erik

    2015-11-30

    Patients with rheumatoid arthritis (RA) can be separated into two major subpopulations based on the absence or presence of serum anti-citrullinated protein antibodies (ACPAs). The more severe disease course in ACPA(+) RA and differences in treatment outcome between these subpopulations suggest that ACPA(+) and ACPA(-) RA are different disease subsets. The identification of T-helper (Th) cells specifically recognizing citrullinated peptides, combined with the strong association between HLA-DRB1 and ACPA positivity, point toward a pathogenic role of Th cells in ACPA(+) RA. In this context we recently identified a potential pathogenic role for CCR6(+) Th cells in RA. Therefore, we examined whether Th cell population distributions differ by ACPA status. We performed a nested matched case-control study including 27 ACPA(+) and 27 ACPA(-) treatment-naive early RA patients matched for disease activity score in 44 joints, presence of rheumatoid factor, sex, age, duration of complaints and presence of erosions. CD4(+)CD45RO(+) (memory) Th cell distribution profiles from these patients were generated based on differential chemokine receptor expression and related with disease duration. ACPA status was not related to differences in total CD4(+) T cell or memory Th cell proportions. However, ACPA(+) patients had significantly higher proportions of Th cells expressing the chemokine receptors CCR6 and CXCR3. Similar proportions of CCR4(+) and CCR10(+) Th cells were found. Within the CCR6(+) cell population, four Th subpopulations were distinguished based on differential chemokine receptor expression: Th17 (CCR4(+)CCR10(-)), Th17.1 (CXCR3(+)), Th22 (CCR4(+)CCR10(+)) and CCR4/CXCR3 double-positive (DP) cells. In particular, higher proportions of Th22 (p = 0.02), Th17.1 (p = 0.03) and CCR4/CXCR3 DP (p = 0.01) cells were present in ACPA(+) patients. In contrast, ACPA status was not associated with differences in Th1 (CCR6(-)CXCR3(+); p = 0.90), Th2 (CCR6(-)CCR4(+); p = 0.27) and T

  2. Resolution, configurational assignment, and enantiopharmacology at glutamate receptors of 2-amino-3-(3-carboxy-5-methyl-4-isoxazolyl)propionic acid (ACPA) and demethyl-ACPA

    DEFF Research Database (Denmark)

    Johansen, T N; Stensbøl, T B; Nielsen, B

    2001-01-01

    We have previously described (RS)-2-amino-3-(3-carboxy-5-methyl-4-isoxazolyl)propionic acid (ACPA) as a potent agonist at the (RS)-2-amino-3-(3-hydroxy-5-methyl-4-isoxazolyl)propionic acid (AMPA) receptor subtype of (S)-glutamic acid (Glu) receptors. We now report the chromatographic resolution...... of ACPA and (RS)-2-amino-3-(3-carboxy-4-isoxazolyl)propionic acid (demethyl-ACPA) using a Sumichiral OA-5000 column. The configuration of the enantiomers of both compounds have been assigned based on X-ray crystallographic analyses, supported by circular dichroism spectra and elution orders on chiral HPLC...... columns. Furthermore, the enantiopharmacology of ACPA and demethyl-ACPA was investigated using radioligand binding and cortical wedge electrophysiological assay systems and cloned metabotropic Glu receptors. (S)-ACPA showed high affinity in AMPA binding (IC(50) = 0.025 microM), low affinity in kainic acid...

  3. Smoking is associated with an increased risk of developing ACPA-positive but not ACPA-negative rheumatoid arthritis in Asian populations: evidence from the Malaysian MyEIRA case-control study.

    Science.gov (United States)

    Yahya, Abqariyah; Bengtsson, Camilla; Lai, Too Chun; Larsson, Per T; Mustafa, Amal Nasir; Abdullah, Nor Aini; Muhamad, Norasiah; Hussein, Heselynn; Klareskog, Lars; Alfredsson, Lars; Murad, Shahnaz

    2012-08-01

    We investigated the association between cigarette smoking and the risk of developing rheumatoid arthritis (RA) in the Malaysian population. A total of 1,056 RA patients and 1,416 matched controls aged 18-70 years within a defined area of Peninsular Malaysia were evaluated in a case-control study between August 2005 and December 2009. A case was defined as a person with early diagnosed RA using the 1987 American College of Rheumatology criteria for RA. Controls were randomly selected matched for sex, age, and residential area. Cases and controls answered a questionnaire on a broad range of issues, including lifestyle factors and smoking habits wherein current and former smoking was classified as ever-smoking. The presence of anti-citrullinated peptide antibodies (ACPA) was determined for cases and controls. We found that ever-smokers had an increased risk of developing ACPA-positive RA [odds ratio (OR) = 4.1, 95% confidence interval (CI) 1.9-9.2] but not ACPA-negative RA (OR = 0.7, 95% CI 0.3-2.0), compared with never-smokers. A significant dose-response relationship between cumulative dose of smoking and risk of ACPA-positive RA was observed (<20 pack-years OR = 3.3, 95% CI 1.1-9.8; at least 20 pack-years OR = 5.2, 95% CI 1.6-17.6). Hence, smoking is associated with an increased risk of ACPA-positive RA in the Malaysian population, in which the genetic context is similar to several other Asian countries.

  4. Inhibition of AcpA phosphatase activity with ascorbate attenuates Francisella tularensis intramacrophage survival.

    Science.gov (United States)

    McRae, Steven; Pagliai, Fernando A; Mohapatra, Nrusingh P; Gener, Alejandro; Mahmou, Asma Sayed Abdelgeliel; Gunn, John S; Lorca, Graciela L; Gonzalez, Claudio F

    2010-02-19

    Acid phosphatase activity in the highly infectious intracellular pathogen Francisella tularensis is directly related with the ability of these bacteria to survive inside host cells. Pharmacological inactivation of acid phosphatases could potentially help in the treatment of tularemia or even be utilized to neutralize the infection. In the present work, we report inhibitory compounds for three of the four major acid phosphatases produced by F. tularensis SCHU4: AcpA, AcpB, and AcpC. The inhibitors were identified using a catalytic screen from a library of chemicals approved for use in humans. The best results were obtained against AcpA. The two compounds identified, ascorbate (K(i) = 380 +/- 160 microM) and 2-phosphoascorbate (K(i) = 3.2 +/- 0.85 microM) inhibit AcpA in a noncompetitive, nonreversible fashion. A potential ascorbylation site in the proximity of the catalytic pocket of AcpA was identified using site-directed mutagenesis. The effects of the inhibitors identified in vitro were evaluated using bioassays determining the ability of F. tularensis to survive inside infected cells. The presence of ascorbate or 2-phosphoascorbate impaired the intramacrophage survival of F. tularensis in an AcpA-dependent manner as it was probed using knockout strains. The evidence presented herein indicated that ascorbate could be a good alternative to be used clinically to improve treatments against tularemia.

  5. Identification of genes required for secretion of the Francisella oxidative burst-inhibiting acid phosphatase AcpA

    Directory of Open Access Journals (Sweden)

    John S Gunn

    2016-04-01

    Full Text Available Francisella tularensis is a Tier 1 bioterror threat and the intracellular pathogen responsible for tularemia in humans and animals. Upon entry into the host, Francisella uses multiple mechanisms to evade killing. Our previous studies have shown that after entering its primary cellular host, the macrophage, Francisella immediately suppresses the oxidative burst by secreting a series of acid phosphatases including AcpA-B-C and HapA, thereby evading the innate immune response of the macrophage and enhancing survival and further infection. However, the mechanism of acid phosphatase secretion by Francisella is still unknown. In this study, we screened for genes required for AcpA secretion in Francisella. We initially demonstrated that the known secretion systems, the putative Francisella-pathogenicity island (FPI-encoded Type VI secretion system and the Type IV pili, do not secrete AcpA. Using random transposon mutagenesis in conjunction with ELISA, Western blotting and acid phosphatase enzymatic assays, a transposon library of 5450 mutants was screened for strains with a minimum 1.5-fold decrease in secreted (culture supernatant AcpA, but no defect in cytosolic AcpA. Three mutants with decreased supernatant AcpA were identified. The transposon insertion sites of these mutants were revealed by direct genomic sequencing or inverse-PCR and sequencing. One of these mutants has a severe defect in AcpA secretion (at least 85% decrease and is a predicted hypothetical inner membrane protein. Interestingly, this mutant also affected the secretion of the FPI-encoded protein, VgrG. Thus, this screen identified novel protein secretion factors involved in the subversion of host defenses.

  6. Automated Clustering of Similar Amendments

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    The Italian Senate is clogged by computer-generated amendments. This talk will describe a simple strategy to cluster them in an automated fashion, so that the appropriate Senate procedures can be used to get rid of them in one sweep.

  7. The effect of BLA GABAB receptors in anxiolytic-like effect and aversive memory deficit induced by ACPA

    Directory of Open Access Journals (Sweden)

    Katayoon Kangarlu Haghighi

    2016-07-01

    Full Text Available Background: As a psychoactive plant, Cannabis sativa (Marijuana is widely used throughout the world. Several investigations have indicated that administration of Marijuana affects various cognitive and non-cognitive behaviors. These include anxiety-like behaviors and learning and memory deficit. It has been shown that three main cannabinoid receptors [i.e. CB1, CB2 and CB3 are involved in cannabinoids’ functions. CB1 receptors are abundantly expressed in the central nervous system regions such as hippocampus, amygdala, cerebellum and the cortex. Therefore, the neuropsychological functions of endocannabinoids are thought to be more linked to CB1 receptors. Among other brain regions, CB1 is highly expressed in the amygdala which is an integral component of the limbic circuitry. The amygdala plays a major role in the control of emotional behavior, including conditioned fear and anxiety. In present study we examined the possible roles of basolateral amygdala (BLA GABAB receptors in arachydonilcyclopropylamide (ACPA-induced anxiolytic-like effect and aversive memory deficit in adult male mice. Methods: This experimental study was conducted from September 2013 to December 2014 in Institute for Studies in Theoretical Physics and Mathematics, School of Cognitive Sciences, Tehran and Male albino NMRI mice (Pasture Institute, Iran, weighting 27-30 g, were used. Bilateral guide-cannulae were implanted to allow intra BLA microinjection of the drugs. We used Elevated Plus Maze (EPM to examine memory and anxiety behavior (test-retest protocol. ACPA administrate intra-peritoneal and GABAB agonist and antagonist administrated intra-amygdala. Results: Data showed that pre-test treatment with ACPA induced anxiolytic-like and aversive memory deficit The results revealed that pre-test intra-BLA infusion of baclofen (GABAB receptor agonist impaired the aversive memory while phaclofen (GABAB receptor antagonist improved it. Interestingly, pretreatment with a sub

  8. Pride and Progress? 30 Years of ACPA and NASPA LGBTQ Presentations

    Science.gov (United States)

    Pryor, Jonathan T.; Garvey, Jason C.; Johnson, Shonteria

    2017-01-01

    The purpose of this article is to examine themes of campus climate, LGBTQ identity construction, and visibility within LGBTQ presentations at ACPA and NASPA national conferences over the last 30 years. The authors frame their analysis utilizing prominent LGBT and queer issues scholarship in higher education research. Findings demonstrate the role…

  9. Shared epitope alleles remain a risk factor for anti-citrullinated proteins antibody (ACPA--positive rheumatoid arthritis in three Asian ethnic groups.

    Directory of Open Access Journals (Sweden)

    Too Chun-Lai

    Full Text Available BACKGROUND: To investigate the associations between HLA-DRB1 shared epitope (SE alleles and rheumatoid arthritis in subsets of rheumatoid arthritis defined by autoantibodies in three Asian populations from Malaysia. METHODS: 1,079 rheumatoid arthritis patients and 1,470 healthy controls were included in the study. Levels of antibodies to citrullinated proteins (ACPA and rheumatoid factors were assessed and the PCR-SSO method was used for HLA-DRB1 genotyping. RESULTS: The proportion of ACPA positivity among Malay, Chinese and Indian rheumatoid arthritis patients were 62.9%, 65.2% and 68.6%, respectively. An increased frequency of SE alleles was observed in ACPA-positive rheumatoid arthritis among the three Asian ethnic groups. HLA-DRB1*10 was highly associated with rheumatoid arthritis susceptibility in these Asian populations. HLA-DRB1*0405 was significantly associated with susceptibility to rheumatoid arthritis in Malays and Chinese, but not in Indians. HLA-DRB1*01 did not show any independent effect as a risk factor for rheumatoid arthritis in this study and HLA-DRB1*1202 was protective in Malays and Chinese. There was no association between SE alleles and ACPA- negative rheumatoid arthritis in any of the three Asian ethnic groups. CONCLUSION: The HLA-DRB1 SE alleles increase the risk of ACPA-positive rheumatoid arthritis in all three Asian populations from Malaysia.

  10. Automated analysis of organic particles using cluster SIMS

    Energy Technology Data Exchange (ETDEWEB)

    Gillen, Greg; Zeissler, Cindy; Mahoney, Christine; Lindstrom, Abigail; Fletcher, Robert; Chi, Peter; Verkouteren, Jennifer; Bright, David; Lareau, Richard T.; Boldman, Mike

    2004-06-15

    Cluster primary ion bombardment combined with secondary ion imaging is used on an ion microscope secondary ion mass spectrometer for the spatially resolved analysis of organic particles on various surfaces. Compared to the use of monoatomic primary ion beam bombardment, the use of a cluster primary ion beam (SF{sub 5}{sup +} or C{sub 8}{sup -}) provides significant improvement in molecular ion yields and a reduction in beam-induced degradation of the analyte molecules. These characteristics of cluster bombardment, along with automated sample stage control and custom image analysis software are utilized to rapidly characterize the spatial distribution of trace explosive particles, narcotics and inkjet-printed microarrays on a variety of surfaces.

  11. Automated detection of microcalcification clusters in mammograms

    Science.gov (United States)

    Karale, Vikrant A.; Mukhopadhyay, Sudipta; Singh, Tulika; Khandelwal, Niranjan; Sadhu, Anup

    2017-03-01

    Mammography is the most efficient modality for detection of breast cancer at early stage. Microcalcifications are tiny bright spots in mammograms and can often get missed by the radiologist during diagnosis. The presence of microcalcification clusters in mammograms can act as an early sign of breast cancer. This paper presents a completely automated computer-aided detection (CAD) system for detection of microcalcification clusters in mammograms. Unsharp masking is used as a preprocessing step which enhances the contrast between microcalcifications and the background. The preprocessed image is thresholded and various shape and intensity based features are extracted. Support vector machine (SVM) classifier is used to reduce the false positives while preserving the true microcalcification clusters. The proposed technique is applied on two different databases i.e DDSM and private database. The proposed technique shows good sensitivity with moderate false positives (FPs) per image on both databases.

  12. Abatacept reduces disease activity and ultrasound power Doppler in ACPA-negative undifferentiated arthritis: a proof-of-concept clinical and imaging study.

    Science.gov (United States)

    Buch, Maya H; Hensor, Elizabeth M A; Rakieh, Chadi; Freeston, Jane E; Middleton, Edward; Horton, Sarah; Das, Sudipto; Peterfy, Charles; Tan, Ai Lyn; Wakefield, Richard J; Emery, Paul

    2017-01-01

    No proven treatment exists for ACPA-negative undifferentiated arthritis (UA). The aim of this study was to evaluate whether abatacept is effective in treating poor prognosis, ACPA-negative UA, including its effect on power Doppler on US (PDUS). A proof-of-concept, open-label, prospective study of 20 patients with DMARD-naïve, ACPA-negative UA (⩾2 joint synovitis) and PDUS ⩾ 1 with clinical and 20-joint US (grey scale/PDUS) assessments at baseline, 6, 12, 18 and 24 months. All patients received 12 months of abatacept (monotherapy for minimum first 6 months). The primary end point was a composite of the proportion of patients that at 6 months achieved DAS44 remission, a maximum of one swollen joint for at least 3 consecutive months and no radiographic progression (over 0-12 months). Twenty of the 23 patients screened were enrolled [14 female; mean (sd) age 53.4 (11.2) years, symptom duration 7.5 (0.9) months]. Two (10%) achieved the composite primary end point. A reduction in the mean (sd) DAS44 was observed from a baseline value of 2.66 (0.77) to 2.01 (0.81) at 6 months and to 1.78 (0.95) at 12 months. The DAS44 remission rates were 6/20 (30%; 95% CI: 15, 51%) at 6 months and 8/20 (40%; 95% CI: 22, 62%) at 12 months. A striking decrease in the median (interquartile range; IQR) total PDUS score was noted from 10 (4-23) at baseline to 3 (2-12) and 3 (0-5) at 6 and 12 months, respectively. This report is a first in potentially identifying an effective therapy, abatacept monotherapy, for poor-prognosis, ACPA-negative UA, supported by a clear reduction in PDUS. These data justify evaluation in a controlled study. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  13. UBO Detector - A cluster-based, fully automated pipeline for extracting white matter hyperintensities.

    Science.gov (United States)

    Jiang, Jiyang; Liu, Tao; Zhu, Wanlin; Koncz, Rebecca; Liu, Hao; Lee, Teresa; Sachdev, Perminder S; Wen, Wei

    2018-07-01

    We present 'UBO Detector', a cluster-based, fully automated pipeline for extracting and calculating variables for regions of white matter hyperintensities (WMH) (available for download at https://cheba.unsw.edu.au/group/neuroimaging-pipeline). It takes T1-weighted and fluid attenuated inversion recovery (FLAIR) scans as input, and SPM12 and FSL functions are utilised for pre-processing. The candidate clusters are then generated by FMRIB's Automated Segmentation Tool (FAST). A supervised machine learning algorithm, k-nearest neighbor (k-NN), is applied to determine whether the candidate clusters are WMH or non-WMH. UBO Detector generates both image and text (volumes and the number of WMH clusters) outputs for whole brain, periventricular, deep, and lobar WMH, as well as WMH in arterial territories. The computation time for each brain is approximately 15 min. We validated the performance of UBO Detector by showing a) high segmentation (similarity index (SI) = 0.848) and volumetric (intraclass correlation coefficient (ICC) = 0.985) agreement between the UBO Detector-derived and manually traced WMH; b) highly correlated (r 2  > 0.9) and a steady increase of WMH volumes over time; and c) significant associations of periventricular (t = 22.591, p deep (t = 14.523, p < 0.001) WMH volumes generated by UBO Detector with Fazekas rating scores. With parallel computing enabled in UBO Detector, the processing can take advantage of multi-core CPU's that are commonly available on workstations. In conclusion, UBO Detector is a reliable, efficient and fully automated WMH segmentation pipeline. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Automated three-dimensional morphology-based clustering of human erythrocytes with regular shapes: stomatocytes, discocytes, and echinocytes

    Science.gov (United States)

    Ahmadzadeh, Ezat; Jaferzadeh, Keyvan; Lee, Jieun; Moon, Inkyu

    2017-07-01

    We present unsupervised clustering methods for automatic grouping of human red blood cells (RBCs) extracted from RBC quantitative phase images obtained by digital holographic microscopy into three RBC clusters with regular shapes, including biconcave, stomatocyte, and sphero-echinocyte. We select some good features related to the RBC profile and morphology, such as RBC average thickness, sphericity coefficient, and mean corpuscular volume, and clustering methods, including density-based spatial clustering applications with noise, k-medoids, and k-means, are applied to the set of morphological features. The clustering results of RBCs using a set of three-dimensional features are compared against a set of two-dimensional features. Our experimental results indicate that by utilizing the introduced set of features, two groups of biconcave RBCs and old RBCs (suffering from the sphero-echinocyte process) can be perfectly clustered. In addition, by increasing the number of clusters, the three RBC types can be effectively clustered in an automated unsupervised manner with high accuracy. The performance evaluation of the clustering techniques reveals that they can assist hematologists in further diagnosis.

  15. A new classification of HLA-DRB1 alleles based on acid-base properties of the amino acids located at positions 13, 70 and 71: impact on ACPA status or structural progression, and meta-analysis on 1235 patients with rheumatoid from two cohorts (ESPOIR and EAC cohort).

    Science.gov (United States)

    Ruyssen-Witrand, Adeline; van Steenbergen, Hanna W; van Heemst, Jurgen; Gourraud, Pierre-Antoine; Nigon, Delphine; Lukas, Cédric; Miceli-Richard, Corinne; Jamard, Bénédicte; Cambon-Thomsen, Anne; Cantagrel, Alain; Dieudé, Philippe; van der Helm-van Mil, Annette H M; Constantin, Arnaud

    2015-01-01

    To group HLA-DRB1 alleles based on acid-base properties of amino acids at positions 13, 70 and 71 and analyse their association with the presence of anticitrullinated peptide antibodies (ACPA) and structural progression in 2 cohorts of early rheumatoid arthritis (RA). Patients with RA (N=612) from ESPOIR cohort and from EAC cohort (n=624) were genotyped for HLA-DRB1 alleles. The alleles containing the RAA sequence at positions 72-74 were classified into 3 groups according to the amino acid at positions 13, 70 and 71: BB encoding basic amino acids at positions 13, 70 and 71; A encoding acidic amino acids at positions 70 and 71; and BN encoding either neutral amino acids at position 13 and basic amino acids at positions 70 and 71, or basic amino acid at position 13 and neutral amino acids at positions 70 and 71. The associations between the different alleles and (1) the ACPA presence, and (2) the structural progression were assessed by χ(2) test; a meta-analysis was performed on the 2 cohorts using the Mantel-Haenszel method. After meta-analysis, BB alleles were significantly associated with ACPA presence (OR (95% CI) 4.08 (3.14 to 5.31)) and structural progression (OR (95% CI) 2.33 (1.76 to 3.09)). The alleles protected significantly against ACPA presence (OR (95% CI) 0.37 (0.28 to 0.50)) and structural progression (OR (95% CI) 0.34 (0.23 to 0.50)). This acid-base classification allowed to separate another group BN with an intermediate risk of ACPA production (OR (95% CI) 1.14 (0.91 to 1.44)) and structural progression (OR (95% CI) 1.01 (0.77 to 1.33)). This new classification permitted to make a hierarchy of HLA-DRB1 alleles in terms of association with ACPA presence or structural progression in early RA.

  16. Application of FLIA to the evaluation of newly incorporated control panel. 2. Determination of balance of manipulated/automated phase by cluster analysis

    International Nuclear Information System (INIS)

    Sano, Norihide; Takahashi, Ryoichi.

    1996-01-01

    Human reliability in a complex system has been studied to establish safety systems by analyzing the operator's performance in a control room of a nuclear power plant. In this paper, results of a mathematical model and a questionnaire given to plant designers and operators led to the proposal of a fuzzy tool for evaluating the quality of recent automated control systems. The first report described a method which is capable of calculating human performance by summing the weighted utility of attributes. The modified fuzzy measures learning identification algorithm (FLIA) reduces a set of attributes until human tasks are represented clearly. A change in the performance is illustrated on a two-dimensional map of the dominant attributes as a function of the automated level. The designers and the operators determined the balance of the manipulated/automated phase on the map after careful individual interviews. In the present paper, we attempt to interpret the boundary with a cluster-analysis theory, where the Euclidian square distance and the nearest-neighbor method are applied. The evaluated aspect of the boundary on the map can be divided into the manipulated/automated phase. It is shown that the calculated boundary is equal to the vertical bisector between the center of gravity of the clusters. The analytical boundary agrees precisely with the questionnaire result. (author)

  17. Taxonomic revision of the Malagasy members of the Nesomyrmex angulatus species group using the automated morphological species delineation protocol NC-PART-clustering

    Directory of Open Access Journals (Sweden)

    Sándor Csősz

    2016-03-01

    Full Text Available Background. Applying quantitative morphological approaches in systematics research is a promising way to discover cryptic biological diversity. Information obtained through twenty-first century science poses new challenges to taxonomy by offering the possibility of increased objectivity in independent and automated hypothesis formation. In recent years a number of promising new algorithmic approaches have been developed to recognize morphological diversity among insects based on multivariate morphometric analyses. These algorithms objectively delimit components in the data by automatically assigning objects into clusters. Method. In this paper, hypotheses on the diversity of the Malagasy Nesomyrmex angulatus group are formulated via a highly automated protocol involving a fusion of two algorithms, (1 Nest Centroid clustering (NC clustering and (2 Partitioning Algorithm based on Recursive Thresholding (PART. Both algorithms assign samples into clusters, making the class assignment results of different algorithms readily inferable. The results were tested by confirmatory cross-validated Linear Discriminant Analysis (LOOCV-LDA. Results. Here we reveal the diversity of a unique and largely unexplored fragment of the Malagasy ant fauna using NC-PART-clustering on continuous morphological data, an approach that brings increased objectivity to taxonomy. We describe eight morphologically distinct species, including seven new species: Nesomyrmex angulatus (Mayr, 1862, N. bidentatus sp. n., N. clypeatus sp. n., N. devius sp. n., N. exiguus sp. n., N. fragilis sp. n., N. gracilis sp. n., and N. hirtellus sp. n.. An identification key for their worker castes using morphometric data is provided. Conclusions. Combining the dimensionality reduction feature of NC clustering with the assignment of samples into clusters by PART advances the automatization of morphometry-based alpha taxonomy.

  18. The automated analysis of clustering behaviour of piglets from thermal images in response to immune challenge by vaccination.

    Science.gov (United States)

    Cook, N J; Bench, C J; Liu, T; Chabot, B; Schaefer, A L

    2018-01-01

    An automated method of estimating the spatial distribution of piglets within a pen was used to assess huddling behaviour under normal conditions and during a febrile response to vaccination. The automated method was compared with a manual assessment of clustering activity. Huddling behaviour was partly related to environmental conditions and clock time such that more huddling occurred during the night and at lower ambient air temperatures. There were no positive relationships between maximum pig temperatures and environmental conditions, suggesting that the narrow range of air temperatures in this study was not a significant factor for pig temperature. Spatial distribution affected radiated pig temperature measurements by IR thermography. Higher temperatures were recorded in groups of animals displaying huddling behaviour. Huddling behaviour was affected by febrile responses to vaccination with increased huddling occurring 3 to 8 h post-vaccination. The automated method of assessing spatial distribution from an IR image successfully identified periods of huddling associated with a febrile response, and to changing environmental temperatures. Infrared imaging could be used to quantify temperature and behaviour from the same images.

  19. A WEB-BASED SOLUTION TO VISUALIZE OPERATIONAL MONITORING LINUX CLUSTER FOR THE PROTODUNE DATA QUALITY MONITORING CLUSTER

    CERN Document Server

    Mosesane, Badisa

    2017-01-01

    The Neutrino computing cluster made of 300 Dell PowerEdge 1950 U1 nodes serves an integral role to the CERN Neutrino Platform (CENF). It represents an effort to foster fundamental research in the field of Neutrino physics as it provides data processing facility. We cannot begin to over emphasize the need for data quality monitoring coupled with automating system configurations and remote monitoring of the cluster. To achieve these, a software stack has been chosen to implement automatic propagation of configurations across all the nodes in the cluster. The bulk of these discusses and delves more into the automated configuration management system on this cluster to enable the fast online data processing and Data Quality (DQM) process for the Neutrino Platform cluster (npcmp.cern.ch).

  20. Automated Clustering Analysis of Immunoglobulin Sequences in Chronic Lymphocytic Leukemia Based on 3D Structural Descriptors

    DEFF Research Database (Denmark)

    Marcatili, Paolo; Mochament, Konstantinos; Agathangelidis, Andreas

    2016-01-01

    study, we used the structure prediction tools PIGS and I-TASSER for creating the 3D models and the TM-align algorithm to superpose them. The innovation of the current methodology resides in the usage of methods adapted from 3D content-based search methodologies to determine the local structural...... determine it are extremely laborious and demanding. Hence, the ability to gain insight into the structure of Igs at large relies on the availability of tools and algorithms for producing accurate Ig structural models based on their primary sequence alone. These models can then be used to determine...... to achieve an optimal solution to this task yet their results were hindered mainly due to the lack of efficient clustering methods based on the similarity of 3D structure descriptors. Here, we present a novel workflow for robust Ig 3D modeling and automated clustering. We validated our protocol in chronic...

  1. BioCluster: Tool for Identification and Clustering of Enterobacteriaceae Based on Biochemical Data

    Directory of Open Access Journals (Sweden)

    Ahmed Abdullah

    2015-06-01

    Full Text Available Presumptive identification of different Enterobacteriaceae species is routinely achieved based on biochemical properties. Traditional practice includes manual comparison of each biochemical property of the unknown sample with known reference samples and inference of its identity based on the maximum similarity pattern with the known samples. This process is labor-intensive, time-consuming, error-prone, and subjective. Therefore, automation of sorting and similarity in calculation would be advantageous. Here we present a MATLAB-based graphical user interface (GUI tool named BioCluster. This tool was designed for automated clustering and identification of Enterobacteriaceae based on biochemical test results. In this tool, we used two types of algorithms, i.e., traditional hierarchical clustering (HC and the Improved Hierarchical Clustering (IHC, a modified algorithm that was developed specifically for the clustering and identification of Enterobacteriaceae species. IHC takes into account the variability in result of 1–47 biochemical tests within this Enterobacteriaceae family. This tool also provides different options to optimize the clustering in a user-friendly way. Using computer-generated synthetic data and some real data, we have demonstrated that BioCluster has high accuracy in clustering and identifying enterobacterial species based on biochemical test data. This tool can be freely downloaded at http://microbialgen.du.ac.bd/biocluster/.

  2. Automated classification of mouse pup isolation syllables: from cluster analysis to an Excel based ‘mouse pup syllable classification calculator’

    Directory of Open Access Journals (Sweden)

    Jasmine eGrimsley

    2013-01-01

    Full Text Available Mouse pups vocalize at high rates when they are cold or isolated from the nest. The proportions of each syllable type produced carry information about disease state and are being used as behavioral markers for the internal state of animals. Manual classifications of these vocalizations identified ten syllable types based on their spectro-temporal features. However, manual classification of mouse syllables is time consuming and vulnerable to experimenter bias. This study uses an automated cluster analysis to identify acoustically distinct syllable types produced by CBA/CaJ mouse pups, and then compares the results to prior manual classification methods. The cluster analysis identified two syllable types, based on their frequency bands, that have continuous frequency-time structure, and two syllable types featuring abrupt frequency transitions. Although cluster analysis computed fewer syllable types than manual classification, the clusters represented well the probability distributions of the acoustic features within syllables. These probability distributions indicate that some of the manually classified syllable types are not statistically distinct. The characteristics of the four classified clusters were used to generate a Microsoft Excel-based mouse syllable classifier that rapidly categorizes syllables, with over a 90% match, into the syllable types determined by cluster analysis.

  3. Automated tetraploid genotype calling by hierarchical clustering

    Science.gov (United States)

    SNP arrays are transforming breeding and genetics research for autotetraploids. To fully utilize these arrays, however, the relationship between signal intensity and allele dosage must be inferred independently for each marker. We developed an improved computational method to automate this process, ...

  4. Automatic insertion of simulated microcalcification clusters in a software breast phantom

    Science.gov (United States)

    Shankla, Varsha; Pokrajac, David D.; Weinstein, Susan P.; DeLeo, Michael; Tuite, Catherine; Roth, Robyn; Conant, Emily F.; Maidment, Andrew D.; Bakic, Predrag R.

    2014-03-01

    An automated method has been developed to insert realistic clusters of simulated microcalcifications (MCs) into computer models of breast anatomy. This algorithm has been developed as part of a virtual clinical trial (VCT) software pipeline, which includes the simulation of breast anatomy, mechanical compression, image acquisition, image processing, display and interpretation. An automated insertion method has value in VCTs involving large numbers of images. The insertion method was designed to support various insertion placement strategies, governed by probability distribution functions (pdf). The pdf can be predicated on histological or biological models of tumor growth, or estimated from the locations of actual calcification clusters. To validate the automated insertion method, a 2-AFC observer study was designed to compare two placement strategies, undirected and directed. The undirected strategy could place a MC cluster anywhere within the phantom volume. The directed strategy placed MC clusters within fibroglandular tissue on the assumption that calcifications originate from epithelial breast tissue. Three radiologists were asked to select between two simulated phantom images, one from each placement strategy. Furthermore, questions were posed to probe the rationale behind the observer's selection. The radiologists found the resulting cluster placement to be realistic in 92% of cases, validating the automated insertion method. There was a significant preference for the cluster to be positioned on a background of adipose or mixed adipose/fibroglandular tissues. Based upon these results, this automated lesion placement method will be included in our VCT simulation pipeline.

  5. CERES AuTomAted job Loading SYSTem (CATALYST): An automated workflow manager for satellite data production

    Science.gov (United States)

    Gleason, J. L.; Hillyer, T. N.; Wilkins, J.

    2012-12-01

    The CERES Science Team integrates data from 5 CERES instruments onboard the Terra, Aqua and NPP missions. The processing chain fuses CERES observations with data from 19 other unique sources. The addition of CERES Flight Model 5 (FM5) onboard NPP, coupled with ground processing system upgrades further emphasizes the need for an automated job-submission utility to manage multiple processing streams concurrently. The operator-driven, legacy-processing approach relied on manually staging data from magnetic tape to limited spinning disk attached to a shared memory architecture system. The migration of CERES production code to a distributed, cluster computing environment with approximately one petabyte of spinning disk containing all precursor input data products facilitates the development of a CERES-specific, automated workflow manager. In the cluster environment, I/O is the primary system resource in contention across jobs. Therefore, system load can be maximized with a throttling workload manager. This poster discusses a Java and Perl implementation of an automated job management tool tailored for CERES processing.

  6. Introduction matters: Manipulating trust in automation and reliance in automated driving.

    Science.gov (United States)

    Körber, Moritz; Baseler, Eva; Bengler, Klaus

    2018-01-01

    Trust in automation is a key determinant for the adoption of automated systems and their appropriate use. Therefore, it constitutes an essential research area for the introduction of automated vehicles to road traffic. In this study, we investigated the influence of trust promoting (Trust promoted group) and trust lowering (Trust lowered group) introductory information on reported trust, reliance behavior and take-over performance. Forty participants encountered three situations in a 17-min highway drive in a conditionally automated vehicle (SAE Level 3). Situation 1 and Situation 3 were non-critical situations where a take-over was optional. Situation 2 represented a critical situation where a take-over was necessary to avoid a collision. A non-driving-related task (NDRT) was presented between the situations to record the allocation of visual attention. Participants reporting a higher trust level spent less time looking at the road or instrument cluster and more time looking at the NDRT. The manipulation of introductory information resulted in medium differences in reported trust and influenced participants' reliance behavior. Participants of the Trust promoted group looked less at the road or instrument cluster and more at the NDRT. The odds of participants of the Trust promoted group to overrule the automated driving system in the non-critical situations were 3.65 times (Situation 1) to 5 times (Situation 3) higher. In Situation 2, the Trust promoted group's mean take-over time was extended by 1154 ms and the mean minimum time-to-collision was 933 ms shorter. Six participants from the Trust promoted group compared to no participant of the Trust lowered group collided with the obstacle. The results demonstrate that the individual trust level influences how much drivers monitor the environment while performing an NDRT. Introductory information influences this trust level, reliance on an automated driving system, and if a critical take-over situation can be

  7. Automating the expert consensus paradigm for robust lung tissue classification

    Science.gov (United States)

    Rajagopalan, Srinivasan; Karwoski, Ronald A.; Raghunath, Sushravya; Bartholmai, Brian J.; Robb, Richard A.

    2012-03-01

    Clinicians confirm the efficacy of dynamic multidisciplinary interactions in diagnosing Lung disease/wellness from CT scans. However, routine clinical practice cannot readily accomodate such interactions. Current schemes for automating lung tissue classification are based on a single elusive disease differentiating metric; this undermines their reliability in routine diagnosis. We propose a computational workflow that uses a collection (#: 15) of probability density functions (pdf)-based similarity metrics to automatically cluster pattern-specific (#patterns: 5) volumes of interest (#VOI: 976) extracted from the lung CT scans of 14 patients. The resultant clusters are refined for intra-partition compactness and subsequently aggregated into a super cluster using a cluster ensemble technique. The super clusters were validated against the consensus agreement of four clinical experts. The aggregations correlated strongly with expert consensus. By effectively mimicking the expertise of physicians, the proposed workflow could make automation of lung tissue classification a clinical reality.

  8. Reanalysis of 24 Nearby Open Clusters using Gaia data

    Science.gov (United States)

    Yen, Steffi X.; Reffert, Sabine; Röser, Siegfried; Schilbach, Elena; Kharchenko, Nina V.; Piskunov, Anatoly E.

    2018-04-01

    We have developed a fully automated cluster characterization pipeline, which simultaneously determines cluster membership and fits the fundamental cluster parameters: distance, reddening, and age. We present results for 24 established clusters and compare them to literature values. Given the large amount of stellar data for clusters available from Gaia DR2 in 2018, this pipeline will be beneficial to analyzing the parameters of open clusters in our Galaxy.

  9. Dockomatic - automated ligand creation and docking.

    Science.gov (United States)

    Bullock, Casey W; Jacob, Reed B; McDougal, Owen M; Hampikian, Greg; Andersen, Tim

    2010-11-08

    The application of computational modeling to rationally design drugs and characterize macro biomolecular receptors has proven increasingly useful due to the accessibility of computing clusters and clouds. AutoDock is a well-known and powerful software program used to model ligand to receptor binding interactions. In its current version, AutoDock requires significant amounts of user time to setup and run jobs, and collect results. This paper presents DockoMatic, a user friendly Graphical User Interface (GUI) application that eases and automates the creation and management of AutoDock jobs for high throughput screening of ligand to receptor interactions. DockoMatic allows the user to invoke and manage AutoDock jobs on a single computer or cluster, including jobs for evaluating secondary ligand interactions. It also automates the process of collecting, summarizing, and viewing results. In addition, DockoMatic automates creation of peptide ligand .pdb files from strings of single-letter amino acid abbreviations. DockoMatic significantly reduces the complexity of managing multiple AutoDock jobs by facilitating ligand and AutoDock job creation and management.

  10. Dockomatic - automated ligand creation and docking

    Directory of Open Access Journals (Sweden)

    Hampikian Greg

    2010-11-01

    Full Text Available Abstract Background The application of computational modeling to rationally design drugs and characterize macro biomolecular receptors has proven increasingly useful due to the accessibility of computing clusters and clouds. AutoDock is a well-known and powerful software program used to model ligand to receptor binding interactions. In its current version, AutoDock requires significant amounts of user time to setup and run jobs, and collect results. This paper presents DockoMatic, a user friendly Graphical User Interface (GUI application that eases and automates the creation and management of AutoDock jobs for high throughput screening of ligand to receptor interactions. Results DockoMatic allows the user to invoke and manage AutoDock jobs on a single computer or cluster, including jobs for evaluating secondary ligand interactions. It also automates the process of collecting, summarizing, and viewing results. In addition, DockoMatic automates creation of peptide ligand .pdb files from strings of single-letter amino acid abbreviations. Conclusions DockoMatic significantly reduces the complexity of managing multiple AutoDock jobs by facilitating ligand and AutoDock job creation and management.

  11. Misty Mountain clustering: application to fast unsupervised flow cytometry gating

    Directory of Open Access Journals (Sweden)

    Sealfon Stuart C

    2010-10-01

    Full Text Available Abstract Background There are many important clustering questions in computational biology for which no satisfactory method exists. Automated clustering algorithms, when applied to large, multidimensional datasets, such as flow cytometry data, prove unsatisfactory in terms of speed, problems with local minima or cluster shape bias. Model-based approaches are restricted by the assumptions of the fitting functions. Furthermore, model based clustering requires serial clustering for all cluster numbers within a user defined interval. The final cluster number is then selected by various criteria. These supervised serial clustering methods are time consuming and frequently different criteria result in different optimal cluster numbers. Various unsupervised heuristic approaches that have been developed such as affinity propagation are too expensive to be applied to datasets on the order of 106 points that are often generated by high throughput experiments. Results To circumvent these limitations, we developed a new, unsupervised density contour clustering algorithm, called Misty Mountain, that is based on percolation theory and that efficiently analyzes large data sets. The approach can be envisioned as a progressive top-down removal of clouds covering a data histogram relief map to identify clusters by the appearance of statistically distinct peaks and ridges. This is a parallel clustering method that finds every cluster after analyzing only once the cross sections of the histogram. The overall run time for the composite steps of the algorithm increases linearly by the number of data points. The clustering of 106 data points in 2D data space takes place within about 15 seconds on a standard laptop PC. Comparison of the performance of this algorithm with other state of the art automated flow cytometry gating methods indicate that Misty Mountain provides substantial improvements in both run time and in the accuracy of cluster assignment. Conclusions

  12. Automated clustering procedure for TJ-II experimental signals

    International Nuclear Information System (INIS)

    Duro, N.; Vega, J.; Dormido, R.; Farias, G.; Dormido-Canto, S.; Sanchez, J.; Santos, M.; Pajares, G.

    2006-01-01

    Databases in fusion experiments are made up of thousands of signals. For this reason, data analysis must be simplified by developing automatic mechanisms for fast search and retrieval of specific data in the waveform database. In particular, a method for finding similar waveforms would be very helpful. The term 'similar' implies the use of proximity measurements in order to quantify how close two signals are. In this way, it would be possible to define several categories (clusters) and to classify the waveforms according to them, where this classification can be a starting point for exploratory data analysis in large databases. The clustering process is divided in two stages. The first one is feature extraction, i.e., to choose the set of properties that allow us to encode as much information as possible concerning a signal. The second one establishes the number of clusters according to a proximity measure

  13. Tidal radii of the globular clusters M 5, M 12, M 13, M 15, M 53, NGC 5053 and NGC 5466 from automated star counts.

    Science.gov (United States)

    Lehmann, I.; Scholz, R.-D.

    1997-04-01

    We present new tidal radii for seven Galactic globular clusters using the method of automated star counts on Schmidt plates of the Tautenburg, Palomar and UK telescopes. The plates were fully scanned with the APM system in Cambridge (UK). Special account was given to a reliable background subtraction and the correction of crowding effects in the central cluster region. For the latter we used a new kind of crowding correction based on a statistical approach to the distribution of stellar images and the luminosity function of the cluster stars in the uncrowded area. The star counts were correlated with surface brightness profiles of different authors to obtain complete projected density profiles of the globular clusters. Fitting an empirical density law (King 1962) we derived the following structural parameters: tidal radius r_t_, core radius r_c_ and concentration parameter c. In the cases of NGC 5466, M 5, M 12, M 13 and M 15 we found an indication for a tidal tail around these objects (cf. Grillmair et al. 1995).

  14. A data-driven approach to estimating the number of clusters in hierarchical clustering [version 1; referees: 2 approved, 1 approved with reservations

    Directory of Open Access Journals (Sweden)

    Antoine E. Zambelli

    2016-12-01

    Full Text Available DNA microarray and gene expression problems often require a researcher to perform clustering on their data in a bid to better understand its structure. In cases where the number of clusters is not known, one can resort to hierarchical clustering methods. However, there currently exist very few automated algorithms for determining the true number of clusters in the data. We propose two new methods (mode and maximum difference for estimating the number of clusters in a hierarchical clustering framework to create a fully automated process with no human intervention. These methods are compared to the established elbow and gap statistic algorithms using simulated datasets and the Biobase Gene ExpressionSet. We also explore a data mixing procedure inspired by cross validation techniques. We find that the overall performance of the maximum difference method is comparable or greater to that of the gap statistic in multi-cluster scenarios, and achieves that performance at a fraction of the computational cost. This method also responds well to our mixing procedure, which opens the door to future research. We conclude that both the mode and maximum difference methods warrant further study related to their mixing and cross-validation potential. We particularly recommend the use of the maximum difference method in multi-cluster scenarios given its accuracy and execution times, and present it as an alternative to existing algorithms.

  15. Semi-Supervised Clustering for High-Dimensional and Sparse Features

    Science.gov (United States)

    Yan, Su

    2010-01-01

    Clustering is one of the most common data mining tasks, used frequently for data organization and analysis in various application domains. Traditional machine learning approaches to clustering are fully automated and unsupervised where class labels are unknown a priori. In real application domains, however, some "weak" form of side…

  16. Fast semi-automated lesion demarcation in stroke

    Directory of Open Access Journals (Sweden)

    Bianca de Haan

    2015-01-01

    Full Text Available Lesion–behaviour mapping analyses require the demarcation of the brain lesion on each (usually transverse slice of the individual stroke patient's brain image. To date, this is generally thought to be most precise when done manually, which is, however, both time-consuming and potentially observer-dependent. Fully automated lesion demarcation methods have been developed to address these issues, but these are often not practicable in acute stroke research where for each patient only a single image modality is available and the available image modality differs over patients. In the current study, we evaluated a semi-automated lesion demarcation approach, the so-called Clusterize algorithm, in acute stroke patients scanned in a range of common image modalities. Our results suggest that, compared to the standard of manual lesion demarcation, the semi-automated Clusterize algorithm is capable of significantly speeding up lesion demarcation in the most commonly used image modalities, without loss of either lesion demarcation precision or lesion demarcation reproducibility. For the three investigated acute datasets (CT, DWI, T2FLAIR, containing a total of 44 patient images obtained in a regular clinical setting at patient admission, the reduction in processing time was on average 17.8 min per patient and this advantage increased with increasing lesion volume (up to 60 min per patient for the largest lesion volumes in our datasets. Additionally, our results suggest that performance of the Clusterize algorithm in a chronic dataset with 11 T1 images was comparable to its performance in the acute datasets. We thus advocate the use of the Clusterize algorithm, integrated into a simple, freely available SPM toolbox, for the precise, reliable and fast preparation of imaging data for lesion–behaviour mapping analyses.

  17. A Fully Automated Approach to Spike Sorting.

    Science.gov (United States)

    Chung, Jason E; Magland, Jeremy F; Barnett, Alex H; Tolosa, Vanessa M; Tooker, Angela C; Lee, Kye Y; Shah, Kedar G; Felix, Sarah H; Frank, Loren M; Greengard, Leslie F

    2017-09-13

    Understanding the detailed dynamics of neuronal networks will require the simultaneous measurement of spike trains from hundreds of neurons (or more). Currently, approaches to extracting spike times and labels from raw data are time consuming, lack standardization, and involve manual intervention, making it difficult to maintain data provenance and assess the quality of scientific results. Here, we describe an automated clustering approach and associated software package that addresses these problems and provides novel cluster quality metrics. We show that our approach has accuracy comparable to or exceeding that achieved using manual or semi-manual techniques with desktop central processing unit (CPU) runtimes faster than acquisition time for up to hundreds of electrodes. Moreover, a single choice of parameters in the algorithm is effective for a variety of electrode geometries and across multiple brain regions. This algorithm has the potential to enable reproducible and automated spike sorting of larger scale recordings than is currently possible. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. The Impact of Automated Notification on Follow-up of Actionable Tests Pending at Discharge: a Cluster-Randomized Controlled Trial.

    Science.gov (United States)

    Dalal, Anuj K; Schaffer, Adam; Gershanik, Esteban F; Papanna, Ranganath; Eibensteiner, Katyuska; Nolido, Nyryan V; Yoon, Cathy S; Williams, Deborah; Lipsitz, Stuart R; Roy, Christopher L; Schnipper, Jeffrey L

    2018-03-12

    Follow-up of tests pending at discharge (TPADs) is poor. We previously demonstrated a twofold increase in awareness of any TPAD by attendings and primary care physicians (PCPs) using an automated email intervention OBJECTIVE: To determine whether automated notification improves documented follow-up for actionable TPADs DESIGN: Cluster-randomized controlled trial SUBJECTS: Attendings and PCPs caring for adult patients discharged from general medicine and cardiology services with at least one actionable TPAD between June 2011 and May 2012 INTERVENTION: An automated system that notifies discharging attendings and network PCPs of finalized TPADs by email MAIN MEASURES: The primary outcome was the proportion of actionable TPADs with documented action determined by independent physician review of the electronic health record (EHR). Secondary outcomes included documented acknowledgment, 30-day readmissions, and adjusted median days to documented follow-up. Of the 3378 TPADs sampled, 253 (7.5%) were determined to be actionable by physician review. Of these, 150 (123 patients discharged by 53 attendings) and 103 (90 patients discharged by 44 attendings) were assigned to intervention and usual care groups, respectively, and underwent chart review. The proportion of actionable TPADs with documented action was 60.7 vs. 56.3% (p = 0.82) in the intervention vs. usual care groups, similar for documented acknowledgment. The proportion of patients with actionable TPADs readmitted within 30 days was 22.8 vs. 31.1% in the intervention vs. usual care groups (p = 0.24). The adjusted median days [95% CI] to documented action was 9 [6.2, 11.8] vs. 14 [10.2, 17.8] (p = 0.04) in the intervention vs. usual care groups, similar for documented acknowledgment. In sub-group analysis, the intervention had greater impact on documented action for patients with network PCPs compared with usual care (70 vs. 50%, p = 0.03). Automated notification of actionable TPADs shortened time to

  19. Adoption of automated livestock production systems in Northern Europe

    DEFF Research Database (Denmark)

    Pedersen, Søren Marcus; Lind, Kim Martin Hjorth

    2014-01-01

    In the last decades the development of automated systems in livestock production has gained increasing interest among farmers. A combined use of computers and sensor systems has lead the development into new research areas with automated milking systems, grain drying systems and automated feeding...... on the relationship and adoption patterns among these countries. The paper presents the results of the surveyed population, demography, farm structure with livestock production characteristics and farmers use of selected automated systems in livestock production....... systems. The aim of this paper is to present the results of a farm survey and cluster analysis that have been made among 4 countries in Europe. This study is based on replies from 413 respondents in Germany (eastern part), Greece, Finland and Denmark, respectively, and the study comments...

  20. Vertebra identification using template matching modelmp and K-means clustering.

    Science.gov (United States)

    Larhmam, Mohamed Amine; Benjelloun, Mohammed; Mahmoudi, Saïd

    2014-03-01

    Accurate vertebra detection and segmentation are essential steps for automating the diagnosis of spinal disorders. This study is dedicated to vertebra alignment measurement, the first step in a computer-aided diagnosis tool for cervical spine trauma. Automated vertebral segment alignment determination is a challenging task due to low contrast imaging and noise. A software tool for segmenting vertebrae and detecting subluxations has clinical significance. A robust method was developed and tested for cervical vertebra identification and segmentation that extracts parameters used for vertebra alignment measurement. Our contribution involves a novel combination of a template matching method and an unsupervised clustering algorithm. In this method, we build a geometric vertebra mean model. To achieve vertebra detection, manual selection of the region of interest is performed initially on the input image. Subsequent preprocessing is done to enhance image contrast and detect edges. Candidate vertebra localization is then carried out by using a modified generalized Hough transform (GHT). Next, an adapted cost function is used to compute local voted centers and filter boundary data. Thereafter, a K-means clustering algorithm is applied to obtain clusters distribution corresponding to the targeted vertebrae. These clusters are combined with the vote parameters to detect vertebra centers. Rigid segmentation is then carried out by using GHT parameters. Finally, cervical spine curves are extracted to measure vertebra alignment. The proposed approach was successfully applied to a set of 66 high-resolution X-ray images. Robust detection was achieved in 97.5 % of the 330 tested cervical vertebrae. An automated vertebral identification method was developed and demonstrated to be robust to noise and occlusion. This work presents a first step toward an automated computer-aided diagnosis system for cervical spine trauma detection.

  1. NLRP1, PTPN22 and PADI4 gene polymorphisms and rheumatoid arthritis in ACPA-positive Singaporean Chinese.

    Science.gov (United States)

    Goh, Liuh Ling; Yong, Mei Yun; See, Wei Qiang; Chee, Edward Yu Wing; Lim, Pei Qi; Koh, Ee Tzun; Leong, Khai Pang

    2017-08-01

    Studies have shown that the genetic risk factors for rheumatoid arthritis (RA) differ substantially between Asian and Caucasian populations. Even among Asian populations, the genetic contributions of NLRP1, PTPN22 and PADI4 have been controversial. Consequently, we sought to address these separate findings and determine whether any of these proposed risk variants are associated with RA susceptibility, onset, DAS activity and erosion in a Singaporean Chinese cohort. We genotyped five SNPs within NLRP1 (rs878329 and rs6502867), PTPN22 (rs2488457 and rs6665194), and PADI4 (rs2240340) in 500 anti-cyclic citrullinated peptide antibody-positive (ACPA) patients with RA and 500 healthy controls using TaqMan assays. The CC genotype of NLRP1 rs878329 and TT genotype of PADI4 rs2240340 were associated with RA susceptibility. The risk association of the T allele of PADI4 rs2240340 with RA was confirmed through a meta-analysis based on previous reports in Asian populations. The GG genotype of PTPN22 rs6665194 (-3508A>G) was associated with significantly reduced risk of RA. No significant association was found for NLRP1 rs6502867 T/C and PTPN22 rs2488457 G/C polymorphisms. None of the five SNPs was associated with RA's clinical features. This work supports the association of the T allele of PADI4 rs2240340 with RA in Asians. The roles of NLRP1 rs878329 G/C and PTPN22 rs6665194 A/G polymorphisms were demonstrated for the first time. We also propose rs6665194 to be a promising candidate for RA risk evaluation between ethnicities.

  2. Automated flow cytometric analysis across large numbers of samples and cell types.

    Science.gov (United States)

    Chen, Xiaoyi; Hasan, Milena; Libri, Valentina; Urrutia, Alejandra; Beitz, Benoît; Rouilly, Vincent; Duffy, Darragh; Patin, Étienne; Chalmond, Bernard; Rogge, Lars; Quintana-Murci, Lluis; Albert, Matthew L; Schwikowski, Benno

    2015-04-01

    Multi-parametric flow cytometry is a key technology for characterization of immune cell phenotypes. However, robust high-dimensional post-analytic strategies for automated data analysis in large numbers of donors are still lacking. Here, we report a computational pipeline, called FlowGM, which minimizes operator input, is insensitive to compensation settings, and can be adapted to different analytic panels. A Gaussian Mixture Model (GMM)-based approach was utilized for initial clustering, with the number of clusters determined using Bayesian Information Criterion. Meta-clustering in a reference donor permitted automated identification of 24 cell types across four panels. Cluster labels were integrated into FCS files, thus permitting comparisons to manual gating. Cell numbers and coefficient of variation (CV) were similar between FlowGM and conventional gating for lymphocyte populations, but notably FlowGM provided improved discrimination of "hard-to-gate" monocyte and dendritic cell (DC) subsets. FlowGM thus provides rapid high-dimensional analysis of cell phenotypes and is amenable to cohort studies. Copyright © 2015. Published by Elsevier Inc.

  3. Scientific Cluster Deployment and Recovery - Using puppet to simplify cluster management

    Science.gov (United States)

    Hendrix, Val; Benjamin, Doug; Yao, Yushu

    2012-12-01

    Deployment, maintenance and recovery of a scientific cluster, which has complex, specialized services, can be a time consuming task requiring the assistance of Linux system administrators, network engineers as well as domain experts. Universities and small institutions that have a part-time FTE with limited time for and knowledge of the administration of such clusters can be strained by such maintenance tasks. This current work is the result of an effort to maintain a data analysis cluster (DAC) with minimal effort by a local system administrator. The realized benefit is the scientist, who is the local system administrator, is able to focus on the data analysis instead of the intricacies of managing a cluster. Our work provides a cluster deployment and recovery process (CDRP) based on the puppet configuration engine allowing a part-time FTE to easily deploy and recover entire clusters with minimal effort. Puppet is a configuration management system (CMS) used widely in computing centers for the automatic management of resources. Domain experts use Puppet's declarative language to define reusable modules for service configuration and deployment. Our CDRP has three actors: domain experts, a cluster designer and a cluster manager. The domain experts first write the puppet modules for the cluster services. A cluster designer would then define a cluster. This includes the creation of cluster roles, mapping the services to those roles and determining the relationships between the services. Finally, a cluster manager would acquire the resources (machines, networking), enter the cluster input parameters (hostnames, IP addresses) and automatically generate deployment scripts used by puppet to configure it to act as a designated role. In the event of a machine failure, the originally generated deployment scripts along with puppet can be used to easily reconfigure a new machine. The cluster definition produced in our CDRP is an integral part of automating cluster deployment

  4. AUTOMATED UNSUPERVISED CLASSIFICATION OF THE SLOAN DIGITAL SKY SURVEY STELLAR SPECTRA USING k-MEANS CLUSTERING

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez Almeida, J.; Allende Prieto, C., E-mail: jos@iac.es, E-mail: callende@iac.es [Instituto de Astrofisica de Canarias, E-38205 La Laguna, Tenerife (Spain)

    2013-01-20

    Large spectroscopic surveys require automated methods of analysis. This paper explores the use of k-means clustering as a tool for automated unsupervised classification of massive stellar spectral catalogs. The classification criteria are defined by the data and the algorithm, with no prior physical framework. We work with a representative set of stellar spectra associated with the Sloan Digital Sky Survey (SDSS) SEGUE and SEGUE-2 programs, which consists of 173,390 spectra from 3800 to 9200 A sampled on 3849 wavelengths. We classify the original spectra as well as the spectra with the continuum removed. The second set only contains spectral lines, and it is less dependent on uncertainties of the flux calibration. The classification of the spectra with continuum renders 16 major classes. Roughly speaking, stars are split according to their colors, with enough finesse to distinguish dwarfs from giants of the same effective temperature, but with difficulties to separate stars with different metallicities. There are classes corresponding to particular MK types, intrinsically blue stars, dust-reddened, stellar systems, and also classes collecting faulty spectra. Overall, there is no one-to-one correspondence between the classes we derive and the MK types. The classification of spectra without continuum renders 13 classes, the color separation is not so sharp, but it distinguishes stars of the same effective temperature and different metallicities. Some classes thus obtained present a fairly small range of physical parameters (200 K in effective temperature, 0.25 dex in surface gravity, and 0.35 dex in metallicity), so that the classification can be used to estimate the main physical parameters of some stars at a minimum computational cost. We also analyze the outliers of the classification. Most of them turn out to be failures of the reduction pipeline, but there are also high redshift QSOs, multiple stellar systems, dust-reddened stars, galaxies, and, finally, odd

  5. DAFi: A directed recursive data filtering and clustering approach for improving and interpreting data clustering identification of cell populations from polychromatic flow cytometry data.

    Science.gov (United States)

    Lee, Alexandra J; Chang, Ivan; Burel, Julie G; Lindestam Arlehamn, Cecilia S; Mandava, Aishwarya; Weiskopf, Daniela; Peters, Bjoern; Sette, Alessandro; Scheuermann, Richard H; Qian, Yu

    2018-04-17

    Computational methods for identification of cell populations from polychromatic flow cytometry data are changing the paradigm of cytometry bioinformatics. Data clustering is the most common computational approach to unsupervised identification of cell populations from multidimensional cytometry data. However, interpretation of the identified data clusters is labor-intensive. Certain types of user-defined cell populations are also difficult to identify by fully automated data clustering analysis. Both are roadblocks before a cytometry lab can adopt the data clustering approach for cell population identification in routine use. We found that combining recursive data filtering and clustering with constraints converted from the user manual gating strategy can effectively address these two issues. We named this new approach DAFi: Directed Automated Filtering and Identification of cell populations. Design of DAFi preserves the data-driven characteristics of unsupervised clustering for identifying novel cell subsets, but also makes the results interpretable to experimental scientists through mapping and merging the multidimensional data clusters into the user-defined two-dimensional gating hierarchy. The recursive data filtering process in DAFi helped identify small data clusters which are otherwise difficult to resolve by a single run of the data clustering method due to the statistical interference of the irrelevant major clusters. Our experiment results showed that the proportions of the cell populations identified by DAFi, while being consistent with those by expert centralized manual gating, have smaller technical variances across samples than those from individual manual gating analysis and the nonrecursive data clustering analysis. Compared with manual gating segregation, DAFi-identified cell populations avoided the abrupt cut-offs on the boundaries. DAFi has been implemented to be used with multiple data clustering methods including K-means, FLOCK, FlowSOM, and

  6. Automating proliferation rate estimation from Ki-67 histology images

    Science.gov (United States)

    Al-Lahham, Heba Z.; Alomari, Raja S.; Hiary, Hazem; Chaudhary, Vipin

    2012-03-01

    Breast cancer is the second cause of women death and the most diagnosed female cancer in the US. Proliferation rate estimation (PRE) is one of the prognostic indicators that guide the treatment protocols and it is clinically performed from Ki-67 histopathology images. Automating PRE substantially increases the efficiency of the pathologists. Moreover, presenting a deterministic and reproducible proliferation rate value is crucial to reduce inter-observer variability. To that end, we propose a fully automated CAD system for PRE from the Ki-67 histopathology images. This CAD system is based on a model of three steps: image pre-processing, image clustering, and nuclei segmentation and counting that are finally followed by PRE. The first step is based on customized color modification and color-space transformation. Then, image pixels are clustered by K-Means depending on the features extracted from the images derived from the first step. Finally, nuclei are segmented and counted using global thresholding, mathematical morphology and connected component analysis. Our experimental results on fifty Ki-67-stained histopathology images show a significant agreement between our CAD's automated PRE and the gold standard's one, where the latter is an average between two observers' estimates. The Paired T-Test, for the automated and manual estimates, shows ρ = 0.86, 0.45, 0.8 for the brown nuclei count, blue nuclei count, and proliferation rate, respectively. Thus, our proposed CAD system is as reliable as the pathologist estimating the proliferation rate. Yet, its estimate is reproducible.

  7. Scientific Cluster Deployment and Recovery – Using puppet to simplify cluster management

    International Nuclear Information System (INIS)

    Hendrix, Val; Yao Yushu; Benjamin, Doug

    2012-01-01

    Deployment, maintenance and recovery of a scientific cluster, which has complex, specialized services, can be a time consuming task requiring the assistance of Linux system administrators, network engineers as well as domain experts. Universities and small institutions that have a part-time FTE with limited time for and knowledge of the administration of such clusters can be strained by such maintenance tasks. This current work is the result of an effort to maintain a data analysis cluster (DAC) with minimal effort by a local system administrator. The realized benefit is the scientist, who is the local system administrator, is able to focus on the data analysis instead of the intricacies of managing a cluster. Our work provides a cluster deployment and recovery process (CDRP) based on the puppet configuration engine allowing a part-time FTE to easily deploy and recover entire clusters with minimal effort. Puppet is a configuration management system (CMS) used widely in computing centers for the automatic management of resources. Domain experts use Puppet's declarative language to define reusable modules for service configuration and deployment. Our CDRP has three actors: domain experts, a cluster designer and a cluster manager. The domain experts first write the puppet modules for the cluster services. A cluster designer would then define a cluster. This includes the creation of cluster roles, mapping the services to those roles and determining the relationships between the services. Finally, a cluster manager would acquire the resources (machines, networking), enter the cluster input parameters (hostnames, IP addresses) and automatically generate deployment scripts used by puppet to configure it to act as a designated role. In the event of a machine failure, the originally generated deployment scripts along with puppet can be used to easily reconfigure a new machine. The cluster definition produced in our CDRP is an integral part of automating cluster deployment

  8. AUTOMATION OF REMEDY TICKETS CATEGORIZATION USING BUSINESS INTELLIGENCE TOOLS

    OpenAIRE

    DR. M RAJASEKHARA BABU; ANKITA TIWARI

    2012-01-01

    The work log of an issue is often the primary source of information for predicting the cause. Mining patterns from work log is an important issue management task. This paper aims at developing an application which categorizes the issues into problem areas using a clustering algorithm. This algorithm helps one to cluster the issues by mining patterns from the work log files. Standard reports can be generated for the root cause analysis. The whole process is automated using Business Intelligenc...

  9. Towards automating the discovery of certain innovative design principles through a clustering-based optimization technique

    Science.gov (United States)

    Bandaru, Sunith; Deb, Kalyanmoy

    2011-09-01

    In this article, a methodology is proposed for automatically extracting innovative design principles which make a system or process (subject to conflicting objectives) optimal using its Pareto-optimal dataset. Such 'higher knowledge' would not only help designers to execute the system better, but also enable them to predict how changes in one variable would affect other variables if the system has to retain its optimal behaviour. This in turn would help solve other similar systems with different parameter settings easily without the need to perform a fresh optimization task. The proposed methodology uses a clustering-based optimization technique and is capable of discovering hidden functional relationships between the variables, objective and constraint functions and any other function that the designer wishes to include as a 'basis function'. A number of engineering design problems are considered for which the mathematical structure of these explicit relationships exists and has been revealed by a previous study. A comparison with the multivariate adaptive regression splines (MARS) approach reveals the practicality of the proposed approach due to its ability to find meaningful design principles. The success of this procedure for automated innovization is highly encouraging and indicates its suitability for further development in tackling more complex design scenarios.

  10. Significant association of periodontal disease with anti-citrullinated peptide antibody in a Japanese healthy population - The Nagahama study.

    Science.gov (United States)

    Terao, Chikashi; Asai, Keita; Hashimoto, Motomu; Yamazaki, Toru; Ohmura, Koichiro; Yamaguchi, Akihiko; Takahashi, Katsu; Takei, Noriko; Ishii, Takanori; Kawaguchi, Takahisa; Tabara, Yasuharu; Takahashi, Meiko; Nakayama, Takeo; Kosugi, Shinji; Sekine, Akihiro; Fujii, Takao; Yamada, Ryo; Mimori, Tsuneyo; Matsuda, Fumihiko; Bessho, Kazuhisa

    2015-05-01

    Anti-citrullinated peptide antibody (ACPA) is a highly specific autoantibody to rheumatoid arthritis (RA). Recent studies have revealed that periodontal disease (PD) is closely associated with RA and production of ACPA in RA. Analyses of associations between PD and ACPA production in a healthy population may deepen our understandings. Here, we analyzed a total of 9554 adult healthy subjects. ACPA and IgM-rheumatoid factor (RF) was quantified and PD status was evaluated using the number of missing teeth (MT), the Community Periodontal Index (CPI) and Loss of Attachment (LA) for these subjects. PD status was analyzed for its association with the positivity and categorical levels of ACPA and RF conditioned for covariates which were shown to be associated with PD, ACPA or RF. As a result, all of MT, CPI and LA showed suggestive or significant associations with positivity (p = 0.024, 0.0042 and 0.037, respectively) and levels of ACPA (p ≤ 0.00031), but none of the PD parameters were associated with those of RF. These association patterns were also observed when we analyzed 6206 non-smokers of the participants. The significant associations between PD parameters and positivity and levels of ACPA in healthy population support the fundamental involvement of PD with ACPA production. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Intra Cluster Light properties in the CLASH-VLT cluster MACS J1206.2-0847

    CERN Document Server

    Presotto, V; Nonino, M; Mercurio, A; Grillo, C; Rosati, P; Biviano, A; Annunziatella, M; Balestra, I; Cui, W; Sartoris, B; Lemze, D; Ascaso, B; Moustakas, J; Ford, H; Fritz, A; Czoske, O; Ettori, S; Kuchner, U; Lombardi, M; Maier, C; Medezinski, E; Molino, A; Scodeggio, M; Strazzullo, V; Tozzi, P; Ziegler, B; Bartelmann, M; Benitez, N; Bradley, L; Brescia, M; Broadhurst, T; Coe, D; Donahue, M; Gobat, R; Graves, G; Kelson, D; Koekemoer, A; Melchior, P; Meneghetti, M; Merten, J; Moustakas, L; Munari, E; Postman, M; Regős, E; Seitz, S; Umetsu, K; Zheng, W; Zitrin, A

    2014-01-01

    We aim at constraining the assembly history of clusters by studying the intra cluster light (ICL) properties, estimating its contribution to the fraction of baryons in stars, f*, and understanding possible systematics/bias using different ICL detection techniques. We developed an automated method, GALtoICL, based on the software GALAPAGOS to obtain a refined version of typical BCG+ICL maps. We applied this method to our test case MACS J1206.2-0847, a massive cluster located at z=0.44, that is part of the CLASH sample. Using deep multi-band SUBARU images, we extracted the surface brightness (SB) profile of the BCG+ICL and we studied the ICL morphology, color, and contribution to f* out to R500. We repeated the same analysis using a different definition of the ICL, SBlimit method, i.e. a SB cut-off level, to compare the results. The most peculiar feature of the ICL in MACS1206 is its asymmetric radial distribution, with an excess in the SE direction and extending towards the 2nd brightest cluster galaxy which i...

  12. Automated cloud screening of AVHRR imagery using split-and-merge clustering

    Science.gov (United States)

    Gallaudet, Timothy C.; Simpson, James J.

    1991-01-01

    Previous methods to segment clouds from ocean in AVHRR imagery have shown varying degrees of success, with nighttime approaches being the most limited. An improved method of automatic image segmentation, the principal component transformation split-and-merge clustering (PCTSMC) algorithm, is presented and applied to cloud screening of both nighttime and daytime AVHRR data. The method combines spectral differencing, the principal component transformation, and split-and-merge clustering to sample objectively the natural classes in the data. This segmentation method is then augmented by supervised classification techniques to screen clouds from the imagery. Comparisons with other nighttime methods demonstrate its improved capability in this application. The sensitivity of the method to clustering parameters is presented; the results show that the method is insensitive to the split-and-merge thresholds.

  13. Analytical Energy Gradients for Excited-State Coupled-Cluster Methods

    Science.gov (United States)

    Wladyslawski, Mark; Nooijen, Marcel

    The equation-of-motion coupled-cluster (EOM-CC) and similarity transformed equation-of-motion coupled-cluster (STEOM-CC) methods have been firmly established as accurate and routinely applicable extensions of single-reference coupled-cluster theory to describe electronically excited states. An overview of these methods is provided, with emphasis on the many-body similarity transform concept that is the key to a rationalization of their accuracy. The main topic of the paper is the derivation of analytical energy gradients for such non-variational electronic structure approaches, with an ultimate focus on obtaining their detailed algebraic working equations. A general theoretical framework using Lagrange's method of undetermined multipliers is presented, and the method is applied to formulate the EOM-CC and STEOM-CC gradients in abstract operator terms, following the previous work in [P.G. Szalay, Int. J. Quantum Chem. 55 (1995) 151] and [S.R. Gwaltney, R.J. Bartlett, M. Nooijen, J. Chem. Phys. 111 (1999) 58]. Moreover, the systematics of the Lagrange multiplier approach is suitable for automation by computer, enabling the derivation of the detailed derivative equations through a standardized and direct procedure. To this end, we have developed the SMART (Symbolic Manipulation and Regrouping of Tensors) package of automated symbolic algebra routines, written in the Mathematica programming language. The SMART toolkit provides the means to expand, differentiate, and simplify equations by manipulation of the detailed algebraic tensor expressions directly. The Lagrangian multiplier formulation establishes a uniform strategy to perform the automated derivation in a standardized manner: A Lagrange multiplier functional is constructed from the explicit algebraic equations that define the energy in the electronic method; the energy functional is then made fully variational with respect to all of its parameters, and the symbolic differentiations directly yield the explicit

  14. Fine Mapping Seronegative and Seropositive Rheumatoid Arthritis to Shared and Distinct HLA Alleles by Adjusting for the Effects of Heterogeneity

    NARCIS (Netherlands)

    Han, Buhm; Diogo, Dorothee; Eyre, Steve; Kallberg, Henrik; Zhernakova, Alexandra; Bowes, John; Padyukov, Leonid; Okada, Yukinori; Gonzalez-Gay, Miguel A.; Rantapaa-Dahlqvist, Solbritt; Martin, Javier; Huizinga, Tom W. J.; Plenge, Robert M.; Worthington, Jane; Gregersen, Peter K.; Klareskog, Lars; de Bakker, Paul I. W.; Raychaudhuri, Soumya

    2014-01-01

    Despite progress in defining human leukocyte antigen (HLA) alleles for anti-citrullinated-protein-autoantibody-positive (ACPA(+)) rheumatoid arthritis (RA), identifying HLA alleles for ACPA-negative (ACPA(-)) RA has been challenging because of clinical heterogeneity within clinical cohorts. We

  15. Accommodating error analysis in comparison and clustering of molecular fingerprints.

    Science.gov (United States)

    Salamon, H; Segal, M R; Ponce de Leon, A; Small, P M

    1998-01-01

    Molecular epidemiologic studies of infectious diseases rely on pathogen genotype comparisons, which usually yield patterns comprising sets of DNA fragments (DNA fingerprints). We use a highly developed genotyping system, IS6110-based restriction fragment length polymorphism analysis of Mycobacterium tuberculosis, to develop a computational method that automates comparison of large numbers of fingerprints. Because error in fragment length measurements is proportional to fragment length and is positively correlated for fragments within a lane, an align-and-count method that compensates for relative scaling of lanes reliably counts matching fragments between lanes. Results of a two-step method we developed to cluster identical fingerprints agree closely with 5 years of computer-assisted visual matching among 1,335 M. tuberculosis fingerprints. Fully documented and validated methods of automated comparison and clustering will greatly expand the scope of molecular epidemiology.

  16. NMDA receptor adjusted co-administration of ecstasy and cannabinoid receptor-1 agonist in the amygdala via stimulation of BDNF/Trk-B/CREB pathway in adult male rats.

    Science.gov (United States)

    Ashabi, Ghorbangol; Sadat-Shirazi, Mitra-Sadat; Khalifeh, Solmaz; Elhampour, Laleh; Zarrindast, Mohammad-Reza

    2017-04-01

    Consumption of cannabinoid receptor-1 (CB-1) agonist such as cannabis is widely taken in 3,4- methylenedioxymethamphetamine (MDMA) or ecstasy users; it has been hypothesized that co-consumption of CB-1 agonist might protect neurons against MDMA toxicity. N-methyl-d-aspartate (NMDA) receptors regulate neuronal plasticity and firing rate in the brain through Tyrosine-kinase B (Trk-B) activation. The molecular and electrophysiological association among NMDA and MDMA/Arachidonylcyclopropylamide (ACPA, a selective CB-1 receptor agonist) co-consumption was not well-known. Here, neuronal spontaneous activity, Brain-derived neurotrophic factor (BDNF), Trk-B and cAMP response element binding protein (CREB) phosphorylation levels were recognized in ACPA and MDMA co-injected rats. Besides, we proved the role of NMDA receptor on MDMA and ACPA combination on neuronal spontaneous activity and Trk-B/BDNF pathway in the central amygdala (CeA). Male rats were anesthetized with intra-peritoneal injections of urethane; MDMA, D-2-amino-5-phosphonopentanoate (D-AP5, NMDA receptor antagonist) were injected into CeA. ACPA was administrated by intra-cerebroventricular injection. Thirty minutes following injections, neuronal firing rate was recorded from CeA. Two hours after drug injection, amygdala was collected from brain for molecular evaluations. Single administration of MDMA and/or ACPA reduced firing rates compared with sham group in the CeA dose-dependently. Injection of D-AP5, ACPA and MDMA reduced firing rate compared with sham group (P<0.001). Interestingly, injection of ACPA+MDMA enhanced BDNF, Trk-B and CREB phosphorylation compared with MDMA groups. D-AP5, ACPA and MDMA co-injection decreased BDNF, Trk-B and CREB phosphorylation levels compared with ACPA+MDMA in the amygdala (P<0.01). Probably, NMDA receptors are involved in the protective role of acute MDMA+ACPA co-injection via BDNF/Trk-B/CREB pathways. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. A Web service substitution method based on service cluster nets

    Science.gov (United States)

    Du, YuYue; Gai, JunJing; Zhou, MengChu

    2017-11-01

    Service substitution is an important research topic in the fields of Web services and service-oriented computing. This work presents a novel method to analyse and substitute Web services. A new concept, called a Service Cluster Net Unit, is proposed based on Web service clusters. A service cluster is converted into a Service Cluster Net Unit. Then it is used to analyse whether the services in the cluster can satisfy some service requests. Meanwhile, the substitution methods of an atomic service and a composite service are proposed. The correctness of the proposed method is proved, and the effectiveness is shown and compared with the state-of-the-art method via an experiment. It can be readily applied to e-commerce service substitution to meet the business automation needs.

  18. Impact of an automated email notification system for results of tests pending at discharge: a cluster-randomized controlled trial.

    Science.gov (United States)

    Dalal, Anuj K; Roy, Christopher L; Poon, Eric G; Williams, Deborah H; Nolido, Nyryan; Yoon, Cathy; Budris, Jonas; Gandhi, Tejal; Bates, David W; Schnipper, Jeffrey L

    2014-01-01

    Physician awareness of the results of tests pending at discharge (TPADs) is poor. We developed an automated system that notifies responsible physicians of TPAD results via secure, network email. We sought to evaluate the impact of this system on self-reported awareness of TPAD results by responsible physicians, a necessary intermediary step to improve management of TPAD results. We conducted a cluster-randomized controlled trial at a major hospital affiliated with an integrated healthcare delivery network in Boston, Massachusetts. Adult patients with TPADs who were discharged from inpatient general medicine and cardiology services were assigned to the intervention or usual care arm if their inpatient attending physician and primary care physician (PCP) were both randomized to the same study arm. Patients of physicians randomized to discordant study arms were excluded. We surveyed these physicians 72 h after all TPAD results were finalized. The primary outcome was awareness of TPAD results by attending physicians. Secondary outcomes included awareness of TPAD results by PCPs, awareness of actionable TPAD results, and provider satisfaction. We analyzed data on 441 patients. We sent 441 surveys to attending physicians and 353 surveys to PCPs and received 275 and 152 responses from 83 different attending physicians and 112 different PCPs, respectively (attending physician survey response rate of 63%). Intervention attending physicians and PCPs were significantly more aware of TPAD results (76% vs 38%, adjusted/clustered OR 6.30 (95% CI 3.02 to 13.16), pemail notification represents a promising strategy for managing TPAD results, potentially mitigating an unresolved patient safety concern. ClinicalTrials.gov (NCT01153451).

  19. Clinical Characteristics of Aldosterone- and Cortisol-Coproducing Adrenal Adenoma in Primary Aldosteronism

    Directory of Open Access Journals (Sweden)

    Lu Tang

    2018-01-01

    Full Text Available Aldosterone- and cortisol-coproducing adrenal adenoma (A/CPA cases have been observed in patients with primary aldosteronism (PA. This study investigated the incidence, clinical characteristics, and molecular biological features of patients with A/CPAs. We retrospectively identified 22 A/CPA patients from 555 PA patients who visited the Chinese People’s Liberation Army General Hospital between 2004 and 2015. Analysis of clinical parameters revealed that patients with A/CPAs had larger tumors than those with pure APAs (P<0.05. Moreover, they had higher proportions of cardiovascular complications, glucose intolerance/diabetes, and osteopenia/osteoporosis compared to the pure APA patients (P<0.001. In the molecular biological findings, quantitative real-time PCR analysis revealed similar CYP11B1 and CYP17A1 mRNA expressions in resected A/CPA specimens and in pure APA specimens. Western blot and immunochemical analyses showed CYP11B1, CYP11B2, and CYP17A1 expressions in both A/CPAs and pure APAs. Seventeen cases with KCNJ5 mutations were detected among the 22 A/CPA DNA samples, but no PRKACA or other causative mutations were observed. Each patient improved following adrenalectomy. In conclusion, A/CPAs were not rare among PA patients. These patients associated with high incidences of cardiovascular events and metabolic disorders. Screening for excess cortisol secretion is necessary for PA patients.

  20. An automated three-dimensional detection and segmentation method for touching cells by integrating concave points clustering and random walker algorithm.

    Directory of Open Access Journals (Sweden)

    Yong He

    Full Text Available Characterizing cytoarchitecture is crucial for understanding brain functions and neural diseases. In neuroanatomy, it is an important task to accurately extract cell populations' centroids and contours. Recent advances have permitted imaging at single cell resolution for an entire mouse brain using the Nissl staining method. However, it is difficult to precisely segment numerous cells, especially those cells touching each other. As presented herein, we have developed an automated three-dimensional detection and segmentation method applied to the Nissl staining data, with the following two key steps: 1 concave points clustering to determine the seed points of touching cells; and 2 random walker segmentation to obtain cell contours. Also, we have evaluated the performance of our proposed method with several mouse brain datasets, which were captured with the micro-optical sectioning tomography imaging system, and the datasets include closely touching cells. Comparing with traditional detection and segmentation methods, our approach shows promising detection accuracy and high robustness.

  1. Cluster processing for 16Mb DRAM production

    International Nuclear Information System (INIS)

    Bergendahl, A.; Horak, D.

    1989-01-01

    Multichamber and in-situ technology are used to meet the challenge of manufacturing 16-Mb cost/performance DRAMs. The 16-Mb fabrication process is more complex than earlier 1-Mb and 4-Mb chips. Clustering of sequential process steps effectively compensates for both manufacturing complexity and foreign-material (FM) related defect densities. The development time of clusters combining new processes and equipment in multiple automated steps is nearly as long as the product development cycle. This necessitates codevelopment of manufacturing process cluster with technology integration while addressing the factors influencing FM defect generation, processing turnaround time (TAT), manufacturing costs, yield and array cell and support device designs. The advantages of multichamber and in situ processing have resulted in their application throughout the entire 16-Mb DRAM process as appropriate equipment becomes available

  2. Autonomic Cluster Management System (ACMS): A Demonstration of Autonomic Principles at Work

    Science.gov (United States)

    Baldassari, James D.; Kopec, Christopher L.; Leshay, Eric S.; Truszkowski, Walt; Finkel, David

    2005-01-01

    Cluster computing, whereby a large number of simple processors or nodes are combined together to apparently function as a single powerful computer, has emerged as a research area in its own right. The approach offers a relatively inexpensive means of achieving significant computational capabilities for high-performance computing applications, while simultaneously affording the ability to. increase that capability simply by adding more (inexpensive) processors. However, the task of manually managing and con.guring a cluster quickly becomes impossible as the cluster grows in size. Autonomic computing is a relatively new approach to managing complex systems that can potentially solve many of the problems inherent in cluster management. We describe the development of a prototype Automatic Cluster Management System (ACMS) that exploits autonomic properties in automating cluster management.

  3. Parallel Density-Based Clustering for Discovery of Ionospheric Phenomena

    Science.gov (United States)

    Pankratius, V.; Gowanlock, M.; Blair, D. M.

    2015-12-01

    Ionospheric total electron content maps derived from global networks of dual-frequency GPS receivers can reveal a plethora of ionospheric features in real-time and are key to space weather studies and natural hazard monitoring. However, growing data volumes from expanding sensor networks are making manual exploratory studies challenging. As the community is heading towards Big Data ionospheric science, automation and Computer-Aided Discovery become indispensable tools for scientists. One problem of machine learning methods is that they require domain-specific adaptations in order to be effective and useful for scientists. Addressing this problem, our Computer-Aided Discovery approach allows scientists to express various physical models as well as perturbation ranges for parameters. The search space is explored through an automated system and parallel processing of batched workloads, which finds corresponding matches and similarities in empirical data. We discuss density-based clustering as a particular method we employ in this process. Specifically, we adapt Density-Based Spatial Clustering of Applications with Noise (DBSCAN). This algorithm groups geospatial data points based on density. Clusters of points can be of arbitrary shape, and the number of clusters is not predetermined by the algorithm; only two input parameters need to be specified: (1) a distance threshold, (2) a minimum number of points within that threshold. We discuss an implementation of DBSCAN for batched workloads that is amenable to parallelization on manycore architectures such as Intel's Xeon Phi accelerator with 60+ general-purpose cores. This manycore parallelization can cluster large volumes of ionospheric total electronic content data quickly. Potential applications for cluster detection include the visualization, tracing, and examination of traveling ionospheric disturbances or other propagating phenomena. Acknowledgments. We acknowledge support from NSF ACI-1442997 (PI V. Pankratius).

  4. Exploring relationships of human-automation interaction consequences on pilots: uncovering subsystems.

    Science.gov (United States)

    Durso, Francis T; Stearman, Eric J; Morrow, Daniel G; Mosier, Kathleen L; Fischer, Ute; Pop, Vlad L; Feigh, Karen M

    2015-05-01

    We attempted to understand the latent structure underlying the systems pilots use to operate in situations involving human-automation interaction (HAI). HAI is an important characteristic of many modern work situations. Of course, the cognitive subsystems are not immediately apparent by observing a functioning system, but correlations between variables may reveal important relations. The current report examined pilot judgments of 11 HAI dimensions (e.g., Workload, Task Management, Stress/Nervousness, Monitoring Automation, and Cross-Checking Automation) across 48 scenarios that required airline pilots to interact with automation on the flight deck. We found three major clusters of the dimensions identifying subsystems on the flight deck: a workload subsystem, a management subsystem, and an awareness subsystem. Relationships characterized by simple correlations cohered in ways that suggested underlying subsystems consistent with those that had previously been theorized. Understanding the relationship among dimensions affecting HAI is an important aspect in determining how a new piece of automation designed to affect one dimension will affect other dimensions as well. © 2014, Human Factors and Ergonomics Society.

  5. Automated rice leaf disease detection using color image analysis

    Science.gov (United States)

    Pugoy, Reinald Adrian D. L.; Mariano, Vladimir Y.

    2011-06-01

    In rice-related institutions such as the International Rice Research Institute, assessing the health condition of a rice plant through its leaves, which is usually done as a manual eyeball exercise, is important to come up with good nutrient and disease management strategies. In this paper, an automated system that can detect diseases present in a rice leaf using color image analysis is presented. In the system, the outlier region is first obtained from a rice leaf image to be tested using histogram intersection between the test and healthy rice leaf images. Upon obtaining the outlier, it is then subjected to a threshold-based K-means clustering algorithm to group related regions into clusters. Then, these clusters are subjected to further analysis to finally determine the suspected diseases of the rice leaf.

  6. TOSCA-based orchestration of complex clusters at the IaaS level

    Science.gov (United States)

    Caballer, M.; Donvito, G.; Moltó, G.; Rocha, R.; Velten, M.

    2017-10-01

    This paper describes the adoption and extension of the TOSCA standard by the INDIGO-DataCloud project for the definition and deployment of complex computing clusters together with the required support in both OpenStack and OpenNebula, carried out in close collaboration with industry partners such as IBM. Two examples of these clusters are described in this paper, the definition of an elastic computing cluster to support the Galaxy bioinformatics application where the nodes are dynamically added and removed from the cluster to adapt to the workload, and the definition of an scalable Apache Mesos cluster for the execution of batch jobs and support for long-running services. The coupling of TOSCA with Ansible Roles to perform automated installation has resulted in the definition of high-level, deterministic templates to provision complex computing clusters across different Cloud sites.

  7. Automatic detection of erythemato-squamous diseases using k-means clustering.

    Science.gov (United States)

    Ubeyli, Elif Derya; Doğdu, Erdoğan

    2010-04-01

    A new approach based on the implementation of k-means clustering is presented for automated detection of erythemato-squamous diseases. The purpose of clustering techniques is to find a structure for the given data by finding similarities between data according to data characteristics. The studied domain contained records of patients with known diagnosis. The k-means clustering algorithm's task was to classify the data points, in this case the patients with attribute data, to one of the five clusters. The algorithm was used to detect the five erythemato-squamous diseases when 33 features defining five disease indications were used. The purpose is to determine an optimum classification scheme for this problem. The present research demonstrated that the features well represent the erythemato-squamous diseases and the k-means clustering algorithm's task achieved high classification accuracies for only five erythemato-squamous diseases.

  8. Computer-aided diagnosis of mammographic microcalcification clusters

    International Nuclear Information System (INIS)

    Kallergi, Maria

    2004-01-01

    Computer-aided diagnosis techniques in medical imaging are developed for the automated differentiation between benign and malignant lesions and go beyond computer-aided detection by providing cancer likelihood for a detected lesion given image and/or patient characteristics. The goal of this study was the development and evaluation of a computer-aided detection and diagnosis algorithm for mammographic calcification clusters. The emphasis was on the diagnostic component, although the algorithm included automated detection, segmentation, and classification steps based on wavelet filters and artificial neural networks. Classification features were selected primarily from descriptors of the morphology of the individual calcifications and the distribution of the cluster. Thirteen such descriptors were selected and, combined with patient's age, were given as inputs to the network. The features were ranked and evaluated for the classification of 100 high-resolution, digitized mammograms containing biopsy-proven, benign and malignant calcification clusters. The classification performance of the algorithm reached a 100% sensitivity for a specificity of 85% (receiver operating characteristic area index A z =0.98±0.01). Tests of the algorithm under various conditions showed that the selected features were robust morphological and distributional descriptors, relatively insensitive to segmentation and detection errors such as false positive signals. The algorithm could exceed the performance of a similar visual analysis system that was used as basis for development and, combined with a simple image standardization process, could be applied to images from different imaging systems and film digitizers with similar sensitivity and specificity rates

  9. Automated Dimension Determination for NMF-based Incremental Collaborative Filtering

    Directory of Open Access Journals (Sweden)

    Xiwei Wang

    2015-12-01

    Full Text Available The nonnegative matrix factorization (NMF based collaborative filtering t e chniques h a ve a c hieved great success in product recommendations. It is well known that in NMF, the dimensions of the factor matrices have to be determined in advance. Moreover, data is growing fast; thus in some cases, the dimensions need to be changed to reduce the approximation error. The recommender systems should be capable of updating new data in a timely manner without sacrificing the prediction accuracy. In this paper, we propose an NMF based data update approach with automated dimension determination for collaborative filtering purposes. The approach can determine the dimensions of the factor matrices and update them automatically. It exploits the nearest neighborhood based clustering algorithm to cluster users and items according to their auxiliary information, and uses the clusters as the constraints in NMF. The dimensions of the factor matrices are associated with the cluster quantities. When new data becomes available, the incremental clustering algorithm determines whether to increase the number of clusters or merge the existing clusters. Experiments on three different datasets (MovieLens, Sushi, and LibimSeTi were conducted to examine the proposed approach. The results show that our approach can update the data quickly and provide encouraging prediction accuracy.

  10. Anti-citrullinated protein antibodies promote apoptosis of mature human Saos-2 osteoblasts via cell-surface binding to citrullinated heat shock protein 60.

    Science.gov (United States)

    Lu, Ming-Chi; Yu, Chia-Li; Yu, Hui-Chun; Huang, Hsien-Bin; Koo, Malcolm; Lai, Ning-Sheng

    2016-01-01

    We hypothesized that anti-citrullinated protein antibodies (ACPAs) react with osteoblast surface citrullinated proteins and affect cell function, leading to joint damage in patients with rheumatoid arthritis (RA). First, we purified ACPAs by cyclic citrullinated peptide (CCP)-conjugated affinity column chromatography. The cognate antigens of ACPAs on Saos-2 cells, a sarcoma osteogenic cell line generated from human osteoblasts, were probed by ACPAs, and the reactive bands were analyzed using proteomic analyses. We found that ACPAs bind to Saos-2 cell membrane, and several protein candidates, including HSP60, were identified. We then cloned and purified recombinant heat shock protein 60 (HSP60) and citrullinated HSP60 (citHSP60) and investigated the effect of ACPAs on Saos-2 cell. We confirmed that HSP60 obtained from Saos-2 cell membrane were citrullinated and reacted with ACPAs, which induces Saos-2 cells apoptosis via binding to surface-expressed citHSP60 through Toll-like receptor 4 signaling. ACPAs promoted interleukin (IL)-6 and IL-8 expression in Saos-2 cells. Finally, sera from patients with RA and healthy controls were examined for their titers of anti-HSP60 and anti-citHSP60 antibodies using an enzyme-linked immunosorbent assay. The radiographic change in patients with RA was evaluated using the Genant-modified Sharp scoring system. Patients with RA showed higher sera titers of anti-citHSP60, but not anti-HSP60, antibodies when compared with controls. In addition, the anti-citHSP60 level was positively associated with increased joint damage in patients with RA. In conclusion, Saos-2 cell apoptosis was mediated by ACPAs via binding to cell surface-expressed citHSP60 and the titer of anti-citHSP60 in patients with RA positively associated with joint damage. Copyright © 2015 Elsevier GmbH. All rights reserved.

  11. Automated Parallel Computing Tools for Multicore Machines and Clusters, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to improve productivity of high performance computing for applications on multicore computers and clusters. These machines built from one or more chips...

  12. Automation model of sewerage rehabilitation planning.

    Science.gov (United States)

    Yang, M D; Su, T C

    2006-01-01

    The major steps of sewerage rehabilitation include inspection of sewerage, assessment of structural conditions, computation of structural condition grades, and determination of rehabilitation methods and materials. Conventionally, sewerage rehabilitation planning relies on experts with professional background that is tedious and time-consuming. This paper proposes an automation model of planning optimal sewerage rehabilitation strategies for the sewer system by integrating image process, clustering technology, optimization, and visualization display. Firstly, image processing techniques, such as wavelet transformation and co-occurrence features extraction, were employed to extract various characteristics of structural failures from CCTV inspection images. Secondly, a classification neural network was established to automatically interpret the structural conditions by comparing the extracted features with the typical failures in a databank. Then, to achieve optimal rehabilitation efficiency, a genetic algorithm was used to determine appropriate rehabilitation methods and substitution materials for the pipe sections with a risk of mal-function and even collapse. Finally, the result from the automation model can be visualized in a geographic information system in which essential information of the sewer system and sewerage rehabilitation plans are graphically displayed. For demonstration, the automation model of optimal sewerage rehabilitation planning was applied to a sewer system in east Taichung, Chinese Taiwan.

  13. Affinity Purification and Comparative Biosensor Analysis of Citrulline-Peptide-Specific Antibodies in Rheumatoid Arthritis

    Directory of Open Access Journals (Sweden)

    Eszter Szarka

    2018-01-01

    Full Text Available Background: In rheumatoid arthritis (RA, anti-citrullinated protein/peptide antibodies (ACPAs are responsible for disease onset and progression, however, our knowledge is limited on ligand binding affinities of autoantibodies with different citrulline-peptide specificity. Methods: Citrulline-peptide-specific ACPA IgGs were affinity purified and tested by ELISA. Binding affinities of ACPA IgGs and serum antibodies were compared by surface plasmon resonance (SPR analysis. Bifunctional nanoparticles harboring a multi-epitope citrulline-peptide and a complement-activating peptide were used to induce selective depletion of ACPA-producing B cells. Results: KD values of affinity-purified ACPA IgGs varied between 10−6 and 10−8 M and inversely correlated with disease activity. Based on their cross-reaction with citrulline-peptides, we designed a novel multi-epitope peptide, containing Cit-Gly and Ala-Cit motifs in two–two copies, separated with a short, neutral spacer. This peptide detected antibodies in RA sera with 66% sensitivity and 98% specificity in ELISA and was recognized by 90% of RA sera, while none of the healthy samples in SPR. When coupled to nanoparticles, the multi-epitope peptide specifically targeted and depleted ACPA-producing B cells ex vivo. Conclusions: The unique multi-epitope peptide designed based on ACPA cross-reactivity might be suitable to develop better diagnostics and novel therapies for RA.

  14. Comparative Investigation of Guided Fuzzy Clustering and Mean Shift Clustering for Edge Detection in Electrical Resistivity Tomography Images of Mineral Deposits

    Science.gov (United States)

    Ward, Wil; Wilkinson, Paul; Chambers, Jon; Bai, Li

    2014-05-01

    Geophysical surveying using electrical resistivity tomography (ERT) can be used as a rapid non-intrusive method to investigate mineral deposits [1]. One of the key challenges with this approach is to find a robust automated method to assess and characterise deposits on the basis of an ERT image. Recent research applying edge detection techniques has yielded a framework that can successfully locate geological interfaces in ERT images using a minimal assumption data clustering technique, the guided fuzzy clustering method (gfcm) [2]. Non-parametric clustering techniques are statistically grounded methods of image segmentation that do not require any assumptions about the distribution of data under investigation. This study is a comparison of two such methods to assess geological structure based on the resistivity images. In addition to gfcm, a method called mean-shift clustering [3] is investigated with comparisons directed at accuracy, computational expense, and degree of user interaction. Neither approach requires the number of clusters as input (a common parameter and often impractical), rather they are based on a similar theory that data can be clustered based on peaks in the probability density function (pdf) of the data. Each local maximum in these functions represents the modal value of a particular population corresponding to a cluster and as such the data are assigned based on their relationships to these model values. The two methods differ in that gfcm approximates the pdf using kernel density estimation and identifies population means, assigning cluster membership probabilities to each resistivity value in the model based on its distance from the distribution averages. Whereas, in mean-shift clustering, the density function is not calculated, but a gradient ascent method creates a vector that leads each datum towards high density distributions iteratively using weighted kernels to calculate locally dense regions. The only parameter needed in both methods

  15. Automated reliability assessment for spectroscopic redshift measurements

    Science.gov (United States)

    Jamal, S.; Le Brun, V.; Le Fèvre, O.; Vibert, D.; Schmitt, A.; Surace, C.; Copin, Y.; Garilli, B.; Moresco, M.; Pozzetti, L.

    2018-03-01

    Context. Future large-scale surveys, such as the ESA Euclid mission, will produce a large set of galaxy redshifts (≥106) that will require fully automated data-processing pipelines to analyze the data, extract crucial information and ensure that all requirements are met. A fundamental element in these pipelines is to associate to each galaxy redshift measurement a quality, or reliability, estimate. Aim. In this work, we introduce a new approach to automate the spectroscopic redshift reliability assessment based on machine learning (ML) and characteristics of the redshift probability density function. Methods: We propose to rephrase the spectroscopic redshift estimation into a Bayesian framework, in order to incorporate all sources of information and uncertainties related to the redshift estimation process and produce a redshift posterior probability density function (PDF). To automate the assessment of a reliability flag, we exploit key features in the redshift posterior PDF and machine learning algorithms. Results: As a working example, public data from the VIMOS VLT Deep Survey is exploited to present and test this new methodology. We first tried to reproduce the existing reliability flags using supervised classification in order to describe different types of redshift PDFs, but due to the subjective definition of these flags (classification accuracy 58%), we soon opted for a new homogeneous partitioning of the data into distinct clusters via unsupervised classification. After assessing the accuracy of the new clusters via resubstitution and test predictions (classification accuracy 98%), we projected unlabeled data from preliminary mock simulations for the Euclid space mission into this mapping to predict their redshift reliability labels. Conclusions: Through the development of a methodology in which a system can build its own experience to assess the quality of a parameter, we are able to set a preliminary basis of an automated reliability assessment for

  16. Bright galaxies in the Fornax cluster. Automated galaxy surface photometry: Pt. 7

    International Nuclear Information System (INIS)

    Disney, M.J.; Phillipps, S.; Davies, J.L.; Cawson, M.G.M.; Kibblewhite, E.J.

    1990-01-01

    We have determined surface-brightness profiles for all galaxies down to magnitude B = 16 in the central region of the Fornax cluster. Using existing redshift data, we have determined the distributions of surface brightness for both the whole sample and for cluster disc galaxies only. Although both distributions peak at extrapolated central surface brightness ∼ 21.7B mag/arcsec 2 (the canonical result), it is shown that they are, in fact, consistent with very broad distributions of disc central surface brightness once selection effects and the effects of bulge contamination of the profile are taken into account. (author)

  17. Cerebral perfusion and automated individual analysis using SPECT among an obsessive-compulsive population

    Directory of Open Access Journals (Sweden)

    Euclides Timóteo da Rocha

    2011-01-01

    Full Text Available OBJECTIVE: To make individual assessments using automated quantification methodology in order to screen for perfusion abnormalities in cerebral SPECT examinations among a sample of subjects with OCD. METHODS: Statistical parametric mapping (SPM was used to compare 26 brain SPECT images from patients with OCD individually with an image bank of 32 normal subjects, using the statistical threshold of p < 0.05 (corrected for multiple comparisons at the level of individual voxels or clusters. The maps were analyzed, and regions presenting voxels that remained above this threshold were sought. RESULTS: Six patients from a sample of 26 OCD images showed abnormalities at cluster or voxel level, considering the criteria described above, which represented 23.07%. However, seven images from the normal group of 32 were also indicated as cases of perfusional abnormality, representing 21.8% of the sample. CONCLUSION: The automated quantification method was not considered to be a useful tool for clinical practice, for analyses complementary to visual inspection.

  18. Automated Classification and Analysis of Non-metallic Inclusion Data Sets

    Science.gov (United States)

    Abdulsalam, Mohammad; Zhang, Tongsheng; Tan, Jia; Webler, Bryan A.

    2018-05-01

    The aim of this study is to utilize principal component analysis (PCA), clustering methods, and correlation analysis to condense and examine large, multivariate data sets produced from automated analysis of non-metallic inclusions. Non-metallic inclusions play a major role in defining the properties of steel and their examination has been greatly aided by automated analysis in scanning electron microscopes equipped with energy dispersive X-ray spectroscopy. The methods were applied to analyze inclusions on two sets of samples: two laboratory-scale samples and four industrial samples from a near-finished 4140 alloy steel components with varying machinability. The laboratory samples had well-defined inclusions chemistries, composed of MgO-Al2O3-CaO, spinel (MgO-Al2O3), and calcium aluminate inclusions. The industrial samples contained MnS inclusions as well as (Ca,Mn)S + calcium aluminate oxide inclusions. PCA could be used to reduce inclusion chemistry variables to a 2D plot, which revealed inclusion chemistry groupings in the samples. Clustering methods were used to automatically classify inclusion chemistry measurements into groups, i.e., no user-defined rules were required.

  19. A multistage, semi-automated procedure for analyzing the morphology of nanoparticles

    KAUST Repository

    Park, Chiwoo

    2012-07-01

    This article presents a multistage, semi-automated procedure that can expedite the morphology analysis of nanoparticles. Material scientists have long conjectured that the morphology of nanoparticles has a profound impact on the properties of the hosting material, but a bottleneck is the lack of a reliable and automated morphology analysis of the particles based on their image measurements. This article attempts to fill in this critical void. One particular challenge in nanomorphology analysis is how to analyze the overlapped nanoparticles, a problem not well addressed by the existing methods but effectively tackled by the method proposed in this article. This method entails multiple stages of operations, executed sequentially, and is considered semi-automated due to the inclusion of a semi-supervised clustering step. The proposed method is applied to several images of nanoparticles, producing the needed statistical characterization of their morphology. © 2012 "IIE".

  20. FPGA-Based Real-Time Motion Detection for Automated Video Surveillance Systems

    Directory of Open Access Journals (Sweden)

    Sanjay Singh

    2016-03-01

    Full Text Available Design of automated video surveillance systems is one of the exigent missions in computer vision community because of their ability to automatically select frames of interest in incoming video streams based on motion detection. This research paper focuses on the real-time hardware implementation of a motion detection algorithm for such vision based automated surveillance systems. A dedicated VLSI architecture has been proposed and designed for clustering-based motion detection scheme. The working prototype of a complete standalone automated video surveillance system, including input camera interface, designed motion detection VLSI architecture, and output display interface, with real-time relevant motion detection capabilities, has been implemented on Xilinx ML510 (Virtex-5 FX130T FPGA platform. The prototyped system robustly detects the relevant motion in real-time in live PAL (720 × 576 resolution video streams directly coming from the camera.

  1. A multistage, semi-automated procedure for analyzing the morphology of nanoparticles

    KAUST Repository

    Park, Chiwoo; Huang, Jianhua Z.; Huitink, David; Kundu, Subrata; Mallick, Bani K.; Liang, Hong; Ding, Yu

    2012-01-01

    This article presents a multistage, semi-automated procedure that can expedite the morphology analysis of nanoparticles. Material scientists have long conjectured that the morphology of nanoparticles has a profound impact on the properties of the hosting material, but a bottleneck is the lack of a reliable and automated morphology analysis of the particles based on their image measurements. This article attempts to fill in this critical void. One particular challenge in nanomorphology analysis is how to analyze the overlapped nanoparticles, a problem not well addressed by the existing methods but effectively tackled by the method proposed in this article. This method entails multiple stages of operations, executed sequentially, and is considered semi-automated due to the inclusion of a semi-supervised clustering step. The proposed method is applied to several images of nanoparticles, producing the needed statistical characterization of their morphology. © 2012 "IIE".

  2. Clustering and Recurring Anomaly Identification: Recurring Anomaly Detection System (ReADS)

    Science.gov (United States)

    McIntosh, Dawn

    2006-01-01

    This viewgraph presentation reviews the Recurring Anomaly Detection System (ReADS). The Recurring Anomaly Detection System is a tool to analyze text reports, such as aviation reports and maintenance records: (1) Text clustering algorithms group large quantities of reports and documents; Reduces human error and fatigue (2) Identifies interconnected reports; Automates the discovery of possible recurring anomalies; (3) Provides a visualization of the clusters and recurring anomalies We have illustrated our techniques on data from Shuttle and ISS discrepancy reports, as well as ASRS data. ReADS has been integrated with a secure online search

  3. Automated image analysis for quantitative fluorescence in situ hybridization with environmental samples.

    Science.gov (United States)

    Zhou, Zhi; Pons, Marie Noëlle; Raskin, Lutgarde; Zilles, Julie L

    2007-05-01

    When fluorescence in situ hybridization (FISH) analyses are performed with complex environmental samples, difficulties related to the presence of microbial cell aggregates and nonuniform background fluorescence are often encountered. The objective of this study was to develop a robust and automated quantitative FISH method for complex environmental samples, such as manure and soil. The method and duration of sample dispersion were optimized to reduce the interference of cell aggregates. An automated image analysis program that detects cells from 4',6'-diamidino-2-phenylindole (DAPI) micrographs and extracts the maximum and mean fluorescence intensities for each cell from corresponding FISH images was developed with the software Visilog. Intensity thresholds were not consistent even for duplicate analyses, so alternative ways of classifying signals were investigated. In the resulting method, the intensity data were divided into clusters using fuzzy c-means clustering, and the resulting clusters were classified as target (positive) or nontarget (negative). A manual quality control confirmed this classification. With this method, 50.4, 72.1, and 64.9% of the cells in two swine manure samples and one soil sample, respectively, were positive as determined with a 16S rRNA-targeted bacterial probe (S-D-Bact-0338-a-A-18). Manual counting resulted in corresponding values of 52.3, 70.6, and 61.5%, respectively. In two swine manure samples and one soil sample 21.6, 12.3, and 2.5% of the cells were positive with an archaeal probe (S-D-Arch-0915-a-A-20), respectively. Manual counting resulted in corresponding values of 22.4, 14.0, and 2.9%, respectively. This automated method should facilitate quantitative analysis of FISH images for a variety of complex environmental samples.

  4. flowClust: a Bioconductor package for automated gating of flow cytometry data

    Directory of Open Access Journals (Sweden)

    Lo Kenneth

    2009-05-01

    Full Text Available Abstract Background As a high-throughput technology that offers rapid quantification of multidimensional characteristics for millions of cells, flow cytometry (FCM is widely used in health research, medical diagnosis and treatment, and vaccine development. Nevertheless, there is an increasing concern about the lack of appropriate software tools to provide an automated analysis platform to parallelize the high-throughput data-generation platform. Currently, to a large extent, FCM data analysis relies on the manual selection of sequential regions in 2-D graphical projections to extract the cell populations of interest. This is a time-consuming task that ignores the high-dimensionality of FCM data. Results In view of the aforementioned issues, we have developed an R package called flowClust to automate FCM analysis. flowClust implements a robust model-based clustering approach based on multivariate t mixture models with the Box-Cox transformation. The package provides the functionality to identify cell populations whilst simultaneously handling the commonly encountered issues of outlier identification and data transformation. It offers various tools to summarize and visualize a wealth of features of the clustering results. In addition, to ensure its convenience of use, flowClust has been adapted for the current FCM data format, and integrated with existing Bioconductor packages dedicated to FCM analysis. Conclusion flowClust addresses the issue of a dearth of software that helps automate FCM analysis with a sound theoretical foundation. It tends to give reproducible results, and helps reduce the significant subjectivity and human time cost encountered in FCM analysis. The package contributes to the cytometry community by offering an efficient, automated analysis platform which facilitates the active, ongoing technological advancement.

  5. clues: An R Package for Nonparametric Clustering Based on Local Shrinking

    Directory of Open Access Journals (Sweden)

    Fang Chang

    2010-02-01

    Full Text Available Determining the optimal number of clusters appears to be a persistent and controversial issue in cluster analysis. Most existing R packages targeting clustering require the user to specify the number of clusters in advance. However, if this subjectively chosen number is far from optimal, clustering may produce seriously misleading results. In order to address this vexing problem, we develop the R package clues to automate and evaluate the selection of an optimal number of clusters, which is widely applicable in the field of clustering analysis. Package clues uses two main procedures, shrinking and partitioning, to estimate an optimal number of clusters by maximizing an index function, either the CH index or the Silhouette index, rather than relying on guessing a pre-specified number. Five agreement indices (Rand index, Hubert and Arabie’s adjusted Rand index, Morey and Agresti’s adjusted Rand index, Fowlkes and Mallows index and Jaccard index, which measure the degree of agreement between any two partitions, are also provided in clues. In addition to numerical evidence, clues also supplies a deeper insight into the partitioning process with trajectory plots.

  6. Altering user' acceptance of automation through prior automation exposure.

    Science.gov (United States)

    Bekier, Marek; Molesworth, Brett R C

    2017-06-01

    Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.

  7. Automated Essay Grading using Machine Learning Algorithm

    Science.gov (United States)

    Ramalingam, V. V.; Pandian, A.; Chetry, Prateek; Nigam, Himanshu

    2018-04-01

    Essays are paramount for of assessing the academic excellence along with linking the different ideas with the ability to recall but are notably time consuming when they are assessed manually. Manual grading takes significant amount of evaluator’s time and hence it is an expensive process. Automated grading if proven effective will not only reduce the time for assessment but comparing it with human scores will also make the score realistic. The project aims to develop an automated essay assessment system by use of machine learning techniques by classifying a corpus of textual entities into small number of discrete categories, corresponding to possible grades. Linear regression technique will be utilized for training the model along with making the use of various other classifications and clustering techniques. We intend to train classifiers on the training set, make it go through the downloaded dataset, and then measure performance our dataset by comparing the obtained values with the dataset values. We have implemented our model using java.

  8. Detecting and extracting clusters in atom probe data: A simple, automated method using Voronoi cells

    International Nuclear Information System (INIS)

    Felfer, P.; Ceguerra, A.V.; Ringer, S.P.; Cairney, J.M.

    2015-01-01

    The analysis of the formation of clusters in solid solutions is one of the most common uses of atom probe tomography. Here, we present a method where we use the Voronoi tessellation of the solute atoms and its geometric dual, the Delaunay triangulation to test for spatial/chemical randomness of the solid solution as well as extracting the clusters themselves. We show how the parameters necessary for cluster extraction can be determined automatically, i.e. without user interaction, making it an ideal tool for the screening of datasets and the pre-filtering of structures for other spatial analysis techniques. Since the Voronoi volumes are closely related to atomic concentrations, the parameters resulting from this analysis can also be used for other concentration based methods such as iso-surfaces. - Highlights: • Cluster analysis of atom probe data can be significantly simplified by using the Voronoi cell volumes of the atomic distribution. • Concentration fields are defined on a single atomic basis using Voronoi cells. • All parameters for the analysis are determined by optimizing the separation probability of bulk atoms vs clustered atoms

  9. Automated detection of very Low Surface Brightness galaxies in the Virgo Cluster

    Science.gov (United States)

    Prole, D. J.; Davies, J. I.; Keenan, O. C.; Davies, L. J. M.

    2018-04-01

    We report the automatic detection of a new sample of very low surface brightness (LSB) galaxies, likely members of the Virgo cluster. We introduce our new software, DeepScan, that has been designed specifically to detect extended LSB features automatically using the DBSCAN algorithm. We demonstrate the technique by applying it over a 5 degree2 portion of the Next-Generation Virgo Survey (NGVS) data to reveal 53 low surface brightness galaxies that are candidate cluster members based on their sizes and colours. 30 of these sources are new detections despite the region being searched specifically for LSB galaxies previously. Our final sample contains galaxies with 26.0 ≤ ⟨μe⟩ ≤ 28.5 and 19 ≤ mg ≤ 21, making them some of the faintest known in Virgo. The majority of them have colours consistent with the red sequence, and have a mean stellar mass of 106.3 ± 0.5M⊙ assuming cluster membership. After using ProFit to fit Sérsic profiles to our detections, none of the new sources have effective radii larger than 1.5 Kpc and do not meet the criteria for ultra-diffuse galaxy (UDG) classification, so we classify them as ultra-faint dwarfs.

  10. Multi-agent grid system Agent-GRID with dynamic load balancing of cluster nodes

    Science.gov (United States)

    Satymbekov, M. N.; Pak, I. T.; Naizabayeva, L.; Nurzhanov, Ch. A.

    2017-12-01

    In this study the work presents the system designed for automated load balancing of the contributor by analysing the load of compute nodes and the subsequent migration of virtual machines from loaded nodes to less loaded ones. This system increases the performance of cluster nodes and helps in the timely processing of data. A grid system balances the work of cluster nodes the relevance of the system is the award of multi-agent balancing for the solution of such problems.

  11. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  12. Twelve automated thresholding methods for segmentation of PET images: a phantom study

    International Nuclear Information System (INIS)

    Prieto, Elena; Peñuelas, Iván; Martí-Climent, Josep M; Lecumberri, Pablo; Gómez, Marisol; Pagola, Miguel; Bilbao, Izaskun; Ecay, Margarita

    2012-01-01

    Tumor volume delineation over positron emission tomography (PET) images is of great interest for proper diagnosis and therapy planning. However, standard segmentation techniques (manual or semi-automated) are operator dependent and time consuming while fully automated procedures are cumbersome or require complex mathematical development. The aim of this study was to segment PET images in a fully automated way by implementing a set of 12 automated thresholding algorithms, classical in the fields of optical character recognition, tissue engineering or non-destructive testing images in high-tech structures. Automated thresholding algorithms select a specific threshold for each image without any a priori spatial information of the segmented object or any special calibration of the tomograph, as opposed to usual thresholding methods for PET. Spherical 18 F-filled objects of different volumes were acquired on clinical PET/CT and on a small animal PET scanner, with three different signal-to-background ratios. Images were segmented with 12 automatic thresholding algorithms and results were compared with the standard segmentation reference, a threshold at 42% of the maximum uptake. Ridler and Ramesh thresholding algorithms based on clustering and histogram-shape information, respectively, provided better results that the classical 42%-based threshold (p < 0.05). We have herein demonstrated that fully automated thresholding algorithms can provide better results than classical PET segmentation tools. (paper)

  13. An Automated DAKOTA and VULCAN-CFD Framework with Application to Supersonic Facility Nozzle Flowpath Optimization

    Science.gov (United States)

    Axdahl, Erik L.

    2015-01-01

    Removing human interaction from design processes by using automation may lead to gains in both productivity and design precision. This memorandum describes efforts to incorporate high fidelity numerical analysis tools into an automated framework and applying that framework to applications of practical interest. The purpose of this effort was to integrate VULCAN-CFD into an automated, DAKOTA-enabled framework with a proof-of-concept application being the optimization of supersonic test facility nozzles. It was shown that the optimization framework could be deployed on a high performance computing cluster with the flow of information handled effectively to guide the optimization process. Furthermore, the application of the framework to supersonic test facility nozzle flowpath design and optimization was demonstrated using multiple optimization algorithms.

  14. "Clustering" Documents Automatically to Support Scoping Reviews of Research: A Case Study

    Science.gov (United States)

    Stansfield, Claire; Thomas, James; Kavanagh, Josephine

    2013-01-01

    Background: Scoping reviews of research help determine the feasibility and the resource requirements of conducting a systematic review, and the potential to generate a description of the literature quickly is attractive. Aims: To test the utility and applicability of an automated clustering tool to describe and group research studies to improve…

  15. A phylogenomic gene cluster resource: The phylogeneticallyinferred groups (PhlGs) database

    Energy Technology Data Exchange (ETDEWEB)

    Dehal, Paramvir S.; Boore, Jeffrey L.

    2005-08-25

    We present here the PhIGs database, a phylogenomic resource for sequenced genomes. Although many methods exist for clustering gene families, very few attempt to create truly orthologous clusters sharing descent from a single ancestral gene across a range of evolutionary depths. Although these non-phylogenetic gene family clusters have been used broadly for gene annotation, errors are known to be introduced by the artifactual association of slowly evolving paralogs and lack of annotation for those more rapidly evolving. A full phylogenetic framework is necessary for accurate inference of function and for many studies that address pattern and mechanism of the evolution of the genome. The automated generation of evolutionary gene clusters, creation of gene trees, determination of orthology and paralogy relationships, and the correlation of this information with gene annotations, expression information, and genomic context is an important resource to the scientific community.

  16. ICARES: a real-time automated detection tool for clusters of infectious diseases in the Netherlands.

    NARCIS (Netherlands)

    Groeneveld, Geert H; Dalhuijsen, Anton; Kara-Zaïtri, Chakib; Hamilton, Bob; de Waal, Margot W; van Dissel, Jaap T; van Steenbergen, Jim E

    2017-01-01

    Clusters of infectious diseases are frequently detected late. Real-time, detailed information about an evolving cluster and possible associated conditions is essential for local policy makers, travelers planning to visit the area, and the local population. This is currently illustrated in the Zika

  17. Automation, communication and cybernetics in science and engineering 2013/2014

    CERN Document Server

    Isenhardt, Ingrid; Hees, Frank; Henning, Klaus

    2014-01-01

    This book continues the tradition of its predecessors “Automation, Communication and Cybernetics in Science and Engineering 2009/2010 and 2011/2012” and includes a representative selection of scientific publications from researchers at the institute cluster IMA/ZLW & IfU.   IMA - Institute of Information Management in Mechanical Engineering
 ZLW - Center for Learning and Knowledge Management
 IfU - Associated Institute for Management Cybernetics e.V.
Faculty of Mechanical Engineering, RWTH Aachen University   The book presents a range of innovative fields of application, including: cognitive systems, cyber-physical production systems, robotics, automation technology, machine learning, natural language processing, data mining, predictive data analytics, visual analytics, innovation and diversity management, demographic models, virtual and remote laboratories, virtual and augmented realities, multimedia learning environments, organizational development and management cybernetics. The contributio...

  18. An Optimized Clustering Approach for Automated Detection of White Matter Lesions in MRI Brain Images

    Directory of Open Access Journals (Sweden)

    M. Anitha

    2012-04-01

    Full Text Available Settings White Matter lesions (WMLs are small areas of dead cells found in parts of the brain. In general, it is difficult for medical experts to accurately quantify the WMLs due to decreased contrast between White Matter (WM and Grey Matter (GM. The aim of this paper is to
    automatically detect the White Matter Lesions which is present in the brains of elderly people. WML detection process includes the following stages: 1. Image preprocessing, 2. Clustering (Fuzzy c-means clustering, Geostatistical Possibilistic clustering and Geostatistical Fuzzy clustering and 3.Optimization using Particle Swarm Optimization (PSO. The proposed system is tested on a database of 208 MRI images. GFCM yields high sensitivity of 89%, specificity of 94% and overall accuracy of 93% over FCM and GPC. The clustered brain images are then subjected to Particle Swarm Optimization (PSO. The optimized result obtained from GFCM-PSO provides sensitivity of 90%, specificity of 94% and accuracy of 95%. The detection results reveals that GFCM and GFCMPSO better localizes the large regions of lesions and gives less false positive rate when compared to GPC and GPC-PSO which captures the largest loads of WMLs only in the upper ventral horns of the brain.

  19. Improving the driver-automation interaction: an approach using automation uncertainty.

    Science.gov (United States)

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  20. Low surface brightness galaxies in the Fornax Cluster: automated galaxy surface photometry

    International Nuclear Information System (INIS)

    Davies, J.I.; Phillipps, S.; Disney, M.J.

    1988-01-01

    A sample is presented of low surface brightness galaxies (with extrapolated central surface brightness fainter than 22.0 Bμ) in the Fornax Cluster region which has been measured by the APM machine. Photometric parameters, namely profile shape, scale length, central brightness and total magnitude, are derived for the sample galaxies and correlations between the parameters of low surface brightness dwarf galaxies are discussed, with particular reference to the selection limits. Contrary to previous authors we find no evidence for a luminosity-surface brightness correlation in the sense of lower surface brightness galaxies having lower luminosities and scale sizes. In fact, the present data suggest that it is the galaxies with the largest scale lengths which are more likely to be of very low surface brightness. In addition, the larger scale length galaxies occur preferentially towards the centre of the Cluster. (author)

  1. Contribution of peptide backbone to Anti-citrulline-dependent antibody reactivity

    DEFF Research Database (Denmark)

    Trier, Nicole Hartwig; Dam, Catharina; Olsen, Dorthe

    2015-01-01

    for ACPA reactivity and to be cross-reactive between the selected citrullinated peptides. The remaining amino acids within the citrullinated peptides were found to be of less importance for antibody reactivity. Moreover, these findings indicated that the Cit-Gly motif in combination with peptide backbone...... found in up to 70% of RA patients’ sera, have received much attention. Several citrullinated proteins are associated with RA, suggesting that ACPAs may react with different sequence patterns, separating them from traditional antibodies, whose reactivity usually is specific towards a single target...... homology rather than sequence homology are favored between citrullinated epitopes. These findings are important in relation to clarifying the etiology of RA and to determine the nature of ACPAs, e.g. why some Cit-Gly-containing sequences are not targeted by ACPAs....

  2. Clustering of near clusters versus cluster compactness

    International Nuclear Information System (INIS)

    Yu Gao; Yipeng Jing

    1989-01-01

    The clustering properties of near Zwicky clusters are studied by using the two-point angular correlation function. The angular correlation functions for compact and medium compact clusters, for open clusters, and for all near Zwicky clusters are estimated. The results show much stronger clustering for compact and medium compact clusters than for open clusters, and that open clusters have nearly the same clustering strength as galaxies. A detailed study of the compactness-dependence of correlation function strength is worth investigating. (author)

  3. Cluster-cluster clustering

    International Nuclear Information System (INIS)

    Barnes, J.; Dekel, A.; Efstathiou, G.; Frenk, C.S.; Yale Univ., New Haven, CT; California Univ., Santa Barbara; Cambridge Univ., England; Sussex Univ., Brighton, England)

    1985-01-01

    The cluster correlation function xi sub c(r) is compared with the particle correlation function, xi(r) in cosmological N-body simulations with a wide range of initial conditions. The experiments include scale-free initial conditions, pancake models with a coherence length in the initial density field, and hybrid models. Three N-body techniques and two cluster-finding algorithms are used. In scale-free models with white noise initial conditions, xi sub c and xi are essentially identical. In scale-free models with more power on large scales, it is found that the amplitude of xi sub c increases with cluster richness; in this case the clusters give a biased estimate of the particle correlations. In the pancake and hybrid models (with n = 0 or 1), xi sub c is steeper than xi, but the cluster correlation length exceeds that of the points by less than a factor of 2, independent of cluster richness. Thus the high amplitude of xi sub c found in studies of rich clusters of galaxies is inconsistent with white noise and pancake models and may indicate a primordial fluctuation spectrum with substantial power on large scales. 30 references

  4. Installing, Running and Maintaining Large Linux Clusters at CERN

    CERN Document Server

    Bahyl, V; van Eldik, Jan; Fuchs, Ulrich; Kleinwort, Thorsten; Murth, Martin; Smith, Tim; Bahyl, Vladimir; Chardi, Benjamin; Eldik, Jan van; Fuchs, Ulrich; Kleinwort, Thorsten; Murth, Martin; Smith, Tim

    2003-01-01

    Having built up Linux clusters to more than 1000 nodes over the past five years, we already have practical experience confronting some of the LHC scale computing challenges: scalability, automation, hardware diversity, security, and rolling OS upgrades. This paper describes the tools and processes we have implemented, working in close collaboration with the EDG project [1], especially with the WP4 subtask, to improve the manageability of our clusters, in particular in the areas of system installation, configuration, and monitoring. In addition to the purely technical issues, providing shared interactive and batch services which can adapt to meet the diverse and changing requirements of our users is a significant challenge. We describe the developments and tuning that we have introduced on our LSF based systems to maximise both responsiveness to users and overall system utilisation. Finally, this paper will describe the problems we are facing in enlarging our heterogeneous Linux clusters, the progress we have ...

  5. Development of a Genetic Algorithm to Automate Clustering of a Dependency Structure Matrix

    Science.gov (United States)

    Rogers, James L.; Korte, John J.; Bilardo, Vincent J.

    2006-01-01

    Much technology assessment and organization design data exists in Microsoft Excel spreadsheets. Tools are needed to put this data into a form that can be used by design managers to make design decisions. One need is to cluster data that is highly coupled. Tools such as the Dependency Structure Matrix (DSM) and a Genetic Algorithm (GA) can be of great benefit. However, no tool currently combines the DSM and a GA to solve the clustering problem. This paper describes a new software tool that interfaces a GA written as an Excel macro with a DSM in spreadsheet format. The results of several test cases are included to demonstrate how well this new tool works.

  6. Automated identification of crystallographic ligands using sparse-density representations

    International Nuclear Information System (INIS)

    Carolan, C. G.; Lamzin, V. S.

    2014-01-01

    A novel procedure for identifying ligands in macromolecular crystallographic electron-density maps is introduced. Density clusters in such maps can be rapidly attributed to one of 82 different ligands in an automated manner. A novel procedure for the automatic identification of ligands in macromolecular crystallographic electron-density maps is introduced. It is based on the sparse parameterization of density clusters and the matching of the pseudo-atomic grids thus created to conformationally variant ligands using mathematical descriptors of molecular shape, size and topology. In large-scale tests on experimental data derived from the Protein Data Bank, the procedure could quickly identify the deposited ligand within the top-ranked compounds from a database of candidates. This indicates the suitability of the method for the identification of binding entities in fragment-based drug screening and in model completion in macromolecular structure determination

  7. ASAP: an environment for automated preprocessing of sequencing data

    Directory of Open Access Journals (Sweden)

    Torstenson Eric S

    2013-01-01

    Full Text Available Abstract Background Next-generation sequencing (NGS has yielded an unprecedented amount of data for genetics research. It is a daunting task to process the data from raw sequence reads to variant calls and manually processing this data can significantly delay downstream analysis and increase the possibility for human error. The research community has produced tools to properly prepare sequence data for analysis and established guidelines on how to apply those tools to achieve the best results, however, existing pipeline programs to automate the process through its entirety are either inaccessible to investigators, or web-based and require a certain amount of administrative expertise to set up. Findings Advanced Sequence Automated Pipeline (ASAP was developed to provide a framework for automating the translation of sequencing data into annotated variant calls with the goal of minimizing user involvement without the need for dedicated hardware or administrative rights. ASAP works both on computer clusters and on standalone machines with minimal human involvement and maintains high data integrity, while allowing complete control over the configuration of its component programs. It offers an easy-to-use interface for submitting and tracking jobs as well as resuming failed jobs. It also provides tools for quality checking and for dividing jobs into pieces for maximum throughput. Conclusions ASAP provides an environment for building an automated pipeline for NGS data preprocessing. This environment is flexible for use and future development. It is freely available at http://biostat.mc.vanderbilt.edu/ASAP.

  8. ASAP: an environment for automated preprocessing of sequencing data.

    Science.gov (United States)

    Torstenson, Eric S; Li, Bingshan; Li, Chun

    2013-01-04

    Next-generation sequencing (NGS) has yielded an unprecedented amount of data for genetics research. It is a daunting task to process the data from raw sequence reads to variant calls and manually processing this data can significantly delay downstream analysis and increase the possibility for human error. The research community has produced tools to properly prepare sequence data for analysis and established guidelines on how to apply those tools to achieve the best results, however, existing pipeline programs to automate the process through its entirety are either inaccessible to investigators, or web-based and require a certain amount of administrative expertise to set up. Advanced Sequence Automated Pipeline (ASAP) was developed to provide a framework for automating the translation of sequencing data into annotated variant calls with the goal of minimizing user involvement without the need for dedicated hardware or administrative rights. ASAP works both on computer clusters and on standalone machines with minimal human involvement and maintains high data integrity, while allowing complete control over the configuration of its component programs. It offers an easy-to-use interface for submitting and tracking jobs as well as resuming failed jobs. It also provides tools for quality checking and for dividing jobs into pieces for maximum throughput. ASAP provides an environment for building an automated pipeline for NGS data preprocessing. This environment is flexible for use and future development. It is freely available at http://biostat.mc.vanderbilt.edu/ASAP.

  9. ASAP: an environment for automated preprocessing of sequencing data

    Science.gov (United States)

    2013-01-01

    Background Next-generation sequencing (NGS) has yielded an unprecedented amount of data for genetics research. It is a daunting task to process the data from raw sequence reads to variant calls and manually processing this data can significantly delay downstream analysis and increase the possibility for human error. The research community has produced tools to properly prepare sequence data for analysis and established guidelines on how to apply those tools to achieve the best results, however, existing pipeline programs to automate the process through its entirety are either inaccessible to investigators, or web-based and require a certain amount of administrative expertise to set up. Findings Advanced Sequence Automated Pipeline (ASAP) was developed to provide a framework for automating the translation of sequencing data into annotated variant calls with the goal of minimizing user involvement without the need for dedicated hardware or administrative rights. ASAP works both on computer clusters and on standalone machines with minimal human involvement and maintains high data integrity, while allowing complete control over the configuration of its component programs. It offers an easy-to-use interface for submitting and tracking jobs as well as resuming failed jobs. It also provides tools for quality checking and for dividing jobs into pieces for maximum throughput. Conclusions ASAP provides an environment for building an automated pipeline for NGS data preprocessing. This environment is flexible for use and future development. It is freely available at http://biostat.mc.vanderbilt.edu/ASAP. PMID:23289815

  10. Genome cluster database. A sequence family analysis platform for Arabidopsis and rice.

    Science.gov (United States)

    Horan, Kevin; Lauricha, Josh; Bailey-Serres, Julia; Raikhel, Natasha; Girke, Thomas

    2005-05-01

    The genome-wide protein sequences from Arabidopsis (Arabidopsis thaliana) and rice (Oryza sativa) spp. japonica were clustered into families using sequence similarity and domain-based clustering. The two fundamentally different methods resulted in separate cluster sets with complementary properties to compensate the limitations for accurate family analysis. Functional names for the identified families were assigned with an efficient computational approach that uses the description of the most common molecular function gene ontology node within each cluster. Subsequently, multiple alignments and phylogenetic trees were calculated for the assembled families. All clustering results and their underlying sequences were organized in the Web-accessible Genome Cluster Database (http://bioinfo.ucr.edu/projects/GCD) with rich interactive and user-friendly sequence family mining tools to facilitate the analysis of any given family of interest for the plant science community. An automated clustering pipeline ensures current information for future updates in the annotations of the two genomes and clustering improvements. The analysis allowed the first systematic identification of family and singlet proteins present in both organisms as well as those restricted to one of them. In addition, the established Web resources for mining these data provide a road map for future studies of the composition and structure of protein families between the two species.

  11. Validating clustering of molecular dynamics simulations using polymer models

    Directory of Open Access Journals (Sweden)

    Phillips Joshua L

    2011-11-01

    Full Text Available Abstract Background Molecular dynamics (MD simulation is a powerful technique for sampling the meta-stable and transitional conformations of proteins and other biomolecules. Computational data clustering has emerged as a useful, automated technique for extracting conformational states from MD simulation data. Despite extensive application, relatively little work has been done to determine if the clustering algorithms are actually extracting useful information. A primary goal of this paper therefore is to provide such an understanding through a detailed analysis of data clustering applied to a series of increasingly complex biopolymer models. Results We develop a novel series of models using basic polymer theory that have intuitive, clearly-defined dynamics and exhibit the essential properties that we are seeking to identify in MD simulations of real biomolecules. We then apply spectral clustering, an algorithm particularly well-suited for clustering polymer structures, to our models and MD simulations of several intrinsically disordered proteins. Clustering results for the polymer models provide clear evidence that the meta-stable and transitional conformations are detected by the algorithm. The results for the polymer models also help guide the analysis of the disordered protein simulations by comparing and contrasting the statistical properties of the extracted clusters. Conclusions We have developed a framework for validating the performance and utility of clustering algorithms for studying molecular biopolymer simulations that utilizes several analytic and dynamic polymer models which exhibit well-behaved dynamics including: meta-stable states, transition states, helical structures, and stochastic dynamics. We show that spectral clustering is robust to anomalies introduced by structural alignment and that different structural classes of intrinsically disordered proteins can be reliably discriminated from the clustering results. To our

  12. Determination of Autoantibody Isotypes Increases the Sensitivity of Serodiagnostics in Rheumatoid Arthritis

    Directory of Open Access Journals (Sweden)

    Daniela Sieghart

    2018-04-01

    Full Text Available Anti-citrullinated protein antibodies (ACPA and rheumatoid factor (RF are the most commonly used diagnostic markers of rheumatoid arthritis (RA. These antibodies are predominantly of the immunoglobulin (Ig M (RF or IgG (ACPA isotype. Other subtypes of both antibodies—particularly IgA isotypes and other autoantibodies—such as RA33 antibodies—have been repeatedly reported but their diagnostic value has still not been fully elucidated. Here, we investigated the prevalence of IgA, IgG, and IgM subtypes of RF, ACPA, and RA33 antibodies in patients with RA. To determine the diagnostic specificity and sensitivity sera from 290 RA patients (165 early and 125 established disease, 261 disease controls and 100 healthy subjects were tested for the presence of IgA, IgG, and IgM isotypes of RF, ACPA, and RA33 by EliA™ platform (Phadia AB, Uppsala, Sweden. The most specific antibodies were IgG-ACPA, IgA-ACPA, and IgG-RF showing specificities >98%, closely followed by IgG- and IgA-RA33 while IgM subtypes were somewhat less specific, ranging from 95.8% (RA33 to 90% (RF. On the other hand, IgM-RF was the most sensitive subtype (65% followed by IgG-ACPA (59.5% and IgA-RF (50.7%. Other subtypes were less sensitive ranging from 35 (IgA-ACPA to 6% (IgA-RA33. RA33 antibodies as well as IgA-RF and IgA-ACPA were found to increase the diagnostic sensitivity of serological testing since they were detected also in seronegative patients reducing their number from 109 to 85. Moreover, analyzing IgM-RF by EliA™ proved more sensitive than measuring RF by nephelometry and further reduced the number of seronegative patients to 76 individuals. Importantly, among antibody positive individuals, RA patients were found having significantly more antibodies (≥3 than disease controls which generally showed one or two antibody species. Thus, increasing the number of autoantibodies in serological routine testing provides valuable additional information allowing to better

  13. Low cost automation

    International Nuclear Information System (INIS)

    1987-03-01

    This book indicates method of building of automation plan, design of automation facilities, automation and CHIP process like basics of cutting, NC processing machine and CHIP handling, automation unit, such as drilling unit, tapping unit, boring unit, milling unit and slide unit, application of oil pressure on characteristics and basic oil pressure circuit, application of pneumatic, automation kinds and application of process, assembly, transportation, automatic machine and factory automation.

  14. Automated Inventory and Monitoring of the ALICE HLT Cluster Resources with the SysMES Framework

    International Nuclear Information System (INIS)

    Ulrich, J; Lara, C; Böttger, S; Kebschull, U; Haaland, Ø; Röhrich, D

    2012-01-01

    The High-Level-Trigger (HLT) cluster of the ALICE experiment is a computer cluster with about 200 nodes and 20 infrastructure machines. In its current state, the cluster consists of nearly 10 different configurations of nodes in terms of installed hardware, software and network structure. In such a heterogeneous environment with a distributed application, information about the actual configuration of the nodes is needed to automatically distribute and adjust the application accordingly. An inventory database provides a unified interface to such information. To be useful, the data in the inventory has to be up to date, complete and consistent. Manual maintenance of such databases is error-prone and data tends to become outdated. The inventory module of the ALICE HLT cluster overcomes these drawbacks by automatically updating the actual state periodically and, in contrast to existing solutions, it allows the definition of a target state for each node. A target state can simply be a fully operational state, i.e. a state without malfunctions, or a dedicated configuration of the node. The target state is then compared to the actual state to detect deviations and malfunctions which could induce severe problems when running the application. The inventory module of the ALICE HLT cluster has been integrated into the monitoring and management framework SysMES in order to use existing functionality like transactionality and monitoring infrastructure. Additionally, SysMES allows to solve detected problems automatically via its rule-system. To describe the heterogeneous environment with all its specifics, like custom hardware, the inventory module uses an object-oriented model which is based on the Common Information Model. The inventory module provides an automatically updated actual state of the cluster, detects discrepancies between the actual and the target state and is able to solve detected problems automatically. This contribution presents the current implementation

  15. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  16. Automation and remote handling activities in BARC: an overview

    International Nuclear Information System (INIS)

    Badodkar, D.N.

    2016-01-01

    Division of Remote Handling and Robotics, BARC has been working on design and development of various application specific remote handling and automation systems for nuclear front-end and back-end fuel cycle technologies. Division is also engaged in preservice and in-service inspection of coolant channels for Pressurized Heavy Water Reactors in India. Design and development of Reactor Control Mechanisms for Nuclear Research and Power Reactors (PHWRs and Compact LWRs) is another important activity carried out in this division. Robotic systems for Indoor and Outdoor surveillance in and around nuclear installations have also been developed. A line scan camera based system has been developed for measuring individual PHWR fuel pellet lengths as well as stack length. An industrial robot is used for autonomous exchange of pellets to achieve desired stack length. The system can be extended for active fuel pellets also. An automation system has been conceptualized for remote handling and transfer of spent fuel bundles from storage pool directly to the chopper unit of reprocessing plant. In case of Advanced Heavy Water Reactor which uses mixed oxides of (Th-Pu) and (Th-"2"3"3U ) as fuel, automation system for front-end fuel cycle has been designed, which includes Powder processing and pressing; Pellet handling and inspection; Pin handling and inspection; and Cluster assembly and dis-assembly in shielded facilities. System demonstration through fullscale mock-up facility is nearing completion. Above talk is presented on behalf of all the officers and staff of DRHR. The talk is mainly focused on development of an automated fuel fabrication facility for mixed oxides of (Th- Pu)/(Th-"2"3"3U ) fuel pins. An overview of divisional ongoing activities in the field of remote handling and automation are also covered. (author)

  17. FLOCK cluster analysis of mast cell event clustering by high-sensitivity flow cytometry predicts systemic mastocytosis.

    Science.gov (United States)

    Dorfman, David M; LaPlante, Charlotte D; Pozdnyakova, Olga; Li, Betty

    2015-11-01

    In our high-sensitivity flow cytometric approach for systemic mastocytosis (SM), we identified mast cell event clustering as a new diagnostic criterion for the disease. To objectively characterize mast cell gated event distributions, we performed cluster analysis using FLOCK, a computational approach to identify cell subsets in multidimensional flow cytometry data in an unbiased, automated fashion. FLOCK identified discrete mast cell populations in most cases of SM (56/75 [75%]) but only a minority of non-SM cases (17/124 [14%]). FLOCK-identified mast cell populations accounted for 2.46% of total cells on average in SM cases and 0.09% of total cells on average in non-SM cases (P < .0001) and were predictive of SM, with a sensitivity of 75%, a specificity of 86%, a positive predictive value of 76%, and a negative predictive value of 85%. FLOCK analysis provides useful diagnostic information for evaluating patients with suspected SM, and may be useful for the analysis of other hematopoietic neoplasms. Copyright© by the American Society for Clinical Pathology.

  18. MOLA: a bootable, self-configuring system for virtual screening using AutoDock4/Vina on computer clusters

    Directory of Open Access Journals (Sweden)

    Abreu Rui MV

    2010-10-01

    Full Text Available Abstract Background Virtual screening of small molecules using molecular docking has become an important tool in drug discovery. However, large scale virtual screening is time demanding and usually requires dedicated computer clusters. There are a number of software tools that perform virtual screening using AutoDock4 but they require access to dedicated Linux computer clusters. Also no software is available for performing virtual screening with Vina using computer clusters. In this paper we present MOLA, an easy-to-use graphical user interface tool that automates parallel virtual screening using AutoDock4 and/or Vina in bootable non-dedicated computer clusters. Implementation MOLA automates several tasks including: ligand preparation, parallel AutoDock4/Vina jobs distribution and result analysis. When the virtual screening project finishes, an open-office spreadsheet file opens with the ligands ranked by binding energy and distance to the active site. All results files can automatically be recorded on an USB-flash drive or on the hard-disk drive using VirtualBox. MOLA works inside a customized Live CD GNU/Linux operating system, developed by us, that bypass the original operating system installed on the computers used in the cluster. This operating system boots from a CD on the master node and then clusters other computers as slave nodes via ethernet connections. Conclusion MOLA is an ideal virtual screening tool for non-experienced users, with a limited number of multi-platform heterogeneous computers available and no access to dedicated Linux computer clusters. When a virtual screening project finishes, the computers can just be restarted to their original operating system. The originality of MOLA lies on the fact that, any platform-independent computer available can he added to the cluster, without ever using the computer hard-disk drive and without interfering with the installed operating system. With a cluster of 10 processors, and a

  19. Automated Analysis of Flow Cytometry Data to Reduce Inter-Lab Variation in the Detection of Major Histocompatibility Complex Multimer-Binding T Cells

    DEFF Research Database (Denmark)

    Pedersen, Natasja Wulff; Chandran, P. Anoop; Qian, Yu

    2017-01-01

    Manual analysis of flow cytometry data and subjective gate-border decisions taken by individuals continue to be a source of variation in the assessment of antigen-specific T cells when comparing data across laboratories, and also over time in individual labs. Therefore, strategies to provide...... automated analysis of major histocompatibility complex (MHC) multimer-binding T cells represent an attractive solution to decrease subjectivity and technical variation. The challenge of using an automated analysis approach is that MHC multimer-binding T cell populations are often rare and therefore...... laboratories. We used three different methods, FLOw Clustering without K (FLOCK), Scalable Weighted Iterative Flow-clustering Technique (SWIFT), and ReFlow to analyze flow cytometry data files from 28 laboratories. Each laboratory screened for antigen-responsive T cell populations with frequency ranging from 0...

  20. Extraction of the number of peroxisomes in yeast cells by automated image analysis.

    Science.gov (United States)

    Niemistö, Antti; Selinummi, Jyrki; Saleem, Ramsey; Shmulevich, Ilya; Aitchison, John; Yli-Harja, Olli

    2006-01-01

    An automated image analysis method for extracting the number of peroxisomes in yeast cells is presented. Two images of the cell population are required for the method: a bright field microscope image from which the yeast cells are detected and the respective fluorescent image from which the number of peroxisomes in each cell is found. The segmentation of the cells is based on clustering the local mean-variance space. The watershed transformation is thereafter employed to separate cells that are clustered together. The peroxisomes are detected by thresholding the fluorescent image. The method is tested with several images of a budding yeast Saccharomyces cerevisiae population, and the results are compared with manually obtained results.

  1. Nucleus and cytoplasm segmentation in microscopic images using K-means clustering and region growing.

    Science.gov (United States)

    Sarrafzadeh, Omid; Dehnavi, Alireza Mehri

    2015-01-01

    Segmentation of leukocytes acts as the foundation for all automated image-based hematological disease recognition systems. Most of the time, hematologists are interested in evaluation of white blood cells only. Digital image processing techniques can help them in their analysis and diagnosis. The main objective of this paper is to detect leukocytes from a blood smear microscopic image and segment them into their two dominant elements, nucleus and cytoplasm. The segmentation is conducted using two stages of applying K-means clustering. First, the nuclei are segmented using K-means clustering. Then, a proposed method based on region growing is applied to separate the connected nuclei. Next, the nuclei are subtracted from the original image. Finally, the cytoplasm is segmented using the second stage of K-means clustering. The results indicate that the proposed method is able to extract the nucleus and cytoplasm regions accurately and works well even though there is no significant contrast between the components in the image. In this paper, a method based on K-means clustering and region growing is proposed in order to detect leukocytes from a blood smear microscopic image and segment its components, the nucleus and the cytoplasm. As region growing step of the algorithm relies on the information of edges, it will not able to separate the connected nuclei more accurately in poor edges and it requires at least a weak edge to exist between the nuclei. The nucleus and cytoplasm segments of a leukocyte can be used for feature extraction and classification which leads to automated leukemia detection.

  2. Automated Extraction of 3D Trees from Mobile LiDAR Point Clouds

    Directory of Open Access Journals (Sweden)

    Y. Yu

    2014-06-01

    Full Text Available This paper presents an automated algorithm for extracting 3D trees directly from 3D mobile light detection and ranging (LiDAR data. To reduce both computational and spatial complexities, ground points are first filtered out from a raw 3D point cloud via blockbased elevation filtering. Off-ground points are then grouped into clusters representing individual objects through Euclidean distance clustering and voxel-based normalized cut segmentation. Finally, a model-driven method is proposed to achieve the extraction of 3D trees based on a pairwise 3D shape descriptor. The proposed algorithm is tested using a set of mobile LiDAR point clouds acquired by a RIEGL VMX-450 system. The results demonstrate the feasibility and effectiveness of the proposed algorithm.

  3. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  4. The Hubble Space Telescope Medium Deep Survey Cluster Sample: Methodology and Data

    Science.gov (United States)

    Ostrander, E. J.; Nichol, R. C.; Ratnatunga, K. U.; Griffiths, R. E.

    1998-12-01

    We present a new, objectively selected, sample of galaxy overdensities detected in the Hubble Space Telescope Medium Deep Survey (MDS). These clusters/groups were found using an automated procedure that involved searching for statistically significant galaxy overdensities. The contrast of the clusters against the field galaxy population is increased when morphological data are used to search around bulge-dominated galaxies. In total, we present 92 overdensities above a probability threshold of 99.5%. We show, via extensive Monte Carlo simulations, that at least 60% of these overdensities are likely to be real clusters and groups and not random line-of-sight superpositions of galaxies. For each overdensity in the MDS cluster sample, we provide a richness and the average of the bulge-to-total ratio of galaxies within each system. This MDS cluster sample potentially contains some of the most distant clusters/groups ever detected, with about 25% of the overdensities having estimated redshifts z > ~0.9. We have made this sample publicly available to facilitate spectroscopic confirmation of these clusters and help more detailed studies of cluster and galaxy evolution. We also report the serendipitous discovery of a new cluster close on the sky to the rich optical cluster Cl l0016+16 at z = 0.546. This new overdensity, HST 001831+16208, may be coincident with both an X-ray source and a radio source. HST 001831+16208 is the third cluster/group discovered near to Cl 0016+16 and appears to strengthen the claims of Connolly et al. of superclustering at high redshift.

  5. Cluster-based analysis of multi-model climate ensembles

    Science.gov (United States)

    Hyde, Richard; Hossaini, Ryan; Leeson, Amber A.

    2018-06-01

    Clustering - the automated grouping of similar data - can provide powerful and unique insight into large and complex data sets, in a fast and computationally efficient manner. While clustering has been used in a variety of fields (from medical image processing to economics), its application within atmospheric science has been fairly limited to date, and the potential benefits of the application of advanced clustering techniques to climate data (both model output and observations) has yet to be fully realised. In this paper, we explore the specific application of clustering to a multi-model climate ensemble. We hypothesise that clustering techniques can provide (a) a flexible, data-driven method of testing model-observation agreement and (b) a mechanism with which to identify model development priorities. We focus our analysis on chemistry-climate model (CCM) output of tropospheric ozone - an important greenhouse gas - from the recent Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP). Tropospheric column ozone from the ACCMIP ensemble was clustered using the Data Density based Clustering (DDC) algorithm. We find that a multi-model mean (MMM) calculated using members of the most-populous cluster identified at each location offers a reduction of up to ˜ 20 % in the global absolute mean bias between the MMM and an observed satellite-based tropospheric ozone climatology, with respect to a simple, all-model MMM. On a spatial basis, the bias is reduced at ˜ 62 % of all locations, with the largest bias reductions occurring in the Northern Hemisphere - where ozone concentrations are relatively large. However, the bias is unchanged at 9 % of all locations and increases at 29 %, particularly in the Southern Hemisphere. The latter demonstrates that although cluster-based subsampling acts to remove outlier model data, such data may in fact be closer to observed values in some locations. We further demonstrate that clustering can provide a viable and

  6. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Seong, Poong Hyun

    2014-01-01

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  7. A hybrid clustering approach to recognition of protein families in 114 microbial genomes

    Directory of Open Access Journals (Sweden)

    Gogarten J Peter

    2004-04-01

    Full Text Available Abstract Background Grouping proteins into sequence-based clusters is a fundamental step in many bioinformatic analyses (e.g., homology-based prediction of structure or function. Standard clustering methods such as single-linkage clustering capture a history of cluster topologies as a function of threshold, but in practice their usefulness is limited because unrelated sequences join clusters before biologically meaningful families are fully constituted, e.g. as the result of matches to so-called promiscuous domains. Use of the Markov Cluster algorithm avoids this non-specificity, but does not preserve topological or threshold information about protein families. Results We describe a hybrid approach to sequence-based clustering of proteins that combines the advantages of standard and Markov clustering. We have implemented this hybrid approach over a relational database environment, and describe its application to clustering a large subset of PDB, and to 328577 proteins from 114 fully sequenced microbial genomes. To demonstrate utility with difficult problems, we show that hybrid clustering allows us to constitute the paralogous family of ATP synthase F1 rotary motor subunits into a single, biologically interpretable hierarchical grouping that was not accessible using either single-linkage or Markov clustering alone. We describe validation of this method by hybrid clustering of PDB and mapping SCOP families and domains onto the resulting clusters. Conclusion Hybrid (Markov followed by single-linkage clustering combines the advantages of the Markov Cluster algorithm (avoidance of non-specific clusters resulting from matches to promiscuous domains and single-linkage clustering (preservation of topological information as a function of threshold. Within the individual Markov clusters, single-linkage clustering is a more-precise instrument, discerning sub-clusters of biological relevance. Our hybrid approach thus provides a computationally efficient

  8. Automated Interpretation of Blood Culture Gram Stains by Use of a Deep Convolutional Neural Network.

    Science.gov (United States)

    Smith, Kenneth P; Kang, Anthony D; Kirby, James E

    2018-03-01

    Microscopic interpretation of stained smears is one of the most operator-dependent and time-intensive activities in the clinical microbiology laboratory. Here, we investigated application of an automated image acquisition and convolutional neural network (CNN)-based approach for automated Gram stain classification. Using an automated microscopy platform, uncoverslipped slides were scanned with a 40× dry objective, generating images of sufficient resolution for interpretation. We collected 25,488 images from positive blood culture Gram stains prepared during routine clinical workup. These images were used to generate 100,213 crops containing Gram-positive cocci in clusters, Gram-positive cocci in chains/pairs, Gram-negative rods, or background (no cells). These categories were targeted for proof-of-concept development as they are associated with the majority of bloodstream infections. Our CNN model achieved a classification accuracy of 94.9% on a test set of image crops. Receiver operating characteristic (ROC) curve analysis indicated a robust ability to differentiate between categories with an area under the curve of >0.98 for each. After training and validation, we applied the classification algorithm to new images collected from 189 whole slides without human intervention. Sensitivity and specificity were 98.4% and 75.0% for Gram-positive cocci in chains and pairs, 93.2% and 97.2% for Gram-positive cocci in clusters, and 96.3% and 98.1% for Gram-negative rods. Taken together, our data support a proof of concept for a fully automated classification methodology for blood-culture Gram stains. Importantly, the algorithm was highly adept at identifying image crops with organisms and could be used to present prescreened, classified crops to technologists to accelerate smear review. This concept could potentially be extended to all Gram stain interpretive activities in the clinical laboratory. Copyright © 2018 American Society for Microbiology.

  9. The use of synthetic peptides for detection of anti-citrullinated protein antibodies in rheumatoid arthritis

    DEFF Research Database (Denmark)

    Trier, Nicole Hartwig; Holm, Bettina Eide; Heiden, Julie

    2018-01-01

    Rheumatoid arthritis (RA) is an autoimmune disease of unknown etiology. A characteristic feature of RA is the presence of anti-citrullinated protein antibodies (ACPA). Since ACPAs are highly specific for RA and are often present before the onset of RA symptoms, they have become valuable diagnostic...

  10. plantiSMASH: automated identification, annotation and expression analysis of plant biosynthetic gene clusters

    DEFF Research Database (Denmark)

    Kautsar, Satria A.; Suarez Duran, Hernando G.; Blin, Kai

    2017-01-01

    exploration of the nature and dynamics of gene clustering in plant metabolism. Moreover, spurred by the continuing decrease in costs of plant genome sequencing, they will allow genome mining technologies to be applied to plant natural product discovery. The plantiSMASH web server, precalculated results...

  11. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  12. A Container Horizontal Positioning Method with Image Sensors for Cranes in Automated Container Terminals

    Directory of Open Access Journals (Sweden)

    FU Yonghua

    2014-03-01

    Full Text Available Automation is a trend for large container terminals nowadays, and container positioning techniques are key factor in the automating process. Vision based positioning techniques are inexpensive and rather accurate in nature, while the effect with insufficient illumination is left in question. This paper proposed a vision-based procedure with image sensors to determine the position of one container in the horizontal plane. The points found by the edge detection operator are clustered, and only the peak points in the parameter space of the Hough transformation is selected, in order that the effect of noises could be much decreased. The effectiveness of our procedure is verified in experiments, in which the efficiency of the procedure is also investigated.

  13. Automated segmentation of white matter fiber bundles using diffusion tensor imaging data and a new density based clustering algorithm.

    Science.gov (United States)

    Kamali, Tahereh; Stashuk, Daniel

    2016-10-01

    Robust and accurate segmentation of brain white matter (WM) fiber bundles assists in diagnosing and assessing progression or remission of neuropsychiatric diseases such as schizophrenia, autism and depression. Supervised segmentation methods are infeasible in most applications since generating gold standards is too costly. Hence, there is a growing interest in designing unsupervised methods. However, most conventional unsupervised methods require the number of clusters be known in advance which is not possible in most applications. The purpose of this study is to design an unsupervised segmentation algorithm for brain white matter fiber bundles which can automatically segment fiber bundles using intrinsic diffusion tensor imaging data information without considering any prior information or assumption about data distributions. Here, a new density based clustering algorithm called neighborhood distance entropy consistency (NDEC), is proposed which discovers natural clusters within data by simultaneously utilizing both local and global density information. The performance of NDEC is compared with other state of the art clustering algorithms including chameleon, spectral clustering, DBSCAN and k-means using Johns Hopkins University publicly available diffusion tensor imaging data. The performance of NDEC and other employed clustering algorithms were evaluated using dice ratio as an external evaluation criteria and density based clustering validation (DBCV) index as an internal evaluation metric. Across all employed clustering algorithms, NDEC obtained the highest average dice ratio (0.94) and DBCV value (0.71). NDEC can find clusters with arbitrary shapes and densities and consequently can be used for WM fiber bundle segmentation where there is no distinct boundary between various bundles. NDEC may also be used as an effective tool in other pattern recognition and medical diagnostic systems in which discovering natural clusters within data is a necessity. Copyright

  14. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  15. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis.

    Science.gov (United States)

    Wang, Tao; Shao, Kang; Chu, Qinying; Ren, Yanfei; Mu, Yiming; Qu, Lijia; He, Jie; Jin, Changwen; Xia, Bin

    2009-03-16

    Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested

  16. HLA-DRB1 Analysis Identified a Genetically Unique Subset within Rheumatoid Arthritis and Distinct Genetic Background of Rheumatoid Factor Levels from Anticyclic Citrullinated Peptide Antibodies.

    Science.gov (United States)

    Hiwa, Ryosuke; Ikari, Katsunori; Ohmura, Koichiro; Nakabo, Shuichiro; Matsuo, Keitaro; Saji, Hiroh; Yurugi, Kimiko; Miura, Yasuo; Maekawa, Taira; Taniguchi, Atsuo; Yamanaka, Hisashi; Matsuda, Fumihiko; Mimori, Tsuneyo; Terao, Chikashi

    2018-04-01

    HLA-DRB1 is the most important locus associated with rheumatoid arthritis (RA) and anticitrullinated protein antibodies (ACPA). However, fluctuations of rheumatoid factor (RF) over the disease course have made it difficult to define fine subgroups according to consistent RF positivity for the analyses of genetic background and the levels of RF. A total of 2873 patients with RA and 2008 healthy controls were recruited. We genotyped HLA-DRB1 alleles for the participants and collected consecutive data of RF in the case subjects. In addition to RF+ and RF- subsets, we classified the RF+ subjects into group 1 (constant RF+) and group 2 (seroconversion). We compared HLA-DRB1 alleles between the RA subsets and controls and performed linear regression analysis to identify HLA-DRB1 alleles associated with maximal RF levels. Omnibus tests were conducted to assess important amino acid positions. RF positivity was 88%, and 1372 and 970 RF+ subjects were classified into groups 1 and 2, respectively. RF+ and RF- showed similar genetic associations to ACPA+ and ACPA- RA, respectively. We found that shared epitope (SE) was more enriched in group 2 than 1, p = 2.0 × 10 -5 , and that amino acid position 11 showed a significant association between 1 and 2, p = 2.7 × 10 -5 . These associations were independent of ACPA positivity. SE showed a tendency to be negatively correlated with RF titer (p = 0.012). HLA-DRB1*09:01, which reduces ACPA titer, was not associated with RF levels (p = 0.70). The seroconversion group was shown to have distinct genetic characteristics. The genetic architecture of RF levels is different from that of ACPA.

  17. AUTOMATED CELL SEGMENTATION WITH 3D FLUORESCENCE MICROSCOPY IMAGES.

    Science.gov (United States)

    Kong, Jun; Wang, Fusheng; Teodoro, George; Liang, Yanhui; Zhu, Yangyang; Tucker-Burden, Carol; Brat, Daniel J

    2015-04-01

    A large number of cell-oriented cancer investigations require an effective and reliable cell segmentation method on three dimensional (3D) fluorescence microscopic images for quantitative analysis of cell biological properties. In this paper, we present a fully automated cell segmentation method that can detect cells from 3D fluorescence microscopic images. Enlightened by fluorescence imaging techniques, we regulated the image gradient field by gradient vector flow (GVF) with interpolated and smoothed data volume, and grouped voxels based on gradient modes identified by tracking GVF field. Adaptive thresholding was then applied to voxels associated with the same gradient mode where voxel intensities were enhanced by a multiscale cell filter. We applied the method to a large volume of 3D fluorescence imaging data of human brain tumor cells with (1) small cell false detection and missing rates for individual cells; and (2) trivial over and under segmentation incidences for clustered cells. Additionally, the concordance of cell morphometry structure between automated and manual segmentation was encouraging. These results suggest a promising 3D cell segmentation method applicable to cancer studies.

  18. Preferential decrease in IgG4 anti-citrullinated protein antibodies during treatment with tumour necrosis factor blocking agents in patients with rheumatoid arthritis

    NARCIS (Netherlands)

    Bos, W.H.; Bartelds, G.M.; Vis, M.; van der Horst, A.R.; Wolbink, G.J.; van de Stadt, R.J.; van Schaardenburg, D.; Dijkmans, B.A.C.; Lems, W.F.; Nurmohamed, M.T.; Aarden, L.; Hamann, D.

    2009-01-01

    Objective: To investigate the dynamics of IgG1 and IgG4 anti-citrullinated protein antibody ( ACPA) subclasses during anti-tumour necrosis factor (TNF) treatment in patients with rheumatoid arthritis ( RA). Methods: IgG, IgG1 and IgG4 ACPA levels were determined by ELISA on anti-citrullinated

  19. Preferential decrease in IgG4 anti-citrullinated protein antibodies during treatment with tumour necrosis factor blocking agents in patients with rheumatoid arthritis

    NARCIS (Netherlands)

    Bos, W.H.; Bartelds, G.M.; Vis, M.; Horst, A.; Wolbink, G.; van de Stadt, R.J.; van Schaardenburg, D.; Dijkmans, B.A.C.; Lems, W.F.; Nurmohamed, M.T.; Aarden, L.; Hamann, D.

    2009-01-01

    Objective: To investigate the dynamics of IgG1 and IgG4 anti-citrullinated protein antibody (ACPA) subclasses during anti-tumour necrosis factor (TNF) treatment in patients with rheumatoid arthritis (RA). Methods: IgG, IgG1 and IgG4 ACPA levels were determined by ELISA on anti-citrullinated

  20. Automated Micro-Object Detection for Mobile Diagnostics Using Lens-Free Imaging Technology

    Directory of Open Access Journals (Sweden)

    Mohendra Roy

    2016-05-01

    Full Text Available Lens-free imaging technology has been extensively used recently for microparticle and biological cell analysis because of its high throughput, low cost, and simple and compact arrangement. However, this technology still lacks a dedicated and automated detection system. In this paper, we describe a custom-developed automated micro-object detection method for a lens-free imaging system. In our previous work (Roy et al., we developed a lens-free imaging system using low-cost components. This system was used to generate and capture the diffraction patterns of micro-objects and a global threshold was used to locate the diffraction patterns. In this work we used the same setup to develop an improved automated detection and analysis algorithm based on adaptive threshold and clustering of signals. For this purpose images from the lens-free system were then used to understand the features and characteristics of the diffraction patterns of several types of samples. On the basis of this information, we custom-developed an automated algorithm for the lens-free imaging system. Next, all the lens-free images were processed using this custom-developed automated algorithm. The performance of this approach was evaluated by comparing the counting results with standard optical microscope results. We evaluated the counting results for polystyrene microbeads, red blood cells, and HepG2, HeLa, and MCF7 cells. The comparison shows good agreement between the systems, with a correlation coefficient of 0.91 and linearity slope of 0.877. We also evaluated the automated size profiles of the microparticle samples. This Wi-Fi-enabled lens-free imaging system, along with the dedicated software, possesses great potential for telemedicine applications in resource-limited settings.

  1. Lean automation development : applying lean principles to the automation development process

    OpenAIRE

    Granlund, Anna; Wiktorsson, Magnus; Grahn, Sten; Friedler, Niklas

    2014-01-01

    By a broad empirical study it is indicated that automation development show potential of improvement. In the paper, 13 lean product development principles are contrasted to the automation development process and it is suggested why and how these principles can facilitate, support and improve the automation development process. The paper summarises a description of what characterises a lean automation development process and what consequences it entails. Main differences compared to current pr...

  2. Passing in Command Line Arguments and Parallel Cluster/Multicore Batching in R with batch.

    Science.gov (United States)

    Hoffmann, Thomas J

    2011-03-01

    It is often useful to rerun a command line R script with some slight change in the parameters used to run it - a new set of parameters for a simulation, a different dataset to process, etc. The R package batch provides a means to pass in multiple command line options, including vectors of values in the usual R format, easily into R. The same script can be setup to run things in parallel via different command line arguments. The R package batch also provides a means to simplify this parallel batching by allowing one to use R and an R-like syntax for arguments to spread a script across a cluster or local multicore/multiprocessor computer, with automated syntax for several popular cluster types. Finally it provides a means to aggregate the results together of multiple processes run on a cluster.

  3. Cluster Flow: A user-friendly bioinformatics workflow tool [version 1; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Philip Ewels

    2016-12-01

    Full Text Available Pipeline tools are becoming increasingly important within the field of bioinformatics. Using a pipeline manager to manage and run workflows comprised of multiple tools reduces workload and makes analysis results more reproducible. Existing tools require significant work to install and get running, typically needing pipeline scripts to be written from scratch before running any analysis. We present Cluster Flow, a simple and flexible bioinformatics pipeline tool designed to be quick and easy to install. Cluster Flow comes with 40 modules for common NGS processing steps, ready to work out of the box. Pipelines are assembled using these modules with a simple syntax that can be easily modified as required. Core helper functions automate many common NGS procedures, making running pipelines simple. Cluster Flow is available with an GNU GPLv3 license on GitHub. Documentation, examples and an online demo are available at http://clusterflow.io.

  4. Quality of radiomic features in glioblastoma multiforme: Impact of semi-automated tumor segmentation software

    International Nuclear Information System (INIS)

    Lee, Myung Eun; Kim, Jong Hyo; Woo, Bo Yeong; Ko, Micheal D.; Jamshidi, Neema

    2017-01-01

    The purpose of this study was to evaluate the reliability and quality of radiomic features in glioblastoma multiforme (GBM) derived from tumor volumes obtained with semi-automated tumor segmentation software. MR images of 45 GBM patients (29 males, 16 females) were downloaded from The Cancer Imaging Archive, in which post-contrast T1-weighted imaging and fluid-attenuated inversion recovery MR sequences were used. Two raters independently segmented the tumors using two semi-automated segmentation tools (TumorPrism3D and 3D Slicer). Regions of interest corresponding to contrast-enhancing lesion, necrotic portions, and non-enhancing T2 high signal intensity component were segmented for each tumor. A total of 180 imaging features were extracted, and their quality was evaluated in terms of stability, normalized dynamic range (NDR), and redundancy, using intra-class correlation coefficients, cluster consensus, and Rand Statistic. Our study results showed that most of the radiomic features in GBM were highly stable. Over 90% of 180 features showed good stability (intra-class correlation coefficient [ICC] ≥ 0.8), whereas only 7 features were of poor stability (ICC NDR ≥1), while above 35% of the texture features showed poor NDR (< 1). Features were shown to cluster into only 5 groups, indicating that they were highly redundant. The use of semi-automated software tools provided sufficiently reliable tumor segmentation and feature stability; thus helping to overcome the inherent inter-rater and intra-rater variability of user intervention. However, certain aspects of feature quality, including NDR and redundancy, need to be assessed for determination of representative signature features before further development of radiomics

  5. Quantum picturalism for topological cluster-state computing

    International Nuclear Information System (INIS)

    Horsman, Clare

    2011-01-01

    Topological quantum computing (QC) is a way of allowing precise quantum computations to run on noisy and imperfect hardware. One implementation uses surface codes created by forming defects in a highly-entangled cluster state. Such a method of computing is a leading candidate for large-scale QC. However, there has been a lack of sufficiently powerful high-level languages to describe computing in this form without resorting to single-qubit operations, which quickly become prohibitively complex as the system size increases. In this paper, we apply the category-theoretic work of Abramsky and Coecke to the topological cluster-state model of QC to give a high-level graphical language that enables direct translation between quantum processes and physical patterns of measurement in a computer-a 'compiler language'. We give the equivalence between the graphical and topological information flows, and show the applicable rewrite algebra for this computing model. We show that this gives us a native graphical language for the design and analysis of topological quantum algorithms, and finish by discussing the possibilities for automating this process on a large scale.

  6. VizieR Online Data Catalog: Tidal radii of 7 globular clusters (Lehmann+ 1997)

    Science.gov (United States)

    Lehmann, I.; Scholz, R.-D.

    1998-02-01

    We present new tidal radii for seven Galactic globular clusters using the method of automated star counts on Schmidt plates of the Tautenburg, Palomar and UK telescopes. The plates were fully scanned with the APM system in Cambridge (UK). Special account was given to a reliable background subtraction and the correction of crowding effects in the central cluster region. For the latter we used a new kind of crowding correction based on a statistical approach to the distribution of stellar images and the luminosity function of the cluster stars in the uncrowded area. The star counts were correlated with surface brightness profiles of different authors to obtain complete projected density profiles of the globular clusters. Fitting an empirical density law (King 1962AJ.....67..471K) we derived the following structural parameters: tidal radius rt, core radius rc and concentration parameter c. In the cases of NGC 5466, M 5, M 12, M 13 and M 15 we found an indication for a tidal tail around these objects (cf. Grillmair et al., 1995AJ....109.2553G). (1 data file).

  7. Both Automation and Paper.

    Science.gov (United States)

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  8. Automated Groundwater Screening

    International Nuclear Information System (INIS)

    Taylor, Glenn A.; Collard, Leonard B.

    2005-01-01

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application

  9. Preferential decrease in IgG4 anti-citrullinated protein antibodies during treatment with tumour necrosis factor blocking agents in patients with rheumatoid arthritis

    NARCIS (Netherlands)

    Bos, W. H.; Bartelds, G. M.; Vis, M.; van der Horst, A. R.; Wolbink, G. J.; van de Stadt, R. J.; van Schaardenburg, D.; Dijkmans, B. A. C.; Lems, W. F.; Nurmohamed, M. T.; Aarden, L.; Hamann, D.

    2009-01-01

    To investigate the dynamics of IgG1 and IgG4 anti-citrullinated protein antibody (ACPA) subclasses during anti-tumour necrosis factor (TNF) treatment in patients with rheumatoid arthritis (RA). IgG, IgG1 and IgG4 ACPA levels were determined by ELISA on anti-citrullinated fibrinogen (ACF) and IgG1 :

  10. Does information on novel identified autoantibodies contribute to predicting the progression from undifferentiated arthritis to rheumatoid arthritis: a study on anti-CarP antibodies as an example.

    Science.gov (United States)

    Boeters, Debbie M; Trouw, Leendert A; van der Helm-van Mil, Annette H M; van Steenbergen, Hanna W

    2018-05-03

    The presence of autoantibodies is considered an important characteristic of rheumatoid arthritis (RA); therefore, both anticitrullinated protein antibodies (ACPA) and rheumatoid factor (RF) are included in the 2010 classification criteria for rheumatoid arthritis (RA). However, a considerable number of RA patients lack both these autoantibodies. Recently, several novel autoantibodies have been identified but their value for the classification of RA patients is unclear. Therefore, we studied the value of novel autoantibodies using the presence of anticarbamylated protein (anti-CarP) antibodies as an example for predicting RA development in patients with undifferentiated arthritis (UA). There were 1352 UA patients included in the Leiden Early Arthritis Clinic (EAC) cohort according to the 1987 criteria. When the 2010 criteria were used, there were 838 UA patients. Of these, we evaluated whether they fulfilled the 1987 or 2010 criteria after 1 year, respectively. Logistic regression analyses were performed with RA as outcome and ACPA, RF, and anti-CarP antibodies as predictors. Analyses were repeated after stratification for ACPA and RF. Thirty-three percent of the 1987-UA patients and 6% of the 2010-UA patients progressed to RA during the first year of follow-up. For the 1987-UA patients, anti-CarP antibodies were associated with progression to RA, an association which remained when a correction was made for the presence of ACPA and RF (odds ratio (OR) 1.7, 95% confidence interval (CI) 1.2-2.4). After stratification for ACPA and RF, anti-CarP antibodies were associated with progression to RA only for ACPA- and RF-negative patients (OR 2.1, 95% CI 1.3-3.7). For the 2010-UA patients, anti-CarP antibodies were associated with progression to RA; however, they were not when a correction was made for the presence of ACPA and RF (OR 0.8, 95% CI 0.3-2.1). Our finding that anti-CarP antibodies have no additional value when RA is defined according to the 2010 criteria might

  11. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  12. Sifting through genomes with iterative-sequence clustering produces a large, phylogenetically diverse protein-family resource.

    Science.gov (United States)

    Sharpton, Thomas J; Jospin, Guillaume; Wu, Dongying; Langille, Morgan G I; Pollard, Katherine S; Eisen, Jonathan A

    2012-10-13

    New computational resources are needed to manage the increasing volume of biological data from genome sequencing projects. One fundamental challenge is the ability to maintain a complete and current catalog of protein diversity. We developed a new approach for the identification of protein families that focuses on the rapid discovery of homologous protein sequences. We implemented fully automated and high-throughput procedures to de novo cluster proteins into families based upon global alignment similarity. Our approach employs an iterative clustering strategy in which homologs of known families are sifted out of the search for new families. The resulting reduction in computational complexity enables us to rapidly identify novel protein families found in new genomes and to perform efficient, automated updates that keep pace with genome sequencing. We refer to protein families identified through this approach as "Sifting Families," or SFams. Our analysis of ~10.5 million protein sequences from 2,928 genomes identified 436,360 SFams, many of which are not represented in other protein family databases. We validated the quality of SFam clustering through statistical as well as network topology-based analyses. We describe the rapid identification of SFams and demonstrate how they can be used to annotate genomes and metagenomes. The SFam database catalogs protein-family quality metrics, multiple sequence alignments, hidden Markov models, and phylogenetic trees. Our source code and database are publicly available and will be subject to frequent updates (http://edhar.genomecenter.ucdavis.edu/sifting_families/).

  13. 78 FR 53466 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Science.gov (United States)

    2013-08-29

    ... Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image... National Customs Automation Program (NCAP) tests concerning document imaging, known as the Document Image... the National Customs Automation Program (NCAP) tests concerning document imaging, known as the...

  14. Automated modal parameter estimation using correlation analysis and bootstrap sampling

    Science.gov (United States)

    Yaghoubi, Vahid; Vakilzadeh, Majid K.; Abrahamsson, Thomas J. S.

    2018-02-01

    The estimation of modal parameters from a set of noisy measured data is a highly judgmental task, with user expertise playing a significant role in distinguishing between estimated physical and noise modes of a test-piece. Various methods have been developed to automate this procedure. The common approach is to identify models with different orders and cluster similar modes together. However, most proposed methods based on this approach suffer from high-dimensional optimization problems in either the estimation or clustering step. To overcome this problem, this study presents an algorithm for autonomous modal parameter estimation in which the only required optimization is performed in a three-dimensional space. To this end, a subspace-based identification method is employed for the estimation and a non-iterative correlation-based method is used for the clustering. This clustering is at the heart of the paper. The keys to success are correlation metrics that are able to treat the problems of spatial eigenvector aliasing and nonunique eigenvectors of coalescent modes simultaneously. The algorithm commences by the identification of an excessively high-order model from frequency response function test data. The high number of modes of this model provides bases for two subspaces: one for likely physical modes of the tested system and one for its complement dubbed the subspace of noise modes. By employing the bootstrap resampling technique, several subsets are generated from the same basic dataset and for each of them a model is identified to form a set of models. Then, by correlation analysis with the two aforementioned subspaces, highly correlated modes of these models which appear repeatedly are clustered together and the noise modes are collected in a so-called Trashbox cluster. Stray noise modes attracted to the mode clusters are trimmed away in a second step by correlation analysis. The final step of the algorithm is a fuzzy c-means clustering procedure applied to

  15. Towards Automated Binding Affinity Prediction Using an Iterative Linear Interaction Energy Approach

    Directory of Open Access Journals (Sweden)

    C. Ruben Vosmeer

    2014-01-01

    Full Text Available Binding affinity prediction of potential drugs to target and off-target proteins is an essential asset in drug development. These predictions require the calculation of binding free energies. In such calculations, it is a major challenge to properly account for both the dynamic nature of the protein and the possible variety of ligand-binding orientations, while keeping computational costs tractable. Recently, an iterative Linear Interaction Energy (LIE approach was introduced, in which results from multiple simulations of a protein-ligand complex are combined into a single binding free energy using a Boltzmann weighting-based scheme. This method was shown to reach experimental accuracy for flexible proteins while retaining the computational efficiency of the general LIE approach. Here, we show that the iterative LIE approach can be used to predict binding affinities in an automated way. A workflow was designed using preselected protein conformations, automated ligand docking and clustering, and a (semi-automated molecular dynamics simulation setup. We show that using this workflow, binding affinities of aryloxypropanolamines to the malleable Cytochrome P450 2D6 enzyme can be predicted without a priori knowledge of dominant protein-ligand conformations. In addition, we provide an outlook for an approach to assess the quality of the LIE predictions, based on simulation outcomes only.

  16. Optimization of automation: III. Development of optimization method for determining automation rate in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Kim, Man Cheol; Seong, Poong Hyun

    2016-01-01

    Highlights: • We propose an appropriate automation rate that enables the best human performance. • We analyze the shortest working time considering Situation Awareness Recovery (SAR). • The optimized automation rate is estimated by integrating the automation and ostracism rate estimation methods. • The process to derive the optimized automation rate is demonstrated through case studies. - Abstract: Automation has been introduced in various industries, including the nuclear field, because it is commonly believed that automation promises greater efficiency, lower workloads, and fewer operator errors through reducing operator errors and enhancing operator and system performance. However, the excessive introduction of automation has deteriorated operator performance due to the side effects of automation, which are referred to as Out-of-the-Loop (OOTL), and this is critical issue that must be resolved. Thus, in order to determine the optimal level of automation introduction that assures the best human operator performance, a quantitative method of optimizing the automation is proposed in this paper. In order to propose the optimization method for determining appropriate automation levels that enable the best human performance, the automation rate and ostracism rate, which are estimation methods that quantitatively analyze the positive and negative effects of automation, respectively, are integrated. The integration was conducted in order to derive the shortest working time through considering the concept of situation awareness recovery (SAR), which states that the automation rate with the shortest working time assures the best human performance. The process to derive the optimized automation rate is demonstrated through an emergency operation scenario-based case study. In this case study, four types of procedures are assumed through redesigning the original emergency operating procedure according to the introduced automation and ostracism levels. Using the

  17. World-wide distribution automation systems

    International Nuclear Information System (INIS)

    Devaney, T.M.

    1994-01-01

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems

  18. WIDAFELS flexible automation systems

    International Nuclear Information System (INIS)

    Shende, P.S.; Chander, K.P.; Ramadas, P.

    1990-01-01

    After discussing the various aspects of automation, some typical examples of various levels of automation are given. One of the examples is of automated production line for ceramic fuel pellets. (M.G.B.)

  19. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  20. Quality of radiomic features in glioblastoma multiforme: Impact of semi-automated tumor segmentation software

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myung Eun; Kim, Jong Hyo [Center for Medical-IT Convergence Technology Research, Advanced Institutes of Convergence Technology, Seoul National University, Suwon (Korea, Republic of); Woo, Bo Yeong [Dept. of Transdisciplinary Studies, Graduate School of Convergence Science and Technology, Seoul National University, Suwon (Korea, Republic of); Ko, Micheal D.; Jamshidi, Neema [Dept. of Radiological Sciences, University of California, Los Angeles, Los Angeles (United States)

    2017-06-15

    The purpose of this study was to evaluate the reliability and quality of radiomic features in glioblastoma multiforme (GBM) derived from tumor volumes obtained with semi-automated tumor segmentation software. MR images of 45 GBM patients (29 males, 16 females) were downloaded from The Cancer Imaging Archive, in which post-contrast T1-weighted imaging and fluid-attenuated inversion recovery MR sequences were used. Two raters independently segmented the tumors using two semi-automated segmentation tools (TumorPrism3D and 3D Slicer). Regions of interest corresponding to contrast-enhancing lesion, necrotic portions, and non-enhancing T2 high signal intensity component were segmented for each tumor. A total of 180 imaging features were extracted, and their quality was evaluated in terms of stability, normalized dynamic range (NDR), and redundancy, using intra-class correlation coefficients, cluster consensus, and Rand Statistic. Our study results showed that most of the radiomic features in GBM were highly stable. Over 90% of 180 features showed good stability (intra-class correlation coefficient [ICC] ≥ 0.8), whereas only 7 features were of poor stability (ICC < 0.5). Most first order statistics and morphometric features showed moderate-to-high NDR (4 > NDR ≥1), while above 35% of the texture features showed poor NDR (< 1). Features were shown to cluster into only 5 groups, indicating that they were highly redundant. The use of semi-automated software tools provided sufficiently reliable tumor segmentation and feature stability; thus helping to overcome the inherent inter-rater and intra-rater variability of user intervention. However, certain aspects of feature quality, including NDR and redundancy, need to be assessed for determination of representative signature features before further development of radiomics.

  1. Virtual automation.

    Science.gov (United States)

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  2. Automation of Test Cases for Web Applications : Automation of CRM Test Cases

    OpenAIRE

    Seyoum, Alazar

    2012-01-01

    The main theme of this project was to design a test automation framework for automating web related test cases. Automating test cases designed for testing a web interface provide a means of improving a software development process by shortening the testing phase in the software development life cycle. In this project an existing AutoTester framework and iMacros test automation tools were used. CRM Test Agent was developed to integrate AutoTester to iMacros and to enable the AutoTester,...

  3. Distribution-based fuzzy clustering of electrical resistivity tomography images for interface detection

    Science.gov (United States)

    Ward, W. O. C.; Wilkinson, P. B.; Chambers, J. E.; Oxby, L. S.; Bai, L.

    2014-04-01

    A novel method for the effective identification of bedrock subsurface elevation from electrical resistivity tomography images is described. Identifying subsurface boundaries in the topographic data can be difficult due to smoothness constraints used in inversion, so a statistical population-based approach is used that extends previous work in calculating isoresistivity surfaces. The analysis framework involves a procedure for guiding a clustering approach based on the fuzzy c-means algorithm. An approximation of resistivity distributions, found using kernel density estimation, was utilized as a means of guiding the cluster centroids used to classify data. A fuzzy method was chosen over hard clustering due to uncertainty in hard edges in the topography data, and a measure of clustering uncertainty was identified based on the reciprocal of cluster membership. The algorithm was validated using a direct comparison of known observed bedrock depths at two 3-D survey sites, using real-time GPS information of exposed bedrock by quarrying on one site, and borehole logs at the other. Results show similarly accurate detection as a leading isosurface estimation method, and the proposed algorithm requires significantly less user input and prior site knowledge. Furthermore, the method is effectively dimension-independent and will scale to data of increased spatial dimensions without a significant effect on the runtime. A discussion on the results by automated versus supervised analysis is also presented.

  4. An Automation Planning Primer.

    Science.gov (United States)

    Paynter, Marion

    1988-01-01

    This brief planning guide for library automation incorporates needs assessment and evaluation of options to meet those needs. A bibliography of materials on automation planning and software reviews, library software directories, and library automation journals is included. (CLB)

  5. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  6. Automated pulmonary lobar ventilation measurements using volume-matched thoracic CT and MRI

    Science.gov (United States)

    Guo, F.; Svenningsen, S.; Bluemke, E.; Rajchl, M.; Yuan, J.; Fenster, A.; Parraga, G.

    2015-03-01

    Objectives: To develop and evaluate an automated registration and segmentation pipeline for regional lobar pulmonary structure-function measurements, using volume-matched thoracic CT and MRI in order to guide therapy. Methods: Ten subjects underwent pulmonary function tests and volume-matched 1H and 3He MRI and thoracic CT during a single 2-hr visit. CT was registered to 1H MRI using an affine method that incorporated block-matching and this was followed by a deformable step using free-form deformation. The resultant deformation field was used to deform the associated CT lobe mask that was generated using commercial software. 3He-1H image registration used the same two-step registration method and 3He ventilation was segmented using hierarchical k-means clustering. Whole lung and lobar 3He ventilation and ventilation defect percent (VDP) were generated by mapping ventilation defects to CT-defined whole lung and lobe volumes. Target CT-3He registration accuracy was evaluated using region- , surface distance- and volume-based metrics. Automated whole lung and lobar VDP was compared with semi-automated and manual results using paired t-tests. Results: The proposed pipeline yielded regional spatial agreement of 88.0+/-0.9% and surface distance error of 3.9+/-0.5 mm. Automated and manual whole lung and lobar ventilation and VDP were not significantly different and they were significantly correlated (r = 0.77, p pulmonary structural-functional maps with high accuracy and robustness, providing an important tool for image-guided pulmonary interventions.

  7. Galaxy CloudMan: delivering cloud compute clusters.

    Science.gov (United States)

    Afgan, Enis; Baker, Dannon; Coraor, Nate; Chapman, Brad; Nekrutenko, Anton; Taylor, James

    2010-12-21

    Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is "cloud computing", which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate "as is" use by experimental biologists. We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon's EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge.

  8. Automated Budget System -

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  9. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  10. Segmentation methodology for automated classification and differentiation of soft tissues in multiband images of high-resolution ultrasonic transmission tomography.

    Science.gov (United States)

    Jeong, Jeong-Won; Shin, Dae C; Do, Synho; Marmarelis, Vasilis Z

    2006-08-01

    This paper presents a novel segmentation methodology for automated classification and differentiation of soft tissues using multiband data obtained with the newly developed system of high-resolution ultrasonic transmission tomography (HUTT) for imaging biological organs. This methodology extends and combines two existing approaches: the L-level set active contour (AC) segmentation approach and the agglomerative hierarchical kappa-means approach for unsupervised clustering (UC). To prevent the trapping of the current iterative minimization AC algorithm in a local minimum, we introduce a multiresolution approach that applies the level set functions at successively increasing resolutions of the image data. The resulting AC clusters are subsequently rearranged by the UC algorithm that seeks the optimal set of clusters yielding the minimum within-cluster distances in the feature space. The presented results from Monte Carlo simulations and experimental animal-tissue data demonstrate that the proposed methodology outperforms other existing methods without depending on heuristic parameters and provides a reliable means for soft tissue differentiation in HUTT images.

  11. Dose-dependent effects of celecoxib on CB-1 agonist-induced antinociception in the mice

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Zarrindast

    2009-04-01

    Full Text Available "nObjective: Endocannabinoid produce analgesia that is comparable which of opioids. The mechanism of antinociceptive effects of (∆ - 9 tetrahydrocannabinol (THC is suggested to be through cyclooxygenase (COX pathway. In the present work, the effect of two extreme dose ranges of celecoxib (mg/kg and ng/kg, a cyclooxygenase-2 (COX-2 antagonist, on arachidonylcyclopropylamide (ACPA, a selective CB1 agonist induced antinociception in mice was examined. "nMethods: We have investigated the interaction between celecoxib, at the doses of mg/kg (50, 100, 200 and 400 i.p.  and ultra low dose (ULD (25 and 50 ng/kg, i.p., on the antinociceptive effect of intracerebroventricular (i.c.v. administration of ACPA (0.004, 0.0625 and 1 μg/mice, using formalin test in mice. "nResults: I.C.V. administration of ACPA induced antinociception. Intraperitoneal administration of celecoxib (mg/kg and its ULD (ng/kg attenuated and potentiated, ACPA antinociceptive effects, respectively. "nConclusion: It is concluded that the mg/kg doses of COX-2 antagonist showed opposite effects compare to the ultra-low dose of the drug.

  12. Multidisciplinary Cleft Palate Program at BC Children's Hospital: Are We Meeting the Standards of Care?

    Science.gov (United States)

    Dahiya, Anita; Courtemanche, Rebecca; Courtemanche, Douglas J

    2018-05-01

    To characterize current Cleft Palate Program (CPP) practices and evaluate the timeliness of appointments with respect to patient age and diagnosis based on American Cleft Palate-Craniofacial Association (ACPA) population guidelines and CPP patient-specific recommendations. A retrospective review of CPP patient appointments from November 6, 2012, to March 31, 2015, was done. Data were analyzed using descriptive and inferential statistics. The study was conducted using data from the CPP at BC Children's Hospital. A total of 1214 appointments were considered in the analysis, including syndromic and nonsyndromic patients of 0 to 27 years of age. Percentage of patients meeting follow-up targets by ACPA standards and CPP team recommendations. Our results showed patients 5 years and younger or nonsyndromic were more likely to be seen on time ( P meeting ACPA guidelines for timeliness and 32% of all appointments meeting CPP recommendations. Timely care for the cleft/craniofacial patient populations represents a challenge for the CPP. Although half of patients may meet the general ACPA guidelines, only 32% of patients are meeting the CPP patient-specific recommendations. To provide better patient care, future adjustments are needed, which may include improved resource allotment and program support.

  13. Segmentation of clustered cells in negative phase contrast images with integrated light intensity and cell shape information.

    Science.gov (United States)

    Wang, Y; Wang, C; Zhang, Z

    2018-05-01

    Automated cell segmentation plays a key role in characterisations of cell behaviours for both biology research and clinical practices. Currently, the segmentation of clustered cells still remains as a challenge and is the main reason for false segmentation. In this study, the emphasis was put on the segmentation of clustered cells in negative phase contrast images. A new method was proposed to combine both light intensity and cell shape information through the construction of grey-weighted distance transform (GWDT) within preliminarily segmented areas. With the constructed GWDT, the clustered cells can be detected and then separated with a modified region skeleton-based method. Moreover, a contour expansion operation was applied to get optimised detection of cell boundaries. In this paper, the working principle and detailed procedure of the proposed method are described, followed by the evaluation of the method on clustered cell segmentation. Results show that the proposed method achieves an improved performance in clustered cell segmentation compared with other methods, with 85.8% and 97.16% accuracy rate for clustered cells and all cells, respectively. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.

  14. BlastNeuron for Automated Comparison, Retrieval and Clustering of 3D Neuron Morphologies.

    Science.gov (United States)

    Wan, Yinan; Long, Fuhui; Qu, Lei; Xiao, Hang; Hawrylycz, Michael; Myers, Eugene W; Peng, Hanchuan

    2015-10-01

    Characterizing the identity and types of neurons in the brain, as well as their associated function, requires a means of quantifying and comparing 3D neuron morphology. Presently, neuron comparison methods are based on statistics from neuronal morphology such as size and number of branches, which are not fully suitable for detecting local similarities and differences in the detailed structure. We developed BlastNeuron to compare neurons in terms of their global appearance, detailed arborization patterns, and topological similarity. BlastNeuron first compares and clusters 3D neuron reconstructions based on global morphology features and moment invariants, independent of their orientations, sizes, level of reconstruction and other variations. Subsequently, BlastNeuron performs local alignment between any pair of retrieved neurons via a tree-topology driven dynamic programming method. A 3D correspondence map can thus be generated at the resolution of single reconstruction nodes. We applied BlastNeuron to three datasets: (1) 10,000+ neuron reconstructions from a public morphology database, (2) 681 newly and manually reconstructed neurons, and (3) neurons reconstructions produced using several independent reconstruction methods. Our approach was able to accurately and efficiently retrieve morphologically and functionally similar neuron structures from large morphology database, identify the local common structures, and find clusters of neurons that share similarities in both morphology and molecular profiles.

  15. Sifting through genomes with iterative-sequence clustering produces a large, phylogenetically diverse protein-family resource

    Directory of Open Access Journals (Sweden)

    Sharpton Thomas J

    2012-10-01

    Full Text Available Abstract Background New computational resources are needed to manage the increasing volume of biological data from genome sequencing projects. One fundamental challenge is the ability to maintain a complete and current catalog of protein diversity. We developed a new approach for the identification of protein families that focuses on the rapid discovery of homologous protein sequences. Results We implemented fully automated and high-throughput procedures to de novo cluster proteins into families based upon global alignment similarity. Our approach employs an iterative clustering strategy in which homologs of known families are sifted out of the search for new families. The resulting reduction in computational complexity enables us to rapidly identify novel protein families found in new genomes and to perform efficient, automated updates that keep pace with genome sequencing. We refer to protein families identified through this approach as “Sifting Families,” or SFams. Our analysis of ~10.5 million protein sequences from 2,928 genomes identified 436,360 SFams, many of which are not represented in other protein family databases. We validated the quality of SFam clustering through statistical as well as network topology–based analyses. Conclusions We describe the rapid identification of SFams and demonstrate how they can be used to annotate genomes and metagenomes. The SFam database catalogs protein-family quality metrics, multiple sequence alignments, hidden Markov models, and phylogenetic trees. Our source code and database are publicly available and will be subject to frequent updates (http://edhar.genomecenter.ucdavis.edu/sifting_families/.

  16. Comparison of five cluster validity indices performance in brain [18 F]FET-PET image segmentation using k-means.

    Science.gov (United States)

    Abualhaj, Bedor; Weng, Guoyang; Ong, Melissa; Attarwala, Ali Asgar; Molina, Flavia; Büsing, Karen; Glatting, Gerhard

    2017-01-01

    Dynamic [ 18 F]fluoro-ethyl-L-tyrosine positron emission tomography ([ 18 F]FET-PET) is used to identify tumor lesions for radiotherapy treatment planning, to differentiate glioma recurrence from radiation necrosis and to classify gliomas grading. To segment different regions in the brain k-means cluster analysis can be used. The main disadvantage of k-means is that the number of clusters must be pre-defined. In this study, we therefore compared different cluster validity indices for automated and reproducible determination of the optimal number of clusters based on the dynamic PET data. The k-means algorithm was applied to dynamic [ 18 F]FET-PET images of 8 patients. Akaike information criterion (AIC), WB, I, modified Dunn's and Silhouette indices were compared on their ability to determine the optimal number of clusters based on requirements for an adequate cluster validity index. To check the reproducibility of k-means, the coefficients of variation CVs of the objective function values OFVs (sum of squared Euclidean distances within each cluster) were calculated using 100 random centroid initialization replications RCI 100 for 2 to 50 clusters. k-means was performed independently on three neighboring slices containing tumor for each patient to investigate the stability of the optimal number of clusters within them. To check the independence of the validity indices on the number of voxels, cluster analysis was applied after duplication of a slice selected from each patient. CVs of index values were calculated at the optimal number of clusters using RCI 100 to investigate the reproducibility of the validity indices. To check if the indices have a single extremum, visual inspection was performed on the replication with minimum OFV from RCI 100 . The maximum CV of OFVs was 2.7 × 10 -2 from all patients. The optimal number of clusters given by modified Dunn's and Silhouette indices was 2 or 3 leading to a very poor segmentation. WB and I indices suggested in

  17. 78 FR 66039 - Modification of National Customs Automation Program Test Concerning Automated Commercial...

    Science.gov (United States)

    2013-11-04

    ... Customs Automation Program Test Concerning Automated Commercial Environment (ACE) Cargo Release (Formerly...) plan to both rename and modify the National Customs Automation Program (NCAP) test concerning the... data elements required to obtain release for cargo transported by air. The test will now be known as...

  18. Efficiency Sustainability Resource Visual Simulator for Clustered Desktop Virtualization Based on Cloud Infrastructure

    Directory of Open Access Journals (Sweden)

    Jong Hyuk Park

    2014-11-01

    Full Text Available Following IT innovations, manual operations have been automated, improving the overall quality of life. This has been possible because an organic topology has been formed among many diverse smart devices grafted onto real life. To provide services to these smart devices, enterprises or users use the cloud. Cloud services are divided into infrastructure as a service (IaaS, platform as a service (PaaS and software as a service (SaaS. SaaS is operated on PaaS, and PaaS is operated on IaaS. Since IaaS is the foundation of all services, algorithms for the efficient operation of virtualized resources are required. Among these algorithms, desktop resource virtualization is used for high resource availability when existing desktop PCs are unavailable. For this high resource availability, clustering for hierarchical structures is important. In addition, since many clustering algorithms show different percentages of the main resources depending on the desktop PC distribution rates and environments, selecting appropriate algorithms is very important. If diverse attempts are made to find algorithms suitable for the operating environments’ desktop resource virtualization, huge costs are incurred for the related power, time and labor. Therefore, in the present paper, a desktop resource virtualization clustering simulator (DRV-CS, a clustering simulator for selecting clusters of desktop virtualization clusters to be maintained sustainably, is proposed. The DRV-CS provides simulations, so that clustering algorithms can be selected and elements can be properly applied in different desktop PC environments through the DRV-CS.

  19. Automation-aided Task Loads Index based on the Automation Rate Reflecting the Effects on Human Operators in NPPs

    International Nuclear Information System (INIS)

    Lee, Seungmin; Seong, Poonghyun; Kim, Jonghyun

    2013-01-01

    Many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs) was suggested. These suggested measures express how much automation support human operators but it cannot express the change of human operators' workload, whether the human operators' workload is increased or decreased. Before considering automation rates, whether the adopted automation is good or bad might be estimated in advance. In this study, to estimate the appropriateness of automation according to the change of the human operators' task loads, automation-aided task loads index is suggested based on the concept of the suggested automation rate. To insure plant safety and efficiency on behalf of human operators, various automation systems have been installed in NPPs, and many works which were previously conducted by human operators can now be supported by computer-based operator aids. According to the characteristics of the automation types, the estimation method of the system automation and the cognitive automation rate were suggested. The proposed estimation method concentrates on the effects of introducing automation, so it directly express how much the automated system support human operators. Based on the suggested automation rates, the way to estimate how much the automated system can affect the human operators' cognitive task load is suggested in this study. When there is no automation, the calculated index is 1, and it means there is no change of human operators' task load

  20. Asleep at the automated wheel-Sleepiness and fatigue during highly automated driving.

    Science.gov (United States)

    Vogelpohl, Tobias; Kühn, Matthias; Hummel, Thomas; Vollrath, Mark

    2018-03-20

    Due to the lack of active involvement in the driving situation and due to monotonous driving environments drivers with automation may be prone to become fatigued faster than manual drivers (e.g. Schömig et al., 2015). However, little is known about the progression of fatigue during automated driving and its effects on the ability to take back manual control after a take-over request. In this driving simulator study with Nö=ö60 drivers we used a three factorial 2ö×ö2ö×ö12 mixed design to analyze the progression (12ö×ö5ömin; within subjects) of driver fatigue in drivers with automation compared to manual drivers (between subjects). Driver fatigue was induced as either mainly sleep related or mainly task related fatigue (between subjects). Additionally, we investigated the drivers' reactions to a take-over request in a critical driving scenario to gain insights into the ability of fatigued drivers to regain manual control and situation awareness after automated driving. Drivers in the automated driving condition exhibited facial indicators of fatigue after 15 to 35ömin of driving. Manual drivers only showed similar indicators of fatigue if they suffered from a lack of sleep and then only after a longer period of driving (approx. 40ömin). Several drivers in the automated condition closed their eyes for extended periods of time. In the driving with automation condition mean automation deactivation times after a take-over request were slower for a certain percentage (about 30%) of the drivers with a lack of sleep (Mö=ö3.2; SDö=ö2.1ös) compared to the reaction times after a long drive (Mö=ö2.4; SDö=ö0.9ös). Drivers with automation also took longer than manual drivers to first glance at the speed display after a take-over request and were more likely to stay behind a braking lead vehicle instead of overtaking it. Drivers are unable to stay alert during extended periods of automated driving without non-driving related tasks. Fatigued drivers could

  1. Procedure automation: the effect of automated procedure execution on situation awareness and human performance

    International Nuclear Information System (INIS)

    Andresen, Gisle; Svengren, Haakan; Heimdal, Jan O.; Nilsen, Svein; Hulsund, John-Einar; Bisio, Rossella; Debroise, Xavier

    2004-04-01

    As advised by the procedure workshop convened in Halden in 2000, the Halden Project conducted an experiment on the effect of automation of Computerised Procedure Systems (CPS) on situation awareness and human performance. The expected outcome of the study was to provide input for guidance on CPS design, and to support the Halden Project's ongoing research on human reliability analysis. The experiment was performed in HAMMLAB using the HAMBO BWR simulator and the COPMA-III CPS. Eight crews of operators from Forsmark 3 and Oskarshamn 3 participated. Three research questions were investigated: 1) Does procedure automation create Out-Of-The-Loop (OOTL) performance problems? 2) Does procedure automation affect situation awareness? 3) Does procedure automation affect crew performance? The independent variable, 'procedure configuration', had four levels: paper procedures, manual CPS, automation with breaks, and full automation. The results showed that the operators experienced OOTL problems in full automation, but that situation awareness and crew performance (response time) were not affected. One possible explanation for this is that the operators monitored the automated procedure execution conscientiously, something which may have prevented the OOTL problems from having negative effects on situation awareness and crew performance. In a debriefing session, the operators clearly expressed their dislike for the full automation condition, but that automation with breaks could be suitable for some tasks. The main reason why the operators did not like the full automation was that they did not feel being in control. A qualitative analysis addressing factors contributing to response time delays revealed that OOTL problems did not seem to cause delays, but that some delays could be explained by the operators having problems with the freeze function of the CPS. Also other factors such as teamwork and operator tendencies were of importance. Several design implications were drawn

  2. Automation-aided Task Loads Index based on the Automation Rate Reflecting the Effects on Human Operators in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seungmin; Seong, Poonghyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Kim, Jonghyun [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2013-05-15

    Many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs) was suggested. These suggested measures express how much automation support human operators but it cannot express the change of human operators' workload, whether the human operators' workload is increased or decreased. Before considering automation rates, whether the adopted automation is good or bad might be estimated in advance. In this study, to estimate the appropriateness of automation according to the change of the human operators' task loads, automation-aided task loads index is suggested based on the concept of the suggested automation rate. To insure plant safety and efficiency on behalf of human operators, various automation systems have been installed in NPPs, and many works which were previously conducted by human operators can now be supported by computer-based operator aids. According to the characteristics of the automation types, the estimation method of the system automation and the cognitive automation rate were suggested. The proposed estimation method concentrates on the effects of introducing automation, so it directly express how much the automated system support human operators. Based on the suggested automation rates, the way to estimate how much the automated system can affect the human operators' cognitive task load is suggested in this study. When there is no automation, the calculated index is 1, and it means there is no change of human operators' task load.

  3. Brightest Cluster Galaxies in REXCESS Clusters

    Science.gov (United States)

    Haarsma, Deborah B.; Leisman, L.; Bruch, S.; Donahue, M.

    2009-01-01

    Most galaxy clusters contain a Brightest Cluster Galaxy (BCG) which is larger than the other cluster ellipticals and has a more extended profile. In the hierarchical model, the BCG forms through many galaxy mergers in the crowded center of the cluster, and thus its properties give insight into the assembly of the cluster as a whole. In this project, we are working with the Representative XMM-Newton Cluster Structure Survey (REXCESS) team (Boehringer et al 2007) to study BCGs in 33 X-ray luminous galaxy clusters, 0.055 < z < 0.183. We are imaging the BCGs in R band at the Southern Observatory for Astrophysical Research (SOAR) in Chile. In this poster, we discuss our methods and give preliminary measurements of the BCG magnitudes, morphology, and stellar mass. We compare these BCG properties with the properties of their host clusters, particularly of the X-ray emitting gas.

  4. Segmentation of the Clustered Cells with Optimized Boundary Detection in Negative Phase Contrast Images.

    Science.gov (United States)

    Wang, Yuliang; Zhang, Zaicheng; Wang, Huimin; Bi, Shusheng

    2015-01-01

    Cell image segmentation plays a central role in numerous biology studies and clinical applications. As a result, the development of cell image segmentation algorithms with high robustness and accuracy is attracting more and more attention. In this study, an automated cell image segmentation algorithm is developed to get improved cell image segmentation with respect to cell boundary detection and segmentation of the clustered cells for all cells in the field of view in negative phase contrast images. A new method which combines the thresholding method and edge based active contour method was proposed to optimize cell boundary detection. In order to segment clustered cells, the geographic peaks of cell light intensity were utilized to detect numbers and locations of the clustered cells. In this paper, the working principles of the algorithms are described. The influence of parameters in cell boundary detection and the selection of the threshold value on the final segmentation results are investigated. At last, the proposed algorithm is applied to the negative phase contrast images from different experiments. The performance of the proposed method is evaluated. Results show that the proposed method can achieve optimized cell boundary detection and highly accurate segmentation for clustered cells.

  5. Automation of radioimmunoassay

    International Nuclear Information System (INIS)

    Yamaguchi, Chisato; Yamada, Hideo; Iio, Masahiro

    1974-01-01

    Automation systems for measuring Australian antigen by radioimmunoassay under development were discussed. Samples were processed as follows: blood serum being dispensed by automated sampler to the test tube, and then incubated under controlled time and temperature; first counting being omitted; labelled antibody being dispensed to the serum after washing; samples being incubated and then centrifuged; radioactivities in the precipitate being counted by auto-well counter; measurements being tabulated by automated typewriter. Not only well-type counter but also position counter was studied. (Kanao, N.)

  6. A survey of energy conservation mechanisms for dynamic cluster based wireless sensor networks

    International Nuclear Information System (INIS)

    Enam, R.N.; Tahir, M.; Ahmed, S.; Qureshi, R.

    2018-01-01

    WSN (Wireless Sensor Network) is an emerging technology that has unlimited potential for numerous application areas including military, crisis management, environmental, transportation, medical, home/ city automations and smart spaces. But energy constrained nature of WSNs necessitates that their architecture and communicating protocols to be designed in an energy aware manner. Sensor data collection through clustering mechanisms has become a common strategy in WSN. This paper presents a survey report on the major perspectives with which energy conservation mechanisms has been proposed in dynamic cluster based WSNs so far. All the solutions discussed in this paper focus on the cluster based protocols only.We have covered a vast scale of existing energy efficient protocols and have categorized them in six categories. In the beginning of this paper the fundamentals of the energy constraint issues of WSNs have been discussed and an overview of the causes of energy consumptions at all layers of WSN has been given. Later in this paper several previously proposed energy efficient protocols of WSNs are presented. (author)

  7. A Survey of Energy Conservation Mechanisms for Dynamic Cluster Based Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Rabia Noor Enam

    2018-04-01

    Full Text Available WSN (Wireless Sensor Network is an emerging technology that has unlimited potential for numerous application areas including military, crisis management, environmental, transportation, medical, home/ city automations and smart spaces. But energy constrained nature of WSNs necessitates that their architecture and communicating protocols to be designed in an energy aware manner. Sensor data collection through clustering mechanisms has become a common strategy in WSN. This paper presents a survey report on the major perspectives with which energy conservation mechanisms has been proposed in dynamic cluster based WSNs so far. All the solutions discussed in this paper focus on the cluster based protocols only.We have covered a vast scale of existing energy efficient protocols and have categorized them in six categories. In the beginning of this paper the fundamentals of the energy constraint issues of WSNs have been discussed and an overview of the causes of energy consumptions at all layers of WSN has been given. Later in this paper several previously proposed energy efficient protocols of WSNs are presented.

  8. 77 FR 48527 - National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE...

    Science.gov (United States)

    2012-08-14

    ... National Customs Automation Program (NCAP) test concerning the simplified entry functionality in the... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection National Customs Automation Program (NCAP) Test Concerning Automated Commercial Environment (ACE) Simplified Entry: Modification of...

  9. GraphTeams: a method for discovering spatial gene clusters in Hi-C sequencing data.

    Science.gov (United States)

    Schulz, Tizian; Stoye, Jens; Doerr, Daniel

    2018-05-08

    Hi-C sequencing offers novel, cost-effective means to study the spatial conformation of chromosomes. We use data obtained from Hi-C experiments to provide new evidence for the existence of spatial gene clusters. These are sets of genes with associated functionality that exhibit close proximity to each other in the spatial conformation of chromosomes across several related species. We present the first gene cluster model capable of handling spatial data. Our model generalizes a popular computational model for gene cluster prediction, called δ-teams, from sequences to graphs. Following previous lines of research, we subsequently extend our model to allow for several vertices being associated with the same label. The model, called δ-teams with families, is particular suitable for our application as it enables handling of gene duplicates. We develop algorithmic solutions for both models. We implemented the algorithm for discovering δ-teams with families and integrated it into a fully automated workflow for discovering gene clusters in Hi-C data, called GraphTeams. We applied it to human and mouse data to find intra- and interchromosomal gene cluster candidates. The results include intrachromosomal clusters that seem to exhibit a closer proximity in space than on their chromosomal DNA sequence. We further discovered interchromosomal gene clusters that contain genes from different chromosomes within the human genome, but are located on a single chromosome in mouse. By identifying δ-teams with families, we provide a flexible model to discover gene cluster candidates in Hi-C data. Our analysis of Hi-C data from human and mouse reveals several known gene clusters (thus validating our approach), but also few sparsely studied or possibly unknown gene cluster candidates that could be the source of further experimental investigations.

  10. A computational linguistic measure of clustering behavior on semantic verbal fluency task predicts risk of future dementia in the nun study.

    Science.gov (United States)

    Pakhomov, Serguei V S; Hemmy, Laura S

    2014-06-01

    Generative semantic verbal fluency (SVF) tests show early and disproportionate decline relative to other abilities in individuals developing Alzheimer's disease. Optimal performance on SVF tests depends on the efficiency of using clustered organization of semantically related items and the ability to switch between clusters. Traditional approaches to clustering and switching have relied on manual determination of clusters. We evaluated a novel automated computational linguistic approach for quantifying clustering behavior. Our approach is based on Latent Semantic Analysis (LSA) for computing strength of semantic relatedness between pairs of words produced in response to SVF test. The mean size of semantic clusters (MCS) and semantic chains (MChS) are calculated based on pairwise relatedness values between words. We evaluated the predictive validity of these measures on a set of 239 participants in the Nun Study, a longitudinal study of aging. All were cognitively intact at baseline assessment, measured with the Consortium to Establish a Registry for Alzheimer's Disease (CERAD) battery, and were followed in 18-month waves for up to 20 years. The onset of either dementia or memory impairment were used as outcomes in Cox proportional hazards models adjusted for age and education and censored at follow-up waves 5 (6.3 years) and 13 (16.96 years). Higher MCS was associated with 38% reduction in dementia risk at wave 5 and 26% reduction at wave 13, but not with the onset of memory impairment. Higher [+1 standard deviation (SD)] MChS was associated with 39% dementia risk reduction at wave 5 but not wave 13, and association with memory impairment was not significant. Higher traditional SVF scores were associated with 22-29% memory impairment and 35-40% dementia risk reduction. SVF scores were not correlated with either MCS or MChS. Our study suggests that an automated approach to measuring clustering behavior can be used to estimate dementia risk in cognitively normal

  11. Spatial-Temporal Clustering of Tornadoes

    Science.gov (United States)

    Malamud, Bruce D.; Turcotte, Donald L.; Brooks, Harold E.

    2017-04-01

    The standard measure of the intensity of a tornado is the Enhanced Fujita scale, which is based qualitatively on the damage caused by a tornado. An alternative measure of tornado intensity is the tornado path length, L. Here we examine the spatial-temporal clustering of severe tornadoes, which we define as having path lengths L ≥ 10 km. Of particular concern are tornado outbreaks, when a large number of severe tornadoes occur in a day in a restricted region. We apply a spatial-temporal clustering analysis developed for earthquakes. We take all pairs of severe tornadoes in observed and modelled outbreaks, and for each pair plot the spatial lag (distance between touchdown points) against the temporal lag (time between touchdown points). We apply our spatial-temporal lag methodology to the intense tornado outbreaks in the central United States on 26 and 27 April 2011, which resulted in over 300 fatalities and produced 109 severe (L ≥ 10 km) tornadoes. The patterns of spatial-temporal lag correlations that we obtain for the 2 days are strikingly different. On 26 April 2011, there were 45 severe tornadoes and our clustering analysis is dominated by a complex sequence of linear features. We associate the linear patterns with the tornadoes generated in either a single cell thunderstorm or a closely spaced cluster of single cell thunderstorms moving at a near-constant velocity. Our study of a derecho tornado outbreak of six severe tornadoes on 4 April 2011 along with modelled outbreak scenarios confirms this association. On 27 April 2011, there were 64 severe tornadoes and our clustering analysis is predominantly random with virtually no embedded linear patterns. We associate this pattern with a large number of interacting supercell thunderstorms generating tornadoes randomly in space and time. In order to better understand these associations, we also applied our approach to the Great Plains tornado outbreak of 3 May 1999. Careful studies by others have associated

  12. Automated profiling of individual cell-cell interactions from high-throughput time-lapse imaging microscopy in nanowell grids (TIMING).

    Science.gov (United States)

    Merouane, Amine; Rey-Villamizar, Nicolas; Lu, Yanbin; Liadi, Ivan; Romain, Gabrielle; Lu, Jennifer; Singh, Harjeet; Cooper, Laurence J N; Varadarajan, Navin; Roysam, Badrinath

    2015-10-01

    There is a need for effective automated methods for profiling dynamic cell-cell interactions with single-cell resolution from high-throughput time-lapse imaging data, especially, the interactions between immune effector cells and tumor cells in adoptive immunotherapy. Fluorescently labeled human T cells, natural killer cells (NK), and various target cells (NALM6, K562, EL4) were co-incubated on polydimethylsiloxane arrays of sub-nanoliter wells (nanowells), and imaged using multi-channel time-lapse microscopy. The proposed cell segmentation and tracking algorithms account for cell variability and exploit the nanowell confinement property to increase the yield of correctly analyzed nanowells from 45% (existing algorithms) to 98% for wells containing one effector and a single target, enabling automated quantification of cell locations, morphologies, movements, interactions, and deaths without the need for manual proofreading. Automated analysis of recordings from 12 different experiments demonstrated automated nanowell delineation accuracy >99%, automated cell segmentation accuracy >95%, and automated cell tracking accuracy of 90%, with default parameters, despite variations in illumination, staining, imaging noise, cell morphology, and cell clustering. An example analysis revealed that NK cells efficiently discriminate between live and dead targets by altering the duration of conjugation. The data also demonstrated that cytotoxic cells display higher motility than non-killers, both before and during contact. broysam@central.uh.edu or nvaradar@central.uh.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. Metal cluster compounds - chemistry and importance; clusters containing isolated main group element atoms, large metal cluster compounds, cluster fluxionality

    International Nuclear Information System (INIS)

    Walther, B.

    1988-01-01

    This part of the review on metal cluster compounds deals with clusters containing isolated main group element atoms, with high nuclearity clusters and metal cluster fluxionality. It will be obvious that main group element atoms strongly influence the geometry, stability and reactivity of the clusters. High nuclearity clusters are of interest in there own due to the diversity of the structures adopted, but their intermediate position between molecules and the metallic state makes them a fascinating research object too. These both sites of the metal cluster chemistry as well as the frequently observed ligand and core fluxionality are related to the cluster metal and surface analogy. (author)

  14. Laboratory Automation and Middleware.

    Science.gov (United States)

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    Directory of Open Access Journals (Sweden)

    Kevin A. Huck

    2008-01-01

    Full Text Available The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis of individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.

  16. Managing laboratory automation.

    Science.gov (United States)

    Saboe, T J

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed.

  17. Determination of the Optimized Automation Rate considering Effects of Automation on Human Operators in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun; Kim, Jong Hyun; Kim, Man Cheol

    2015-01-01

    Automation refers to the use of a device or a system to perform a function previously performed by a human operator. It is introduced to reduce the human errors and to enhance the performance in various industrial fields, including the nuclear industry. However, these positive effects are not always achieved in complex systems such as nuclear power plants (NPPs). An excessive introduction of automation can generate new roles for human operators and change activities in unexpected ways. As more automation systems are accepted, the ability of human operators to detect automation failures and resume manual control is diminished. This disadvantage of automation is called the Out-of-the- Loop (OOTL) problem. We should consider the positive and negative effects of automation at the same time to determine the appropriate level of the introduction of automation. Thus, in this paper, we suggest an estimation method to consider the positive and negative effects of automation at the same time to determine the appropriate introduction of automation. This concept is limited in that it does not consider the effects of automation on human operators. Thus, a new estimation method for automation rate was suggested to overcome this problem

  18. 78 FR 44142 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Science.gov (United States)

    2013-07-23

    ... Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image... (CBP's) plan to modify the National Customs Automation Program (NCAP) tests concerning document imaging... entry process by reducing the number of data elements required to obtain release for cargo transported...

  19. The Science of Home Automation

    Science.gov (United States)

    Thomas, Brian Louis

    Smart home technologies and the concept of home automation have become more popular in recent years. This popularity has been accompanied by social acceptance of passive sensors installed throughout the home. The subsequent increase in smart homes facilitates the creation of home automation strategies. We believe that home automation strategies can be generated intelligently by utilizing smart home sensors and activity learning. In this dissertation, we hypothesize that home automation can benefit from activity awareness. To test this, we develop our activity-aware smart automation system, CARL (CASAS Activity-aware Resource Learning). CARL learns the associations between activities and device usage from historical data and utilizes the activity-aware capabilities to control the devices. To help validate CARL we deploy and test three different versions of the automation system in a real-world smart environment. To provide a foundation of activity learning, we integrate existing activity recognition and activity forecasting into CARL home automation. We also explore two alternatives to using human-labeled data to train the activity learning models. The first unsupervised method is Activity Detection, and the second is a modified DBSCAN algorithm that utilizes Dynamic Time Warping (DTW) as a distance metric. We compare the performance of activity learning with human-defined labels and with automatically-discovered activity categories. To provide evidence in support of our hypothesis, we evaluate CARL automation in a smart home testbed. Our results indicate that home automation can be boosted through activity awareness. We also find that the resulting automation has a high degree of usability and comfort for the smart home resident.

  20. Adaptive Automation Design and Implementation

    Science.gov (United States)

    2015-09-17

    with an automated system to a real-world adaptive au- tomation system implementation. There have been plenty of adaptive automation 17 Adaptive...of systems without increasing manpower requirements by allocating routine tasks to automated aids, improving safety through the use of au- tomated ...between intermediate levels of au- tomation , explicitly defining which human task a given level automates. Each model aids the creation and classification

  1. Layered distributed architecture for plant automation

    International Nuclear Information System (INIS)

    Aravamuthan, G.; Verma, Yachika; Ranjan, Jyoti; Chachondia, Alka S.; Ganesh, G.

    2005-01-01

    The development of plant automation system and associated software remains one of the greatest challenges to the widespread implementation of highly adaptive re-configurable automation technology. This paper presents a layered distributed architecture for a plant automation system designed to support rapid reconfiguration and redeployment of automation components. The paper first presents evolution of automation architecture and their associated environment in the past few decades and then presents the concept of layered system architecture and the use of automation components to support the construction of a wide variety of automation system. It also highlights the role of standards and technology, which can be used in the development of automation components. We have attempted to adhere to open standards and technology for the development of automation component at a various layers. It also highlights the application of this concept in the development of an Operator Information System (OIS) for Advanced Heavy Water Reactor (AHWR). (author)

  2. Semantics-based Automated Web Testing

    Directory of Open Access Journals (Sweden)

    Hai-Feng Guo

    2015-08-01

    Full Text Available We present TAO, a software testing tool performing automated test and oracle generation based on a semantic approach. TAO entangles grammar-based test generation with automated semantics evaluation using a denotational semantics framework. We show how TAO can be incorporated with the Selenium automation tool for automated web testing, and how TAO can be further extended to support automated delta debugging, where a failing web test script can be systematically reduced based on grammar-directed strategies. A real-life parking website is adopted throughout the paper to demonstrate the effectivity of our semantics-based web testing approach.

  3. PREFACE: Nuclear Cluster Conference; Cluster'07

    Science.gov (United States)

    Freer, Martin

    2008-05-01

    The Cluster Conference is a long-running conference series dating back to the 1960's, the first being initiated by Wildermuth in Bochum, Germany, in 1969. The most recent meeting was held in Nara, Japan, in 2003, and in 2007 the 9th Cluster Conference was held in Stratford-upon-Avon, UK. As the name suggests the town of Stratford lies upon the River Avon, and shortly before the conference, due to unprecedented rainfall in the area (approximately 10 cm within half a day), lay in the River Avon! Stratford is the birthplace of the `Bard of Avon' William Shakespeare, and this formed an intriguing conference backdrop. The meeting was attended by some 90 delegates and the programme contained 65 70 oral presentations, and was opened by a historical perspective presented by Professor Brink (Oxford) and closed by Professor Horiuchi (RCNP) with an overview of the conference and future perspectives. In between, the conference covered aspects of clustering in exotic nuclei (both neutron and proton-rich), molecular structures in which valence neutrons are exchanged between cluster cores, condensates in nuclei, neutron-clusters, superheavy nuclei, clusters in nuclear astrophysical processes and exotic cluster decays such as 2p and ternary cluster decay. The field of nuclear clustering has become strongly influenced by the physics of radioactive beam facilities (reflected in the programme), and by the excitement that clustering may have an important impact on the structure of nuclei at the neutron drip-line. It was clear that since Nara the field had progressed substantially and that new themes had emerged and others had crystallized. Two particular topics resonated strongly condensates and nuclear molecules. These topics are thus likely to be central in the next cluster conference which will be held in 2011 in the Hungarian city of Debrechen. Martin Freer Participants and Cluster'07

  4. Pre-crash scenarios at road junctions: A clustering method for car crash data.

    Science.gov (United States)

    Nitsche, Philippe; Thomas, Pete; Stuetz, Rainer; Welsh, Ruth

    2017-10-01

    Given the recent advancements in autonomous driving functions, one of the main challenges is safe and efficient operation in complex traffic situations such as road junctions. There is a need for comprehensive testing, either in virtual simulation environments or on real-world test tracks. This paper presents a novel data analysis method including the preparation, analysis and visualization of car crash data, to identify the critical pre-crash scenarios at T- and four-legged junctions as a basis for testing the safety of automated driving systems. The presented method employs k-medoids to cluster historical junction crash data into distinct partitions and then applies the association rules algorithm to each cluster to specify the driving scenarios in more detail. The dataset used consists of 1056 junction crashes in the UK, which were exported from the in-depth "On-the-Spot" database. The study resulted in thirteen crash clusters for T-junctions, and six crash clusters for crossroads. Association rules revealed common crash characteristics, which were the basis for the scenario descriptions. The results support existing findings on road junction accidents and provide benchmark situations for safety performance tests in order to reduce the possible number parameter combinations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Automation in organizations: Eternal conflict

    Science.gov (United States)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  6. Structuring heterogeneous biological information using fuzzy clustering of k-partite graphs

    Directory of Open Access Journals (Sweden)

    Theis Fabian J

    2010-10-01

    Full Text Available Abstract Background Extensive and automated data integration in bioinformatics facilitates the construction of large, complex biological networks. However, the challenge lies in the interpretation of these networks. While most research focuses on the unipartite or bipartite case, we address the more general but common situation of k-partite graphs. These graphs contain k different node types and links are only allowed between nodes of different types. In order to reveal their structural organization and describe the contained information in a more coarse-grained fashion, we ask how to detect clusters within each node type. Results Since entities in biological networks regularly have more than one function and hence participate in more than one cluster, we developed a k-partite graph partitioning algorithm that allows for overlapping (fuzzy clusters. It determines for each node a degree of membership to each cluster. Moreover, the algorithm estimates a weighted k-partite graph that connects the extracted clusters. Our method is fast and efficient, mimicking the multiplicative update rules commonly employed in algorithms for non-negative matrix factorization. It facilitates the decomposition of networks on a chosen scale and therefore allows for analysis and interpretation of structures on various resolution levels. Applying our algorithm to a tripartite disease-gene-protein complex network, we were able to structure this graph on a large scale into clusters that are functionally correlated and biologically meaningful. Locally, smaller clusters enabled reclassification or annotation of the clusters' elements. We exemplified this for the transcription factor MECP2. Conclusions In order to cope with the overwhelming amount of information available from biomedical literature, we need to tackle the challenge of finding structures in large networks with nodes of multiple types. To this end, we presented a novel fuzzy k-partite graph partitioning

  7. Automated grouping of action potentials of human embryonic stem cell-derived cardiomyocytes.

    Science.gov (United States)

    Gorospe, Giann; Zhu, Renjun; Millrod, Michal A; Zambidis, Elias T; Tung, Leslie; Vidal, Rene

    2014-09-01

    Methods for obtaining cardiomyocytes from human embryonic stem cells (hESCs) are improving at a significant rate. However, the characterization of these cardiomyocytes (CMs) is evolving at a relatively slower rate. In particular, there is still uncertainty in classifying the phenotype (ventricular-like, atrial-like, nodal-like, etc.) of an hESC-derived cardiomyocyte (hESC-CM). While previous studies identified the phenotype of a CM based on electrophysiological features of its action potential, the criteria for classification were typically subjective and differed across studies. In this paper, we use techniques from signal processing and machine learning to develop an automated approach to discriminate the electrophysiological differences between hESC-CMs. Specifically, we propose a spectral grouping-based algorithm to separate a population of CMs into distinct groups based on the similarity of their action potential shapes. We applied this method to a dataset of optical maps of cardiac cell clusters dissected from human embryoid bodies. While some of the nine cell clusters in the dataset are presented with just one phenotype, the majority of the cell clusters are presented with multiple phenotypes. The proposed algorithm is generally applicable to other action potential datasets and could prove useful in investigating the purification of specific types of CMs from an electrophysiological perspective.

  8. Mobile home automation-merging mobile value added services and home automation technologies

    OpenAIRE

    Rosendahl, Andreas; Hampe, Felix J.; Botterweck, Goetz

    2007-01-01

    non-peer-reviewed In this paper we study mobile home automation, a field that emerges from an integration of mobile application platforms and home automation technologies. In a conceptual introduction we first illustrate the need for such applications by introducing a two-dimensional conceptual model of mobility. Subsequently we suggest an architecture and discuss different options of how a user might access a mobile home automation service and the controlled devices. As another contrib...

  9. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    Science.gov (United States)

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.

  10. Dorsal hippocampal NMDA receptors mediate the interactive effects of arachidonylcyclopropylamide and MDMA/ecstasy on memory retrieval in rats.

    Science.gov (United States)

    Ghaderi, Marzieh; Rezayof, Ameneh; Vousooghi, Nasim; Zarrindast, Mohammad-Reza

    2016-04-03

    A combination of cannabis and ecstasy may change the cognitive functions more than either drug alone. The present study was designed to investigate the possible involvement of dorsal hippocampal NMDA receptors in the interactive effects of arachidonylcyclopropylamide (ACPA) and ecstasy/MDMA on memory retrieval. Adult male Wistar rats were cannulated into the CA1 regions of the dorsal hippocampus (intra-CA1) and memory retrieval was examined using the step-through type of passive avoidance task. Intra-CA1 microinjection of a selective CB1 receptor agonist, ACPA (0.5-4ng/rat) immediately before the testing phase (pre-test), but not after the training phase (post-training), impaired memory retrieval. In addition, pre-test intra-CA1 microinjection of MDMA (0.5-1μg/rat) dose-dependently decreased step-through latency, indicating an amnesic effect of the drug by itself. Interestingly, pre-test microinjection of a higher dose of MDMA into the CA1 regions significantly improved ACPA-induced memory impairment. Moreover, pre-test intra-CA1 microinjection of a selective NMDA receptor antagonist, D-AP5 (1 and 2μg/rat) inhibited the reversal effect of MDMA on the impairment of memory retrieval induced by ACPA. Pre-test intra-CA1 microinjection of the same doses of D-AP5 had no effect on memory retrieval alone. These findings suggest that ACPA or MDMA consumption can induce memory retrieval impairment, while their co-administration improves this amnesic effect through interacting with hippocampal glutamatergic-NMDA receptor mechanism. Thus, it seems that the tendency to abuse cannabis with ecstasy may be for avoiding cognitive dysfunction. Copyright © 2015. Published by Elsevier Inc.

  11. A mode of error: Immunoglobulin binding protein (a subset of anti-citrullinated proteins can cause false positive tuberculosis test results in rheumatoid arthritis

    Directory of Open Access Journals (Sweden)

    Maria Greenwald

    2017-12-01

    Full Text Available Citrullinated Immunoglobulin Binding Protein (BiP is a newly described autoimmune target in rheumatoid arthritis (RA, one of many cyclic citrullinated peptides(CCP or ACPA. BiP is over-expressed in RA patients causing T cell expansion and increased interferon levels during incubation for the QuantiFERON-Gold tuberculosis test (QFT-G TB. The QFT-G TB has never been validated where interferon is increased by underlying disease, as for example RA.Of ACPA-positive RA patients (n = 126, we found a 13% false-positive TB test rate by QFT-G TB. Despite subsequent biologic therapy for 3 years of all 126 RA patients, none showed evidence of TB without INH. Most of the false-positive RA patients after treatment with biologic therapy reverted to a negative QFT-G test. False TB tests correlated with ACPA level (p < 0.02.Three healthy women without arthritis or TB exposure had negative QFT-G TB. In vitro, all three tested positive every time for TB correlating to the dose of BiP or anti-BiP added, at 2 ug/ml, 5 ug/ml, 10 ug/ml, and 20 ug/ml.BiP naturally found in the majority of ACPA-positive RA patients can result in a false positive QFT-G TB. Subsequent undertreatment of RA, if biologic therapy is withheld, and overtreatment of presumed latent TB may harm patients. Keywords: Tuberculosis, IGRA, Rheumatoid arthritis, Interferon, Anti-citrullinated peptide antibody (ACPA, Immunoglobulin binding protein (BiP

  12. Future Trends in Process Automation

    OpenAIRE

    Jämsä-Jounela, Sirkka-Liisa

    2007-01-01

    The importance of automation in the process industries has increased dramatically in recent years. In the highly industrialized countries, process automation serves to enhance product quality, master the whole range of products, improve process safety and plant availability, efficiently utilize resources and lower emissions. In the rapidly developing countries, mass production is the main motivation for applying process automation. The greatest demand for process automation is in the chemical...

  13. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  14. 76 FR 34246 - Automated Commercial Environment (ACE); Announcement of National Customs Automation Program Test...

    Science.gov (United States)

    2011-06-13

    ... Environment (ACE); Announcement of National Customs Automation Program Test of Automated Procedures for In... Customs Automation Program (NCAP) test relating to highway movements of commercial goods that are transported in-bond through the United States from one point in Canada to another point in Canada. The NCAP...

  15. Automated cloning methods.; TOPICAL

    International Nuclear Information System (INIS)

    Collart, F.

    2001-01-01

    Argonne has developed a series of automated protocols to generate bacterial expression clones by using a robotic system designed to be used in procedures associated with molecular biology. The system provides plate storage, temperature control from 4 to 37 C at various locations, and Biomek and Multimek pipetting stations. The automated system consists of a robot that transports sources from the active station on the automation system. Protocols for the automated generation of bacterial expression clones can be grouped into three categories (Figure 1). Fragment generation protocols are initiated on day one of the expression cloning procedure and encompass those protocols involved in generating purified coding region (PCR)

  16. Formation of stable products from cluster-cluster collisions

    International Nuclear Information System (INIS)

    Alamanova, Denitsa; Grigoryan, Valeri G; Springborg, Michael

    2007-01-01

    The formation of stable products from copper cluster-cluster collisions is investigated by using classical molecular-dynamics simulations in combination with an embedded-atom potential. The dependence of the product clusters on impact energy, relative orientation of the clusters, and size of the clusters is studied. The structures and total energies of the product clusters are analysed and compared with those of the colliding clusters before impact. These results, together with the internal temperature, are used in obtaining an increased understanding of cluster fusion processes

  17. Abdominal adipose tissue quantification on water-suppressed and non-water-suppressed MRI at 3T using semi-automated FCM clustering algorithm

    Science.gov (United States)

    Valaparla, Sunil K.; Peng, Qi; Gao, Feng; Clarke, Geoffrey D.

    2014-03-01

    Accurate measurements of human body fat distribution are desirable because excessive body fat is associated with impaired insulin sensitivity, type 2 diabetes mellitus (T2DM) and cardiovascular disease. In this study, we hypothesized that the performance of water suppressed (WS) MRI is superior to non-water suppressed (NWS) MRI for volumetric assessment of abdominal subcutaneous (SAT), intramuscular (IMAT), visceral (VAT), and total (TAT) adipose tissues. We acquired T1-weighted images on a 3T MRI system (TIM Trio, Siemens), which was analyzed using semi-automated segmentation software that employs a fuzzy c-means (FCM) clustering algorithm. Sixteen contiguous axial slices, centered at the L4-L5 level of the abdomen, were acquired in eight T2DM subjects with water suppression (WS) and without (NWS). Histograms from WS images show improved separation of non-fatty tissue pixels from fatty tissue pixels, compared to NWS images. Paired t-tests of WS versus NWS showed a statistically significant lower volume of lipid in the WS images for VAT (145.3 cc less, p=0.006) and IMAT (305 cc less, p1), but not SAT (14.1 cc more, NS). WS measurements of TAT also resulted in lower fat volumes (436.1 cc less, p=0.002). There is strong correlation between WS and NWS quantification methods for SAT measurements (r=0.999), but poorer correlation for VAT studies (r=0.845). These results suggest that NWS pulse sequences may overestimate adipose tissue volumes and that WS pulse sequences are more desirable due to the higher contrast generated between fatty and non-fatty tissues.

  18. Automation, Performance and International Competition

    DEFF Research Database (Denmark)

    Kromann, Lene; Sørensen, Anders

    This paper presents new evidence on trade‐induced automation in manufacturing firms using unique data combining a retrospective survey that we have assembled with register data for 2005‐2010. In particular, we establish a causal effect where firms that have specialized in product types for which...... the Chinese exports to the world market has risen sharply invest more in automated capital compared to firms that have specialized in other product types. We also study the relationship between automation and firm performance and find that firms with high increases in scale and scope of automation have faster...... productivity growth than other firms. Moreover, automation improves the efficiency of all stages of the production process by reducing setup time, run time, and inspection time and increasing uptime and quantity produced per worker. The efficiency improvement varies by type of automation....

  19. Document clustering methods, document cluster label disambiguation methods, document clustering apparatuses, and articles of manufacture

    Science.gov (United States)

    Sanfilippo, Antonio [Richland, WA; Calapristi, Augustin J [West Richland, WA; Crow, Vernon L [Richland, WA; Hetzler, Elizabeth G [Kennewick, WA; Turner, Alan E [Kennewick, WA

    2009-12-22

    Document clustering methods, document cluster label disambiguation methods, document clustering apparatuses, and articles of manufacture are described. In one aspect, a document clustering method includes providing a document set comprising a plurality of documents, providing a cluster comprising a subset of the documents of the document set, using a plurality of terms of the documents, providing a cluster label indicative of subject matter content of the documents of the cluster, wherein the cluster label comprises a plurality of word senses, and selecting one of the word senses of the cluster label.

  20. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  1. The Automation-by-Expertise-by-Training Interaction.

    Science.gov (United States)

    Strauch, Barry

    2017-03-01

    I introduce the automation-by-expertise-by-training interaction in automated systems and discuss its influence on operator performance. Transportation accidents that, across a 30-year interval demonstrated identical automation-related operator errors, suggest a need to reexamine traditional views of automation. I review accident investigation reports, regulator studies, and literature on human computer interaction, expertise, and training and discuss how failing to attend to the interaction of automation, expertise level, and training has enabled operators to commit identical automation-related errors. Automated systems continue to provide capabilities exceeding operators' need for effective system operation and provide interfaces that can hinder, rather than enhance, operator automation-related situation awareness. Because of limitations in time and resources, training programs do not provide operators the expertise needed to effectively operate these automated systems, requiring them to obtain the expertise ad hoc during system operations. As a result, many do not acquire necessary automation-related system expertise. Integrating automation with expected operator expertise levels, and within training programs that provide operators the necessary automation expertise, can reduce opportunities for automation-related operator errors. Research to address the automation-by-expertise-by-training interaction is needed. However, such research must meet challenges inherent to examining realistic sociotechnical system automation features with representative samples of operators, perhaps by using observational and ethnographic research. Research in this domain should improve the integration of design and training and, it is hoped, enhance operator performance.

  2. Distribution automation

    International Nuclear Information System (INIS)

    Gruenemeyer, D.

    1991-01-01

    This paper reports on a Distribution Automation (DA) System enhances the efficiency and productivity of a utility. It also provides intangible benefits such as improved public image and market advantages. A utility should evaluate the benefits and costs of such a system before committing funds. The expenditure for distribution automation is economical when justified by the deferral of a capacity increase, a decrease in peak power demand, or a reduction in O and M requirements

  3. Nuclear clustering - a cluster core model study

    International Nuclear Information System (INIS)

    Paul Selvi, G.; Nandhini, N.; Balasubramaniam, M.

    2015-01-01

    Nuclear clustering, similar to other clustering phenomenon in nature is a much warranted study, since it would help us in understanding the nature of binding of the nucleons inside the nucleus, closed shell behaviour when the system is highly deformed, dynamics and structure at extremes. Several models account for the clustering phenomenon of nuclei. We present in this work, a cluster core model study of nuclear clustering in light mass nuclei

  4. Understanding human management of automation errors

    Science.gov (United States)

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  5. Cluster fusion algorithm: application to Lennard-Jones clusters

    DEFF Research Database (Denmark)

    Solov'yov, Ilia; Solov'yov, Andrey V.; Greiner, Walter

    2006-01-01

    paths up to the cluster size of 150 atoms. We demonstrate that in this way all known global minima structures of the Lennard-Jones clusters can be found. Our method provides an efficient tool for the calculation and analysis of atomic cluster structure. With its use we justify the magic number sequence......We present a new general theoretical framework for modelling the cluster structure and apply it to description of the Lennard-Jones clusters. Starting from the initial tetrahedral cluster configuration, adding new atoms to the system and absorbing its energy at each step, we find cluster growing...... for the clusters of noble gas atoms and compare it with experimental observations. We report the striking correspondence of the peaks in the dependence of the second derivative of the binding energy per atom on cluster size calculated for the chain of the Lennard-Jones clusters based on the icosahedral symmetry...

  6. Cluster fusion algorithm: application to Lennard-Jones clusters

    DEFF Research Database (Denmark)

    Solov'yov, Ilia; Solov'yov, Andrey V.; Greiner, Walter

    2008-01-01

    paths up to the cluster size of 150 atoms. We demonstrate that in this way all known global minima structures of the Lennard-Jones clusters can be found. Our method provides an efficient tool for the calculation and analysis of atomic cluster structure. With its use we justify the magic number sequence......We present a new general theoretical framework for modelling the cluster structure and apply it to description of the Lennard-Jones clusters. Starting from the initial tetrahedral cluster configuration, adding new atoms to the system and absorbing its energy at each step, we find cluster growing...... for the clusters of noble gas atoms and compare it with experimental observations. We report the striking correspondence of the peaks in the dependence of the second derivative of the binding energy per atom on cluster size calculated for the chain of the Lennard-Jones clusters based on the icosahedral symmetry...

  7. Human-centred automation: an explorative study

    International Nuclear Information System (INIS)

    Hollnagel, Erik; Miberg, Ann Britt

    1999-05-01

    The purpose of the programme activity on human-centred automation at the HRP is to develop knowledge (in the form of models and theories) and tools (in the form of techniques and simulators) to support design of automation that ensures effective human performance and comprehension. This report presents the work done on both the analytical and experimental side of this project. The analytical work has surveyed common definitions of automation and traditional design principles. A general finding is that human-centred automation usually is defined in terms of what it is not. This is partly due to a lack of adequate models and of human-automation interaction. Another result is a clarification of the consequences of automation, in particular with regard to situation awareness and workload. The experimental work has taken place as an explorative experiment in HAMMLAB in collaboration with IPSN (France). The purpose of this experiment was to increase the understanding of how automation influences operator performance in NPP control rooms. Two different types of automation (extensive and limited) were considered in scenarios having two different degrees of complexity (high and low), and involving diagnostic and procedural tasks. Six licensed NPP crews from the NPP at Loviisa, Finland, participated in the experiment. The dependent variables applied were plant performance, operator performance, self-rated crew performance, situation awareness, workload, and operator trust in the automation. The results from the diagnostic scenarios indicated that operators' judgement of crew efficiency was related to their level of trust in the automation, and further that operators trusted automation least and rated crew performance lowest in situations where crew performance was efficient and vice versa. The results from procedural scenarios indicated that extensive automation efficiently supported operators' performance, and further that operator' judgement of crew performance efficiency

  8. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  9. Pep2Path: automated mass spectrometry-guided genome mining of peptidic natural products.

    Directory of Open Access Journals (Sweden)

    Marnix H Medema

    2014-09-01

    Full Text Available Nonribosomally and ribosomally synthesized bioactive peptides constitute a source of molecules of great biomedical importance, including antibiotics such as penicillin, immunosuppressants such as cyclosporine, and cytostatics such as bleomycin. Recently, an innovative mass-spectrometry-based strategy, peptidogenomics, has been pioneered to effectively mine microbial strains for novel peptidic metabolites. Even though mass-spectrometric peptide detection can be performed quite fast, true high-throughput natural product discovery approaches have still been limited by the inability to rapidly match the identified tandem mass spectra to the gene clusters responsible for the biosynthesis of the corresponding compounds. With Pep2Path, we introduce a software package to fully automate the peptidogenomics approach through the rapid Bayesian probabilistic matching of mass spectra to their corresponding biosynthetic gene clusters. Detailed benchmarking of the method shows that the approach is powerful enough to correctly identify gene clusters even in data sets that consist of hundreds of genomes, which also makes it possible to match compounds from unsequenced organisms to closely related biosynthetic gene clusters in other genomes. Applying Pep2Path to a data set of compounds without known biosynthesis routes, we were able to identify candidate gene clusters for the biosynthesis of five important compounds. Notably, one of these clusters was detected in a genome from a different subphylum of Proteobacteria than that in which the molecule had first been identified. All in all, our approach paves the way towards high-throughput discovery of novel peptidic natural products. Pep2Path is freely available from http://pep2path.sourceforge.net/, implemented in Python, licensed under the GNU General Public License v3 and supported on MS Windows, Linux and Mac OS X.

  10. Toward designing for trust in database automation

    Energy Technology Data Exchange (ETDEWEB)

    Duez, P. P.; Jamieson, G. A. [Cognitive Engineering Laboratory, Univ. of Toronto, 5 King' s College Rd., Toronto, Ont. M5S 3G8 (Canada)

    2006-07-01

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operating functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process

  11. Toward designing for trust in database automation

    International Nuclear Information System (INIS)

    Duez, P. P.; Jamieson, G. A.

    2006-01-01

    Appropriate reliance on system automation is imperative for safe and productive work, especially in safety-critical systems. It is unsafe to rely on automation beyond its designed use; conversely, it can be both unproductive and unsafe to manually perform tasks that are better relegated to automated tools. Operator trust in automated tools mediates reliance, and trust appears to affect how operators use technology. As automated agents become more complex, the question of trust in automation is increasingly important. In order to achieve proper use of automation, we must engender an appropriate degree of trust that is sensitive to changes in operating functions and context. In this paper, we present research concerning trust in automation in the domain of automated tools for relational databases. Lee and See have provided models of trust in automation. One model developed by Lee and See identifies three key categories of information about the automation that lie along a continuum of attributional abstraction. Purpose-, process-and performance-related information serve, both individually and through inferences between them, to describe automation in such a way as to engender r properly-calibrated trust. Thus, one can look at information from different levels of attributional abstraction as a general requirements analysis for information key to appropriate trust in automation. The model of information necessary to engender appropriate trust in automation [1] is a general one. Although it describes categories of information, it does not provide insight on how to determine the specific information elements required for a given automated tool. We have applied the Abstraction Hierarchy (AH) to this problem in the domain of relational databases. The AH serves as a formal description of the automation at several levels of abstraction, ranging from a very abstract purpose-oriented description to a more concrete description of the resources involved in the automated process

  12. Dynamic Fuzzy Clustering Method for Decision Support in Electricity Markets Negotiation

    Directory of Open Access Journals (Sweden)

    Ricardo FAIA

    2016-10-01

    Full Text Available Artificial Intelligence (AI methods contribute to the construction of systems where there is a need to automate the tasks. They are typically used for problems that have a large response time, or when a mathematical method cannot be used to solve the problem. However, the application of AI brings an added complexity to the development of such applications. AI has been frequently applied in the power systems field, namely in Electricity Markets (EM. In this area, AI applications are essentially used to forecast / estimate the prices of electricity or to search for the best opportunity to sell the product. This paper proposes a clustering methodology that is combined with fuzzy logic in order to perform the estimation of EM prices. The proposed method is based on the application of a clustering methodology that groups historic energy contracts according to their prices’ similarity. The optimal number of groups is automatically calculated taking into account the preference for the balance between the estimation error and the number of groups. The centroids of each cluster are used to define a dynamic fuzzy variable that approximates the tendency of contracts’ history. The resulting fuzzy variable allows estimating expected prices for contracts instantaneously and approximating missing values in the historic contracts.

  13. Automated electron microprobe

    International Nuclear Information System (INIS)

    Thompson, K.A.; Walker, L.R.

    1986-01-01

    The Plant Laboratory at the Oak Ridge Y-12 Plant has recently obtained a Cameca MBX electron microprobe with a Tracor Northern TN5500 automation system. This allows full stage and spectrometer automation and digital beam control. The capabilities of the system include qualitative and quantitative elemental microanalysis for all elements above and including boron in atomic number, high- and low-magnification imaging and processing, elemental mapping and enhancement, and particle size, shape, and composition analyses. Very low magnification, quantitative elemental mapping using stage control (which is of particular interest) has been accomplished along with automated size, shape, and composition analysis over a large relative area

  14. Dynamic PROOF clusters with PoD: architecture and user experience

    Science.gov (United States)

    Manafov, Anar

    2011-12-01

    PROOF on Demand (PoD) is a tool-set, which sets up a PROOF cluster on any resource management system. PoD is a user oriented product with an easy to use GUI and a command-line interface. It is fully automated. No administrative privileges or special knowledge is required to use it. PoD utilizes a plug-in system, to use different job submission front-ends. The current PoD distribution is shipped with LSF, Torque (PBS), Grid Engine, Condor, gLite, and SSH plug-ins. The product is to be extended. We therefore plan to implement a plug-in for AliEn Grid as well. Recently developed algorithms made it possible to efficiently maintain two types of connections: packet-forwarding and native PROOF connections. This helps to properly handle most kinds of workers, with and without firewalls. PoD maintains the PROOF environment automatically and, for example, prevents resource misusage in case when workers idle for too long. As PoD matures as a product and provides more plug-ins, it's used as a standard for setting up dynamic PROOF clusters in many different institutions. The GSI Analysis Facility (GSIAF) is in production since 2007. The static PROOF cluster has been phased out end of 2009. GSIAF is now completely based on PoD. Users create private dynamic PROOF clusters on the general purpose batch farm. This provides an easier resource sharing between interactive local batch and Grid usage. The main user communities are FAIR and ALICE.

  15. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  16. Driver Psychology during Automated Platooning

    NARCIS (Netherlands)

    Heikoop, D.D.

    2017-01-01

    With the rapid increase in vehicle automation technology, the call for understanding how humans behave while driving in an automated vehicle becomes more urgent. Vehicles that have automated systems such as Lane Keeping Assist (LKA) or Adaptive Cruise Control (ACC) not only support drivers in their

  17. Automated delineation and characterization of drumlins using a localized contour tree approach

    Science.gov (United States)

    Wang, Shujie; Wu, Qiusheng; Ward, Dylan

    2017-10-01

    Drumlins are ubiquitous landforms in previously glaciated regions, formed through a series of complex subglacial processes operating underneath the paleo-ice sheets. Accurate delineation and characterization of drumlins are essential for understanding the formation mechanism of drumlins as well as the flow behaviors and basal conditions of paleo-ice sheets. Automated mapping of drumlins is particularly important for examining the distribution patterns of drumlins across large spatial scales. This paper presents an automated vector-based approach to mapping drumlins from high-resolution light detection and ranging (LiDAR) data. The rationale is to extract a set of concentric contours by building localized contour trees and establishing topological relationships. This automated method can overcome the shortcomings of previously manual and automated methods for mapping drumlins, for instance, the azimuthal biases during the generation of shaded relief images. A case study was carried out over a portion of the New York Drumlin Field. Overall 1181 drumlins were identified from the LiDAR-derived DEM across the study region, which had been underestimated in previous literature. The delineation results were visually and statistically compared to the manual digitization results. The morphology of drumlins was characterized by quantifying the length, width, elongation ratio, height, area, and volume. Statistical and spatial analyses were conducted to examine the distribution pattern and spatial variability of drumlin size and form. The drumlins and the morphologic characteristics exhibit significant spatial clustering rather than randomly distributed patterns. The form of drumlins varies from ovoid to spindle shapes towards the downstream direction of paleo ice flows, along with the decrease in width, area, and volume. This observation is in line with previous studies, which may be explained by the variations in sediment thickness and/or the velocity increases of ice flows

  18. Ectopic lymphoid structures support ongoing production of class-switched autoantibodies in rheumatoid synovium.

    Directory of Open Access Journals (Sweden)

    Frances Humby

    2009-01-01

    Full Text Available Follicular structures resembling germinal centres (GCs that are characterized by follicular dendritic cell (FDC networks have long been recognized in chronically inflamed tissues in autoimmune diseases, including the synovium of rheumatoid arthritis (RA. However, it is debated whether these ectopic structures promote autoimmunity and chronic inflammation driving the production of pathogenic autoantibodies. Anti-citrullinated protein/peptide antibodies (ACPA are highly specific markers of RA, predict a poor prognosis, and have been suggested to be pathogenic. Therefore, the main study objectives were to determine whether ectopic lymphoid structures in RA synovium: (i express activation-induced cytidine deaminase (AID, the enzyme required for somatic hypermutation and class-switch recombination (CSR of Ig genes; (ii support ongoing CSR and ACPA production; and (iii remain functional in a RA/severe combined immunodeficiency (SCID chimera model devoid of new immune cell influx into the synovium.Using immunohistochemistry (IHC and quantitative Taqman real-time PCR (QT-PCR in synovial tissue from 55 patients with RA, we demonstrated that FDC+ structures invariably expressed AID with a distribution resembling secondary lymphoid organs. Further, AID+/CD21+ follicular structures were surrounded by ACPA+/CD138+ plasma cells, as demonstrated by immune reactivity to citrullinated fibrinogen. Moreover, we identified a novel subset of synovial AID+/CD20+ B cells outside GCs resembling interfollicular large B cells. In order to gain direct functional evidence that AID+ structures support CSR and in situ manufacturing of class-switched ACPA, 34 SCID mice were transplanted with RA synovium and humanely killed at 4 wk for harvesting of transplants and sera. Persistent expression of AID and Igamma-Cmu circular transcripts (identifying ongoing IgM-IgG class-switching was observed in synovial grafts expressing FDCs/CD21L. Furthermore, synovial mRNA levels of AID

  19. Ectopic lymphoid structures support ongoing production of class-switched autoantibodies in rheumatoid synovium.

    Science.gov (United States)

    Humby, Frances; Bombardieri, Michele; Manzo, Antonio; Kelly, Stephen; Blades, Mark C; Kirkham, Bruce; Spencer, Jo; Pitzalis, Costantino

    2009-01-13

    Follicular structures resembling germinal centres (GCs) that are characterized by follicular dendritic cell (FDC) networks have long been recognized in chronically inflamed tissues in autoimmune diseases, including the synovium of rheumatoid arthritis (RA). However, it is debated whether these ectopic structures promote autoimmunity and chronic inflammation driving the production of pathogenic autoantibodies. Anti-citrullinated protein/peptide antibodies (ACPA) are highly specific markers of RA, predict a poor prognosis, and have been suggested to be pathogenic. Therefore, the main study objectives were to determine whether ectopic lymphoid structures in RA synovium: (i) express activation-induced cytidine deaminase (AID), the enzyme required for somatic hypermutation and class-switch recombination (CSR) of Ig genes; (ii) support ongoing CSR and ACPA production; and (iii) remain functional in a RA/severe combined immunodeficiency (SCID) chimera model devoid of new immune cell influx into the synovium. Using immunohistochemistry (IHC) and quantitative Taqman real-time PCR (QT-PCR) in synovial tissue from 55 patients with RA, we demonstrated that FDC+ structures invariably expressed AID with a distribution resembling secondary lymphoid organs. Further, AID+/CD21+ follicular structures were surrounded by ACPA+/CD138+ plasma cells, as demonstrated by immune reactivity to citrullinated fibrinogen. Moreover, we identified a novel subset of synovial AID+/CD20+ B cells outside GCs resembling interfollicular large B cells. In order to gain direct functional evidence that AID+ structures support CSR and in situ manufacturing of class-switched ACPA, 34 SCID mice were transplanted with RA synovium and humanely killed at 4 wk for harvesting of transplants and sera. Persistent expression of AID and Igamma-Cmu circular transcripts (identifying ongoing IgM-IgG class-switching) was observed in synovial grafts expressing FDCs/CD21L. Furthermore, synovial mRNA levels of AID were

  20. Hybrid Swarm Intelligence Energy Efficient Clustered Routing Algorithm for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Rajeev Kumar

    2016-01-01

    Full Text Available Currently, wireless sensor networks (WSNs are used in many applications, namely, environment monitoring, disaster management, industrial automation, and medical electronics. Sensor nodes carry many limitations like low battery life, small memory space, and limited computing capability. To create a wireless sensor network more energy efficient, swarm intelligence technique has been applied to resolve many optimization issues in WSNs. In many existing clustering techniques an artificial bee colony (ABC algorithm is utilized to collect information from the field periodically. Nevertheless, in the event based applications, an ant colony optimization (ACO is a good solution to enhance the network lifespan. In this paper, we combine both algorithms (i.e., ABC and ACO and propose a new hybrid ABCACO algorithm to solve a Nondeterministic Polynomial (NP hard and finite problem of WSNs. ABCACO algorithm is divided into three main parts: (i selection of optimal number of subregions and further subregion parts, (ii cluster head selection using ABC algorithm, and (iii efficient data transmission using ACO algorithm. We use a hierarchical clustering technique for data transmission; the data is transmitted from member nodes to the subcluster heads and then from subcluster heads to the elected cluster heads based on some threshold value. Cluster heads use an ACO algorithm to discover the best route for data transmission to the base station (BS. The proposed approach is very useful in designing the framework for forest fire detection and monitoring. The simulation results show that the ABCACO algorithm enhances the stability period by 60% and also improves the goodput by 31% against LEACH and WSNCABC, respectively.

  1. Automated detection of microcalcification clusters in digital mammograms based on wavelet domain hidden Markov tree modeling

    International Nuclear Information System (INIS)

    Regentova, E.; Zhang, L.; Veni, G.; Zheng, J.

    2007-01-01

    A system is designed for detecting microcalcification clusters (MCC) in digital mammograms. The system is intended for computer-aided diagnostic prompting. Further discrimination of MCC as benign or malignant is assumed to be performed by radiologists. Processing of mammograms is based on the statistical modeling by means of wavelet domain hidden markov trees (WHMT). Segmentation is performed by the weighted likelihood evaluation followed by the classification based on spatial filters for a single microcalcification (MC) and a cluster of MC detection. The analysis is carried out on FROC curves for 40 mammograms from the mini-MIAS database and for 100 mammograms with 50 cancerous and 50 benign cases from DDSM database. The designed system is capable to detect 100% of true positive cases in these sets. The rate of false positives is 2.9 per case for mini-MIAS dataset; and 0.01 for the DDSM images. (orig.)

  2. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  3. Segmentation of the Clustered Cells with Optimized Boundary Detection in Negative Phase Contrast Images.

    Directory of Open Access Journals (Sweden)

    Yuliang Wang

    Full Text Available Cell image segmentation plays a central role in numerous biology studies and clinical applications. As a result, the development of cell image segmentation algorithms with high robustness and accuracy is attracting more and more attention. In this study, an automated cell image segmentation algorithm is developed to get improved cell image segmentation with respect to cell boundary detection and segmentation of the clustered cells for all cells in the field of view in negative phase contrast images. A new method which combines the thresholding method and edge based active contour method was proposed to optimize cell boundary detection. In order to segment clustered cells, the geographic peaks of cell light intensity were utilized to detect numbers and locations of the clustered cells. In this paper, the working principles of the algorithms are described. The influence of parameters in cell boundary detection and the selection of the threshold value on the final segmentation results are investigated. At last, the proposed algorithm is applied to the negative phase contrast images from different experiments. The performance of the proposed method is evaluated. Results show that the proposed method can achieve optimized cell boundary detection and highly accurate segmentation for clustered cells.

  4. Model-based classification of CPT data and automated lithostratigraphic mapping for high-resolution characterization of a heterogeneous sedimentary aquifer.

    Science.gov (United States)

    Rogiers, Bart; Mallants, Dirk; Batelaan, Okke; Gedeon, Matej; Huysmans, Marijke; Dassargues, Alain

    2017-01-01

    Cone penetration testing (CPT) is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT) of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT classification would be very useful. This paper investigates the use of model-based clustering for SBT classification, and the influence of different clustering approaches on the properties and spatial distribution of the obtained soil classes. We additionally propose a methodology for automated lithostratigraphic mapping of regionally occurring sedimentary units using SBT classification. The methodology is applied to a large CPT dataset, covering a groundwater basin of ~60 km2 with predominantly unconsolidated sandy sediments in northern Belgium. Results show that the model-based approach is superior in detecting the true lithological classes when compared to more frequently applied unsupervised classification approaches or literature classification diagrams. We demonstrate that automated mapping of lithostratigraphic units using advanced SBT classification techniques can provide a large gain in efficiency, compared to more time-consuming manual approaches and yields at least equally accurate results.

  5. Model-based classification of CPT data and automated lithostratigraphic mapping for high-resolution characterization of a heterogeneous sedimentary aquifer.

    Directory of Open Access Journals (Sweden)

    Bart Rogiers

    Full Text Available Cone penetration testing (CPT is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT classification would be very useful. This paper investigates the use of model-based clustering for SBT classification, and the influence of different clustering approaches on the properties and spatial distribution of the obtained soil classes. We additionally propose a methodology for automated lithostratigraphic mapping of regionally occurring sedimentary units using SBT classification. The methodology is applied to a large CPT dataset, covering a groundwater basin of ~60 km2 with predominantly unconsolidated sandy sediments in northern Belgium. Results show that the model-based approach is superior in detecting the true lithological classes when compared to more frequently applied unsupervised classification approaches or literature classification diagrams. We demonstrate that automated mapping of lithostratigraphic units using advanced SBT classification techniques can provide a large gain in efficiency, compared to more time-consuming manual approaches and yields at least equally accurate results.

  6. Automated analysis for detecting beams in laser wakefield simulations

    International Nuclear Information System (INIS)

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-01-01

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets

  7. Automation systems for radioimmunoassay

    International Nuclear Information System (INIS)

    Yamasaki, Paul

    1974-01-01

    The application of automation systems for radioimmunoassay (RIA) was discussed. Automated systems could be useful in the second step, of the four basic processes in the course of RIA, i.e., preparation of sample for reaction. There were two types of instrumentation, a semi-automatic pipete, and a fully automated pipete station, both providing for fast and accurate dispensing of the reagent or for the diluting of sample with reagent. Illustrations of the instruments were shown. (Mukohata, S.)

  8. Comprehensive cluster analysis with Transitivity Clustering.

    Science.gov (United States)

    Wittkop, Tobias; Emig, Dorothea; Truss, Anke; Albrecht, Mario; Böcker, Sebastian; Baumbach, Jan

    2011-03-01

    Transitivity Clustering is a method for the partitioning of biological data into groups of similar objects, such as genes, for instance. It provides integrated access to various functions addressing each step of a typical cluster analysis. To facilitate this, Transitivity Clustering is accessible online and offers three user-friendly interfaces: a powerful stand-alone version, a web interface, and a collection of Cytoscape plug-ins. In this paper, we describe three major workflows: (i) protein (super)family detection with Cytoscape, (ii) protein homology detection with incomplete gold standards and (iii) clustering of gene expression data. This protocol guides the user through the most important features of Transitivity Clustering and takes ∼1 h to complete.

  9. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  10. An automated swimming respirometer

    DEFF Research Database (Denmark)

    STEFFENSEN, JF; JOHANSEN, K; BUSHNELL, PG

    1984-01-01

    An automated respirometer is described that can be used for computerized respirometry of trout and sharks.......An automated respirometer is described that can be used for computerized respirometry of trout and sharks....

  11. Laboratory automation: trajectory, technology, and tactics.

    Science.gov (United States)

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  12. Cluster-cluster correlations and constraints on the correlation hierarchy

    Science.gov (United States)

    Hamilton, A. J. S.; Gott, J. R., III

    1988-01-01

    The hypothesis that galaxies cluster around clusters at least as strongly as they cluster around galaxies imposes constraints on the hierarchy of correlation amplitudes in hierachical clustering models. The distributions which saturate these constraints are the Rayleigh-Levy random walk fractals proposed by Mandelbrot; for these fractal distributions cluster-cluster correlations are all identically equal to galaxy-galaxy correlations. If correlation amplitudes exceed the constraints, as is observed, then cluster-cluster correlations must exceed galaxy-galaxy correlations, as is observed.

  13. CONSTRAINING CLUSTER PHYSICS WITH THE SHAPE OF X-RAY CLUSTERS: COMPARISON OF LOCAL X-RAY CLUSTERS VERSUS ΛCDM CLUSTERS

    International Nuclear Information System (INIS)

    Lau, Erwin T.; Nagai, Daisuke; Kravtsov, Andrey V.; Vikhlinin, Alexey; Zentner, Andrew R.

    2012-01-01

    Recent simulations of cluster formation have demonstrated that condensation of baryons into central galaxies during cluster formation can drive the shape of the gas distribution in galaxy clusters significantly rounder out to their virial radius. These simulations generally predict stellar fractions within cluster virial radii that are ∼2-3 times larger than the stellar masses deduced from observations. In this paper, we compare ellipticity profiles of simulated clusters performed with varying input physics (radiative cooling, star formation, and supernova feedback) to the cluster ellipticity profiles derived from Chandra and ROSAT observations, in an effort to constrain the fraction of gas that cools and condenses into the central galaxies within clusters. We find that local relaxed clusters have an average ellipticity of ε = 0.18 ± 0.05 in the radial range of 0.04 ≤ r/r 500 ≤ 1. At larger radii r > 0.1r 500 , the observed ellipticity profiles agree well with the predictions of non-radiative simulations. In contrast, the ellipticity profiles of simulated clusters that include dissipative gas physics deviate significantly from the observed ellipticity profiles at all radii. The dissipative simulations overpredict (underpredict) ellipticity in the inner (outer) regions of galaxy clusters. By comparing simulations with and without dissipative gas physics, we show that gas cooling causes the gas distribution to be more oblate in the central regions, but makes the outer gas distribution more spherical. We find that late-time gas cooling and star formation are responsible for the significantly oblate gas distributions in cluster cores, but the gas shapes outside of cluster cores are set primarily by baryon dissipation at high redshift (z ≥ 2). Our results indicate that the shapes of X-ray emitting gas in galaxy clusters, especially at large radii, can be used to place constraints on cluster gas physics, making it potential probes of the history of baryonic

  14. Contaminant analysis automation, an overview

    International Nuclear Information System (INIS)

    Hollen, R.; Ramos, O. Jr.

    1996-01-01

    To meet the environmental restoration and waste minimization goals of government and industry, several government laboratories, universities, and private companies have formed the Contaminant Analysis Automation (CAA) team. The goal of this consortium is to design and fabricate robotics systems that standardize and automate the hardware and software of the most common environmental chemical methods. In essence, the CAA team takes conventional, regulatory- approved (EPA Methods) chemical analysis processes and automates them. The automation consists of standard laboratory modules (SLMs) that perform the work in a much more efficient, accurate, and cost- effective manner

  15. Application of synthetic peptides for detection of anti-citrullinated peptide antibodies

    DEFF Research Database (Denmark)

    Trier, Nicole Hartwig; Holm, Bettina Eide; Slot, Ole

    2016-01-01

    Anti-citrullinated protein antibodies (ACPAs) are a hallmark of rheumatoid arthritis (RA) and represent an important tool for the serological diagnosis of RA. In this study, we describe ACPA reactivity to overlapping citrullinated Epstein-Barr virus nuclear antigen-1 (EBNA-1)-derived peptides...... (n=40), systemic lupus erythematosus (n=20), Sjögren's syndrome (n=40)) were screened for antibody reactivity. Antibodies to a panel of five citrullinated EBNA-1 peptides were found in 67% of RA sera, exclusively of the IgG isotype, while 53% of the patient sera reacted with a single peptide......, ARGGSRERARGRGRG-Cit-GEKR, accounting for more than half of the ACPA reactivity alone. Moreover, these antibodies were detected in 10% of CCP2-negative RA sera. In addition, 47% of the RA sera reacted with two or three citrullinated EBNA-1 peptides from the selected peptide panel. Furthermore, a negative...

  16. Spike sorting using locality preserving projection with gap statistics and landmark-based spectral clustering.

    Science.gov (United States)

    Nguyen, Thanh; Khosravi, Abbas; Creighton, Douglas; Nahavandi, Saeid

    2014-12-30

    Understanding neural functions requires knowledge from analysing electrophysiological data. The process of assigning spikes of a multichannel signal into clusters, called spike sorting, is one of the important problems in such analysis. There have been various automated spike sorting techniques with both advantages and disadvantages regarding accuracy and computational costs. Therefore, developing spike sorting methods that are highly accurate and computationally inexpensive is always a challenge in the biomedical engineering practice. An automatic unsupervised spike sorting method is proposed in this paper. The method uses features extracted by the locality preserving projection (LPP) algorithm. These features afterwards serve as inputs for the landmark-based spectral clustering (LSC) method. Gap statistics (GS) is employed to evaluate the number of clusters before the LSC can be performed. The proposed LPP-LSC is highly accurate and computationally inexpensive spike sorting approach. LPP spike features are very discriminative; thereby boost the performance of clustering methods. Furthermore, the LSC method exhibits its efficiency when integrated with the cluster evaluator GS. The proposed method's accuracy is approximately 13% superior to that of the benchmark combination between wavelet transformation and superparamagnetic clustering (WT-SPC). Additionally, LPP-LSC computing time is six times less than that of the WT-SPC. LPP-LSC obviously demonstrates a win-win spike sorting solution meeting both accuracy and computational cost criteria. LPP and LSC are linear algorithms that help reduce computational burden and thus their combination can be applied into real-time spike analysis. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Building the library of RNA 3D nucleotide conformations using the clustering approach

    Directory of Open Access Journals (Sweden)

    Zok Tomasz

    2015-09-01

    Full Text Available An increasing number of known RNA 3D structures contributes to the recognition of various RNA families and identification of their features. These tasks are based on an analysis of RNA conformations conducted at different levels of detail. On the other hand, the knowledge of native nucleotide conformations is crucial for structure prediction and understanding of RNA folding. However, this knowledge is stored in structural databases in a rather distributed form. Therefore, only automated methods for sampling the space of RNA structures can reveal plausible conformational representatives useful for further analysis. Here, we present a machine learning-based approach to inspect the dataset of RNA three-dimensional structures and to create a library of nucleotide conformers. A median neural gas algorithm is applied to cluster nucleotide structures upon their trigonometric description. The clustering procedure is two-stage: (i backbone- and (ii ribose-driven. We show the resulting library that contains RNA nucleotide representatives over the entire data, and we evaluate its quality by computing normal distribution measures and average RMSD between data points as well as the prototype within each cluster.

  18. Selecting automation for the clinical chemistry laboratory.

    Science.gov (United States)

    Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr

    2007-07-01

    Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.

  19. Automated Vehicles Symposium 2014

    CERN Document Server

    Beiker, Sven; Road Vehicle Automation 2

    2015-01-01

    This paper collection is the second volume of the LNMOB series on Road Vehicle Automation. The book contains a comprehensive review of current technical, socio-economic, and legal perspectives written by experts coming from public authorities, companies and universities in the U.S., Europe and Japan. It originates from the Automated Vehicle Symposium 2014, which was jointly organized by the Association for Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Burlingame, CA, in July 2014. The contributions discuss the challenges arising from the integration of highly automated and self-driving vehicles into the transportation system, with a focus on human factors and different deployment scenarios. This book is an indispensable source of information for academic researchers, industrial engineers, and policy makers interested in the topic of road vehicle automation.

  20. Work Planing Automation at Mechanical Subdivision

    OpenAIRE

    Dzindzelėta, Vytautas

    2005-01-01

    Work planing automation, installation possibilities and future outlook at mechanical subdivision. To study how the work planing has changed before and after automation process and to analyse automation process methodology.

  1. Convex Clustering: An Attractive Alternative to Hierarchical Clustering

    Science.gov (United States)

    Chen, Gary K.; Chi, Eric C.; Ranola, John Michael O.; Lange, Kenneth

    2015-01-01

    The primary goal in cluster analysis is to discover natural groupings of objects. The field of cluster analysis is crowded with diverse methods that make special assumptions about data and address different scientific aims. Despite its shortcomings in accuracy, hierarchical clustering is the dominant clustering method in bioinformatics. Biologists find the trees constructed by hierarchical clustering visually appealing and in tune with their evolutionary perspective. Hierarchical clustering operates on multiple scales simultaneously. This is essential, for instance, in transcriptome data, where one may be interested in making qualitative inferences about how lower-order relationships like gene modules lead to higher-order relationships like pathways or biological processes. The recently developed method of convex clustering preserves the visual appeal of hierarchical clustering while ameliorating its propensity to make false inferences in the presence of outliers and noise. The solution paths generated by convex clustering reveal relationships between clusters that are hidden by static methods such as k-means clustering. The current paper derives and tests a novel proximal distance algorithm for minimizing the objective function of convex clustering. The algorithm separates parameters, accommodates missing data, and supports prior information on relationships. Our program CONVEXCLUSTER incorporating the algorithm is implemented on ATI and nVidia graphics processing units (GPUs) for maximal speed. Several biological examples illustrate the strengths of convex clustering and the ability of the proximal distance algorithm to handle high-dimensional problems. CONVEXCLUSTER can be freely downloaded from the UCLA Human Genetics web site at http://www.genetics.ucla.edu/software/ PMID:25965340

  2. Physiological Self-Regulation and Adaptive Automation

    Science.gov (United States)

    Prinzell, Lawrence J.; Pope, Alan T.; Freeman, Frederick G.

    2007-01-01

    Adaptive automation has been proposed as a solution to current problems of human-automation interaction. Past research has shown the potential of this advanced form of automation to enhance pilot engagement and lower cognitive workload. However, there have been concerns voiced regarding issues, such as automation surprises, associated with the use of adaptive automation. This study examined the use of psychophysiological self-regulation training with adaptive automation that may help pilots deal with these problems through the enhancement of cognitive resource management skills. Eighteen participants were assigned to 3 groups (self-regulation training, false feedback, and control) and performed resource management, monitoring, and tracking tasks from the Multiple Attribute Task Battery. The tracking task was cycled between 3 levels of task difficulty (automatic, adaptive aiding, manual) on the basis of the electroencephalogram-derived engagement index. The other two tasks remained in automatic mode that had a single automation failure. Those participants who had received self-regulation training performed significantly better and reported lower National Aeronautics and Space Administration Task Load Index scores than participants in the false feedback and control groups. The theoretical and practical implications of these results for adaptive automation are discussed.

  3. Determination of proper motions in the Pleiades cluster

    Science.gov (United States)

    Schilbach, E.

    1991-04-01

    For 458 stars in the Pleiades field from the catalog of Eichhorn et al. (1970) proper motions were derived on Tautenburg and CERGA Schmidt telescope plates measured with the automated measuring machine MAMA in Paris. The catalog positions were considered as first epoch coordinates with an epoch difference of ca. 33 years to the observations. The results show good coincidence of proper motions derived with both Schmidt telescopes within the error bars. Comparison with proper motions determined by Vasilevskis et al. (1979) displays some significant differences but no systematic effects depending on plate coordinates or magnitudes could be found. An accuracy of 0.3 arcsec/100a for one proper motion component was estimated. According to the criterion of common proper motion 34 new cluster members were identified.

  4. Buying Program of the Standard Automated Materiel Management System. Automated Small Purchase System: Defense Supply Center Philadelphia

    National Research Council Canada - National Science Library

    2001-01-01

    The Standard Automated Materiel Management System Automated Small Purchase System is a fully automated micro-purchases system used by the General and Industrial Directorate at the Defense Supply Center Philadelphia...

  5. Exposure to passive smoking and rheumatoid arthritis risk: results from the Swedish EIRA study.

    Science.gov (United States)

    Hedström, Anna Karin; Klareskog, Lars; Alfredsson, Lars

    2018-07-01

    Smoking has consistently been associated with increased risk of developing rheumatoid arthritis (RA). The aim of this study was to estimate the influence of passive smoking on the risk of developing anti-cyclic citrullinated peptide antibodies (ACPA)-positive and ACPA-negative RA. A population-based case-control study using incident cases of RA was performed in Sweden, and the study population in this report was restricted to include never-smokers (589 cases, 1764 controls). The incidence of RA among never-smokers who had been exposed to passive smoking was compared with that of never-smokers who had never been exposed, by calculating the OR with a 95% CI employing logistic regression. No association was observed between exposure to passive smoking and RA risk (OR 1.0, 95% CI 0.8 to 1.2 for ACPA-positive RA, and OR 0.9, 95% CI 0.7 to 1.2, for ACPA-negative RA). No suggestion of a trend between duration of passive smoking and RA risk was observed. No association was observed between exposure to passive smoking and RA risk, which may be explained by a threshold below which no association between smoke exposure and RA occurs. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  6. Comparison of Size Modulation Standard Automated Perimetry and Conventional Standard Automated Perimetry with a 10-2 Test Program in Glaucoma Patients.

    Science.gov (United States)

    Hirasawa, Kazunori; Takahashi, Natsumi; Satou, Tsukasa; Kasahara, Masayuki; Matsumura, Kazuhiro; Shoji, Nobuyuki

    2017-08-01

    This prospective observational study compared the performance of size modulation standard automated perimetry with the Octopus 600 10-2 test program, with stimulus size modulation during testing, based on stimulus intensity and conventional standard automated perimetry, with that of the Humphrey 10-2 test program in glaucoma patients. Eighty-seven eyes of 87 glaucoma patients underwent size modulation standard automated perimetry with Dynamic strategy and conventional standard automated perimetry using the SITA standard strategy. The main outcome measures were global indices, point-wise threshold, visual defect size and depth, reliability indices, and test duration; these were compared between size modulation standard automated perimetry and conventional standard automated perimetry. Global indices and point-wise threshold values between size modulation standard automated perimetry and conventional standard automated perimetry were moderately to strongly correlated (p 33.40, p modulation standard automated perimetry than with conventional standard automated perimetry, but the visual-field defect size was smaller (p modulation-standard automated perimetry than on conventional standard automated perimetry. The reliability indices, particularly the false-negative response, of size modulation standard automated perimetry were worse than those of conventional standard automated perimetry (p modulation standard automated perimetry than with conventional standard automated perimetry (p = 0.02). Global indices and the point-wise threshold value of the two testing modalities correlated well. However, the potential of a large stimulus presented at an area with a decreased sensitivity with size modulation standard automated perimetry could underestimate the actual threshold in the 10-2 test protocol, as compared with conventional standard automated perimetry.

  7. Cluster management.

    Science.gov (United States)

    Katz, R

    1992-11-01

    Cluster management is a management model that fosters decentralization of management, develops leadership potential of staff, and creates ownership of unit-based goals. Unlike shared governance models, there is no formal structure created by committees and it is less threatening for managers. There are two parts to the cluster management model. One is the formation of cluster groups, consisting of all staff and facilitated by a cluster leader. The cluster groups function for communication and problem-solving. The second part of the cluster management model is the creation of task forces. These task forces are designed to work on short-term goals, usually in response to solving one of the unit's goals. Sometimes the task forces are used for quality improvement or system problems. Clusters are groups of not more than five or six staff members, facilitated by a cluster leader. A cluster is made up of individuals who work the same shift. For example, people with job titles who work days would be in a cluster. There would be registered nurses, licensed practical nurses, nursing assistants, and unit clerks in the cluster. The cluster leader is chosen by the manager based on certain criteria and is trained for this specialized role. The concept of cluster management, criteria for choosing leaders, training for leaders, using cluster groups to solve quality improvement issues, and the learning process necessary for manager support are described.

  8. Laboratory automation and LIMS in forensics

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Morling, Niels

    2013-01-01

    . Furthermore, implementation of automated liquid handlers reduces the risk of sample misplacement. A LIMS can efficiently control the sample flow through the laboratory and manage the results of the conducted tests for each sample. Integration of automated liquid handlers with a LIMS provides the laboratory......Implementation of laboratory automation and LIMS in a forensic laboratory enables the laboratory, to standardize sample processing. Automated liquid handlers can increase throughput and eliminate manual repetitive pipetting operations, known to result in occupational injuries to the technical staff...... with the tools required for setting up automated production lines of complex laboratory processes and monitoring the whole process and the results. Combined, this enables processing of a large number of samples. Selection of the best automated solution for an individual laboratory should be based on user...

  9. Automation and robotics

    Science.gov (United States)

    Montemerlo, Melvin

    1988-01-01

    The Autonomous Systems focus on the automation of control systems for the Space Station and mission operations. Telerobotics focuses on automation for in-space servicing, assembly, and repair. The Autonomous Systems and Telerobotics each have a planned sequence of integrated demonstrations showing the evolutionary advance of the state-of-the-art. Progress is briefly described for each area of concern.

  10. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1986-01-01

    Automation, the removal of the human element in inspection, has not been generally applied to film radiographic NDT. The justication for automating is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 8 x 10 9 bits per 14 x 17. The equivalent to 2200 computer floppy discs. Parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, computer aided interpretation appears on the horizon. A unit which laser scans a 14 x 17 (inch) film in 6 - 8 seconds can digitize film information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for Film Digital Radiography System) is moving toward 50 micron (*approx* 16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. We expect the automated system to appear first in parts (modules) as certain operations are automated. The future will see it all come together in an automated film radiographic NDT system (author) [pt

  11. Automated Vehicles Symposium 2015

    CERN Document Server

    Beiker, Sven

    2016-01-01

    This edited book comprises papers about the impacts, benefits and challenges of connected and automated cars. It is the third volume of the LNMOB series dealing with Road Vehicle Automation. The book comprises contributions from researchers, industry practitioners and policy makers, covering perspectives from the U.S., Europe and Japan. It is based on the Automated Vehicles Symposium 2015 which was jointly organized by the Association of Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Ann Arbor, Michigan, in July 2015. The topical spectrum includes, but is not limited to, public sector activities, human factors, ethical and business aspects, energy and technological perspectives, vehicle systems and transportation infrastructure. This book is an indispensable source of information for academic researchers, industrial engineers and policy makers interested in the topic of road vehicle automation.

  12. Automated PCB Inspection System

    Directory of Open Access Journals (Sweden)

    Syed Usama BUKHARI

    2017-05-01

    Full Text Available Development of an automated PCB inspection system as per the need of industry is a challenging task. In this paper a case study is presented, to exhibit, a proposed system for an immigration process of a manual PCB inspection system to an automated PCB inspection system, with a minimal intervention on the existing production flow, for a leading automotive manufacturing company. A detailed design of the system, based on computer vision followed by testing and analysis was proposed, in order to aid the manufacturer in the process of automation.

  13. Portfolio of automated trading systems: complexity and learning set size issues.

    Science.gov (United States)

    Raudys, Sarunas

    2013-03-01

    In this paper, we consider using profit/loss histories of multiple automated trading systems (ATSs) as N input variables in portfolio management. By means of multivariate statistical analysis and simulation studies, we analyze the influences of sample size (L) and input dimensionality on the accuracy of determining the portfolio weights. We find that degradation in portfolio performance due to inexact estimation of N means and N(N - 1)/2 correlations is proportional to N/L; however, estimation of N variances does not worsen the result. To reduce unhelpful sample size/dimensionality effects, we perform a clustering of N time series and split them into a small number of blocks. Each block is composed of mutually correlated ATSs. It generates an expert trading agent based on a nontrainable 1/N portfolio rule. To increase the diversity of the expert agents, we use training sets of different lengths for clustering. In the output of the portfolio management system, the regularized mean-variance framework-based fusion agent is developed in each walk-forward step of an out-of-sample portfolio validation experiment. Experiments with the real financial data (2003-2012) confirm the effectiveness of the suggested approach.

  14. Lifting to cluster-tilting objects in higher cluster categories

    OpenAIRE

    Liu, Pin

    2008-01-01

    In this note, we consider the $d$-cluster-tilted algebras, the endomorphism algebras of $d$-cluster-tilting objects in $d$-cluster categories. We show that a tilting module over such an algebra lifts to a $d$-cluster-tilting object in this $d$-cluster category.

  15. Data Clustering

    Science.gov (United States)

    Wagstaff, Kiri L.

    2012-03-01

    On obtaining a new data set, the researcher is immediately faced with the challenge of obtaining a high-level understanding from the observations. What does a typical item look like? What are the dominant trends? How many distinct groups are included in the data set, and how is each one characterized? Which observable values are common, and which rarely occur? Which items stand out as anomalies or outliers from the rest of the data? This challenge is exacerbated by the steady growth in data set size [11] as new instruments push into new frontiers of parameter space, via improvements in temporal, spatial, and spectral resolution, or by the desire to "fuse" observations from different modalities and instruments into a larger-picture understanding of the same underlying phenomenon. Data clustering algorithms provide a variety of solutions for this task. They can generate summaries, locate outliers, compress data, identify dense or sparse regions of feature space, and build data models. It is useful to note up front that "clusters" in this context refer to groups of items within some descriptive feature space, not (necessarily) to "galaxy clusters" which are dense regions in physical space. The goal of this chapter is to survey a variety of data clustering methods, with an eye toward their applicability to astronomical data analysis. In addition to improving the individual researcher’s understanding of a given data set, clustering has led directly to scientific advances, such as the discovery of new subclasses of stars [14] and gamma-ray bursts (GRBs) [38]. All clustering algorithms seek to identify groups within a data set that reflect some observed, quantifiable structure. Clustering is traditionally an unsupervised approach to data analysis, in the sense that it operates without any direct guidance about which items should be assigned to which clusters. There has been a recent trend in the clustering literature toward supporting semisupervised or constrained

  16. Automated lung nodule classification following automated nodule detection on CT: A serial approach

    International Nuclear Information System (INIS)

    Armato, Samuel G. III; Altman, Michael B.; Wilkie, Joel; Sone, Shusuke; Li, Feng; Doi, Kunio; Roy, Arunabha S.

    2003-01-01

    We have evaluated the performance of an automated classifier applied to the task of differentiating malignant and benign lung nodules in low-dose helical computed tomography (CT) scans acquired as part of a lung cancer screening program. The nodules classified in this manner were initially identified by our automated lung nodule detection method, so that the output of automated lung nodule detection was used as input to automated lung nodule classification. This study begins to narrow the distinction between the 'detection task' and the 'classification task'. Automated lung nodule detection is based on two- and three-dimensional analyses of the CT image data. Gray-level-thresholding techniques are used to identify initial lung nodule candidates, for which morphological and gray-level features are computed. A rule-based approach is applied to reduce the number of nodule candidates that correspond to non-nodules, and the features of remaining candidates are merged through linear discriminant analysis to obtain final detection results. Automated lung nodule classification merges the features of the lung nodule candidates identified by the detection algorithm that correspond to actual nodules through another linear discriminant classifier to distinguish between malignant and benign nodules. The automated classification method was applied to the computerized detection results obtained from a database of 393 low-dose thoracic CT scans containing 470 confirmed lung nodules (69 malignant and 401 benign nodules). Receiver operating characteristic (ROC) analysis was used to evaluate the ability of the classifier to differentiate between nodule candidates that correspond to malignant nodules and nodule candidates that correspond to benign lesions. The area under the ROC curve for this classification task attained a value of 0.79 during a leave-one-out evaluation

  17. Automation of Electrical Cable Harnesses Testing

    Directory of Open Access Journals (Sweden)

    Zhuming Bi

    2017-12-01

    Full Text Available Traditional automated systems, such as industrial robots, are applied in well-structured environments, and many automated systems have a limited adaptability to deal with complexity and uncertainty; therefore, the applications of industrial robots in small- and medium-sized enterprises (SMEs are very limited. The majority of manual operations in SMEs are too complicated for automation. The rapidly developed information technologies (IT has brought new opportunities for the automation of manufacturing and assembly processes in the ill-structured environments. Note that an automation solution should be designed to meet the given requirements of the specified application, and it differs from one application to another. In this paper, we look into the feasibility of automated testing for electric cable harnesses, and our focus is on some of the generic strategies for the improvement of the adaptability of automation solutions. Especially, the concept of modularization is adopted in developing hardware and software to maximize system adaptability in testing a wide scope of products. A proposed system has been implemented, and the system performances have been evaluated by executing tests on actual products. The testing experiments have shown that the automated system outperformed manual operations greatly in terms of cost-saving, productivity and reliability. Due to the potential of increasing system adaptability and cost reduction, the presented work has its theoretical and practical significance for an extension for other automation solutions in SMEs.

  18. Dense Fe cluster-assembled films by energetic cluster deposition

    International Nuclear Information System (INIS)

    Peng, D.L.; Yamada, H.; Hihara, T.; Uchida, T.; Sumiyama, K.

    2004-01-01

    High-density Fe cluster-assembled films were produced at room temperature by an energetic cluster deposition. Though cluster-assemblies are usually sooty and porous, the present Fe cluster-assembled films are lustrous and dense, revealing a soft magnetic behavior. Size-monodispersed Fe clusters with the mean cluster size d=9 nm were synthesized using a plasma-gas-condensation technique. Ionized clusters are accelerated electrically and deposited onto the substrate together with neutral clusters from the same cluster source. Packing fraction and saturation magnetic flux density increase rapidly and magnetic coercivity decreases remarkably with increasing acceleration voltage. The Fe cluster-assembled film obtained at the acceleration voltage of -20 kV has a packing fraction of 0.86±0.03, saturation magnetic flux density of 1.78±0.05 Wb/m 2 , and coercivity value smaller than 80 A/m. The resistivity at room temperature is ten times larger than that of bulk Fe metal

  19. Clustering-based Feature Learning on Variable Stars

    Science.gov (United States)

    Mackenzie, Cristóbal; Pichara, Karim; Protopapas, Pavlos

    2016-04-01

    The success of automatic classification of variable stars depends strongly on the lightcurve representation. Usually, lightcurves are represented as a vector of many descriptors designed by astronomers called features. These descriptors are expensive in terms of computing, require substantial research effort to develop, and do not guarantee a good classification. Today, lightcurve representation is not entirely automatic; algorithms must be designed and manually tuned up for every survey. The amounts of data that will be generated in the future mean astronomers must develop scalable and automated analysis pipelines. In this work we present a feature learning algorithm designed for variable objects. Our method works by extracting a large number of lightcurve subsequences from a given set, which are then clustered to find common local patterns in the time series. Representatives of these common patterns are then used to transform lightcurves of a labeled set into a new representation that can be used to train a classifier. The proposed algorithm learns the features from both labeled and unlabeled lightcurves, overcoming the bias using only labeled data. We test our method on data sets from the Massive Compact Halo Object survey and the Optical Gravitational Lensing Experiment; the results show that our classification performance is as good as and in some cases better than the performance achieved using traditional statistical features, while the computational cost is significantly lower. With these promising results, we believe that our method constitutes a significant step toward the automation of the lightcurve classification pipeline.

  20. CLUSTERING-BASED FEATURE LEARNING ON VARIABLE STARS

    International Nuclear Information System (INIS)

    Mackenzie, Cristóbal; Pichara, Karim; Protopapas, Pavlos

    2016-01-01

    The success of automatic classification of variable stars depends strongly on the lightcurve representation. Usually, lightcurves are represented as a vector of many descriptors designed by astronomers called features. These descriptors are expensive in terms of computing, require substantial research effort to develop, and do not guarantee a good classification. Today, lightcurve representation is not entirely automatic; algorithms must be designed and manually tuned up for every survey. The amounts of data that will be generated in the future mean astronomers must develop scalable and automated analysis pipelines. In this work we present a feature learning algorithm designed for variable objects. Our method works by extracting a large number of lightcurve subsequences from a given set, which are then clustered to find common local patterns in the time series. Representatives of these common patterns are then used to transform lightcurves of a labeled set into a new representation that can be used to train a classifier. The proposed algorithm learns the features from both labeled and unlabeled lightcurves, overcoming the bias using only labeled data. We test our method on data sets from the Massive Compact Halo Object survey and the Optical Gravitational Lensing Experiment; the results show that our classification performance is as good as and in some cases better than the performance achieved using traditional statistical features, while the computational cost is significantly lower. With these promising results, we believe that our method constitutes a significant step toward the automation of the lightcurve classification pipeline

  1. CLUSTERING-BASED FEATURE LEARNING ON VARIABLE STARS

    Energy Technology Data Exchange (ETDEWEB)

    Mackenzie, Cristóbal; Pichara, Karim [Computer Science Department, Pontificia Universidad Católica de Chile, Santiago (Chile); Protopapas, Pavlos [Institute for Applied Computational Science, Harvard University, Cambridge, MA (United States)

    2016-04-01

    The success of automatic classification of variable stars depends strongly on the lightcurve representation. Usually, lightcurves are represented as a vector of many descriptors designed by astronomers called features. These descriptors are expensive in terms of computing, require substantial research effort to develop, and do not guarantee a good classification. Today, lightcurve representation is not entirely automatic; algorithms must be designed and manually tuned up for every survey. The amounts of data that will be generated in the future mean astronomers must develop scalable and automated analysis pipelines. In this work we present a feature learning algorithm designed for variable objects. Our method works by extracting a large number of lightcurve subsequences from a given set, which are then clustered to find common local patterns in the time series. Representatives of these common patterns are then used to transform lightcurves of a labeled set into a new representation that can be used to train a classifier. The proposed algorithm learns the features from both labeled and unlabeled lightcurves, overcoming the bias using only labeled data. We test our method on data sets from the Massive Compact Halo Object survey and the Optical Gravitational Lensing Experiment; the results show that our classification performance is as good as and in some cases better than the performance achieved using traditional statistical features, while the computational cost is significantly lower. With these promising results, we believe that our method constitutes a significant step toward the automation of the lightcurve classification pipeline.

  2. Automated DBS microsampling, microscale automation and microflow LC-MS for therapeutic protein PK.

    Science.gov (United States)

    Zhang, Qian; Tomazela, Daniela; Vasicek, Lisa A; Spellman, Daniel S; Beaumont, Maribel; Shyong, BaoJen; Kenny, Jacqueline; Fauty, Scott; Fillgrove, Kerry; Harrelson, Jane; Bateman, Kevin P

    2016-04-01

    Reduce animal usage for discovery-stage PK studies for biologics programs using microsampling-based approaches and microscale LC-MS. We report the development of an automated DBS-based serial microsampling approach for studying the PK of therapeutic proteins in mice. Automated sample preparation and microflow LC-MS were used to enable assay miniaturization and improve overall assay throughput. Serial sampling of mice was possible over the full 21-day study period with the first six time points over 24 h being collected using automated DBS sample collection. Overall, this approach demonstrated comparable data to a previous study using single mice per time point liquid samples while reducing animal and compound requirements by 14-fold. Reduction in animals and drug material is enabled by the use of automated serial DBS microsampling for mice studies in discovery-stage studies of protein therapeutics.

  3. Cluster Physics with Merging Galaxy Clusters

    Directory of Open Access Journals (Sweden)

    Sandor M. Molnar

    2016-02-01

    Full Text Available Collisions between galaxy clusters provide a unique opportunity to study matter in a parameter space which cannot be explored in our laboratories on Earth. In the standard LCDM model, where the total density is dominated by the cosmological constant ($Lambda$ and the matter density by cold dark matter (CDM, structure formation is hierarchical, and clusters grow mostly by merging.Mergers of two massive clusters are the most energetic events in the universe after the Big Bang,hence they provide a unique laboratory to study cluster physics.The two main mass components in clusters behave differently during collisions:the dark matter is nearly collisionless, responding only to gravity, while the gas is subject to pressure forces and dissipation, and shocks and turbulenceare developed during collisions. In the present contribution we review the different methods used to derive the physical properties of merging clusters. Different physical processes leave their signatures on different wavelengths, thusour review is based on a multifrequency analysis. In principle, the best way to analyze multifrequency observations of merging clustersis to model them using N-body/HYDRO numerical simulations. We discuss the results of such detailed analyses.New high spatial and spectral resolution ground and space based telescopeswill come online in the near future. Motivated by these new opportunities,we briefly discuss methods which will be feasible in the near future in studying merging clusters.

  4. Quantitative Estimation for the Effectiveness of Automation

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun

    2012-01-01

    In advanced MCR, various automation systems are applied to enhance the human performance and reduce the human errors in industrial fields. It is expected that automation provides greater efficiency, lower workload, and fewer human errors. However, these promises are not always fulfilled. As the new types of events related to application of the imperfect and complex automation are occurred, it is required to analyze the effects of automation system for the performance of human operators. Therefore, we suggest the quantitative estimation method to analyze the effectiveness of the automation systems according to Level of Automation (LOA) classification, which has been developed over 30 years. The estimation of the effectiveness of automation will be achieved by calculating the failure probability of human performance related to the cognitive activities

  5. The Distribution and Ages of Star Clusters in the Small Magellanic Cloud: Constraints on the Interaction History of the Magellanic Clouds

    Science.gov (United States)

    Bitsakis, Theodoros; González-Lópezlira, R. A.; Bonfini, P.; Bruzual, G.; Maravelias, G.; Zaritsky, D.; Charlot, S.; Ramírez-Siordia, V. H.

    2018-02-01

    We present a new study of the spatial distribution and ages of the star clusters in the Small Magellanic Cloud (SMC). To detect and estimate the ages of the star clusters we rely on the new fully automated method developed by Bitsakis et al. Our code detects 1319 star clusters in the central 18 deg2 of the SMC we surveyed (1108 of which have never been reported before). The age distribution of those clusters suggests enhanced cluster formation around 240 Myr ago. It also implies significant differences in the cluster distribution of the bar with respect to the rest of the galaxy, with the younger clusters being predominantly located in the bar. Having used the same setup, and data from the same surveys as for our previous study of the LMC, we are able to robustly compare the cluster properties between the two galaxies. Our results suggest that the bulk of the clusters in both galaxies were formed approximately 300 Myr ago, probably during a direct collision between the two galaxies. On the other hand, the locations of the young (≤50 Myr) clusters in both Magellanic Clouds, found where their bars join the H I arms, suggest that cluster formation in those regions is a result of internal dynamical processes. Finally, we discuss the potential causes of the apparent outside-in quenching of cluster formation that we observe in the SMC. Our findings are consistent with an evolutionary scheme where the interactions between the Magellanic Clouds constitute the major mechanism driving their overall evolution.

  6. Automating spectral measurements

    Science.gov (United States)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  7. Ask the experts: automation: part I.

    Science.gov (United States)

    Allinson, John L; Blick, Kenneth E; Cohen, Lucinda; Higton, David; Li, Ming

    2013-08-01

    Bioanalysis invited a selection of leading researchers to express their views on automation in the bioanalytical laboratory. The topics discussed include the challenges that the modern bioanalyst faces when integrating automation into existing drug-development processes, the impact of automation and how they envision the modern bioanalytical laboratory changing in the near future. Their enlightening responses provide a valuable insight into the impact of automation and the future of the constantly evolving bioanalytical laboratory.

  8. An Automation Survival Guide for Media Centers.

    Science.gov (United States)

    Whaley, Roger E.

    1989-01-01

    Reviews factors that should affect the decision to automate a school media center and offers suggestions for the automation process. Topics discussed include getting the library collection ready for automation, deciding what automated functions are needed, evaluating software vendors, selecting software, and budgeting. (CLB)

  9. System for Automated Calibration of Vector Modulators

    Science.gov (United States)

    Lux, James; Boas, Amy; Li, Samuel

    2009-01-01

    Vector modulators are used to impose baseband modulation on RF signals, but non-ideal behavior limits the overall performance. The non-ideal behavior of the vector modulator is compensated using data collected with the use of an automated test system driven by a LabVIEW program that systematically applies thousands of control-signal values to the device under test and collects RF measurement data. The technology innovation automates several steps in the process. First, an automated test system, using computer controlled digital-to-analog converters (DACs) and a computer-controlled vector network analyzer (VNA) systematically can apply different I and Q signals (which represent the complex number by which the RF signal is multiplied) to the vector modulator under test (VMUT), while measuring the RF performance specifically, gain and phase. The automated test system uses the LabVIEW software to control the test equipment, collect the data, and write it to a file. The input to the Lab - VIEW program is either user-input for systematic variation, or is provided in a file containing specific test values that should be fed to the VMUT. The output file contains both the control signals and the measured data. The second step is to post-process the file to determine the correction functions as needed. The result of the entire process is a tabular representation, which allows translation of a desired I/Q value to the required analog control signals to produce a particular RF behavior. In some applications, corrected performance is needed only for a limited range. If the vector modulator is being used as a phase shifter, there is only a need to correct I and Q values that represent points on a circle, not the entire plane. This innovation has been used to calibrate 2-GHz MMIC (monolithic microwave integrated circuit) vector modulators in the High EIRP Cluster Array project (EIRP is high effective isotropic radiated power). These calibrations were then used to create

  10. Demands on digital automation; Anforderungen an die Digitale Automation

    Energy Technology Data Exchange (ETDEWEB)

    Bieler, P.

    1995-12-31

    In chapter 12 of the anthology about building control the demands on digital automation are presented. The following aspects are discussed: variety of the companies` philosophies, demands of the customer/investor, demands of the use of buildings/rooms, the user, point of view of manufacturer of technical plants. (BWI) [Deutsch] Kapitel 12 des Sammelbandes ueber Building Control stellt die Anforderungen an die Digitale Automation vor. In diesem Zusammenhang wird auf folgende Themenbereiche eingegangen: Spektrum der Firmenphilosophien, Forderungen der Auftraggeber/Investoren, der Gebaeude-/Raumnutzung, der Betreiber sowie Sicht der Ersteller betriebstechnischer Anlagen. (BWI)

  11. Disassembly automation automated systems with cognitive abilities

    CERN Document Server

    Vongbunyong, Supachai

    2015-01-01

    This book presents a number of aspects to be considered in the development of disassembly automation, including the mechanical system, vision system and intelligent planner. The implementation of cognitive robotics increases the flexibility and degree of autonomy of the disassembly system. Disassembly, as a step in the treatment of end-of-life products, can allow the recovery of embodied value left within disposed products, as well as the appropriate separation of potentially-hazardous components. In the end-of-life treatment industry, disassembly has largely been limited to manual labor, which is expensive in developed countries. Automation is one possible solution for economic feasibility. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  12. Smoking and polymorphisms of genes encoding mannose-binding lectin and surfactant protein-D in patients with rheumatoid arthritis

    DEFF Research Database (Denmark)

    Kristiansen, Malthe; Frisch, Morten; Madsen, Hans Ole

    2014-01-01

    genotype at codon 11, and HLA-shared epitope were determined in 456 patients with rheumatoid arthritis and 533 sex- and age-matched controls. Patients were grouped according to the presence of ACPA antibodies and RA-associated bone erosions and sub-stratified according to smoking status as never or ever...... smokers. Odds ratios with 95% confidence interval (OR, 95% CI) were calculated using multiple logistic regression analyses controlling for shared epitope. The low-producing SFTPD genotype was not associated with risk of RA or ACPA positive RA, but with erosive disease in the RA patients (OR = 1.8; 95% CI...... 1.1-3.0) particularly in RA ever smokers (OR = 2.4; 95% CI 1.3-4.3). The high-producing MBL2 genotype YA/YA was associated with ACPA positive RA (OR = 1.4; 95% CI 1.0-1.9) and erosive joint disease in RA ever smokers (OR = 1.8; 95% CI 1.1-3.0). Genetic disposition for low SP-D was not associated...

  13. Influence of 4,4’-azobis (4-cyanopentanoic acid in Transmission and Reflection Gratings Stored in a PVA/AA Photopolymer

    Directory of Open Access Journals (Sweden)

    Elena Fernandez

    2016-03-01

    Full Text Available Holographic transmission gratings with a spatial frequency of 2658 lines/mm and reflection gratings with a spatial frequency of 4553 lines/mm were stored in a polyvinyl alcohol (PVA/acrylamide (AA based photopolymer. This material can reach diffraction efficiencies close to 100% for spatial frequencies about 1000 lines/mm. However, for higher spatial frequencies, the diffraction efficiency decreases considerably as the spatial frequency increases. To enhance the material response at high spatial frequencies, a chain transfer agent, the 4,4’-azobis (4-cyanopentanoic acid, ACPA, is added to the composition of the material. Different concentrations of ACPA are incorporated into the main composition of the photopolymer to find the concentration value that provides the highest diffraction efficiency. Moreover, the refractive index modulation and the optical thickness of the transmission and reflection gratings were obtained, evaluated and compared to procure more information about the influence of the ACPA on them.

  14. Automations influence on nuclear power plants: a look at three accidents and how automation played a role.

    Science.gov (United States)

    Schmitt, Kara

    2012-01-01

    Nuclear power is one of the ways that we can design an efficient sustainable future. Automation is the primary system used to assist operators in the task of monitoring and controlling nuclear power plants (NPP). Automation performs tasks such as assessing the status of the plant's operations as well as making real time life critical situational specific decisions. While the advantages and disadvantages of automation are well studied in variety of domains, accidents remind us that there is still vulnerability to unknown variables. This paper will look at the effects of automation within three NPP accidents and incidents and will consider why automation failed in preventing these accidents from occurring. It will also review the accidents at the Three Mile Island, Chernobyl, and Fukushima Daiichi NPP's in order to determine where better use of automation could have resulted in a more desirable outcome.

  15. Are clusters of dietary patterns and cluster membership stable over time? Results of a longitudinal cluster analysis study.

    Science.gov (United States)

    Walthouwer, Michel Jean Louis; Oenema, Anke; Soetens, Katja; Lechner, Lilian; de Vries, Hein

    2014-11-01

    Developing nutrition education interventions based on clusters of dietary patterns can only be done adequately when it is clear if distinctive clusters of dietary patterns can be derived and reproduced over time, if cluster membership is stable, and if it is predictable which type of people belong to a certain cluster. Hence, this study aimed to: (1) identify clusters of dietary patterns among Dutch adults, (2) test the reproducibility of these clusters and stability of cluster membership over time, and (3) identify sociodemographic predictors of cluster membership and cluster transition. This study had a longitudinal design with online measurements at baseline (N=483) and 6 months follow-up (N=379). Dietary intake was assessed with a validated food frequency questionnaire. A hierarchical cluster analysis was performed, followed by a K-means cluster analysis. Multinomial logistic regression analyses were conducted to identify the sociodemographic predictors of cluster membership and cluster transition. At baseline and follow-up, a comparable three-cluster solution was derived, distinguishing a healthy, moderately healthy, and unhealthy dietary pattern. Male and lower educated participants were significantly more likely to have a less healthy dietary pattern. Further, 251 (66.2%) participants remained in the same cluster, 45 (11.9%) participants changed to an unhealthier cluster, and 83 (21.9%) participants shifted to a healthier cluster. Men and people living alone were significantly more likely to shift toward a less healthy dietary pattern. Distinctive clusters of dietary patterns can be derived. Yet, cluster membership is unstable and only few sociodemographic factors were associated with cluster membership and cluster transition. These findings imply that clusters based on dietary intake may not be suitable as a basis for nutrition education interventions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Automated synovium segmentation in doppler ultrasound images for rheumatoid arthritis assessment

    Science.gov (United States)

    Yeung, Pak-Hei; Tan, York-Kiat; Xu, Shuoyu

    2018-02-01

    We need better clinical tools to improve monitoring of synovitis, synovial inflammation in the joints, in rheumatoid arthritis (RA) assessment. Given its economical, safe and fast characteristics, ultrasound (US) especially Doppler ultrasound is frequently used. However, manual scoring of synovitis in US images is subjective and prone to observer variations. In this study, we propose a new and robust method for automated synovium segmentation in the commonly affected joints, i.e. metacarpophalangeal (MCP) and metatarsophalangeal (MTP) joints, which would facilitate automation in quantitative RA assessment. The bone contour in the US image is firstly detected based on a modified dynamic programming method, incorporating angular information for detecting curved bone surface and using image fuzzification to identify missing bone structure. K-means clustering is then performed to initialize potential synovium areas by utilizing the identified bone contour as boundary reference. After excluding invalid candidate regions, the final segmented synovium is identified by reconnecting remaining candidate regions using level set evolution. 15 MCP and 15 MTP US images were analyzed in this study. For each image, segmentations by our proposed method as well as two sets of annotations performed by an experienced clinician at different time-points were acquired. Dice's coefficient is 0.77+/-0.12 between the two sets of annotations. Similar Dice's coefficients are achieved between automated segmentation and either the first set of annotations (0.76+/-0.12) or the second set of annotations (0.75+/-0.11), with no significant difference (P = 0.77). These results verify that the accuracy of segmentation by our proposed method and by clinician is comparable. Therefore, reliable synovium identification can be made by our proposed method.

  17. Revealing the ecological content of long-duration audio-recordings of the environment through clustering and visualisation.

    Science.gov (United States)

    Phillips, Yvonne F; Towsey, Michael; Roe, Paul

    2018-01-01

    Audio recordings of the environment are an increasingly important technique to monitor biodiversity and ecosystem function. While the acquisition of long-duration recordings is becoming easier and cheaper, the analysis and interpretation of that audio remains a significant research area. The issue addressed in this paper is the automated reduction of environmental audio data to facilitate ecological investigations. We describe a method that first reduces environmental audio to vectors of acoustic indices, which are then clustered. This can reduce the audio data by six to eight orders of magnitude yet retain useful ecological information. We describe techniques to visualise sequences of cluster occurrence (using for example, diel plots, rose plots) that assist interpretation of environmental audio. Colour coding acoustic clusters allows months and years of audio data to be visualised in a single image. These techniques are useful in identifying and indexing the contents of long-duration audio recordings. They could also play an important role in monitoring long-term changes in species abundance brought about by habitat degradation and/or restoration.

  18. Automated System Marketplace 1994.

    Science.gov (United States)

    Griffiths, Jose-Marie; Kertis, Kimberly

    1994-01-01

    Reports results of the 1994 Automated System Marketplace survey based on responses from 60 vendors. Highlights include changes in the library automation marketplace; estimated library systems revenues; minicomputer and microcomputer-based systems; marketplace trends; global markets and mergers; research needs; new purchase processes; and profiles…

  19. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and

  20. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Reifenhaeuser, R.; Schlicht, K.

    1976-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these technics of equipment is further improved and if its volume is tallied with a definite etc. (orig.) [de

  1. Chef infrastructure automation cookbook

    CERN Document Server

    Marschall, Matthias

    2013-01-01

    Chef Infrastructure Automation Cookbook contains practical recipes on everything you will need to automate your infrastructure using Chef. The book is packed with illustrated code examples to automate your server and cloud infrastructure.The book first shows you the simplest way to achieve a certain task. Then it explains every step in detail, so that you can build your knowledge about how things work. Eventually, the book shows you additional things to consider for each approach. That way, you can learn step-by-step and build profound knowledge on how to go about your configuration management

  2. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Schlicht, K.

    1977-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these equipment techniques is further improved and if it stands in a certain ratio with a definite efficiency. (orig.) [de

  3. Clusters and how to make it work : Cluster Strategy Toolkit

    NARCIS (Netherlands)

    Manickam, Anu; van Berkel, Karel

    2014-01-01

    Clusters are the magic answer to regional economic development. Firms in clusters are more innovative; cluster policy dominates EU policy; ‘top-sectors’ and excellence are the choice of national policy makers; clusters are ‘in’. But, clusters are complex, clusters are ‘messy’; there is no clear

  4. Cluster dynamics at different cluster size and incident laser wavelengths

    International Nuclear Information System (INIS)

    Desai, Tara; Bernardinello, Andrea

    2002-01-01

    X-ray emission spectra from aluminum clusters of diameter -0.4 μm and gold clusters of dia. ∼1.25 μm are experimentally studied by irradiating the cluster foil targets with 1.06 μm laser, 10 ns (FWHM) at an intensity ∼10 12 W/cm 2 . Aluminum clusters show a different spectra compared to bulk material whereas gold cluster evolve towards bulk gold. Experimental data are analyzed on the basis of cluster dimension, laser wavelength and pulse duration. PIC simulations are performed to study the behavior of clusters at higher intensity I≥10 17 W/cm 2 for different size of the clusters irradiated at different laser wavelengths. Results indicate the dependence of cluster dynamics on cluster size and incident laser wavelength

  5. Automated recognition of cell phenotypes in histology images based on membrane- and nuclei-targeting biomarkers

    International Nuclear Information System (INIS)

    Karaçalı, Bilge; Vamvakidou, Alexandra P; Tözeren, Aydın

    2007-01-01

    Three-dimensional in vitro culture of cancer cells are used to predict the effects of prospective anti-cancer drugs in vivo. In this study, we present an automated image analysis protocol for detailed morphological protein marker profiling of tumoroid cross section images. Histologic cross sections of breast tumoroids developed in co-culture suspensions of breast cancer cell lines, stained for E-cadherin and progesterone receptor, were digitized and pixels in these images were classified into five categories using k-means clustering. Automated segmentation was used to identify image regions composed of cells expressing a given biomarker. Synthesized images were created to check the accuracy of the image processing system. Accuracy of automated segmentation was over 95% in identifying regions of interest in synthesized images. Image analysis of adjacent histology slides stained, respectively, for Ecad and PR, accurately predicted regions of different cell phenotypes. Image analysis of tumoroid cross sections from different tumoroids obtained under the same co-culture conditions indicated the variation of cellular composition from one tumoroid to another. Variations in the compositions of cross sections obtained from the same tumoroid were established by parallel analysis of Ecad and PR-stained cross section images. Proposed image analysis methods offer standardized high throughput profiling of molecular anatomy of tumoroids based on both membrane and nuclei markers that is suitable to rapid large scale investigations of anti-cancer compounds for drug development

  6. Future Computer, Communication, Control and Automation

    CERN Document Server

    2011 International Conference on Computer, Communication, Control and Automation

    2012-01-01

    The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011). 2011 International Conference on Computer, Communication, Control and Automation (3CA 2011) has been held in Zhuhai, China, November 19-20, 2011. This volume topics covered include wireless communications, advances in wireless video, wireless sensors networking, security in wireless networks, network measurement and management, hybrid and discrete-event systems, internet analytics and automation, robotic system and applications, reconfigurable automation systems, machine vision in automation. We hope that researchers, graduate students and other interested readers benefit scientifically from the proceedings and also find it stimulating in the process.

  7. "First generation" automated DNA sequencing technology.

    Science.gov (United States)

    Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M

    2011-10-01

    Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.

  8. Text Clustering Algorithm Based on Random Cluster Core

    Directory of Open Access Journals (Sweden)

    Huang Long-Jun

    2016-01-01

    Full Text Available Nowadays clustering has become a popular text mining algorithm, but the huge data can put forward higher requirements for the accuracy and performance of text mining. In view of the performance bottleneck of traditional text clustering algorithm, this paper proposes a text clustering algorithm with random features. This is a kind of clustering algorithm based on text density, at the same time using the neighboring heuristic rules, the concept of random cluster is introduced, which effectively reduces the complexity of the distance calculation.

  9. Automating the Small Library.

    Science.gov (United States)

    Skapura, Robert

    1987-01-01

    Discusses the use of microcomputers for automating school libraries, both for entire systems and for specific library tasks. Highlights include available library management software, newsletters that evaluate software, constructing an evaluation matrix, steps to consider in library automation, and a brief discussion of computerized card catalogs.…

  10. GUI test automation for Qt application

    OpenAIRE

    Wang, Lei

    2015-01-01

    GUI test automation is a popular and interesting subject in the testing industry. Many companies plan to start test automation projects in order to implement efficient, less expensive software testing. However, there are challenges for the testing team who lack experience performing GUI tests automation. Many GUI test automation projects have ended in failure due to mistakes made during the early stages of the project. The major work of this thesis is to find a solution to the challenges of e...

  11. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  12. 76 FR 69755 - National Customs Automation Program Test Concerning Automated Commercial Environment (ACE...

    Science.gov (United States)

    2011-11-09

    ... DEPARTMENT OF HOMELAND SECURITY U.S. Customs and Border Protection National Customs Automation... announces U.S. Customs and Border Protection's (CBP's) plan to conduct a National Customs Automation Program... conveyance transporting the cargo to the United States. This data will fulfill merchandise entry requirements...

  13. On clusters and clustering from atoms to fractals

    CERN Document Server

    Reynolds, PJ

    1993-01-01

    This book attempts to answer why there is so much interest in clusters. Clusters occur on all length scales, and as a result occur in a variety of fields. Clusters are interesting scientifically, but they also have important consequences technologically. The division of the book into three parts roughly separates the field into small, intermediate, and large-scale clusters. Small clusters are the regime of atomic and molecular physics and chemistry. The intermediate regime is the transitional regime, with its characteristics including the onset of bulk-like behavior, growth and aggregation, a

  14. I trust it, but I don't know why: effects of implicit attitudes toward automation on trust in an automated system.

    Science.gov (United States)

    Merritt, Stephanie M; Heimbaugh, Heather; LaChapell, Jennifer; Lee, Deborah

    2013-06-01

    This study is the first to examine the influence of implicit attitudes toward automation on users' trust in automation. Past empirical work has examined explicit (conscious) influences on user level of trust in automation but has not yet measured implicit influences. We examine concurrent effects of explicit propensity to trust machines and implicit attitudes toward automation on trust in an automated system. We examine differential impacts of each under varying automation performance conditions (clearly good, ambiguous, clearly poor). Participants completed both a self-report measure of propensity to trust and an Implicit Association Test measuring implicit attitude toward automation, then performed an X-ray screening task. Automation performance was manipulated within-subjects by varying the number and obviousness of errors. Explicit propensity to trust and implicit attitude toward automation did not significantly correlate. When the automation's performance was ambiguous, implicit attitude significantly affected automation trust, and its relationship with propensity to trust was additive: Increments in either were related to increases in trust. When errors were obvious, a significant interaction between the implicit and explicit measures was found, with those high in both having higher trust. Implicit attitudes have important implications for automation trust. Users may not be able to accurately report why they experience a given level of trust. To understand why users trust or fail to trust automation, measurements of implicit and explicit predictors may be necessary. Furthermore, implicit attitude toward automation might be used as a lever to effectively calibrate trust.

  15. Robotics/Automated Systems Technicians.

    Science.gov (United States)

    Doty, Charles R.

    Major resources exist that can be used to develop or upgrade programs in community colleges and technical institutes that educate robotics/automated systems technicians. The first category of resources is Economic, Social, and Education Issues. The Office of Technology Assessment (OTA) report, "Automation and the Workplace," presents analyses of…

  16. GibbsCluster: unsupervised clustering and alignment of peptide sequences

    DEFF Research Database (Denmark)

    Andreatta, Massimo; Alvarez, Bruno; Nielsen, Morten

    2017-01-01

    motif characterizing each cluster. Several parameters are available to customize cluster analysis, including adjustable penalties for small clusters and overlapping groups and a trash cluster to remove outliers. As an example application, we used the server to deconvolute multiple specificities in large......-scale peptidome data generated by mass spectrometry. The server is available at http://www.cbs.dtu.dk/services/GibbsCluster-2.0....

  17. Evaluation of an Automated Keywording System.

    Science.gov (United States)

    Malone, Linda C.; And Others

    1990-01-01

    Discussion of automated indexing techniques focuses on ways to statistically document improvements in the development of an automated keywording system over time. The system developed by the Joint Chiefs of Staff to automate the storage, categorization, and retrieval of information from military exercises is explained, and performance measures are…

  18. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    CERN Document Server

    2012-01-01

    This volume Future Control and Automation- Volume 2 includes best papers from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into six sessions on the basis of the classification of manuscripts considered, which is listed as follows: Mathematical Modeling, Analysis and Computation, Control Engineering, Reliable Networks Design, Vehicular Communications and Networking, Automation and Mechatronics.

  19. Automation for a base station stability testing

    OpenAIRE

    Punnek, Elvis

    2016-01-01

    This Batchelor’s thesis was commissioned by Oy LM Ericsson Ab Oulu. The aim of it was to help to investigate and create a test automation solution for the stability testing of the LTE base station. The main objective was to create a test automation for a predefined test set. This test automation solution had to be created for specific environments and equipment. This work included creating the automation for the test cases and putting them to daily test automation jobs. The key factor...

  20. Diametrical clustering for identifying anti-correlated gene clusters.

    Science.gov (United States)

    Dhillon, Inderjit S; Marcotte, Edward M; Roshan, Usman

    2003-09-01

    Clustering genes based upon their expression patterns allows us to predict gene function. Most existing clustering algorithms cluster genes together when their expression patterns show high positive correlation. However, it has been observed that genes whose expression patterns are strongly anti-correlated can also be functionally similar. Biologically, this is not unintuitive-genes responding to the same stimuli, regardless of the nature of the response, are more likely to operate in the same pathways. We present a new diametrical clustering algorithm that explicitly identifies anti-correlated clusters of genes. Our algorithm proceeds by iteratively (i). re-partitioning the genes and (ii). computing the dominant singular vector of each gene cluster; each singular vector serving as the prototype of a 'diametric' cluster. We empirically show the effectiveness of the algorithm in identifying diametrical or anti-correlated clusters. Testing the algorithm on yeast cell cycle data, fibroblast gene expression data, and DNA microarray data from yeast mutants reveals that opposed cellular pathways can be discovered with this method. We present systems whose mRNA expression patterns, and likely their functions, oppose the yeast ribosome and proteosome, along with evidence for the inverse transcriptional regulation of a number of cellular systems.

  1. Partitional clustering algorithms

    CERN Document Server

    2015-01-01

    This book summarizes the state-of-the-art in partitional clustering. Clustering, the unsupervised classification of patterns into groups, is one of the most important tasks in exploratory data analysis. Primary goals of clustering include gaining insight into, classifying, and compressing data. Clustering has a long and rich history that spans a variety of scientific disciplines including anthropology, biology, medicine, psychology, statistics, mathematics, engineering, and computer science. As a result, numerous clustering algorithms have been proposed since the early 1950s. Among these algorithms, partitional (nonhierarchical) ones have found many applications, especially in engineering and computer science. This book provides coverage of consensus clustering, constrained clustering, large scale and/or high dimensional clustering, cluster validity, cluster visualization, and applications of clustering. Examines clustering as it applies to large and/or high-dimensional data sets commonly encountered in reali...

  2. SONG-China Project: A Global Automated Observation Network

    Science.gov (United States)

    Yang, Z. Z.; Lu, X. M.; Tian, J. F.; Zhuang, C. G.; Wang, K.; Deng, L. C.

    2017-09-01

    Driven by advancements in technology and scientific objectives, data acquisition in observational astronomy has been changed greatly in recent years. Fully automated or even autonomous ground-based network of telescopes has now become a tendency for time-domain observational projects. The Stellar Observations Network Group (SONG) is an international collaboration with the participation and contribution of the Chinese astronomy community. The scientific goal of SONG is time-domain astrophysics such as asteroseismology and open cluster research. The SONG project aims to build a global network of 1 m telescopes equipped with high-precision and high-resolution spectrographs, and two-channel lucky-imaging cameras. It is the Chinese initiative to install a 50 cm binocular photometry telescope at each SONG node sharing the network platform and infrastructure. This work is focused on design and implementation in technology and methodology of SONG/50BiN, a typical ground-based network composed of multiple sites and a variety of instruments.

  3. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  4. MADIBA: A web server toolkit for biological interpretation of Plasmodium and plant gene clusters

    Directory of Open Access Journals (Sweden)

    Louw Abraham I

    2008-02-01

    Full Text Available Abstract Background Microarray technology makes it possible to identify changes in gene expression of an organism, under various conditions. Data mining is thus essential for deducing significant biological information such as the identification of new biological mechanisms or putative drug targets. While many algorithms and software have been developed for analysing gene expression, the extraction of relevant information from experimental data is still a substantial challenge, requiring significant time and skill. Description MADIBA (MicroArray Data Interface for Biological Annotation facilitates the assignment of biological meaning to gene expression clusters by automating the post-processing stage. A relational database has been designed to store the data from gene to pathway for Plasmodium, rice and Arabidopsis. Tools within the web interface allow rapid analyses for the identification of the Gene Ontology terms relevant to each cluster; visualising the metabolic pathways where the genes are implicated, their genomic localisations, putative common transcriptional regulatory elements in the upstream sequences, and an analysis specific to the organism being studied. Conclusion MADIBA is an integrated, online tool that will assist researchers in interpreting their results and understand the meaning of the co-expression of a cluster of genes. Functionality of MADIBA was validated by analysing a number of gene clusters from several published experiments – expression profiling of the Plasmodium life cycle, and salt stress treatments of Arabidopsis and rice. In most of the cases, the same conclusions found by the authors were quickly and easily obtained after analysing the gene clusters with MADIBA.

  5. Application of the dynamically allocated virtual clustering management system to emulated tactical network experimentation

    Science.gov (United States)

    Marcus, Kelvin

    2014-06-01

    The U.S Army Research Laboratory (ARL) has built a "Network Science Research Lab" to support research that aims to improve their ability to analyze, predict, design, and govern complex systems that interweave the social/cognitive, information, and communication network genres. Researchers at ARL and the Network Science Collaborative Technology Alliance (NS-CTA), a collaborative research alliance funded by ARL, conducted experimentation to determine if automated network monitoring tools and task-aware agents deployed within an emulated tactical wireless network could potentially increase the retrieval of relevant data from heterogeneous distributed information nodes. ARL and NS-CTA required the capability to perform this experimentation over clusters of heterogeneous nodes with emulated wireless tactical networks where each node could contain different operating systems, application sets, and physical hardware attributes. Researchers utilized the Dynamically Allocated Virtual Clustering Management System (DAVC) to address each of the infrastructure support requirements necessary in conducting their experimentation. The DAVC is an experimentation infrastructure that provides the means to dynamically create, deploy, and manage virtual clusters of heterogeneous nodes within a cloud computing environment based upon resource utilization such as CPU load, available RAM and hard disk space. The DAVC uses 802.1Q Virtual LANs (VLANs) to prevent experimentation crosstalk and to allow for complex private networks. Clusters created by the DAVC system can be utilized for software development, experimentation, and integration with existing hardware and software. The goal of this paper is to explore how ARL and the NS-CTA leveraged the DAVC to create, deploy and manage multiple experimentation clusters to support their experimentation goals.

  6. Automation System Products and Research

    OpenAIRE

    Rintala, Mikko; Sormunen, Jussi; Kuisma, Petri; Rahkala, Matti

    2014-01-01

    Automation systems are used in most buildings nowadays. In the past they were mainly used in industry to control and monitor critical systems. During the past few decades the automation systems have become more common and are used today from big industrial solutions to homes of private customers. With the growing need for ecologic and cost-efficient management systems, home and building automation systems are becoming a standard way of controlling lighting, ventilation, heating etc. Auto...

  7. Guidelines for Automation Project Execution

    OpenAIRE

    Takkinen, Heidi

    2011-01-01

    The purpose of this Master’s thesis was to create instructions for executing an automation project. Sarlin Oy Ab needed directions on how to execute an automation project. Sarlin is starting up a new business area offering total project solutions for customers. Sarlin focuses on small and minor automation projects on domestic markets. The thesis represents issues related to project execution starting from the theory of the project to its kick-off and termination. Site work is one importan...

  8. Organizational changes and automation: Towards a customer-oriented automation: Part 3

    International Nuclear Information System (INIS)

    Van Gelder, J.W.

    1994-01-01

    Automation offers great opportunities in the efforts of energy utilities in the Netherlands to reorganize towards more customer-oriented businesses. However, automation in itself is not enough. First, the organizational structure has to be changed considerably. Various energy utilities have already started on it. The restructuring principle is the same everywhere, but the way it is implemented differs widely. In this article attention is paid to the necessity of realizing an integrated computerized system, which, however, is not feasible at the moment. The second best alternative is to use various computerized systems, capable of two-way data exchange. Two viable approaches are discussed: (1) one operating system on which all automated systems within a company should run, or (2) a selective system linking on the basis of required speed information exchange. Option (2) offers more freedom of selecting the system. 2 figs

  9. You're a What? Automation Technician

    Science.gov (United States)

    Mullins, John

    2010-01-01

    Many people think of automation as laborsaving technology, but it sure keeps Jim Duffell busy. Defined simply, automation is a technique for making a device run or a process occur with minimal direct human intervention. But the functions and technologies involved in automated manufacturing are complex. Nearly all functions, from orders coming in…

  10. Does Automated Feedback Improve Writing Quality?

    Science.gov (United States)

    Wilson, Joshua; Olinghouse, Natalie G.; Andrada, Gilbert N.

    2014-01-01

    The current study examines data from students in grades 4-8 who participated in a statewide computer-based benchmark writing assessment that featured automated essay scoring and automated feedback. We examined whether the use of automated feedback was associated with gains in writing quality across revisions to an essay, and with transfer effects…

  11. System reliability, performance and trust in adaptable automation.

    Science.gov (United States)

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-01-01

    The present study examined the effects of reduced system reliability on operator performance and automation management in an adaptable automation environment. 39 operators were randomly assigned to one of three experimental groups: low (60%), medium (80%), and high (100%) reliability of automation support. The support system provided five incremental levels of automation which operators could freely select according to their needs. After 3 h of training on a simulated process control task (AutoCAMS) in which the automation worked infallibly, operator performance and automation management were measured during a 2.5-h testing session. Trust and workload were also assessed through questionnaires. Results showed that although reduced system reliability resulted in lower levels of trust towards automation, there were no corresponding differences in the operators' reliance on automation. While operators showed overall a noteworthy ability to cope with automation failure, there were, however, decrements in diagnostic speed and prospective memory with lower reliability. Copyright © 2015. Published by Elsevier Ltd.

  12. Automated estimation of defects in magnetographic defectoscopy. 1. Automated magnetographic flow detectors

    International Nuclear Information System (INIS)

    Mikhajlov, S.P.; Vaulin, S.L.; Shcherbinin, V.E.; Shur, M.L.

    1993-01-01

    Consideration is given to specific features and possible functions of equipment for automated estimation of stretched continuity defects for samples with plane surface in magnetographic defectoscopy are discussed. Two models of automated magnetographic flow detectors, those with built-in microcomputer and in the form computer attachment, are described. Directions of further researches and development are discussed. 35 refs., 6 figs

  13. Cluster Matters

    DEFF Research Database (Denmark)

    Gulati, Mukesh; Lund-Thomsen, Peter; Suresh, Sangeetha

    2018-01-01

    sell their products successfully in international markets, but there is also an increasingly large consumer base within India. Indeed, Indian industrial clusters have contributed to a substantial part of this growth process, and there are several hundred registered clusters within the country...... of this handbook, which focuses on the role of CSR in MSMEs. Hence we contribute to the literature on CSR in industrial clusters and specifically CSR in Indian industrial clusters by investigating the drivers of CSR in India’s industrial clusters....

  14. Weighted Clustering

    DEFF Research Database (Denmark)

    Ackerman, Margareta; Ben-David, Shai; Branzei, Simina

    2012-01-01

    We investigate a natural generalization of the classical clustering problem, considering clustering tasks in which different instances may have different weights.We conduct the first extensive theoretical analysis on the influence of weighted data on standard clustering algorithms in both...... the partitional and hierarchical settings, characterizing the conditions under which algorithms react to weights. Extending a recent framework for clustering algorithm selection, we propose intuitive properties that would allow users to choose between clustering algorithms in the weighted setting and classify...

  15. Towards automated diffraction tomography. Part II-Cell parameter determination

    International Nuclear Information System (INIS)

    Kolb, U.; Gorelik, T.; Otten, M.T.

    2008-01-01

    Automated diffraction tomography (ADT) allows the collection of three-dimensional (3d) diffraction data sets from crystals down to a size of only few nanometres. Imaging is done in STEM mode, and diffraction data are collected with quasi-parallel beam nanoelectron diffraction (NED). Here, we present a set of developed processing steps necessary for automatic unit-cell parameter determination from the collected 3d diffraction data. Cell parameter determination is done via extraction of peak positions from a recorded data set (called the data reduction path) followed by subsequent cluster analysis of difference vectors. The procedure of lattice parameter determination is presented in detail for a beam-sensitive organic material. Independently, we demonstrate a potential (called the full integration path) based on 3d reconstruction of the reciprocal space visualising special structural features of materials such as partial disorder. Furthermore, we describe new features implemented into the acquisition part

  16. Automated spike sorting algorithm based on Laplacian eigenmaps and k-means clustering.

    Science.gov (United States)

    Chah, E; Hok, V; Della-Chiesa, A; Miller, J J H; O'Mara, S M; Reilly, R B

    2011-02-01

    This study presents a new automatic spike sorting method based on feature extraction by Laplacian eigenmaps combined with k-means clustering. The performance of the proposed method was compared against previously reported algorithms such as principal component analysis (PCA) and amplitude-based feature extraction. Two types of classifier (namely k-means and classification expectation-maximization) were incorporated within the spike sorting algorithms, in order to find a suitable classifier for the feature sets. Simulated data sets and in-vivo tetrode multichannel recordings were employed to assess the performance of the spike sorting algorithms. The results show that the proposed algorithm yields significantly improved performance with mean sorting accuracy of 73% and sorting error of 10% compared to PCA which combined with k-means had a sorting accuracy of 58% and sorting error of 10%.A correction was made to this article on 22 February 2011. The spacing of the title was amended on the abstract page. No changes were made to the article PDF and the print version was unaffected.

  17. Segmentation of Brain Lesions in MRI and CT Scan Images: A Hybrid Approach Using k-Means Clustering and Image Morphology

    Science.gov (United States)

    Agrawal, Ritu; Sharma, Manisha; Singh, Bikesh Kumar

    2018-04-01

    Manual segmentation and analysis of lesions in medical images is time consuming and subjected to human errors. Automated segmentation has thus gained significant attention in recent years. This article presents a hybrid approach for brain lesion segmentation in different imaging modalities by combining median filter, k means clustering, Sobel edge detection and morphological operations. Median filter is an essential pre-processing step and is used to remove impulsive noise from the acquired brain images followed by k-means segmentation, Sobel edge detection and morphological processing. The performance of proposed automated system is tested on standard datasets using performance measures such as segmentation accuracy and execution time. The proposed method achieves a high accuracy of 94% when compared with manual delineation performed by an expert radiologist. Furthermore, the statistical significance test between lesion segmented using automated approach and that by expert delineation using ANOVA and correlation coefficient achieved high significance values of 0.986 and 1 respectively. The experimental results obtained are discussed in lieu of some recently reported studies.

  18. Support Policies in Clusters: Prioritization of Support Needs by Cluster Members According to Cluster Life Cycle

    Directory of Open Access Journals (Sweden)

    Gulcin Salıngan

    2012-07-01

    Full Text Available Economic development has always been a moving target. Both the national and local governments have been facing the challenge of implementing the effective and efficient economic policy and program in order to best utilize their limited resources. One of the recent approaches in this area is called cluster-based economic analysis and strategy development. This study reviews key literature and some of the cluster based economic policies adopted by different governments. Based on this review, it proposes “the cluster life cycle” as a determining factor to identify the support requirements of clusters. A survey, designed based on literature review of International Cluster support programs, was conducted with 30 participants from 3 clusters with different maturity stage. This paper discusses the results of this study conducted among the cluster members in Eskişehir- Bilecik-Kütahya Region in Turkey on the requirement of the support to foster the development of related clusters.

  19. Order Division Automated System.

    Science.gov (United States)

    Kniemeyer, Justin M.; And Others

    This publication was prepared by the Order Division Automation Project staff to fulfill the Library of Congress' requirement to document all automation efforts. The report was originally intended for internal use only and not for distribution outside the Library. It is now felt that the library community at-large may have an interest in the…

  20. Automated Image Analysis of HER2 Fluorescence In Situ Hybridization to Refine Definitions of Genetic Heterogeneity in Breast Cancer Tissue.

    Science.gov (United States)

    Radziuviene, Gedmante; Rasmusson, Allan; Augulis, Renaldas; Lesciute-Krilaviciene, Daiva; Laurinaviciene, Aida; Clim, Eduard; Laurinavicius, Arvydas

    2017-01-01

    Human epidermal growth factor receptor 2 gene- (HER2-) targeted therapy for breast cancer relies primarily on HER2 overexpression established by immunohistochemistry (IHC) with borderline cases being further tested for amplification by fluorescence in situ hybridization (FISH). Manual interpretation of HER2 FISH is based on a limited number of cells and rather complex definitions of equivocal, polysomic, and genetically heterogeneous (GH) cases. Image analysis (IA) can extract high-capacity data and potentially improve HER2 testing in borderline cases. We investigated statistically derived indicators of HER2 heterogeneity in HER2 FISH data obtained by automated IA of 50 IHC borderline (2+) cases of invasive ductal breast carcinoma. Overall, IA significantly underestimated the conventional HER2, CEP17 counts, and HER2/CEP17 ratio; however, it collected more amplified cells in some cases below the lower limit of GH definition by manual procedure. Indicators for amplification, polysomy, and bimodality were extracted by factor analysis and allowed clustering of the tumors into amplified, nonamplified, and equivocal/polysomy categories. The bimodality indicator provided independent cell diversity characteristics for all clusters. Tumors classified as bimodal only partially coincided with the conventional GH heterogeneity category. We conclude that automated high-capacity nonselective tumor cell assay can generate evidence-based HER2 intratumor heterogeneity indicators to refine GH definitions.

  1. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2014-12-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  2. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2015-01-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  3. BARD: Better Automated Redistricting

    Directory of Open Access Journals (Sweden)

    Micah Altman

    2011-08-01

    Full Text Available BARD is the first (and at time of writing, only open source software package for general redistricting and redistricting analysis. BARD provides methods to create, display, compare, edit, automatically refine, evaluate, and profile political districting plans. BARD aims to provide a framework for scientific analysis of redistricting plans and to facilitate wider public participation in the creation of new plans.BARD facilitates map creation and refinement through command-line, graphical user interface, and automatic methods. Since redistricting is a computationally complex partitioning problem not amenable to an exact optimization solution, BARD implements a variety of selectable metaheuristics that can be used to refine existing or randomly-generated redistricting plans based on user-determined criteria.Furthermore, BARD supports automated generation of redistricting plans and profiling of plans by assigning different weights to various criteria, such as district compactness or equality of population. This functionality permits exploration of trade-offs among criteria. The intent of a redistricting authority may be explored by examining these trade-offs and inferring which reasonably observable plans were not adopted.Redistricting is a computationally-intensive problem for even modest-sized states. Performance is thus an important consideration in BARD's design and implementation. The program implements performance enhancements such as evaluation caching, explicit memory management, and distributed computing across snow clusters.

  4. Spiral waves characterization: Implications for an automated cardiodynamic tissue characterization.

    Science.gov (United States)

    Alagoz, Celal; Cohen, Andrew R; Frisch, Daniel R; Tunç, Birkan; Phatharodom, Saran; Guez, Allon

    2018-07-01

    Spiral waves are phenomena observed in cardiac tissue especially during fibrillatory activities. Spiral waves are revealed through in-vivo and in-vitro studies using high density mapping that requires special experimental setup. Also, in-silico spiral wave analysis and classification is performed using membrane potentials from entire tissue. In this study, we report a characterization approach that identifies spiral wave behaviors using intracardiac electrogram (EGM) readings obtained with commonly used multipolar diagnostic catheters that perform localized but high-resolution readings. Specifically, the algorithm is designed to distinguish between stationary, meandering, and break-up rotors. The clustering and classification algorithms are tested on simulated data produced using a phenomenological 2D model of cardiac propagation. For EGM measurements, unipolar-bipolar EGM readings from various locations on tissue using two catheter types are modeled. The distance measure between spiral behaviors are assessed using normalized compression distance (NCD), an information theoretical distance. NCD is a universal metric in the sense it is solely based on compressibility of dataset and not requiring feature extraction. We also introduce normalized FFT distance (NFFTD) where compressibility is replaced with a FFT parameter. Overall, outstanding clustering performance was achieved across varying EGM reading configurations. We found that effectiveness in distinguishing was superior in case of NCD than NFFTD. We demonstrated that distinct spiral activity identification on a behaviorally heterogeneous tissue is also possible. This report demonstrates a theoretical validation of clustering and classification approaches that provide an automated mapping from EGM signals to assessment of spiral wave behaviors and hence offers a potential mapping and analysis framework for cardiac tissue wavefront propagation patterns. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Proof-of-concept automation of propellant processing

    Science.gov (United States)

    Ramohalli, Kumar; Schallhorn, P. A.

    1989-01-01

    For space-based propellant production, automation of the process is needed. Currently, all phases of terrestrial production have some form of human interaction. A mixer was acquired to help perform the tasks of automation. A heating system to be used with the mixer was designed, built, and installed. Tests performed on the heating system verify design criteria. An IBM PS/2 personal computer was acquired for the future automation work. It is hoped that some the mixing process itself will be automated. This is a concept demonstration task; proving that propellant production can be automated reliably.

  6. Physical characteristics of a citrullinated pro-filaggrin epitope recognized by anti-citrullinated protein antibodies in rheumatoid arthritis sera

    DEFF Research Database (Denmark)

    Trier, Nicole Hartwig; Holm, Bettina Eide; Slot, Ole

    2016-01-01

    whether biotin labelling influence antibody recognition. The full-length cyclic pro-filaggrin peptide and a linear form with a N-terminal biotin, was recognized to the same level, whereas, a notable difference in ACPA reactivity to the linear peptides with a C-terminal biotin was found, probably due...... amino acid in position 4 C-terminal to citrulline. Collectively, peptide structure, length, the presence of charged amino acids and biotin labelling markedly influence antibody reactivity. In relation to the clinical diagnostics of ACPA, these findings may reflect the differences in diagnostic assays...

  7. Clusters and how to make it work : toolkit for cluster strategy

    NARCIS (Netherlands)

    Manickam, Anu; van Berkel, Karel

    2013-01-01

    Clusters are the magic answer to regional economic development. Firms in clusters are more innovative; cluster policy dominates EU policy; ‘top-sectors’ and excellence are the choice of national policy makers; clusters are ‘in’. But, clusters are complex, clusters are ‘messy’; there is no clear

  8. Automated Test-Form Generation

    Science.gov (United States)

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  9. Context-Aware user interfaces in automation

    DEFF Research Database (Denmark)

    Olsen, Mikkel Holm

    2007-01-01

    Automation is deployed in a great range of different domains such as the chemical industry, the production of consumer goods, the production of energy (both in terms of power plants and in the petrochemical industry), transportation and several others. Through several decades the complexity...... of automation systems and the level of automation have been rising. This has caused problems regarding the operator's ability to comprehend the overall situation and state of the automation system, in particular in abnormal situations. The amount of data available to the operator results in information overload....... Since context-aware applications have been developed in other research areas it seems natural to analyze the findings of this research and examine how this can be applied to the domain of automation systems. By evaluating existing architectures for the development of context-aware applications we find...

  10. Automated transit planning, operation, and applications

    CERN Document Server

    Liu, Rongfang

    2016-01-01

    This book analyzes the successful implementations of automated transit in various international locations, such as Paris, Toronto, London, and Kuala Lumpur, and investigates the apparent lack of automated transit applications in the urban environment in the United States. The book begins with a brief definition of automated transit and its historical development. After a thorough description of the technical specifications, the author highlights a few applications from each sub-group of the automated transit spectrum. International case studies display various technologies and their applications, and identify vital factors that affect each system and performance evaluations of existing applications. The book then discusses the planning and operation of automated transit applications at both macro and micro levels. Finally, the book covers a number of less successful concepts, as well as the lessons learned, allow ng readers to gain a comprehensive understanding of the topic.

  11. Participation through Automation: Fully Automated Critical PeakPricing in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Kiliccote,Sila; Linkugel, Eric

    2006-06-20

    California electric utilities have been exploring the use of dynamic critical peak prices (CPP) and other demand response programs to help reduce peaks in customer electric loads. CPP is a tariff design to promote demand response. Levels of automation in DR can be defined as follows: Manual Demand Response involves a potentially labor-intensive approach such as manually turning off or changing comfort set points at each equipment switch or controller. Semi-Automated Demand Response involves a pre-programmed demand response strategy initiated by a person via centralized control system. Fully Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. The receipt of the external signal initiates pre-programmed demand response strategies. They refer to this as Auto-DR. This paper describes the development, testing, and results from automated CPP (Auto-CPP) as part of a utility project in California. The paper presents the project description and test methodology. This is followed by a discussion of Auto-DR strategies used in the field test buildings. They present a sample Auto-CPP load shape case study, and a selection of the Auto-CPP response data from September 29, 2005. If all twelve sites reached their maximum saving simultaneously, a total of approximately 2 MW of DR is available from these twelve sites that represent about two million ft{sup 2}. The average DR was about half that value, at about 1 MW. These savings translate to about 0.5 to 1.0 W/ft{sup 2} of demand reduction. They are continuing field demonstrations and economic evaluations to pursue increasing penetrations of automated DR that has demonstrated ability to provide a valuable DR resource for California.

  12. Managing laboratory automation

    OpenAIRE

    Saboe, Thomas J.

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Fina...

  13. LIBRARY AUTOMATION IN NIGERAN UNIVERSITIES

    African Journals Online (AJOL)

    facilitate services and access to information in libraries is widely acceptable. ... Moreover, Ugah (2001) reports that the automation process at the. Abubakar ... blueprint in 1987 and a turn-key system of automation was suggested for the library.

  14. Automated 741 document preparation: Oak Ridge National Laboratory's Automated Safeguards Information System (OASIS)

    International Nuclear Information System (INIS)

    Austin, H.C.; Gray, L.M.

    1982-01-01

    OASIS has been providing for Oak Ridge National Laboratory's total safeguards needs since being place on line in April 1980. The system supports near real-time nuclear materials safeguards and accountability control. The original design of OASIS called for an automated facsimile of a 741 document to be prepared as a functional by-product of updating the inventory. An attempt was made to utilize, intact, DOE-Albuquerque's automated 741 system to generate the facsimile; however, the five page document produced proved too cumbersome. Albuquerque's programs were modified to print an original 741 document utilizing standard DOE/NRC 741 forms. It is felt that the best features of both the automated and manually generated 741 documents have been incorporated. Automation of the source data for 741 shipping documents produces greater efficiency while reducing possible errors. Through utilization of the standard DOE/NRC form, continuity within the NMMSS system is maintained, thus minimizing the confusion and redundancy associated with facsimiles. OASIS now fulfills the original concept of near real-time accountability by furnishing a viable 741 document as a function of updating the inventory

  15. The prevalence of ANA antibodies, anticentromere antibodies, and anti-cyclic citrullinated peptide antibodies in patients with primary Sjögren’s syndrome compared to patients with dryness symptoms without primary Sjögren’s syndrome confirmation

    Directory of Open Access Journals (Sweden)

    Maria Maślińska

    2017-07-01

    Full Text Available Objectives : Our study analyses the prevalence of ANA, anti-SS-A, anti-SS-B, and ACA and ACPA antibodies in patients with pSS and with dryness symptoms without pSS confirmation, and the association of ACPA and ACA antibodies with specific clinical symptoms. Materials and methods : 113 patients were divided into two groups: I – with diagnosed pSS (N = 75; and II – with dryness without pSS evidence (N = 38. Diagnostics: indirect immunofluorescence (IF; Hep-2 cell line of antinuclear antibodies (ANA, anti-SS-A anti-SS-B antibodies determined with semi-quantitative method, autoantibody profile (14 antigens, ANA Profil 3 EUROLINE; basic laboratory, ophthalmic examination tests, minor salivary gland biopsy with focus score (FS, joint and lung evaluation, and ESSDAI questionnaire (pSS activity. Results : 88% of group I had ANA antibodies (1 : 320 titre, 5.3% at 1 : 160. Anti-SS-A antibodies were present in 88% of group I, including all ANA 1 : 160. Anti-SS-A antibodies positively correlated with greater and moderate activity of ESSDAI 5 (p = 0.046 and FS. The presence of SS-B antibodies significantly affected disease activity. ACPA present: group I – 13% (associated with higher arthritis incidence; p = 0.003; group II – 8%. ACA antibodies present in 4% of group I, but not in group II. No ACA association with interstitial lung changes (small ACA + group excludes full conclusions. Conclusions : ANA antibodies should also be considered in a titre of less than 1 : 320, but the presence of anti-SS-A antibodies is still the most important immunological marker for pSS. Anti-SS-A antibodies correlate with higher disease activity (ESSDAI ≥ 5 and higher FS. The presence of the anti-SS-B antibody was significantly affected by higher activity of the disease. The incidence of arthritis was higher in patients with ACPA+ pSS compared to ACPA– (p = 0.003. There was no relationship between ACPA and arthritis in patients with dry-type syndrome without

  16. Determination of atomic cluster structure with cluster fusion algorithm

    DEFF Research Database (Denmark)

    Obolensky, Oleg I.; Solov'yov, Ilia; Solov'yov, Andrey V.

    2005-01-01

    We report an efficient scheme of global optimization, called cluster fusion algorithm, which has proved its reliability and high efficiency in determination of the structure of various atomic clusters.......We report an efficient scheme of global optimization, called cluster fusion algorithm, which has proved its reliability and high efficiency in determination of the structure of various atomic clusters....

  17. Large-Scale Multi-Dimensional Document Clustering on GPU Clusters

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Mueller, Frank [North Carolina State University; Zhang, Yongpeng [ORNL; Potok, Thomas E [ORNL

    2010-01-01

    Document clustering plays an important role in data mining systems. Recently, a flocking-based document clustering algorithm has been proposed to solve the problem through simulation resembling the flocking behavior of birds in nature. This method is superior to other clustering algorithms, including k-means, in the sense that the outcome is not sensitive to the initial state. One limitation of this approach is that the algorithmic complexity is inherently quadratic in the number of documents. As a result, execution time becomes a bottleneck with large number of documents. In this paper, we assess the benefits of exploiting the computational power of Beowulf-like clusters equipped with contemporary Graphics Processing Units (GPUs) as a means to significantly reduce the runtime of flocking-based document clustering. Our framework scales up to over one million documents processed simultaneously in a sixteennode GPU cluster. Results are also compared to a four-node cluster with higher-end GPUs. On these clusters, we observe 30X-50X speedups, which demonstrates the potential of GPU clusters to efficiently solve massive data mining problems. Such speedups combined with the scalability potential and accelerator-based parallelization are unique in the domain of document-based data mining, to the best of our knowledge.

  18. Membership determination of open clusters based on a spectral clustering method

    Science.gov (United States)

    Gao, Xin-Hua

    2018-06-01

    We present a spectral clustering (SC) method aimed at segregating reliable members of open clusters in multi-dimensional space. The SC method is a non-parametric clustering technique that performs cluster division using eigenvectors of the similarity matrix; no prior knowledge of the clusters is required. This method is more flexible in dealing with multi-dimensional data compared to other methods of membership determination. We use this method to segregate the cluster members of five open clusters (Hyades, Coma Ber, Pleiades, Praesepe, and NGC 188) in five-dimensional space; fairly clean cluster members are obtained. We find that the SC method can capture a small number of cluster members (weak signal) from a large number of field stars (heavy noise). Based on these cluster members, we compute the mean proper motions and distances for the Hyades, Coma Ber, Pleiades, and Praesepe clusters, and our results are in general quite consistent with the results derived by other authors. The test results indicate that the SC method is highly suitable for segregating cluster members of open clusters based on high-precision multi-dimensional astrometric data such as Gaia data.

  19. Individual differences in the calibration of trust in automation.

    Science.gov (United States)

    Pop, Vlad L; Shrewsbury, Alex; Durso, Francis T

    2015-06-01

    The objective was to determine whether operators with an expectancy that automation is trustworthy are better at calibrating their trust to changes in the capabilities of automation, and if so, why. Studies suggest that individual differences in automation expectancy may be able to account for why changes in the capabilities of automation lead to a substantial change in trust for some, yet only a small change for others. In a baggage screening task, 225 participants searched for weapons in 200 X-ray images of luggage. Participants were assisted by an automated decision aid exhibiting different levels of reliability. Measures of expectancy that automation is trustworthy were used in conjunction with subjective measures of trust and perceived reliability to identify individual differences in trust calibration. Operators with high expectancy that automation is trustworthy were more sensitive to changes (both increases and decreases) in automation reliability. This difference was eliminated by manipulating the causal attribution of automation errors. Attributing the cause of automation errors to factors external to the automation fosters an understanding of tasks and situations in which automation differs in reliability and may lead to more appropriate trust. The development of interventions can lead to calibrated trust in automation. © 2014, Human Factors and Ergonomics Society.

  20. Automation of Taxiing

    Directory of Open Access Journals (Sweden)

    Jaroslav Bursík

    2017-01-01

    Full Text Available The article focuses on the possibility of automation of taxiing, which is the part of a flight, which, under adverse weather conditions, greatly reduces the operational usability of an airport, and is the only part of a flight that has not been affected by automation, yet. Taxiing is currently handled manually by the pilot, who controls the airplane based on information from visual perception. The article primarily deals with possible ways of obtaining navigational information, and its automatic transfer to the controls. Analyzed wand assessed were currently available technologies such as computer vision, Light Detection and Ranging and Global Navigation Satellite System, which are useful for navigation and their general implementation into an airplane was designed. Obstacles to the implementation were identified, too. The result is a proposed combination of systems along with their installation into airplane’s systems so that it is possible to use the automated taxiing.

  1. Programmable automation systems in PSA

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1997-06-01

    The Finnish safety authority (STUK) requires plant specific PSAs, and quantitative safety goals are set on different levels. The reliability analysis is more problematic when critical safety functions are realized by applying programmable automation systems. Conventional modeling techniques do not necessarily apply to the analysis of these systems, and the quantification seems to be impossible. However, it is important to analyze contribution of programmable automation systems to the plant safety and PSA is the only method with system analytical view over the safety. This report discusses the applicability of PSA methodology (fault tree analyses, failure modes and effects analyses) in the analysis of programmable automation systems. The problem of how to decompose programmable automation systems for reliability modeling purposes is discussed. In addition to the qualitative analysis and structural reliability modeling issues, the possibility to evaluate failure probabilities of programmable automation systems is considered. One solution to the quantification issue is the use of expert judgements, and the principles to apply expert judgements is discussed in the paper. A framework to apply expert judgements is outlined. Further, the impacts of subjective estimates on the interpretation of PSA results are discussed. (orig.) (13 refs.)

  2. How to assess sustainability in automated manufacturing

    DEFF Research Database (Denmark)

    Dijkman, Teunis Johannes; Rödger, Jan-Markus; Bey, Niki

    2015-01-01

    The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system...... that fulfills the market demand for a given functionality. Secondly, three aspects of sustainability have to be assessed: environment, economy, and society. Thirdly, automation is part of a system with many levels, with different actors on each level, resulting in meeting the market demand. In this system......, (sustainability) specifications move top-down, which helps avoiding sub-optimization and problem shifting. From these three aspects, sustainable automation is defined as automation that contributes to products that fulfill a market demand in a more sustainable way. The case study presents the carbon footprints...

  3. Cluster headache

    Science.gov (United States)

    Histamine headache; Headache - histamine; Migrainous neuralgia; Headache - cluster; Horton's headache; Vascular headache - cluster ... Doctors do not know exactly what causes cluster headaches. They ... (chemical in the body released during an allergic response) or ...

  4. Automation of coal mining equipment

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Ryuji

    1986-12-25

    Major machines used in the working face include the shearer and the self-advancing frame. The shearer has been changed from the radio-controlled model to the microcomputer operated machine, while automating various functions. In addition, a system for comprehensively examining operating conditions and natural conditions in the working face for further automation. The selfadvancing frame has been modified from the sequence controlled model to the microcomputer aided electrohydraulic control system. In order to proceed further with automation and introduce robotics, detectors, control units and valves must be made smaller in higher reliability. The system will be controlled above the ground in the future, provided that the machines in the working face are remote controlled at the gate while transmitting relevant data above the ground from this system. Thus, automated working face will be realized. (2 figs, 1 photo)

  5. Controls and automation in the SPIRAL project

    International Nuclear Information System (INIS)

    Bothner, U.; Boulot, A.; Maherault, J.; Martial, L.

    1999-01-01

    The control and automation team of the R and D of Accelerator-Exotic Beam Department has had in the framework of SPIRAL collaboration the following tasks: 1. automation of the resonator high frequency equipment of the CIME cyclotron; 2. automation of the vacuum equipment, i.e. the low energy line (TBE), the CIME cyclotron, the low energy line (BE); 3. automation of load safety for power supply; 4. for each of these tasks a circuitry file based on the SCHEMA software has been worked out. The programs required in the automation of load safety for power supply (STEP5, PROTOOL, DESIGNER 4.1) were developed and implemented for PC

  6. Automated controlled-potential coulometric determination of uranium

    International Nuclear Information System (INIS)

    Knight, C.H.; Clegg, D.E.; Wright, K.D.; Cassidy, R.M.

    1982-06-01

    A controlled-potential coulometer has been automated in our laboratory for routine determination of uranium in solution. The CRNL-designed automated system controls degassing, prereduction, and reduction of the sample. The final result is displayed on a digital coulometer readout. Manual and automated modes of operation are compared to show the precision and accuracy of the automated system. Results are also shown for the coulometric titration of typical uranium-aluminum alloy samples

  7. Office automation: a look beyond word processing

    OpenAIRE

    DuBois, Milan Ephriam, Jr.

    1983-01-01

    Approved for public release; distribution is unlimited Word processing was the first of various forms of office automation technologies to gain widespread acceptance and usability in the business world. For many, it remains the only form of office automation technology. Office automation, however, is not just word processing, although it does include the function of facilitating and manipulating text. In reality, office automation is not one innovation, or one office system, or one tech...

  8. The Employment-Impact of Automation in Canada

    OpenAIRE

    McLean, Colin Alexander

    2015-01-01

    Standard neoclassical models of labour demand predict that automation does not produce long-term increases in unemployment. Supporting evidence in Canada between 1970 and 2008 is explained by the reallocation of labour from industries with high levels of automation such as Manufacturing to industries with low levels of automation such as Retail and Wholesale Trade, and Business Services. Recent evidence indicates however that on-going technological advances are now driving labour automation i...

  9. Complex Automated Negotiations Theories, Models, and Software Competitions

    CERN Document Server

    Zhang, Minjie; Robu, Valentin; Matsuo, Tokuro

    2013-01-01

    Complex Automated Negotiations are a widely studied, emerging area in the field of Autonomous Agents and Multi-Agent Systems. In general, automated negotiations can be complex, since there are a lot of factors that characterize such negotiations. For this book, we solicited papers on all aspects of such complex automated negotiations, which are studied in the field of Autonomous Agents and Multi-Agent Systems. This book includes two parts, which are Part I: Agent-based Complex Automated Negotiations and Part II: Automated Negotiation Agents Competition. Each chapter in Part I is an extended version of ACAN 2011 papers after peer reviews by three PC members. Part II includes ANAC 2011 (The Second Automated Negotiating Agents Competition), in which automated agents who have different negotiation strategies and implemented by different developers are automatically negotiate in the several negotiation domains. ANAC is an international competition in which automated negotiation strategies, submitted by a number of...

  10. Automated Methods of Corrosion Measurements

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    . Mechanical control, recording, and data processing must therefore be automated to a high level of precision and reliability. These general techniques and the apparatus involved have been described extensively. The automated methods of such high-resolution microscopy coordinated with computerized...

  11. Automated segmentation of ventricles from serial brain MRI for the quantification of volumetric changes associated with communicating hydrocephalus in patients with brain tumor

    Science.gov (United States)

    Pura, John A.; Hamilton, Allison M.; Vargish, Geoffrey A.; Butman, John A.; Linguraru, Marius George

    2011-03-01

    Accurate ventricle volume estimates could improve the understanding and diagnosis of postoperative communicating hydrocephalus. For this category of patients, associated changes in ventricle volume can be difficult to identify, particularly over short time intervals. We present an automated segmentation algorithm that evaluates ventricle size from serial brain MRI examination. The technique combines serial T1- weighted images to increase SNR and segments the means image to generate a ventricle template. After pre-processing, the segmentation is initiated by a fuzzy c-means clustering algorithm to find the seeds used in a combination of fast marching methods and geodesic active contours. Finally, the ventricle template is propagated onto the serial data via non-linear registration. Serial volume estimates were obtained in an automated robust and accurate manner from difficult data.

  12. Single-cluster dynamics for the random-cluster model

    NARCIS (Netherlands)

    Deng, Y.; Qian, X.; Blöte, H.W.J.

    2009-01-01

    We formulate a single-cluster Monte Carlo algorithm for the simulation of the random-cluster model. This algorithm is a generalization of the Wolff single-cluster method for the q-state Potts model to noninteger values q>1. Its results for static quantities are in a satisfactory agreement with those

  13. Human-centered automation: Development of a philosophy

    Science.gov (United States)

    Graeber, Curtis; Billings, Charles E.

    1990-01-01

    Information on human-centered automation philosophy is given in outline/viewgraph form. It is asserted that automation of aircraft control will continue in the future, but that automation should supplement, not supplant the human management and control function in civil air transport.

  14. A system-level approach to automation research

    Science.gov (United States)

    Harrison, F. W.; Orlando, N. E.

    1984-01-01

    Automation is the application of self-regulating mechanical and electronic devices to processes that can be accomplished with the human organs of perception, decision, and actuation. The successful application of automation to a system process should reduce man/system interaction and the perceived complexity of the system, or should increase affordability, productivity, quality control, and safety. The expense, time constraints, and risk factors associated with extravehicular activities have led the Automation Technology Branch (ATB), as part of the NASA Automation Research and Technology Program, to investigate the use of robots and teleoperators as automation aids in the context of space operations. The ATB program addresses three major areas: (1) basic research in autonomous operations, (2) human factors research on man-machine interfaces with remote systems, and (3) the integration and analysis of automated systems. This paper reviews the current ATB research in the area of robotics and teleoperators.

  15. International Conference Automation : Challenges in Automation, Robotics and Measurement Techniques

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2016-01-01

    This book presents the set of papers accepted for presentation at the International Conference Automation, held in Warsaw, 2-4 March of 2016. It presents the research results presented by top experts in the fields of industrial automation, control, robotics and measurement techniques. Each chapter presents a thorough analysis of a specific technical problem which is usually followed by numerical analysis, simulation, and description of results of implementation of the solution of a real world problem. The presented theoretical results, practical solutions and guidelines will be valuable for both researchers working in the area of engineering sciences and for practitioners solving industrial problems. .

  16. Automated road network extraction from high spatial resolution multi-spectral imagery

    Science.gov (United States)

    Zhang, Qiaoping

    For the last three decades, the Geomatics Engineering and Computer Science communities have considered automated road network extraction from remotely-sensed imagery to be a challenging and important research topic. The main objective of this research is to investigate the theory and methodology of automated feature extraction for image-based road database creation, refinement or updating, and to develop a series of algorithms for road network extraction from high resolution multi-spectral imagery. The proposed framework for road network extraction from multi-spectral imagery begins with an image segmentation using the k-means algorithm. This step mainly concerns the exploitation of the spectral information for feature extraction. The road cluster is automatically identified using a fuzzy classifier based on a set of predefined road surface membership functions. These membership functions are established based on the general spectral signature of road pavement materials and the corresponding normalized digital numbers on each multi-spectral band. Shape descriptors of the Angular Texture Signature are defined and used to reduce the misclassifications between roads and other spectrally similar objects (e.g., crop fields, parking lots, and buildings). An iterative and localized Radon transform is developed for the extraction of road centerlines from the classified images. The purpose of the transform is to accurately and completely detect the road centerlines. It is able to find short, long, and even curvilinear lines. The input image is partitioned into a set of subset images called road component images. An iterative Radon transform is locally applied to each road component image. At each iteration, road centerline segments are detected based on an accurate estimation of the line parameters and line widths. Three localization approaches are implemented and compared using qualitative and quantitative methods. Finally, the road centerline segments are grouped into a

  17. Psi4 1.1: An Open-Source Electronic Structure Program Emphasizing Automation, Advanced Libraries, and Interoperability.

    Science.gov (United States)

    Parrish, Robert M; Burns, Lori A; Smith, Daniel G A; Simmonett, Andrew C; DePrince, A Eugene; Hohenstein, Edward G; Bozkaya, Uğur; Sokolov, Alexander Yu; Di Remigio, Roberto; Richard, Ryan M; Gonthier, Jérôme F; James, Andrew M; McAlexander, Harley R; Kumar, Ashutosh; Saitow, Masaaki; Wang, Xiao; Pritchard, Benjamin P; Verma, Prakash; Schaefer, Henry F; Patkowski, Konrad; King, Rollin A; Valeev, Edward F; Evangelista, Francesco A; Turney, Justin M; Crawford, T Daniel; Sherrill, C David

    2017-07-11

    Psi4 is an ab initio electronic structure program providing methods such as Hartree-Fock, density functional theory, configuration interaction, and coupled-cluster theory. The 1.1 release represents a major update meant to automate complex tasks, such as geometry optimization using complete-basis-set extrapolation or focal-point methods. Conversion of the top-level code to a Python module means that Psi4 can now be used in complex workflows alongside other Python tools. Several new features have been added with the aid of libraries providing easy access to techniques such as density fitting, Cholesky decomposition, and Laplace denominators. The build system has been completely rewritten to simplify interoperability with independent, reusable software components for quantum chemistry. Finally, a wide range of new theoretical methods and analyses have been added to the code base, including functional-group and open-shell symmetry adapted perturbation theory, density-fitted coupled cluster with frozen natural orbitals, orbital-optimized perturbation and coupled-cluster methods (e.g., OO-MP2 and OO-LCCD), density-fitted multiconfigurational self-consistent field, density cumulant functional theory, algebraic-diagrammatic construction excited states, improvements to the geometry optimizer, and the "X2C" approach to relativistic corrections, among many other improvements.

  18. clusterMaker: a multi-algorithm clustering plugin for Cytoscape

    Directory of Open Access Journals (Sweden)

    Morris John H

    2011-11-01

    Full Text Available Abstract Background In the post-genomic era, the rapid increase in high-throughput data calls for computational tools capable of integrating data of diverse types and facilitating recognition of biologically meaningful patterns within them. For example, protein-protein interaction data sets have been clustered to identify stable complexes, but scientists lack easily accessible tools to facilitate combined analyses of multiple data sets from different types of experiments. Here we present clusterMaker, a Cytoscape plugin that implements several clustering algorithms and provides network, dendrogram, and heat map views of the results. The Cytoscape network is linked to all of the other views, so that a selection in one is immediately reflected in the others. clusterMaker is the first Cytoscape plugin to implement such a wide variety of clustering algorithms and visualizations, including the only implementations of hierarchical clustering, dendrogram plus heat map visualization (tree view, k-means, k-medoid, SCPS, AutoSOME, and native (Java MCL. Results Results are presented in the form of three scenarios of use: analysis of protein expression data using a recently published mouse interactome and a mouse microarray data set of nearly one hundred diverse cell/tissue types; the identification of protein complexes in the yeast Saccharomyces cerevisiae; and the cluster analysis of the vicinal oxygen chelate (VOC enzyme superfamily. For scenario one, we explore functionally enriched mouse interactomes specific to particular cellular phenotypes and apply fuzzy clustering. For scenario two, we explore the prefoldin complex in detail using both physical and genetic interaction clusters. For scenario three, we explore the possible annotation of a protein as a methylmalonyl-CoA epimerase within the VOC superfamily. Cytoscape session files for all three scenarios are provided in the Additional Files section. Conclusions The Cytoscape plugin cluster

  19. Relevant Subspace Clustering

    DEFF Research Database (Denmark)

    Müller, Emmanuel; Assent, Ira; Günnemann, Stephan

    2009-01-01

    Subspace clustering aims at detecting clusters in any subspace projection of a high dimensional space. As the number of possible subspace projections is exponential in the number of dimensions, the result is often tremendously large. Recent approaches fail to reduce results to relevant subspace...... clusters. Their results are typically highly redundant, i.e. many clusters are detected multiple times in several projections. In this work, we propose a novel model for relevant subspace clustering (RESCU). We present a global optimization which detects the most interesting non-redundant subspace clusters...... achieves top clustering quality while competing approaches show greatly varying performance....

  20. Comparison of Automated Graphical User Interface Testing Tools

    OpenAIRE

    Gaber, Domen

    2018-01-01

    The thesis presents the analysis of modern tools for automated testing of various web based user interfaces. The purpose of the work is to compare specific test automation solutions and point out the most suitable test automation tool amongst them. One of the main goals of test automation is to gain faster execution when compared to manual testing and overall cost deduction. There are multiple test automation solutions available on the market, which differ in complexity of use, type of o...

  1. Efficient Messaging through Cluster Coordinators in Decentralized Controlled Material Flow Systems

    Directory of Open Access Journals (Sweden)

    Lieberoth-Leden Christian

    2016-01-01

    Full Text Available The modularization of the hard- and software is one approach handling the demand for increasing flexibility and changeability of automated material flow systems. A control that is distributed across several different hardware controllers leads to a great demand for coordination between the modules while planning for example transports, especially if there is a mutual dependency between the modules on the executing tasks. Short-term changes in planning often initiate a rescheduling chain reaction, which causes a high communication load in the system. In the presented approach, module clusters with a centralized coordinator are automatically formed out of multiple modules and substitutional take over the surrounding communication for the modules. As a result, they minimize exchanged messages by focusing on the essential information.

  2. Wireless Android Based Home Automation System

    Directory of Open Access Journals (Sweden)

    Muhammad Tanveer Riaz

    2017-01-01

    Full Text Available This manuscript presents a prototype and design implementation of an advance home automation system that uses Wi-Fi technology as a network infrastructure connecting its parts. The proposed system consists of two main components; the first part is the server, which presents system core that manages and controls user’s home. Users and system administrator can locally (Local Area Network or remotely (internet manage and control the system. Second part is the hardware interface module, which provides appropriate interface to sensors and actuator of home automation system. Unlike most of the available home automation system in the market, the proposed system is scalable that one server can manage many hardware interface modules as long as it exists within network coverage. System supports a wide range of home automation devices like appliances, power management components, and security components. The proposed system is better in terms of the flexibility and scalability than the commercially available home automation systems

  3. Horticultural cluster

    OpenAIRE

    SHERSTIUK S.V.; POSYLAYEVA K.I.

    2013-01-01

    In the article there are the theoretical and methodological approaches to the nature and existence of the cluster. The cluster differences from other kinds of cooperative and integration associations. Was develop by scientific-practical recommendations for forming a competitive horticultur cluster.

  4. Automated processing of endoscopic surgical instruments.

    Science.gov (United States)

    Roth, K; Sieber, J P; Schrimm, H; Heeg, P; Buess, G

    1994-10-01

    This paper deals with the requirements for automated processing of endoscopic surgical instruments. After a brief analysis of the current problems, solutions are discussed. Test-procedures have been developed to validate the automated processing, so that the cleaning results are guaranteed and reproducable. Also a device for testing and cleaning was designed together with Netzsch Newamatic and PCI, called TC-MIC, to automate processing and reduce manual work.

  5. Highway Electrification And Automation

    OpenAIRE

    Shladover, Steven E.

    1992-01-01

    This report addresses how the California Department of Transportation and the California PATH Program have made efforts to evaluate the feasibility and applicability of highway electrification and automation technologies. In addition to describing how the work was conducted, the report also describes the findings on highway electrification and highway automation, with experimental results, design study results, and a region-wide application impacts study for Los Angeles.

  6. Automated Status Notification System

    Science.gov (United States)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  7. TreeCluster: Massively scalable transmission clustering using phylogenetic trees

    OpenAIRE

    Moshiri, Alexander

    2018-01-01

    Background: The ability to infer transmission clusters from molecular data is critical to designing and evaluating viral control strategies. Viral sequencing datasets are growing rapidly, but standard methods of transmission cluster inference do not scale well beyond thousands of sequences. Results: I present TreeCluster, a cross-platform tool that performs transmission cluster inference on a given phylogenetic tree orders of magnitude faster than existing inference methods and supports multi...

  8. Automated Testing Infrastructure and Result Comparison for Geodynamics Codes

    Science.gov (United States)

    Heien, E. M.; Kellogg, L. H.

    2013-12-01

    The geodynamics community uses a wide variety of codes on a wide variety of both software and hardware platforms to simulate geophysical phenomenon. These codes are generally variants of finite difference or finite element calculations involving Stokes flow or wave propagation. A significant problem is that codes of even low complexity will return different results depending on the platform due to slight differences in hardware, software, compiler, and libraries. Furthermore, changes to the codes during development may affect solutions in unexpected ways such that previously validated results are altered. The Computational Infrastructure for Geodynamics (CIG) is funded by the NSF to enhance the capabilities of the geodynamics community through software development. CIG has recently done extensive work in setting up an automated testing and result validation system based on the BaTLab system developed at the University of Wisconsin, Madison. This system uses 16 variants of Linux and Mac platforms on both 32 and 64-bit processors to test several CIG codes, and has also recently been extended to support testing on the XSEDE TACC (Texas Advanced Computing Center) Stampede cluster. In this work we overview the system design and demonstrate how automated testing and validation occurs and results are reported. We also examine several results from the system from different codes and discuss how changes in compilers and libraries affect the results. Finally we detail some result comparison tools for different types of output (scalar fields, velocity fields, seismogram data), and discuss within what margins different results can be considered equivalent.

  9. Voting-based consensus clustering for combining multiple clusterings of chemical structures

    Directory of Open Access Journals (Sweden)

    Saeed Faisal

    2012-12-01

    Full Text Available Abstract Background Although many consensus clustering methods have been successfully used for combining multiple classifiers in many areas such as machine learning, applied statistics, pattern recognition and bioinformatics, few consensus clustering methods have been applied for combining multiple clusterings of chemical structures. It is known that any individual clustering method will not always give the best results for all types of applications. So, in this paper, three voting and graph-based consensus clusterings were used for combining multiple clusterings of chemical structures to enhance the ability of separating biologically active molecules from inactive ones in each cluster. Results The cumulative voting-based aggregation algorithm (CVAA, cluster-based similarity partitioning algorithm (CSPA and hyper-graph partitioning algorithm (HGPA were examined. The F-measure and Quality Partition Index method (QPI were used to evaluate the clusterings and the results were compared to the Ward’s clustering method. The MDL Drug Data Report (MDDR dataset was used for experiments and was represented by two 2D fingerprints, ALOGP and ECFP_4. The performance of voting-based consensus clustering method outperformed the Ward’s method using F-measure and QPI method for both ALOGP and ECFP_4 fingerprints, while the graph-based consensus clustering methods outperformed the Ward’s method only for ALOGP using QPI. The Jaccard and Euclidean distance measures were the methods of choice to generate the ensembles, which give the highest values for both criteria. Conclusions The results of the experiments show that consensus clustering methods can improve the effectiveness of chemical structures clusterings. The cumulative voting-based aggregation algorithm (CVAA was the method of choice among consensus clustering methods.

  10. Automation in airport security X-ray screening of cabin baggage: Examining benefits and possible implementations of automated explosives detection.

    Science.gov (United States)

    Hättenschwiler, Nicole; Sterchi, Yanik; Mendes, Marcia; Schwaninger, Adrian

    2018-10-01

    Bomb attacks on civil aviation make detecting improvised explosive devices and explosive material in passenger baggage a major concern. In the last few years, explosive detection systems for cabin baggage screening (EDSCB) have become available. Although used by a number of airports, most countries have not yet implemented these systems on a wide scale. We investigated the benefits of EDSCB with two different levels of automation currently being discussed by regulators and airport operators: automation as a diagnostic aid with an on-screen alarm resolution by the airport security officer (screener) or EDSCB with an automated decision by the machine. The two experiments reported here tested and compared both scenarios and a condition without automation as baseline. Participants were screeners at two international airports who differed in both years of work experience and familiarity with automation aids. Results showed that experienced screeners were good at detecting improvised explosive devices even without EDSCB. EDSCB increased only their detection of bare explosives. In contrast, screeners with less experience (tenure automated decision provided better human-machine detection performance than on-screen alarm resolution and no automation. This came at the cost of slightly higher false alarm rates on the human-machine system level, which would still be acceptable from an operational point of view. Results indicate that a wide-scale implementation of EDSCB would increase the detection of explosives in passenger bags and automated decision instead of automation as diagnostic aid with on screen alarm resolution should be considered. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. 21 CFR 864.5600 - Automated hematocrit instrument.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated hematocrit instrument. 864.5600 Section 864.5600 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices...

  12. 21 CFR 862.2900 - Automated urinalysis system.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated urinalysis system. 862.2900 Section 862....2900 Automated urinalysis system. (a) Identification. An automated urinalysis system is a device... that duplicate manual urinalysis systems. This device is used in conjunction with certain materials to...

  13. Resins production: batch plant automation

    International Nuclear Information System (INIS)

    Banti, M.; Mauri, G.

    1996-01-01

    Companies that look for automation in their plants without external resources, have at their disposal flexible, custom and easy to use DCS, open towards PLC. In this article it is explained why Hoechts has followed this way of new plants for resins production automation

  14. OBSERVED SCALING RELATIONS FOR STRONG LENSING CLUSTERS: CONSEQUENCES FOR COSMOLOGY AND CLUSTER ASSEMBLY

    International Nuclear Information System (INIS)

    Comerford, Julia M.; Moustakas, Leonidas A.; Natarajan, Priyamvada

    2010-01-01

    Scaling relations of observed galaxy cluster properties are useful tools for constraining cosmological parameters as well as cluster formation histories. One of the key cosmological parameters, σ 8 , is constrained using observed clusters of galaxies, although current estimates of σ 8 from the scaling relations of dynamically relaxed galaxy clusters are limited by the large scatter in the observed cluster mass-temperature (M-T) relation. With a sample of eight strong lensing clusters at 0.3 8 , but combining the cluster concentration-mass relation with the M-T relation enables the inclusion of unrelaxed clusters as well. Thus, the resultant gains in the accuracy of σ 8 measurements from clusters are twofold: the errors on σ 8 are reduced and the cluster sample size is increased. Therefore, the statistics on σ 8 determination from clusters are greatly improved by the inclusion of unrelaxed clusters. Exploring cluster scaling relations further, we find that the correlation between brightest cluster galaxy (BCG) luminosity and cluster mass offers insight into the assembly histories of clusters. We find preliminary evidence for a steeper BCG luminosity-cluster mass relation for strong lensing clusters than the general cluster population, hinting that strong lensing clusters may have had more active merging histories.

  15. Small cities face greater impact from automation

    Science.gov (United States)

    Sun, Lijun; Cebrian, Manuel; Rahwan, Iyad

    2018-01-01

    The city has proved to be the most successful form of human agglomeration and provides wide employment opportunities for its dwellers. As advances in robotics and artificial intelligence revive concerns about the impact of automation on jobs, a question looms: how will automation affect employment in cities? Here, we provide a comparative picture of the impact of automation across US urban areas. Small cities will undertake greater adjustments, such as worker displacement and job content substitutions. We demonstrate that large cities exhibit increased occupational and skill specialization due to increased abundance of managerial and technical professions. These occupations are not easily automatable, and, thus, reduce the potential impact of automation in large cities. Our results pass several robustness checks including potential errors in the estimation of occupational automation and subsampling of occupations. Our study provides the first empirical law connecting two societal forces: urban agglomeration and automation's impact on employment. PMID:29436514

  16. Small cities face greater impact from automation.

    Science.gov (United States)

    Frank, Morgan R; Sun, Lijun; Cebrian, Manuel; Youn, Hyejin; Rahwan, Iyad

    2018-02-01

    The city has proved to be the most successful form of human agglomeration and provides wide employment opportunities for its dwellers. As advances in robotics and artificial intelligence revive concerns about the impact of automation on jobs, a question looms: how will automation affect employment in cities? Here, we provide a comparative picture of the impact of automation across US urban areas. Small cities will undertake greater adjustments, such as worker displacement and job content substitutions. We demonstrate that large cities exhibit increased occupational and skill specialization due to increased abundance of managerial and technical professions. These occupations are not easily automatable, and, thus, reduce the potential impact of automation in large cities. Our results pass several robustness checks including potential errors in the estimation of occupational automation and subsampling of occupations. Our study provides the first empirical law connecting two societal forces: urban agglomeration and automation's impact on employment. © 2018 The Authors.

  17. Cluster Headache

    OpenAIRE

    Pearce, Iris

    1985-01-01

    Cluster headache is the most severe primary headache with recurrent pain attacks described as worse than giving birth. The aim of this paper was to make an overview of current knowledge on cluster headache with a focus on pathophysiology and treatment. This paper presents hypotheses of cluster headache pathophysiology, current treatment options and possible future therapy approaches. For years, the hypothalamus was regarded as the key structure in cluster headache, but is now thought to be pa...

  18. Space power subsystem automation technology

    Science.gov (United States)

    Graves, J. R. (Compiler)

    1982-01-01

    The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.

  19. AUTOMATED PROCESS MONITORING: APPLYING PROVEN AUTOMATION TECHNIQUES TO INTERNATIONAL SAFEGUARDS NEEDS

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Devol, Timothy A.; Egorov, Oleg; Clements, John P.

    2008-01-01

    Identification and quantification of specific alpha- and beta-emitting radionuclides in complex liquid matrices is highly challenging, and is typically accomplished through laborious wet chemical sample preparation and separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, gas proportional counting, alpha energy analysis, mass spectrometry). Analytical results may take days or weeks to report. Chains of custody and sample security measures may also complicate or slow the analytical process. When an industrial process-scale plant requires the monitoring of specific radionuclides as an indication of the composition of its feed stream or of plant performance, radiochemical measurements must be fast, accurate, and reliable. Scientists at Pacific Northwest National Laboratory have assembled a fully automated prototype Process Monitor instrument capable of a variety of tasks: automated sampling directly from a feed stream, sample digestion/analyte redox adjustment, chemical separations, radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results could be immediately transmitted to on- or off-site locations. The development of a rapid radiochemical Process Monitor for 99Tc in Hanford tank waste processing streams, capable of performing several measurements per hour, will be discussed in detail. More recently, the automated platform was modified to perform measurements of 90Sr in Hanford tank waste stimulant. The system exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  20. Modeling and clustering water demand patterns from real-world smart meter data

    Directory of Open Access Journals (Sweden)

    N. Cheifetz

    2017-08-01

    Full Text Available Nowadays, drinking water utilities need an acute comprehension of the water demand on their distribution network, in order to efficiently operate the optimization of resources, manage billing and propose new customer services. With the emergence of smart grids, based on automated meter reading (AMR, a better understanding of the consumption modes is now accessible for smart cities with more granularities. In this context, this paper evaluates a novel methodology for identifying relevant usage profiles from the water consumption data produced by smart meters. The methodology is fully data-driven using the consumption time series which are seen as functions or curves observed with an hourly time step. First, a Fourier-based additive time series decomposition model is introduced to extract seasonal patterns from time series. These patterns are intended to represent the customer habits in terms of water consumption. Two functional clustering approaches are then used to classify the extracted seasonal patterns: the functional version of K-means, and the Fourier REgression Mixture (FReMix model. The K-means approach produces a hard segmentation and K representative prototypes. On the other hand, the FReMix is a generative model and also produces K profiles as well as a soft segmentation based on the posterior probabilities. The proposed approach is applied to a smart grid deployed on the largest water distribution network (WDN in France. The two clustering strategies are evaluated and compared. Finally, a realistic interpretation of the consumption habits is given for each cluster. The extensive experiments and the qualitative interpretation of the resulting clusters allow one to highlight the effectiveness of the proposed methodology.

  1. Modeling and clustering water demand patterns from real-world smart meter data

    Science.gov (United States)

    Cheifetz, Nicolas; Noumir, Zineb; Samé, Allou; Sandraz, Anne-Claire; Féliers, Cédric; Heim, Véronique

    2017-08-01

    Nowadays, drinking water utilities need an acute comprehension of the water demand on their distribution network, in order to efficiently operate the optimization of resources, manage billing and propose new customer services. With the emergence of smart grids, based on automated meter reading (AMR), a better understanding of the consumption modes is now accessible for smart cities with more granularities. In this context, this paper evaluates a novel methodology for identifying relevant usage profiles from the water consumption data produced by smart meters. The methodology is fully data-driven using the consumption time series which are seen as functions or curves observed with an hourly time step. First, a Fourier-based additive time series decomposition model is introduced to extract seasonal patterns from time series. These patterns are intended to represent the customer habits in terms of water consumption. Two functional clustering approaches are then used to classify the extracted seasonal patterns: the functional version of K-means, and the Fourier REgression Mixture (FReMix) model. The K-means approach produces a hard segmentation and K representative prototypes. On the other hand, the FReMix is a generative model and also produces K profiles as well as a soft segmentation based on the posterior probabilities. The proposed approach is applied to a smart grid deployed on the largest water distribution network (WDN) in France. The two clustering strategies are evaluated and compared. Finally, a realistic interpretation of the consumption habits is given for each cluster. The extensive experiments and the qualitative interpretation of the resulting clusters allow one to highlight the effectiveness of the proposed methodology.

  2. 21 CFR 864.5620 - Automated hemoglobin system.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated hemoglobin system. 864.5620 Section 864.5620 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  3. 21 CFR 864.5200 - Automated cell counter.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated cell counter. 864.5200 Section 864.5200 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  4. 21 CFR 864.5680 - Automated heparin analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated heparin analyzer. 864.5680 Section 864.5680 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  5. 21 CFR 864.5850 - Automated slide spinner.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated slide spinner. 864.5850 Section 864.5850 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  6. Novel insights in agent-based complex automated negotiation

    CERN Document Server

    Lopez-Carmona, Miguel; Ito, Takayuki; Zhang, Minjie; Bai, Quan; Fujita, Katsuhide

    2014-01-01

    This book focuses on all aspects of complex automated negotiations, which are studied in the field of autonomous agents and multi-agent systems. This book consists of two parts. I: Agent-Based Complex Automated Negotiations, and II: Automated Negotiation Agents Competition. The chapters in Part I are extended versions of papers presented at the 2012 international workshop on Agent-Based Complex Automated Negotiation (ACAN), after peer reviews by three Program Committee members. Part II examines in detail ANAC 2012 (The Third Automated Negotiating Agents Competition), in which automated agents that have different negotiation strategies and are implemented by different developers are automatically negotiated in the several negotiation domains. ANAC is an international competition in which automated negotiation strategies, submitted by a number of universities and research institutes across the world, are evaluated in tournament style. The purpose of the competition is to steer the research in the area of bilate...

  7. Properties of an ionised-cluster beam from a vaporised-cluster ion source

    International Nuclear Information System (INIS)

    Takagi, T.; Yamada, I.; Sasaki, A.

    1978-01-01

    A new type of ion source vaporised-metal cluster ion source, has been developed for deposition and epitaxy. A cluster consisting of 10 2 to 10 3 atoms coupled loosely together is formed by adiabatic expansion ejecting the vapour of materials into a high-vacuum region through the nozzle of a heated crucible. The clusters are ionised by electron bombardment and accelerated with neutral clusters toward a substrate. In this paper, mechanisms of cluster formation experimental results of the cluster size (atoms/cluster) and its distribution, and characteristics of the cluster ion beams are reported. The size is calculated from the kinetic equation E = (1/2)mNVsub(ej) 2 , where E is the cluster beam energy, Vsub(ej) is the ejection velocity, m is the mass of atom and N is the cluster size. The energy and the velocity of the cluster are measured by an electrostatic 127 0 energy analyser and a rotating disc system, respectively. The cluster size obtained for Ag is about 5 x 10 2 to 2 x 10 3 atoms. The retarding potential method is used to confirm the results for Ag. The same dependence on cluster size for metals such as Ag, Cu and Pb has been obtained in previous experiments. In the cluster state the cluster ion beam is easily produced by electron bombardment. About 50% of ionised clusters are obtained under typical operation conditions, because of the large ionisation cross sections of the clusters. To obtain a uniform spatial distribution, the ionising electrode system is also discussed. The new techniques are termed ionised-cluster beam deposition (ICBD) and epitaxy (ICBE). (author)

  8. ERP processes automation in corporate environments

    OpenAIRE

    Antonoaie Victor; Irimeş Adrian; Chicoş Lucia-Antoneta

    2017-01-01

    The automation processes are used in organizations to speed up analyses processes and reduce manual labour. Robotic Automation of IT processes implemented in a modern corporate workspace provides an excellent tool for assisting professionals in making decisions, saving resources and serving as a know-how repository. This study presents the newest trends in process automation, its benefits such as security, ease of use, reduction of overall process duration, and provide examples of SAPERP proj...

  9. Powder handling for automated fuel processing

    International Nuclear Information System (INIS)

    Frederickson, J.R.; Eschenbaum, R.C.; Goldmann, L.H.

    1989-01-01

    Installation of the Secure Automated Fabrication (SAF) line has been completed. It is located in the Fuel Cycle Plant (FCP) at the Department of Energy's (DOE) Hanford site near Richland, Washington. The SAF line was designed to fabricate advanced reactor fuel pellets and assemble fuel pins by automated, remote operation. This paper describes powder handling equipment and techniques utilized for automated powder processing and powder conditioning systems in this line. 9 figs

  10. Automation Revolutionize the Business Service Industry

    OpenAIRE

    Marciniak, Róbert

    2017-01-01

    In the last decades significant disruptive changes began with the extended use of automation. Many jobs are changed or disappeared and others were born totally with the automation. Together with the progress of technology, the automation was primarily spread in the industrial sector, mostly in the production and assembly lines. The growth maycontinue in the future further, researchers expect more than 35 million industrial robots globally by 2018.But it shades the situati...

  11. Feasibility Study of Parallel Finite Element Analysis on Cluster-of-Clusters

    Science.gov (United States)

    Muraoka, Masae; Okuda, Hiroshi

    With the rapid growth of WAN infrastructure and development of Grid middleware, it's become a realistic and attractive methodology to connect cluster machines on wide-area network for the execution of computation-demanding applications. Many existing parallel finite element (FE) applications have been, however, designed and developed with a single computing resource in mind, since such applications require frequent synchronization and communication among processes. There have been few FE applications that can exploit the distributed environment so far. In this study, we explore the feasibility of FE applications on the cluster-of-clusters. First, we classify FE applications into two types, tightly coupled applications (TCA) and loosely coupled applications (LCA) based on their communication pattern. A prototype of each application is implemented on the cluster-of-clusters. We perform numerical experiments executing TCA and LCA on both the cluster-of-clusters and a single cluster. Thorough these experiments, by comparing the performances and communication cost in each case, we evaluate the feasibility of FEA on the cluster-of-clusters.

  12. Interplay between experiments and calculations for organometallic clusters and caged clusters

    International Nuclear Information System (INIS)

    Nakajima, Atsushi

    2015-01-01

    Clusters consisting of 10-1000 atoms exhibit size-dependent electronic and geometric properties. In particular, composite clusters consisting of several elements and/or components provide a promising way for a bottom-up approach for designing functional advanced materials, because the functionality of the composite clusters can be optimized not only by the cluster size but also by their compositions. In the formation of composite clusters, their geometric symmetry and dimensionality are emphasized to control the physical and chemical properties, because selective and anisotropic enhancements for optical, chemical, and magnetic properties can be expected. Organometallic clusters and caged clusters are demonstrated as a representative example of designing the functionality of the composite clusters. Organometallic vanadium-benzene forms a one dimensional sandwich structure showing ferromagnetic behaviors and anomalously large HOMO-LUMO gap differences of two spin orbitals, which can be regarded as spin-filter components for cluster-based spintronic devices. Caged clusters of aluminum (Al) are well stabilized both geometrically and electronically at Al 12 X, behaving as a “superatom”

  13. The interaction between hippocampal GABA-B and cannabinoid receptors upon spatial change and object novelty discrimination memory function.

    Science.gov (United States)

    Nasehi, Mohammad; Alaghmandan-Motlagh, Niyousha; Ebrahimi-Ghiri, Mohaddeseh; Nami, Mohammad; Zarrindast, Mohammad-Reza

    2017-10-01

    Previous studies have postulated functional links between GABA and cannabinoid systems in the hippocampus. The aim of the present study was to investigate any possible interaction between these systems in spatial change and object novelty discrimination memory consolidation in the dorsal hippocampus (CA1 region) of NMRI mice. Assessment of the spatial change and object novelty discrimination memory function was carried out in a non-associative task. The experiment comprised mice exposure to an open field containing five objects followed by the examination of their reactivity to object displacement (spatial change) and object substitution (object novelty) after three sessions of habituation. Our results showed that the post-training intraperitoneal administration of the higher dose of ACPA (0.02 mg/kg) impaired both spatial change and novelty discrimination memory functions. Meanwhile, the higher dose of GABA-B receptor agonist, baclofen, impaired the spatial change memory by itself. Moreover, the post-training intra-CA1 microinjection of a subthreshold dose of baclofen increased the ACPA effect on spatial change and novelty discrimination memory at a lower and higher dose, respectively. On the other hand, the lower and higher but not mid-level doses of GABA-B receptor antagonist, phaclofen, could reverse memory deficits induced by ACPA. However, phaclofen at its mid-level dose impaired the novelty discrimination memory and whereas the higher dose impaired the spatial change memory. Based on our findings, GABA-B receptors in the CA1 region appear to modulate the ACPA-induced cannabinoid CB1 signaling upon spatial change and novelty discrimination memory functions.

  14. Automated diagnostics scoping study. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Quadrel, R.W.; Lash, T.A.

    1994-06-01

    The objective of the Automated Diagnostics Scoping Study was to investigate the needs for diagnostics in building operation and to examine some of the current technologies in automated diagnostics that can address these needs. The study was conducted in two parts. In the needs analysis, the authors interviewed facility managers and engineers at five building sites. In the technology survey, they collected published information on automated diagnostic technologies in commercial and military applications as well as on technologies currently under research. The following describe key areas that the authors identify for the research, development, and deployment of automated diagnostic technologies: tools and techniques to aid diagnosis during building commissioning, especially those that address issues arising from integrating building systems and diagnosing multiple simultaneous faults; technologies to aid diagnosis for systems and components that are unmonitored or unalarmed; automated capabilities to assist cause-and-effect exploration during diagnosis; inexpensive, reliable sensors, especially those that expand the current range of sensory input; technologies that aid predictive diagnosis through trend analysis; integration of simulation and optimization tools with building automation systems to optimize control strategies and energy performance; integration of diagnostic, control, and preventive maintenance technologies. By relating existing technologies to perceived and actual needs, the authors reached some conclusions about the opportunities for automated diagnostics in building operation. Some of a building operator`s needs can be satisfied by off-the-shelf hardware and software. Other needs are not so easily satisfied, suggesting directions for future research. Their conclusions and suggestions are offered in the final section of this study.

  15. Categorias Cluster

    OpenAIRE

    Queiroz, Dayane Andrade

    2015-01-01

    Neste trabalho apresentamos as categorias cluster, que foram introduzidas por Aslak Bakke Buan, Robert Marsh, Markus Reineke, Idun Reiten e Gordana Todorov, com o objetivo de categoriíicar as algebras cluster criadas em 2002 por Sergey Fomin e Andrei Zelevinsky. Os autores acima, em [4], mostraram que existe uma estreita relação entre algebras cluster e categorias cluster para quivers cujo grafo subjacente é um diagrama de Dynkin. Para isto desenvolveram uma teoria tilting na estrutura triang...

  16. BRIGHTEST CLUSTER GALAXIES AND CORE GAS DENSITY IN REXCESS CLUSTERS

    International Nuclear Information System (INIS)

    Haarsma, Deborah B.; Leisman, Luke; Donahue, Megan; Bruch, Seth; Voit, G. Mark; Boehringer, Hans; Pratt, Gabriel W.; Pierini, Daniele; Croston, Judith H.; Arnaud, Monique

    2010-01-01

    We investigate the relationship between brightest cluster galaxies (BCGs) and their host clusters using a sample of nearby galaxy clusters from the Representative XMM-Newton Cluster Structure Survey. The sample was imaged with the Southern Observatory for Astrophysical Research in R band to investigate the mass of the old stellar population. Using a metric radius of 12 h -1 kpc, we found that the BCG luminosity depends weakly on overall cluster mass as L BCG ∝ M 0.18±0.07 cl , consistent with previous work. We found that 90% of the BCGs are located within 0.035 r 500 of the peak of the X-ray emission, including all of the cool core (CC) clusters. We also found an unexpected correlation between the BCG metric luminosity and the core gas density for non-cool-core (non-CC) clusters, following a power law of n e ∝ L 2.7±0.4 BCG (where n e is measured at 0.008 r 500 ). The correlation is not easily explained by star formation (which is weak in non-CC clusters) or overall cluster mass (which is not correlated with core gas density). The trend persists even when the BCG is not located near the peak of the X-ray emission, so proximity is not necessary. We suggest that, for non-CC clusters, this correlation implies that the same process that sets the central entropy of the cluster gas also determines the central stellar density of the BCG, and that this underlying physical process is likely to be mergers.

  17. Levels of automation and user control - evaluation of a turbine automation interface

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Jonas (Chalmers Univ. of Technology (Sweden))

    2008-10-15

    The study was performed during the annual operator training at the Studsvik nuclear power plant simulator facility in Nykoeping, Sweden. The participating operators came from the Oskarshamn 3 nuclear power plant. In the study, seven nuclear power plant turbine operators were interviewed concerning their use of the automatic turbine system. A field study approach together with a heuristic usability evaluation was made to assess how the operators are affected by use of automation in the control room setting. The purpose of the study was to examine how operator performance is affected by varying levels of automation in nuclear power plant turbine operation. The Automatic Turbine System (ATS) was evaluated to clarify how the ATS interface design supports the operators work. The results show that during manual control the operators experience loss of speed and accuracy in performing actions together with difficulty of dividing attention between performing a task and overall monitoring, as the major problems. The positive aspects of manual operations lie in increased feeling of being in control when performing actions by hand. With higher levels of automation the problems shift to issues concerning difficulty of following the automatic sequences and loosing track in procedures. As the level of automation gets higher, the need of feedback increases which means that information presentation also becomes more important. The use of the semiautomatic, step-mode is often preferred by the operators since it combines the speed and accuracy of the automation with the ability of maintaining the feeling of being in control. Further, a number of usability related concerns was found in the ATS interface. The operators especially experience the presentation of the conditions that manage the automatic sequences as difficult to perceive. (author)

  18. Levels of automation and user control - evaluation of a turbine automation interface

    International Nuclear Information System (INIS)

    Andersson, Jonas

    2008-10-01

    The study was performed during the annual operator training at the Studsvik nuclear power plant simulator facility in Nykoeping, Sweden. The participating operators came from the Oskarshamn 3 nuclear power plant. In the study, seven nuclear power plant turbine operators were interviewed concerning their use of the automatic turbine system. A field study approach together with a heuristic usability evaluation was made to assess how the operators are affected by use of automation in the control room setting. The purpose of the study was to examine how operator performance is affected by varying levels of automation in nuclear power plant turbine operation. The Automatic Turbine System (ATS) was evaluated to clarify how the ATS interface design supports the operators work. The results show that during manual control the operators experience loss of speed and accuracy in performing actions together with difficulty of dividing attention between performing a task and overall monitoring, as the major problems. The positive aspects of manual operations lie in increased feeling of being in control when performing actions by hand. With higher levels of automation the problems shift to issues concerning difficulty of following the automatic sequences and loosing track in procedures. As the level of automation gets higher, the need of feedback increases which means that information presentation also becomes more important. The use of the semiautomatic, step-mode is often preferred by the operators since it combines the speed and accuracy of the automation with the ability of maintaining the feeling of being in control. Further, a number of usability related concerns was found in the ATS interface. The operators especially experience the presentation of the conditions that manage the automatic sequences as difficult to perceive. (au)

  19. Future Autonomous and Automated Systems Testbed

    Data.gov (United States)

    National Aeronautics and Space Administration — Trust is the greatest obstacle to implementing greater autonomy and automation (A&A) in the human spaceflight program. The Future Autonomous and Automated...

  20. Fatigue and voluntary utilization of automation in simulated driving.

    Science.gov (United States)

    Neubauer, Catherine; Matthews, Gerald; Langheim, Lisa; Saxby, Dyani

    2012-10-01

    A driving simulator was used to assess the impact on fatigue, stress, and workload of full vehicle automation that was initiated by the driver. Previous studies have shown that mandatory use of full automation induces a state of "passive fatigue" associated with loss of alertness. By contrast, voluntary use of automation may enhance the driver's perceptions of control and ability to manage fatigue. Participants were assigned to one of two experimental conditions, automation optional (AO) and nonautomation (NA), and then performed a 35 min, monotonous simulated drive. In the last 5 min, automation was unavailable and drivers were required to respond to an emergency event. Subjective state and workload were evaluated before and after the drive. Making automation available to the driver failed to alleviate fatigue and stress states induced by driving in monotonous conditions. Drivers who were fatigued prior to the drive were more likely to choose to use automation, but automation use increased distress, especially in fatigue-prone drivers. Drivers in the AO condition were slower to initiate steering responses to the emergency event, suggesting optional automation may be distracting. Optional, driver-controlled automation appears to pose the same dangers to task engagement and alertness as externally initiated automation. Drivers of automated vehicles may be vulnerable to fatigue that persists when normal vehicle control is restored. It is important to evaluate automated systems' impact on driver fatigue, to seek design solutions to the issue of maintaining driver engagement, and to address the vulnerabilities of fatigue-prone drivers.

  1. Aprendizaje automático

    OpenAIRE

    Moreno, Antonio

    1994-01-01

    En este libro se introducen los conceptos básicos en una de las ramas más estudiadas actualmente dentro de la inteligencia artificial: el aprendizaje automático. Se estudian temas como el aprendizaje inductivo, el razonamiento analógico, el aprendizaje basado en explicaciones, las redes neuronales, los algoritmos genéticos, el razonamiento basado en casos o las aproximaciones teóricas al aprendizaje automático.

  2. Instant Sikuli test automation

    CERN Document Server

    Lau, Ben

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A concise guide written in an easy-to follow style using the Starter guide approach.This book is aimed at automation and testing professionals who want to use Sikuli to automate GUI. Some Python programming experience is assumed.

  3. Automated lattice data generation

    Science.gov (United States)

    Ayyar, Venkitesh; Hackett, Daniel C.; Jay, William I.; Neil, Ethan T.

    2018-03-01

    The process of generating ensembles of gauge configurations (and measuring various observables over them) can be tedious and error-prone when done "by hand". In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  4. Designing and implementing test automation frameworks with QTP

    CERN Document Server

    Bhargava, Ashish

    2013-01-01

    A tutorial-based approach, showing basic coding and designing techniques to build test automation frameworks.If you are a beginner, an automation engineer, an aspiring test automation engineer, a manual tester, a test lead or a test architect who wants to learn, create, and maintain test automation frameworks, this book will accelerate your ability to develop and adapt the framework.

  5. Automated data collection in single particle electron microscopy

    Science.gov (United States)

    Tan, Yong Zi; Cheng, Anchi; Potter, Clinton S.; Carragher, Bridget

    2016-01-01

    Automated data collection is an integral part of modern workflows in single particle electron microscopy (EM) research. This review surveys the software packages available for automated single particle EM data collection. The degree of automation at each stage of data collection is evaluated, and the capabilities of the software packages are described. Finally, future trends in automation are discussed. PMID:26671944

  6. Cluster-cluster correlations in the two-dimensional stationary Ising-model

    International Nuclear Information System (INIS)

    Klassmann, A.

    1997-01-01

    In numerical integration of the Cahn-Hillard equation, which describes Oswald rising in a two-phase matrix, N. Masbaum showed that spatial correlations between clusters scale with respect to the mean cluster size (itself a function of time). T. B. Liverpool showed by Monte Carlo simulations for the Ising model that the analogous correlations have a similar form. Both demonstrated that immediately around each cluster there is some depletion area followed by something like a ring of clusters of the same size as the original one. More precisely, it has been shown that the distribution of clusters around a given cluster looks like a sinus-curve decaying exponentially with respect to the distance to a constant value

  7. Application of Novel Software Algorithms to Spectral-Domain Optical Coherence Tomography for Automated Detection of Diabetic Retinopathy.

    Science.gov (United States)

    Adhi, Mehreen; Semy, Salim K; Stein, David W; Potter, Daniel M; Kuklinski, Walter S; Sleeper, Harry A; Duker, Jay S; Waheed, Nadia K

    2016-05-01

    To present novel software algorithms applied to spectral-domain optical coherence tomography (SD-OCT) for automated detection of diabetic retinopathy (DR). Thirty-one diabetic patients (44 eyes) and 18 healthy, nondiabetic controls (20 eyes) who underwent volumetric SD-OCT imaging and fundus photography were retrospectively identified. A retina specialist independently graded DR stage. Trained automated software generated a retinal thickness score signifying macular edema and a cluster score signifying microaneurysms and/or hard exudates for each volumetric SD-OCT. Of 44 diabetic eyes, 38 had DR and six eyes did not have DR. Leave-one-out cross-validation using a linear discriminant at missed detection/false alarm ratio of 3.00 computed software sensitivity and specificity of 92% and 69%, respectively, for DR detection when compared to clinical assessment. Novel software algorithms applied to commercially available SD-OCT can successfully detect DR and may have potential as a viable screening tool for DR in future. [Ophthalmic Surg Lasers Imaging Retina. 2016;47:410-417.]. Copyright 2016, SLACK Incorporated.

  8. Meaningful Clusters

    Energy Technology Data Exchange (ETDEWEB)

    Sanfilippo, Antonio P.; Calapristi, Augustin J.; Crow, Vernon L.; Hetzler, Elizabeth G.; Turner, Alan E.

    2004-05-26

    We present an approach to the disambiguation of cluster labels that capitalizes on the notion of semantic similarity to assign WordNet senses to cluster labels. The approach provides interesting insights on how document clustering can provide the basis for developing a novel approach to word sense disambiguation.

  9. Generic Automated Multi-function Finger Design

    Science.gov (United States)

    Honarpardaz, M.; Tarkian, M.; Sirkett, D.; Ölvander, J.; Feng, X.; Elf, J.; Sjögren, R.

    2016-11-01

    Multi-function fingers that are able to handle multiple workpieces are crucial in improvement of a robot workcell. Design automation of multi-function fingers is highly demanded by robot industries to overcome the current iterative, time consuming and complex manual design process. However, the existing approaches for the multi-function finger design automation are unable to entirely meet the robot industries’ need. This paper proposes a generic approach for design automation of multi-function fingers. The proposed approach completely automates the design process and requires no expert skill. In addition, this approach executes the design process much faster than the current manual process. To validate the approach, multi-function fingers are successfully designed for two case studies. Further, the results are discussed and benchmarked with existing approaches.

  10. Identifying Requirements for Effective Human-Automation Teamwork

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. Joe; John O' Hara; Heather D. Medema; Johanna H. Oxstrand

    2014-06-01

    Previous studies have shown that poorly designed human-automation collaboration, such as poorly designed communication protocols, often leads to problems for the human operators, such as: lack of vigilance, complacency, and loss of skills. These problems often lead to suboptimal system performance. To address this situation, a considerable amount of research has been conducted to improve human-automation collaboration and to make automation function better as a “team player.” Much of this research is based on an understanding of what it means to be a good team player from the perspective of a human team. However, the research is often based on a simplified view of human teams and teamwork. In this study, we sought to better understand the capabilities and limitations of automation from the standpoint of human teams. We first examined human teams to identify the principles for effective teamwork. We next reviewed the research on integrating automation agents and human agents into mixed agent teams to identify the limitations of automation agents to conform to teamwork principles. This research resulted in insights that can lead to more effective human-automation collaboration by enabling a more realistic set of requirements to be developed based on the strengths and limitations of all agents.

  11. Advanced automation for in-space vehicle processing

    Science.gov (United States)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  12. Macroeconomic Dimensions in the Clusterization Processes: Lithuanian Biomass Cluster Case

    Directory of Open Access Journals (Sweden)

    Navickas Valentinas

    2017-03-01

    Full Text Available The Future production systems’ increasing significance will impose work, which maintains not a competitive, but a collaboration basis, with concentrated resources and expertise, which can help to reach the general purpose. One form of collaboration among medium-size business organizations is work in clusters. Clusterization as a phenomenon has been known from quite a long time, but it offers simple benefits to researches at micro and medium levels. The clusterization process evaluation in macroeconomic dimensions has been comparatively little investigated. Thereby, in this article, the clusterization processes is analysed by concentrating our attention on macroeconomic factor researches. The authors analyse clusterization’s influence on country’s macroeconomic growth; they apply a structure research methodology for clusterization’s macroeconomic influence evaluation and propose that clusterization processes benefit macroeconomic analysis. The theoretical model of clusterization processes was validated by referring to a biomass cluster case. Because biomass cluster case is a new phenomenon, currently there are no other scientific approaches to them. The authors’ accomplished researches show that clusterization allows the achievement of a large positive slip in macroeconomics, which proves to lead to a high value added to creation, a faster country economic growth, and social situation amelioration.

  13. Automated Assessment in Massive Open Online Courses

    Science.gov (United States)

    Ivaniushin, Dmitrii A.; Shtennikov, Dmitrii G.; Efimchick, Eugene A.; Lyamin, Andrey V.

    2016-01-01

    This paper describes an approach to use automated assessments in online courses. Open edX platform is used as the online courses platform. The new assessment type uses Scilab as learning and solution validation tool. This approach allows to use automated individual variant generation and automated solution checks without involving the course…

  14. Clustering Dycom

    KAUST Repository

    Minku, Leandro L.

    2017-10-06

    Background: Software Effort Estimation (SEE) can be formulated as an online learning problem, where new projects are completed over time and may become available for training. In this scenario, a Cross-Company (CC) SEE approach called Dycom can drastically reduce the number of Within-Company (WC) projects needed for training, saving the high cost of collecting such training projects. However, Dycom relies on splitting CC projects into different subsets in order to create its CC models. Such splitting can have a significant impact on Dycom\\'s predictive performance. Aims: This paper investigates whether clustering methods can be used to help finding good CC splits for Dycom. Method: Dycom is extended to use clustering methods for creating the CC subsets. Three different clustering methods are investigated, namely Hierarchical Clustering, K-Means, and Expectation-Maximisation. Clustering Dycom is compared against the original Dycom with CC subsets of different sizes, based on four SEE databases. A baseline WC model is also included in the analysis. Results: Clustering Dycom with K-Means can potentially help to split the CC projects, managing to achieve similar or better predictive performance than Dycom. However, K-Means still requires the number of CC subsets to be pre-defined, and a poor choice can negatively affect predictive performance. EM enables Dycom to automatically set the number of CC subsets while still maintaining or improving predictive performance with respect to the baseline WC model. Clustering Dycom with Hierarchical Clustering did not offer significant advantage in terms of predictive performance. Conclusion: Clustering methods can be an effective way to automatically generate Dycom\\'s CC subsets.

  15. Automation trust and attention allocation in multitasking workspace.

    Science.gov (United States)

    Karpinsky, Nicole D; Chancey, Eric T; Palmer, Dakota B; Yamani, Yusuke

    2018-07-01

    Previous research suggests that operators with high workload can distrust and then poorly monitor automation, which has been generally inferred from automation dependence behaviors. To test automation monitoring more directly, the current study measured operators' visual attention allocation, workload, and trust toward imperfect automation in a dynamic multitasking environment. Participants concurrently performed a manual tracking task with two levels of difficulty and a system monitoring task assisted by an unreliable signaling system. Eye movement data indicate that operators allocate less visual attention to monitor automation when the tracking task is more difficult. Participants reported reduced levels of trust toward the signaling system when the tracking task demanded more focused visual attention. Analyses revealed that trust mediated the relationship between the load of the tracking task and attention allocation in Experiment 1, an effect that was not replicated in Experiment 2. Results imply a complex process underlying task load, visual attention allocation, and automation trust during multitasking. Automation designers should consider operators' task load in multitasking workspaces to avoid reduced automation monitoring and distrust toward imperfect signaling systems. Copyright © 2018. Published by Elsevier Ltd.

  16. LMC clusters: young

    International Nuclear Information System (INIS)

    Freeman, K.C.

    1980-01-01

    The young globular clusters of the LMC have ages of 10 7 -10 8 y. Their masses and structure are similar to those of the smaller galactic globular clusters. Their stellar mass functions (in the mass range 6 solar masses to 1.2 solar masses) vary greatly from cluster to cluster, although the clusters are similar in total mass, age, structure and chemical composition. It would be very interesting to know why these clusters are forming now in the LMC and not in the Galaxy. The author considers the 'young globular' or 'blue populous' clusters of the LMC. The ages of these objects are 10 7 to 10 8 y, and their masses are 10 4 to 10 5 solar masses, so they are populous enough to be really useful for studying the evolution of massive stars. The author concentrates on the structure and stellar content of these young clusters. (Auth.)

  17. Major cluster mergers and the location of the brightest cluster galaxy

    International Nuclear Information System (INIS)

    Martel, Hugo; Robichaud, Fidèle; Barai, Paramita

    2014-01-01

    Using a large N-body cosmological simulation combined with a subgrid treatment of galaxy formation, merging, and tidal destruction, we study the formation and evolution of the galaxy and cluster population in a comoving volume (100 Mpc) 3 in a ΛCDM universe. At z = 0, our computational volume contains 1788 clusters with mass M cl > 1.1 × 10 12 M ☉ , including 18 massive clusters with M cl > 10 14 M ☉ . It also contains 1, 088, 797 galaxies with mass M gal ≥ 2 × 10 9 M ☉ and luminosity L > 9.5 × 10 5 L ☉ . For each cluster, we identified the brightest cluster galaxy (BCG). We then computed two separate statistics: the fraction f BNC of clusters in which the BCG is not the closest galaxy to the center of the cluster in projection, and the ratio Δv/σ, where Δv is the difference in radial velocity between the BCG and the whole cluster and σ is the radial velocity dispersion of the cluster. We found that f BNC increases from 0.05 for low-mass clusters (M cl ∼ 10 12 M ☉ ) to 0.5 for high-mass clusters (M cl > 10 14 M ☉ ) with very little dependence on cluster redshift. Most of this result turns out to be a projection effect and when we consider three-dimensional distances instead of projected distances, f BNC increases only to 0.2 at high-cluster mass. The values of Δv/σ vary from 0 to 1.8, with median values in the range 0.03-0.15 when considering all clusters, and 0.12-0.31 when considering only massive clusters. These results are consistent with previous observational studies and indicate that the central galaxy paradigm, which states that the BCG should be at rest at the center of the cluster, is usually valid, but exceptions are too common to be ignored. We built merger trees for the 18 most massive clusters in the simulation. Analysis of these trees reveal that 16 of these clusters have experienced 1 or several major or semi-major mergers in the past. These mergers leave each cluster in a non-equilibrium state, but eventually the cluster

  18. Monitoring by Use of Clusters of Sensor-Data Vectors

    Science.gov (United States)

    Iverson, David L.

    2007-01-01

    The inductive monitoring system (IMS) is a system of computer hardware and software for automated monitoring of the performance, operational condition, physical integrity, and other aspects of the health of a complex engineering system (e.g., an industrial process line or a spacecraft). The input to the IMS consists of streams of digitized readings from sensors in the monitored system. The IMS determines the type and amount of any deviation of the monitored system from a nominal or normal ( healthy ) condition on the basis of a comparison between (1) vectors constructed from the incoming sensor data and (2) corresponding vectors in a database of nominal or normal behavior. The term inductive reflects the use of a process reminiscent of traditional mathematical induction to learn about normal operation and build the nominal-condition database. The IMS offers two major advantages over prior computational monitoring systems: The computational burden of the IMS is significantly smaller, and there is no need for abnormal-condition sensor data for training the IMS to recognize abnormal conditions. The figure schematically depicts the relationships among the computational processes effected by the IMS. Training sensor data are gathered during normal operation of the monitored system, detailed computational simulation of operation of the monitored system, or both. The training data are formed into vectors that are used to generate the database. The vectors in the database are clustered into regions that represent normal or nominal operation. Once the database has been generated, the IMS compares the vectors of incoming sensor data with vectors representative of the clusters. The monitored system is deemed to be operating normally or abnormally, depending on whether the vector of incoming sensor data is or is not, respectively, sufficiently close to one of the clusters. For this purpose, a distance between two vectors is calculated by a suitable metric (e.g., Euclidean

  19. Changing cluster composition in cluster randomised controlled trials: design and analysis considerations

    Science.gov (United States)

    2014-01-01

    Background There are many methodological challenges in the conduct and analysis of cluster randomised controlled trials, but one that has received little attention is that of post-randomisation changes to cluster composition. To illustrate this, we focus on the issue of cluster merging, considering the impact on the design, analysis and interpretation of trial outcomes. Methods We explored the effects of merging clusters on study power using standard methods of power calculation. We assessed the potential impacts on study findings of both homogeneous cluster merges (involving clusters randomised to the same arm of a trial) and heterogeneous merges (involving clusters randomised to different arms of a trial) by simulation. To determine the impact on bias and precision of treatment effect estimates, we applied standard methods of analysis to different populations under analysis. Results Cluster merging produced a systematic reduction in study power. This effect depended on the number of merges and was most pronounced when variability in cluster size was at its greatest. Simulations demonstrate that the impact on analysis was minimal when cluster merges were homogeneous, with impact on study power being balanced by a change in observed intracluster correlation coefficient (ICC). We found a decrease in study power when cluster merges were heterogeneous, and the estimate of treatment effect was attenuated. Conclusions Examples of cluster merges found in previously published reports of cluster randomised trials were typically homogeneous rather than heterogeneous. Simulations demonstrated that trial findings in such cases would be unbiased. However, simulations also showed that any heterogeneous cluster merges would introduce bias that would be hard to quantify, as well as having negative impacts on the precision of estimates obtained. Further methodological development is warranted to better determine how to analyse such trials appropriately. Interim recommendations

  20. [Automated analyzer of enzyme immunoassay].

    Science.gov (United States)

    Osawa, S

    1995-09-01

    Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.

  1. Aviation Safety/Automation Program Conference

    Science.gov (United States)

    Morello, Samuel A. (Compiler)

    1990-01-01

    The Aviation Safety/Automation Program Conference - 1989 was sponsored by the NASA Langley Research Center on 11 to 12 October 1989. The conference, held at the Sheraton Beach Inn and Conference Center, Virginia Beach, Virginia, was chaired by Samuel A. Morello. The primary objective of the conference was to ensure effective communication and technology transfer by providing a forum for technical interchange of current operational problems and program results to date. The Aviation Safety/Automation Program has as its primary goal to improve the safety of the national airspace system through the development and integration of human-centered automation technologies for aircraft crews and air traffic controllers.

  2. Programmable Automated Welding System (PAWS)

    Science.gov (United States)

    Kline, Martin D.

    1994-01-01

    An ambitious project to develop an advanced, automated welding system is being funded as part of the Navy Joining Center with Babcock & Wilcox as the prime integrator. This program, the Programmable Automated Welding System (PAWS), involves the integration of both planning and real-time control activities. Planning functions include the development of a graphical decision support system within a standard, portable environment. Real-time control functions include the development of a modular, intelligent, real-time control system and the integration of a number of welding process sensors. This paper presents each of these components of the PAWS and discusses how they can be utilized to automate the welding operation.

  3. Nonanalytic Laboratory Automation: A Quarter Century of Progress.

    Science.gov (United States)

    Hawker, Charles D

    2017-06-01

    Clinical laboratory automation has blossomed since the 1989 AACC meeting, at which Dr. Masahide Sasaki first showed a western audience what his laboratory had implemented. Many diagnostics and other vendors are now offering a variety of automated options for laboratories of all sizes. Replacing manual processing and handling procedures with automation was embraced by the laboratory community because of the obvious benefits of labor savings and improvement in turnaround time and quality. Automation was also embraced by the diagnostics vendors who saw automation as a means of incorporating the analyzers purchased by their customers into larger systems in which the benefits of automation were integrated to the analyzers.This report reviews the options that are available to laboratory customers. These options include so called task-targeted automation-modules that range from single function devices that automate single tasks (e.g., decapping or aliquoting) to multifunction workstations that incorporate several of the functions of a laboratory sample processing department. The options also include total laboratory automation systems that use conveyors to link sample processing functions to analyzers and often include postanalytical features such as refrigerated storage and sample retrieval.Most importantly, this report reviews a recommended process for evaluating the need for new automation and for identifying the specific requirements of a laboratory and developing solutions that can meet those requirements. The report also discusses some of the practical considerations facing a laboratory in a new implementation and reviews the concept of machine vision to replace human inspections. © 2017 American Association for Clinical Chemistry.

  4. Measuring Technology and Mechatronics Automation in Electrical Engineering

    CERN Document Server

    2012-01-01

    Measuring Technology and Mechatronics Automation in Electrical Engineering includes select presentations on measuring technology and mechatronics automation related to electrical engineering, originally presented during the International Conference on Measuring Technology and Mechanatronics Automation (ICMTMA2012). This Fourth ICMTMA, held at Sanya, China, offered a prestigious, international forum for scientists, engineers, and educators to present the state of the art of measuring technology and mechatronics automation research.

  5. Opening up Library Automation Software

    Science.gov (United States)

    Breeding, Marshall

    2009-01-01

    Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…

  6. Automation for mineral resource development

    Energy Technology Data Exchange (ETDEWEB)

    Norrie, A.W.; Turner, D.R. (eds.)

    1986-01-01

    A total of 55 papers were presented at the symposium under the following headings: automation and the future of mining; modelling and control of mining processes; transportation for mining; automation and the future of metallurgical processes; modelling and control of metallurgical processes; and general aspects. Fifteen papers have been abstracted separately.

  7. Translation: Aids, Robots, and Automation.

    Science.gov (United States)

    Andreyewsky, Alexander

    1981-01-01

    Examines electronic aids to translation both as ways to automate it and as an approach to solve problems resulting from shortage of qualified translators. Describes the limitations of robotic MT (Machine Translation) systems, viewing MAT (Machine-Aided Translation) as the only practical solution and the best vehicle for further automation. (MES)

  8. Automated recognition system for ELM classification in JET

    International Nuclear Information System (INIS)

    Duro, N.; Dormido, R.; Vega, J.; Dormido-Canto, S.; Farias, G.; Sanchez, J.; Vargas, H.; Murari, A.

    2009-01-01

    Edge localized modes (ELMs) are instabilities occurring in the edge of H-mode plasmas. Considerable efforts are being devoted to understanding the physics behind this non-linear phenomenon. A first characterization of ELMs is usually their identification as type I or type III. An automated pattern recognition system has been developed in JET for off-line ELM recognition and classification. The empirical method presented in this paper analyzes each individual ELM instead of starting from a temporal segment containing many ELM bursts. The ELM recognition and isolation is carried out using three signals: Dα, line integrated electron density and stored diamagnetic energy. A reduced set of characteristics (such as diamagnetic energy drop, ELM period or Dα shape) has been extracted to build supervised and unsupervised learning systems for classification purposes. The former are based on support vector machines (SVM). The latter have been developed with hierarchical and K-means clustering methods. The success rate of the classification systems is about 98% for a database of almost 300 ELMs.

  9. Toward a human-centered aircraft automation philosophy

    Science.gov (United States)

    Billings, Charles E.

    1989-01-01

    The evolution of automation in civil aircraft is examined in order to discern trends in the respective roles and functions of automation technology and the humans who operate these aircraft. The effects of advances in automation technology on crew reaction is considered and it appears that, though automation may well have decreased the frequency of certain types of human errors in flight, it may also have enabled new categories of human errors, some perhaps less obvious and therefore more serious than those it has alleviated. It is suggested that automation could be designed to keep the pilot closer to the control of the vehicle, while providing an array of information management and aiding functions designed to provide the pilot with data regarding flight replanning, degraded system operation, and the operational status and limits of the aircraft, its systems, and the physical and operational environment. The automation would serve as the pilot's assistant, providing and calculating data, watching for the unexpected, and keeping track of resources and their rate of expenditure.

  10. Automated packing systems: review of industrial implementations

    Science.gov (United States)

    Whelan, Paul F.; Batchelor, Bruce G.

    1993-08-01

    A rich theoretical background to the problems that occur in the automation of material handling can be found in operations research, production engineering, systems engineering and automation, more specifically machine vision, literature. This work has contributed towards the design of intelligent handling systems. This paper will review the application of these automated material handling and packing techniques to industrial problems. The discussion will also highlight the systems integration issues involved in these applications. An outline of one such industrial application, the automated placement of shape templates on to leather hides, is also discussed. The purpose of this system is to arrange shape templates on a leather hide in an efficient manner, so as to minimize the leather waste, before they are automatically cut from the hide. These pieces are used in the furniture and car manufacturing industries for the upholstery of high quality leather chairs and car seats. Currently this type of operation is semi-automated. The paper will outline the problems involved in the full automation of such a procedure.

  11. Cluster evolution

    International Nuclear Information System (INIS)

    Schaeffer, R.

    1987-01-01

    The galaxy and cluster luminosity functions are constructed from a model of the mass distribution based on hierarchical clustering at an epoch where the matter distribution is non-linear. These luminosity functions are seen to reproduce the present distribution of objects as can be inferred from the observations. They can be used to deduce the redshift dependence of the cluster distribution and to extrapolate the observations towards the past. The predicted evolution of the cluster distribution is quite strong, although somewhat less rapid than predicted by the linear theory

  12. Automated assessment and tracking of human body thermal variations using unsupervised clustering.

    Science.gov (United States)

    Yousefi, Bardia; Fleuret, Julien; Zhang, Hai; Maldague, Xavier P V; Watt, Raymond; Klein, Matthieu

    2016-12-01

    The presented approach addresses a review of the overheating that occurs during radiological examinations, such as magnetic resonance imaging, and a series of thermal experiments to determine a thermally suitable fabric material that should be used for radiological gowns. Moreover, an automatic system for detecting and tracking of the thermal fluctuation is presented. It applies hue-saturated-value-based kernelled k-means clustering, which initializes and controls the points that lie on the region-of-interest (ROI) boundary. Afterward, a particle filter tracks the targeted ROI during the video sequence independently of previous locations of overheating spots. The proposed approach was tested during experiments and under conditions very similar to those used during real radiology exams. Six subjects have voluntarily participated in these experiments. To simulate the hot spots occurring during radiology, a controllable heat source was utilized near the subject's body. The results indicate promising accuracy for the proposed approach to track hot spots. Some approximations were used regarding the transmittance of the atmosphere, and emissivity of the fabric could be neglected because of the independence of the proposed approach for these parameters. The approach can track the heating spots continuously and correctly, even for moving subjects, and provides considerable robustness against motion artifact, which occurs during most medical radiology procedures.

  13. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    CERN Document Server

    2012-01-01

    This volume Future Control and Automation- Volume 1 includes best papers selected from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into five sessions on the basis of the classification of manuscripts considered, which is listed as follows: Identification and Control, Navigation, Guidance and Sensor, Simulation Technology, Future Telecommunications and Control

  14. Automated lattice data generation

    Directory of Open Access Journals (Sweden)

    Ayyar Venkitesh

    2018-01-01

    Full Text Available The process of generating ensembles of gauge configurations (and measuring various observables over them can be tedious and error-prone when done “by hand”. In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  15. Studying human-automation interactions: methodological lessons learned from the human-centred automation experiments 1997-2001

    International Nuclear Information System (INIS)

    Massaiu, Salvatore; Skjerve, Ann Britt Miberg; Skraaning, Gyrd Jr.; Strand, Stine; Waeroe, Irene

    2004-04-01

    This report documents the methodological lessons learned from the Human Centred Automation (HCA) programme both in terms of psychometric evaluation of the measurement techniques developed for human-automation interaction study, and in terms of the application of advanced statistical methods for analysis of experiments. The psychometric evaluation is based on data from the four experiments performed within the HCA programme. The result is a single-source reference text of measurement instruments for the study of human-automation interaction, part of which were specifically developed by the programme. The application of advanced statistical techniques is exemplified by additional analyses performed on the IPSN-HCA experiment of 1998. Special importance is given to the statistical technique Structural Equation Modeling, for the possibility it offers to advance, and empirically test, comprehensive explanations about human-automation interactions. The additional analyses of the IPSN-HCA experiment investigated how the operators formed judgments about their own performance. The issue is of substantive interest for human automation interaction research because the operators' over- or underestimation of their own performance could be seen as a symptom of human-machine mismatch, and a potential latent failure. These analyses concluded that it is the interplay between (1) the level of automation and several factors that determines the operators' bias in performance self-estimation: (2) the nature of the task, (3) the level of scenario complexity, and (4) the level of trust in the automatic system. A structural model that expresses the interplay of all these factors was empirically evaluated and was found able to provide a concise and elegant explanation of the intricate pattern of relationships between the identified factors. (Author)

  16. Comparison of vehicle types at an automated container terminal

    NARCIS (Netherlands)

    Vis, I.F.A.; Harika, I.

    2004-01-01

    At automated container terminals, containers are transshipped from one mode of transportation to another. Automated vehicles transport containers from the stack to the ship and vice versa. Two different types of automated vehicles are studied in this paper, namely automated lifting vehicles and

  17. AutoLens: Automated Modeling of a Strong Lens's Light, Mass and Source

    Science.gov (United States)

    Nightingale, J. W.; Dye, S.; Massey, Richard J.

    2018-05-01

    This work presents AutoLens, the first entirely automated modeling suite for the analysis of galaxy-scale strong gravitational lenses. AutoLens simultaneously models the lens galaxy's light and mass whilst reconstructing the extended source galaxy on an adaptive pixel-grid. The method's approach to source-plane discretization is amorphous, adapting its clustering and regularization to the intrinsic properties of the lensed source. The lens's light is fitted using a superposition of Sersic functions, allowing AutoLens to cleanly deblend its light from the source. Single component mass models representing the lens's total mass density profile are demonstrated, which in conjunction with light modeling can detect central images using a centrally cored profile. Decomposed mass modeling is also shown, which can fully decouple a lens's light and dark matter and determine whether the two component are geometrically aligned. The complexity of the light and mass models are automatically chosen via Bayesian model comparison. These steps form AutoLens's automated analysis pipeline, such that all results in this work are generated without any user-intervention. This is rigorously tested on a large suite of simulated images, assessing its performance on a broad range of lens profiles, source morphologies and lensing geometries. The method's performance is excellent, with accurate light, mass and source profiles inferred for data sets representative of both existing Hubble imaging and future Euclid wide-field observations.

  18. Using Modeling and Simulation to Predict Operator Performance and Automation-Induced Complacency With Robotic Automation: A Case Study and Empirical Validation.

    Science.gov (United States)

    Wickens, Christopher D; Sebok, Angelia; Li, Huiyang; Sarter, Nadine; Gacy, Andrew M

    2015-09-01

    The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Some computational models of complacency in human-automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development. © 2015, Human Factors and Ergonomics Society.

  19. Preface to the special section on human factors and automation in vehicles: designing highly automated vehicles with the driver in mind.

    Science.gov (United States)

    Merat, Natasha; Lee, John D

    2012-10-01

    This special section brings together diverse research regarding driver interaction with advanced automotive technology to guide design of increasingly automated vehicles. Rapidly evolving vehicle automation will likely change cars and trucks more in the next 5 years than the preceding 50, radically redefining what it means to drive. This special section includes 10 articles from European and North American researchers reporting simulator and naturalistic driving studies. Little research has considered the consequences of fully automated driving, with most focusing on lane-keeping and speed control systems individually. The studies reveal two underlying design philosophies: automate driving versus support driving. Results of several studies, consistent with previous research in other domains, suggest that the automate philosophy can delay driver responses to incidents in which the driver has to intervene and take control from the automation. Understanding how to orchestrate the transfer or sharing of control between the system and the driver, particularly in critical incidents, emerges as a central challenge. Designers should not assume that automation can substitute seamlessly for a human driver, nor can they assume that the driver can safely accommodate the limitations of automation. Designers, policy makers, and researchers must give careful consideration to what role the person should have in highly automated vehicles and how to support the driver if the driver is to be responsible for vehicle control. As in other domains, driving safety increasingly depends on the combined performance of the human and automation, and successful designs will depend on recognizing and supporting the new roles of the driver.

  20. Automation of orders in taxi service

    OpenAIRE

    Simčič, Matej

    2012-01-01

    Automation is rapidly growing in the last years. The advantages it brings are cost reduction, faster and better performance of tasks that would be otherwise done by humas. It began in the manufacturing industry and later expanded to other sectors. Today's technology allows the implementation of automation in a wide range of areas. The thesis deals with the implementation of a system that allows automated ordering of a taxi. The system consists of four components. They are two mobile app...

  1. Cluster-cluster aggregation of Ising dipolar particles under thermal noise

    KAUST Repository

    Suzuki, Masaru

    2009-08-14

    The cluster-cluster aggregation processes of Ising dipolar particles under thermal noise are investigated in the dilute condition. As the temperature increases, changes in the typical structures of clusters are observed from chainlike (D1) to crystalline (D2) through fractal structures (D1.45), where D is the fractal dimension. By calculating the bending energy of the chainlike structure, it is found that the transition temperature is associated with the energy gap between the chainlike and crystalline configurations. The aggregation dynamics changes from being dominated by attraction to diffusion involving changes in the dynamic exponent z=0.2 to 0.5. In the region of temperature where the fractal clusters grow, different growth rates are observed between charged and neutral clusters. Using the Smoluchowski equation with a twofold kernel, this hetero-aggregation process is found to result from two types of dynamics: the diffusive motion of neutral clusters and the weak attractive motion between charged clusters. The fact that changes in structures and dynamics take place at the same time suggests that transitions in the structure of clusters involve marked changes in the dynamics of the aggregation processes. © 2009 The American Physical Society.

  2. Migration monitoring with automated technology

    Science.gov (United States)

    Rhonda L. Millikin

    2005-01-01

    Automated technology can supplement ground-based methods of migration monitoring by providing: (1) unbiased and automated sampling; (2) independent validation of current methods; (3) a larger sample area for landscape-level analysis of habitat selection for stopover, and (4) an opportunity to study flight behavior. In particular, radar-acoustic sensor fusion can...

  3. Automated evaluation of ultrasonic indications

    International Nuclear Information System (INIS)

    Hansch, M.K.T.; Stegemann, D.

    1994-01-01

    Future requirements of reliability and reproducibility in quality assurance demand computer evaluation of defect indications. The ultrasonic method with its large field of applications and a high potential for automation provides all preconditions for fully automated inspection. The survey proposes several desirable hardware improvements, data acquisition requirements and software configurations. (orig.) [de

  4. Automated quantification of proliferation with automated hot-spot selection in phosphohistone H3/MART1 dual-stained stage I/II melanoma.

    Science.gov (United States)

    Nielsen, Patricia Switten; Riber-Hansen, Rikke; Schmidt, Henrik; Steiniche, Torben

    2016-04-09

    Staging of melanoma includes quantification of a proliferation index, i.e., presumed melanocytic mitoses of H&E stains are counted manually in hot spots. Yet, its reproducibility and prognostic impact increases by immunohistochemical dual staining for phosphohistone H3 (PHH3) and MART1, which also may enable fully automated quantification by image analysis. To ensure manageable workloads and repeatable measurements in modern pathology, the study aimed to present an automated quantification of proliferation with automated hot-spot selection in PHH3/MART1-stained melanomas. Formalin-fixed, paraffin-embedded tissue from 153 consecutive stage I/II melanoma patients was immunohistochemically dual-stained for PHH3 and MART1. Whole slide images were captured, and the number of PHH3/MART1-positive cells was manually and automatically counted in the global tumor area and in a manually and automatically selected hot spot, i.e., a fixed 1-mm(2) square. Bland-Altman plots and hypothesis tests compared manual and automated procedures, and the Cox proportional hazards model established their prognostic impact. The mean difference between manual and automated global counts was 2.9 cells/mm(2) (P = 0.0071) and 0.23 cells per hot spot (P = 0.96) for automated counts in manually and automatically selected hot spots. In 77 % of cases, manual and automated hot spots overlapped. Fully manual hot-spot counts yielded the highest prognostic performance with an adjusted hazard ratio of 5.5 (95 % CI, 1.3-24, P = 0.024) as opposed to 1.3 (95 % CI, 0.61-2.9, P = 0.47) for automated counts with automated hot spots. The automated index and automated hot-spot selection were highly correlated to their manual counterpart, but altogether their prognostic impact was noticeably reduced. Because correct recognition of only one PHH3/MART1-positive cell seems important, extremely high sensitivity and specificity of the algorithm is required for prognostic purposes. Thus, automated

  5. Adaptive Automation Based on Air Traffic Controller Decision-Making

    NARCIS (Netherlands)

    IJtsma (Student TU Delft), Martijn; Borst, C.; Mercado Velasco, G.A.; Mulder, M.; van Paassen, M.M.; Tsang, P.S.; Vidulich, M.A.

    2017-01-01

    Through smart scheduling and triggering of automation support, adaptive automation has the potential to balance air traffic controller workload. The challenge in the design of adaptive automation systems is to decide how and when the automation should provide support. This paper describes the design

  6. Diversity among galaxy clusters

    International Nuclear Information System (INIS)

    Struble, M.F.; Rood, H.J.

    1988-01-01

    The classification of galaxy clusters is discussed. Consideration is given to the classification scheme of Abell (1950's), Zwicky (1950's), Morgan, Matthews, and Schmidt (1964), and Morgan-Bautz (1970). Galaxies can be classified based on morphology, chemical composition, spatial distribution, and motion. The correlation between a galaxy's environment and morphology is examined. The classification scheme of Rood-Sastry (1971), which is based on clusters's morphology and galaxy population, is described. The six types of clusters they define include: (1) a cD-cluster dominated by a single large galaxy, (2) a cluster dominated by a binary, (3) a core-halo cluster, (4) a cluster dominated by several bright galaxies, (5) a cluster appearing flattened, and (6) an irregularly shaped cluster. Attention is also given to the evolution of cluster structures, which is related to initial density and cluster motion

  7. An Intelligent Automation Platform for Rapid Bioprocess Design.

    Science.gov (United States)

    Wu, Tianyi; Zhou, Yuhong

    2014-08-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.

  8. Inventory management and reagent supply for automated chemistry.

    Science.gov (United States)

    Kuzniar, E

    1999-08-01

    Developments in automated chemistry have kept pace with developments in HTS such that hundreds of thousands of new compounds can be rapidly synthesized in the belief that the greater the number and diversity of compounds that can be screened, the more successful HTS will be. The increasing use of automation for Multiple Parallel Synthesis (MPS) and the move to automated combinatorial library production is placing an overwhelming burden on the management of reagents. Although automation has improved the efficiency of the processes involved in compound synthesis, the bottleneck has shifted to ordering, collating and preparing reagents for automated chemistry resulting in loss of time, materials and momentum. Major efficiencies have already been made in the area of compound management for high throughput screening. Most of these efficiencies have been achieved with sophisticated library management systems using advanced engineering and data handling for the storage, tracking and retrieval of millions of compounds. The Automation Partnership has already provided many of the top pharmaceutical companies with modular automated storage, preparation and retrieval systems to manage compound libraries for high throughput screening. This article describes how these systems may be implemented to solve the specific problems of inventory management and reagent supply for automated chemistry.

  9. Automated methods of corrosion measurement

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Bech-Nielsen, Gregers; Reeve, John Ch

    1997-01-01

    to revise assumptions regarding the basis of the method, which sometimes leads to the discovery of as-yet unnoticed phenomena. The present selection of automated methods for corrosion measurements is not motivated simply by the fact that a certain measurement can be performed automatically. Automation...... is applied to nearly all types of measurements today....

  10. Cluster-cluster aggregation of Ising dipolar particles under thermal noise

    KAUST Repository

    Suzuki, Masaru; Kun, Ferenc; Ito, Nobuyasu

    2009-01-01

    The cluster-cluster aggregation processes of Ising dipolar particles under thermal noise are investigated in the dilute condition. As the temperature increases, changes in the typical structures of clusters are observed from chainlike (D1

  11. Re-estimating sample size in cluster randomized trials with active recruitment within clusters

    NARCIS (Netherlands)

    van Schie, Sander; Moerbeek, Mirjam

    2014-01-01

    Often only a limited number of clusters can be obtained in cluster randomised trials, although many potential participants can be recruited within each cluster. Thus, active recruitment is feasible within the clusters. To obtain an efficient sample size in a cluster randomised trial, the cluster

  12. AUTOMATING ASSET KNOWLEDGE WITH MTCONNECT.

    Science.gov (United States)

    Venkatesh, Sid; Ly, Sidney; Manning, Martin; Michaloski, John; Proctor, Fred

    2016-01-01

    In order to maximize assets, manufacturers should use real-time knowledge garnered from ongoing and continuous collection and evaluation of factory-floor machine status data. In discrete parts manufacturing, factory machine monitoring has been difficult, due primarily to closed, proprietary automation equipment that make integration difficult. Recently, there has been a push in applying the data acquisition concepts of MTConnect to the real-time acquisition of machine status data. MTConnect is an open, free specification aimed at overcoming the "Islands of Automation" dilemma on the shop floor. With automated asset analysis, manufacturers can improve production to become lean, efficient, and effective. The focus of this paper will be on the deployment of MTConnect to collect real-time machine status to automate asset management. In addition, we will leverage the ISO 22400 standard, which defines an asset and quantifies asset performance metrics. In conjunction with these goals, the deployment of MTConnect in a large aerospace manufacturing facility will be studied with emphasis on asset management and understanding the impact of machine Overall Equipment Effectiveness (OEE) on manufacturing.

  13. Automated ultrasonic inspection using PULSDAT

    International Nuclear Information System (INIS)

    Naybour, P.J.

    1992-01-01

    PULSDAT (Portable Ultrasonic Data Acquisition Tool) is a system for recording the data from single probe automated ultrasonic inspections. It is one of a range of instruments and software developed by Nuclear Electric to carry out a wide variety of high quality ultrasonic inspections. These vary from simple semi-automated inspections through to multi-probe, highly automated ones. PULSDAT runs under the control of MIPS software, and collects data which is compatible with the GUIDE data display system. PULSDAT is therefore fully compatible with Nuclear Electric's multi-probe inspection systems and utilises all the reliability and quality assurance of the software. It is a rugged, portable system that can be used in areas of difficult access. The paper discusses the benefits of automated inspection and gives an outline of the main features of PULSDAT. Since April 1990 PULSDAT has been used in several applications within Nuclear Electric and this paper presents two examples: the first is a ferritic set-through nozzle and the second is an austenitic fillet weld. (Author)

  14. Marketing automation supporting sales

    OpenAIRE

    Sandell, Niko

    2016-01-01

    The past couple of decades has been a time of major changes in marketing. Digitalization has become a permanent part of marketing and at the same time enabled efficient collection of data. Personalization and customization of content are playing a crucial role in marketing when new customers are acquired. This has also created a need for automation to facilitate the distribution of targeted content. As a result of successful marketing automation more information of the customers is gathered ...

  15. Benefits of Ilizarov automated bone distraction for nerves and articular cartilage in experimental leg lengthening.

    Science.gov (United States)

    Shchudlo, Nathalia; Varsegova, Tatyana; Stupina, Tatyana; Shchudlo, Michael; Saifutdinov, Marat; Yemanov, Andrey

    2017-09-18

    To determine peculiarities of tissue responses to manual and automated Ilizarov bone distraction in nerves and articular cartilage. Twenty-nine dogs were divided in two experimental groups: Group M - leg lengthening with manual distraction (1 mm/d in 4 steps), Group A - automated distraction (1 mm/d in 60 steps) and intact group. Animals were euthanized at the end of distraction, at 30 th day of fixation in apparatus and 30 d after the fixator removal. M-responses in gastrocnemius and tibialis anterior muscles were recorded, numerical histology of peroneal and tibialis nerves and knee cartilage semi-thin sections, scanning electron microscopy and X-ray electron probe microanalysis were performed. Better restoration of M-response amplitudes in leg muscles was noted in A-group. Fibrosis of epineurium with adipocytes loss in peroneal nerve, subperineurial edema and fibrosis of endoneurium in some fascicles of both nerves were noted only in M-group, shares of nerve fibers with atrophic and degenerative changes were bigger in M-group than in A-group. At the end of experiment morphometric parameters of nerve fibers in peroneal nerve were comparable with intact nerve only in A-group. Quantitative parameters of articular cartilage (thickness, volumetric densities of chondrocytes, percentages of isogenic clusters and empty cellular lacunas, contents of sulfur and calcium) were badly changed in M-group and less changed in A-group. Automated Ilizarov distraction is more safe method of orthopedic leg lengthening than manual distraction in points of nervous fibers survival and articular cartilage arthrotic changes.

  16. ERP processes automation in corporate environments

    Directory of Open Access Journals (Sweden)

    Antonoaie Victor

    2017-01-01

    Full Text Available The automation processes are used in organizations to speed up analyses processes and reduce manual labour. Robotic Automation of IT processes implemented in a modern corporate workspace provides an excellent tool for assisting professionals in making decisions, saving resources and serving as a know-how repository. This study presents the newest trends in process automation, its benefits such as security, ease of use, reduction of overall process duration, and provide examples of SAPERP projects where this technology was implemented and meaningful impact was obtained.

  17. 12 CFR 205.16 - Disclosures at automated teller machines.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 2 2010-01-01 2010-01-01 false Disclosures at automated teller machines. 205... SYSTEM ELECTRONIC FUND TRANSFERS (REGULATION E) § 205.16 Disclosures at automated teller machines. (a) Definition. Automated teller machine operator means any person that operates an automated teller machine at...

  18. HICOSMO - cosmology with a complete sample of galaxy clusters - I. Data analysis, sample selection and luminosity-mass scaling relation

    Science.gov (United States)

    Schellenberger, G.; Reiprich, T. H.

    2017-08-01

    The X-ray regime, where the most massive visible component of galaxy clusters, the intracluster medium, is visible, offers directly measured quantities, like the luminosity, and derived quantities, like the total mass, to characterize these objects. The aim of this project is to analyse a complete sample of galaxy clusters in detail and constrain cosmological parameters, like the matter density, Ωm, or the amplitude of initial density fluctuations, σ8. The purely X-ray flux-limited sample (HIFLUGCS) consists of the 64 X-ray brightest galaxy clusters, which are excellent targets to study the systematic effects, that can bias results. We analysed in total 196 Chandra observations of the 64 HIFLUGCS clusters, with a total exposure time of 7.7 Ms. Here, we present our data analysis procedure (including an automated substructure detection and an energy band optimization for surface brightness profile analysis) that gives individually determined, robust total mass estimates. These masses are tested against dynamical and Planck Sunyaev-Zeldovich (SZ) derived masses of the same clusters, where good overall agreement is found with the dynamical masses. The Planck SZ masses seem to show a mass-dependent bias to our hydrostatic masses; possible biases in this mass-mass comparison are discussed including the Planck selection function. Furthermore, we show the results for the (0.1-2.4) keV luminosity versus mass scaling relation. The overall slope of the sample (1.34) is in agreement with expectations and values from literature. Splitting the sample into galaxy groups and clusters reveals, even after a selection bias correction, that galaxy groups exhibit a significantly steeper slope (1.88) compared to clusters (1.06).

  19. Automated data acquisition technology development:Automated modeling and control development

    Science.gov (United States)

    Romine, Peter L.

    1995-01-01

    This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.

  20. Logistic control in automated transportation networks

    NARCIS (Netherlands)

    Ebben, Mark

    2001-01-01

    Increasing congestion problems lead to a search for alternative transportation systems. Automated transportation networks, possibly underground, are an option. Logistic control systems are essential for future implementations of such automated transportation networks. This book contributes to the

  1. Text Mining in Biomedical Domain with Emphasis on Document Clustering.

    Science.gov (United States)

    Renganathan, Vinaitheerthan

    2017-07-01

    With the exponential increase in the number of articles published every year in the biomedical domain, there is a need to build automated systems to extract unknown information from the articles published. Text mining techniques enable the extraction of unknown knowledge from unstructured documents. This paper reviews text mining processes in detail and the software tools available to carry out text mining. It also reviews the roles and applications of text mining in the biomedical domain. Text mining processes, such as search and retrieval of documents, pre-processing of documents, natural language processing, methods for text clustering, and methods for text classification are described in detail. Text mining techniques can facilitate the mining of vast amounts of knowledge on a given topic from published biomedical research articles and draw meaningful conclusions that are not possible otherwise.

  2. Multi-Optimisation Consensus Clustering

    Science.gov (United States)

    Li, Jian; Swift, Stephen; Liu, Xiaohui

    Ensemble Clustering has been developed to provide an alternative way of obtaining more stable and accurate clustering results. It aims to avoid the biases of individual clustering algorithms. However, it is still a challenge to develop an efficient and robust method for Ensemble Clustering. Based on an existing ensemble clustering method, Consensus Clustering (CC), this paper introduces an advanced Consensus Clustering algorithm called Multi-Optimisation Consensus Clustering (MOCC), which utilises an optimised Agreement Separation criterion and a Multi-Optimisation framework to improve the performance of CC. Fifteen different data sets are used for evaluating the performance of MOCC. The results reveal that MOCC can generate more accurate clustering results than the original CC algorithm.

  3. Automated Methods Of Corrosion Measurements

    DEFF Research Database (Denmark)

    Bech-Nielsen, Gregers; Andersen, Jens Enevold Thaulov; Reeve, John Ch

    1997-01-01

    The chapter describes the following automated measurements: Corrosion Measurements by Titration, Imaging Corrosion by Scanning Probe Microscopy, Critical Pitting Temperature and Application of the Electrochemical Hydrogen Permeation Cell.......The chapter describes the following automated measurements: Corrosion Measurements by Titration, Imaging Corrosion by Scanning Probe Microscopy, Critical Pitting Temperature and Application of the Electrochemical Hydrogen Permeation Cell....

  4. Aviation safety/automation program overview

    Science.gov (United States)

    Morello, Samuel A.

    1990-01-01

    The goal is to provide a technology base leading to improved safety of the national airspace system through the development and integration of human-centered automation technologies for aircraft crews and air traffic controllers. Information on the problems, specific objectives, human-automation interaction, intelligent error-tolerant systems, and air traffic control/cockpit integration is given in viewgraph form.

  5. Electron: Cluster interactions

    International Nuclear Information System (INIS)

    Scheidemann, A.A.; Knight, W.D.

    1994-02-01

    Beam depletion spectroscopy has been used to measure absolute total inelastic electron-sodium cluster collision cross sections in the energy range from E ∼ 0.1 to E ∼ 6 eV. The investigation focused on the closed shell clusters Na 8 , Na 20 , Na 40 . The measured cross sections show an increase for the lowest collision energies where electron attachment is the primary scattering channel. The electron attachment cross section can be understood in terms of Langevin scattering, connecting this measurement with the polarizability of the cluster. For energies above the dissociation energy the measured electron-cluster cross section is energy independent, thus defining an electron-cluster interaction range. This interaction range increases with the cluster size

  6. Lighting Automation Flying an Earthlike Habitat

    Science.gov (United States)

    Clark, Toni A.; Kolomenski, Andrei

    2017-01-01

    Currently, spacecraft lighting systems are not demonstrating innovations in automation due to perceived costs in designing circuitry for the communication and automation of lights. The majority of spacecraft lighting systems employ lamps or zone specific manual switches and dimmers. This type of 'hardwired' solution does not easily convert to automation. With advances in solid state lighting, the potential to enhance a spacecraft habitat is lost if the communication and automation problem is not tackled. If we are to build long duration environments, which provide earth-like habitats, minimize crew time, and optimize spacecraft power reserves, innovation in lighting automation is a must. This project researched the use of the DMX512 communication protocol originally developed for high channel count lighting systems. DMX512 is an internationally governed, industry-accepted, lighting communication protocol with wide industry support. The lighting industry markets a wealth of hardware and software that utilizes DMX512, and there may be incentive to space certify the system. Our goal in this research is to enable the development of automated spacecraft habitats for long duration missions. To transform how spacecraft lighting environments are automated, our project conducted a variety of tests to determine a potential scope of capability. We investigated utilization and application of an industry accepted lighting control protocol, DMX512 by showcasing how the lighting system could help conserve power, assist with lighting countermeasures, and utilize spatial body tracking. We hope evaluation and the demonstrations we built will inspire other NASA engineers, architects and researchers to consider employing DMX512 "smart lighting" capabilities into their system architecture. By using DMX512 we will prove the 'wheel' does not need to be reinvented in terms of smart lighting and future spacecraft can use a standard lighting protocol to produce an effective, optimized and

  7. Lighting Automation - Flying an Earthlike Habitat

    Science.gov (United States)

    Clark, Tori A. (Principal Investigator); Kolomenski, Andrei

    2017-01-01

    Currently, spacecraft lighting systems are not demonstrating innovations in automation due to perceived costs in designing circuitry for the communication and automation of lights. The majority of spacecraft lighting systems employ lamps or zone specific manual switches and dimmers. This type of 'hardwired' solution does not easily convert to automation. With advances in solid state lighting, the potential to enhance a spacecraft habitat is lost if the communication and automation problem is not tackled. If we are to build long duration environments, which provide earth-like habitats, minimize crew time, and optimize spacecraft power reserves, innovation in lighting automation is a must. This project researched the use of the DMX512 communication protocol originally developed for high channel count lighting systems. DMX512 is an internationally governed, industry-accepted, lighting communication protocol with wide industry support. The lighting industry markets a wealth of hardware and software that utilizes DMX512, and there may be incentive to space certify the system. Our goal in this research is to enable the development of automated spacecraft habitats for long duration missions. To transform how spacecraft lighting environments are automated, our project conducted a variety of tests to determine a potential scope of capability. We investigated utilization and application of an industry accepted lighting control protocol, DMX512 by showcasing how the lighting system could help conserve power, assist with lighting countermeasures, and utilize spatial body tracking. We hope evaluation and the demonstrations we built will inspire other NASA engineers, architects and researchers to consider employing DMX512 "smart lighting" capabilities into their system architecture. By using DMX512 we will prove the 'wheel' does not need to be reinvented in terms of smart lighting and future spacecraft can use a standard lighting protocol to produce an effective, optimized and

  8. Automated exchange transfusion and exchange rate.

    Science.gov (United States)

    Funato, M; Shimada, S; Tamai, H; Taki, H; Yoshioka, Y

    1989-10-01

    An automated blood exchange transfusion (BET) with a two-site technique has been devised by Goldmann et al and by us, using an infusion pump. With this method, we successfully performed exchange transfusions 189 times in the past four years on 110 infants with birth weights ranging from 530 g to 4,000 g. The exchange rate by the automated method was compared with the rate by Diamond's method. Serum bilirubin (SB) levels before and after BET and the maximal SB rebound within 24 hours after BET were: 21.6 +/- 2.4, 11.5 +/- 2.2, and 15.0 +/- 1.5 mg/dl in the automated method, and 22.0 +/- 2.9, 11.2 +/- 2.5, and 17.7 +/- 3.2 mg/dl in Diamond's method, respectively. The result showed that the maximal rebound of the SB level within 24 hours after BET was significantly lower in the automated method than in Diamond's method (p less than 0.01), though SB levels before and after BET were not significantly different between the two methods. The exchange rate was also measured by means of staining the fetal red cells (F cells) both in the automated method and in Diamond's method, and comparing them. The exchange rate of F cells in Diamond's method went down along the theoretical exchange curve proposed by Diamond, while the rate in the automated method was significantly better than in Diamond's, especially in the early stage of BET (p less than 0.01). We believe that the use of this automated method may give better results than Diamond's method in the rate of exchange, because this method is performed with a two-site technique using a peripheral artery and vein.

  9. Semantic based cluster content discovery in description first clustering algorithm

    International Nuclear Information System (INIS)

    Khan, M.W.; Asif, H.M.S.

    2017-01-01

    In the field of data analytics grouping of like documents in textual data is a serious problem. A lot of work has been done in this field and many algorithms have purposed. One of them is a category of algorithms which firstly group the documents on the basis of similarity and then assign the meaningful labels to those groups. Description first clustering algorithm belong to the category in which the meaningful description is deduced first and then relevant documents are assigned to that description. LINGO (Label Induction Grouping Algorithm) is the algorithm of description first clustering category which is used for the automatic grouping of documents obtained from search results. It uses LSI (Latent Semantic Indexing); an IR (Information Retrieval) technique for induction of meaningful labels for clusters and VSM (Vector Space Model) for cluster content discovery. In this paper we present the LINGO while it is using LSI during cluster label induction and cluster content discovery phase. Finally, we compare results obtained from the said algorithm while it uses VSM and Latent semantic analysis during cluster content discovery phase. (author)

  10. The clustered nucleus-cluster structures in stable and unstable nuclei

    International Nuclear Information System (INIS)

    Freer, Martin

    2007-01-01

    The subject of clustering has a lineage which runs throughout the history of nuclear physics. Its attraction is the simplification of the often uncorrelated behaviour of independent particles to organized and coherent quasi-crystalline structures. In this review the ideas behind the development of clustering in light nuclei are investigated, mostly from the stand-point of the harmonic oscillator framework. This allows a unifying description of alpha-conjugate and neutron-rich nuclei, alike. More sophisticated models of clusters are explored, such as antisymmetrized molecular dynamics. A number of contemporary topics in clustering are touched upon; the 3α-cluster state in 12 C, nuclear molecules and clustering at the drip-line. Finally, an understanding of the 12 C+ 12 C resonances in 24 Mg, within the framework of the theoretical ideas developed in the review, is presented

  11. Regional Innovation Clusters

    Data.gov (United States)

    Small Business Administration — The Regional Innovation Clusters serve a diverse group of sectors and geographies. Three of the initial pilot clusters, termed Advanced Defense Technology clusters,...

  12. Choosing the Number of Clusters in K-Means Clustering

    Science.gov (United States)

    Steinley, Douglas; Brusco, Michael J.

    2011-01-01

    Steinley (2007) provided a lower bound for the sum-of-squares error criterion function used in K-means clustering. In this article, on the basis of the lower bound, the authors propose a method to distinguish between 1 cluster (i.e., a single distribution) versus more than 1 cluster. Additionally, conditional on indicating there are multiple…

  13. Lighting Automation - Flying an Earthlike Habit Project

    Science.gov (United States)

    Falker, Jay; Howard, Ricky; Culbert, Christopher; Clark, Toni Anne; Kolomenski, Andrei

    2017-01-01

    Our proposal will enable the development of automated spacecraft habitats for long duration missions. Majority of spacecraft lighting systems employ lamps or zone specific switches and dimmers. Automation is not in the "picture". If we are to build long duration environments, which provide earth-like habitats, minimize crew time, and optimize spacecraft power reserves, innovation in lighting automation is a must. To transform how spacecraft lighting environments are automated, we will provide performance data on a standard lighting communication protocol. We will investigate utilization and application of an industry accepted lighting control protocol, DMX512. We will demonstrate how lighting automation can conserve power, assist with lighting countermeasures, and utilize spatial body tracking. By using DMX512 we will prove the "wheel" does not need to be reinvented in terms of smart lighting and future spacecraft can use a standard lighting protocol to produce an effective, optimized and potentially earthlike habitat.

  14. Personalized PageRank Clustering: A graph clustering algorithm based on random walks

    Science.gov (United States)

    A. Tabrizi, Shayan; Shakery, Azadeh; Asadpour, Masoud; Abbasi, Maziar; Tavallaie, Mohammad Ali

    2013-11-01

    Graph clustering has been an essential part in many methods and thus its accuracy has a significant effect on many applications. In addition, exponential growth of real-world graphs such as social networks, biological networks and electrical circuits demands clustering algorithms with nearly-linear time and space complexity. In this paper we propose Personalized PageRank Clustering (PPC) that employs the inherent cluster exploratory property of random walks to reveal the clusters of a given graph. We combine random walks and modularity to precisely and efficiently reveal the clusters of a graph. PPC is a top-down algorithm so it can reveal inherent clusters of a graph more accurately than other nearly-linear approaches that are mainly bottom-up. It also gives a hierarchy of clusters that is useful in many applications. PPC has a linear time and space complexity and has been superior to most of the available clustering algorithms on many datasets. Furthermore, its top-down approach makes it a flexible solution for clustering problems with different requirements.

  15. Home automation as an example of construction innovation

    NARCIS (Netherlands)

    Vlies, R.D. van der; Bronswijk, J.E.M.H. van

    2009-01-01

    Home automation can contribute to the health of (older) adults. Home automation covers a broad field of ‘intelligent’ electronic or mechanical devices in the home (domestic) environment. Realizing home automation is technically possible, though still not common. In this paper main influential

  16. Implementing The Automated Phases Of The Partially-Automated Digital Triage Process Model

    Directory of Open Access Journals (Sweden)

    Gary D Cantrell

    2012-12-01

    Full Text Available Digital triage is a pre-digital-forensic phase that sometimes takes place as a way of gathering quick intelligence. Although effort has been undertaken to model the digital forensics process, little has been done to date to model digital triage. This work discuses the further development of a model that does attempt to address digital triage the Partially-automated Crime Specific Digital Triage Process model. The model itself will be presented along with a description of how its automated functionality was implemented to facilitate model testing.

  17. Computerized automated remote inspection system

    International Nuclear Information System (INIS)

    The automated inspection system utilizes a computer to control the location of the ultrasonic transducer, the actual inspection process, the display of the data, and the storage of the data on IBM magnetic tape. This automated inspection equipment provides two major advantages. First, it provides a cost savings, because of the reduced inspection time, made possible by the automation of the data acquisition, processing, and storage equipment. This reduced inspection time is also made possible by a computerized data evaluation aid which speeds data interpretation. In addition, the computer control of the transducer location drive allows the exact duplication of a previously located position or flaw. The second major advantage is that the use of automated inspection equipment also allows a higher-quality inspection, because of the automated data acquisition, processing, and storage. This storage of data, in accurate digital form on IBM magnetic tape, for example, facilitates retrieval for comparison with previous inspection data. The equipment provides a multiplicity of scan data which will provide statistical information on any questionable volume or flaw. An automatic alarm for location of all reportable flaws reduces the probability of operator error. This system has the ability to present data on a cathode ray tube as numerical information, a three-dimensional picture, or ''hard-copy'' sheet. One important advantage of this system is the ability to store large amounts of data in compact magnetic tape reels

  18. Clusters in nuclei

    CERN Document Server

    Following the pioneering discovery of alpha clustering and of molecular resonances, the field of nuclear clustering is today one of those domains of heavy-ion nuclear physics that faces the greatest challenges, yet also contains the greatest opportunities. After many summer schools and workshops, in particular over the last decade, the community of nuclear molecular physicists has decided to collaborate in producing a comprehensive collection of lectures and tutorial reviews covering the field. This third volume follows the successful Lect. Notes Phys. 818 (Vol. 1) and 848 (Vol. 2), and comprises six extensive lectures covering the following topics:  - Gamma Rays and Molecular Structure - Faddeev Equation Approach for Three Cluster Nuclear Reactions - Tomography of the Cluster Structure of Light Nuclei Via Relativistic Dissociation - Clustering Effects Within the Dinuclear Model : From Light to Hyper-heavy Molecules in Dynamical Mean-field Approach - Clusterization in Ternary Fission - Clusters in Light N...

  19. Spatial cluster modelling

    CERN Document Server

    Lawson, Andrew B

    2002-01-01

    Research has generated a number of advances in methods for spatial cluster modelling in recent years, particularly in the area of Bayesian cluster modelling. Along with these advances has come an explosion of interest in the potential applications of this work, especially in epidemiology and genome research. In one integrated volume, this book reviews the state-of-the-art in spatial clustering and spatial cluster modelling, bringing together research and applications previously scattered throughout the literature. It begins with an overview of the field, then presents a series of chapters that illuminate the nature and purpose of cluster modelling within different application areas, including astrophysics, epidemiology, ecology, and imaging. The focus then shifts to methods, with discussions on point and object process modelling, perfect sampling of cluster processes, partitioning in space and space-time, spatial and spatio-temporal process modelling, nonparametric methods for clustering, and spatio-temporal ...

  20. Automated artery-venous classification of retinal blood vessels based on structural mapping method

    Science.gov (United States)

    Joshi, Vinayak S.; Garvin, Mona K.; Reinhardt, Joseph M.; Abramoff, Michael D.

    2012-03-01

    Retinal blood vessels show morphologic modifications in response to various retinopathies. However, the specific responses exhibited by arteries and veins may provide a precise diagnostic information, i.e., a diabetic retinopathy may be detected more accurately with the venous dilatation instead of average vessel dilatation. In order to analyze the vessel type specific morphologic modifications, the classification of a vessel network into arteries and veins is required. We previously described a method for identification and separation of retinal vessel trees; i.e. structural mapping. Therefore, we propose the artery-venous classification based on structural mapping and identification of color properties prominent to the vessel types. The mean and standard deviation of each of green channel intensity and hue channel intensity are analyzed in a region of interest around each centerline pixel of a vessel. Using the vector of color properties extracted from each centerline pixel, it is classified into one of the two clusters (artery and vein), obtained by the fuzzy-C-means clustering. According to the proportion of clustered centerline pixels in a particular vessel, and utilizing the artery-venous crossing property of retinal vessels, each vessel is assigned a label of an artery or a vein. The classification results are compared with the manually annotated ground truth (gold standard). We applied the proposed method to a dataset of 15 retinal color fundus images resulting in an accuracy of 88.28% correctly classified vessel pixels. The automated classification results match well with the gold standard suggesting its potential in artery-venous classification and the respective morphology analysis.