Diffeomorphic Statistical Deformation Models
DEFF Research Database (Denmark)
Hansen, Michael Sass; Hansen, Mads/Fogtman; Larsen, Rasmus
2007-01-01
In this paper we present a new method for constructing diffeomorphic statistical deformation models in arbitrary dimensional images with a nonlinear generative model and a linear parameter space. Our deformation model is a modified version of the diffeomorphic model introduced by Cootes et al....... The modifications ensure that no boundary restriction has to be enforced on the parameter space to prevent folds or tears in the deformation field. For straightforward statistical analysis, principal component analysis and sparse methods, we assume that the parameters for a class of deformations lie on a linear...... with ground truth in form of manual expert annotations, and compared to Cootes's model. We anticipate applications in unconstrained diffeomorphic synthesis of images, e.g. for tracking, segmentation, registration or classification purposes....
International Nuclear Information System (INIS)
Ipach, Ingmar; Mittag, F.; Sachsenmaier, S.; Kluba, T.; Heinrich, P.
2011-01-01
Purpose: Two types of femoroacetabular impingement (FAI) are described as reasons for the early development of osteoarthritis of the hip. Cam impingement develops from contact between an abnormal head-neck junction and the acetabular rim. Pincer impingement is characterized by local or general overcoverage of the femoral head by the acetabular rim. Both forms might cause early osteoarthritis of the hip. A decreased head/neck offset has been recognized on AP pelvic views and labeled as 'pistol grip deformity'. The aim of the study was to develop a classification for this deformity with regard to the stage of osteoarthritis of the hip. Materials and Methods: 76 pelvic and axial views were analyzed for alpha angle and head ratio. 22 of them had a normal shape in the head-neck region and no osteoarthritis signs, 27 had a 'pistol grip deformity' and osteoarthritis I and 27 had a 'pistol grip deformity' and osteoarthritis II -IV . The CART method was used to develop a classification. Results: There was a statistically significant correlation between alpha angle and head ratio. A statistically significant difference in alpha angle and head ratio was seen between the three groups. Using the CART method, we developed a three-step classification system for the 'pistol grip deformity' with very high accuracy. This deformity was aggravated by increasing age. Conclusion: Using this model it is possible to differentiate between normal shapes of the head-neck junction and different severities of the pistol grip deformity. (orig.)
14 CFR Section 19 - Uniform Classification of Operating Statistics
2010-01-01
... Statistics Section 19 Section 19 Aeronautics and Space OFFICE OF THE SECRETARY, DEPARTMENT OF TRANSPORTATION... AIR CARRIERS Operating Statistics Classifications Section 19 Uniform Classification of Operating Statistics ...
A Classification of Statistics Courses (A Framework for Studying Statistical Education)
Turner, J. C.
1976-01-01
A classification of statistics courses in presented, with main categories of "course type,""methods of presentation,""objectives," and "syllabus." Examples and suggestions for uses of the classification are given. (DT)
High-temperature behavior of a deformed Fermi gas obeying interpolating statistics.
Algin, Abdullah; Senay, Mustafa
2012-04-01
An outstanding idea originally introduced by Greenberg is to investigate whether there is equivalence between intermediate statistics, which may be different from anyonic statistics, and q-deformed particle algebra. Also, a model to be studied for addressing such an idea could possibly provide us some new consequences about the interactions of particles as well as their internal structures. Motivated mainly by this idea, in this work, we consider a q-deformed Fermi gas model whose statistical properties enable us to effectively study interpolating statistics. Starting with a generalized Fermi-Dirac distribution function, we derive several thermostatistical functions of a gas of these deformed fermions in the thermodynamical limit. We study the high-temperature behavior of the system by analyzing the effects of q deformation on the most important thermostatistical characteristics of the system such as the entropy, specific heat, and equation of state. It is shown that such a deformed fermion model in two and three spatial dimensions exhibits the interpolating statistics in a specific interval of the model deformation parameter 0 < q < 1. In particular, for two and three spatial dimensions, it is found from the behavior of the third virial coefficient of the model that the deformation parameter q interpolates completely between attractive and repulsive systems, including the free boson and fermion cases. From the results obtained in this work, we conclude that such a model could provide much physical insight into some interacting theories of fermions, and could be useful to further study the particle systems with intermediate statistics.
An anthropometric classification of body contour deformities after massive weight loss.
Iglesias, Martin; Butron, Patricia; Abarca, Leonardo; Perez-Monzo, Mario F; de Rienzo-Madero, Beatriz
2010-08-01
Deformities caused by massive weight loss were originally subsidized at the Instituto Nacional de Ciencias Médicas y Nutrición "Salvador Zubirán." This caused great economical losses, which led to the development of a classification to select patients with functional problems secondary to massive weight loss. The parameter used is the size of the pannus in relation to fixed anatomic structures within the following anatomic regions: abdomen, arms, thighs, mammary glands, lateral thoracic area, back, lumbar region, gluteal region, sacrum, and mons pubis. Grade 3 deformities are candidates for body contouring surgery because they constitute a functional problem. Grade 2 deformities reevaluated whether the patient has comorbidities. Lesser grades are considered aesthetic procedures and are not candidates for surgical rehabilitation at the Instituto Nacional de Ciencias Médicas y Nutrición "Salvador Zubirán." This classification allowed an improvement in communication between the different surgical-medical specialties; therefore, we suggest its application not only for surgical-administrative reasons but also for academic purposes.
Oostenbroek, Hubert J; Brand, Ronald; van Roermund, Peter M; Castelein, René M
2014-01-01
Limb length discrepancy (LLD) and other patient factors are thought to influence the complication rate in (paediatric) limb deformity correction. In the literature, information is conflicting. This study was performed to identify clinical factors that affect the complication rate in paediatric lower-limb lengthening. A consecutive group of 37 children was analysed. The median proportionate LLD was 15 (4-42)%. An analysis was carried out on several patient factors that may complicate the treatment or end result using logistic regression in a polytomous logistic regression model. The factors analysed were proportionate LLD, cause of deformity, location of corrected bone, and the classification of the deformity according to an overall classification that includes the LLD and all concomitant deformity factors. The median age at the start of the treatment was 11 (6-17) years. The median lengthening index was 1.5 (0.8-3.8) months per centimetre lengthening. The obstacle and complication rate was 69% per lengthened bone. Proportionate LLD was the only statistically significant predictor for the occurrence of complications. Concomitant deformities did not influence the complication rate. From these data we constructed a simple graph that shows the relationship between proportionate LLD and risk for complications. This study shows that only relative LLD is a predictor of the risk for complications. The additional value of this analysis is the production of a simple graph. Construction of this graph using data of a patient group (for example, your own) may allow a more realistic comparison with results in the literature than has been possible before.
2006-01-01
The author develops a deformation theory for degenerations of complex curves; specifically, he treats deformations which induce splittings of the singular fiber of a degeneration. He constructs a deformation of the degeneration in such a way that a subdivisor is "barked" (peeled) off from the singular fiber. These "barking deformations" are related to deformations of surface singularities (in particular, cyclic quotient singularities) as well as the mapping class groups of Riemann surfaces (complex curves) via monodromies. Important applications, such as the classification of atomic degenerations, are also explained.
DEFF Research Database (Denmark)
Hallager, Dennis Winge; Hansen, Lars Valentin; Dragsted, Casper Rokkjær
2016-01-01
STUDY DESIGN: Cross-sectional analyses on a consecutive, prospective cohort. OBJECTIVE: To evaluate the ability of the Scoliosis Research Society (SRS)-Schwab Adult Spinal Deformity Classification to group patients by widely used health-related quality-of-life (HRQOL) scores and examine possible...... to confounding. However, age group and aetiology had individual significant effects. CONCLUSION: The SRS-Schwab sagittal modifiers reliably grouped patients graded 0 versus + / + + according to the most widely used HRQOL scores and the effects of increasing grade level on odds for worse ODI scores remained...... confounding variables. SUMMARY OF BACKGROUND DATA: The SRS-Schwab Adult Spinal Deformity Classification includes sagittal modifiers considered important for HRQOL and the clinical impact of the classification has been validated in patients from the International Spine Study Group database; however, equivocal...
Deformation of log-likelihood loss function for multiclass boosting.
Kanamori, Takafumi
2010-09-01
The purpose of this paper is to study loss functions in multiclass classification. In classification problems, the decision function is estimated by minimizing an empirical loss function, and then, the output label is predicted by using the estimated decision function. We propose a class of loss functions which is obtained by a deformation of the log-likelihood loss function. There are four main reasons why we focus on the deformed log-likelihood loss function: (1) this is a class of loss functions which has not been deeply investigated so far, (2) in terms of computation, a boosting algorithm with a pseudo-loss is available to minimize the proposed loss function, (3) the proposed loss functions provide a clear correspondence between the decision functions and conditional probabilities of output labels, (4) the proposed loss functions satisfy the statistical consistency of the classification error rate which is a desirable property in classification problems. Based on (3), we show that the deformed log-likelihood loss provides a model of mislabeling which is useful as a statistical model of medical diagnostics. We also propose a robust loss function against outliers in multiclass classification based on our approach. The robust loss function is a natural extension of the existing robust loss function for binary classification. A model of mislabeling and a robust loss function are useful to cope with noisy data. Some numerical studies are presented to show the robustness of the proposed loss function. A mathematical characterization of the deformed log-likelihood loss function is also presented. Copyright 2010 Elsevier Ltd. All rights reserved.
Texture classification by texton: statistical versus binary.
Directory of Open Access Journals (Sweden)
Zhenhua Guo
Full Text Available Using statistical textons for texture classification has shown great success recently. The maximal response 8 (Statistical_MR8, image patch (Statistical_Joint and locally invariant fractal (Statistical_Fractal are typical statistical texton algorithms and state-of-the-art texture classification methods. However, there are two limitations when using these methods. First, it needs a training stage to build a texton library, thus the recognition accuracy will be highly depended on the training samples; second, during feature extraction, local feature is assigned to a texton by searching for the nearest texton in the whole library, which is time consuming when the library size is big and the dimension of feature is high. To address the above two issues, in this paper, three binary texton counterpart methods were proposed, Binary_MR8, Binary_Joint, and Binary_Fractal. These methods do not require any training step but encode local feature into binary representation directly. The experimental results on the CUReT, UIUC and KTH-TIPS databases show that binary texton could get sound results with fast feature extraction, especially when the image size is not big and the quality of image is not poor.
Classification, (big) data analysis and statistical learning
Conversano, Claudio; Vichi, Maurizio
2018-01-01
This edited book focuses on the latest developments in classification, statistical learning, data analysis and related areas of data science, including statistical analysis of large datasets, big data analytics, time series clustering, integration of data from different sources, as well as social networks. It covers both methodological aspects as well as applications to a wide range of areas such as economics, marketing, education, social sciences, medicine, environmental sciences and the pharmaceutical industry. In addition, it describes the basic features of the software behind the data analysis results, and provides links to the corresponding codes and data sets where necessary. This book is intended for researchers and practitioners who are interested in the latest developments and applications in the field. The peer-reviewed contributions were presented at the 10th Scientific Meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in Santa Margherita di Pul...
Lepore, Natasha; Brun, Caroline; Chou, Yi-Yu; Chiang, Ming-Chang; Dutton, Rebecca A.; Hayashi, Kiralee M.; Luders, Eileen; Lopez, Oscar L.; Aizenstein, Howard J.; Toga, Arthur W.; Becker, James T.; Thompson, Paul M.
2008-01-01
This paper investigates the performance of a new multivariate method for tensor-based morphometry (TBM). Statistics on Riemannian manifolds are developed that exploit the full information in deformation tensor fields. In TBM, multiple brain images are warped to a common neuroanatomical template via 3-D nonlinear registration; the resulting deformation fields are analyzed statistically to identify group differences in anatomy. Rather than study the Jacobian determinant (volume expansion factor...
Classification of Specialized Farms Applying Multivariate Statistical Methods
Directory of Open Access Journals (Sweden)
Zuzana Hloušková
2017-01-01
Full Text Available Classification of specialized farms applying multivariate statistical methods The paper is aimed at application of advanced multivariate statistical methods when classifying cattle breeding farming enterprises by their economic size. Advantage of the model is its ability to use a few selected indicators compared to the complex methodology of current classification model that requires knowledge of detailed structure of the herd turnover and structure of cultivated crops. Output of the paper is intended to be applied within farm structure research focused on future development of Czech agriculture. As data source, the farming enterprises database for 2014 has been used, from the FADN CZ system. The predictive model proposed exploits knowledge of actual size classes of the farms tested. Outcomes of the linear discriminatory analysis multifactor classification method have supported the chance of filing farming enterprises in the group of Small farms (98 % filed correctly, and the Large and Very Large enterprises (100 % filed correctly. The Medium Size farms have been correctly filed at 58.11 % only. Partial shortages of the process presented have been found when discriminating Medium and Small farms.
Shape-correlated deformation statistics for respiratory motion prediction in 4D lung
Liu, Xiaoxiao; Oguz, Ipek; Pizer, Stephen M.; Mageras, Gig S.
2010-02-01
4D image-guided radiation therapy (IGRT) for free-breathing lungs is challenging due to the complicated respiratory dynamics. Effective modeling of respiratory motion is crucial to account for the motion affects on the dose to tumors. We propose a shape-correlated statistical model on dense image deformations for patient-specic respiratory motion estimation in 4D lung IGRT. Using the shape deformations of the high-contrast lungs as the surrogate, the statistical model trained from the planning CTs can be used to predict the image deformation during delivery verication time, with the assumption that the respiratory motion at both times are similar for the same patient. Dense image deformation fields obtained by diffeomorphic image registrations characterize the respiratory motion within one breathing cycle. A point-based particle optimization algorithm is used to obtain the shape models of lungs with group-wise surface correspondences. Canonical correlation analysis (CCA) is adopted in training to maximize the linear correlation between the shape variations of the lungs and the corresponding dense image deformations. Both intra- and inter-session CT studies are carried out on a small group of lung cancer patients and evaluated in terms of the tumor location accuracies. The results suggest potential applications using the proposed method.
The ability of current statistical classifications to separate services and manufacturing
DEFF Research Database (Denmark)
Christensen, Jesper Lindgaard
2013-01-01
This paper explores the performance of current statistical classification systems in classifying firms and, in particular, their ability to distinguish between firms that provide services and firms that provide manufacturing. We find that a large share of firms, almost 20%, are not classified...... as expected based on a comparison of their statements of activities with the assigned industry codes. This result is robust to analyses on different levels of aggregation and is validated in an additional survey. It is well known from earlier literature that industry classification systems are not perfect....... This paper provides a quantification of the flaws in classifications of firms. Moreover, it is explained why the classifications of firms are imprecise. The increasing complexity of production, inertia in changes to statistical systems and the increasing integration of manufacturing products and services...
A statistical approach to root system classification.
Directory of Open Access Journals (Sweden)
Gernot eBodner
2013-08-01
Full Text Available Plant root systems have a key role in ecology and agronomy. In spite of fast increase in root studies, still there is no classification that allows distinguishing among distinctive characteristics within the diversity of rooting strategies. Our hypothesis is that a multivariate approach for plant functional type identification in ecology can be applied to the classification of root systems. We demonstrate that combining principal component and cluster analysis yields a meaningful classification of rooting types based on morphological traits. The classification method presented is based on a data-defined statistical procedure without a priori decision on the classifiers. Biplot inspection is used to determine key traits and to ensure stability in cluster based grouping. The classification method is exemplified with simulated root architectures and morphological field data. Simulated root architectures showed that morphological attributes with spatial distribution parameters capture most distinctive features within root system diversity. While developmental type (tap vs. shoot-borne systems is a strong, but coarse classifier, topological traits provide the most detailed differentiation among distinctive groups. Adequacy of commonly available morphologic traits for classification is supported by field data. Three rooting types emerged from measured data, distinguished by diameter/weight, density and spatial distribution respectively. Similarity of root systems within distinctive groups was the joint result of phylogenetic relation and environmental as well as human selection pressure. We concluded that the data-define classification is appropriate for integration of knowledge obtained with different root measurement methods and at various scales. Currently root morphology is the most promising basis for classification due to widely used common measurement protocols. To capture details of root diversity efforts in architectural measurement
Lepore, N; Brun, C; Chou, Y Y; Chiang, M C; Dutton, R A; Hayashi, K M; Luders, E; Lopez, O L; Aizenstein, H J; Toga, A W; Becker, J T; Thompson, P M
2008-01-01
This paper investigates the performance of a new multivariate method for tensor-based morphometry (TBM). Statistics on Riemannian manifolds are developed that exploit the full information in deformation tensor fields. In TBM, multiple brain images are warped to a common neuroanatomical template via 3-D nonlinear registration; the resulting deformation fields are analyzed statistically to identify group differences in anatomy. Rather than study the Jacobian determinant (volume expansion factor) of these deformations, as is common, we retain the full deformation tensors and apply a manifold version of Hotelling's $T(2) test to them, in a Log-Euclidean domain. In 2-D and 3-D magnetic resonance imaging (MRI) data from 26 HIV/AIDS patients and 14 matched healthy subjects, we compared multivariate tensor analysis versus univariate tests of simpler tensor-derived indices: the Jacobian determinant, the trace, geodesic anisotropy, and eigenvalues of the deformation tensor, and the angle of rotation of its eigenvectors. We detected consistent, but more extensive patterns of structural abnormalities, with multivariate tests on the full tensor manifold. Their improved power was established by analyzing cumulative p-value plots using false discovery rate (FDR) methods, appropriately controlling for false positives. This increased detection sensitivity may empower drug trials and large-scale studies of disease that use tensor-based morphometry.
A statistical analysis of the elastic distortion and dislocation density fields in deformed crystals
Mohamed, Mamdouh S.
2015-05-18
The statistical properties of the elastic distortion fields of dislocations in deforming crystals are investigated using the method of discrete dislocation dynamics to simulate dislocation structures and dislocation density evolution under tensile loading. Probability distribution functions (PDF) and pair correlation functions (PCF) of the simulated internal elastic strains and lattice rotations are generated for tensile strain levels up to 0.85%. The PDFs of simulated lattice rotation are compared with sub-micrometer resolution three-dimensional X-ray microscopy measurements of rotation magnitudes and deformation length scales in 1.0% and 2.3% compression strained Cu single crystals to explore the linkage between experiment and the theoretical analysis. The statistical properties of the deformation simulations are analyzed through determinations of the Nye and Kröner dislocation density tensors. The significance of the magnitudes and the length scales of the elastic strain and the rotation parts of dislocation density tensors are demonstrated, and their relevance to understanding the fundamental aspects of deformation is discussed.
A statistical analysis of the elastic distortion and dislocation density fields in deformed crystals
Mohamed, Mamdouh S.; Larson, Ben C.; Tischler, Jon Z.; El-Azab, Anter
2015-01-01
The statistical properties of the elastic distortion fields of dislocations in deforming crystals are investigated using the method of discrete dislocation dynamics to simulate dislocation structures and dislocation density evolution under tensile loading. Probability distribution functions (PDF) and pair correlation functions (PCF) of the simulated internal elastic strains and lattice rotations are generated for tensile strain levels up to 0.85%. The PDFs of simulated lattice rotation are compared with sub-micrometer resolution three-dimensional X-ray microscopy measurements of rotation magnitudes and deformation length scales in 1.0% and 2.3% compression strained Cu single crystals to explore the linkage between experiment and the theoretical analysis. The statistical properties of the deformation simulations are analyzed through determinations of the Nye and Kröner dislocation density tensors. The significance of the magnitudes and the length scales of the elastic strain and the rotation parts of dislocation density tensors are demonstrated, and their relevance to understanding the fundamental aspects of deformation is discussed.
Incidence of spinal deformity in adults and its distribution according SRS-Schwab classification
Directory of Open Access Journals (Sweden)
Marcus Vinicius Amaral Barreto
2015-06-01
Full Text Available OBJECTIVE: To evaluate the incidence of spinal deformity in adults, as well as its distribution according the curve type and the occurrence of sagittal modifiers of the SRS-Schwab classification..METHODS: Radiographs in frontal and lateral views of the entire column were performed and radiographic parameters were used to diagnose the vertebral deformity for the classification according to the SRS-Schwab system.RESULTS: We included 302 patients in the study, 236 (78.1% women and 66 (21.9% men. Fifty-six of the participants were diagnosed with ASD, 50 women and 6 men. The incidence of ASD was 18.5% in the total population, ranging from 9.1% in males and 21.2% in females (p=0.04. As to age group, the incidence was 11.9% in patients between 18 and 39 years, 12% between 40 and 59 years and 28.8% in patients with 60 years of age or older, significantly higher in the oldest group (p=0.002. When analyzing the correlation between age and progression of sagittal modifiers, there was no significant difference in the PI-LL and PT modifiers, but there was significant difference of SVA modifier (p=0.008, with a higher age in individuals "++".CONCLUSION: This study presented demographic data on ASD in a Brazilian population sample. There was a higher incidence of ASD in females and individuals aged ≥ 60 years. As for the sagittal modifiers of SRS-Schwab classification, there was a correlation between increasing age and degree of progression of SVA.
Statistical methods for segmentation and classification of images
DEFF Research Database (Denmark)
Rosholm, Anders
1997-01-01
The central matter of the present thesis is Bayesian statistical inference applied to classification of images. An initial review of Markov Random Fields relates to the modeling aspect of the indicated main subject. In that connection, emphasis is put on the relatively unknown sub-class of Pickard...... with a Pickard Random Field modeling of a considered (categorical) image phenomemon. An extension of the fast PRF based classification technique is presented. The modification introduces auto-correlation into the model of an involved noise process, which previously has been assumed independent. The suitability...... of the extended model is documented by tests on controlled image data containing auto-correlated noise....
The κ parameter and κ-distribution in κ-deformed statistics for the systems in an external field
International Nuclear Information System (INIS)
Guo, Lina; Du, Jiulin
2007-01-01
It is naturally important question for us to ask under what physical situation should the κ-deformed statistics be suitable for the statistical description of a system and what should the κ parameter stand for. In this Letter, a formula expression of κ parameter is derived on the basis of the κ-H theorem, the κ-velocity distribution and the generalized Boltzmann equation in the framework of κ-deformed statistics. We thus obtain a physical interpretation for the parameter κ 0 with regard to the temperature gradient and the external force field. We show, as the q-statistics based on Tsallis entropy, the κ-deformed statistics may also be the candidate one suitable for the statistical description of the systems in external fields when being in the nonequilibrium stationary state, but has different physical characteristics. Namely, the κ-distribution is found to describe the nonequilibrium stationary state of the system where the external force should be vertical to the temperature gradient
Goyal, Shrigopal; Balhara, Yatan Pal Singh; Khandelwal, S K
2012-07-01
Two of the most commonly used nosological systems- International Statistical Classification of Diseases and Related Health Problems (ICD)-10 and Diagnostic and Statistical Manual of Mental Disorders (DSM)-IV are under revision. This process has generated a lot of interesting debates with regards to future of the current diagnostic categories. In fact, the status of categorical approach in the upcoming versions of ICD and DSM is also being debated. The current article focuses on the debate with regards to the eating disorders. The existing classification of eating disorders has been criticized for its limitations. A host of new diagnostic categories have been recommended for inclusion in the upcoming revisions. Also the structure of the existing categories has also been put under scrutiny.
Cameron, Enrico; Pilla, Giorgio; Stella, Fabio A.
2018-01-01
The application of statistical classification methods is investigated—in comparison also to spatial interpolation methods—for predicting the acceptability of well-water quality in a situation where an effective quantitative model of the hydrogeological system under consideration cannot be developed. In the example area in northern Italy, in particular, the aquifer is locally affected by saline water and the concentration of chloride is the main indicator of both saltwater occurrence and groundwater quality. The goal is to predict if the chloride concentration in a water well will exceed the allowable concentration so that the water is unfit for the intended use. A statistical classification algorithm achieved the best predictive performances and the results of the study show that statistical classification methods provide further tools for dealing with groundwater quality problems concerning hydrogeological systems that are too difficult to describe analytically or to simulate effectively.
DEFF Research Database (Denmark)
Nest, Ryszard; Tsygan, Boris
2001-01-01
Recently Kontsevich solved the classification problem for deformation quantizations of all Poisson structures on a manifold. In this paper we study those Poisson structures for which the explicit methods of Fedosov can be applied, namely the Poisson structures coming from symplectic Lie algebroids......, as well as holomorphic symplectic structures. For deformations of these structures we prove the classification theorems and a general a general index theorem....
Visual Tracking of Deformation and Classification of Non-Rigid Objects with Robot Hand Probing
Directory of Open Access Journals (Sweden)
Fei Hui
2017-03-01
Full Text Available Performing tasks with a robot hand often requires a complete knowledge of the manipulated object, including its properties (shape, rigidity, surface texture and its location in the environment, in order to ensure safe and efficient manipulation. While well-established procedures exist for the manipulation of rigid objects, as well as several approaches for the manipulation of linear or planar deformable objects such as ropes or fabric, research addressing the characterization of deformable objects occupying a volume remains relatively limited. The paper proposes an approach for tracking the deformation of non-rigid objects under robot hand manipulation using RGB-D data. The purpose is to automatically classify deformable objects as rigid, elastic, plastic, or elasto-plastic, based on the material they are made of, and to support recognition of the category of such objects through a robotic probing process in order to enhance manipulation capabilities. The proposed approach combines advantageously classical color and depth image processing techniques and proposes a novel combination of the fast level set method with a log-polar mapping of the visual data to robustly detect and track the contour of a deformable object in a RGB-D data stream. Dynamic time warping is employed to characterize the object properties independently from the varying length of the tracked contour as the object deforms. The proposed solution achieves a classification rate over all categories of material of up to 98.3%. When integrated in the control loop of a robot hand, it can contribute to ensure stable grasp, and safe manipulation capability that will preserve the physical integrity of the object.
Identification of AE Bursts by Classification of Physical and Statistical Parameters
International Nuclear Information System (INIS)
Mieza, J.I.; Oliveto, M.E.; Lopez Pumarega, M.I.; Armeite, M.; Ruzzante, J.E.; Piotrkowski, R.
2005-01-01
Physical and statistical parameters obtained with the Principal Components method, extracted from Acoustic Emission bursts coming from triaxial deformation tests were analyzed. The samples came from seamless steel tubes used in the petroleum industry and some of them were provided with a protective coating. The purpose of our work was to identify bursts originated in the breakage of the coating, from those originated in damage mechanisms in the bulk steel matrix. Analysis was performed by statistical distributions, fractal analysis and clustering methods
Deformations of superconformal theories
Energy Technology Data Exchange (ETDEWEB)
Córdova, Clay [School of Natural Sciences, Institute for Advanced Study,1 Einstein Drive, Princeton, NJ 08540 (United States); Dumitrescu, Thomas T. [Department of Physics, Harvard University,17 Oxford Street, Cambridge, MA 02138 (United States); Intriligator, Kenneth [Department of Physics, University of California,9500 Gilman Drive, San Diego, La Jolla, CA 92093 (United States)
2016-11-22
We classify possible supersymmetry-preserving relevant, marginal, and irrelevant deformations of unitary superconformal theories in d≥3 dimensions. Our method only relies on symmetries and unitarity. Hence, the results are model independent and do not require a Lagrangian description. Two unifying themes emerge: first, many theories admit deformations that reside in multiplets together with conserved currents. Such deformations can lead to modifications of the supersymmetry algebra by central and non-central charges. Second, many theories with a sufficient amount of supersymmetry do not admit relevant or marginal deformations, and some admit neither. The classification is complicated by the fact that short superconformal multiplets display a rich variety of sporadic phenomena, including supersymmetric deformations that reside in the middle of a multiplet. We illustrate our results with examples in diverse dimensions. In particular, we explain how the classification of irrelevant supersymmetric deformations can be used to derive known and new constraints on moduli-space effective actions.
Hallager, Dennis Winge; Hansen, Lars Valentin; Dragsted, Casper Rokkjær; Peytz, Nina; Gehrchen, Martin; Dahl, Benny
2016-05-01
Cross-sectional analyses on a consecutive, prospective cohort. To evaluate the ability of the Scoliosis Research Society (SRS)-Schwab Adult Spinal Deformity Classification to group patients by widely used health-related quality-of-life (HRQOL) scores and examine possible confounding variables. The SRS-Schwab Adult Spinal Deformity Classification includes sagittal modifiers considered important for HRQOL and the clinical impact of the classification has been validated in patients from the International Spine Study Group database; however, equivocal results were reported for the Pelvic Tilt modifier and potential confounding variables were not evaluated. Between March 2013 and May 2014, all adult spinal deformity patients from our outpatient clinic with sufficient radiographs were prospectively enrolled. Analyses of HRQOL variance and post hoc analyses were performed for each SRS-Schwab modifier. Age, history of spine surgery, and aetiology of spinal deformity were considered potential confounders and their influence on the association between SRS-Schwab modifiers and aggregated Oswestry Disability Index (ODI) scores was evaluated with multivariate proportional odds regressions. P values were adjusted for multiple testing. Two hundred ninety-two of 460 eligible patients were included for analyses. The SRS-Schwab Classification significantly discriminated HRQOL scores between normal and abnormal sagittal modifier classifications. Individual grade comparisons showed equivocal results; however, Pelvic Tilt grade + versus + + did not discriminate patients according to any HRQOL score. All modifiers showed significant proportional odds for worse aggregated ODI scores with increasing grade levels and the effects were robust to confounding. However, age group and aetiology had individual significant effects. The SRS-Schwab sagittal modifiers reliably grouped patients graded 0 versus + / + + according to the most widely used HRQOL scores and the
Statistical Emulator for Expensive Classification Simulators
Ross, Jerret; Samareh, Jamshid A.
2016-01-01
Expensive simulators prevent any kind of meaningful analysis to be performed on the phenomena they model. To get around this problem the concept of using a statistical emulator as a surrogate representation of the simulator was introduced in the 1980's. Presently, simulators have become more and more complex and as a result running a single example on these simulators is very expensive and can take days to weeks or even months. Many new techniques have been introduced, termed criteria, which sequentially select the next best (most informative to the emulator) point that should be run on the simulator. These criteria methods allow for the creation of an emulator with only a small number of simulator runs. We follow and extend this framework to expensive classification simulators.
Directory of Open Access Journals (Sweden)
Emmanouil Styvaktakis
2007-01-01
Full Text Available This paper presents the two main types of classification methods for power quality disturbances based on underlying causes: deterministic classification, giving an expert system as an example, and statistical classification, with support vector machines (a novel method as an example. An expert system is suitable when one has limited amount of data and sufficient power system expert knowledge; however, its application requires a set of threshold values. Statistical methods are suitable when large amount of data is available for training. Two important issues to guarantee the effectiveness of a classifier, data segmentation, and feature extraction are discussed. Segmentation of a sequence of data recording is preprocessing to partition the data into segments each representing a duration containing either an event or a transition between two events. Extraction of features is applied to each segment individually. Some useful features and their effectiveness are then discussed. Some experimental results are included for demonstrating the effectiveness of both systems. Finally, conclusions are given together with the discussion of some future research directions.
2010-07-08
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention National Center for Health Statistics (NCHS), Classifications and Public Health Data Standards Staff, Announces the... Prevention, Classifications and Public Health Data Standards, 3311 Toledo Road, Room 2337, Hyattsville, MD...
2013-08-28
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention National Center for Health Statistics (NCHS), Classifications and Public Health Data Standards Staff, Announces the... Administrator, Classifications and Public Health Data Standards Staff, NCHS, 3311 Toledo Road, Room 2337...
2013-02-07
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention National Center for Health Statistics (NCHS), Classifications and Public Health Data Standards Staff, Announces the..., Medical Systems Administrator, Classifications and Public Health Data Standards Staff, NCHS, 3311 Toledo...
Wang, Xue; Bi, Dao-wei; Ding, Liang; Wang, Sheng
2007-01-01
The recent availability of low cost and miniaturized hardware has allowed wireless sensor networks (WSNs) to retrieve audio and video data in real world applications, which has fostered the development of wireless multimedia sensor networks (WMSNs). Resource constraints and challenging multimedia data volume make development of efficient algorithms to perform in-network processing of multimedia contents imperative. This paper proposes solving problems in the domain of WMSNs from the perspective of multi-agent systems. The multi-agent framework enables flexible network configuration and efficient collaborative in-network processing. The focus is placed on target classification in WMSNs where audio information is retrieved by microphones. To deal with the uncertainties related to audio information retrieval, the statistical approaches of power spectral density estimates, principal component analysis and Gaussian process classification are employed. A multi-agent negotiation mechanism is specially developed to efficiently utilize limited resources and simultaneously enhance classification accuracy and reliability. The negotiation is composed of two phases, where an auction based approach is first exploited to allocate the classification task among the agents and then individual agent decisions are combined by the committee decision mechanism. Simulation experiments with real world data are conducted and the results show that the proposed statistical approaches and negotiation mechanism not only reduce memory and computation requirements in WMSNs but also significantly enhance classification accuracy and reliability. PMID:28903223
International Nuclear Information System (INIS)
Brock, Kristy K.; Dawson, Laura A.; Sharpe, Michael B.; Moseley, Douglas J.; Jaffray, David A.
2006-01-01
Purpose: To investigate the feasibility of a biomechanical-based deformable image registration technique for the integration of multimodality imaging, image guided treatment, and response monitoring. Methods and Materials: A multiorgan deformable image registration technique based on finite element modeling (FEM) and surface projection alignment of selected regions of interest with biomechanical material and interface models has been developed. FEM also provides an inherent method for direct tracking specified regions through treatment and follow-up. Results: The technique was demonstrated on 5 liver cancer patients. Differences of up to 1 cm of motion were seen between the diaphragm and the tumor center of mass after deformable image registration of exhale and inhale CT scans. Spatial differences of 5 mm or more were observed for up to 86% of the surface of the defined tumor after deformable image registration of the computed tomography (CT) and magnetic resonance images. Up to 6.8 mm of motion was observed for the tumor after deformable image registration of the CT and cone-beam CT scan after rigid registration of the liver. Deformable registration of the CT to the follow-up CT allowed a more accurate assessment of tumor response. Conclusions: This biomechanical-based deformable image registration technique incorporates classification, targeting, and monitoring of tumor and normal tissue using one methodology
Statistical methods of discrimination and classification advances in theory and applications
Choi, Sung C
1986-01-01
Statistical Methods of Discrimination and Classification: Advances in Theory and Applications is a collection of papers that tackles the multivariate problems of discriminating and classifying subjects into exclusive population. The book presents 13 papers that cover that advancement in the statistical procedure of discriminating and classifying. The studies in the text primarily focus on various methods of discriminating and classifying variables, such as multiple discriminant analysis in the presence of mixed continuous and categorical data; choice of the smoothing parameter and efficiency o
The ability of current statistical classifications to separate services and manufacturing
DEFF Research Database (Denmark)
Christensen, Jesper Lindgaard
The paper explores how well our statistical classification systems perform in classifying firms and in particular how they distinguish firms doing services and/or manufacturing. It is found that a large share, almost 20%, of firms can be said to be misclassified based on their statements on activ...
International Nuclear Information System (INIS)
Dai, Wu-Sheng; Xie, Mi
2013-01-01
In this paper, we give a general discussion on the calculation of the statistical distribution from a given operator relation of creation, annihilation, and number operators. Our result shows that as long as the relation between the number operator and the creation and annihilation operators can be expressed as a † b=Λ(N) or N=Λ −1 (a † b), where N, a † , and b denote the number, creation, and annihilation operators, i.e., N is a function of quadratic product of the creation and annihilation operators, the corresponding statistical distribution is the Gentile distribution, a statistical distribution in which the maximum occupation number is an arbitrary integer. As examples, we discuss the statistical distributions corresponding to various operator relations. In particular, besides the Bose–Einstein and Fermi–Dirac cases, we discuss the statistical distributions for various schemes of intermediate statistics, especially various q-deformation schemes. Our result shows that the statistical distributions corresponding to various q-deformation schemes are various Gentile distributions with different maximum occupation numbers which are determined by the deformation parameter q. This result shows that the results given in much literature on the q-deformation distribution are inaccurate or incomplete. -- Highlights: ► A general discussion on calculating statistical distribution from relations of creation, annihilation, and number operators. ► A systemic study on the statistical distributions corresponding to various q-deformation schemes. ► Arguing that many results of q-deformation distributions in literature are inaccurate or incomplete
Vile, Douglas J.
In radiation therapy, interfraction organ motion introduces a level of geometric uncertainty into the planning process. Plans, which are typically based upon a single instance of anatomy, must be robust against daily anatomical variations. For this problem, a model of the magnitude, direction, and likelihood of deformation is useful. In this thesis, principal component analysis (PCA) is used to statistically model the 3D organ motion for 19 prostate cancer patients, each with 8-13 fractional computed tomography (CT) images. Deformable image registration and the resultant displacement vector fields (DVFs) are used to quantify the interfraction systematic and random motion. By applying the PCA technique to the random DVFs, principal modes of random tissue deformation were determined for each patient, and a method for sampling synthetic random DVFs was developed. The PCA model was then extended to describe the principal modes of systematic and random organ motion for the population of patients. A leave-one-out study tested both the systematic and random motion model's ability to represent PCA training set DVFs. The random and systematic DVF PCA models allowed the reconstruction of these data with absolute mean errors between 0.5-0.9 mm and 1-2 mm, respectively. To the best of the author's knowledge, this study is the first successful effort to build a fully 3D statistical PCA model of systematic tissue deformation in a population of patients. By sampling synthetic systematic and random errors, organ occupancy maps were created for bony and prostate-centroid patient setup processes. By thresholding these maps, PCA-based planning target volume (PTV) was created and tested against conventional margin recipes (van Herk for bony alignment and 5 mm fixed [3 mm posterior] margin for centroid alignment) in a virtual clinical trial for low-risk prostate cancer. Deformably accumulated delivered dose served as a surrogate for clinical outcome. For the bony landmark setup
Craniofacial Statistical Deformation Models of Wild-type mice and Crouzon mice
DEFF Research Database (Denmark)
Ólafsdóttir, Hildur; Darvann, Tron Andre; Ersbøll, Bjarne Kjær
2007-01-01
Crouzon syndrome is characterised by the premature fusion of cranial sutures and synchondroses leading to craniofacial growth disturbances. The gene causing the syndrome was discovered approximately a decade ago and recently the first mouse model of the syndrome was generated. In this study, a set...... of Micro CT scannings of the heads of wild-type (normal) mice and Crouzon mice were investigated. Statistical deformation models were built to assess the anatomical differences between the groups, as well as the within-group anatomical variation. Following the approach by Rueckert et al. we built an atlas...
Berti, Matteo; Corsini, Alessandro; Franceschini, Silvia; Iannacone, Jean Pascal
2013-04-01
The application of space borne synthetic aperture radar interferometry has progressed, over the last two decades, from the pioneer use of single interferograms for analyzing changes on the earth's surface to the development of advanced multi-interferogram techniques to analyze any sort of natural phenomena which involves movements of the ground. The success of multi-interferograms techniques in the analysis of natural hazards such as landslides and subsidence is widely documented in the scientific literature and demonstrated by the consensus among the end-users. Despite the great potential of this technique, radar interpretation of slope movements is generally based on the sole analysis of average displacement velocities, while the information embraced in multi interferogram time series is often overlooked if not completely neglected. The underuse of PS time series is probably due to the detrimental effect of residual atmospheric errors, which make the PS time series characterized by erratic, irregular fluctuations often difficult to interpret, and also to the difficulty of performing a visual, supervised analysis of the time series for a large dataset. In this work is we present a procedure for automatic classification of PS time series based on a series of statistical characterization tests. The procedure allows to classify the time series into six distinctive target trends (0=uncorrelated; 1=linear; 2=quadratic; 3=bilinear; 4=discontinuous without constant velocity; 5=discontinuous with change in velocity) and retrieve for each trend a series of descriptive parameters which can be efficiently used to characterize the temporal changes of ground motion. The classification algorithms were developed and tested using an ENVISAT datasets available in the frame of EPRS-E project (Extraordinary Plan of Environmental Remote Sensing) of the Italian Ministry of Environment (track "Modena", Northern Apennines). This dataset was generated using standard processing, then the
Energy Technology Data Exchange (ETDEWEB)
Sobhani, Hadi; Hassanabadi, Hassan [Shahrood University of Technology, Faculty of Physics, Shahrood (Iran, Islamic Republic of); Chung, Won Sang [Gyeongsang National University, Department of Physics and Research Institute of Natural Science, College of Natural Science, Jinju (Korea, Republic of)
2018-02-15
In this article, we determine the thermodynamical properties of the anharmonic canonical ensemble within the cosmic-string framework. We use the ordinary statistics and the q-deformed superstatistics for this study. The q-deformed superstatistics is derived by modifying the probability density in the original superstatistics. The Schroedinger equation is rewritten in the cosmic-string framework. Next, the anharmonic oscillator is investigated in detail. The wave function and the energy spectrum of the considered system are derived using the bi-confluent Heun functions. In the next step, we first determine the thermodynamical properties for the canonical ensemble of the anharmonic oscillator in the cosmic-string framework using the ordinary statistics approach. Also, these quantities have been obtained in the q-deformed superstatistics. For vanishing deformation parameter, the ordinary results are obtained. (orig.)
On deformations of linear differential systems
Gontsov, R.R.; Poberezhnyi, V.A.; Helminck, G.F.
2011-01-01
This article concerns deformations of meromorphic linear differential systems. Problems relating to their existence and classification are reviewed, and the global and local behaviour of solutions to deformation equations in a neighbourhood of their singular set is analysed. Certain classical
Directory of Open Access Journals (Sweden)
Sheng Wang
2007-10-01
Full Text Available The recent availability of low cost and miniaturized hardware has allowedwireless sensor networks (WSNs to retrieve audio and video data in real worldapplications, which has fostered the development of wireless multimedia sensor networks(WMSNs. Resource constraints and challenging multimedia data volume makedevelopment of efficient algorithms to perform in-network processing of multimediacontents imperative. This paper proposes solving problems in the domain of WMSNs fromthe perspective of multi-agent systems. The multi-agent framework enables flexible networkconfiguration and efficient collaborative in-network processing. The focus is placed ontarget classification in WMSNs where audio information is retrieved by microphones. Todeal with the uncertainties related to audio information retrieval, the statistical approachesof power spectral density estimates, principal component analysis and Gaussian processclassification are employed. A multi-agent negotiation mechanism is specially developed toefficiently utilize limited resources and simultaneously enhance classification accuracy andreliability. The negotiation is composed of two phases, where an auction based approach isfirst exploited to allocate the classification task among the agents and then individual agentdecisions are combined by the committee decision mechanism. Simulation experiments withreal world data are conducted and the results show that the proposed statistical approachesand negotiation mechanism not only reduce memory and computation requi
Deformation of shape memory alloys associated with twinned domain re-configurations
International Nuclear Information System (INIS)
Liu Yong; Van Humbeeck, J.; Xie Zeliang; Delaey, L.
1999-01-01
Most of the applications of shape memory alloys (SMAs) imply deformation of martensite; it is therefore one of the fundamental research topics on the shape memory effect. So far, several classifications of the deformation mechanisms have been made as a function of deformation amplitude. However, the deformation details of martensitic SMAs are still not yet satisfactorily understood and these classifications need to be refined, because several incoherencies have been found lately by mechanical testing and transmission electron microscopy (TEM) observations. The present work summarizes some new results on the deformation mechanisms of martensitic NiTi SMAs under tension. As a result, the deformation process of martensite twins as a function of the deformation strain amplitude is refined. (orig.)
Muthu Rama Krishnan, M; Shah, Pratik; Chakraborty, Chandan; Ray, Ajoy K
2012-04-01
The objective of this paper is to provide an improved technique, which can assist oncopathologists in correct screening of oral precancerous conditions specially oral submucous fibrosis (OSF) with significant accuracy on the basis of collagen fibres in the sub-epithelial connective tissue. The proposed scheme is composed of collagen fibres segmentation, its textural feature extraction and selection, screening perfomance enhancement under Gaussian transformation and finally classification. In this study, collagen fibres are segmented on R,G,B color channels using back-probagation neural network from 60 normal and 59 OSF histological images followed by histogram specification for reducing the stain intensity variation. Henceforth, textural features of collgen area are extracted using fractal approaches viz., differential box counting and brownian motion curve . Feature selection is done using Kullback-Leibler (KL) divergence criterion and the screening performance is evaluated based on various statistical tests to conform Gaussian nature. Here, the screening performance is enhanced under Gaussian transformation of the non-Gaussian features using hybrid distribution. Moreover, the routine screening is designed based on two statistical classifiers viz., Bayesian classification and support vector machines (SVM) to classify normal and OSF. It is observed that SVM with linear kernel function provides better classification accuracy (91.64%) as compared to Bayesian classifier. The addition of fractal features of collagen under Gaussian transformation improves Bayesian classifier's performance from 80.69% to 90.75%. Results are here studied and discussed.
Statistical Fractal Models Based on GND-PCA and Its Application on Classification of Liver Diseases
Directory of Open Access Journals (Sweden)
Huiyan Jiang
2013-01-01
Full Text Available A new method is proposed to establish the statistical fractal model for liver diseases classification. Firstly, the fractal theory is used to construct the high-order tensor, and then Generalized -dimensional Principal Component Analysis (GND-PCA is used to establish the statistical fractal model and select the feature from the region of liver; at the same time different features have different weights, and finally, Support Vector Machine Optimized Ant Colony (ACO-SVM algorithm is used to establish the classifier for the recognition of liver disease. In order to verify the effectiveness of the proposed method, PCA eigenface method and normal SVM method are chosen as the contrast methods. The experimental results show that the proposed method can reconstruct liver volume better and improve the classification accuracy of liver diseases.
2010-09-16
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Disease Control and Prevention National Center for Health Statistics (NCHS), Classifications and Public Health Data Standards Staff, Announces the... Public Health Data Standards Staff, NCHS, 3311 Toledo Road, Room 2337, Hyattsville, Maryland 20782, e...
Directory of Open Access Journals (Sweden)
R. Jegadeeshwaran
2015-03-01
Full Text Available In automobile, brake system is an essential part responsible for control of the vehicle. Any failure in the brake system impacts the vehicle's motion. It will generate frequent catastrophic effects on the vehicle cum passenger's safety. Thus the brake system plays a vital role in an automobile and hence condition monitoring of the brake system is essential. Vibration based condition monitoring using machine learning techniques are gaining momentum. This study is one such attempt to perform the condition monitoring of a hydraulic brake system through vibration analysis. In this research, the performance of a Clonal Selection Classification Algorithm (CSCA for brake fault diagnosis has been reported. A hydraulic brake system test rig was fabricated. Under good and faulty conditions of a brake system, the vibration signals were acquired using a piezoelectric transducer. The statistical parameters were extracted from the vibration signal. The best feature set was identified for classification using attribute evaluator. The selected features were then classified using CSCA. The classification accuracy of such artificial intelligence technique has been compared with other machine learning approaches and discussed. The Clonal Selection Classification Algorithm performs better and gives the maximum classification accuracy (96% for the fault diagnosis of a hydraulic brake system.
Hohn, M. Ed; Nuhfer, E.B.; Vinopal, R.J.; Klanderman, D.S.
1980-01-01
Classifying very fine-grained rocks through fabric elements provides information about depositional environments, but is subject to the biases of visual taxonomy. To evaluate the statistical significance of an empirical classification of very fine-grained rocks, samples from Devonian shales in four cored wells in West Virginia and Virginia were measured for 15 variables: quartz, illite, pyrite and expandable clays determined by X-ray diffraction; total sulfur, organic content, inorganic carbon, matrix density, bulk density, porosity, silt, as well as density, sonic travel time, resistivity, and ??-ray response measured from well logs. The four lithologic types comprised: (1) sharply banded shale, (2) thinly laminated shale, (3) lenticularly laminated shale, and (4) nonbanded shale. Univariate and multivariate analyses of variance showed that the lithologic classification reflects significant differences for the variables measured, difference that can be detected independently of stratigraphic effects. Little-known statistical methods found useful in this work included: the multivariate analysis of variance with more than one effect, simultaneous plotting of samples and variables on canonical variates, and the use of parametric ANOVA and MANOVA on ranked data. ?? 1980 Plenum Publishing Corporation.
Cartwright-Taylor, Alexis; Vallianatos, Filippos; Sammonds, Peter
2014-05-01
We have conducted room-temperature, triaxial compression experiments on samples of Carrara marble, recording concurrently acoustic and electric current signals emitted during the deformation process as well as mechanical loading information and ultrasonic wave velocities. Our results reveal that in a dry non-piezoelectric rock under simulated crustal pressure conditions, a measurable electric current (nA) is generated within the stressed sample. The current is detected only in the region beyond (quasi-)linear elastic deformation; i.e. in the region of permanent deformation beyond the yield point of the material and in the presence of microcracking. Our results extend to shallow crustal conditions previous observations of electric current signals in quartz-free rocks undergoing uniaxial deformation and support the idea of a universal electrification mechanism related to deformation. Confining pressure conditions of our slow strain rate (10-6 s-1) experiments range from the purely brittle regime (10 MPa) to the semi-brittle transition (30-100MPa) where cataclastic flow is the dominant deformation mechanism. Electric current is generated under all confining pressures,implying the existence of a current-producing mechanism during both microfracture and frictional sliding. Some differences are seen in the current evolution between these two regimes, possibly related to crack localisation. In all cases, the measured electric current exhibits episodes of strong fluctuations over short timescales; calm periods punctuated by bursts of strong activity. For the analysis, we adopt an entropy-based statistical physics approach (Tsallis, 1988), particularly suited to the study of fracture related phenomena. We find that the probability distribution of normalised electric current fluctuations over short time intervals (0.5 s) can be well described by a q-Gaussian distribution of a form similar to that which describes turbulent flows. This approach yields different entropic
Statistical-mechanics analysis of Gaussian labeled-unlabeled classification problems
International Nuclear Information System (INIS)
Tanaka, Toshiyuki
2013-01-01
The labeled-unlabeled classification problem in semi-supervised learning is studied via statistical-mechanics approach. We analytically investigate performance of a learner with an equal-weight mixture of two symmetrically-located Gaussians, performing posterior mean estimation of the parameter vector on the basis of a dataset consisting of labeled and unlabeled data generated from the same probability model as that assumed by the learner. Under the assumption of replica symmetry, we have analytically obtained a set of saddle-point equations, which allows us to numerically evaluate performance of the learner. On the basis of the analytical result we have observed interesting phenomena, in particular the coexistence of good and bad solutions, which may happen when the number of unlabeled data is relatively large compared with that of labeled data
Sreejith, Sreevarsha; Pereverzyev, Sergiy, Jr.; Kelvin, Lee S.; Marleau, Francine R.; Haltmeier, Markus; Ebner, Judith; Bland-Hawthorn, Joss; Driver, Simon P.; Graham, Alister W.; Holwerda, Benne W.; Hopkins, Andrew M.; Liske, Jochen; Loveday, Jon; Moffett, Amanda J.; Pimbblet, Kevin A.; Taylor, Edward N.; Wang, Lingyu; Wright, Angus H.
2018-03-01
We apply four statistical learning methods to a sample of 7941 galaxies (z test the feasibility of using automated algorithms to classify galaxies. Using 10 features measured for each galaxy (sizes, colours, shape parameters, and stellar mass), we apply the techniques of Support Vector Machines, Classification Trees, Classification Trees with Random Forest (CTRF) and Neural Networks, and returning True Prediction Ratios (TPRs) of 75.8 per cent, 69.0 per cent, 76.2 per cent, and 76.0 per cent, respectively. Those occasions whereby all four algorithms agree with each other yet disagree with the visual classification (`unanimous disagreement') serves as a potential indicator of human error in classification, occurring in ˜ 9 per cent of ellipticals, ˜ 9 per cent of little blue spheroids, ˜ 14 per cent of early-type spirals, ˜ 21 per cent of intermediate-type spirals, and ˜ 4 per cent of late-type spirals and irregulars. We observe that the choice of parameters rather than that of algorithms is more crucial in determining classification accuracy. Due to its simplicity in formulation and implementation, we recommend the CTRF algorithm for classifying future galaxy data sets. Adopting the CTRF algorithm, the TPRs of the five galaxy types are : E, 70.1 per cent; LBS, 75.6 per cent; S0-Sa, 63.6 per cent; Sab-Scd, 56.4 per cent, and Sd-Irr, 88.9 per cent. Further, we train a binary classifier using this CTRF algorithm that divides galaxies into spheroid-dominated (E, LBS, and S0-Sa) and disc-dominated (Sab-Scd and Sd-Irr), achieving an overall accuracy of 89.8 per cent. This translates into an accuracy of 84.9 per cent for spheroid-dominated systems and 92.5 per cent for disc-dominated systems.
Indian Academy of Sciences (India)
Home; Journals; Pramana – Journal of Physics; Volume 64; Issue 3 ... Keywords. Nonlinear dynamics; logistic map; -deformation; Tsallis statistics. ... As a specific example, a -deformation procedure is applied to the logistic map. Compared ...
Schildkrout, Barbara
2016-10-01
A new nosology for mental disorders is needed as a basis for effective scientific inquiry. Diagnostic and Statistical Manual of Mental Disorders and International Classification of Diseases diagnoses are not natural, biological categories, and these diagnostic systems do not address mental phenomena that exist on a spectrum. Advances in neuroscience offer the hope of breakthroughs for diagnosing and treating major mental illness in the future. At present, a neuroscience-based understanding of brain/behavior relationships can reshape clinical thinking. Neuroscience literacy allows psychiatrists to formulate biologically informed psychological theories, to follow neuroscientific literature pertinent to psychiatry, and to embark on a path toward neurologically informed clinical thinking that can help move the field away from Diagnostic and Statistical Manual of Mental Disorders and International Classification of Diseases conceptualizations. Psychiatrists are urged to work toward attaining neuroscience literacy to prepare for and contribute to the development of a new nosology.
Directory of Open Access Journals (Sweden)
Hua-Jun Wang
Full Text Available BACKGROUND: Abnormal posture and spinal mobility have been demonstrated to cause functional impairment in the quality of life, especially in the postmenopausal osteoporotic population. Most of the literature studies focus on either thoracic kyphosis or lumbar lordosis, but not on the change of the entire spinal alignment. Very few articles reported the spinal alignment of Chinese people. The purpose of this study was threefold: to classify the spinal curvature based on the classification system defined by Satoh consisting of the entire spine alignment; to identify the change of trunk mobility; and to relate spinal curvature to balance disorder in a Chinese population. METHODOLOGY/PRINCIPAL FINDINGS: 450 osteoporotic volunteers were recruited for this study. Spinal range of motion and global curvature were evaluated noninvasively using the Spinal-Mouse® system and sagittal postural deformities were characterized. RESULTS: We found a new spine postural alignment consisting of an increased thoracic kyphosis and decreased lumbar lordosis which we classified as our modified round back. We did not find any of Satoh's type 5 classification in our population. Type 2 sagittal alignment was the most common spinal deformity (38.44%. In standing, thoracic kyphosis angles in types 2 (58.34° and 3 (58.03° were the largest and lumbar lordosis angles in types 4 (13.95° and 5 (-8.61° were the smallest. The range of flexion (ROF and range of flexion-extension (ROFE of types 2 and 3 were usually greater than types 4 and 5, with type 1 being the largest. CONCLUSIONS/SIGNIFICANCE: The present study classified and compared for the first time the mobility, curvature and balance in a Chinese population based on the entire spine alignment and found types 4 and 5 to present the worst balance and mobility. This study included a new spine postural alignment classification that should be considered in future population studies.
Coding and classification in drug statistics – From national to global application
Directory of Open Access Journals (Sweden)
Marit Rønning
2009-11-01
Full Text Available SUMMARYThe Anatomical Therapeutic Chemical (ATC classification system and the defined daily dose (DDDwas developed in Norway in the early seventies. The creation of the ATC/DDD methodology was animportant basis for presenting drug utilisation statistics in a sensible way. Norway was in 1977 also thefirst country to publish national drug utilisation statistics from wholesalers on an annual basis. Thecombination of these activities in Norway in the seventies made us a pioneer country in the area of drugutilisation research. Over the years, the use of the ATC/DDD methodology has gradually increased incountries outside Norway. Since 1996, the methodology has been recommended by WHO for use ininternational drug utilisation studies. The WHO Collaborating Centre for Drug Statistics Methodologyin Oslo handles the maintenance and development of the ATC/DDD system. The Centre is now responsiblefor the global co-ordination. After nearly 30 years of experience with ATC/DDD, the methodologyhas demonstrated its suitability in drug use research. The main challenge in the coming years is toeducate the users worldwide in how to use the methodology properly.
Stable classification of the energy-momentum tensor. Summary
International Nuclear Information System (INIS)
Guzman-Sanchez, A.R.; Przanowski, M.; Plevansky, J.
1990-01-01
Starting with the algebraic classification of the energy-momentum tensor given by Plebansky, it is established that this classification is unstable under versal deformations and a new (stable) classification is given. In order to keep the text to reasonable length, we just write the basic ideas and some results. (Author) (Author)
Ranaie, Mehrdad; Soffianian, Alireza; Pourmanafi, Saeid; Mirghaffari, Noorollah; Tarkesh, Mostafa
2018-03-01
In recent decade, analyzing the remotely sensed imagery is considered as one of the most common and widely used procedures in the environmental studies. In this case, supervised image classification techniques play a central role. Hence, taking a high resolution Worldview-3 over a mixed urbanized landscape in Iran, three less applied image classification methods including Bagged CART, Stochastic gradient boosting model and Neural network with feature extraction were tested and compared with two prevalent methods: random forest and support vector machine with linear kernel. To do so, each method was run ten time and three validation techniques was used to estimate the accuracy statistics consist of cross validation, independent validation and validation with total of train data. Moreover, using ANOVA and Tukey test, statistical difference significance between the classification methods was significantly surveyed. In general, the results showed that random forest with marginal difference compared to Bagged CART and stochastic gradient boosting model is the best performing method whilst based on independent validation there was no significant difference between the performances of classification methods. It should be finally noted that neural network with feature extraction and linear support vector machine had better processing speed than other.
Liu, Yuan; D'Haese, Pierre-Francois; Dawant, Benoit M.
2014-03-01
Deep brain stimulation, which is used to treat various neurological disorders, involves implanting a permanent electrode into precise targets deep in the brain. Accurate pre-operative localization of the targets on pre-operative MRI sequence is challenging as these are typically located in homogenous regions with poor contrast. Population-based statistical atlases can assist with this process. Such atlases are created by acquiring the location of efficacious regions from numerous subjects and projecting them onto a common reference image volume using some normalization method. In previous work, we presented results concluding that non-rigid registration provided the best result for such normalization. However, this process could be biased by the choice of the reference image and/or registration approach. In this paper, we have qualitatively and quantitatively compared the performance of six recognized deformable registration methods at normalizing such data in poor contrasted regions onto three different reference volumes using a unique set of data from 100 patients. We study various metrics designed to measure the centroid, spread, and shape of the normalized data. This study leads to a total of 1800 deformable registrations and results show that statistical atlases constructed using different deformable registration methods share comparable centroids and spreads with marginal differences in their shape. Among the six methods being studied, Diffeomorphic Demons produces the largest spreads and centroids that are the furthest apart from the others in general. Among the three atlases, one atlas consistently outperforms the other two with smaller spreads for each algorithm. However, none of the differences in the spreads were found to be statistically significant, across different algorithms or across different atlases.
Hoell, Simon; Omenzetter, Piotr
2017-04-01
The increasing demand for carbon neutral energy in a challenging economic environment is a driving factor for erecting ever larger wind turbines in harsh environments using novel wind turbine blade (WTBs) designs characterized by high flexibilities and lower buckling capacities. To counteract resulting increasing of operation and maintenance costs, efficient structural health monitoring systems can be employed to prevent dramatic failures and to schedule maintenance actions according to the true structural state. This paper presents a novel methodology for classifying structural damages using vibrational responses from a single sensor. The method is based on statistical classification using Bayes' theorem and an advanced statistic, which allows controlling the performance by varying the number of samples which represent the current state. This is done for multivariate damage sensitive features defined as partial autocorrelation coefficients (PACCs) estimated from vibrational responses and principal component analysis scores from PACCs. Additionally, optimal DSFs are composed not only for damage classification but also for damage detection based on binary statistical hypothesis testing, where features selections are found with a fast forward procedure. The method is applied to laboratory experiments with a small scale WTB with wind-like excitation and non-destructive damage scenarios. The obtained results demonstrate the advantages of the proposed procedure and are promising for future applications of vibration-based structural health monitoring in WTBs.
Zhang, Y.; Li, F.; Zhang, S.; Hao, W.; Zhu, T.; Yuan, L.; Xiao, F.
2017-09-01
In this paper, Statistical Distribution based Conditional Random Fields (STA-CRF) algorithm is exploited for improving marginal ice-water classification. Pixel level ice concentration is presented as the comparison of methods based on CRF. Furthermore, in order to explore the effective statistical distribution model to be integrated into STA-CRF, five statistical distribution models are investigated. The STA-CRF methods are tested on 2 scenes around Prydz Bay and Adélie Depression, where contain a variety of ice types during melt season. Experimental results indicate that the proposed method can resolve sea ice edge well in Marginal Ice Zone (MIZ) and show a robust distinction of ice and water.
Alves, Gelio; Wang, Guanghui; Ogurtsov, Aleksey Y; Drake, Steven K; Gucek, Marjan; Sacks, David B; Yu, Yi-Kuo
2018-06-05
Rapid and accurate identification and classification of microorganisms is of paramount importance to public health and safety. With the advance of mass spectrometry (MS) technology, the speed of identification can be greatly improved. However, the increasing number of microbes sequenced is complicating correct microbial identification even in a simple sample due to the large number of candidates present. To properly untwine candidate microbes in samples containing one or more microbes, one needs to go beyond apparent morphology or simple "fingerprinting"; to correctly prioritize the candidate microbes, one needs to have accurate statistical significance in microbial identification. We meet these challenges by using peptide-centric representations of microbes to better separate them and by augmenting our earlier analysis method that yields accurate statistical significance. Here, we present an updated analysis workflow that uses tandem MS (MS/MS) spectra for microbial identification or classification. We have demonstrated, using 226 MS/MS publicly available data files (each containing from 2500 to nearly 100,000 MS/MS spectra) and 4000 additional MS/MS data files, that the updated workflow can correctly identify multiple microbes at the genus and often the species level for samples containing more than one microbe. We have also shown that the proposed workflow computes accurate statistical significances, i.e., E values for identified peptides and unified E values for identified microbes. Our updated analysis workflow MiCId, a freely available software for Microorganism Classification and Identification, is available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads.html . Graphical Abstract ᅟ.
Leydesdorff, Loet; Kogler, Dieter Franz; Yan, Bowen
2017-01-01
The Cooperative Patent Classifications (CPC) recently developed cooperatively by the European and US Patent Offices provide a new basis for mapping patents and portfolio analysis. CPC replaces International Patent Classifications (IPC) of the World Intellectual Property Organization. In this study, we update our routines previously based on IPC for CPC and use the occasion for rethinking various parameter choices. The new maps are significantly different from the previous ones, although this may not always be obvious on visual inspection. We provide nested maps online and a routine for generating portfolio overlays on the maps; a new tool is provided for "difference maps" between patent portfolios of organizations or firms. This is illustrated by comparing the portfolios of patents granted to two competing firms-Novartis and MSD-in 2016. Furthermore, the data is organized for the purpose of statistical analysis.
International Nuclear Information System (INIS)
Vaganov, P.A.; Kol'tsov, A.A.; Kulikov, V.D.; Mejer, V.A.
1983-01-01
The multi-element instrumental neutron activation analysis of samples of mountain rocks (sandstones, aleurolites and shales of one of gold deposits) is performed. The spectra of irradiated samples are measured by Ge(Li) detector of the volume of 35 mm 3 . The content of 22 chemical elements is determined in each sample. The results of analysis serve as reliable basis for multi-dimensional statistic information processing, they constitute the basis for the generalized characteristics of rocks which brings about the solution of classification problem for rocks of different deposits
Deformed exterior algebra, quons and their coherent states
International Nuclear Information System (INIS)
El Baz, M.; Hassouni, Y.
2002-08-01
We review the notion of the deformation of the exterior wedge product. This allows us to construct the deformation of the algebra of exterior forms over a vector space and also over an arbitrary manifold. We relate this approach to the generalized statistics and we study quons, as a particular case of these generalized statistics. We also give their statistical properties. A large part of the work is devoted to the problem of constructing coherent states for the deformed oscillators. We give a review of all the approaches existing in the literature concerning this point and enforce it with many examples. (author)
An On-Chip RBC Deformability Checker Significantly Improves Velocity-Deformation Correlation
Directory of Open Access Journals (Sweden)
Chia-Hung Dylan Tsai
2016-10-01
Full Text Available An on-chip deformability checker is proposed to improve the velocity–deformation correlation for red blood cell (RBC evaluation. RBC deformability has been found related to human diseases, and can be evaluated based on RBC velocity through a microfluidic constriction as in conventional approaches. The correlation between transit velocity and amount of deformation provides statistical information of RBC deformability. However, such correlations are usually only moderate, or even weak, in practical evaluations due to limited range of RBC deformation. To solve this issue, we implemented three constrictions of different width in the proposed checker, so that three different deformation regions can be applied to RBCs. By considering cell responses from the three regions as a whole, we practically extend the range of cell deformation in the evaluation, and could resolve the issue about the limited range of RBC deformation. RBCs from five volunteer subjects were tested using the proposed checker. The results show that the correlation between cell deformation and transit velocity is significantly improved by the proposed deformability checker. The absolute values of the correlation coefficients are increased from an average of 0.54 to 0.92. The effects of cell size, shape and orientation to the evaluation are discussed according to the experimental results. The proposed checker is expected to be useful for RBC evaluation in medical practices.
Classification of Malaysia aromatic rice using multivariate statistical analysis
Energy Technology Data Exchange (ETDEWEB)
Abdullah, A. H.; Adom, A. H.; Shakaff, A. Y. Md; Masnan, M. J.; Zakaria, A.; Rahim, N. A. [School of Mechatronic Engineering, Universiti Malaysia Perlis, Kampus Pauh Putra, 02600 Arau, Perlis (Malaysia); Omar, O. [Malaysian Agriculture Research and Development Institute (MARDI), Persiaran MARDI-UPM, 43400 Serdang, Selangor (Malaysia)
2015-05-15
Aromatic rice (Oryza sativa L.) is considered as the best quality premium rice. The varieties are preferred by consumers because of its preference criteria such as shape, colour, distinctive aroma and flavour. The price of aromatic rice is higher than ordinary rice due to its special needed growth condition for instance specific climate and soil. Presently, the aromatic rice quality is identified by using its key elements and isotopic variables. The rice can also be classified via Gas Chromatography Mass Spectrometry (GC-MS) or human sensory panels. However, the uses of human sensory panels have significant drawbacks such as lengthy training time, and prone to fatigue as the number of sample increased and inconsistent. The GC–MS analysis techniques on the other hand, require detailed procedures, lengthy analysis and quite costly. This paper presents the application of in-house developed Electronic Nose (e-nose) to classify new aromatic rice varieties. The e-nose is used to classify the variety of aromatic rice based on the samples odour. The samples were taken from the variety of rice. The instrument utilizes multivariate statistical data analysis, including Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA) and K-Nearest Neighbours (KNN) to classify the unknown rice samples. The Leave-One-Out (LOO) validation approach is applied to evaluate the ability of KNN to perform recognition and classification of the unspecified samples. The visual observation of the PCA and LDA plots of the rice proves that the instrument was able to separate the samples into different clusters accordingly. The results of LDA and KNN with low misclassification error support the above findings and we may conclude that the e-nose is successfully applied to the classification of the aromatic rice varieties.
Classification of Malaysia aromatic rice using multivariate statistical analysis
Abdullah, A. H.; Adom, A. H.; Shakaff, A. Y. Md; Masnan, M. J.; Zakaria, A.; Rahim, N. A.; Omar, O.
2015-05-01
Aromatic rice (Oryza sativa L.) is considered as the best quality premium rice. The varieties are preferred by consumers because of its preference criteria such as shape, colour, distinctive aroma and flavour. The price of aromatic rice is higher than ordinary rice due to its special needed growth condition for instance specific climate and soil. Presently, the aromatic rice quality is identified by using its key elements and isotopic variables. The rice can also be classified via Gas Chromatography Mass Spectrometry (GC-MS) or human sensory panels. However, the uses of human sensory panels have significant drawbacks such as lengthy training time, and prone to fatigue as the number of sample increased and inconsistent. The GC-MS analysis techniques on the other hand, require detailed procedures, lengthy analysis and quite costly. This paper presents the application of in-house developed Electronic Nose (e-nose) to classify new aromatic rice varieties. The e-nose is used to classify the variety of aromatic rice based on the samples odour. The samples were taken from the variety of rice. The instrument utilizes multivariate statistical data analysis, including Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA) and K-Nearest Neighbours (KNN) to classify the unknown rice samples. The Leave-One-Out (LOO) validation approach is applied to evaluate the ability of KNN to perform recognition and classification of the unspecified samples. The visual observation of the PCA and LDA plots of the rice proves that the instrument was able to separate the samples into different clusters accordingly. The results of LDA and KNN with low misclassification error support the above findings and we may conclude that the e-nose is successfully applied to the classification of the aromatic rice varieties.
Classification of Malaysia aromatic rice using multivariate statistical analysis
International Nuclear Information System (INIS)
Abdullah, A. H.; Adom, A. H.; Shakaff, A. Y. Md; Masnan, M. J.; Zakaria, A.; Rahim, N. A.; Omar, O.
2015-01-01
Aromatic rice (Oryza sativa L.) is considered as the best quality premium rice. The varieties are preferred by consumers because of its preference criteria such as shape, colour, distinctive aroma and flavour. The price of aromatic rice is higher than ordinary rice due to its special needed growth condition for instance specific climate and soil. Presently, the aromatic rice quality is identified by using its key elements and isotopic variables. The rice can also be classified via Gas Chromatography Mass Spectrometry (GC-MS) or human sensory panels. However, the uses of human sensory panels have significant drawbacks such as lengthy training time, and prone to fatigue as the number of sample increased and inconsistent. The GC–MS analysis techniques on the other hand, require detailed procedures, lengthy analysis and quite costly. This paper presents the application of in-house developed Electronic Nose (e-nose) to classify new aromatic rice varieties. The e-nose is used to classify the variety of aromatic rice based on the samples odour. The samples were taken from the variety of rice. The instrument utilizes multivariate statistical data analysis, including Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA) and K-Nearest Neighbours (KNN) to classify the unknown rice samples. The Leave-One-Out (LOO) validation approach is applied to evaluate the ability of KNN to perform recognition and classification of the unspecified samples. The visual observation of the PCA and LDA plots of the rice proves that the instrument was able to separate the samples into different clusters accordingly. The results of LDA and KNN with low misclassification error support the above findings and we may conclude that the e-nose is successfully applied to the classification of the aromatic rice varieties
A Statistical Model of Head Asymmetry in Infants with Deformational Plagiocephaly
DEFF Research Database (Denmark)
Lanche, Stéphanie; Darvann, Tron Andre; Ólafsdóttir, Hildur
2007-01-01
Deformational plagiocephaly is a term describing cranial asymmetry and deformation commonly seen in infants. The purpose of this work was to develop a methodology for assessment and modelling of head asymmetry. The clinical population consisted of 38 infants for whom 3-dimensional surface scans...... quantitative description of the asymmetry present in the dataset....
International Nuclear Information System (INIS)
Bojtsov, V.V.; Tsepin, M.A.; Karpilyanskij, N.N.; Ershov, A.N.
1982-01-01
Results of statistical analysis of the description accuracy of superplasticity S-form curve by different analytic expressions, suggested on the basis of phenomenological and metallophysical concepts about the nature of superplastic deformation, are given. Experimental investigations into the dependence of flow stresses on the deformation rate were conducted on VT3-1 two-phase titanium alloy. Test samples were cut out of a rod, 30 mm in diameter, produced by lengthwise rolling in α+#betta#-region. Optimal temperature of superplasticity manifestation was determined by the method of stress relaxation from a relaxation time value to a given stress. It was established that the Smirnov phemonemological equation describes in the best way the rate dependence of flow stress of superplastic material. This equation can be used for solution of problems of studying mechanism, physical nature of superplastic deformation, analysing strain-stress state and the structure of deformation zone during the processes of pressure shaping of superplastic materials, when considerably wide range (in the limits of 7-8 orders) of deformation rate variation takes place
International Nuclear Information System (INIS)
Thörnqvist, Sara; Hysing, Liv B.; Zolnay, Andras G.; Söhn, Matthias; Hoogeman, Mischa S.; Muren, Ludvig P.; Bentzen, Lise; Heijmen, Ben J.M.
2013-01-01
Background and purpose: Deformation and correlated target motion remain challenges for margin recipes in radiotherapy (RT). This study presents a statistical deformable motion model for multiple targets and applies it to margin evaluations for locally advanced prostate cancer i.e. RT of the prostate (CTV-p), seminal vesicles (CTV-sv) and pelvic lymph nodes (CTV-ln). Material and methods: The 19 patients included in this study, all had 7–10 repeat CT-scans available that were rigidly aligned with the planning CT-scan using intra-prostatic implanted markers, followed by deformable registrations. The displacement vectors from the deformable registrations were used to create patient-specific statistical motion models. The models were applied in treatment simulations to determine probabilities for adequate target coverage, e.g. by establishing distributions of the accumulated dose to 99% of the target volumes (D 99 ) for various CTV–PTV expansions in the planning-CTs. Results: The method allowed for estimation of the expected accumulated dose and its variance of different DVH parameters for each patient. Simulations of inter-fractional motion resulted in 7, 10, and 18 patients with an average D 99 >95% of the prescribed dose for CTV-p expansions of 3 mm, 4 mm and 5 mm, respectively. For CTV-sv and CTV-ln, expansions of 3 mm, 5 mm and 7 mm resulted in 1, 11 and 15 vs. 8, 18 and 18 patients respectively with an average D 99 >95% of the prescription. Conclusions: Treatment simulations of target motion revealed large individual differences in accumulated dose mainly for CTV-sv, demanding the largest margins whereas those required for CTV-p and CTV-ln were comparable
Rotary deformity in degenerative spondylolisthesis
International Nuclear Information System (INIS)
Kang, Sung Gwon; Kim, Jeong; Kho, Hyen Sim; Yun, Sung Su; Oh, Jae Hee; Byen, Ju Nam; Kim, Young Chul
1994-01-01
We studied to determine whether the degenerative spondylolisthesis has rotary deformity in addition to forward displacement. We have made analysis of difference of rotary deformity between the 31 study groups of symptomatic degenerative spondylolisthesis and 31 control groups without any symptom, statistically. We also reviewed CT findings in 15 study groups. The mean rotary deformity in study groups was 6.1 degree(the standard deviation is 5.20), and the mean rotary deformity in control groups was 2.52 degree(the standard deviation is 2.16)(p < 0.01). The rotary deformity can be accompanied with degenerative spondylolisthesis. We may consider the rotary deformity as a cause of symptomatic degenerative spondylolisthesis in case that any other cause is not detected
Directory of Open Access Journals (Sweden)
Tristan J Webb
2014-04-01
Full Text Available When we see a human sitting down, standing up, or walking, we can recognise one of these poses independently of the individual, or we can recognise the individual person, independently of the pose. The same issues arise for deforming objects. For example, if we see a flag deformed by the wind, either blowing out or hanging languidly, we can usually recognise the flag, independently of its deformation; or we can recognise the deformation independently of the identity of the flag. We hypothesize that these types of recognition can be implemented by the primate visual system using temporo-spatial continuity as objects transform as a learning principle. In particular, we hypothesize that pose or deformation can be learned under conditions in which large numbers of different people are successively seen in the same pose, or objects in the same deformation. We also hypothesize that person-specific representations that are independent of pose, and object-specific representations that are independent of deformation and view, could be built, when individual people or objects are observed successively transforming from one pose or deformation and view to another. These hypotheses were tested in a simulation of the ventral visual system, VisNet, that uses temporal continuity, implemented in a synaptic learning rule with a short-term memory trace of previous neuronal activity, to learn invariant representations. It was found that depending on the statistics of the visual input, either pose-specific or deformation-specific representations could be built that were invariant with respect to individual and view; or that identity-specific representations could be built that were invariant with respect to pose or deformation and view. We propose that this is how pose-specific and pose-invariant, and deformation-specific and deformation-invariant, perceptual representations are built in the brain.
Classification of lung sounds using higher-order statistics: A divide-and-conquer approach.
Naves, Raphael; Barbosa, Bruno H G; Ferreira, Danton D
2016-06-01
Lung sound auscultation is one of the most commonly used methods to evaluate respiratory diseases. However, the effectiveness of this method depends on the physician's training. If the physician does not have the proper training, he/she will be unable to distinguish between normal and abnormal sounds generated by the human body. Thus, the aim of this study was to implement a pattern recognition system to classify lung sounds. We used a dataset composed of five types of lung sounds: normal, coarse crackle, fine crackle, monophonic and polyphonic wheezes. We used higher-order statistics (HOS) to extract features (second-, third- and fourth-order cumulants), Genetic Algorithms (GA) and Fisher's Discriminant Ratio (FDR) to reduce dimensionality, and k-Nearest Neighbors and Naive Bayes classifiers to recognize the lung sound events in a tree-based system. We used the cross-validation procedure to analyze the classifiers performance and the Tukey's Honestly Significant Difference criterion to compare the results. Our results showed that the Genetic Algorithms outperformed the Fisher's Discriminant Ratio for feature selection. Moreover, each lung class had a different signature pattern according to their cumulants showing that HOS is a promising feature extraction tool for lung sounds. Besides, the proposed divide-and-conquer approach can accurately classify different types of lung sounds. The classification accuracy obtained by the best tree-based classifier was 98.1% for classification accuracy on training, and 94.6% for validation data. The proposed approach achieved good results even using only one feature extraction tool (higher-order statistics). Additionally, the implementation of the proposed classifier in an embedded system is feasible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Formality theory from Poisson structures to deformation quantization
Esposito, Chiara
2015-01-01
This book is a survey of the theory of formal deformation quantization of Poisson manifolds, in the formalism developed by Kontsevich. It is intended as an educational introduction for mathematical physicists who are dealing with the subject for the first time. The main topics covered are the theory of Poisson manifolds, star products and their classification, deformations of associative algebras and the formality theorem. Readers will also be familiarized with the relevant physical motivations underlying the purely mathematical construction.
A connectionist-geostatistical approach for classification of deformation types in ice surfaces
Goetz-Weiss, L. R.; Herzfeld, U. C.; Hale, R. G.; Hunke, E. C.; Bobeck, J.
2014-12-01
Deformation is a class of highly non-linear geophysical processes from which one can infer other geophysical variables in a dynamical system. For example, in an ice-dynamic model, deformation is related to velocity, basal sliding, surface elevation changes, and the stress field at the surface as well as internal to a glacier. While many of these variables cannot be observed, deformation state can be an observable variable, because deformation in glaciers (once a viscosity threshold is exceeded) manifests itself in crevasses.Given the amount of information that can be inferred from observing surface deformation, an automated method for classifying surface imagery becomes increasingly desirable. In this paper a Neural Network is used to recognize classes of crevasse types over the Bering Bagley Glacier System (BBGS) during a surge (2011-2013-?). A surge is a spatially and temporally highly variable and rapid acceleration of the glacier. Therefore, many different crevasse types occur in a short time frame and in close proximity, and these crevasse fields hold information on the geophysical processes of the surge.The connectionist-geostatistical approach uses directional experimental (discrete) variograms to parameterize images into a form that the Neural Network can recognize. Recognizing that each surge wave results in different crevasse types and that environmental conditions affect the appearance in imagery, we have developed a semi-automated pre-training software to adapt the Neural Net to chaining conditions.The method is applied to airborne and satellite imagery to classify surge crevasses from the BBGS surge. This method works well for classifying spatially repetitive images such as the crevasses over Bering Glacier. We expand the network for less repetitive images in order to analyze imagery collected over the Arctic sea ice, to assess the percentage of deformed ice for model calibration.
Brenčič, Mihael
2016-01-01
Northern hemisphere elementary circulation mechanisms, defined with the Dzerdzeevski classification and published on a daily basis from 1899-2012, are analysed with statistical methods as continuous categorical time series. Classification consists of 41 elementary circulation mechanisms (ECM), which are assigned to calendar days. Empirical marginal probabilities of each ECM were determined. Seasonality and the periodicity effect were investigated with moving dispersion filters and randomisation procedure on the ECM categories as well as with the time analyses of the ECM mode. The time series were determined as being non-stationary with strong time-dependent trends. During the investigated period, periodicity interchanges with periods when no seasonality is present. In the time series structure, the strongest division is visible at the milestone of 1986, showing that the atmospheric circulation pattern reflected in the ECM has significantly changed. This change is result of the change in the frequency of ECM categories; before 1986, the appearance of ECM was more diverse, and afterwards fewer ECMs appear. The statistical approach applied to the categorical climatic time series opens up new potential insight into climate variability and change studies that have to be performed in the future.
Linear deformations of discrete groups and constructions of multivalued groups
International Nuclear Information System (INIS)
Yagodovskii, Petr V
2000-01-01
We construct deformations of discrete multivalued groups described as special deformations of their group algebras in the class of finite-dimensional associative algebras. We show that the deformations of ordinary groups producing multivalued groups are defined by cocycles with coefficients in the group algebra of the original group and obtain classification theorems on these deformations. We indicate a connection between the linear deformations of discrete groups introduced in this paper and the well-known constructions of multivalued groups. We describe the manifold of three-dimensional associative commutative algebras with identity element, fixed basis, and a constant number of values. The group algebras of n-valued groups of order three (three-dimensional n-group algebras) form a discrete set in this manifold
Directory of Open Access Journals (Sweden)
S Anantha Sivaprakasam
2017-02-01
Full Text Available Cervical cancer that is a disease, in which malignant (cancer cells form in the tissues of the cervix, is one of the fourth leading causes of cancer death in female community worldwide. The cervical cancer can be prevented and/or cured if it is diagnosed in the pre-cancerous lesion stage or earlier. A common physical examination technique widely used in the screening is called Papanicolaou test or Pap test which is used to detect the abnormality of the cell. Due to intricacy of the cell nature, automating of this procedure is still a herculean task for the pathologist. This paper addresses solution for the challenges in terms of a simple and novel method to segment and classify the cervical cell automatically. The primary step of this procedure is pre-processing in which de-nosing, de-correlation operation and segregation of colour components are carried out, Then, two new techniques called Morphological and Statistical Edge based segmentation and Morphological and Statistical Region Based segmentation Techniques- put forward in this paper, and that are applied on the each component of image to segment the nuclei from cervical image. Finally, all segmented colour components are combined together to make a final segmentation result. After extracting the nuclei, the morphological features are extracted from the nuclei. The performance of two techniques mentioned above outperformed than standard segmentation techniques. Besides, Morphological and Statistical Edge based segmentation is outperformed than Morphological and Statistical Region based Segmentation. Finally, the nuclei are classified based on the morphological value The segmentation accuracy is echoed in classification accuracy. The overall segmentation accuracy is 97%.
Rizvi, Mohd Suhail; Pal, Anupam
2014-09-01
The fibrous matrices are widely used as scaffolds for the regeneration of load-bearing tissues due to their structural and mechanical similarities with the fibrous components of the extracellular matrix. These scaffolds not only provide the appropriate microenvironment for the residing cells but also act as medium for the transmission of the mechanical stimuli, essential for the tissue regeneration, from macroscopic scale of the scaffolds to the microscopic scale of cells. The requirement of the mechanical loading for the tissue regeneration requires the fibrous scaffolds to be able to sustain the complex three-dimensional mechanical loading conditions. In order to gain insight into the mechanical behavior of the fibrous matrices under large amount of elongation as well as shear, a statistical model has been formulated to study the macroscopic mechanical behavior of the electrospun fibrous matrix and the transmission of the mechanical stimuli from scaffolds to the cells via the constituting fibers. The study establishes the load-deformation relationships for the fibrous matrices for different structural parameters. It also quantifies the changes in the fiber arrangement and tension generated in the fibers with the deformation of the matrix. The model reveals that the tension generated in the fibers on matrix deformation is not homogeneous and hence the cells located in different regions of the fibrous scaffold might experience different mechanical stimuli. The mechanical response of fibrous matrices was also found to be dependent on the aspect ratio of the matrix. Therefore, the model establishes a structure-mechanics interdependence of the fibrous matrices under large deformation, which can be utilized in identifying the appropriate structure and external mechanical loading conditions for the regeneration of load-bearing tissues. Copyright © 2014 Elsevier Ltd. All rights reserved.
Waltho, Daniel; Hatchell, Alexandra; Thoma, Achilleas
2017-03-01
Gynecomastia is a common deformity of the male breast, where certain cases warrant surgical management. There are several surgical options, which vary depending on the breast characteristics. To guide surgical management, several classification systems for gynecomastia have been proposed. A systematic review was performed to (1) identify all classification systems for the surgical management of gynecomastia, and (2) determine the adequacy of these classification systems to appropriately categorize the condition for surgical decision-making. The search yielded 1012 articles, and 11 articles were included in the review. Eleven classification systems in total were ascertained, and a total of 10 unique features were identified: (1) breast size, (2) skin redundancy, (3) breast ptosis, (4) tissue predominance, (5) upper abdominal laxity, (6) breast tuberosity, (7) nipple malposition, (8) chest shape, (9) absence of sternal notch, and (10) breast skin elasticity. On average, classification systems included two or three of these features. Breast size and ptosis were the most commonly included features. Based on their review of the current classification systems, the authors believe the ideal classification system should be universal and cater to all causes of gynecomastia; be surgically useful and easy to use; and should include a comprehensive set of clinically appropriate patient-related features, such as breast size, breast ptosis, tissue predominance, and skin redundancy. None of the current classification systems appears to fulfill these criteria.
Weiss, Hans-Rudolf; Werkmann, Mario
2009-02-17
Up to now, chronic low back pain without radicular symptoms is not classified and attributed in international literature as being "unspecific". For specific bracing of this patient group we use simple physical tests to predict the brace type the patient is most likely to benefit from. Based on these physical tests we have developed a simple functional classification of "unspecific" low back pain in patients with spinal deformities. Between January 2006 and July 2007 we have tested 130 patients (116 females and 14 males) with spinal deformities (average age 45 years, ranging from 14 years to 69) and chronic unspecific low back pain (pain for > 24 months) along with the indication for brace treatment for chronic unspecific low back pain. Some of the patients had symptoms of spinal claudication (n = 16). The "sagittal realignment test" (SRT) was applied, a lumbar hyperextension test, and the "sagittal delordosation test" (SDT). Additionally 3 female patients with spondylolisthesis were tested, including one female with symptoms of spinal claudication and 2 of these patients were 14 years of age and the other 43yrs old at the time of testing. 117 Patients reported significant pain release in the SRT and 13 in the SDT (> or = 2 steps in the Roland & Morris VRS). 3 Patients had no significant pain release in both of the tests (manual investigation we found hypermobility at L5/S1 or a spondylolisthesis at level L5/S1. In the other patients who responded well to the SRT loss of lumbar lordosis was the main issue, a finding which, according to scientific literature, correlates well with low back pain. The 3 patients who did not respond to either test had a fair pain reduction in a generally delordosing brace with an isolated small foam pad inserted at the level of L 2/3, leading to a lordosation at this region. With the exception of 3 patients (2.3%) a clear distribution to one of the two classes has been possible. 117 patients were supplied successfully with a sagittal
Collisions of deformed nuclei and superheavy-element production
International Nuclear Information System (INIS)
Iwamoto, Akira; Moeller, P.; Univ. of Aizu, Fukushima; P. Moller Scientific Computing and Graphics, Inc., Los Alamos, NM; Los Alamos National Lab., NM; Nix, J.R.; Sagawa, Hiroyuki, Sagawa
1995-01-01
A detailed understanding of complete fusion cross sections in heavy-ion collisions requires a consideration of the effects of the deformation of the projectile and target. The aim here is to show that deformation and orientation of the colliding nuclei have a very significant effect on the fusion-barrier height and on the compactness of the touching configuration. To facilitate discussions of fusion configurations of deformed nuclei, the authors develop a classification scheme and introduce a notation convention for these configurations. They discuss particular deformations and orientations that lead to compact touching configurations and to fusion-barrier heights that correspond to fairly low excitation energies of the compound systems. Such configurations should be the most favorable for producing superheavy elements. They analyze a few projectile-target combinations whose deformations allow favorable entrance-channel configurations and whose proton and neutron numbers lead to compound systems in a part of the superheavy region where a half-lives are calculated to be observable, that is, longer than 1 micros
Wen, Hongwei; Liu, Yue; Wang, Jieqiong; Zhang, Jishui; Peng, Yun; He, Huiguang
2016-03-01
Tourette syndrome (TS) is a developmental neuropsychiatric disorder with the cardinal symptoms of motor and vocal tics which emerges in early childhood and fluctuates in severity in later years. To date, the neural basis of TS is not fully understood yet and TS has a long-term prognosis that is difficult to accurately estimate. Few studies have looked at the potential of using diffusion tensor imaging (DTI) in conjunction with machine learning algorithms in order to automate the classification of healthy children and TS children. Here we apply Tract-Based Spatial Statistics (TBSS) method to 44 TS children and 48 age and gender matched healthy children in order to extract the diffusion values from each voxel in the white matter (WM) skeleton, and a feature selection algorithm (ReliefF) was used to select the most salient voxels for subsequent classification with support vector machine (SVM). We use a nested cross validation to yield an unbiased assessment of the classification method and prevent overestimation. The accuracy (88.04%), sensitivity (88.64%) and specificity (87.50%) were achieved in our method as peak performance of the SVM classifier was achieved using the axial diffusion (AD) metric, demonstrating the potential of a joint TBSS and SVM pipeline for fast, objective classification of healthy and TS children. These results support that our methods may be useful for the early identification of subjects with TS, and hold promise for predicting prognosis and treatment outcome for individuals with TS.
On the Fedosov deformation quantization beyond the regular Poisson manifolds
International Nuclear Information System (INIS)
Dolgushev, V.A.; Isaev, A.P.; Lyakhovich, S.L.; Sharapov, A.A.
2002-01-01
A simple iterative procedure is suggested for the deformation quantization of (irregular) Poisson brackets associated to the classical Yang-Baxter equation. The construction is shown to admit a pure algebraic reformulation giving the Universal Deformation Formula (UDF) for any triangular Lie bialgebra. A simple proof of classification theorem for inequivalent UDF's is given. As an example the explicit quantization formula is presented for the quasi-homogeneous Poisson brackets on two-plane
Statistical Shape Modelling and Markov Random Field Restoration (invited tutorial and exercise)
DEFF Research Database (Denmark)
Hilger, Klaus Baggesen
This tutorial focuses on statistical shape analysis using point distribution models (PDM) which is widely used in modelling biological shape variability over a set of annotated training data. Furthermore, Active Shape Models (ASM) and Active Appearance Models (AAM) are based on PDMs and have proven...... deformation field between shapes. The tutorial demonstrates both generative active shape and appearance models, and MRF restoration on 3D polygonized surfaces. ''Exercise: Spectral-Spatial classification of multivariate images'' From annotated training data this exercise applies spatial image restoration...... using Markov random field relaxation of a spectral classifier. Keywords: the Ising model, the Potts model, stochastic sampling, discriminant analysis, expectation maximization....
Mean template for tensor-based morphometry using deformation tensors.
Leporé, Natasha; Brun, Caroline; Pennec, Xavier; Chou, Yi-Yu; Lopez, Oscar L; Aizenstein, Howard J; Becker, James T; Toga, Arthur W; Thompson, Paul M
2007-01-01
Tensor-based morphometry (TBM) studies anatomical differences between brain images statistically, to identify regions that differ between groups, over time, or correlate with cognitive or clinical measures. Using a nonlinear registration algorithm, all images are mapped to a common space, and statistics are most commonly performed on the Jacobian determinant (local expansion factor) of the deformation fields. In, it was shown that the detection sensitivity of the standard TBM approach could be increased by using the full deformation tensors in a multivariate statistical analysis. Here we set out to improve the common space itself, by choosing the shape that minimizes a natural metric on the deformation tensors from that space to the population of control subjects. This method avoids statistical bias and should ease nonlinear registration of new subjects data to a template that is 'closest' to all subjects' anatomies. As deformation tensors are symmetric positive-definite matrices and do not form a vector space, all computations are performed in the log-Euclidean framework. The control brain B that is already the closest to 'average' is found. A gradient descent algorithm is then used to perform the minimization that iteratively deforms this template and obtains the mean shape. We apply our method to map the profile of anatomical differences in a dataset of 26 HIV/AIDS patients and 14 controls, via a log-Euclidean Hotelling's T2 test on the deformation tensors. These results are compared to the ones found using the 'best' control, B. Statistics on both shapes are evaluated using cumulative distribution functions of the p-values in maps of inter-group differences.
Random forests for classification in ecology
Cutler, D.R.; Edwards, T.C.; Beard, K.H.; Cutler, A.; Hess, K.T.; Gibson, J.; Lawler, J.J.
2007-01-01
Classification procedures are some of the most widely used statistical methods in ecology. Random forests (RF) is a new and powerful statistical classifier that is well established in other disciplines but is relatively unknown in ecology. Advantages of RF compared to other statistical classifiers include (1) very high classification accuracy; (2) a novel method of determining variable importance; (3) ability to model complex interactions among predictor variables; (4) flexibility to perform several types of statistical data analysis, including regression, classification, survival analysis, and unsupervised learning; and (5) an algorithm for imputing missing values. We compared the accuracies of RF and four other commonly used statistical classifiers using data on invasive plant species presence in Lava Beds National Monument, California, USA, rare lichen species presence in the Pacific Northwest, USA, and nest sites for cavity nesting birds in the Uinta Mountains, Utah, USA. We observed high classification accuracy in all applications as measured by cross-validation and, in the case of the lichen data, by independent test data, when comparing RF to other common classification methods. We also observed that the variables that RF identified as most important for classifying invasive plant species coincided with expectations based on the literature. ?? 2007 by the Ecological Society of America.
Statistical symmetries in physics
International Nuclear Information System (INIS)
Green, H.S.; Adelaide Univ., SA
1994-01-01
Every law of physics is invariant under some group of transformations and is therefore the expression of some type of symmetry. Symmetries are classified as geometrical, dynamical or statistical. At the most fundamental level, statistical symmetries are expressed in the field theories of the elementary particles. This paper traces some of the developments from the discovery of Bose statistics, one of the two fundamental symmetries of physics. A series of generalizations of Bose statistics is described. A supersymmetric generalization accommodates fermions as well as bosons, and further generalizations, including parastatistics, modular statistics and graded statistics, accommodate particles with properties such as 'colour'. A factorization of elements of ggl(n b ,n f ) can be used to define truncated boson operators. A general construction is given for q-deformed boson operators, and explicit constructions of the same type are given for various 'deformed' algebras. A summary is given of some of the applications and potential applications. 39 refs., 2 figs
Statistical control chart and neural network classification for improving human fall detection
Harrou, Fouzi; Zerrouki, Nabil; Sun, Ying; Houacine, Amrane
2017-01-01
This paper proposes a statistical approach to detect and classify human falls based on both visual data from camera and accelerometric data captured by accelerometer. Specifically, we first use a Shewhart control chart to detect the presence of potential falls by using accelerometric data. Unfortunately, this chart cannot distinguish real falls from fall-like actions, such as lying down. To bypass this difficulty, a neural network classifier is then applied only on the detected cases through visual data. To assess the performance of the proposed method, experiments are conducted on the publicly available fall detection databases: the University of Rzeszow's fall detection (URFD) dataset. Results demonstrate that the detection phase play a key role in reducing the number of sequences used as input into the neural network classifier for classification, significantly reducing computational burden and achieving better accuracy.
Statistical control chart and neural network classification for improving human fall detection
Harrou, Fouzi
2017-01-05
This paper proposes a statistical approach to detect and classify human falls based on both visual data from camera and accelerometric data captured by accelerometer. Specifically, we first use a Shewhart control chart to detect the presence of potential falls by using accelerometric data. Unfortunately, this chart cannot distinguish real falls from fall-like actions, such as lying down. To bypass this difficulty, a neural network classifier is then applied only on the detected cases through visual data. To assess the performance of the proposed method, experiments are conducted on the publicly available fall detection databases: the University of Rzeszow\\'s fall detection (URFD) dataset. Results demonstrate that the detection phase play a key role in reducing the number of sequences used as input into the neural network classifier for classification, significantly reducing computational burden and achieving better accuracy.
Mercier, Lény; Darnaude, Audrey M; Bruguier, Olivier; Vasconcelos, Rita P; Cabral, Henrique N; Costa, Maria J; Lara, Monica; Jones, David L; Mouillot, David
2011-06-01
Reliable assessment of fish origin is of critical importance for exploited species, since nursery areas must be identified and protected to maintain recruitment to the adult stock. During the last two decades, otolith chemical signatures (or "fingerprints") have been increasingly used as tools to discriminate between coastal habitats. However, correct assessment of fish origin from otolith fingerprints depends on various environmental and methodological parameters, including the choice of the statistical method used to assign fish to unknown origin. Among the available methods of classification, Linear Discriminant Analysis (LDA) is the most frequently used, although it assumes data are multivariate normal with homogeneous within-group dispersions, conditions that are not always met by otolith chemical data, even after transformation. Other less constrained classification methods are available, but there is a current lack of comparative analysis in applications to otolith microchemistry. Here, we assessed stock identification accuracy for four classification methods (LDA, Quadratic Discriminant Analysis [QDA], Random Forests [RF], and Artificial Neural Networks [ANN]), through the use of three distinct data sets. In each case, all possible combinations of chemical elements were examined to identify the elements to be used for optimal accuracy in fish assignment to their actual origin. Our study shows that accuracy differs according to the model and the number of elements considered. Best combinations did not include all the elements measured, and it was not possible to define an ad hoc multielement combination for accurate site discrimination. Among all the models tested, RF and ANN performed best, especially for complex data sets (e.g., with numerous fish species and/or chemical elements involved). However, for these data, RF was less time-consuming and more interpretable than ANN, and far more efficient and less demanding in terms of assumptions than LDA or QDA
Directory of Open Access Journals (Sweden)
Nadja Stumberg
2014-05-01
Full Text Available The vegetation in the forest-tundra ecotone zone is expected to be highly affected by climate change and requires effective monitoring techniques. Airborne laser scanning (ALS has been proposed as a tool for the detection of small pioneer trees for such vast areas using laser height and intensity data. The main objective of the present study was to assess a possible improvement in the performance of classifying tree and nontree laser echoes from high-density ALS data. The data were collected along a 1000 km long transect stretching from southern to northern Norway. Different geostatistical and statistical measures derived from laser height and intensity values were used to extent and potentially improve more simple models ignoring the spatial context. Generalised linear models (GLM and support vector machines (SVM were employed as classification methods. Total accuracies and Cohen’s kappa coefficients were calculated and compared to those of simpler models from a previous study. For both classification methods, all models revealed total accuracies similar to the results of the simpler models. Concerning classification performance, however, the comparison of the kappa coefficients indicated a significant improvement for some models both using GLM and SVM, with classification accuracies >94%.
Cooper, Rachel
2015-06-01
The latest edition of the Diagnostic and Statistical Manual of Mental Disorders, the D.S.M.-5, was published in May 2013. In the lead up to publication, radical changes to the classification were anticipated; there was widespread dissatisfaction with the previous edition and it was accepted that a "paradigm shift" might be required. In the end, however, and despite huge efforts at revision, the published D.S.M.-5 differs far less than originally envisaged from its predecessor. This paper considers why it is that revising the D.S.M. has become so difficult. The D.S.M. is such an important classification that this question is worth asking in its own right. The case of the D.S.M. can also serve as a study for considering stasis in classification more broadly; why and how can classifications become resistant to change? I suggest that classifications like the D.S.M. can be thought of as forming part of the infrastructure of science, and have much in common with material infrastructure. In particular, as with material technologies, it is possible for "path dependent" development to cause a sub-optimal classification to become "locked in" and hard to replace. Copyright © 2015 Elsevier Ltd. All rights reserved.
Calisto, A; Bramanti, A; Galeano, M; Angileri, F; Campobello, G; Serrano, S; Azzerboni, B
2009-01-01
The objective of this study is to investigate Id-iopatic Normal Pressure Hydrocephalus (INPH) through a multidimensional and multiparameter analysis of statistical data obtained from accurate analysis of Intracranial Pressure (ICP) recordings. Such a study could permit to detect new factors, correlated with therapeutic response, which are able to validate a predicting significance for infusion test. The algorithm developed by the authors computes 13 ICP parameter trends on each of the recording, afterward 9 statistical information from each trend is determined. All data are transferred to the datamining software WEKA. According to the exploited feature-selection techniques, the WEKA has revealed that the most significant statistical parameter is the maximum of Single-Wave-Amplitude: setting a 27 mmHg threshold leads to over 90% of correct classification.
Scaling theory and the classification of phase transitions
International Nuclear Information System (INIS)
Hilfer, R.
1992-01-01
In this paper, the recent classification theory for phase transitions and its relation with the foundations of statistical physics is reviewed. First it is outlined how Ehrenfests classification scheme can be generalized into a general thermodynamic classification theory for phase transitions. The classification theory implies scaling and multiscaling thereby eliminating the need to postulate the scaling hypothesis as a fourth law of thermodynamics. The new classification has also led to the discovery and distinction of nonequilibrium transitions within equilibrium statistical physics. Nonequilibrium phase transitions are distinguished from equilibrium transitions by orders less than unity and by the fact the equilibrium thermodynamics and statistical mechanics become inapplicable at the critical point. The latter fact requires a change in the Gibbs assumption underlying the canonical and grandcanonical ensembles in order to recover the thermodynamic description in the critical limit
Kawata, Masaaki; Sato, Chikara
2007-06-01
In determining the three-dimensional (3D) structure of macromolecular assemblies in single particle analysis, a large representative dataset of two-dimensional (2D) average images from huge number of raw images is a key for high resolution. Because alignments prior to averaging are computationally intensive, currently available multireference alignment (MRA) software does not survey every possible alignment. This leads to misaligned images, creating blurred averages and reducing the quality of the final 3D reconstruction. We present a new method, in which multireference alignment is harmonized with classification (multireference multiple alignment: MRMA). This method enables a statistical comparison of multiple alignment peaks, reflecting the similarities between each raw image and a set of reference images. Among the selected alignment candidates for each raw image, misaligned images are statistically excluded, based on the principle that aligned raw images of similar projections have a dense distribution around the correctly aligned coordinates in image space. This newly developed method was examined for accuracy and speed using model image sets with various signal-to-noise ratios, and with electron microscope images of the Transient Receptor Potential C3 and the sodium channel. In every data set, the newly developed method outperformed conventional methods in robustness against noise and in speed, creating 2D average images of higher quality. This statistically harmonized alignment-classification combination should greatly improve the quality of single particle analysis.
The paradox of atheoretical classification
DEFF Research Database (Denmark)
Hjørland, Birger
2016-01-01
A distinction can be made between “artificial classifications” and “natural classifications,” where artificial classifications may adequately serve some limited purposes, but natural classifications are overall most fruitful by allowing inference and thus many different purposes. There is strong...... support for the view that a natural classification should be based on a theory (and, of course, that the most fruitful theory provides the most fruitful classification). Nevertheless, atheoretical (or “descriptive”) classifications are often produced. Paradoxically, atheoretical classifications may...... be very successful. The best example of a successful “atheoretical” classification is probably the prestigious Diagnostic and Statistical Manual of Mental Disorders (DSM) since its third edition from 1980. Based on such successes one may ask: Should the claim that classifications ideally are natural...
Stumpe, B; Engel, T; Steinweg, B; Marschner, B
2012-04-03
In the past, different slag materials were often used for landscaping and construction purposes or simply dumped. Nowadays German environmental laws strictly control the use of slags, but there is still a remaining part of 35% which is uncontrolled dumped in landfills. Since some slags have high heavy metal contents and different slag types have typical chemical and physical properties that will influence the risk potential and other characteristics of the deposits, an identification of the slag types is needed. We developed a FT-IR-based statistical method to identify different slags classes. Slags samples were collected at different sites throughout various cities within the industrial Ruhr area. Then, spectra of 35 samples from four different slags classes, ladle furnace (LF), blast furnace (BF), oxygen furnace steel (OF), and zinc furnace slags (ZF), were determined in the mid-infrared region (4000-400 cm(-1)). The spectra data sets were subject to statistical classification methods for the separation of separate spectral data of different slag classes. Principal component analysis (PCA) models for each slag class were developed and further used for soft independent modeling of class analogy (SIMCA). Precise classification of slag samples into four different slag classes were achieved using two different SIMCA models stepwise. At first, SIMCA 1 was used for classification of ZF as well as OF slags over the total spectral range. If no correct classification was found, then the spectrum was analyzed with SIMCA 2 at reduced wavenumbers for the classification of LF as well as BF spectra. As a result, we provide a time- and cost-efficient method based on FT-IR spectroscopy for processing and identifying large numbers of environmental slag samples.
Seismic texture classification. Final report
Energy Technology Data Exchange (ETDEWEB)
Vinther, R.
1997-12-31
The seismic texture classification method, is a seismic attribute that can both recognize the general reflectivity styles and locate variations from these. The seismic texture classification performs a statistic analysis for the seismic section (or volume) aiming at describing the reflectivity. Based on a set of reference reflectivities the seismic textures are classified. The result of the seismic texture classification is a display of seismic texture categories showing both the styles of reflectivity from the reference set and interpolations and extrapolations from these. The display is interpreted as statistical variations in the seismic data. The seismic texture classification is applied to seismic sections and volumes from the Danish North Sea representing both horizontal stratifications and salt diapers. The attribute succeeded in recognizing both general structure of successions and variations from these. Also, the seismic texture classification is not only able to display variations in prospective areas (1-7 sec. TWT) but can also be applied to deep seismic sections. The seismic texture classification is tested on a deep reflection seismic section (13-18 sec. TWT) from the Baltic Sea. Applied to this section the seismic texture classification succeeded in locating the Moho, which could not be located using conventional interpretation tools. The seismic texture classification is a seismic attribute which can display general reflectivity styles and deviations from these and enhance variations not found by conventional interpretation tools. (LN)
Geometric quantization of vector bundles and the correspondence with deformation quantization
International Nuclear Information System (INIS)
Hawkins, E.
2000-01-01
I repeat my definition for quantization of a vector bundle. For the cases of the Toeplitz and geometric quantizations of a compact Kaehler manifold, I give a construction for quantizing any smooth vector bundle, which depends functorially on a choice of connection on the bundle. Using this, the classification of formal deformation quantizations, and the formal, algebraic index theorem, I give a simple proof as to which formal deformation quantization (modulo isomorphism) is derived from a given geometric quantization. (orig.)
High Dimensional Classification Using Features Annealed Independence Rules.
Fan, Jianqing; Fan, Yingying
2008-01-01
Classification using high-dimensional features arises frequently in many contemporary statistical studies such as tumor classification using microarray or other high-throughput data. The impact of dimensionality on classifications is largely poorly understood. In a seminal paper, Bickel and Levina (2004) show that the Fisher discriminant performs poorly due to diverging spectra and they propose to use the independence rule to overcome the problem. We first demonstrate that even for the independence classification rule, classification using all the features can be as bad as the random guessing due to noise accumulation in estimating population centroids in high-dimensional feature space. In fact, we demonstrate further that almost all linear discriminants can perform as bad as the random guessing. Thus, it is paramountly important to select a subset of important features for high-dimensional classification, resulting in Features Annealed Independence Rules (FAIR). The conditions under which all the important features can be selected by the two-sample t-statistic are established. The choice of the optimal number of features, or equivalently, the threshold value of the test statistics are proposed based on an upper bound of the classification error. Simulation studies and real data analysis support our theoretical results and demonstrate convincingly the advantage of our new classification procedure.
Deformations of classical Lie algebras with homogeneous root system in characteristic two. I
International Nuclear Information System (INIS)
Chebochko, N G
2005-01-01
Spaces of local deformations of classical Lie algebras with a homogeneous root system over a field K of characteristic 2 are studied. By a classical Lie algebra over a field K we mean the Lie algebra of a simple algebraic Lie group or its quotient algebra by the centre. The description of deformations of Lie algebras is interesting in connection with the classification of the simple Lie algebras.
Classification of high resolution satellite images
Karlsson, Anders
2003-01-01
In this thesis the Support Vector Machine (SVM)is applied on classification of high resolution satellite images. Sveral different measures for classification, including texture mesasures, 1st order statistics, and simple contextual information were evaluated. Additionnally, the image was segmented, using an enhanced watershed method, in order to improve the classification accuracy.
Analysis of regional deformation and strain accumulation data adjacent to the San Andreas fault
Turcotte, Donald L.
1991-01-01
A new approach to the understanding of crustal deformation was developed under this grant. This approach combined aspects of fractals, chaos, and self-organized criticality to provide a comprehensive theory for deformation on distributed faults. It is hypothesized that crustal deformation is an example of comminution: Deformation takes place on a fractal distribution of faults resulting in a fractal distribution of seismicity. Our primary effort under this grant was devoted to developing an understanding of distributed deformation in the continental crust. An initial effort was carried out on the fractal clustering of earthquakes in time. It was shown that earthquakes do not obey random Poisson statistics, but can be approximated in many cases by coupled, scale-invariant fractal statistics. We applied our approach to the statistics of earthquakes in the New Hebrides region of the southwest Pacific because of the very high level of seismicity there. This work was written up and published in the Bulletin of the Seismological Society of America. This approach was also applied to the statistics of the seismicity on the San Andreas fault system.
Spino-pelvic sagittal balance of spondylolisthesis: a review and classification.
Labelle, Hubert; Mac-Thiong, Jean-Marc; Roussouly, Pierre
2011-09-01
In L5-S1 spondylolisthesis, it has been clearly demonstrated over the past decade that sacro-pelvic morphology is abnormal and that it can be associated to an abnormal sacro-pelvic orientation as well as to a disturbed global sagittal balance of the spine. The purpose of this article is to review the work done within the Spinal Deformity Study Group (SDSG) over the past decade, which has led to a classification incorporating this recent knowledge. The evidence presented has been derived from the analysis of the SDSG database, a multi-center radiological database of patients with L5-S1 spondylolisthesis, collected from 43 spine surgeons in North America and Europe. The classification defines 6 types of spondylolisthesis based on features that can be assessed on sagittal radiographs of the spine and pelvis: (1) grade of slip, (2) pelvic incidence, and (3) spino-pelvic alignment. A reliability study has demonstrated substantial intra- and inter-observer reliability similar to other currently used classifications for spinal deformity. Furthermore, health-related quality of life measures were found to be significantly different between the 6 types, thus supporting the value of a classification based on spino-pelvic alignment. The clinical relevance is that clinicians need to keep in mind when planning treatment that subjects with L5-S1 spondylolisthesis are a heterogeneous group with various adaptations of their posture. In the current controversy on whether high-grade deformities should or should not be reduced, it is suggested that reduction techniques should preferably be used in subjects with evidence of abnormal posture, in order to restore global spino-pelvic balance and improve the biomechanical environment for fusion.
Couinaud's classification v.s. Cho's classification. Their feasibility in the right hepatic lobe
International Nuclear Information System (INIS)
Shioyama, Yasukazu; Ikeda, Hiroaki; Sato, Motohito; Yoshimi, Fuyo; Kishi, Kazushi; Sato, Morio; Kimura, Masashi
2008-01-01
The objective of this study was to investigate if the new classification system proposed by Cho is feasible to clinical usage comparing with the classical Couinaud's one. One hundred consecutive cases of abdominal CT were studied using a 64 or an 8 slice multislice CT and created three dimensional portal vein images for analysis by the Workstation. We applied both Cho's classification and the classical Couinaud's one for each cases according to their definitions. Three diagnostic radiologists assessed their feasibility as category one (unable to classify) to five (clear to classify with total suit with the original classification criteria). And in each cases, we tried to judge whether Cho's or the classical Couinaud' classification could more easily transmit anatomical information. Analyzers could classified portal veins clearly (category 5) in 77 to 80% of cases and clearly (category 5) or almost clearly (category 4) in 86-93% along with both classifications. In the feasibility of classification, there was no statistically significant difference between two classifications. In 15 cases we felt that using Couinaud's classification is more convenient for us to transmit anatomical information to physicians than using Cho's one, because in these cases we noticed two large portal veins ramify from right main portal vein cranialy and caudaly and then we could not classify P5 as a branch of antero-ventral segment (AVS). Conversely in 17 cases we felt Cho's classification is more convenient because we could not divide right posterior branch as P6 and P7 and in these cases the right posterior portal vein ramified to several small branches. The anterior fissure vein was clearly noticed in only 60 cases. Comparing the classical Couinaud's classification and Cho's one in feasility of classification, there was no statistically significant difference. We propose we routinely report hepatic anatomy with the classical Couinauds classification and in the preoperative cases we
[Research progress on real-time deformable models of soft tissues for surgery simulation].
Xu, Shaoping; Liu, Xiaoping; Zhang, Hua; Luo, Jie
2010-04-01
Biological tissues generally exhibit nonlinearity, anisotropy, quasi-incompressibility and viscoelasticity about material properties. Simulating the behaviour of elastic objects in real time is one of the current objectives of virtual surgery simulation which is still a challenge for researchers to accurately depict the behaviour of human tissues. In this paper, we present a classification of the different deformable models that have been developed. We present the advantages and disadvantages of each one. Finally, we make a comparison of deformable models and perform an evaluation of the state of the art and the future of deformable models.
Directory of Open Access Journals (Sweden)
G. Shanmugam
2017-10-01
Problems that hinder our understanding of SSDS still remain. They are: (1 vague definitions of the phrase “soft-sediment deformation”; (2 complex factors that govern the origin of SSDS; (3 omission of vital empirical data in documenting vertical changes in facies using measured sedimentological logs; (4 difficulties in distinguishing depositional processes from tectonic events; (5 a model-driven interpretation of SSDS (i.e., earthquake being the singular cause; (6 routine application of the genetic term “seismites” to the “SSDS”, thus undermining the basic tenet of process sedimentology (i.e., separation of interpretation from observation; (7 the absence of objective criteria to differentiate 21 triggering mechanisms of liquefaction and related SSDS; (8 application of the process concept “high-density turbidity currents”, a process that has never been documented in modern oceans; (9 application of the process concept “sediment creep” with a velocity connotation that cannot be inferred from the ancient record; (10 classification of pockmarks, which are hollow spaces (i.e., without sediments as SSDS, with their problematic origins by fluid expulsion, sediment degassing, fish activity, etc.; (11 application of the Earth's climate-change model; and most importantly, (12 an arbitrary distinction between depositional process and sediment deformation. Despite a profusion of literature on SSDS, our understanding of their origin remains muddled. A solution to the chronic SSDS problem is to utilize the robust core dataset from scientific drilling at sea (DSDP/ODP/IODP with a constrained definition of SSDS.
On the deformed Einstein equations and quantum black holes
International Nuclear Information System (INIS)
Dil, E; Ersanli, C C; Kolay, E
2016-01-01
Recently q -deformed Einstein equations have been studied for extremal quantum black holes which have been proposed to obey deformed statistics by Strominger. In this study, we give the solutions of deformed Einstein equations by considering these equations for the charged black holes. Also we present the implications of the solutions, such as the deformation parameters lead the charged black holes to have a smaller mass than the classical Reissner- Nordstrom black holes. The reduction in mass of a classical black hole can be viewed as a transition from classical to quantum black hole regime. (paper)
Stavrianaki, K.; Vallianatos, F.; Sammonds, P. R.; Ross, G. J.
2014-12-01
Fracturing is the most prevalent deformation mechanism in rocks deformed in the laboratory under simulated upper crustal conditions. Fracturing produces acoustic emissions (AE) at the laboratory scale and earthquakes on a crustal scale. The AE technique provides a means to analyse microcracking activity inside the rock volume and since experiments can be performed under confining pressure to simulate depth of burial, AE can be used as a proxy for natural processes such as earthquakes. Experimental rock deformation provides us with several ways to investigate time-dependent brittle deformation. Two main types of experiments can be distinguished: (1) "constant strain rate" experiments in which stress varies as a result of deformation, and (2) "creep" experiments in which deformation and deformation rate vary over time as a result of an imposed constant stress. We conducted constant strain rate experiments on air-dried Darley Dale sandstone samples in a variety of confining pressures (30MPa, 50MPa, 80MPa) and in water saturated samples with 20 MPa initial pore fluid pressure. The results from these experiments used to determine the initial loading in the creep experiments. Non-extensive statistical physics approach was applied to the AE data in order to investigate the spatio-temporal pattern of cracks close to failure. A more detailed study was performed for the data from the creep experiments. When axial stress is plotted against time we obtain the trimodal creep curve. Calculation of Tsallis entropic index q is performed to each stage of the curve and the results are compared with the ones from the constant strain rate experiments. The Epidemic Type Aftershock Sequence model (ETAS) is also applied to each stage of the creep curve and the ETAS parameters are calculated. We investigate whether these parameters are constant across all stages of the curve, or whether there are interesting patterns of variation. This research has been co-funded by the European Union
Learning statistical correlation for fast prostate registration in image-guided radiotherapy
International Nuclear Information System (INIS)
Shi Yonghong; Liao Shu; Shen Dinggang
2011-01-01
Purpose: In adaptive radiation therapy of prostate cancer, fast and accurate registration between the planning image and treatment images of the patient is of essential importance. With the authors' recently developed deformable surface model, prostate boundaries in each treatment image can be rapidly segmented and their correspondences (or relative deformations) to the prostate boundaries in the planning image are also established automatically. However, the dense correspondences on the nonboundary regions, which are important especially for transforming the treatment plan designed in the planning image space to each treatment image space, are remained unresolved. This paper presents a novel approach to learn the statistical correlation between deformations of prostate boundary and nonboundary regions, for rapidly estimating deformations of the nonboundary regions when given the deformations of the prostate boundary at a new treatment image. Methods: The main contributions of the proposed method lie in the following aspects. First, the statistical deformation correlation will be learned from both current patient and other training patients, and further updated adaptively during the radiotherapy. Specifically, in the initial treatment stage when the number of treatment images collected from the current patient is small, the statistical deformation correlation is mainly learned from other training patients. As more treatment images are collected from the current patient, the patient-specific information will play a more important role in learning patient-specific statistical deformation correlation to effectively reflect prostate deformation of the current patient during the treatment. Eventually, only the patient-specific statistical deformation correlation is used to estimate dense correspondences when a sufficient number of treatment images have been acquired from the current patient. Second, the statistical deformation correlation will be learned by using a
Reconstruction of early phase deformations by integrated magnetic and mesotectonic data evaluation
Sipos, András A.; Márton, Emő; Fodor, László
2018-02-01
Markers of brittle faulting are widely used for recovering past deformation phases. Rocks often have oriented magnetic fabrics, which can be interpreted as connected to ductile deformation before cementation of the sediment. This paper reports a novel statistical procedure for simultaneous evaluation of AMS (Anisotropy of Magnetic Susceptibility) and fault-slip data. The new method analyzes the AMS data, without linearization techniques, so that weak AMS lineation and rotational AMS can be assessed that are beyond the scope of classical methods. This idea is extended to the evaluation of fault-slip data. While the traditional assumptions of stress inversion are not rejected, the method recovers the stress field via statistical hypothesis testing. In addition it provides statistical information needed for the combined evaluation of the AMS and the mesotectonic (0.1 to 10 m) data. In the combined evaluation a statistical test is carried out that helps to decide if the AMS lineation and the mesotectonic markers (in case of repeated deformation of the oldest set of markers) were formed in the same or different deformation phases. If this condition is met, the combined evaluation can improve the precision of the reconstruction. When the two data sets do not have a common solution for the direction of the extension, the deformational origin of the AMS is questionable. In this case the orientation of the stress field responsible for the AMS lineation might be different from that which caused the brittle deformation. Although most of the examples demonstrate the reconstruction of weak deformations in sediments, the new method is readily applicable to investigate the ductile-brittle transition of any rock formation as long as AMS and fault-slip data are available.
Toward the classification of differential calculi on κ-Minkowski space and related field theories
Energy Technology Data Exchange (ETDEWEB)
Jurić, Tajron; Meljanac, Stjepan; Pikutić, Danijel [Ruđer Bošković Institute, Theoretical Physics Division,Bijenička c.54, HR-10002 Zagreb (Croatia); Štrajn, Rina [Dipartimento di Matematica e Informatica, Università di Cagliari,viale Merello 92, I-09123 Cagliari (Italy); INFN, Sezione di Cagliari,Cagliari (Italy)
2015-07-13
Classification of differential forms on κ-Minkowski space, particularly, the classification of all bicovariant differential calculi of classical dimension is presented. By imposing super-Jacobi identities we derive all possible differential algebras compatible with the κ-Minkowski algebra for time-like, space-like and light-like deformations. Embedding into the super-Heisenberg algebra is constructed using non-commutative (NC) coordinates and one-forms. Particularly, a class of differential calculi with an undeformed exterior derivative and one-forms is considered. Corresponding NC differential calculi are elaborated. Related class of new Drinfeld twists is proposed. It contains twist leading to κ-Poincaré Hopf algebra for light-like deformation. Corresponding super-algebra and deformed super-Hopf algebras, as well as the symmetries of differential algebras are presented and elaborated. Using the NC differential calculus, we analyze NC field theory, modified dispersion relations, and discuss further physical applications.
MULTISCALE SPARSE APPEARANCE MODELING AND SIMULATION OF PATHOLOGICAL DEFORMATIONS
Directory of Open Access Journals (Sweden)
Rami Zewail
2017-08-01
Full Text Available Machine learning and statistical modeling techniques has drawn much interest within the medical imaging research community. However, clinically-relevant modeling of anatomical structures continues to be a challenging task. This paper presents a novel method for multiscale sparse appearance modeling in medical images with application to simulation of pathological deformations in X-ray images of human spine. The proposed appearance model benefits from the non-linear approximation power of Contourlets and its ability to capture higher order singularities to achieve a sparse representation while preserving the accuracy of the statistical model. Independent Component Analysis is used to extract statistical independent modes of variations from the sparse Contourlet-based domain. The new model is then used to simulate clinically-relevant pathological deformations in radiographic images.
Statistically motivated model of mechanisms controlling evolution of deformation band substructure
Czech Academy of Sciences Publication Activity Database
Kratochvíl, J.; Kružík, Martin
2016-01-01
Roč. 81, č. 1 (2016), s. 196-208 ISSN 0749-6419 Grant - others:GA ČR(CZ) GAP107/12/0121 Institutional support: RVO:67985556 Keywords : Crystal plastic ity * Microstructures * Deformation bands Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 5.702, year: 2016 http://library.utia.cas.cz/separaty/2016/MTR/kruzik-0457407.pdf
Solution of Deformed Einstein Equations and Quantum Black Holes
International Nuclear Information System (INIS)
Dil, Emre; Kolay, Erdinç
2016-01-01
Recently, one- and two-parameter deformed Einstein equations have been studied for extremal quantum black holes which have been proposed to obey deformed statistics by Strominger. In this study, we give a deeper insight into the deformed Einstein equations and consider the solutions of these equations for the extremal quantum black holes. We then represent the implications of the solutions, such that the deformation parameters lead the charged black holes to have a smaller mass than the usual Reissner-Nordström black holes. This reduction in mass of a usual black hole can be considered as a transition from classical to quantum black hole regime.
Plasticity margin recovery during annealing after cold deformation
International Nuclear Information System (INIS)
Bogatov, A.A.; Smirnov, S.V.; Kolmogorov, V.L.
1978-01-01
Restoration of the plasticity margin in steel 20 after cold deformation and annealing at 550 - 750 C and soaking for 5 - 300 min was investigated. The conditions of cold deformation under which the metal acquires microdefects unhealed by subsequent annealing were determined. It was established that if the degree of utilization of the plasticity margin is psi < 0.5, the plasticity margin in steel 20 can be completely restored by annealing. A mathematical model of restoration of the plasticity margin by annealing after cold deformation was constructed. A statistical analysis showed good agreement between model and experiment
Relative scale and the strength and deformability of rock masses
Schultz, Richard A.
1996-09-01
The strength and deformation of rocks depend strongly on the degree of fracturing, which can be assessed in the field and related systematically to these properties. Appropriate Mohr envelopes obtained from the Rock Mass Rating (RMR) classification system and the Hoek-Brown criterion for outcrops and other large-scale exposures of fractured rocks show that rock-mass cohesive strength, tensile strength, and unconfined compressive strength can be reduced by as much as a factor often relative to values for the unfractured material. The rock-mass deformation modulus is also reduced relative to Young's modulus. A "cook-book" example illustrates the use of RMR in field applications. The smaller values of rock-mass strength and deformability imply that there is a particular scale of observation whose identification is critical to applying laboratory measurements and associated failure criteria to geologic structures.
Advances in statistical models for data analysis
Minerva, Tommaso; Vichi, Maurizio
2015-01-01
This edited volume focuses on recent research results in classification, multivariate statistics and machine learning and highlights advances in statistical models for data analysis. The volume provides both methodological developments and contributions to a wide range of application areas such as economics, marketing, education, social sciences and environment. The papers in this volume were first presented at the 9th biannual meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in September 2013 at the University of Modena and Reggio Emilia, Italy.
Central limit theorem and deformed exponentials
International Nuclear Information System (INIS)
Vignat, C; Plastino, A
2007-01-01
The central limit theorem (CLT) can be ranked among the most important ones in probability theory and statistics and plays an essential role in several basic and applied disciplines, notably in statistical thermodynamics. We show that there exists a natural extension of the CLT from exponentials to so-called deformed exponentials (also denoted as q-Gaussians). Our proposal applies exactly in the usual conditions in which the classical CLT is used. (fast track communication)
Directory of Open Access Journals (Sweden)
Werkmann Mario
2009-02-01
Full Text Available Abstract Background Up to now, chronic low back pain without radicular symptoms is not classified and attributed in international literature as being "unspecific". For specific bracing of this patient group we use simple physical tests to predict the brace type the patient is most likely to benefit from. Based on these physical tests we have developed a simple functional classification of "unspecific" low back pain in patients with spinal deformities. Methods Between January 2006 and July 2007 we have tested 130 patients (116 females and 14 males with spinal deformities (average age 45 years, ranging from 14 years to 69 and chronic unspecific low back pain (pain for > 24 months along with the indication for brace treatment for chronic unspecific low back pain. Some of the patients had symptoms of spinal claudication (n = 16. The "sagittal realignment test" (SRT was applied, a lumbar hyperextension test, and the "sagittal delordosation test" (SDT. Additionally 3 female patients with spondylolisthesis were tested, including one female with symptoms of spinal claudication and 2 of these patients were 14 years of age and the other 43yrs old at the time of testing. Results 117 Patients reported significant pain release in the SRT and 13 in the SDT (>/= 2 steps in the Roland & Morris VRS. 3 Patients had no significant pain release in both of the tests ( Pain intensity was high (3,29 before performing the physical tests (VRS-scale 0–5 and low (1,37 while performing the physical test for the whole sample of patients. The differences where highly significant in the Wilcoxon test (z = -3,79; p In the 16 patients who did not respond to the SRT in the manual investigation we found hypermobility at L5/S1 or a spondylolisthesis at level L5/S1. In the other patients who responded well to the SRT loss of lumbar lordosis was the main issue, a finding which, according to scientific literature, correlates well with low back pain. The 3 patients who did not
Frequency of foot deformity in preschool girls
Directory of Open Access Journals (Sweden)
Mihajlović Ilona
2010-01-01
Full Text Available Background/Aim. In order to determine the moment of creation of postural disorders, regardless of the causes of this problem, it is necessary to examine the moment of entry of children into a new environment, ie. in kindergarten or school. There is a weak evidence about the age period when foot deformity occurs, and the type of these deformities. The aim of this study was to establish the relationship between the occurrence of foot deformities and age characteristics of girls. Methods. The research was conducted in preschools 'Radosno detinjstvo' in the region of Novi Sad, using the method of random selection, on the sample of 272 girls, 4-7 years of age, classified into four strata according to the year of birth. To determine the foot deformities measurement technique using computerized digitized pedografy (CDP was applied. Results. In preschool population girls pes transversoplanus and calcanei valga deformities occurred in a very high percentage (over 90%. Disturbed longitudinal instep ie flat feet also appeared in a high percentage, but we noted the improvement of this deformity according to increasing age. Namely, there was a statistically significant correlation between the age and this deformity. As a child grows older, the deformity is lower. Conclusion. This study confirmed that the formation of foot arches probably does not end at the age of 3-4 years but lasts until school age.
Classification of Noisy Data: An Approach Based on Genetic Algorithms and Voronoi Tessellation
DEFF Research Database (Denmark)
Khan, Abdul Rauf; Schiøler, Henrik; Knudsen, Torben
Classification is one of the major constituents of the data-mining toolkit. The well-known methods for classification are built on either the principle of logic or statistical/mathematical reasoning for classification. In this article we propose: (1) a different strategy, which is based on the po......Classification is one of the major constituents of the data-mining toolkit. The well-known methods for classification are built on either the principle of logic or statistical/mathematical reasoning for classification. In this article we propose: (1) a different strategy, which is based...
Arabic text classification using Polynomial Networks
Directory of Open Access Journals (Sweden)
Mayy M. Al-Tahrawi
2015-10-01
Full Text Available In this paper, an Arabic statistical learning-based text classification system has been developed using Polynomial Neural Networks. Polynomial Networks have been recently applied to English text classification, but they were never used for Arabic text classification. In this research, we investigate the performance of Polynomial Networks in classifying Arabic texts. Experiments are conducted on a widely used Arabic dataset in text classification: Al-Jazeera News dataset. We chose this dataset to enable direct comparisons of the performance of Polynomial Networks classifier versus other well-known classifiers on this dataset in the literature of Arabic text classification. Results of experiments show that Polynomial Networks classifier is a competitive algorithm to the state-of-the-art ones in the field of Arabic text classification.
International Nuclear Information System (INIS)
Oh, Jung Su; Lee, Jae Sung; Kim, Yu Kyeong; Chung, June Key; Lee, Myung Chul; Lee, Dong Soo
2005-01-01
In the statistical probabilistic mapping, commonly, differences between two or more groups of subjects are statistically analyzed following spatial normalization. However, to our best knowledge, there is few study which performed the statistical mapping in the individual brain space rather than in the stereotaxic brain space, i.e., template space. Therefore, in the current study, a new method for mapping the statistical results in the template space onto individual brain space has been developed. Four young subjects with epilepsy and their age-matched thirty normal healthy subjects were recruited. Both FDG PET and T1 structural MRI was scanned in these groups. Statistical analysis on the decreased FDG metabolism in epilepsy was performed on the SPM with two sample t-test (p < 0.001, intensity threshold 100). To map the statistical results onto individual space, inverse deformation was performed as follows. With SPM deformation toolbox, DCT (discrete cosine transform) basis-encoded deformation fields between individual T1 images and T1 MNI template were obtained. Afterward, inverse of those fields, i.e., inverse deformation fields were obtained. Since both PET and T1 images have been already normalized in the same MNI space, inversely deformed results in PET is on the individual brain MRI space. By applying inverse deformation field on the statistical results of the PET, the statistical map of decreased metabolism in individual spaces were obtained. With statistical results in the template space, localization of decreased metabolism was in the inferior temporal lobe, which was slightly inferior to the hippocampus. The statistical results in the individual space were commonly located in the hippocampus, where the activation should be decreased according to a priori knowledge of neuroscience. With our newly developed statistical mapping on the individual spaces, the localization of the brain functional mapping became more appropriate in the sense of neuroscience
Energy Technology Data Exchange (ETDEWEB)
Oh, Jung Su; Lee, Jae Sung; Kim, Yu Kyeong; Chung, June Key; Lee, Myung Chul; Lee, Dong Soo [Seoul National University Hospital, Seoul (Korea, Republic of)
2005-07-01
In the statistical probabilistic mapping, commonly, differences between two or more groups of subjects are statistically analyzed following spatial normalization. However, to our best knowledge, there is few study which performed the statistical mapping in the individual brain space rather than in the stereotaxic brain space, i.e., template space. Therefore, in the current study, a new method for mapping the statistical results in the template space onto individual brain space has been developed. Four young subjects with epilepsy and their age-matched thirty normal healthy subjects were recruited. Both FDG PET and T1 structural MRI was scanned in these groups. Statistical analysis on the decreased FDG metabolism in epilepsy was performed on the SPM with two sample t-test (p < 0.001, intensity threshold 100). To map the statistical results onto individual space, inverse deformation was performed as follows. With SPM deformation toolbox, DCT (discrete cosine transform) basis-encoded deformation fields between individual T1 images and T1 MNI template were obtained. Afterward, inverse of those fields, i.e., inverse deformation fields were obtained. Since both PET and T1 images have been already normalized in the same MNI space, inversely deformed results in PET is on the individual brain MRI space. By applying inverse deformation field on the statistical results of the PET, the statistical map of decreased metabolism in individual spaces were obtained. With statistical results in the template space, localization of decreased metabolism was in the inferior temporal lobe, which was slightly inferior to the hippocampus. The statistical results in the individual space were commonly located in the hippocampus, where the activation should be decreased according to a priori knowledge of neuroscience. With our newly developed statistical mapping on the individual spaces, the localization of the brain functional mapping became more appropriate in the sense of neuroscience.
Anyons, deformed oscillator algebras and projectors
International Nuclear Information System (INIS)
Engquist, Johan
2009-01-01
We initiate an algebraic approach to the many-anyon problem based on deformed oscillator algebras. The formalism utilizes a generalization of the deformed Heisenberg algebras underlying the operator solution of the Calogero problem. We define a many-body Hamiltonian and an angular momentum operator which are relevant for a linearized analysis in the statistical parameter ν. There exists a unique ground state and, in spite of the presence of defect lines, the anyonic weight lattices are completely connected by the application of the oscillators of the algebra. This is achieved by supplementing the oscillator algebra with a certain projector algebra.
Classification and Analysis of Computer Network Traffic
DEFF Research Database (Denmark)
Bujlow, Tomasz
2014-01-01
various classification modes (decision trees, rulesets, boosting, softening thresholds) regarding the classification accuracy and the time required to create the classifier. We showed how to use our VBS tool to obtain per-flow, per-application, and per-content statistics of traffic in computer networks...
Generalized field quantization and statistics of elementary particles
International Nuclear Information System (INIS)
Govorkov, A.V.
1994-01-01
Generalized schemes for the quantization of free fields based on the deformed trilinear relations of Green are investigated. A theorem shows that in reality continuous deformation is impossible. In particular, it is shown that a open-quotes smallclose quotes violation of the ordinary Fermi and Bose statistics is impossible both in the framework of local field theory, corresponding to parastatistics of finite orders, and in the framework of nonlocal field theory, corresponding to infinite statistics. The existence of antiparticles plays a decisive role in establishing the matter case. 23 refs
Statistical classification of road pavements using near field vehicle rolling noise measurements.
Paulo, Joel Preto; Coelho, J L Bento; Figueiredo, Mário A T
2010-10-01
Low noise surfaces have been increasingly considered as a viable and cost-effective alternative to acoustical barriers. However, road planners and administrators frequently lack information on the correlation between the type of road surface and the resulting noise emission profile. To address this problem, a method to identify and classify different types of road pavements was developed, whereby near field road noise is analyzed using statistical learning methods. The vehicle rolling sound signal near the tires and close to the road surface was acquired by two microphones in a special arrangement which implements the Close-Proximity method. A set of features, characterizing the properties of the road pavement, was extracted from the corresponding sound profiles. A feature selection method was used to automatically select those that are most relevant in predicting the type of pavement, while reducing the computational cost. A set of different types of road pavement segments were tested and the performance of the classifier was evaluated. Results of pavement classification performed during a road journey are presented on a map, together with geographical data. This procedure leads to a considerable improvement in the quality of road pavement noise data, thereby increasing the accuracy of road traffic noise prediction models.
Computerized Classification Testing with the Rasch Model
Eggen, Theo J. H. M.
2011-01-01
If classification in a limited number of categories is the purpose of testing, computerized adaptive tests (CATs) with algorithms based on sequential statistical testing perform better than estimation-based CATs (e.g., Eggen & Straetmans, 2000). In these computerized classification tests (CCTs), the Sequential Probability Ratio Test (SPRT) (Wald,…
A pentatonic classification of extreme events
International Nuclear Information System (INIS)
Eliazar, Iddo; Cohen, Morrel H.
2015-01-01
In this paper we present a classification of the extreme events – very small and very large outcomes – of positive-valued random variables. The classification distinguishes five different categories of randomness, ranging from the very ‘mild’ to the very ‘wild’. In analogy with the common five-tone musical scale we term the classification ‘pentatonic’. The classification is based on the analysis of the inherent Gibbsian ‘forces’ and ‘temperatures’ existing on the logarithmic scale of the random variables under consideration, and provides a statistical-physics insight regarding the nature of these random variables. The practical application of the pentatonic classification is remarkably straightforward, it can be performed by non-experts, and it is demonstrated via an array of examples
Cloud field classification based on textural features
Sengupta, Sailes Kumar
1989-01-01
An essential component in global climate research is accurate cloud cover and type determination. Of the two approaches to texture-based classification (statistical and textural), only the former is effective in the classification of natural scenes such as land, ocean, and atmosphere. In the statistical approach that was adopted, parameters characterizing the stochastic properties of the spatial distribution of grey levels in an image are estimated and then used as features for cloud classification. Two types of textural measures were used. One is based on the distribution of the grey level difference vector (GLDV), and the other on a set of textural features derived from the MaxMin cooccurrence matrix (MMCM). The GLDV method looks at the difference D of grey levels at pixels separated by a horizontal distance d and computes several statistics based on this distribution. These are then used as features in subsequent classification. The MaxMin tectural features on the other hand are based on the MMCM, a matrix whose (I,J)th entry give the relative frequency of occurrences of the grey level pair (I,J) that are consecutive and thresholded local extremes separated by a given pixel distance d. Textural measures are then computed based on this matrix in much the same manner as is done in texture computation using the grey level cooccurrence matrix. The database consists of 37 cloud field scenes from LANDSAT imagery using a near IR visible channel. The classification algorithm used is the well known Stepwise Discriminant Analysis. The overall accuracy was estimated by the percentage or correct classifications in each case. It turns out that both types of classifiers, at their best combination of features, and at any given spatial resolution give approximately the same classification accuracy. A neural network based classifier with a feed forward architecture and a back propagation training algorithm is used to increase the classification accuracy, using these two classes
Statistical mechanics in the context of special relativity.
Kaniadakis, G
2002-11-01
In Ref. [Physica A 296, 405 (2001)], starting from the one parameter deformation of the exponential function exp(kappa)(x)=(sqrt[1+kappa(2)x(2)]+kappax)(1/kappa), a statistical mechanics has been constructed which reduces to the ordinary Boltzmann-Gibbs statistical mechanics as the deformation parameter kappa approaches to zero. The distribution f=exp(kappa)(-beta E+betamu) obtained within this statistical mechanics shows a power law tail and depends on the nonspecified parameter beta, containing all the information about the temperature of the system. On the other hand, the entropic form S(kappa)= integral d(3)p(c(kappa) f(1+kappa)+c(-kappa) f(1-kappa)), which after maximization produces the distribution f and reduces to the standard Boltzmann-Shannon entropy S0 as kappa-->0, contains the coefficient c(kappa) whose expression involves, beside the Boltzmann constant, another nonspecified parameter alpha. In the present effort we show that S(kappa) is the unique existing entropy obtained by a continuous deformation of S0 and preserving unaltered its fundamental properties of concavity, additivity, and extensivity. These properties of S(kappa) permit to determine unequivocally the values of the above mentioned parameters beta and alpha. Subsequently, we explain the origin of the deformation mechanism introduced by kappa and show that this deformation emerges naturally within the Einstein special relativity. Furthermore, we extend the theory in order to treat statistical systems in a time dependent and relativistic context. Then, we show that it is possible to determine in a self consistent scheme within the special relativity the values of the free parameter kappa which results to depend on the light speed c and reduces to zero as c--> infinity recovering in this way the ordinary statistical mechanics and thermodynamics. The statistical mechanics here presented, does not contain free parameters, preserves unaltered the mathematical and epistemological structure of
Segmentation and classification of biological objects
DEFF Research Database (Denmark)
Schultz, Nette
1995-01-01
The present thesis is on segmentation and classification of biological objects using statistical methods. It is based on case studies dealing with different kinds of pork meat images, and we introduce appropriate statistical methods to solve the tasks in the case studies. The case studies concern...
Generalized t-statistic for two-group classification.
Komori, Osamu; Eguchi, Shinto; Copas, John B
2015-06-01
In the classic discriminant model of two multivariate normal distributions with equal variance matrices, the linear discriminant function is optimal both in terms of the log likelihood ratio and in terms of maximizing the standardized difference (the t-statistic) between the means of the two distributions. In a typical case-control study, normality may be sensible for the control sample but heterogeneity and uncertainty in diagnosis may suggest that a more flexible model is needed for the cases. We generalize the t-statistic approach by finding the linear function which maximizes a standardized difference but with data from one of the groups (the cases) filtered by a possibly nonlinear function U. We study conditions for consistency of the method and find the function U which is optimal in the sense of asymptotic efficiency. Optimality may also extend to other measures of discriminatory efficiency such as the area under the receiver operating characteristic curve. The optimal function U depends on a scalar probability density function which can be estimated non-parametrically using a standard numerical algorithm. A lasso-like version for variable selection is implemented by adding L1-regularization to the generalized t-statistic. Two microarray data sets in the study of asthma and various cancers are used as motivating examples. © 2014, The International Biometric Society.
Classification Methods for High-Dimensional Genetic Data
Czech Academy of Sciences Publication Activity Database
Kalina, Jan
2014-01-01
Roč. 34, č. 1 (2014), s. 10-18 ISSN 0208-5216 Institutional support: RVO:67985807 Keywords : multivariate statistics * classification analysis * shrinkage estimation * dimension reduction * data mining Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.646, year: 2014
Leydesdorff, L.; Kogler, D.F.; Yan, B.
The Cooperative Patent Classifications (CPC) recently developed cooperatively by the European and US Patent Offices provide a new basis for mapping patents and portfolio analysis. CPC replaces International Patent Classifications (IPC) of the World Intellectual Property Organization. In this study,
All of statistics a concise course in statistical inference
Wasserman, Larry
2004-01-01
This book is for people who want to learn probability and statistics quickly It brings together many of the main ideas in modern statistics in one place The book is suitable for students and researchers in statistics, computer science, data mining and machine learning This book covers a much wider range of topics than a typical introductory text on mathematical statistics It includes modern topics like nonparametric curve estimation, bootstrapping and classification, topics that are usually relegated to follow-up courses The reader is assumed to know calculus and a little linear algebra No previous knowledge of probability and statistics is required The text can be used at the advanced undergraduate and graduate level Larry Wasserman is Professor of Statistics at Carnegie Mellon University He is also a member of the Center for Automated Learning and Discovery in the School of Computer Science His research areas include nonparametric inference, asymptotic theory, causality, and applications to astrophysics, bi...
Coherent states for oscillators of non-conventional statistics
International Nuclear Information System (INIS)
Dao Vong Duc; Nguyen Ba An
1998-12-01
In this work we consider systematically the concept of coherent states for oscillators of non-conventional statistics - parabose oscillator, infinite statistics oscillator and generalised q-deformed oscillator. The expressions for the quadrature variances and particle number distribution are derived and displayed graphically. The obtained results show drastic changes when going from one statistics to another. (author)
Hydrologic landscape regionalisation using deductive classification and random forests.
Directory of Open Access Journals (Sweden)
Stuart C Brown
Full Text Available Landscape classification and hydrological regionalisation studies are being increasingly used in ecohydrology to aid in the management and research of aquatic resources. We present a methodology for classifying hydrologic landscapes based on spatial environmental variables by employing non-parametric statistics and hybrid image classification. Our approach differed from previous classifications which have required the use of an a priori spatial unit (e.g. a catchment which necessarily results in the loss of variability that is known to exist within those units. The use of a simple statistical approach to identify an appropriate number of classes eliminated the need for large amounts of post-hoc testing with different number of groups, or the selection and justification of an arbitrary number. Using statistical clustering, we identified 23 distinct groups within our training dataset. The use of a hybrid classification employing random forests extended this statistical clustering to an area of approximately 228,000 km2 of south-eastern Australia without the need to rely on catchments, landscape units or stream sections. This extension resulted in a highly accurate regionalisation at both 30-m and 2.5-km resolution, and a less-accurate 10-km classification that would be more appropriate for use at a continental scale. A smaller case study, of an area covering 27,000 km2, demonstrated that the method preserved the intra- and inter-catchment variability that is known to exist in local hydrology, based on previous research. Preliminary analysis linking the regionalisation to streamflow indices is promising suggesting that the method could be used to predict streamflow behaviour in ungauged catchments. Our work therefore simplifies current classification frameworks that are becoming more popular in ecohydrology, while better retaining small-scale variability in hydrology, thus enabling future attempts to explain and visualise broad-scale hydrologic
Classification differences and maternal mortality
DEFF Research Database (Denmark)
Salanave, B; Bouvier-Colle, M H; Varnoux, N
1999-01-01
OBJECTIVES: To compare the ways maternal deaths are classified in national statistical offices in Europe and to evaluate the ways classification affects published rates. METHODS: Data on pregnancy-associated deaths were collected in 13 European countries. Cases were classified by a European panel....... This change was substantial in three countries (P statistical offices appeared to attribute fewer deaths to obstetric causes. In the other countries, no differences were detected. According to official published data, the aggregated maternal mortality rate for participating countries was 7.7 per...... of experts into obstetric or non-obstetric causes. An ICD-9 code (International Classification of Diseases) was attributed to each case. These were compared to the codes given in each country. Correction indices were calculated, giving new estimates of maternal mortality rates. SUBJECTS: There were...
A Novel Texture Classification Procedure by using Association Rules
Directory of Open Access Journals (Sweden)
L. Jaba Sheela
2008-11-01
Full Text Available Texture can be defined as a local statistical pattern of texture primitives in observer’s domain of interest. Texture classification aims to assign texture labels to unknown textures, according to training samples and classification rules. Association rules have been used in various applications during the past decades. Association rules capture both structural and statistical information, and automatically identify the structures that occur most frequently and relationships that have significant discriminative power. So, association rules can be adapted to capture frequently occurring local structures in textures. This paper describes the usage of association rules for texture classification problem. The performed experimental studies show the effectiveness of the association rules. The overall success rate is about 98%.
Video genre classification using multimodal features
Jin, Sung Ho; Bae, Tae Meon; Choo, Jin Ho; Ro, Yong Man
2003-12-01
We propose a video genre classification method using multimodal features. The proposed method is applied for the preprocessing of automatic video summarization or the retrieval and classification of broadcasting video contents. Through a statistical analysis of low-level and middle-level audio-visual features in video, the proposed method can achieve good performance in classifying several broadcasting genres such as cartoon, drama, music video, news, and sports. In this paper, we adopt MPEG-7 audio-visual descriptors as multimodal features of video contents and evaluate the performance of the classification by feeding the features into a decision tree-based classifier which is trained by CART. The experimental results show that the proposed method can recognize several broadcasting video genres with a high accuracy and the classification performance with multimodal features is superior to the one with unimodal features in the genre classification.
Directory of Open Access Journals (Sweden)
J. V. Lukovich
2017-07-01
Full Text Available A framework is developed to assess the directional changes in sea ice drift paths and associated deformation processes in response to atmospheric forcing. The framework is based on Lagrangian statistical analyses leveraging particle dispersion theory which tells us whether ice drift is in a subdiffusive, diffusive, ballistic, or superdiffusive dynamical regime using single-particle (absolute dispersion statistics. In terms of sea ice deformation, the framework uses two- and three-particle dispersion to characterize along- and across-shear transport as well as differential kinematic parameters. The approach is tested with GPS beacons deployed in triplets on sea ice in the southern Beaufort Sea at varying distances from the coastline in fall of 2009 with eight individual events characterized. One transition in particular follows the sea level pressure (SLP high on 8 October in 2009 while the sea ice drift was in a superdiffusive dynamic regime. In this case, the dispersion scaling exponent (which is a slope between single-particle absolute dispersion of sea ice drift and elapsed time changed from superdiffusive (α ∼ 3 to ballistic (α ∼ 2 as the SLP was rounding its maximum pressure value. Following this shift between regimes, there was a loss in synchronicity between sea ice drift and atmospheric motion patterns. While this is only one case study, the outcomes suggest similar studies be conducted on more buoy arrays to test momentum transfer linkages between storms and sea ice responses as a function of dispersion regime states using scaling exponents. The tools and framework developed in this study provide a unique characterization technique to evaluate these states with respect to sea ice processes in general. Application of these techniques can aid ice hazard assessments and weather forecasting in support of marine transportation and indigenous use of near-shore Arctic areas.
A novel deformation mechanism for superplastic deformation
Energy Technology Data Exchange (ETDEWEB)
Muto, H.; Sakai, M. (Toyohashi Univ. of Technology (Japan). Dept. of Materials Science)
1999-01-01
Uniaxial compressive creep tests with strain value up to -0.1 for a [beta]-spodumene glass ceramic are conducted at 1060 C. From the observation of microstructural changes between before and after the creep deformations, it is shown that the grain-boundary sliding takes place via cooperative movement of groups of grains rather than individual grains under the large-scale-deformation. The deformation process and the surface technique used in this work are not only applicable to explain the deformation and flow of two-phase ceramics but also the superplastic deformation. (orig.) 12 refs.
25 Years of Quantum Groups: from Definition to Classification
Directory of Open Access Journals (Sweden)
A. Stolin
2008-01-01
Full Text Available In mathematics and theoretical physics, quantum groups are certain non-commutative, non-cocommutative Hopf algebras, which first appeared in the theory of quantum integrable models and later they were formalized by Drinfeld and Jimbo. In this paper we present a classification scheme for quantum groups, whose classical limit is a polynomial Lie algebra. As a consequence we obtain deformed XXX and XXZ Hamiltonians.
Directory of Open Access Journals (Sweden)
Jocelyn E Bolin
2014-02-01
Full Text Available Statistical classification of phenomena into observed groups is very common in the social and behavioral sciences. Statistical classification methods, however, are affected by the characteristics of the data under study. Statistical classification can be further complicated by initial misclassification of the observed groups. The purpose of this study is to investigate the impact of initial training data misclassification on several statistical classification and data mining techniques. Misclassification conditions in the three-group case will be simulated and results will be presented in terms of overall as well as subgroup classification accuracy. Results show decreased classification accuracy as sample size, group separation and group size ratio decrease and as misclassification percentage increases with random forests demonstrating the highest accuracy across conditions.
Iris-based medical analysis by geometric deformation features.
Ma, Lin; Zhang, D; Li, Naimin; Cai, Yan; Zuo, Wangmeng; Wang, Kuanguan
2013-01-01
Iris analysis studies the relationship between human health and changes in the anatomy of the iris. Apart from the fact that iris recognition focuses on modeling the overall structure of the iris, iris diagnosis emphasizes the detecting and analyzing of local variations in the characteristics of irises. This paper focuses on studying the geometrical structure changes in irises that are caused by gastrointestinal diseases, and on measuring the observable deformations in the geometrical structures of irises that are related to roundness, diameter and other geometric forms of the pupil and the collarette. Pupil and collarette based features are defined and extracted. A series of experiments are implemented on our experimental pathological iris database, including manual clustering of both normal and pathological iris images, manual classification by non-specialists, manual classification by individuals with a medical background, classification ability verification for the proposed features, and disease recognition by applying the proposed features. The results prove the effectiveness and clinical diagnostic significance of the proposed features and a reliable recognition performance for automatic disease diagnosis. Our research results offer a novel systematic perspective for iridology studies and promote the progress of both theoretical and practical work in iris diagnosis.
Formation of disorientations in dislocation structures during plastic deformation
DEFF Research Database (Denmark)
Pantleon, W.
2002-01-01
Disorientations developing during plastic deformation in dislocation structures are investigated. Based on expected mechanisms for the formation of different types of dislocation boundaries (statistical trapping of dislocations or differently activated slip systems) the formation of the disorient...
Classification of astrocyto-mas and meningiomas using statistical discriminant analysis on MRI data
International Nuclear Information System (INIS)
Siromoney, Anna; Prasad, G.N.S.; Raghuram, Lakshminarayan; Korah, Ipeson; Siromoney, Arul; Chandrasekaran, R.
2001-01-01
The objective of this study was to investigate the usefulness of Multivariate Discriminant Analysis for classifying two groups of primary brain tumours, astrocytomas and meningiomas, from Magnetic Resonance Images. Discriminant analysis is a multivariate technique concerned with separating distinct sets of objects and with allocating new objects to previously defined groups. Allocation or classification rules are usually developed from learning examples in a supervised learning environment. Data from signal intensity measurements in the multiple scan performed on each patient in routine clinical scanning was analysed using Fisher's Classification, which is one method of discriminant analysis
Classification Using Markov Blanket for Feature Selection
DEFF Research Database (Denmark)
Zeng, Yifeng; Luo, Jian
2009-01-01
Selecting relevant features is in demand when a large data set is of interest in a classification task. It produces a tractable number of features that are sufficient and possibly improve the classification performance. This paper studies a statistical method of Markov blanket induction algorithm...... for filtering features and then applies a classifier using the Markov blanket predictors. The Markov blanket contains a minimal subset of relevant features that yields optimal classification performance. We experimentally demonstrate the improved performance of several classifiers using a Markov blanket...... induction as a feature selection method. In addition, we point out an important assumption behind the Markov blanket induction algorithm and show its effect on the classification performance....
On classification of N=2 supersymmetric theories
International Nuclear Information System (INIS)
Cecotti, S.; Vafa, C.
1993-01-01
We find a relation between the spectrum of solitons of massive N=2 quantum field theories in d=2 and the scaling dimensions of chiral fields at the conformal point. The condition that the scaling dimensions be real imposes restrictions on the soliton numbers and leads to a classification program for symmetric N=2 conformal theories and their massive deformations in terms of a suitable generalization of Dynkin diagrams (which coincides with the A-D-E Dynkin diagrams for minimal models). The Landau-Ginzburg theories are a proper subset of this classification. In the particular case of LG theories we relate the soliton numbers with intersection of vanishing cycles of the corresponding singularity; the relation between soliton numbers and the scaling dimensions in this particular case is a well known application of Picard-Lefschetz theory. (orig.)
Using machine learning, neural networks and statistics to predict bankruptcy
Pompe, P.P.M.; Feelders, A.J.; Feelders, A.J.
1997-01-01
Recent literature strongly suggests that machine learning approaches to classification outperform "classical" statistical methods. We make a comparison between the performance of linear discriminant analysis, classification trees, and neural networks in predicting corporate bankruptcy. Linear
International Nuclear Information System (INIS)
Levine, Lyle E.; Geantil, Peter; Larson, Bennett C.; Tischler, Jonathan Z.; Kassner, Michael E.; Liu, Wenjun; Stoudt, Mark R.; Tavazza, Francesca
2011-01-01
Highlights: → Axial elastic strains were measured from numerous individual, contiguous dislocation cell walls and cell interiors. → The mean stresses for the cell walls and cell interiors were of opposite sign, in agreement with theoretical predictions. → The separation between the mean cell wall and cell interior stresses was about 20% of the flow stress. → Broad distributions of dipolar stresses were observed that are consistent with a simple size-scaling model. - Abstract: The strength of wavy glide metals increases dramatically during deformation as dislocations multiply and entangle, forming dense dislocation wall structures. Numerous competing models have been proposed for this process but experimental validation and guidance for further model development require new experimental approaches capable of resolving local stresses within the dislocation microstructure. We use three-dimensional X-ray microscopy combining submicrometer spatial resolution with diffracted-beam masking to make direct measurements of axial elastic strain (and thus stress) in individual dislocation cell walls and their adjacent cell interiors in heavily deformed copper. These spatially resolved measurements show broad, asymmetric distributions of dipolar stresses that directly discriminate between long-standing deformation models and demonstrate that the distribution of local stresses is statistically connected to the global behavior through simple rules.
Guo, Yanrong; Gao, Yaozong; Shao, Yeqin; Price, True; Oto, Aytekin; Shen, Dinggang
2014-07-01
prostate surface and trained to adaptively capture the appearance in different prostate zones, thus achieving better local tissue differentiation. For each local region, multiple classifiers are trained based on the randomly selected samples and finally assembled by a specific fusion method. In addition to this nonparametric appearance model, a prostate shape model is learned from the shape statistics using a novel approach, sparse shape composition, which can model nonGaussian distributions of shape variation and regularize the 3D mesh deformation by constraining it within the observed shape subspace. The proposed method has been evaluated on two datasets consisting of T2-weighted MR prostate images. For the first (internal) dataset, the classification effectiveness of the authors' improved dictionary learning has been validated by comparing it with three other variants of traditional dictionary learning methods. The experimental results show that the authors' method yields a Dice Ratio of 89.1% compared to the manual segmentation, which is more accurate than the three state-of-the-art MR prostate segmentation methods under comparison. For the second dataset, the MICCAI 2012 challenge dataset, the authors' proposed method yields a Dice Ratio of 87.4%, which also achieves better segmentation accuracy than other methods under comparison. A new magnetic resonance image prostate segmentation method is proposed based on the combination of deformable model and dictionary learning methods, which achieves more accurate segmentation performance on prostate T2 MR images.
Guo, Yanrong; Gao, Yaozong; Shao, Yeqin; Price, True; Oto, Aytekin; Shen, Dinggang
2014-01-01
patches of the prostate surface and trained to adaptively capture the appearance in different prostate zones, thus achieving better local tissue differentiation. For each local region, multiple classifiers are trained based on the randomly selected samples and finally assembled by a specific fusion method. In addition to this nonparametric appearance model, a prostate shape model is learned from the shape statistics using a novel approach, sparse shape composition, which can model nonGaussian distributions of shape variation and regularize the 3D mesh deformation by constraining it within the observed shape subspace. Results: The proposed method has been evaluated on two datasets consisting of T2-weighted MR prostate images. For the first (internal) dataset, the classification effectiveness of the authors' improved dictionary learning has been validated by comparing it with three other variants of traditional dictionary learning methods. The experimental results show that the authors' method yields a Dice Ratio of 89.1% compared to the manual segmentation, which is more accurate than the three state-of-the-art MR prostate segmentation methods under comparison. For the second dataset, the MICCAI 2012 challenge dataset, the authors' proposed method yields a Dice Ratio of 87.4%, which also achieves better segmentation accuracy than other methods under comparison. Conclusions: A new magnetic resonance image prostate segmentation method is proposed based on the combination of deformable model and dictionary learning methods, which achieves more accurate segmentation performance on prostate T2 MR images. PMID:24989402
International Nuclear Information System (INIS)
Guo, Yanrong; Shao, Yeqin; Gao, Yaozong; Price, True; Oto, Aytekin; Shen, Dinggang
2014-01-01
different patches of the prostate surface and trained to adaptively capture the appearance in different prostate zones, thus achieving better local tissue differentiation. For each local region, multiple classifiers are trained based on the randomly selected samples and finally assembled by a specific fusion method. In addition to this nonparametric appearance model, a prostate shape model is learned from the shape statistics using a novel approach, sparse shape composition, which can model nonGaussian distributions of shape variation and regularize the 3D mesh deformation by constraining it within the observed shape subspace. Results: The proposed method has been evaluated on two datasets consisting of T2-weighted MR prostate images. For the first (internal) dataset, the classification effectiveness of the authors' improved dictionary learning has been validated by comparing it with three other variants of traditional dictionary learning methods. The experimental results show that the authors' method yields a Dice Ratio of 89.1% compared to the manual segmentation, which is more accurate than the three state-of-the-art MR prostate segmentation methods under comparison. For the second dataset, the MICCAI 2012 challenge dataset, the authors' proposed method yields a Dice Ratio of 87.4%, which also achieves better segmentation accuracy than other methods under comparison. Conclusions: A new magnetic resonance image prostate segmentation method is proposed based on the combination of deformable model and dictionary learning methods, which achieves more accurate segmentation performance on prostate T2 MR images
Energy Technology Data Exchange (ETDEWEB)
Guo, Yanrong; Shao, Yeqin [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, North Carolina 27599 (United States); Gao, Yaozong; Price, True [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, North Carolina 27599 and Department of Computer Science, University of North Carolina at Chapel Hill, North Carolina 27599 (United States); Oto, Aytekin [Department of Radiology, Section of Urology, University of Chicago, Illinois 60637 (United States); Shen, Dinggang, E-mail: dgshen@med.unc.edu [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, North Carolina 27599 and Department of Brain and Cognitive Engineering, Korea University, Seoul 136-713 (Korea, Republic of)
2014-07-15
different patches of the prostate surface and trained to adaptively capture the appearance in different prostate zones, thus achieving better local tissue differentiation. For each local region, multiple classifiers are trained based on the randomly selected samples and finally assembled by a specific fusion method. In addition to this nonparametric appearance model, a prostate shape model is learned from the shape statistics using a novel approach, sparse shape composition, which can model nonGaussian distributions of shape variation and regularize the 3D mesh deformation by constraining it within the observed shape subspace. Results: The proposed method has been evaluated on two datasets consisting of T2-weighted MR prostate images. For the first (internal) dataset, the classification effectiveness of the authors' improved dictionary learning has been validated by comparing it with three other variants of traditional dictionary learning methods. The experimental results show that the authors' method yields a Dice Ratio of 89.1% compared to the manual segmentation, which is more accurate than the three state-of-the-art MR prostate segmentation methods under comparison. For the second dataset, the MICCAI 2012 challenge dataset, the authors' proposed method yields a Dice Ratio of 87.4%, which also achieves better segmentation accuracy than other methods under comparison. Conclusions: A new magnetic resonance image prostate segmentation method is proposed based on the combination of deformable model and dictionary learning methods, which achieves more accurate segmentation performance on prostate T2 MR images.
A two-dimensional deformable phantom for quantitatively verifying deformation algorithms
Energy Technology Data Exchange (ETDEWEB)
Kirby, Neil; Chuang, Cynthia; Pouliot, Jean [Department of Radiation Oncology, University of California San Francisco, San Francisco, California 94143-1708 (United States)
2011-08-15
Purpose: The incorporation of deformable image registration into the treatment planning process is rapidly advancing. For this reason, the methods used to verify the underlying deformation algorithms must evolve equally fast. This manuscript proposes a two-dimensional deformable phantom, which can objectively verify the accuracy of deformation algorithms, as the next step for improving these techniques. Methods: The phantom represents a single plane of the anatomy for a head and neck patient. Inflation of a balloon catheter inside the phantom simulates tumor growth. CT and camera images of the phantom are acquired before and after its deformation. Nonradiopaque markers reside on the surface of the deformable anatomy and are visible through an acrylic plate, which enables an optical camera to measure their positions; thus, establishing the ground-truth deformation. This measured deformation is directly compared to the predictions of deformation algorithms, using several similarity metrics. The ratio of the number of points with more than a 3 mm deformation error over the number that are deformed by more than 3 mm is used for an error metric to evaluate algorithm accuracy. Results: An optical method of characterizing deformation has been successfully demonstrated. For the tests of this method, the balloon catheter deforms 32 out of the 54 surface markers by more than 3 mm. Different deformation errors result from the different similarity metrics. The most accurate deformation predictions had an error of 75%. Conclusions: The results presented here demonstrate the utility of the phantom for objectively verifying deformation algorithms and determining which is the most accurate. They also indicate that the phantom would benefit from more electron density heterogeneity. The reduction of the deformable anatomy to a two-dimensional system allows for the use of nonradiopaque markers, which do not influence deformation algorithms. This is the fundamental advantage of this
REAL-TIME INTELLIGENT MULTILAYER ATTACK CLASSIFICATION SYSTEM
Directory of Open Access Journals (Sweden)
T. Subbhulakshmi
2014-01-01
Full Text Available Intrusion Detection Systems (IDS takes the lion’s share of the current security infrastructure. Detection of intrusions is vital for initiating the defensive procedures. Intrusion detection was done by statistical and distance based methods. A threshold value is used in these methods to indicate the level of normalcy. When the network traffic crosses the level of normalcy then above which it is flagged as anomalous. When there are occurrences of new intrusion events which are increasingly a key part of system security, the statistical techniques cannot detect them. To overcome this issue, learning techniques are used which helps in identifying new intrusion activities in a computer system. The objective of the proposed system designed in this paper is to classify the intrusions using an Intelligent Multi Layered Attack Classification System (IMLACS which helps in detecting and classifying the intrusions with improved classification accuracy. The intelligent multi layered approach contains three intelligent layers. The first layer involves Binary Support Vector Machine classification for detecting the normal and attack. The second layer involves neural network classification to classify the attacks into classes of attacks. The third layer involves fuzzy inference system to classify the attacks into various subclasses. The proposed IMLACS can be able to detect an intrusion behavior of the networks since the system contains a three intelligent layer classification and better set of rules. Feature selection is also used to improve the time of detection. The experimental results show that the IMLACS achieves the Classification Rate of 97.31%.
Comparison of Danish dichotomous and BI-RADS classifications of mammographic density.
Hodge, Rebecca; Hellmann, Sophie Sell; von Euler-Chelpin, My; Vejborg, Ilse; Andersen, Zorana Jovanovic
2014-06-01
In the Copenhagen mammography screening program from 1991 to 2001, mammographic density was classified either as fatty or mixed/dense. This dichotomous mammographic density classification system is unique internationally, and has not been validated before. To compare the Danish dichotomous mammographic density classification system from 1991 to 2001 with the density BI-RADS classifications, in an attempt to validate the Danish classification system. The study sample consisted of 120 mammograms taken in Copenhagen in 1991-2001, which tested false positive, and which were in 2012 re-assessed and classified according to the BI-RADS classification system. We calculated inter-rater agreement between the Danish dichotomous mammographic classification as fatty or mixed/dense and the four-level BI-RADS classification by the linear weighted Kappa statistic. Of the 120 women, 32 (26.7%) were classified as having fatty and 88 (73.3%) as mixed/dense mammographic density, according to Danish dichotomous classification. According to BI-RADS density classification, 12 (10.0%) women were classified as having predominantly fatty (BI-RADS code 1), 46 (38.3%) as having scattered fibroglandular (BI-RADS code 2), 57 (47.5%) as having heterogeneously dense (BI-RADS 3), and five (4.2%) as having extremely dense (BI-RADS code 4) mammographic density. The inter-rater variability assessed by weighted kappa statistic showed a substantial agreement (0.75). The dichotomous mammographic density classification system utilized in early years of Copenhagen's mammographic screening program (1991-2001) agreed well with the BI-RADS density classification system.
Volunteer-Based System for classification of traffic in computer networks
DEFF Research Database (Denmark)
Bujlow, Tomasz; Balachandran, Kartheepan; Riaz, M. Tahir
2011-01-01
To overcome the drawbacks of existing methods for traffic classification (by ports, Deep Packet Inspection, statistical classification) a new system was developed, in which the data are collected from client machines. This paper presents design of the system, implementation, initial runs and obta...
Kerr, Laura T.; Adams, Aine; O'Dea, Shirley; Domijan, Katarina; Cullen, Ivor; Hennelly, Bryan M.
2014-05-01
Raman microspectroscopy can be applied to the urinary bladder for highly accurate classification and diagnosis of bladder cancer. This technique can be applied in vitro to bladder epithelial cells obtained from urine cytology or in vivo as an optical biopsy" to provide results in real-time with higher sensitivity and specificity than current clinical methods. However, there exists a high degree of variability across experimental parameters which need to be standardised before this technique can be utilized in an everyday clinical environment. In this study, we investigate different laser wavelengths (473 nm and 532 nm), sample substrates (glass, fused silica and calcium fluoride) and multivariate statistical methods in order to gain insight into how these various experimental parameters impact on the sensitivity and specificity of Raman cytology.
Directory of Open Access Journals (Sweden)
S. Arivazhagan
2011-11-01
Full Text Available Texture classification is important in applications of computer image analysis for characterization or classification of images based on local spatial variations of intensity or color. Texture can be defined as consisting of mutually related elements. This paper proposes an experimental approach for identification of suitable multi-resolution transform for characterization and classification of different texture groups based on statistical and co-occurrence features derived from multi-resolution transformed sub bands. The statistical and co-occurrence feature sets are extracted for various multi-resolution transforms such as Discrete Wavelet Transform (DWT, Stationary Wavelet Transform (SWT, Double Density Wavelet Transform (DDWT and Dual Tree Complex Wavelet Transform (DTCWT and then, the transform that maximizes the texture classification performance for the particular texture group is identified.
Bazan, Carlos; Hawkins, Trevor; Torres-Barba, David; Blomgren, Peter; Paolini, Paul
2011-08-22
We are exploring the viability of a novel approach to cardiocyte contractility assessment based on biomechanical properties of the cardiac cells, energy conservation principles, and information content measures. We define our measure of cell contraction as being the distance between the shapes of the contracting cell, assessed by the minimum total energy of the domain deformation (warping) of one cell shape into another. To guarantee a meaningful vis-à-vis correspondence between the two shapes, we employ both a data fidelity term and a regularization term. The data fidelity term is based on nonlinear features of the shapes while the regularization term enforces the compatibility between the shape deformations and that of a hyper-elastic material. We tested the proposed approach by assessing the contractile responses in isolated adult rat cardiocytes and contrasted these measurements against two different methods for contractility assessment in the literature. Our results show good qualitative and quantitative agreements with these methods as far as frequency, pacing, and overall behavior of the contractions are concerned. We hypothesize that the proposed methodology, once appropriately developed and customized, can provide a framework for computational cardiac cell biomechanics that can be used to integrate both theory and experiment. For example, besides giving a good assessment of contractile response of the cardiocyte, since the excitation process of the cell is a closed system, this methodology can be employed in an attempt to infer statistically significant model parameters for the constitutive equations of the cardiocytes.
A hazard and risk classification system for catastrophic rock slope failures in Norway
Hermanns, R.; Oppikofer, T.; Anda, E.; Blikra, L. H.; Böhme, M.; Bunkholt, H.; Dahle, H.; Devoli, G.; Eikenæs, O.; Fischer, L.; Harbitz, C. B.; Jaboyedoff, M.; Loew, S.; Yugsi Molina, F. X.
2012-04-01
outburst floods. It became obvious that large rock slope failures cannot be evaluated on a slope scale with frequency analyses of historical and prehistorical events only, as multiple rockslides have occurred within one century on a single slope that prior to the recent failures had been inactive for several thousand years. In addition, a systematic analysis on temporal distribution indicates that rockslide activity following deglaciation after the Last Glacial Maximum has been much higher than throughout the Holocene. Therefore the classification system has to be based primarily on the geological conditions on the deforming slope and on the deformation rates and only to a lesser weight on a frequency analyses. Our hazard classification therefore is primarily based on several criteria: 1) Development of the back-scarp, 2) development of the lateral release surfaces, 3) development of the potential basal sliding surface, 4) morphologic expression of the basal sliding surface, 5) kinematic feasibility tests for different displacement mechanisms, 6) landslide displacement rates, 7) change of displacement rates (acceleration), 8) increase of rockfall activity on the unstable rock slope, 9) Presence post-glacial events of similar size along the affected slope and its vicinity. For each of these criteria several conditions are possible to choose from (e.g. different velocity classes for the displacement rate criterion). A score is assigned to each condition and the sum of all scores gives the total susceptibility score. Since many of these observations are somewhat uncertain, the classification system is organized in a decision tree where probabilities can be assigned to each condition. All possibilities in the decision tree are computed and the individual probabilities giving the same total score are summed. Basic statistics show the minimum and maximum total scores of a scenario, as well as the mean and modal value. The final output is a cumulative frequency distribution of
Using Context Variety and Students' Discussions in Recognizing Statistical Situations
Silva, José Luis Ángel Rodríguez; Aguilar, Mario Sánchez
2016-01-01
We present a proposal for helping students to cope with statistical word problems related to the classification of different cases of confidence intervals. The proposal promotes an environment where students can explicitly discuss the reasons underlying their classification of cases.
Smolin, I. Yu.; Kulkov, A. S.; Makarov, P. V.; Tunda, V. A.; Krasnoveikin, V. A.; Eremin, M. O.; Bakeev, R. A.
2017-12-01
The aim of the paper is to analyze experimental data on the dynamic response of the marble specimen in uniaxial compression. To make it we use the methods of mathematical statistics. The lateral surface velocity evolution obtained by the laser Doppler vibrometer represents the data for analysis. The registered data were regarded as a time series that reflects deformation evolution of the specimen loaded up to failure. The revealed changes in statistical parameters were considered as precursors of failure. It is shown that before failure the deformation response is autocorrelated and reflects the states of dynamic chaos and self-organized criticality.
Development of zircaloy deformation model to describe the zircaloy-4 cladding tube during accidents
International Nuclear Information System (INIS)
Raff, S.
1978-01-01
The development of a high-temperature deformation model for Zircaloy-4 cans is primarily based on numerous well-parametrized tensile tests to get the material behaviour including statistical variance. It is shown that plastic deformation may be described by a power creep law, the coefficients of which show strong dependence on temperature in the relevant temperature region. These coefficients have been determined. A model based on these coefficients has been established which, apart from best estimate deformation, gives upper and lower bounds of possible deformation. The model derived from isothermal uniaxial tests is being verified against isothermal and transient tube burst tests. The influence of preoxidation and increased oxygen concentration during deformation is modeled on the basis of the pseudobinary Zircaloy-oxygen phase diagram. (author)
Automated Tissue Classification Framework for Reproducible Chronic Wound Assessment
Directory of Open Access Journals (Sweden)
Rashmi Mukherjee
2014-01-01
Full Text Available The aim of this paper was to develop a computer assisted tissue classification (granulation, necrotic, and slough scheme for chronic wound (CW evaluation using medical image processing and statistical machine learning techniques. The red-green-blue (RGB wound images grabbed by normal digital camera were first transformed into HSI (hue, saturation, and intensity color space and subsequently the “S” component of HSI color channels was selected as it provided higher contrast. Wound areas from 6 different types of CW were segmented from whole images using fuzzy divergence based thresholding by minimizing edge ambiguity. A set of color and textural features describing granulation, necrotic, and slough tissues in the segmented wound area were extracted using various mathematical techniques. Finally, statistical learning algorithms, namely, Bayesian classification and support vector machine (SVM, were trained and tested for wound tissue classification in different CW images. The performance of the wound area segmentation protocol was further validated by ground truth images labeled by clinical experts. It was observed that SVM with 3rd order polynomial kernel provided the highest accuracies, that is, 86.94%, 90.47%, and 75.53%, for classifying granulation, slough, and necrotic tissues, respectively. The proposed automated tissue classification technique achieved the highest overall accuracy, that is, 87.61%, with highest kappa statistic value (0.793.
Niemann, Brand Lee
A major field program to study beta-mesoscale transport and dispersion over complex mountainous terrain was conducted during 1969 with the cooperation of three government agencies at the White Sands Missile Range in central Utah. The purpose of the program was to measure simultaneously on a large number of days the synoptic and mesoscale wind fields, the relative dispersion between pairs of particle trajectories and the rate of small scale turbulence dissipation. The field program included measurements during more than 60 days in the months of March, June, and November. The large quantity of data generated from this program has been processed and analyzed to provide case studies and statistics to evaluate and refine Lagrangian variable trajectory models. The case studies selected to illustrate the complexities of mesoscale transport and dispersion over complex terrain include those with terrain blocking, lee waves, and stagnation, as well as those with large vertical wind shears and horizontal wind field deformation. The statistics of relative particle dispersion were computed and compared to the classical theories of Richardson and Batchelor and the more recent theories of Lin and Kao among others. The relative particle dispersion was generally found to increase with travel time in the alongwind and crosswind directions, but in a more oscillatory than sustained or even accelerated manner as predicted by most theories, unless substantial wind shears or finite vertical separations between particles were present. The relative particle dispersion in the vertical was generally found to be small and bounded even when substantial vertical motions due to lee waves were present because of the limiting effect of stable temperature stratification. The data show that velocity shears have a more significant effect than turbulence on relative particle dispersion and that sufficient turbulence may not always be present above the planetary boundary layer for "wind direction shear
Prediction and classification of respiratory motion
Lee, Suk Jin
2014-01-01
This book describes recent radiotherapy technologies including tools for measuring target position during radiotherapy and tracking-based delivery systems. This book presents a customized prediction of respiratory motion with clustering from multiple patient interactions. The proposed method contributes to the improvement of patient treatments by considering breathing pattern for the accurate dose calculation in radiotherapy systems. Real-time tumor-tracking, where the prediction of irregularities becomes relevant, has yet to be clinically established. The statistical quantitative modeling for irregular breathing classification, in which commercial respiration traces are retrospectively categorized into several classes based on breathing pattern are discussed as well. The proposed statistical classification may provide clinical advantages to adjust the dose rate before and during the external beam radiotherapy for minimizing the safety margin. In the first chapter following the Introduction to this book, we...
Sugawara, Kotaro; Yamashita, Hiroharu; Uemura, Yukari; Mitsui, Takashi; Yagi, Koichi; Nishida, Masato; Aikou, Susumu; Mori, Kazuhiko; Nomura, Sachiyo; Seto, Yasuyuki
2017-10-01
The current eighth tumor node metastasis lymph node category pathologic lymph node staging system for esophageal squamous cell carcinoma is based solely on the number of metastatic nodes and does not consider anatomic distribution. We aimed to assess the prognostic capability of the eighth tumor node metastasis pathologic lymph node staging system (numeric-based) compared with the 11th Japan Esophageal Society (topography-based) pathologic lymph node staging system in patients with esophageal squamous cell carcinoma. We retrospectively reviewed the clinical records of 289 patients with esophageal squamous cell carcinoma who underwent esophagectomy with extended lymph node dissection during the period from January 2006 through June 2016. We compared discrimination abilities for overall survival, recurrence-free survival, and cancer-specific survival between these 2 staging systems using C-statistics. The median number of dissected and metastatic nodes was 61 (25% to 75% quartile range, 45 to 79) and 1 (25% to 75% quartile range, 0 to 3), respectively. The eighth tumor node metastasis pathologic lymph node staging system had a greater ability to accurately determine overall survival (C-statistics: tumor node metastasis classification, 0.69, 95% confidence interval, 0.62-0.76; Japan Esophageal Society classification; 0.65, 95% confidence interval, 0.58-0.71; P = .014) and cancer-specific survival (C-statistics: tumor node metastasis classification, 0.78, 95% confidence interval, 0.70-0.87; Japan Esophageal Society classification; 0.72, 95% confidence interval, 0.64-0.80; P = .018). Rates of total recurrence rose as the eighth tumor node metastasis pathologic lymph node stage increased, while stratification of patients according to the topography-based node classification system was not feasible. Numeric nodal staging is an essential tool for stratifying the oncologic outcomes of patients with esophageal squamous cell carcinoma even in the cohort in which adequate
Chung, Ka-Fai; Yeung, Wing-Fai; Ho, Fiona Yan-Yee; Yung, Kam-Ping; Yu, Yee-Man; Kwok, Chi-Wa
2015-04-01
To compare the prevalence of insomnia according to symptoms, quantitative criteria, and Diagnostic and Statistical Manual of Mental Disorders, 4th and 5th Edition (DSM-IV and DSM-5), International Classification of Diseases, 10th Revision (ICD-10), and International Classification of Sleep Disorders, 2nd Edition (ICSD-2), and to compare the prevalence of insomnia disorder between Hong Kong and the United States by adopting a similar methodology used by the America Insomnia Survey (AIS). Population-based epidemiological survey respondents (n = 2011) completed the Brief Insomnia Questionnaire (BIQ), a validated scale generating DSM-IV, DSM-5, ICD-10, and ICSD-2 insomnia disorder. The weighted prevalence of difficulty falling asleep, difficulty staying asleep, waking up too early, and non-restorative sleep that occurred ≥3 days per week was 14.0%, 28.3%, 32.1%, and 39.9%, respectively. When quantitative criteria were included, the prevalence dropped the most from 39.9% to 8.4% for non-restorative sleep, and the least from 14.0% to 12.9% for difficulty falling asleep. The weighted prevalence of DSM-IV, ICD-10, ICSD-2, and any of the three insomnia disorders was 22.1%, 4.7%, 15.1%, and 22.1%, respectively; for DSM-5 insomnia disorder, it was 10.8%. Compared with 22.1%, 3.9%, and 14.7% for DSM-IV, ICD-10, and ICSD-2 in the AIS, cross-cultural difference in the prevalence of insomnia disorder is less than what is expected. The prevalence is reduced by half from DSM-IV to DSM-5. ICD-10 insomnia disorder has the lowest prevalence, perhaps because excessive concern and preoccupation, one of its diagnostic criteria, is not always present in people with insomnia. Copyright © 2014 Elsevier B.V. All rights reserved.
Deformable segmentation via sparse shape representation.
Zhang, Shaoting; Zhan, Yiqiang; Dewan, Maneesh; Huang, Junzhou; Metaxas, Dimitris N; Zhou, Xiang Sean
2011-01-01
Appearance and shape are two key elements exploited in medical image segmentation. However, in some medical image analysis tasks, appearance cues are weak/misleading due to disease/artifacts and often lead to erroneous segmentation. In this paper, a novel deformable model is proposed for robust segmentation in the presence of weak/misleading appearance cues. Owing to the less trustable appearance information, this method focuses on the effective shape modeling with two contributions. First, a shape composition method is designed to incorporate shape prior on-the-fly. Based on two sparsity observations, this method is robust to false appearance information and adaptive to statistically insignificant shape modes. Second, shape priors are modeled and used in a hierarchical fashion. More specifically, by using affinity propagation method, our deformable surface is divided into multiple partitions, on which local shape models are built independently. This scheme facilitates a more compact shape prior modeling and hence a more robust and efficient segmentation. Our deformable model is applied on two very diverse segmentation problems, liver segmentation in PET-CT images and rodent brain segmentation in MR images. Compared to state-of-art methods, our method achieves better performance in both studies.
A comparison of Landsat point and rectangular field training sets for land-use classification
Tom, C. H.; Miller, L. D.
1984-01-01
Rectangular training fields of homogeneous spectroreflectance are commonly used in supervised pattern recognition efforts. Trial image classification with manually selected training sets gives irregular and misleading results due to statistical bias. A self-verifying, grid-sampled training point approach is proposed as a more statistically valid feature extraction technique. A systematic pixel sampling network of every ninth row and ninth column efficiently replaced the full image scene with smaller statistical vectors which preserved the necessary characteristics for classification. The composite second- and third-order average classification accuracy of 50.1 percent for 331,776 pixels in the full image substantially agreed with the 51 percent value predicted by the grid-sampled, 4,100-point training set.
Analytic-graphic testing of deformities at the waterworks Pod Bukovcom
Directory of Open Access Journals (Sweden)
Jeèný Milo
2001-09-01
Full Text Available The paper presents some geodetic measurement results in a frame of deformity survey of the bulk dam at the waterworks Pod Bukovcom nearby Koice. Periodic geodetic position and levelling measurement are realized on the dam since 1999. Testing statistics are applied into the deformity survey. Geodetic data obtained from individual measurements in the geodetic network on the bulk dam at the waterworks Pod Bukovcom are adjusted using Gauss-Markov model. Accuracy analysis by means of using relative and confidence ellipses is complemented to geodetic measurements.
Error calculations statistics in radioactive measurements
International Nuclear Information System (INIS)
Verdera, Silvia
1994-01-01
Basic approach and procedures frequently used in the practice of radioactive measurements.Statistical principles applied are part of Good radiopharmaceutical Practices and quality assurance.Concept of error, classification as systematic and random errors.Statistic fundamentals,probability theories, populations distributions, Bernoulli, Poisson,Gauss, t-test distribution,Ξ2 test, error propagation based on analysis of variance.Bibliography.z table,t-test table, Poisson index ,Ξ2 test
Investigation into iron moessbauer atom state in a deformed iron-manganese alloys
International Nuclear Information System (INIS)
Mints, R.I.; Semenkin, V.A.; Shevchenko, Yu.A.
1977-01-01
A plastically deformed Fe + 12 at. %. Mn alloy was investigated by the method of nuclear gamma-resonance on Fe 57 nuclei. The specimens were deformed by 5 to 57 %. The obtained nuclear gamma-resonance spectra, which are a superposition of the paramagnetic single line (ν-phase) and the Zeeman splitting line (α-phase), were statistically processed with the aid of a computer. The behaviour of the values of Moessbauer parameters possessing a least dispersion, such as isomer chemical shift, quadrupolar reaction constant, effectiveness of magnetic field and of area of the nuclear gamma-resonance spectrum, points to their connection with the degree of the deformation disintegration of the initial solid solution
Collaborative classification of hyperspectral and visible images with convolutional neural network
Zhang, Mengmeng; Li, Wei; Du, Qian
2017-10-01
Recent advances in remote sensing technology have made multisensor data available for the same area, and it is well-known that remote sensing data processing and analysis often benefit from multisource data fusion. Specifically, low spatial resolution of hyperspectral imagery (HSI) degrades the quality of the subsequent classification task while using visible (VIS) images with high spatial resolution enables high-fidelity spatial analysis. A collaborative classification framework is proposed to fuse HSI and VIS images for finer classification. First, the convolutional neural network model is employed to extract deep spectral features for HSI classification. Second, effective binarized statistical image features are learned as contextual basis vectors for the high-resolution VIS image, followed by a classifier. The proposed approach employs diversified data in a decision fusion, leading to an integration of the rich spectral information, spatial information, and statistical representation information. In particular, the proposed approach eliminates the potential problems of the curse of dimensionality and excessive computation time. The experiments evaluated on two standard data sets demonstrate better classification performance offered by this framework.
Atanasov, Nenad; Poposka, Anastasika; Samardziski, Milan; Kamnar, Viktor
2014-01-01
Radiographic examination of extremities in surgical lengthening and/or correction of deformities is of crucial importance for the assessment of new bone formation. The purpose of this study is to confirm the diagnostic value of radiography in precise detection of bone parameters in various lengthening or correction stages in patients treated by limb-lengthening and deformity correction. 50 patients were treated by the Ilizarov method of limb lengthening or deformity correction at the University Orthopaedic Surgery Clinic in Skopje, and analysed over the period from 2006 to 2012. The patients were divided into two groups. The first group consisted of 27 patients with limb-lengthening because of congenital shortening. The second group consisted of 23 patients treated for acquired limb deformities. The results in both groups were received in three stages of new bone formation and were based on the appearance of 3 radiographic parameters at the distraction/compression site. The differences between the presence of all radiographic bone parameters in different stages of new bone formation were statistically signficant in both groups, especially the presence of the cortical margin in the first group (Cochran Q=34.43, df=2, p=0.00000). The comparative analysis between the two groups showed a statistically significant difference in the presence of initial bone elements and cystic formations only in the first stage. Almost no statistical significance in the differences between both groups of patients with regard to 3 radiographic parameters in 3 stages of new bone formation, indicates a minor influence of the etiopathogenetic background on the new bone formation in patients treated by gradual lengthening or correction of limb deformities.
Protein structure: geometry, topology and classification
Energy Technology Data Exchange (ETDEWEB)
Taylor, William R.; May, Alex C.W.; Brown, Nigel P.; Aszodi, Andras [Division of Mathematical Biology, National Institute for Medical Research, London (United Kingdom)
2001-04-01
The structural principals of proteins are reviewed and analysed from a geometric perspective with a view to revealing the underlying regularities in their construction. Computer methods for the automatic comparison and classification of these structures are then reviewed with an analysis of the statistical significance of comparing different shapes. Following an analysis of the current state of the classification of proteins, more abstract geometric and topological representations are explored, including the occurrence of knotted topologies. The review concludes with a consideration of the origin of higher-level symmetries in protein structure. (author)
Particles with small violations of Fermi or Bose statistics
International Nuclear Information System (INIS)
Greenberg, O.W.
1991-01-01
I discuss the statistics of ''quons'' (pronounced to rhyme with muons), particles whose annihilation and creation operators obey the q-deformed commutation relation (the quon algebra or q-mutator) which interpolates between fermions and bosons. Topics discussed include representations of the quon algebra, proof of the TCP theorem, violation of the usual locality properties, and experimental constraints on violations of the Pauli exclusion principle (i.e., Fermi statistics) and of Bose statistics
Outcomes of a Stepcut Lengthening Calcaneal Osteotomy for Adult-Acquired Flatfoot Deformity.
Demetracopoulos, Constantine A; Nair, Pallavi; Malzberg, Andrew; Deland, Jonathan T
2015-07-01
Lateral column lengthening is used to correct abduction deformity at the midfoot and improve talar head coverage in patients with flatfoot deformity. It was our hypothesis that following a stepcut lengthening calcaneal osteotomy (SLCO), patients would have adequate correction of the deformity, a high union rate of the osteotomy, and improvement in clinical outcome scores. We retrospectively reviewed 37 consecutive patients who underwent SLCO for the treatment of stage IIB flatfoot deformity with a minimum 2-year follow-up. Deformity correction was assessed using preoperative and postoperative weight-bearing radiographs. Healing of the osteotomy was assessed by computed tomography. Clinical outcomes included the FAOS and SF-36 questionnaires. The Wilcoxon signed-rank test was used to compare clinical outcome scores. An alpha level of .05 was deemed statistically significant. Healing of the osteotomy occurred at a mean of 7.7 weeks postoperatively. The talonavicular (TN) coverage angle improved from 34.0 to 8.8 (P lengthening. Level IV, retrospective case review. © The Author(s) 2015.
Automatic Genre Classification of Musical Signals
Barbedo, Jayme Garcia sArnal; Lopes, Amauri
2006-12-01
We present a strategy to perform automatic genre classification of musical signals. The technique divides the signals into 21.3 milliseconds frames, from which 4 features are extracted. The values of each feature are treated over 1-second analysis segments. Some statistical results of the features along each analysis segment are used to determine a vector of summary features that characterizes the respective segment. Next, a classification procedure uses those vectors to differentiate between genres. The classification procedure has two main characteristics: (1) a very wide and deep taxonomy, which allows a very meticulous comparison between different genres, and (2) a wide pairwise comparison of genres, which allows emphasizing the differences between each pair of genres. The procedure points out the genre that best fits the characteristics of each segment. The final classification of the signal is given by the genre that appears more times along all signal segments. The approach has shown very good accuracy even for the lowest layers of the hierarchical structure.
Automotive System for Remote Surface Classification.
Bystrov, Aleksandr; Hoare, Edward; Tran, Thuy-Yung; Clarke, Nigel; Gashinova, Marina; Cherniakov, Mikhail
2017-04-01
In this paper we shall discuss a novel approach to road surface recognition, based on the analysis of backscattered microwave and ultrasonic signals. The novelty of our method is sonar and polarimetric radar data fusion, extraction of features for separate swathes of illuminated surface (segmentation), and using of multi-stage artificial neural network for surface classification. The developed system consists of 24 GHz radar and 40 kHz ultrasonic sensor. The features are extracted from backscattered signals and then the procedures of principal component analysis and supervised classification are applied to feature data. The special attention is paid to multi-stage artificial neural network which allows an overall increase in classification accuracy. The proposed technique was tested for recognition of a large number of real surfaces in different weather conditions with the average accuracy of correct classification of 95%. The obtained results thereby demonstrate that the use of proposed system architecture and statistical methods allow for reliable discrimination of various road surfaces in real conditions.
Tahir, Muhammad; Jan, Bismillah; Hayat, Maqsood; Shah, Shakir Ullah; Amin, Muhammad
2018-04-01
Discriminative and informative feature extraction is the core requirement for accurate and efficient classification of protein subcellular localization images so that drug development could be more effective. The objective of this paper is to propose a novel modification in the Threshold Adjacency Statistics technique and enhance its discriminative power. In this work, we utilized Threshold Adjacency Statistics from a novel perspective to enhance its discrimination power and efficiency. In this connection, we utilized seven threshold ranges to produce seven distinct feature spaces, which are then used to train seven SVMs. The final prediction is obtained through the majority voting scheme. The proposed ETAS-SubLoc system is tested on two benchmark datasets using 5-fold cross-validation technique. We observed that our proposed novel utilization of TAS technique has improved the discriminative power of the classifier. The ETAS-SubLoc system has achieved 99.2% accuracy, 99.3% sensitivity and 99.1% specificity for Endogenous dataset outperforming the classical Threshold Adjacency Statistics technique. Similarly, 91.8% accuracy, 96.3% sensitivity and 91.6% specificity values are achieved for Transfected dataset. Simulation results validated the effectiveness of ETAS-SubLoc that provides superior prediction performance compared to the existing technique. The proposed methodology aims at providing support to pharmaceutical industry as well as research community towards better drug designing and innovation in the fields of bioinformatics and computational biology. The implementation code for replicating the experiments presented in this paper is available at: https://drive.google.com/file/d/0B7IyGPObWbSqRTRMcXI2bG5CZWs/view?usp=sharing. Copyright © 2018 Elsevier B.V. All rights reserved.
Multivariate Approaches to Classification in Extragalactic Astronomy
Directory of Open Access Journals (Sweden)
Didier eFraix-Burnet
2015-08-01
Full Text Available Clustering objects into synthetic groups is a natural activity of any science. Astrophysics is not an exception and is now facing a deluge of data. For galaxies, the one-century old Hubble classification and the Hubble tuning fork are still largely in use, together with numerous mono- or bivariate classifications most often made by eye. However, a classification must be driven by the data, and sophisticated multivariate statistical tools are used more and more often. In this paper we review these different approaches in order to situate them in the general context of unsupervised and supervised learning. We insist on the astrophysical outcomes of these studies to show that multivariate analyses provide an obvious path toward a renewal of our classification of galaxies and are invaluable tools to investigate the physics and evolution of galaxies.
Multiscale modeling of large deformations in 3-D polycrystals
International Nuclear Information System (INIS)
Lu Jing; Maniatty, Antoinette; Misiolek, Wojciech; Bandar, Alexander
2004-01-01
An approach for modeling 3-D polycrystals, linking to the macroscale, is presented. A Potts type model is used to generate a statistically representative grain structures with periodicity to allow scale-linking. The grain structures are compared to experimentally observed grain structures to validate that they are representative. A macroscale model of a compression test is compared against an experimental compression test for an Al-Mg-Si alloy to determine various deformation paths at different locations in the samples. These deformation paths are then applied to the experimental grain structure using a scale-bridging technique. Preliminary results from this work will be presented and discussed
Lee, Do-Youl; Kim, Se-Hoon; Suh, Jung-Keun; Cho, Tai-Hyoung; Chung, Yong-Gu
2012-09-01
This study was designed to investigate the correlation between insertion depth of artificial disc and postoperative kyphotic deformity after Prodisc-C total disc replacement surgery, and the range of artificial disc insertion depth which is effective in preventing postoperative whole cervical or segmental kyphotic deformity. A retrospective radiological analysis was performed in 50 patients who had undergone single level total disc replacement surgery. Records were reviewed to obtain demographic data. Preoperative and postoperative radiographs were assessed to determine C2-7 Cobb's angle and segmental angle and to investigate postoperative kyphotic deformity. A formula was introduced to calculate insertion depth of Prodisc-C artificial disc. Statistical analysis was performed to search the correlation between insertion depth of Prodisc-C artificial disc and postoperative kyphotic deformity, and to estimate insertion depth of Prodisc-C artificial disc to prevent postoperative kyphotic deformity. In this study no significant statistical correlation was observed between insertion depth of Prodisc-C artificial disc and postoperative kyphotic deformity regarding C2-7 Cobb's angle. Statistical correlation between insertion depth of Prodisc-C artificial disc and postoperative kyphotic deformity was observed regarding segmental angle (p<0.05). It failed to estimate proper insertion depth of Prodisc-C artificial disc effective in preventing postoperative kyphotic deformity. Postoperative segmental kyphotic deformity is associated with insertion depth of Prodisc-C artificial disc. Anterior located artificial disc leads to lordotic segmental angle and posterior located artificial disc leads to kyphotic segmental angle postoperatively. But C2-7 Cobb's angle is not affected by artificial disc location after the surgery.
Street-side vehicle detection, classification and change detection using mobile laser scanning data
Xiao, Wen; Vallet, Bruno; Schindler, Konrad; Paparoditis, Nicolas
2016-04-01
Statistics on street-side car parks, e.g. occupancy rates, parked vehicle types, parking durations, are of great importance for urban planning and policy making. Related studies, e.g. vehicle detection and classification, mostly focus on static images or video. Whereas mobile laser scanning (MLS) systems are increasingly utilized for urban street environment perception due to their direct 3D information acquisition, high accuracy and movability. In this paper, we design a complete system for car park monitoring, including vehicle recognition, localization, classification and change detection, from laser scanning point clouds. The experimental data are acquired by an MLS system using high frequency laser scanner which scans the streets vertically along the system's moving trajectory. The point clouds are firstly classified as ground, building façade, and street objects which are then segmented using state-of-the-art methods. Each segment is treated as an object hypothesis, and its geometric features are extracted. Moreover, a deformable vehicle model is fitted to each object. By fitting an explicit model to the vehicle points, detailed information, such as precise position and orientation, can be obtained. The model parameters are also treated as vehicle features. Together with the geometric features, they are applied to a supervised learning procedure for vehicle or non-vehicle recognition. The classes of detected vehicles are also investigated. Whether vehicles have changed across two datasets acquired at different times is detected to estimate the durations. Here, vehicles are trained pair-wisely. Two same or different vehicles are paired up as training samples. As a result, the vehicle recognition, classification and change detection accuracies are 95.9%, 86.0% and 98.7%, respectively. Vehicle modelling improves not only the recognition rate, but also the localization precision compared to bounding boxes.
Applied Statistics Using SPSS, STATISTICA, MATLAB and R
De Sá, Joaquim P Marques
2007-01-01
This practical reference provides a comprehensive introduction and tutorial on the main statistical analysis topics, demonstrating their solution with the most common software package. Intended for anyone needing to apply statistical analysis to a large variety of science and enigineering problems, the book explains and shows how to use SPSS, MATLAB, STATISTICA and R for analysis such as data description, statistical inference, classification and regression, factor analysis, survival data and directional statistics. It concisely explains key concepts and methods, illustrated by practical examp
Statistical inference for remote sensing-based estimates of net deforestation
Ronald E. McRoberts; Brian F. Walters
2012-01-01
Statistical inference requires expression of an estimate in probabilistic terms, usually in the form of a confidence interval. An approach to constructing confidence intervals for remote sensing-based estimates of net deforestation is illustrated. The approach is based on post-classification methods using two independent forest/non-forest classifications because...
Dynamic control of knee axial deformities
Directory of Open Access Journals (Sweden)
E. E. Malyshev
2013-01-01
Full Text Available The authors have evaluated the clinical examination of the patients with axial malalignments in the knee by the original method and device which was named varovalgometer. The measurements were conducted by tension of the cord through the spina iliaca anterior superior and the middle of the lower pole of patella. The deviation of the center of the ankle estimated by metal ruler which was positioned perpendicular to the lower leg axis on the level of the ankle joint line. The results of comparison of our method and computer navigation in 53 patients during the TKA show no statistically significant varieties but they differ by average 5° of valgus in clinical examination in comparison with mechanical axis which was identified by computer navigation. The dynamic control of axial malalignment can be used in clinical practice for estimation of the results of treatment of pathology with axial deformities in the knee; for the control of reduction and secondary displacement of the fractures around the knee; for assessment of instability; in planning of correctional osteotomies and intraoperative control of deformity correction; for estimation of Q angle in subluxation and recurrent dislocation of patella; in planning of TKA; during the growth of child it allows to assess the progression of deformity.
Population Based Analysis of Directional Information in Serial Deformation Tensor Morphometry
Studholme, Colin; Cardenas, Valerie
2012-01-01
Deformation morphometry provides a sensitive approach to detecting and mapping subtle volume changes in the brain. Population based analyses of this data have been used successfully to detect characteristic changes in different neurodegenerative conditions. However, most studies have been limited to statistical mapping of the scalar volume change at each point in the brain, by evaluating the determinant of the Jacobian of the deformation field. In this paper we describe an approach to spatial normalisation and analysis of the full deformation tensor. The approach employs a spatial relocation and reorientation of tensors of each subject. Using the assumption of small changes, we use a linear modeling of effects of clinical variables on each deformation tensor component across a population. We illustrate the use of this approach by examining the pattern of significance and orientation of the volume change effects in recovery from alcohol abuse. Results show new local structure which was not apparent in the analysis of scalar volume changes. PMID:18044583
Risk classification and cream skimming on the deregulated German insurance market
Beschorner, Patrick F. E.
2003-01-01
In a two-stage model insurance companies first decide upon risk classification and then compete in prices. I show that the observed heterogeneous behavior of similar firms is compatible with rational behavior. On the deregulated German insurance market individual application of classification schemes induces welfare losses due to cream skimming. Classification costs and pricing above marginal cost can be prevented by common industry-wide loss statistics which already exist to a rudimentary ex...
New proposals for the international classification of diseases-11 revision of pain diagnoses
DEFF Research Database (Denmark)
Rief, Winfried; Kaasa, Stein; Jensen, Rigmor
2012-01-01
The representation of pain diagnoses in current classification systems like International Classification of Diseases (ICD)-10 and Diagnostic and Statistical Manual of Mental Disorders (DSM)-IV does not adequately reflect the state of the art of pain research, and does not sufficiently support...... the clinical management and research programs for pain conditions. Moreover, there is an urgent need to harmonize classification of pain syndromes of special expert groups (eg, International Classification of Headache Disorders) and general classification systems (eg, ICD-11, DSM-V). Therefore, this paper...
Comparison of Danish dichotomous and BI-RADS classifications of mammographic density
DEFF Research Database (Denmark)
Hodge, Rebecca; Hellmann, Sophie Sell; von Euler-Chelpin, My
2014-01-01
BACKGROUND: In the Copenhagen mammography screening program from 1991 to 2001, mammographic density was classified either as fatty or mixed/dense. This dichotomous mammographic density classification system is unique internationally, and has not been validated before. PURPOSE: To compare the Danish...... dichotomous mammographic density classification system from 1991 to 2001 with the density BI-RADS classifications, in an attempt to validate the Danish classification system. MATERIAL AND METHODS: The study sample consisted of 120 mammograms taken in Copenhagen in 1991-2001, which tested false positive......, and which were in 2012 re-assessed and classified according to the BI-RADS classification system. We calculated inter-rater agreement between the Danish dichotomous mammographic classification as fatty or mixed/dense and the four-level BI-RADS classification by the linear weighted Kappa statistic. RESULTS...
Particle deformation during stirred media milling
Hamey, Rhye Garrett
Production of high aspect ratio metal flakes is an important part of the paint and coating industry. The United States Army also uses high aspect ratio metal flakes of a specific dimension in obscurant clouds to attenuate infrared radiation. The most common method for their production is by milling a metal powder. Ductile metal particles are initially flattened in the process increasing the aspect ratio. As the process continues, coldwelding of metal flakes can take place increasing the particle size and decreasing the aspect ratio. Extended milling times may also result in fracture leading to a further decrease in the particle size and aspect ratio. Both the coldwelding of the particles and the breakage of the particles are ultimately detrimental to the materials performance. This study utilized characterization techniques, such as, light scattering and image analysis to determine the change in particle size as a function of milling time and parameters. This study proved that a fundamental relationship between the milling parameters and particle deformation could be established by using Hertz's theory to calculate the stress acting on the aluminum particles. The study also demonstrated a method by which milling efficiency could be calculated, based on the amount of energy required to cause particle deformation. The study found that the particle deformation process could be an energy efficient process at short milling times with milling efficiency as high as 80%. Finally, statistical design of experiment was used to obtain a model that related particle deformation to milling parameters, such as, rotation rate and milling media size.
Porter, David; Michael, Shona; Kirkwood, Craig
2007-12-01
To investigate: (a) associations between the direction of scoliosis, direction of pelvic obliquity, direction of windswept deformity and side of hip subluxation/ dislocation in non-ambulant people with cerebral palsy; and (b) the lateral distribution of these postural asymmetries. Cross-sectional observational study. Posture management services in three centres in the UK. Non-ambulant people at level five on the gross motor function classification system for cerebral palsy. Direction of pelvic obliquity and lateral spinal curvature determined from physical examination, direction of windswept hip deformity derived from range of hip abduction/adduction, and presence/side of unilateral hip subluxation defined by hip migration percentage. A total of 747 participants were included in the study, aged 6-80 years (median 18 years 10 months). Associations between the direction of scoliosis and direction of pelvic obliquity, and between the direction of windswept hip deformity and side hip subluxation/dislocation were confirmed. A significant association was also seen between the direction of scoliosis and the direction of the windswept hip deformity (P<0.001) such that the convexity of the lateral spinal curve was more likely to be opposite to the direction of windsweeping. Furthermore, significantly more windswept deformities to the right (P=0.007), hips subluxed on the left (P=0.002) and lateral lumbar/lower thoracic spinal curves convex to the left (P=0.03) were observed. The individual asymmetrical postural deformities are not unrelated in terms of direction and not equally distributed to the left/right. A pattern of postural deformity was observed.
Genton, Marc G.
2015-04-14
This paper explores the use of visualization through animations, coined visuanimation, in the field of statistics. In particular, it illustrates the embedding of animations in the paper itself and the storage of larger movies in the online supplemental material. We present results from statistics research projects using a variety of visuanimations, ranging from exploratory data analysis of image data sets to spatio-temporal extreme event modelling; these include a multiscale analysis of classification methods, the study of the effects of a simulated explosive volcanic eruption and an emulation of climate model output. This paper serves as an illustration of visuanimation for future publications in Stat. Copyright © 2015 John Wiley & Sons, Ltd.
Maheras, Panagiotis; Tolika, Konstantia; Tegoulias, Ioannis; Anagnostopoulou, Christina; Szpirosz, Klicász; Károssy, Csaba; Makra, László
2018-04-01
The aim of the study is to compare the performance of the two classification methods, based on the atmospheric circulation types over the Pannonian basin in Central Europe. Moreover, relationships including seasonal occurrences and correlation coefficients, as well as comparative diagrams of the seasonal occurrences of the circulation types of the two classification systems are presented. When comparing of the automated (objective) and empirical (subjective) classification methods, it was found that the frequency of the empirical anticyclonic (cyclonic) types is much higher (lower) than that of the automated anticyclonic (cyclonic) types both on an annual and seasonal basis. The highest and statistically significant correlations between the circulation types of the two classification systems, as well as those between the cumulated seasonal anticyclonic and cyclonic types occur in winter for both classifications, since the weather-influencing effect of the atmospheric circulation in this season is the most prevalent. Precipitation amounts in Budapest display a decreasing trend in accordance with the decrease in the occurrence of the automated cyclonic types. In contrast, the occurrence of the empirical cyclonic types displays an increasing trend. There occur types in a given classification that are usually accompanied by high ratios of certain types in the other classification.
Tsallis p, q-deformed Touchard polynomials and Stirling numbers
Herscovici, O.; Mansour, T.
2017-01-01
In this paper, we develop and investigate a new two-parametrized deformation of the Touchard polynomials, based on the definition of the NEXT q-exponential function of Tsallis. We obtain new generalizations of the Stirling numbers of the second kind and of the binomial coefficients and represent two new statistics for the set partitions.
15th Conference of the International Federation of Classification Societies
Montanari, Angela; Vichi, Maurizio
2017-01-01
This edited volume on the latest advances in data science covers a wide range of topics in the context of data analysis and classification. In particular, it includes contributions on classification methods for high-dimensional data, clustering methods, multivariate statistical methods, and various applications. The book gathers a selection of peer-reviewed contributions presented at the Fifteenth Conference of the International Federation of Classification Societies (IFCS2015), which was hosted by the Alma Mater Studiorum, University of Bologna, from July 5 to 8, 2015.
MAXIMUM LIKELIHOOD CLASSIFICATION OF HIGH-RESOLUTION SAR IMAGES IN URBAN AREA
Directory of Open Access Journals (Sweden)
M. Soheili Majd
2012-09-01
Full Text Available In this work, we propose a state-of-the-art on statistical analysis of polarimetric synthetic aperture radar (SAR data, through the modeling of several indices. We concentrate on eight ground classes which have been carried out from amplitudes, co-polarisation ratio, depolarization ratios, and other polarimetric descriptors. To study their different statistical behaviours, we consider Gauss, log- normal, Beta I, Weibull, Gamma, and Fisher statistical models and estimate their parameters using three methods: method of moments (MoM, maximum-likelihood (ML methodology, and log-cumulants method (MoML. Then, we study the opportunity of introducing this information in an adapted supervised classification scheme based on Maximum–Likelihood and Fisher pdf. Our work relies on an image of a suburban area, acquired by the airborne RAMSES SAR sensor of ONERA. The results prove the potential of such data to discriminate urban surfaces and show the usefulness of adapting any classical classification algorithm however classification maps present a persistant class confusion between flat gravelled or concrete roofs and trees.
Secondary structure classification of amino-acid sequences using state-space modeling
Brunnert, Marcus; Krahnke, Tillmann; Urfer, Wolfgang
2001-01-01
The secondary structure classification of amino acid sequences can be carried out by a statistical analysis of sequence and structure data using state-space models. Aiming at this classification, a modified filter algorithm programmed in S is applied to data of three proteins. The application leads to correct classifications of two proteins even when using relatively simple estimation methods for the parameters of the state-space models. Furthermore, it has been shown that the assumed initial...
q-deformed Weinberg-Salam model and q-deformed Maxwell equations
International Nuclear Information System (INIS)
Alavi, S.A.; Sarbishaei, M.; Mokhtari, A.
2000-01-01
We study the q-deformation of the gauge part of the Weinberg-Salam model and show that the q-deformed theory involves new interactions. We then obtain q-deformed Maxwell equations from which magnetic monopoles appear naturally. (author)
Classification of rigid and deformable objects using a novel tactile sensor
DEFF Research Database (Denmark)
Drimus, Alin; Kootstra, Gert; Bilberg, Arne
2011-01-01
. A real time acquisition system scans the data from the array which is then further processed. We validate the properties of the sensor in an application that classifies a number of household objects while performing a palpation procedure with a robotic gripper. Based on the haptic feedback, we classify......In this paper, we present a novel array tactile sensor for use in robotic grippers based on a flexible piezoresistive rubber. We start by describing the physical principles of piezoresistive materials and continue by outlining how to build a flexible array tactile sensor using stitch electrodes...... the results with the ones obtained from an experimental setup that uses a Weiss Robotics tactile sensor with similar characteristics and we conclude by exemplifying how the results of the classification can be used in different industrial applications....
Speech emotion recognition based on statistical pitch model
Institute of Scientific and Technical Information of China (English)
WANG Zhiping; ZHAO Li; ZOU Cairong
2006-01-01
A modified Parzen-window method, which keep high resolution in low frequencies and keep smoothness in high frequencies, is proposed to obtain statistical model. Then, a gender classification method utilizing the statistical model is proposed, which have a 98% accuracy of gender classification while long sentence is dealt with. By separation the male voice and female voice, the mean and standard deviation of speech training samples with different emotion are used to create the corresponding emotion models. Then the Bhattacharyya distance between the test sample and statistical models of pitch, are utilized for emotion recognition in speech.The normalization of pitch for the male voice and female voice are also considered, in order to illustrate them into a uniform space. Finally, the speech emotion recognition experiment based on K Nearest Neighbor shows that, the correct rate of 81% is achieved, where it is only 73.85%if the traditional parameters are utilized.
Argyres, Philip; Lotito, Matteo; Lü, Yongchao; Martone, Mario
2018-02-01
We initiate a systematic study of four dimensional N = 2 superconformal field theories (SCFTs) based on the analysis of their Coulomb branch geometries. Because these SCFTs are not uniquely characterized by their scale-invariant Coulomb branch geometries we also need information on their deformations. We construct all inequivalent such deformations preserving N = 2 supersymmetry and additional physical consistency conditions in the rank 1 case. These not only include all the ones previously predicted by S-duality, but also 16 additional deformations satisfying all the known N = 2 low energy consistency conditions. All but two of these additonal deformations have recently been identified with new rank 1 SCFTs; these identifications are briefly reviewed. Some novel ingredients which are important for this study include: a discussion of RG-flows in the presence of a moduli space of vacua; a classification of local N = 2 supersymmetry-preserving deformations of unitary N = 2 SCFTs; and an analysis of charge normalizations and the Dirac quantization condition on Coulomb branches. This paper is the first in a series of three. The second paper [1] gives the details of the explicit construction of the Coulomb branch geometries discussed here, while the third [2] discusses the computation of central charges of the associated SCFTs.
Collecting operational event data for statistical analysis
International Nuclear Information System (INIS)
Atwood, C.L.
1994-09-01
This report gives guidance for collecting operational data to be used for statistical analysis, especially analysis of event counts. It discusses how to define the purpose of the study, the unit (system, component, etc.) to be studied, events to be counted, and demand or exposure time. Examples are given of classification systems for events in the data sources. A checklist summarizes the essential steps in data collection for statistical analysis
International Nuclear Information System (INIS)
Hardcastle, Nicholas; Bender, Edward T.; Tomé, Wolfgang A.
2014-01-01
It has previously been shown that deformable image registrations (DIRs) often result in deformation maps that are neither inverse-consistent nor transitive, and that the dose accumulation based on these deformation maps can be inconsistent if different image pathways are used for dose accumulation. A method presented to reduce inverse consistency and transitivity errors has been shown to result in more consistent dose accumulation, regardless of the image pathway selected for dose accumulation. The present study investigates the effect on the dose accumulation accuracy of deformation maps processed to reduce inverse consistency and transitivity errors. A set of lung 4DCT phases were analysed, consisting of four images on which a dose grid was created. Dose to 75 corresponding anatomical locations was manually tracked. Dose accumulation was performed between all image sets with Demons derived deformation maps as well as deformation maps processed to reduce inverse consistency and transitivity errors. The ground truth accumulated dose was then compared with the accumulated dose derived from DIR. Two dose accumulation image pathways were considered. The post-processing method to reduce inverse consistency and transitivity errors had minimal effect on the dose accumulation accuracy. There was a statistically significant improvement in dose accumulation accuracy for one pathway, but for the other pathway there was no statistically significant difference. A post-processing technique to reduce inverse consistency and transitivity errors has a positive, yet minimal effect on the dose accumulation accuracy. Thus the post-processing technique improves consistency of dose accumulation with minimal effect on dose accumulation accuracy.
Energy Technology Data Exchange (ETDEWEB)
Trishkina, L., E-mail: trishkina.53@mail.ru; Zboykova, N.; Koneva, N., E-mail: koneva@tsuab.ru; Kozlov, E. [Tomsk State University of Architecture and Building, 2 Solyanaya St., Tomsk, 634003 (Russian Federation); Cherkasova, T. [Tomsk State University of Architecture and Building, 2 Solyanaya St., Tomsk, 634003 (Russian Federation); National Research Tomsk Polytechnic University, 50 Lenin Ave., Tomsk, 634050 (Russian Federation)
2016-01-15
The aim of the investigation was the determination of the statistic description of dislocation distribution in each dislocation substructures component forming after different deformation degrees in the Cu-Al alloys. The dislocation structures were investigated by the transmission diffraction electron microscopy method. In the work the statistic description of distance distribution between the dislocations, dislocation barriers and dislocation tangles in the deformed Cu-Al alloys with different concentration of Al and test temperature at the grain size of 100 µm was carried out. It was established that the above parameters influence the dislocation distribution in different types of the dislocation substructures (DSS): dislocation chaos, dislocation networks without disorientation, nondisoriented and disoriented cells, in the walls and inside the cells. The distributions of the distances between dislocations in the investigated alloys for each DSS type formed at certain deformation degrees and various test temperatures were plotted.
Trishkina, L.; Cherkasova, T.; Zboykova, N.; Koneva, N.; Kozlov, E.
2016-01-01
The aim of the investigation was the determination of the statistic description of dislocation distribution in each dislocation substructures component forming after different deformation degrees in the Cu-Al alloys. The dislocation structures were investigated by the transmission diffraction electron microscopy method. In the work the statistic description of distance distribution between the dislocations, dislocation barriers and dislocation tangles in the deformed Cu-Al alloys with different concentration of Al and test temperature at the grain size of 100 µm was carried out. It was established that the above parameters influence the dislocation distribution in different types of the dislocation substructures (DSS): dislocation chaos, dislocation networks without disorientation, nondisoriented and disoriented cells, in the walls and inside the cells. The distributions of the distances between dislocations in the investigated alloys for each DSS type formed at certain deformation degrees and various test temperatures were plotted.
National Forum on Education Statistics, 2011
2011-01-01
In this handbook, "Prior-to-Secondary School Course Classification System: School Codes for the Exchange of Data" (SCED), the National Center for Education Statistics (NCES) and the National Forum on Education Statistics have extended the existing secondary course classification system with codes and descriptions for courses offered at…
Steinberg, P. D.; Brener, G.; Duffy, D.; Nearing, G. S.; Pelissier, C.
2017-12-01
Hyperparameterization, of statistical models, i.e. automated model scoring and selection, such as evolutionary algorithms, grid searches, and randomized searches, can improve forecast model skill by reducing errors associated with model parameterization, model structure, and statistical properties of training data. Ensemble Learning Models (Elm), and the related Earthio package, provide a flexible interface for automating the selection of parameters and model structure for machine learning models common in climate science and land cover classification, offering convenient tools for loading NetCDF, HDF, Grib, or GeoTiff files, decomposition methods like PCA and manifold learning, and parallel training and prediction with unsupervised and supervised classification, clustering, and regression estimators. Continuum Analytics is using Elm to experiment with statistical soil moisture forecasting based on meteorological forcing data from NASA's North American Land Data Assimilation System (NLDAS). There Elm is using the NSGA-2 multiobjective optimization algorithm for optimizing statistical preprocessing of forcing data to improve goodness-of-fit for statistical models (i.e. feature engineering). This presentation will discuss Elm and its components, including dask (distributed task scheduling), xarray (data structures for n-dimensional arrays), and scikit-learn (statistical preprocessing, clustering, classification, regression), and it will show how NSGA-2 is being used for automate selection of soil moisture forecast statistical models for North America.
Statistical fingerprinting for malware detection and classification
Prowell, Stacy J.; Rathgeb, Christopher T.
2015-09-15
A system detects malware in a computing architecture with an unknown pedigree. The system includes a first computing device having a known pedigree and operating free of malware. The first computing device executes a series of instrumented functions that, when executed, provide a statistical baseline that is representative of the time it takes the software application to run on a computing device having a known pedigree. A second computing device executes a second series of instrumented functions that, when executed, provides an actual time that is representative of the time the known software application runs on the second computing device. The system detects malware when there is a difference in execution times between the first and the second computing devices.
A dynamical classification of the cosmic web
Forero-Romero, J. E.; Hoffman, Y.; Gottlöber, S.; Klypin, A.; Yepes, G.
2009-07-01
In this paper, we propose a new dynamical classification of the cosmic web. Each point in space is classified in one of four possible web types: voids, sheets, filaments and knots. The classification is based on the evaluation of the deformation tensor (i.e. the Hessian of the gravitational potential) on a grid. The classification is based on counting the number of eigenvalues above a certain threshold, λth, at each grid point, where the case of zero, one, two or three such eigenvalues corresponds to void, sheet, filament or a knot grid point. The collection of neighbouring grid points, friends of friends, of the same web type constitutes voids, sheets, filaments and knots as extended web objects. A simple dynamical consideration of the emergence of the web suggests that the threshold should not be null, as in previous implementations of the algorithm. A detailed dynamical analysis would have found different threshold values for the collapse of sheets, filaments and knots. Short of such an analysis a phenomenological approach has been opted for, looking for a single threshold to be determined by analysing numerical simulations. Our cosmic web classification has been applied and tested against a suite of large (dark matter only) cosmological N-body simulations. In particular, the dependence of the volume and mass filling fractions on λth and on the resolution has been calculated for the four web types. We also study the percolation properties of voids and filaments. Our main findings are as follows. (i) Already at λth = 0.1 the resulting web classification reproduces the visual impression of the cosmic web. (ii) Between 0.2 net of interconnected filaments. This suggests a reasonable choice for λth as the parameter that defines the cosmic web. (iii) The dynamical nature of the suggested classification provides a robust framework for incorporating environmental information into galaxy formation models, and in particular to semi-analytical models.
A decision-theoretic approach for segmental classification
Yau, Christopher; Holmes, Christopher C.
2013-01-01
This paper is concerned with statistical methods for the segmental classification of linear sequence data where the task is to segment and classify the data according to an underlying hidden discrete state sequence. Such analysis is commonplace in the empirical sciences including genomics, finance and speech processing. In particular, we are interested in answering the following question: given data $y$ and a statistical model $\\pi(x,y)$ of the hidden states $x$, what should we report as the ...
IRIS COLOUR CLASSIFICATION SCALES--THEN AND NOW.
Grigore, Mariana; Avram, Alina
2015-01-01
Eye colour is one of the most obvious phenotypic traits of an individual. Since the first documented classification scale developed in 1843, there have been numerous attempts to classify the iris colour. In the past centuries, iris colour classification scales has had various colour categories and mostly relied on comparison of an individual's eye with painted glass eyes. Once photography techniques were refined, standard iris photographs replaced painted eyes, but this did not solve the problem of painted/ printed colour variability in time. Early clinical scales were easy to use, but lacked objectivity and were not standardised or statistically tested for reproducibility. The era of automated iris colour classification systems came with the technological development. Spectrophotometry, digital analysis of high-resolution iris images, hyper spectral analysis of the human real iris and the dedicated iris colour analysis software, all accomplished an objective, accurate iris colour classification, but are quite expensive and limited in use to research environment. Iris colour classification systems evolved continuously due to their use in a wide range of studies, especially in the fields of anthropology, epidemiology and genetics. Despite the wide range of the existing scales, up until present there has been no generally accepted iris colour classification scale.
Cochran, Susan D; Drescher, Jack; Kismödi, Eszter; Giami, Alain; García-Moreno, Claudia; Atalla, Elham; Marais, Adele; Vieira, Elisabeth Meloni; Reed, Geoffrey M
2014-09-01
The World Health Organization is developing the 11th revision of the International Statistical Classification of Diseases and Related Health Problems (ICD-11), planned for publication in 2017. The Working Group on the Classification of Sexual Disorders and Sexual Health was charged with reviewing and making recommendations on disease categories related to sexuality in the chapter on mental and behavioural disorders in the 10th revision (ICD-10), published in 1990. This chapter includes categories for diagnoses based primarily on sexual orientation even though ICD-10 states that sexual orientation alone is not a disorder. This article reviews the scientific evidence and clinical rationale for continuing to include these categories in the ICD. A review of the evidence published since 1990 found little scientific interest in these categories. In addition, the Working Group found no evidence that they are clinically useful: they neither contribute to health service delivery or treatment selection nor provide essential information for public health surveillance. Moreover, use of these categories may create unnecessary harm by delaying accurate diagnosis and treatment. The Working Group recommends that these categories be deleted entirely from ICD-11. Health concerns related to sexual orientation can be better addressed using other ICD categories.
Statistical approach for selection of biologically informative genes.
Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N
2018-05-20
Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes
A discrimlnant function approach to ecological site classification in northern New England
James M. Fincher; Marie-Louise Smith
1994-01-01
Describes one approach to ecologically based classification of upland forest community types of the White and Green Mountain physiographic regions. The classification approach is based on an intensive statistical analysis of the relationship between the communities and soil-site factors. Discriminant functions useful in distinguishing between types based on soil-site...
A classification of growth friendly spine implants.
Skaggs, David L; Akbarnia, Behrooz A; Flynn, John M; Myung, Karen S; Sponseller, Paul D; Vitale, Michael G
2014-01-01
Various types of spinal implants have been used with the objective of minimizing spinal deformities while maximizing the spine and thoracic growth in a growing child with a spinal deformity. The aim of this study was to describe a classification system of growth friendly spinal implants to allow researchers and clinicians to have a common language and facilitate comparative studies. Growth friendly spinal implant systems fall into 3 categories based upon the forces of correction the implants exert on the spine, which are as follows: Distraction-based systems correct spinal deformities by mechanically applying a distractive force across a deformed segment with anchors at the top and bottom of the implants, which commonly attach to the spine, rib, and/or the pelvis. The present examples of distraction-based implants are spine-based or rib-based growing rods, vertical expandable titanium rib prosthesis, and remotely expandable devices. Compression-based systems correct spinal deformities with a compressive force applied to the convexity of the curve causing convex growth inhibition. This compressive force may be generated both mechanically at the time of implantation, as well as over time resulting from longitudinal growth of vertebral endplates hindered by the spinal implants. Examples of compression-based systems are vertebral staples and tethers. Guided growth systems correct spinal deformity by anchoring multiple vertebrae (usually including the apical vertebrae) to rods with mechanical forces including translation at the time of the initial implant. The majority of the anchors are not rigidly attached to the rods, thus permitting longitudinal growth over time as the anchors slide over the rods. Examples of guided growth systems include the Luque trolley and Shilla. Each system has its benefits and shortcomings. Knowledge of the fundamental principles upon which these systems are based may aid the clinician to choose an appropriate treatment for patients. Having a
DEFF Research Database (Denmark)
Biering-Sorensen, Tor; Jensen, Jan Skov; Pedersen, Sune H
2016-01-01
deformation in comparison to GLS, conventional echocardiography and clinical information. Method In total 391 patients were admitted with ST-Segment elevation myocardial infarction (STEMI), treated with primary percutaneous coronary intervention and subsequently examined by echocardiography. All patients were...... information to clinical and conventional echocardiographic information (Harrell's c-statistics: 0.63 vs. 0.67, p = 0.032). In addition, impaired longitudinal deformation outside the culprit lesion perfusion region was significantly associated with an adverse outcome (p...). Conclusion Regional longitudinal myocardial deformation measures, regardless if determined by TDI or 2DSE, are superior prognosticators to GLS. In addition, impaired longitudinal deformation in the inferior myocardial segment provides prognostic information over and above clinical and conventional...
Deformation twinning in a creep-deformed nanolaminate structure
International Nuclear Information System (INIS)
Hsiung, Luke L
2010-01-01
The underlying mechanism of deformation twinning occurring in a TiAl-(γ)/Ti 3 Al-(α 2 ) nanolaminate creep deformed at elevated temperatures has been studied. Since the multiplication and propagation of lattice dislocations in both γ and α 2 thin lamellae are very limited, the total flow of lattice dislocations becomes insufficient to accommodate the accumulated creep strains. Consequently, the movement of interfacial dislocations along the laminate interfaces, i.e., interface sliding, becomes an alternative deformation mode of the nanolaminate structure. Pile-ups of interfacial dislocations occur when interfacial ledges and impinged lattice dislocations act as obstacles to impede the movement of interfacial dislocations. Deformation twinning can accordingly take place to relieve a stress concentration resulting from the pile-up of interfacial dislocations. An interface-controlled twinning mechanism driven by the pile-up and dissociation of interfacial dislocations is accordingly proposed.
Deformation twinning in a creep-deformed nanolaminate structure
Hsiung, Luke L.
2010-10-01
The underlying mechanism of deformation twinning occurring in a TiAl-(γ)/Ti3Al-(α2) nanolaminate creep deformed at elevated temperatures has been studied. Since the multiplication and propagation of lattice dislocations in both γ and α2 thin lamellae are very limited, the total flow of lattice dislocations becomes insufficient to accommodate the accumulated creep strains. Consequently, the movement of interfacial dislocations along the laminate interfaces, i.e., interface sliding, becomes an alternative deformation mode of the nanolaminate structure. Pile-ups of interfacial dislocations occur when interfacial ledges and impinged lattice dislocations act as obstacles to impede the movement of interfacial dislocations. Deformation twinning can accordingly take place to relieve a stress concentration resulting from the pile-up of interfacial dislocations. An interface-controlled twinning mechanism driven by the pile-up and dissociation of interfacial dislocations is accordingly proposed.
[The importance of classifications in psychiatry].
Lempérière, T
1995-12-01
The classifications currently used in psychiatry have different aims: to facilitate communication between researchers and clinicians at national and international levels through the use of a common language, or at least a clearly and precisely defined nomenclature; to provide a nosographical reference system which can be used in practice (diagnosis, prognosis, treatment); to optimize research by ensuring that sample cases are as homogeneous as possible; to facilitate statistical records for public health institutions. A classification is of practical interest only if it is reliable, valid and acceptable to all potential users. In recent decades, there has been a considerable systematic and coordinated effort to improve the methodological approach to classification and categorization in the field of psychiatry, including attempts to create operational definitions, field trials of inter-assessor reliability, attempts to validate the selected nosological categories by analysis of correlation between progression, treatment response, family history and additional examinations. The introduction of glossaries, and particularly of diagnostic criteria, marked a decisive step in this new approach. The key problem remains that of the validity of diagnostic criteria. Ideally, these should be based on demonstrable etiologic or pathogenic data, but such information is rarely available in psychiatry. Current classifications rely on the use of extremely diverse elements in differing degrees: descriptive criteria, evolutive criteria, etiopathogenic criteria, psychopathogenic criteria, etc. Certain syndrome-based classifications such as DSM III and its successors aim to be atheoretical and pragmatic. Others, such as ICD-10, while more eclectic than the different versions of DSM, follow suit by abandoning the terms "disease" and "illness" in favor of the more consensual "disorder". The legitimacy of classifications in the field of psychiatry has been fiercely contested, being
International Nuclear Information System (INIS)
Tsusaka, Kimikazu
2010-01-01
Japan Atomic Energy Agency has been excavating deep shafts through sedimentary soft rocks in Horonobe, Hokkaido. From the viewpoint of the observational construction, site engineers need a practical guide to evaluate the field measurements conducted with shaft sinking. The author analyzed the relationship among initial deformation rate, observed deformation, the ratio of the modulus of elasticity of rock mass to the initial stress, and the magnitude of inelastic behavior of rock based on convergence measurements and investigation of rock mass properties on shaft walls. As a result, the rock mass behavior classification for shaft sinking which consists of three classes was proposed. (author)
Highly Robust Statistical Methods in Medical Image Analysis
Czech Academy of Sciences Publication Activity Database
Kalina, Jan
2012-01-01
Roč. 32, č. 2 (2012), s. 3-16 ISSN 0208-5216 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : robust statistics * classification * faces * robust image analysis * forensic science Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.208, year: 2012 http://www.ibib.waw.pl/bbe/bbefulltext/BBE_32_2_003_FT.pdf
Differential Classification of Dementia
Directory of Open Access Journals (Sweden)
E. Mohr
1995-01-01
Full Text Available In the absence of biological markers, dementia classification remains complex both in terms of characterization as well as early detection of the presence or absence of dementing symptoms, particularly in diseases with possible secondary dementia. An empirical, statistical approach using neuropsychological measures was therefore developed to distinguish demented from non-demented patients and to identify differential patterns of cognitive dysfunction in neurodegenerative disease. Age-scaled neurobehavioral test results (Wechsler Adult Intelligence Scale—Revised and Wechsler Memory Scale from Alzheimer's (AD and Huntington's (HD patients, matched for intellectual disability, as well as normal controls were used to derive a classification formula. Stepwise discriminant analysis accurately (99% correct distinguished controls from demented patients, and separated the two patient groups (79% correct. Variables discriminating between HD and AD patient groups consisted of complex psychomotor tasks, visuospatial function, attention and memory. The reliability of the classification formula was demonstrated with a new, independent sample of AD and HD patients which yielded virtually identical results (classification accuracy for dementia: 96%; AD versus HD: 78%. To validate the formula, the discriminant function was applied to Parkinson's (PD patients, 38% of whom were classified as demented. The validity of the classification was demonstrated by significant PD subgroup differences on measures of dementia not included in the discriminant function. Moreover, a majority of demented PD patients (65% were classified as having an HD-like pattern of cognitive deficits, in line with previous reports of the subcortical nature of PD dementia. This approach may thus be useful in classifying presence or absence of dementia and in discriminating between dementia subtypes in cases of secondary or coincidental dementia.
Fractal dimension and image statistics of anal intraepithelial neoplasia
International Nuclear Information System (INIS)
Ahammer, H.; Kroepfl, J.M.; Hackl, Ch.; Sedivy, R.
2011-01-01
Research Highlights: → Human papillomaviruses cause anal intraepithelial neoplasia (AIN). → Digital image processing was carried out to classify the grades of AIN quantitatively. → The fractal dimension as well as grey value statistics was calculated. → Higher grades of AIN yielded higher values of the fractal dimension. → An automatic detection system is feasible. - Abstract: It is well known that human papillomaviruses (HPV) induce a variety of tumorous lesions of the skin. HPV-subtypes also cause premalignant lesions which are termed anal intraepithelial neoplasia (AIN). The clinical classification of AIN is of growing interest in clinical practice, due to increasing HPV infection rates throughout human population. The common classification approach is based on subjective inspections of histological slices of anal tissues with all the drawbacks of depending on the status and individual variances of the trained pathologists. Therefore, a nonlinear quantitative classification method including the calculation of the fractal dimension and first order as well as second order image statistical parameters was developed. The absolute values of these quantitative parameters reflected the distinct grades of AIN very well. The quantitative approach has the potential to decrease classification errors significantly and it could be used as a widely applied screening technique.
Modeling of 3D Aluminum Polycrystals during Large Deformations
International Nuclear Information System (INIS)
Maniatty, Antoinette M.; Littlewood, David J.; Lu Jing; Pyle, Devin
2007-01-01
An approach for generating, meshing, and modeling 3D polycrystals, with a focus on aluminum alloys, subjected to large deformation processes is presented. A Potts type model is used to generate statistically representative grain structures with periodicity to allow scale-linking. The grain structures are compared to experimentally observed grain structures to validate that they are representative. A procedure for generating a geometric model from the voxel data is developed allowing for adaptive meshing of the generated grain structure. Material behavior is governed by an appropriate crystal, elasto-viscoplastic constitutive model. The elastic-viscoplastic model is implemented in a three-dimensional, finite deformation, mixed, finite element program. In order to handle the large-scale problems of interest, a parallel implementation is utilized. A multiscale procedure is used to link larger scale models of deformation processes to the polycrystal model, where periodic boundary conditions on the fluctuation field are enforced. Finite-element models, of 3D polycrystal grain structures will be presented along with observations made from these simulations
Solid state conformational classification of eight-membered rings
DEFF Research Database (Denmark)
Pérez, J.; García, L.; Kessler, M.
2005-01-01
A statistical classification of the solid state conformation in the title complexes using data retrieved from the Cambridge Structural Database (CSD) has been made. Phosphate and phosphinate complexes show a chair conformation preferably. In phosphonate complexes, the most frequent conformations...
Re-Irradiation of Hepatocellular Carcinoma: Clinical Applicability of Deformable Image Registration.
Lee, Dong Soo; Woo, Joong Yeol; Kim, Jun Won; Seong, Jinsil
2016-01-01
This study aimed to evaluate whether the deformable image registration (DIR) method is clinically applicable to the safe delivery of re-irradiation in hepatocellular carcinoma (HCC). Between August 2010 and March 2012, 12 eligible HCC patients received re-irradiation using helical tomotherapy. The median total prescribed radiation doses at first irradiation and re-irradiation were 50 Gy (range, 36-60 Gy) and 50 Gy (range, 36-58.42 Gy), respectively. Most re-irradiation therapies (11 of 12) were administered to previously irradiated or marginal areas. Dose summation results were reproduced using DIR by rigid and deformable registration methods, and doses of organs-at-risk (OARs) were evaluated. Treatment outcomes were also assessed. Thirty-six dose summation indices were obtained for three OARs (bowel, duodenum, and stomach doses in each patient). There was no statistical difference between the two different types of DIR methods (rigid and deformable) in terms of calculated summation ΣD (0.1 cc, 1 cc, 2 cc, and max) in each OAR. The median total mean remaining liver doses (M(RLD)) in rigid- and deformable-type registration were not statistically different for all cohorts (p=0.248), although a large difference in M(RLD) was observed when there was a significant difference in spatial liver volume change between radiation intervals. One duodenal ulcer perforation developed 20 months after re-irradiation. Although current dose summation algorithms and uncertainties do not warrant accurate dosimetric results, OARs-based DIR dose summation can be usefully utilized in the re-irradiation of HCC. Appropriate cohort selection, watchful interpretation, and selective use of DIR methods are crucial to enhance the radio-therapeutic ratio.
Statistics on Lie groups: A need to go beyond the pseudo-Riemannian framework
Miolane, Nina; Pennec, Xavier
2015-01-01
Lie groups appear in many fields from Medical Imaging to Robotics. In Medical Imaging and particularly in Computational Anatomy, an organ's shape is often modeled as the deformation of a reference shape, in other words: as an element of a Lie group. In this framework, if one wants to model the variability of the human anatomy, e.g. in order to help diagnosis of diseases, one needs to perform statistics on Lie groups. A Lie group G is a manifold that carries an additional group structure. Statistics on Riemannian manifolds have been well studied with the pioneer work of Fréchet, Karcher and Kendall [1, 2, 3, 4] followed by others [5, 6, 7, 8, 9]. In order to use such a Riemannian structure for statistics on Lie groups, one needs to define a Riemannian metric that is compatible with the group structure, i.e a bi-invariant metric. However, it is well known that general Lie groups which cannot be decomposed into the direct product of compact and abelian groups do not admit a bi-invariant metric. One may wonder if removing the positivity of the metric, thus asking only for a bi-invariant pseudo-Riemannian metric, would be sufficient for most of the groups used in Computational Anatomy. In this paper, we provide an algorithmic procedure that constructs bi-invariant pseudo-metrics on a given Lie group G. The procedure relies on a classification theorem of Medina and Revoy. However in doing so, we prove that most Lie groups do not admit any bi-invariant (pseudo-) metric. We conclude that the (pseudo-) Riemannian setting is not the richest setting if one wants to perform statistics on Lie groups. One may have to rely on another framework, such as affine connection space.
Etiology, pathophysiology and classifications of the diabetic Charcot foot
Papanas, Nikolaos; Maltezos, Efstratios
2013-01-01
In people with diabetes mellitus, the Charcot foot is a specific manifestation of peripheral neuropathy that may involve autonomic neuropathy with high blood flow to the foot, leading to increased bone resorption. It may also involve peripheral somatic polyneuropathy with loss of protective sensation and high risk of unrecognized acute or chronic minor trauma. In both cases, there is excess local inflammatory response to foot injury, resulting in local osteoporosis. In the Charcot foot, the acute and chronic phases have been described. The former is characterized by local erythema, edema, and marked temperature elevation, while pain is not a prominent symptom. In the latter, signs of inflammation gradually recede and deformities may develop, increasing the risk of foot ulceration. The most common anatomical classification describes five patterns, according to the localization of bone and joint pathology. This review article aims to provide a brief overview of the diabetic Charcot foot in terms of etiology, pathophysiology, and classification. PMID:23705058
Statistical text classifier to detect specific type of medical incidents.
Wong, Zoie Shui-Yee; Akiyama, Masanori
2013-01-01
WHO Patient Safety has put focus to increase the coherence and expressiveness of patient safety classification with the foundation of International Classification for Patient Safety (ICPS). Text classification and statistical approaches has showed to be successful to identifysafety problems in the Aviation industryusing incident text information. It has been challenging to comprehend the taxonomy of medical incidents in a structured manner. Independent reporting mechanisms for patient safety incidents have been established in the UK, Canada, Australia, Japan, Hong Kong etc. This research demonstrates the potential to construct statistical text classifiers to detect specific type of medical incidents using incident text data. An illustrative example for classifying look-alike sound-alike (LASA) medication incidents using structured text from 227 advisories related to medication errors from Global Patient Safety Alerts (GPSA) is shown in this poster presentation. The classifier was built using logistic regression model. ROC curve and the AUC value indicated that this is a satisfactory good model.
Relation between epistaxis, external nasal deformity, and septal deviation following nasal trauma
Daniel, M; Raghavan, U
2005-01-01
Objectives: To find if the presence of epistaxis after nasal trauma can be used to predict post-traumatic external nasal deformity or a symptomatic deviated nasal septum. Methods: Retrospective analysis of all patients seen in the fractured nose clinic by the first author between 17 October 2003 and 27 February 2004. Presence of epistaxis, newly developed external nasal deformity, and the presence of a deviated nasal septum with new symptoms of nasal obstruction were noted. Results: A total of 139 patients were included in the study. Epistaxis following injury was noted in 106 (76%). Newly developed external nasal deformity was noted in 71 (51%), and 33 (24%) had a deviated nasal septum with new symptoms of nasal obstruction. Of the 106 patients with post-trauma epistaxis, 50 (67%) had newly developed external nasal deformity and of the 33 patients without post-traumatic epistaxis, 11 (33%) had nasal deformity (pepistaxis was not associated with the presence of a newly symptomatic deviated septum (25% in patients with epistaxis after injury versus 18% if there was no epistaxis). Conclusions: Presence of epistaxis after nasal trauma is associated with a statistically significant increase in external nasal deformity. However, one third of patients without epistaxis following nasal trauma also had external nasal deformity and hence all patients with a swollen nose after injury, irrespective of post-trauma epistaxis, still need to be referred to the fractured nose clinic. PMID:16244333
Families of vector-like deformations of relativistic quantum phase spaces, twists and symmetries
Energy Technology Data Exchange (ETDEWEB)
Meljanac, Daniel [Ruder Boskovic Institute, Division of Materials Physics, Zagreb (Croatia); Meljanac, Stjepan; Pikutic, Danijel [Ruder Boskovic Institute, Division of Theoretical Physics, Zagreb (Croatia)
2017-12-15
Families of vector-like deformed relativistic quantum phase spaces and corresponding realizations are analyzed. A method for a general construction of the star product is presented. The corresponding twist, expressed in terms of phase space coordinates, in the Hopf algebroid sense is presented. General linear realizations are considered and corresponding twists, in terms of momenta and Poincare-Weyl generators or gl(n) generators are constructed and R-matrix is discussed. A classification of linear realizations leading to vector-like deformed phase spaces is given. There are three types of spaces: (i) commutative spaces, (ii) κ-Minkowski spaces and (iii) κ-Snyder spaces. The corresponding star products are (i) associative and commutative (but non-local), (ii) associative and non-commutative and (iii) non-associative and non-commutative, respectively. Twisted symmetry algebras are considered. Transposed twists and left-right dual algebras are presented. Finally, some physical applications are discussed. (orig.)
Families of vector-like deformations of relativistic quantum phase spaces, twists and symmetries
International Nuclear Information System (INIS)
Meljanac, Daniel; Meljanac, Stjepan; Pikutic, Danijel
2017-01-01
Families of vector-like deformed relativistic quantum phase spaces and corresponding realizations are analyzed. A method for a general construction of the star product is presented. The corresponding twist, expressed in terms of phase space coordinates, in the Hopf algebroid sense is presented. General linear realizations are considered and corresponding twists, in terms of momenta and Poincare-Weyl generators or gl(n) generators are constructed and R-matrix is discussed. A classification of linear realizations leading to vector-like deformed phase spaces is given. There are three types of spaces: (i) commutative spaces, (ii) κ-Minkowski spaces and (iii) κ-Snyder spaces. The corresponding star products are (i) associative and commutative (but non-local), (ii) associative and non-commutative and (iii) non-associative and non-commutative, respectively. Twisted symmetry algebras are considered. Transposed twists and left-right dual algebras are presented. Finally, some physical applications are discussed. (orig.)
Families of vector-like deformations of relativistic quantum phase spaces, twists and symmetries
Meljanac, Daniel; Meljanac, Stjepan; Pikutić, Danijel
2017-12-01
Families of vector-like deformed relativistic quantum phase spaces and corresponding realizations are analyzed. A method for a general construction of the star product is presented. The corresponding twist, expressed in terms of phase space coordinates, in the Hopf algebroid sense is presented. General linear realizations are considered and corresponding twists, in terms of momenta and Poincaré-Weyl generators or gl(n) generators are constructed and R-matrix is discussed. A classification of linear realizations leading to vector-like deformed phase spaces is given. There are three types of spaces: (i) commutative spaces, (ii) κ -Minkowski spaces and (iii) κ -Snyder spaces. The corresponding star products are (i) associative and commutative (but non-local), (ii) associative and non-commutative and (iii) non-associative and non-commutative, respectively. Twisted symmetry algebras are considered. Transposed twists and left-right dual algebras are presented. Finally, some physical applications are discussed.
International Nuclear Information System (INIS)
Kharlamov, V.V.; Dvinskij, V.M.; Vashlyaev, Eh.V.; Dyblenko, Z.A.; Khamatov, R.I.; Zverev, K.P.
1981-01-01
On the basis of approximation of the experimental curves partial differential equations relating ABM-1 alloy deformation resistance to the deformation parameters are obtained. Using statistical processing of the experimental data the regression equations of the dependence of the deformation resistance on temperature rate and relative reduction of the samples are found. In the 2.1-23.6 1/c deformation rate range hardening and weakening rates of the AMB-1 alloy increases with the increase of the latter. The data obtained permit to calculate the deformation parameters of the studied alloy for different processes of metal plastic working in the studied temperature range [ru
Deformed Materials: Towards a Theory of Materials Morphology Dynamics
Energy Technology Data Exchange (ETDEWEB)
Sethna, James P [Laboratory of Atomic and Solid State Physics, Cornell University
2017-06-28
This grant supported work on the response of crystals to external stress. Our primary work described how disordered structural materials break in two (statistical models of fracture in disordered materials), studied models of deformation bursts (avalanches) that mediate deformation on the microscale, and developed continuum dislocation dynamics models for plastic deformation (as when scooping ice cream bends a spoon, Fig. 9). Glass is brittle -- it breaks with almost atomically smooth fracture surfaces. Many metals are ductile -- when they break, the fracture surface is locally sheared and stretched, and it is this damage that makes them hard to break. Bone and seashells are made of brittle material, but they are strong because they are disordered -- lots of little cracks form as they are sheared and near the fracture surface, diluting the external force. We have studied materials like bone and seashells using simulations, mathematical tools, and statistical mechanics models from physics. In particular, we studied the extreme values of fracture strengths (how likely will a beam in a bridge break far below its design strength), and found that the traditional engineering tools could be improved greatly. We also studied fascinating crackling-noise precursors -- systems which formed microcracks of a broad range of sizes before they broke. Ductile metals under stress undergo irreversible plastic deformation -- the planes of atoms must slide across one another (through the motion of dislocations) to change the overall shape in response to the external force. Microscopically, the dislocations in crystals move in bursts of a broad range of sizes (termed 'avalanches' in the statistical mechanics community, whose motion is deemed 'crackling noise'). In this grant period, we resolved a longstanding mystery about the average shape of avalanches of fixed duration (using tools related to an emergent scale invariance), we developed the fundamental theory
Signal classification using global dynamical models, Part I: Theory
International Nuclear Information System (INIS)
Kadtke, J.; Kremliovsky, M.
1996-01-01
Detection and classification of signals is one of the principal areas of signal processing, and the utilization of nonlinear information has long been considered as a way of improving performance beyond standard linear (e.g. spectral) techniques. Here, we develop a method for using global models of chaotic dynamical systems theory to define a signal classification processing chain, which is sensitive to nonlinear correlations in the data. We use it to demonstrate classification in high noise regimes (negative SNR), and argue that classification probabilities can be directly computed from ensemble statistics in the model coefficient space. We also develop a modification for non-stationary signals (i.e. transients) using non-autonomous ODEs. In Part II of this paper, we demonstrate the analysis on actual open ocean acoustic data from marine biologics. copyright 1996 American Institute of Physics
Simple Fully Automated Group Classification on Brain fMRI
International Nuclear Information System (INIS)
Honorio, J.; Goldstein, R.; Samaras, D.; Tomasi, D.; Goldstein, R.Z.
2010-01-01
We propose a simple, well grounded classification technique which is suited for group classification on brain fMRI data sets that have high dimensionality, small number of subjects, high noise level, high subject variability, imperfect registration and capture subtle cognitive effects. We propose threshold-split region as a new feature selection method and majority voteas the classification technique. Our method does not require a predefined set of regions of interest. We use average acros ssessions, only one feature perexperimental condition, feature independence assumption, and simple classifiers. The seeming counter-intuitive approach of using a simple design is supported by signal processing and statistical theory. Experimental results in two block design data sets that capture brain function under distinct monetary rewards for cocaine addicted and control subjects, show that our method exhibits increased generalization accuracy compared to commonly used feature selection and classification techniques.
Simple Fully Automated Group Classification on Brain fMRI
Energy Technology Data Exchange (ETDEWEB)
Honorio, J.; Goldstein, R.; Honorio, J.; Samaras, D.; Tomasi, D.; Goldstein, R.Z.
2010-04-14
We propose a simple, well grounded classification technique which is suited for group classification on brain fMRI data sets that have high dimensionality, small number of subjects, high noise level, high subject variability, imperfect registration and capture subtle cognitive effects. We propose threshold-split region as a new feature selection method and majority voteas the classification technique. Our method does not require a predefined set of regions of interest. We use average acros ssessions, only one feature perexperimental condition, feature independence assumption, and simple classifiers. The seeming counter-intuitive approach of using a simple design is supported by signal processing and statistical theory. Experimental results in two block design data sets that capture brain function under distinct monetary rewards for cocaine addicted and control subjects, show that our method exhibits increased generalization accuracy compared to commonly used feature selection and classification techniques.
The effect of person misfit on classification decisions
Hendrawan, I.; Glas, Cornelis A.W.; Meijer, R.R.
2001-01-01
The effect of person misfit to an item response theory (IRT) model on a mastery/nonmastery decision was investigated. Also investigated was whether the classification precision can be improved by identifying misfitting respondents using person-fit statistics. A simulation study was conducted to
Appraisal of transport and deformation in shale reservoirs using natural noble gas tracers
Energy Technology Data Exchange (ETDEWEB)
Heath, Jason E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kuhlman, Kristopher L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Robinson, David G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bauer, Stephen J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gardner, William Payton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Univ. of Montana, Missoula, MT (United States)
2015-09-01
This report presents efforts to develop the use of in situ naturally-occurring noble gas tracers to evaluate transport mechanisms and deformation in shale hydrocarbon reservoirs. Noble gases are promising as shale reservoir diagnostic tools due to their sensitivity of transport to: shale pore structure; phase partitioning between groundwater, liquid, and gaseous hydrocarbons; and deformation from hydraulic fracturing. Approximately 1.5-year time-series of wellhead fluid samples were collected from two hydraulically-fractured wells. The noble gas compositions and isotopes suggest a strong signature of atmospheric contribution to the noble gases that mix with deep, old reservoir fluids. Complex mixing and transport of fracturing fluid and reservoir fluids occurs during production. Real-time laboratory measurements were performed on triaxially-deforming shale samples to link deformation behavior, transport, and gas tracer signatures. Finally, we present improved methods for production forecasts that borrow statistical strength from production data of nearby wells to reduce uncertainty in the forecasts.
A new improved method for assessing brain deformation after decompressive craniectomy.
Directory of Open Access Journals (Sweden)
Tim L Fletcher
Full Text Available Decompressive craniectomy (DC is a surgical intervention used following traumatic brain injury to prevent or alleviate raised intracranial pressure. However the clinical effectiveness of the intervention remains in doubt. The location of the craniectomy (unilateral or bifrontal might be expected to change the brain deformation associated with the operation and hence the clinical outcome. As existing methods for assessing brain deformation have several limitations, we sought to develop and validate a new improved method.Computed tomography (CT scans were taken from 27 patients who underwent DC (17 bifrontal patients and 10 unilateral patients. Pre-operative and post-operative images were processed and registered to determine the change in brain position associated with the operation. The maximum deformation in the herniated brain, the change in volume and estimates of the craniectomy area were determined from the images. Statistical comparison was made using the Pearson's correlation coefficient r and a Welch's two-tailed T-test, with statistical significance reported at the 5% level.There was a reasonable correlation between the volume increase and the maximum brain displacement (r = 0.64, a low correlation between the volume increase and the craniectomy area (r = 0.30 and no correlation between the maximum displacement and the craniectomy area (r = -0.01. The maximum deformation was significantly lower (P = 0.023 in the bifrontal patients (mean = 22.5 mm compared with the unilateral patients (mean = 29.8 mm. Herniation volume was significantly lower (P = 0.023 in bifrontal (mean = 50.0 ml than unilateral patients (mean = 107.3 ml. Craniectomy area was not significantly different for the two craniectomy locations (P = 0.29.A method has been developed to quantify changes in brain deformation due to decompressive craniectomy from CT images and allow comparison between different craniectomy locations
Benediktsson, J. A.; Swain, P. H.; Ersoy, O. K.
1993-01-01
Application of neural networks to classification of remote sensing data is discussed. Conventional two-layer backpropagation is found to give good results in classification of remote sensing data but is not efficient in training. A more efficient variant, based on conjugate-gradient optimization, is used for classification of multisource remote sensing and geographic data and very-high-dimensional data. The conjugate-gradient neural networks give excellent performance in classification of multisource data, but do not compare as well with statistical methods in classification of very-high-dimentional data.
Nasca classification of hemivertebra in five dogs
Directory of Open Access Journals (Sweden)
Besalti Omer
2005-12-01
Full Text Available Five dogs, four small mixed breed and a Doberman Pinscher, presented in our clinic with hemivertebra. Complete physical, radiological and neurological examinations were done and the spinal deformities were characterized in accord with the Nasca classification used in human medicine. Two dogs had multiple hemivertebrae (round, oval or wedge-shaped: Type 3 in the thoracic region; one dog had an individual surplus half vertebral body (Type 1 plus a wedge-shaped hemivertebra (Type 2b in the lumbar region; one dog had multiple hemivertebrae which were fused on one side (Type 4a in the thoracic region; and one dog had a wedge-shaped hemivertebra (Type 2a in the cervical region.
A Comparison of Machine Learning Methods in a High-Dimensional Classification Problem
Directory of Open Access Journals (Sweden)
Zekić-Sušac Marijana
2014-09-01
Full Text Available Background: Large-dimensional data modelling often relies on variable reduction methods in the pre-processing and in the post-processing stage. However, such a reduction usually provides less information and yields a lower accuracy of the model. Objectives: The aim of this paper is to assess the high-dimensional classification problem of recognizing entrepreneurial intentions of students by machine learning methods. Methods/Approach: Four methods were tested: artificial neural networks, CART classification trees, support vector machines, and k-nearest neighbour on the same dataset in order to compare their efficiency in the sense of classification accuracy. The performance of each method was compared on ten subsamples in a 10-fold cross-validation procedure in order to assess computing sensitivity and specificity of each model. Results: The artificial neural network model based on multilayer perceptron yielded a higher classification rate than the models produced by other methods. The pairwise t-test showed a statistical significance between the artificial neural network and the k-nearest neighbour model, while the difference among other methods was not statistically significant. Conclusions: Tested machine learning methods are able to learn fast and achieve high classification accuracy. However, further advancement can be assured by testing a few additional methodological refinements in machine learning methods.
International Nuclear Information System (INIS)
Abbas, Afsar
1992-01-01
The surprising answer to this question Is nucleon deformed? is : Yes. The evidence comes from a study of the quark model of the single nucleon and when it is found in a nucleus. It turns out that many of the long standing problems of the Naive Quark Model are taken care of if the nucleon is assumed to be deformed. Only one value of the parameter P D ∼1/4 (which specifies deformation) fits g A (the axial vector coupling constant) for all the semileptonic decay of baryons, the F/D ratio, the pion-nucleon-delta coupling constant fsub(πNΔ), the double delta coupling constant 1 fsub(πΔΔ), the Ml transition moment μΔN and g 1 p the spin structure function of proton 2 . All this gives strong hint that both neutron and proton are deformed. It is important to look for further signatures of this deformation. When this deformed nucleon finds itself in a nuclear medium its deformation decreases. So much that in a heavy nucleus the nucleons are actually spherical. We look into the Gamow-Teller strengths, magnetic moments and magnetic transition strengths in nuclei to study this property. (author). 15 refs
Nuss bar migrations: occurrence and classification
International Nuclear Information System (INIS)
Binkovitz, Lauren E.; Binkovitz, Larry A.; Zendejas, Benjamin; Moir, Christopher R.
2016-01-01
Pectus excavatum results from dorsal deviation of the sternum causing narrowing of the anterior-posterior diameter of the chest. It can result in significant cosmetic deformities and cardiopulmonary compromise if severe. The Nuss procedure is a minimally invasive technique that involves placing a thin horizontally oriented metal bar below the dorsal sternal apex for correction of the pectus deformity. To identify the frequency and types of Nuss bar migrations, to present a new categorization of bar migrations, and to present examples of true migrations and pseudomigrations. We retrospectively reviewed the electronic medical records and all pertinent radiologic studies of 311 pediatric patients who underwent a Nuss procedure. We evaluated the frequency and type of bar migrations. Bar migration was demonstrated in 23 of 311 patients (7%) and occurred within a mean period of 26 days after surgery. Bar migrations were subjectively defined as deviation of the bar from the position demonstrated on the immediate postoperative radiographs and categorized as superior, inferior, rotation, lateral or flipped using a new classification system. Sixteen of the 23 migrations required re-operation. Nuss bar migration can be diagnosed with careful evaluation of serial radiographs. Nuss bar migration has a wide variety of appearances and requires exclusion of pseudomigration resulting from changes in patient positioning between radiologic examinations. (orig.)
Nuss bar migrations: occurrence and classification
Energy Technology Data Exchange (ETDEWEB)
Binkovitz, Lauren E.; Binkovitz, Larry A. [Mayo Clinic, Department of Radiology, Rochester, MN (United States); Zendejas, Benjamin; Moir, Christopher R. [Mayo Clinic, Department of Surgery, Rochester, MN (United States)
2016-12-15
Pectus excavatum results from dorsal deviation of the sternum causing narrowing of the anterior-posterior diameter of the chest. It can result in significant cosmetic deformities and cardiopulmonary compromise if severe. The Nuss procedure is a minimally invasive technique that involves placing a thin horizontally oriented metal bar below the dorsal sternal apex for correction of the pectus deformity. To identify the frequency and types of Nuss bar migrations, to present a new categorization of bar migrations, and to present examples of true migrations and pseudomigrations. We retrospectively reviewed the electronic medical records and all pertinent radiologic studies of 311 pediatric patients who underwent a Nuss procedure. We evaluated the frequency and type of bar migrations. Bar migration was demonstrated in 23 of 311 patients (7%) and occurred within a mean period of 26 days after surgery. Bar migrations were subjectively defined as deviation of the bar from the position demonstrated on the immediate postoperative radiographs and categorized as superior, inferior, rotation, lateral or flipped using a new classification system. Sixteen of the 23 migrations required re-operation. Nuss bar migration can be diagnosed with careful evaluation of serial radiographs. Nuss bar migration has a wide variety of appearances and requires exclusion of pseudomigration resulting from changes in patient positioning between radiologic examinations. (orig.)
Wu, Chong; Liu, Liping; Wei, Ming; Xi, Baozhu; Yu, Minghui
2018-03-01
A modified hydrometeor classification algorithm (HCA) is developed in this study for Chinese polarimetric radars. This algorithm is based on the U.S. operational HCA. Meanwhile, the methodology of statistics-based optimization is proposed including calibration checking, datasets selection, membership functions modification, computation thresholds modification, and effect verification. Zhuhai radar, the first operational polarimetric radar in South China, applies these procedures. The systematic bias of calibration is corrected, the reliability of radar measurements deteriorates when the signal-to-noise ratio is low, and correlation coefficient within the melting layer is usually lower than that of the U.S. WSR-88D radar. Through modification based on statistical analysis of polarimetric variables, the localized HCA especially for Zhuhai is obtained, and it performs well over a one-month test through comparison with sounding and surface observations. The algorithm is then utilized for analysis of a squall line process on 11 May 2014 and is found to provide reasonable details with respect to horizontal and vertical structures, and the HCA results—especially in the mixed rain-hail region—can reflect the life cycle of the squall line. In addition, the kinematic and microphysical processes of cloud evolution and the differences between radar-detected hail and surface observations are also analyzed. The results of this study provide evidence for the improvement of this HCA developed specifically for China.
Schizophrenia classification using functional network features
Rish, Irina; Cecchi, Guillermo A.; Heuton, Kyle
2012-03-01
This paper focuses on discovering statistical biomarkers (features) that are predictive of schizophrenia, with a particular focus on topological properties of fMRI functional networks. We consider several network properties, such as node (voxel) strength, clustering coefficients, local efficiency, as well as just a subset of pairwise correlations. While all types of features demonstrate highly significant statistical differences in several brain areas, and close to 80% classification accuracy, the most remarkable results of 93% accuracy are achieved by using a small subset of only a dozen of most-informative (lowest p-value) correlation features. Our results suggest that voxel-level correlations and functional network features derived from them are highly informative about schizophrenia and can be used as statistical biomarkers for the disease.
Directory of Open Access Journals (Sweden)
Casey Olives
Full Text Available Originally a binary classifier, Lot Quality Assurance Sampling (LQAS has proven to be a useful tool for classification of the prevalence of Schistosoma mansoni into multiple categories (≤10%, >10 and <50%, ≥50%, and semi-curtailed sampling has been shown to effectively reduce the number of observations needed to reach a decision. To date the statistical underpinnings for Multiple Category-LQAS (MC-LQAS have not received full treatment. We explore the analytical properties of MC-LQAS, and validate its use for the classification of S. mansoni prevalence in multiple settings in East Africa.We outline MC-LQAS design principles and formulae for operating characteristic curves. In addition, we derive the average sample number for MC-LQAS when utilizing semi-curtailed sampling and introduce curtailed sampling in this setting. We also assess the performance of MC-LQAS designs with maximum sample sizes of n=15 and n=25 via a weighted kappa-statistic using S. mansoni data collected in 388 schools from four studies in East Africa.Overall performance of MC-LQAS classification was high (kappa-statistic of 0.87. In three of the studies, the kappa-statistic for a design with n=15 was greater than 0.75. In the fourth study, where these designs performed poorly (kappa-statistic less than 0.50, the majority of observations fell in regions where potential error is known to be high. Employment of semi-curtailed and curtailed sampling further reduced the sample size by as many as 0.5 and 3.5 observations per school, respectively, without increasing classification error.This work provides the needed analytics to understand the properties of MC-LQAS for assessing the prevalance of S. mansoni and shows that in most settings a sample size of 15 children provides a reliable classification of schools.
Texture classification using autoregressive filtering
Lawton, W. M.; Lee, M.
1984-01-01
A general theory of image texture models is proposed and its applicability to the problem of scene segmentation using texture classification is discussed. An algorithm, based on half-plane autoregressive filtering, which optimally utilizes second order statistics to discriminate between texture classes represented by arbitrary wide sense stationary random fields is described. Empirical results of applying this algorithm to natural and sysnthesized scenes are presented and future research is outlined.
Haldane's statistical interactions and universal properties of anyon systems
International Nuclear Information System (INIS)
Protogenov, A.
1995-03-01
The exclusion principle of fractional statistics proposed by Haldane is applied to systems with internal degrees of freedom. The symmetry of these systems is included in the statistical interaction matrix which contains the Cartan matrix of Lie algebras. The solutions of the equations for the statistical weights, which coincide with the thermodynamic Bethe ansatz equations are determined in the high temperature limit by the squares of q-deformed dimensions of irreducible representations. The entropy and other thermodynamic properties of anyon systems in this limit are completely characterized by the algebraic structure of symmetry in the universal form. (author). 39 refs
Tongue Images Classification Based on Constrained High Dispersal Network
Directory of Open Access Journals (Sweden)
Dan Meng
2017-01-01
Full Text Available Computer aided tongue diagnosis has a great potential to play important roles in traditional Chinese medicine (TCM. However, the majority of the existing tongue image analyses and classification methods are based on the low-level features, which may not provide a holistic view of the tongue. Inspired by deep convolutional neural network (CNN, we propose a novel feature extraction framework called constrained high dispersal neural networks (CHDNet to extract unbiased features and reduce human labor for tongue diagnosis in TCM. Previous CNN models have mostly focused on learning convolutional filters and adapting weights between them, but these models have two major issues: redundancy and insufficient capability in handling unbalanced sample distribution. We introduce high dispersal and local response normalization operation to address the issue of redundancy. We also add multiscale feature analysis to avoid the problem of sensitivity to deformation. Our proposed CHDNet learns high-level features and provides more classification information during training time, which may result in higher accuracy when predicting testing samples. We tested the proposed method on a set of 267 gastritis patients and a control group of 48 healthy volunteers. Test results show that CHDNet is a promising method in tongue image classification for the TCM study.
On the statistical assessment of classifiers using DNA microarray data
Directory of Open Access Journals (Sweden)
Carella M
2006-08-01
Full Text Available Abstract Background In this paper we present a method for the statistical assessment of cancer predictors which make use of gene expression profiles. The methodology is applied to a new data set of microarray gene expression data collected in Casa Sollievo della Sofferenza Hospital, Foggia – Italy. The data set is made up of normal (22 and tumor (25 specimens extracted from 25 patients affected by colon cancer. We propose to give answers to some questions which are relevant for the automatic diagnosis of cancer such as: Is the size of the available data set sufficient to build accurate classifiers? What is the statistical significance of the associated error rates? In what ways can accuracy be considered dependant on the adopted classification scheme? How many genes are correlated with the pathology and how many are sufficient for an accurate colon cancer classification? The method we propose answers these questions whilst avoiding the potential pitfalls hidden in the analysis and interpretation of microarray data. Results We estimate the generalization error, evaluated through the Leave-K-Out Cross Validation error, for three different classification schemes by varying the number of training examples and the number of the genes used. The statistical significance of the error rate is measured by using a permutation test. We provide a statistical analysis in terms of the frequencies of the genes involved in the classification. Using the whole set of genes, we found that the Weighted Voting Algorithm (WVA classifier learns the distinction between normal and tumor specimens with 25 training examples, providing e = 21% (p = 0.045 as an error rate. This remains constant even when the number of examples increases. Moreover, Regularized Least Squares (RLS and Support Vector Machines (SVM classifiers can learn with only 15 training examples, with an error rate of e = 19% (p = 0.035 and e = 18% (p = 0.037 respectively. Moreover, the error rate
Data classification and MTBF prediction with a multivariate analysis approach
International Nuclear Information System (INIS)
Braglia, Marcello; Carmignani, Gionata; Frosolini, Marco; Zammori, Francesco
2012-01-01
The paper presents a multivariate statistical approach that supports the classification of mechanical components, subjected to specific operating conditions, in terms of the Mean Time Between Failure (MTBF). Assessing the influence of working conditions and/or environmental factors on the MTBF is a prerequisite for the development of an effective preventive maintenance plan. However, this task may be demanding and it is generally performed with ad-hoc experimental methods, lacking of statistical rigor. To solve this common problem, a step by step multivariate data classification technique is proposed. Specifically, a set of structured failure data are classified in a meaningful way by means of: (i) cluster analysis, (ii) multivariate analysis of variance, (iii) feature extraction and (iv) predictive discriminant analysis. This makes it possible not only to define the MTBF of the analyzed components, but also to identify the working parameters that explain most of the variability of the observed data. The approach is finally demonstrated on 126 centrifugal pumps installed in an oil refinery plant; obtained results demonstrate the quality of the final discrimination, in terms of data classification and failure prediction.
IRIS COLOUR CLASSIFICATION SCALES – THEN AND NOW
Grigore, Mariana; Avram, Alina
2015-01-01
Eye colour is one of the most obvious phenotypic traits of an individual. Since the first documented classification scale developed in 1843, there have been numerous attempts to classify the iris colour. In the past centuries, iris colour classification scales has had various colour categories and mostly relied on comparison of an individual’s eye with painted glass eyes. Once photography techniques were refined, standard iris photographs replaced painted eyes, but this did not solve the problem of painted/ printed colour variability in time. Early clinical scales were easy to use, but lacked objectivity and were not standardised or statistically tested for reproducibility. The era of automated iris colour classification systems came with the technological development. Spectrophotometry, digital analysis of high-resolution iris images, hyper spectral analysis of the human real iris and the dedicated iris colour analysis software, all accomplished an objective, accurate iris colour classification, but are quite expensive and limited in use to research environment. Iris colour classification systems evolved continuously due to their use in a wide range of studies, especially in the fields of anthropology, epidemiology and genetics. Despite the wide range of the existing scales, up until present there has been no generally accepted iris colour classification scale. PMID:27373112
Featureless classification of light curves
Kügler, S. D.; Gianniotis, N.; Polsterer, K. L.
2015-08-01
In the era of rapidly increasing amounts of time series data, classification of variable objects has become the main objective of time-domain astronomy. Classification of irregularly sampled time series is particularly difficult because the data cannot be represented naturally as a vector which can be directly fed into a classifier. In the literature, various statistical features serve as vector representations. In this work, we represent time series by a density model. The density model captures all the information available, including measurement errors. Hence, we view this model as a generalization to the static features which directly can be derived, e.g. as moments from the density. Similarity between each pair of time series is quantified by the distance between their respective models. Classification is performed on the obtained distance matrix. In the numerical experiments, we use data from the OGLE (Optical Gravitational Lensing Experiment) and ASAS (All Sky Automated Survey) surveys and demonstrate that the proposed representation performs up to par with the best currently used feature-based approaches. The density representation preserves all static information present in the observational data, in contrast to a less-complete description by features. The density representation is an upper boundary in terms of information made available to the classifier. Consequently, the predictive power of the proposed classification depends on the choice of similarity measure and classifier, only. Due to its principled nature, we advocate that this new approach of representing time series has potential in tasks beyond classification, e.g. unsupervised learning.
Classification and regression trees
Breiman, Leo; Olshen, Richard A; Stone, Charles J
1984-01-01
The methodology used to construct tree structured rules is the focus of this monograph. Unlike many other statistical procedures, which moved from pencil and paper to calculators, this text's use of trees was unthinkable before computers. Both the practical and theoretical sides have been developed in the authors' study of tree methods. Classification and Regression Trees reflects these two sides, covering the use of trees as a data analysis method, and in a more mathematical framework, proving some of their fundamental properties.
Hill, C. L.
1984-01-01
A computer-implemented classification has been derived from Landsat-4 Thematic Mapper data acquired over Baldwin County, Alabama on January 15, 1983. One set of spectral signatures was developed from the data by utilizing a 3x3 pixel sliding window approach. An analysis of the classification produced from this technique identified forested areas. Additional information regarding only the forested areas. Additional information regarding only the forested areas was extracted by employing a pixel-by-pixel signature development program which derived spectral statistics only for pixels within the forested land covers. The spectral statistics from both approaches were integrated and the data classified. This classification was evaluated by comparing the spectral classes produced from the data against corresponding ground verification polygons. This iterative data analysis technique resulted in an overall classification accuracy of 88.4 percent correct for slash pine, young pine, loblolly pine, natural pine, and mixed hardwood-pine. An accuracy assessment matrix has been produced for the classification.
Application of TOPEX Altimetry for Solid Earth Deformation Studies
Directory of Open Access Journals (Sweden)
Hyongki Lee
2008-01-01
Full Text Available This study demonstrates the use of satellite radar altimetry to detect solid Earth deformation signals such as Glacial Isostatic Adjustment (GIA. Our study region covers moderately flat land surfaces seasonally covered by snow/ice/vegetation. The maximum solid Earth uplift of ~10 mm yr-1 is primarily due to the incomplete glacial isostatic rebound that occurs around Hudson Bay, North America. We use decadal (1992 - 2002 surface height measurements from TOPEX/POSEIDON radar altimetry to generate height changes time series for 12 selected locations in the study region. Due to the seasonally varying surface characteristics, we first perform radar waveform shape classification and have found that most of the waveforms are quasi-diffuse during winter/spring and specular during summer/fall. As a result, we used the NASA £]-retracker for the quasi-diffuse waveforms and the Offset Center of Gravity or the threshold retracker for the specular waveforms, to generate the surface height time series. The TOPEX height change time series exhibit coherent seasonal signals (higher amplitude during the winter and lower amplitude during the summer, and the estimated deformation rates agree qualitatively well with GPS vertical velocities, and with altimeter/tide gauge combined vertical velocities around the Great Lakes. The TOPEX observations also agree well with various GIA model predictions, especially with the ICE-5G (VM2 model with differences at 0.2 ¡_ 1.4 mm yr-1, indicating that TOPEX has indeed observed solid Earth deformation signals manifested as crustal uplift over the former Laurentide Ice Sheet region.
A new classification for cochleovestibular malformations.
Sennaroglu, Levent; Saatci, Isil
2002-12-01
The report proposes a new classification system for inner ear malformations, based on radiological features of inner ear malformations reviewed in 23 patients. The investigation took the form of a retrospective review of computerized tomography findings relating to the temporal bone in 23 patients (13 male and 10 female patients) with inner ear malformations. The subjects were patients with profound bilateral sensorineural hearing loss who had all had high-resolution computed tomography (CT) with contiguous 1-mm-thick images obtained through the petrous bone in axial sections. The CT results were reviewed for malformations of bony otic capsule under the following subgroups: cochlear, vestibular, semicircular canal, internal auditory canal (IAC), and vestibular and cochlear aqueduct malformations. Cochlear malformations were classified as Michel deformity, common cavity deformity, cochlear aplasia, hypoplastic cochlea, incomplete partition types I (IP-I) and II (IP-II) (Mondini deformity). Incomplete partition type I (cystic cochleovestibular malformation) is defined as a malformation in which the cochlea lacks the entire modiolus and cribriform area, resulting in a cystic appearance, and there is an accompanying large cystic vestibule. In IP-II (the Mondini deformity), there is a cochlea consisting of 1.5 turns (in which the middle and apical turns coalesce to form a cystic apex) accompanied by a dilated vestibule and enlarged vestibular aqueduct. Four patients demonstrated anomalies involving only one inner ear component. All the remaining patients had diseases or conditions affecting more than one inner ear component. Eight ears had IP-I, and 10 patients had IP-II. Ears with IP-I had large cystic vestibules, whereas the amount of dilation was minimal in patients with IP-II. The majority of the semicircular canals (67%) were normal. Semicircular canal aplasia accompanied cases of Michel deformity, cochlear hypoplasia, and common cavity. In 14 ears, the IAC had a
Huda, M. M.; Siregar, E.; Ismah, N.
2017-08-01
Stainless steel bracket slot deformation ffects the force applied to teeth and it can impede tooth movement and prolong orthodontic treatment time. The aim of this study is to determine the slot deformation due to torque of a 0.021 × 0.025 inch Beta Titanium wire with a torsional angle of 30° and 45° for five different bracket brands: y, 3M, Biom, Versadent, Ormco, and Shinye. The research also aims to compare the deformation and amount of torque among all five bracket brands at torsional angles of 30° and 45°. Fifty stainless steel edgewise brackets from the five bracket group brands (n=10) were attached to acrylic plates. The bracket slot measurements were carried out in two stages. In the first stage, the, deformation was measured by calculating the average bracket slot height using a stereoscopy microscope before and after application of torque. In the second stage, the torque was measured using a torque measurement apparatus. The statistical analysis shows that slot deformations were found on all five bracket brands with a clinical permanent deformation on the Biom (2.79 μm) and Shinye (2.29 μm) brackets. The most torque was observed on the 3M bracket, followed by the Ormco, Versadent, Shinye, and Biom brackets. When the brands were compared, a correlation between bracket slot deformation and the amount of torque was found, but the correlation was not statistically significant for the 3M and Ormco brackets and the Biom and Shinye brackets. There is a difference in the amount of torque between the five brands with a torsional angle of 30° (except the 3M and Ormco brackets) and those with a torsional angle of 45°. The composition of the metal and the manufacturing process are the factors that influence the occurrence of bracket slot deformation and the amount of torque. A manufacturing process using metal injection molding (MIM) and metal compositions of AISI 303 and 17-4 PH stainless steel reduce the risk of deformation.
Deformed statistics Kullback–Leibler divergence minimization within a scaled Bregman framework
International Nuclear Information System (INIS)
Venkatesan, R.C.; Plastino, A.
2011-01-01
The generalized Kullback–Leibler divergence (K–Ld) in Tsallis statistics [constrained by the additive duality of generalized statistics (dual generalized K–Ld)] is here reconciled with the theory of Bregman divergences for expectations defined by normal averages, within a measure-theoretic framework. Specifically, it is demonstrated that the dual generalized K–Ld is a scaled Bregman divergence. The Pythagorean theorem is derived from the minimum discrimination information principle using the dual generalized K–Ld as the measure of uncertainty, with constraints defined by normal averages. The minimization of the dual generalized K–Ld, with normal averages constraints, is shown to exhibit distinctly unique features. -- Highlights: ► Dual generalized Kullback–Leibler divergence (K–Ld) proven to be scaled Bregman divergence in continuous measure-theoretic framework. ► Minimum dual generalized K–Ld condition established with normal averages constraints. ► Pythagorean theorem derived.
Hyperspectral Image Classification Using Discriminative Dictionary Learning
International Nuclear Information System (INIS)
Zongze, Y; Hao, S; Kefeng, J; Huanxin, Z
2014-01-01
The hyperspectral image (HSI) processing community has witnessed a surge of papers focusing on the utilization of sparse prior for effective HSI classification. In sparse representation based HSI classification, there are two phases: sparse coding with an over-complete dictionary and classification. In this paper, we first apply a novel fisher discriminative dictionary learning method, which capture the relative difference in different classes. The competitive selection strategy ensures that atoms in the resulting over-complete dictionary are the most discriminative. Secondly, motivated by the assumption that spatially adjacent samples are statistically related and even belong to the same materials (same class), we propose a majority voting scheme incorporating contextual information to predict the category label. Experiment results show that the proposed method can effectively strengthen relative discrimination of the constructed dictionary, and incorporating with the majority voting scheme achieve generally an improved prediction performance
Statistical model for prediction of hearing loss in patients receiving cisplatin chemotherapy.
Johnson, Andrew; Tarima, Sergey; Wong, Stuart; Friedland, David R; Runge, Christina L
2013-03-01
This statistical model might be used to predict cisplatin-induced hearing loss, particularly in patients undergoing concomitant radiotherapy. To create a statistical model based on pretreatment hearing thresholds to provide an individual probability for hearing loss from cisplatin therapy and, secondarily, to investigate the use of hearing classification schemes as predictive tools for hearing loss. Retrospective case-control study. Tertiary care medical center. A total of 112 subjects receiving chemotherapy and audiometric evaluation were evaluated for the study. Of these subjects, 31 met inclusion criteria for analysis. The primary outcome measurement was a statistical model providing the probability of hearing loss following the use of cisplatin chemotherapy. Fifteen of the 31 subjects had significant hearing loss following cisplatin chemotherapy. American Academy of Otolaryngology-Head and Neck Society and Gardner-Robertson hearing classification schemes revealed little change in hearing grades between pretreatment and posttreatment evaluations for subjects with or without hearing loss. The Chang hearing classification scheme could effectively be used as a predictive tool in determining hearing loss with a sensitivity of 73.33%. Pretreatment hearing thresholds were used to generate a statistical model, based on quadratic approximation, to predict hearing loss (C statistic = 0.842, cross-validated = 0.835). The validity of the model improved when only subjects who received concurrent head and neck irradiation were included in the analysis (C statistic = 0.91). A calculated cutoff of 0.45 for predicted probability has a cross-validated sensitivity and specificity of 80%. Pretreatment hearing thresholds can be used as a predictive tool for cisplatin-induced hearing loss, particularly with concomitant radiotherapy.
Liu, Ling; Onck, Patrick R.
2017-08-01
Azobenzene-embedded liquid crystal polymers can undergo mechanical deformation in response to ultraviolet (UV) light. The natural rodlike trans state azobenzene absorbs UV light and isomerizes to a bentlike cis state, which disturbs the order of the polymer network, leading to an anisotropic deformation. The current consensus is that the magnitude of the photoinduced deformation is related to the statistical building up of molecules in the cis state. However, a recent experimental study [Liu and Broer, Nat. Commun. 6 8334 (2015)., 10.1038/ncomms9334] shows that a drastic (fourfold) increase of the photoinduced deformation can be generated by exposing the samples simultaneously to 365 nm (UV) and 455 nm (visible) light. To elucidate the physical mechanism that drives this increase, we develop a two-light attenuation model and an optomechanical constitutive relation that not only accounts for the statistical accumulation of cis azobenzenes, but also for the dynamic trans-cis-trans oscillatory isomerization process. Our experimentally calibrated model predicts that the optimal single-wavelength exposure is 395 nm light, a pronounced shift towards the visible spectrum. In addition, we identify a range of optimal combinations of two-wavelength lights that generate a favorable response for a given amount of injected energy. Our model provides mechanistic insight into the different (multi)wavelength exposures used in experiments and, at the same time, opens new avenues towards enhanced, multiwavelength optomechanical behavior.
Sitter, de L.U.
1937-01-01
§ 1. Plastic deformation of solid matter under high confining pressures has been insufficiently studied. Jeffreys 1) devotes a few paragraphs to deformation of solid matter as a preface to his chapter on the isostasy problem. He distinguishes two properties of solid matter with regard to its
Classification across gene expression microarray studies
Directory of Open Access Journals (Sweden)
Kuner Ruprecht
2009-12-01
Full Text Available Abstract Background The increasing number of gene expression microarray studies represents an important resource in biomedical research. As a result, gene expression based diagnosis has entered clinical practice for patient stratification in breast cancer. However, the integration and combined analysis of microarray studies remains still a challenge. We assessed the potential benefit of data integration on the classification accuracy and systematically evaluated the generalization performance of selected methods on four breast cancer studies comprising almost 1000 independent samples. To this end, we introduced an evaluation framework which aims to establish good statistical practice and a graphical way to monitor differences. The classification goal was to correctly predict estrogen receptor status (negative/positive and histological grade (low/high of each tumor sample in an independent study which was not used for the training. For the classification we chose support vector machines (SVM, predictive analysis of microarrays (PAM, random forest (RF and k-top scoring pairs (kTSP. Guided by considerations relevant for classification across studies we developed a generalization of kTSP which we evaluated in addition. Our derived version (DV aims to improve the robustness of the intrinsic invariance of kTSP with respect to technologies and preprocessing. Results For each individual study the generalization error was benchmarked via complete cross-validation and was found to be similar for all classification methods. The misclassification rates were substantially higher in classification across studies, when each single study was used as an independent test set while all remaining studies were combined for the training of the classifier. However, with increasing number of independent microarray studies used in the training, the overall classification performance improved. DV performed better than the average and showed slightly less variance. In
A novel Neuro-fuzzy classification technique for data mining
Directory of Open Access Journals (Sweden)
Soumadip Ghosh
2014-11-01
Full Text Available In our study, we proposed a novel Neuro-fuzzy classification technique for data mining. The inputs to the Neuro-fuzzy classification system were fuzzified by applying generalized bell-shaped membership function. The proposed method utilized a fuzzification matrix in which the input patterns were associated with a degree of membership to different classes. Based on the value of degree of membership a pattern would be attributed to a specific category or class. We applied our method to ten benchmark data sets from the UCI machine learning repository for classification. Our objective was to analyze the proposed method and, therefore compare its performance with two powerful supervised classification algorithms Radial Basis Function Neural Network (RBFNN and Adaptive Neuro-fuzzy Inference System (ANFIS. We assessed the performance of these classification methods in terms of different performance measures such as accuracy, root-mean-square error, kappa statistic, true positive rate, false positive rate, precision, recall, and f-measure. In every aspect the proposed method proved to be superior to RBFNN and ANFIS algorithms.
Energy Technology Data Exchange (ETDEWEB)
Desbat, L. [Universite Joseph Fourier, UMR CNRS 5525, 38 - Grenoble (France); Roux, S. [Universite Joseph Fourier, TIMC-IMAG, In3S, Faculte de Medecine, 38 - Grenoble (France)]|[CEA Grenoble, Lab. d' Electronique et de Technologie de l' Informatique (LETI), 38 (France); Grangeat, P. [CEA Grenoble, Lab. d' Electronique et de Technologie de l' Informatique (LETI), 38 (France)
2005-07-01
This work is a contribution to the compensation of motion in tomography. New classes of deformation are proposed, that compensates analytically by an algorithm of a F.B.P. type reconstruction. This work makes a generalisation of the known results for affine deformations, in parallel geometry and fan-beam, to deformation classes of infinite dimension able to include strong non linearities. (N.C.)
Classification d'images RSO polarimétriques à haute résolution spatiale sur site urbain.
Soheili Majd , Maryam
2014-01-01
In this research, our aim is to assess the potential of a one single look high spatial resolution polarimetric radar image for the classification of urban areas. For that purpose, we concentrate on classes corresponding to different kinds of roofs, objects and ground surfaces.At first, we propose a uni-variate statistical analysis of polarimetric and texture attributes, that can be used in a classification algorithm. We perform a statistical analysis of descriptors and show that the Fisher di...
Arpacı, Hande; Çomu, Faruk Metin; Küçük, Ayşegül; Kösem, Bahadır; Kartal, Seyfi; Şıvgın, Volkan; Turgut, Hüseyin Cihad; Aydın, Muhammed Enes; Koç, Derya Sebile; Arslan, Mustafa
2016-01-01
Change in blood supply is held responsible for anesthesia-related abnormal tissue and organ perfusion. Decreased erythrocyte deformability and increased aggregation may be detected after surgery performed under general anesthesia. It was shown that nonsteroidal anti-inflammatory drugs decrease erythrocyte deformability. Lornoxicam and/or intravenous (iv) ibuprofen are commonly preferred analgesic agents for postoperative pain management. In this study, we aimed to investigate the effects of lornoxicam (2 mg/kg, iv) and ibuprofen (30 mg/kg, iv) on erythrocyte deformability, as well as hepatic and renal blood flows, in male rats. Eighteen male Wistar albino rats were randomly divided into three groups as follows: iv lornoxicam-treated group (Group L), iv ibuprofen-treated group (Group İ), and control group (Group C). Drug administration was carried out by the iv route in all groups except Group C. Hepatic and renal blood flows were studied by laser Doppler, and euthanasia was performed via intra-abdominal blood uptake. Erythrocyte deformability was measured using a constant-flow filtrometry system. Lornoxicam and ibuprofen increased the relative resistance, which is an indicator of erythrocyte deformability, of rats (P=0.016). Comparison of the results from Group L and Group I revealed no statistically significant differences (P=0.694), although the erythrocyte deformability levels in Group L and Group I were statistically higher than the results observed in Group C (P=0.018 and P=0.008, respectively). Hepatic and renal blood flows were significantly lower than the same in Group C. We believe that lornoxicam and ibuprofen may lead to functional disorders related to renal and liver tissue perfusion secondary to both decreased blood flow and erythrocyte deformability. Further studies regarding these issues are thought to be essential.
Tumor Classification Using High-Order Gene Expression Profiles Based on Multilinear ICA
Directory of Open Access Journals (Sweden)
Ming-gang Du
2009-01-01
Full Text Available Motivation. Independent Components Analysis (ICA maximizes the statistical independence of the representational components of a training gene expression profiles (GEP ensemble, but it cannot distinguish relations between the different factors, or different modes, and it is not available to high-order GEP Data Mining. In order to generalize ICA, we introduce Multilinear-ICA and apply it to tumor classification using high order GEP. Firstly, we introduce the basis conceptions and operations of tensor and recommend Support Vector Machine (SVM classifier and Multilinear-ICA. Secondly, the higher score genes of original high order GEP are selected by using t-statistics and tabulate tensors. Thirdly, the tensors are performed by Multilinear-ICA. Finally, the SVM is used to classify the tumor subtypes. Results. To show the validity of the proposed method, we apply it to tumor classification using high order GEP. Though we only use three datasets, the experimental results show that the method is effective and feasible. Through this survey, we hope to gain some insight into the problem of high order GEP tumor classification, in aid of further developing more effective tumor classification algorithms.
DEFF Research Database (Denmark)
Debus, Michael S.
2017-01-01
This paper critically analyzes seventeen game classifications. The classifications were chosen on the basis of diversity, ranging from pre-digital classification (e.g. Murray 1952), over game studies classifications (e.g. Elverdam & Aarseth 2007) to classifications of drinking games (e.g. LaBrie et...... al. 2013). The analysis aims at three goals: The classifications’ internal consistency, the abstraction of classification criteria and the identification of differences in classification across fields and/or time. Especially the abstraction of classification criteria can be used in future endeavors...... into the topic of game classifications....
A Classification-based Review Recommender
O'Mahony, Michael P.; Smyth, Barry
Many online stores encourage their users to submit product/service reviews in order to guide future purchasing decisions. These reviews are often listed alongside product recommendations but, to date, limited attention has been paid as to how best to present these reviews to the end-user. In this paper, we describe a supervised classification approach that is designed to identify and recommend the most helpful product reviews. Using the TripAdvisor service as a case study, we compare the performance of several classification techniques using a range of features derived from hotel reviews. We then describe how these classifiers can be used as the basis for a practical recommender that automatically suggests the mosthelpful contrasting reviews to end-users. We present an empirical evaluation which shows that our approach achieves a statistically significant improvement over alternative review ranking schemes.
Olives, Casey; Valadez, Joseph J; Brooker, Simon J; Pagano, Marcello
2012-01-01
Originally a binary classifier, Lot Quality Assurance Sampling (LQAS) has proven to be a useful tool for classification of the prevalence of Schistosoma mansoni into multiple categories (≤10%, >10 and LQAS (MC-LQAS) have not received full treatment. We explore the analytical properties of MC-LQAS, and validate its use for the classification of S. mansoni prevalence in multiple settings in East Africa. We outline MC-LQAS design principles and formulae for operating characteristic curves. In addition, we derive the average sample number for MC-LQAS when utilizing semi-curtailed sampling and introduce curtailed sampling in this setting. We also assess the performance of MC-LQAS designs with maximum sample sizes of n=15 and n=25 via a weighted kappa-statistic using S. mansoni data collected in 388 schools from four studies in East Africa. Overall performance of MC-LQAS classification was high (kappa-statistic of 0.87). In three of the studies, the kappa-statistic for a design with n=15 was greater than 0.75. In the fourth study, where these designs performed poorly (kappa-statistic less than 0.50), the majority of observations fell in regions where potential error is known to be high. Employment of semi-curtailed and curtailed sampling further reduced the sample size by as many as 0.5 and 3.5 observations per school, respectively, without increasing classification error. This work provides the needed analytics to understand the properties of MC-LQAS for assessing the prevalance of S. mansoni and shows that in most settings a sample size of 15 children provides a reliable classification of schools.
Studies of deformation-induced texture development in sheet materials using diffraction techniques
International Nuclear Information System (INIS)
Banovic, S.W.; Vaudin, M.D.; Gnaeupel-Herold, T.H.; Saylor, D.M.; Rodbell, K.P
2004-01-01
Crystallographic texture measurements were made on a series of rolled aluminum sheet specimens deformed in equi-biaxial tension up to a strain level of 0.11. The measurement techniques used were neutron diffraction with a 4-circle goniometer, electron backscatter diffraction, conventional powder X-ray diffraction (XRD), and XRD using an area detector. Results indicated a complex texture orientation distribution function which altered in response to the applied plastic deformation. Increased deformation caused the {1 1 0} planes, to align parallel to the plane of the sheet. The different techniques produced results that were very consistent with each other. The advantages and disadvantages of the various methods are discussed, with particular consideration of the time taken for each method, the range of orientation space accessible, the density of data that can be obtained, and the statistical significance of each data set with respect to rolled sheet product
On the High Temperature Deformation Behaviour of 2507 Super Duplex Stainless Steel
Mishra, M. K.; Balasundar, I.; Rao, A. G.; Kashyap, B. P.; Prabhu, N.
2017-02-01
High temperature deformation behaviour of 2507 super duplex stainless steel was investigated by conducting isothermal hot compression tests. The dominant restoration processes in ferrite and austenite phases present in the material were found to be distinct. The possible causes for these differences are discussed. Based on the dynamic materials model, processing map was developed to identify the optimum processing parameters. The microstructural mechanisms operating in the material were identified. A unified strain-compensated constitutive equation was established to describe the high temperature deformation behaviour of the material under the identified processing conditions. Standard statistical parameter such as correlation coefficient has been used to validate the established equation.
Sargolzaeipor, S.; Hassanabadi, H.; Chung, W. S.
2018-04-01
The Klein-Gordon equation is extended in the presence of an Aharonov-Bohm magnetic field for the Cornell potential and the corresponding wave functions as well as the spectra are obtained. After introducing the superstatistics in the statistical mechanics, we first derived the effective Boltzmann factor in the deformed formalism with modified Dirac delta distribution. We then use the concepts of the superstatistics to calculate the thermodynamics properties of the system. The well-known results are recovered by the vanishing of deformation parameter and some graphs are plotted for the clarity of our results.
DEFF Research Database (Denmark)
Wang, Ting; Guan, Sheng-Uei; Puthusserypady, Sadasivan
2014-01-01
Feature ordering is a significant data preprocessing method in Incremental Attribute Learning (IAL), a novel machine learning approach which gradually trains features according to a given order. Previous research has shown that, similar to feature selection, feature ordering is also important based...... estimation. Moreover, a criterion that summarizes all the produced values of AD is employed with a GA (Genetic Algorithm)-based approach to obtain the optimum feature ordering for classification problems based on neural networks by means of IAL. Compared with the feature ordering obtained by other approaches...
Estimation of Lithological Classification in Taipei Basin: A Bayesian Maximum Entropy Method
Wu, Meng-Ting; Lin, Yuan-Chien; Yu, Hwa-Lung
2015-04-01
In environmental or other scientific applications, we must have a certain understanding of geological lithological composition. Because of restrictions of real conditions, only limited amount of data can be acquired. To find out the lithological distribution in the study area, many spatial statistical methods used to estimate the lithological composition on unsampled points or grids. This study applied the Bayesian Maximum Entropy (BME method), which is an emerging method of the geological spatiotemporal statistics field. The BME method can identify the spatiotemporal correlation of the data, and combine not only the hard data but the soft data to improve estimation. The data of lithological classification is discrete categorical data. Therefore, this research applied Categorical BME to establish a complete three-dimensional Lithological estimation model. Apply the limited hard data from the cores and the soft data generated from the geological dating data and the virtual wells to estimate the three-dimensional lithological classification in Taipei Basin. Keywords: Categorical Bayesian Maximum Entropy method, Lithological Classification, Hydrogeological Setting
Applied multivariate statistics with R
Zelterman, Daniel
2015-01-01
This book brings the power of multivariate statistics to graduate-level practitioners, making these analytical methods accessible without lengthy mathematical derivations. Using the open source, shareware program R, Professor Zelterman demonstrates the process and outcomes for a wide array of multivariate statistical applications. Chapters cover graphical displays, linear algebra, univariate, bivariate and multivariate normal distributions, factor methods, linear regression, discrimination and classification, clustering, time series models, and additional methods. Zelterman uses practical examples from diverse disciplines to welcome readers from a variety of academic specialties. Those with backgrounds in statistics will learn new methods while they review more familiar topics. Chapters include exercises, real data sets, and R implementations. The data are interesting, real-world topics, particularly from health and biology-related contexts. As an example of the approach, the text examines a sample from the B...
Kumar, Jagadish; Ananthakrishna, G
2018-01-01
Scale-invariant power-law distributions for acoustic emission signals are ubiquitous in several plastically deforming materials. However, power-law distributions for acoustic emission energies are reported in distinctly different plastically deforming situations such as hcp and fcc single and polycrystalline samples exhibiting smooth stress-strain curves and in dilute metallic alloys exhibiting discontinuous flow. This is surprising since the underlying dislocation mechanisms in these two types of deformations are very different. So far, there have been no models that predict the power-law statistics for discontinuous flow. Furthermore, the statistics of the acoustic emission signals in jerky flow is even more complex, requiring multifractal measures for a proper characterization. There has been no model that explains the complex statistics either. Here we address the problem of statistical characterization of the acoustic emission signals associated with the three types of the Portevin-Le Chatelier bands. Following our recently proposed general framework for calculating acoustic emission, we set up a wave equation for the elastic degrees of freedom with a plastic strain rate as a source term. The energy dissipated during acoustic emission is represented by the Rayleigh-dissipation function. Using the plastic strain rate obtained from the Ananthakrishna model for the Portevin-Le Chatelier effect, we compute the acoustic emission signals associated with the three Portevin-Le Chatelier bands and the Lüders-like band. The so-calculated acoustic emission signals are used for further statistical characterization. Our results show that the model predicts power-law statistics for all the acoustic emission signals associated with the three types of Portevin-Le Chatelier bands with the exponent values increasing with increasing strain rate. The calculated multifractal spectra corresponding to the acoustic emission signals associated with the three band types have a maximum
Deformation of Prostate and Seminal Vesicles Relative to Intraprostatic Fiducial Markers
International Nuclear Information System (INIS)
Wielen, Gerard J. van der; Mutanga, Theodore F.; Incrocci, Luca; Kirkels, Wim J.; Vasquez Osorio, Eliana M.; Hoogeman, Mischa S.; Heijmen, Ben J.M.; Boer, Hans C.J. de
2008-01-01
Purpose: To quantify the residual geometric uncertainties after on-line corrections with intraprostatic fiducial markers, this study analyzed the deformation of the prostate and, in particular, the seminal vesicles relative to such markers. Patients and Methods: A planning computed tomography (CT) scan and three repeat CT scans were obtained for 21 prostate cancer patients who had had three to four cylindrical gold markers placed. The prostate and whole seminal vesicles (clinical target volume [CTV]) were delineated on each scan at a slice thickness of 1.5 mm. Rigid body transformations (translation and rotation) mapping the markers onto the planning scan positions were obtained. The translation only (T only ) or both translation and rotation were applied to the delineated CTVs. Next, the residue CTV surface displacements were determined using nonrigid registration of the delineated contours. For translation and rotation of the CTV, the residues represented deformation; for T only , the residues stemmed from deformation and rotation. T only represented the residues for most currently applied on-line protocols. The patient and population statistics of the CTV surface displacements were calculated. The intraobserver delineation variation was similarly quantified using repeat delineations for all patients and corrected for. Results: The largest CTV deformations were observed at the anterior and posterior side of the seminal vesicles (population average standard deviation ≤3 mm). Prostate deformation was small (standard deviation ≤1 mm). The increase in these deviations when neglecting rotation (T only ) was small. Conclusion: Although prostate deformation with respect to implanted fiducial markers was small, the corresponding deformation of the seminal vesicles was considerable. Adding marker-based rotational corrections to on-line translation corrections provided a limited reduction in the estimated planning margins
Statistical Yearbook of Norway 2012
Energy Technology Data Exchange (ETDEWEB)
NONE
2012-07-01
The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)
Statistical Yearbook of Norway 2012
Energy Technology Data Exchange (ETDEWEB)
NONE
2012-07-01
The Statistical Yearbook of Norway 2012 contains statistics on Norway and main figures for the Nordic countries and other countries selected from international statistics. The international over-views are integrated with the other tables and figures. The selection of tables in this edition is mostly the same as in the 2011 edition. The yearbook's 480 tables and figures present the main trends in official statistics in most areas of society. The list of tables and figures and an index at the back of the book provide easy access to relevant information. In addition, source information and Internet addresses below the tables make the yearbook a good starting point for those who are looking for more detailed statistics. The statistics are based on data gathered in statistical surveys and from administrative data, which, in cooperation with other public institutions, have been made available for statistical purposes. Some tables have been prepared in their entirety by other public institutions. The statistics follow approved principles, standards and classifications that are in line with international recommendations and guidelines. Content: 00. General subjects; 01. Environment; 02. Population; 03. Health and social conditions; 04. Education; 05. Personal economy and housing conditions; 06. Labour market; 07. Recreational, cultural and sporting activities; 08. Prices and indices; 09. National Economy and external trade; 10. Industrial activities; 11. Financial markets; 12. Public finances; Geographical survey.(eb)
Xu, Sai; Zhou, Zhiyan; Lu, Huazhong; Luo, Xiwen; Lan, Yubin
2014-03-19
Principal Component Analysis (PCA) is one of the main methods used for electronic nose pattern recognition. However, poor classification performance is common in classification and recognition when using regular PCA. This paper aims to improve the classification performance of regular PCA based on the existing Wilks Λ-statistic (i.e., combined PCA with the Wilks distribution). The improved algorithms, which combine regular PCA with the Wilks Λ-statistic, were developed after analysing the functionality and defects of PCA. Verification tests were conducted using a PEN3 electronic nose. The collected samples consisted of the volatiles of six varieties of rough rice (Zhongxiang1, Xiangwan13, Yaopingxiang, WufengyouT025, Pin 36, and Youyou122), grown in same area and season. The first two principal components used as analysis vectors cannot perform the rough rice varieties classification task based on a regular PCA. Using the improved algorithms, which combine the regular PCA with the Wilks Λ-statistic, many different principal components were selected as analysis vectors. The set of data points of the Mahalanobis distance between each of the varieties of rough rice was selected to estimate the performance of the classification. The result illustrates that the rough rice varieties classification task is achieved well using the improved algorithm. A Probabilistic Neural Networks (PNN) was also established to test the effectiveness of the improved algorithms. The first two principal components (namely PC1 and PC2) and the first and fifth principal component (namely PC1 and PC5) were selected as the inputs of PNN for the classification of the six rough rice varieties. The results indicate that the classification accuracy based on the improved algorithm was improved by 6.67% compared to the results of the regular method. These results prove the effectiveness of using the Wilks Λ-statistic to improve the classification accuracy of the regular PCA approach. The results
Multisensor multiresolution data fusion for improvement in classification
Rubeena, V.; Tiwari, K. C.
2016-04-01
The rapid advancements in technology have facilitated easy availability of multisensor and multiresolution remote sensing data. Multisensor, multiresolution data contain complementary information and fusion of such data may result in application dependent significant information which may otherwise remain trapped within. The present work aims at improving classification by fusing features of coarse resolution hyperspectral (1 m) LWIR and fine resolution (20 cm) RGB data. The classification map comprises of eight classes. The class names are Road, Trees, Red Roof, Grey Roof, Concrete Roof, Vegetation, bare Soil and Unclassified. The processing methodology for hyperspectral LWIR data comprises of dimensionality reduction, resampling of data by interpolation technique for registering the two images at same spatial resolution, extraction of the spatial features to improve classification accuracy. In the case of fine resolution RGB data, the vegetation index is computed for classifying the vegetation class and the morphological building index is calculated for buildings. In order to extract the textural features, occurrence and co-occurence statistics is considered and the features will be extracted from all the three bands of RGB data. After extracting the features, Support Vector Machine (SVMs) has been used for training and classification. To increase the classification accuracy, post processing steps like removal of any spurious noise such as salt and pepper noise is done which is followed by filtering process by majority voting within the objects for better object classification.
q-Deformed KP Hierarchy and q-Deformed Constrained KP Hierarchy
He, Jingsong; Li, Yinghua; Cheng, Yi
2006-01-01
Using the determinant representation of gauge transformation operator, we have shown that the general form of $au$ function of the $q$-KP hierarchy is a $q$-deformed generalized Wronskian, which includes the $q$-deformed Wronskian as a special case. On the basis of these, we study the $q$-deformed constrained KP ($q$-cKP) hierarchy, i.e. $l$-constraints of $q$-KP hierarchy. Similar to the ordinary constrained KP (cKP) hierarchy, a large class of solutions of $q$-cKP hierarchy can be represent...
Land Cover Classification Using ALOS Imagery For Penang, Malaysia
International Nuclear Information System (INIS)
Sim, C K; Abdullah, K; MatJafri, M Z; Lim, H S
2014-01-01
This paper presents the potential of integrating optical and radar remote sensing data to improve automatic land cover mapping. The analysis involved standard image processing, and consists of spectral signature extraction and application of a statistical decision rule to identify land cover categories. A maximum likelihood classifier is utilized to determine different land cover categories. Ground reference data from sites throughout the study area are collected for training and validation. The land cover information was extracted from the digital data using PCI Geomatica 10.3.2 software package. The variations in classification accuracy due to a number of radar imaging processing techniques are studied. The relationship between the processing window and the land classification is also investigated. The classification accuracies from the optical and radar feature combinations are studied. Our research finds that fusion of radar and optical significantly improved classification accuracies. This study indicates that the land cover/use can be mapped accurately by using this approach
A Comparative Analysis of Classification Algorithms on Diverse Datasets
Directory of Open Access Journals (Sweden)
M. Alghobiri
2018-04-01
Full Text Available Data mining involves the computational process to find patterns from large data sets. Classification, one of the main domains of data mining, involves known structure generalizing to apply to a new dataset and predict its class. There are various classification algorithms being used to classify various data sets. They are based on different methods such as probability, decision tree, neural network, nearest neighbor, boolean and fuzzy logic, kernel-based etc. In this paper, we apply three diverse classification algorithms on ten datasets. The datasets have been selected based on their size and/or number and nature of attributes. Results have been discussed using some performance evaluation measures like precision, accuracy, F-measure, Kappa statistics, mean absolute error, relative absolute error, ROC Area etc. Comparative analysis has been carried out using the performance evaluation measures of accuracy, precision, and F-measure. We specify features and limitations of the classification algorithms for the diverse nature datasets.
International Nuclear Information System (INIS)
Hartley, G.
1998-01-01
The revised Basic Safety Standard (BSS), Directive 96/29/Euratom, continues to promote the principles of classifying workplaces into different areas and workers into different categories. These requirements were present in the previous BSS and are enshrined in current United Kingdom Legislation. The opportunity is taken to look at the relationship between the classification of areas and the classification of persons in the context of a Commercial Advanced Gas Reactor (CAGR). The radiation exposure statistics for eight years of operation have been reviewed to determine if the criteria for classifying Controlled Areas and for classifying Category A workers are in alignment. The data shows that there is a significant misalignment and the reasons for this are explored. The likely impact on the classification of areas and of workers through the implementation of the revised BSS is also considered in the CAGR context. (author)
Generalized ensemble theory with non-extensive statistics
Shen, Ke-Ming; Zhang, Ben-Wei; Wang, En-Ke
2017-12-01
The non-extensive canonical ensemble theory is reconsidered with the method of Lagrange multipliers by maximizing Tsallis entropy, with the constraint that the normalized term of Tsallis' q -average of physical quantities, the sum ∑ pjq, is independent of the probability pi for Tsallis parameter q. The self-referential problem in the deduced probability and thermal quantities in non-extensive statistics is thus avoided, and thermodynamical relationships are obtained in a consistent and natural way. We also extend the study to the non-extensive grand canonical ensemble theory and obtain the q-deformed Bose-Einstein distribution as well as the q-deformed Fermi-Dirac distribution. The theory is further applied to the generalized Planck law to demonstrate the distinct behaviors of the various generalized q-distribution functions discussed in literature.
Directory of Open Access Journals (Sweden)
J. Sunil Rao
2007-01-01
Full Text Available In gene selection for cancer classifi cation using microarray data, we define an eigenvalue-ratio statistic to measure a gene’s contribution to the joint discriminability when this gene is included into a set of genes. Based on this eigenvalueratio statistic, we define a novel hypothesis testing for gene statistical redundancy and propose two gene selection methods. Simulation studies illustrate the agreement between statistical redundancy testing and gene selection methods. Real data examples show the proposed gene selection methods can select a compact gene subset which can not only be used to build high quality cancer classifiers but also show biological relevance.
Nonlinear estimation and classification
Hansen, Mark; Holmes, Christopher; Mallick, Bani; Yu, Bin
2003-01-01
Researchers in many disciplines face the formidable task of analyzing massive amounts of high-dimensional and highly-structured data This is due in part to recent advances in data collection and computing technologies As a result, fundamental statistical research is being undertaken in a variety of different fields Driven by the complexity of these new problems, and fueled by the explosion of available computer power, highly adaptive, non-linear procedures are now essential components of modern "data analysis," a term that we liberally interpret to include speech and pattern recognition, classification, data compression and signal processing The development of new, flexible methods combines advances from many sources, including approximation theory, numerical analysis, machine learning, signal processing and statistics The proposed workshop intends to bring together eminent experts from these fields in order to exchange ideas and forge directions for the future
Fetit, Ahmed E; Novak, Jan; Peet, Andrew C; Arvanitits, Theodoros N
2015-09-01
The aim of this study was to assess the efficacy of three-dimensional texture analysis (3D TA) of conventional MR images for the classification of childhood brain tumours in a quantitative manner. The dataset comprised pre-contrast T1 - and T2-weighted MRI series obtained from 48 children diagnosed with brain tumours (medulloblastoma, pilocytic astrocytoma and ependymoma). 3D and 2D TA were carried out on the images using first-, second- and higher order statistical methods. Six supervised classification algorithms were trained with the most influential 3D and 2D textural features, and their performances in the classification of tumour types, using the two feature sets, were compared. Model validation was carried out using the leave-one-out cross-validation (LOOCV) approach, as well as stratified 10-fold cross-validation, in order to provide additional reassurance. McNemar's test was used to test the statistical significance of any improvements demonstrated by 3D-trained classifiers. Supervised learning models trained with 3D textural features showed improved classification performances to those trained with conventional 2D features. For instance, a neural network classifier showed 12% improvement in area under the receiver operator characteristics curve (AUC) and 19% in overall classification accuracy. These improvements were statistically significant for four of the tested classifiers, as per McNemar's tests. This study shows that 3D textural features extracted from conventional T1 - and T2-weighted images can improve the diagnostic classification of childhood brain tumours. Long-term benefits of accurate, yet non-invasive, diagnostic aids include a reduction in surgical procedures, improvement in surgical and therapy planning, and support of discussions with patients' families. It remains necessary, however, to extend the analysis to a multicentre cohort in order to assess the scalability of the techniques used. Copyright © 2015 John Wiley & Sons, Ltd.
Kunina-Habenicht, Olga; Rupp, André A.; Wilhelm, Oliver
2017-01-01
Diagnostic classification models (DCMs) hold great potential for applications in summative and formative assessment by providing discrete multivariate proficiency scores that yield statistically driven classifications of students. Using data from a newly developed diagnostic arithmetic assessment that was administered to 2032 fourth-grade students…
Textual information access statistical models
Gaussier, Eric
2013-01-01
This book presents statistical models that have recently been developed within several research communities to access information contained in text collections. The problems considered are linked to applications aiming at facilitating information access:- information extraction and retrieval;- text classification and clustering;- opinion mining;- comprehension aids (automatic summarization, machine translation, visualization).In order to give the reader as complete a description as possible, the focus is placed on the probability models used in the applications
[The inadequacy of official classification of work accidents in Brazil].
Cordeiro, Ricardo
2018-02-19
Traditionally, work accidents in Brazil have been categorized in government documents and legal and academic texts as typical work accidents and commuting accidents. Given the increase in urban violence and the increasingly precarious work conditions in recent decades, this article addresses the conceptual inadequacy of this classification and its implications for the underestimation of work accidents in the country. An alternative classification is presented as an example and a contribution to the discussion on the improvement of statistics on work-related injuries in Brazil.
SVM-based Partial Discharge Pattern Classification for GIS
Ling, Yin; Bai, Demeng; Wang, Menglin; Gong, Xiaojin; Gu, Chao
2018-01-01
Partial discharges (PD) occur when there are localized dielectric breakdowns in small regions of gas insulated substations (GIS). It is of high importance to recognize the PD patterns, through which we can diagnose the defects caused by different sources so that predictive maintenance can be conducted to prevent from unplanned power outage. In this paper, we propose an approach to perform partial discharge pattern classification. It first recovers the PRPD matrices from the PRPD2D images; then statistical features are extracted from the recovered PRPD matrix and fed into SVM for classification. Experiments conducted on a dataset containing thousands of images demonstrates the high effectiveness of the method.
Statistical properties of spectra in harmonically trapped spin-orbit coupled systems
DEFF Research Database (Denmark)
V. Marchukov, O.; G. Volosniev, A.; V. Fedorov, D.
2014-01-01
We compute single-particle energy spectra for a one-body Hamiltonian consisting of a two-dimensional deformed harmonic oscillator potential, the Rashba spin-orbit coupling and the Zeeman term. To investigate the statistical properties of the obtained spectra as functions of deformation, spin......-orbit and Zeeman strengths we examine the distributions of the nearest neighbor spacings. We find that the shapes of these distributions depend strongly on the three potential parameters. We show that the obtained shapes in some cases can be well approximated with the standard Poisson, Brody and Wigner...... distributions. The Brody and Wigner distributions characterize irregular motion and help identify quantum chaotic systems. We present a special choices of deformation and spin-orbit strengths without the Zeeman term which provide a fair reproduction of the fourth-power repelling Wigner distribution. By adding...
International Nuclear Information System (INIS)
Lima, A.F. de
2003-01-01
The q-deformed kink of the λφ 4 -model is obtained via the normalisable ground state eigenfunction of a fluctuation operator associated with the q-deformed hyperbolic functions. The kink mass, the bosonic zero-mode and the q-deformed potential in 1+1 dimensions are found. (author)
Text document classification based on mixture models
Czech Academy of Sciences Publication Activity Database
Novovičová, Jana; Malík, Antonín
2004-01-01
Roč. 40, č. 3 (2004), s. 293-304 ISSN 0023-5954 R&D Projects: GA AV ČR IAA2075302; GA ČR GA102/03/0049; GA AV ČR KSK1019101 Institutional research plan: CEZ:AV0Z1075907 Keywords : text classification * text categorization * multinomial mixture model Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.224, year: 2004
Rahman, Md Mostafizur; Fattah, Shaikh Anowarul
2017-01-01
In view of recent increase of brain computer interface (BCI) based applications, the importance of efficient classification of various mental tasks has increased prodigiously nowadays. In order to obtain effective classification, efficient feature extraction scheme is necessary, for which, in the proposed method, the interchannel relationship among electroencephalogram (EEG) data is utilized. It is expected that the correlation obtained from different combination of channels will be different for different mental tasks, which can be exploited to extract distinctive feature. The empirical mode decomposition (EMD) technique is employed on a test EEG signal obtained from a channel, which provides a number of intrinsic mode functions (IMFs), and correlation coefficient is extracted from interchannel IMF data. Simultaneously, different statistical features are also obtained from each IMF. Finally, the feature matrix is formed utilizing interchannel correlation features and intrachannel statistical features of the selected IMFs of EEG signal. Different kernels of the support vector machine (SVM) classifier are used to carry out the classification task. An EEG dataset containing ten different combinations of five different mental tasks is utilized to demonstrate the classification performance and a very high level of accuracy is achieved by the proposed scheme compared to existing methods.
Study on MPGA-BP of Gravity Dam Deformation Prediction
Directory of Open Access Journals (Sweden)
Xiaoyu Wang
2017-01-01
Full Text Available Displacement is an important physical quantity of hydraulic structures deformation monitoring, and its prediction accuracy is the premise of ensuring the safe operation. Most existing metaheuristic methods have three problems: (1 falling into local minimum easily, (2 slowing convergence, and (3 the initial value’s sensitivity. Resolving these three problems and improving the prediction accuracy necessitate the application of genetic algorithm-based backpropagation (GA-BP neural network and multiple population genetic algorithm (MPGA. A hybrid multiple population genetic algorithm backpropagation (MPGA-BP neural network algorithm is put forward to optimize deformation prediction from periodic monitoring surveys of hydraulic structures. This hybrid model is employed for analyzing the displacement of a gravity dam in China. The results show the proposed model is superior to an ordinary BP neural network and statistical regression model in the aspect of global search, convergence speed, and prediction accuracy.
The solubility and diffusivity of hydrogen in well-annealed and deformed iron
International Nuclear Information System (INIS)
Kiuchi, K.; McLellan, R.B.
1983-01-01
It has been shown that a large volume of data for the solubility of hydrogen in iron is affected by spurious surface conditions. Arrhenius plots of solubility data in the temperature range 300-1750 K, which are free of such effects, exhibit a temperature variation which, despite the low H-solubility in the entire temperature range, is not consistent with regular mixing statistics. This departure from regular behavior is consistent with the thermal activation of H atoms into energetically less favorable octahedral sites as the temperature is increased. The enhancement in H-solubility caused by the cold deformation of iron can be understood in terms of a simple Maxwell-Boltzmann distribution of H atoms between ''normal'' lattice sites and ''trapping'' sites of depth 34 kJ/mol. The 62 currently existing sets of data for the diffusivity of hydrogen through b.c.c. iron exhibit a large degree of mutual inconsistency. Exhaustive statistical analysis of this large data mass has shown that only those data obtained by electrochemical methods and H 2 -gas equilibration methods using UHV techniques and Pd-coated membranes are reliable. The problem of H-diffusion in deformed iron has been analysed using a semi-quantitative model in which the retarding effect of trapping sites on the diffusivity is partially compensated by a ''pipe'' diffusion contribution along dislocations. It is shown that this model is in accord with the diffusivities measured in deformed iron when data not encumbered by spurious surface effects are considered
The effects of needle deformation during lumbar puncture
Directory of Open Access Journals (Sweden)
Hasan Hüseyin Özdemir
2015-01-01
Full Text Available Objective: The aim of this study is to assess deformation of the tip and deflection from the axis of 22-gauge Quincke needles when they are used for diagnostic lumbar puncture (LP. Thus, it can be determined whether constructional alterations of needles are important for predicting clinical problems after diagnostic LP. Materials and Methods: The 22-gauge Quincke needles used for diagnostic LP were evaluated. A specially designed protractor was used for measurement and evaluation. Waist circumference was measured in each patient. Patients were questioned about headaches occurring after LP. Results: A total of 115 Quincke-type spinal needles used in 113 patients were evaluated. No deflection was detected in 38 (33.1% of the needles. Deflection between 0.1° and 5° occurred in 43 (37.3% of the needles and deflection ≥ 5.1° occurred in 34 patients (29.6%. Forty-seven (41.5% patients experienced post lumbar puncture headache (PLPH and 13 (11.5% patients experienced intracranial hypotension (IH. No statistically significant correlation between the degree of deflection and headache was found (P > 0.05. Epidural blood patch was performed for three patients. Deformity in the form of bending like a hook occurred in seven needles and IH occurred in six patients using these needles. Two of the needles used in three patients requiring blood patch were found to be bent. Conclusion: Deformation of needles may increase complications after LP. Needle deformation may lead to IH. In case of deterioration in the structure of the needle, termination of the puncture procedure and the use of a new needle could reduce undesirable clinical consequences, especially IH.
The effects of needle deformation during lumbar puncture
Özdemir, Hasan Hüseyin; Demir, Caner F.; Varol, Sefer; Arslan, Demet; Yıldız, Mustafa; Akil, Eşref
2015-01-01
Objective: The aim of this study is to assess deformation of the tip and deflection from the axis of 22-gauge Quincke needles when they are used for diagnostic lumbar puncture (LP). Thus, it can be determined whether constructional alterations of needles are important for predicting clinical problems after diagnostic LP. Materials and Methods: The 22-gauge Quincke needles used for diagnostic LP were evaluated. A specially designed protractor was used for measurement and evaluation. Waist circumference was measured in each patient. Patients were questioned about headaches occurring after LP. Results: A total of 115 Quincke-type spinal needles used in 113 patients were evaluated. No deflection was detected in 38 (33.1%) of the needles. Deflection between 0.1° and 5° occurred in 43 (37.3%) of the needles and deflection ≥ 5.1° occurred in 34 patients (29.6%). Forty-seven (41.5%) patients experienced post lumbar puncture headache (PLPH) and 13 (11.5%) patients experienced intracranial hypotension (IH). No statistically significant correlation between the degree of deflection and headache was found (P > 0.05). Epidural blood patch was performed for three patients. Deformity in the form of bending like a hook occurred in seven needles and IH occurred in six patients using these needles. Two of the needles used in three patients requiring blood patch were found to be bent. Conclusion: Deformation of needles may increase complications after LP. Needle deformation may lead to IH. In case of deterioration in the structure of the needle, termination of the puncture procedure and the use of a new needle could reduce undesirable clinical consequences, especially IH. PMID:25883480
Marti, Sina; Stünitz, Holger; Heilbronner, Renée; Plümper, Oliver; Drury, Martyn
2016-04-01
Deformation experiments were performed on natural Maryland Diabase (˜ 55% Plg, 42% Px, 3% accessories, 0.18 wt.-% H2O added) in a Griggs-type deformation apparatus in order to explore the brittle-viscous transition and the interplay between deformation and mineral reactions. Shear experiments at strain rates of ˜ 2e-5 /s are performed, at T=600, 700 and 800°C and confining pressures Pc=1.0 and 1.5 GPa. Deformation localizes in all experiments. Below 700°C, the microstructure is dominated by brittle deformation with a foliation formed by cataclastic flow and high strain accommodated along 3-5 major ultracataclasite shear bands. At 700°C, the bulk of the material still exhibits abundant microfractures, however, deformation localizes into an anastomosing network of shear bands (SB) formed from a fine-grained (<< 1 μm) mixture of newly formed Plg and Amph. These reaction products occur almost exclusively along syn-kinematic structures such as fractures and SB. Experiments at 800°C show extensive mineral reactions, with the main reaction products Amph+Plg (+Zo). Deformation is localized in broad C' and C SB formed by a fine-grained (0.1 - 0.8 μm) mixture of Plg+Amph (+Zo). The onset of mineral reactions in the 700°C experiments shows that reaction kinetics and diffusional mass transport are fast enough to keep up with the short experimental timescales. While in the 700°C experiments brittle processes kinematically contribute to deformation, fracturing is largely absent at 800°C. Diffusive mass transfer dominates. The very small grain size within SB favours a grain size sensitive deformation mechanism. Due to the presence of water (and relatively high supported stresses), dissolution-precipitation creep is interpreted to be the dominant strain accommodating mechanism. From the change of Amph coronas around Px clasts with strain, we can determine that Amph is re-dissolved at high stress sites while growing in low stress sites, showing the ability of Amph to
Deformation of second and third quantization
Faizal, Mir
2015-03-01
In this paper, we will deform the second and third quantized theories by deforming the canonical commutation relations in such a way that they become consistent with the generalized uncertainty principle. Thus, we will first deform the second quantized commutator and obtain a deformed version of the Wheeler-DeWitt equation. Then we will further deform the third quantized theory by deforming the third quantized canonical commutation relation. This way we will obtain a deformed version of the third quantized theory for the multiverse.
Directory of Open Access Journals (Sweden)
Vladimir A. Yanchuk
2017-09-01
Full Text Available Introduction: a distinctive feature of modern psychological knowledge is an extreme degree of disintegration manifested in an infinite array of publications describing local fragments of the studied reality outside the context of integrity. Simultaneously, development of knowledge without its metatheoretical interpretation gives it a sporadic character and, consequently, restricts the optimal solutions. The author’s attempt to solve this urgent problem is presented in the framework of sociocultural-interdeterminist dialogical metatheory of integration of psychological knowledge. Methodological foundations with substantial characterisation of metatheory are described. A research objective is to present innovative and heuristic potential of the meta-approach demonstration illustrated through the teacher’s psychological anti-deforming model. Materials and Methods: the methodological basis of research is presented by the sociocultural interdeterminist dialogical approach to education phenomenology analysis which innovative potential is illustrated by the example of teacher’s personality deformation. System analysis, comparative method, systematisation and conceptualisation of scientific ideas, classification and typifications, research object and subject modeling were used during the study. Results: the foundations and innovative potential of the sociocultural-interdeterminist dialogical meta-approach to social phenomenology analysis are given a thorough account. The teacher’s personality professional deformation main criteria are given (authoritativeness, rigidity, self-perception non-criticality, role expansionism and pedagogical indifference, the personality deformation operational definition is formulated. The concept of psychological interdeterminants of professional deformation is introduced, the process of interdetermination of personality’s deformation phenomenon is revealed, that is: interdependence of per¬sonal, environmental
WIX: statistical nuclear multifragmentation with collective expansion and Coulomb forces
Randrup, J.∅rgen
1993-10-01
By suitable augmentation of the event generator FREESCO, a code WIX has been constructed with which it is possible to simulate the statistical multifragmentation of a specified nuclear source, which may be both hollow and deformed, in the presence of a collective expansion and with the interfragment Coulomb forces included.
Deformations of the Almheiri-Polchinski model
Energy Technology Data Exchange (ETDEWEB)
Kyono, Hideki; Okumura, Suguru; Yoshida, Kentaroh [Department of Physics, Kyoto University, Kitashirakawa Oiwake-cho, Kyoto 606-8502 (Japan)
2017-03-31
We study deformations of the Almheiri-Polchinski (AP) model by employing the Yang-Baxter deformation technique. The general deformed AdS{sub 2} metric becomes a solution of a deformed AP model. In particular, the dilaton potential is deformed from a simple quadratic form to a hyperbolic function-type potential similarly to integrable deformations. A specific solution is a deformed black hole solution. Because the deformation makes the spacetime structure around the boundary change drastically and a new naked singularity appears, the holographic interpretation is far from trivial. The Hawking temperature is the same as the undeformed case but the Bekenstein-Hawking entropy is modified due to the deformation. This entropy can also be reproduced by evaluating the renormalized stress tensor with an appropriate counter-term on the regularized screen close to the singularity.
New Statistics for Texture Classification Based on Gabor Filters
Directory of Open Access Journals (Sweden)
J. Pavlovicova
2007-09-01
Full Text Available The paper introduces a new method of texture segmentation efficiency evaluation. One of the well known texture segmentation methods is based on Gabor filters because of their orientation and spatial frequency character. Several statistics are used to extract more information from results obtained by Gabor filtering. Big amount of input parameters causes a wide set of results which need to be evaluated. The evaluation method is based on the normal distributions Gaussian curves intersection assessment and provides a new point of view to the segmentation method selection.
Fault Diagnosis for Rotating Machinery Using Vibration Measurement Deep Statistical Feature Learning
Directory of Open Access Journals (Sweden)
Chuan Li
2016-06-01
Full Text Available Fault diagnosis is important for the maintenance of rotating machinery. The detection of faults and fault patterns is a challenging part of machinery fault diagnosis. To tackle this problem, a model for deep statistical feature learning from vibration measurements of rotating machinery is presented in this paper. Vibration sensor signals collected from rotating mechanical systems are represented in the time, frequency, and time-frequency domains, each of which is then used to produce a statistical feature set. For learning statistical features, real-value Gaussian-Bernoulli restricted Boltzmann machines (GRBMs are stacked to develop a Gaussian-Bernoulli deep Boltzmann machine (GDBM. The suggested approach is applied as a deep statistical feature learning tool for both gearbox and bearing systems. The fault classification performances in experiments using this approach are 95.17% for the gearbox, and 91.75% for the bearing system. The proposed approach is compared to such standard methods as a support vector machine, GRBM and a combination model. In experiments, the best fault classification rate was detected using the proposed model. The results show that deep learning with statistical feature extraction has an essential improvement potential for diagnosing rotating machinery faults.
Li, Chuan; Sánchez, René-Vinicio; Zurita, Grover; Cerrada, Mariela; Cabrera, Diego
2016-06-17
Fault diagnosis is important for the maintenance of rotating machinery. The detection of faults and fault patterns is a challenging part of machinery fault diagnosis. To tackle this problem, a model for deep statistical feature learning from vibration measurements of rotating machinery is presented in this paper. Vibration sensor signals collected from rotating mechanical systems are represented in the time, frequency, and time-frequency domains, each of which is then used to produce a statistical feature set. For learning statistical features, real-value Gaussian-Bernoulli restricted Boltzmann machines (GRBMs) are stacked to develop a Gaussian-Bernoulli deep Boltzmann machine (GDBM). The suggested approach is applied as a deep statistical feature learning tool for both gearbox and bearing systems. The fault classification performances in experiments using this approach are 95.17% for the gearbox, and 91.75% for the bearing system. The proposed approach is compared to such standard methods as a support vector machine, GRBM and a combination model. In experiments, the best fault classification rate was detected using the proposed model. The results show that deep learning with statistical feature extraction has an essential improvement potential for diagnosing rotating machinery faults.
Statistical inference for template aging
Schuckers, Michael E.
2006-04-01
A change in classification error rates for a biometric device is often referred to as template aging. Here we offer two methods for determining whether the effect of time is statistically significant. The first of these is the use of a generalized linear model to determine if these error rates change linearly over time. This approach generalizes previous work assessing the impact of covariates using generalized linear models. The second approach uses of likelihood ratio tests methodology. The focus here is on statistical methods for estimation not the underlying cause of the change in error rates over time. These methodologies are applied to data from the National Institutes of Standards and Technology Biometric Score Set Release 1. The results of these applications are discussed.
Semi-Automated Classification of Seafloor Data Collected on the Delmarva Inner Shelf
Sweeney, E. M.; Pendleton, E. A.; Brothers, L. L.; Mahmud, A.; Thieler, E. R.
2017-12-01
We tested automated classification methods on acoustic bathymetry and backscatter data collected by the U.S. Geological Survey (USGS) and National Oceanic and Atmospheric Administration (NOAA) on the Delmarva inner continental shelf to efficiently and objectively identify sediment texture and geomorphology. Automated classification techniques are generally less subjective and take significantly less time than manual classification methods. We used a semi-automated process combining unsupervised and supervised classification techniques to characterize seafloor based on bathymetric slope and relative backscatter intensity. Statistical comparison of our automated classification results with those of a manual classification conducted on a subset of the acoustic imagery indicates that our automated method was highly accurate (95% total accuracy and 93% Kappa). Our methods resolve sediment ridges, zones of flat seafloor and areas of high and low backscatter. We compared our classification scheme with mean grain size statistics of samples collected in the study area and found that strong correlations between backscatter intensity and sediment texture exist. High backscatter zones are associated with the presence of gravel and shells mixed with sand, and low backscatter areas are primarily clean sand or sand mixed with mud. Slope classes further elucidate textural and geomorphologic differences in the seafloor, such that steep slopes (>0.35°) with high backscatter are most often associated with the updrift side of sand ridges and bedforms, whereas low slope with high backscatter correspond to coarse lag or shell deposits. Low backscatter and high slopes are most often found on the downdrift side of ridges and bedforms, and low backscatter and low slopes identify swale areas and sand sheets. We found that poor acoustic data quality was the most significant cause of inaccurate classification results, which required additional user input to mitigate. Our method worked well
Weighted statistical parameters for irregularly sampled time series
Rimoldini, Lorenzo
2014-01-01
Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time and corrupt measurements, for example, or inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. Irregular sampling often causes clumps of measurements and gaps with no data which can severely disrupt the values of estimators. This paper aims at improving the accuracy of common statistical parameters when linear interpolation (in time or phase) can be considered an acceptable approximation of a deterministic signal. A pragmatic solution is formulated in terms of a simple weighting scheme, adapting to the sampling density and noise level, applicable to large data volumes at minimal computational cost. Tests on time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the suggested scheme confirmed the benefits of the improved input attributes. The classification of eclipsing binaries, Mira, RR Lyrae, Delta Cephei and Alpha2 Canum Venaticorum stars employing exclusively weighted descriptive statistics achieved an overall accuracy of 92 per cent, about 6 per cent higher than with unweighted estimators.
International Nuclear Information System (INIS)
Ogievetsky, O.; Pillin, M.; Schmidke, W.B.; Wess, J.; Zumino, B.
1993-01-01
In this lecture I discuss the algebraic structure of a q-deformed four-vector space. It serves as a good example of quantizing Minkowski space. To give a physical interpretation of such a quantized Minkowski space we construct the Hilbert space representation and find that the relevant time and space operators have a discrete spectrum. Thus the q-deformed Minkowski space has a lattice structure. Nevertheless this lattice structure is compatible with the operation of q-deformed Lorentz transformations. The generators of the q-deformed Lorentz group can be represented as linear operators in the same Hilbert space. (orig.)
Directory of Open Access Journals (Sweden)
Branislava Gemovic
2013-01-01
Full Text Available There are more than 500 amino acid substitutions in each human genome, and bioinformatics tools irreplaceably contribute to determination of their functional effects. We have developed feature-based algorithm for the detection of mutations outside conserved functional domains (CFDs and compared its classification efficacy with the most commonly used phylogeny-based tools, PolyPhen-2 and SIFT. The new algorithm is based on the informational spectrum method (ISM, a feature-based technique, and statistical analysis. Our dataset contained neutral polymorphisms and mutations associated with myeloid malignancies from epigenetic regulators ASXL1, DNMT3A, EZH2, and TET2. PolyPhen-2 and SIFT had significantly lower accuracies in predicting the effects of amino acid substitutions outside CFDs than expected, with especially low sensitivity. On the other hand, only ISM algorithm showed statistically significant classification of these sequences. It outperformed PolyPhen-2 and SIFT by 15% and 13%, respectively. These results suggest that feature-based methods, like ISM, are more suitable for the classification of amino acid substitutions outside CFDs than phylogeny-based tools.
CLASSIFICATION OF LEARNING MANAGEMENT SYSTEMS
Directory of Open Access Journals (Sweden)
Yu. B. Popova
2016-01-01
Full Text Available Using of information technologies and, in particular, learning management systems, increases opportunities of teachers and students in reaching their goals in education. Such systems provide learning content, help organize and monitor training, collect progress statistics and take into account the individual characteristics of each user. Currently, there is a huge inventory of both paid and free systems are physically located both on college servers and in the cloud, offering different features sets of different licensing scheme and the cost. This creates the problem of choosing the best system. This problem is partly due to the lack of comprehensive classification of such systems. Analysis of more than 30 of the most common now automated learning management systems has shown that a classification of such systems should be carried out according to certain criteria, under which the same type of system can be considered. As classification features offered by the author are: cost, functionality, modularity, keeping the customer’s requirements, the integration of content, the physical location of a system, adaptability training. Considering the learning management system within these classifications and taking into account the current trends of their development, it is possible to identify the main requirements to them: functionality, reliability, ease of use, low cost, support for SCORM standard or Tin Can API, modularity and adaptability. According to the requirements at the Software Department of FITR BNTU under the guidance of the author since 2009 take place the development, the use and continuous improvement of their own learning management system.
Comparison analysis for classification algorithm in data mining and the study of model use
Chen, Junde; Zhang, Defu
2018-04-01
As a key technique in data mining, classification algorithm was received extensive attention. Through an experiment of classification algorithm in UCI data set, we gave a comparison analysis method for the different algorithms and the statistical test was used here. Than that, an adaptive diagnosis model for preventive electricity stealing and leakage was given as a specific case in the paper.
Faust, Kevin; Xie, Quin; Han, Dominick; Goyle, Kartikay; Volynskaya, Zoya; Djuric, Ugljesa; Diamandis, Phedias
2018-05-16
There is growing interest in utilizing artificial intelligence, and particularly deep learning, for computer vision in histopathology. While accumulating studies highlight expert-level performance of convolutional neural networks (CNNs) on focused classification tasks, most studies rely on probability distribution scores with empirically defined cutoff values based on post-hoc analysis. More generalizable tools that allow humans to visualize histology-based deep learning inferences and decision making are scarce. Here, we leverage t-distributed Stochastic Neighbor Embedding (t-SNE) to reduce dimensionality and depict how CNNs organize histomorphologic information. Unique to our workflow, we develop a quantitative and transparent approach to visualizing classification decisions prior to softmax compression. By discretizing the relationships between classes on the t-SNE plot, we show we can super-impose randomly sampled regions of test images and use their distribution to render statistically-driven classifications. Therefore, in addition to providing intuitive outputs for human review, this visual approach can carry out automated and objective multi-class classifications similar to more traditional and less-transparent categorical probability distribution scores. Importantly, this novel classification approach is driven by a priori statistically defined cutoffs. It therefore serves as a generalizable classification and anomaly detection tool less reliant on post-hoc tuning. Routine incorporation of this convenient approach for quantitative visualization and error reduction in histopathology aims to accelerate early adoption of CNNs into generalized real-world applications where unanticipated and previously untrained classes are often encountered.
Deep-learnt classification of light curves
DEFF Research Database (Denmark)
Mahabal, Ashish; Gieseke, Fabian; Pai, Akshay Sadananda Uppinakudru
2017-01-01
is to derive statistical features from the time series and to use machine learning methods, generally supervised, to separate objects into a few of the standard classes. In this work, we transform the time series to two-dimensional light curve representations in order to classify them using modern deep......Astronomy light curves are sparse, gappy, and heteroscedastic. As a result standard time series methods regularly used for financial and similar datasets are of little help and astronomers are usually left to their own instruments and techniques to classify light curves. A common approach...... learning techniques. In particular, we show that convolutional neural networks based classifiers work well for broad characterization and classification. We use labeled datasets of periodic variables from CRTS survey and show how this opens doors for a quick classification of diverse classes with several...
International Nuclear Information System (INIS)
Kapur, G.S.; Sastry, M.I.S.; Jaiswal, A.K.; Sarpal, A.S.
2004-01-01
The present paper describes various classification techniques like cluster analysis, principal component (PC)/factor analysis to classify different types of base stocks. The API classification of base oils (Group I-III) has been compared to a more detailed NMR derived chemical compositional and molecular structural parameters based classification in order to point out the similarities of the base oils in the same group and the differences between the oils placed in different groups. The detailed compositional parameters have been generated using 1 H and 13 C nuclear magnetic resonance (NMR) spectroscopic methods. Further, oxidation stability, measured in terms of rotating bomb oxidation test (RBOT) life, of non-conventional base stocks and their blends with conventional base stocks, has been quantitatively correlated with their 1 H NMR and elemental (sulphur and nitrogen) data with the help of multiple linear regression (MLR) and artificial neural networks (ANN) techniques. The MLR based model developed using NMR and elemental data showed a high correlation between the 'measured' and 'estimated' RBOT values for both training (R=0.859) and validation (R=0.880) data sets. The ANN based model, developed using fewer number of input variables (only 1 H NMR data) also showed high correlation between the 'measured' and 'estimated' RBOT values for training (R=0.881), validation (R=0.860) and test (R=0.955) data sets
Directory of Open Access Journals (Sweden)
Ganchimeg Ganbold
2017-03-01
Full Text Available There are several statistical classification algorithms available for landuse/land cover classification. However, each has a certain bias orcompromise. Some methods like the parallel piped approach in supervisedclassification, cannot classify continuous regions within a feature. Onthe other hand, while unsupervised classification method takes maximumadvantage of spectral variability in an image, the maximally separableclusters in spectral space may not do much for our perception of importantclasses in a given study area. In this research, the output of an ANNalgorithm was compared with the Possibilistic c-Means an improvementof the fuzzy c-Means on both moderate resolutions Landsat8 and a highresolution Formosat 2 images. The Formosat 2 image comes with an8m spectral resolution on the multispectral data. This multispectral imagedata was resampled to 10m in order to maintain a uniform ratio of1:3 against Landsat 8 image. Six classes were chosen for analysis including:Dense forest, eucalyptus, water, grassland, wheat and riverine sand. Using a standard false color composite (FCC, the six features reflecteddifferently in the infrared region with wheat producing the brightestpixel values. Signature collection per class was therefore easily obtainedfor all classifications. The output of both ANN and FCM, were analyzedseparately for accuracy and an error matrix generated to assess the qualityand accuracy of the classification algorithms. When you compare theresults of the two methods on a per-class-basis, ANN had a crisperoutput compared to PCM which yielded clusters with pixels especiallyon the moderate resolution Landsat 8 imagery.
Extremely deformable structures
2015-01-01
Recently, a new research stimulus has derived from the observation that soft structures, such as biological systems, but also rubber and gel, may work in a post critical regime, where elastic elements are subject to extreme deformations, though still exhibiting excellent mechanical performances. This is the realm of ‘extreme mechanics’, to which this book is addressed. The possibility of exploiting highly deformable structures opens new and unexpected technological possibilities. In particular, the challenge is the design of deformable and bi-stable mechanisms which can reach superior mechanical performances and can have a strong impact on several high-tech applications, including stretchable electronics, nanotube serpentines, deployable structures for aerospace engineering, cable deployment in the ocean, but also sensors and flexible actuators and vibration absorbers. Readers are introduced to a variety of interrelated topics involving the mechanics of extremely deformable structures, with emphasis on ...
Retrieval and classification of food images.
Farinella, Giovanni Maria; Allegra, Dario; Moltisanti, Marco; Stanco, Filippo; Battiato, Sebastiano
2016-10-01
Automatic food understanding from images is an interesting challenge with applications in different domains. In particular, food intake monitoring is becoming more and more important because of the key role that it plays in health and market economies. In this paper, we address the study of food image processing from the perspective of Computer Vision. As first contribution we present a survey of the studies in the context of food image processing from the early attempts to the current state-of-the-art methods. Since retrieval and classification engines able to work on food images are required to build automatic systems for diet monitoring (e.g., to be embedded in wearable cameras), we focus our attention on the aspect of the representation of the food images because it plays a fundamental role in the understanding engines. The food retrieval and classification is a challenging task since the food presents high variableness and an intrinsic deformability. To properly study the peculiarities of different image representations we propose the UNICT-FD1200 dataset. It was composed of 4754 food images of 1200 distinct dishes acquired during real meals. Each food plate is acquired multiple times and the overall dataset presents both geometric and photometric variabilities. The images of the dataset have been manually labeled considering 8 categories: Appetizer, Main Course, Second Course, Single Course, Side Dish, Dessert, Breakfast, Fruit. We have performed tests employing different representations of the state-of-the-art to assess the related performances on the UNICT-FD1200 dataset. Finally, we propose a new representation based on the perceptual concept of Anti-Textons which is able to encode spatial information between Textons outperforming other representations in the context of food retrieval and Classification. Copyright © 2016 Elsevier Ltd. All rights reserved.
Multiview Discriminative Geometry Preserving Projection for Image Classification
Directory of Open Access Journals (Sweden)
Ziqiang Wang
2014-01-01
Full Text Available In many image classification applications, it is common to extract multiple visual features from different views to describe an image. Since different visual features have their own specific statistical properties and discriminative powers for image classification, the conventional solution for multiple view data is to concatenate these feature vectors as a new feature vector. However, this simple concatenation strategy not only ignores the complementary nature of different views, but also ends up with “curse of dimensionality.” To address this problem, we propose a novel multiview subspace learning algorithm in this paper, named multiview discriminative geometry preserving projection (MDGPP for feature extraction and classification. MDGPP can not only preserve the intraclass geometry and interclass discrimination information under a single view, but also explore the complementary property of different views to obtain a low-dimensional optimal consensus embedding by using an alternating-optimization-based iterative algorithm. Experimental results on face recognition and facial expression recognition demonstrate the effectiveness of the proposed algorithm.
Deforming tachyon kinks and tachyon potentials
International Nuclear Information System (INIS)
Afonso, Victor I.; Bazeia, Dionisio; Brito, Francisco A.
2006-01-01
In this paper we investigate deformation of tachyon potentials and tachyon kink solutions. We consider the deformation of a DBI type action with gauge and tachyon fields living on D1-brane and D3-brane world-volume. We deform tachyon potentials to get other consistent tachyon potentials by using properly a deformation function depending on the gauge field components. Resolutions of singular tachyon kinks via deformation and applications of deformed tachyon potentials to scalar cosmology scenario are discussed
Intelligent Computer Vision System for Automated Classification
International Nuclear Information System (INIS)
Jordanov, Ivan; Georgieva, Antoniya
2010-01-01
In this paper we investigate an Intelligent Computer Vision System applied for recognition and classification of commercially available cork tiles. The system is capable of acquiring and processing gray images using several feature generation and analysis techniques. Its functionality includes image acquisition, feature extraction and preprocessing, and feature classification with neural networks (NN). We also discuss system test and validation results from the recognition and classification tasks. The system investigation also includes statistical feature processing (features number and dimensionality reduction techniques) and classifier design (NN architecture, target coding, learning complexity and performance, and training with our own metaheuristic optimization method). The NNs trained with our genetic low-discrepancy search method (GLPτS) for global optimisation demonstrated very good generalisation abilities. In our view, the reported testing success rate of up to 95% is due to several factors: combination of feature generation techniques; application of Analysis of Variance (ANOVA) and Principal Component Analysis (PCA), which appeared to be very efficient for preprocessing the data; and use of suitable NN design and learning method.
Statistic methods for searching inundated radioactive entities
International Nuclear Information System (INIS)
Dubasov, Yu.V.; Krivokhatskij, A.S.; Khramov, N.N.
1993-01-01
The problem of searching flooded radioactive object in a present area was considered. Various models of the searching route plotting are discussed. It is shown that spiral route by random points from the centre of the area examined is the most efficient one. The conclusion is made that, when searching flooded radioactive objects, it is advisable to use multidimensional statistical methods of classification
A population based statistical model for daily geometric variations in the thorax
Szeto, Yenny Z.; Witte, Marnix G.; van Herk, Marcel; Sonke, Jan-Jakob
2017-01-01
To develop a population based statistical model of the systematic interfraction geometric variations between the planning CT and first treatment week of lung cancer patients for inclusion as uncertainty term in future probabilistic planning. Deformable image registrations between the planning CT and
Classification of Grassland Successional Stages Using Airborne Hyperspectral Imagery
Directory of Open Access Journals (Sweden)
Thomas Möckel
2014-08-01
Full Text Available Plant communities differ in their species composition, and, thus, also in their functional trait composition, at different stages in the succession from arable fields to grazed grassland. We examine whether aerial hyperspectral (414–2501 nm remote sensing can be used to discriminate between grazed vegetation belonging to different grassland successional stages. Vascular plant species were recorded in 104.1 m2 plots on the island of Öland (Sweden and the functional properties of the plant species recorded in the plots were characterized in terms of the ground-cover of grasses, specific leaf area and Ellenberg indicator values. Plots were assigned to three different grassland age-classes, representing 5–15, 16–50 and >50 years of grazing management. Partial least squares discriminant analysis models were used to compare classifications based on aerial hyperspectral data with the age-class classification. The remote sensing data successfully classified the plots into age-classes: the overall classification accuracy was higher for a model based on a pre-selected set of wavebands (85%, Kappa statistic value = 0.77 than one using the full set of wavebands (77%, Kappa statistic value = 0.65. Our results show that nutrient availability and grass cover differences between grassland age-classes are detectable by spectral imaging. These techniques may potentially be used for mapping the spatial distribution of grassland habitats at different successional stages.
Robert Spitzer and psychiatric classification: technical challenges and ethical dilemmas.
Jacob, K S
2016-01-01
Dr Robert Leopold Spitzer (May 22, 1932-December 25, 2015), the architect of modern psychiatric diagnostic criteria and classification, died recently at the age of 83 in Seattle. Under his leadership, the American Psychiatric Association's (APA) Diagnostic and Statistical Manuals (DSM) became the international standard.
Iannacone, J.; Berti, M.; Allievi, J.; Del Conte, S.; Corsini, A.
2013-12-01
Space borne InSAR has proven to be very valuable for landslides detection. In particular, extremely slow landslides (Cruden and Varnes, 1996) can be now clearly identified, thanks to the millimetric precision reached by recent multi-interferometric algorithms. The typical approach in radar interpretation for landslides mapping is based on average annual velocity of the deformation which is calculated over the entire times series. The Hotspot and Cluster Analysis (Lu et al., 2012) and the PSI-based matrix approach (Cigna et al., 2013) are examples of landslides mapping techniques based on average annual velocities. However, slope movements can be affected by non-linear deformation trends, (i.e. reactivation of dormant landslides, deceleration due to natural or man-made slope stabilization, seasonal activity, etc). Therefore, analyzing deformation time series is crucial in order to fully characterize slope dynamics. While this is relatively simple to be carried out manually when dealing with small dataset, the time series analysis over regional scale dataset requires automated classification procedures. Berti et al. (2013) developed an automatic procedure for the analysis of InSAR time series based on a sequence of statistical tests. The analysis allows to classify the time series into six distinctive target trends (0=uncorrelated; 1=linear; 2=quadratic; 3=bilinear; 4=discontinuous without constant velocity; 5=discontinuous with change in velocity) which are likely to represent different slope processes. The analysis also provides a series of descriptive parameters which can be used to characterize the temporal changes of ground motion. All the classification algorithms were integrated into a Graphical User Interface called PSTime. We investigated an area of about 2000 km2 in the Northern Apennines of Italy by using SqueeSAR™ algorithm (Ferretti et al., 2011). Two Radarsat-1 data stack, comprising of 112 scenes in descending orbit and 124 scenes in ascending orbit
EEG Eye State Identification Using Incremental Attribute Learning with Time-Series Classification
Directory of Open Access Journals (Sweden)
Ting Wang
2014-01-01
Full Text Available Eye state identification is a kind of common time-series classification problem which is also a hot spot in recent research. Electroencephalography (EEG is widely used in eye state classification to detect human's cognition state. Previous research has validated the feasibility of machine learning and statistical approaches for EEG eye state classification. This paper aims to propose a novel approach for EEG eye state identification using incremental attribute learning (IAL based on neural networks. IAL is a novel machine learning strategy which gradually imports and trains features one by one. Previous studies have verified that such an approach is applicable for solving a number of pattern recognition problems. However, in these previous works, little research on IAL focused on its application to time-series problems. Therefore, it is still unknown whether IAL can be employed to cope with time-series problems like EEG eye state classification. Experimental results in this study demonstrates that, with proper feature extraction and feature ordering, IAL can not only efficiently cope with time-series classification problems, but also exhibit better classification performance in terms of classification error rates in comparison with conventional and some other approaches.
Passias, Peter G; Horn, Samantha R; Jalai, Cyrus M; Poorman, Gregory; Bono, Olivia J; Ramchandran, Subaraman; Smith, Justin S; Scheer, Justin K; Sciubba, Daniel M; Hamilton, D Kojo; Mundis, Gregory; Oh, Cheongeun; Klineberg, Eric O; Lafage, Virginie; Shaffrey, Christopher I; Ames, Christopher P
2017-11-01
Complication rates for adult cervical deformity are poorly characterized given the complexity and heterogeneity of cases. To compare perioperative complication rates following adult cervical deformity corrective surgery between a prospective multicenter database for patients with cervical deformity (PCD) and the Nationwide Inpatient Sample (NIS). Retrospective review of prospective databases. A total of 11,501 adult patients with cervical deformity (11,379 patients from the NIS and 122 patients from the PCD database). Perioperative medical and surgical complications. The NIS was queried (2001-2013) for cervical deformity discharges for patients ≥18 years undergoing cervical fusions using International Classification of Disease, Ninth Revision (ICD-9) coding. Patients ≥18 years from the PCD database (2013-2015) were selected. Equivalent complications were identified and rates were compared. Bonferroni correction (pdatabases. A total of 11,379 patients from the NIS database and 122 patiens from the PCD database were identified. Patients from the PCD database were older (62.49 vs. 55.15, pdatabase. The PCD database had an increased risk of reporting overall complications than the NIS (odds ratio: 2.81, confidence interval: 1.81-4.38). Only device-related complications were greater in the NIS (7.1% vs. 1.1%, p=.007). Patients from the PCD database displayed higher rates of the following complications: peripheral vascular (0.8% vs. 0.1%, p=.001), gastrointestinal (GI) (2.5% vs. 0.2%, pdatabases (p>.004). Based on surgicalapproach, the PCD reported higher GI and neurologic complication rates for combined anterior-posterior procedures (pdatabase revealed higher overall and individual complication rates and higher data granularity. The nationwide database may underestimate complications of patients with adult cervical deformity (ACD) particularly in regard to perioperative surgical details owing to coding and deformity generalizations. The surgeon-maintained database
Empirical evaluation of data normalization methods for molecular classification.
Huang, Huei-Chung; Qin, Li-Xuan
2018-01-01
Data artifacts due to variations in experimental handling are ubiquitous in microarray studies, and they can lead to biased and irreproducible findings. A popular approach to correct for such artifacts is through post hoc data adjustment such as data normalization. Statistical methods for data normalization have been developed and evaluated primarily for the discovery of individual molecular biomarkers. Their performance has rarely been studied for the development of multi-marker molecular classifiers-an increasingly important application of microarrays in the era of personalized medicine. In this study, we set out to evaluate the performance of three commonly used methods for data normalization in the context of molecular classification, using extensive simulations based on re-sampling from a unique pair of microRNA microarray datasets for the same set of samples. The data and code for our simulations are freely available as R packages at GitHub. In the presence of confounding handling effects, all three normalization methods tended to improve the accuracy of the classifier when evaluated in an independent test data. The level of improvement and the relative performance among the normalization methods depended on the relative level of molecular signal, the distributional pattern of handling effects (e.g., location shift vs scale change), and the statistical method used for building the classifier. In addition, cross-validation was associated with biased estimation of classification accuracy in the over-optimistic direction for all three normalization methods. Normalization may improve the accuracy of molecular classification for data with confounding handling effects; however, it cannot circumvent the over-optimistic findings associated with cross-validation for assessing classification accuracy.
Man'ko, V I
1993-01-01
Brownian motion may be embedded in the Fock space of bosonic free field in one dimension.Extending this correspondence to a family of creation and annihilation operators satisfying a q-deformed algebra, the notion of q-deformation is carried from the algebra to the domain of stochastic processes.The properties of q-deformed Brownian motion, in particular its non-Gaussian nature and cumulant structure,are established.
Effective Packet Number for 5G IM WeChat Application at Early Stage Traffic Classification
Directory of Open Access Journals (Sweden)
Muhammad Shafiq
2017-01-01
Full Text Available Accurate network traffic classification at early stage is very important for 5G network applications. During the last few years, researchers endeavored hard to propose effective machine learning model for classification of Internet traffic applications at early stage with few packets. Nevertheless, this essential problem still needs to be studied profoundly to find out effective packet number as well as effective machine learning (ML model. In this paper, we tried to solve the above-mentioned problem. For this purpose, five Internet traffic datasets are utilized. Initially, we extract packet size of 20 packets and then mutual information analysis is carried out to find out the mutual information of each packet on n flow type. Thereafter, we execute 10 well-known machine learning algorithms using crossover classification method. Two statistical analysis tests, Friedman and Wilcoxon pairwise tests, are applied for the experimental results. Moreover, we also apply the statistical tests for classifiers to find out effective ML classifier. Our experimental results show that 13–19 packets are the effective packet numbers for 5G IM WeChat application at early stage network traffic classification. We also find out effective ML classifier, where Random Forest ML classifier is effective classifier at early stage Internet traffic classification.
Deformation properties of sedimentary rocks in the process of underground coal gasification
Directory of Open Access Journals (Sweden)
Mirosława Bukowska
2015-01-01
Full Text Available The article presents results of research into changes in deformation properties of rocks, under influence of temperature, during the process of underground coal gasification. Samples of carboniferous sedimentary rocks (claystones and sandstones, collected in different areas of Upper Silesian Coal Basin (GZW, were heated at the temperature of between 100 and 1000–1200 °C, and then subjected to uniaxial compression tests to obtain a full stress-strain curves of the samples and determine values of residual strain and Poisson's ratio. To compare the obtained values of deformation parameters of rocks, tested in dry-air state and after heating in a given range of temperature, normalised values of residual strain and Poisson's ratio were determined. Based on them, coefficient of influence of temperature on tested deformation parameters was determined. The obtained values of the coefficient can be applied in mining practice to forecast deformability of gangue during underground coal gasification, when in the direct surrounding of a georeactor there are claystones or sandstones. The obtained results were analysed based on classification of uniaxial compression strength of GZW gangue, which formed the basis for dividing claystones and sandstones into very low, low, medium and high uniaxial compression strength rocks. Based on the conducted tests it was concluded that the influence of uniaxial compression strength on the value of residual strain, unlike the influence of grain size of sandstones, is unambiguous within the range of changes in the parameter. Among claystones changes in the value of Poisson's ratio depending on their initial strength were observed. Sandstones of different grain size either increased or decreased the value of Poisson's ratio in comparison with the value determined at room temperature in dry-air conditions.
Nikitin, S. Yu.; Priezzhev, A. V.; Lugovtsov, A. E.; Ustinov, V. D.; Razgulin, A. V.
2014-10-01
The paper is devoted to development of the laser ektacytometry technique for evaluation of the statistical characteristics of inhomogeneous ensembles of red blood cells (RBCs). We have analyzed theoretically laser beam scattering by the inhomogeneous ensembles of elliptical discs, modeling red blood cells in the ektacytometer. The analysis shows that the laser ektacytometry technique allows for quantitative evaluation of such population characteristics of RBCs as the cells mean shape, the cells deformability variance and asymmetry of the cells distribution in the deformability. Moreover, we show that the deformability distribution itself can be retrieved by solving a specific Fredholm integral equation of the first kind. At this stage we do not take into account the scatter in the RBC sizes.
Statistical yearbook 2005. Data available as of March 2006. 50 ed
International Nuclear Information System (INIS)
2006-08-01
The Statistical Yearbook is an annual compilation of a wide range of international economic, social and environmental statistics on over 200 countries and areas, compiled from sources including UN agencies and other international, national and specialized organizations. The 50th issue contains data available to the Statistics Division as of March 2006 and presents them in 76 tables. The number of years of data shown in the tables varies from one to ten, with the ten-year tables covering 1994 to 2003 or 1995 to 2004. Accompanying the tables are technical notes providing brief descriptions of major statistical concepts, definitions and classifications
DEFF Research Database (Denmark)
Jakobsen, Bo
2006-01-01
The main goal of the study presented in this thesis was to perform in-situ investigations on deformation structures in plastically deformed polycrystalline copper at low degrees of tensile deformation (model system for cell forming pure fcc metals. Anovel synchrotron...... grains in polycrystalline samples during tensile deformation. We have shown that the resulting 3D reciprocal space maps from tensile deformed copper comprise a pronounced structure, consisting of bright sharp peaks superimposed on a cloud of enhanced intensity. Based on the integrated intensity......, the width of the peaks, and spatial scanning experiments it is concluded that the individual peaks arise from individual dislocation-free regions (the subgrains) in the dislocation structure. The cloud is attributed to the dislocation rich walls. Samples deformed to 2% tensile strain were investigated under...
Direct Learning of Systematics-Aware Summary Statistics
CERN. Geneva
2018-01-01
Complex machine learning tools, such as deep neural networks and gradient boosting algorithms, are increasingly being used to construct powerful discriminative features for High Energy Physics analyses. These methods are typically trained with simulated or auxiliary data samples by optimising some classification or regression surrogate objective. The learned feature representations are then used to build a sample-based statistical model to perform inference (e.g. interval estimation or hypothesis testing) over a set of parameters of interest. However, the effectiveness of the mentioned approach can be reduced by the presence of known uncertainties that cause differences between training and experimental data, included in the statistical model via nuisance parameters. This work presents an end-to-end algorithm, which leverages on existing deep learning technologies but directly aims to produce inference-optimal sample-summary statistics. By including the statistical model and a differentiable approximation of ...
Household Classification Using Smart Meter Data
Directory of Open Access Journals (Sweden)
Carroll Paula
2018-03-01
Full Text Available This article describes a project conducted in conjunction with the Central Statistics Office of Ireland in response to a planned national rollout of smart electricity metering in Ireland. We investigate how this new data source might be used for the purpose of official statistics production. This study specifically looks at the question of determining household composition from electricity smart meter data using both Neural Networks (a supervised machine learning approach and Elastic Net Logistic regression. An overview of both classification techniques is given. Results for both approaches are presented with analysis. We find that the smart meter data alone is limited in its capability to distinguish between household categories but that it does provide some useful insights.
Effect of alloy deformation on the average spacing parameters of non-deforming particles
International Nuclear Information System (INIS)
Fisher, J.; Gurland, J.
1980-02-01
It is shown on the basis of stereological definitions and a few simple experiments that the commonly used average dispersion parameters, area fraction (A/sub A/)/sub β/, areal particle density N/sub Aβ/ and mean free path lambda/sub α/, remain invariant during plastic deformation in the case of non-deforming equiaxed particles. Directional effects on the spacing parameters N/sub Aβ/ and lambda/sub α/ arise during uniaxial deformation by rotation and preferred orientation of nonequiaxed particles. Particle arrangement in stringered or layered structures and the effect of deformation on nearest neighbor distances of particles and voids are briefly discussed in relation to strength and fracture theories
Modeling time-to-event (survival) data using classification tree analysis.
Linden, Ariel; Yarnold, Paul R
2017-12-01
Time to the occurrence of an event is often studied in health research. Survival analysis differs from other designs in that follow-up times for individuals who do not experience the event by the end of the study (called censored) are accounted for in the analysis. Cox regression is the standard method for analysing censored data, but the assumptions required of these models are easily violated. In this paper, we introduce classification tree analysis (CTA) as a flexible alternative for modelling censored data. Classification tree analysis is a "decision-tree"-like classification model that provides parsimonious, transparent (ie, easy to visually display and interpret) decision rules that maximize predictive accuracy, derives exact P values via permutation tests, and evaluates model cross-generalizability. Using empirical data, we identify all statistically valid, reproducible, longitudinally consistent, and cross-generalizable CTA survival models and then compare their predictive accuracy to estimates derived via Cox regression and an unadjusted naïve model. Model performance is assessed using integrated Brier scores and a comparison between estimated survival curves. The Cox regression model best predicts average incidence of the outcome over time, whereas CTA survival models best predict either relatively high, or low, incidence of the outcome over time. Classification tree analysis survival models offer many advantages over Cox regression, such as explicit maximization of predictive accuracy, parsimony, statistical robustness, and transparency. Therefore, researchers interested in accurate prognoses and clear decision rules should consider developing models using the CTA-survival framework. © 2017 John Wiley & Sons, Ltd.
A New Classification Approach Based on Multiple Classification Rules
Zhongmei Zhou
2014-01-01
A good classifier can correctly predict new data for which the class label is unknown, so it is important to construct a high accuracy classifier. Hence, classification techniques are much useful in ubiquitous computing. Associative classification achieves higher classification accuracy than some traditional rule-based classification approaches. However, the approach also has two major deficiencies. First, it generates a very large number of association classification rules, especially when t...
Friese, Uwe; Meindl, Thomas; Herpertz, Sabine C; Reiser, Maximilian F; Hampel, Harald; Teipel, Stefan J
2010-01-01
We report evidence that multivariate analyses of deformation-based morphometry and diffusion tensor imaging (DTI) data can be used to discriminate between healthy participants and patients with Alzheimer's disease (AD) with comparable diagnostic accuracy. In contrast to other studies on MRI-based biomarkers which usually only focus on a single modality, we derived deformation maps from high-dimensional normalization of T1-weighted images, as well as mean diffusivity maps and fractional anisotropy maps from DTI of the same group of 21 patients with AD and 20 healthy controls. Using an automated multivariate analysis of the entire brain volume, widespread decreased white matter integrity and atrophy effects were found in cortical and subcortical regions of AD patients. Mean diffusivity maps and deformation maps were equally effective in discriminating between AD patients and controls (AUC =0.88 vs. AUC=0.85) while fractional anisotropy maps performed slightly inferior. Combining the maps from different modalities in a logistic regression model resulted in a classification accuracy of AUC=0.86 after leave-one-out cross-validation. It remains to be shown if this automated multivariate analysis of DTI-measures can improve early diagnosis of AD in predementia stages.
An innovative and efficient method to control the shape of push-pull membrane deformable mirror
Polo, A.; Haber, A.; Pereira, S.F.; Verhaegen, M.H.G.; Urbach, H.P.
2012-01-01
We carry out performance characterisation of a commercial push and pull deformable mirror with 48 actuators (Adaptica Srl). We present a detailed description of the system as well as a statistical approach on the identification of the mirror influence function. A new efficient control algorithm to
Cosmetic and Functional Nasal Deformities
... nasal complaints. Nasal deformity can be categorized as “cosmetic” or “functional.” Cosmetic deformity of the nose results in a less ... taste , nose bleeds and/or recurrent sinusitis . A cosmetic or functional nasal deformity may occur secondary to ...
78 FR 68983 - Cotton Futures Classification: Optional Classification Procedure
2013-11-18
...-AD33 Cotton Futures Classification: Optional Classification Procedure AGENCY: Agricultural Marketing... regulations to allow for the addition of an optional cotton futures classification procedure--identified and... response to requests from the U.S. cotton industry and ICE, AMS will offer a futures classification option...
Directory of Open Access Journals (Sweden)
Jasmine eGrimsley
2013-01-01
Full Text Available Mouse pups vocalize at high rates when they are cold or isolated from the nest. The proportions of each syllable type produced carry information about disease state and are being used as behavioral markers for the internal state of animals. Manual classifications of these vocalizations identified ten syllable types based on their spectro-temporal features. However, manual classification of mouse syllables is time consuming and vulnerable to experimenter bias. This study uses an automated cluster analysis to identify acoustically distinct syllable types produced by CBA/CaJ mouse pups, and then compares the results to prior manual classification methods. The cluster analysis identified two syllable types, based on their frequency bands, that have continuous frequency-time structure, and two syllable types featuring abrupt frequency transitions. Although cluster analysis computed fewer syllable types than manual classification, the clusters represented well the probability distributions of the acoustic features within syllables. These probability distributions indicate that some of the manually classified syllable types are not statistically distinct. The characteristics of the four classified clusters were used to generate a Microsoft Excel-based mouse syllable classifier that rapidly categorizes syllables, with over a 90% match, into the syllable types determined by cluster analysis.
DEFF Research Database (Denmark)
Colone, L.; Hovgaard, K.; Glavind, Lars
2018-01-01
A method for mass change detection on wind turbine blades using natural frequencies is presented. The approach is based on two statistical tests. The first test decides if there is a significant mass change and the second test is a statistical group classification based on Linear Discriminant Ana...
Rigo, Manuel
2011-01-01
Progressive adolescent idiopathic scoliosis (AIS) produces specific signs and symptoms, including trunk and spinal deformity and imbalance, impairment of breathing function, pain, progression during adult life, and psychological problems, as a whole resulting in an alteration of the health-related quality of life. A scoliosis-specific rehabilitation program attempts to prevent, improve, or minimize these signs and symptoms by using exercises and braces as the main tools in the rehabilitation treatment. Patient evaluation is an essential point in the decision-making process and determines the selection of the specific exercises and the specifications of the brace design. However, this article is not addressed to scoliosis management. In this present article, a complete definition and discussion of radiological aspects, such as the Cobb angle, axial rotation, curve pattern classifications, and sagittal configuration, follow a short description of the three-dimensional nature of AIS. The relationship between AIS and growth is also discussed. There is also a section dedicated to the assessment of trunk deformity and back asymmetry. Other important clinical aspects, such as pain and disability, changes in other regions of the body, muscular balance, breathing function, and health-related quality of life, are not discussed in this present article.
International Nuclear Information System (INIS)
Sabaton, M.; Viollet, P.L.; Darles, A.; Gland, H.
1980-07-01
The PANACH three dimensional calculation code developed from tests on a small scale model and validated from full scale measurement campaigns, was used to estimate a three dimensional statistic of plumes. As it is not possible with the calculation times to make a calculation for each radio sondage, a classification method was adopted. This method developed by the French National Meteorological Office is based on a double classification comprising basic classes in which the plumes are assumed to be dynamically similar and a sub-classification to take better account of the true moisture profiles. This statistical method was then applied to the case of 2 or 4 1300 MWe units fitted with natural draught cooling towers of the wet, dry or wet-dry types [fr
The ANACONDA algorithm for deformable image registration in radiotherapy
International Nuclear Information System (INIS)
Weistrand, Ola; Svensson, Stina
2015-01-01
Purpose: The purpose of this work was to describe a versatile algorithm for deformable image registration with applications in radiotherapy and to validate it on thoracic 4DCT data as well as CT/cone beam CT (CBCT) data. Methods: ANAtomically CONstrained Deformation Algorithm (ANACONDA) combines image information (i.e., intensities) with anatomical information as provided by contoured image sets. The registration problem is formulated as a nonlinear optimization problem and solved with an in-house developed solver, tailored to this problem. The objective function, which is minimized during optimization, is a linear combination of four nonlinear terms: 1. image similarity term; 2. grid regularization term, which aims at keeping the deformed image grid smooth and invertible; 3. a shape based regularization term which works to keep the deformation anatomically reasonable when regions of interest are present in the reference image; and 4. a penalty term which is added to the optimization problem when controlling structures are used, aimed at deforming the selected structure in the reference image to the corresponding structure in the target image. Results: To validate ANACONDA, the authors have used 16 publically available thoracic 4DCT data sets for which target registration errors from several algorithms have been reported in the literature. On average for the 16 data sets, the target registration error is 1.17 ± 0.87 mm, Dice similarity coefficient is 0.98 for the two lungs, and image similarity, measured by the correlation coefficient, is 0.95. The authors have also validated ANACONDA using two pelvic cases and one head and neck case with planning CT and daily acquired CBCT. Each image has been contoured by a physician (radiation oncologist) or experienced radiation therapist. The results are an improvement with respect to rigid registration. However, for the head and neck case, the sample set is too small to show statistical significance. Conclusions: ANACONDA
Nishihara, Yu; Ohuchi, Tomohiro; Kawazoe, Takaaki; Seto, Yusuke; Maruyama, Genta; Higo, Yuji; Funakoshi, Ken-ichi; Tange, Yoshinori; Irifune, Tetsuo
2018-05-01
Shear and uniaxial deformation experiments on hexagonal close-packed iron (hcp-Fe) was conducted using a deformation-DIA apparatus at a pressure of 13-17 GPa and a temperature of 723 K to determine its deformation-induced crystallographic-preferred orientation (CPO). Development of the CPO in the deforming sample is determined in-situ based on two-dimensional X-ray diffraction using monochromatic synchrotron X-rays. In the shear deformation geometry, the and axes gradually align to be sub-parallel to the shear plane normal and shear direction, respectively, from the initial random texture. In the uniaxial compression and tensile geometry, the and axes, respectively, gradually align along the direction of the uniaxial deformation axis. These results suggest that basal slip (0001) is the dominant slip system in hcp-Fe under the studied deformation conditions. The P-wave anisotropy for a shear deformed sample was calculated using elastic constants at the inner core condition by recent ab-initio calculations. Strength of the calculated anisotropy was comparable to or higher than axisymmetric anisotropy in Earth's inner core.
Neutron halo in deformed nuclei
International Nuclear Information System (INIS)
Zhou Shangui; Meng Jie; Ring, P.; Zhao Enguang
2010-01-01
Halo phenomena in deformed nuclei are investigated within a deformed relativistic Hartree Bogoliubov (DRHB) theory. These weakly bound quantum systems present interesting examples for the study of the interdependence between the deformation of the core and the particles in the halo. Contributions of the halo, deformation effects, and large spatial extensions of these systems are described in a fully self-consistent way by the DRHB equations in a spherical Woods-Saxon basis with the proper asymptotic behavior at a large distance from the nuclear center. Magnesium and neon isotopes are studied and detailed results are presented for the deformed neutron-rich and weakly bound nucleus 44 Mg. The core of this nucleus is prolate, but the halo has a slightly oblate shape. This indicates a decoupling of the halo orbitals from the deformation of the core. The generic conditions for the occurrence of this decoupling effects are discussed.
Mechanics of deformable bodies
Sommerfeld, Arnold Johannes Wilhelm
1950-01-01
Mechanics of Deformable Bodies: Lectures on Theoretical Physics, Volume II covers topics on the mechanics of deformable bodies. The book discusses the kinematics, statics, and dynamics of deformable bodies; the vortex theory; as well as the theory of waves. The text also describes the flow with given boundaries. Supplementary notes on selected hydrodynamic problems and supplements to the theory of elasticity are provided. Physicists, mathematicians, and students taking related courses will find the book useful.
Quantum deformed magnon kinematics
Gómez, César; Hernández Redondo, Rafael
2007-01-01
The dispersion relation for planar N=4 supersymmetric Yang-Mills is identified with the Casimir of a quantum deformed two-dimensional kinematical symmetry, E_q(1,1). The quantum deformed symmetry algebra is generated by the momentum, energy and boost, with deformation parameter q=e^{2\\pi i/\\lambda}. Representing the boost as the infinitesimal generator for translations on the rapidity space leads to an elliptic uniformization with crossing transformations implemented through translations by t...
Inter- and intraobserver reliability of the MTM-classification for proximal humeral fractures
DEFF Research Database (Denmark)
Bahrs, Christian; Schmal, Hagen; Lingenfelter, Erich
2008-01-01
tool. METHODS: Three observers classified plain radiographs of 22 fractures using both a simple version (fracture displacement, number of parts) and an extensive version (individual topographic fracture type and morphology) of the MTM classification. Kappa-statistics were used to determine reliability....... RESULTS: An acceptable reliability was found for the simple version classifying fracture displacement and fractured main parts. Fair interobserver agreement was found for the extensive version with individual topographic fracture type and morphology. CONCLUSION: Although the MTM-classification covers...
Audio Classification in Speech and Music: A Comparison between a Statistical and a Neural Approach
Directory of Open Access Journals (Sweden)
Alessandro Bugatti
2002-04-01
Full Text Available We focus the attention on the problem of audio classification in speech and music for multimedia applications. In particular, we present a comparison between two different techniques for speech/music discrimination. The first method is based on Zero crossing rate and Bayesian classification. It is very simple from a computational point of view, and gives good results in case of pure music or speech. The simulation results show that some performance degradation arises when the music segment contains also some speech superimposed on music, or strong rhythmic components. To overcome these problems, we propose a second method, that uses more features, and is based on neural networks (specifically a multi-layer Perceptron. In this case we obtain better performance, at the expense of a limited growth in the computational complexity. In practice, the proposed neural network is simple to be implemented if a suitable polynomial is used as the activation function, and a real-time implementation is possible even if low-cost embedded systems are used.
Organ transplant AN-DRGs: modifying the exceptions hierarchy in casemix classification.
Antioch, K; Zhang, X
2000-01-01
The study described in this article sought to develop AN-DRG Version 3 classification revisions for organ transplantation through statistical analyses of recommendations formulated by the Australian Casemix Clinical Committee. Two separate analyses of variance were undertaken for AN-DRG Version 2 and for the proposed Version 3 AN-DRGs, using average length of stay as the dependent variable. The committee made four key recommendations which were accepted and incorporated into AN-DRG Versions 3 and 3.1. This article focuses on the classification revisions for organ transplantation.
Statistical Challenges of Big Data Analysis in Medicine
Czech Academy of Sciences Publication Activity Database
Kalina, Jan
2015-01-01
Roč. 3, č. 1 (2015), s. 24-27 ISSN 1805-8698 R&D Projects: GA ČR GA13-23940S Grant - others:CESNET Development Fund(CZ) 494/2013 Institutional support: RVO:67985807 Keywords : big data * variable selection * classification * cluster analysis Subject RIV: BB - Applied Statistics, Operational Research http://www.ijbh.org/ijbh2015-1.pdf
Deformable mirror study. Final report, 21 July 1980-15 May 1981
International Nuclear Information System (INIS)
Budgor, A.B.
1981-03-01
The beam quality of a baseline system similar to the Helios system at Los Alamos Scientific Laboratory was analyzed with a two-dimensional beam train code based on a Fresnel propagator. The other components of the code include: (a) characterization of phase aberrations either in terms of Zernike polynomials synthesized directly from optical component interferograms when available, or by constructing a random wave front with specified statistics; (b) non-diffractive linear amplification via the Frantz-Nodvik equations; and (c) correction of accumulated phase aberration with continuous deformable mirrors whose surface is modeled by bicubic splines through the actuator points. The technical contents of this report will be presented in 4 sections. Section II will describe the physical optics of beam train propagation. A heuristic physical argument defining the zeroth order efficacy of adaptive optics to correct phase aberration is then derived. The results of applying the diffraction computer code to one beam line of the Helios laser system are described. The wave length scalability of deformable mirrors and efficacy of deformable mirror adaptive optics to correct phase aberration at UV wave lengths are then described
Motor Oil Classification using Color Histograms and Pattern Recognition Techniques.
Ahmadi, Shiva; Mani-Varnosfaderani, Ahmad; Habibi, Biuck
2018-04-20
Motor oil classification is important for quality control and the identification of oil adulteration. In thiswork, we propose a simple, rapid, inexpensive and nondestructive approach based on image analysis and pattern recognition techniques for the classification of nine different types of motor oils according to their corresponding color histograms. For this, we applied color histogram in different color spaces such as red green blue (RGB), grayscale, and hue saturation intensity (HSI) in order to extract features that can help with the classification procedure. These color histograms and their combinations were used as input for model development and then were statistically evaluated by using linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), and support vector machine (SVM) techniques. Here, two common solutions for solving a multiclass classification problem were applied: (1) transformation to binary classification problem using a one-against-all (OAA) approach and (2) extension from binary classifiers to a single globally optimized multilabel classification model. In the OAA strategy, LDA, QDA, and SVM reached up to 97% in terms of accuracy, sensitivity, and specificity for both the training and test sets. In extension from binary case, despite good performances by the SVM classification model, QDA and LDA provided better results up to 92% for RGB-grayscale-HSI color histograms and up to 93% for the HSI color map, respectively. In order to reduce the numbers of independent variables for modeling, a principle component analysis algorithm was used. Our results suggest that the proposed method is promising for the identification and classification of different types of motor oils.
Thorax deformity, joint hypermobility and anxiety disorder
International Nuclear Information System (INIS)
Gulsun, M.; Dumlu, K.; Erbas, M.; Yilmaz, Mehmet B.; Pinar, M.; Tonbul, M.; Celik, C.; Ozdemir, B.
2007-01-01
Objective was to evaluate the association between thorax deformities, panic disorder and joint hypermobility. The study includes 52 males diagnosed with thorax deformity, and 40 healthy male controls without thorax deformity, in Tatvan Bitlis and Isparta, Turkey. The study was carried out from 2004 to 2006. The teleradiographic and thoracic lateral images of the subjects were evaluated to obtain the Beighton scores; subjects psychiatric conditions were evaluated using the Structured Clinical Interview for DSM-IV Axis I Disorders (SCID-1), and the Hamilton Anxiety Scale (HAM-A) was applied in order to determine the anxiety levels. Both the subjects and controls were compared in sociodemographic, anxiety levels and joint mobility levels. In addition, males with joint hypermobility and thorax deformity were compared to the group with thorax deformity without joint hypermobility. A significant difference in HAM-A scores was found between the groups with thorax deformity and without. In addition, 21 subjects with thorax deformity met the joint hypermobility criteria in the group with thorax deformity and 7 subjects without thorax deformity met the joint hypermobility criteria in the group without thorax deformity, according to Beighton scoring. The Beighton score of subjects with thorax deformity were significantly different from those of the group without deformity. Additionally, anxiety scores of the males with thorax deformity and joint hypermobility were found higher than males with thorax deformity without joint hypermobility. Anxiety disorders, particularly panic disorder, have a significantly higher distribution in males subjects with thorax deformity compared to the healthy control group. In addition, the anxiety level of males with thorax deformity and joint hypermobility is higher than males with thorax deformity without joint hypermobility. (author)
Deformed configurations, band structures and spectroscopic ...
Indian Academy of Sciences (India)
2014-03-20
Mar 20, 2014 ... The deformed configurations and rotational band structures in =50 Ge and Se nuclei are studied by deformed Hartree–Fock with quadrupole constraint and angular momentum projection. Apart from the `almost' spherical HF solution, a well-deformed configuration occurs at low excitation. A deformed ...
Topics in statistical and theoretical physics
Dobrushin, R L; Shubin, M A
1996-01-01
This is the second of two volumes dedicated to the scientific heritage of F. A. Berezin (1931-1980). Before his untimely death, Berezin had an important influence on physics and mathematics, discovering new ideas in mathematical physics, representation theory, analysis, geometry, and other areas of mathematics. His crowning achievements were the introduction of a new notion of deformation quantization and Grassmannian analysis ("supermathematics"). Collected here are papers by many of his colleagues and others who worked in related areas, representing a wide spectrum of topics in statistical a
On the classification and prediction of characteristics of heat resisting materials durability
International Nuclear Information System (INIS)
Krivenyuk, V.V.
1976-01-01
The proposed methods - one of which is based on the direct or indirect use of comparable temperature and load conditions, while the other takes into account in addition structural features of the material that are governed by the short-term ductility characteristics - are practically equivalent to the Larson and Miller method as regards accuracy and reliability of prediction. The classification of materials employed in the theory of high-temperature strength may promote the development of rapid methods of predicting the long-term strength and deformation properties by also taking into consideration the state of the material characterized by the short-term mechanical properties
Interactive classification and content-based retrieval of tissue images
Aksoy, Selim; Marchisio, Giovanni B.; Tusk, Carsten; Koperski, Krzysztof
2002-11-01
We describe a system for interactive classification and retrieval of microscopic tissue images. Our system models tissues in pixel, region and image levels. Pixel level features are generated using unsupervised clustering of color and texture values. Region level features include shape information and statistics of pixel level feature values. Image level features include statistics and spatial relationships of regions. To reduce the gap between low-level features and high-level expert knowledge, we define the concept of prototype regions. The system learns the prototype regions in an image collection using model-based clustering and density estimation. Different tissue types are modeled using spatial relationships of these regions. Spatial relationships are represented by fuzzy membership functions. The system automatically selects significant relationships from training data and builds models which can also be updated using user relevance feedback. A Bayesian framework is used to classify tissues based on these models. Preliminary experiments show that the spatial relationship models we developed provide a flexible and powerful framework for classification and retrieval of tissue images.
A method for classification of network traffic based on C5.0 Machine Learning Algorithm
DEFF Research Database (Denmark)
Bujlow, Tomasz; Riaz, M. Tahir; Pedersen, Jens Myrup
2012-01-01
current network traffic. To overcome the drawbacks of existing methods for traffic classification, usage of C5.0 Machine Learning Algorithm (MLA) was proposed. On the basis of statistical traffic information received from volunteers and C5.0 algorithm we constructed a boosted classifier, which was shown...... and classification, an algorithm for recognizing flow direction and the C5.0 itself. Classified applications include Skype, FTP, torrent, web browser traffic, web radio, interactive gaming and SSH. We performed subsequent tries using different sets of parameters and both training and classification options...
van Diggelen, Esther; Holdsworth, Robert; de Bresser, Hans; Spiers, Chris
2010-05-01
The San Andreas Fault (SAF) in California marks the boundary between the Pacific plate and the North American plate. The San Andreas Fault Observatory at Depth (SAFOD) is located 9 km northwest of the town of Parkfield, CA and provide an extensive set of samples through the SAF. The SAFOD drill hole encountered different lithologies, including arkosic sediments from the Salinian block (Pacific plate) and claystones and siltstones from the Great Valley block (North American plate). Fault deformation in the area is mainly by a combination of micro-earthquakes and fault creep. Deformation of the borehole casing indicated that the SAFOD drill hole cross cuts two actively deforming strands of the SAF. In order to determine the deformation mechanisms in the actively creeping fault segments, we have studied thin sections obtained from SAFOD phase 3 core material using optical and electron microscopy, and we have compared these natural SAFOD microstructures with microstructures developed in simulated fault gouges deformed in laboratory shear experiments. The phase 3 core material is divided in three different core intervals consisting of different lithologies. Core interval 1 consists of mildly deformed Salinian rocks that show evidence of cataclasis, pressure solution and reaction of feldspar to form phyllosilicates, all common processes in upper crustal rocks. Most of Core interval 3 (Great Valley) is also only mildly deformed and very similar to Core interval 1. Bedding and some sedimentary features are still visible, together with limited evidence for cataclasis and pressure solution, and reaction of feldspar to form phyllosilicates. However, in between the relatively undeformed rocks, Core interval 3 encountered a zone of foliated fault gouge, consisting mostly of phyllosilicates. This zone is correlated with one of the zones of localized deformation of the borehole casing, i.e. with an actively deforming strand of the SAF. The fault gouge zone shows a strong, chaotic
Mathematics and Statistics Research Department progress report for period ending June 30, 1977
International Nuclear Information System (INIS)
Lever, W.E.; Shepherd, D.E.; Ward, R.C.; Wilson, D.G.
1977-09-01
Brief descriptions are given of work done in mathematical and statistical research (moving-boundary problems; numerical analysis; continuum mechanics; matrices and other operators; experiment design; statistical testing; multivariate, multipopulation classification; statistical estimation) and statistical and mathematical collaboration (analytical chemistry, biological research, chemistry and physics research, energy research, engineering technology research, environmental sciences research, health physics research, meterials research, sampling inspection and quality control, uranium resource evaluation research). Most of the descriptions are a page or less in length. Educational activities, publications, seminar titles, etc., are also included
Statistical learning in high energy and astrophysics
International Nuclear Information System (INIS)
Zimmermann, J.
2005-01-01
This thesis studies the performance of statistical learning methods in high energy and astrophysics where they have become a standard tool in physics analysis. They are used to perform complex classification or regression by intelligent pattern recognition. This kind of artificial intelligence is achieved by the principle ''learning from examples'': The examples describe the relationship between detector events and their classification. The application of statistical learning methods is either motivated by the lack of knowledge about this relationship or by tight time restrictions. In the first case learning from examples is the only possibility since no theory is available which would allow to build an algorithm in the classical way. In the second case a classical algorithm exists but is too slow to cope with the time restrictions. It is therefore replaced by a pattern recognition machine which implements a fast statistical learning method. But even in applications where some kind of classical algorithm had done a good job, statistical learning methods convinced by their remarkable performance. This thesis gives an introduction to statistical learning methods and how they are applied correctly in physics analysis. Their flexibility and high performance will be discussed by showing intriguing results from high energy and astrophysics. These include the development of highly efficient triggers, powerful purification of event samples and exact reconstruction of hidden event parameters. The presented studies also show typical problems in the application of statistical learning methods. They should be only second choice in all cases where an algorithm based on prior knowledge exists. Some examples in physics analyses are found where these methods are not used in the right way leading either to wrong predictions or bad performance. Physicists also often hesitate to profit from these methods because they fear that statistical learning methods cannot be controlled in a
Statistical learning in high energy and astrophysics
Energy Technology Data Exchange (ETDEWEB)
Zimmermann, J.
2005-06-16
This thesis studies the performance of statistical learning methods in high energy and astrophysics where they have become a standard tool in physics analysis. They are used to perform complex classification or regression by intelligent pattern recognition. This kind of artificial intelligence is achieved by the principle ''learning from examples'': The examples describe the relationship between detector events and their classification. The application of statistical learning methods is either motivated by the lack of knowledge about this relationship or by tight time restrictions. In the first case learning from examples is the only possibility since no theory is available which would allow to build an algorithm in the classical way. In the second case a classical algorithm exists but is too slow to cope with the time restrictions. It is therefore replaced by a pattern recognition machine which implements a fast statistical learning method. But even in applications where some kind of classical algorithm had done a good job, statistical learning methods convinced by their remarkable performance. This thesis gives an introduction to statistical learning methods and how they are applied correctly in physics analysis. Their flexibility and high performance will be discussed by showing intriguing results from high energy and astrophysics. These include the development of highly efficient triggers, powerful purification of event samples and exact reconstruction of hidden event parameters. The presented studies also show typical problems in the application of statistical learning methods. They should be only second choice in all cases where an algorithm based on prior knowledge exists. Some examples in physics analyses are found where these methods are not used in the right way leading either to wrong predictions or bad performance. Physicists also often hesitate to profit from these methods because they fear that statistical learning methods cannot
An Outlyingness Matrix for Multivariate Functional Data Classification
Dai, Wenlin
2017-08-25
The classification of multivariate functional data is an important task in scientific research. Unlike point-wise data, functional data are usually classified by their shapes rather than by their scales. We define an outlyingness matrix by extending directional outlyingness, an effective measure of the shape variation of curves that combines the direction of outlyingness with conventional statistical depth. We propose two classifiers based on directional outlyingness and the outlyingness matrix, respectively. Our classifiers provide better performance compared with existing depth-based classifiers when applied on both univariate and multivariate functional data from simulation studies. We also test our methods on two data problems: speech recognition and gesture classification, and obtain results that are consistent with the findings from the simulated data.
Design of a flexible tactile sensor for classification of rigid and deformable objects
DEFF Research Database (Denmark)
Drimus, Alin; Kootstra, Gert; Bilberg, Arne
2014-01-01
of the sensor in an active object-classification system. A robotic gripper with two sensors mounted on its fingers performs a palpation procedure on a set of objects. By squeezing an object, the robot actively explores the material properties, and the system acquires tactile information corresponding......For both humans and robots, tactile sensing is important for interaction with the environment: it is the core sensing used for exploration and manipulation of objects. In this paper, we present a novel tactile-array sensor based on flexible piezoresistive rubber.We describe the design of the sensor...... and data acquisition system.We evaluate the sensitivity and robustness of the sensor, and show that it is consistent over time with little relaxation. Furthermore, the sensor has the benefit of being flexible, having a high resolution, it is easy to mount, and simple to manufacture. We demonstrate the use...
Lausen, Berthold; Seidel, Wilfried; Ultsch, Alfred
2010-01-01
Data Analysis, Data Handling and Business Intelligence are research areas at the intersection of computer science, artificial intelligence, mathematics, and statistics. They cover general methods and techniques that can be applied to a vast set of applications such as in marketing, finance, economics, engineering, linguistics, archaeology, musicology, medical science, and biology. This volume contains the revised versions of selected papers presented during the 32nd Annual Conference of the German Classification Society (Gesellschaft für Klassifikation, GfKl). The conference, which was organized in cooperation with the British Classification Society (BCS) and the Dutch/Flemish Classification Society (VOC), was hosted by Helmut-Schmidt-University, Hamburg, Germany, in July 2008.
Interfacial Bubble Deformations
Seymour, Brian; Shabane, Parvis; Cypull, Olivia; Cheng, Shengfeng; Feitosa, Klebert
Soap bubbles floating at an air-water experience deformations as a result of surface tension and hydrostatic forces. In this experiment, we investigate the nature of such deformations by taking cross-sectional images of bubbles of different volumes. The results show that as their volume increases, bubbles transition from spherical to hemispherical shape. The deformation of the interface also changes with bubble volume with the capillary rise converging to the capillary length as volume increases. The profile of the top and bottom of the bubble and the capillary rise are completely determined by the volume and pressure differences. James Madison University Department of Physics and Astronomy, 4VA Consortium, Research Corporation for Advancement of Science.
Orientation correlation in tensile deformed [0 1 1] Cu single crystals
International Nuclear Information System (INIS)
Borbely, Andras; Szabo, Peter J.; Groma, Istvan
2005-01-01
Local crystallographic orientation of tensile deformed copper single crystals was investigated by the electron backscattering technique. Statistical evaluation of the data reveals the presence of an increased crystallographic correlation at the transition point between stages II and III of work-hardening. The transition state has the lowest probability of finding geometrically necessary dislocations in circular regions of radius smaller than 8 μm. According to the present results and other data showing that the relative fluctuation of the dislocation density has a maximum at the transition point, we conclude that the transition from stages II to III of work-hardening is similar to a second-order phase transformation of the statistical dislocation system
Deformation behaviour of turbine foundations
International Nuclear Information System (INIS)
Koch, W.; Klitzing, R.; Pietzonka, R.; Wehr, J.
1979-01-01
The effects of foundation deformation on alignment in turbine generator sets have gained significance with the transition to modern units at the limit of design possibilities. It is therefore necessary to obtain clarification about the remaining operational variations of turbine foundations. Static measurement programmes, which cover both deformation processes as well as individual conditions of deformation are described in the paper. In order to explain the deformations measured structural engineering model calculations are being undertaken which indicate the effect of limiting factors. (orig.) [de
Zinoviev, Sergei
2014-05-01
Kuznetsk-Altai region is a part of the Central Asian Orogenic Belt. The nature and formation mechanisms of the observed structure of Kuznetsk-Altai region are interpreted by the author as the consequence of convergence of Tuva-Mongolian and Junggar lithospheric block structures and energy of collision interaction between the blocks of crust in Late-Paleozoic-Mesozoic period. Tectonic zoning of Kuznetsk-Altai region is based on the principle of adequate description of geological medium (without methods of 'primary' state recovery). The initial indication of this convergence is the crust thickening in the zone of collision. On the surface the mechanisms of lateral compression form a regional elevation; with this elevation growth the 'mountain roots' start growing. With an approach of blocks an interblock elevation is divided into various fragments, and these fragments interact in the manner of collision. The physical expression of collision mechanisms are periodic pulses of seismic activity. The main tectonic consequence of the block convergence and collision of interblock units is formation of an ensemble of regional structures of the deformation type on the basis of previous 'pre-collision' geological substratum [Chikov et al., 2012]. This ensemble includes: 1) allochthonous and autochthonous blocks of weakly deformed substratum; 2) folded (folded-thrust) systems; 3) dynamic metamorphism zones of regional shears and main faults. Characteristic of the main structures includes: the position of sedimentary, magmatic and PT-metamorphic rocks, the degree of rock dynamometamorphism and variety rock body deformation, as well as the styles and concentrations of mechanic deformations. 1) block terranes have weakly elongated or isometric shape in plane, and they are the systems of block structures of pre-collision substratum separated by the younger zones of interblock deformations. They stand out among the main deformation systems, and the smallest are included into the
Is overall similarity classification less effortful than single-dimension classification?
Wills, Andy J; Milton, Fraser; Longmore, Christopher A; Hester, Sarah; Robinson, Jo
2013-01-01
It is sometimes argued that the implementation of an overall similarity classification is less effortful than the implementation of a single-dimension classification. In the current article, we argue that the evidence securely in support of this view is limited, and report additional evidence in support of the opposite proposition--overall similarity classification is more effortful than single-dimension classification. Using a match-to-standards procedure, Experiments 1A, 1B and 2 demonstrate that concurrent load reduces the prevalence of overall similarity classification, and that this effect is robust to changes in the concurrent load task employed, the level of time pressure experienced, and the short-term memory requirements of the classification task. Experiment 3 demonstrates that participants who produced overall similarity classifications from the outset have larger working memory capacities than those who produced single-dimension classifications initially, and Experiment 4 demonstrates that instructions to respond meticulously increase the prevalence of overall similarity classification.
Development of Bake Hardening Effect by Plastic Deformation and Annealing Conditions
Directory of Open Access Journals (Sweden)
Kvačkaj, T.
2006-01-01
Full Text Available The paper deals with the classification of steel sheets for automotives industry on the basis of strength and structural characteristics. Experimental works were aimed to obtain the best possible strengthening parameters as well as work hardening and solid solution ferrite hardening, which are the result of thermal activation of interstitial carbon atoms during paint-baking of auto body. Hardening process coming from interstitial atoms is realized as two-step process. The first step is BH (bake hardening effect achieved by interaction of interstitial atoms with dislocations. The Cottrels atmosphere is obtained. The second step of BH effect is to produced the hardening from precipitation of the carbon atoms in e-carbides, or formation of Fe32C4 carbides. WH (work hardening effect is obtained as dislocation hardening from plastic deformations during sheet deep drawing. Experimental works were aimed at as to achieve such plastic material properties after cold rolling, annealing and skin-pass rolling, which would be able to classify the material ZStE220BH into the drawing categories at the level of DQ – DDQ. As resulting from the experimental results, the optimal treatment conditions for the maximal sum (WH+BH = 86 MPa are as follows: total cold rolling deformation ecold = 65 %, annealing temperature Tanneal. = 700 °C.
An introductory review on gravitational-deformation induced structures, fabrics and modeling
Jaboyedoff, Michel; Penna, Ivanna; Pedrazzini, Andrea; Baroň, Ivo; Crosta, Giovanni B.
2013-10-01
Recent studies have pointed out a similarity between tectonics and slope tectonic-induced structures. Numerous studies have demonstrated that structures and fabrics previously interpreted as of purely geodynamical origin are instead the result of large slope deformation, and this led in the past to erroneous interpretations. Nevertheless, their limit seems not clearly defined, but it is somehow transitional. Some studies point out continuity between failures developing at surface with upper crust movements. In this contribution, the main studies which examine the link between rock structures and slope movements are reviewed. The aspects regarding model and scale of observation are discussed together with the role of pre-existing weaknesses in the rock mass. As slope failures can develop through progressive failure, structures and their changes in time and space can be recognized. Furthermore, recognition of the origin of these structures can help in avoiding misinterpretations of regional geology. This also suggests the importance of integrating different slope movement classifications based on distribution and pattern of deformation and the application of structural geology techniques. A structural geology approach in the landslide community is a tool that can greatly support the hazard quantification and related risks, because most of the physical parameters, which are used for landslide modeling, are derived from geotechnical tests or the emerging geophysical approaches.
Sargolzaeipor, S.; Hassanabadi, H.; Chung, W. S.
2018-04-01
In this paper, we study the T -fluctuated form of superstatistics. In this form, some thermodynamic quantities such as the Helmholtz energy, the entropy and the internal energy, are expressed in terms of the T -fluctuated form for a canonical ensemble. In addition, the partition functions in the formalism for 2-level and 3-level distributions are derived. Then we make use of the T -fluctuated superstatistics for a quantum harmonic oscillator problem and the thermal properties of the system for three statistics of the Bose-Einstein, Maxwell-Boltzmann and Fermi-Dirac statistics are calculated. The effect of the deformation parameter on these properties is examined. All the results recover the well-known results by removing the deformation parameter.
DEFF Research Database (Denmark)
Hansen, N.; Huang, X.; Hughes, D.A.
2004-01-01
Microstructural characterization and modeling has shown that a variety of metals deformed by different thermomechanical processes follows a general path of grain subdivision, by dislocation boundaries and high angle boundaries. This subdivision has been observed to very small structural scales...... of the order of 10 nm, produced by deformation under large sliding loads. Limits to the evolution of microstructural parameters during monotonic loading have been investigated based on a characterization by transmission electron microscopy. Such limits have been observed at an equivalent strain of about 10...
Twist deformations leading to κ-Poincaré Hopf algebra and their application to physics
International Nuclear Information System (INIS)
Jurić, Tajron; Meljanac, Stjepan; Samsarov, Andjelo
2016-01-01
We consider two twist operators that lead to kappa-Poincaré Hopf algebra, the first being an Abelian one and the second corresponding to a light-like kappa-deformation of Poincaré algebra. The adventage of the second one is that it is expressed solely in terms of Poincaré generators. In contrast to this, the Abelian twist goes out of the boundaries of Poincaré algebra and runs into envelope of the general linear algebra. Some of the physical applications of these two different twist operators are considered. In particular, we use the Abelian twist to construct the statistics flip operator compatible with the action of deformed symmetry group. Furthermore, we use the light-like twist operator to define a star product and subsequently to formulate a free scalar field theory compatible with kappa-Poincaré Hopf algebra and appropriate for considering the interacting ϕ 4 scalar field model on kappa-deformed space. (paper)
Characterization of gait in female patients with moderate to severe hallux valgus deformity.
Chopra, S; Moerenhout, K; Crevoisier, X
2015-07-01
Hallux valgus is one of the most common forefoot problems in females. Studies have looked at gait alterations due to hallux valgus deformity, assessing temporal, kinematic or plantar pressure parameters individually. The present study, however, aims to assess all listed parameters at once and to isolate the most clinically relevant gait parameters for moderate to severe hallux valgus deformity with the intent of improving post-operative patient prognosis and rehabilitation. The study included 26 feet with moderate to severe hallux valgus deformity and 30 feet with no sign of hallux valgus in female participants. Initially, weight bearing radiographs and foot and ankle clinical scores were assessed. Gait assessment was then performed utilizing pressure insoles (PEDAR) and inertial sensors (Physilog) and the two groups were compared using a non-parametric statistical hypothesis test (Wilcoxon rank sum, Phallux valgus group compared to controls and 9 gait parameters (effect size between 1.03 and 1.76) were successfully isolated to best describe the altered gait in hallux valgus deformity (r(2)=0.71) as well as showed good correlation with clinical scores. Our results, and nine listed parameters, could serve as benchmark for characterization of hallux valgus and objective evaluation of treatment efficacy. Copyright © 2015 Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Fan Wu
2015-05-01
Full Text Available Vessel monitoring is one of the most important maritime applications of Synthetic Aperture Radar (SAR data. Because of the dihedral reflections between the vessel hull and sea surface and the trihedral reflections among superstructures, vessels usually have strong backscattering in SAR images. Furthermore, in high-resolution SAR images, detailed information on vessel structures can be observed, allowing for vessel classification in high-resolution SAR images. This paper focuses on the feature analysis of merchant vessels, including bulk carriers, container ships and oil tankers, in 3 m resolution COSMO-SkyMed stripmap HIMAGE mode images and proposes a method for vessel classification. After preprocessing, a feature vector is estimated by calculating the average value of the kernel density estimation, three structural features and the mean backscattering coefficient. Support vector machine (SVM classifier is used for the vessel classification, and the results are compared with traditional methods, such as the K-nearest neighbor algorithm (K-NN and minimum distance classifier (MDC. In situ investigations are conducted during the SAR data acquisition. Corresponding Automatic Identification System (AIS reports are also obtained as ground truth to evaluate the effectiveness of the classifier. The preliminary results show that the combination of the average value of the kernel density estimation and mean backscattering coefficient has good ability for classifying the three types of vessels. When adding the three structural features, the results slightly improve. The result of the SVM classifier is better than that of K-NN and MDC. However, the SVM requires more time, when the parameters of the kernel are estimated.
Montoya, Isaac D.
2008-01-01
Three classification techniques (Chi-square Automatic Interaction Detection [CHAID], Classification and Regression Tree [CART], and discriminant analysis) were tested to determine their accuracy in predicting Temporary Assistance for Needy Families program recipients' future employment. Technique evaluation was based on proportion of correctly…
International Nuclear Information System (INIS)
Berrada, K.; Benmoussa, A.; Hassouni, Y.
2010-07-01
Using linear entropy as a measure of entanglement, we investigate the entanglement generated via a beam splitter using deformed Barut-Girardello coherent states. We show that the degree of entanglement depends strongly on the q-deformation parameter and amplitude Z of the states. We compute the Mandel Q parameter to examine the quantum statistical properties of these coherent states and make a comparison with the Glauber coherent states. It is shown that these states are useful to describe the states of real and ideal lasers by a proper choice of their characterizing parameters, using an alteration of the Holstein-Primakoff realization. (author)
Use of prior mammograms in the classification of benign and malignant masses
International Nuclear Information System (INIS)
Varela, Celia; Karssemeijer, Nico; Hendriks, Jan H.C.L.; Holland, Roland
2005-01-01
The purpose of this study was to determine the importance of using prior mammograms for classification of benign and malignant masses. Five radiologists and one resident classified mass lesions in 198 mammograms obtained from a population-based screening program. Cases were interpreted twice, once without and once with comparison of previous mammograms, in a sequential reading order using soft copy image display. The radiologists' performances in classifying benign and malignant masses without and with previous mammograms were evaluated with receiver operating characteristic (ROC) analysis. The statistical significance of the difference in performances was calculated using analysis of variance. The use of prior mammograms improved the classification performance of all participants in the study. The mean area under the ROC curve of the readers increased from 0.763 to 0.796. This difference in performance was statistically significant (P = 0.008)
Energy Technology Data Exchange (ETDEWEB)
Moenkkoenen, H.; Rantanen, T.; Kuula, H. [WSP Finland Oy, Helsinki (Finland)
2012-05-15
In this report, the rock mechanics parameters of fractures and brittle deformation zones have been estimated in the vicinity of the ONKALO area at the Olkiluoto site, western Finland. This report is an extension of the previously published report: Geometrical and Mechanical properties if the fractures and brittle deformation zones based on ONKALO tunnel mapping, 0-2400 m tunnel chainage (Kuula 2010). In this updated report, mapping data are from 2400-4390 m tunnel chainage. Defined rock mechanics parameters of the fractures are associated with the rock engineering classification quality index, Q', which incorporates the RQD, Jn, Jr and Ja values. The friction angle of the fracture surfaces is estimated from the Jr and Ja numbers. There are no new data from laboratory joint shear and normal tests. The fracture wall compressive strength (JCS) data are available from the chainage range 1280-2400 m. Estimation of the mechanics properties of the 24 brittle deformation zones (BDZ) is based on the mapped Q' value, which is transformed to the GSI value in order to estimate strength and deformability properties. A component of the mapped Q' values is from the ONKALO and another component is from the drill cores. In this study, 24 BDZs have been parameterized. The location and size of the brittle deformation are based on the latest interpretation. New data for intact rock strength of the brittle deformation zones are not available. (orig.)
Habitat classification modelling with incomplete data: Pushing the habitat envelope
Phoebe L. Zarnetske; Thomas C. Edwards; Gretchen G. Moisen
2007-01-01
Habitat classification models (HCMs) are invaluable tools for species conservation, land-use planning, reserve design, and metapopulation assessments, particularly at broad spatial scales. However, species occurrence data are often lacking and typically limited to presence points at broad scales. This lack of absence data precludes the use of many statistical...
A hierarchical approach of hybrid image classification for land use and land cover mapping
Directory of Open Access Journals (Sweden)
Rahdari Vahid
2018-01-01
Full Text Available Remote sensing data analysis can provide thematic maps describing land-use and land-cover (LULC in a short period. Using proper image classification method in an area, is important to overcome the possible limitations of satellite imageries for producing land-use and land-cover maps. In the present study, a hierarchical hybrid image classification method was used to produce LULC maps using Landsat Thematic mapper TM for the year of 1998 and operational land imager OLI for the year of 2016. Images were classified using the proposed hybrid image classification method, vegetation cover crown percentage map from normalized difference vegetation index, Fisher supervised classification and object-based image classification methods. Accuracy assessment results showed that the hybrid classification method produced maps with total accuracy up to 84 percent with kappa statistic value 0.81. Results of this study showed that the proposed classification method worked better with OLI sensor than with TM. Although OLI has a higher radiometric resolution than TM, the produced LULC map using TM is almost accurate like OLI, which is because of LULC definitions and image classification methods used.
Anomalous behavior of q-averages in nonextensive statistical mechanics
International Nuclear Information System (INIS)
Abe, Sumiyoshi
2009-01-01
A generalized definition of average, termed the q-average, is widely employed in the field of nonextensive statistical mechanics. Recently, it has however been pointed out that such an average value may behave unphysically under specific deformations of probability distributions. Here, the following three issues are discussed and clarified. Firstly, the deformations considered are physical and may be realized experimentally. Secondly, in view of the thermostatistics, the q-average is unstable in both finite and infinite discrete systems. Thirdly, a naive generalization of the discussion to continuous systems misses a point, and a norm better than the L 1 -norm should be employed for measuring the distance between two probability distributions. Consequently, stability of the q-average is shown not to be established in all of the cases
Deformation twinning: Influence of strain rate
Energy Technology Data Exchange (ETDEWEB)
Gray, G.T. III
1993-11-01
Twins in most crystal structures, including advanced materials such as intermetallics, form more readily as the temperature of deformation is decreased or the rate of deformation is increased. Both parameters lead to the suppression of thermally-activated dislocation processes which can result in stresses high enough to nucleate and grow deformation twins. Under high-strain rate or shock-loading/impact conditions deformation twinning is observed to be promoted even in high stacking fault energy FCC metals and alloys, composites, and ordered intermetallics which normally do not readily deform via twinning. Under such conditions and in particular under the extreme loading rates typical of shock wave deformation the competition between slip and deformation twinning can be examined in detail. In this paper, examples of deformation twinning in the intermetallics TiAl, Ti-48Al-lV and Ni{sub 3}A as well in the cermet Al-B{sub 4}C as a function of strain rate will be presented. Discussion includes: (1) the microstructural and experimental variables influencing twin formation in these systems and twinning topics related to high-strain-rate loading, (2) the high velocity of twin formation, and (3) the influence of deformation twinning on the constitutive response of advanced materials.
Energy Technology Data Exchange (ETDEWEB)
Dossing, T.; Khoo, T.L.; Lauritsen, T. [and others
1995-08-01
The decay out of superdeformed states occurs by coupling to compound nuclear states of normal deformation. The coupling is very weak, resulting in mixing of the SD state with one or two normal compound states. With a high energy available for decay, a statistical spectrum ensues. The shape of this statistical spectrum contains information on the level densities of the excited states below the SD level. The level densities are sensitively affected by the pair correlations. Thus decay-out of a SD state (which presents us with a means to start a statistical cascade from a highly-excited sharp state) provides a method for investigating the reduction of pairing with increasing thermal excitation energy.
Deformation of Man Made Objects
Ibrahim, Mohamed
2012-07-01
We introduce a framework for 3D object deformation with primary focus on man-made objects. Our framework enables a user to deform a model while preserving its defining characteristics. Moreover, our framework enables a user to set constraints on a model to keep its most significant features intact after the deformation process. Our framework supports a semi-automatic constraint setting environment, where some constraints could be automatically set by the framework while others are left for the user to specify. Our framework has several advantages over some state of the art deformation techniques in that it enables a user to add new features to the deformed model while keeping its general look similar to the input model. In addition, our framework enables the rotation and extrusion of different parts of a model.
Large-scale gene function analysis with the PANTHER classification system.
Mi, Huaiyu; Muruganujan, Anushya; Casagrande, John T; Thomas, Paul D
2013-08-01
The PANTHER (protein annotation through evolutionary relationship) classification system (http://www.pantherdb.org/) is a comprehensive system that combines gene function, ontology, pathways and statistical analysis tools that enable biologists to analyze large-scale, genome-wide data from sequencing, proteomics or gene expression experiments. The system is built with 82 complete genomes organized into gene families and subfamilies, and their evolutionary relationships are captured in phylogenetic trees, multiple sequence alignments and statistical models (hidden Markov models or HMMs). Genes are classified according to their function in several different ways: families and subfamilies are annotated with ontology terms (Gene Ontology (GO) and PANTHER protein class), and sequences are assigned to PANTHER pathways. The PANTHER website includes a suite of tools that enable users to browse and query gene functions, and to analyze large-scale experimental data with a number of statistical tests. It is widely used by bench scientists, bioinformaticians, computer scientists and systems biologists. In the 2013 release of PANTHER (v.8.0), in addition to an update of the data content, we redesigned the website interface to improve both user experience and the system's analytical capability. This protocol provides a detailed description of how to analyze genome-wide experimental data with the PANTHER classification system.
Statistics of ductile fracture surfaces: the effect of material parameters
DEFF Research Database (Denmark)
Ponson, Laurent; Cao, Yuanyuan; Bouchaud, Elisabeth
2013-01-01
distributed. The three dimensional analysis permits modeling of a three dimensional material microstructure and of the resulting three dimensional stress and deformation states that develop in the fracture process region. Material parameters characterizing void nucleation are varied and the statistics...... of the resulting fracture surfaces is investigated. All the fracture surfaces are found to be self-affine over a size range of about two orders of magnitude with a very similar roughness exponent of 0.56 ± 0.03. In contrast, the full statistics of the fracture surfaces is found to be more sensitive to the material...
International Nuclear Information System (INIS)
Dobrev, V.K.
1992-01-01
We review and explain a canonical procedure for the q-deformation of the real forms G of complex Lie (super-) algebras associated with (generalized) Cartan matrices. Our procedure gives different q-deformations for the non-conjugate Cartan subalgebras of G. We give several in detail the q-deformed Lorentz and conformal (super-) algebras. The q-deformed conformal algebra contains as a subalgebra a q-deformed Poincare algebra and as Hopf subalgebras two conjugate 11-generator q-deformed Weyl algebras. The q-deformed Lorentz algebra in Hopf subalgebra of both Weyl algebras. (author). 24 refs
Le, Trang T; Simmons, W Kyle; Misaki, Masaya; Bodurka, Jerzy; White, Bill C; Savitz, Jonathan; McKinney, Brett A
2017-09-15
Classification of individuals into disease or clinical categories from high-dimensional biological data with low prediction error is an important challenge of statistical learning in bioinformatics. Feature selection can improve classification accuracy but must be incorporated carefully into cross-validation to avoid overfitting. Recently, feature selection methods based on differential privacy, such as differentially private random forests and reusable holdout sets, have been proposed. However, for domains such as bioinformatics, where the number of features is much larger than the number of observations p≫n , these differential privacy methods are susceptible to overfitting. We introduce private Evaporative Cooling, a stochastic privacy-preserving machine learning algorithm that uses Relief-F for feature selection and random forest for privacy preserving classification that also prevents overfitting. We relate the privacy-preserving threshold mechanism to a thermodynamic Maxwell-Boltzmann distribution, where the temperature represents the privacy threshold. We use the thermal statistical physics concept of Evaporative Cooling of atomic gases to perform backward stepwise privacy-preserving feature selection. On simulated data with main effects and statistical interactions, we compare accuracies on holdout and validation sets for three privacy-preserving methods: the reusable holdout, reusable holdout with random forest, and private Evaporative Cooling, which uses Relief-F feature selection and random forest classification. In simulations where interactions exist between attributes, private Evaporative Cooling provides higher classification accuracy without overfitting based on an independent validation set. In simulations without interactions, thresholdout with random forest and private Evaporative Cooling give comparable accuracies. We also apply these privacy methods to human brain resting-state fMRI data from a study of major depressive disorder. Code
Reliability of a four-column classification for tibial plateau fractures.
Martínez-Rondanelli, Alfredo; Escobar-González, Sara Sofía; Henao-Alzate, Alejandro; Martínez-Cano, Juan Pablo
2017-09-01
A four-column classification system offers a different way of evaluating tibial plateau fractures. The aim of this study is to compare the intra-observer and inter-observer reliability between four-column and classic classifications. This is a reliability study, which included patients presenting with tibial plateau fractures between January 2013 and September 2015 in a level-1 trauma centre. Four orthopaedic surgeons blindly classified each fracture according to four different classifications: AO, Schatzker, Duparc and four-column. Kappa, intra-observer and inter-observer concordance were calculated for the reliability analysis. Forty-nine patients were included. The mean age was 39 ± 14.2 years, with no gender predominance (men: 51%; women: 49%), and 67% of the fractures included at least one of the posterior columns. The intra-observer and inter-observer concordance were calculated for each classification: four-column (84%/79%), Schatzker (60%/71%), AO (50%/59%) and Duparc (48%/58%), with a statistically significant difference among them (p = 0.001/p = 0.003). Kappa coefficient for intr-aobserver and inter-observer evaluations: Schatzker 0.48/0.39, four-column 0.61/0.34, Duparc 0.37/0.23, and AO 0.34/0.11. The proposed four-column classification showed the highest intra and inter-observer agreement. When taking into account the agreement that occurs by chance, Schatzker classification showed the highest inter-observer kappa, but again the four-column had the highest intra-observer kappa value. The proposed classification is a more inclusive classification for the posteromedial and posterolateral fractures. We suggest, therefore, that it be used in addition to one of the classic classifications in order to better understand the fracture pattern, as it allows more attention to be paid to the posterior columns, it improves the surgical planning and allows the surgical approach to be chosen more accurately.
Directory of Open Access Journals (Sweden)
Jianning Wu
2015-01-01
Full Text Available The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.
Wu, Jianning; Wu, Bin
2015-01-01
The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.
Building an asynchronous web-based tool for machine learning classification.
Weber, Griffin; Vinterbo, Staal; Ohno-Machado, Lucila
2002-01-01
Various unsupervised and supervised learning methods including support vector machines, classification trees, linear discriminant analysis and nearest neighbor classifiers have been used to classify high-throughput gene expression data. Simpler and more widely accepted statistical tools have not yet been used for this purpose, hence proper comparisons between classification methods have not been conducted. We developed free software that implements logistic regression with stepwise variable selection as a quick and simple method for initial exploration of important genetic markers in disease classification. To implement the algorithm and allow our collaborators in remote locations to evaluate and compare its results against those of other methods, we developed a user-friendly asynchronous web-based application with a minimal amount of programming using free, downloadable software tools. With this program, we show that classification using logistic regression can perform as well as other more sophisticated algorithms, and it has the advantages of being easy to interpret and reproduce. By making the tool freely and easily available, we hope to promote the comparison of classification methods. In addition, we believe our web application can be used as a model for other bioinformatics laboratories that need to develop web-based analysis tools in a short amount of time and on a limited budget.
Application of In-Segment Multiple Sampling in Object-Based Classification
Directory of Open Access Journals (Sweden)
Nataša Đurić
2014-12-01
Full Text Available When object-based analysis is applied to very high-resolution imagery, pixels within the segments reveal large spectral inhomogeneity; their distribution can be considered complex rather than normal. When normality is violated, the classification methods that rely on the assumption of normally distributed data are not as successful or accurate. It is hard to detect normality violations in small samples. The segmentation process produces segments that vary highly in size; samples can be very big or very small. This paper investigates whether the complexity within the segment can be addressed using multiple random sampling of segment pixels and multiple calculations of similarity measures. In order to analyze the effect sampling has on classification results, statistics and probability value equations of non-parametric two-sample Kolmogorov-Smirnov test and parametric Student’s t-test are selected as similarity measures in the classification process. The performance of both classifiers was assessed on a WorldView-2 image for four land cover classes (roads, buildings, grass and trees and compared to two commonly used object-based classifiers—k-Nearest Neighbor (k-NN and Support Vector Machine (SVM. Both proposed classifiers showed a slight improvement in the overall classification accuracies and produced more accurate classification maps when compared to the ground truth image.
SAW Classification Algorithm for Chinese Text Classification
Xiaoli Guo; Huiyu Sun; Tiehua Zhou; Ling Wang; Zhaoyang Qu; Jiannan Zang
2015-01-01
Considering the explosive growth of data, the increased amount of text data’s effect on the performance of text categorization forward the need for higher requirements, such that the existing classification method cannot be satisfied. Based on the study of existing text classification technology and semantics, this paper puts forward a kind of Chinese text classification oriented SAW (Structural Auxiliary Word) algorithm. The algorithm uses the special space effect of Chinese text where words...
Modeling plastic deformation of post-irradiated copper micro-pillars
Energy Technology Data Exchange (ETDEWEB)
Crosby, Tamer, E-mail: tcrosby@ucla.edu; Po, Giacomo, E-mail: gpo@ucla.edu; Ghoniem, Nasr M., E-mail: ghoniem@ucla.edu
2014-12-15
We present here an application of a fundamentally new theoretical framework for description of the simultaneous evolution of radiation damage and plasticity that can describe both in situ and ex situ deformation of structural materials [1]. The theory is based on the variational principle of maximum entropy production rate; with constraints on dislocation climb motion that are imposed by point defect fluxes as a result of irradiation. The developed theory is implemented in a new computational code that facilitates the simulation of irradiated and unirradiated materials alike in a consistent fashion [2]. Discrete Dislocation Dynamics (DDD) computer simulations are presented here for irradiated fcc metals that address the phenomenon of dislocation channel formation in post-irradiated copper. The focus of the simulations is on the role of micro-pillar boundaries and the statistics of dislocation pinning by stacking-fault tetrahedra (SFTs) on the onset of dislocation channel and incipient surface crack formation. The simulations show that the spatial heterogeneity in the distribution of SFTs naturally leads to localized plastic deformation and incipient surface fracture of micro-pillars.
Spectral deformation techniques applied to the study of quantum statistical irreversible processes
International Nuclear Information System (INIS)
Courbage, M.
1978-01-01
A procedure of analytic continuation of the resolvent of Liouville operators for quantum statistical systems is discussed. When applied to the theory of irreversible processes of the Brussels School, this method supports the idea that the restriction to a class of initial conditions is necessary to obtain an irreversible behaviour. The general results are tested on the Friedrichs model. (Auth.)
DEFF Research Database (Denmark)
Mai, Jens Erik
2004-01-01
This paper surveys classification research literature, discusses various classification theories, and shows that the focus has traditionally been on establishing a scientific foundation for classification research. This paper argues that a shift has taken place, and suggests that contemporary...... classification research focus on contextual information as the guide for the design and construction of classification schemes....
International Nuclear Information System (INIS)
Combs, Stephanie E.; Schulz-Ertner, Daniela; Debus, Jürgen; Deimling, Andreas von; Hartmann, Christian
2011-01-01
Purpose: To evaluate the correlation between the 1993 and 2000/2007 World Health Organization (WHO) classification with the outcome in patients with high-grade meningiomas. Patients and Methods: Between 1985 and 2004, 73 patients diagnosed with atypical or anaplastic meningiomas were treated with radiotherapy. Sections from the paraffin-embedded tumor material from 66 patients (90%) from 13 different pathology departments were re-evaluated according to the first revised WHO classification from 1993 and the revised classifications from 2000/2007. In 4 cases, the initial diagnosis meningioma was not reproducible (5%). Therefore, 62 patients with meningiomas were analyzed. Results: All 62 tumors were reclassified according to the 1993 and 2000/2007 WHO classification systems. Using the 1993 system, 7 patients were diagnosed with WHO grade I meningioma (11%), 23 with WHO grade II (37%), and 32 with WHO grade III meningioma (52%). After scoring using the 2000/2007 system, we found 17 WHO grade I meningiomas (27%), 32 WHO grade II meningiomas (52%), and 13 WHO grade III meningiomas (21%). According to the 1993 classification, the difference in overall survival was not statistically significant among the histologic subgroups (p = .96). Using the 2000/2007 WHO classifications, the difference in overall survival became significant (p = .02). Of the 62 reclassified patients 29 developed tumor progression (47%). No difference in progression-free survival was observed among the histologic subgroups (p = .44). After grading according to the 2000/2007 WHO classifications, significant differences in progression-free survival were observed among the three histologic groups (p = .005). Conclusion: The new 2000/2007 WHO classification for meningiomas showed an improved correlation between the histologic grade and outcome. This classification therefore provides a useful basis to determine the postoperative indication for radiotherapy. According to our results, a comparison of the
Energy Technology Data Exchange (ETDEWEB)
Combs, Stephanie E., E-mail: Stephanie.Combs@med.uni-heidelberg.de [Department of Radiation Oncology, University Hospital of Heidelberg, Heidelberg (Germany); Schulz-Ertner, Daniela [Radiologisches Institut, Markuskrankenhaus Frankfurt, Frankfurt am Main (Germany); Debus, Juergen [Department of Radiation Oncology, University Hospital of Heidelberg, Heidelberg (Germany); Deimling, Andreas von; Hartmann, Christian [Department of Neuropathology, Institute for Pathology, University Hospital of Heidelberg, Heidelberg (Germany); Clinical Cooperation Unit Neuropathology, German Cancer Research Center, Heidelberg (Germany)
2011-12-01
Purpose: To evaluate the correlation between the 1993 and 2000/2007 World Health Organization (WHO) classification with the outcome in patients with high-grade meningiomas. Patients and Methods: Between 1985 and 2004, 73 patients diagnosed with atypical or anaplastic meningiomas were treated with radiotherapy. Sections from the paraffin-embedded tumor material from 66 patients (90%) from 13 different pathology departments were re-evaluated according to the first revised WHO classification from 1993 and the revised classifications from 2000/2007. In 4 cases, the initial diagnosis meningioma was not reproducible (5%). Therefore, 62 patients with meningiomas were analyzed. Results: All 62 tumors were reclassified according to the 1993 and 2000/2007 WHO classification systems. Using the 1993 system, 7 patients were diagnosed with WHO grade I meningioma (11%), 23 with WHO grade II (37%), and 32 with WHO grade III meningioma (52%). After scoring using the 2000/2007 system, we found 17 WHO grade I meningiomas (27%), 32 WHO grade II meningiomas (52%), and 13 WHO grade III meningiomas (21%). According to the 1993 classification, the difference in overall survival was not statistically significant among the histologic subgroups (p = .96). Using the 2000/2007 WHO classifications, the difference in overall survival became significant (p = .02). Of the 62 reclassified patients 29 developed tumor progression (47%). No difference in progression-free survival was observed among the histologic subgroups (p = .44). After grading according to the 2000/2007 WHO classifications, significant differences in progression-free survival were observed among the three histologic groups (p = .005). Conclusion: The new 2000/2007 WHO classification for meningiomas showed an improved correlation between the histologic grade and outcome. This classification therefore provides a useful basis to determine the postoperative indication for radiotherapy. According to our results, a comparison of the
Constructions of quantum fields with anyonic statistics
International Nuclear Information System (INIS)
Plaschke, M.
2015-01-01
From the principles of algebraic quantum field theory it follows that in low dimensions particles are not necessarily bosons or fermions, but their statistics can in general be governed by the braid group. Such particles are called anyons and their possible statistics is intimately related to their localization properties and their covariance with respect to rotations. This work is concerned with the explicit construction of quantum fields with anyonic statistics which are localized in various different regions on two- and three-dimensional Minkowski space, and we will analyze the connection between localization, statistics and spin. The reason why this is considerably more difficult than for bosons or fermions is the no-go theorem regarding free cone-localized anyons in d=2+1. This problem is approached in this work from different directions leaving out some of the underlying assumptions one makes in the abstract algebraic quantum field theory. Despite a similar no-go theorem for free local anyons, it is in two dimensions possible to construct compactly localized quantum field nets with anyonic commutation relations for every mass m ≥ 0 and every statistics parameter by using the theory of loop groups and implementable Bogoliubov transformations. This does not work in higher dimensions so in d=2+1 we will first construct polarization free generators, which are only wedge-local, using a recent work about multiplicative deformations of free quantum fields on the Fock space. By generalizing this procedure to the charged case it is possible to extend the set of admissible deformations and end up with fields satisfying anyonic commutation relations, which are covariant w.r.t a Poincaré group representation with arbitrary real-valued spin. Another approach, which further demonstrates the connection between localization, statistics and spin of quantum field nets, is to focus first only on the rotational degrees of freedom and construct field operators on the circle
International Nuclear Information System (INIS)
Dahmen, Karin A.; Ben-Zion, Yehuda; Uhl, Jonathan T.
2009-01-01
A basic micromechanical model for deformation of solids with only one tuning parameter (weakening ε) is introduced. The model can reproduce observed stress-strain curves, acoustic emissions and related power spectra, event statistics, and geometrical properties of slip, with a continuous phase transition from brittle to ductile behavior. Exact universal predictions are extracted using mean field theory and renormalization group tools. The results agree with recent experimental observations and simulations of related models for dislocation dynamics, material damage, and earthquake statistics.
DEFF Research Database (Denmark)
Hjørland, Birger
2017-01-01
This article presents and discusses definitions of the term “classification” and the related concepts “Concept/conceptualization,”“categorization,” “ordering,” “taxonomy” and “typology.” It further presents and discusses theories of classification including the influences of Aristotle...... and Wittgenstein. It presents different views on forming classes, including logical division, numerical taxonomy, historical classification, hermeneutical and pragmatic/critical views. Finally, issues related to artificial versus natural classification and taxonomic monism versus taxonomic pluralism are briefly...
Palano, Mimmo; Imprescia, Paola; Agnon, Amotz; Gresta, Stefano
2018-04-01
We present an improved picture of the ongoing crustal deformation field for the Zagros Fold-and-Thrust Belt continental collision zone by using an extensive combination of both novel and published GPS observations. The main results define the significant amount of oblique Arabia-Eurasia convergence currently being absorbed within the Zagros: right-lateral shear along the NW trending Main Recent fault in NW Zagros and accommodated between fold-and-thrust structures and NS right-lateral strike-slip faults on Southern Zagros. In addition, taking into account the 1909-2016 instrumental seismic catalogue, we provide a statistical evaluation of the seismic/geodetic deformation-rate ratio for the area. On Northern Zagros and on the Turkish-Iranian Plateau, a moderate to large fraction (˜49 and >60 per cent, respectively) of the crustal deformation occurs seismically. On the Sanandaj-Sirjan zone, the seismic/geodetic deformation-rate ratio suggests that a small to moderate fraction (<40 per cent) of crustal deformation occurs seismically; locally, the occurrence of large historic earthquakes (M ≥ 6) coupled with the high geodetic deformation, could indicate overdue M ≥ 6 earthquakes. On Southern Zagros, aseismic strain dominates crustal deformation (the ratio ranges in the 15-33 per cent interval). Such aseismic deformation is probably related to the presence of the weak evaporitic Hormuz Formation which allows the occurrence of large aseismic motion on both subhorizontal faults and surfaces of décollement. These results, framed into the seismotectonic framework of the investigated region, confirm that the fold-and-thrust-dominated deformation is driven by buoyancy forces; by contrast, the shear-dominated deformation is primary driven by plate stresses.
Analysis and classification of ECG-waves and rhythms using circular statistics and vector strength
Directory of Open Access Journals (Sweden)
Janßen Jan-Dirk
2017-09-01
Full Text Available The most common way to analyse heart rhythm is to calculate the RR-interval and the heart rate variability. For further evaluation, descriptive statistics are often used. Here we introduce a new and more natural heart rhythm analysis tool that is based on circular statistics and vector strength. Vector strength is a tool to measure the periodicity or lack of periodicity of a signal. We divide the signal into non-overlapping window segments and project the detected R-waves around the unit circle using the complex exponential function and the median RR-interval. In addition, we calculate the vector strength and apply circular statistics as wells as an angular histogram on the R-wave vectors. This approach enables an intuitive visualization and analysis of rhythmicity. Our results show that ECG-waves and rhythms can be easily visualized, analysed and classified by circular statistics and vector strength.
On infinitesimal conformai deformations of surfaces
Directory of Open Access Journals (Sweden)
Юлия Степановна Федченко
2014-11-01
Full Text Available A new form of basic equations for conformai deformations is found. The equations involve tensor fields of displacement vector only. Conditions for trivial deformations as well as infinitesimal conformai deformations are studied.
An Outlyingness Matrix for Multivariate Functional Data Classification
Dai, Wenlin; Genton, Marc G.
2017-01-01
outlyingness with conventional statistical depth. We propose two classifiers based on directional outlyingness and the outlyingness matrix, respectively. Our classifiers provide better performance compared with existing depth-based classifiers when applied on both univariate and multivariate functional data from simulation studies. We also test our methods on two data problems: speech recognition and gesture classification, and obtain results that are consistent with the findings from the simulated data.
Intracrystalline deformation of calcite
Bresser, J.H.P. de
1991-01-01
It is well established from observations on natural calcite tectonites that intracrystalline plastic mechanisms are important during the deformation of calcite rocks in nature. In this thesis, new data are presented on fundamental aspects of deformation behaviour of calcite under conditions where
Directory of Open Access Journals (Sweden)
Brion Philippe
2015-12-01
Full Text Available Using as much administrative data as possible is a general trend among most national statistical institutes. Different kinds of administrative sources, from tax authorities or other administrative bodies, are very helpful material in the production of business statistics. However, these sources often have to be completed by information collected through statistical surveys. This article describes the way Insee has implemented such a strategy in order to produce French structural business statistics. The originality of the French procedure is that administrative and survey variables are used jointly for the same enterprises, unlike the majority of multisource systems, in which the two kinds of sources generally complement each other for different categories of units. The idea is to use, as much as possible, the richness of the administrative sources combined with the timeliness of a survey, even if the latter is conducted only on a sample of enterprises. One main issue is the classification of enterprises within the NACE nomenclature, which is a cornerstone variable in producing the breakdown of the results by industry. At a given date, two values of the corresponding code may coexist: the value of the register, not necessarily up to date, and the value resulting from the data collected via the survey, but only from a sample of enterprises. Using all this information together requires the implementation of specific statistical estimators combining some properties of the difference estimators with calibration techniques. This article presents these estimators, as well as their statistical properties, and compares them with those of other methods.
A systematic review of definitions and classification systems of adjacent segment pathology.
Kraemer, Paul; Fehlings, Michael G; Hashimoto, Robin; Lee, Michael J; Anderson, Paul A; Chapman, Jens R; Raich, Annie; Norvell, Daniel C
2012-10-15
Systematic review. To undertake a systematic review to determine how "adjacent segment degeneration," "adjacent segment disease," or clinical pathological processes that serve as surrogates for adjacent segment pathology are classified and defined in the peer-reviewed literature. Adjacent segment degeneration and adjacent segment disease are terms referring to degenerative changes known to occur after reconstructive spine surgery, most commonly at an immediately adjacent functional spinal unit. These can include disc degeneration, instability, spinal stenosis, facet degeneration, and deformity. The true incidence and clinical impact of degenerative changes at the adjacent segment is unclear because there is lack of a universally accepted classification system that rigorously addresses clinical and radiological issues. A systematic review of the English language literature was undertaken and articles were classified using the Grades of Recommendation Assessment, Development, and Evaluation criteria. RESULTS.: Seven classification systems of spinal degeneration, including degeneration at the adjacent segment, were identified. None have been evaluated for reliability or validity specific to patients with degeneration at the adjacent segment. The ways in which terms related to adjacent segment "degeneration" or "disease" are defined in the peer-reviewed literature are highly variable. On the basis of the systematic review presented in this article, no formal classification system for either cervical or thoracolumbar adjacent segment disorders currently exists. No recommendations regarding the use of current classification of degeneration at any segments can be made based on the available literature. A new comprehensive definition for adjacent segment pathology (ASP, the now preferred terminology) has been proposed in this Focus Issue, which reflects the diverse pathology observed at functional spinal units adjacent to previous spinal reconstruction and balances
Advanced data analysis in neuroscience integrating statistical and computational models
Durstewitz, Daniel
2017-01-01
This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering. Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanat ory frameworks, but become powerfu...
Shirota, Yukari; Hashimoto, Takako; Fitri Sari, Riri
2018-03-01
It has been very significant to visualize time series big data. In the paper we shall discuss a new analysis method called “statistical shape analysis” or “geometry driven statistics” on time series statistical data in economics. In the paper, we analyse the agriculture, value added and industry, value added (percentage of GDP) changes from 2000 to 2010 in Asia. We handle the data as a set of landmarks on a two-dimensional image to see the deformation using the principal components. The point of the analysis method is the principal components of the given formation which are eigenvectors of its bending energy matrix. The local deformation can be expressed as the set of non-Affine transformations. The transformations give us information about the local differences between in 2000 and in 2010. Because the non-Affine transformation can be decomposed into a set of partial warps, we present the partial warps visually. The statistical shape analysis is widely used in biology but, in economics, no application can be found. In the paper, we investigate its potential to analyse the economic data.
Deformable paper origami optoelectronic devices
He, Jr-Hau
2017-01-19
Deformable optoelectronic devices are provided, including photodetectors, photodiodes, and photovoltaic cells. The devices can be made on a variety of paper substrates, and can include a plurality of fold segments in the paper substrate creating a deformable pattern. Thin electrode layers and semiconductor nanowire layers can be attached to the substrate, creating the optoelectronic device. The devices can be highly deformable, e.g. capable of undergoing strains of 500% or more, bending angles of 25° or more, and/or twist angles of 270° or more. Methods of making the deformable optoelectronic devices and methods of using, e.g. as a photodetector, are also provided.
Horizontal and Vertical Intra-Industry Trade: Is the Empirical Classification Usable?
DEFF Research Database (Denmark)
Nielsen, Jørgen Ulff-Møller; Lüthje, Teit
2002-01-01
On the basis of OECD trade statistics at SITC 5 digit level for the period 1961-1999 this paper shows the classification of international trade in (1) inter-industry trade; (2) horizontal intra-industry; and (3) vertical intra-industry trade used in the empirical trade literature to be unstable...
Kozanecka, Anna; Sarul, Michał; Kawala, Beata; Antoszewska-Smith, Joanna
2016-01-01
Orthodontic classifications make it possible to give an accurate diagnosis but do not indicate an objective orthodontic treatment need. In order to evaluate the need for treatment, it is necessary to use such indicators as the IOTN. The aim of the study was to find (i) relationships between individual diagnosis and objective recommendations for treatment and (ii) an answer to the question whether and which occlusal anomalies play an important role in the objectification of treatment needs. Two hundred three 18-year-old adolescents (104 girls, 99 boys) were examined. In order to recognize occlusal anomalies, the classifications proposed by Orlik-Grzybowska and Ackerman-Proffit were used. The occlusal anomalies were divided into three categories: belonging to both classifications, typical of OrlikGrzybowska classification and typical of Ackerman-Proffit classification. In order to determine the objective need for orthodontic treatment, the Dental Health Component (DHC) of the IOTN was used. The occurrence of the following malocclusions covered by both classifications, namely abnormal overjet, crossbite and Angle's class, had a statistically significant (p 3). As for the classification by Orlik-Grzybowska, dental malpositions and canine class significantly affected the need for orthodontic treatment, while in the case of the Ackerman-Proffit scheme, it was asymmetry and crowding. There was no statistically significant correlation between past orthodontic treatment and current orthodontic treatment need. IOTN may be affected by a greater number of occlusal anomalies than it was assumed. Orthodontic treatment received in the past slightly reduces the need for treatment in 18-year-olds.
Stochastic deformation of a thermodynamic symplectic structure
Kazinski, P. O.
2008-01-01
A stochastic deformation of a thermodynamic symplectic structure is studied. The stochastic deformation procedure is analogous to the deformation of an algebra of observables like deformation quantization, but for an imaginary deformation parameter (the Planck constant). Gauge symmetries of thermodynamics and corresponding stochastic mechanics, which describes fluctuations of a thermodynamic system, are revealed and gauge fields are introduced. A physical interpretation to the gauge transform...
78 FR 54970 - Cotton Futures Classification: Optional Classification Procedure
2013-09-09
... Service 7 CFR Part 27 [AMS-CN-13-0043] RIN 0581-AD33 Cotton Futures Classification: Optional Classification Procedure AGENCY: Agricultural Marketing Service, USDA. ACTION: Proposed rule. SUMMARY: The... optional cotton futures classification procedure--identified and known as ``registration'' by the U.S...
Nonaffine deformation under compression and decompression of a flow-stabilized solid
Ortiz, Carlos P.; Riehn, Robert; Daniels, Karen E.
2016-08-01
Understanding the particle-scale transition from elastic deformation to plastic flow is central to making predictions about the bulk material properties and response of disordered materials. To address this issue, we perform experiments on flow-stabilized solids composed of micron-scale spheres within a microfluidic channel, in a regime where particle inertia is negligible. Each solid heap exists within a stress field imposed by the flow, and we track the positions of particles in response to single impulses of fluid-driven compression or decompression. We find that the resulting deformation field is well-decomposed into an affine field, with a constant strain profile throughout the solid, and a non-affine field. The magnitude of this non-affine response decays with the distance from the free surface in the long-time limit, suggesting that the distance from jamming plays a significant role in controlling the length scale of plastic flow. Finally, we observe that compressive pulses create more rearrangements than decompressive pulses, an effect that we quantify using the D\\text{min}2 statistic for non-affine motion. Unexpectedly, the time scale for the compression response is shorter than for decompression at the same strain (but unequal pressure), providing insight into the coupling between deformation and cage-breaking.
Directory of Open Access Journals (Sweden)
Noman Naseer
2016-01-01
Full Text Available We analyse and compare the classification accuracies of six different classifiers for a two-class mental task (mental arithmetic and rest using functional near-infrared spectroscopy (fNIRS signals. The signals of the mental arithmetic and rest tasks from the prefrontal cortex region of the brain for seven healthy subjects were acquired using a multichannel continuous-wave imaging system. After removal of the physiological noises, six features were extracted from the oxygenated hemoglobin (HbO signals. Two- and three-dimensional combinations of those features were used for classification of mental tasks. In the classification, six different modalities, linear discriminant analysis (LDA, quadratic discriminant analysis (QDA, k-nearest neighbour (kNN, the Naïve Bayes approach, support vector machine (SVM, and artificial neural networks (ANN, were utilized. With these classifiers, the average classification accuracies among the seven subjects for the 2- and 3-dimensional combinations of features were 71.6, 90.0, 69.7, 89.8, 89.5, and 91.4% and 79.6, 95.2, 64.5, 94.8, 95.2, and 96.3%, respectively. ANN showed the maximum classification accuracies: 91.4 and 96.3%. In order to validate the results, a statistical significance test was performed, which confirmed that the p values were statistically significant relative to all of the other classifiers (p < 0.005 using HbO signals.
Deformation aspects of time dependent fracture
International Nuclear Information System (INIS)
Li, C.Y.; Turner, A.P.L.; Diercks, D.R.; Laird, C.; Langdon, T.G.; Nix, W.D.; Swindeman, R.; Wolfer, W.G.; Woodford, D.A.
1979-01-01
For all metallic materials, particularly at elevated temperatures, deformation plays an important role in fracture. On the macro-continuum level, the inelastic deformation behavior of the material determines how stress is distributed in the body and thus determines the driving force for fracture. At the micro-continuum level, inelastic deformation alters the elastic stress singularity at the crack tip and so determines the local environment in which crack advance takes place. At the microscopic and mechanistic level, there are many possibilities for the mechanisms of deformation to be related to those for crack initiation and growth. At elevated temperatures, inelastic deformation in metallic systems is time dependent so that the distribution of stress in a body will vary with time, affecting conditions for crack initiation and propagation. Creep deformation can reduce the tendency for fracture by relaxing the stresses at geometric stress concentrations. It can also, under suitable constraints, cause a concentration of stresses at specific loading points as a result of relaxation elsewhere in the body. A combination of deformation and unequal heating, as in welding, can generate large residual stress which cannot be predicted from the external loads on the body. Acceleration of deformation by raising the temperature can be an effective way to relieve such residual stresses
Acute leukemia classification by ensemble particle swarm model selection.
Escalante, Hugo Jair; Montes-y-Gómez, Manuel; González, Jesús A; Gómez-Gil, Pilar; Altamirano, Leopoldo; Reyes, Carlos A; Reta, Carolina; Rosales, Alejandro
2012-07-01
Acute leukemia is a malignant disease that affects a large proportion of the world population. Different types and subtypes of acute leukemia require different treatments. In order to assign the correct treatment, a physician must identify the leukemia type or subtype. Advanced and precise methods are available for identifying leukemia types, but they are very expensive and not available in most hospitals in developing countries. Thus, alternative methods have been proposed. An option explored in this paper is based on the morphological properties of bone marrow images, where features are extracted from medical images and standard machine learning techniques are used to build leukemia type classifiers. This paper studies the use of ensemble particle swarm model selection (EPSMS), which is an automated tool for the selection of classification models, in the context of acute leukemia classification. EPSMS is the application of particle swarm optimization to the exploration of the search space of ensembles that can be formed by heterogeneous classification models in a machine learning toolbox. EPSMS does not require prior domain knowledge and it is able to select highly accurate classification models without user intervention. Furthermore, specific models can be used for different classification tasks. We report experimental results for acute leukemia classification with real data and show that EPSMS outperformed the best results obtained using manually designed classifiers with the same data. The highest performance using EPSMS was of 97.68% for two-type classification problems and of 94.21% for more than two types problems. To the best of our knowledge, these are the best results reported for this data set. Compared with previous studies, these improvements were consistent among different type/subtype classification tasks, different features extracted from images, and different feature extraction regions. The performance improvements were statistically significant
Defining a core outcome set for adolescent and young adult patients with a spinal deformity.
de Kleuver, Marinus; Faraj, Sayf S A; Holewijn, Roderick M; Germscheid, Niccole M; Adobor, Raphael D; Andersen, Mikkel; Tropp, Hans; Dahl, Benny; Keskinen, Heli; Olai, Anders; Polly, David W; van Hooff, Miranda L; Haanstra, Tsjitske M
2017-12-01
Background and purpose - Routine outcome measurement has been shown to improve performance in several fields of healthcare. National spine surgery registries have been initiated in 5 Nordic countries. However, there is no agreement on which outcomes are essential to measure for adolescent and young adult patients with a spinal deformity. The aim of this study was to develop a core outcome set (COS) that will facilitate benchmarking within and between the 5 countries of the Nordic Spinal Deformity Society (NSDS) and other registries worldwide. Material and methods - From August 2015 to September 2016, 7 representatives (panelists) of the national spinal surgery registries from each of the NSDS countries participated in a modified Delphi study. With a systematic literature review as a basis and the International Classification of Functioning, Disability and Health framework as guidance, 4 consensus rounds were held. Consensus was defined as agreement between at least 5 of the 7 representatives. Data were analyzed qualitatively and quantitatively. Results - Consensus was reached on the inclusion of 13 core outcome domains: "satisfaction with overall outcome of surgery", "satisfaction with cosmetic result of surgery", "pain interference", physical functioning", "health-related quality of life", "recreation and leisure", "pulmonary fatigue", "change in deformity", "self-image", "pain intensity", "physical function", "complications", and "re-operation". Panelists agreed that the SRS-22r, EQ-5D, and a pulmonary fatigue questionnaire (yet to be developed) are the most appropriate set of patient-reported measurement instruments that cover these outcome domains. Interpretation - We have identified a COS for a large subgroup of spinal deformity patients for implementation and validation in the NSDS countries. This is the first study to further develop a COS in a global perspective.
A New Feature Ensemble with a Multistage Classification Scheme for Breast Cancer Diagnosis
Directory of Open Access Journals (Sweden)
Idil Isikli Esener
2017-01-01
Full Text Available A new and effective feature ensemble with a multistage classification is proposed to be implemented in a computer-aided diagnosis (CAD system for breast cancer diagnosis. A publicly available mammogram image dataset collected during the Image Retrieval in Medical Applications (IRMA project is utilized to verify the suggested feature ensemble and multistage classification. In achieving the CAD system, feature extraction is performed on the mammogram region of interest (ROI images which are preprocessed by applying a histogram equalization followed by a nonlocal means filtering. The proposed feature ensemble is formed by concatenating the local configuration pattern-based, statistical, and frequency domain features. The classification process of these features is implemented in three cases: a one-stage study, a two-stage study, and a three-stage study. Eight well-known classifiers are used in all cases of this multistage classification scheme. Additionally, the results of the classifiers that provide the top three performances are combined via a majority voting technique to improve the recognition accuracy on both two- and three-stage studies. A maximum of 85.47%, 88.79%, and 93.52% classification accuracies are attained by the one-, two-, and three-stage studies, respectively. The proposed multistage classification scheme is more effective than the single-stage classification for breast cancer diagnosis.
Quantifying the Erlenmeyer flask deformity
Carter, A; Rajan, P S; Deegan, P; Cox, T M; Bearcroft, P
2012-01-01
Objective Erlenmeyer flask deformity is a common radiological finding in patients with Gaucher′s disease; however, no definition of this deformity exists and the reported prevalence of the deformity varies widely. To devise an easily applied definition of this deformity, we investigated a cohort of knee radiographs in which there was consensus between three experienced radiologists as to the presence or absence of Erlenmeyer flask morphology. Methods Using the presence or absence of Erlenmeyer flask morphology as a benchmark, we measured the diameter of the femur at the level of the physeal scar and serially at defined intervals along the metadiaphysis. Results A measured ratio in excess of 0.57 between the diameter of the femoral shaft 4 cm from the physis to the diameter of the physeal baseline itself on a frontal radiograph of the knee predicted the Erlenmeyer flask deformity with 95.6% sensitivity and 100% specificity in our series of 43 independently diagnosed adults with Gaucher′s disease. Application of this method to the distal femur detected the Erlenmeyer flask deformity reproducibly and was simple to carry out. Conclusion Unlike diagnostic assignments based on subjective review, our simple procedure for identifying the modelling deformity is based on robust quantitative measurement: it should facilitate comparative studies between different groups of patients, and may allow more rigorous exploration of the pathogenesis of the complex osseous manifestations of Gaucher′s disease to be undertaken. PMID:22010032
Use of seismic attributes for sediment classification
Directory of Open Access Journals (Sweden)
Fabio Radomille Santana
2015-04-01
Full Text Available A study to understand the relationships between seismic attributes extracted from 2D high-resolution seismic data and the seafloor's sediments of the surveyed area. As seismic attributes are features highly influenced by the medium through which the seismic waves are propagated, the authors can assume that it would be possible to characterise the geological nature of the seafloor by using these attributes. Herein, a survey was performed on the continental margin of the South Shetland Islands in Antarctica, where both 2D high-resolution seismic data and sediment gravity cores samples were simultaneously acquired. A computational script was written to extract the seismic attributes from the data, which have been statistically analysed with clustering analyses, such as principal components analysis, dendrograms and k-means classification. The extracted seismic attributes are the amplitude, the instantaneous phase, the instantaneous frequency, the envelope, the time derivative of the envelope, the second derivative of the envelope and the acceleration of phase. Statistical evaluation showed that geological classification of the seafloor's sediments is possible by associating these attributes according to their coherence. The methodologies here developed seem to be appropriate for glacio-marine environment and coarse-to-medium silt sediment found in the study area and may be applied to other regions in the same geological conditions.
The Statistics of Health and Longevity
DEFF Research Database (Denmark)
Zarulli, Virginia
Increases in human longevity have made it critical to distinguish healthy longevity from longevity without regard to health. We present a new method for calculating the statistics of healthy longevity which extends, in several directions, current calculations of health expectancy (HE) and disabil......Increases in human longevity have made it critical to distinguish healthy longevity from longevity without regard to health. We present a new method for calculating the statistics of healthy longevity which extends, in several directions, current calculations of health expectancy (HE......) and disability-adjusted life years (DALYs), from data on prevalence of health conditions. Current methods focus on binary conditions (e.g., disabled or not disabled) or on categorical classifications (e.g. in good, poor, or very bad health) and report only expectations. Our method, based on Markov chain theory...
International Nuclear Information System (INIS)
Wang, T.S.; Hou, R.J.; Lv, B.; Zhang, M.; Zhang, F.C.
2007-01-01
The microstructure evolution and the deformation mechanism change in 0.98C-8.3Mn-0.04N steel during compressive deformation at room temperature have been studied as a function of the reduction in the range of 20-60%. Experimental results show that with the reduction increasing the microstructure of the deformed sample changes from dislocation substructures into the dominant twins plus dislocations. This suggests that the plastic deformation mechanism changes from the dislocation slip to the dominant deformation twinning. The minimum reduction for deformation twins starting is estimated to be at between 30 and 40%. With the reduction further increases to more than 40%, the deformation twinning is operative and the thickness of deformation twins gradually decreases to nanoscale and shear bands occur. These high-density twins can be curved by the formation of shear bands. In addition, both transmission electron microscopy and X-ray diffraction examinations confirm the inexistence of deformation-induced martensites in these deformed samples
Efficient Feature Selection and Classification of Protein Sequence Data in Bioinformatics
Faye, Ibrahima; Samir, Brahim Belhaouari; Md Said, Abas
2014-01-01
Bioinformatics has been an emerging area of research for the last three decades. The ultimate aims of bioinformatics were to store and manage the biological data, and develop and analyze computational tools to enhance their understanding. The size of data accumulated under various sequencing projects is increasing exponentially, which presents difficulties for the experimental methods. To reduce the gap between newly sequenced protein and proteins with known functions, many computational techniques involving classification and clustering algorithms were proposed in the past. The classification of protein sequences into existing superfamilies is helpful in predicting the structure and function of large amount of newly discovered proteins. The existing classification results are unsatisfactory due to a huge size of features obtained through various feature encoding methods. In this work, a statistical metric-based feature selection technique has been proposed in order to reduce the size of the extracted feature vector. The proposed method of protein classification shows significant improvement in terms of performance measure metrics: accuracy, sensitivity, specificity, recall, F-measure, and so forth. PMID:25045727
Digitisation of films and texture analysis for digital classification of pulmonary opacities
International Nuclear Information System (INIS)
Desaga, J.F.; Dengler, J.; Wolf, T.; Engelmann, U.; Scheppelmann, D.; Meinzer, H.P.
1988-01-01
The study aimed at evaluating the effect of different methods of digitisation of radiographic films on the digital classification of pulmonary opacities. Test sets from the standard of the International Labour Office (ILO) Classification of Radiographs of Pneumoconiosis were prepared by film digitsation using a scanning microdensitometer or a video digitiser based on a personal computer equipped with a real time digitiser board and a vidicon or a Charge Coupled Device (CCD) camera. Seven different algorithms were used for texture analysis resulting in 16 texture parameters for each region. All methods used for texture analysis were independent of the mean grey value level and the size of the image analysed. Classification was performed by discriminant analysis using the classes from the ILO classification. A hit ratio of at least 85% was achieved for a digitisation by scanner digitisation or the vidicon, while the corresponding results of the CCD camera were significantly less good. Classification by texture analysis of opacities of chest X-rays of pneumoconiosis digitised by a personal computer based video digitiser and a vidicon are of equal quality compared to digitisation by a scanning microdensitometer. Correct classification of 90% was achieved via the described statistical approach. (orig.) [de
Plastic deformation of indium nanostructures
International Nuclear Information System (INIS)
Lee, Gyuhyon; Kim, Ju-Young; Burek, Michael J.; Greer, Julia R.; Tsui, Ting Y.
2011-01-01
Highlights: → Indium nanopillars display two different deformation mechanisms. → ∼80% exhibited low flow stresses near that of bulk indium. → Low strength nanopillars have strain rate sensitivity similar to bulk indium. → ∼20% of compressed indium nanopillars deformed at nearly theoretical strengths. → Low-strength samples do not exhibit strength size effects. - Abstract: Mechanical properties and morphology of cylindrical indium nanopillars, fabricated by electron beam lithography and electroplating, are characterized in uniaxial compression. Time-dependent deformation and influence of size on nanoscale indium mechanical properties were investigated. The results show two fundamentally different deformation mechanisms which govern plasticity in these indium nanostructures. We observed that the majority of indium nanopillars deform at engineering stresses near the bulk values (Type I), with a small fraction sustaining flow stresses approaching the theoretical limit for indium (Type II). The results also show the strain rate sensitivity and flow stresses in Type I indium nanopillars are similar to bulk indium with no apparent size effects.
Interactive Character Deformation Using Simplified Elastic Models
Luo, Z.
2016-01-01
This thesis describes the results of our research into realistic skin and model deformation methods aimed at the field of character deformation and animation. The main contributions lie in the properties of our deformation scheme. Our approach preserves the volume of the deformed object while
Classification of refrigerants; Classification des fluides frigorigenes
Energy Technology Data Exchange (ETDEWEB)
NONE
2001-07-01
This document was made from the US standard ANSI/ASHRAE 34 published in 2001 and entitled 'designation and safety classification of refrigerants'. This classification allows to clearly organize in an international way the overall refrigerants used in the world thanks to a codification of the refrigerants in correspondence with their chemical composition. This note explains this codification: prefix, suffixes (hydrocarbons and derived fluids, azeotropic and non-azeotropic mixtures, various organic compounds, non-organic compounds), safety classification (toxicity, flammability, case of mixtures). (J.S.)
Nonlinear Deformable-body Dynamics
Luo, Albert C J
2010-01-01
"Nonlinear Deformable-body Dynamics" mainly consists in a mathematical treatise of approximate theories for thin deformable bodies, including cables, beams, rods, webs, membranes, plates, and shells. The intent of the book is to stimulate more research in the area of nonlinear deformable-body dynamics not only because of the unsolved theoretical puzzles it presents but also because of its wide spectrum of applications. For instance, the theories for soft webs and rod-reinforced soft structures can be applied to biomechanics for DNA and living tissues, and the nonlinear theory of deformable bodies, based on the Kirchhoff assumptions, is a special case discussed. This book can serve as a reference work for researchers and a textbook for senior and postgraduate students in physics, mathematics, engineering and biophysics. Dr. Albert C.J. Luo is a Professor of Mechanical Engineering at Southern Illinois University, Edwardsville, IL, USA. Professor Luo is an internationally recognized scientist in the field of non...
Developing a Virtual Rock Deformation Laboratory
Zhu, W.; Ougier-simonin, A.; Lisabeth, H. P.; Banker, J. S.
2012-12-01
Experimental rock physics plays an important role in advancing earthquake research. Despite its importance in geophysics, reservoir engineering, waste deposits and energy resources, most geology departments in U.S. universities don't have rock deformation facilities. A virtual deformation laboratory can serve as an efficient tool to help geology students naturally and internationally learn about rock deformation. Working with computer science engineers, we built a virtual deformation laboratory that aims at fostering user interaction to facilitate classroom and outreach teaching and learning. The virtual lab is built to center around a triaxial deformation apparatus in which laboratory measurements of mechanical and transport properties such as stress, axial and radial strains, acoustic emission activities, wave velocities, and permeability are demonstrated. A student user can create her avatar to enter the virtual lab. In the virtual lab, the avatar can browse and choose among various rock samples, determine the testing conditions (pressure, temperature, strain rate, loading paths), then operate the virtual deformation machine to observe how deformation changes physical properties of rocks. Actual experimental results on the mechanical, frictional, sonic, acoustic and transport properties of different rocks at different conditions are compiled. The data acquisition system in the virtual lab is linked to the complied experimental data. Structural and microstructural images of deformed rocks are up-loaded and linked to different deformation tests. The integration of the microstructural image and the deformation data allows the student to visualize how forces reshape the structure of the rock and change the physical properties. The virtual lab is built using the Game Engine. The geological background, outstanding questions related to the geological environment, and physical and mechanical concepts associated with the problem will be illustrated on the web portal. In
Evaluation of interobserver agreement in Albertoni's classification for mallet finger
Directory of Open Access Journals (Sweden)
Vinícius Alexandre de Souza Almeida
Full Text Available ABSTRACT Objective: To measure the reliability of Albertoni's classification for mallet finger. Methods: Agreement study. Forty-three radiographs of patients with mallet finger were assessed by 19 responders (12 hand surgeons and seven residents. Injuries were classified by Albertoni's classification. For agreement comparison, lesions were grouped as: (A tendon avulsion; (B avulsion fracture; (C fracture of the dorsal lip; and (D physis injury-and subgroups (each group divided into two subgroups. Agreement was assessed by Fleiss's modification for kappa statistics. Results: Agreement was excellent for Group A (k = 0.95 (0.93-0.97 and remained good when separated into A1 and A2. Group B was moderate (k = 0.42 (0.39-0.44 and poor when separated into B1 and B2. In the Group C, agreement was good (k = 0.72 (0.70-0.74, but when separated into C1 and C2, it became moderate. Group D was always poor (k = 0.16 (0.14-0.19. The general agreement was moderate, with (k = 0.57 (0.56-0.58. Conclusion: Albertoni's classification evaluated for interobserver agreement is considered a reproducible classification by the method used in the research.
Energy Technology Data Exchange (ETDEWEB)
Bugiak, B.; Weber, L. [Saskatchewan Univ., Saskatoon, SK (Canada)
2009-07-01
Exposure to aryl hydrocarbon receptor (AhR) agonists in fish causes lethal disturbances in fish development, but the effects of acute AhR agonist exposure on the cardiovascular system and deformities remain unclear. This study addressed this issue by performing a series of experiments on zebrafish (Danio rerio). The authors hypothesized that genes needed for cardiovascular regulation (PTGS) would exhibit a stronger link to deformities than detoxification enzymes (CYPs). Zebrafish eggs were exposed aqueously until 4 days post-fertilization (dpf) to the AhR agonists benzo(a)pyrene (BaP) or 2,3,7,8-tetrachlorodibenzop-dioxin (TCDD) alone and in combination with the putative AhR antagonists resveratrol or alpha-naphthoflavone (ANF). Gene expression was measured using real-time, reverse transcriptase PCR in zebrafish at 5 and 10 dpf. Although the mortalities did not differ considerably among groups at 10 dpf, the deformities increased significantly after BaP-ANF at 5 dpf and after BaP at 10 dpf, but not after TCDD treatment. CYP and PTGS isozymes exhibited small, but statistically significant changes at 5 dpf. By 10 dpf, the expression returned to control values. In general, CYP1A and PTGS-1 expression at 5 dpf were positively correlated with deformities, while all other genes were negatively correlated with deformities. It was concluded that changes in CYP1A, CYP1C2, and PTGS-1 gene expression at 5 dpf are associated with developmental deformities, but additional work is needed to determine which has the most important mechanistic link.
FPGA-Based Online PQD Detection and Classification through DWT, Mathematical Morphology and SVD
Directory of Open Access Journals (Sweden)
Misael Lopez-Ramirez
2018-03-01
Full Text Available Power quality disturbances (PQD in electric distribution systems can be produced by the utilization of non-linear loads or environmental circumstances, causing electrical equipment malfunction and reduction of its useful life. Detecting and classifying different PQDs implies great efforts in planning and structuring the monitoring system. The main disadvantage of most works in the literature is that they treat a limited number of electrical disturbances through personal computer (PC-based computation techniques, which makes it difficult to perform an online PQD classification. In this work, the novel contribution is a methodology for PQD recognition and classification through discrete wavelet transform, mathematical morphology, decomposition of singular values, and statistical analysis. Furthermore, the timely and reliable classification of different disturbances is necessary; hence, a field programmable gate array (FPGA-based integrated circuit is developed to offer a portable hardware processing unit to perform fast, online PQD classification. The obtained numerical and experimental results demonstrate that the proposed method guarantees high effectiveness during online PQD detection and classification of real voltage/current signals.
McDermott, P A; Hale, R L
1982-07-01
Tested diagnostic classifications of child psychopathology produced by a computerized technique known as multidimensional actuarial classification (MAC) against the criterion of expert psychological opinion. The MAC program applies series of statistical decision rules to assess the importance of and relationships among several dimensions of classification, i.e., intellectual functioning, academic achievement, adaptive behavior, and social and behavioral adjustment, to perform differential diagnosis of children's mental retardation, specific learning disabilities, behavioral and emotional disturbance, possible communication or perceptual-motor impairment, and academic under- and overachievement in reading and mathematics. Classifications rendered by MAC are compared to those offered by two expert child psychologists for cases of 73 children referred for psychological services. Experts' agreement with MAC was significant for all classification areas, as was MAC's agreement with the experts held as a conjoint reference standard. Whereas the experts' agreement with MAC averaged 86.0% above chance, their agreement with one another averaged 76.5% above chance. Implications of the findings are explored and potential advantages of the systems-actuarial approach are discussed.
The Spherical Deformation Model
DEFF Research Database (Denmark)
Hobolth, Asgar
2003-01-01
Miller et al. (1994) describe a model for representing spatial objects with no obvious landmarks. Each object is represented by a global translation and a normal deformation of a sphere. The normal deformation is defined via the orthonormal spherical-harmonic basis. In this paper we analyse the s...
Classification in Astronomy: Past and Present
Feigelson, Eric
2012-03-01
used today with many refinements by Gerard de Vaucouleurs and others. Supernovae, nearly all of which are found in external galaxies, have a complicated classification scheme:Type I with subtypes Ia, Ib, Ic, Ib/c pec and Type II with subtypes IIb, IIL, IIP, and IIn (Turatto 2003). The classification is based on elemental abundances in optical spectra and on optical light curve shapes. Tadhunter (2009) presents a three-dimensional classification of active galactic nuclei involving radio power, emission line width, and nuclear luminosity. These taxonomies have played enormously important roles in the development of astronomy, yet all were developed using heuristic methods. Many are based on qualitative and subjective assessments of spatial, temporal, or spectral properties. A qualitative, morphological approach to astronomical studies was explicitly promoted by Zwicky (1957). Other classifications are based on quantitative criteria, but these criteria were developed by subjective examination of training datasets. For example, starburst galaxies are discriminated from narrow-line Seyfert galaxies by a curved line in a diagramof the ratios of four emission lines (Veilleux and Osterbrock 1987). Class II young stellar objects have been defined by a rectangular region in a mid-infrared color-color diagram (Allen et al. 2004). Short and hard gamma-ray bursts are discriminated by a dip in the distribution of burst durations (Kouveliotou et al. 2000). In no case was a statistical or algorithmic procedure used to define the classes.
Representations of the q-deformed algebras Uq (so2,1) and Uq (so3,1)
International Nuclear Information System (INIS)
Gavrilik, O.M.; Klimyk, A.U.
1993-01-01
Representations of algebra U q (so 2 ,1) are studied. This algebra is a q-deformation of the universal enveloping algebra U(so 2 ,1) of the Lie algebra of the group SO 0 (2,1) and differs from the quantum algebra U q (SU 1 ,1). Classifications of irreducible representations and of infinitesimally irreducible representations of U q (SU 1 ,1). The sets of irreducible representations and of infinitesimally unitary irreducible representations of the algebra U q (so 3 ,1) are given. We also consider representations of U q (so n ,1) which are of class 1 with respect to subalgebra U q (so n ). (author). 22 refs
Deformation-induced structural changes of amorphous Ni0.5Zr0.5 in molecular-dynamic simulations
International Nuclear Information System (INIS)
Brinkmann, K.
2006-01-01
The present work investigates the plastic deformation of metallic glasses by the aid of molecular-dynamic simulations. The parameters for the model system are adapted to those for a NiZr-alloy. In particular, the composition Ni 0.5 Zr 0.5 is used. The analyzed deformation simulations are conducted for small systems with 5184 atoms and large systems with 17500 atoms in a periodic simulation cell. The deformation simulations of pre-deformed samples are carried out either at constant shear-rate or at constant load, the latter mode modeling a creep experiment. Stress-strain curves for pre-deformed samples show a less pronounced stress-overshoot phenomenon. Creep-simulations of samples deformed beyond the yield region indicate a drastically reduced viscosity in these systems when compared to samples pre-deformed only up to the linear regime of the stress-strain curve. From analyzing the local atomic topology it is found that the transition from the highly viscous, hard-to-deform state of the undeformed or only weakly strained system into the easy-to-deform flow-state, present if the system is strained far beyond the yielding regime of the stress-strain curve, is connected with the formation of a region containing atoms with massive changes in their topology which is oriented along a diagonal plane of the simulation cell. The degree of localization of these deformation bands is influenced by temperature and shear-rate. In subsequent deformations of pre-deformed samples the regions with massive changes in the atomic topology are again susceptible to changes in the local atomic topology. By using methods from statistics, a significant difference in the distribution of atomic properties for the group of atoms with massive topology changes on the one hand and the group of atoms without changes in their topology on the other gets quantitatively ascertainable. From the differences in structural properties, e.g. potential energy, cage volumes, angular order parameters or atomic
Directory of Open Access Journals (Sweden)
Yaoying Huang
2018-01-01
Full Text Available The main objective of this study is to present a method of determining viscoelastic deformation monitoring index of a Roller-compacted concrete (RCC gravity dam in an alpine region. By focusing on a modified deformation monitoring model considering frost heave and back analyzed mechanical parameters of the dam, the working state of viscoelasticity for the dam is illustrated followed by an investigation and designation of adverse load cases using orthogonal test method. Water pressure component is then calculated by finite element method, while temperature, time effect, and frost heave components are obtained through deformation statistical model considering frost heave. The viscoelastic deformation monitoring index is eventually determined by small probability and maximum entropy methods. The results show that (a with the abnormal probability 1% the dam deformation monitoring index for small probability and maximum entropy methods is 23.703 mm and 22.981 mm, respectively; thus the maximum measured displacement of the dam is less than deformation monitoring index, which indicates that the dam is currently in a state of safety operation and (b the obtained deformation monitoring index using orthogonal test method is more accurate due to the full consideration of more random factors; the method gained from this study will likely be of use to diagnose the working state for those RCC dams in alpine regions.
Passive sorting of capsules by deformability
Haener, Edgar; Juel, Anne
We study passive sorting according to deformability of liquid-filled ovalbumin-alginate capsules. We present results for two sorting geometries: a straight channel with a half-cylindrical obstruction and a pinched flow fractioning device (PFF) adapted for use with capsules. In the half-cylinder device, the capsules deform as they encounter the obstruction, and travel around the half-cylinder. The distance from the capsule's centre of mass to the surface of the half-cylinder depends on deformability, and separation between capsules of different deformability is amplified by diverging streamlines in the channel expansion downstream of the obstruction. We show experimentally that capsules can be sorted according to deformability with their downstream position depending on capillary number only, and we establish the sensitivity of the device to experimental variability. In the PFF device, particles are compressed against a wall using a strong pinching flow. We show that capsule deformation increases with the intensity of the pinching flow, but that the downstream capsule position is not set by deformation in the device. However, when using the PFF device like a T-Junction, we achieve improved sorting resolution compared to the half-cylinder device.
Image Analysis and Classification Based on Soil Strength
2016-08-01
Impact Hammer, which is light, easy to operate, and cost effective . The Clegg Impact Hammer measures stiffness of the soil surface by drop- ping a... effect on out-of-scene classifications. More statistical analy- sis should, however, be done to compare the measured field spectra, the WV2 training...DISCLAIMER: The contents of this report are not to be used for advertising , publication, or promotional purposes. Ci- tation of trade names does not
Dissimilarity Application in Digitized Mammographic Images Classification
Directory of Open Access Journals (Sweden)
Ubaldo Bottigli
2006-06-01
Full Text Available Purpose of this work is the development of an automatic classification system which could be useful for radiologists in the investigation of breast cancer. The software has been designed in the framework of the MAGIC-5 collaboration. In the traditional way of learning from examples of objects the classifiers are built in a feature space. However, an alternative ways can be found by constructing decision rules on dissimilarity (distance representations. In such a recognition process a new object is described by its distances to (a subset of the training samples. The use of the dissimilarities is especially of interest when features are difficult to obtain or when they have a little discriminative power. In the automatic classification system the suspicious regions with high probability to include a lesion are extracted from the image as regions of interest (ROIs. Each ROI is characterized by some features extracted from co-occurrence matrix containing spatial statistics information on ROI pixel grey tones. A dissimilarity representation of these features is made before the classification. A feed-forward neural network is employed to distinguish pathological records, from non-pathological ones by the new features. The results obtained in terms of sensitivity and specificity will be presented.
Pancreatic neuroendocrine tumours: correlation between MSCT features and pathological classification
Energy Technology Data Exchange (ETDEWEB)
Luo, Yanji; Dong, Zhi; Li, Zi-Ping; Feng, Shi-Ting [The First Affiliated Hospital, Sun Yat-Sen University, Department of Radiology, Guangzhou, Guangdong (China); Chen, Jie [The First Affiliated Hospital, Sun Yat-Sen University, Department of Gastroenterology, Guangzhou, Guangdong (China); Chan, Tao; Chen, Minhu [Union Hospital, Hong Kong, Medical Imaging Department, Shatin, N.T. (China); Lin, Yuan [The First Affiliated Hospital, Sun Yat-Sen University, Department of Pathology, Guangzhou, Guangdong (China)
2014-11-15
We aimed to evaluate the multi-slice computed tomography (MSCT) features of pancreatic neuroendocrine neoplasms (P-NENs) and analyse the correlation between the MSCT features and pathological classification of P-NENs. Forty-one patients, preoperatively investigated by MSCT and subsequently operated on with a histological diagnosis of P-NENs, were included. Various MSCT features of the primary tumour, lymph node, and distant metastasis were analysed. The relationship between MSCT features and pathologic classification of P-NENs was analysed with univariate and multivariate models. Contrast-enhanced images showed significant differences among the three grades of tumours in the absolute enhancement (P = 0.013) and relative enhancement (P = 0.025) at the arterial phase. Univariate analysis revealed statistically significant differences among the tumours of different grades (based on World Health Organization [WHO] 2010 classification) in tumour size (P = 0.001), tumour contour (P < 0.001), cystic necrosis (P = 0.001), tumour boundary (P = 0.003), dilatation of the main pancreatic duct (P = 0.001), peripancreatic tissue or vascular invasion (P < 0.001), lymphadenopathy (P = 0.011), and distant metastasis (P = 0.012). Multivariate analysis suggested that only peripancreatic tissue or vascular invasion (HR 3.934, 95 % CI, 0.426-7.442, P = 0.028) was significantly associated with WHO 2010 pathological classification. MSCT is helpful in evaluating the pathological classification of P-NENs. (orig.)
Perceptual transparency from image deformation.
Kawabe, Takahiro; Maruya, Kazushi; Nishida, Shin'ya
2015-08-18
Human vision has a remarkable ability to perceive two layers at the same retinal locations, a transparent layer in front of a background surface. Critical image cues to perceptual transparency, studied extensively in the past, are changes in luminance or color that could be caused by light absorptions and reflections by the front layer, but such image changes may not be clearly visible when the front layer consists of a pure transparent material such as water. Our daily experiences with transparent materials of this kind suggest that an alternative potential cue of visual transparency is image deformations of a background pattern caused by light refraction. Although previous studies have indicated that these image deformations, at least static ones, play little role in perceptual transparency, here we show that dynamic image deformations of the background pattern, which could be produced by light refraction on a moving liquid's surface, can produce a vivid impression of a transparent liquid layer without the aid of any other visual cues as to the presence of a transparent layer. Furthermore, a transparent liquid layer perceptually emerges even from a randomly generated dynamic image deformation as long as it is similar to real liquid deformations in its spatiotemporal frequency profile. Our findings indicate that the brain can perceptually infer the presence of "invisible" transparent liquids by analyzing the spatiotemporal structure of dynamic image deformation, for which it uses a relatively simple computation that does not require high-level knowledge about the detailed physics of liquid deformation.
Impact of dentofacial deformity and motivation for treatment: a qualitative study.
Ryan, Fiona S; Barnard, Matthew; Cunningham, Susan J
2012-06-01
Satisfaction with the outcome of orthognathic treatment is generally high; however, an important minority remains dissatisfied with the results. The reasons for this could be inadequate patient understanding and preparation, external motivation, and unrealistic expectations. In-depth appreciation of these issues can be obtained using qualitative research methods, but there is a paucity of qualitative research in this field. This was a cross-sectional qualitative study of orthognathic patients conducted at a teaching hospital. In-depth interviews were conducted with 18 prospective orthognathic patients. The data were managed by using the framework approach and analyzed by using the critical qualitative theory. Two main themes were explored in the interviews: the impact of the dentofacial deformity and the motivation for treatment. Both the everyday problems of living with a dentofacial deformity and the motivation for seeking treatment could be classified either as exclusively practical (including functional and structural), exclusively psychological (including psychosocial and esthetic), or a combination. Different coping strategies were also described. The sources of motivation ranged between purely external to purely internal, with most subjects between these 2 extremes. In this article, we present a classification of the impact of dentofacial deformity that is a refinement of the traditional one that includes esthetic, functional, and psychosocial factors. The motivating factors, together with the triggers for accessing treatment and the source of motivation, are generally linked directly or indirectly to the problem and the impact of the condition. However, in a few patients, the motivation might not relate to the impact of the problem but to a complex array of other factors such as personality, upbringing, and relationships. Therefore, clinicians should not make assumptions but explore these factors on an individual basis without preconceived ideas. Copyright
A neural network approach for radiographic image classification in NDT
International Nuclear Information System (INIS)
Lavayssiere, B.
1993-05-01
Radiography is used by EDF for pipe inspection in nuclear power plants in order to detect defects. The radiographs obtained are then digitized in a well-defined protocol. The aim of EDF consists of developing a non destructive testing system for recognizing defects. In this note, we describe the recognition procedure of areas with defects. We first present the digitization protocol, specifies the poor quality of images under study and propose a procedure to enhance defects. We then examine the problem raised by the choice of good features for classification. After having proved that statistical or standard textural features such as homogeneity, entropy or contrast are not relevant, we develop a geometrical-statistical approach based on the cooperation between signal correlations study and regional extrema analysis. The principle consists of analysing and comparing for areas with defects and without any defect, the evolution of conditional probabilities matrices for increasing neighbourhood sizes, the shape of variograms and the location of regional minima. We demonstrate that anisotropy and surface of series of 'comet tails' associated with probability matrices, variograms slope and statistical indices, regional extrema location, are features able to discriminate areas with defects from areas without any. The classification is then realized by a neural network, which structure, properties and learning mechanisms are detailed. Finally we discuss the results. (author). 5 figs., 21 refs
Hu, Li; Jiang, Shuyong; Zhou, Tao; Tu, Jian; Shi, Laixin; Chen, Qiang; Yang, Mingbo
2017-10-13
Numerical modeling of microstructure evolution in various regions during uniaxial compression and canning compression of NiTi shape memory alloy (SMA) are studied through combined macroscopic and microscopic finite element simulation in order to investigate plastic deformation of NiTi SMA at 400 °C. In this approach, the macroscale material behavior is modeled with a relatively coarse finite element mesh, and then the corresponding deformation history in some selected regions in this mesh is extracted by the sub-model technique of finite element code ABAQUS and subsequently used as boundary conditions for the microscale simulation by means of crystal plasticity finite element method (CPFEM). Simulation results show that NiTi SMA exhibits an inhomogeneous plastic deformation at the microscale. Moreover, regions that suffered canning compression sustain more homogeneous plastic deformation by comparison with the corresponding regions subjected to uniaxial compression. The mitigation of inhomogeneous plastic deformation contributes to reducing the statistically stored dislocation (SSD) density in polycrystalline aggregation and also to reducing the difference of stress level in various regions of deformed NiTi SMA sample, and therefore sustaining large plastic deformation in the canning compression process.
Directory of Open Access Journals (Sweden)
Li Hu
2017-10-01
Full Text Available Numerical modeling of microstructure evolution in various regions during uniaxial compression and canning compression of NiTi shape memory alloy (SMA are studied through combined macroscopic and microscopic finite element simulation in order to investigate plastic deformation of NiTi SMA at 400 °C. In this approach, the macroscale material behavior is modeled with a relatively coarse finite element mesh, and then the corresponding deformation history in some selected regions in this mesh is extracted by the sub-model technique of finite element code ABAQUS and subsequently used as boundary conditions for the microscale simulation by means of crystal plasticity finite element method (CPFEM. Simulation results show that NiTi SMA exhibits an inhomogeneous plastic deformation at the microscale. Moreover, regions that suffered canning compression sustain more homogeneous plastic deformation by comparison with the corresponding regions subjected to uniaxial compression. The mitigation of inhomogeneous plastic deformation contributes to reducing the statistically stored dislocation (SSD density in polycrystalline aggregation and also to reducing the difference of stress level in various regions of deformed NiTi SMA sample, and therefore sustaining large plastic deformation in the canning compression process.
Chen, Yifei; Sun, Yuxing; Han, Bing-Qing
2015-01-01
Protein interaction article classification is a text classification task in the biological domain to determine which articles describe protein-protein interactions. Since the feature space in text classification is high-dimensional, feature selection is widely used for reducing the dimensionality of features to speed up computation without sacrificing classification performance. Many existing feature selection methods are based on the statistical measure of document frequency and term frequency. One potential drawback of these methods is that they treat features separately. Hence, first we design a similarity measure between the context information to take word cooccurrences and phrase chunks around the features into account. Then we introduce the similarity of context information to the importance measure of the features to substitute the document and term frequency. Hence we propose new context similarity-based feature selection methods. Their performance is evaluated on two protein interaction article collections and compared against the frequency-based methods. The experimental results reveal that the context similarity-based methods perform better in terms of the F1 measure and the dimension reduction rate. Benefiting from the context information surrounding the features, the proposed methods can select distinctive features effectively for protein interaction article classification.
SoFoCles: feature filtering for microarray classification based on gene ontology.
Papachristoudis, Georgios; Diplaris, Sotiris; Mitkas, Pericles A
2010-02-01
Marker gene selection has been an important research topic in the classification analysis of gene expression data. Current methods try to reduce the "curse of dimensionality" by using statistical intra-feature set calculations, or classifiers that are based on the given dataset. In this paper, we present SoFoCles, an interactive tool that enables semantic feature filtering in microarray classification problems with the use of external, well-defined knowledge retrieved from the Gene Ontology. The notion of semantic similarity is used to derive genes that are involved in the same biological path during the microarray experiment, by enriching a feature set that has been initially produced with legacy methods. Among its other functionalities, SoFoCles offers a large repository of semantic similarity methods that are used in order to derive feature sets and marker genes. The structure and functionality of the tool are discussed in detail, as well as its ability to improve classification accuracy. Through experimental evaluation, SoFoCles is shown to outperform other classification schemes in terms of classification accuracy in two real datasets using different semantic similarity computation approaches.
Associative and Lie deformations of Poisson algebras
Remm, Elisabeth
2011-01-01
Considering a Poisson algebra as a non associative algebra satisfying the Markl-Remm identity, we study deformations of Poisson algebras as deformations of this non associative algebra. This gives a natural interpretation of deformations which preserves the underlying associative structure and we study deformations which preserve the underlying Lie algebra.
Statistics of Monte Carlo methods used in radiation transport calculation
International Nuclear Information System (INIS)
Datta, D.
2009-01-01
Radiation transport calculation can be carried out by using either deterministic or statistical methods. Radiation transport calculation based on statistical methods is basic theme of the Monte Carlo methods. The aim of this lecture is to describe the fundamental statistics required to build the foundations of Monte Carlo technique for radiation transport calculation. Lecture note is organized in the following way. Section (1) will describe the introduction of Basic Monte Carlo and its classification towards the respective field. Section (2) will describe the random sampling methods, a key component of Monte Carlo radiation transport calculation, Section (3) will provide the statistical uncertainty of Monte Carlo estimates, Section (4) will describe in brief the importance of variance reduction techniques while sampling particles such as photon, or neutron in the process of radiation transport
Deformation Models Tracking, Animation and Applications
Torres, Arnau; Gómez, Javier
2013-01-01
The computational modelling of deformations has been actively studied for the last thirty years. This is mainly due to its large range of applications that include computer animation, medical imaging, shape estimation, face deformation as well as other parts of the human body, and object tracking. In addition, these advances have been supported by the evolution of computer processing capabilities, enabling realism in a more sophisticated way. This book encompasses relevant works of expert researchers in the field of deformation models and their applications. The book is divided into two main parts. The first part presents recent object deformation techniques from the point of view of computer graphics and computer animation. The second part of this book presents six works that study deformations from a computer vision point of view with a common characteristic: deformations are applied in real world applications. The primary audience for this work are researchers from different multidisciplinary fields, s...
A Pruning Neural Network Model in Credit Classification Analysis
Directory of Open Access Journals (Sweden)
Yajiao Tang
2018-01-01
Full Text Available Nowadays, credit classification models are widely applied because they can help financial decision-makers to handle credit classification issues. Among them, artificial neural networks (ANNs have been widely accepted as the convincing methods in the credit industry. In this paper, we propose a pruning neural network (PNN and apply it to solve credit classification problem by adopting the well-known Australian and Japanese credit datasets. The model is inspired by synaptic nonlinearity of a dendritic tree in a biological neural model. And it is trained by an error back-propagation algorithm. The model is capable of realizing a neuronal pruning function by removing the superfluous synapses and useless dendrites and forms a tidy dendritic morphology at the end of learning. Furthermore, we utilize logic circuits (LCs to simulate the dendritic structures successfully which makes PNN be implemented on the hardware effectively. The statistical results of our experiments have verified that PNN obtains superior performance in comparison with other classical algorithms in terms of accuracy and computational efficiency.
Document representations for classification of short web-page descriptions
Directory of Open Access Journals (Sweden)
Radovanović Miloš
2008-01-01
Full Text Available Motivated by applying Text Categorization to classification of Web search results, this paper describes an extensive experimental study of the impact of bag-of- words document representations on the performance of five major classifiers - Naïve Bayes, SVM, Voted Perceptron, kNN and C4.5. The texts, representing short Web-page descriptions sorted into a large hierarchy of topics, are taken from the dmoz Open Directory Web-page ontology, and classifiers are trained to automatically determine the topics which may be relevant to a previously unseen Web-page. Different transformations of input data: stemming, normalization, logtf and idf, together with dimensionality reduction, are found to have a statistically significant improving or degrading effect on classification performance measured by classical metrics - accuracy, precision, recall, F1 and F2. The emphasis of the study is not on determining the best document representation which corresponds to each classifier, but rather on describing the effects of every individual transformation on classification, together with their mutual relationships. .
Classification of ADHD children through multimodal Magnetic Resonance Imaging
Directory of Open Access Journals (Sweden)
Dai eDai
2012-09-01
Full Text Available Attention deficit/hyperactivity disorder (ADHD is one of the most common diseases in school-age children. To date, the diagnosis of ADHD is mainly subjective and studies of objective diagnostic method are of great importance. Although many efforts have been made recently to investigate the use of structural and functional brain images for the diagnosis purpose, few of them are related to ADHD. In this paper, we introduce an automatic classification framework based on brain imaging features of ADHD patients, and present in detail the feature extraction, feature selection and classifier training methods. The effects of using different features are compared against each other. In addition, we integrate multimodal image features using multi-kernel learning (MKL. The performance of our framework has been validated in the ADHD-200 Global Competition, which is a world-wide classification contest on the ADHD-200 datasets. In this competition, our classification framework using features of resting-state functional connectivity was ranked the 6th out of 21 participants under the competition scoring policy, and performed the best in terms of sensitivity and J-statistic.
Structured Literature Review of Electricity Consumption Classification Using Smart Meter Data
Directory of Open Access Journals (Sweden)
Alexander Martin Tureczek
2017-04-01
Full Text Available Smart meters for measuring electricity consumption are fast becoming prevalent in households. The meters measure consumption on a very fine scale, usually on a 15 min basis, and the data give unprecedented granularity of consumption patterns at household level. A multitude of papers have emerged utilizing smart meter data for deepening our knowledge of consumption patterns. This paper applies a modification of Okoli’s method for conducting structured literature reviews to generate an overview of research in electricity customer classification using smart meter data. The process assessed 2099 papers before identifying 34 significant papers, and highlights three key points: prominent methods, datasets and application. Three important findings are outlined. First, only a few papers contemplate future applications of the classification, rendering papers relevant only in a classification setting. Second; the encountered classification methods do not consider correlation or time series analysis when classifying. The identified papers fail to thoroughly analyze the statistical properties of the data, investigations that could potentially improve classification performance. Third, the description of the data utilized is of varying quality, with only 50% acknowledging missing values impact on the final sample size. A data description score for assessing the quality in data description has been developed and applied to all papers reviewed.
Should Social Workers Use "Diagnostic and Statistical Manual of Mental Disorders-5?"
Frances, Allen; Jones, K. Dayle
2014-01-01
Up until now, social workers have depended on the "Diagnostic and Statistical Manual of Mental Disorders" ("DSM") as the primary diagnostic classification for mental disorders. However, the "DSM-5" revision includes scientifically unfounded, inadequately tested, and potentially dangerous diagnoses that may lead them…
Directory of Open Access Journals (Sweden)
Bethany Melville
2018-02-01
Full Text Available This paper presents a case study for the analysis of endangered lowland native grassland communities in the Tasmanian Midlands region using field spectroscopy and spectral convolution techniques. The aim of the study was to determine whether there was significant improvement in classification accuracy for lowland native grasslands and other vegetation communities based on hyperspectral resolution datasets over multispectral equivalents. A spectral dataset was collected using an ASD Handheld-2 spectroradiometer at Tunbridge Township Lagoon. The study then employed a k-fold cross-validation approach for repeated classification of a full hyperspectral dataset, a reduced hyperspectral dataset, and two convoluted multispectral datasets. Classification was performed on each of the four datasets a total of 30 times, based on two different class configurations. The classes analysed were Themeda triandra grassland, Danthonia/Poa grassland, Wilsonia rotundifolia/Selliera radicans, saltpan, and a simplified C3 vegetation class. The results of the classifications were then tested for statistically significant differences using ANOVA and Tukey’s post-hoc comparisons. The results of the study indicated that hyperspectral resolution provides small but statistically significant increases in classification accuracy for Themeda and Danthonia grasslands. For other classes, differences in classification accuracy for all datasets were not statistically significant. The results obtained here indicate that there is some potential for enhanced detection of major lowland native grassland community types using hyperspectral resolution datasets, and that future analysis should prioritise good performance in these classes over others. This study presents a method for identification of optimal spectral resolution across multiple datasets, and constitutes an important case study for lowland native grassland mapping in Tasmania.
International Nuclear Information System (INIS)
Saedtler, E.
1981-01-01
The dissertation discusses: 1. Approximative filter algorithms for identification of systems and hierarchical structures. 2. Adaptive statistical pattern recognition and classification. 3. Parameter selection, extraction, and modelling for an automatic control system. 4. Design of a decision tree and an adaptive diagnostic system. (orig./RW) [de
Android Malware Classification Using K-Means Clustering Algorithm
Hamid, Isredza Rahmi A.; Syafiqah Khalid, Nur; Azma Abdullah, Nurul; Rahman, Nurul Hidayah Ab; Chai Wen, Chuah
2017-08-01
Malware was designed to gain access or damage a computer system without user notice. Besides, attacker exploits malware to commit crime or fraud. This paper proposed Android malware classification approach based on K-Means clustering algorithm. We evaluate the proposed model in terms of accuracy using machine learning algorithms. Two datasets were selected to demonstrate the practicing of K-Means clustering algorithms that are Virus Total and Malgenome dataset. We classify the Android malware into three clusters which are ransomware, scareware and goodware. Nine features were considered for each types of dataset such as Lock Detected, Text Detected, Text Score, Encryption Detected, Threat, Porn, Law, Copyright and Moneypak. We used IBM SPSS Statistic software for data classification and WEKA tools to evaluate the built cluster. The proposed K-Means clustering algorithm shows promising result with high accuracy when tested using Random Forest algorithm.