WorldWideScience

Sample records for sentence processing statistical

  1. Incremental Sentence Processing in Japanese: A Maze Investigation into Scrambled and Control Sentences

    Witzel, Jeffrey; Witzel, Naoko

    2016-01-01

    This study investigates preverbal structural and semantic processing in Japanese, a head-final language, using the maze task. Two sentence types were tested--simple scrambled sentences (Experiment 1) and control sentences (Experiment 2). Experiment 1 showed that even for simple, mono-clausal Japanese sentences, (1) there are online processing…

  2. Sentence processing and grammaticality in functional linguistics

    Poulsen, Mads

    finding from research on sentence processing that sentences are processed incrementally. Empirical methods for establishing grammaticality status are discussed and applied in relation to non-WH extraction phenomena in Danish. In Chapter 2, I discuss the use of the notions of grammaticality......The dissertation presents a functional linguistic model of grammaticality and investigates methods for applying this notion in empirical work. The use of the notion of grammaticality in generative grammar has been criticized by functionalists (Harder, 1996; Lakoff & Johnson, 1999), but attempts...... grammaticality. It is concluded that the intuitions of linguists should in principle be considered hypotheses of grammaticality, and that such hypotheses need to be tested with independent data. Such data can for example take the form of corpus data or acceptability judgment experiments. It is furthermore argued...

  3. Sentence processing: linking language to motor chains

    Fabian Chersi

    2010-05-01

    Full Text Available A growing body of evidence in cognitive science and neuroscience is pointing towards the existence of a deep interconnection between cognition, perception and action. According to this embodied perspective language understanding is based on a mental simulation process involving a sensory-motor matching system known as the mirror neuron system. However, the precise dynamics underling the relation between language and action are not yet well understood. In fact, experimental studies are not always coherent as some report that language processing interferes with action execution while others find facilitation. In this work we present a detailed neural network model capable of reproducing experimentally observed influences of the processing of action-related sentences on the execution of motor sequences. The proposed model is based on three main points. The first is that the processing of action-related sentences causes the resonance of motor and mirror neurons encoding the corresponding actions. The second is that there exists a varying degree of crosstalk between neuronal populations depending on whether they encode the same motor act, the same effector or the same action-goal. The third is the fact that neuronal populations’ internal dynamics, which results from the combination of multiple processes taking place at different time scales, can facilitate or interfere with successive activations of the same or of partially overlapping pools.

  4. Effects of surprisal and locality on Danish sentence processing

    Balling, Laura Winther; Kizach, Johannes

    2017-01-01

    An eye-tracking experiment in Danish investigates two dominant accounts of sentence processing: locality-based theories that predict a processing advantage for sentences where the distance between the major syntactic heads is minimized, and the surprisal theory which predicts that processing time...

  5. The Effects of Syntactic Complexity on Processing Sentences in Noise

    Carroll, Rebecca; Ruigendijk, Esther

    2013-01-01

    This paper discusses the influence of stationary (non-fluctuating) noise on processing and understanding of sentences, which vary in their syntactic complexity (with the factors canonicity, embedding, ambiguity). It presents data from two RT-studies with 44 participants testing processing of German sentences in silence and in noise. Results show a…

  6. Probabilistic modeling of discourse-aware sentence processing.

    Dubey, Amit; Keller, Frank; Sturt, Patrick

    2013-07-01

    Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.

  7. A Computational Evaluation of Sentence Processing Deficits in Aphasia

    Patil, Umesh; Hanne, Sandra; Burchert, Frank; De Bleser, Ria; Vasishth, Shravan

    2016-01-01

    Individuals with agrammatic Broca's aphasia experience difficulty when processing reversible non-canonical sentences. Different accounts have been proposed to explain this phenomenon. The Trace Deletion account (Grodzinsky, 1995, 2000, 2006) attributes this deficit to an impairment in syntactic representations, whereas others (e.g., Caplan,…

  8. Effects of reading speed on second-language sentence processing

    Kaan, Edith; Ballantyne, Jocelyn C.; Wijnen, Frank

    2014-01-01

    To test the effects of reading speed on second-language (L2) sentence processing and the potential influence of conflicting native language word order, we compared advanced L2 learners of English with native English speakers on a self-paced reading task. L2 learners read faster overall than native

  9. Semantic Priming During Sentence Processing by Young and Older Adults.

    Burke, Deborah M.; Yee, Penny L.

    1984-01-01

    Compares the semantic processing skills of younger adults (mean age 25) and older adults (mean age 68). After reading a sentence, subjects performed a task in which responses did not depend on retention. Results provided no evidence for age-related changes, including those associated with access to implied information. (Author/RH)

  10. Cultural considerations in the criminal law: the sentencing process.

    Boehnlein, James K; Schaefer, Michele N; Bloom, Joseph D

    2005-01-01

    In forensic psychiatry, there is increasing recognition of the importance of culture and ethnicity in the criminal justice process as the population becomes more culturally diverse. However, there has been little consideration of the role of cultural factors in the trial process for criminal defendants, particularly in the sentencing phase of trial. Using a capital murder case study, this article explores the role of cultural forensic psychiatric consultation, focusing on the sentencing phase of trial as the place where the full scope and power of a cultural evaluation can be brought most effectively to the attention of the court. Cultural psychiatric perspectives can enrich a core forensic evaluation and be maximally helpful to the court, by exploring family dynamics and psychological health influenced by cultural history, immigrant and refugee experiences, and sociocultural environment. Specific recommendations and cautions for effective cultural consultation in forensic psychiatry are discussed.

  11. Impact of Background Noise and Sentence Complexity on Processing Demands during Sentence Comprehension

    Wendt, Dorothea; Dau, Torsten; Hjortkjær, Jens

    2016-01-01

    Speech comprehension in adverse listening conditions can be effortful even when speech is fully intelligible. Acoustical distortions typically make speech comprehension more effortful, but effort also depends on linguistic aspects of the speech signal, such as its syntactic complexity....... In the present study, pupil dilations, and subjective effort ratings were recorded in 20 normal-hearing participants while performing a sentence comprehension task. The sentences were either syntactically simple (subject-first sentence structure) or complex (object-first sentence structure) and were presented...... and less by syntactic complexity. Conversely, pupil dilations increased with syntactic complexity but only showed a small effect of the noise level. Participants with higher WMC showed increased pupil responses in the higher-level noise condition but rated sentence comprehension as being less effortful...

  12. Distinct frontal regions for processing sentence syntax and story grammar.

    Sirigu, A; Cohen, L; Zalla, T; Pradat-Diehl, P; Van Eeckhout, P; Grafman, J; Agid, Y

    1998-12-01

    Time is a fundamental dimension of cognition. It is expressed in the sequential ordering of individual elements in a wide variety of activities such as language, motor control or in the broader domain of long range goal-directed actions. Several studies have shown the importance of the frontal lobes in sequencing information. The question addressed in this study is whether this brain region hosts a single supramodal sequence processor, or whether separate mechanisms are required for different kinds of temporally organised knowledge structures such as syntax and action knowledge. Here we show that so-called agrammatic patients, with lesions in Broca's area, ordered word groups correctly to form a logical sequence of actions but they were severely impaired when similar word groups had to be ordered as a syntactically well-formed sentence. The opposite performance was observed in patients with dorsolateral prefrontal lesions, that is, while their syntactic processing was intact at the sentence level, they demonstrated a pronounced deficit in producing temporally coherent sequences of actions. Anatomical reconstruction of lesions from brain scans revealed that the sentence and action grammar deficits involved distinct, non-overlapping sites within the frontal lobes. Finally, in a third group of patients whose lesions encompassed both Broca's area and the prefrontal cortex, the two types of deficits were found. We conclude that sequence processing is specific to knowledge domains and involves different networks within the frontal lobes.

  13. Localizing components of a complex task : sentence processing and working memory

    Stowe, L.A.; Broere, C.A.J.; Paans, A.MJ; Wijers, A.A.; Mulder, G.; Vaalburg, W.; Zwarts, Frans

    1998-01-01

    THREE areas of the left hemisphere play different roles in sentence comprehension. An area of posterior middle and superior temporal gyrus shows activation correlated with the structural complexity of a sentence, suggesting that this area supports processing of sentence structure. The lateral

  14. Reduced Syntactic Processing Efficiency in Older Adults During Sentence Comprehension

    Zude Zhu

    2018-03-01

    Full Text Available Researchers have frequently reported an age-related decline in semantic processing during sentence comprehension. However, it remains unclear whether syntactic processing also declines or whether it remains constant as people age. In the present study, 26 younger adults and 20 older adults were recruited and matched in terms of working memory, general intelligence, verbal intelligence and fluency. They were then asked to make semantic acceptability judgments while completing a Chinese sentence reading task. The behavioral results revealed that the older adults had significantly lower accuracy on measures of semantic and syntactic processing compared to younger adults. Event-related potential (ERP results showed that during semantic processing, older adults had a significantly reduced amplitude and delayed peak latency of the N400 compared to the younger adults. During syntactic processing, older adults also showed delayed peak latency of the P600 relative to younger adults. Moreover, while P600 amplitude was comparable between the two age groups, larger P600 amplitude was associated with worse performance only in the older adults. Together, the behavioral and ERP data suggest that there is an age-related decline in both semantic and syntactic processing, with a trend toward lower efficiency in syntactic ability.

  15. Elaboration over a discourse facilitates retrieval in sentence processing

    Melissa eTroyer

    2016-03-01

    Full Text Available Language comprehension requires access to stored knowledge and the ability to combine knowledge in new, meaningful ways. Previous work has shown that processing linguistically more complex expressions (‘Texas cattle rancher’ vs. ‘rancher’ leads to slow-downs in reading during initial processing, possibly reflecting effort in combining information. Conversely, when this information must subsequently be retrieved (as in filler-gap constructions, processing is facilitated for more complex expressions, possibly because more semantic cues are available during retrieval. To follow up on this hypothesis, we tested whether information distributed across a short discourse can similarly provide effective cues for retrieval. Participants read texts introducing two referents (e.g., two senators, one of whom was described in greater detail than the other (e.g., ‘The Democrat had voted for one of the senators, and the Republican had voted for the other, a man from Ohio who was running for president’. The final sentence (e.g., ‘The senator who the {Republican / Democrat} had voted for…’ contained a relative clause picking out either the Many-Cue referent (with ‘Republican’ or the One-Cue referent (with ‘Democrat’. We predicted facilitated retrieval (faster reading times for the Many-Cue condition at the verb region (‘had voted for’, where readers could understand that ‘The senator’ is the object of the verb. As predicted, this pattern was observed at the retrieval region and continued throughout the rest of the sentence. Participants also completed the Author/Magazine Recognition Tests (ART/MRT; Stanovich & West, 1989, providing a proxy for world knowledge. Since higher ART/MRT scores may index (a greater experience accessing relevant knowledge and/or (b richer/more highly-structured representations in semantic memory, we predicted it would be positively associated with effects of elaboration on retrieval. We did not observe

  16. Impact of Background Noise and Sentence Complexity on Processing Demands during Sentence Comprehension.

    Wendt, Dorothea; Dau, Torsten; Hjortkjær, Jens

    2016-01-01

    Speech comprehension in adverse listening conditions can be effortful even when speech is fully intelligible. Acoustical distortions typically make speech comprehension more effortful, but effort also depends on linguistic aspects of the speech signal, such as its syntactic complexity. In the present study, pupil dilations, and subjective effort ratings were recorded in 20 normal-hearing participants while performing a sentence comprehension task. The sentences were either syntactically simple (subject-first sentence structure) or complex (object-first sentence structure) and were presented in two levels of background noise both corresponding to high intelligibility. A digit span and a reading span test were used to assess individual differences in the participants' working memory capacity (WMC). The results showed that the subjectively rated effort was mostly affected by the noise level and less by syntactic complexity. Conversely, pupil dilations increased with syntactic complexity but only showed a small effect of the noise level. Participants with higher WMC showed increased pupil responses in the higher-level noise condition but rated sentence comprehension as being less effortful compared to participants with lower WMC. Overall, the results demonstrate that pupil dilations and subjectively rated effort represent different aspects of effort. Furthermore, the results indicate that effort can vary in situations with high speech intelligibility.

  17. Impact of background noise and sentence complexity on processing demands during sentence comprehension

    Dorothea eWendt

    2016-03-01

    Full Text Available Speech comprehension in adverse listening conditions can be effortful even when speech is fully intelligible. Acoustical distortions typically make speech comprehension more effortful, but effort also depends on linguistic aspects of the speech signal, such as its syntactic complexity. In the present study, pupil dilations and subjective effort ratings were recorded in 20 normal-hearing participants while performing a sentence comprehension task. The sentences were either syntactically simple (subject-first sentence structure or complex (object-first sentence structure and were presented in two levels of background noise both corresponding to high intelligibility. A digit span and a reading span test were used to assess individual differences in the participants' working memory capacity. The results showed that the subjectively rated effort was mostly affected by the noise level and less by syntactic complexity. Conversely, pupil dilations increased with syntactic complexity but only showed a small effect of the noise level. Participants with higher working memory capacity showed increased pupil responses in the higher-level noise condition but rated sentence comprehension as being less effortful compared to participants with lower working memory capacity. Overall, the results demonstrate that pupil dilations and subjectively rated effort represent different aspects of effort. Furthermore, the results indicate that effort can vary in situations with high speech intelligibility.

  18. Developmental differences in beta and theta power during sentence processing

    Julie M. Schneider

    2016-06-01

    Full Text Available Although very young children process ongoing language quickly and effortlessly, research indicates that they continue to improve and mature in their language skills through adolescence. This prolonged development may be related to differing engagement of semantic and syntactic processes. This study used event related potentials and time frequency analysis of EEG to identify developmental differences in neural engagement as children (ages 10–12 and adults performed an auditory verb agreement grammaticality judgment task. Adults and children revealed very few differences in comprehending grammatically correct sentences. When identifying grammatical errors, however, adults displayed widely distributed beta and theta power decreases that were significantly less pronounced in children. Adults also demonstrated a significant P600 effect, while children exhibited an apparent N400 effect. Thus, when identifying subtle grammatical errors in real time, adults display greater neural activation that is traditionally associated with syntactic processing whereas children exhibit greater activity more commonly associated with semantic processing. These findings support previous claims that the cognitive and neural underpinnings of syntactic processing are still developing in adolescence, and add to them by more clearly identifying developmental changes in the neural oscillations underlying grammatical processing.

  19. Constructive processes in skilled and less skilled comprehenders' memory for sentences.

    Oakhill, J

    1982-02-01

    An experiment was carried out to investigate seven-eight-year-old children's memory for aurally presented sentences. A recognition-memory task was used to probe constructive memory processes in two groups differentiated by their ability at comprehending printed text. The recognition errors of both groups indicated that they constructed meanings implied by the original input sentences, whilst demonstrating poor memory for the syntactic form of the sentences. The tendency to construct meanings implied by the original input sentences was greater in children who scored higher on tests of reading comprehension of test. These results indicate that constructive memory processes are related to comprehension ability in young readers.

  20. [Short-term sentence memory in children with auditory processing disorders].

    Kiese-Himmel, C

    2010-05-01

    To compare sentence repetition performance of different groups of children with Auditory Processing Disorders (APD) and to examine the relationship between age or respectively nonverbal intelligence and sentence recall. Nonverbal intelligence was measured with the COLOURED MATRICES, in addition the children completed a standardized test of SENTENCE REPETITION (SR) which requires to repeat spoken sentences (subtest of the HEIDELBERGER SPRACHENTWICKLUNGSTEST). Three clinical groups (n=49 with monosymptomatic APD; n=29 with APD+developmental language impairment; n=14 with APD+developmental dyslexia); two control groups (n=13 typically developing peers without any clinical developmental disorder; n=10 children with slight reduced nonverbal intelligence). The analysis showed a significant group effect (p=0.0007). The best performance was achieved by the normal controls (T-score 52.9; SD 6.4; Min 42; Max 59) followed by children with monosymptomatic APD (43.2; SD 9.2), children with the co-morbid-conditions APD+developmental dyslexia (43.1; SD 10.3), and APD+developmental language impairment (39.4; SD 9.4). The clinical control group presented the lowest performance, on average (38.6; SD 9.6). Accordingly, language-impaired children and children with slight reductions in intelligence could poorly use their grammatical knowledge for SR. A statistically significant improvement in SR was verified with the increase of age with the exception of children belonging to the small group with lowered intelligence. This group comprised the oldest children. Nonverbal intelligence correlated positively with SR only in children with below average-range intelligence (0.62; p=0.054). The absence of APD, SLI as well as the presence of normal intelligence facilitated the use of phonological information for SR.

  1. Children's Verbal Working Memory: Role of Processing Complexity in Predicting Spoken Sentence Comprehension

    Magimairaj, Beula M.; Montgomery, James W.

    2012-01-01

    Purpose: This study investigated the role of processing complexity of verbal working memory tasks in predicting spoken sentence comprehension in typically developing children. Of interest was whether simple and more complex working memory tasks have similar or different power in predicting sentence comprehension. Method: Sixty-five children (6- to…

  2. Processing Advantages of Lexical Bundles: Evidence from Self-Paced Reading and Sentence Recall Tasks

    Tremblay, Antoine; Derwing, Bruce; Libben, Gary; Westbury, Chris

    2011-01-01

    This article examines the extent to which lexical bundles (LBs; i.e., frequently recurring strings of words that often span traditional syntactic boundaries) are stored and processed holistically. Three self-paced reading experiments compared sentences containing LBs (e.g., "in the middle of the") and matched control sentence fragments (e.g., "in…

  3. Verb Agreements during On-Line Sentence Processing in Alzheimer's Disease and Frontotemporal Dementia

    Price, C.C.; Grossman, M.

    2005-01-01

    An on-line ''word detection'' paradigm was used to assess the comprehension of thematic and transitive verb agreements during sentence processing in individuals diagnosed with probable Alzheimer's Disease (AD, n=15) and Frontotemporal Dementia (FTD, n=14). AD, FTD, and control participants (n=17) were asked to listen for a word in a sentence.…

  4. Early referential context effects in sentence processing: Evidence from event-related brain potentials

    Berkum, J.J.A. van; Brown, C.M.; Hagoort, P.

    1999-01-01

    An event-related brain potentials experiment was carried out to examine the interplay of referential and structural factors during sentence processing in discourse. Subjects read (Dutch) sentences beginning like “David told the girl that … ” in short story contexts that had introduced either one or

  5. Processing Mechanisms in Hearing-Impaired Listeners: Evidence from Reaction Times and Sentence Interpretation.

    Carroll, Rebecca; Uslar, Verena; Brand, Thomas; Ruigendijk, Esther

    The authors aimed to determine whether hearing impairment affects sentence comprehension beyond phoneme or word recognition (i.e., on the sentence level), and to distinguish grammatically induced processing difficulties in structurally complex sentences from perceptual difficulties associated with listening to degraded speech. Effects of hearing impairment or speech in noise were expected to reflect hearer-specific speech recognition difficulties. Any additional processing time caused by the sustained perceptual challenges across the sentence may either be independent of or interact with top-down processing mechanisms associated with grammatical sentence structure. Forty-nine participants listened to canonical subject-initial or noncanonical object-initial sentences that were presented either in quiet or in noise. Twenty-four participants had mild-to-moderate hearing impairment and received hearing-loss-specific amplification. Twenty-five participants were age-matched peers with normal hearing status. Reaction times were measured on-line at syntactically critical processing points as well as two control points to capture differences in processing mechanisms. An off-line comprehension task served as an additional indicator of sentence (mis)interpretation, and enforced syntactic processing. The authors found general effects of hearing impairment and speech in noise that negatively affected perceptual processing, and an effect of word order, where complex grammar locally caused processing difficulties for the noncanonical sentence structure. Listeners with hearing impairment were hardly affected by noise at the beginning of the sentence, but were affected markedly toward the end of the sentence, indicating a sustained perceptual effect of speech recognition. Comprehension of sentences with noncanonical word order was negatively affected by degraded signals even after sentence presentation. Hearing impairment adds perceptual processing load during sentence processing

  6. Structure before meaning: sentence processing, plausibility, and subcategorization.

    Kizach, Johannes; Nyvad, Anne Mette; Christensen, Ken Ramshøj

    2013-01-01

    Natural language processing is a fast and automatized process. A crucial part of this process is parsing, the online incremental construction of a syntactic structure. The aim of this study was to test whether a wh-filler extracted from an embedded clause is initially attached as the object of the matrix verb with subsequent reanalysis, and if so, whether the plausibility of such an attachment has an effect on reaction time. Finally, we wanted to examine whether subcategorization plays a role. We used a method called G-Maze to measure response time in a self-paced reading design. The experiments confirmed that there is early attachment of fillers to the matrix verb. When this attachment is implausible, the off-line acceptability of the whole sentence is significantly reduced. The on-line results showed that G-Maze was highly suited for this type of experiment. In accordance with our predictions, the results suggest that the parser ignores (or has no access to information about) implausibility and attaches fillers as soon as possible to the matrix verb. However, the results also show that the parser uses the subcategorization frame of the matrix verb. In short, the parser ignores semantic information and allows implausible attachments but adheres to information about which type of object a verb can take, ensuring that the parser does not make impossible attachments. We argue that the evidence supports a syntactic parser informed by syntactic cues, rather than one guided by semantic cues or one that is blind, or completely autonomous.

  7. Structure before meaning: sentence processing, plausibility, and subcategorization.

    Johannes Kizach

    Full Text Available Natural language processing is a fast and automatized process. A crucial part of this process is parsing, the online incremental construction of a syntactic structure. The aim of this study was to test whether a wh-filler extracted from an embedded clause is initially attached as the object of the matrix verb with subsequent reanalysis, and if so, whether the plausibility of such an attachment has an effect on reaction time. Finally, we wanted to examine whether subcategorization plays a role. We used a method called G-Maze to measure response time in a self-paced reading design. The experiments confirmed that there is early attachment of fillers to the matrix verb. When this attachment is implausible, the off-line acceptability of the whole sentence is significantly reduced. The on-line results showed that G-Maze was highly suited for this type of experiment. In accordance with our predictions, the results suggest that the parser ignores (or has no access to information about implausibility and attaches fillers as soon as possible to the matrix verb. However, the results also show that the parser uses the subcategorization frame of the matrix verb. In short, the parser ignores semantic information and allows implausible attachments but adheres to information about which type of object a verb can take, ensuring that the parser does not make impossible attachments. We argue that the evidence supports a syntactic parser informed by syntactic cues, rather than one guided by semantic cues or one that is blind, or completely autonomous.

  8. How Hearing Impairment Affects Sentence Comprehension: Using Eye Fixations to Investigate the Duration of Speech Processing

    Wendt, Dorothea; Kollmeier, Birger; Brand, Thomas

    2015-01-01

    ; this measure uses eye fixations recorded while the participant listens to a sentence. Eye fixations toward a target picture (which matches the aurally presented sentence) were measured in the presence of a competitor picture. Based on the recorded eye fixations, the single target detection amplitude, which...... reflects the tendency of the participant to fixate the target picture, was used as a metric to estimate the duration of sentence processing. The single target detection amplitude was calculated for sentence structures with different levels of linguistic complexity and for different listening conditions......: in quiet and in two different noise conditions. Participants with hearing impairment spent more time processing sentences, even at high levels of speech intelligibility. In addition, the relationship between the proposed online measure and listener-specific factors, such as hearing aid use and cognitive...

  9. Priming sentence planning

    Konopka, A.E.; Meyer, A.S.

    2014-01-01

    Sentence production requires mapping preverbal messages onto linguistic structures. Because sentences are normally built incrementally, the information encoded in a sentence-initial increment is critical for explaining how the mapping process starts and for predicting its timecourse. Two experiments

  10. Task Difficulty Differentially Affects Two Measures of Processing Load: The Pupil Response during Sentence Processing and Delayed Cued Recall of the Sentences

    Zekveld, Adriana A.; Festen, Joost M.; Kramer, Kramera

    2013-01-01

    Purpose: In this study, the authors assessed the influence of masking level (29% or 71% sentence perception) and test modality on the processing load during language perception as reflected by the pupil response. In addition, the authors administered a delayed cued stimulus recall test to examine whether processing load affected the encoding of…

  11. Impact of background noise and sentence complexity on cognitive processing demands

    Wendt, Dorothea; Dau, Torsten; Hjortkjær, Jens

    2015-01-01

    Speech comprehension in adverse listening conditions requires cognitive processingdemands. Processing demands can increase with acoustically degraded speech but also depend on linguistic aspects of the speech signal, such as syntactic complexity. In the present study, pupil dilations were recorded...... in 19 normal-hearing participants while processing sentences that were either syntactically simple or complex and presented in either high- or low-level background noise. Furthermore, the participants were asked to rate the subjectively perceived difficulty of sentence comprehension. The results showed...

  12. Sentence processing in an artificial language: Learning and using combinatorial constraints.

    Amato, Michael S; MacDonald, Maryellen C

    2010-07-01

    A study combining artificial grammar and sentence comprehension methods investigated the learning and online use of probabilistic, nonadjacent combinatorial constraints. Participants learned a small artificial language describing cartoon monsters acting on objects. Self-paced reading of sentences in the artificial language revealed comprehenders' sensitivity to nonadjacent combinatorial constraints, without explicit awareness of the probabilities embedded in the language. These results show that even newly-learned constraints have an identifiable effect on online sentence processing. The rapidity of learning in this paradigm relative to others has implications for theories of implicit learning and its role in language acquisition. 2010 Elsevier B.V. All rights reserved.

  13. Multivariate Statistical Process Control

    Kulahci, Murat

    2013-01-01

    As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...... amount of cross correlation, practitioners are often recommended to use latent structures methods such as Principal Component Analysis to summarize the data in only a few linear combinations of the original variables that capture most of the variation in the data. Applications of these control charts...

  14. EVALUATION OF SEMANTIC SIMILARITY FOR SENTENCES IN NATURAL LANGUAGE BY MATHEMATICAL STATISTICS METHODS

    A. E. Pismak

    2016-03-01

    Full Text Available Subject of Research. The paper is focused on Wiktionary articles structural organization in the aspect of its usage as the base for semantic network. Wiktionary community references, article templates and articles markup features are analyzed. The problem of numerical estimation for semantic similarity of structural elements in Wiktionary articles is considered. Analysis of existing software for semantic similarity estimation of such elements is carried out; algorithms of their functioning are studied; their advantages and disadvantages are shown. Methods. Mathematical statistics methods were used to analyze Wiktionary articles markup features. The method of semantic similarity computing based on statistics data for compared structural elements was proposed.Main Results. We have concluded that there is no possibility for direct use of Wiktionary articles as the source for semantic network. We have proposed to find hidden similarity between article elements, and for that purpose we have developed the algorithm for calculation of confidence coefficients proving that each pair of sentences is semantically near. The research of quantitative and qualitative characteristics for the developed algorithm has shown its major performance advantage over the other existing solutions in the presence of insignificantly higher error rate. Practical Relevance. The resulting algorithm may be useful in developing tools for automatic Wiktionary articles parsing. The developed method could be used in computing of semantic similarity for short text fragments in natural language in case of algorithm performance requirements are higher than its accuracy specifications.

  15. Sentence processing and verbal working memory in a white-matter-disconnection patient.

    Meyer, Lars; Cunitz, Katrin; Obleser, Jonas; Friederici, Angela D

    2014-08-01

    The Arcuate Fasciculus/Superior Longitudinal Fasciculus (AF/SLF) is the white-matter bundle that connects posterior superior temporal and inferior frontal cortex. Its causal functional role in sentence processing and verbal working memory is currently under debate. While impairments of sentence processing and verbal working memory often co-occur in patients suffering from AF/SLF damage, it is unclear whether these impairments result from shared white-matter damage to the verbal-working-memory network. The present study sought to specify the behavioral consequences of focal AF/SLF damage for sentence processing and verbal working memory, which were assessed in a single patient suffering from a cleft-like lesion spanning the deep left superior temporal gyrus, sparing most surrounding gray matter. While tractography suggests that the ventral fronto-temporal white-matter bundle is intact in this patient, the AF/SLF was not visible to tractography. In line with the hypothesis that the AF/SLF is causally involved in sentence processing, the patient׳s performance was selectively impaired on sentences that jointly involve both complex word orders and long word-storage intervals. However, the patient was unimpaired on sentences that only involved long word-storage intervals without involving complex word orders. On the contrary, the patient performed generally worse than a control group across standard verbal-working-memory tests. We conclude that the AF/SLF not only plays a causal role in sentence processing, linking regions of the left dorsal inferior frontal gyrus to the temporo-parietal region, but moreover plays a crucial role in verbal working memory, linking regions of the left ventral inferior frontal gyrus to the left temporo-parietal region. Together, the specific sentence-processing impairment and the more general verbal-working-memory impairment may imply that the AF/SLF subserves both sentence processing and verbal working memory, possibly pointing to the AF

  16. Analyzing processing effort during sentence comprehension in quiet and in noise: Evidence from eye-fixations and pupil size

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    structures . Here, we compare both methods, i.e. p rocessing speed and pupil size , as indicator s for the required effort when processing sentences that differ in their level of syntactic complexity . Furthermore, an interaction of background noise and syntactic complexity is exanimated by analyzing...... processing effort for sentence s presented in quiet and in noise. Moreover, it is investigated whether both measure s provide similar or complementary information about sentence processing and the required effort....

  17. Lexical ambiguity resolution during sentence processing in Parkinson's disease: An event-related potential study.

    Anthony J Angwin

    Full Text Available Event-related potentials (ERPs were recorded to investigate lexical ambiguity resolution during sentence processing in 16 people with Parkinson's disease (PD and 16 healthy controls. Sentences were presented word-by-word on computer screen, and participants were required to decide if a subsequent target word was related to the meaning of the sentence. The task consisted of related, unrelated and ambiguous trials. For the ambiguous trials, the sentence ended with an ambiguous word and the target was related to one of the meanings of that word, but not the one captured by the sentence context (e.g., 'He dug with the spade', Target 'ACE'. Both groups demonstrated slower reaction times and lower accuracy for the ambiguous condition relative to the unrelated condition, however accuracy was impacted by the ambiguous condition to a larger extent in the PD group. These results suggested that PD patients experience increased difficulties with contextual ambiguity resolution. The ERP results did not reflect increased ambiguity resolution difficulties in PD, as a similar N400 effect was evident for the unrelated and ambiguous condition in both groups. However, the magnitude of the N400 for these conditions was correlated with a measure of inhibition in the PD group, but not the control group. The ERP results suggest that semantic processing may be more compromised in PD patients with increased response inhibition deficits.

  18. Impact of background noise and sentence complexity on cognitive processing effort

    Wendt, Dorothea; Dau, Torsten; Hjortkjær, Jens

    2015-01-01

    Speech comprehension in adverse listening conditions requires cognitive pro- cessing demands. Processing demands can increase with acoustically degraded speech but also depend on linguistic aspects of the speech signal, such as syntactic complexity. In the present study, pupil dilations were...... recorded in 19 normal-hearing participants while processing sentences that were either syntactically simple or complex and presented in either high- or low-level background noise. Furthermore, the participants were asked to rate the sub- jectively perceived difficulty of sentence comprehension. The results...

  19. Experience and Sentence Processing: Statistical Learning and Relative Clause Comprehension

    Wells, Justine B.; Christiansen, Morten H.; Race, David S.; Acheson, Daniel J.; MacDonald, Maryellen C.

    2009-01-01

    Many explanations of the difficulties associated with interpreting object relative clauses appeal to the demands that object relatives make on working memory. MacDonald and Christiansen [MacDonald, M. C., & Christiansen, M. H. (2002). "Reassessing working memory: Comment on Just and Carpenter (1992) and Waters and Caplan (1996)." "Psychological…

  20. Understanding bit by bit: information theory and the role of inflections in sentence processing

    Manika, S.

    2014-01-01

    What makes a sentence hard to process? Apart from the meanings of the words it contains, their number, and the way these words combine into constituents, words also contribute to processing difficulty on the basis of their accessibility in lexical retrieval. Apart from their frequency of use or

  1. Children's and adults' on-line processing of syntactically ambiguous sentences during reading.

    Holly S S L Joseph

    Full Text Available While there has been a fair amount of research investigating children's syntactic processing during spoken language comprehension, and a wealth of research examining adults' syntactic processing during reading, as yet very little research has focused on syntactic processing during text reading in children. In two experiments, children and adults read sentences containing a temporary syntactic ambiguity while their eye movements were monitored. In Experiment 1, participants read sentences such as, 'The boy poked the elephant with the long stick/trunk from outside the cage' in which the attachment of a prepositional phrase was manipulated. In Experiment 2, participants read sentences such as, 'I think I'll wear the new skirt I bought tomorrow/yesterday. It's really nice' in which the attachment of an adverbial phrase was manipulated. Results showed that adults and children exhibited similar processing preferences, but that children were delayed relative to adults in their detection of initial syntactic misanalysis. It is concluded that children and adults have the same sentence-parsing mechanism in place, but that it operates with a slightly different time course. In addition, the data support the hypothesis that the visual processing system develops at a different rate than the linguistic processing system in children.

  2. Prediction in a visual language: real-time sentence processing in American Sign Language across development.

    Lieberman, Amy M; Borovsky, Arielle; Mayberry, Rachel I

    2018-01-01

    Prediction during sign language comprehension may enable signers to integrate linguistic and non-linguistic information within the visual modality. In two eyetracking experiments, we investigated American Sign language (ASL) semantic prediction in deaf adults and children (aged 4-8 years). Participants viewed ASL sentences in a visual world paradigm in which the sentence-initial verb was either neutral or constrained relative to the sentence-final target noun. Adults and children made anticipatory looks to the target picture before the onset of the target noun in the constrained condition only, showing evidence for semantic prediction. Crucially, signers alternated gaze between the stimulus sign and the target picture only when the sentential object could be predicted from the verb. Signers therefore engage in prediction by optimizing visual attention between divided linguistic and referential signals. These patterns suggest that prediction is a modality-independent process, and theoretical implications are discussed.

  3. Experimental Designs in Sentence Processing Research: A Methodological Review and User's Guide

    Keating, Gregory D.; Jegerski, Jill

    2015-01-01

    Since the publication of Clahsen and Felser's (2006) keynote article on grammatical processing in language learners, the online study of sentence comprehension in adult second language (L2) learners has quickly grown into a vibrant and prolific subfield of SLA. As online methods begin to establish a foothold in SLA research, it is important…

  4. On-Line Sentence Processing in Swedish: Cross-Linguistic Developmental Comparisons with French

    Kail, Michele; Kihlstedt, Maria; Bonnet, Philippe

    2012-01-01

    This study examined on-line processing of Swedish sentences in a grammaticality-judgement experiment within the framework of the Competition Model. Three age groups from 6 to 11 and an adult group were asked to detect grammatical violations as quickly as possible. Three factors concerning cue cost were studied: violation position (early vs. late),…

  5. The Influence of Emotional Words on Sentence Processing: Electrophysiological and Behavioral Evidence

    Martin-Loeches, Manuel; Fernandez, Anabel; Schacht, Annekathrin; Sommer, Werner; Casado, Pilar; Jimenez-Ortega, Laura; Fondevila, Sabela

    2012-01-01

    Whereas most previous studies on emotion in language have focussed on single words, we investigated the influence of the emotional valence of a word on the syntactic and semantic processes unfolding during sentence comprehension, by means of event-related brain potentials (ERP). Experiment 1 assessed how positive, negative, and neutral adjectives…

  6. Automatic and Controlled Processing in Sentence Recall: The Role of Long-Term and Working Memory

    Jefferies, E.; Lambon Ralph, M.A.; Baddeley, A.D.

    2004-01-01

    Immediate serial recall is better for sentences than word lists presumably because of the additional support that meaningful material receives from long-term memory. This may occur automatically, without the involvement of attention, or may require additional attentionally demanding processing. For example, the episodic buffer model (Baddeley,…

  7. Respecting Relations: Memory Access and Antecedent Retrieval in Incremental Sentence Processing

    Kush, Dave W.

    2013-01-01

    This dissertation uses the processing of anaphoric relations to probe how linguistic information is encoded in and retrieved from memory during real-time sentence comprehension. More specifically, the dissertation attempts to resolve a tension between the demands of a linguistic processor implemented in a general-purpose cognitive architecture and…

  8. Electrophysiology of prosodic and lexical-semantic processing during sentence comprehension in aphasia.

    Sheppard, Shannon M; Love, Tracy; Midgley, Katherine J; Holcomb, Phillip J; Shapiro, Lewis P

    2017-12-01

    Event-related potentials (ERPs) were used to examine how individuals with aphasia and a group of age-matched controls use prosody and themattic fit information in sentences containing temporary syntactic ambiguities. Two groups of individuals with aphasia were investigated; those demonstrating relatively good sentence comprehension whose primary language difficulty is anomia (Individuals with Anomic Aphasia (IWAA)), and those who demonstrate impaired sentence comprehension whose primary diagnosis is Broca's aphasia (Individuals with Broca's Aphasia (IWBA)). The stimuli had early closure syntactic structure and contained a temporary early closure (correct)/late closure (incorrect) syntactic ambiguity. The prosody was manipulated to either be congruent or incongruent, and the temporarily ambiguous NP was also manipulated to either be a plausible or an implausible continuation for the subordinate verb (e.g., "While the band played the song/the beer pleased all the customers."). It was hypothesized that an implausible NP in sentences with incongruent prosody may provide the parser with a plausibility cue that could be used to predict syntactic structure. The results revealed that incongruent prosody paired with a plausibility cue resulted in an N400-P600 complex at the implausible NP (the beer) in both the controls and the IWAAs, yet incongruent prosody without a plausibility cue resulted in an N400-P600 at the critical verb (pleased) only in healthy controls. IWBAs did not show evidence of N400 or P600 effects at the ambiguous NP or critical verb, although they did show evidence of a delayed N400 effect at the sentence-final word in sentences with incongruent prosody. These results suggest that IWAAs have difficulty integrating prosodic cues with underlying syntactic structure when lexical-semantic information is not available to aid their parse. IWBAs have difficulty integrating both prosodic and lexical-semantic cues with syntactic structure, likely due to a

  9. Processing Rhythmic Pattern during Chinese Sentence Reading: An Eye Movement Study.

    Luo, Yingyi; Duan, Yunyan; Zhou, Xiaolin

    2015-01-01

    Prosodic constraints play a fundamental role during both spoken sentence comprehension and silent reading. In Chinese, the rhythmic pattern of the verb-object (V-O) combination has been found to rapidly affect the semantic access/integration process during sentence reading (Luo and Zhou, 2010). Rhythmic pattern refers to the combination of words with different syllabic lengths, with certain combinations disallowed (e.g., [2 + 1]; numbers standing for the number of syllables of the verb and the noun respectively) and certain combinations preferred (e.g., [1 + 1] or [2 + 2]). This constraint extends to the situation in which the combination is used to modify other words. A V-O phrase could modify a noun by simply preceding it, forming a V-O-N compound; when the verb is disyllabic, however, the word order has to be O-V-N and the object is preferred to be disyllabic. In this study, we investigated how the reader processes the rhythmic pattern and word order information by recording the reader's eye-movements. We created four types of sentences by crossing rhythmic pattern and word order in compounding. The compound, embedding a disyllabic verb, could be in the correct O-V-N or the incorrect V-O-N order; the object could be disyllabic or monosyllabic. We found that the reader spent more time and made more regressions on and after the compounds when either type of anomaly was detected during the first pass reading. However, during re-reading (after all the words in the sentence have been viewed), less regressive eye movements were found for the anomalous rhythmic pattern, relative to the correct pattern; moreover, only the abnormal rhythmic pattern, not the violated word order, influenced the regressive eye movements. These results suggest that while the processing of rhythmic pattern and word order information occurs rapidly during the initial reading of the sentence, the process of recovering from the rhythmic pattern anomaly may ease the reanalysis processing at the

  10. Processing rhythmic pattern during Chinese sentence reading: An eye movement study

    Yingyi eLuo

    2015-12-01

    Full Text Available Prosodic constraints play a fundamental role during both spoken sentence comprehension and silent reading. In Chinese, the rhythmic pattern of the verb-object (V-O combination has been found to rapidly affect the semantic access/integration process during sentence reading (Luo and Zhou, 2010. Rhythmic pattern refers to the combination of words with different syllabic lengths, with certain combinations disallowed (e.g., [2+1]; numbers standing for the number of syllables of the verb and the noun respectively and certain combinations preferred (e.g., [1+1] or [2+2]. This constraint extends to the situation in which the combination is used to modify other words. A V-O phrase could modify a noun by simply preceding it, forming a V-O-N compound; when the verb is disyllabic, however, the word order has to be O-V-N and the object is preferred to be disyllabic. In this study, we investigated how the reader processes the rhythmic pattern and word order information by recording the reader’s eye-movements. We created four types of sentences by crossing rhythmic pattern and word order in compounding. The compound, embedding a disyllabic verb, could be in the correct O-V-N or the incorrect V-O-N order; the object could be disyllabic or monosyllabic. We found that the reader spent more time and made more regressions on and after the compounds when either type of anomaly was detected during the first pass reading. However, during re-reading (after all the words in the sentence have been viewed, less regressive eye movements were found for the anomalous rhythmic pattern, relative to the correct pattern; moreover, only the abnormal rhythmic pattern, not the violated word order, influenced the regressive eye movements. These results suggest that while the processing of rhythmic pattern and word order information occurs rapidly during the initial reading of the sentence, the process of recovering from the rhythmic pattern anomaly may ease the reanalysis

  11. Applying pause analysis to explore cognitive processes in the copying of sentences by second language users

    Zulkifli, Putri Afzan Maria Binti

    2013-01-01

    Pause analysis is a method that investigates processes of writing by measuring the amount of time between pen strokes. It provides the field of second language studies with a means to explore the cognitive processes underpinning the nature of writing. This study examined the potential of using free handwritten copying of sentences as a means of investigating components of the cognitive processes of adults who have English as their Second Language (ESL).\\ud \\ud A series of one pilot and three ...

  12. The processing of blend words in naming and sentence reading.

    Johnson, Rebecca L; Slate, Sarah Rose; Teevan, Allison R; Juhasz, Barbara J

    2018-04-01

    Research exploring the processing of morphologically complex words, such as compound words, has found that they are decomposed into their constituent parts during processing. Although much is known about the processing of compound words, very little is known about the processing of lexicalised blend words, which are created from parts of two words, often with phoneme overlap (e.g., brunch). In the current study, blends were matched with non-blend words on a variety of lexical characteristics, and blend processing was examined using two tasks: a naming task and an eye-tracking task that recorded eye movements during reading. Results showed that blend words were processed more slowly than non-blend control words in both tasks. Blend words led to longer reaction times in naming and longer processing times on several eye movement measures compared to non-blend words. This was especially true for blends that were long, rated low in word familiarity, but were easily recognisable as blends.

  13. Effect of Visual Support on the Processing of Multiclausal Sentences

    Hagiwara, Akiko

    2015-01-01

    Processing morphemic elements is one of the most difficult parts of second language acquisition (DeKeyser, 2005; Larsen-Freeman, 2010). This difficulty gains prominence when second language (L2) learners must perform under time pressure, and difficulties arise in using grammatical knowledge. To solve the problem, the current study used the tenets…

  14. The Real-Time Processing of Sluiced Sentences

    Poirier, Josee; Wolfinger, Katie; Spellman, Lisa; Shapiro, Lewis P.

    2010-01-01

    Ellipsis refers to an element that is absent from the input but whose meaning can nonetheless be recovered from context. In this cross-modal priming study, we examined the online processing of Sluicing, an ellipsis whose antecedent is an entire clause: "The handyman threw a book to the programmer but I don't know which book" the handyman threw to…

  15. Sentence processing in anterior superior temporal cortex shows a social-emotional bias.

    Mellem, Monika S; Jasmin, Kyle M; Peng, Cynthia; Martin, Alex

    2016-08-01

    The anterior region of the left superior temporal gyrus/superior temporal sulcus (aSTG/STS) has been implicated in two very different cognitive functions: sentence processing and social-emotional processing. However, the vast majority of the sentence stimuli in previous reports have been of a social or social-emotional nature suggesting that sentence processing may be confounded with semantic content. To evaluate this possibility we had subjects read word lists that differed in phrase/constituent size (single words, 3-word phrases, 6-word sentences) and semantic content (social-emotional, social, and inanimate objects) while scanned in a 7T environment. This allowed us to investigate if the aSTG/STS responded to increasing constituent structure (with increased activity as a function of constituent size) with or without regard to a specific domain of concepts, i.e., social and/or social-emotional content. Activity in the left aSTG/STS was found to increase with constituent size. This region was also modulated by content, however, such that social-emotional concepts were preferred over social and object stimuli. Reading also induced content type effects in domain-specific semantic regions. Those preferring social-emotional content included aSTG/STS, inferior frontal gyrus, posterior STS, lateral fusiform, ventromedial prefrontal cortex, and amygdala, regions included in the "social brain", while those preferring object content included parahippocampal gyrus, retrosplenial cortex, and caudate, regions involved in object processing. These results suggest that semantic content affects higher-level linguistic processing and should be taken into account in future studies. Copyright © 2016. Published by Elsevier Ltd.

  16. Automatic and controlled processing in sentence recall: The role of long-term and working memory

    Jefferies, Elizabeth; Lambon Ralph, Matthew A.; Baddeley, Alan D.

    2004-01-01

    Immediate serial recall is better for sentences than word lists presumably because of the additional support that meaningful material receives from long-term memory. This may occur automatically, without the involvement of attention, or may require additional attentionally demanding processing. For example, the episodic buffer model (Baddeley, 2000) proposes that the executive component of working memory plays a crucial role in the formation of links between different representational formats...

  17. Mathematical statistics and stochastic processes

    Bosq, Denis

    2013-01-01

    Generally, books on mathematical statistics are restricted to the case of independent identically distributed random variables. In this book however, both this case AND the case of dependent variables, i.e. statistics for discrete and continuous time processes, are studied. This second case is very important for today's practitioners.Mathematical Statistics and Stochastic Processes is based on decision theory and asymptotic statistics and contains up-to-date information on the relevant topics of theory of probability, estimation, confidence intervals, non-parametric statistics and rob

  18. The role of the left ventrolateral prefrontal cortex in online sentence processing

    Nazbanou Nozari

    2014-04-01

    Full Text Available Introduction: Patients with damage to the left ventrolateral prefrontal cortex (VLPFC are often not impaired in understanding simple sentences. It is, however, possible that the damage may cause subclinical effects. If VLPFC has a role in biasing competition towards what is relevant to the task, we would expect patients with VLPFC damage to be slower in using the relevant information and discarding the irrelevant information when they process sentences online. Methods: Nine patients, five with lesions limited to VLPFC, and four with lesions sparing VLPFC participated. The groups were matched in age, education, WAB-AQ and total lesion volume. Two experiments explored processing of online cues during sentence comprehension by tracking eye fixations in a Visual World paradigm with four pictures. Participants only listened to the sentences and looked at the pictures. Experiment 1 investigated how quickly cues can be used for target identification using a simple “She will [verb] the [target].” sentence structure. The verbs in the restrictive condition were compatible with only one of the four pictures (e.g., “eat”; target “apple” + three inedible competitors. The verbs in the control conditions were matched to the restrictive verbs in length and frequency, but did not point to a unique target (e.g., “see”. If VLPFC is critical for quickly biasing competition towards the relevant target, the VLPFC patients should to be slower than the non-VLPFC patients in fixating the noun when the verb is restrictive. Experiment 2 probed how effectively irrelevant cues are suppressed. A similar Visual World paradigm was used, but all verbs were restrictive, and one of the distractors was also compatible with the verb (e.g., “banana”. The sentences contained an adjective that ruled out one of verb-compatible pictures (e.g., “red”. The critical manipulation involved a third picture (the adjective competitor which was compatible with the

  19. Children's assignment of grammatical roles in the online processing of Mandarin passive sentences.

    Huang, Yi Ting; Zheng, Xiaobei; Meng, Xiangzhi; Snedeker, Jesse

    2013-11-01

    Children's difficulty understanding passives in English has been attributed to the syntactic complexity, overall frequency, cue reliability, and/or incremental processing of this construction. To understand the role of these factors, we used the visual-world paradigm to examine comprehension in Mandarin Chinese where passives are infrequent but signaled by a highly valid marker (BEI). Eye-movements during sentences indicated that these markers triggered incremental role assignments in adults and 5-year-olds. Actions after sentences indicated that passives were often misinterpreted as actives when markers appeared after the referential noun (" Seal BEI it eat " → The seal is eaten by it). However, they were more likely to be interpreted correctly when markers appeared before (" It BEI seal eat " → It is eaten by the seal). The actions and the eye-movements suggest that for both adults and children, interpretations of passive are easier when they do not require revision of an earlier role assignment.

  20. Delta, theta, beta, and gamma brain oscillations index levels of auditory sentence processing.

    Mai, Guangting; Minett, James W; Wang, William S-Y

    2016-06-01

    A growing number of studies indicate that multiple ranges of brain oscillations, especially the delta (δ, processing. It is not clear, however, how these oscillations relate to functional processing at different linguistic hierarchical levels. Using scalp electroencephalography (EEG), the current study tested the hypothesis that phonological and the higher-level linguistic (semantic/syntactic) organizations during auditory sentence processing are indexed by distinct EEG signatures derived from the δ, θ, β, and γ oscillations. We analyzed specific EEG signatures while subjects listened to Mandarin speech stimuli in three different conditions in order to dissociate phonological and semantic/syntactic processing: (1) sentences comprising valid disyllabic words assembled in a valid syntactic structure (real-word condition); (2) utterances with morphologically valid syllables, but not constituting valid disyllabic words (pseudo-word condition); and (3) backward versions of the real-word and pseudo-word conditions. We tested four signatures: band power, EEG-acoustic entrainment (EAE), cross-frequency coupling (CFC), and inter-electrode renormalized partial directed coherence (rPDC). The results show significant effects of band power and EAE of δ and θ oscillations for phonological, rather than semantic/syntactic processing, indicating the importance of tracking δ- and θ-rate phonetic patterns during phonological analysis. We also found significant β-related effects, suggesting tracking of EEG to the acoustic stimulus (high-β EAE), memory processing (θ-low-β CFC), and auditory-motor interactions (20-Hz rPDC) during phonological analysis. For semantic/syntactic processing, we obtained a significant effect of γ power, suggesting lexical memory retrieval or processing grammatical word categories. Based on these findings, we confirm that scalp EEG signatures relevant to δ, θ, β, and γ oscillations can index phonological and semantic/syntactic organizations

  1. Is the comprehension of idiomatic sentences indeed impaired in paranoid Schizophrenia? A window into semantic processing deficits

    Pesciarelli, Francesca; Gamberoni, Tania; Ferlazzo, Fabio; Lo Russo, Leo; Pedrazzi, Francesca; Melati, Ermanno; Cacciari, Cristina

    2014-01-01

    Schizophrenia patients have been reported to be more impaired in comprehending non-literal than literal language since early studies on proverbs. Preference for literal rather than figurative interpretations continues to be documented. The main aim of this study was to establish whether patients are indeed able to use combinatorial semantic processing to comprehend literal sentences and both combinatorial analysis, and retrieval of pre-stored meanings to comprehend idiomatic sentences. The study employed a sentence continuation task in which subjects were asked to decide whether a target word was a sensible continuation of a previous sentence fragment to investigate idiomatic and literal sentence comprehension in patients with paranoid schizophrenia. Patients and healthy controls were faster in accepting sensible continuations than in rejecting non-sensible ones in both literal and idiomatic sentences. Patients were as accurate as controls in comprehending literal and idiomatic sentences, but they were overall slower than controls in all conditions. Once the contribution of cognitive covariates was partialled out, the response times (RTs) to sensible idiomatic continuations of patients did not significantly differ from those of controls. This suggests that the state of residual schizophrenia did not contribute to slower processing of sensible idioms above and beyond the cognitive deficits that are typically associated with schizophrenia. PMID:25346676

  2. Influence of Second Language Proficiency and Syntactic Structure Similarities on the Sensitivity and Processing of English Passive Sentence in Late Chinese-English Bilinguists: An ERP Study

    Chang, Xin; Wang, Pei

    2016-01-01

    To investigate the influence of L2 proficiency and syntactic similarity on English passive sentence processing, the present ERP study asked 40 late Chinese-English bilinguals (27 females and 13 males, mean age = 23.88) with high or intermediate L2 proficiency to read the sentences carefully and to indicate for each sentence whether or not it was…

  3. Individual differences in executive control relate to metaphor processing: an eye movement study of sentence reading.

    Columbus, Georgie; Sheikh, Naveed A; Côté-Lecaldare, Marilena; Häuser, Katja; Baum, Shari R; Titone, Debra

    2014-01-01

    Metaphors are common elements of language that allow us to creatively stretch the limits of word meaning. However, metaphors vary in their degree of novelty, which determines whether people must create new meanings on-line or retrieve previously known metaphorical meanings from memory. Such variations affect the degree to which general cognitive capacities such as executive control are required for successful comprehension. We investigated whether individual differences in executive control relate to metaphor processing using eye movement measures of reading. Thirty-nine participants read sentences including metaphors or idioms, another form of figurative language that is more likely to rely on meaning retrieval. They also completed the AX-CPT, a domain-general executive control task. In Experiment 1, we examined sentences containing metaphorical or literal uses of verbs, presented with or without prior context. In Experiment 2, we examined sentences containing idioms or literal phrases for the same participants to determine whether the link to executive control was qualitatively similar or different to Experiment 1. When metaphors were low familiar, all people read verbs used as metaphors more slowly than verbs used literally (this difference was smaller for high familiar metaphors). Executive control capacity modulated this pattern in that high executive control readers spent more time reading verbs when a prior context forced a particular interpretation (metaphorical or literal), and they had faster total metaphor reading times when there was a prior context. Interestingly, executive control did not relate to idiom processing for the same readers. Here, all readers had faster total reading times for high familiar idioms than literal phrases. Thus, executive control relates to metaphor but not idiom processing for these readers, and for the particular metaphor and idiom reading manipulations presented.

  4. Individual Differences in Executive Control Relate to Metaphor Processing: An Eye Movement Study of Sentence Reading

    Georgie eColumbus

    2015-01-01

    Full Text Available Metaphors are common elements of language that allow us to creatively stretch the limits of word meaning. However, metaphors vary in their degree of novelty, which determines whether people must create new meanings on-line or retrieve previously known metaphorical meanings from memory. Such variations affect the degree to which general cognitive capacities such as executive control are required for successful comprehension.We investigated whether individual differences in executive control relate to metaphor processing using eye movement measures of reading. Thirty-nine participants read sentences including metaphors or idioms, another form of figurative language that is more likely to rely on meaning retrieval. They also completed the AX-CPT, a domain-general executive control task. In Experiment 1, we examined sentences containing metaphorical or literal uses of verbs, presented with or without prior context. In Experiment 2, we examined sentences containing idioms or literal phrases for the same participants to determine whether the link to executive control was qualitatively similar or different to Experiment 1.When metaphors were low familiar, all people read verbs used as metaphors more slowly than verbs used literally (this difference was smaller for high familiar metaphors. Executive control capacity modulated this pattern in that high executive control readers spent more time reading verbs when a prior context forced a particular interpretation (metaphorical or literal, and they had faster total metaphor reading times when there was a prior context. Interestingly, executive control did not relate to idiom processing for the same readers. Here, all readers had faster total reading times for high familiar idioms than literal phrases. Thus, executive control relates to metaphor but not idiom processing for these readers, and for the particular metaphor and idiom reading manipulations presented.

  5. Probability, Statistics, and Stochastic Processes

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  6. Fundamentals of statistical signal processing

    Kay, Steven M

    1993-01-01

    A unified presentation of parameter estimation for those involved in the design and implementation of statistical signal processing algorithms. Covers important approaches to obtaining an optimal estimator and analyzing its performance; and includes numerous examples as well as applications to real- world problems. MARKETS: For practicing engineers and scientists who design and analyze signal processing systems, i.e., to extract information from noisy signals — radar engineer, sonar engineer, geophysicist, oceanographer, biomedical engineer, communications engineer, economist, statistician, physicist, etc.

  7. Priming sentence planning

    Konopka, A.; Meyer, A.

    2014-01-01

    Sentence production requires mapping preverbal messages onto linguistic structures. Because sentences are normally built incrementally, the information encoded in a sentence-initial increment is critical for explaining how the mapping process starts and for predicting its timecourse. Two experiments tested whether and when speakers prioritize encoding of different types of information at the outset of formulation by comparing production of descriptions of transitive events (e.g., A dog is cha...

  8. Young children with ASD use lexical and referential information during on-line sentence processing

    Edith L. Bavin

    2016-02-01

    Full Text Available Research with adults and older children indicates that verb biases are strong influences on listeners’ interpretations when processing sentences, but they can be overruled. In this paper we ask two questions: (i are children with ASD who are high functioning sensitive to verb biases like their same age typically developing peers?, and (ii do young children with ASD and young children with typical development override strong verb biases to consider alternative interpretations of ambiguous sentences? Participants were aged 5-9 years (mean age 6.65 years: children with ASD who were high functioning and children with typical development. In task 1 biasing and neutral verbs were included (e.g., eat cake versus move cake. In task 2 the focus was on whether the prepositional phrase occurring with an instrument biasing verb (e.g. ‘chop the tree with the axe’ was interpreted as an instrument even if the named item was an implausible instrument (e.g. candle in ‘Cut the cake with the candle’. Overall, the results showed similarities between groups but the ASD group was generally slower. In task 1, both groups looked at the named object faster in the biasing than the non-biasing condition, and in the biasing condition the ASD group looked away from the target more quickly than the TD group. In task 2, both groups identified the target in the prepositional phrase. They were more likely to override the verb instrument bias and consider the alternative (modification interpretation in the implausible condition (e.g., looking at the picture of a cake with a candle on it’. Our findings indicate that children of age 5 years and above can use context to override verb biases. Additionally, an important component of the sentence processing mechanism is largely intact for young children with ASD who are high functioning. Like children with typical development, they draw on verb semantics and plausibility in integrating information. However, they are likely

  9. A comparison of Danish listeners’ processing cost in judging the truth value of Norwegian, Swedish, and English sentences

    Bohn, Ocke-Schwen; Askjær-Jørgensen, Trine

    2017-01-01

    that the processing cost for native Danish listeners in comprehending Danish and English statements is equivalent, whereas Norwegian and Swedish statements incur a much higher cost, both in terms of response time and correct assessments. The results are discussed with regard to the costs of inter......The present study used a sentence verification task to assess the processing cost involved in native Danish listeners’ attempts to comprehend true/false statements spoken in Danish, Norwegian, Swedish, and English. Three groups of native Danish listeners heard 40 sentences each which were...... translation equivalents, and assessed the truth value of these statements. Group 1 heard sentences in Danish and Norwegian, Group 2 in Danish and Swedish, and Group 3 in Danish and English. Response time and proportion of correct responses were used as indices of processing cost. Both measures indicate...

  10. Empirical Descriptions of Criminal Sentencing Decision-Making

    Rasmus H. Wandall

    2014-05-01

    Full Text Available The article addresses the widespread use of statistical causal modelling to describe criminal sentencing decision-making empirically in Scandinavia. The article describes the characteristics of this model, and on this basis discusses three aspects of sentencing decision-making that the model does not capture: 1 the role of law and legal structures in sentencing, 2 the processes of constructing law and facts as they occur in the processes of handling criminal cases, and 3 reflecting newer organisational changes to sentencing decision-making. The article argues for a stronger empirically based design of sentencing models and for a more balanced use of different social scientific methodologies and models of sentencing decision-making.

  11. Statistical thermodynamics of nonequilibrium processes

    Keizer, Joel

    1987-01-01

    The structure of the theory ofthermodynamics has changed enormously since its inception in the middle of the nineteenth century. Shortly after Thomson and Clausius enunciated their versions of the Second Law, Clausius, Maxwell, and Boltzmann began actively pursuing the molecular basis of thermo­ dynamics, work that culminated in the Boltzmann equation and the theory of transport processes in dilute gases. Much later, Onsager undertook the elucidation of the symmetry oftransport coefficients and, thereby, established himself as the father of the theory of nonequilibrium thermodynamics. Com­ bining the statistical ideas of Gibbs and Langevin with the phenomenological transport equations, Onsager and others went on to develop a consistent statistical theory of irreversible processes. The power of that theory is in its ability to relate measurable quantities, such as transport coefficients and thermodynamic derivatives, to the results of experimental measurements. As powerful as that theory is, it is linear and...

  12. Higher Language Ability is Related to Angular Gyrus Activation Increase During Semantic Processing, Independent of Sentence Incongruency

    Van Ettinger-Veenstra, Helene; McAllister, Anita; Lundberg, Peter; Karlsson, Thomas; Engström, Maria

    2016-01-01

    This study investigates the relation between individual language ability and neural semantic processing abilities. Our aim was to explore whether high-level language ability would correlate to decreased activation in language-specific regions or rather increased activation in supporting language regions during processing of sentences. Moreover, we were interested if observed neural activation patterns are modulated by semantic incongruency similarly to previously observed changes upon syntactic congruency modulation. We investigated 27 healthy adults with a sentence reading task—which tapped language comprehension and inference, and modulated sentence congruency—employing functional magnetic resonance imaging (fMRI). We assessed the relation between neural activation, congruency modulation, and test performance on a high-level language ability assessment with multiple regression analysis. Our results showed increased activation in the left-hemispheric angular gyrus extending to the temporal lobe related to high language ability. This effect was independent of semantic congruency, and no significant relation between language ability and incongruency modulation was observed. Furthermore, there was a significant increase of activation in the inferior frontal gyrus (IFG) bilaterally when the sentences were incongruent, indicating that processing incongruent sentences was more demanding than processing congruent sentences and required increased activation in language regions. The correlation of high-level language ability with increased rather than decreased activation in the left angular gyrus, a region specific for language processing, is opposed to what the neural efficiency hypothesis would predict. We can conclude that no evidence is found for an interaction between semantic congruency related brain activation and high-level language performance, even though the semantic incongruent condition shows to be more demanding and evoking more neural activation. PMID

  13. Higher language ability is related to angular gyrus activation increase during semantic processing, independent of sentence incongruency

    Helene eVan Ettinger-Veenstra

    2016-03-01

    Full Text Available This study investigates the relation between individual language ability and neural semantic processing abilities. Our aim was to explore whether high-level language ability would correlate to decreased activation in language-specific regions or rather increased activation in supporting language regions during processing of sentences. Moreover, we were interested if observed neural activation patterns are modulated by semantic incongruency similarly to previously observed changes upon syntactic congruency modulation. We investigated 27 healthy adults with a sentence reading task - which tapped language comprehension and inference, and modulated sentence congruency - employing functional magnetic resonance imaging. We assessed the relation between neural activation, congruency modulation, and test performance on a high-level language ability assessment with multiple regression analysis. Our results showed increased activation in the left-hemispheric angular gyrus extending to the temporal lobe related to high language ability. This effect was independent of semantic congruency, and no significant relation between language ability and incongruency modulation was observed. Furthermore, a significant increase of activation in the inferior frontal gyrus bilaterally when the sentences were incongruent, indicating that processing incongruent sentences was more demanding than processing congruent sentences and required increased activation in language regions. The correlation of high-level language ability with increased rather than decreased activation in the left angular gyrus, a region specific for language processing is opposed to what the neural efficiency hypothesis would predict. We can conclude that there is no evidence found for an interaction between semantic congruency related brain activation and high-level language performance, even though the semantic incongruent condition shows to be more demanding and evoking more neural activation.

  14. "The Drawer Is Still Closed": Simulating Past and Future Actions when Processing Sentences that Describe a State

    Kaup, Barbara; Ludtke, Jana; Maienborn, Claudia

    2010-01-01

    In two experiments using the action-sentence-compatibility paradigm we investigated the simulation processes that readers undertake when processing state descriptions with adjectives (e.g., "Die Schublade ist offen/zu". ["The drawer is open/shut"]) or adjectival passives (e.g., "Die Schublade ist…

  15. Sentence Processing in an Artificial Language: Learning and Using Combinatorial Constraints

    Amato, Michael S.; MacDonald, Maryellen C.

    2010-01-01

    A study combining artificial grammar and sentence comprehension methods investigated the learning and online use of probabilistic, nonadjacent combinatorial constraints. Participants learned a small artificial language describing cartoon monsters acting on objects. Self-paced reading of sentences in the artificial language revealed comprehenders'…

  16. EEG theta and gamma responses to semantic violations in online sentence processing

    Hald, L.A.; Bastiaansen, M.C.M.; Hagoort, P.

    2006-01-01

    We explore the nature of the oscillatory dynamics in the EEG of subjects reading sentences that contain a semantic violation. More specifically, we examine whether increases in theta (≈3–7 Hz) and gamma (around 40 Hz) band power occur in response to sentences that were either semantically correct or

  17. Deeper than Shallow: Evidence for Structure-Based Parsing Biases in Second-Language Sentence Processing

    Witzel, Jeffrey; Witzel, Naoko; Nicol, Janet

    2012-01-01

    This study examines the reading patterns of native speakers (NSs) and high-level (Chinese) nonnative speakers (NNSs) on three English sentence types involving temporarily ambiguous structural configurations. The reading patterns on each sentence type indicate that both NSs and NNSs were biased toward specific structural interpretations. These…

  18. Statistical estimation of process holdup

    Harris, S.P.

    1988-01-01

    Estimates of potential process holdup and their random and systematic error variances are derived to improve the inventory difference (ID) estimate and its associated measure of uncertainty for a new process at the Savannah River Plant. Since the process is in a start-up phase, data have not yet accumulated for statistical modelling. The material produced in the facility will be a very pure, highly enriched 235U with very small isotopic variability. Therefore, data published in LANL's unclassified report on Estimation Methods for Process Holdup of a Special Nuclear Materials was used as a starting point for the modelling process. LANL's data were gathered through a series of designed measurements of special nuclear material (SNM) holdup at two of their materials-processing facilities. Also, they had taken steps to improve the quality of data through controlled, larger scale, experiments outside of LANL at highly enriched uranium processing facilities. The data they have accumulated are on an equipment component basis. Our modelling has been restricted to the wet chemistry area. We have developed predictive models for each of our process components based on the LANL data. 43 figs

  19. Action verbs are processed differently in metaphorical and literal sentences depending on the semantic match of visual primes

    Melissa eTroyer

    2014-12-01

    Full Text Available Language comprehension requires rapid and flexible access to information stored in long-term memory, likely influenced by activation of rich world knowledge and by brain systems that support the processing of sensorimotor content. We hypothesized that while literal language about biological motion might rely on neurocognitive representations of biological motion specific to the details of the actions described, metaphors rely on more generic representations of motion. In a priming and self-paced reading paradigm, participants saw video clips or images of (a an intact point-light walker or (b a scrambled control and read sentences containing literal or metaphoric uses of biological motion verbs either closely or distantly related to the depicted action (walking. We predicted that reading times for literal and metaphorical sentences would show differential sensitivity to the match between the verb and the visual prime. In Experiment 1, we observed interactions between the prime type (walker or scrambled video and the verb type (close or distant match for both literal and metaphorical sentences, but with strikingly different patterns. We found no difference in the verb region of literal sentences for Close-Match verbs after walker or scrambled motion primes, but Distant-Match verbs were read more quickly following walker primes. For metaphorical sentences, the results were roughly reversed, with Distant-Match verbs being read more slowly following a walker compared to scrambled motion. In Experiment 2, we observed a similar pattern following still image primes, though critical interactions emerged later in the sentence. We interpret these findings as evidence for shared recruitment of cognitive and neural mechanisms for processing visual and verbal biological motion information. Metaphoric language using biological motion verbs may recruit neurocognitive mechanisms similar to those used in processing literal language but be represented in a less

  20. The N400 and Late Positive Complex (LPC Effects Reflect Controlled Rather than Automatic Mechanisms of Sentence Processing

    Boris Kotchoubey

    2012-08-01

    Full Text Available This study compared automatic and controlled cognitive processes that underlie event-related potentials (ERPs effects during speech perception. Sentences were presented to French native speakers, and the final word could be congruent or incongruent, and presented at one of four levels of degradation (using a modulation with pink noise: no degradation, mild degradation (2 levels, or strong degradation. We assumed that degradation impairs controlled more than automatic processes. The N400 and Late Positive Complex (LPC effects were defined as the differences between the corresponding wave amplitudes to incongruent words minus congruent words. Under mild degradation, where controlled sentence-level processing could still occur (as indicated by behavioral data, both N400 and LPC effects were delayed and the latter effect was reduced. Under strong degradation, where sentence processing was rather automatic (as indicated by behavioral data, no ERP effect remained. These results suggest that ERP effects elicited in complex contexts, such as sentences, reflect controlled rather than automatic mechanisms of speech processing. These results differ from the results of experiments that used word-pair or word-list paradigms.

  1. Probability, Statistics, and Stochastic Processes

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  2. Statistical Inference at Work: Statistical Process Control as an Example

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  3. Statistical Process Control for KSC Processing

    Ford, Roger G.; Delgado, Hector; Tilley, Randy

    1996-01-01

    The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.

  4. Processing Interrogative Sentence Mood at the Semantic-Syntactic Interface: An Electrophysiological Research in Chinese, German, and Polish

    Kao, Chung-Shan; Dietrich, Rainer; Sommer, Werner

    2010-01-01

    Background Languages differ in the marking of the sentence mood of a polar interrogative (yes/no question). For instance, the interrogative mood is marked at the beginning of the surface structure in Polish, whereas the marker appears at the end in Chinese. In order to generate the corresponding sentence frame, the syntactic specification of the interrogative mood is early in Polish and late in Chinese. In this respect, German belongs to an interesting intermediate class. The yes/no question is expressed by a shift of the finite verb from its final position in the underlying structure into the utterance initial position, a move affecting, hence, both the sentence's final and the sentence's initial constituents. The present study aimed to investigate whether during generation of the semantic structure of a polar interrogative, i.e., the processing preceding the grammatical formulation, the interrogative mood is encoded according to its position in the syntactic structure at distinctive time points in Chinese, German, and Polish. Methodology/Principal Findings In a two-choice go/nogo experimental design, native speakers of the three languages responded to pictures by pressing buttons and producing utterances in their native language while their brain potentials were recorded. The emergence and latency of lateralized readiness potentials (LRP) in nogo conditions, in which speakers asked a yes/no question, should indicate the time point of processing the interrogative mood. The results revealed that Chinese, German, and Polish native speakers did not differ from each other in the electrophysiological indicator. Conclusions/Significance The findings suggest that the semantic encoding of the interrogative mood is temporally consistent across languages despite its disparate syntactic specification. The consistent encoding may be ascribed to economic processing of interrogative moods at various sentential positions of the syntactic structures in languages or, more

  5. Statistical processing of experimental data

    NAVRÁTIL, Pavel

    2012-01-01

    This thesis contains theory of probability and statistical sets. Solved and unsolved problems of probability, random variable and distributions random variable, random vector, statistical sets, regression and correlation analysis. Unsolved problems contains solutions.

  6. Rape sentencing

    Ó Cathaoir, Katharina Eva

    This handbook conducts an analysis of the sentences imposed for rape by Irish courts. Part I examines The People (DPP) v. WD [2007] IEHC 310 by outlining the salient points of the decision, in particular the separation of rape sentences into categories of punishments. The mitigating and aggravating...... factors are also laid out. Part II analyses recent sentences for rape since 2007. All reported Court of Criminal Appeal (CCA) cases post The People (DPP) v. WD are included as well as a survey of two years of Irish Times reports (covering the period November 2010 to November 2012)....

  7. From Utterance to Example Sentence

    Kristoffersen, Jette Hedegaard

    This poster will address some of the problems on excerption of example sentences for the online dictionary of Danish Sign Language (DTS) from a raw corpus of dialogues and monologues. In the Danish Sign Language Dictionary every meaning is illustrated by one or more sentences showing the sign...... lexicographers. The sentences were excerpted by hand from a raw corpus of dialogues and monologues – given to us by our group of consultants. The poster describes the process from utterance in a corpus in a larger context to an example sentence in a dictionary, where the purpose of having examples sentences...... for use in the dictionary consists of 11 stages in the DTS dictionary project. Special focus will be on the stage in the process where the sentence is judged suitable for dictionary use. A set of guidelines for what makes up a good example sentence has been developed for the DTS dictionary project...

  8. Electrophysiology of Sentence Processing in Aphasia: Prosodic Cues and Thematic Fit

    Shannon M. Sheppard

    2015-05-01

    * [ ] Indicates prosodic contour Methods: Twenty-four healthy college-age control participants (YNCs and ten adults with a Broca’s aphasia participated in this study. Each sentence was presented aurally to the participants over headphones. ERP Data Recording & Analysis. ERPs were recorded from 32-electrode sites across the scalp according to the 10-20 system. ERPs were averaged (100ms prestimulus baseline from artifact free trials time-locked to critical words (i.e., the point of disambiguation “pleased” in the prosodic comparison, and the NP “the song”/”the beer” in the semantic comparison. Mean amplitudes were calculated in two windows: 300-500ms for the N400 effects and 500-1000ms for the P600 effects. Results: The data from our YNCs revealed a biphasic N400-P600 complex in the prosody comparison (Figure 1A. We also found an N400 effect immediately at the NP in the incongruent relative to congruent thematic fit comparison. For the prosodic comparison in the PWA group, a delayed N400 effect was found one word downstream relative to the YNC data in the prosody comparison (Figure 1B. Additionally, an N400 effect was observed in the thematic fit comparison. Discussion: The results suggests that PWA possess a delayed sensitivity to prosodic cues, which then may affect their ability to recover from misanalysis from an incorrect parse. The results also indicate that PWA are sensitive to thematic fit information and have the capacity to process this information similarly to YNCs.

  9. A Classification of Sentences Used in Natural Language Processing in the Military Services.

    Wittrock, Merlin C.

    Concepts in cognitive psychology are applied to the language used in military situations, and a sentence classification system for use in analyzing military language is outlined. The system is designed to be used, in part, in conjunction with a natural language query system that allows a user to access a database. The discussion of military…

  10. Parallel language activation during word processing in bilinguals: Evidence from word production in sentence context

    Starreveld, P.A.; de Groot, A.M.B.; Rossmark, B.M.M.; van Hell, J.G.

    2014-01-01

    In two picture-naming experiments we examined whether bilinguals co-activate the non-target language during word production in the target language. The pictures were presented out-of-context (Experiment 1) or in visually presented sentence contexts (Experiment 2). In both experiments different

  11. Strong systematicity through sensorimotor conceptual grounding: an unsupervised, developmental approach to connectionist sentence processing

    Jansen, Peter A.; Watter, Scott

    2012-03-01

    Connectionist language modelling typically has difficulty with syntactic systematicity, or the ability to generalise language learning to untrained sentences. This work develops an unsupervised connectionist model of infant grammar learning. Following the semantic boostrapping hypothesis, the network distils word category using a developmentally plausible infant-scale database of grounded sensorimotor conceptual representations, as well as a biologically plausible semantic co-occurrence activation function. The network then uses this knowledge to acquire an early benchmark clausal grammar using correlational learning, and further acquires separate conceptual and grammatical category representations. The network displays strongly systematic behaviour indicative of the general acquisition of the combinatorial systematicity present in the grounded infant-scale language stream, outperforms previous contemporary models that contain primarily noun and verb word categories, and successfully generalises broadly to novel untrained sensorimotor grounded sentences composed of unfamiliar nouns and verbs. Limitations as well as implications to later grammar learning are discussed.

  12. Statistical aspects of determinantal point processes

    Lavancier, Frédéric; Møller, Jesper; Rubak, Ege

    The statistical aspects of determinantal point processes (DPPs) seem largely unexplored. We review the appealing properties of DDPs, demonstrate that they are useful models for repulsiveness, detail a simulation procedure, and provide freely available software for simulation and statistical infer...

  13. Statistical inference for Cox processes

    Møller, Jesper; Waagepetersen, Rasmus Plenge

    2002-01-01

    Research has generated a number of advances in methods for spatial cluster modelling in recent years, particularly in the area of Bayesian cluster modelling. Along with these advances has come an explosion of interest in the potential applications of this work, especially in epidemiology and genome...... research.   In one integrated volume, this book reviews the state-of-the-art in spatial clustering and spatial cluster modelling, bringing together research and applications previously scattered throughout the literature. It begins with an overview of the field, then presents a series of chapters...... that illuminate the nature and purpose of cluster modelling within different application areas, including astrophysics, epidemiology, ecology, and imaging. The focus then shifts to methods, with discussions on point and object process modelling, perfect sampling of cluster processes, partitioning in space...

  14. The effects of context on processing words during sentence reading among adults varying in age and literacy skill.

    Steen-Baker, Allison A; Ng, Shukhan; Payne, Brennan R; Anderson, Carolyn J; Federmeier, Kara D; Stine-Morrow, Elizabeth A L

    2017-08-01

    The facilitation of word processing by sentence context reflects the interaction between the build-up of message-level semantics and lexical processing. Yet, little is known about how this effect varies through adulthood as a function of reading skill. In this study, Participants 18-64 years old with a range of literacy competence read simple sentences as their eye movements were monitored. We manipulated the predictability of a sentence-final target word, operationalized as cloze probability. First fixation durations showed an interaction between age and literacy skill, decreasing with age among more skilled readers but increasing among less skilled readers. This pattern suggests that age-related slowing may impact reading when not buffered by skill, but with continued practice, automatization of reading can continue to develop in adulthood. In absolute terms, readers were sensitive to predictability, regardless of age or literacy, in both early and later measures. Older readers showed differential contextual sensitivity in regression patterns, effects not moderated by literacy skill. Finally, comprehension performance increased with age and literacy skill, but performance among less skilled readers was especially reduced when predictability was low, suggesting that low-literacy adults (regardless of age) struggle when creating mental representations under weaker semantic constraints. Collectively, these findings suggest that aging readers (regardless of reading skill) are more sensitive to context for meaning-integration processes; that less skilled adult readers (regardless of age) depend more on a constrained semantic representation for comprehension; and that the capacity for literacy engagement enables continued development of efficient lexical processing in adult reading development. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. Improving Instruction Using Statistical Process Control.

    Higgins, Ronald C.; Messer, George H.

    1990-01-01

    Two applications of statistical process control to the process of education are described. Discussed are the use of prompt feedback to teachers and prompt feedback to students. A sample feedback form is provided. (CW)

  16. Predicting complex syntactic structure in real time: Processing of negative sentences in Russian.

    Kazanina, Nina

    2017-11-01

    In Russian negative sentences the verb's direct object may appear either in the accusative case, which is licensed by the verb (as is common cross-linguistically), or in the genitive case, which is licensed by the negation (Russian-specific "genitive-of-negation" phenomenon). Such sentences were used to investigate whether case marking is employed for anticipating syntactic structure, and whether lexical heads other than the verb can be predicted on the basis of a case-marked noun phrase. Experiment 1, a completion task, confirmed that genitive-of-negation is part of Russian speakers' active grammatical repertoire. In Experiments 2 and 3, the genitive/accusative case manipulation on the preverbal object led to shorter reading times at the negation and verb in the genitive versus accusative condition. Furthermore, Experiment 3 manipulated linear order of the direct object and the negated verb in order to distinguish whether the abovementioned facilitatory effect was predictive or integrative in nature, and concluded that the parser actively predicts a verb and (otherwise optional) negation on the basis of a preceding genitive-marked object. Similarly to a head-final language, case-marking information on preverbal noun phrases (NPs) is used by the parser to enable incremental structure building in a free-word-order language such as Russian.

  17. Individual differences in processing emotional images after reading disgusting and neutral sentences.

    Hartigan, Alex; Richards, Anne

    2017-11-21

    The present study examined the extent to which Event Related Potentials (ERPs) evoked by disgusting, threatening and neutral photographic images were influenced by disgust propensity, disgust sensitivity and attentional control following exposure to disgusting information. Emotional cognition was manipulated by instructing participants to remember either disgusting or neutral sentences; participants in both groups then viewed emotional images while ERPs were recorded. Disgust propensity was associated with a reduced Late Positive Potential (LPP) gap between threatening and neutral stimuli (an effect driven by a rise in the LPP for neutral images) but only amongst individuals who were exposed to disgusting sentences. The typical LPP increase for disgust over neutral was reduced by attentional shifting capacity but only for individuals who were not previously exposed to disgust. There was also a persistent occipital shifted late positivity that was enhanced for disgust for the entire LPP window and was independent of exposure. Results suggest that emotion specific ERP effects can emerge within the broad unpleasant emotional category in conjunction with individual differences and prior emotional exposure. These results have important implications for the ways in which the perception of emotion is impacted by short term cognitive influences and longer term individual differences. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. A FUNCTIONAL NEUROIMAGING INVESTIGATION OF THE ROLES OF STRUCTURAL COMPLEXITY AND TASK-DEMAND DURING AUDITORY SENTENCE PROCESSING

    Love, Tracy; Haist, Frank; Nicol, Janet; Swinney, David

    2009-01-01

    Using functional magnetic resonance imaging (fMRI), this study directly examined an issue that bridges the potential language processing and multi-modal views of the role of Broca’s area: the effects of task-demands in language comprehension studies. We presented syntactically simple and complex sentences for auditory comprehension under three different (differentially complex) task-demand conditions: passive listening, probe verification, and theme judgment. Contrary to many language imaging findings, we found that both simple and complex syntactic structures activated left inferior frontal cortex (L-IFC). Critically, we found activation in these frontal regions increased together with increased task-demands. Specifically, tasks that required greater manipulation and comparison of linguistic material recruited L-IFC more strongly; independent of syntactic structure complexity. We argue that much of the presumed syntactic effects previously found in sentence imaging studies of L-IFC may, among other things, reflect the tasks employed in these studies and that L-IFC is a region underlying mnemonic and other integrative functions, on which much language processing may rely. PMID:16881268

  19. Statistical aspects of determinantal point processes

    Lavancier, Frédéric; Møller, Jesper; Rubak, Ege Holger

    The statistical aspects of determinantal point processes (DPPs) seem largely unexplored. We review the appealing properties of DDPs, demonstrate that they are useful models for repulsiveness, detail a simulation procedure, and provide freely available software for simulation and statistical...... inference. We pay special attention to stationary DPPs, where we give a simple condition ensuring their existence, construct parametric models, describe how they can be well approximated so that the likelihood can be evaluated and realizations can be simulated, and discuss how statistical inference...

  20. Proactive interference effects on sentence production

    FERREIRA, VICTOR S.; FIRATO, CARLA E.

    2002-01-01

    Proactive interference refers to recall difficulties caused by prior similar memory-related processing. Information-processing approaches to sentence production predict that retrievability affects sentence form: Speakers may word sentences so that material that is difficult to retrieve is spoken later. In this experiment, speakers produced sentence structures that could include an optional that, thereby delaying the mention of a subsequent noun phrase. This subsequent noun phrase was either (...

  1. Applicability of statistical process control techniques

    Schippers, W.A.J.

    1998-01-01

    This paper concerns the application of Process Control Techniques (PCTs) for the improvement of the technical performance of discrete production processes. Successful applications of these techniques, such as Statistical Process Control Techniques (SPC), can be found in the literature. However, some

  2. Statistical process control for serially correlated data

    Wieringa, Jakob Edo

    1999-01-01

    Statistical Process Control (SPC) aims at quality improvement through reduction of variation. The best known tool of SPC is the control chart. Over the years, the control chart has proved to be a successful practical technique for monitoring process measurements. However, its usefulness in practice

  3. On statistical analysis of compound point process

    Volf, Petr

    2006-01-01

    Roč. 35, 2-3 (2006), s. 389-396 ISSN 1026-597X R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : counting process * compound process * hazard function * Cox -model Subject RIV: BB - Applied Statistics, Operational Research

  4. Statistical process control for residential treated wood

    Patricia K. Lebow; Timothy M. Young; Stan Lebow

    2017-01-01

    This paper is the first stage of a study that attempts to improve the process of manufacturing treated lumber through the use of statistical process control (SPC). Analysis of industrial and auditing agency data sets revealed there are differences between the industry and agency probability density functions (pdf) for normalized retention data. Resampling of batches of...

  5. Nonparametric predictive inference in statistical process control

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2000-01-01

    New methods for statistical process control are presented, where the inferences have a nonparametric predictive nature. We consider several problems in process control in terms of uncertainties about future observable random quantities, and we develop inferences for these random quantities hased on

  6. Nonparametric predictive inference in statistical process control

    Arts, G.R.J.; Coolen, F.P.A.; Laan, van der P.

    2004-01-01

    Statistical process control (SPC) is used to decide when to stop a process as confidence in the quality of the next item(s) is low. Information to specify a parametric model is not always available, and as SPC is of a predictive nature, we present a control chart developed using nonparametric

  7. Multivariate Statistical Process Control Charts: An Overview

    Bersimis, Sotiris; Psarakis, Stelios; Panaretos, John

    2006-01-01

    In this paper we discuss the basic procedures for the implementation of multivariate statistical process control via control charting. Furthermore, we review multivariate extensions for all kinds of univariate control charts, such as multivariate Shewhart-type control charts, multivariate CUSUM control charts and multivariate EWMA control charts. In addition, we review unique procedures for the construction of multivariate control charts, based on multivariate statistical techniques such as p...

  8. Do Resources, Justice Administration Practices And Federalism Have An Impact On Registered And Sentenced Crime Prevalence?

    Christophe Koller

    2014-06-01

    Full Text Available This contribution, based on a statistical approach, undertakes to link data on resources (personnel and financial means and the working of the administration of penal justice (prosecution, sentencing taking into account the nationality of those prosecuted. In order to be able to distinguish prosecution and sentencing practices of judicial authorities and possible processes of discrimination, diverse sources have been used such as data from court administrations, public finances and police forces, collected by the Swiss Federal Statistical Office and the Swiss Federal administration of finances. The authors discuss discrimination in prosecution and sentencing between Swiss residents and foreigners taking into account localization and resources regarding personnel and public finances.

  9. Applied Behavior Analysis and Statistical Process Control?

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  10. Analysis of Variance in Statistical Image Processing

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  11. Statistical process control in nursing research.

    Polit, Denise F; Chaboyer, Wendy

    2012-02-01

    In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.

  12. Effects of Children's Working Memory Capacity and Processing Speed on Their Sentence Imitation Performance

    Poll, Gerard H.; Miller, Carol A.; Mainela-Arnold, Elina; Adams, Katharine Donnelly; Misra, Maya; Park, Ji Sook

    2013-01-01

    Background: More limited working memory capacity and slower processing for language and cognitive tasks are characteristics of many children with language difficulties. Individual differences in processing speed have not

  13. Who is who? Interpretation of multiple occurrences of the Chinese reflexive: evidence from real-time sentence processing.

    Lan Shuai

    Full Text Available Theoretical linguists claim that the notorious reflexive ziji 'self' in Mandarin Chinese, if occurring more than once in a single sentence, can take distinct antecedents. This study tackles possibly the most interesting puzzle in the linguistic literature, investigating how two occurrences of ziji in a single sentence are interpreted and whether or not there are mixed readings, i.e., these zijis are interpretively bound by distinct antecedents. Using 15 Chinese sentences each having two zijis, we conducted two sentence reading experiments based on a modified self-paced reading paradigm. The general interpretation patterns observed showed that the majority of participants associated both zijis with the same local antecedent, which was consistent with Principle A of the Standard Binding Theory and previous experimental findings involving a single ziji. In addition, mixed readings also occurred, but did not pattern as claimed in the theoretical linguistic literature (i.e., one ziji is bound by a long-distance antecedent and the other by a local antecedent. Based on these results, we argue that: (i mixed readings were due to manifold, interlocking and conflicting perspectives taken by the participants; and (ii cases of multiple occurrences of ziji taking distinct antecedents are illicit in Chinese syntax, since the speaker, when expressing a sentence, can select only one P(erspective-Center that referentially denotes the psychological perspective in which the sentence is situated.

  14. Machine Learning from Garden Path Sentences: The Application of Computational Linguistics

    Jiali Du

    2014-12-01

    Full Text Available This paper discusses the application of computational linguistics in the machine learning (ML system for the processing of garden path sentences. ML is closely related to artificial intelligence and linguistic cognition. The rapid and efficient processing of the complex structures is an effective method to test the system. By means of parsing the garden path sentence, we draw a conclusion that the integration of theoretical and statistical methods is helpful for the development of ML system.

  15. Semantic Structure in Vocabulary Knowledge Interacts with Lexical and Sentence Processing in Infancy

    Borovsky, Arielle; Ellis, Erica M.; Evans, Julia L.; Elman, Jeffrey L.

    2016-01-01

    Although the size of a child's vocabulary associates with language-processing skills, little is understood regarding how this relation emerges. This investigation asks whether and how the structure of vocabulary knowledge affects language processing in English-learning 24-month-old children (N = 32; 18 F, 14 M). Parental vocabulary report was used…

  16. The statistical process control methods - SPC

    Floreková Ľubica

    1998-03-01

    Full Text Available Methods of statistical evaluation of quality – SPC (item 20 of the documentation system of quality control of ISO norm, series 900 of various processes, products and services belong amongst basic qualitative methods that enable us to analyse and compare data pertaining to various quantitative parameters. Also they enable, based on the latter, to propose suitable interventions with the aim of improving these processes, products and services. Theoretical basis and applicatibily of the principles of the: - diagnostics of a cause and effects, - Paret analysis and Lorentz curve, - number distribution and frequency curves of random variable distribution, - Shewhart regulation charts, are presented in the contribution.

  17. Statistical image processing and multidimensional modeling

    Fieguth, Paul

    2010-01-01

    Images are all around us! The proliferation of low-cost, high-quality imaging devices has led to an explosion in acquired images. When these images are acquired from a microscope, telescope, satellite, or medical imaging device, there is a statistical image processing task: the inference of something - an artery, a road, a DNA marker, an oil spill - from imagery, possibly noisy, blurry, or incomplete. A great many textbooks have been written on image processing. However this book does not so much focus on images, per se, but rather on spatial data sets, with one or more measurements taken over

  18. Statistical process control for alpha spectroscopy

    Richardson, W; Majoras, R E [Oxford Instruments, Inc. P.O. Box 2560, Oak Ridge TN 37830 (United States); Joo, I O; Seymour, R S [Accu-Labs Research, Inc. 4663 Table Mountain Drive, Golden CO 80403 (United States)

    1995-10-01

    Statistical process control(SPC) allows for the identification of problems in alpha spectroscopy processes before they occur, unlike standard laboratory Q C which only identifies problems after a process fails. SPC tools that are directly applicable to alpha spectroscopy include individual X-charts and X-bar charts, process capability plots, and scatter plots. Most scientists are familiar with the concepts the and methods employed by SPC. These tools allow analysis of process bias, precision, accuracy and reproducibility as well as process capability. Parameters affecting instrument performance are monitored and analyzed using SPC methods. These instrument parameters can also be compared to sampling, preparation, measurement, and analysis Q C parameters permitting the evaluation of cause effect relationships. Three examples of SPC, as applied to alpha spectroscopy , are presented. The first example investigates background contamination using averaging to show trends quickly. A second example demonstrates how SPC can identify sample processing problems, analyzing both how and why this problem occurred. A third example illustrates how SPC can predict when an alpha spectroscopy process is going to fail. This allows for an orderly and timely shutdown of the process to perform preventative maintenance, avoiding the need to repeat costly sample analyses. 7 figs., 2 tabs.

  19. Statistical process control for alpha spectroscopy

    Richardson, W.; Majoras, R.E.; Joo, I.O.; Seymour, R.S.

    1995-01-01

    Statistical process control(SPC) allows for the identification of problems in alpha spectroscopy processes before they occur, unlike standard laboratory Q C which only identifies problems after a process fails. SPC tools that are directly applicable to alpha spectroscopy include individual X-charts and X-bar charts, process capability plots, and scatter plots. Most scientists are familiar with the concepts the and methods employed by SPC. These tools allow analysis of process bias, precision, accuracy and reproducibility as well as process capability. Parameters affecting instrument performance are monitored and analyzed using SPC methods. These instrument parameters can also be compared to sampling, preparation, measurement, and analysis Q C parameters permitting the evaluation of cause effect relationships. Three examples of SPC, as applied to alpha spectroscopy , are presented. The first example investigates background contamination using averaging to show trends quickly. A second example demonstrates how SPC can identify sample processing problems, analyzing both how and why this problem occurred. A third example illustrates how SPC can predict when an alpha spectroscopy process is going to fail. This allows for an orderly and timely shutdown of the process to perform preventative maintenance, avoiding the need to repeat costly sample analyses. 7 figs., 2 tabs

  20. Statistical analysis of non-homogeneous Poisson processes. Statistical processing of a particle multidetector

    Lacombe, J.P.

    1985-12-01

    Statistic study of Poisson non-homogeneous and spatial processes is the first part of this thesis. A Neyman-Pearson type test is defined concerning the intensity measurement of these processes. Conditions are given for which consistency of the test is assured, and others giving the asymptotic normality of the test statistics. Then some techniques of statistic processing of Poisson fields and their applications to a particle multidetector study are given. Quality tests of the device are proposed togetherwith signal extraction methods [fr

  1. Statistical processing of technological and radiochemical data

    Lahodova, Zdena; Vonkova, Kateřina

    2011-01-01

    The project described in this article had two goals. The main goal was to compare technological and radiochemical data from two units of nuclear power plant. The other goal was to check the collection, organization and interpretation of routinely measured data. Monitoring of analytical and radiochemical data is a very valuable source of knowledge for some processes in the primary circuit. Exploratory analysis of one-dimensional data was performed to estimate location and variability and to find extreme values, data trends, distribution, autocorrelation etc. This process allowed for the cleaning and completion of raw data. Then multiple analyses such as multiple comparisons, multiple correlation, variance analysis, and so on were performed. Measured data was organized into a data matrix. The results and graphs such as Box plots, Mahalanobis distance, Biplot, Correlation, and Trend graphs are presented in this article as statistical analysis tools. Tables of data were replaced with graphs because graphs condense large amounts of information into easy-to-understand formats. The significant conclusion of this work is that the collection and comprehension of data is a very substantial part of statistical processing. With well-prepared and well-understood data, its accurate evaluation is possible. Cooperation between the technicians who collect data and the statistician who processes it is also very important. (author)

  2. PROCESS VARIABILITY REDUCTION THROUGH STATISTICAL PROCESS CONTROL FOR QUALITY IMPROVEMENT

    B.P. Mahesh

    2010-09-01

    Full Text Available Quality has become one of the most important customer decision factors in the selection among the competing product and services. Consequently, understanding and improving quality is a key factor leading to business success, growth and an enhanced competitive position. Hence quality improvement program should be an integral part of the overall business strategy. According to TQM, the effective way to improve the Quality of the product or service is to improve the process used to build the product. Hence, TQM focuses on process, rather than results as the results are driven by the processes. Many techniques are available for quality improvement. Statistical Process Control (SPC is one such TQM technique which is widely accepted for analyzing quality problems and improving the performance of the production process. This article illustrates the step by step procedure adopted at a soap manufacturing company to improve the Quality by reducing process variability using Statistical Process Control.

  3. FILLED GAP EFFECT AND SEMANTIC PLAUSIBILITY IN BRAZILIAN PORTUGUESE SENTENCE PROCESSING

    Marcus Maia

    2014-01-01

    The Filled Gap Effect (FGE) is investigated in Brazilian Portuguese through eye- tracking and self paced reading experiments. Results detect the presence of FGE, indicating that the parser is strictly syntactic in the early stage of processing. The final measures in the two experiments present discrepant results, motivating a discussion on possible good-enough effects.

  4. FILLED GAP EFFECT AND SEMANTIC PLAUSIBILITY IN BRAZILIAN PORTUGUESE SENTENCE PROCESSING

    Marcus Maia

    2014-12-01

    Full Text Available The Filled Gap Effect (FGE is investigated in Brazilian Portuguese through eye- tracking and self paced reading experiments. Results detect the presence of FGE, indicating that the parser is strictly syntactic in the early stage of processing. The final measures in the two experiments present discrepant results, motivating a discussion on possible good-enough effects.

  5. Statistical process control for radiotherapy quality assurance

    Pawlicki, Todd; Whitaker, Matthew; Boyer, Arthur L.

    2005-01-01

    Every quality assurance process uncovers random and systematic errors. These errors typically consist of many small random errors and a very few number of large errors that dominate the result. Quality assurance practices in radiotherapy do not adequately differentiate between these two sources of error. The ability to separate these types of errors would allow the dominant source(s) of error to be efficiently detected and addressed. In this work, statistical process control is applied to quality assurance in radiotherapy for the purpose of setting action thresholds that differentiate between random and systematic errors. The theoretical development and implementation of process behavior charts are described. We report on a pilot project is which these techniques are applied to daily output and flatness/symmetry quality assurance for a 10 MV photon beam in our department. This clinical case was followed over 52 days. As part of our investigation, we found that action thresholds set using process behavior charts were able to identify systematic changes in our daily quality assurance process. This is in contrast to action thresholds set using the standard deviation, which did not identify the same systematic changes in the process. The process behavior thresholds calculated from a subset of the data detected a 2% change in the process whereas with a standard deviation calculation, no change was detected. Medical physicists must make decisions on quality assurance data as it is acquired. Process behavior charts help decide when to take action and when to acquire more data before making a change in the process

  6. Remediation of context-processing deficits in schizophrenia: preliminary data with ambiguous sentences

    Besche-Richard C

    2014-12-01

    Full Text Available Chrystel Besche-Richard,1,2 Sarah Terrien,1 Marion Lesgourgues,3,4 Célia Béchiri-Payet,5 Fabien Gierski,1,3 Frédéric Limosin6–8 1Laboratory Cognition, Santé, Socialisation, University of Reims Champagne-Ardenne, France; 2Institut Universitaire de France, Paris, France; 3Centre Hospitalier Universitaire, Pôle de Psychiatrie des Adultes, Reims, France; 4Service Universitaire de Médecine Préventive et de Promotion de la Santé, University of Reims Champagne-Ardenne, Reims, France; 5Etablissement Public de Santé mentale départemental de l’Aisne, Prémontré, France; 6Department of Adult and Geriatric Psychiatry, Hôpitaux Universitaires Paris, Ouest (Assistance publique-Hôpitaux de Paris, Hôpital Corentin-Celton, Issy-les-Moulineaux, France; 7Faculty of Medicine, University Paris Descartes, Sorbonne Paris Cité, Paris, France; 8Psychiatry and Neurosciences Center, French National Institute of Health and Medical Research, Institut National de la Santé et de la Recherche Médicale U894, Paris, France Background: Processing of contextual information is essential for the establishment of good interpersonal relations and communicational interactions. Nevertheless, it is known that schizophrenic patients present impairments in the processing of contextual information. The aim of this study is to explore the influence of the remediation of context processing in schizophrenic patients. Methods: Thirty-one schizophrenic patients and 28 matched healthy participants were included in this study. All participants were assessed on verbal knowledge (Mill-Hill test and depression intensity (Beck Depression Scale 21 items. Schizophrenic patients were also assessed on thought, language, and communication disorders (Thought, Language and Communication scale. All participants completed a disambiguation task with two different levels of contextualization (high or low context and a context-processing remediation task containing social scenarios that

  7. From arrest to sentencing: A comparative analysis of the criminal justice system processing for rape crimes

    Joana Domingues Vargas

    2008-01-01

    Full Text Available The current article is intended to demonstrate the advantages of prioritizing an analysis of court caseload processing for a given type of crime and proceeding to a comparison of the results obtained from empirical studies in different countries. The article draws on a study I performed on rape cases tried by the court system in Campinas, São Paulo State, and the study by Gary LaFree on rape cases in the United States, based on data in Indianapolis, Indiana. The comparative analysis of determinants of victims' and law enforcement agencies' decisions concerning the pursuit of legal action proved to be productive, even when comparing two different systems of justice. This allowed greater knowledge of how the Brazilian criminal justice system operates, both in its capacity to identify, try, and punish sex offenders, and in terms of the importance it ascribes to formal legal rules in trying rape cases, in comparison to the American criminal justice system.

  8. NOTE TAKING PAIRS TO IMPROVE STUDENTS‟ SENTENCE BASED WRITING ACHIEVEMENT

    Testiana Deni Wijayatiningsih

    2017-04-01

    Full Text Available Students had skill to actualize their imagination and interpret their knowledge through writing which could be combined with good writing structure. Moreover, their writing skill still had low motivation and had not reached the standard writing structure. Based on the background above, this research has purpose to know the influence Note Taking Pairs in improving students‘sentence based writing achievement. The subject of this research was the second semester of English Department in Muhammadiyah University of Semarang. It also used statistic non parametric method to analyze the students‘ writing achievement. The result of this research showed that Note Taking Pairs strategy could improve students‘sentence based writing achievement. Hopefully this research is recommended into learning process to improve students‘writing skill especially in sentence-based writing subject.

  9. Radiographic rejection index using statistical process control

    Savi, M.B.M.B.; Camozzato, T.S.C.; Soares, F.A.P.; Nandi, D.M.

    2015-01-01

    The Repeat Analysis Index (IRR) is one of the items contained in the Quality Control Program dictated by brazilian law of radiological protection and should be performed frequently, at least every six months. In order to extract more and better information of IRR, this study presents the Statistical Quality Control applied to reject rate through Statistical Process Control (Control Chart for Attributes ρ - GC) and the Pareto Chart (GP). Data collection was performed for 9 months and the last four months of collection was given on a daily basis. The Limits of Control (LC) were established and Minitab 16 software used to create the charts. IRR obtained for the period was corresponding to 8.8% ± 2,3% and the generated charts analyzed. Relevant information such as orders for X-ray equipment and processors were crossed to identify the relationship between the points that exceeded the control limits and the state of equipment at the time. The GC demonstrated ability to predict equipment failures, as well as the GP showed clearly what causes are recurrent in IRR. (authors) [pt

  10. Sentence Processing as a Function of Syntax, Short Term Memory Capacity, the Meaningfulness of the Stimulus and Age

    Gamlin, Peter J.

    1971-01-01

    Examines the effects of short term memory (STM) capacity, meaningfulness of stimuli, and age upon listeners' structuring of sentences. Results show that the interaction between STM capacity and meaningfulness (1) approached significance when data were collapsed over both age levels, and (2) was significant for one age level. Tables and references.…

  11. Accounting for Regressive Eye-Movements in Models of Sentence Processing: A Reappraisal of the Selective Reanalysis Hypothesis

    Mitchell, Don C.; Shen, Xingjia; Green, Matthew J.; Hodgson, Timothy L.

    2008-01-01

    When people read temporarily ambiguous sentences, there is often an increased prevalence of regressive eye-movements launched from the word that resolves the ambiguity. Traditionally, such regressions have been interpreted at least in part as reflecting readers' efforts to re-read and reconfigure earlier material, as exemplified by the Selective…

  12. Mathematical SETI Statistics, Signal Processing, Space Missions

    Maccone, Claudio

    2012-01-01

    This book introduces the Statistical Drake Equation where, from a simple product of seven positive numbers, the Drake Equation is turned into the product of seven positive random variables. The mathematical consequences of this transformation are demonstrated and it is proven that the new random variable N for the number of communicating civilizations in the Galaxy must follow the lognormal probability distribution when the number of factors in the Drake equation is allowed to increase at will. Mathematical SETI also studies the proposed FOCAL (Fast Outgoing Cyclopean Astronomical Lens) space mission to the nearest Sun Focal Sphere at 550 AU and describes its consequences for future interstellar precursor missions and truly interstellar missions. In addition the author shows how SETI signal processing may be dramatically improved by use of the Karhunen-Loève Transform (KLT) rather than Fast Fourier Transform (FFT). Finally, he describes the efforts made to persuade the United Nations to make the central part...

  13. Spherical Process Models for Global Spatial Statistics

    Jeong, Jaehong

    2017-11-28

    Statistical models used in geophysical, environmental, and climate science applications must reflect the curvature of the spatial domain in global data. Over the past few decades, statisticians have developed covariance models that capture the spatial and temporal behavior of these global data sets. Though the geodesic distance is the most natural metric for measuring distance on the surface of a sphere, mathematical limitations have compelled statisticians to use the chordal distance to compute the covariance matrix in many applications instead, which may cause physically unrealistic distortions. Therefore, covariance functions directly defined on a sphere using the geodesic distance are needed. We discuss the issues that arise when dealing with spherical data sets on a global scale and provide references to recent literature. We review the current approaches to building process models on spheres, including the differential operator, the stochastic partial differential equation, the kernel convolution, and the deformation approaches. We illustrate realizations obtained from Gaussian processes with different covariance structures and the use of isotropic and nonstationary covariance models through deformations and geographical indicators for global surface temperature data. To assess the suitability of each method, we compare their log-likelihood values and prediction scores, and we end with a discussion of related research problems.

  14. Statistical process control for electron beam monitoring.

    López-Tarjuelo, Juan; Luquero-Llopis, Naika; García-Mollá, Rafael; Quirós-Higueras, Juan David; Bouché-Babiloni, Ana; Juan-Senabre, Xavier Jordi; de Marco-Blancas, Noelia; Ferrer-Albiach, Carlos; Santos-Serra, Agustín

    2015-07-01

    To assess the electron beam monitoring statistical process control (SPC) in linear accelerator (linac) daily quality control. We present a long-term record of our measurements and evaluate which SPC-led conditions are feasible for maintaining control. We retrieved our linac beam calibration, symmetry, and flatness daily records for all electron beam energies from January 2008 to December 2013, and retrospectively studied how SPC could have been applied and which of its features could be used in the future. A set of adjustment interventions designed to maintain these parameters under control was also simulated. All phase I data was under control. The dose plots were characterized by rising trends followed by steep drops caused by our attempts to re-center the linac beam calibration. Where flatness and symmetry trends were detected they were less-well defined. The process capability ratios ranged from 1.6 to 9.3 at a 2% specification level. Simulated interventions ranged from 2% to 34% of the total number of measurement sessions. We also noted that if prospective SPC had been applied it would have met quality control specifications. SPC can be used to assess the inherent variability of our electron beam monitoring system. It can also indicate whether a process is capable of maintaining electron parameters under control with respect to established specifications by using a daily checking device, but this is not practical unless a method to establish direct feedback from the device to the linac can be devised. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  15. Statistical physics of media processes: Mediaphysics

    Kuznetsov, Dmitri V.; Mandel, Igor

    2007-04-01

    The processes of mass communications in complicated social or sociobiological systems such as marketing, economics, politics, animal populations, etc. as a subject for the special scientific subbranch-“mediaphysics”-are considered in its relation with sociophysics. A new statistical physics approach to analyze these phenomena is proposed. A keystone of the approach is an analysis of population distribution between two or many alternatives: brands, political affiliations, or opinions. Relative distances between a state of a “person's mind” and the alternatives are measures of propensity to buy (to affiliate, or to have a certain opinion). The distribution of population by those relative distances is time dependent and affected by external (economic, social, marketing, natural) and internal (influential propagation of opinions, “word of mouth”, etc.) factors, considered as fields. Specifically, the interaction and opinion-influence field can be generalized to incorporate important elements of Ising-spin-based sociophysical models and kinetic-equation ones. The distributions were described by a Schrödinger-type equation in terms of Green's functions. The developed approach has been applied to a real mass-media efficiency problem for a large company and generally demonstrated very good results despite low initial correlations of factors and the target variable.

  16. Statistical Processing Algorithms for Human Population Databases

    Camelia COLESCU

    2012-01-01

    Full Text Available The article is describing some algoritms for statistic functions aplied to a human population database. The samples are specific for the most interesting periods, when the evolution of statistical datas has spectacolous value. The article describes the most usefull form of grafical prezentation of the results

  17. Working Memory and Binding in Sentence Recall

    Baddeley, A. D.; Hitch, G. J.; Allen, R. J.

    2009-01-01

    A series of experiments explored whether chunking in short-term memory for verbal materials depends on attentionally limited executive processes. Secondary tasks were used to disrupt components of working memory and chunking was indexed by the sentence superiority effect, whereby immediate recall is better for sentences than word lists. To…

  18. The sentence wrap-up dogma.

    Stowe, Laurie A; Kaan, Edith; Sabourin, Laura; Taylor, Ryan C

    2018-03-30

    Current sentence processing research has focused on early effects of the on-line incremental processes that are performed at each word or constituent during processing. However, less attention has been devoted to what happens at the end of the clause or sentence. More specifically, over the last decade and a half, a lot of effort has been put into avoiding measuring event-related brain potentials (ERPs) at the final word of a sentence, because of the possible effects of sentence wrap-up. This article reviews the evidence on how and when sentence wrap-up impacts behavioral and ERP results. Even though the end of the sentence is associated with a positive-going ERP wave, thus far this effect has not been associated with any factors hypothesized to affect wrap-up. In addition, ERP responses to violations have not been affected by this positivity. "Sentence-final" negativities reported in the literature are not unique to sentence final positions, nor do they obscure or distort ERP effects associated with linguistic manipulations. Finally, the empirical evidence used to argue that sentence-final ERPs are different from those recorded at sentence-medial positions is weak at most. Measuring ERPs at sentence-final positions is therefore certainly not to be avoided at all costs, especially not in cases where the structure of the language under investigation requires it. More importantly, researchers should follow rigorous method in their experimental design, avoid decision tasks which may induce ERP confounds, and ensure all other possible explanations for results are considered. Although this article is directed at a particular dogma from a particular literature, this review shows that it is important to reassess what is regarded as "general knowledge" from time to time. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Spherical Process Models for Global Spatial Statistics

    Jeong, Jaehong; Jun, Mikyoung; Genton, Marc G.

    2017-01-01

    Statistical models used in geophysical, environmental, and climate science applications must reflect the curvature of the spatial domain in global data. Over the past few decades, statisticians have developed covariance models that capture

  20. Parametric statistical inference for discretely observed diffusion processes

    Pedersen, Asger Roer

    Part 1: Theoretical results Part 2: Statistical applications of Gaussian diffusion processes in freshwater ecology......Part 1: Theoretical results Part 2: Statistical applications of Gaussian diffusion processes in freshwater ecology...

  1. A Comparison of Alzheimer\\'s Patients and Healthy Elders in Relationship between Components of Working Memory and Sentence Comprehension

    Raziyeh A'lemi

    2010-10-01

    Full Text Available Objective: The aim of this study was to investigate the relation between working memory and sentence comprehension. Patients with Dementia of the Alzheimer´s type (DAT and matched older volunteers were tested on a battery of working memory tests, as well as on test of sentence comprehension. Materials & Methods: This is a cross-sectional study. Statistical population included all patients with Alzheimer disease who were registered in the center of Alzheimer (Imam Ali in Tehran.10 patients randomly were selected among them, according to inclusive and exclusive criteria. For data collection, epidemiological information questionnaire, Mental Mini State Examination (MMSE, working memory tests and sentence comprehension were applied. Data were analyzed by Independent T-test and correlation analyses. Results: Patients had impaired central executive processes in working memory (P=0.006, but showed normal effects of phonological and articulatory variables on span (P=0.480. On the sentence comprehension tasks (simple and complicated sentences, DAT patients showed significant differences with their peers (simple s. P=0.001, complicated s. P=0.004. Impairment in the central executive processes of working memory in DAT patients was correlated with the complicated sentences on the sentence comprehension tasks. Conclusion: The results suggest that patients with DAT have working memory impairments that are related to their ability to map the meaning on sentences on to depictions of events in the world.

  2. When novel sentences spoken or heard for the first time in the history of the universe are not enough: toward a dual-process model of language.

    Van Lancker Sidtis, Diana

    2004-01-01

    Although interest in the language sciences was previously focused on newly created sentences, more recently much attention has turned to the importance of formulaic expressions in normal and disordered communication. Also referred to as formulaic expressions and made up of speech formulas, idioms, expletives, serial and memorized speech, slang, sayings, clichés, and conventional expressions, non-propositional language forms a large proportion of every speaker's competence, and may be differentially disturbed in neurological disorders. This review aims to examine non-propositional speech with respect to linguistic descriptions, psycholinguistic experiments, sociolinguistic studies, child language development, clinical language disorders, and neurological studies. Evidence from numerous sources reveals differentiated and specialized roles for novel and formulaic verbal functions, and suggests that generation of novel sentences and management of prefabricated expressions represent two legitimate and separable processes in language behaviour. A preliminary model of language behaviour that encompasses unitary and compositional properties and their integration in everyday language use is proposed. Integration and synchronizing of two disparate processes in language behaviour, formulaic and novel, characterizes normal communicative function and contributes to creativity in language. This dichotomy is supported by studies arising from other disciplines in neurology and psychology. Further studies are necessary to determine in what ways the various categories of formulaic expressions are related, and how these categories are processed by the brain. Better understanding of how non-propositional categories of speech are stored and processed in the brain can lead to better informed treatment strategies in language disorders.

  3. Statistical data processing with automatic system for environmental radiation monitoring

    Zarkh, V.G.; Ostroglyadov, S.V.

    1986-01-01

    Practice of statistical data processing for radiation monitoring is exemplified, and some results obtained are presented. Experience in practical application of mathematical statistics methods for radiation monitoring data processing allowed to develop a concrete algorithm of statistical processing realized in M-6000 minicomputer. The suggested algorithm by its content is divided into 3 parts: parametrical data processing and hypotheses test, pair and multiple correlation analysis. Statistical processing programms are in a dialogue operation. The above algorithm was used to process observed data over radioactive waste disposal control region. Results of surface waters monitoring processing are presented

  4. The development of the Athens Emotional States Inventory (AESI): collection, validation and automatic processing of emotionally loaded sentences.

    Chaspari, Theodora; Soldatos, Constantin; Maragos, Petros

    2015-01-01

    The development of ecologically valid procedures for collecting reliable and unbiased emotional data towards computer interfaces with social and affective intelligence targeting patients with mental disorders. Following its development, presented with, the Athens Emotional States Inventory (AESI) proposes the design, recording and validation of an audiovisual database for five emotional states: anger, fear, joy, sadness and neutral. The items of the AESI consist of sentences each having content indicative of the corresponding emotion. Emotional content was assessed through a survey of 40 young participants with a questionnaire following the Latin square design. The emotional sentences that were correctly identified by 85% of the participants were recorded in a soundproof room with microphones and cameras. A preliminary validation of AESI is performed through automatic emotion recognition experiments from speech. The resulting database contains 696 recorded utterances in Greek language by 20 native speakers and has a total duration of approximately 28 min. Speech classification results yield accuracy up to 75.15% for automatically recognizing the emotions in AESI. These results indicate the usefulness of our approach for collecting emotional data with reliable content, balanced across classes and with reduced environmental variability.

  5. Statistical properties of several models of fractional random point processes

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  6. Prisons and Sentencing Reform.

    Galvin, Jim

    1983-01-01

    Reviews current themes in sentencing and prison policy. The eight articles of this special issue discuss selective incapacitation, prison bed allocation models, computer-scored classification systems, race and gender relations, commutation, parole, and a historical review of sentencing reform. (JAC)

  7. [Psychiatric treatment sentences.

    Stevens, Hanne; Nordentoft, Merete; Agerbo, Esben

    2010-01-01

    INTRODUCTION: Previous Danish studies of the increasing number of sentences to psychiatric treatment (SPT) have compared prevalent populations of persons undergoing treatment with incident measures of reported crimes. Examining the period 1990-2006, we studied incident sentences, taking the type...

  8. STATISTICAL OPTIMIZATION OF PROCESS VARIABLES FOR ...

    2012-11-03

    Nov 3, 2012 ... The osmotic dehydration process was optimized for water loss and solutes gain. ... basis) with safe moisture content for storage (10% wet basis) [3]. Due to ... sucrose, glucose, fructose, corn syrup and sodium chlo- ride have ...

  9. Rational integration of noisy evidence and prior semantic expectations in sentence interpretation.

    Gibson, Edward; Bergen, Leon; Piantadosi, Steven T

    2013-05-14

    Sentence processing theories typically assume that the input to our language processing mechanisms is an error-free sequence of words. However, this assumption is an oversimplification because noise is present in typical language use (for instance, due to a noisy environment, producer errors, or perceiver errors). A complete theory of human sentence comprehension therefore needs to explain how humans understand language given imperfect input. Indeed, like many cognitive systems, language processing mechanisms may even be "well designed"--in this case for the task of recovering intended meaning from noisy utterances. In particular, comprehension mechanisms may be sensitive to the types of information that an idealized statistical comprehender would be sensitive to. Here, we evaluate four predictions about such a rational (Bayesian) noisy-channel language comprehender in a sentence comprehension task: (i) semantic cues should pull sentence interpretation towards plausible meanings, especially if the wording of the more plausible meaning is close to the observed utterance in terms of the number of edits; (ii) this process should asymmetrically treat insertions and deletions due to the Bayesian "size principle"; such nonliteral interpretation of sentences should (iii) increase with the perceived noise rate of the communicative situation and (iv) decrease if semantically anomalous meanings are more likely to be communicated. These predictions are borne out, strongly suggesting that human language relies on rational statistical inference over a noisy channel.

  10. Modern Statistics for Spatial Point Processes

    Møller, Jesper; Waagepetersen, Rasmus

    2007-01-01

    We summarize and discuss the current state of spatial point process theory and directions for future research, making an analogy with generalized linear models and random effect models, and illustrating the theory with various examples of applications. In particular, we consider Poisson, Gibbs...

  11. Modern statistics for spatial point processes

    Møller, Jesper; Waagepetersen, Rasmus

    We summarize and discuss the current state of spatial point process theory and directions for future research, making an analogy with generalized linear models and random effect models, and illustrating the theory with various examples of applications. In particular, we consider Poisson, Gibbs...

  12. Robust control charts in statistical process control

    Nazir, H.Z.

    2014-01-01

    The presence of outliers and contaminations in the output of the process highly affects the performance of the design structures of commonly used control charts and hence makes them of less practical use. One of the solutions to deal with this problem is to use control charts which are robust

  13. Statistical process control in wine industry using control cards

    Dimitrieva, Evica; Atanasova-Pacemska, Tatjana; Pacemska, Sanja

    2013-01-01

    This paper is based on the research of the technological process of automatic filling of bottles of wine in winery in Stip, Republic of Macedonia. The statistical process control using statistical control card is created. The results and recommendations for improving the process are discussed.

  14. Sentence Recognition Prediction for Hearing-impaired Listeners in Stationary and Fluctuation Noise With FADE: Empowering the Attenuation and Distortion Concept by Plomp With a Quantitative Processing Model.

    Kollmeier, Birger; Schädler, Marc René; Warzybok, Anna; Meyer, Bernd T; Brand, Thomas

    2016-09-07

    To characterize the individual patient's hearing impairment as obtained with the matrix sentence recognition test, a simulation Framework for Auditory Discrimination Experiments (FADE) is extended here using the Attenuation and Distortion (A+D) approach by Plomp as a blueprint for setting the individual processing parameters. FADE has been shown to predict the outcome of both speech recognition tests and psychoacoustic experiments based on simulations using an automatic speech recognition system requiring only few assumptions. It builds on the closed-set matrix sentence recognition test which is advantageous for testing individual speech recognition in a way comparable across languages. Individual predictions of speech recognition thresholds in stationary and in fluctuating noise were derived using the audiogram and an estimate of the internal level uncertainty for modeling the individual Plomp curves fitted to the data with the Attenuation (A-) and Distortion (D-) parameters of the Plomp approach. The "typical" audiogram shapes from Bisgaard et al with or without a "typical" level uncertainty and the individual data were used for individual predictions. As a result, the individualization of the level uncertainty was found to be more important than the exact shape of the individual audiogram to accurately model the outcome of the German Matrix test in stationary or fluctuating noise for listeners with hearing impairment. The prediction accuracy of the individualized approach also outperforms the (modified) Speech Intelligibility Index approach which is based on the individual threshold data only. © The Author(s) 2016.

  15. Statistical Inference for Partially Observed Diffusion Processes

    Jensen, Anders Christian

    This thesis is concerned with parameter estimation for multivariate diffusion models. It gives a short introduction to diffusion models, and related mathematical concepts. we then introduce the method of prediction-based estimating functions and describe in detail the application for a two......-Uhlenbeck process, while chapter eight describes the detials of an R-package that was developed in relations to the application of the estimationprocedure of chapters five and six....

  16. Representing sentence information

    Perkins, Walton A., III

    1991-03-01

    This paper describes a computer-oriented representation for sentence information. Whereas many Artificial Intelligence (AI) natural language systems start with a syntactic parse of a sentence into the linguist's components: noun, verb, adjective, preposition, etc., we argue that it is better to parse the input sentence into 'meaning' components: attribute, attribute value, object class, object instance, and relation. AI systems need a representation that will allow rapid storage and retrieval of information and convenient reasoning with that information. The attribute-of-object representation has proven useful for handling information in relational databases (which are well known for their efficiency in storage and retrieval) and for reasoning in knowledge- based systems. On the other hand, the linguist's syntactic representation of the works in sentences has not been shown to be useful for information handling and reasoning. We think it is an unnecessary and misleading intermediate form. Our sentence representation is semantic based in terms of attribute, attribute value, object class, object instance, and relation. Every sentence is segmented into one or more components with the form: 'attribute' of 'object' 'relation' 'attribute value'. Using only one format for all information gives the system simplicity and good performance as a RISC architecture does for hardware. The attribute-of-object representation is not new; it is used extensively in relational databases and knowledge-based systems. However, we will show that it can be used as a meaning representation for natural language sentences with minor extensions. In this paper we describe how a computer system can parse English sentences into this representation and generate English sentences from this representation. Much of this has been tested with computer implementation.

  17. Using Statistical Process Control to Enhance Student Progression

    Hanna, Mark D.; Raichura, Nilesh; Bernardes, Ednilson

    2012-01-01

    Public interest in educational outcomes has markedly increased in the most recent decade; however, quality management and statistical process control have not deeply penetrated the management of academic institutions. This paper presents results of an attempt to use Statistical Process Control (SPC) to identify a key impediment to continuous…

  18. Applying Statistical Process Control to Clinical Data: An Illustration.

    Pfadt, Al; And Others

    1992-01-01

    Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…

  19. Applying Statistical Process Quality Control Methodology to Educational Settings.

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  20. Do not resonate with actions: sentence polarity modulates cortico-spinal excitability during action-related sentence reading.

    Marco Tullio Liuzza

    Full Text Available BACKGROUND: Theories of embodied language suggest that the motor system is differentially called into action when processing motor-related versus abstract content words or sentences. It has been recently shown that processing negative polarity action-related sentences modulates neural activity of premotor and motor cortices. METHODS AND FINDINGS: We sought to determine whether reading negative polarity sentences brought about differential modulation of cortico-spinal motor excitability depending on processing hand-action related or abstract sentences. Facilitatory paired-pulses Transcranial Magnetic Stimulation (pp-TMS was applied to the primary motor representation of the right-hand and the recorded amplitude of induced motor-evoked potentials (MEP was used to index M1 activity during passive reading of either hand-action related or abstract content sentences presented in both negative and affirmative polarity. Results showed that the cortico-spinal excitability was affected by sentence polarity only in the hand-action related condition. Indeed, in keeping with previous TMS studies, reading positive polarity, hand action-related sentences suppressed cortico-spinal reactivity. This effect was absent when reading hand action-related negative polarity sentences. Moreover, no modulation of cortico-spinal reactivity was associated with either negative or positive polarity abstract sentences. CONCLUSIONS: Our results indicate that grammatical cues prompting motor negation reduce the cortico-spinal suppression associated with affirmative action sentences reading and thus suggest that motor simulative processes underlying the embodiment may involve even syntactic features of language.

  1. Reaming process improvement and control: An application of statistical engineering

    Müller, Pavel; Genta, G.; Barbato, G.

    2012-01-01

    A reaming operation had to be performed within given technological and economical constraints. Process improvement under realistic conditions was the goal of a statistical engineering project, supported by a comprehensive experimental investigation providing detailed information on single...

  2. Using Statistical Process Control Methods to Classify Pilot Mental Workloads

    Kudo, Terence

    2001-01-01

    .... These include cardiac, ocular, respiratory, and brain activity measures. The focus of this effort is to apply statistical process control methodology on different psychophysiological features in an attempt to classify pilot mental workload...

  3. Statistic techniques of process control for MTR type

    Oliveira, F.S.; Ferrufino, F.B.J.; Santos, G.R.T.; Lima, R.M.

    2002-01-01

    This work aims at introducing some improvements on the fabrication of MTR type fuel plates, applying statistic techniques of process control. The work was divided into four single steps and their data were analyzed for: fabrication of U 3 O 8 fuel plates; fabrication of U 3 Si 2 fuel plates; rolling of small lots of fuel plates; applying statistic tools and standard specifications to perform a comparative study of these processes. (author)

  4. Sentence Level Information Patterns for Novelty Detection

    Li, Xiaoyan

    2006-01-01

    .... Given a user's information need, some information patterns in sentences such as combinations of query words, sentence lengths, named entities and phrases, and other sentence patterns, may contain...

  5. Sentencing Multiple Crimes

    Most people assume that criminal offenders have only been convicted of a single crime. However, in reality almost half of offenders stand to be sentenced for more than one crime.The high proportion of multiple crime offenders poses a number of practical and theoretical challenges for the criminal......, and psychology offer their perspectives to the volume. A comprehensive examination of the dynamics involved with sentencing multiple offenders has the potential to be a powerful tool for legal scholars and professionals, particularly given the practical importance of the topic and the relative dearth of research...

  6. Statistical Data Processing with R – Metadata Driven Approach

    Rudi SELJAK

    2016-06-01

    Full Text Available In recent years the Statistical Office of the Republic of Slovenia has put a lot of effort into re-designing its statistical process. We replaced the classical stove-pipe oriented production system with general software solutions, based on the metadata driven approach. This means that one general program code, which is parametrized with process metadata, is used for data processing for a particular survey. Currently, the general program code is entirely based on SAS macros, but in the future we would like to explore how successfully statistical software R can be used for this approach. Paper describes the metadata driven principle for data validation, generic software solution and main issues connected with the use of statistical software R for this approach.

  7. The development of an automated sentence generator for the assessment of reading speed

    Legge Gordon E

    2008-03-01

    Full Text Available Abstract Reading speed is an important outcome measure for many studies in neuroscience and psychology. Conventional reading speed tests have a limited corpus of sentences and usually require observers to read sentences aloud. Here we describe an automated sentence generator which can create over 100,000 unique sentences, scored using a true/false response. We propose that an estimate of the minimum exposure time required for observers to categorise the truth of such sentences is a good alternative to reading speed measures that guarantees comprehension of the printed material. Removing one word from the sentence reduces performance to chance, indicating minimal redundancy. Reading speed assessed using rapid serial visual presentation (RSVP of these sentences is not statistically different from using MNREAD sentences. The automated sentence generator would be useful for measuring reading speed with button-press response (such as within MRI scanners and for studies requiring many repeated measures of reading speed.

  8. The application of bayesian statistic in data fit processing

    Guan Xingyin; Li Zhenfu; Song Zhaohui

    2010-01-01

    The rationality and disadvantage of least squares fitting that is usually used in data processing is analyzed, and the theory and commonly method that Bayesian statistic is applied in data processing is shown in detail. As it is proved in analysis, Bayesian approach avoid the limitative hypothesis that least squares fitting has in data processing, and the result has traits that it is more scientific and more easily understood, may replace the least squares fitting to apply in data processing. (authors)

  9. Using Paper Helicopters to Teach Statistical Process Control

    Johnson, Danny J.

    2011-01-01

    This hands-on project uses a paper helicopter to teach students how to distinguish between common and special causes of variability when developing and using statistical process control charts. It allows the student to experience a process that is out-of-control due to imprecise or incomplete product design specifications and to discover how the…

  10. Memory-type control charts in statistical process control

    Abbas, N.

    2012-01-01

    Control chart is the most important statistical tool to manage the business processes. It is a graph of measurements on a quality characteristic of the process on the vertical axis plotted against time on the horizontal axis. The graph is completed with control limits that cause variation mark. Once

  11. Manufacturing Squares: An Integrative Statistical Process Control Exercise

    Coy, Steven P.

    2016-01-01

    In the exercise, students in a junior-level operations management class are asked to manufacture a simple product. Given product specifications, they must design a production process, create roles and design jobs for each team member, and develop a statistical process control plan that efficiently and effectively controls quality during…

  12. Prototypicality in Sentence Production

    Onishi, Kristine H.; Murphy, Gregory L.; Bock, Kathryn

    2008-01-01

    Three cued-recall experiments examined the effect of category typicality on the ordering of words in sentence production. Past research has found that typical items tend to be mentioned before atypical items in a phrase--a pattern usually associated with lexical variables (like word frequency), and yet typicality is a conceptual variable.…

  13. What's statistical about learning? Insights from modelling statistical learning as a set of memory processes.

    Thiessen, Erik D

    2017-01-05

    Statistical learning has been studied in a variety of different tasks, including word segmentation, object identification, category learning, artificial grammar learning and serial reaction time tasks (e.g. Saffran et al. 1996 Science 274: , 1926-1928; Orban et al. 2008 Proceedings of the National Academy of Sciences 105: , 2745-2750; Thiessen & Yee 2010 Child Development 81: , 1287-1303; Saffran 2002 Journal of Memory and Language 47: , 172-196; Misyak & Christiansen 2012 Language Learning 62: , 302-331). The difference among these tasks raises questions about whether they all depend on the same kinds of underlying processes and computations, or whether they are tapping into different underlying mechanisms. Prior theoretical approaches to statistical learning have often tried to explain or model learning in a single task. However, in many cases these approaches appear inadequate to explain performance in multiple tasks. For example, explaining word segmentation via the computation of sequential statistics (such as transitional probability) provides little insight into the nature of sensitivity to regularities among simultaneously presented features. In this article, we will present a formal computational approach that we believe is a good candidate to provide a unifying framework to explore and explain learning in a wide variety of statistical learning tasks. This framework suggests that statistical learning arises from a set of processes that are inherent in memory systems, including activation, interference, integration of information and forgetting (e.g. Perruchet & Vinter 1998 Journal of Memory and Language 39: , 246-263; Thiessen et al. 2013 Psychological Bulletin 139: , 792-814). From this perspective, statistical learning does not involve explicit computation of statistics, but rather the extraction of elements of the input into memory traces, and subsequent integration across those memory traces that emphasize consistent information (Thiessen and Pavlik

  14. Proficiency and sentence constraint effects on second language word learning.

    Ma, Tengfei; Chen, Baoguo; Lu, Chunming; Dunlap, Susan

    2015-07-01

    This paper presents an experiment that investigated the effects of L2 proficiency and sentence constraint on semantic processing of unknown L2 words (pseudowords). All participants were Chinese native speakers who learned English as a second language. In the experiment, we used a whole sentence presentation paradigm with a delayed semantic relatedness judgment task. Both higher and lower-proficiency L2 learners could make use of the high-constraint sentence context to judge the meaning of novel pseudowords, and higher-proficiency L2 learners outperformed lower-proficiency L2 learners in all conditions. These results demonstrate that both L2 proficiency and sentence constraint affect subsequent word learning among second language learners. We extended L2 word learning into a sentence context, replicated the sentence constraint effects previously found among native speakers, and found proficiency effects in L2 word learning. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Sentence Comprehension as Mental Simulation: An Information-Theoretic Perspective

    Gabriella Vigliocco

    2011-11-01

    Full Text Available It has been argued that the mental representation resulting from sentence comprehension is not (just an abstract symbolic structure but a “mental simulation” of the state-of-affairs described by the sentence. We present a particular formalization of this theory and show how it gives rise to quantifications of the amount of syntactic and semantic information conveyed by each word in a sentence. These information measures predict simulated word-processing times in a dynamic connectionist model of sentence comprehension as mental simulation. A quantitatively similar relation between information content and reading time is known to be present in human reading-time data.

  16. Improving the Document Development Process: Integrating Relational Data and Statistical Process Control.

    Miller, John

    1994-01-01

    Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)

  17. Statistical Process Control: Going to the Limit for Quality.

    Training, 1987

    1987-01-01

    Defines the concept of statistical process control, a quality control method used especially in manufacturing. Generally, concept users set specific standard levels that must be met. Makes the point that although employees work directly with the method, management is responsible for its success within the plant. (CH)

  18. Statistical Process Control in the Practice of Program Evaluation.

    Posavac, Emil J.

    1995-01-01

    A technique developed to monitor the quality of manufactured products, statistical process control (SPC), incorporates several features that may prove attractive to evaluators. This paper reviews the history of SPC, suggests how the approach can enrich program evaluation, and illustrates its use in a hospital-based example. (SLD)

  19. Statistical Process Control. Impact and Opportunities for Ohio.

    Brown, Harold H.

    The first purpose of this study is to help the reader become aware of the evolution of Statistical Process Control (SPC) as it is being implemented and used in industry today. This is approached through the presentation of a brief historical account of SPC, from its inception through the technological miracle that has occurred in Japan. The…

  20. Statistical Process Control. A Summary. FEU/PICKUP Project Report.

    Owen, M.; Clark, I.

    A project was conducted to develop a curriculum and training materials to be used in training industrial operatives in statistical process control (SPC) techniques. During the first phase of the project, questionnaires were sent to 685 companies (215 of which responded) to determine where SPC was being used, what type of SPC firms needed, and how…

  1. Electrophysiological signatures of phonological and semantic maintenance in sentence repetition.

    Meltzer, Jed A; Kielar, Aneta; Panamsky, Lilia; Links, Kira A; Deschamps, Tiffany; Leigh, Rosie C

    2017-08-01

    Verbal short-term memory comprises resources for phonological rehearsal, which have been characterized anatomically, and for maintenance of semantic information, which are less understood. Sentence repetition tasks tap both processes interactively. To distinguish brain activity involved in phonological vs. semantic maintenance, we recorded magnetoencephalography during a sentence repetition task, incorporating three manipulations emphasizing one mechanism over the other. Participants heard sentences or word lists and attempted to repeat them verbatim after a 5-second delay. After MEG, participants completed a cued recall task testing how much they remembered of each sentence. Greater semantic engagement relative to phonological rehearsal was hypothesized for 1) sentences vs. word lists, 2) concrete vs. abstract sentences, and 3) well recalled vs. poorly recalled sentences. During auditory perception and the memory delay period, we found highly left-lateralized activation in the form of 8-30 Hz event-related desynchronization. Compared to abstract sentences, concrete sentences recruited posterior temporal cortex bilaterally, demonstrating a neural signature for the engagement of visual imagery in sentence maintenance. Maintenance of arbitrary word lists recruited right hemisphere dorsal regions, reflecting increased demands on phonological rehearsal. Sentences that were ultimately poorly recalled in the post-test also elicited extra right hemisphere activation when they were held in short-term memory, suggesting increased demands on phonological resources. Frontal midline theta oscillations also reflected phonological rather than semantic demand, being increased for word lists and poorly recalled sentences. These findings highlight distinct neural resources for phonological and semantic maintenance, with phonological maintenance associated with stronger oscillatory modulations. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Limiting processes in non-equilibrium classical statistical mechanics

    Jancel, R.

    1983-01-01

    After a recall of the basic principles of the statistical mechanics, the results of ergodic theory, the transient at the thermodynamic limit and his link with the transport theory near the equilibrium are analyzed. The fundamental problems put by the description of non-equilibrium macroscopic systems are investigated and the kinetic methods are stated. The problems of the non-equilibrium statistical mechanics are analyzed: irreversibility and coarse-graining, macroscopic variables and kinetic description, autonomous reduced descriptions, limit processes, BBGKY hierarchy, limit theorems [fr

  3. A new instrument for statistical process control of thermoset molding

    Day, D.R.; Lee, H.L.; Shepard, D.D.; Sheppard, N.F.

    1991-01-01

    The recent development of a rugged ceramic mold mounted dielectric sensor and high speed dielectric instrumentation now enables monitoring and statistical process control of production molding over thousands of runs. In this work special instrumentation and software (ICAM-1000) was utilized that automatically extracts critical point during the molding process including flow point, viscosity minimum gel inflection, and reaction endpoint. In addition, other sensors were incorporated to measure temperature and pressure. The critical point as well as temperature and pressure were then recorded during normal production and then plotted in the form of statistical process control (SPC) charts. Experiments have been carried out in RIM, SMC, and RTM type molding operations. The influence of temperature, pressure chemistry, and other variables has been investigated. In this paper examples of both RIM and SMC are discussed

  4. Asyndetic sentences with a concretiser

    Tanasić Sreto Z.

    2015-01-01

    Full Text Available The paper discusses asyndetic sentences, compound sentences without a conjunction between the clauses. Slavic scholars pay considerable attention to these sentences. They predominantly consider asyndetic sentences to be a model of compound sentences, apart from the model of compound conjunctional sentences, and plead that they should be described separately. Asyndetic sentences in contemporary Serbian have not been studied sufficiently. There are few specific papers dedicated to asyndetic sentences, and one can say that there are virtually no papers giving them an in-depth treatment. Therefore, we are so far left without a full insight into how widespread that compound sentence model is in contemporary Serbian and in what variants it occurs, not to mention our even lesser knowledge of its distribution in certain functional styles. This paper describes one type of asyndetic sentences in the contemporary Standard Serbian language. It includes such sentences that have a word or a phrase functioning as the verifier of the semantic relation between the clauses of asyndetic sentences. The paper demonstrates that such sentences take up a sizeable portion of the asyndetic sentence corpus, and that a large number of concretisers occur functioning as the verifiers of different meanings which are established between the clauses. The concretisers, similarly to conjunctions in syndetic sentences, serve the purpose of reducing the typical polysemy of asyndetic sentences to monosemy by assigning a monosemic relation between the clauses while foregrounding one of the possible meanings, and suppressing the others. The paper indicates that coordinate asyndetic sentences express a number of different semantic relations between the clauses. Some of them are expressed in complex sentences, some in compound sentences, and there are also those that can be expressed in both types of conjunctional sentences. The paper presents examples of sentences which have in their

  5. Impact of SNR, masker type and noise reduction processing on sentence recognition performance and listening effort as indicated by the pupil dilation response

    Ohlenforst, Barbara; Wendt, Dorothea; Kramer, Sophia E

    2018-01-01

    Recent studies have shown that activating the noise reduction scheme in hearing aids results in a smaller peak pupil dilation (PPD), indicating reduced listening effort, at 50% and 95% correct sentence recognition with a 4-talker masker. The objective of this study was to measure the effect...... of the noise reduction scheme (on or off) on PPD and sentence recognition across a wide range of signal-to-noise ratios (SNRs) from +16 dB to -12 dB and two masker types (4-talker and stationary noise). Relatively low PPDs were observed at very low (-12 dB) and very high (+16 dB to +8 dB) SNRs presumably due...... to 'giving up' and 'easy listening', respectively. The maximum PPD was observed with SNRs at approximately 50% correct sentence recognition. Sentence recognition with both masker types was significantly improved by the noise reduction scheme, which corresponds to the shift in performance from SNR function...

  6. Statistical convergence of a non-positive approximation process

    Agratini, Octavian

    2011-01-01

    Highlights: → A general class of approximation processes is introduced. → The A-statistical convergence is studied. → Applications in quantum calculus are delivered. - Abstract: Starting from a general sequence of linear and positive operators of discrete type, we associate its r-th order generalization. This construction involves high order derivatives of a signal and it looses the positivity property. Considering that the initial approximation process is A-statistically uniform convergent, we prove that the property is inherited by the new sequence. Also, our result includes information about the uniform convergence. Two applications in q-Calculus are presented. We study q-analogues both of Meyer-Koenig and Zeller operators and Stancu operators.

  7. Statistical Process Control in a Modern Production Environment

    Windfeldt, Gitte Bjørg

    gathered here and standard statistical software. In Paper 2 a new method for process monitoring is introduced. The method uses a statistical model of the quality characteristic and a sliding window of observations to estimate the probability that the next item will not respect the specications......Paper 1 is aimed at practicians to help them test the assumption that the observations in a sample are independent and identically distributed. An assumption that is essential when using classical Shewhart charts. The test can easily be performed in the control chart setup using the samples....... If the estimated probability exceeds a pre-determined threshold the process will be stopped. The method is exible, allowing a complexity in modeling that remains invisible to the end user. Furthermore, the method allows to build diagnostic plots based on the parameters estimates that can provide valuable insight...

  8. Statistical features of pre-compound processes in nuclear reactions

    Hussein, M.S.; Rego, R.A.

    1983-04-01

    Several statistical aspects of multistep compound processes are discussed. The connection between the cross-section auto-correlation function and the average number of maxima is emphasized. The restrictions imposed by the non-zero value of the energy step used in measuring the excitation fuction and the experimental error are discussed. Applications are made to the system 25 Mg( 3 He,p) 27 Al. (Author) [pt

  9. Application of statistical process control to qualitative molecular diagnostic assays.

    Cathal P O'brien

    2014-11-01

    Full Text Available Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control. Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply statistical process control to assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater samples with a resultant protracted time to detection. Modelled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of statistical process control to qualitative laboratory data.

  10. Some properties of point processes in statistical optics

    Picinbono, B.; Bendjaballah, C.

    2010-01-01

    The analysis of the statistical properties of the point process (PP) of photon detection times can be used to determine whether or not an optical field is classical, in the sense that its statistical description does not require the methods of quantum optics. This determination is, however, more difficult than ordinarily admitted and the first aim of this paper is to illustrate this point by using some results of the PP theory. For example, it is well known that the analysis of the photodetection of classical fields exhibits the so-called bunching effect. But this property alone cannot be used to decide the nature of a given optical field. Indeed, we have presented examples of point processes for which a bunching effect appears and yet they cannot be obtained from a classical field. These examples are illustrated by computer simulations. Similarly, it is often admitted that for fields with very low light intensity the bunching or antibunching can be described by using the statistical properties of the distance between successive events of the point process, which simplifies the experimental procedure. We have shown that, while this property is valid for classical PPs, it has no reason to be true for nonclassical PPs, and we have presented some examples of this situation also illustrated by computer simulations.

  11. An introduction to statistical process control in research proteomics.

    Bramwell, David

    2013-12-16

    Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier

  12. Tracking sentence planning and production.

    Kemper, Susan; Bontempo, Daniel; McKedy, Whitney; Schmalzried, RaLynn; Tagliaferri, Bruno; Kieweg, Doug

    2011-03-01

    To assess age differences in the costs of language planning and production. A controlled sentence production task was combined with digital pursuit rotor tracking. Participants were asked to track a moving target while formulating a sentence using specified nouns and verbs and to continue to track the moving target while producing their response. The length of the critical noun phrase (NP) as well as the type of verb provided were manipulated. The analysis indicated that sentence planning was more costly than sentence production, and sentence planning costs increased when participants had to incorporate a long NP into their sentence. The long NPs also tended to be shifted to the end of the sentence, whereas short NPs tended to be positioned after the verb. Planning or producing responses with long NPs was especially difficult for older adults, although verb type and NP shift had similar costs for young and older adults. Pursuit rotor tracking during controlled sentence production reveals the effects of aging on sentence planning and production.

  13. Design of short Italian sentences to assess near vision performance.

    Calossi, Antonio; Boccardo, Laura; Fossetti, Alessandro; Radner, Wolfgang

    2014-01-01

    To develop and validate 28 short Italian sentences for the construction of the Italian version of the Radner Reading Chart to simultaneously measure near visual acuity and reading speed. 41 sentences were constructed in Italian language, following the procedure defined by Radner, to obtain "sentence optotypes" with comparable structure and with the same lexical and grammatical difficulty. Sentences were statistically selected and used in 211 normal, non-presbyopic, native Italian-speaking persons. The most equally matched sentences in terms of reading speed and number of reading errors were selected. To assess the validity of the reading speed results obtained with the 28 selected short sentences, we compared the reading speed and reading errors with the average obtained by reading two long 4th-grade paragraphs (97 and 90 words) under the same conditions. The overall mean reading speed of the tested persons was 189±26wpm. The 28 sentences more similar in terms of reading times were selected, achieving a coefficient of variation (the relative SD) of 2.2%. The reliability analyses yielded an overall Cronbach's alpha coefficient of 0.98. The correlation between the short sentences and the long paragraph was high (r=0.85, P<0.0001). The 28 short single Italian sentences optotypes were highly comparable in syntactical structure, number, position, and length of words, lexical difficulty, and reading length. The resulting Italian Radner Reading Chart is precise (high consistency) and practical (short sentences) and therefore useful for research and clinical practice to simultaneously measure near reading acuity and reading speed. Copyright © 2013 Spanish General Council of Optometry. Published by Elsevier Espana. All rights reserved.

  14. 77 FR 46096 - Statistical Process Controls for Blood Establishments; Public Workshop

    2012-08-02

    ...] Statistical Process Controls for Blood Establishments; Public Workshop AGENCY: Food and Drug Administration... workshop entitled: ``Statistical Process Controls for Blood Establishments.'' The purpose of this public workshop is to discuss the implementation of statistical process controls to validate and monitor...

  15. Competent statistical programmer: Need of business process outsourcing industry

    Khan, Imran

    2014-01-01

    Over the last two decades Business Process Outsourcing (BPO) has evolved as much mature practice. India is looked as preferred destination for pharmaceutical outsourcing over a cost arbitrage. Among the biometrics outsourcing, statistical programming and analysis required very niche skill for service delivery. The demand and supply ratios are imbalance due to high churn out rate and less supply of competent programmer. Industry is moving from task delivery to ownership and accountability. The paradigm shift from an outsourcing to consulting is triggering the need for competent statistical programmer. Programmers should be trained in technical, analytical, problem solving, decision making and soft skill as the expectations from the customer are changing from task delivery to accountability of the project. This paper will highlight the common issue SAS programming service industry is facing and skills the programmers need to develop to cope up with these changes. PMID:24987578

  16. Competent statistical programmer: Need of business process outsourcing industry.

    Khan, Imran

    2014-07-01

    Over the last two decades Business Process Outsourcing (BPO) has evolved as much mature practice. India is looked as preferred destination for pharmaceutical outsourcing over a cost arbitrage. Among the biometrics outsourcing, statistical programming and analysis required very niche skill for service delivery. The demand and supply ratios are imbalance due to high churn out rate and less supply of competent programmer. Industry is moving from task delivery to ownership and accountability. The paradigm shift from an outsourcing to consulting is triggering the need for competent statistical programmer. Programmers should be trained in technical, analytical, problem solving, decision making and soft skill as the expectations from the customer are changing from task delivery to accountability of the project. This paper will highlight the common issue SAS programming service industry is facing and skills the programmers need to develop to cope up with these changes.

  17. Competent statistical programmer: Need of business process outsourcing industry

    Imran Khan

    2014-01-01

    Full Text Available Over the last two decades Business Process Outsourcing (BPO has evolved as much mature practice. India is looked as preferred destination for pharmaceutical outsourcing over a cost arbitrage. Among the biometrics outsourcing, statistical programming and analysis required very niche skill for service delivery. The demand and supply ratios are imbalance due to high churn out rate and less supply of competent programmer. Industry is moving from task delivery to ownership and accountability. The paradigm shift from an outsourcing to consulting is triggering the need for competent statistical programmer. Programmers should be trained in technical, analytical, problem solving, decision making and soft skill as the expectations from the customer are changing from task delivery to accountability of the project. This paper will highlight the common issue SAS programming service industry is facing and skills the programmers need to develop to cope up with these changes.

  18. Design and Statistics in Quantitative Translation (Process) Research

    Balling, Laura Winther; Hvelplund, Kristian Tangsgaard

    2015-01-01

    Traditionally, translation research has been qualitative, but quantitative research is becoming increasingly important, especially in translation process research but also in other areas of translation studies. This poses problems to many translation scholars since this way of thinking...... is unfamiliar. In this article, we attempt to mitigate these problems by outlining our approach to good quantitative research, all the way from research questions and study design to data preparation and statistics. We concentrate especially on the nature of the variables involved, both in terms of their scale...... and their role in the design; this has implications for both design and choice of statistics. Although we focus on quantitative research, we also argue that such research should be supplemented with qualitative analyses and considerations of the translation product....

  19. Statistical representation of a spray as a point process

    Subramaniam, S.

    2000-01-01

    The statistical representation of a spray as a finite point process is investigated. One objective is to develop a better understanding of how single-point statistical information contained in descriptions such as the droplet distribution function (ddf), relates to the probability density functions (pdfs) associated with the droplets themselves. Single-point statistical information contained in the droplet distribution function (ddf) is shown to be related to a sequence of single surrogate-droplet pdfs, which are in general different from the physical single-droplet pdfs. It is shown that the ddf contains less information than the fundamental single-point statistical representation of the spray, which is also described. The analysis shows which events associated with the ensemble of spray droplets can be characterized by the ddf, and which cannot. The implications of these findings for the ddf approach to spray modeling are discussed. The results of this study also have important consequences for the initialization and evolution of direct numerical simulations (DNS) of multiphase flows, which are usually initialized on the basis of single-point statistics such as the droplet number density in physical space. If multiphase DNS are initialized in this way, this implies that even the initial representation contains certain implicit assumptions concerning the complete ensemble of realizations, which are invalid for general multiphase flows. Also the evolution of a DNS initialized in this manner is shown to be valid only if an as yet unproven commutation hypothesis holds true. Therefore, it is questionable to what extent DNS that are initialized in this manner constitute a direct simulation of the physical droplets. Implications of these findings for large eddy simulations of multiphase flows are also discussed. (c) 2000 American Institute of Physics

  20. Using sentence combining in technical writing classes

    Rosner, M.; Paul, T.

    1981-01-01

    Sentence combining exercises are advanced as a way to teach technical writing style without reliance upon abstractions, from which students do not learn. Such exercises: (1) give students regular writing practice; (2) teach the logic of sentence structure, sentence editing, and punctuation; (3) paragraph development and organization; and (4) rhetorical stance. Typical sentence, paragraph, and discourse level sentence combining exercises are described.

  1. Effects of Word Frequency and Modality on Sentence Comprehension Impairments in People with Aphasia

    DeDe, Gayle

    2012-01-01

    Purpose: It is well known that people with aphasia have sentence comprehension impairments. The present study investigated whether lexical factors contribute to sentence comprehension impairments in both the auditory and written modalities using online measures of sentence processing. Method: People with aphasia and non brain-damaged controls…

  2. Sentence Learning in Children and Adults: The Production of Forms and Transforms.

    Ehri, Linnea C.

    This investigation was intended to study the effects of some linguistic variables on child and adult memories for sentences when recall was prompted by nouns embedded in the sentences. Its purpose was to examine for developmental differences in sentence processing systems expected by psycholinguistic theory and research. A group of 64 subjects,…

  3. On the joint statistics of stable random processes

    Hopcraft, K I; Jakeman, E

    2011-01-01

    A utilitarian continuous bi-variate random process whose first-order probability density function is a stable random variable is constructed. Results paralleling some of those familiar from the theory of Gaussian noise are derived. In addition to the joint-probability density for the process, these include fractional moments and structure functions. Although the correlation functions for stable processes other than Gaussian do not exist, we show that there is coherence between values adopted by the process at different times, which identifies a characteristic evolution with time. The distribution of the derivative of the process, and the joint-density function of the value of the process and its derivative measured at the same time are evaluated. These enable properties to be calculated analytically such as level crossing statistics and those related to the random telegraph wave. When the stable process is fractal, the proportion of time it spends at zero is finite and some properties of this quantity are evaluated, an optical interpretation for which is provided. (paper)

  4. Statistical characterization of pitting corrosion process and life prediction

    Sheikh, A.K.; Younas, M.

    1995-01-01

    In order to prevent corrosion failures of machines and structures, it is desirable to know in advance when the corrosion damage will take place, and appropriate measures are needed to mitigate the damage. The corrosion predictions are needed both at development as well as operational stage of machines and structures. There are several forms of corrosion process through which varying degrees of damage can occur. Under certain conditions these corrosion processes at alone and in other set of conditions, several of these processes may occur simultaneously. For a certain type of machine elements and structures, such as gears, bearing, tubes, pipelines, containers, storage tanks etc., are particularly prone to pitting corrosion which is an insidious form of corrosion. The corrosion predictions are usually based on experimental results obtained from test coupons and/or field experiences of similar machines or parts of a structure. Considerable scatter is observed in corrosion processes. The probabilities nature and kinetics of pitting process makes in necessary to use statistical method to forecast the residual life of machine of structures. The focus of this paper is to characterization pitting as a time-dependent random process, and using this characterization the prediction of life to reach a critical level of pitting damage can be made. Using several data sets from literature on pitting corrosion, the extreme value modeling of pitting corrosion process, the evolution of the extreme value distribution in time, and their relationship to the reliability of machines and structure are explained. (author)

  5. Fast Quantum Algorithm for Predicting Descriptive Statistics of Stochastic Processes

    Williams Colin P.

    1999-01-01

    Stochastic processes are used as a modeling tool in several sub-fields of physics, biology, and finance. Analytic understanding of the long term behavior of such processes is only tractable for very simple types of stochastic processes such as Markovian processes. However, in real world applications more complex stochastic processes often arise. In physics, the complicating factor might be nonlinearities; in biology it might be memory effects; and in finance is might be the non-random intentional behavior of participants in a market. In the absence of analytic insight, one is forced to understand these more complex stochastic processes via numerical simulation techniques. In this paper we present a quantum algorithm for performing such simulations. In particular, we show how a quantum algorithm can predict arbitrary descriptive statistics (moments) of N-step stochastic processes in just O(square root of N) time. That is, the quantum complexity is the square root of the classical complexity for performing such simulations. This is a significant speedup in comparison to the current state of the art.

  6. Statistical process control using optimized neural networks: a case study.

    Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid

    2014-09-01

    The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Advanced statistics to improve the physical interpretation of atomization processes

    Panão, Miguel R.O.; Radu, Lucian

    2013-01-01

    Highlights: ► Finite pdf mixtures improves physical interpretation of sprays. ► Bayesian approach using MCMC algorithm is used to find the best finite mixture. ► Statistical method identifies multiple droplet clusters in a spray. ► Multiple drop clusters eventually associated with multiple atomization mechanisms. ► Spray described by drop size distribution and not only its moments. -- Abstract: This paper reports an analysis of the physics of atomization processes using advanced statistical tools. Namely, finite mixtures of probability density functions, which best fitting is found using a Bayesian approach based on a Markov chain Monte Carlo (MCMC) algorithm. This approach takes into account eventual multimodality and heterogeneities in drop size distributions. Therefore, it provides information about the complete probability density function of multimodal drop size distributions and allows the identification of subgroups in the heterogeneous data. This allows improving the physical interpretation of atomization processes. Moreover, it also overcomes the limitations induced by analyzing the spray droplets characteristics through moments alone, particularly, the hindering of different natures of droplet formation. Finally, the method is applied to physically interpret a case-study based on multijet atomization processes

  8. Statistical process control charts for monitoring military injuries.

    Schuh, Anna; Canham-Chervak, Michelle; Jones, Bruce H

    2017-12-01

    An essential aspect of an injury prevention process is surveillance, which quantifies and documents injury rates in populations of interest and enables monitoring of injury frequencies, rates and trends. To drive progress towards injury reduction goals, additional tools are needed. Statistical process control charts, a methodology that has not been previously applied to Army injury monitoring, capitalise on existing medical surveillance data to provide information to leadership about injury trends necessary for prevention planning and evaluation. Statistical process control Shewhart u-charts were created for 49 US Army installations using quarterly injury medical encounter rates, 2007-2015, for active duty soldiers obtained from the Defense Medical Surveillance System. Injuries were defined according to established military injury surveillance recommendations. Charts display control limits three standard deviations (SDs) above and below an installation-specific historical average rate determined using 28 data points, 2007-2013. Charts are available in Army strategic management dashboards. From 2007 to 2015, Army injury rates ranged from 1254 to 1494 unique injuries per 1000 person-years. Installation injury rates ranged from 610 to 2312 injuries per 1000 person-years. Control charts identified four installations with injury rates exceeding the upper control limits at least once during 2014-2015, rates at three installations exceeded the lower control limit at least once and 42 installations had rates that fluctuated around the historical mean. Control charts can be used to drive progress towards injury reduction goals by indicating statistically significant increases and decreases in injury rates. Future applications to military subpopulations, other health outcome metrics and chart enhancements are suggested. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  9. Statistical process control applied to the manufacturing of beryllia ceramics

    Ferguson, G.P.; Jech, D.E.; Sepulveda, J.L.

    1991-01-01

    To compete effectively in an international market, scrap and re-work costs must be minimized. Statistical Process Control (SPC) provides powerful tools to optimize production performance. These techniques are currently being applied to the forming, metallizing, and brazing of beryllia ceramic components. This paper describes specific examples of applications of SPC to dry-pressing of beryllium oxide 2x2 substrates, to Mo-Mn refractory metallization, and to metallization and brazing of plasma tubes used in lasers where adhesion strength is critical

  10. Statistical dynamics of transient processes in a gas discharge plasma

    Smirnov, G.I.; Telegin, G.G.

    1991-01-01

    The properties of a gas discharge plasma to a great extent depend on random processes whose study has recently become particularly important. The present work is concerned with analyzing the statistical phenomena that occur during the prebreakdown stage in a gas discharge. Unlike other studies of breakdown in the discharge gap, in which secondary electron effects and photon processes at the electrodes must be considered, here the authors treat the case of an electrodeless rf discharge or a laser photoresonant plasma. The analysis is based on the balance between the rates of electron generation and recombination in the plasma. The fluctuation kinetics for ionization of atoms in the hot plasma may also play an important role when the electron temperature changes abruptly, as occurs during adiabatic pinching of the plasma or during electron cyclotron heating

  11. Russian Sentence Adverbials

    Lorentzen, Elena; Durst-Andersen, Per

    2015-01-01

    way or the other to take their starting point in the previous discourse. It is, however, stressed that the specificity of the Russian language is found in modal adverbials where a division between external and internal reality exists. We end the examination by discussing the function of word order......Sentence adverbials (SA) in Russian are analyzed in their totality, i.e. from a lexical, syntactic, semantic and pragmatic point of view. They are classified according to Hare’s three utterance components which yields (1) neustic, (2) tropic and (3) phrastic SAs. These components are used...... to represent semantic paraphrases of Russian SAs in utterances from various types of discourse in order to show their exact contribution to the meaning conveyed by the entire utterance. They are further subdivided according to their function: (1) into connectives and non-connectives; (2) into attitudinal...

  12. Errors in patient specimen collection: application of statistical process control.

    Dzik, Walter Sunny; Beckman, Neil; Selleng, Kathleen; Heddle, Nancy; Szczepiorkowski, Zbigniew; Wendel, Silvano; Murphy, Michael

    2008-10-01

    Errors in the collection and labeling of blood samples for pretransfusion testing increase the risk of transfusion-associated patient morbidity and mortality. Statistical process control (SPC) is a recognized method to monitor the performance of a critical process. An easy-to-use SPC method was tested to determine its feasibility as a tool for monitoring quality in transfusion medicine. SPC control charts were adapted to a spreadsheet presentation. Data tabulating the frequency of mislabeled and miscollected blood samples from 10 hospitals in five countries from 2004 to 2006 were used to demonstrate the method. Control charts were produced to monitor process stability. The participating hospitals found the SPC spreadsheet very suitable to monitor the performance of the sample labeling and collection and applied SPC charts to suit their specific needs. One hospital monitored subcategories of sample error in detail. A large hospital monitored the number of wrong-blood-in-tube (WBIT) events. Four smaller-sized facilities, each following the same policy for sample collection, combined their data on WBIT samples into a single control chart. One hospital used the control chart to monitor the effect of an educational intervention. A simple SPC method is described that can monitor the process of sample collection and labeling in any hospital. SPC could be applied to other critical steps in the transfusion processes as a tool for biovigilance and could be used to develop regional or national performance standards for pretransfusion sample collection. A link is provided to download the spreadsheet for free.

  13. A case study: application of statistical process control tool for determining process capability and sigma level.

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  14. THE FUNCTION OF ALBANIAN AND ENGLISH SENTENCE

    Shkelqim Millaku

    2017-01-01

    A simple sentence consists of a single independent clause. A multiple sentence contains one or more clauses as its immediate constituents. Multiple sentences are either compound or complex. In a compound sentence the immediate constituents are two or more coordinate clause. In a complex sentence one or more of its elements, such as direct object or adverbial, are realized by a subordinate.[1] Simple sentence may be divided into four major syntactic classes, whose use correlates with different...

  15. Graphene growth process modeling: a physical-statistical approach

    Wu, Jian; Huang, Qiang

    2014-09-01

    As a zero-band semiconductor, graphene is an attractive material for a wide variety of applications such as optoelectronics. Among various techniques developed for graphene synthesis, chemical vapor deposition on copper foils shows high potential for producing few-layer and large-area graphene. Since fabrication of high-quality graphene sheets requires the understanding of growth mechanisms, and methods of characterization and control of grain size of graphene flakes, analytical modeling of graphene growth process is therefore essential for controlled fabrication. The graphene growth process starts with randomly nucleated islands that gradually develop into complex shapes, grow in size, and eventually connect together to cover the copper foil. To model this complex process, we develop a physical-statistical approach under the assumption of self-similarity during graphene growth. The growth kinetics is uncovered by separating island shapes from area growth rate. We propose to characterize the area growth velocity using a confined exponential model, which not only has clear physical explanation, but also fits the real data well. For the shape modeling, we develop a parametric shape model which can be well explained by the angular-dependent growth rate. This work can provide useful information for the control and optimization of graphene growth process on Cu foil.

  16. Single photon laser altimeter simulator and statistical signal processing

    Vacek, Michael; Prochazka, Ivan

    2013-05-01

    Spaceborne altimeters are common instruments onboard the deep space rendezvous spacecrafts. They provide range and topographic measurements critical in spacecraft navigation. Simultaneously, the receiver part may be utilized for Earth-to-satellite link, one way time transfer, and precise optical radiometry. The main advantage of single photon counting approach is the ability of processing signals with very low signal-to-noise ratio eliminating the need of large telescopes and high power laser source. Extremely small, rugged and compact microchip lasers can be employed. The major limiting factor, on the other hand, is the acquisition time needed to gather sufficient volume of data in repetitive measurements in order to process and evaluate the data appropriately. Statistical signal processing is adopted to detect signals with average strength much lower than one photon per measurement. A comprehensive simulator design and range signal processing algorithm are presented to identify a mission specific altimeter configuration. Typical mission scenarios (celestial body surface landing and topographical mapping) are simulated and evaluated. The high interest and promising single photon altimeter applications are low-orbit (˜10 km) and low-radial velocity (several m/s) topographical mapping (asteroids, Phobos and Deimos) and landing altimetry (˜10 km) where range evaluation repetition rates of ˜100 Hz and 0.1 m precision may be achieved. Moon landing and asteroid Itokawa topographical mapping scenario simulations are discussed in more detail.

  17. Statistics

    Hayslett, H T

    1991-01-01

    Statistics covers the basic principles of Statistics. The book starts by tackling the importance and the two kinds of statistics; the presentation of sample data; the definition, illustration and explanation of several measures of location; and the measures of variation. The text then discusses elementary probability, the normal distribution and the normal approximation to the binomial. Testing of statistical hypotheses and tests of hypotheses about the theoretical proportion of successes in a binomial population and about the theoretical mean of a normal population are explained. The text the

  18. Statistical reliability analyses of two wood plastic composite extrusion processes

    Crookston, Kevin A.; Mark Young, Timothy; Harper, David; Guess, Frank M.

    2011-01-01

    Estimates of the reliability of wood plastic composites (WPC) are explored for two industrial extrusion lines. The goal of the paper is to use parametric and non-parametric analyses to examine potential differences in the WPC metrics of reliability for the two extrusion lines that may be helpful for use by the practitioner. A parametric analysis of the extrusion lines reveals some similarities and disparities in the best models; however, a non-parametric analysis reveals unique and insightful differences between Kaplan-Meier survival curves for the modulus of elasticity (MOE) and modulus of rupture (MOR) of the WPC industrial data. The distinctive non-parametric comparisons indicate the source of the differences in strength between the 10.2% and 48.0% fractiles [3,183-3,517 MPa] for MOE and for MOR between the 2.0% and 95.1% fractiles [18.9-25.7 MPa]. Distribution fitting as related to selection of the proper statistical methods is discussed with relevance to estimating the reliability of WPC. The ability to detect statistical differences in the product reliability of WPC between extrusion processes may benefit WPC producers in improving product reliability and safety of this widely used house-decking product. The approach can be applied to many other safety and complex system lifetime comparisons.

  19. Application of statistical process control to qualitative molecular diagnostic assays

    O'Brien, Cathal P.

    2014-11-01

    Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data.

  20. Application of statistical process control to qualitative molecular diagnostic assays.

    O'Brien, Cathal P; Finn, Stephen P

    2014-01-01

    Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data.

  1. The application of statistical process control in linac quality assurance

    Li Dingyu; Dai Jianrong

    2009-01-01

    Objective: To improving linac quality assurance (QA) program with statistical process control (SPC) method. Methods: SPC is applied to set the control limit of QA data, draw charts and differentiate the random and systematic errors. A SPC quality assurance software named QA M ANAGER has been developed by VB programming for clinical use. Two clinical cases are analyzed with SPC to study daily output QA of a 6MV photon beam. Results: In the clinical case, the SPC is able to identify the systematic errors. Conclusion: The SPC application may be assistant to detect systematic errors in linac quality assurance thus it alarms the abnormal trend to eliminate the systematic errors and improves quality control. (authors)

  2. Monitoring a PVC batch process with multivariate statistical process control charts

    Tates, A. A.; Louwerse, D. J.; Smilde, A. K.; Koot, G. L. M.; Berndt, H.

    1999-01-01

    Multivariate statistical process control charts (MSPC charts) are developed for the industrial batch production process of poly(vinyl chloride) (PVC). With these MSPC charts different types of abnormal batch behavior were detected on-line. With batch contribution plots, the probable causes of these

  3. Does verbatim sentence recall underestimate the language competence of near-native speakers?

    Judith eSchweppe

    2015-02-01

    Full Text Available Verbatim sentence recall is widely used to test the language competence of native and non-native speakers since it involves comprehension and production of connected speech. However, we assume that, to maintain surface information, sentence recall relies particularly on attentional resources, which differentially affects native and non-native speakers. Since even in near-natives language processing is less automatized than in native speakers, processing a sentence in a foreign language plus retaining its surface may result in a cognitive overload. We contrasted sentence recall performance of German native speakers with that of highly proficient non-natives. Non-natives recalled the sentences significantly poorer than the natives, but performed equally well on a cloze test. This implies that sentence recall underestimates the language competence of good non-native speakers in mixed groups with native speakers. The findings also suggest that theories of sentence recall need to consider both its linguistic and its attentional aspects.

  4. Statistics

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  5. Advances in statistical monitoring of complex multivariate processes with applications in industrial process control

    Kruger, Uwe

    2012-01-01

    The development and application of multivariate statistical techniques in process monitoring has gained substantial interest over the past two decades in academia and industry alike.  Initially developed for monitoring and fault diagnosis in complex systems, such techniques have been refined and applied in various engineering areas, for example mechanical and manufacturing, chemical, electrical and electronic, and power engineering.  The recipe for the tremendous interest in multivariate statistical techniques lies in its simplicity and adaptability for developing monitoring applica

  6. Multiresolution, Geometric, and Learning Methods in Statistical Image Processing, Object Recognition, and Sensor Fusion

    Willsky, Alan

    2004-01-01

    .... Our research blends methods from several fields-statistics and probability, signal and image processing, mathematical physics, scientific computing, statistical learning theory, and differential...

  7. Guideline implementation in clinical practice: Use of statistical process control charts as visual feedback devices

    Fahad A Al-Hussein

    2009-01-01

    Conclusions: A process of audits in the context of statistical process control is necessary for any improvement in the implementation of guidelines in primary care. Statistical process control charts are an effective means of visual feedback to the care providers.

  8. Exploring Methods to Investigate Sentencing Decisions

    Merrall, Elizabeth L. C.; Dhami, Mandeep K.; Bird, Sheila M.

    2010-01-01

    The determinants of sentencing are of much interest in criminal justice and legal research. Understanding the determinants of sentencing decisions is important for ensuring transparent, consistent, and justifiable sentencing practice that adheres to the goals of sentencing, such as the punishment, rehabilitation, deterrence, and incapacitation of…

  9. Sentence retrieval for abstracts of randomized controlled trials

    Chung Grace Y

    2009-02-01

    Full Text Available Abstract Background The practice of evidence-based medicine (EBM requires clinicians to integrate their expertise with the latest scientific research. But this is becoming increasingly difficult with the growing numbers of published articles. There is a clear need for better tools to improve clinician's ability to search the primary literature. Randomized clinical trials (RCTs are the most reliable source of evidence documenting the efficacy of treatment options. This paper describes the retrieval of key sentences from abstracts of RCTs as a step towards helping users find relevant facts about the experimental design of clinical studies. Method Using Conditional Random Fields (CRFs, a popular and successful method for natural language processing problems, sentences referring to Intervention, Participants and Outcome Measures are automatically categorized. This is done by extending a previous approach for labeling sentences in an abstract for general categories associated with scientific argumentation or rhetorical roles: Aim, Method, Results and Conclusion. Methods are tested on several corpora of RCT abstracts. First structured abstracts with headings specifically indicating Intervention, Participant and Outcome Measures are used. Also a manually annotated corpus of structured and unstructured abstracts is prepared for testing a classifier that identifies sentences belonging to each category. Results Using CRFs, sentences can be labeled for the four rhetorical roles with F-scores from 0.93–0.98. This outperforms the use of Support Vector Machines. Furthermore, sentences can be automatically labeled for Intervention, Participant and Outcome Measures, in unstructured and structured abstracts where the section headings do not specifically indicate these three topics. F-scores of up to 0.83 and 0.84 are obtained for Intervention and Outcome Measure sentences. Conclusion Results indicate that some of the methodological elements of RCTs are

  10. Statistics

    2005-01-01

    For the years 2004 and 2005 the figures shown in the tables of Energy Review are partly preliminary. The annual statistics published in Energy Review are presented in more detail in a publication called Energy Statistics that comes out yearly. Energy Statistics also includes historical time-series over a longer period of time (see e.g. Energy Statistics, Statistics Finland, Helsinki 2004.) The applied energy units and conversion coefficients are shown in the back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes, precautionary stock fees and oil pollution fees

  11. Statistics

    2001-01-01

    For the year 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions from the use of fossil fuels, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in 2000, Energy exports by recipient country in 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  12. Statistics

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g., Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-March 2000, Energy exports by recipient country in January-March 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  13. Statistics

    1999-01-01

    For the year 1998 and the year 1999, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy Review appear in more detail from the publication Energiatilastot - Energy Statistics issued annually, which also includes historical time series over a longer period (see e.g. Energiatilastot 1998, Statistics Finland, Helsinki 1999, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 1999, Energy exports by recipient country in January-June 1999, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  14. Statistical Analysis of CMC Constituent and Processing Data

    Fornuff, Jonathan

    2004-01-01

    observed using statistical analysis software. The ultimate purpose of this study is to determine what variations in material processing can lead to the most critical changes in the materials property. The work I have taken part in this summer explores, in general, the key properties needed In this study SiC/SiC composites of varying architectures, utilizing a boron-nitride (BN)

  15. Discussion of "Modern statistics for spatial point processes"

    Jensen, Eva Bjørn Vedel; Prokesová, Michaela; Hellmund, Gunnar

    2007-01-01

    ABSTRACT. The paper ‘Modern statistics for spatial point processes’ by Jesper Møller and Rasmus P. Waagepetersen is based on a special invited lecture given by the authors at the 21st Nordic Conference on Mathematical Statistics, held at Rebild, Denmark, in June 2006. At the conference, Antti...

  16. Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)

    Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)

    1999-01-01

    This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.

  17. Statistics

    2003-01-01

    For the year 2002, part of the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot 2001, Statistics Finland, Helsinki 2002). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supply and total consumption of electricity GWh, Energy imports by country of origin in January-June 2003, Energy exports by recipient country in January-June 2003, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees on energy products

  18. Statistics

    2004-01-01

    For the year 2003 and 2004, the figures shown in the tables of the Energy Review are partly preliminary. The annual statistics of the Energy Review also includes historical time-series over a longer period (see e.g. Energiatilastot, Statistics Finland, Helsinki 2003, ISSN 0785-3165). The applied energy units and conversion coefficients are shown in the inside back cover of the Review. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in GDP, energy consumption and electricity consumption, Carbon dioxide emissions from fossile fuels use, Coal consumption, Consumption of natural gas, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices in heat production, Fuel prices in electricity production, Price of electricity by type of consumer, Average monthly spot prices at the Nord pool power exchange, Total energy consumption by source and CO 2 -emissions, Supplies and total consumption of electricity GWh, Energy imports by country of origin in January-March 2004, Energy exports by recipient country in January-March 2004, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Price of natural gas by type of consumer, Price of electricity by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Excise taxes, precautionary stock fees on oil pollution fees

  19. Statistics

    2000-01-01

    For the year 1999 and 2000, part of the figures shown in the tables of the Energy Review are preliminary or estimated. The annual statistics of the Energy also includes historical time series over a longer period (see e.g., Energiatilastot 1999, Statistics Finland, Helsinki 2000, ISSN 0785-3165). The inside of the Review's back cover shows the energy units and the conversion coefficients used for them. Explanatory notes to the statistical tables can be found after tables and figures. The figures presents: Changes in the volume of GNP and energy consumption, Changes in the volume of GNP and electricity, Coal consumption, Natural gas consumption, Peat consumption, Domestic oil deliveries, Import prices of oil, Consumer prices of principal oil products, Fuel prices for heat production, Fuel prices for electricity production, Carbon dioxide emissions, Total energy consumption by source and CO 2 -emissions, Electricity supply, Energy imports by country of origin in January-June 2000, Energy exports by recipient country in January-June 2000, Consumer prices of liquid fuels, Consumer prices of hard coal, natural gas and indigenous fuels, Average electricity price by type of consumer, Price of district heating by type of consumer, Excise taxes, value added taxes and fiscal charges and fees included in consumer prices of some energy sources and Energy taxes and precautionary stock fees on oil products

  20. Statistical 21-cm Signal Separation via Gaussian Process Regression Analysis

    Mertens, F. G.; Ghosh, A.; Koopmans, L. V. E.

    2018-05-01

    Detecting and characterizing the Epoch of Reionization and Cosmic Dawn via the redshifted 21-cm hyperfine line of neutral hydrogen will revolutionize the study of the formation of the first stars, galaxies, black holes and intergalactic gas in the infant Universe. The wealth of information encoded in this signal is, however, buried under foregrounds that are many orders of magnitude brighter. These must be removed accurately and precisely in order to reveal the feeble 21-cm signal. This requires not only the modeling of the Galactic and extra-galactic emission, but also of the often stochastic residuals due to imperfect calibration of the data caused by ionospheric and instrumental distortions. To stochastically model these effects, we introduce a new method based on `Gaussian Process Regression' (GPR) which is able to statistically separate the 21-cm signal from most of the foregrounds and other contaminants. Using simulated LOFAR-EoR data that include strong instrumental mode-mixing, we show that this method is capable of recovering the 21-cm signal power spectrum across the entire range k = 0.07 - 0.3 {h cMpc^{-1}}. The GPR method is most optimal, having minimal and controllable impact on the 21-cm signal, when the foregrounds are correlated on frequency scales ≳ 3 MHz and the rms of the signal has σ21cm ≳ 0.1 σnoise. This signal separation improves the 21-cm power-spectrum sensitivity by a factor ≳ 3 compared to foreground avoidance strategies and enables the sensitivity of current and future 21-cm instruments such as the Square Kilometre Array to be fully exploited.

  1. Chinese Sentence Classification Based on Convolutional Neural Network

    Gu, Chengwei; Wu, Ming; Zhang, Chuang

    2017-10-01

    Sentence classification is one of the significant issues in Natural Language Processing (NLP). Feature extraction is often regarded as the key point for natural language processing. Traditional ways based on machine learning can not take high level features into consideration, such as Naive Bayesian Model. The neural network for sentence classification can make use of contextual information to achieve greater results in sentence classification tasks. In this paper, we focus on classifying Chinese sentences. And the most important is that we post a novel architecture of Convolutional Neural Network (CNN) to apply on Chinese sentence classification. In particular, most of the previous methods often use softmax classifier for prediction, we embed a linear support vector machine to substitute softmax in the deep neural network model, minimizing a margin-based loss to get a better result. And we use tanh as an activation function, instead of ReLU. The CNN model improve the result of Chinese sentence classification tasks. Experimental results on the Chinese news title database validate the effectiveness of our model.

  2. Sentence comprehension following moderate closed head injury in adults.

    Leikin, Mark; Ibrahim, Raphiq; Aharon-Peretz, Judith

    2012-09-01

    The current study explores sentence comprehension impairments among adults following moderate closed head injury. It was hypothesized that if the factor of syntactic complexity significantly affects sentence comprehension in these patients, it would testify to the existence of syntactic processing deficit along with working-memory problems. Thirty-six adults (18 closed head injury patients and 18 healthy controls matched in age, gender, and IQ) participated in the study. A picture-sentence matching task together with various tests for memory, language, and reading abilities were used to explore whether sentence comprehension impairments exist as a result of a deficit in syntactic processing or of working-memory dysfunction. Results indicate significant impairment in sentence comprehension among adults with closed head injury compared with their non-head-injured peers. Results also reveal that closed head injury patients demonstrate considerable decline in working memory, short-term memory, and semantic knowledge. Analysis of the results shows that memory impairment and syntactic complexity contribute significantly to sentence comprehension difficulties in closed head injury patients. At the same time, the presentation mode (spoken or written language) was found to have no effect on comprehension among adults with closed head injury, and their reading abilities appear to be relatively intact.

  3. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, Francois; Aletti, Pierre [Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France) and DOSIsoft SA, 94230 Cachan (France); Research Laboratory for Innovative Processes (ERPI), Nancy University, EA 3767, 5400 Nancy Cedex (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France); DOSIsoft SA, 94230 Cachan (France); Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy, France and Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France)

    2009-04-15

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculating a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should

  4. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the

  5. An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques

    2018-01-09

    100 kHz, 1 MHz 100 MHz–1 GHz 1 100 kHz 3. Statistical Processing 3.1 Statistical Analysis Statistical analysis is the mathematical science...quantitative terms. In commercial prognostics and diagnostic vibrational monitoring applications , statistical techniques that are mainly used for alarm...Balakrishnan N, editors. Handbook of statistics . Amsterdam (Netherlands): Elsevier Science; 1998. p 555–602; (Order statistics and their applications

  6. Computationally efficient algorithms for statistical image processing : implementation in R

    Langovoy, M.; Wittich, O.

    2010-01-01

    In the series of our earlier papers on the subject, we proposed a novel statistical hypothesis testing method for detection of objects in noisy images. The method uses results from percolation theory and random graph theory. We developed algorithms that allowed to detect objects of unknown shapes in

  7. Spatio-temporal statistical models with applications to atmospheric processes

    Wikle, C.K.

    1996-01-01

    This doctoral dissertation is presented as three self-contained papers. An introductory chapter considers traditional spatio-temporal statistical methods used in the atmospheric sciences from a statistical perspective. Although this section is primarily a review, many of the statistical issues considered have not been considered in the context of these methods and several open questions are posed. The first paper attempts to determine a means of characterizing the semiannual oscillation (SAO) spatial variation in the northern hemisphere extratropical height field. It was discovered that the midlatitude SAO in 500hPa geopotential height could be explained almost entirely as a result of spatial and temporal asymmetries in the annual variation of stationary eddies. It was concluded that the mechanism for the SAO in the northern hemisphere is a result of land-sea contrasts. The second paper examines the seasonal variability of mixed Rossby-gravity waves (MRGW) in lower stratospheric over the equatorial Pacific. Advanced cyclostationary time series techniques were used for analysis. It was found that there are significant twice-yearly peaks in MRGW activity. Analyses also suggested a convergence of horizontal momentum flux associated with these waves. In the third paper, a new spatio-temporal statistical model is proposed that attempts to consider the influence of both temporal and spatial variability. This method is mainly concerned with prediction in space and time, and provides a spatially descriptive and temporally dynamic model

  8. The Pearson diffusions: A class of statistically tractable diffusion processes

    Forman, Julie Lyng; Sørensen, Michael

    The Pearson diffusions is a flexible class of diffusions defined by having linear drift and quadratic squared diffusion coefficient. It is demonstrated that for this class explicit statistical inference is feasible. Explicit optimal martingale estimating func- tions are found, and the corresponding...

  9. [AN OVERALL SOUND PROCESS] Syntactic parameters, statistic parameters, and universals

    Nicolas Meeùs

    2016-05-01

    My paper intends to show that comparative musicology, in facts if not in principles, appears inherently linked to the syntactic elements of music – and so also any encyclopedic project aiming at uncovering universals in music. Not that statistic elements cannot be universal, but that they cannot be commented as such, because they remain largely unquantifiable.

  10. Aerodynamic Characteristics of Syllable and Sentence Productions in Normal Speakers.

    Thiel, Cedric; Yang, Jin; Crawley, Brianna; Krishna, Priya; Murry, Thomas

    2018-01-08

    Aerodynamic measures of subglottic air pressure (Ps) and airflow rate (AFR) are used to select behavioral voice therapy versus surgical treatment for voice disorders. However, these measures are usually taken during a series of syllables, which differs from conversational speech. Repeated syllables do not share the variation found in even simple sentences, and patients may use their best rather than typical voice unless specifically instructed otherwise. This study examined the potential differences in estimated Ps and AFR in syllable and sentence production and their effects on a measure of vocal efficiency in normal speakers. Prospective study. Measures of estimated Ps, AFR, and aerodynamic vocal efficiency (AVE) were obtained from 19 female and four male speakers ages 22-44 years with no history of voice disorders. Subjects repeated a series of /pa/ syllables and a sentence at comfortable effort level into a face mask with a pressure-sensing tube between the lips. AVE varies as a function of the speech material in normal subjects. Ps measures were significantly higher for the sentence-production samples than for the syllable-production samples. AFR was higher during sentence production than syllable production, but the difference was not statistically significant. AVE values were significantly higher for syllable versus sentence productions. The results suggest that subjects increase Ps and AFR in sentence compared with syllable production. Speaking task is a critical factor when considering measures of AVE, and this preliminary study provides a basis for further aerodynamic studies of patient populations. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  11. Multivariate statistical analysis of a multi-step industrial processes

    Reinikainen, S.P.; Høskuldsson, Agnar

    2007-01-01

    Monitoring and quality control of industrial processes often produce information on how the data have been obtained. In batch processes, for instance, the process is carried out in stages; some process or control parameters are set at each stage. However, the obtained data might not be utilized...... efficiently, even if this information may reveal significant knowledge about process dynamics or ongoing phenomena. When studying the process data, it may be important to analyse the data in the light of the physical or time-wise development of each process step. In this paper, a unified approach to analyse...... multivariate multi-step processes, where results from each step are used to evaluate future results, is presented. The methods presented are based on Priority PLS Regression. The basic idea is to compute the weights in the regression analysis for given steps, but adjust all data by the resulting score vectors...

  12. About statistical process contribution to elastic diffraction scattering

    Ismanov, E.I.; Dzhuraev, Sh. Kh.; Paluanov, B.K.

    1999-01-01

    The experimental data on angular distribution show two basic properties. The first one is the presence of back and front peaks. The second one is the angular isotropic distribution near 90 degree, and has a big energy dependence. Different models for partial amplitudes a dl of the diffraction statistical scattering, particularly the model with Gaussian and exponential density distribution, were considered. The experimental data on pp-scattering were analyzed using the examined models

  13. Bayesian Nonparametric Statistical Inference for Shock Models and Wear Processes.

    1979-12-01

    also note that the results in Section 2 do not depend on the support of F .) This shock model have been studied by Esary, Marshall and Proschan (1973...Barlow and Proschan (1975), among others. The analogy of the shock model in risk and acturial analysis has been given by BUhlmann (1970, Chapter 2... Mathematical Statistics, Vol. 4, pp. 894-906. Billingsley, P. (1968), CONVERGENCE OF PROBABILITY MEASURES, John Wiley, New York. BUhlmann, H. (1970

  14. Statistical data processing of mobility curves of univalent weak bases

    Šlampová, Andrea; Boček, Petr

    2008-01-01

    Roč. 29, č. 2 (2008), s. 538-541 ISSN 0173-0835 R&D Projects: GA AV ČR IAA400310609; GA ČR GA203/05/2106 Institutional research plan: CEZ:AV0Z40310501 Keywords : mobility curve * univalent weak bases * statistical evaluation Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 3.509, year: 2008

  15. Oscillatory brain dynamics during sentence reading: A Fixation-related spectral perturbation analysis.

    Lorenzo eVignali

    2016-04-01

    Full Text Available The present study investigated oscillatory brain dynamics during self-paced sentence-level processing. Participants read fully correct sentences, sentences containing a semantic violation and sentences in which the order of the words was randomized. At the target word level, fixations on semantically unrelated words elicited a lower-beta band (13-18 Hz desynchronization. At the sentence level, gamma power (31-55 Hz increased linearly for syntactically correct sentences, but not when the order of the words was randomized. In the 300 to 900 ms time window after sentence onsets, theta power (4-7 Hz was greater for syntactically correct sentences as compared to sentences where no syntactic structure was preserved (random words condition. We interpret our results as conforming with a recently formulated predictive-coding framework for oscillatory neural dynamics during sentence-level language comprehension. Additionally, we discuss how our results relate to previous findings with serial visual presentation versus self-paced reading.

  16. Statistical tests for power-law cross-correlated processes

    Podobnik, Boris; Jiang, Zhi-Qiang; Zhou, Wei-Xing; Stanley, H. Eugene

    2011-12-01

    For stationary time series, the cross-covariance and the cross-correlation as functions of time lag n serve to quantify the similarity of two time series. The latter measure is also used to assess whether the cross-correlations are statistically significant. For nonstationary time series, the analogous measures are detrended cross-correlations analysis (DCCA) and the recently proposed detrended cross-correlation coefficient, ρDCCA(T,n), where T is the total length of the time series and n the window size. For ρDCCA(T,n), we numerically calculated the Cauchy inequality -1≤ρDCCA(T,n)≤1. Here we derive -1≤ρDCCA(T,n)≤1 for a standard variance-covariance approach and for a detrending approach. For overlapping windows, we find the range of ρDCCA within which the cross-correlations become statistically significant. For overlapping windows we numerically determine—and for nonoverlapping windows we derive—that the standard deviation of ρDCCA(T,n) tends with increasing T to 1/T. Using ρDCCA(T,n) we show that the Chinese financial market's tendency to follow the U.S. market is extremely weak. We also propose an additional statistical test that can be used to quantify the existence of cross-correlations between two power-law correlated time series.

  17. Alternating event processes during lifetimes: population dynamics and statistical inference.

    Shinohara, Russell T; Sun, Yifei; Wang, Mei-Cheng

    2018-01-01

    In the literature studying recurrent event data, a large amount of work has been focused on univariate recurrent event processes where the occurrence of each event is treated as a single point in time. There are many applications, however, in which univariate recurrent events are insufficient to characterize the feature of the process because patients experience nontrivial durations associated with each event. This results in an alternating event process where the disease status of a patient alternates between exacerbations and remissions. In this paper, we consider the dynamics of a chronic disease and its associated exacerbation-remission process over two time scales: calendar time and time-since-onset. In particular, over calendar time, we explore population dynamics and the relationship between incidence, prevalence and duration for such alternating event processes. We provide nonparametric estimation techniques for characteristic quantities of the process. In some settings, exacerbation processes are observed from an onset time until death; to account for the relationship between the survival and alternating event processes, nonparametric approaches are developed for estimating exacerbation process over lifetime. By understanding the population dynamics and within-process structure, the paper provide a new and general way to study alternating event processes.

  18. Verbal semantics drives early anticipatory eye movements during the comprehension of verb-initial sentences

    Sebastian eSauppe; Sebastian eSauppe; Sebastian eSauppe

    2016-01-01

    Studies on anticipatory processes during sentence comprehension often focus on the prediction of postverbal direct objects. In subject-initial languages (the target of most studies so far), however, the position in the sentence, the syntactic function, and the semantic role of arguments are often conflated. For example, in the sentence The frog will eat the fly the syntactic object (fly) is at the same time also the last word and the patient argument of the verb. It is therefore not apparent ...

  19. Statistical process control support during Defense Waste Processing Facility chemical runs

    Brown, K.G.

    1994-01-01

    The Product Composition Control System (PCCS) has been developed to ensure that the wasteforms produced by the Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will satisfy the regulatory and processing criteria that will be imposed. The PCCS provides rigorous, statistically-defensible management of a noisy, multivariate system subject to multiple constraints. The system has been successfully tested and has been used to control the production of the first two melter feed batches during DWPF Chemical Runs. These operations will demonstrate the viability of the DWPF process. This paper provides a brief discussion of the technical foundation for the statistical process control algorithms incorporated into PCCS, and describes the results obtained and lessons learned from DWPF Cold Chemical Run operations. The DWPF will immobilize approximately 130 million liters of high-level nuclear waste currently stored at the Site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive sludge and precipitate streams and less radioactive water soluble salts. (In a separate facility, soluble salts are disposed of as low-level waste in a mixture of cement slag, and flyash.) In DWPF, the precipitate steam (Precipitate Hydrolysis Aqueous or PHA) is blended with the insoluble sludge and ground glass frit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in a geologic repository

  20. Statistical process control of cocrystallization processes: A comparison between OPLS and PLS.

    Silva, Ana F T; Sarraguça, Mafalda Cruz; Ribeiro, Paulo R; Santos, Adenilson O; De Beer, Thomas; Lopes, João Almeida

    2017-03-30

    Orthogonal partial least squares regression (OPLS) is being increasingly adopted as an alternative to partial least squares (PLS) regression due to the better generalization that can be achieved. Particularly in multivariate batch statistical process control (BSPC), the use of OPLS for estimating nominal trajectories is advantageous. In OPLS, the nominal process trajectories are expected to be captured in a single predictive principal component while uncorrelated variations are filtered out to orthogonal principal components. In theory, OPLS will yield a better estimation of the Hotelling's T 2 statistic and corresponding control limits thus lowering the number of false positives and false negatives when assessing the process disturbances. Although OPLS advantages have been demonstrated in the context of regression, its use on BSPC was seldom reported. This study proposes an OPLS-based approach for BSPC of a cocrystallization process between hydrochlorothiazide and p-aminobenzoic acid monitored on-line with near infrared spectroscopy and compares the fault detection performance with the same approach based on PLS. A series of cocrystallization batches with imposed disturbances were used to test the ability to detect abnormal situations by OPLS and PLS-based BSPC methods. Results demonstrated that OPLS was generally superior in terms of sensibility and specificity in most situations. In some abnormal batches, it was found that the imposed disturbances were only detected with OPLS. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Multivariate Statistical Process Optimization in the Industrial Production of Enzymes

    Klimkiewicz, Anna

    of productyield. The potential of NIR technology to monitor the activity of the enzyme has beenthe subject of a feasibility study presented in PAPER I. It included (a) evaluation onwhich of the two real-time NIR flow cell configurations is the preferred arrangementfor monitoring of the retentate stream downstream...... strategies for theorganization of these datasets, with varying number of timestamps, into datastructures fit for latent variable (LV) modeling, have been compared. The ultimateaim of the data mining steps is the construction of statistical ‘soft models’ whichcapture the principle or latent behavior...

  2. Signal processing and statistical analysis of spaced-based measurements

    Iranpour, K.

    1996-05-01

    The reports deals with data obtained by the ROSE rocket project. This project was designed to investigate the low altitude auroral instabilities in the electrojet region. The spectral and statistical analyses indicate the existence of unstable waves in the ionized gas in the region. An experimentally obtained dispersion relation for these waves were established. It was demonstrated that the characteristic phase velocities are much lower than what is expected from the standard theoretical results. This analysis of the ROSE data indicate the cascading of energy from lower to higher frequencies. 44 refs., 54 figs

  3. Statistical and signal-processing concepts in surface metrology

    Church, E.L.; Takacs, P.Z.

    1986-03-01

    This paper proposes the use of a simple two-scale model of surface roughness for testing and specifying the topographic figure and finish of synchrotron-radiation mirrors. In this approach the effects of figure and finish are described in terms of their slope distribution and power spectrum, respectively, which are then combined with the system point spread function to produce a composite image. The result can be used to predict mirror performance or to translate design requirements into manufacturing specifications. Pacing problems in this approach are the development of a practical long-trace slope-profiling instrument and realistic statistical models for figure and finish errors

  4. Statistical and signal-processing concepts in surface metrology

    Church, E.L.; Takacs, P.Z.

    1986-03-01

    This paper proposes the use of a simple two-scale model of surface roughness for testing and specifying the topographic figure and finish of synchrotron-radiation mirrors. In this approach the effects of figure and finish are described in terms of their slope distribution and power spectrum, respectively, which are then combined with the system point spread function to produce a composite image. The result can be used to predict mirror performance or to translate design requirements into manufacturing specifications. Pacing problems in this approach are the development of a practical long-trace slope-profiling instrument and realistic statistical models for figure and finish errors.

  5. Ready-to-Use Simulation: Demystifying Statistical Process Control

    Sumukadas, Narendar; Fairfield-Sonn, James W.; Morgan, Sandra

    2005-01-01

    Business students are typically introduced to the concept of process management in their introductory course on operations management. A very important learning outcome here is an appreciation that the management of processes is a key to the management of quality. Some of the related concepts are qualitative, such as strategic and behavioral…

  6. An easy and low cost option for economic statistical process control ...

    An easy and low cost option for economic statistical process control using Excel. ... in both economic and economic statistical designs of the X-control chart. ... in this paper and the numerical examples illustrated are executed on this program.

  7. Distinctiveness and encoding effects in online sentence comprehension

    Philip eHofmeister

    2014-12-01

    Full Text Available In explicit memory recall and recognition tasks, elaboration and contextual isolation both facilitate memory performance. Here, we investigate these effects in the context of sentence processing: targets for retrieval during online sentence processing of English object relative clause constructions differ in the amount of elaboration associated with the target noun phrase, or the homogeneity of superficial features (text color. Experiment 1 shows that greater elaboration for targets during the encoding phase reduces reading times at retrieval sites, but elaboration of non-targets has considerably weaker effects. Experiment 2 illustrates that processing isolated superficial features of target noun phrases --- here, a green word in a sentence with words colored white --- does not lead to enhanced memory performance, despite triggering longer encoding times. These results are interpreted in the light of the memory models of Nairne 1990, 2001, 2006, which state that encoding remnants contribute to the set of retrieval cues that provide the basis for similarity-based interference effects.

  8. The Use of Statistical Methods in Dimensional Process Control

    Krajcsik, Stephen

    1985-01-01

    ... erection. To achieve this high degree of unit accuracy, we have begun a pilot dimensional control program that has set the guidelines for systematically monitoring each stage of the production process prior to erection...

  9. Automatic sentence extraction for the detection of scientific paper relations

    Sibaroni, Y.; Prasetiyowati, S. S.; Miftachudin, M.

    2018-03-01

    The relations between scientific papers are very useful for researchers to see the interconnection between scientific papers quickly. By observing the inter-article relationships, researchers can identify, among others, the weaknesses of existing research, performance improvements achieved to date, and tools or data typically used in research in specific fields. So far, methods that have been developed to detect paper relations include machine learning and rule-based methods. However, a problem still arises in the process of sentence extraction from scientific paper documents, which is still done manually. This manual process causes the detection of scientific paper relations longer and inefficient. To overcome this problem, this study performs an automatic sentences extraction while the paper relations are identified based on the citation sentence. The performance of the built system is then compared with that of the manual extraction system. The analysis results suggested that the automatic sentence extraction indicates a very high level of performance in the detection of paper relations, which is close to that of manual sentence extraction.

  10. Extrinsic Cognitive Load Impairs Spoken Word Recognition in High- and Low-Predictability Sentences.

    Hunter, Cynthia R; Pisoni, David B

    -predictability sentences. Under mild spectral degradation (eight-channel vocoding), the effect of load was present for low-predictability sentences but not for high-predictability sentences. There were also reliable downstream effects of speech degradation and sentence predictability on recall of the preload digit sequences. Long digit sequences were more easily recalled following spoken sentences that were less spectrally degraded. When digits were reported after identification of sentence-final words, short digit sequences were recalled more accurately when the spoken sentences were predictable. Extrinsic cognitive load can impair recognition of spectrally degraded spoken words in a sentence recognition task. Cognitive load affected word identification in both high- and low-predictability sentences, suggesting that load may impact both context use and lower-level perceptual processes. Consistent with prior work, LE also had downstream effects on memory for visual digit sequences. Results support the proposal that extrinsic cognitive load and LE induced by signal degradation both draw on a central, limited pool of cognitive resources that is used to recognize spoken words in sentences under adverse listening conditions.

  11. Counting statistics of non-markovian quantum stochastic processes

    Flindt, Christian; Novotny, T.; Braggio, A.

    2008-01-01

    We derive a general expression for the cumulant generating function (CGF) of non-Markovian quantum stochastic transport processes. The long-time limit of the CGF is determined by a single dominating pole of the resolvent of the memory kernel from which we extract the zero-frequency cumulants...

  12. Statistical optimization of process parameters for the production of ...

    In this study, optimization of process parameters such as moisture content, incubation temperature and initial pH (fixed) for the improvement of citric acid production from oil palm empty fruit bunches through solid state bioconversion was carried out using traditional one-factor-at-a-time (OFAT) method and response surface ...

  13. Bilinguals Show Weaker Lexical Access during Spoken Sentence Comprehension

    Shook, Anthony; Goldrick, Matthew; Engstler, Caroline; Marian, Viorica

    2015-01-01

    When bilinguals process written language, they show delays in accessing lexical items relative to monolinguals. The present study investigated whether this effect extended to spoken language comprehension, examining the processing of sentences with either low or high semantic constraint in both first and second languages. English-German…

  14. Types of Sentences in EFL Students' Paragraph Assignments: A Quantitative Study on Teaching and Learning Writing at Higher Education Level

    Syayid Sandi Sukandi

    2017-08-01

    Full Text Available This research investigates Indonesian EFL students writing four types of English sentences in their paragraph writing assignments that were posted online in Writing 1 course of English Education at STKIP PGRI Sumatera Barat. The analysed types of sentences are Simple Sentence (code: S.S., Compound Sentence (code: C.S.1, Complex Sentence (code: C.S.2, and Compound-Complex Sentence (code: C.C.S. The percentage of each type of sentences that appears in the students’ writings within each five genres represents the students’ syntactical composition. Moreover, this research focuses on quantitatively analysing the above five types of sentences that appeared in students’ assignments in each type of following genres: argumentative, descriptive, process, cause-effect, and comparison-contrast. Data are taken from 10% samples of all population. The finding shows that writing Simple Sentence in paragraphs is a common type of sentence that is used by the students. It indicates that the guiding process to teaching students about writing paragraphs with varied sentence types is important for further development of teaching process of writing.

  15. Statistical properties of antisymmetrized molecular dynamics for non-nucleon-emission and nucleon-emission processes

    Ono, A.; Horiuchi, H.

    1996-01-01

    Statistical properties of antisymmetrized molecular dynamics (AMD) are classical in the case of nucleon-emission processes, while they are quantum mechanical for the processes without nucleon emission. In order to understand this situation, we first clarify that there coexist mutually opposite two statistics in the AMD framework: One is the classical statistics of the motion of wave packet centroids and the other is the quantum statistics of the motion of wave packets which is described by the AMD wave function. We prove the classical statistics of wave packet centroids by using the framework of the microcanonical ensemble of the nuclear system with a realistic effective two-nucleon interaction. We show that the relation between the classical statistics of wave packet centroids and the quantum statistics of wave packets can be obtained by taking into account the effects of the wave packet spread. This relation clarifies how the quantum statistics of wave packets emerges from the classical statistics of wave packet centroids. It is emphasized that the temperature of the classical statistics of wave packet centroids is different from the temperature of the quantum statistics of wave packets. We then explain that the statistical properties of AMD for nucleon-emission processes are classical because nucleon-emission processes in AMD are described by the motion of wave packet centroids. We further show that when we improve the description of the nucleon-emission process so as to take into account the momentum fluctuation due to the wave packet spread, the AMD statistical properties for nucleon-emission processes change drastically into quantum statistics. Our study of nucleon-emission processes can be conversely regarded as giving another kind of proof of the fact that the statistics of wave packets is quantum mechanical while that of wave packet centroids is classical. copyright 1996 The American Physical Society

  16. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  17. A statistical approach to define some tofu processing conditions

    Vera de Toledo Benassi

    2011-12-01

    Full Text Available The aim of this work was to make tofu from soybean cultivar BRS 267 under different processing conditions in order to evaluate the influence of each treatment on the product quality. A fractional factorial 2(5-1 design was used, in which independent variables (thermal treatment, coagulant concentration, coagulation time, curd cutting, and draining time were tested at two different levels. The response variables studied were hardness, yield, total solids, and protein content of tofu. Polynomial models were generated for each response. To obtain tofu with desirable characteristics (hardness ~4 N, yield 306 g tofu.100 g-1 soybeans, 12 g proteins.100 g-1 tofu and 22 g solids.100 g-1 tofu, the following processing conditions were selected: heating until boiling plus 10 minutes in water bath, 2% dihydrated CaSO4 w/w, 10 minutes coagulation, curd cutting, and 30 minutes draining time.

  18. Use of statistical process control in evaluation of academic performance

    Ezequiel Gibbon Gautério

    2014-05-01

    Full Text Available The aim of this article was to study some indicators of academic performance (number of students per class, dropout rate, failure rate and scores obtained by the students to identify a pattern of behavior that would enable to implement improvements in the teaching-learning process. The sample was composed of five classes of undergraduate courses in Engineering. The data were collected for three years. Initially an exploratory analysis with analytical and graphical techniques was performed. An analysis of variance and Tukey’s test investigated some sources of variability. This information was used in the construction of control charts. We have found evidence that classes with more students are associated with higher failure rates and lower mean. Moreover, when the course was later in the curriculum, the students had higher scores. The results showed that although they have been detected some special causes interfering in the process, it was possible to stabilize it and to monitor it.

  19. Statistical and dynamical aspects in fission process: The rotational ...

    the fission process, during the evolution from compound nucleus to the ..... For fission induced by light particles like n, p, and α, the total angular momenta ... 96 MeV. 16O+232Th. SaddleTSM. 72 MeV. 10B+232Th. 1.2. 1.4. 1.6. 1.8. 80 ... Systematic investigations in both light- and heavy-ion-induced fissions have shown that.

  20. Sentence-Level Attachment Prediction

    Albakour, M.-Dyaa; Kruschwitz, Udo; Lucas, Simon

    Attachment prediction is the task of automatically identifying email messages that should contain an attachment. This can be useful to tackle the problem of sending out emails but forgetting to include the relevant attachment (something that happens all too often). A common Information Retrieval (IR) approach in analyzing documents such as emails is to treat the entire document as a bag of words. Here we propose a finer-grained analysis to address the problem. We aim at identifying individual sentences within an email that refer to an attachment. If we detect any such sentence, we predict that the email should have an attachment. Using part of the Enron corpus for evaluation we find that our finer-grained approach outperforms previously reported document-level attachment prediction in similar evaluation settings.

  1. Incremental phonological encoding during unscripted sentence production

    Florian T Jaeger

    2012-11-01

    Full Text Available We investigate phonological encoding during unscripted sentence production, focusing on the effect of phonological overlap on phonological encoding. Previous work on this question has almost exclusively employed isolated word production or highly scripted multiword production. These studies have led to conflicting results: some studies found that phonological overlap between two words facilitates phonological encoding, while others found inhibitory effects. One worry with many of these paradigms is that they involve processes that are not typical to everyday language use, which calls into question to what extent their findings speak to the architectures and mechanisms underlying language production. We present a paradigm to investigate the consequences of phonological overlap between words in a sentence while leaving speakers much of the lexical and structural choices typical in everyday language use. Adult native speakers of English described events in short video clips. We annotated the presence of disfluencies and the speech rate at various points throughout the sentence, as well as the constituent order. We find that phonological overlap has an inhibitory effect on phonological encoding. Specifically, if adjacent content words share their phonological onset (e.g., hand the hammer, they are preceded by production difficulty, as reflected in fluency and speech rate. We also find that this production difficulty affects speakers’ constituent order preferences during grammatical encoding. We discuss our results and previous works to isolate the properties of other paradigms that resulted in facilitatory or inhibitory results. The data from our paradigm also speak to questions about the scope of phonological planning in unscripted speech and as to whether phonological and grammatical encoding interact.

  2. Intertime jump statistics of state-dependent Poisson processes.

    Daly, Edoardo; Porporato, Amilcare

    2007-01-01

    A method to obtain the probability distribution of the interarrival times of jump occurrences in systems driven by state-dependent Poisson noise is proposed. Such a method uses the survivor function obtained by a modified version of the master equation associated to the stochastic process under analysis. A model for the timing of human activities shows the capability of state-dependent Poisson noise to generate power-law distributions. The application of the method to a model for neuron dynamics and to a hydrological model accounting for land-atmosphere interaction elucidates the origin of characteristic recurrence intervals and possible persistence in state-dependent Poisson models.

  3. Nonlinear Statistical Signal Processing: A Particle Filtering Approach

    Candy, J.

    2007-01-01

    A introduction to particle filtering is discussed starting with an overview of Bayesian inference from batch to sequential processors. Once the evolving Bayesian paradigm is established, simulation-based methods using sampling theory and Monte Carlo realizations are discussed. Here the usual limitations of nonlinear approximations and non-gaussian processes prevalent in classical nonlinear processing algorithms (e.g. Kalman filters) are no longer a restriction to perform Bayesian inference. It is shown how the underlying hidden or state variables are easily assimilated into this Bayesian construct. Importance sampling methods are then discussed and shown how they can be extended to sequential solutions implemented using Markovian state-space models as a natural evolution. With this in mind, the idea of a particle filter, which is a discrete representation of a probability distribution, is developed and shown how it can be implemented using sequential importance sampling/resampling methods. Finally, an application is briefly discussed comparing the performance of the particle filter designs with classical nonlinear filter implementations

  4. Statistical problems raised by data processing of food surveys

    Lacourly, Nancy

    1974-01-01

    The methods used for the analysis of dietary habits of national populations - food surveys - have been studied. S. Lederman's linear model for the estimation of the average individual consumptions from the total family diets was in the light of a food survey carried on with 250 Roman families in 1969. An important bias in the estimates thus obtained was shown out by a simulation assuming 'housewife's dictatorship'; these assumptions should contribute to set up an unbiased model. Several techniques of multidimensional analysis were therefore used and the theoretical aspect of linear regression for some particular situations had to be investigated: quasi-colinear 'independent variables', measurements with errors, positive constraints on regression coefficients. A new survey methodology was developed taking account of the new 'Integrated Information Systems', which have incidence on all the stages of a consumption survey: organization, data collection, constitution of an information bank and data processing. (author) [fr

  5. Reproducing American Sign Language Sentences: Cognitive Scaffolding in Working Memory

    Ted eSupalla

    2014-08-01

    Full Text Available The American Sign Language Sentence Reproduction Test (ASL-SRT requires the precise reproduction of a series of ASL sentences increasing in complexity and length. Error analyses of such tasks provides insight into working memory and scaffolding processes. Data was collected from three groups expected to differ in fluency: deaf children, deaf adults and hearing adults, all users of ASL. Quantitative (correct/incorrect recall and qualitative error analyses were performed. Percent correct on the reproduction task supports its sensitivity to fluency as test performance clearly differed across the three groups studied. A linguistic analysis of errors further documented differing strategies and bias across groups. Subjects’ recall projected the affordance and constraints of deep linguistic representations to differing degrees, with subjects resorting to alternate processing strategies in the absence of linguistic knowledge. A qualitative error analysis allows us to capture generalizations about the relationship between error pattern and the cognitive scaffolding, which governs the sentence reproduction process. Highly fluent signers and less-fluent signers share common chokepoints on particular words in sentences. However, they diverge in heuristic strategy. Fluent signers, when they make an error, tend to preserve semantic details while altering morpho-syntactic domains. They produce syntactically correct sentences with equivalent meaning to the to-be-reproduced one, but these are not verbatim reproductions of the original sentence. In contrast, less-fluent signers tend to use a more linear strategy, preserving lexical status and word ordering while omitting local inflections, and occasionally resorting to visuo-motoric imitation. Thus, whereas fluent signers readily use top-down scaffolding in their working memory, less fluent signers fail to do so. Implications for current models of working memory across spoken and signed modalities are

  6. Predicting Sentencing for Low-Level Crimes: Comparing Models of Human Judgment

    von Helversen, Bettina; Rieskamp, Jorg

    2009-01-01

    Laws and guidelines regulating legal decision making are often imposed without taking the cognitive processes of the legal decision maker into account. In the case of sentencing, this raises the question of whether the sentencing decisions of prosecutors and judges are consistent with legal policy. Especially in handling low-level crimes, legal…

  7. Distinct contributions of attention and working memory to visual statistical learning and ensemble processing.

    Hall, Michelle G; Mattingley, Jason B; Dux, Paul E

    2015-08-01

    The brain exploits redundancies in the environment to efficiently represent the complexity of the visual world. One example of this is ensemble processing, which provides a statistical summary of elements within a set (e.g., mean size). Another is statistical learning, which involves the encoding of stable spatial or temporal relationships between objects. It has been suggested that ensemble processing over arrays of oriented lines disrupts statistical learning of structure within the arrays (Zhao, Ngo, McKendrick, & Turk-Browne, 2011). Here we asked whether ensemble processing and statistical learning are mutually incompatible, or whether this disruption might occur because ensemble processing encourages participants to process the stimulus arrays in a way that impedes statistical learning. In Experiment 1, we replicated Zhao and colleagues' finding that ensemble processing disrupts statistical learning. In Experiments 2 and 3, we found that statistical learning was unimpaired by ensemble processing when task demands necessitated (a) focal attention to individual items within the stimulus arrays and (b) the retention of individual items in working memory. Together, these results are consistent with an account suggesting that ensemble processing and statistical learning can operate over the same stimuli given appropriate stimulus processing demands during exposure to regularities. (c) 2015 APA, all rights reserved).

  8. Multiplicative Process in Turbulent Velocity Statistics: A Simplified Analysis

    Chillà, F.; Peinke, J.; Castaing, B.

    1996-04-01

    A lot of models in turbulence links the energy cascade process and intermittency, the characteristic of which being the shape evolution of the probability density functions (pdf) for longitudinal velocity increments. Using recent models and experimental results, we show that the flatness factor of these pdf gives a simple and direct estimate for what is called the deepness of the cascade. We analyse in this way the published data of a Direct Numerical Simulation and show that the deepness of the cascade presents the same Reynolds number dependence as in laboratory experiments. Plusieurs modèles de turbulence relient la cascade d'énergie et l'intermittence, caractérisée par l'évolution des densités de probabilité (pdf) des incréments longitudinaux de vitesse. Nous appuyant aussi bien sur des modèles récents que sur des résultats expérimentaux, nous montrons que la Curtosis de ces pdf permet une estimation simple et directe de la profondeur de la cascade. Cela nous permet de réanalyser les résultats publiés d'une simulation numérique et de montrer que la profondeur de la cascade y évolue de la même façon que pour les expériences de laboratoire en fonction du nombre de Reynolds.

  9. How the conceptions of Chinese rhetorical expressions are derived from the corresponding generic sentences

    Zhao, Wenhui

    2018-04-01

    Generic sentences are simple and intuitive recognition and objective description to the external world in terms of "class". In the long evolutionary process of human being's language, the concepts represented by generic sentences has been internalized to be the defaulted knowledge in people's minds. In Chinese, some rhetorical expressions supported by corresponding generic sentences can be accepted by people. The derivation of these rhetorical expressions from the corresponding generic sentences is an important way for language to evolution, which reflects human's creative cognitive competence. From the perspective of conceptual blend theory and the theory of categorization of the cognitive linguistics, the goal of this paper is to analysis the process of the derivation of the rhetorical expressions from the corresponding generic sentences, which can facilitate the Chinese metaphorical information processing and the corpus construction of Chinese emotion metaphors.

  10. A system for classifying wood-using industries and recording statistics for automatic data processing.

    E.W. Fobes; R.W. Rowe

    1968-01-01

    A system for classifying wood-using industries and recording pertinent statistics for automatic data processing is described. Forms and coding instructions for recording data of primary processing plants are included.

  11. Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory

    2016-05-12

    Distribution Unlimited UU UU UU UU 12-05-2016 15-May-2014 14-Feb-2015 Final Report: Statistical Inference on Memory Structure of Processes and Its Applications ...ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 mathematical statistics ; time series; Markov chains; random...journals: Final Report: Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory Report Title Three areas

  12. Initial uncertainty impacts statistical learning in sound sequence processing.

    Todd, Juanita; Provost, Alexander; Whitson, Lisa; Mullens, Daniel

    2016-11-01

    This paper features two studies confirming a lasting impact of first learning on how subsequent experience is weighted in early relevance-filtering processes. In both studies participants were exposed to sequences of sound that contained a regular pattern on two different timescales. Regular patterning in sound is readily detected by the auditory system and used to form "prediction models" that define the most likely properties of sound to be encountered in a given context. The presence and strength of these prediction models is inferred from changes in automatically elicited components of auditory evoked potentials. Both studies employed sound sequences that contained both a local and longer-term pattern. The local pattern was defined by a regular repeating pure tone occasionally interrupted by a rare deviating tone (p=0.125) that was physically different (a 30msvs. 60ms duration difference in one condition and a 1000Hz vs. 1500Hz frequency difference in the other). The longer-term pattern was defined by the rate at which the two tones alternated probabilities (i.e., the tone that was first rare became common and the tone that was first common became rare). There was no task related to the tones and participants were asked to ignore them while focussing attention on a movie with subtitles. Auditory-evoked potentials revealed long lasting modulatory influences based on whether the tone was initially encountered as rare and unpredictable or common and predictable. The results are interpreted as evidence that probability (or indeed predictability) assigns a differential information-value to the two tones that in turn affects the extent to which prediction models are updated and imposed. These effects are exposed for both common and rare occurrences of the tones. The studies contribute to a body of work that reveals that probabilistic information is not faithfully represented in these early evoked potentials and instead exposes that predictability (or conversely

  13. Broca's area, sentence comprehension, and working memory: an fMRI study

    Corianne Rogalsky

    2008-10-01

    Full Text Available The role of Broca's area in sentence processing remains controversial. According to one view, Broca's area is involved in processing a subcomponent of syntactic processing. Another view holds that it contributes to sentence processing via verbal working memory. Sub-regions of Broca's area have been identified that are more active during the processing of complex (object-relative clause sentences compared to simple (subject-relative clause sentences. The present study aimed to determine if this complexity effect can be accounted for in terms of the articulatory rehearsal component of verbal working memory.  In a behavioral experiment, subjects were asked to comprehend sentences during concurrent speech articulation which minimizes articulatory rehearsal as a resource for sentence comprehension. A finger-tapping task was used as a control concurrent task. Only the object-relative clause sentences were more difficult to comprehend during speech articulation than during the manual task, showing that articulatory rehearsal does contribute to sentence processing.  A second experiment used fMRI to document the brain regions underlying this effect.  Subjects judged the plausibility of sentences during speech articulation, a finger-tapping task, or without a concurrent task. In the absence of a secondary task, Broca's area (pars triangularis and pars opercularis demonstrated an increase in activity as a function of syntactic complexity. However, during concurrent speech articulation (but not finger-tapping this complexity effect was eliminated in the pars opercularis suggesting that this region supports sentence comprehension via its role in articulatory rehearsal.  Activity in the pars triangularis was modulated by the finger-tapping task, but not the speech articulation task.

  14. The extraction and integration framework: a two-process account of statistical learning.

    Thiessen, Erik D; Kronstein, Alexandra T; Hufnagle, Daniel G

    2013-07-01

    The term statistical learning in infancy research originally referred to sensitivity to transitional probabilities. Subsequent research has demonstrated that statistical learning contributes to infant development in a wide array of domains. The range of statistical learning phenomena necessitates a broader view of the processes underlying statistical learning. Learners are sensitive to a much wider range of statistical information than the conditional relations indexed by transitional probabilities, including distributional and cue-based statistics. We propose a novel framework that unifies learning about all of these kinds of statistical structure. From our perspective, learning about conditional relations outputs discrete representations (such as words). Integration across these discrete representations yields sensitivity to cues and distributional information. To achieve sensitivity to all of these kinds of statistical structure, our framework combines processes that extract segments of the input with processes that compare across these extracted items. In this framework, the items extracted from the input serve as exemplars in long-term memory. The similarity structure of those exemplars in long-term memory leads to the discovery of cues and categorical structure, which guides subsequent extraction. The extraction and integration framework provides a way to explain sensitivity to both conditional statistical structure (such as transitional probabilities) and distributional statistical structure (such as item frequency and variability), and also a framework for thinking about how these different aspects of statistical learning influence each other. 2013 APA, all rights reserved

  15. Listening to Sentences in Noise: Revealing Binaural Hearing Challenges in Patients with Schizophrenia.

    Abdul Wahab, Noor Alaudin; Zakaria, Mohd Normani; Abdul Rahman, Abdul Hamid; Sidek, Dinsuhaimi; Wahab, Suzaily

    2017-11-01

    The present, case-control, study investigates binaural hearing performance in schizophrenia patients towards sentences presented in quiet and noise. Participants were twenty-one healthy controls and sixteen schizophrenia patients with normal peripheral auditory functions. The binaural hearing was examined in four listening conditions by using the Malay version of hearing in noise test. The syntactically and semantically correct sentences were presented via headphones to the randomly selected subjects. In each condition, the adaptively obtained reception thresholds for speech (RTS) were used to determine RTS noise composite and spatial release from masking. Schizophrenia patients demonstrated significantly higher mean RTS value relative to healthy controls (p=0.018). The large effect size found in three listening conditions, i.e., in quiet (d=1.07), noise right (d=0.88) and noise composite (d=0.90) indicates statistically significant difference between the groups. However, noise front and noise left conditions show medium (d=0.61) and small (d=0.50) effect size respectively. No statistical difference between groups was noted in regards to spatial release from masking on right (p=0.305) and left (p=0.970) ear. The present findings suggest an abnormal unilateral auditory processing in central auditory pathway in schizophrenia patients. Future studies to explore the role of binaural and spatial auditory processing were recommended.

  16. Sentence Complexity and Working Memory Effects in Ambiguity Resolution

    Kim, Ji Hyon; Christianson, Kiel

    2013-01-01

    Two self-paced reading experiments using a paraphrase decision task paradigm were performed to investigate how sentence complexity contributed to the relative clause (RC) attachment preferences of speakers of different working memory capacities (WMCs). Experiment 1 (English) showed working memory effects on relative clause processing in both…

  17. Sensing the Sentence: An Embodied Simulation Approach to Rhetorical Grammar

    Rule, Hannah J.

    2017-01-01

    This article applies the neuroscientific concept of embodied simulation--the process of understanding language through visual, motor, and spatial modalities of the body--to rhetorical grammar and sentence-style pedagogies. Embodied simulation invigorates rhetorical grammar instruction by attuning writers to the felt effects of written language,…

  18. Number Attraction Effects in Near-Native Spanish Sentence Comprehension

    Jegerski, Jill

    2016-01-01

    Grammatical agreement phenomena such as verbal number have long been of fundamental interest in the study of second language (L2) acquisition. Previous research from the perspective of sentence processing has documented nativelike behavior among nonnative participants but has also relied almost exclusively on grammar violation paradigms. The…

  19. Conceptual Combination During Sentence Comprehension

    Swinney, David; Love, Tracy; Walenski, Matthew; Smith, Edward E.

    2008-01-01

    This experiment examined the time course of integration of modifier-noun (conceptual) combinations during auditory sentence comprehension using cross-modal lexical priming. The study revealed that during ongoing comprehension, there is initial activation of features of the noun prior to activation of (emergent) features of the entire conceptual combination. These results support compositionality in conceptual combination; that is, they indicate that features of the individual words constituting a conceptual combination are activated prior to combination of the words into a new concept. PMID:17576278

  20. [Cognitive aging mechanism of signaling effects on the memory for procedural sentences].

    Yamamoto, Hiroki; Shimada, Hideaki

    2006-08-01

    The aim of this study was to clarify the cognitive aging mechanism of signaling effects on the memory for procedural sentences. Participants were 60 younger adults (college students) and 60 older adults. Both age groups were assigned into two groups; half of each group was presented with procedural sentences with signals that highlighted their top-level structure and the other half with procedural sentences without them. Both groups were requested to perform the sentence arrangement task and the reconstruction task. Each task was composed of procedural sentences with or without signals. Results indicated that signaling supported changes in strategy utilization during the successive organizational processes and that changes in strategy utilization resulting from signaling improved the memory for procedural sentences. Moreover, age-related factors interfered with these signaling effects. This study clarified the cognitive aging mechanism of signaling effects in which signaling supports changes in the strategy utilization during organizational processes at encoding and this mediation promotes memory for procedural sentences, though disuse of the strategy utilization due to aging restrains their memory for procedural sentences.

  1. INFLUENCE OF LENGTH OF SENTENCES ON THE FREQUENCY OF SPEECH DISFLUENCIES IN CHILDREN WHO STUTTER

    Leila Begić

    2017-04-01

    Full Text Available The main purpose of this study was to investigate whether the length of sentences has influence on the frequency of speech disfluencies for children who stutter. The participants included 30 children who stutter 19 male participants and 13 female participants, whose age ranged between 4 years and 8 months to 6 years and 11 months (56 to 83 months of age. Research was conducted in kindergartens and primary schools in Tuzla Canton in Bosnia and Herzegovina2 . The test consisted of 36 sentences. In relation to the length, sentences were divided into three groups: in the first group there were 9 sentences which included 3 to 5 words, in the second group, there were 14 sentences which included 6 to 8 words and in the third group there were 13 sentences which included 9 to 11 words. Testing was conducted so that the examiner was pronouncing one sentence after which the participant repeated the same sentence. Each participant was requested to repeat exactly what he/she had heard. Speech and language pathologist has recorded all speech disfluencies in all sentences. The results showed that the sentences containing 9 to 11 words had most effects on the overall dynamics of speech disfluencies in children who stutter. The results suggest that during the process of assessment and diagnosis of children who stutter, it should be required to assess the child's ability to use complex linguistic statements and to assess the frequency of disfluencies in relation to the complexity of the sentences. Precision diagnostics would provide guidelines for the treatment of stuttering in terms of implementation of approaches and strategies which include language treatment and gradually increasing the length and complexity of statements of children who stutter during speech.

  2. A shared neural substrate for mentalizing and the affective component of sentence comprehension.

    Pierre-Yves Hervé

    Full Text Available Using event-related fMRI in a sample of 42 healthy participants, we compared the cerebral activity maps obtained when classifying spoken sentences based on the mental content of the main character (belief, deception or empathy or on the emotional tonality of the sentence (happiness, anger or sadness. To control for the effects of different syntactic constructions (such as embedded clauses in belief sentences, we subtracted from each map the BOLD activations obtained during plausibility judgments on structurally matching sentences, devoid of emotions or ToM. The obtained theory of mind (ToM and emotional speech comprehension networks overlapped in the bilateral temporo-parietal junction, posterior cingulate cortex, right anterior temporal lobe, dorsomedial prefrontal cortex and in the left inferior frontal sulcus. These regions form a ToM network, which contributes to the emotional component of spoken sentence comprehension. Compared with the ToM task, in which the sentences were enounced on a neutral tone, the emotional sentence classification task, in which the sentences were play-acted, was associated with a greater activity in the bilateral superior temporal sulcus, in line with the presence of emotional prosody. Besides, the ventromedial prefrontal cortex was more active during emotional than ToM sentence processing. This region may link mental state representations with verbal and prosodic emotional cues. Compared with emotional sentence classification, ToM was associated with greater activity in the caudate nucleus, paracingulate cortex, and superior frontal and parietal regions, in line with behavioral data showing that ToM sentence comprehension was a more demanding task.

  3. A Mouse with a Roof? Effects of Phonological Neighbors on Processing of Words in Sentences in a Non-Native Language

    Ruschemeyer, Shirley-Ann; Nojack, Agnes; Limbach, Maxi

    2008-01-01

    The architecture of the language processing system for speakers of more than one language remains an intriguing topic of research. A common finding is that speakers of multiple languages are slower at responding to language stimuli in their non-native language (L2) than monolingual speakers. This may simply reflect participants' unfamiliarity with…

  4. Statistical process control methods allow the analysis and improvement of anesthesia care.

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  5. Discourse, Paragraph, and Sentence Structure in Selected Philippine Languages. Final Report. Volume II, Sentence Structure.

    Longacre, Robert E.

    Volume II of "Discourse, Paragraph, and Sentence Structure in Selected Philippine Languages" begins with an explanation of certain assumptions and postulates regarding sentence structure. A detailed treatment of systems of sentence structure and the parameters of such systems follows. Data in the various indigenous languages are…

  6. A Study of the Speed of Understanding Sentences as a Function of Sentence Structure. Final Report.

    Halamandaris, Pandelis G.

    On the basis of the grammatical theory developed by Noam Chomsky, it is reasonable to presume that the different parts of a sentence may not all be understood with equal facility and speed. One purpose of this study was to determine whether some of the grammatical relations within a sentence were understood more readily than others. Sentences of…

  7. Pengendalian Kualitas Kertas Dengan Menggunakan Statistical Process Control di Paper Machine 3

    Vera Devani

    2017-01-01

    Full Text Available Purpose of this research is to determine types and causes of defects commonly found in Paper Machine 3 by using statistical process control (SPC method.  Statistical process control (SPC is a technique for solving problems and is used to monitor, control, analyze, manage and improve products and processes using statistical methods.  Based on Pareto Diagrams, wavy defect is found as the most frequent defect, which is 81.7%.  Human factor, meanwhile, is found as the main cause of defect, primarily due to lack of understanding on machinery and lack of training both leading to errors in data input.

  8. Listening to factually incorrect sentences activates classical language areas and thalamus.

    Yu, Tao; Lang, Simone; Birbaumer, Niels; Kotchoubey, Boris

    2011-12-07

    Neurophysiological underpinnings of the integration of information during sentence comprehension have been studied since 1980. However, little is known about integrative processes in sentences containing a word that is semantically congruent, but factually incompatible with the context. In this study, we aimed at investigating the differences between the brain regions involved in responses to factually correct and incorrect sentences. Eighteen healthy volunteers underwent functional MRI while listening passively to 40 correct and 40 incorrect sentences. The contrast between factually correct and incorrect sentence endings revealed large activation areas in the left inferior frontal gyrus, the left middle/superior temporal gyrus, and smaller activations of these areas' homologs in the right hemisphere, in the thalamus, and Brodmann area 6.

  9. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  10. Statistical Process Control Charts for Measuring and Monitoring Temporal Consistency of Ratings

    Omar, M. Hafidz

    2010-01-01

    Methods of statistical process control were briefly investigated in the field of educational measurement as early as 1999. However, only the use of a cumulative sum chart was explored. In this article other methods of statistical quality control are introduced and explored. In particular, methods in the form of Shewhart mean and standard deviation…

  11. Protecting the Force: Application of Statistical Process Control for Force Protection in Bosnia

    Finken, Paul

    2000-01-01

    .... In Operations Other Than War (OOTW), environments where the enemy is disorganized and incapable of mounting a deception plan, staffs could model hostile events as stochastic events and use statistical methods to detect changes to the process...

  12. Penultimate modeling of spatial extremes: statistical inference for max-infinitely divisible processes

    Huser, Raphaë l; Opitz, Thomas; Thibaud, Emeric

    2018-01-01

    Extreme-value theory for stochastic processes has motivated the statistical use of max-stable models for spatial extremes. However, fitting such asymptotic models to maxima observed over finite blocks is problematic when the asymptotic stability

  13. Statistics to the Rescue!: Using Data to Evaluate a Manufacturing Process

    Keithley, Michael G.

    2009-01-01

    The use of statistics and process controls is too often overlooked in educating students. This article describes an activity appropriate for high school students who have a background in material processing. It gives them a chance to advance their knowledge by determining whether or not a manufacturing process works well. The activity follows a…

  14. Impact of Autocorrelation on Principal Components and Their Use in Statistical Process Control

    Vanhatalo, Erik; Kulahci, Murat

    2015-01-01

    A basic assumption when using principal component analysis (PCA) for inferential purposes, such as in statistical process control (SPC), is that the data are independent in time. In many industrial processes, frequent sampling and process dynamics make this assumption unrealistic rendering sampled...

  15. Encoding and Retrieval Interference in Sentence Comprehension: Evidence from Agreement

    Sandra Villata

    2018-01-01

    Full Text Available Long-distance verb-argument dependencies generally require the integration of a fronted argument when the verb is encountered for sentence interpretation. Under a parsing model that handles long-distance dependencies through a cue-based retrieval mechanism, retrieval is hampered when retrieval cues also resonate with non-target elements (retrieval interference. However, similarity-based interference may also stem from interference arising during the encoding of elements in memory (encoding interference, an effect that is not directly accountable for by a cue-based retrieval mechanism. Although encoding and retrieval interference are clearly distinct at the theoretical level, it is difficult to disentangle the two on empirical grounds, since encoding interference may also manifest at the retrieval region. We report two self-paced reading experiments aimed at teasing apart the role of each component in gender and number subject-verb agreement in Italian and English object relative clauses. In Italian, the verb does not agree in gender with the subject, thus providing no cue for retrieval. In English, although present tense verbs agree in number with the subject, past tense verbs do not, allowing us to test the role of number as a retrieval cue within the same language. Results from both experiments converge, showing similarity-based interference at encoding, and some evidence for an effect at retrieval. After having pointed out the non-negligible role of encoding in sentence comprehension, and noting that Lewis and Vasishth’s (2005 ACT-R model of sentence processing, the most fully developed cue-based retrieval approach to sentence processing does not predict encoding effects, we propose an augmentation of this model that predicts these effects. We then also propose a self-organizing sentence processing model (SOSP, which has the advantage of accounting for retrieval and encoding interference with a single mechanism.

  16. Encoding and Retrieval Interference in Sentence Comprehension: Evidence from Agreement

    Villata, Sandra; Tabor, Whitney; Franck, Julie

    2018-01-01

    Long-distance verb-argument dependencies generally require the integration of a fronted argument when the verb is encountered for sentence interpretation. Under a parsing model that handles long-distance dependencies through a cue-based retrieval mechanism, retrieval is hampered when retrieval cues also resonate with non-target elements (retrieval interference). However, similarity-based interference may also stem from interference arising during the encoding of elements in memory (encoding interference), an effect that is not directly accountable for by a cue-based retrieval mechanism. Although encoding and retrieval interference are clearly distinct at the theoretical level, it is difficult to disentangle the two on empirical grounds, since encoding interference may also manifest at the retrieval region. We report two self-paced reading experiments aimed at teasing apart the role of each component in gender and number subject-verb agreement in Italian and English object relative clauses. In Italian, the verb does not agree in gender with the subject, thus providing no cue for retrieval. In English, although present tense verbs agree in number with the subject, past tense verbs do not, allowing us to test the role of number as a retrieval cue within the same language. Results from both experiments converge, showing similarity-based interference at encoding, and some evidence for an effect at retrieval. After having pointed out the non-negligible role of encoding in sentence comprehension, and noting that Lewis and Vasishth’s (2005) ACT-R model of sentence processing, the most fully developed cue-based retrieval approach to sentence processing does not predict encoding effects, we propose an augmentation of this model that predicts these effects. We then also propose a self-organizing sentence processing model (SOSP), which has the advantage of accounting for retrieval and encoding interference with a single mechanism. PMID:29403414

  17. Evaluation of context effects in sentence recognition

    Bronkhorst, A.W.; Brand, T.; Wagener, K.

    2002-01-01

    It was investigated whether the model for context effects, developed earlier by Bronkhorst et al. [J. Acoust. Soc. Am. 93, 499-509 (1993)], can be applied to results of sentence tests, used for the evaluation of speech recognition. Data for two German sentence tests, that differed with respect to

  18. Creating Hope for Life-Sentenced Offenders

    Ruddell, Rick; Broom, Ian; Young, Matthew

    2010-01-01

    Offenders sentenced to terms of life imprisonment pose special challenges for correctional systems. The Correctional Service of Canada collaborated with nongovernmental agencies to develop programmatic interventions to better prepare this population to survive their prison sentences and transition to the community. This study describes the…

  19. THE CHILD JUSTICE ACT: PROCEDURAL SENTENCING ISSUES

    Stephan

    2012-08-08

    Aug 8, 2012 ... research visits, and the Max Planck Institute for Foreign and International Criminal Law,. Freiburg, Germany ... Whether or not a pre-sentence report should be obtained before a child offender is sentenced has ...... the Criminal Procedure Act. It is important to read the quoted part of section 85(1) as a single ...

  20. Phonological Advance Planning in Sentence Production

    Oppermann, Frank; Jescheniak, Jorg D.; Schriefers, Herbert

    2010-01-01

    Our study addresses the scope of phonological advance planning during sentence production using a novel experimental procedure. The production of German sentences in various syntactic formats (SVO, SOV, and VSO) was cued by presenting pictures of the agents of previously memorized agent-action-patient scenes. To tap the phonological activation of…

  1. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  2. Statistical process control: An approach to quality assurance in the production of vitrified nuclear waste

    Pulsipher, B.A.; Kuhn, W.L.

    1987-01-01

    Current planning for liquid high-level nuclear wastes existing in the United States includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product

  3. Statistical process control: An approach to quality assurance in the production of vitrified nuclear waste

    Pulsipher, B.A.; Kuhn, W.L.

    1987-02-01

    Current planning for liquid high-level nuclear wastes existing in the US includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product. 2 refs., 4 figs

  4. Predicting Neural Activity Patterns Associated with Sentences Using a Neurobiologically Motivated Model of Semantic Representation.

    Anderson, Andrew James; Binder, Jeffrey R; Fernandino, Leonardo; Humphries, Colin J; Conant, Lisa L; Aguilar, Mario; Wang, Xixi; Doko, Donias; Raizada, Rajeev D S

    2017-09-01

    We introduce an approach that predicts neural representations of word meanings contained in sentences then superposes these to predict neural representations of new sentences. A neurobiological semantic model based on sensory, motor, social, emotional, and cognitive attributes was used as a foundation to define semantic content. Previous studies have predominantly predicted neural patterns for isolated words, using models that lack neurobiological interpretation. Fourteen participants read 240 sentences describing everyday situations while undergoing fMRI. To connect sentence-level fMRI activation patterns to the word-level semantic model, we devised methods to decompose the fMRI data into individual words. Activation patterns associated with each attribute in the model were then estimated using multiple-regression. This enabled synthesis of activation patterns for trained and new words, which were subsequently averaged to predict new sentences. Region-of-interest analyses revealed that prediction accuracy was highest using voxels in the left temporal and inferior parietal cortex, although a broad range of regions returned statistically significant results, showing that semantic information is widely distributed across the brain. The results show how a neurobiologically motivated semantic model can decompose sentence-level fMRI data into activation features for component words, which can be recombined to predict activation patterns for new sentences. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. The role of working memory in inferential sentence comprehension.

    Pérez, Ana Isabel; Paolieri, Daniela; Macizo, Pedro; Bajo, Teresa

    2014-08-01

    Existing literature on inference making is large and varied. Trabasso and Magliano (Discourse Process 21(3):255-287, 1996) proposed the existence of three types of inferences: explicative, associative and predictive. In addition, the authors suggested that these inferences were related to working memory (WM). In the present experiment, we investigated whether WM capacity plays a role in our ability to answer comprehension sentences that require text information based on these types of inferences. Participants with high and low WM span read two narratives with four paragraphs each. After each paragraph was read, they were presented with four true/false comprehension sentences. One required verbatim information and the other three implied explicative, associative and predictive inferential information. Results demonstrated that only the explicative and predictive comprehension sentences required WM: participants with high verbal WM were more accurate in giving explanations and also faster at making predictions relative to participants with low verbal WM span; in contrast, no WM differences were found in the associative comprehension sentences. These results are interpreted in terms of the causal nature underlying these types of inferences.

  6. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2011-01-01

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scien...

  7. Use of statistical process control in the production of blood components

    Magnussen, K; Quere, S; Winkel, P

    2008-01-01

    Introduction of statistical process control in the setting of a small blood centre was tested, both on the regular red blood cell production and specifically to test if a difference was seen in the quality of the platelets produced, when a change was made from a relatively large inexperienced...... by an experienced staff with four technologists. We applied statistical process control to examine if time series of quality control values were in statistical control. Leucocyte count in red blood cells was out of statistical control. Platelet concentration and volume of the platelets produced by the occasional...... occasional component manufacturing staff to an experienced regular manufacturing staff. Production of blood products is a semi-automated process in which the manual steps may be difficult to control. This study was performed in an ongoing effort to improve the control and optimize the quality of the blood...

  8. The Effect of Number and Presentation Order of High-Constraint Sentences on Second Language Word Learning.

    Ma, Tengfei; Chen, Ran; Dunlap, Susan; Chen, Baoguo

    2016-01-01

    This paper presents the results of an experiment that investigated the effects of number and presentation order of high-constraint sentences on semantic processing of unknown second language (L2) words (pseudowords) through reading. All participants were Chinese native speakers who learned English as a foreign language. In the experiment, sentence constraint and order of different constraint sentences were manipulated in English sentences, as well as L2 proficiency level of participants. We found that the number of high-constraint sentences was supportive for L2 word learning except in the condition in which high-constraint exposure was presented first. Moreover, when the number of high-constraint sentences was the same, learning was significantly better when the first exposure was a high-constraint exposure. And no proficiency level effects were found. Our results provided direct evidence that L2 word learning benefited from high quality language input and first presentations of high quality language input.

  9. Effects of syntactic structure in the memory of concrete and abstract Chinese sentences.

    Ho, C S; Chen, H C

    1993-09-01

    Smith (1981) found that concrete English sentences were better recognized than abstract sentences and that this concreteness effect was potent only when the concrete sentence was also affirmative but the effect switched to an opposite end when the concrete sentence was negative. These results were partially replicated in Experiment 1 by using materials from a very different language (i.e., Chinese): concrete-affirmative sentences were better remembered than concrete-negative and abstract sentences, but no reliable difference was found between the latter two types. In Experiment 2, the task was modified by using a visual presentation instead of an oral one as in Experiment 1. Both concrete-affirmative and concrete-negative sentences were better memorized then abstract ones in Experiment 2. The findings in the two experiments are explained by a combination of the dual-coding model and Marschark's (1985) item-specific and relational processing. The differential effects of experience with different language systems on processing verbal materials in memory are also discussed.

  10. Statistical relation between particle contaminations in ultra pure water and defects generated by process tools

    Wali, F.; Knotter, D. Martin; Wortelboer, Ronald; Mud, Auke

    2007-01-01

    Ultra pure water supplied inside the Fab is used in different tools at different stages of processing. Data of the particles measured in ultra pure water was compared with the defect density on wafers processed on these tools and a statistical relation is found Keywords— Yield, defect density,

  11. Ionization processes in a transient hollow cathode discharge before electric breakdown: statistical distribution

    Zambra, M.; Favre, M.; Moreno, J.; Wyndham, E.; Chuaqui, H.; Choi, P.

    1998-01-01

    The charge formation processes in a hollow cathode region (HCR) of transient hollow cathode discharge have been studied at the final phase. The statistical distribution that describe different processes of ionization have been represented by Gaussian distributions. Nevertheless, was observed a better representation of these distributions when the pressure is near a minimum value, just before breakdown

  12. Hazard rate model and statistical analysis of a compound point process

    Volf, Petr

    2005-01-01

    Roč. 41, č. 6 (2005), s. 773-786 ISSN 0023-5954 R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : couting process * compound process * Cox regression model * intensity Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.343, year: 2005

  13. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Statistical Process Control.

    Billings, Paul H.

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 6-hour introductory module on statistical process control (SPC), designed to develop competencies in the following skill areas: (1) identification of the three classes of SPC use; (2) understanding a process and how it works; (3)…

  14. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  15. Statistical error in simulations of Poisson processes: Example of diffusion in solids

    Nilsson, Johan O.; Leetmaa, Mikael; Vekilova, Olga Yu.; Simak, Sergei I.; Skorodumova, Natalia V.

    2016-08-01

    Simulations of diffusion in solids often produce poor statistics of diffusion events. We present an analytical expression for the statistical error in ion conductivity obtained in such simulations. The error expression is not restricted to any computational method in particular, but valid in the context of simulation of Poisson processes in general. This analytical error expression is verified numerically for the case of Gd-doped ceria by running a large number of kinetic Monte Carlo calculations.

  16. Animacy or case marker order?: priority information for online sentence comprehension in a head-final language.

    Yokoyama, Satoru; Takahashi, Kei; Kawashima, Ryuta

    2014-01-01

    It is well known that case marker information and animacy information are incrementally used to comprehend sentences in head-final languages. However, it is still unclear how these two kinds of information are processed when they are in competition in a sentence's surface expression. The current study used sentences conveying the potentiality of some event (henceforth, potential sentences) in the Japanese language with theoretically canonical word order (dative-nominative/animate-inanimate order) and with scrambled word order (nominative-dative/inanimate-animate order). In Japanese, nominative-first case order and animate-inanimate animacy order are preferred to their reversed patterns in simplex sentences. Hence, in these potential sentences, case information and animacy information are in competition. The experiment consisted of a self-paced reading task testing two conditions (that is, canonical and scrambled potential sentences). Forty-five native speakers of Japanese participated. In our results, the canonical potential sentences showed a scrambling cost at the second argument position (the nominative argument). This result indicates that the theoretically scrambled case marker order (nominative-dative) is processed as a mentally canonical case marker order, suggesting that case information is used preferentially over animacy information when the two are in competition. The implications of our findings are discussed with regard to incremental simplex sentence comprehension models for head-final languages.

  17. Development of Statistical Process Control Methodology for an Environmentally Compliant Surface Cleaning Process in a Bonding Laboratory

    Hutchens, Dale E.; Doan, Patrick A.; Boothe, Richard E.

    1997-01-01

    Bonding labs at both MSFC and the northern Utah production plant prepare bond test specimens which simulate or witness the production of NASA's Reusable Solid Rocket Motor (RSRM). The current process for preparing the bonding surfaces employs 1,1,1-trichloroethane vapor degreasing, which simulates the current RSRM process. Government regulations (e.g., the 1990 Amendments to the Clean Air Act) have mandated a production phase-out of a number of ozone depleting compounds (ODC) including 1,1,1-trichloroethane. In order to comply with these regulations, the RSRM Program is qualifying a spray-in-air (SIA) precision cleaning process using Brulin 1990, an aqueous blend of surfactants. Accordingly, surface preparation prior to bonding process simulation test specimens must reflect the new production cleaning process. The Bonding Lab Statistical Process Control (SPC) program monitors the progress of the lab and its capabilities, as well as certifies the bonding technicians, by periodically preparing D6AC steel tensile adhesion panels with EA-91 3NA epoxy adhesive using a standardized process. SPC methods are then used to ensure the process is statistically in control, thus producing reliable data for bonding studies, and identify any problems which might develop. Since the specimen cleaning process is being changed, new SPC limits must be established. This report summarizes side-by-side testing of D6AC steel tensile adhesion witness panels and tapered double cantilevered beams (TDCBs) using both the current baseline vapor degreasing process and a lab-scale spray-in-air process. A Proceco 26 inches Typhoon dishwasher cleaned both tensile adhesion witness panels and TDCBs in a process which simulates the new production process. The tests were performed six times during 1995, subsequent statistical analysis of the data established new upper control limits (UCL) and lower control limits (LCL). The data also demonstrated that the new process was equivalent to the vapor

  18. Verbal Semantics Drives Early Anticipatory Eye Movements during the Comprehension of Verb-Initial Sentences

    Sauppe, Sebastian

    2016-01-01

    Studies on anticipatory processes during sentence comprehension often focus on the prediction of postverbal direct objects. In subject-initial languages (the target of most studies so far), however, the position in the sentence, the syntactic function, and the semantic role of arguments are often conflated. For example, in the sentence “The frog will eat the fly” the syntactic object (“fly”) is at the same time also the last word and the patient argument of the verb. It is therefore not appar...

  19. Parametric analysis of the statistical model of the stick-slip process

    Lima, Roberta; Sampaio, Rubens

    2017-06-01

    In this paper it is performed a parametric analysis of the statistical model of the response of a dry-friction oscillator. The oscillator is a spring-mass system which moves over a base with a rough surface. Due to this roughness, the mass is subject to a dry-frictional force modeled as a Coulomb friction. The system is stochastically excited by an imposed bang-bang base motion. The base velocity is modeled by a Poisson process for which a probabilistic model is fully specified. The excitation induces in the system stochastic stick-slip oscillations. The system response is composed by a random sequence alternating stick and slip-modes. With realizations of the system, a statistical model is constructed for this sequence. In this statistical model, the variables of interest of the sequence are modeled as random variables, as for example, the number of time intervals in which stick or slip occur, the instants at which they begin, and their duration. Samples of the system response are computed by integration of the dynamic equation of the system using independent samples of the base motion. Statistics and histograms of the random variables which characterize the stick-slip process are estimated for the generated samples. The objective of the paper is to analyze how these estimated statistics and histograms vary with the system parameters, i.e., to make a parametric analysis of the statistical model of the stick-slip process.

  20. Statistical test data selection for reliability evalution of process computer software

    Volkmann, K.P.; Hoermann, H.; Ehrenberger, W.

    1976-01-01

    The paper presents a concept for converting knowledge about the characteristics of process states into practicable procedures for the statistical selection of test cases in testing process computer software. Process states are defined as vectors whose components consist of values of input variables lying in discrete positions or within given limits. Two approaches for test data selection, based on knowledge about cases of demand, are outlined referring to a purely probabilistic method and to the mathematics of stratified sampling. (orig.) [de

  1. Is human sentence parsing serial or parallel? Evidence from event-related brain potentials.

    Hopf, Jens-Max; Bader, Markus; Meng, Michael; Bayer, Josef

    2003-01-01

    In this ERP study we investigate the processes that occur in syntactically ambiguous German sentences at the point of disambiguation. Whereas most psycholinguistic theories agree on the view that processing difficulties arise when parsing preferences are disconfirmed (so-called garden-path effects), important differences exist with respect to theoretical assumptions about the parser's recovery from a misparse. A key distinction can be made between parsers that compute all alternative syntactic structures in parallel (parallel parsers) and parsers that compute only a single preferred analysis (serial parsers). To distinguish empirically between parallel and serial parsing models, we compare ERP responses to garden-path sentences with ERP responses to truly ungrammatical sentences. Garden-path sentences contain a temporary and ultimately curable ungrammaticality, whereas truly ungrammatical sentences remain so permanently--a difference which gives rise to different predictions in the two classes of parsing architectures. At the disambiguating word, ERPs in both sentence types show negative shifts of similar onset latency, amplitude, and scalp distribution in an initial time window between 300 and 500 ms. In a following time window (500-700 ms), the negative shift to garden-path sentences disappears at right central parietal sites, while it continues in permanently ungrammatical sentences. These data are taken as evidence for a strictly serial parser. The absence of a difference in the early time window indicates that temporary and permanent ungrammaticalities trigger the same kind of parsing responses. Later differences can be related to successful reanalysis in garden-path but not in ungrammatical sentences. Copyright 2003 Elsevier Science B.V.

  2. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  3. Statistical analysis and digital processing of the Mössbauer spectra

    Prochazka, Roman; Tucek, Pavel; Tucek, Jiri; Marek, Jaroslav; Mashlan, Miroslav; Pechousek, Jiri

    2010-02-01

    This work is focused on using the statistical methods and development of the filtration procedures for signal processing in Mössbauer spectroscopy. Statistical tools for noise filtering in the measured spectra are used in many scientific areas. The use of a pure statistical approach in accumulated Mössbauer spectra filtration is described. In Mössbauer spectroscopy, the noise can be considered as a Poisson statistical process with a Gaussian distribution for high numbers of observations. This noise is a superposition of the non-resonant photons counting with electronic noise (from γ-ray detection and discrimination units), and the velocity system quality that can be characterized by the velocity nonlinearities. The possibility of a noise-reducing process using a new design of statistical filter procedure is described. This mathematical procedure improves the signal-to-noise ratio and thus makes it easier to determine the hyperfine parameters of the given Mössbauer spectra. The filter procedure is based on a periodogram method that makes it possible to assign the statistically important components in the spectral domain. The significance level for these components is then feedback-controlled using the correlation coefficient test results. The estimation of the theoretical correlation coefficient level which corresponds to the spectrum resolution is performed. Correlation coefficient test is based on comparison of the theoretical and the experimental correlation coefficients given by the Spearman method. The correctness of this solution was analyzed by a series of statistical tests and confirmed by many spectra measured with increasing statistical quality for a given sample (absorber). The effect of this filter procedure depends on the signal-to-noise ratio and the applicability of this method has binding conditions.

  4. Statistical analysis and digital processing of the Mössbauer spectra

    Prochazka, Roman; Tucek, Jiri; Mashlan, Miroslav; Pechousek, Jiri; Tucek, Pavel; Marek, Jaroslav

    2010-01-01

    This work is focused on using the statistical methods and development of the filtration procedures for signal processing in Mössbauer spectroscopy. Statistical tools for noise filtering in the measured spectra are used in many scientific areas. The use of a pure statistical approach in accumulated Mössbauer spectra filtration is described. In Mössbauer spectroscopy, the noise can be considered as a Poisson statistical process with a Gaussian distribution for high numbers of observations. This noise is a superposition of the non-resonant photons counting with electronic noise (from γ-ray detection and discrimination units), and the velocity system quality that can be characterized by the velocity nonlinearities. The possibility of a noise-reducing process using a new design of statistical filter procedure is described. This mathematical procedure improves the signal-to-noise ratio and thus makes it easier to determine the hyperfine parameters of the given Mössbauer spectra. The filter procedure is based on a periodogram method that makes it possible to assign the statistically important components in the spectral domain. The significance level for these components is then feedback-controlled using the correlation coefficient test results. The estimation of the theoretical correlation coefficient level which corresponds to the spectrum resolution is performed. Correlation coefficient test is based on comparison of the theoretical and the experimental correlation coefficients given by the Spearman method. The correctness of this solution was analyzed by a series of statistical tests and confirmed by many spectra measured with increasing statistical quality for a given sample (absorber). The effect of this filter procedure depends on the signal-to-noise ratio and the applicability of this method has binding conditions

  5. Rapid L2 Word Learning through High Constraint Sentence Context: An Event-Related Potential Study

    Baoguo Chen

    2017-12-01

    Full Text Available Previous studies have found quantity of exposure, i.e., frequency of exposure (Horst et al., 1998; Webb, 2008; Pellicer-Sánchez and Schmitt, 2010, is important for second language (L2 contextual word learning. Besides this factor, context constraint and L2 proficiency level have also been found to affect contextual word learning (Pulido, 2003; Tekmen and Daloglu, 2006; Elgort et al., 2015; Ma et al., 2015. In the present study, we adopted the event-related potential (ERP technique and chose high constraint sentences as reading materials to further explore the effects of quantity of exposure and proficiency on L2 contextual word learning. Participants were Chinese learners of English with different English proficiency levels. For each novel word, there were four high constraint sentences with the critical word at the end of the sentence. Learners read sentences and made semantic relatedness judgment afterwards, with ERPs recorded. Results showed that in the high constraint condition where each pseudoword was embedded in four sentences with consistent meaning, N400 amplitude upon this pseudoword decreased significantly as learners read the first two sentences. High proficiency learners responded faster in the semantic relatedness judgment task. These results suggest that in high quality sentence contexts, L2 learners could rapidly acquire word meaning without multiple exposures, and L2 proficiency facilitated this learning process.

  6. THE CHILD JUSTICE ACT: PROCEDURAL SENTENCING ISSUES

    Stephan

    principles in terms of which the appropriate sentence should be established,1 ... Republic of South Africa, 1996, the theory of the best interests of the child as a ..... different forms of imprisonment under South African law.29 The Act expressly.

  7. Management of Uncertainty by Statistical Process Control and a Genetic Tuned Fuzzy System

    Stephan Birle

    2016-01-01

    Full Text Available In food industry, bioprocesses like fermentation often are a crucial part of the manufacturing process and decisive for the final product quality. In general, they are characterized by highly nonlinear dynamics and uncertainties that make it difficult to control these processes by the use of traditional control techniques. In this context, fuzzy logic controllers offer quite a straightforward way to control processes that are affected by nonlinear behavior and uncertain process knowledge. However, in order to maintain process safety and product quality it is necessary to specify the controller performance and to tune the controller parameters. In this work, an approach is presented to establish an intelligent control system for oxidoreductive yeast propagation as a representative process biased by the aforementioned uncertainties. The presented approach is based on statistical process control and fuzzy logic feedback control. As the cognitive uncertainty among different experts about the limits that define the control performance as still acceptable may differ a lot, a data-driven design method is performed. Based upon a historic data pool statistical process corridors are derived for the controller inputs control error and change in control error. This approach follows the hypothesis that if the control performance criteria stay within predefined statistical boundaries, the final process state meets the required quality definition. In order to keep the process on its optimal growth trajectory (model based reference trajectory a fuzzy logic controller is used that alternates the process temperature. Additionally, in order to stay within the process corridors, a genetic algorithm was applied to tune the input and output fuzzy sets of a preliminarily parameterized fuzzy controller. The presented experimental results show that the genetic tuned fuzzy controller is able to keep the process within its allowed limits. The average absolute error to the

  8. Statistical process control: separating signal from noise in emergency department operations.

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Assessing segmentation processes by click detection: online measure of statistical learning, or simple interference?

    Franco, Ana; Gaillard, Vinciane; Cleeremans, Axel; Destrebecqz, Arnaud

    2015-12-01

    Statistical learning can be used to extract the words from continuous speech. Gómez, Bion, and Mehler (Language and Cognitive Processes, 26, 212-223, 2011) proposed an online measure of statistical learning: They superimposed auditory clicks on a continuous artificial speech stream made up of a random succession of trisyllabic nonwords. Participants were instructed to detect these clicks, which could be located either within or between words. The results showed that, over the length of exposure, reaction times (RTs) increased more for within-word than for between-word clicks. This result has been accounted for by means of statistical learning of the between-word boundaries. However, even though statistical learning occurs without an intention to learn, it nevertheless requires attentional resources. Therefore, this process could be affected by a concurrent task such as click detection. In the present study, we evaluated the extent to which the click detection task indeed reflects successful statistical learning. Our results suggest that the emergence of RT differences between within- and between-word click detection is neither systematic nor related to the successful segmentation of the artificial language. Therefore, instead of being an online measure of learning, the click detection task seems to interfere with the extraction of statistical regularities.

  10. Use of statistic control of the process as part of a quality assurance plan

    Acosta, S.; Lewis, C.

    2013-01-01

    One of the technical requirements of the standard IRAM ISO 17025 for the accreditation of testing laboratories, is the assurance of the quality of the results through the control and monitoring of the factors influencing the reliability of them. The grade the factors contribute to the total measurement uncertainty, determines which of them should be considered when developing a quality assurance plan. The laboratory of environmental measurements of strontium-90 in the accreditation process, performs most of its determinations in samples with values close to the detection limit. For this reason the correct characterization of the white, is a critical parameter and is verified through a letter for statistical process control. The scope of the present work is concerned the control of whites and so it was collected a statistically significant amount of data, for a period of time that is covered of different conditions. This allowed consider significant variables in the process, such as temperature and humidity, and build a graph of white control, which forms the basis of a statistical process control. The data obtained were lower and upper limits for the preparation of the charter white control. In this way the process of characterization of white was considered to operate under statistical control and concludes that it can be used as part of a plan of insurance of the quality

  11. Minho Affective Sentences (MAS): Probing the roles of sex, mood, and empathy in affective ratings of verbal stimuli.

    Pinheiro, Ana P; Dias, Marcelo; Pedrosa, João; Soares, Ana P

    2017-04-01

    During social communication, words and sentences play a critical role in the expression of emotional meaning. The Minho Affective Sentences (MAS) were developed to respond to the lack of a standardized sentence battery with normative affective ratings: 192 neutral, positive, and negative declarative sentences were strictly controlled for psycholinguistic variables such as numbers of words and letters and per-million word frequency. The sentences were designed to represent examples of each of the five basic emotions (anger, sadness, disgust, fear, and happiness) and of neutral situations. These sentences were presented to 536 participants who rated the stimuli using both dimensional and categorical measures of emotions. Sex differences were also explored. Additionally, we probed how personality, empathy, and mood from a subset of 40 participants modulated the affective ratings. Our results confirmed that the MAS affective norms are valid measures to guide the selection of stimuli for experimental studies of emotion. The combination of dimensional and categorical ratings provided a more fine-grained characterization of the affective properties of the sentences. Moreover, the affective ratings of positive and negative sentences were not only modulated by participants' sex, but also by individual differences in empathy and mood state. Together, our results indicate that, in their quest to reveal the neurofunctional underpinnings of verbal emotional processing, researchers should consider not only the role of sex, but also of interindividual differences in empathy and mood states, in responses to the emotional meaning of sentences.

  12. 28 CFR 2.10 - Date service of sentence commences.

    2010-07-01

    ... imposed. (b) The imposition of a sentence of imprisonment for civil contempt shall interrupt the running of any sentence of imprisonment being served at the time the sentence of civil contempt is imposed... civil contempt is lifted. (c) Service of the sentence of a committed youth offender or person committed...

  13. IMPROVING QUALITY OF STATISTICAL PROCESS CONTROL BY DEALING WITH NON‐NORMAL DATA IN AUTOMOTIVE INDUSTRY

    Zuzana ANDRÁSSYOVÁ

    2012-07-01

    Full Text Available Study deals with an analysis of data to the effect that it improves the quality of statistical tools in processes of assembly of automobile seats. Normal distribution of variables is one of inevitable conditions for the analysis, examination, and improvement of the manufacturing processes (f. e.: manufacturing process capability although, there are constantly more approaches to non‐normal data handling. An appropriate probability distribution of measured data is firstly tested by the goodness of fit of empirical distribution with theoretical normal distribution on the basis of hypothesis testing using programme StatGraphics Centurion XV.II. Data are collected from the assembly process of 1st row automobile seats for each characteristic of quality (Safety Regulation ‐S/R individually. Study closely processes the measured data of an airbag´s assembly and it aims to accomplish the normal distributed data and apply it the statistical process control. Results of the contribution conclude in a statement of rejection of the null hypothesis (measured variables do not follow the normal distribution therefore it is necessary to begin to work on data transformation supported by Minitab15. Even this approach does not reach a normal distributed data and so should be proposed a procedure that leads to the quality output of whole statistical control of manufacturing processes.

  14. The Interaction of Eye-Voice Span with Syntactic Chunking and Predictability in Right- and Left-Embedded Sentences.

    Balajthy, Ernest P., Jr.

    Sixty tenth graders participated in this study of relationships between eye/voice span, phrase and clause boundaries, reading ability, and sentence structure. Results indicated that sentences apparently are "chunked" into surface constituents during processing. Better tenth grade readers had longer eye/voice spans than did poorer readers and…

  15. [Personality traits of drivers serving a custodial sentence for drink driving].

    Pawłowska, Beata; Rzeszutko, Ewa

    2015-01-01

    The aim of the work was the analysis of personality traits of men serving a custodial sentence for driving under the influence of alcohol. The study included 44 males serving a custodial sentence for drink driving, 45 males serving a custodial sentence for assault and robbery as well as 32 men with no criminal record, who had never driven a motor vehicle under the influence of alcohol. The following research methods were used during the study: the Socio-demographic Questionnaire designed by the authors, the KRS, the Cattell's IPAT, the NI, the ACL and the Life style Questionnaire. The obtained results indicate significant statistical differences between the men serving the custodial sentence for drink driving as regards stress coping, anxiety level, intensified need to look for new experiences as well as anti-social personality traits. The men serving a custodial sentence for drink driving show intensified traits of antisocial personality, higher level of anxiety, intensified impulsiveness irritability, distrust, aggression, egocentrism, eccentricity, intensified need for recognition, breaking social standards, experiencing various stimuli, new impressions, greater adaptation difficulties, less self-discipline, lower self-esteem as well as more frequently used destructive, escapist and emotional stress coping strategies as compared to the people with no criminal record, who never drove while under the influence of alcohol. As regards the intensity of personality disorders, stress coping strategies and self-image no significant differences were found between the men serving a custodial sentence for drink driving and those imprisoned for assault and robbery.

  16. On-line statistical processing of radiation detector pulse trains with time-varying count rates

    Apostolopoulos, G.

    2008-01-01

    Statistical analysis is of primary importance for the correct interpretation of nuclear measurements, due to the inherent random nature of radioactive decay processes. This paper discusses the application of statistical signal processing techniques to the random pulse trains generated by radiation detectors. The aims of the presented algorithms are: (i) continuous, on-line estimation of the underlying time-varying count rate θ(t) and its first-order derivative dθ/dt; (ii) detection of abrupt changes in both of these quantities and estimation of their new value after the change point. Maximum-likelihood techniques, based on the Poisson probability distribution, are employed for the on-line estimation of θ and dθ/dt. Detection of abrupt changes is achieved on the basis of the generalized likelihood ratio statistical test. The properties of the proposed algorithms are evaluated by extensive simulations and possible applications for on-line radiation monitoring are discussed

  17. On the Complexity of Chinese Sentences in Singapore Primary Textbooks

    Goh Saye Wee

    2016-12-01

    Full Text Available This paper uses the sentences from the Singapore primary school Chinese textbooks as the research material, using sentence as a unit; analyses sentence composition and sentence patterns, from quantity, distribution, characteristic and semantic type aspects to examine the progression of sentence complexity in Chinese language. The paper describes how the sentence develops in a systemic and complexity aspect in textbooks of various levels. The paper suggests 7 types of sentence pattern grading in term of complexity progression, and proposes a formula to examine and calculate the complexity index of a sentence. The findings derive a set of valuable data to expound the complexity of a sentence and discuss the variable factors influencing the complexity of sentences used in primary school Chinese textbooks.

  18. A case for the sentence in reading comprehension.

    Scott, Cheryl M

    2009-04-01

    This article addresses sentence comprehension as a requirement of reading comprehension within the framework of the narrow view of reading that was advocated in the prologue to this forum. The focus is on the comprehension requirements of complex sentences, which are characteristic of school texts. Topics included in this discussion are (a) evidence linking sentence comprehension and syntax with reading, (b) syntactic properties of sentences that make them difficult to understand, (c) clinical applications for the assessment of sentence comprehension as it relates to reading, and (d) evidence and methods for addressing sentence complexity in treatment. Sentence complexity can create comprehension problems for struggling readers. The contribution of sentence comprehension to successful reading has been overlooked in models that emphasize domain-general comprehension strategies at the text level. The author calls for the evaluation of sentence comprehension within the context of content domains where complex sentences are found.

  19. Large-Deviation Results for Discriminant Statistics of Gaussian Locally Stationary Processes

    Junichi Hirukawa

    2012-01-01

    Full Text Available This paper discusses the large-deviation principle of discriminant statistics for Gaussian locally stationary processes. First, large-deviation theorems for quadratic forms and the log-likelihood ratio for a Gaussian locally stationary process with a mean function are proved. Their asymptotics are described by the large deviation rate functions. Second, we consider the situations where processes are misspecified to be stationary. In these misspecified cases, we formally make the log-likelihood ratio discriminant statistics and derive the large deviation theorems of them. Since they are complicated, they are evaluated and illustrated by numerical examples. We realize the misspecification of the process to be stationary seriously affecting our discrimination.

  20. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  1. Statistical modeling in phenomenological description of electromagnetic cascade processes produced by high-energy gamma quanta

    Slowinski, B.

    1987-01-01

    A description of a simple phenomenological model of electromagnetic cascade process (ECP) initiated by high-energy gamma quanta in heavy absorbents is given. Within this model spatial structure and fluctuations of ionization losses of shower electrons and positrons are described. Concrete formulae have been obtained as a result of statistical analysis of experimental data from the xenon bubble chamber of ITEP (Moscow)

  2. Learning Curves and Bootstrap Estimates for Inference with Gaussian Processes: A Statistical Mechanics Study

    Malzahn, Dorthe; Opper, Manfred

    2003-01-01

    We employ the replica method of statistical physics to study the average case performance of learning systems. The new feature of our theory is that general distributions of data can be treated, which enables applications to real data. For a class of Bayesian prediction models which are based...... on Gaussian processes, we discuss Bootstrap estimates for learning curves....

  3. Spectral deformation techniques applied to the study of quantum statistical irreversible processes

    Courbage, M.

    1978-01-01

    A procedure of analytic continuation of the resolvent of Liouville operators for quantum statistical systems is discussed. When applied to the theory of irreversible processes of the Brussels School, this method supports the idea that the restriction to a class of initial conditions is necessary to obtain an irreversible behaviour. The general results are tested on the Friedrichs model. (Auth.)

  4. Statistical Language Models and Information Retrieval: Natural Language Processing Really Meets Retrieval

    Hiemstra, Djoerd; de Jong, Franciska M.G.

    2001-01-01

    Traditionally, natural language processing techniques for information retrieval have always been studied outside the framework of formal models of information retrieval. In this article, we introduce a new formal model of information retrieval based on the application of statistical language models.

  5. Eliciting and Developing Teachers' Conceptions of Random Processes in a Probability and Statistics Course

    Smith, Toni M.; Hjalmarson, Margret A.

    2013-01-01

    The purpose of this study is to examine prospective mathematics specialists' engagement in an instructional sequence designed to elicit and develop their understandings of random processes. The study was conducted with two different sections of a probability and statistics course for K-8 teachers. Thirty-two teachers participated. Video analyses…

  6. Improving Usage Statistics Processing for a Library Consortium: The Virtual Library of Virginia's Experience

    Matthews, Tansy E.

    2009-01-01

    This article describes the development of the Virtual Library of Virginia (VIVA). The VIVA statistics-processing system remains a work in progress. Member libraries will benefit from the ability to obtain the actual data from the VIVA site, rather than just the summaries, so a project to make these data available is currently being planned. The…

  7. Reducing lumber thickness variation using real-time statistical process control

    Thomas M. Young; Brian H. Bond; Jan Wiedenbeck

    2002-01-01

    A technology feasibility study for reducing lumber thickness variation was conducted from April 2001 until March 2002 at two sawmills located in the southern U.S. A real-time statistical process control (SPC) system was developed that featured Wonderware human machine interface technology (HMI) with distributed real-time control charts for all sawing centers and...

  8. Disciplined Decision Making in an Interdisciplinary Environment: Some Implications for Clinical Applications of Statistical Process Control.

    Hantula, Donald A.

    1995-01-01

    Clinical applications of statistical process control (SPC) in human service organizations are considered. SPC is seen as providing a standard set of criteria that serves as a common interface for data-based decision making, which may bring decision making under the control of established contingencies rather than the immediate contingencies of…

  9. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC I. Instructor Book.

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…

  10. An Automated Statistical Process Control Study of Inline Mixing Using Spectrophotometric Detection

    Dickey, Michael D.; Stewart, Michael D.; Willson, C. Grant

    2006-01-01

    An experiment is described, which is designed for a junior-level chemical engineering "fundamentals of measurements and data analysis" course, where students are introduced to the concept of statistical process control (SPC) through a simple inline mixing experiment. The students learn how to create and analyze control charts in an effort to…

  11. Using Statistical Process Control Charts to Study Stuttering Frequency Variability during a Single Day

    Karimi, Hamid; O'Brian, Sue; Onslow, Mark; Jones, Mark; Menzies, Ross; Packman, Ann

    2013-01-01

    Purpose: Stuttering varies between and within speaking situations. In this study, the authors used statistical process control charts with 10 case studies to investigate variability of stuttering frequency. Method: Participants were 10 adults who stutter. The authors counted the percentage of syllables stuttered (%SS) for segments of their speech…

  12. Making Women Count: Gender-Typing, Technology and Path Dependencies in Dutch Statistical Data Processing

    van den Ende, Jan; van Oost, Elizabeth C.J.

    2001-01-01

    This article is a longitudinal analysis of the relation between gendered labour divisions and new data processing technologies at the Dutch Central Bureau of Statistics (CBS). Following social-constructivist and evolutionary economic approaches, the authors hold that the relation between technology

  13. Study of film data processing systems by means of a statistical simulation

    Deart, A.F.; Gromov, A.I.; Kapustinskaya, V.I.; Okorochenko, G.E.; Sychev, A.Yu.; Tatsij, L.I.

    1974-01-01

    Considered is a statistic model of the film information processing system. The given time diagrams illustrate the model operation algorithm. The program realizing this model of the system is described in detail. The elaborated program model has been tested at the film information processing system which represents a group of measuring devices operating in line with BESM computer. The obtained functioning quantitative characteristics of the system being tested permit to estimate the system operation efficiency

  14. [Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry].

    Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A

    2010-06-01

    The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to

  15. Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry

    Villani, N.; Noel, A.; Villani, N.; Gerard, K.; Marchesi, V.; Huger, S.; Noel, A.; Francois, P.

    2010-01-01

    Purpose The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (I.M.R.T.) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. Patients and methods At Alexis-Vautrin center, pretreatment quality controls in I.M.R.T. for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Results Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multi-leaf collimator). Correlation between dose measured at one point, given with the E.P.I.D. and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. Conclusion The study allowed to

  16. Relatedness of content and sentence formation in Japanese

    Andrej Bekeš

    1993-12-01

    Full Text Available Leech (1983: 63-70 distinguishes two kinds of pragmatics, interpersonal prag­ matics and textual pragmatics.  Our article is concerned with textual pragmatics,  spe­ cifically with the textual motivations behind a format such as a sentence in Japanese. Studying spontaneous spoken discourse, Chafe (1980 proposed two units of spoken discourse on the basis of phonetical and intonational criteria, i.e. the "idea unit" and the "intonation sentence". He finds justification for both units in cognitive processes as follows. Idea units, most often verbalized as clauses, are the linguistic expression of cognitive units that Chafe calls "foci of consciousness". A focus of consciousness is a chunk of information small enough to be processed and verbalized in one step. Next, an intonation sentence, consisting usually of several idea units (or sometimes just one is the verbal expression of a larger cognitive unit, the "center of interest", a chunk of information too large to be verbalized in one step. Concerning the center of interest, Chafe puts forward the following hypothesis.

  17. Context updating during sentence comprehension: the effect of aboutness topic.

    Burmester, Juliane; Spalek, Katharina; Wartenburger, Isabell

    2014-10-01

    To communicate efficiently, speakers typically link their utterances to the discourse environment and adapt their utterances to the listener's discourse representation. Information structure describes how linguistic information is packaged within a discourse to optimize information transfer. The present study investigates the nature and time course of context integration (i.e., aboutness topic vs. neutral context) on the comprehension of German declarative sentences with either subject-before-object (SO) or object-before-subject (OS) word order using offline comprehensibility judgments and online event-related potentials (ERPs). Comprehensibility judgments revealed that the topic context selectively facilitated comprehension of stories containing OS (i.e., non-canonical) sentences. In the ERPs, the topic context effect was reflected in a less pronounced late positivity at the sentence-initial object. In line with the Syntax-Discourse Model, we argue that these context-induced effects are attributable to reduced processing costs for updating the current discourse model. The results support recent approaches of neurocognitive models of discourse processing. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Statistical process control applied to the liquid-fed ceramic melter process

    Pulsipher, B.A.; Kuhn, W.L.

    1987-09-01

    In this report, an application of control charts to the apparent feed composition of a Liquid-Fed Ceramic Melter (LFCM) is demonstrated by using results from a simulation of the LFCM system. Usual applications of control charts require the assumption of uncorrelated observations over time. This assumption is violated in the LFCM system because of the heels left in tanks from previous batches. Methods for dealing with this problem have been developed to create control charts for individual batches sent to the feed preparation tank (FPT). These control charts are capable of detecting changes in the process average as well as changes in the process variation. All numbers reported in this document were derived from a simulated demonstration of a plausible LFCM system. In practice, site-specific data must be used as input to a simulation tailored to that site. These data directly affect all variance estimates used to develop control charts. 64 refs., 3 figs., 2 tabs

  19. Effective application of statistical process control (SPC) on the lengthwise tonsure rolled plates process

    Noskievičová, Darja; Kucharczyk, Radim

    2012-01-01

    This paper deals with the eff ective application of SPC on the lengthwise tonsure rolled plates process on double side scissors. After explanation of the SPC fundamentals, goals and mistakes during the SPC implementation, the methodical framework for the eff ective SPC application is defi ned. In the next part of the paper the description of practical application of SPC and its analysis from the point of view of this framework is accomplished. Ovaj članak opisuje djelotvornu primj...

  20. Spatial statistics of pitting corrosion patterning: Quadrat counts and the non-homogeneous Poisson process

    Lopez de la Cruz, J.; Gutierrez, M.A.

    2008-01-01

    This paper presents a stochastic analysis of spatial point patterns as effect of localized pitting corrosion. The Quadrat Counts method is studied with two empirical pit patterns. The results are dependent on the quadrat size and bias is introduced when empty quadrats are accounted for the analysis. The spatially inhomogeneous Poisson process is used to improve the performance of the Quadrat Counts method. The latter combines Quadrat Counts with distance-based statistics in the analysis of pit patterns. The Inter-Event and the Nearest-Neighbour statistics are here implemented in order to compare their results. Further, the treatment of patterns in irregular domains is discussed

  1. Ideology, Social Threat, and the Death Sentence: Capital Sentences across Time and Space

    Jacobs, David; Carmichael, Jason T.

    2004-01-01

    Capital punishment is the most severe criminal penalty, yet we know little about the factors that produce jurisdictional differences in the use of the death sentence. Political explanations emphasize conservative values and the strength of more conservative political parties. Threat accounts suggest that this sentence will be more likely in…

  2. A Frequency-List of Sentence Structures: Distribution of Kernel Sentences

    Geens, Dirk

    1974-01-01

    A corpus of 10,000 sentences extracted from British theatrical texts was used to construct a frequency list of kernel sentence structures. Thirty-one charts illustrate the analyzed results. The procedures used and an interpretation of the frequencies are given. Such lists might aid foreign language teachers in course organization. Available from…

  3. Motor activation in literal and non literal sentences: does time matter?

    Cristina eCacciari

    2013-05-01

    Full Text Available Despite the impressive amount of evidence showing involvement of the sensorimotor systems in language processing, important questions remain unsolved among which the relationship between non literal uses of language and sensorimotor activation. The literature did not yet provide a univocal answer on whether the comprehension of non literal, abstract motion sentences engages the same neural networks recruited for literal sentences. A previous TMS study using the same experimental materials of the present study showed activation for literal, fictive and metaphoric motion sentences but not for idiomatic ones. To evaluate whether this may depend on insufficient time for elaborating the idiomatic meaning, we conducted a behavioural experiment that used a sensibility judgment task performed by pressing a button either with a hand finger or with a foot. Motor activation is known to be sensitive to the action-congruency of the effector used for responding. Therefore, all other things being equal, significant differences between response emitted with an action-congruent or incongruent effector (foot vs. hand may be attributed to motor activation. Foot-related action verbs were embedded in sentences conveying literal motion, fictive motion, metaphoric motion or idiomatic motion. Mental sentences were employed as a control condition. Foot responses were significantly faster than finger responses but only in literal motion sentences. We hypothesize that motor activation may arise in early phases of comprehension processes (i.e. upon reading the verb for then decaying as a function of the strength of the semantic motion component of the verb.

  4. DEVELOPMENT OF A METHOD STATISTICAL ANALYSIS ACCURACY AND PROCESS STABILITY PRODUCTION OF EPOXY RESIN ED-20

    N. V. Zhelninskaya

    2015-01-01

    Full Text Available Statistical methods play an important role in the objective evaluation of quantitative and qualitative characteristics of the process and are one of the most important elements of the quality assurance system production and total quality management process. To produce a quality product, one must know the real accuracy of existing equipment, to determine compliance with the accuracy of a selected technological process specified accuracy products, assess process stability. Most of the random events in life, particularly in manufacturing and scientific research, are characterized by the presence of a large number of random factors, is described by a normal distribution, which is the main in many practical studies. Modern statistical methods is quite difficult to grasp and wide practical use without in-depth mathematical training of all participants in the process. When we know the distribution of a random variable, you can get all the features of this batch of products, to determine the mean value and the variance. Using statistical control methods and quality control in the analysis of accuracy and stability of the technological process of production of epoxy resin ED20. Estimated numerical characteristics of the law of distribution of controlled parameters and determined the percentage of defects of the investigated object products. For sustainability assessment of manufacturing process of epoxy resin ED-20 selected Shewhart control charts, using quantitative data, maps of individual values of X and sliding scale R. Using Pareto charts identify the causes that affect low dynamic viscosity in the largest extent. For the analysis of low values of dynamic viscosity were the causes of defects using Ishikawa diagrams, which shows the most typical factors of the variability of the results of the process. To resolve the problem, it is recommended to modify the polymer composition of carbon fullerenes and to use the developed method for the production of

  5. Statistical Process Control: A Quality Tool for a Venous Thromboembolic Disease Registry.

    Posadas-Martinez, Maria Lourdes; Rojas, Liliana Paloma; Vazquez, Fernando Javier; De Quiros, Fernan Bernaldo; Waisman, Gabriel Dario; Giunta, Diego Hernan

    2016-01-01

    We aim to describe Statistical Control Process as a quality tool for the Institutional Registry of Venous Thromboembolic Disease (IRTD), a registry developed in a community-care tertiary hospital in Buenos Aires, Argentina. The IRTD is a prospective cohort. The process of data acquisition began with the creation of a computerized alert generated whenever physicians requested imaging or laboratory study to diagnose venous thromboembolism, which defined eligible patients. The process then followed a structured methodology for patient's inclusion, evaluation, and posterior data entry. To control this process, process performance indicators were designed to be measured monthly. These included the number of eligible patients, the number of included patients, median time to patient's evaluation, and percentage of patients lost to evaluation. Control charts were graphed for each indicator. The registry was evaluated in 93 months, where 25,757 patients were reported and 6,798 patients met inclusion criteria. The median time to evaluation was 20 hours (SD, 12) and 7.7% of the total was lost to evaluation. Each indicator presented trends over time, caused by structural changes and improvement cycles, and therefore the central limit suffered inflexions. Statistical process control through process performance indicators allowed us to control the performance of the registry over time to detect systematic problems. We postulate that this approach could be reproduced for other clinical registries.

  6. iSentenizer-μ: multilingual sentence boundary detection model.

    Wong, Derek F; Chao, Lidia S; Zeng, Xiaodong

    2014-01-01

    Sentence boundary detection (SBD) system is normally quite sensitive to genres of data that the system is trained on. The genres of data are often referred to the shifts of text topics and new languages domains. Although new detection models can be retrained for different languages or new text genres, previous model has to be thrown away and the creation process has to be restarted from scratch. In this paper, we present a multilingual sentence boundary detection system (iSentenizer-μ) for Danish, German, English, Spanish, Dutch, French, Italian, Portuguese, Greek, Finnish, and Swedish languages. The proposed system is able to detect the sentence boundaries of a mixture of different text genres and languages with high accuracy. We employ i (+)Learning algorithm, an incremental tree learning architecture, for constructing the system. iSentenizer-μ, under the incremental learning framework, is adaptable to text of different topics and Roman-alphabet languages, by merging new data into existing model to learn the new knowledge incrementally by revision instead of retraining. The system has been extensively evaluated on different languages and text genres and has been compared against two state-of-the-art SBD systems, Punkt and MaxEnt. The experimental results show that the proposed system outperforms the other systems on all datasets.

  7. iSentenizer-μ: Multilingual Sentence Boundary Detection Model

    Derek F. Wong

    2014-01-01

    Full Text Available Sentence boundary detection (SBD system is normally quite sensitive to genres of data that the system is trained on. The genres of data are often referred to the shifts of text topics and new languages domains. Although new detection models can be retrained for different languages or new text genres, previous model has to be thrown away and the creation process has to be restarted from scratch. In this paper, we present a multilingual sentence boundary detection system (iSentenizer-μ for Danish, German, English, Spanish, Dutch, French, Italian, Portuguese, Greek, Finnish, and Swedish languages. The proposed system is able to detect the sentence boundaries of a mixture of different text genres and languages with high accuracy. We employ i+Learning algorithm, an incremental tree learning architecture, for constructing the system. iSentenizer-μ, under the incremental learning framework, is adaptable to text of different topics and Roman-alphabet languages, by merging new data into existing model to learn the new knowledge incrementally by revision instead of retraining. The system has been extensively evaluated on different languages and text genres and has been compared against two state-of-the-art SBD systems, Punkt and MaxEnt. The experimental results show that the proposed system outperforms the other systems on all datasets.

  8. Guideline implementation in clinical practice: use of statistical process control charts as visual feedback devices.

    Al-Hussein, Fahad A

    2009-01-01

    To use statistical control charts in a series of audits to improve the acceptance and consistant use of guidelines, and reduce the variations in prescription processing in primary health care. A series of audits were done at the main satellite of King Saud Housing Family and Community Medicine Center, National Guard Health Affairs, Riyadh, where three general practitioners and six pharmacists provide outpatient care to about 3000 residents. Audits were carried out every fortnight to calculate the proportion of prescriptions that did not conform to the given guidelines of prescribing and dispensing. Simple random samples of thirty were chosen from a sampling frame of all prescriptions given in the two previous weeks. Thirty six audits were carried out from September 2004 to February 2006. P-charts were constructed around a parametric specification of non-conformities not exceeding 25%. Of the 1081 prescriptions, the most frequent non-conformity was failure to write generic names (35.5%), followed by the failure to record patient's weight (16.4%), pharmacist's name (14.3%), duration of therapy (9.1%), and the use of inappropriate abbreviations (6.0%). Initially, 100% of prescriptions did not conform to the guidelines, but within a period of three months, this came down to 40%. A process of audits in the context of statistical process control is necessary for any improvement in the implementation of guidelines in primary care. Statistical process control charts are an effective means of visual feedback to the care providers.

  9. Statistical trajectory of an approximate EM algorithm for probabilistic image processing

    Tanaka, Kazuyuki; Titterington, D M

    2007-01-01

    We calculate analytically a statistical average of trajectories of an approximate expectation-maximization (EM) algorithm with generalized belief propagation (GBP) and a Gaussian graphical model for the estimation of hyperparameters from observable data in probabilistic image processing. A statistical average with respect to observed data corresponds to a configuration average for the random-field Ising model in spin glass theory. In the present paper, hyperparameters which correspond to interactions and external fields of spin systems are estimated by an approximate EM algorithm. A practical algorithm is described for gray-level image restoration based on a Gaussian graphical model and GBP. The GBP approach corresponds to the cluster variation method in statistical mechanics. Our main result in the present paper is to obtain the statistical average of the trajectory in the approximate EM algorithm by using loopy belief propagation and GBP with respect to degraded images generated from a probability density function with true values of hyperparameters. The statistical average of the trajectory can be expressed in terms of recursion formulas derived from some analytical calculations

  10. A flexible statistics web processing service--added value for information systems for experiment data.

    Heimann, Dennis; Nieschulze, Jens; König-Ries, Birgitta

    2010-04-20

    Data management in the life sciences has evolved from simple storage of data to complex information systems providing additional functionalities like analysis and visualization capabilities, demanding the integration of statistical tools. In many cases the used statistical tools are hard-coded within the system. That leads to an expensive integration, substitution, or extension of tools because all changes have to be done in program code. Other systems are using generic solutions for tool integration but adapting them to another system is mostly rather extensive work. This paper shows a way to provide statistical functionality over a statistics web service, which can be easily integrated in any information system and set up using XML configuration files. The statistical functionality is extendable by simply adding the description of a new application to a configuration file. The service architecture as well as the data exchange process between client and service and the adding of analysis applications to the underlying service provider are described. Furthermore a practical example demonstrates the functionality of the service.

  11. Automation in Siemens fuel manufacturing - the basis for quality improvement by statistical process control (SPC)

    Drecker, St.; Hoff, A.; Dietrich, M.; Guldner, R.

    1999-01-01

    Statistical Process Control (SPC) is one of the systematic tools to perform a valuable contribution to the control and planning activities for manufacturing processes and product quality. Advanced Nuclear Fuels GmbH (ANF) started a program to introduce SPC in all sections of the manufacturing process of fuel assemblies. The concept phase is based on a realization of SPC in 3 pilot projects. The existing manufacturing devices are reviewed for the utilization of SPC. Subsequent modifications were made to provide the necessary interfaces. The processes 'powder/pellet manufacturing'. 'cladding tube manufacturing' and 'laser-welding of spacers' are located at the different locations of ANF. Due to the completion of the first steps and the experience obtained by the pilot projects, the introduction program for SPC has already been extended to other manufacturing processes. (authors)

  12. ANALISIS KEHILANGAN MINYAK PADA CRUDE PALM OIL (CPO DENGAN MENGGUNAKAN METODE STATISTICAL PROCESS CONTROL

    Vera Devani

    2014-06-01

    Full Text Available PKS “XYZ” merupakan perusahaan yang bergerak di bidang pengolahan kelapa sawit. Produk yang dihasilkan adalah Crude Palm Oil (CPO dan Palm Kernel Oil (PKO. Tujuan penelitian ini adalah menganalisa kehilangan minyak (oil losses dan faktor-faktor penyebab dengan menggunakan metoda Statistical Process Control. Statistical Process Control adalah sekumpulan strategi, teknik, dan tindakan yang diambil oleh sebuah organisasi untuk memastikan bahwa strategi tersebut menghasilkan produk yang berkualitas atau menyediakan pelayanan yang berkualitas. Sampel terjadinya oil losses pada CPO yang diteliti adalah tandan kosong (tankos, biji (nut, ampas (fibre, dan sludge akhir. Berdasarkan Peta Kendali I-MR dapat disimpulkan bahwa kondisi keempat jenis oil losses CPO berada dalam batas kendali dan konsisten. Sedangkan nilai Cpk dari total oil losses berada di luar batas kendali rata-rata proses, hal ini berarti CPO yang diproduksi telah memenuhi kebutuhan pelanggan, dengan total oil losses kurang dari batas maksimum yang ditetapkan oleh perusahaan yaitu 1,65%.

  13. Morphology of Laplacian growth processes and statistics of equivalent many-body systems

    Blumenfeld, R.

    1994-01-01

    The authors proposes a theory for the nonlinear evolution of two dimensional interfaces in Laplacian fields. The growing region is conformally mapped onto the unit disk, generating an equivalent many-body system whose dynamics and statistics are studied. The process is shown to be Hamiltonian, with the Hamiltonian being the imaginary part of the complex electrostatic potential. Surface effects are introduced through the Hamiltonian as an external field. An extension to a continuous density of particles is presented. The results are used to study the morphology of the interface using statistical mechanics for the many-body system. The distribution of the curvature and the moments of the growth probability along the interface are calculated exactly from the distribution of the particles. In the dilute limit, the distribution of the curvature is shown to develop algebraic tails, which may, for the first time, explain the origin of fractality in diffusion controlled processes

  14. Application of Statistical Process Control (SPC in it´s Quality control

    Carlos Hernández-Pedrera

    2015-12-01

    Full Text Available The overall objective of this paper is to use the SPC to assess the possibility of improving the process of obtaining a sanitary device. As specific objectives we set out to identify the variables to be analyzed to enter the statistical control of process (SPC, analyze possible errors and variations indicated by the control charts in addition to evaluate and compare the results achieved with the study of SPC before and after monitoring direct in the production line were used sampling methods and laboratory replacement to determine the quality of the finished product, then statistical methods were applied seeking to emphasize the importance and contribution from its application to monitor corrective actions and support processes in production. It was shown that the process is under control because the results were found within established control limits. There is a tendency to be displaced toward one end of the boundary, the distribution exceeds the limits, creating the possibility that under certain conditions the process is out of control, the results also showed that the process being within the limits of quality control is operating far from the optimal conditions. In any of the study situations were obtained products outside the limits of weight and discoloration but defective products were obtained.

  15. Review of the patient positioning reproducibility in head-and-neck radiotherapy using Statistical Process Control.

    Moore, Sarah J; Herst, Patries M; Louwe, Robert J W

    2018-05-01

    A remarkable improvement in patient positioning was observed after the implementation of various process changes aiming to increase the consistency of patient positioning throughout the radiotherapy treatment chain. However, no tool was available to describe these changes over time in a standardised way. This study reports on the feasibility of Statistical Process Control (SPC) to highlight changes in patient positioning accuracy and facilitate correlation of these changes with the underlying process changes. Metrics were designed to quantify the systematic and random patient deformation as input for the SPC charts. These metrics were based on data obtained from multiple local ROI matches for 191 patients who were treated for head-and-neck cancer during the period 2011-2016. SPC highlighted a significant improvement in patient positioning that coincided with multiple intentional process changes. The observed improvements could be described as a combination of a reduction in outliers and a systematic improvement in the patient positioning accuracy of all patients. SPC is able to track changes in the reproducibility of patient positioning in head-and-neck radiation oncology, and distinguish between systematic and random process changes. Identification of process changes underlying these trends requires additional statistical analysis and seems only possible when the changes do not overlap in time. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Multivariate statistical process control of a continuous pharmaceutical twin-screw granulation and fluid bed drying process.

    Silva, A F; Sarraguça, M C; Fonteyne, M; Vercruysse, J; De Leersnyder, F; Vanhoorne, V; Bostijn, N; Verstraeten, M; Vervaet, C; Remon, J P; De Beer, T; Lopes, J A

    2017-08-07

    A multivariate statistical process control (MSPC) strategy was developed for the monitoring of the ConsiGma™-25 continuous tablet manufacturing line. Thirty-five logged variables encompassing three major units, being a twin screw high shear granulator, a fluid bed dryer and a product control unit, were used to monitor the process. The MSPC strategy was based on principal component analysis of data acquired under normal operating conditions using a series of four process runs. Runs with imposed disturbances in the dryer air flow and temperature, in the granulator barrel temperature, speed and liquid mass flow and in the powder dosing unit mass flow were utilized to evaluate the model's monitoring performance. The impact of the imposed deviations to the process continuity was also evaluated using Hotelling's T 2 and Q residuals statistics control charts. The influence of the individual process variables was assessed by analyzing contribution plots at specific time points. Results show that the imposed disturbances were all detected in both control charts. Overall, the MSPC strategy was successfully developed and applied. Additionally, deviations not associated with the imposed changes were detected, mainly in the granulator barrel temperature control. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Optimization Model for Uncertain Statistics Based on an Analytic Hierarchy Process

    Yongchao Hou

    2014-01-01

    Full Text Available Uncertain statistics is a methodology for collecting and interpreting the expert’s experimental data by uncertainty theory. In order to estimate uncertainty distributions, an optimization model based on analytic hierarchy process (AHP and interpolation method is proposed in this paper. In addition, the principle of least squares method is presented to estimate uncertainty distributions with known functional form. Finally, the effectiveness of this method is illustrated by an example.

  18. THE CHILD JUSTICE ACT: PROCEDURAL SENTENCING ISSUES

    Stephan S Terblanche

    2013-04-01

    Full Text Available In this contribution a number of procedural issues related to the sentencing of child offenders and emanating from the Child Justice Act 75 of 2008 are considered in some detail. As a general rule, the Act requires pre-sentence reports to be obtained from probation officers before sentencing any child offender, with only a limited number of exceptions. The article argues that the peremptory nature of the Act means that a probation report is always required, even if reports by other experts are also available. The exceptions are limited to instances other than those where the child offender is sentenced to any form of imprisonment or to residence in a care centre. The article addresses the question of whether or not the reference to imprisonment includes alternative imprisonment which is imposed only as an alternative to a fine. It suggests that alternative imprisonment should, generally, not be imposed on child offenders. When an exception is not prevented because of the sentence, a pre-sentence report may be dispensed with only when the offence is a schedule-1 offence (the least serious class of offences or when obtaining a report would prejudice the child. It is argued that these exceptions are likely to occur rather rarely. A final aspect of the Act’s provisions on pre-sentence reports is the requirement that reasons be given for a departure from the recommendations in a pre-sentence report. This requirement merely confirms the status quo.The Act permits the prosecutor to provide the court with a victim impact statement. Such a statement is defined in the Act. It is a sworn statement by a victim or someone authorised by the victim explaining the consequences to the victim of the commission of the crime. The article also addresses the issue of whether or not the child justice court might mero motu obtain a victim impact statement when the prosecution does not do so.Finally, the article addresses appeals against and reviews of the trial

  19. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  20. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  1. Tungsten Ions in Plasmas: Statistical Theory of Radiative-Collisional Processes

    Alexander V. Demura

    2015-05-01

    Full Text Available The statistical model for calculations of the collisional-radiative processes in plasmas with tungsten impurity was developed. The electron structure of tungsten multielectron ions is considered in terms of both the Thomas-Fermi model and the Brandt-Lundquist model of collective oscillations of atomic electron density. The excitation or ionization of atomic electrons by plasma electron impacts are represented as photo-processes under the action of flux of equivalent photons introduced by E. Fermi. The total electron impact single ionization cross-sections of ions Wk+ with respective rates have been calculated and compared with the available experimental and modeling data (e.g., CADW. Plasma radiative losses on tungsten impurity were also calculated in a wide range of electron temperatures 1 eV–20 keV. The numerical code TFATOM was developed for calculations of radiative-collisional processes involving tungsten ions. The needed computational resources for TFATOM code are orders of magnitudes less than for the other conventional numerical codes. The transition from corona to Boltzmann limit was investigated in detail. The results of statistical approach have been tested by comparison with the vast experimental and conventional code data for a set of ions Wk+. It is shown that the universal statistical model accuracy for the ionization cross-sections and radiation losses is within the data scattering of significantly more complex quantum numerical codes, using different approximations for the calculation of atomic structure and the electronic cross-sections.

  2. Word Order and Voice Influence the Timing of Verb Planning in German Sentence Production

    Sebastian Sauppe

    2017-09-01

    Full Text Available Theories of incremental sentence production make different assumptions about when speakers encode information about described events and when verbs are selected, accordingly. An eye tracking experiment on German testing the predictions from linear and hierarchical incrementality about the timing of event encoding and verb planning is reported. In the experiment, participants described depictions of two-participant events with sentences that differed in voice and word order. Verb-medial active sentences and actives and passives with sentence-final verbs were compared. Linear incrementality predicts that sentences with verbs placed early differ from verb-final sentences because verbs are assumed to only be planned shortly before they are articulated. By contrast, hierarchical incrementality assumes that speakers start planning with relational encoding of the event. A weak version of hierarchical incrementality assumes that only the action is encoded at the outset of formulation and selection of lexical verbs only occurs shortly before they are articulated, leading to the prediction of different fixation patterns for verb-medial and verb-final sentences. A strong version of hierarchical incrementality predicts no differences between verb-medial and verb-final sentences because it assumes that verbs are always lexically selected early in the formulation process. Based on growth curve analyses of fixations to agent and patient characters in the described pictures, and the influence of character humanness and the lack of an influence of the visual salience of characters on speakers' choice of active or passive voice, the current results suggest that while verb planning does not necessarily occur early during formulation, speakers of German always create an event representation early.

  3. Word Order and Voice Influence the Timing of Verb Planning in German Sentence Production.

    Sauppe, Sebastian

    2017-01-01

    Theories of incremental sentence production make different assumptions about when speakers encode information about described events and when verbs are selected, accordingly. An eye tracking experiment on German testing the predictions from linear and hierarchical incrementality about the timing of event encoding and verb planning is reported. In the experiment, participants described depictions of two-participant events with sentences that differed in voice and word order. Verb-medial active sentences and actives and passives with sentence-final verbs were compared. Linear incrementality predicts that sentences with verbs placed early differ from verb-final sentences because verbs are assumed to only be planned shortly before they are articulated. By contrast, hierarchical incrementality assumes that speakers start planning with relational encoding of the event. A weak version of hierarchical incrementality assumes that only the action is encoded at the outset of formulation and selection of lexical verbs only occurs shortly before they are articulated, leading to the prediction of different fixation patterns for verb-medial and verb-final sentences. A strong version of hierarchical incrementality predicts no differences between verb-medial and verb-final sentences because it assumes that verbs are always lexically selected early in the formulation process. Based on growth curve analyses of fixations to agent and patient characters in the described pictures, and the influence of character humanness and the lack of an influence of the visual salience of characters on speakers' choice of active or passive voice, the current results suggest that while verb planning does not necessarily occur early during formulation, speakers of German always create an event representation early.

  4. Statistical metrology - measurement and modeling of variation for advanced process development and design rule generation

    Boning, Duane S.; Chung, James E.

    1998-01-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of 'dummy fill' or 'metal fill' to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal

  5. Using Statistical Process Control to Drive Improvement in Neonatal Care: A Practical Introduction to Control Charts.

    Gupta, Munish; Kaplan, Heather C

    2017-09-01

    Quality improvement (QI) is based on measuring performance over time, and variation in data measured over time must be understood to guide change and make optimal improvements. Common cause variation is natural variation owing to factors inherent to any process; special cause variation is unnatural variation owing to external factors. Statistical process control methods, and particularly control charts, are robust tools for understanding data over time and identifying common and special cause variation. This review provides a practical introduction to the use of control charts in health care QI, with a focus on neonatology. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. The suspended sentence in French Criminal Law

    Jovašević Dragan

    2016-01-01

    Full Text Available From the ancient times until today, criminal law has provided different criminal sanctions as measures of social control. These coercive measures are imposed on the criminal offender by the competent court and aimed at limitting the offender's rights and freedoms or depriving the offender of certain rights and freedoms. These sanctions are applied to the natural or legal persons who violate the norms of the legal order and injure or endanger other legal goods that enjoy legal protection. In order to effectively protect social values, criminal legislations in all countries predict a number of criminal sanctions. These are: 1 imprisonment, 2 precautions, 3 safety measures, 4 penalties for juveniles, and 5 sanctions for legal persons. Apart and instead of punishment, warning measures have a significant role in the jurisprudence. Since they emerged in the early 20th century in the system of criminal sanctions, there has been an increase in their application to criminal offenders, especially when it comes to first-time offenders who committed a negligent or accidental criminal act. Warnings are applied in case of crimes that do not have serious consequences, and whose perpetrators are not hardened and incorrigible criminals. All contemporary criminal legislations (including the French legilation provide a warning measure of suspended sentence. Suspended sentence is a conditional stay of execution of sentence of imprisonment for a specified time, provided that the convicted person does not commit another criminal offense and fulfills other obligations. This sanction applies if the following two conditions are fulfilled: a forma! -which is attached to the sentence of imprisonment; and b material -which is the court assessment that the application of this sanction is justified and necessary in a particular case. In many modern criminal legislations, there are two different types of suspended (conditional sentence: 1 ordinary (classical suspended

  7. Process simulation and statistical approaches for validating waste form qualification models

    Kuhn, W.L.; Toland, M.R.; Pulsipher, B.A.

    1989-05-01

    This report describes recent progress toward one of the principal objectives of the Nuclear Waste Treatment Program (NWTP) at the Pacific Northwest Laboratory (PNL): to establish relationships between vitrification process control and glass product quality. during testing of a vitrification system, it is important to show that departures affecting the product quality can be sufficiently detected through process measurements to prevent an unacceptable canister from being produced. Meeting this goal is a practical definition of a successful sampling, data analysis, and process control strategy. A simulation model has been developed and preliminarily tested by applying it to approximate operation of the West Valley Demonstration Project (WVDP) vitrification system at West Valley, New York. Multivariate statistical techniques have been identified and described that can be applied to analyze large sets of process measurements. Information on components, tanks, and time is then combined to create a single statistic through which all of the information can be used at once to determine whether the process has shifted away from a normal condition

  8. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control

    Pitard, Francis F

    1993-01-01

    Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...

  9. Auditory Magnetoencephalographic Frequency-Tagged Responses Mirror the Ongoing Segmentation Processes Underlying Statistical Learning.

    Farthouat, Juliane; Franco, Ana; Mary, Alison; Delpouve, Julie; Wens, Vincent; Op de Beeck, Marc; De Tiège, Xavier; Peigneux, Philippe

    2017-03-01

    Humans are highly sensitive to statistical regularities in their environment. This phenomenon, usually referred as statistical learning, is most often assessed using post-learning behavioural measures that are limited by a lack of sensibility and do not monitor the temporal dynamics of learning. In the present study, we used magnetoencephalographic frequency-tagged responses to investigate the neural sources and temporal development of the ongoing brain activity that supports the detection of regularities embedded in auditory streams. Participants passively listened to statistical streams in which tones were grouped as triplets, and to random streams in which tones were randomly presented. Results show that during exposure to statistical (vs. random) streams, tritone frequency-related responses reflecting the learning of regularities embedded in the stream increased in the left supplementary motor area and left posterior superior temporal sulcus (pSTS), whereas tone frequency-related responses decreased in the right angular gyrus and right pSTS. Tritone frequency-related responses rapidly developed to reach significance after 3 min of exposure. These results suggest that the incidental extraction of novel regularities is subtended by a gradual shift from rhythmic activity reflecting individual tone succession toward rhythmic activity synchronised with triplet presentation, and that these rhythmic processes are subtended by distinct neural sources.

  10. A tale of two audits: statistical process control for improving diabetes care in primary care settings.

    Al-Hussein, Fahad Abdullah

    2008-01-01

    Diabetes constitutes a major burden of disease globally. Both primary and secondary prevention need to improve in order to face this challenge. Improving management of diabetes in primary care is therefore of fundamental importance. The objective of these series of audits was to find means of improving diabetes management in chronic disease mini-clinics in primary health care. In the process, we were able to study the effect and practical usefulness of different audit designs - those measuring clinical outcomes, process of care, or both. King Saud City Family and Community Medicine Centre, Saudi National Guard Health Affairs in Riyadh city, Saudi Arabia. Simple random samples of 30 files were selected every two weeks from a sampling frame of file numbers for all diabetes clients seen over the period. Information was transferred to a form, entered on the computer and an automated response was generated regarding the appropriateness of management, a criterion mutually agreed upon by care providers. The results were plotted on statistical process control charts, p charts, displayed for all employees. Data extraction, archiving, entry, analysis, plotting and design and preparation of p charts were managed by nursing staff specially trained for the purpose by physicians with relevant previous experience. Audit series with mixed outcome and process measures failed to detect any changes in the proportion of non-conforming cases over a period of one year. The process measures series, on the other hand, showed improvement in care corresponding to a reduction in the proportion non-conforming by 10% within a period of 3 months. Non-conformities dropped from a mean of 5.0 to 1.4 over the year (P process audits and feedbacks. Frequent process audits in the context of statistical process control should be supplemented with concurrent outcome audits, once or twice a year.

  11. Statistical process control as a tool for controlling operating room performance: retrospective analysis and benchmarking.

    Chen, Tsung-Tai; Chang, Yun-Jau; Ku, Shei-Ling; Chung, Kuo-Piao

    2010-10-01

    There is much research using statistical process control (SPC) to monitor surgical performance, including comparisons among groups to detect small process shifts, but few of these studies have included a stabilization process. This study aimed to analyse the performance of surgeons in operating room (OR) and set a benchmark by SPC after stabilized process. The OR profile of 499 patients who underwent laparoscopic cholecystectomy performed by 16 surgeons at a tertiary hospital in Taiwan during 2005 and 2006 were recorded. SPC was applied to analyse operative and non-operative times using the following five steps: first, the times were divided into two segments; second, they were normalized; third, they were evaluated as individual processes; fourth, the ARL(0) was calculated;, and fifth, the different groups (surgeons) were compared. Outliers were excluded to ensure stability for each group and to facilitate inter-group comparison. The results showed that in the stabilized process, only one surgeon exhibited a significantly shorter total process time (including operative time and non-operative time). In this study, we use five steps to demonstrate how to control surgical and non-surgical time in phase I. There are some measures that can be taken to prevent skew and instability in the process. Also, using SPC, one surgeon can be shown to be a real benchmark. © 2010 Blackwell Publishing Ltd.

  12. The product composition control system at Savannah River: Statistical process control algorithm

    Brown, K.G.

    1994-01-01

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will be used to immobilize the approximately 130 million liters of high-level nuclear waste currently stored at the site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive insoluble sludge and precipitate and less radioactive water soluble salts. In DWPF, precipitate (PHA) is blended with insoluble sludge and ground glass frit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in an geologic repository. Described here is the Product Composition Control System (PCCS) process control algorithm. The PCCS is the amalgam of computer hardware and software intended to ensure that the melt will be processable and that the glass wasteform produced will be acceptable. Within PCCS, the Statistical Process Control (SPC) Algorithm is the means which guides control of the DWPF process. The SPC Algorithm is necessary to control the multivariate DWPF process in the face of uncertainties arising from the process, its feeds, sampling, modeling, and measurement systems. This article describes the functions performed by the SPC Algorithm, characterization of DWPF prior to making product, accounting for prediction uncertainty, accounting for measurement uncertainty, monitoring a SME batch, incorporating process information, and advantages of the algorithm. 9 refs., 6 figs

  13. Statistical Analysis of Deep Drilling Process Conditions Using Vibrations and Force Signals

    Syafiq Hazwan

    2016-01-01

    Full Text Available Cooling systems is a key point for hot forming process of Ultra High Strength Steels (UHSS. Normally, cooling systems is made using deep drilling technique. Although deep twist drill is better than other drilling techniques in term of higher productivity however its main problem is premature tool breakage, which affects the production quality. In this paper, analysis of deep twist drill process parameters such as cutting speed, feed rate and depth of cut by using statistical analysis to identify the tool condition is presented. The comparisons between different two tool geometries are also studied. Measured data from vibrations and force sensors are being analyzed through several statistical parameters such as root mean square (RMS, mean, kurtosis, standard deviation and skewness. Result found that kurtosis and skewness value are the most appropriate parameters to represent the deep twist drill tool conditions behaviors from vibrations and forces data. The condition of the deep twist drill process been classified according to good, blunt and fracture. It also found that the different tool geometry parameters affect the performance of the tool drill. It believe the results of this study are useful in determining the suitable analysis method to be used for developing online tool condition monitoring system to identify the tertiary tool life stage and helps to avoid mature of tool fracture during drilling process.

  14. Implementation of Statistical Process Control: Evaluating the Mechanical Performance of a Candidate Silicone Elastomer Docking Seal

    Oravec, Heather Ann; Daniels, Christopher C.

    2014-01-01

    The National Aeronautics and Space Administration has been developing a novel docking system to meet the requirements of future exploration missions to low-Earth orbit and beyond. A dynamic gas pressure seal is located at the main interface between the active and passive mating components of the new docking system. This seal is designed to operate in the harsh space environment, but is also to perform within strict loading requirements while maintaining an acceptable level of leak rate. In this study, a candidate silicone elastomer seal was designed, and multiple subscale test articles were manufactured for evaluation purposes. The force required to fully compress each test article at room temperature was quantified and found to be below the maximum allowable load for the docking system. However, a significant amount of scatter was observed in the test results. Due to the stochastic nature of the mechanical performance of this candidate docking seal, a statistical process control technique was implemented to isolate unusual compression behavior from typical mechanical performance. The results of this statistical analysis indicated a lack of process control, suggesting a variation in the manufacturing phase of the process. Further investigation revealed that changes in the manufacturing molding process had occurred which may have influenced the mechanical performance of the seal. This knowledge improves the chance of this and future space seals to satisfy or exceed design specifications.

  15. An integrated model of statistical process control and maintenance based on the delayed monitoring

    Yin, Hui; Zhang, Guojun; Zhu, Haiping; Deng, Yuhao; He, Fei

    2015-01-01

    This paper develops an integrated model of statistical process control and maintenance decision. The proposal of delayed monitoring policy postpones the sampling process till a scheduled time and contributes to ten-scenarios of the production process, where equipment failure may occur besides quality shift. The equipment failure and the control chart alert trigger the corrective maintenance and the predictive maintenance, respectively. The occurrence probability, the cycle time and the cycle cost of each scenario are obtained by integral calculation; therefore, a mathematical model is established to minimize the expected cost by using the genetic algorithm. A Monte Carlo simulation experiment is conducted and compared with the integral calculation in order to ensure the analysis of the ten-scenario model. Another ordinary integrated model without delayed monitoring is also established as comparison. The results of a numerical example indicate satisfactory economic performance of the proposed model. Finally, a sensitivity analysis is performed to investigate the effect of model parameters. - Highlights: • We develop an integrated model of statistical process control and maintenance. • We propose delayed monitoring policy and derive an economic model with 10 scenarios. • We consider two deterioration mechanisms, quality shift and equipment failure. • The delayed monitoring policy will help reduce the expected cost

  16. Multivariate Statistical Process Control Charts and the Problem of Interpretation: A Short Overview and Some Applications in Industry

    Bersimis, Sotiris; Panaretos, John; Psarakis, Stelios

    2005-01-01

    Woodall and Montgomery [35] in a discussion paper, state that multivariate process control is one of the most rapidly developing sections of statistical process control. Nowadays, in industry, there are many situations in which the simultaneous monitoring or control, of two or more related quality - process characteristics is necessary. Process monitoring problems in which several related variables are of interest are collectively known as Multivariate Statistical Process Control (MSPC).This ...

  17. On the structure of dynamic principal component analysis used in statistical process monitoring

    Vanhatalo, Erik; Kulahci, Murat; Bergquist, Bjarne

    2017-01-01

    When principal component analysis (PCA) is used for statistical process monitoring it relies on the assumption that data are time independent. However, industrial data will often exhibit serial correlation. Dynamic PCA (DPCA) has been suggested as a remedy for high-dimensional and time...... for determining the number of principal components to retain. The number of retained principal components is determined by visual inspection of the serial correlation in the squared prediction error statistic, Q (SPE), together with the cumulative explained variance of the model. The methods are illustrated using...... driven method to determine the maximum number of lags in DPCA with a foundation in multivariate time series analysis. The method is based on the behavior of the eigenvalues of the lagged autocorrelation and partial autocorrelation matrices. Given a specific lag structure we also propose a method...

  18. ZnO crystals obtained by electrodeposition: Statistical analysis of most important process variables

    Cembrero, Jesus; Busquets-Mataix, David

    2009-01-01

    In this paper a comparative study by means of a statistical analysis of the main process variables affecting ZnO crystal electrodeposition is presented. ZnO crystals were deposited on two different substrates, silicon wafer and indium tin oxide. The control variables were substrate types, electrolyte concentration, temperature, exposition time and current density. The morphologies of the different substrates were observed using scanning electron microscopy. The percentage of substrate area covered by ZnO deposit was calculated by computational image analysis. The design of the applied experiments was based on a two-level factorial analysis involving a series of 32 experiments and an analysis of variance. Statistical results reveal that variables exerting a significant influence on the area covered by ZnO deposit are electrolyte concentration, substrate type and time of deposition, together with a combined two-factor interaction between temperature and current density. However, morphology is also influenced by surface roughness of the substrates

  19. Bootstrap-based confidence estimation in PCA and multivariate statistical process control

    Babamoradi, Hamid

    be used to detect outliers in the data since the outliers can distort the bootstrap estimates. Bootstrap-based confidence limits were suggested as alternative to the asymptotic limits for control charts and contribution plots in MSPC (Paper II). The results showed that in case of the Q-statistic......Traditional/Asymptotic confidence estimation has limited applicability since it needs statistical theories to estimate the confidences, which are not available for all indicators/parameters. Furthermore, in case the theories are available for a specific indicator/parameter, the theories are based....... The goal was to improve process monitoring by improving the quality of MSPC charts and contribution plots. Bootstrapping algorithm to build confidence limits was illustrated in a case study format (Paper I). The main steps in the algorithm were discussed where a set of sensible choices (plus...

  20. Using statistical process control for monitoring the prevalence of hospital-acquired pressure ulcers.

    Kottner, Jan; Halfens, Ruud

    2010-05-01

    Institutionally acquired pressure ulcers are used as outcome indicators to assess the quality of pressure ulcer prevention programs. Determining whether quality improvement projects that aim to decrease the proportions of institutionally acquired pressure ulcers lead to real changes in clinical practice depends on the measurement method and statistical analysis used. To examine whether nosocomial pressure ulcer prevalence rates in hospitals in the Netherlands changed, a secondary data analysis using different statistical approaches was conducted of annual (1998-2008) nationwide nursing-sensitive health problem prevalence studies in the Netherlands. Institutions that participated regularly in all survey years were identified. Risk-adjusted nosocomial pressure ulcers prevalence rates, grade 2 to 4 (European Pressure Ulcer Advisory Panel system) were calculated per year and hospital. Descriptive statistics, chi-square trend tests, and P charts based on statistical process control (SPC) were applied and compared. Six of the 905 healthcare institutions participated in every survey year and 11,444 patients in these six hospitals were identified as being at risk for pressure ulcers. Prevalence rates per year ranged from 0.05 to 0.22. Chi-square trend tests revealed statistically significant downward trends in four hospitals but based on SPC methods, prevalence rates of five hospitals varied by chance only. Results of chi-square trend tests and SPC methods were not comparable, making it impossible to decide which approach is more appropriate. P charts provide more valuable information than single P values and are more helpful for monitoring institutional performance. Empirical evidence about the decrease of nosocomial pressure ulcer prevalence rates in the Netherlands is contradictory and limited.

  1. When Language Switching has No Apparent Cost: Lexical Access in Sentence Context

    Gullifer, Jason W.; Kroll, Judith F.; Dussias, Paola E.

    2013-01-01

    We report two experiments that investigate the effects of sentence context on bilingual lexical access in Spanish and English. Highly proficient Spanish-English bilinguals read sentences in Spanish and English that included a marked word to be named. The word was either a cognate with similar orthography and/or phonology in the two languages, or a matched non-cognate control. Sentences appeared in one language alone (i.e., Spanish or English) and target words were not predictable on the basis of the preceding semantic context. In Experiment 1, we mixed the language of the sentence within a block such that sentences appeared in an alternating run in Spanish or in English. These conditions partly resemble normally occurring inter-sentential code-switching. In these mixed-language sequences, cognates were named faster than non-cognates in both languages. There were no effects of switching the language of the sentence. In Experiment 2, with Spanish-English bilinguals matched closely to those who participated in the first experiment, we blocked the language of the sentences to encourage language-specific processes. The results were virtually identical to those of the mixed-language experiment. In both cases, target cognates were named faster than non-cognates, and the magnitude of the effect did not change according to the broader context. Taken together, the results support the predictions of the Bilingual Interactive Activation + Model (Dijkstra and van Heuven, 2002) in demonstrating that bilingual lexical access is language non-selective even under conditions in which language-specific cues should enable selective processing. They also demonstrate that, in contrast to lexical switching from one language to the other, inter-sentential code-switching of the sort in which bilinguals frequently engage, imposes no significant costs to lexical processing. PMID:23750141

  2. When language switching has no apparent cost: Lexical access in sentence context

    Jason W. Gullifer

    2013-05-01

    Full Text Available We report two experiments that investigate the effects of sentence context on bilingual lexical access in Spanish and English. Highly proficient Spanish-English bilinguals read sentences in Spanish and English that included a marked word to be named. The word was either a cognate with similar orthography and/or phonology in the two languages, or a matched non-cognate control. Sentences appeared in one language alone (i.e., Spanish or English and target words were not predictable on the basis of the preceding semantic context. In Experiment 1, we mixed the language of the sentence within a block such that sentences appeared in an alternating run in Spanish or in English. These conditions partly resemble normally occurring inter-sentential code-switching. In these mixed language sequences, cognates were named faster than non-cognates in both languages. There were no effects of switching the language of the sentence. In Experiment 2, with Spanish-English bilinguals matched closely to those who participated in the first experiment, we blocked the language of the sentences to encourage language-specific processes. The results were virtually identical to those of the mixed language experiment. In both cases, target cognates were named faster than non-cognates, and the magnitude of the effect did not change according to the broader context. Taken together, the results support the predictions of the Bilingual Interactive Activation + Model (Dijkstra & Van Heuven, 2002 in demonstrating that bilingual lexical access is language nonselective even under conditions in which language-specific cues should enable selective processing. They also demonstrate that, in contrast to lexical switching from one language to the other, inter-sentential code-switching of the sort in which bilinguals frequently engage, imposes no significant costs to lexical processing.

  3. 32 CFR 16.3 - Available sentences.

    2010-07-01

    .... Any lawful punishment or condition of punishment is authorized, including death, so long as the... sentence given to those who violate the law. Such reasons include: punishment of the wrongdoer; protection of society from the wrongdoer; deterrence of the wrongdoer and those who know of his crimes and...

  4. Categorising Example Sentences in Dictionaries for Research ...

    able contextual or grammatical support. I have constructed a table to classify example sentences according to different criteria. I filled in this table with randomly selected words and their examples which have been taken from five different South African school dictionaries. The goal of this research is to present characteristics ...

  5. Working memory and planning during sentence production.

    Martin, Randi C; Yan, Hao; Schnur, Tatiana T

    2014-10-01

    Speakers retrieve conceptual, syntactic and lexical information in advance of articulation during sentence production. What type of working memory (WM) store is used to hold the planned information before speaking? To address this question, we measured onset latencies when subjects produced sentences that began with either a complex or a simple initial noun phrase, while holding semantic, phonological or spatial information in WM. Although we found that subjects had longer onset latencies for sentences beginning with a complex noun phrase, showing a phrasal scope of planning, the magnitude of this complexity effect was not affected by any type of WM load. However, subjects made more syntactic errors (but not lexical errors) for sentences beginning with a complex noun phrase, suggesting that advance planning for these phrases occurs at a syntactic rather than lexical-semantic level, which may account for the lack of effect with various types of WM load in the current study. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Categorising Example Sentences in Dictionaries for Research ...

    ers the grammatical support that they provide is more important. While there is ... The goal of this research is to present characteristics of examples in a way that makes them easier to .... the headword is simple or inflected in the example. The final .... I have also included whether the sentence is a command as some teachers.

  7. Foregrounding awareness of sentence construction-types in the ...

    ... a grasp of the principles of phrase and sentence formation and the kinds of structure ... In this article, I have demonstrated through the content analysis of the ... within sentences whose construction typifies the desirative and instrumental ...

  8. Federal Sentencing Guidelines: Background, Legal Analysis, and Policy Options

    Seghetti, Lisa M; Smith, Alison M

    2007-01-01

    In United States v. Booker, an unusual two-part opinion transformed federal criminal sentencing by restoring to judges much of the discretion that Congress took away when it put mandatory sentencing guidelines in place...

  9. UK: Welsh court reduces sentence, cites HIV status.

    Marceau, Emmanuelle

    2003-08-01

    A Welsh appeal court has reduced the sentence handed down to an offender because of his HIV status, despite his lengthy criminal record. The court reduced the sentence from five to three-and-a-half years' imprisonment.

  10. Enhancing Possible Sentence through Cooperative Learning (Open to Suggestion).

    Jensen, Sharon J.; Duffelmeyer, Frederick A.

    1996-01-01

    Describes using Think-Pair-Share (a three-step cooperative learning activity) to complement the sentence-generation phase of the Possible Sentences Activity, a highly recommended prereading vocabulary strategy. (SR)

  11. Numerical consideration for multiscale statistical process control method applied to nuclear material accountancy

    Suzuki, Mitsutoshi; Hori, Masato; Asou, Ryoji; Usuda, Shigekazu

    2006-01-01

    The multiscale statistical process control (MSSPC) method is applied to clarify the elements of material unaccounted for (MUF) in large scale reprocessing plants using numerical calculations. Continuous wavelet functions are used to decompose the process data, which simulate batch operation superimposed by various types of disturbance, and the disturbance components included in the data are divided into time and frequency spaces. The diagnosis of MSSPC is applied to distinguish abnormal events from the process data and shows how to detect abrupt and protracted diversions using principle component analysis. Quantitative performance of MSSPC for the time series data is shown with average run lengths given by Monte-Carlo simulation to compare to the non-detection probability β. Recent discussion about bias corrections in material balances is introduced and another approach is presented to evaluate MUF without assuming the measurement error model. (author)

  12. Feasibility study of using statistical process control to customized quality assurance in proton therapy.

    Rah, Jeong-Eun; Shin, Dongho; Oh, Do Hoon; Kim, Tae Hyun; Kim, Gwe-Ya

    2014-09-01

    To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors' analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.

  13. Feasibility study of using statistical process control to customized quality assurance in proton therapy

    Rah, Jeong-Eun; Oh, Do Hoon; Shin, Dongho; Kim, Tae Hyun; Kim, Gwe-Ya

    2014-01-01

    Purpose: To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. Methods: The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. Results: The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors’ analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. Conclusions: SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety

  14. Assessment of the beryllium lymphocyte proliferation test using statistical process control.

    Cher, Daniel J; Deubner, David C; Kelsh, Michael A; Chapman, Pamela S; Ray, Rose M

    2006-10-01

    Despite more than 20 years of surveillance and epidemiologic studies using the beryllium blood lymphocyte proliferation test (BeBLPT) as a measure of beryllium sensitization (BeS) and as an aid for diagnosing subclinical chronic beryllium disease (CBD), improvements in specific understanding of the inhalation toxicology of CBD have been limited. Although epidemiologic data suggest that BeS and CBD risks vary by process/work activity, it has proven difficult to reach specific conclusions regarding the dose-response relationship between workplace beryllium exposure and BeS or subclinical CBD. One possible reason for this uncertainty could be misclassification of BeS resulting from variation in BeBLPT testing performance. The reliability of the BeBLPT, a biological assay that measures beryllium sensitization, is unknown. To assess the performance of four laboratories that conducted this test, we used data from a medical surveillance program that offered testing for beryllium sensitization with the BeBLPT. The study population was workers exposed to beryllium at various facilities over a 10-year period (1992-2001). Workers with abnormal results were offered diagnostic workups for CBD. Our analyses used a standard statistical technique, statistical process control (SPC), to evaluate test reliability. The study design involved a repeated measures analysis of BeBLPT results generated from the company-wide, longitudinal testing. Analytical methods included use of (1) statistical process control charts that examined temporal patterns of variation for the stimulation index, a measure of cell reactivity to beryllium; (2) correlation analysis that compared prior perceptions of BeBLPT instability to the statistical measures of test variation; and (3) assessment of the variation in the proportion of missing test results and how time periods with more missing data influenced SPC findings. During the period of this study, all laboratories displayed variation in test results that

  15. The suspended sentence in German criminal law

    Jovašević Dragan

    2017-01-01

    Full Text Available From the ancient times until today, criminal law in all countries has provided different criminal sanctions as social control measures. These are court-imposed coercive measures that take away or limit certain rights and freedoms of criminal offenders. Sanctions are applied to natural or legal persons who violate the norms of the legal order and cause damage or endanger other legal goods that enjoy legal protection. In order to effectively protect social values jeopardized by the commission of crime, state legislations prescribe several kinds of criminal sanctions: 1 penalties, 2 precautions, 3 safety measures, 4 penalties for juvenile offenders, and 5 sanctions for legal persons. Penalties are the basic, the oldest and the most important type of criminal sanctions. They are prescribed for the largest number of criminal offences. Imposed instead of or alongside with penalties, warning measures have particularly important role in jurisprudence. Since they were introduced in the system of criminal sanctions in the early 20th century, there has been a notable increase in the application of these measures, particularly in cases involving negligent and accidental offences, and minor offences that do not cause serious consequences, whose perpetrators are not persons with criminal characteristics. Warning measures (suspended sentence are envisaged in all contemporary criminal legislations, including the German legislation. Suspended sentence is a conditional stay of execution of the sentence of imprisonment for a specified time, provided that the convicted person fulfills the imposed obligations and does not commit another criminal offense. Two conditions must be fulfilled for the application of these sanctions: a the formal requirement, which is attached to the sentence of imprisonment; and b the substantive requirement, which implies the court assessment that the application of these sanctions is justified and necessary in a particular case. Many

  16. Materials of acoustic analysis: sustained vowel versus sentence.

    Moon, Kyung Ray; Chung, Sung Min; Park, Hae Sang; Kim, Han Su

    2012-09-01

    Sustained vowel is a widely used material of acoustic analysis. However, vowel phonation does not sufficiently demonstrate sentence-based real-life phonation, and biases may occur depending on the test subjects intent during pronunciation. The purpose of this study was to investigate the differences between the results of acoustic analysis using each material. An individual prospective study. Two hundred two individuals (87 men and 115 women) with normal findings in videostroboscopy were enrolled. Acoustic analysis was done using the speech pattern element acquisition and display program. Fundamental frequency (Fx), amplitude (Ax), contact quotient (Qx), jitter, and shimmer were measured with sustained vowel-based acoustic analysis. Average fundamental frequency (FxM), average amplitude (AxM), average contact quotient (QxM), Fx perturbation (CFx), and amplitude perturbation (CAx) were measured with sentence-based acoustic analysis. Corresponding data of the two methods were compared with each other. SPSS (Statistical Package for the Social Sciences, Version 12.0; SPSS, Inc., Chicago, IL) software was used for statistical analysis. FxM was higher than Fx in men (Fx, 124.45 Hz; FxM, 133.09 Hz; P=0.000). In women, FxM seemed to be lower than Fx, but the results were not statistically significant (Fx, 210.58 Hz; FxM, 208.34 Hz; P=0.065). There was no statistical significance between Ax and AxM in both the groups. QxM was higher than Qx in men and women. Jitter was lower in men, but CFx was lower in women. Both Shimmer and CAx were higher in men. Sustained vowel phonation could not be a complete substitute for real-time phonation in acoustic analysis. Characteristics of acoustic materials should be considered when choosing the material for acoustic analysis and interpreting the results. Copyright © 2012 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  17. Multivariate statistical process control in product quality review assessment - A case study.

    Kharbach, M; Cherrah, Y; Vander Heyden, Y; Bouklouze, A

    2017-11-01

    According to the Food and Drug Administration and the European Good Manufacturing Practices (GMP) guidelines, Annual Product Review (APR) is a mandatory requirement in GMP. It consists of evaluating a large collection of qualitative or quantitative data in order to verify the consistency of an existing process. According to the Code of Federal Regulation Part 11 (21 CFR 211.180), all finished products should be reviewed annually for the quality standards to determine the need of any change in specification or manufacturing of drug products. Conventional Statistical Process Control (SPC) evaluates the pharmaceutical production process by examining only the effect of a single factor at the time using a Shewhart's chart. It neglects to take into account the interaction between the variables. In order to overcome this issue, Multivariate Statistical Process Control (MSPC) can be used. Our case study concerns an APR assessment, where 164 historical batches containing six active ingredients, manufactured in Morocco, were collected during one year. Each batch has been checked by assaying the six active ingredients by High Performance Liquid Chromatography according to European Pharmacopoeia monographs. The data matrix was evaluated both by SPC and MSPC. The SPC indicated that all batches are under control, while the MSPC, based on Principal Component Analysis (PCA), for the data being either autoscaled or robust scaled, showed four and seven batches, respectively, out of the Hotelling T 2 95% ellipse. Also, an improvement of the capability of the process is observed without the most extreme batches. The MSPC can be used for monitoring subtle changes in the manufacturing process during an APR assessment. Copyright © 2017 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  18. Impact analysis of critical success factors on the benefits from statistical process control implementation

    Fabiano Rodrigues Soriano

    Full Text Available Abstract The Statistical Process Control - SPC is a set of statistical techniques focused on process control, monitoring and analyzing variation causes in the quality characteristics and/or in the parameters used to control and process improvements. Implementing SPC in organizations is a complex task. The reasons for its failure are related to organizational or social factors such as lack of top management commitment and little understanding about its potential benefits. Other aspects concern technical factors such as lack of training on and understanding about the statistical techniques. The main aim of the present article is to understand the interrelations between conditioning factors associated with top management commitment (Support, SPC Training and Application, as well as to understand the relationships between these factors and the benefits associated with the implementation of the program. The Partial Least Squares Structural Equation Modeling (PLS-SEM was used in the analysis since the main goal is to establish the causal relations. A cross-section survey was used as research method to collect information of samples from Brazilian auto-parts companies, which were selected according to guides from the auto-parts industry associations. A total of 170 companies were contacted by e-mail and by phone in order to be invited to participate in the survey. However, just 93 industries agreed on participating, and only 43 answered the questionnaire. The results showed that the senior management support considerably affects the way companies develop their training programs. In turn, these trainings affect the way companies apply the techniques. Thus, it will reflect on the benefits gotten from implementing the program. It was observed that the managerial and technical aspects are closely connected to each other and that they are represented by the ratio between top management and training support. The technical aspects observed through SPC

  19. Comparison of Statistical Post-Processing Methods for Probabilistic Wind Speed Forecasting

    Han, Keunhee; Choi, JunTae; Kim, Chansoo

    2018-02-01

    In this study, the statistical post-processing methods that include bias-corrected and probabilistic forecasts of wind speed measured in PyeongChang, which is scheduled to host the 2018 Winter Olympics, are compared and analyzed to provide more accurate weather information. The six post-processing methods used in this study are as follows: mean bias-corrected forecast, mean and variance bias-corrected forecast, decaying averaging forecast, mean absolute bias-corrected forecast, and the alternative implementations of ensemble model output statistics (EMOS) and Bayesian model averaging (BMA) models, which are EMOS and BMA exchangeable models by assuming exchangeable ensemble members and simplified version of EMOS and BMA models. Observations for wind speed were obtained from the 26 stations in PyeongChang and 51 ensemble member forecasts derived from the European Centre for Medium-Range Weather Forecasts (ECMWF Directorate, 2012) that were obtained between 1 May 2013 and 18 March 2016. Prior to applying the post-processing methods, reliability analysis was conducted by using rank histograms to identify the statistical consistency of ensemble forecast and corresponding observations. Based on the results of our study, we found that the prediction skills of probabilistic forecasts of EMOS and BMA models were superior to the biascorrected forecasts in terms of deterministic prediction, whereas in probabilistic prediction, BMA models showed better prediction skill than EMOS. Even though the simplified version of BMA model exhibited best prediction skill among the mentioned six methods, the results showed that the differences of prediction skills between the versions of EMOS and BMA were negligible.

  20. The role of nonequilibrium thermo-mechanical statistics in modern technologies and industrial processes: an overview

    Rodrigues, Clóves G.; Silva, Antônio A. P.; Silva, Carlos A. B.; Vasconcellos, Áurea R.; Ramos, J. Galvão; Luzzi, Roberto

    2010-01-01

    The nowadays notable development of all the modern technology, fundamental for the progress and well being of world society, imposes a great deal of stress in the realm of basic Physics, more precisely on Thermo-Statistics. We do face situations in electronics and optoelectronics involving physical-chemical systems far-removed-from equilibrium, where ultrafast (in pico- and femto-second scale) and non-linear processes are present. Further, we need to be aware of the rapid unfolding of nano-te...

  1. Confidence limits for contribution plots in multivariate statistical process control using bootstrap estimates.

    Babamoradi, Hamid; van den Berg, Frans; Rinnan, Åsmund

    2016-02-18

    In Multivariate Statistical Process Control, when a fault is expected or detected in the process, contribution plots are essential for operators and optimization engineers in identifying those process variables that were affected by or might be the cause of the fault. The traditional way of interpreting a contribution plot is to examine the largest contributing process variables as the most probable faulty ones. This might result in false readings purely due to the differences in natural variation, measurement uncertainties, etc. It is more reasonable to compare variable contributions for new process runs with historical results achieved under Normal Operating Conditions, where confidence limits for contribution plots estimated from training data are used to judge new production runs. Asymptotic methods cannot provide confidence limits for contribution plots, leaving re-sampling methods as the only option. We suggest bootstrap re-sampling to build confidence limits for all contribution plots in online PCA-based MSPC. The new strategy to estimate CLs is compared to the previously reported CLs for contribution plots. An industrial batch process dataset was used to illustrate the concepts. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Statistical Analysis of the First Passage Path Ensemble of Jump Processes

    von Kleist, Max; Schütte, Christof; Zhang, Wei

    2018-02-01

    The transition mechanism of jump processes between two different subsets in state space reveals important dynamical information of the processes and therefore has attracted considerable attention in the past years. In this paper, we study the first passage path ensemble of both discrete-time and continuous-time jump processes on a finite state space. The main approach is to divide each first passage path into nonreactive and reactive segments and to study them separately. The analysis can be applied to jump processes which are non-ergodic, as well as continuous-time jump processes where the waiting time distributions are non-exponential. In the particular case that the jump processes are both Markovian and ergodic, our analysis elucidates the relations between the study of the first passage paths and the study of the transition paths in transition path theory. We provide algorithms to numerically compute statistics of the first passage path ensemble. The computational complexity of these algorithms scales with the complexity of solving a linear system, for which efficient methods are available. Several examples demonstrate the wide applicability of the derived results across research areas.

  3. Sentence comprehension in Swahili-English bilingual agrammatic speakers

    Abuom, Tom O.; Shah, Emmah; Bastiaanse, Roelien

    For this study, sentence comprehension was tested in Swahili-English bilingual agrammatic speakers. The sentences were controlled for four factors: (1) order of the arguments (base vs. derived); (2) embedding (declarative vs. relative sentences); (3) overt use of the relative pronoun "who"; (4)

  4. 75 FR 13680 - Commutation of Sentence: Technical Change

    2010-03-23

    ... Sentence: Technical Change AGENCY: Bureau of Prisons, Justice. ACTION: Interim rule. SUMMARY: This document makes a minor technical change to the Bureau of Prisons (Bureau) regulations on sentence commutation to.... Commutation of Sentence: Technical Change This document makes a minor technical change to the Bureau...

  5. Statistical methods to assess and control processes and products during nuclear fuel fabrication

    Weidinger, H.

    1999-01-01

    Very good statistical tools and techniques are available today to access the quality and the reliability of fabrication process as the original sources for a good and reliable quality of the fabricated processes. Quality control charts of different types play a key role and the high capability of modern electronic data acquisition technologies proved, at least potentially, a high efficiency in the more or less online application of these methods. These techniques focus mainly on stability and the reliability of the fabrication process. In addition, relatively simple statistical tolls are available to access the capability of fabrication process, assuming they are stable, to fulfill the product specifications. All these techniques can only result in as good a product as the product design is able to describe the product requirements necessary for good performance. Therefore it is essential that product design is strictly and closely performance oriented. However, performance orientation is only successful through an open and effective cooperation with the customer who uses or applies those products. During the last one to two decades in the west, a multi-vendor strategy has been developed by the utility, sometimes leading to three different fuel vendors for one reactor core. This development resulted in better economic conditions for the user but did not necessarily increase an open attitude with the vendor toward the using utility. The responsibility of the utility increased considerably to ensure an adequate quality of the fuel they received. As a matter of fact, sometimes the utilities had to pay a high price because of unexpected performance problems. Thus the utilities are now learning that they need to increase their knowledge and experience in the area of nuclear fuel quality management and technology. This process started some time ago in the west. However, it now also reaches the utilities in the eastern countries. (author)

  6. A bibliometric analysis of 50 years of worldwide research on statistical process control

    Fabiane Letícia Lizarelli

    Full Text Available Abstract An increasing number of papers on statistical process control (SPC has emerged in the last fifty years, especially in the last fifteen years. This may be attributed to the increased global competitiveness generated by innovation and the continuous improvement of products and processes. In this sense, SPC has a fundamentally important role in quality and production systems. The research in this paper considers the context of technological improvement and innovation of products and processes to increase corporate competitiveness. There are several other statistical technics and tools for assisting continuous improvement and innovation of products and processes but, despite the limitations in their use in the improvement projects, there is growing concern about the use of SPC. A gap between the SPC technics taught in engineering courses and their practical applications to industrial problems is observed in empirical research; thus, it is important to understand what has been done and identify the trends in SPC research. The bibliometric study in this paper is proposed in this direction and uses the Web of Science (WoS database. Data analysis indicates that there was a growth rate of more than 90% in the number of publications on SPC after 1990. Our results reveal the countries where these publications have come from, the authors with the highest number of papers and their networks. Main sources of publications are also identified; it is observed that the publications of SPC papers are concentrated in some of the international research journals, not necessarily those with the major high-impact factors. Furthermore, the papers are focused on industrial engineering, operations research and management science fields. The most common term found in the papers was cumulative sum control charts, but new topics have emerged and have been researched in the past ten years, such as multivariate methods for process monitoring and nonparametric methods.

  7. The Syntax and Semantics of Russian Non-Sentence Adverbials

    Lorentzen, Elena; Durst-Andersen, Per

    2015-01-01

    For the first time non-sentence adverbials in Russian are analyzed in their totality, i.e., from a lexical, syntactic and propositional-semantic point of view. They are classified, defined and interpreted according to four propositional structures identified in Russian: (1) state descriptions...... and (2) activity descriptions – both created by simplex verbs; (3) event descriptions and (4) process descriptions – both involving complex verbs. All four structures function as statement models and are used to represent semantic paraphrases of utterances in order to be able to show the exact...

  8. Who gets a second chance? An investigation of Ohio's blended juvenile sentence.

    Cheesman, Fred L; Waters, Nicole L; Hurst, Hunter

    2010-01-01

    Factors differentiating blended sentencing cases (Serious Youthful Offenders or SYOs) from conventional juvenile cases and cases transferred to the adult criminal court in Ohio were investigated using a two-stage probit. Conventional juvenile cases differed from cases selected for non-conventional processing (i.e., SYO or transfer) according to offense seriousness, number of prior Ohio Department of Youth Services placements, age and gender. Controlling for probability of selection for nonconventional processing, transfers differed from SYOs according to age, gender, and race. Minorities were significantly more likely than Whites to be transfers rather than SYOs, suggesting possible bias in the decision-making process. Objective risk and needs assessments should be used to identify the most suitable candidates for blended sentences and adult transfer and enhanced services should be provided to juvenile offenders given blended sentences.

  9. Semantic Models of Sentences with Verbs of Motion in Standard Language and in Scientific Language Used in Biology

    Vita Banionytė

    2016-06-01

    Full Text Available The semantic models of sentences with verbs of motion in German standard language and in scientific language used in biology are analyzed in the article. In its theoretic part it is affirmed that the article is based on the semantic theory of the sentence. This theory, in its turn, is grounded on the correlation of semantic predicative classes and semantic roles. The combination of semantic predicative classes and semantic roles is expressed by the main semantic formula – proposition. In its practical part the differences between the semantic models of standard and scientific language used in biology are explained. While modelling sentences with verbs of motion, two groups of semantic models of sentences are singled out: that of action (Handlung and process (Vorgang. The analysis shows that the semantic models of sentences with semantic action predicatives dominate in the text of standard language while the semantic models of sentences with semantic process predicatives dominate in the texts of scientific language used in biology. The differences how the doer and direction are expressed in standard and in scientific language are clearly seen and the semantic cases (Agens, Patiens, Direktiv1 help to determine that. It is observed that in scientific texts of high level of specialization (biology science in contrast to popular scientific literature models of sentences with moving verbs are usually seldom found. They are substituted by denominative constructions. In conclusions it is shown that this analysis can be important in methodics, especially planning material for teaching professional-scientific language.

  10. Automatic detection of health changes using statistical process control techniques on measured transfer times of elderly.

    Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom

    2015-01-01

    It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days.

  11. Statistical modeling of copper losses in the silicate slag of the sulfide concentrate smelting process

    Savic Marija V.

    2015-09-01

    Full Text Available This article presents the results of the statistical modeling of copper losses in the silicate slag of the sulfide concentrates smelting process. The aim of this study was to define the correlation dependence of the degree of copper losses in the silicate slag on the following parameters of technological processes: SiO2, FeO, Fe3O4, CaO and Al2O3 content in the slag and copper content in the matte. Multiple linear regression analysis (MLRA, artificial neural networks (ANNs and adaptive network based fuzzy inference system (ANFIS were used as tools for mathematical analysis of the indicated problem. The best correlation coefficient (R2 = 0.719 of the final model was obtained using the ANFIS modeling approach.

  12. Neural networks mediating sentence reading in the deaf

    Elizabeth Ann Hirshorn

    2014-06-01

    Full Text Available The present work addresses the neural bases of sentence reading in deaf populations. To better understand the relative role of deafness and English knowledge in shaping the neural networks that mediate sentence reading, three populations with different degrees of English knowledge and depth of hearing loss were included – deaf signers, oral deaf and hearing individuals. The three groups were matched for reading comprehension and scanned while reading sentences. A similar neural network of left perisylvian areas was observed, supporting the view of a shared network of areas for reading despite differences in hearing and English knowledge. However, differences were observed, in particular in the auditory cortex, with deaf signers and oral deaf showing greatest bilateral superior temporal gyrus (STG recruitment as compared to hearing individuals. Importantly, within deaf individuals, the same STG area in the left hemisphere showed greater recruitment as hearing loss increased. To further understand the functional role of such auditory cortex re-organization after deafness, connectivity analyses were performed from the STG regions identified above. Connectivity from the left STG toward areas typically associated with semantic processing (BA45 and thalami was greater in deaf signers and in oral deaf as compared to hearing. In contrast, connectivity from left STG toward areas identified with speech-based processing was greater in hearing and in oral deaf as compared to deaf signers. These results support the growing literature indicating recruitment of auditory areas after congenital deafness for visually-mediated language functions, and establish that both auditory deprivation and language experience shape its functional reorganization. Implications for differential reliance on semantic vs. phonological pathways during reading in the three groups is discussed.

  13. A generic statistical methodology to predict the maximum pit depth of a localized corrosion process

    Jarrah, A.; Bigerelle, M.; Guillemot, G.; Najjar, D.; Iost, A.; Nianga, J.-M.

    2011-01-01

    Highlights: → We propose a methodology to predict the maximum pit depth in a corrosion process. → Generalized Lambda Distribution and the Computer Based Bootstrap Method are combined. → GLD fit a large variety of distributions both in their central and tail regions. → Minimum thickness preventing perforation can be estimated with a safety margin. → Considering its applications, this new approach can help to size industrial pieces. - Abstract: This paper outlines a new methodology to predict accurately the maximum pit depth related to a localized corrosion process. It combines two statistical methods: the Generalized Lambda Distribution (GLD), to determine a model of distribution fitting with the experimental frequency distribution of depths, and the Computer Based Bootstrap Method (CBBM), to generate simulated distributions equivalent to the experimental one. In comparison with conventionally established statistical methods that are restricted to the use of inferred distributions constrained by specific mathematical assumptions, the major advantage of the methodology presented in this paper is that both the GLD and the CBBM enable a statistical treatment of the experimental data without making any preconceived choice neither on the unknown theoretical parent underlying distribution of pit depth which characterizes the global corrosion phenomenon nor on the unknown associated theoretical extreme value distribution which characterizes the deepest pits. Considering an experimental distribution of depths of pits produced on an aluminium sample, estimations of maximum pit depth using a GLD model are compared to similar estimations based on usual Gumbel and Generalized Extreme Value (GEV) methods proposed in the corrosion engineering literature. The GLD approach is shown having smaller bias and dispersion in the estimation of the maximum pit depth than the Gumbel approach both for its realization and mean. This leads to comparing the GLD approach to the GEV one

  14. Squeezing, photon bunching, photon antibunching and nonclassical photon statistics in degenerate hyper Raman processes

    Sen, Biswajit; Mandal, Swapan

    2007-01-01

    An initially prepared coherent state coupled to a second-order nonlinear medium is responsible for stimulated and spontaneous hyper Raman processes. By using an intuitive approach based on perturbation theory, the Hamiltonian corresponding to the hyper Raman processes is analytically solved to obtain the temporal development of the field operators. It is true that these analytical solutions are valid for small coupling constants. However, the interesting part is that these solutions are valid for reasonably large time. Hence, the present analytical solutions are quite general and are fresh compared to those solutions under short-time approximations. By exploiting the analytical solutions of field operators for various modes, we investigate the squeezing, photon antibunching and nonclassical photon statistics for pure modes of the input coherent light responsible for hyper Raman processes. At least in one instance (stimulated hyper Raman processes for vibration phonon mode), we report the simultaneous appearance of classical (photon bunching) and nonclassical (squeezing) effects of the radiation field responsible for hyper Raman processes

  15. IMPROVING KNITTED FABRICS BY A STATISTICAL CONTROL OF DIMENSIONAL CHANGES AFTER THE DYEING PROCESS

    LLINARES-BERENGUER Jorge

    2017-05-01

    Full Text Available One of the most important problems that cotton knitted fabrics present during the manufacturing process is their dimensional instability, which needs to be minimised. Some of the variables that intervene in fabric shrinkage are related with its structural characteristics, use of fiber when producing yarn, the yarn count used or the dyeing process employed. Conducted under real factory conditions, the present study attempted to model the behaviour of a fabric structure after a dyeing process by contributing several algorithms that calculate dyed fabric stability after the first wash cycle. Small-diameter circular machines are used to produce garments with no side seams. This is the reason why a list of machines that produce the same fabrics for different widths needs to be made available to produce all the sizes of a given garment. Two relaxation states were distingued for interlock fabric: dyed and dry relaxation, and dyed and wash relaxation. The linear density of the yarn employed to produce sample fabric was combed cotton Ne 30. The machines used for optic bleaching were Overflow. To obtain knitting structures with optimum dimensional stability, different statistical tools were used to help us to evaluate all the production process variables (raw material, machines and process responsible for this variation. This allowed to guarantee product quality without creating costs and losses.

  16. Preliminary Retrospective Analysis of Daily Tomotherapy Output Constancy Checks Using Statistical Process Control.

    Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello

    2016-01-01

    The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.

  17. Using multitype branching processes to quantify statistics of disease outbreaks in zoonotic epidemics.

    Singh, Sarabjeet; Schneider, David J; Myers, Christopher R

    2014-03-01

    Branching processes have served as a model for chemical reactions, biological growth processes, and contagion (of disease, information, or fads). Through this connection, these seemingly different physical processes share some common universalities that can be elucidated by analyzing the underlying branching process. In this work we focus on coupled branching processes as a model of infectious diseases spreading from one population to another. An exceedingly important example of such coupled outbreaks are zoonotic infections that spill over from animal populations to humans. We derive several statistical quantities characterizing the first spillover event from animals to humans, including the probability of spillover, the first passage time distribution for human infection, and disease prevalence in the animal population at spillover. Large stochastic fluctuations in those quantities can make inference of the state of the system at the time of spillover difficult. Focusing on outbreaks in the human population, we then characterize the critical threshold for a large outbreak, the distribution of outbreak sizes, and associated scaling laws. These all show a strong dependence on the basic reproduction number in the animal population and indicate the existence of a novel multicritical point with altered scaling behavior. The coupling of animal and human infection dynamics has crucial implications, most importantly allowing for the possibility of large human outbreaks even when human-to-human transmission is subcritical.

  18. Using multitype branching processes to quantify statistics of disease outbreaks in zoonotic epidemics

    Singh, Sarabjeet; Schneider, David J.; Myers, Christopher R.

    2014-03-01

    Branching processes have served as a model for chemical reactions, biological growth processes, and contagion (of disease, information, or fads). Through this connection, these seemingly different physical processes share some common universalities that can be elucidated by analyzing the underlying branching process. In this work we focus on coupled branching processes as a model of infectious diseases spreading from one population to another. An exceedingly important example of such coupled outbreaks are zoonotic infections that spill over from animal populations to humans. We derive several statistical quantities characterizing the first spillover event from animals to humans, including the probability of spillover, the first passage time distribution for human infection, and disease prevalence in the animal population at spillover. Large stochastic fluctuations in those quantities can make inference of the state of the system at the time of spillover difficult. Focusing on outbreaks in the human population, we then characterize the critical threshold for a large outbreak, the distribution of outbreak sizes, and associated scaling laws. These all show a strong dependence on the basic reproduction number in the animal population and indicate the existence of a novel multicritical point with altered scaling behavior. The coupling of animal and human infection dynamics has crucial implications, most importantly allowing for the possibility of large human outbreaks even when human-to-human transmission is subcritical.

  19. The Initial Regression Statistical Characteristics of Intervals Between Zeros of Random Processes

    V. K. Hohlov

    2014-01-01

    Full Text Available The article substantiates the initial regression statistical characteristics of intervals between zeros of realizing random processes, studies their properties allowing the use these features in the autonomous information systems (AIS of near location (NL. Coefficients of the initial regression (CIR to minimize the residual sum of squares of multiple initial regression views are justified on the basis of vector representations associated with a random vector notion of analyzed signal parameters. It is shown that even with no covariance-based private CIR it is possible to predict one random variable through another with respect to the deterministic components. The paper studies dependences of CIR interval sizes between zeros of the narrowband stationary in wide-sense random process with its energy spectrum. Particular CIR for random processes with Gaussian and rectangular energy spectra are obtained. It is shown that the considered CIRs do not depend on the average frequency of spectra, are determined by the relative bandwidth of the energy spectra, and weakly depend on the type of spectrum. CIR properties enable its use as an informative parameter when implementing temporary regression methods of signal processing, invariant to the average rate and variance of the input implementations. We consider estimates of the average energy spectrum frequency of the random stationary process by calculating the length of the time interval corresponding to the specified number of intervals between zeros. It is shown that the relative variance in estimation of the average energy spectrum frequency of stationary random process with increasing relative bandwidth ceases to depend on the last process implementation in processing above ten intervals between zeros. The obtained results can be used in the AIS NL to solve the tasks of detection and signal recognition, when a decision is made in conditions of unknown mathematical expectations on a limited observation

  20. Image-guided radiotherapy quality control: Statistical process control using image similarity metrics.

    Shiraishi, Satomi; Grams, Michael P; Fong de Los Santos, Luis E

    2018-05-01

    The purpose of this study was to demonstrate an objective quality control framework for the image review process. A total of 927 cone-beam computed tomography (CBCT) registrations were retrospectively analyzed for 33 bilateral head and neck cancer patients who received definitive radiotherapy. Two registration tracking volumes (RTVs) - cervical spine (C-spine) and mandible - were defined, within which a similarity metric was calculated and used as a registration quality tracking metric over the course of treatment. First, sensitivity to large misregistrations was analyzed for normalized cross-correlation (NCC) and mutual information (MI) in the context of statistical analysis. The distribution of metrics was obtained for displacements that varied according to a normal distribution with standard deviation of σ = 2 mm, and the detectability of displacements greater than 5 mm was investigated. Then, similarity metric control charts were created using a statistical process control (SPC) framework to objectively monitor the image registration and review process. Patient-specific control charts were created using NCC values from the first five fractions to set a patient-specific process capability limit. Population control charts were created using the average of the first five NCC values for all patients in the study. For each patient, the similarity metrics were calculated as a function of unidirectional translation, referred to as the effective displacement. Patient-specific action limits corresponding to 5 mm effective displacements were defined. Furthermore, effective displacements of the ten registrations with the lowest similarity metrics were compared with a three dimensional (3DoF) couch displacement required to align the anatomical landmarks. Normalized cross-correlation identified suboptimal registrations more effectively than MI within the framework of SPC. Deviations greater than 5 mm were detected at 2.8σ and 2.1σ from the mean for NCC and MI

  1. Analysis of statistical misconception in terms of statistical reasoning

    Maryati, I.; Priatna, N.

    2018-05-01

    Reasoning skill is needed for everyone to face globalization era, because every person have to be able to manage and use information from all over the world which can be obtained easily. Statistical reasoning skill is the ability to collect, group, process, interpret, and draw conclusion of information. Developing this skill can be done through various levels of education. However, the skill is low because many people assume that statistics is just the ability to count and using formulas and so do students. Students still have negative attitude toward course which is related to research. The purpose of this research is analyzing students’ misconception in descriptive statistic course toward the statistical reasoning skill. The observation was done by analyzing the misconception test result and statistical reasoning skill test; observing the students’ misconception effect toward statistical reasoning skill. The sample of this research was 32 students of math education department who had taken descriptive statistic course. The mean value of misconception test was 49,7 and standard deviation was 10,6 whereas the mean value of statistical reasoning skill test was 51,8 and standard deviation was 8,5. If the minimal value is 65 to state the standard achievement of a course competence, students’ mean value is lower than the standard competence. The result of students’ misconception study emphasized on which sub discussion that should be considered. Based on the assessment result, it was found that students’ misconception happen on this: 1) writing mathematical sentence and symbol well, 2) understanding basic definitions, 3) determining concept that will be used in solving problem. In statistical reasoning skill, the assessment was done to measure reasoning from: 1) data, 2) representation, 3) statistic format, 4) probability, 5) sample, and 6) association.

  2. Syntactic flexibility and planning scope: The effect of verb bias on advance planning during sentence recall

    Maartje evan de Velde

    2014-10-01

    Full Text Available In sentence production, grammatical advance planning scope depends on contextual factors (e.g., time pressure, linguistic factors (e.g., ease of structural processing, and cognitive factors (e.g., production speed. The present study tests the influence of the availability of multiple syntactic alternatives (i.e., syntactic flexibility on the scope of advance planning during the recall of Dutch dative phrases. We manipulated syntactic flexibility by using verbs with a strong bias or a weak bias towards one structural alternative in sentence frames accepting both verbs (e.g., strong/weak bias: De ober schotelt/serveert de klant de maaltijd [voor] 'The waiter dishes out/serves the customer the meal'. To assess lexical planning scope, we varied the frequency of the first post-verbal noun (N1, Experiment 1 or the second post-verbal noun (N2, Experiment 2. In each experiment, 36 speakers produced the verb phrases in a Rapid Serial Visual Presentation (RSVP paradigm. On each trial, they read a sentence presented one word at a time, performed a short distractor task, and then saw a sentence preamble (e.g., De ober… which they had to complete to form the presented sentence. Onset latencies were compared using linear mixed effects models. N1 frequency did not produce any effects. N2 frequency only affected sentence onsets in the weak verb bias condition and especially in slow speakers. These findings highlight the dependency of planning scope during sentence recall on the grammatical properties of the verb and the frequency of post-verbal nouns. Implications for utterance planning in everyday speech are discussed.

  3. QUALITY IMPROVEMENT USING STATISTICAL PROCESS CONTROL TOOLS IN GLASS BOTTLES MANUFACTURING COMPANY

    Yonatan Mengesha Awaj

    2013-03-01

    Full Text Available In order to survive in a competitive market, improving quality and productivity of product or process is a must for any company. This study is about to apply the statistical process control (SPC tools in the production processing line and on final product in order to reduce defects by identifying where the highest waste is occur at and to give suggestion for improvement. The approach used in this study is direct observation, thorough examination of production process lines, brain storming session, fishbone diagram, and information has been collected from potential customers and company's workers through interview and questionnaire, Pareto chart/analysis and control chart (p-chart was constructed. It has been found that the company has many problems; specifically there is high rejection or waste in the production processing line. The highest waste occurs in melting process line which causes loss due to trickle and in the forming process line which causes loss due to defective product rejection. The vital few problems were identified, it was found that the blisters, double seam, stone, pressure failure and overweight are the vital few problems. The principal aim of the study is to create awareness to quality team how to use SPC tools in the problem analysis, especially to train quality team on how to held an effective brainstorming session, and exploit these data in cause-and-effect diagram construction, Pareto analysis and control chart construction. The major causes of non-conformities and root causes of the quality problems were specified, and possible remedies were proposed. Although the company has many constraints to implement all suggestion for improvement within short period of time, the company recognized that the suggestion will provide significant productivity improvement in the long run.

  4. Statistical process control analysis for patient-specific IMRT and VMAT QA.

    Sanghangthum, Taweap; Suriyapee, Sivalee; Srisatit, Somyot; Pawlicki, Todd

    2013-05-01

    This work applied statistical process control to establish the control limits of the % gamma pass of patient-specific intensity modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) quality assurance (QA), and to evaluate the efficiency of the QA process by using the process capability index (Cpml). A total of 278 IMRT QA plans in nasopharyngeal carcinoma were measured with MapCHECK, while 159 VMAT QA plans were undertaken with ArcCHECK. Six megavolts with nine fields were used for the IMRT plan and 2.5 arcs were used to generate the VMAT plans. The gamma (3%/3 mm) criteria were used to evaluate the QA plans. The % gamma passes were plotted on a control chart. The first 50 data points were employed to calculate the control limits. The Cpml was calculated to evaluate the capability of the IMRT/VMAT QA process. The results showed higher systematic errors in IMRT QA than VMAT QA due to the more complicated setup used in IMRT QA. The variation of random errors was also larger in IMRT QA than VMAT QA because the VMAT plan has more continuity of dose distribution. The average % gamma pass was 93.7% ± 3.7% for IMRT and 96.7% ± 2.2% for VMAT. The Cpml value of IMRT QA was 1.60 and VMAT QA was 1.99, which implied that the VMAT QA process was more accurate than the IMRT QA process. Our lower control limit for % gamma pass of IMRT is 85.0%, while the limit for VMAT is 90%. Both the IMRT and VMAT QA processes are good quality because Cpml values are higher than 1.0.

  5. Conceptual similarity effects on working memory in sentence contexts: testing a theory of anaphora.

    Cowles, H Wind; Garnham, Alan; Simner, Julia

    2010-06-01

    The degree of semantic similarity between an anaphoric noun phrase (e.g., the bird) and its antecedent (e.g., a robin) is known to affect the anaphor resolution process, but the mechanisms that underlie this effect are not known. One proposal (Almor, 1999) is that semantic similarity triggers interference effects in working memory and makes two crucial assumptions: First, semantic similarity impairs working memory just as phonological similarity does (e.g., Baddeley, 1992), and, second, this impairment interferes with processes of sentence comprehension. We tested these assumptions in two experiments that compared recall accuracy between phonologically similar, semantically similar, and control words in sentence contexts. Our results do not provide support for Almor's claims: Phonological overlap decreased recall accuracy in sentence contexts, but semantic similarity did not. These results shed doubt on the idea that semantic interference in working memory is an underlying mechanism in anaphor resolution.

  6. Penultimate modeling of spatial extremes: statistical inference for max-infinitely divisible processes

    Huser, Raphaël

    2018-01-09

    Extreme-value theory for stochastic processes has motivated the statistical use of max-stable models for spatial extremes. However, fitting such asymptotic models to maxima observed over finite blocks is problematic when the asymptotic stability of the dependence does not prevail in finite samples. This issue is particularly serious when data are asymptotically independent, such that the dependence strength weakens and eventually vanishes as events become more extreme. We here aim to provide flexible sub-asymptotic models for spatially indexed block maxima, which more realistically account for discrepancies between data and asymptotic theory. We develop models pertaining to the wider class of max-infinitely divisible processes, extending the class of max-stable processes while retaining dependence properties that are natural for maxima: max-id models are positively associated, and they yield a self-consistent family of models for block maxima defined over any time unit. We propose two parametric construction principles for max-id models, emphasizing a point process-based generalized spectral representation, that allows for asymptotic independence while keeping the max-stable extremal-$t$ model as a special case. Parameter estimation is efficiently performed by pairwise likelihood, and we illustrate our new modeling framework with an application to Dutch wind gust maxima calculated over different time units.

  7. Effects of Tasks on BOLD Signal Responses to Sentence Contrasts: Review and Commentary

    Caplan, David; Gow, David

    2012-01-01

    Functional neuroimaging studies of syntactic processing have been interpreted as identifying the neural locations of parsing and interpretive operations. However, current behavioral studies of sentence processing indicate that many operations occur simultaneously with parsing and interpretation. In this review, we point to issues that arise in…

  8. Hierarchical Rhetorical Sentence Categorization for Scientific Papers

    Rachman, G. H.; Khodra, M. L.; Widyantoro, D. H.

    2018-03-01

    Important information in scientific papers can be composed of rhetorical sentences that is structured from certain categories. To get this information, text categorization should be conducted. Actually, some works in this task have been completed by employing word frequency, semantic similarity words, hierarchical classification, and the others. Therefore, this paper aims to present the rhetorical sentence categorization from scientific paper by employing TF-IDF and Word2Vec to capture word frequency and semantic similarity words and employing hierarchical classification. Every experiment is tested in two classifiers, namely Naïve Bayes and SVM Linear. This paper shows that hierarchical classifier is better than flat classifier employing either TF-IDF or Word2Vec, although it increases only almost 2% from 27.82% when using flat classifier until 29.61% when using hierarchical classifier. It shows also different learning model for child-category can be built by hierarchical classifier.

  9. The two sides of sensory-cognitive interactions: effects of age, hearing acuity, and working memory span on sentence comprehension

    Renee eDeCaro

    2016-02-01

    Full Text Available Reduced hearing acuity is among the most prevalent of chronic medical conditions among older adults. An experiment is reported in which comprehension of spoken sentences was tested for older adults with good hearing acuity or with a mild-to-moderate hearing loss, and young adults with age-normal hearing. Comprehension was measured by participants’ ability to determine of the agent of an action in sentences that expressed this relation with a syntactically less complex subject-relative construction or a syntactically more complex object-relative construction. Agency determination was further challenged by inserting a prepositional phrase into sentences between the person performing an action and the action being performed. As a control, prepositional phrases of equivalent length were also inserted into sentences in a non-disruptive position. Effects on sentence comprehension of age, hearing acuity, prepositional phrase placement and sound level of stimulus presentations appeared only for comprehension of sentences with the more syntactically complex object-relative structures. Working memory as tested by reading span scores accounted for a significant amount of the variance in comprehension accuracy. Once working memory capacity and hearing acuity were taken into account, chronological age among the older adults contributed no further variance to comprehension accuracy. Results are discussed in terms of the positive and negative effects of sensory-cognitive interactions in comprehension of spoken sentences and lend support to a framework in which domain-general executive resources, notably verbal working memory, play a role in both linguistic and perceptual processing.

  10. Word Embedding Perturbation for Sentence Classification

    Zhang, Dongxu; Yang, Zhichao

    2018-01-01

    In this technique report, we aim to mitigate the overfitting problem of natural language by applying data augmentation methods. Specifically, we attempt several types of noise to perturb the input word embedding, such as Gaussian noise, Bernoulli noise, and adversarial noise, etc. We also apply several constraints on different types of noise. By implementing these proposed data augmentation methods, the baseline models can gain improvements on several sentence classification tasks.

  11. The ICSI+ Multilingual Sentence Segmentation System

    2006-01-01

    these steps the ASR output needs to be enriched with information additional to words, such as speaker diarization , sentence segmentation, or story...and the out- of a speaker diarization is considered as well. We first detail extraction of the prosodic features, and then describe the clas- ation...also takes into account the speaker turns that estimated by the diarization system. In addition to the Max- 1) model speaker turn unigrams, trigram

  12. GENERATIVE WORDS OF ALBANIAN AND ENGLISH SENTENCE

    Shkelqim Millaku

    2017-01-01

    This studies or the aim of the research is to deals the generative “morphems, words or “simple or compound[1]” sentence. The full congrast of Albanian and English language in this phenomena of generative is in morphology and in syntactic structure. This accepts of studies will comparted, contrasted and generated between two languages. This studies deals with noun (noun phrase), verb (verb phrase) of syntactic structure between Albanian and English language. In both of languages, most linguis...

  13. Statistical media and process optimization for biotransformation of rice bran to vanillin using Pediococcus acidilactici.

    Kaur, Baljinder; Chakraborty, Debkumar

    2013-11-01

    An isolate of P. acidilactici capable of producing vanillin from rice bran was isolated from a milk product. Response Surface Methodology was employed for statistical media and process optimization for production of biovanillin. Statistical medium optimization was done in two steps involving Placket Burman Design and Central Composite Response Designs. The RSM optimized vanillin production medium consisted of 15% (w/v) rice bran, 0.5% (w/v) peptone, 0.1% (w/v) ammonium nitrate, 0.005% (w/v) ferulic acid, 0.005% (w/v) magnesium sulphate, and 0.1% (v/v) tween-80, pH 5.6, at a temperature of 37 degrees C under shaking conditions at 180 rpm. 1.269 g/L vanillin was obtained within 24 h of incubation in optimized culture medium. This is the first report indicating such a high vanillin yield obtained during biotransformation of ferulic acid to vanillin using a Pediococcal isolate.

  14. GPR Raw-Data Order Statistic Filtering and Split-Spectrum Processing to Detect Moisture

    Gokhan Kilic

    2014-05-01

    Full Text Available Considerable research into the area of bridge health monitoring has been undertaken; however, information is still lacking on the effects of certain defects, such as moisture ingress, on the results of ground penetrating radar (GPR surveying. In this paper, this issue will be addressed by examining the results of a GPR bridge survey, specifically the effect of moisture in the predicted position of the rebars. It was found that moisture ingress alters the radargram to indicate distortion or skewing of the steel reinforcements, when in fact destructive testing was able to confirm that no such distortion or skewing had occurred. Additionally, split-spectrum processing with order statistic filters was utilized to detect moisture ingress from the GPR raw data.

  15. Damage localization by statistical evaluation of signal-processed mode shapes

    Ulriksen, Martin Dalgaard; Damkilde, Lars

    2015-01-01

    Due to their inherent ability to provide structural information on a local level, mode shapes and their derivatives are utilized extensively for structural damage identification. Typically, more or less advanced mathematical methods are implemented to identify damage-induced discontinuities in th...... is conducted on the basis of T2-statistics. The proposed method is demonstrated in the context of analytical work with a free-vibrating Euler-Bernoulli beam under noisy conditions.......) and subsequent application of a generalized discrete Teager-Kaiser energy operator (GDTKEO) to identify damage-induced mode shape discontinuities. In order to evaluate whether the identified discontinuities are in fact damage-induced, outlier analysis of principal components of the signal-processed mode shapes...

  16. A commercial microbial enhanced oil recovery process: statistical evaluation of a multi-project database

    Portwood, J.T.

    1995-12-31

    This paper discusses a database of information collected and organized during the past eight years from 2,000 producing oil wells in the United States, all of which have been treated with special applications techniques developed to improve the effectiveness of MEOR technology. The database, believed to be the first of its kind, has been generated for the purpose of statistically evaluating the effectiveness and economics of the MEOR process in a wide variety of oil reservoir environments, and is a tool that can be used to improve the predictability of treatment response. The information in the database has also been evaluated to determine which, if any, reservoir characteristics are dominant factors in determining the applicability of MEOR.

  17. Self-Organized Criticality in Astrophysics The Statistics of Nonlinear Processes in the Universe

    Aschwanden, Markus

    2011-01-01

    The concept of ‘self-organized criticality’ (SOC) has been applied to a variety of problems, ranging from population growth and traffic jams to earthquakes, landslides and forest fires. The technique is now being applied to a wide range of phenomena in astrophysics, such as planetary magnetospheres, solar flares, cataclysmic variable stars, accretion disks, black holes and gamma-ray bursts, and also to phenomena in galactic physics and cosmology. Self-organized Criticality in Astrophysics introduces the concept of SOC and shows that, due to its universality and ubiquity, it is a law of nature. The theoretical framework and specific physical models are described, together with a range of applications in various aspects of astrophyics. The mathematical techniques, including the statistics of random processes, time series analysis, time scale and waiting time distributions, are presented and the results are applied to specific observations of astrophysical phenomena.

  18. Statistical learning problem of artificial neural network to control roofing process

    Lapidus Azariy

    2017-01-01

    Full Text Available Now software developed on the basis of artificial neural networks (ANN has been actively implemented in construction companies to support decision-making in organization and management of construction processes. ANN learning is the main stage of its development. A key question for supervised learning is how many number of training examples we need to approximate the true relationship between network inputs and output with the desired accuracy. Also designing of ANN architecture is related to learning problem known as “curse of dimensionality”. This problem is important for the study of construction process management because of the difficulty to get training data from construction sites. In previous studies the authors have designed a 4-layer feedforward ANN with a unit model of 12-5-4-1 to approximate estimation and prediction of roofing process. This paper presented the statistical learning side of created ANN with simple-error-minimization algorithm. The sample size to efficient training and the confidence interval of network outputs defined. In conclusion the authors predicted successful ANN learning in a large construction business company within a short space of time.

  19. Statistical Downscaling Output GCM Modeling with Continuum Regression and Pre-Processing PCA Approach

    Sutikno Sutikno

    2010-08-01

    Full Text Available One of the climate models used to predict the climatic conditions is Global Circulation Models (GCM. GCM is a computer-based model that consists of different equations. It uses numerical and deterministic equation which follows the physics rules. GCM is a main tool to predict climate and weather, also it uses as primary information source to review the climate change effect. Statistical Downscaling (SD technique is used to bridge the large-scale GCM with a small scale (the study area. GCM data is spatial and temporal data most likely to occur where the spatial correlation between different data on the grid in a single domain. Multicollinearity problems require the need for pre-processing of variable data X. Continuum Regression (CR and pre-processing with Principal Component Analysis (PCA methods is an alternative to SD modelling. CR is one method which was developed by Stone and Brooks (1990. This method is a generalization from Ordinary Least Square (OLS, Principal Component Regression (PCR and Partial Least Square method (PLS methods, used to overcome multicollinearity problems. Data processing for the station in Ambon, Pontianak, Losarang, Indramayu and Yuntinyuat show that the RMSEP values and R2 predict in the domain 8x8 and 12x12 by uses CR method produces results better than by PCR and PLS.

  20. Application of Statistical Model in Wastewater Treatment Process Modeling Using Data Analysis

    Alireza Raygan Shirazinezhad

    2015-06-01

    Full Text Available Background: Wastewater treatment includes very complex and interrelated physical, chemical and biological processes which using data analysis techniques can be rigorously modeled by a non-complex mathematical calculation models. Materials and Methods: In this study, data on wastewater treatment processes from water and wastewater company of Kohgiluyeh and Boyer Ahmad were used. A total of 3306 data for COD, TSS, PH and turbidity were collected, then analyzed by SPSS-16 software (descriptive statistics and data analysis IBM SPSS Modeler 14.2, through 9 algorithm. Results: According to the results on logistic regression algorithms, neural networks, Bayesian networks, discriminant analysis, decision tree C5, tree C & R, CHAID, QUEST and SVM had accuracy precision of 90.16, 94.17, 81.37, 70.48, 97.89, 96.56, 96.46, 96.84 and 88.92, respectively. Discussion and conclusion: The C5 algorithm as the best and most applicable algorithms for modeling of wastewater treatment processes were chosen carefully with accuracy of 97.899 and the most influential variables in this model were PH, COD, TSS and turbidity.

  1. Poster - Thur Eve - 29: Detecting changes in IMRT QA using statistical process control.

    Drever, L; Salomons, G

    2012-07-01

    Statistical process control (SPC) methods were used to analyze 239 measurement based individual IMRT QA events. The selected IMRT QA events were all head and neck (H&N) cases with 70Gy in 35 fractions, and all prostate cases with 76Gy in 38 fractions planned between March 2009 and 2012. The results were used to determine if the tolerance limits currently being used for IMRT QA were able to indicate if the process was under control. The SPC calculations were repeated for IMRT QA of the same type of cases that were planned after the treatment planning system was upgraded from Eclipse version 8.1.18 to version 10.0.39. The initial tolerance limits were found to be acceptable for two of the three metrics tested prior to the upgrade. After the upgrade to the treatment planning system the SPC analysis found that the a priori limits were no longer capable of indicating control for 2 of the 3 metrics analyzed. The changes in the IMRT QA results were clearly identified using SPC, indicating that it is a useful tool for finding changes in the IMRT QA process. Routine application of SPC to IMRT QA results would help to distinguish unintentional trends and changes from the random variation in the IMRT QA results for individual plans. © 2012 American Association of Physicists in Medicine.

  2. Statistical optimization of microencapsulation process for coating of magnesium particles with Viton polymer

    Pourmortazavi, Seied Mahdi, E-mail: pourmortazavi@yahoo.com [Faculty of Material and Manufacturing Technologies, Malek Ashtar University of Technology, P.O. Box 16765-3454, Tehran (Iran, Islamic Republic of); Babaee, Saeed; Ashtiani, Fatemeh Shamsi [Faculty of Chemistry & Chemical Engineering, Malek Ashtar University of Technology, Tehran (Iran, Islamic Republic of)

    2015-09-15

    Graphical abstract: - Highlights: • Surface of magnesium particles was modified with Viton via solvent/non-solvent method. • FT-IR, SEM, EDX, Map analysis, and TG/DSC techniques were employed to characterize the coated particles. • Coating process factors were optimized by Taguchi robust design. • The importance of coating conditions on resistance of coated magnesium against oxidation was studied. - Abstract: The surface of magnesium particles was modified by coating with Viton as an energetic polymer using solvent/non-solvent technique. Taguchi robust method was utilized as a statistical experiment design to evaluate the role of coating process parameters. The coated magnesium particles were characterized by various techniques, i.e., Fourier transform infrared (FT-IR) spectroscopy, scanning electron microscopy (SEM), energy-dispersive X-ray spectroscopy (EDX) and thermogravimetry (TG), and differential scanning calorimetry (DSC). The results showed that the coating of magnesium powder with the Viton leads to a higher resistance of metal against oxidation in the presence of air atmosphere. Meanwhile, tuning of the coating process parameters (i.e., percent of Viton, flow rate of non-solvent addition, and type of solvent) influences on the resistance of the metal particles against thermal oxidation. Coating of magnesium particles yields Viton coated particles with higher thermal stability (632 °C); in comparison with the pure magnesium powder, which commences oxidation in the presence of air atmosphere at a lower temperature of 260 °C.

  3. Determination of dominant biogeochemical processes in a contaminated aquifer-wetland system using multivariate statistical analysis

    Baez-Cazull, S. E.; McGuire, J.T.; Cozzarelli, I.M.; Voytek, M.A.

    2008-01-01

    Determining the processes governing aqueous biogeochemistry in a wetland hydrologically linked to an underlying contaminated aquifer is challenging due to the complex exchange between the systems and their distinct responses to changes in precipitation, recharge, and biological activities. To evaluate temporal and spatial processes in the wetland-aquifer system, water samples were collected using cm-scale multichambered passive diffusion samplers (peepers) to span the wetland-aquifer interface over a period of 3 yr. Samples were analyzed for major cations and anions, methane, and a suite of organic acids resulting in a large dataset of over 8000 points, which was evaluated using multivariate statistics. Principal component analysis (PCA) was chosen with the purpose of exploring the sources of variation in the dataset to expose related variables and provide insight into the biogeochemical processes that control the water chemistry of the system. Factor scores computed from PCA were mapped by date and depth. Patterns observed suggest that (i) fermentation is the process controlling the greatest variability in the dataset and it peaks in May; (ii) iron and sulfate reduction were the dominant terminal electron-accepting processes in the system and were associated with fermentation but had more complex seasonal variability than fermentation; (iii) methanogenesis was also important and associated with bacterial utilization of minerals as a source of electron acceptors (e.g., barite BaSO4); and (iv) seasonal hydrological patterns (wet and dry periods) control the availability of electron acceptors through the reoxidation of reduced iron-sulfur species enhancing iron and sulfate reduction. Copyright ?? 2008 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.

  4. The modality-switch effect: Visually and aurally presented prime sentences activate our senses

    Elisa eScerrati

    2015-10-01

    Full Text Available Verifying different sensory modality properties for concepts results in a processing cost known as the Modality-Switch Effect. It has been argued that this cognitive cost is the result of a perceptual simulation. This paper extends this argument and reports an experiment investigating whether the effect is the result of an activation of sensory information which can also be triggered by perceptual linguistically described stimuli. Participants were first exposed to a prime sentence describing a light or a sound’s perceptual property (e.g. The light is flickering, The sound is echoing, then required to perform a property-verification task on a target sentence (e.g. Butter is yellowish, Leaves rustle. The content modalities of the prime and target sentences could be compatible (i.e. in the same modality: e.g. visual-visual or not (i.e. in different modalities. Crucially, we manipulated the stimuli’s presentation modality such that half of the participants was faced with written sentences while the other half was faced with aurally presented sentences. Results show a cost when two different modalities alternate, compared to when the same modality is repeated with both visual and aural stimuli presentations. This result supports the embodied and grounded cognition view which claims that conceptual knowledge is grounded into the perceptual system. Specifically, this evidence suggests that sensory modalities can be pre-activated through the simulation of either read or listened linguistic stimuli describing visual or acoustic perceptual properties.

  5. Semantic and pragmatic factors influencing deaf and hearing students' comprehension of english sentences containing numeral quantifiers.

    Kelly, Ronald R; Berent, Gerald P

    2011-01-01

    This research contrasted deaf and hearing students' interpretive knowledge of English sentences containing numeral quantifier phrases and indefinite noun phrases. A multiple-interpretation picture task methodology was used to assess 305 participants' judgments of the compatibility of sentence meanings with depicted discourse contexts. Participants' performance was assessed on the basis of hearing level (deaf, hearing) and grade level (middle school, high school, college). The deaf students were predicted to have differential access to specific sentence interpretations in accordance with the relative derivational complexity of the targeted sentence types. Hypotheses based on the pressures of derivational economy on acquisition were largely supported. The results also revealed that the deaf participants tended to overactivate pragmatic processes that yielded principled, though non-target, sentence interpretations. Collectively, the results not only contribute to the understanding of English acquisition under conditions of restricted access to spoken language input, they also suggest that pragmatic factors may play a broad role in influencing, and compromising, deaf students' reading comprehension and written expression.

  6. The Role of Working Memory in Planning and Generating Written Sentences

    Ronald T. Kellogg

    2016-02-01

    Full Text Available Planning a sentence with concrete concepts whose referents can be mentally imaged has been shown in past work to require the limited resources of visual working memory. By contrast, grammatically encoding such concepts as lexical items in a syntactic structure requires verbal working memory. We report an experiment designed to demonstrate a double dissociation of these two stores of working memory by manipulating the difficulty of (1 planning by comparing related concepts to unrelated concepts and (2 grammatical encoding of an English sentence in active voice versus the more complex structure of the passive voice. College students (N = 46 composed sentences that were to include two noun prompts (related versus unrelated while concurrently performing either a visual or a verbal distracting task. Instructions to produce either active or passive sentences were manipulated between groups. The results surprisingly indicated that the supposedly easier planning with related concepts made a large demand on verbal working memory, rather than unrelated concepts demanding more visual working memory. The temporal dynamics of the sentence production process appear to best account for the unexpected findings.

  7. Musical metaphors: evidence for a spatial grounding of non-literal sentences describing auditory events.

    Wolter, Sibylla; Dudschig, Carolin; de la Vega, Irmgard; Kaup, Barbara

    2015-03-01

    This study investigated whether the spatial terms high and low, when used in sentence contexts implying a non-literal interpretation, trigger similar spatial associations as would have been expected from the literal meaning of the words. In three experiments, participants read sentences describing either a high or a low auditory event (e.g., The soprano sings a high aria vs. The pianist plays a low note). In all Experiments, participants were asked to judge (yes/no) whether the sentences were meaningful by means of up/down (Experiments 1 and 2) or left/right (Experiment 3) key press responses. Contrary to previous studies reporting that metaphorical language understanding differs from literal language understanding with regard to simulation effects, the results show compatibility effects between sentence implied pitch height and response location. The results are in line with grounded models of language comprehension proposing that sensory motor experiences are being elicited when processing literal as well as non-literal sentences. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Sentence-Level Effects of Literary Genre: Behavioral and Electrophysiological Evidence

    Stefan Blohm

    2017-11-01

    Full Text Available The current study used event-related brain potentials (ERPs and behavioral measures to examine effects of genre awareness on sentence processing and evaluation. We hypothesized that genre awareness modulates effects of genre-typical manipulations. We manipulated instructions between participants, either specifying a genre (poetry or not (neutral. Sentences contained genre-typical variations of semantic congruency (congruent/incongruent and morpho-phonological features (archaic/contemporary inflections. Offline ratings of meaningfulness (n = 64/group showed higher average ratings for semantically incongruent sentences in the poetry vs. neutral condition. ERPs during sentence reading (n = 24/group; RSVP presentation at a fixed per-constituent rate; probe task showed a left-lateralized N400-like effect for contemporary vs. archaic inflections. Semantic congruency elicited a bilateral posterior N400 effect for incongruent vs. congruent continuations followed by a centro-parietal positivity (P600. While N400 amplitudes were insensitive to the genre, the latency of the P600 was delayed by the poetry instruction. From these results, we conclude that during real-time sentence comprehension, readers are sensitive to subtle morphological manipulations and the implicit prosodic differences that accompany them. By contrast, genre awareness affects later stages of comprehension.

  9. Wind gust estimation by combining numerical weather prediction model and statistical post-processing

    Patlakas, Platon; Drakaki, Eleni; Galanis, George; Spyrou, Christos; Kallos, George

    2017-04-01

    The continuous rise of off-shore and near-shore activities as well as the development of structures, such as wind farms and various offshore platforms, requires the employment of state-of-the-art risk assessment techniques. Such analysis is used to set the safety standards and can be characterized as a climatologically oriented approach. Nevertheless, a reliable operational support is also needed in order to minimize cost drawbacks and human danger during the construction and the functioning stage as well as during maintenance activities. One of the most important parameters for this kind of analysis is the wind speed intensity and variability. A critical measure associated with this variability is the presence and magnitude of wind gusts as estimated in the reference level of 10m. The latter can be attributed to different processes that vary among boundary-layer turbulence, convection activities, mountain waves and wake phenomena. The purpose of this work is the development of a wind gust forecasting methodology combining a Numerical Weather Prediction model and a dynamical statistical tool based on Kalman filtering. To this end, the parameterization of Wind Gust Estimate method was implemented to function within the framework of the atmospheric model SKIRON/Dust. The new modeling tool combines the atmospheric model with a statistical local adaptation methodology based on Kalman filters. This has been tested over the offshore west coastline of the United States. The main purpose is to provide a useful tool for wind analysis and prediction and applications related to offshore wind energy (power prediction, operation and maintenance). The results have been evaluated by using observational data from the NOAA's buoy network. As it was found, the predicted output shows a good behavior that is further improved after the local adjustment post-process.

  10. Trial-Based Thought Record (TBTR): preliminary data on a strategy to deal with core beliefs by combining sentence reversion and the use of analogy with a judicial process.

    Oliveira, Irismar Reis de

    2008-03-01

    To propose the Trial-Based Thought Record, a modified, 7-column thought record addressing core beliefs by sentence reversion and the analogy to a trial. Clients (n = 30) participated in a simulation of a trial and exhibited shifts in their adherence to core beliefs and in the intensity of corresponding emotions after each step (investigation, prosecutor s plea, defense attorney s plea, prosecutor s second plea, defense attorney s second plea, and jury verdict) during a session. Significant mean reductions existed between percent values after investigation (taken as baseline) and defense attorney s plea (p < 0.001), and after the jury s verdict, either in beliefs (p < 0.001) or in intensity of emotions (p < 0.001). Significant differences also emerged between the defense attorney s first and second pleas (p = 0.009) and between the defense attorney s second plea and jury s verdict concerning core beliefs (p = 0.005) and emotions (p = 0.02). Trial-Based Thought Record may at least temporarily help patients constructively reduce attachment to negative core beliefs and corresponding emotions.

  11. Person Fit Based on Statistical Process Control in an Adaptive Testing Environment. Research Report 98-13.

    van Krimpen-Stoop, Edith M. L. A.; Meijer, Rob R.

    Person-fit research in the context of paper-and-pencil tests is reviewed, and some specific problems regarding person fit in the context of computerized adaptive testing (CAT) are discussed. Some new methods are proposed to investigate person fit in a CAT environment. These statistics are based on Statistical Process Control (SPC) theory. A…

  12. Initiating statistical process control to improve quality outcomes in colorectal surgery.

    Keller, Deborah S; Stulberg, Jonah J; Lawrence, Justin K; Samia, Hoda; Delaney, Conor P

    2015-12-01

    Unexpected variations in postoperative length of stay (LOS) negatively impact resources and patient outcomes. Statistical process control (SPC) measures performance, evaluates productivity, and modifies processes for optimal performance. The goal of this study was to initiate SPC to identify LOS outliers and evaluate its feasibility to improve outcomes in colorectal surgery. Review of a prospective database identified colorectal procedures performed by a single surgeon. Patients were grouped into elective and emergent categories and then stratified by laparoscopic and open approaches. All followed a standardized enhanced recovery protocol. SPC was applied to identify outliers and evaluate causes within each group. A total of 1294 cases were analyzed--83% elective (n = 1074) and 17% emergent (n = 220). Emergent cases were 70.5% open and 29.5% laparoscopic; elective cases were 36.8% open and 63.2% laparoscopic. All groups had a wide range in LOS. LOS outliers ranged from 8.6% (elective laparoscopic) to 10.8% (emergent laparoscopic). Evaluation of outliers demonstrated patient characteristics of higher ASA scores, longer operating times, ICU requirement, and temporary nursing at discharge. Outliers had higher postoperative complication rates in elective open (57.1 vs. 20.0%) and elective lap groups (77.6 vs. 26.1%). Outliers also had higher readmission rates for emergent open (11.4 vs. 5.4%), emergent lap (14.3 vs. 9.2%), and elective lap (32.8 vs. 6.9%). Elective open outliers did not follow trends of longer LOS or higher reoperation rates. SPC is feasible and promising for improving colorectal surgery outcomes. SPC identified patient and process characteristics associated with increased LOS. SPC may allow real-time outlier identification, during quality improvement efforts, and reevaluation of outcomes after introducing process change. SPC has clinical implications for improving patient outcomes and resource utilization.

  13. Independent assessment to continue improvement: Implementing statistical process control at the Hanford Site

    Hu, T.A.; Lo, J.C.

    1994-11-01

    A Quality Assurance independent assessment has brought about continued improvement in the PUREX Plant surveillance program at the Department of Energy's Hanford Site. After the independent assessment, Quality Assurance personnel were closely involved in improving the surveillance program, specifically regarding storage tank monitoring. The independent assessment activities included reviewing procedures, analyzing surveillance data, conducting personnel interviews, and communicating with management. Process improvement efforts included: (1) designing data collection methods; (2) gaining concurrence between engineering and management, (3) revising procedures; and (4) interfacing with shift surveillance crews. Through this process, Statistical Process Control (SPC) was successfully implemented and surveillance management was improved. The independent assessment identified several deficiencies within the surveillance system. These deficiencies can be grouped into two areas: (1) data recording and analysis and (2) handling off-normal conditions. By using several independent assessment techniques, Quality Assurance was able to point out program weakness to senior management and present suggestions for improvements. SPC charting, as implemented by Quality Assurance, is an excellent tool for diagnosing the process, improving communication between the team members, and providing a scientific database for management decisions. In addition, the surveillance procedure was substantially revised. The goals of this revision were to (1) strengthen the role of surveillance management, engineering and operators and (2) emphasize the importance of teamwork for each individual who performs a task. In this instance we believe that the value independent assessment adds to the system is the continuous improvement activities that follow the independent assessment. Excellence in teamwork between the independent assessment organization and the auditee is the key to continuing improvement

  14. Infant Statistical-Learning Ability Is Related to Real-Time Language Processing

    Lany, Jill; Shoaib, Amber; Thompson, Abbie; Estes, Katharine Graf

    2018-01-01

    Infants are adept at learning statistical regularities in artificial language materials, suggesting that the ability to learn statistical structure may support language development. Indeed, infants who perform better on statistical learning tasks tend to be more advanced in parental reports of infants' language skills. Work with adults suggests…

  15. Dispensing processes impact apparent biological activity as determined by computational and statistical analyses.

    Sean Ekins

    Full Text Available Dispensing and dilution processes may profoundly influence estimates of biological activity of compounds. Published data show Ephrin type-B receptor 4 IC50 values obtained via tip-based serial dilution and dispensing versus acoustic dispensing with direct dilution differ by orders of magnitude with no correlation or ranking of datasets. We generated computational 3D pharmacophores based on data derived by both acoustic and tip-based transfer. The computed pharmacophores differ significantly depending upon dispensing and dilution methods. The acoustic dispensing-derived pharmacophore correctly identified active compounds in a subsequent test set where the tip-based method failed. Data from acoustic dispensing generates a pharmacophore containing two hydrophobic features, one hydrogen bond donor and one hydrogen bond acceptor. This is consistent with X-ray crystallography studies of ligand-protein interactions and automatically generated pharmacophores derived from this structural data. In contrast, the tip-based data suggest a pharmacophore with two hydrogen bond acceptors, one hydrogen bond donor and no hydrophobic features. This pharmacophore is inconsistent with the X-ray crystallographic studies and automatically generated pharmacophores. In short, traditional dispensing processes are another important source of error in high-throughput screening that impacts computational and statistical analyses. These findings have far-reaching implications in biological research.

  16. Statistical process control analysis for patient quality assurance of intensity modulated radiation therapy

    Lee, Rena; Kim, Kyubo; Cho, Samju; Lim, Sangwook; Lee, Suk; Shim, Jang Bo; Huh, Hyun Do; Lee, Sang Hoon; Ahn, Sohyun

    2017-11-01

    This study applied statistical process control to set and verify the quality assurances (QA) tolerance standard for our hospital's characteristics with the criteria standards that are applied to all the treatment sites with this analysis. Gamma test factor of delivery quality assurances (DQA) was based on 3%/3 mm. Head and neck, breast, prostate cases of intensity modulated radiation therapy (IMRT) or volumetric arc radiation therapy (VMAT) were selected for the analysis of the QA treatment sites. The numbers of data used in the analysis were 73 and 68 for head and neck patients. Prostate and breast were 49 and 152 by MapCHECK and ArcCHECK respectively. C p value of head and neck and prostate QA were above 1.0, C pml is 1.53 and 1.71 respectively, which is close to the target value of 100%. C pml value of breast (IMRT) was 1.67, data values are close to the target value of 95%. But value of was 0.90, which means that the data values are widely distributed. C p and C pml of breast VMAT QA were respectively 1.07 and 2.10. This suggests that the VMAT QA has better process capability than the IMRT QA. Consequently, we should pay more attention to planning and QA before treatment for breast Radiotherapy.

  17. Healthy Aging and Compensation of Sentence Comprehension Auditory Deficits

    Marcela Lima Silagi

    2015-01-01

    Full Text Available Objectives. To analyze the effect of aging on sentence auditory comprehension and to study the relationship between this language skill and cognitive functions (attention, working memory, and executive functions. Methods. A total of 90 healthy subjects were divided into three groups: adults (50–59 years, young-old (60–69 years, and old-old (70–80 years. Subjects were assessed using the Revised Token Test. The measures used for performance analysis were number of correct answers (accuracy and execution time of commands on the different subtests. Results. Regarding accuracy, groups showed similar performance on the first blocks, but the young-old and old-old performed worse than adults on blocks 9 and 10. With respect to execution time, groups differed from block 2 (i.e., the groups differed for all blocks, except for block 1, with the worst performance observed in the old-old group, followed by that of the young-old group. Therefore, the elderly required more time to attain performance similar to that of adults, showing that time measurements are more sensitive for detecting the effects of age. Sentence comprehension ability is correlated with cognitive test performance, especially for global cognition and working memory tests. Conclusions. Healthy aging is characterized by the ability to compensate for difficulties in linguistic processing, which allows the elderly to maintain functional communication.

  18. A Computerized Version of the Scrambled Sentences Test

    Roberto Viviani

    2018-01-01

    Full Text Available The scrambled sentences test (SST, an experimental procedure that involves participants writing down their cognitions, has been used to elicit individual differences in depressiveness and vulnerability to depression. We describe here a modification of the SST to adapt it to computerized administration, with a particular view of its use in large samples and functional neuroimaging applications. In a first study with the computerized version, we reproduce the preponderance of positive cognitions in the healthy and the inverse association of these cognitions with individual measures of depressiveness. We also report a tendency of self-referential cognitions to elicit higher positive cognition rates. In a second study, we describe the patterns of neural activations elicited by emotional and neutral sentences in a functional neuroimaging study, showing that it replicates and extends previous findings obtained with the original version of the SST. During the formation of emotional cognitions, ventral areas such as the ventral anterior cingulus and the supramarginal gyrus were relatively activated. This activation pattern speaks for the recruitment of mechanisms coordinating motivational and associative processes in the formation of value-based decisions.

  19. Artificial intelligence versus statistical modeling and optimization of continuous bead milling process for bacterial cell lysis

    Shafiul Haque

    2016-11-01

    Full Text Available AbstractFor a commercially viable recombinant intracellular protein production process, efficient cell lysis and protein release is a major bottleneck. The recovery of recombinant protein, cholesterol oxidase (COD was studied in a continuous bead milling process. A full factorial Response Surface Model (RSM design was employed and compared to Artificial Neural Networks coupled with Genetic Algorithm (ANN-GA. Significant process variables, cell slurry feed rate (A, bead load (B, cell load (C and run time (D, were investigated and optimized for maximizing COD recovery. RSM predicted an optimum of feed rate of 310.73 mL/h, bead loading of 79.9% (v/v, cell loading OD600 nm of 74, and run time of 29.9 min with a recovery of ~3.2 g/L. ANN coupled with GA predicted a maximum COD recovery of ~3.5 g/L at an optimum feed rate (mL/h: 258.08, bead loading (%, v/v: 80%, cell loading (OD600 nm: 73.99, and run time of 32 min. An overall 3.7-fold increase in productivity is obtained when compared to a batch process. Optimization and comparison of statistical vs. artificial intelligence techniques in continuous bead milling process has been attempted for the very first time in our study. We were able to successfully represent the complex non-linear multivariable dependence of enzyme recovery on bead milling parameters. The quadratic second order response functions are not flexible enough to represent such complex non-linear dependence. ANN being a summation function of multiple layers are capable to represent complex non-linear dependence of variables in this case; enzyme recovery as a function of bead milling parameters. Since GA can even optimize discontinuous functions present study cites a perfect example of using machine learning (ANN in combination with evolutionary optimization (GA for representing undefined biological functions which is the case for common industrial processes involving biological moieties.

  20. Statistical evaluation of tablet coating processes: influence of pan design and solvent type

    Valdomero Pereira de Melo Junior

    2010-12-01

    Full Text Available Partially and fully perforated pan coaters are among the most relevant types of equipment currently used in the process of coating tablets. The goal of this study was to assess the performance differences among these types of equipment employing a factorial design. This statistical approach allowed the simultaneous study of the process variables and verification of interactions among them. The study included partially-perforated and fully-perforated pan coaters, aqueous and organic solvents, as well as hypromellose-based immediate-release coating. The dependent variables were process time, energy consumption, mean weight of tablets and process yield. For the tests, placebo tablets with a mean weight of 250 mg were produced, divided into eight lots of two kilograms each and coated in duplicate, using both partially perforated pan and fully perforated pan coaters. The results showed a significant difference between the type of equipment used (partially and fully perforated pan coaters with regard to process time and energy consumption, whereas no significant difference was identified for mean weight of the coated tablets and process yield.Entre os tipos de equipamentos de maior relevância utilizados atualmente no processo de revestimento de comprimidos estão os de tambor parcial e totalmente perfurados. A proposta desse trabalho foi avaliar as diferenças de desempenho entre esses equipamentos empregando projeto fatorial. Essa abordagem estatística possibilitou o estudo simultâneo das variáveis do processo, permitindo verificar interações entre elas. O trabalho incluiu equipamento com tambor parcialmente perfurado e totalmente perfurado, solventes aquoso e orgânico, assim como revestimento de liberação imediata à base de hipromelose. As variáveis dependentes ou respostas foram tempo de processo, consumo de energia, peso médio e rendimento do processo. Para os ensaios, foram produzidos comprimidos de placebo de 250 mg de peso m