Sample records for 510k process cber

  1. 76 FR 45825 - Center for Devices and Radiological Health 510(k) Clearance Process; Institute of Medicine Report...


    ... Process; Institute of Medicine Report: ``Medical Devices and the Public's Health, The FDA 510(k) Clearance... Medicine (IOM) report entitled: ``Medical Devices and the Public's Health, The FDA 510(k) Clearance Process.... ADDRESSES: See the SUPPLEMENTARY INFORMATION section for electronic access to the document....

  2. 75 FR 4402 - Strengthening the Center for Devices and Radiological Health's 510(k) Review Process; Public...


    ... requirement), that the device meets the comparative standard of ``substantial equivalence'' to a ``predicate... are safe and effective, and to promote innovation in the medical device industry. The 510(k) premarket... different ``intended uses'' and explain your reasoning. b. What are the advantages and disadvantages...

  3. CBERS-02 Application Assessment

    GUO Jianning; YU Jin; MIN Xiangjun; LI Xingchao; HOU Minghui


    As the successor of CBERS-01, CBERS-02 was launched successfully on 21October 2003 and transmitted its first downlink data the next day. According to the OBT (On Board Test) outline, CRESDA cooperated with CAST and many remote sensing organizations in China in implementing the test of the satellite payloads, ground processing system and data application. In this paper, the test is briefly illustrated in three parts: Ⅰ. Satellite Parameters Test (especially the test for payloads), Ⅱ. Payload Image Quality Assessment, and Ⅲ. Data Application Assessment. The results of the test show that the image quality of CBERS-02 is much improved over CBERS-01 and will continue to play a more important role in the society and economic development of China.

  4. Center for Biologics Evaluation and Research (CBER)

    Federal Laboratory Consortium — CBER is the Center within FDA that regulates biological products for human use under applicable federal laws, including the Public Health Service Act and the Federal...

  5. CBERS-2B Monitored Forest Fires In Yunnan Province



    @@ Several forest fires hit Yunnan Province,in the southwest of China from April 6 to 9.Two disastrous fires happened near Shangri-La County,Yunnan Province.According to the requirement of the Land and Surveying Department of Yunnan Province,the China Center for Resources Satellite Data & Application (CRESDA) provided satellite monitoring images to detect the events.The processed CBERS-2B images were delivered to the related departments for decision making and disaster relief.

  6. The Prototype Development for The Second Generation CBERS Initiates



    It's reported that the concept for the second generation CBERS-CBERS-03 & 04 to be jointly developed by China Academy of Space Technology (CAST) and INPE went through the assessment by China Aerospace Science and Technology Corporation (CASC) at the beginning of the year. It means that the prototype development of CBERS-03 & 04 has been initiated.

  7. Applications Of CBERS Data In Oceanic Research

    Li Xiaomin; Zhang Jie; Ma Yi; Song Pingjian


    @@ The CBERS series satellite data has been widely used by the First Institute of Oceanography, State Oceanic Administration (SOA) for the studies related to the monitoring of coral islands, land use changes of islands and coastal zone, green tide,implementation of ocean functional zoning, suspended particulate matter,sea ice, rafts cultivation and so on.The data covers the range of islands,coastal zone and inshore sea areas.

  8. The Application Achievements And Perspective Of CBERS Series Satellite Imagery

    Li Xingchao; Qi Xueyong; Lu Yilin


    @@ Since the first China-Brazil Earth Resources Satellite (CBERS-1),launched in 1999,the CBERS data has been applied in many fields extensively.Remarkable social and economic benefits have been achieved.This article presents the application achievements during the past nine years,and gives a perspective for the future.All these applications demonstrate that the CBERS data has been an important data source for resources investigation and monitoring.

  9. Classification of CBERS-2 Imagery with Fuzzy ARTMAP Classifier%用模糊ARTMAP算法对中巴卫星02数据进行分类

    骆成凤; 刘正军; 燕琴


    A fuzzy ARTMAP classifier is adopted for a classification experiment of CBERS-2 imagery. The fundamental theory and processing about the algorithm are first introduced, followed with a land-use classification experiment in Shihezi County on CBERS-2 high resolution imagery. Three classifiers are compared: maximum likelihood classifier (MLC), error back propagation (BP) classifier, and fuzzy ARTMAP classifier. The comparison shows comparably better results for the fuzzy ARTMAP classifier, with overall classification accuracy of 9.9% and 4.6% higher than that of MLC and BP. The results also prove that the fuzzy ARTMAP classifier has better discernment in identifying bare soil on CBERS-2 imagery.

  10. CBERS-2B Brazilian remote sensing satellite to help to monitor the Bolivia-Brazil gas pipeline

    Hernandes, Gilberto Luis Sanches [TBG Transportadora Brasileira Gasoduto Bolivia-Brasil, Rio de Janeiro, RJ (Brazil)


    This paper presents the results of CBERS-2B' Brazilian Remote Sensing Satellite to help to monitor the Bolivia-Brazil Gas Pipeline. The CBERS-2B is the third satellite launched in 2007 by the CBERS Program (China-Brazil Earth Resources Satellite) and the innovation was the HRC camera that produces high resolution images. It will be possible to obtain one complete coverage of the country every 130 days. In this study, 2 images from different parts of the Bolivia- Brazil Gas Pipeline were selected. Image processing involved the geometric registration of CBERS-2B satellite images with airborne images, contrast stretch transform and pseudo color. The analysis of satellite and airborne images in a GIS software to detect third party encroachment was effective to detect native vegetation removal, street construction, growth of urban areas, farming and residential/industrial land development. Very young, the CBERS-2B is a good promise to help to inspect the areas along the pipelines. (author)

  11. The investigation of special information distilling method of land use in karst area based on CBERS-02B and analysis on application: a case study of Duyun, Guizhou

    Hu, Juan; Luo, Miao; An, Yulun


    This paper explores the optimal methods for processing CBERS-02B images and using them to classify the land uses of karst mountain areas with 3S technologies, especially the RS digital image processing technology. Through multiple experiments and analysis, the difficulty of CBERS-02B images in distinguishing water from mountain shades, construction land from dry land and paddy field are satisfactorily removed. And the combination of band 421, based on OIF method, is proved optimal for classifying the land uses of karst areas. After comparing and evaluating the effect of HIS, PCA and HPC based image fusion methods, the HIS transformation based image fusion method is found best for CBERS-02B HR and CCD data fusion in the case of karst highland mountains. Based on the experiments, this paper proves that CBERS images are capable of large scale land use classification for karst areas, a competent substitute of TM images for karst mountain area land use survey.

  12. Atmospheric correction of CBERS CCD images with MODIS data

    LI; Junsheng; ZHANG; Bing; CHEN; Zhengchao; SHEN; Qian


    China Brazil Earth Resource Satellite (CBERS) CCD images have much potential for inland water environmental monitoring. However, their atmospheric accuracy correction can affect their quantitative applications. This paper contains an atmospheric correction algorithm for CBERS CCD images with MODIS data from the same day, the use of which improves the atmospheric correction algorithm of ocean color remote sensing developed by Gordon (1993, 1994) and makes it applicable to inland waters. The improved algorithm retrieves atmospheric parameters from MODIS data and uses them to perform the atmospheric correction of CBERS CCD images. Experimental results show that the atmospheric correction algorithm of CBERS CCD images assisted by MODIS data is reliable. Furthermore, MODIS data can be freely obtained on a daily basis, making the algorithm developed in this paper useful for environmental monitoring of inland waters.

  13. Pre-Launch Absolute Calibration of CCD/CBERS-2B Sensor

    Ponzoni, Flávio Jorge; Albuquerque, Bráulio Fonseca Carneiro


    Pre-launch absolute calibration coefficients for the CCD/CBERS-2B sensor have been calculated from radiometric measurements performed in a satellite integration and test hall in the Chinese Academy of Space Technology (CAST) headquarters, located in Beijing, China. An illuminated integrating sphere was positioned in the test hall facilities to allow the CCD/CBERS-2B imagery of the entire sphere aperture. Calibration images were recorded and a relative calibration procedure adopted exclusively in Brazil was applied to equalize the detectors responses. Averages of digital numbers (DN) from these images were determined and correlated to their respective radiance levels in order to calculate the absolute calibration coefficients. It has been the first time these pre-launch absolute calibration coefficients have been calculated considering the Brazilian image processing criteria. Now it will be possible to compare them to those that will be calculated from vicarious calibration campaigns. This comparison will permit the CCD/CBERS-2B monitoring and the frequently data updating to the user community. PMID:27873886

  14. Direct Determination of the Rate Coefficient for the Reaction of OH Radicals with Monoethanol Amine (MEA) from 296 to 510 K.

    Onel, L; Blitz, M A; Seakins, P W


    Monoethanol amine (H2NCH2CH2OH, MEA) has been proposed for large-scale use in carbon capture and storage. We present the first absolute, temperature-dependent determination of the rate coefficient for the reaction of OH with MEA using laser flash photolysis for OH generation, monitoring OH removal by laser-induced fluorescence. The room-temperature rate coefficient is determined to be (7.61 ± 0.76) × 10(-11) cm(3) molecule(-1) s(-1), and the rate coefficient decreases by about 40% by 510 K. The temperature dependence of the rate coefficient is given by k1= (7.73 ± 0.24) × 10(-11)(T/295)(-(0.79±0.11)) cm(3) molecule(-1) s(-1). The high rate coefficient shows that gas-phase processing in the atmosphere will be competitive with uptake onto aerosols.

  15. 78 FR 100 - Guidance for Industry and Food and Drug Administration Staff; Refuse To Accept Policy for 510(k)s...


    ... HUMAN SERVICES Food and Drug Administration Guidance for Industry and Food and Drug Administration Staff; Refuse To Accept Policy for 510(k)s; Availability AGENCY: Food and Drug Administration, HHS. ACTION: Notice. SUMMARY: The Food and Drug Administration (FDA) is announcing the availability of the...


    E. Amraei


    Full Text Available CCD Camera is a multi-spectral sensor that is carried by CBERS 2 satellite. Imaging technique in this sensor is push broom. In images acquired by the CCD Camera, some vertical striping noise can be seen. This is due to the detectors mismatch, inter detector variability, improper calibration of detectors and low signal-to-noise ratio. These noises are more profound in images acquired from the homogeneous surfaces, which are processed at level 2. However, the existence of these noises render the interpretation of the data and extracting information from these images difficult. In this work, spatial moment matching method is proposed to modify these images. In this method, the statistical moments such as mean and standard deviation of columns in each band are used to balance the statistical specifications of the detector array to those of reference values. After the removal of the noise, some periodic diagonal stripes remain in the image where their removal by using the aforementioned method seems impossible. Therefore, to omit them, frequency domain Butterworth notch filter was applied. Finally to evaluate the results, the image statistical moments such as the mean and standard deviation were deployed. The study proves the effectiveness of the method in noise removal.



    CRESDA Application System of CBERS-1 was established in 1999. During the operation of the system for more than two years, about 240 000 scenes of CBERS-1 Level 0 data have been archived and more than 13 000 scenes of Level 2 products have been ordered by end users from different application fields.In this paper, the typical examples of applications in crop yield estimation,calamity alleviation, resources survey and protection, environment monitoring and continuable development, and urban planning are mainly described.

  18. Discriminação de variedades de citros em imagens CCD/CBERS-2 Discrimination of citrus varieties using CCD/CBERS-2 satellite imagery

    Ieda Del'Arco Sanches


    Full Text Available O presente trabalho teve o objetivo de avaliar as imagens CCD/CBERS-2 quanto à possibilidade de discriminarem variedades de citros. A área de estudo localiza-se em Itirapina (SP e, para este estudo, foram utilizadas imagens CCD de três datas (30/05/2004, 16/08/2004 e 11/09/2004. Um modelo que integra os elementos componentes da cena citrícola sensoriada é proposto com o objetivo de explicar a variabilidade das respostas das parcelas de citros em imagens orbitais do tipo CCD/CBERS-2. Foram feitas classificações pelos algoritmos Isoseg e Maxver e, de acordo com o índice kappa, concluiu-se que é possível obterem-se exatidões qualificadas como muito boas, sendo que as melhores classificações foram conseguidas com imagens da estação seca.This paper was aimed at evaluating the possibility of discriminating citrus varieties in CCD imageries from CBERS-2 satellite ("China-Brazil Earth Resouces Satellite". The study area is located in Itirapina, São Paulo State. For this study, three CCD images from 2004 were acquired (May 30, August 16, and September 11. In order to acquire a better understanding and for explaining the variability of the spectral behavior of the citrus areas in orbital images (like as the CCD/CBERS-2 images a model that integrates the elements of the citrus scene is proposed and discussed. The images were classified by Isoseg and MaxVer classifiers. According to kappa index, it was possible to obtain classifications qualified as 'very good'. The best results were obtained with the images from the dry season.

  19. Identifying Spatial Units of Human Occupation in the Brazilian Amazon Using Landsat and CBERS Multi-Resolution Imagery

    Maria Isabel Sobral Escada


    Full Text Available Every spatial unit of human occupation is part of a network structuring an extensive process of urbanization in the Amazon territory. Multi-resolution remote sensing data were used to identify and map human presence and activities in the Sustainable Forest District of Cuiabá-Santarém highway (BR-163, west of Pará, Brazil. The limits of spatial units of human occupation were mapped based on digital classification of Landsat-TM5 (Thematic Mapper 5 image (30m spatial resolution. High-spatial-resolution CBERS-HRC (China-Brazil Earth Resources Satellite-High-Resolution Camera images (5 m merged with CBERS-CCD (Charge Coupled Device images (20 m were used to map spatial arrangements inside each populated unit, describing intra-urban characteristics. Fieldwork data validated and refined the classification maps that supported the categorization of the units. A total of 133 spatial units were individualized, comprising population centers as municipal seats, villages and communities, and units of human activities, such as sawmills, farmhouses, landing strips, etc. From the high-resolution analysis, 32 population centers were grouped in four categories, described according to their level of urbanization and spatial organization as: structured, recent, established and dependent on connectivity. This multi-resolution approach provided spatial information about the urbanization process and organization of the territory. It may be extended into other areas or be further used to devise a monitoring system, contributing to the discussion of public policy priorities for sustainable development in the Amazon.

  20. Land use mapping from CBERS-2 images with open source tools by applying different classification algorithms

    Sanhouse-García, Antonio J.; Rangel-Peraza, Jesús Gabriel; Bustos-Terrones, Yaneth; García-Ferrer, Alfonso; Mesas-Carrascosa, Francisco J.


    Land cover classification is often based on different characteristics between their classes, but with great homogeneity within each one of them. This cover is obtained through field work or by mean of processing satellite images. Field work involves high costs; therefore, digital image processing techniques have become an important alternative to perform this task. However, in some developing countries and particularly in Casacoima municipality in Venezuela, there is a lack of geographic information systems due to the lack of updated information and high costs in software license acquisition. This research proposes a low cost methodology to develop thematic mapping of local land use and types of coverage in areas with scarce resources. Thematic mapping was developed from CBERS-2 images and spatial information available on the network using open source tools. The supervised classification method per pixel and per region was applied using different classification algorithms and comparing them among themselves. Classification method per pixel was based on Maxver algorithms (maximum likelihood) and Euclidean distance (minimum distance), while per region classification was based on the Bhattacharya algorithm. Satisfactory results were obtained from per region classification, where overall reliability of 83.93% and kappa index of 0.81% were observed. Maxver algorithm showed a reliability value of 73.36% and kappa index 0.69%, while Euclidean distance obtained values of 67.17% and 0.61% for reliability and kappa index, respectively. It was demonstrated that the proposed methodology was very useful in cartographic processing and updating, which in turn serve as a support to develop management plans and land management. Hence, open source tools showed to be an economically viable alternative not only for forestry organizations, but for the general public, allowing them to develop projects in economically depressed and/or environmentally threatened areas.

  1. Performance evaluation of different classifiers (Isoseg, Bhattacharyya, Maxver e Maxver-ICM, using CCD/CBERS-1 and ETM+/Landsat-7 fused images

    Wilson Lins de Mello Filho


    Full Text Available The aim of this study was to compare the performance of image classifiers (Isoseg, Bhattacharyya, Maxver and Maxver-ICM based on an accuracy analysis (set percentage, area determination and Kappa coefficient, using as ground truth an edited thematic map. For this, pre-processing techniques (atmospheric, geometric and radiometric corrections, contrast enhancement (IHS data fusion and principal component analysis and classification of CCD/CBERS-1 and ETM+/Landsat-7 images were done. Amongst all classifiers tested, Isoseg and Bhattacharyya presented best performance for the studied classes and the study area. It is anticipated that these results are relevant to environmental analyses based on orbital satellite data.

  2. Features information extraction of the mining area based on CBERS-02B%基于CBERS-02B的矿区地物信息的提取

    王飞红; 任晓敏


    Object-oriented classification method is used to extract information from the Pingshuo surface coal mine in Shanxi Province. China-Brazil Earth Resources Satel ite(CBERS-02B)is used as data sources.The multi-scale segmentation and partitions level are created by object-oriented classification method.Through comparison of different segmentation results,the final segmentation scales are obtained.And spectral and spatial characteristics of the specific surface features are used to classify the image into vegetation, roads,mine construction,coal pile,mining face,waste rock dumps through the fuzzy classification of the membership function classification method.In the end,The classification result is evaluated by the error matrix,the result shows that the overal classification accuracy reaches 88.03%,and Kappa coefficient is 0.88.%应用面向对象分类方法,对山西省平朔露天煤矿进行信息提取。以中巴资源卫星CBERS-02B卫星遥感影像为数据源,利用面向对象分类方法进行多尺度分割并建立分割等级,通过对不同分割尺度的分割结果进行比较,获得最终分割尺度,并结合具体地物的光谱、空间等特征,采用模糊分类中的隶属度函数分类方法,最终将地物分为植被、道路、矿区建筑、煤堆、开采面、废石堆六类。最后使用误差矩阵对分类结果进行精度评价,其总体分类精度达到了88.63%,Kappa系数为0.89。

  3. Energetic particle radiations measured by particle detector on board CBERS-1 satellite

    HAO YongQiang; XIAO Zuo; ZOU Hong; ZHANG DongHe


    Using the data measured by energetic particle detector on board CBERS-01 and -02 for the past five years, statistics was made to show the general features of MeV electrons and protons along a solar synchronous orbit at an altitude of 780 km. This height is in the bottom region of the Earth's radiation belts. Detectors are inside the satellite cabinet and such continuous monitoring of particle radiation environment inside a satellite has seldom conducted so far. After a proper and careful treatment, it is indicated that the data inside satellite are well correlated with the radiation environment outside. Besides the agreement of the general distribution characteristics of energetic electrons and protons with similar observations from other satellites, attention is particularly paid to the disturbed conditions. Variations of particle fluxes are closely related with solar proton events, in general, electron fluxes of outer belt are well correlated with Dst index after three days' delay while the electron injection occurred almost at the same day during great magnetic storms. It is confirmed that both energetic electrons and protons appear in the Polar Cap region only after the solar proton events.

  4. CCD CBERS and ASTER data in dasometric characterization of Pinus radiata D. Don (north-western Spain

    Eva Sevillano-Marco


    Full Text Available A Chinese-Brazilian Earth Resources Satellite (CBERS and an Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER scenes coupled with ancillary georeferenced data and field survey were employed to examine the potential of the remote sensing data in stand basal area, volume and aboveground biomass assessment over large areas of Pinus radiata D. Don plantations in Northwestern Spain. Statistical analysis proved that the near infrared band and the shade fraction image showed significant correlation coefficients with all stand variables considered. Predictive models were accordingly selected and utilized to undertake the spatial distribution of stand variables in radiata stands delimited by the National Forestry Map. The study reinforces the potentiality of remote sensing techniques in a cost-effective assessment of forest systems.

  5. Identificação de fragmentos de floresta nativa, por diferentes intérpretes, com a utilização de imagens landsat e cbers em lavras, MG Identification of small areas of semideciduous forest, by different analysts, in Lavras region, MG, using Landsat and Cbers sattelites images

    Elizabeth Ferreira


    Full Text Available Neste trabalho, as imagens dos satélites Landsat 7 e Cbers 2 foram analisadas com o objetivo de identificar áreas com fragmentos de floresta semidecídua e de avaliar a exatidão da classificação feita por diferentes intérpretes e técnicas de interpretação. O estudo foi realizado em Lavras, MG, utilizando o SIG-SPRING, que possui recursos para realização da classificação digital e visual. Na comparação das diferentes classificações e avaliação da exatidão, foram empregadas as exatidões global, do consumidor, do produtor e o coeficiente Kappa. Pelos resultados, verificou-se que a exatidão global foi maior que 90% e o coeficiente Kappa variou de 50% a 77% nas comparações feitas por diferentes intérpretes, em imagens Landsat e Cbers. Os mapas de fragmentos de vegetação produzidos com base na classificação digital das imagens Cbers e Landsat apresentaram alta porcentagem de áreas comuns e os intérpretes produziram diferentes mapas, porém, aqueles gerados pela imagem Cbers apresentaram a melhor concordância entre as classificações.In this work two images from Landsat 7 and Cbers 2 were analyzed in order to identify small areas of semideciduous forest and to evaluate the classification accuracy made by three different analysts. The study was carried out in Lavras region, MG, using the SPRING GIS with the appropriate functions to jufil the digital classification and visual inspection. The comparisons between the classifications and accuracy assessment procedures employed the overall accuracy, the user's accuracy, the producer's accuracy and the Kappa coefficient. The results showed that the overall accuracy were higher than 90% and the Kappa coefficient ranged from 50% to 77% when the Landsat and Cbers images were compared by different analysts. The fragments vegetation maps made from digital classification of Cbers and Landsat satellites images presented high percentage of common areas and analysts made different maps

  6. Análise de mapas de represas publicados na web usando imagens orbitais CCD/CBERS no estado de Minas Gerais Analysis of dams maps published on the web by using orbital CCD/CBERS images in Minas Gerais State, Brazil

    Elizabeth Ferreira


    Full Text Available Neste trabalho, bancos de dados públicos e gratuitos disponíveis na World Wide Web (WEB foram utilizados para avaliar as áreas das superfícies dos espelhos d'água das represas de Furnas e do Funil, no Estado de Minas Gerais. O objetivo foi comparar as informações obtidas nos bancos da WEB com os valores das áreas calculadas a partir de imagens do sensor CCD a bordo dos satélites CBERS2 e CBERS2B. A área da represa de Furnas obtida a partir das imagens CCD/CBERS2B, ano 2008, foi de 1.138 km², mas nos bancos de dados consultados esta área estava entre 1.182 e 1.503 km². A represa do Funil, construída em 2003, com superfície de espelho d'água de 29,37 km² e uma ilha com área de 1,93 km² não aparecem nos bancos Atlas, Geominas, IGAM e IBGE. Os resultados mostraram algumas discrepâncias nos bancos de dados publicados na WEB, tais como diferenças em áreas e supressão ou extrapolação de limites do espelho d'água. Concluiu-se que, até o momento, os responsáveis por algumas publicações de bancos de dados no Estado de Minas Gerais não tiveram rigor suficiente com as atualizações. As imagens CCD/CBERS, que também são dados públicos disponíveis na WEB, mostraram ser produtos adequados para verificar, atualizar e melhorar as informações publicadas.In this work some public databases from the World Wide Web (WEB were used to find the area of the Furnas and Funil Dams in Minas Gerais State. The purpose of this work was to compare the WEB public databases values and the real values obtained from the CCD camera images on board CBERS2 and CBERS2B satellites. The Furnas Dam area obtained from CCD/ CBERS2B images, in 2008, was 1.138 km², but in the consulted databases this area ranged from 1.182 to 1.503 km². The dam of the Funil, built in 2003, with a water surface of 29.37 km² and an island with 1.93 km² area, did not appear in Atlas, Geominas, IGAM and IBGE databases. The results revealed some problems in the WEB public

  7. Assessing the Relative Ecological Importance and Deforestation Risks of Unprotected Areas in Western Brazil Using Landsat, CBERS and Quantum GIS

    Smith, A.; Sevilla, C.; Lanclos, A.; Carson, C.; Larson, J.; Sankaran, M.; Saad, M.


    In addition to understanding Brazilian policies and currently utilized methodologies, the measurement of the impacts of deforestation is essential for enhancing techniques to reduce deforestation in the future. Adverse impacts of deforestation include biodiversity loss, increased carbon dioxide emissions, and a reduced rate of evapotranspiration, all of which contribute directly or indirectly to global warming. With the continual growth in population in developing countries such as Brazil, increased demands are placed on infrastructural development and food production. As a result, forested areas are cleared for agricultural production. Recently, exploration for hydrocarbons in Western Brazil has also intensified as a means to stimulate the economy, as abundant oil and gas is believed to be found in these regions. Unfortunately, hydrocarbon-rich regions of Western Brazil are also home to thousands of species. Many of these regions are as of yet untapped but are at risk of ecological disruption as a result of impending human activity. This project utilized Landsat 5 TM to monitor deforestation in a subsection of the Brazilian states of Rondônia and Amazonas. A risk map identifying areas susceptible to future deforestation, based on factors such as proximity to roads, bodies of water, cities, and proposed hydrocarbon activities such as pipeline construction, was created. Areas at higher risk of clearance were recommended to be a target for enhanced monitoring and law enforcement. In addition, an importance map was created based on biodiversity and location of endangered species. This map was used to identify potential areas for future protection. A Chinese-Brazilian satellite, CBERS 2B CCD was also utilized for comparison. The NDVI model was additionally replicated in Quantum GIS, an open source software, so that local communities and policymakers could benefit without having to pay for expensive ArcGIS software. The capabilities of VIIRS were also investigated to

  8. Remote Sensing of the EnviSat and Cbers-2B satellites rotation around the centre of mass by photometry

    Koshkin, N.; Korobeynikova, E.; Shakun, L.; Strakhova, S.; Tang, Z. H.


    During 2013-2015 the photometric observations of the EnviSat satellite, which became space debris after the failure in April 2012 in low Earth orbit, were performed. The rotation pole position and slow change in sidereal rotation period were updated on the basis of analysis of specular glints observed in 222 light curves using reduction of synodic periods. Apparently, there are minor oscillations of the rotation pole near the normal to the orbital plane. The sense of the EnviSat's spinning is opposite to the sense of its orbital rotation. The sidereal period is best approximated by the second-order polynomial: Psid (sec) = 0.000021534 ·T2 + 0.04936003 · T + 121.18195 where T is measured in days from the beginning of 2013. This method being applied to another representative of space debris, namely the Cbers-2B satellite, has shown a similar result: there is not precise solution for the rotation pole either as it undergoes oscillations with various time scale from several hours to several months. In 2014, the spin axis made the 10 ° angle with the normal to the orbital plane while the sense of spinning is direct in this case, i.e. coincides with the sense of orbital rotation. The sidereal period is best approximated by the following expression: Psid (sec) = 0.000029543 ·T2 + 0.08094931 · T + 81.31775 where T is measured in days starting from March 10, 2014. This method allows of controlling slow changes in the spatial orientation of the rotation axis of the satellites in which specular reflection of light from flat faces of the surface is inherent.

  9. Orthopaedic Device Approval Through the Premarket Approval Process: A Financial Feasibility Analysis for a Single Center.

    Yang, Brian W; Iorio, Matthew L; Day, Charles S


    The 2 main routes of medical device approval through the U.S. Food and Drug Administration are the premarket approval (PMA) process, which requires clinical trials, and the 510(k) premarket notification, which exempts devices from clinical trials if they are substantially equivalent to an existing device. Recently, there has been growing concern regarding the safety of devices approved through the 510(k) premarket notification. The PMA process decreases the potential for device recall; however, it is substantially more costly and time-consuming. Investors and medical device companies are only willing to invest in devices if they can expect to recoup their investment within a timeline of roughly 7 years. Our study utilizes financial modeling to assess the financial feasibility of approving various orthopaedic medical devices through the 510(k) and PMA processes. The expected time to recoup investment through the 510(k) process ranged from 0.585 years to 7.715 years, with an average time of 2.4 years; the expected time to recoup investment through the PMA route ranged from 2.9 years to 24.5 years, with an average time of 8.5 years. Six of the 13 orthopaedic device systems that we analyzed would require longer than our 7-year benchmark to recoup the investment costs of the PMA process. With the 510(k) premarket notification, only 1 device system would take longer than 7 years to recoup its investment costs. Although the 510(k) premarket notification has demonstrated safety concerns, broad requirements for PMA authorization may limit device innovation for less-prevalent orthopaedic conditions. As a result, new approval frameworks may be beneficial. Our report demonstrates how current regulatory policies can potentially influence orthopaedic device innovation.

  10. Caracterização espectral de áreas de gramíneas forrageiras infectadas com a doença "mela-das-sementes da braquiária" por meio de imagens CCD/CBERS-2 Spectral characterization of forage grasses infected with the disease "mela-das-sementes da braquiária" through CCD/CDBERS -2 images

    José C. Rosatti


    Full Text Available Imagens CCD/CBERS-2, nas bandas espectrais CCD2, CCD3 e CCD4, dos anos de 2004 e 2005, de Mirante do Paranapanema - SP, foram transformadas em reflectância de superfície usando o modelo 5S de correção atmosférica e normalizadas radiometricamente. O objetivo principal foi caracterizar espectralmente áreas de pastagens de Brachiaria brizantha em fase de florescimento, isentas e infectadas com a doença "mela-das-sementes da braquiária", possibilitando a sua detecção por meio da comparação entre os valores de reflectância de superfície denominada de Fator de Reflectância Bidirecional de Superfície (FRBS. Teve-se, também, o objetivo de avaliar a eficácia das imagens CCD/CBERS-2 para a obtenção de respostas espectrais de pastagens. Os dosséis sadios e doentes da Brachiaria brizantha foram identificados por meio da análise dos valores de reflectância e dos dados observados no Índice de Estresse Hídrico Acumulativo Relativo da Cultura (ACWSI obtidos na área de estudo. Os resultados indicaram que as principais diferenças foram a diminuição da reflectância na banda CCD3 e o aumento da reflectância na banda CCD4 nas áreas doentes. A metodologia empregada com o uso de dados do sensor CCD/CBERS-2, associados ao ACWSI, mostrou-se eficaz para discriminar dosséis infectados com a "mela-das-sementes da braquiária".CCD/CBERS-2 images in the spectral bands of CCD2, CCD3 and CCD4 of the years 2004 and 2005, from Mirante do Paranapanema - SP (Brazil, were transformed into surface reflectance images using the 5S atmospheric correction model and radiometrically normalized. The main objective was to spectrally characterize pastures of Brachiaria brizantha in the flowering phase, exempt and infected with the disease "mela-das-sementes da braquiária" making it possible its detection through the comparison among the SBRF - Surface Bidirectional Reflectance Factor values. At the same time, it was aimed to evaluate the effectiveness of the

  11. 中巴资源卫星影像HR数据在煤矿矿区地面塌陷调查中的应用%Application of CBERS HR Image Data in Investigation of Surface Collapse in Coal Mine Zone

    陈文平; 范英霞; 韩小明; 薛磊; 李少贞


    We apply, for the first time, the CBERS HR image data whose independent intellectual property rights are owned by China to investigate the surface collapse of coal mine zone through remote sensing technique. A problem which grows serious over time is coal mine zone' s surface collapse which features itself by its growing area and both its gradual and sudden change. It is difficult to deter- mine the boundary of the collapse zone and adapt to the development of the problem via conventional station observation. In this paper, we construct a remote sensing image lettering and interpreting system to interpret and delineate the boundary of collapse zones, collect relevant data and support the environmental protection of the region via human computer interaction using Liudaowan - Tieehanggou coal mine zone' s CBERS satellite HR image based on known fundamental features of the collapse zone as a case study.%首次应用具有我国自主知识产权的中巴资源卫星影像数据对煤矿矿区采煤地面塌陷进行遥感调查研究。煤矿采空区地面塌陷随着时间日趋严重,面积不断扩大,兼有渐变和突变的特点,用常规方法定点观测难以适应其发展,确定其边界分布非常困难。文章应用中巴资源卫星HR高分辨率影像数据,以六道湾一铁厂沟煤矿矿区为例,通过已知塌陷区的基本特点,建立塌陷区的遥感影像地物解译标志体系,通过人机交互遥感影像解译,圈定地面塌陷的边界并获得塌陷区的有关数据,为该地区环境地质灾害治理提供依据。

  12. 78 FR 14097 - Pulse Oximeters-Premarket Notification Submissions [510(k)s]; Guidance for Industry and Food and...


    ... HUMAN SERVICES Food and Drug Administration Pulse Oximeters--Premarket Notification Submissions... availability of the guidance entitled ``Pulse Oximeters--Premarket Notification Submissions .'' This guidance document pertains to non-invasive pulse oximeters intended for prescription use to measure arterial...

  13. Connectivity processes and riparian vegetation of the upper Paraná River, Brazil

    Stevaux, José C.; Corradini, Fabrício A.; Aquino, Samia


    In fluvial systems, the relationship between a dominant variable (e.g. flood pulse) and its dependent ones (e.g. riparian vegetation) is called connectivity. This paper analyzes the connectivity elements and processes controlling riparian vegetation for a reach of the upper Paraná River (Brazil) and estimates the future changes in channel-vegetation relationship as a consequence of the managing of a large dam. The studied reach is situated 30 km downstream from the Porto Primavera Dam (construction finished in 1999). Through aerial photography (1:25,000, 1996), RGB-CBERS satellite imagery and a previous field botany survey it was possible to elaborate a map with the five major morpho-vegetation units: 1) Tree-dominated natural levee, 2) Shrubby upper floodplain, 3) Shrub-herbaceous mid floodplain, 4) Grass-herbaceous lower floodplain and 5) Shrub-herbaceous flood runoff channel units. By use of a detailed topographic survey and statistical tools each morpho-vegetation type was analyzed according to its connectivity parameters (frequency, recurrence, permanence, seasonality, potamophase, limnophase and FCQ index) in the pre- and post-dam closure periods of the historical series. Data showed that most of the morpho-vegetation units were predicted to present changes in connectivity parameters values after dam closing and the new regime could affect, in different intensity, the river ecology and particularly the riparian vegetation. The methods used in this study can be useful for dam impact studies in other South American tropical rivers.

  14. 21 CFR 878.9 - Limitations of exemptions from section 510(k) of the Federal Food, Drug, and Cosmetic Act (the act).


    ... (AIDS), chronic or active hepatitis, tuberculosis, or myocardial infarction or to monitor therapy; (4...) For use in the diagnosis, monitoring, or screening of neoplastic diseases with the exception of immunohistochemical devices; (2) For use in screening or diagnosis of familial or acquired genetic...

  15. 21 CFR 874.9 - Limitations of exemptions from section 510(k) of the Federal Food, Drug, and Cosmetic Act (the act).


    ... hepatitis, tuberculosis, or myocardial infarction or to monitor therapy; (4) For assessing the risk of... immunoassay technology; or (c) The device is an in vitro device that is intended: (1) For use in the diagnosis... use in screening or diagnosis of familial or acquired genetic disorders, including inborn errors...

  16. 21 CFR 864.9 - Limitations of exemptions from section 510(k) of the Federal Food, Drug, and Cosmetic Act (the act).


    ... (AIDS), chronic or active hepatitis, tuberculosis, or myocardial infarction or to monitor therapy; (4...) For use in the diagnosis, monitoring, or screening of neoplastic diseases with the exception of immunohistochemical devices; (2) For use in screening or diagnosis of familial or acquired genetic...

  17. 21 CFR 882.9 - Limitations of exemptions from section 510(k) of the Federal Food, Drug, and Cosmetic Act (the act).


    ... hepatitis, tuberculosis, or myocardial infarction or to monitor therapy; (4) For assessing the risk of... immunoassay technology; or (c) The device is an in vitro device that is intended: (1) For use in the diagnosis... use in screening or diagnosis of familial or acquired genetic disorders, including inborn errors...

  18. 21 CFR 890.9 - Limitations of exemptions from section 510(k) of the Federal Food, Drug, and Cosmetic Act (the act).


    ... hepatitis, tuberculosis, or myocardial infarction or to monitor therapy; (4) For assessing the risk of... immunoassay technology; or (c) The device is an in vitro device that is intended: (1) For use in the diagnosis... use in screening or diagnosis of familial or acquired genetic disorders, including inborn errors...

  19. 21 CFR 880.9 - Limitations of exemptions from section 510(k) of the Federal Food, Drug, and Cosmetic Act (the act).


    ... (AIDS), chronic or active hepatitis, tuberculosis, or myocardial infarction or to monitor therapy; (4...) For use in the diagnosis, monitoring, or screening of neoplastic diseases with the exception of immunohistochemical devices; (2) For use in screening or diagnosis of familial or acquired genetic...

  20. 21 CFR 884.9 - Limitations of exemptions from section 510(k) of the Federal Food, Drug, and Cosmetic Act (the act).


    ... (AIDS), chronic or active hepatitis, tuberculosis, or myocardial infarction or to monitor therapy; (4...) For use in the diagnosis, monitoring, or screening of neoplastic diseases with the exception of immunohistochemical devices; (2) For use in screening or diagnosis of familial or acquired genetic...

  1. 21 CFR 872.9 - Limitations of exemptions from section 510(k) of the Federal Food, Drug, and Cosmetic Act (the act).


    ... hepatitis, tuberculosis, or myocardial infarction or to monitor therapy; (4) For assessing the risk of... immunoassay technology; or (c) The device is an in vitro device that is intended: (1) For use in the diagnosis... use in screening or diagnosis of familial or acquired genetic disorders, including inborn errors...

  2. 21 CFR 892.9 - Limitations of exemptions from section 510(k) of the Federal Food, Drug, and Cosmetic Act (the act).


    ... hepatitis, tuberculosis, or myocardial infarction or to monitor therapy; (4) For assessing the risk of... immunoassay technology; or (c) The device is an in vitro device that is intended: (1) For use in the diagnosis... use in screening or diagnosis of familial or acquired genetic disorders, including inborn errors...

  3. 21 CFR 870.9 - Limitations of exemptions from section 510(k) of the Federal Food, Drug, and Cosmetic Act (the act).


    ... hepatitis, tuberculosis, or myocardial infarction or to monitor therapy; (4) For assessing the risk of... immunoassay technology; or (c) The device is an in vitro device that is intended: (1) For use in the diagnosis... use in screening or diagnosis of familial or acquired genetic disorders, including inborn errors...

  4. 21 CFR 866.9 - Limitations of exemptions from section 510(k) of the Federal Food, Drug, and Cosmetic Act (the act).


    ... (AIDS), chronic or active hepatitis, tuberculosis, or myocardial infarction or to monitor therapy; (4...) For use in the diagnosis, monitoring, or screening of neoplastic diseases with the exception of immunohistochemical devices; (2) For use in screening or diagnosis of familial or acquired genetic...

  5. 21 CFR 886.9 - Limitations of exemptions from section 510(k) of the Federal Food, Drug, and Cosmetic Act (the act).


    ... hepatitis, tuberculosis, or myocardial infarction or to monitor therapy; (4) For assessing the risk of... immunoassay technology; or (c) The device is an in vitro device that is intended: (1) For use in the diagnosis... use in screening or diagnosis of familial or acquired genetic disorders, including inborn errors...

  6. 21 CFR 862.9 - Limitations of exemptions from section 510(k) of the Federal Food, Drug, and Cosmetic Act (the act).


    ... deficiency syndrome (AIDS), chronic or active hepatitis, tuberculosis, or myocardial infarction or to monitor... intended: (1) For use in the diagnosis, monitoring, or screening of neoplastic diseases with the exception of immunohistochemical devices; (2) For use in screening or diagnosis of familial or acquired...

  7. 21 CFR 888.9 - Limitations of exemptions from section 510(k) of the Federal Food, Drug, and Cosmetic Act (the act).


    ... hepatitis, tuberculosis, or myocardial infarction or to monitor therapy; (4) For assessing the risk of... immunoassay technology; or (c) The device is an in vitro device that is intended: (1) For use in the diagnosis... use in screening or diagnosis of familial or acquired genetic disorders, including inborn errors...

  8. 21 CFR 868.9 - Limitations of exemptions from section 510(k) of the Federal Food, Drug, and Cosmetic Act (the act).


    ... hepatitis, tuberculosis, or myocardial infarction or to monitor therapy; (4) For assessing the risk of... immunoassay technology; or (c) The device is an in vitro device that is intended: (1) For use in the diagnosis... use in screening or diagnosis of familial or acquired genetic disorders, including inborn errors...

  9. 21 CFR 876.9 - Limitations of exemptions from section 510(k) of the Federal Food, Drug, and Cosmetic Act (the act).


    ... (AIDS), chronic or active hepatitis, tuberculosis, or myocardial infarction or to monitor therapy; (4...) For use in the diagnosis, monitoring, or screening of neoplastic diseases with the exception of immunohistochemical devices; (2) For use in screening or diagnosis of familial or acquired genetic...

  10. Study of the degradation process of polyimide induced by high energetic ion irradiation

    Severin, Daniel


    The dissertation focuses on the radiation hardness of Kapton under extreme radiation environment conditions. To study ion-beam induced modifications, Kapton foils were irradiated at the GSI linear accelerator UNILAC using several projectiles (e.g. Ti, Mo, Au, and U) within a large fluence regime (1 x 10{sup 10}-5 x 10{sup 12} ions/cm{sup 2}). The irradiated Kapton foils were analysed by means of infrared and UV/Vis spectroscopy, tensile strength measurement, mass loss analysis, and dielectric relaxation spectroscopy. For testing the radiation stability of Kapton at the cryogenic operation temperature (5-10 K) of the superconducting magnets, additional irradiation experiments were performed at the Grand Accelerateur National d' Ions Lourds (GANIL, France) focusing on the online analysis of the outgassing process of small volatile degradation fragments. The investigations of the electrical properties analysed by dielectric relaxation spectroscopy exhibit a different trend: high fluence irradiations with light ions (e.g. Ti) lead to a slight increase of the conductivity, whereas heavy ions (e.g. Sm, Au) cause a drastic change already in the fluence regime of nonoverlapping tracks (5 x 10{sup 10} ions/cm{sup 2}). Online analysis of the outgassing process during irradiation at cryogenic temperatures shows the release of a variety of small gaseous molecules (e.g. CO, CO{sub 2}, and short hydro carbons). Also a small amount of large polymer fragments is identified. The results allow the following conclusions which are of special interest for the application of Kapton as insulating material in a high-energetic particle radiation environment. a) The material degradation measured with the optical spectroscopy and tensile strength tests are scalable with the dose deposited by the ions. The high correlation of the results allows the prediction of the mechanical degradation with the simple and non-destructive infrared spectroscopy. The degradation curve points to a

  11. Analysis of the observation of particle detector inside ‘CBERS-1’ satellite under solar quiet conditions

    ZOU; Hong; XIAO; Zuo; HAO; Yongqiang; ZOU; Jiqing; ZHU; Wenming; WU; Zhongxiang


    Based on the knowledge and related theory of earth's radiation belt, the data chronous orbit were analyzed. It is proved that the observational results are in agreement with the theoretical description of the radiation belt structures. Analysis of more than 3years' data showed clearly that under quiet solar conditions, at a height of about 800 km the energetic particles were mainly located in three regions: northern auroral belt (40°-80°), southern auroral belt (-40°- -80°) and South Atlantic Geomagnetic Anomaly Region (SAA). Actually, this is for the global distribution, at each longitude the latitudinal coverage is much narrower and particles are along the geomagnetic latitude of about ±60°.The species of particles in different regions and their counting rates are different. In SAA,usually both electrons and protons are observed, which should come from inner radiation belt; in polar regions only energetic electrons are observed under the quiet condition,which belongs to the outer radiation belt. The distribution of outer radiation electrons is asymmetrical for longitudes as well as northern and southern polar regions. These asymmetries can be explained with the reflecting altitudes of the mirror points of charged particles at the same L shell.

  12. The discussion of different varieties of remote sensing images' frequent applying and essentials of processing in the forestry's daily work%浅谈林业工作中常用的遥感影像及处理要点



    在遥感影像定义的基础之上,讨论了遥感影像的四个基本特征。对林业工作中常用的中分辨率遥感影像Landsate TM5、中巴资源卫星以及常用的高分辨率遥感影像Spot5、Rapideye、Alos、QuickBird、WorldviewⅠ、WorldviewⅡ等进行了介绍并做了对比分析,总结各种遥感影像的优缺点;并讨论了遥感影像基础数据准备,DOM与DEM参考数据准备以及的遥感影像正射纠正,并以ALOS为例,对算法及图像增强等工作进行了探讨。%According to the definition of remote sensing images (RSI) , four basic characteristics have been disscussed. Mid-resolution varieties of RSI that include Landsate TM5 and CBERS, and high-resolution varieties of RSI that include SpotS, Rapideye, Alos, QuickBird, Worldview I , Worldview II etc.,which have been introduced and frequently applied in inspection work of forestry. This article concluded merits and drawbacks of different varieties of RSI via comparing and analyzing based on mid-resolution and high-resolution varieties of RSI. Meanwhile, basic data's preparation of RSI, DOM and DEM, ortho-rectification.of RSI have been discussed. At last, taking Alos as a case, discussion and research of RSI's processing, enhancement and algorithm have been done as well.

  13. Primary Processing

    Mulder, W.J.; Harmsen, P.F.H.; Sanders, J.P.M.; Carre, P.; Kamm, B.; Schoenicke, P.


    Primary processing of oil-containing material involves pre-treatment processes, oil recovery processes and the extraction and valorisation of valuable compounds from waste streams. Pre-treatment processes, e.g. thermal, enzymatic, electrical and radio frequency, have an important effect on the oil r

  14. Elektrokemiske Processer

    Bech-Nielsen, Gregers


    Electrochemical processes in: Power sources, Electrosynthesis, Corrosion.Pourbaix-diagrams.Decontamination of industrial waste water for heavy metals.......Electrochemical processes in: Power sources, Electrosynthesis, Corrosion.Pourbaix-diagrams.Decontamination of industrial waste water for heavy metals....

  15. Data processing

    Fry, T F


    Data Processing discusses the principles, practices, and associated tools in data processing. The book is comprised of 17 chapters that are organized into three parts. The first part covers the characteristics, systems, and methods of data processing. Part 2 deals with the data processing practice; this part discusses the data input, output, and storage. The last part discusses topics related to systems and software in data processing, which include checks and controls, computer language and programs, and program elements and structures. The text will be useful to practitioners of computer-rel

  16. Design Processes

    Ovesen, Nis


    advantages and challenges of agile processes in mobile software and web businesses are identified. The applicability of these agile processes is discussed in re- gards to design educations and product development in the domain of Industrial Design and is briefly seen in relation to the concept of dromology......Inspiration for most research and optimisations on design processes still seem to focus within the narrow field of the traditional design practise. The focus in this study turns to associated businesses of the design professions in order to learn from their development processes. Through interviews...

  17. Magnetics Processing

    Federal Laboratory Consortium — The Magnetics Processing Lab equipped to perform testing of magnetometers, integrate them into aircraft systems, and perform data analysis, including noise reduction...

  18. Stochastic processes

    Parzen, Emanuel


    Well-written and accessible, this classic introduction to stochastic processes and related mathematics is appropriate for advanced undergraduate students of mathematics with a knowledge of calculus and continuous probability theory. The treatment offers examples of the wide variety of empirical phenomena for which stochastic processes provide mathematical models, and it develops the methods of probability model-building.Chapter 1 presents precise definitions of the notions of a random variable and a stochastic process and introduces the Wiener and Poisson processes. Subsequent chapters examine

  19. Development of a Conceptual Process for Selective CO 2 Capture from Fuel Gas Streams Using [hmim][Tf 2 N] Ionic Liquid as a Physical Solvent

    Basha, Omar M.; Keller, Murphy J.; Luebke, David R.; Resnik, Kevin P.; Morsi, Badie I.


    The Ionic Liquid (IL) [hmim][Tf2N] was used as a physical solvent in an Aspen Plus simulation, employing the Peng-Robinson Equation of State (P-R EOS) with Boston-Mathias (BM) alpha function and standard mixing rules, to develop a conceptual process for CO2 capture from a shifted warm fuel gas stream produced from Pittsburgh # 8 coal for a 400 MWe power plant. The physical properties of the IL, including density, viscosity, surface tension, vapor pressure and heat capacity were obtained from literature and modeled as a function of temperature. Also, available experimental solubility values for CO2, H2, H2S, CO, and CH4 in this IL were compiled and their binary interaction parameters (Δij and lij) were optimized and correlated as functions of temperature. The Span-Wager Equation-of-State EOS was also employed to generate CO2 solubilities in [hmim][Tf2N] at high pressures (up to 10 MPa) and temperatures (up to 510 K). The conceptual process developed consisted of 4 adiabatic absorbers (2.4 m ID, 30 m high) arranged in parallel and packed with Plastic Pall Rings of 0.025 m for CO2 capture; 3 flash drums arranged in series for solvent (IL) regeneration with the pressure-swing option; and a pressure-intercooling system for separating and pumping CO2 up to 153 bar to the sequestration sites. The compositions of all process streams, CO2 capture efficiency, and net power were calculated using Aspen Plus simulator. The results showed that, based on the composition of the inlet gas stream to the absorbers, 95.67 mol% of CO2 was captured and sent to sequestration sites; 99.5 mol% of H2 was separated and sent to turbines; the solvent exhibited a minimum loss of 0.31 mol%; and the net power balance of the entire system was 30.81 MW. These results indicated that [hmim][Tf2N] IL could be used as a physical

  20. Image processing

    Heijden, van der F.; Spreeuwers, L.J.; Blanken, H.M.; Vries de, A.P.; Blok, H.E.; Feng, L


    The field of image processing addresses handling and analysis of images for many purposes using a large number of techniques and methods. The applications of image processing range from enhancement of the visibility of cer- tain organs in medical images to object recognition for handling by industri

  1. Sustainable processing

    Kristensen, Niels Heine


    Kristensen_NH and_Beck A: Sustainable processing. In Otto Schmid, Alexander Beck and Ursula Kretzschmar (Editors) (2004): Underlying Principles in Organic and "Low-Input Food" Processing - Literature Survey. Research Institute of Organic Agriculture FiBL, CH-5070 Frick, Switzerland. ISBN 3-906081-58-3...

  2. Process mining

    van der Aalst, W.M.P.; Rubin, V.; Verbeek, H.M.W.


    Process mining includes the automated discovery of processes from event logs. Based on observed events (e.g., activities being executed or messages being exchanged) a process model is constructed. One of the essential problems in process mining is that one cannot assume to have seen all possible...... behavior. At best, one has seen a representative subset. Therefore, classical synthesis techniques are not suitable as they aim at finding a model that is able to exactly reproduce the log. Existing process mining techniques try to avoid such “overfitting” by generalizing the model to allow for more...... behavior. This generalization is often driven by the representation language and very crude assumptions about completeness. As a result, parts of the model are “overfitting” (allow only for what has actually been observed) while other parts may be “underfitting” (allowfor much more behavior without strong...

  3. Organizing Process

    Hull Kristensen, Peer; Bojesen, Anders

    This paper invites to discuss the processes of individualization and organizing being carried out under what we might see as an emerging regime of change. The underlying argumentation is that in certain processes of change, competence becomes questionable at all times. The hazy characteristics...... of this regime of change are pursued through a discussion of competencies as opposed to qualifications illustrated by distinct cases from the Danish public sector in the search for repetitive mechanisms. The cases are put into a general perspective by drawing upon experiences from similar change processes...

  4. Electrochemical Processes

    Bech-Nielsen, Gregers


    The notes describe in detail primary and secondary galvanic cells, fuel cells, electrochemical synthesis and electroplating processes, corrosion: measurments, inhibitors, cathodic and anodic protection, details of metal dissolution reactions, Pourbaix diagrams and purification of waste water from...

  5. Processing Proteases

    Ødum, Anders Sebastian Rosenkrans

    Processing proteases are proteases which proteolytically activate proteins and peptides into their biologically active form. Processing proteases play an important role in biotechnology as tools in protein fusion technology. Fusion strategies where helper proteins or peptide tags are fused...... to the protein of interest are an elaborate method to optimize expression or purification systems. It is however critical that fusion proteins can be removed and processing proteases can facilitate this in a highly specific manner. The commonly used proteases all have substrate specificities to the N...... of few known proteases to have substrate specificity for the C-terminal side of the scissile bond. LysN exhibits specificity for lysine, and has primarily been used to complement trypsin in to proteomic studies. A working hypothesis during this study was the potential of LysN as a processing protease...

  6. Grants Process

    The NCI Grants Process provides an overview of the end-to-end lifecycle of grant funding. Learn about the types of funding available and the basics for application, review, award, and on-going administration within the NCI.

  7. Sewer Processes

    Hvitved-Jacobsen, Thorkild; Vollertsen, Jes; Nielsen, Asbjørn Haaning

    and valuable information on the sewer as a chemical and biological reactor. It focuses on how to predict critical impacts and control adverse effects. It also provides an integrated description of sewer processes in modeling terms. This second edition is full of illustrative examples and figures, includes...... microbial and chemical processes and demonstrates how this knowledge can be applied for the design, operation, and the maintenance of wastewater collection systems. The authors add chemical and microbial dimensions to the design and management of sewer networks with an overall aim of improved sustainability...... by hydrogen sulfide and other volatile organic compounds, as well as other potential health issues, have caused environmental concerns to rise. Reflecting the most current developments, Sewer Processes: Microbial and Chemical Process Engineering of Sewer Networks, Second Edition, offers the reader updated...

  8. In process...

    LI Xin


    Architecture is a wonderful world. As a student of architecture, time and time again I am inpressed by its powerful imagines. The more I study and learn, however, the more I question. What is the truth beyond those fantastic imagines? What is the nuture of Architecture? Is there any basic way or process to approach the work of Architecture? With these questions, I begin my thesis project and the process of looking for answers. MArch

  9. In process...

    LI Xin


    Architecture is a wonderful world. As a student of architecture, time and time again I am inpressed by its powerful imagines. The more I study and learn, however, the more I question. What is the truth beyond those fantastic imagines? What is the nuture of Architecture? Is there any basic way or process to approach the work of Architecture? With these questions, I begin my thesis project and the process of looking for answers.

  10. Renewal processes

    Mitov, Kosto V


    This monograph serves as an introductory text to classical renewal theory and some of its applications for graduate students and researchers in mathematics and probability theory. Renewal processes play an important part in modeling many phenomena in insurance, finance, queuing systems, inventory control and other areas. In this book, an overview of univariate renewal theory is given and renewal processes in the non-lattice and lattice case are discussed. A pre-requisite is a basic knowledge of probability theory.

  11. Macdonald processes

    Borodin, Alexei


    Macdonald processes are probability measures on sequences of partitions defined in terms of nonnegative specializations of the Macdonald symmetric functions and two Macdonald parameters q,t in [0,1). We prove several results about these processes, which include the following. (1) We explicitly evaluate expectations of a rich family of observables for these processes. (2) In the case t=0, we find a Fredholm determinant formula for a q-Laplace transform of the distribution of the last part of the Macdonald-random partition. (3) We introduce Markov dynamics that preserve the class of Macdonald processes and lead to new "integrable" 2d and 1d interacting particle systems. (4) In a large time limit transition, and as q goes to 1, the particles of these systems crystallize on a lattice, and fluctuations around the lattice converge to O'Connell's Whittaker process that describe semi-discrete Brownian directed polymers. (5) This yields a Fredholm determinant for the Laplace transform of the polymer partition function...

  12. Offshoring Process

    Slepniov, Dmitrij; Sørensen, Brian Vejrum; Katayama, Hiroshi


    The purpose of this chapter is to contribute to the knowledge on how production offshoring and international operations management vary across cultural contexts. The chapter attempts to shed light on how companies approach the process of offshoring in different cultural contexts. In order...... of globalisation. Yet there are clear differences in how offshoring is conducted in Denmark and Japan. The main differences are outlined in a framework and explained employing cultural variables. The findings lead to a number of propositions suggesting that the process of offshoring is not simply a uniform...

  13. Processing Branches

    Schindler, Christoph; Tamke, Martin; Tabatabai, Ali;


    Angled and forked wood – a desired material until 19th century, was swept away by industrialization and its standardization of processes and materials. Contemporary information technology has the potential for the capturing and recognition of individual geometries through laser scanning and compu...

  14. Innovation process

    Kolodovski, A.


    Purpose of this report: This report was prepared for RISO team involved in design of the innovation system Report provides innovation methodology to establish common understanding of the process concepts and related terminology The report does not includeRISO- or Denmark-specific cultural, economic...


    Anamarija Kutlić


    Full Text Available Bentonite has vide variety of uses. Special use of bentonite, where its absorbing properties are employed to provide water-tight sealing is for an underground repository in granites In this paper, bentonite processing and beneficiation are described.

  16. Optical Processing.


    34perceptron" (F. Rosenblatt, Principles of Neurodynamics ), workers in the neural network field have been seeking to understand how neural networks can perform...Moscow). 13. F. Rosenblatt, Principles of Neurodynamics , (Spartan, 1962). 14. W. Stoner "Incoherent optical processing via spatially offset pupil

  17. Photobiomodulation Process

    Yang-Yi Xu


    Full Text Available Photobiomodulation (PBM is a modulation of laser irradiation or monochromatic light (LI on biosystems. There is little research on PBM dynamics although its phenomena and mechanism have been widely studied. The PBM was discussed from dynamic viewpoint in this paper. It was found that the primary process of cellular PBM might be the key process of cellular PBM so that the transition rate of cellular molecules can be extended to discuss the dose relationship of PBM. There may be a dose zone in which low intensity LI (LIL at different doses has biological effects similar to each other, so that biological information model of PBM might hold. LIL may self-adaptively modulate a chronic stress until it becomes successful.

  18. Boolean process

    闵应骅; 李忠诚; 赵著行


    Boolean algebra successfully describes the logical behavior of a digital circuit, and has been widely used in electronic circuit design and test With the development of high speed VLSIs it is a drawback for Boolean algebra to be unable to describe circuit timing behavior. Therefore a Boolean process is defined as a family of Boolean van ables relevant to the time parameter t. A real-valued sample of a Boolean process is a waveform. Waveform functions can be manipulated formally by using mathematical tools. The distance, difference and limit of a waveform polynomial are defined, and a sufficient and necessary condition of the limit existence is presented. Based on this, the concept of sensitization is redefined precisely to demonstrate the potential and wide application possibility The new definition is very different from the traditional one, and has an impact on determining the sensitizable paths with maximum or minimum length, and false paths, and then designing and testing high performance circuits

  19. Purification process

    Marshall, A.


    A process for the removal of hydrogen sulphide from gases or liquid hydrocarbons, comprises contacting the gas or liquid hydrocarbon with an aqueous alkaline solution, preferably having a pH value of 8 to 10, comprising (A) an anthraquinone disulphonic acid or a water-soluble sulphonamide thereof (B) a compound of a metal which can exist in at least two valency states and (C) a sequestering agent.

  20. Ceramic Processing



    Ceramics represent a unique class of materials that are distinguished from common metals and plastics by their: (1) high hardness, stiffness, and good wear properties (i.e., abrasion resistance); (2) ability to withstand high temperatures (i.e., refractoriness); (3) chemical durability; and (4) electrical properties that allow them to be electrical insulators, semiconductors, or ionic conductors. Ceramics can be broken down into two general categories, traditional and advanced ceramics. Traditional ceramics include common household products such as clay pots, tiles, pipe, and bricks, porcelain china, sinks, and electrical insulators, and thermally insulating refractory bricks for ovens and fireplaces. Advanced ceramics, also referred to as ''high-tech'' ceramics, include products such as spark plug bodies, piston rings, catalyst supports, and water pump seals for automobiles, thermally insulating tiles for the space shuttle, sodium vapor lamp tubes in streetlights, and the capacitors, resistors, transducers, and varistors in the solid-state electronics we use daily. The major differences between traditional and advanced ceramics are in the processing tolerances and cost. Traditional ceramics are manufactured with inexpensive raw materials, are relatively tolerant of minor process deviations, and are relatively inexpensive. Advanced ceramics are typically made with more refined raw materials and processing to optimize a given property or combination of properties (e.g., mechanical, electrical, dielectric, optical, thermal, physical, and/or magnetic) for a given application. Advanced ceramics generally have improved performance and reliability over traditional ceramics, but are typically more expensive. Additionally, advanced ceramics are typically more sensitive to the chemical and physical defects present in the starting raw materials, or those that are introduced during manufacturing.

  1. Hydrocarbon processing

    Hill, S.G.; Seddon, D.


    A process for the catalytic conversion of synthesis-gas into a product which comprises naphtha, kerosene and distillate is characterized in that the catalyst is a Fischer-Tropsch catalyst also containing a zeolite, the naphtha fraction contains 60% or less linear paraffins and the kerosene and distillated fractions contain more linear paraffins and olefins than found in the naphtha fraction. Reduction of the relative amount of straight chain material in the naphtha fraction increases the octane number and so enhances the quality of the gasoline product, while the high quality of the kerosene and distillate fractions is maintained.

  2. Crystallization process

    Adler, Robert J.; Brown, William R.; Auyang, Lun; Liu, Yin-Chang; Cook, W. Jeffrey


    An improved crystallization process is disclosed for separating a crystallizable material and an excluded material which is at least partially excluded from the solid phase of the crystallizable material obtained upon freezing a liquid phase of the materials. The solid phase is more dense than the liquid phase, and it is separated therefrom by relative movement with the formation of a packed bed of solid phase. The packed bed is continuously formed adjacent its lower end and passed from the liquid phase into a countercurrent flow of backwash liquid. The packed bed extends through the level of the backwash liquid to provide a drained bed of solid phase adjacent its upper end which is melted by a condensing vapor.

  3. Lithospheric processes

    Baldridge, W. [and others


    The authors used geophysical, geochemical, and numerical modeling to study selected problems related to Earth's lithosphere. We interpreted seismic waves to better characterize the thickness and properties of the crust and lithosphere. In the southwestern US and Tien Shari, crust of high elevation is dynamically supported above buoyant mantle. In California, mineral fabric in the mantle correlate with regional strain history. Although plumes of buoyant mantle may explain surface deformation and magmatism, our geochemical work does not support this mechanism for Iberia. Generation and ascent of magmas remains puzzling. Our work in Hawaii constrains the residence of magma beneath Hualalai to be a few hundred to about 1000 years. In the crust, heat drives fluid and mass transport. Numerical modeling yielded robust and accurate predictions of these processes. This work is important fundamental science, and applies to mitigation of volcanic and earthquake hazards, Test Ban Treaties, nuclear waste storage, environmental remediation, and hydrothermal energy.

  4. Image Processing


    Electronic Imagery, Inc.'s ImageScale Plus software, developed through a Small Business Innovation Research (SBIR) contract with Kennedy Space Flight Center for use on space shuttle Orbiter in 1991, enables astronauts to conduct image processing, prepare electronic still camera images in orbit, display them and downlink images to ground based scientists for evaluation. Electronic Imagery, Inc.'s ImageCount, a spin-off product of ImageScale Plus, is used to count trees in Florida orange groves. Other applications include x-ray and MRI imagery, textile designs and special effects for movies. As of 1/28/98, company could not be located, therefore contact/product information is no longer valid.

  5. Dynamic Processes

    Klingshirn, C.

    . Phys. Lett. 92:211105, 2008). For this point, recall Figs. 6.16 and 6.33. Since the polarisation amplitude is gone in any case after the recombination process, there is an upper limit for T 2 given by T 2 ≤ 2 T1. The factor of two comes from the fact that T 2 describes the decay of an amplitude and T 1 the decay of a population, which is proportional to the amplitude squared. Sometimes T 2 is subdivided in a term due to recombination described by T 1 and another called 'pure dephasing' called T 2 ∗ with the relation 1 / T 2 = 1 / 2 T 1 + 1 / T2 ∗. The quantity T 2 ∗ can considerably exceed 2 T 1. In the part on relaxation processes that is on processes contributing to T 3, we give also examples for the capture of excitons into bound, localized, or deep states. For more details on dynamics in semiconductors in general see for example, the (text-) books [Klingshirn, Semiconductor Optics, 3rd edn. (Springer, Berlin, 2006); Haug and Koch, Quantum Theory of the Optical and Electronic Properties of Semiconductors, 4th edn. (World Scientific, Singapore, 2004); Haug and Jauho, Quantum Kinetics in Transport and Optics of Semiconductors, Springer Series in Solid State Sciences vol. 123 (Springer, Berlin, 1996); J. Shah, Ultrafast Spectroscopy of Semiconductors and of Semiconductor Nanostructures, Springer Series in Solid State Sciences vol. 115 (Springer, Berlin, 1996); Schafer and Wegener, Semiconductor Optics and Transport Phenomena (Springer, Berlin, 2002)]. We present selected data for free, bound and localized excitons, biexcitons and electron-hole pairs in an EHP and examples for bulk materials, epilayers, quantum wells, nano rods and nano crystals with the restriction that - to the knowledge of the author - data are not available for all these systems, density ranges and temperatures. Therefore, we subdivide the topic below only according to the three time constants T 2, T 3 and T 1.

  6. Data Processing

    Grangeat, P.

    A new area of biology has been opened up by nanoscale exploration of the living world. This has been made possible by technological progress, which has provided the tools needed to make devices that can measure things on such length and time scales. In a sense, this is a new window upon the living world, so rich and so diverse. Many of the investigative methods described in this book seek to obtain complementary physical, chemical, and biological data to understand the way it works and the way it is organised. At these length and time scales, only dedicated instrumentation could apprehend the relevant phenomena. There is no way for our senses to observe these things directly. One important field of application is molecular medicine, which aims to explain the mechanisms of life and disease by the presence and quantification of specific molecular entities. This involves combining information about genes, proteins, cells, and organs. This in turn requires the association of instruments for molecular diagnosis, either in vitro, e.g., the microarray or the lab-on-a-chip, or in vivo, e.g., probes for molecular biopsy, and tools for molecular imaging, used to localise molecular information in living organisms in a non-invasive way. These considerations concern both preclinical research for drug design and human medical applications. With the development of DNA and RNA chips [1], genomics has revolutionised investigative methods for cells and cell processes [2,3]. By sequencing the human genome, new ways have been found for understanding the fundamental mechanisms of life [4]. A revolution is currently under way with the analysis of the proteome [5-8], i.e., the complete set of proteins that can be found in some given biological medium, such as the blood plasma. The goal is to characterise certain diseases by recognisable signatures in the proteomic profile, as determined from a blood sample or a biopsy, for example [9-13]. What is at stake is the early detection of

  7. Information Processing - Administrative Data Processing

    Bubenko, Janis

    A three semester, 60-credit course package in the topic of Administrative Data Processing (ADP), offered in 1966 at Stockholm University (SU) and the Royal Institute of Technology (KTH) is described. The package had an information systems engineering orientation. The first semester focused on datalogical topics, while the second semester focused on the infological topics. The third semester aimed to deepen the students’ knowledge in different parts of ADP and at writing a bachelor thesis. The concluding section of this paper discusses various aspects of the department’s first course effort. The course package led to a concretisation of our discipline and gave our discipline an identity. Our education seemed modern, “just in time”, and well adapted to practical needs. The course package formed the first concrete activity of a group of young teachers and researchers. In a forty-year perspective, these people have further developed the department and the topic to an internationally well-reputed body of knowledge and research. The department has produced more than thirty professors and more than one hundred doctoral degrees.

  8. Electrotechnologies to process foods

    Electrical energy is being used to process foods. In conventional food processing plants, electricity drives mechanical devices and controls the degree of process. In recent years, several processing technologies are being developed to process foods directly with electricity. Electrotechnologies use...

  9. Studies on process synthesis and process integration

    Fien, Gert-Jan A. F.


    This thesis discusses topics in the field of process engineering that have received much attention over the past twenty years: (1) conceptual process synthesis using heuristic shortcut methods and (2) process integration through heat-exchanger networks and energy-saving power and refrigeration systems. The shortcut methods for conceptual process synthesis presented in Chapter 2, utilize Residue Curve Maps in ternary diagrams and are illustrated with examples of processes...

  10. Extensible packet processing architecture

    Robertson, Perry J.; Hamlet, Jason R.; Pierson, Lyndon G.; Olsberg, Ronald R.; Chun, Guy D.


    A technique for distributed packet processing includes sequentially passing packets associated with packet flows between a plurality of processing engines along a flow through data bus linking the plurality of processing engines in series. At least one packet within a given packet flow is marked by a given processing engine to signify by the given processing engine to the other processing engines that the given processing engine has claimed the given packet flow for processing. A processing function is applied to each of the packet flows within the processing engines and the processed packets are output on a time-shared, arbitered data bus coupled to the plurality of processing engines.

  11. Process mineralogy IX

    Petruk, W.; Hagni, R.D.; Pignolet-Brandom, S.; Hausen, D.M. (eds.) (Canada Centre for Mineral and Energy Technology, Ottawa, ON (Canada))


    54 papers are presented under the headings: keynote address; process mineralogy applications to mineral processing; process mineralogy applications to gold; process mineralogy applications to pyrometallurgy and hydrometallurgy; process mineralogy applications to environment and health; and process mineralogy applications to synthetic materials. Subject and author indexes are provided. Three papers have been abstracted separately.

  12. Thinning spatial point processes into Poisson processes

    Møller, Jesper; Schoenberg, Frederic Paik


    are identified, and where we simulate backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and...

  13. Thinning spatial point processes into Poisson processes

    Møller, Jesper; Schoenberg, Frederic Paik

    , and where one simulates backwards and forwards in order to obtain the thinned process. In the case of a Cox process, a simple independent thinning technique is proposed. In both cases, the thinning results in a Poisson process if and only if the true Papangelou conditional intensity is used, and thus can...

  14. Refactoring Process Models in Large Process Repositories.

    Weber, B.; Reichert, M.U.


    With the increasing adoption of process-aware information systems (PAIS), large process model repositories have emerged. Over time respective models have to be re-aligned to the real-world business processes through customization or adaptation. This bears the risk that model redundancies are introdu

  15. Process Intensification: A Perspective on Process Synthesis

    Lutze, Philip; Gani, Rafiqul; Woodley, John


    In recent years, process intensification (PI) has attracted considerable academic interest as a potential means for process improvement, to meet the increasing demands for sustainable production. A variety of intensified operations developed in academia and industry creates a large number...... of options to potentially improve the process but to identify the set of feasible solutions for PI in which the optimal can be found takes considerable resources. Hence, a process synthesis tool to achieve PI would potentially assist in the generation and evaluation of PI options. Currently, several process...... of the main concepts is illustrated through an example involving the operation of a membrane reactor....

  16. The permanental process

    McCullagh, Peter; Møller, Jesper


    We extend the boson process first to a large class of Cox processes and second to an even larger class of infinitely divisible point processes. Density and moment results are studied in detail. These results are obtained in closed form as weighted permanents, so the extension i called a permanental...... process. Temporal extensions and a particularly tractable case of the permanental process are also studied. Extensions of the fermion process along similar lines, leading to so-called determinantal processes, are discussed....

  17. Food processing and allergenicity

    Verhoeckx, K.C.M.; Vissers, Y.M.; Baumert, J.L.; Faludi, R.; Feys, M.; Flanagan, S.; Herouet-Guicheney, C.; Holzhauser, T.; Shimojo, R.; Bolt, N. van der; Wichers, H.; Kimber, I.


    Food processing can have many beneficial effects. However, processing may also alter the allergenic properties of food proteins. A wide variety of processing methods is available and their use depends largely on the food to be processed. In this review the impact of processing (heat and non-heat tre

  18. Food Processing and Allergenicity

    Verhoeckx, K.; Vissers, Y.; Baumert, J.L.; Faludi, R.; Fleys, M.; Flanagan, S.; Herouet-Guicheney, C.; Holzhauser, T.; Shimojo, R.; Bolt, van der Nieke; Wichers, H.J.; Kimber, I.


    Food processing can have many beneficial effects. However, processing may also alter the allergenic properties of food proteins. A wide variety of processing methods is available and their use depends largely on the food to be processed.

    In this review the impact of processing (heat and non

  19. SAR processing using SHARC signal processing systems

    Huxtable, Barton D.; Jackson, Christopher R.; Skaron, Steve A.


    Synthetic aperture radar (SAR) is uniquely suited to help solve the Search and Rescue problem since it can be utilized either day or night and through both dense fog or thick cloud cover. Other papers in this session, and in this session in 1997, describe the various SAR image processing algorithms that are being developed and evaluated within the Search and Rescue Program. All of these approaches to using SAR data require substantial amounts of digital signal processing: for the SAR image formation, and possibly for the subsequent image processing. In recognition of the demanding processing that will be required for an operational Search and Rescue Data Processing System (SARDPS), NASA/Goddard Space Flight Center and NASA/Stennis Space Center are conducting a technology demonstration utilizing SHARC multi-chip modules from Boeing to perform SAR image formation processing.


    Suyono .


    Full Text Available The marginal distribution of integrated renewal process is derived in this paper. Our approach is based on the theory of point processes, especially Poisson point processes. The results are presented in the form of Laplace transforms.

  1. Integrated Process Capability Analysis

    Chen; H; T; Huang; M; L; Hung; Y; H; Chen; K; S


    Process Capability Analysis (PCA) is a powerful too l to assess the ability of a process for manufacturing product that meets specific ations. The larger process capability index implies the higher process yield, a nd the larger process capability index also indicates the lower process expected loss. Chen et al. (2001) has applied indices C pu, C pl, and C pk for evaluating the process capability for a multi-process product wi th smaller-the-better, larger-the-better, and nominal-the-best spec...

  2. Product Development Process Modeling


    The use of Concurrent Engineering and other modern methods of product development and maintenance require that a large number of time-overlapped "processes" be performed by many people. However, successfully describing and optimizing these processes are becoming even more difficult to achieve. The perspective of industrial process theory (the definition of process) and the perspective of process implementation (process transition, accumulation, and inter-operations between processes) are used to survey the method used to build one base model (multi-view) process model.

  3. From Process Understanding to Process Control

    Streefland, M.


    A licensed pharmaceutical process is required to be executed within the validated ranges throughout the lifetime of product manufacturing. Changes to the process usually require the manufacturer to demonstrate that the safety and efficacy of the product remains unchanged. Recent changes in the regul

  4. Business Process Redesign: Design the Improved Process


    64 C. MULTIVOTING ......... .................. .. 65 D. ELECTRONIC VOTING TECHNOLOGY ... ......... .. 65 v E. PAIRED...PROCESS IMPROVEMENT PROCESS (PIP) Diagram of each Activity (AI-A4) ......... .. 122 vi APPENDIX D: PRODUCTS AND VENDORS WHICH SUPPORT ELECTRONIC VOTING ............. 126...requirements. D. ELECTRONIC VOTING TECHNOLOGY Nunamaker [1992] suggests that traditional voting usual- ly happens at the end of a discussion, to close

  5. Idaho Chemical Processing Plant Process Efficiency improvements

    Griebenow, B.


    In response to decreasing funding levels available to support activities at the Idaho Chemical Processing Plant (ICPP) and a desire to be cost competitive, the Department of Energy Idaho Operations Office (DOE-ID) and Lockheed Idaho Technologies Company have increased their emphasis on cost-saving measures. The ICPP Effectiveness Improvement Initiative involves many activities to improve cost effectiveness and competitiveness. This report documents the methodology and results of one of those cost cutting measures, the Process Efficiency Improvement Activity. The Process Efficiency Improvement Activity performed a systematic review of major work processes at the ICPP to increase productivity and to identify nonvalue-added requirements. A two-phase approach was selected for the activity to allow for near-term implementation of relatively easy process modifications in the first phase while obtaining long-term continuous improvement in the second phase and beyond. Phase I of the initiative included a concentrated review of processes that had a high potential for cost savings with the intent of realizing savings in Fiscal Year 1996 (FY-96.) Phase II consists of implementing long-term strategies too complex for Phase I implementation and evaluation of processes not targeted for Phase I review. The Phase II effort is targeted for realizing cost savings in FY-97 and beyond.

  6. Technology or Process First?

    Siurdyban, Artur Henryk; Svejvig, Per; Møller, Charles

    between them using strategic alignment, Enterprise Systems and Business Process Management theories. We argue that the insights from these cases can lead to a better alignment between process and technology. Implications for practice include the direction towards a closer integration of process...... and technology factors in organizations. Theoretical implications call for a design-oriented view of technology and process alignment....

  7. Thin film processes

    Vossen, John L


    Remarkable advances have been made in recent years in the science and technology of thin film processes for deposition and etching. It is the purpose of this book to bring together tutorial reviews of selected filmdeposition and etching processes from a process viewpoint. Emphasis is placed on the practical use of the processes to provide working guidelines for their implementation, a guide to the literature, and an overview of each process.

  8. Metallurgical process engineering

    Yin, Ruiyu [Central Iron and Steel Research Institute (CISRI), Beijing (China)


    ''Metallurgical Process Engineering'' discusses large-scale integrated theory on the level of manufacturing production processes, putting forward concepts for exploring non-equilibrium and irreversible complex system. It emphasizes the dynamic and orderly operation of the steel plant manufacturing process, the major elements of which are the flow, process network and program. The book aims at establishing a quasi-continuous and continuous process system for improving several techno-economic indices, minimizing dissipation and enhancing the market competitiveness and sustainability of steel plants. The book is intended for engineers, researchers and managers in the fields of metallurgical engineering, industrial design, and process engineering. (orig.)

  9. Acoustic signal processing toolbox for array processing

    Pham, Tien; Whipps, Gene T.


    The US Army Research Laboratory (ARL) has developed an acoustic signal processing toolbox (ASPT) for acoustic sensor array processing. The intent of this document is to describe the toolbox and its uses. The ASPT is a GUI-based software that is developed and runs under MATLAB. The current version, ASPT 3.0, requires MATLAB 6.0 and above. ASPT contains a variety of narrowband (NB) and incoherent and coherent wideband (WB) direction-of-arrival (DOA) estimation and beamforming algorithms that have been researched and developed at ARL. Currently, ASPT contains 16 DOA and beamforming algorithms. It contains several different NB and WB versions of the MVDR, MUSIC and ESPRIT algorithms. In addition, there are a variety of pre-processing, simulation and analysis tools available in the toolbox. The user can perform simulation or real data analysis for all algorithms with user-defined signal model parameters and array geometries.

  10. News: Process intensification

    Conservation of materials and energy is a major objective to the philosophy of sustainability. Where production processes can be intensified to assist these objectives, significant advances have been developed to assist conservation as well as cost. Process intensification (PI) h...

  11. Group Decision Process Support

    Gøtze, John; Hijikata, Masao


    Introducing the notion of Group Decision Process Support Systems (GDPSS) to traditional decision-support theorists.......Introducing the notion of Group Decision Process Support Systems (GDPSS) to traditional decision-support theorists....

  12. Business Process Inventory

    Office of Personnel Management — Inventory of maps and descriptions of the business processes of the U.S. Office of Personnel Management (OPM), with an emphasis on the processes of the Office of the...

  13. Towards better process understanding

    Matero, Sanni Elina; van der Berg, Franciscus Winfried J; Poutiainen, Sami


    The manufacturing of tablets involves many unit operations that possess multivariate and complex characteristics. The interactions between the material characteristics and process related variation are presently not comprehensively analyzed due to univariate detection methods. As a consequence......, current best practice to control a typical process is to not allow process-related factors to vary i.e. lock the production parameters. The problem related to the lack of sufficient process understanding is still there: the variation within process and material properties is an intrinsic feature...... and cannot be compensated for with constant process parameters. Instead, a more comprehensive approach based on the use of multivariate tools for investigating processes should be applied. In the pharmaceutical field these methods are referred to as Process Analytical Technology (PAT) tools that aim...

  14. Secure Processing Lab

    Federal Laboratory Consortium — The Secure Processing Lab is the center of excellence for new and novel processing techniques for the formation, calibration and analysis of radar. In addition, this...

  15. Radiochemical Processing Laboratory (RPL)

    Federal Laboratory Consortium — The Radiochemical Processing Laboratory (RPL)�is a scientific facility funded by DOE to create and implement innovative processes for environmental clean-up and...

  16. Infrared processing of foods

    Infrared (IR) processing of foods has been gaining popularity over conventional processing in several unit operations, including drying, peeling, baking, roasting, blanching, pasteurization, sterilization, disinfection, disinfestation, cooking, and popping . It has shown advantages over conventional...

  17. Dairy processing, Improving quality

    Smit, G.


    This book discusses raw milk composition, production and quality, and reviews developments in processing from hygiene and HACCP systems to automation, high-pressure processing and modified atmosphere packaging.

  18. Stochastic processes - quantum physics

    Streit, L. (Bielefeld Univ. (Germany, F.R.))


    The author presents an elementary introduction to stochastic processes. He starts from simple quantum mechanics and considers problems in probability, finally presenting quantum dynamics in terms of stochastic processes.

  19. Drug Development Process

    ... Device Approvals The Drug Development Process The Drug Development Process Share Tweet Linkedin Pin it More sharing ... Pin it Email Print Step 1 Discovery and Development Discovery and Development Research for a new drug ...

  20. Polyamines in tea processing.

    Palavan-Unsal, Narcin; Arisan, Elif Damla; Terzioglu, Salih


    The distribution of dietary polyamines, putrescine, spermidine and spermine, was determined during processing of Camellia sinensis. Black tea manufacture is carried by a series of processes on fresh tea leaves involving withering, rolling, fermentation, drying and sieving. The aim of this research was to determine the effect of tea processing on the polyamine content in relation with antioxidant enzymes, superoxide dismutase, lipid peroxidase and glutathione peroxidase. Before processing, the spermine content was much higher than the putrescine and spermidine content in green tea leaves. Spermine was significantly decreased during processing while the putrescine and spermine contents increased during withered and rolling and decreased in the following stages. The superoxide dismutase activity increased at the withering stage and declined during processing. The transcript level of the polyamine biosynthesis-responsible enzyme ornithine decarboxylase was reduced during each processing step. This study reveals the importance of protection of nutritional compounds that are essential for health during the manufacturing process.

  1. Grind hardening process

    Salonitis, Konstantinos


    This book presents the grind-hardening process and the main studies published since it was introduced in 1990s.  The modelling of the various aspects of the process, such as the process forces, temperature profile developed, hardness profiles, residual stresses etc. are described in detail. The book is of interest to the research community working with mathematical modeling and optimization of manufacturing processes.

  2. Dosimetry for radiation processing

    Miller, Arne


    During the past few years significant advances have taken place in the different areas of dosimetry for radiation processing, mainly stimulated by the increased interest in radiation for food preservation, plastic processing and sterilization of medical products. Reference services both...... and sterilization dosimetry, optichromic dosimeters in the shape of small tubes for food processing, and ESR spectroscopy of alanine for reference dosimetry. In this paper the special features of radiation processing dosimetry are discussed, several commonly used dosimeters are reviewed, and factors leading...

  3. Software Process Improvement Defined

    Aaen, Ivan


    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  4. Auditory processing models

    Dau, Torsten


    The Handbook of Signal Processing in Acoustics will compile the techniques and applications of signal processing as they are used in the many varied areas of Acoustics. The Handbook will emphasize the interdisciplinary nature of signal processing in acoustics. Each Section of the Handbook will pr...

  5. Semisolid Metal Processing Consortium



    Mathematical modeling and simulations of semisolid filling processes remains a critical issue in understanding and optimizing the process. Semisolid slurries are non-Newtonian materials that exhibit complex rheological behavior. There the way these slurries flow in cavities is very different from the way liquid in classical casting fills cavities. Actually filling in semisolid processing is often counter intuitive

  6. Clinical Process Intelligence

    Vilstrup Pedersen, Klaus


    .e. local guidelines. From a knowledge management point of view, this externalization of generalized processes, gives the opportunity to learn from, evaluate and optimize the processes. "Clinical Process Intelligence" (CPI), will denote the goal of getting generalized insight into patient centered health...

  7. Biomass process handbook


    Descriptions are given of 42 processes which use biomass to produce chemical products. Marketing and economic background, process description, flow sheets, costs, major equipment, and availability of technology are given for each of the 42 processes. Some of the chemicals discussed are: ethanol, ethylene, acetaldehyde, butanol, butadiene, acetone, citric acid, gluconates, itaconic acid, lactic acid, xanthan gum, sorbitol, starch polymers, fatty acids, fatty alcohols, glycerol, soap, azelaic acid, perlargonic acid, nylon-11, jojoba oil, furfural, furfural alcohol, tetrahydrofuran, cellulose polymers, products from pulping wastes, and methane. Processes include acid hydrolysis, enzymatic hydrolysis, fermentation, distillation, Purox process, and anaerobic digestion.

  8. Thin film processes II

    Kern, Werner


    This sequel to the 1978 classic, Thin Film Processes, gives a clear, practical exposition of important thin film deposition and etching processes that have not yet been adequately reviewed. It discusses selected processes in tutorial overviews with implementation guide lines and an introduction to the literature. Though edited to stand alone, when taken together, Thin Film Processes II and its predecessor present a thorough grounding in modern thin film techniques.Key Features* Provides an all-new sequel to the 1978 classic, Thin Film Processes* Introduces new topics, and sever

  9. Colloid process engineering

    Peukert, Wolfgang; Rehage, Heinz; Schuchmann, Heike


    This book deals with colloidal systems in technical processes and the influence of colloidal systems by technical processes. It explores how new measurement capabilities can offer the potential for a dynamic development of scientific and engineering, and examines the origin of colloidal systems and its use for new products. The future challenges to colloidal process engineering are the development of appropriate equipment and processes for the production and obtainment of multi-phase structures and energetic interactions in market-relevant quantities. The book explores the relevant processes and for controlled production and how they can be used across all scales.

  10. Silicon production process evaluations


    Chemical engineering analyses involving the preliminary process design of a plant (1,000 metric tons/year capacity) to produce silicon via the technology under consideration were accomplished. Major activities in the chemical engineering analyses included base case conditions, reaction chemistry, process flowsheet, material balance, energy balance, property data, equipment design, major equipment list, production labor and forward for economic analysis. The process design package provided detailed data for raw materials, utilities, major process equipment and production labor requirements necessary for polysilicon production in each process.

  11. Data processing made simple

    Wooldridge, Susan


    Data Processing: Made Simple, Second Edition presents discussions of a number of trends and developments in the world of commercial data processing. The book covers the rapid growth of micro- and mini-computers for both home and office use; word processing and the 'automated office'; the advent of distributed data processing; and the continued growth of database-oriented systems. The text also discusses modern digital computers; fundamental computer concepts; information and data processing requirements of commercial organizations; and the historical perspective of the computer industry. The

  12. Dynamical laser spike processing

    Shastri, Bhavin J; Tait, Alexander N; Rodriguez, Alejandro W; Wu, Ben; Prucnal, Paul R


    Novel materials and devices in photonics have the potential to revolutionize optical information processing, beyond conventional binary-logic approaches. Laser systems offer a rich repertoire of useful dynamical behaviors, including the excitable dynamics also found in the time-resolved "spiking" of neurons. Spiking reconciles the expressiveness and efficiency of analog processing with the robustness and scalability of digital processing. We demonstrate that graphene-coupled laser systems offer a unified low-level spike optical processing paradigm that goes well beyond previously studied laser dynamics. We show that this platform can simultaneously exhibit logic-level restoration, cascadability and input-output isolation---fundamental challenges in optical information processing. We also implement low-level spike-processing tasks that are critical for higher level processing: temporal pattern detection and stable recurrent memory. We study these properties in the context of a fiber laser system, but the addit...

  13. A secondary fuel removal process: plasma processing

    Min, J. Y.; Kim, Y. S. [Hanyang Univ., Seoul (Korea, Republic of); Bae, K. K.; Yang, M. S. [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)


    Plasma etching process of UO{sub 2} by using fluorine containing gas plasma is studied as a secondary fuel removal process for DUPIC (Direct Use of PWR spent fuel Into Candu) process which is taken into consideration for potential future fuel cycle in Korea. CF{sub 4}/O{sub 2} gas mixture is chosen for reactant gas and the etching rates of UO{sub 2} by the gas plasma are investigated as functions of CF{sub 4}/O{sub 2} ratio, plasma power, substrate temperature, and plasma gas pressure. It is found that the optimum CF{sub 4}/O{sub 2} ratio is around 4:1 at all temperatures up to 400 deg C and the etching rate increases with increasing r.f. power and substrate temperature. Under 150W r.f. power the etching rate reaches 1100 monolayers/min at 400 deg C, which is equivalent to about 0.5mm/min. (author).

  14. Business Model Process Configurations

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter


    Purpose – The paper aims: 1) To develop systematically a structural list of various business model process configuration and to group (deductively) these selected configurations in a structured typological categorization list. 2) To facilitate companies in the process of BM innovation, by develop......Purpose – The paper aims: 1) To develop systematically a structural list of various business model process configuration and to group (deductively) these selected configurations in a structured typological categorization list. 2) To facilitate companies in the process of BM innovation......, by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...... method of data analysis. Findings - A comprehensive literature review and analysis resulted in a list of business model process configurations systematically organized under five classification groups, namely, revenue model; value proposition; value configuration; target customers, and strategic...

  15. Decomposability for stable processes

    Wang, Yizao; Roy, Parthanil


    We characterize all possible independent symmetric $\\alpha$-stable (S$\\alpha$S) components of a non--Gaussian S$\\alpha$S process, $0<\\alpha<2$. In particular, we characterize the independent stationary S$\\alpha$S components of a stationary S$\\alpha$S process. One simple consequence of our characterization is that all stationary components of the S$\\alpha$S moving average processes are trivial. As a main application, we show that the standard Brown--Resnick process has a moving average representation. This complements a result of Kabluchko et al. (2009), who obtained mixed moving average representations for these processes. We also develop a parallel characterization theory for max-stable processes.

  16. Badge Office Process Analysis

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  17. Linearity in Process Languages

    Nygaard, Mikkel; Winskel, Glynn


    The meaning and mathematical consequences of linearity (managing without a presumed ability to copy) are studied for a path-based model of processes which is also a model of affine-linear logic. This connection yields an affine-linear language for processes, automatically respecting open-map bisi......The meaning and mathematical consequences of linearity (managing without a presumed ability to copy) are studied for a path-based model of processes which is also a model of affine-linear logic. This connection yields an affine-linear language for processes, automatically respecting open......-map bisimulation, in which a range of process operations can be expressed. An operational semantics is provided for the tensor fragment of the language. Different ways to make assemblies of processes lead to different choices of exponential, some of which respect bisimulation....

  18. Process Improvement Essentials

    Persse, James R


    Process Improvement Essentials combines the foundation needed to understand process improvement theory with the best practices to help individuals implement process improvement initiatives in their organization. The three leading programs: ISO 9001:2000, CMMI, and Six Sigma--amidst the buzz and hype--tend to get lumped together under a common label. This book delivers a combined guide to all three programs, compares their applicability, and then sets the foundation for further exploration.

  19. TEP process flow diagram

    Wilms, R Scott [Los Alamos National Laboratory; Carlson, Bryan [Los Alamos National Laboratory; Coons, James [Los Alamos National Laboratory; Kubic, William [Los Alamos National Laboratory


    This presentation describes the development of the proposed Process Flow Diagram (PFD) for the Tokamak Exhaust Processing System (TEP) of ITER. A brief review of design efforts leading up to the PFD is followed by a description of the hydrogen-like, air-like, and waterlike processes. Two new design values are described; the mostcommon and most-demanding design values. The proposed PFD is shown to meet specifications under the most-common and mostdemanding design values.

  20. Business process transformation

    Grover, Varun


    Featuring contributions from prominent thinkers and researchers, this volume in the ""Advances in Management Information Systems"" series provides a rich set of conceptual, empirical, and introspective studies that epitomize fundamental knowledge in the area of Business Process Transformation. Processes are interpreted broadly to include operational and managerial processes within and between organizations, as well as those involved in knowledge generation. Transformation includes radical and incremental change, its conduct, management, and outcome. The editors and contributing authors pay clo

  1. Jointly Poisson processes

    Johnson, D H


    What constitutes jointly Poisson processes remains an unresolved issue. This report reviews the current state of the theory and indicates how the accepted but unproven model equals that resulting from the small time-interval limit of jointly Bernoulli processes. One intriguing consequence of these models is that jointly Poisson processes can only be positively correlated as measured by the correlation coefficient defined by cumulants of the probability generating functional.

  2. Multiphoton processes: conference proceedings

    Lambropoulos, P.; Smith, S.J. (eds.)


    The chapters of this volume represent the invited papers delivered at the conference. They are arranged according to thermatic proximity beginning with atoms and continuing with molecules and surfaces. Section headings include multiphoton processes in atoms, field fluctuations and collisions in multiphoton process, and multiphoton processes in molecules and surfaces. Abstracts of individual items from the conference were prepared separately for the data base. (GHT)

  3. Polygon mesh processing

    Botsch, Mario; Pauly, Mark; Alliez, Pierre; Levy, Bruno


    Geometry processing, or mesh processing, is a fast-growing area of research that uses concepts from applied mathematics, computer science, and engineering to design efficient algorithms for the acquisition, reconstruction, analysis, manipulation, simulation, and transmission of complex 3D models. Applications of geometry processing algorithms already cover a wide range of areas from multimedia, entertainment, and classical computer-aided design, to biomedical computing, reverse engineering, and scientific computing. Over the last several years, triangle meshes have become increasingly popular,

  4. Basic digital signal processing

    Lockhart, Gordon B


    Basic Digital Signal Processing describes the principles of digital signal processing and experiments with BASIC programs involving the fast Fourier theorem (FFT). The book reviews the fundamentals of the BASIC program, continuous and discrete time signals including analog signals, Fourier analysis, discrete Fourier transform, signal energy, power. The text also explains digital signal processing involving digital filters, linear time-variant systems, discrete time unit impulse, discrete-time convolution, and the alternative structure for second order infinite impulse response (IIR) sections.

  5. NASA Hazard Analysis Process

    Deckert, George


    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  6. Scientific information processing procedures

    García, Maylin


    Full Text Available The paper systematizes several theoretical view-points on scientific information processing skill. It decomposes the processing skills into sub-skills. Several methods such analysis, synthesis, induction, deduction, document analysis were used to build up a theoretical framework. Interviews and survey to professional being trained and a case study was carried out to evaluate the results. All professional in the sample improved their performance in scientific information processing.

  7. Multimodal Processes Rescheduling

    Bocewicz, Grzegorz; Banaszak, Zbigniew A.; Nielsen, Peter


    Cyclic scheduling problems concerning multimodal processes are usually observed in FMSs producing multi-type parts where the Automated Guided Vehicles System (AGVS) plays a role of a material handling system. Schedulability analysis of concurrently flowing cyclic processes (SCCP) exe-cuted in the......Cyclic scheduling problems concerning multimodal processes are usually observed in FMSs producing multi-type parts where the Automated Guided Vehicles System (AGVS) plays a role of a material handling system. Schedulability analysis of concurrently flowing cyclic processes (SCCP) exe...

  8. Fuels Processing Laboratory

    Federal Laboratory Consortium — NETL’s Fuels Processing Laboratory in Morgantown, WV, provides researchers with the equipment they need to thoroughly explore the catalytic issues associated with...

  9. Process modeling style

    Long, John


    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  10. Living olefin polymerization processes

    Schrock, Richard R.; Baumann, Robert


    Processes for the living polymerization of olefin monomers with terminal carbon-carbon double bonds are disclosed. The processes employ initiators that include a metal atom and a ligand having two group 15 atoms and a group 16 atom or three group 15 atoms. The ligand is bonded to the metal atom through two anionic or covalent bonds and a dative bond. The initiators are particularly stable under reaction conditions in the absence of olefin monomer. The processes provide polymers having low polydispersities, especially block copolymers having low polydispersities. It is an additional advantage of these processes that, during block copolymer synthesis, a relatively small amount of homopolymer is formed.

  11. Financial information processing

    Shuo BAI; Shouyang WANG; Lean YU; Aoying ZHOU


    @@ The rapid growth in financial data volume has made financial information processing more and more difficult due to the increase in complexity, which has forced businesses and academics alike to turn to sophisticated information processing technologies for better solutions. A typical feature is that high-performance computers and advanced computational techniques play ever-increasingly important roles for business and industries to have competitive advantages. Accordingly, financial information processing has emerged as a new cross-disciplinary field integrating computer science, mathematics, financial economics, intelligent techniques, and computer simulations to make different decisions based on processed financial information.

  12. The Recruitment Process:

    Holm, Anna

    The aim of this research was to determine whether the introduction of e-recruitment has an impact on the process and underlying tasks, subtasks and activities of recruitment. Three large organizations with well-established e-recruitment practices were included in the study. The three case studies......, which were carried out in Denmark in 2008-2009 using qualitative research methods, revealed changes in the sequence, divisibility and repetitiveness of a number of recruitment tasks and subtasks. The new recruitment process design was identified and presented in the paper. The study concluded...... that the main task of the process shifted from processing applications to communicating with candidates....

  13. Biased predecision processing.

    Brownstein, Aaron L


    Decision makers conduct biased predecision processing when they restructure their mental representation of the decision environment to favor one alternative before making their choice. The question of whether biased predecision processing occurs has been controversial since L. Festinger (1957) maintained that it does not occur. The author reviews relevant research in sections on theories of cognitive dissonance, decision conflict, choice certainty, action control, action phases, dominance structuring, differentiation and consolidation, constructive processing, motivated reasoning, and groupthink. Some studies did not find evidence of biased predecision processing, but many did. In the Discussion section, the moderators are summarized and used to assess the theories.

  14. Acoustic Signal Processing

    Hartmann, William M.; Candy, James V.

    Signal processing refers to the acquisition, storage, display, and generation of signals - also to the extraction of information from signals and the re-encoding of information. As such, signal processing in some form is an essential element in the practice of all aspects of acoustics. Signal processing algorithms enable acousticians to separate signals from noise, to perform automatic speech recognition, or to compress information for more efficient storage or transmission. Signal processing concepts are the building blocks used to construct models of speech and hearing. Now, in the 21st century, all signal processing is effectively digital signal processing. Widespread access to high-speed processing, massive memory, and inexpensive software make signal processing procedures of enormous sophistication and power available to anyone who wants to use them. Because advanced signal processing is now accessible to everybody, there is a need for primers that introduce basic mathematical concepts that underlie the digital algorithms. The present handbook chapter is intended to serve such a purpose.

  15. Cooperative internal conversion process

    Kálmán, Péter


    A new phenomenon, called cooperative internal conversion process, in which the coupling of bound-free electron and neutron transitions due to the dipole term of their Coulomb interaction permits cooperation of two nuclei leading to neutron exchange if it is allowed by energy conservation, is discussed theoretically. General expression of the cross section of the process is reported in one particle nuclear and spherical shell models as well in the case of free atoms (e.g. noble gases). A half-life characteristic of the process is also determined. The case of $Ne$ is investigated numerically. The process may have significance in fields of nuclear waste disposal and nuclear energy production.

  16. Kuhlthau's Information Search Process.

    Shannon, Donna


    Explains Kuhlthau's Information Search Process (ISP) model which is based on a constructivist view of learning and provides a framework for school library media specialists for the design of information services and instruction. Highlights include a shift from library skills to information skills; attitudes; process approach; and an interview with…

  17. The Analytical Hierarchy Process

    Barfod, Michael Bruhn


    The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use.......The technical note gathers the theory behind the Analytical Hierarchy Process (AHP) and present its advantages and disadvantages in practical use....

  18. Mineral Processing Technology Roadmap



    This document represents the roadmap for Processing Technology Research in the US Mining Industry. It was developed based on the results of a Processing Technology Roadmap Workshop sponsored by the National Mining Association in conjunction with the US Department of Energy, Office of Energy Efficiency and Renewable Energy, Office of Industrial Technologies. The Workshop was held January 24 - 25, 2000.

  19. Technologies for Optical Processing

    Stubkjær, Kristian


    The article consists of a Powerpoint presentation on technologies for optical processing. The paper concludes that the nonlinear elements based on SOA, fibers and waveguide structures have capabilities of simple processing at data rates of 100-600 Gb/s. Switching powers comparable to electronics...

  20. Cognitive Processes and Achievement.

    Hunt, Dennis; Randhawa, Bikkar S.

    For a group of 165 fourth- and fifth-grade students, four achievement test scores were correlated with success on nine tests designed to measure three cognitive functions: sustained attention, successive processing, and simultaneous processing. This experiment was designed in accordance with Luria's model of the three functional units of the…

  1. The process of entrepreneurship:

    Neergaard, Helle


    Growing a technology-based new venture is a complex process because these ventures are embedded in turbulent environments that require fast organisational and managerial transformation. This chapter addresses the evolutionary process of such ventures. It seeks to provide insight into the link...

  2. Process Writing Checklist.

    Jenks, Christopher J.

    This checklist is designed to help develop writing strategies for English language learners (ELLs), focusing on a variety of linguistic strategies inherent in the writing process. It provides them with a graphical representation of the cognitive process involved in complex writing, promoting self-assessment strategies and integrating oral…

  3. Food processing and allergenicity.

    Verhoeckx, Kitty C M; Vissers, Yvonne M; Baumert, Joseph L; Faludi, Roland; Feys, Marcel; Flanagan, Simon; Herouet-Guicheney, Corinne; Holzhauser, Thomas; Shimojo, Ryo; van der Bolt, Nieke; Wichers, Harry; Kimber, Ian


    Food processing can have many beneficial effects. However, processing may also alter the allergenic properties of food proteins. A wide variety of processing methods is available and their use depends largely on the food to be processed. In this review the impact of processing (heat and non-heat treatment) on the allergenic potential of proteins, and on the antigenic (IgG-binding) and allergenic (IgE-binding) properties of proteins has been considered. A variety of allergenic foods (peanuts, tree nuts, cows' milk, hens' eggs, soy, wheat and mustard) have been reviewed. The overall conclusion drawn is that processing does not completely abolish the allergenic potential of allergens. Currently, only fermentation and hydrolysis may have potential to reduce allergenicity to such an extent that symptoms will not be elicited, while other methods might be promising but need more data. Literature on the effect of processing on allergenic potential and the ability to induce sensitisation is scarce. This is an important issue since processing may impact on the ability of proteins to cause the acquisition of allergic sensitisation, and the subject should be a focus of future research. Also, there remains a need to develop robust and integrated methods for the risk assessment of food allergenicity.

  4. Process innovation laboratory

    Møller, Charles


    Most organizations today are required not only to operate effective business processes but also to allow for changing business conditions at an increasing rate. Today nearly every business relies on their enterprise information systems (EIS) for process integration and future generations of EIS...

  5. Hyperspectral image processing methods

    Hyperspectral image processing refers to the use of computer algorithms to extract, store and manipulate both spatial and spectral information contained in hyperspectral images across the visible and near-infrared portion of the electromagnetic spectrum. A typical hyperspectral image processing work...

  6. Flow generating processes

    Lanen, van H.A.J.; Fendeková, M.; Kupczyk, E.; Kasprzyk, A.; Pokojski, W.


    This chapter starts with an overview of how climatic water deficits affect hydrological processes in different type of catchments. It then continues with a more comprehensive description of drought relevant processes. Two catchments in climatologically contrasting regions are used for illustrative p

  7. WWTP Process Tank Modelling

    Laursen, Jesper

    hydrofoil shaped propellers. These two sub-processes deliver the main part of the supplied energy to the activated sludge tank, and for this reason they are important for the mixing conditions in the tank. For other important processes occurring in the activated sludge tank, existing models and measurements...

  8. Uranium processing and properties


    Covers a broad spectrum of topics and applications that deal with uranium processing and the properties of uranium Offers extensive coverage of both new and established practices for dealing with uranium supplies in nuclear engineering Promotes the documentation of the state-of-the-art processing techniques utilized for uranium and other specialty metals

  9. Relational Processing Following Stroke

    Andrews, Glenda; Halford, Graeme S.; Shum, David; Maujean, Annick; Chappell, Mark; Birney, Damian


    The research examined relational processing following stroke. Stroke patients (14 with frontal, 30 with non-frontal lesions) and 41 matched controls completed four relational processing tasks: sentence comprehension, Latin square matrix completion, modified Dimensional Change Card Sorting, and n-back. Each task included items at two or three…

  10. Monitoring Business Processes

    Bellandi, Valerio; Ceravolo, Paolo; Damiani, Ernesto; Frati, Fulvio

    In this chapter, we introduce the TEKNE Metrics Framework that performs services to monitor business processes. This framework was designed to support the prescription and explanation of these processes. TEKNE's most innovative contribution is managing data expressed in declarative form. To face this challenge, the TEKNE project implemented an infrastructure that relies on declarative Semantic Web technologies designed to be used in distributed systems.

  11. Image processing mini manual

    Matthews, Christine G.; Posenau, Mary-Anne; Leonard, Desiree M.; Avis, Elizabeth L.; Debure, Kelly R.; Stacy, Kathryn; Vonofenheim, Bill


    The intent is to provide an introduction to the image processing capabilities available at the Langley Research Center (LaRC) Central Scientific Computing Complex (CSCC). Various image processing software components are described. Information is given concerning the use of these components in the Data Visualization and Animation Laboratory at LaRC.

  12. Hybrid quantum information processing

    Furusawa, Akira [Department of Applied Physics, School of Engineering, The University of Tokyo (Japan)


    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  13. Organic food processing

    Kahl, Johannes; Alborzi, Farnaz; Beck, Alexander


    In 2007 EU Regulation (EC) 834/2007 introduced principles and criteria for organic food processing. These regulations have been analysed and discussed in several scientific publications and research project reports. Recently, organic food quality was described by principles, aspects and criteria....... These principles from organic agriculture were verified and adapted for organic food processing. Different levels for evaluation were suggested. In another document, underlying paradigms and consumer perception of organic food were reviewed against functional food, resulting in identifying integral product...... to evaluate processing methods. Therefore the goal of this paper is to describe and discuss the topic of organic food processing to make it operational. A conceptual background for organic food processing is given by verifying the underlying paradigms and principles of organic farming and organic food as well...

  14. Beryllium chemistry and processing

    Walsh, Kenneth A


    This book introduces beryllium; its history, its chemical, mechanical, and physical properties including nuclear properties. The 29 chapters include the mineralogy of beryllium and the preferred global sources of ore bodies. The identification and specifics of the industrial metallurgical processes used to form oxide from the ore and then metal from the oxide are thoroughly described. The special features of beryllium chemistry are introduced, including analytical chemical practices. Beryllium compounds of industrial interest are identified and discussed. Alloying, casting, powder processing, forming, metal removal, joining and other manufacturing processes are covered. The effect of composition and process on the mechanical and physical properties of beryllium alloys assists the reader in material selection. The physical metallurgy chapter brings conformity between chemical and physical metallurgical processing of beryllium, metal, alloys, and compounds. The environmental degradation of beryllium and its all...

  15. Posttranslational processing of progastrin

    Bundgaard, Jens René; Rehfeld, Jens F.


    Gastrin and cholecystokinin (CCK) are homologous hormones with important functions in the brain and the gut. Gastrin is the main regulator of gastric acid secretion and gastric mucosal growth, whereas cholecystokinin regulates gall bladder emptying, pancreatic enzyme secretion and besides acts...... processing progastrin is often greatly disturbed in neoplastic cells.The posttranslational phase of the biogenesis of gastrin and the various progastrin products in gastrin gene-expressing tissues is now reviewed here. In addition, the individual contributions of the processing enzymes are discussed......, as are structural features of progastrin that are involved in the precursor activation process. Thus, the review describes how the processing depends on the cell-specific expression of the processing enzymes and kinetics in the secretory pathway....

  16. Managing Software Process Evolution

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  17. Business Process Management

    Mendling, Jan

    The recent progress of Business Process Management (BPM) is reflected by the figures of the related industry. Wintergreen Research estimates that the international market for BPM-related software and services accounted for more than USD 1 billion in 2005 with a tendency towards rapid growth in the subsequent couple of years [457]. The relevance of business process modeling to general management initiatives has been previously studied in the 1990s [28]. Today, Gartner finds that organizations that had the best results in implementing business process management spent more than 40 percent of the total project time on discovery and construction of their initial process model [265]. As a consequence, Gartner considers Business Process Modeling to be among the Top 10 Strategic Technologies for 2008.

  18. New Processes for Annulation

    Liu Hsing-Jang


    Making use of the high propensity of 2-cyano-2-cycloalkenones to undergo conjugate addition with various carbanions and the high reactivity of the ensuing α -cyano ketone system, a number of new annulation processes have been developed recently in our laboratories. As shown in Eq. 1 (n=1) with a specific example, one such process involves the addition of 3-butenylmagnesium bromide, followed by a palladium (Ⅱ) acetate mediated oxidative cyclization, to facilitate methylenecyclopentane ring formation. This annulation process could be readily extended to effect methylenecyclohexane ring formation (Eq. 1, n=2), using 4-pentenylmagnesinm bromide as the initial reagent, and to install the carbomethoxy-substituted methylenecyclopentane and methylenecyclohexane rings, using the carbanions derived from methyl 4-pentenoate and methyl 5-hexenoate, respectively (Eq. 2). In another annulation process, the addition of the enolate of methyl 5-chloropentanoate is involved initially, and the ring formation is readily effected by an intramolecular alkylation process. A specific example is given in Eq. 3.

  19. Business process support

    Carle, Adriana; Fiducia, Daniel [Transportadora de Gas del Sur S.A. (TGS), Buenos Aires (Argentina)


    This paper is about the own development of business support software. The developed applications are used to support two business processes: one of them is the process of gas transportation and the other is the natural gas processing. This software has interphases with the ERP SAP, software SCADA and on line gas transportation simulation software. The main functionalities of the applications are: entrance on line real time of clients transport nominations, transport programming, allocation of the clients transport nominations, transport control, measurements, balanced pipeline, allocation of gas volume to the gas processing plants, calculate of product tons processed in each plant and tons of product distributed to clients. All the developed software generates information to the internal staff, regulatory authorities and clients. (author)

  20. Nonhomogeneous fractional Poisson processes

    Wang Xiaotian [School of Management, Tianjin University, Tianjin 300072 (China)]. E-mail:; Zhang Shiying [School of Management, Tianjin University, Tianjin 300072 (China); Fan Shen [Computer and Information School, Zhejiang Wanli University, Ningbo 315100 (China)


    In this paper, we propose a class of non-Gaussian stationary increment processes, named nonhomogeneous fractional Poisson processes W{sub H}{sup (j)}(t), which permit the study of the effects of long-range dependance in a large number of fields including quantum physics and finance. The processes W{sub H}{sup (j)}(t) are self-similar in a wide sense, exhibit more fatter tail than Gaussian processes, and converge to the Gaussian processes in distribution in some cases. In addition, we also show that the intensity function {lambda}(t) strongly influences the existence of the highest finite moment of W{sub H}{sup (j)}(t) and the behaviour of the tail probability of W{sub H}{sup (j)}(t)

  1. Conceptualizing operations strategy processes

    Rytter, Niels Gorm; Boer, Harry; Koch, Christian


    Purpose - The purpose of this paper is to present insights into operations strategy (OS) in practice. It outlines a conceptualization and model of OS processes and, based on findings from an in-depth and longitudinal case study, contributes to further development of extant OS models and methods......; taking place in five dimensions of change - technical-rational, cultural, political, project management, and facilitation; and typically unfolding as a sequential and parallel, ordered and disordered, planned and emergent as well as top-down and bottom-up process. The proposed OS conceptualization...... outcomes for an OS process in practice, change agents may need to moderate their outcome ambitions, manage process dimensions and agendas in a situational manner, balance inherent process paradoxes, strive at bridging both language and reality, as well as mobilizing key stakeholders, especially middle...

  2. Branching processes in biology

    Kimmel, Marek


    This book provides a theoretical background of branching processes and discusses their biological applications. Branching processes are a well-developed and powerful set of tools in the field of applied probability. The range of applications considered includes molecular biology, cellular biology, human evolution and medicine. The branching processes discussed include Galton-Watson, Markov, Bellman-Harris, Multitype, and General Processes. As an aid to understanding specific examples, two introductory chapters, and two glossaries are included that provide background material in mathematics and in biology. The book will be of interest to scientists who work in quantitative modeling of biological systems, particularly probabilists, mathematical biologists, biostatisticians, cell biologists, molecular biologists, and bioinformaticians. The authors are a mathematician and cell biologist who have collaborated for more than a decade in the field of branching processes in biology for this new edition. This second ex...

  3. Formed HIP Can Processing

    Clarke, Kester Diederik [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    The intent of this report is to document a procedure used at LANL for HIP bonding aluminum cladding to U-10Mo fuel foils using a formed HIP can for the Domestic Reactor Conversion program in the NNSA Office of Material, Management and Minimization, and provide some details that may not have been published elsewhere. The HIP process is based on the procedures that have been used to develop the formed HIP can process, including the baseline process developed at Idaho National Laboratory (INL). The HIP bonding cladding process development is summarized in the listed references. Further iterations with Babcock & Wilcox (B&W) to refine the process to meet production and facility requirements is expected.

  4. Heavy oils processing materials requirements crude processing

    Sloley, Andrew W. [CH2M Hill, Englewood, CO (United States)


    Over time, recommended best practices for crude unit materials selection have evolved to accommodate new operating requirements, feed qualities, and product qualities. The shift to heavier oil processing is one of the major changes in crude feed quality occurring over the last 20 years. The three major types of crude unit corrosion include sulfidation attack, naphthenic acid attack, and corrosion resulting from hydrolyzable chlorides. Heavy oils processing makes all three areas worse. Heavy oils have higher sulfur content; higher naphthenic acid content; and are more difficult to desalt, leading to higher chloride corrosion rates. Materials selection involves two major criteria, meeting required safety standards, and optimizing economics of the overall plant. Proper materials selection is only one component of a plant integrity approach. Materials selection cannot eliminate all corrosion. Proper materials selection requires appropriate support from other elements of an integrity protection program. The elements of integrity preservation include: materials selection (type and corrosion allowance); management limits on operating conditions allowed; feed quality control; chemical additives for corrosion reduction; and preventive maintenance and inspection (PMI). The following discussion must be taken in the context of the application of required supporting work in all the other areas. Within that context, specific materials recommendations are made to minimize corrosion due to the most common causes in the crude unit. (author)

  5. 75 FR 54343 - Center for Biologics Evaluation and Research eSubmitter Pilot Evaluation Program for Blood...


    ...The Food and Drug Administration (FDA), Center for Biologics Evaluation and Research (CBER) is announcing an invitation to participate in a pilot evaluation program for CBER's eSubmitter Program (eSubmitter). CBER's eSubmitter has been customized as an automated biologics license application (BLA) and BLA supplement (BLS) submission system for blood and blood components. Participation in the......

  6. The Isasmelt process

    Barrett, K.R. (MIM Technology Marketing Ltd., Northfleet (United Kingdom))


    The Isasmelt process was developed at Mt Isa Mines Ltd. in Queensland. The process was initially developed for the treatment of lead concentrate. After successful application of the process to lead production a pilot plant was built for the treatment of copper concentrate to produce copper matte. This was successful and as a result Mt Isa Mines decided to build a new copper smelter with a capacity of 180,000 t copper/ a in copper matte. Further commercialisation of the process has resulted in the construction of further plants for lead, nickel and copper production. Mt Isa Mines Ltd. has been associated with CSIRO (Commonwealth Scientific Industrial Research Organisation) in the development of the Isasmelt process since 1977, when a Sirosmelt lance was tested for reducing copper converter slags. Thermodynamic modelling and cruicible scale investigations on a lead smelting process were initiated in 1978. After further work on a 120 kg/h pilot plant the Isasmelt process for lead concentrate smelting was patented jointly by Mt Isa Mines and CSIRO. Since then a 5 t/h demonstration plant was commissioned 1983/85 for smelting and reduction. Finally in 1991 a commercial scale plant with a capacity of 60,000 t/a was commissioned. (orig.).

  7. Revealing the programming process

    Bennedsen, Jens; Caspersen, Michael Edelgaard


    One of the most important goals of an introductory programming course is that the students learn a systematic approach to the development of computer programs. Revealing the programming process is an important part of this; however, textbooks do not address the issue -- probably because the textb......One of the most important goals of an introductory programming course is that the students learn a systematic approach to the development of computer programs. Revealing the programming process is an important part of this; however, textbooks do not address the issue -- probably because...... the textbook medium is static and therefore ill-suited to expose the process of programming. We have found that process recordings in the form of captured narrated programming sessions are a simple, cheap, and efficient way of providing the revelation.We identify seven different elements of the programming...... process for which process recordings are a valuable communication media in order to enhance the learning process. Student feedback indicates both high learning outcome and superior learning potential compared to traditional classroom teaching....

  8. States in Process Calculi

    Christoph Wagner


    Full Text Available Formal reasoning about distributed algorithms (like Consensus typically requires to analyze global states in a traditional state-based style. This is in contrast to the traditional action-based reasoning of process calculi. Nevertheless, we use domain-specific variants of the latter, as they are convenient modeling languages in which the local code of processes can be programmed explicitly, with the local state information usually managed via parameter lists of process constants. However, domain-specific process calculi are often equipped with (unlabeled reduction semantics, building upon a rich and convenient notion of structural congruence. Unfortunately, the price for this convenience is that the analysis is cumbersome: the set of reachable states is modulo structural congruence, and the processes' state information is very hard to identify. We extract from congruence classes of reachable states individual state-informative representatives that we supply with a proper formal semantics. As a result, we can now freely switch between the process calculus terms and their representatives, and we can use the stateful representatives to perform assertional reasoning on process calculus models.

  9. Semi-Markov processes



    Semi-Markov Processes: Applications in System Reliability and Maintenance is a modern view of discrete state space and continuous time semi-Markov processes and their applications in reliability and maintenance. The book explains how to construct semi-Markov models and discusses the different reliability parameters and characteristics that can be obtained from those models. The book is a useful resource for mathematicians, engineering practitioners, and PhD and MSc students who want to understand the basic concepts and results of semi-Markov process theory. Clearly defines the properties and

  10. Transnational Learning Processes

    Nedergaard, Peter

    This paper analyses and compares the transnational learning processes in the employment field in the European Union and among the Nordic countries. Based theoretically on a social constructivist model of learning and methodologically on a questionnaire distributed to the relevant participants......, a number of hypotheses concerning transnational learning processes are tested. The paper closes with a number of suggestions regarding an optimal institutional setting for facilitating transnational learning processes.Key words: Transnational learning, Open Method of Coordination, Learning, Employment......, European Employment Strategy, European Union, Nordic countries....

  11. Plasma processing for VLSI

    Einspruch, Norman G


    VLSI Electronics: Microstructure Science, Volume 8: Plasma Processing for VLSI (Very Large Scale Integration) discusses the utilization of plasmas for general semiconductor processing. It also includes expositions on advanced deposition of materials for metallization, lithographic methods that use plasmas as exposure sources and for multiple resist patterning, and device structures made possible by anisotropic etching.This volume is divided into four sections. It begins with the history of plasma processing, a discussion of some of the early developments and trends for VLSI. The second section

  12. Getting Started with Processing

    Reas, Casey


    Learn computer programming the easy way with Processing, a simple language that lets you use code to create drawings, animation, and interactive graphics. Programming courses usually start with theory, but this book lets you jump right into creative and fun projects. It's ideal for anyone who wants to learn basic programming, and serves as a simple introduction to graphics for people with some programming skills. Written by the founders of Processing, this book takes you through the learning process one step at a time to help you grasp core programming concepts. You'll learn how to sketch wi

  13. Chemical Processing Manual

    Beyerle, F. J.


    Chemical processes presented in this document include cleaning, pickling, surface finishes, chemical milling, plating, dry film lubricants, and polishing. All types of chemical processes applicable to aluminum, for example, are to be found in the aluminum alloy section. There is a separate section for each category of metallic alloy plus a section for non-metals, such as plastics. The refractories, super-alloys and titanium, are prime candidates for the space shuttle, therefore, the chemical processes applicable to these alloys are contained in individual sections of this manual.

  14. The image processing handbook

    Russ, John C


    Now in its fifth edition, John C. Russ's monumental image processing reference is an even more complete, modern, and hands-on tool than ever before. The Image Processing Handbook, Fifth Edition is fully updated and expanded to reflect the latest developments in the field. Written by an expert with unequalled experience and authority, it offers clear guidance on how to create, select, and use the most appropriate algorithms for a specific application. What's new in the Fifth Edition? ·       A new chapter on the human visual process that explains which visual cues elicit a response from the vie

  15. Study on Glulam Process

    PENG Limin; WANG Haiqing; HE Weili


    This paper selected lumbers of Manchurian ash (Fraxinus rnandshurica), Manchurian walnut (Juglans mandshuricd) and Spruce (Picea jezoensis vai.kornamvii) for manufacturing glulam with water-borne polymeric-isocyanate adhesive to determine process variables. The process variables that include specific pressure, pressing time and adhesive application amount influencing the shear strength of the glulam, were investigated through the orthogonal test. The results indicated that optimum process variables for glulam manufacturing were as follows: Specific pressure of 1.5 MPa for Spruce and 2,0 MPa both for Manchurian ash and Manchurian walnut, pressing time of 60 min and adhesive application amount of 250 g/m2.

  16. Digital Differential Geometry Processing

    Xin-Guo Liu; Hu-Jun Bao; Qun-Sheng Peng


    The theory and methods of digital geometry processing has been a hot research area in computer graphics, as geometric models serves as the core data for 3D graphics applications. The purpose of this paper is to introduce some recent advances in digital geometry processing, particularly mesh fairing, surface parameterization and mesh editing, that heavily use differential geometry quantities. Some related concepts from differential geometry, such as normal, curvature, gradient,Laplacian and their counterparts on digital geometry are also reviewed for understanding the strength and weakness of various digital geometry processing methods.

  17. Nano integrated circuit process

    Yoon, Yung Sup


    This book contains nine chapters, which are introduction of manufacture of semiconductor chip, oxidation such as Dry-oxidation, wet oxidation, oxidation model and oxide film, diffusion like diffusion process, diffusion equation, diffusion coefficient and diffusion system, ion implantation, including ion distribution, channeling, multiimplantation and masking and its system, sputtering such as CVD and PVD, lithography, wet etch and dry etch, interconnection and flattening like metal-silicon connection, silicide, multiple layer metal process and flattening, an integrated circuit process, including MOSFET and CMOS.

  18. Irreversible processes kinetic theory

    Brush, Stephen G


    Kinetic Theory, Volume 2: Irreversible Processes deals with the kinetic theory of gases and the irreversible processes they undergo. It includes the two papers by James Clerk Maxwell and Ludwig Boltzmann in which the basic equations for transport processes in gases are formulated, together with the first derivation of Boltzmann's ""H-theorem"" and a discussion of this theorem, along with the problem of irreversibility.Comprised of 10 chapters, this volume begins with an introduction to the fundamental nature of heat and of gases, along with Boltzmann's work on the kinetic theory of gases and s

  19. Ultrasonic Processing of Materials

    Han, Qingyou


    Irradiation of high-energy ultrasonic vibration in metals and alloys generates oscillating strain and stress fields in solids, and introduces nonlinear effects such as cavitation, acoustic streaming, and radiation pressure in molten materials. These nonlinear effects can be utilized to assist conventional material processing processes. This article describes recent research at Oak Ridge National Labs and Purdue University on using high-intensity ultrasonic vibrations for degassing molten aluminum, processing particulate-reinforced metal matrix composites, refining metals and alloys during solidification process and welding, and producing bulk nanostructures in solid metals and alloys. Research results suggest that high-intensity ultrasonic vibration is capable of degassing and dispersing small particles in molten alloys, reducing grain size during alloy solidification, and inducing nanostructures in solid metals.

  20. Processer i undervisningen

    Bundsgaard, Jeppe

    Undersøgelsen har fokus på processer i undervisningen – og derigennem på hvordan digitale læremidler kan understøtte eller integreres i typiske processer. Undersøgelsen hviler på deltagende observation på Abildgårdskolen i Odense. Gennem observationerne er der identificeret en række eksempler på ...... udfordringer for at gennemføre de undervisningsmæssige processer og givet bud på digitale læremidler der forventes at kunne understøtte processerne. Undersøgelsen viser samtidig hvordan fokus på processer kan fungere som en metode til brugerdreven innovation....

  1. Processed Products Database System

    National Oceanic and Atmospheric Administration, Department of Commerce — Collection of annual data on processed seafood products. The Division provides authoritative advice, coordination and guidance on matters related to the collection,...

  2. Reconfigurable network processing platforms

    Kachris, C.


    This dissertation presents our investigation on how to efficiently exploit reconfigurable hardware to design flexible, high performance, and power efficient network devices capable to adapt to varying processing requirements of network applications and traffic. The proposed reconfigurable network pr

  3. Quantum processes in semiconductors

    Ridley, B K


    Aimed at graduate students, this is a guide to quantum processes of importance in the physics and technology of semiconductors. The fifth edition includes new chapters that expand the coverage of semiconductor physics relevant to its accompanying technology.


    Evaporation has been an established technology in the metal finishing industry for many years. In this process, wastewaters containing reusable materials, such as copper, nickel, or chromium compounds are heated, producing a water vapor that is continuously removed and condensed....

  5. Phenol removal pretreatment process

    Hames, Bonnie R.


    A process for removing phenols from an aqueous solution is provided, which comprises the steps of contacting a mixture comprising the solution and a metal oxide, forming a phenol metal oxide complex, and removing the complex from the mixture.

  6. Logistics Innovation Process Revisited

    Gammelgaard, Britta; Su, Shong-Iee Ivan; Yang, Su-Lan


    Purpose – The purpose of this paper is to learn more about logistics innovation processes and their implications for the focal organization as well as the supply chain, especially suppliers. Design/methodology/approach – The empirical basis of the study is a longitudinal action research project...... that was triggered by the practical needs of new ways of handling material flows of a hospital. This approach made it possible to revisit theory on logistics innovation process. Findings – Apart from the tangible benefits reported to the case hospital, five findings can be extracted from this study: the logistics...... innovation process model may include not just customers but also suppliers; logistics innovation in buyer-supplier relations may serve as an alternative to outsourcing; logistics innovation processes are dynamic and may improve supplier partnerships; logistics innovations in the supply chain are as dependent...


    Yu. Taranenko


    Full Text Available The article deals with the theoretical basis of the simulation. The study shows the simulation of logistic processes in industrial countries is an integral part of many economic projects aimed at the creation or improvement of logistics systems. The paper was used model Beer Game for management of logistics processes in the enterprise. The simulation model implements in AnyLogic package. AnyLogic product allows us to consider the logistics processes as an integrated system, which allows reaching better solutions. Logistics process management involves pooling the sales market, production and distribution to ensure the temporal level of customer service at the lowest cost overall. This made it possible to conduct experiments and to determine the optimal size of the warehouse at the lowest cost.

  8. Ultrahigh bandwidth signal processing

    Oxenløwe, Leif Katsuo


    Optical time lenses have proven to be very versatile for advanced optical signal processing. Based on a controlled interplay between dispersion and phase-modulation by e.g. four-wave mixing, the processing is phase-preserving, an hence useful for all types of data signals including coherent multi......-level modulation founats. This has enabled processing of phase-modulated spectrally efficient data signals, such as orthogonal frequency division multiplexed (OFDM) signa In that case, a spectral telescope system was used, using two time lenses with different focal lengths (chirp rates), yielding a spectral...... regeneratio These operations require a broad bandwidth nonlinear platform, and novel photonic integrated nonlinear platform like aluminum gallium arsenide nano-waveguides used for 1.28 Tbaud optical signal processing will be described....

  9. Radiation processing in Japan

    Makuuchi, Keizo [Japan Atomic Energy Research Inst., Takasaki, Gunma (Japan). Takasaki Radiation Chemistry Research Establishment


    Economic scale of radiation application in the field of industry, agriculture and medicine in Japan in 1997 was investigated to compare its economic impacts with that of nuclear energy industry. Total production value of radiation application accounted for 54% of nuclear industry including nuclear energy industry and radiation applications in three fields above. Industrial radiation applications were further divided into five groups, namely nondestructive test, RI instruments, radiation facilities, radiation processing and ion beam processing. More than 70% of the total production value was brought about by ion beam processing for use with IC and semiconductors. Future economic prospect of radiation processing of polymers, for example cross-linking, EB curing, graft polymerization and degradation, is reviewed. Particular attention was paid to radiation vulcanization of natural rubber latex and also to degradation of natural polymers. (S. Ohno)

  10. IT Project Prioritization Process

    Shollo, Arisa; Constantiou, Ioanna


    In most of the large companies IT project prioritization process is designed based on principles of evidencebased management. We investigate a case of IT project prioritization in a financial institution, and in particular, how managers practice evidence-based management during this process. We use...... a rich dataset built from a longitudinal study of the prioritization process for the IT projects. Our findings indicate that managers reach a decision not only by using evidence but from the interplay between the evidence and the judgment devices that managers employ. The interplay between evidence...... and judgment devices is manifested in three ways: supplementing, substituting, and interpreting evidence. We show that while evidence does not fully determine the decision, it plays a central role in discussions, reflections, and negotiations during the IT prioritization process....

  11. Cooperative processing data bases

    Hasta, Juzar


    Cooperative processing for the 1990's using client-server technology is addressed. The main theme is concepts of downsizing from mainframes and minicomputers to workstations on a local area network (LAN). This document is presented in view graph form.


    Igor G. Fedorov


    Full Text Available We can be mistaken to formulate basic concepts of process management, and we are at risk to be on the wrong way solving the focused problems – instead of process management we could do automatization, instead of process system we could introduce function-oriented system. Without having a clear idea of the model we have to execute, we can plan this model as an analytical one and do not include all the necessary tools for management on the stage of planning. The article is targeted for the analysts who have skills in analytical modeling of business processes and would like to make a step forward to the implementation of these models. In order to become professionals in this field it is necessary to learn the terminology, first of all. 

  13. Essentials of stochastic processes

    Durrett, Richard


    Building upon the previous editions, this textbook is a first course in stochastic processes taken by undergraduate and graduate students (MS and PhD students from math, statistics, economics, computer science, engineering, and finance departments) who have had a course in probability theory. It covers Markov chains in discrete and continuous time, Poisson processes, renewal processes, martingales, and option pricing. One can only learn a subject by seeing it in action, so there are a large number of examples and more than 300 carefully chosen exercises to deepen the reader’s understanding. Drawing from teaching experience and student feedback, there are many new examples and problems with solutions that use TI-83 to eliminate the tedious details of solving linear equations by hand, and the collection of exercises is much improved, with many more biological examples. Originally included in previous editions, material too advanced for this first course in stochastic processes has been eliminated while treatm...

  14. Assessing Process and Product

    Bennedsen, Jens; Caspersen, Michael E.


    The final assessment of a course must reflect its goals, and contents. An important goal of our introductory programming course is that the students learn a systematic approach for the development of computer programs. Having the programming process as learning objective naturally raises the ques......The final assessment of a course must reflect its goals, and contents. An important goal of our introductory programming course is that the students learn a systematic approach for the development of computer programs. Having the programming process as learning objective naturally raises...... the question how to include this in assessments. Traditional assessments (e.g. oral, written, or multiple choice) are unsuitable to test the programming process. We describe and evaluate a practical lab examination that assesses the students' programming process as well as the developed programs...

  15. Markovian risk process

    WANG Han-xing; YAN Yun-zhi; ZHAO Fei; FANG Da-fan


    A Markovian risk process is considered in this paper, which is the generalization of the classical risk model. It is proper that a risk process with large claims is modelled as the Markovian risk model. In such a model, the occurrence of claims is described by a point process {N(t)}t≥o with N(t) being the number of jumps during the interval (0, t] for a Markov jump process. The ruin probability Ψ(u) of a company facing such a risk model is mainly studied. An integral equation satisfied by the ruin probability function Ψ(u) is obtained and the bounds for the convergence rate of the ruin probability Ψ(u) are given by using a generalized renewal technique developed in the paper.

  16. Fractional Pure Birth Processes

    Orsingher, Enzo; 10.3150/09-BEJ235


    We consider a fractional version of the classical non-linear birth process of which the Yule-Furry model is a particular case. Fractionality is obtained by replacing the first-order time derivative in the difference-differential equations which govern the probability law of the process, with the Dzherbashyan-Caputo fractional derivative. We derive the probability distribution of the number $ \\mathcal{N}_\

  17. Harmonizable Processes: Structure.


    a related result of Thomas ([39], p. 146). However, the Bourbaki set up of these papers is inconvenient here, and they will be converted to the set ...of processes. 1 2 2i 1. Introduction. Recently there have been significant attempts for extending the well-understood theory of stationary processes...characterizations of the respective classes. This involves a free use of some elementary aspects of vector measure theory ; and it already raises some interesting

  18. Process for compound transformation

    Basset, Jean-Marie


    Embodiments of the present disclosure provide for methods of using a catalytic system to chemically transform a compound (e.g., a hydrocarbon). In an embodiment, the method does not employ grafting the catalyst prior to catalysis. In particular, embodiments of the present disclosure provide for a process of hydrocarbon (e.g., C1 to C20 hydrocarbon) metathesis (e.g., alkane, olefin, or alkyne metathesis) transformation, where the process can be conducted without employing grafting prior to catalysis.

  19. Digital signal processing: Handbook

    Goldenberg, L. M.; Matiushkin, B. D.; Poliak, M. N.

    The fundamentals of the theory and design of systems and devices for the digital processing of signals are presented. Particular attention is given to algorithmic methods of synthesis and digital processing equipment in communication systems (e.g., selective digital filtering, spectral analysis, and variation of the signal discretization frequency). Programs for the computer-aided analysis of digital filters are described. Computational examples are presented, along with tables of transfer function coefficients for recursive and nonrecursive digital filters.

  20. Diasporic Relationships and Processes

    Singla, Rashmi


    How does moving across the geographical borders affect the relationships of diaspora members both here – in the country of residence and there- in the country of origin? The article delineates some of the processes through gendered experiences of the young adults perceived as active actors based...... an empirical longitudinal study. The results indicate transformations in belongings and longings indicating reinterpretation of the self, others and home in context of exclusion processes at various levels....

  1. Bank Record Processing


    Barnett Banks of Florida, Inc. operates 150 banking offices in 80 Florida cities. Banking offices have computerized systems for processing deposits or withdrawals in checking/savings accounts, and for handling commercial and installment loan transactions. In developing a network engineering design for the terminals used in record processing, an affiliate, Barnett Computing Company, used COSMIC's STATCOM program. This program provided a reliable network design tool and avoided the cost of developing new software.

  2. Poststroke neuroplasticity processes

    I. V. Damulin


    Full Text Available The paper considers different aspects of neuroplasticity in patients with stroke. It underlines the dynamism of this process and the ambiguity of involvement of the structures of the contralateral cerebral hemisphere in the restorative process. It considers the periods after onset of stroke and the activation of different brain regions (of both the involved and intact hemisphere in the poststroke period. Particular emphasis is placed on the issues of neurorehabilitation in this category of patients. Delay in rehabilitation measures leads to a worse outcome, the patients must be at hospital longer. It is emphasized that the neurorehabilitaton measures should use strategies aimed at improving plasticity processes at the level of synaptic transmission and neuronal communications. At the same time, of great importance are the processes of structural and functional remodeling of neuronal communications with the involvement of surviving neurons that are located in the peri-infarct area and partially damaged during ischemia. To recover stroke-induced lost motor functions, measures are implemented to modulate the ipsilateral motor cortex, contralateral motor cortex, and sensory afferentation. Remodeling processes, one of the manifestations of neuroplasticity, vary with the size and location of an ischemic focus. The specific features of this process with subcortical and cortical foci are considered. It is stressed that there are genetically determined neurotrophic factors that may enhance remodeling processes in the peri-infarct area, as well as factors that inhibit these processes. The sensory system is noted to have a high potential of compensation, which is appreciably associated with the considerable extent of sensory fibers even at the level of the cerebral cortex.

  3. Cognitive Processes in Writing



    Writing has become one of important topic to discuss in the new age.Its theories could be generally learnt,but its nature needs to handle in specific contents.In another words,every one who can write must generate his/her thinking or cognitive processes.Because writing thinking is to do meaningful activities,how to solove writing problems could be managed through cognitive process.

  4. Apple Image Processing Educator

    Gunther, F. J.


    A software system design is proposed and demonstrated with pilot-project software. The system permits the Apple II microcomputer to be used for personalized computer-assisted instruction in the digital image processing of LANDSAT images. The programs provide data input, menu selection, graphic and hard-copy displays, and both general and detailed instructions. The pilot-project results are considered to be successful indicators of the capabilities and limits of microcomputers for digital image processing education.

  5. Metoda Analytic Network Process


    The thesis is concerned with Multi-Criteria Decision Making, in particular the Analytic Network Process method. The introductory part is dedicated to compile all the theory necessary to understand the method and utilized throughout the paper. The Analytic Hierarchy Process method is described and later generalized in the form of the ANP. Part of the paper is a description of available software products that are able to solve the ANP models. The main focus is on the application of the method, ...

  6. Biomedical signal processing

    Akay, Metin


    Sophisticated techniques for signal processing are now available to the biomedical specialist! Written in an easy-to-read, straightforward style, Biomedical Signal Processing presents techniques to eliminate background noise, enhance signal detection, and analyze computer data, making results easy to comprehend and apply. In addition to examining techniques for electrical signal analysis, filtering, and transforms, the author supplies an extensive appendix with several computer programs that demonstrate techniques presented in the text.

  7. Cauchy cluster process

    Ghorbani, Mohammad


    In this paper we introduce an instance of the well-know Neyman–Scott cluster process model with clusters having a long tail behaviour. In our model the offspring points are distributed around the parent points according to a circular Cauchy distribution. Using a modified Cramér-von Misses test st...... statistic and the simulated pointwise envelopes it is shown that this model fits better than the Thomas process to the frequently analyzed long-leaf pine data-set....

  8. Scramjet Combustion Processes


    plan for these flights is as follows: Scramjet Combustion Processes RTO-EN-AVT-185 11 - 21 HyShot 5 – A Free-Flying Hypersonic Glider HyShot...5 will be a hypersonic glider designed to fly at Mach 8. It will separate from its rocket booster in space and perform controlled manoeuvres as it...RTO-EN-AVT-185 11 - 1 Scramjet Combustion Processes Michael Smart and Ray Stalker Centre for Hypersonics The University of Queensland

  9. Solution Processing - Rodlike Polymers


    side it necessary and identify by block number) Para-ordered Polymers High Modulus Fibers and Films Polybenzobisoxazoles Polybenzobisthiazoles 20...considerations important in solution processing are considered, with special emphasis on the dry-jet wet spinning process used to form fibers . Pertinent...Company, Summit, N.J. iii TABLE OF CONTENTS 1. INTRODUCTION ................ .......................... .. 1 2. REMARKS ON DRY-JET WET SPUN FIBER

  10. Image Processing Software


    To convert raw data into environmental products, the National Weather Service and other organizations use the Global 9000 image processing system marketed by Global Imaging, Inc. The company's GAE software package is an enhanced version of the TAE, developed by Goddard Space Flight Center to support remote sensing and image processing applications. The system can be operated in three modes and is combined with HP Apollo workstation hardware.

  11. The Integrated Renovation Process

    Galiotto, Nicolas; Heiselberg, Per; Knudstrup, Mary-Ann

    The Integrated Renovation Process (IRP) is a user customized methodology based on judiciously selected constructivist and interactive multi-criteria decision making methods (Galiotto, Heiselberg, & Knudstrup, 2014 (expected)). When applied for home renovation, the Integrated Renovation Process...... for the quantitative analyses and the generation of the renovation scenarios so they get more time for the cost optimisation and the qualitative analysis of the homeowners’ needs, wishes and behaviours....

  12. Privatisation Process in Kosovo

    Dr.Sc. Hysni Terziu


    Full Text Available This paper aims at analysing activities of the privatisation process in Kosovo, seeing that privatisation is treated as a fundamental factor of overall transformation of the whole society. It may be established that the primary aim of privatisation process is increasing economic efficiency, reflection of the current state and directions of development in general. Privatisation as a process has as primary aim of opening new areas of freedom, economic efficiency and individualism. Key aim of privatisation process in Kosovo must be increase of economic efficiency, preservation of the healthy economic potential created up to date, and ensuring of the long term concept, which enables growth and macroeconomic stability. The policy of privatisation should give a response related to strategic aspects of privatisation of these sectors: of models, procedures, potential investors, technological modernisation and overtaking of social barriers. Process of privatisation and transition which has now covered countries of the Eastern and Central Europe, aims at profound economic and political transformation of these countries. To achieve this, it is necessarily required to have some basic preconditions, which are related to incitement of general efficiency of the enterprises, expansion of the capital market, introduction of competition, development of business culture in private property and freedom of entrepreneurship. Impacts of privatisation in economic development of Kosovo take a considerable place compared to other countries, therefore our aim is that through this paper we analyse factors and methods of implementation in this process.

  13. PALSAR ground data processing

    Frick, Heinrich; Palsetia, Marzban; Carande, Richard; Curlander, James C.


    The upcoming launches of new satellites like ALOS, Envisat, Radarsat2 and ECHO will pose a significant challenge for many ground stations, namely to integrate new SAR processing software into their existing systems. Vexcel Corporation in Boulder, Colorado, has built a SAR processing system, named APEX -Suite, for spaceborne SAR satellites that can easily be expanded for the next generation of SAR satellites. APEX-Suite includes an auto-satellite-detecting Level 0 Processor that includes bit-error correction, data quality characterization, and as a unique feature, a sophisticated and very accurate Doppler centroid estimator. The Level 1 processing is divided into the strip mode processor FOCUST, based on the well-proven range-Doppler algorithm, and the SWATHT ScanSAR processor that uses the Chirp Z Trans-form algorithm. A high-accuracy ortho-rectification processor produces systematic and precision corrected Level 2 SAR image pro ducts. The PALSAR instrument is an L-band SAR with multiple fine and standard resolution beams in strip mode, and several wide-swath ScanSAR modes. We will address the adaptation process of Vexcel's APEX-Suite processing system for the PALSAR sensor and discuss image quality characteristics based on processed simulated point target phase history data.

  14. Helium process cycle

    Ganni, Venkatarao


    A unique process cycle and apparatus design separates the consumer (cryogenic) load return flow from most of the recycle return flow of a refrigerator and/or liquefier process cycle. The refrigerator and/or liquefier process recycle return flow is recompressed by a multi-stage compressor set and the consumer load return flow is recompressed by an independent consumer load compressor set that maintains a desirable constant suction pressure using a consumer load bypass control valve and the consumer load return pressure control valve that controls the consumer load compressor's suction pressure. The discharge pressure of this consumer load compressor is thereby allowed to float at the intermediate pressure in between the first and second stage recycle compressor sets. Utilizing the unique gas management valve regulation, the unique process cycle and apparatus design in which the consumer load return flow is separate from the recycle return flow, the pressure ratios of each recycle compressor stage and all main pressures associated with the recycle return flow are allowed to vary naturally, thus providing a naturally regulated and balanced floating pressure process cycle that maintains optimal efficiency at design and off-design process cycle capacity and conditions automatically.

  15. Novel food processing techniques

    Vesna Lelas


    Full Text Available Recently, a lot of investigations have been focused on development of the novel mild food processing techniques with the aim to obtain the high quality food products. It is presumed also that they could substitute some of the traditional processes in the food industry. The investigations are primarily directed to usage of high hydrostatic pressure, ultrasound, tribomechanical micronization, microwaves, pulsed electrical fields. The results of the scientific researches refer to the fact that application of some of these processes in particular food industry can result in lots of benefits. A significant energy savings, shortening of process duration, mild thermal conditions, food products with better sensory characteristics and with higher nutritional values can be achieved. As some of these techniques act also on the molecular level changing the conformation, structure and electrical potential of organic as well as inorganic materials, the improvement of some functional properties of these components may occur. Common characteristics of all of these techniques are treatment at ambient or insignificant higher temperatures and short time of processing (1 to 10 minutes. High hydrostatic pressure applied to various foodstuffs can destroy some microorganisms, successfully modify molecule conformation and consequently improve functional properties of foods. At the same time it acts positively on the food products intend for freezing. Tribomechanical treatment causes micronization of various solid materials that results in nanoparticles and changes in structure and electrical potential of molecules. Therefore, the significant improvement of some rheological and functional properties of materials occurred. Ultrasound treatment proved to be potentially very successful technique of food processing. It can be used as a pretreatment to drying (decreases drying time and improves functional properties of food, as extraction process of various components

  16. Carbon dioxide reducing processes; Koldioxidreducerande processer

    Svensson, Fredrik


    This thesis discusses different technologies to reduce or eliminate the carbon dioxide emissions, when a fossil fuel is used for energy production. Emission reduction can be accomplished by separating the carbon dioxide for storage or reuse. There are three different ways of doing the separation. The carbon dioxide can be separated before the combustion, the process can be designed so that the carbon dioxide can be separated without any energy consumption and costly systems or the carbon dioxide can be separated from the flue gas stream. Two different concepts of separating the carbon dioxide from a combined cycle are compared, from the performance and the economical point of view, with a standard natural gas fired combined cycle where no attempts are made to reduce the carbon dioxide emissions. One concept is to use absorption technologies to separate the carbon dioxide from the flue gas stream. The other concept is based on a semi-closed gas turbine cycle using carbon dioxide as working fluid and combustion with pure oxygen, generated in an air-separating unit. The calculations show that the efficiency (power) drop is smaller for the first concept than for the second, 8.7 % points compared to 13.7 % points, when power is produced. When both heat and power are produced, the relation concerning the efficiency (power) remains. Regarding the overall efficiency (heat and power) the opposite relation is present. A possible carbon dioxide tax must exceed 0.21 SEK/kg CO{sub 2} for it to be profitable to separate carbon dioxide with any of these technologies.

  17. Customer Innovation Process Leadership

    Lindgren, Peter; Jørgensen, Jacob Høj; Goduscheit, René Chester


    Innovation leadership has traditionally been focused on leading the companies' product development fast, cost effectively and with an optimal performance driven by technological inventions or by customers´ needs. To improve the efficiency of the product development process focus has been on diffe......Innovation leadership has traditionally been focused on leading the companies' product development fast, cost effectively and with an optimal performance driven by technological inventions or by customers´ needs. To improve the efficiency of the product development process focus has been...... to handle shorter and faster product life cycles. Continuously changing customer needs are pushing companies' competence of continuous innovation to a maximum - but still this seems not to be enough to stay competitive on the global market and reach the goals of growth. This article therefore suggests...... another outlook to future innovation leadership - Customer Innovation Process Leadership - CIP-leadership. CIP-leadership moves the company's innovation process closer to the customer innovation process and discusses how companies can be involved and innovate in customers' future needs and lead...

  18. VLSI signal processing technology

    Swartzlander, Earl


    This book is the first in a set of forthcoming books focussed on state-of-the-art development in the VLSI Signal Processing area. It is a response to the tremendous research activities taking place in that field. These activities have been driven by two factors: the dramatic increase in demand for high speed signal processing, especially in consumer elec­ tronics, and the evolving microelectronic technologies. The available technology has always been one of the main factors in determining al­ gorithms, architectures, and design strategies to be followed. With every new technology, signal processing systems go through many changes in concepts, design methods, and implementation. The goal of this book is to introduce the reader to the main features of VLSI Signal Processing and the ongoing developments in this area. The focus of this book is on: • Current developments in Digital Signal Processing (DSP) pro­ cessors and architectures - several examples and case studies of existing DSP chips are discussed in...

  19. Due process traditionalism.

    Sunstein, Cass R


    In important cases, the Supreme Court has limited the scope of "substantive due process" by reference to tradition, but it has yet to explain why it has done so. Due process traditionalism might be defended in several distinctive ways. The most ambitious defense draws on a set of ideas associated with Edmund Burke and Friedrich Hayek, who suggested that traditions have special credentials by virtue of their acceptance by many minds. But this defense runs into three problems. Those who have participated in a tradition may not have accepted any relevant proposition; they might suffer from a systematic bias; and they might have joined a cascade. An alternative defense sees due process traditionalism as a second-best substitute for two preferable alternatives: a purely procedural approach to the Due Process Clause, and an approach that gives legislatures the benefit of every reasonable doubt. But it is not clear that in these domains, the first-best approaches are especially attractive; and even if they are, the second-best may be an unacceptably crude substitute. The most plausible defense of due process traditionalism operates on rule-consequentialist grounds, with the suggestion that even if traditions are not great, they are often good, and judges do best if they defer to traditions rather than attempting to specify the content of "liberty" on their own. But the rule-consequentialist defense depends on controversial and probably false assumptions about the likely goodness of traditions and the institutional incapacities of judges.

  20. Beyond the search process

    Hyldegård, Jette


    . It is concluded that the ISP-model does not fully comply with group members' problem solving process and the involved information seeking behavior. Further, complex academic problem solving seems to be even more complex when it is performed in a group based setting. The study contributes with a new conceptual......This paper reports on the findings from a longitudinal case study exploring Kuhlthau's information search process (ISP)-model in a group based academic setting. The research focus is on group members' activities and cognitive and emotional experiences during the task process of writing...... an assignment. It is investigated if group members' information behavior differ from the individual information seeker in the ISP-model and to what extent this behavior is influenced by contextual (work task) and social (group work) factors. Three groups of LIS students were followed during a 14 weeks period...

  1. Laser Processing and Chemistry

    Bäuerle, Dieter


    This book gives an overview of the fundamentals and applications of laser-matter interactions, in particular with regard to laser material processing. Special attention is given to laser-induced physical and chemical processes at gas-solid, liquid-solid, and solid-solid interfaces. Starting with the background physics, the book proceeds to examine applications of lasers in “standard” laser machining and laser chemical processing (LCP), including the patterning, coating, and modification of material surfaces. This fourth edition has been enlarged to cover the rapid advances in the understanding of the dynamics of materials under the action of ultrashort laser pulses, and to include a number of new topics, in particular the increasing importance of lasers in various different fields of surface functionalizations and nanotechnology. In two additional chapters, recent developments in biotechnology, medicine, art conservation and restoration are summarized. Graduate students, physicists, chemists, engineers, a...

  2. Processes for xanthomonas biopolymers

    Engelskirchen, K.; Stein, W.; Bahn, M.; Schieferstein, L.; Schindler, J.


    A process is described for producing xanthan gum in which the use of a stable, water-in-oil emulsion in the fermentation medium markedly lowers the viscosity of the medium, resulting in lower energy requirements for the process, and also resulting in enhanced yields of the biopolymer. In such an emulsion, the aqueous fermentation phase, with its microbial growth and metabolic processes, takes place in a finely dispersed homogeneous oil phase. The viscosity increase in each droplet of the aqueous nutrient solution will not noticeably affect this mixture in the fermenter because the viscosity of the reaction mixture in the fermenter is determined primarily by the viscosity of the oil phase. 45 claims

  3. Identification of wastewater processes

    Carstensen, Niels Jacob

    -known theory of the processes with the significant effects found in data. These models are called grey box models, and they contain rate expressions for the processes of influent load of nutrients, transport of nutrients between the aeration tanks, hydrolysis and growth of biomass, nitrification...... function. The grey box models are estimated on data sets from the Lundtofte pilot scale plant and the Aalborg West wastewater treatment plant. Estimation of Monod- kinetic expressions is made possible through the application of large data sets. Parameter extimates from the two plants show a reasonable......The introduction of on-line sensors for monitoring of nutrient salts concentrations on wastewater treatment plants with nutrient removal, opens a wide new area of modelling wastewater processes. The subject of this thesis is the formulation of operational dynamic models based on time series...

  4. Stochastic processes inference theory

    Rao, Malempati M


    This is the revised and enlarged 2nd edition of the authors’ original text, which was intended to be a modest complement to Grenander's fundamental memoir on stochastic processes and related inference theory. The present volume gives a substantial account of regression analysis, both for stochastic processes and measures, and includes recent material on Ridge regression with some unexpected applications, for example in econometrics. The first three chapters can be used for a quarter or semester graduate course on inference on stochastic processes. The remaining chapters provide more advanced material on stochastic analysis suitable for graduate seminars and discussions, leading to dissertation or research work. In general, the book will be of interest to researchers in probability theory, mathematical statistics and electrical and information theory.

  5. Process window metrology

    Ausschnitt, Christopher P.; Chu, William; Hadel, Linda M.; Ho, Hok; Talvi, Peter


    This paper is the third of a series that defines a new approach to in-line lithography control. The first paper described the use of optically measurable line-shortening targets to enhance signal-to-noise and reduce measurement time. The second described the dual-tone optical critical dimension (OCD) measurement and analysis necessary to distinguish dose and defocus. Here we describe the marriage of dual-tone OCD to SEM-CD metrology that comprises what we call 'process window metrology' (PWM), the means to locate each measured site in dose and focus space relative to the allowed process window. PWM provides in-line process tracking and control essential to the successful implementation of low-k lithography.

  6. Quartz resonator processing system

    Peters, Roswell D. M.


    Disclosed is a single chamber ultra-high vacuum processing system for the oduction of hermetically sealed quartz resonators wherein electrode metallization and sealing are carried out along with cleaning and bake-out without any air exposure between the processing steps. The system includes a common vacuum chamber in which is located a rotatable wheel-like member which is adapted to move a plurality of individual component sets of a flat pack resonator unit past discretely located processing stations in said chamber whereupon electrode deposition takes place followed by the placement of ceramic covers over a frame containing a resonator element and then to a sealing stage where a pair of hydraulic rams including heating elements effect a metallized bonding of the covers to the frame.

  7. Process of timbral composing

    Withrow, Sam

    In this paper, I discuss the techniques and processes of timbral organization I developed while writing my chamber work, Afterimage. I compare my techniques with illustrative examples by other composers to place my work in historical context. I examine three elements of my composition process. The first is the process of indexing and cataloging basic sonic materials. The second consists of the techniques and mechanics of manipulating and assembling these collections into larger scale phrases, textures, and overall form in a musical work. The third element is the more elusive, and often extra-musical, source of inspiration and motivation. The evocative power of tone color is both immediately evident yet difficult to explain. What is timbre? This question cannot be answered solely in scientific terms; subjective factors affect our perception of it.

  8. The Nursing Process

    M. Hammond


    Full Text Available The essence of the nursing process can be summed up in this quotation by Sir Francis Bacon: “Human knowledge and human powers meet in one; for where the cause is not known the effect cannot be produced.” Arriving at a concise, accurate definition of the nursing process was, for me, an impossible task. It is altogether too vast and too personal a topic to contract down into a niftylooking, we-pay-lip-service-to-it cliché. So what I propose to do is to present my understanding of the nursing process throughout this essay, and then to leave the reader with some overall, general impression of what it all entails.

  9. Quantum independent increment processes

    Franz, Uwe


    This volume is the first of two volumes containing the revised and completed notes lectures given at the school "Quantum Independent Increment Processes: Structure and Applications to Physics". This school was held at the Alfried-Krupp-Wissenschaftskolleg in Greifswald during the period March 9 – 22, 2003, and supported by the Volkswagen Foundation. The school gave an introduction to current research on quantum independent increment processes aimed at graduate students and non-specialists working in classical and quantum probability, operator algebras, and mathematical physics. The present first volume contains the following lectures: "Lévy Processes in Euclidean Spaces and Groups" by David Applebaum, "Locally Compact Quantum Groups" by Johan Kustermans, "Quantum Stochastic Analysis" by J. Martin Lindsay, and "Dilations, Cocycles and Product Systems" by B.V. Rajarama Bhat.

  10. A support design process

    Arthur, J.; Scott, P.B. [Health and Safety Executive (United Kingdom)


    A workman suffered a fatal injury due to a fall of ground from the face of a development drivage, which was supported by passive supports supplemented with roof bolts. A working party was set up to review the support process and evaluate how protection of the workmen could be improved whilst setting supports.The working party included representatives from the trade unions, the mines inspectorate and mine operators.Visits were made to several mines and discussions were held with the workmen and management at these mines. The paper describes the results of the visits and how a support design process was evolved. The process will ensure that the support system is designed to reduce the inherent hazards associated with setting supports using either conventional or mixed support systems.

  11. Topological signal processing

    Robinson, Michael


    Signal processing is the discipline of extracting information from collections of measurements. To be effective, the measurements must be organized and then filtered, detected, or transformed to expose the desired information.  Distortions caused by uncertainty, noise, and clutter degrade the performance of practical signal processing systems. In aggressively uncertain situations, the full truth about an underlying signal cannot be known.  This book develops the theory and practice of signal processing systems for these situations that extract useful, qualitative information using the mathematics of topology -- the study of spaces under continuous transformations.  Since the collection of continuous transformations is large and varied, tools which are topologically-motivated are automatically insensitive to substantial distortion. The target audience comprises practitioners as well as researchers, but the book may also be beneficial for graduate students.

  12. COTS software selection process.

    Watkins, William M. (Strike Wire Technologies, Louisville, CO); Lin, Han Wei; McClelland, Kelly (U.S. Security Associates, Livermore, CA); Ullrich, Rebecca Ann; Khanjenoori, Soheil; Dalton, Karen; Lai, Anh Tri; Kuca, Michal; Pacheco, Sandra; Shaffer-Gant, Jessica


    Today's need for rapid software development has generated a great interest in employing Commercial-Off-The-Shelf (COTS) software products as a way of managing cost, developing time, and effort. With an abundance of COTS software packages to choose from, the problem now is how to systematically evaluate, rank, and select a COTS product that best meets the software project requirements and at the same time can leverage off the current corporate information technology architectural environment. This paper describes a systematic process for decision support in evaluating and ranking COTS software. Performed right after the requirements analysis, this process provides the evaluators with more concise, structural, and step-by-step activities for determining the best COTS software product with manageable risk. In addition, the process is presented in phases that are flexible to allow for customization or tailoring to meet various projects' requirements.

  13. NTP comparison process

    Corban, Robert

    The systems engineering process for the concept definition phase of the program involves requirements definition, system definition, and consistent concept definition. The requirements definition process involves obtaining a complete understanding of the system requirements based on customer needs, mission scenarios, and nuclear thermal propulsion (NTP) operating characteristics. A system functional analysis is performed to provide a comprehensive traceability and verification of top-level requirements down to detailed system specifications and provides significant insight into the measures of system effectiveness to be utilized in system evaluation. The second key element in the process is the definition of system concepts to meet the requirements. This part of the process involves engine system and reactor contractor teams to develop alternative NTP system concepts that can be evaluated against specific attributes, as well as a reference configuration against which to compare system benefits and merits. Quality function deployment (QFD), as an excellent tool within Total Quality Management (TQM) techniques, can provide the required structure and provide a link to the voice of the customer in establishing critical system qualities and their relationships. The third element of the process is the consistent performance comparison. The comparison process involves validating developed concept data and quantifying system merits through analysis, computer modeling, simulation, and rapid prototyping of the proposed high risk NTP subsystems. The maximum amount possible of quantitative data will be developed and/or validated to be utilized in the QFD evaluation matrix. If upon evaluation of a new concept or its associated subsystems determine to have substantial merit, those features will be incorporated into the reference configuration for subsequent system definition and comparison efforts.

  14. AERONET Version 3 processing

    Holben, B. N.; Slutsker, I.; Giles, D. M.; Eck, T. F.; Smirnov, A.; Sinyuk, A.; Schafer, J.; Rodriguez, J.


    The Aerosol Robotic Network (AERONET) database has evolved in measurement accuracy, data quality products, availability to the scientific community over the course of 21 years with the support of NASA, PHOTONS and all federated partners. This evolution is periodically manifested as a new data version release by carefully reprocessing the entire database with the most current algorithms that fundamentally change the database and ultimately the data products used by the community. The newest processing, Version 3, will be released in 2015 after the entire database is reprocessed and real-time data processing becomes operational. All V 3 algorithms have been developed, individually vetted and represent four main categories: aerosol optical depth (AOD) processing, inversion processing, database management and new products. The primary trigger for release of V 3 lies with cloud screening of the direct sun observations and computation of AOD that will fundamentally change all data available for analysis and all subsequent retrieval products. This presentation will illustrate the innovative approach used for cloud screening and assesses the elements of V3 AOD relative to the current version. We will also present the advances in the inversion product processing with emphasis on the random and systematic uncertainty estimates. This processing will be applied to the new hybrid measurement scenario intended to provide inversion retrievals for all solar zenith angles. We will introduce automatic quality assurance criteria that will allow near real time quality assured aerosol products necessary for real time satellite and model validation and assimilation. Last we will introduce the new management structure that will improve access to the data database. The current version 2 will be supported for at least two years after the initial release of V3 to maintain continuity for on going investigations.

  15. Orchestrator Telemetry Processing Pipeline

    Powell, Mark; Mittman, David; Joswig, Joseph; Crockett, Thomas; Norris, Jeffrey


    Orchestrator is a software application infrastructure for telemetry monitoring, logging, processing, and distribution. The architecture has been applied to support operations of a variety of planetary rovers. Built in Java with the Eclipse Rich Client Platform, Orchestrator can run on most commonly used operating systems. The pipeline supports configurable parallel processing that can significantly reduce the time needed to process a large volume of data products. Processors in the pipeline implement a simple Java interface and declare their required input from upstream processors. Orchestrator is programmatically constructed by specifying a list of Java processor classes that are initiated at runtime to form the pipeline. Input dependencies are checked at runtime. Fault tolerance can be configured to attempt continuation of processing in the event of an error or failed input dependency if possible, or to abort further processing when an error is detected. This innovation also provides support for Java Message Service broadcasts of telemetry objects to clients and provides a file system and relational database logging of telemetry. Orchestrator supports remote monitoring and control of the pipeline using browser-based JMX controls and provides several integration paths for pre-compiled legacy data processors. At the time of this reporting, the Orchestrator architecture has been used by four NASA customers to build telemetry pipelines to support field operations. Example applications include high-volume stereo image capture and processing, simultaneous data monitoring and logging from multiple vehicles. Example telemetry processors used in field test operations support include vehicle position, attitude, articulation, GPS location, power, and stereo images.

  16. Biomedical Image Processing

    Deserno, Thomas Martin


    In modern medicine, imaging is the most effective tool for diagnostics, treatment planning and therapy. Almost all modalities have went to directly digital acquisition techniques and processing of this image data have become an important option for health care in future. This book is written by a team of internationally recognized experts from all over the world. It provides a brief but complete overview on medical image processing and analysis highlighting recent advances that have been made in academics. Color figures are used extensively to illustrate the methods and help the reader to understand the complex topics.

  17. Exoplanet atmospheres physical processes

    Seager, Sara


    Over the past twenty years, astronomers have identified hundreds of extrasolar planets--planets orbiting stars other than the sun. Recent research in this burgeoning field has made it possible to observe and measure the atmospheres of these exoplanets. This is the first textbook to describe the basic physical processes--including radiative transfer, molecular absorption, and chemical processes--common to all planetary atmospheres, as well as the transit, eclipse, and thermal phase variation observations that are unique to exoplanets. In each chapter, Sara Seager offers a conceptual introdu

  18. Semantic and Process Interoperability

    Félix Oscar Fernández Peña


    Full Text Available Knowledge management systems support education at different levels of the education. This is very important for the process in which the higher education of Cuba is involved. Structural transformations of teaching are focused on supporting the foundation of the information society in the country. This paper describes technical aspects of the designing of a model for the integration of multiple knowledgemanagement tools supporting teaching. The proposal is based on the definition of an ontology for the explicit formal description of the semantic of motivations of students and teachers in the learning process. Its target is to facilitate knowledge spreading.

  19. Advanced Polymer Processing Facility

    Muenchausen, Ross E. [Los Alamos National Laboratory


    Some conclusions of this presentation are: (1) Radiation-assisted nanotechnology applications will continue to grow; (2) The APPF will provide a unique focus for radiolytic processing of nanomaterials in support of DOE-DP, other DOE and advanced manufacturing initiatives; (3) {gamma}, X-ray, e-beam and ion beam processing will increasingly be applied for 'green' manufacturing of nanomaterials and nanocomposites; and (4) Biomedical science and engineering may ultimately be the biggest application area for radiation-assisted nanotechnology development.

  20. Hard exclusive QCD processes

    Kugler, W.


    Hard exclusive processes in high energy electron proton scattering offer the opportunity to get access to a new generation of parton distributions, the so-called generalized parton distributions (GPDs). This functions provide more detailed informations about the structure of the nucleon than the usual PDFs obtained from DIS. In this work we present a detailed analysis of exclusive processes, especially of hard exclusive meson production. We investigated the influence of exclusive produced mesons on the semi-inclusive production of mesons at fixed target experiments like HERMES. Further we give a detailed analysis of higher order corrections (NLO) for the exclusive production of mesons in a very broad range of kinematics. (orig.)

  1. An Integrated Desgin Process

    Petersen, Mads Dines; Knudstrup, Mary-Ann


    Present paper is placed in the discussion about how sustainable measures are integrated in the design process by architectural offices. It presents results from interviews with four leading Danish architectural offices working with sustainable architecture and their experiences with it, as well...... as the requirements they meet in terms of how to approach the design process – especially focused on the early stages like a competition. The interviews focus on their experiences with working in multidisciplinary teams and using digital tools to support their work with sustainable issues. The interviews show...

  2. Research Planning Process

    Lofton, Rodney


    This presentation describes the process used to collect, review, integrate, and assess research requirements desired to be a part of research and payload activities conducted on the ISS. The presentation provides a description of: where the requirements originate, to whom they are submitted, how they are integrated into a requirements plan, and how that integrated plan is formulated and approved. It is hoped that from completing the review of this presentation, one will get an understanding of the planning process that formulates payload requirements into an integrated plan used for specifying research activities to take place on the ISS.

  3. Multivariate Statistical Process Control

    Kulahci, Murat


    As sensor and computer technology continues to improve, it becomes a normal occurrence that we confront with high dimensional data sets. As in many areas of industrial statistics, this brings forth various challenges in statistical process control (SPC) and monitoring for which the aim...... is to identify “out-of-control” state of a process using control charts in order to reduce the excessive variation caused by so-called assignable causes. In practice, the most common method of monitoring multivariate data is through a statistic akin to the Hotelling’s T2. For high dimensional data with excessive...

  4. Solar industrial process heat

    Lumsdaine, E.


    The aim of the assessment reported is to candidly examine the contribution that solar industrial process heat (SIPH) is realistically able to make in the near and long-term energy futures of the United States. The performance history of government and privately funded SIPH demonstration programs, 15 of which are briefly summarized, and the present status of SIPH technology are discussed. The technical and performance characteristics of solar industrial process heat plants and equipment are reviewed, as well as evaluating how the operating experience of over a dozen SIPH demonstration projects is influencing institutional acceptance and economoc projections. Implications for domestic energy policy and international implications are briefly discussed. (LEW)

  5. Thermal stir welding process

    Ding, R. Jeffrey (Inventor)


    A welding method is provided for forming a weld joint between first and second elements of a workpiece. The method includes heating the first and second elements to form an interface of material in a plasticized or melted state interface between the elements. The interface material is then allowed to cool to a plasticized state if previously in a melted state. The interface material, while in the plasticized state, is then mixed, for example, using a grinding/extruding process, to remove any dendritic-type weld microstructures introduced into the interface material during the heating process.

  6. Genomic signal processing

    Shmulevich, Ilya


    Genomic signal processing (GSP) can be defined as the analysis, processing, and use of genomic signals to gain biological knowledge, and the translation of that knowledge into systems-based applications that can be used to diagnose and treat genetic diseases. Situated at the crossroads of engineering, biology, mathematics, statistics, and computer science, GSP requires the development of both nonlinear dynamical models that adequately represent genomic regulation, and diagnostic and therapeutic tools based on these models. This book facilitates these developments by providing rigorous mathema

  7. Anaerobic Digestion: Process

    Angelidaki, Irini; Batstone, Damien J.


    with very little dry matter may also be called a digest. The digest should not be termed compost unless it specifically has been composted in an aerated step. This chapter describes the basic processes of anaerobic digestion. Chapter 9.5 describes the anaerobic treatment technologies, and Chapter 9.......6 addresses the mass balances and environmental aspects of anaerobic digestion....

  8. Authenticizing the Research Process

    Nora Elizondo-Schmelkes, MA, Ph.D. Candidate


    Full Text Available This study reflects the main concern of students (national and international who are trying to get a postgraduate degree in a third world (or “in means of development” country. The emergent problem found is that students have to finish their thesis or dissertation but they do not really know how to accomplish this goal. They resolve this problem by authenticizing the process as their own. The theory of authenticizing involves compassing their way to solve the problem of advancing in the research process. Compassing allows the student to authenticize his/her research process, making it a personal and „owned. process. The main categories of compassing are the intellectual, physical and emotional dimension patterns that the student has, learns and follows in order to finish the project and get a degree. Authenticizing implies to author with authenticity their thesis or dissertation. Compassing allows them to do this in their own way, at their own pace or time and with their own internal resources, strengths and weaknesses.

  9. Performance Evaluation Process.


    This document contains four papers from a symposium on the performance evaluation process and human resource development (HRD). "Assessing the Effectiveness of OJT (On the Job Training): A Case Study Approach" (Julie Furst-Bowe, Debra Gates) is a case study of the effectiveness of OJT in one of a high-tech manufacturing company's product…

  10. Advanced Biosignal Processing

    Nait-Ali, Amine


    Presents the principle of many advanced biosignal processing techniques. This title introduces the main biosignal properties and the acquisition techniques. It concerns one of the most intensively used biosignals in the clinical routine, namely the Electrocardiogram, the Elektroenzephalogram, the Electromyogram and the Evoked Potential

  11. Matchmaking for business processes

    Wombacher, Andreas; Fankhauser, Peter; Mahleko, Bendick; Neuhold, Erich


    Web services have a potential to enhance B2B ecommerce over the Internet by allowing companies and organizations to publish their business processes on service directories where potential trading partners can find them. This can give rise to new business paradigms based on ad-hoc trading relations a

  12. Flax shive thermocatalytic processing

    Sulman, E. M.; Lugovoy, Yu. V.; Chalov, K. V.; Kosivtsov, Yu. Yu.; Stepacheva, A. A.; Shimanskaya, E. I.


    In the paper the thermogravimetric study of biomass waste thermodestruction process is presented. Metal chlorides have the highest influence on the flax shive thermodestruction. The results of kinetic modeling are also shown on the base of thermogravimetric analysis both of the samples of flax shive and flax shive with addition of 10% (wt.) nickel chloride at different heating rate.

  13. Ultrahigh bandwidth signal processing

    Oxenløwe, Leif Katsuo


    Optical time lenses have proven to be very versatile for advanced optical signal processing. Based on a controlled interplay between dispersion and phase-modulation by e.g. four-wave mixing, the processing is phase-preserving, and hence useful for all types of data signals including coherent multi-level modulation formats. This has enabled processing of phase-modulated spectrally efficient data signals, such as orthogonal frequency division multiplexed (OFDM) signals. In that case, a spectral telescope system was used, using two time lenses with different focal lengths (chirp rates), yielding a spectral magnification of the OFDM signal. Utilising such telescopic arrangements, it has become possible to perform a number of interesting functionalities, which will be described in the presentation. This includes conversion from OFDM to Nyquist WDM, compression of WDM channels to a single Nyquist channel and WDM regeneration. These operations require a broad bandwidth nonlinear platform, and novel photonic integrated nonlinear platforms like aluminum gallium arsenide nano-waveguides used for 1.28 Tbaud optical signal processing will be described.

  14. Obsolescence: the underlying processes

    Thomsen, A.F.; Nieboer, N.E.T.; Van der Flier, C.L.


    Obsolescence, defined as the process of declining performance of buildings, is a serious threat for the value, the usefulness and the life span of housing properties. Thomsen and van der Flier (2011) developed a model in which obsolescence is categorised on the basis of two distinctions, namely betw

  15. Cascaded Poisson processes

    Matsuo, Kuniaki; Saleh, Bahaa E. A.; Teich, Malvin Carl


    We investigate the counting statistics for stationary and nonstationary cascaded Poisson processes. A simple equation is obtained for the variance-to-mean ratio in the limit of long counting times. Explicit expressions for the forward-recurrence and inter-event-time probability density functions are also obtained. The results are expected to be of use in a number of areas of physics.

  16. Photonic curvilinear data processing

    Browning, Clyde; Quaglio, Thomas; Figueiro, Thiago; Pauliac, Sébastien; Belledent, Jérôme; Fay, Aurélien; Bustos, Jessy; Marusic, Jean-Christophe; Schiavone, Patrick


    With more and more photonic data presence in e-beam lithography, the need for efficient and accurate data fracturing is required to meet acceptable manufacturing cycle time. Large photonic based layouts now create high shot count patterns for VSB based tools. Multiple angles, sweeping curves, and non-orthogonal data create a challenge for today's e-beam tools that are more efficient on Manhattan style data. This paper describes techniques developed and used for creating fractured data for VSB based pattern generators. Proximity Effect Correction is also applied during the fracture process, taking into account variable shot sizes to apply for accuracy and design style. Choosing different fracture routines for pattern data on-the-fly allows for fast and efficient processing. Data interpretation is essential for processing curvilinear data as to its size, angle, and complexity. Fracturing complex angled data into "efficient" shot counts is no longer practical as shot creation now requires knowledge of the actual data content as seen in photonic based pattern data. Simulation and physical printing results prove the implementations for accuracy and write times compared to traditional VSB writing strategies on photonic data. Geometry tolerance is used as part of the fracturing algorithm for controlling edge placement accuracy and tuning to different e-beam processing parameters.

  17. The magnetization process: Hysteresis

    Balsamel, Richard


    The magnetization process, hysteresis (the difference in the path of magnetization for an increasing and decreasing magnetic field), hysteresis loops, and hard magnetic materials are discussed. The fabrication of classroom projects for demonstrating hysteresis and the hysteresis of common magnetic materials is described in detail.

  18. Sustainability of abrasive processes

    Aurich, J.C.; Linke, B.; Hauschild, Michael Zwicky


    This paper presents an overview of research on sustainability of abrasive processes. It incorporates results from a round robin study on ‘‘energy-efficiency of abrasive processes’’ which has been carried out within the scientific technical committee ‘‘abrasive processes’’ (STC G) of CIRP...


    Magdalena LUCA (DEDIU


    Full Text Available Business process reengineering determines the change of organizational functions from an orientation focused on operations through a multidimensional approach. Former employees who were mere executors are now determined to take their own decisions and as a result the functional departments lose their reason to exist. Managers do not act anymore as supervisors, but mainly as mentors, while the employees focus more attention on customer needs and less than the head’s. Under these conditions, new organizational paradigms are required, the most important being that of learning organizations. In order to implement a reengineering of the economic processes and promoting a new organizational paradigm the information technology plays a decisive role. The article presents some results obtained in a research theme ANSTI funded by contract no. 501/2000. Economic and financial analysis is performed in order to know the current situation to achieve better results in the future. One of its objectives is the production analyzed as a labour process and the interaction elements of this process. The indicators investigated in the analysis of financial and economic activity of production reflect the development directions, the means and resources to accomplish predetermined objectives and express the results and effectiveness of what is expected.

  20. Attentional Processes in Autism.

    Goldstein, Gerald; Johnson, Cynthia R.; Minshew, Nancy J.


    Attention processes in 103 children and adults with high functioning autism were compared with a matched control group using a battery of attention measures. Differences were found only on tasks which placed demands on cognitive flexibility or psychomotor speed, suggesting that purported attention deficits in autism may actually be primary…

  1. Normal modified stable processes

    Barndorff-Nielsen, Ole Eiler; Shephard, N.


    Gaussian (NGIG) laws. The wider framework thus established provides, in particular, for added flexibility in the modelling of the dynamics of financial time series, of importance especially as regards OU based stochastic volatility models for equities. In the special case of the tempered stable OU process...

  2. Communicating Process Achitectures 2005

    Broenink, Jan F.; Roebbers, Herman W.; Sunters, Johan P.E.; Welch, Peter H.; Wood, David C.


    The awareness of the ideas characterized by Communicating Processes Architecture and their adoption by industry beyond their traditional base in safety-critical systems and security is growing. The complexity of modern computing systems has become so great that no one person – maybe not even a small

  3. Software Process Improvement

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen


    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...


    Carrión Muñoz, Rolando; Docente de la FII - UNMSM


    The article shows the role that ergonomics in automation of processes, and the importance for Industrial Engineering.  El artículo nos muestra el papel que tiene la ergonomía en la automatización de los procesos, y la importancia para la Ingeniería Industrial.

  5. Students' Differentiated Translation Processes

    Bossé, Michael J.; Adu-Gyamfi, Kwaku; Chandler, Kayla


    Understanding how students translate between mathematical representations is of both practical and theoretical importance. This study examined students' processes in their generation of symbolic and graphic representations of given polynomial functions. The purpose was to investigate how students perform these translations. The result of the study…

  6. Restricted broadcast process theory

    Ghassemi, F.; Fokkink, W.J.; Movaghar, A.; Cerone, A.; Gruner, S.


    We present a process algebra for modeling and reasoning about Mobile Ad hoc Networks (MANETs) and their protocols. In our algebra we model the essential modeling concepts of ad hoc networks, i.e. local broadcast, connectivity of nodes and connectivity changes. Connectivity and connectivity changes a

  7. Governing Knowledge Processes

    Foss, Nicolai Juul; Husted, Kenneth; Michailova, Snejina;


    An under-researched issue in work within the `knowledge movement' is therelation between organizational issues and knowledge processes (i.e., sharingand creating knowledge). We argue that managers can shape formalorganization structure and organization forms and can influence the moreinformal...

  8. Audio Spectral Processing


    Global Security & Engineering Solutions Division 1300-B Floyd Avenue Rome, NY 13440-4615 8. PERFORMING ORGANIZATION REPORT NUMBER...18 1 1. BACKGROUND This report is being submitted by L-3 Global Security...tasks. Utilized the Avid Xpress video enhancement system to process the Group 2, Phase II competency test A. This was done to attempt to recreate

  9. Qualitative Process Theory.


    write a heat flow process that violates energy conservation and transfers " caloric fluid" between the source and destination. The assumptions made about...removed in ease of ex1-0ClCits. Seco nd, if’ thle program is drawNing concilsions that rely criticaillyoi atClrsum in, then1 it IIos’t test ss

  10. Food processing in action

    Radio frequency (RF) heating is a commonly used food processing technology that has been applied for drying and baking as well as thawing of frozen foods. Its use in pasteurization, as well as for sterilization and disinfection of foods, is more limited. This column will review various RF heating ap...

  11. Udfordringer for transkulturelle processer

    Petersen, Karen Bjerg


    at indskrænke mulighedsrummet for transkulturelle processer og for det at lære fra en terra nullius position. Der er fokus på empiriske undersøgelser af kultursyn i lovgivning om opholdstilladelse fra 2010, lovgivning om statsborgerskab fra 2006 samt kultursyn i den i 2003 indførte obligatoriske...

  12. Sparsity and Information Processing

    Ikeda, Shiro


    Recently, many information processing methods utilizing the sparsity of the information source is studied. We have reported some results on this line of research. Here we pick up two results from our own works. One is an image reconstruction method for radio interferometory and the other is a motor command computation method for a two-joint arm.

  13. Image Processing for Teaching.

    Greenberg, R.; And Others


    The Image Processing for Teaching project provides a powerful medium to excite students about science and mathematics, especially children from minority groups and others whose needs have not been met by traditional teaching. Using professional-quality software on microcomputers, students explore a variety of scientific data sets, including…

  14. Image-Processing Program

    Roth, D. J.; Hull, D. R.


    IMAGEP manipulates digital image data to effect various processing, analysis, and enhancement functions. It is keyboard-driven program organized into nine subroutines. Within subroutines are sub-subroutines also selected via keyboard. Algorithm has possible scientific, industrial, and biomedical applications in study of flows in materials, analysis of steels and ores, and pathology, respectively.

  15. The Serendipitous Research Process

    Nutefall, Jennifer E.; Ryder, Phyllis Mentzell


    This article presents the results of an exploratory study asking faculty in the first-year writing program and instruction librarians about their research process focusing on results specifically related to serendipity. Steps to prepare for serendipity are highlighted as well as a model for incorporating serendipity into a first-year writing…

  16. Automated process planning system

    Mann, W.


    Program helps process engineers set up manufacturing plans for machined parts. System allows one to develop and store library of similar parts characteristics, as related to particular facility. Information is then used in interactive system to help develop manufacturing plans that meet required standards.

  17. Quantum image processing?

    Mastriani, Mario


    This paper presents a number of problems concerning the practical (real) implementation of the techniques known as quantum image processing. The most serious problem is the recovery of the outcomes after the quantum measurement, which will be demonstrated in this work that is equivalent to a noise measurement, and it is not considered in the literature on the subject. It is noteworthy that this is due to several factors: (1) a classical algorithm that uses Dirac's notation and then it is coded in MATLAB does not constitute a quantum algorithm, (2) the literature emphasizes the internal representation of the image but says nothing about the classical-to-quantum and quantum-to-classical interfaces and how these are affected by decoherence, (3) the literature does not mention how to implement in a practical way (at the laboratory) these proposals internal representations, (4) given that quantum image processing works with generic qubits, this requires measurements in all axes of the Bloch sphere, logically, and (5) among others. In return, the technique known as quantum Boolean image processing is mentioned, which works with computational basis states (CBS), exclusively. This methodology allows us to avoid the problem of quantum measurement, which alters the results of the measured except in the case of CBS. Said so far is extended to quantum algorithms outside image processing too.

  18. Electrochemical Discharge Machining Process

    Anjali V. Kulkarni


    Full Text Available Electrochemical discharge machining process is evolving as a promising micromachiningprocess. The experimental investigations in the present work substantiate this trend. In the presentwork, in situ, synchronised, transient temperature and current measurements have been carriedout. The need for the transient measurements arose due to the time-varying nature of the dischargeformation and time varying circuit current. Synchronised and transient measurements revealedthe discrete nature of the process. It also helped in formulating the basic mechanism for thedischarge formation and the material removal in the process. Temperature profile on workpieceand in electrochemical discharge machining cell is experimentally measured using pyrometer,and two varieties of K-type thermocouples. Surface topography of the discharge-affected zoneson the workpiece has been carried out using scanning electron microscope. Measurements andsurface topographical studies reveal the potential use of this process for machining in micronregime. With careful experimental set-up design, suitable supply voltage and its polarity, theprocess can be applied for both micromachining and micro-deposition. It can be extended formachining and or deposition of wide range of materials.

  19. Food Process Engineering

    Friis, Alan; Jensen, Bo Boye Busk; Risum, Jørgen

    to calculate the requirements of heat processing. Our goal is to put food engineering into a production context. Other courses teach food chemistry, food microbiology and food technology. Topics of great importance and all have to be seen in a broader context of producing good and safe food in a large scale...

  20. Pattern evaporation process

    Z. Żółkiewicz


    Full Text Available The paper discusses the process of thermal evaporation of a foundry pattern. At several research-development centres, studies have been carried out to examine the physico-chemical phenomena that take place in foundry mould filled with polystyrene pattern when it is poured with molten metal. In the technique of evaporative patterns, the process of mould filling with molten metal (the said mould holding inside a polystyrene pattern is interrelated with the process of thermal decomposition of this pattern. The transformation of an evaporative pattern (e.g. made from foamed polystyrene from the solid into liquid and then gaseous state occurs as a result of the thermal effect that the liquid metal exerts onto this pattern. Consequently, at the liquid metal-pattern-mould phase boundary some physico-chemical phenomena take place, which until now have not been fully explained. When the pattern is evaporating, some solid and gaseous products are evolved, e.g. CO, CO2, H2, N2, and hydrocarbons, e.g. styrene, toluene, ethane, methane, benzene [16, 23]. The process of polystyrene pattern evaporation in foundry mould under the effect of molten metal is of a very complex nature and depends on many different factors, still not fully investigated. The kinetics of pattern evaporation is also affected by the technological properties of foundry mould, e.g. permeability, thermophysical properties, parameters of the gating system, temperature of pouring, properties of pattern material, and the size of pattern-liquid metal contact surface.

  1. Biosphere Process Model Report

    J. Schmitt


    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  2. PROcess Based Diagnostics PROBE

    Clune, T.; Schmidt, G.; Kuo, K.; Bauer, M.; Oloso, H.


    Many of the aspects of the climate system that are of the greatest interest (e.g., the sensitivity of the system to external forcings) are emergent properties that arise via the complex interplay between disparate processes. This is also true for climate models most diagnostics are not a function of an isolated portion of source code, but rather are affected by multiple components and procedures. Thus any model-observation mismatch is hard to attribute to any specific piece of code or imperfection in a specific model assumption. An alternative approach is to identify diagnostics that are more closely tied to specific processes -- implying that if a mismatch is found, it should be much easier to identify and address specific algorithmic choices that will improve the simulation. However, this approach requires looking at model output and observational data in a more sophisticated way than the more traditional production of monthly or annual mean quantities. The data must instead be filtered in time and space for examples of the specific process being targeted.We are developing a data analysis environment called PROcess-Based Explorer (PROBE) that seeks to enable efficient and systematic computation of process-based diagnostics on very large sets of data. In this environment, investigators can define arbitrarily complex filters and then seamlessly perform computations in parallel on the filtered output from their model. The same analysis can be performed on additional related data sets (e.g., reanalyses) thereby enabling routine comparisons between model and observational data. PROBE also incorporates workflow technology to automatically update computed diagnostics for subsequent executions of a model. In this presentation, we will discuss the design and current status of PROBE as well as share results from some preliminary use cases.

  3. Retinomorphic image processing.

    Ghosh, Kuntal; Bhaumik, Kamales; Sarkar, Sandip


    The present work is aimed at understanding and explaining some of the aspects of visual signal processing at the retinal level while exploiting the same towards the development of some simple techniques in the domain of digital image processing. Classical studies on retinal physiology revealed the nature of contrast sensitivity of the receptive field of bipolar or ganglion cells, which lie in the outer and inner plexiform layers of the retina. To explain these observations, a difference of Gaussian (DOG) filter was suggested, which was subsequently modified to a Laplacian of Gaussian (LOG) filter for computational ease in handling two-dimensional retinal inputs. Till date almost all image processing algorithms, used in various branches of science and engineering had followed LOG or one of its variants. Recent observations in retinal physiology however, indicate that the retinal ganglion cells receive input from a larger area than the classical receptive fields. We have proposed an isotropic model for the non-classical receptive field of the retinal ganglion cells, corroborated from these recent observations, by introducing higher order derivatives of Gaussian expressed as linear combination of Gaussians only. In digital image processing, this provides a new mechanism of edge detection on one hand and image half-toning on the other. It has also been found that living systems may sometimes prefer to "perceive" the external scenario by adding noise to the received signals in the pre-processing level for arriving at better information on light and shade in the edge map. The proposed model also provides explanation to many brightness-contrast illusions hitherto unexplained not only by the classical isotropic model but also by some other Gestalt and Constructivist models or by non-isotropic multi-scale models. The proposed model is easy to implement both in the analog and digital domain. A scheme for implementation in the analog domain generates a new silicon retina

  4. Privatization Process in Kosovo

    Ing. Florin Aliu


    Full Text Available Privatization is considered an initial step toward market economy, restructuring financial and economic sector that enables competition in the economy. Privatization is the most painful process in economy where beside legal establishment and political will, it includes also the aspect of fairness and honesty. Analysis of this process is based on the models and comparisons between Kosovo and countries of central and Eastern Europe, in order to give a clearer picture on the overall process of privatization in Kosovo Methodology that is used to analyze this issue is based on empirical results and also qualitative interpretation of the models and also on studying particular asset privatization process. A widely discussed case of privatization in Kosovo is that of Post and Telecom of Kosovo (PTK. Since each company has its own value, I have focused my appraising analysis on the financial statements with a special observation on Cash Flow from Operation, as the most significant indicator on showing how company is using her physical and human recourses to generate money. I have based my research on using methodology of discounted cash flow from operation analysis, even though the company valuation was done using net cash flow from operation analysis. Cash Flow valuation then was discounted by the T-bonds interest rate. This paper tries to bring a conclusion that privatization process in Kosovo have not brought the results excepted, firstly by setting an inappropriate price of assets and lastly by restructuring overall privatization sector and the overall industry. Kosovo, consequently, lost a big opportunity to create a competitive environment of financial industry: starting from the banking industry followed the pension trust which remained at their initial steps of development

  5. Cassini science planning process

    Paczkowski, Brian G.; Ray, Trina L.


    The mission design for Cassini-Huygens calls for a four-year orbital survey of the Saturnian system and the descent into the Titan atmosphere and eventual soft-landing of the Huygens probe. The Cassini orbiter tour consists of 76 orbits around Saturn with 44 close Titan flybys and 8 targeted icy satellite flybys. The Cassini orbiter spacecraft carries twelve scientific instruments that will perform a wide range of observations on a multitude of designated targets. The science opportunities, frequency of encounters, the length of the Tour, and the use of distributed operations pose significant challenges for developing the science plan for the orbiter mission. The Cassini Science Planning Process is the process used to develop and integrate the science and engineering plan that incorporates an acceptable level of science required to meet the primary mission objectives far the orbiter. The bulk of the integrated science and engineering plan will be developed prior to Saturn Orbit Insertion (Sol). The Science Planning Process consists of three elements: 1) the creation of the Tour Atlas, which identifies the science opportunities in the tour, 2) the development of the Science Operations Plan (SOP), which is the conflict-free timeline of all science observations and engineering activities, a constraint-checked spacecraft pointing profile, and data volume allocations to the science instruments, and 3) an Aftermarket and SOP Update process, which is used to update the SOP while in tour with the latest information on spacecraft performance, science opportunities, and ephemerides. This paper will discuss the various elements of the Science Planning Process used on the Cassini Mission to integrate, implement, and adapt the science and engineering activity plans for Tour.

  6. Vaccine process technology.

    Josefsberg, Jessica O; Buckland, Barry


    The evolution of vaccines (e.g., live attenuated, recombinant) and vaccine production methods (e.g., in ovo, cell culture) are intimately tied to each other. As vaccine technology has advanced, the methods to produce the vaccine have advanced and new vaccine opportunities have been created. These technologies will continue to evolve as we strive for safer and more immunogenic vaccines and as our understanding of biology improves. The evolution of vaccine process technology has occurred in parallel to the remarkable growth in the development of therapeutic proteins as products; therefore, recent vaccine innovations can leverage the progress made in the broader biotechnology industry. Numerous important legacy vaccines are still in use today despite their traditional manufacturing processes, with further development focusing on improving stability (e.g., novel excipients) and updating formulation (e.g., combination vaccines) and delivery methods (e.g., skin patches). Modern vaccine development is currently exploiting a wide array of novel technologies to create safer and more efficacious vaccines including: viral vectors produced in animal cells, virus-like particles produced in yeast or insect cells, polysaccharide conjugation to carrier proteins, DNA plasmids produced in E. coli, and therapeutic cancer vaccines created by in vitro activation of patient leukocytes. Purification advances (e.g., membrane adsorption, precipitation) are increasing efficiency, while innovative analytical methods (e.g., microsphere-based multiplex assays, RNA microarrays) are improving process understanding. Novel adjuvants such as monophosphoryl lipid A, which acts on antigen presenting cell toll-like receptors, are expanding the previously conservative list of widely accepted vaccine adjuvants. As in other areas of biotechnology, process characterization by sophisticated analysis is critical not only to improve yields, but also to determine the final product quality. From a regulatory

  7. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip;


    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  8. Process and Post-Process: A Discursive History.

    Matsuda, Paul Kei


    Examines the history of process and post-process in composition studies, focusing on ways in which terms, such as "current-traditional rhetoric,""process," and "post-process" have contributed to the discursive construction of reality. Argues that use of the term post-process in the context of second language writing needs to be guided by a…

  9. Managing Process Variants in the Process Life Cycle

    Hallerbach, A.; Bauer, Th.; Reichert, M.U.


    When designing process-aware information systems, often variants of the same process have to be specified. Each variant then constitutes an adjustment of a particular process to specific requirements building the process context. Current Business Process Management (BPM) tools do not adequately supp

  10. 5 CFR 1653.13 - Processing legal processes.


    ... TSP is notified in writing that the legal process has been appealed, and that the effect of the filing... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Processing legal processes. 1653.13... PROCESSES AFFECTING THRIFT SAVINGS PLAN ACCOUNTS Legal Process for the Enforcement of a Participant's...

  11. An Integrated Design Process

    Petersen, Mads Dines; Knudstrup, Mary-Ann


    Present paper is placed in the discussion about how sustainable measures are integrated in the design process by architectural offices. It presents results from interviews with four leading Danish architectural offices working with sustainable architecture and their experiences with it, as well...... as the requirements they meet in terms of how to approach the design process – especially focused on the early stages like a competition. The interviews focus on their experiences with working in multidisciplinary teams and using digital tools to support their work with sustainable issues. The interviews show...... that there is a difference in the experiences of the different offices. Architects taking an active part in the development of projects and tools in general have a better understanding of how to approach this. It is of course not surprising, because of a focused strategy towards this. However the most important thing...

  12. Instabilities in sensory processes

    Balakrishnan, J.


    In any organism there are different kinds of sensory receptors for detecting the various, distinct stimuli through which its external environment may impinge upon it. These receptors convey these stimuli in different ways to an organism's information processing region enabling it to distinctly perceive the varied sensations and to respond to them. The behavior of cells and their response to stimuli may be captured through simple mathematical models employing regulatory feedback mechanisms. We argue that the sensory processes such as olfaction function optimally by operating in the close proximity of dynamical instabilities. In the case of coupled neurons, we point out that random disturbances and fluctuations can move their operating point close to certain dynamical instabilities triggering synchronous activity.

  13. The Player Engagement Process

    Schoenau-Fog, Henrik


    Engagement is an essential element of the player experience, and the concept is described in various ways in the literature. To gain a more detailed comprehension of this multifaceted concept, and in order to better understand what aspects can be used to evaluate engaging game play and to design...... engaging user experiences, this study investigates one dimension of player engagement by empirically identifying the components associated with the desire to continue playing. Based on a description of the characteristics of player engagement, a series of surveys were developed to discover the components......, categories and triggers involved in this process. By applying grounded theory to the analysis of the responses, a process-oriented player engagement framework was developed and four main components consisting of objectives, activities, accomplishments and affects as well as the corresponding categories...

  14. Plant hydrocarbon recovery process

    Dzadzic, P.M.; Price, M.C.; Shih, C.J.; Weil, T.A.


    A process for production and recovery of hydrocarbons from hydrocarbon-containing whole plants in a form suitable for use as chemical feedstocks or as hydrocarbon energy sources which process comprises: (A) pulverizing by grinding or chopping hydrocarbon-containing whole plants selected from the group consisting of euphorbiaceae, apocynaceae, asclepiadaceae, compositae, cactaceae and pinaceae families to a suitable particle size, (B) drying and preheating said particles in a reducing atmosphere under positive pressure (C) passing said particles through a thermal conversion zone containing a reducing atmosphere and with a residence time of 1 second to about 30 minutes at a temperature within the range of from about 200* C. To about 1000* C., (D) separately recovering the condensable vapors as liquids and the noncondensable gases in a condition suitable for use as chemical feedstocks or as hydrocarbon fuels.

  15. A Logical Process Calculus

    Cleaveland, Rance; Luettgen, Gerald; Bushnell, Dennis M. (Technical Monitor)


    This paper presents the Logical Process Calculus (LPC), a formalism that supports heterogeneous system specifications containing both operational and declarative subspecifications. Syntactically, LPC extends Milner's Calculus of Communicating Systems with operators from the alternation-free linear-time mu-calculus (LT(mu)). Semantically, LPC is equipped with a behavioral preorder that generalizes Hennessy's and DeNicola's must-testing preorder as well as LT(mu's) satisfaction relation, while being compositional for all LPC operators. From a technical point of view, the new calculus is distinguished by the inclusion of: (1) both minimal and maximal fixed-point operators and (2) an unimple-mentability predicate on process terms, which tags inconsistent specifications. The utility of LPC is demonstrated by means of an example highlighting the benefits of heterogeneous system specification.

  16. Integral Politics as Process

    Tom Atlee


    Full Text Available Using the definition proposed here, integral politics can be a process of integrating diverse perspectives into wholesome guidance for a community or society. Characteristics that follow from this definition have ramifications for understanding what such political processes involve. Politics becomes integral as it transcends partisan battle and nurtures generative conversation toward the common good. Problems, conflicts and crises become opportunities for new (or renewed social coherence. Conversational methodologies abound that can help citizen awareness temporarily expand during policy-making, thus helping raise society’s manifested developmental stage. Convening archetypal stakeholders or randomly selected citizens in conversations designed to engage the broader public enhances democratic legitimacy. With minimal issue- and candidate-advocacy, integral political leaders would develop society’s capacity to use integral conversational tools to improve its health, resilience, and collective intelligence. This both furthers and manifests evolution becoming conscious of itself.

  17. Yeast nuclear RNA processing

    Jade; Bernstein; Eric; A; Toth


    Nuclear RNA processing requires dynamic and intricately regulated machinery composed of multiple enzymes and their cofactors.In this review,we summarize recent experiments using Saccharomyces cerevisiae as a model system that have yielded important insights regarding the conversion of pre-RNAs to functional RNAs,and the elimination of aberrant RNAs and unneeded intermediates from the nuclear RNA pool.Much progress has been made recently in describing the 3D structure of many elements of the nuclear degradation machinery and its cofactors.Similarly,the regulatory mechanisms that govern RNA processing are gradually coming into focus.Such advances invariably generate many new questions,which we highlight in this review.

  18. Posttranslational processing of progastrin

    Bundgaard, Jens René; Rehfeld, Jens F.


    Gastrin and cholecystokinin (CCK) are homologous hormones with important functions in the brain and the gut. Gastrin is the main regulator of gastric acid secretion and gastric mucosal growth, whereas cholecystokinin regulates gall bladder emptying, pancreatic enzyme secretion and besides acts...... as a major neurotransmitter in the central and peripheral nervous systems. The tissue-specific expression of the hormones is regulated at the transcriptional level, but the posttranslational phase is also decisive and is highly complex in order to ensure accurate maturation of the prohormones in a cell...... processing progastrin is often greatly disturbed in neoplastic cells.The posttranslational phase of the biogenesis of gastrin and the various progastrin products in gastrin gene-expressing tissues is now reviewed here. In addition, the individual contributions of the processing enzymes are discussed...

  19. Process Improvement: Customer Service.

    Cull, Donald


    Utilizing the comment section of patient satisfaction surveys, Clark Memorial Hospital in Jeffersonville, IN went through a thoughtful process to arrive at an experience that patients said they wanted. Two Lean Six Sigma tools were used--the Voice of the Customer (VoC) and the Affinity Diagram. Even when using these tools, a facility will not be able to accomplish everything the patient may want. Guidelines were set and rules were established for the Process Improvement Team in order to lessen frustration, increase focus, and ultimately be successful. The project's success is driven by the team members carrying its message back to their areas. It's about ensuring that everyone is striving to improve the patients' experience by listening to what they say is being done right and what they say can be done better. And then acting on it.

  20. Thin film interconnect processes

    Malik, Farid

    Interconnects and associated photolithography and etching processes play a dominant role in the feature shrinkage of electronic devices. Most interconnects are fabricated by use of thin film processing techniques. Planarization of dielectrics and novel metal deposition methods are the focus of current investigations. Spin-on glass, polyimides, etch-back, bias-sputtered quartz, and plasma-enhanced conformal films are being used to obtain planarized dielectrics over which metal films can be reliably deposited. Recent trends have been towards chemical vapor depositions of metals and refractory metal silicides. Interconnects of the future will be used in conjunction with planarized dielectric layers. Reliability of devices will depend to a large extent on the quality of the interconnects.

  1. The aluminum smelting process.

    Kvande, Halvor


    This introduction to the industrial primary aluminum production process presents a short description of the electrolytic reduction technology, the history of aluminum, and the importance of this metal and its production process to modern society. Aluminum's special qualities have enabled advances in technologies coupled with energy and cost savings. Aircraft capabilities have been greatly enhanced, and increases in size and capacity are made possible by advances in aluminum technology. The metal's flexibility for shaping and extruding has led to architectural advances in energy-saving building construction. The high strength-to-weight ratio has meant a substantial reduction in energy consumption for trucks and other vehicles. The aluminum industry is therefore a pivotal one for ecological sustainability and strategic for technological development.

  2. [In Process Citation].

    Yildirim, Ayhan; Metzler, Philipp; Lanzer, Martin; Lübbers, Heinz-Theo; Yildirim, Vedat


    Solcoseryl® is a protein-free haemodialysate, containing a broad spectrum of low molecular components of cellular mass and blood serum obtained from veal calves. Solcoseryl® improves the transport of oxygen and glucose to cells that are under hypoxic conditions. It increases the synthesis of intracellular ATP and contributes to an increase in the level of aerobic glycolysis and oxidative phosphorylation. It activates the reparative and regenerative processes in tissues by stimulating fibroblast proliferation and repair of the collagen vascular wall. The formulations of Solcoseryl® are infusion, injection, gel and ointment, and it is also available as a dental paste for inflammatory processes of the mouth cavity, gums and lips.

  3. Plutonium dissolution process

    Vest, Michael A.; Fink, Samuel D.; Karraker, David G.; Moore, Edwin N.; Holcomb, H. Perry


    A two-step process for dissolving plutonium metal, which two steps can be carried out sequentially or simultaneously. Plutonium metal is exposed to a first mixture containing approximately 1.0M-1.67M sulfamic acid and 0.0025M-0.1M fluoride, the mixture having been heated to a temperature between C. and C. The mixture will dissolve a first portion of the plutonium metal but leave a portion of the plutonium in an oxide residue. Then, a mineral acid and additional fluoride are added to dissolve the residue. Alteratively, nitric acid in a concentration between approximately 0.05M and 0.067M is added to the first mixture to dissolve the residue as it is produced. Hydrogen released during the dissolution process is diluted with nitrogen.

  4. Youpi: YOUr processing PIpeline

    Monnerville, Mathias; Sémah, Gregory


    Youpi is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. Built on top of various open source reduction tools released to the community by TERAPIX (, Youpi can help organize data, manage processing jobs on a computer cluster in real time (using Condor) and facilitate teamwork by allowing fine-grain sharing of results and data. Youpi is modular and comes with plugins which perform, from within a browser, various processing tasks such as evaluating the quality of incoming images (using the QualityFITS software package), computing astrometric and photometric solutions (using SCAMP), resampling and co-adding FITS images (using SWarp) and extracting sources and building source catalogues from astronomical images (using SExtractor). Youpi is useful for small to medium-sized data reduction projects; it is free and is published under the GNU General Public License.

  5. Image processing occupancy sensor

    Brackney, Larry J.


    A system and method of detecting occupants in a building automation system environment using image based occupancy detection and position determinations. In one example, the system includes an image processing occupancy sensor that detects the number and position of occupants within a space that has controllable building elements such as lighting and ventilation diffusers. Based on the position and location of the occupants, the system can finely control the elements to optimize conditions for the occupants, optimize energy usage, among other advantages.

  6. Fastdata processing with Spark

    Karau, Holden


    This book will be a basic, step-by-step tutorial, which will help readers take advantage of all that Spark has to offer.Fastdata Processing with Spark is for software developers who want to learn how to write distributed programs with Spark. It will help developers who have had problems that were too much to be dealt with on a single computer. No previous experience with distributed programming is necessary. This book assumes knowledge of either Java, Scala, or Python.

  7. Sample Data Processing.


    the relative practicality of compensating the channel with an approach of predistorting the masking sequence, by processing in a filter that...replicates the channel response, with a conventional approach of equal- izing the channel with an inverse filter. The predistortion method demonstrated a...compensate for the channel distortion is to predistort the encryption stream in the receiver by means of a fil- ter which replicates the impulse response of

  8. Near Shore Wave Processes


    given the offshore wave conditions. OBJECTIVES We hypothesize that the wave-induced kinematic, sediment and morphologic processes are nonlinearly... morphology , which acts as hydraulic roughness for the mean flows and perturbs the velocity-sediment fields, is measured as a function of time and over...REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT

  9. Aluminum powder metallurgy processing

    Flumerfelt, J.F.


    The objective of this dissertation is to explore the hypothesis that there is a strong linkage between gas atomization processing conditions, as-atomized aluminum powder characteristics, and the consolidation methodology required to make components from aluminum powder. The hypothesis was tested with pure aluminum powders produced by commercial air atomization, commercial inert gas atomization, and gas atomization reaction synthesis (GARS). A comparison of the GARS aluminum powders with the commercial aluminum powders showed the former to exhibit superior powder characteristics. The powders were compared in terms of size and shape, bulk chemistry, surface oxide chemistry and structure, and oxide film thickness. Minimum explosive concentration measurements assessed the dependence of explosibility hazard on surface area, oxide film thickness, and gas atomization processing conditions. The GARS aluminum powders were exposed to different relative humidity levels, demonstrating the effect of atmospheric conditions on post-atomization processing conditions. The GARS aluminum powders were exposed to different relative humidity levels, demonstrating the effect of atmospheric conditions on post-atomization oxidation of aluminum powder. An Al-Ti-Y GARS alloy exposed in ambient air at different temperatures revealed the effect of reactive alloy elements on post-atomization powder oxidation. The pure aluminum powders were consolidated by two different routes, a conventional consolidation process for fabricating aerospace components with aluminum powder and a proposed alternative. The consolidation procedures were compared by evaluating the consolidated microstructures and the corresponding mechanical properties. A low temperature solid state sintering experiment demonstrated that tap densified GARS aluminum powders can form sintering necks between contacting powder particles, unlike the total resistance to sintering of commercial air atomization aluminum powder.

  10. Topology and mental processes.

    McLeay, H


    The study reported here considers the effect of rotation on the decision time taken to compare nonrigid objects, presented as like and unlike pairs of knots and unknots. The results for 48 subjects, 21 to 45 years old, support the notion that images which have a characteristic 'foundation part' are more easily stored and accessed in the brain. Also, there is evidence that the comparison of deformable objects is processed by mental strategies other than self-evident mental rotation.

  11. Processing Nanostructured Structural Ceramics


    aspects of the processing of nanostructured ceramics, viz. • • • The production of a flowable and compactable dry nanopowder suitable for use in... composition due to the different synthesis routes used. Therefore, ‘industry-standard’ dispersants can cause flocculation rather than dispersion...stabilised zirconia (3-YSZ) were no higher than for conventional, micron-sized material of the same composition . However, detailed crystallographic

  12. Inelastic Light Scattering Processes

    Fouche, Daniel G.; Chang, Richard K.


    Five different inelastic light scattering processes will be denoted by, ordinary Raman scattering (ORS), resonance Raman scattering (RRS), off-resonance fluorescence (ORF), resonance fluorescence (RF), and broad fluorescence (BF). A distinction between fluorescence (including ORF and RF) and Raman scattering (including ORS and RRS) will be made in terms of the number of intermediate molecular states which contribute significantly to the scattered amplitude, and not in terms of excited state lifetimes or virtual versus real processes. The theory of these processes will be reviewed, including the effects of pressure, laser wavelength, and laser spectral distribution on the scattered intensity. The application of these processes to the remote sensing of atmospheric pollutants will be discussed briefly. It will be pointed out that the poor sensitivity of the ORS technique cannot be increased by going toward resonance without also compromising the advantages it has over the RF technique. Experimental results on inelastic light scattering from I(sub 2) vapor will be presented. As a single longitudinal mode 5145 A argon-ion laser line was tuned away from an I(sub 2) absorption line, the scattering was observed to change from RF to ORF. The basis, of the distinction is the different pressure dependence of the scattered intensity. Nearly three orders of magnitude enhancement of the scattered intensity was measured in going from ORF to RF. Forty-seven overtones were observed and their relative intensities measured. The ORF cross section of I(sub 2) compared to the ORS cross section of N2 was found to be 3 x 10(exp 6), with I(sub 2) at its room temperature vapor pressure.

  13. Pyrolysis process and apparatus

    Lee, Chang-Kuei


    This invention discloses a process and apparatus for pyrolyzing particulate coal by heating with a particulate solid heating media in a transport reactor. The invention tends to dampen fluctuations in the flow of heating media upstream of the pyrolysis zone, and by so doing forms a substantially continuous and substantially uniform annular column of heating media flowing downwardly along the inside diameter of the reactor. The invention is particularly useful for bituminous or agglomerative type coals.

  14. The Caroline interrogatory process

    Degagne, D. [Alberta Energy and Utilities Board, Calgary, AB (Canada); Gibson, T. [Gecko Management, Calgary, AB (Canada)


    Using the specific case study of the Caroline interrogatory process, an example is given of how an effective communications and public involvement program can re-establish trust and credibility levels within an community after an incident. The public is nervous about sour gas, especially about blowouts of gas from a pipeline. The post-approval period was marked by high expectations and a community consultation program which included a community advisory board, an emergency planning committee, socio-economic factors, and environmental monitoring and studies. Information and education involves newspaper articles, newsletters, tours, public consultation meetings, and weekly e-mail. Mercury was detected as a potential hazard at the site, and company actions are illustrated. Overall lessons learned included: starting early paid off, face to face resident contacts were the most effective, the willingness to make changes was the key to success, the community helped, knowing all the answers is not essential, and there is a need for empathy. The interrogatory process includes a hybrid technique that is comprised of four stages: 1) process review and public input, 2) identification and clarification of issues, 3) responses by industry and government, and 4) a public forum and follow-up action.

  15. The Caroline interrogatory process

    Degagne, D. (Alberta Energy and Utilities Board, Calgary, AB (Canada)); Gibson, T. (Gecko Management, Calgary, AB (Canada))


    Using the specific case study of the Caroline interrogatory process, an example is given of how an effective communications and public involvement program can re-establish trust and credibility levels within an community after an incident. The public is nervous about sour gas, especially about blowouts of gas from a pipeline. The post-approval period was marked by high expectations and a community consultation program which included a community advisory board, an emergency planning committee, socio-economic factors, and environmental monitoring and studies. Information and education involves newspaper articles, newsletters, tours, public consultation meetings, and weekly e-mail. Mercury was detected as a potential hazard at the site, and company actions are illustrated. Overall lessons learned included: starting early paid off, face to face resident contacts were the most effective, the willingness to make changes was the key to success, the community helped, knowing all the answers is not essential, and there is a need for empathy. The interrogatory process includes a hybrid technique that is comprised of four stages: 1) process review and public input, 2) identification and clarification of issues, 3) responses by industry and government, and 4) a public forum and follow-up action.

  16. Processing of lateritic ores

    Collier, D.E.; Ring, R.J. [Environment Division, Australian Nuclear Science and Technology Organisation, Menai, New South Wales (Australia); McGill, J.; Russell, H. [Energy Resources of Australia Ltd., Ranger Mine, Jabiru, Northern Territory (Australia)


    Highly weathered or lateritic ores that contain high proportions of fine clay minerals present specific problems when they are processed to extract uranium. Of perhaps the greatest significance is the potential of the fine minerals to adsorb dissolved uranium (preg-robbing) from leach liquors produced by processing laterites or blends of laterite and primary ores. These losses can amount to 25% of the readily soluble uranium. The clay components can also restrict practical slurry densities to relatively low values in order to avoid rheology problems in pumping and agitation. The fine fractions also contribute to relatively poor solid-liquid separation characteristics in settling and/or filtration. Studies at ANSTO have characterised the minerals believed to be responsible for these problems and quantified the effects of the fines in these types of ores. Processing strategies were also examined, including roasting, resin-in-leach and separate leaching of the laterite fines to overcome potential problems. The incorporation of the preferred treatment option into an existing mill circuit is discussed. (author)

  17. Advanced microwave processing concepts

    Lauf, R.J.; McMillan, A.D.; Paulauskas, F.L. [Oak Ridge National Laboratory, TN (United States)


    The purpose of this work is to explore the feasibility of several advanced microwave processing concepts to develop new energy-efficient materials and processes. The project includes two tasks: (1) commercialization of the variable-frequency microwave furnace; and (2) microwave curing of polymer composites. The variable frequency microwave furnace, whose initial conception and design was funded by the AIC Materials Program, will allow us, for the first time, to conduct microwave processing studies over a wide frequency range. This novel design uses a high-power traveling wave tube (TWT) originally developed for electronic warfare. By using this microwave source, one can not only select individual microwave frequencies for particular experiments, but also achieve uniform power densities over a large area by the superposition of many different frequencies. Microwave curing of thermoset resins will be studied because it hold the potential of in-situ curing of continuous-fiber composites for strong, lightweight components. Microwave heating can shorten curing times, provided issues of scaleup, uniformity, and thermal management can be adequately addressed.

  18. Advanced microwave processing concepts

    Lauf, R.J.; McMillan, A.D.; Paulauskas, F.L. [Oak Ridge National Lab., TN (United States)


    The purpose of this work is to explore the feasibility of several advanced microwave processing concepts to develop new energy-efficient materials and processes. The project includes two tasks: (1) commercialization of the variable-frequency microwave furnace; and (2) microwave curing of polymeric materials. The variable frequency microwave furnace, whose initial conception and design was funded by the AIM Materials Program, allows the authors, for the first time, to conduct microwave processing studies over a wide frequency range. This novel design uses a high-power traveling wave tube (TWT) originally developed for electronic warfare. By using this microwave source, one can not only select individual microwave frequencies for particular experiments, but also achieve uniform power densities over a large area by the superposition of many different frequencies. Microwave curing of various thermoset resins will be studied because it holds the potential of in-situ curing of continuous-fiber composites for strong, lightweight components or in-situ curing of adhesives, including metal-to-metal. Microwave heating can shorten curing times, provided issues of scaleup, uniformity, and thermal management can be adequately addressed.

  19. Laser processing of materials

    J Dutta Majumdar; I Manna


    Light amplification by stimulated emission of radiation (laser) is a coherent and monochromatic beam of electromagnetic radiation that can propagate in a straight line with negligible divergence and occur in a wide range of wavelength, energy/power and beam-modes/configurations. As a result, lasers find wide applications in the mundane to the most sophisticated devices, in commercial to purely scientific purposes, and in life-saving as well as life-threatening causes. In the present contribution, we provide an overview of the application of lasers for material processing. The processes covered are broadly divided into four major categories; namely, laser-assisted forming, joining, machining and surface engineering. Apart from briefly introducing the fundamentals of these operations, we present an updated review of the relevant literature to highlight the recent advances and open questions. We begin our discussion with the general applications of lasers, fundamentals of laser-matter interaction and classification of laser material processing. A major part of the discussion focuses on laser surface engineering that has attracted a good deal of attention from the scientific community for its technological significance and scientific challenges. In this regard, a special mention is made about laser surface vitrification or amorphization that remains a very attractive but unaccomplished proposition.

  20. Basic Social Processes

    Barney G. Glaser, PhD, Hon. PhD


    Full Text Available The goal of grounded theory is to generate a theory that accounts for a pattern of behavior that is relevant and problematic for those involved. The goal is not voluminous description, nor clever verification. As with all grounded theory, the generation of a basic social process (BSP theory occurs around a core category. While a core category is always present in a grounded research study, a BSP may not be.BSPs are ideally suited to generation by grounded theory from qualitative research because qualitative research can pick up process through fieldwork that continues over a period of time. BSPs are a delight to discover and formulate since they give so much movement and scope to the analyst’s perception of the data. BSPs such as cultivating, defaulting, centering, highlighting or becoming, give the feeling of process, change and movement over time. They also have clear, amazing general implications; so much so, that it is hard to contain them within the confines of a single substantive study. The tendency is to refer to them as a formal theory without the necessary comparative development of formal theory. They are labeled by a “gerund”(“ing” which both stimulates their generation and the tendency to over-generalize them.

  1. Adaptive Signal Processing Testbed

    Parliament, Hugh A.


    The design and implementation of a system for the acquisition, processing, and analysis of signal data is described. The initial application for the system is the development and analysis of algorithms for excision of interfering tones from direct sequence spread spectrum communication systems. The system is called the Adaptive Signal Processing Testbed (ASPT) and is an integrated hardware and software system built around the TMS320C30 chip. The hardware consists of a radio frequency data source, digital receiver, and an adaptive signal processor implemented on a Sun workstation. The software components of the ASPT consists of a number of packages including the Sun driver package; UNIX programs that support software development on the TMS320C30 boards; UNIX programs that provide the control, user interaction, and display capabilities for the data acquisition, processing, and analysis components of the ASPT; and programs that perform the ASPT functions including data acquisition, despreading, and adaptive filtering. The performance of the ASPT system is evaluated by comparing actual data rates against their desired values. A number of system limitations are identified and recommendations are made for improvements.

  2. Spacelab Ground Processing

    Scully, Edward J.; Gaskins, Roger B.


    Spacelab (SL) ground processing is active at the Kennedy Space Center (KSC). The palletized payload for the second Shuttle launch is staged and integrated with interface verification active. The SL Engineering Model is being assembled for subsequent test and checkout activities. After delivery of SL flight elements from Europe, prelaunch operations for the first SL flight start with receipt of the flight experiment packages and staging of the SL hardware. Experiment operations consist of integrating the various experiment elements into the SL racks, floors and pallets. Rack and floor assemblies with the experiments installed, are integrated into the flight module. Aft end-cone installation, pallet connections, and SL subsystems interface verifications are accomplished, and SL-Orbiter interfaces verified. The Spacelab cargo is then transferred to the Orbiter Processing Facility (OPF) in a controlled environment using a canister/transporter. After the SL is installed into the Orbiter payload bay, physical and functional integrity of all payload-to-Orbiter interfaces are verified and final close-out operations conducted. Spacelab payload activities at the launch pad are minimal with the payload bay doors remaining closed. Limited access is available to the module through the Spacelab Transfer Tunnel. After mission completion, the SL is removed from the Orbiter in the OPF and returned to the SL processing facility for experiment equipment removal and reconfiguration for the subsequent mission.

  3. Process for protein PEGylation.

    Pfister, David; Morbidelli, Massimo


    PEGylation is a versatile drug delivery technique that presents a particularly wide range of conjugation chemistry and polymer structure. The conjugated protein can be tuned to specifically meet the needs of the desired application. In the area of drug delivery this typically means to increase the persistency in the human body without affecting the activity profile of the original protein. On the other hand, because of the high costs associated with the production of therapeutic proteins, subsequent operations imposed by PEGylation must be optimized to minimize the costs inherent to the additional steps. The closest attention has to be given to the PEGylation reaction engineering and to the subsequent purification processes. This review article focuses on these two aspects and critically reviews the current state of the art with a clear focus on the development of industrial scale processes which can meet the market requirements in terms of quality and costs. The possibility of using continuous processes, with integration between the reaction and the separation steps is also illustrated.

  4. Process measuring techniques; Prozessmesstechnik

    Freudenberger, A.


    This introduction into measurement techniques for chemical and process-technical plant in science and industry describes in detail the methods used to measure basic quantities. Most prominent are modern measuring techniques by means of ultrasound, microwaves and the Coriolis effect. Alongside physical and measuring technique fundamentals, the practical applications of measuring devices are described. Calculation examples are given to illustrate the subject matter. The book addresses students of physical engineering, process engineering and environmental engineering at technical schools as well as engineers of other disciplines wishing to familiarize themselves with the subject of process measurement techniques. (orig.) [German] Diese Einfuehrung in die Messtechnik fuer chemische und verfahrens-technische Forschungs- und Produktionsanlagen beschreibt ausfuehrlich die Methoden zur Messung der Basisgroessen. Moderne Messverfahren mit Ultraschall, Mikrowellen und Coriolis-Effekt stehen dabei im Vordergrund. Beruecksichtigung finden sowohl die physikalischen und messtechnischen Grundlagen als auch die praktischen Anwendungen der Geraete. Berechnungsbeispiele dienen der Erlaeuterung und Vertiefung des Stoffes. Angesprochen sind Studenten der Ingenieurstufengaenge Physikalische Technik und Verfahrens- und Umwelttechnik an Fachhochschulen als auch Ingenieure anderer Fachrichtungen, die sich in das Gebiet der Prozessmesstechnik einarbeiten wollen. (orig.)

  5. RACORO aerosol data processing

    Elisabeth Andrews


    The RACORO aerosol data (cloud condensation nuclei (CCN), condensation nuclei (CN) and aerosol size distributions) need further processing to be useful for model evaluation (e.g., GCM droplet nucleation parameterizations) and other investigations. These tasks include: (1) Identification and flagging of 'splash' contaminated Twin Otter aerosol data. (2) Calculation of actual supersaturation (SS) values in the two CCN columns flown on the Twin Otter. (3) Interpolation of CCN spectra from SGP and Twin Otter to 0.2% SS. (4) Process data for spatial variability studies. (5) Provide calculated light scattering from measured aerosol size distributions. Below we first briefly describe the measurements and then describe the results of several data processing tasks that which have been completed, paving the way for the scientific analyses for which the campaign was designed. The end result of this research will be several aerosol data sets which can be used to achieve some of the goals of the RACORO mission including the enhanced understanding of cloud-aerosol interactions and improved cloud simulations in climate models.

  6. Approximate simulation of Hawkes processes

    Møller, Jesper; Rasmussen, Jakob Gulddahl


    Hawkes processes are important in point process theory and its applications, and simulation of such processes are often needed for various statistical purposes. This article concerns a simulation algorithm for unmarked and marked Hawkes processes, exploiting that the process can be constructed...


    Blaženka Piuković Babičković


    Full Text Available With modern business process orientation binds primarily, process of thinking and process organizational structure. Although the business processes are increasingly a matter of writing and speaking, it is a major problem among the business world, especially in countries in transition, where it has been found that there is a lack of understanding of the concept of business process management. The aim of this paper is to give a specific contribution to overcoming the identified problem, by pointing out the significance of the concept of business process management, as well as the representation of the model for review of process maturity and tools that are recommended for use in process management.

  8. Discovery as a process

    Loehle, C.


    The three great myths, which form a sort of triumvirate of misunderstanding, are the Eureka! myth, the hypothesis myth, and the measurement myth. These myths are prevalent among scientists as well as among observers of science. The Eureka! myth asserts that discovery occurs as a flash of insight, and as such is not subject to investigation. This leads to the perception that discovery or deriving a hypothesis is a moment or event rather than a process. Events are singular and not subject to description. The hypothesis myth asserts that proper science is motivated by testing hypotheses, and that if something is not experimentally testable then it is not scientific. This myth leads to absurd posturing by some workers conducting empirical descriptive studies, who dress up their study with a ``hypothesis`` to obtain funding or get it published. Methods papers are often rejected because they do not address a specific scientific problem. The fact is that many of the great breakthroughs in silence involve methods and not hypotheses or arise from largely descriptive studies. Those captured by this myth also try to block funding for those developing methods. The third myth is the measurement myth, which holds that determining what to measure is straightforward, so one doesn`t need a lot of introspection to do science. As one ecologist put it to me ``Don`t give me any of that philosophy junk, just let me out in the field. I know what to measure.`` These myths lead to difficulties for scientists who must face peer review to obtain funding and to get published. These myths also inhibit the study of science as a process. Finally, these myths inhibit creativity and suppress innovation. In this paper I first explore these myths in more detail and then propose a new model of discovery that opens the supposedly miraculous process of discovery to doser scrutiny.

  9. Solar Flares: Magnetohydrodynamic Processes

    Kazunari Shibata


    Full Text Available This paper outlines the current understanding of solar flares, mainly focused on magnetohydrodynamic (MHD processes responsible for producing a flare. Observations show that flares are one of the most explosive phenomena in the atmosphere of the Sun, releasing a huge amount of energy up to about 10^32 erg on the timescale of hours. Flares involve the heating of plasma, mass ejection, and particle acceleration that generates high-energy particles. The key physical processes for producing a flare are: the emergence of magnetic field from the solar interior to the solar atmosphere (flux emergence, local enhancement of electric current in the corona (formation of a current sheet, and rapid dissipation of electric current (magnetic reconnection that causes shock heating, mass ejection, and particle acceleration. The evolution toward the onset of a flare is rather quasi-static when free energy is accumulated in the form of coronal electric current (field-aligned current, more precisely, while the dissipation of coronal current proceeds rapidly, producing various dynamic events that affect lower atmospheres such as the chromosphere and photosphere. Flares manifest such rapid dissipation of coronal current, and their theoretical modeling has been developed in accordance with observations, in which numerical simulations proved to be a strong tool reproducing the time-dependent, nonlinear evolution of a flare. We review the models proposed to explain the physical mechanism of flares, giving an comprehensive explanation of the key processes mentioned above. We start with basic properties of flares, then go into the details of energy build-up, release and transport in flares where magnetic reconnection works as the central engine to produce a flare.

  10. Time processing in dyscalculia

    marinella eCappelletti


    Full Text Available To test whether atypical number development may affect other types of quantity processing, we investigated temporal discrimination in adults with developmental dyscalculia (DD. This also allowed us to test whether (1 number and time may be sub-served by a common quantity system or decision mechanisms –in which case they may both be impaired, or (2 whether number and time are distinct –and therefore they may dissociate. Participants judged which of two successively presented horizontal lines was longer in duration, the first line being preceded by either a small or a large number prime (‘1’ or ‘9’ or by a neutral symbol (‘#’, or in third task decide which of two Arabic numbers (either ‘1’, ‘5’, ’9’ lasted longer. Results showed that (i DD’s temporal discriminability was normal as long as numbers were not part of the experimental design even as task-irrelevant stimuli; however (ii task-irrelevant numbers dramatically disrupted DD’s temporal discriminability, the more their salience increased, though the actual magnitude of the numbers had no effect; and in contrast (iii controls’ time perception was robust to the presence of numbers but modulated by numerical quantity such that small number primes or numerical stimuli made durations appear shorter than veridical and the opposite for larger numerical prime or numerical stimuli. This study is the first to investigate continuous quantity as time in a population with a congenital number impairment and to show that atypical development of numerical competence leaves continuous quantity processing spared. Our data support the idea of a partially shared quantity system across numerical and temporal dimensions, which allows dissociations and interactions among dimensions; furthermore, they suggest that impaired number in DD is unlikely to originate from systems initially dedicated to continuous quantity processing like time.

  11. Stone dusting process advance

    Matt Ryan; David Humphreys [Mining Attachments (Qld.) Pty Ltd. (Australia)


    The coal mining industry has, for many years, used dry stone dust or calcium carbonate (CaCO{sub 3}) in the prevention of the propagation of coal dust explosions throughout their underground mines in Australia. In the last decade wet stone dusting has been introduced. This is where stone dust and water are mixed together to form a paste like slurry. This mixture is pumped and sprayed on to the underground roadway surfaces. This method solved the contamination of the intake airways but brought with it a new problem known as 'caking'. Caking is the hardened layer that is formed as the stone dust slurry dries. It was proven that this hardened layer compromises the dispersal characteristics of the stone dust and therefore its ability to suppress a coal dust explosion. This project set out to prove a specially formulated, non toxic slurry additive and process that could overcome the caking effect. The slurry additive process combines dry stone dust with water to form a slurry. The slurry is then treated with the additive and compressed air to create a highly vesicular foam like stone dusted surface. The initial testing on a range of additives and the effectiveness in minimising the caking effect of wet dusting were performed at Applied Chemical's research laboratory in Melbourne, Victoria and independently tested at the SGS laboratory in Paget, Queensland. The results from these tests provided the platform to conduct full scale spraying trials at the Queensland Mines Rescue Station and Caledon Coal's Cook Colliery, Blackwater. The project moved into the final stage of completion with the collection of data. The intent was to compare the slurry additive process to dry stone dusting in full-scale methane explosions at the CSIR Kloppersbos explosion facility in Kloppersbos, South Africa.

  12. Multivariate Statistical Process Control Process Monitoring Methods and Applications

    Ge, Zhiqiang


      Given their key position in the process control industry, process monitoring techniques have been extensively investigated by industrial practitioners and academic control researchers. Multivariate statistical process control (MSPC) is one of the most popular data-based methods for process monitoring and is widely used in various industrial areas. Effective routines for process monitoring can help operators run industrial processes efficiently at the same time as maintaining high product quality. Multivariate Statistical Process Control reviews the developments and improvements that have been made to MSPC over the last decade, and goes on to propose a series of new MSPC-based approaches for complex process monitoring. These new methods are demonstrated in several case studies from the chemical, biological, and semiconductor industrial areas.   Control and process engineers, and academic researchers in the process monitoring, process control and fault detection and isolation (FDI) disciplines will be inter...

  13. Process for soil consolidation

    Herrick, F.W.; Brandstrom, R.I.


    In this process for the formation of a consolidated aggregate, a mass of solid particles is combined with an aqueous alkaline consolidating compound which forms a gel. This gel consists principally of a mixture of the following: a vegetable polyphenolic material; one of the group of catechins; condensed tannins and extract of the bark of coniferous trees; with 1-10% by weight of formaldehyde; and a catalyst of the group of water-soluble salts of chromium, iron, and aluminum. This catalyst serves to catalyze the reaction of formation of the gel.

  14. Phonocardiography Signal Processing

    Abbas, Abbas K


    The auscultation method is an important diagnostic indicator for hemodynamic anomalies. Heart sound classification and analysis play an important role in the auscultative diagnosis. The term phonocardiography refers to the tracing technique of heart sounds and the recording of cardiac acoustics vibration by means of a microphone-transducer. Therefore, understanding the nature and source of this signal is important to give us a tendency for developing a competent tool for further analysis and processing, in order to enhance and optimize cardiac clinical diagnostic approach. This book gives the

  15. Bismuth vanadate process

    Sullivan, R.M.


    This patent describes the process for the preparation of bismuth vanadate and bismuth vanadate-containing compounds wherein the precursor materials are calcined in the solid state at temperatures sufficient to react the precursor materials to prepare the vanadate compounds. It comprises: wet grinding the calcined product, contacting the calcined product with sufficient alkaline material to provide a pH level of 7.0-13.0 and recovering the treated product, the wet grinding of the calcined product being conducted either in the presence of the alkaline material or prior to the contacting with the alkaline material.

  16. Medical image processing

    Dougherty, Geoff


    This book is designed for end users in the field of digital imaging, who wish to update their skills and understanding with the latest techniques in image analysis. This book emphasizes the conceptual framework of image analysis and the effective use of image processing tools. It uses applications in a variety of fields to demonstrate and consolidate both specific and general concepts, and to build intuition, insight and understanding. Although the chapters are essentially self-contained they reference other chapters to form an integrated whole. Each chapter employs a pedagogical approach to e

  17. FHR Process Instruments

    Holcomb, David Eugene [ORNL


    Fluoride salt-cooled High temperature Reactors (FHRs) are entering into early phase engineering development. Initial candidate technologies have been identified to measure all of the required process variables. The purpose of this paper is to describe the proposed measurement techniques in sufficient detail to enable assessment of the proposed instrumentation suite and to support development of the component technologies. This paper builds upon the instrumentation chapter of the recently published FHR technology development roadmap. Locating instruments outside of the intense core radiation and high-temperature fluoride salt environment significantly decreases their environmental tolerance requirements. Under operating conditions, FHR primary coolant salt is a transparent, low-vapor-pressure liquid. Consequently, FHRs can employ standoff optical measurements from above the salt pool to assess in-vessel conditions. For example, the core outlet temperature can be measured by observing the fuel s blackbody emission. Similarly, the intensity of the core s Cerenkov glow indicates the fission power level. Short-lived activation of the primary coolant provides another means for standoff measurements of process variables. The primary coolant flow and neutron flux can be measured using gamma spectroscopy along the primary coolant piping. FHR operation entails a number of process measurements. Reactor thermal power and core reactivity are the most significant variables for process control. Thermal power can be determined by measuring the primary coolant mass flow rate and temperature rise across the core. The leading candidate technologies for primary coolant temperature measurement are Au-Pt thermocouples and Johnson noise thermometry. Clamp-on ultrasonic flow measurement, that includes high-temperature tolerant standoffs, is a potential coolant flow measurement technique. Also, the salt redox condition will be monitored as an indicator of its corrosiveness. Both

  18. Coupled Diffusion Processes



    @@ Coupled diffusion processes (or CDP for short) model the systems of molecular motors,which attract much interest from physicists and biologists in recent years[1,2,9,14,4,7,21]. The protein moves along a filament called the track, and it is crucial that there are several inner states of the protein and the underlying chemical reaction causes transitions among different inner states,while chemical energy can be converted to mechanical energy by rachet effects[5,3,2,14,12].

  19. Image Processing Research


    Technical Journal, Vol. 36, pp. 653-709, May 1957. -50- 4. Image Restoration anJ Enhdikcement Projects Imaje restoration ani image enhancement are...n (9K =--i_ (9) -sn =0- 2. where o is the noise energy ani I is an identity matrix. n Color Imaje Scanner Calibration: A common problem in the...line of the imaje , and >at. The statistics cf the process N(k) can now be given in terms of the statistics of m , 8 2 , and the sequence W= (cLe (5

  20. Process for treating biomass

    Campbell, Timothy J; Teymouri, Farzaneh


    This invention is directed to a process for treating biomass. The biomass is treated with a biomass swelling agent within the vessel to swell or rupture at least a portion of the biomass. A portion of the swelling agent is removed from a first end of the vessel following the treatment. Then steam is introduced into a second end of the vessel different from the first end to further remove swelling agent from the vessel in such a manner that the swelling agent exits the vessel at a relatively low water content.

  1. Industrial Information Processing

    Svensson, Carsten


    This paper demonstrates, how cross-functional business processes may be aligned with product specification systems in an intra-organizational environment by integrating planning systems and expert systems, thereby providing an end-to-end integrated and an automated solution to the “build......-to-order” challenge. An outcome of this capability is that the potential market for customized products will expand, resulting in a reduction in administrative and manufacturing costs. This potential for cost reduction, simultaneous with market expansion, is a source of competitive advantage; hence manufacturers have...

  2. Software Process Improvement

    Kuhrmann, Marco; Konopka, Claudia; Nellemann, Peter


    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...... directions. An analysis of 635 publications draws a big picture of SPI-related research of the past 25 years. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories. In particular, standard SPI models are analyzed and evaluated for applicability...

  3. Process Principle of Information

    张高锋; 任君


    Ⅰ.IntroductionInformation structure is the organization modelof given and New information in the course ofinformation transmission.A discourse contains avariety of information and not all the informationlisted in the discourse is necessary and useful to us.When we decode a discourse,usually,we do not needto read every word in the discourse or text but skimor scan the discourse or text to search what we thinkis important or useful to us in the discourse as quicklyas possible.Ⅱ.Process Principles of Informati...

  4. Process Analytical Chemistry

    Veltkamp, David J.(VISITORS); Doherty, Steve D.(BCO); Anderson, B B.(VISITORS); Koch, Mel (University of Washington); Bond, Leonard J.(BATTELLE (PACIFIC NW LAB)); Burgess, Lloyd W.(VISITORS); Ullman, Alan H.(UNKNOWN); Bamberger, Judith A.(BATTELLE (PACIFIC NW LAB)); Greenwood, Margaret S.(BATTELLE (PACIFIC NW LAB))


    This review of process analytical chemistry is an update to the previous review on this subject published in 1995(A2). The time period covered for this review includes publications written or published from late 1994 until early 1999, with the addition of a few classic references pointing to background information critical to an understanding of a specific topic area. These older references have been critically included as established fundamental works. New topics covered in this review not previously treated as separate subjects in past reviews include sampling systems, imaging (via optical spectroscopy), and ultrasonic analysis.

  5. Sea Ice Processes


    aq pnoiqs suol)0!pOid AixoolQA 00! 191100 (1I ’uoTow poAlosqo aql jo lqlgti 04) ol a~xe juqp suotioaJip 4)!A% parto s~t S stqi pule ’spoods 001 a) provide information as ating characteristics of PIPS. These factors in- to processes and their scales (as ascertained by elude the vertical grid...warranted horizontal compression being compensated by at this time. Further investigation is needed. vertical motion. In the case of ice, upward The space

  6. Introduction to information processing

    Dietel, Harvey M


    An Introduction to Information Processing provides an informal introduction to the computer field. This book introduces computer hardware, which is the actual computing equipment.Organized into three parts encompassing 12 chapters, this book begins with an overview of the evolution of personal computing and includes detailed case studies on two of the most essential personal computers for the 1980s, namely, the IBM Personal Computer and Apple's Macintosh. This text then traces the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapte

  7. Computers and data processing

    Deitel, Harvey M


    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  8. Stochastic conditional intensity processes

    Bauwens, Luc; Hautsch, Nikolaus


    In this article, we introduce the so-called stochastic conditional intensity (SCI) model by extending Russell’s (1999) autoregressive conditional intensity (ACI) model by a latent common dynamic factor that jointly drives the individual intensity components. We show by simulations that the proposed...... model allows for a wide range of (cross-)autocorrelation structures in multivariate point processes. The model is estimated by simulated maximum likelihood (SML) using the efficient importance sampling (EIS) technique. By modeling price intensities based on NYSE trading, we provide significant evidence...

  9. Hyperspectral image processing

    Wang, Liguo


    Based on the authors’ research, this book introduces the main processing techniques in hyperspectral imaging. In this context, SVM-based classification, distance comparison-based endmember extraction, SVM-based spectral unmixing, spatial attraction model-based sub-pixel mapping, and MAP/POCS-based super-resolution reconstruction are discussed in depth. Readers will gain a comprehensive understanding of these cutting-edge hyperspectral imaging techniques. Researchers and graduate students in fields such as remote sensing, surveying and mapping, geosciences and information systems will benefit from this valuable resource.

  10. Digital signal processing

    O'Shea, Peter; Hussain, Zahir M


    In three parts, this book contributes to the advancement of engineering education and that serves as a general reference on digital signal processing. Part I presents the basics of analog and digital signals and systems in the time and frequency domain. It covers the core topics: convolution, transforms, filters, and random signal analysis. It also treats important applications including signal detection in noise, radar range estimation for airborne targets, binary communication systems, channel estimation, banking and financial applications, and audio effects production. Part II considers sel

  11. Digital signal processing laboratory

    Kumar, B Preetham


    INTRODUCTION TO DIGITAL SIGNAL PROCESSING Brief Theory of DSP ConceptsProblem SolvingComputer Laboratory: Introduction to MATLAB®/SIMULINK®Hardware Laboratory: Working with Oscilloscopes, Spectrum Analyzers, Signal SourcesDigital Signal Processors (DSPs)ReferencesDISCRETE-TIME LTI SIGNALS AND SYSTEMS Brief Theory of Discrete-Time Signals and SystemsProblem SolvingComputer Laboratory: Simulation of Continuous Time and Discrete-Time Signals and Systems ReferencesTIME AND FREQUENCY ANALYSIS OF COMMUNICATION SIGNALS Brief Theory of Discrete-Time Fourier Transform (DTFT), Discrete Fourier Transform

  12. Koenigs function and branching processes

    Chikilev, O G


    An explicit solution of time-homogeneous pure birth branching processes is described. It gives alternative extensions for the negative binomial distribution (branching processes with immigration) and for the Furry-Yule distribution (branching processes without immigration).

  13. Perfect simulation of Hawkes processes

    Møller, Jesper; Rasmussen, Jakob Gulddahl

    This article concerns a perfect simulation algorithm for unmarked and marked Hawkes processes. The usual stratihtforward simulation algorithm suffers from edge effects, whereas our perfect simulation algorithm does not. By viewing Hawkes processes as Poisson cluster processes and using...


    Qixiang Wang; Fei Wei


    The research of nanoscale process engineering (NPE) is based on the interdisciplinary nature of nanoscale science and technology. It mainly deals with transformation of materials and energy into nanostructured materials and nanodevices, and synergizes the multidisciplinary convergence between materials science and technology, biotechnology, and information technology. The core technologies of NPE concern all aspects of nanodevice construction and operation, such as manufacture of nanomaterials "by design", concepts and design of nanoarchitectures, and manufacture and control of customizable nanodevices. Two main targets of NPE at present are focused on nanoscale manufacture and concept design of nanodevices. The research progress of nanoscale manufacturing processes focused on creating nanostructures and assembling them into nanosystems and larger scale architectures has built the interdiscipline of NPE. The concepts and design of smart, multi-functional, environmentally compatible and customizable nanodevice prototypes built from the nanostructured systems of nanocrystalline, nanoporous and microemulsion systems are most challenging tasks of NPE. The development of NPE may also impel us to consider the curriculum and educational reform of chemical engineering in universities.

  15. Oxytocin and emotion processing.

    Di Simplicio, Martina; Harmer, Catherine J


    Since the observation that oxytocin has key effects on social decision making, research on this exciting neuropeptide has doubled in volume: hundreds of studies have pursued the promise of a specific oxytocin action on high-level cognition and social function with wide potential translational implications (from autism to social anxiety to dementia). Here we review the evidence on whether the complex behavioural effects observed in humans after exogenous oxytocin administration build on changes in basic emotional information processing, in particular emotional facial expressions recognition, and attention and memory for emotionally-valenced stimuli.We observe that recent studies confirm a facilitatory effect of oxytocin to more accurate emotion processing, irrespective of emotion type. However, it remains unclear whether this action precedes, is independent of or even secondary to the neuropeptide promoting a greater salience of social stimuli. Overall, this growing research area has shown that oxytocin produces behavioural and neurofunctional outcomes that are highly dependent on the experimental context and on individual differences (gender, personality, life experiences). This poses an exciting challenge for future experimental medicine designs to address and unpack complex interactions between individual and context characteristic, which is needed for the development of more precise clinical applications.

  16. Mastering the diesel process

    Antila, E.; Kaario, O.; Lahtinen, T. (and others)


    This is the final report of the research project 'Mastering the Diesel Process'. The project has been a joint research effort of the Helsinki University of Technology, the Tampere University of Technology, the Technical Research Centre of Finland, and the Aabo Akademi University. Moreover, the contribution of the Michigan Technological University has been important. The project 'Mastering the Diesel Process' has been a computational research project on the physical phenomena of diesel combustion. The theoretical basis of the project lies on computational fluid dynamics. Various submodels for computational fluid dynamics have been developed or tested within engine simulation. Various model combinations in three diesel engines of different sizes have been studied. The most important submodels comprise fuel spray drop breakup, fuel evaporation, gas-fuel interaction in the spray, mixing model of combustion, heat transfer, emission mechanisms. The boundary conditions and flow field modelling have been studied, as well. The main simulation tool have been Star-CD. KIVA code have been used in the model development, as well. By the help of simulation, we are able to investigate the effect of various design parameters or operational parameters on diesel combustion and emission formation. (orig.)

  17. ARM Mentor Selection Process

    Sisterson, D. L. [Argonne National Lab. (ANL), Argonne, IL (United States)


    The Atmospheric Radiation Measurement (ARM) Program was created in 1989 with funding from the U.S. Department of Energy (DOE) to develop several highly instrumented ground stations to study cloud formation processes and their influence on radiative transfer. In 2003, the ARM Program became a national scientific user facility, known as the ARM Climate Research Facility. This scientific infrastructure provides for fixed sites, mobile facilities, an aerial facility, and a data archive available for use by scientists worldwide through the ARM Climate Research Facility—a scientific user facility. The ARM Climate Research Facility currently operates more than 300 instrument systems that provide ground-based observations of the atmospheric column. To keep ARM at the forefront of climate observations, the ARM infrastructure depends heavily on instrument scientists and engineers, also known as lead mentors. Lead mentors must have an excellent understanding of in situ and remote-sensing instrumentation theory and operation and have comprehensive knowledge of critical scale-dependent atmospheric processes. They must also possess the technical and analytical skills to develop new data retrievals that provide innovative approaches for creating research-quality data sets. The ARM Climate Research Facility is seeking the best overall qualified candidate who can fulfill lead mentor requirements in a timely manner.

  18. The anaerobic digestion process

    Rivard, C.J. [National Renewable Energy Lab., Golden, CO (United States); Boone, D.R. [Oregon Graduate Inst., Portland, OR (United States)


    The microbial process of converting organic matter into methane and carbon dioxide is so complex that anaerobic digesters have long been treated as {open_quotes}black boxes.{close_quotes} Research into this process during the past few decades has gradually unraveled this complexity, but many questions remain. The major biochemical reactions for forming methane by methanogens are largely understood, and evolutionary studies indicate that these microbes are as different from bacteria as they are from plants and animals. In anaerobic digesters, methanogens are at the terminus of a metabolic web, in which the reactions of myriads of other microbes produce a very limited range of compounds - mainly acetate, hydrogen, and formate - on which the methanogens grow and from which they form methane. {open_quotes}Interspecies hydrogen-transfer{close_quotes} and {open_quotes}interspecies formate-transfer{close_quotes} are major mechanisms by which methanogens obtain their substrates and by which volatile fatty acids are degraded. Present understanding of these reactions and other complex interactions among the bacteria involved in anaerobic digestion is only now to the point where anaerobic digesters need no longer be treated as black boxes.

  19. Ambiguity in sentence processing.

    Altmann, G T


    As listeners and readers, we rarely notice the ambiguities that pervade our everyday language. When we hear the proverb `Time flies like an arrow' we might ponder its meaning, but not the fact that there are almost 100 grammatically permissible interpretations of this short sentence. On occasion, however, we do notice sentential ambiguity: headlines, such as `Two Sisters Reunited After 18 Years in Checkout Counter', are amusing because they so consistently lead to the unintended interpretation (presumably, the sisters did not spend 18 years at the checkout). It is this consistent preference for one interpretation-and one grammatical structure-rather than another that has fuelled research into sentence processing for more than 20 years. Until relatively recently, the dominant belief had been that these preferences arise from general principles that underlie our use of grammar, with certain grammatical constructions being preferred over others. There has now accrued, however, a considerable body of evidence demonstrating that these preferences are not absolute, but can change in particular circumstances. With this evidence have come new theories of sentence processing, some of which, at first glance, radically question the standard notions of linguistic representation, grammar and understanding.

  20. Multidimensional diffusion processes

    Stroock, Daniel W


    From the reviews: "… Both the Markov-process approach and the Itô approach … have been immensely successful in diffusion theory. The Stroock-Varadhan book, developed from the historic 1969 papers by its authors, presents the martingale-problem approach as a more powerful - and, in certain regards, more intrinsic-means of studying the foundations of the subject. […] … the authors make the uncompromising decision not "to proselytise by intimidating the reader with myriad examples demonstrating the full scope of the techniques", but rather to persuade the reader "with a careful treatment of just one problem to which they apply". […] Most of the main tools of stochastic-processes theory are used, ..but it is the formidable combination of probability theory with analysis … which is the core of the work. […] I have emphasized the great importance of the Stroock-Varadhan book. It contains a lot more than I have indicated; in particular, its many exercises conain much interesting material. For immediat...

  1. Mindfulness and psychological process.

    Williams, J Mark G


    The author reviews the articles in the Special Section on Mindfulness, starting from the assumption that emotions evolved as signaling systems that need to be sensitive to environmental contingencies. Failure to switch off emotion is due to the activation of mental representations of present, past, and future that are created independently of external contingencies. Mindfulness training can be seen as one way to teach people to discriminate such "simulations" from objects and contingencies as they actually are. The articles in this Special Section show how even brief laboratory training can have effects on processing affective stimuli; that long-term meditation practitioners show distinct reactions to pain; that longer meditation training is associated with differences in brain structure; that 8 weeks' mindfulness practice brings about changes in the way emotion is processed showing that participants can learn to uncouple the sensory, directly experienced self from the "narrative" self; that mindfulness training can affect working memory capacity, and enhance the ability of participants to talk about past crises in a way that enables them to remain specific and yet not be overwhelmed. The implications of these findings for understanding emotion and for further research is discussed.

  2. Turbulence and Stochastic Processes

    Celani, Antonio; Mazzino, Andrea; Pumir, Alain

    sec:08-1In 1931 the monograph Analytical Methods in Probability Theory appeared, in which A.N. Kolmogorov laid the foundations for the modern theory of Markov processes [1]. According to Gnedenko: "In the history of probability theory it is difficult to find other works that changed the established points of view and basic trends in research work in such a decisive way". Ten years later, his article on fully developed turbulence provided the framework within which most, if not all, of the subsequent theoretical investigations have been conducted [2] (see e.g. the review by Biferale et al. in this volume [3]. Remarkably, the greatest advances made in the last few years towards a thorough understanding of turbulence developed from the successful marriage between the theory of stochastic processes and the phenomenology of turbulent transport of scalar fields. In this article we will summarize these recent developments which expose the direct link between the intermittency of transported fields and the statistical properties of particle trajectories advected by the turbulent flow (see also [4], and, for a more thorough review, [5]. We also discuss the perspectives of the Lagrangian approach beyond passive scalars, especially for the modeling of hydrodynamic turbulence.

  3. Natural gas conversion process


    The experimental apparatus was dismantled and transferred to a laboratory space provided by Lawrence Berkeley Laboratory (LBL) which is already equipped with a high-ventilation fume hood. This will enable us to make tests at higher gas flow rates in a safe environment. Three papers presented at the ACS meeting in San Francisco (Symposium on Natural Gas Upgrading II) April 5--10, 1992 show that the goal of direct catalytic conversion of Methane into heavier Hydrocarbons in a reducing atmosphere is actively pursued in three other different laboratories. There are similarities in their general concept with our own approach, but the temperature range of the experiments reported in these recent papers is much lower and this leads to uneconomic conversion rates. This illustrates the advantages of Methane activation by a Hydrogen plasma to reach commercial conversion rates. A preliminary process flow diagram was established for the Integrated Process, which was outlined in the previous Quarterly Report. The flow diagram also includes all the required auxiliary facilities for product separation and recycle of the unconverted feed as well as for the preparation and compression of the Syngas by-product.

  4. Grants Process Overview

    This infographic shows the steps in the National Institutes of Health and National Cancer Institute Grants Process. The graphic shows which steps are done by the Principle Investigator, Grantee Institution, and by NIH. The process is represented by a circular flow of steps. Starting from the top and reading clockwise: The Principle Investigator “Initiates Research Idea and Prepares Application” The Grantee Institution “Submits Application” NIH “NIH Center For Scientific Review, Assigns To NCI And To Study Section” NIH “Scientific Review Group (NCI OR CSR) Evaluates for Scientific Merit” NIH “National Cancer Advisory Board Recommends Action” NIH “NCI Evaluates Program Relevance And Need” NIH “NCI Makes Funding Selections And Issues Grant Awards” (NIH) NIH “NCI Monitors Programmatic and Business Management Performance of the Grant” The Grantee Institution “Manages Funds” The Principle Investigator “Conducts Research” Source: Icons made by Freepik from is licensed by CC BY3.0”

  5. Poultry Slaughtering and Processing Facilities

    Department of Homeland Security — Agriculture Production Poultry Slaughtering and Processing in the United States This dataset consists of facilities which engage in slaughtering, processing, and/or...

  6. Beryllium Manufacturing Processes

    Goldberg, A


    This report is one of a number of reports that will be combined into a handbook on beryllium. Each report covers a specific topic. To-date, the following reports have been published: (1) Consolidation and Grades of Beryllium; (2) Mechanical Properties of Beryllium and the Factors Affecting these Properties; (3) Corrosion and Corrosion Protection of Beryllium; (4) Joining of Beryllium; (5) Atomic, Crystal, Elastic, Thermal, Nuclear, and other Properties of Beryllium; and (6) Beryllium Coating (Deposition) Processes and the Influence of Processing Parameters on Properties and Microstructure. The conventional method of using ingot-cast material is unsuitable for manufacturing a beryllium product. Beryllium is a highly reactive metal with a high melting point, making it susceptible to react with mold-wall materials forming beryllium compounds (BeO, etc.) that become entrapped in the solidified metal. In addition, the grain size is excessively large, being 50 to 100 {micro}m in diameter, while grain sizes of 15 {micro}m or less are required to meet acceptable strength and ductility requirements. Attempts at refining the as-cast-grain size have been unsuccessful. Because of the large grain size and limited slip systems, the casting will invariably crack during a hot-working step, which is an important step in the microstructural-refining process. The high reactivity of beryllium together with its high viscosity (even with substantial superheat) also makes it an unsuitable candidate for precision casting. In order to overcome these problems, alternative methods have been developed for the manufacturing of beryllium. The vast majority of these methods involve the use of beryllium powders. The powders are consolidated under pressure in vacuum at an elevated temperature to produce vacuum hot-pressed (VHP) blocks and vacuum hot-isostatic-pressed (HIP) forms and billets. The blocks (typically cylindrical), which are produced over a wide range of sizes (up to 183 cm dia. by 61

  7. Fundamentals of process intensification: A process systems engineering view

    Babi, Deenesh Kavi; Sales Cruz, Alfonso Mauricio; Gani, Rafiqul


    at different scales of size, that is, the unit operation scale, the task scale, and the phenomena scale. The roles of process intensification with respect to process improvements and the generation of more sustainable process designs are discussed and questions related to when to apply process intensification...

  8. Brownian semi-stationary processes, turbulence and smooth processes

    Urbina, José Ulises Márquez

    process a bounded variation process with differentiable paths. It is natural to inquire if it is possible to obtain an asymptotic theory for this class of BSS processes. This problem is investigated and some partial results are presented. The asymptotic theory for BSS processes naturally leads...

  9. Particle processing technology

    Yoshio, Sakka


    In recent years, there has been strong demand for the development of novel devices and equipment that support advanced industries including IT/semiconductors, the environment, energy and aerospace along with the achievement of higher efficiency and reduced environmental impact. Many studies have been conducted on the fabrication of innovative inorganic materials with novel individual properties and/or multifunctional properties including electrical, dielectric, thermal, optical, chemical and mechanical properties through the development of particle processing. The fundamental technologies that are key to realizing such materials are (i) the synthesis of nanoparticles with uniform composition and controlled crystallite size, (ii) the arrangement/assembly and controlled dispersion of nanoparticles with controlled particle size, (iii) the precise structural control at all levels from micrometer to nanometer order and (iv) the nanostructural design based on theoretical/experimental studies of the correlation between the local structure and the functions of interest. In particular, it is now understood that the application of an external stimulus, such as magnetic energy, electrical energy and/or stress, to a reaction field is effective in realizing advanced particle processing [1-3]. This special issue comprises 12 papers including three review papers. Among them, seven papers are concerned with phosphor particles, such as silicon, metals, Si3N4-related nitrides, rare-earth oxides, garnet oxides, rare-earth sulfur oxides and rare-earth hydroxides. In these papers, the effects of particle size, morphology, dispersion, surface states, dopant concentration and other factors on the optical properties of phosphor particles and their applications are discussed. These nanoparticles are classified as zero-dimensional materials. Carbon nanotubes (CNT) and graphene are well-known one-dimensional (1D) and two-dimensional (2D) materials, respectively. This special issue also

  10. Business process modeling for processing classified documents using RFID technology

    Koszela Jarosław


    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  11. Process dissociation, single-process theories, and recognition memory.

    Ratcliff, R; Van Zandt, T; McKoon, G


    According to the assumptions of L. L. Jacoby's (1991) process dissociation method, performance in recognition memory is determined by the combination of an unconscious familiarity process and a conscious intentional recollection process. The process dissociation method is used to produce estimates of the contributions of the 2 components to recognition performance. This article investigates whether the method provides the correct estimates of components if performance actually depends on only a single process or on 2 processes different from those assumed by the method. The SAM model (G. Gillund & R. M. Shiffrin, 1984) was used to produce simulated data based on a single process. Variants of SAM with 2 processes and R. C. Atkinson and J. F. Juola's (1973) 2-process model were used to produce data based on 2 processes.

  12. Titan's global geologic processes

    Malaska, Michael; Lopes, Rosaly M. C.; Schoenfeld, Ashley; Birch, Samuel; Hayes, Alexander; Williams, David A.; Solomonidou, Anezina; Janssen, Michael A.; Le Gall, Alice; Soderblom, Jason M.; Neish, Catherine; Turtle, Elizabeth P.; Cassini RADAR Team


    We have mapped the Cassini SAR imaged areas of Saturn's moon Titan in order to determine the geological properties that modify the surface [1]. We used the SAR dataset for mapping, but incorporated data from radiometry, VIMS, ISS, and SARTopo for terrain unit determination. This work extends our analyses of the mid-latitude/equatorial Afekan Crater region [2] and in the southern and northern polar regions [3]. We placed Titan terrains into six broad terrain classes: craters, mountain/hummocky, labyrinth, plains, dunes, and lakes. We also extended the fluvial mapping done by Burr et al. [4], and defined areas as potential cryovolcanic features [5]. We found that hummocky/mountainous and labyrinth areas are the oldest units on Titan, and that lakes and dunes are among the youngest. Plains units are the largest unit in terms of surface area, followed by the dunes unit. Radiometry data suggest that most of Titan's surface is covered in high-emissivity materials, consistent with organic materials, with only minor exposures of low-emissivity materials that are consistent with water ice, primarily in the mountain and hummocky areas and crater rims and ejecta [6, 7]. From examination of terrain orientation, we find that landscape evolution in the mid-latitude and equatorial regions is driven by aeolian processes, while polar landscapes are shaped by fluvial, lacrustine, and possibly dissolution or volatilization processes involving cycling organic materials [3, 8]. Although important in deciphering Titan's terrain evolution, impact processes play a very minor role in the modification of Titan's landscape [9]. We find no evidence for large-scale aqueous cryovolcanic deposits.References: [1] Lopes, R.M.C. et al. (2010) Icarus, 205, 540–558. [2] Malaska, M.J. et al. (2016) Icarus, 270, 130–161. [3] Birch et al., in revision. [4] Burr et al. (2013) GSA Bulletin 125, 299–321. [5] Lopes et al. JGR: Planets, 118, 1–20. [6] Janssen et al., (2009) Icarus, 200, 222–239. [7

  13. Sensors in Spray Processes

    Fauchais, P.; Vardelle, M.


    This paper presents what is our actual knowledge about sensors, used in the harsh environment of spray booths, to improve the reproducibility and reliability of coatings sprayed with hot or cold gases. First are described, with their limitations and precisions, the different sensors following the in-flight hot particle parameters (trajectories, temperatures, velocities, sizes, and shapes). A few comments are also made about techniques, still under developments in laboratories, to improve our understanding of coating formation such as plasma jet temperature measurements in non-symmetrical conditions, hot gases heat flux, particles flattening and splats formation, particles evaporation. Then are described the illumination techniques by laser flash of either cold particles (those injected in hot gases, or in cold spray gun) or liquid injected into hot gases (suspensions or solutions). The possibilities they open to determine the flux and velocities of cold particles or visualize liquid penetration in the core of hot gases are discussed. Afterwards are presented sensors to follow, when spraying hot particles, substrate and coating temperature evolution, and the stress development within coatings during the spray process as well as the coating thickness. The different uses of these sensors are then described with successively: (i) Measurements limited to particle trajectories, velocities, temperatures, and sizes in different spray conditions: plasma (including transient conditions due to arc root fluctuations in d.c. plasma jets), HVOF, wire arc, cold spray. Afterwards are discussed how such sensor data can be used to achieve a better understanding of the different spray processes, compare experiments to calculations and improve the reproducibility and reliability of the spray conditions. (ii) Coatings monitoring through in-flight measurements coupled with those devoted to coatings formation. This is achieved by either maintaining at their set point both in-flight and

  14. Process technology implications of procurement process: some initial observations

    Ellmer, E.; Emmerich, W.; Finkelstein, A


    We report on a study of procurement processes in a large organization. The purpose of the study was to identify problems in the organization’s procurement processesand to suggestimprovement actions.Procurement processesdetermine the characteristics of software processes. Procurement processes are themselves complex and amenable to process technology. Cost and scheduling benefits can be realised if procurement and contracting organizations integrate their respective processes...

  15. Learning Determinantal Point Processes

    Kulesza, Alex


    Determinantal point processes (DPPs), which arise in random matrix theory and quantum physics, are natural models for subset selection problems where diversity is preferred. Among many remarkable properties, DPPs offer tractable algorithms for exact inference, including computing marginal probabilities and sampling; however, an important open question has been how to learn a DPP from labeled training data. In this paper we propose a natural feature-based parameterization of conditional DPPs, and show how it leads to a convex and efficient learning formulation. We analyze the relationship between our model and binary Markov random fields with repulsive potentials, which are qualitatively similar but computationally intractable. Finally, we apply our approach to the task of extractive summarization, where the goal is to choose a small subset of sentences conveying the most important information from a set of documents. In this task there is a fundamental tradeoff between sentences that are highly relevant to th...

  16. Signal processing unit

    Boswell, J.


    The architecture of the signal processing unit (SPU) comprises an ROM connected to a program bus, and an input-output bus connected to a data bus and register through a pipeline multiplier accumulator (pmac) and a pipeline arithmetic logic unit (palu), each associated with a random access memory (ram1,2). The system pulse frequency is from 20 mhz. The pmac is further detailed, and has a capability of 20 mega operations per second. There is also a block diagram for the palu, showing interconnections between the register block (rbl), separator for bus (bs), register (reg), shifter (sh) and combination unit. The first and second rams have formats 64*16 and 32*32 bits, respectively. Further data are a 5-v power supply and 2.5 micron n-channel silicon gate mos technology with about 50000 transistors.

  17. Forward Osmosis Process

    Duan, Jintang


    A process that can alleviate the internal concentration polarization and can enhance membrane performance of a forward osmosis system includes the steps of passing a fluid in a forward osmosis system from a feed solution with a first osmotic pressure, through a membrane into a draw solution comprising a draw solute with a second osmotic pressure, where the first osmotic pressure is lower than the second osmotic pressure, the membrane includes an active layer and a support layer, and the membrane is oriented such that the active layer of the membrane faces a draw side, and the support layer faces a feed side; and applying an external force to the fluid on the feed side of the membrane.

  18. Catalyzing alignment processes

    Lauridsen, Erik Hagelskjær; Jørgensen, Ulrik


    This paper describes how environmental management systems (EMS) spur the circulation of processes that support the constitution of environmental issues as specific environ¬mental objects and objectives. EMS catalyzes alignmentprocesses that produce coherence among the different elements involved......, the networks of environmental professionals that work in the environmental organisation, in consulting and regulatory enforcement, and dominating business cultures. These have previously been identified in the literature as individually significant in relation to the evolving environmental agendas....... They are here used to describe the context in which environmental management is implemented. Based on findings from contributions to a research program studying the implementation and impact of EMS in different settings, we highlight the diverse roles that these systems play in the Thai context. EMS may over...

  19. Plutonium dissolution process

    Vest, M.A.; Fink, S.D.; Karraker, D.G.; Moore, E.N.; Holcomb, H.P.


    A two-step process for dissolving Pu metal is disclosed in which two steps can be carried out sequentially or simultaneously. Pu metal is exposed to a first mixture of 1.0-1.67 M sulfamic acid and 0.0025-0.1 M fluoride, the mixture having been heated to 45-70 C. The mixture will dissolve a first portion of the Pu metal but leave a portion of the Pu in an oxide residue. Then, a mineral acid and additional fluoride are added to dissolve the residue. Alternatively, nitric acid between 0.05 and 0.067 M is added to the first mixture to dissolve the residue as it is produced. Hydrogen released during the dissolution is diluted with nitrogen.

  20. Additive Gaussian Processes

    Duvenaud, David; Rasmussen, Carl Edward


    We introduce a Gaussian process model of functions which are additive. An additive function is one which decomposes into a sum of low-dimensional functions, each depending on only a subset of the input variables. Additive GPs generalize both Generalized Additive Models, and the standard GP models which use squared-exponential kernels. Hyperparameter learning in this model can be seen as Bayesian Hierarchical Kernel Learning (HKL). We introduce an expressive but tractable parameterization of the kernel function, which allows efficient evaluation of all input interaction terms, whose number is exponential in the input dimension. The additional structure discoverable by this model results in increased interpretability, as well as state-of-the-art predictive power in regression tasks.

  1. The Integrated Renovation Process

    Galiotto, Nicolas; Heiselberg, Per; Knudstrup, Mary-Ann

    The Integrated Renovation Process (IRP) is a user customized methodology based on judiciously selected constructivist and interactive multi-criteria decision making methods (Galiotto, Heiselberg, & Knudstrup, 2014 (expected)). The IRP supports, informs and reassures building owners to decide...... they get more time for the cost optimization and the qualitative analysis of the users’ needs and behaviours. In order to reach a fossil free energy building stock within an acceptable time frame, it is primordial that researchers, politicians and the building industry work hand in hand. Indeed, in order...... to overcome the financial barriers to energy renovation and bring a new type of building experts in the building renovation sector, cost optimization tools for building renovation have been and can be developed but new legislation and politico-economic supports are still much needed. We present in this report...


    Calkins, G.D.; Bohlmann, E.G.


    A process for the recovery of thorium, uranium, and rare earths from monazite sands is presented. The sands are first digested and dissolved in concentrated NaOH, and the solution is then diluted causing precipitation of uranium, thorium and rare earth hydroxides. The precipitate is collected and dissolved in HCl, and the pH of this solution is adjusted to about 6, precipitating the hydroxides of thorium and uranium but leaving the rare earths in solution. The rare earths are then separated from the solution by precipitation at a still higher pH. The thorium and uranium containing precipitate is redissolved in HNO/sub 3/ and the two elements are separated by extraction into tributyl phosphate and back extraction with a weakly acidic solution to remove the thorium.

  3. Fractal Poisson processes

    Eliazar, Iddo; Klafter, Joseph


    The Central Limit Theorem (CLT) and Extreme Value Theory (EVT) study, respectively, the stochastic limit-laws of sums and maxima of sequences of independent and identically distributed (i.i.d.) random variables via an affine scaling scheme. In this research we study the stochastic limit-laws of populations of i.i.d. random variables via nonlinear scaling schemes. The stochastic population-limits obtained are fractal Poisson processes which are statistically self-similar with respect to the scaling scheme applied, and which are characterized by two elemental structures: (i) a universal power-law structure common to all limits, and independent of the scaling scheme applied; (ii) a specific structure contingent on the scaling scheme applied. The sum-projection and the maximum-projection of the population-limits obtained are generalizations of the classic CLT and EVT results - extending them from affine to general nonlinear scaling schemes.

  4. Paretian Poisson Processes

    Eliazar, Iddo; Klafter, Joseph


    Many random populations can be modeled as a countable set of points scattered randomly on the positive half-line. The points may represent magnitudes of earthquakes and tornados, masses of stars, market values of public companies, etc. In this article we explore a specific class of random such populations we coin ` Paretian Poisson processes'. This class is elemental in statistical physics—connecting together, in a deep and fundamental way, diverse issues including: the Poisson distribution of the Law of Small Numbers; Paretian tail statistics; the Fréchet distribution of Extreme Value Theory; the one-sided Lévy distribution of the Central Limit Theorem; scale-invariance, renormalization and fractality; resilience to random perturbations.

  5. Supplier Evaluation Processes

    Hald, Kim Sundtoft; Ellegaard, Chris


    Purpose – The purpose of this paper is to illuminate how supplier evaluation practices are linked to supplier performance improvements. Specifically, the paper investigates how performance information travelling between the evaluating buyer and the evaluated suppliers is shaped and reshaped...... in the evaluation process. Design/methodology/approach – The paper relies on a multiple, longitudinal case research methodology. The two cases show two companies' efforts in designing, implementing, and using supplier evaluation in order to improve supplier performance. Findings – The findings show how the dynamics...... of representing, reducing, amplifying, dampening, and directing shape and reshape supplier evaluation information. In both companies, evaluation practices were defined, redefined, and re-directed by the involved actors' perception and decision making, as well as organisational structures, IT systems...

  6. Process and plant safety

    Hauptmanns, Ulrich


    Accidents in technical installations are random events. Hence they cannot be totally avoided. Only the probability of their occurrence may be reduced and their consequences be mitigated. The book proceeds from hazards caused by materials and process conditions to indicating technical and organizational measures for achieving the objectives of reduction and mitigation. Qualitative methods for identifying weaknesses of design and increasing safety as well as models for assessing accident consequences are presented. The quantitative assessment of the effectiveness of safety measures is explained. The treatment of uncertainties plays a role there. They stem from the random character of the accident and from lacks of knowledge on some of the phenomena to be addressed. The reader is acquainted with the simulation of accidents, safety and risk analyses and learns how to judge the potential and limitations of mathematical modelling. Risk analysis is applied amongst others to “functional safety” and the determinat...

  7. Advanced powder processing

    Janney, M.A. [Oak Ridge National Lab., TN (United States)


    Gelcasting is an advanced powder forming process. It is most commonly used to form ceramic or metal powders into complex, near-net shapes. Turbine rotors, gears, nozzles, and crucibles have been successfully gelcast in silicon nitride, alumina, nickel-based superalloy, and several steels. Gelcasting can also be used to make blanks that can be green machined to near-net shape and then high fired. Green machining has been successfully applied to both ceramic and metal gelcast blanks. Recently, the authors have used gelcasting to make tooling for metal casting applications. Most of the work has centered on H13 tool steel. They have demonstrated an ability to gelcast and sinter H13 to near net shape for metal casting tooling. Also, blanks of H13 have been cast, green machined into complex shape, and fired. Issues associated with forming, binder burnout, and sintering are addressed.

  8. Welding processes handbook

    Weman, Klas


    Offers an introduction to the range of available welding technologies. This title includes chapters on individual techniques that cover principles, equipment, consumables and key quality issues. It includes material on such topics as the basics of electricity in welding, arc physics, and distortion, and the weldability of particular metals.$bThe first edition of Welding processes handbook established itself as a standard introduction and guide to the main welding technologies and their applications. This new edition has been substantially revised and extended to reflect the latest developments. After an initial introduction, the book first reviews gas welding before discussing the fundamentals of arc welding, including arc physics and power sources. It then discusses the range of arc welding techniques including TIG, plasma, MIG/MAG, MMA and submerged arc welding. Further chapters cover a range of other important welding technologies such as resistance and laser welding, as well as the use of welding techniqu...

  9. Electro Processing Research


    Electroprocessing which is concerned with fluid dynamics of the electroreduction process to determine how it may be modified to improve the quality of the deposit was studied. Experimental techniques are used in this research. These techniques include laser Schlieren photography, laser Doppler velocimetry, and frequency spectrum analysis. Projects involve fluid flow studies of zinc plating in aqueous and molten salt electrolytes, study of cell design for magnesium chlorides electrolysis, digital signal analysis of manganese electrodeposition in molten chlorides, and electroplating of molybdenum from low melting salts. It is anticipated that the use of refractory metals as constructed materials in engineering will increase. Their electrodeposition from molten salt electrolytes is important in the extraction metallurgy of refractory metals.

  10. Experiencing Historical Processes

    Marchetti, Emanuela


    ” are involved in guided tours: the visitors (in this case primary school children), the guides, and museum practitioners responsible for planning exhibitions. Current studies tend to focus on one user group; this means that the proposed solutions do not take into account the needs of the other groups. Instead......, which is discussed in the second paper, based on the framework of apprenticeship in thinking (Rogoff 1990) and 4 play as a resource for conceptual thinking (Vygotsky 1978). Play is also seen as a state of mind (Apter 2007; Sutton-Smith 1997) allowing children to reconfigure the hierarchical relationship...... emerging with the guides. Moreover, as discussed in the third paper presented in this thesis, the design process takes into account children’s individual needs, regarding play and museum experience. Final evaluations with MicroCulture (fourth paper) show that digital technologies allow for compelling...

  11. Continuous coal processing method

    Ryason, P. R.


    A coal pump is provided in which solid coal is heated in the barrel of an extruder under pressure to a temperature at which the coal assumes plastic properties. The coal is continuously extruded, without static zones, using, for example, screw extrusion preferably without venting through a reduced diameter die to form a dispersed spray. As a result, the dispersed coal may be continuously injected into vessels or combustors at any pressure up to the maximum pressure developed in the extrusion device. The coal may be premixed with other materials such as desulfurization aids or reducible metal ores so that reactions occur, during or after conversion to its plastic state. Alternatively, the coal may be processed and caused to react after extrusion, through the die, with, for example, liquid oxidizers, whereby a coal reactor is provided.

  12. Fluorination process using catalysts

    Hochel, R.C.; Saturday, K.A.


    A process is given for converting an actinide compound selected from the group consisting of uranium oxides, plutonium oxides, uranium tetrafluorides, plutonium tetrafluorides and mixtures of said oxides and tetrafluorides, to the corresponding volatile actinide hexafluoride by fluorination with a stoichiometric excess of fluorine gas. The improvement involves conducting the fluorination of the plutonium compounds in the presence of a fluoride catalyst selected from the group consisting of CoF/sub 3/, AgF/sub 2/ and NiF/sub 2/, whereby the fluorination is significantly enhanced. The improvement also involves conducting the fluorination of one of the uranium compounds in the presence of a fluoride catalyst selected from the group consisting of CoF/sub 3/ and AgF/sub 2/, whereby the fluorination is significantly enhanced.

  13. Fluorination process using catalyst

    Hochel, Robert C.; Saturday, Kathy A.


    A process for converting an actinide compound selected from the group consisting of uranium oxides, plutonium oxides, uranium tetrafluorides, plutonium tetrafluorides and mixtures of said oxides and tetrafluorides, to the corresponding volatile actinide hexafluoride by fluorination with a stoichiometric excess of fluorine gas. The improvement involves conducting the fluorination of the plutonium compounds in the presence of a fluoride catalyst selected from the group consisting of CoF.sub.3, AgF.sub.2 and NiF.sub.2, whereby the fluorination is significantly enhanced. The improvement also involves conducting the fluorination of one of the uranium compounds in the presence of a fluoride catalyst selected from the group consisting of CoF.sub.3 and AgF.sub.2, whereby the fluorination is significantly enhanced.

  14. Catalyst Alloys Processing

    Tan, Xincai


    Catalysts are one of the key materials used for diamond formation at high pressures. Several such catalyst products have been developed and applied in China and around the world. The catalyst alloy most widely used in China is Ni70Mn25Co5 developed at Changsha Research Institute of Mining and Metallurgy. In this article, detailed techniques for manufacturing such a typical catalyst alloy will be reviewed. The characteristics of the alloy will be described. Detailed processing of the alloy will be presented, including remelting and casting, hot rolling, annealing, surface treatment, cold rolling, blanking, finishing, packaging, and waste treatment. An example use of the catalyst alloy will also be given. Industrial experience shows that for the catalyst alloy products, a vacuum induction remelt furnace can be used for remelting, a metal mold can be used for casting, hot and cold rolling can be used for forming, and acid pickling can be used for metal surface cleaning.

  15. Evaluating Discourse Processing Algorithms

    Walker, M A


    In order to take steps towards establishing a methodology for evaluating Natural Language systems, we conducted a case study. We attempt to evaluate two different approaches to anaphoric processing in discourse by comparing the accuracy and coverage of two published algorithms for finding the co-specifiers of pronouns in naturally occurring texts and dialogues. We present the quantitative results of hand-simulating these algorithms, but this analysis naturally gives rise to both a qualitative evaluation and recommendations for performing such evaluations in general. We illustrate the general difficulties encountered with quantitative evaluation. These are problems with: (a) allowing for underlying assumptions, (b) determining how to handle underspecifications, and (c) evaluating the contribution of false positives and error chaining.

  16. Entrepreneurship and Process Studies

    Hjorth, Daniel; Holt, Robin; Steyaert, Chris


    Process studies put movement, change and flow first; to study processually is to consider the world as restless, something underway, becoming and perishing, without end. To understand firms processually is to accept but also – and this is harder perhaps – to absorb this fluidity, to treat...... a variable as just that, a variable. The resonance with entrepreneurship studies is obvious. If any field is alive to, and fully resonant with, a processual understanding of, for example, the creation of firms, it is entrepreneurship studies. This special issue is an attempt to consider the promise...... and potential of processual approaches to studying, researching and practising entrepreneurship. The articles in the issue attest to an increasing sensitivity to processual thinking. We argue that appreciating entrepreneurial phenomena processually opens up the field to an understanding of entrepreneurship...

  17. Perception and information processing

    Scholderer, Joachim


    : as consumers, we can only respond to a stimulus if our senses are actually stimulated by it. Psychologically speaking, a stimulus only exists for us once we have formed an internal representation of it. The objective of this chapter is to introduce the systems that are involved in this processing of perceptual...... information and to characterise the operations they perform. To avoid confusion, it should be stressed that the term "perception" is often used in a colloquial sense in consumer research. In concepts like perceived quality, perceived value, or perceived risk, the modifier "perceived" simply highlights...... ("psychophysics") can be considered the birth of experimental psychology. Today, most perception research is carried out in the interdisciplinary field of cognitive neuroscience. Only selected issues have made their way into consumer research. After a short general introduction, we will therefore focus...



    IntroductionThe main goal of studying a foreign language is to be able to communicate.The essence ofcommunication is sending and receiving messages and negotiating meaning.During the communicationprocess,learners may meet problems which hinder their understanding.In order to overcome theselimitations,it is very.important to know and use certain strategies involved in the communicationprocesses.There are three basic activities in the communication process-expressing intensions,interpretation andnegotiation.Expressing intentions is giving information.During communication,every speaker has tofirst send his or her messages and the listener must decode what he or she has heard.This activity may becalled interpretation.During conversation,both listener and speaker must do some negotiation in orderto make sure that they understand each other.Negotiation could be called communication exchange.

  19. Business Process Outsourcing

    Doina FOTACHE


    Full Text Available Business Process Outsourcing (BPO is gaining widespread acceptance throughout the US, Europe, South America and Asia Pacific as the top executives of leading multinationals turn to outsourcing as a strategic management tool for improving corporate performance, profitability and shareholder value. BPO started to emerge a few years ago as follow-on to IT outsourcing. The concept is not new; BPO is the contracting of a specific business task. Outsourcing focuses on adding value typically to non-core and non-complex activities by buying in best practices and economies of scale. Because reduce costs, focus on core strategic activities and improve customer service, an increasing number of organizations in both the public and the private sector are looking toward BPO as a solution to their needs.

  20. Experimental adaptive process tomography

    Pogorelov, I. A.; Struchalin, G. I.; Straupe, S. S.; Radchenko, I. V.; Kravtsov, K. S.; Kulik, S. P.


    Adaptive measurements were recently shown to significantly improve the performance of quantum state tomography. Utilizing information about the system for the online choice of optimal measurements allows one to reach the ultimate bounds of precision for state reconstruction. In this article we generalize an adaptive Bayesian approach to the case of process tomography and experimentally show its superiority in the task of learning unknown quantum operations. Our experiments with photonic polarization qubits cover all types of single-qubit channels. We also discuss instrumental errors and the criteria for evaluation of the ultimate achievable precision in an experiment. It turns out that adaptive tomography provides a lower noise floor in the presence of strong technical noise.

  1. Epoxidation catalyst and process

    Linic, Suljo; Christopher, Phillip


    Disclosed herein is a catalytic method of converting alkenes to epoxides. This method generally includes reacting alkenes with oxygen in the presence of a specific silver catalyst under conditions suitable to produce a yield of the epoxides. The specific silver catalyst is a silver nanocrystal having a plurality of surface planes, a substantial portion of which is defined by Miller indices of (100). The reaction is performed by charging a suitable reactor with this silver catalyst and then feeding the reactants to the reactor under conditions to carry out the reaction. The reaction may be performed in batch, or as a continuous process that employs a recycle of any unreacted alkenes. The specific silver catalyst has unexpectedly high selectivity for epoxide products. Consequently, this general method (and its various embodiments) will result in extraordinarily high epoxide yields heretofore unattainable.

  2. Innovation Processes and Closure

    Darsø, Lotte; Austin, Robert


    Artiklen beskriver, diskuterer og illustrerer en lang række forskellige innovationsprocesmodeller med fokus på hvordan (og hvornår) innovative teams håndterer at stoppe processen. Sommetider er der tale om forceret lukning fx pga deadlines eller andet, andre gange sker der en tydelig krystalliser......Artiklen beskriver, diskuterer og illustrerer en lang række forskellige innovationsprocesmodeller med fokus på hvordan (og hvornår) innovative teams håndterer at stoppe processen. Sommetider er der tale om forceret lukning fx pga deadlines eller andet, andre gange sker der en tydelig...... krystallisering af et koncept. Der peges også på nye typer af modeller, hvor udviklingsprocessen holdes åben, fordi produktet aldrig bliver "færdigt". Endelig sammenholdes innovationsprocessen med kunstneriske og kreative processer....

  3. Uncloaking the Scientific Process

    Leitzell, K.; Meier, W.


    Since April 2008, NSIDC has offered daily updates of sea ice data on our Arctic Sea Ice News & Analysis Web page ( The images provide near-real-time data to the general public and policy makers, accompanied by monthly or more frequent analysis updates. In February 2009, a crucial channel of the Special Sensor Microwave/Imager (SSM/I) sensor on the Defense Meteorological Satellite Program (DMSP) F15 satellite, from which NSIDC was obtaining near-real-time Arctic sea ice data, suddenly failed. The daily image, which is automatically updated, showed a sudden drop in ice extent of over 50,000 square kilometers. Even after taking the images down, skeptical blogs jumped on the event, posting headlines such as “Errors in publicly presented data - Worth blogging about?” and “NSIDC pulls the plug on sea ice data.” In fact, NSIDC data managers and scientists were well aware that the F15 satellite sensor would eventually fail. NSIDC switched to a previously used back-up sensor, F13, and work to transition to a newer sensor on the F17 satellite had been underway for several weeks. While the deluge of questions from readers and bloggers were frustrating to NSIDC communications staff and scientists, they also presented a chance to give readers a window into the scientific process, and specifically into the collection of satellite data. We decided to publish a clear account of the process used to transition between sensors, as well as a basic explanation of the satellites used to measure sea ice data. While most scientists are familiar with the limitations of near-real-time data, the concept is unfamiliar to many in the general public. The Web page includes links to information on near-real-time data, including notes that images sometimes contain missing or erroneous data, and that delays can occur. However, to a skeptical person, the words that scientists use to describe the processing of final data, including “adjustment,”

  4. Foam process models.

    Moffat, Harry K.; Noble, David R.; Baer, Thomas A. (Procter & Gamble Co., West Chester, OH); Adolf, Douglas Brian; Rao, Rekha Ranjana; Mondy, Lisa Ann


    In this report, we summarize our work on developing a production level foam processing computational model suitable for predicting the self-expansion of foam in complex geometries. The model is based on a finite element representation of the equations of motion, with the movement of the free surface represented using the level set method, and has been implemented in SIERRA/ARIA. An empirically based time- and temperature-dependent density model is used to encapsulate the complex physics of foam nucleation and growth in a numerically tractable model. The change in density with time is at the heart of the foam self-expansion as it creates the motion of the foam. This continuum-level model uses an homogenized description of foam, which does not include the gas explicitly. Results from the model are compared to temperature-instrumented flow visualization experiments giving the location of the foam front as a function of time for our EFAR model system.

  5. Mulighedsbetingelser for transkulturelle processer

    Petersen, Karen Bjerg


    En tiltagende neoliberal diskurs ikke kun i international uddannelsespolitik i de seneste årtier, men også i skandinaviske lande som fx Danmark i 2000-tallet indikerer betydningsfulde ændringer i forståelsen af formålet med kulturelle processer som læring og undervisning. Eksemplificeret gennem...... læringsprocesser, men tillige synes at have ændret også konkrete kulturelle læringsprocesser i uhensigtsmæssig retning. Da en neoliberal uddannelsespolitik er relativt ny i Danmark er formålet med artiklen 1. at diskutere udviklingen i den konkrete case og sammenholde dette med internationale tendenser i fx...

  6. Weather Information Processing


    Science Communications International (SCI), formerly General Science Corporation, has developed several commercial products based upon experience acquired as a NASA Contractor. Among them are METPRO, a meteorological data acquisition and processing system, which has been widely used, RISKPRO, an environmental assessment system, and MAPPRO, a geographic information system. METPRO software is used to collect weather data from satellites, ground-based observation systems and radio weather broadcasts to generate weather maps, enabling potential disaster areas to receive advance warning. GSC's initial work for NASA Goddard Space Flight Center resulted in METPAK, a weather satellite data analysis system. METPAK led to the commercial METPRO system. The company also provides data to other government agencies, U.S. embassies and foreign countries.

  7. Irradiation and food processing.

    Sigurbjörnsson, B; Loaharanu, P


    After more than four decades of research and development, food irradiation has been demonstrated to be safe, effective and versatile as a process of food preservation, decontamination or disinfection. Its various applications cover: inhibition of sprouting of root crops; insect disinfestation of stored products, fresh and dried food; shelf-life extension of fresh fruits, vegetables, meat and fish; destruction of parasites and pathogenic micro-organisms in food of animal origin; decontamination of spices and food ingredients, etc. Such applications provide consumers with the increase in variety, volume and value of food. Although regulations on food irradiation in different countries are largely unharmonized, national authorities have shown increasing recognition and acceptance of this technology based on the Codex Standard for Irradiated Foods and its associated Code of Practice. Harmonization of national legislations represents an important prerequisite to international trade in irradiated food. Consumers at large are still not aware of the safety and benefits that food irradiation has to offer. Thus, national and international organizations, food industry, trade associations and consumer unions have important roles to play in introducing this technology based on its scientific values. Public acceptance of food irradiation may be slow at the beginning, but should increase at a faster rate in the foreseeable future when consumers are well informed of the safety and benefits of this technology in comparison with existing ones. Commercial applications of food irradiation has already started in 18 countries at present. The volume of food or ingredients treated on a commercial scale varies from country to country ranging from several tons of spices to hundreds of thousands of tons of grains per annum. With the increasing interest of national authorities and the food industry in applying the process, it is anticipated that some 25 countries will use some 55 commercial

  8. Time processing in dyscalculia.

    Cappelletti, Marinella; Freeman, Elliot D; Butterworth, Brian L


    To test whether atypical number development may affect other types of quantity processing, we investigated temporal discrimination in adults with developmental dyscalculia (DD). This also allowed us to test whether number and time may be sub-served by a common quantity system or decision mechanisms: if they do, both should be impaired in dyscalculia, but if number and time are distinct they should dissociate. Participants judged which of two successively presented horizontal lines was longer in duration, the first line being preceded by either a small or a large number prime ("1" or "9") or by a neutral symbol ("#"), or in a third task participants decided which of two Arabic numbers (either "1," "5," "9") lasted longer. Results showed that (i) DD's temporal discriminability was normal as long as numbers were not part of the experimental design, even as task-irrelevant stimuli; however (ii) task-irrelevant numbers dramatically disrupted DD's temporal discriminability the more their salience increased, though the actual magnitude of the numbers had no effect; in contrast (iii) controls' time perception was robust to the presence of numbers but modulated by numerical quantity: therefore small number primes or numerical stimuli seemed to make durations appear shorter than veridical, but longer for larger numerical prime or numerical stimuli. This study is the first to show spared temporal discrimination - a dimension of continuous quantity - in a population with a congenital number impairment. Our data reinforce the idea of a partially shared quantity system across numerical and temporal dimensions, which supports both dissociations and interactions among dimensions; however, they suggest that impaired number in DD is unlikely to originate from systems initially dedicated to continuous quantity processing like time.

  9. Laundry process intensification by ultrasound

    Warmoeskerken, M.M.C.G.; Vlist, van der P.; Moholkar, V.S.; Nierstrasz, V.A.


    In domestic textile laundering processes, mass transfer and mass transport are often rate limiting. Therefore, these processes require a long processing time, large amounts of water and chemicals, and they are energy consuming. In most of these processes, diffusion and convection in the inter-yarn a

  10. Process algebra for Hybrid systems

    Bergstra, J.A.; Middelburg, C.A.


    We propose a process algebra obtained by extending a combination of the process algebra with continuous relative timing from Baeten and Middelburg [Process Algebra with Timing, Springer, Chap. 4, 2002] and the process algebra with propositional signals from Baeten and Bergstra [Theoretical Computer

  11. Process algebra for hybrid systems

    Bergstra, J.A.; Middelburg, C.A.


    We propose a process algebra obtained by extending a combination of the process algebra with continuous relative timing from Baeten and Middelburg (Process Algebra with Timing, Springer,Berlin, 2002, Chapter 4), and the process algebra with propositional signals from Baeten and Bergstra(Theoret. Com

  12. Multivariate supOU processes

    Barndorff-Nielsen, Ole Eiler; Stelzer, Robert


    Univariate superpositions of Ornstein–Uhlenbeck-type processes (OU), called supOU processes, provide a class of continuous time processes capable of exhibiting long memory behavior. This paper introduces multivariate supOU processes and gives conditions for their existence and finiteness of momen...

  13. Perspectives on Multienzyme Process Technology

    Santacoloma, Paloma A.; Woodley, John M.


    There is little doubt that chemical processing of the future will involve an increasing number of biocatalytic processes using more than one enzyme. There are good reasons for developing such innovative biocatalytic processes and interesting new biocatalyst and process options will be introduced....


    O. Honcharova


    Full Text Available The article is devoted to the analysis of process management approach. The main understanding of process management approach has been researched in the article. The definition of process and process management has been given. Also the methods of business process improvement has been analyzed, among them are fast-analysis solution technology (FAST, benchmarking, reprojecting and reengineering. The main results of using business process improvement have been described in figures of reducing cycle time, costs and errors. Also the tasks of business process reengineering have been noticed. The main stages of business process reengineering have been noticed. The main efficiency results of business process reengineering and its success factors have been determined.

  15. Process correlation analysis model for process improvement identification.

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong


    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  16. Process of Petri Nets Extension


    To describe the dynamic semantics for the network computing, the concept on process is presented based on the semantic model with variable, resource and relation. Accordingly, the formal definition of process and the mapping rules from the specification of Petri nets extension to process are discussed in detail respectively. Based on the collective concepts of process, the specification of dynamic semantics also is constructed as a net system. Finally, to illustrate process intuitively, an example is specified completely.

  17. Process-oriented evaluation of agile business processes


    Agile enterprises are built based on agile business processes.Simultaneously,agile enterprises must be able to utilize agile business processes to rapidly respond market opportunities and maintain enterprises'competitions.But how to evaluate and choose agile business processes is a key problem about building agile enterprises.The paper proposes a goal driven method and an evaluation architecture for business processes' agility.Furthermore,a four-layer configuring model for agile business processes is developed based on the evaluation architecture and it can evaluate and configure agile business processes among alternatives.

  18. Event-driven process execution model for process virtual machine

    WU Dong-yao; WEI Jun; GAO Chu-shu; DOU Wen-shen


    Current orchestration and choreography process engines only serve with dedicate process languages. To solve these problems, an Even~driven Process Execution Model (EPEM) was developed. Formalization and map- ping principles of the model were presented to guarantee the correctness and efficiency for process transformation. As a case study, the EPEM descriptions of Web Services Business Process Execution Language (WS~BPEL) were represented and a Process Virtual Machine (PVM)-OncePVM was implemented in compliance with the EPEM.

  19. Data Processing for Scientists.

    Heumann, K F


    This brief survey of integrated and electronic data processing has touched on such matters as the origin of the concepts, their use in business, machines that are available, indexing problems, and, finally, some scientific uses that surely foreshadow further development. The purpose of this has been to present for the consideration of scientists a point of view and some techniques which have had a phenomenal growth in the business world and to suggest that these are worth consideration in scientific data-handling problems (30). To close, let me quote from William Bamert on the experience of the C. and O. Railroad once more (8, p. 121): "Frankly, we have been asked whether we weren't planning for Utopia-the implication being that everyone except starry-eyed visionaries knows that Utopia is unattainable. Our answer is that of course we are! Has anyone yet discovered a better way to begin program planning of this nature? Our feeling is that compromise comes early enough in the normal order of things."

  20. Controlled processing during sequencing

    Malathi eThothathiri


    Full Text Available Longstanding evidence has identified a role for the frontal cortex in sequencing within both linguistic and non-linguistic domains. More recently, neuropsychological studies have suggested a specific role for the left premotor-prefrontal junction (BA 44/6 in selection between competing alternatives during sequencing. In this study, we used neuroimaging with healthy adults to confirm and extend knowledge about the neural correlates of sequencing. Participants reproduced visually presented sequences of syllables and words using manual button presses. Items in the sequence were presented either consecutively or concurrently. Concurrent presentation is known to trigger the planning of multiple responses, which might compete with one another. Therefore, we hypothesized that regions involved in controlled processing would show greater recruitment during the concurrent than the consecutive condition. Whole-brain analysis showed concurrent > consecutive activation in sensory, motor and somatosensory cortices and notably also in rostral-dorsal anterior cingulate cortex (ACC. Region of interest analyses showed increased activation within left BA 44/6 and correlation between this region’s activation and behavioral response times. Functional connectivity analysis revealed increased connectivity between left BA 44/6 and the posterior lobe of the cerebellum during the concurrent than the consecutive condition. These results corroborate recent evidence and demonstrate the involvement of BA 44/6 and other control regions when ordering co-activated representations.

  1. Coal liquefaction processes

    Baker, N.R.; Blazek, C.F.; Tison, R.R.


    Coal liquefaction is an emerging technology receiving great attention as a possible liquid fuel source. Currently, four general methods of converting coal to liquid fuel are under active development: direct hydrogenation; pyrolysis/hydrocarbonization; solvent extraction; and indirect liquefaction. This work is being conducted at the pilot plant stage, usually with a coal feed rate of several tons per day. Several conceptual design studies have been published recently for large (measured in tens of thousands of tons per day coal feed rate) commercial liquefaction plants, and these reports form the data base for this evaluation. Products from a liquefaction facility depend on the particular method and plant design selected, and these products range from synthetic crude oils up through the lighter hydrocarbon gases, and, in some cases, electricity. Various processes are evaluated with respect to product compositions, thermal efficiency, environmental effects, operating and maintenance requirements, and cost. Because of the large plant capacities of current conceptual designs, it is not clear as to how, and on what scale, coal liquefaction may be considered appropriate as an energy source for Integrated Community Energy Systems (CES). Development work, both currently under way and planned for the future, should help to clarify and quantify the question of applicability.

  2. Controlled processing during sequencing.

    Thothathiri, Malathi; Rattinger, Michelle


    Longstanding evidence has identified a role for the frontal cortex in sequencing within both linguistic and non-linguistic domains. More recently, neuropsychological studies have suggested a specific role for the left premotor-prefrontal junction (BA 44/6) in selection between competing alternatives during sequencing. In this study, we used neuroimaging with healthy adults to confirm and extend knowledge about the neural correlates of sequencing. Participants reproduced visually presented sequences of syllables and words using manual button presses. Items in the sequence were presented either consecutively or concurrently. Concurrent presentation is known to trigger the planning of multiple responses, which might compete with one another. Therefore, we hypothesized that regions involved in controlled processing would show greater recruitment during the concurrent than the consecutive condition. Whole-brain analysis showed concurrent > consecutive activation in sensory, motor and somatosensory cortices and notably also in rostral-dorsal anterior cingulate cortex. Region of interest analyses showed increased activation within left BA 44/6 and correlation between this region's activation and behavioral response times. Functional connectivity analysis revealed increased connectivity between left BA 44/6 and the posterior lobe of the cerebellum during the concurrent than the consecutive condition. These results corroborate recent evidence and demonstrate the involvement of BA 44/6 and other control regions when ordering co-activated representations.

  3. Radiation processing of polyethylene

    Barlow, A.; Biggs, J. W.; Meeks, L. A.

    This paper covers two areas (a) the use of high energy radiation for the synthesis and improvement of polymer properties and (b) the formulation of radiation curable compounds for automotive/appliance wire applications and high voltage insulation. The first part discusses the use of gamma radiation for the bulk polymerization of ethylene and the properties of the polymer produced. The use of low dose radiation to increase polymer molecular weight and modify polydispersity is also described together with its projected operational cost. An update is provided of the cost savings that can be realized when using radiation crosslinked heavy duty film, which expands its applications, compared with noncrosslinked materials. The second section of the paper considers the advantages and disadvantages of radiation vs. peroxide curing of wire and cable compounds. The formulation of a radiation curable, automotive/appliance wire compound is discussed together with the interactions between the various ingredients; i.e., base resin, antioxidants, flame retardant filler, coupling agents, processing aids and radiation to achieve the desired product. In addition, the general property requirements of a radiation curable polyethylene for high voltage insulation are discussed; these include crosslinking efficiency, thermal stability, wet tree resistance and satisfactory dielectric properties. Preliminary data generated in the development of a 230KV radiation crosslinked polyethylene insulation are included.

  4. Processing of food wastes.

    Kosseva, Maria R


    Every year almost 45 billion kg of fresh vegetables, fruits, milk, and grain products is lost to waste in the United States. According to the EPA, the disposal of this costs approximately $1 billion. In the United Kingdom, 20 million ton of food waste is produced annually. Every tonne of food waste means 4.5 ton of CO(2) emissions. The food wastes are generated largely by the fruit-and-vegetable/olive oil, fermentation, dairy, meat, and seafood industries. The aim of this chapter is to emphasize existing trends in the food waste processing technologies during the last 15 years. The chapter consists of three major parts, which distinguish recovery of added-value products (the upgrading concept), the food waste treatment technologies as well as the food chain management for sustainable food system development. The aim of the final part is to summarize recent research on user-oriented innovation in the food sector, emphasizing on circular structure of a sustainable economy.

  5. Maintenance Process Strategic Analysis

    Jasiulewicz-Kaczmarek, M.; Stachowiak, A.


    The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.

  6. Food Processing Control


    When NASA started plarning for manned space travel in 1959, the myriad challenges of sustaining life in space included a seemingly mundane but vitally important problem: How and what do you feed an astronaut? There were two main concerns: preventing food crumbs from contaminating the spacecraft's atmosphere or floating into sensitive instruments, and ensuring complete freedom from potentially catastrophic disease-producing bacteria, viruses, and toxins. To solve these concerns, NASA enlisted the help of the Pillsbury Company. Pillsbury quickly solved the first problem by coating bite-size foods to prevent crumbling. They developed the hazard analysis and critical control point (HACCP) concept to ensure against bacterial contamination. Hazard analysis is a systematic study of product, its ingredients, processing conditions, handling, storage, packing, distribution, and directions for consumer use to identify sensitive areas that might prove hazardous. Hazard analysis provides a basis for blueprinting the Critical Control Points (CCPs) to be monitored. CCPs are points in the chain from raw materials to the finished product where loss of control could result in unacceptable food safety risks. In early 1970, Pillsbury plants were following HACCP in production of food for Earthbound consumers. Pillsbury's subsequent training courses for Food and Drug Administration (FDA) personnel led to the incorporation of HACCP in the FDA's Low Acid Canned Foods Regulations, set down in the mid-1970s to ensure the safety of all canned food products in the U.S.

  7. Hardware/Software Issues for Video Guidance Systems: The Coreco Frame Grabber

    Bales, John W.


    The F64 frame grabber is a high performance video image acquisition and processing board utilizing the TMS320C40 and TMS34020 processors. The hardware is designed for the ISA 16 bit bus and supports multiple digital or analog cameras. It has an acquisition rate of 40 million pixels per second, with a variable sampling frequency of 510 kHz to MO MHz. The board has a 4MB frame buffer memory expandable to 32 MB, and has a simultaneous acquisition and processing capability. It supports both VGA and RGB displays, and accepts all analog and digital video input standards.

  8. Innovating in the medical device industry - challenges & opportunities ESB 2015 translational research symposium.

    Bayon, Y; Bohner, M; Eglin, D; Procter, P; Richards, R G; Weber, J; Zeugolis, D I


    The European Society for Biomaterials 2015 Translational Research Symposium focused on 'Innovating in the Medical Device Industry - Challenges & Opportunities' from different perspectives, i.e., from a non-profit research organisation to a syndicate of small and medium-sized companies and large companies. Lecturers from regulatory consultants, industry and research institutions described the innovation process and regulatory processes (e.g., 510K, PMA, combination product) towards market approval. The aim of the present article is to summarise and explain the main statements made during the symposium, in terms of challenges and opportunities for medical device industries, in a constantly changing customer and regulatory environment.

  9. Sustainable Process Synthesis-Intensification

    Babi, Deenesh Kavi

    materials (feedstock) and the use of sustainable technologies or processes which directly impacts and improves sustainability/LCA factors. Process intensification is a concept by which processes, whether conceptual or existing, can be designed or redesigned to achieve more efficient and sustainable designs....... Therefore sustainable process design can be achieved by performing process syn-thesis and process intensification together. The main contribution of this work is the development of a systematic computer-aided multi-scale, multi-level framework for performing process synthesis-intensification that aims...... to make a process more sustainable than a base case design, which represents either a new or existing process. The framework consists of eight steps (step 1 to step 8) that operates at the unit operation scale and task scale, and four integrated task-phenomena-based steps (IT-PBS.1 to IT-PBS.4...

  10. Locally Stationary Processes - A Review

    Dahlhaus, Rainer


    The article contains an overview over locally stationary processes. At the beginning time varying autoregressive processes are discussed in detail - both as as a deep example and an important class of locally stationary processes. In the next section a general framework for time series with time varying finite dimensional parameters is discussed with special emphasis on nonlinear locally stationary processes. Then the paper focusses on linear processes where a more general theory is possible. First a general definition for linear processes is given and time varying spectral densities are discussed in detail. Then the Gaussian likelihood theory is presented for locally stationary processes. In the next section the relevance of empirical spectral processes for locally stationary time series is discussed. Empirical spectral processes play a major role in proving theoretical results and provide a deeper understanding of many techniques. The article concludes with an overview of other results for locally stationar...

  11. Transforming spatial point processes into Poisson processes using random superposition

    Møller, Jesper; Berthelsen, Kasper Klitgaaard

    with a complementary spatial point process Y  to obtain a Poisson process X∪Y  with intensity function β. Underlying this is a bivariate spatial birth-death process (Xt,Yt) which converges towards the distribution of (X,Y). We study the joint distribution of X and Y, and their marginal and conditional distributions...... process with intensity function β if and only if the true Papangelou intensity is used. Whether the superposition is actually such a Poisson process can easily be examined using well known results and fast simulation procedures for Poisson processes. We illustrate this approach to model checking....... In particular, we introduce a fast and easy simulation procedure for Y conditional on X. This may be used for model checking: given a model for the Papangelou intensity of the original spatial point process, this model is used to generate the complementary process, and the resulting superposition is a Poisson...

  12. Design Process Optimization Based on Design Process Gene Mapping

    LI Bo; TONG Shu-rong


    The idea of genetic engineering is introduced into the area of product design to improve the design efficiency. A method towards design process optimization based on the design process gene is proposed through analyzing the correlation between the design process gene and characteristics of the design process. The concept of the design process gene is analyzed and categorized into five categories that are the task specification gene, the concept design gene, the overall design gene, the detailed design gene and the processing design gene in the light of five design phases. The elements and their interactions involved in each kind of design process gene signprocess gene mapping is drawn with its structure disclosed based on its function that process gene.

  13. Integrated Process Design, Control and Analysis of Intensified Chemical Processes

    Mansouri, Seyed Soheil

    chemical processes; for example, intensified processes such as reactive distillation. Most importantly, it identifies and eliminates potentially promising design alternatives that may have controllability problems later. To date, a number of methodologies have been proposed and applied on various problems......Process design and process control have been considered as independent problems for many years. In this context, a sequential approach is used where the process is designed first, followed by the control design. However, this sequential approach has its limitations related to dynamic constraint...... violations, for example, infeasible operating points, process overdesign or under-performance. Therefore, by using this approach, a robust performance is not always guaranteed. Furthermore, process design decisions can influence process control and operation. To overcome these limitations, an alternative...

  14. Business Process Innovation using the Process Innovation Laboratory

    Møller, Charles

    process innovation (BPI) in future organizations. There is a significant body of knowledge on various aspect of process innovation, e.g. on conceptual modeling, business processes, supply chains and enterprise systems. Still an overall comprehensive and consistent theoretical framework with guidelines...... for practical applications has not been identified. The aim of this paper is to establish a conceptual framework for business process innovation in the supply chain based on advanced enterprise systems. The main approach to business process innovation in this context is to create a new methodology for exploring...... process models and patterns of applications. The paper thus presents a new concept for business process innovation called the process innovation laboratory a.k.a. the ?-Lab. The ?-Lab is a comprehensive framework for BPI using advanced enterprise systems. The ?-Lab is a collaborative workspace...

  15. Markov reward processes

    Smith, R. M.


    Numerous applications in the area of computer system analysis can be effectively studied with Markov reward models. These models describe the behavior of the system with a continuous-time Markov chain, where a reward rate is associated with each state. In a reliability/availability model, upstates may have reward rate 1 and down states may have reward rate zero associated with them. In a queueing model, the number of jobs of certain type in a given state may be the reward rate attached to that state. In a combined model of performance and reliability, the reward rate of a state may be the computational capacity, or a related performance measure. Expected steady-state reward rate and expected instantaneous reward rate are clearly useful measures of the Markov reward model. More generally, the distribution of accumulated reward or time-averaged reward over a finite time interval may be determined from the solution of the Markov reward model. This information is of great practical significance in situations where the workload can be well characterized (deterministically, or by continuous functions e.g., distributions). The design process in the development of a computer system is an expensive and long term endeavor. For aerospace applications the reliability of the computer system is essential, as is the ability to complete critical workloads in a well defined real time interval. Consequently, effective modeling of such systems must take into account both performance and reliability. This fact motivates our use of Markov reward models to aid in the development and evaluation of fault tolerant computer systems.

  16. Flash Lidar Data Processing

    Bergkoetter, M. D.; Ruppert, L.; Weimer, C. S.; Ramond, T.; Lefsky, M. A.; Burke, I. C.; Hu, Y.


    Late last year, a prototype Flash LIDAR instrument flew on a series of airborne tests to demonstrate its potential for improved vegetation measurements. The prototype is a precursor to the Electronically Steerable Flash LIDAR (ESFL) currently under development at Ball Aerospace and Technology Corp. with funding from the NASA Earth Science Technology Office. ESFL may soon significantly expand our ability to measure vegetation and forests and better understand the extent of their role in global climate change and the carbon cycle - all critical science questions relating to the upcoming NASA DESDynI and ESA BIOMASS missions. In order to more efficiently exploit data returned from the experimental Flash Lidar system and plan for data exploitation from future flights, Ball funded a graduate student project (through the Ball Summer Intern Program, summer 2009) to develop and implement algorithms for post-processing of the 3-Dimensional Flash Lidar data. This effort included developing autonomous algorithms to resample the data to a uniform rectangular grid, geolocation of the data, and visual display of large swaths of data. The resampling, geolocation, surface hit detection, and aggregation of frame data are implemented with new MATLAB code, and the efficient visual display is achieved with free commercial viewing software. These efforts directly support additional tests flights planned as early as October 2009, including possible flights over Niwot Ridge, CO, for which there is ICESat data, and a sea-level coastal area in California to test the effect of higher altitude (above ground level) on the divergence of the beams and the beam spot sizes.

  17. Representative process sampling - in practice

    Esbensen, Kim; Friis-Pedersen, Hans Henrik; Julius, Lars Petersen


    Didactic data sets representing a range of real-world processes are used to illustrate "how to do" representative process sampling and process characterisation. The selected process data lead to diverse variogram expressions with different systematics (no range vs. important ranges; trends and....../or periodicity; different nugget effects and process variations ranging from less than one lag to full variogram lag). Variogram data analysis leads to a fundamental decomposition into 0-D sampling vs. 1-D process variances, based on the three principal variogram parameters: range, sill and nugget effect...

  18. An improved approach for process monitoring in laser material processing

    König, Hans-Georg; Pütsch, Oliver; Stollenwerk, Jochen; Loosen, Peter


    Process monitoring is used in many different laser material processes due to the demand for reliable and stable processes. Among different methods, on-axis process monitoring offers multiple advantages. To observe a laser material process it is unavoidable to choose a wavelength for observation that is different to the one used for material processing, otherwise the light of the processing laser would outshine the picture of the process. By choosing a different wavelength, lateral chromatic aberration occurs in not chromatically corrected optical systems with optical scanning units and f-Theta lenses. These aberrations lead to a truncated image of the process on the camera or the pyrometer, respectively. This is the reason for adulterated measurements and non-satisfying images of the process. A new approach for solving the problem of field dependent lateral chromatic aberration in process monitoring is presented. Therefore, the scanner-based optical system is reproduced in a simulation environment, to predict the occurring lateral chromatic aberrations. In addition, a second deflecting system is integrated into the system. By using simulation, a predictive control is designed that uses the additional deflecting system to introduce reverse lateral deviations in order to compensate the lateral effect of chromatic aberration. This paper illustrates the concept and the implementation of the predictive control, which is used to eliminate lateral chromatic aberrations in process monitoring, the simulation on which the system is based the optical system as well as the control concept.

  19. Process mining discovery, conformance and enhancement of business processes

    van der Aalst, Wil M P


    The first to cover this missing link between data mining and process modeling, this book provides real-world techniques for monitoring and analyzing processes in real time. It is a powerful new tool destined to play a key role in business process management.

  20. Composable Data Processing in Environmental Science - A Process View

    Wombacher, A.


    Data processing in environmental science is essential for doing science. The heterogeneity of data sources, data processing operations and infrastructures results in a lot of manual data and process integration work done by each scientist individually. This is very inefficient and time consuming. Th

  1. Process intensification: a balance between product and process innovation

    Graaff, M.P. de; Swinkels, P.


    Martijn de Graaff of TNO and Pieter Swinkels of TU Delft discuss the challenges of implementing process intensification in new product and process innovation. The Delft Product & Process Design Institute at Delft University of Technology in the Netherlands (TU Delft) has seen over 100 case studies o

  2. Arctic Summer Ice Processes

    Holt, Benjamin


    The primary objective of this study is to estimate the flux of heat and freshwater resulting from sea ice melt in the polar seas. The approach taken is to examine the decay of sea ice in the summer months primarily through the use of spaceborne Synthetic Aperture Radar (SAR) imagery. The improved understanding of the dynamics of the melt process can be usefully combined with ice thermodynamic and upper ocean models to form more complete models of ice melt. Models indicate that more heat is absorbed in the upper ocean when the ice cover is composed of smaller rather than larger floes and when there is more open water. Over the course of the summer, floes disintegrate by physical forcing and heating, melting into smaller and smaller sizes. By measuring the change in distribution of floes together with open water over a summer period, we can make estimates of the amount of heating by region and time. In a climatic sense, these studies are intended to improve the understanding of the Arctic heat budget which can then be eventually incorporated into improved global climate models. This work has two focus areas. The first is examining the detailed effect of storms on floe size and open water. A strong Arctic low pressure storm has been shown to loosen up the pack ice, increase the open water concentration well into the pack ice, and change the distribution of floes toward fewer and smaller floes. This suggests episodic melting and the increased importance of horizontal (lateral) melt during storms. The second focus area is related to an extensive ship-based experiment that recently took place in the Arctic called Surface Heat Budget of the Arctic (SHEBA). An icebreaker was placed purposely into the older pack ice north of Alaska in September 1997. The ship served as the base for experimenters who deployed extensive instrumentation to measure the atmosphere, ocean, and ice during a one-year period. My experiment will be to derive similar measurements (floe size, open

  3. Human Assisted Assembly Processes



    Automatic assembly sequencing and visualization tools are valuable in determining the best assembly sequences, but without Human Factors and Figure Models (HFFMs) it is difficult to evaluate or visualize human interaction. In industry, accelerating technological advances and shorter market windows have forced companies to turn to an agile manufacturing paradigm. This trend has promoted computerized automation of product design and manufacturing processes, such as automated assembly planning. However, all automated assembly planning software tools assume that the individual components fly into their assembled configuration and generate what appear to be a perfectly valid operations, but in reality the operations cannot physically be carried out by a human. Similarly, human figure modeling algorithms may indicate that assembly operations are not feasible and consequently force design modifications; however, if they had the capability to quickly generate alternative assembly sequences, they might have identified a feasible solution. To solve this problem HFFMs must be integrated with automated assembly planning to allow engineers to verify that assembly operations are possible and to see ways to make the designs even better. Factories will very likely put humans and robots together in cooperative environments to meet the demands for customized products, for purposes including robotic and automated assembly. For robots to work harmoniously within an integrated environment with humans the robots must have cooperative operational skills. For example, in a human only environment, humans may tolerate collisions with one another if they did not cause much pain. This level of tolerance may or may not apply to robot-human environments. Humans expect that robots will be able to operate and navigate in their environments without collisions or interference. The ability to accomplish this is linked to the sensing capabilities available. Current work in the field of cooperative

  4. [In Process Citation].

    Stahnisch, Frank W


    Since the middle of the Nineteenth Century, neurophysiological researchers such as Theodor Fechner (1801-1887), Wilhelm Wundt (1832-1920), or Maximilian Ruppert Franz von Frey (1852-1932) started to analyze the causes, propagation, and perception of "pain" in the nervous system through the systematic use of experimental laboratory investigations. Particularly, Theodor Fechner's groundbreaking works made the contemporary neurophysiologists aware of the potential inclusion of psychological and subjective perceptions as a respectable object for the experimental study in mid-nineteenth century laboratories and clinical wards. Wilhelm Wundt frequently crossed the intersections between animal and human subject research and opened up many theoretical discussions, which also incorporated pluridisciplinary perspectives. On the research side, Wundt worked with many experimental physiological methods, developed theoretical psychophysiological considerations, and provided a detailed philosophical analysis of the new experimental findings and the subjective accounts of pain perceptions in his test persons--among many other experimental and investigative approaches. While each one of these neurophysiologists' research programs have been extensively studied in their own right, their mutual contributions to modern pain research and impact on this emerging interdisciplinary field of biomedical, psychophysiological and philosophical studies have so far not sufficiently been analyzed from a historiographical perspective. This even regards their highly sophisticated instruments and apparatuses that they applied to the study of pain, which Maximilian von Frey used further in the medical wards at the Fin de Siècle. These instruments became applied to many patients with acute or chronic pain disorders. In a way, the substantial time lag between early laboratory research and the application of these findings in the medical clinics of the time could also be explained as a process of newly

  5. Process Correlation Analysis Model for Process Improvement Identification

    Su-jin Choi


    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  6. Process waste assessment: Color print processing (RA-4)

    Catlett, P.


    The Kodak RA-4 process is used to develop prints and overhead transparencies from photographic negatives. The assessment was based on usage, effluent discharge, and final disposition of waste generated by the process. Two options explored were bleach-fix regeneration and the conversion to a digital image processing system. The RA-4 process is process is environmentally sound and generates a relatively small amount of waste. The bleach-fix option would provide only a small effluent reduction. The digital imaging conversion option, if fully implemented, could greatly reduce waste generated in the photo lab.

  7. Nontraditional machining processes research advances


    Nontraditional machining employs processes that remove material by various methods involving thermal, electrical, chemical and mechanical energy or even combinations of these. Nontraditional Machining Processes covers recent research and development in techniques and processes which focus on achieving high accuracies and good surface finishes, parts machined without burrs or residual stresses especially with materials that cannot be machined by conventional methods. With applications to the automotive, aircraft and mould and die industries, Nontraditional Machining Processes explores different aspects and processes through dedicated chapters. The seven chapters explore recent research into a range of topics including laser assisted manufacturing, abrasive water jet milling and hybrid processes. Students and researchers will find the practical examples and new processes useful for both reference and for developing further processes. Industry professionals and materials engineers will also find Nontraditional M...

  8. National Automated Conformity Inspection Process -

    Department of Transportation — The National Automated Conformity Inspection Process (NACIP) Application is intended to expedite the workflow process as it pertains to the FAA Form 81 0-10 Request...

  9. Sustainable development through process innovation

    Sleeman, E.; Oonk, J.; Krist-Spit, mw. C.E.


    Innovation of processes is one of the corner stones of sustainable development. Innovation may be characterized by the degree of intervention in the existing basematerials-processes-products chains. Four types of innovation are distinguished [1]:

  10. Chemometrics applications in biotech processes: assessing process comparability.

    Bhushan, Nitish; Hadpe, Sandip; Rathore, Anurag S


    A typical biotech process starts with the vial of the cell bank, ends with the final product and has anywhere from 15 to 30 unit operations in series. The total number of process variables (input and output parameters) and other variables (raw materials) can add up to several hundred variables. As the manufacturing process is widely accepted to have significant impact on the quality of the product, the regulatory agencies require an assessment of process comparability across different phases of manufacturing (Phase I vs. Phase II vs. Phase III vs. Commercial) as well as other key activities during product commercialization (process scale-up, technology transfer, and process improvement). However, assessing comparability for a process with such a large number of variables is nontrivial and often companies resort to qualitative comparisons. In this article, we present a quantitative approach for assessing process comparability via use of chemometrics. To our knowledge this is the first time that such an approach has been published for biotech processing. The approach has been applied to an industrial case study involving evaluation of two processes that are being used for commercial manufacturing of a major biosimilar product. It has been demonstrated that the proposed approach is able to successfully identify the unit operations in the two processes that are operating differently. We expect this approach, which can also be applied toward assessing product comparability, to be of great use to both the regulators and the industry which otherwise struggle to assess comparability.


    B.P. Mahesh


    Full Text Available Quality has become one of the most important customer decision factors in the selection among the competing product and services. Consequently, understanding and improving quality is a key factor leading to business success, growth and an enhanced competitive position. Hence quality improvement program should be an integral part of the overall business strategy. According to TQM, the effective way to improve the Quality of the product or service is to improve the process used to build the product. Hence, TQM focuses on process, rather than results as the results are driven by the processes. Many techniques are available for quality improvement. Statistical Process Control (SPC is one such TQM technique which is widely accepted for analyzing quality problems and improving the performance of the production process. This article illustrates the step by step procedure adopted at a soap manufacturing company to improve the Quality by reducing process variability using Statistical Process Control.

  12. Analysis of Hospital Processes with Process Mining Techniques.

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises


    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  13. Electrochemical process of titanium extraction



    A wide variety of processes are being pursued by researchers for cost effective extraction of titanium metal. Electrochemical processes are promising due to simplicity and being less capital intensive. Some of the promising electrochemical processes of titanium extraction were reviewed and the results of laboratory scale experiments on electrochemical reduction of TiO2 granules were brought out. Some of the kinetic parameters of the reduction process were discussed while presenting the quality improvements achieved in the experimentation.

  14. GPU applications for data processing

    Vladymyrov, Mykhailo, E-mail: [LPI - Lebedev Physical Institute of the Russian Academy of Sciences, RUS-119991 Moscow (Russian Federation); Aleksandrov, Andrey [LPI - Lebedev Physical Institute of the Russian Academy of Sciences, RUS-119991 Moscow (Russian Federation); INFN sezione di Napoli, I-80125 Napoli (Italy); Tioukov, Valeri [INFN sezione di Napoli, I-80125 Napoli (Italy)


    Modern experiments that use nuclear photoemulsion imply fast and efficient data acquisition from the emulsion can be performed. The new approaches in developing scanning systems require real-time processing of large amount of data. Methods that use Graphical Processing Unit (GPU) computing power for emulsion data processing are presented here. It is shown how the GPU-accelerated emulsion processing helped us to rise the scanning speed by factor of nine.

  15. Advanced methods for processing ceramics

    Carter, W.B. [Georgia Institute of Technology, Atlanta, GA (United States)


    Combustion chemical vapor deposition (CCVD) is a flame assisted, open air chemical vapor deposition (CVD) process. The process is capable of producing textured, epitaxial coatings on single crystal substrates using low cost reagents. Combustion chemical vapor deposition is a relatively inexpensive, alternative thin film deposition process with potential to replace conventional coating technologies for certain applications. The goals of this project are to develop the CCVD process to the point that potential industrial applications can be identified and reliably assessed.

  16. Project management of business process

    Jovanović, Slaviša


    This thesis presents one of the hottest areas in today's software industry, business process management (. Business Process Management has become an important tool for companies to hone their capabilities and business processes to adjust rapid and sudden changes in the marketplace and achieve the set of plans. Thesis is divided into two parts. The first section provides an overview of the scope of business process management through the project approach, historically defined its...

  17. PSE in Pharmaceutical Process Development

    Gernaey, Krist; Cervera Padrell, Albert Emili; Woodley, John


    The pharmaceutical industry is under growing pressure to increase efficiency, both in production and in process development. This paper will discuss the use of Process Systems Engineering (PSE) methods in pharmaceutical process development, and searches for answers to questions such as: Which PSE...

  18. Rapid thermal processing of semiconductors

    Borisenko, Victor E


    Rapid thermal processing has contributed to the development of single wafer cluster processing tools and other innovations in integrated circuit manufacturing environments Borisenko and Hesketh review theoretical and experimental progress in the field, discussing a wide range of materials, processes, and conditions They thoroughly cover the work of international investigators in the field

  19. Dynamic similarity in erosional processes

    Scheidegger, A.E.


    A study is made of the dynamic similarity conditions obtaining in a variety of erosional processes. The pertinent equations for each type of process are written in dimensionless form; the similarity conditions can then easily be deduced. The processes treated are: raindrop action, slope evolution and river erosion. ?? 1963 Istituto Geofisico Italiano.

  20. Bayesian inference for Hawkes processes

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...

  1. Present status of processing method

    Kosako, Kazuaki [Sumitomo Atomic Energy Industries Ltd., Tokyo (Japan)


    Present status of processing method for a high-energy nuclear data file was examined. The NJOY94 code is the only one available to the processing. In Japan, present processing used NJOY94 is orienting toward the production of traditional cross section library, because a high-energy transport code using a high-energy cross section library is indistinct. (author)

  2. IT Support for Healthcare Processes

    Lenz, R.; Reichert, M.U.


    Patient treatment processes require the cooperation of different organizational units and medical disciplines. In such an environment an optimal process support becomes crucial. Though healthcare processes frequently change, and therefore the separation of the flow logic from the application code se

  3. Process control in biogas plants

    Holm-Nielsen, Jens Bo; Oleskowicz-Popiel, Piotr


    Efficient monitoring and control of anaerobic digestion (AD) processes are necessary in order to enhance biogas plant performance. The aim of monitoring and controlling the biological processes is to stabilise and optimise the production of biogas. The principles of process analytical technology...

  4. Counting Processes in Simple Addition.

    Svenson, Ola; Hedenborg, Maj-Lene


    The cognitive processes of seven children solving arithmetic problems were accurately classified as reconstructive or reproductive according to the child's verbal report of his thought processes. Classifications of thought processes by means of verbal reports can also be used to improve the analysis of latencies. (SB)

  5. Learning processes across knowledge domains

    Hall-Andersen, Lene Bjerg; Broberg, Ole


    informed by selected perspectives on learning processes and boundary processes was applied on three illustrative vignettes to illuminate learning potentials and shortcomings in boundary processes. Findings - In the engineering consultancy, it was found that while learning did occur in the consultancy...

  6. Perfect simulation of Hawkes processes

    Møller, Jesper; Rasmussen, Jakob Gulddahl


    Our objective is to construct a perfect simulation algorithm for unmarked and marked Hawkes processes. The usual straightforward simulation algorithm suffers from edge effects, whereas our perfect simulation algorithm does not. By viewing Hawkes processes as Poisson cluster processes and using...

  7. Semantic Processing of Mathematical Gestures

    Lim, Vanessa K.; Wilson, Anna J.; Hamm, Jeff P.; Phillips, Nicola; Iwabuchi, Sarina J.; Corballis, Michael C.; Arzarello, Ferdinando; Thomas, Michael O. J.


    Objective: To examine whether or not university mathematics students semantically process gestures depicting mathematical functions (mathematical gestures) similarly to the way they process action gestures and sentences. Semantic processing was indexed by the N400 effect. Results: The N400 effect elicited by words primed with mathematical gestures…

  8. Process algebra for synchronous communication

    Bergstra, J.A.; Klop, J.W.


    Within the context of an algebraic theory of processes, an equational specification of process cooperation is provided. Four cases are considered: free merge or interleaving, merging with communication, merging with mutual exclusion of tight regions, and synchronous process cooperation. The rewrite

  9. Methods in Astronomical Image Processing

    Jörsäter, S.

    A Brief Introductory Note History of Astronomical Imaging Astronomical Image Data Images in Various Formats Digitized Image Data Digital Image Data Philosophy of Astronomical Image Processing Properties of Digital Astronomical Images Human Image Processing Astronomical vs. Computer Science Image Processing Basic Tools of Astronomical Image Processing Display Applications Calibration of Intensity Scales Calibration of Length Scales Image Re-shaping Feature Enhancement Noise Suppression Noise and Error Analysis Image Processing Packages: Design of AIPS and MIDAS AIPS MIDAS Reduction of CCD Data Bias Subtraction Clipping Preflash Subtraction Dark Subtraction Flat Fielding Sky Subtraction Extinction Correction Deconvolution Methods Rebinning/Combining Summary and Prospects for the Future

  10. Continuous-state branching processes

    Li, Zenghu


    These notes were used in a short graduate course on branching processes the author gave in Beijing Normal University. The following main topics are covered: scaling limits of Galton--Watson processes, continuous-state branching processes, extinction probabilities, conditional limit theorems, decompositions of sample paths, martingale problems, stochastic equations, Lamperti's transformations, independent and dependent immigration processes. Some of the results are simplified versions of those in the author's book "Measure-valued branching Markov processes" (Springer, 2011). We hope these simplified results will set out the main ideas in an easy way and lead the reader to a quick access of the subject.

  11. Optical and digital image processing

    Cristobal, Gabriel; Thienpont, Hugo


    In recent years, Moore's law has fostered the steady growth of the field of digital image processing, though the computational complexity remains a problem for most of the digital image processing applications. In parallel, the research domain of optical image processing has matured, potentially bypassing the problems digital approaches were suffering and bringing new applications. The advancement of technology calls for applications and knowledge at the intersection of both areas but there is a clear knowledge gap between the digital signal processing and the optical processing communities. T

  12. Unifying the Software Process Spectrum


    Software Process Workshop (SPW 2005) was held in Beijing on May 25-27, 2005. This paper introduces the motivation of organizing such a workshop, as well as its theme and paper gathering and review; and summarizes the main content and insights of 11 keynote speeches, 30 regular papers in five sessions of "Process Content", "Process Tools and Metrics", "Process Management", "Process Representation and Analysis", and "Experience Reports", 8 software development support tools demonstration, and the ending panel "Where Are We Now? Where Should We Go Next?".

  13. Bayesian inference for Hawkes processes

    Rasmussen, Jakob Gulddahl


    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  14. Bayesian inference for Hawkes processes

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  15. Wet flue gas desulfurization processes

    Hayrunnisa Çavuşoğlu


    Full Text Available The wet flue gas desulfurization process is widely used for the treatment of exhaust gases in power stations. Due to its high level of effectiveness over the already available processes, it has also been the mostly preferred method by industry. Its high SO2 removal efficiency, wide applicability of absorption chemicals and the ease of the chemical process handling which does not require comprehensive konowledge are among the main advantages of this process. In this article, various wet flue gas desulfurization processes such as lime/limestone have beendiscussed.

  16. Guideline Implementation: Processing Flexible Endoscopes.

    Bashaw, Marie A


    The updated AORN "Guideline for processing flexible endoscopes" provides guidance to perioperative, endoscopy, and sterile processing personnel for processing all types of reusable flexible endoscopes and accessories in all procedural settings. This article focuses on key points of the guideline to help perioperative personnel safely and effectively process flexible endoscopes to prevent infection transmission. The key points address verification of manual cleaning, mechanical cleaning and processing, storage in a drying cabinet, determination of maximum storage time before reprocessing is needed, and considerations for implementing a microbiologic surveillance program. Perioperative RNs should review the complete guideline for additional information and for guidance when writing and updating policies and procedures.

  17. IWTU Process Sample Analysis Report

    Nick Soelberg


    CH2M-WG Idaho (CWI) requested that Battelle Energy Alliance (BEA) analyze various samples collected during June – August 2012 at the Integrated Waste Treatment Facility (IWTU). Samples of IWTU process materials were collected from various locations in the process. None of these samples were radioactive. These samples were collected and analyzed to provide more understanding of the compositions of various materials in the process during the time of the process shutdown that occurred on June 16, 2012, while the IWTU was in the process of nonradioactive startup.

  18. Business process modeling in healthcare.

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd


    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  19. Sustainable Process Synthesis-Intensification

    Babi, Deenesh Kavi; Holtbruegge, Johannes; Lutze, Philip


    Sustainable process design can be achieved by performing process synthesis and process intensification together. This approach first defines a design target through a sustainability analysis and then finds design alternatives that match the target through process intensification. A systematic......, multi-stage framework for process synthesis- intensification that identifies more sustainable process designs has been developed. At stages 1-2, the working scale is at the level of unit operations, where a base case design is identified and analyzed with respect to sustainability metrics. At stages 3...... concepts and the framework are presented together with the results from a case study highlighting the application of the framework to the sustainable design of a production process for dimethyl carbonate....

  20. Machine intelligence and signal processing

    Vatsa, Mayank; Majumdar, Angshul; Kumar, Ajay


    This book comprises chapters on key problems in machine learning and signal processing arenas. The contents of the book are a result of a 2014 Workshop on Machine Intelligence and Signal Processing held at the Indraprastha Institute of Information Technology. Traditionally, signal processing and machine learning were considered to be separate areas of research. However in recent times the two communities are getting closer. In a very abstract fashion, signal processing is the study of operator design. The contributions of signal processing had been to device operators for restoration, compression, etc. Applied Mathematicians were more interested in operator analysis. Nowadays signal processing research is gravitating towards operator learning – instead of designing operators based on heuristics (for example wavelets), the trend is to learn these operators (for example dictionary learning). And thus, the gap between signal processing and machine learning is fast converging. The 2014 Workshop on Machine Intel...