WorldWideScience

Sample records for design-based stereological analysis

  1. [Simulation and data analysis of stereological modeling based on virtual slices].

    Science.gov (United States)

    Wang, Hao; Shen, Hong; Bai, Xiao-yan

    2008-05-01

    To establish a computer-assisted stereological model for simulating the process of slice section and evaluate the relationship between section surface and estimated three-dimensional structure. The model was designed by mathematic method as a win32 software based on the MFC using Microsoft visual studio as IDE for simulating the infinite process of sections and analysis of the data derived from the model. The linearity of the fitting of the model was evaluated by comparison with the traditional formula. The win32 software based on this algorithm allowed random sectioning of the particles distributed randomly in an ideal virtual cube. The stereological parameters showed very high throughput (>94.5% and 92%) in homogeneity and independence tests. The data of density, shape and size of the section were tested to conform to normal distribution. The output of the model and that from the image analysis system showed statistical correlation and consistency. The algorithm we described can be used for evaluating the stereologic parameters of the structure of tissue slices.

  2. Stereological analysis of spatial structures

    DEFF Research Database (Denmark)

    Hansen, Linda Vadgård

    The thesis deals with stereological analysis of spatial structures. One area of focus has been to improve the precision of well-known stereological estimators by including information that is available via automatic image analysis. Furthermore, the thesis presents a stochastic model for star......-shaped three-dimensional objects using the radial function. It appears that the model is highly fleksiblel in the sense that it can be used to describe an object with arbitrary irregular surface. Results on the distribution of well-known local stereological volume estimators are provided....

  3. Newsletter '77 in stereology

    International Nuclear Information System (INIS)

    Ondracek, G.

    1977-12-01

    There are three groups of contributions forming the present Newsletter in Stereology which are such of theoretical type, stereological activities in bio-sciences and quatitative image analysis in materials science. The report is introduced by two papers treating theoretical problems as the definition of particle size based on the total curvature and the definition of pattern recognition categories. It than follows a summarizing description and comparison of alternative techniques used to measure and conclude stereological parameters in bio-sciences. The discussion includes the sample preparation, semi- and complete automatic measuring procedures as well as the computation of primary data. The biological part ends by considering the use of those quantitative microscopical methods to investigate and classify foreign compounds inside the human liver stereologically. The materials science part reports about tests made on steel specimens to evaluate the accuracy of automatic microstructural analyses and about the use of image 'erosion' and 'dilatation' to measure microstructural parameters automatically. The last subject is part of a serie on morphology in quantitative metallography started in the previous Newsletter '76. The last paper on materials sciences considers the use of stereology and microstructural analysis in respect to a quality control, choosing WC-Co hardmetals as an example, where stereologically defined microstructural parameters do not serve only to describe microstructures quantitatively but also provide a usefull tool to determine properties indirectly. (orig.) [de

  4. A review of state-of-the-art stereology for better quantitative 3D morphology in cardiac research.

    Science.gov (United States)

    Mühlfeld, Christian; Nyengaard, Jens Randel; Mayhew, Terry M

    2010-01-01

    The aim of stereological methods in biomedical research is to obtain quantitative information about three-dimensional (3D) features of tissues, cells, or organelles from two-dimensional physical or optical sections. With immunogold labeling, stereology can even be used for the quantitative analysis of the distribution of molecules within tissues and cells. Nowadays, a large number of design-based stereological methods offer an efficient quantitative approach to intriguing questions in cardiac research, such as "Is there a significant loss of cardiomyocytes during progression from ventricular hypertrophy to heart failure?" or "Does a specific treatment reduce the degree of fibrosis in the heart?" Nevertheless, the use of stereological methods in cardiac research is rare. The present review article demonstrates how some of the potential pitfalls in quantitative microscopy may be avoided. To this end, we outline the concepts of design-based stereology and illustrate their practical applications to a wide range of biological questions in cardiac research. We hope that the present article will stimulate researchers in cardiac research to incorporate design-based stereology into their study designs, thus promoting an unbiased quantitative 3D microscopy.

  5. Newsletter '76 in stereology

    International Nuclear Information System (INIS)

    Ondracek, G.

    1976-08-01

    The present newsletter on stereology deals with a brief outlook about stereological problems to be solved in the future, compares definitions in pattern recognition and stereology and exposes the main notions of mathematical morphology used in quantitative metallography. This includes the description of the main stereological equations relating the parameters describing the dimensional features to the parameters measured in plane sections as well as a special type of equation for practical uses by which the average fiber length in composite materials can be determined. In this context the methods of particle shape descriptions have been summarized and reviewed and an example is given, how particle size and shape distributions can be measured statistically by automatic feature analysis of morphometric sections. - The introduction of stereological microstructural parameters into microstructure - property equations opens the way to calculate the materials properties by a stereological microstructure analysis and extends the possibilities of the common microstructural quality control. This is demonstrated for WC-Co hard metals. (orig./GSC) [de

  6. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    Science.gov (United States)

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  7. Design-based stereological analysis of the lung parenchymal architecture and alveolar type II cells in surfactant protein A and D double deficient mice

    DEFF Research Database (Denmark)

    Jung, A; Allen, L; Nyengaard, Jens Randel

    2005-01-01

    Alveolar epithelial type II cells synthesize and secrete surfactant. The surfactant-associated proteins A and D (SP-A and SP-D), members of the collectin protein family, participate in pulmonary immune defense, modulation of inflammation, and surfactant metabolism. Both proteins are known to have......, but the mean volume of a single lamellar body remains constant. These results demonstrate that chronic deficiency of SP-A and SP-D in mice leads to parenchymal remodeling, type II cell hyperplasia and hypertrophy, and disturbed intracellular surfactant metabolism. The design-based stereological approach...

  8. Practical Stereology Applications for the Pathologist.

    Science.gov (United States)

    Brown, Danielle L

    2017-05-01

    Qualitative histopathology is the gold standard for routine examination of morphological tissue changes in the regulatory or academic environment. The human eye is exceptional for pattern recognition but often cannot detect small changes in quantity. In cases where detection of subtle quantitative changes is critical, more sensitive methods are required. Two-dimensional histomorphometry can provide additional quantitative information and is quite useful in many cases. However, the provided data may not be referent to the entire tissue and, as such, it makes several assumptions, which are sources of bias. In contrast, stereology is design based rather than assumption based and uses stringent sampling methods to obtain accurate and precise 3-dimensional information using geometrical and statistical principles. Recent advances in technology have made stereology more approachable and practical for the pathologist in both regulatory and academic environments. This review introduces pathologists to the basic principles of stereology and walks the reader through some real-world examples for the application of these principles in the workplace.

  9. Digital stereology in neuropathology

    DEFF Research Database (Denmark)

    Kristiansen, Sarah Line Brøgger; Nyengaard, Jens Randel

    2012-01-01

    will therefore present the relevant stereological estimators for obtaining reliable quantitative structural data from brains and peripheral nerves when using digital light microscopy. It is discussed how to obtain brain and nerve fibre samples to fulfil the requirements for the estimators. A presentation......-dimensional structural knowledge. Accordingly, stereology is a science based on statistical sampling principles and geometric measures. The application of stereology to neuropathological studies allows the researcher to efficiently obtain a precise estimate of various structural quantities. This neuropathological review...

  10. Using biased image analysis for improving unbiased stereological number estimation - a pilot simulation study of the smooth fractionator

    DEFF Research Database (Denmark)

    Gardi, Jonathan Eyal; Nyengaard, Jens Randel; Gundersen, Hans Jørgen Gottlieb

    2006-01-01

    uniformly random sampling design and the ordinary simple random sampling design. The smooth protocol is performed using biased information from crude (but fully automatic) image analysis of the fields of view. The different design paradigms are compared using simulation in three different cell distributions......The smooth fractionator was introduced in 2002. The combination of a smoothing protocol with a computer-aided stereology tool provides better precision and a lighter workload. This study uses simulation to compare fractionator sampling based on the smooth design, the commonly used systematic...

  11. Quantitative analysis of the renal aging in rats. Stereological study

    OpenAIRE

    Melchioretto, Eduardo Felippe; Zeni, Marcelo; Veronez, Djanira Aparecida da Luz; Martins Filho, Eduardo Lopes; Fraga, Rogério de

    2016-01-01

    ABSTRACT PURPOSE: To evaluate the renal function and the renal histological alterations through the stereology and morphometrics in rats submitted to the natural process of aging. METHODS: Seventy two Wistar rats, divided in six groups. Each group was sacrificed in a different age: 3, 6, 9, 12, 18 and 24 months. It was performed right nephrectomy, stereological and morphometric analysis of the renal tissue (renal volume and weight, density of volume (Vv[glom]) and numerical density (Nv[glo...

  12. Design-based stereological estimation of the total number of cardiac myocytes in histological sections

    DEFF Research Database (Denmark)

    Brüel, Annemarie; Nyengaard, Jens Randel

    2005-01-01

    in LM sections using design-based stereology. MATERIALS AND METHODS: From formalin-fixed left rat ventricles (LV) isotropic uniformly random sections were cut. The total number of myocyte nuclei per LV was estimated using the optical disector. Two-microm-thick serial paraffin sections were stained......BACKGROUND: Counting the total number of cardiac myocytes has not previously been possible in ordinary histological sections using light microscopy (LM) due to difficulties in defining the myocyte borders properly. AIM: To describe a method by which the total number of cardiac myocytes is estimated...... with antibodies against cadherin and type IV collagen to visualise the intercalated discs and the myocyte membranes, respectively. Using the physical disector in "local vertical windows" of the serial sections, the average number of nuclei per myocyte was estimated.RESULTS: The total number of myocyte nuclei...

  13. Computer-assisted stereology and automated image analysis for quantification of tumor infiltrating lymphocytes in colon cancer.

    Science.gov (United States)

    Eriksen, Ann C; Andersen, Johnnie B; Kristensson, Martin; dePont Christensen, René; Hansen, Torben F; Kjær-Frifeldt, Sanne; Sørensen, Flemming B

    2017-08-29

    Precise prognostic and predictive variables allowing improved post-operative treatment stratification are missing in patients treated for stage II colon cancer (CC). Investigation of tumor infiltrating lymphocytes (TILs) may be rewarding, but the lack of a standardized analytic technique is a major concern. Manual stereological counting is considered the gold standard, but digital pathology with image analysis is preferred due to time efficiency. The purpose of this study was to compare manual stereological estimates of TILs with automatic counts obtained by image analysis, and at the same time investigate the heterogeneity of TILs. From 43 patients treated for stage II CC in 2002 three paraffin embedded, tumor containing tissue blocks were selected one of them representing the deepest invasive tumor front. Serial sections from each of the 129 blocks were immunohistochemically stained for CD3 and CD8, and the slides were scanned. Stereological estimates of the numerical density and area fraction of TILs were obtained using the computer-assisted newCAST stereology system. For the image analysis approach an app-based algorithm was developed using Visiopharm Integrator System software. For both methods the tumor areas of interest (invasive front and central area) were manually delineated by the observer. Based on all sections, the Spearman's correlation coefficients for density estimates varied from 0.9457 to 0.9638 (p heterogeneity, intra-class correlation coefficients (ICC) for CD3+ TILs varied from 0.615 to 0.746 in the central area, and from 0.686 to 0.746 in the invasive area. ICC for CD8+ TILs varied from 0.724 to 0.775 in the central area, and from 0.746 to 0.765 in the invasive area. Exact objective and time efficient estimates of numerical densities and area fractions of CD3+ and CD8+ TILs in stage II colon cancer can be obtained by image analysis and are highly correlated to the corresponding estimates obtained by the gold standard based on stereology

  14. Quantitative analysis of the renal aging in rats. Stereological study.

    Science.gov (United States)

    Melchioretto, Eduardo Felippe; Zeni, Marcelo; Veronez, Djanira Aparecida da Luz; Martins, Eduardo Lopes; Fraga, Rogério de

    2016-05-01

    To evaluate the renal function and the renal histological alterations through the stereology and morphometrics in rats submitted to the natural process of aging. Seventy two Wistar rats, divided in six groups. Each group was sacrificed in a different age: 3, 6, 9, 12, 18 and 24 months. It was performed right nephrectomy, stereological and morphometric analysis of the renal tissue (renal volume and weight, density of volume (Vv[glom]) and numerical density (Nv[glom]) of the renal glomeruli and average glomerular volume (Vol[glom])) and also it was evaluated the renal function for the dosage of serum creatinine and urea. There was significant decrease of the renal function in the oldest rats. The renal volume presented gradual increase during the development of the rats with the biggest values registered in the group of animals at 12 months of age and significant progressive decrease in older animals. Vv[glom] presented statistically significant gradual reduction between the groups and the Nv[glom] also decreased significantly. The renal function proved to be inferior in senile rats when compared to the young rats. The morphometric and stereological analysis evidenced renal atrophy, gradual reduction of the volume density and numerical density of the renal glomeruli associated to the aging process.

  15. Assessing the Effects of Fibrosis on Lung Function by Light Microscopy-Coupled Stereology

    DEFF Research Database (Denmark)

    Pilecki, Bartosz; Sørensen, Grith Lykke

    2017-01-01

    Pulmonary diseases such as fibrosis are characterized by structural abnormalities that lead to impairment of proper lung function. Stereological analysis of serial tissue sections allows detection and quantitation of subtle changes in lung architecture. Here, we describe a stereology-based method...

  16. Stereological estimation of surface area from digital images

    DEFF Research Database (Denmark)

    Ziegel, Johanna; Kiderlen, Markus

    2010-01-01

    A sampling design of local stereology is combined with a method from digital stereology to yield a novel estimator of surface area based on counts of configurations observed in a digitization of an isotropic 2- dimensional slice with thickness s. As a tool, a result of the second author and J....... Rataj on infinitesimal increase of volumes of morphological transforms is refined and used. The proposed surface area estimator is asymptotically unbiased in the case of sets contained in the ball centred at the origin with radius s and in the case of balls centred at the origin with unknown radius...

  17. STEREOLOGICAL ESTIMATION OF SURFACE AREA FROM DIGITAL IMAGES

    Directory of Open Access Journals (Sweden)

    Johanna Ziegel

    2011-05-01

    Full Text Available A sampling design of local stereology is combined with a method from digital stereology to yield a novel estimator of surface area based on counts of configurations observed in a digitization of an isotropic 2- dimensional slice with thickness s. As a tool, a result of the second author and J. Rataj on infinitesimal increase of volumes of morphological transforms is refined and used. The proposed surface area estimator is asymptotically unbiased in the case of sets contained in the ball centred at the origin with radius s and in the case of balls centred at the origin with unknown radius. For general shapes bounds for the asymptotic expected relative worst case error are given. A simulation example is discussed for surface area estimation based on 2×2×2-configurations.

  18. Assessment of left ventricular function and mass by MR imaging: a stereological study based on the systematic slice sampling procedure.

    Science.gov (United States)

    Mazonakis, Michalis; Sahin, Bunyamin; Pagonidis, Konstantin; Damilakis, John

    2011-06-01

    The aim of this study was to combine the stereological technique with magnetic resonance (MR) imaging data for the volumetric and functional analysis of the left ventricle (LV). Cardiac MR examinations were performed in 13 consecutive subjects with known or suspected coronary artery disease. The end-diastolic volume (EDV), end-systolic volume, ejection fraction (EF), and mass were estimated by stereology using the entire slice set depicting LV and systematic sampling intensities of 1/2 and 1/3 that provided samples with every second and third slice, respectively. The repeatability of stereology was evaluated. Stereological assessments were compared with the reference values derived by manually tracing the endocardial and epicardial contours on MR images. Stereological EDV and EF estimations obtained by the 1/3 systematic sampling scheme were significantly different from those by manual delineation (P sampling intensity of 1/2 (P > .05). For these stereological approaches, a high correlation (r(2) = 0.80-0.93) and clinically acceptable limits of agreement were found with the reference method. Stereological estimations obtained by both sample sizes presented comparable coefficient of variation values of 2.9-5.8%. The mean time for stereological measurements on the entire slice set was 3.4 ± 0.6 minutes and it was reduced to 2.5 ± 0.5 minutes with the 1/2 systematic sampling scheme. Stereological analysis on systematic samples of MR slices generated by the 1/2 sampling intensity provided efficient and quick assessment of LV volumes, function, and mass. Copyright © 2011 AUR. Published by Elsevier Inc. All rights reserved.

  19. STEREOLOGICAL ANALYSIS OF SHAPE

    Directory of Open Access Journals (Sweden)

    Asger Hobolth

    2011-05-01

    Full Text Available This paper concerns the problem of making stereological inference about the shape variability in a population of spatial particles. Under rotational invariance the shape variability can be estimated from central planar sections through the particles. A simple, but flexible, parametric model for rotation invariant spatial particles is suggested. It is shown how the parameters of the model can be estimated from observations on central sections. The corresponding model for planar particles is also discussed in some detail.

  20. Abdominal fat volume estimation by stereology on CT: a comparison with manual planimetry.

    Science.gov (United States)

    Manios, G E; Mazonakis, M; Voulgaris, C; Karantanas, A; Damilakis, J

    2016-03-01

    To deploy and evaluate a stereological point-counting technique on abdominal CT for the estimation of visceral (VAF) and subcutaneous abdominal fat (SAF) volumes. Stereological volume estimations based on point counting and systematic sampling were performed on images from 14 consecutive patients who had undergone abdominal CT. For the optimization of the method, five sampling intensities in combination with 100 and 200 points were tested. The optimum stereological measurements were compared with VAF and SAF volumes derived by the standard technique of manual planimetry on the same scans. Optimization analysis showed that the selection of 200 points along with the sampling intensity 1/8 provided efficient volume estimations in less than 4 min for VAF and SAF together. The optimized stereology showed strong correlation with planimetry (VAF: r = 0.98; SAF: r = 0.98). No statistical differences were found between the two methods (VAF: P = 0.81; SAF: P = 0.83). The 95% limits of agreement were also acceptable (VAF: -16.5%, 16.1%; SAF: -10.8%, 10.7%) and the repeatability of stereology was good (VAF: CV = 4.5%, SAF: CV = 3.2%). Stereology may be successfully applied to CT images for the efficient estimation of abdominal fat volume and may constitute a good alternative to the conventional planimetric technique. Abdominal obesity is associated with increased risk of disease and mortality. Stereology may quantify visceral and subcutaneous abdominal fat accurately and consistently. The application of stereology to estimating abdominal volume fat reduces processing time. Stereology is an efficient alternative method for estimating abdominal fat volume.

  1. UTILIZATION OF STEREOLOGY FOR QUANTITATIVE ANALYSIS OF PLASTIC DEFORMATION OF FORMING PIECES

    Directory of Open Access Journals (Sweden)

    Maroš Martinkovič

    2012-01-01

    Full Text Available Mechanical working leads to final properties of forming pieces, which are affected by conditions of production technology. Utilization of stereology leads to the detail analysis of three-dimensional plastic deformed material structure by different forming technologies, e.g. forging, extruding, upsetting, metal spinning, drawing etc. The microstructure of cold drawing wires was analyzed. Grain boundaries orientation was measured on the parallel section of wire with a different degree of deformation and direct axis plastic deformation was evaluated in bulk formed part. The strain of probes on their sections was obtained using stereology by measurement of degree of grain boundary orientation which was converted to deformation using model of conversion of grain boundary orientation degree to deformation.

  2. STEREOLOGY FROM ONE OF ALL THE POSSIBLE ANGLES

    Directory of Open Access Journals (Sweden)

    Leszek Wojnar

    2011-05-01

    Full Text Available Relation between image analysis and stereology is discussed in terms of different fields of applications, especially materials science, biology and medicine. Some long-term tendencies observed as well as possible future trends are discussed. The need of a wider use of image analysis techniques including ma1hematical morphology in any field of science is demons1rated. Simultaneously, the significance of stereological background in automatic quantification of1he investigated structures is confirmed.

  3. Abdominal fat volume estimation by stereology on CT: a comparison with manual planimetry

    Energy Technology Data Exchange (ETDEWEB)

    Manios, G.E.; Mazonakis, M.; Damilakis, J. [University of Crete, Department of Medical Physics, Faculty of Medicine, Heraklion, Crete (Greece); Voulgaris, C.; Karantanas, A. [University of Crete, Department of Radiology, Faculty of Medicine, Heraklion, Crete (Greece)

    2016-03-15

    To deploy and evaluate a stereological point-counting technique on abdominal CT for the estimation of visceral (VAF) and subcutaneous abdominal fat (SAF) volumes. Stereological volume estimations based on point counting and systematic sampling were performed on images from 14 consecutive patients who had undergone abdominal CT. For the optimization of the method, five sampling intensities in combination with 100 and 200 points were tested. The optimum stereological measurements were compared with VAF and SAF volumes derived by the standard technique of manual planimetry on the same scans. Optimization analysis showed that the selection of 200 points along with the sampling intensity 1/8 provided efficient volume estimations in less than 4 min for VAF and SAF together. The optimized stereology showed strong correlation with planimetry (VAF: r = 0.98; SAF: r = 0.98). No statistical differences were found between the two methods (VAF: P = 0.81; SAF: P = 0.83). The 95 % limits of agreement were also acceptable (VAF: -16.5 %, 16.1 %; SAF: -10.8 %, 10.7 %) and the repeatability of stereology was good (VAF: CV = 4.5 %, SAF: CV = 3.2 %). Stereology may be successfully applied to CT images for the efficient estimation of abdominal fat volume and may constitute a good alternative to the conventional planimetric technique. (orig.)

  4. Stereological study of postnatal development in the mouse utricular macula

    DEFF Research Database (Denmark)

    Kirkegaard, Mette; Nyengaard, Jens Randel

    2005-01-01

    This study describes the morphometric changes taking place in the utricular macula of mice with ages in geometric progression from 1 to 512 days after birth. By using design-based stereological methods, the total volume and surface area of the sensory epithelium as well the total number of the ha...

  5. Efficient stereological approaches for the volumetry of a normal or enlarged spleen from MDCT images

    Energy Technology Data Exchange (ETDEWEB)

    Mazonakis, Michalis; Stratakis, John; Damilakis, John [University of Crete, Department of Medical Physics, Faculty of Medicine, P.O. Box 2208, Iraklion, Crete (Greece)

    2015-06-01

    To introduce efficient stereological approaches for estimating the volume of a normal or enlarged spleen from MDCT. All study participants underwent an abdominal MDCT. The first group included 20 consecutive patients with splenomegaly and the second group consisted of 20 subjects with a normal spleen. Splenic volume estimations were performed using the stereological point counting method. Stereological assessments were optimized using the systematic slice sampling procedure. Planimetric measurements based on manual tracing of splenic boundaries on each slice were taken as reference values. Stereological analysis using five to eight systematically sampled slices provided enlarged splenic volume estimations with a mean precision of 4.9 ± 1.0 % in a mean time of 2.3 ± 0.4 min. A similar measurement duration and error was observed for normal splenic volume assessment using four to seven systematically selected slices. These stereological approaches slightly but insignificantly overestimated the volume of a normal and enlarged spleen compared to planimetry (P > 0.05) with a mean difference of -1.3 ± 4.3 % and -2.7 ± 5.2 %, respectively. The two methods were highly correlated (r ≥ 0.96). The variability of repeated stereological estimations was below 3.8 %. The proposed stereological approaches enable the rapid, reproducible, and accurate splenic volume estimation from MDCT data in patients with or without splenomegaly. (orig.)

  6. Digital immunohistochemistry wizard: image analysis-assisted stereology tool to produce reference data set for calibration and quality control.

    Science.gov (United States)

    Plancoulaine, Benoît; Laurinaviciene, Aida; Meskauskas, Raimundas; Baltrusaityte, Indra; Besusparis, Justinas; Herlin, Paulette; Laurinavicius, Arvydas

    2014-01-01

    Digital image analysis (DIA) enables better reproducibility of immunohistochemistry (IHC) studies. Nevertheless, accuracy of the DIA methods needs to be ensured, demanding production of reference data sets. We have reported on methodology to calibrate DIA for Ki67 IHC in breast cancer tissue based on reference data obtained by stereology grid count. To produce the reference data more efficiently, we propose digital IHC wizard generating initial cell marks to be verified by experts. Digital images of proliferation marker Ki67 IHC from 158 patients (one tissue microarray spot per patient) with an invasive ductal carcinoma of the breast were used. Manual data (mD) were obtained by marking Ki67-positive and negative tumour cells, using a stereological method for 2D object enumeration. DIA was used as an initial step in stereology grid count to generate the digital data (dD) marks by Aperio Genie and Nuclear algorithms. The dD were collected into XML files from the DIA markup images and overlaid on the original spots along with the stereology grid. The expert correction of the dD marks resulted in corrected data (cD). The percentages of Ki67 positive tumour cells per spot in the mD, dD, and cD sets were compared by single linear regression analysis. Efficiency of cD production was estimated based on manual editing effort. The percentage of Ki67-positive tumor cells was in very good agreement in the mD, dD, and cD sets: regression of cD from dD (R2=0.92) reflects the impact of the expert editing the dD as well as accuracy of the DIA used; regression of the cD from the mD (R2=0.94) represents the consistency of the DIA-assisted ground truth (cD) with the manual procedure. Nevertheless, the accuracy of detection of individual tumour cells was much lower: in average, 18 and 219 marks per spot were edited due to the Genie and Nuclear algorithm errors, respectively. The DIA-assisted cD production in our experiment saved approximately 2/3 of manual marking. Digital IHC wizard

  7. Sampling for stereology in lungs

    Directory of Open Access Journals (Sweden)

    J. R. Nyengaard

    2006-12-01

    Full Text Available The present article reviews the relevant stereological estimators for obtaining reliable quantitative structural data from the lungs. Stereological sampling achieves reliable, quantitative information either about the whole lung or complete lobes, whilst minimising the workload. Studies have used systematic random sampling, which has fixed and constant sampling probabilities on all blocks, sections and fields of view. For an estimation of total lung or lobe volume, the Cavalieri principle can be used, but it is not useful in estimating individual cell volume due to various effects from over- or underprojection. If the number of certain structures is required, two methods can be used: the disector and the fractionator. The disector method is a three-dimensional stereological probe for sampling objects according to their number. However, it may be affected on tissue deformation and, therefore, the fractionator method is often the preferred sampling principle. In this method, a known and predetermined fraction of an object is sampled in one or more steps, with the final step estimating the number. Both methods can be performed in a physical and optical manner, therefore enabling cells and larger lung structure numbers (e.g. number of alveoli to be estimated. Some estimators also require randomisation of orientation, so that all directions have an equal chance of being chosen. Using such isotropic sections, surface area, length, and diameter can be estimated on a Cavalieri set of sections. Stereology can also illustrate the potential for transport between two compartments by analysing the barrier width. Estimating the individual volume of cells can be achieved by local stereology using a two-step procedure that first samples lung cells using the disector and then introduces individual volume estimation of the sampled cells. The coefficient of error of most unbiased stereological estimators is a combination of variance from blocks, sections, fields

  8. Application of stereology to dermatological research

    DEFF Research Database (Denmark)

    Kamp, Søren; Jemec, Gregor Borut Ernst; Kemp, Kåre

    2009-01-01

    Stereology is a set of mathematical and statistical tools to estimate three-dimensional (3-D) characteristics of objects from regular two-dimensional (2-D) sections. In medicine and biology, it can be used to estimate features such as cell volume, cell membrane surface area, total length of blood...... vessels per volume tissue and total number of cells. The unbiased quantification of these 3-D features allows for a better understanding of morphology in vivo compared with 2-D methods. This review provides an introduction to the field of stereology with specific emphasis on the application of stereology...

  9. Stereological, functional and molecular studies of development and disease : a collection of published works 1981 to 2013

    OpenAIRE

    Bertram, John F.

    2016-01-01

    The unifying theme throughout this Doctorate of Science thesis is the development, refinement and utilisation of stereological techniques to study tissue structures in development, and in adult health and disease. Stereology is the discipline based on geometric probability theory that enables us to quantify structures in three-dimensional space. Stereological techniques are often applied in material science as well as biomedical science. When applied to histology and pathology, stereology can...

  10. Application of stereological methods to estimate post-mortem brain surface area using 3T MRI

    DEFF Research Database (Denmark)

    Furlong, Carolyn; García-Fiñana, Marta; Puddephat, Michael

    2013-01-01

    The Cavalieri and Vertical Sections methods of design based stereology were applied in combination with 3 tesla (i.e. 3T) Magnetic Resonance Imaging (MRI) to estimate cortical and subcortical volume, area of the pial surface, area of the grey-white matter boundary, and thickness of the cerebral...

  11. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue

    Science.gov (United States)

    Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-01-01

    Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715

  12. Tensor-based morphometry and stereology reveal brain pathology in the complexin1 knockout mouse.

    Science.gov (United States)

    Kielar, Catherine; Sawiak, Stephen J; Navarro Negredo, Paloma; Tse, Desmond H Y; Morton, A Jennifer

    2012-01-01

    Complexins (Cplxs) are small, soluble, regulatory proteins that bind reversibly to the SNARE complex and modulate synaptic vesicle release. Cplx1 knockout mice (Cplx1(-/-)) have the earliest known onset of ataxia seen in a mouse model, although hitherto no histopathology has been described in these mice. Nevertheless, the profound neurological phenotype displayed by Cplx1(-/-) mutants suggests that significant functional abnormalities must be present in these animals. In this study, MRI was used to automatically detect regions where structural differences were not obvious when using a traditional histological approach. Tensor-based morphometry of Cplx1(-/-) mouse brains showed selective volume loss from the thalamus and cerebellum. Stereological analysis of Cplx1(-/-) and Cplx1(+/+) mice brain slices confirmed the volume loss in the thalamus as well as loss in some lobules of the cerebellum. Finally, stereology was used to show that there was loss of cerebellar granule cells in Cplx1(-/-) mice when compared to Cplx1(+/+) animals. Our study is the first to describe pathological changes in Cplx1(-/-) mouse brain. We suggest that the ataxia in Cplx1(-/-) mice is likely to be due to pathological changes in both cerebellum and thalamus. Reduced levels of Cplx proteins have been reported in brains of patients with neurodegenerative diseases. Therefore, understanding the effects of Cplx depletion in brains from Cplx1(-/-) mice may also shed light on the mechanisms underlying pathophysiology in disorders in which loss of Cplx1 occurs.

  13. Tensor-based morphometry and stereology reveal brain pathology in the complexin1 knockout mouse.

    Directory of Open Access Journals (Sweden)

    Catherine Kielar

    Full Text Available Complexins (Cplxs are small, soluble, regulatory proteins that bind reversibly to the SNARE complex and modulate synaptic vesicle release. Cplx1 knockout mice (Cplx1(-/- have the earliest known onset of ataxia seen in a mouse model, although hitherto no histopathology has been described in these mice. Nevertheless, the profound neurological phenotype displayed by Cplx1(-/- mutants suggests that significant functional abnormalities must be present in these animals. In this study, MRI was used to automatically detect regions where structural differences were not obvious when using a traditional histological approach. Tensor-based morphometry of Cplx1(-/- mouse brains showed selective volume loss from the thalamus and cerebellum. Stereological analysis of Cplx1(-/- and Cplx1(+/+ mice brain slices confirmed the volume loss in the thalamus as well as loss in some lobules of the cerebellum. Finally, stereology was used to show that there was loss of cerebellar granule cells in Cplx1(-/- mice when compared to Cplx1(+/+ animals. Our study is the first to describe pathological changes in Cplx1(-/- mouse brain. We suggest that the ataxia in Cplx1(-/- mice is likely to be due to pathological changes in both cerebellum and thalamus. Reduced levels of Cplx proteins have been reported in brains of patients with neurodegenerative diseases. Therefore, understanding the effects of Cplx depletion in brains from Cplx1(-/- mice may also shed light on the mechanisms underlying pathophysiology in disorders in which loss of Cplx1 occurs.

  14. Blood Capillary Length Estimation from Three-Dimensional Microscopic Data by Image Analysis and Stereology

    Czech Academy of Sciences Publication Activity Database

    Kubínová, Lucie; Mao, X. W.; Janáček, Jiří

    2013-01-01

    Roč. 19, č. 4 (2013), s. 898-906 ISSN 1431-9276 R&D Projects: GA MŠk(CZ) ME09010; GA MŠk(CZ) LH13028; GA ČR(CZ) GAP108/11/0794 Institutional research plan: CEZ:AV0Z5011922 Institutional support: RVO:67985823 Keywords : capillaries * confocal microscopy * image analysis * length * rat brain * stereology Subject RIV: EA - Cell Biology Impact factor: 1.757, year: 2013

  15. Unbiased stereologic techniques for practical use in diagnostic histopathology

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt

    1995-01-01

    Grading of malignancy by the examination of morphologic and cytologic details in histologic sections from malignant neoplasms is based exclusively on qualitative features, associated with significant subjectivity, and thus rather poor reproducibility. The traditional way of malignancy grading may...... by introducing quantitative techniques in the histopathologic discipline of malignancy grading. Unbiased stereologic methods, especially based on measurements of nuclear three-dimensional mean size, have during the last decade proved their value in this regard. In this survey, the methods are reviewed regarding...... the basic technique involved, sampling, efficiency, and reproducibility. Various types of cancers, where stereologic grading of malignancy has been used, are reviewed and discussed with regard to the development of a new objective and reproducible basis for carrying out prognosis-related malignancy grading...

  16. Unbiased stereologic techniques for practical use in diagnostic histopathology

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt

    1995-01-01

    by introducing quantitative techniques in the histopathologic discipline of malignancy grading. Unbiased stereologic methods, especially based on measurements of nuclear three-dimensional mean size, have during the last decade proved their value in this regard. In this survey, the methods are reviewed regarding......Grading of malignancy by the examination of morphologic and cytologic details in histologic sections from malignant neoplasms is based exclusively on qualitative features, associated with significant subjectivity, and thus rather poor reproducibility. The traditional way of malignancy grading may...... of solid tumors. This new, unbiased attitude to malignancy grading is associated with excellent virtues, which ultimately may help the clinician in the choice of optimal treatment of the individual patient suffering from cancer. Stereologic methods are not solely applicable to the field of malignancy...

  17. Introduction into integral geometry and stereology

    DEFF Research Database (Denmark)

    Kiderlen, Markus

    Statistics and Random Fields and is a self-containing introduction into integral geometry and its applications in stereology. The most important integral geometric tools for stereological applications are kinematic formulas and results of Blaschke-Petkantschin type. Therefore, Crofton's formula......This text is the extended version of two talks held at the Summer Academy Stochastic Geometry, Spatial Statistics and Random Fields in the Soellerhaus, Germany, in September 2009. It forms (with slight modifications) a chapter of the Springer lecture notes Lectures on Stochastic Geometry, Spatial...

  18. Practical application of stereological methods in experimental kidney animal models.

    Science.gov (United States)

    Fernández García, María Teresa; Núñez Martínez, Paula; García de la Fuente, Vanessa; Sánchez Pitiot, Marta; Muñiz Salgueiro, María Del Carmen; Perillán Méndez, Carmen; Argüelles Luis, Juan; Astudillo González, Aurora

    The kidneys are vital organs responsible for excretion, fluid and electrolyte balance and hormone production. The nephrons are the kidney's functional and structural units. The number, size and distribution of the nephron components contain relevant information on renal function. Stereology is a branch of morphometry that applies mathematical principles to obtain three-dimensional information from serial, parallel and equidistant two-dimensional microscopic sections. Because of the complexity of stereological studies and the lack of scientific literature on the subject, the aim of this paper is to clearly explain, through animal models, the basic concepts of stereology and how to calculate the main kidney stereological parameters that can be applied in future experimental studies. Copyright © 2016 Sociedad Española de Nefrología. Published by Elsevier España, S.L.U. All rights reserved.

  19. THE APPLICATION OF STEREOLOGY METHOD FOR ESTIMATING THE NUMBER OF 3D BaTiO3 – CERAMIC GRAINS CONTACT SURFACES

    Directory of Open Access Journals (Sweden)

    Vojislav V Mitić

    2011-05-01

    Full Text Available Methods of stereological study are of great importance for structural research of electronic ceramic materials including BaTiO3-ceramic materials. The broad application of ceramics, based on barium-titanate, in advanced electronics nowadays demands a constant research of its structure, that through the correlation structureproperties, a fundamental in the basic materials properties prognosis triad (technology-structure-properties, leads to further prognosis and properties design of these ceramics. Microstructure properties of BaTiO3- ceramic material, expressed in grains' boundary contact, are of basic importance for electric properties of this material, particularly the capacity. In this paper, a significant step towards establishing control under capacitive properties of BaTiO3-ceramics is being done by estimating the number of grains contact surfaces. Defining an efficient stereology method for estimating the number of BaTiO3-ceramic grains contact surfaces, we have started from a mathematical model of mutual grains distribution in the prescribed volume of BaTiO3-ceramic sample. Since the real microstructure morphology of BaTiO3-ceramics is in some way disordered, spherical shaped grains, using computer-modelling methods, are approximated by polyhedra with a great number of small convex polygons. By dividing the volume of BaTiO3-ceramic sample with the definite number of parallel planes, according to a given pace, into the intersection plane a certain number of grains contact surfaces are identified. According to quantitative estimation of 2D stereological parameters the modelled 3D internal microstructure is obtained. Experiments were made by using the scanning electronic microscopy (SEM method with the ceramic samples prepared under pressing pressures up to 150 MPa and sintering temperature up to 1370°C while the obtained microphotographs were used as a base of confirming the validity of presented stereology method. This paper, by applying

  20. Stereological analysis of nuclear volume in recurrent meningiomas

    DEFF Research Database (Denmark)

    Madsen, C; Schrøder, H D

    1994-01-01

    A stereological estimation of nuclear volume in recurrent and non-recurrent meningiomas was made. The aim was to investigate whether this method could discriminate between these two groups. We found that the mean nuclear volumes in recurrent meningiomas were all larger at debut than in any...... of the control tumors. The mean nuclear volume of the individual recurrent tumors appeared to change with time, showing a tendency to diminish. A relationship between large nuclear volume at presentation and number of or time interval between recurrences was not found. We conclude that measurement of mean...... nuclear volume in meningiomas might help identify a group at risk of recurrence....

  1. The proportionator: unbiased stereological estimation using biased automatic image analysis and non-uniform probability proportional to size sampling

    DEFF Research Database (Denmark)

    Gardi, Jonathan Eyal; Nyengaard, Jens Randel; Gundersen, Hans Jørgen Gottlieb

    2008-01-01

    examined, which in turn leads to any of the known stereological estimates, including size distributions and spatial distributions. The unbiasedness is not a function of the assumed relation between the weight and the structure, which is in practice always a biased relation from a stereological (integral......, the desired number of fields are sampled automatically with probability proportional to the weight and presented to the expert observer. Using any known stereological probe and estimator, the correct count in these fields leads to a simple, unbiased estimate of the total amount of structure in the sections...... geometric) point of view. The efficiency of the proportionator depends, however, directly on this relation to be positive. The sampling and estimation procedure is simulated in sections with characteristics and various kinds of noises in possibly realistic ranges. In all cases examined, the proportionator...

  2. PROBES, POPULATIONS, SAMPLES, MEASUREMENTS AND RELATIONS IN STEREOLOGY

    Directory of Open Access Journals (Sweden)

    Robert T Dehoff

    2011-05-01

    Full Text Available This summary paper provides an overview of the content of stereology. The typical problem at hand centers around some three dimensional object that has an internal structure that determines its function, performance, or response. To understand and quantify the geometry of that structure it is necessary to probe it with geometric entities: points, lines, planes volumes, etc. Meaningful results are obtained only if the set of probes chosen for use in the assessment is drawn uniformly from the population of such probes for the structure as a whole. This requires an understanding of the population of each kind of probe. Interaction of the probes with the structure produce geometric events which are the focus of stereological measurements. In almost all applications the measurement that is made is a simple count of the number of these events. Rigorous application of these requirements for sample design produce unbiased estimates of geometric properties of features in the structure no matter how complex are the features or what their arrangement in space. It is this assumption-free characteristic of the methodology that makes it a powerful tool for characterizing the internal structure of three dimensional objects.

  3. The efficiency of systematic sampling in stereology-reconsidered

    DEFF Research Database (Denmark)

    Gundersen, Hans Jørgen Gottlieb; Jensen, Eva B. Vedel; Kieu, K

    1999-01-01

    In the present paper, we summarize and further develop recent research in the estimation of the variance of stereological estimators based on systematic sampling. In particular, it is emphasized that the relevant estimation procedure depends on the sampling density. The validity of the variance...... estimation is examined in a collection of data sets, obtained by systematic sampling. Practical recommendations are also provided in a separate section....

  4. Stereological analysis of neuron, glial and endothelial cell numbers in the human amygdaloid complex.

    Directory of Open Access Journals (Sweden)

    María García-Amado

    Full Text Available Cell number alterations in the amygdaloid complex (AC might coincide with neurological and psychiatric pathologies with anxiety imbalances as well as with changes in brain functionality during aging. This stereological study focused on estimating, in samples from 7 control individuals aged 20 to 75 years old, the number and density of neurons, glia and endothelial cells in the entire AC and in its 5 nuclear groups (including the basolateral (BL, corticomedial and central groups, 5 nuclei and 13 nuclear subdivisions. The volume and total cell number in these territories were determined on Nissl-stained sections with the Cavalieri principle and the optical fractionator. The AC mean volume was 956 mm(3 and mean cell numbers (x10(6 were: 15.3 neurons, 60 glial cells and 16.8 endothelial cells. The numbers of endothelial cells and neurons were similar in each AC region and were one fourth the number of glial cells. Analysis of the influence of the individuals' age at death on volume, cell number and density in each of these 24 AC regions suggested that aging does not affect regional size or the amount of glial cells, but that neuron and endothelial cell numbers respectively tended to decrease and increase in territories such as AC or BL. These accurate stereological measures of volume and total cell numbers and densities in the AC of control individuals could serve as appropriate reference values to evaluate subtle alterations in this structure in pathological conditions.

  5. Stereological analysis of neuron, glial and endothelial cell numbers in the human amygdaloid complex.

    Science.gov (United States)

    García-Amado, María; Prensa, Lucía

    2012-01-01

    Cell number alterations in the amygdaloid complex (AC) might coincide with neurological and psychiatric pathologies with anxiety imbalances as well as with changes in brain functionality during aging. This stereological study focused on estimating, in samples from 7 control individuals aged 20 to 75 years old, the number and density of neurons, glia and endothelial cells in the entire AC and in its 5 nuclear groups (including the basolateral (BL), corticomedial and central groups), 5 nuclei and 13 nuclear subdivisions. The volume and total cell number in these territories were determined on Nissl-stained sections with the Cavalieri principle and the optical fractionator. The AC mean volume was 956 mm(3) and mean cell numbers (x10(6)) were: 15.3 neurons, 60 glial cells and 16.8 endothelial cells. The numbers of endothelial cells and neurons were similar in each AC region and were one fourth the number of glial cells. Analysis of the influence of the individuals' age at death on volume, cell number and density in each of these 24 AC regions suggested that aging does not affect regional size or the amount of glial cells, but that neuron and endothelial cell numbers respectively tended to decrease and increase in territories such as AC or BL. These accurate stereological measures of volume and total cell numbers and densities in the AC of control individuals could serve as appropriate reference values to evaluate subtle alterations in this structure in pathological conditions.

  6. Stereological estimation of nuclear volume in benign and atypical meningiomas

    DEFF Research Database (Denmark)

    Madsen, C; Schrøder, H D

    1993-01-01

    A stereological estimation of nuclear volume in benign and atypical meningiomas was made. The aim was to investigate whether this method could discriminate between these two meningeal neoplasms. The difference was significant and it was moreover seen that there was no overlap between the two groups....... The results demonstrate that atypical meningiomas can be distinguished from benign meningiomas by an objective stereological estimation of nuclear volume....

  7. EARLY HISTORY OF GEOMETRIC PROBABILITY AND STEREOLOGY

    Directory of Open Access Journals (Sweden)

    Magdalena Hykšová

    2012-03-01

    Full Text Available The paper provides an account of the history of geometric probability and stereology from the time of Newton to the early 20th century. It depicts the development of two parallel ways: on one hand, the theory of geometric probability was formed with minor attention paid to other applications than those concerning spatial chance games. On the other hand, practical rules of the estimation of area or volume fraction and other characteristics, easily deducible from geometric probability theory, were proposed without the knowledge of this branch. A special attention is paid to the paper of J.-É. Barbier published in 1860, which contained the fundamental stereological formulas, but remained almost unnoticed both by mathematicians and practicians.

  8. Stereology as a tool to assess reproduction strategy and fecundity of teleost fishes

    DEFF Research Database (Denmark)

    Bucholtz, Rikke Hagstrøm

    methods to assess fecundity and reproductive strategies. The strength of the stereological method being that, in combination with conventional histological analysis, quantification of all oocyte categories is possible, as well as registration of qualitative characteristics relating to spawning history...... of oocyte dynamics in fish and were successfully implemented in herring ovaries for quantification of both oocyte numbers and sizes as well as total volume fraction of atretic oocytes, introducing a negligible error to the total variance of estimates. The histological nature of the stereological methods...... facilitated a ready validation of maturity data, distinguishing first time spawners from repeat spawners, as well as a ready recognition of ongoing oocyte recruitment in early maturity stages, early stage atresia, POFs and residual eggs. Analyzing a sample of females all collected during a short time frame...

  9. Confocal stereology and image analysis: methods for estimating geometrical characteristics of cells and tissues from three-dimensional confocal images

    Czech Academy of Sciences Publication Activity Database

    Kubínová, Lucie; Janáček, Jiří; Karen, Petr; Radochová, Barbora; Difato, Francesco; Krekule, Ivan

    2004-01-01

    Roč. 53, Suppl.1 (2004), s. S47-S55 ISSN 0862-8408 R&D Projects: GA ČR GA304/01/0257; GA ČR GA310/02/1470; GA AV ČR KJB6011309; GA AV ČR KJB5039302 Grant - others:SI - CZ(CZ) KONTAKT 001/2001 Institutional research plan: CEZ:AV0Z5011922 Keywords : confocal microscopy * image analysis * stereology Subject RIV: EA - Cell Biology Impact factor: 1.140, year: 2004

  10. STEREOLOGICAL ANALYSIS OF THE COCHLEAR NUCLEI OF MONKEY (MACACA FASCICULARIS AFTER DEAFFERENTATION

    Directory of Open Access Journals (Sweden)

    Ana M Insausti

    2011-05-01

    Full Text Available The cochlear nuclei (CN in the brainstem receive the input signals from the inner ear through the cochlear nerve, and transmit these signals to higher auditory centres. A variety of lesions of the cochlear nerve cause deafness. As reported in the literature, artificial removal of auditive input, or 'deafferentation', induces structural alterations in the CN. The purpose of this study was to estimate a number of relevant stereological parameters of the CN in control and deafferented Macaca fascicularis monkeys.

  11. Microstructure characterization via stereological relations — A shortcut for beginners

    Energy Technology Data Exchange (ETDEWEB)

    Pabst, Willi, E-mail: pabstw@vscht.cz; Gregorová, Eva; Uhlířová, Tereza

    2015-07-15

    Stereological relations that can be routinely applied for the quantitative characterization of microstructures of heterogeneous single- and two-phase materials via global microstructural descriptors are reviewed. It is shown that in the case of dense, single-phase polycrystalline materials (e.g., transparent yttrium aluminum garnet ceramics) two quantities have to be determined, the interface density (or, equivalently, the mean chord length of the grains) and the mean curvature integral density (or, equivalently, the Jeffries grain size), while for two-phase materials (e.g., highly porous, cellular alumina ceramics), one additional quantity, the volume fraction (porosity), is required. The Delesse–Rosiwal law is recalled and size measures are discussed. It is shown that the Jeffries grain size is based on the triple junction line length density, while the mean chord length of grains is based on the interface density (grain boundary area density). In contrast to widespread belief, however, these two size measures are not alternative, but independent (and thus complementary), measures of grain size. Concomitant with this fact, a clear distinction between linear and planar grain size numbers is proposed. Finally, based on our concept of phase-specific quantities, it is shown that under certain conditions it is possible to define a Jeffries size also for two-phase materials and that the ratio of the mean chord length and the Jeffries size has to be considered as an invariant number for a certain type of microstructure, i.e., a characteristic value that is independent of the absolute size of the microstructural features (e.g., grains, inclusions or pores). - Highlights: • Stereology-based image analysis is reviewed, including error considerations. • Recipes are provided for measuring global metric microstructural descriptors. • Size measures are based on interface density and mean curvature integral density. • Phase-specific quantities and a generalized

  12. Current automated 3D cell detection methods are not a suitable replacement for manual stereologic cell counting

    Directory of Open Access Journals (Sweden)

    Christoph eSchmitz

    2014-05-01

    Full Text Available Stereologic cell counting has had a major impact on the field of neuroscience. A major bottleneck in stereologic cell counting is that the user must manually decide whether or not each cell is counted according to three-dimensional (3D stereologic counting rules by visual inspection within hundreds of microscopic fields-of-view per investigated brain or brain region. Reliance on visual inspection forces stereologic cell counting to be very labor-intensive and time-consuming, and is the main reason why biased, non-stereologic two-dimensional (2D cell counting approaches have remained in widespread use. We present an evaluation of the performance of modern automated cell detection and segmentation algorithms as a potential alternative to the manual approach in stereologic cell counting. The image data used in this study were 3D microscopic images of thick brain tissue sections prepared with a variety of commonly used nuclear and cytoplasmic stains. The evaluation compared the numbers and locations of cells identified unambiguously and counted exhaustively by an expert observer with those found by three automated 3D cell detection algorithms: nuclei segmentation from the FARSIGHT toolkit, nuclei segmentation by 3D multiple level set methods, and the 3D object counter plug-in for ImageJ. Of these methods, FARSIGHT performed best, with true-positive detection rates between 38–99% and false-positive rates from 3.6–82%. The results demonstrate that the current automated methods suffer from lower detection rates and higher false-positive rates than are acceptable for obtaining valid estimates of cell numbers. Thus, at present, stereologic cell counting with manual decision for object inclusion according to unbiased stereologic counting rules remains the only adequate method for unbiased cell quantification in histologic tissue sections.

  13. Stereological quantification of mast cells in human synovium

    DEFF Research Database (Denmark)

    Damsgaard, T E; Sørensen, Flemming Brandt; Herlin, T

    1999-01-01

    Mast cells participate in both the acute allergic reaction as well as in chronic inflammatory diseases. Earlier studies have revealed divergent results regarding the quantification of mast cells in the human synovium. The aim of the present study was therefore to quantify these cells in the human...... synovium, using stereological techniques. Different methods of staining and quantification have previously been used for mast cell quantification in human synovium. Stereological techniques provide precise and unbiased information on the number of cell profiles in two-dimensional tissue sections of......, in this case, human synovium. In 10 patients suffering from osteoarthritis a median of 3.6 mast cells/mm2 synovial membrane was found. The total number of cells (synoviocytes, fibroblasts, lymphocytes, leukocytes) present was 395.9 cells/mm2 (median). The mast cells constituted 0.8% of all the cell profiles...

  14. The total number of Leydig and Sertoli cells in the testes of men across various age groups - a stereological study

    DEFF Research Database (Denmark)

    Petersen, Peter M; Seierøe, Karina; Pakkenberg, Bente

    2015-01-01

    is particularly sensitive to methodological problems. Therefore, using the optical fractionator technique and a sampling design specifically optimized for human testes, we estimated the total number of Sertoli and Leydig cells in the testes from 26 post mortem male subjects ranging in age from 16 to 80 years...... of Sertoli cells with age; no such decline was found for Leydig cells. Quantitative stereological analysis of post mortem tissue may help understand the influence of age or disease on the number of human testicular cells....

  15. Stereology of human myometrium in pregnancy: influence of maternal body mass index and age.

    LENUS (Irish Health Repository)

    Sweeney, Eva M

    2013-04-01

    Knowledge of the stereology of human myometrium in pregnancy is limited. Uterine contractile performance may be altered in association with maternal obesity and advanced maternal age. The aim of this study was to investigate the stereology of human myometrium in pregnancy, and to evaluate a potential influence of maternal body mass index (BMI) and age.

  16. Stereology application in the investigation of physical and mechanical properties of porous materials

    International Nuclear Information System (INIS)

    Cytermann, Richard.

    1979-04-01

    The sintering of carbonyl nickel powders has been studied through stereology (quantitative microscopy) associated with different physical and mechanical measurements. This study demonstrated that a set of stereological parameters, such as porosity, grain size, mean pore volume ..., was necessary to characterize porous parts with the same porosity obtained through different ways. On the one hand, stereology permitted to elucidate powder shape and speed of pressure rising influence on the compacting process. On the other hand, the study of physical and mechanical properties related to their microstructure led to distinguish: properties such as elasticity modulus independant of compacting pressure, sintering temperature and powder shape. Their evolution has been characterized through contiguity coefficient; properties such as tensile strength dependant of sintering parameters. Their characterization required the simultaneous measurement of porosity mean pore volume, shape factor and grain size [fr

  17. Stereological estimation of nuclear mean volume in invasive meningiomas

    DEFF Research Database (Denmark)

    Madsen, C; Schrøder, H D

    1996-01-01

    A stereological estimation of nuclear mean volume in bone and brain invasive meningiomas was made. For comparison the nuclear mean volume of benign meningiomas was estimated. The aim was to investigate whether this method could discriminate between these groups. We found that the nuclear mean...... volume in the bone and brain invasive meningiomas was larger than in the benign tumors. The difference was significant and moreover it was seen that there was no overlap between the two groups. In the bone invasive meningiomas the nuclear mean volume appeared to be larger inside than outside the bone....... No significant difference in nuclear mean volume was found between brain and bone invasive meningiomas. The results demonstrate that invasive meningiomas differ from benign meningiomas by an objective stereological estimation of nuclear mean volume (p

  18. Light scattering in porous materials: Geometrical optics and stereological approach

    International Nuclear Information System (INIS)

    Malinka, Aleksey V.

    2014-01-01

    Porous material has been considered from the point of view of stereology (geometrical statistics), as a two-phase random mixture of solid material and air. Considered are the materials having the refractive index with the real part that differs notably from unit and the imaginary part much less than unit. Light scattering in such materials has been described using geometrical optics. These two – the geometrical optics laws and the stereological approach – allow one to obtain the inherent optical properties of such a porous material, which are basic in the radiative transfer theory: the photon survival probability, the scattering phase function, and the polarization properties (Mueller matrix). In this work these characteristics are expressed through the refractive index of the material and the random chord length distribution. The obtained results are compared with the traditional approach, modeling the porous material as a pack of particles of different shapes. - Highlights: • Porous material has been considered from the point of view of stereology. • Properties of a two-phase random mixture of solid material and air are considered. • Light scattering in such materials has been described using geometrical optics. • The inherent optical properties of such a porous material have been obtained

  19. A comparison of porosity analysis using 2D stereology estimates and 3D serial sectioning for additively manufactured Ti 6Al 2Sn 4Zr 2Mo alloy

    International Nuclear Information System (INIS)

    Ganti, Satya R.; Velez, Michael A.; Geier, Brian A.; Hayes, Brian J.; Turner, Bryan J.; Jenkins, Elizabeth J.

    2017-01-01

    Porosity is a typical defect in additively manufactured (AM) parts. Such defects limit the properties and performance of AM parts, and therefore need to be characterized accurately. Current methods for characterization of defects and microstructure rely on classical stereological methods that extrapolate information from two dimensional images. The automation of serial sectioning provides an opportunity to precisely and accurately quantify porosity in three dimensions in materials. In this work, we analyzed the porosity of an additively manufactured Ti 6Al 2Sn 4Zr 2Mo sample using Robo-Met.3D "r"e"g"i"s"t"e"r"e"d, an automated serial sectioning system. Image processing for three dimensional reconstruction of the serial-sectioned two dimensional images was performed using open source image analysis software (Fiji/ImageJ, Dream.3D, Paraview). The results from this 3D serial sectioning analysis were then compared to classical 2D stereological methods (Saltykov stereological theory). We found that for this dataset, the classical 2D methods underestimated the porosity size and distributions of the larger pores; a critical attribute to fatigue behavior of the AM part. The results suggest that acquiring experimental data with equipment such as Robo-Met.3D "r"e"g"i"s"t"e"r"e"d to measure the number and size of particles such as pores in a volume irrespective of knowing their shape is a better choice.

  20. Comparison of automated brain volumetry methods with stereology in children aged 2 to 3 years

    Energy Technology Data Exchange (ETDEWEB)

    Mayer, Kristina N. [University Children' s Hospital of Zurich, Center for MR Research, Zurich (Switzerland); University Children' s Hospital, Pediatric Cardiology, Zurich (Switzerland); Latal, Beatrice [University Children' s Hospital, Child Development Center, Zurich (Switzerland); University Children' s Hospital, Children' s Research Center, Zurich (Switzerland); Knirsch, Walter [University Children' s Hospital, Pediatric Cardiology, Zurich (Switzerland); University Children' s Hospital, Children' s Research Center, Zurich (Switzerland); Scheer, Ianina [University Children' s Hospital, Department for Diagnostic Neuroradiology, Zurich (Switzerland); Rhein, Michael von [University Children' s Hospital, Child Development Center, Zurich (Switzerland); Reich, Bettina; Bauer, Juergen; Gummel, Kerstin [Justus-Liebig University, Pediatric Heart Center, University Hospital Giessen, Giessen (Germany); Roberts, Neil [University of Edinburgh, Clinical Research and Imaging Centre (CRIC), The Queens Medical Research Institute (QMRI), Edinburgh (United Kingdom); O' Gorman Tuura, Ruth [University Children' s Hospital of Zurich, Center for MR Research, Zurich (Switzerland); University Children' s Hospital, Children' s Research Center, Zurich (Switzerland)

    2016-09-15

    The accurate and precise measurement of brain volumes in young children is important for early identification of children with reduced brain volumes and an increased risk for neurodevelopmental impairment. Brain volumes can be measured from cerebral MRI (cMRI), but most neuroimaging tools used for cerebral segmentation and volumetry were developed for use in adults and have not been validated in infants or young children. Here, we investigate the feasibility and accuracy of three automated software methods (i.e., SPM, FSL, and FreeSurfer) for brain volumetry in young children and compare the measures with corresponding volumes obtained using the Cavalieri method of modern design stereology. Cerebral MRI data were collected from 21 children with a complex congenital heart disease (CHD) before Fontan procedure, at a median age of 27 months (range 20.9-42.4 months). Data were segmented with SPM, FSL, and FreeSurfer, and total intracranial volume (ICV) and total brain volume (TBV) were compared with corresponding measures obtained using the Cavalieri method. Agreement between the estimated brain volumes (ICV and TBV) relative to the gold standard stereological volumes was strongest for FreeSurfer (p < 0.001) and moderate for SPM segment (ICV p = 0.05; TBV p = 0.006). No significant association was evident between ICV and TBV obtained using SPM NewSegment and FSL FAST and the corresponding stereological volumes. FreeSurfer provides an accurate method for measuring brain volumes in young children, even in the presence of structural brain abnormalities. (orig.)

  1. Comparison of automated brain volumetry methods with stereology in children aged 2 to 3 years

    International Nuclear Information System (INIS)

    Mayer, Kristina N.; Latal, Beatrice; Knirsch, Walter; Scheer, Ianina; Rhein, Michael von; Reich, Bettina; Bauer, Juergen; Gummel, Kerstin; Roberts, Neil; O'Gorman Tuura, Ruth

    2016-01-01

    The accurate and precise measurement of brain volumes in young children is important for early identification of children with reduced brain volumes and an increased risk for neurodevelopmental impairment. Brain volumes can be measured from cerebral MRI (cMRI), but most neuroimaging tools used for cerebral segmentation and volumetry were developed for use in adults and have not been validated in infants or young children. Here, we investigate the feasibility and accuracy of three automated software methods (i.e., SPM, FSL, and FreeSurfer) for brain volumetry in young children and compare the measures with corresponding volumes obtained using the Cavalieri method of modern design stereology. Cerebral MRI data were collected from 21 children with a complex congenital heart disease (CHD) before Fontan procedure, at a median age of 27 months (range 20.9-42.4 months). Data were segmented with SPM, FSL, and FreeSurfer, and total intracranial volume (ICV) and total brain volume (TBV) were compared with corresponding measures obtained using the Cavalieri method. Agreement between the estimated brain volumes (ICV and TBV) relative to the gold standard stereological volumes was strongest for FreeSurfer (p < 0.001) and moderate for SPM segment (ICV p = 0.05; TBV p = 0.006). No significant association was evident between ICV and TBV obtained using SPM NewSegment and FSL FAST and the corresponding stereological volumes. FreeSurfer provides an accurate method for measuring brain volumes in young children, even in the presence of structural brain abnormalities. (orig.)

  2. The changes of stage distribution of seminiferous epithelium cycle and its correlations with Leydig cell stereological parameters in aging men.

    Science.gov (United States)

    Huang, Rui; Zhu, Wei-Jie; Li, Jing; Gu, Yi-Qun

    2014-12-01

    To evaluate the changes of stage distribution of seminiferous epithelium cycle and its correlations with Leydig cell stereological parameters in aging men. Point counting method was used to analyze the stereological parameters of Leydig cells. The stage number of seminiferous epithelium cycle was calculated in the same testicular tissue samples which were used for Leydig cell stereological analysis. The aging group had shown more severe pathological changes as well as higher pathologic scores than the young group. Compared with the control group, the volume density (VV) and surface density (NA) of Leydig cells in the aging group were increased significantly. The stage number of seminiferous epithelium cycle in the aging group was decreased coincidently compared to the young group. Leydig cell Vv in the young group has a positive relationship with stages I, II, III, V and VI of seminiferous epithelium cycle, and Leydig cell NA and numerical density (NV) were positively related to stage IV. However, only the correlation between NV and stage II was found in the aging group. The stage number of seminiferous epithelium cycle was decreased in aging testes. Changes in the stage distribution in aging testes were related to the Leydig cell stereological parameters which presented as a sign of morphological changes. Copyright © 2014 Elsevier GmbH. All rights reserved.

  3. Stereology, an unbiased methodological approach to study plant anatomy and cytology: Past, present and future

    Czech Academy of Sciences Publication Activity Database

    Kubínová, Lucie; Radochová, Barbora; Lhotáková, Z.; Kubínová, Z.; Albrechtová, J.

    2017-01-01

    Roč. 36, č. 3 (2017), s. 187-205 ISSN 1580-3139 R&D Projects: GA MŠk(CZ) LM2015062 Institutional support: RVO:67985823 Keywords : chloroplast * confocal microscopy * leaf anatomy * mesophyll * stereological methods * systematic uniform random sampling Subject RIV: FS - Medical Facilities ; Equipment OBOR OECD: Medical laboratory technology (including laboratory samples analysis Impact factor: 1.135, year: 2016

  4. PLASTICITY OF SKELETAL MUSCLE STUDIED BY STEREOLOGY

    Directory of Open Access Journals (Sweden)

    Ida Eržen

    2011-05-01

    Full Text Available The present contribution provides an overview of stereological methods applied in the skeletal muscle research at the Institute of Anatomy of the Medical Faculty in Ljubljana. Interested in skeletal muscle plasticity we studied three different topics: (i expression of myosin heavy chain isoforms in slow and fast muscles under experimental conditions, (ii frequency of satellite cells in young and old human and rat muscles and (iii capillary supply of rat fast and slow muscles. We analysed the expression of myosin heavy chain isoforms within slow rat soleus and fast extensor digitorum longus muscles after (i homotopic and heterotopic transplantation of both muscles, (ii low frequency electrical stimulation of the fast muscle and (iii transposition of the fast nerve to the slow muscle. The models applied were able to turn the fast muscle into a completely slow muscle, but not vice versa. One of the indicators for the regenerative potential of skeletal muscles is its satellite cell pool. The estimated parameters, number of satellite cells per unit fibre length, corrected to the reference sarcomere length (Nsc/Lfib and number of satellite cells per number of nuclei (myonuclei and satellite cell nuclei (Nsc/Nnucl indicated that the frequency of M-cadherin stained satellite cells declines in healthy old human and rat muscles compared to young muscles. To access differences in capillary densities among slow and fast muscles and slow and fast muscle fibres, we have introduced Slicer and Fakir methods, and tested them on predominantly slow and fast rat muscles. Discussing three different topics that require different approach, the present paper reflects the three decades of the development of stereological methods: 2D analysis by simple point counting in the 70's, the disector in the 80's and virtual spatial probes in the 90's. In all methods the interactive computer assisted approach was utilised.

  5. Improving efficiency in stereology

    DEFF Research Database (Denmark)

    Keller, Kresten Krarup; Andersen, Ina Trolle; Andersen, Johnnie Bremholm

    2013-01-01

    of the study was to investigate the time efficiency of the proportionator and the autodisector on virtual slides compared with traditional methods in a practical application, namely the estimation of osteoclast numbers in paws from mice with experimental arthritis and control mice. Tissue slides were scanned......, a proportionator sampling and a systematic, uniform random sampling were simulated. We found that the proportionator was 50% to 90% more time efficient than systematic, uniform random sampling. The time efficiency of the autodisector on virtual slides was 60% to 100% better than the disector on tissue slides. We...... conclude that both the proportionator and the autodisector on virtual slides may improve efficiency of cell counting in stereology....

  6. An unbiased stereological method for efficiently quantifying the innervation of the heart and other organs based on total length estimations

    DEFF Research Database (Denmark)

    Mühlfeld, Christian; Papadakis, Tamara; Krasteva, Gabriela

    2010-01-01

    Quantitative information about the innervation is essential to analyze the structure-function relationships of organs. So far, there has been no unbiased stereological tool for this purpose. This study presents a new unbiased and efficient method to quantify the total length of axons in a given...... reference volume, illustrated on the left ventricle of the mouse heart. The method is based on the following steps: 1) estimation of the reference volume; 2) randomization of location and orientation using appropriate sampling techniques; 3) counting of nerve fiber profiles hit by a defined test area within...

  7. A handheld support system to facilitate stereological measurements and mapping of branching structures

    DEFF Research Database (Denmark)

    Gardi, J.E.; Wulfsohn, Dvora-Laiô; Nyengaard, J.R.

    2007-01-01

    specifications, software and Graphical User Interface (GUI) development, functionality and application of the handheld system using four examples: (1) sampling monkey lung bronchioles for estimation of diameter and wall thickness (2) sampling rat kidney for estimating number of arteries and arterioles......‘BranchSampler' is a system for computer-assisted manual stereology written for handheld devices running Windows CE. The system has been designed specifically to streamline data collection and optimize sampling of tree-like branching structures, with particular aims of reducing user errors, saving...

  8. Three-dimensional stereology as a tool for evaluating bladder outlet obstruction

    DEFF Research Database (Denmark)

    Wijk, J. Van der; Wijk, J. Van der; Horn, T.

    2008-01-01

    -cup biopsy, taken during cystoscopy, was stereologically evaluated to determine the smooth muscle cell volume and the fractions of collagen and smooth muscle using light and electron microscopy. Results. The collagen fraction was higher in patients than in controls (probably because the patients were older...... tract symptoms (LUTS) suggestive of BOO and five controls (mean age 48.6 years; range 43-53 years) without LUTS were studied. All participants underwent a full examination, including determination of the International Prostate Symptom Score, laboratory analysis and a urodynamic evaluation. A cold...

  9. Systematic versus random sampling in stereological studies.

    Science.gov (United States)

    West, Mark J

    2012-12-01

    The sampling that takes place at all levels of an experimental design must be random if the estimate is to be unbiased in a statistical sense. There are two fundamental ways by which one can make a random sample of the sections and positions to be probed on the sections. Using a card-sampling analogy, one can pick any card at all out of a deck of cards. This is referred to as independent random sampling because the sampling of any one card is made without reference to the position of the other cards. The other approach to obtaining a random sample would be to pick a card within a set number of cards and others at equal intervals within the deck. Systematic sampling along one axis of many biological structures is more efficient than random sampling, because most biological structures are not randomly organized. This article discusses the merits of systematic versus random sampling in stereological studies.

  10. Stereological Cell Morphometry In Right Atrium Myocardium Of Primates

    Science.gov (United States)

    Mandarim-De-Lacerda, Carlos A...; Hureau, Jacques

    1986-07-01

    The mechanism by which the cardiac impulse is propagated in normal hearts from its origin in the sinus node to the atrio-ventricular node has not been agreed on fully. We studied the "internodal posterior tract" through the crista terminalis by light microscopy and stereological morphometry. The hearts of 12 Papio cynocephalus were perfused , after sacrifice,with phosphate-buffered formol saline. The regions of the crista terminalis (CT), interatrial septum (IAS), atrioventricular bundle (AVB) and interventricular septum (IVS) were cut off and embedded in paraplast and sectioned (10 4m). The multipurpose test system M 42 was superimposed over the photomicrographs (1,890 points test, ESR = 2%) to the stereological computing. The quantitative results show that the cells from CT were more closely relationed with IAS cells than others cells (IVS and AVB cells). This results are not a morphological evidence to establish the specificity of the "internodal posterior tract". The cellular arrangement and anatomical variation in CT myocardium is very important.

  11. Quantifying Golgi structure using EM: combining volume-SEM and stereology for higher throughput.

    Science.gov (United States)

    Ferguson, Sophie; Steyer, Anna M; Mayhew, Terry M; Schwab, Yannick; Lucocq, John Milton

    2017-06-01

    Investigating organelles such as the Golgi complex depends increasingly on high-throughput quantitative morphological analyses from multiple experimental or genetic conditions. Light microscopy (LM) has been an effective tool for screening but fails to reveal fine details of Golgi structures such as vesicles, tubules and cisternae. Electron microscopy (EM) has sufficient resolution but traditional transmission EM (TEM) methods are slow and inefficient. Newer volume scanning EM (volume-SEM) methods now have the potential to speed up 3D analysis by automated sectioning and imaging. However, they produce large arrays of sections and/or images, which require labour-intensive 3D reconstruction for quantitation on limited cell numbers. Here, we show that the information storage, digital waste and workload involved in using volume-SEM can be reduced substantially using sampling-based stereology. Using the Golgi as an example, we describe how Golgi populations can be sensed quantitatively using single random slices and how accurate quantitative structural data on Golgi organelles of individual cells can be obtained using only 5-10 sections/images taken from a volume-SEM series (thereby sensing population parameters and cell-cell variability). The approach will be useful in techniques such as correlative LM and EM (CLEM) where small samples of cells are treated and where there may be variable responses. For Golgi study, we outline a series of stereological estimators that are suited to these analyses and suggest workflows, which have the potential to enhance the speed and relevance of data acquisition in volume-SEM.

  12. DNA level and stereologic estimates of nuclear volume in squamous cell carcinomas of the uterine cervix. A comparative study with analysis of prognostic impact

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Bichel, P; Jakobsen, A

    1992-01-01

    Grading of malignancy in squamous cell carcinomas of the uterine cervix is based on qualitative, morphologic examination and suffers from poor reproducibility. Using modern stereology, unbiased estimates of the three-dimensional, volume-weighted mean nuclear volume (nuclear vv), were obtained...... in pretreatment biopsies from 51 patients treated for cervical cancer in clinical Stages I through III (mean age of 56 years, follow-up period greater than 5 years). In addition, conventional, two-dimensional morphometric estimates of nuclear and mitotic features were obtained. DNA indices (DI) were estimated...... of nuclear vv were only of marginal prognostic significance (2P = 0.07). However, Cox multivariate regression analysis showed independent prognostic value of patient age and nuclear vv along with clinical stage and DI. All other investigated variables were rejected from the model. A prognostic index...

  13. Nondestructive, stereological estimation of canopy surface area

    DEFF Research Database (Denmark)

    Wulfsohn, Dvora-Laio; Sciortino, Marco; Aaslyng, Jesper M.

    2010-01-01

    We describe a stereological procedure to estimate the total leaf surface area of a plant canopy in vivo, and address the problem of how to predict the variance of the corresponding estimator. The procedure involves three nested systematic uniform random sampling stages: (i) selection of plants from...... a canopy using the smooth fractionator, (ii) sampling of leaves from the selected plants using the fractionator, and (iii) area estimation of the sampled leaves using point counting. We apply this procedure to estimate the total area of a chrysanthemum (Chrysanthemum morifolium L.) canopy and evaluate both...... the time required and the precision of the estimator. Furthermore, we compare the precision of point counting for three different grid intensities with that of several standard leaf area measurement techniques. Results showed that the precision of the plant leaf area estimator based on point counting...

  14. Stereological estimation of nuclear volume in benign and atypical meningiomas

    DEFF Research Database (Denmark)

    Madsen, C; Schrøder, H D

    1993-01-01

    A stereological estimation of nuclear volume in benign and atypical meningiomas was made. The aim was to investigate whether this method could discriminate between these two meningeal neoplasms. The difference was significant and it was moreover seen that there was no overlap between the two groups...

  15. Confocal stereology: an efficient tool for measurement of microscopic structures

    Czech Academy of Sciences Publication Activity Database

    Kubínová, Lucie; Janáček, Jiří

    2015-01-01

    Roč. 360, č. 1 (2015), s. 13-28 ISSN 0302-766X R&D Projects: GA MŠk(CZ) LH13028 Institutional support: RVO:67985823 Keywords : 3-D images * confocal microscopy * geometrical characteristics * spatial probes * stereology Subject RIV: EA - Cell Biology Impact factor: 2.948, year: 2015

  16. Blood capillary length estimation from three-dimensional microscopic data by image analysis and stereology.

    Science.gov (United States)

    Kubínová, Lucie; Mao, Xiao Wen; Janáček, Jiří

    2013-08-01

    Studies of the capillary bed characterized by its length or length density are relevant in many biomedical studies. A reliable assessment of capillary length from two-dimensional (2D), thin histological sections is a rather difficult task as it requires physical cutting of such sections in randomized directions. This is often technically demanding, inefficient, or outright impossible. However, if 3D image data of the microscopic structure under investigation are available, methods of length estimation that do not require randomized physical cutting of sections may be applied. Two different rat brain regions were optically sliced by confocal microscopy and resulting 3D images processed by three types of capillary length estimation methods: (1) stereological methods based on a computer generation of isotropic uniform random virtual test probes in 3D, either in the form of spatial grids of virtual "slicer" planes or spherical probes; (2) automatic method employing a digital version of the Crofton relations using the Euler characteristic of planar sections of the binary image; and (3) interactive "tracer" method for length measurement based on a manual delineation in 3D of the axes of capillary segments. The presented methods were compared in terms of their practical applicability, efficiency, and precision.

  17. Autogenous bone graft and ePTFE membrane in the treatment of peri-implantitis. II. Stereologic and histologic observations in cynomolgus monkeys

    DEFF Research Database (Denmark)

    Schou, Søren; Holmstrup, Palle; Skovgaard, Lene Theil

    2003-01-01

    autogenous bone graft; guided bone regeneration; histology; membrane; non-human primates; oral implants; osseointegration; pathalogy; peri-implantitis; stereology; treatment......autogenous bone graft; guided bone regeneration; histology; membrane; non-human primates; oral implants; osseointegration; pathalogy; peri-implantitis; stereology; treatment...

  18. Stereological estimates of nuclear volume in the prognostic evaluation of primary flat carcinoma in situ of the urinary bladder

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Jacobsen, F

    1991-01-01

    Primary, flat carcinoma in situ of the urinary bladder is rare and its behaviour is unpredictable. The aim of this retrospective study was to obtain base-line data and investigate the prognostic value of unbiased, stereological estimates of the volume-weighted mean nuclear volume, nuclear vv, in ...

  19. Ultrastructural and stereological analysis of trypanosomatidis of the genus Endotrypanum

    Directory of Open Access Journals (Sweden)

    Maurílio J. Soares

    1991-06-01

    Full Text Available Culture forms of four strains of Endotrypanum (E. schaudinni and E. monterogeii were processed for transmission electron microscopy and analyzed at the ultrastructural level. Quantitative data about some cytoplasmic organelles were obeined by stereology. All culture forms were promastigotes. In their cytoplasm four different organelles could be found: lipid inclusions (0,2-0,4 µm in diameter, mebrane-bounded vacuoles (0.10-0,28 µm in diameter, glycosomes (0,2-0,3 µm in diameter, and the mitochondrion. The kenetoplast appears as a thin band, except for the strain IM201, which possesses a broader structure, and possibly is not a member of this genus. Clusters of virus-like particles were seen in the cytoplasm of the strain LV88. The data obtained show that all strains have the typical morphological feature of the trypanosomatids. Only strain IM201 could be differentiated from the others, due to its larger kenetoplast-DNA network and its large mitochondrial and glycosomal relative volume. The morphometrical data did not allow the differentiation between E. schaudinni (strains IM217 and M6226 and E. monterogeii (strain LV88.

  20. Stereological estimates of nuclear volume in squamous cell carcinoma of the uterine cervix and its precursors

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Bichel, P; Jakobsen, A

    1991-01-01

    Using modern stereology, this study was carried out to obtain base-line data concerning three-dimensional, mean nuclear size in precancerous and invasive lesions of the uterine cervix. Unbiased estimates of the volume-weighted mean nuclear volume (nuclear vv) were obtained by point-sampling of nu......Using modern stereology, this study was carried out to obtain base-line data concerning three-dimensional, mean nuclear size in precancerous and invasive lesions of the uterine cervix. Unbiased estimates of the volume-weighted mean nuclear volume (nuclear vv) were obtained by point......-sampling of nuclear intercepts in 51 pre-treatment biopsies from patients with invasive squamous cell carcinomas (SCC). Vertical sections from 27 specimens with cervical intraepithelial neoplasia (CIN) grades I through III were also investigated, along with 10 CIN III associated with microinvasion (CIN III + M......). On average, nuclear vv was larger in SCC than in CIN III and CIN III + M together (2 P = 8.9 . 10(-5). A conspicuous overlap of nuclear vv existed between all investigated lesional groups. The reproducibility of estimates of nuclear vv in biopsies with SCC was acceptable (r = 0.85 and r = 0.84 in intra...

  1. DNA level and stereologic estimates of nuclear volume in squamous cell carcinomas of the uterine cervix. A comparative study with analysis of prognostic impact

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Bichel, P; Jakobsen, A

    1992-01-01

    Grading of malignancy in squamous cell carcinomas of the uterine cervix is based on qualitative, morphologic examination and suffers from poor reproducibility. Using modern stereology, unbiased estimates of the three-dimensional, volume-weighted mean nuclear volume (nuclear vv), were obtained...... in pretreatment biopsies from 51 patients treated for cervical cancer in clinical Stages I through III (mean age of 56 years, follow-up period greater than 5 years). In addition, conventional, two-dimensional morphometric estimates of nuclear and mitotic features were obtained. DNA indices (DI) were estimated...

  2. DNA level and stereologic estimates of nuclear volume in squamous cell carcinomas of the uterine cervix. A comparative study with analysis of prognostic impact

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Bichel, P; Jakobsen, A

    1992-01-01

    Grading of malignancy in squamous cell carcinomas of the uterine cervix is based on qualitative, morphologic examination and suffers from poor reproducibility. Using modern stereology, unbiased estimates of the three-dimensional, volume-weighted mean nuclear volume (nuclear vv), were obtained...... in pretreatment biopsies from 51 patients treated for cervical cancer in clinical Stages I through III (mean age of 56 years, follow-up period greater than 5 years). In addition, conventional, two-dimensional morphometric estimates of nuclear and mitotic features were obtained. DNA indices (DI) were estimated...... carcinoma of the uterine cervix....

  3. Cerebral atrophy in AIDS: a stereological study

    DEFF Research Database (Denmark)

    Oster, S; Christoffersen, P; Gundersen, H J

    1993-01-01

    Stereological estimates of mean volumes, surface areas, and cortical thicknesses were obtained on formalin-fixed brains from 19 men with AIDS and 19 controls. Volumes of neocortex, white matter, central brain nuclei, ventricles and archicortex were estimated using point counting and Cavalieri......'s unbiased principle for volume estimation. In AIDS, the mean volume of neocortex was reduced by 11%, and that of the central brain nuclei by 18%. Mean ventricular volume was increased by 55%. Mean neocortical thickness was reduced by 12%. The mean volume of white matter was reduced by 13%. The findings in 6...

  4. Unbiased stereological methods used for the quantitative evaluation of guided bone regeneration

    DEFF Research Database (Denmark)

    Aaboe, Else Merete; Pinholt, E M; Schou, S

    1998-01-01

    The present study describes the use of unbiased stereological methods for the quantitative evaluation of the amount of regenerated bone. Using the principle of guided bone regeneration the amount of regenerated bone after placement of degradable or non-degradable membranes covering defects...

  5. FIRST USE OF STEREOLOGY TO QUANTIFY THE SURVIVAL OF FAT AUTOGRAFTS

    Directory of Open Access Journals (Sweden)

    Eduardo Serna Cuéllar

    2011-05-01

    Full Text Available It is not usual to perform quantitative analyses on surgical materials. Rather, they are evaluated clinically, through qualitative methods, and if quantitation is done, it is on a 2-dimensional basis. In this study, the long-term survival of fat autografts (FAG in 40 subjects with facial soft tissue defects is quantified. An adipose tissue preparation from the abdomen obtained through liposuction and centrifugation is injected subcutaneously. Approximately 14 months later, the treated area is biopsied. Extensive computer-based histological analyses were performed using the stereological method in order to directly obtain three parameters: volume fraction of adipocytes in the fat tissue (VV, density (number per volume of adipocytes in the fat tissue (NV, and the mean cell volume of adipocytes (VA in each tissue sample. A set of equations based on these three quantitative parameters is produced for evaluation of the volumetric survival fraction (VSF of FAG. The presented data evidenced a 66% survival fraction at the 14-month follow-up. In routine practice, it would be sufficient to perform this volumetric analysis on the injected and biopsied fat samples to know what fraction of the FAG has survived. This is an objective method for quantifying FAG survival and will allow a standardized comparison between different research series and authors.

  6. Estimation of absolute microglial cell numbers in mouse fascia dentata using unbiased and efficient stereological cell counting principles

    DEFF Research Database (Denmark)

    Wirenfeldt, Martin; Dalmau, Ishar; Finsen, Bente

    2003-01-01

    Stereology offers a set of unbiased principles to obtain precise estimates of total cell numbers in a defined region. In terms of microglia, which in the traumatized and diseased CNS is an extremely dynamic cell population, the strength of stereology is that the resultant estimate is unaffected...... of microglia, although with this thickness, the intensity of the staining is too high to distinguish single cells. Lectin histochemistry does not visualize microglia throughout the section and, accordingly, is not suited for the optical fractionator. The mean total number of Mac-1+ microglial cells...... in the unilateral dentate gyrus of the normal young adult male C57BL/6 mouse was estimated to be 12,300 (coefficient of variation (CV)=0.13) with a mean coefficient of error (CE) of 0.06. The perspective of estimating microglial cell numbers using stereology is to establish a solid basis for studying the dynamics...

  7. Appreciating the difference between design-based and model-based sampling strategies in quantitative morphology of the nervous system.

    Science.gov (United States)

    Geuna, S

    2000-11-20

    Quantitative morphology of the nervous system has undergone great developments over recent years, and several new technical procedures have been devised and applied successfully to neuromorphological research. However, a lively debate has arisen on some issues, and a great deal of confusion appears to exist that is definitely responsible for the slow spread of the new techniques among scientists. One such element of confusion is related to uncertainty about the meaning, implications, and advantages of the design-based sampling strategy that characterize the new techniques. In this article, to help remove this uncertainty, morphoquantitative methods are described and contrasted on the basis of the inferential paradigm of the sampling strategy: design-based vs model-based. Moreover, some recommendations are made to help scientists judge the appropriateness of a method used for a given study in relation to its specific goals. Finally, the use of the term stereology to label, more or less expressly, only some methods is critically discussed. Copyright 2000 Wiley-Liss, Inc.

  8. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....

  9. A note on stereological estimation of the volume-weighted second moment of particle volume

    DEFF Research Database (Denmark)

    Jensen, E B; Sørensen, Flemming Brandt

    1991-01-01

    It is shown that for a variety of biological particle shapes, the volume-weighted second moment of particle volume can be estimated stereologically using only the areas of particle transects, which can be estimated manually by point-counting....

  10. Three-dimensional stereology as a tool for evaluating bladder outlet obstruction

    DEFF Research Database (Denmark)

    Van Der Wijk, Jasper; Van Der Wijk, Jan; Horn, Thomas

    2008-01-01

    Objective. In a pilot study we evaluated whether implementation of a novel 3D stereologic technique can prove that bladder outlet obstruction (BOO) is associated with morphologic changes in the bladder wall. Material and methods. Ten males (mean age 69.7 years; range 58-84 years) with lower urinary...... tract symptoms (LUTS) suggestive of BOO and five controls (mean age 48.6 years; range 43-53 years) without LUTS were studied. All participants underwent a full examination, including determination of the International Prostate Symptom Score, laboratory analysis and a urodynamic evaluation. A cold....... Conclusions. This pilot study shows that, even with the implementation of subtle morphometric techniques, there seems to be no relationship between the severity of BOO and bladder wall morphology. It is possible that interstitial collagen in the bladder wall increases with age. It seems that bladder wall...

  11. DNA-index and stereological estimation of nuclear volume in primary and metastatic malignant melanomas

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Kristensen, I B; Grymer, F

    1990-01-01

    The aim of this study was to investigate the relationship between physical nuclear volume and ploidy level in malignant melanomas, and to analyse the heterogeneity of these two parameters among primary and corresponding secondary tumours. Unbiased stereological estimates of nuclear volume can...

  12. Evaluation of FITC-induced atopic dermatitis-like disease in NC/Nga mice and BALB/c mice using computer-assisted stereological toolbox, a computer-aided morphometric system

    DEFF Research Database (Denmark)

    Hvid, Malene; Jensen, Helene Kofoed; Deleuran, Bent

    2009-01-01

    Stereological Toolbox as a stereological method, the mice were sensitized to FITC and the histological efficiency of disease induction with regard to inflammation and CD4+ and CD8+ lymphocytes, in addition to mast cells, was evaluated. The method was validated by comparison to a conventional semiquantitative...

  13. Unbiased Stereologic Estimation of the Spatial Distribution of Paget’s Disease in the Human Temporal Bone

    DEFF Research Database (Denmark)

    Bloch, Sune Land; Sørensen, Mads Sølvsten

    2014-01-01

    remodeling around the inner ear space and to compare it with that of otosclerosis in a contemporary context of temporal bone dynamics. MATERIALS AND METHODS: From the temporal bone collection of Massachusetts Eye and Ear Infirmary, 15 of 29 temporal bones with Paget's disease were selected to obtain...... an independent sample. All volume distributions were obtained along the normal axis of capsular bone remodeling activity by the use of vector-based stereology. RESULTS: Pagetic bone remodeling was distributed centrifugally around the inner ear space at the individual and the general level. This pattern...

  14. Unbiased stereological estimation of d-dimensional volume in Rn from an isotropic random slice through a fixed point

    DEFF Research Database (Denmark)

    Jensen, Eva B. Vedel; Kiêu, K

    1994-01-01

    Unbiased stereological estimators of d-dimensional volume in R(n) are derived, based on information from an isotropic random r-slice through a specified point. The content of the slice can be subsampled by means of a spatial grid. The estimators depend only on spatial distances. As a fundamental ...... lemma, an explicit formula for the probability that an isotropic random r-slice in R(n) through 0 hits a fixed point in R(n) is given....

  15. Estimation of fetal volume by magnetic resonance imaging and stereology.

    Science.gov (United States)

    Roberts, N; Garden, A S; Cruz-Orive, L M; Whitehouse, G H; Edwards, R H

    1994-11-01

    The current methods to monitor fetal growth in utero are based on ultrasound image measurements which, lacking a proper sampling methodology, may be biased to unknown degrees. The Cavalieri method of stereology guarantees the accurate estimation of the volume of an arbitrary object from a few systematic sections. Non-invasive scanning methods, and magnetic resonance imaging (MRI) in particular, are valuable tools to provide the necessary sections, and therefore offer interesting possibilities for unbiased quantification. This paper describes how to estimate fetal volume in utero with a coefficient of error of less than 5% in less than 5 min, from three or four properly sampled MRI scans. MRI was chosen because it does not use ionizing radiations on the one hand, and it offers a good image quality on the other. The impact of potential sources of bias such as fetal motion, chemical shift and partial voluming artefacts is discussed. The methods are illustrated on four subjects monitored between weeks 28 and 40 of gestation.

  16. FIBRIN-TYPE FIBRINOID IN HUMAN PLACENTA: A STEREOLOGICAL ANALYSIS OF ITS ASSOCIATION WITH INTERVILLOUS VOLUME AND VILLOUS SURFACE AREA

    Directory of Open Access Journals (Sweden)

    Terry M Mayhew

    2011-05-01

    Full Text Available Stereological methods were used to examine fibrin-type fibrinoid deposition in the intervillous spaces of human placentas collected during gestation (12-41 weeks and from term pregnancies at low (400 m and high (3.6 km altitude. The main aim was to test predictions about the relationships between fibrinoid deposits and either the volume of intervillous space or the surface area of (intermediate + terminal villi. Fields of view on Masson trichrome-stained paraffin sections were selected as part of a systematic sampling design which randomised section location and orientation. Relative and absolute volumes were estimated by test point counting and surfaces by intersection counting. Apparent differences were tested by analyses of variance and relationships by correlation and regression analysis. Fibrinoid volume increased during gestation and correlated positively with intervillous volume and villous surface area. However, relative to intervillous volume, the main increase in fibrinoid occurred towards term (36-41 weeks. At high altitude, placentas contained more intervillous space but less fibrinoid. At both altitudes, there were significant correlations between fibrinoid volume and villous surface area. In all cases, changes in fibrinoid volume were commensurate with changes in villous surface area. Whilst findings lend support to the notion that fibrinoid deposition during normal gestation is influenced by the quality of vascular perfusion, they also emphasise that the extent of the villous surface is a more generally important factor. The villous surface may influence the steady state between coagulation and fibrinolysis since some pro-coagulatory events operate at the trophoblastic epithelium. They occur notably at sites of trophoblast de-epithelialisation and these arise following trauma or during the extrusion phase of normal epithelial turnover.

  17. Practicable methods for histological section thickness measurement in quantitative stereological analyses.

    Science.gov (United States)

    Matenaers, Cyrill; Popper, Bastian; Rieger, Alexandra; Wanke, Rüdiger; Blutke, Andreas

    2018-01-01

    The accuracy of quantitative stereological analysis tools such as the (physical) disector method substantially depends on the precise determination of the thickness of the analyzed histological sections. One conventional method for measurement of histological section thickness is to re-embed the section of interest vertically to its original section plane. The section thickness is then measured in a subsequently prepared histological section of this orthogonally re-embedded sample. However, the orthogonal re-embedding (ORE) technique is quite work- and time-intensive and may produce inaccurate section thickness measurement values due to unintentional slightly oblique (non-orthogonal) positioning of the re-embedded sample-section. Here, an improved ORE method is presented, allowing for determination of the factual section plane angle of the re-embedded section, and correction of measured section thickness values for oblique (non-orthogonal) sectioning. For this, the analyzed section is mounted flat on a foil of known thickness (calibration foil) and both the section and the calibration foil are then vertically (re-)embedded. The section angle of the re-embedded section is then calculated from the deviation of the measured section thickness of the calibration foil and its factual thickness, using basic geometry. To find a practicable, fast, and accurate alternative to ORE, the suitability of spectral reflectance (SR) measurement for determination of plastic section thicknesses was evaluated. Using a commercially available optical reflectometer (F20, Filmetrics®, USA), the thicknesses of 0.5 μm thick semi-thin Epon (glycid ether)-sections and of 1-3 μm thick plastic sections (glycolmethacrylate/ methylmethacrylate, GMA/MMA), as regularly used in physical disector analyses, could precisely be measured within few seconds. Compared to the measured section thicknesses determined by ORE, SR measures displayed less than 1% deviation. Our results prove the applicability

  18. Quantification of rat retinal growth and vascular population changes after single and split doses of proton irradiation: translational study using stereology methods

    Science.gov (United States)

    Mao, Xiao W.; Archambeau, John O.; Kubinova, Lucie; Boyle, Soames; Petersen, Georgia; Grove, Roger; Nelson, G. A. (Principal Investigator)

    2003-01-01

    This study quantified architectural and population changes in the rat retinal vasculature after proton irradiation using stereology. A 100 MeV conformal proton beam delivered 8, 14, 20 and 28 Gy as single and split doses to the whole eye. The vascular networks were prepared from retinal digests. Stereological methods were used to obtain the area of the retina and unbiased estimates of microvessel/artery/vein endothelial, pericyte and smooth muscle population, and vessel length. The retinal area increased progressively in the unirradiated, age-matched controls and in the retinas irradiated with 8 and 14 Gy, indicating uniform progressive retinal growth. No growth occurred after 20 and 28 Gy. Regression analysis of total endothelial cell number in all vessels (arteries, veins and capillaries) after irradiation documented a progressive time- and dose-dependent cell loss occurring over 15 to 24 months. The difference from controls was significant (Ppopulations after split doses. At 10 Gy, the rate of endothelial cell loss, a dose parameter used to characterize the time- and dose-dependent loss of the endothelial population, was doubled.

  19. Some new, simple and efficient stereological methods and their use in pathological research and diagnosis

    DEFF Research Database (Denmark)

    Gundersen, H J; Bendtsen, T F; Korbo, L

    1988-01-01

    Stereology is a set of simple and efficient methods for quantitation of three-dimensional microscopic structures which is specifically tuned to provide reliable data from sections. Within the last few years, a number of new methods has been developed which are of special interest to pathologists...... are invariably simple and easy....

  20. STEREOLOGICAL ESTIMATION OF ITO CELLS FROM RAT LIVER USING THE OPTICAL FRACTIONATOR - A PRELIMINARY REPORT

    Directory of Open Access Journals (Sweden)

    Ricardo Marcos

    2011-05-01

    Full Text Available In the last two decades, much light has been shed on hepatic fibrosis, and the activation / proliferation of Ito cells (IC emerged to play a central role. Therefore, it is essential to have solid quantitative data in nonpathological statuses; yet, this data is scarce and confined to "number per area" or semiquantitative information. Moreover, the supposed heterogeneous distribution of IC in the hepatic lobule was never analysed with design-based (unbiased stereology. In the present study, the total number (N of IC in rat liver was estimated for the first time, by combining immunocytochemistry with the optical fractionator. Quantification was extended to the hepatocytes, to disclose the IC index, an often-used ratio in hepatology. Systematic uniform random liver sections were obtained from male Wistar rats (n = 3, and immunostained against glial fibrillary acidic protein (GFAP, a known specific marker for hepatic IC. For the first time, these were marked against GFAP in thick (30 μm paraffin sections. The estimated N of IC was 224E06; with a coefficient of error of 0.04 or 0.06, depending on the particular equation used (based on the so-called "quadratic approximation". The IC index was 91 IC/1000 hepatocytes. Concerning the lobular heterogeneity, it was proved the liver harbours a larger total number of periportal IC and hepatocytes.

  1. Introducing Stereology as a Tool to Assess the Severity of Psoriasis

    DEFF Research Database (Denmark)

    Kamp, Søren; Stenderup, Karin; Rosada, Cecilia

    2008-01-01

    to histological specimens in order to obtain three-dimensional properties from two-dimensional tissue samples. The psoriasis xenograft model used in this trial is accepted as a leading animal model for psoriasis. Psoriatic skin from psoriatic patients was grafted onto severe combined immunodeficient (SCID) mice......  The purpose of this study was to introduce stereology as a novel tool in assessing the severity of psoriasis. Psoriasis is a well described chronic inflammatory skin disease affecting approximately 2% of the Caucasian population.   The severity of psoriasis has been assessed by a multitude...

  2. Analysis and Design of Reinforced Concrete Structures With Spring Base Isolation

    International Nuclear Information System (INIS)

    Tun Myint Aung; Tin Tin Win, Nyan Myint Kyaw

    2008-06-01

    In the study, analysis and design of four storey reinforced concrete building and it's isolations which is located in seismic zone 4. Then comparison of analysis result between fixed base condition and isolated condition of the building due to multi direction earthquake motions such as horizontal and vertical earthquake. Firstaly, static analysis is used for fixed base condition due to gravity unfactored load to design the helical spring. Secondly spectrum analysis is only utilized for horizontal earthquake and time history analysis is used for both horizontal earthquake and vertical earthquake respectively. Finally, comparison of the analysis results as forces, displacements, drifts, accelerations and shear at various levels of building are presented. The static period of fixed base is 0.4 sec. According to the base isolated concept, base isolated period is lengthened to 0.8 sec, 1 sec and 1.2sec for design earthquake level. The results which are especially compared to base isolated (1.2 sec) and fixed base building show that the displacements of base isolated is more than fixed base building but other seismic response such as acceleration of base isolated is significantly reduced compared to fixed base as well as base isloated building has capacity for reducing of member force of the structure with fixed base building

  3. No correlation between ultrasound placental grading at 31-34 weeks of gestation and a surrogate estimate of organ function at term obtained by stereological analysis.

    Science.gov (United States)

    Yin, T T; Loughna, P; Ong, S S; Padfield, J; Mayhew, T M

    2009-08-01

    We test the experimental hypothesis that early changes in the ultrasound appearance of the placenta reflect poor or reduced placental function. The sonographic (Grannum) grade of placental maturity was compared to placental function as expressed by the morphometric oxygen diffusive conductance of the villous membrane. Ultrasonography was used to assess the Grannum grade of 32 placentas at 31-34 weeks of gestation. Indications for the scans included a history of previous fetal abnormalities, previous fetal growth problems or suspicion of IUGR. Placentas were classified from grade 0 (most immature) to grade III (most mature). We did not exclude smokers or complicated pregnancies as we aimed to correlate the early appearance of mature placentas with placental function. After delivery, microscopical fields on formalin-fixed, trichrome-stained histological sections of each placenta were obtained by multistage systematic uniform random sampling. Using design-based stereological methods, the exchange surface areas of peripheral (terminal and intermediate) villi and their fetal capillaries and the arithmetic and harmonic mean thicknesses of the villous membrane (maternal surface of villous trophoblast to adluminal surface of vascular endothelium) were estimated. An index of the variability in thickness of this membrane, and an estimate of its oxygen diffusive conductance, were derived secondarily as were estimates of the mean diameters and total lengths of villi and fetal capillaries. Group comparisons were drawn using analysis of variance. We found no significant differences in placental volume or composition or in the dimensions or diffusive conductances of the villous membrane. Subsequent exclusion of smokers did not alter these main findings. Grannum grades at 31-34 weeks of gestation appear not to provide reliable predictors of the functional capacity of the term placenta as expressed by the surrogate measure, morphometric diffusive conductance.

  4. Changes in total cell numbers of the basal ganglia in patients with multiple system atrophy - A stereological study

    DEFF Research Database (Denmark)

    Salvesen, Lisette; Ullerup, Birgitte H; Sunay, Fatma B

    2014-01-01

    Total numbers of neurons, oligodendrocytes, astrocytes, and microglia in the basal ganglia and red nucleus were estimated in brains from 11 patients with multiple system atrophy (MSA) and 11 age- and gender-matched control subjects with unbiased stereological methods. Compared to the control...

  5. A comparison of porosity analysis using 2D stereology estimates and 3D serial sectioning for additively manufactured Ti 6Al 2Sn 4Zr 2Mo alloy; Vergleich der Porositaetsanalyse einer Ti 6Al 2Sn 4Zr 2Mo-Legierung aus additiver Fertigung mittels stereologischer Schaetzungen (2D) und mit Serienschnitten (3D)

    Energy Technology Data Exchange (ETDEWEB)

    Ganti, Satya R.; Velez, Michael A.; Geier, Brian A.; Hayes, Brian J.; Turner, Bryan J.; Jenkins, Elizabeth J. [UES Inc., Dayton, OH (United States)

    2017-02-15

    Porosity is a typical defect in additively manufactured (AM) parts. Such defects limit the properties and performance of AM parts, and therefore need to be characterized accurately. Current methods for characterization of defects and microstructure rely on classical stereological methods that extrapolate information from two dimensional images. The automation of serial sectioning provides an opportunity to precisely and accurately quantify porosity in three dimensions in materials. In this work, we analyzed the porosity of an additively manufactured Ti 6Al 2Sn 4Zr 2Mo sample using Robo-Met.3D {sup registered}, an automated serial sectioning system. Image processing for three dimensional reconstruction of the serial-sectioned two dimensional images was performed using open source image analysis software (Fiji/ImageJ, Dream.3D, Paraview). The results from this 3D serial sectioning analysis were then compared to classical 2D stereological methods (Saltykov stereological theory). We found that for this dataset, the classical 2D methods underestimated the porosity size and distributions of the larger pores; a critical attribute to fatigue behavior of the AM part. The results suggest that acquiring experimental data with equipment such as Robo-Met.3D {sup registered} to measure the number and size of particles such as pores in a volume irrespective of knowing their shape is a better choice.

  6. First and second order stereology of hyaline cartilage: Application on mice femoral cartilage.

    Science.gov (United States)

    Noorafshan, Ali; Niazi, Behnam; Mohamadpour, Masoomeh; Hoseini, Leila; Hoseini, Najmeh; Owji, Ali Akbar; Rafati, Ali; Sadeghi, Yasaman; Karbalay-Doust, Saied

    2016-11-01

    Stereological techniques could be considered in research on cartilage to obtain quantitative data. The present study aimed to explain application of the first- and second-order stereological methods on articular cartilage of mice and the methods applied on the mice exposed to cadmium (Cd). The distal femoral articular cartilage of BALB/c mice (control and Cd-treated) was removed. Then, volume and surface area of the cartilage and number of chondrocytes were estimated using Cavalieri and optical dissector techniques on isotropic uniform random sections. Pair-correlation function [g(r)] and cross-correlation function were calculated to express the spatial arrangement of chondrocytes-chondrocytes and chondrocytes-matrix (chondrocyte clustering/dispersing), respectively. The mean±standard deviation of the cartilage volume, surface area, and thickness were 1.4±0.1mm 3 , 26.2±5.4mm 2 , and 52.8±6.7μm, respectively. Besides, the mean number of chondrocytes was 680±200 (×10 3 ). The cartilage volume, cartilage surface area, and number of chondrocytes were respectively reduced by 25%, 27%, and 27% in the Cd-treated mice in comparison to the control animals (pcartilage components carried potential advantages for investigating the cartilage in different joint conditions. Chondrocyte clustering/dispersing and cellularity can be evaluated in cartilage assessment in normal or abnormal situations. Copyright © 2016 Elsevier GmbH. All rights reserved.

  7. Intralesional and metastatic heterogeneity in malignant melanomas demonstrated by stereologic estimates of nuclear volume

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Erlandsen, M

    1990-01-01

    Regional variability of nuclear 3-dimensional size can be estimated objectively using point-sampled intercepts obtained from different, defined zones within individual neoplasms. In the present study, stereologic estimates of the volume-weighted mean nuclear volume, nuclear vv, within peripheral...... melanomas showed large interindividual variation. This finding emphasizes that unbiased estimates of nuclear vv are robust to regional heterogeneity of nuclear volume and thus suitable for purposes of objective, quantitative malignancy grading of melanomas....

  8. Histological versus stereological methods applied at spermatogonia during normal human development

    DEFF Research Database (Denmark)

    Cortes, D

    1990-01-01

    The number of spermatogonia per tubular transverse section (S/T), and the percentage of seminiferous tubulus containing spermatogonia (the fertility index (FI] were measured in 40 pairs of normal autopsy testes aged 28 weeks of gestation-40 years. S/T and FI showed similar changes during the whole...... period, and were minimal between 1 and 4 years. The number of spermatogonia per testis (S/testis) and the number of spermatogonia per cm3 testis tissue (S/cm3) were estimated by stereological methods in the same testes. S/T and FI respectively were significantly correlated both to S/testis and S/cm3. So...

  9. Parametric Design and Mechanical Analysis of Beams based on SINOVATION

    Science.gov (United States)

    Xu, Z. G.; Shen, W. D.; Yang, D. Y.; Liu, W. M.

    2017-07-01

    In engineering practice, engineer needs to carry out complicated calculation when the loads on the beam are complex. The processes of analysis and calculation take a lot of time and the results are unreliable. So VS2005 and ADK are used to develop a software for beams design based on the 3D CAD software SINOVATION with C ++ programming language. The software can realize the mechanical analysis and parameterized design of various types of beams and output the report of design in HTML format. Efficiency and reliability of design of beams are improved.

  10. A novel approach to non-biased systematic random sampling: a stereologic estimate of Purkinje cells in the human cerebellum.

    Science.gov (United States)

    Agashiwala, Rajiv M; Louis, Elan D; Hof, Patrick R; Perl, Daniel P

    2008-10-21

    Non-biased systematic sampling using the principles of stereology provides accurate quantitative estimates of objects within neuroanatomic structures. However, the basic principles of stereology are not optimally suited for counting objects that selectively exist within a limited but complex and convoluted portion of the sample, such as occurs when counting cerebellar Purkinje cells. In an effort to quantify Purkinje cells in association with certain neurodegenerative disorders, we developed a new method for stereologic sampling of the cerebellar cortex, involving calculating the volume of the cerebellar tissues, identifying and isolating the Purkinje cell layer and using this information to extrapolate non-biased systematic sampling data to estimate the total number of Purkinje cells in the tissues. Using this approach, we counted Purkinje cells in the right cerebella of four human male control specimens, aged 41, 67, 70 and 84 years, and estimated the total Purkinje cell number for the four entire cerebella to be 27.03, 19.74, 20.44 and 22.03 million cells, respectively. The precision of the method is seen when comparing the density of the cells within the tissue: 266,274, 173,166, 167,603 and 183,575 cells/cm3, respectively. Prior literature documents Purkinje cell counts ranging from 14.8 to 30.5 million cells. These data demonstrate the accuracy of our approach. Our novel approach, which offers an improvement over previous methodologies, is of value for quantitative work of this nature. This approach could be applied to morphometric studies of other similarly complex tissues as well.

  11. Effect of praziquantel administration on hepatic stereology of mice infected with Schistosoma mansoni and fed a low-protein diet

    Directory of Open Access Journals (Sweden)

    L.A. Barros

    2009-09-01

    Full Text Available A study was undertaken to investigate the effect of administering praziquantel (PZQ, focusing on the liver stereological findings of malnourished mice infected with Schistosoma mansoni. Thirty female Swiss Webster mice (age: 21 days; weight: 8-14 g were fed either a low-protein diet (8% or standard chow (22% protein for 15 days. Five mice in each group were infected with 50 cercariae each of the BH strain (Brazil. PZQ therapy (80 mg/kg body weight, per day was started on the 50th day of infection and consisted of daily administration for 5 days. Volume density (hepatocytes, sinusoids and hepatic fibrosis was determined by stereology using a light microscope. Body weight gain and total serum albumin levels were always lower in undernourished mice. Our stereological study demonstrated that treatment increased both volume density of hepatocytes in mice fed standard chow (47.56%, treated group and 12.06%, control and low-protein chow (30.98%, treated group and 21.44%, control, and hepatic sinusoids [standard chow (12.52%, treated group and 9.06%, control, low-protein chow (14.42%, treated group and 8.46%, control], while hepatic fibrosis was reduced [standard chow (39.92%, treated group and 78.88%, control and low-protein chow (54.60%, treated group and 70.10%, control]. On the other hand, mice fed low-protein chow decreased density volume of hepatocytes and hepatic fibrosis. In conclusion, our findings indicate that treatment with PZQ ameliorates hepatic schistosomiasis pathology even in mice fed a low-protein diet.

  12. Prediction of prognosis in patients with epidural hematoma by a new stereological method

    International Nuclear Information System (INIS)

    Kalkan, E.; Cander, B.; Gul, M.; Girisgin, S.; Karabagli, H.; Sahin, B.

    2007-01-01

    Epidural hematoma (EH) is a serious clinical event observed in 2% of head trauma patients. Studies regarding the effects of epidural hematoma volume (EHV) on prognosis are not sufficient. In this study, we applied the volume fraction approach of the stereological method to estimate the hematoma to brain volume fraction (HBVF), and investigated the relation between the HBVF and prognosis. Fifty-nine EH patients (46 male and 13 female subjects, with average age of 21 years) admitted to the emergency clinic were included. The HBVF was estimated on the printed films of cranial computed tomography scans. For this purpose, common point counting grids were superimposed over the scan frames. According to the clinical results, patients were divided into three groups as complete recovery (43), disability (8) and exitus (8). The HBVF was compared with the clinical results. HBVF was determined as 4.6% in the patients with recovery, 8.1% in disability, and 7.6% in exitus patients. The HBVF values were lowest in recovery patients, and the difference between the recovery and the other two groups was statistically significant (p=0.007). However, there was no statistically significant difference in HBVF between disability and exitus patients (p>0.05). In conclusion, the HBVF can be an important tool to determine prognosis, and it can be measured using the volume fraction approach of stereological methods as developed in the present study. (author)

  13. Stereologic, histopathologic, flow cytometric, and clinical parameters in the prognostic evaluation of 74 patients with intraoral squamous cell carcinomas

    DEFF Research Database (Denmark)

    Bundgaard, T; Sørensen, Flemming Brandt; Gaihede, M

    1992-01-01

    BACKGROUND AND METHODS: A consecutive series of all 78 incident cases of intraoral squamous cell carcinoma occurring during a 2-year period in a population of 1.4 million inhabitants were evaluated by histologic score (the modified classification of Jacobsson et al.), flow cytometry, stereology, ...

  14. Proliferation assessment in breast carcinomas using digital image analysis based on virtual Ki67/cytokeratin double staining.

    Science.gov (United States)

    Røge, Rasmus; Riber-Hansen, Rikke; Nielsen, Søren; Vyberg, Mogens

    2016-07-01

    Manual estimation of Ki67 Proliferation Index (PI) in breast carcinoma classification is labor intensive and prone to intra- and interobserver variation. Standard Digital Image Analysis (DIA) has limitations due to issues with tumor cell identification. Recently, a computer algorithm, DIA based on Virtual Double Staining (VDS), segmenting Ki67-positive and -negative tumor cells using digitally fused parallel cytokeratin (CK) and Ki67-stained slides has been introduced. In this study, we compare VDS with manual stereological counting of Ki67-positive and -negative cells and examine the impact of the physical distance of the parallel slides on the alignment of slides. TMAs, containing 140 cores of consecutively obtained breast carcinomas, were stained for CK and Ki67 using optimized staining protocols. By means of stereological principles, Ki67-positive and -negative cell profiles were counted in sampled areas and used for the estimation of PIs of the whole tissue core. The VDS principle was applied to both the same sampled areas and the whole tissue core. Additionally, five neighboring slides were stained for CK in order to examine the alignment algorithm. Correlation between manual counting and VDS in both sampled areas and whole core was almost perfect (correlation coefficients above 0.97). Bland-Altman plots did not reveal any skewness in any data ranges. There was a good agreement in alignment (>85 %) in neighboring slides, whereas agreement decreased in non-neighboring slides. VDS gave similar results compared with manual counting using stereological principles. Introduction of this method in clinical and research practice may improve accuracy and reproducibility of Ki67 PI.

  15. Digital image analysis

    DEFF Research Database (Denmark)

    Riber-Hansen, Rikke; Vainer, Ben; Steiniche, Torben

    2012-01-01

    Digital image analysis (DIA) is increasingly implemented in histopathological research to facilitate truly quantitative measurements, decrease inter-observer variation and reduce hands-on time. Originally, efforts were made to enable DIA to reproduce manually obtained results on histological slides...... reproducibility, application of stereology-based quantitative measurements, time consumption, optimization of histological slides, regions of interest selection and recent developments in staining and imaging techniques....

  16. Model-based Computer Aided Framework for Design of Process Monitoring and Analysis Systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    In the manufacturing industry, for example, the pharmaceutical industry, a thorough understanding of the process is necessary in addition to a properly designed monitoring and analysis system (PAT system) to consistently obtain the desired end-product properties. A model-based computer....... The knowledge base provides the necessary information/data during the design of the PAT system while the model library generates additional or missing data needed for design and analysis. Optimization of the PAT system design is achieved in terms of product data analysis time and/or cost of monitoring equipment......-aided framework including the methods and tools through which the design of monitoring and analysis systems for product quality control can be generated, analyzed and/or validated, has been developed. Two important supporting tools developed as part of the framework are a knowledge base and a model library...

  17. Researching on knowledge architecture of design by analysis based on ASME code

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan

    2003-01-01

    The quality of knowledge-based system's knowledge architecture is one of decisive factors of knowledge-based system's validity and rationality. For designing the ASME code knowledge based system, this paper presents a knowledge acquisition method which is extracting knowledge through document analysis consulted domain experts' knowledge. Then the paper describes knowledge architecture of design by analysis based on the related rules in ASME code. The knowledge of the knowledge architecture is divided into two categories: one is empirical knowledge, and another is ASME code knowledge. Applied as the basement of the knowledge architecture, a general procedural process of design by analysis that is met the engineering design requirements and designers' conventional mode is generalized and explained detailed in the paper. For the sake of improving inference efficiency and concurrent computation of KBS, a kind of knowledge Petri net (KPN) model is proposed and adopted in expressing the knowledge architecture. Furthermore, for validating and verifying of the empirical rules, five knowledge validation and verification theorems are given in the paper. Moreover the research production is applicable to design the knowledge architecture of ASME codes or other engineering standards. (author)

  18. Analysis of images of acute human and animal leukaemia

    International Nuclear Information System (INIS)

    Feinermann, Emmanuel

    1981-01-01

    This research thesis first proposes a review of the development of stereology: historical backgrounds, basic principles. It discusses the choices regarding instrumentation: Coulter counter (principle and theory), quantitative analysis of particles, image analyser (optical microscope, epidiascope, scanners, detection, electronic pencil, computers, programming and data processing system), and stereo-logical parameters. The author then reports the stereo-logical study of acute human leukaemia: definition, classification, determination of spherical particle size distribution, lympho-blast size distributions. He reports the comparative study of rat L 5222 leukaemia and Brown Norway rat acute myelocytic leukaemia, and discusses their relationship with acute human leukaemia

  19. Effect of Sodium Cyclamate on the Rat Fetal Exocrine Pancreas: a Karyometric and Stereological Study

    OpenAIRE

    MARTINS, Alex Tadeu; SANTOS, Fabiano de Sant`Ana dos; SCANNAVINO, Fabio Luiz Ferreira; PIRES, Juliana Rico; ZUZA, Elizangela Partata; PADOVANI JUNIOR, Joao Armando; AZOUBEL, Reinaldo; MATEO, Miguel Angel Sala Di; LOPES, Ruberval Armando

    2010-01-01

    The cyclamate, a sweetner substance derived from N-cyclo-hexyl-sulfamic acid, is largely utilized as a non-caloric artificial edulcorant in foods and beverages as well as in the pharmaceutical industry. The objective of this study was to evaluate karyometric and stereological alterations in the rat fetal pancreas resulting from the intraperitoneal administration of sodium cyclamate. The exocrine pancreas of ten fetuses of rats were evaluated, five treated and five controls chosen at random, i...

  20. Morphometric changes in the spinal cord during prenatal life: a stereological study in sheep.

    Science.gov (United States)

    Sadeghinezhad, Javad; Zadsar, Narges; Hasanzadeh, Beal

    2018-03-01

    This study describes the volumetric changes in the spinal cord during prenatal life in sheep using quantitative stereological methods. Twenty healthy sheep fetuses were included in the present study, divided into four groups representing 9-11, 12-14, 15-17, and 18-20 weeks of gestation. In each group, the spinal cord was dissected out and sampled according to the unbiased systematic random sampling method then used for stereological estimations. The total volume of spinal cord, volume of gray matter (GM), volume of white matter (WM), ratio of GM volume to WM volume, and volume of central canal (CC) were estimated in the whole spinal cord and its various regions using Cavalieri's principle. The total volume of the spinal cord increased 8 times from week 9 to week 20. The cervical region showed the greatest (9.7 times) and the sacral region the least (6.3 times) volumetric change. The CC volume of the whole spinal cord increased 5.8 times from week 9 to week 20. The cervical region developed faster (8.2 times) and the thoracic region slower (4.4 times) than the total spinal cord. During development, the volume ratio of GM to WM decreased from lower toward upper regions. The greatest volume changes occurred mostly in weeks 9-11 and 12-14. The cervical region showed the greatest volume changes in comparison with other regions of the spinal cord.

  1. Stereological estimation of nuclear volume in benign and malignant melanocytic lesions of the skin. Inter- and intraobserver variability of malignancy grading

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Ottosen, P D

    1991-01-01

    The volume-weighted, mean nuclear volume (nuclear vv) may be estimated without any assumptions regarding nuclear shape using modern stereological techniques. As a part of an investigation concerning the prospects of nuclear vv for classification and malignancy grading of cutaneous melanocytic tum...

  2. Stereological observations of platelet-reinforced mullite- and zirconia-matrix composites

    International Nuclear Information System (INIS)

    Cherian, I.K.; Kriven, W.M.; Lehigh, M.D.; Nettleship, I.

    1996-01-01

    Recently, the effect of solid inclusions on the sintering of ceramic powders has been explained in terms of a back-stress that opposes densification. Several analyses have been proposed to describe this problem. However, little quantitative information exists concerning the effect of reinforcement on microstructural evolution. This study compares the microstructural development of zirconia and mullite matrices in the presence of alumina platelets. The effect of platelet loading on density is similar for both composites. Quantitative stereological examinations reveal that the average grain size and pore size are finer for the zirconia-matrix composite. The platelet loading does not have any noticeable effect on the average grain size of the matrix in either composite. However, the average pore size increases as the volume fraction of platelets increases for both materials. Contiguity measurements have detected some aggregation of platelets in the zirconia-matrix composite

  3. Stereological brain volume changes in post-weaned socially isolated rats

    DEFF Research Database (Denmark)

    Fabricius, Katrine; Helboe, Lone; Steiniger-Brach, Björn

    2010-01-01

    Lister Hooded rats isolated from postnatal day 25 for 15 weeks. We observed the expected gender differences in total brain volume with males having larger brains than females. Further, we found that isolated males had significantly smaller brains than group-housed controls and larger lateral ventricles...... have evaluated the neuroanatomical changes in this animal model in comparison to changes seen in schizophrenia. In this study, we applied stereological volume estimates to evaluate the total brain, the ventricular system, and the pyramidal and granular cell layers of the hippocampus in male and female...... than controls. However, this was not seen in female rats. Isolated males had a significant smaller hippocampus, dentate gyrus and CA2/3 where isolated females had a significant smaller CA1 compared to controls. Thus, our results indicate that long-term isolation of male rats leads to neuroanatomical...

  4. Urban Planning and Management Information Systems Analysis and Design Based on GIS

    Science.gov (United States)

    Xin, Wang

    Based on the analysis of existing relevant systems on the basis of inadequate, after a detailed investigation and research, urban planning and management information system will be designed for three-tier structure system, under the LAN using C/S mode architecture. Related functions for the system designed in accordance with the requirements of the architecture design of the functional relationships between the modules. Analysis of the relevant interface and design, data storage solutions proposed. The design for small and medium urban planning information system provides a viable building program.

  5. From the Analysis of Work-Processes to Designing Competence-Based Occupational Standards and Vocational Curricula

    Science.gov (United States)

    Tutlys, Vidmantas; Spöttl, Georg

    2017-01-01

    Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…

  6. Computer based approach to fatigue analysis and design

    International Nuclear Information System (INIS)

    Comstock, T.R.; Bernard, T.; Nieb, J.

    1979-01-01

    An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de

  7. Non-Binary Protograph-Based LDPC Codes: Analysis,Enumerators and Designs

    OpenAIRE

    Sun, Yizeng

    2013-01-01

    Non-binary LDPC codes can outperform binary LDPC codes using sum-product algorithm with higher computation complexity. Non-binary LDPC codes based on protographs have the advantage of simple hardware architecture. In the first part of this thesis, we will use EXIT chart analysis to compute the thresholds of different protographs over GF(q). Based on threshold computation, some non-binary protograph-based LDPC codes are designed and their frame error rates are compared with binary LDPC codes. ...

  8. Stereological estimates of nuclear volume in normal germ cells and carcinoma in situ of the human testis

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Müller, J

    1990-01-01

    Carcinoma in situ of the testis may appear many years prior to the development of an invasive tumour. Using point-sampled intercepts, base-line data concerning unbiased stereological estimates of the volume-weighted mean nuclear volume (nuclear vV) were obtained in 50 retrospective serial...... testicular biopsies from 10 patients with carcinoma in situ. All but two patients eventually developed an invasive growth. Testicular biopsies from 10 normal adult individuals and five prepubertal boys were included as controls. Nuclear vV in testicular carcinoma in situ was significantly larger than...... that of morphologically normal spermatogonia (2P = 1.0 x 10(-19)), with only minor overlap. Normal spermatogonia from controls had, on average, smaller nuclear vV than morphologically normal spermatogonia in biopsies with ipsi- or contra-lateral carcinoma in situ (2P = 5.2 x 10(-3)). No difference in nuclear vV was found...

  9. Demonstration of risk-based decision analysis in remedial alternative selection and design

    International Nuclear Information System (INIS)

    Evans, E.K.; Duffield, G.M.; Massmann, J.W.; Freeze, R.A.; Stephenson, D.E.

    1993-01-01

    This study demonstrates the use of risk-based decision analysis (Massmann and Freeze 1987a, 1987b) in the selection and design of an engineering alternative for groundwater remediation at a waste site at the Savannah River Site, a US Department of Energy facility in South Carolina. The investigation focuses on the remediation and closure of the H-Area Seepage Basins, an inactive disposal site that formerly received effluent water from a nearby production facility. A previous study by Duffield et al. (1992), which used risk-based decision analysis to screen a number of ground-water remediation alternatives under consideration for this site, indicated that the most attractive remedial option is ground-water extraction by wells coupled with surface water discharge of treated effluent. The aim of the present study is to demonstrate the iterative use of risk-based decision analysis throughout the design of a particular remedial alternative. In this study, we consider the interaction between two episodes of aquifer testing over a 6-year period and the refinement of a remedial extraction well system design. Using a three-dimensional ground-water flow model, this study employs (1) geostatistics and Monte Carlo techniques to simulate hydraulic conductivity as a stochastic process and (2) Bayesian updating and conditional simulation to investigate multiple phases of aquifer testing. In our evaluation of a remedial alternative, we compute probabilistic costs associated with the failure of an alternative to completely capture a simulated contaminant plume. The results of this study demonstrate the utility of risk-based decision analysis as a tool for improving the design of a remedial alternative through the course of phased data collection at a remedial site

  10. The daylighting dashboard - A simulation-based design analysis for daylit spaces

    Energy Technology Data Exchange (ETDEWEB)

    Reinhart, Christoph F. [Harvard University, Graduate School of Design, 48 Quincy Street, Cambridge, MA 02138 (United States); Wienold, Jan [Fraunhofer Institute for Solar Energy Systems, Heidenhofstrasse 2, 79110 Freiburg (Germany)

    2011-02-15

    This paper presents a vision of how state-of-the-art computer-based analysis techniques can be effectively used during the design of daylit spaces. Following a review of recent advances in dynamic daylight computation capabilities, climate-based daylighting metrics, occupant behavior and glare analysis, a fully integrated design analysis method is introduced that simultaneously considers annual daylight availability, visual comfort and energy use: Annual daylight glare probability profiles are combined with an occupant behavior model in order to determine annual shading profiles and visual comfort conditions throughout a space. The shading profiles are then used to calculate daylight autonomy plots, energy loads, operational energy costs and green house gas emissions. The paper then shows how simulation results for a sidelit space can be visually presented to simulation non-experts using the concept of a daylighting dashboard. The paper ends with a discussion of how the daylighting dashboard could be practically implemented using technologies that are available today. (author)

  11. Stereological measures of trabecular bone structure: comparison of 3D micro computed tomography with 2D histological sections in human proximal tibial bone biopsies

    DEFF Research Database (Denmark)

    Thomsen, Jesper Skovhus; Laib, A.; Koller, B.

    2005-01-01

    Stereology applied on histological sections is the 'gold standard' for obtaining quantitative information on cancellous bone structure. Recent advances in micro computed tomography (microCT) have made it possible to acquire three-dimensional (3D) data non-destructively. However, before the 3D...... methods can be used as a substitute for the current 'gold standard' they have to be verified against the existing standard. The aim of this study was to compare bone structural measures obtained from 3D microCT data sets with those obtained by stereology performed on conventional histological sections...... tibial metaphysis. The biopsies were embedded in methylmetacrylate before microCT scanning in a Scanco microCT 40 scanner at a resolution of 20 x 20 x 20 microm3, and the 3D data sets were analysed with a computer program. After microCT scanning, 16 sections were cut from the central 2 mm of each biopsy...

  12. Total numbers of neurons and glial cells in cortex and basal ganglia of aged brains with Down syndrome--a stereological study.

    Science.gov (United States)

    Karlsen, Anna Schou; Pakkenberg, Bente

    2011-11-01

    The total numbers of neurons and glial cells in the neocortex and basal ganglia in adults with Down syndrome (DS) were estimated with design-based stereological methods, providing quantitative data on brains affected by delayed development and accelerated aging. Cell numbers, volume of regions, and densities of neurons and glial cell subtypes were estimated in brains from 4 female DS subjects (mean age 66 years) and 6 female controls (mean age 70 years). The DS subjects were estimated to have about 40% fewer neocortical neurons in total (11.1 × 10(9) vs. 17.8 × 10(9), 2p ≤ 0.001) and almost 30% fewer neocortical glial cells with no overlap to controls (12.8 × 10(9) vs. 18.2 × 10(9), 2p = 0.004). In contrast, the total number of neurons in the basal ganglia was the same in the 2 groups, whereas the number of oligodendrocytes in the basal ganglia was reduced by almost 50% in DS (405 × 10(6) vs. 816 × 10(6), 2p = 0.01). We conclude that trisomy 21 affects cortical structures more than central gray matter emphasizing the differential impairment of brain development. Despite concomitant Alzheimer-like pathology, the neurodegenerative outcome in a DS brain deviates from common Alzheimer disease.

  13. Multi-objective experimental design for (13)C-based metabolic flux analysis.

    Science.gov (United States)

    Bouvin, Jeroen; Cajot, Simon; D'Huys, Pieter-Jan; Ampofo-Asiama, Jerry; Anné, Jozef; Van Impe, Jan; Geeraerd, Annemie; Bernaerts, Kristel

    2015-10-01

    (13)C-based metabolic flux analysis is an excellent technique to resolve fluxes in the central carbon metabolism but costs can be significant when using specialized tracers. This work presents a framework for cost-effective design of (13)C-tracer experiments, illustrated on two different networks. Linear and non-linear optimal input mixtures are computed for networks for Streptomyces lividans and a carcinoma cell line. If only glucose tracers are considered as labeled substrate for a carcinoma cell line or S. lividans, the best parameter estimation accuracy is obtained by mixtures containing high amounts of 1,2-(13)C2 glucose combined with uniformly labeled glucose. Experimental designs are evaluated based on a linear (D-criterion) and non-linear approach (S-criterion). Both approaches generate almost the same input mixture, however, the linear approach is favored due to its low computational effort. The high amount of 1,2-(13)C2 glucose in the optimal designs coincides with a high experimental cost, which is further enhanced when labeling is introduced in glutamine and aspartate tracers. Multi-objective optimization gives the possibility to assess experimental quality and cost at the same time and can reveal excellent compromise experiments. For example, the combination of 100% 1,2-(13)C2 glucose with 100% position one labeled glutamine and the combination of 100% 1,2-(13)C2 glucose with 100% uniformly labeled glutamine perform equally well for the carcinoma cell line, but the first mixture offers a decrease in cost of $ 120 per ml-scale cell culture experiment. We demonstrated the validity of a multi-objective linear approach to perform optimal experimental designs for the non-linear problem of (13)C-metabolic flux analysis. Tools and a workflow are provided to perform multi-objective design. The effortless calculation of the D-criterion can be exploited to perform high-throughput screening of possible (13)C-tracers, while the illustrated benefit of multi

  14. Design of process displays based on risk analysis techniques

    International Nuclear Information System (INIS)

    Lundtang Paulsen, J.

    2004-05-01

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  15. Stereological quantification of lymphocytes in skin biopsies from atopic dermatitis patients

    DEFF Research Database (Denmark)

    Ellingsen, A R; Sørensen, F B; Larsen, Jytte Overgaard

    2001-01-01

    with active eczema in 8 adults with AD and from clinically normal skin from 4 of the patients. Five persons without allergy or skin disease served as controls. The mean number of lymphocytes in 4-mm skin biopsies was 469,000 and 124,000 in active eczema and in clinically normal skin, respectively. Compared......Atopic dermatitis (AD) is histologically characterized by lymphocytic infiltration of the skin and quantitative assessment is required. This study introduces stereological techniques to quantify the number of lymphocytes in skin biopsies. Four-millimetre punch biopsies were taken from skin...... with controls, the number of lymphocytes in biopsies increased by a factor of 6.8 in active eczema and a factor of 1.8 in clinically normal skin. If 20% of skin is affected by eczema the total number of lymphocytes located in the affected skin can be estimated to 1.27 x 10(10). A patient with clinically...

  16. Stereological estimation of nuclear volume and other quantitative histopathological parameters in the prognostic evaluation of supraglottic laryngeal squamous cell carcinoma

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Bennedbaek, O; Pilgaard, J

    1989-01-01

    The aim of this study was to investigate various approaches to the grading of malignancy in pre-treatment biopsies from patients with supraglottic laryngeal squamous cell carcinoma. The prospects of objective malignancy grading based on stereological estimation of the volume-weighted mean nuclear...... volume, nuclear Vv, and nuclear volume fraction, Vv(nuc/tis), along with morphometrical 2-dimensional estimation of nuclear density index, NI, and mitotic activity index, MI, were investigated and compared with the current morphological, multifactorial grading system. The reproducibility among two...... observers of the latter was poor in the material which consisted of 35 biopsy specimens. Unbiased estimates of nuclear Vv were on the average 385 microns3 (CV = 0.44), with more than 90% of the associated variance attributable to differences in nuclear Vv among individual lesions. Nuclear Vv was positively...

  17. Stereological estimation of ovarian volume and number of follicles in low dose of Vitex agnus castus treated mice

    OpenAIRE

    HAMIDIAN, Gholamreza; YAHYAVI, Fariba

    2014-01-01

    Vitex agnus castus (VAC) has been proven to have a wide range of biological activities. It is commonly used in the treatment of menstrual disorders resulting from corpus luteum deficiency, including premenstrual symptoms and spasmodic dysmenorrheal, for certain menopausal conditions, and for insufficient lactation. The aim of this study was to investigate the effects of low dose of VAC essential oil on ovarian volume and oocyte number in mice by stereological technique. In this study 10 young...

  18. Design analysis and microprocessor based control of a nuclear reactor

    International Nuclear Information System (INIS)

    Sabbakh, N.J.

    1988-01-01

    The object of this thesis is to design and test a microprocessor based controller, to a simulated nuclear reactor system. The mathematical model that describes the dynamics of a typical nuclear reactor of one group of delayed neutrons approximations with temperature feedback was chosen. A digital computer program has been developed for the design and analysis of a simulated model based on the concept of state-variable feedback in order to meet a desired system response with maximum overshoot of 3.4% and setting time of 4 sec. The state variable feedback coefficients are designed for the continuous system, then an approximation is used to obtain in the state variable feedback vector for the discrete system. System control was implemented utilizing Direct Digital Control (DDC) of a nuclear reactor simulated model through a control algorithm that was performed by means of a microprocessor based system. The controller performance was satisfactorily tested by exciting the reactor system with a transient reactivity disturbance and by a step change in power demand. Direct digital control, when implemented on a microprocessor adds versatility, flexibility in system design with the added advantage of possible use of optimal control algorithms. 6 tabs.; 30 figs.; 46 refs.; 6 apps

  19. A model-based systems approach to pharmaceutical product-process design and analysis

    DEFF Research Database (Denmark)

    Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    This is a perspective paper highlighting the need for systematic model-based design and analysis in pharmaceutical product-process development. A model-based framework is presented and the role, development and use of models of various types are discussed together with the structure of the models...

  20. A characteristic ventricular shape in myelomeningocele-associated hydrocephalus? A CT stereology study

    International Nuclear Information System (INIS)

    Roost, D. van; Solymosi, L.; Funke, K.

    1995-01-01

    We measured the volume of the supratentorial ventricles in 39 consecutive children with myelomeningocele (MMC) and associated hydrocephalus, using a stereological method based on the Cavalieri theorem of systematic sampling. We distinguished the following groups: newborns before and after cerebrospinal fluid shunting (14), a somewhat larger group of newborns with an untreated MMC-associated hydrocephalus (25) and a group of shunted children at a mean age of 1.5 years (28). We paid special attention to the shape of the lateral ventricles, looking separately at the anterior and posterior halves. The measurements were compared with a healthy control group (10) and with children with hydrocephalus unrelated to MMC (15). The average volume ratio of the posterior to the anterior half of the lateral ventricles was 1.05 ± 0.39 in non-hydrocephalic children, 1.11 ± 0.55 in untreated hydrocephalic children without MMC, and 2.15 ± 0.65 in MMC-associated hydrocephalus prior to shunting. These ratios did not change significantly after shunting. This confirms our impression that MMC-associated hydrocephalus shows a characteristic shape, with a disproportionate enlargement of the posterior part of the lateral ventricles, in clear contrast to the normal-width frontal horns. This shape is reminiscent of the fetal ventricular shape. It reveals disturbance of brain development in children with MMC, which goes beyond the classic description of the Chiari malformation. (orig.)

  1. Stereological estimation of the mean and variance of nuclear volume from vertical sections

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt

    1991-01-01

    The application of assumption-free, unbiased stereological techniques for estimation of the volume-weighted mean nuclear volume, nuclear vv, from vertical sections of benign and malignant nuclear aggregates in melanocytic skin tumours is described. Combining sampling of nuclei with uniform...... probability in a physical disector and Cavalieri's direct estimator of volume, the unbiased, number-weighted mean nuclear volume, nuclear vN, of the same benign and malignant nuclear populations is also estimated. Having obtained estimates of nuclear volume in both the volume- and number distribution...... of volume, a detailed investigation of nuclear size variability is possible. Benign and malignant nuclear populations show approximately the same relative variability with regard to nuclear volume, and the presented data are compatible with a simple size transformation from the smaller benign nuclei...

  2. Analysis of pre-service physics teacher skills designing simple physics experiments based technology

    Science.gov (United States)

    Susilawati; Huda, C.; Kurniawan, W.; Masturi; Khoiri, N.

    2018-03-01

    Pre-service physics teacher skill in designing simple experiment set is very important in adding understanding of student concept and practicing scientific skill in laboratory. This study describes the skills of physics students in designing simple experiments based technologicall. The experimental design stages include simple tool design and sensor modification. The research method used is descriptive method with the number of research samples 25 students and 5 variations of simple physics experimental design. Based on the results of interviews and observations obtained the results of pre-service physics teacher skill analysis in designing simple experimental physics charged technology is good. Based on observation result, pre-service physics teacher skill in designing simple experiment is good while modification and sensor application are still not good. This suggests that pre-service physics teacher still need a lot of practice and do experiments in designing physics experiments using sensor modifications. Based on the interview result, it is found that students have high enough motivation to perform laboratory activities actively and students have high curiosity to be skilled at making simple practicum tool for physics experiment.

  3. Intralesional and metastatic heterogeneity in malignant melanomas demonstrated by stereologic estimates of nuclear volume

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Erlandsen, M

    1990-01-01

    Regional variability of nuclear 3-dimensional size can be estimated objectively using point-sampled intercepts obtained from different, defined zones within individual neoplasms. In the present study, stereologic estimates of the volume-weighted mean nuclear volume, nuclear vv, within peripheral...... on average larger in the peripheral zones of primary melanomas, than nuclear vv in central zones (2p = 6.7 x 10(-4), whereas no zonal differences were demonstrated in metastatic lesions (2p = 0.21). A marked intraindividual variation was demonstrated between primary and corresponding secondary melanomas (2p...... melanomas showed large interindividual variation. This finding emphasizes that unbiased estimates of nuclear vv are robust to regional heterogeneity of nuclear volume and thus suitable for purposes of objective, quantitative malignancy grading of melanomas....

  4. Analysis, Design, and Construction of a Base-Isolated Multiple Building Structure

    Directory of Open Access Journals (Sweden)

    Stefano Sorace

    2014-01-01

    Full Text Available The analysis and design of a multiple residential building, seismically protected by a base isolation system incorporating double friction pendulum sliders as protective devices, are presented in the paper. The building, situated in the suburban area of Florence, is composed of four independent reinforced concrete framed structures, mutually separated by three thermal expansion joints. The plan is L-shaped, with dimensions of about 75 m in the longitudinal direction and about 30 m along the longest side of the transversal direction. These characteristics identify the structure as the largest example of a base-isolated “artificial ground” ever built in Italy. The base isolation solution guarantees lower costs, a much greater performance, and a finer architectural look, as compared to a conventional fixed-base antiseismic design. The characteristics of the building and the isolators, the mechanical properties and the experimental characterization campaign and preliminary sizing carried out on the latter, and the nonlinear time-history design and performance assessment analyses developed on the base isolated building are reported in this paper, along with details about the installation of the isolators and the plants and highlights of the construction works.

  5. Assessing the 'system' in safe systems-based road designs: using cognitive work analysis to evaluate intersection designs.

    Science.gov (United States)

    Cornelissen, M; Salmon, P M; Stanton, N A; McClure, R

    2015-01-01

    While a safe systems approach has long been acknowledged as the underlying philosophy of contemporary road safety strategies, systemic applications are sparse. This article argues that systems-based methods from the discipline of Ergonomics have a key role to play in road transport design and evaluation. To demonstrate, the Cognitive Work Analysis framework was used to evaluate two road designs - a traditional Melbourne intersection and a cut-through design for future intersections based on road safety safe systems principles. The results demonstrate that, although the cut-through intersection appears different in layout from the traditional intersection, system constraints are not markedly different. Furthermore, the analyses demonstrated that redistribution of constraints in the cut-through intersection resulted in emergent behaviour, which was not anticipated and could prove problematic. Further, based on the lack of understanding of emergent behaviour, similar design induced problems are apparent across both intersections. Specifically, incompatibilities between infrastructure, vehicles and different road users were not dealt with by the proposed design changes. The importance of applying systems methods in the design and evaluation of road transport systems is discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Stereological estimation of nuclear volume and other quantitative histopathological parameters in the prognostic evaluation of supraglottic laryngeal squamous cell carcinoma

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Bennedbaek, O; Pilgaard, J

    1989-01-01

    The aim of this study was to investigate various approaches to the grading of malignancy in pre-treatment biopsies from patients with supraglottic laryngeal squamous cell carcinoma. The prospects of objective malignancy grading based on stereological estimation of the volume-weighted mean nuclear...... observers of the latter was poor in the material which consisted of 35 biopsy specimens. Unbiased estimates of nuclear Vv were on the average 385 microns3 (CV = 0.44), with more than 90% of the associated variance attributable to differences in nuclear Vv among individual lesions. Nuclear Vv was positively....... None of the investigated categorical and quantitative parameters (cutoff points = means) reached the level of significance with respect to prognostic value. However, nuclear Vv showed the best information concerning survival (2p = 0.08), and this estimator offers optimal features for objective...

  7. Design-based estimation of neuronal number and individual neuronal volume in the rat hippocampus

    DEFF Research Database (Denmark)

    Hosseini-Sharifabad, Mohammad; Nyengaard, Jens Randel

    2007-01-01

    Tools recently developed in stereology were employed for unbiased estimation of the neuronal number and volume in three major subdivisions of rat hippocampus (dentate granular, CA1 and CA3 pyramidal layers). The optical fractionator is used extensively in quantitative studies of the hippocampus; ...

  8. Effects of bisphenol A treatment during pregnancy on kidney development in mice: a stereological and histopathological study.

    Science.gov (United States)

    Nuñez, P; Fernandez, T; García-Arévalo, M; Alonso-Magdalena, P; Nadal, A; Perillan, C; Arguelles, J

    2018-04-01

    Bisphenol A (BPA) is a chemical found in plastics that resembles oestrogen in organisms. Developmental exposure to endocrine-disrupting chemicals, such as BPA, increases the susceptibility to type 2 diabetes (T2DM) and cardiovascular diseases. Animal studies have reported a nephron deficit in offspring exposed to maternal diabetes. The aim of this study was to investigate the prenatal BPA exposure effects on nephrogenesis in a mouse model that was predisposed to T2DM. This study quantitatively evaluated the renal structural changes using stereology and histomorphometry methods. The OF1 pregnant mice were treated with a vehicle or BPA (10 or 100 μg/kg/day) during days 9-16 of gestation (early nephrogenesis). The 30-day-old offspring were sacrificed, and tissue samples were collected and prepared for histopathological and stereology studies. Glomerular abnormalities and reduced glomerular formation were observed in the BPA offspring. The kidneys of the BPA10 and BPA100 female offspring had a significantly lower glomerular number and density than those of the CONTROL female offspring. The glomerular histomorphometry revealed a significant difference between the female and male CONTROL offspring for the analysed glomerular parameters that disappeared in the BPA10 and BPA100 offspring. In addition, the kidney histopathological examination showed typical male cuboidal epithelial cells of the Bowman capsule in the female BPA offspring. Exposure to environmentally relevant doses of BPA during embryonic development altered nephrogenesis. These structural changes could be associated with an increased risk of developing cardiometabolic diseases later in life.

  9. Design of microcontroller-based EMG and the analysis of EMG signals.

    Science.gov (United States)

    Güler, Nihal Fatma; Hardalaç, Firat

    2002-04-01

    In this work, a microcontroller-based EMG designed and tested on 40 patients. When the patients are in rest, the fast Fourier transform (FFT) analysis was applied to EMG signals recorded from right leg peroneal region. The histograms are constructed from the results of the FFT analysis. The analysis results shows that the amplitude of fibrillation potential of the muscle fiber of 30 patients measured from peroneal region is low and the duration is short. This is the reason why the motor nerves degenerated and 10 patients were found to be healthy.

  10. Evidence-Based Design and Research-Informed Design: What's the Difference? Conceptual Definitions and Comparative Analysis.

    Science.gov (United States)

    Peavey, Erin; Vander Wyst, Kiley B

    2017-10-01

    This article provides critical examination and comparison of the conceptual meaning and underlying assumptions of the concepts evidence-based design (EBD) and research-informed design (RID) in order to facilitate practical use and theoretical development. In recent years, EBD has experienced broad adoption, yet it has been simultaneously critiqued for rigidity and misapplication. Many practitioners are gravitating to the term RID to describe their method of integrating knowledge into the design process. However, the term RID lacks a clear definition and the blurring of terms has the potential to weaken advances made integrating research into practice. Concept analysis methods from Walker and Avant were used to define the concepts for comparison. Conceptual definitions, process descriptions, examples (i.e., model cases), and methods of evaluation are offered for EBD and RID. Although EBD and RID share similarities in meaning, the two terms are distinct. When comparing evidence based (EB) and research informed, EB is a broad base of information types (evidence) that are narrowly applied (based), while the latter references a narrow slice of information (research) that is broadly applied (informed) to create an end product of design. Much of the confusion between the use of the concepts EBD and RID arises out of differing perspectives between the way practitioners and academics understand the underlying terms. The authors hope this article serves to generate thoughtful dialogue, which is essential to the development of a discipline, and look forward to the contribution of the readership.

  11. Stereological estimation of the mean and variance of nuclear volume from vertical sections

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt

    1991-01-01

    The application of assumption-free, unbiased stereological techniques for estimation of the volume-weighted mean nuclear volume, nuclear vv, from vertical sections of benign and malignant nuclear aggregates in melanocytic skin tumours is described. Combining sampling of nuclei with uniform...... probability in a physical disector and Cavalieri's direct estimator of volume, the unbiased, number-weighted mean nuclear volume, nuclear vN, of the same benign and malignant nuclear populations is also estimated. Having obtained estimates of nuclear volume in both the volume- and number distribution...... to the larger malignant nuclei. Finally, the variance in the volume distribution of nuclear volume is estimated by shape-independent estimates of the volume-weighted second moment of the nuclear volume, vv2, using both a manual and a computer-assisted approach. The working procedure for the description of 3-D...

  12. Application of the Physical Disector Principle for Quantification of Dopaminergic Neuronal Loss in a Rat 6-Hydroxydopamine Nigral Lesion Model of Parkinson's Disease

    Directory of Open Access Journals (Sweden)

    Katrine Fabricius

    2017-12-01

    Full Text Available Stereological analysis is the optimal tool for quantitative assessment of brain morphological and cellular changes induced by neurotoxic lesions or treatment interventions. Stereological methods based on random sampling techniques yield unbiased estimates of particle counts within a defined volume, thereby providing a true quantitative estimate of the target cell population. Neurodegenerative diseases involve loss of specific neuron types, such as the midbrain tyrosine hydroxylase-positive dopamine neurons in Parkinson's disease and in animal models of nigrostriatal degeneration. Therefore, we applied an established automated physical disector principle in a fractionator design for efficient stereological quantitative analysis of tyrosine hydroxylase (TH-positive dopamine neurons in the substantia nigra pars compacta of hemiparkinsonian rats with unilateral 6-hydroxydopamine (6-OHDA lesions. We obtained reliable estimates of dopamine neuron numbers, and established the relationship between behavioral asymmetry and dopamine neuron loss on the lesioned side. In conclusion, the automated physical disector principle provided a useful and efficient tool for unbiased estimation of TH-positive neurons in rat midbrain, and should prove valuable for investigating neuroprotective strategies in 6-OHDA model of parkinsonism, while generalizing to other immunohistochemically-defined cell populations.

  13. STEREOLOGICAL STUDIES ON FETAL VASCULAR DEVELOPMENT IN HUMAN PLACENTAL VILLI

    Directory of Open Access Journals (Sweden)

    Terry M Mayhew

    2011-05-01

    Full Text Available In human pregnancy, fetal well-being depends on the development of placental villi and the creation and maintenance of fetal microvessels within them. The aim of this study was to define stereological measures of the growth, capillarization and maturation of villi and of fetoplacental angiogenesis and capillary remodelling. Placentas were collected at 12-41 weeks of gestation and assigned to six age groups spanning equal age ranges. Tissue samples were randomised for position and orientation. Overall growth of peripheral (intermediate and terminal villi and their capillaries was evaluated using total volumes, surface areas and lengths. Measures of villous capillarization comprised capillary volume, surface and length densities and capillary:villus surface and length ratios. Size and shape remodelling of villi and capillaries was assessed using mean cross-sectional areas, perimeters and shape coefficients (perimeter2/area. Group comparisons were drawn by analysis of variance. Villous and capillary volumes, surfaces and lengths increased significantly throughout gestation. Villous maturation involved phasic (capillary:villus surface and length ratios or progressive (volume, surface and length densities increases in indices of villous capillarization. It also involved isomorphic thinning (cross-sectional areas and perimeters declined but shape coefficients did not alter. In contrast, growth of capillaries did not involve changes in luminal areas or perimeters. The results show that villous growth and fetal angiogenesis involve increases in overall length rather than calibre and that villous differentiation involves increased capillarization. Although they do not distinguish between increases in the lengths versus numbers of capillary segments, other studies have shown that capillaries switch from branching to non-branching angiogenesis during gestation. Combined with maintenance of capillary calibres, these processes will contribute to the reduced

  14. Decision-Based Design Integrating Consumer Preferences into Engineering Design

    CERN Document Server

    Chen, Wei; Wassenaar, Henk Jan

    2013-01-01

    Building upon the fundamental principles of decision theory, Decision-Based Design: Integrating Consumer Preferences into Engineering Design presents an analytical approach to enterprise-driven Decision-Based Design (DBD) as a rigorous framework for decision making in engineering design.  Once the related fundamentals of decision theory, economic analysis, and econometrics modelling are established, the remaining chapters describe the entire process, the associated analytical techniques, and the design case studies for integrating consumer preference modeling into the enterprise-driven DBD framework. Methods for identifying key attributes, optimal design of human appraisal experiments, data collection, data analysis, and demand model estimation are presented and illustrated using engineering design case studies. The scope of the chapters also provides: •A rigorous framework of integrating the interests from both producer and consumers in engineering design, •Analytical techniques of consumer choice model...

  15. Immunocytochemical and stereological analysis of GABA(B) receptor subunit expression in the rat vestibular nucleus following unilateral vestibular deafferentation.

    Science.gov (United States)

    Zhang, Rong; Ashton, John; Horii, Arata; Darlington, Cynthia L; Smith, Paul F

    2005-03-10

    The process of behavioral recovery that occurs following damage to one vestibular labyrinth, vestibular compensation, has been attributed in part to a down-regulation of GABA(B) receptors in the vestibular nucleus complex (VNC) ipsilateral to the lesion, which could potentially reduce commissural inhibition from the contralateral VNC. In this study, we tested the possibility that this occurs through a decrease in the expression of either the GABA(B1) or GABA(B2) subunits of the GABA(B) receptor. We used Western blotting to quantify the expression of these subunits in the VNC at 10 h and 50 h following unilateral vestibular deafferentation (UVD) or sham surgery in rats. We then used immunocytochemistry and stereological counting methods to estimate the number of neurons expressing these subunits in the MVN at 10 h and 2 weeks following UVD or sham surgery. Compared to sham controls, we found no significant changes in either the expression of the two GABA(B) receptor subunits in the VNC or in the number of MVN neurons expressing these GABA(B) receptor subunits post-UVD. These results suggest that GABA(B) receptor expression does not change substantially in the VNC during the process of vestibular compensation.

  16. Fault tree synthesis for software design analysis of PLC based safety-critical systems

    International Nuclear Information System (INIS)

    Koo, S. R.; Cho, C. H.; Seong, P. H.

    2006-01-01

    As a software verification and validation should be performed for the development of PLC based safety-critical systems, a software safety analysis is also considered in line with entire software life cycle. In this paper, we propose a technique of software safety analysis in the design phase. Among various software hazard analysis techniques, fault tree analysis is most widely used for the safety analysis of nuclear power plant systems. Fault tree analysis also has the most intuitive notation and makes both qualitative and quantitative analyses possible. To analyze the design phase more effectively, we propose a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Consequently, we can analyze the safety of software on the basis of fault tree synthesis. (authors)

  17. Stereological quantification of immune-competent cells in baseline biopsy specimens from achilles tendons

    DEFF Research Database (Denmark)

    Kragsnaes, Maja Skov; Fredberg, Ulrich; Stribolt, Katrine

    2014-01-01

    BACKGROUND: Limited data exist on the presence and function of immune-competent cells in chronic tendinopathic tendons and their potential role in inflammation and tissue healing as well as in predicting long-term outcome. PURPOSE: To quantify subtypes of immune-competent cells in biopsy specimens...... immunohistochemically by quantifying the presence of macrophages (CD68-PGM1(+), CD68-KP1(+)), hemosiderophages (Perls blue), T lymphocytes (CD2(+), CD3(+), CD4(+), CD7(+), CD8(+)), B lymphocytes (CD20(+)), natural killer cells (CD56(+)), mast cells (NaSDCl(+)), Schwann cells (S100(+)), and endothelial cells (CD34......(+)) using a stereological technique. A follow-up examination was conducted more than 4 years (range, 4-9 years) after the biopsy procedure to evaluate the long-term presence of Achilles tendon symptoms. RESULTS: Macrophages, T lymphocytes, mast cells, and natural killer cells were observed in the majority...

  18. Concurrent multidisciplinary mechanical design based on design task analysis and knowledge sharing; Sekkei task bunseki to joho kyoyu ni yoru mechatronics kyocho sekkei

    Energy Technology Data Exchange (ETDEWEB)

    Kondo, K.; Ozawa, M.; Mori, T. [Toshiba Corp., Tokyo (Japan)

    1999-09-01

    We have developed a systematic design task planning method based on a design structure matrix(DSM) and a lumped model- based framework for knowledge sharing in a concurrent design environment as key techniques for developing higher quality products in a shorter design time. The DSM facilitates systematic analysis of dependencies among design tasks and optimization of the design process. The framework based on a lumped model description of mechanical systems enables concurrent and cooperative work among multidisciplinary designers at an early stage of the design process. In this paper, we also discuss the relationships between these techniques and the product development flow from product definition to detailed design. (author)

  19. Stereologic, histopathologic, flow cytometric, and clinical parameters in the prognostic evaluation of 74 patients with intraoral squamous cell carcinomas

    DEFF Research Database (Denmark)

    Bundgaard, T; Sørensen, Flemming Brandt; Gaihede, M

    1992-01-01

    , tumor size, and the TNM classification. RESULTS: The investigation showed a significant difference between the volume-weighted mean nuclear volume (nuclear vv) of oral leukoplakia (n = 29) and oral squamous cell carcinomas (P = 0.001). The value of the parameters as prognostic indicators of survival......BACKGROUND AND METHODS: A consecutive series of all 78 incident cases of intraoral squamous cell carcinoma occurring during a 2-year period in a population of 1.4 million inhabitants were evaluated by histologic score (the modified classification of Jacobsson et al.), flow cytometry, stereology...

  20. Early behavioral changes and quantitative analysis of neuropathological features in murine prion disease

    Science.gov (United States)

    Borner, Roseane; Bento-Torres, João; Souza, Diego RV; Sadala, Danyelle B; Trevia, Nonata; Farias, José Augusto; Lins, Nara; Passos, Aline; Quintairos, Amanda; Diniz, José Antônio; Perry, Victor Hugh; Vasconcelos, Pedro Fernando; Cunningham, Colm

    2011-01-01

    Behavioral and neuropathological changes have been widely investigated in murine prion disease but stereological based unbiased estimates of key neuropathological features have not been carried out. After injections of ME7 infected (ME7) or normal brain homogenates (NBH) into dorsal CA1 of albino Swiss mice and C57BL6, we assessed behavioral changes on hippocampal-dependent tasks. We also estimated by optical fractionator at 15 and 18 weeks post-injections (w.p.i.) the total number of neurons, reactive astrocytes, activated microglia and perineuronal nets (PN) in the polymorphic layer of dentate gyrus (PolDG), CA1 and septum in albino Swiss mice. On average, early behavioral changes in albino Swiss mice start four weeks later than in C57BL6. Cluster and discriminant analysis of behavioral data in albino Swiss mice revealed that four of nine subjects start to change their behavior at 12 w.p.i. and reach terminal stage at 22 w.p.i and the remaining subjects start at 22 w.p.i. and reach terminal stage at 26 w.p.i. Biotinylated dextran-amine BDA-tracer experiments in mossy fiber pathway confirmed axonal degeneration and stereological data showed that early astrocytosis, microgliosis and reduction in the perineuronal nets are independent of a change in the number of neuronal cell bodies. Statistical analysis revealed that the septal region had greater levels of neuroinflammation and extracellular matrix damage than CA1. This stereological and multivariate analysis at early stages of disease in an outbred model of prion disease provided new insights connecting behavioral changes and neuroinflammation and seems to be important to understand the mechanisms of prion disease progression. PMID:21862877

  1. Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles

    Science.gov (United States)

    Knox, Lenora A.

    The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.

  2. A finite element based substructuring procedure for design analysis of large smart structural systems

    International Nuclear Information System (INIS)

    Ashwin, U; Raja, S; Dwarakanathan, D

    2009-01-01

    A substructuring based design analysis procedure is presented for large smart structural system using the Craig–Bampton method. The smart structural system is distinctively characterized as an active substructure, modelled as a design problem, and a passive substructure, idealized as an analysis problem. Furthermore, a novel thought has been applied by introducing the electro–elastic coupling into the reduction scheme to solve the global structural control problem in a local domain. As an illustration, a smart composite box beam with surface bonded actuators/sensors is considered, and results of the local to global control analysis are presented to show the potential use of the developed procedure. The present numerical scheme is useful for optimally designing the active substructures to study their locations, coupled structure–actuator interaction and provide a solution to the global design of large smart structural systems

  3. Assessment of in vivo MR imaging compared to physical sections in vitro-A quantitative study of brain volumes using stereology

    DEFF Research Database (Denmark)

    Jelsing, Jacob; Rostrup, Egill; Markenroth, Karin

    2005-01-01

    The object of the present study was to compare stereological estimates of brain volumes obtained in vivo by magnetic resonance imaging (MRI) to corresponding volumes from physical sections in vitro. Brains of ten domestic pigs were imaged using a 3-T scanner. The volumes of different brain....... However, although intraobserver difference of MRI estimates was acceptable, the interobserver difference was not. A statistical highly significant difference of 11-41% was observed between observers for volume estimates of all compartments considered. The study demonstrates that quantitative MRI...

  4. Analysis of a mammography teaching program based on an affordance design model.

    Science.gov (United States)

    Luo, Ping; Eikman, Edward A; Kealy, William; Qian, Wei

    2006-12-01

    The wide use of computer technology in education, particularly in mammogram reading, asks for e-learning evaluation. The existing media comparative studies, learner attitude evaluations, and performance tests are problematic. Based on an affordance design model, this study examined an existing e-learning program on mammogram reading. The selection criteria include content relatedness, representativeness, e-learning orientation, image quality, program completeness, and accessibility. A case study was conducted to examine the affordance features, functions, and presentations of the selected software. Data collection and analysis methods include interviews, protocol-based document analysis, and usability tests and inspection. Also some statistics were calculated. The examination of PBE identified that this educational software designed and programmed some tools. The learner can use these tools in the process of optimizing displays, scanning images, comparing different projections, marking the region of interests, constructing a descriptive report, assessing one's learning outcomes, and comparing one's decisions with the experts' decisions. Further, PBE provides some resources for the learner to construct one's knowledge and skills, including a categorized image library, a term-searching function, and some teaching links. Besides, users found it easy to navigate and carry out tasks. The users also reacted positively toward PBE's navigation system, instructional aids, layout, pace and flow of information, graphics, and other presentation design. The software provides learners with some cognitive tools, supporting their perceptual problem-solving processes and extending their capabilities. Learners can internalize the mental models in mammogram reading through multiple perceptual triangulations, sensitization of related features, semantic description of mammogram findings, and expert-guided semantic report construction. The design of these cognitive tools and the

  5. STEREOLOGICAL QUANTITATION OF LEYDIG AND SERTOLI CELLS IN THE TESTIS FROM YOUNG AND OLD MEN

    Directory of Open Access Journals (Sweden)

    Peter M Petersen

    2011-05-01

    Full Text Available One of the newer stereological methods, the optical fractionator, was applied to the study of the effects of ageing on the human testis. The estimated total number of Sertoli and Leydig cells per testis in men younger than 30 years were 430×106 (CV = SD/mean = 0.35 and 117×106 (CV = 0.53, respectively, while in men older than 50 years the estimated total Sertoli cell number was 266×106 (CV = 0.46 and the mean Leydig cell number 83×106 (CV = 0.53. The difference between the number of Sertoli cells in men younger than 30 years compared with men older than 50 years was close to statistical significance (p = 0.052 while no differences was found in total Leydig cell number (p = 0.22.

  6. Novel efficient methods for measuring mesophyll anatomical characteristics from fresh thick sections using stereology and confocal microscopy: application on acid rain-treated Norway spruce needles

    Czech Academy of Sciences Publication Activity Database

    Albrechtová, Jana; Janáček, Jiří; Lhotáková, Zuzana; Radochová, Barbora; Kubínová, Lucie

    2007-01-01

    Roč. 58, č. 6 (2007), s. 1451-1461 ISSN 0022-0957 R&D Projects: GA AV ČR IAA5011810; GA AV ČR(CZ) IAA600110507; GA MŠk(CZ) LC06063 Institutional research plan: CEZ:AV0Z50110509; CEZ:AV0Z60050516 Keywords : mesophyll * stereology * confocal microscopy Subject RIV: EA - Cell Biology Impact factor: 3.917, year: 2007

  7. Inner Current Loop Analysis and Design Based on Resonant Regulators for Isolated Microgrids

    DEFF Research Database (Denmark)

    Federico, de Bosio; de Sousa Ribeiro, Luiz Antonio; Soares Lima, Marcel

    2015-01-01

    Inner current and voltage loops are fundamental in achieving good performance of microgrids based on power electronics voltage source inverters. The analysis and design of these loops are essential for the adequate operation of these systems. This paper investigates the effect of state feedback...

  8. SLS Navigation Model-Based Design Approach

    Science.gov (United States)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and

  9. An empirical analysis of the precision of estimating the numbers of neurons and glia in human neocortex using a fractionator-design with sub-sampling

    DEFF Research Database (Denmark)

    Lyck, L.; Santamaria, I.D.; Pakkenberg, B.

    2009-01-01

    Improving histomorphometric analysis of the human neocortex by combining stereological cell counting with immunchistochemical visualisation of specific neuronal and glial cell populations is a methodological challenge. To enable standardized immunohistochemical staining, the amount of brain tissue...... at each level of sampling was determined empirically. The methodology was tested in three brains analysing the contribution of the multi-step sampling procedure to the precision on the estimated total numbers of immunohistochemically defined NeuN expressing (NeuN(+)) neurons and CD45(+) microglia...

  10. Design and analysis of sustainable computer mouse using design for disassembly methodology

    Science.gov (United States)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  11. CAPRI (Computational Analysis PRogramming Interface): A Solid Modeling Based Infra-Structure for Engineering Analysis and Design Simulations

    Science.gov (United States)

    Haimes, Robert; Follen, Gregory J.

    1998-01-01

    CAPRI is a CAD-vendor neutral application programming interface designed for the construction of analysis and design systems. By allowing access to the geometry from within all modules (grid generators, solvers and post-processors) such tasks as meshing on the actual surfaces, node enrichment by solvers and defining which mesh faces are boundaries (for the solver and visualization system) become simpler. The overall reliance on file 'standards' is minimized. This 'Geometry Centric' approach makes multi-physics (multi-disciplinary) analysis codes much easier to build. By using the shared (coupled) surface as the foundation, CAPRI provides a single call to interpolate grid-node based data from the surface discretization in one volume to another. Finally, design systems are possible where the results can be brought back into the CAD system (and therefore manufactured) because all geometry construction and modification are performed using the CAD system's geometry kernel.

  12. Design, Development and Analysis of Centrifugal Blower

    Science.gov (United States)

    Baloni, Beena Devendra; Channiwala, Salim Abbasbhai; Harsha, Sugnanam Naga Ramannath

    2018-06-01

    Centrifugal blowers are widely used turbomachines equipment in all kinds of modern and domestic life. Manufacturing of blowers seldom follow an optimum design solution for individual blower. Although centrifugal blowers are developed as highly efficient machines, design is still based on various empirical and semi empirical rules proposed by fan designers. There are different methodologies used to design the impeller and other components of blowers. The objective of present study is to study explicit design methodologies and tracing unified design to get better design point performance. This unified design methodology is based more on fundamental concepts and minimum assumptions. Parametric study is also carried out for the effect of design parameters on pressure ratio and their interdependency in the design. The code is developed based on a unified design using C programming. Numerical analysis is carried out to check the flow parameters inside the blower. Two blowers, one based on the present design and other on industrial design, are developed with a standard OEM blower manufacturing unit. A comparison of both designs is done based on experimental performance analysis as per IS standard. The results suggest better efficiency and more flow rate for the same pressure head in case of the present design compared with industrial one.

  13. Proliferation of Sertoli cells during development of the human testis assessed by stereological methods

    DEFF Research Database (Denmark)

    Cortes, D; Müller, J; Skakkebaek, N E

    1987-01-01

    Sertoli cells were studied using stereological methods in testes obtained from five children who were stillborn, and 31 individuals between 3 months and 40 years of age, who had suffered from sudden, unexpected death. The mean nuclear volume of the Sertoli cells, the numerical density of Sertoli...... cells, and the total number of Sertoli cells per individual were determined by point- and profile-counting of 0.5 micron sections. The nuclear volume of Sertoli cells increased from a median of 120 microns3 (range 53-130) during the period of 3 months to 10 years to 210 microns3 (170-260) in adults...... (greater than 25 years). The numerical density of Sertoli cells decreased from a median of 1200 X 10(6)/cm3 (870-1400) during childhood (3 months to 10 years) to 140 X 10(6)/cm3 (110-260) in adults (greater than 25 years). The total number of Sertoli cells per individual increased significantly from...

  14. PRESIDENT’S ADDRESS

    Directory of Open Access Journals (Sweden)

    Lucie Kubínová

    2016-03-01

    Full Text Available Dear colleagues, Let me wish you and our Image Analysis & Stereology Journal all the best in 2016. It is great honour for me to be in charge of the International Society for Stereology (ISS. I would like to thank Eric Pirard for all his tremendous work and effort he has put into ISS activities in past years, together with all members of the ISS Board. I am glad Eric is willing to help me with ISS as „Immediate Past President“. I also rely on close collaboration with Ida Eržen and Marko Kreft, the Editors-in-Chief of Image Analysis & Stereology (IAS, IAS Editorial Board and future ISS Board. I appreciate long-term cooperation between the Image Analysis & Stereology Journal and the International Society for Stereology and I am looking forward to broaden our relationships in the future. As a President of ISS I plan to evoke new activities which would help to make ISS vivid and useful to scientific community. We are planning new courses on stereology, image analysis and related topics run in cooperation with ISS with reduced fee for ISS members, competion for the best PhD thesis using stereology and/or image analysis, ISS history mapping, etc. I welcome your further suggestions and comments. I would like to cordially invite you to become members of the International Society for Stereology (http://stereologysociety.org/membership.html and to take part in new activities organized by ISS, such as:Special Session „3D Image Analysis and Stereology in Fluorescence Microscopy“ at ISBI 2016 (in Prague, see http://biomedicalimaging.org/2016/?page_id=768.Round table discussion on „Stereology and 3D image analysis in microscopy“ to be held at the 16th European Microscopy Congress in Lyon (EMC 2016, see http://www.emc2016.fr/en/scientific-programme/special/stereology.It is my intention to involve young people and fresh ideas in the activities of ISS and in further improvement of IAS. Please contact me at lucie

  15. A Technique of Software Safety Analysis in the Design Phase for PLC Based Safety-Critical Systems

    International Nuclear Information System (INIS)

    Koo, Seo-Ryong; Kim, Chang-Hwoi

    2017-01-01

    The purpose of safety analysis, which is a method of identifying portions of a system that have the potential for unacceptable hazards, is firstly to encourage design changes that will reduce or eliminate hazards and, secondly, to conduct special analyses and tests that can provide increased confidence in especially vulnerable portions of the system. For the design and implementation phase of the PLC based systems, we proposed a technique for software design specification and analysis, and this technique enables us to generate software design specifications (SDSs) in nuclear fields. For the safety analysis in the design phase, we used architecture design blocks of NuFDS to represent the architecture of the software. On the basis of the architecture design specification, we can directly generate the fault tree and then use the fault tree for qualitative analysis. Therefore, we proposed a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Through our proposed fault tree synthesis in this work, users can use the architecture specification of the NuFDS approach to intuitively compose fault trees that help analyze the safety design features of software.

  16. Value-centric design architecture based on analysis of space system characteristics

    Science.gov (United States)

    Xu, Q.; Hollingsworth, P.; Smith, K.

    2018-03-01

    Emerging design concepts such as miniaturisation, modularity, and standardisation, have contributed to the rapid development of small and inexpensive platforms, particularly cubesats. This has been stimulating an upcoming revolution in space design and development, leading satellites into the era of "smaller, faster, and cheaper". However, the current requirement-centric design philosophy, focused on bespoke monolithic systems, along with the associated development and production process does not inherently fit with the innovative modular, standardised, and mass-produced technologies. This paper presents a new categorisation, characterisation, and value-centric design architecture to address this need for both traditional and novel system designs. Based on the categorisation of system configurations, a characterisation of space systems, comprised of duplication, fractionation, and derivation, is proposed to capture the overall system configuration characteristics and promote potential hybrid designs. Complying with the definitions of the system characterisation, mathematical mapping relations between the system characterisation and the system properties are described to establish the mathematical foundation of the proposed value-centric design methodology. To illustrate the methodology, subsystem reliability relationships are therefore analysed to explore potential system configurations in the design space. The results of the applications of system characteristic analysis clearly show that the effects of different configuration characteristics on the system properties can be effectively analysed and evaluated, enabling the optimization of system configurations.

  17. x-y-recording in transmission electron microscopy. A versatile and inexpensive interface to personal computers with application to stereology.

    Science.gov (United States)

    Rickmann, M; Siklós, L; Joó, F; Wolff, J R

    1990-09-01

    An interface for IBM XT/AT-compatible computers is described which has been designed to read the actual specimen stage position of electron microscopes. The complete system consists of (i) optical incremental encoders attached to the x- and y-stage drivers of the microscope, (ii) two keypads for operator input, (iii) an interface card fitted to the bus of the personal computer, (iv) a standard configuration IBM XT (or compatible) personal computer optionally equipped with a (v) HP Graphic Language controllable colour plotter. The small size of the encoders and their connection to the stage drivers by simple ribbed belts allows an easy adaptation of the system to most electron microscopes. Operation of the interface card itself is supported by any high-level language available for personal computers. By the modular concept of these languages, the system can be customized to various applications, and no computer expertise is needed for actual operation. The present configuration offers an inexpensive attachment, which covers a wide range of applications from a simple notebook to high-resolution (200-nm) mapping of tissue. Since section coordinates can be processed in real-time, stereological estimations can be derived directly "on microscope". This is exemplified by an application in which particle numbers were determined by the disector method.

  18. Initial Multidisciplinary Design and Analysis Framework

    Science.gov (United States)

    Ozoroski, L. P.; Geiselhart, K. A.; Padula, S. L.; Li, W.; Olson, E. D.; Campbell, R. L.; Shields, E. W.; Berton, J. J.; Gray, J. S.; Jones, S. M.; hide

    2010-01-01

    Within the Supersonics (SUP) Project of the Fundamental Aeronautics Program (FAP), an initial multidisciplinary design & analysis framework has been developed. A set of low- and intermediate-fidelity discipline design and analysis codes were integrated within a multidisciplinary design and analysis framework and demonstrated on two challenging test cases. The first test case demonstrates an initial capability to design for low boom and performance. The second test case demonstrates rapid assessment of a well-characterized design. The current system has been shown to greatly increase the design and analysis speed and capability, and many future areas for development were identified. This work has established a state-of-the-art capability for immediate use by supersonic concept designers and systems analysts at NASA, while also providing a strong base to build upon for future releases as more multifidelity capabilities are developed and integrated.

  19. Design and construction of an Offner spectrometer based on geometrical analysis of ring fields.

    Science.gov (United States)

    Kim, Seo Hyun; Kong, Hong Jin; Lee, Jong Ung; Lee, Jun Ho; Lee, Jai Hoon

    2014-08-01

    A method to obtain an aberration-corrected Offner spectrometer without ray obstruction is proposed. A new, more efficient spectrometer optics design is suggested in order to increase its spectral resolution. The derivation of a new ring equation to eliminate ray obstruction is based on geometrical analysis of the ring fields for various numerical apertures. The analytical design applying this equation was demonstrated using the optical design software Code V in order to manufacture a spectrometer working in wavelengths of 900-1700 nm. The simulation results show that the new concept offers an analytical initial design taking the least time of calculation. The simulated spectrometer exhibited a modulation transfer function over 80% at Nyquist frequency, root-mean-square spot diameters under 8.6 μm, and a spectral resolution of 3.2 nm. The final design and its realization of a high resolution Offner spectrometer was demonstrated based on the simulation result. The equation and analytical design procedure shown here can be applied to most Offner systems regardless of the wavelength range.

  20. Design Criteria for Suspended Pipelines Based on Structural Analysis

    Directory of Open Access Journals (Sweden)

    Mariana Simão

    2016-06-01

    Full Text Available Mathematical models have become the target of numerous attempts to obtain results that can be extrapolated to the study of hydraulic pressure infrastructures associated with different engineering requests. Simulation analysis based on finite element method (FEM models are used to determine the vulnerability of hydraulic systems under different types of actions (e.g., natural events and pressure variation. As part of the numerical simulation of a suspended pipeline, the adequacy of existing supports to sustain the pressure loads is verified. With a certain value of load application, the pipeline is forced to sway sideways, possibly lifting up off its deadweight supports. Thus, identifying the frequency, consequences and predictability of accidental events is of extreme importance. This study focuses on the stability of vertical supports associated with extreme transient loads and how a pipeline design can be improved using FEM simulations, in the design stage, to avoid accidents. Distribution of bending moments, axial forces, displacements and deformations along the pipeline and supports are studied for a set of important parametric variations. A good representation of the pipeline displacements is obtained using FEM.

  1. Analysis and Design of Web-Based Database Application for Culinary Community

    Directory of Open Access Journals (Sweden)

    Choirul Huda

    2017-03-01

    Full Text Available This research is based on the rapid development of the culinary and information technology. The difficulties in communicating with the culinary expert and on recipe documentation make a proper support for media very important. Therefore, a web-based database application for the public is important to help the culinary community in communication, searching and recipe management. The aim of the research was to design a web-based database application that could be used as social media for the culinary community. This research used literature review, user interviews, and questionnaires. Moreover, the database system development life cycle was used as a guide for designing a database especially for conceptual database design, logical database design, and physical design database. Web-based application design used eight golden rules for user interface design. The result of this research is the availability of a web-based database application that can fulfill the needs of users in the culinary field related to communication and recipe management.

  2. Problem based Learning versus Design Thinking in Team based Project work

    DEFF Research Database (Denmark)

    Denise J. Stokholm, Marianne

    2014-01-01

    project based learning issues, which has caused a need to describe and compare the two models; in specific the understandings, approaches and organization of learning in project work. The PBL model viewing the process as 3 separate project stages including; problem analysis, problem solving and project......All educations at Aalborg University has since 1974 been rooted in Problem Based Learning (PBL). In 1999 a new education in Industrial design was set up, introducing Design Based Learning (DBL). Even though the two approaches have a lot in common they also hold different understandings of core...... report, with focus on problem solving through analysis. Design Based Learning viewing the process as series of integrated design spaces including; alignment, research, mission, vision, concept, product and process report, with focus on innovative ideation though integration. There is a need of renewing...

  3. Sensitivity analysis of dynamic characteristic of the fixture based on design variables

    International Nuclear Information System (INIS)

    Wang Dongsheng; Nong Shaoning; Zhang Sijian; Ren Wanfa

    2002-01-01

    The research on the sensitivity analysis is dealt with of structural natural frequencies to structural design parameters. A typical fixture for vibration test is designed. Using I-DEAS Finite Element programs, the sensitivity of its natural frequency to design parameters is analyzed by Matrix Perturbation Method. The research result shows that the sensitivity analysis is a fast and effective dynamic re-analysis method to dynamic design and parameters modification of complex structures such as fixtures

  4. VR-based training and assessment in ultrasound-guided regional anesthesia: from error analysis to system design.

    LENUS (Irish Health Repository)

    2011-01-01

    If VR-based medical training and assessment is to improve patient care and safety (i.e. a genuine health gain), it has to be based on clinically relevant measurement of performance. Metrics on errors are particularly useful for capturing and correcting undesired behaviors before they occur in the operating room. However, translating clinically relevant metrics and errors into meaningful system design is a challenging process. This paper discusses how an existing task and error analysis was translated into the system design of a VR-based training and assessment environment for Ultrasound Guided Regional Anesthesia (UGRA).

  5. Comparative effectiveness of instructional design features in simulation-based education: systematic review and meta-analysis.

    Science.gov (United States)

    Cook, David A; Hamstra, Stanley J; Brydges, Ryan; Zendejas, Benjamin; Szostek, Jason H; Wang, Amy T; Erwin, Patricia J; Hatala, Rose

    2013-01-01

    Although technology-enhanced simulation is increasingly used in health professions education, features of effective simulation-based instructional design remain uncertain. Evaluate the effectiveness of instructional design features through a systematic review of studies comparing different simulation-based interventions. We systematically searched MEDLINE, EMBASE, CINAHL, ERIC, PsycINFO, Scopus, key journals, and previous review bibliographies through May 2011. We included original research studies that compared one simulation intervention with another and involved health professions learners. Working in duplicate, we evaluated study quality and abstracted information on learners, outcomes, and instructional design features. We pooled results using random effects meta-analysis. From a pool of 10,903 articles we identified 289 eligible studies enrolling 18,971 trainees, including 208 randomized trials. Inconsistency was usually large (I2 > 50%). For skills outcomes, pooled effect sizes (positive numbers favoring the instructional design feature) were 0.68 for range of difficulty (20 studies; p simulation-based education.

  6. Knowledge Representation and Inference for Analysis and Design of Database and Tabular Rule-Based Systems

    Directory of Open Access Journals (Sweden)

    Antoni Ligeza

    2001-01-01

    Full Text Available Rulebased systems constitute a powerful tool for specification of knowledge in design and implementation of knowledge based systems. They provide also a universal programming paradigm for domains such as intelligent control, decision support, situation classification and operational knowledge encoding. In order to assure safe and reliable performance, such system should satisfy certain formal requirements, including completeness and consistency. This paper addresses the issue of analysis and verification of selected properties of a class of such system in a systematic way. A uniform, tabular scheme of single-level rule-based systems is considered. Such systems can be applied as a generalized form of databases for specification of data pattern (unconditional knowledge, or can be used for defining attributive decision tables (conditional knowledge in form of rules. They can also serve as lower-level components of a hierarchical multi-level control and decision support knowledge-based systems. An algebraic knowledge representation paradigm using extended tabular representation, similar to relational database tables is presented and algebraic bases for system analysis, verification and design support are outlined.

  7. Rationale for reduced tornado design bases

    International Nuclear Information System (INIS)

    Rutherford, P.D.; Ho, H.W.; Hartung, J.A.; Kastenberg, W.E.

    1985-01-01

    This paper provides a rationale for relaxing the present NRC tornado design requirements, which are based on a design basis tornado (DBT) whose frequency of exceedance is 10 -7 per year. It is proposed that a reduced DBT frequency of 10 -5 to 10 -6 per year is acceptable. This change in the tornado design bases for LMFBRs (and possibly all types of nuclear plants) is justified based on (1) existing NRC regulations and guidelines, (2) probabilistic arguments, (3) consistency with NRC trial safety goals, and (4) cost-benefit analysis

  8. Parametric design and off-design analysis of organic Rankine cycle (ORC) system

    International Nuclear Information System (INIS)

    Song, Jian; Gu, Chun-wei; Ren, Xiaodong

    2016-01-01

    Highlights: • A one-dimensional analysis method for ORC system is proposed. • The system performance under both design and off-design conditions are analyzed. • The working fluid selection is based on both design and off-design performance. • The system parameter determination are based on both design and off-design performance. - Abstract: A one-dimensional analysis method has been proposed for the organic Rankine cycle (ORC) system in this paper. The method contains two main parts: a one-dimensional aerodynamic analysis model of the radial-inflow turbine and a performance prediction model of the heat exchanger. Based on the present method, an ORC system for the industrial waste heat recovery is designed and analyzed. The net power output of the ORC system is 534 kW, and the thermal efficiency reaches 13.5%. System performance under off-design conditions is simulated and considered. The results show that the inlet temperatures of the heat source and the cooling water have a significant influence on the system. With the increment of the heat source inlet temperature, the mass flow rate of the working fluid, the net power output and the heat utilization ratio of the ORC system increase. While, the system thermal efficiency decreases with increasing cooling water inlet temperature. In order to maintain the condensation pressure at a moderate value, the heat source inlet temperature considered in this analysis should be kept within the range of 443.15–468.15 K, while the optimal temperature range of the cooling water is between 283.15 K and 303.15 K.

  9. Design Concepts of Polycarbonate-Based Intervertebral Lumbar Cages: Finite Element Analysis and Compression Testing

    Directory of Open Access Journals (Sweden)

    J. Obedt Figueroa-Cavazos

    2016-01-01

    Full Text Available This work explores the viability of 3D printed intervertebral lumbar cages based on biocompatible polycarbonate (PC-ISO® material. Several design concepts are proposed for the generation of patient-specific intervertebral lumbar cages. The 3D printed material achieved compressive yield strength of 55 MPa under a specific combination of manufacturing parameters. The literature recommends a reference load of 4,000 N for design of intervertebral lumbar cages. Under compression testing conditions, the proposed design concepts withstand between 7,500 and 10,000 N of load before showing yielding. Although some stress concentration regions were found during analysis, the overall viability of the proposed design concepts was validated.

  10. Design for rock grouting based on analysis of grout penetration. Verification using Aespoe HRL data and parameter analysis

    International Nuclear Information System (INIS)

    Kobayashi, Shinji; Stille, Haakan

    2007-01-01

    Grouting as a method to reduce the inflow of water into underground facilities will be important in both the construction and operation of the deep repository. SKB has been studying grouting design based on characterization of fractured rock and prediction of grout spread. However, as in other Scandinavian tunnels, stop criteria have been empirically set so that grouting is completed when the grout flow is less than a certain value at maximum pressure or the grout take is above a certain value. Since empirically based stop criteria are determined without a theoretical basis and are not related to grout penetration, the grouting result may be inadequate or uneconomical. In order to permit the choice of adequate and cost-effective grouting methods, stop criteria can be designed based on a theoretical analysis of grout penetration. The relationship between grout penetration and grouting time has been studied at the Royal Institute of Technology and Chalmers University of Technology. Based on these studies, the theory has been further developed in order to apply to real grouting work. Another aspect is using the developed method for parameter analysis. The purpose of parameter analysis is to evaluate the influence of different grouting parameters on the result. Since the grouting strategy is composed of many different components, the selection of a grouting method is complex. Even if the theoretically most suitable grouting method is selected, it is difficult to carry out grouting exactly as planned because grouting parameters such as grout properties can easily vary during the grouting operation. In addition, knowing the parameters precisely beforehand is impossible because there are uncertainties inherent in the rock mass. Therefore, it is important to asses the effects of variations in grouting parameters. The parameter analysis can serve as a guide in choosing an effective grouting method. The objectives of this report are to: Further develop the theory concerning

  11. Model-Based Requirements Management in Gear Systems Design Based On Graph-Based Design Languages

    Directory of Open Access Journals (Sweden)

    Kevin Holder

    2017-10-01

    Full Text Available For several decades, a wide-spread consensus concerning the enormous importance of an in-depth clarification of the specifications of a product has been observed. A weak clarification of specifications is repeatedly listed as a main cause for the failure of product development projects. Requirements, which can be defined as the purpose, goals, constraints, and criteria associated with a product development project, play a central role in the clarification of specifications. The collection of activities which ensure that requirements are identified, documented, maintained, communicated, and traced throughout the life cycle of a system, product, or service can be referred to as “requirements engineering”. These activities can be supported by a collection and combination of strategies, methods, and tools which are appropriate for the clarification of specifications. Numerous publications describe the strategy and the components of requirements management. Furthermore, recent research investigates its industrial application. Simultaneously, promising developments of graph-based design languages for a holistic digital representation of the product life cycle are presented. Current developments realize graph-based languages by the diagrams of the Unified Modelling Language (UML, and allow the automatic generation and evaluation of multiple product variants. The research presented in this paper seeks to present a method in order to combine the advantages of a conscious requirements management process and graph-based design languages. Consequently, the main objective of this paper is the investigation of a model-based integration of requirements in a product development process by means of graph-based design languages. The research method is based on an in-depth analysis of an exemplary industrial product development, a gear system for so-called “Electrical Multiple Units” (EMU. Important requirements were abstracted from a gear system

  12. Designing discovery learning environments: process analysis and implications for designing an information system

    NARCIS (Netherlands)

    Pieters, Julius Marie; Limbach, R.; de Jong, Anthonius J.M.

    2004-01-01

    A systematic analysis of the design process of authors of (simulation based) discovery learning environments was carried out. The analysis aimed at identifying the design activities of authors and categorising knowledge gaps that they experience. First, five existing studies were systematically

  13. Design Analysis Method for Multidisciplinary Complex Product using SysML

    Directory of Open Access Journals (Sweden)

    Liu Jihong

    2017-01-01

    Full Text Available In the design of multidisciplinary complex products, model-based systems engineering methods are widely used. However, the methodologies only contain only modeling order and simple analysis steps, and lack integrated design analysis methods supporting the whole process. In order to solve the problem, a conceptual design analysis method with integrating modern design methods has been proposed. First, based on the requirement analysis of the quantization matrix, the user’s needs are quantitatively evaluated and translated to system requirements. Then, by the function decomposition of the function knowledge base, the total function is semi-automatically decomposed into the predefined atomic function. The function is matched into predefined structure through the behaviour layer using function-structure mapping based on the interface matching. Finally based on design structure matrix (DSM, the structure reorganization is completed. The process of analysis is implemented with SysML, and illustrated through an aircraft air conditioning system for the system validation.

  14. Design by analysis of composite pressure equipment

    International Nuclear Information System (INIS)

    Durand, S.; Mallard, H.

    2004-01-01

    Design by analysis has been particularly pointed out by the european pressure equipment directive. Advanced mechanical analysis like finite element method are used instead of classical design by formulas or graphs. Structural behaviour can be understood by the designer. Design by analysis of metallic pressure equipments is widely used. Material behaviour or limits analysis is based on sophisticated approach (elasto-plastic analysis,..). Design by analysis of composite pressure equipments is not systematically used for industrial products. The difficulty comes from the number of information to handle. The laws of mechanics are the same for composite materials than for steel. The authors want to show that in design by analysis, the composite material approach is only more complete than the metallic approach. Mechanics is more general but not more complicated. A multi-material approach is a natural evolution of design by analysis of composite equipments. The presentation is illustrated by several industrial cases - composite vessel: analogy with metallic calculations; - composite pipes and fittings; - welding and bounding of thermoplastic equipments. (authors)

  15. Software design specification and analysis(NuFDS) approach for the safety critical software based on porgrammable logic controller(PLC)

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Seong, Poong Hyun; Jung, Jin Yong; Choi, Seong Soo

    2004-01-01

    This paper introduces the software design specification and analysis technique for the safety-critical system based on Programmable Logic Controller (PLC). During software development phases, the design phase should perform an important role to connect between requirements phase and implementation phase as a process of translating problem requirements into software structures. In this work, the Nuclear FBD-style Design Specification and analysis (NuFDS) approach was proposed. The NuFDS approach for nuclear Instrumentation and Control (I and C) software are suggested in a straight forward manner. It consists of four major specifications as follows; Database, Software Architecture, System Behavior, and PLC Hardware Configuration. Additionally, correctness, completeness, consistency, and traceability check techniques are also suggested for the formal design analysis in NuFDS approach. In addition, for the tool supporting, we are developing NuSDS tool based on the NuFDS approach which is a tool, especially for the software design specification in nuclear fields

  16. Frequency domain analysis and design of nonlinear systems based on Volterra series expansion a parametric characteristic approach

    CERN Document Server

    Jing, Xingjian

    2015-01-01

    This book is a systematic summary of some new advances in the area of nonlinear analysis and design in the frequency domain, focusing on the application oriented theory and methods based on the GFRF concept, which is mainly done by the author in the past 8 years. The main results are formulated uniformly with a parametric characteristic approach, which provides a convenient and novel insight into nonlinear influence on system output response in terms of characteristic parameters and thus facilitate nonlinear analysis and design in the frequency domain.  The book starts with a brief introduction to the background of nonlinear analysis in the frequency domain, followed by recursive algorithms for computation of GFRFs for different parametric models, and nonlinear output frequency properties. Thereafter the parametric characteristic analysis method is introduced, which leads to the new understanding and formulation of the GFRFs, and nonlinear characteristic output spectrum (nCOS) and the nCOS based analysis a...

  17. Making Design Decisions Visible: Applying the Case-Based Method in Designing Online Instruction

    Directory of Open Access Journals (Sweden)

    Heng Luo,

    2011-01-01

    Full Text Available The instructional intervention in this design case is a self-directed online tutorial that applies the case-based method to teach educators how to design and conduct entrepreneurship programs for elementary school students. In this article, the authors describe the major decisions made in each phase of the design and development process, explicate the rationales behind them, and demonstrate their effect on the production of the tutorial. Based on such analysis, the guidelines for designing case-based online instruction are summarized for the design case.

  18. 1972 preliminary safety analysis report based on a conceptual design of a proposed repository in Kansas

    International Nuclear Information System (INIS)

    Blomeke, J.O.

    1977-08-01

    This preliminary safety analysis report is based on a proposed Federal Repository at Lyons, Kansas, for receiving, handling, and depositing radioactive solid wastes in bedded salt during the remainder of this century. The safety analysis applies to a hypothetical site in central Kansas identical to the Lyons site, except that it is free of nearby salt solution-mining operations and bore holes that cannot be plugged to Repository specifications. This PSAR contains much information that also appears in the conceptual design report. Much of the geological-hydrological information was gathered in the Lyons area. This report is organized in 16 sections: considerations leading to the proposed Repository, design requirements and criteria, a description of the Lyons site and its environs, land improvements, support facilities, utilities, different impacts of Repository operations, safety analysis, design confirmation program, operational management, requirements for eventually decommissioning the facility, design criteria for protection from severe natural events, and the proposed program of experimental investigations

  19. 1972 preliminary safety analysis report based on a conceptual design of a proposed repository in Kansas

    Energy Technology Data Exchange (ETDEWEB)

    Blomeke, J.O.

    1977-08-01

    This preliminary safety analysis report is based on a proposed Federal Repository at Lyons, Kansas, for receiving, handling, and depositing radioactive solid wastes in bedded salt during the remainder of this century. The safety analysis applies to a hypothetical site in central Kansas identical to the Lyons site, except that it is free of nearby salt solution-mining operations and bore holes that cannot be plugged to Repository specifications. This PSAR contains much information that also appears in the conceptual design report. Much of the geological-hydrological information was gathered in the Lyons area. This report is organized in 16 sections: considerations leading to the proposed Repository, design requirements and criteria, a description of the Lyons site and its environs, land improvements, support facilities, utilities, different impacts of Repository operations, safety analysis, design confirmation program, operational management, requirements for eventually decommissioning the facility, design criteria for protection from severe natural events, and the proposed program of experimental investigations. (DLC)

  20. Mast cells and atopic dermatitis. Stereological quantification of mast cells in atopic dermatitis and normal human skin

    DEFF Research Database (Denmark)

    Damsgaard, T E; Olesen, A B; Sørensen, Flemming Brandt

    1997-01-01

    Stereological quantification of mast cell numbers was applied to sections of punch biopsies from lesional and nonlesional skin of atopic dermatitis patients and skin of healthy volunteers. We also investigated whether the method of staining and/or the fixative influenced the results...... of the determination of the mast cell profile numbers. The punch biopsies were taken from the same four locations in both atopic dermatitis patients and normal individuals. The locations were the scalp, neck and flexure of the elbow (lesional skin), and nates (nonlesional skin). Clinical scoring was carried out...... yielded the following results: (1) in atopic dermatitis lesional skin an increased number of mast cell profiles was found as compared with nonlesional skin, (2) comparing atopic dermatitis skin with normal skin, a significantly increased number of mast cell profiles per millimetre squared was found...

  1. Mast cells and atopic dermatitis. Stereological quantification of mast cells in atopic dermatitis and normal human skin

    DEFF Research Database (Denmark)

    Damsgaard, T E; Olesen, A B; Sørensen, Flemming Brandt

    1997-01-01

    Stereological quantification of mast cell numbers was applied to sections of punch biopsies from lesional and nonlesional skin of atopic dermatitis patients and skin of healthy volunteers. We also investigated whether the method of staining and/or the fixative influenced the results...... of the determination of the mast cell profile numbers. The punch biopsies were taken from the same four locations in both atopic dermatitis patients and normal individuals. The locations were the scalp, neck and flexure of the elbow (lesional skin), and nates (nonlesional skin). Clinical scoring was carried out...... at the site of each biopsy. After fixation and plastic embedding, the biopsies were cut into 2 microns serial sections. Ten sections, 30 microns apart, from each biopsy were examined and stained alternately with either toluidine blue or Giemsa stain and mast cell profile numbers were determined. The study...

  2. Sequential ensemble-based optimal design for parameter estimation: SEQUENTIAL ENSEMBLE-BASED OPTIMAL DESIGN

    Energy Technology Data Exchange (ETDEWEB)

    Man, Jun [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Zhang, Jiangjiang [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Li, Weixuan [Pacific Northwest National Laboratory, Richland Washington USA; Zeng, Lingzao [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Wu, Laosheng [Department of Environmental Sciences, University of California, Riverside California USA

    2016-10-01

    The ensemble Kalman filter (EnKF) has been widely used in parameter estimation for hydrological models. The focus of most previous studies was to develop more efficient analysis (estimation) algorithms. On the other hand, it is intuitively understandable that a well-designed sampling (data-collection) strategy should provide more informative measurements and subsequently improve the parameter estimation. In this work, a Sequential Ensemble-based Optimal Design (SEOD) method, coupled with EnKF, information theory and sequential optimal design, is proposed to improve the performance of parameter estimation. Based on the first-order and second-order statistics, different information metrics including the Shannon entropy difference (SD), degrees of freedom for signal (DFS) and relative entropy (RE) are used to design the optimal sampling strategy, respectively. The effectiveness of the proposed method is illustrated by synthetic one-dimensional and two-dimensional unsaturated flow case studies. It is shown that the designed sampling strategies can provide more accurate parameter estimation and state prediction compared with conventional sampling strategies. Optimal sampling designs based on various information metrics perform similarly in our cases. The effect of ensemble size on the optimal design is also investigated. Overall, larger ensemble size improves the parameter estimation and convergence of optimal sampling strategy. Although the proposed method is applied to unsaturated flow problems in this study, it can be equally applied in any other hydrological problems.

  3. Study on the Theoretical Foundation of Business English Curriculum Design Based on ESP and Needs Analysis

    Science.gov (United States)

    Zhu, Wenzhong; Liu, Dan

    2014-01-01

    Based on a review of the literature on ESP and needs analysis, this paper is intended to offer some theoretical supports and inspirations for BE instructors to develop BE curricula for business contexts. It discusses how the theory of need analysis can be used in Business English curriculum design, and proposes some principles of BE curriculum…

  4. DECIPHERING THE FINEST IMPRINT OF GLACIAL EROSION: OBJECTIVE ANALYSIS OF STRIAE PATTERNS ON BEDROCK

    Directory of Open Access Journals (Sweden)

    Piet Stroeven

    2011-05-01

    Full Text Available The aim of this study is to compare the efficiency of different mathematical and statistical geometrical methods applied to characterise the orientation distribution of striae on bedrock for deciphering the finest imprint of glacial erosion. The involved methods include automatic image analysis techniques of Fast Fourier Transform (FFT, and the experimental investigations by means of Saltikov's directed secants analysis (rose of intersection densities, applied to digital and analogue images of the striae pattern, respectively. In addition, the experimental data were compared with the modelling results made on the basis of Underwood's concept of linear systems in a plane. The experimental and modelling approaches in the framework of stereology yield consistent results. These results reveal that stereological methods allow a reliable and efficient delineation of different families of glacial striae from a complex record imprinted in bedrock.

  5. Design and Polarization Characteristics Analysis of Dihedral Based on Salisbury Screen

    Directory of Open Access Journals (Sweden)

    Zhang Ran

    2016-12-01

    Full Text Available Salisbury screens have a number of unique electromagnetic scattering characteristics. When appropriately designed, the Salisbury screen can reach the radar target signature transform. Based on the electromagnetic scattering characteristics of the Salisbury screen, we designed a novel dihedral corner, and theoretically analyzed and simulated its electromagnetic scattering characteristics in this study. The results reveal the monostatic radar cross section curves of the 90°and 60° Salisbury screen dihedral and metal dihedral, respectively. Taking an orthogonal dihedral corner as an example, we obtained the polarization scattering matrixes for different incident degrees. In addition, we investigated the influence of illumination frequency, target gestures, and other key factors on the polarization characteristics of the Salisbury screen dihedral corner. The theoretical and simulation analysis results show that compared with the conventional metal dihedral corner, the Salisbury screen dihedral corner significantly influences the scattering characteristics and will have potential application in electronic warfare.

  6. Norm based design of fault detectors

    DEFF Research Database (Denmark)

    Rank, Mike Lind; Niemann, Hans Henrik

    1999-01-01

    The design of fault detectors for fault detection and isolation (FDI) in dynamic systems is considered in this paper from a norm based point of view. An analysis of norm based threshold selection is given based on different formulations of FDI problems. Both the nominal FDI problem as well...

  7. Role of Theories in the Design of Web-Based Person-Centered Support: A Critical Analysis

    Directory of Open Access Journals (Sweden)

    Agneta Ranerup

    2014-01-01

    Full Text Available Objective. The aim of this study was to provide a critical understanding of the role of theories and their compatibility with a person-centered approach in the design and evaluation of web-based support for the management of chronic illness. Methods. Exploration of web-based support research projects focusing on four cases: (1 preschool children aged 4–6 with bladder dysfunction and urogenital malformation; (2 young adults aged 16–25 living with mental illness; (3 women with type 1 diabetes who are pregnant or in early motherhood; and (4 women who have undergone surgery for breast cancer. Data comprised interviews with research leaders and documented plans. Analysis was performed by means of a cross-case methodology. Results. The used theories concerned design, learning, health and well-being, or transition. All web support products had been developed using a participatory design (PD. Fundamental to the technology design and evaluation of outcomes were theories focusing on learning and on health and well-being. All theories were compatible with a person-centered approach. However, a notable exception was the relatively collective character of PD and Communities of Practice. Conclusion. Our results illustrate multifaceted ways for theories to be used in the design and evaluation of web-based support.

  8. Stereology of the thyroid gland in Indo-Pacific bottlenose dolphin (Tursiops aduncus in comparison with human (Homo sapiens: quantitative and functional implications.

    Directory of Open Access Journals (Sweden)

    Brian Chin Wing Kot

    Full Text Available The mammalian thyroid gland maintains basal metabolism in tissues for optimal function. Determining thyroid volume is important in assessing growth and involution. Volume estimation is also important in stereological studies. Direct measurements of colloid volume and nuclear-to-cytoplasmic ratio of the follicular cells may provide important information about thyroid gland function such as hormone storage and secretion, which helps understand the changes at morphological and functional levels. The present study determined the colloid volume using simple stereological principle and the nuclear-to-cytoplasmic ratio of 4 Indo-Pacific bottlenose dolphins and 2 human thyroid glands. In both dolphin and human thyroid glands, the size of the follicles tended to be quite variable. The distribution of large and small follicles within the thyroid gland was also found to be random in both the dolphin and human thyroid gland; however, the size of follicles appeared to decrease as a function of increasing age in the dolphin thyroid gland. The mean colloid volume of the dolphin thyroid gland and human thyroid gland was 1.22×10(5 µm(3 and 7.02×10(5 µm(3 respectively. The dolphin and human subjects had a significant difference in the mean colloid volume. The mean N/C ratio of the dolphin thyroid follicular epithelia and human follicular epithelia was 0.50 and 0.64 respectively. The dolphin and human subjects had a significant difference in the mean N/C ratio. This information contributes to understanding dolphin thyroid physiology and its structural adaptations to meet the physical demands of the aquatic environment, and aids with ultrasonography and corrective therapy in live subjects.

  9. Towards for Analyzing Alternatives of Interaction Design Based on Verbal Decision Analysis of User Experience

    Directory of Open Access Journals (Sweden)

    Marília Soares Mendes

    2010-04-01

    Full Text Available In domains (as digital TV, smart home, and tangible interfaces that represent a new paradigm of interactivity, the decision of the most appropriate interaction design solution is a challenge. HCI researchers have promoted in their works the validation of design alternative solutions with users before producing the final solution. User experience with technology is a subject that has also gained ground in these works in order to analyze the appropriate solution(s. Following this concept, a study was accomplished under the objective of finding a better interaction solution for an application of mobile TV. Three executable applications of mobile TV prototypes were built. A Verbal Decision Analysis model was applied on the investigations for the favorite characteristics in each prototype based on the user’s experience and their intentions of use. This model led a performance of a qualitative analysis which objectified the design of a new prototype.

  10. Advanced overlay analysis through design based metrology

    Science.gov (United States)

    Ji, Sunkeun; Yoo, Gyun; Jo, Gyoyeon; Kang, Hyunwoo; Park, Minwoo; Kim, Jungchan; Park, Chanha; Yang, Hyunjo; Yim, Donggyu; Maruyama, Kotaro; Park, Byungjun; Yamamoto, Masahiro

    2015-03-01

    As design rule shrink, overlay has been critical factor for semiconductor manufacturing. However, the overlay error which is determined by a conventional measurement with an overlay mark based on IBO and DBO often does not represent the physical placement error in the cell area. The mismatch may arise from the size or pitch difference between the overlay mark and the cell pattern. Pattern distortion caused by etching or CMP also can be a source of the mismatch. In 2014, we have demonstrated that method of overlay measurement in the cell area by using DBM (Design Based Metrology) tool has more accurate overlay value than conventional method by using an overlay mark. We have verified the reproducibility by measuring repeatable patterns in the cell area, and also demonstrated the reliability by comparing with CD-SEM data. We have focused overlay mismatching between overlay mark and cell area until now, further more we have concerned with the cell area having different pattern density and etch loading. There appears a phenomenon which has different overlay values on the cells with diverse patterning environment. In this paper, the overlay error was investigated from cell edge to center. For this experiment, we have verified several critical layers in DRAM by using improved(Better resolution and speed) DBM tool, NGR3520.

  11. A Framework for IT-based Design Tools

    DEFF Research Database (Denmark)

    Hartvig, Susanne C

    The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements to integ...... to integrated design enviornments, and analysis of engineeringdesign and design problem solving methods. And the developed framework has been testedby applying it to development of prototype design tools for realistic design scenarios.......The thesis presents a new apprach to develop design tools that can be integrated, bypresenting a framework consisting of a set of guidelines for design tools, an integration andcommunication scheme, and a set of design tool schemes.This framework has been based onanalysis of requirements...

  12. The spatial distribution of otosclerosis: a quantitative study using design-based stereology

    DEFF Research Database (Denmark)

    Bloch, Sune Land; Sørensen, Mads Sølvsten

    2010-01-01

    This study documents that otosclerotic bone remodeling is distributed centripetally around the inner ear space whereas normal bone remodeling is distributed centrifugally. We suggest that this inverse relation reflects the unique osteo-dynamic setting of the otic capsule: since perilabyrinthine...

  13. The spatial distribution of otosclerosis: a quantitative study using design-based stereology

    DEFF Research Database (Denmark)

    Bloch, Sune Land; Sørensen, Mads Sølvsten

    2010-01-01

    This study documents that otosclerotic bone remodeling is distributed centripetally around the inner ear space whereas normal bone remodeling is distributed centrifugally. We suggest that this inverse relation reflects the unique osteo-dynamic setting of the otic capsule: since perilabyrinthine b...

  14. Design and analysis of sustainable paper bicycle

    Science.gov (United States)

    Roni Sahroni, Taufik; Nasution, Januar

    2017-12-01

    This paper presents the design of sustainable paper bicycle which describes the stage by stage in the production of paper bicycle. The objective of this project is to design a sustainable paper bicycles to be used for children under five years old. The design analysis emphasizes in screening method to ensure the design fulfil the safety purposes. The evaluation concept is presented in designing a sustainable paper bicycle to determine highest rating. Project methodology is proposed for developing a sustainable paper bicycle. Design analysis of pedal, front and rear wheel, seat, and handle were presented using AutoCAD software. The design optimization was performed to fulfil the safety factors by modifying the material size and dimension. Based on the design analysis results, it is found that the optimization results met the factor safety. As a result, a sustainable paper bicycle was proposed for children under five years old.

  15. Liaison based assembly design

    Energy Technology Data Exchange (ETDEWEB)

    Ames, A.; Kholwadwala, D.; Wilson, R.H.

    1996-12-01

    Liaison Based Assembly Design extends the current information infrastructure to support design in terms of kinematic relationships between parts, or liaisons. These liaisons capture information regarding contact, degrees-of-freedom constraints and containment relationships between parts in an assembly. The project involved defining a useful collection of liaison representations, investigating their properties, and providing for maximum use of the data in downstream applications. We tested our ideas by implementing a prototype system involving extensions to Pro/Engineer and the Archimedes assembly planner. With an expanded product model, the design system is more able to capture design intent. When a product update is attempted, increased knowledge availability improves our ability to understand the effect of design changes. Manufacturing and analysis disciplines benefit from having liaison information available, so less time is wasted arguing over incomplete design specifications and our enterprise can be more completely integrated.

  16. INFORMED DESIGN DECISION-MAKING: FROM DIGITAL ANALYSIS TO URBAN DESIGN

    Directory of Open Access Journals (Sweden)

    Camilla Pezzica

    2017-11-01

    Full Text Available This study describes a new approach to explore the design of public open spaces based on a multidimensional analysis useful to inform decision-making and foster the development of evidence-based architectural solutions. It presents an overview of the most relevant design variables and their constraints, providing, in this way, valuable information for the elaboration of a more sustainable urban design, considerate of the local socio-cultural values. This research aims at providing holistic guidance for the development of better design proposals in contemporary urban environments. More specifically, it seeks to synchronously characterize urban spaces at a multi-scale and multidimensional level, both quantitatively and qualitatively, by collecting contributions from Space Syntax Theory, Public Life Studies, Building Science and Environmental/Comfort Analysis in public open spaces. Many advanced digital tools are used for data management purposes and to generate and test iteratively different design proposals. The proposed methodology is based on a range of tests and analyses performed in the process of developing a new experimental project for Largo da Graça, an urban square located in Lisbon’s historic centre, which allowed the testing of different design solutions. This experiment generated a digital workflow for the design of the urban square, in which are registered all the steps undertaken to solve the many design problems identified by considering the efficiency targets (centrality, connectivity, enclosure, thermal comfort, security, social equity and interaction. The process comprises a sequence of comparative design reviews and records the choices made when dealing with latent information underlying changing conditions in the use of public space and the programmatic malleability of the Portuguese plaza. The description of the adopted design strategy and the examples extracted from the workflow are used to illustrate the practical

  17. Stereological quantification of tumor volume, mean nuclear volume and total number of melanoma cells correlated with morbidity and mortality

    DEFF Research Database (Denmark)

    Bønnelykke-Behrndtz, Marie Louise; Sørensen, Flemming Brandt; Damsgaard, Tine Engberg

    2008-01-01

    potential indicators of prognosis. Sixty patients who underwent surgery at the Department of Plastic Surgery, Aarhus University Hospital, from 1991 to 1994 were included in the study. Total tumor volume was estimated by the Cavalieri technique, total number of tumor cells by the optical dissector principle...... showed a significant impact on both disease-free survival (p=0.001) and mortality (p=0.009). In conclusion, tumor volume and total number of cancer cells were highly reproducible but did not add additional, independent prognostic information regarding the study population.......Stereological quantification of tumor volume, total number of tumor cells and mean nuclear volume provides unbiased data, regardless of the three-dimensional shape of the melanocytic lesion. The aim of the present study was to investigate whether these variables are reproducible and may represent...

  18. Automated reasoning applications to design analysis

    International Nuclear Information System (INIS)

    Stratton, R.C.

    1984-01-01

    Given the necessary relationships and definitions of design functions and components, validation of system incarnation (the physical product of design) and sneak function analysis can be achieved via automated reasoners. The relationships and definitions must define the design specification and incarnation functionally. For the design specification, the hierarchical functional representation is based on physics and engineering principles and bounded by design objectives and constraints. The relationships and definitions of the design incarnation are manifested as element functional definitions, state relationship to functions, functional relationship to direction, element connectivity, and functional hierarchical configuration

  19. 11 YEARS OF BIOSTEREOLOGY IN CHINA

    Directory of Open Access Journals (Sweden)

    Hong Shen

    2011-05-01

    Full Text Available Biostereology in China is very active. Here is a brief summary: Organization: The organization of biostereology in China was founded in Nov. 1988. Its name is Chinese Society of Biomedical Stereology (CSBS, and is affiliated to the Chinese Society for Stereology (CSS. The first joint president of CSS/BMC was Prof. Peixuan Tang, the second and now the third, is Prof. Dewen Wang. There are 556 registered members. Academic Congresses: Sessions of the National Biostereological Congress were convened in 1990, 1992, 1996 and 2000. Publications: Four works were written and published in China. One is "Quantitative Histology" (Luji Shi, 1964, another is "Stereological Morphometry For Cell Morphology" (Fusheng Zheng, 1990, the third one is "Practical Biostereological Techniques" (Hong Shen and Yingzhong Shen, 1991 and the fourth one is "Quantitative Cytology and Cytochemistry Techniques" (Genxing Xu, 1994. A Chinese Journal of Stereology and Image Analysis has been published since 1996. Courses: More than ten national training courses on biostereology were held. In some medical universities or colleges, a biostereology course has been set up. Theoretical studies: Some new concepts, parameters and methods for stereology and morphometry were put forward, such as: regular form factor, volume concavity, surface concavity, area concavity, boundary concavity, curve profile area density, positive university for immunohistochemistry stain etc. Application: Stereological methods have been widely applied in biomedical studies. The applied field covered most of the morphological domain of biology. The main applications of biostereology are quantitative pathological diagnosis and prognosis of tumor cells and histostructures. Most studies utilize classical stereological methods. New stereological methods should be popularized and applied in the future. Image Analysis System: Image analysis systems are widely used in biostereological studies. About ten kinds of image

  20. Conceptual design and structural analysis of the CFETR cryostat

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Zhen, E-mail: wangzhen@ipp.ac.cn; Yang, Qingxi; Xu, Hao

    2015-04-15

    Highlights: • The CFETR cryostat is a large vacuum container surrounding the tokamak basic machine. • Two conceptual design schemes of CFETR cryostat were proposed. • A series of structural analyses were performed for cryostat cylinder. • The design of base section is feasible for cryostat. - Abstract: CFETR (China Fusion Engineering Test Reactor) is a new tokamak device, one important component of which is cryostat and it is now under designing by China national integration design group. The CFETR cryostat is a large single-wall vacuum-tight container surrounding the tokamak basic machine, which consists of top dome-shape lid, two cylindrical sections with circumferential stiffening ribs and bottom flat head. It shall provide a vacuum environment (10{sup −4} Pa) for the operation of the superconducting coils and all the loads that derive from cryostat itself and inner components should be transferred to the floor of tokamak pit. In this paper, two schemes of cryostat were proposed and then the structural analyses including seismic response analysis, elastic stress analysis and buckling analysis were performed to validate the conceptual design of CFETR cryostat. Based on the analysis results, it can be inferred that the cryostat II has a higher stiffness and stability. The structure of cryostat I needs to be improved against buckling and it is more difficult to manufacture for cryostat II due to its complex curved surface compared with cryostat I. Finally, the structural analysis for base section was performed and the design of main support was proved to be feasible. The design of CFETR cryostat has not been finalized and structural optimization still need to be proceeded based on the analysis results.

  1. Conceptual design and structural analysis of the CFETR cryostat

    International Nuclear Information System (INIS)

    Wang, Zhen; Yang, Qingxi; Xu, Hao

    2015-01-01

    Highlights: • The CFETR cryostat is a large vacuum container surrounding the tokamak basic machine. • Two conceptual design schemes of CFETR cryostat were proposed. • A series of structural analyses were performed for cryostat cylinder. • The design of base section is feasible for cryostat. - Abstract: CFETR (China Fusion Engineering Test Reactor) is a new tokamak device, one important component of which is cryostat and it is now under designing by China national integration design group. The CFETR cryostat is a large single-wall vacuum-tight container surrounding the tokamak basic machine, which consists of top dome-shape lid, two cylindrical sections with circumferential stiffening ribs and bottom flat head. It shall provide a vacuum environment (10 −4 Pa) for the operation of the superconducting coils and all the loads that derive from cryostat itself and inner components should be transferred to the floor of tokamak pit. In this paper, two schemes of cryostat were proposed and then the structural analyses including seismic response analysis, elastic stress analysis and buckling analysis were performed to validate the conceptual design of CFETR cryostat. Based on the analysis results, it can be inferred that the cryostat II has a higher stiffness and stability. The structure of cryostat I needs to be improved against buckling and it is more difficult to manufacture for cryostat II due to its complex curved surface compared with cryostat I. Finally, the structural analysis for base section was performed and the design of main support was proved to be feasible. The design of CFETR cryostat has not been finalized and structural optimization still need to be proceeded based on the analysis results

  2. Systemization of Design and Analysis Technology for Advanced Reactor

    International Nuclear Information System (INIS)

    Kim, Keung Koo; Lee, J.; Zee, S. K.

    2009-01-01

    The present study is performed to establish the base for the license application of the original technology by systemization and enhancement of the technology that is indispensable for the design and analysis of the advanced reactors including integral reactors. Technical reports and topical reports are prepared for this purpose on some important design/analysis methodology; design and analysis computer programs, structural integrity evaluation of main components and structures, digital I and C systems and man-machine interface design. PPS design concept is complemented reflecting typical safety analysis results. And test plans and requirements are developed for the verification of the advanced reactor technology. Moreover, studies are performed to draw up plans to apply to current or advanced power reactors the original technologies or base technologies such as patents, computer programs, test results, design concepts of the systems and components of the advanced reactors. Finally, pending issues are studied of the advanced reactors to improve the economics and technology realization

  3. Practical stress analysis in engineering design

    CERN Document Server

    Huston, Ronald

    2008-01-01

    Presents the application of engineering design and analysis based on the approach of understanding the physical characteristics of a given problem and then modeling the important aspects of the physical system. This book covers such topics as contact stress analysis, singularity functions, gear stresses, fasteners, shafts, and shaft stresses.

  4. Space Launch System Base Heating Test: Sub-Scale Rocket Engine/Motor Design, Development and Performance Analysis

    Science.gov (United States)

    Mehta, Manish; Seaford, Mark; Kovarik, Brian; Dufrene, Aaron; Solly, Nathan; Kirchner, Robert; Engel, Carl D.

    2014-01-01

    The Space Launch System (SLS) base heating test is broken down into two test programs: (1) Pathfinder and (2) Main Test. The Pathfinder Test Program focuses on the design, development, hot-fire test and performance analyses of the 2% sub-scale SLS core-stage and booster element propulsion systems. The core-stage propulsion system is composed of four gaseous oxygen/hydrogen RS-25D model engines and the booster element is composed of two aluminum-based model solid rocket motors (SRMs). The first section of the paper discusses the motivation and test facility specifications for the test program. The second section briefly investigates the internal flow path of the design. The third section briefly shows the performance of the model RS-25D engines and SRMs for the conducted short duration hot-fire tests. Good agreement is observed based on design prediction analysis and test data. This program is a challenging research and development effort that has not been attempted in 40+ years for a NASA vehicle.

  5. Design and simulation analysis of a novel pressure sensor based on graphene film

    Science.gov (United States)

    Nie, M.; Xia, Y. H.; Guo, A. Q.

    2018-02-01

    A novel pressure sensor structure based on graphene film as the sensitive membrane was proposed in this paper, which solved the problem to measure low and minor pressure with high sensitivity. Moreover, the fabrication process was designed which can be compatible with CMOS IC fabrication technology. Finite element analysis has been used to simulate the displacement distribution of the thin movable graphene film of the designed pressure sensor under the different pressures with different dimensions. From the simulation results, the optimized structure has been obtained which can be applied in the low measurement range from 10hPa to 60hPa. The length and thickness of the graphene film could be designed as 100μm and 0.2μm, respectively. The maximum mechanical stress on the edge of the sensitive membrane was 1.84kPa, which was far below the breaking strength of the silicon nitride and graphene film.

  6. Multidisciplinary Product Decomposition and Analysis Based on Design Structure Matrix Modeling

    DEFF Research Database (Denmark)

    Habib, Tufail

    2014-01-01

    Design structure matrix (DSM) modeling in complex system design supports to define physical and logical configuration of subsystems, components, and their relationships. This modeling includes product decomposition, identification of interfaces, and structure analysis to increase the architectural...... interactions across subsystems and components. For this purpose, Cambridge advanced modeler (CAM) software tool is used to develop the system matrix. The analysis of the product (printer) architecture includes clustering, partitioning as well as structure analysis of the system. The DSM analysis is helpful...... understanding of the system. Since product architecture has broad implications in relation to product life cycle issues, in this paper, mechatronic product is decomposed into subsystems and components, and then, DSM model is developed to examine the extent of modularity in the system and to manage multiple...

  7. Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations

    Science.gov (United States)

    Nielsen, Eric J.

    2016-01-01

    An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.

  8. Stereological estimate of the total number of neurons in spinal segment D9 of the red-eared turtle

    DEFF Research Database (Denmark)

    Walløe, Solveig; Nissen, Ulla Vig; Berg, Rune W

    2011-01-01

    The red-eared turtle is an important animal model for investigating the neural activity in the spinal circuit that generates motor behavior. However, basic anatomical features, including the number of neurons in the spinal segments involved, are unknown. In the present study, we estimate the total...... number of neurons in segment D9 of the spinal cord in the red-eared turtle (Trachemys scripta elegans) using stereological cell counting methods. In transverse spinal cord sections stained with modified Giemsa, motoneurons (MNs), interneurons (INs), and non-neuronal cells were distinguished according...... to location and morphology. Each cell type was then counted separately using an optical disector with the cell nucleus as counting item. The number of cells in segment D9 was as follows (mean ± SE): MNs, 2049 ± 74; INs, 16,135 ± 316; non-neuronal cells, 47,504 ± 478 (n = 6). These results provide the first...

  9. Using a Design Science Perspective to Understand a Complex Design-Based Research Process

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2012-01-01

    The purpose of the paper is to demonstrate how a design science perspective can be used to describe and understand a set of related design-based research processes. We describe and analyze a case study in a manner that is inspired by design science. The case study involves the design of modeling......-based research processes. And we argue that a design science perspective may be useful for both researchers and practitioners....... tools and the redesign of an information service in a library. We use a set of guidelines from a design science perspective to organize the description and analysis of the case study. By doing this we demonstrate the usefulness of design science as an analytical tool for understanding related design...

  10. Local porosity analysis of pore structure in cement paste

    International Nuclear Information System (INIS)

    Hu Jing; Stroeven, Piet

    2005-01-01

    Three-dimensional (3-D) local porosity theory (LPT) was originally proposed by Hilfer and recently used for the analysis of pore space geometry in model sandstone. LPT pursues to define the probability density functions of porosity and porosity connectivity. In doing so, heterogeneity differences in various sandstone samples were assessed. However, fundamental issues as to the stochastic concept of geometric heterogeneity are ignored in Hilfer's LPT theory. This paper focuses on proper sampling procedures that should be based on stochastic approaches to multistage sampling and geometric heterogeneity. Standard LPT analysis provides a 3-D microscopic modeling approach to materials. Traditional experimental techniques yield two-dimensional (2-D) section images, however. Therefore, this paper replaces the method for assessing material data in standard LPT theory to a more practical one, based on stereological, 3-D interpretation of quantitative image analysis data. The developed methodology is used to characterize the pore structure in hardened cement paste with various water/cement ratios (w/c) at different hydration stages

  11. Effect of hindlimb unloading on stereological parameters of the motor cortex and hippocampus in male rats.

    Science.gov (United States)

    Salehi, Mohammad Saied; Mirzaii-Dizgah, Iraj; Vasaghi-Gharamaleki, Behnoosh; Zamiri, Mohammad Javad

    2016-11-09

    Hindlimb unloading (HU) can cause motion and cognition dysfunction, although its cellular and molecular mechanisms are not well understood. The aim of the present study was to determine the stereological parameters of the brain areas involved in motion (motor cortex) and spatial learning - memory (hippocampus) under an HU condition. Sixteen adult male rats, kept under a 12 : 12 h light-dark cycle, were divided into two groups of freely moving (n=8) and HU (n=8) rats. The volume of motor cortex and hippocampus, the numerical cell density of neurons in layers I, II-III, V, and VI of the motor cortex, the entire motor cortex as well as the primary motor cortex, and the numerical density of the CA1, CA3, and dentate gyrus subregions of the hippocampus were estimated. No significant differences were observed in the evaluated parameters. Our results thus indicated that motor cortical and hippocampal atrophy and cell loss may not necessarily be involved in the motion and spatial learning memory impairment in the rat.

  12. Structural design systems using knowledge-based techniques

    International Nuclear Information System (INIS)

    Orsborn, K.

    1993-01-01

    Engineering information management and the corresponding information systems are of a strategic importance for industrial enterprises. This thesis treats the interdisciplinary field of designing computing systems for structural design and analysis using knowledge-based techniques. Specific conceptual models have been designed for representing the structure and the process of objects and activities in a structural design and analysis domain. In this thesis, it is shown how domain knowledge can be structured along several classification principles in order to reduce complexity and increase flexibility. By increasing the conceptual level of the problem description and representation of the domain knowledge in a declarative form, it is possible to enhance the development, maintenance and use of software for mechanical engineering. This will result in a corresponding increase of the efficiency of the mechanical engineering design process. These ideas together with the rule-based control point out the leverage of declarative knowledge representation within this domain. Used appropriately, a declarative knowledge representation preserves information better, is more problem-oriented and change-tolerant than procedural representations. 74 refs

  13. Process-based organization design and hospital efficiency.

    Science.gov (United States)

    Vera, Antonio; Kuntz, Ludwig

    2007-01-01

    The central idea of process-based organization design is that organizing a firm around core business processes leads to cost reductions and quality improvements. We investigated theoretically and empirically whether the implementation of a process-based organization design is advisable in hospitals. The data came from a database compiled by the Statistical Office of the German federal state of Rheinland-Pfalz and from a written questionnaire, which was sent to the chief executive officers (CEOs) of all 92 hospitals in this federal state. We used data envelopment analysis (DEA) to measure hospital efficiency, and factor analysis and regression analysis to test our hypothesis. Our principal finding is that a high degree of process-based organization has a moderate but significant positive effect on the efficiency of hospitals. The main implication is that hospitals should implement a process-based organization to improve their efficiency. However, to actually achieve positive effects on efficiency, it is of paramount importance to observe some implementation rules, in particular to mobilize physician participation and to create an adequate organizational culture.

  14. Crataegus Monogyna Aqueous Extract Ameliorates Cyclophosphamide-Induced Toxicity in Rat Testis: Stereological Evidences

    Directory of Open Access Journals (Sweden)

    Hassan Malekinejad

    2012-01-01

    Full Text Available Cyclophosphamide (CP is extensively used as an antineoplastic agent for the treatment of various cancers, as well as an immunosuppressive agent. However, despite its wide spectrum of clinical uses, CP is known to cause several adverse effects including reproductive toxicity. Crataegus monogyna is one of the oldest pharmaceutical plants that have been shown to be cytoprotective by scavenging free radicals. The present study was conducted to assess whether Crataegus monogyna fruits aqueous extract with anti-oxidant properties, could serve as a protective agent against reproductive toxicity during CP treatment in a rat model. Male Wistar rats were categorized into four groups. Two groups of rats were administered CP at a dose of 5 mg in 5 ml saline/kg/day for 28 days by oral gavages. One of these groups received Crataegus monogyna aqueous extract at a dose of 20 mg/kg/day orally four hours after cyclophosphamide administration. A vehicle treated control group and a Crataegus monogyna control group were also included. The CP-treated group showed significant decreases in the body, testes and epididymides weights as well as many histological alterations. Stereological parameters and spermatogenic activities (Sertoli cell, repopulation and miotic indices were also significantly decreased by CP treatment. Notably, Crataegus coadministration caused a partial recovery in above-mentined parameters. These findings indicate that Crataegus monogyna may be partially protective against CP-induced testicular toxicity.

  15. Hardware Design of Tuber Electrical Resistance Tomography System Based on the Soil Impedance Test and Analysis

    Directory of Open Access Journals (Sweden)

    Liu Shuyi

    2016-01-01

    Full Text Available The hardware design of tuber electrical resistance tomography (TERT system is one of the key research problems of TERT data acquisition system. The TERT system can be applied to the tuber growth process monitoring in agriculture, i.e., the TERT data acquisition system can realize the real imaging of tuber plants in soil. In TERT system, the imaging tuber and soil multiphase medium is quite complexity. So, the impedance test and analysis of soil multiphase medium is very important to the design of sensitive array sensor subsystem and signals processing circuits. In the paper, the soil impedance test experimental is described and the results are analysed. The data acquisition hardware system is designed based on the result of soil medium impedance test and analysis. In the hardware design, the switch control chip ADG508, the instrumentation amplifier AD620 and programmable amplifier AD526 are employed. In the meantime, the phase locked loop technique for signal demodulation is introduced. The initial data collection is given and discussed under the conditions of existing plant tuber and no existing plant tuber. Conclusions of the hardware design of TERT system are presented.

  16. Application of computer aided tolerance analysis in product design

    International Nuclear Information System (INIS)

    Du Hua

    2009-01-01

    This paper introduces the shortage of the traditional tolerance design method and the strong point of the computer aided tolerancing (CAT) method,compares the shortage and the strong point among the three tolerance analysis methods, which are Worst Case Analysis, Statistical Analysis and Monte-Carlo Simulation Analysis, and offers the basic courses and correlative details for CAT. As the study objects, the reactor pressure vessel, the core barrel, the hold-down barrel and the support plate are used to upbuild the tolerance simulation model, based on their 3D design models. Then the tolerance simulation analysis has been conducted and the scheme of the tolerance distribution is optimized based on the analysis results. (authors)

  17. Design process dynamics in an experience-based context : a design methodological analysis of the Brabantia corkscrew development

    NARCIS (Netherlands)

    Vries, de M.J.

    1994-01-01

    In design methodology, the influence of various factors on design processes is studied. In this article the design of the Brabantia corkscrew is presented as a case study in which these factors are analysed. The aim of the analysis is to gain insight into the way Brabantia took these factors into

  18. Effects of melatonin on diclofenac sodium treated rat kidney: a stereological and histopathological study.

    Science.gov (United States)

    Khoshvakhti, Habib; Yurt, K Kübra; Altunkaynak, B Zuhal; Türkmen, Aysın P; Elibol, Ebru; Aydın, Işınsu; Kıvrak, Elfide G; Önger, M Emin; Kaplan, Süleyman

    2015-01-01

    In this study, we aimed to investigate the effect of diclofenac sodium (DS) and melatonin (MEL) on kidney of the prenatally administered rats. Pregnant rats were divided into the control, physiological saline, DS, and DS + MEL groups. All injections were given beginning from the 5th day after mating to the 15th day of the pregnancy. Physical dissector and Cavalieri principle were used to estimate the numerical density and total number of glomeruli and the volumetric parameters of kidney, respectively. Our stereological results indicated that DS application during the pregnancy lead to decrease in the mean volume, numerical density, and total number of the glomeruli (p  0.05). Light microscopic investigation showed congestion in blood vessels and shrinkage of the Bowman's space in the DS group. Moreover, there was degeneration in nephrons including glomerulosclerosis and tubular defects, and an increase in the connective tissue in the kidneys of the DS-treated group. However, usage of the MEL with the DS caused preventing of these pathological alterations in the kidney. We suggested that DS might lead to adverse effects in the kidneys of the rats that are prenatally subjected to this drug. Fortunately, these adverse effects can be prevented by the melatonin supplementation.

  19. A supportive architecture for CFD-based design optimisation

    Science.gov (United States)

    Li, Ni; Su, Zeya; Bi, Zhuming; Tian, Chao; Ren, Zhiming; Gong, Guanghong

    2014-03-01

    Multi-disciplinary design optimisation (MDO) is one of critical methodologies to the implementation of enterprise systems (ES). MDO requiring the analysis of fluid dynamics raises a special challenge due to its extremely intensive computation. The rapid development of computational fluid dynamic (CFD) technique has caused a rise of its applications in various fields. Especially for the exterior designs of vehicles, CFD has become one of the three main design tools comparable to analytical approaches and wind tunnel experiments. CFD-based design optimisation is an effective way to achieve the desired performance under the given constraints. However, due to the complexity of CFD, integrating with CFD analysis in an intelligent optimisation algorithm is not straightforward. It is a challenge to solve a CFD-based design problem, which is usually with high dimensions, and multiple objectives and constraints. It is desirable to have an integrated architecture for CFD-based design optimisation. However, our review on existing works has found that very few researchers have studied on the assistive tools to facilitate CFD-based design optimisation. In the paper, a multi-layer architecture and a general procedure are proposed to integrate different CFD toolsets with intelligent optimisation algorithms, parallel computing technique and other techniques for efficient computation. In the proposed architecture, the integration is performed either at the code level or data level to fully utilise the capabilities of different assistive tools. Two intelligent algorithms are developed and embedded with parallel computing. These algorithms, together with the supportive architecture, lay a solid foundation for various applications of CFD-based design optimisation. To illustrate the effectiveness of the proposed architecture and algorithms, the case studies on aerodynamic shape design of a hypersonic cruising vehicle are provided, and the result has shown that the proposed architecture

  20. Design of Process Displays based on Risk Analysis Techniques

    DEFF Research Database (Denmark)

    Paulsen, Jette Lundtang

    -tions. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engi-neer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described...

  1. An integrated factor analysis model for product eco-design based on full life cycle assessment

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Z.; Xiao, T.; Li, D.

    2016-07-01

    Among the methods of comprehensive analysis for a product or an enterprise, there exist defects and deficiencies in traditional standard cost analyses and life cycle assessment methods. For example, some methods only emphasize one dimension (such as economic or environmental factors) while neglecting other relevant dimensions. This paper builds a factor analysis model of resource value flow, based on full life cycle assessment and eco-design theory, in order to expose the relevant internal logic between these two factors. The model considers the efficient multiplication of resources, economic efficiency, and environmental efficiency as its core objectives. The model studies the status of resource value flow during the entire life cycle of a product, and gives an in-depth analysis on the mutual logical relationship of product performance, value, resource consumption, and environmental load to reveal the symptoms and potentials in different dimensions. This provides comprehensive, accurate and timely decision-making information for enterprise managers regarding product eco-design, as well as production and management activities. To conclude, it verifies the availability of this evaluation and analysis model using a Chinese SUV manufacturer as an example. (Author)

  2. Research-based design & design-based research: Affordances, limitations and synergies

    NARCIS (Netherlands)

    McKenney, Susan

    2015-01-01

    Research-based design is an orientation to educational development that is explicitly informed by existing research as well as formative evaluation. Design-based research is a genre of inquiry in which the design of innovative solutions to problems in educational practice provides the context for

  3. Long live the liver: immunohistochemical and stereological study of hepatocytes, liver sinusoidal endothelial cells, Kupffer cells and hepatic stellate cells of male and female rats throughout ageing.

    Science.gov (United States)

    Marcos, Ricardo; Correia-Gomes, Carla

    2016-12-01

    Male/female differences in enzyme activity and gene expression in the liver are known to be attenuated with ageing. Nevertheless, the effect of ageing on liver structure and quantitative cell morphology remains unknown. Male and female Wistar rats aged 2, 6, 12 and 18 months were examined by means of stereological techniques and immunohistochemical tagging of hepatocytes (HEP), liver sinusoidal endothelial cells (LSEC), Kupffer cells (KC) and hepatic stellate cells (HSC) in order to assess the total number and number per gram of these cells throughout life. The mean cell volume of HEP and HSC, the lobular position and the collagen content of the liver were also evaluated with stereological techniques. The number per gram of HSC was similar for both genders and was maintained throughout ageing. The mean volume of HSC was also conserved but differences in the cell body and lobular location were observed. Statistically significant gender differences in HEP were noted in young rats (females had smaller and more binucleated HEP) but were attenuated with ageing. The same occurred for KC and LSEC, since the higher number per gram in young females disappeared in older animals. Liver collagen increased with ageing but only in males. Thus, the numbers of these four cell types are related throughout ageing, with well-defined cell ratios. The shape and lobular position of HSC change with ageing in both males and females. Gender dimorphism in HEP, KC and LSEC of young rat liver disappears with ageing.

  4. Advanced methods for the analysis, design, and optimization of SMA-based aerostructures

    International Nuclear Information System (INIS)

    Hartl, D J; Lagoudas, D C; Calkins, F T

    2011-01-01

    Engineers continue to apply shape memory alloys to aerospace actuation applications due to their high energy density, robust solid-state actuation, and silent and shock-free operation. Past design and development of such actuators relied on experimental trial and error and empirically derived graphical methods. Over the last two decades, however, it has been repeatedly demonstrated that existing SMA constitutive models can capture stabilized SMA transformation behaviors with sufficient accuracy. This work builds upon past successes and suggests a general framework by which predictive tools can be used to assess the responses of many possible design configurations in an automated fashion. By applying methods of design optimization, it is shown that the integrated implementation of appropriate analysis tools can guide engineers and designers to the best design configurations. A general design optimization framework is proposed for the consideration of any SMA component or assembly of such components that applies when the set of design variables includes many members. This is accomplished by relying on commercially available software and utilizing tools already well established in the design optimization community. Such tools are combined with finite element analysis (FEA) packages that consider a multitude of structural effects. The foundation of this work is a three-dimensional thermomechanical constitutive model for SMAs applicable for arbitrarily shaped bodies. A reduced-order implementation also allows computationally efficient analysis of structural components such as wires, rods, beams and shells. The use of multiple optimization schemes, the consideration of assembled components, and the accuracy of the implemented constitutive model in full and reduced-order forms are all demonstrated

  5. Stereological Study on the Positive Effect of Running Exercise on the Capillaries in the Hippocampus in a Depression Model

    Directory of Open Access Journals (Sweden)

    Linmu Chen

    2017-11-01

    Full Text Available Running exercise is an effective method to improve depressive symptoms when combined with drugs. However, the underlying mechanisms are not fully clear. Cerebral blood flow perfusion in depressed patients is significantly lower in the hippocampus. Physical activity can achieve cerebrovascular benefits. The purpose of this study was to evaluate the impacts of running exercise on capillaries in the hippocampal CA1 and dentate gyrus (DG regions. The chronic unpredictable stress (CUS depression model was used in this study. CUS rats were given 4 weeks of running exercise from the fifth week to the eighth week (20 min every day from Monday to Friday each week. The sucrose consumption test was used to measure anhedonia. Furthermore, stereological methods were used to investigate the capillary changes among the control group, CUS/Standard group and CUS/Running group. Sucrose consumption significantly increased in the CUS/Running group. Running exercise has positive effects on the capillaries parameters in the hippocampal CA1 and DG regions, such as the total volume, total length and total surface area. These results demonstrated that capillaries are protected by running exercise in the hippocampal CA1 and DG might be one of the structural bases for the exercise-induced treatment of depression-like behavior. These results suggest that drugs and behavior influence capillaries and may be considered as a new means for depression treatment in the future.

  6. Reliability-Based Robust Design Optimization of Structures Considering Uncertainty in Design Variables

    Directory of Open Access Journals (Sweden)

    Shujuan Wang

    2015-01-01

    Full Text Available This paper investigates the structural design optimization to cover both the reliability and robustness under uncertainty in design variables. The main objective is to improve the efficiency of the optimization process. To address this problem, a hybrid reliability-based robust design optimization (RRDO method is proposed. Prior to the design optimization, the Sobol sensitivity analysis is used for selecting key design variables and providing response variance as well, resulting in significantly reduced computational complexity. The single-loop algorithm is employed to guarantee the structural reliability, allowing fast optimization process. In the case of robust design, the weighting factor balances the response performance and variance with respect to the uncertainty in design variables. The main contribution of this paper is that the proposed method applies the RRDO strategy with the usage of global approximation and the Sobol sensitivity analysis, leading to the reduced computational cost. A structural example is given to illustrate the performance of the proposed method.

  7. A Novel Double Cluster and Principal Component Analysis-Based Optimization Method for the Orbit Design of Earth Observation Satellites

    Directory of Open Access Journals (Sweden)

    Yunfeng Dong

    2017-01-01

    Full Text Available The weighted sum and genetic algorithm-based hybrid method (WSGA-based HM, which has been applied to multiobjective orbit optimizations, is negatively influenced by human factors through the artificial choice of the weight coefficients in weighted sum method and the slow convergence of GA. To address these two problems, a cluster and principal component analysis-based optimization method (CPC-based OM is proposed, in which many candidate orbits are gradually randomly generated until the optimal orbit is obtained using a data mining method, that is, cluster analysis based on principal components. Then, the second cluster analysis of the orbital elements is introduced into CPC-based OM to improve the convergence, developing a novel double cluster and principal component analysis-based optimization method (DCPC-based OM. In DCPC-based OM, the cluster analysis based on principal components has the advantage of reducing the human influences, and the cluster analysis based on six orbital elements can reduce the search space to effectively accelerate convergence. The test results from a multiobjective numerical benchmark function and the orbit design results of an Earth observation satellite show that DCPC-based OM converges more efficiently than WSGA-based HM. And DCPC-based OM, to some degree, reduces the influence of human factors presented in WSGA-based HM.

  8. Designers' Cognitive Thinking Based on Evolutionary Algorithms

    OpenAIRE

    Zhang Shutao; Jianning Su; Chibing Hu; Peng Wang

    2013-01-01

    The research on cognitive thinking is important to construct the efficient intelligent design systems. But it is difficult to describe the model of cognitive thinking with reasonable mathematical theory. Based on the analysis of design strategy and innovative thinking, we investigated the design cognitive thinking model that included the external guide thinking of "width priority - depth priority" and the internal dominated thinking of "divergent thinking - convergent thinking", built a reaso...

  9. A Contradiction-Based Approach for Innovative Product Design

    Directory of Open Access Journals (Sweden)

    Ko Yao-Tsung

    2016-01-01

    Full Text Available Without creativity in design there is no potential for innovation. This paper investigates the role of contradictions in enhancing creativity in product design. Based on the inventive principles of TRIZ, this paper presents a novel design method by integrating technical and physical contradiction analysis methods into the conceptual design activities of new product development (NPD. Despite the recognized importance of innovative design, there is a lack of the systematic and effective design-thinking process that can covers all conceptual design activities. To address this gap, a sharper and fundamental model of the problem-solving is created for innovative product design based on the contradiction-oriented concept. Eventually, one case study is employed to illustrate the method and the result validates that it can help designers produce more creative outcomes in product design.

  10. Reliability Based Ship Structural Design

    DEFF Research Database (Denmark)

    Dogliani, M.; Østergaard, C.; Parmentier, G.

    1996-01-01

    This paper deals with the development of different methods that allow the reliability-based design of ship structures to be transferred from the area of research to the systematic application in current design. It summarises the achievements of a three-year collaborative research project dealing...... with developments of models of load effects and of structural collapse adopted in reliability formulations which aim at calibrating partial safety factors for ship structural design. New probabilistic models of still-water load effects are developed both for tankers and for containerships. New results are presented...... structure of several tankers and containerships. The results of the reliability analysis were the basis for the definition of a target safety level which was used to asses the partial safety factors suitable for in a new design rules format to be adopted in modern ship structural design. Finally...

  11. Embracing model-based designs for dose-finding trials.

    Science.gov (United States)

    Love, Sharon B; Brown, Sarah; Weir, Christopher J; Harbron, Chris; Yap, Christina; Gaschler-Markefski, Birgit; Matcham, James; Caffrey, Louise; McKevitt, Christopher; Clive, Sally; Craddock, Charlie; Spicer, James; Cornelius, Victoria

    2017-07-25

    Dose-finding trials are essential to drug development as they establish recommended doses for later-phase testing. We aim to motivate wider use of model-based designs for dose finding, such as the continual reassessment method (CRM). We carried out a literature review of dose-finding designs and conducted a survey to identify perceived barriers to their implementation. We describe the benefits of model-based designs (flexibility, superior operating characteristics, extended scope), their current uptake, and existing resources. The most prominent barriers to implementation of a model-based design were lack of suitable training, chief investigators' preference for algorithm-based designs (e.g., 3+3), and limited resources for study design before funding. We use a real-world example to illustrate how these barriers can be overcome. There is overwhelming evidence for the benefits of CRM. Many leading pharmaceutical companies routinely implement model-based designs. Our analysis identified barriers for academic statisticians and clinical academics in mirroring the progress industry has made in trial design. Unified support from funders, regulators, and journal editors could result in more accurate doses for later-phase testing, and increase the efficiency and success of clinical drug development. We give recommendations for increasing the uptake of model-based designs for dose-finding trials in academia.

  12. An integrated reliability-based design optimization of offshore towers

    International Nuclear Information System (INIS)

    Karadeniz, Halil; Togan, Vedat; Vrouwenvelder, Ton

    2009-01-01

    After recognizing the uncertainty in the parameters such as material, loading, geometry and so on in contrast with the conventional optimization, the reliability-based design optimization (RBDO) concept has become more meaningful to perform an economical design implementation, which includes a reliability analysis and an optimization algorithm. RBDO procedures include structural analysis, reliability analysis and sensitivity analysis both for optimization and for reliability. The efficiency of the RBDO system depends on the mentioned numerical algorithms. In this work, an integrated algorithms system is proposed to implement the RBDO of the offshore towers, which are subjected to the extreme wave loading. The numerical strategies interacting with each other to fulfill the RBDO of towers are as follows: (a) a structural analysis program, SAPOS, (b) an optimization program, SQP and (c) a reliability analysis program based on FORM. A demonstration of an example tripod tower under the reliability constraints based on limit states of the critical stress, buckling and the natural frequency is presented.

  13. An integrated reliability-based design optimization of offshore towers

    Energy Technology Data Exchange (ETDEWEB)

    Karadeniz, Halil [Faculty of Civil Engineering and Geosciences, Delft University of Technology, Delft (Netherlands)], E-mail: h.karadeniz@tudelft.nl; Togan, Vedat [Department of Civil Engineering, Karadeniz Technical University, Trabzon (Turkey); Vrouwenvelder, Ton [Faculty of Civil Engineering and Geosciences, Delft University of Technology, Delft (Netherlands)

    2009-10-15

    After recognizing the uncertainty in the parameters such as material, loading, geometry and so on in contrast with the conventional optimization, the reliability-based design optimization (RBDO) concept has become more meaningful to perform an economical design implementation, which includes a reliability analysis and an optimization algorithm. RBDO procedures include structural analysis, reliability analysis and sensitivity analysis both for optimization and for reliability. The efficiency of the RBDO system depends on the mentioned numerical algorithms. In this work, an integrated algorithms system is proposed to implement the RBDO of the offshore towers, which are subjected to the extreme wave loading. The numerical strategies interacting with each other to fulfill the RBDO of towers are as follows: (a) a structural analysis program, SAPOS, (b) an optimization program, SQP and (c) a reliability analysis program based on FORM. A demonstration of an example tripod tower under the reliability constraints based on limit states of the critical stress, buckling and the natural frequency is presented.

  14. Design of an RF window for L-band CW klystron based on thermal-stress analysis

    International Nuclear Information System (INIS)

    Yamaguchi, Seiya; Sato, Isamu; Konashi, Kenji; Ohshika, Junji.

    1993-01-01

    Design of klystron RF window has been performed based on a thermal-stress analysis for L-band CW electron linac for nuclear wastes transmutation. It was shown that the hoop stress for a modified disk is 46% of that of normal disk. Thermal load test has been done which indicated that the modified disk is proof against power twice as much as that for the normal disk. (author)

  15. Performance-based analysis of current South African semi-trailer designs

    CSIR Research Space (South Africa)

    Thorogood, R

    2009-07-01

    Full Text Available , performance based standards, dynamic stability, tractor semi- trailers, directional response, static rollover threshold Introduction South African heavy vehicles are currently designed according to prescriptive standards designed and enforced... productivity. These include Central Tyre Inflation (CTI), on-board weighing, new materials such as Domex and vehicle satellite tracking, all leading towards increased payloads and reduced costs. There have also been improvements in technology...

  16. Stereological estimates of nuclear volume and other quantitative variables in supratentorial brain tumors. Practical technique and use in prognostic evaluation

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Braendgaard, H; Chistiansen, A O

    1991-01-01

    The use of morphometry and modern stereology in malignancy grading of brain tumors is only poorly investigated. The aim of this study was to present these quantitative methods. A retrospective feasibility study of 46 patients with supratentorial brain tumors was carried out to demonstrate...... the practical technique. The continuous variables were correlated with the subjective, qualitative WHO classification of brain tumors, and the prognostic value of the parameters was assessed. Well differentiated astrocytomas (n = 14) had smaller estimates of the volume-weighted mean nuclear volume and mean...... nuclear profile area, than those of anaplastic astrocytomas (n = 13) (2p = 3.1.10(-3) and 2p = 4.8.10(-3), respectively). No differences were seen between the latter type of tumor and glioblastomas (n = 19). The nuclear index was of the same magnitude in all three tumor types, whereas the mitotic index...

  17. Linear regression and sensitivity analysis in nuclear reactor design

    International Nuclear Information System (INIS)

    Kumar, Akansha; Tsvetkov, Pavel V.; McClarren, Ryan G.

    2015-01-01

    Highlights: • Presented a benchmark for the applicability of linear regression to complex systems. • Applied linear regression to a nuclear reactor power system. • Performed neutronics, thermal–hydraulics, and energy conversion using Brayton’s cycle for the design of a GCFBR. • Performed detailed sensitivity analysis to a set of parameters in a nuclear reactor power system. • Modeled and developed reactor design using MCNP, regression using R, and thermal–hydraulics in Java. - Abstract: The paper presents a general strategy applicable for sensitivity analysis (SA), and uncertainity quantification analysis (UA) of parameters related to a nuclear reactor design. This work also validates the use of linear regression (LR) for predictive analysis in a nuclear reactor design. The analysis helps to determine the parameters on which a LR model can be fit for predictive analysis. For those parameters, a regression surface is created based on trial data and predictions are made using this surface. A general strategy of SA to determine and identify the influential parameters those affect the operation of the reactor is mentioned. Identification of design parameters and validation of linearity assumption for the application of LR of reactor design based on a set of tests is performed. The testing methods used to determine the behavior of the parameters can be used as a general strategy for UA, and SA of nuclear reactor models, and thermal hydraulics calculations. A design of a gas cooled fast breeder reactor (GCFBR), with thermal–hydraulics, and energy transfer has been used for the demonstration of this method. MCNP6 is used to simulate the GCFBR design, and perform the necessary criticality calculations. Java is used to build and run input samples, and to extract data from the output files of MCNP6, and R is used to perform regression analysis and other multivariate variance, and analysis of the collinearity of data

  18. Long lasting structural changes in primary motor cortex after motor skill learning: a behavioural and stereological study

    Directory of Open Access Journals (Sweden)

    PAOLA MORALES

    2008-12-01

    Full Text Available Many motor skills, once acquired, are stored over a long time period, probably sustained by permanent neuronal changes. Thus, in this paper we have investigated with quantitative stereology the generation and persistence of neuronal density changes in primary motor cortex (MI following motor skill learning (skilled reaching task. Rats were trained a lateralised reaching task during an "early" (22-31 days oíd or "late" (362-371 days oíd postnatal period. The trained and corresponding control rats were sacrificed at day 372, immediately after the behavioural testing. The "early" trained group preserved the learned skilled reaching task when tested at day 372, without requiring any additional training. The "late" trained group showed a similar capacity to that of the "early" trained group for learning the skilled reaching task. All trained animáis ("early" and "late" trained groups showed a significant Ínter hemispheric decrease of neuronal density in the corresponding motor forelimb representation área of MI (cortical layers II-III

  19. ITER containment design-assist analysis

    International Nuclear Information System (INIS)

    Nguyen, T.H.

    1992-03-01

    In this report, the analysis methods, models and assumptions used to predict the pressure and temperature transients in the ITER containment following a loss of coolant accident are presented. The ITER reactor building is divided into 10 different volumes (zones) based on their functional design. The base model presented in this report will be modified in volume 2 in order to determine the peak pressure, the required size of openings between various functional zones and the differential pressures on walls separating these zones

  20. Manufacturing system design based on axiomatic design: Case of assembly line

    Energy Technology Data Exchange (ETDEWEB)

    Hager, T.; Wafik, H.; Faouzi, M.

    2017-07-01

    In this paper, a combined Production Line Design (PLD) process which includes many design aspects is presented, developed and validated. Design/methodology/approach: The PLD process is based on the SADT (Structured Analysis and Design Technique) diagram and the Axiomatic Design (AD) method. Practical implications: For a purpose of validation, this proposed process has been applied in a manufacturing company and it has been validated by simulation. Findings: The results of the validation indicated that the production line designed by this process is outperformed the initial line of the company. Originality/value: Recently, the problems of production line design (PLD) have attracted the attention of many researchers. However, only a few studies have treated the PLD which includes all design aspects. In this work, a combined PLD process is presented. It should be noted that the proposed process is simple and effective.

  1. Manufacturing system design based on axiomatic design: Case of assembly line

    International Nuclear Information System (INIS)

    Hager, T.; Wafik, H.; Faouzi, M.

    2017-01-01

    In this paper, a combined Production Line Design (PLD) process which includes many design aspects is presented, developed and validated. Design/methodology/approach: The PLD process is based on the SADT (Structured Analysis and Design Technique) diagram and the Axiomatic Design (AD) method. Practical implications: For a purpose of validation, this proposed process has been applied in a manufacturing company and it has been validated by simulation. Findings: The results of the validation indicated that the production line designed by this process is outperformed the initial line of the company. Originality/value: Recently, the problems of production line design (PLD) have attracted the attention of many researchers. However, only a few studies have treated the PLD which includes all design aspects. In this work, a combined PLD process is presented. It should be noted that the proposed process is simple and effective.

  2. Generic Model-Based Tailor-Made Design and Analysis of Biphasic Reaction Systems

    DEFF Research Database (Denmark)

    Anantpinijwatna, Amata

    systems have a broad range of application, such as the manufacture of petroleum based chemicals, pharmaceuticals, and agro-bio products. Major considerations in the design and analysis of biphasic reaction systems are physical and chemical equilibria, kinetic mechanisms, and reaction rates. The primary...... contribution of this thesis is the development of a systematic modelling framework for the biphasic reaction system. The developed framework consists of three modules describing phase equilibria, reactions and mass transfer, and material balances of such processes. Correlative and predictive thermodynamic......Biphasic reaction systems are composed of immiscible aqueous and organic liquid phases where reactants, products, and catalysts are partitioned. These biphasic conditions point to novel synthesis paths, higher yields, and faster reactions, as well as facilitate product separation. The biphasic...

  3. Analysis/design of tensile property database system

    International Nuclear Information System (INIS)

    Park, S. J.; Kim, D. H.; Jeon, I.; Lyu, W. S.

    2001-01-01

    The data base construction using the data produced from tensile experiment can increase the application of test results. Also, we can get the basic data ease from database when we prepare the new experiment and can produce high quality result by compare the previous data. The development part must be analysis and design more specific to construct the database and after that, we can offer the best quality to customers various requirements. In this thesis, the analysis and design was performed to develop the database for tensile extension property

  4. Furnace and Heat Recovery Area Design and Analysis for Conceptual Design of Supercritical O2-Based PC Boiler

    International Nuclear Information System (INIS)

    Andrew Seltzer

    2006-01-01

    The objective of the furnace and heat recovery area design and analysis task of the Conceptual Design of Supercritical Oxygen-Based PC Boiler study is to optimize the location and design of the furnace, burners, over-fire gas ports, and internal radiant surfaces. The furnace and heat recovery area were designed and analyzed using the FW-FIRE, Siemens, and HEATEX computer programs. The furnace is designed with opposed wall-firing burners and over-fire air ports. Water is circulated in the furnace by forced circulation to the waterwalls at the periphery and divisional wall panels within the furnace. Compared to the air-fired furnace, the oxygen-fired furnace requires only 65% of the surface area and 45% of the volume. Two oxygen-fired designs were simulated: (1) with cryogenic air separation unit (ASU) and (2) with oxygen ion transport membrane (OITM). The maximum wall heat flux in the oxygen-fired furnace is more than double that of the air-fired furnace due to the higher flame temperature and higher H 2 O and CO 2 concentrations. The coal burnout for the oxygen-fired case is 100% due to a 500 F higher furnace temperature and higher concentration of O 2 . Because of the higher furnace wall temperature of the oxygen-fired case compared to the air-fired case, furnace water wall material was upgraded from T2 to T92. Compared to the air-fired heat recovery area (HRA), the oxygen-fired HRA total heat transfer surface is 35% less for the cryogenic design and 13% less for the OITM design due to more heat being absorbed in the oxygen-fired furnace and the greater molecular weight of the oxygen-fired flue gas. The HRA tube materials and wall thickness are nearly the same for the air-fired and oxygen-fired design since the flue gas and water/steam temperature profiles encountered by the heat transfer banks are similar

  5. A Distributed Feature-based Environment for Collaborative Design

    Directory of Open Access Journals (Sweden)

    Wei-Dong Li

    2003-02-01

    Full Text Available This paper presents a client/server design environment based on 3D feature-based modelling and Java technologies to enable design information to be shared efficiently among members within a design team. In this environment, design tasks and clients are organised through working sessions generated and maintained by a collaborative server. The information from an individual design client during a design process is updated and broadcast to other clients in the same session through an event-driven and call-back mechanism. The downstream manufacturing analysis modules can be wrapped as agents and plugged into the open environment to support the design activities. At the server side, a feature-feature relationship is established and maintained to filter the varied information of a working part, so as to facilitate efficient information update during the design process.

  6. No postnatal doubling of number of neurons in human Broca's areas (Brodmann areas 44 and 45)? A stereological study.

    Science.gov (United States)

    Uylings, H B M; Malofeeva, L I; Bogolepova, I N; Jacobsen, A M; Amunts, K; Zilles, K

    2005-01-01

    In this study we explored whether a postnatal doubling of the total number of neurons occurs in the human Brodmann areas 44 and 45 (Broca's area). We describe the most recent error prediction formulae and their application for the modern stereological estimators for volume and number of neurons. We estimated the number of neurons in 3D optical disector probes systematically random sampled throughout the entire Brodmann areas (BA) 44 and 45 in developing and young adult cases. In the relatively small number of male and female cases studied no substantial postnatal increase in total number of neurons occurred in areas 44 and 45; the volume of these areas reached adult values around 7 years. In addition, we did find indications that a shift from a right-over-left to a left-over-right asymmetry may occur in the volume of BA 45 during postnatal development. No major asymmetry in total number of neurons in BA 44 and 45 was detected.

  7. Stereological comparison of oocyte recruitment and batch fecundity estimates from paraffin and resin sections using spawning albacore (Thunnus alalunga) ovaries as a case study

    Science.gov (United States)

    Saber, Sámar; Macías, David; Ortiz de Urbina, Josetxu; Kjesbu, Olav Sigurd

    2015-01-01

    Traditional histological protocols in marine fish reproductive laboratories using paraffin as the embedding medium are now increasingly being replaced with protocols using resin instead. These procedures entail different degrees of tissue shrinkage complicating direct comparisons of measurement results across laboratories or articles. In this work we selected ovaries of spawning Mediterranean albacore (Thunnus alalunga) as the subject of our study to address the issue of structural changes, by contrasting values on oocyte recruitment and final batch fecundity given from the same tissue samples in both paraffin and resin. A modern stereological method, the oocyte packing density (OPD) theory, was used supported by initial studies on ovarian tissue sampling and measurement design. Examples of differences in the volume fraction of oocyte stages, free space and connective tissue were found between the embedding media. Mean oocyte diameters were smaller in paraffin than in resin with differences ranging between 0.5% in primary growth and 24.3% in hydration (HYD) stage oocytes. Fresh oocyte measurements showed that oocytes shrank as a consequence of the embedding process, reaching the maximal degree of shrinkage for oocytes in the HYD stage (45.8% in paraffin and 26.5% in resin). In order to assess the effect of oocyte shrinkage on the OPD result, and thereby on relative batch fecundity (Fr), oocyte diameters corrected and uncorrected for shrinkage, were used for estimations. Statistical significant differences were found (P based on either oocytes in the germinal vesicle migration stage or HYD stage. As a valuable adjunct, the present use of the OPD theory made it possible to document that the oocyte recruitment of spawning ovaries of Mediterranean albacore followed the typical pattern of an asynchronous oocyte development and indeterminate fecundity.

  8. Reliability-based design code calibration for concrete containment structures

    International Nuclear Information System (INIS)

    Han, B.K.; Cho, H.N.; Chang, S.P.

    1991-01-01

    In this study, a load combination criteria for design and a probability-based reliability analysis were proposed on the basis of a FEM-based random vibration analysis. The limit state model defined for the study is a serviceability limit state of the crack failure that causes the emission of radioactive materials, and the results are compared with the case of strength limit state. More accurate reliability analyses under various dynamic loads such as earthquake loads were made possible by incorporating the FEM and random vibration theory, which is different from the conventional reliability analysis method. The uncertainties in loads and resistance available in Korea and the references were adapted to the situation of Korea, and especially in case of earthquake, the design earthquake was assessed based on the available data for the probabilistic description of earthquake ground acceleration in the Korea peninsula. The SAP V-2 is used for a three-dimensional finite element analysis of concrete containment structure, and the reliability analysis is carried out by modifying HRAS reliability analysis program for this study. (orig./GL)

  9. Nuclear component design ontology building based on ASME codes

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan

    2005-01-01

    The adoption of ontology analysis in the study of concept knowledge acquisition and representation for the nuclear component design process based on computer-supported cooperative work (CSCW) makes it possible to share and reuse numerous concept knowledge of multi-disciplinary domains. A practical ontology building method is accordingly proposed based on Protege knowledge model in combination with both top-down and bottom-up approaches together with Formal Concept Analysis (FCA). FCA exhibits its advantages in the way it helps establish and improve taxonomic hierarchy of concepts and resolve concept conflict occurred in modeling multi-disciplinary domains. With Protege-3.0 as the ontology building tool, a nuclear component design ontology based ASME codes is developed by utilizing the ontology building method. The ontology serves as the basis to realize concept knowledge sharing and reusing of nuclear component design. (authors)

  10. Multi performance option in direct displacement based design

    Directory of Open Access Journals (Sweden)

    Muljati Ima

    2017-01-01

    Full Text Available Compare to traditional method, direct displacement based design (DDBD offers the more rational design choice due to its compatibility with performance based design which is controlled by the targeted displacement in design. The objectives of this study are: 1 to explore the performance of DDBD for design Level-1, -2 and -3; 2 to determine the most appropriate design level based on material efficiency and damage risk; and 3 to verify the chosen design in order to check its performance under small-, moderate- and severe earthquake. As case study, it uses regular concrete frame structures consists of fourand eight-story with typical plan, located in low- and high-risk seismicity area. The study shows that design Level-2 (repairable damage is the most appropriate choice. Nonlinear time history analysis is run for each case study in order to verify their performance based on parameter: story drift, damage indices, and plastic mechanism. It can be concluded that DDBD performed very well in predicting seismic demand of the observed structures. Design Level-2 can be chosen as the most appropriate design level. Structures are in safe plastic mechanism under all level of seismicity although some plastic hinges formed at some unexpected locations.

  11. Computer-aided system of evaluation for population-based all-in-one service screening (CASE-PASS): from study design to outcome analysis with bias adjustment.

    Science.gov (United States)

    Chen, Li-Sheng; Yen, Amy Ming-Fang; Duffy, Stephen W; Tabar, Laszlo; Lin, Wen-Chou; Chen, Hsiu-Hsi

    2010-10-01

    Population-based routine service screening has gained popularity following an era of randomized controlled trials. The evaluation of these service screening programs is subject to study design, data availability, and the precise data analysis for adjusting bias. We developed a computer-aided system that allows the evaluation of population-based service screening to unify these aspects and facilitate and guide the program assessor to efficiently perform an evaluation. This system underpins two experimental designs: the posttest-only non-equivalent design and the one-group pretest-posttest design and demonstrates the type of data required at both the population and individual levels. Three major analyses were developed that included a cumulative mortality analysis, survival analysis with lead-time adjustment, and self-selection bias adjustment. We used SAS AF software to develop a graphic interface system with a pull-down menu style. We demonstrate the application of this system with data obtained from a Swedish population-based service screen and a population-based randomized controlled trial for the screening of breast, colorectal, and prostate cancer, and one service screening program for cervical cancer with Pap smears. The system provided automated descriptive results based on the various sources of available data and cumulative mortality curves corresponding to the study designs. The comparison of cumulative survival between clinically and screen-detected cases without a lead-time adjustment are also demonstrated. The intention-to-treat and noncompliance analysis with self-selection bias adjustments are also shown to assess the effectiveness of the population-based service screening program. Model validation was composed of a comparison between our adjusted self-selection bias estimates and the empirical results on effectiveness reported in the literature. We demonstrate a computer-aided system allowing the evaluation of population-based service screening

  12. On the analysis of genome-wide association studies in family-based designs: a universal, robust analysis approach and an application to four genome-wide association studies.

    Directory of Open Access Journals (Sweden)

    Sungho Won

    2009-11-01

    Full Text Available For genome-wide association studies in family-based designs, we propose a new, universally applicable approach. The new test statistic exploits all available information about the association, while, by virtue of its design, it maintains the same robustness against population admixture as traditional family-based approaches that are based exclusively on the within-family information. The approach is suitable for the analysis of almost any trait type, e.g. binary, continuous, time-to-onset, multivariate, etc., and combinations of those. We use simulation studies to verify all theoretically derived properties of the approach, estimate its power, and compare it with other standard approaches. We illustrate the practical implications of the new analysis method by an application to a lung-function phenotype, forced expiratory volume in one second (FEV1 in 4 genome-wide association studies.

  13. Multidisciplinary Analysis and Optimal Design: As Easy as it Sounds?

    Science.gov (United States)

    Moore, Greg; Chainyk, Mike; Schiermeier, John

    2004-01-01

    The viewgraph presentation examines optimal design for precision, large aperture structures. Discussion focuses on aspects of design optimization, code architecture and current capabilities, and planned activities and collaborative area suggestions. The discussion of design optimization examines design sensitivity analysis; practical considerations; and new analytical environments including finite element-based capability for high-fidelity multidisciplinary analysis, design sensitivity, and optimization. The discussion of code architecture and current capabilities includes basic thermal and structural elements, nonlinear heat transfer solutions and process, and optical modes generation.

  14. Probability based load combinations for design of category I structures

    International Nuclear Information System (INIS)

    Reich, M.; Hwang, H.

    1985-01-01

    This paper discusses a reliability analysis method and a procedure for developing the load combination design criteria for category I structures. For safety evaluation of category I concrete structures under various static and dynamic loads, a probability-based reliability analysis method has been developed. This reliability analysis method is also used as a tool for determining the load factors for design of category I structures. In this paper, the load combinations for design of concrete containments, corresponding to a target limit state probability of 1.0 x 10 -6 in 4 years, are described. A comparison of containments designed using the ASME code and the proposed design criteria is also presented

  15. A model-based meta-analysis of monoclonal antibody pharmacokinetics to guide optimal first-in-human study design

    Science.gov (United States)

    Davda, Jasmine P; Dodds, Michael G; Gibbs, Megan A; Wisdom, Wendy; Gibbs, John P

    2014-01-01

    The objectives of this retrospective analysis were (1) to characterize the population pharmacokinetics (popPK) of four different monoclonal antibodies (mAbs) in a combined analysis of individual data collected during first-in-human (FIH) studies and (2) to provide a scientific rationale for prospective design of FIH studies with mAbs. The data set was composed of 171 subjects contributing a total of 2716 mAb serum concentrations, following intravenous (IV) and subcutaneous (SC) doses. mAb PK was described by an open 2-compartment model with first-order elimination from the central compartment and a depot compartment with first-order absorption. Parameter values obtained from the popPK model were further used to generate optimal sampling times for a single dose study. A robust fit to the combined data from four mAbs was obtained using the 2-compartment model. Population parameter estimates for systemic clearance and central volume of distribution were 0.20 L/day and 3.6 L with intersubject variability of 31% and 34%, respectively. The random residual error was 14%. Differences (> 2-fold) in PK parameters were not apparent across mAbs. Rich designs (22 samples/subject), minimal designs for popPK (5 samples/subject), and optimal designs for non-compartmental analysis (NCA) and popPK (10 samples/subject) were examined by stochastic simulation and estimation. Single-dose PK studies for linear mAbs executed using the optimal designs are expected to yield high-quality model estimates, and accurate capture of NCA estimations. This model-based meta-analysis has determined typical popPK values for four mAbs with linear elimination and enabled prospective optimization of FIH study designs, potentially improving the efficiency of FIH studies for this class of therapeutics. PMID:24837591

  16. Degeneration and regeneration of motor and sensory nerves: a stereological study of crush lesions in rat facial and mental nerves

    DEFF Research Database (Denmark)

    Barghash, Ziad; Larsen, Jytte Overgaard; Al-Bishri, Awad

    2013-01-01

    The aim of this study was to evaluate the degeneration and regeneration of a sensory nerve and a motor nerve at the histological level after a crush injury. Twenty-five female Wistar rats had their mental nerve and the buccal branch of their facial nerve compressed unilaterally against a glass rod...... for 30 s. Specimens of the compressed nerves and the corresponding control nerves were dissected at 3, 7, and 19 days after surgery. Nerve cross-sections were stained with osmium tetroxide and toluidine blue and analysed using two-dimensional stereology. We found differences between the two nerves both...... in the normal anatomy and in the regenerative pattern. The mental nerve had a larger cross-sectional area including all tissue components. The mental nerve had a larger volume fraction of myelinated axons and a correspondingly smaller volume fraction of endoneurium. No differences were observed...

  17. Toward design-based engineering of industrial microbes.

    Science.gov (United States)

    Tyo, Keith E J; Kocharin, Kanokarn; Nielsen, Jens

    2010-06-01

    Engineering industrial microbes has been hampered by incomplete knowledge of cell biology. Thus an iterative engineering cycle of modeling, implementation, and analysis has been used to increase knowledge of the underlying biology while achieving engineering goals. Recent advances in Systems Biology technologies have drastically improved the amount of information that can be collected in each iteration. As well, Synthetic Biology tools are melding modeling and molecular implementation. These advances promise to move microbial engineering from the iterative approach to a design-oriented paradigm, similar to electrical circuits and architectural design. Genome-scale metabolic models, new tools for controlling expression, and integrated -omics analysis are described as key contributors in moving the field toward Design-based Engineering. Copyright 2010 Elsevier Ltd. All rights reserved.

  18. Development of design and safety analysis supporting system for casks

    International Nuclear Information System (INIS)

    Ohsono, Katsunari; Higashino, Akira; Endoh, Shuji

    1993-01-01

    Mitsubishi heavy Industries has developed a design and safety analysis supporting system 'CADDIE' (Cask Computer Aided Design, Drawing and Integrated Evaluation System), with the following objectives: (1) Enhancement of efficiency of the design and safety analysis (2) Further advancement of design quality (3) Response to the diversification of design requirements. The features of this system are as follows: (1) The analysis model data common to analyses is established, and it is prepared automatically from the model made by CAD. (2) The input data for the analysis code is available by simple operation of conversation type from the analysis model data. (3) The analysis results are drawn out in diagrams by output generator, so as to facilitate easy observation. (4) The data of material properties, fuel assembly data, etc. required for the analyses are made available as a data base. (J.P.N.)

  19. Muscular dystrophy-related quantitative and chemical changes in adenohypophysis GH-cells in golden retrievers

    DEFF Research Database (Denmark)

    de Lima, A R; Nyengaard, Jens Randel; Jorge, A A L

    2007-01-01

    investigated the morphological aspects of the adenohypophysis as well as the total number and size of GH-granulated cells using design-based stereological methods in a limited number of dystrophic and healthy golden retrievers. GH-cells were larger (32.4%) in dystrophic dogs than in healthy animals (p=0...

  20. Design-for-analysis or the unintended role of analysis in the design of piping systems

    International Nuclear Information System (INIS)

    Antaki, G.A.

    1991-01-01

    The paper discusses the evolution of piping design in the nuclear industry with its increasing reliance on dynamic analysis. While it is well recognized that the practice has evolved from ''design-by- rule '' to ''design-by-analysis,'' examples are provided of cases where the choice of analysis technique has determined the hardware configuration, which could be called ''design-for-analysis.'' The paper presents practical solutions to some of these cases and summarizes the important recent industry and regulatory developments which, if successful, will reverse the trend towards ''design-for-analysis.'' 14 refs

  1. Basic earthquake engineering from seismology to analysis and design

    CERN Document Server

    Sucuoğlu, Halûk

    2014-01-01

    This book provides senior undergraduate students, master students and structural engineers who do not have a background in the field with core knowledge of structural earthquake engineering that will be invaluable in their professional lives. The basics of seismotectonics, including the causes, magnitude, and intensity of earthquakes, are first explained. Then the book introduces basic elements of seismic hazard analysis and presents the concept of a seismic hazard map for use in seismic design. Subsequent chapters cover key aspects of the response analysis of simple systems and building struc­tures to earthquake ground motions, design spectrum, the adoption of seismic analysis procedures in seismic design codes, seismic design principles and seismic design of reinforced concrete structures. Helpful worked examples on seismic analysis of linear, nonlinear and base isolated buildings, earthquake-resistant design of frame and frame-shear wall systems are included, most of which can be solved using a hand calcu...

  2. Quantitative Analysis of Ductile Iron Microstructure – A Comparison of Selected Methods for Assessment

    Directory of Open Access Journals (Sweden)

    Mrzygłód B.

    2013-09-01

    Full Text Available Stereological description of dispersed microstructure is not an easy task and remains the subject of continuous research. In its practical aspect, a correct stereological description of this type of structure is essential for the analysis of processes of coagulation and spheroidisation, or for studies of relationships between structure and properties. One of the most frequently used methods for an estimation of the density Nv and size distribution of particles is the Scheil - Schwartz - Saltykov method. In this article, the authors present selected methods for quantitative assessment of ductile iron microstructure, i.e. the Scheil - Schwartz - Saltykov method, which allows a quantitative description of three-dimensional sets of solids using measurements and counts performed on two-dimensional cross-sections of these sets (microsections and quantitative description of three-dimensional sets of solids by X-ray computed microtomography, which is an interesting alternative for structural studies compared to traditional methods of microstructure imaging since, as a result, the analysis provides a three-dimensional imaging of microstructures examined.

  3. SLS Model Based Design: A Navigation Perspective

    Science.gov (United States)

    Oliver, T. Emerson; Anzalone, Evan; Park, Thomas; Geohagan, Kevin

    2018-01-01

    The SLS Program has implemented a Model-based Design (MBD) and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team is responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1B design, the additional GPS Receiver hardware model is managed as a DMM at the vehicle design level. This paper describes the models, and discusses the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the navigation components.

  4. An empirical analysis of the precision of estimating the numbers of neurons and glia in human neocortex using a fractionator-design with sub-sampling

    DEFF Research Database (Denmark)

    Lyck, L.; Santamaria, I.D.; Pakkenberg, B.

    2009-01-01

    Improving histomorphometric analysis of the human neocortex by combining stereological cell counting with immunchistochemical visualisation of specific neuronal and glial cell populations is a methodological challenge. To enable standardized immunohistochemical staining, the amount of brain tissue...... to be stained and analysed by cell counting was efficiently reduced using a fractionator protocol involving several steps of sub-sampling. Since no mathematical or statistical tools exist to predict the variance originating from repeated sampling in complex structures like the human neocortex, the variance....... The results showed that it was possible, but not straight forward, to combine immunohistochemistry and the optical fractionator for estimation of specific subpopulations of brain cells in human neocortex. (C) 2009 Elsevier B.V. All rights reserved Udgivelsesdato: 2009/9/15...

  5. Design-Based Research

    DEFF Research Database (Denmark)

    Gynther, Karsten; Christensen, Ove; Petersen, Trine Brun

    2012-01-01

    I denne artikel introduceres Design Based Research for første gang på dansk i et videnskabeligt tidsskrift. Artiklen præsenterer de grundlæggende antagelser, som ligger til grund for Design Based Research-traditionen, og artiklen diskuterer de principper, som ligger til grund for gennemførelse af...... et DBR-forskningsprojekt. Med udgangspunkt i forsknings- og udviklingsprojektet ELYK: E-læring, Yderområder og Klyngedannelse, præsenteres den innovationsmodel, som projektet har udviklet med udgangspunkt i Design Based Research traditionen. ELYKs DBR innovationsmodel har vist sig effektiv i forhold...

  6. Uncertainty analysis and design optimization of hybrid rocket motor powered vehicle for suborbital flight

    Directory of Open Access Journals (Sweden)

    Zhu Hao

    2015-06-01

    Full Text Available In this paper, we propose an uncertainty analysis and design optimization method and its applications on a hybrid rocket motor (HRM powered vehicle. The multidisciplinary design model of the rocket system is established and the design uncertainties are quantified. The sensitivity analysis of the uncertainties shows that the uncertainty generated from the error of fuel regression rate model has the most significant effect on the system performances. Then the differences between deterministic design optimization (DDO and uncertainty-based design optimization (UDO are discussed. Two newly formed uncertainty analysis methods, including the Kriging-based Monte Carlo simulation (KMCS and Kriging-based Taylor series approximation (KTSA, are carried out using a global approximation Kriging modeling method. Based on the system design model and the results of design uncertainty analysis, the design optimization of an HRM powered vehicle for suborbital flight is implemented using three design optimization methods: DDO, KMCS and KTSA. The comparisons indicate that the two UDO methods can enhance the design reliability and robustness. The researches and methods proposed in this paper can provide a better way for the general design of HRM powered vehicles.

  7. A Morphogenetic Design Approach with Embedded Structural Analysis

    DEFF Research Database (Denmark)

    Jensen, Mads Brath; Kirkegaard, Poul Henning; Holst, Malene Kirstine

    2010-01-01

    The present paper explores a morphogenetic design approach with embedded structural analysis for architectural design. A material system based on a combined space truss and membrane system has been derived as a growth system with inspiration from natural growth of plants. The structural system...... is capable of adding new elements based on a structural analysis of the existing components and their internal stress levels. A GA decision-making procedure that control the generation of the growth cycles is introduced. This evaluation and generation loop is capable of successfully making decisions based...... on several, and often conflicting, inputs formulated from architectural requirements. An experiment with a tri-pyramid component has been considered, but many other space truss systems could be explored in the same manner and result in highly performative outcomes. not only with respect to the structural...

  8. Structured Performance Analysis for Component Based Systems

    OpenAIRE

    Salmi , N.; Moreaux , Patrice; Ioualalen , M.

    2012-01-01

    International audience; The Component Based System (CBS) paradigm is now largely used to design software systems. In addition, performance and behavioural analysis remains a required step for the design and the construction of efficient systems. This is especially the case of CBS, which involve interconnected components running concurrent processes. % This paper proposes a compositional method for modeling and structured performance analysis of CBS. Modeling is based on Stochastic Well-formed...

  9. Design and performance analysis of delay insensitive multi-ring structures

    DEFF Research Database (Denmark)

    Sparsø, Jens; Staunstrup, Jørgen

    1993-01-01

    A set of simple design and performance analysis techniques that have been successfully used to design a number of nontrivial delay insensitive circuits is described. Examples are building blocks for digital filters and a vector multiplier using a serial-parallel multiply and accumulate algorithm....... The vector multiplier circuit has been laid out, submitted for fabrication and successfully tested. Throughout the analysis elements from this design are used to illustrate the design and performance analysis techniques. The design technique is based on a data flow approach using pipelines and rings...... that are composed into larger multiring structures by joining and forking of signals. By limiting to this class of structures, it is possible, even for complex designs, to analyze the performance and establish an understanding of the bottlenecks....

  10. Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles

    Science.gov (United States)

    Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.

    2006-01-01

    This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.

  11. Seismic design and performance of nuclear safety related RC structures based on new seismic design principle

    International Nuclear Information System (INIS)

    Murugan, R.; Sivathanu Pillai, C.; Chattopadhyaya, S.; Sundaramurthy, C.

    2011-01-01

    Full text: Seismic design of safety related Reinforced Concrete (RC) structures of Nuclear power plants (NPP) in India as per the present AERB codal procedures tries to ensure predominantly elastic behaviour under OBE so that the features of Nuclear Power Plant (NPP) necessary for continued safe operation are designed to remain functional and prevent accident (collapse) of NPP under SSE for which certain Structures, Systems and Components (SSCs) those are necessary to ensure the capability to shut down the reactor safely, are designed to remain functional. While the seismic design principles of non safety related structures as per Indian code (IS 1893-2002) are ensuring elastic behaviour under DBE and inelastic behaviour under MCE by utilizing ductility and energy dissipation capacity of the structure effectively. The design principle of AERB code is ensuring elastic behaviour under OBE and is not enlightening much inference about the overall structural behaviour under SSE (only ensuring the capability of certain SSCs required for safe shutdown of reactor). Various buildings and structures of Indian Nuclear power plant are classified from the basis of associated safety functions in a descending order in according with their roles in preventions and mitigation of an accident or support functions for prevention. This paper covers a comprehensive seismic analysis and design methodology based on the AERB codal provisions followed for safety related RC structure taking Diesel Generator Building of PFBR as a case study and study and investigates its performance under OBE and SSE by carrying out Non-linear static Pushover analysis. Based on the analysis, observed variations, recommendations are given for getting the desired performance level so as to implement performance based design in the future NPP design

  12. Change Impact Analysis of Crosscutting in Software Architectural Design

    NARCIS (Netherlands)

    van den Berg, Klaas

    2006-01-01

    Software architectures should be amenable to changes in user requirements and implementation technology. The analysis of the impact of these changes can be based on traceability of architectural design elements. Design elements have dependencies with other software artifacts but also evolve in time.

  13. Creative design-by-analysis solutions applied to high-temperature components

    International Nuclear Information System (INIS)

    Dhalla, A.K.

    1993-01-01

    Elevated temperature design has evolved over the last two decades from design-by-formula philosophy of the ASME Boiler and Pressure Vessel Code, Sections I and VIII (Division 1), to the design-by-analysis philosophy of Section III, Code Case N-47. The benefits of design-by-analysis procedures, which were developed under a US-DOE-sponsored high-temperature structural design (HTSD) program, are illustrated in the paper through five design examples taken from two U.S. liquid metal reactor (LMR) plants. Emphasis in the paper is placed upon the use of a detailed, nonlinear finite element analysis method to understand the structural response and to suggest design optimization so as to comply with Code Case N-47 criteria. A detailed analysis is cost-effective, if selectively used, to qualify an LMR component for service when long-lead-time structural forgings, procured based upon simplified preliminary analysis, do not meet the design criteria, or the operational loads are increased after the components have been fabricated. In the future, the overall costs of a detailed analysis will be reduced even further with the availability of finite element software used on workstations or PCs

  14. Cost and performance analysis of conceptual designs of physical protection systems

    International Nuclear Information System (INIS)

    Hicks, M.J.; Snell, M.S.; Sandoval, J.S.; Potter, C.S.

    1998-01-01

    CPA -- Cost and Performance Analysis -- is a methodology that joins Activity Based Cost (ABC) estimation with performance based analysis of physical protection systems. CPA offers system managers an approach that supports both tactical decision making and strategic planning. Current exploratory applications of the CPA methodology are addressing analysis of alternative conceptual designs. To support these activities, the original architecture for CPA, is being expanded to incorporate results from a suite of performance and consequence analysis tools such as JTS (Joint Tactical Simulation), ERAD (Explosive Release Atmospheric Dispersion) and blast effect models. The process flow for applying CPA to the development and analysis conceptual designs is illustrated graphically

  15. Web-based Core Design System Development

    International Nuclear Information System (INIS)

    Moon, So Young; Kim, Hyung Jin; Yang, Sung Tae; Hong, Sun Kwan

    2011-01-01

    The selection of a loading pattern is one of core design processes in the operation of a nuclear power plant. A potential new loading pattern is identified by selecting fuels that to not exceed the major limiting factors of the design and that satisfy the core design conditions for employing fuel data from the existing loading pattern of the current operating cycle. The selection of a loading pattern is also related to the cycle plan of an operating nuclear power plant and must meet safety and economic requirements. In selecting an appropriate loading pattern, all aspects, such as input creation, code runs and result processes are processed as text forms manually by a designer, all of which may be subject to human error, such as syntax or running errors. Time-consuming results analysis and decision-making processes are the most significant inefficiencies to avoid. A web-based nuclear plant core design system was developed here to remedy the shortcomings of an existing core design system. The proposed system adopts the general methodology of OPR1000 (Korea Standard Nuclear Power Plants) and Westinghouse-type plants. Additionally, it offers a GUI (Graphic User Interface)-based core design environment with a user-friendly interface for operators. It reduces human errors related to design model creation, computation, final reload core model selection, final output confirmation, and result data validation and verification. Most significantly, it reduces the core design time by more than 75% compared to its predecessor

  16. Electrical Steering of Vehicles - Fault-tolerant Analysis and Design

    DEFF Research Database (Denmark)

    Blanke, Mogens; Thomsen, Jesper Sandberg

    2006-01-01

    solutions and still meet strict requirements to functional safety. The paper applies graph-based analysis of functional system structure to find a novel fault-tolerant architecture for an electrical steering where a dedicated AC-motor design and cheap voltage measurements ensure ability to detect all......The topic of this paper is systems that need be designed such that no single fault can cause failure at the overall level. A methodology is presented for analysis and design of fault-tolerant architectures, where diagnosis and autonomous reconfiguration can replace high cost triple redundancy...

  17. Taming Human Genetic Variability: Transcriptomic Meta-Analysis Guides the Experimental Design and Interpretation of iPSC-Based Disease Modeling

    Directory of Open Access Journals (Sweden)

    Pierre-Luc Germain

    2017-06-01

    Full Text Available Both the promises and pitfalls of the cell reprogramming research platform rest on human genetic variation, making the measurement of its impact one of the most urgent issues in the field. Harnessing large transcriptomics datasets of induced pluripotent stem cells (iPSC, we investigate the implications of this variability for iPSC-based disease modeling. In particular, we show that the widespread use of more than one clone per individual in combination with current analytical practices is detrimental to the robustness of the findings. We then proceed to identify methods to address this challenge and leverage multiple clones per individual. Finally, we evaluate the specificity and sensitivity of different sample sizes and experimental designs, presenting computational tools for power analysis. These findings and tools reframe the nature of replicates used in disease modeling and provide important resources for the design, analysis, and interpretation of iPSC-based studies.

  18. Histopathological, immunohistochemical, and stereological analysis of the effect of Ginkgo biloba (Egb761) on the hippocampus of rats exposed to long-term cellphone radiation.

    Science.gov (United States)

    Gevrek, Fikret

    2018-05-01

    Cellular phones are major sources of electromagnetic radiation (EMR) that can penetrate the human body and pose serious health hazards. The increasingly widespread use of mobile communication systems has raised concerns about the effects of cellphone radiofrequency (RF) on the hippocampus because of its close proximity to radiation during cellphone use. The effects of cellphone EMR exposure on the hippocampus of rats and the possible counteractive effects of Ginkgo biloba (Egb761) were aimed to investigate. Rats were divided into three groups: Control, EMR, and EMR+Egb761. The EMR and EMR+Egb761 groups were exposed to cellphone EMR for one month. Egb761 was also administered to the EMR+Egb761 group. Specifically, we evaluated the effect of RF exposure on rat hippocampi at harmful EMR levels (0.96 W/kg specific absorption rate [SAR]) for one month and also investigated the possible impact of Ginkgo biloba (Egb761) using stereological, TUNEL-staining, and immunohistochemical methods. An increase in apoptotic proteins (Bax, Acas-3) and a decrease in anti-apoptotic protein (Bcl-2) immunoreactivity along with a decrease in the total granule and pyramidal cell count were noted in the EMR group. A decrease in Bax and Acas-3 and an increase in Bcl-2 immunoreactivity were observed in rats treated with Egb761 in addition to a decrease in TUNEL-stained apoptotic cells and a higher total viable cell number. In conclusion, chronic cellphone EMR exposure may affect hippocampal cell viability, and Egb761 may be used to mitigate some of the deleterious effects.

  19. Integrating reliability analysis and design

    International Nuclear Information System (INIS)

    Rasmuson, D.M.

    1980-10-01

    This report describes the Interactive Reliability Analysis Project and demonstrates the advantages of using computer-aided design systems (CADS) in reliability analysis. Common cause failure problems require presentations of systems, analysis of fault trees, and evaluation of solutions to these. Results have to be communicated between the reliability analyst and the system designer. Using a computer-aided design system saves time and money in the analysis of design. Computer-aided design systems lend themselves to cable routing, valve and switch lists, pipe routing, and other component studies. At EG and G Idaho, Inc., the Applicon CADS is being applied to the study of water reactor safety systems

  20. Stereological estimation of surface area and barrier thickness of fish gills in vertical sections.

    Science.gov (United States)

    Da Costa, Oscar T F; Pedretti, Ana Carolina E; Schmitz, Anke; Perry, Steven F; Fernandes, Marisa N

    2007-01-01

    Previous morphometric methods for estimation of the volume of components, surface area and thickness of the diffusion barrier in fish gills have taken advantage of the highly ordered structure of these organs for sampling and surface area estimations, whereas the thickness of the diffusion barrier has been measured orthogonally on perpendicularly sectioned material at subjectively selected sites. Although intuitively logical, these procedures do not have a demonstrated mathematical basis, do not involve random sampling and measurement techniques, and are not applicable to the gills of all fish. The present stereological methods apply the principles of surface area estimation in vertical uniform random sections to the gills of the Brazilian teleost Arapaima gigas. The tissue was taken from the entire gill apparatus of the right-hand or left-hand side (selected at random) of the fish by systematic random sampling and embedded in glycol methacrylate for light microscopy. Arches from the other side were embedded in Epoxy resin. Reference volume was estimated by the Cavalieri method in the same vertical sections that were used for surface density and volume density measurements. The harmonic mean barrier thickness of the water-blood diffusion barrier was calculated from measurements taken along randomly selected orientation lines that were sine-weighted relative to the vertical axis. The values thus obtained for the anatomical diffusion factor (surface area divided by barrier thickness) compare favourably with those obtained for other sluggish fish using existing methods.

  1. Design-Based Research in Science Education: One Step Towards Methodology

    Directory of Open Access Journals (Sweden)

    Kalle Juuti

    2012-10-01

    Full Text Available Recently, there has been critiques towards science education research, as the potential of this research has not been actualised in science teaching and learning praxis. The paper describes an analysis of a design-based research approach (DBR that has been suggested as a solution for the discontinuation between science education research and praxis. We propose that a pragmatic frame helps to clarify well the design-based research endeavour. We abstracted three aspects from the analysis that constitute design-based research: (a a design process is essentially iterative starting from the recognition of the change of the environment of praxis, (b it generates a widely usable artefact, (c and it provides educational knowledge for more intelligible praxis. In the knowledge acquisition process, the pragmatic viewpoint emphasises the role of a teacher’s reflected actions as well as the researches’ involvement in the authentic teaching and learning settings.

  2. Hippocampal MR volumetry

    Science.gov (United States)

    Haller, John W.; Botteron, K.; Brunsden, Barry S.; Sheline, Yvette I.; Walkup, Ronald K.; Black, Kevin J.; Gado, Mokhtar; Vannier, Michael W.

    1994-09-01

    Goal: To estimate hippocampal volumes from in vivo 3D magnetic resonance (MR) brain images and determine inter-rater and intra- rater repeatability. Objective: The precision and repeatability of hippocampal volume estimates using stereologic measurement methods is sought. Design: Five normal control and five schizophrenic subjects were MR scanned using a MPRAGE protocol. Fixed grid stereologic methods were used to estimate hippocampal volumes on a graphics workstation. The images were preprocessed using histogram analysis to standardize 3D MR image scaling from 16 to 8 bits and image volumes were interpolated to 0.5 mm3 isotropic voxels. The following variables were constant for the repeated stereologic measures: grid size, inter-slice distance (1.5 mm), voxel dimensions (0.5 mm3), number of hippocampi measured (10), total number of measurements per rater (40), and number of raters (5). Two grid sizes were tested to determine the coefficient of error associated with the number of sampled 'hits' (approximately 140 and 280) on the hippocampus. Starting slice and grid position were randomly varied to assure unbiased volume estimates. Raters were blind to subject identity, diagnosis, and side of the brain from which the image volumes were extracted and the order of subject presentation was randomized for each of the raters. Inter- and intra-rater intraclass correlation coefficients (ICC) were determined. Results: The data indicate excellent repeatability of fixed grid stereologic hippocampal volume measures when using an inter-slice distance of 1.5 mm and a 6.25 mm2 grid (inter-rater ICCs equals 0.86 - 0.97, intra- rater ICCs equals 0.85 - 0.97). One major advantage of the current study was the use of 3D MR data which significantly improved visualization of hippocampal boundaries by providing the ability to access simultaneous orthogonal views while counting stereological marks within the hippocampus. Conclusion: Stereological estimates of 3D volumes from 2D MR

  3. Teaching Sustainable Design Using BIM and Project-Based Energy Simulations

    Directory of Open Access Journals (Sweden)

    Zhigang Shen

    2012-08-01

    Full Text Available The cross-disciplinary nature of energy-efficient building design has created many challenges for architecture, engineering and construction instructors. One of the technical challenges in teaching sustainable building design is enabling students to quantitatively understand how different building designs affect a building’s energy performance. Concept based instructional methods fall short in evaluating the impact of different design choices on a buildings’ energy consumption. Building Information Modeling (BIM with energy performance software provides a feasible tool to evaluate building design parameters. One notable advantage of this tool is its ability to couple 3D visualization of the structure with energy performance analysis without requiring detailed mathematical and thermodynamic calculations. Project-based Learning (PBL utilizing BIM tools coupled with energy analysis software was incorporated into a senior level undergraduate class. Student perceptions and feedback were analyzed to gauge the effectiveness of these techniques as instructional tools. The findings indicated that BIM-PBL can be used to effectively teach energy-efficient building design and construction.

  4. Rotors stress analysis and design

    CERN Document Server

    Vullo, Vincenzo

    2013-01-01

    Stress and strain analysis of rotors subjected to surface and body loads, as well as to thermal loads deriving from temperature variation along the radius, constitutes a classic subject of machine design. Nevertheless attention is limited to rotor profiles for which governing equations are solvable in closed form. Furthermore very few actual engineering issues may relate to structures for which stress and strain analysis in the linear elastic field and, even more, under non-linear conditions (i.e. plastic or viscoelastic conditions) produces equations to be solved in closed form. Moreover, when a product is still in its design stage, an analytical formulation with closed-form solution is of course simpler and more versatile than numerical methods, and it allows to quickly define a general configuration, which may then be fine-tuned using such numerical methods. In this view, all subjects are based on analytical-methodological approach, and some new solutions in closed form are presented. The analytical formul...

  5. Finite Element Analysis and Design of Experiments in Engineering Design

    OpenAIRE

    Eriksson, Martin

    1999-01-01

    Projects with the objective of introducing Finite Element Analysis (FEA) into the early phases of the design process have previously been carried out at the Department of Machine Design, see e.g. the Doctoral thesis by Burman [13]. These works clearly highlight the usefulness of introducing design analysis early in the design process. According to Bjärnemo and Burman [10] the most significant advantage of applying design analysis early in the design process was the shift from verification to ...

  6. Design study of ship based nuclear power reactor

    International Nuclear Information System (INIS)

    Su'ud, Zaki; Fitriyani, Dian

    2002-01-01

    Preliminary design study of ship based nuclear power reactors has been performed. In this study the results of thermohydraulics analysis is presented especially related to behaviour of ship motion in the sea. The reactors are basically lead-bismuth cooled fast power reactors using nitride fuels to enhance neutronics and safety performance. Some design modification are performed for feasibility of operation under sea wave movement. The system use loop type with relatively large coolant pipe above reactor core. The reactors does not use IHX, so that the heat from primary coolant system directly transferred to water-steam loop through steam generator. The reactors are capable to be operated in difference power level during night and noon. The reactors however can also be used totally or partially to produce clean water through desalination of sea water. Due to the influence of sea wave movement the analysis have to be performed in three dimensional analysis. The computation time for this analysis is speeded up using Parallel Virtual Machine (PVM) Based multi processor system

  7. Computational Chemical Synthesis Analysis and Pathway Design

    Directory of Open Access Journals (Sweden)

    Fan Feng

    2018-06-01

    Full Text Available With the idea of retrosynthetic analysis, which was raised in the 1960s, chemical synthesis analysis and pathway design have been transformed from a complex problem to a regular process of structural simplification. This review aims to summarize the developments of computer-assisted synthetic analysis and design in recent years, and how machine-learning algorithms contributed to them. LHASA system started the pioneering work of designing semi-empirical reaction modes in computers, with its following rule-based and network-searching work not only expanding the databases, but also building new approaches to indicating reaction rules. Programs like ARChem Route Designer replaced hand-coded reaction modes with automatically-extracted rules, and programs like Chematica changed traditional designing into network searching. Afterward, with the help of machine learning, two-step models which combine reaction rules and statistical methods became the main stream. Recently, fully data-driven learning methods using deep neural networks which even do not require any prior knowledge, were applied into this field. Up to now, however, these methods still cannot replace experienced human organic chemists due to their relatively low accuracies. Future new algorithms with the aid of powerful computational hardware will make this topic promising and with good prospects.

  8. Seismic analysis for conceptual design of HCCR TBM-set

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Won, E-mail: dwlee@kaeri.re.kr [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Park, Seong Dae; Jin, Hyung Gon; Lee, Eo Hwak; Kim, Suk-Kwon; Yoon, Jae Sung [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Shin, Kyu In [Gentec Co., Daejeon, Republic of Korea (Korea, Republic of); Cho, Seungyon [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2016-11-01

    Highlights: • The seismic analysis of KO HCCR TBM-set are performed. • The seismic envents like SL-1, SL-2, and SMHV are selected and evaluated with FEM code (ANSYS). • The results of the stresses and deformations are confirmed to meet the design criteria. - Abstract: Using the conceptual design of the Korean helium cooled ceramic reflector (HCCR) test blanket module (TBM) including the TBM-shield for testing in ITER, a seismic analysis is performed. According to the ITER TBM port plug (TBM PP) system load specifications, seismic events are selected as SL-1 (seismic level-1), SL-2 (seismic level-2), and SMHV (seismes maximaux historiquement vraisemblables, Maximum Histroically Probable Earthquakes). In a modal analysis a total of 50 modes are obtained. Then, a spectra response analysis for each seismic event is carried out using ANSYS based on the modal analysis results. For each event, the obtained Tresca stress is evaluated to confirm the design integrity, by comparing the resulting stress to the design criteria. The Tresca strain and displacement are also estimated for the HCCR TBM-set. From the analysis, it was concluded that the maximum stresses by the seismic events meet the design criteria, and the displacements are lower than the designed gap from the TBM PP frame. The results are provided to a load combination analysis.

  9. RELIABILITY BASED DESIGN OF FIXED FOUNDATION WIND TURBINES

    Energy Technology Data Exchange (ETDEWEB)

    Nichols, R.

    2013-10-14

    Recent analysis of offshore wind turbine foundations using both applicable API and IEC standards show that the total load demand from wind and waves is greatest in wave driven storms. Further, analysis of overturning moment loads (OTM) reveal that impact forces exerted by breaking waves are the largest contributor to OTM in big storms at wind speeds above the operating range of 25 m/s. Currently, no codes or standards for offshore wind power generators have been adopted by the Bureau of Ocean Energy Management Regulation and Enforcement (BOEMRE) for use on the Outer Continental Shelf (OCS). Current design methods based on allowable stress design (ASD) incorporate the uncertainty in the variation of loads transferred to the foundation and geotechnical capacity of the soil and rock to support the loads is incorporated into a factor of safety. Sources of uncertainty include spatial and temporal variation of engineering properties, reliability of property measurements applicability and sufficiency of sampling and testing methods, modeling errors, and variability of estimated load predictions. In ASD these sources of variability are generally given qualitative rather than quantitative consideration. The IEC 61400‐3 design standard for offshore wind turbines is based on ASD methods. Load and resistance factor design (LRFD) methods are being increasingly used in the design of structures. Uncertainties such as those listed above can be included quantitatively into the LRFD process. In LRFD load factors and resistance factors are statistically based. This type of analysis recognizes that there is always some probability of failure and enables the probability of failure to be quantified. This paper presents an integrated approach consisting of field observations and numerical simulation to establish the distribution of loads from breaking waves to support the LRFD of fixed offshore foundations.

  10. The use of quantimet 720 for quantitative analysis of acute leukemia images in animals and humans

    International Nuclear Information System (INIS)

    Feinermann, E.; Langlet, G.A.

    1979-01-01

    Considerable progress has been achieved in the past ten years in the analysis of particle size and form. Automatic and quantitative image analyzers and stereology enabled a comparative study of acute human and animal leukemias. It is obvious that the agreement of results between these two natural and induced categories provides encouragement to continue this investigation by these methods

  11. A procedure for the determination of scenario earthquakes for seismic design based on probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Hirose, Jiro; Muramatsu, Ken

    2002-03-01

    This report presents a study on the procedures for the determination of scenario earthquakes for seismic design of nuclear power plants (NPPs) based on probabilistic seismic hazard analysis (PSHA). In the recent years, the use of PSHA, which is a part of seismic probabilistic safety assessment (PSA), to determine the design basis earthquake motions for NPPs has been proposed. The identified earthquakes are called probability-based scenario earthquakes (PBSEs). The concept of PBSEs originates both from the study of US NRC and from Ishikawa and Kameda. The assessment of PBSEs is composed of seismic hazard analysis and identification of dominant earthquakes. The objectives of this study are to formulate the concept of PBSEs and to examine the procedures for determining the PBSEs for a domestic NPP site. This report consists of three parts, namely, procedures to compile analytical conditions for PBSEs, an assessment to identify PBSEs for a model site using the Ishikawa's concept and the examination of uncertainties involved in analytical conditions. The results obtained from the examination of PBSEs using Ishikawa's concept are as follows. (a) Since PBSEs are expressed by hazard-consistent magnitude and distance in terms of a prescribed reference probability, it is easy to obtain a concrete image of earthquakes that determine the ground response spectrum to be considered in the design of NPPs. (b) Source contribution factors provide the information on the importance of the earthquake source regions and/or active faults, and allows the selection of a couple of PBSEs based on their importance to the site. (c) Since analytical conditions involve uncertainty, sensitivity analyses on uncertainties that would affect seismic hazard curves and identification of PBSEs were performed on various aspects and provided useful insights for assessment of PBSEs. A result from this sensitivity analysis was that, although the difference in selection of attenuation equations led to a

  12. Fatigue based design and analysis of wheel hub for Student formula car by Simulation Approach

    Science.gov (United States)

    Gowtham, V.; Ranganathan, A. S.; Satish, S.; Alexis, S. John; Siva kumar, S.

    2016-09-01

    In the existing design of Wheel hub used for Student formula cars, the brake discs cannot be removed easily since the disc is mounted in between the knuckle and hub. In case of bend or any other damage to the disc, the replacement of the disc becomes difficult. Further using OEM hub and knuckle that are used for commercial vehicles will result in increase of unsprung mass, which should be avoided in Student formula cars for improving the performance. In this design the above mentioned difficulties have been overcome by redesigning the hub in such a way that the brake disc could be removed easily by just removing the wheel and the caliper and also it will have reduced weight when compared to existing OEM hub. A CAD Model was developed based on the required fatigue life cycles. The forces acting on the hub were calculated and linear static structural analysis was performed on the wheel hub for three different materials using ANSYS Finite Element code V 16.2. The theoretical fatigue strength was compared with the stress obtained from the structural analysis for each material.

  13. Nuclear design and analysis report for KALIMER breakeven core conceptual design

    International Nuclear Information System (INIS)

    Kim, Sang Ji; Song, Hoon; Lee, Ki Bog; Chang, Jin Wook; Hong, Ser Gi; Kim, Young Gyun; Kim, Yeong Il

    2002-04-01

    During the phase 2 of LMR design technology development project, the breakeven core configuration was developed with the aim of the KALIMER self-sustaining with regard to the fissile material. The excess fissile material production is limited only to the extent of its own requirement for sustaining its planned power operation. The average breeding ratio is estimated to be 1.05 for the equilibrium core and the fissile plutonium gain per cycle is 13.9 kg. The nuclear performance characteristics as well as the reactivity coefficients have been analyzed so that the design evaluation in other activity areas can be made. In order to find out a realistic heavy metal flow evolution and investigate cycle-dependent nuclear performance parameter behaviors, the startup and transition cycle loading strategies are developed, followed by the startup core physics analysis. Driver fuel and blankets are assumed to be shuffled at the time of each reload. The startup core physics analysis has shown that the burnup reactivity swing, effective delayed neutron fraction, conversion ratio and peak linear heat generation rate at the startup core lead to an extreme of bounding physics data for safety analysis. As an outcome of this study, a whole spectrum of reactor life is first analyzed in detail for the KALIMER core. It is experienced that the startup core analysis deserves more attention than the current design practice, before the core configuration is finalized based on the equilibrium cycle analysis alone.

  14. RIPOSTE: a framework for improving the design and analysis of laboratory-based research.

    Science.gov (United States)

    Masca, Nicholas Gd; Hensor, Elizabeth Ma; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam Ka; Teare, M Dawn

    2015-05-07

    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results.

  15. RIPOSTE: a framework for improving the design and analysis of laboratory-based research

    Science.gov (United States)

    Masca, Nicholas GD; Hensor, Elizabeth MA; Cornelius, Victoria R; Buffa, Francesca M; Marriott, Helen M; Eales, James M; Messenger, Michael P; Anderson, Amy E; Boot, Chris; Bunce, Catey; Goldin, Robert D; Harris, Jessica; Hinchliffe, Rod F; Junaid, Hiba; Kingston, Shaun; Martin-Ruiz, Carmen; Nelson, Christopher P; Peacock, Janet; Seed, Paul T; Shinkins, Bethany; Staples, Karl J; Toombs, Jamie; Wright, Adam KA; Teare, M Dawn

    2015-01-01

    Lack of reproducibility is an ongoing problem in some areas of the biomedical sciences. Poor experimental design and a failure to engage with experienced statisticians at key stages in the design and analysis of experiments are two factors that contribute to this problem. The RIPOSTE (Reducing IrreProducibility in labOratory STudiEs) framework has been developed to support early and regular discussions between scientists and statisticians in order to improve the design, conduct and analysis of laboratory studies and, therefore, to reduce irreproducibility. This framework is intended for use during the early stages of a research project, when specific questions or hypotheses are proposed. The essential points within the framework are explained and illustrated using three examples (a medical equipment test, a macrophage study and a gene expression study). Sound study design minimises the possibility of bias being introduced into experiments and leads to higher quality research with more reproducible results. DOI: http://dx.doi.org/10.7554/eLife.05519.001 PMID:25951517

  16. [Design of plant leaf bionic camouflage materials based on spectral analysis].

    Science.gov (United States)

    Yang, Yu-Jie; Liu, Zhi-Ming; Hu, Bi-Ru; Wu, Wen-Jian

    2011-06-01

    The influence of structure parameters and contents of plant leaves on their reflectance spectra was analyzed using the PROSPECT model. The result showed that the bionic camouflage materials should be provided with coarse surface and spongy inner structure, the refractive index of main content must be close to that of plant leaves, the contents of materials should contain chlorophyll and water, and the content of C-H bond must be strictly controlled. Based on the analysis above, a novel camouflage material, which was constituted by coarse transparent waterproof surface, chlorophyll, water and spongy material, was designed. The result of verifiable experiment showed that the reflectance spectra of camouflage material exhibited the same characteristics as those of plant leaves. The similarity coefficient of reflectance spectrum of the camouflage material and camphor leaves was 0.988 1, and the characteristics of camouflage material did not change after sunlight treatment for three months. The bionic camouflage material, who exhibited a high spectral similarity with plant leaves and a good weather resistance, will be an available method for reconnaissance of hyperspectral imaging hopefully.

  17. Analysis and Design of Web-Based Database Application for Culinary Community

    OpenAIRE

    Huda, Choirul; Awang, Osel Dharmawan; Raymond, Raymond; Raynaldi, Raynaldi

    2017-01-01

    This research is based on the rapid development of the culinary and information technology. The difficulties in communicating with the culinary expert and on recipe documentation make a proper support for media very important. Therefore, a web-based database application for the public is important to help the culinary community in communication, searching and recipe management. The aim of the research was to design a web-based database application that could be used as social media for the cu...

  18. Reliability Analysis and Optimal Design of Monolithic Vertical Wall Breakwaters

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, Hans F.; Christiani, E.

    1994-01-01

    Reliability analysis and reliability-based design of monolithic vertical wall breakwaters are considered. Probabilistic models of the most important failure modes, sliding failure, failure of the foundation and overturning failure are described . Relevant design variables are identified...

  19. Design Transformation based on Nature and Identity Formation in the Design of Landscape Elements

    Directory of Open Access Journals (Sweden)

    Zulkifli Muslim

    2016-01-01

    Full Text Available There is a lack of initiative from the designers to integrate the environmental resources in the material and design production of local urban landscape elements that reflects human culture and lifestyle. Based on criteria and principles of symbol design and transformation process, this paper describes the symbiotic relationship between local plants (flower and designs of landscape elements. Using visual analysis, the researcher manipulated shapes and forms of local plant images in producing possible shapes and forms for a design of landscape element (lamp post. The results indicate that the design transformation is a systematic process that allows for variations in design without losing the core characteristics and identity of the basic elements of nature.

  20. Finite element analysis-based design of a fluid-flow control nano-valve

    International Nuclear Information System (INIS)

    Grujicic, M.; Cao, G.; Pandurangan, B.; Roy, W.N.

    2005-01-01

    A finite element method-based procedure is developed for the design of molecularly functionalized nano-size devices. The procedure is aimed at the single-walled carbon nano-tubes (SWCNTs) used in the construction of such nano-devices and utilizes spatially varying nodal forces to represent electrostatic interactions between the charged groups of the functionalizing molecules. The procedure is next applied to the design of a fluid-flow control nano-valve. The results obtained suggest that the finite element-based procedure yields the results, which are very similar to their molecular modeling counterparts for small-size nano-valves, for which both types of analyses are feasible. The procedure is finally applied to optimize the design of a larger-size nano-valve, for which the molecular modeling approach is not practical

  1. A Geometry Based Infra-Structure for Computational Analysis and Design

    Science.gov (United States)

    Haimes, Robert

    1998-01-01

    The computational steps traditionally taken for most engineering analysis suites (computational fluid dynamics (CFD), structural analysis, heat transfer and etc.) are: (1) Surface Generation -- usually by employing a Computer Assisted Design (CAD) system; (2) Grid Generation -- preparing the volume for the simulation; (3) Flow Solver -- producing the results at the specified operational point; (4) Post-processing Visualization -- interactively attempting to understand the results. For structural analysis, integrated systems can be obtained from a number of commercial vendors. These vendors couple directly to a number of CAD systems and are executed from within the CAD Graphical User Interface (GUI). It should be noted that the structural analysis problem is more tractable than CFD; there are fewer mesh topologies used and the grids are not as fine (this problem space does not have the length scaling issues of fluids). For CFD, these steps have worked well in the past for simple steady-state simulations at the expense of much user interaction. The data was transmitted between phases via files. In most cases, the output from a CAD system could go to Initial Graphics Exchange Specification (IGES) or Standard Exchange Program (STEP) files. The output from Grid Generators and Solvers do not really have standards though there are a couple of file formats that can be used for a subset of the gridding (i.e. PLOT3D data formats). The user would have to patch up the data or translate from one format to another to move to the next step. Sometimes this could take days. Specifically the problems with this procedure are:(1) File based -- Information flows from one step to the next via data files with formats specified for that procedure. File standards, when they exist, are wholly inadequate. For example, geometry from CAD systems (transmitted via IGES files) is defined as disjoint surfaces and curves (as well as masses of other information of no interest for the Grid Generator

  2. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    Science.gov (United States)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  3. Stress Analysis for Mobile Hot Cell Design

    International Nuclear Information System (INIS)

    Muhammad Hannan Bahrin; Anwar Abdul Rahman; Mohd Arif Hamzah

    2015-01-01

    Prototype and Plant Development Centre (PDC) is developing a Mobile Hot Cell (MHC) to handle and manage Spent High Activity Radioactive Sources (SHARS), such as teletherapy heads and dry irradiators. At present, there are two units of MHC in the world, one in South Africa and the other one in China. Malaysian Mobile MHC is developed by Malaysian Nuclear Agency with the assistance of IAEA expert, based on the design of South Africa and China, but with improved features. Stress analysis has been performed on the design to fulfill the safety requirement in MHC operation. This paper discusses the loading effect analysis from the radiation shielding materials to the MHC wall structure, roof supporting column and window structure. (author)

  4. Design Analysis of Power Extracting Unit of an Onshore OWC Based Wave Energy Power Plant using Numerical Simulation

    Directory of Open Access Journals (Sweden)

    Zahid Suleman

    2011-07-01

    Full Text Available This research paper describes design and analysis of power extracting unit of an onshore OWC (Oscillating Water Column based wave energy power plant of capacity about 100 kilowatts. The OWC is modeled as solid piston of a reciprocating pump. The power extracting unit is designed analytically by using the theory of reciprocating pumps and principles of fluid mechanics. Pro-E and ANSYS workbench softwares are used to verify the analytical design. The analytical results of the flow velocity in the turbine duct are compared with the simulation results. The results are found to be in good agreement with each other. The results achieved by this research would finally assist in the overall design of the power plant which is the ultimate goal of this research work.

  5. Participatory design based research

    DEFF Research Database (Denmark)

    Dau, Susanne; Bach Jensen, Louise; Falk, Lars

    This poster reveal how participatory design based research by the use of a CoED inspired creative process can be used for designing solutions to problems regarding students study activities outside campus.......This poster reveal how participatory design based research by the use of a CoED inspired creative process can be used for designing solutions to problems regarding students study activities outside campus....

  6. A Conceptual Design and Analysis Method for Conventional and Unconventional Airplanes

    NARCIS (Netherlands)

    Elmendorp, R.J.M.; Vos, R.; La Rocca, G.

    2014-01-01

    A design method is presented that has been implemented in a software program to investigate the merits of conventional and unconventional transport airplanes. Design and analysis methods are implemented in a design tool capable of creating a conceptual design based on a set of toplevel requirements.

  7. Fast glomerular quantification of whole ex vivo mouse kidneys using Magnetic Resonance Imaging at 9.4 Tesla

    Energy Technology Data Exchange (ETDEWEB)

    Chacon-Caldera, Jorge; Kraemer, Philipp; Schad, Lothar R. [Heidelberg Univ., Mannheim (Germany). Computer Assisted Clinical Medicine; Geraci, Stefania; Gretz, Norbert [Heidelberg Univ., Mannheim (Germany). Medical Research Centre; Cullen-McEwen, Luise; Bertram, John F. [Monash Univ., Melbourne, VIC (Australia). Development and Stem Cells Program and Dept. of Anatomy and Developmental Biology

    2016-05-01

    A method to measure total glomerular number (N{sub glom}) in whole mouse kidneys using MRI is presented. The method relies on efficient acquisition times. A 9.4 T preclinical MRI system with a surface cryogenic coil and a 3D gradient echo sequence were used to image nine whole ex vivo BALB/c mouse kidneys labelled with cationized-ferritin (CF). A novel method to segment the glomeruli was developed. The quantification of glomeruli was achieved by identifying and fitting the probability distribution of glomeruli thus reducing variations due to noise. For validation, N{sub glom} of the same kidneys were also obtained using the gold standard: design-based stereology. Excellent agreement was found between the MRI and stereological measurements of N{sub glom}, with values differing by less than 4%: (mean ± SD) MRI = 15 606 ± 1 178; stereology = 16 273 ± 1 523. Using a robust segmentation method and a reliable quantification method, it was possible to acquire N{sub glom} with a scanning time of 33 minutes and 20 seconds. This was more than 8 times faster than previously presented MRI-based methods. Thus, an efficient approach to measure N{sub glom} ex vivo in health and disease is provided.

  8. An approach to review design bases and safety analysis of earlier generation atomic power plants; a case study of TAPS

    International Nuclear Information System (INIS)

    Malhotra, P.K.; Bajaj, S.S.

    2002-01-01

    The twin unit boiling water reactor (BWR) station at TAPS has completed 30 years of power operation and for further extending plant operating life, a fresh extensive exercise involving review of plant operating performance, aging management and review of design bases and safety analysis has been carried out. The review exercise resulted in assessment of acceptability of identified non-conformances and recommendation for compensatory measures in the form of design modification or plant operating procedures. The second part of the exercise is related to safety analysis, which is carried out in view of the plant modifications done and advances taken place in methodologies of analytical techniques. Chiefly, it involves LOCA analysis done for various break sizes at different locations and plant transient studies. It also includes the fatigue analysis of the reactor pressure vessel. The related review approach adopted is presented here

  9. Intuitive web-based experimental design for high-throughput biomedical data.

    Science.gov (United States)

    Friedrich, Andreas; Kenar, Erhan; Kohlbacher, Oliver; Nahnsen, Sven

    2015-01-01

    Big data bioinformatics aims at drawing biological conclusions from huge and complex biological datasets. Added value from the analysis of big data, however, is only possible if the data is accompanied by accurate metadata annotation. Particularly in high-throughput experiments intelligent approaches are needed to keep track of the experimental design, including the conditions that are studied as well as information that might be interesting for failure analysis or further experiments in the future. In addition to the management of this information, means for an integrated design and interfaces for structured data annotation are urgently needed by researchers. Here, we propose a factor-based experimental design approach that enables scientists to easily create large-scale experiments with the help of a web-based system. We present a novel implementation of a web-based interface allowing the collection of arbitrary metadata. To exchange and edit information we provide a spreadsheet-based, humanly readable format. Subsequently, sample sheets with identifiers and metainformation for data generation facilities can be created. Data files created after measurement of the samples can be uploaded to a datastore, where they are automatically linked to the previously created experimental design model.

  10. An expert system for integrated structural analysis and design optimization for aerospace structures

    Science.gov (United States)

    1992-04-01

    The results of a research study on the development of an expert system for integrated structural analysis and design optimization is presented. An Object Representation Language (ORL) was developed first in conjunction with a rule-based system. This ORL/AI shell was then used to develop expert systems to provide assistance with a variety of structural analysis and design optimization tasks, in conjunction with procedural modules for finite element structural analysis and design optimization. The main goal of the research study was to provide expertise, judgment, and reasoning capabilities in the aerospace structural design process. This will allow engineers performing structural analysis and design, even without extensive experience in the field, to develop error-free, efficient and reliable structural designs very rapidly and cost-effectively. This would not only improve the productivity of design engineers and analysts, but also significantly reduce time to completion of structural design. An extensive literature survey in the field of structural analysis, design optimization, artificial intelligence, and database management systems and their application to the structural design process was first performed. A feasibility study was then performed, and the architecture and the conceptual design for the integrated 'intelligent' structural analysis and design optimization software was then developed. An Object Representation Language (ORL), in conjunction with a rule-based system, was then developed using C++. Such an approach would improve the expressiveness for knowledge representation (especially for structural analysis and design applications), provide ability to build very large and practical expert systems, and provide an efficient way for storing knowledge. Functional specifications for the expert systems were then developed. The ORL/AI shell was then used to develop a variety of modules of expert systems for a variety of modeling, finite element analysis, and

  11. Marginal adaptation of a low-shrinkage silorane-based composite: A SEM-analysis

    DEFF Research Database (Denmark)

    Schmidt, Malene; Bindslev, Preben Hørsted; Poulsen, Sven

    2012-01-01

    shrinkage, has been marketed. Objective. To investigate whether reduced polymerization shrinkage improves the marginal adaptation of composite restorations. Material and methods. A total of 156 scanning electron microscopy (SEM) pictures (78 baseline, 78 follow-up) of the occlusal part of Class II......-casts of the restorations were used for SEM pictures at x 16 magnification. Pictures from baseline and follow-up (398 days, SD 29 days) were randomized and the examiner was blinded to the material and the age of the restoration. Stereologic measurements were used to calculate the length and the width of the marginal...

  12. Precise design-based defect characterization and root cause analysis

    Science.gov (United States)

    Xie, Qian; Venkatachalam, Panneerselvam; Lee, Julie; Chen, Zhijin; Zafar, Khurram

    2017-03-01

    As semiconductor manufacturing continues its march towards more advanced technology nodes, it becomes increasingly important to identify and characterize design weak points, which is typically done using a combination of inline inspection data and the physical layout (or design). However, the employed methodologies have been somewhat imprecise, relying greatly on statistical techniques to signal excursions. For example, defect location error that is inherent to inspection tools prevents them from reporting the true locations of defects. Therefore, common operations such as background-based binning that are designed to identify frequently failing patterns cannot reliably identify specific weak patterns. They can only identify an approximate set of possible weak patterns, but within these sets there are many perfectly good patterns. Additionally, characterizing the failure rate of a known weak pattern based on inline inspection data also has a lot of fuzziness due to coordinate uncertainty. SEM (Scanning Electron Microscope) Review attempts to come to the rescue by capturing high resolution images of the regions surrounding the reported defect locations, but SEM images are reviewed by human operators and the weak patterns revealed in those images must be manually identified and classified. Compounding the problem is the fact that a single Review SEM image may contain multiple defective patterns and several of those patterns might not appear defective to the human eye. In this paper we describe a significantly improved methodology that brings advanced computer image processing and design-overlay techniques to better address the challenges posed by today's leading technology nodes. Specifically, new software techniques allow the computer to analyze Review SEM images in detail, to overlay those images with reference design to detect every defect that might be present in all regions of interest within the overlaid reference design (including several classes of defects

  13. Formal analysis of design process dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  14. Formal Analysis of Design Process Dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  15. Design and Analysis for a Floating Oscillating Surge Wave Energy Converter: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Y. H.; Li, Y.; Hallett, K.; Hotimsky, C.

    2014-03-01

    This paper presents a recent study on the design and analysis of an oscillating surge wave energy converter. A successful wave energy conversion design requires the balance between the design performance and cost. The cost of energy is often used as the metric to judge the design of the wave energy conversion system. It is often determined based on the device power performance, the cost for manufacturing, deployment, operation and maintenance, as well as the effort to ensure the environmental compliance. The objective of this study is to demonstrate the importance of a cost driven design strategy and how it can affect a WEC design. Three oscillating surge wave energy converter (OSWEC) designs were used as the example. The power generation performance of the design was modeled using a time-domain numerical simulation tool, and the mass properties of the design were determined based on a simple structure analysis. The results of those power performance simulations, the structure analysis and a simple economic assessment were then used to determine the cost-efficiency of selected OSWEC designs. Finally, a discussion on the environmental barrier, integrated design strategy and the key areas that need further investigation is also presented.

  16. MONJU experimental data analysis and its feasibility evaluation to build up the standard data base for large FBR nuclear core design

    International Nuclear Information System (INIS)

    Sugino, K.; Iwai, T.

    2006-01-01

    MONJU experimental data analysis was performed by using the detailed calculation scheme for fast reactor cores developed in Japan. Subsequently, feasibility of the MONJU integral data was evaluated by the cross-section adjustment technique for the use of FBR nuclear core design. It is concluded that the MONJU integral data is quite valuable for building up the standard data base for large FBR nuclear core design. In addition, it is found that the application of the updated data base has a possibility to considerably improve the prediction accuracy of neutronic parameters for MONJU. (authors)

  17. Implementation of knowledge-based engineering methodology in hydraulic generator design

    Directory of Open Access Journals (Sweden)

    Wei Guo

    2015-05-01

    Full Text Available Hydraulic generator design companies are always being exhorted to become more competitive by reducing the lead time and costs for their products for survival. Knowledge-based engineering technology is a rapidly developing technology with competitive advantage for design application to reduce time and cost in product development. This article addresses the structure of the hydraulic generator design system based on the knowledge-based engineering technology in detail. The system operates by creating a unified knowledge base to store the scattered knowledge among the whole life of the design process, which was contained in the expert’s brain and technical literature. It helps designers to make appropriate decisions by supplying necessary information at the right time through query and inference engine to represent the knowledge within the knowledge-based engineering application framework. It also integrates the analysis tools into one platform to help achieve global optimum solutions. Finally, an example of turbine-type selection was given to illustrate the operation process and prove its validity.

  18. Microseismic Monitoring Design Optimization Based on Multiple Criteria Decision Analysis

    Science.gov (United States)

    Kovaleva, Y.; Tamimi, N.; Ostadhassan, M.

    2017-12-01

    Borehole microseismic monitoring of hydraulic fracture treatments of unconventional reservoirs is a widely used method in the oil and gas industry. Sometimes, the quality of the acquired microseismic data is poor. One of the reasons for poor data quality is poor survey design. We attempt to provide a comprehensive and thorough workflow, using multiple criteria decision analysis (MCDA), to optimize planning micriseismic monitoring. So far, microseismic monitoring has been used extensively as a powerful tool for determining fracture parameters that affect the influx of formation fluids into the wellbore. The factors that affect the quality of microseismic data and their final results include average distance between microseismic events and receivers, complexity of the recorded wavefield, signal-to-noise ratio, data aperture, etc. These criteria often conflict with each other. In a typical microseismic monitoring, those factors should be considered to choose the best monitoring well(s), optimum number of required geophones, and their depth. We use MDCA to address these design challenges and develop a method that offers an optimized design out of all possible combinations to produce the best data acquisition results. We believe that this will be the first research to include the above-mentioned factors in a 3D model. Such a tool would assist companies and practicing engineers in choosing the best design parameters for future microseismic projects.

  19. Experience with simplified inelastic analysis of piping designed for elevated temperature service

    International Nuclear Information System (INIS)

    Severud, L.K.

    1980-03-01

    Screening rules and preliminary design of FFTF piping were developed in 1974 based on expected behavior and engineering judgment, approximate calculations, and a few detailed inelastic analyses of pipelines. This paper provides findings from six additional detailed inelastic analyses with correlations to the simplified analysis screening rules. In addition, simplified analysis methods for treating weldment local stresses and strains as well as fabrication induced flaws are described. Based on the FFTF experience, recommendations for future Code and technology work to reduce design analysis costs are identified

  20. Property Model-based Tailor-made Design of Chemical-based Products

    DEFF Research Database (Denmark)

    Kalakul, Sawitree

    Computer-aided model-based methods and tools are increasingly playing important roles in chemical product design. They have the potential to very quickly search for and identify reliable product candidates that can then be verified through experiments. Inthis way, the time and resources spent...... on experiment are reduced leading to faster and cheaper to market the products. The tools also help to manage the solution of product design problems, which usually require efficient handling of model-data-knowledge from different sources and at different time and size scales. The main contribution...... the needed template for a desired product is not available. VPPD-Lab employs a suite of algorithms(such as database search, molecular and mixture blend design) and toolboxes (such asproperty calculations and property model consistency tests) for specific product property prediction, design, and/or analysis...

  1. Preliminary conceptual design and analysis on KALIMER reactor structures

    International Nuclear Information System (INIS)

    Kim, Jong Bum

    1996-10-01

    The objectives of this study are to perform preliminary conceptual design and structural analyses for KALIMER (Korea Advanced Liquid Metal Reactor) reactor structures to assess the design feasibility and to identify detailed analysis requirements. KALIMER thermal hydraulic system analysis results and neutronic analysis results are not available at present, only-limited preliminary structural analyses have been performed with the assumptions on the thermal loads. The responses of reactor vessel and reactor internal structures were based on the temperature difference of core inlet and outlet and on engineering judgments. Thermal stresses from the assumed temperatures were calculated using ANSYS code through parametric finite element heat transfer and elastic stress analyses. While, based on the results of preliminary conceptual design and structural analyses, the ASME Code limits for the reactor structures were satisfied for the pressure boundary, the needs for inelastic analyses were indicated for evaluation of design adequacy of the support barrel and the thermal liner. To reduce thermal striping effects in the bottom are of UIS due to up-flowing sodium form reactor core, installation of Inconel-718 liner to the bottom area was proposed, and to mitigate thermal shock loads, additional stainless steel liner was also suggested. The design feasibilities of these were validated through simplified preliminary analyses. In conceptual design phase, the implementation of these results will be made for the design of the reactor structures and the reactor internal structures in conjunction with the thermal hydraulic, neutronic, and seismic analyses results. 4 tabs., 24 figs., 4 refs. (Author)

  2. Introduction to Chemical Engineering Reactor Analysis: A Web-Based Reactor Design Game

    Science.gov (United States)

    Orbey, Nese; Clay, Molly; Russell, T.W. Fraser

    2014-01-01

    An approach to explain chemical engineering through a Web-based interactive game design was developed and used with college freshman and junior/senior high school students. The goal of this approach was to demonstrate how to model a lab-scale experiment, and use the results to design and operate a chemical reactor. The game incorporates both…

  3. Carbon nanotube based VLSI interconnects analysis and design

    CERN Document Server

    Kaushik, Brajesh Kumar

    2015-01-01

    The brief primarily focuses on the performance analysis of CNT based interconnects in current research scenario. Different CNT structures are modeled on the basis of transmission line theory. Performance comparison for different CNT structures illustrates that CNTs are more promising than Cu or other materials used in global VLSI interconnects. The brief is organized into five chapters which mainly discuss: (1) an overview of current research scenario and basics of interconnects; (2) unique crystal structures and the basics of physical properties of CNTs, and the production, purification and applications of CNTs; (3) a brief technical review, the geometry and equivalent RLC parameters for different single and bundled CNT structures; (4) a comparative analysis of crosstalk and delay for different single and bundled CNT structures; and (5) various unique mixed CNT bundle structures and their equivalent electrical models.

  4. Design, Analysis and Test of Logic Circuits Under Uncertainty

    CERN Document Server

    Krishnaswamy, Smita; Hayes, John P

    2013-01-01

    Integrated circuits (ICs) increasingly exhibit uncertain characteristics due to soft errors, inherently probabilistic devices, and manufacturing variability. As device technologies scale, these effects can be detrimental to the reliability of logic circuits.  To improve future semiconductor designs, this book describes methods for analyzing, designing, and testing circuits subject to probabilistic effects. The authors first develop techniques to model inherently probabilistic methods in logic circuits and to test circuits for determining their reliability after they are manufactured. Then, they study error-masking mechanisms intrinsic to digital circuits and show how to leverage them to design more reliable circuits.  The book describes techniques for:   • Modeling and reasoning about probabilistic behavior in logic circuits, including a matrix-based reliability-analysis framework;   • Accurate analysis of soft-error rate (SER) based on functional-simulation, sufficiently scalable for use in gate-l...

  5. Design and implementation of FPGA-based phase modulation ...

    Indian Academy of Sciences (India)

    Section 3 deals with the implementation process of the digital con- .... To aid the analysis of the SRC, a simulation model based on the state-space technique is ... compensator designed using 'SISO' tool in Matlab, is a PI compensator which ...

  6. The Design-to-Analysis Process at Sandia National Laboratories Observations and Recommendations; TOPICAL

    International Nuclear Information System (INIS)

    BURNS, SHAWN P.; HARRISON, RANDY J.; DOBRANICH, DEAN

    2001-01-01

    The efficiency of the design-to-analysis process for translating solid-model-based design data to computational analysis model data plays a central role in the application of computational analysis to engineering design and certification. A review of the literature from within Sandia as well as from industry shows that the design-to-analysis process involves a number of complex organizational and technological issues. This study focuses on the design-to-analysis process from a business process standpoint and is intended to generate discussion regarding this important issue. Observations obtained from Sandia staff member and management interviews suggest that the current Sandia design-to-analysis process is not mature and that this cross-organizational issue requires committed high-level ownership. A key recommendation of the study is that additional resources should be provided to the computer aided design organizations to support design-to-analysis. A robust community of practice is also needed to continuously improve the design-to-analysis process and to provide a corporate perspective

  7. Impedance Based Analysis and Design of Harmonic Resonant Controller for a Wide Range of Grid Impedance

    DEFF Research Database (Denmark)

    Kwon, Jun Bum; Wang, Xiongfei; Blaabjerg, Frede

    2014-01-01

    This paper investigates the effect of grid impedance variation on harmonic resonant current controllers for gridconnected voltage source converters by means of impedance-based analysis. It reveals that the negative harmonic resistances tend to be derived from harmonic resonant controllers...... in the closed-loop output admittance of converter. Such negative resistances may interact with the grid impedance resulting in steady state error or unstable harmonic compensation. To deal with this problem, a design guideline for harmonic resonant controllers under a wide range of grid impedance is proposed...

  8. Designing Task-Based Syllabus For Writing Class

    Directory of Open Access Journals (Sweden)

    Sundari Hanna

    2018-01-01

    Full Text Available Writing is viewed as the most complex skill to learn and to teach. Beside learner factors, teacher, materials and syllabus may also affect the process of learning language as foreign language. Syllabus, in general, can be defined as a set of what is taught (content and the way it is taught (procedure. This current research aims to design a task-based syllabus for writing class at university level. This study was conducted by qualitative descriptive design with 92 students and 4 lecturers as respondents. As part of research and development project in one private university in Jakarta, a developed task-based syllabus was based on need analysis and the principles of task-based language teaching. Students’ proficiency levels are fair with sentence patterns and grammar as the most difficult aspects. Academic writing is more preferable orientation with the small portions of creative writing. Then, the developed task-based syllabus has been proposed for writing class which covers the components of goal (learning outcome, course description and objectives, a set of writing tasks, features of content focus and language focus and course evaluation. The developed syllabus, then, can guide the lecturers in designing lesson plan and selecting materials for writing class.

  9. Game Theory and Risk-Based Levee System Design

    Science.gov (United States)

    Hui, R.; Lund, J. R.; Madani, K.

    2014-12-01

    Risk-based analysis has been developed for optimal levee design for economic efficiency. Along many rivers, two levees on opposite riverbanks act as a simple levee system. Being rational and self-interested, land owners on each river bank would tend to independently optimize their levees with risk-based analysis, resulting in a Pareto-inefficient levee system design from the social planner's perspective. Game theory is applied in this study to analyze decision making process in a simple levee system in which the land owners on each river bank develop their design strategies using risk-based economic optimization. For each land owner, the annual expected total cost includes expected annual damage cost and annualized construction cost. The non-cooperative Nash equilibrium is identified and compared to the social planner's optimal distribution of flood risk and damage cost throughout the system which results in the minimum total flood cost for the system. The social planner's optimal solution is not feasible without appropriate level of compensation for the transferred flood risk to guarantee and improve conditions for all parties. Therefore, cooperative game theory is then employed to develop an economically optimal design that can be implemented in practice. By examining the game in the reversible and irreversible decision making modes, the cost of decision making myopia is calculated to underline the significance of considering the externalities and evolution path of dynamic water resource problems for optimal decision making.

  10. Reflections on Design-Based Research

    DEFF Research Database (Denmark)

    Ørngreen, Rikke

    2015-01-01

    Design-Based Research is an intervention method that researches educational design (products or processes) in real-life settings with the dual purpose of generating theories about the domain and develop the design iteratively. This paper is an integrative review with a personal ethnographic...... narrative that draws on Design-Based Research literature, and identifies and discusses elements from Interaction Design and Action Research that the Design-Based Research approach could apply, situating the research in online educational projects, where participants are distributed in time and space...

  11. Ergonomic study and static analysis for new design of electric scooter

    Science.gov (United States)

    Fadzly, M. K.; Munirah, Anis; Shayfull, Z.; Saad, Mohd Sazli

    2017-09-01

    The purposes of this project are to design and diversify the function of a battery powered scooter frame which is more practical for the human factor in ergonomic and optimum design. The new design is based on ideas which are studied from existing scooter frame, United States Patent design and European States International Patent design. The final idea of concept design for scooter frame is based on concept chosen from the best characteristics and it is divided into three main difference ideas and the matrix evaluation method is applied. Analysis that applies to frame design, arm, rim and drive train component is based on Cosmos Express program. As a conclusion, the design that is produce are able to carry the maximum also has more practical features in ergonomic view.

  12. A G-function-based reliability-based design methodology applied to a cam roller system

    International Nuclear Information System (INIS)

    Wang, W.; Sui, P.; Wu, Y.T.

    1996-01-01

    Conventional reliability-based design optimization methods treats the reliability function as an ordinary function and applies existing mathematical programming techniques to solve the design problem. As a result, the conventional approach requires nested loops with respect to g-function, and is very time consuming. A new reliability-based design method is proposed in this paper that deals with the g-function directly instead of the reliability function. This approach has the potential of significantly reducing the number of calls for g-function calculations since it requires only one full reliability analysis in a design iteration. A cam roller system in a typical high pressure fuel injection diesel engine is designed using both the proposed and the conventional approach. The proposed method is much more efficient for this application

  13. Classification of polycystic ovary based on ultrasound images using competitive neural network

    Science.gov (United States)

    Dewi, R. M.; Adiwijaya; Wisesty, U. N.; Jondri

    2018-03-01

    Infertility in the women reproduction system due to inhibition of follicles maturation process causing the number of follicles which is called polycystic ovaries (PCO). PCO detection is still operated manually by a gynecologist by counting the number and size of follicles in the ovaries, so it takes a long time and needs high accuracy. In general, PCO can be detected by calculating stereology or feature extraction and classification. In this paper, we designed a system to classify PCO by using the feature extraction (Gabor Wavelet method) and Competitive Neural Network (CNN). CNN was selected because this method is the combination between Hemming Net and The Max Net so that the data classification can be performed based on the specific characteristics of ultrasound data. Based on the result of system testing, Competitive Neural Network obtained the highest accuracy is 80.84% and the time process is 60.64 seconds (when using 32 feature vectors as well as weight and bias values respectively of 0.03 and 0.002).

  14. Mefenamic acid conjugates based on a hydrophilic biopolymer hydroxypropylcellulose: novel prodrug design, characterization and thermal analysis

    International Nuclear Information System (INIS)

    Hussain, M.A.; Kausar, R.; Amin, M.

    2015-01-01

    Macromolecular prodrugs (MPDs) of mefenamic acid were designed onto a cellulose ether derivative hydroxypropylcellulose (HPC) as ester conjugates. Fabrication of HPC-mefenamic acid conjugates was achieved by using p-toluenesulfonyl chloride as carboxylic acid (a functional group in drug) activator at 80 degree C for 24 h under nitrogen atmosphere. Reaction was preceded under homogeneous reaction conditions as HPC was dissolved before use in DMAc solvent. Imidazole was used as a base. Easy workup reactions resulted in good yields (55-65%) and degree of substitution (DS) of drug (0.37-0.99) onto HPC. The DS was calculated by acid-base titration after saponification and UV/Vis spectrophotometry after hydrolysis. DS by both of the methods was found in good agreement with each other. Aqueous and organic soluble novel prodrugs of mefenamic acid were purified and characterized by different spectroscopic and thermal analysis techniques. The initial, maximum and final degradation temperatures of HPC, mefenamic acid and HPC-mefenamic acid conjugates were drawn from thermogravimetric (TG) and derivative TG curves and compared to access relative thermal stability. The TG analysis has indicated that samples obtained were thermally more stable especially with increased stability of mefenamic acid in HPC-mefenamic acid conjugates. These novel MPDs of mefenamic acid (i.e., HPC-mefenamic acid conjugates) may have potential applications in pharmaceutically viable drug design due to wide range of solubility and extra thermal stability imparted after MPD formation. (author)

  15. Semianalytic Design Sensitivity Analysis of Nonlinear Structures With a Commercial Finite Element Package

    International Nuclear Information System (INIS)

    Lee, Tae Hee; Yoo, Jung Hun; Choi, Hyeong Cheol

    2002-01-01

    A finite element package is often used as a daily design tool for engineering designers in order to analyze and improve the design. The finite element analysis can provide the responses of a system for given design variables. Although finite element analysis can quite well provide the structural behaviors for given design variables, it cannot provide enough information to improve the design such as design sensitivity coefficients. Design sensitivity analysis is an essential step to predict the change in responses due to a change in design variables and to optimize a system with the aid of the gradient-based optimization techniques. To develop a numerical method of design sensitivity analysis, analytical derivatives that are based on analytical differentiation of the continuous or discrete finite element equations are effective but analytical derivatives are difficult because of the lack of internal information of the commercial finite element package such as shape functions. Therefore, design sensitivity analysis outside of the finite element package is necessary for practical application in an industrial setting. In this paper, the semi-analytic method for design sensitivity analysis is used for the development of the design sensitivity module outside of a commercial finite element package of ANSYS. The direct differentiation method is employed to compute the design derivatives of the response and the pseudo-load for design sensitivity analysis is effectively evaluated by using the design variation of the related internal nodal forces. Especially, we suggest an effective method for stress and nonlinear design sensitivity analyses that is independent of the commercial finite element package is also discussed. Numerical examples are illustrated to show the accuracy and efficiency of the developed method and to provide insights for implementation of the suggested method into other commercial finite element packages

  16. [Design and implementation of a competency-based curriculum for medical education].

    Science.gov (United States)

    Risco de Domínguez, Graciela

    2014-01-01

    Competency-based education is a form of designing, developing, delivering and documenting instruction based on a set of objectives and results that have been recommended for medical education. This article describes the steps in the process of designing and implementing a competency-based curriculum at a new medical school in a Peruvian university. We present the process followed including context analysis, mission design, the professional profile, the content and organization of the curriculum as well as the evaluation and resources for the training. Finally, issues and challenges faced, as well as lessons learned are summarized.

  17. Projecting the Future for Design Science Research: An Action‐Case Based Analysis

    DEFF Research Database (Denmark)

    Baskerville, Richard; Pries-Heje, Jan

    2015-01-01

    and theories appears to be a key challenge. In this paper we commence with a DESRIST paper from 2012 that instantiated design principles in an artifact for a bank. That paper included plans and techniques for future use of its principles (propagation), including prescriptions for a five-phase adoption process...... or theories have stimulated many actual projections. We demonstrate these concepts in a case study of propagation: a chemical manufacturer and service provider that adopted the design principles arising from that 2012 DESRIST banking-based design science research. We conclude that generalizability is too well...

  18. Needs Analysis and Course Design; A Framework for Designing Exam Courses

    Directory of Open Access Journals (Sweden)

    Reza Eshtehardi

    2017-09-01

    Full Text Available This paper introduces a framework for designing exam courses and highlights the importance of needs analysis in designing exam courses. The main objectives of this paper are to highlight the key role of needs analysis in designing exam courses, to offer a framework for designing exam courses, to show the language needs of different students for IELTS (International English Language Testing System exam, to offer an analysis of those needs and to explain how they will be taken into account for the design of the course. First, I will concentrate on some distinguishing features in exam classes, which make them different from general English classes. Secondly, I will introduce a framework for needs analysis and diagnostic testing and highlight the importance of needs analysis for the design of syllabus and language courses. Thirdly, I will describe significant features of syllabus design, course assessment, and evaluation procedures.

  19. DESIGN ANALYSIS OF ELECTRICAL MACHINES THROUGH INTEGRATED NUMERICAL APPROACH

    Directory of Open Access Journals (Sweden)

    ARAVIND C.V.

    2016-02-01

    Full Text Available An integrated design platform for the newer type of machines is presented in this work. The machine parameters are evaluated out using developed modelling tool. With the machine parameters, the machine is modelled using computer aided tool. The designed machine is brought to simulation tool to perform electromagnetic and electromechanical analysis. In the simulation, conditions setting are performed to setup the materials, meshes, rotational speed and the excitation circuit. Electromagnetic analysis is carried out to predict the behavior of the machine based on the movement of flux in the machines. Besides, electromechanical analysis is carried out to analyse the speed-torque characteristic, the current-torque characteristic and the phase angle-torque characteristic. After all the results are analysed, the designed machine is used to generate S block function that is compatible with MATLAB/SIMULINK tool for the dynamic operational characteristics. This allows the integration of existing drive system into the new machines designed in the modelling tool. An example of the machine design is presented to validate the usage of such a tool.

  20. Affordances and use plans : an analysis of two alternatives to function-based design

    NARCIS (Netherlands)

    Pols, A.J.K.

    2015-01-01

    Function-based design approaches have been criticized for being too narrow to properly guide design. Specifically, they are said to be unable to cope with nonfunctional considerations, such as cost or maintenance issues without invoking other concepts, such as constraints. This paper investigates

  1. Optics Program Simplifies Analysis and Design

    Science.gov (United States)

    2007-01-01

    Engineers at Goddard Space Flight Center partnered with software experts at Mide Technology Corporation, of Medford, Massachusetts, through a Small Business Innovation Research (SBIR) contract to design the Disturbance-Optics-Controls-Structures (DOCS) Toolbox, a software suite for performing integrated modeling for multidisciplinary analysis and design. The DOCS Toolbox integrates various discipline models into a coupled process math model that can then predict system performance as a function of subsystem design parameters. The system can be optimized for performance; design parameters can be traded; parameter uncertainties can be propagated through the math model to develop error bounds on system predictions; and the model can be updated, based on component, subsystem, or system level data. The Toolbox also allows the definition of process parameters as explicit functions of the coupled model and includes a number of functions that analyze the coupled system model and provide for redesign. The product is being sold commercially by Nightsky Systems Inc., of Raleigh, North Carolina, a spinoff company that was formed by Mide specifically to market the DOCS Toolbox. Commercial applications include use by any contractors developing large space-based optical systems, including Lockheed Martin Corporation, The Boeing Company, and Northrup Grumman Corporation, as well as companies providing technical audit services, like General Dynamics Corporation

  2. An integrated 3D design, modeling and analysis resource for SSC detector systems

    International Nuclear Information System (INIS)

    DiGiacomo, N.J.; Adams, T.; Anderson, M.K.; Davis, M.; Easom, B.; Gliozzi, J.; Hale, W.M.; Hupp, J.; Killian, K.; Krohn, M.; Leitch, R.; Lajczok, M.; Mason, L.; Mitchell, J.; Pohlen, J.; Wright, T.

    1989-01-01

    Integrated computer aided engineering and design (CAE/CAD) is having a significant impact on the way design, modeling and analysis is performed, from system concept exploration and definition through final design and integration. Experience with integrated CAE/CAD in high technology projects of scale and scope similar to SSC detectors leads them to propose an integrated computer-based design, modeling and analysis resource aimed specifically at SSC detector system development. The resource architecture emphasizes value-added contact with data and efficient design, modeling and analysis of components, sub-systems or systems with fidelity appropriate to the task. They begin with a general examination of the design, modeling and analysis cycle in high technology projects, emphasizing the transition from the classical islands of automation to the integrated CAE/CAD-based approach. They follow this with a discussion of lessons learned from various attempts to design and implement integrated CAE/CAD systems in scientific and engineering organizations. They then consider the requirements for design, modeling and analysis during SSC detector development, and describe an appropriate resource architecture. They close with a report on the status of the resource and present some results that are indicative of its performance. 10 refs., 7 figs

  3. Research design and statistical analysis

    CERN Document Server

    Myers, Jerome L; Lorch Jr, Robert F

    2013-01-01

    Research Design and Statistical Analysis provides comprehensive coverage of the design principles and statistical concepts necessary to make sense of real data.  The book's goal is to provide a strong conceptual foundation to enable readers to generalize concepts to new research situations.  Emphasis is placed on the underlying logic and assumptions of the analysis and what it tells the researcher, the limitations of the analysis, and the consequences of violating assumptions.  Sampling, design efficiency, and statistical models are emphasized throughout. As per APA recommendations

  4. An Analysis Of Pole/zero Cancellation In LTR-based Feedback Design

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Jannerup, Ole Erik

    1990-01-01

    The pole/zero cancellation in LTR-based feedback design will be analyzed for both full-order as well as minimal-order observers. The asymptotic behaviour of the sensitivity function from the LTR-procedure are given in explicit expressions in the case when a zero is not cancelled by an equivalent...... pole. It will be shown that the non-minimum phase case is included as a special case. The results are not based on any specific LTR-method....

  5. Integrated design and performance analysis of the KO HCCR TBM for ITER

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dong Won, E-mail: dwlee@kaeri.re.kr [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Jin, Hyung Gon; Lee, Eo Hwak; Yoon, Jae Sung; Kim, Suk Kwon; Lee, Cheol Woo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Ahn, Mu-Young; Cho, Seungyon [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Highlights: • Integrated analysis is performed with the conventional CFD code (ANSYS-CFX). • Overall pressure drop and coolant flow scheme are investigated. • Manifold design is being performed considering flow distribution. - Abstract: To develop tritium breeding technology for a Fusion Reactor, Korea has participated in the Test Blanket Module (TBM) program in ITER. The He Cooled Ceramic Reflector (HCCR) TBM consists of functional components such as First Wall (FW), Breeding Zone (BZ), Side Wall (SW), and Back Manifold (BM) and it was designed based on the separate analyses for each component in 2012. Based on the each component analysis model, the integrated model is prepared and thermal-hydraulic analysis for the HCCR TBM is performed in the present study. The coolant flow distribution from BM and SW to FW and BZ, and resulted structure temperatures are obtained with the integrated model. It is found that the non-uniform flow rate occurs at FW and BZ and it causes excess of the design limit (550 °C) at some region. Based on this integrated model, we will perform the design optimization for obtaining uniform flow distribution for satisfying the design requirements.

  6. Stability analysis criteria in landfill design based on the Spanish code

    International Nuclear Information System (INIS)

    Estaire Gepp, J.; Pardo de Santayana, F.

    2014-01-01

    The design of a landfill requires performing stability analyses. To perform such analyses it is necessary to define different design situations and their corresponding safety factors. Geo synthetics are normally used to construct the lining system of the landfills, causing critical slip surfaces to pass along one of the different geosynthetic interfaces. Determination of the shear strength of such critical interfaces is, therefore, an extremely important issue. In this paper, these aspects are analysed based on what is set in the Spanish codes and in the technical literature. As a result of the study, some tables are presented which relate the different design situations (normal, accidental or extraordinary) to the shear strength of the lining system to be used (peak or residual) and define the minimum factor of safety to be accomplished. (Author)

  7. Interactive Building Design Space Exploration Using Regionalized Sensitivity Analysis

    DEFF Research Database (Denmark)

    Østergård, Torben; Jensen, Rasmus Lund; Maagaard, Steffen

    2017-01-01

    simulation inputs are most important and which have negligible influence on the model output. Popular sensitivity methods include the Morris method, variance-based methods (e.g. Sobol’s), and regression methods (e.g. SRC). However, all these methods only address one output at a time, which makes it difficult...... in combination with the interactive parallel coordinate plot (PCP). The latter is an effective tool to explore stochastic simulations and to find high-performing building designs. The proposed methods help decision makers to focus their attention to the most important design parameters when exploring......Monte Carlo simulations combined with regionalized sensitivity analysis provide the means to explore a vast, multivariate design space in building design. Typically, sensitivity analysis shows how the variability of model output relates to the uncertainties in models inputs. This reveals which...

  8. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization

    Science.gov (United States)

    Meyn, Larry A.

    2018-01-01

    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use

  9. Design and Transmission Analysis of an Asymmetrical Spherical Parallel Manipulator

    DEFF Research Database (Denmark)

    Wu, Guanglei; Caro, Stéphane; Wang, Jiawei

    2015-01-01

    analysis and optimal design of the proposed manipulator based on its kinematic analysis. The input and output transmission indices of the manipulator are defined for its optimum design based on the virtual coefficient between the transmission wrenches and twist screws. The sets of optimal parameters......This paper presents an asymmetrical spherical parallel manipulator and its transmissibility analysis. This manipulator contains a center shaft to both generate a decoupled unlimited-torsion motion and support the mobile platform for high positioning accuracy. This work addresses the transmission...... are identified and the distribution of the transmission index is visualized. Moreover, a comparative study regarding to the performances with the symmetrical spherical parallel manipulators is conducted and the comparison shows the advantages of the proposed manipulator with respect to its spherical parallel...

  10. Issues in risk analysis of passive LWR designs

    International Nuclear Information System (INIS)

    Youngblood, R.W.; Pratt, W.T.; Amico, P.J.; Gallagher, D.

    1992-01-01

    This paper discusses issues which bear on the question of how safety is to be demonstrated for ''simplified passive'' light water reactor (LWR) designs. First, a very simplified comparison is made between certain systems in today's plants. comparable systems in evolutionary designs, and comparable systems in the simplified passives. in order to introduce the issues. This discussion is not intended to describe the designs comprehensively, but is offered only to show why certain issues seem to be important in these particular designs. Next, an important class of accident sequences is described; finally, based on this discussion, some priorities in risk analysis are presented and discussed

  11. Designing a ticket to ride with the Cognitive Work Analysis Design Toolkit.

    Science.gov (United States)

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Jenkins, Daniel P

    2015-01-01

    Cognitive work analysis has been applied in the design of numerous sociotechnical systems. The process used to translate analysis outputs into design concepts, however, is not always clear. Moreover, structured processes for translating the outputs of ergonomics methods into concrete designs are lacking. This paper introduces the Cognitive Work Analysis Design Toolkit (CWA-DT), a design approach which has been developed specifically to provide a structured means of incorporating cognitive work analysis outputs in design using design principles and values derived from sociotechnical systems theory. This paper outlines the CWA-DT and describes its application in a public transport ticketing design case study. Qualitative and quantitative evaluations of the process provide promising early evidence that the toolkit fulfils the evaluation criteria identified for its success, with opportunities for improvement also highlighted. The Cognitive Work Analysis Design Toolkit has been developed to provide ergonomics practitioners with a structured approach for translating the outputs of cognitive work analysis into design solutions. This paper demonstrates an application of the toolkit and provides evaluation findings.

  12. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example....

  13. Web-Based Intervention for Women With Type 1 Diabetes in Pregnancy and Early Motherhood: Critical Analysis of Adherence to Technological Elements and Study Design.

    Science.gov (United States)

    Berg, Marie; Linden, Karolina; Adolfsson, Annsofie; Sparud Lundin, Carina; Ranerup, Agneta

    2018-05-02

    Numerous Web-based interventions have been implemented to promote health and health-related behaviors in persons with chronic conditions. Using randomized controlled trials to evaluate such interventions creates a range of challenges, which in turn can influence the study outcome. Applying a critical perspective when evaluating Web-based health interventions is important. The objective of this study was to critically analyze and discuss the challenges of conducting a Web-based health intervention as a randomized controlled trial. The MODIAB-Web study was critically examined using an exploratory case study methodology and the framework for analysis offered through the Persuasive Systems Design model. Focus was on technology, study design, and Web-based support usage, with special focus on the forum for peer support. Descriptive statistics and qualitative content analysis were used. The persuasive content and technological elements in the design of the randomized controlled trial included all four categories of the Persuasive Systems Design model, but not all design principles were implemented. The study duration was extended to a period of four and a half years. Of 81 active participants in the intervention group, a maximum of 36 women were simultaneously active. User adherence varied greatly with a median of 91 individual log-ins. The forum for peer support was used by 63 participants. Although only about one-third of the participants interacted in the forum, there was a fairly rich exchange of experiences and advice between them. Thus, adherence in terms of social interactions was negatively affected by limited active participation due to prolonged recruitment process and randomization effects. Lessons learned from this critical analysis are that technology and study design matter and might mutually influence each other. In Web-based interventions, the use of design theories enables utilization of the full potential of technology and promotes adherence. The

  14. Model-Based Integrated Process Design and Controller Design of Chemical Processes

    DEFF Research Database (Denmark)

    Abd Hamid, Mohd Kamaruddin Bin

    that is typically formulated as a mathematical programming (optimization with constraints) problem is solved by the so-called reverse approach by decomposing it into four sequential hierarchical sub-problems: (i) pre-analysis, (ii) design analysis, (iii) controller design analysis, and (iv) final selection......This thesis describes the development and application of a new systematic modelbased methodology for performing integrated process design and controller design (IPDC) of chemical processes. The new methodology is simple to apply, easy to visualize and efficient to solve. Here, the IPDC problem...... are ordered according to the defined performance criteria (objective function). The final selected design is then verified through rigorous simulation. In the pre-analysis sub-problem, the concepts of attainable region and driving force are used to locate the optimal process-controller design solution...

  15. Virus-inspired design principles of nanoparticle-based bioagents.

    Directory of Open Access Journals (Sweden)

    Hongyan Yuan

    Full Text Available The highly effectiveness and robustness of receptor-mediated viral invasion of living cells shed lights on the biomimetic design of nanoparticle(NP-based therapeutics. Through thermodynamic analysis, we elucidate that the mechanisms governing both the endocytic time of a single NP and the cellular uptake can be unified into a general energy-balance framework of NP-membrane adhesion and membrane deformation. Yet the NP-membrane adhesion strength is a globally variable quantity that effectively regulates the NP uptake rate. Our analysis shows that the uptake rate interrelatedly depends on the particle size and ligand density, in contrast to the widely reported size effect. Our model predicts that the optimal radius of NPs for maximal uptake rate falls in the range of 25-30 nm, and optimally several tens of ligands should be coated onto NPs. These findings are supported by both recent experiments and typical viral structures, and serve as fundamental principles for the rational design of NP-based nanomedicine.

  16. The Dimensions of the Orbital Cavity Based on High-Resolution Computed Tomography of Human Cadavers

    DEFF Research Database (Denmark)

    Felding, Ulrik Ascanius; Bloch, Sune Land; Buchwald, Christian von

    2016-01-01

    for surface area. To authors' knowledge, this study is the first to have measured the entire surface area of the orbital cavity.The volume and surface area of the orbital cavity were estimated in computed tomography scans of 11 human cadavers using unbiased stereological sampling techniques. The mean (± SD......) total volume and total surface area of the orbital cavities was 24.27 ± 3.88 cm and 32.47 ± 2.96 cm, respectively. There was no significant difference in volume (P = 0.315) or surface area (P = 0.566) between the 2 orbital cavities.The stereological technique proved to be a robust and unbiased method...... that may be used as a gold standard for comparison with automated computer software. Future imaging studies in blow-out fracture patients may be based on individual and relative calculation involving both herniated volume and fractured surface area in relation to the total volume and surface area...

  17. Identification of Design Work Patterns by Retrospective Analysis of Work Sheets

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp

    1999-01-01

    project is carried out where we seek to identify design work patterns by retrospective analysis of documentation created during design projects.An elements to satisfy the wish for an efficient design process could be to identify work patterns applied by engineering designers, evaluate these patterns...... with respect to their efficiency, and reuse the most efficient in future projects. Thus, the objective of this research is to analyse design projects in order to identify the work patterns applied. Based on an evaluation of identified work patterns we expect a recommendation of work patterns supporting...... an efficient design process can be established.In this paper we describe the analysis method, and present observations from analyses of three projects....

  18. System analysis and design

    International Nuclear Information System (INIS)

    Son, Seung Hui

    2004-02-01

    This book deals with information technology and business process, information system architecture, methods of system development, plan on system development like problem analysis and feasibility analysis, cases for system development, comprehension of analysis of users demands, analysis of users demands using traditional analysis, users demands analysis using integrated information system architecture, system design using integrated information system architecture, system implementation, and system maintenance.

  19. A comparison of an energy/economic-based against an exergoeconomic-based multi-objective optimisation for low carbon building energy design

    International Nuclear Information System (INIS)

    García Kerdan, Iván; Raslan, Rokia; Ruyssevelt, Paul; Morillón Gálvez, David

    2017-01-01

    This study presents a comparison of the optimisation of building energy retrofit strategies from two different perspectives: an energy/economic-based analysis and an exergy/exergoeconomic-based analysis. A recently retrofitted community centre is used as a case study. ExRET-Opt, a novel building energy/exergy simulation tool with multi-objective optimisation capabilities based on NSGA-II is used to run both analysis. The first analysis, based on the 1st Law only, simultaneously optimises building energy use and design's Net Present Value (NPV). The second analysis, based on the 1st and the 2nd Laws, simultaneously optimises exergy destructions and the exergoeconomic cost-benefit index. Occupant thermal comfort is considered as a common objective function for both approaches. The aim is to assess the difference between the methods and calculate the performance among main indicators, considering the same decision variables and constraints. Outputs show that the inclusion of exergy/exergoeconomics as objective functions into the optimisation procedure has resulted in similar 1st Law and thermal comfort outputs, while providing solutions with less environmental impact under similar capital investments. This outputs demonstrate how the 1st Law is only a necessary calculation while the utilisation of the 1st and 2nd Laws becomes a sufficient condition for the analysis and design of low carbon buildings. - Highlights: • The study compares an energy-based and an exergy-based building design optimisation. • Occupant thermal comfort is considered as a common objective function. • A comparison of thermodynamic outputs is made against the actual retrofit design. • Under similar constraints, second law optimisation presents better overall results. • Exergoeconomic optimisation solutions improves building exergy efficiency to double.

  20. Scenario-based design: a method for connecting information system design with public health operations and emergency management.

    Science.gov (United States)

    Reeder, Blaine; Turner, Anne M

    2011-12-01

    Responding to public health emergencies requires rapid and accurate assessment of workforce availability under adverse and changing circumstances. However, public health information systems to support resource management during both routine and emergency operations are currently lacking. We applied scenario-based design as an approach to engage public health practitioners in the creation and validation of an information design to support routine and emergency public health activities. Using semi-structured interviews we identified the information needs and activities of senior public health managers of a large municipal health department during routine and emergency operations. Interview analysis identified 25 information needs for public health operations management. The identified information needs were used in conjunction with scenario-based design to create 25 scenarios of use and a public health manager persona. Scenarios of use and persona were validated and modified based on follow-up surveys with study participants. Scenarios were used to test and gain feedback on a pilot information system. The method of scenario-based design was applied to represent the resource management needs of senior-level public health managers under routine and disaster settings. Scenario-based design can be a useful tool for engaging public health practitioners in the design process and to validate an information system design. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. Design spectrums based on earthquakes recorded at tarbela

    International Nuclear Information System (INIS)

    Rizwan, M.; Ilyas, M.; Masood, A.

    2008-01-01

    First Seismological Network in Pakistan was setup in early 1969 at Tarbela, which is the location of largest water reservoir of the country. The network consisted of Analog Accelerograms and Seismographs. Since the installation many seismic events of different magnitudes occurred and were recorded by the installed instruments. The analog form of recorded time histories has been digitized and data of twelve earthquakes, irrespective of the type of soil, has been used to derive elastic design spectrums for Tarbela, Pakistan. The PGA scaling factors, based on the risk analysis studies carried out for the region, for each component are also given. The design spectrums suggested will be very useful for carrying out new construction in the region and its surroundings. The digitized data of time histories will be useful for seismic response analysis of structures and seismic risk analysis of the region. (author)

  2. Design, simulation and comparative analysis of CNT based cascode operational transconductance amplifiers

    Science.gov (United States)

    Nizamuddin, M.; Loan, Sajad A.; Alamoud, Abdul R.; Abbassi, Shuja A.

    2015-10-01

    In this work, design and calibrated simulation of carbon nanotube field effect transistor (CNTFET)-based cascode operational transconductance amplifiers (COTA) have been performed. Three structures of CNTFET-based COTAs have been designed using HSPICE and have been compared with the conventional CMOS-based COTAs. The proposed COTAs include one using pure CNTFETs and two others that employ CNTFETs, as well as the conventional MOSFETs. The simulation study has revealed that the CNTFET-based COTAs have significantly outperformed the conventional MOSFET-based COTAs. A significant increase in dc gain, output resistance and slew rate of 81.4%, 25% and 13.2%, respectively, have been achieved in the proposed pure CNT-based COTA in comparison to the conventional CMOS-based COTA. The power consumption in the pure CNT-COTA is 324 times less in comparison to the conventional CMOS-COTA. Further, the phase margin (PM), gain margin (GM), common mode and power supply rejection ratios have been significantly increased in the proposed CNT-based COTAs in comparison to the conventional CMOS-based COTAs. Furthermore, to see the advantage of cascoding, the proposed CNT-based cascode OTAs have been compared with the CNT-based OTAs. It has been observed that by incorporating the concept of cascode in the CNTFET-based OTAs, significant increases in gain (12.5%) and output resistance (13.07%) have been achieved. The performance of the proposed COTAs has been further observed by changing the number of CNTs (N), CNT pitch (S) and CNT diameter (DCNT) in the CNTFETs used. It has been observed that the performance of the proposed COTAs can be significantly improved by using optimum values of N, S and DCNT.

  3. Moving Average Filter-Based Phase-Locked Loops: Performance Analysis and Design Guidelines

    DEFF Research Database (Denmark)

    Golestan, Saeed; Ramezani, Malek; Guerrero, Josep M.

    2014-01-01

    this challenge, incorporating moving average filter(s) (MAF) into the PLL structure has been proposed in some recent literature. A MAF is a linear-phase finite impulse response filter which can act as an ideal low-pass filter, if certain conditions hold. The main aim of this paper is to present the control...... design guidelines for a typical MAF-based PLL. The paper starts with the general description of MAFs. The main challenge associated with using the MAFs is then explained, and its possible solutions are discussed. The paper then proceeds with a brief overview of the different MAF-based PLLs. In each case......, the PLL block diagram description is shown, the advantages and limitations are briefly discussed, and the tuning approach (if available) is evaluated. The paper then presents two systematic methods to design the control parameters of a typical MAF-based PLL: one for the case of using a proportional...

  4. A bottom collider vertex detector design, Monte-Carlo simulation and analysis package

    International Nuclear Information System (INIS)

    Lebrun, P.

    1990-01-01

    A detailed simulation of the BCD vertex detector is underway. Specifications and global design issues are briefly reviewed. The BCD design based on double sided strip detector is described in more detail. The GEANT3-based Monte-Carlo program and the analysis package used to estimate detector performance are discussed in detail. The current status of the expected resolution and signal to noise ratio for the ''golden'' CP violating mode B d → π + π - is presented. These calculations have been done at FNAL energy (√s = 2.0 TeV). Emphasis is placed on design issues, analysis techniques and related software rather than physics potentials. 20 refs., 46 figs

  5. Using Social Media Sentiment Analysis for Interaction Design Choices

    DEFF Research Database (Denmark)

    McGuire, Mark; Kampf, Constance Elizabeth

    2015-01-01

    Social media analytics is an emerging skill for organizations. Currently, developers are exploring ways to create tools for simplifying social media analysis. These tools tend to focus on gathering data, and using systems to make it meaningful. However, we contend that making social media data...... meaningful is by nature a human-computer interaction problem. We examine this problem around the emerging field of sentiment analysis, exploring criteria for designing sentiment analysis systems based in Human Computer interaction, HCI. We contend that effective sentiment analysis affects audience analysis...

  6. Intermediary object for participative design processes based on the ergonomic work analysis

    DEFF Research Database (Denmark)

    Souza da Conceição, Carolina; Duarte, F.; Broberg, Ole

    2012-01-01

    The objective of this paper is to present and discuss the use of an intermediary object, built from the ergonomic work analysis, in a participative design process. The object was a zoning pattern, developed as a visual representation ‘mapping’ of the interrelations among the functional units of t...

  7. Improving the Aircraft Design Process Using Web-based Modeling and Simulation

    Science.gov (United States)

    Reed, John A.; Follen, Gregory J.; Afjeh, Abdollah A.

    2003-01-01

    Designing and developing new aircraft systems is time-consuming and expensive. Computational simulation is a promising means for reducing design cycle times, but requires a flexible software environment capable of integrating advanced multidisciplinary and muitifidelity analysis methods, dynamically managing data across heterogeneous computing platforms, and distributing computationally complex tasks. Web-based simulation, with its emphasis on collaborative composition of simulation models, distributed heterogeneous execution, and dynamic multimedia documentation, has the potential to meet these requirements. This paper outlines the current aircraft design process, highlighting its problems and complexities, and presents our vision of an aircraft design process using Web-based modeling and simulation.

  8. Planning for a data base system to support satellite conceptual design

    Science.gov (United States)

    Claydon, C. R.

    1976-01-01

    The conceptual design of an automated satellite design data base system is presented. The satellite catalog in the system includes data for all earth orbital satellites funded to the hardware stage for launch between 1970 and 1980, and provides a concise compilation of satellite capabilities and design parameters. The cost of satellite subsystems and components will be added to the base. Data elements are listed and discussed. Sensor and science and applications opportunities catalogs will be included in the data system. Capabilities of the BASIS storage, retrieval, and analysis system are used in the system design.

  9. A novel graphical technique for Pinch Analysis applications: Energy Targets and grassroots design

    International Nuclear Information System (INIS)

    Gadalla, Mamdouh A.

    2015-01-01

    Graphical abstract: A new HEN graphical design. - Highlights: • A new graphical technique for heat exchanger networks design. • Pinch Analysis principles and design rules are better interpreted. • Graphical guidelines for optimum heat integration. • New temperature-based graphs provide user-interactive features. - Abstract: Pinch Analysis is for decades a leading tool to energy integration for retrofit and design. This paper presents a new graphical technique, based on Pinch Analysis, for the grassroots design of heat exchanger networks. In the new graph, the temperatures of hot streams are plotted versus those of the cold streams. The temperature–temperature based graph is constructed to include temperatures of hot and cold streams as straight lines, horizontal lines for hot streams, and vertical lines for cold streams. The graph is applied to determine the pinch temperatures and Energy Targets. It is then used to synthesise graphically a complete exchanger network, achieving the Energy Targets. Within the new graph, exchangers are represented by inclined straight lines, whose slopes are proportional to the ratio of heat capacities and flows. Pinch Analysis principles for design are easily interpreted using this new graphical technique to design a complete exchanger network. Network designs achieved by the new technique can guarantee maximum heat recovery. The new technique can also be employed to simulate basic designs of heat exchanger networks. The strengths of the new tool are that it is simply applied using computers, requires no commercial software, and can be used for academic purposes/engineering education

  10. The Research of Computer Aided Farm Machinery Designing Method Based on Ergonomics

    Science.gov (United States)

    Gao, Xiyin; Li, Xinling; Song, Qiang; Zheng, Ying

    Along with agricultural economy development, the farm machinery product type Increases gradually, the ergonomics question is also getting more and more prominent. The widespread application of computer aided machinery design makes it possible that farm machinery design is intuitive, flexible and convenient. At present, because the developed computer aided ergonomics software has not suitable human body database, which is needed in view of farm machinery design in China, the farm machinery design have deviation in ergonomics analysis. This article puts forward that using the open database interface procedure in CATIA to establish human body database which aims at the farm machinery design, and reading the human body data to ergonomics module of CATIA can product practical application virtual body, using human posture analysis and human activity analysis module to analysis the ergonomics in farm machinery, thus computer aided farm machinery designing method based on engineering can be realized.

  11. Irradiation Pattern Analysis for Designing Light Sources-Based on Light Emitting Diodes

    International Nuclear Information System (INIS)

    Rojas, E.; Stolik, S.; La Rosa, J. de; Valor, A.

    2016-01-01

    Nowadays it is possible to design light sources with a specific irradiation pattern for many applications. Light Emitting Diodes present features like high luminous efficiency, durability, reliability, flexibility, among others as the result of its rapid development. In this paper the analysis of the irradiation pattern of the light emitting diodes is presented. The approximation of these irradiation patterns to both, a Lambertian, as well as a Gaussian functions for the design of light sources is proposed. Finally, the obtained results and the functionality of bringing the irradiation pattern of the light emitting diodes to these functions are discussed. (Author)

  12. Research on SDG-Based Qualitative Reasoning in Conceptual Design

    Directory of Open Access Journals (Sweden)

    Kai Li

    2013-01-01

    Full Text Available Conceptual design is the initial stage throughout the product life cycle, whose main purposes include function creation, function decomposition, and function and subfunction designs. At this stage, the information about product function and structure has the characteristics of imprecision, incompleteness, being qualitative, and so forth, which will affect the validity of conceptual design. In this paper, the signed directed graph is used to reveal the inherent causal relationship and interactions among the variables and find qualitative interactions between design variables and design purpose with the help of causal sequence analysis and constraint propagation. In the case of incomplete information, qualitative reasoning, which has the function of qualitative behavior prediction, can improve conceptual design level aided by the computer. To some extent, qualitative reasoning plays a supplementary role in evaluating scheme and predicting function. At last, with the problem of planar four-bar mechanism design, a qualitative reasoning flowchart based on the Signed Directed Graph is introduced, and an analysis is made of how to adjust design parameters to make the trajectory of a moving point reach to the predetermined position so as to meet the design requirements and achieve the effect that aided designers expect in conceptual design.

  13. A knowledge-based design framework for airplane conceptual and preliminary design

    Science.gov (United States)

    Anemaat, Wilhelmus A. J.

    The goal of work described herein is to develop the second generation of Advanced Aircraft Analysis (AAA) into an object-oriented structure which can be used in different environments. One such environment is the third generation of AAA with its own user interface, the other environment with the same AAA methods (i.e. the knowledge) is the AAA-AML program. AAA-AML automates the initial airplane design process using current AAA methods in combination with AMRaven methodologies for dependency tracking and knowledge management, using the TechnoSoft Adaptive Modeling Language (AML). This will lead to the following benefits: (1) Reduced design time: computer aided design methods can reduce design and development time and replace tedious hand calculations. (2) Better product through improved design: more alternative designs can be evaluated in the same time span, which can lead to improved quality. (3) Reduced design cost: due to less training and less calculation errors substantial savings in design time and related cost can be obtained. (4) Improved Efficiency: the design engineer can avoid technically correct but irrelevant calculations on incomplete or out of sync information, particularly if the process enables robust geometry earlier. Although numerous advancements in knowledge based design have been developed for detailed design, currently no such integrated knowledge based conceptual and preliminary airplane design system exists. The third generation AAA methods are tested over a ten year period on many different airplane designs. Using AAA methods will demonstrate significant time savings. The AAA-AML system will be exercised and tested using 27 existing airplanes ranging from single engine propeller, business jets, airliners, UAV's to fighters. Data for the varied sizing methods will be compared with AAA results, to validate these methods. One new design, a Light Sport Aircraft (LSA), will be developed as an exercise to use the tool for designing a new airplane

  14. Case-based Agile Fixture Design

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In order to realize the agility of the fixture design, such asreconfigurability, rescalability and reusability, fixture structure is function unit-based decomposed from a fire-new point of view. Which makes it easy for agile fixture to be reconfigured and modified. Thereby, the base of case-based agile fixture design system info is established.Whole case-based agile fixture design model is presented. In which, three modules are added relative to the other models, including case matching of fixture planning module, conflict arbitration module and agile fixture case modify module. The three modules could solve the previous problem that the experience and result are difficult to be reused in the process of design.Two key techniques in the process of the agile fixture design, the evaluation of case similarity, and restriction-based conflict arbitration, are listed. And some methods are presented to evaluate the similarity and clear up the conflict.

  15. Stepped-frequency radar sensors theory, analysis and design

    CERN Document Server

    Nguyen, Cam

    2016-01-01

    This book presents the theory, analysis and design of microwave stepped-frequency radar sensors. Stepped-frequency radar sensors are attractive for various sensing applications that require fine resolution. The book consists of five chapters. The first chapter describes the fundamentals of radar sensors including applications followed by a review of ultra-wideband pulsed, frequency-modulated continuous-wave (FMCW), and stepped-frequency radar sensors. The second chapter discusses a general analysis of radar sensors including wave propagation in media and scattering on targets, as well as the radar equation. The third chapter addresses the analysis of stepped-frequency radar sensors including their principles and design parameters. Chapter 4 presents the development of two stepped-frequency radar sensors at microwave and millimeter-wave frequencies based on microwave integrated circuits (MICs), microwave monolithic integrated circuits (MMICs) and printed-circuit antennas, and discusses their signal processing....

  16. Designing Dietary Recommendations Using System Level Interactomics Analysis and Network-Based Inference

    Directory of Open Access Journals (Sweden)

    Tingting Zheng

    2017-09-01

    diet in disease development. Due to the complexity of analyzing the food composition and eating patterns of individuals our in silico analysis, using large-scale gene expression datasets and network-based topological features, may serve as a proof-of-concept in nutritional systems biology for identifying diet-disease relationships and subsequently designing dietary recommendations.

  17. Development of guidelines for inelastic analysis in design of fast reactor components

    International Nuclear Information System (INIS)

    Nakamura, Kyotada; Kasahara, Naoto; Morishita, Masaki; Shibamoto, Hiroshi; Inoue, Kazuhiko; Nakayama, Yasunari

    2008-01-01

    The interim guidelines for the application of inelastic analysis to design of fast reactor components were developed. These guidelines are referred from 'Elevated Temperature Structural Design Guide for Commercialized Fast Reactor (FDS)'. The basic policies of the guidelines are more rational predictions compared with elastic analysis approach and a guarantee of conservative results for design conditions. The guidelines recommend two kinds of constitutive equations to estimate strains conservatively. They also provide the methods for modeling load histories and estimating fatigue and creep damage based on the results of inelastic analysis. The guidelines were applied to typical design examples and their results were summarized as exemplars to support users

  18. A Participatory Design Approach for a Mobile App-Based Personal Response System

    Science.gov (United States)

    Song, Donggil; Oh, Eun Young

    2016-01-01

    This study reports on a participatory design approach including the design, development, implementation, and evaluation of a mobile app-based personal response system (PRS). The first cycle formulated initial design principles through context and needs analysis; the second utilized the collaboration with instructors and experts embodying specific…

  19. Analysis and design optimization of flexible pavement

    Energy Technology Data Exchange (ETDEWEB)

    Mamlouk, M.S.; Zaniewski, J.P.; He, W.

    2000-04-01

    A project-level optimization approach was developed to minimize total pavement cost within an analysis period. Using this approach, the designer is able to select the optimum initial pavement thickness, overlay thickness, and overlay timing. The model in this approach is capable of predicting both pavement performance and condition in terms of roughness, fatigue cracking, and rutting. The developed model combines the American Association of State Highway and Transportation Officials (AASHTO) design procedure and the mechanistic multilayer elastic solution. The Optimization for Pavement Analysis (OPA) computer program was developed using the prescribed approach. The OPA program incorporates the AASHTO equations, the multilayer elastic system ELSYM5 model, and the nonlinear dynamic programming optimization technique. The program is PC-based and can run in either a Windows 3.1 or a Windows 95 environment. Using the OPA program, a typical pavement section was analyzed under different traffic volumes and material properties. The optimum design strategy that produces the minimum total pavement cost in each case was determined. The initial construction cost, overlay cost, highway user cost, and total pavement cost were also calculated. The methodology developed during this research should lead to more cost-effective pavements for agencies adopting the recommended analysis methods.

  20. Design, simulation and comparative analysis of CNT based cascode operational transconductance amplifiers

    International Nuclear Information System (INIS)

    Nizamuddin, M; Loan, Sajad A; Alamoud, Abdul R; Abbassi, Shuja A

    2015-01-01

    In this work, design and calibrated simulation of carbon nanotube field effect transistor (CNTFET)-based cascode operational transconductance amplifiers (COTA) have been performed. Three structures of CNTFET-based COTAs have been designed using HSPICE and have been compared with the conventional CMOS-based COTAs. The proposed COTAs include one using pure CNTFETs and two others that employ CNTFETs, as well as the conventional MOSFETs. The simulation study has revealed that the CNTFET-based COTAs have significantly outperformed the conventional MOSFET-based COTAs. A significant increase in dc gain, output resistance and slew rate of 81.4%, 25% and 13.2%, respectively, have been achieved in the proposed pure CNT-based COTA in comparison to the conventional CMOS-based COTA. The power consumption in the pure CNT-COTA is 324 times less in comparison to the conventional CMOS-COTA. Further, the phase margin (PM), gain margin (GM), common mode and power supply rejection ratios have been significantly increased in the proposed CNT-based COTAs in comparison to the conventional CMOS-based COTAs. Furthermore, to see the advantage of cascoding, the proposed CNT-based cascode OTAs have been compared with the CNT-based OTAs. It has been observed that by incorporating the concept of cascode in the CNTFET-based OTAs, significant increases in gain (12.5%) and output resistance (13.07%) have been achieved. The performance of the proposed COTAs has been further observed by changing the number of CNTs (N), CNT pitch (S) and CNT diameter (D_C_N_T) in the CNTFETs used. It has been observed that the performance of the proposed COTAs can be significantly improved by using optimum values of N, S and D_C_N_T. (paper)

  1. Watershed-based survey designs

    Science.gov (United States)

    Detenbeck, N.E.; Cincotta, D.; Denver, J.M.; Greenlee, S.K.; Olsen, A.R.; Pitchford, A.M.

    2005-01-01

    Watershed-based sampling design and assessment tools help serve the multiple goals for water quality monitoring required under the Clean Water Act, including assessment of regional conditions to meet Section 305(b), identification of impaired water bodies or watersheds to meet Section 303(d), and development of empirical relationships between causes or sources of impairment and biological responses. Creation of GIS databases for hydrography, hydrologically corrected digital elevation models, and hydrologic derivatives such as watershed boundaries and upstream–downstream topology of subcatchments would provide a consistent seamless nationwide framework for these designs. The elements of a watershed-based sample framework can be represented either as a continuous infinite set defined by points along a linear stream network, or as a discrete set of watershed polygons. Watershed-based designs can be developed with existing probabilistic survey methods, including the use of unequal probability weighting, stratification, and two-stage frames for sampling. Case studies for monitoring of Atlantic Coastal Plain streams, West Virginia wadeable streams, and coastal Oregon streams illustrate three different approaches for selecting sites for watershed-based survey designs.

  2. Computer-Aided Model Based Analysis for Design and Operation of a Copolymerization Process

    DEFF Research Database (Denmark)

    Lopez-Arenas, Maria Teresa; Sales-Cruz, Alfonso Mauricio; Gani, Rafiqul

    2006-01-01

    . This will allow analysis of the process behaviour, contribute to a better understanding of the polymerization process, help to avoid unsafe conditions of operation, and to develop operational and optimizing control strategies. In this work, through a computer-aided modeling system ICAS-MoT, two first......The advances in computer science and computational algorithms for process modelling, process simulation, numerical methods and design/synthesis algorithms, makes it advantageous and helpful to employ computer-aided modelling systems and tools for integrated process analysis. This is illustrated......-principles models have been investigated with respect to design and operational issues for solution copolymerization reactors in general, and for the methyl methacrylate/vinyl acetate system in particular. The Model 1 is taken from literature and is commonly used for low conversion region, while the Model 2 has...

  3. Development of computational methods of design by analysis for pressure vessel components

    International Nuclear Information System (INIS)

    Bao Shiyi; Zhou Yu; He Shuyan; Wu Honglin

    2005-01-01

    Stress classification is not only one of key steps when pressure vessel component is designed by analysis, but also a difficulty which puzzles engineers and designers at all times. At present, for calculating and categorizing the stress field of pressure vessel components, there are several computation methods of design by analysis such as Stress Equivalent Linearization, Two-Step Approach, Primary Structure method, Elastic Compensation method, GLOSS R-Node method and so on, that are developed and applied. Moreover, ASME code also gives an inelastic method of design by analysis for limiting gross plastic deformation only. When pressure vessel components design by analysis, sometimes there are huge differences between the calculating results for using different calculating and analysis methods mentioned above. As consequence, this is the main reason that affects wide application of design by analysis approach. Recently, a new approach, presented in the new proposal of a European Standard, CEN's unfired pressure vessel standard EN 13445-3, tries to avoid problems of stress classification by analyzing pressure vessel structure's various failure mechanisms directly based on elastic-plastic theory. In this paper, some stress classification methods mentioned above, are described briefly. And the computational methods cited in the European pressure vessel standard, such as Deviatoric Map, and nonlinear analysis methods (plastic analysis and limit analysis), are depicted compendiously. Furthermore, the characteristics of computational methods of design by analysis are summarized for selecting the proper computational method when design pressure vessel component by analysis. (authors)

  4. Needs Analysis of the English Writing Skill as the Base to Design the Learning Materials

    Directory of Open Access Journals (Sweden)

    Tenri Ampa Andi

    2018-01-01

    Full Text Available This research used a descriptive method. It was aimed at identifying students’ learning needs for the English writing skill as the base for designing the learning materials. Writing skill covered the analysis of the types of paragraph, types of text, the components of writing and paragraph development. The subjects of the research were the fourth semester students that consisted of 330 students. The samples were taken 15 % randomly, so the number of samples was 50 students. The research used a questionnaire as the instrument to get responses from the students about their learning needs. The results showed that the learning needs for the writing skills coped with the types of paragraph development, the types of text, and components of writing skill. The types of paragraph development included the ways by definition (79.7%, classification (67.0%, listing (59.3%, cause effect (47.7%, example (47.3%, and comparison (45.7%. The types of text consisted of description (66.0%, news items (59.7%, narration (58.7%, discussion (56.7%, recount (57.0%, and exposition (50.7%. The components of writing skill contained structure (79.6%, vocabulary (79.4%, content (62.0%, organisation (53.6% and mechanic (34.0%. The implication of the findings would be the base of teaching and learning process, especially in designing the learning materials for the English writing skill.

  5. Effect of Urtica dioica on morphometric indices of kidney in streptozotocin diabetic rats--a stereological study.

    Science.gov (United States)

    Golalipour, Mohammad Jafar; Gharravi, Anneh Mohammad; Ghafari, Sorya; Afshar, Mohammad

    2007-11-01

    The aim of the present study was to investigate the effect of Urtica dioica on Morphometric indices of kidney in diabetic rats. Thirty male adult albino wistar rats of 125-175 g divided into control, diabetic and Urtica dioica treatment groups. In treatment Group, diabetic rats received 100 mg kg(-1) daily hydroalcoholic extract of U. dioica intraperitoneally for 4 weeks. After the animals had been sacrified, the kidneys were removed and fixed by formaldehyde, cut horizontally into 1 mm slices and processed, Stained with H and E. Stereological study performed using light microscope and the image projected on a table of olysa software. Cavalieri principle was used to estimate the volume of cortex, medulla and whole kidney. All the grouped data statistically evaluated using Student's t-test, expressed as the Mean +/- SE. Ration of kidney weight/body weight in diabetes (0.51) and diabetes-extract group (0.67) were higher then control group (0.42). Ratio of kidney volume/body weight in diabetes (350) and diabetes-extract group (348) were higher then control group (323). Volume Ratio of cortex/medulla in diabetes-extract group (1.65) was higher then control (1.34) and diabetes group (1.33). Glomerular area and diameter and proximal tubule diameter in diabetes-Extract group was higher than control and diabetes groups. This study revealed that Urtica dioica has no effect on renal morphometric indices in induced diabetic rats.

  6. Double seal door design and analysis for ITER transfer cask

    International Nuclear Information System (INIS)

    Liu, C.L.; Yao, D.M.; Cheng, T.

    2007-01-01

    DSD (Double seal door) design concept was introduced. 3-D model work was performed for DSD in the three typical regions, such as upper port, equatorial port, divertor port. The numerical analysis for some typical components was done based on Finite Element (FE) method by using ANSYS code, especially for the optimization activities. The rescue procedures of the DSD was discussed which could benefit a little for future engineering implementation. The design and analysis work can support and be the important reference for future procurement. (authors)

  7. Design of magnetic analysis system for magnetic proton recoil spectrometer

    International Nuclear Information System (INIS)

    Qi Jianmin; Jiang Shilun; Zhou Lin; Peng Taiping

    2010-01-01

    Magnetic proton recoil (MPR) spectrometer is a novel diagnostic instrument with high performance for measurements of the neutron spectra from inertial confinement fusion (ICF) experiments and high power fusion devices. The design of the magnetic analysis system, which is a key part of the compact MPR-type spectrometer, has been completed through two-dimensional beam transport simulations and three-dimensional particle transport simulation. The analysis of the system's parameters and performances was performed, as well as system designs based on preferential principles of energy resolution, detection efficiency, and count rate, respectively. The results indicate that the magnetic analysis system can achieve a detection efficiency of 10 -5 ∼ 10 -4 level at the resolution range of 1.5% to 3.0% and fulfill the design goals of the compact MPR spectrometer. (authors)

  8. Design of Composite Structures Using Knowledge-Based and Case Based Reasoning

    Science.gov (United States)

    Lambright, Jonathan Paul

    1996-01-01

    A method of using knowledge based and case based reasoning to assist designers during conceptual design tasks of composite structures was proposed. The cooperative use of heuristics, procedural knowledge, and previous similar design cases suggests a potential reduction in design cycle time and ultimately product lead time. The hypothesis of this work is that the design process of composite structures can be improved by using Case-Based Reasoning (CBR) and Knowledge-Based (KB) reasoning in the early design stages. The technique of using knowledge-based and case-based reasoning facilitates the gathering of disparate information into one location that is easily and readily available. The method suggests that the inclusion of downstream life-cycle issues into the conceptual design phase reduces potential of defective, and sub-optimal composite structures. Three industry experts were interviewed extensively. The experts provided design rules, previous design cases, and test problems. A Knowledge Based Reasoning system was developed using the CLIPS (C Language Interpretive Procedural System) environment and a Case Based Reasoning System was developed using the Design Memory Utility For Sharing Experiences (MUSE) xviii environment. A Design Characteristic State (DCS) was used to document the design specifications, constraints, and problem areas using attribute-value pair relationships. The DCS provided consistent design information between the knowledge base and case base. Results indicated that the use of knowledge based and case based reasoning provided a robust design environment for composite structures. The knowledge base provided design guidance from well defined rules and procedural knowledge. The case base provided suggestions on design and manufacturing techniques based on previous similar designs and warnings of potential problems and pitfalls. The case base complemented the knowledge base and extended the problem solving capability beyond the existence of

  9. Surface tensor estimation from linear sections

    DEFF Research Database (Denmark)

    Kousholt, Astrid; Kiderlen, Markus; Hug, Daniel

    From Crofton's formula for Minkowski tensors we derive stereological estimators of translation invariant surface tensors of convex bodies in the n-dimensional Euclidean space. The estimators are based on one-dimensional linear sections. In a design based setting we suggest three types of estimators....... These are based on isotropic uniform random lines, vertical sections, and non-isotropic random lines, respectively. Further, we derive estimators of the specific surface tensors associated with a stationary process of convex particles in the model based setting....

  10. Surface tensor estimation from linear sections

    DEFF Research Database (Denmark)

    Kousholt, Astrid; Kiderlen, Markus; Hug, Daniel

    2015-01-01

    From Crofton’s formula for Minkowski tensors we derive stereological estimators of translation invariant surface tensors of convex bodies in the n-dimensional Euclidean space. The estimators are based on one-dimensional linear sections. In a design based setting we suggest three types of estimators....... These are based on isotropic uniform random lines, vertical sections, and non-isotropic random lines, respectively. Further, we derive estimators of the specific surface tensors associated with a stationary process of convex particles in the model based setting....

  11. Design and performance analysis of gas and liquid radial turbines

    Science.gov (United States)

    Tan, Xu

    In the first part of the research, pumps running in reverse as turbines are studied. This work uses experimental data of wide range of pumps representing the centrifugal pumps' configurations in terms of specific speed. Based on specific speed and specific diameter an accurate correlation is developed to predict the performances at best efficiency point of the centrifugal pump in its turbine mode operation. The proposed prediction method yields very good results to date compared to previous such attempts. The present method is compared to nine previous methods found in the literature. The comparison results show that the method proposed in this paper is the most accurate. The proposed method can be further complemented and supplemented by more future tests to increase its accuracy. The proposed method is meaningful because it is based both specific speed and specific diameter. The second part of the research is focused on the design and analysis of the radial gas turbine. The specification of the turbine is obtained from the solar biogas hybrid system. The system is theoretically analyzed and constructed based on the purchased compressor. Theoretical analysis results in a specification of 100lb/min, 900ºC inlet total temperature and 1.575atm inlet total pressure. 1-D and 3-D geometry of the rotor is generated based on Aungier's method. 1-D loss model analysis and 3-D CFD simulations are performed to examine the performances of the rotor. The total-to-total efficiency of the rotor is more than 90%. With the help of CFD analysis, modifications on the preliminary design obtained optimized aerodynamic performances. At last, the theoretical performance analysis on the hybrid system is performed with the designed turbine.

  12. Problem Based Game Design

    DEFF Research Database (Denmark)

    Reng, Lars; Schoenau-Fog, Henrik

    2011-01-01

    At Aalborg University’s department of Medialogy, we are utilizing the Problem Based Learning method to encourage students to solve game design problems by pushing the boundaries and designing innovative games. This paper is concerned with describing this method, how students employ it in various ...... projects and how they learn to analyse, design, and develop for innovation by using it. We will present various cases to exemplify the approach and focus on how the method engages students and aspires for innovation in digital entertainment and games.......At Aalborg University’s department of Medialogy, we are utilizing the Problem Based Learning method to encourage students to solve game design problems by pushing the boundaries and designing innovative games. This paper is concerned with describing this method, how students employ it in various...

  13. Nonlinear structural analysis methods and their application to elevated temperature design: A US perspective

    International Nuclear Information System (INIS)

    Dhalla, A.K.

    1989-01-01

    Technological advances over the last two decades have been assimilated into the routine design of Liquid Metal Reactor (LMR) structural components operating at elevated temperatures. The mature elevated temperature design technology is based upon: (a) an extensive material data base, (b) recent advances in nonlinear computational methods, and (c) conservative design criteria based upon past successful and reliable operating experiences with petrochemical and nonnuclear power plants. This survey paper provides a US perspective on the role of nonlinear analysis methods used in the design of LMR plants. The simplified and detailed nonlinear analysis methods and the level of computational effort required to qualify structural components for safe and reliable long-term operation are discussed. The paper also illustrates how a detailed nonlinear analysis can be used to resolve technical licensing issues, to understand complex nonlinear structural behavior, to identify predominant failure modes, and to guide future experimental programs

  14. Design of LTCC Based Fractal Antenna

    KAUST Repository

    AdbulGhaffar, Farhan

    2010-09-01

    The thesis presents a Sierpinski Carpet fractal antenna array designed at 24 GHz for automotive radar applications. Miniaturized, high performance and low cost antennas are required for this application. To meet these specifications a fractal array has been designed for the first time on Low Temperature Co-fired Ceramic (LTCC) based substrate. LTCC provides a suitable platform for the development of these antennas due to its properties of vertical stack up and embedded passives. The complete antenna concept involves integration of this fractal antenna array with a Fresnel lens antenna providing a total gain of 15dB which is appropriate for medium range radar applications. The thesis also presents a comparison between the designed fractal antenna and a conventional patch antenna outlining the advantages of fractal antenna over the later one. The fractal antenna has a bandwidth of 1.8 GHz which is 7.5% of the centre frequency (24GHz) as compared to 1.9% of the conventional patch antenna. Furthermore the fractal design exhibits a size reduction of 53% as compared to the patch antenna. In the end a sensitivity analysis is carried out for the fractal antenna design depicting the robustness of the proposed design against the typical LTCC fabrication tolerances.

  15. Demand-Based Optimal Design of Storage Tank with Inerter System

    Directory of Open Access Journals (Sweden)

    Shiming Zhang

    2017-01-01

    Full Text Available A parameter optimal design method for a tank with an inerter system is proposed in this study based on the requirements of tank vibration control to improve the effectiveness and efficiency of vibration control. Moreover, a response indicator and a cost control indicator are selected based on the control targets for liquid storage tanks for simultaneously minimizing the dynamic response and controlling costs. These indicators are reformulated through a random vibration analysis under virtual excitation. The problem is then transformed from a multiobjective optimization problem to a single-objective nonlinear problem using the ε-constraint method, which is consistent with the demand-based method. White noise excitation can be used to design the tank with the inerter system under seismic excitation to simplify the calculation. Subsequently, a MATLAB-based calculation program is compiled, and several optimization cases are examined under different excitation conditions. The effectiveness of the demand-based method is proven through a time history analysis. The results show that specific vibration control requirements can be met at the lowest cost with a simultaneous reduction in base shears and overturning base moments.

  16. Design and Evaluation of Chitosan-Based Novel pHSensitive Drug ...

    African Journals Online (AJOL)

    Design and Evaluation of Chitosan-Based Novel pHSensitive Drug Carrier for Sustained ... Scanning electron microscopy(SEM),Raman spectroscopy for particle size analysis. Swelling ratio, Effect of drug loading on encapsulation efficiency

  17. The impact of design-based modeling instruction on seventh graders' spatial abilities and model-based argumentation

    Science.gov (United States)

    McConnell, William J.

    Due to the call of current science education reform for the integration of engineering practices within science classrooms, design-based instruction is receiving much attention in science education literature. Although some aspect of modeling is often included in well-known design-based instructional methods, it is not always a primary focus. The purpose of this study was to better understand how design-based instruction with an emphasis on scientific modeling might impact students' spatial abilities and their model-based argumentation abilities. In the following mixed-method multiple case study, seven seventh grade students attending a secular private school in the Mid-Atlantic region of the United States underwent an instructional intervention involving design-based instruction, modeling and argumentation. Through the course of a lesson involving students in exploring the interrelatedness of the environment and an animal's form and function, students created and used multiple forms of expressed models to assist them in model-based scientific argument. Pre/post data were collected through the use of The Purdue Spatial Visualization Test: Rotation, the Mental Rotation Test and interviews. Other data included a spatial activities survey, student artifacts in the form of models, notes, exit tickets, and video recordings of students throughout the intervention. Spatial abilities tests were analyzed using descriptive statistics while students' arguments were analyzed using the Instrument for the Analysis of Scientific Curricular Arguments and a behavior protocol. Models were analyzed using content analysis and interviews and all other data were coded and analyzed for emergent themes. Findings in the area of spatial abilities included increases in spatial reasoning for six out of seven participants, and an immense difference in the spatial challenges encountered by students when using CAD software instead of paper drawings to create models. Students perceived 3D printed

  18. Feasibility study for objective oriented design of system thermal hydraulic analysis program

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu

    2008-01-01

    The system safety analysis code, such as RELAP5, TRAC, CATHARE etc. have been developed based on Fortran language during the past few decades. Refactoring of conventional codes has been also performed to improve code readability and maintenance. However the programming paradigm in software technology has been changed to use objects oriented programming (OOP), which is based on several techniques, including encapsulation, modularity, polymorphism, and inheritance. In this work, objective oriented program for system safety analysis code has been tried utilizing modernized C language. The analysis, design, implementation and verification steps for OOP system code development are described with some implementation examples. The system code SYSTF based on three-fluid thermal hydraulic solver has been developed by OOP design. The verifications of feasibility are performed with simple fundamental problems and plant models. (author)

  19. Secondary Analysis under Cohort Sampling Designs Using Conditional Likelihood

    Directory of Open Access Journals (Sweden)

    Olli Saarela

    2012-01-01

    Full Text Available Under cohort sampling designs, additional covariate data are collected on cases of a specific type and a randomly selected subset of noncases, primarily for the purpose of studying associations with a time-to-event response of interest. With such data available, an interest may arise to reuse them for studying associations between the additional covariate data and a secondary non-time-to-event response variable, usually collected for the whole study cohort at the outset of the study. Following earlier literature, we refer to such a situation as secondary analysis. We outline a general conditional likelihood approach for secondary analysis under cohort sampling designs and discuss the specific situations of case-cohort and nested case-control designs. We also review alternative methods based on full likelihood and inverse probability weighting. We compare the alternative methods for secondary analysis in two simulated settings and apply them in a real-data example.

  20. Automated analysis and design of complex structures

    International Nuclear Information System (INIS)

    Wilson, E.L.

    1977-01-01

    This paper discusses the following: 1. The relationship of analysis to design. 2. New methods of analysis. 3. Improved finite elements. 4. Effect of minicomputer on structural analysis methods. 5. The use of system of microprocessors for nonlinear structural analysis. 6. The role of interacting graphics systems in future analysis and design. The discussion focusses on the impact of new inexpensive computer hardware on design and analysis methods. (Auth.)

  1. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    Bower, G.

    2004-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  2. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    Bower, Gary; Cassell, Ron; Graf, Norman; Johnson, Tony; Ronan, Mike

    2001-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  3. Robustness-based evaluation of hydropower infrastructure design under climate change

    Directory of Open Access Journals (Sweden)

    Mehmet Ümit Taner

    2017-01-01

    Full Text Available The conventional tools of decision-making in water resources infrastructure planning have been developed for problems with well-characterized uncertainties and are ill-suited for problems involving climate nonstationarity. In the past 20 years, a predict-then-act-based approach to the incorporation of climate nonstationarity has been widely adopted in which the outputs of bias-corrected climate model projections are used to evaluate planning options. However, the ambiguous nature of results has often proved unsatisfying to decision makers. This paper presents the use of a bottom-up, decision scaling framework for the evaluation of water resources infrastructure design alternatives regarding their robustness to climate change and expected value of performance. The analysis begins with an assessment of the vulnerability of the alternative designs under a wide domain of systematically-generated plausible future climates and utilizes downscaled climate projections ex post to inform likelihoods within a risk-based evaluation. The outcomes under different project designs are compared by way of a set of decision criteria, including the performance under the most likely future, expected value of performance across all evaluated futures and robustness. The method is demonstrated for the design of a hydropower system in sub-Saharan Africa and is compared to the results that would be found using a GCM-based, scenario-led analysis. The results indicate that recommendations from the decision scaling analysis can be substantially different from the scenario-led approach, alleviate common shortcomings related to the use of climate projections in water resources planning, and produce recommendations that are more robust to future climate uncertainty.

  4. Analysis and design of hybrid control systems

    Energy Technology Data Exchange (ETDEWEB)

    Malmborg, J.

    1998-05-01

    Different aspects of hybrid control systems are treated: analysis, simulation, design and implementation. A systematic methodology using extended Lyapunov theory for design of hybrid systems is developed. The methodology is based on conventional control designs in separate regions together with a switching strategy. Dynamics are not well defined if the control design methods lead to fast mode switching. The dynamics depend on the salient features of the implementation of the mode switches. A theorem for the stability of second order switching together with the resulting dynamics is derived. The dynamics on an intersection of two sliding sets are defined for two relays working on different time scales. The current simulation packages have problems modeling and simulating hybrid systems. It is shown how fast mode switches can be found before or during simulation. The necessary analysis work is a very small overhead for a modern simulation tool. To get some experience from practical problems with hybrid control the switching strategy is implemented in two different software environments. In one of them a time-optimal controller is added to an existing PID controller on a commercial control system. Successful experiments with this hybrid controller shows the practical use of the method 78 refs, 51 figs, 2 tabs

  5. ICAS-PAT: A Software for Design, Analysis and Validation of PAT Systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    end product qualities. In an earlier article, Singh et al. [Singh, R., Gernaey, K. V., Gani, R. (2009). Model-based computer-aided framework for design of process monitoring and analysis systems. Computers & Chemical Engineering, 33, 22–42] proposed the use of a systematic model and data based...... methodology to design appropriate PAT systems. This methodology has now been implemented into a systematic computer-aided framework to develop a software (ICAS-PAT) for design, validation and analysis of PAT systems. Two supporting tools needed by ICAS-PAT have also been developed: a knowledge base...... (consisting of process knowledge as well as knowledge on measurement methods and tools) and a generic model library (consisting of process operational models). Through a tablet manufacturing process example, the application of ICAS-PAT is illustrated, highlighting as well, the main features of the software....

  6. Reliability-based design optimization via high order response surface method

    International Nuclear Information System (INIS)

    Li, Hong Shuang

    2013-01-01

    To reduce the computational effort of reliability-based design optimization (RBDO), the response surface method (RSM) has been widely used to evaluate reliability constraints. We propose an efficient methodology for solving RBDO problems based on an improved high order response surface method (HORSM) that takes advantage of an efficient sampling method, Hermite polynomials and uncertainty contribution concept to construct a high order response surface function with cross terms for reliability analysis. The sampling method generates supporting points from Gauss-Hermite quadrature points, which can be used to approximate response surface function without cross terms, to identify the highest order of each random variable and to determine the significant variables connected with point estimate method. The cross terms between two significant random variables are added to the response surface function to improve the approximation accuracy. Integrating the nested strategy, the improved HORSM is explored in solving RBDO problems. Additionally, a sampling based reliability sensitivity analysis method is employed to reduce the computational effort further when design variables are distributional parameters of input random variables. The proposed methodology is applied on two test problems to validate its accuracy and efficiency. The proposed methodology is more efficient than first order reliability method based RBDO and Monte Carlo simulation based RBDO, and enables the use of RBDO as a practical design tool.

  7. RandomSpot: A web-based tool for systematic random sampling of virtual slides.

    Science.gov (United States)

    Wright, Alexander I; Grabsch, Heike I; Treanor, Darren E

    2015-01-01

    This paper describes work presented at the Nordic Symposium on Digital Pathology 2014, Linköping, Sweden. Systematic random sampling (SRS) is a stereological tool, which provides a framework to quickly build an accurate estimation of the distribution of objects or classes within an image, whilst minimizing the number of observations required. RandomSpot is a web-based tool for SRS in stereology, which systematically places equidistant points within a given region of interest on a virtual slide. Each point can then be visually inspected by a pathologist in order to generate an unbiased sample of the distribution of classes within the tissue. Further measurements can then be derived from the distribution, such as the ratio of tumor to stroma. RandomSpot replicates the fundamental principle of traditional light microscope grid-shaped graticules, with the added benefits associated with virtual slides, such as facilitated collaboration and automated navigation between points. Once the sample points have been added to the region(s) of interest, users can download the annotations and view them locally using their virtual slide viewing software. Since its introduction, RandomSpot has been used extensively for international collaborative projects, clinical trials and independent research projects. So far, the system has been used to generate over 21,000 sample sets, and has been used to generate data for use in multiple publications, identifying significant new prognostic markers in colorectal, upper gastro-intestinal and breast cancer. Data generated using RandomSpot also has significant value for training image analysis algorithms using sample point coordinates and pathologist classifications.

  8. Object-oriented analysis and design

    CERN Document Server

    Deacon, John

    2005-01-01

    John Deacon’s in-depth, highly pragmatic approach to object-oriented analysis and design, demonstrates how to lay the foundations for developing the best possible software. Students will learn how to ensure that analysis and design remain focused and productive. By working through the book, they will gain a solid working knowledge of best practices in software development.

  9. Design and analysis of biomedical studies

    DEFF Research Database (Denmark)

    Hansen, Merete Kjær

    been allocated this field. It is utterly important to utilize these ressources responsibly and efficiently by constantly striving to ensure high-quality biomedical studies. This involves the use of a sound statistical methodology regarding both the design and analysis of biomedical studies. The focus...... have conducted a literature study strongly indicating that this structure commonly is neglected in the statistical analysis. Based on this closed-form expressions for the approximate type I error rate are formulated. The type I error rates are assessed for a number of factor combinations as they appear...... in practice and in all cases the type I error rates are demonstrated to be severely inflated. Prior to conducting a study it is important to perform power and sample size determinations to ensure that reliable conclusions can be drawn from the statistical analysis. We have formulated closed-form expressions...

  10. Conditional analysis of mixed Poisson processes with baseline counts: implications for trial design and analysis.

    Science.gov (United States)

    Cook, Richard J; Wei, Wei

    2003-07-01

    The design of clinical trials is typically based on marginal comparisons of a primary response under two or more treatments. The considerable gains in efficiency afforded by models conditional on one or more baseline responses has been extensively studied for Gaussian models. The purpose of this article is to present methods for the design and analysis of clinical trials in which the response is a count or a point process, and a corresponding baseline count is available prior to randomization. The methods are based on a conditional negative binomial model for the response given the baseline count and can be used to examine the effect of introducing selection criteria on power and sample size requirements. We show that designs based on this approach are more efficient than those proposed by McMahon et al. (1994).

  11. Analysis of the optimal design strategy of a magnetorheological smart structure

    International Nuclear Information System (INIS)

    Yang Likang; Duan Fubin; Eriksson, Anders

    2008-01-01

    The exploration of magnetorheological (MR) fluid applications involves many fields. During the phase of theory analysis and experimental investigations, most of the research has been in developing primary products, and the design method is becoming important in MR device design. To establish general design guidelines, not with the usual MR smart structure design method which just complies with the presented yield stress of smart materials, in this paper, an MR smart structure design method is presented according to the whole requirement of smart structure characteristics. In other words, the smart structure design method does not just execute its optimization according to the presented MR fluid features, and it can customize or select the properties of MR fluid obeying the whole system requirements. Besides the usual magnetic circuit design analysis, the MR fluid physical content, such as the volume fraction of particles, was incorporated into the design parameters of the products. At the same time, by utilizing the structural parameters, the response time of MR devices was considered by analyzing the time constant of electromagnetic coils inside the MR devices too. Additionally, the power consumption relevant to transient useful power was analyzed for structure design. Finally, based on the computation of the magnetic field in a finite element (COMSOL multiphysics), all these factors were illustrated in an MR fluid valve based on the results of a magnetic circuit design

  12. Heat loss analysis-based design of a 12 MW wind power generator module having an HTS flux pump exciter

    International Nuclear Information System (INIS)

    Sung, Hae-Jin; Go, Byeong-Soo; Jiang, Zhenan; Park, Minwon; Yu, In-Keun

    2016-01-01

    Highlights: • A large-scale HTS generator module has been suggested to avoid issues such as a huge vacuum vessel and higher reliability. • The challenging heat loss analysis of a large-scale HTS generator has successfully been performed, enabling the design of an optimal support structure having a total heat loss of 43 W/400 kW. • The results prove the potential of a large-scale superconducting wind-power generator to operate efficiently, and support further development of the concept. - Abstract: The development of an effective high-temperature superconducting (HTS) generator is currently a research focus; however, the reduction of heat loss of a large-scale HTS generator is a challenge. This study deals with a heat loss analysis-based design of a 12 MW wind power generator module having an HTS flux pump exciter. The generator module consists of an HTS rotor of the generator and an HTS flux pump exciter. The specifications of the module were described, and the detailed configuration of the module was illustrated. For the heat loss analysis of the module, the excitation loss of the flux pump exciter, eddy current loss of all of the structures in the module, radiation loss, and conduction loss of an HTS coil supporter were assessed using a 3D finite elements method program. In the case of the conduction loss, different types of the supporters were compared to find out the supporter of the lowest conduction loss in the module. The heat loss analysis results of the module were reflected in the design of the generator module and discussed in detail. The results will be applied to the design of large-scale superconducting generators for wind turbines including a cooling system.

  13. Heat loss analysis-based design of a 12 MW wind power generator module having an HTS flux pump exciter

    Energy Technology Data Exchange (ETDEWEB)

    Sung, Hae-Jin, E-mail: haejin0216@gmail.com [Changwon National University, 20 Changwondaehak-ro, Changwon, 641-773 (Korea, Republic of); Go, Byeong-Soo [Changwon National University, 20 Changwondaehak-ro, Changwon, 641-773 (Korea, Republic of); Jiang, Zhenan [Robinson Research Institute, Victoria University of Wellington, PO Box 33436 (New Zealand); Park, Minwon [Changwon National University, 20 Changwondaehak-ro, Changwon, 641-773 (Korea, Republic of); Yu, In-Keun, E-mail: yuik@changwon.ac.kr [Changwon National University, 20 Changwondaehak-ro, Changwon, 641-773 (Korea, Republic of)

    2016-11-15

    Highlights: • A large-scale HTS generator module has been suggested to avoid issues such as a huge vacuum vessel and higher reliability. • The challenging heat loss analysis of a large-scale HTS generator has successfully been performed, enabling the design of an optimal support structure having a total heat loss of 43 W/400 kW. • The results prove the potential of a large-scale superconducting wind-power generator to operate efficiently, and support further development of the concept. - Abstract: The development of an effective high-temperature superconducting (HTS) generator is currently a research focus; however, the reduction of heat loss of a large-scale HTS generator is a challenge. This study deals with a heat loss analysis-based design of a 12 MW wind power generator module having an HTS flux pump exciter. The generator module consists of an HTS rotor of the generator and an HTS flux pump exciter. The specifications of the module were described, and the detailed configuration of the module was illustrated. For the heat loss analysis of the module, the excitation loss of the flux pump exciter, eddy current loss of all of the structures in the module, radiation loss, and conduction loss of an HTS coil supporter were assessed using a 3D finite elements method program. In the case of the conduction loss, different types of the supporters were compared to find out the supporter of the lowest conduction loss in the module. The heat loss analysis results of the module were reflected in the design of the generator module and discussed in detail. The results will be applied to the design of large-scale superconducting generators for wind turbines including a cooling system.

  14. Toxic release consequence analysis tool (TORCAT) for inherently safer design plant

    International Nuclear Information System (INIS)

    Shariff, Azmi Mohd; Zaini, Dzulkarnain

    2010-01-01

    Many major accidents due to toxic release in the past have caused many fatalities such as the tragedy of MIC release in Bhopal, India (1984). One of the approaches is to use inherently safer design technique that utilizes inherent safety principle to eliminate or minimize accidents rather than to control the hazard. This technique is best implemented in preliminary design stage where the consequence of toxic release can be evaluated and necessary design improvements can be implemented to eliminate or minimize the accidents to as low as reasonably practicable (ALARP) without resorting to costly protective system. However, currently there is no commercial tool available that has such capability. This paper reports on the preliminary findings on the development of a prototype tool for consequence analysis and design improvement via inherent safety principle by utilizing an integrated process design simulator with toxic release consequence analysis model. The consequence analysis based on the worst-case scenarios during process flowsheeting stage were conducted as case studies. The preliminary finding shows that toxic release consequences analysis tool (TORCAT) has capability to eliminate or minimize the potential toxic release accidents by adopting the inherent safety principle early in preliminary design stage.

  15. Design and Development of a Web Based User Interface

    OpenAIRE

    László, Magda

    2014-01-01

    The first objective of the thesis is to study the technological background of application design and more specifically the Unified Modeling Language (hereinafter UML). Due to this, the research provides deeper understanding of technical aspects of the practical part of the thesis work. The second and third objectives of this thesis are to design and develop a web application and more specifically a Web Based User Interface for Multimodal Observation and Analysis System for Social Interactions...

  16. Analysis and evaluation system for elevated temperature design of pressure vessels

    International Nuclear Information System (INIS)

    Hayakawa, Teiji; Sayawaki, Masaaki; Nishitani, Masahiro; Mii, Tatsuo; Murasawa, Kanji

    1977-01-01

    In pressure vessel technology, intensive efforts have recently been made to develop the elevated temperature design methods. Much of the impetus of these efforts has been provided mainly by the results of the Liquid Metal Fast Breeder Reactor (LMFBR) and more recently, of the High Temperature Gas-cooled Reactor (HTGR) Programs. The pressure vessels and associated components in these new type nuclear power plants must operate for long periods at elevated temperature where creep effects are significant and then must be designed by rigorous analysis for high reliability and safety. To carry out such an elevated temperature designing, numbers of highly developed analysis and evaluation techniques, which are so complicated as to be impossible by manual work, are indispensable. Under these circumstances, the authors have made the following approaches in the study: (1) Study into basic concepts and the associated techniques in elevated temperature design. (2) Systematization (Analysis System) of the procedure for loads and stress analyses. (3) Development of post-processor, ''POST-1592'', for strength evaluation based on ASME Code Case 1592-7. By linking the POST-1592 together with the Analysis System, an analysis and evaluation system is developed for an elevated temperature design of pressure vessels. Consequently, designing of elevated temperature vessels by detailed analysis and evaluation has easily and effectively become feasible by applying this software system. (auth.)

  17. Design and Analysis of Subscale and Full-Scale Buckling-Critical Cylinders for Launch Vehicle Technology Development

    Science.gov (United States)

    Hilburger, Mark W.; Lovejoy, Andrew E.; Thornburgh, Robert P.; Rankin, Charles

    2012-01-01

    NASA s Shell Buckling Knockdown Factor (SBKF) project has the goal of developing new analysis-based shell buckling design factors (knockdown factors) and design and analysis technologies for launch vehicle structures. Preliminary design studies indicate that implementation of these new knockdown factors can enable significant reductions in mass and mass-growth in these vehicles. However, in order to validate any new analysis-based design data or methods, a series of carefully designed and executed structural tests are required at both the subscale and full-scale levels. This paper describes the design and analysis of three different orthogrid-stiffeNed metallic cylindrical-shell test articles. Two of the test articles are 8-ft-diameter, 6-ft-long test articles, and one test article is a 27.5-ft-diameter, 20-ft-long Space Shuttle External Tank-derived test article.

  18. Analytical Model-Based Design Optimization of a Transverse Flux Machine

    Energy Technology Data Exchange (ETDEWEB)

    Hasan, Iftekhar; Husain, Tausif; Sozer, Yilmaz; Husain, Iqbal; Muljadi, Eduard

    2017-02-16

    This paper proposes an analytical machine design tool using magnetic equivalent circuit (MEC)-based particle swarm optimization (PSO) for a double-sided, flux-concentrating transverse flux machine (TFM). The magnetic equivalent circuit method is applied to analytically establish the relationship between the design objective and the input variables of prospective TFM designs. This is computationally less intensive and more time efficient than finite element solvers. A PSO algorithm is then used to design a machine with the highest torque density within the specified power range along with some geometric design constraints. The stator pole length, magnet length, and rotor thickness are the variables that define the optimization search space. Finite element analysis (FEA) was carried out to verify the performance of the MEC-PSO optimized machine. The proposed analytical design tool helps save computation time by at least 50% when compared to commercial FEA-based optimization programs, with results found to be in agreement with less than 5% error.

  19. Experimental design and quantitative analysis of microbial community multiomics.

    Science.gov (United States)

    Mallick, Himel; Ma, Siyuan; Franzosa, Eric A; Vatanen, Tommi; Morgan, Xochitl C; Huttenhower, Curtis

    2017-11-30

    Studies of the microbiome have become increasingly sophisticated, and multiple sequence-based, molecular methods as well as culture-based methods exist for population-scale microbiome profiles. To link the resulting host and microbial data types to human health, several experimental design considerations, data analysis challenges, and statistical epidemiological approaches must be addressed. Here, we survey current best practices for experimental design in microbiome molecular epidemiology, including technologies for generating, analyzing, and integrating microbiome multiomics data. We highlight studies that have identified molecular bioactives that influence human health, and we suggest steps for scaling translational microbiome research to high-throughput target discovery across large populations.

  20. Modeling, Design, and Implementation of a Cloud Workflow Engine Based on Aneka

    OpenAIRE

    Zhou, Jiantao; Sun, Chaoxin; Fu, Weina; Liu, Jing; Jia, Lei; Tan, Hongyan

    2014-01-01

    This paper presents a Petri net-based model for cloud workflow which plays a key role in industry. Three kinds of parallelisms in cloud workflow are characterized and modeled. Based on the analysis of the modeling, a cloud workflow engine is designed and implemented in Aneka cloud environment. The experimental results validate the effectiveness of our approach of modeling, design, and implementation of cloud workflow.

  1. Thermal energy systems design and analysis

    CERN Document Server

    Penoncello, Steven G

    2015-01-01

    IntroductionThermal Energy Systems Design and AnalysisSoftwareThermal Energy System TopicsUnits and Unit SystemsThermophysical PropertiesEngineering DesignEngineering EconomicsIntroductionCommon Engineering Economics NomenclatureEconomic Analysis Tool: The Cash Flow DiagramTime Value of MoneyTime Value of Money ExamplesUsing Software to Calculate Interest FactorsEconomic Decision MakingDepreciation and TaxesProblemsAnalysis of Thermal Energy SystemsIntroductionNomenclatureThermophysical Properties of SubstancesSuggested Thermal Energy Systems Analysis ProcedureConserved and Balanced QuantitiesConservation of MassConservation of Energy (The First Law of Thermodynamics)Entropy Balance (The Second Law of Thermodynamics)Exergy Balance: The Combined LawEnergy and Exergy Analysis of Thermal Energy CyclesDetailed Analysis of Thermal Energy CyclesProblemsFluid Transport in Thermal Energy SystemsIntroductionPiping and Tubing StandardsFluid Flow FundamentalsValves and FittingsDesign and Analysis of Pipe NetworksEconomi...

  2. Reliability assessment and probability based design of reinforced concrete containments and shear walls

    International Nuclear Information System (INIS)

    Hwang, H.; Reich, M.; Ellingwood, B.; Shinozuka, M.

    1986-03-01

    This report summarizes work completed under the program entitled, ''Probability-Based Load Combinations for Design of Category I Structures.'' Under this program, the probabilistic models for various static and dynamic loads were formulated. The randomness and uncertainties in material strengths and structural resistance were established. Several limit states of concrete containments and shear walls were identified and analytically formulated. Furthermore, the reliability analysis methods for estimating limit state probabilities were established. These reliability analysis methods can be used to evaluate the safety levels of nuclear structures under various combinations of static and dynamic loads. They can also be used to generate analytically the fragility data for PRA studies. In addition to the development of reliability analysis methods, probability-based design criteria for concrete containments and shear wall structures have also been developed. The proposed design criteria are in the load and resistance factor design (LRFD) format. The load and resistance factors are determined for several limit states and target limit state probabilities. Thus, the proposed design criteria are risk-consistent and have a well-established rationale. 73 refs., 18 figs., 16 tabs

  3. Design analysis of rotary turret of poucher machine

    Directory of Open Access Journals (Sweden)

    Jigar G. Patel

    2016-09-01

    Full Text Available This paper present design analysis of rotary turret plate of 5 kg capacity for food product packaging machine. The turret plate has been designed considering two different criteria, first one is inertia force approach with only self-weight of turret plate and second is with mass of pouches. A 3-dimenssional CAD model of rotary turret assembly has been prepared in using solid modelling packages CRE-O. The finite element analysis (FEA of turret plate has been carried out using analysis software ANSYS 15.0. Consideration of inertia force is one of the criteria to analyze the performance and behaviour of component in working condition. The rotational velocity is applied at the central axis of turret and friction less support is applied on inner surface, where shaft is being attached. Also, pressure is applied on the same surface to incorporate the shrink fit condition of the assembly of turret plate with shaft. The boundary conditions as fixed support have been considered at the different sixteen faces, where bolts have been attached. The obtained simulation results for induced stress, deformation and strain depict that the modified design of rotary turret plate is well within the allowable stress limits of considered material. And, further optimization can be performed for topological and strength based more efficient design of turret plate.

  4. Conceptual Design of GRIG (GUI Based RETRAN Input Generator)

    International Nuclear Information System (INIS)

    Lee, Gyung Jin; Hwang, Su Hyun; Hong, Soon Joon; Lee, Byung Chul; Jang, Chan Su; Um, Kil Sup

    2007-01-01

    For the development of high performance methodology using advanced transient analysis code, it is essential to generate the basic input of transient analysis code by rigorous QA procedures. There are various types of operating NPPs (Nuclear Power Plants) in Korea such as Westinghouse plants, KSNP(Korea Standard Nuclear Power Plant), APR1400 (Advance Power Reactor), etc. So there are some difficulties to generate and manage systematically the input of transient analysis code reflecting the inherent characteristics of various types of NPPs. To minimize the user faults and investment man power and to generate effectively and accurately the basic inputs of transient analysis code for all domestic NPPs, it is needed to develop the program that can automatically generate the basic input, which can be directly applied to the transient analysis, from the NPP design material. ViRRE (Visual RETRAN Running Environment) developed by KEPCO (Korea Electric Power Corporation) and KAERI (Korea Atomic Energy Research Institute) provides convenient working environment for Kori Unit 1/2. ViRRE shows the calculated results through on-line display but its capability is limited on the convenient execution of RETRAN. So it can not be used as input generator. ViSA (Visual System Analyzer) developed by KAERI is a NPA (Nuclear Plant Analyzer) using RETRAN and MARS code as thermal-hydraulic engine. ViSA contains both pre-processing and post-processing functions. In the pre-processing, only the trip data cards and boundary conditions can be changed through GUI mode based on pre-prepared text-input, so the capability of input generation is very limited. SNAP (Symbolic Nuclear Analysis Package) developed by Applied Programming Technology, Inc. and NRC (Nuclear Regulatory Commission) provides efficient working environment for the use of nuclear safety analysis codes such as RELAP5 and TRAC-M codes. SNAP covers wide aspects of thermal-hydraulic analysis from model creation through data analysis

  5. Small-Signal Modeling, Stability Analysis and Design Optimization of Single-Phase Delay-Based PLLs

    DEFF Research Database (Denmark)

    Golestan, Saeed; Guerrero, Josep M.; Vidal, Ana

    2016-01-01

    Generally speaking, designing single-phase phaselocked loops (PLLs) is more complicated than three-phase ones, as their implementation often involves the generation of a fictitious orthogonal signal for the frame transformation. In recent years, many approaches to generate the orthogonal signal...... these issues and explore new methods to enhance their performance. The stability analysis, control design guidelines and performance comparison with the state-of-the-art PLLs are presented as well....

  6. Innovations in systems engineering and analysis for the simulation of beyond design-base accidents

    International Nuclear Information System (INIS)

    Frisch, W.; Beraha, D.

    1990-01-01

    An important target in improving reactor safety is to have the most realistic simulation possible of beyond design-base accidents in the computer. This paper presents new developments in ATHLET and further developments (description of the thermo-fluid-dynamic conditions in the core and cooling circuits during serious incidents in the computer programme ATHLET-SA) and extensions (link-up to RALOC). RALOC is a computer programme for describing thermodynamic conditions inside the containment during design-base accidents and accidents involving core meltdown. Further research is dedicated to code acceleration. (DG) [de

  7. Application of Reliability Analysis for Optimal Design of Monolithic Vertical Wall Breakwaters

    DEFF Research Database (Denmark)

    Burcharth, H. F.; Sørensen, John Dalsgaard; Christiani, E.

    1995-01-01

    Reliability analysis and reliability-based design of monolithic vertical wall breakwaters are considered. Probabilistic models of some of the most important failure modes are described. The failures are sliding and slip surface failure of a rubble mound and a clay foundation. Relevant design...

  8. Application of the EGM Method to a LED-Based Spotlight: A Constrained Pseudo-Optimization Design Process Based on the Analysis of the Local Entropy Generation Maps

    Directory of Open Access Journals (Sweden)

    Enrico Sciubba

    2011-06-01

    Full Text Available In this paper, the entropy generation minimization (EGM method is applied to an industrial heat transfer problem: the forced convective cooling of a LED-based spotlight. The design specification calls for eighteen diodes arranged on a circular copper plate of 35 mm diameter. Every diode dissipates 3 W and the maximum allowedtemperature of the plate is 80 °C. The cooling relies on the forced convection driven by a jet of air impinging on the plate. An initial complex geometry of plate fins is presented and analyzed with a commercial CFD code that computes the entropy generation rate. A pseudo-optimization process is carried out via a successive series of design modifications based on a careful analysis of the entropy generation maps. One of the advantages of the EGM method is that the rationale behind each step of the design process can be justified on a physical basis. It is found that the best performance is attained when the fins are periodically spaced in the radial direction.

  9. DESIGN AND IMPLEMENTATION OF A VHDL PROCESSOR FOR DCT BASED IMAGE COMPRESSION

    Directory of Open Access Journals (Sweden)

    Md. Shabiul Islam

    2017-11-01

    Full Text Available This paper describes the design and implementation of a VHDL processor meant for performing 2D-Discrete Cosine Transform (DCT to use in image compression applications. The design flow starts from the system specification to implementation on silicon and the entire process is carried out using an advanced workstation based design environment for digital signal processing. The software allows the bit-true analysis to ensure that the designed VLSI processor satisfies the required specifications. The bit-true analysis is performed on all levels of abstraction (behavior, VHDL etc.. The motivation behind the work is smaller size chip area, faster processing, reducing the cost of the chip

  10. Preliminary CFD Analysis for HVAC System Design of a Containment Building

    Energy Technology Data Exchange (ETDEWEB)

    Son, Sung Man; Choi, Choengryul [ELSOLTEC, Yongin (Korea, Republic of); Choo, Jae Ho; Hong, Moonpyo; Kim, Hyungseok [KEPCO Engineering and Construction, Gimcheon (Korea, Republic of)

    2016-10-15

    HVAC (Heating, Ventilation, Air Conditioning) system has been mainly designed based on overall heat balance and averaging concepts, which is simple and useful for designing overall system. However, such a method has the disadvantage that cannot predict the local flow and temperature distributions in a containment building. In this study, a CFD (Computational Fluid Dynamics) preliminary analysis is carried out to obtain detailed flow and temperature distributions in a containment building and to ensure that such information can be obtained via CFD analysis. This approach can be useful for hydrogen analysis in an accident related to hydrogen released into a containment building. In this study, CFD preliminary analysis has been performed to obtain the detailed information of the reactor containment building by using the CFD analysis techniques and to ensure that such information can be obtained via CFD analysis. We confirmed that CFD analysis can offer enough detailed information about flow patterns and temperature field and that CFD technique is a useful tool for HVAC design of nuclear power plants.

  11. Effects of hypertension and ovariectomy on rat hepatocytes. Are amlodipine and lacidipine protective? (A stereological and histological study).

    Science.gov (United States)

    Dursun, Hakan; Albayrak, Fatih; Uyanik, Abdullah; Keleş, Nuri Osman; Beyzagül, Polat; Bayram, Ednan; Halici, Zekai; Altunkaynak, Zuhal Berrin; Süleyman, Halis; Okçu, Nihat; Ünal, Bünyamin

    2010-12-01

    Calcium channel blockers are increasingly used for the treatment of hypertension. Menopause and hypertension are both important risk factors for liver damage and several other circulatory abnormalities. The aim of this study was to determine the effects of amlodipine and lacidipine in an ovariectomy-induced postmenopausal period model and a deoxycorticosterone acetate-salt-induced hypertensive model in rats. In this study, animals were divided into six groups as follows: control (Group 1), hypertension (Group 2), ovariectomy (Group 3), ovariectomy and hypertension (Group 4), ovariectomy, hypertension and amlodipine-treated (Group 5), and ovariectomy, hypertension and lacidipine-treated (Group 6). At the end of the experiment, the livers were removed and tissue samples were histologically and stereologically examined. The numerical densities of the hepatocytes according to group were 0.000422, 0.00329, 0.000272, 0.00259, 0.00374 and 0.000346 μm3, respectively. Significant differences were found between values of all groups (phypertension in Groups 5 and 6. Our experimental results show that both hypertension and the postmenopausal period have negative effects on the number of hepatocytes and histological structure of the liver. Both amlodipine and lacidipine appear to ameliorate the hypertension and/or postmenopausal period-related decrease in hepatocyte number. We thus suggest that lacidipine and particularly amlodipine have important protective and recovering effects on the liver.

  12. Point Analysis in Java applied to histological images of the perforant pathway: A user’s account

    OpenAIRE

    Scorcioni, Ruggero; Wright, Susan N.; Card, J. Patrick; Ascoli, Giorgio A.; Barrionuevo, Germán

    2008-01-01

    The freeware Java tool PAJ, created to perform 3D point analysis, was tested in an independent laboratory setting. The input data consisted of images of the hippocampal perforant pathway from serial immunocytochemical localizations of the rat brain in multiple views at different resolutions. The low magnification set (2× objective) comprised the entire perforant pathway, while the high magnification set (100× objective) allowed the identification of individual fibers. A preliminary stereologi...

  13. Application of discriminant analysis-based model for prediction of risk of low back disorders due to workplace design in industrial jobs.

    Science.gov (United States)

    Ganga, G M D; Esposto, K F; Braatz, D

    2012-01-01

    The occupational exposure limits of different risk factors for development of low back disorders (LBDs) have not yet been established. One of the main problems in setting such guidelines is the limited understanding of how different risk factors for LBDs interact in causing injury, since the nature and mechanism of these disorders are relatively unknown phenomena. Industrial ergonomists' role becomes further complicated because the potential risk factors that may contribute towards the onset of LBDs interact in a complex manner, which makes it difficult to discriminate in detail among the jobs that place workers at high or low risk of LBDs. The purpose of this paper was to develop a comparative study between predictions based on the neural network-based model proposed by Zurada, Karwowski & Marras (1997) and a linear discriminant analysis model, for making predictions about industrial jobs according to their potential risk of low back disorders due to workplace design. The results obtained through applying the discriminant analysis-based model proved that it is as effective as the neural network-based model. Moreover, the discriminant analysis-based model proved to be more advantageous regarding cost and time savings for future data gathering.

  14. Early stage design and analysis of biorefinery networks

    DEFF Research Database (Denmark)

    Sin, Gürkan

    2013-01-01

    Recent work regarding biorefineries resulted in many competing concepts and technologies for conversion of renewable bio-based feedstock into many promising products including fuels, chemicals, materials, etc. The design of a biorefinery process requires, at its earlier stages, the selection...... of the process configuration which exhibits the best performances, for a given set of economical, technical and environmental criteria. To this end, we formulate a computer-aided framework as an enabling technology for early stage design and analysis of biorefineries. The tool represents different raw materials......, different products and different available technologies and proposes a conceptual (early stage) biorefinery network. This network can then be the basis for further detailed and rigorous model-based studies. In this talk, we demonstrate the application of the tool for generating an early stage optimal...

  15. Review of research in feature based design

    NARCIS (Netherlands)

    Salomons, O.W.; van Houten, Frederikus J.A.M.; Kals, H.J.J.

    1993-01-01

    Research in feature-based design is reviewed. Feature-based design is regarded as a key factor towards CAD/CAPP integration from a process planning point of view. From a design point of view, feature-based design offers possibilities for supporting the design process better than current CAD systems

  16. Space Launch System Base Heating Test: Sub-Scale Rocket Engine/Motor Design, Development & Performance Analysis

    Science.gov (United States)

    Mehta, Manish; Seaford, Mark; Kovarik, Brian; Dufrene, Aaron; Solly, Nathan

    2014-01-01

    ATA-002 Technical Team has successfully designed, developed, tested and assessed the SLS Pathfinder propulsion systems for the Main Base Heating Test Program. Major Outcomes of the Pathfinder Test Program: Reach 90% of full-scale chamber pressure Achieved all engine/motor design parameter requirements Reach steady plume flow behavior in less than 35 msec Steady chamber pressure for 60 to 100 msec during engine/motor operation Similar model engine/motor performance to full-scale SLS system Mitigated nozzle throat and combustor thermal erosion Test data shows good agreement with numerical prediction codes Next phase of the ATA-002 Test Program Design & development of the SLS OML for the Main Base Heating Test Tweak BSRM design to optimize performance Tweak CS-REM design to increase robustness MSFC Aerosciences and CUBRC have the capability to develop sub-scale propulsion systems to meet desired performance requirements for short-duration testing.

  17. Design and analysis of biorefineries based on raw glycerol: addressing the glycerol problem.

    Science.gov (United States)

    Posada, John A; Rincón, Luis E; Cardona, Carlos A

    2012-05-01

    Glycerol as a low-cost by-product of the biodiesel industry can be considered a renewable building block for biorefineries. In this work, the conversion of raw glycerol to nine added-value products obtained by chemical (syn-gas, acrolein, and 1,2-propanediol) or bio-chemical (ethanol, 1,3-propanediol, d-lactic acid, succinic acid, propionic acid, and poly-3-hydroxybutyrate) routes were considered. The technological schemes for these synthesis routes were designed, simulated, and economically assessed using Aspen Plus and Aspen Icarus Process Evaluator, respectively. The techno-economic potential of a glycerol-based biorefinery system for the production of fuels, chemicals, and plastics was analyzed using the commercial Commercial Sale Price/Production Cost ratio criteria, under different production scenarios. More income can be earned from 1,3-propanediol and 1,2-propanediol production, while less income would be obtained from hydrogen and succinic acid. This analysis may be useful mainly for biodiesel producers since several profitable alternatives are presented and discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Cooperative Experimental System Development - cooperative techniques beyound initial design and analysis

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Kyng, Morten; Mogensen, Preben Holst

    1995-01-01

    This chapter represents a step towards the establishment of a new system development approach, called Cooperative Experimental System Development (CESD). CESD seeks to overcome a number of limitations in existing approaches: specification oriented methods usually assume that system design can....../design activities of development projects. In contrast, the CESD approach is characterized by its focus on: active user involvement throughout the entire development process; prototyping experiments closely coupled to work-situations and use-scenarios; transforming results from early cooperative analysis...... be based solely on observation and detached reflection; prototyping methods often have a narrow focus on the technical construction of various kinds of prototypes; Participatory Design techniques—including the Scandinavian Cooperative Design (CD) approaches—seldom go beyond the early analysis...

  19. Above-knee prosthesis design based on fatigue life using finite element method and design of experiment.

    Science.gov (United States)

    Phanphet, Suwattanarwong; Dechjarern, Surangsee; Jomjanyong, Sermkiat

    2017-05-01

    The main objective of this work is to improve the standard of the existing design of knee prosthesis developed by Thailand's Prostheses Foundation of Her Royal Highness The Princess Mother. The experimental structural tests, based on the ISO 10328, of the existing design showed that a few components failed due to fatigue under normal cyclic loading below the required number of cycles. The finite element (FE) simulations of structural tests on the knee prosthesis were carried out. Fatigue life predictions of knee component materials were modeled based on the Morrow's approach. The fatigue life prediction based on the FE model result was validated with the corresponding structural test and the results agreed well. The new designs of the failed components were studied using the design of experimental approach and finite element analysis of the ISO 10328 structural test of knee prostheses under two separated loading cases. Under ultimate loading, knee prosthesis peak von Mises stress must be less than the yield strength of knee component's material and the total knee deflection must be lower than 2.5mm. The fatigue life prediction of all knee components must be higher than 3,000,000 cycles under normal cyclic loading. The design parameters are the thickness of joint bars, the diameter of lower connector and the thickness of absorber-stopper. The optimized knee prosthesis design meeting all the requirements was recommended. Experimental ISO 10328 structural test of the fabricated knee prosthesis based on the optimized design confirmed the finite element prediction. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  20. Performance-based seismic design of steel frames utilizing colliding bodies algorithm.

    Science.gov (United States)

    Veladi, H

    2014-01-01

    A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm.

  1. Performance Based Plastic Design of Concentrically Braced Frame attuned with Indian Standard code and its Seismic Performance Evaluation

    Directory of Open Access Journals (Sweden)

    Sejal Purvang Dalal

    2015-12-01

    Full Text Available In the Performance Based Plastic design method, the failure is predetermined; making it famous throughout the world. But due to lack of proper guidelines and simple stepwise methodology, it is not quite popular in India. In this paper, stepwise design procedure of Performance Based Plastic Design of Concentrically Braced frame attuned with the Indian Standard code has been presented. The comparative seismic performance evaluation of a six storey concentrically braced frame designed using the displacement based Performance Based Plastic Design (PBPD method and currently used force based Limit State Design (LSD method has also been carried out by nonlinear static pushover analysis and time history analysis under three different ground motions. Results show that Performance Based Plastic Design method is superior to the current design in terms of displacement and acceleration response. Also total collapse of the frame is prevented in the PBPD frame.

  2. Stochastic analysis for Poisson point processes Malliavin calculus, Wiener-Itô chaos expansions and stochastic geometry

    CERN Document Server

    Peccati, Giovanni

    2016-01-01

    Stochastic geometry is the branch of mathematics that studies geometric structures associated with random configurations, such as random graphs, tilings and mosaics. Due to its close ties with stereology and spatial statistics, the results in this area are relevant for a large number of important applications, e.g. to the mathematical modeling and statistical analysis of telecommunication networks, geostatistics and image analysis. In recent years – due mainly to the impetus of the authors and their collaborators – a powerful connection has been established between stochastic geometry and the Malliavin calculus of variations, which is a collection of probabilistic techniques based on the properties of infinite-dimensional differential operators. This has led in particular to the discovery of a large number of new quantitative limit theorems for high-dimensional geometric objects. This unique book presents an organic collection of authoritative surveys written by the principal actors in this rapidly evolvi...

  3. The analysis of the initiating events in thorium-based molten salt reactor

    International Nuclear Information System (INIS)

    Zuo Jiaxu; Song Wei; Jing Jianping; Zhang Chunming

    2014-01-01

    The initiation events analysis and evaluation were the beginning of nuclear safety analysis and probabilistic safety analysis, and it was the key points of the nuclear safety analysis. Currently, the initiation events analysis method and experiences both focused on water reactor, but no methods and theories for thorium-based molten salt reactor (TMSR). With TMSR's research and development in China, the initiation events analysis and evaluation was increasingly important. The research could be developed from the PWR analysis theories and methods. Based on the TMSR's design, the theories and methods of its initiation events analysis could be researched and developed. The initiation events lists and analysis methods of the two or three generation PWR, high-temperature gascooled reactor and sodium-cooled fast reactor were summarized. Based on the TMSR's design, its initiation events would be discussed and developed by the logical analysis. The analysis of TMSR's initiation events was preliminary studied and described. The research was important to clarify the events analysis rules, and useful to TMSR's designs and nuclear safety analysis. (authors)

  4. Analysis and design of Fuel Cycle Plant for natural phenomena hazards

    International Nuclear Information System (INIS)

    Horsager, B.K.

    1985-01-01

    A description of the Design Basis and the analysis and design methods used for natural phenomena at the Fuel Cycle Plant at Hanford, Washington is presented. A physical description of the main process facility and the auxiliary emergency and support facilities is given. The mission of the facility is presented and a brief description of the processes which will take place within the facility is given. The Design Criteria and design bases for natural phenomena including tornados, earthquakes and volcanic eruptions are described

  5. Reusing Design Knowledge Based on Design Cases and Knowledge Map

    Science.gov (United States)

    Yang, Cheng; Liu, Zheng; Wang, Haobai; Shen, Jiaoqi

    2013-01-01

    Design knowledge was reused for innovative design work to support designers with product design knowledge and help designers who lack rich experiences to improve their design capacity and efficiency. First, based on the ontological model of product design knowledge constructed by taxonomy, implicit and explicit knowledge was extracted from some…

  6. Design and analysis for piping systems

    International Nuclear Information System (INIS)

    Sterkel, H.-P.; Cutrim, J.H.C.

    1981-01-01

    The procedure and the typical techniques that are used in NUCLEN for the design and the calculation of the piping of Nuclear Plants. The classification system are generically described and the analysis techniques which are used for the design and verification of the piping systems, i.e. pressure design for the dimensioning of the wallthicknesses, temperature and dead weight analysis together with determination of support points, are shown. The techniques of dynamic design and analyses are described for earthquake and pressure impulse loadings. (Author) [pt

  7. Systemic design methodologies for electrical energy systems analysis, synthesis and management

    CERN Document Server

    Roboam, Xavier

    2012-01-01

    This book proposes systemic design methodologies applied to electrical energy systems, in particular analysis and system management, modeling and sizing tools. It includes 8 chapters: after an introduction to the systemic approach (history, basics & fundamental issues, index terms) for designing energy systems, this book presents two different graphical formalisms especially dedicated to multidisciplinary devices modeling, synthesis and analysis: Bond Graph and COG/EMR. Other systemic analysis approaches for quality and stability of systems, as well as for safety and robustness analysis tools are also proposed. One chapter is dedicated to energy management and another is focused on Monte Carlo algorithms for electrical systems and networks sizing. The aim of this book is to summarize design methodologies based in particular on a systemic viewpoint, by considering the system as a whole. These methods and tools are proposed by the most important French research laboratories, which have many scientific partn...

  8. The design of portable X-ray fluorescence analyzer based on PDA

    International Nuclear Information System (INIS)

    Zhou Jianbin; Ma Yingjie; Wang Lei; Tong Yunfu

    2010-01-01

    It designs a portable x-ray fluorescence analyzer based on PDA. The high performance Single Chip Microcomputer-C8051F060 works as the core controller on the measure-control board. The communication between PDA and measure-control board is based on Bluetooth technology. Benefiting from the rich resource on the chip of C8051F060, it's very easy to design the MCA (Multi-Channel Analyzer), detection-control circuit, peak-detection circuit compactly. WinCE OS runs on the PDA, and the analysis software is designed on the Visual Studio2005 in C++. The power of the system is supplied by the lithium battery. (authors)

  9. Seismic design ampersand analysis considerations for high level nuclear waste repositories

    International Nuclear Information System (INIS)

    Hossain, Q.A.

    1993-01-01

    A high level nuclear waste repository, like the one at Nevada's Yucca Mountain that is being investigated for site suitability, will have some unique seismic design and analysis considerations. These are discussed, and a design philosophy that can rationally account for the unique performance objectives of such facilities is presented. A case is made for the use of DOE's performance goal-based seismic design and evaluation methodology that is based on a hybrid open-quotes deterministicclose quotes and open-quotes probabilisticclose quotes concept. How and to what extent this methodology should be modified to adopt it for a potential site like Yucca Mountain is also outlined. Finally, the issue of designing for seismic fault rupture is discussed briefly, and the desirability of using the proposed seismic design philosophy in fault rupture evaluation is described

  10. Seismic design and analysis considerations for high level nuclear waste repositories

    International Nuclear Information System (INIS)

    Hossain, Q.A.

    1993-01-01

    A high level nuclear waste repository, like the one at Nevada's Yucca Mountain that is being investigated for site suitability, will have some unique seismic design and analysis considerations. These are discussed, and a design philosophy that can rationally account for the unique performance objectives of such facilities is presented. A case is made for the use of DOE's performance goal-based seismic design and evaluation methodology that is based on a hybrid ''deterministic'' and ''probabilistic'' concept. How and to what extent this methodology should be modified to adopt it for a potential site like Yucca Mountain is also outlined. Finally, the issue of designing for seismic fault rupture is discussed briefly, and the desirability of using the proposed seismic design philosophy in fault rupture evaluation is described

  11. The role of function analysis in the ACR control centre design

    International Nuclear Information System (INIS)

    Leger, R.P.; Davey, E.C.

    2006-01-01

    An essential aspect of control centre design is the need to characterize: plant functions and their inter-relationships to support the achievement of operational goals, and roles for humans and automation in sharing and exchanging the execution of functions across all operational phases. Function analysis is a design activity that has been internationally accepted as an approach to satisfy this need. It is recognized as a fundamental and necessary component in the systematic approach to control centre design and is carried out early in the design process. A function analysis can provide a clear basis for: the control centre design for the purposes of design team communication, and customer or regulatory review, the control centre display and control systems, the staffing and layout requirements of the control centre, assessing the completeness of control centre displays and controls prior and supplementary to mock-up walkthroughs or simulator evaluations, and the design of operating procedures and training programs. This paper will explore the role for function analysis in supporting the design of the control centre. The development of the ACR control room will be used as an illustrative context for the discussion. The paper will also discuss the merits of using function analysis in a goal-or function-based approach resulting in a more robust, operationally compatible, and cost-effective design over the life of the plant. Two former papers have previously outlined, the evolution in AECL's application approach and lessons learned in applying function analysis in support of control room design. This paper provides the most recent update to this progression in application refinement. (author)

  12. Application of Monte Carlo filtering method in regional sensitivity analysis of AASHTOWare Pavement ME design

    Directory of Open Access Journals (Sweden)

    Zhong Wu

    2017-04-01

    Full Text Available Since AASHTO released the Mechanistic-Empirical Pavement Design Guide (MEPDG for public review in 2004, many highway research agencies have performed sensitivity analyses using the prototype MEPDG design software. The information provided by the sensitivity analysis is essential for design engineers to better understand the MEPDG design models and to identify important input parameters for pavement design. In literature, different studies have been carried out based on either local or global sensitivity analysis methods, and sensitivity indices have been proposed for ranking the importance of the input parameters. In this paper, a regional sensitivity analysis method, Monte Carlo filtering (MCF, is presented. The MCF method maintains many advantages of the global sensitivity analysis, while focusing on the regional sensitivity of the MEPDG model near the design criteria rather than the entire problem domain. It is shown that the information obtained from the MCF method is more helpful and accurate in guiding design engineers in pavement design practices. To demonstrate the proposed regional sensitivity method, a typical three-layer flexible pavement structure was analyzed at input level 3. A detailed procedure to generate Monte Carlo runs using the AASHTOWare Pavement ME Design software was provided. The results in the example show that the sensitivity ranking of the input parameters in this study reasonably matches with that in a previous study under a global sensitivity analysis. Based on the analysis results, the strengths, practical issues, and applications of the MCF method were further discussed.

  13. Enhancing product robustness in reliability-based design optimization

    International Nuclear Information System (INIS)

    Zhuang, Xiaotian; Pan, Rong; Du, Xiaoping

    2015-01-01

    Different types of uncertainties need to be addressed in a product design optimization process. In this paper, the uncertainties in both product design variables and environmental noise variables are considered. The reliability-based design optimization (RBDO) is integrated with robust product design (RPD) to concurrently reduce the production cost and the long-term operation cost, including quality loss, in the process of product design. This problem leads to a multi-objective optimization with probabilistic constraints. In addition, the model uncertainties associated with a surrogate model that is derived from numerical computation methods, such as finite element analysis, is addressed. A hierarchical experimental design approach, augmented by a sequential sampling strategy, is proposed to construct the response surface of product performance function for finding optimal design solutions. The proposed method is demonstrated through an engineering example. - Highlights: • A unifying framework for integrating RBDO and RPD is proposed. • Implicit product performance function is considered. • The design problem is solved by sequential optimization and reliability assessment. • A sequential sampling technique is developed for improving design optimization. • The comparison with traditional RBDO is provided

  14. Frame-based safety analysis approach for decision-based errors

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Yihb, Swu

    1997-01-01

    A frame-based approach is proposed to analyze decision-based errors made by automatic controllers or human operators due to erroneous reference frames. An integrated framework, Two Frame Model (TFM), is first proposed to model the dynamic interaction between the physical process and the decision-making process. Two important issues, consistency and competing processes, are raised. Consistency between the physical and logic frames makes a TFM-based system work properly. Loss of consistency refers to the failure mode that the logic frame does not accurately reflect the state of the controlled processes. Once such failure occurs, hazards may arise. Among potential hazards, the competing effect between the controller and the controlled process is the most severe one, which may jeopardize a defense-in-depth design. When the logic and physical frames are inconsistent, conventional safety analysis techniques are inadequate. We propose Frame-based Fault Tree; Analysis (FFTA) and Frame-based Event Tree Analysis (FETA) under TFM to deduce the context for decision errors and to separately generate the evolution of the logical frame as opposed to that of the physical frame. This multi-dimensional analysis approach, different from the conventional correctness-centred approach, provides a panoramic view in scenario generation. Case studies using the proposed techniques are also given to demonstrate their usage and feasibility

  15. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    Science.gov (United States)

    Guariniello, Cesare

    The increasing size and complexity of space systems and space missions pose severe challenges to space systems engineers. When complex systems and Systems-of-Systems are involved, the behavior of the whole entity is not only due to that of the individual systems involved but also to the interactions and dependencies between the systems. Dependencies can be varied and complex, and designers usually do not perform analysis of the impact of dependencies at the level of complex systems, or this analysis involves excessive computational cost, or occurs at a later stage of the design process, after designers have already set detailed requirements, following a bottom-up approach. While classical systems engineering attempts to integrate the perspectives involved across the variety of engineering disciplines and the objectives of multiple stakeholders, there is still a need for more effective tools and methods capable to identify, analyze and quantify properties of the complex system as a whole and to model explicitly the effect of some of the features that characterize complex systems. This research describes the development and usage of Systems Operational Dependency Analysis and Systems Developmental Dependency Analysis, two methods based on parametric models of the behavior of complex systems, one in the operational domain and one in the developmental domain. The parameters of the developed models have intuitive meaning, are usable with subjective and quantitative data alike, and give direct insight into the causes of observed, and possibly emergent, behavior. The approach proposed in this dissertation combines models of one-to-one dependencies among systems and between systems and capabilities, to analyze and evaluate the impact of failures or delays on the outcome of the whole complex system. The analysis accounts for cascading effects, partial operational failures, multiple failures or delays, and partial developmental dependencies. The user of these methods can

  16. An analysis of teaching competence in science teachers involved in the design of context-based curriculum materials

    NARCIS (Netherlands)

    Putter - Smits, de L.G.A.; Taconis, R.; Driel, van J.H.; Jochems, W.M.G.

    2012-01-01

    The committees for the current Dutch context-based innovation in secondary science education employed teachers to design context-based curriculum materials. A study on the learning of science teachers in design teams for context-based curriculum materials is presented in this paper. In a correlation

  17. Design principles for simulation games for learning clinical reasoning: A design-based research approach.

    Science.gov (United States)

    Koivisto, J-M; Haavisto, E; Niemi, H; Haho, P; Nylund, S; Multisilta, J

    2018-01-01

    Nurses sometimes lack the competence needed for recognising deterioration in patient conditions and this is often due to poor clinical reasoning. There is a need to develop new possibilities for learning this crucial competence area. In addition, educators need to be future oriented; they need to be able to design and adopt new pedagogical innovations. The purpose of the study is to describe the development process and to generate principles for the design of nursing simulation games. A design-based research methodology is applied in this study. Iterative cycles of analysis, design, development, testing and refinement were conducted via collaboration among researchers, educators, students, and game designers. The study facilitated the generation of reusable design principles for simulation games to guide future designers when designing and developing simulation games for learning clinical reasoning. This study makes a major contribution to research on simulation game development in the field of nursing education. The results of this study provide important insights into the significance of involving nurse educators in the design and development process of educational simulation games for the purpose of nursing education. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Development of a standard data base for FBR core nuclear design (XIII). Analysis of small sample reactivity experiments at ZPPR-9

    International Nuclear Information System (INIS)

    Sato, Wakaei; Fukushima, Manabu; Ishikawa, Makoto

    2000-09-01

    A comprehensive study to evaluate and accumulate the abundant results of fast reactor physics is now in progress at O-arai Engineering Center to improve analytical methods and prediction accuracy of nuclear design for large fast breeder cores such as future commercial FBRs. The present report summarizes the analytical results of sample reactivity experiments at ZPPR-9 core, which has not been evaluated by the latest analytical method yet. The intention of the work is to extend and further generalize the standard data base for FBR core nuclear design. The analytical results of the sample reactivity experiments (samples: PU-30, U-6, DU-6, SS-1 and B-1) at ZPPR-9 core in JUPITER series, with the latest nuclear data library JENDL-3.2 and the analytical method which was established by the JUPITER analysis, can be concluded as follows: The region-averaged final C/E values generally agreed with unity within 5% differences at the inner core region. However, the C/E values of every sample showed the radial space-dependency increasing from center to core edge, especially the discrepancy of B-1 was the largest by 10%. Next, the influence of the present analytical results for the ZPPR-9 sample reactivity to the cross-section adjustment was evaluated. The reference case was a unified cross-section set ADJ98 based on the recent JUPITER analysis. As a conclusion, the present analytical results have sufficient physical consistency with other JUPITER data, and possess qualification as a part of the standard data base for FBR nuclear design. (author)

  19. Real Time Engineering Analysis Based on a Generative Component Implementation

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Klitgaard, Jens

    2007-01-01

    The present paper outlines the idea of a conceptual design tool with real time engineering analysis which can be used in the early conceptual design phase. The tool is based on a parametric approach using Generative Components with embedded structural analysis. Each of these components uses the g...

  20. The Design and Analysis of Learning Effects for a Game-based Learning System

    OpenAIRE

    Wernhuar Tarng; Weichian Tsai

    2010-01-01

    The major purpose of this study is to use network and multimedia technologies to build a game-based learning system for junior high school students to apply in learning “World Geography" through the “role-playing" game approaches. This study first investigated the motivation and habits of junior high school students to use the Internet and online games, and then designed a game-based learning system according to situated and game-based learning theories. A teaching experiment was conducted to...

  1. Multivariant design and multiple criteria analysis of building refurbishments

    Energy Technology Data Exchange (ETDEWEB)

    Kaklauskas, A.; Zavadskas, E. K.; Raslanas, S. [Faculty of Civil Engineering, Vilnius Gediminas Technical University, Vilnius (Lithuania)

    2005-07-01

    In order to design and realize an efficient building refurbishment, it is necessary to carry out an exhaustive investigation of all solutions that form it. The efficiency level of the considered building's refurbishment depends on a great many of factors, including: cost of refurbishment, annual fuel economy after refurbishment, tentative pay-back time, harmfulness to health of the materials used, aesthetics, maintenance properties, functionality, comfort, sound insulation and longevity, etc. Solutions of an alternative character allow for a more rational and realistic assessment of economic, ecological, legislative, climatic, social and political conditions, traditions and for better the satisfaction of customer requirements. They also enable one to cut down on refurbishment costs. In carrying out the multivariant design and multiple criteria analysis of a building refurbishment much data was processed and evaluated. Feasible alternatives could be as many as 100,000. How to perform a multivariant design and multiple criteria analysis of alternate alternatives based on the enormous amount of information became the problem. Method of multivariant design and multiple criteria of a building refurbishment's analysis were developed by the authors to solve the above problems. In order to demonstrate the developed method, a practical example is presented in this paper. (author)

  2. Affordable Design: A Methodolgy to Implement Process-Based Manufacturing Cost into the Traditional Performance-Focused Multidisciplinary Design Optimization

    Science.gov (United States)

    Bao, Han P.; Samareh, J. A.

    2000-01-01

    The primary objective of this paper is to demonstrate the use of process-based manufacturing and assembly cost models in a traditional performance-focused multidisciplinary design and optimization process. The use of automated cost-performance analysis is an enabling technology that could bring realistic processbased manufacturing and assembly cost into multidisciplinary design and optimization. In this paper, we present a new methodology for incorporating process costing into a standard multidisciplinary design optimization process. Material, manufacturing processes, and assembly processes costs then could be used as the objective function for the optimization method. A case study involving forty-six different configurations of a simple wing is presented, indicating that a design based on performance criteria alone may not necessarily be the most affordable as far as manufacturing and assembly cost is concerned.

  3. GIS-based landscape design research: Stourhead landscape garden as a case study

    Directory of Open Access Journals (Sweden)

    Steffen Nijhuis

    2017-11-01

    Full Text Available Landscape design research is important for cultivating spatial intelligence in landscape architecture. This study explores GIS (geographic information systems as a tool for landscape design research - investigating landscape designs to understand them as architectonic compositions (architectonic plan analysis. The concept ‘composition’ refers to a conceivable arrangement, an architectural expression of a mental construct that is legible and open to interpretation. Landscape architectonic compositions and their representations embody a great wealth of design knowledge as objects of our material culture and reflect the possible treatment of the ground, space, image and program as a characteristic coherence. By exploring landscape architectonic compositions with GIS, design researchers can acquire design knowledge that can be used in the creation and refinement of a design.  The research aims to identify and illustrate the potential role of GIS as a tool in landscape design research, so as to provide insight into the possibilities and limitations of using GIS in this capacity. The critical, information-oriented case of Stourhead landscape garden (Wiltshire, UK, an example of a designed landscape that covers the scope and remit of landscape architecture design, forms the heart of the study. The exploration of Stourhead by means of GIS can be understood as a plausibility probe. Here the case study is considered a form of ‘quasi-experiment’, testing the hypothesis and generating a learning process that constitutes a prerequisite for advanced understanding, while using an adjusted version of the framework for landscape design analysis by Steenbergen and Reh (2003. This is a theoretically informed analytical method based on the formal interpretation of the landscape architectonic composition addressing four landscape architectonic categories: the basic, the spatial, the symbolic and the programmatic form. This study includes new aspects to be

  4. A Priori Implementation Effort Estimation for HW Design Based on Independent-Path Analysis

    DEFF Research Database (Denmark)

    Abildgren, Rasmus; Diguet, Jean-Philippe; Bomel, Pierre

    2008-01-01

    that with the proposed approach it is possible to estimate the hardware implementation effort. This approach, part of our light design space exploration concept, is implemented in our framework ‘‘Design-Trotter'' and offers a new type of tool that can help designers and managers to reduce the time-to-market factor......This paper presents a metric-based approach for estimating the hardware implementation effort (in terms of time) for an application in relation to the number of linear-independent paths of its algorithms. We exploit the relation between the number of edges and linear-independent paths...... in an algorithm and the corresponding implementation effort. We propose an adaptation of the concept of cyclomatic complexity, complemented with a correction function to take designers' learning curve and experience into account. Our experimental results, composed of a training and a validation phase, show...

  5. Analysis and Design of High-Order Parallel Resonant Converters

    Science.gov (United States)

    Batarseh, Issa Eid

    1990-01-01

    In this thesis, a special state variable transformation technique has been derived for the analysis of high order dc-to-dc resonant converters. Converters comprised of high order resonant tanks have the advantage of utilizing the parasitic elements by making them part of the resonant tank. A new set of state variables is defined in order to make use of two-dimensional state-plane diagrams in the analysis of high order converters. Such a method has been successfully used for the analysis of the conventional Parallel Resonant Converters (PRC). Consequently, two -dimensional state-plane diagrams are used to analyze the steady state response for third and fourth order PRC's when these converters are operated in the continuous conduction mode. Based on this analysis, a set of control characteristic curves for the LCC-, LLC- and LLCC-type PRC are presented from which various converter design parameters are obtained. Various design curves for component value selections and device ratings are given. This analysis of high order resonant converters shows that the addition of the reactive components to the resonant tank results in converters with better performance characteristics when compared with the conventional second order PRC. Complete design procedure along with design examples for 2nd, 3rd and 4th order converters are presented. Practical power supply units, normally used for computer applications, were built and tested by using the LCC-, LLC- and LLCC-type commutation schemes. In addition, computer simulation results are presented for these converters in order to verify the theoretical results.

  6. Operation and maintenance requirements of system design bases

    International Nuclear Information System (INIS)

    Banerjee, A.K.; Hanley, N.E.

    1989-01-01

    All system designs make assumptions about system operation testing, inspection, and maintenance. Existing industry codes and standards explicitly address design requirements of new systems, while issues related to system and plant reliability, life, design margins, effects of service conditions, operation, maintenance, etc., usually are implicit. However, system/component design documents of existing power plants often address the code requirements without considering the operation, maintenance, inspection, and testing (OMIT) requirements. The nuclear industry is expending major efforts at most nuclear power plants to reassemble and/or reconstitute system design bases. Stone ampersand Webster Engineering Corporation (SWEC) recently addressed the OMIT requirements of system/component design as an integral part of a utility's preventive maintenance program. For each component, SWEC reviewed vendor recommendations, NPRDS data/industry experience, the existing maintenance program, component service conditions, and actual plant experience. A maintenance program that considers component service conditions and plant experience ensures a connection between maintenance and design basis. Root cause analysis of failure and engineering evaluation of service condition are part of the program. System/component OMIT requirements also are compared against system design, service condition, degradation mechanism, etc., through system/component life-cycle evaluation

  7. Service network design of bike sharing systems analysis and optimization

    CERN Document Server

    Vogel, Patrick

    2016-01-01

    This monograph presents a tactical planning approach for service network design in metropolitan areas. Designing the service network requires the suitable aggregation of demand data as well as the anticipation of operational relocation decisions. To this end, an integrated approach of data analysis and mathematical optimization is introduced. The book also includes a case study based on real-world data to demonstrate the benefit of the proposed service network design approach. The target audience comprises primarily research experts in the field of traffic engineering, but the book may also be beneficial for graduate students.

  8. Molecular-based recursive partitioning analysis model for glioblastoma in the temozolomide era a correlative analysis based on nrg oncology RTOG 0525

    NARCIS (Netherlands)

    Bell, Erica Hlavin; Pugh, Stephanie L.; McElroy, Joseph P.; Gilbert, Mark R.; Mehta, Minesh; Klimowicz, Alexander C.; Magliocco, Anthony; Bredel, Markus; Robe, Pierre; Grosu, Anca L.; Stupp, Roger; Curran, Walter; Becker, Aline P.; Salavaggione, Andrea L.; Barnholtz-Sloan, Jill S.; Aldape, Kenneth; Blumenthal, Deborah T.; Brown, Paul D.; Glass, Jon; Souhami, Luis; Lee, R. Jeffrey; Brachman, David; Flickinger, John; Won, Minhee; Chakravarti, Arnab

    2017-01-01

    IMPORTANCE: There is a need for a more refined, molecularly based classification model for glioblastoma (GBM) in the temozolomide era. OBJECTIVE: To refine the existing clinically based recursive partitioning analysis (RPA) model by incorporating molecular variables. DESIGN, SETTING, AND

  9. Coal conversion processes and analysis methodologies for synthetic fuels production. [technology assessment and economic analysis of reactor design for coal gasification

    Science.gov (United States)

    1979-01-01

    Information to identify viable coal gasification and utilization technologies is presented. Analysis capabilities required to support design and implementation of coal based synthetic fuels complexes are identified. The potential market in the Southeast United States for coal based synthetic fuels is investigated. A requirements analysis to identify the types of modeling and analysis capabilities required to conduct and monitor coal gasification project designs is discussed. Models and methodologies to satisfy these requirements are identified and evaluated, and recommendations are developed. Requirements for development of technology and data needed to improve gasification feasibility and economies are examined.

  10. Development and Validation of a Hypersonic Vehicle Design Tool Based On Waverider Design Technique

    Science.gov (United States)

    Dasque, Nastassja

    Methodologies for a tool capable of assisting design initiatives for practical waverider based hypersonic vehicles were developed and validated. The design space for vehicle surfaces was formed using an algorithm that coupled directional derivatives with the conservation laws to determine a flow field defined by a set of post-shock streamlines. The design space is used to construct an ideal waverider with a sharp leading edge. A blunting method was developed to modify the ideal shapes to a more practical geometry for real-world application. Empirical and analytical relations were then systematically applied to the resulting geometries to determine local pressure, skin-friction and heat flux. For the ideal portion of the geometry, flat plate relations for compressible flow were applied. For the blunted portion of the geometry modified Newtonian theory, Fay-Riddell theory and Modified Reynolds analogy were applied. The design and analysis methods were validated using analytical solutions as well as empirical and numerical data. The streamline solution for the flow field generation technique was compared with a Taylor-Maccoll solution and showed very good agreement. The relationship between the local Stanton number and skin friction coefficient with local Reynolds number along the ideal portion of the body showed good agreement with experimental data. In addition, an automated grid generation routine was formulated to construct a structured mesh around resulting geometries in preparation for Computational Fluid Dynamics analysis. The overall analysis of the waverider body using the tool was then compared to CFD studies. The CFD flow field showed very good agreement with the design space. However, the distribution of the surface properties was near CFD results but did not have great agreement.

  11. Thermodynamic analysis of combined cycle under design/off-design conditions for its efficient design and operation

    International Nuclear Information System (INIS)

    Zhang, Guoqiang; Zheng, Jiongzhi; Xie, Angjun; Yang, Yongping; Liu, Wenyi

    2016-01-01

    Highlights: • Based on the PG9351FA gas turbine, two gas-steam combined cycles are redesigned. • Analysis of detailed off-design characteristics of the combined cycle main parts. • Suggestions for improving design and operation performance of the combined cycle. • Higher design efficiency has higher off-design efficiency in general PR range. • High pressure ratio combined cycles possess good off-design performance. - Abstract: To achieve a highly efficient design and operation of combined cycles, this study analyzed in detail the off-design characteristics of the main components of three combined cycles with different compressor pressure ratios (PRs) based on real units. The off-design model of combined cycle was built consisting of a compressor, a combustor, a gas turbine, and a heat recovery steam generator (HRSG). The PG9351FA unit is selected as the benchmark unit, on the basis of which the compressor is redesigned with two different PRs. Then, the design/off-design characteristics of the three units with different design PRs and the interactive relations between topping and bottoming cycles are analyzed with the same turbine inlet temperature (TIT). The results show that the off-design characteristics of the topping cycle affect dramatically the combined cycle performance. The variation range of the exergy efficiency of the topping cycle for the three units is between 11.9% and 12.4% under the design/off-design conditions. This range is larger than that of the bottoming cycle (between 9.2% and 9.5%). The HRSG can effectively recycle the heat/heat exergy of the gas turbine exhaust. Comparison among the three units shows that for a traditional gas-steam combined cycle, a high design efficiency results in a high off-design efficiency in the usual PR range. The combined cycle design efficiency of higher pressure ratio is almost equal to that of the PG9351FA, but its off-design efficiency is higher (maximum 0.42%) and the specific power decreases. As for

  12. The Efficiency of Split Panel Designs in an Analysis of Variance Model

    Science.gov (United States)

    Wang, Wei-Guo; Liu, Hai-Jun

    2016-01-01

    We consider split panel design efficiency in analysis of variance models, that is, the determination of the cross-sections series optimal proportion in all samples, to minimize parametric best linear unbiased estimators of linear combination variances. An orthogonal matrix is constructed to obtain manageable expression of variances. On this basis, we derive a theorem for analyzing split panel design efficiency irrespective of interest and budget parameters. Additionally, relative estimator efficiency based on the split panel to an estimator based on a pure panel or a pure cross-section is present. The analysis shows that the gains from split panel can be quite substantial. We further consider the efficiency of split panel design, given a budget, and transform it to a constrained nonlinear integer programming. Specifically, an efficient algorithm is designed to solve the constrained nonlinear integer programming. Moreover, we combine one at time designs and factorial designs to illustrate the algorithm’s efficiency with an empirical example concerning monthly consumer expenditure on food in 1985, in the Netherlands, and the efficient ranges of the algorithm parameters are given to ensure a good solution. PMID:27163447

  13. Visualization analysis and design

    CERN Document Server

    Munzner, Tamara

    2015-01-01

    Visualization Analysis and Design provides a systematic, comprehensive framework for thinking about visualization in terms of principles and design choices. The book features a unified approach encompassing information visualization techniques for abstract data, scientific visualization techniques for spatial data, and visual analytics techniques for interweaving data transformation and analysis with interactive visual exploration. It emphasizes the careful validation of effectiveness and the consideration of function before form. The book breaks down visualization design according to three questions: what data users need to see, why users need to carry out their tasks, and how the visual representations proposed can be constructed and manipulated. It walks readers through the use of space and color to visually encode data in a view, the trade-offs between changing a single view and using multiple linked views, and the ways to reduce the amount of data shown in each view. The book concludes with six case stu...

  14. Neutronics analysis of the conceptual design of a component test facility based on the spherical tokamak

    International Nuclear Information System (INIS)

    Zheng, S.; Voss, G.M.; Pampin, R.

    2010-01-01

    One of the crucial aspects of fusion research is the optimisation and qualification of suitable materials and components. To enable the design and construction of DEMO in the future, ITER is taken to demonstrate the scientific and technological feasibility and IFMIF will provide rigorous testing of small material samples. Meanwhile, a dedicated, small-scale components testing facility (CTF) is proposed to complement and extend the functions of ITER and IFMIF and operate in association with DEMO so as to reduce the risk of delays during this phase of fusion power development. The design of a spherical tokamak (ST)-based CTF is being developed which offers many advantages over conventional machines, including lower tritium consumption, easier maintenance, and a compact assembly. The neutronics analysis of this system is presented here. Based on a three-dimensional neutronics model generated by the interface programme MCAM from CAD models, a series of nuclear and radiation protection analyses were carried out using the MCNP code and FENDL2.1 nuclear data library to assess the current design and guide its development if needed. The nuclear analyses addresses key neutronics issues such as the neutron wall loading (NWL) profile, nuclear heat loads, and radiation damage to the coil insulation and to structural components, particularly the stainless steel vessel wall close to the NBI ports where shielding is limited. The shielding of the divertor coil and the internal Poloidal Field (PF) coil, which is introduced in the expanded divertor design, are optimised to reduce their radiation damage. The preliminary results show that the peak radiation damage to the structure of martensitic/ferritic steel is about 29 dpa at the mid-plane assuming a life of 12 years at a duty factor 33%, which is much lower than its ∼150 dpa limit. In addition, TBMs installed in 8 mid-plane ports and 6 lower ports, and 60% 6 Li enrichment in the Li 4 SiO 4 breeder, the total tritium generation is

  15. COA based robust output feedback UPFC controller design

    Energy Technology Data Exchange (ETDEWEB)

    Shayeghi, H., E-mail: hshayeghi@gmail.co [Technical Engineering Department, University of Mohaghegh Ardabili, Ardabil (Iran, Islamic Republic of); Shayanfar, H.A. [Center of Excellence for Power System Automation and Operation, Electrical Engineering Department, Iran University of Science and Technology, Tehran (Iran, Islamic Republic of); Jalilzadeh, S.; Safari, A. [Technical Engineering Department, Zanjan University, Zanjan (Iran, Islamic Republic of)

    2010-12-15

    In this paper, a novel method for the design of output feedback controller for unified power flow controller (UPFC) using chaotic optimization algorithm (COA) is developed. Chaotic optimization algorithms, which have the features of easy implementation, short execution time and robust mechanisms of escaping from the local optimum, is a promising tool for the engineering applications. The selection of the output feedback gains for the UPFC controllers is converted to an optimization problem with the time domain-based objective function which is solved by a COA based on Lozi map. Since chaotic mapping enjoys certainty, ergodicity and the stochastic property, the proposed chaotic optimization problem introduces chaos mapping using Lozi map chaotic sequences which increases its convergence rate and resulting precision. To ensure the robustness of the proposed stabilizers, the design process takes into account a wide range of operating conditions and system configurations. The effectiveness of the proposed controller for damping low frequency oscillations is tested and demonstrated through non-linear time-domain simulation and some performance indices studies. The results analysis reveals that the designed COA based output feedback UPFC damping controller has an excellent capability in damping power system low frequency oscillations and enhance greatly the dynamic stability of the power systems.

  16. Plant aging and design bases documentation

    International Nuclear Information System (INIS)

    Kelly, J.

    1985-01-01

    As interest in plant aging and lifetime extension continues to grow, the need to identify and capture the original design bases for the plant becomes more urgent. Decisions on lifetime extension and availability must be based on a rational understanding of design input, assumptions, and objectives. As operating plant time accumulates, the history of the early design begins to fade. The longer the utility waits, the harder it will be to re-establish the original design bases. Therefore, the time to develop this foundation is now. This paper demonstrates the impact that collecting and maintaining the original design bases of the plant can have on a utility's lifetime extension program. This impact becomes apparent when considering the technical, regulatory and financial aspects of lifetime extension. It is not good enough to know that the design information is buried somewhere in the corporate archives, and that given enough time, it could be retrieved. To be useful to the lifetime extension program, plant design information must be concise, readily available (i.e., retrievable), and easy to use. These objectives can only be met through a systematic program for collecting and presenting plant design documentation. To get the maximum benefit from a lifetime extension program, usable design bases documentation should be available as early in the plant life as possible. It will help identify areas that require monitoring today so that data is available to make rational decisions in the future

  17. Comparative analysis of design codes for timber bridges in Canada, the United States, and Europe

    Science.gov (United States)

    James Wacker; James (Scott) Groenier

    2010-01-01

    The United States recently completed its transition from the allowable stress design code to the load and resistance factor design (LRFD) reliability-based code for the design of most highway bridges. For an international perspective on the LRFD-based bridge codes, a comparative analysis is presented: a study addressed national codes of the United States, Canada, and...

  18. Design and Analysis Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Provides engineering design of aircraft components, subsystems and installations using Pro/E, Anvil 1000, CADKEY 97, AutoCAD 13. Engineering analysis tools include...

  19. Automated analysis and design of complex structures

    International Nuclear Information System (INIS)

    Wilson, E.L.

    1977-01-01

    The present application of optimum design appears to be restricted to components of the structure rather than to the total structural system. Since design normally involved many analysis of the system any improvement in the efficiency of the basic methods of analysis will allow more complicated systems to be designed by optimum methods. The evaluation of the risk and reliability of a structural system can be extremely important. Reliability studies have been made of many non-structural systems for which the individual components have been extensively tested and the service environment is known. For such systems the reliability studies are valid. For most structural systems, however, the properties of the components can only be estimated and statistical data associated with the potential loads is often minimum. Also, a potentially critical loading condition may be completely neglected in the study. For these reasons and the previous problems associated with the reliability of both linear and nonlinear analysis computer programs it appears to be premature to place a significant value on such studies for complex structures. With these comments as background the purpose of this paper is to discuss the following: the relationship of analysis to design; new methods of analysis; new of improved finite elements; effect of minicomputer on structural analysis methods; the use of system of microprocessors for nonlinear structural analysis; the role of interacting graphics systems in future analysis and design. This discussion will focus on the impact of new, inexpensive computer hardware on design and analysis methods

  20. Comparative analysis of nuclear reactor control system designs

    International Nuclear Information System (INIS)

    Russcher, G.E.

    1975-01-01

    Control systems are vital to the safe operation of nuclear reactors. Their seismic design requirements are some of the most important criteria governing reactor system design evaluation. Consequently, the seismic analysis for nuclear reactors is directed to include not only the mechanical and structural seismic capabilities of a reactor, but the control system functional requirements as well. In the study described an alternate conceptual design of a safety rod system was compared with a prototypic system design to assess their relative functional reliabilities under design seismic conditions. The comparative methods utilized standard success tree and decision tree techniques to determine the relative figures of merit. The study showed: (1) The methodology utilized can provide both qualitative and quantitative bases for design decisions regarding seismic functional capabilities of two systems under comparison, (2) the process emphasizes the visibility of particular design features that are subject to common mode failure while under seismic loading, and (3) minimal improvement was shown to be available in overall system seismic performance of an independent conceptual design, however, it also showed the system would be subject to a new set of operational uncertainties which would have to be resolved by extensive development programs