WorldWideScience

Sample records for standard approach techniques

  1. EUS-guided biliary drainage by using a standardized approach for malignant biliary obstruction: rendezvous versus direct transluminal techniques (with videos).

    Science.gov (United States)

    Khashab, Mouen A; Valeshabad, Ali Kord; Modayil, Rani; Widmer, Jessica; Saxena, Payal; Idrees, Mehak; Iqbal, Shahzad; Kalloo, Anthony N; Stavropoulos, Stavros N

    2013-11-01

    EUS-guided biliary drainage (EGBD) can be performed via direct transluminal or rendezvous techniques. It is unknown how both techniques compare in terms of efficacy and adverse events. To describe outcomes of EGBD performed by using a standardized approach and compare outcomes of rendezvous and transluminal techniques. Retrospective analysis of prospectively collected data. Two tertiary-care centers. Consecutive jaundiced patients with distal malignant biliary obstruction who underwent EGBD after failed ERCP between July 2006 and December 2012 were included. EGBD by using a standardized algorithm. Technical success, clinical success, and adverse events. During the study period, 35 patients underwent EGBD (rendezvous n = 13, transluminal n = 20). Technical success was achieved in 33 patients (94%), and clinical success was attained in 32 of 33 patients (97.0%). The mean postprocedure bilirubin level was 1.38 mg/dL in the rendezvous group and 1.33 mg/dL in the transluminal group (P = .88). Similarly, length of hospital stay was not different between groups (P = .23). There was no significant difference in adverse event rate between rendezvous and transluminal groups (15.4% vs 10%; P = .64). Long-term outcomes were comparable between groups, with 1 stent migration in the rendezvous group at 62 days and 1 stent occlusion in the transluminal group at 42 days after EGBD. Retrospective analysis, small number of patients, and selection bias. EGBD is safe and effective when the described standardized approach is used. Stent occlusion is not common during long-term follow-up. Both rendezvous and direct transluminal techniques seem to be equally effective and safe. The latter approach is a reasonable alternative to rendezvous EGBD. Copyright © 2013. Published by Mosby, Inc.

  2. Fault estimation - A standard problem approach

    DEFF Research Database (Denmark)

    Stoustrup, J.; Niemann, Hans Henrik

    2002-01-01

    This paper presents a range of optimization based approaches to fault diagnosis. A variety of fault diagnosis problems are reformulated in the so-called standard problem set-up introduced in the literature on robust control. Once the standard problem formulations are given, the fault diagnosis...... problems can be solved by standard optimization techniques. The proposed methods include (1) fault diagnosis (fault estimation, (FE)) for systems with model uncertainties; FE for systems with parametric faults, and FE for a class of nonlinear systems. Copyright...

  3. Standard Establishment Through Scenarios (SETS): A new technique for occupational fitness standards.

    Science.gov (United States)

    Blacklock, R E; Reilly, T J; Spivock, M; Newton, P S; Olinek, S M

    2015-01-01

    An objective and scientific task analysis provides the basis for establishing legally defensible Physical Employment Standards (PES), based on common and essential occupational tasks. Infrequent performance of these tasks creates challenges when developing PES based on criterion, or content validity. Develop a systematic approach using Subject Matter Experts (SME) to provide tasks with 1) an occupationally relevant scenario considered common to all personnel; 2) a minimum performance standard defined by time, distance, load or work. Examples provided here relate to the development of a new PES for the Canadian Armed Forces (CAF). SME of various experience are selected based on their eligibility criteria. SME are required to define a reasonable scenario for each task from personal experience, provide occupational performance requirements of the scenario in sub-groups, and discuss and agree by consensus vote on the final standard based on the definition of essential. A common and essential task for the CAF is detailed as a case example of process application. Techniques to avoid common SME rating errors are discussed and advantages to the method described. The SETS method was developed as a systematic approach to setting occupational performance standards and qualifying information from SME.

  4. Standardized technique for single port laparoscopic ileostomy and colostomy.

    Science.gov (United States)

    Shah, A; Moftah, M; Hadi Nahar Al-Furaji, H; Cahill, R A

    2014-07-01

    Single site laparoscopic techniques and technology exploit maximum usefulness from confined incisions. The formation of an ileostomy or colostomy seems very applicable for this modality as the stoma occupies the solitary incision obviating any additional wounds. Here we detail the principles of our approach to defunctioning loop stoma formation using single port laparoscopic access in a stepwise and standardized fashion along with the salient specifics of five illustrative patients. No specialized instrumentation is required and the single access platform is established table-side using the 'glove port' technique. The approach has the intra-operative advantage of excellent visualization of the correct intestinal segment for exteriorization along with direct visual control of its extraction to avoid twisting. Postoperatively, abdominal wall trauma has been minimal allowing convalescence and stoma care education with only one parietal incision. Single incision stoma siting proves a ready, robust and reliable technique for diversion ileostomy and colostomy with a minimum of operative trauma for the patient. Colorectal Disease © 2014 The Association of Coloproctology of Great Britain and Ireland.

  5. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    Science.gov (United States)

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  6. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    International Nuclear Information System (INIS)

    Jha, Abhinav K; Frey, Eric C; Caffo, Brian

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  7. New quantitative safety standards: different techniques, different results?

    International Nuclear Information System (INIS)

    Rouvroye, J.L.; Brombacher, A.C.

    1999-01-01

    Safety Instrumented Systems (SIS) are used in the process industry to perform safety functions. Many factors can influence the safety of a SIS like system layout, diagnostics, testing and repair. In standards like the German DIN no quantitative analysis is demanded (DIN V 19250 Grundlegende Sicherheitsbetrachtungen fuer MSR-Schutzeinrichtungen, Berlin, 1994; DIN/VDE 0801 Grundsaetze fuer Rechner in Systemen mit Sicherheitsaufgaben, Berlin, 1990). The analysis according to these standards is based on expert opinion and qualitative analysis techniques. New standards like the IEC 61508 (IEC 61508 Functional safety of electrical/electronic/programmable electronic safety-related systems, IEC, Geneve, 1997) and the ISA-S84.01 (ISA-S84.01.1996 Application of Safety Instrumented Systems for the Process Industries, Instrument Society of America, Research Triangle Park, 1996) require quantitative risk analysis but do not prescribe how to perform the analysis. Earlier publications of the authors (Rouvroye et al., Uncertainty in safety, new techniques for the assessment and optimisation of safety in process industry, D W. Pyatt (ed), SERA-Vol. 4, Safety engineering and risk analysis, ASME, New York 1995; Rouvroye et al., A comparison study of qualitative and quantitative analysis techniques for the assessment of safety in industry, P.C. Cacciabue, I.A. Papazoglou (eds), Proceedings PSAM III conference, Crete, Greece, June 1996) have shown that different analysis techniques cover different aspects of system behaviour. This paper shows by means of a case study, that different (quantitative) analysis techniques may lead to different results. The consequence is that the application of the standards to practical systems will not always lead to unambiguous results. The authors therefore propose a technique to overcome this major disadvantage

  8. Force coordination in static manipulation tasks performed using standard and non-standard grasping techniques.

    Science.gov (United States)

    de Freitas, Paulo B; Jaric, Slobodan

    2009-04-01

    We evaluated coordination of the hand grip force (GF; normal component of the force acting at the hand-object contact area) and load force (LF; the tangential component) in a variety of grasping techniques and two LF directions. Thirteen participants exerted a continuous sinusoidal LF pattern against externally fixed handles applying both standard (i.e., using either the tips of the digits or the palms; the precision and palm grasps, respectively) and non-standard grasping techniques (using wrists and the dorsal finger areas; the wrist and fist grasp). We hypothesized (1) that the non-standard grasping techniques would provide deteriorated indices of force coordination when compared with the standard ones, and (2) that the nervous system would be able to adjust GF to the differences in friction coefficients of various skin areas used for grasping. However, most of the indices of force coordination remained similar across the tested grasping techniques, while the GF adjustments for the differences in friction coefficients (highest in the palm and the lowest in the fist and wrist grasp) provided inconclusive results. As hypothesized, GF relative to the skin friction was lowest in the precision grasp, but highest in the palm grasp. Therefore, we conclude that (1) the elaborate coordination of GF and LF consistently seen across the standard grasping techniques could be generalized to the non-standard ones, while (2) the ability to adjust GF using the same grasping technique to the differences in friction of various objects cannot be fully generalized to the GF adjustment when different grasps (i.e., hand segments) are used to manipulate the same object. Due to the importance of the studied phenomena for understanding both the functional and neural control aspects of manipulation, future studies should extend the current research to the transient and dynamic tasks, as well as to the general role of friction in our mechanical interactions with the environment.

  9. Assessment of Snared-Loop Technique When Standard Retrieval of Inferior Vena Cava Filters Fails

    International Nuclear Information System (INIS)

    Doody, Orla; Noe, Geertje; Given, Mark F.; Foley, Peter T.; Lyon, Stuart M.

    2009-01-01

    Purpose To identify the success and complications related to a variant technique used to retrieve inferior vena cava filters when simple snare approach has failed. Methods A retrospective review of all Cook Guenther Tulip filters and Cook Celect filters retrieved between July 2006 and February 2008 was performed. During this period, 130 filter retrievals were attempted. In 33 cases, the standard retrieval technique failed. Retrieval was subsequently attempted with our modified retrieval technique. Results The retrieval was successful in 23 cases (mean dwell time, 171.84 days; range, 5-505 days) and unsuccessful in 10 cases (mean dwell time, 162.2 days; range, 94-360 days). Our filter retrievability rates increased from 74.6% with the standard retrieval method to 92.3% when the snared-loop technique was used. Unsuccessful retrieval was due to significant endothelialization (n = 9) and caval penetration by the filter (n = 1). A single complication occurred in the group, in a patient developing pulmonary emboli after attempted retrieval. Conclusion The technique we describe increased the retrievability of the two filters studied. Hook endothelialization is the main factor resulting in failed retrieval and continues to be a limitation with these filters.

  10. AN EVALUATION OF SELECTED MOROCCAN ELT TEXTBOOKS: A STANDARDS-BASED APPROACH PERSPECTIVE

    Directory of Open Access Journals (Sweden)

    Hassan Ait Bouzid

    2017-05-01

    Full Text Available Standards-Based Approach to textbook evaluation has been blooming in recent decades. Nevertheless, this practice has received very little attention in Morocco.  The present study aims to bridge a gap in the literature of the Moroccan context by investigating the extent to which three locally designed ELT textbooks conform to the theoretical principles of the Standards-Based Approach which defines the teaching of English as a foreign language in the country (Ministry of National Education, 2007. Its objective is to examine whether and how these textbooks present contents that enable learners to meet the content standards included in the goal areas of Communications, Cultures, Connections and Comparisons. The study is informed by the theoretical framework of the Standards-Based Approach. It adopts a mixed-methods design that uses content analysis as a mixed data analysis method combining both quantitative and qualitative techniques. The findings reveal a number of shortcomings relevant to the representation of the content standards as several standards are not sufficiently addressed in the activities included in these textbooks. Eventually, some suggestions are addressed to policy makers, textbook designers and teachers to overcome the identified problems in current and future textbooks. The study is expected to sensitize ELT practitioners about the viability of using textbook evaluation in boosting both the quality of ELT textbooks and the quality of the teaching learning outcomes.

  11. [TECHNIQUES IN MITRAL VALVE REPAIR VIA A MINIMALLY INVASIVE APPROACH].

    Science.gov (United States)

    Ito, Toshiaki

    2016-03-01

    In mitral valve repair via a minimally invasive approach, resection of the leaflet is technically demanding compared with that in the standard approach. For resection and suture repair of the posterior leaflet, premarking of incision lines is recommended for precise resection. As an alternative to resection and suture, the leaflet-folding technique is also recommended. For correction of prolapse of the anterior leaflet, neochordae placement with the loop technique is easy to perform. Premeasurement with transesophageal echocardiography or intraoperative measurement using a replica of artificial chordae is useful to determine the appropriate length of the loops. Fine-tuning of the length of neochordae is possible by adding a secondary fixation point on the leaflet if the loop is too long. If the loop is too short, a CV5 Gore-Tex suture can be passed through the loop and loosely tied several times to stack the knots, with subsequent fixation to the edge of the leaflet. Finally, skill in the mitral valve replacement technique is necessary as a back-up for surgeons who perform minimally invasive mitral valve repair.

  12. Improvement of AC motor reliability from technique standardization

    International Nuclear Information System (INIS)

    Muniz, P.R.; Faria, M.D.R.; Mendes, M.P.; Silva, J.N.; Dos Santos, J.D.

    2005-01-01

    The purpose of this paper is to explain the increase of reliability of motors serviced in the Electrical Maintenance Shop of Companhia Siderurgica de Tubarao by standardization of the technique based on Brazilian and International Standards, manufacturer's recommendations and the experience of the maintenance staff. (author)

  13. Short Shrift to Long Lists: An Alternative Approach to the Development of Performance Standards for School Principals.

    Science.gov (United States)

    Louden, William; Wildy, Helen

    1999-01-01

    Describes examples of standards frameworks for principals' work operant in three countries and describes an alternative approach based on interviewing 40 Australian principals. By combining qualitative case studies with probabilistic measurement techniques, the alternative approach provides contextually rich descriptions of growth in performance…

  14. Paediatric sutureless circumcision-an alternative to the standard technique.

    LENUS (Irish Health Repository)

    2012-01-31

    INTRODUCTION: Circumcision is one of the most commonly performed surgical procedures in male children. A range of surgical techniques exist for this commonly performed procedure. The aim of this study is to assess the safety, functional outcome and cosmetic appearance of a sutureless circumcision technique. METHODS: Over a 9-year period, 502 consecutive primary sutureless circumcisions were performed by a single surgeon. All 502 cases were entered prospectively into a database including all relevant clinical details and a review was performed. The technique used to perform the sutureless circumcision is a modification of the standard sleeve technique with the use of a bipolar diathermy and the application of 2-octyl cyanoacrylate (2-OCA) to approximate the tissue edges. RESULTS: All boys in this study were pre-pubescent and the ages ranged from 6 months to 12 years (mean age 3.5 years). All patients had this procedure performed as a day case and under general anaesthetic. Complications included: haemorrhage (2.2%), haematoma (1.4%), wound infection (4%), allergic reaction (0.2%) and wound dehiscence (0.8%). Only 9 (1.8%) parents or patients were dissatisfied with the cosmetic appearance. CONCLUSION: The use of 2-OCA as a tissue adhesive for sutureless circumcisions is an alternative to the standard suture technique. The use of this tissue adhesive, 2-OCA, results in comparable complication rates to the standard circumcision technique and results in excellent post-operative cosmetic satisfaction.

  15. MIMO wireless networks channels, techniques and standards for multi-antenna, multi-user and multi-cell systems

    CERN Document Server

    Clerckx, Bruno

    2013-01-01

    This book is unique in presenting channels, techniques and standards for the next generation of MIMO wireless networks. Through a unified framework, it emphasizes how propagation mechanisms impact the system performance under realistic power constraints. Combining a solid mathematical analysis with a physical and intuitive approach to space-time signal processing, the book progressively derives innovative designs for space-time coding and precoding as well as multi-user and multi-cell techniques, taking into consideration that MIMO channels are often far from ideal. Reflecting developments

  16. International Standardization of Library and Documentation Techniques.

    Science.gov (United States)

    International Federation for Documentation, The Hague (Netherlands).

    This comparative study of the national and international standards, rules and regulations on library and documentation techniques adopted in various countries was conducted as a preliminary step in determining the minimal bases for facilitating national and international cooperation between documentalists and librarians. The study compares and…

  17. Standardization of Laser Methods and Techniques for Vibration Measurements and Calibrations

    International Nuclear Information System (INIS)

    Martens, Hans-Juergen von

    2010-01-01

    The realization and dissemination of the SI units of motion quantities (vibration and shock) have been based on laser interferometer methods specified in international documentary standards. New and refined laser methods and techniques developed by national metrology institutes and by leading manufacturers in the past two decades have been swiftly specified as standard methods for inclusion into in the series ISO 16063 of international documentary standards. A survey of ISO Standards for the calibration of vibration and shock transducers demonstrates the extended ranges and improved accuracy (measurement uncertainty) of laser methods and techniques for vibration and shock measurements and calibrations. The first standard for the calibration of laser vibrometers by laser interferometry or by a reference accelerometer calibrated by laser interferometry (ISO 16063-41) is on the stage of a Draft International Standard (DIS) and may be issued by the end of 2010. The standard methods with refined techniques proved to achieve wider measurement ranges and smaller measurement uncertainties than that specified in the ISO Standards. The applicability of different standardized interferometer methods to vibrations at high frequencies was recently demonstrated up to 347 kHz (acceleration amplitudes up to 350 km/s 2 ). The relative deviations between the amplitude measurement results of the different interferometer methods that were applied simultaneously, differed by less than 1% in all cases.

  18. Nanorobotics current approaches and techniques

    CERN Document Server

    Ferreira, Antoine

    2013-01-01

    Nanorobot devices now perform a wide variety of tasks at the nanoscale in a wide variety of fields including but not limited to fields such as manufacturing, medicine, supply chain, biology, and outer space. Nanorobotics: Current Approaches and Techniques is a comprehensive overview of this interdisciplinary field with a wide ranging discussion that includes nano-manipulation and industrial nanorobotics, nanorobotics in biology and medicine, nanorobotic sensing, navigation and swarm behavior, and protein and DNA-based nanorobotics. Also included is the latest on topics such as bio-nano-actuators and propulsion and navigation of nanorobotic systems using magnetic fields. Nanorobotics: Current Approaches and Techniques is an ideal book for scientists, researchers, and engineers actively involved in applied and robotic research and development.

  19. Standard lymphadenectomy technique in the gastric adenocarcinoma

    International Nuclear Information System (INIS)

    Aguirre Fernandez, Roberto Eduardo; Fernandez Vazquez, Pedro Ivan; LLera Dominguez, Gerardo de la

    2012-01-01

    The surgical technique used from 1990 in the 'Celia Sanchez Manduley' Clinical Surgical Teaching Provincial Hospital in Manzanillo, Granma province to carry out the gastrectomy together with the standard lymphadenectomy in patients carriers of a gastric adenocarcinoma, allowing application of the current oncologic and surgical concepts of the Japanese Society for Research of Gastric Cancer, essential to obtain a better prognosis in these patients

  20. [Study on standardization of cupping technique: elucidation on the establishment of the National Standard Standardized Manipulation of Acupuncture and Moxibustion, Part V, Cupping].

    Science.gov (United States)

    Gao, Shu-zhong; Liu, Bing

    2010-02-01

    From the aspects of basis, technique descriptions, core contents, problems and solutions, and standard thinking in standard setting process, this paper states experiences in the establishment of the national standard Standardized Manipulation of Acupuncture and Moxibustion, Part V, Cupping, focusing on methodologies used in cupping standard setting process, the method selection and operating instructions of cupping standardization, and the characteristics of standard TCM. In addition, this paper states the scope of application, and precautions for this cupping standardization. This paper also explaines tentative ideas on the research of standardized manipulation of acupuncture and moxibustion.

  1. Technique for fabrication of gradual standards of radiographic image blachening density

    International Nuclear Information System (INIS)

    Borovin, I.V.; Kondina, M.A.

    1987-01-01

    The technique of fabrication of gradual standards of blackening density for industrial radiography by contact printing from a negative is described. The technique is designed for possibilities of industrial laboratoriesof radiation defectoscopy possessing no special-purpose sensitometric equipment

  2. Nuclear security standard: Argentina approach

    International Nuclear Information System (INIS)

    Bonet Duran, Stella M.; Rodriguez, Carlos E.; Menossi, Sergio A.; Serdeiro, Nelida H.

    2007-01-01

    Argentina has a comprehensive regulatory system designed to assure the security and safety of radioactive sources, which has been in place for more than fifty years. In 1989 the Radiation Protection and Nuclear Safety branch of the National Atomic Energy Commission created the 'Council of Physical Protection of Nuclear Materials and Installations' (CAPFMIN). This Council published in 1992 a Physical Protection Standard based on a deep and careful analysis of INFCIRC 225/Rev.2 including topics like 'sabotage scenario'. Since then, the world's scenario has changed, and some concepts like 'design basis threat', 'detection, delay and response', 'performance approach and prescriptive approach', have been applied to the design of physical protection systems in facilities other than nuclear installations. In Argentina, radioactive sources are widely used in medical and industrial applications with more than 1,600 facilities controlled by the Nuclear Regulatory Authority (in spanish ARN). During 2005, measures like 'access control', 'timely detection of intruder', 'background checks', and 'security plan', were required by ARN for implementation in facilities with radioactive sources. To 'close the cycle' the next step is to produce a regulatory standard based on the operational experience acquired during 2005. ARN has developed a set of criteria for including them in a new standard on security of radioactive materials. Besides, a specific Regulatory Guide is being prepared to help licensees of facilities in design a security system and to fulfill the 'Design of Security System Questionnaire'. The present paper describes the proposed Standard on Security of Radioactive Sources and the draft of the Nuclear Security Regulatory Guidance, based on our regulatory experience and the latest international recommendations. (author)

  3. Algorithmic approach to diagram techniques

    International Nuclear Information System (INIS)

    Ponticopoulos, L.

    1980-10-01

    An algorithmic approach to diagram techniques of elementary particles is proposed. The definition and axiomatics of the theory of algorithms are presented, followed by the list of instructions of an algorithm formalizing the construction of graphs and the assignment of mathematical objects to them. (T.A.)

  4. Verification of FPGA-based NPP I and C systems. General approach and techniques

    International Nuclear Information System (INIS)

    Andrashov, Anton; Kharchenko, Vyacheslav; Sklyar, Volodymir; Reva, Lubov; Siora, Alexander

    2011-01-01

    This paper presents a general approach and techniques for design and verification of Field Programmable Gates Arrays (FPGA)-based Instrumentation and Control (I and C) systems for Nuclear Power Plants (NPP). Appropriate regulatory documents used for I and C systems design, development, verification and validation (V and V) are discussed considering the latest international standards and guidelines. Typical development and V and V processes of FPGA electronic design for FPGA-based NPP I and C systems are presented. Some safety-related features of implementation process are discussed. Corresponding development artifacts, related to design and implementation activities are outlined. An approach to test-based verification of FPGA electronic design algorithms, used in FPGA-based reactor trip systems is proposed. The results of application of test-based techniques for assessment of FPGA electronic design algorithms for reactor trip system (RTS) produced by Research and Production Corporation (RPC) 'Radiy' are presented. Some principles of invariant-oriented verification for FPGA-based safety-critical systems are outlined. (author)

  5. Current Methods Applied to Biomaterials - Characterization Approaches, Safety Assessment and Biological International Standards.

    Science.gov (United States)

    Oliveira, Justine P R; Ortiz, H Ivan Melendez; Bucio, Emilio; Alves, Patricia Terra; Lima, Mayara Ingrid Sousa; Goulart, Luiz Ricardo; Mathor, Monica B; Varca, Gustavo H C; Lugao, Ademar B

    2018-04-10

    Safety and biocompatibility assessment of biomaterials are themes of constant concern as advanced materials enter the market as well as products manufactured by new techniques emerge. Within this context, this review provides an up-to-date approach on current methods for the characterization and safety assessment of biomaterials and biomedical devices from a physicalchemical to a biological perspective, including a description of the alternative methods in accordance with current and established international standards. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  6. Strain analysis in CRT candidates using the novel segment length in cine (SLICE) post-processing technique on standard CMR cine images.

    Science.gov (United States)

    Zweerink, Alwin; Allaart, Cornelis P; Kuijer, Joost P A; Wu, LiNa; Beek, Aernout M; van de Ven, Peter M; Meine, Mathias; Croisille, Pierre; Clarysse, Patrick; van Rossum, Albert C; Nijveldt, Robin

    2017-12-01

    Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive segmental strains from standard cardiovascular MR (CMR) cine images in CRT candidates. Twenty-seven patients with left bundle branch block underwent CMR examination including cine imaging and myocardial tagging (CMR-TAG). SLICE was performed by measuring segment length between anatomical landmarks throughout all phases on short-axis cines. This measure of frame-to-frame segment length change was compared to CMR-TAG circumferential strain measurements. Subsequently, conventional markers of CRT response were calculated. Segmental strains showed good to excellent agreement between SLICE and CMR-TAG (septum strain, intraclass correlation coefficient (ICC) 0.76; lateral wall strain, ICC 0.66). Conventional markers of CRT response also showed close agreement between both methods (ICC 0.61-0.78). Reproducibility of SLICE was excellent for intra-observer testing (all ICC ≥0.76) and good for interobserver testing (all ICC ≥0.61). The novel SLICE post-processing technique on standard CMR cine images offers both accurate and robust segmental strain measures compared to the 'gold standard' CMR-TAG technique, and has the advantage of being widely available. • Myocardial strain analysis could potentially improve patient selection for CRT. • Currently a well validated clinical approach to derive segmental strains is lacking. • The novel SLICE technique derives segmental strains from standard CMR cine images. • SLICE-derived strain markers of CRT response showed close agreement with CMR-TAG. • Future studies will focus on the prognostic value of SLICE in CRT candidates.

  7. Assessing the service quality of Iran military hospitals: Joint Commission International standards and Analytic Hierarchy Process (AHP) technique

    Science.gov (United States)

    Bahadori, Mohammadkarim; Ravangard, Ramin; Yaghoubi, Maryam; Alimohammadzadeh, Khalil

    2014-01-01

    Background: Military hospitals are responsible for preserving, restoring and improving the health of not only armed forces, but also other people. According to the military organizations strategy, which is being a leader and pioneer in all areas, providing quality health services is one of the main goals of the military health care organizations. This study was aimed to evaluate the service quality of selected military hospitals in Iran based on the Joint Commission International (JCI) standards and comparing these hospitals with each other and ranking them using the analytic hierarchy process (AHP) technique in 2013. Materials and Methods: This was a cross-sectional and descriptive study conducted on five military hospitals, selected using the purposive sampling method, in 2013. Required data collected using checklists of accreditation standards and nominal group technique. AHP technique was used for prioritizing. Furthermore, Expert Choice 11.0 was used to analyze the collected data. Results: Among JCI standards, the standards of access to care and continuity of care (weight = 0.122), quality improvement and patient safety (weight = 0.121) and leadership and management (weight = 0.117) had the greatest importance, respectively. Furthermore, in the overall ranking, BGT (weight = 0.369), IHM (0.238), SAU (0.202), IHK (weight = 0.125) and SAB (weight = 0.066) ranked first to fifth, respectively. Conclusion: AHP is an appropriate technique for measuring the overall performance of hospitals and their quality of services. It is a holistic approach that takes all hospital processes into consideration. The results of the present study can be used to improve hospitals performance through identifying areas, which are in need of focus for quality improvement and selecting strategies to improve service quality. PMID:25250364

  8. Assessing the service quality of Iran military hospitals: Joint Commission International standards and Analytic Hierarchy Process (AHP) technique.

    Science.gov (United States)

    Bahadori, Mohammadkarim; Ravangard, Ramin; Yaghoubi, Maryam; Alimohammadzadeh, Khalil

    2014-01-01

    Military hospitals are responsible for preserving, restoring and improving the health of not only armed forces, but also other people. According to the military organizations strategy, which is being a leader and pioneer in all areas, providing quality health services is one of the main goals of the military health care organizations. This study was aimed to evaluate the service quality of selected military hospitals in Iran based on the Joint Commission International (JCI) standards and comparing these hospitals with each other and ranking them using the analytic hierarchy process (AHP) technique in 2013. This was a cross-sectional and descriptive study conducted on five military hospitals, selected using the purposive sampling method, in 2013. Required data collected using checklists of accreditation standards and nominal group technique. AHP technique was used for prioritizing. Furthermore, Expert Choice 11.0 was used to analyze the collected data. Among JCI standards, the standards of access to care and continuity of care (weight = 0.122), quality improvement and patient safety (weight = 0.121) and leadership and management (weight = 0.117) had the greatest importance, respectively. Furthermore, in the overall ranking, BGT (weight = 0.369), IHM (0.238), SAU (0.202), IHK (weight = 0.125) and SAB (weight = 0.066) ranked first to fifth, respectively. AHP is an appropriate technique for measuring the overall performance of hospitals and their quality of services. It is a holistic approach that takes all hospital processes into consideration. The results of the present study can be used to improve hospitals performance through identifying areas, which are in need of focus for quality improvement and selecting strategies to improve service quality.

  9. Replacement Value - Representation of Fair Value in Accounting. Techniques and Modeling Suitable for the Income Based Approach

    OpenAIRE

    Manea Marinela – Daniela

    2011-01-01

    The term fair value is spread within the sphere of international standards without reference to any detailed guidance on how to apply. However, specialized tangible assets, which are rarely sold, the rule IAS 16 "Intangible assets " makes it possible to estimate fair value using an income approach or a replacement cost or depreciation. The following material is intended to identify potential modeling of fair value as an income-based approach, appealing to techniques used by professional evalu...

  10. Standardization of surgical techniques used in facial bone contouring.

    Science.gov (United States)

    Lee, Tae Sung

    2015-12-01

    Since the introduction of facial bone contouring surgery for cosmetic purposes, various surgical methods have been used to improve the aesthetics of facial contours. In general, by standardizing the surgical techniques, it is possible to decrease complication rates and achieve more predictable surgical outcomes, thereby increasing patient satisfaction. The technical strategies used by the author to standardize facial bone contouring procedures are introduced here. The author uses various pre-manufactured surgical tools and hardware for facial bone contouring. During a reduction malarplasty or genioplasty procedure, double-bladed reciprocating saws and pre-bent titanium plates customized for the zygomatic body, arch and chin are used. Various guarded oscillating saws are used for mandibular angloplasty. The use of double-bladed saws and pre-bent plates to perform reduction malarplasty reduces the chances of post-operative asymmetry or under- or overcorrection of the zygoma contours due to technical faults. Inferior alveolar nerve injury and post-operative jawline asymmetry or irregularity can be reduced by using a guarded saw during mandibular angloplasty. For genioplasty, final placement of the chin in accordance with preoperative quantitative analysis can be easily performed with pre-bent plates, and a double-bladed saw allows more procedural accuracy during osteotomies. Efforts by the surgeon to avoid unintentional faults are key to achieving satisfactory results and reducing the incidence of complications. The surgical techniques described in this study in conjunction with various in-house surgical tools and modified hardware can be used to standardize techniques to achieve aesthetically gratifying outcomes. Copyright © 2015 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  11. The standardized surgical approach improves outcome of gallbladder cancer

    Directory of Open Access Journals (Sweden)

    Igna Dorian

    2007-05-01

    Full Text Available Abstract Background The objective of this study was to examine the extent of surgical procedures, pathological findings, complications and outcome of patients treated in the last 12 years for gallbladder cancer. Methods The impact of a standardized more aggressive approach compared with historical controls of our center with an individual approach was examined. Of 53 patients, 21 underwent resection for cure and 32 for palliation. Results Overall hospital mortality was 9% and procedure related mortality was 4%. The standardized approach in UICC stage IIa, IIb and III led to a significantly improved outcome compared to patients with an individual approach (Median survival: 14 vs. 7 months, mean+/-SEM: 26+/-7 vs. 17+/-5 months, p = 0.014. The main differences between the standardized and the individual approach were anatomical vs. atypical liver resection, performance of systematic lymph dissection of the hepaticoduodenal ligament and the resection of the common bile duct. Conclusion Anatomical liver resection, proof for bile duct infiltration and, in case of tumor invasion, radical resection and lymph dissection of the hepaticoduodenal ligament are essential to improve outcome of locally advanced gallbladder cancer.

  12. ELISA technique standardization for strongyloidiasis diagnosis

    International Nuclear Information System (INIS)

    Huapaya, P.; Espinoza, I.; Huiza, A.; Universidad Nacional Mayor de San Marcos, Lima; Sevilla, C.

    2002-01-01

    To standardize ELISA technique for human Strongyloides stercoralis infection diagnosis a crude antigen was prepared using filariform larvae obtained from positive stool samples cultured with charcoal. Harvested larvae were crushed by sonication and washed by centrifugation in order to obtain protein extracts to be used as antigen. Final protein concentration was 600 μg/mL. Several kinds of ELISA plates were tested and antigen concentration, sera dilution, conjugate dilution and cut off were determined to identify infection. Sera from patients with both hyper-infection syndrome and intestinal infection demonstrated by parasitological examination were positive controls and sera from people living in non-endemic areas with no infection demonstrated by parasitological examination were negative controls. Best values were 5 μg/mL for antigen, 1/64 for sera, 1/1000 for conjugate; optical density values for positive samples were 1,2746 (1,1065 - 1,4206, DS = 0,3284) and for negative samples 0,4457 (0,3324 - 0,5538, DS = 0,2230). Twenty sera samples from positive subjects and one hundred from negative subjects were examined, obtaining 90% sensitivity and 88% specificity. The results show this technique could be useful as strongyloidiasis screening test in population studies

  13. Data compression techniques and the ACR-NEMA digital interface communications standard

    International Nuclear Information System (INIS)

    Zielonka, J.S.; Blume, H.; Hill, D.; Horil, S.C.; Lodwick, G.S.; Moore, J.; Murphy, L.L.; Wake, R.; Wallace, G.

    1987-01-01

    Data compression offers the possibility of achieving high, effective information transfer rates between devices and of efficient utilization of digital storge devices in meeting department-wide archiving needs. Accordingly, the ARC-NEMA Digital Imaging and Communications Standards Committee established a Working Group to develop a means to incorporate the optimal use of a wide variety of current compression techniques while remaining compatible with the standard. This proposed method allows the use of public domain techniques, predetermined methods between devices already aware of the selected algorithm, and the ability for the originating device to specify algorithms and parameters prior to transmitting compressed data. Because of the latter capability, the technique has the potential for supporting many compression algorithms not yet developed or in common use. Both lossless and lossy methods can be implemented. In addition to description of the overall structure of this proposal, several examples using current compression algorithms are given

  14. Standardization of proton-induced x-ray emission technique for analysis of thick samples

    Science.gov (United States)

    Ali, Shad; Zeb, Johar; Ahad, Abdul; Ahmad, Ishfaq; Haneef, M.; Akbar, Jehan

    2015-09-01

    This paper describes the standardization of the proton-induced x-ray emission (PIXE) technique for finding the elemental composition of thick samples. For the standardization, three different samples of standard reference materials (SRMs) were analyzed using this technique and the data were compared with the already known data of these certified SRMs. These samples were selected in order to cover the maximum range of elements in the periodic table. Each sample was irradiated for three different values of collected beam charges at three different times. A proton beam of 2.57 MeV obtained using 5UDH-II Pelletron accelerator was used for excitation of x-rays from the sample. The acquired experimental data were analyzed using the GUPIXWIN software. The results show that the SRM data and the data obtained using the PIXE technique are in good agreement.

  15. Standardization of P-33 by the TDCR efficiency calculation technique

    CSIR Research Space (South Africa)

    Simpson, BRS

    2004-02-01

    Full Text Available The activity of the pure beta-emitter phosphorus-33 (P-33) has been directly determined by the triple-to-double coincidence ratio (TDCR) efficiency calculation technique, thus extending the number of radionuclides that have been standardized...

  16. Risk approaches in setting radiation standards

    International Nuclear Information System (INIS)

    Whipple, C.

    1984-01-01

    This paper discusses efforts to increase the similarity of risk regulation approaches for radiation and chemical carcinogens. The risk assessment process in both cases involves the same controversy over the extrapolation from high to low doses and dose rates, and in both cases the boundaries between science and policy in risk assessment are indistinct. Three basic considerations are presented to approach policy questions: the economic efficiency of the regulatory approach, the degree of residual risk, and the technical opportunities for risk control. It is the author's opinion that if an agency can show that its standard-setting policies are consistent with those which have achieved political and judicial acceptance in other contexts, the greater the predictability of the regulatory process and the stability of this process

  17. Intraoral approach: A newer technique for filler injection

    Directory of Open Access Journals (Sweden)

    Chytra V Anand

    2010-01-01

    Full Text Available Filler injections are the most common aesthetic procedures used for volume correction. Various techniques have been described in the use of fillers. This article reviews the available literature on a new technique using the intraoral approach for injection of fillers.

  18. Integrated management system - management standards evolution and the IAEA new approach

    International Nuclear Information System (INIS)

    Oliveira, Dirceu Paulo de; Zouain, Desiree Moraes

    2007-01-01

    The management standards application began in military and nuclear areas towards the end of Second World War, when some westerns countries developed quality standards to improve their means to assess suppliers' conditions to assure their products conformance, which was increasingly complex and required a higher degree of reliability. Afterwards, the quality standards application was extended to the consumer market focused on consumers' requirements satisfaction. Coming along the society crescent concern about quality of life, other management standards were developed, such as those dealing with environmental and sustainable development, occupational health and safety, social accountability and so on. As a consequence, the management process became complex. The management system integrated form approach makes possible the compatibility of distinct and complementary interests from several functions and disciplines involved and supply the absence of the organizations' holistic approach. According to this integrated management approach, the Agency - 'International Atomic Energy Agency' (IAEA) - decided to review the structure of the 50-C-Q standard - 'Quality Assurance for Safety in Nuclear Power Plants and Other Nuclear Installations', from 1996, publishing in 2006 the new GS-R-3 standard - 'The Management System for Facilities and Activities - Safety Requirements'. This work presents a brief evolution of management standards and integrated management approach, showing the Agency's new vision concerning this issue with the GS-R-3 standard publication. (author)

  19. The development of an efficient mass balance approach for the purity assignment of organic calibration standards.

    Science.gov (United States)

    Davies, Stephen R; Alamgir, Mahiuddin; Chan, Benjamin K H; Dang, Thao; Jones, Kai; Krishnaswami, Maya; Luo, Yawen; Mitchell, Peter S R; Moawad, Michael; Swan, Hilton; Tarrant, Greg J

    2015-10-01

    The purity determination of organic calibration standards using the traditional mass balance approach is described. Demonstrated examples highlight the potential for bias in each measurement and the need to implement an approach that provides a cross-check for each result, affording fit for purpose purity values in a timely and cost-effective manner. Chromatographic techniques such as gas chromatography with flame ionisation detection (GC-FID) and high-performance liquid chromatography with UV detection (HPLC-UV), combined with mass and NMR spectroscopy, provide a detailed impurity profile allowing an efficient conversion of chromatographic peak areas into relative mass fractions, generally avoiding the need to calibrate each impurity present. For samples analysed by GC-FID, a conservative measurement uncertainty budget is described, including a component to cover potential variations in the response of each unidentified impurity. An alternative approach is also detailed in which extensive purification eliminates the detector response factor issue, facilitating the certification of a super-pure calibration standard which can be used to quantify the main component in less-pure candidate materials. This latter approach is particularly useful when applying HPLC analysis with UV detection. Key to the success of this approach is the application of both qualitative and quantitative (1)H NMR spectroscopy.

  20. A statistical approach to traditional Vietnamese medical diagnoses standardization

    International Nuclear Information System (INIS)

    Nguyen Hoang Phuong; Nguyen Quang Hoa; Le Dinh Long

    1990-12-01

    In this paper the first results of the statistical approach for Cold-Heat diagnosis standardization as a first work in the ''eight rules diagnoses'' standardization of Traditional Vietnamese Medicine are briefly described. Some conclusions and suggestions for further work are given. 3 refs, 2 tabs

  1. Finite element methods for engineering sciences. Theoretical approach and problem solving techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chaskalovic, J. [Ariel University Center of Samaria (Israel); Pierre and Marie Curie (Paris VI) Univ., 75 (France). Inst. Jean le Rond d' Alembert

    2008-07-01

    This self-tutorial offers a concise yet thorough grounding in the mathematics necessary for successfully applying FEMs to practical problems in science and engineering. The unique approach first summarizes and outlines the finite-element mathematics in general and then, in the second and major part, formulates problem examples that clearly demonstrate the techniques of functional analysis via numerous and diverse exercises. The solutions of the problems are given directly afterwards. Using this approach, the author motivates and encourages the reader to actively acquire the knowledge of finite- element methods instead of passively absorbing the material, as in most standard textbooks. The enlarged English-language edition, based on the original French, also contains a chapter on the approximation steps derived from the description of nature with differential equations and then applied to the specific model to be used. Furthermore, an introduction to tensor calculus using distribution theory offers further insight for readers with different mathematical backgrounds. (orig.)

  2. The STEP standard as an approach for design and prototyping

    OpenAIRE

    Plantec , Alain; Ribaud , Vincent

    1998-01-01

    International audience; STEP is an ISO standard (ISO-10303) for the computer-interpretable representation and exchange of product data. Parts of STEP standardize conceptual structures and usage of information in generic or specific domains. The standardization process of these constructs is an evolutionary approach , which uses generated prototypes at different phases of the process. This paper presents a method for the building of prototype generators, inspired by this standardization proces...

  3. Standardization of MIP technique in three-dimensional CT portography: usefulness in evaluation of portosystemic collaterals in cirrhotic patients

    International Nuclear Information System (INIS)

    Kim, Jong Gi; Kim, Yong; Kim, Chang Won; Lee, Jun Woo; Lee, Suk Hong

    2003-01-01

    To assess the usefulness of three-dimensional CT portography using a standardized maximum intensity projection (MIP) technique for the evaluation of portosystemic collaterals in cirrhotic patients. In 25 cirrhotic patients with portosystemic collaterals, three-phase CT using a multide-tector-row helical CT scanner was performed to evaluate liver disease. Late arterial-phase images were transferred to an Advantage Windows 3.1 workstation (Gener Electric). Axial images were reconstructed by means of three-dimensional CT portography, using both a standardized and a non-standardized MIP technique, and the respective reconstruction times were determined. Three-dimensional CT portography with the standardized technique involved eight planes, namely the spleno-portal confluence axis (coronal, lordotic coronal, lordotic coronal RAO 30 .deg. C, and lordotic coronal LAO 30 .deg. C), the left renal vein axis (lordotic coronal), and axial MIP images (lower esophagus level, gastric fundus level and splenic hilum). The eight MIP images obtained in each case were interpreted by two radiologists, who reached a consensus in their evaluation. The portosystemic collaterals evaluated were as follows: left gastric vein dilatation; esophageal, paraesophageal, gastric, and splenic varix; paraumbilical vein dilatation; gastro-renal, spleno-renal, and gastro-spleno-renal shunt; mesenteric, retroperitoneal, and omental collaterals. The average reconstruction time using the non-standardized MIP technique was 11 minutes 23 seconds, and with the standardized technique, the time was 6 minutes 5 seconds. Three-dimensional CT portography with the standardized technique demonstrated left gastric vein dilatation (n=25), esophageal varix (n=18), paraesophageal varix (n=13), gastric varix (n=4), splenic varix (n=4), paraumbilical vein dilatation (n=4), gastro-renal shunt (n=3), spleno-renal shunt (n=3), and gastro-spleno-renal shunt (n=1). Using three-dimensional CT protography and the non-standardized

  4. Peyton's four-step approach for teaching complex spinal manipulation techniques - a prospective randomized trial.

    Science.gov (United States)

    Gradl-Dietsch, Gertraud; Lübke, Cavan; Horst, Klemens; Simon, Melanie; Modabber, Ali; Sönmez, Tolga T; Münker, Ralf; Nebelung, Sven; Knobe, Matthias

    2016-11-03

    The objectives of this prospective randomized trial were to assess the impact of Peyton's four-step approach on the acquisition of complex psychomotor skills and to examine the influence of gender on learning outcomes. We randomly assigned 95 third to fifth year medical students to an intervention group which received instructions according to Peyton (PG) or a control group, which received conventional teaching (CG). Both groups attended four sessions on the principles of manual therapy and specific manipulative and diagnostic techniques for the spine. We assessed differences in theoretical knowledge (multiple choice (MC) exam) and practical skills (Objective Structured Practical Examination (OSPE)) with respect to type of intervention and gender. Participants took a second OSPE 6 months after completion of the course. There were no differences between groups with respect to the MC exam. Students in the PG group scored significantly higher in the OSPE. Gender had no additional impact. Results of the second OSPE showed a significant decline in competency regardless of gender and type of intervention. Peyton's approach is superior to standard instruction for teaching complex spinal manipulation skills regardless of gender. Skills retention was equally low for both techniques.

  5. Anterior Colporrhaphy Technique and Approach Choices: Turkey Evaluation

    Directory of Open Access Journals (Sweden)

    Serdar Aydın

    2016-09-01

    Full Text Available Aim: To evaluate the diversity in techniques and approaches for anterior colporrhaphy among operators in Turkey. Methods: A survey evaluating the preoperative examination, technique of anterior colporrhaphy, operation choice and postoperative care was presented to surgeons. We contacted via directly, mail or telephone. We used 28 item questionnaire. Results: Majority (87.9% was composed of young gynecologists. Urologists composed of the 9.5% of the study population. The rate of paravaginal defect evaluation was 75.9% and mostly by inspection the presence of vaginal rugae. The use of transperineal 3D pelvic floor ultrasonography was low (5.7%. The evaluation of levator ani muscle defect was 46.6%. The usage of the transperineal 3D ultrasonography for levator ani muscle defect was 19 percent of operators. There were diversity in use of hydrodissection, fascial plication, excision of vaginal mucosa and suture choice. Usage of mesh for anterior colporrhaphy was limited (17.8% and mostly in recurrent cases (12.2%. Paravaginal defect repair rate was 31.9%. The urinary catheter was generally removed one or two day after operation. Vaginal pack usually removed 24 hours after. Conclusion: Several techniques and approaches for anterior vaginal wall repair among operators in Turkey. The variety of techniques suggested that there is no consensus on best surgical technique.

  6. The Development of the Standard Lithuanian Language: Ecolinguistic Approach

    Directory of Open Access Journals (Sweden)

    Vaida Buivydienė

    2014-06-01

    Full Text Available The theory of standard languages is closely linked with the standardization policy and prevailing ideology. The language ideology comprises its value, experience and convictions related to language usage and its dis - course being influenced at institutional, local and global levels. Recently, in the last decades, foreign linguists have linked the theories of the development of standard lan- guages and their ideologies with an ecolinguistic approach towards language standardization phenomena. The article is based on Einar Haugen ’s theory about the development of standard languages and ecolinguistic statements and presents the stages of developing the standard language as well as the factors having an influ - ence on them. In conclusion, a strong political and social impact has been made on the development of the standard Lithuanian language. The stages of the progress of the standard Lithuanian language have rapidly changed each other, some have been held very close to one another and some still have been taken part.

  7. Standardization of the Descemet membrane endothelial keratoplasty technique: Outcomes of the first 450 consecutive cases.

    Science.gov (United States)

    Satué, M; Rodríguez-Calvo-de-Mora, M; Naveiras, M; Cabrerizo, J; Dapena, I; Melles, G R J

    2015-08-01

    To evaluate the clinical outcome of the first 450 consecutive cases after Descemet membrane endothelial keratoplasty (DMEK), as well as the effect of standardization of the technique. Comparison between 3 groups: Group I: (cases 1-125), as the extended learning curve; Group II: (cases 126-250), transition to technique standardization; Group III: (cases 251-450), surgery with standardized technique. Best corrected visual acuity, endothelial cell density, pachymetry and intra- and postoperative complications were evaluated before, and 1, 3 and 6 months after DMEK. At 6 months after surgery, 79% of eyes reached a best corrected visual acuity of≥0.8 and 43%≥1.0. Mean preoperative endothelial cell density was 2,530±220 cells/mm2 and 1,613±495 at 6 months after surgery. Mean pachymetry measured 668±92 μm and 526±46 μm pre- and (6 months) postoperatively, respectively. There were no significant differences in best corrected visual acuity, endothelial cell density and pachymetry between the 3 groups (P > .05). Graft detachment presented in 17.3% of the eyes. The detachment rate declined from 24% to 12%, and the rate of secondary surgeries from 9.6% to 3.5%, from group I to III respectively. Visual outcomes and endothelial cell density after DMEK are independent of the technique standardization. However, technique standardization may have contributed to a lower graft detachment rate and a relatively low number of secondary interventions required. As such, DMEK may become the first choice of treatment in corneal endothelial disease. Copyright © 2014 Sociedad Española de Oftalmología. Published by Elsevier España, S.L.U. All rights reserved.

  8. Next Generation Space Interconnect Standard (NGSIS): a modular open standards approach for high performance interconnects for space

    Science.gov (United States)

    Collier, Charles Patrick

    2017-04-01

    The Next Generation Space Interconnect Standard (NGSIS) effort is a Government-Industry collaboration effort to define a set of standards for interconnects between space system components with the goal of cost effectively removing bandwidth as a constraint for future space systems. The NGSIS team has selected the ANSI/VITA 65 OpenVPXTM standard family for the physical baseline. The RapidIO protocol has been selected as the basis for the digital data transport. The NGSIS standards are developed to provide sufficient flexibility to enable users to implement a variety of system configurations, while meeting goals for interoperability and robustness for space. The NGSIS approach and effort represents a radical departure from past approaches to achieve a Modular Open System Architecture (MOSA) for space systems and serves as an exemplar for the civil, commercial, and military Space communities as well as a broader high reliability terrestrial market.

  9. Strain analysis in CRT candidates using the novel segment length in cine (SLICE) post-processing technique on standard CMR cine images

    Energy Technology Data Exchange (ETDEWEB)

    Zweerink, Alwin; Allaart, Cornelis P.; Wu, LiNa; Beek, Aernout M.; Rossum, Albert C. van; Nijveldt, Robin [VU University Medical Center, Department of Cardiology, and Institute for Cardiovascular Research (ICaR-VU), Amsterdam (Netherlands); Kuijer, Joost P.A. [VU University Medical Center, Department of Physics and Medical Technology, Amsterdam (Netherlands); Ven, Peter M. van de [VU University Medical Center, Department of Epidemiology and Biostatistics, Amsterdam (Netherlands); Meine, Mathias [University Medical Center, Department of Cardiology, Utrecht (Netherlands); Croisille, Pierre; Clarysse, Patrick [Univ Lyon, UJM-Saint-Etienne, INSA, CNRS UMR 5520, INSERM U1206, CREATIS, Saint-Etienne (France)

    2017-12-15

    Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive segmental strains from standard cardiovascular MR (CMR) cine images in CRT candidates. Twenty-seven patients with left bundle branch block underwent CMR examination including cine imaging and myocardial tagging (CMR-TAG). SLICE was performed by measuring segment length between anatomical landmarks throughout all phases on short-axis cines. This measure of frame-to-frame segment length change was compared to CMR-TAG circumferential strain measurements. Subsequently, conventional markers of CRT response were calculated. Segmental strains showed good to excellent agreement between SLICE and CMR-TAG (septum strain, intraclass correlation coefficient (ICC) 0.76; lateral wall strain, ICC 0.66). Conventional markers of CRT response also showed close agreement between both methods (ICC 0.61-0.78). Reproducibility of SLICE was excellent for intra-observer testing (all ICC ≥0.76) and good for interobserver testing (all ICC ≥0.61). The novel SLICE post-processing technique on standard CMR cine images offers both accurate and robust segmental strain measures compared to the 'gold standard' CMR-TAG technique, and has the advantage of being widely available. (orig.)

  10. Strain analysis in CRT candidates using the novel segment length in cine (SLICE) post-processing technique on standard CMR cine images

    International Nuclear Information System (INIS)

    Zweerink, Alwin; Allaart, Cornelis P.; Wu, LiNa; Beek, Aernout M.; Rossum, Albert C. van; Nijveldt, Robin; Kuijer, Joost P.A.; Ven, Peter M. van de; Meine, Mathias; Croisille, Pierre; Clarysse, Patrick

    2017-01-01

    Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive segmental strains from standard cardiovascular MR (CMR) cine images in CRT candidates. Twenty-seven patients with left bundle branch block underwent CMR examination including cine imaging and myocardial tagging (CMR-TAG). SLICE was performed by measuring segment length between anatomical landmarks throughout all phases on short-axis cines. This measure of frame-to-frame segment length change was compared to CMR-TAG circumferential strain measurements. Subsequently, conventional markers of CRT response were calculated. Segmental strains showed good to excellent agreement between SLICE and CMR-TAG (septum strain, intraclass correlation coefficient (ICC) 0.76; lateral wall strain, ICC 0.66). Conventional markers of CRT response also showed close agreement between both methods (ICC 0.61-0.78). Reproducibility of SLICE was excellent for intra-observer testing (all ICC ≥0.76) and good for interobserver testing (all ICC ≥0.61). The novel SLICE post-processing technique on standard CMR cine images offers both accurate and robust segmental strain measures compared to the 'gold standard' CMR-TAG technique, and has the advantage of being widely available. (orig.)

  11. Development of Canadian seismic design approach and overview of seismic standards

    Energy Technology Data Exchange (ETDEWEB)

    Usmani, A. [Amec Foster Wheeler, Toronto, ON (Canada); Aziz, T. [TSAziz Consulting Inc., Mississauga, ON (Canada)

    2015-07-01

    Historically the Canadian seismic design approaches have evolved for CANDU® nuclear power plants to ensure that they are designed to withstand a design basis earthquake (DBE) and have margins to meet the safety requirements of beyond DBE (BDBE). While the Canadian approach differs from others, it is comparable and in some cases more conservative. The seismic requirements are captured in five CSA nuclear standards which are kept up to date and incorporate lessons learnt from recent seismic events. This paper describes the evolution of Canadian approach, comparison with others and provides an overview and salient features of CSA seismic standards. (author)

  12. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    Directory of Open Access Journals (Sweden)

    Joseph P. Kenny

    2008-01-01

    Full Text Available Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also address interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.

  13. Natural background approach to setting radiation standards

    International Nuclear Information System (INIS)

    Adler, H.I.; Federow, H.; Weinberg, A.M.

    1979-01-01

    The suggestion has often been made that an additional radiation exposure imposed on humanity as a result of some important activity such as electricity generation would be acceptable if the exposure was small compared to the natural background. In order to make this concept quantitative and objective, we propose that small compared with the natural background be interpreted as the standard deviation (weighted with the exposed population) of the natural background. This use of the variation in natural background radiation is less arbitrary and requires fewer unfounded assumptions than some current approaches to standard-setting. The standard deviation is an easily calculated statistic that is small compared with the mean value for natural exposures of populations. It is an objectively determined quantity and its significance is generally understood. Its determination does not omit any of the pertinent data. When this method is applied to the population of the United States, it suggests that a dose of 20 mrem/year would be an acceptable standard. This is comparable to the 25 mrem/year suggested as the maximum allowable exposure to an individual from the complete uranium fuel cycle

  14. Analysis of approaches to classification of forms of non-standard employment

    Directory of Open Access Journals (Sweden)

    N. V. Dorokhova

    2017-01-01

    Full Text Available Currently becoming more widespread non-standard forms of employment. If this is not clear approach to the definition and maintenance of non-standard employment. In the article the analysis of diverse interpretations of the concept, on what basis, the author makes a conclusion about the complexity and contradictory nature of precarious employment as an economic category. It examines different approaches to classification of forms of precarious employment. The main forms of precarious employment such as flexible working year, flexible working week, flexible working hours, remote work, employees on call, shift forwarding; Agency employment, self-employment, negotiator, underemployment, over employment, employment on the basis of fixed-term contracts employment based on contract of civil-legal nature, one-time employment, casual employment, temporary employment, secondary employment and part-time. The author’s approach to classification of non-standard forms of employment, based on identifying the impact of atypical employment on the development of human potential. For the purpose of classification of non-standard employment forms from the standpoint of their impact on human development as the criteria of classification proposed in the following: working conditions, wages and social guarantees, possibility of workers ' participation in management, personal development and self-employment stability. Depending on what value each of these criteria, some form of non-standard employment can be attributed to the progressive or regressive. Classification of non-standard forms of employment should be the basis of the state policy of employment management.

  15. A System Approach to Advanced Practice Clinician Standardization and High Reliability.

    Science.gov (United States)

    Okuno-Jones, Susan; Siehoff, Alice; Law, Jennifer; Juarez, Patricia

    Advanced practice clinicians (APCs) are an integral part of the health care team. Opportunities exist within Advocate Health Care to standardize and optimize APC practice across the system. To enhance the role and talents of APCs, an approach to role definition and optimization of practice and a structured approach to orientation and evaluation are shared. Although in the early stages of development, definition and standardization of accountabilities in a framework to support system changes are transforming the practice of APCs.

  16. Modification of the cranial closing wedge ostectomy technique for the treatment of canine cruciate disease. Description and comparison with standard technique.

    Science.gov (United States)

    Wallace, A M; Addison, E S; Smith, B A; Radke, H; Hobbs, S J

    2011-01-01

    To describe a modification of the cranial closing wedge ostectomy (CCWO) technique and to compare its efficacy to the standard technique on cadaveric specimens. The standard and modified CCWO technique were applied to eight pairs of cadaveric tibiae. The following parameters were compared following the ostectomy: degrees of plateau levelling achieved (degrees), tibial long axis shift (degrees), reduction in tibial length (mm), area of bone wedge removed (cm²), and the area of proximal fragment (cm²). The size of the removed wedge of bone and the reduction in tibial length were significantly less with the modified CCWO technique. The modified CCWO has two main advantages. Firstly a smaller wedge is removed, allowing a greater preservation of bone stock in the proximal tibia, which is advantageous for implant placement. Secondly, the tibia is shortened to a lesser degree, which might reduce the risk of recurvatum, fibular fracture and patella desmitis. These factors are particularly propitious for the application of this technique to Terrier breeds with excessive tibial plateau angle, where large angular corrections are required. The modified CCWO is equally effective for plateau levelling and results in an equivalent tibial long-axis shift. A disadvantage with the modified technique is that not all of the cross sectional area of the distal fragment contributes to load sharing at the osteotomy.

  17. A direct sensitivity approach to predict hourly ozone resulting from compliance with the National Ambient Air Quality Standard.

    Science.gov (United States)

    Simon, Heather; Baker, Kirk R; Akhtar, Farhan; Napelenok, Sergey L; Possiel, Norm; Wells, Benjamin; Timin, Brian

    2013-03-05

    In setting primary ambient air quality standards, the EPA's responsibility under the law is to establish standards that protect public health. As part of the current review of the ozone National Ambient Air Quality Standard (NAAQS), the US EPA evaluated the health exposure and risks associated with ambient ozone pollution using a statistical approach to adjust recent air quality to simulate just meeting the current standard level, without specifying emission control strategies. One drawback of this purely statistical concentration rollback approach is that it does not take into account spatial and temporal heterogeneity of ozone response to emissions changes. The application of the higher-order decoupled direct method (HDDM) in the community multiscale air quality (CMAQ) model is discussed here to provide an example of a methodology that could incorporate this variability into the risk assessment analyses. Because this approach includes a full representation of the chemical production and physical transport of ozone in the atmosphere, it does not require assumed background concentrations, which have been applied to constrain estimates from past statistical techniques. The CMAQ-HDDM adjustment approach is extended to measured ozone concentrations by determining typical sensitivities at each monitor location and hour of the day based on a linear relationship between first-order sensitivities and hourly ozone values. This approach is demonstrated by modeling ozone responses for monitor locations in Detroit and Charlotte to domain-wide reductions in anthropogenic NOx and VOCs emissions. As seen in previous studies, ozone response calculated using HDDM compared well to brute-force emissions changes up to approximately a 50% reduction in emissions. A new stepwise approach is developed here to apply this method to emissions reductions beyond 50% allowing for the simulation of more stringent reductions in ozone concentrations. Compared to previous rollback methods, this

  18. Development of self-calibration techniques for on-wafer and fixtured measurements: a novel approach

    OpenAIRE

    Pradell i Cara, Lluís; Purroy Martín, Francesc; Cáceres, M.

    1992-01-01

    Network Analyzer self-calibration techniques - TRL, LMR, TAR- are developed, implemented and compared in several transmission media. A novel LMR (Line-Match-Reflect) technique based on known LINE and REFLECT Standards, is proposed and compared to conventional LMR (based on known LINE and MATCH Standards) and other techniques (TRL, TAR). They are applied to on-wafer S-parameter measurement as well as to coaxial, waveguide and microstrip media. Experimental results up to 40 GHz are presented. ...

  19. Subgaleal Retention Sutures: Internal Pressure Dressing Technique for Dolenc Approach.

    Science.gov (United States)

    Burrows, Anthony M; Rayan, Tarek; Van Gompel, Jamie J

    2017-08-01

    Extradural approach to the cavernous sinus, the "Dolenc" approach recognizing its developing Dr. Vinko Dolenc, is a critically important skull base approach. However, resection of the lateral wall of the cavernous sinus, most commonly for cavernous sinus meningiomas, results commonly in a defect that often cannot be reconstructed in a water-tight fashion. This may result in troublesome pseudomeningocele postoperatively. To describe a technique designed to mitigate the development of pseudomeningocele. We found the Dolenc approach critical for resection of cavernous lesions. However, a number of pseudomeningoceles were managed with prolonged external pressure wrapping in the early cohort. Therefore, we incorporated subgaleal to muscular sutures, which were designed to close this potential space and retrospectively analyzed our results. Twenty-one patients treated with a Dolenc approach and resection of the lateral wall of the cavernous sinus over a 2-year period were included. Prior to incorporation of this technique, 12 patients were treated and 3 (25%) experienced postoperative pseudomeningoceles requiring multiple clinic visits and frequent dressing. After incorporation of subgaleal retention sutures, no patient (0%) experienced this complication. Although basic, subgaleal to temporalis muscle retention sutures likely aid in eliminating this potential dead space, thereby preventing patient distress postoperatively. This technique is simple and further emphasizes the importance of dead space elimination in complex closures. Copyright © 2017 by the Congress of Neurological Surgeons

  20. The colloquial approach: An active learning technique

    Science.gov (United States)

    Arce, Pedro

    1994-09-01

    This paper addresses the very important problem of the effectiveness of teaching methodologies in fundamental engineering courses such as transport phenomena. An active learning strategy, termed the colloquial approach, is proposed in order to increase student involvement in the learning process. This methodology is a considerable departure from traditional methods that use solo lecturing. It is based on guided discussions, and it promotes student understanding of new concepts by directing the student to construct new ideas by building upon the current knowledge and by focusing on key cases that capture the essential aspects of new concepts. The colloquial approach motivates the student to participate in discussions, to develop detailed notes, and to design (or construct) his or her own explanation for a given problem. This paper discusses the main features of the colloquial approach within the framework of other current and previous techniques. Problem-solving strategies and the need for new textbooks and for future investigations based on the colloquial approach are also outlined.

  1. A Gold Standards Approach to Training Instructors to Evaluate Crew Performance

    Science.gov (United States)

    Baker, David P.; Dismukes, R. Key

    2003-01-01

    The Advanced Qualification Program requires that airlines evaluate crew performance in Line Oriented Simulation. For this evaluation to be meaningful, instructors must observe relevant crew behaviors and evaluate those behaviors consistently and accurately against standards established by the airline. The airline industry has largely settled on an approach in which instructors evaluate crew performance on a series of event sets, using standardized grade sheets on which behaviors specific to event set are listed. Typically, new instructors are given a class in which they learn to use the grade sheets and practice evaluating crew performance observed on videotapes. These classes emphasize reliability, providing detailed instruction and practice in scoring so that all instructors within a given class will give similar scores to similar performance. This approach has value but also has important limitations; (1) ratings within one class of new instructors may differ from those of other classes; (2) ratings may not be driven primarily by the specific behaviors on which the company wanted the crews to be scored; and (3) ratings may not be calibrated to company standards for level of performance skill required. In this paper we provide a method to extend the existing method of training instructors to address these three limitations. We call this method the "gold standards" approach because it uses ratings from the company's most experienced instructors as the basis for training rater accuracy. This approach ties the training to the specific behaviors on which the experienced instructors based their ratings.

  2. The implementation of a standardized approach to laparoscopic rectal surgery

    DEFF Research Database (Denmark)

    Aslak, Katrine Kanstrup; Bulut, Orhan

    2012-01-01

    BACKGROUND AND OBJECTIVES: The purpose of this study was to audit our results after implementation of a standardized operative approach to laparoscopic surgery for rectal cancer within a fast-track recovery program. METHODS: From January 2009 to February 2011, 100 consecutive patients underwent...... laparoscopic surgery on an intention-to-treat basis for rectal cancer. The results were retrospectively reviewed from a prospectively collected database. Operative steps and instrumentation for the procedure were standardized. A standard perioperative care plan was used. RESULTS: The following procedures were...

  3. Comparison of anthropometry with photogrammetry based on a standardized clinical photographic technique using a cephalostat and chair.

    Science.gov (United States)

    Han, Kihwan; Kwon, Hyuk Joon; Choi, Tae Hyun; Kim, Jun Hyung; Son, Daegu

    2010-03-01

    The aim of this study was to standardize clinical photogrammetric techniques, and to compare anthropometry with photogrammetry. To standardize clinical photography, we have developed a photographic cephalostat and chair. We investigated the repeatability of the standardized clinical photogrammetric technique. Then, with 40 landmarks, a total of 96 anthropometric measurement items was obtained from 100 Koreans. Ninety six photogrammetric measurements from the same subjects were also obtained from standardized clinical photographs using Adobe Photoshop version 7.0 (Adobe Systems Corporation, San Jose, CA, USA). The photogrammetric and anthropometric measurement data (mm, degree) were then compared. A coefficient was obtained by dividing the anthropometric measurements by the photogrammetric measurements. The repeatability of the standardized photography was statistically significantly high (p=0.463). Among the 96 measurement items, 44 items were reliable; for these items the photogrammetric measurements were not different to the anthropometric measurements. The remaining 52 items must be classified as unreliable. By developing a photographic cephalostat and chair, we have standardized clinical photogrammetric techniques. The reliable set of measurement items can be used as anthropometric measurements. For unreliable measurement items, applying a suitable coefficient to the photogrammetric measurement allows the anthropometric measurement to be obtained indirectly.

  4. Radiation safety standards : an environmentalist's approach

    International Nuclear Information System (INIS)

    Murthy, M.S.S.S.

    1977-01-01

    An integrated approach to the problem of environmental mutagenic hazards leads to the recommendation of a single dose-limit to the exposure of human beings to all man-made mutagenic agents including chemicals and radiation. However, because of lack of : (1) adequate information on chemical mutagens, (2) sufficient data on their risk estimates and (3) universally accepted dose-limites, control of chemical mutagens in the environment has not reached that advanced stage as that of radiation. In this situation, the radiation safety standards currently in use should be retained at their present levels. (M.G.B.)

  5. Do Energy Efficiency Standards Improve Quality? Evidence from a Revealed Preference Approach

    Energy Technology Data Exchange (ETDEWEB)

    Houde, Sebastien [Univ. of Maryland, College Park, MD (United States); Spurlock, C. Anna [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-06-01

    Minimum energy efficiency standards have occupied a central role in U.S. energy policy for more than three decades, but little is known about their welfare effects. In this paper, we employ a revealed preference approach to quantify the impact of past revisions in energy efficiency standards on product quality. The micro-foundation of our approach is a discrete choice model that allows us to compute a price-adjusted index of vertical quality. Focusing on the appliance market, we show that several standard revisions during the period 2001-2011 have led to an increase in quality. We also show that these standards have had a modest effect on prices, and in some cases they even led to decreases in prices. For revision events where overall quality increases and prices decrease, the consumer welfare effect of tightening the standards is unambiguously positive. Finally, we show that after controlling for the effect of improvement in energy efficiency, standards have induced an expansion of quality in the non-energy dimension. We discuss how imperfect competition can rationalize these results.

  6. Comparison of QuadrapolarTM radiofrequency lesions produced by standard versus modified technique: an experimental model

    Directory of Open Access Journals (Sweden)

    Safakish R

    2017-06-01

    Full Text Available Ramin Safakish Allevio Pain Management Clinic, Toronto, ON, Canada Abstract: Lower back pain (LBP is a global public health issue and is associated with substantial financial costs and loss of quality of life. Over the years, different literature has provided different statistics regarding the causes of the back pain. The following statistic is the closest estimation regarding our patient population. The sacroiliac (SI joint pain is responsible for LBP in 18%–30% of individuals with LBP. Quadrapolar™ radiofrequency ablation, which involves ablation of the nerves of the SI joint using heat, is a commonly used treatment for SI joint pain. However, the standard Quadrapolar radiofrequency procedure is not always effective at ablating all the sensory nerves that cause the pain in the SI joint. One of the major limitations of the standard Quadrapolar radiofrequency procedure is that it produces small lesions of ~4 mm in diameter. Smaller lesions increase the likelihood of failure to ablate all nociceptive input. In this study, we compare the standard Quadrapolar radiofrequency ablation technique to a modified Quadrapolar ablation technique that has produced improved patient outcomes in our clinic. The methodology of the two techniques are compared. In addition, we compare results from an experimental model comparing the lesion sizes produced by the two techniques. Taken together, the findings from this study suggest that the modified Quadrapolar technique provides longer lasting relief for the back pain that is caused by SI joint dysfunction. A randomized controlled clinical trial is the next step required to quantify the difference in symptom relief and quality of life produced by the two techniques. Keywords: lower back pain, radiofrequency ablation, sacroiliac joint, Quadrapolar radiofrequency ablation

  7. Standard Test Method for Determining Thermal Neutron Reaction Rates and Thermal Neutron Fluence Rates by Radioactivation Techniques

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 The purpose of this test method is to define a general procedure for determining an unknown thermal-neutron fluence rate by neutron activation techniques. It is not practicable to describe completely a technique applicable to the large number of experimental situations that require the measurement of a thermal-neutron fluence rate. Therefore, this method is presented so that the user may adapt to his particular situation the fundamental procedures of the following techniques. 1.1.1 Radiometric counting technique using pure cobalt, pure gold, pure indium, cobalt-aluminum, alloy, gold-aluminum alloy, or indium-aluminum alloy. 1.1.2 Standard comparison technique using pure gold, or gold-aluminum alloy, and 1.1.3 Secondary standard comparison techniques using pure indium, indium-aluminum alloy, pure dysprosium, or dysprosium-aluminum alloy. 1.2 The techniques presented are limited to measurements at room temperatures. However, special problems when making thermal-neutron fluence rate measurements in high-...

  8. Catheter and Laryngeal Mask Endotracheal Surfactant Therapy: the CALMEST approach as a novel MIST technique.

    Science.gov (United States)

    Vannozzi, Ilaria; Ciantelli, Massimiliano; Moscuzza, Francesca; Scaramuzzo, Rosa T; Panizza, Davide; Sigali, Emilio; Boldrini, Antonio; Cuttano, Armando

    2017-10-01

    Neonatal respiratory distress syndrome (RDS) is a major cause of mortality and morbidity among preterm infants. Although the INSURE (INtubation, SURfactant administration, Estubation) technique for surfactant replacement therapy is so far the gold standard method, over the last years new approaches have been studied, i.e. less invasive surfactant administration (LISA) or minimally invasive surfactant therapy (MIST). Here we propose an originally modified MIST, called CALMEST (Catheter And Laryngeal Mask Endotracheal Surfactant Therapy), using a particular laryngeal mask as a guide for a thin catheter to deliver surfactant directly in the trachea. We performed a preliminary study on a mannequin and a subsequent in vivo pilot trial. This novel procedure is quick, effective and well tolerated and might represent an improvement in reducing neonatal stress. Ultimately, CALMEST offers an alternative approach that could be extremely useful for medical staff with low expertise in laryngoscopy and intubation.

  9. Reliability and Validity of 10 Different Standard Setting Procedures.

    Science.gov (United States)

    Halpin, Glennelle; Halpin, Gerald

    Research indicating that different cut-off points result from the use of different standard-setting techniques leaves decision makers with a disturbing dilemma: Which standard-setting method is best? This investigation of the reliability and validity of 10 different standard-setting approaches was designed to provide information that might help…

  10. Rendezvous technique for recanalization of long-segmental chronic total occlusion above the knee following unsuccessful standard angioplasty.

    Science.gov (United States)

    Cao, Jun; Lu, Hai-Tao; Wei, Li-Ming; Zhao, Jun-Gong; Zhu, Yue-Qi

    2016-04-01

    To assess the technical feasibility and efficacy of the rendezvous technique, a type of subintimal retrograde wiring, for the treatment of long-segmental chronic total occlusions above the knee following unsuccessful standard angioplasty. The rendezvous technique was attempted in eight limbs of eight patients with chronic total occlusions above the knee after standard angioplasty failed. The clinical symptoms and ankle-brachial index were compared before and after the procedure. At follow-up, pain relief, wound healing, limb salvage, and the presence of restenosis of the target vessels were evaluated. The rendezvous technique was performed successfully in seven patients (87.5%) and failed in one patient (12.5%). Foot pain improved in all seven patients who underwent successful treatment, with ankle-brachial indexes improving from 0.23 ± 0.13 before to 0.71 ± 0.09 after the procedure (P rendezvous technique is a feasible and effective treatment for chronic total occlusions above the knee when standard angioplasty fails. © The Author(s) 2015.

  11. Risk Management Standards: Towards a contemporary, organisation-wide management approach

    OpenAIRE

    Koutsoukis, Nikitas-Spiros

    2010-01-01

    Risk management has been progressively evolving into a systemic approach for organisational decision making in today’s dynamic economic environment of the global era. In this context, risk management is reaching beyond its traditional finance and insurance application context and is entering the sphere of generic, organisation-wide management approaches. In support of this argument we consider four generic risk management standards issued at the institutional, national or international level...

  12. Elemental analyses of goundwater: demonstrated advantage of low-flow sampling and trace-metal clean techniques over standard techniques

    Science.gov (United States)

    Creasey, C. L.; Flegal, A. R.

    The combined use of both (1) low-flow purging and sampling and (2) trace-metal clean techniques provides more representative measurements of trace-element concentrations in groundwater than results derived with standard techniques. The use of low-flow purging and sampling provides relatively undisturbed groundwater samples that are more representative of in situ conditions, and the use of trace-element clean techniques limits the inadvertent introduction of contaminants during sampling, storage, and analysis. When these techniques are applied, resultant trace-element concentrations are likely to be markedly lower than results based on standard sampling techniques. In a comparison of data derived from contaminated and control groundwater wells at a site in California, USA, trace-element concentrations from this study were 2-1000 times lower than those determined by the conventional techniques used in sampling of the same wells prior to (5months) and subsequent to (1month) the collections for this study. Specifically, the cadmium and chromium concentrations derived using standard sampling techniques exceed the California Maximum Contaminant Levels (MCL), whereas in this investigation concentrations of both of those elements are substantially below their MCLs. Consequently, the combined use of low-flow and trace-metal clean techniques may preclude erroneous reports of trace-element contamination in groundwater. Résumé L'utilisation simultanée de la purge et de l'échantillonnage à faible débit et des techniques sans traces de métaux permet d'obtenir des mesures de concentrations en éléments en traces dans les eaux souterraines plus représentatives que les résultats fournis par les techniques classiques. L'utilisation de la purge et de l'échantillonnage à faible débit donne des échantillons d'eau souterraine relativement peu perturbés qui sont plus représentatifs des conditions in situ, et le recours aux techniques sans éléments en traces limite l

  13. Post-voiding residual urine and capacity increase in orthotopic urinary diversion: Standard vs modified technique

    Directory of Open Access Journals (Sweden)

    Bančević Vladimir

    2010-01-01

    Full Text Available Background/Aim. Ever since the time when the first orthotopic urinary diversion (pouch was performed there has been a constant improvement and modification of surgical techniques. The aim has been to create a urinary reservoir similar to normal bladder, to decrease incidence of postoperative complications and provide an improved life quality. The aim of this study was to compare postvoiding residual urine (PVR and capacity of the pouch constructed by standard or modified technique. Methods. In this prospective and partially retrospective clinical study we included 79 patients. In the group of 41 patients (group ST pouch was constructed using 50-70 cm of the ileum (standard technique. In the group of 38 patients (group MT pouch was constructed using 25-35 cm of the ileum (modified technique. Postoperatively, PVR and pouch capacity were measured using ultrasound in a 3-, 6- and 12-month period. Results. Postoperatively, an increase in PVR and pouch capacity was noticed in both groups. Twelve months postoperatively, PVR was significantly smaller in the group MT than in the group ST [23 (0-90 mL vs 109 (0-570 mL, p < 0,001]. In the same period the pouch capacity was significantly smaller in the MT group than in the ST group [460 (290-710 mL vs 892 (480-2 050 mL, p < 0.001]. Conclusion. Postoperatively, an increase in PVR and pouch capacity was noticed during a 12-month period. A year following the operation the pouch created from a shorter ileal segment reached capacity of the 'normal' bladder with small PVR. The pouch created by standard technique developed an unnecessary large PVR and capacity.

  14. Adopting HLA standard for interdependency study

    International Nuclear Information System (INIS)

    Nan, Cen; Eusgeld, Irene

    2011-01-01

    In recent decades, modern Critical Infrastructure (CI) has become increasingly automated and interlinked as more and more resources and information are required to maintain its day-to-day operation. A system failure, or even just a service debilitation, of any CI may have significant adverse effects on other infrastructures it is connected/interconnected with. It is vital to study the interdependencies within and between CIs and provide advanced modeling and simulation techniques in order to prevent or at least minimize these adverse effects. The key limitation of traditional mathematical models such as complex network theory is their lacking the capabilities of providing sufficient insights into interrelationships between CIs due to the complexities of these systems. A comprehensive method, a hybrid approach combining various modeling/simulation techniques in a distributed simulation environment, is presented in this paper. High Level Architecture (HLA) is an open standard (IEEE standard 1516) supporting simulations composed of different simulation components, which can be regarded as the framework for implementing such a hybrid approach. The concept of adopting HLA standard for the interdependency study is still under discussion by many researchers. Whether or not this HLA standard, or even the distributed simulation environment, is able to meet desired model/simulation requirements needs to be carefully examined. This paper presents the results from our experimental test-bed, which recreates the architecture of a typical Electricity Power Supply System (EPSS) with its own Supervisory Control and Data Acquisition (SCADA) system, for the purpose of investigating the capabilities of the HLA technique as a standard to perform interdependency studies.

  15. A Novel Instructional Approach to the Design of Standard Controllers: Using Inversion Formulae

    Science.gov (United States)

    Ntogramatzidis, Lorenzo; Zanasi, Roberto; Cuoghi, Stefania

    2014-01-01

    This paper describes a range of design techniques for standard compensators (Lead-Lag networks and PID controllers) that have been applied to the teaching of many undergraduate control courses throughout Italy over the last twenty years, but that have received little attention elsewhere. These techniques hinge upon a set of simple formulas--herein…

  16. Minilaparoscopic technique for inguinal hernia repair combining transabdominal pre-peritoneal and totally extraperitoneal approaches.

    Science.gov (United States)

    Carvalho, Gustavo L; Loureiro, Marcelo P; Bonin, Eduardo A; Claus, Christiano P; Silva, Frederico W; Cury, Antonio M; Fernandes, Flavio A M

    2012-01-01

    Endoscopic surgical repair of inguinal hernia is currently conducted using 2 techniques: the totally extraperitoneal (TEP) and the transabdominal (TAPP) hernia repair. The TEP procedure is technically advantageous, because of the use of no mesh fixation and the elimination of the peritoneal flap, leading to less postoperative pain and faster recovery. The drawback is that TEP is not performed as frequently, because of its complexity and longer learning curve. In this study, we propose a hybrid technique that could potentially become the gold standard of minimally invasive inguinal hernia surgery. This will be achieved by combining established advantages of TEP and TAPP associated with the precision and cosmetics of minilaparoscopy (MINI). Between January and July 2011, 22 patients were admitted for endoscopic inguinal hernia repair. The combined technique was initiated with TAPP inspection and direct visualization of a minilaparoscopic trocar dissection of the preperitoneum space. A10-mm trocar was then placed inside the previously dissected preperitoneal space, using the same umbilical TAPP skin incision. Minilaparoscopic retroperitoneal dissection was completed by TEP, and the surgical procedure was finalized with intraperitoneal review and correction of the preperitoneal work. The minilaparoscopic TEP-TAPP combined approach for inguinal hernia is feasible, safe, and allows a simple endoscopic repair. This is achieved by combining features and advantages of both TAPP and TEP techniques using precise and sophisticated MINI instruments. Minilaparoscopic preperitoneal dissection allows a faster and easier creation of the preperitoneal space for the TEP component of the procedure.

  17. Placement of empty catheters for an HDR-emulating LDR prostate brachytherapy technique: comparison to standard intraoperative planning.

    Science.gov (United States)

    Niedermayr, Thomas R; Nguyen, Paul L; Murciano-Goroff, Yonina R; Kovtun, Konstantin A; Neubauer Sugar, Emily; Cail, Daniel W; O'Farrell, Desmond A; Hansen, Jorgen L; Cormack, Robert A; Buzurovic, Ivan; Wolfsberger, Luciant T; O'Leary, Michael P; Steele, Graeme S; Devlin, Philip M; Orio, Peter F

    2014-01-01

    We sought to determine whether placing empty catheters within the prostate and then inverse planning iodine-125 seed locations within those catheters (High Dose Rate-Emulating Low Dose Rate Prostate Brachytherapy [HELP] technique) would improve concordance between planned and achieved dosimetry compared with a standard intraoperative technique. We examined 30 consecutive low dose rate prostate cases performed by standard intraoperative technique of planning followed by needle placement/seed deposition and compared them to 30 consecutive low dose rate prostate cases performed by the HELP technique. The primary endpoint was concordance between planned percentage of the clinical target volume that receives at least 100% of the prescribed dose/dose that covers 90% of the volume of the clinical target volume (V100/D90) and the actual V100/D90 achieved at Postoperative Day 1. The HELP technique had superior concordance between the planned target dosimetry and what was actually achieved at Day 1 and Day 30. Specifically, target D90 at Day 1 was on average 33.7 Gy less than planned for the standard intraoperative technique but was only 10.5 Gy less than planned for the HELP technique (p 0.05). Placing empty needles first and optimizing the plan to the known positions of the needles resulted in improved concordance between the planned and the achieved dosimetry to the target, possibly because of elimination of errors in needle placement. Copyright © 2014 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  18. Differences in Approach between Nuclear and Conventional Seismic Standards with regard to Hazard Definition - CSNI Integrity And Ageing Working Group

    International Nuclear Information System (INIS)

    Djaoudi, Ali; Labbe, Pierre; Murphy, Andrew; Kitada, Yoshio

    2008-01-01

    . - Introduction of the displacement based method considered as an advanced technique by the conventional industry (need for accurate modelling and introduction of the actual nonlinearities more realistic design performance assessment), is neither considered as efficient nor sufficiently conservative to be used for the nuclear structures and components. Use of displacement based methods by the nuclear industry need a definition of new acceptance criteria. - Considering the difference in approaches on safety objective, hazard definition and the methodologies, one cannot consider that nuclear seismic hazards determination and design standards are lagging behind developments in similar standards for conventional facilities

  19. Peyton’s four-step approach for teaching complex spinal manipulation techniques – a prospective randomized trial

    Directory of Open Access Journals (Sweden)

    Gertraud Gradl-Dietsch

    2016-11-01

    Full Text Available Abstract Background The objectives of this prospective randomized trial were to assess the impact of Peyton’s four-step approach on the acquisition of complex psychomotor skills and to examine the influence of gender on learning outcomes. Methods We randomly assigned 95 third to fifth year medical students to an intervention group which received instructions according to Peyton (PG or a control group, which received conventional teaching (CG. Both groups attended four sessions on the principles of manual therapy and specific manipulative and diagnostic techniques for the spine. We assessed differences in theoretical knowledge (multiple choice (MC exam and practical skills (Objective Structured Practical Examination (OSPE with respect to type of intervention and gender. Participants took a second OSPE 6 months after completion of the course. Results There were no differences between groups with respect to the MC exam. Students in the PG group scored significantly higher in the OSPE. Gender had no additional impact. Results of the second OSPE showed a significant decline in competency regardless of gender and type of intervention. Conclusions Peyton’s approach is superior to standard instruction for teaching complex spinal manipulation skills regardless of gender. Skills retention was equally low for both techniques.

  20. The standard laboratory module approach to automation of the chemical laboratory

    International Nuclear Information System (INIS)

    Hollen, R.M.; Erkkila, T.H.

    1993-01-01

    Automation of the technology and practice of environmental laboratory automation has not been as rapid or complete as one might expect. Confined to autosamplers and limited robotic systems, our ability to apply production concepts to environmental analytical analysis is not great. With the impending remediation of our hazardous waste sites in the US, only the application of production chemistry techniques will even begin to provide those responsible with the necessary knowledge to accomplish the cleanup expeditiously and safely. Tightening regulatory requirements have already mandated staggering increases in sampling and characterization needs with the future only guaranteeing greater demands. The Contaminant Analysis Automation Program has been initiated by our government to address these current and future characterization by application of a new robotic paradigm for analytical chemistry. By using standardized modular instruments, named Standard Laboratory Modules, flexible automation systems can rapidly be configured to apply production techniques to our nations environmental problems at-site

  1. Approaches for quantifying reactive and low-volatility biogenic organic compound emissions by vegetation enclosure techniques - part A.

    Science.gov (United States)

    Ortega, John; Helmig, Detlev

    2008-06-01

    The high reactivity and low vapor pressure of many biogenic volatile organic compounds (BVOC) make it difficult to measure whole-canopy fluxes of BVOC species using common analytical techniques. The most appropriate approach for estimating these BVOC fluxes is to determine emission rates from dynamic vegetation enclosure measurements. After scaling leaf- and branch-level emission rates to the canopy level, these fluxes can then be used in models to determine BVOC influences on atmospheric chemistry and aerosol processes. Previously published reports from enclosure measurements show considerable variation among procedures with limited guidelines or standard protocols to follow. This article reviews this literature and describes the variety of enclosure types, materials, and analysis techniques that have been used to determine BVOC emission rates. The current review article is followed by a companion paper which details a comprehensive enclosure technique that incorporates both recommendations from the literature as well as insight gained from theoretical calculations and practical experiences. These methods have yielded new BVOC emission data for highly reactive monoterpenes (MT) and sesquiterpenes (SQT) from a variety of vegetation species.

  2. To Teach Standard English or World Englishes? A Balanced Approach to Instruction

    Science.gov (United States)

    Farrell, Thomas S. C.; Martin, Sonia

    2009-01-01

    This article suggests that English language teachers should consider all varieties of English, not just British Standard English or American Standard English. In order to better prepare students for the global world, and to show them that their own English is valued, teachers can implement a balanced approach that incorporates the teaching and…

  3. A Creative Approach to the Common Core Standards: The Da Vinci Curriculum

    Science.gov (United States)

    Chaucer, Harry

    2012-01-01

    "A Creative Approach to the Common Core Standards: The Da Vinci Curriculum" challenges educators to design programs that boldly embrace the Common Core State Standards by imaginatively drawing from the genius of great men and women such as Leonardo da Vinci. A central figure in the High Renaissance, Leonardo made extraordinary contributions as a…

  4. Integrated approach to model decomposed flow hydrograph using artificial neural network and conceptual techniques

    Science.gov (United States)

    Jain, Ashu; Srinivasulu, Sanaga

    2006-02-01

    This paper presents the findings of a study aimed at decomposing a flow hydrograph into different segments based on physical concepts in a catchment, and modelling different segments using different technique viz. conceptual and artificial neural networks (ANNs). An integrated modelling framework is proposed capable of modelling infiltration, base flow, evapotranspiration, soil moisture accounting, and certain segments of the decomposed flow hydrograph using conceptual techniques and the complex, non-linear, and dynamic rainfall-runoff process using ANN technique. Specifically, five different multi-layer perceptron (MLP) and two self-organizing map (SOM) models have been developed. The rainfall and streamflow data derived from the Kentucky River catchment were employed to test the proposed methodology and develop all the models. The performance of all the models was evaluated using seven different standard statistical measures. The results obtained in this study indicate that (a) the rainfall-runoff relationship in a large catchment consists of at least three or four different mappings corresponding to different dynamics of the underlying physical processes, (b) an integrated approach that models the different segments of the decomposed flow hydrograph using different techniques is better than a single ANN in modelling the complex, dynamic, non-linear, and fragmented rainfall runoff process, (c) a simple model based on the concept of flow recession is better than an ANN to model the falling limb of a flow hydrograph, and (d) decomposing a flow hydrograph into the different segments corresponding to the different dynamics based on the physical concepts is better than using the soft decomposition employed using SOM.

  5. Radiographic analysis of the temporomandibular joint by the standardized projection technique

    International Nuclear Information System (INIS)

    Choe, Han Up; Park, Tae Won

    1983-01-01

    The purpose of this study was to investigate the radiographic images of the condylar head in clinically normal subjects and the TMJ patients using standardized projection technique. 45 subjects who have not clinical evidence of TMJ problems and 96 patients who have the clinical evidence of TMJ problems were evaluated, but the patients who had fracture, trauma and tumor on TMJ area were discluded in this study. For the evaluation of radiographic images, the author has observed the condylar head positions in closed mouth and 2.54 cm open mouth position taken by the standardized transcranial oblique lateral projection technique. The results were as follow: 1. In closed mouth position, the crest of condylar head took relatively posterior position to the deepest point of the glenoid fossa in 8.9% of the normals and in 26.6% of TMJ patients. 2. In 2.54 cm open mouth position, condylar head took relatively posterior position to the articular eminence in 2 .2% of TMJ patients and 39.6% of the normals. 3. In open mouth position, the horizontal distance from the deepest point of the glenoid fossa to the condylar head was 13.96 mm in the normals and 10.68 mm in TMJ patients. 4. The distance of true movement of condylar head was 13.49 mm in the normals and 10.27 mm in TMJ patients. 5. The deviation of mandible in TMJ patients was slightly greater than of the normals.

  6. Compressed air injection technique to standardize block injection pressures.

    Science.gov (United States)

    Tsui, Ban C H; Li, Lisa X Y; Pillay, Jennifer J

    2006-11-01

    Presently, no standardized technique exists to monitor injection pressures during peripheral nerve blocks. Our objective was to determine if a compressed air injection technique, using an in vitro model based on Boyle's law and typical regional anesthesia equipment, could consistently maintain injection pressures below a 1293 mmHg level associated with clinically significant nerve injury. Injection pressures for 20 and 30 mL syringes with various needle sizes (18G, 20G, 21G, 22G, and 24G) were measured in a closed system. A set volume of air was aspirated into a saline-filled syringe and then compressed and maintained at various percentages while pressure was measured. The needle was inserted into the injection port of a pressure sensor, which had attached extension tubing with an injection plug clamped "off". Using linear regression with all data points, the pressure value and 99% confidence interval (CI) at 50% air compression was estimated. The linearity of Boyle's law was demonstrated with a high correlation, r = 0.99, and a slope of 0.984 (99% CI: 0.967-1.001). The net pressure generated at 50% compression was estimated as 744.8 mmHg, with the 99% CI between 729.6 and 760.0 mmHg. The various syringe/needle combinations had similar results. By creating and maintaining syringe air compression at 50% or less, injection pressures will be substantially below the 1293 mmHg threshold considered to be an associated risk factor for clinically significant nerve injury. This technique may allow simple, real-time and objective monitoring during local anesthetic injections while inherently reducing injection speed.

  7. The natural background approach to setting radiation standards

    International Nuclear Information System (INIS)

    Adler, H.I.; Federow, H.; Weinberg, A.M.

    1979-01-01

    The suggestion has often been made that an additional radiation exposure imposed on humanity as a result of some important activity such as electricity generation would be acceptable if the exposure was 'small' compared to the natural background. In order to make this concept quantitative and objective, we propose that 'small compared with the natural background' be interpreted as the standard deviation (weighted with the exposed population) of the natural background. We believe that this use of the variation in natural background radiation is less arbitrary and requires fewer unfounded assumptions than some current approaches to standard-setting. The standard deviation is an easily calculated statistic that is small compared with the mean value for natural exposures of populations. It is an objectively determined quantity and its significance is generally understood. Its determination does not omit any of the pertinent data. When this method is applied to the population of the USA, it implies that a dose of 20 mrem/year would be an acceptable standard. This is closely comparable to the 25 mrem/year suggested by the Environmental Protection Agency as the maximum allowable exposure to an individual in the general population as a result of the operation of the complete uranium fuel cycle. Other agents for which a natural background exists can be treated in the same way as radiation. In addition, a second method for determining permissible exposure levels for agents other than radiation is presented. This method makes use of the natural background radiation data as a primary standard. Some observations on benzo(a)pyrene, using this latter method, are presented. (author)

  8. Teaching emergent bilingual students flexible approaches in an era of new standards

    CERN Document Server

    Proctor, C Patrick; Hiebert, Elfrieda H

    2016-01-01

    Recent educational reform initiatives such as the Common Core State Standards (CCSS) largely fail to address the needs--or tap into the unique resources--of students who are developing literacy skills in both English and a home language. This book discusses ways to meet the challenges that current standards pose for teaching emergent bilingual students in grades K-8. Leading experts describe effective, standards-aligned instructional approaches and programs expressly developed to promote bilingual learners' academic vocabulary, comprehension, speaking, writing, and content learning. Innovative

  9. Video-assisted thoracoscopic surgery (VATS) lobectomy using a standardized anterior approach

    DEFF Research Database (Denmark)

    Hansen, Henrik Jessen; Petersen, René Horsleben; Christensen, Merete

    2011-01-01

    Lobectomy using video-assisted thoracoscopic surgery (VATS) still is a controversial operation despite its many observed benefits. The controversy may be due to difficulties performing the procedure. This study addresses a standardized anterior approach facilitating the operation....

  10. A systems approach to accepted standards of care: Shifting the blame

    Directory of Open Access Journals (Sweden)

    David G. Glance

    2011-09-01

    Full Text Available In healthcare, from a legal perspective, the standard ofacceptable practice has been generally set by the courts anddefined as healthcare professionals acting in a manner thatis widely accepted by their peers as meeting an acceptablestandard of care. This view, however, reflects the state ofhow practice “is” rather than what it “ought to be”. What isought to be depends on whether you take a “person” or“system” oriented approach to practice.The increasing pressures of lack of money and resources,and an ever-increasing need for care are bringing pressureon the health services to move to a system approach andthis is gaining acceptance both with clinicians and thuseventually the courts.A systems-type approach to healthcare will, by necessity,embrace clinical protocols and guidelines supported byclinical information systems. It will also see blame for errorsshifting from clinicians to the organisations that employthem.This paper argues that a continued use of a person-basedapproach to healthcare, developed through an historicalrecord of practice by individual clinicians, is no longeradequate defence in a case of supposed negligence.When the healthcare system has codified clinical guidelinesand digital data gathered across thousands of clinicians andtheir patients, it is possible to compute adequate levels ofcare and expect clinicians and the healthcare system ingeneral to meet these minimum standards.Future negligence decisions will rely on a systems-basedbest practice standard of care determined through evidencerather than opinion

  11. Developing standardized connection analysis techniques for slim hole core rod designs

    International Nuclear Information System (INIS)

    Fehr, G.; Bailey, E.I.

    1994-01-01

    Slim hole core rod design remains essentially in the proprietary domain. API standardization provides the ability to perform engineering analyses and dimensional inspections through the use of documents, ie: Specifications, Bulletins, and Recommended Practices. In order to provide similar engineering capability for non-API slim hole connections, this paper develops the initial phase of what may evolve into an engineering tool to provide at least an indication of relative serviceability between two connection styles for a given application. The starting point for this process will look at bending strength ratios and connection strength calculations. Since empirical data are yet needed to verify the approaches proposed in this paper, it is recognized that the alternatives presented here are only a first step to developing useful rules of thumb which may lead to later standardization

  12. Competency mapping and visualisation techniques in change management

    OpenAIRE

    Schöpfel , Joachim; Creusot , Jacques

    2008-01-01

    Purpose: The article describes techniques that may facilitate change management in the library. Approach: The paper is based on practical experience and evidence from the INIST library department in France. Findings: Based on standard inventories of LIS professions and competencies, we present techniques for the mapping and visualisation of individual or team-centred job functions and skills. These techniques can help and facilitate communication, information and participation and are useful ...

  13. Application of the Sampling Selection Technique in Approaching Financial Audit

    Directory of Open Access Journals (Sweden)

    Victor Munteanu

    2018-03-01

    Full Text Available In his professional approach, the financial auditor has a wide range of working techniques, including selection techniques. They are applied depending on the nature of the information available to the financial auditor, the manner in which they are presented - paper or electronic format, and, last but not least, the time available. Several techniques are applied, successively or in parallel, to increase the safety of the expressed opinion and to provide the audit report with a solid basis of information. Sampling is used in the phase of control or clarification of the identified error. The main purpose is to corroborate or measure the degree of risk detected following a pertinent analysis. Since the auditor does not have time or means to thoroughly rebuild the information, the sampling technique can provide an effective response to the need for valorization.

  14. Longitudinal analysis of standardized test scores of students in the Science Writing Heuristic approach

    Science.gov (United States)

    Chanlen, Niphon

    The purpose of this study was to examine the longitudinal impacts of the Science Writing Heuristic (SWH) approach on student science achievement measured by the Iowa Test of Basic Skills (ITBS). A number of studies have reported positive impact of an inquiry-based instruction on student achievement, critical thinking skills, reasoning skills, attitude toward science, etc. So far, studies have focused on exploring how an intervention affects student achievement using teacher/researcher-generated measurement. Only a few studies have attempted to explore the long-term impacts of an intervention on student science achievement measured by standardized tests. The students' science and reading ITBS data was collected from 2000 to 2011 from a school district which had adopted the SWH approach as the main approach in science classrooms since 2002. The data consisted of 12,350 data points from 3,039 students. The multilevel model for change with discontinuity in elevation and slope technique was used to analyze changes in student science achievement growth trajectories prior and after adopting the SWH approach. The results showed that the SWH approach positively impacted students by initially raising science achievement scores. The initial impact was maintained and gradually increased when students were continuously exposed to the SWH approach. Disadvantaged students who were at risk of having low science achievement had bigger benefits from experience with the SWH approach. As a result, existing problematic achievement gaps were narrowed down. Moreover, students who started experience with the SWH approach as early as elementary school seemed to have better science achievement growth compared to students who started experiencing with the SWH approach only in high school. The results found in this study not only confirmed the positive impacts of the SWH approach on student achievement, but also demonstrated additive impacts found when students had longitudinal experiences

  15. Household energy consumption versus income and relative standard of living: A panel approach

    International Nuclear Information System (INIS)

    Joyeux, Roselyne; Ripple, Ronald D.

    2007-01-01

    Our fundamental premise is that energy consumption at the household level is a key indicator of standard of living. We employ state-of-the-art panel cointegration techniques to evaluate the nature of the relationship between income measures and energy consumption measures for seven East Indian Ocean countries. The general finding is that income and household electricity consumption are not cointegrated. Given this finding, we conclude that standard of living measures that rely on income measures and do not include household-level energy consumption information will necessarily miss important indications of both levels and changes of standard of living

  16. Standardized Approach to Quantitatively Measure Residual Limb Skin Health in Individuals with Lower Limb Amputation.

    Science.gov (United States)

    Rink, Cameron L; Wernke, Matthew M; Powell, Heather M; Tornero, Mark; Gnyawali, Surya C; Schroeder, Ryan M; Kim, Jayne Y; Denune, Jeffrey A; Albury, Alexander W; Gordillo, Gayle M; Colvin, James M; Sen, Chandan K

    2017-07-01

    Objective: (1) Develop a standardized approach to quantitatively measure residual limb skin health. (2) Report reference residual limb skin health values in people with transtibial and transfemoral amputation. Approach: Residual limb health outcomes in individuals with transtibial ( n  = 5) and transfemoral ( n  = 5) amputation were compared to able-limb controls ( n  = 4) using noninvasive imaging (hyperspectral imaging and laser speckle flowmetry) and probe-based approaches (laser doppler flowmetry, transcutaneous oxygen, transepidermal water loss, surface electrical capacitance). Results: A standardized methodology that employs noninvasive imaging and probe-based approaches to measure residual limb skin health are described. Compared to able-limb controls, individuals with transtibial and transfemoral amputation have significantly lower transcutaneous oxygen tension, higher transepidermal water loss, and higher surface electrical capacitance in the residual limb. Innovation: Residual limb health as a critical component of prosthesis rehabilitation for individuals with lower limb amputation is understudied in part due to a lack of clinical measures. Here, we present a standardized approach to measure residual limb health in people with transtibial and transfemoral amputation. Conclusion: Technology advances in noninvasive imaging and probe-based measures are leveraged to develop a standardized approach to quantitatively measure residual limb health in individuals with lower limb loss. Compared to able-limb controls, resting residual limb physiology in people that have had transfemoral or transtibial amputation is characterized by lower transcutaneous oxygen tension and poorer skin barrier function.

  17. MATLAB Toolboxes for Reference Electrode Standardization Technique (REST) of Scalp EEG.

    Science.gov (United States)

    Dong, Li; Li, Fali; Liu, Qiang; Wen, Xin; Lai, Yongxiu; Xu, Peng; Yao, Dezhong

    2017-01-01

    Reference electrode standardization technique (REST) has been increasingly acknowledged and applied as a re-reference technique to transform an actual multi-channels recordings to approximately zero reference ones in electroencephalography/event-related potentials (EEG/ERPs) community around the world in recent years. However, a more easy-to-use toolbox for re-referencing scalp EEG data to zero reference is still lacking. Here, we have therefore developed two open-source MATLAB toolboxes for REST of scalp EEG. One version of REST is closely integrated into EEGLAB, which is a popular MATLAB toolbox for processing the EEG data; and another is a batch version to make it more convenient and efficient for experienced users. Both of them are designed to provide an easy-to-use for novice researchers and flexibility for experienced researchers. All versions of the REST toolboxes can be freely downloaded at http://www.neuro.uestc.edu.cn/rest/Down.html, and the detailed information including publications, comments and documents on REST can also be found from this website. An example of usage is given with comparative results of REST and average reference. We hope these user-friendly REST toolboxes could make the relatively novel technique of REST easier to study, especially for applications in various EEG studies.

  18. Deep round window insertion versus standard approach in cochlear implant surgery.

    Science.gov (United States)

    Nordfalk, Karl Fredrik; Rasmussen, Kjell; Bunne, Marie; Jablonski, Greg Eigner

    2016-01-01

    The aim of this study was to compare the outcomes of vestibular tests and the residual hearing of patients who have undergone full insertion cochlear implant surgery using the round window approach with a hearing preservation protocol (RW-HP) or the standard cochleostomy approach (SCA) without hearing preservation. A prospective study of 34 adults who underwent unilateral cochlear implantation was carried out. One group was operated using the RW-HP (n = 17) approach with Med-El +Flex(SOFT) electrode array with full insertion, while the control group underwent a more conventional SCA surgery (n = 17) with shorter perimodiolar electrodes. Assessments of residual hearing, cervical vestibular-evoked myogenic potentials (cVEMP), videonystagmography, subjective visual vertical/horizontal (SVH/SVV) were performed before and after surgery. There was a significantly (p < 0.05) greater number of subjects who exhibited complete or partial hearing preservation in the deep insertion RW-HP group (9/17) compared to the SCA group (2/15). A higher degree of vestibular loss but a lower degree of vertigo symptoms could be seen in the RW-HP group, but the differences were not statistically significant. It is possible to preserve residual hearing to a certain extent also with deep insertion. Full insertion with hearing preservation was less harmful to residual hearing particularly at 125 Hz (p < 0.05), than was the standard cochleostomy approach.

  19. IAEA safety standards and approach to safety of advanced reactors

    International Nuclear Information System (INIS)

    Gasparini, M.

    2004-01-01

    The paper presents an overview of the IAEA safety standards including their overall structure and purpose. A detailed presentation is devoted to the general approach to safety that is embodied in the current safety requirements for the design of nuclear power plants. A safety approach is proposed for the future. This approach can be used as reference for a safe design, for safety assessment and for the preparation of the safety requirements. The method proposes an integration of deterministic and risk informed concepts in the general frame of a generalized concept of safety goals and defence in depth. This methodology may provide a useful tool for the preparation of safety requirements for the design and operation of any kind of reactor including small and medium sized reactors with innovative safety features.(author)

  20. Minimally Invasive Surgical Approach to Distal Fibula Fractures: A Technique Tip

    Directory of Open Access Journals (Sweden)

    Tyler A. Gonzalez

    2017-01-01

    Full Text Available Wound complications following ankle fracture surgery are a major concern. Through the use of minimally invasive surgical techniques some of these complications can be mitigated. Recent investigations have reported on percutaneous fixation of distal fibula fractures demonstrating similar radiographic and functional outcomes to traditional open approaches. The purpose of this manuscript is to describe in detail the minimally invasive surgical approach for distal fibula fractures.

  1. A Principles-Based Approach to Teaching International Financial Reporting Standards (IFRS)

    Science.gov (United States)

    Persons, Obeua

    2014-01-01

    This article discusses the principles-based approach that emphasizes a "why" question by using the International Accounting Standards Board (IASB) "Conceptual Framework for Financial Reporting" to question and understand the basis for specific differences between IFRS and U.S. generally accepted accounting principles (U.S.…

  2. VATS Lobectomy: Surgical Evolution from Conventional VATS to Uniportal Approach

    Directory of Open Access Journals (Sweden)

    Diego Gonzalez-Rivas

    2012-01-01

    Full Text Available There is no standardized technique for the VATS lobectomy, though most centres use 2 ports and add a utility incision. However, the procedure can be performed by eliminating the two small ports and using only the utility incision with similar outcomes. Since 2010, when the uniportal approach was introduced for major pulmonary resection, the technique has been spreading worldwide. The single-port technique provides a direct view to the target tissue. The conventional triple port triangulation creates a new optical plane with genesis of dihedral or torsional angle that is not favorable with standard two-dimension monitors. The parallel instrumentation achieved during single-port approach mimics inside the maneuvers performed during open surgery. Furthermore, it represents the less invasive approach possible, and avoiding the use of trocar, we minimize the compression of the intercostal nerve. Further development of new technologies like sealing devices for all vessels and fissure, robotic arms that open inside the thorax, and wireless cameras will facilitate the uniportal approach to become the standard surgical procedure for pulmonary resection in most thoracic departments.

  3. A Risk and Standards Based Approach to Quality Assurance in Australia's Diverse Higher Education Sector

    Science.gov (United States)

    Australian Government Tertiary Education Quality and Standards Agency, 2015

    2015-01-01

    The Australian Government Tertiary Education Quality and Standards Agency's (TEQSA's) role is to assure that quality standards are being met by all registered higher education providers. This paper explains how TEQSA's risk-based approach to assuring higher education standards is applied in broad terms to a diverse sector. This explanation is…

  4. Standard practice for monitoring atmospheric SO2 using the sulfation plate technique

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1997-01-01

    1.1 This practice covers a weighted average effective SO2 level for a 30-day interval through the use of the sulfation plate method, a technique for estimating the effective SO2 content of the atmosphere, and especially with regard to the atmospheric corrosion of stationary structures or panels. This practice is aimed at determining SO2 levels rather than sulfuric acid aerosol or acid precipitation. 1.2 The results of this practice correlate approximately with volumetric SO2 concentrations, although the presence of dew or condensed moisture tends to enhance the capture of SO2 into the plate. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  5. New approaches in intelligent control techniques, methodologies and applications

    CERN Document Server

    Kountchev, Roumen

    2016-01-01

    This volume introduces new approaches in intelligent control area from both the viewpoints of theory and application. It consists of eleven contributions by prominent authors from all over the world and an introductory chapter. This volume is strongly connected to another volume entitled "New Approaches in Intelligent Image Analysis" (Eds. Roumen Kountchev and Kazumi Nakamatsu). The chapters of this volume are self-contained and include summary, conclusion and future works. Some of the chapters introduce specific case studies of various intelligent control systems and others focus on intelligent theory based control techniques with applications. A remarkable specificity of this volume is that three chapters are dealing with intelligent control based on paraconsistent logics.

  6. The Suprameatal Approach: A Safe Alternative Surgical Technique for Cochlear Implantation

    NARCIS (Netherlands)

    Postelmans, Job T. F.; Tange, Rinze A.; Stokroos, Robert J.; Grolman, Wilko

    2010-01-01

    Objective: To report on surgical complications arising post-operatively in 104 patients undergoing cochlear implantation surgery using the suprameatal approach (SMA). Second, to examine the advantages and disadvantages of the SMA technique compared with the classic mastoidectomy using the posterior

  7. Female stress urinary incontinence: standard techniques revisited and critical evaluation of innovative techniques

    Science.gov (United States)

    de Riese, Cornelia; de Riese, Werner T. W.

    2003-06-01

    Objectives: The treatment of female urinary incontinence (UI) is a growing health care concern in our aging society. Publications of recent innovations and modifications are creating expectations. This brief review provides some insight and structure regarding indications and expected outcomes for the different approaches. Materials: Data extraction is part of a Medline data base search, which was performed for "female stress incontinence" from 1960 until 2000. Additional literature search was performed to cover 2001 and 2002. Outcome data were extracted. Results: (1) INJECTION OF BULKING AGENTS (collagen, synthetic agents): The indication for mucosal coaptation was more clearly defined and in the majority of articles limited to ISD. (2) OPEN COLPOSUSPENSION (Burch, MMK): Best long-term results of all operative procedures, to date considered the gold standard. (3) LAPAROSCOPIC COLPOSUSPENSION (different modifications): Long-term success rates appear dependent on operator skills. There are few long-term data. (4) NEEDLE SUSPENSION: (Stamey, Pareyra and modifications): Initial results were equal to Burch with less morbidity, but long-term success rates are worse. (5) SLING PROCEDURES (autologous, synthetic, allogenic graft materials, different modes of support and anchoring, free tapes): The suburethral sling has traditionally been considered a procedure for those in whom suspension had failed and for those with severe ISD. The most current trend shows its use as a primary procedure for SUI. Long-term data beyond 5 years are insufficient. (6) EXTERNAL OCCLUSIVE DEVICES (vaginal sponges and pessaries, urethral insert): Both vaginal and urethral insert devices can be effective in selected patients. (7) IMPLANTABLE ARTEFICIAL URETHRAL SPHINCTERS: Modifications and improvements of the devices resulted in improved clinical results regarding durability and efficacy. CONCLUSION: (1) The Burch colposuspension is still considered the gold standard in the treatment of female

  8. Narrative Interest Standard: A Novel Approach to Surrogate Decision-Making for People With Dementia.

    Science.gov (United States)

    Wilkins, James M

    2017-06-17

    Dementia is a common neurodegenerative process that can significantly impair decision-making capacity as the disease progresses. When a person is found to lack capacity to make a decision, a surrogate decision-maker is generally sought to aid in decision-making. Typical bases for surrogate decision-making include the substituted judgment standard and the best interest standard. Given the heterogeneous and progressive course of dementia, however, these standards for surrogate decision-making are often insufficient in providing guidance for the decision-making for a person with dementia, escalating the likelihood of conflict in these decisions. In this article, the narrative interest standard is presented as a novel and more appropriate approach to surrogate decision-making for people with dementia. Through case presentation and ethical analysis, the standard mechanisms for surrogate decision-making for people with dementia are reviewed and critiqued. The narrative interest standard is then introduced and discussed as a dementia-specific model for surrogate decision-making. Through incorporation of elements of a best interest standard in focusing on the current benefit-burden ratio and elements of narrative to provide context, history, and flexibility for values and preferences that may change over time, the narrative interest standard allows for elaboration of an enriched context for surrogate decision-making for people with dementia. More importantly, however, a narrative approach encourages the direct contribution from people with dementia in authoring the story of what matters to them in their lives.

  9. Using Narratives to Develop Standards for Leaders: Applying an Innovative Approach in Western Australia

    Science.gov (United States)

    Wildy, Helen; Pepper, Coral

    2005-01-01

    Dissatisfaction with long lists of duties as substitutes for standards led to the innovative application of narratives as an alternative approach to the generation and use of standards for school leaders. This paper describes research conducted over nearly a decade in collaboration with the state education authority in Western Australia,…

  10. An integrated approach using orthogonal analytical techniques to characterize heparan sulfate structure.

    Science.gov (United States)

    Beccati, Daniela; Lech, Miroslaw; Ozug, Jennifer; Gunay, Nur Sibel; Wang, Jing; Sun, Elaine Y; Pradines, Joël R; Farutin, Victor; Shriver, Zachary; Kaundinya, Ganesh V; Capila, Ishan

    2017-02-01

    Heparan sulfate (HS), a glycosaminoglycan present on the surface of cells, has been postulated to have important roles in driving both normal and pathological physiologies. The chemical structure and sulfation pattern (domain structure) of HS is believed to determine its biological function, to vary across tissue types, and to be modified in the context of disease. Characterization of HS requires isolation and purification of cell surface HS as a complex mixture. This process may introduce additional chemical modification of the native residues. In this study, we describe an approach towards thorough characterization of bovine kidney heparan sulfate (BKHS) that utilizes a variety of orthogonal analytical techniques (e.g. NMR, IP-RPHPLC, LC-MS). These techniques are applied to characterize this mixture at various levels including composition, fragment level, and overall chain properties. The combination of these techniques in many instances provides orthogonal views into the fine structure of HS, and in other instances provides overlapping / confirmatory information from different perspectives. Specifically, this approach enables quantitative determination of natural and modified saccharide residues in the HS chains, and identifies unusual structures. Analysis of partially digested HS chains allows for a better understanding of the domain structures within this mixture, and yields specific insights into the non-reducing end and reducing end structures of the chains. This approach outlines a useful framework that can be applied to elucidate HS structure and thereby provides means to advance understanding of its biological role and potential involvement in disease progression. In addition, the techniques described here can be applied to characterization of heparin from different sources.

  11. Gold-standard for computer-assisted morphological sperm analysis.

    Science.gov (United States)

    Chang, Violeta; Garcia, Alejandra; Hitschfeld, Nancy; Härtel, Steffen

    2017-04-01

    Published algorithms for classification of human sperm heads are based on relatively small image databases that are not open to the public, and thus no direct comparison is available for competing methods. We describe a gold-standard for morphological sperm analysis (SCIAN-MorphoSpermGS), a dataset of sperm head images with expert-classification labels in one of the following classes: normal, tapered, pyriform, small or amorphous. This gold-standard is for evaluating and comparing known techniques and future improvements to present approaches for classification of human sperm heads for semen analysis. Although this paper does not provide a computational tool for morphological sperm analysis, we present a set of experiments for comparing sperm head description and classification common techniques. This classification base-line is aimed to be used as a reference for future improvements to present approaches for human sperm head classification. The gold-standard provides a label for each sperm head, which is achieved by majority voting among experts. The classification base-line compares four supervised learning methods (1- Nearest Neighbor, naive Bayes, decision trees and Support Vector Machine (SVM)) and three shape-based descriptors (Hu moments, Zernike moments and Fourier descriptors), reporting the accuracy and the true positive rate for each experiment. We used Fleiss' Kappa Coefficient to evaluate the inter-expert agreement and Fisher's exact test for inter-expert variability and statistical significant differences between descriptors and learning techniques. Our results confirm the high degree of inter-expert variability in the morphological sperm analysis. Regarding the classification base line, we show that none of the standard descriptors or classification approaches is best suitable for tackling the problem of sperm head classification. We discovered that the correct classification rate was highly variable when trying to discriminate among non-normal sperm

  12. Novel information theory techniques for phonon spectroscopy

    International Nuclear Information System (INIS)

    Hague, J P

    2007-01-01

    The maximum entropy method (MEM) and spectral reverse Monte Carlo (SRMC) techniques are applied to the determination of the phonon density of states (PDOS) from heat-capacity data. The approach presented here takes advantage of the standard integral transform relating the PDOS with the specific heat at constant volume. MEM and SRMC are highly successful numerical approaches for inverting integral transforms. The formalism and algorithms necessary to carry out the inversion of specific heat curves are introduced, and where possible, I have concentrated on algorithms and experimental details for practical usage. Simulated data are used to demonstrate the accuracy of the approach. The main strength of the techniques presented here is that the resulting spectra are always physical: Computed PDOS is always positive and properly applied information theory techniques only show statistically significant detail. The treatment set out here provides a simple, cost-effective and reliable method to determine phonon properties of new materials. In particular, the new technique is expected to be very useful for establishing where interesting phonon modes and properties can be found, before spending time at large scale facilities

  13. A STUDY ON ENGLISH TEACHERS’ TEACHING APPROACHES, METHODS, AND TECHNIQUES AT A STATE SENIOR HIGH SCHOOL IN ENREKANG, INDONESIA

    Directory of Open Access Journals (Sweden)

    Ita Sarmita Samad

    2016-11-01

    Full Text Available This research was aimed at identifying the approaches, methods and techniques used by the English teachers at a state senior high school in Enrekang, Indonesia in teaching English as a foreign language. Furthermore, the consistency of the approaches, methods, and techniques is also identifed. This research applied explorative qualitative research design. The subjects were all of the English teachers in that school. They were chosen through purpossive sampling technique. They were interviewed and observed to get data regarding their teaching approach, method, and technique. Their lesson plan were copied to gain supporting data. Based on fndings and discussion, the approaches used by teacher 1 were communicative and behaviorism approach. Teacher 2 applied systemic functional linguistic and constructivism/ cognitivism. Most of the techniques used by teacher 1 reflected behaviorism approach or principles of grammar translation method while the techniques used by teacher 2 reflected both of behaviorism and constructivism. In the case of the consistency, the English teachers still showed a considerable inconsistency. Yet, comparing with teacher 1, teacher 2 was more consistent. It is concluded that the two English teachers still need further upgrading regarding approaches, methods, and techniques of teaching English as a foreign language.

  14. European standards and approaches to EMC in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Bardsley, D.J.; Dillingham, S.R.; McMinn, K. [AEA Technology, Dorset (United Kingdom)

    1995-04-01

    Electromagnetic Interference (EMI) arising from a wide range of sources can threaten nuclear power plant operation. The need for measures to mitigate its effects have long been recognised although there are difference in approaches worldwide. The US industry approaches the problem by comprehensive site surveys defining an envelope of emissions for the environmental whilst the UK nuclear industry defined many years ago generic levels which cover power station environments. Moves to standardisation within the European community have led to slight changes in UK approach, in particular how large systems can be tested. The tests undertaken on UK nuclear plant include tests for immunity to conducted as well as radiated interference. Similar tests are also performed elsewhere in Europe but are not, to the authors` knowledge, commonly undertaken in the USA. Currently work is proceeding on draft international standards under the auspices of the IEC.

  15. Relationship between alveolar bone measured by 125I absorptiometry with analysis of standardized radiographs: 2. Bjorn technique

    International Nuclear Information System (INIS)

    Ortman, L.F.; McHenry, K.; Hausmann, E.

    1982-01-01

    The Bjorn technique is widely used in periodontal studies as a standardized measure of alveolar bone. Recent studies have demonstrated the feasibility of using 125 I absorptiometry to measure bone mass. The purpose of this study was to compare 125 I absorptiometry with the Bjorn technique in detecting small sequential losses of alveolary bone. Four periodontal-like defects of incrementally increasing size were produced in alveolar bone in the posterior segment of the maxilla of a human skull. An attempt was made to sequentially reduce the amount of bone in 10% increments until no bone remained, a through and through defect. The bone remaining at each step was measured using 125 I absorptiometry. At each site the 125 I absorptiometry measurements were made at the same location by fixing the photon source to a prefabricated precision-made occlusal splint. This site was just beneath the crest and midway between the borders of two adjacent teeth. Bone loss was also determined by the Bjorn technique. Standardized intraoral films were taken using a custom-fitted acrylic clutch, and bone measurements were made from the root apex to coronal height of the lamina dura. A comparison of the data indicates that: (1) in early bone loss, less than 30%, the Bjorn technique underestimates the amount of loss, and (2) in advanced bone loss, more than 60% the Bjorn technique overestimates it

  16. Next generation initiation techniques

    Science.gov (United States)

    Warner, Tom; Derber, John; Zupanski, Milija; Cohn, Steve; Verlinde, Hans

    1993-01-01

    Four-dimensional data assimilation strategies can generally be classified as either current or next generation, depending upon whether they are used operationally or not. Current-generation data-assimilation techniques are those that are presently used routinely in operational-forecasting or research applications. They can be classified into the following categories: intermittent assimilation, Newtonian relaxation, and physical initialization. It should be noted that these techniques are the subject of continued research, and their improvement will parallel the development of next generation techniques described by the other speakers. Next generation assimilation techniques are those that are under development but are not yet used operationally. Most of these procedures are derived from control theory or variational methods and primarily represent continuous assimilation approaches, in which the data and model dynamics are 'fitted' to each other in an optimal way. Another 'next generation' category is the initialization of convective-scale models. Intermittent assimilation systems use an objective analysis to combine all observations within a time window that is centered on the analysis time. Continuous first-generation assimilation systems are usually based on the Newtonian-relaxation or 'nudging' techniques. Physical initialization procedures generally involve the use of standard or nonstandard data to force some physical process in the model during an assimilation period. Under the topic of next-generation assimilation techniques, variational approaches are currently being actively developed. Variational approaches seek to minimize a cost or penalty function which measures a model's fit to observations, background fields and other imposed constraints. Alternatively, the Kalman filter technique, which is also under investigation as a data assimilation procedure for numerical weather prediction, can yield acceptable initial conditions for mesoscale models. The

  17. Novel secret key generation techniques using memristor devices

    Science.gov (United States)

    Abunahla, Heba; Shehada, Dina; Yeun, Chan Yeob; Mohammad, Baker; Jaoude, Maguy Abi

    2016-02-01

    This paper proposes novel secret key generation techniques using memristor devices. The approach depends on using the initial profile of a memristor as a master key. In addition, session keys are generated using the master key and other specified parameters. In contrast to existing memristor-based security approaches, the proposed development is cost effective and power efficient since the operation can be achieved with a single device rather than a crossbar structure. An algorithm is suggested and demonstrated using physics based Matlab model. It is shown that the generated keys can have dynamic size which provides perfect security. Moreover, the proposed encryption and decryption technique using the memristor based generated keys outperforms Triple Data Encryption Standard (3DES) and Advanced Encryption Standard (AES) in terms of processing time. This paper is enriched by providing characterization results of a fabricated microscale Al/TiO2/Al memristor prototype in order to prove the concept of the proposed approach and study the impacts of process variations. The work proposed in this paper is a milestone towards System On Chip (SOC) memristor based security.

  18. Interviewer-Respondent Interactions in Conversational and Standardized Interviewing

    Science.gov (United States)

    Mittereder, Felicitas; Durow, Jen; West, Brady T.; Kreuter, Frauke; Conrad, Frederick G.

    2018-01-01

    Standardized interviewing (SI) and conversational interviewing are two approaches to collect survey data that differ in how interviewers address respondent confusion. This article examines interviewer-respondent interactions that occur during these two techniques, focusing on requests for and provisions of clarification. The data derive from an…

  19. Specific binding assay technique; standardization of reagent

    International Nuclear Information System (INIS)

    Huggins, K.G.; Roitt, I.M.

    1979-01-01

    The standardization of a labelled constituent, such as anti-IgE, for use in a specific binding assay method is disclosed. A labelled ligand, such as IgE, is standardized against a ligand reference substance, such as WHO standard IgE, to determine the weight of IgE protein represented by the labelled ligand. Anti-light chain antibodies are contacted with varying concentrations of the labelled ligand. The ligand is then contacted with the labelled constituent which is then quantitated in relation to the amount of ligand protein present. The preparation of 131 I-labelled IgE is described. Also disclosed is an improved specific binding assay test method for determining the potency of an allergen extract in serum from an allergic individual. The improvement involved using a parallel model system of a second complex which consisted of anti-light chain antibodies, labelled ligand and the standardized labelled constituent (anti-IgE). The amount of standardized labelled constituent bound to the ligand in the first complex was determined, as described above, and the weight of ligand inhibited by addition of soluble allergen was then used as a measure of the potency of the allergen extract. (author)

  20. Standardized approach for developing probabilistic exposure factor distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, Randy L.; McKone, Thomas E.; Sohn, Michael D.

    2003-03-01

    The effectiveness of a probabilistic risk assessment (PRA) depends critically on the quality of input information that is available to the risk assessor and specifically on the probabilistic exposure factor distributions that are developed and used in the exposure and risk models. Deriving probabilistic distributions for model inputs can be time consuming and subjective. The absence of a standard approach for developing these distributions can result in PRAs that are inconsistent and difficult to review by regulatory agencies. We present an approach that reduces subjectivity in the distribution development process without limiting the flexibility needed to prepare relevant PRAs. The approach requires two steps. First, we analyze data pooled at a population scale to (1) identify the most robust demographic variables within the population for a given exposure factor, (2) partition the population data into subsets based on these variables, and (3) construct archetypal distributions for each subpopulation. Second, we sample from these archetypal distributions according to site- or scenario-specific conditions to simulate exposure factor values and use these values to construct the scenario-specific input distribution. It is envisaged that the archetypal distributions from step 1 will be generally applicable so risk assessors will not have to repeatedly collect and analyze raw data for each new assessment. We demonstrate the approach for two commonly used exposure factors--body weight (BW) and exposure duration (ED)--using data for the U.S. population. For these factors we provide a first set of subpopulation based archetypal distributions along with methodology for using these distributions to construct relevant scenario-specific probabilistic exposure factor distributions.

  1. Stakeholder approach, Stakeholders mental model: A visualization test with cognitive mapping technique

    Directory of Open Access Journals (Sweden)

    Garoui Nassreddine

    2012-04-01

    Full Text Available The idea of this paper is to determine the mental models of actors in the firm with respect to the stakeholder approach of corporate governance. The use of the cognitive map to view these diagrams to show the ways of thinking and conceptualization of the stakeholder approach. The paper takes a corporate governance perspective, discusses stakeholder model. It takes also a cognitive mapping technique.

  2. Clinical veterinary proteomics: Techniques and approaches to decipher the animal plasma proteome.

    Science.gov (United States)

    Ghodasara, P; Sadowski, P; Satake, N; Kopp, S; Mills, P C

    2017-12-01

    Over the last two decades, technological advancements in the field of proteomics have advanced our understanding of the complex biological systems of living organisms. Techniques based on mass spectrometry (MS) have emerged as powerful tools to contextualise existing genomic information and to create quantitative protein profiles from plasma, tissues or cell lines of various species. Proteomic approaches have been used increasingly in veterinary science to investigate biological processes responsible for growth, reproduction and pathological events. However, the adoption of proteomic approaches by veterinary investigators lags behind that of researchers in the human medical field. Furthermore, in contrast to human proteomics studies, interpretation of veterinary proteomic data is difficult due to the limited protein databases available for many animal species. This review article examines the current use of advanced proteomics techniques for evaluation of animal health and welfare and covers the current status of clinical veterinary proteomics research, including successful protein identification and data interpretation studies. It includes a description of an emerging tool, sequential window acquisition of all theoretical fragment ion mass spectra (SWATH-MS), available on selected mass spectrometry instruments. This newly developed data acquisition technique combines advantages of discovery and targeted proteomics approaches, and thus has the potential to advance the veterinary proteomics field by enhancing identification and reproducibility of proteomics data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Development of Standardized Mobile Tracer Correlation Approach for Large Area Emission Measurements (DRAFT UNDER EPA REVIEW)

    Science.gov (United States)

    Foster-wittig, T. A.; Thoma, E.; Green, R.; Hater, G.; Swan, N.; Chanton, J.

    2013-12-01

    Improved understanding of air emissions from large area sources such as landfills, waste water ponds, open-source processing, and agricultural operations is a topic of increasing environmental importance. In many cases, the size of the area source, coupled with spatial-heterogeneity, make direct (on-site) emission assessment difficult; methane emissions, from landfills for example, can be particularly complex [Thoma et al, 2009]. Recently, whole-facility (remote) measurement approaches based on tracer correlation have been utilized [Scheutz et al, 2011]. The approach uses a mobile platform to simultaneously measure a metered-release of a conservative gas (the tracer) along with the target compound (methane in the case of landfills). The known-rate tracer release provides a measure of atmospheric dispersion at the downwind observing location allowing the area source emission to be determined by a ratio calculation [Green et al, 2010]. Although powerful in concept, the approach has been somewhat limited to research applications due to the complexities and cost of the high-sensitivity measurement equipment required to quantify the part-per billion levels of tracer and target gas at kilometer-scale distances. The advent of compact, robust, and easy to use near-infrared optical measurement systems (such as cavity ring down spectroscopy) allow the tracer correlation approach to be investigated for wider use. Over the last several years, Waste Management Inc., the U.S. EPA, and collaborators have conducted method evaluation activities to determine the viability of a standardized approach through execution of a large number of field measurement trials at U.S. landfills. As opposed to previous studies [Scheutz et al, 2011] conducted at night (optimal plume transport conditions), the current work evaluated realistic use-scenarios; these scenarios include execution by non-scientist personnel, daylight operation, and full range of atmospheric condition (all plume transport

  4. Imaging of stellar surfaces with the Occamian approach and the least-squares deconvolution technique

    Science.gov (United States)

    Järvinen, S. P.; Berdyugina, S. V.

    2010-10-01

    Context. We present in this paper a new technique for the indirect imaging of stellar surfaces (Doppler imaging, DI), when low signal-to-noise spectral data have been improved by the least-squares deconvolution (LSD) method and inverted into temperature maps with the Occamian approach. We apply this technique to both simulated and real data and investigate its applicability for different stellar rotation rates and noise levels in data. Aims: Our goal is to boost the signal of spots in spectral lines and to reduce the effect of photon noise without loosing the temperature information in the lines. Methods: We simulated data from a test star, to which we added different amounts of noise, and employed the inversion technique based on the Occamian approach with and without LSD. In order to be able to infer a temperature map from LSD profiles, we applied the LSD technique for the first time to both the simulated observations and theoretical local line profiles, which remain dependent on temperature and limb angles. We also investigated how the excitation energy of individual lines effects the obtained solution by using three submasks that have lines with low, medium, and high excitation energy levels. Results: We show that our novel approach enables us to overcome the limitations of the two-temperature approximation, which was previously employed for LSD profiles, and to obtain true temperature maps with stellar atmosphere models. The resulting maps agree well with those obtained using the inversion code without LSD, provided the data are noiseless. However, using LSD is only advisable for poor signal-to-noise data. Further, we show that the Occamian technique, both with and without LSD, approaches the surface temperature distribution reasonably well for an adequate spatial resolution. Thus, the stellar rotation rate has a great influence on the result. For instance, in a slowly rotating star, closely situated spots are usually recovered blurred and unresolved, which

  5. Endoscope-assisted approach to excision of branchial cleft cysts.

    Science.gov (United States)

    Teng, Stephanie E; Paul, Benjamin C; Brumm, John D; Fritz, Mark; Fang, Yixin; Myssiorek, David

    2016-06-01

    The purpose of this study is to describe an endoscope-assisted surgical technique for the excision of branchial cleft cysts and compare it to the standard approach. Retrospective case series review. Twenty-seven cases described as branchial cleft excisions performed by a single surgeon at one academic medical center were identified between 2007 and 2014. Twenty-five cases (8 endoscopic, 17 standard approach) were included in the study. Cases were excluded if final pathology was malignant. Patient charts were reviewed, and two techniques were compared through analysis of incision size, operative time, and surgical outcomes. This study showed that the length of incision required for the endoscopic approach (mean = 2.13 ± 0.23) was significantly less than that of the standard approach (mean = 4.10 ± 1.46, P = 0.008) despite the fact that there was no significant difference in cyst size between the two groups (P = 0.09). The other variables examined, including operative time and surgical outcomes, were not significantly different between the two groups. This transcervical endoscope-assisted approach to branchial cleft cyst excision is a viable option for uncomplicated cases. It provides better cosmetic results than the standard approach and does not negatively affect outcomes, increase operative time, or result in recurrence. 4. Laryngoscope, 126:1339-1342, 2016. © 2015 The American Laryngological, Rhinological and Otological Society, Inc.

  6. 77 FR 63763 - Regulatory Capital Rules: Standardized Approach for Risk-Weighted Assets; Market Discipline and...

    Science.gov (United States)

    2012-10-17

    ... BCBS in 2006 and 2009, as well as other proposals set forth in consultative papers of the BCBS. Section... provided in this Federal Register document, which describes the economic impact of the Standardized... in recent consultative papers of the BCBS.\\2\\ In the Standardized Approach NPR, the agencies also...

  7. A standardized approach for determining radiological sabotage targets

    International Nuclear Information System (INIS)

    Gardner, B.H.; Snell, M.K.

    1993-01-01

    The US Department of Energy has required radiological sabotage vulnerability assessments to be conducted for years. However, the exact methodology to be used in this type of analysis still remains somewhat elusive. Therefore, there is tremendous variation in the methodologies and assumptions used to determine release levels and doses potentially affecting the health and safety of the public. In some cases, there are three orders of magnitude difference in results for dispersal of similar materials under similar meteorological conditions. To address this issue, the authors have developed an approach to standardizing radiological sabotage target analysis that starts by addressing basic assumptions and then directs the user to some recommended computerized analytical tools. Results from different dispersal codes are also compared in this analysis

  8. Standardization of Berberis aristata extract through conventional and modern HPTLC techniques

    Directory of Open Access Journals (Sweden)

    Dinesh K. Patel

    2012-05-01

    Full Text Available Objective: Berberis aristata (Berberidaceae is an important medicinal plant, found in the different region of the world. It has significant medicinal value in the traditional Indian and Chinese system of medicine. The aim of the present investigation includes qualitative and quantitative analysis of Berberis aristata extract. Methods: Present study includes determination of phytochemical analysis, solubility test, heavy metal analysis, antimicrobial study and quantitative analysis by HPTLC method. Results: Preliminary phytochemical analysis showed the presence of carbohydrate, glycoside, alkaloid, protein, amino acid, saponin, tannin and flavonoid. Solubility in water and alcohal were found to be 81.90% in water and 84.52% in 50% in alcohal. Loss on drying was found to be 5.32%. Total phenol and flavonoid content were found to be 0.11% and 2.8%. Level of lead, arsenic, mercury and cadmium complies the standard level. E. coli and salmonella was found to be absent whereas total bacterial count, yeast and moulds contents were found to be under the limit. Content of berberine was found to be 13.47% through HPTLC techniques. Conclusions: The results obtained from the present studies could be used as source of valuable information which can play an important role for the food scientists, researchers and even the consumers for its standards.

  9. Multidimensional Poverty Indices and First Order Dominance Techniques: An Empirical Comparison of Different Approaches

    DEFF Research Database (Denmark)

    Hussain, M. Azhar; Permanyer, Iñaki

    2018-01-01

    techniques (FOD). Our empirical findings suggest that the FOD approach might be a reasonable cost-effective alternative to the United Nations Development Program (UNDP)’s flagship poverty indicator: the Multidimensional Poverty Index (MPI). To the extent that the FOD approach is able to uncover the socio...

  10. New approach for dry formulation techniques for rhizobacteria

    Science.gov (United States)

    Elchin, A. A.; Mashinistova, A. V.; Gorbunova, N. V.; Muratov, V. S.; Kydralieva, K. A.; Jorobekova, Sh. J.

    2009-04-01

    Two beneficial Pseudomonas isolates selected from rhizosphere of abundant weed - couch-grass Elytrigia repens L. Nevski have been found to have biocontrol activity. An adequate biocontrol effect requires high yield and long stability of the bacterial preparation [1], which could be achieved by an effective and stable formulation. This study was aimed to test various approaches to dry formulation techniques for Pseudomonas- based preparations. To reach this goal, two drying formulation techniques have been tested: the first one, spray drying and the second, low-temperature contact-convective drying in fluidized bed. The optimal temperature parameters for each technique were estimated. Main merits of the selected approach to dry technique are high yield, moderate specific energy expenditures per 1 kg of evaporated moisture, minimal time of contact of the drying product with drying agent. The technological process for dry formulation included the following stages: the obtaining of cell liquids, the low-temperature concentrating and the subsequent drying of a concentrate. The preliminary technological stages consist in cultivation of the rhizobacteria cultures and concentrating the cell liquids. The following requirements for cultivation regime in laboratory conditions were proposed: optimal temperatures are 26-28°С in 3 days, concentration of viable cells in cell liquid makes 1010-1011 cell/g of absolutely dry substance (ADS). For concentrating the cell liquids the method of a vacuum evaporation, which preserves both rhizobacteria cells and the secondary metabolites of cell liquid, has been used. The process of concentrating was conducted at the minimum possible temperature, i.e. not above 30-33°С. In this case the concentration of viable cells has decreased up to 109-1010 cell/g of ADS. For spray drying the laboratory up-dated drier BUCHI 190, intended for the drying of thermolabile products, was used. The temperatures of an in- and outcoming air did not exceed

  11. Novel secret key generation techniques using memristor devices

    Directory of Open Access Journals (Sweden)

    Heba Abunahla

    2016-02-01

    Full Text Available This paper proposes novel secret key generation techniques using memristor devices. The approach depends on using the initial profile of a memristor as a master key. In addition, session keys are generated using the master key and other specified parameters. In contrast to existing memristor-based security approaches, the proposed development is cost effective and power efficient since the operation can be achieved with a single device rather than a crossbar structure. An algorithm is suggested and demonstrated using physics based Matlab model. It is shown that the generated keys can have dynamic size which provides perfect security. Moreover, the proposed encryption and decryption technique using the memristor based generated keys outperforms Triple Data Encryption Standard (3DES and Advanced Encryption Standard (AES in terms of processing time. This paper is enriched by providing characterization results of a fabricated microscale Al/TiO2/Al memristor prototype in order to prove the concept of the proposed approach and study the impacts of process variations. The work proposed in this paper is a milestone towards System On Chip (SOC memristor based security.

  12. HEURISTIC APPROACHES FOR PORTFOLIO OPTIMIZATION

    OpenAIRE

    Manfred Gilli, Evis Kellezi

    2000-01-01

    The paper first compares the use of optimization heuristics to the classical optimization techniques for the selection of optimal portfolios. Second, the heuristic approach is applied to problems other than those in the standard mean-variance framework where the classical optimization fails.

  13. Comparisons of non-destructive examination standards in the framework of fracture mechanics approach

    International Nuclear Information System (INIS)

    Reale, S.; Corvi, A.

    1993-01-01

    One of the aims of the various Engineering Standards related to Non-destructive Examination (NDE) is to identify and limit some characteristics of defects in a structure, since the degree of damage of a structure can be associated with these defect characteristics. One way that the damage level can be evaluated is by means of Fracture Mechanics. The objective of the present paper is to compare and identify the differences in the flaw acceptance criteria of national NDE Standards so as to suggest some guidelines for a future common European Standard. This paper examines the Standards adopted in France (RCC-MR), Germany (DIN), Italy (ASME) and the UK (BSI). It concentrates on both ultrasonic and radiographic inspection methods. The flaw acceptance criteria in these standards relating to non-destructive tests performed on a component during manufacturing are compared and evaluated by the Fracture Mechanics CEGB R6 procedure. General guidelines and results supporting the significance of the Fracture Mechanics approach are given. (Author)

  14. Strategic approaches and assessment techniques-Potential for knowledge brokerage towards sustainability

    International Nuclear Information System (INIS)

    Sheate, William R.; Partidario, Maria Rosario

    2010-01-01

    The role of science in policy and decision-making has been an issue of intensive debate over the past decade. The concept of knowledge brokerage has been developing in this context contemplating issues of communication, interaction, sharing of knowledge, contribution to common understandings, as well as to effective and efficient action. For environmental and sustainability policy and decision-making the discussion has addressed more the essence of the issue rather than the techniques that can be used to enable knowledge brokerage. This paper aims to contribute to covering this apparent gap in current discussion by selecting and examining empirical cases from Portugal and the United Kingdom that can help to explore how certain environmental and sustainability assessment approaches can contribute, if well applied, to strengthen the science-policy link. The cases show that strategic assessment approaches and techniques have the potential to promote knowledge brokerage, but a conscious effort will be required to design in genuine opportunities to facilitate knowledge exchange and transfer as part of assessment processes.

  15. An alternative to the standard spatial econometric approaches in hedonic house price models

    DEFF Research Database (Denmark)

    von Graevenitz, Kathrine; Panduro, Toke Emil

    2015-01-01

    Omitted, misspecified, or mismeasured spatially varying characteristics are a cause for concern in hedonic house price models. Spatial econometrics or spatial fixed effects have become popular ways of addressing these concerns. We discuss the limitations of standard spatial approaches to hedonic...

  16. Use of variance techniques to measure dry air-surface exchange rates

    Science.gov (United States)

    Wesely, M. L.

    1988-07-01

    The variances of fluctuations of scalar quantities can be measured and interpreted to yield indirect estimates of their vertical fluxes in the atmospheric surface layer. Strong correlations among scalar fluctuations indicate a similarity of transfer mechanisms, which is utilized in some of the variance techniques. The ratios of the standard deviations of two scalar quantities, for example, can be used to estimate the flux of one if the flux of the other is measured, without knowledge of atmospheric stability. This is akin to a modified Bowen ratio approach. Other methods such as the normalized standard-deviation technique and the correlation-coefficient technique can be utilized effectively if atmospheric stability is evaluated and certain semi-empirical functions are known. In these cases, iterative calculations involving measured variances of fluctuations of temperature and vertical wind velocity can be used in place of direct flux measurements. For a chemical sensor whose output is contaminated by non-atmospheric noise, covariances with fluctuations of scalar quantities measured with a very good signal-to-noise ratio can be used to extract the needed standard deviation. Field measurements have shown that many of these approaches are successful for gases such as ozone and sulfur dioxide, as well as for temperature and water vapor, and could be extended to other trace substances. In humid areas, it appears that water vapor fluctuations often have a higher degree of correlation to fluctuations of other trace gases than do temperature fluctuations; this makes water vapor a more reliable companion or “reference” scalar. These techniques provide some reliable research approaches but, for routine or operational measurement, they are limited by the need for fast-response sensors. Also, all variance approaches require some independent means to estimate the direction of the flux.

  17. Open Partial Nephrectomy in Renal Cancer: A Feasible Gold Standard Technique in All Hospitals

    Directory of Open Access Journals (Sweden)

    J. M. Cozar

    2008-01-01

    Full Text Available Introduction. Partial nephrectomy (PN is playing an increasingly important role in localized renal cell carcinoma (RCC as a true alternative to radical nephrectomy. With the greater experience and expertise of surgical teams, it has become an alternative to radical nephrectomy in young patients when the tumor diameter is 4 cm or less in almost all hospitals since cancer-specific survival outcomes are similar to those obtained with radical nephrectomy. Materials and Methods. The authors comment on their own experience and review the literature, reporting current indications and outcomes including complications. The surgical technique of open partial nephrectomy is outlined. Conclusions. Nowadays, open PN is the gold standard technique to treat small renal masses, and all nonablative techniques must pass the test of time to be compared to PN. It is not ethical for patients to undergo radical surgery just because the urologists involved do not have adequate experience with PN. Patients should be involved in the final treatment decision and, when appropriate, referred to specialized centers with experience in open or laparoscopic partial nephrectomies.

  18. [Abdominothoracic esophageal resection according to Ivor Lewis with intrathoracic anastomosis : standardized totally minimally invasive technique].

    Science.gov (United States)

    Runkel, N; Walz, M; Ketelhut, M

    2015-05-01

    The clinical and scientific interest in minimally invasive techniques for esophagectomy (MIE) are increasing; however, the intrathoracic esophagogastric anastomosis remains a surgical challenge and lacks standardization. Surgeons either transpose the anastomosis to the cervical region or perform hybrid thoracotomy for stapler access. This article reports technical details and early experiences with a completely laparoscopic-thoracoscopic approach for Ivor Lewis esophagectomy without additional thoracotomy. The extent of radical dissection follows clinical guidelines. Laparoscopy is performed with the patient in a beach chair position and thoracoscopy in a left lateral decubitus position using single lung ventilation. The anvil of the circular stapler is placed transorally into the esophageal stump. The specimen and gastric conduit are exteriorized through a subcostal rectus muscle split incision. The stapler body is placed into the gastric conduit and both are advanced through the abdominal mini-incision transhiatally into the right thoracic cavity, where the anastomosis is constructed. Data were collected prospectively and analyzed retrospectively. A total of 23 non-selected consecutive patients (mean age 69 years, range 46-80 years) with adenocarcinoma (n = 19) or squamous cell carcinoma (n = 4) were surgically treated between June 2010 and July 2013. Neoadjuvant therapy was performed in 15 patients resulting in 10 partial and 4 complete remissions. There were no technical complications and no conversions. Mean operative time was 305 min (range 220-441 min). The median lymph node count was 16 (range 4-42). An R0 resection was achieved in 91 % of patients and 3 anastomotic leaks occurred which were successfully managed endoscopically. There were no postoperative deaths. The intrathoracic esophagogastric anastomosis during minimally invasive Ivor Lewis esophagectomy can be constructed in a standardized fashion without an additional thoracotomy

  19. Comparative regulatory approaches for groups of new plant breeding techniques.

    Science.gov (United States)

    Lusser, Maria; Davies, Howard V

    2013-06-25

    This manuscript provides insights into ongoing debates on the regulatory issues surrounding groups of biotechnology-driven 'New Plant Breeding Techniques' (NPBTs). It presents the outcomes of preliminary discussions and in some cases the initial decisions taken by regulators in the following countries: Argentina, Australia, Canada, EU, Japan, South Africa and USA. In the light of these discussions we suggest in this manuscript a structured approach to make the evaluation more consistent and efficient. The issue appears to be complex as these groups of new technologies vary widely in both the technologies deployed and their impact on heritable changes in the plant genome. An added complication is that the legislation, definitions and regulatory approaches for biotechnology-derived crops differ significantly between these countries. There are therefore concerns that this situation will lead to non-harmonised regulatory approaches and asynchronous development and marketing of such crops resulting in trade disruptions. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. A new approach to the combination of IBA techniques and wind back trajectory data to determine source contributions to long range transport of fine particle air pollution

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, David D., E-mail: dcz@ansto.gov.au [Australian Nuclear Science and Technology Organisation, Locked Bag 2001, Kirrawee DC, NSW 2232 (Australia); Crawford, Jagoda; Stelcer, Eduard; Atanacio, Armand [Australian Nuclear Science and Technology Organisation, Locked Bag 2001, Kirrawee DC, NSW 2232 (Australia)

    2012-02-15

    A new approach to link HYSPLIT back trajectories to the source of fine particle pollution as characterised by standard IBA techniques is discussed. The example of the long range transport of desert dust from inland Australia across the eastern coast is used to show that over a 10-year period extreme soil events originated from major agricultural regions some 30% of the time and that dust from known deserts are not always the problem.

  1. Simulation and Modeling Capability for Standard Modular Hydropower Technology

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Kevin M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Smith, Brennan T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Witt, Adam M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); DeNeale, Scott T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bevelhimer, Mark S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pries, Jason L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Burress, Timothy A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kao, Shih-Chieh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Mobley, Miles H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lee, Kyutae [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Curd, Shelaine L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tsakiris, Achilleas [Univ. of Tennessee, Knoxville, TN (United States); Mooneyham, Christian [Univ. of Tennessee, Knoxville, TN (United States); Papanicolaou, Thanos [Univ. of Tennessee, Knoxville, TN (United States); Ekici, Kivanc [Univ. of Tennessee, Knoxville, TN (United States); Whisenant, Matthew J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Welch, Tim [US Department of Energy, Washington, DC (United States); Rabon, Daniel [US Department of Energy, Washington, DC (United States)

    2017-08-01

    Grounded in the stakeholder-validated framework established in Oak Ridge National Laboratory’s SMH Exemplary Design Envelope Specification, this report on Simulation and Modeling Capability for Standard Modular Hydropower (SMH) Technology provides insight into the concepts, use cases, needs, gaps, and challenges associated with modeling and simulating SMH technologies. The SMH concept envisions a network of generation, passage, and foundation modules that achieve environmentally compatible, cost-optimized hydropower using standardization and modularity. The development of standardized modeling approaches and simulation techniques for SMH (as described in this report) will pave the way for reliable, cost-effective methods for technology evaluation, optimization, and verification.

  2. Extracting business vocabularies from business process models: SBVR and BPMN standards-based approach

    Science.gov (United States)

    Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis

    2013-10-01

    Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.

  3. [A comprehensive approach to designing of magnetotherapy techniques based on the Atos device].

    Science.gov (United States)

    Raĭgorodskiĭ, Iu M; Semiachkin, G P; Tatarenko, D A

    1995-01-01

    The paper determines how to apply a comprehensive approach to designing magnetic therapeutical techniques based on concomitant exposures to two or more physical factors. It shows the advantages of the running pattern of a magnetic field and photostimuli in terms of optimization of physiotherapeutical exposures. An Atos apparatus with an Amblio-1 attachment is used as an example to demonstrate how to apply the comprehensive approach for ophthalmology.

  4. Developing detection efficiency standards for atom probe tomography

    Science.gov (United States)

    Prosa, Ty J.; Geiser, Brian P.; Lawrence, Dan; Olson, David; Larson, David J.

    2014-08-01

    Atom Probe Tomography (APT) is a near-atomic-scale analytical technique which, due to recent advances in instrumentation and sample preparation techniques, is being used on a variety of 3D applications. Total system detection efficiency is a key parameter for obtaining accurate spatial reconstruction of atomic coordinates from detected ions, but experimental determination of efficiency can be difficult. This work explores new ways to measure total system detection efficiency as well as the specimen characteristics necessary for such measurements. Composite specimens composed of a nickel/chromium multilayer core, National Institute of Standards and Technology Standard Reference Material 2135c, encapsulated with silver, silicon, or nickel were used to demonstrate the suitability of this approach for providing a direct measurement of APT efficiency. Efficiency measurements based on this multilayer encapsulated in nickel are reported.

  5. Dynamic translabial ultrasound versus echodefecography combined with the endovaginal approach to assess pelvic floor dysfunctions: How effective are these techniques?

    Science.gov (United States)

    Murad-Regadas, S M; Karbage, S A; Bezerra, L S; Regadas, F S P; da Silva Vilarinho, A; Borges, L B; Regadas Filho, F S P; Veras, L B

    2017-07-01

    The aim of this study was to evaluate the role of dynamic translabial ultrasound (TLUS) in the assessment of pelvic floor dysfunction and compare the results with echodefecography (EDF) combined with the endovaginal approach. Consecutive female patients with pelvic floor dysfunction were eligible. Each patient was assessed with EDF combined with the endovaginal approach and TLUS. The diagnostic accuracy of the TLUS was evaluated using the results of EDF as the standard for comparison. A total of 42 women were included. Four sphincter defects were identified with both techniques, and EDF clearly showed if the defect was partial or total and additionally identified the pubovisceral muscle defect. There was substantial concordance regarding normal relaxation and anismus. Perfect concordance was found with rectocele and cystocele. The rectocele depth was measured with TLUS and quantified according to the EDF classification. Fair concordance was found for intussusception. There was no correlation between the displacement of the puborectal muscle at maximum straining on EDF with the displacement of the anorectal junction (ARJ), compared at rest with maximal straining on TLUS to determine perineal descent (PD). The mean ARJ displacement was similar in patients with normal and those with excessive PD on TLUS. Both modalities can be used as a method to assess pelvic floor dysfunction. The EDF using 3D anorectal and endovaginal approaches showed advantages in identification of the anal sphincters and pubodefects (partial or total). There was good correlation between the two techniques, and a TLUS rectocele classification based on size that corresponds to the established classification using EDF was established.

  6. Rule-based Approach on Extraction of Malay Compound Nouns in Standard Malay Document

    Science.gov (United States)

    Abu Bakar, Zamri; Kamal Ismail, Normaly; Rawi, Mohd Izani Mohamed

    2017-08-01

    Malay compound noun is defined as a form of words that exists when two or more words are combined into a single syntax and it gives a specific meaning. Compound noun acts as one unit and it is spelled separately unless an established compound noun is written closely from two words. The basic characteristics of compound noun can be seen in the Malay sentences which are the frequency of that word in the text itself. Thus, this extraction of compound nouns is significant for the following research which is text summarization, grammar checker, sentiments analysis, machine translation and word categorization. There are many research efforts that have been proposed in extracting Malay compound noun using linguistic approaches. Most of the existing methods were done on the extraction of bi-gram noun+noun compound. However, the result still produces some problems as to give a better result. This paper explores a linguistic method for extracting compound Noun from stand Malay corpus. A standard dataset are used to provide a common platform for evaluating research on the recognition of compound Nouns in Malay sentences. Therefore, an improvement for the effectiveness of the compound noun extraction is needed because the result can be compromised. Thus, this study proposed a modification of linguistic approach in order to enhance the extraction of compound nouns processing. Several pre-processing steps are involved including normalization, tokenization and tagging. The first step that uses the linguistic approach in this study is Part-of-Speech (POS) tagging. Finally, we describe several rules-based and modify the rules to get the most relevant relation between the first word and the second word in order to assist us in solving of the problems. The effectiveness of the relations used in our study can be measured using recall, precision and F1-score techniques. The comparison of the baseline values is very essential because it can provide whether there has been an improvement

  7. Legal technique: approaches to section on types

    Directory of Open Access Journals (Sweden)

    І. Д. Шутак

    2015-11-01

    Full Text Available Legal technique is a branch of knowledge about the rules of doing legal work and creating in the process a variety of legal documents, which had previously been part of the theory of law. In modern conditions of the legal technique are isolated in a separate branch of legal science, focused on solving practical problems. The purpose of this article is to analyze the types of legal techniques, in particular, on the basis of theoretical propositions about legal technique to allocate substantial characteristics and types of legal technique. O. Malko and M. Matuzov consider legal technique as a set of rules, techniques, methods of preparation, creation, registration of legal documents, their classification and accounting for their excellence, efficient use. A similar meaning is investing in this concept Alekseev, determining that the legal technique is a set of tools and techniques used in accordance with accepted rules in the formulation and systematization of legal acts to ensure their perfection. So, legal technique – theoretical and applied legal science, which studies the regularities of rational legal practice in the creation, interpretation and implementation of law. In relation to the type of legal techniques in the literature proposed different classifications. For example, G. Muromtsev technique, which is used only in the field of law, divide on the technique of law-making (legislative technique, technique of law enforcement, interpretation, technique of judicial speech, interrogation, notarial activities. V. Kartashov shared legal technique on law making and enforcement (prorealtime, interpretive yourself and prevacidrebatezw, judicial or investigative, prosecutorial, and the like. Some authors clearly indicate that the criterion by which to distinguish types of legal techniques. So, S. Alekseev notes that legal technique is classified from the point of view of the legal nature of the act made on: a techniques of legal acts; b the

  8. A review of numerical techniques approaching microstructures of crystalline rocks

    Science.gov (United States)

    Zhang, Yahui; Wong, Louis Ngai Yuen

    2018-06-01

    The macro-mechanical behavior of crystalline rocks including strength, deformability and failure pattern are dominantly influenced by their grain-scale structures. Numerical technique is commonly used to assist understanding the complicated mechanisms from a microscopic perspective. Each numerical method has its respective strengths and limitations. This review paper elucidates how numerical techniques take geometrical aspects of the grain into consideration. Four categories of numerical methods are examined: particle-based methods, block-based methods, grain-based methods, and node-based methods. Focusing on the grain-scale characters, specific relevant issues including increasing complexity of micro-structure, deformation and breakage of model elements, fracturing and fragmentation process are described in more detail. Therefore, the intrinsic capabilities and limitations of different numerical approaches in terms of accounting for the micro-mechanics of crystalline rocks and their phenomenal mechanical behavior are explicitly presented.

  9. Clinical Comparison of Extensile Lateral Approach and Sinus Tarsi Approach Combined with Medial Distraction Technique for Intra-Articular Calcaneal Fractures.

    Science.gov (United States)

    Zhou, Hai-Chao; Yu, Tao; Ren, Hao-Yang; Li, Bing; Chen, Kai; Zhao, You-Guang; Yang, Yun-Feng

    2017-02-01

    To study and compare the clinical outcomes of open reduction and internal fixation via extensile L-shape incision and limited open reduction via the sinus tarsi approach using the medial distraction technique for intra-articular calcaneal fractures. We performed a retrospective review of 65 intra-articular calcaneal fractures treated operatively between March 2012 and February 2015. Patients were divided into two groups: 28 were in the sinus tarsi approach group and 37 were in the extensile lateral approach group. All patients were asked to return for a research visit that included radiography and clinical evaluation. The postoperative function was evaluated using the ankle and hindfoot score of the American Orthopaedic Foot and Ankle Society (AOFAS) and the visual analogue scale (VAS). No significant difference was found in demographics between the two groups. The corrected value of the calcaneal varus angle between the two groups is statistically significant (P articular calcaneal fractures could reduce the incidence of wound complications effectively, and the medial distraction technique is helpful for correcting the calcaneus varus deformity. © 2017 Chinese Orthopaedic Association and John Wiley & Sons Australia, Ltd.

  10. Applied anatomy of a new approach of endoscopic technique in thyroid gland surgery.

    Science.gov (United States)

    Liu, Hong; Xie, Yong-jun; Xu, Yi-quan; Li, Chao; Liu, Xing-guo

    2012-10-01

    To explore the feasibility and safety of transtracheal assisted sublingual approach to totally endoscopic thyroidectomy by studying the anatomical approach and adjacent structures. A total of 5 embalmed adult cadavers from Chengdu Medical College were dissected layer by layer in the cervical region, pharyngeal region, and mandible region, according to transtracheal assisted sublingual approach that was verified from the anatomical approach and planes. A total of 15 embalmed adult cadavers were dissected by arterial vascular casting technique, imaging scanning technique, and thin layer cryotomy. Then the vessel and anatomical structures of thyroid surgical region were analyzed qualitatively and quantitatively. Three-dimensional visualization of larynx artery was reconstructed by Autodesk 3ds Max 2010(32). Transtracheal assisted sublingual approach for totally endoscopic thyroidectomy was simulated on 5 embalmed adult cadavers. The sublingual observed access was located in the middle of sublingual region. The geniohyoid muscle, mylohyoid seam, and submental triangle were divided in turn in the middle to reach the plane under the plastima muscles. Superficial cervical fascia, anterior body of hyoid bone, and infrahyoid muscles were passed in sequence to reach thyroid gland surgical region. The transtracheal operational access was placed from the cavitas oris propria, isthmus faucium, subepiglottic region, laryngeal pharynx, and intermediate laryngeal cavit, and then passed from the top down in order to reach pars cervicalis tracheae where a sagittal incision was made in the anterior wall of cartilagines tracheales to reach a ascertained surgical region. Transtracheal assisted sublingual approach to totally endoscopic thyroidectomy is anatomically feasible and safe and can be useful in thyroid gland surgery.

  11. Advanced Packaging Materials and Techniques for High Power TR Module: Standard Flight vs. Advanced Packaging

    Science.gov (United States)

    Hoffman, James Patrick; Del Castillo, Linda; Miller, Jennifer; Jenabi, Masud; Hunter, Donald; Birur, Gajanana

    2011-01-01

    The higher output power densities required of modern radar architectures, such as the proposed DESDynI [Deformation, Ecosystem Structure, and Dynamics of Ice] SAR [Synthetic Aperture Radar] Instrument (or DSI) require increasingly dense high power electronics. To enable these higher power densities, while maintaining or even improving hardware reliability, requires advances in integrating advanced thermal packaging technologies into radar transmit/receive (TR) modules. New materials and techniques have been studied and compared to standard technologies.

  12. Surgical management of inverted papilloma: approaching a new standard for surgery.

    Science.gov (United States)

    Carta, Filippo; Blancal, Jean-Philippe; Verillaud, Benjamin; Tran, Hugo; Sauvaget, Elisabeth; Kania, Romain; Herman, Philippe

    2013-10-01

    Inverted papilloma surgery is currently performed primarily with an endoscopic approach, a technique that has a recurrence rate of 12%. However, a recent study reported a recurrence rate of 5% with a strategy based on subperiosteal dissection of the tumor, with limited indications for using an external approach. The aim of this work was to evaluate whether different teams using the same surgical concepts could reproduce the excellent results that were recently reported. This study is a retrospective chart review of 71 consecutive patients with inverted papilloma who were treated during the last 10 years. In all, 80% of the patients were treated using a purely endoscopic approach. The mean follow-up period was 31.6 months. The recurrence rate was 3.3% for cases with at least a 12-month follow-up. This work confirms the results described in recent literature and further supports transnasal endoscopic surgery to manage inverted papilloma. Copyright © 2013 Wiley Periodicals, Inc.

  13. a Holistic Approach for Inspection of Civil Infrastructures Based on Computer Vision Techniques

    Science.gov (United States)

    Stentoumis, C.; Protopapadakis, E.; Doulamis, A.; Doulamis, N.

    2016-06-01

    In this work, it is examined the 2D recognition and 3D modelling of concrete tunnel cracks, through visual cues. At the time being, the structural integrity inspection of large-scale infrastructures is mainly performed through visual observations by human inspectors, who identify structural defects, rate them and, then, categorize their severity. The described approach targets at minimum human intervention, for autonomous inspection of civil infrastructures. The shortfalls of existing approaches in crack assessment are being addressed by proposing a novel detection scheme. Although efforts have been made in the field, synergies among proposed techniques are still missing. The holistic approach of this paper exploits the state of the art techniques of pattern recognition and stereo-matching, in order to build accurate 3D crack models. The innovation lies in the hybrid approach for the CNN detector initialization, and the use of the modified census transformation for stereo matching along with a binary fusion of two state-of-the-art optimization schemes. The described approach manages to deal with images of harsh radiometry, along with severe radiometric differences in the stereo pair. The effectiveness of this workflow is evaluated on a real dataset gathered in highway and railway tunnels. What is promising is that the computer vision workflow described in this work can be transferred, with adaptations of course, to other infrastructure such as pipelines, bridges and large industrial facilities that are in the need of continuous state assessment during their operational life cycle.

  14. Portable optical frequency standard based on sealed gas-filled hollow-core fiber using a novel encapsulation technique

    DEFF Research Database (Denmark)

    Triches, Marco; Brusch, Anders; Hald, Jan

    2015-01-01

    A portable stand-alone optical frequency standard based on a gas-filled hollow-core photonic crystal fiber is developed to stabilize a fiber laser to the 13C2H2 P(16) (ν1 + ν3) transition at 1542 nm using saturated absorption. A novel encapsulation technique is developed to permanently seal...

  15. Parallelism measurement for base plate of standard artifact with multiple tactile approaches

    Science.gov (United States)

    Ye, Xiuling; Zhao, Yan; Wang, Yiwen; Wang, Zhong; Fu, Luhua; Liu, Changjie

    2018-01-01

    Nowadays, as workpieces become more precise and more specialized which results in more sophisticated structures and higher accuracy for the artifacts, higher requirements have been put forward for measuring accuracy and measuring methods. As an important method to obtain the size of workpieces, coordinate measuring machine (CMM) has been widely used in many industries. In order to achieve the calibration of a self-developed CMM, it is found that the parallelism of the base plate used for fixing the standard artifact is an important factor which affects the measurement accuracy in the process of studying self-made high-precision standard artifact. And aimed to measure the parallelism of the base plate, by using the existing high-precision CMM, gauge blocks, dial gauge and marble platform with the tactile approach, three methods for parallelism measurement of workpieces are employed, and comparisons are made within the measurement results. The results of experiments show that the final accuracy of all the three methods is able to reach micron level and meets the measurement requirements. Simultaneously, these three approaches are suitable for different measurement conditions which provide a basis for rapid and high-precision measurement under different equipment conditions.

  16. Improving the road wear performance of heavy vehicles in South Africa using a performance-based standards approach

    CSIR Research Space (South Africa)

    Nordengen, Paul A

    2010-05-01

    Full Text Available of the world to achieve regional harmonisation and effective road use have had limited success. Another approach is to consider performance-based standards (PBS); in this case standards specify the performance required from the operation of a vehicle on a...

  17. A Generalizability Theory Approach to Standard Error Estimates for Bookmark Standard Settings

    Science.gov (United States)

    Lee, Guemin; Lewis, Daniel M.

    2008-01-01

    The bookmark standard-setting procedure is an item response theory-based method that is widely implemented in state testing programs. This study estimates standard errors for cut scores resulting from bookmark standard settings under a generalizability theory model and investigates the effects of different universes of generalization and error…

  18. Approaches for protection standards for ionizing radiation and combustion pollutants

    International Nuclear Information System (INIS)

    Butler, G.C.

    1978-01-01

    The question ''can the approach used for radiation protection standards, i.e., to extrapolate dose--response relationships to low doses, be applied to combustion pollutants'' provided a basis for discussion. The linear, nonthreshold model postulated by ICRP and UNSCEAR for late effects of ionizing radiation is described and discussed. The utility and problems of applying this model to the effects of air pollutants constitute the focus of this paper. The conclusion is that, in the absence of evidence to the contrary, one should assume the same type of dose--effect relation for chemical air pollutants as for ionizing radiation

  19. A comparison of two standard-setting approaches in high-stakes clinical performance assessment using generalizability theory.

    Science.gov (United States)

    Richter Lagha, Regina A; Boscardin, Christy K; May, Win; Fung, Cha-Chi

    2012-08-01

    Scoring clinical assessments in a reliable and valid manner using criterion-referenced standards remains an important issue and directly affects decisions made regarding examinee proficiency. This generalizability study of students' clinical performance examination (CPX) scores examines the reliability of those scores and of their interpretation, particularly according to a newly introduced, "critical actions" criterion-referenced standard and scoring approach. The authors applied a generalizability framework to the performance scores of 477 third-year students attending three different medical schools in 2008. The norm-referenced standard included all station checklist items. The criterion-referenced standard included only those items deemed critical to patient care by a faculty panel. The authors calculated and compared variance components and generalizability coefficients for each standard across six common stations. Norm-referenced scores had moderate generalizability (ρ = 0.51), whereas criterion-referenced scores showed low dependability (φ = 0.20). The estimated 63% of measurement error associated with the person-by-station interaction suggests case specificity. Increasing the number of stations on the CPX from 6 to 24, an impractical solution both for cost and time, would still yield only moderate dependability (φ = 0.50). Though the performance assessment of complex skills, like clinical competence, seems intrinsically valid, careful consideration of the scoring standard and approach is needed to avoid misinterpretation of proficiency. Further study is needed to determine how best to improve the reliability of criterion-referenced scores, by implementing changes to the examination structure, the process of standard-setting, or both.

  20. Standards for Standardized Logistic Regression Coefficients

    Science.gov (United States)

    Menard, Scott

    2011-01-01

    Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…

  1. A New Paradigm for Tissue Diagnostics: Tools and Techniques to Standardize Tissue Collection, Transport, and Fixation.

    Science.gov (United States)

    Bauer, Daniel R; Otter, Michael; Chafin, David R

    2018-01-01

    Studying and developing preanalytical tools and technologies for the purpose of obtaining high-quality samples for histological assays is a growing field. Currently, there does not exist a standard practice for collecting, fixing, and monitoring these precious samples. There has been some advancement in standardizing collection for the highest profile tumor types, such as breast, where HER2 testing drives therapeutic decisions. This review examines the area of tissue collection, transport, and monitoring of formalin diffusion and details a prototype system that could be used to help standardize tissue collection efforts. We have surveyed recent primary literature sources and conducted several site visits to understand the most error-prone processes in histology laboratories. This effort identified errors that resulted from sample collection techniques and subsequent transport delays from the operating room (OR) to the histology laboratories. We have therefore devised a prototype sample collection and transport concept. The system consists of a custom data logger and cold transport box and takes advantage of a novel cold + warm (named 2 + 2) fixation method. This review highlights the beneficial aspects of standardizing tissue collection, fixation, and monitoring. In addition, a prototype system is introduced that could help standardize these processes and is compatible with use directly in the OR and from remote sites.

  2. [Standardization of Blastocystis hominis diagnosis using different staining techniques].

    Science.gov (United States)

    Eymael, Dayane; Schuh, Graziela Maria; Tavares, Rejane Giacomelli

    2010-01-01

    The present study was carried out from March to May 2008, with the aim of evaluating the effectiveness of different techniques for diagnosing Blastocystis hominis in a sample of the population attended at the Biomedicine Laboratory of Feevale University, Novo Hamburgo, Rio Grande do Sul. On hundred feces samples from children and adults were evaluated. After collection, the samples were subjected to the techniques of spontaneous sedimentation (HPJ), sedimentation in formalin-ether (Ritchie) and staining by means of Gram and May-Grünwald-Giemsa (MGG). The presence of Blastocystis hominis was observed in 40 samples, when staining techniques were used (MGG and Gram), while sedimentation techniques were less efficient (32 positive samples using the Ritchie technique and 20 positive samples using the HPJ technique). Our results demonstrate that HPJ was less efficient than the other methods, thus indicating the need to include laboratory techniques that enable parasite identification on a routine basis.

  3. Interband coding extension of the new lossless JPEG standard

    Science.gov (United States)

    Memon, Nasir D.; Wu, Xiaolin; Sippy, V.; Miller, G.

    1997-01-01

    Due to the perceived inadequacy of current standards for lossless image compression, the JPEG committee of the International Standards Organization (ISO) has been developing a new standard. A baseline algorithm, called JPEG-LS, has already been completed and is awaiting approval by national bodies. The JPEG-LS baseline algorithm despite being simple is surprisingly efficient, and provides compression performance that is within a few percent of the best and more sophisticated techniques reported in the literature. Extensive experimentations performed by the authors seem to indicate that an overall improvement by more than 10 percent in compression performance will be difficult to obtain even at the cost of great complexity; at least not with traditional approaches to lossless image compression. However, if we allow inter-band decorrelation and modeling in the baseline algorithm, nearly 30 percent improvement in compression gains for specific images in the test set become possible with a modest computational cost. In this paper we propose and investigate a few techniques for exploiting inter-band correlations in multi-band images. These techniques have been designed within the framework of the baseline algorithm, and require minimal changes to the basic architecture of the baseline, retaining its essential simplicity.

  4. Approaches, techniques, and information technology systems in the restaurants and foodservice industry: a qualitative study in sales forecasting.

    OpenAIRE

    Green, Yvette N. J.; Weaver, Pamela A.

    2008-01-01

    This is a study of the approaches, techniques, and information technology systems utilized for restaurant sales forecasting in the full-service restaurant segment. Companies were examined using a qualitative research methods design and long interviews to gather information on approaches, techniques, and technology systems utilized in the sales forecasting process. The results of the interviews were presented along with ensuing discussion.

  5. Lightening protection, techniques, applied codes and standards. Vol. 4

    International Nuclear Information System (INIS)

    Mahmoud, M.; Shaaban, H.; Lamey, S.

    1996-01-01

    Lightening is the only natural disaster that protection against is highly effective. Therefore for the safety of critical installations specifically nuclear, an effective lightening protection system (LPS) is required. The design and installation of LPS's have been addressed by many international codes and standards. In this paper, the various LPS's are discussed and compared, including radioactive air terminals, ionizing air terminals, and terminals equipped with electrical trigging devices. Also, the so-called dissipation array systems are discussed and compared to other systems technically and economically. Moreover, the available international codes and standards related to the lightening protection are discussed. such standards include those published by the national fire protection association (NFPA), lightening protection institute (LPI), underwriters laboratories (UL), and british standards Finally, the possibility of developing an egyptian national standards is discussed

  6. A decoupled power flow algorithm using particle swarm optimization technique

    International Nuclear Information System (INIS)

    Acharjee, P.; Goswami, S.K.

    2009-01-01

    A robust, nondivergent power flow method has been developed using the particle swarm optimization (PSO) technique. The decoupling properties between the power system quantities have been exploited in developing the power flow algorithm. The speed of the power flow algorithm has been improved using a simple perturbation technique. The basic power flow algorithm and the improvement scheme have been designed to retain the simplicity of the evolutionary approach. The power flow is rugged, can determine the critical loading conditions and also can handle the flexible alternating current transmission system (FACTS) devices efficiently. Test results on standard test systems show that the proposed method can find the solution when the standard power flows fail.

  7. Closing the gap: accelerating the translational process in nanomedicine by proposing standardized characterization techniques.

    Science.gov (United States)

    Khorasani, Ali A; Weaver, James L; Salvador-Morales, Carolina

    2014-01-01

    On the cusp of widespread permeation of nanomedicine, academia, industry, and government have invested substantial financial resources in developing new ways to better treat diseases. Materials have unique physical and chemical properties at the nanoscale compared with their bulk or small-molecule analogs. These unique properties have been greatly advantageous in providing innovative solutions for medical treatments at the bench level. However, nanomedicine research has not yet fully permeated the clinical setting because of several limitations. Among these limitations are the lack of universal standards for characterizing nanomaterials and the limited knowledge that we possess regarding the interactions between nanomaterials and biological entities such as proteins. In this review, we report on recent developments in the characterization of nanomaterials as well as the newest information about the interactions between nanomaterials and proteins in the human body. We propose a standard set of techniques for universal characterization of nanomaterials. We also address relevant regulatory issues involved in the translational process for the development of drug molecules and drug delivery systems. Adherence and refinement of a universal standard in nanomaterial characterization as well as the acquisition of a deeper understanding of nanomaterials and proteins will likely accelerate the use of nanomedicine in common practice to a great extent.

  8. The trans-frontal-sinus subcranial approach for removal of large olfactory groove meningiomas: surgical technique and comparison to other approaches.

    Science.gov (United States)

    Boari, Nicola; Gagliardi, Filippo; Roberti, Fabio; Barzaghi, Lina Raffaella; Caputy, Anthony J; Mortini, Pietro

    2013-05-01

    Several surgical approaches have been previously reported for the treatment of olfactory groove meningiomas (OGM).The trans-frontal-sinus subcranial approach (TFSSA) for the removal of large OGMs is described, comparing it with other reported approaches in terms of advantages and drawbacks. The TFSSA was performed on cadaveric specimens to illustrate the surgical technique. The surgical steps of the TFSSA and the related anatomical pictures are reported. The approach was adopted in a clinical setting; a case illustration is reported to demonstrate the feasibility of the described approach and to provide intraoperative pictures. The TFSSA represents a possible route to treat large OGMs. The subcranial approach provides early devascularization of the tumor, direct tumor access from the base without traction on the frontal lobes, good overview of dissection of the optic nerves and anterior cerebral arteries, and dural reconstruction with pedicled pericranial flap. Georg Thieme Verlag KG Stuttgart · New York.

  9. Solving Inverse Kinematics – A New Approach to the Extended Jacobian Technique

    Directory of Open Access Journals (Sweden)

    M. Šoch

    2005-01-01

    Full Text Available This paper presents a brief summary of current numerical algorithms for solving the Inverse Kinematics problem. Then a new approach based on the Extended Jacobian technique is compared with the current Jacobian Inversion method. The presented method is intended for use in the field of computer graphics for animation of articulated structures. 

  10. Standard molar enthalpy of formation of methoxyacetophenone isomers

    International Nuclear Information System (INIS)

    Amaral, Luísa M.P.F.; Morais, Victor M.F.; Ribeiro da Silva, Manuel A.V.

    2014-01-01

    Highlights: • Experimental and computational energetic study of methoxyacetophenone isomers. • Enthalpies of formation and phase transition determined by calorimetric techniques. • Quantum chemical calculations allowed estimation of enthalpies of formation. • Structure and energy correlations were established. - Abstract: Values of the standard (p o = 0.1 MPa) molar enthalpy of formation of 2′-, 3′- and 4′-methoxyacetophenones were derived from their standard molar energy of combustion, in oxygen, at T = 298.15 K, measured by static bomb combustion calorimetry. The Calvet high temperature vacuum sublimation technique was used to measure the enthalpies of sublimation/vaporization of the compounds studied. The standard molar enthalpies of formation of the three compounds, in the gaseous phase, at T = 298.15 K, have been derived from the corresponding standard molar enthalpies of formation in the condensed phase and the standard molar enthalpies for the phase transition. The results obtained are −(232.0 ± 2.5), −(237.7 ± 2.7) and −(241.1 ± 2.1) kJ · mol −1 for 2′-, 3′- and 4′-methoxyacetophenone, respectively. Standard molar enthalpies of formation were also estimated from different methodologies: the Cox scheme as well as two different computational approaches using density functional theory-based B3LYP and the multilevel G3 methodologies

  11. A HOLISTIC APPROACH FOR INSPECTION OF CIVIL INFRASTRUCTURES BASED ON COMPUTER VISION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    C. Stentoumis

    2016-06-01

    Full Text Available In this work, it is examined the 2D recognition and 3D modelling of concrete tunnel cracks, through visual cues. At the time being, the structural integrity inspection of large-scale infrastructures is mainly performed through visual observations by human inspectors, who identify structural defects, rate them and, then, categorize their severity. The described approach targets at minimum human intervention, for autonomous inspection of civil infrastructures. The shortfalls of existing approaches in crack assessment are being addressed by proposing a novel detection scheme. Although efforts have been made in the field, synergies among proposed techniques are still missing. The holistic approach of this paper exploits the state of the art techniques of pattern recognition and stereo-matching, in order to build accurate 3D crack models. The innovation lies in the hybrid approach for the CNN detector initialization, and the use of the modified census transformation for stereo matching along with a binary fusion of two state-of-the-art optimization schemes. The described approach manages to deal with images of harsh radiometry, along with severe radiometric differences in the stereo pair. The effectiveness of this workflow is evaluated on a real dataset gathered in highway and railway tunnels. What is promising is that the computer vision workflow described in this work can be transferred, with adaptations of course, to other infrastructure such as pipelines, bridges and large industrial facilities that are in the need of continuous state assessment during their operational life cycle.

  12. Novel technique and simple approach for supra-alar region and supra-alar crease correction by supra-alar cinching.

    Science.gov (United States)

    Selvaraj, Loganathan

    2016-01-01

    This technical report describes a simple and innovative surgical technique for supra-alar sidewall region constriction and supra-alar crease attenuation by cinching technique through intraoral approach.

  13. Novel technique and simple approach for supra-alar region and supra-alar crease correction by supra-alar cinching

    OpenAIRE

    Selvaraj, Loganathan

    2016-01-01

    This technical report describes a simple and innovative surgical technique for supra-alar sidewall region constriction and supra-alar crease attenuation by cinching technique through intraoral approach.

  14. Hand-assisted retroperitoneoscopic versus standard laparoscopic donor nephrectomy: HARP-trial

    Directory of Open Access Journals (Sweden)

    Alwayn Ian PJ

    2010-03-01

    Full Text Available Abstract Background Transplantation is the only treatment offering long-term benefit to patients with chronic kidney failure. Live donor nephrectomy is performed on healthy individuals who do not receive direct therapeutic benefit of the procedure themselves. In order to guarantee the donor's safety, it is important to optimise the surgical approach. Recently we demonstrated the benefit of laparoscopic nephrectomy experienced by the donor. However, this method is characterised by higher in hospital costs, longer operating times and it requires a well-trained surgeon. The hand-assisted retroperitoneoscopic technique may be an alternative to a complete laparoscopic, transperitoneal approach. The peritoneum remains intact and the risk of visceral injuries is reduced. Hand-assistance results in a faster procedure and a significantly reduced operating time. The feasibility of this method has been demonstrated recently, but as to date there are no data available advocating the use of one technique above the other. Methods/design The HARP-trial is a multi-centre randomised controlled, single-blind trial. The study compares the hand-assisted retroperitoneoscopic approach with standard laparoscopic donor nephrectomy. The objective is to determine the best approach for live donor nephrectomy to optimise donor's safety and comfort while reducing donation related costs. Discussion This study will contribute to the evidence on any benefits of hand-assisted retroperitoneoscopic versus standard laparoscopic donor nephrectomy. Trial Registration Dutch Trial Register NTR1433

  15. Color standardization and optimization in Whole Slide Imaging

    Directory of Open Access Journals (Sweden)

    Yagi Yukako

    2011-03-01

    Full Text Available Abstract Introduction Standardization and validation of the color displayed by digital slides is an important aspect of digital pathology implementation. While the most common reason for color variation is the variance in the protocols and practices in the histology lab, the color displayed can also be affected by variation in capture parameters (for example, illumination and filters, image processing and display factors in the digital systems themselves. Method We have been developing techniques for color validation and optimization along two paths. The first was based on two standard slides that are scanned and displayed by the imaging system in question. In this approach, one slide is embedded with nine filters with colors selected especially for H&E stained slides (looking like tiny Macbeth color chart; the specific color of the nine filters were determined in our previous study and modified for whole slide imaging (WSI. The other slide is an H&E stained mouse embryo. Both of these slides were scanned and the displayed images were compared to a standard. The second approach was based on our previous multispectral imaging research. Discussion As a first step, the two slide method (above was used to identify inaccurate display of color and its cause, and to understand the importance of accurate color in digital pathology. We have also improved the multispectral-based algorithm for more consistent results in stain standardization. In near future, the results of the two slide and multispectral techniques can be combined and will be widely available. We have been conducting a series of researches and developing projects to improve image quality to establish Image Quality Standardization. This paper discusses one of most important aspects of image quality – color.

  16. Standard plants, standard outages: the EdF approach

    International Nuclear Information System (INIS)

    Miron, J.L.

    1991-01-01

    At the end of 1990 Electricite de France had carried out a total of 350 PWR refuelling outages. Although the French units are standardized the routine of the outages are not all the same. The major influences on outages were: setting up new organizations to apply quality assurance regulations; improving systematic experience feedback; incorporating modifications in the outage schedules; assumilation of computerized maintenance management by the sites. (author)

  17. Four-arm single docking full robotic surgery for low rectal cancer: technique standardization

    Directory of Open Access Journals (Sweden)

    José Reinan Ramos

    Full Text Available The authors present the four-arm single docking full robotic surgery to treat low rectal cancer. The eight main operative steps are: 1- patient positioning; 2- trocars set-up and robot docking; 3- sigmoid colon, left colon and splenic flexure mobilization (lateral-to-medial approach; 4-Inferior mesenteric artery and vein ligation (medial-to-lateral approach; 5- total mesorectum excision and preservation of hypogastric and pelvic autonomic nerves (sacral dissection, lateral dissection, pelvic dissection; 6- division of the rectum using an endo roticulator stapler for the laparoscopic performance of a double-stapled coloanal anastomosis (type I tumor; 7- intersphincteric resection, extraction of the specimen through the anus and lateral-to-end hand sewn coloanal anastomosis (type II tumor; 8- cylindric abdominoperineal resection, with transabdominal section of the levator muscles (type IV tumor. The techniques employed were safe and have presented low rates of complication and no mortality.

  18. NET European Network on Neutron Techniques Standardization for Structural Integrity

    International Nuclear Information System (INIS)

    Youtsos, A.

    2004-01-01

    Improved performance and safety of European energy production systems is essential for providing safe, clean and inexpensive electricity to the citizens of the enlarged EU. The state of the art in assessing internal stresses, micro-structure and defects in welded nuclear components -as well as their evolution due to complex thermo-mechanical loads and irradiation exposure -needs to be improved before relevant structural integrity assessment code requirements can safely become less conservative. This is valid for both experimental characterization techniques and predictive numerical algorithms. In the course of the last two decades neutron methods have proven to be excellent means for providing valuable information required in structural integrity assessment of advanced engineering applications. However, the European industry is hampered from broadly using neutron research due to lack of harmonised and standardized testing methods. 35 European major industrial and research/academic organizations have joined forces, under JRC coordination, to launch the NET European Network on Neutron Techniques Standardization for Structural Integrity in May 2002. The NET collaborative research initiative aims at further development and harmonisation of neutron scattering methods, in support of structural integrity assessment. This is pursued through a number of testing round robin campaigns on neutron diffraction and small angle neutron scattering - SANS and supported by data provided by other more conventional destructive and non-destructive methods, such as X-ray diffraction and deep and surface hole drilling. NET also strives to develop more reliable and harmonized simulation procedures for the prediction of residual stress and damage in steel welded power plant components. This is pursued through a number of computational round robin campaigns based on advanced FEM techniques, and on reliable data obtained by such novel and harmonized experimental methods. The final goal of

  19. AI Techniques for Space: The APSI Approach

    Science.gov (United States)

    Steel, R.; Niézette, M.; Cesta, A.; Verfaille, G., Lavagna, M.; Donati, A.

    2009-05-01

    This paper will outline the framework and tools developed under the Advanced Planning and Schedule Initiative (APSI) study performed by VEGA for the European Space Agency in collaboration with three academic institutions, ISTC-CNR, ONERA, and Politecnico di Milano. We will start by illustrating the background history to APSI and why it was needed, giving a brief summary of all the partners within the project and the rolls they played within it. We will then take a closer look at what APSI actually consists of, showing the techniques that were used and detailing the framework that was developed within the scope of the project. We will follow this with an elaboration on the three demonstration test scenarios that have been developed as part of the project, illustrating the re-use and synergies between the three cases along the way. We will finally conclude with a summary of some pros and cons of the approach devised during the project and outline future directions to be further investigated and expanded on within the context of the work performed within the project.

  20. The transverse technique; a complementary approach to the measurement of first-trimester uterine artery Doppler.

    Science.gov (United States)

    Drouin, Olivier; Johnson, Jo-Ann; Chaemsaithong, Piya; Metcalfe, Amy; Huber, Janie; Schwarzenberger, Jill; Winters, Erin; Stavness, Lesley; Tse, Ada W T; Lu, Jing; Lim, Wan Teng; Leung, Tak Yeung; Bujold, Emmanuel; Sahota, Daljit; Poon, Liona C

    2017-10-04

    The objectives of this study were to 1) define the protocol for the first-trimester assessment of the uterine artery pulsatility index (UtA-PI) using the new transverse technique, 2) evaluate UtA-PI measured by the transverse approach versus that obtained by the conventional sagittal approach, and 3) determine if accelerated onsite training (both methods) of inexperienced sonographers can achieve reproducible UtA-PI measurements compared to that measured by an experienced sonographer. The study consists of 2 parts conducted in 2 centers (Part 1, Calgary, Canada and Part 2, Hong Kong). Part 1 Prospective observational study of women with singleton pregnancies between 11-13+6 weeks' gestation. UtA-PI measurements were performed using the 2 techniques (4 sonographers trained in both methods, 10 cases each) and measurement indices (PI), time required and subjective difficulty to obtain satisfactory measurements were compared. One sample t-test and Wilcoxon rank sign test was used when appropriate. Bland-Altman difference plots were used to assess measurement agreement, and intra-class correlation (ICC) was used to evaluate measurement reliability. A target plot was used to assess measures of central tendency and dispersion. Part 2 One experienced and three inexperienced sonographers prospectively measured the UtA-PI at 11-13+6 weeks' gestation in two groups of women (42 and 35, respectively), with singleton pregnancies using both approaches. Inexperienced sonographers underwent accelerated on-site training by the experienced sonographer. Measurement approach and sonographer order were on a random basis. ICC, Bland-Altman and Passing-Bablok analyses were performed to assess measurement agreement, reliability and effect of accelerated training. Part 1 We observed no difference in the mean time to acquire the measurements (Sagittal: 118 seconds vs Transverse: 106 seconds, p=0.38). The 4 sonographers reported the transverse technique was subjectively easier to perform (p=0

  1. T-Fix endoscopic meniscal repair: technique and approach to different types of tears.

    Science.gov (United States)

    Barrett, G R; Richardson, K; Koenig, V

    1995-04-01

    Endoscopic meniscus repair using the T-Fix suture device (Acufex Microsurgical, Inc, Mansfield, MA) allows ease of suture placement for meniscus stability without the problems associated with ancillary incisions such as neurovascular compromise. It is ideal for the central posterior horn tears that are difficult using conventional techniques. Vertical tears, bucket handle tears, flap tears, and horizontal tears can be approached using a temporary "anchor stitch" to stabilize the meniscus before T-Fix repair. The basic method of repair and our approach to these different types of tears is presented.

  2. Comparison of ankle-brachial index measured by an automated oscillometric apparatus with that by standard Doppler technique in vascular patients

    DEFF Research Database (Denmark)

    Korno, M.; Eldrup, N.; Sillesen, H.

    2009-01-01

    was calculated twice using both the methods on both legs. MATERIALS AND METHODS: We tested the automated oscillometric blood pressure device, CASMED 740, for measuring ankle and arm blood pressure and compared it with the current gold standard, the hand-held Doppler technique, by the Bland-Altman analysis....... RESULTS: Using the Doppler-derived ABI as the gold standard, the sensitivity and specificity of the oscillometric method for determining an ABI Udgivelsesdato: 2009/11...

  3. A transferability study of the EPR-tooth-dosimetry technique

    International Nuclear Information System (INIS)

    Sholom, S.; Chumak, V.; Desrosiers, M.; Bouville, A.

    2006-01-01

    The transferability of a measurement protocol from one laboratory to another is an important feature of any mature, standardised protocol. The electron paramagnetic resonance (EPR)-tooth dosimetry technique that was developed in Scientific Center for Radiation Medicine, AMS (Ukraine) (SCRM) for routine dosimetry of Chernobyl liquidators has demonstrated consistent results in several inter-laboratory measurement comparisons. Transferability to the EPR dosimetry laboratory at the National Inst. of Standards and Technology (NIST) was examined. Several approaches were used to test the technique, including dose reconstruction of SCRM-NIST inter-comparison samples. The study has demonstrated full transferability of the technique and the possibility to reproduce results in a different laboratory environment. (authors)

  4. Wavelength standards in the infrared

    CERN Document Server

    Rao, KN

    2012-01-01

    Wavelength Standards in the Infrared is a compilation of wavelength standards suitable for use with high-resolution infrared spectrographs, including both emission and absorption standards. The book presents atomic line emission standards of argon, krypton, neon, and xenon. These atomic line emission standards are from the deliberations of Commission 14 of the International Astronomical Union, which is the recognized authority for such standards. The text also explains the techniques employed in determining spectral positions in the infrared. One of the techniques used includes the grating con

  5. The Trojan Female Technique: A Novel, Effective and Humane Approach for Pest Population Control

    Energy Technology Data Exchange (ETDEWEB)

    Gemmell, Neil J. [Centre for Reproduction and Genomics and Allan Wilson Centre for Molecular Ecology and Evolution, Department of Anatomy, University of Otago, Dunedin (New Zealand); Jalilzadeh, Aidin [Department of Mathematics and Statistics, University of Otago, Dunedin (New Zealand); Didham, Raphael K. [School of Animal Biology, University of Western Australia (Australia); CSIRO Ecosystem Sciences, Perth, Western Australia (Australia); Soboleva, Tanya [Science and Risk Assessment Directorate, Ministry for Primary Industries, PO Box 2526, Wellington (New Zealand); Tompkins, Daniel M. [Landcare Research, Private Bag 1930, Dunedin (New Zealand); New Zealand Institute for Plant and Food Research Ltd., Christchurch (New Zealand)

    2014-01-15

    Full-text: Humankind's ongoing battle with pest species spans millennia. Pests cause or carry disease, damage or consume food crops and other resources, and drive global environmental change. Conventional approaches to pest management usually involve lethal control, but such approaches are costly, of varying efficiency and often have ethical issues. Thus, pest management via control of reproductive output is increasingly considered an optimal solution. One of the most successful such 'fertility control' strategies developed to date is the sterile male technique (SMT), in which large numbers of sterile males are released into a population each generation. However, this approach is time-consuming, labour- intensive and costly. We use mathematical models to test a new twist on the SMT, using maternally inherited mitochondrial (mtDNA) mutations that affect male, but not female reproductive fitness. 'Trojan females' carrying such mutations, and their female descendants, produce 'sterile-male'-equivalents under natural conditions over multiple generations. We find that the Trojan Female Technique (TFT) has the potential to be a novel humane approach for pest control. Single large releases and relatively few small repeat releases of Trojan females both provided effective and persistent control within relatively few generations. Although greatest efficacy was predicted for high-turnover species, the additive nature of multiple releases made the TFT applicable to the full range of life histories modelled. The extensive conservation of mtDNA among eukaryotes suggests this approach could have broad utility for pest control. (author)

  6. The Trojan female technique: a novel, effective and humane approach for pest population control.

    Science.gov (United States)

    Gemmell, Neil J; Jalilzadeh, Aidin; Didham, Raphael K; Soboleva, Tanya; Tompkins, Daniel M

    2013-12-22

    Humankind's ongoing battle with pest species spans millennia. Pests cause or carry disease, damage or consume food crops and other resources, and drive global environmental change. Conventional approaches to pest management usually involve lethal control, but such approaches are costly, of varying efficiency and often have ethical issues. Thus, pest management via control of reproductive output is increasingly considered an optimal solution. One of the most successful such 'fertility control' strategies developed to date is the sterile male technique (SMT), in which large numbers of sterile males are released into a population each generation. However, this approach is time-consuming, labour-intensive and costly. We use mathematical models to test a new twist on the SMT, using maternally inherited mitochondrial (mtDNA) mutations that affect male, but not female reproductive fitness. 'Trojan females' carrying such mutations, and their female descendants, produce 'sterile-male'-equivalents under natural conditions over multiple generations. We find that the Trojan female technique (TFT) has the potential to be a novel humane approach for pest control. Single large releases and relatively few small repeat releases of Trojan females both provided effective and persistent control within relatively few generations. Although greatest efficacy was predicted for high-turnover species, the additive nature of multiple releases made the TFT applicable to the full range of life histories modelled. The extensive conservation of mtDNA among eukaryotes suggests this approach could have broad utility for pest control.

  7. On Alternative Approaches to 3D Image Perception: Monoscopic 3D Techniques

    Science.gov (United States)

    Blundell, Barry G.

    2015-06-01

    In the eighteenth century, techniques that enabled a strong sense of 3D perception to be experienced without recourse to binocular disparities (arising from the spatial separation of the eyes) underpinned the first significant commercial sales of 3D viewing devices and associated content. However following the advent of stereoscopic techniques in the nineteenth century, 3D image depiction has become inextricably linked to binocular parallax and outside the vision science and arts communities relatively little attention has been directed towards earlier approaches. Here we introduce relevant concepts and terminology and consider a number of techniques and optical devices that enable 3D perception to be experienced on the basis of planar images rendered from a single vantage point. Subsequently we allude to possible mechanisms for non-binocular parallax based 3D perception. Particular attention is given to reviewing areas likely to be thought-provoking to those involved in 3D display development, spatial visualization, HCI, and other related areas of interdisciplinary research.

  8. Exploring Chinese cultural standards through the lens of German managers: A case study approach

    Directory of Open Access Journals (Sweden)

    Roger Moser

    2011-06-01

    Full Text Available The ability to understand one’s own culture and to deal with specificities of foreign cultures is one of the core requirements in today’s international business. Management skills are partially culture specific and a management approach that is appropriate in one cultural context may not be appropriate in another. Several business activities of companies nowadays take place abroad, which requires managers to interact with different cultures. This paper aims to analyse cultural characteristics, especially in a Sino-German business context. Based on literature analysis and case study research, relevant cultural standards in China were identified from the German perspective. The result differentiates three superordinate cultural areas and five specific cultural standards and analyses different influence factors on the dimensions of the identified Chinese cultural standards.

  9. Design of Quiet Rotorcraft Approach Trajectories

    Science.gov (United States)

    Padula, Sharon L.; Burley, Casey L.; Boyd, D. Douglas, Jr.; Marcolini, Michael A.

    2009-01-01

    A optimization procedure for identifying quiet rotorcraft approach trajectories is proposed and demonstrated. The procedure employs a multi-objective genetic algorithm in order to reduce noise and create approach paths that will be acceptable to pilots and passengers. The concept is demonstrated by application to two different helicopters. The optimized paths are compared with one another and to a standard 6-deg approach path. The two demonstration cases validate the optimization procedure but highlight the need for improved noise prediction techniques and for additional rotorcraft acoustic data sets.

  10. A systematic approach to the training in the nuclear power industry: The need for standard

    International Nuclear Information System (INIS)

    Wilkinson, J.D.

    1995-01-01

    The five elements of a open-quotes Systematic Approach to Trainingclose quotes (SAT) are analysis, design, development, implementation and evaluation. These elements are also present in the effective application of basic process control. The fundamental negative feedback process control loop is therefore an excellent model for a successful, systematic approach to training in the nuclear power industry. Just as standards are required in today's manufacturing and service industries, eg ISO 9000, so too are control standards needed in the training industry and in particular in the training of nuclear power plant staff. The International Atomic Energy Agency (IAEA) produced its TECDOC 525 on open-quotes Training to Establish and Maintain the Qualification and Competence of Nuclear Power Plant Operations Personnelclose quotes in 1989 and the American Nuclear Society published its open-quotes Selection, Qualification, and Training of Personnel for Nuclear Power Plants, an American National Standardclose quotes in 1993. It is important that community colleges, training vendors and organizations such as the Instrument Society of America (ISA), who may be supplying basic or prerequisite training to the nuclear power industry, become aware of these and other standards relating to training in the nuclear power industry

  11. Effect of International Standards Certification on Firm-Level Exports: An Application of the Control Function Approach

    OpenAIRE

    Tsunehiro Otsuki

    2011-01-01

    Growing number of firms in developing countries have earned certifications such as International Standards Organization (ISO) as it enhances reputation of their company or brand and attract buyers particularly in export market. This study evaluates the effect of international standards certification on firm's export performance in Europe and Central Asia by applying the control function approach with endogenous treatment effect to firm-level data. Certification is found to increase export sha...

  12. Reverse breech extraction versus the standard approach of pushing the impacted fetal head up through the vagina in caesarean section for obstructed labour: A randomised controlled trial.

    Science.gov (United States)

    Nooh, Ahmed Mohamed; Abdeldayem, Hussein Mohammed; Ben-Affan, Othman

    2017-05-01

    The objective of this study was to assess effectiveness and safety of the reverse breech extraction approach in Caesarean section for obstructed labour, and compare it with the standard approach of pushing the fetal head up through the vagina. This randomised controlled trial included 192 women. In 96, the baby was delivered by the 'reverse breech extraction approach', and in the remaining 96, by the 'standard approach'. Extension of uterine incision occurred in 18 participants (18.8%) in the reverse breech extraction approach group, and 46 (47.9%) in the standard approach group (p = .0003). Two women (2.1%) in the reverse breech extraction approach group needed blood transfusion and 11 (11.5%) in the standard approach group (p = .012). Pyrexia developed in 3 participants (3.1%) in the reverse breech extraction approach group, and 19 (19.8%) in the standard approach group (p = .0006). Wound infection occurred in 2 women (2.1%) in the reverse breech extraction approach group, and 12 (12.5%) in the standard approach group (p = .007). Apgar score pushing the fetal head up through the vagina.

  13. Evaluation of a performance-based standards approach to heavy vehicle design to reduce pavement wear

    CSIR Research Space (South Africa)

    Nordengen, Paul A

    2013-11-01

    Full Text Available As a result of successful initiatives in Australia, New Zealand and Canada, the introduction of a performance-based standards (PBS) approach in the heavy vehicle sector in South Africa was identified by the Council for Scientific and Industrial...

  14. Investigating the application of diving endoscopic technique in determining the extent of pituitary adenoma resection via the trans-nasal-sphenoidal approach.

    Science.gov (United States)

    Gao, Hai-Bin; Wang, Li-Qing; Zhou, Jian-Yun; Sun, Wei

    2018-04-01

    The aim of the present study was to investigate the advantages and disadvantages of the diving endoscopic technique in pituitary adenoma surgery, and the application value in determining the extent of tumor resection. A total of 37 patients with pituitary adenoma initially underwent tumor resection under an endoscope-assisted microscope via standard trans-nasal-sphenoidal approach, and tumor cavity structure was observed by applying the diving endoscopic technique. Surgery was subsequently performed again under a microscope or endoscope. The diving endoscopic technique allowed surgeons to directly observe the structure inside a tumor cavity in high-definition. In the present study, 24 patients had pituitary macroadenomas or microadenomas that did not invade the cavernous sinus, and were considered to have undergone successful total resection. Among these patients, no tumor residues were observed through the diving endoscopic technique. Some white lichenoid or fibrous cord-like tissues in the tumor cavity were considered to be remnants of tumors. However, pathology confirmed that these were not tumor tissues. For tumors that invaded the cavernous sinus in 13 patients, observation could only be conducted under the angulation endoscope of the diving endoscope; i.e., the operation could not be conducted under an endoscope. The present study suggests that the diving endoscopic technique may be used to directly observe the resection extent of tumors within the tumor cavity, especially the structure of the tumor cavity inside the sella turcica. The present study also directly validates the reliability of pituitary adenoma resection under endoscope-assisted microscope. In addition, the diving endoscopic technique also allows the surgeon to observe the underwater environment within the sella turcica.

  15. A strategic approach for managing conflict in hospitals: responding to the Joint Commission leadership standard, Part 1.

    Science.gov (United States)

    Scott, Charity; Gerardi, Debra

    2011-02-01

    The Joint Commission's leadership standard for conflict management in hospitals, LD.02.04.01, states, "The hospital manages conflict between leadership groups to protect the quality and safety of care." This standard is one of numerous standards and alerts issued by The Joint Commission that address conflict and communication. They underscore the significant impact of relational dynamics on patient safety and quality of care and the critical need for a strategic approach to conflict in health care organizations. Whether leadership conflicts openly threaten a major disruption of hospital operations or whether unresolved conflicts lurk beneath the surface of daily interactions, unaddressed conflict can undermine a hospital's efforts to ensure safe, high-quality patient care. How leaders manage organizational conflict has a significant impact on achieving strategic objectives. Aligning conflict management approaches with quality and safety goals is the first step in adopting a strategic approach to conflict management. A strategic approach goes beyond reducing costs of litigation or improving grievance processes--it integrates a collaborative mind-set and individual conflict competency with nonadversarial processes. Conflict assessment should determine how conflicts are handled among the leaders at the hospital, the degree of conflict competence already present among the leaders, where the most significant conflicts occur, and how leaders think a conflict management system might work for them. Strategically aligning a conflict management approach that addresses conflict among leadership groups as a means of protecting the quality and safety of patient care is at the heart of LD.02.04.01.

  16. Clinical utility of an endoscopic ultrasound-guided rendezvous technique via various approach routes.

    Science.gov (United States)

    Kawakubo, Kazumichi; Isayama, Hiroyuki; Sasahira, Naoki; Nakai, Yousuke; Kogure, Hirofumi; Hamada, Tsuyoshi; Miyabayashi, Koji; Mizuno, Suguru; Sasaki, Takashi; Ito, Yukiko; Yamamoto, Natsuyo; Hirano, Kenji; Tada, Minoru; Koike, Kazuhiko

    2013-09-01

    The endoscopic ultrasound-guided rendezvous techniques (EUS-rendezvous) provide reliable biliary access after failed endoscopic retrograde cholangiopancreatography (ERCP) cannulation. We evaluated the clinical utility of an EUS-rendezvous technique using various approach routes. Patients undergoing EUS-rendezvous for biliary access after failed bile duct cannulation in ERCP were included. EUS-rendezvous was performed via three approach routes depending on the patient's condition: transgastric, transduodenal in a short endoscopic position, or transduodenal in a long endoscopic position. The main outcomes were the technical success rates. Secondary outcomes were procedure time and complications. Fourteen patients (median age, 77 years) underwent EUS-rendezvous for biliary access resulting from failed biliary cannulation. The reasons for biliary drainage were malignant biliary obstruction in five patients and choledocholithiasis in nine. Transgastric, transduodenal in a short position, and transduodenal in a long position EUS-rendezvous was performed in five, five, and four patients, respectively. Bile duct puncture occurred in the left intrahepatic duct in four patients, right hepatic duct in one, middle common bile duct in four, and lower common bile duct in five. The technical success rate was 100 %. In four patients, the approach route was modified from transduodenal in a short position to transduodenal in a long position or transgastric route. The median procedure time was 81 min. One case each of biliary peritonitis and pancreatitis occurred and were managed conservatively. EUS-rendezvous provided safe and reliable transpapillary bile duct access after failed ERCP cannulation. The selection of the appropriate approach routes, depending on patient condition, is critical.

  17. A contingency table approach to nonparametric testing

    CERN Document Server

    Rayner, JCW

    2000-01-01

    Most texts on nonparametric techniques concentrate on location and linear-linear (correlation) tests, with less emphasis on dispersion effects and linear-quadratic tests. Tests for higher moment effects are virtually ignored. Using a fresh approach, A Contingency Table Approach to Nonparametric Testing unifies and extends the popular, standard tests by linking them to tests based on models for data that can be presented in contingency tables.This approach unifies popular nonparametric statistical inference and makes the traditional, most commonly performed nonparametric analyses much more comp

  18. On the applicability of the standard approaches for evaluating a neoclassical radial electric field in a tokamak edge region

    Energy Technology Data Exchange (ETDEWEB)

    Dorf, M. A.; Cohen, R. H.; Joseph, I. [Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Simakov, A. N. [Los Alamos National Laboratory, Los Alamos, New Mexico 87544 (United States)

    2013-08-15

    The use of the standard approaches for evaluating a neoclassical radial electric field E{sub r}, i.e., the Ampere (or gyro-Poisson) equation, requires accurate calculation of the difference between the gyroaveraged electron and ion particle fluxes (or densities). In the core of a tokamak, the nontrivial difference appears only in high-order corrections to a local Maxwellian distribution due to the intrinsic ambipolarity of particle transport. The evaluation of such high-order corrections may be inconsistent with the accuracy of the standard long wavelength gyrokinetic equation (GKE), thus imposing limitations on the applicability of the standard approaches. However, in the edge of a tokamak, charge-exchange collisions with neutrals and prompt ion orbit losses can drive non-intrinsically ambipolar particle fluxes for which a nontrivial (E{sub r}-dependent) difference between the electron and ion fluxes appears already in a low order and can be accurately predicted by the long wavelength GKE. The parameter regimes, where the radial electric field dynamics in the tokamak edge region is dominated by the non-intrinsically ambipolar processes, thus allowing for the use of the standard approaches, are discussed.

  19. Generation of gaseous methanol reference standards

    International Nuclear Information System (INIS)

    Geib, R.C.

    1991-01-01

    Methanol has been proposed as an automotive fuel component. Reliable, accurate methanol standards are essential to support widespread monitoring programs. The monitoring programs may include quantification of methanol from tailpipe emissions, evaporative emissions, plus ambient air methanol measurements. This paper will present approaches and results in the author's investigation to develop high accuracy methanol standards. The variables upon which the authors will report results are as follows: (1) stability of methanol gas standards, the studies will focus on preparation requirements and stability results from 10 to 1,000 ppmv; (2) cylinder to instrument delivery system components and purge technique, these studies have dealt with materials in contact with the sample stream plus static versus flow injection; (3) optimization of gas chromatographic analytical system will be discussed; (4) gas chromatography and process analyzer results and utility for methanol analysis will be presented; (5) the accuracy of the methanol standards will be qualified using data from multiple studies including: (a) gravimetric preparation; (b) linearity studies; (c) independent standards sources such as low pressure containers and diffusion tubes. The accuracy will be provided as a propagation of error from multiple sources. The methanol target concentrations will be 10 to 500 ppmv

  20. Selected Bibliography of the Nephrourology standard techniques

    International Nuclear Information System (INIS)

    1999-01-01

    In the mark of the first meeting of project coordinators ARCAL XXXVI a selected Bibliography is presented about standardization of technical of Nuclear Nephrourology .In this selection it found: radiopharmaceuticals used, quality control,dosimetry, obstruction, clearance and renal function paediatric aspects pielonephritis,Renovascular hypertension and renal transplant [es

  1. Approaches to answering critical CER questions.

    Science.gov (United States)

    Kinnier, Christine V; Chung, Jeanette W; Bilimoria, Karl Y

    2015-01-01

    While randomized controlled trials (RCTs) are the gold standard for research, many research questions cannot be ethically and practically answered using an RCT. Comparative effectiveness research (CER) techniques are often better suited than RCTs to address the effects of an intervention under routine care conditions, an outcome otherwise known as effectiveness. CER research techniques covered in this section include: effectiveness-oriented experimental studies such as pragmatic trials and cluster randomized trials, treatment response heterogeneity, observational and database studies including adjustment techniques such as sensitivity analysis and propensity score analysis, systematic reviews and meta-analysis, decision analysis, and cost effectiveness analysis. Each section describes the technique and covers the strengths and weaknesses of the approach.

  2. A Strategy Modelling Technique for Financial Services

    OpenAIRE

    Heinrich, Bernd; Winter, Robert

    2004-01-01

    Strategy planning processes often suffer from a lack of conceptual models that can be used to represent business strategies in a structured and standardized form. If natural language is replaced by an at least semi-formal model, the completeness, consistency, and clarity of strategy descriptions can be drastically improved. A strategy modelling technique is proposed that is based on an analysis of modelling requirements, a discussion of related work and a critical analysis of generic approach...

  3. MATE standardization

    Science.gov (United States)

    Farmer, R. E.

    1982-11-01

    The MATE (Modular Automatic Test Equipment) program was developed to combat the proliferation of unique, expensive ATE within the Air Force. MATE incorporates a standard management approach and a standard architecture designed to implement a cradle-to-grave approach to the acquisition of ATE and to significantly reduce the life cycle cost of weapons systems support. These standards are detailed in the MATE Guides. The MATE Guides assist both the Air Force and Industry in implementing the MATE concept, and provide the necessary tools and guidance required for successful acquisition of ATE. The guides also provide the necessary specifications for industry to build MATE-qualifiable equipment. The MATE architecture provides standards for all key interfaces of an ATE system. The MATE approach to the acquisition and management of ATE has been jointly endorsed by the commanders of Air Force Systems Command and Air Force Logistics Command as the way of doing business in the future.

  4. In-Plane Ultrasound-Guided Knee Injection Through a Lateral Suprapatellar Approach: A Safe Technique.

    Science.gov (United States)

    Chagas-Neto, Francisco A; Taneja, Atul K; Gregio-Junior, Everaldo; Nogueira-Barbosa, Marcello H

    2017-06-01

    This study aims to describe a technique for in-plane ultrasound-guided knee arthrography through a lateral suprapatellar approach, reporting its accuracy and related complications. A retrospective search was performed for computed tomography and magnetic resonance reports from June 2013 through June 2015. Imaging studies, puncture descriptions, and guided-procedure images were reviewed along with clinical and surgical history. A fellowship-trained musculoskeletal radiologist performed all procedures under sterile technique and ultrasound guidance with the probe in oblique position on the lateral suprapatellar recess after local anesthesia with the patient on dorsal decubitus, hip in neutral rotation, and 30 to 45 degrees of knee flexion. A total of 86 consecutive subjects were evaluated (mean, 55 years). All subjects underwent intra-articular injection of contrast, which was successfully reached in the first attempt in 94.2% of the procedures (81/86), and in the second attempt in 5.8% (5/86) after needle repositioning without a second puncture. There were no postprocedural reports of regional complications at the puncture site, such as significant pain, bleeding, or vascular lesions. Our study demonstrates that in-plane ultrasound-guided injection of the knee in semiflexion approaching the lateral suprapatellar recess is a safe and useful technique to administer intra-articular contrast solution, as an alternative method without radiation exposure.

  5. Process-outcome interrelationship and standard setting in medical education: the need for a comprehensive approach.

    Science.gov (United States)

    Christensen, Leif; Karle, Hans; Nystrup, Jørgen

    2007-09-01

    An outcome-based approach to medical education compared to a process/content orientation is currently being discussed intensively. In this article, the process and outcome interrelationship in medical education is discussed, with specific emphasis on the relation to the definition of standards in basic medical education. Perceptions of outcome have always been an integrated element of curricular planning. The present debate underlines the need for stronger focus on learning objectives and outcome assessment in many medical schools around the world. The need to maintain an integrated approach of process/content and outcome is underlined in this paper. A worry is expressed about the taxonomy of learning in pure outcome-based medical education, in which student assessment can be a major determinant for the learning process, leaving the control of the medical curriculum to medical examiners. Moreover, curricula which favour reductionism by stating everything in terms of instrumental outcomes or competences, do face a risk of lowering quality and do become a prey for political interference. Standards based on outcome alone rise unclarified problems in relationship to licensure requirements of medical doctors. It is argued that the alleged dichotomy between process/content and outcome seems artificial, and that formulation of standards in medical education must follow a comprehensive line in curricular planning.

  6. A new approach for modeling the peak utility impacts from a proposed CUAC standard

    Energy Technology Data Exchange (ETDEWEB)

    LaCommare, Kristina Hamachi; Gumerman, Etan; Marnay, Chris; Chan, Peter; Coughlin, Katie

    2004-08-01

    This report describes a new Berkeley Lab approach for modeling the likely peak electricity load reductions from proposed energy efficiency programs in the National Energy Modeling System (NEMS). This method is presented in the context of the commercial unitary air conditioning (CUAC) energy efficiency standards. A previous report investigating the residential central air conditioning (RCAC) load shapes in NEMS revealed that the peak reduction results were lower than expected. This effect was believed to be due in part to the presence of the squelch, a program algorithm designed to ensure changes in the system load over time are consistent with the input historic trend. The squelch applies a system load-scaling factor that scales any differences between the end-use bottom-up and system loads to maintain consistency with historic trends. To obtain more accurate peak reduction estimates, a new approach for modeling the impact of peaky end uses in NEMS-BT has been developed. The new approach decrements the system load directly, reducing the impact of the squelch on the final results. This report also discusses a number of additional factors, in particular non-coincidence between end-use loads and system loads as represented within NEMS, and their impacts on the peak reductions calculated by NEMS. Using Berkeley Lab's new double-decrement approach reduces the conservation load factor (CLF) on an input load decrement from 25% down to 19% for a SEER 13 CUAC trial standard level, as seen in NEMS-BT output. About 4 GW more in peak capacity reduction results from this new approach as compared to Berkeley Lab's traditional end-use decrement approach, which relied solely on lowering end use energy consumption. The new method has been fully implemented and tested in the Annual Energy Outlook 2003 (AEO2003) version of NEMS and will routinely be applied to future versions. This capability is now available for use in future end-use efficiency or other policy analysis

  7. Laparoscopic anterior versus endoscopic posterior approach for adrenalectomy : a shift to a new golden standard?

    NARCIS (Netherlands)

    Vrielink, O M; Wevers, K P; Kist, J W; Borel Rinkes, I H M; Hemmer, P. H. J.; Vriens, M. R.; de Vries, J; Kruijff, S.

    PURPOSE: There has been an increased utilization of the posterior retroperitoneal approach (PRA) for adrenalectomy alongside the "classic" laparoscopic transabdominal technique (LTA). The aim of this study was to compare both procedures based on outcome variables at various ranges of tumor size.

  8. A Segmental Approach with SWT Technique for Denoising the EOG Signal

    Directory of Open Access Journals (Sweden)

    Naga Rajesh

    2015-01-01

    Full Text Available The Electrooculogram (EOG signal is often contaminated with artifacts and power-line while recording. It is very much essential to denoise the EOG signal for quality diagnosis. The present study deals with denoising of noisy EOG signals using Stationary Wavelet Transformation (SWT technique by two different approaches, namely, increasing segments of the EOG signal and different equal segments of the EOG signal. For performing the segmental denoising analysis, an EOG signal is simulated and added with controlled noise powers of 5 dB, 10 dB, 15 dB, 20 dB, and 25 dB so as to obtain five different noisy EOG signals. The results obtained after denoising them are extremely encouraging. Root Mean Square Error (RMSE values between reference EOG signal and EOG signals with noise powers of 5 dB, 10 dB, and 15 dB are very less when compared with 20 dB and 25 dB noise powers. The findings suggest that the SWT technique can be used to denoise the noisy EOG signal with optimum noise powers ranging from 5 dB to 15 dB. This technique might be useful in quality diagnosis of various neurological or eye disorders.

  9. Semiextended approach for intramedullary nailing via a patellar eversion technique for tibial-shaft fractures: Evaluation of the patellofemoral joint.

    Science.gov (United States)

    Yasuda, Tomohiro; Obara, Shu; Hayashi, Junji; Arai, Masayuki; Sato, Kaoru

    2017-06-01

    Intramedullary nail fixation is a common treatment for tibial-shaft fractures, and it offers a better functional prognosis than other conservative treatments. Currently, the primary approach employed during intramedullary nail insertion is the semiextended position is the suprapatellar approach, which involves a vertical incision of the quadriceps tendon Damage to the patellofemoral joint cartilage has been highlighted as a drawback associated with this approach. To avoid this issue, we perform surgery using the patellar eversion technique and a soft sleeve. This method allows the articular surface to be monitored during intramedullary nail insertion. We arthroscopically assessed the effect of this technique on patellofemoral joint cartilage. The patellar eversion technique allows a direct view and protection of the patellofemoral joint without affecting the patella. Thus, damage to the patellofemoral joint cartilage can be avoided. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Standardizing the practice of human reliability analysis

    International Nuclear Information System (INIS)

    Hallbert, B.P.

    1993-01-01

    The practice of human reliability analysis (HRA) within the nuclear industry varies greatly in terms of posited mechanisms that shape human performance, methods of characterizing and analytically modeling human behavior, and the techniques that are employed to estimate the frequency with which human error occurs. This variation has been a source of contention among HRA practitioners regarding the validity of results obtained from different HRA methods. It has also resulted in attempts to develop standard methods and procedures for conducting HRAs. For many of the same reasons, the practice of HRA has not been standardized or has been standardized only to the extent that individual analysts have developed heuristics and consistent approaches in their practice of HRA. From the standpoint of consumers and regulators, this has resulted in a lack of clear acceptance criteria for the assumptions, modeling, and quantification of human errors in probabilistic risk assessments

  11. Using a business model approach and marketing techniques for recruitment to clinical trials

    Directory of Open Access Journals (Sweden)

    Knight Rosemary

    2011-03-01

    Full Text Available Abstract Randomised controlled trials (RCTs are generally regarded as the gold standard for evaluating health care interventions. The level of uncertainty around a trial's estimate of effect is, however, frequently linked to how successful the trial has been in recruiting and retaining participants. As recruitment is often slower or more difficult than expected, with many trials failing to reach their target sample size within the timescale and funding originally envisaged, the results are often less reliable than they could have been. The high number of trials that require an extension to the recruitment period in order to reach the required sample size potentially delays the introduction of more effective therapies into routine clinical practice. Moreover, it may result in less research being undertaken as resources are redirected to extending existing trials rather than funding additional studies. Poor recruitment to publicly-funded RCTs has been much debated but there remains remarkably little clear evidence as to why many trials fail to recruit well, which recruitment methods work, in which populations and settings and for what type of intervention. One proposed solution to improving recruitment and retention is to adopt methodology from the business world to inform and structure trial management techniques. We review what is known about interventions to improve recruitment to trials. We describe a proposed business approach to trials and discuss the implementation of using a business model, using insights gained from three case studies.

  12. Using a business model approach and marketing techniques for recruitment to clinical trials

    Science.gov (United States)

    2011-01-01

    Randomised controlled trials (RCTs) are generally regarded as the gold standard for evaluating health care interventions. The level of uncertainty around a trial's estimate of effect is, however, frequently linked to how successful the trial has been in recruiting and retaining participants. As recruitment is often slower or more difficult than expected, with many trials failing to reach their target sample size within the timescale and funding originally envisaged, the results are often less reliable than they could have been. The high number of trials that require an extension to the recruitment period in order to reach the required sample size potentially delays the introduction of more effective therapies into routine clinical practice. Moreover, it may result in less research being undertaken as resources are redirected to extending existing trials rather than funding additional studies. Poor recruitment to publicly-funded RCTs has been much debated but there remains remarkably little clear evidence as to why many trials fail to recruit well, which recruitment methods work, in which populations and settings and for what type of intervention. One proposed solution to improving recruitment and retention is to adopt methodology from the business world to inform and structure trial management techniques. We review what is known about interventions to improve recruitment to trials. We describe a proposed business approach to trials and discuss the implementation of using a business model, using insights gained from three case studies. PMID:21396088

  13. Using a business model approach and marketing techniques for recruitment to clinical trials.

    Science.gov (United States)

    McDonald, Alison M; Treweek, Shaun; Shakur, Haleema; Free, Caroline; Knight, Rosemary; Speed, Chris; Campbell, Marion K

    2011-03-11

    Randomised controlled trials (RCTs) are generally regarded as the gold standard for evaluating health care interventions. The level of uncertainty around a trial's estimate of effect is, however, frequently linked to how successful the trial has been in recruiting and retaining participants. As recruitment is often slower or more difficult than expected, with many trials failing to reach their target sample size within the timescale and funding originally envisaged, the results are often less reliable than they could have been. The high number of trials that require an extension to the recruitment period in order to reach the required sample size potentially delays the introduction of more effective therapies into routine clinical practice. Moreover, it may result in less research being undertaken as resources are redirected to extending existing trials rather than funding additional studies.Poor recruitment to publicly-funded RCTs has been much debated but there remains remarkably little clear evidence as to why many trials fail to recruit well, which recruitment methods work, in which populations and settings and for what type of intervention. One proposed solution to improving recruitment and retention is to adopt methodology from the business world to inform and structure trial management techniques.We review what is known about interventions to improve recruitment to trials. We describe a proposed business approach to trials and discuss the implementation of using a business model, using insights gained from three case studies.

  14. Assessment of multi-version NPP I and C systems safety. Metric-based approach, technique and tool

    International Nuclear Information System (INIS)

    Kharchenko, Vyacheslav; Volkovoy, Andrey; Bakhmach, Eugenii; Siora, Alexander; Duzhyi, Vyacheslav

    2011-01-01

    The challenges related to problem of assessment of actual diversity level and evaluation of diversity-oriented NPP I and C systems safety are analyzed. There are risks of inaccurate assessment and problems of insufficient decreasing probability of CCFs. CCF probability of safety-critical systems may be essentially decreased due to application of several different types of diversity (multi-diversity). Different diversity types of FPGA-based NPP I and C systems, general approach and stages of diversity and safety assessment as a whole are described. Objectives of the report are: (a) analysis of the challenges caused by use of diversity approach in NPP I and C systems in context of FPGA and other modern technologies application; (b) development of multi-version NPP I and C systems assessment technique and tool based on check-list and metric-oriented approach; (c) case-study of the technique: assessment of multi-version FPGA-based NPP I and C developed by use of Radiy TM Platform. (author)

  15. Robotic Spent Fuel Monitoring – It is time to improve old approaches and old techniques!

    Energy Technology Data Exchange (ETDEWEB)

    Tobin, Stephen Joseph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dasari, Venkateswara Rao [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Trellue, Holly Renee [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-12-13

    This report describes various approaches and techniques associated with robotic spent fuel monitoring. The purpose of this description is to improve the quality of measured signatures, reduce the inspection burden on the IAEA, and to provide frequent verification.

  16. Development of a technique for three-dimensional image reconstruction from emission computed tomograms (ECT)

    International Nuclear Information System (INIS)

    Gerischer, R.

    1987-01-01

    The described technique for three-dimensional image reconstruction from ECT sections is based on a simple procedure, which can be carried out with the aid of any standard-type computer used in nuclear medicine and requires no sophisticated arithmetic approach. (TRV) [de

  17. Population-genetic approach to standardization of radiation and non-radiation factors

    International Nuclear Information System (INIS)

    Telnov, I.

    2006-01-01

    population level. Of 65 analyses of association between diseases and unfavorable effects and various genetic polymorphic systems, 27 had negative results. Other 38 had significant, i.e. positive results. Respective G.S.R.R. varied accordingly in the range from 1.2 to 2.5. Averaged G.S.R.R. for some genetic systems ranged from 1.4 to 1.9. More stable and closer values of averaged G.S.R.R. calculated for various categories of effects: pathologies due to radiation and non-radiation factors - 1.51; non-tumor (1,47) and tumor (1,54) diseases; average life expectancy - 1.34. Population-averaged or integral value of G.S.R.R. was about 1.5. This value can be used as genetic predisposition coefficient (C.G.P.) for correction in averaging of environmental population level factors. Such correction can be done by decreasing of permissible standard value by the value of C.G.P. to calculate population-genetic standard. It should be noted that population-genetic standards decrease risk of development of unfavorable consequences due to effect of environmental factors in individuals with genetic predisposition to the general population level. An important advantage of this approach is that there is no need to account for all existing variations of genetic predisposition to multiform unfavorable environmental factors

  18. A new approach to preparation of standard LEDs for luminous intensity and flux measurement of LEDs

    Science.gov (United States)

    Park, Seung-Nam; Park, Seongchong; Lee, Dong-Hoon

    2006-09-01

    This work presents an alternative approach for preparing photometric standard LEDs, which is based on a novel functional seasoning method. The main idea of our seasoning method is simultaneously monitoring the light output and the junction voltage to obtain quantitative information on the temperature dependence and the aging effect of the LED emission. We suggested a general model describing the seasoning process by taking junction temperature variation and aging effect into account and implemented a fully automated seasoning facility, which is capable of seasoning 12 LEDs at the same time. By independent measurements of the temperature dependence, we confirmed the discrepancy of the theoretical model to be less than 0.5 % and evaluate the uncertainty contribution of the functional seasoning to be less than 0.5 % for all the seasoned samples. To demonstrate assigning the reference value to a standard LED, the CIE averaged LED intensity (ALI) of the seasoned LEDs was measured with a spectroradiometer-based instrument and the measurement uncertainty was analyzed. The expanded uncertainty of the standard LED prepared by the new approach amounts to be 4 % ~ 5 % (k=2) depending on color without correction of spectral stray light in the spectroradiometer.

  19. Three-dimensional conformal pancreas treatment: comparison of four- to six-field techniques

    International Nuclear Information System (INIS)

    Higgins, Patrick D.; Sohn, Jason W.; Fine, Robert M.; Schell, Michael C.

    1995-01-01

    Purpose: We compare practical conformal treatment approaches to pancreatic cancer using 6 and 18 MV photons and contrast those approaches against standard techniques. Methods and Materials: A four-field conformal technique for treating pancreas cancer has been developed using nonopposed 18 MV photons. This approach has been extended to 6 MV photon application by the addition of one to two fields. These techniques have been optimized to increase sparing of normal liver and bowel, compared with opposed-field methods, to improve patient tolerance of high doses. In this study we compare these techniques in a simulated tumor model in a cylindrical phantom. Dose-volume analysis is used to quantify differences between the conformal, nonopposed techniques with conformal, opposed field methods. This model is also used to evaluate the effect of 1-2 cm setup errors on dose-volume coverage. Results: Dose-volume analysis demonstrates that five-to-six field conformal treatments using 6 MV photons provides similar or better dose coverage and normal tissue sparing characteristics as an optimized 18 MV, four-field approach when 1-2 cm margins are included for setup uncertainty. All approaches using nonopposed beam geometry provide significant reduction in the volume of tissue encompassed by the 30-50% isodose surfaces, as compared with four-field box techniques. Conclusions: Three-dimensional (3D) conformal treatments can be designed that significantly improve dose-volume characteristics over conventional treatment designs without costing unacceptable amounts of machine time. Further, deep intraabdominal sites can be adequately accessed and treated on intermediate energy machines with a relatively moderate increase in machine time

  20. A Bayesian approach to particle identification in ALICE

    Energy Technology Data Exchange (ETDEWEB)

    Wilkinson, Jeremy [Physikalisches Institut, Ruprecht-Karls-Universitaet Heidelberg (Germany); Collaboration: ALICE-Collaboration

    2016-07-01

    Particle identification (PID) is one of the major strengths of the ALICE detector at the LHC, and provides essential insight into quark-gluon plasma formation in heavy-ion collisions. PID is most effective when complementary identification techniques (such as specific energy loss in the Time Projection Chamber, or flight times measured by the Time Of Flight detector) are combined, however with standard PID techniques it can be difficult to combine these signals, especially when detectors with non-Gaussian responses are used. Here, an alternative probabilistic PID approach based on Bayes' theorem will be presented. This method facilitates the combination of different detector technologies based on the combined probability of a particle type to produce the signals measured in various detectors. The Bayesian PID approach will be briefly outlined, and benchmark analyses will be presented for high-purity samples of pions, kaons, and protons, as well as for the two-pronged decay D{sup 0} → K{sup -}π{sup +}, comparing the performance of the standard PID approach with that of the Bayesian approach. Finally, prospects for measuring the Λ{sub c} baryon in the three-pronged decay channel Λ{sub c}{sup +} → pK{sup -}π{sup +} are presented.

  1. Laparoscopic colorectal surgery in learning curve: Role of implementation of a standardized technique and recovery protocol. A cohort study

    Directory of Open Access Journals (Sweden)

    Gaetano Luglio

    2015-06-01

    Conclusion: Proper laparoscopic colorectal surgery is safe and leads to excellent results in terms of recovery and short term outcomes, even in a learning curve setting. Key factors for better outcomes and shortening the learning curve seem to be the adoption of a standardized technique and training model along with the strict supervision of an expert colorectal surgeon.

  2. A hybrid electron and photon IMRT planning technique that lowers normal tissue integral patient dose using standard hardware.

    Science.gov (United States)

    Rosca, Florin

    2012-06-01

    To present a mixed electron and photon IMRT planning technique using electron beams with an energy range of 6-22 MeV and standard hardware that minimizes integral dose to patients for targets as deep as 7.5 cm. Ten brain cases, two lung, a thyroid, an abdominal, and a parotid case were planned using two planning techniques: a photon-only IMRT (IMRT) versus a mixed modality treatment (E+IMRT) that includes an enface electron beam and a photon IMRT portion that ensures a uniform target coverage. The electron beam is delivered using a regular cutout placed in an electron cone. The electron energy was chosen to provide a good trade-off between minimizing integral dose and generating a uniform, deliverable plan. The authors choose electron energies that cover the deepest part of PTV with the 65%-70% isodose line. The normal tissue integral dose, the dose for ring structures around the PTV, and the volumes of the 75%, 50%, and 25% isosurfaces were used to compare the dose distributions generated by the two planning techniques. The normal tissue integral dose was lowered by about 20% by the E+IMRT plans compared to the photon-only IMRT ones for most studied cases. With the exception of lungs, the dose reduction associated to the E+IMRT plans was more pronounced further away from the target. The average dose ratio delivered to the 0-2 cm and the 2-4 cm ring structures for brain patients for the two planning techniques were 89.6% and 70.8%, respectively. The enhanced dose sparing away from the target for the brain patients can also be observed in the ratio of the 75%, 50%, and 25% isodose line volumes for the two techniques, which decreases from 85.5% to 72.6% and further to 65.1%, respectively. For lungs, the lateral electron beams used in the E+IMRT plans were perpendicular to the mostly anterior/posterior photon beams, generating much more conformal plans. The authors proved that even using the existing electron delivery hardware, a mixed electron/photon planning

  3. Foetal radiography for suspected skeletal dysplasia: technique, normal appearances, diagnostic approach

    Energy Technology Data Exchange (ETDEWEB)

    Calder, Alistair D. [Great Ormond Street Hospital for Children NHS Foundation Trust, Radiology Department, London (United Kingdom); Offiah, Amaka C. [Sheffield Children' s NHS Foundation Trust, Academic Unit of Child Health, Sheffield (United Kingdom)

    2015-04-01

    Despite advances in antenatal imaging and genetic techniques, post-delivery post-mortem foetal radiography remains the key investigation in accurate diagnosis of skeletal dysplasia manifesting in the foetus. Foetal radiography is best performed using pathology-specimen radiography equipment and is often carried out in the pathology department without involvement of the radiology unit. However, paediatric radiologists may be asked to interpret post-mortem foetal radiographs when an abnormality is suspected. Many foetal radiographs are carried out before 20 weeks' gestation, and the interpreting radiologist needs to be familiar with the range of normal post-mortem foetal appearances at different gestational ages, as well as the appearances of some of the more commonly presenting skeletal dysplasias, and will benefit from a systematic approach when assessing more challenging cases. In this pictorial essay, we illustrate various normal post-mortem foetal radiographic appearances, give examples of commonly occurring skeletal dysplasias, and describe an approach to establishing more difficult diagnoses. (orig.)

  4. Foetal radiography for suspected skeletal dysplasia: technique, normal appearances, diagnostic approach

    International Nuclear Information System (INIS)

    Calder, Alistair D.; Offiah, Amaka C.

    2015-01-01

    Despite advances in antenatal imaging and genetic techniques, post-delivery post-mortem foetal radiography remains the key investigation in accurate diagnosis of skeletal dysplasia manifesting in the foetus. Foetal radiography is best performed using pathology-specimen radiography equipment and is often carried out in the pathology department without involvement of the radiology unit. However, paediatric radiologists may be asked to interpret post-mortem foetal radiographs when an abnormality is suspected. Many foetal radiographs are carried out before 20 weeks' gestation, and the interpreting radiologist needs to be familiar with the range of normal post-mortem foetal appearances at different gestational ages, as well as the appearances of some of the more commonly presenting skeletal dysplasias, and will benefit from a systematic approach when assessing more challenging cases. In this pictorial essay, we illustrate various normal post-mortem foetal radiographic appearances, give examples of commonly occurring skeletal dysplasias, and describe an approach to establishing more difficult diagnoses. (orig.)

  5. Advancing Lie Detection by Inducing Cognitive Load on Liars: A Review of Relevant Theories and Techniques Guided by Lessons from Polygraph-Based Approaches

    Science.gov (United States)

    Walczyk, Jeffrey J.; Igou, Frank P.; Dixon, Alexa P.; Tcholakian, Talar

    2013-01-01

    This article critically reviews techniques and theories relevant to the emerging field of “lie detection by inducing cognitive load selectively on liars.” To help these techniques benefit from past mistakes, we start with a summary of the polygraph-based Controlled Question Technique (CQT) and the major criticisms of it made by the National Research Council (2003), including that it not based on a validated theory and administration procedures have not been standardized. Lessons from the more successful Guilty Knowledge Test are also considered. The critical review that follows starts with the presentation of models and theories offering insights for cognitive lie detection that can undergird theoretically load-inducing approaches. This is followed by evaluation of specific research-based, load-inducing proposals, especially for their susceptibility to rehearsal and other countermeasures. To help organize these proposals and suggest new direction for innovation and refinement, a theoretical taxonomy is presented based on the type of cognitive load induced in examinees (intrinsic or extraneous) and how open-ended the responses to test items are. Finally, four recommendations are proffered that can help researchers and practitioners to avert the corresponding mistakes with the CQT and yield new, valid cognitive lie detection technologies. PMID:23378840

  6. Conceptual Explanation for the Algebra in the Noncommutative Approach to the Standard Model

    International Nuclear Information System (INIS)

    Chamseddine, Ali H.; Connes, Alain

    2007-01-01

    The purpose of this Letter is to remove the arbitrariness of the ad hoc choice of the algebra and its representation in the noncommutative approach to the standard model, which was begging for a conceptual explanation. We assume as before that space-time is the product of a four-dimensional manifold by a finite noncommmutative space F. The spectral action is the pure gravitational action for the product space. To remove the above arbitrariness, we classify the irreducible geometries F consistent with imposing reality and chiral conditions on spinors, to avoid the fermion doubling problem, which amounts to have total dimension 10 (in the K-theoretic sense). It gives, almost uniquely, the standard model with all its details, predicting the number of fermions per generation to be 16, their representations and the Higgs breaking mechanism, with very little input

  7. Marketing moxie for librarians fresh ideas, proven techniques, and innovative approaches

    CERN Document Server

    Watson-Lakamp, Paula

    2015-01-01

    Robust, resilient, and flexible marketing is an absolute necessity for today's libraries. Fortunately, marketing can be fun. Through this savvy guide, you'll discover a wealth of fresh, actionable ideas and approaches that can be combined with tried-and-true marketing techniques to serve any library. Focusing on building platforms rather than chasing trends, the book offers low- and no-budget ideas for those in small libraries as well as information that can be used by libraries that have a staff of professionals. The guide opens with an overview of the basics of marketing and continues throug

  8. New overlay measurement technique with an i-line stepper using embedded standard field image alignment marks for wafer bonding applications

    Science.gov (United States)

    Kulse, P.; Sasai, K.; Schulz, K.; Wietstruck, M.

    2017-06-01

    marks. In this work, the non-contact infrared alignment system of the Nikon i-line Stepper NSR-SF150 for both the alignment and the overlay determination of bonded wafer stacks with embedded alignment marks are used to achieve an accurate alignment between the different wafer sides. The embedded field image alignment (FIA) marks of the interface and the device wafer top layer are measured in a single measurement job. By taking the offsets between all different FIA's into account, after correcting the wafer rotation induced FIA position errors, hence an overlay for the stacked wafers can be determined. The developed approach has been validated by a standard back to front side application. The overlay was measured and determined using both, the EVG NT40 automated measurement system with special overlay marks and the measurement of the FIA marks of the front and back side layer. A comparison of both results shows mismatches in x and y translations smaller than 200 nm, which is relatively small compared to the overlay tolerances of +/-500 nm for the back to front side process. After the successful validation of the developed technique, special wafer stacks with FIA alignment marks in the bonding interface are fabricated. Due to the super IR light transparency of both doubled side polished wafers, the embedded FIA marks generate a stable and clear signal for accurate x and y wafer coordinate positioning. The FIA marks of the device wafer top layer were measured under standard condition in a developed photoresist mask without IR illumination. Following overlay calculation shows an overlay of less than 200 nm, which enables very accurate process condition for highly scaled TSV integration and advanced substrate integration into IHP's 0.25/0.13 μm SiGe:C BiCMOS technology. The presented method can be applied for both the standard back to front side process technologies and also new temporary and permanent wafer bonding applications.

  9. Crew awareness as key to optimizing habitability standards onboard naval platforms: A 'back-to-basics' approach.

    Science.gov (United States)

    Neelakantan, Anand; Ilankumaran, Mookkiah; Ray, Sougat

    2017-10-01

    A healthy habitable environment onboard warships is vital to operational fleet efficiency and fit sea-warrier force. Unique man-machine-armament interface issues and consequent constraints on habitability necessitate a multi-disciplinary approach toward optimizing habitability standards. Study of the basic 'human factor', including crew awareness on what determines shipboard habitability, and its association with habitation specifications is an essential step in such an approach. The aim of this study was to assess crew awareness on shipboard habitability and the association between awareness and maintenance of optimal habitability as per specifications. A cross-sectional descriptive study was carried out among 552 naval personnel onboard warships in Mumbai. Data on crew awareness on habitability was collected using a standardized questionnaire, and correlated with basic habitability requirement specifications. Data was analyzed using Microsoft Excel, Epi-info, and SPSS version 17. Awareness level on basic habitability aspects was very good in 65.3% of crew. Area-specific awareness was maximum with respect to living area (95.3%). Knowledge levels on waste management were among the lowest (65.2%) in the category of aspect-wise awareness. Statistically significant association was found between awareness levels and habitability standards (OR = 7.27). The new benchmarks set in the form of high crew awareness levels on basic shipboard habitability specifications and its significant association with standards needs to be sustained. It entails re-iteration of healthy habitation essentials into training; and holds the key to a fit fighting force.

  10. Dramatics and education: Different directions and approaches to the application of drama/theatre techniques

    Directory of Open Access Journals (Sweden)

    Stamenković Ivana

    2015-01-01

    Full Text Available This paper describes, from the perspective of drama pedagogy, two basic directions in the application of drama and/or theatre techniques in the educational context which mainly differ primarily according to who applies them in practice with children and the young - teachers or professional actors.. Then the most prominent approaches of individual authors are singled out from each agent. Although there are certain similarities among them, we wanted also to show their differences, evident in theoretical, pedagogical and practical orientations. In this context the opinions about the importance of dramatics for education, and about the effects of the use of different approaches, levels and the ways of the application of drama/theatre techniques are singled out, showing at the same time different roles, levels and ways of participating students. This theme is important for promoting dramatics pedagogy, which is not at a high level of development in our country, in spite of the fact that many authors are engaged in theoretical and/or practical work in this area.

  11. One-Tube-Only Standardized Site-Directed Mutagenesis: An Alternative Approach to Generate Amino Acid Substitution Collections

    NARCIS (Netherlands)

    Mingo, J.; Erramuzpe, A.; Luna, S.; Aurtenetxe, O.; Amo, L.; Diez, I.; Schepens, J.T.G.; Hendriks, W.J.A.J.; Cortes, J.M.; Pulido, R.

    2016-01-01

    Site-directed mutagenesis (SDM) is a powerful tool to create defined collections of protein variants for experimental and clinical purposes, but effectiveness is compromised when a large number of mutations is required. We present here a one-tube-only standardized SDM approach that generates

  12. Study of elevated temperature design standard against thermal loads

    International Nuclear Information System (INIS)

    Kasahara, Naoto; Asayama, Tai; Morishita, Masaki

    2001-01-01

    Elevated temperature components must be designed against both pressure and thermal loads. In the case of sodium circuits of fast breeder reactors, a restriction from the pressure load becomes small because of the high boiling point of sodium. Design approaches for thermal loads (displacement-controlled) are compared with those against pressure loads (load-controlled). Considering differences between those two approaches, a concept of the elevated temperature design standard that takes the nature of thermal loads fully into account is proposed. This concept is a basis of load evaluation techniques and an inelastic analysis guide, that are being developed. Finally, problems and plans to realize the above concept are discussed. (author)

  13. In search of standards to support circularity in product policies: A systematic approach.

    Science.gov (United States)

    Tecchio, Paolo; McAlister, Catriona; Mathieux, Fabrice; Ardente, Fulvio

    2017-12-01

    The aspiration of a circular economy is to shift material flows toward a zero waste and pollution production system. The process of shifting to a circular economy has been initiated by the European Commission in their action plan for the circular economy. The EU Ecodesign Directive is a key policy in this transition. However, to date the focus of access to market requirements on products has primarily been upon energy efficiency. The absence of adequate metrics and standards has been a key barrier to the inclusion of resource efficiency requirements. This paper proposes a framework to boost sustainable engineering and resource use by systematically identifying standardization needs and features. Standards can then support the setting of appropriate material efficiency requirements in EU product policy. Three high-level policy goals concerning material efficiency of products were identified: embodied impact reduction, lifetime extension and residual waste reduction. Through a lifecycle perspective, a matrix of interactions among material efficiency topics (recycled content, re-used content, relevant material content, durability, upgradability, reparability, re-manufacturability, reusability, recyclability, recoverability, relevant material separability) and policy goals was created. The framework was tested on case studies for electronic displays and washing machines. For potential material efficiency requirements, specific standardization needs were identified, such as adequate metrics for performance measurements, reliable and repeatable tests, and calculation procedures. The proposed novel framework aims to provide a method by which to identify key material efficiency considerations within the policy context, and to map out the generic and product-specific standardisation needs to support ecodesign. Via such an approach, many different stakeholders (industry, academics, policy makers, non-governmental organizations etc.) can be involved in material efficiency

  14. Reconstruction of reflectance data using an interpolation technique.

    Science.gov (United States)

    Abed, Farhad Moghareh; Amirshahi, Seyed Hossein; Abed, Mohammad Reza Moghareh

    2009-03-01

    A linear interpolation method is applied for reconstruction of reflectance spectra of Munsell as well as ColorChecker SG color chips from the corresponding colorimetric values under a given set of viewing conditions. Hence, different types of lookup tables (LUTs) have been created to connect the colorimetric and spectrophotometeric data as the source and destination spaces in this approach. To optimize the algorithm, different color spaces and light sources have been used to build different types of LUTs. The effects of applied color datasets as well as employed color spaces are investigated. Results of recovery are evaluated by the mean and the maximum color difference values under other sets of standard light sources. The mean and the maximum values of root mean square (RMS) error between the reconstructed and the actual spectra are also calculated. Since the speed of reflectance reconstruction is a key point in the LUT algorithm, the processing time spent for interpolation of spectral data has also been measured for each model. Finally, the performance of the suggested interpolation technique is compared with that of the common principal component analysis method. According to the results, using the CIEXYZ tristimulus values as a source space shows priority over the CIELAB color space. Besides, the colorimetric position of a desired sample is a key point that indicates the success of the approach. In fact, because of the nature of the interpolation technique, the colorimetric position of the desired samples should be located inside the color gamut of available samples in the dataset. The resultant spectra that have been reconstructed by this technique show considerable improvement in terms of RMS error between the actual and the reconstructed reflectance spectra as well as CIELAB color differences under the other light source in comparison with those obtained from the standard PCA technique.

  15. Robotic and endoscopic transaxillary thyroidectomies may be cost prohibitive when compared to standard cervical thyroidectomy: a cost analysis.

    Science.gov (United States)

    Cabot, Jennifer C; Lee, Cho Rok; Brunaud, Laurent; Kleiman, David A; Chung, Woong Youn; Fahey, Thomas J; Zarnegar, Rasa

    2012-12-01

    This study presents a cost analysis of the standard cervical, gasless transaxillary endoscopic, and gasless transaxillary robotic thyroidectomy approaches based on medical costs in the United States. A retrospective review of 140 patients who underwent standard cervical, transaxillary endoscopic, or transaxillary robotic thyroidectomy at 2 tertiary centers was conducted. The cost model included operating room charges, anesthesia fee, consumables cost, equipment depreciation, and maintenance cost. Sensitivity analyses assessed individual cost variables. The mean operative times for the standard cervical, transaxillary endoscopic, and transaxillary robotic approaches were 121 ± 18.9, 185 ± 26.0, and 166 ± 29.4 minutes, respectively. The total cost for the standard cervical, transaxillary endoscopic, and transaxillary robotic approaches were $9,028 ± $891, $12,505 ± $1,222, and $13,670 ± $1,384, respectively. Transaxillary approaches were significantly more expensive than the standard cervical technique (standard cervical/transaxillary endoscopic, P cost when transaxillary endoscopic operative time decreased to 111 minutes and transaxillary robotic operative time decreased to 68 minutes. Increasing the case load did not resolve the cost difference. Transaxillary endoscopic and transaxillary robotic thyroidectomies are significantly more expensive than the standard cervical approach. Decreasing operative times reduces this cost difference. The greater expense may be prohibitive in countries with a flat reimbursement schedule. Copyright © 2012 Mosby, Inc. All rights reserved.

  16. STANDARDIZING QUALITY ASSESSMENT OF FUSED REMOTELY SENSED IMAGES

    Directory of Open Access Journals (Sweden)

    C. Pohl

    2017-09-01

    Full Text Available The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  17. Standardizing Quality Assessment of Fused Remotely Sensed Images

    Science.gov (United States)

    Pohl, C.; Moellmann, J.; Fries, K.

    2017-09-01

    The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  18. Assessment of chromium biostabilization in contaminated soils using standard leaching and sequential extraction techniques

    International Nuclear Information System (INIS)

    Papassiopi, Nymphodora; Kontoyianni, Athina; Vaxevanidou, Katerina; Xenidis, Anthimos

    2009-01-01

    The iron reducing microorganism Desulfuromonas palmitatis was evaluated as potential biostabilization agent for the remediation of chromate contaminated soils. D. palmitatis were used for the treatment of soil samples artificially contaminated with Cr(VI) at two levels, i.e. 200 and 500 mg kg -1 . The efficiency of the treatment was evaluated by applying several standard extraction techniques on the soil samples before and after treatment, such as the EN12457 standard leaching test, the US EPA 3060A alkaline digestion method and the BCR sequential extraction procedure. The water soluble chromium as evaluated with the EN leaching test, was found to decrease after the biostabilization treatment from 13 to less than 0.5 mg kg -1 and from 120 to 5.6 mg kg -1 for the soil samples contaminated with 200 and 500 mg Cr(VI) per kg soil respectively. The BCR sequential extraction scheme, although not providing accurate estimates about the initial chromium speciation in contaminated soils, proved to be a useful tool for monitoring the relative changes in element partitioning, as a consequence of the stabilization treatment. After bioreduction, the percentage of chromium retained in the two least soluble BCR fractions, i.e. the 'oxidizable' and 'residual' fractions, increased from 54 and 73% to more than 96% in both soils

  19. Quantum functional analysis non-coordinate approach

    CERN Document Server

    Helemskii, A Ya

    2010-01-01

    This book contains a systematic presentation of quantum functional analysis, a mathematical subject also known as operator space theory. Created in the 1980s, it nowadays is one of the most prominent areas of functional analysis, both as a field of active research and as a source of numerous important applications. The approach taken in this book differs significantly from the standard approach used in studying operator space theory. Instead of viewing "quantized coefficients" as matrices in a fixed basis, in this book they are interpreted as finite rank operators in a fixed Hilbert space. This allows the author to replace matrix computations with algebraic techniques of module theory and tensor products, thus achieving a more invariant approach to the subject. The book can be used by graduate students and research mathematicians interested in functional analysis and related areas of mathematics and mathematical physics. Prerequisites include standard courses in abstract algebra and functional analysis.

  20. Comparison of Image Processing Techniques for Nonviable Tissue Quantification in Late Gadolinium Enhancement Cardiac Magnetic Resonance Images.

    Science.gov (United States)

    Carminati, M Chiara; Boniotti, Cinzia; Fusini, Laura; Andreini, Daniele; Pontone, Gianluca; Pepi, Mauro; Caiani, Enrico G

    2016-05-01

    The aim of this study was to compare the performance of quantitative methods, either semiautomated or automated, for left ventricular (LV) nonviable tissue analysis from cardiac magnetic resonance late gadolinium enhancement (CMR-LGE) images. The investigated segmentation techniques were: (i) n-standard deviations thresholding; (ii) full width at half maximum thresholding; (iii) Gaussian mixture model classification; and (iv) fuzzy c-means clustering. These algorithms were applied either in each short axis slice (single-slice approach) or globally considering the entire short-axis stack covering the LV (global approach). CMR-LGE images from 20 patients with ischemic cardiomyopathy were retrospectively selected, and results from each technique were assessed against manual tracing. All methods provided comparable performance in terms of accuracy in scar detection, computation of local transmurality, and high correlation in scar mass compared with the manual technique. In general, no significant difference between single-slice and global approach was noted. The reproducibility of manual and investigated techniques was confirmed in all cases with slightly lower results for the nSD approach. Automated techniques resulted in accurate and reproducible evaluation of LV scars from CMR-LGE in ischemic patients with performance similar to the manual technique. Their application could minimize user interaction and computational time, even when compared with semiautomated approaches.

  1. Fibrinolysis standards: a review of the current status.

    Science.gov (United States)

    Thelwell, C

    2010-07-01

    Biological standards are used to calibrate measurements of components of the fibrinolytic system, either for assigning potency values to therapeutic products, or to determine levels in human plasma as an indicator of thrombotic risk. Traditionally WHO International Standards are calibrated in International Units based on consensus values from collaborative studies. The International Unit is defined by the response activity of a given amount of the standard in a bioassay, independent of the method used. Assay validity is based on the assumption that both standard and test preparation contain the same analyte, and the response in an assay is a true function of this analyte. This principle is reflected in the diversity of source materials used to prepare fibrinolysis standards, which has depended on the contemporary preparations they were employed to measure. With advancing recombinant technology, and improved analytical techniques, a reference system based on reference materials and associated reference methods has been recommended for future fibrinolysis standards. Careful consideration and scientific judgement must however be applied when deciding on an approach to develop a new standard, with decisions based on the suitability of a standard to serve its purpose, and not just to satisfy a metrological ideal. 2010 The International Association for Biologicals. Published by Elsevier Ltd. All rights reserved.

  2. Technical Approaches For Implementation Of CSMA/CD On A Fiber Optic Medium

    Science.gov (United States)

    Cronin, William J.

    1990-01-01

    The IEEE 802.3 10BASE-F Task Force is currently working on two approaches for 10 Megabit per second CSMA/CD networks using fiber optic media. The two approaches have distinct application spaces. One approach is based on Passive Fiber Optic Star technology, the other on Active Star technology. One standard, covering both application spaces, is expected sometime in 1991. This paper will summarize the proposals presented to the committee, and its predecessors, over the last two years. Three types of proposals were considered for the Active Star application space. These included CSMA/CD on a ring topology, an Active Star based on the Fiber Optic Inter Repeater Link, and an Active Star based on a synchronous signaling scheme. Three proposals were considered for the Passive Star application space. These included techniques for detecting collisions via special pulses, code rule violations, and amplitude sensing techniques. The paper concludes with a summary of the proposals being standardized by the 10BASE-F Task Force.

  3. Minimally Invasive Calcaneal Displacement Osteotomy Site Using a Reference Kirschner Wire: A Technique Tip.

    Science.gov (United States)

    Lee, Moses; Guyton, Gregory P; Zahoor, Talal; Schon, Lew C

    2016-01-01

    As a standard open approach, the lateral oblique incision has been widely used for calcaneal displacement osteotomy. However, just as with other orthopedic procedures that use an open approach, complications, including wound healing problems and neurovascular injury in the heel, have been reported. To help avoid these limitations, a percutaneous technique using a Shannon burr for calcaneal displacement osteotomy was introduced. However, relying on a free-hand technique without direct visualization at the osteotomy site has been a major obstacle for this technique. To address this problem, we developed a technical tip using a reference Kirschner wire. A reference Kirschner wire technique provides a reliable and accurate guide for minimally invasive calcaneal displacement osteotomy. Also, the technique should be easy to learn for surgeons new to the procedure. Copyright © 2016 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  4. Proteomic and metabolomic approaches to biomarker discovery

    CERN Document Server

    Issaq, Haleem J

    2013-01-01

    Proteomic and Metabolomic Approaches to Biomarker Discovery demonstrates how to leverage biomarkers to improve accuracy and reduce errors in research. Disease biomarker discovery is one of the most vibrant and important areas of research today, as the identification of reliable biomarkers has an enormous impact on disease diagnosis, selection of treatment regimens, and therapeutic monitoring. Various techniques are used in the biomarker discovery process, including techniques used in proteomics, the study of the proteins that make up an organism, and metabolomics, the study of chemical fingerprints created from cellular processes. Proteomic and Metabolomic Approaches to Biomarker Discovery is the only publication that covers techniques from both proteomics and metabolomics and includes all steps involved in biomarker discovery, from study design to study execution.  The book describes methods, and presents a standard operating procedure for sample selection, preparation, and storage, as well as data analysis...

  5. Standards for holdup measurement

    International Nuclear Information System (INIS)

    Zucker, M.S.

    1982-01-01

    Holdup measurement, needed for material balance, depend intensively on standards and on interpretation of the calibration procedure. More than other measurements, the calibration procedure using the standard becomes part of the standard. Standards practical for field use and calibration techniques have been developed. While accuracy in holdup measurements is comparatively poor, avoidance of bias is a necessary goal

  6. A Generalized Form of Context-Dependent Psychophysiological Interactions (gPPI): A Comparison to Standard Approaches

    Science.gov (United States)

    McLaren, Donald G.; Ries, Michele L.; Xu, Guofan; Johnson, Sterling C.

    2012-01-01

    Functional MRI (fMRI) allows one to study task-related regional responses and task-dependent connectivity analysis using psychophysiological interaction (PPI) methods. The latter affords the additional opportunity to understand how brain regions interact in a task-dependent manner. The current implementation of PPI in Statistical Parametric Mapping (SPM8) is configured primarily to assess connectivity differences between two task conditions, when in practice fMRI tasks frequently employ more than two conditions. Here we evaluate how a generalized form of context-dependent PPI (gPPI; http://www.nitrc.org/projects/gppi), which is configured to automatically accommodate more than two task conditions in the same PPI model by spanning the entire experimental space, compares to the standard implementation in SPM8. These comparisons are made using both simulations and an empirical dataset. In the simulated dataset, we compare the interaction beta estimates to their expected values and model fit using the Akaike Information Criterion (AIC). We found that interaction beta estimates in gPPI were robust to different simulated data models, were not different from the expected beta value, and had better model fits than when using standard PPI (sPPI) methods. In the empirical dataset, we compare the model fit of the gPPI approach to sPPI. We found that the gPPI approach improved model fit compared to sPPI. There were several regions that became non-significant with gPPI. These regions all showed significantly better model fits with gPPI. Also, there were several regions where task-dependent connectivity was only detected using gPPI methods, also with improved model fit. Regions that were detected with all methods had more similar model fits. These results suggest that gPPI may have greater sensitivity and specificity than standard implementation in SPM. This notion is tempered slightly as there is no gold standard; however, data simulations with a known outcome support our

  7. A standards-based approach to quality improvement for HIV services at Zambia Defence Force facilities: results and lessons learned.

    Science.gov (United States)

    Kols, Adrienne; Kim, Young-Mi; Bazant, Eva; Necochea, Edgar; Banda, Joseph; Stender, Stacie

    2015-07-01

    The Zambia Defence Force adopted the Standards-Based Management and Recognition approach to improve the quality of the HIV-related services at its health facilities. This quality improvement intervention relies on comprehensive, detailed assessment tools to communicate and verify adherence to national standards of care, and to test and implement changes to improve performance. A quasi-experimental evaluation of the intervention was conducted at eight Zambia Defence Force primary health facilities (four facilities implemented the intervention and four did not). Data from three previous analyses are combined to assess the effect of Standards-Based Management and Recognition on three domains: facility readiness to provide services; observed provider performance during antiretroviral therapy (ART) and antenatal care consultations; and provider perceptions of the work environment. Facility readiness scores for ART improved on four of the eight standards at intervention sites, and one standard at comparison sites. Facility readiness scores for prevention of mother-to-child transmission (PMTCT) of HIV increased by 15 percentage points at intervention sites and 7 percentage points at comparison sites. Provider performance improved significantly at intervention sites for both ART services (from 58 to 84%; P improved at intervention sites and declined at comparison sites; differences in trends between study groups were significant for eight items. A standards-based approach to quality improvement proved effective in supporting healthcare managers and providers to deliver ART and PMTCT services in accordance with evidence-based standards in a health system suffering from staff shortages.

  8. Olfactory groove meningiomas from neurosurgical and ear, nose, and throat perspectives: approaches, techniques, and outcomes.

    Science.gov (United States)

    Spektor, Sergey; Valarezo, Javier; Fliss, Dan M; Gil, Ziv; Cohen, Jose; Goldman, Jose; Umansky, Felix

    2005-10-01

    To review the surgical approaches, techniques, outcomes, and recurrence rates in a series of 80 olfactory groove meningioma (OGM) patients operated on between 1990 and 2003. Eighty patients underwent 81 OGM surgeries. Tumor diameter varied from 2 to 9 cm (average, 4.6 cm). In 35 surgeries (43.2%), the tumor was removed through bifrontal craniotomy; nine operations (11.1%) were performed through a unilateral subfrontal approach; 18 surgeries (22.2%) were performed through a pterional approach; seven surgeries (8.6%) were carried out using a fronto-orbital craniotomy; and 12 procedures (14.8%) were accomplished via a subcranial approach. Nine patients (11.3%) had undergone surgery previously and had recurrent tumor. Total removal was obtained in 72 patients (90.0%); subtotal removal was achieved in 8 patients (10.0%). Two patients, one with total and one with subtotal removal, had atypical (World Health Organization Grade II) meningiomas, whereas 78 patients had World Health Organization Grade I tumors. There was no operative mortality and no new permanent focal neurological deficit besides anosmia. Twenty-five patients (31.3%) experienced surgery-related complications. There were no recurrences in 75 patients (93.8%) 6 to 164 months (mean, 70.8 mo) after surgery. Three patients (3.8%) were lost to follow-up. In two patients (2.5%) with subtotal removal, the residual evidenced growth on computed tomography and/or magnetic resonance imaging 1 year after surgery. One of them had an atypical meningioma. The second, a multiple meningiomata patient, was operated on twice in this series. A variety of surgical approaches are used for OGM resection. An approach tailored to the tumor's size, location, and extension, combined with modern microsurgical cranial base techniques, allows full OGM removal with minimal permanent morbidity, excellent neurological outcome, and very low recurrence rates.

  9. Whole-genome-based Mycobacterium tuberculosis surveillance: a standardized, portable, and expandable approach.

    Science.gov (United States)

    Kohl, Thomas A; Diel, Roland; Harmsen, Dag; Rothgänger, Jörg; Walter, Karen Meywald; Merker, Matthias; Weniger, Thomas; Niemann, Stefan

    2014-07-01

    Whole-genome sequencing (WGS) allows for effective tracing of Mycobacterium tuberculosis complex (MTBC) (tuberculosis pathogens) transmission. However, it is difficult to standardize and, therefore, is not yet employed for interlaboratory prospective surveillance. To allow its widespread application, solutions for data standardization and storage in an easily expandable database are urgently needed. To address this question, we developed a core genome multilocus sequence typing (cgMLST) scheme for clinical MTBC isolates using the Ridom SeqSphere(+) software, which transfers the genome-wide single nucleotide polymorphism (SNP) diversity into an allele numbering system that is standardized, portable, and not computationally intensive. To test its performance, we performed WGS analysis of 26 isolates with identical IS6110 DNA fingerprints and spoligotyping patterns from a longitudinal outbreak in the federal state of Hamburg, Germany (notified between 2001 and 2010). The cgMLST approach (3,041 genes) discriminated the 26 strains with a resolution comparable to that of SNP-based WGS typing (one major cluster of 22 identical or closely related and four outlier isolates with at least 97 distinct SNPs or 63 allelic variants). Resulting tree topologies are highly congruent and grouped the isolates in both cases analogously. Our data show that SNP- and cgMLST-based WGS analyses facilitate high-resolution discrimination of longitudinal MTBC outbreaks. cgMLST allows for a meaningful epidemiological interpretation of the WGS genotyping data. It enables standardized WGS genotyping for epidemiological investigations, e.g., on the regional public health office level, and the creation of web-accessible databases for global TB surveillance with an integrated early warning system. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  10. Russian Language Development Assessment as a Standardized Technique for Assessing Communicative Function in Children Aged 3–9 Years

    Directory of Open Access Journals (Sweden)

    Prikhoda N.A.,

    2016-10-01

    Full Text Available The article describes the Russian Language Development Assessment, a standardized individual diagnostic tool for children aged from 3 to 9 that helps to assess the following components of a child’s communicative function: passive vocabulary, expressive vocabulary, knowledge of semantic constructs with logical, temporal and spatial relations, passive perception and active use of syntactic and morphological features of words in a sentence, active and passive phonological awareness, active and passive knowledge of syntactic structures and categories. The article provides descriptions of content and diagnostic procedures for all 7 subtests included in the assessment (Passive Vocabulary, Active Vocabulary, Linguistic Operators, Sentence structure, Word Structure, Phonology, Sentence Repetition. Basing on the data collected in the study that involved 86 first- graders of a Moscow school, the article analyzes the internal consistency and construct validity of each subtest of the technique. It concludes that the Russian Language Development Assessment technique can be of much use both in terms of diagnostic purposes and in supporting children with ASD taking into account the lack of standardized tools for language and speech development assessment in Russian and the importance of this measure in general.

  11. A comparative study of standard vs. high definition colonoscopy for adenoma and hyperplastic polyp detection with optimized withdrawal technique.

    Science.gov (United States)

    East, J E; Stavrindis, M; Thomas-Gibson, S; Guenther, T; Tekkis, P P; Saunders, B P

    2008-09-15

    Colonoscopy has a known miss rate for polyps and adenomas. High definition (HD) colonoscopes may allow detection of subtle mucosal change, potentially aiding detection of adenomas and hyperplastic polyps. To compare detection rates between HD and standard definition (SD) colonoscopy. Prospective, cohort study with optimized withdrawal technique (withdrawal time >6 min, antispasmodic, position changes, re-examining flexures and folds). One hundred and thirty patients attending for routine colonoscopy were examined with either SD (n = 72) or HD (n = 58) colonoscopes. Groups were well matched. Sixty per cent of patients had at least one adenoma detected with SD vs. 71% with HD, P = 0.20, relative risk (benefit) 1.32 (95% CI 0.85-2.04). Eighty-eight adenomas (mean +/- standard deviation 1.2 +/- 1.4) were detected using SD vs. 93 (1.6 +/- 1.5) with HD, P = 0.12; however more nonflat, diminutive (9 mm) hyperplastic polyps was 7% (0.09 +/- 0.36). High definition did not lead to a significant increase in adenoma or hyperplastic polyp detection, but may help where comprehensive lesion detection is paramount. High detection rates appear possible with either SD or HD, when using an optimized withdrawal technique.

  12. Selection of suitable e-learning approach using TOPSIS technique with best ranked criteria weights

    Science.gov (United States)

    Mohammed, Husam Jasim; Kasim, Maznah Mat; Shaharanee, Izwan Nizal Mohd

    2017-11-01

    This paper compares the performances of four rank-based weighting assessment techniques, Rank Sum (RS), Rank Reciprocal (RR), Rank Exponent (RE), and Rank Order Centroid (ROC) on five identified e-learning criteria to select the best weights method. A total of 35 experts in a public university in Malaysia were asked to rank the criteria and to evaluate five e-learning approaches which include blended learning, flipped classroom, ICT supported face to face learning, synchronous learning, and asynchronous learning. The best ranked criteria weights are defined as weights that have the least total absolute differences with the geometric mean of all weights, were then used to select the most suitable e-learning approach by using TOPSIS method. The results show that RR weights are the best, while flipped classroom approach implementation is the most suitable approach. This paper has developed a decision framework to aid decision makers (DMs) in choosing the most suitable weighting method for solving MCDM problems.

  13. Image evaluation of HIV encephalopathy: a multimodal approach using quantitative MR techniques

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Paulo T.C.; Escorsi-Rosset, Sara [University of Sao Paulo, Radiology Section, Internal Medicine Department, Ribeirao Preto School of Medicine, Sao Paulo (Brazil); Cervi, Maria C. [University of Sao Paulo, Department of Pediatrics, Ribeirao Preto School of Medicine, Sao Paulo (Brazil); Santos, Antonio Carlos [University of Sao Paulo, Radiology Section, Internal Medicine Department, Ribeirao Preto School of Medicine, Sao Paulo (Brazil); Hospital das Clinicas da FMRP-USP, Ribeirao Preto, SP (Brazil)

    2011-11-15

    A multimodal approach of the human immunodeficiency virus (HIV) encephalopathy using quantitative magnetic resonance (MR) techniques can demonstrate brain changes not detectable only with conventional magnetic resonance imaging (MRI). The aim of this study was to compare conventional MRI and MR quantitative techniques, such as magnetic resonance spectroscopy (MRS) and relaxometry and to determine whether quantitative techniques are more sensitive than conventional imaging for brain changes caused by HIV infection. We studied prospectively nine HIV positive children (mean age 6 years, from 5 to 8 years old) and nine controls (mean age 7.3 years; from 3 to 10 years), using MRS and relaxometry. Examinations were carried on 1.5-T equipment. HIV-positive patients presented with only minor findings and all control patients had normal conventional MR findings. MRS findings showed an increase in choline to creatine (CHO/CRE) ratios bilaterally in both frontal gray and white matter, in the left parietal white matter, and in total CHO/CRE ratio. In contrast, N-acetylaspartate to creatine (NAA/CRE) ratios did not present with any significant difference between both groups. Relaxometry showed significant bilateral abnormalities, with lengthening of the relaxation time in HIV positive in many regions. Conventional MRI is not sensitive for early brain changes caused by HIV infection. Quantitative techniques such as MRS and relaxometry appear as valuable tools in the diagnosis of these early changes. Therefore, a multimodal quantitative study can be useful in demonstrating and understanding the physiopathology of the disease. (orig.)

  14. A Secure Test Technique for Pipelined Advanced Encryption Standard

    Science.gov (United States)

    Shi, Youhua; Togawa, Nozomu; Yanagisawa, Masao; Ohtsuki, Tatsuo

    In this paper, we presented a Design-for-Secure-Test (DFST) technique for pipelined AES to guarantee both the security and the test quality during testing. Unlike previous works, the proposed method can keep all the secrets inside and provide high test quality and fault diagnosis ability as well. Furthermore, the proposed DFST technique can significantly reduce test application time, test data volume, and test generation effort as additional benefits.

  15. Safety of achilles detachment and reattachment using a standard midline approach to insertional enthesophytes.

    Science.gov (United States)

    McAlister, Jeffrey E; Hyer, Christopher F

    2015-01-01

    Detachment with reattachment of the Achilles tendon is a common surgery for debridement of retrocalcaneal exostosis, bursitis, and other insertional pathologic entities. The technique involves a midline skin incision on the posterior Achilles to the tendon. The distal Achilles attachment is removed in a U-shaped manner, leaving the medial and lateral flares, but exposing the posterior spur. This midline approach provides excellent exposure and allows for rapid and efficient surgical debridement. The tendon is reapproximated and repaired with a suture anchor to facilitate fixation to the posterior calcaneus. Some surgeons have expressed concerned that the rupture risk could be increased in the postoperative period using this technique. The present study was a retrospective medical record review of 98 patients (100 feet) who had undergone a midline approach with Achilles reattachment after insertional Achilles debridement during a 3-year period. The demographic and comorbidity data were collected and analyzed. The outcome measures were postoperative rupture and the need for revision surgery. The mean age was 51.9 years, and the patients included 59 females (60.2%) and 39 males (39.8%). The complications included 4 rupture or avulsion revisions (4.0%) and 2 recurrent pain and tendinitis revisions (2.0%). The most common repeat repair procedure included hardware removal and a flexor hallucis longus transfer or augmentation. Nine patients (9.0%) had wound complications, 7 (77.8%) of which necessitated incision and drainage. The midline approach with Achilles detachment and reattachment is a safe and effective method of surgical treatment of insertional Achilles pathologic entities. The low reoperation rate of 4.0% will allow foot and ankle surgeons to safely rely on this approach. Copyright © 2015 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  16. A multi-technique approach for detecting and evaluating material inconsistencies in historical banknotes.

    Science.gov (United States)

    Del Hoyo-Meléndez, Julio M; Gondko, Klaudia; Mendys, Agata; Król, Maſgorzata; Klisiſska-Kopacz, Anna; Sobczyk, Joanna; Jaworucka-Drath, Anda

    2016-09-01

    The identification of forged and genuine historical banknotes is an important problem for private collectors and researchers responsible for the care of numismatic collections. This paper presents a research approach for detecting material differences in historical banknotes through the use of microfading spectrometry along with other techniques such as hyperspectral image analysis, Fourier-transform infrared spectroscopy, and X-ray fluorescence spectrometry. Microfading spectrometry results showed higher sensitivity to light irradiation for an overprint ink used on a suspicious banknote relative to its counterparts. In addition, the spectrocolorimetric changes experienced by the paper substrates during microfade testing also provided a way for discriminating between two groups of banknotes. These variations have been confirmed after analyzing the spectral and physico-chemical data obtained using the abovementioned complementary techniques. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. A multi-technique approach for the characterization of Roman mural paintings

    Energy Technology Data Exchange (ETDEWEB)

    Toschi, Francesco [CNR-IMIP (Istituto di Metodologie Inorganiche e dei Plasmi), Area della Ricerca Roma 1, Via Salaria Km. 29,300, 00016 Monterotondo, Roma, (Italy); Paladini, Alessandra, E-mail: alessandra.paladini@cnr.it [CNR-IMIP (Istituto di Metodologie Inorganiche e dei Plasmi), Area della Ricerca Roma 1, Via Salaria Km. 29,300, 00016 Monterotondo, Roma, (Italy); Colosi, Francesca [CNR-ITABC (Istituto per le Tecnologie Applicate ai Beni Culturali), Area della Ricerca Roma 1, Via Salaria Km. 29,300, 00016 Monterotondo, Roma (Italy); Cafarelli, Patrizia; Valentini, Veronica [CNR-IMIP (Istituto di Metodologie Inorganiche e dei Plasmi), Area della Ricerca Roma 1, Via Salaria Km. 29,300, 00016 Monterotondo, Roma, (Italy); Falconieri, Mauro; Gagliardi, Serena [ENEA, C.R. Casaccia, via Anguillarese 301, 00060 Roma (Italy); Santoro, Paola [CNR-ISMA (Istituto di Studi sul Mediterraneo Antico), Area della Ricerca Roma 1, Via Salaria Km. 29,300, 00016 Monterotondo, Roma (Italy)

    2013-11-01

    In the frame of an ongoing archeological study on the Sabina area, a countryside close to Rome, white and red samples of roman wall paintings have been investigated by combining X-ray diffraction and different spectroscopic methodologies, namely laser induced breakdown spectroscopy, μ-Raman and Fourier transform infrared attenuated total reflectance spectroscopy. The used multi-technique approach has allowed the unambiguous identification of the red pigment as red ochre and has provided insight on the provenance of both the pigment and the material used for the realization of the wall paintings. The experimental results have confirmed some assumptions on the use of local materials in roman rural architecture.

  18. A new approach of watermarking technique by means multichannel wavelet functions

    Science.gov (United States)

    Agreste, Santa; Puccio, Luigia

    2012-12-01

    The digital piracy involving images, music, movies, books, and so on, is a legal problem that has not found a solution. Therefore it becomes crucial to create and to develop methods and numerical algorithms in order to solve the copyright problems. In this paper we focus the attention on a new approach of watermarking technique applied to digital color images. Our aim is to describe the realized watermarking algorithm based on multichannel wavelet functions with multiplicity r = 3, called MCWM 1.0. We report a large experimentation and some important numerical results in order to show the robustness of the proposed algorithm to geometrical attacks.

  19. Quantitative and statistical approaches to geography a practical manual

    CERN Document Server

    Matthews, John A

    2013-01-01

    Quantitative and Statistical Approaches to Geography: A Practical Manual is a practical introduction to some quantitative and statistical techniques of use to geographers and related scientists. This book is composed of 15 chapters, each begins with an outline of the purpose and necessary mechanics of a technique or group of techniques and is concluded with exercises and the particular approach adopted. These exercises aim to enhance student's ability to use the techniques as part of the process by which sound judgments are made according to scientific standards while tackling complex problems. After a brief introduction to the principles of quantitative and statistical geography, this book goes on dealing with the topics of measures of central tendency; probability statements and maps; the problem of time-dependence, time-series analysis, non-normality, and data transformations; and the elements of sampling methodology. Other chapters cover the confidence intervals and estimation from samples, statistical hy...

  20. Development, enhancement, and evaluation of aircraft measurement techniques for national ambient air quality standard criteria pollutants

    Science.gov (United States)

    Brent, Lacey Cluff

    The atmospheric contaminants most harmful to human health are designated Criteria Pollutants. To help Maryland attain the national ambient air quality standards (NAAQS) for Criteria Pollutants, and to improve our fundamental understanding of atmospheric chemistry, I conducted aircraft measurements in the Regional Atmospheric Measurement Modeling Prediction Program (RAMMPP). These data are used to evaluate model simulations and satellite observations. I developed techniques for improving airborne observation of two NAAQS pollutants, particulate matter (PM) and nitrogen dioxide (NO2). While structure and composition of organic aerosol are important for understanding PM formation, the molecular speciation of organic ambient aerosol remains largely unknown. The spatial distribution of reactive nitrogen is likewise poorly constrained. To examine water-soluble organic aerosol (WSOA) during an air pollution episode, I designed and implemented a shrouded aerosol inlet system to collect PM onto quartz fiber filters from a Cessna 402 research aircraft. Inlet evaluation conducted during a side-by-side flight with the NASA P3 demonstrated agreement to within 30%. An ion chromatographic mass spectrometric method developed using the NIST Standard Reference Material (SRM) 1649b Urban Dust, as a surrogate material resulted in acidic class separation and resolution of at least 34 organic acids; detection limits approach pg/g concentrations. Analysis of aircraft filter samples resulted in detection of 8 inorganic species and 16 organic acids of which 12 were quantified. Aged, re-circulated metropolitan air showed a greater number of dicarboxylic acids compared to air recently transported from the west. While the NAAQS for NO2 is rarely exceeded, it is a precursor molecule for ozone, America's most recalcitrant pollutant. Using cavity ringdown spectroscopy employing a light emitting diode (LED), I measured vertical profiles of NO2 (surface to 2.5 km) west (upwind) of the Baltimore

  1. Standardization of the Fricke gel dosimetry method and tridimensional dose evaluation using the magnetic resonance imaging technique

    International Nuclear Information System (INIS)

    Cavinato, Christianne Cobello

    2009-01-01

    This study standardized the method for obtaining the Fricke gel solution developed at IPEN. The results for different gel qualities used in the preparation of solutions and the influence of the gelatin concentration in the response of dosimetric solutions were compared. Type tests such as: dose response dependence, minimum and maximum detection limits, response reproducibility, among others, were carried out using different radiation types and the Optical Absorption (OA) spectrophotometry and Magnetic Resonance (MR) techniques. The useful dose ranges for Co 60 gamma radiation and 6 MeV photons are 0,4 to 30,0 Gy and 0,5 to 100,0 Gy , using OA and MR techniques, respectively. A study of ferric ions diffusion in solution was performed to determine the optimum time interval between irradiation and samples evaluation; until 2,5 hours after irradiation to obtain sharp MR images. A spherical simulator consisting of Fricke gel solution prepared with 5% by weight 270 Bloom gelatine (national quality) was developed to be used to three-dimensional dose assessment using the Magnetic Resonance Imaging (MRI) technique. The Fricke gel solution prepared with 270 Bloom gelatine, that, in addition to low cost, can be easily acquired on the national market, presents satisfactory results on the ease of handling, sensitivity, response reproducibility and consistency. The results confirm their applicability in the three-dimensional dosimetry using MRI technique. (author)

  2. Maxillary sinus floor elevation via crestal approach: the evolution of the hydraulic pressure technique.

    Science.gov (United States)

    Lopez, Michele Antonio; Andreasi Bassi, Mirko; Confalone, Luca; Carinci, Francesco

    2014-01-01

    The current study describes an innovative protocol for the surgical maxillary sinus augmentation via a crestal approach that uses hydraulic pressure to lift the Schneiderian membrane and simultaneously fill the subantral space with a biomaterial for bone regeneration (nanocrystalline hydroxyapatite in aqueous solution). The technique in question combines the advantages of large amounts of grafted biomaterial with reduced trauma, high precision, and predictability.

  3. The emergence of international food safety standards and guidelines: understanding the current landscape through a historical approach.

    Science.gov (United States)

    Ramsingh, Brigit

    2014-07-01

    Following the Second World War, the Food and Agriculture Organization (FAO) and the World Health Organization (WHO) teamed up to construct an International Codex Alimentarius (or 'food code') which emerged in 1963. The Codex Committee on Food Hygiene (CCFH) was charged with the task of developing microbial hygiene standards, although it found itself embroiled in debate with the WHO over the nature these standards should take. The WHO was increasingly relying upon the input of biometricians and especially the International Commission on Microbial Specifications for Foods (ICMSF) which had developed statistical sampling plans for determining the microbial counts in the final end products. The CCFH, however, was initially more focused on a qualitative approach which looked at the entire food production system and developed codes of practice as well as more descriptive end-product specifications which the WHO argued were 'not scientifically correct'. Drawing upon historical archival material (correspondence and reports) from the WHO and FAO, this article examines this debate over microbial hygiene standards and suggests that there are many lessons from history which could shed light upon current debates and efforts in international food safety management systems and approaches.

  4. Depth profile analysis of thin TiOxNy films using standard ion beam analysis techniques and HERDA

    International Nuclear Information System (INIS)

    Markwitz, A.; Dytlewski, N.; Cohen, D.

    1999-01-01

    Ion beam assisted deposition is used to fabricate thin titanium oxynitride films (TiO x N y ) at Industrial Research (typical film thickness 100nm). At the Institute of Geological and Nuclear Sciences, the thin films are analysed using non-destructive standard ion beam analysis (IBA) techniques. High-resolution titanium depth profiles are measured with RBS using 1.5MeV 4 He + ions. Non-resonant nuclear reaction analysis (NRA) is performed for investigating the amounts of O and N in the deposited films using the reactions 16 O(d,p) 17 O at 920 keV and 14 N(d,α) 12 C at 1.4 MeV. Using a combination of these nuclear techniques, the stoichiometry as well as the thickness of the layers is revealed. However, when oxygen and nitrogen depth profiles are required for investigating stoichiometric changes in the films, additional nuclear analysis techniques such as heavy ion elastic recoil detection (HERDA) have to be applied. With HERDA, depth profiles of N, O, and Ti are measured simultaneously. In this paper comparative IBA measurement s of TiO x N y films with different compositions are presented and discussed

  5. Robotic-assisted partial nephrectomy: surgical technique using a 3-arm approach and sliding-clip renorrhaphy

    Directory of Open Access Journals (Sweden)

    Jose M. Cabello

    2009-04-01

    Full Text Available INTRODUCTION: For the treatment of renal tumors, minimally invasive nephron-sparing surgery has become increasingly performed due to proven efficiency and excellent functional and oncological outcomes. The introduction of robotics into urologic laparoscopic surgery has allowed surgeons to perform challenging procedures in a reliable and reproducible manner. We present our surgical technique for robotic assisted partial nephrectomy (RPN using a 3-arm approach, including a sliding-clip renorrhaphy. MATERIAL AND METHODS: Our RPN technique is presented which describes the trocar positioning, hilar dissection, tumor identification using intraoperative ultrasound for margin determination, selective vascular clamping, tumor resection, and reconstruction using a sliding-clip technique. CONCLUSION: RPN using a sliding-clip renorrhaphy is a valid and reproducible surgical technique that reduces the challenge of the procedure by taking advantage of the enhanced visualization and control afforded by the robot. The renorrhaphy described is performed under complete control of the console surgeon, and has demonstrated a reduction in the warm ischemia times in our series.

  6. Fetoscopic Open Neural Tube Defect Repair: Development and Refinement of a Two-Port, Carbon Dioxide Insufflation Technique.

    Science.gov (United States)

    Belfort, Michael A; Whitehead, William E; Shamshirsaz, Alireza A; Bateni, Zhoobin H; Olutoye, Oluyinka O; Olutoye, Olutoyin A; Mann, David G; Espinoza, Jimmy; Williams, Erin; Lee, Timothy C; Keswani, Sundeep G; Ayres, Nancy; Cassady, Christopher I; Mehollin-Ray, Amy R; Sanz Cortes, Magdalena; Carreras, Elena; Peiro, Jose L; Ruano, Rodrigo; Cass, Darrell L

    2017-04-01

    To describe development of a two-port fetoscopic technique for spina bifida repair in the exteriorized, carbon dioxide-filled uterus and report early results of two cohorts of patients: the first 15 treated with an iterative technique and the latter 13 with a standardized technique. This was a retrospective cohort study (2014-2016). All patients met Management of Myelomeningocele Study selection criteria. The intraoperative approach was iterative in the first 15 patients and was then standardized. Obstetric, maternal, fetal, and early neonatal outcomes were compared. Standard parametric and nonparametric tests were used as appropriate. Data for 28 patients (22 endoscopic only, four hybrid, two abandoned) are reported, but only those with a complete fetoscopic repair were analyzed (iterative technique [n=10] compared with standardized technique [n=12]). Maternal demographics and gestational age (median [range]) at fetal surgery (25.4 [22.9-25.9] compared with 24.8 [24-25.6] weeks) were similar, but delivery occurred at 35.9 (26-39) weeks of gestation with the iterative technique compared with 39 (35.9-40) weeks of gestation with the standardized technique (Pmet in 9 of 12 (75%) and 3 of 10 (30%), respectively, and 7 of 12 (58%) compared with 2 of 10 (20%) have been treated for hydrocephalus to date. These latter differences were not statistically significant. Fetoscopic open neural tube defect repair does not appear to increase maternal-fetal complications as compared with repair by hysterotomy, allows for vaginal delivery, and may reduce long-term maternal risks. ClinicalTrials.gov, https://clinicaltrials.gov, NCT02230072.

  7. Does leaf chemistry differentially affect breakdown in tropical vs temperate streams? Importance of standardized analytical techniques to measure leaf chemistry

    Science.gov (United States)

    Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert

    2009-01-01

    Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...

  8. Laparoscopic radical nephrectomy: incorporating advantages of hand assisted and standard laparoscopy.

    Science.gov (United States)

    Ponsky, Lee E; Cherullo, Edward E; Banks, Kevin L W; Greenstein, Marc; Streem, Stevan B; Klein, Eric A; Zippe, Craig D

    2003-06-01

    We present an approach to laparoscopic radical nephrectomy and intact specimen extraction, which incorporates hand assisted and standard laparoscopic techniques. A refined approach to laparoscopic radical nephrectomy is described and our experience is reviewed. A low, muscle splitting Gibson incision is made just lateral to the rectus muscle and the hand port is inserted. A trocar is placed through the hand port and pneumoperitoneum is established. With the laparoscope in the hand port trocar 2 additional trocars are placed under direct vision. The laparoscope is then repositioned through the middle trocar and standard laparoscopic instruments are used through the other 2 trocars including the one in the hand port. If at any time during the procedure the surgeon believes the hand would be useful or needed, the trocar is removed from the hand port and the hand is inserted. This approach has been applied to 7 patients. Mean estimated blood loss was 200 cc (range 50 to 300) and mean operative time was 276.7 minutes (range 247 to 360). Mean specimen weight was 767 gm. (range 538 to 1,170). Pathologically 6 specimens were renal cell carcinoma (grades 2 to 4) and 1 was oncocytoma. Mean length of hospital stay was 3.71 days (range 2 to 7). There were no major complications. We believe that this approach enables the surgeon to incorporate the advantages of the hand assisted and standard laparoscopic approaches.

  9. Astrophysical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Kitchin, C R

    1984-01-01

    The subject is covered in chapters, entitled: detectors (optical and infrared detection; radio and microwave detection; X-ray and gamma-ray detection; cosmic ray detectors; neutrino detectors; gravitational radiation); imaging (photography; electronic imaging; scanning; interferometry; speckle interferometry; occultations; radar); photometry and photometers; spectroscopy and spectroscopes; other techniques (astrometry; polarimetry; solar studies; magnetometry). Appendices: magnitudes and spectral types of bright stars; north polar sequence; standard stars for the UBV photometric system; standard stars for the UVBY photometric system; standard stars for MK spectral types; standard stars for polarimetry; Julian date; catalogues; answers to the exercises.

  10. Methodological integrative review of the work sampling technique used in nursing workload research.

    Science.gov (United States)

    Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael

    2014-11-01

    To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.

  11. Study of the standard direct costs of various techniques of advanced endoscopy. Comparison with surgical alternatives.

    Science.gov (United States)

    Loras, Carme; Mayor, Vicenç; Fernández-Bañares, Fernando; Esteve, Maria

    2018-03-12

    The complexity of endoscopy has carried out an increase in cost that has a direct effect on the healthcare systems. However, few studies have analyzed the cost of advanced endoscopic procedures (AEP). To carry out a calculation of the standard direct costs of AEP, and to make a financial comparison with their surgical alternatives. Calculation of the standard direct cost in carrying out each procedure. An endoscopist detailed the time, personnel, materials, consumables, recovery room time, stents, pathology and medication used. The cost of surgical procedures was the average cost recorded in the hospital. Thirty-eight AEP were analyzed. The technique showing lowest cost was gastroscopy + APC (€116.57), while that with greatest cost was ERCP with cholangioscopy + stent placement (€5083.65). Some 34.2% of the procedures registered average costs of €1000-2000. In 57% of cases, the endoscopic alternative was 2-5 times more cost-efficient than surgery, in 31% of cases indistinguishable or up to 1.4 times more costly. Standard direct cost of the majority of AEP is reported using a methodology that enables easy application in other centers. For the most part, endoscopic procedures are more cost-efficient than the corresponding surgical procedure. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  12. Bowel sparing in pediatric cranio-spinal radiotherapy: a comparison of combined electron and photon and helical TomoTherapy techniques to a standard photon method

    International Nuclear Information System (INIS)

    Harron, Elizabeth; Lewis, Joanne

    2012-01-01

    The aim of this study was to compare the dose to organs at risk (OARs) from different craniospinal radiotherapy treatment approaches available at the Northern Centre for Cancer Care (NCCC), with a particular emphasis on sparing the bowel. Method: Treatment plans were produced for a pediatric medulloblastoma patient with inflammatory bowel disease using 3D conformal 6-MV photons (3DCP), combined 3D 6-MV photons and 18-MeV electrons (3DPE), and helical photon TomoTherapy (HT). The 3DPE plan was a modification of the standard 3DCP technique, using electrons to treat the spine inferior to the level of the diaphragm. The plans were compared in terms of the dose-volume data to OARs and the nontumor integral dose. Results: The 3DPE plan was found to give the lowest dose to the bowel and the lowest nontumor integral dose of the 3 techniques. However, the coverage of the spine planning target volume (PTV) was least homogeneous using this technique, with only 74.6% of the PTV covered by 95% of the prescribed dose. HT was able to achieve the best coverage of the PTVs (99.0% of the whole-brain PTV and 93.1% of the spine PTV received 95% of the prescribed dose), but delivered a significantly higher integral dose. HT was able to spare the heart, thyroid, and eyes better than the linac-based techniques, but other OARs received a higher dose. Conclusions: Use of electrons was the best method for reducing the dose to the bowel and the integral dose, at the expense of compromised spine PTV coverage. For some patients, HT may be a viable method of improving dose homogeneity and reducing selected OAR doses.

  13. Long-term Outcomes of Elective Surgery for Diverticular Disease: A Call for Standardization.

    Science.gov (United States)

    Biondi, Alberto; Santullo, Francesco; Fico, Valeria; Persiani, Roberto

    2016-10-01

    To date, the appropriate management of diverticular disease is still controversial. The American Society of Colon and Rectal Surgeons declared that the decision between conservative or surgical approach should be taken by a case-by-case evaluation. There is still lack of evidence in literature about long-term outcomes after elective sigmoid resection for diverticular disease. Considering the potentially key role of the surgical technique in long-term outcomes, there is the need for surgeons to define strict rules to standardize the surgical technique. Currently there are 5 areas of debate in elective surgery for diverticular disease: laparoscopic versus open approach, the site of the proximal and distal colonic division, the vascular approach and the mobilization of the splenic flexure. The purpose of this paper is to review existing knowledge about technical aspects, which represent how the surgeon is able to affect the long-term results.

  14. New quantitative safety standards : Different techniques, different results?

    NARCIS (Netherlands)

    Rouvroye, J.L.; Brombacher, A.C.; Lydersen, S.; Hansen, G.K.; Sandtor, H.

    1998-01-01

    Safety Instrumented Systems (SIS) are used in the process industry to perform safety functions. Many parameters can influence the safety of a SIS like system layout, diagnostics, testing and repair. In standards like the German DIN [DIN19250, DIN0801] no quantitative analysis was demanded. The

  15. Pylorus-preserving Whipple pancreaticoduodenectomy: Postoperative evaluation of a new surgical technique

    International Nuclear Information System (INIS)

    Trerotola, S.O.; Jones, B.; Crist, D.J.; Cameron, J.L.

    1988-01-01

    The pylorus-preserving Whipple pancreaticoduodenectomy is becoming an increasingly popular alternative to the standard Whipple operation in the surgical treatment of diseases of the periampullary region. Contrast radiography plays an important role in the postoperative evaluation of patients undergoing this operation. Although most radiologists are familiar with the postoperative anatomy and complications associated with the standard Whipple operation, the newer technique involves different postoperative anatomy and different complications and requires a different approach to examination. The procedure presents several new diagnostic pitfalls. These variables are presented from a described series of 50 patients undergoing this procedure for periampullary neoplasm or chronic pancreatitis

  16. Sportswear textiles emissivity measurement: comparison of IR thermography and emissometry techniques

    Science.gov (United States)

    Bison, P.; Grinzato, E.; Libbra, A.; Muscio, A.

    2012-06-01

    Three sportswear textiles are compared, one normal and two 'special' with Ag+ ions and Carbon powder added, with different colors. The emissivity of the textiles has been measured to determine if it is increased in the 'special' textiles with respect to the normal one. The test implied some non-standard procedure due to the semitransparent nature of the textiles, in comparison with the normal procedure that is commonly used on opaque surfaces. The test is also carried out by a standard emissometry technique, based on a comparative approach with reference samples having known thermal emissivity. The results are compared and discussed.

  17. Standard practice for leaks using bubble emission techniques

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 This practice describes accepted procedures for and factors that influence laboratory immersion corrosion tests, particularly mass loss tests. These factors include specimen preparation, apparatus, test conditions, methods of cleaning specimens, evaluation of results, and calculation and reporting of corrosion rates. This practice also emphasizes the importance of recording all pertinent data and provides a checklist for reporting test data. Other ASTM procedures for laboratory corrosion tests are tabulated in the Appendix. (Warning-In many cases the corrosion product on the reactive metals titanium and zirconium is a hard and tightly bonded oxide that defies removal by chemical or ordinary mechanical means. In many such cases, corrosion rates are established by mass gain rather than mass loss.) 1.2 The values stated in SI units are to be regarded as the standard. The values given in parentheses are for information only. This standard does not purport to address all of the safety concerns, if any, assoc...

  18. An integrated approach for solving a MCDM problem, Combination of Entropy Fuzzy and F-PROMETHEE techniques

    Directory of Open Access Journals (Sweden)

    Amin Shahmardan

    2013-09-01

    Full Text Available Purpose: The intention of this paper is the presentation of a new integrated approach for solving a multi attribute decision making problem by the use of Entropy Fuzzy and F- PROMETHEE (fuzzy preference ranking method for enrichment evaluation techniques. Design/methodology/approach: In these sorts of multi attribute decision making problem, a number of criteria and alternatives are put forward as input data. Ranking of these alternatives according to mentioned criteria is regarded as the outcome of solving these kinds of problems. Initially, weights of criteria are determined by implementation of Entropy Fuzzy method. According to determined weights, F-PROMETHEE method is exerted to rank these alternatives in terms of desirability of DM (decision maker. Findings: Being in an uncertain environment and vagueness of DM’s judgments, lead us to implement an algorithm which can deal with these constraints properly. This technique namely called Entropy Fuzzy as a weighting method and F-PROMETHEE is performed to fulfill this approach more precisely according to tangible and intangible aspects. The main finding of applied approach is the final ranking of alternatives helping DM to have a more reliable decision. Originality/Value: The main contribution of this approach is the giving real significance to DM’s attitudes about mentioned criteria in determined alternatives which is not elucidate in former approaches like Analytical Hierarchy Process (AHP. Furthermore, previous methods like Shanon Entropy do not pay attention sufficiently to satisfaction degree of each criterion in proposed alternatives, regarding to DM’s statements. Comprehensive explanations about these procedures have been made in miscellaneous sections of this article.

  19. Fiscal 1999 technical research report. Research and development project on prompt-effect international standards creation (Standardization of gene amplification and analysis methods for genetic screening); 1999 nendo shinki sangyo ikusei sokkogata kokusai hyojunka kaihatsu jigyo seika hokokusho. Idenshi kensayo idenshi zofuku kaiseki hoho no hyojunka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    For the promotion of Japan's proposition for international standardization, PCR (polymerase chain reaction) Comprehensive Research Committee, Technical Promotion Committee, and Japan Bioindustry Association organized a 5-member overseas research team and dispatched the team to Europe and America where they held research interviews at government organizations and business corporations engaged in standardization promotion. The aims were to disclose trends of PCR method standardization and standardization in general in biotechnology. The team visited British Standards Institution, Association Francaise de Normalisation, German Institute for Standardization, Food and Drug Administration of America, American Society for Testing and Materials, PE Biosystem, Bio-Rad Laboratories, and Roche Molecular Systems. The objects of standardization at issue included techniques, tools, devices, and reagents to be used. It is found that in Europe and America there are standardization plans under deliberation for PCR-aided techniques, not for PCR itself, and that some are now approaching completion as national or local standards. It is also learned that in every standardizing organization there is a person responsible for each TC (technical committee) of CEN (Committee European pour Normalisation). (NEDO)

  20. Fiscal 1999 technical research report. Research and development project on prompt-effect international standards creation (Standardization of gene amplification and analysis methods for genetic screening); 1999 nendo shinki sangyo ikusei sokkogata kokusai hyojunka kaihatsu jigyo seika hokokusho. Idenshi kensayo idenshi zofuku kaiseki hoho no hyojunka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    For the promotion of Japan's proposition for international standardization, PCR (polymerase chain reaction) Comprehensive Research Committee, Technical Promotion Committee, and Japan Bioindustry Association organized a 5-member overseas research team and dispatched the team to Europe and America where they held research interviews at government organizations and business corporations engaged in standardization promotion. The aims were to disclose trends of PCR method standardization and standardization in general in biotechnology. The team visited British Standards Institution, Association Francaise de Normalisation, German Institute for Standardization, Food and Drug Administration of America, American Society for Testing and Materials, PE Biosystem, Bio-Rad Laboratories, and Roche Molecular Systems. The objects of standardization at issue included techniques, tools, devices, and reagents to be used. It is found that in Europe and America there are standardization plans under deliberation for PCR-aided techniques, not for PCR itself, and that some are now approaching completion as national or local standards. It is also learned that in every standardizing organization there is a person responsible for each TC (technical committee) of CEN (Committee European pour Normalisation). (NEDO)

  1. New pricing approaches for bundled payments: Leveraging clinical standards and regional variations to target avoidable utilization.

    Science.gov (United States)

    Hellsten, Erik; Chu, Scally; Crump, R Trafford; Yu, Kevin; Sutherland, Jason M

    2016-03-01

    Develop pricing models for bundled payments that draw inputs from clinician-defined best practice standards and benchmarks set from regional variations in utilization. Health care utilization and claims data for a cohort of incident Ontario ischemic and hemorrhagic stroke episodes. Episodes of care are created by linking incident stroke hospitalizations with subsequent health service utilization across multiple datasets. Costs are estimated for episodes of care and constituent service components using setting-specific case mix methodologies and provincial fee schedules. Costs are estimated for five areas of potentially avoidable utilization, derived from best practice standards set by an expert panel of stroke clinicians. Alternative approaches for setting normative prices for stroke episodes are developed using measures of potentially avoidable utilization and benchmarks established by the best performing regions. There are wide regional variations in the utilization of different health services within episodes of stroke care. Reconciling the best practice standards with regional utilization identifies significant amounts of potentially avoidable utilization. Normative pricing models for stroke episodes result in increasingly aggressive redistributions of funding. Bundled payment pilots to date have been based on the costs of historical service patterns, which effectively 'bake in' unwarranted and inefficient variations in utilization. This study demonstrates the feasibility of novel clinically informed episode pricing approaches that leverage these variations to target reductions in potentially avoidable utilization. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. An approach to meeting the spent fuel standard

    Energy Technology Data Exchange (ETDEWEB)

    Makhijani, A. [Institute for Energy and Environmental Research, Takoma Park, MD (United States)

    1996-05-01

    The idea of the spent fuel standard is that there should be a high surface gamma radiation to prevent theft. For purposes of preventing theft, containers should be massive, and the plutonium should be difficult to extract. This report discusses issues associated with the spent fuel standard.

  3. An approach to meeting the spent fuel standard

    International Nuclear Information System (INIS)

    Makhijani, A.

    1996-01-01

    The idea of the spent fuel standard is that there should be a high surface gamma radiation to prevent theft. For purposes of preventing theft, containers should be massive, and the plutonium should be difficult to extract. This report discusses issues associated with the spent fuel standard

  4. Lateral Load Capacity of Piles: A Comparative Study Between Indian Standards and Theoretical Approach

    Science.gov (United States)

    Jayasree, P. K.; Arun, K. V.; Oormila, R.; Sreelakshmi, H.

    2018-05-01

    As per Indian Standards, laterally loaded piles are usually analysed using the method adopted by IS 2911-2010 (Part 1/Section 2). But the practising engineers are of the opinion that the IS method is very conservative in design. This work aims at determining the extent to which the conventional IS design approach is conservative. This is done through a comparative study between IS approach and the theoretical model based on Vesic's equation. Bore log details for six different bridges were collected from the Kerala Public Works Department. Cast in situ fixed head piles embedded in three soil conditions both end bearing as well as friction piles were considered and analyzed separately. Piles were also modelled in STAAD.Pro software based on IS approach and the results were validated using Matlock and Reese (In Proceedings of fifth international conference on soil mechanics and foundation engineering, 1961) equation. The results were presented as the percentage variation in values of bending moment and deflection obtained by different methods. The results obtained from the mathematical model based on Vesic's equation and that obtained as per the IS approach were compared and the IS method was found to be uneconomical and conservative.

  5. Techniques for incorporating operator expertise into intelligent decision aids and training

    International Nuclear Information System (INIS)

    Blackman, H.S.; Nelson, W.R.

    1988-01-01

    An experiment is presented that was designed to investigate the use of protocol analysis, during task performance, as a technique for knowledge engineering that provides a direct tie between knowledge and performance. The technique is described and problem solving strategies are presented that were found to correlate with optimal performance. The results indicate that protocol analysis adds a dimension to the more standard knowledge engineering approaches by providing a more complete picture of the expert's knowledge and a performance yardstick to determine the most optimal problem solving strategies. Implications for the developers of expert systems and training programs are discussed. (author)

  6. A multi-technique phytoremediation approach to purify metals contaminated soil from e-waste recycling site.

    Science.gov (United States)

    Luo, Jie; Cai, Limei; Qi, Shihua; Wu, Jian; Sophie Gu, Xiaowen

    2017-12-15

    Multiple techniques for soil decontamination were combined to enhance the phytoremediation efficiency of Eucalyptus globulese and alleviate the corresponding environmental risks. The approach constituted of chelating agent using, electrokinetic remediation, plant hormone foliar application and phytoremediation was designed to remediate multi-metal contaminated soils from a notorious e-waste recycling town. The decontamination ability of E. globulese increased from 1.35, 58.47 and 119.18 mg per plant for Cd, Pb and Cu in planting controls to 7.57, 198.68 and 174.34 mg per plant in individual EDTA treatments, respectively, but simultaneously, 0.9-11.5 times more metals leached from chelator treatments relative to controls. Low (2 V) and moderate (4 V) voltage electric fields provoked the growth of the species while high voltage (10 V) had an opposite effect and metal concentrations of the plants elevated with the increment of voltage. Volumes of the leachate decreased from 1224 to 134 mL with voltage increasing from 0 to 10 V due to electroosmosis and electrolysis. Comparing with individual phytoremediation, foliar cytokinin treatments produced 56% more biomass and intercepted 2.5 times more leachate attributed to the enhanced transpiration rate. The synergistic combination of the individuals resulted in the most biomass production and metal accumulation of the species under the stress condition relative to other methods. Time required for the multi-technique approach to decontaminate Cd, Pb and Cu from soil was 2.1-10.4 times less than individual chelator addition, electric field application or plant hormone utilization. It's especially important that nearly no leachate (60 mL in total) was collected from the multi-technique system. This approach is a suitable method to remediate metal polluted site considering its decontamination efficiency and associated environmental negligible risk. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. A novel technique combining laparoscopic and endovascular approaches using image fusion guidance for anterior embolization of type II endoleak

    Directory of Open Access Journals (Sweden)

    M. Mujeeb Zubair, MD

    2017-03-01

    Full Text Available Type II endoleak (T2E leading to aneurysm sac enlargement is one of the challenging complications associated with endovascular aneurysm repair. Recent guidelines recommend embolization of T2E associated with aneurysmal sac enlargement. Various percutaneous and endovascular techniques have been reported for embolization of T2E. We report a novel technique for T2E embolization combining laparoscopic and endovascular approaches using preoperative image fusion. We believe our technique provides a more direct access to the lumbar feeding vessels that is typically challenging with transarterial or translumbar embolization techniques.

  8. Variation in assessment and standard setting practices across UK undergraduate medicine and the need for a benchmark.

    Science.gov (United States)

    MacDougall, Margaret

    2015-10-31

    The principal aim of this study is to provide an account of variation in UK undergraduate medical assessment styles and corresponding standard setting approaches with a view to highlighting the importance of a UK national licensing exam in recognizing a common standard. Using a secure online survey system, response data were collected during the period 13 - 30 January 2014 from selected specialists in medical education assessment, who served as representatives for their respective medical schools. Assessment styles and corresponding choices of standard setting methods vary markedly across UK medical schools. While there is considerable consensus on the application of compensatory approaches, individual schools display their own nuances through use of hybrid assessment and standard setting styles, uptake of less popular standard setting techniques and divided views on norm referencing. The extent of variation in assessment and standard setting practices across UK medical schools validates the concern that there is a lack of evidence that UK medical students achieve a common standard on graduation. A national licensing exam is therefore a viable option for benchmarking the performance of all UK undergraduate medical students.

  9. Isolating DNA from sexual assault cases: a comparison of standard methods with a nuclease-based approach

    Science.gov (United States)

    2012-01-01

    Background Profiling sperm DNA present on vaginal swabs taken from rape victims often contributes to identifying and incarcerating rapists. Large amounts of the victim’s epithelial cells contaminate the sperm present on swabs, however, and complicate this process. The standard method for obtaining relatively pure sperm DNA from a vaginal swab is to digest the epithelial cells with Proteinase K in order to solubilize the victim’s DNA, and to then physically separate the soluble DNA from the intact sperm by pelleting the sperm, removing the victim’s fraction, and repeatedly washing the sperm pellet. An alternative approach that does not require washing steps is to digest with Proteinase K, pellet the sperm, remove the victim’s fraction, and then digest the residual victim’s DNA with a nuclease. Methods The nuclease approach has been commercialized in a product, the Erase Sperm Isolation Kit (PTC Labs, Columbia, MO, USA), and five crime laboratories have tested it on semen-spiked female buccal swabs in a direct comparison with their standard methods. Comparisons have also been performed on timed post-coital vaginal swabs and evidence collected from sexual assault cases. Results For the semen-spiked buccal swabs, Erase outperformed the standard methods in all five laboratories and in most cases was able to provide a clean male profile from buccal swabs spiked with only 1,500 sperm. The vaginal swabs taken after consensual sex and the evidence collected from rape victims showed a similar pattern of Erase providing superior profiles. Conclusions In all samples tested, STR profiles of the male DNA fractions obtained with Erase were as good as or better than those obtained using the standard methods. PMID:23211019

  10. Muscle gap approach under a minimally invasive channel technique for treating long segmental lumbar spinal stenosis: A retrospective study.

    Science.gov (United States)

    Bin, Yang; De Cheng, Wang; Wei, Wang Zong; Hui, Li

    2017-08-01

    This study aimed to compare the efficacy of muscle gap approach under a minimally invasive channel surgical technique with the traditional median approach.In the Orthopedics Department of Traditional Chinese and Western Medicine Hospital, Tongzhou District, Beijing, 68 cases of lumbar spinal canal stenosis underwent surgery using the muscle gap approach under a minimally invasive channel technique and a median approach between September 2013 and February 2016. Both approaches adopted lumbar spinal canal decompression, intervertebral disk removal, cage implantation, and pedicle screw fixation. The operation time, bleeding volume, postoperative drainage volume, and preoperative and postoperative visual analog scale (VAS) score and Japanese Orthopedics Association score (JOA) were compared between the 2 groups.All patients were followed up for more than 1 year. No significant difference between the 2 groups was found with respect to age, gender, surgical segments. No diversity was noted in the operation time, intraoperative bleeding volume, preoperative and 1 month after the operation VAS score, preoperative and 1 month after the operation JOA score, and 6 months after the operation JOA score between 2 groups (P > .05). The amount of postoperative wound drainage (260.90 ± 160 mL vs 447.80 ± 183.60 mL, P gap approach group than in the median approach group (P gap approach under a minimally invasive channel group, the average drainage volume was reduced by 187 mL, and the average VAS score 6 months after the operation was reduced by an average of 0.48.The muscle gap approach under a minimally invasive channel technique is a feasible method to treat long segmental lumbar spinal canal stenosis. It retains the integrity of the posterior spine complex to the greatest extent, so as to reduce the adjacent spinal segmental degeneration and soft tissue trauma. Satisfactory short-term and long-term clinical results were obtained.

  11. Data structure techniques for the graphical special unitary group approach to arbitrary spin representations

    International Nuclear Information System (INIS)

    Kent, R.D.; Schlesinger, M.

    1987-01-01

    For the purpose of computing matrix elements of quantum mechanical operators in complex N-particle systems it is necessary that as much of each irreducible representation be stored in high-speed memory as possible in order to achieve the highest possible rate of computations. A graph theoretic approach to the representation of N-particle systems involving arbitrary single-particle spin is presented. The method involves a generalization of a technique employed by Shavitt in developing the graphical group approach (GUGA) to electronic spin-orbitals. The methods implemented in GENDRT and DRTDIM overcome many deficiencies inherent in other approaches, particularly with respect to utilization of memory resources, computational efficiency in the recognition and evaluation of non-zero matrix elements of certain group theoretic operators and complete labelling of all the basis states of the permutation symmetry (S N ) adapted irreducible representations of SU(n) groups. (orig.)

  12. The k0-based neutron activation analysis: a mono standard to standardless approach of NAA

    International Nuclear Information System (INIS)

    Acharya, R.; Nair, A.G.C.; Sudarshan, K.; Goswami, A.; Reddy, A.V.R.

    2006-01-01

    The k 0 -based neutron activation analysis (k 0 -NAA) uses neutron flux parameters, detection efficiency and nuclear constants namely k 0 and Q 0 for the determination of concentration of elements. Gold ( 197 Au) or any other element having suitable nuclear properties is used as external or internal single comparator. This article describes the principle of k 0 -NAA and standardization of method by characterization of reactor irradiation sites and calibration of efficiency of the detector and applications. The method was validated using CRMs obtained from USGS, IAEA and NIST. The applications of method includes samples like gemstones (ruby, beryl and emerald), sediments, manganese nodules and encrustations, cereals, and medicinal and edible leaves. Recently, a k-o-based internal mono standard INAA (IM-NAA) method using in-situ relative efficiency has been standardized by us for the analysis of small and large samples of different shapes and sizes. The method was applied to a new meteorite sample and large size wheat samples. Non-standard size and shape samples of nuclear cladding materials namely zircaloy 2 and 4, stainless steels (SS 316M and D9) and 1S aluminium were analysed. Standard-less analysis of these cladding materials was possible by mass balance approach since all the major and minor elements were amenable to NAA. (author)

  13. Alternative approaches to improve site investigations

    International Nuclear Information System (INIS)

    Beach, R.B.; Silka, L.R.

    1992-01-01

    Common complaints about standard investigations at hazardous waste sites include high costs and long time frames. Investigations at military bases as part of the installation restoration program or base closures suffer additionally from nonuniformity of approach and results and redundancy of work effort conducted by multiple environmental contractors. The problems of high costs and long time frames can be minimized by the consistent use of alternative sampling methods (such as soil gas surveys) and the utilization of analytical screening procedures at both on-site and off-site laboratories. Acceptable data quality is maintained by several procedures. Incorporation of quality control measures (10 % frequency), such as matrix spikes and duplicates, into the alternative analytical techniques allows assessment of the data quality relative to predetermined data quality objectives (DQOs). Confirmation of the screening results (10% frequency) using standard US EPA methods, such as the contract laboratory program (CLP) statement of work (SOW), allows an additional evaluation of the data accuracy. Depending on the investigative objectives, knowledge based computer systems (expert systems,) could be used to improve uniformity of site evaluations. Several case histories will be presented demonstrating how soil gas surveys, screening analyses and standard analyses can be utilized to give increased site information in a reduced time frame and at a cost savings of 30 to 40%. One case history illustrates a screening technique developed by the author for polynuclear aromatics (semi-volatile organic compounds) that can be conducted at a cost savings of 90% relative to a standard US EPA method. A comparison of the phased investigative approach to one using an integrated field team is presented for fuel spill or UST areas

  14. The earrings of Pancas treasure: Analytical study by X-ray based techniques – A first approach

    International Nuclear Information System (INIS)

    Tissot, I.; Tissot, M.; Manso, M.; Alves, L.C.; Barreiros, M.A.; Marcelo, T.; Carvalho, M.L.; Corregidor, V.; Guerra, M.F.

    2013-01-01

    The development of new metallurgical technologies in the Iberian Peninsula during the Iron Age is well represented by the 10 gold earrings from the treasure of Pancas. This work presents a first approach to the analytical study of these earrings and contributes to the construction of a typological evolution of the Iberian earrings. The manufacture techniques and the alloys composition were studied with three complementary X-ray spectroscopy techniques: portable EDXRF, μ-PIXE and SEM–EDS. The results were compared with earrings from the same and previous periods

  15. Tissue-Based MRI Intensity Standardization: Application to Multicentric Datasets

    Directory of Open Access Journals (Sweden)

    Nicolas Robitaille

    2012-01-01

    Full Text Available Intensity standardization in MRI aims at correcting scanner-dependent intensity variations. Existing simple and robust techniques aim at matching the input image histogram onto a standard, while we think that standardization should aim at matching spatially corresponding tissue intensities. In this study, we present a novel automatic technique, called STI for STandardization of Intensities, which not only shares the simplicity and robustness of histogram-matching techniques, but also incorporates tissue spatial intensity information. STI uses joint intensity histograms to determine intensity correspondence in each tissue between the input and standard images. We compared STI to an existing histogram-matching technique on two multicentric datasets, Pilot E-ADNI and ADNI, by measuring the intensity error with respect to the standard image after performing nonlinear registration. The Pilot E-ADNI dataset consisted in 3 subjects each scanned in 7 different sites. The ADNI dataset consisted in 795 subjects scanned in more than 50 different sites. STI was superior to the histogram-matching technique, showing significantly better intensity matching for the brain white matter with respect to the standard image.

  16. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  17. New Approach to Quantitative Analysis by Laser-induced Breakdown Spectroscopy

    International Nuclear Information System (INIS)

    Lee, D. H.; Kim, T. H.; Yun, J. I.; Jung, E. C.

    2009-01-01

    Laser-induced breakdown spectroscopy (LIBS) has been studied as the technique of choice in some particular situations like screening, in situ measurement, process monitoring, hostile environments, etc. Especially, LIBS can fulfill the qualitative and quantitative analysis for radioactive high level waste (HLW) glass in restricted experimental conditions. Several ways have been suggested to get quantitative information from LIBS. The one approach is to use the absolute intensities of each element. The other approach is to use the elemental emission intensities relative to the intensity of the internal standard element whose concentration is known already in the specimen. But these methods are not applicable to unknown samples. In the present work, we introduce new approach to LIBS quantitative analysis by using H α (656.28 nm) emission line as external standard

  18. A Lifecycle Approach to Brokered Data Management for Hydrologic Modeling Data Using Open Standards.

    Science.gov (United States)

    Blodgett, D. L.; Booth, N.; Kunicki, T.; Walker, J.

    2012-12-01

    The U.S. Geological Survey Center for Integrated Data Analytics has formalized an information management-architecture to facilitate hydrologic modeling and subsequent decision support throughout a project's lifecycle. The architecture is based on open standards and open source software to decrease the adoption barrier and to build on existing, community supported software. The components of this system have been developed and evaluated to support data management activities of the interagency Great Lakes Restoration Initiative, Department of Interior's Climate Science Centers and WaterSmart National Water Census. Much of the research and development of this system has been in cooperation with international interoperability experiments conducted within the Open Geospatial Consortium. Community-developed standards and software, implemented to meet the unique requirements of specific disciplines, are used as a system of interoperable, discipline specific, data types and interfaces. This approach has allowed adoption of existing software that satisfies the majority of system requirements. Four major features of the system include: 1) assistance in model parameter and forcing creation from large enterprise data sources; 2) conversion of model results and calibrated parameters to standard formats, making them available via standard web services; 3) tracking a model's processes, inputs, and outputs as a cohesive metadata record, allowing provenance tracking via reference to web services; and 4) generalized decision support tools which rely on a suite of standard data types and interfaces, rather than particular manually curated model-derived datasets. Recent progress made in data and web service standards related to sensor and/or model derived station time series, dynamic web processing, and metadata management are central to this system's function and will be presented briefly along with a functional overview of the applications that make up the system. As the separate

  19. Reliability evaluation of high-performance, low-power FinFET standard cells based on mixed RBB/FBB technique

    Science.gov (United States)

    Wang, Tian; Cui, Xiaoxin; Ni, Yewen; Liao, Kai; Liao, Nan; Yu, Dunshan; Cui, Xiaole

    2017-04-01

    With shrinking transistor feature size, the fin-type field-effect transistor (FinFET) has become the most promising option in low-power circuit design due to its superior capability to suppress leakage. To support the VLSI digital system flow based on logic synthesis, we have designed an optimized high-performance low-power FinFET standard cell library based on employing the mixed FBB/RBB technique in the existing stacked structure of each cell. This paper presents the reliability evaluation of the optimized cells under process and operating environment variations based on Monte Carlo analysis. The variations are modelled with Gaussian distribution of the device parameters and 10000 sweeps are conducted in the simulation to obtain the statistical properties of the worst-case delay and input-dependent leakage for each cell. For comparison, a set of non-optimal cells that adopt the same topology without employing the mixed biasing technique is also generated. Experimental results show that the optimized cells achieve standard deviation reduction of 39.1% and 30.7% at most in worst-case delay and input-dependent leakage respectively while the normalized deviation shrinking in worst-case delay and input-dependent leakage can be up to 98.37% and 24.13%, respectively, which demonstrates that our optimized cells are less sensitive to variability and exhibit more reliability. Project supported by the National Natural Science Foundation of China (No. 61306040), the State Key Development Program for Basic Research of China (No. 2015CB057201), the Beijing Natural Science Foundation (No. 4152020), and Natural Science Foundation of Guangdong Province, China (No. 2015A030313147).

  20. A New profiling and pipelining approach for HEVC Decoder on ZedBoard Platform

    Directory of Open Access Journals (Sweden)

    Habib Smei

    2017-10-01

    Full Text Available New multimedia applications such as mobile video, high-quality Internet video or digital television requires high-performance encoding of video signals to meet technical constraints such as runtime, bandwidth or latency. Video coding standard h.265 HEVC (High Efficiency Video Coding was developed by JCT-VC to replace the MPEG-2, MPEG-4 and h.264 codecs and to respond to these new functional constraints. Currently, there are several implementations of this standard. Some implementations are based on software acceleration techniques; Others, on techniques of purely hardware acceleration and some others combine the two techniques. In software implementations, several techniques are used in order to decrease the video coding and decoding time. We quote data parallelism, tasks parallelism and combined solutions. In the other hand, In order to fulfill the computational demands of the new standard, HEVC includes several coding tools that allow dividing each picture into several partitions that can be processed in parallel, without degrading neither the quality nor the bitrate. In this paper, we adapt one of these approaches, the Tile coding tool to propose a pipeline execution approach of the HEVC / h265 decoder application in its version HM Test model. This approach is based on a fine profiling by using code injection techniques supported by standard profiling tools such as Gprof and Valgrind. Profiling allowed us to divide functions into four groups according to three criteria: the first criterion is based on the minimization of communication between the different functions groups in order to have minimal intergroup communication and maximum intragroup communication. The second criterion is the load balancing between processors. The third criterion is the parallelism between functions. Experiments carried out in this paper are based on the Zedboard platform, which integrates a chip Zynq xilinx with a dual core ARM A9. We start with a purely

  1. Differences of standard values of Supersonic shear imaging and ARFI technique - in vivo study of testicular tissue.

    Science.gov (United States)

    Trottmann, M; Rübenthaler, J; Marcon, J; Stief, C G; Reiser, M F; Clevert, D A

    2016-01-01

    To investigate the difference of standard values of Supersonic shear imaging (SSI) and Acoustic Radiation Force Impulse (ARFI) technique in the evaluation of testicular tissue stiffness in vivo. 58 healthy male testes were examined using B-mode sonography and ARFI and SSI. B-mode sonography was performed in order to scan the testis for pathologies followed by performance of real-time elastography in three predefined areas (upper pole, central portion and lower pole) using the SuperSonic® Aixplorer ultrasound device (SuperSonic Imagine, Aix-en-Provence, France). Afterwards a second assessment of the same testicular regions by elastography followed using the ARFI technique of the Siemens Acuson 2000™ ultrasound device (Siemens Health Care, Germany). Values of shear wave velocity were described in m/s. Parameters of elastography techniques were compared using paired sample t-test. The values of SSI were all significantly higher in all measured areas compared to ARFI (p < 0.001 to p = 0.015). Quantitatively there was a higher mean SSI wave velocity value of 1,1 compared to 0.8 m/s measured by ARFI. SSI values are significantly higher than ARFI values when measuring the stiffness of testicular tissue and should only be compared with caution.

  2. Three-Hand Endoscopic Endonasal Transsphenoidal Surgery: Experience With an Anatomy-Preserving Mononostril Approach Technique.

    Science.gov (United States)

    Eseonu, Chikezie I; ReFaey, Karim; Pamias-Portalatin, Eva; Asensio, Javier; Garcia, Oscar; Boahene, Kofi D; Quiñones-Hinojosa, Alfredo

    2018-02-01

    Variations on the endoscopic transsphenoidal approach present unique surgical techniques that have unique effects on surgical outcomes, extent of resection (EOR), and anatomical complications. To analyze the learning curve and perioperative outcomes of the 3-hand endoscopic endonasal mononostril transsphenoidal technique. Prospective case series and retrospective data analysis of patients who were treated with the 3-hand transsphenoidal technique between January 2007 and May 2015 by a single neurosurgeon. Patient characteristics, preoperative presentation, tumor characteristics, operative times, learning curve, and postoperative outcomes were analyzed. Volumetric EOR was evaluated, and a logistic regression analysis was used to assess predictors of EOR. Two hundred seventy-five patients underwent an endoscopic transsphenoidal surgery using the 3-hand technique. One hundred eighteen patients in the early group had surgery between 2007 and 2010, while 157 patients in the late group had surgery between 2011 and 2015. Operative time was significantly shorter in the late group (161.6 min) compared to the early group (211.3 min, P = .001). Both cohorts had similar EOR (early group 84.6% vs late group 85.5%, P = .846) and postoperative outcomes. The learning curve showed that it took 54 cases to achieve operative proficiency with the 3-handed technique. Multivariate modeling suggested that prior resections and preoperative tumor size are important predictors for EOR. We describe a 3-hand, mononostril endoscopic transsphenoidal technique performed by a single neurosurgeon that has minimal anatomic distortion and postoperative complications. During the learning curve of this technique, operative time can significantly decrease, while EOR, postoperative outcomes, and complications are not jeopardized. Copyright © 2017 by the Congress of Neurological Surgeons

  3. Greener Approach To Leather Techniques

    OpenAIRE

    Sah, Narayan

    2013-01-01

    The main purpose of this study was to find out greener and more ecological methods of leather tanning. In this thesis, old traditional methods and new developing methods are compared. New alternatives to chrome tanning agent and their benefits are reported. Additionally, efficient way of chrome tanning in presence of masking agents or other catalysts is reported with cleaning techniques using membrane processes such as micro-filtration, ultra-filtration (UF), nano-filtration (NF) and reverse ...

  4. Modified Redundancy based Technique—a New Approach to Combat Error Propagation Effect of AES

    Science.gov (United States)

    Sarkar, B.; Bhunia, C. T.; Maulik, U.

    2012-06-01

    Advanced encryption standard (AES) is a great research challenge. It has been developed to replace the data encryption standard (DES). AES suffers from a major limitation of error propagation effect. To tackle this limitation, two methods are available. One is redundancy based technique and the other one is bite based parity technique. The first one has a significant advantage of correcting any error on definite term over the second one but at the cost of higher level of overhead and hence lowering the processing speed. In this paper, a new approach based on the redundancy based technique is proposed that would certainly speed up the process of reliable encryption and hence the secured communication.

  5. Precision of a photogrammetric method to perform 3D wound measurements compared to standard 2D photographic techniques in the horse.

    Science.gov (United States)

    Labens, R; Blikslager, A

    2013-01-01

    Methods of 3D wound imaging in man play an important role in monitoring of healing and determination of the prognosis. Standard photographic assessments in equine wound management consist of 2D analyses, which provide little quantitative information on the wound bed. 3D imaging of equine wounds is feasible using principles of stereophotogrammetry. 3D measurements differ significantly and are more precise than results with standard 2D assessments. Repeated specialised photographic imaging of 4 clinical wounds left to heal by second intention was performed. The intraoperator variability in measurements due to imaging and 3D processing was compared to that of a standard 2D technique using descriptive statistics and multivariate repeated measures ANOVA. Using a custom made imaging system, 3D analyses were successfully performed. Area and circumference measurements were significantly different between imaging modalities. The intraoperator variability of 3D measurements was up to 2.8 times less than that of 2D results. On average, the maximum discrepancy between repeated measurements was 5.8% of the mean for 3D and 17.3% of the mean for 2D assessments. The intraoperator repeatability of 3D wound measurements based on principles of stereophotogrammetry is significantly increased compared to that of a standard 2D photographic technique indicating it may be a useful diagnostic and monitoring tool. The equine granulation bed plays an important role in equine wound healing. When compared to 2D analyses 3D monitoring of the equine wound bed allows superior quantitative characterisation, contributing to clinical and experimental investigations by offering potential new parameters. © 2012 EVJ Ltd.

  6. Stabilization of flail chest injuries: minimized approach techniques to treat the core of instability.

    Science.gov (United States)

    Schulz-Drost, S; Grupp, S; Pachowsky, M; Oppel, P; Krinner, S; Mauerer, A; Hennig, F F; Langenbach, A

    2017-04-01

    Stabilizing techniques of flail chest injuries usually need wide approaches to the chest wall. Three main regions need to be considered when stabilizing the rib cage: median-anterior with dissection of pectoral muscle; lateral-axillary with dissection of musculi (mm) serratus, externus abdominis; posterior inter spinoscapular with division of mm rhomboidei, trapezius and latissimus dorsi. Severe morbidity due to these invasive approaches needs to be considered. This study discusses possibilities for minimized approaches to the shown regions. Fifteen patients were stabilized by locked plate osteosynthesis (MatrixRib ® ) between May 2012 and April 2014 and prospectively followed up. Flail chest injuries were managed through limited incisions to the anterior, the lateral, and the posterior parts of the chest wall or their combinations. Each approach was 4-10 cm using Alexis ® retractor. One minimized approach offered sufficient access at least to four ribs posterior and laterally, four pairs of ribs anterior in all cases. There was no need to divide latissimus dorsi muscle. Trapezius und rhomboid muscles were only limited divided, whereas a subcutaneous dissection of serratus and abdominis muscles was necessary. A follow-up showed sufficient consolidation. pneumothorax (2) and seroma (2). Minimized approaches allow sufficient stabilization of severe dislocated rib fractures without extensive dissection or division of the important muscles. Keeping the arm and, thus, the scapula mobile is very important for providing the largest reachable surface of the rib cage through each approach.

  7. MacCormack's technique-based pressure reconstruction approach for PIV data in compressible flows with shocks

    Science.gov (United States)

    Liu, Shun; Xu, Jinglei; Yu, Kaikai

    2017-06-01

    This paper proposes an improved approach for extraction of pressure fields from velocity data, such as obtained by particle image velocimetry (PIV), especially for steady compressible flows with strong shocks. The principle of this approach is derived from Navier-Stokes equations, assuming adiabatic condition and neglecting viscosity of flow field boundaries measured by PIV. The computing method is based on MacCormack's technique in computational fluid dynamics. Thus, this approach is called the MacCormack method. Moreover, the MacCormack method is compared with several approaches proposed in previous literature, including the isentropic method, the spatial integration and the Poisson method. The effects of velocity error level and PIV spatial resolution on these approaches are also quantified by using artificial velocity data containing shock waves. The results demonstrate that the MacCormack method has higher reconstruction accuracy than other approaches, and its advantages become more remarkable with shock strengthening. Furthermore, the performance of the MacCormack method is also validated by using synthetic PIV images with an oblique shock wave, confirming the feasibility and advantage of this approach in real PIV experiments. This work is highly significant for the studies on aerospace engineering, especially the outer flow fields of supersonic aircraft and the internal flow fields of ramjets.

  8. CONVERGENCE OF INTERNATIONAL AUDIT STANDARDS AND AMERICAN AUDIT STANDARDS REGARDING SAMPLING

    Directory of Open Access Journals (Sweden)

    Chis Anca Oana

    2013-07-01

    Full Text Available Abstract: Sampling is widely used in market research, scientific analysis, market analysis, opinion polls and not least in the financial statement audit. We wonder what is actually sampling and how did it appear? Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Nowadays the technique is indispensable, the economic entities operating with sophisticated computer systems and large amounts of data. Economic globalization and complexity of capital markets has made possible not only the harmonization of international accounting standards with the national ones, but also the convergence of international accounting and auditing standards with the American regulations. International Standard on Auditing 530 and Statement on Auditing Standard 39 are the two main international and American normalized referentials referring to audit sampling. This article discusses the origin of audit sampling, mentioning a brief history of the method and different definitions from literature review. The two standards are studied using Jaccard indicators in terms of the degree of similarity and dissimilarity concerning different issues. The Jaccard coefficient measures the degree of convergence of international auditing standards (ISA 530 and U.S. auditing standards (SAS 39. International auditing standards and American auditing standards, study the sampling problem, both regulations presenting common points with regard to accepted sampling techniques, factors influencing the audit sample, treatment of identified misstatements and the circumstances in which sampling is appropriate. The study shows that both standards agree on application of statistical and non-statistical sampling in auditing, that sampling is appropriate for tests of details and controls, the factors affecting audit sampling being audit risk, audit objectives and population\\'s characteristics.

  9. A standardized approach to study human variability in isometric thermogenesis during low-intensity physical activity

    Directory of Open Access Journals (Sweden)

    Delphine eSarafian

    2013-07-01

    Full Text Available Limitations of current methods: The assessment of human variability in various compartments of daily energy expenditure (EE under standardized conditions is well defined at rest (as basal metabolic rate and thermic effect of feeding, and currently under validation for assessing the energy cost of low-intensity dynamic work. However, because physical activities of daily life consist of a combination of both dynamic and isometric work, there is also a need to develop standardized tests for assessing human variability in the energy cost of low-intensity isometric work.Experimental objectives: Development of an approach to study human variability in isometric thermogenesis by incorporating a protocol of intermittent leg press exercise of varying low-intensity isometric loads with measurements of EE by indirect calorimetry. Results: EE was measured in the seated position with the subject at rest or while intermittently pressing both legs against a press-platform at 5 low-intensity isometric loads (+5, +10, + 15, +20 and +25 kg force, each consisting of a succession of 8 cycles of press (30 s and rest (30 s. EE, integrated over each 8-min period of the intermittent leg press exercise, was found to increase linearly across the 5 isometric loads with a correlation coefficient (r > 0.9 for each individual. The slope of this EE-Load relationship, which provides the energy cost of this standardized isometric exercise expressed per kg force applied intermittently (30 s in every min, was found to show good repeatability when assessed in subjects who repeated the same experimental protocol on 3 separate days: its low intra-individual coefficient of variation (CV of ~ 10% contrasted with its much higher inter-individual CV of 35%; the latter being mass-independent but partly explained by height. Conclusion: This standardized approach to study isometric thermogenesis opens up a new avenue for research in EE phenotyping and metabolic predisposition to obesity

  10. Capacity Management as a Service for Enterprise Standard Software

    Directory of Open Access Journals (Sweden)

    Hendrik Müller

    2017-12-01

    Full Text Available Capacity management approaches optimize component utilization from a strong technical perspective. In fact, the quality of involved services is considered implicitly by linking it to resource capacity values. This practice hinders to evaluate design alternatives with respect to given service levels that are expressed in user-centric metrics such as the mean response time for a business transaction. We argue that utilized historical workload traces often contain a variety of performance-related information that allows for the integration of performance prediction techniques through machine learning. Since enterprise applications excessively make use of standard software that is shipped by large software vendors to a wide range of customers, standardized prediction models can be trained and provisioned as part of a capacity management service which we propose in this article. Therefore, we integrate knowledge discovery activities into well-known capacity planning steps, which we adapt to the special characteristics of enterprise applications. Using a real-world example, we demonstrate how prediction models that were trained on a large scale of monitoring data enable cost-efficient measurement-based prediction techniques to be used in early design and redesign phases of planned or running applications. Finally, based on the trained model, we demonstrate how to simulate and analyze future workload scenarios. Using a Pareto approach, we were able to identify cost-effective design alternatives for an enterprise application whose capacity is being managed.

  11. Numerical approaches to time evolution of complex quantum systems

    International Nuclear Information System (INIS)

    Fehske, Holger; Schleede, Jens; Schubert, Gerald; Wellein, Gerhard; Filinov, Vladimir S.; Bishop, Alan R.

    2009-01-01

    We examine several numerical techniques for the calculation of the dynamics of quantum systems. In particular, we single out an iterative method which is based on expanding the time evolution operator into a finite series of Chebyshev polynomials. The Chebyshev approach benefits from two advantages over the standard time-integration Crank-Nicholson scheme: speedup and efficiency. Potential competitors are semiclassical methods such as the Wigner-Moyal or quantum tomographic approaches. We outline the basic concepts of these techniques and benchmark their performance against the Chebyshev approach by monitoring the time evolution of a Gaussian wave packet in restricted one-dimensional (1D) geometries. Thereby the focus is on tunnelling processes and the motion in anharmonic potentials. Finally we apply the prominent Chebyshev technique to two highly non-trivial problems of current interest: (i) the injection of a particle in a disordered 2D graphene nanoribbon and (ii) the spatiotemporal evolution of polaron states in finite quantum systems. Here, depending on the disorder/electron-phonon coupling strength and the device dimensions, we observe transmission or localisation of the matter wave.

  12. Monte Carlo variance reduction approaches for non-Boltzmann tallies

    International Nuclear Information System (INIS)

    Booth, T.E.

    1992-12-01

    Quantities that depend on the collective effects of groups of particles cannot be obtained from the standard Boltzmann transport equation. Monte Carlo estimates of these quantities are called non-Boltzmann tallies and have become increasingly important recently. Standard Monte Carlo variance reduction techniques were designed for tallies based on individual particles rather than groups of particles. Experience with non-Boltzmann tallies and analog Monte Carlo has demonstrated the severe limitations of analog Monte Carlo for many non-Boltzmann tallies. In fact, many calculations absolutely require variance reduction methods to achieve practical computation times. Three different approaches to variance reduction for non-Boltzmann tallies are described and shown to be unbiased. The advantages and disadvantages of each of the approaches are discussed

  13. Laparoscopic Heller Myotomy and Dor Fundoplication for Esophageal Achalasia: Technique and Perioperative Management.

    Science.gov (United States)

    Andolfi, Ciro; Fisichella, P Marco

    2016-11-01

    Surgical correction of achalasia using laparoscopic Heller myotomy with Dor fundoplication is argued to be the gold standard treatment for patients with achalasia. The goal of this technical report is to illustrate our preferred approach to patients with achalasia and to provide the reader with a detailed description of our operative technique, its rationale, and our pre and postoperative management.

  14. Quality control with R an ISO standards approach

    CERN Document Server

    Cano, Emilio L; Prieto Corcoba, Mariano

    2015-01-01

    Presenting a practitioner's guide to capabilities and best practices of quality control systems using the R programming language, this volume emphasizes accessibility and ease-of-use through detailed explanations of R code as well as standard statistical methodologies. In the interest of reaching the widest possible audience of quality-control professionals and statisticians, examples throughout are structured to simplify complex equations and data structures, and to demonstrate their applications to quality control processes, such as ISO standards. The volume balances its treatment of key aspects of quality control, statistics, and programming in R, making the text accessible to beginners and expert quality control professionals alike. Several appendices serve as useful references for ISO standards and common tasks performed while applying quality control with R.

  15. [Poverty and Health: The Living Standard Approach as a Supplementary Concept to Measure Relative Poverty. Results from the German Socio-Economic Panel (GSOEP 2011)].

    Science.gov (United States)

    Pförtner, T-K

    2016-06-01

    A common indicator of the measurement of relative poverty is the disposable income of a household. Current research introduces the living standard approach as an alternative concept for describing and measuring relative poverty. This study compares both approaches with regard to subjective health status of the German population, and provides theoretical implications for the utilisation of the income and living standard approach in health research. Analyses are based on the German Socio-Economic Panel (GSOEP) from the year 2011 that includes 12 290 private households and 21106 survey members. Self-rated health was based on a subjective assessment of general health status. Income poverty is based on the equalised disposable income and is applied to a threshold of 60% of the median-based average income. A person will be denoted as deprived (inadequate living standard) if 3 or more out of 11 living standard items are lacking due to financial reasons. To calculate the discriminate power of both poverty indicators, descriptive analyses and stepwise logistic regression models were applied separately for men and women adjusted for age, residence, nationality, educational level, occupational status and marital status. The results of the stepwise regression revealed a stronger poverty-health relationship for the living standard indicator. After adjusting for all control variables and the respective poverty indicator, income poverty was statistically not significantly associated with a poor subjective health status among men (OR Men: 1.33; 95% CI: 1.00-1.77) and women (OR Women: 0.98; 95% CI: 0.78-1.22). In contrast, the association between deprivation and subjective health status was statistically significant for men (OR Men: 2.00; 95% CI: 1.57-2.52) and women (OR Women: 2.11; 95% CI: 1.76-2.64). The results of the present study indicate that the income and standard of living approach measure different dimensions of poverty. In comparison to the income approach, the living

  16. Rat pancreatic islet size standardization by the "hanging drop" technique.

    Science.gov (United States)

    Cavallari, G; Zuellig, R A; Lehmann, R; Weber, M; Moritz, W

    2007-01-01

    Rejection and hypoxia are the main factors that limit islet engraftment in the recipient liver in the immediate posttransplant period. Recently authors have reported a negative relationship of graft function and islet size, concluding that small islets are superior to large islets. Islets can be dissociated into single cells and reaggregated into so called "pseudoislets," which are functionally equivalent to intact islets but exhibit reduced immunogenicity. The aim of our study was develop a technique that enabled one to obtain pseudoislets of defined, preferably small, dimensions. Islets were harvested from Lewis rats by the collagenase digestion procedure. After purification, the isolated islets were dissociated into single cells by trypsin digestion. Fractions with different cell numbers were seeded into single drops onto cell culture dishes, which were inverted and incubated for 5 to 8 days under cell culture conditions. Newly formed pseudoislets were analyzed for dimension, morphology, and cellular composition. The volume of reaggregated pseudoislets strongly correlated with the cell number (r(2) = .995). The average diameter of a 250-cell aggregate was 95 +/- 8 microm (mean +/- SD) compared with 122 +/- 46 microm of freshly isolated islets. Islet cell loss may be minimized by performing reaggregation in the presence of medium glucose (11 mmol/L) and the GLP-1 analogue Exendin-4. Morphology, cellular composition, and architecture of reaggregated islets were comparable to intact islets. The "hanging drop" culture method allowed us to obtain pseudoislets of standardized size and regular shape, which did not differ from intact islets in terms of cellular composition or architecture. Further investigations are required to minimize cell loss and test in vivo function of transplanted pseudoislets.

  17. Palliative Spleen Irradiation: Can we Standardize its Technique?

    International Nuclear Information System (INIS)

    NAZMY, M.S.; RADWAN, A.; MOKHTAR, M.

    2008-01-01

    To explore the pattern of practice of palliative splenic irradiation (PSI) at the National Cancer Institute (NCI), Cairo University. Patients and Methods: The medical records of patients referred for PSI during the time period from 1990 to 2005 were retrospectively reviewed. We compared the three most common planning techniques (two parallel opposing, single direct field, anterior and lateral fields). Results: Eighteen patients who received PSI were identified. Thirteen patients were diagnosed as CML and 5 as CLL. The mean age of the patients was 44 (±16) years and the majority were men (60%). Spleen enlargement was documented in all cases. The single direct anterior field was the most commonly used technique. The dose per fraction ranged from 25 c Gy to 100 c Gy. The total dose ranged from 125 c Gy to 1200 c Gy and the median was 200 c Gy (mean 327 c Gy). There was no significant difference between CML and CLL patients regarding the dose level. Three out of 5 CLL patients and only one out of 13 CML patients received re-irradiation. All patients showed subjective improvement regarding pain and swelling. There was a significant increase in the hemoglobin level and a significant decrease in the WBC count. The single direct field shows variations in the dose from 56 to 102%; however, it is the simplest and the best regarding the dose to the surrounding normal tissues especially the kidney and the liver. Conclusion: PSI has a significant palliative benefit. Although the most widely accepted technique is the 2 parallel opposing anterior-posterior fields, single anterior field is also considered as a suitable option. Higher doses are needed for CLL patients compared to CML patients

  18. A new approach to voltage sag detection based on wavelet transform

    Energy Technology Data Exchange (ETDEWEB)

    Gencer, Oezguer; Oeztuerk, Semra; Erfidan, Tarik [Kocaeli University, Faculty of Engineering, Department of Electrical Engineering, Veziroglu Kampuesue, Eski Goelcuek Yolu, Kocaeli (Turkey)

    2010-02-15

    In this work, a new voltage sag detection method based on wavelet transform is developed. Voltage sag detection algorithms, so far have proved their efficiency and computational ability. Using several windowing techniques take long computational times for disturbance detection. Also researchers have been working on separating voltage sags from other voltage disturbances for the last decade. Due to increasing power quality standards new high performance disturbance detection algorithms are necessary to obtain high power quality standards. For this purpose, the wavelet technique is used for detecting voltage sag duration and magnitude. The developed voltage sag detection algorithm is implemented with high speed microcontroller. Test results show that, the new approach provides very accurate and satisfactory voltage sag detection. (author)

  19. Techniques for Preservation of the Frontotemporal Branch of Facial Nerve during Orbitozygomatic Approaches

    DEFF Research Database (Denmark)

    Spiriev, Toma; Poulsgaard, Lars; Fugleholm, Kaare

    2015-01-01

    Background During orbitozygomatic (OZ) approaches, the frontotemporal branch (FTB) of the facial nerve is exposed to injury if proper measures are not taken. This article describes in detail the nuances of the two most common techniques (interfascial and subfascial dissection). Design The FTB...... of the facial nerve was dissected and followed in its tissue planes on fresh-frozen cadaver heads. The interfascial and subfascial dissections were performed, and every step was photographed and examined. Results The interfascial dissection is safe to be started from the most anterior part of the superior...

  20. Principle of the electrically induced Transient Current Technique

    Science.gov (United States)

    Bronuzzi, J.; Moll, M.; Bouvet, D.; Mapelli, A.; Sallese, J. M.

    2018-05-01

    In the field of detector development for High Energy Physics, the so-called Transient Current Technique (TCT) is used to characterize the electric field profile and the charge trapping inside silicon radiation detectors where particles or photons create electron-hole pairs in the bulk of a semiconductor device, as PiN diodes. In the standard approach, the TCT signal originates from the free carriers generated close to the surface of a silicon detector, by short pulses of light or by alpha particles. This work proposes a new principle of charge injection by means of lateral PN junctions implemented in one of the detector electrodes, called the electrical TCT (el-TCT). This technique is fully compatible with CMOS technology and therefore opens new perspectives for assessment of radiation detectors performances.

  1. Comparison of a new noncoplanar intensity-modulated radiation therapy technique for craniospinal irradiation with 3 coplanar techniques

    DEFF Research Database (Denmark)

    Hansen, Anders T; Lukacova, Slavka; Lassen-Ramshad, Yasmin A.

    2015-01-01

    When standard conformal x-ray technique for craniospinal irradiation is used, it is a challenge to achieve satisfactory dose coverage of the target including the area of the cribriform plate, while sparing organs at risk. We present a new intensity-modulated radiation therapy (IMRT), noncoplanar...... patient using the noncoplanar IMRT-based technique, a coplanar IMRT-based technique, and a coplanar volumetric-modulated arch therapy (VMAT) technique. Dosimetry data for all patients were compared with the corresponding data from the conventional treatment plans. The new noncoplanar IMRT technique...... substantially reduced the mean dose to organs at risk compared with the standard radiation technique. The 2 other coplanar techniques also reduced the mean dose to some of the critical organs. However, this reduction was not as substantial as the reduction obtained by the noncoplanar technique. Furthermore...

  2. Software engineering techniques applied to agricultural systems an object-oriented and UML approach

    CERN Document Server

    Papajorgji, Petraq J

    2014-01-01

    Software Engineering Techniques Applied to Agricultural Systems presents cutting-edge software engineering techniques for designing and implementing better agricultural software systems based on the object-oriented paradigm and the Unified Modeling Language (UML). The focus is on the presentation of  rigorous step-by-step approaches for modeling flexible agricultural and environmental systems, starting with a conceptual diagram representing elements of the system and their relationships. Furthermore, diagrams such as sequential and collaboration diagrams are used to explain the dynamic and static aspects of the software system.    This second edition includes: a new chapter on Object Constraint Language (OCL), a new section dedicated to the Model-VIEW-Controller (MVC) design pattern, new chapters presenting details of two MDA-based tools – the Virtual Enterprise and Olivia Nova, and a new chapter with exercises on conceptual modeling.  It may be highly useful to undergraduate and graduate students as t...

  3. New autocorrelation technique for the IR FEL optical pulse width measurements

    Energy Technology Data Exchange (ETDEWEB)

    Amirmadhi, F.; Brau, K.A.; Becker, C. [Vanderbilt Univ., Nashville, TN (United States)] [and others

    1995-12-31

    We have developed a new technique for the autocorrelation measurement of optical pulse width at the Vanderbilt University FEL center. This method is based on nonlinear absorption and transmission characteristics of semiconductors such as Ge, Te and InAs suitable for the wavelength range from 2 to over 6 microns. This approach, aside being simple and low cost, removes the phase matching condition that is generally required for the standard frequency doubling technique and covers a greater wavelength range per nonlinear material. In this paper we will describe the apparatus, explain the principal mechanism involved and compare data which have been acquired with both frequency doubling and two-photon absorption.

  4. One-Tube-Only Standardized Site-Directed Mutagenesis: An Alternative Approach to Generate Amino Acid Substitution Collections.

    Directory of Open Access Journals (Sweden)

    Janire Mingo

    Full Text Available Site-directed mutagenesis (SDM is a powerful tool to create defined collections of protein variants for experimental and clinical purposes, but effectiveness is compromised when a large number of mutations is required. We present here a one-tube-only standardized SDM approach that generates comprehensive collections of amino acid substitution variants, including scanning- and single site-multiple mutations. The approach combines unified mutagenic primer design with the mixing of multiple distinct primer pairs and/or plasmid templates to increase the yield of a single inverse-PCR mutagenesis reaction. Also, a user-friendly program for automatic design of standardized primers for Ala-scanning mutagenesis is made available. Experimental results were compared with a modeling approach together with stochastic simulation data. For single site-multiple mutagenesis purposes and for simultaneous mutagenesis in different plasmid backgrounds, combination of primer sets and/or plasmid templates in a single reaction tube yielded the distinct mutations in a stochastic fashion. For scanning mutagenesis, we found that a combination of overlapping primer sets in a single PCR reaction allowed the yield of different individual mutations, although this yield did not necessarily follow a stochastic trend. Double mutants were generated when the overlap of primer pairs was below 60%. Our results illustrate that one-tube-only SDM effectively reduces the number of reactions required in large-scale mutagenesis strategies, facilitating the generation of comprehensive collections of protein variants suitable for functional analysis.

  5. A systematic comparison of motion artifact correction techniques for functional near-infrared spectroscopy

    DEFF Research Database (Denmark)

    Cooper, Robert J; Selb, Juliette; Gagnon, Louis

    2012-01-01

    a significant reduction in the mean-squared error (MSE) and significant increase in the contrast-to-noise ratio (CNR) of the recovered HRF when compared to no correction and compared to a process of rejecting motion-contaminated trials. Spline interpolation produces the largest average reduction in MSE (55....... Principle component analysis, spline interpolation, wavelet analysis, and Kalman filtering approaches are compared to one another and to standard approaches using the accuracy of the recovered, simulated hemodynamic response function (HRF). Each of the four motion correction techniques we tested yields......%) while wavelet analysis produces the highest average increase in CNR (39%). On the basis of this analysis, we recommend the routine application of motion correction techniques (particularly spline interpolation or wavelet analysis) to minimize the impact of motion artifacts on functional NIRS data....

  6. Iterative categorization (IC): a systematic technique for analysing qualitative data

    Science.gov (United States)

    2016-01-01

    Abstract The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. PMID:26806155

  7. Factors and factorizations of graphs proof techniques in factor theory

    CERN Document Server

    Akiyama, Jin

    2011-01-01

    This book chronicles the development of graph factors and factorizations. It pursues a comprehensive approach, addressing most of the important results from hundreds of findings over the last century. One of the main themes is the observation that many theorems can be proved using only a few standard proof techniques. This stands in marked contrast to the seemingly countless, complex proof techniques offered by the extant body of papers and books. In addition to covering the history and development of this area, the book offers conjectures and discusses open problems. It also includes numerous explanatory figures that enable readers to progressively and intuitively understand the most important notions and proofs in the area of factors and factorization.

  8. How Much Can Non-industry Standard Measurement Methodologies Benefit Methane Reduction Programs?

    Science.gov (United States)

    Risk, D. A.; O'Connell, L.; Atherton, E.

    2017-12-01

    In recent years, energy sector methane emissions have been recorded in large part by applying modern non-industry-standard techniques. Industry may lack the regulatory flexibility to use such techniques, or in some cases may not understand the possible associated economic advantage. As progressive jurisdictions move from estimation and towards routine measurement, the research community should provide guidance to help regulators and companies measure more effectively, and economically if possible. In this study, we outline a modelling experiment in which we explore the integration of non-industry-standard measurement techniques as part of a generalized compliance measurement program. The study was not intended to be exhaustive, or to recommend particular combinations, but instead to explore the inter-relationships between methodologies, development type, compliance practice. We first defined the role, applicable scale, detection limits, working distances, and approximate deployment cost of several measurement methodologies. We then considered a variety of development types differing mainly in footprint, density, and emissions "profile". Using a Monte Carlo approach, we evaluated the effect of these various factors on the cost and confidence of the compliance measurement program. We found that when added individually, some of the research techniques were indeed able to deliver an improvement in cost and/or confidence when used alongside industry-standard Optical Gas Imaging. When applied in combination, the ideal fraction of each measurement technique depended on development type, emission profile, and whether confidence or cost was more important. Results suggest that measurement cost and confidence could be improved if energy companies exploited a wider range of measurement techniques, and in a manner tailored to each development. In the short-term, combining clear scientific guidance with economic information could benefit immediate mitigation efforts over

  9. Signal Amplification Technique (SAT): an approach for improving resolution and reducing image noise in computed tomography

    International Nuclear Information System (INIS)

    Phelps, M.E.; Huang, S.C.; Hoffman, E.J.; Plummer, D.; Carson, R.

    1981-01-01

    Spatial resolution improvements in computed tomography (CT) have been limited by the large and unique error propagation properties of this technique. The desire to provide maximum image resolution has resulted in the use of reconstruction filter functions designed to produce tomographic images with resolution as close as possible to the intrinsic detector resolution. Thus, many CT systems produce images with excessive noise with the system resolution determined by the detector resolution rather than the reconstruction algorithm. CT is a rigorous mathematical technique which applies an increasing amplification to increasing spatial frequencies in the measured data. This mathematical approach to spatial frequency amplification cannot distinguish between signal and noise and therefore both are amplified equally. We report here a method in which tomographic resolution is improved by using very small detectors to selectively amplify the signal and not noise. Thus, this approach is referred to as the signal amplification technique (SAT). SAT can provide dramatic improvements in image resolution without increases in statistical noise or dose because increases in the cutoff frequency of the reconstruction algorithm are not required to improve image resolution. Alternatively, in cases where image counts are low, such as in rapid dynamic or receptor studies, statistical noise can be reduced by lowering the cutoff frequency while still maintaining the best possible image resolution. A possible system design for a positron CT system with SAT is described

  10. A primer on standards setting as it applies to surgical education and credentialing.

    Science.gov (United States)

    Cendan, Juan; Wier, Daryl; Behrns, Kevin

    2013-07-01

    Surgical technological advances in the past three decades have led to dramatic reductions in the morbidity associated with abdominal procedures and permanently altered the surgical practice landscape. Significant changes continue apace including surgical robotics, natural orifice-based surgery, and single-incision approaches. These disruptive technologies have on occasion been injurious to patients, and high-stakes assessment before adoption of new technologies would be reasonable. We reviewed the drivers for well-established psychometric techniques available for the standards-setting process. We present a series of examples that are relevant in the surgical domain including standards setting for knowledge and skills assessments. Defensible standards for knowledge and procedural skills will likely become part of surgical clinical practice. Understanding the methodology for determining standards should position the surgical community to assist in the process and lead within their clinical settings as standards are considered that may affect patient safety and physician credentialing.

  11. Feasibility of CBCT-based dose calculation: Comparative analysis of HU adjustment techniques

    International Nuclear Information System (INIS)

    Fotina, Irina; Hopfgartner, Johannes; Stock, Markus; Steininger, Thomas; Lütgendorf-Caucig, Carola; Georg, Dietmar

    2012-01-01

    Background and purpose: The aim of this work was to compare the accuracy of different HU adjustments for CBCT-based dose calculation. Methods and materials: Dose calculation was performed on CBCT images of 30 patients. In the first two approaches phantom-based (Pha-CC) and population-based (Pop-CC) conversion curves were used. The third method (WAB) represents override of the structures with standard densities for water, air and bone. In ROI mapping approach all structures were overridden with average HUs from planning CT. All techniques were benchmarked to the Pop-CC and CT-based plans by DVH comparison and γ-index analysis. Results: For prostate plans, WAB and ROI mapping compared to Pop-CC showed differences in PTV D median below 2%. The WAB and Pha-CC methods underestimated the bladder dose in IMRT plans. In lung cases PTV coverage was underestimated by Pha-CC method by 2.3% and slightly overestimated by the WAB and ROI techniques. The use of the Pha-CC method for head–neck IMRT plans resulted in difference in PTV coverage up to 5%. Dose calculation with WAB and ROI techniques showed better agreement with pCT than conversion curve-based approaches. Conclusions: Density override techniques provide an accurate alternative to the conversion curve-based methods for dose calculation on CBCT images.

  12. Posterior malleolar fracture: technique and clinical ex-perience of the posterolateral approach

    Directory of Open Access Journals (Sweden)

    HUANG Ruo-kun

    2012-04-01

    Full Text Available 【Abstract】Objective: To introduce the postero-lateral surgical approach to the posterior malleolar fracture and report its clinical outcomes in 32 cases. Methods: This study consisted of 32 cases, 22 males and 10 females with the mean age of 48 years (range, 21-63 years, suffering from posterior malleolar fracture. All cases were treated with the posterolateral surgical approach to the ankle. The average follow-up period was 28 months (range, 24-35 months. The clinical outcomes of these cases were evaluated on the basis of the Olerud-Molander Ankle (OMA score and plain radiographs. Results: All cases showed radiological evidence of bony union at follow-up. The average OMA score was 82 points; 21 cases had excellent scores (90-100 points, 9 good (61-90 points, and 2 fair (31-60 points. The excellent-to-good rate was 93.8%. Although most cases did not show any wound dehiscence or necrosis, one patient had a su-perficial infection which healed after using antibiotic dress-ing and one had sural cutaneous nerve injury that under-went spontaneous remission without any treatment after three months. In addition, one presented with mild symp-toms of peroneal tendonitis that disappeared after plate removal. Conclusion: The posterolateral approach offers an effective technique for fracture reduction and fixation of large posterior malleolar fragments. Key words: Ankle injuries; Dislocations; Fracture fixation, internal

  13. Severe Gynecomastia: New Technique Using Superior Pedicle NAC Flap Through a Circumareolar Approach.

    Science.gov (United States)

    Ibrahiem, Saad Mohamed Saad

    2016-06-01

    : Gynecomastia is defined as benign proliferation of glandular breast tissue in men. Gynecomastia causes considerable emotional discomfort because of limitation of everyday activity especially in young men. Surgical treatment of gynecomastia significantly contributes to an increase in social activity and an improvement of social acceptance and emotional comfort, and thus significantly improves satisfaction from personal life in men who underwent this intervention. Various surgical techniques were suggested to treat gynecomastia, but most of them end with visible scars especially in severe degree gynecomastia. The aim of many plastic surgeons is to advocate new techniques treating severe gynecomastia (grade II B and III according to Simon et al) with less visible scars. The author proposed a new technique combining both surgery and liposuction for treating grade II B and III gynecomastia using only circumareolar approach. This study evaluates aesthetic results after surgery and assessment of the incidence of early and late postoperative complications. The patient was marked preoperatively while standing. Under general anesthesia, ultrasound-assisted liposuction of the periglandular area and de-epithelialization of excess skin were performed. A superiorly based nipple areola complex flap was created based on the subdermal plexus. The excess glandular tissue was resected through the lower half of the circle of the de-epithelialized area. Closure of the wound was done after insertion of 14-French redivac. This treatment protocol was applied to 27 patients, 18 to 53 years of age, from February 2008 till now. Among these patients, 4 were classified as type IIB and 23 as type III. Follow-up ranged from 3 months to 4 years. Complications were the following: 1 hematoma, 1 wound dehiscence, 1 loss of nipple areola complex, 2 cases of hypertrophied scars, and 3 minor aesthetic problems near areolae. A new periareolar approach for correction of severe-grade gynecomastia

  14. Surgical techniques for lumbo-sacral fusion.

    Science.gov (United States)

    Tropiano, P; Giorgi, H; Faure, A; Blondel, B

    2017-02-01

    Lumbo-sacral (L5-S1) fusion is a widely performed procedure that has become the reference standard treatment for refractory low back pain. L5-S1 is a complex transition zone between the mobile lordotic distal lumbar spine and the fixed sacral region. The goal is to immobilise the lumbo-sacral junction in order to relieve pain originating from this site. Apart from achieving inter-vertebral fusion, the main challenge lies in the preoperative determination of the fixed L5-S1 position that will be optimal for the patient. Many lumbo-sacral fusion techniques are available. Stabilisation can be achieved using various methods. An anterior, posterior, or combined approach may be used. Recently developed minimally invasive techniques are gaining in popularity based on their good clinical outcomes and high fusion rates. The objective of this conference is to resolve the main issues faced by spinal surgeons in their everyday practice. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  15. Reduced Rate of Dehiscence After Implementation of a Standardized Fascial Closure Technique in Patients Undergoing Emergency Laparotomy

    DEFF Research Database (Denmark)

    Tolstrup, Mai-Britt; Watt, Sara Kehlet; Gögenur, Ismail

    2017-01-01

    to 2013 with 2014 to 2015. Factors associated with dehiscence were male gender [hazard ratio (HR) 2.8, 95% confidence interval (95% CI) (1.8-4.4), P ... (1.6-4.9), P 4%, P = 0.008. CONCLUSION: The standardized procedure of closing the midline laparotomy by using a "small steps" technique of continuous suturing...... and multivariate Cox regression analysis were performed. RESULTS: We included 494 patients from 2014 to 2015 and 1079 patients from our historical cohort for comparison. All patients had a midline laparotomy in an emergency setting. The rate of dehiscence was reduced from 6.6% to 3.8%, P = 0.03 comparing year 2009...

  16. Strain analysis in CRT candidates using the novel segment length in cine (SLICE) post-processing technique on standard CMR cine images

    NARCIS (Netherlands)

    Zweerink, A.; Allaart, C.P.; Kuijer, J.P.A.; Wu, L.; Beek, A.M.; Ven, P.M. van de; Meine, M.; Croisille, P.; Clarysse, P.; Rossum, A.C. van; Nijveldt, R.

    2017-01-01

    OBJECTIVES: Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive

  17. Working with text tools, techniques and approaches for text mining

    CERN Document Server

    Tourte, Gregory J L

    2016-01-01

    Text mining tools and technologies have long been a part of the repository world, where they have been applied to a variety of purposes, from pragmatic aims to support tools. Research areas as diverse as biology, chemistry, sociology and criminology have seen effective use made of text mining technologies. Working With Text collects a subset of the best contributions from the 'Working with text: Tools, techniques and approaches for text mining' workshop, alongside contributions from experts in the area. Text mining tools and technologies in support of academic research include supporting research on the basis of a large body of documents, facilitating access to and reuse of extant work, and bridging between the formal academic world and areas such as traditional and social media. Jisc have funded a number of projects, including NaCTem (the National Centre for Text Mining) and the ResDis programme. Contents are developed from workshop submissions and invited contributions, including: Legal considerations in te...

  18. Dosimetric comparison of intensity modulated radiotherapy techniques and standard wedged tangents for whole breast radiotherapy

    International Nuclear Information System (INIS)

    Fong, Andrew; Bromley, Regina; Beat, Mardi; Vien, Din; Dineley, Jude; Morgan, Graeme

    2009-01-01

    Full text: Prior to introducing intensity modulated radiotherapy (IMRT) for whole breast radiotherapy (WBRT) into our department we undertook a comparison of the dose parameters of several IMRT techniques and standard wedged tangents (SWT). Our aim was to improve the dose distribution to the breast and to decrease the dose to organs at risk (OAR): heart, lung and contralateral breast (Contra Br). Treatment plans for 20 women (10 right-sided and 10 left-sided) previously treated with SWT for WBRT were used to compare (a) SWT; (b) electronic compensators IMRT (E-IMRT); (c) tangential beam IMRT (T-IMRT); (d) coplanar multi-field IMRT (CP-IMRT); and (e) non-coplanar multi-field IMRT (NCP-IMRT). Plans for the breast were compared for (i) dose homogeneity (DH); (ii) conformity index (CI); (iii) mean dose; (iv) maximum dose; (v) minimum dose; and dose to OAR were calculated (vi) heart; (vii) lung and (viii) Contra Br. Compared with SWT, all plans except CP-IMRT gave improvement in at least two of the seven parameters evaluated. T-IMRT and NCP-IMRT resulted in significant improvement in all parameters except DH and both gave significant reduction in doses to OAR. As on initial evaluation NCP-IMRT is likely to be too time consuming to introduce on a large scale, T-IMRT is the preferred technique for WBRT for use in our department.

  19. Standard approach to plant modifications

    International Nuclear Information System (INIS)

    Mecredy, R.C.

    1988-01-01

    Organizational and management approaches to the design, installation, and turnover of nuclear plant modifications have changed dramatically in the last 10 to 15 yr. In response to these changes, organizational and individual responsibilities have been defined and management systems have been established at Rochester Gas and Electric (RG and E) Corporation to ensure that high-quality plant modifications are installed in a timely manner that satisfies user needs at minimal cost

  20. A new approach to the determination of air kerma using primary-standard cavity ionization chambers

    International Nuclear Information System (INIS)

    Burns, D T

    2006-01-01

    A consistent formalism is presented using Monte Carlo calculations to determine the reference air kerma from the measured energy deposition in a primary-standard cavity ionization chamber. A global approach avoiding the use of cavity ionization theory is discussed and its limitations shown in relation to the use of the recommended value for W. The role of charged-particle equilibrium is outlined and the consequent requirements placed on the calculations are detailed. Values for correction factors are presented for the BIPM air-kerma standard for 60 Co, making use of the Monte Carlo code PENELOPE, a detailed geometrical model of the BIPM 60 Co source and event-by-event electron transport. While the wall correction factor k wall = 1.0012(2) is somewhat lower than the existing value, the axial non-uniformity correction k an = 1.0027(3) is significantly higher. The use of a point source in the evaluation of k an is discussed. A comparison is made of the calculated dose ratio with the Bragg-Gray and Spencer-Attix stopping-power ratios, the results indicating a preference for the Bragg-Gray approach in this particular case. A change to the recommended value for W of up to 2 parts in 10 3 is discussed. The uncertainties arising from the geometrical models, the use of phase-space files, the radiation transport algorithms and the underlying radiation interaction coefficients are estimated

  1. A Rough Set Approach for Customer Segmentation

    Directory of Open Access Journals (Sweden)

    Prabha Dhandayudam

    2014-04-01

    Full Text Available Customer segmentation is a process that divides a business's total customers into groups according to their diversity of purchasing behavior and characteristics. The data mining clustering technique can be used to accomplish this customer segmentation. This technique clusters the customers in such a way that the customers in one group behave similarly when compared to the customers in other groups. The customer related data are categorical in nature. However, the clustering algorithms for categorical data are few and are unable to handle uncertainty. Rough set theory (RST is a mathematical approach that handles uncertainty and is capable of discovering knowledge from a database. This paper proposes a new clustering technique called MADO (Minimum Average Dissimilarity between Objects for categorical data based on elements of RST. The proposed algorithm is compared with other RST based clustering algorithms, such as MMR (Min-Min Roughness, MMeR (Min Mean Roughness, SDR (Standard Deviation Roughness, SSDR (Standard deviation of Standard Deviation Roughness, and MADE (Maximal Attributes DEpendency. The results show that for the real customer data considered, the MADO algorithm achieves clusters with higher cohesion, lower coupling, and less computational complexity when compared to the above mentioned algorithms. The proposed algorithm has also been tested on a synthetic data set to prove that it is also suitable for high dimensional data.

  2. Student’s Perceptions on Simulation as Part of Experiential Learning in Approaches, Methods, and Techniques (AMT Course

    Directory of Open Access Journals (Sweden)

    Marselina Karina Purnomo

    2017-03-01

    Full Text Available Simulation is a part of Experiential Learning which represents certain real-life events. In this study, simulation is used as a learning activity in Approaches, Methods, and Techniques (AMT course which is one of the courses in English Language Education Study Program (ELESP of Sanata Dharma University. Since simulation represents the real-life events, it encourages students to apply the approaches, methods, and techniques being studied based on the real-life classroom. Several experts state that students are able to involve their personal experiences through simulation which additionally is believed to create a meaningful learning in the class. This study aimed to discover ELESP students’ perceptions toward simulation as a part of Experiential Learning in AMT course. From the findings, it could be inferred that students agreed that simulation in class was important for students’ learning for it formed a meaningful learning in class.  DOI: https://doi.org/10.24071/llt.2017.200104

  3. Wireless installation standard

    International Nuclear Information System (INIS)

    Lim, Hwang Bin

    2007-12-01

    This is divided six parts which are radio regulation law on securing of radio resource, use of radio resource, protection of radio resource, radio regulation enforcement ordinance with securing, distribution and assignment of radio regulation, radio regulation enforcement regulation on utility of radio resource and technical qualification examination, a wireless installation regulation of technique standard and safety facility standard, radio regulation such as certification regulation of information communicative machines and regulation of radio station on compliance of signal security, radio equipment in radio station, standard frequency station and emergency communication.

  4. Recanalization strategy for chronic total occlusions with tapered and stiff-tip guidewire. The results of CTO new techniQUE for STandard procedure (CONQUEST) trial.

    Science.gov (United States)

    Mitsudo, Kazuaki; Yamashita, Takehiro; Asakura, Yasushi; Muramatsu, Toshiya; Doi, Osamu; Shibata, Yoshisato; Morino, Yoshihiro

    2008-11-01

    The success rate of percutaneous coronary intervention (PCI) for chronic total coronary occlusion (CTO) lesions varies depending on the guidewire manipulation skills of the operator. The standardization of guidewire technique is very important. A new technique with a new tapered wire (Conquest, Confianza Pro) was tested to verify effectiveness for higher initial success rates and standardization of PCI for CTO. A prospective, multicenter registry was conducted at 6 investigational sites. In the CONQUEST trial, The CTO lesions were treated by using an intermediate guidewire to cross the lesion. If it did not cross, the guidewire was changed to the Conquest guidewire. If it did not cross, "seesaw-wiring" or the "parallel-wire technique" was performed. The primary endpoint was the initial procedural success rate. A total of 110 patients representing 116 CTO lesions were treated from July 2003 through March 2004. The procedural success rate was 86.2% on the first try, and 88.8% on the second try, respectively. The guidewire success rate on the second try was 90.5% during the hospital stay; no deaths, or acute myocardial infarctions were confirmed. Two patients deteriorated into tamponade, and surgical or percutaneous drainage was performed in each patient without any sequelae. A guidewire technique in PCI for CTOs that starts with the intermediate guidewire and moves to the Confianza Pro tapered guidewire, either alone or by performing a see-saw or parallel-wire technique, can achieve a high initial success rate with an acceptably low major complication rate.

  5. Disciplining standard-setting : Which approach to choose (if any)?

    NARCIS (Netherlands)

    Kanevskaia, Olia; Jacobs, Kai; Blind, Knut

    In the world of continuous globalization, standards play a crucial role in transnational economic development. Being the drivers of harmonization and innovation, standards do not only facilitate production and exchange in goods and services, but also carry significant policy implications and create

  6. Disciplining standard-setting : Which approach to choose (if any)

    NARCIS (Netherlands)

    Kanevskaia, Olia

    2017-01-01

    In the world of continuous globalization, standards play a crucial role in transnational economic development. Being the drivers of harmonization and innovation, standards do not only facilitate production and exchange in goods and services, but also carry significant policy implications and create

  7. Prevention of air pollution: guidebook of the French techniques of dedusting and purification of gases and smokes in the industry; Prevention de la pollution de l'air: guide des techniques francaises de depoussierage et d'epuration des gaz et fumees dans l'industrie

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-07-01

    This new edition favours the multi-pollutants approach and integrates the improvements made in the European standardization of emission measurements. It contributes to the diffusion of the French techniques in the domain of gases dedusting and purification. (J.S.)

  8. A novel approach to water polution monitoring by combining ion exchange resin and XRF-scanning technique

    Science.gov (United States)

    Huang, J. J.; Lin, S. C.; Löwemark, L.; Liou, Y. H.; Chang, Q. M.; Chang, T. K.; Wei, K. Y.; Croudace, I. W. C.

    2017-12-01

    Due to the rapid industrial expansion, environments are subject to irregular fluctuations and spatial distributions in pollutant concentrations. This study proposes to use ion exchange resin accompanied with the XRF-scanning technique to monitor environmental pollution. As a passive sampling sorbent, the use of ion exchange resin provides a rapid, low cost and simple method to detect episodic pollution signals with a high spatial sampling density. In order to digest large quantities of samples, the fast and non-destructive Itrax-XRF core scanner has been introduced to assess elemental concentrations in the resin samples. Although the XRF scanning results are often considered as a semi-quantitative measurement due to possible absorption or scattering caused by the physical variabilities of scanned materials, the use of resin can minimize such influences owing to the standarization of the sample matrix. In this study, 17 lab-prepared standard resin samples were scanned with the Itrax-XRF core scanner (at 100 s exposure time with the Mo-tube) and compared with the absolute elemental concentrations. Six elements generally used in pollution studies (Cr, Mn, Ni, Cu, Zn, and Pb) were selected, and their regression lines and correlation coefficients were determined. In addition, 5 standard resin samples were scanned at different exposure time settings (1 s, 5 s, 15 s, 30 s, 100 s) to address the influence of exposure time on the accuracy of the measurements. The results show that within the test range (from few ppm to thousands ppm), the correlation coefficients are higher than 0.97, even at the shortest exposure time (1 s). Furthermore, a pilot field survey with 30 resin samples has been conducted in a potentially polluted farm area in central Taiwan to demonstrate the feasibility of this novel approach. The polluted hot zones could be identified and the properties and sources of wastewater pollution can therefore be traced over large areas for the purposes of

  9. Frequency standards

    CERN Document Server

    Riehle, Fritz

    2006-01-01

    Of all measurement units, frequency is the one that may be determined with the highest degree of accuracy. It equally allows precise measurements of other physical and technical quantities, whenever they can be measured in terms of frequency.This volume covers the central methods and techniques relevant for frequency standards developed in physics, electronics, quantum electronics, and statistics. After a review of the basic principles, the book looks at the realisation of commonly used components. It then continues with the description and characterisation of important frequency standards

  10. Four-chamber view and 'swing technique' (FAST) echo: a novel and simple algorithm to visualize standard fetal echocardiographic planes.

    Science.gov (United States)

    Yeo, L; Romero, R; Jodicke, C; Oggè, G; Lee, W; Kusanovic, J P; Vaisbuch, E; Hassan, S

    2011-04-01

    To describe a novel and simple algorithm (four-chamber view and 'swing technique' (FAST) echo) for visualization of standard diagnostic planes of fetal echocardiography from dataset volumes obtained with spatiotemporal image correlation (STIC) and applying a new display technology (OmniView). We developed an algorithm to image standard fetal echocardiographic planes by drawing four dissecting lines through the longitudinal view of the ductal arch contained in a STIC volume dataset. Three of the lines are locked to provide simultaneous visualization of targeted planes, and the fourth line (unlocked) 'swings' through the ductal arch image (swing technique), providing an infinite number of cardiac planes in sequence. Each line generates the following plane(s): (a) Line 1: three-vessels and trachea view; (b) Line 2: five-chamber view and long-axis view of the aorta (obtained by rotation of the five-chamber view on the y-axis); (c) Line 3: four-chamber view; and (d) 'swing line': three-vessels and trachea view, five-chamber view and/or long-axis view of the aorta, four-chamber view and stomach. The algorithm was then tested in 50 normal hearts in fetuses at 15.3-40 weeks' gestation and visualization rates for cardiac diagnostic planes were calculated. To determine whether the algorithm could identify planes that departed from the normal images, we tested the algorithm in five cases with proven congenital heart defects. In normal cases, the FAST echo algorithm (three locked lines and rotation of the five-chamber view on the y-axis) was able to generate the intended planes (longitudinal view of the ductal arch, pulmonary artery, three-vessels and trachea view, five-chamber view, long-axis view of the aorta, four-chamber view) individually in 100% of cases (except for the three-vessels and trachea view, which was seen in 98% (49/50)) and simultaneously in 98% (49/50). The swing technique was able to generate the three-vessels and trachea view, five-chamber view and/or long

  11. Standardization of thorax, skull and pelvis radiographic images

    International Nuclear Information System (INIS)

    Pina, D.R.; Ghilardi Netto, T.; Trad, C.S.; Brochi, M.A. Corte; Duarte, S.B.; Pina, S.R.

    2001-01-01

    The radiographic techniques for production of chest, skull and pelvis exam were determined for the standard patient. These techniques produced the quality image with smaller dose, for a standard patient, at any conventional X-ray equipment. The radiographic contrast produced for these techniques was measured utilizing the realistic-analytic phantom and classified as an ideal radiographic contrast. This work has the aim to keep the standard of the quality image, for any thickness of patients usually found in clinic routine of the radiodiagnosis service, satisfying the relation risk-benefit for the patient and cost- benefit for the institution. (author)

  12. Sound Power Estimation by Laser Doppler Vibration Measurement Techniques

    Directory of Open Access Journals (Sweden)

    G.M. Revel

    1998-01-01

    Full Text Available The aim of this paper is to propose simple and quick methods for the determination of the sound power emitted by a vibrating surface, by using non-contact vibration measurement techniques. In order to calculate the acoustic power by vibration data processing, two different approaches are presented. The first is based on the method proposed in the Standard ISO/TR 7849, while the second is based on the superposition theorem. A laser-Doppler scanning vibrometer has been employed for vibration measurements. Laser techniques open up new possibilities in this field because of their high spatial resolution and their non-intrusivity. The technique has been applied here to estimate the acoustic power emitted by a loudspeaker diaphragm. Results have been compared with those from a commercial Boundary Element Method (BEM software and experimentally validated by acoustic intensity measurements. Predicted and experimental results seem to be in agreement (differences lower than 1 dB thus showing that the proposed techniques can be employed as rapid solutions for many practical and industrial applications. Uncertainty sources are addressed and their effect is discussed.

  13. A COGNITIVE APPROACH TO CORPORATE GOVERNANCE: A VISUALIZATION TEST OF MENTAL MODELS WITH THE COGNITIVE MAPPING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Garoui NASSREDDINE

    2012-01-01

    Full Text Available The idea of this paper is to determine the mental models of actors in the fi rm with respect to the cognitive approach of corporate governance. The paper takes a corporate governance perspective, discusses mental models and uses the cognitive map to view the diagrams showing the ways of thinking and the conceptualization of the cognitive approach. In addition, it employs a cognitive mapping technique. Returning to the systematic exploration of grids for each actor, it concludes that there is a balance of concepts expressing their cognitive orientation.

  14. Salivary Fluoride level in preschool children after toothbrushing with standard and low fluoride content dentifrice, using the transversal dentifrice application technique: pilot study

    Directory of Open Access Journals (Sweden)

    Fabiana Jandre Melo

    2008-01-01

    Full Text Available Objective: To investigate the salivary fluoride concentration in pre-school children after toothbrushing with dentifrice containing standard (1100ppmF/NaF and low (500ppmF/NaF fluoride concentration, using the transversal technique of placing the product on the toothbrush. Methods: Eight children of both sexes, ranging from 4 to 9 years, and 5 years and 6 months of age, participated in the study. The experiment was divided into two phases with a weekly interval. In the first stage, the children used the standard concentration dentifrice for one week, and in the second, the low concentration product. Samples were collected at the end of each experimental stage, at the following times: Before brushing, immediately afterwards, and after 15, 30 and 45 minutes. The fluoride contents were analyzed by the microdiffusion technique. Statistical analysis was done by the analysis of variance ANOVA and Student’s-t test (p<0.05. Results: The salivary fluoride concentration was significantly higher at all times, when the standard concentration product was used. The comparison between the Halogen concentration found before bushing and immediately afterwards, showed that there was a 6.8 times increase in the standard dentifrice (0.19 x 1.29μgF/ml and in the low concentration product, an increase of 20.5 times (0.02 x 0.41μgF/ml. Conclusion: Toothbrushing with both products promoted relevant increases in the salivary fluoride concentration; however, longitudinal studies are necessary to verify the clinical result of this measurement.

  15. Technique for detecting a small magnitude loss of special nuclear material

    International Nuclear Information System (INIS)

    Pike, D.H.; Chernick, M.R.; Downing, D.J.

    The detection of losses of special nuclear materials has been the subject of much research in recent years. The standard industry practice using ID/LEID will detect large magnitude losses. Time series techniques such as the Kalman Filter or CUSUM methods will detect small magnitude losses if they occur regularly over a sustained period of time. To date no technique has been proposed which adequately addresses the problem of detecting a small magnitude loss occurring in a single period. This paper proposes a method for detecting a small magnitude loss. The approach makes use of the influence function of Hempel. The influence function measures the effect of a single inventory difference on a group of statistics. An inventory difference for a period in which a loss occurs can be expected to produce an abnormality in the calculated statistics. This abnormality is measurable by the influence function. It is shown that a one period loss smaller in magnitude than the LEID can be detected using this approach

  16. Background estimation techniques in searches for heavy resonances at CMS

    CERN Document Server

    Benato, Lisa

    2017-01-01

    Many Beyond Standard Model theories foresee the existence of heavy resonances (over 1 TeV) decaying into final states that include a high-energetic, boosted jet and charged leptons or neutrinos. In these very peculiar conditions, Monte Carlo predictions are not reliable enough to reproduce accurately the expected Standard Model background. A data-Monte Carlo hybrid approach (alpha method) has been successfully adopted since Run 1 in searches for heavy Higgs bosons performed by the CMS Collaboration. By taking advantage of data in signal-free control regions, determined exploiting the boosted jet substructure, predictions are extracted in the signal region. The alpha method and jet substructure techniques are described in detail, along with some recent results obtained with 2016 Run 2 data collected by the CMS detector.

  17. Performance analysis of air-standard Diesel cycle using an alternative irreversible heat transfer approach

    International Nuclear Information System (INIS)

    Al-Hinti, I.; Akash, B.; Abu-Nada, E.; Al-Sarkhi, A.

    2008-01-01

    This study presents the investigation of air-standard Diesel cycle under irreversible heat transfer conditions. The effects of various engine parameters are presented. An alternative approach is used to evaluate net power output and cycle thermal efficiency from more realistic parameters such as air-fuel ratio, fuel mass flow rate, intake temperature, engine design parameters, etc. It is shown that for a given fuel flow rate, thermal efficiency and maximum power output increase with decreasing air-fuel ratio. Also, for a given air-fuel ratio, the maximum power output increases with increasing fuel rate. However, the effect of the thermal efficiency is limited

  18. Feedback Linearization approach for Standard and Fault Tolerant control: Application to a Quadrotor UAV Testbed

    International Nuclear Information System (INIS)

    Ghandour, J; Aberkane, S; Ponsart, J-C

    2014-01-01

    In this paper the control problem of a quadrotor vehicle experiencing a rotor failure is investigated. We develop a Feedback linearization approach to design a controller whose task is to make the vehicle performs trajectory following. Then we use the same approach to design a controller whose task is to make the vehicle enter a stable spin around its vertical axis, while retaining zero angular velocities around the other axis when a rotor failure is present. These conditions can be exploited to design a second control loop, which is used to perform trajectory following. The proposed double control loop architecture allows the vehicle to perform both trajectory and roll/pitch control. At last, to test the robustness of the feedback linearization technique, we applied wind to the quadrotor in mid flight

  19. Dynamic acousto-elastic testing of concrete with a coda-wave probe: comparison with standard linear and nonlinear ultrasonic techniques.

    Science.gov (United States)

    Shokouhi, Parisa; Rivière, Jacques; Lake, Colton R; Le Bas, Pierre-Yves; Ulrich, T J

    2017-11-01

    The use of nonlinear acoustic techniques in solids consists in measuring wave distortion arising from compliant features such as cracks, soft intergrain bonds and dislocations. As such, they provide very powerful nondestructive tools to monitor the onset of damage within materials. In particular, a recent technique called dynamic acousto-elasticity testing (DAET) gives unprecedented details on the nonlinear elastic response of materials (classical and non-classical nonlinear features including hysteresis, transient elastic softening and slow relaxation). Here, we provide a comprehensive set of linear and nonlinear acoustic responses on two prismatic concrete specimens; one intact and one pre-compressed to about 70% of its ultimate strength. The two linear techniques used are Ultrasonic Pulse Velocity (UPV) and Resonance Ultrasound Spectroscopy (RUS), while the nonlinear ones include DAET (fast and slow dynamics) as well as Nonlinear Resonance Ultrasound Spectroscopy (NRUS). In addition, the DAET results correspond to a configuration where the (incoherent) coda portion of the ultrasonic record is used to probe the samples, as opposed to a (coherent) first arrival wave in standard DAET tests. We find that the two visually identical specimens are indistinguishable based on parameters measured by linear techniques (UPV and RUS). On the contrary, the extracted nonlinear parameters from NRUS and DAET are consistent and orders of magnitude greater for the damaged specimen than those for the intact one. This compiled set of linear and nonlinear ultrasonic testing data including the most advanced technique (DAET) provides a benchmark comparison for their use in the field of material characterization. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Non_standard Wood

    DEFF Research Database (Denmark)

    Tamke, Martin

    . Using parametric design tools and computer controlled production facilities Copenhagens Centre for IT and Architecture undertook a practice based research into performance based non-standard element design and mass customization techniques. In close cooperation with wood construction software......, but the integration of traditional wood craft techniques. The extensive use of self adjusting, load bearing wood-wood joints contributed to ease in production and assembly of a performance based architecture....

  1. Arthroscopic Latarjet Techniques: Graft and Fixation Positioning Assessed With 2-Dimensional Computed Tomography Is Not Equivalent With Standard Open Technique.

    Science.gov (United States)

    Neyton, Lionel; Barth, Johannes; Nourissat, Geoffroy; Métais, Pierre; Boileau, Pascal; Walch, Gilles; Lafosse, Laurent

    2018-05-19

    To analyze graft and fixation (screw and EndoButton) positioning after the arthroscopic Latarjet technique with 2-dimensional computed tomography (CT) and to compare it with the open technique. We performed a retrospective multicenter study (March 2013 to June 2014). The inclusion criteria included patients with recurrent anterior instability treated with the Latarjet procedure. The exclusion criterion was the absence of a postoperative CT scan. The positions of the hardware, the positions of the grafts in the axial and sagittal planes, and the dispersion of values (variability) were compared. The study included 208 patients (79 treated with open technique, 87 treated with arthroscopic Latarjet technique with screw fixation [arthro-screw], and 42 treated with arthroscopic Latarjet technique with EndoButton fixation [arthro-EndoButton]). The angulation of the screws was different in the open group versus the arthro-screw group (superior, 10.3° ± 0.7° vs 16.9° ± 1.0° [P open inferior screws (P = .003). In the axial plane (level of equator), the arthroscopic techniques resulted in lateral positions (arthro-screw, 1.5 ± 0.3 mm lateral [P open technique (0.9 ± 0.2 mm medial). At the level of 25% of the glenoid height, the arthroscopic techniques resulted in lateral positions (arthro-screw, 0.3 ± 0.3 mm lateral [P open technique (1.0 ± 0.2 mm medial). Higher variability was observed in the arthro-screw group. In the sagittal plane, the arthro-screw technique resulted in higher positions (55% ± 3% of graft below equator) and the arthro-EndoButton technique resulted in lower positions (82% ± 3%, P open technique (71% ± 2%). Variability was not different. This study shows that the position of the fixation devices and position of the bone graft with the arthroscopic techniques are statistically significantly different from those with the open technique with 2-dimensional CT assessment. In the sagittal plane, the arthro-screw technique provides the highest

  2. Machine-learning techniques for family demography: an application of random forests to the analysis of divorce determinants in Germany

    OpenAIRE

    Arpino, Bruno; Le Moglie, Marco; Mencarini, Letizia

    2018-01-01

    Demographers often analyze the determinants of life-course events with parametric regression-type approaches. Here, we present a class of nonparametric approaches, broadly defined as machine learning (ML) techniques, and discuss advantages and disadvantages of a popular type known as random forest. We argue that random forests can be useful either as a substitute, or a complement, to more standard parametric regression modeling. Our discussion of random forests is intuitive and...

  3. Evidentiary standards for forensic anthropology.

    Science.gov (United States)

    Christensen, Angi M; Crowder, Christian M

    2009-11-01

    As issues of professional standards and error rates continue to be addressed in the courts, forensic anthropologists should be proactive by developing and adhering to professional standards of best practice. There has been recent increased awareness and interest in critically assessing some of the techniques used by forensic anthropologists, but issues such as validation, error rates, and professional standards have seldom been addressed. Here we explore the legal impetus for this trend and identify areas where we can improve regarding these issues. We also discuss the recent formation of a Scientific Working Group for Forensic Anthropology (SWGANTH), which was created with the purposes of encouraging discourse among anthropologists and developing and disseminating consensus guidelines for the practice of forensic anthropology. We believe it is possible and advisable for anthropologists to seek and espouse research and methodological techniques that meet higher standards to ensure quality and consistency in our field.

  4. Experiments beyond the standard model

    International Nuclear Information System (INIS)

    Perl, M.L.

    1984-09-01

    This paper is based upon lectures in which I have described and explored the ways in which experimenters can try to find answers, or at least clues toward answers, to some of the fundamental questions of elementary particle physics. All of these experimental techniques and directions have been discussed fully in other papers, for example: searches for heavy charged leptons, tests of quantum chromodynamics, searches for Higgs particles, searches for particles predicted by supersymmetric theories, searches for particles predicted by technicolor theories, searches for proton decay, searches for neutrino oscillations, monopole searches, studies of low transfer momentum hadron physics at very high energies, and elementary particle studies using cosmic rays. Each of these subjects requires several lectures by itself to do justice to the large amount of experimental work and theoretical thought which has been devoted to these subjects. My approach in these tutorial lectures is to describe general ways to experiment beyond the standard model. I will use some of the topics listed to illustrate these general ways. Also, in these lectures I present some dreams and challenges about new techniques in experimental particle physics and accelerator technology, I call these Experimental Needs. 92 references

  5. Guide to nondestructive assay standards: Preparation criteria, availability, and practical considerations

    International Nuclear Information System (INIS)

    Hsue, S.T.; Stewart, J.E.; Sampson, T.E.; Butler, G.W.; Rudy, C.R.; Rinard, P.M.

    1997-10-01

    For certification and measurement control, nondestructive assay (NDA) instruments and methods used for verification measurements of special nuclear materials (SNMs) require calibrations based on certified reference materials (CRMs), or working reference materials (WRMs), traceable to the national system of measurements, and adequately characteristic of the unknowns. The Department of Energy Office of Safeguards and Security is sponsoring production of a comprehensive guide to preparation of NDA standards. The scope of the report includes preparation criteria, current availability of CRMs and WRMs, practical considerations for preparation and characterization, and an extensive bibliography. In preparing the report, based primarily on experience at Los Alamos, they have found that standards preparation is highly dependent on the particular NDA method being applied. They therefore include sections that contain information specific to commonly used neutron and gamma-ray NDA techniques. They also present approaches that are alternatives to, or minimize requirements for physical standards

  6. Radiation hardening techniques for rare-earth based optical fibers and amplifiers

    International Nuclear Information System (INIS)

    Girard, Sylvain; Marcandella, Claude; Vivona, Marilena; Prudenzano, Luciano Mescia F.; Laurent, Arnaud; Robin, Thierry; Cadier, Benoit; Pinsard, Emmanuel; Ouerdane, Youcef; Boukenter, Aziz; Cannas, Marco; Boscaino, Roberto

    2012-01-01

    Er/Yb doped fibers and amplifiers have been shown to be very radiation sensitive, limiting their integration in space. We present an approach including successive hardening techniques to enhance their radiation tolerance. The efficiency of our approach is demonstrated by comparing the radiation responses of optical amplifiers made with same lengths of different rare-earth doped fibers and exposed to gamma-rays. Previous studies indicated that such amplifiers suffered significant degradation for doses exceeding 10 krad. Applying our techniques significantly enhances the amplifier radiation resistance, resulting in a very limited degradation up to 50 krad. Our optimization techniques concern the fiber composition, some possible pre-treatments and the interest of simulation tools used to harden by design the amplifiers. We showed that adding cerium inside the fiber phospho-silicate-based core strongly decreases the fiber radiation sensitivity compared to the standard fiber. For both fibers, a pre-treatment with hydrogen permits to enhance again the fiber resistance. Furthermore, simulations tools can also be used to improve the tolerance of the fiber amplifier by helping identifying the best amplifier configuration for operation in the radiative environment. (authors)

  7. Multimodal nonlinear microscopy: A powerful label-free method for supporting standard diagnostics on biological tissues

    Directory of Open Access Journals (Sweden)

    Riccardo Cicchi

    2014-09-01

    Full Text Available The large use of nonlinear laser scanning microscopy in the past decade paved the way for potential clinical application of this imaging technique. Modern nonlinear microscopy techniques offer promising label-free solutions to improve diagnostic performances on tissues. In particular, the combination of multiple nonlinear imaging techniques in the same microscope allows integrating morphological with functional information in a morpho-functional scheme. Such approach provides a high-resolution label-free alternative to both histological and immunohistochemical examination of tissues and is becoming increasingly popular among the clinical community. Nevertheless, several technical improvements, including automatic scanning and image analysis, are required before the technique represents a standard diagnostic method. In this review paper, we highlight the capabilities of multimodal nonlinear microscopy for tissue imaging, by providing various examples on colon, arterial and skin tissues. The comparison between images acquired using multimodal nonlinear microscopy and histology shows a good agreement between the two methods. The results demonstrate that multimodal nonlinear microscopy is a powerful label-free alternative to standard histopathological methods and has the potential to find a stable place in the clinical setting in the near future.

  8. Simultaneously Exploiting Two Formulations: an Exact Benders Decomposition Approach

    DEFF Research Database (Denmark)

    Lusby, Richard Martin; Gamst, Mette; Spoorendonk, Simon

    When modelling a given problem using linear programming techniques several possibilities often exist, and each results in a different mathematical formulation of the problem. Usually, advantages and disadvantages can be identified in any single formulation. In this paper we consider mixed integer...... to the standard branch-and-price approach from the literature, the method shows promising performance and appears to be an attractive alternative....

  9. Round robin analyses of hydrogen isotope thin films standards

    Energy Technology Data Exchange (ETDEWEB)

    Banks, J.C. E-mail: jcbanks@sandia.gov; Browning, J.F.; Wampler, W.R.; Doyle, B.L.; LaDuca, C.A.; Tesmer, J.R.; Wetteland, C.J.; Wang, Y.Q

    2004-06-01

    Hydrogen isotope thin film standards have been manufactured at Sandia National Laboratories for use by the materials characterization community. Several considerations were taken into account during the manufacture of the ErHD standards, with accuracy and stability being the most important. The standards were fabricated by e-beam deposition of Er onto a Mo substrate and the film stoichiometrically loaded with hydrogen and deuterium. To determine the loading accuracy of the standards two random samples were measured by thermal desorption mass spectrometry and atomic absorption spectrometry techniques with a stated combined accuracy of {approx}1.6% (1{sigma}). All the standards were then measured by high energy RBS/ERD and RBS/NRA with the accuracy of the techniques {approx}5% (1{sigma}). The standards were then distributed to the IBA materials characterization community for analysis. This paper will discuss the suitability of the standards for use by the IBA community and compare measurement results to highlight the accuracy of the techniques used.

  10. DEVELOPMENT OF RAPID TECHNIQUE FOR DETERMINATION OF THE TOTAL MINERALIZATION OF NATURAL WATERS

    Directory of Open Access Journals (Sweden)

    T. A. Kuchmenko

    2015-01-01

    Full Text Available A new approach has been proposed for rapid and easy evaluation of a indicator of quality and properties of natural water - soluble salt content (mineralization. The method of quartz crystal microbalance is employed at load of the mass-sensitive resonator electrode (BAW-type with investigated water. The degree of correlation between the various indicators related to the contents of salts and insoluble compounds and the level of mineralization obtained by the standard method (gravimetry has been studied. A procedure for salt weighing by single sensor at unilateral load with small sample of natural water has been developed. The optimal conditions for measurement is established using the design of experiment by model 23 . The possibilities of quartz crystal microbalance for determination of non-volatile compounds in the water are described. The calibration of piezosensor is produced by standard solution NaCl (c = 1.000 g / dm3 at optimal conditions of experiment. The adequacy and accuracy of proposed technique is assessed by the correlation between the results of quartz crystal microbalance and conductometry. The correlation between indicators of mineralization established by quartz crystal microbalance and gravimetry is found. It has been obtained an equation that can be used to calculate the standard indicator of the mineralization by the results of a quartz crystal microbalance using single sensor. The approaches to enhance the analytical capabilities of the developed technique for water with low and high mineralization are proposed. The metrological characteristics of quartz crystal microbalance of insoluble compounds in natural water are estimated. A new technique of determination of the mass concentration of the dry residue in water with a conductivity of 0.2 mS or above has been developed, which can be used for rapid analysis of the water at nonlaboratory conditions and in the laboratory for rapid obtaining the information about a sample.

  11. Weighted hybrid technique for recommender system

    Science.gov (United States)

    Suriati, S.; Dwiastuti, Meisyarah; Tulus, T.

    2017-12-01

    Recommender system becomes very popular and has important role in an information system or webpages nowadays. A recommender system tries to make a prediction of which item a user may like based on his activity on the system. There are some familiar techniques to build a recommender system, such as content-based filtering and collaborative filtering. Content-based filtering does not involve opinions from human to make the prediction, while collaborative filtering does, so collaborative filtering can predict more accurately. However, collaborative filtering cannot give prediction to items which have never been rated by any user. In order to cover the drawbacks of each approach with the advantages of other approach, both approaches can be combined with an approach known as hybrid technique. Hybrid technique used in this work is weighted technique in which the prediction score is combination linear of scores gained by techniques that are combined.The purpose of this work is to show how an approach of weighted hybrid technique combining content-based filtering and item-based collaborative filtering can work in a movie recommender system and to show the performance comparison when both approachare combined and when each approach works alone. There are three experiments done in this work, combining both techniques with different parameters. The result shows that the weighted hybrid technique that is done in this work does not really boost the performance up, but it helps to give prediction score for unrated movies that are impossible to be recommended by only using collaborative filtering.

  12. The financial techniques for the determination of fair value, their application in the financial statements and the possible repercussions for the companies

    Directory of Open Access Journals (Sweden)

    Blanca Iris Vega Castro

    2014-10-01

    Full Text Available This paper examines recent international accounting standards, issued by the IASB and FASB, which refer to the measurement and disclosure of fair value. We identify measurement techniques and approaches applied by companies in different countries.We also analyze the application of measurement criteria and approaches recently suggested for determining fair value in financial statements and the possible implications for businesses.

  13. The financial techniques for the determination of fair value, their application in the financial statements and the possible repercussions for the companies

    OpenAIRE

    Blanca Iris Vega Castro; Pedro González Cerrud

    2014-01-01

    This paper examines recent international accounting standards, issued by the IASB and FASB, which refer to the measurement and disclosure of fair value. We identify measurement techniques and approaches applied by companies in different countries.We also analyze the application of measurement criteria and approaches recently suggested for determining fair value in financial statements and the possible implications for businesses.

  14. Estimation of direction of arrival of a moving target using subspace based approaches

    Science.gov (United States)

    Ghosh, Ripul; Das, Utpal; Akula, Aparna; Kumar, Satish; Sardana, H. K.

    2016-05-01

    In this work, array processing techniques based on subspace decomposition of signal have been evaluated for estimation of direction of arrival of moving targets using acoustic signatures. Three subspace based approaches - Incoherent Wideband Multiple Signal Classification (IWM), Least Square-Estimation of Signal Parameters via Rotation Invariance Techniques (LS-ESPRIT) and Total Least Square- ESPIRIT (TLS-ESPRIT) are considered. Their performance is compared with conventional time delay estimation (TDE) approaches such as Generalized Cross Correlation (GCC) and Average Square Difference Function (ASDF). Performance evaluation has been conducted on experimentally generated data consisting of acoustic signatures of four different types of civilian vehicles moving in defined geometrical trajectories. Mean absolute error and standard deviation of the DOA estimates w.r.t. ground truth are used as performance evaluation metrics. Lower statistical values of mean error confirm the superiority of subspace based approaches over TDE based techniques. Amongst the compared methods, LS-ESPRIT indicated better performance.

  15. Sonographically guided posteromedial approach for intra-articular knee injections: a safe, accurate, and efficient method.

    Science.gov (United States)

    Tresley, Jonathan; Jose, Jean

    2015-04-01

    Osteoarthritis of the knee can be a debilitating and extremely painful condition. In patients who desire to postpone knee arthroplasty or in those who are not surgical candidates, percutaneous knee injection therapies have the potential to reduce pain and swelling, maintain joint mobility, and minimize disability. Published studies cite poor accuracy of intra-articular knee joint injections without imaging guidance. We present a sonographically guided posteromedial approach to intra-articular knee joint injections with 100% accuracy and no complications in a consecutive series of 67 patients undergoing subsequent computed tomographic or magnetic resonance arthrography. Although many other standard approaches are available, a posteromedial intra-articular technique is particularly useful in patients with a large body habitus and theoretically allows for simultaneous aspiration of Baker cysts with a single sterile preparation and without changing the patient's position. The posteromedial technique described in this paper is not compared or deemed superior to other standard approaches but, rather, is presented as a potentially safe and efficient alternative. © 2015 by the American Institute of Ultrasound in Medicine.

  16. Catheter Closure Through a Venous Approach of Patent Ductus Arteriosus in Small Pediatric Patients Using Combined Angiographic and Echocardiographic Guidance.

    Science.gov (United States)

    Thanopoulos, Basil Vasilios D; Ninios, Vlassis; Dardas, Petros; Giannopoulos, Andreas; Deleanou, Dan; Iancovici, Silvia

    2016-11-15

    The standard technique of catheter closure of patent ductus arteriosus (PDA) may be associated with arterial complications particularly in small pediatric patients. The aim of this study was to evaluate whether catheter closure of PDA in small children using an exclusive venous approach is a safe and effective alternative to closure with the standard technique. One hundred-twelve patients, aged 2 to 24 months, were randomly assigned in a 1:1 ratio to catheter closure of PDA using the standard technique (group 1) and an exclusive venous approach (group 2), respectively. In group 2, the procedure was guided using hand injections of contrast media through the delivery sheath and 2-dimensional and color Doppler echocardiography. Group 1: the PDA diameter ranged from 2 to 5.5 mm and the device diameter ranged from 4 to 8 mm. The PDA occluders were permanently implanted in all patients. Five losses of the arterial pulses that were restored with intravenous infusion of heparin and recombinant tissue plasminogen activator (rtPA), and 4 groin hematomas were the main complications of the procedure. Group 2: the mean PDA diameter ranged from 2.5 to 6 mm and the device diameter ranged from 3 to 8 mm. The PDA occluders were permanently implanted in all but 2 patients. There were no complications. Complete echocardiographic closure of PDA at 1-month follow-up was observed in all 110 patients. Exclusive transvenous PDA occlusion is an effective and safe technique that prevents the arterial complications of the standard approach in small children. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Statistical learning techniques applied to epidemiology: a simulated case-control comparison study with logistic regression

    Directory of Open Access Journals (Sweden)

    Land Walker H

    2011-01-01

    Full Text Available Abstract Background When investigating covariate interactions and group associations with standard regression analyses, the relationship between the response variable and exposure may be difficult to characterize. When the relationship is nonlinear, linear modeling techniques do not capture the nonlinear information content. Statistical learning (SL techniques with kernels are capable of addressing nonlinear problems without making parametric assumptions. However, these techniques do not produce findings relevant for epidemiologic interpretations. A simulated case-control study was used to contrast the information embedding characteristics and separation boundaries produced by a specific SL technique with logistic regression (LR modeling representing a parametric approach. The SL technique was comprised of a kernel mapping in combination with a perceptron neural network. Because the LR model has an important epidemiologic interpretation, the SL method was modified to produce the analogous interpretation and generate odds ratios for comparison. Results The SL approach is capable of generating odds ratios for main effects and risk factor interactions that better capture nonlinear relationships between exposure variables and outcome in comparison with LR. Conclusions The integration of SL methods in epidemiology may improve both the understanding and interpretation of complex exposure/disease relationships.

  18. The need of standardization and the potential role of voluntary approaches: Issues and trends in Italian GCHP market

    OpenAIRE

    Francesco Rizzi

    2009-01-01

    Despite the lack of specific incentives, Ground Coupled Heat Pumps (GCHP) installations are booming in Italy both in private and public sectors of the market. Such rapid growth entails an increasing concern for environmental and technical performances since no comprehensive regulation and reliable standards exist yet. By means of an investigation of sectoral opinion leaders and SWOT-based technique for building scenarios, this paper discusses potential schemes for balancing mandatory and volu...

  19. Novel Techniques for Dialectal Arabic Speech Recognition

    CERN Document Server

    Elmahdy, Mohamed; Minker, Wolfgang

    2012-01-01

    Novel Techniques for Dialectal Arabic Speech describes approaches to improve automatic speech recognition for dialectal Arabic. Since speech resources for dialectal Arabic speech recognition are very sparse, the authors describe how existing Modern Standard Arabic (MSA) speech data can be applied to dialectal Arabic speech recognition, while assuming that MSA is always a second language for all Arabic speakers. In this book, Egyptian Colloquial Arabic (ECA) has been chosen as a typical Arabic dialect. ECA is the first ranked Arabic dialect in terms of number of speakers, and a high quality ECA speech corpus with accurate phonetic transcription has been collected. MSA acoustic models were trained using news broadcast speech. In order to cross-lingually use MSA in dialectal Arabic speech recognition, the authors have normalized the phoneme sets for MSA and ECA. After this normalization, they have applied state-of-the-art acoustic model adaptation techniques like Maximum Likelihood Linear Regression (MLLR) and M...

  20. Radiological error: analysis, standard setting, targeted instruction and teamworking

    International Nuclear Information System (INIS)

    FitzGerald, Richard

    2005-01-01

    Diagnostic radiology does not have objective benchmarks for acceptable levels of missed diagnoses [1]. Until now, data collection of radiological discrepancies has been very time consuming. The culture within the specialty did not encourage it. However, public concern about patient safety is increasing. There have been recent innovations in compiling radiological interpretive discrepancy rates which may facilitate radiological standard setting. However standard setting alone will not optimise radiologists' performance or patient safety. We must use these new techniques in radiological discrepancy detection to stimulate greater knowledge sharing, targeted instruction and teamworking among radiologists. Not all radiological discrepancies are errors. Radiological discrepancy programmes must not be abused as an instrument for discrediting individual radiologists. Discrepancy rates must not be distorted as a weapon in turf battles. Radiological errors may be due to many causes and are often multifactorial. A systems approach to radiological error is required. Meaningful analysis of radiological discrepancies and errors is challenging. Valid standard setting will take time. Meanwhile, we need to develop top-up training, mentoring and rehabilitation programmes. (orig.)

  1. Evidence based herbal drug standardization approach in coping with challenges of holistic management of diabetes: a dreadful lifestyle disorder of 21st century.

    Science.gov (United States)

    Chawla, Raman; Thakur, Pallavi; Chowdhry, Ayush; Jaiswal, Sarita; Sharma, Anamika; Goel, Rajeev; Sharma, Jyoti; Priyadarshi, Smruti Sagar; Kumar, Vinod; Sharma, Rakesh Kumar; Arora, Rajesh

    2013-07-04

    Plants by virtue of its composition of containing multiple constituents developed during its growth under various environmental stresses providing a plethora of chemical families with medicinal utility. Researchers are exploring this wealth and trying to decode its utility for enhancing health standards of human beings. Diabetes is dreadful lifestyle disorder of 21st century caused due to lack of insulin production or insulin physiological unresponsiveness. The chronic impact of untreated diabetes significantly affects vital organs. The allopathic medicines have five classes of drugs, or otherwise insulin in Type I diabetes, targeting insulin secretion, decreasing effect of glucagon, sensitization of receptors for enhanced glucose uptake etc. In addition, diet management, increased food fiber intake, Resistant Starch intake and routine exercise aid in managing such dangerous metabolic disorder. One of the key factors that limit commercial utility of herbal drugs is standardization. Standardization poses numerous challenges related to marker identification, active principle(s), lack of defined regulations, non-availability of universally acceptable technical standards for testing and implementation of quality control/safety standard (toxicological testing). The present study proposed an integrated herbal drug development & standardization model which is an amalgamation of Classical Approach of Ayurvedic Therapeutics, Reverse Pharmacological Approach based on Observational Therapeutics, Technical Standards for complete product cycle, Chemi-informatics, Herbal Qualitative Structure Activity Relationship and Pharmacophore modeling and, Post-Launch Market Analysis. Further studies are warranted to ensure that an effective herbal drug standardization methodology will be developed, backed by a regulatory standard guide the future research endeavors in more focused manner.

  2. The comparison between limited open carpal tunnel release using direct vision and tunneling technique and standard open carpal tunnel release: a randomized controlled trial study.

    Science.gov (United States)

    Suppaphol, Sorasak; Worathanarat, Patarawan; Kawinwongkovit, Viroj; Pittayawutwinit, Preecha

    2012-04-01

    To compare the operative outcome of carpal tunnel release between limited open carpal tunnel release using direct vision and tunneling technique (group A) with standard open carpal tunnel release (group B). Twenty-eight patients were enrolled in the present study. A single blind randomized control trial study was conducted to compare the postoperative results between group A and B. The study parameters were Levine's symptom severity and functional score, grip and pinch strength, and average two-point discrimination. The postoperative results between two groups were comparable with no statistical significance. Only grip strength at three months follow up was significantly greater in group A than in group B. The limited open carpal tunnel release in the present study is effective comparable to the standard open carpal tunnel release. The others advantage of this technique are better cosmesis and improvement in grip strength at the three months postoperative period.

  3. Constraint Embedding Technique for Multibody System Dynamics

    Science.gov (United States)

    Woo, Simon S.; Cheng, Michael K.

    2011-01-01

    Multibody dynamics play a critical role in simulation testbeds for space missions. There has been a considerable interest in the development of efficient computational algorithms for solving the dynamics of multibody systems. Mass matrix factorization and inversion techniques and the O(N) class of forward dynamics algorithms developed using a spatial operator algebra stand out as important breakthrough on this front. Techniques such as these provide the efficient algorithms and methods for the application and implementation of such multibody dynamics models. However, these methods are limited only to tree-topology multibody systems. Closed-chain topology systems require different techniques that are not as efficient or as broad as those for tree-topology systems. The closed-chain forward dynamics approach consists of treating the closed-chain topology as a tree-topology system subject to additional closure constraints. The resulting forward dynamics solution consists of: (a) ignoring the closure constraints and using the O(N) algorithm to solve for the free unconstrained accelerations for the system; (b) using the tree-topology solution to compute a correction force to enforce the closure constraints; and (c) correcting the unconstrained accelerations with correction accelerations resulting from the correction forces. This constraint-embedding technique shows how to use direct embedding to eliminate local closure-loops in the system and effectively convert the system back to a tree-topology system. At this point, standard tree-topology techniques can be brought to bear on the problem. The approach uses a spatial operator algebra approach to formulating the equations of motion. The operators are block-partitioned around the local body subgroups to convert them into aggregate bodies. Mass matrix operator factorization and inversion techniques are applied to the reformulated tree-topology system. Thus in essence, the new technique allows conversion of a system with

  4. "Heidelberg standard examination" and "Heidelberg standard procedures" - Development of faculty-wide standards for physical examination techniques and clinical procedures in undergraduate medical education.

    Science.gov (United States)

    Nikendei, C; Ganschow, P; Groener, J B; Huwendiek, S; Köchel, A; Köhl-Hackert, N; Pjontek, R; Rodrian, J; Scheibe, F; Stadler, A-K; Steiner, T; Stiepak, J; Tabatabai, J; Utz, A; Kadmon, M

    2016-01-01

    The competent physical examination of patients and the safe and professional implementation of clinical procedures constitute essential components of medical practice in nearly all areas of medicine. The central objective of the projects "Heidelberg standard examination" and "Heidelberg standard procedures", which were initiated by students, was to establish uniform interdisciplinary standards for physical examination and clinical procedures, and to distribute them in coordination with all clinical disciplines at the Heidelberg University Hospital. The presented project report illuminates the background of the initiative and its methodological implementation. Moreover, it describes the multimedia documentation in the form of pocketbooks and a multimedia internet-based platform, as well as the integration into the curriculum. The project presentation aims to provide orientation and action guidelines to facilitate similar processes in other faculties.

  5. An Approach to Establishing International Quality Standards for Medical Travel

    Directory of Open Access Journals (Sweden)

    Ondřej eKácha

    2016-03-01

    Full Text Available Traveling abroad to receive a non-elective treatment is expanding each year. Such rising popularity of medical travel and the absence of clear minimum quality requirements in this area urgently calls for setting international standards to ensure good practice and patient safety. The aim of this study is to identify the key domains in medical travel where such quality standards should be established. Drawing from the evidence-based OECD framework and an extensive literature review, this study proposes three critical areas for international quality standards in medical travel: minimum standards of health care facilities and third-party agencies, financial responsibility and patient-centeredness. Several cultural challenges are subsequently introduced that may pose a barrier to the development of the guidelines and should be additionally taken into consideration. Establishing international quality standards in medical travel enhances the benefits to patients and providers, which is urgently needed given the rapid growth in this industry.

  6. A Maximum Entropy Approach to Loss Distribution Analysis

    Directory of Open Access Journals (Sweden)

    Marco Bee

    2013-03-01

    Full Text Available In this paper we propose an approach to the estimation and simulation of loss distributions based on Maximum Entropy (ME, a non-parametric technique that maximizes the Shannon entropy of the data under moment constraints. Special cases of the ME density correspond to standard distributions; therefore, this methodology is very general as it nests most classical parametric approaches. Sampling the ME distribution is essential in many contexts, such as loss models constructed via compound distributions. Given the difficulties in carrying out exact simulation,we propose an innovative algorithm, obtained by means of an extension of Adaptive Importance Sampling (AIS, for the approximate simulation of the ME distribution. Several numerical experiments confirm that the AIS-based simulation technique works well, and an application to insurance data gives further insights in the usefulness of the method for modelling, estimating and simulating loss distributions.

  7. Towards representing human behavior and decision making in Earth system models - an overview of techniques and approaches

    Science.gov (United States)

    Müller-Hansen, Finn; Schlüter, Maja; Mäs, Michael; Donges, Jonathan F.; Kolb, Jakob J.; Thonicke, Kirsten; Heitzig, Jobst

    2017-11-01

    Today, humans have a critical impact on the Earth system and vice versa, which can generate complex feedback processes between social and ecological dynamics. Integrating human behavior into formal Earth system models (ESMs), however, requires crucial modeling assumptions about actors and their goals, behavioral options, and decision rules, as well as modeling decisions regarding human social interactions and the aggregation of individuals' behavior. Here, we review existing modeling approaches and techniques from various disciplines and schools of thought dealing with human behavior at different levels of decision making. We demonstrate modelers' often vast degrees of freedom but also seek to make modelers aware of the often crucial consequences of seemingly innocent modeling assumptions. After discussing which socioeconomic units are potentially important for ESMs, we compare models of individual decision making that correspond to alternative behavioral theories and that make diverse modeling assumptions about individuals' preferences, beliefs, decision rules, and foresight. We review approaches to model social interaction, covering game theoretic frameworks, models of social influence, and network models. Finally, we discuss approaches to studying how the behavior of individuals, groups, and organizations can aggregate to complex collective phenomena, discussing agent-based, statistical, and representative-agent modeling and economic macro-dynamics. We illustrate the main ingredients of modeling techniques with examples from land-use dynamics as one of the main drivers of environmental change bridging local to global scales.

  8. Laparoscopic colorectal surgery in learning curve: Role of implementation of a standardized technique and recovery protocol. A cohort study

    Science.gov (United States)

    Luglio, Gaetano; De Palma, Giovanni Domenico; Tarquini, Rachele; Giglio, Mariano Cesare; Sollazzo, Viviana; Esposito, Emanuela; Spadarella, Emanuela; Peltrini, Roberto; Liccardo, Filomena; Bucci, Luigi

    2015-01-01

    Background Despite the proven benefits, laparoscopic colorectal surgery is still under utilized among surgeons. A steep learning is one of the causes of its limited adoption. Aim of the study is to determine the feasibility and morbidity rate after laparoscopic colorectal surgery in a single institution, “learning curve” experience, implementing a well standardized operative technique and recovery protocol. Methods The first 50 patients treated laparoscopically were included. All the procedures were performed by a trainee surgeon, supervised by a consultant surgeon, according to the principle of complete mesocolic excision with central vascular ligation or TME. Patients underwent a fast track recovery programme. Recovery parameters, short-term outcomes, morbidity and mortality have been assessed. Results Type of resections: 20 left side resections, 8 right side resections, 14 low anterior resection/TME, 5 total colectomy and IRA, 3 total panproctocolectomy and pouch. Mean operative time: 227 min; mean number of lymph-nodes: 18.7. Conversion rate: 8%. Mean time to flatus: 1.3 days; Mean time to solid stool: 2.3 days. Mean length of hospital stay: 7.2 days. Overall morbidity: 24%; major morbidity (Dindo–Clavien III): 4%. No anastomotic leak, no mortality, no 30-days readmission. Conclusion Proper laparoscopic colorectal surgery is safe and leads to excellent results in terms of recovery and short term outcomes, even in a learning curve setting. Key factors for better outcomes and shortening the learning curve seem to be the adoption of a standardized technique and training model along with the strict supervision of an expert colorectal surgeon. PMID:25859386

  9. Novel Technique for Rebubbling DMEK Grafts at the Slit Lamp Using Intravenous Extension Tubing.

    Science.gov (United States)

    Sáles, Christopher S; Straiko, Michael D; Terry, Mark A

    2016-04-01

    To describe a novel technique for rebubbling DMEK grafts at the slit lamp using a cannula coupled to a syringe with intravenous (IV) extension tubing. We present a retrospective case series of eyes that underwent rebubbling using a novel technique at the slit lamp. The rebubbling apparatus is assembled using a standard 43-inch IV extension tube, a 5-cc luer lock syringe, and a 27-gauge cannula. The cannula is screwed onto one end of the extension tubing, and a 5-cc syringe that has been filled with air is screwed onto the opposite end. With the patient seated at the slit lamp, the cannula is positioned in the anterior chamber by the surgeon with one hand while the other hand operates the syringe and the joystick. We performed 5 rebubbling procedures at the slit lamp using a standard syringe and cannula. Despite suboptimal ergonomics with this approach, all of these cases achieved sufficient air fills without any complications. Four rebubbling procedures were subsequently performed at the slit lamp using our novel rebubbling technique. All of these cases also attained sufficient air fills without complications, but they were noted to be much easier to perform by the surgeon. Using IV extension tubing to couple a syringe to a cannula for rebubbling DMEK grafts at the slit lamp is ergonomically superior to the conventional alternative of using a standard cannula on a syringe. The technique is also simple and inexpensive to adopt.

  10. Actual survey of dose evaluation method for standardization of radiation therapy techniques. With special reference to display method of radiation doses

    International Nuclear Information System (INIS)

    Kumagai, Kozo; Yoshiura, Takao; Izumi, Takashi; Araki, Fujio; Takada, Takuo; Jingu, Kenichi.

    1994-01-01

    This report presents the results of questionnaire survey for actual conditions of radiation therapy, which was conducted with the aim of establishing the standardization of radiation therapy techniques. Questionnaires were sent to 100 facilities in Japan, and 86 of these answered, consisting of 62 university hospitals, 2 national hospitals, 14 cancer centers, 4 prefectural or municipal hospitals, and 4 other hospitals. In addition to electron beam therapy, the following typical diseases for radiation therapy were selected as standard irradiation models: cancers of the larynx, esophagus, breast, and uterine cervix, and malignant lymphomas. According to these models, questionnaire results are analyzed in terms of the following four items: (1) irradiation procedures, (2) energy used for radiotherapy, (3) the depth for calculating target absorption doses, and (4) points for displaying target absorption doses. (N.K.)

  11. Pulse radiolysis - new approaches to the classical technique

    Energy Technology Data Exchange (ETDEWEB)

    Zagorski, Z P [Institute of Nuclear Research, Warsaw (Poland)

    1973-01-01

    The present status of classical pulse radiolysis is described as well as trends in the further development of this technique (the investigation of radiolysis with nano- and picoseconds time resolution, new optica and electrochemical methods of intermediate species detection). The attention is concentrated on experimental difficulties of particular versions and the achievements are reviewed critically. This paper is the background for experiments being performed in the Institute of Nuclear Research on new techniques of pulse radiolysis.

  12. Force scanning: a rapid, high-resolution approach for spatial mechanical property mapping

    International Nuclear Information System (INIS)

    Darling, E M

    2011-01-01

    Atomic force microscopy (AFM) can be used to co-localize mechanical properties and topographical features through property mapping techniques. The most common approach for testing biological materials at the microscale and nanoscale is force mapping, which involves taking individual force curves at discrete sites across a region of interest. The limitations of force mapping include long testing times and low resolution. While newer AFM methodologies, like modulated scanning and torsional oscillation, circumvent this problem, their adoption for biological materials has been limited. This could be due to their need for specialized software algorithms and/or hardware. The objective of this study is to develop a novel force scanning technique using AFM to rapidly capture high-resolution topographical images of soft biological materials while simultaneously quantifying their mechanical properties. Force scanning is a straightforward methodology applicable to a wide range of materials and testing environments, requiring no special modification to standard AFMs. Essentially, if a contact-mode image can be acquired, then force scanning can be used to produce a spatial modulus map. The current study first validates this technique using agarose gels, comparing results to ones achieved by the standard force mapping approach. Biologically relevant demonstrations are then presented for high-resolution modulus mapping of individual cells, cell-cell interfaces, and articular cartilage tissue.

  13. Basic prediction techniques in modern video coding standards

    CERN Document Server

    Kim, Byung-Gyu

    2016-01-01

    This book discusses in detail the basic algorithms of video compression that are widely used in modern video codec. The authors dissect complicated specifications and present material in a way that gets readers quickly up to speed by describing video compression algorithms succinctly, without going to the mathematical details and technical specifications. For accelerated learning, hybrid codec structure, inter- and intra- prediction techniques in MPEG-4, H.264/AVC, and HEVC are discussed together. In addition, the latest research in the fast encoder design for the HEVC and H.264/AVC is also included.

  14. Individualised perioperative open-lung approach versus standard protective ventilation in abdominal surgery (iPROVE): a randomised controlled trial.

    Science.gov (United States)

    Ferrando, Carlos; Soro, Marina; Unzueta, Carmen; Suarez-Sipmann, Fernando; Canet, Jaume; Librero, Julián; Pozo, Natividad; Peiró, Salvador; Llombart, Alicia; León, Irene; India, Inmaculada; Aldecoa, Cesar; Díaz-Cambronero, Oscar; Pestaña, David; Redondo, Francisco J; Garutti, Ignacio; Balust, Jaume; García, Jose I; Ibáñez, Maite; Granell, Manuel; Rodríguez, Aurelio; Gallego, Lucía; de la Matta, Manuel; Gonzalez, Rafael; Brunelli, Andrea; García, Javier; Rovira, Lucas; Barrios, Francisco; Torres, Vicente; Hernández, Samuel; Gracia, Estefanía; Giné, Marta; García, María; García, Nuria; Miguel, Lisset; Sánchez, Sergio; Piñeiro, Patricia; Pujol, Roger; García-Del-Valle, Santiago; Valdivia, José; Hernández, María J; Padrón, Oto; Colás, Ana; Puig, Jaume; Azparren, Gonzalo; Tusman, Gerardo; Villar, Jesús; Belda, Javier

    2018-03-01

    The effects of individualised perioperative lung-protective ventilation (based on the open-lung approach [OLA]) on postoperative complications is unknown. We aimed to investigate the effects of intraoperative and postoperative ventilatory management in patients scheduled for abdominal surgery, compared with standard protective ventilation. We did this prospective, multicentre, randomised controlled trial in 21 teaching hospitals in Spain. We enrolled patients who were aged 18 years or older, were scheduled to have abdominal surgery with an expected time of longer than 2 h, had intermediate-to-high-risk of developing postoperative pulmonary complications, and who had a body-mass index less than 35 kg/m 2 . Patients were randomly assigned (1:1:1:1) online to receive one of four lung-protective ventilation strategies using low tidal volume plus positive end-expiratory pressure (PEEP): open-lung approach (OLA)-iCPAP (individualised intraoperative ventilation [individualised PEEP after a lung recruitment manoeuvre] plus individualised postoperative continuous positive airway pressure [CPAP]), OLA-CPAP (intraoperative individualised ventilation plus postoperative CPAP), STD-CPAP (standard intraoperative ventilation plus postoperative CPAP), or STD-O 2 (standard intraoperative ventilation plus standard postoperative oxygen therapy). Patients were masked to treatment allocation. Investigators were not masked in the operating and postoperative rooms; after 24 h, data were given to a second investigator who was masked to allocations. The primary outcome was a composite of pulmonary and systemic complications during the first 7 postoperative days. We did the primary analysis using the modified intention-to-treat population. This trial is registered with ClinicalTrials.gov, number NCT02158923. Between Jan 2, 2015, and May 18, 2016, we enrolled 1012 eligible patients. Data were available for 967 patients, whom we included in the final analysis. Risk of pulmonary and systemic

  15. A predictive modeling approach to increasing the economic effectiveness of disease management programs.

    Science.gov (United States)

    Bayerstadler, Andreas; Benstetter, Franz; Heumann, Christian; Winter, Fabian

    2014-09-01

    Predictive Modeling (PM) techniques are gaining importance in the worldwide health insurance business. Modern PM methods are used for customer relationship management, risk evaluation or medical management. This article illustrates a PM approach that enables the economic potential of (cost-) effective disease management programs (DMPs) to be fully exploited by optimized candidate selection as an example of successful data-driven business management. The approach is based on a Generalized Linear Model (GLM) that is easy to apply for health insurance companies. By means of a small portfolio from an emerging country, we show that our GLM approach is stable compared to more sophisticated regression techniques in spite of the difficult data environment. Additionally, we demonstrate for this example of a setting that our model can compete with the expensive solutions offered by professional PM vendors and outperforms non-predictive standard approaches for DMP selection commonly used in the market.

  16. High Classification Rates for Continuous Cow Activity Recognition using Low-cost GPS Positioning Sensors and Standard Machine Learning Techniques

    DEFF Research Database (Denmark)

    Godsk, Torben; Kjærgaard, Mikkel Baun

    2011-01-01

    activities. By preprocessing the raw cow position data, we obtain high classification rates using standard machine learning techniques to recognize cow activities. Our objectives were to (i) determine to what degree it is possible to robustly recognize cow activities from GPS positioning data, using low...... and their activities manually logged to serve as ground truth. For our dataset we managed to obtain an average classification success rate of 86.2% of the four activities: eating/seeking (90.0%), walking (100%), lying (76.5%), and standing (75.8%) by optimizing both the preprocessing of the raw GPS data...

  17. Statistical approach to Higgs boson couplings in the standard model effective field theory

    Science.gov (United States)

    Murphy, Christopher W.

    2018-01-01

    We perform a parameter fit in the standard model effective field theory (SMEFT) with an emphasis on using regularized linear regression to tackle the issue of the large number of parameters in the SMEFT. In regularized linear regression, a positive definite function of the parameters of interest is added to the usual cost function. A cross-validation is performed to try to determine the optimal value of the regularization parameter to use, but it selects the standard model (SM) as the best model to explain the measurements. Nevertheless as proof of principle of this technique we apply it to fitting Higgs boson signal strengths in SMEFT, including the latest Run-2 results. Results are presented in terms of the eigensystem of the covariance matrix of the least squares estimators as it has a degree model-independent to it. We find several results in this initial work: the SMEFT predicts the total width of the Higgs boson to be consistent with the SM prediction; the ATLAS and CMS experiments at the LHC are currently sensitive to non-resonant double Higgs boson production. Constraints are derived on the viable parameter space for electroweak baryogenesis in the SMEFT, reinforcing the notion that a first order phase transition requires fairly low-scale beyond the SM physics. Finally, we study which future experimental measurements would give the most improvement on the global constraints on the Higgs sector of the SMEFT.

  18. Cesarean sections, perfecting the technique and standardizing the practice: an analysis of the book Obstetrícia, by Jorge de Rezende.

    Science.gov (United States)

    Nakano, Andreza Rodrigues; Bonan, Claudia; Teixeira, Luiz Antônio

    2016-01-01

    This article discusses the development of techniques for cesarean sections by doctors in Brazil, during the 20th century, by analyzing the title "Operação Cesárea" (Cesarean Section), of three editions of the textbookObstetrícia, by Jorge de Rezende. His prominence as an author in obstetrics and his particular style of working, created the groundwork for the normalization of the practice of cesarean sections. The networks of meaning practiced within this scientific community included a "provision for feeling and for action" (Fleck) which established the C-section as a "normal" delivery: showing standards that exclude unpredictability, chaos, and dangers associated with the physiology of childbirth, meeting the demand for control, discipline and safety, qualities associated with practices, techniques and technologies of biomedicine.

  19. Standard molar enthalpies of formation of 2-, 3- and 4-cyanobenzoic acids

    International Nuclear Information System (INIS)

    Ribeiro da Silva, Manuel A.V.; Amaral, Luisa M.P.F.; Boaventura, Cristina R.P.; Gomes, Jose R.B.

    2008-01-01

    The standard (p 0 = 0.1 MPa) molar enthalpies of formation of 2-, 3- and 4-cyanobenzoic acids were derived from their standard molar energies of combustion, in oxygen, at T = 298.15 K, measured by static bomb combustion calorimetry. The Calvet high temperature vacuum sublimation technique was used to measure the enthalpies of sublimation of 2- and 3-cyanobenzoic acids. The standard molar enthalpies of formation of the three compounds, in the gaseous phase, at T = 298.15 K, have been derived from the corresponding standard molar enthalpies of formation in the condensed phase and standard molar enthalpies for phase transition. The results obtained are -(150.7 ± 2.0) kJ . mol -1 , -(153.6 ± 1.7) kJ . mol -1 and -(157.1 ± 1.4) kJ . mol -1 for 2-cyano, 3-cyano and 4-cyanobenzoic acids, respectively. Standard molar enthalpies of formation were also estimated by employing two different methodologies: one based on the Cox scheme and the other one based on several different computational approaches. The calculated values show a good agreement with the experimental values obtained in this work

  20. A Multilingual Approach to Analysing Standardized Test Results: Immigrant Primary School Children and the Role of Languages Spoken in a Bi-/Multilingual Community

    Science.gov (United States)

    De Angelis, Gessica

    2014-01-01

    The present study adopts a multilingual approach to analysing the standardized test results of primary school immigrant children living in the bi-/multilingual context of South Tyrol, Italy. The standardized test results are from the Invalsi test administered across Italy in 2009/2010. In South Tyrol, several languages are spoken on a daily basis…

  1. Minimally Invasive Scoliosis Surgery: A Novel Technique in Patients with Neuromuscular Scoliosis

    Directory of Open Access Journals (Sweden)

    Vishal Sarwahi

    2015-01-01

    Full Text Available Minimally invasive surgery (MIS has been described in the treatment of adolescent idiopathic scoliosis (AIS and adult scoliosis. The advantages of this approach include less blood loss, shorter hospital stay, earlier mobilization, less tissue disruption, and relatively less pain. However, despite these significant benefits, MIS approach has not been reported in neuromuscular scoliosis patients. This is possibly due to concerns with longer surgery time, which is further increased due to more levels fused and instrumented, challenges of pelvic fixation, size and number of incisions, and prolonged anesthesia. We modified the MIS approach utilized in our AIS patients to be implemented in our neuromuscular patients. Our technique allows easy passage of contoured rods, placement of pedicle screws without image guidance, partial/complete facet resection, and all standard reduction maneuvers. Operative time needed to complete this surgery is comparable to the standard procedure and the majority of our patients have been extubated at the end of procedure, spending 1 day in the PICU and 5-6 days in the hospital. We feel that MIS is not only a feasible but also a superior option in patients with neuromuscular scoliosis. Long-term results are unavailable; however, short-term results have shown multiple benefits of this approach and fewer limitations.

  2. Structured and Sparse Canonical Correlation Analysis as a Brain-Wide Multi-Modal Data Fusion Approach.

    Science.gov (United States)

    Mohammadi-Nejad, Ali-Reza; Hossein-Zadeh, Gholam-Ali; Soltanian-Zadeh, Hamid

    2017-07-01

    Multi-modal data fusion has recently emerged as a comprehensive neuroimaging analysis approach, which usually uses canonical correlation analysis (CCA). However, the current CCA-based fusion approaches face problems like high-dimensionality, multi-collinearity, unimodal feature selection, asymmetry, and loss of spatial information in reshaping the imaging data into vectors. This paper proposes a structured and sparse CCA (ssCCA) technique as a novel CCA method to overcome the above problems. To investigate the performance of the proposed algorithm, we have compared three data fusion techniques: standard CCA, regularized CCA, and ssCCA, and evaluated their ability to detect multi-modal data associations. We have used simulations to compare the performance of these approaches and probe the effects of non-negativity constraint, the dimensionality of features, sample size, and noise power. The results demonstrate that ssCCA outperforms the existing standard and regularized CCA-based fusion approaches. We have also applied the methods to real functional magnetic resonance imaging (fMRI) and structural MRI data of Alzheimer's disease (AD) patients (n = 34) and healthy control (HC) subjects (n = 42) from the ADNI database. The results illustrate that the proposed unsupervised technique differentiates the transition pattern between the subject-course of AD patients and HC subjects with a p-value of less than 1×10 -6 . Furthermore, we have depicted the brain mapping of functional areas that are most correlated with the anatomical changes in AD patients relative to HC subjects.

  3. Parametric studies on the harvested energy of piezoelectric switching techniques

    International Nuclear Information System (INIS)

    Neubauer, M; Krack, M; Wallaschek, J

    2010-01-01

    Piezoelectric energy harvesting techniques have experienced increasing research effort during the last few years. Possible applications including wireless, fully autonomous electronic devices, such as sensors, have attracted great interest. The key aspect of harvesting techniques is the amount of converted and stored energy, because the energy source and the conversion rate is limited. In particular, switching techniques offer many parameters that can be optimized. It is therefore crucial to examine the influence of these parameters in a precise manner. This paper addresses an accurate analytical modeling approach, facilitating the calculation of standard-DC and parallel SSHI-DC energy harvesting circuits. In particular the influence of the frequency ratio between the excitation and the electrical resonance of the switching LR-branch, and the voltage gaps across the rectifier diodes are studied in detail. Additionally a comparison with the SSDI damping network is performed. The relationship between energy harvesting and damping is indicated in this paper

  4. Iterative categorization (IC): a systematic technique for analysing qualitative data.

    Science.gov (United States)

    Neale, Joanne

    2016-06-01

    The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  5. Port-Access cardiac surgery: from a learning process to the standard.

    Science.gov (United States)

    Greco, Ernesto; Barriuso, Clemente; Castro, Miguel Angel; Fita, Guillermina; Pomar, José L

    2002-01-01

    Port-Access surgery has been one of the most innovative and controversial methods in the spectrum of minimally invasive techniques for cardiac operations and has been widely used for the treatment of several cardiac diseases. The technique was introduced in our center to evaluate its efficacy in reproducing standardized results without an additional risk. Endovascular cardiopulmonary bypass (CPB) through femoral access and endoluminal aortic occlusion were used in 129 patients for a variety of surgical procedures, all of which were video-assisted. A minimal (4-6 cm) anterior thoracotomy through the fourth intercostal space was used in all cases as the surgical approach. More than 96% of the planned cases concluded as true Port-Access procedures. Mean CBP and crossclamp times were 87.2 min. +/- 51.2 (range of 10-457) and 54.9 min. +/- 30.6 (range of 10-190), respectively. Hospital mortality for the overall group was 1.5%, and mitral valve surgery had a 2.2% hospital death rate. The incidence of early neurological events was 0.7%. Mean extubation time, ICU stay, and total length of hospital stay were 5 hours +/- 6 hrs. (range of 2-32), 12 hours +/- 11.8 hrs. (range of 5-78), and 7 days +/- 7.03 days (range of 1-72), respectively. Our experience indicates that the Port- Access technique is safe and permits reproduction of standardized results with the use of a very limited surgical approach. We are convinced that this is a superior procedure for certain types of surgery, including isolated primary or redo mitral surgery, repair of a variety of atrial septal defects (ASDs), and atrial tumors. It is especially useful in high-risk patients, such as elderly patients or those requiring reoperation. Simplification of the procedure is nevertheless desirable in order to further reduce the time of operation and to address other drawbacks.

  6. A standardized surgical technique for rat superior cervical ganglionectomy

    DEFF Research Database (Denmark)

    Savastano, Luis Emilio; Castro, Analía Elizabeth; Fitt, Marcos René

    2010-01-01

    Superior cervical ganglionectomy (SCGx) is a valuable microsurgical model to study the role of the sympathetic nervous system in a vast array of physiological and pathological processes, including homeostatic regulation, circadian biology and the dynamics of neuronal dysfunction and recovery afte...... expect that the following standardized and optimized protocol will allow researchers to organize knowledge into a cohesive framework in those areas where the SCGx is applied....

  7. An enhanced approach for biomedical image restoration using image fusion techniques

    Science.gov (United States)

    Karam, Ghada Sabah; Abbas, Fatma Ismail; Abood, Ziad M.; Kadhim, Kadhim K.; Karam, Nada S.

    2018-05-01

    Biomedical image is generally noisy and little blur due to the physical mechanisms of the acquisition process, so one of the common degradations in biomedical image is their noise and poor contrast. The idea of biomedical image enhancement is to improve the quality of the image for early diagnosis. In this paper we are using Wavelet Transformation to remove the Gaussian noise from biomedical images: Positron Emission Tomography (PET) image and Radiography (Radio) image, in different color spaces (RGB, HSV, YCbCr), and we perform the fusion of the denoised images resulting from the above denoising techniques using add image method. Then some quantive performance metrics such as signal -to -noise ratio (SNR), peak signal-to-noise ratio (PSNR), and Mean Square Error (MSE), etc. are computed. Since this statistical measurement helps in the assessment of fidelity and image quality. The results showed that our approach can be applied of Image types of color spaces for biomedical images.

  8. Proposed minimum reporting standards for chemical analysis Chemical Analysis Working Group (CAWG) Metabolomics Standards Initiative (MSI)

    Science.gov (United States)

    Amberg, Alexander; Barrett, Dave; Beale, Michael H.; Beger, Richard; Daykin, Clare A.; Fan, Teresa W.-M.; Fiehn, Oliver; Goodacre, Royston; Griffin, Julian L.; Hankemeier, Thomas; Hardy, Nigel; Harnly, James; Higashi, Richard; Kopka, Joachim; Lane, Andrew N.; Lindon, John C.; Marriott, Philip; Nicholls, Andrew W.; Reily, Michael D.; Thaden, John J.; Viant, Mark R.

    2013-01-01

    There is a general consensus that supports the need for standardized reporting of metadata or information describing large-scale metabolomics and other functional genomics data sets. Reporting of standard metadata provides a biological and empirical context for the data, facilitates experimental replication, and enables the re-interrogation and comparison of data by others. Accordingly, the Metabolomics Standards Initiative is building a general consensus concerning the minimum reporting standards for metabolomics experiments of which the Chemical Analysis Working Group (CAWG) is a member of this community effort. This article proposes the minimum reporting standards related to the chemical analysis aspects of metabolomics experiments including: sample preparation, experimental analysis, quality control, metabolite identification, and data pre-processing. These minimum standards currently focus mostly upon mass spectrometry and nuclear magnetic resonance spectroscopy due to the popularity of these techniques in metabolomics. However, additional input concerning other techniques is welcomed and can be provided via the CAWG on-line discussion forum at http://msi-workgroups.sourceforge.net/ or http://Msi-workgroups-feedback@lists.sourceforge.net. Further, community input related to this document can also be provided via this electronic forum. PMID:24039616

  9. Investigating High Field Gravity using Astrophysical Techniques

    International Nuclear Information System (INIS)

    Bloom, Elliott D.

    2008-01-01

    The purpose of these lectures is to introduce particle physicists to astrophysical techniques. These techniques can help us understand certain phenomena important to particle physics that are currently impossible to address using standard particle physics experimental techniques. As the subject matter is vast, compromises are necessary in order to convey the central ideas to the reader. Many general references are included for those who want to learn more. The paragraphs below elaborate on the structure of these lectures. I hope this discussion will clarify my motivation and make the lectures easier to follow. The lectures begin with a brief review of more theoretical ideas. First, elements of general relativity are reviewed, concentrating on those aspects that are needed to understand compact stellar objects (white dwarf stars, neutron stars, and black holes). I then review the equations of state of these objects, concentrating on the simplest standard models from astrophysics. After these mathematical preliminaries, Sec. 2(c) discusses 'The End State of Stars'. Most of this section also uses the simplest standard models. However, as these lectures are for particle physicists, I also discuss some of the more recent approaches to the equation of state of very dense compact objects. These particle-physics-motivated equations of state can dramatically change how we view the formation of black holes. Section 3 focuses on the properties of the objects that we want to characterize and measure. X-ray binary systems and Active Galactic Nuclei (AGN) are stressed because the lectures center on understanding very dense stellar objects, black hole candidates (BHCs), and their accompanying high gravitational fields. The use of x-ray timing and gamma-ray experiments is also introduced in this section. Sections 4 and 5 review information from x-ray and gamma-ray experiments. These sections also discuss the current state of the art in x-ray and gamma-ray satellite experiments and

  10. Investigating High Field Gravity using Astrophysical Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bloom, Elliott D.; /SLAC

    2008-02-01

    The purpose of these lectures is to introduce particle physicists to astrophysical techniques. These techniques can help us understand certain phenomena important to particle physics that are currently impossible to address using standard particle physics experimental techniques. As the subject matter is vast, compromises are necessary in order to convey the central ideas to the reader. Many general references are included for those who want to learn more. The paragraphs below elaborate on the structure of these lectures. I hope this discussion will clarify my motivation and make the lectures easier to follow. The lectures begin with a brief review of more theoretical ideas. First, elements of general relativity are reviewed, concentrating on those aspects that are needed to understand compact stellar objects (white dwarf stars, neutron stars, and black holes). I then review the equations of state of these objects, concentrating on the simplest standard models from astrophysics. After these mathematical preliminaries, Sec. 2(c) discusses 'The End State of Stars'. Most of this section also uses the simplest standard models. However, as these lectures are for particle physicists, I also discuss some of the more recent approaches to the equation of state of very dense compact objects. These particle-physics-motivated equations of state can dramatically change how we view the formation of black holes. Section 3 focuses on the properties of the objects that we want to characterize and measure. X-ray binary systems and Active Galactic Nuclei (AGN) are stressed because the lectures center on understanding very dense stellar objects, black hole candidates (BHCs), and their accompanying high gravitational fields. The use of x-ray timing and gamma-ray experiments is also introduced in this section. Sections 4 and 5 review information from x-ray and gamma-ray experiments. These sections also discuss the current state of the art in x-ray and gamma-ray satellite

  11. Assessment of Credit Risk Approaches in Relation with Competitiveness Increase of the Banking Sector

    Directory of Open Access Journals (Sweden)

    Cipovová Eva

    2012-06-01

    Full Text Available The article is focused on a presentation and analysis of selected methods of credit risk management in relation with competitiveness increase of the banking sector. The article is defined credit risk approaches under the Basel III gradually. Aim of this contribution constitutes various methods of credit risk management and effects of their usage on regulatory capital amount in respect of corporate exposures. Optimal equity amount in relation to the risk portfolio presents an essential prerequisite of performance and competitiveness growth of commercial banks. Gradually capital requirements using Standardized Approach and Internal Based Approach in a case of used and unused techniques of credit risk reduce has been quantified. We presume that sophisticated approach means significant saving for bank’s equity which increases competitiveness of banking sector also. Within the article, quantification of capital savings in case of Standardized (with and without assigned external ratings and Foundation Internal Based Approach at the selected credit portfolio has been effected.

  12. Transperitoneal rectus sheath block and transversus abdominis plane block for laparoscopic inguinal hernia repair: A novel approach.

    Science.gov (United States)

    Nagata, Jun; Watanabe, Jun; Nagata, Masato; Sawatsubashi, Yusuke; Akiyama, Masaki; Tajima, Takehide; Arase, Koichi; Minagawa, Noritaka; Torigoe, Takayuki; Nakayama, Yoshifumi; Horishita, Reiko; Kida, Kentaro; Hamada, Kotaro; Hirata, Keiji

    2017-08-01

    A laparoscopic approach for inguinal hernia repair is now considered the gold standard. Laparoscopic surgery is associated with a significant reduction in postoperative pain. Epidural analgesia cannot be used in patients with perioperative anticoagulant therapy because of complications such as epidural hematoma. As such, regional anesthetic techniques, such as ultrasound-guided rectus sheath block and transversus abdominis plane block, have become increasingly popular. However, even these anesthetic techniques have potential complications, such as rectus sheath hematoma, if vessels are damaged. We report the use of a transperitoneal laparoscopic approach for rectus sheath block and transversus abdominis plane block as a novel anesthetic procedure. An 81-year-old woman with direct inguinal hernia underwent laparoscopic transabdominal preperitoneal inguinal repair. Epidural anesthesia was not performed because anticoagulant therapy was administered. A Peti-needle™ was delivered through the port, and levobupivacaine was injected though the peritoneum. Surgery was performed successfully, and the anesthetic technique did not affect completion of the operative procedure. The patient was discharged without any complications. This technique was feasible, and the procedure was performed safely. Our novel analgesia technique has potential use as a standard postoperative regimen in various laparoscopic surgeries. Additional prospective studies to compare it with other techniques are required. © 2017 Japan Society for Endoscopic Surgery, Asia Endosurgery Task Force and John Wiley & Sons Australia, Ltd.

  13. An improved technique for fission track dating

    International Nuclear Information System (INIS)

    Zhao Yunlong; Wu Zhaohui; Xia Yuliang

    1996-01-01

    The necessity of improving the fission track dating (FTD) technique both at home and abroad is illustrated. The ways of making such improvement are also proposed. It is suggested to calibrate the constant b value of the uranium standard glass by using the method of fission products activity. The 3 kinds of uranium standard glass which have been calibrated are NBS SRM962a, UB 1 and UB 2 . An established new method σ·Φ ρ d /b, to measure neutron fluence, avoids the influence of the varying neutron spectrum on measuring neutron fluence. The improved etching technique for fission tracks in zircon adopted a two-step method which includes the molten alkali system etching using NaOH + KOH and the mixed acid system etching using HNO 3 + HF; this technique results in adequate track etching, increased track clarity and less interference. In this way the intensity of tracks is authentically reflected. Dividing angular zone in accordance with the angular distribution of spontaneous fission track on the crystal surface of minerals to count the tracks and using the improved etching technique to remove the non-uniform angular distribution of spontaneous fission tracks in zircon, ensure the accuracy of tracks count. The improved FTD techniques were used to finish Laboratory Standardized Calibration. The tests using international FTD age standards samples have proved that above mentioned techniques are reliable and practical in obtaining the accurate FTD data. (8 tabs.; 3 figs.)

  14. Supra-auricular versus Sinusectomy Approaches for Preauricular Sinuses.

    Science.gov (United States)

    El-Anwar, Mohammad Waheed; ElAassar, Ahmed Shaker

    2016-10-01

    Introduction  Several surgical techniques and modifications have been described to reduce the high recurrence rate after excision of preauricular sinus. Objectives  The aim of this study is to review the literature regarding surgical approaches for preauricular sinus. Data Synthesis  We performed searches in the LILACS, MEDLINE, SciELO, PubMed databases and Cochrane Library in September, 2015, and the key words used in the search were "preauricular sinus," "sinusectomy," "supra-auricular approach," "methylene blue," and/or "recurrence." We revised the results of 17 studies, including 1270 preauricular sinuses that were surgically excised by sinusectomy in 937 ears and by supra-auricular approach in 333 ears. Recurrence with supra-auricular was 4 (1.3%) while sinusectomy was 76 (8.1%) with significant difference ( p  Supra-auricular approach had significantly less recurrence rate than tract sinusectomy approaches. Thus, it could be regularly chosen as the standard procedure for preauricular sinus excision. As such, it would be helpful for surgeons to be familiar with this approach.

  15. Mini-open lateral retroperitoneal lumbar spine approach using psoas muscle retraction technique. Technical report and initial results on six patients.

    Science.gov (United States)

    Aghayev, Kamran; Vrionis, Frank D

    2013-09-01

    The main aim of this paper was to report reproducible method of lumbar spine access via a lateral retroperitoneal route. The authors conducted a retrospective analysis of the technical aspects and clinical outcomes of six patients who underwent lateral multilevel retroperitoneal interbody fusion with psoas muscle retraction technique. The main goal was to develop a simple and reproducible technique to avoid injury to the lumbar plexus. Six patients were operated at 15 levels using psoas muscle retraction technique. All patients reported improvement in back pain and radiculopathy after the surgery. The only procedure-related transient complication was weakness and pain on hip flexion that resolved by the first follow-up visit. Psoas retraction technique is a reliable technique for lateral access to the lumbar spine and may avoid some of the complications related to traditional minimally invasive transpsoas approach.

  16. Advanced Atmospheric Ensemble Modeling Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Buckley, R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Chiswell, S. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Kurzeja, R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Maze, G. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Viner, B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Werth, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-09-29

    Ensemble modeling (EM), the creation of multiple atmospheric simulations for a given time period, has become an essential tool for characterizing uncertainties in model predictions. We explore two novel ensemble modeling techniques: (1) perturbation of model parameters (Adaptive Programming, AP), and (2) data assimilation (Ensemble Kalman Filter, EnKF). The current research is an extension to work from last year and examines transport on a small spatial scale (<100 km) in complex terrain, for more rigorous testing of the ensemble technique. Two different release cases were studied, a coastal release (SF6) and an inland release (Freon) which consisted of two release times. Observations of tracer concentration and meteorology are used to judge the ensemble results. In addition, adaptive grid techniques have been developed to reduce required computing resources for transport calculations. Using a 20- member ensemble, the standard approach generated downwind transport that was quantitatively good for both releases; however, the EnKF method produced additional improvement for the coastal release where the spatial and temporal differences due to interior valley heating lead to the inland movement of the plume. The AP technique showed improvements for both release cases, with more improvement shown in the inland release. This research demonstrated that transport accuracy can be improved when models are adapted to a particular location/time or when important local data is assimilated into the simulation and enhances SRNL’s capability in atmospheric transport modeling in support of its current customer base and local site missions, as well as our ability to attract new customers within the intelligence community.

  17. Acceleration techniques for the discrete ordinate method

    International Nuclear Information System (INIS)

    Efremenko, Dmitry; Doicu, Adrian; Loyola, Diego; Trautmann, Thomas

    2013-01-01

    In this paper we analyze several acceleration techniques for the discrete ordinate method with matrix exponential and the small-angle modification of the radiative transfer equation. These techniques include the left eigenvectors matrix approach for computing the inverse of the right eigenvectors matrix, the telescoping technique, and the method of false discrete ordinate. The numerical simulations have shown that on average, the relative speedup of the left eigenvector matrix approach and the telescoping technique are of about 15% and 30%, respectively. -- Highlights: ► We presented the left eigenvector matrix approach. ► We analyzed the method of false discrete ordinate. ► The telescoping technique is applied for matrix operator method. ► Considered techniques accelerate the computations by 20% in average.

  18. A novel approach to measure elemental concentrations in cation exchange resins using XRF-scanning technique, and its potential in water pollution studies

    Science.gov (United States)

    Huang, Jyh-Jaan; Lin, Sheng-Chi; Löwemark, Ludvig; Liou, Ya-Hsuan; Chang, Queenie; Chang, Tsun-Kuo; Wei, Kuo-Yen; Croudace, Ian W.

    2016-04-01

    X-ray fluorescence (XRF) core-scanning is a fast, and convenient technique to assess elemental variations for a wide variety of research topics. However, the XRF scanning counts are often considered a semi-quantitative measurement due to possible absorption or scattering caused by down core variability in physical properties. To overcome this problem and extend the applications of XRF-scanning to water pollution studies, we propose to use cation exchange resin (IR-120) as an "elemental carrier", and to analyze the resins using the Itrax-XRF core scanner. The use of resin minimizes the matrix effects during the measurements, and can be employed in the field in great numbers due to its low price. Therefore, the fast, and non-destructive XRF-scanning technique can provide a quick and economical method to analyze environmental pollution via absorption in the resin. Five standard resin samples were scanned by the Itrax-XRF core scanner at different exposure times (1 s, 5 s, 15 s, 30 s, 100 s) to allow the comparisons of scanning counts with the absolute concentrations. The regression lines and correlation coefficients of elements that are generally used in pollution studies (Ca, Ti, Cr, Ni, Cu, Zn, and Pb) were examined for the different exposure times. The result shows that within the test range (from few ppm to thousands ppm), the correlation coefficients are all higher than 0.97, even at the shortest exposure time (1 s). Therefore, we propose to use this method in the field to monitor for example sewage disposal events. The low price of resin, and fast, multi elements and precise XRF-scanning technique provide a viable, cost- and time-effective approach that allows large sample numbers to be processed. In this way, the properties and sources of wastewater pollution can be traced for the purpose of environmental monitoring and environmental forensics.

  19. A unified approach to assess performance of different techniques for recovering exhaust heat from gas turbines

    International Nuclear Information System (INIS)

    Carapellucci, Roberto

    2009-01-01

    Exhaust heat from gas turbines can be recovered externally or internally to the cycle itself. Of the technology options for external recovery, the combined gas-steam power plant is by far the most effective and commonly used worldwide. For internal recovery conventional solutions are based on thermodynamic regeneration and steam injection, while innovative solutions rely on humid air regeneration and steam reforming of fuel. In this paper a unified approach for analysing different exhaust heat recovery techniques is proposed. It has been possible to define a characteristic internal heat recovery plane, based on a few meaningful parameters and to identify an innovative scheme for repowering existing combined cycles. The characteristic plane indicates directly the performance obtainable with the different recovery techniques, showing that performances close to combined cycle plants (external recovery) can only be achieved with combined recovery techniques (humid air regeneration, steam reforming of fuel). The innovative repowering scheme, which requires the addition of a gas turbine and one-pressure level HRSG to an existing combined gas-steam power plant, significantly increases power output with fairly high marginal efficiency.

  20. Mathematical Knowledge for Teaching, Standards-Based Mathematics Teaching Practices, and Student Achievement in the Context of the "Responsive Classroom Approach"

    Science.gov (United States)

    Ottmar, Erin R.; Rimm-Kaufman, Sara E.; Larsen, Ross A.; Berry, Robert Q.

    2015-01-01

    This study investigates the effectiveness of the Responsive Classroom (RC) approach, a social and emotional learning intervention, on changing the relations between mathematics teacher and classroom inputs (mathematical knowledge for teaching [MKT] and standards-based mathematics teaching practices) and student mathematics achievement. Work was…

  1. Determination of trace impurities in uranium-transition metal alloy fuels by ICP-MS using extended common analyte internal standardization (ECAIS) technique

    International Nuclear Information System (INIS)

    Saha, Abhijit; Deb, S.B.; Nagar, B.K.; Saxena, M.K.

    2015-01-01

    An analytical methodology was developed for the determination of eight trace impurities viz, Al, B, Cd, Co, Cu, Mg, Mn and Ni in three different uranium-transition metal alloy fuels (U-Me; Me = Ti, Zr and Mo) employing inductively coupled plasma mass spectrometry (ICP-MS). The well known common analyte internal standardization (CAIS) chemometric technique was modified and then employed to minimize and account for the matrix effect on analyte intensity. Standard addition of analytes to the pure synthetic U-Me sample solutions and subsequently their ≥ 94% recovery by the ICP-MS measurement validates the proposed methodology. One real sample of each of these alloys was analyzed by the developed analytical methodology and the %RSD observed was in the range of 5-8%. The method detection limits were found to be within 4-10 μg L -1 . (author)

  2. Photonic band structure calculations using nonlinear eigenvalue techniques

    International Nuclear Information System (INIS)

    Spence, Alastair; Poulton, Chris

    2005-01-01

    This paper considers the numerical computation of the photonic band structure of periodic materials such as photonic crystals. This calculation involves the solution of a Hermitian nonlinear eigenvalue problem. Numerical methods for nonlinear eigenvalue problems are usually based on Newton's method or are extensions of techniques for the standard eigenvalue problem. We present a new variation on existing methods which has its derivation in methods for bifurcation problems, where bordered matrices are used to compute critical points in singular systems. This new approach has several advantages over the current methods. First, in our numerical calculations the new variation is more robust than existing techniques, having a larger domain of convergence. Second, the linear systems remain Hermitian and are nonsingular as the method converges. Third, the approach provides an elegant and efficient way of both thinking about the problem and organising the computer solution so that only one linear system needs to be factorised at each stage in the solution process. Finally, first- and higher-order derivatives are calculated as a natural extension of the basic method, and this has advantages in the electromagnetic problem discussed here, where the band structure is plotted as a set of paths in the (ω,k) plane

  3. ‘Mother-in-child’ thrombectomy technique: a novel and effective approach to decrease intracoronary thrombus burden in acute myocardial infarction

    Energy Technology Data Exchange (ETDEWEB)

    Dauvergne, Christian; Araya, Mario [Department of Cardiology, Clinica Alemana, Santiago (Chile); Uriarte, Polentzi [Department of Cardiology, Instituto Nacional del Torax, Santiago (Chile); Novoa, Oscar; Novoa, Lilian [Department of Cardiology, Clinica Alemana, Santiago (Chile); Maluenda, Gabriel, E-mail: gabrielmaluenda@gmail.com [Department of Cardiology, Clinica Alemana, Santiago (Chile)

    2013-01-15

    Background: The presence of large thrombus burden in patients presenting with acute myocardial infarction (AMI) is common and associated with poor prognosis. This study aimed to describe the feasibility and safety of the novel ‘mother-in-child’ thrombectomy (MCT) technique in patients presenting with AMI and large thrombus burden undergoing percutaneous coronary intervention (PCI). Methods: We studied 13 patients presenting with AMI who underwent PCI with persistent large intracoronary thrombus after standard thrombectomy. The procedure was performed using a 5 F ‘Heartrail II-ST01’ catheter (Terumo Medical) into a 6 F guiding system. Angiographic assessment of thrombus burden and coronary flow was obtained at baseline, immediately after thrombectomy and at the end of the procedure. Results: The mean age was 55.9 ± 13.0 years and involved mostly males (76.9%). All patients underwent PCI via radial approach. Following MCT Thrombolysis In Myocardial Infarction (TIMI) flow improved by 2 or more degrees in 11 patients (84.5%), while visible angiographic thrombus was reduced in 11 patients (84.5%). In the final angiogram, normal TIMI flow was restored in 11 patients (84.5%), with normal myocardial ‘blush’ in 7 patients (53.8%) and total clearance of a visible thrombus in 7 patients (53.8%). Overall, 6 patients received thrombectomy as ‘stand-alone’ procedure. All patients were discharged alive after a mean of 5.6 ± 2 days. Conclusion: This initial report suggests that significant reduction in thrombus burden and improvement of the coronary flow can be safely achieved in patients presenting with AMI and large thrombus burden by using the novel MCT technique.

  4. Radiologic examination of orthopaedics. Methods and techniques

    International Nuclear Information System (INIS)

    Hafner, E.; Meuli, H.C.

    1976-01-01

    This volume describes in detail radiological examinations of the skeleton modern procedures in orthopaedic surgery. Special emphasis is given to functional examination techniques based upon the authors' extensive work on standardized radiological examinations best suited to the needs of orthopaedic surgeons. These techniques were developed at the Radiodiagnostic Department of the Central Radiological Clinic, Bern University, in cooperation with the University Clinic of Orthopaedics and Surgery of the Locomotor System. Exposure techniques are explained concisely, yet with extraordinary precision and attention to detail. They have proved highly successful in teaching programs for X-ray technicians and as standard examination techniques for many hospitals, X-ray departments, orthopaedic units, and private clinics. Recommended for orthopaedic surgeons, radiologists, general surgeons, and X-ray technicians, this definitive treatise, with its superb X-ray reproductions and complementary line drawings, explains how to achieve improved diagnoses and standardized control with the least possible radiation exposure to the patient

  5. 48 CFR 9904.413-50 - Techniques for application.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Techniques for application... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.413-50 Techniques for application. (a) Assignment of... subject to this phase-in. (v) If a segment is closed due to a sale or other transfer of ownership to a...

  6. A Stepwise ISO-Based TQM Implementation Approach Using ISO 9001:2015

    Directory of Open Access Journals (Sweden)

    Chen Chi-kuang

    2016-12-01

    Full Text Available The lack of an implementation roadmap always deters enterprises from choosing Total Quality Management (TQM as its major management approach. This paper proposes a stepwise ISO-based TQM implementation approach which is based on the notion of the new three-dimensional overall business excellence framework developed by Dahlgaard et al. [1]. The proposed approach consists of nine steps comprising three categories: “TQM faith building”, “TQM tools and techniques learning”, and “system development”. The steps in each of the three categories are arranged to span across the proposed nine-step approach. The ISO 9001:2015 standard is used as a case study to demonstrate the proposed approach. The ideas and benefits of the proposed approach are further discussed in relation to this illustration.

  7. Territories typification technique with use of statistical models

    Science.gov (United States)

    Galkin, V. I.; Rastegaev, A. V.; Seredin, V. V.; Andrianov, A. V.

    2018-05-01

    Territories typification is required for solution of many problems. The results of geological zoning received by means of various methods do not always agree. That is why the main goal of the research given is to develop a technique of obtaining a multidimensional standard classified indicator for geological zoning. In the course of the research, the probabilistic approach was used. In order to increase the reliability of geological information classification, the authors suggest using complex multidimensional probabilistic indicator P K as a criterion of the classification. The second criterion chosen is multidimensional standard classified indicator Z. These can serve as characteristics of classification in geological-engineering zoning. Above mentioned indicators P K and Z are in good correlation. Correlation coefficient values for the entire territory regardless of structural solidity equal r = 0.95 so each indicator can be used in geological-engineering zoning. The method suggested has been tested and the schematic map of zoning has been drawn.

  8. Segmented arch or continuous arch technique? A rational approach

    Directory of Open Access Journals (Sweden)

    Sergei Godeiro Fernandes Rabelo Caldas

    2014-04-01

    Full Text Available This study aims at revising the biomechanical principles of the segmented archwire technique as well as describing the clinical conditions in which the rational use of scientific biomechanics is essential to optimize orthodontic treatment and reduce the side effects produced by the straight wire technique.

  9. Report from the research committee of digital imaging standardization in nuclear medicine

    International Nuclear Information System (INIS)

    Nakamura, Yutaka; Ise, Toshihide; Isetani, Osamu; Ichihara, Takashi; Ohya, Nobuyoshi; Kanaya, Shinichi; Fukuda, Toshio; Horii, Hitoshi.

    1994-01-01

    Since digital scintillation camera systems were developed in 1982, digital imaging is rapidly replacing analog imaging. During the first year, the research committee of digital imaging standardization has collected and analyzed basic data concerning digital examination equipment systems, display equipments, films, and hardware and software techniques to determine items required for the standardization of digital imaging. During the second year, it has done basic phantom studies to assess digital images and analyzed the results from both physical and visual viewpoints. On the basis of the outcome of the research committee's activities and the nationwide survey, the draft of digital imaging standardization in nuclear medicine has been presented. In this paper. the analytical data of the two-year survey, made by the research committee of digital imaging standardization, are presented. The descriptions are given under the following four items: (1) standardization digital examination techniques, (2) standardization of display techniques, (3) the count and pixel of digital images, and (4) standardization of digital imaging techniques. (N.K.)

  10. Oncoplastic round block technique has comparable operative parameters as standard wide local excision: a matched case-control study.

    Science.gov (United States)

    Lim, Geok-Hoon; Allen, John Carson; Ng, Ruey Pyng

    2017-08-01

    Although oncoplastic breast surgery is used to resect larger tumors with lower re-excision rates compared to standard wide local excision (sWLE), criticisms of oncoplastic surgery include a longer-albeit, well concealed-scar, longer operating time and hospital stay, and increased risk of complications. Round block technique has been reported to be very suitable for patients with relatively smaller breasts and minimal ptosis. We aim to determine if round block technique will result in operative parameters comparable with sWLE. Breast cancer patients who underwent a round block procedure from 1st May 2014 to 31st January 2016 were included in the study. These patients were then matched for the type of axillary procedure, on a one to one basis, with breast cancer patients who had undergone sWLE from 1st August 2011 to 31st January 2016. The operative parameters between the 2 groups were compared. 22 patients were included in the study. Patient demographics and histologic parameters were similar in the 2 groups. No complications were reported in either group. The mean operating time was 122 and 114 minutes in the round block and sWLE groups, respectively (P=0.64). Length of stay was similar in the 2 groups (P=0.11). Round block patients had better cosmesis and lower re-excision rates. A higher rate of recurrence was observed in the sWLE group. The round block technique has comparable operative parameters to sWLE with no evidence of increased complications. Lower re-excision rate and better cosmesis were observed in the round block patients suggesting that the round block technique is not only comparable in general, but may have advantages to sWLE in selected cases.

  11. Standard and biological treatment in large vessel vasculitis: guidelines and current approaches.

    Science.gov (United States)

    Muratore, Francesco; Pipitone, Nicolò; Salvarani, Carlo

    2017-04-01

    Giant cell arteritis and Takayasu arteritis are the two major forms of idiopathic large vessel vasculitis. High doses of glucocorticoids are effective in inducing remission in both conditions, but relapses and recurrences are common, requiring prolonged glucocorticoid treatment with the risk of the related adverse events. Areas covered: In this article, we will review the standard and biological treatment strategies in large vessel vasculitis, and we will focus on the current approaches to these diseases. Expert commentary: The results of treatment trials with conventional immunosuppressive agents such as methotrexate, azathioprine, mycophenolate mofetil, and cyclophosphamide have overall been disappointing. TNF-α blockers are ineffective in giant cell arteritis, while observational evidence and a phase 2 randomized trial support the use of tocilizumab in relapsing giant cell arteritis. Observational evidence strongly supports the use of anti-TNF-α agents and tocilizumab in Takayasu patients with relapsing disease. However biological agents are not curative, and relapses remain common.

  12. Development, improvement and calibration of neutronic reaction rate measurements: elaboration of a base of standard techniques; Developpement, amelioration et calibration des mesures de taux de reaction neutroniques: elaboration d`une base de techniques standards

    Energy Technology Data Exchange (ETDEWEB)

    Hudelot, J.P

    1998-06-19

    In order to improve and to validate the neutronic calculation schemes, perfecting integral measurements of neutronic parameters is necessary. This thesis focuses on the conception, the improvement and the development of neutronic reaction rates measurements, and aims at building a base of standard techniques. Two subjects are discussed. The first one deals with direct measurements by fission chambers. A short presentation of the different usual techniques is given. Then, those last ones are applied through the example of doubling time measurements on the EOLE facility during the MISTRAL 1 experimental programme. Two calibration devices of fission chambers are developed: a thermal column located in the central part of the MINERVE facility, and a calibration cell using a pulsed high flux neutron generator and based on the discrimination of the energy of the neutrons with a time-of-flight method. This second device will soon allow to measure the mass of fission chambers with a precision of about 1 %. Finally, the necessity of those calibrations will be shown through spectral indices measurements in core MISTRAL 1 (UO{sub 2}) and MISTRAL 2 (MOX) of the EOLE facility. In each case, the associated calculation schemes, performed using the Monte Carlo MCNP code with the ENDF-BV library, will be validated. Concerning the second one, the goal is to develop a method for measuring the modified conversion ratio of {sup 238}U (defined as the ratio of {sup 238}U capture rate to total fission rate) by gamma-ray spectrometry of fuel rods. Within the framework of the MISTRAL 1 and MISTRAL 2 programmes, the measurement device, the experimental results and the spectrometer calibration are described. Furthermore, the MCNP calculations of neutron self-shielding and gamma self-absorption are validated. It is finally shown that measurement uncertainties are better than 1 %. The extension of this technique to future modified conversion ratio measurements for {sup 242}Pu (on MOX rods) and

  13. [Sinus tarsi approach combined with medial distraction technique for treatment of intra-articular calcaneus fractures].

    Science.gov (United States)

    Zhou, Haichao; Ren, Haoyang; Li, Bing; Yu, Tao; Yang, Yunfeng

    2016-07-08

    ?To discuss the effectiveness of limited open reduction via sinus tarsi approach using medial distraction technique in the treatment of intra-articular calcaneus fractures by comparing with open reduction and internal fixation via extensile L-shaped incision. ?A retrospective analysis was made on the clinical data of 21 patients with intra-articular calcaneus fractures treated by sinus tarsi approach combined with medial distraction technique between April 2013 and November 2014 (minimally invasive group), and 32 patients treated by extensile L-shaped incision approach between June 2012 and September 2014 (extensile incision group). No significant difference was found in gender, age, injury pattern, fracture classification, time from injury to operation, preoperative Böhler angle, Gissane angle, calcaneal varus angle, the ankle and hind-foot score of American Orthopaedic Foot and Ankle Society (AOFAS), and visual analogue scale (VAS) score between 2 groups (P>0.05), which was comparable. The operation time, wound complications, and bone healing time were recorded. The postoperative function was also evaluated by AOFAS score and VAS score. The pre-and post-operative Böhler angle, Gissane angle, and calcaneal varus angle were measured on the X-ray films, and the corrective angle was calculated. ?Sixteen patients were followed up 6-18 months (mean, 11.5 months) in the minimally invasive group, and 23 patients for 6-24 months (mean, 13.5 months) in the extensile incision group. Difference was not significant in operation time between 2 groups (t=0.929, P=0.796). No complication occurred in the minimally invasive group; partial skin flap necrosis occurred in 3 cases of the extensile incision group, was cured after dressing change. There was no loosening of implants or reduction loss in 2 groups at last follow-up. Subtalar joint stiffness occurred in 1 case of the minimally invasive group and 4 cases of the extensile incision group, and 1 patient had discomfort for the

  14. Use of Monte Carlo modeling approach for evaluating risk and environmental compliance

    International Nuclear Information System (INIS)

    Higley, K.A.; Strenge, D.L.

    1988-09-01

    Evaluating compliance with environmental regulations, specifically those regulations that pertain to human exposure, can be a difficult task. Historically, maximum individual or worst-case exposures have been calculated as a basis for evaluating risk or compliance with such regulations. However, these calculations may significantly overestimate exposure and may not provide a clear understanding of the uncertainty in the analysis. The use of Monte Carlo modeling techniques can provide a better understanding of the potential range of exposures and the likelihood of high (worst-case) exposures. This paper compares the results of standard exposure estimation techniques with the Monte Carlo modeling approach. The authors discuss the potential application of this approach for demonstrating regulatory compliance, along with the strengths and weaknesses of the approach. Suggestions on implementing this method as a routine tool in exposure and risk analyses are also presented. 16 refs., 5 tabs

  15. METHODOLOGY COMPARATIVE EVALUATION OF PROFESSIONAL STANDARDS AND EDUCATION STANDARDS WITH THE USE OF NON-NUMERIC DATA PROCESSING METHODS

    Directory of Open Access Journals (Sweden)

    Gennady V. Abramov

    2016-01-01

    Full Text Available The article discusses the development of a technique that allows for a comparative assessment of the requirements of the professional standard and the federal state educational standards. The results can be used by universities to adjust the learning process for the analysis of their curricula to better compliance with professional standards

  16. Training Standardization

    International Nuclear Information System (INIS)

    Agnihotri, Newal

    2003-01-01

    The article describes the benefits of and required process and recommendations for implementing the standardization of training in the nuclear power industry in the United States and abroad. Current Information and Communication Technologies (ICT) enable training standardization in the nuclear power industry. The delivery of training through the Internet, Intranet and video over IP will facilitate this standardization and bring multiple benefits to the nuclear power industry worldwide. As the amount of available qualified and experienced professionals decreases because of retirements and fewer nuclear engineering institutions, standardized training will help increase the number of available professionals in the industry. Technology will make it possible to use the experience of retired professionals who may be interested in working part-time from a remote location. Well-planned standardized training will prevent a fragmented approach among utilities, and it will save the industry considerable resources in the long run. It will also ensure cost-effective and safe nuclear power plant operation

  17. 48 CFR 9904.410-50 - Techniques for application.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Techniques for application... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.410-50 Techniques for application. (a) G&A expenses of a... practice was to use a cost of sales or sales base, that business unit may use the transition method set out...

  18. Measurements of uranium enrichment by four techniques of gamma-ray spectrometry

    International Nuclear Information System (INIS)

    Tojo, Takao

    1983-12-01

    Measurements of uranium enrichment with the uses of the LMRI (France) UO 2 standards have been made by four techniques of gamma-ray spectrometry, in order to examine measurement characteristics of each technique. The following results were obtained by the three techniques based on the direct determination of the peak area of the 186-keV gamma-rays from 235 U, when the standard sample of 6.297 a/o was used for measuring enrichments ranging from 1.4 a/o to 9.6 a/o ; (i) In a LEPS HP Ge gamma-ray spectrometry, standard deviation of the measured enrichments from the certified ones was 1.4 %, (ii) in a Ge(Li) gamma-ray spectrometry, the standard deviation was 2.0 %, (iii) in a NaI(Tl) gamma-ray spectrometry, the standard deviation was 1.2 %. In the fourth technique, the method of multiple single-channel analyzers, enrichments of 1.4 - 9.6 a/o were measured in the standard deviation of 0.51 %, when the most suitable pairs of standard samples were used for each sample. A part of sources of systematic errors which were caused by each technique adopted was revealed throughout the measurements. And also, it was recognized that the LMRI's values of enrichment were certified precisely, and the UO 2 standards were very useful for enrichment measurements in the four techniques of gamma-ray spectrometry used here. (author)

  19. Cost minimisation analysis of using acellular dermal matrix (Strattice™) for breast reconstruction compared with standard techniques.

    Science.gov (United States)

    Johnson, R K; Wright, C K; Gandhi, A; Charny, M C; Barr, L

    2013-03-01

    We performed a cost analysis (using UK 2011/12 NHS tariffs as a proxy for cost) comparing immediate breast reconstruction using the new one-stage technique of acellular dermal matrix (Strattice™) with implant versus the standard alternative techniques of tissue expander (TE)/implant as a two-stage procedure and latissimus dorsi (LD) flap reconstruction. Clinical report data were collected for operative time, length of stay, outpatient procedures, and number of elective and emergency admissions in our first consecutive 24 patients undergoing one-stage Strattice reconstruction. Total cost to the NHS based on tariff, assuming top-up payments to cover Strattice acquisition costs, was assessed and compared to the two historical control groups matched on key variables. Eleven patients having unilateral Strattice reconstruction were compared to 10 having TE/implant reconstruction and 10 having LD flap and implant reconstruction. Thirteen patients having bilateral Strattice reconstruction were compared to 12 having bilateral TE/implant reconstruction. Total costs were: unilateral Strattice, £3685; unilateral TE, £4985; unilateral LD and implant, £6321; bilateral TE, £5478; and bilateral Strattice, £6771. The cost analysis shows a financial advantage of using acellular dermal matrix (Strattice) in unilateral breast reconstruction versus alternative procedures. The reimbursement system in England (Payment by Results) is based on disease-related groups similar to that of many countries across Europe and tariffs are based on reported hospital costs, making this analysis of relevance in other countries. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. The preparation of large standards for NDA measurements

    International Nuclear Information System (INIS)

    Guardini, S.

    1991-01-01

    The accuracy of a nuclear material balance determination is dependent on the reference materials used to calibrate. The calibration needs of nondestructive assay techniques differ from the needs of destructive techniques: nondestructive techniques use the reference materials more than once and can require larger masses of special nuclear material. Therefore, the expertise inherited from destructive methods is not destructive measurements. The procurement process for reference materials is expensive and complex. Careful specification of the desired attributes defines the required quality measures. A detailed procurement plan, agreed upon and documented before acquisition starts, is crucial to obtaining a set of high quality references materials. The acquisition of some recent Los Alamos standards and the Ispra PERLA (Performance Laboratory) standards are following such plans. To date, plutonium oxide standards of three burnups ranging to 2.5 kg and uranium oxide standards of four (high) enrichments ranging to 1.5 kg are in routine use for calibration, performance evaluation and training. In this paper, the authors discuss an alternative

  1. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  2. Charge separation technique for metal-oxide-silicon capacitors in the presence of hydrogen deactivated dopants

    International Nuclear Information System (INIS)

    Witczak, Steven C.; Winokur, Peter S.; Lacoe, Ronald C.; Mayer, Donald C.

    2000-01-01

    An improved charge separation technique for metal-oxide-silicon (MOS) capacitors is presented which accounts for the deactivation of substrate dopants by hydrogen at elevated irradiation temperatures or small irradiation biases. Using high-frequency capacitance-voltage (C-V) measurements, radiation-induced inversion voltage shifts are separated into components due to oxide trapped charge, interface traps and deactivated dopants, where the latter is computed from a reduction in Si capacitance. In the limit of no radiation-induced dopant deactivation, this approach reduces to the standard midgap charge separation technique used widely for the analysis of room-temperature irradiations. The technique is demonstrated on a p-type MOS capacitor irradiated with 60 Co γ-rays at 100 C and zero bias, where the dopant deactivation is significant

  3. Determining partial differential cross sections for low-energy electron photodetachment involving conical intersections using the solution of a Lippmann-Schwinger equation constructed with standard electronic structure techniques.

    Science.gov (United States)

    Han, Seungsuk; Yarkony, David R

    2011-05-07

    A method for obtaining partial differential cross sections for low energy electron photodetachment in which the electronic states of the residual molecule are strongly coupled by conical intersections is reported. The method is based on the iterative solution to a Lippmann-Schwinger equation, using a zeroth order Hamiltonian consisting of the bound nonadiabatically coupled residual molecule and a free electron. The solution to the Lippmann-Schwinger equation involves only standard electronic structure techniques and a standard three-dimensional free particle Green's function quadrature for which fast techniques exist. The transition dipole moment for electron photodetachment, is a sum of matrix elements each involving one nonorthogonal orbital obtained from the solution to the Lippmann-Schwinger equation. An expression for the electron photodetachment transition dipole matrix element in terms of Dyson orbitals, which does not make the usual orthogonality assumptions, is derived.

  4. Dynamics of the standard model

    CERN Document Server

    Donoghue, John F; Holstein, Barry R

    2014-01-01

    Describing the fundamental theory of particle physics and its applications, this book provides a detailed account of the Standard Model, focusing on techniques that can produce information about real observed phenomena. The book begins with a pedagogic account of the Standard Model, introducing essential techniques such as effective field theory and path integral methods. It then focuses on the use of the Standard Model in the calculation of physical properties of particles. Rigorous methods are emphasized, but other useful models are also described. This second edition has been updated to include recent theoretical and experimental advances, such as the discovery of the Higgs boson. A new chapter is devoted to the theoretical and experimental understanding of neutrinos, and major advances in CP violation and electroweak physics have been given a modern treatment. This book is valuable to graduate students and researchers in particle physics, nuclear physics and related fields.

  5. On the generalization of the hazard rate twisting-based simulation approach

    KAUST Repository

    Rached, Nadhir B.

    2016-11-17

    Estimating the probability that a sum of random variables (RVs) exceeds a given threshold is a well-known challenging problem. A naive Monte Carlo simulation is the standard technique for the estimation of this type of probability. However, this approach is computationally expensive, especially when dealing with rare events. An alternative approach is represented by the use of variance reduction techniques, known for their efficiency in requiring less computations for achieving the same accuracy requirement. Most of these methods have thus far been proposed to deal with specific settings under which the RVs belong to particular classes of distributions. In this paper, we propose a generalization of the well-known hazard rate twisting Importance Sampling-based approach that presents the advantage of being logarithmic efficient for arbitrary sums of RVs. The wide scope of applicability of the proposed method is mainly due to our particular way of selecting the twisting parameter. It is worth observing that this interesting feature is rarely satisfied by variance reduction algorithms whose performances were only proven under some restrictive assumptions. It comes along with a good efficiency, illustrated by some selected simulation results comparing the performance of the proposed method with some existing techniques.

  6. "TuNa-saving" endoscopic medial maxillectomy: a surgical technique for maxillary inverted papilloma.

    Science.gov (United States)

    Pagella, Fabio; Pusateri, Alessandro; Matti, Elina; Avato, Irene; Zaccari, Dario; Emanuelli, Enzo; Volo, Tiziana; Cazzador, Diego; Citraro, Leonardo; Ricci, Giampiero; Tomacelli, Giovanni Leo

    2017-07-01

    The maxillary sinus is the most common site of sinonasal inverted papilloma. Endoscopic sinus surgery, in particular endoscopic medial maxillectomy, is currently the gold standard for treatment of maxillary sinus papilloma. Although a common technique, complications such as stenosis of the lacrimal pathway and consequent development of epiphora are still possible. To avoid these problems, we propose a modification of this surgical technique that preserves the head of the inferior turbinate and the nasolacrimal duct. A retrospective analysis was performed on patients treated for maxillary inverted papilloma in three tertiary medical centres between 2006 and 2014. Pedicle-oriented endoscopic surgery principles were applied and, in select cases where the tumour pedicle was located on the anterior wall, a modified endoscopic medial maxillectomy was carried out as described in this paper. From 2006 to 2014 a total of 84 patients were treated. A standard endoscopic medial maxillectomy was performed in 55 patients (65.4%), while the remaining 29 (34.6%) had a modified technique performed. Three recurrences (3/84; 3.6%) were observed after a minimum follow-up of 24 months. A new surgical approach for select cases of maxillary sinus inverted papilloma is proposed in this paper. In this technique, the endoscopic medial maxillectomy was performed while preserving the head of the inferior turbinate and the nasolacrimal duct ("TuNa-saving"). This technique allowed for good visualization of the maxillary sinus, good oncological control and a reduction in the rate of complications.

  7. 48 CFR 9904.401-50 - Techniques for application.

    Science.gov (United States)

    2010-10-01

    .... 9904.401-50 Section 9904.401-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.401-50 Techniques for application. (a) The standard...

  8. Searching for beyond the Standard Model physics using direct and indirect methods at LHCb

    CERN Document Server

    Hall, Samuel C P; Golutvin, Andrey

    It is known that the Standard Model of particle physics is incomplete in its description of nature at a fundamental level. For example, the Standard Model can neither incorporate dark matter nor explain the matter dominated nature of the Universe. This thesis presents three analyses undertaken using data collected by the LHCb detector. Each analysis searches for indications of physics beyond the Standard Model in dierent decays of B mesons, using dierent techniques. Notably, two analyses look for indications of new physics using indirect methods, and one uses a direct approach. The rst analysis shows evidence for the rare decay $B^{+} \\rightarrow D^{+}_{s}\\phi$ with greater than 3 $\\sigma$ signicance; this also constitutes the rst evidence for a fullyhadronic annihilation-type decay of a $B^{+}$ meson. A measurement of the branching fraction of the decay $B^{+} \\rightarrow D^{+}_{s}\\phi$ is seen to be higher than, but still compatible with, Standard Model predictions. The CP-asymmetry of the decay is also ...

  9. A Survey on Formal Verification Techniques for Safety-Critical Systems-on-Chip

    Directory of Open Access Journals (Sweden)

    Tomás Grimm

    2018-05-01

    Full Text Available The high degree of miniaturization in the electronics industry has been, for several years, a driver to push embedded systems to different fields and applications. One example is safety-critical systems, where the compactness in the form factor helps to reduce the costs and allows for the implementation of new techniques. The automotive industry is a great example of a safety-critical area with a great rise in the adoption of microelectronics. With it came the creation of the ISO 26262 standard with the goal of guaranteeing a high level of dependability in the designs. Other areas in the safety-critical applications domain have similar standards. However, these standards are mostly guidelines to make sure that designs reach the desired dependability level without explicit instructions. In the end, the success of the design to fulfill the standard is the result of a thorough verification process. Naturally, the goal of any verification team dealing with such important designs is complete coverage as well as standards conformity, but as these are complex hardware, complete functional verification is a difficult task. From the several techniques that exist to verify hardware, where each has its pros and cons, we studied six well-established in academia and in industry. We can divide them into two categories: simulation, which needs extremely large amounts of time, and formal verification, which needs unrealistic amounts of resources. Therefore, we conclude that a hybrid approach offers the best balance between simulation (time and formal verification (resources.

  10. A Systematic Approach to Applying Lean Techniques to Optimize an Office Process at the Y-12 National Security Complex

    Energy Technology Data Exchange (ETDEWEB)

    Credille, Jennifer [Y-12 National Security Complex, Oak Ridge, TN (United States); Univ. of Tennessee, Knoxville, TN (United States); Owens, Elizabeth [Y-12 National Security Complex, Oak Ridge, TN (United States); Univ. of Tennessee, Knoxville, TN (United States)

    2017-10-11

    This capstone offers the introduction of Lean concepts to an office activity to demonstrate the versatility of Lean. Traditionally Lean has been associated with process improvements as applied to an industrial atmosphere. However, this paper will demonstrate that implementing Lean concepts within an office activity can result in significant process improvements. Lean first emerged with the conception of the Toyota Production System. This innovative concept was designed to improve productivity in the automotive industry by eliminating waste and variation. Lean has also been applied to office environments, however the limited literature reveals most Lean techniques within an office are restricted to one or two techniques. Our capstone confronts these restrictions by introducing a systematic approach that utilizes multiple Lean concepts. The approach incorporates: system analysis, system reliability, system requirements, and system feasibility. The methodical Lean outline provides tools for a successful outcome, which ensures the process is thoroughly dissected and can be achieved for any process in any work environment.

  11. Standardization for climate change. Approaches and perspectives. Final report

    International Nuclear Information System (INIS)

    Weterings, R.

    1995-01-01

    The results of a project, aimed at support of the environmental quality target for climate policy on a national level (Follow-up Memorandum Climatic Change of the Dutch government) and on an international level (elaboration of the UN Framework Convention on Climate Change). In order to generate ideas for standards of the climate policy a workshop was held on Environmental Quality and Climate. During the workshop standards of climatic change were highlighted from different points of view. Those views and perspectives are analyzed and the results are presented in this report. 4 figs., 31 refs

  12. An holistic approach to the problem of reactor ageing

    International Nuclear Information System (INIS)

    Phythian, W.; McElroy, R.; Druce, S.; Kovan, D.

    1992-01-01

    Understanding the process of ageing in reactors is essential to extending their lives beyond original design. To present a sound case -particularly regarding the level of embrittlement in reactor vessels due to radiation damage - an integrated approach using advanced assessment tools is needed. The techniques developed for the purpose involve, on the microscopic level, advanced neutron dosimetry and high resolution measurement techniques (eg advanced electron beam techniques and small angle neutron scattering) with which an analysis can be done of the radiation damage and the microstructural state of the steel test procedures (tensile, fracture toughness and Charpy impact) on standard and sub-sized specimens, the extent of radiation degradation can be characterised. finally, it is possible to predict how the degradation will evolve using physically-based models of embrittlement. (Author)

  13. A systematic comparison of motion artifact correction techniques for functional near-infrared spectroscopy.

    Science.gov (United States)

    Cooper, Robert J; Selb, Juliette; Gagnon, Louis; Phillip, Dorte; Schytz, Henrik W; Iversen, Helle K; Ashina, Messoud; Boas, David A

    2012-01-01

    Near-infrared spectroscopy (NIRS) is susceptible to signal artifacts caused by relative motion between NIRS optical fibers and the scalp. These artifacts can be very damaging to the utility of functional NIRS, particularly in challenging subject groups where motion can be unavoidable. A number of approaches to the removal of motion artifacts from NIRS data have been suggested. In this paper we systematically compare the utility of a variety of published NIRS motion correction techniques using a simulated functional activation signal added to 20 real NIRS datasets which contain motion artifacts. Principle component analysis, spline interpolation, wavelet analysis, and Kalman filtering approaches are compared to one another and to standard approaches using the accuracy of the recovered, simulated hemodynamic response function (HRF). Each of the four motion correction techniques we tested yields a significant reduction in the mean-squared error (MSE) and significant increase in the contrast-to-noise ratio (CNR) of the recovered HRF when compared to no correction and compared to a process of rejecting motion-contaminated trials. Spline interpolation produces the largest average reduction in MSE (55%) while wavelet analysis produces the highest average increase in CNR (39%). On the basis of this analysis, we recommend the routine application of motion correction techniques (particularly spline interpolation or wavelet analysis) to minimize the impact of motion artifacts on functional NIRS data.

  14. Does the Responsive Classroom Approach Affect the Use of Standards-Based Mathematics Teaching Practices?: Results from a Randomized Controlled Trial

    Science.gov (United States)

    Ottmar, Erin R.; Rimm-Kaufman, Sara E.; Berry, Robert Q.; Larsen, Ross A.

    2013-01-01

    This study highlights the connections between two facets of teachers' skills--those supporting teachers' mathematical instructional interactions and those underlying social interactions within the classroom. The impact of the Responsive Classroom (RC) approach and use of RC practices on the use of standards-based mathematics teaching practices was…

  15. Sub-Frequency Interval Approach in Electromechanical Impedance Technique for Concrete Structure Health Monitoring

    Directory of Open Access Journals (Sweden)

    Bahador Sabet Divsholi

    2010-12-01

    Full Text Available The electromechanical (EM impedance technique using piezoelectric lead zirconate titanate (PZT transducers for structural health monitoring (SHM has attracted considerable attention in various engineering fields. In the conventional EM impedance technique, the EM admittance of a PZT transducer is used as a damage indicator. Statistical analysis methods such as root mean square deviation (RMSD have been employed to associate the damage level with the changes in the EM admittance signatures, but it is difficult to determine the location of damage using such methods. This paper proposes a new approach by dividing the large frequency (30–400 kHz range into sub-frequency intervals and calculating their respective RMSD values. The RMSD of the sub-frequency intervals (RMSD-S will be used to study the severity and location of damage. An experiment is carried out on a real size concrete structure subjected to artificial damage. It is observed that damage close to the PZT changes the high frequency range RMSD-S significantly, while the damage far away from the PZT changes the RMSD-S in the low frequency range significantly. The relationship between the frequency range and the PZT sensing region is also presented. Finally, a damage identification scheme is proposed to estimate the location and severity of damage in concrete structures.

  16. Approaches to Increasing Ethical Compliance in China with Drug Trial Standards of Practice

    DEFF Research Database (Denmark)

    Rosenberg, Jacob

    2016-01-01

    . With recent reports of scientific misconduct from China, there is an urgent need to find approaches to compel researchers to adhere to ethical research practices. This problem does not call for a simple solution, but if forces are joined with governmental regulations, education in ethics issues for medical......Zeng et al.'s Ethics Review highlights some of the challenges associated with clinical research in China. They found that only a minority of published clinical trials of anti-dementia drugs reported that they fulfilled the basic ethical principles as outlined in the Declaration of Helsinki...... researchers, and strong reinforcement by Chinese journal editors not to publish studies with these flaws, then research ethics and publication standards will probably improve. Other solutions to foster ethical practice of drug trials are discussed including Chinese initiatives directed at managing conflict...

  17. Fluorescent standards for photodynamic therapy

    Science.gov (United States)

    Belko, N.; Kavalenka, S.; Samtsov, M.

    2016-08-01

    Photodynamic therapy is an evolving technique for treatment of various oncological diseases. This method employs photosensitizers - species that lead to death of tumor cells after the photoactivation. For further development and novel applications of photodynamic therapy new photosensitizers are required. After synthesis of a new photosensitizer it is important to know its concentration in different biological tissues after its administration and distribution. The concentration is frequently measured by the extraction method, which has some disadvantages, e.g. it requires many biological test subjects that are euthanized during the measurement. We propose to measure the photosensitizer concentration in tissue by its fluorescence. For this purpose fluorescent standards were developed. The standards are robust and simple to produce; their fluorescence signal does not change with time. The fluorescence intensity of fluorescent standards seems to depend linearly on the dye concentration. A set of standards thus allow the calibration of a spectrometer. Finally, the photosensitizer concentration can be determined by the fluorescence intensity after comparing the corresponding spectrum with spectra of the set of fluorescent standards. A biological test subject is not euthanized during this kind of experiment. We hope this more humane technique can be used in future instead of the extraction method.

  18. Concept of Draft International Standard for a Unified Approach to Space Program Quality Assurance

    Science.gov (United States)

    Stryzhak, Y.; Vasilina, V.; Kurbatov, V.

    2002-01-01

    For want of the unified approach to guaranteed space project and product quality assurance, implementation of many international space programs has become a challenge. Globalization of aerospace industry and participation of various international ventures with diverse quality assurance requirements in big international space programs requires for urgent generation of unified international standards related to this field. To ensure successful fulfillment of space missions, aerospace companies should design and process reliable and safe products with properties complying or bettering User's (or Customer's) requirements. Quality of the products designed or processed by subcontractors (or other suppliers) should also be in compliance with the main user (customer)'s requirements. Implementation of this involved set of unified requirements will be made possible by creating and approving a system (series) of international standards under a generic title Space Product Quality Assurance based on a system consensus principle. Conceptual features of the baseline standard in this system (series) should comprise: - Procedures for ISO 9000, CEN and ECSS requirements adaptation and introduction into space product creation, design, manufacture, testing and operation; - Procedures for quality assurance at initial (design) phases of space programs, with a decision on the end product made based on the principle of independence; - Procedures to arrange incoming inspection of products delivered by subcontractors (including testing, audit of supplier's procedures, review of supplier's documentation), and space product certification; - Procedures to identify materials and primary products applied; - Procedures for quality system audit at the component part, primary product and materials supplier facilities; - Unified procedures to form a list of basic performances to be under configuration management; - Unified procedures to form a list of critical space product components, and unified

  19. Approach to the problem of combined radiation and environmental effect standardization

    International Nuclear Information System (INIS)

    Burykina, L.N.; Ajzina, N.L.; Vasil'eva, L.A.; Veselovskaya, K.A.; Likhachev, Yu.P.; Ponomareva, V.L.; Satarina, S.M.; Shmeleva, E.V.

    1978-01-01

    Rats were used to study combined forms of damage caused by radioactive substances with varioUs types of distribution ( 131 I and 147 Pm) and by external radiation sources (γ, X). Damage caused by radiation and dust factors was also studied. Synergism of the combined effect of the tolerance dose of 147 Pm introduced and preceding external general γ-irradiation was determined. The combined action of 131 I and external γ- and X-ray radiation exhibited an additional effect on rat thyroid glands. The combined action of dust and radiation factors showed that the biological effect depended on the dose abs.orbed in a critical organ (lungs). The results of the investigations point to an important role of critical organs (systems) and the degree of their radiosensitivity in response of body to combined internal and external irradiations. The facts presented show that the approach to standardizing radiation factors from the position of partial summation should be changed. This may be accomplished by using a combination factor which is determined experimentally and reflects a relative biological efficiency of the combined effects as compared to separate ones

  20. Combined approach branchial sinusectomy: a new technique for excision of second branchial cleft sinus.

    Science.gov (United States)

    Olusesi, A D

    2009-10-01

    Branchial cleft anomalies are well described, with the second arch anomaly being the commonest. Following surgical excision, recurrence occurs in 2 to 22 per cent of cases, and is believed to be due largely to incomplete resection. This report aims to describe a simple surgical technique for treatment of second branchial cleft sinus in the older paediatric age group and adults. An 11-year-old girl underwent surgical excision of a second branchial sinus. Prior to surgery, she was assessed by means of an imaging sonogram, and by direct methylene blue dye injection into the sinus on the operating table, followed by insertion of a metallic probe. Dissection was of the 'step ladder' incision type, but the incision was completed via an oropharyngeal approach. Histological examination of the lesion after excision established the diagnosis. No recurrence had been observed at the time of writing. Although they are congenital lesions, second branchial cleft abnormalities usually present in the older paediatric age group or even in adulthood. In the case reported, a simple combined approach ensured completeness of resection.

  1. Laparoscopic anterior versus endoscopic posterior approach for adrenalectomy: a shift to a new golden standard?

    Science.gov (United States)

    Vrielink, O M; Wevers, K P; Kist, J W; Borel Rinkes, I H M; Hemmer, P H J; Vriens, M R; de Vries, J; Kruijff, S

    2017-08-01

    There has been an increased utilization of the posterior retroperitoneal approach (PRA) for adrenalectomy alongside the "classic" laparoscopic transabdominal technique (LTA). The aim of this study was to compare both procedures based on outcome variables at various ranges of tumor size. A retrospective analysis was performed on 204 laparoscopic transabdominal (UMC Groningen) and 57 retroperitoneal (UMC Utrecht) adrenalectomies between 1998 and 2013. We applied a univariate and multivariate regression analysis. Mann-Whitney and chi-squared tests were used to compare outcome variables between both approaches. Both mean operation time and median blood loss were significantly lower in the PRA group with 102.1 (SD 33.5) vs. 173.3 (SD 59.1) minutes (p < 0.001) and 0 (0-200) vs. 50 (0-1000) milliliters (p < 0.001), respectively. The shorter operation time in PRA was independent of tumor size. Complication rates were higher in the LTA (19.1%) compared to PRA (8.8%). There was no significant difference in recovery time between both approaches. Application of the PRA decreases operation time, blood loss, and complication rates compared to LTA. This might encourage institutions that use the LTA to start using PRA in patients with adrenal tumors, independent of tumor size.

  2. Development, improvement and calibration of neutronic reaction rates measurements: elaboration of a standard techniques basis

    International Nuclear Information System (INIS)

    Hudelot, J.P.

    1998-06-01

    In order to improve and to validate the neutronics calculation schemes, perfecting integral measurements of neutronics parameters is necessary. This thesis focuses on the conception, the improvement and the development of neutronics reaction rates measurements, and aims at building a base of standard techniques. Two subjects are discussed. The first one deals with direct measurements by fission chambers. A short presentation of the different usual techniques is given. Then, those last ones are applied through the example of doubling time measurements on the EOLE facility during the MISTRAL 1 experimental programme. Two calibration devices of fission chambers are developed: a thermal column located in the central part of the MINERVE facility, and a calibration cell using a pulsed high flux neutron generator and based on the discrimination of the energy of the neutrons with a time-of-flight method. This second device will soon allow to measure the mass of fission chambers with a precision of about 1 %. Finally, the necessity of those calibrations will be shown through spectral indices measurements in core MISTRAL 1 (UO 2 ) and MISTRAL 2 (MOX) of the EOLE facility. In each case, the associated calculation schemes, performed using the Monte Carlo MCNP code with the ENDF-BV library, will be validated. Concerning the second one, the goal is to develop a method for measuring the modified conversion ratio of 238 U (defined as the ratio of 238 U capture rate to total fission rate) by gamma-ray spectrometry of fuel rods. Within the framework of the MISTRAL 1 and MISTRAL 2 programmes, the measurement device, the experimental results and the spectrometer calibration are described. Furthermore, the MCNP calculations of neutron self-shielding and gamma self-absorption are validated. It is finally shown that measurement uncertainties are better than 1 %. The extension of this technique to future modified conversion ratio measurements for 242 Pu (on MOX rods) and 232 Th (on

  3. Assessing the cleanliness of surfaces: Innovative molecular approaches vs. standard spore assays

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, M.; Duc, M.T. La; Probst, A.; Vaishampayan, P.; Stam, C.; Benardini, J.N.; Piceno, Y.M.; Andersen, G.L.; Venkateswaran, K.

    2011-04-01

    A bacterial spore assay and a molecular DNA microarray method were compared for their ability to assess relative cleanliness in the context of bacterial abundance and diversity on spacecraft surfaces. Colony counts derived from the NASA standard spore assay were extremely low for spacecraft surfaces. However, the PhyloChip generation 3 (G3) DNA microarray resolved the genetic signatures of a highly diverse suite of microorganisms in the very same sample set. Samples completely devoid of cultivable spores were shown to harbor the DNA of more than 100 distinct microbial phylotypes. Furthermore, samples with higher numbers of cultivable spores did not necessarily give rise to a greater microbial diversity upon analysis with the DNA microarray. The findings of this study clearly demonstrated that there is not a statistically significant correlation between the cultivable spore counts obtained from a sample and the degree of bacterial diversity present. Based on these results, it can be stated that validated state-of-the-art molecular techniques, such as DNA microarrays, can be utilized in parallel with classical culture-based methods to further describe the cleanliness of spacecraft surfaces.

  4. Late effects of craniospinal irradiation for standard risk medulloblastoma in paediatric patients: A comparison of treatment techniques

    International Nuclear Information System (INIS)

    Leman, J.

    2016-01-01

    Background: Survival rates for standard risk medulloblastoma are favourable, but craniospinal irradiation (CSI) necessary to eradicate microscopic spread causes life limiting late effects. Aims: The aim of this paper is to compare CSI techniques in terms of toxicity and quality of life for survivors. Methods and materials: A literature search was conducted using synonyms of ‘medulloblastoma’, ’craniospinal’, ‘radiotherapy’ and ‘side effects’ to highlight 29 papers that would facilitate this discussion. Results and discussion: Intensity modulated radiotherapy (IMRT), tomotherapy and protons all provide CSI which can reduce dose to normal tissue, however photon methods cannot eliminate exit dose as well as protons can. Research for each technique requires longer term follow up in order to prove that survival rates remain high whilst reducing late effects. Findings/conclusion: Proton therapy is the superior method of CSI in term of late effects, but more research is needed to evidence this. Until proton therapy is available in the UK IMRT should be utilised. - Highlights: • Craniospinal irradiation is vital in the treatment of medulloblastoma. • Survivors often suffer long term side effects which reduce quality of life. • Tomotherapy, IMRT and proton therapy reduce late effects by sparing normal tissue. • Proton therapy offers superior dose distribution but further research is necessary. • IMRT should be employed for photon radiotherapy.

  5. Development of isotope dilution-liquid chromatography/mass spectrometry combined with standard addition techniques for the accurate determination of tocopherols in infant formula

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joonhee; Jang, Eun-Sil; Kim, Byungjoo, E-mail: byungjoo@kriss.re.kr

    2013-07-17

    Graphical abstract: -- Highlights: •ID-LC/MS method showed biased results for tocopherols analysis in infant formula. •H/D exchange of deuterated tocopherols in sample preparation was the source of bias. •Standard addition (SA)-ID-LC/MS was developed as an alternative to ID-LC/MS. •Details of calculation and uncertainty evaluation of the SA-IDMS were described. •SA-ID-LC/MS showed a higher-order metrological quality as a reference method. -- Abstract: During the development of isotope dilution-liquid chromatography/mass spectrometry (ID-LC/MS) for tocopherol analysis in infant formula, biased measurement results were observed when deuterium-labeled tocopherols were used as internal standards. It turned out that the biases came from intermolecular H/D exchange and intramolecular H/D scrambling of internal standards in sample preparation processes. Degrees of H/D exchange and scrambling showed considerable dependence on sample matrix. Standard addition-isotope dilution mass spectrometry (SA-IDMS) based on LC/MS was developed in this study to overcome the shortcomings of using deuterium-labeled internal standards while the inherent advantage of isotope dilution techniques is utilized for the accurate recovery correction in sample preparation processes. Details of experimental scheme, calculation equation, and uncertainty evaluation scheme are described in this article. The proposed SA-IDMS method was applied to several infant formula samples to test its validity. The method was proven to have a higher-order metrological quality with providing very accurate and precise measurement results.

  6. Signal Morphing techniques and possible application to Higgs properties measurements

    CERN Document Server

    Ecker, Katharina Maria; The ATLAS collaboration

    2016-01-01

    One way of describing deviations from the Standard Model is via Effective Field Theories or pseudo-observables, where higher order operators modify the couplings and the kinematics of the interaction of the Standard Model particles. Generating Monte Carlo events for every testable set of parameters for such a theory would require computing resources beyond the ones currently available in ATLAS. Up to now, Matrix-Element based reweighting techniques have been often used to model Beyond Standard Model process starting from Standard Model simulated events. In this talk, we review the advantages and the limitations of morphing techniques to construct continuous probability model for signal parameters, interpolating between a finite number of distributions obtained from the simulation chain. The technique will be exemplified by searching for deviations from the Standard Model predictions in Higgs properties measurements.

  7. Radical prostatectomy: evolution of surgical technique from the laparoscopic point of view

    Directory of Open Access Journals (Sweden)

    Xavier Cathelineau

    2010-04-01

    Full Text Available PURPOSE: To review the literature and present a current picture of the evolution in radical prostatectomy from the laparoscopic point of view. MATERIALS AND METHODS: We conducted an extensive Medline literature search. Articles obtained regarding laparoscopic radical prostatectomy (LRP and our experience at Institut Montsouris were used for reassessing anatomical and technical issues in radical prostatectomy. RESULTS: LRP nuances were reassessed by surgical teams in order to verify possible weaknesses in their performance. Our basic approach was to carefully study the anatomy and pioneer open surgery descriptions in order to standardized and master a technique. The learning curve is presented in terms of an objective evaluation of outcomes for cancer control and functional results. In terms of technique-outcomes, there are several key elements in radical prostatectomy, such as dorsal vein control-apex exposure and nerve sparing with particular implications in oncological and functional results. Major variations among the surgical teams' performance and follow-up prevented objective comparisons in radical prostatectomy. The remarkable evolution of LRP needs to be supported by comprehensive results. CONCLUSIONS: Radical prostatectomy is a complex surgical operation with difficult objectives. Surgical technique should be standardized in order to allow an adequate and reliable performance in all settings, keeping in mind that cancer control remains the primary objective. Reassessing anatomy and a return to basics in surgical technique is the means to improve outcomes and overcome the difficult task of the learning curve, especially in minimally access urological surgery.

  8. A Network Coding Approach to Loss Tomography

    DEFF Research Database (Denmark)

    Sattari, Pegah; Markopoulou, Athina; Fragouli, Christina

    2013-01-01

    network coding capabilities. We design a framework for estimating link loss rates, which leverages network coding capabilities and we show that it improves several aspects of tomography, including the identifiability of links, the tradeoff between estimation accuracy and bandwidth efficiency......, and the complexity of probe path selection. We discuss the cases of inferring the loss rates of links in a tree topology or in a general topology. In the latter case, the benefits of our approach are even more pronounced compared to standard techniques but we also face novel challenges, such as dealing with cycles...

  9. Mathematical and Statistical Techniques for Systems Medicine: The Wnt Signaling Pathway as a Case Study

    KAUST Repository

    MacLean, Adam L.

    2015-12-16

    The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.

  10. Analytical and laser scanning techniques to determine shape properties of aggregates used in pavements

    CSIR Research Space (South Africa)

    Komba, Julius J

    2013-06-01

    Full Text Available and volume of an aggregate particle, the sphericity computed by using orthogonal dimensions of an aggregate particle, and the flat and elongated ratio computed by using longest and smallest dimensions of an aggregate particle. The second approach employed.... Further validation of the laser-based technique was achieved by correlating the laser-based aggregate form indices with the results from two current standard tests; the flakiness index and the flat and elongated particles ratio tests. The laser...

  11. Magnetic Resonance Fingerprinting - a promising new approach to obtain standardized imaging biomarkers from MRI.

    Science.gov (United States)

    2015-04-01

    Current routine MRI examinations rely on the acquisition of qualitative images that have a contrast "weighted" for a mixture of (magnetic) tissue properties. Recently, a novel approach was introduced, namely MR Fingerprinting (MRF) with a completely different approach to data acquisition, post-processing and visualization. Instead of using a repeated, serial acquisition of data for the characterization of individual parameters of interest, MRF uses a pseudo randomized acquisition that causes the signals from different tissues to have a unique signal evolution or 'fingerprint' that is simultaneously a function of the multiple material properties under investigation. The processing after acquisition involves a pattern recognition algorithm to match the fingerprints to a predefined dictionary of predicted signal evolutions. These can then be translated into quantitative maps of the magnetic parameters of interest. MR Fingerprinting (MRF) is a technique that could theoretically be applied to most traditional qualitative MRI methods and replaces them with acquisition of truly quantitative tissue measures. MRF is, thereby, expected to be much more accurate and reproducible than traditional MRI and should improve multi-center studies and significantly reduce reader bias when diagnostic imaging is performed. Key Points • MR fingerprinting (MRF) is a new approach to data acquisition, post-processing and visualization.• MRF provides highly accurate quantitative maps of T1, T2, proton density, diffusion.• MRF may offer multiparametric imaging with high reproducibility, and high potential for multicenter/ multivendor studies.

  12. Comparison of Two Different Techniques of Cooperative Learning Approach: Undergraduates' Conceptual Understanding in the Context of Hormone Biochemistry

    Science.gov (United States)

    Mutlu, Ayfer

    2018-01-01

    The purpose of the research was to compare the effects of two different techniques of the cooperative learning approach, namely Team-Game Tournament and Jigsaw, on undergraduates' conceptual understanding in a Hormone Biochemistry course. Undergraduates were randomly assigned to Group 1 (N = 23) and Group 2 (N = 29). Instructions were accomplished…

  13. Development, improvement and calibration of neutronic reaction rate measurements: elaboration of a base of standard techniques

    International Nuclear Information System (INIS)

    Hudelot, J.P.

    1998-01-01

    In order to improve and to validate the neutronic calculation schemes, perfecting integral measurements of neutronic parameters is necessary. This thesis focuses on the conception, the improvement and the development of neutronic reaction rates measurements, and aims at building a base of standard techniques. Two subjects are discussed. The first one deals with direct measurements by fission chambers. A short presentation of the different usual techniques is given. Then, those last ones are applied through the example of doubling time measurements on the EOLE facility during the MISTRAL 1 experimental programme. Two calibration devices of fission chambers are developed: a thermal column located in the central part of the MINERVE facility, and a calibration cell using a pulsed high flux neutron generator and based on the discrimination of the energy of the neutrons with a time-of-flight method. This second device will soon allow to measure the mass of fission chambers with a precision of about 1 %. Finally, the necessity of those calibrations will be shown through spectral indices measurements in core MISTRAL 1 (UO 2 ) and MISTRAL 2 (MOX) of the EOLE facility. In each case, the associated calculation schemes, performed using the Monte Carlo MCNP code with the ENDF-BV library, will be validated. Concerning the second one, the goal is to develop a method for measuring the modified conversion ratio of 238 U (defined as the ratio of 238 U capture rate to total fission rate) by gamma-ray spectrometry of fuel rods. Within the framework of the MISTRAL 1 and MISTRAL 2 programmes, the measurement device, the experimental results and the spectrometer calibration are described. Furthermore, the MCNP calculations of neutron self-shielding and gamma self-absorption are validated. It is finally shown that measurement uncertainties are better than 1 %. The extension of this technique to future modified conversion ratio measurements for 242 Pu (on MOX rods) and 232 Th (on Thorium

  14. SDSS-IV/MaNGA: SPECTROPHOTOMETRIC CALIBRATION TECHNIQUE

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Renbin; Sánchez-Gallego, José R. [Department of Physics and Astronomy, University of Kentucky, 505 Rose St., Lexington, KY 40506-0057 (United States); Tremonti, Christy; Bershady, Matthew A.; Eigenbrot, Arthur; Wake, David A. [Department of Astronomy, University of Winsconsin-Madison, 475 N. Charter Street, Madison, WI 53706-1582 (United States); Law, David R. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Schlegel, David J. [Physics Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720-8160 (United States); Bundy, Kevin [Kavli IPMU (WPI), UTIAS, The University of Tokyo, Kashiwa, Chiba 277-8583 (Japan); Drory, Niv [McDonald Observatory, Department of Astronomy, University of Texas at Austin, 1 University Station, Austin, TX 78712-0259 (United States); MacDonald, Nicholas [Department of Astronomy, Box 351580, University of Washington, Seattle, WA 98195 (United States); Bizyaev, Dmitry [Apache Point Observatory, P.O. Box 59, sunspot, NM 88349 (United States); Blanc, Guillermo A. [Departamento de Astronomía, Universidad de Chile, Camino el Observatorio 1515, Las Condes, Santiago (Chile); Blanton, Michael R.; Hogg, David W. [Center for Cosmology and Particle Physics, Department of Physics, New York University, 4 Washington Place, New York, NY 10003 (United States); Cherinka, Brian [Dunlap Institute for Astronomy and Astrophysics, University of Toronto, 50 St. George Street, Toronto, Ontario M5S 3H4 (Canada); Gunn, James E. [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States); Harding, Paul [Department of Astronomy, Case Western Reserve University, Cleveland, OH 44106 (United States); Sánchez, Sebastian F., E-mail: yanrenbin@uky.edu [Instituto de Astronomia, Universidad Nacional Autonoma de Mexico, A.P. 70-264, 04510 Mexico D.F. (Mexico); and others

    2016-01-15

    Mapping Nearby Galaxies at Apache Point Observatory (MaNGA), one of three core programs in the Sloan Digital Sky Survey-IV, is an integral-field spectroscopic survey of roughly 10,000 nearby galaxies. It employs dithered observations using 17 hexagonal bundles of 2″ fibers to obtain resolved spectroscopy over a wide wavelength range of 3600–10300 Å. To map the internal variations within each galaxy, we need to perform accurate spectral surface photometry, which is to calibrate the specific intensity at every spatial location sampled by each individual aperture element of the integral field unit. The calibration must correct only for the flux loss due to atmospheric throughput and the instrument response, but not for losses due to the finite geometry of the fiber aperture. This requires the use of standard star measurements to strictly separate these two flux loss factors (throughput versus geometry), a difficult challenge with standard single-fiber spectroscopy techniques due to various practical limitations. Therefore, we developed a technique for spectral surface photometry using multiple small fiber-bundles targeting standard stars simultaneously with galaxy observations. We discuss the principles of our approach and how they compare to previous efforts, and we demonstrate the precision and accuracy achieved. MaNGA's relative calibration between the wavelengths of Hα and Hβ has an rms of 1.7%, while that between [N ii] λ6583 and [O ii] λ3727 has an rms of 4.7%. Using extinction-corrected star formation rates and gas-phase metallicities as an illustration, this level of precision guarantees that flux calibration errors will be sub-dominant when estimating these quantities. The absolute calibration is better than 5% for more than 89% of MaNGA's wavelength range.

  15. SDSS-IV/MaNGA: SPECTROPHOTOMETRIC CALIBRATION TECHNIQUE

    International Nuclear Information System (INIS)

    Yan, Renbin; Sánchez-Gallego, José R.; Tremonti, Christy; Bershady, Matthew A.; Eigenbrot, Arthur; Wake, David A.; Law, David R.; Schlegel, David J.; Bundy, Kevin; Drory, Niv; MacDonald, Nicholas; Bizyaev, Dmitry; Blanc, Guillermo A.; Blanton, Michael R.; Hogg, David W.; Cherinka, Brian; Gunn, James E.; Harding, Paul; Sánchez, Sebastian F.

    2016-01-01

    Mapping Nearby Galaxies at Apache Point Observatory (MaNGA), one of three core programs in the Sloan Digital Sky Survey-IV, is an integral-field spectroscopic survey of roughly 10,000 nearby galaxies. It employs dithered observations using 17 hexagonal bundles of 2″ fibers to obtain resolved spectroscopy over a wide wavelength range of 3600–10300 Å. To map the internal variations within each galaxy, we need to perform accurate spectral surface photometry, which is to calibrate the specific intensity at every spatial location sampled by each individual aperture element of the integral field unit. The calibration must correct only for the flux loss due to atmospheric throughput and the instrument response, but not for losses due to the finite geometry of the fiber aperture. This requires the use of standard star measurements to strictly separate these two flux loss factors (throughput versus geometry), a difficult challenge with standard single-fiber spectroscopy techniques due to various practical limitations. Therefore, we developed a technique for spectral surface photometry using multiple small fiber-bundles targeting standard stars simultaneously with galaxy observations. We discuss the principles of our approach and how they compare to previous efforts, and we demonstrate the precision and accuracy achieved. MaNGA's relative calibration between the wavelengths of Hα and Hβ has an rms of 1.7%, while that between [N ii] λ6583 and [O ii] λ3727 has an rms of 4.7%. Using extinction-corrected star formation rates and gas-phase metallicities as an illustration, this level of precision guarantees that flux calibration errors will be sub-dominant when estimating these quantities. The absolute calibration is better than 5% for more than 89% of MaNGA's wavelength range

  16. Standard guide for computed radiography

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This guide provides general tutorial information regarding the fundamental and physical principles of computed radiography (CR), definitions and terminology required to understand the basic CR process. An introduction to some of the limitations that are typically encountered during the establishment of techniques and basic image processing methods are also provided. This guide does not provide specific techniques or acceptance criteria for specific end-user inspection applications. Information presented within this guide may be useful in conjunction with those standards of 1.2. 1.2 CR techniques for general inspection applications may be found in Practice E2033. Technical qualification attributes for CR systems may be found in Practice E2445. Criteria for classification of CR system technical performance levels may be found in Practice E2446. Reference Images Standards E2422, E2660, and E2669 contain digital reference acceptance illustrations. 1.3 The values stated in SI units are to be regarded as the st...

  17. Analysis of two different surgical approaches for fractures of the mandibular condyle.

    Science.gov (United States)

    Kumaran, S; Thambiah, L J

    2012-01-01

    Fractures of the condyle account for one third of all the mandibular fractures. Different surgical approaches to the condyle described hitherto testify to the advantages and disadvantages of the different surgical techniques used for approaching the condyle in such cases of fractures. We have described and compared two of such surgical techniques in this study. The aim of this study is to compare the outcome of dealing with condylar fractures by two different surgical techniques: the mini retromandibular approach, and the preauricular approach. A prospective study of 31 patients who had suffered with mandibular condylar fractures was carried out. Of these, 26 patients had unilateral condylar fractures, and 5 patients had a bilateral fracture. Further, 19 of these patients were treated by the mini retromandibular approach and 12 by the preauricular approach. The treated patients were followed up and evaluated for a minimum period of 1 year and assessed for parameters such as the maximum mouth opening, lateral movement on the fractured side, mandibular movements such as protrusion, dental occlusion, scar formation, facial nerve weakness, salivary fistula formation and time taken for the completion of the surgical procedure. t- test was used for statistical analysis of the data obtained in the study. Dental occlusion was restored in all the cases, and good anatomical reduction was achieved. The mean operating time was higher 63.53 (mean) ± 18.12 minutes standard deviation (SD) in the preauricular approach compared to 45.22 (mean) ± 18.86 minutes SD in the mini retromandibular approach. Scar formation was satisfactory in almost all the cases.

  18. Anesthetic technique for inferior alveolar nerve block: a new approach

    Science.gov (United States)

    PALTI, Dafna Geller; de ALMEIDA, Cristiane Machado; RODRIGUES, Antonio de Castro; ANDREO, Jesus Carlos; LIMA, José Eduardo Oliveira

    2011-01-01

    Background Effective pain control in Dentistry may be achieved by local anesthetic techniques. The success of the anesthetic technique in mandibular structures depends on the proximity of the needle tip to the mandibular foramen at the moment of anesthetic injection into the pterygomandibular region. Two techniques are available to reach the inferior alveolar nerve where it enters the mandibular canal, namely indirect and direct; these techniques differ in the number of movements required. Data demonstrate that the indirect technique is considered ineffective in 15% of cases and the direct technique in 1329% of cases. Objective Objective: The aim of this study was to describe an alternative technique for inferior alveolar nerve block using several anatomical points for reference, simplifying the procedure and enabling greater success and a more rapid learning curve. Materials and Methods A total of 193 mandibles (146 with permanent dentition and 47 with primary dentition) from dry skulls were used to establish a relationship between the teeth and the mandibular foramen. By using two wires, the first passing through the mesiobuccal groove and middle point of the mesial slope of the distolingual cusp of the primary second molar or permanent first molar (right side), and the second following the oclusal plane (left side), a line can be achieved whose projection coincides with the left mandibular foramen. Results The obtained data showed correlation in 82.88% of cases using the permanent first molar, and in 93.62% of cases using the primary second molar. Conclusion This method is potentially effective for inferior alveolar nerve block, especially in Pediatric Dentistry. PMID:21437463

  19. Multimodality imaging techniques.

    Science.gov (United States)

    Martí-Bonmatí, Luis; Sopena, Ramón; Bartumeus, Paula; Sopena, Pablo

    2010-01-01

    In multimodality imaging, the need to combine morphofunctional information can be approached by either acquiring images at different times (asynchronous), and fused them through digital image manipulation techniques or simultaneously acquiring images (synchronous) and merging them automatically. The asynchronous post-processing solution presents various constraints, mainly conditioned by the different positioning of the patient in the two scans acquired at different times in separated machines. The best solution to achieve consistency in time and space is obtained by the synchronous image acquisition. There are many multimodal technologies in molecular imaging. In this review we will focus on those multimodality image techniques more commonly used in the field of diagnostic imaging (SPECT-CT, PET-CT) and new developments (as PET-MR). The technological innovations and development of new tracers and smart probes are the main key points that will condition multimodality image and diagnostic imaging professionals' future. Although SPECT-CT and PET-CT are standard in most clinical scenarios, MR imaging has some advantages, providing excellent soft-tissue contrast and multidimensional functional, structural and morphological information. The next frontier is to develop efficient detectors and electronics systems capable of detecting two modality signals at the same time. Not only PET-MR but also MR-US or optic-PET will be introduced in clinical scenarios. Even more, MR diffusion-weighted, pharmacokinetic imaging, spectroscopy or functional BOLD imaging will merge with PET tracers to further increase molecular imaging as a relevant medical discipline. Multimodality imaging techniques will play a leading role in relevant clinical applications. The development of new diagnostic imaging research areas, mainly in the field of oncology, cardiology and neuropsychiatry, will impact the way medicine is performed today. Both clinical and experimental multimodality studies, in

  20. X-ray fluorescence in Member States (Italy): Portable EDXRF in a multi-technique approach for the analyses of large paintings

    International Nuclear Information System (INIS)

    Ridolfi, Stefano

    2014-01-01

    Energy-dispersive X-ray fluorescence (EDXRF) with its portable capability, generally characterized by a small Xray tube and a Si-PIN or Si-drift detector, is particularly useful to analyze works of art. The main aspect that characterizes the EDXRF technique is its non-invasive character. This characteristic that makes the technique so powerful and appealing is on the other hand the main source of uncertainty in XRF measurements on Cultural Heritage. This problem is even more evident when we analyze paintings because of their intrinsic stratigraphic essence. As a matter of fact a painting is made of several layers: the support, which can be mainly of wood, canvas, paper; the preparation layer, mainly gypsums, white lead or ochre; pigment layers and at the end the protective varnish layer. The penetrating power of X rays allows that most of the times the information of all the layers reaches the detector. Most of the information that is in the spectrum arrives from deep layers of which we have no clue. In order to better understand this concept, let us use the equation of A. Markowicz. in which the various uncertainties that influence the analyses with portable EDXRF are reported. Let us adjust this equation for non invasive portable EDXRF analysis. The second, the third and the fourth term do not exist, for obvious reasons. Only the first and the last term influence the total uncertainty of an EDXRF analysis. The ways to reduce the influence of the fifth term is known by any scientist: good stability of the system, long measuring time, correct standard samples, good energy resolution etc. But what about the first term when we are executing a non invasive analysis? An example that shows the influence of the sample representation in the increasing of the uncertainty of a XRF analysis is the case in which we are asked to determine the original pigments used in a painting. If we have no clue of where restoration areas are dislocated on the painting, the probability of

  1. Transretroperitoneal CT-guided Embolization of Growing Internal Iliac Artery Aneurysm after Repair of Abdominal Aortic Aneurysm: A Transretroperitoneal Approach with Intramuscular Lidocaine Injection Technique

    Energy Technology Data Exchange (ETDEWEB)

    Park, Joon Young, E-mail: pjy1331@hanmail.net; Kim, Shin Jung, E-mail: witdd2@hanmail.net; Kim, Hyoung Ook, E-mail: chaos821209@hanmail.net [Chonnam National University Hospital, Department of Radiology (Korea, Republic of); Kim, Yong Tae, E-mail: mono-111@hanmail.net [Chonnam National University Hwasun Hospital, Department of Radiology (Korea, Republic of); Lim, Nam Yeol, E-mail: apleseed@hanmail.net; Kim, Jae Kyu, E-mail: kjkrad@jnu.ac.kr [Chonnam National University Hospital, Department of Radiology (Korea, Republic of); Chung, Sang Young, E-mail: sycpvts@jnu.ac.kr; Choi, Soo Jin Na, E-mail: choisjn@jnu.ac.kr; Lee, Ho Kyun, E-mail: mhaha@hanmail.net [Chonnam National University Hospital, Department of Surgery (Korea, Republic of)

    2015-02-15

    This study was designed to evaluate the efficacy and safety of CT-guided embolization of internal iliac artery aneurysm (IIAA) after repair of abdominal aortic aneurysm by transretroperitoneal approach using the lidocaine injection technique to iliacus muscle, making window for safe needle path for three patients for whom CT-guided embolization of IIAA was performed by transretroperitoneal approach with intramuscular lidocaine injection technique. Transretroperitoneal access to the IIAA was successful in all three patients. In all three patients, the IIAA was first embolized using microcoils. The aneurysmal sac was then embolized with glue and coils without complication. With a mean follow-up of 7 months, the volume of the IIAAs remained stable without residual endoleaks. Transretroperitoneal CT-guided embolization of IIAA using intramuscular lidocaine injection technique is effective, safe, and results in good outcome.

  2. Good Practice Standards – a Regulation Tool

    DEFF Research Database (Denmark)

    Sørensen, Marie Jull

    2013-01-01

    The purpose of this article is to identify the considerations weighed in regulation with good practice standards. In this article, potential due process problems with regulation via legal standards are identified and compared to other considerations, which this regulation technique meets....

  3. An intercomparison of approaches for improving operational seasonal streamflow forecasts

    Science.gov (United States)

    Mendoza, Pablo A.; Wood, Andrew W.; Clark, Elizabeth; Rothwell, Eric; Clark, Martyn P.; Nijssen, Bart; Brekke, Levi D.; Arnold, Jeffrey R.

    2017-07-01

    For much of the last century, forecasting centers around the world have offered seasonal streamflow predictions to support water management. Recent work suggests that the two major avenues to advance seasonal predictability are improvements in the estimation of initial hydrologic conditions (IHCs) and the incorporation of climate information. This study investigates the marginal benefits of a variety of methods using IHCs and/or climate information, focusing on seasonal water supply forecasts (WSFs) in five case study watersheds located in the US Pacific Northwest region. We specify two benchmark methods that mimic standard operational approaches - statistical regression against IHCs and model-based ensemble streamflow prediction (ESP) - and then systematically intercompare WSFs across a range of lead times. Additional methods include (i) statistical techniques using climate information either from standard indices or from climate reanalysis variables and (ii) several hybrid/hierarchical approaches harnessing both land surface and climate predictability. In basins where atmospheric teleconnection signals are strong, and when watershed predictability is low, climate information alone provides considerable improvements. For those basins showing weak teleconnections, custom predictors from reanalysis fields were more effective in forecast skill than standard climate indices. ESP predictions tended to have high correlation skill but greater bias compared to other methods, and climate predictors failed to substantially improve these deficiencies within a trace weighting framework. Lower complexity techniques were competitive with more complex methods, and the hierarchical expert regression approach introduced here (hierarchical ensemble streamflow prediction - HESP) provided a robust alternative for skillful and reliable water supply forecasts at all initialization times. Three key findings from this effort are (1) objective approaches supporting methodologically

  4. Independent Auditor's Approach to the Concept of Fraud in Accounting Standards

    Directory of Open Access Journals (Sweden)

    Handan Bulca

    2015-07-01

    Full Text Available Control in the field of standards, quality and safety can be increased fraud and error reduction is envisaged. In parallel with these changes, the conditions in the Official Gazette dated 14 March 2014 to companies that provide audit is mandatory. With the implementation of auditing standards and corporate companies will become more transparent. Information users more secure thanks to the information reaching these standards will be able to make healthy decisions. The study also has redefined the concept of independent auditors and the audit process should follow an independent auditor, is also indicated. The focus of the study of the risk of fraud and cheating, Independent Auditing Standards Number 240 constitutes in terms of views.

  5. An unusual salvage technique for posterior tracheal membranous laceration associated with transhiatal esophagectomy: A transcervical–transsternal approach

    Directory of Open Access Journals (Sweden)

    Seyed Ziaeddin Rasihashemi

    2017-09-01

    Full Text Available Various surgical approaches may be employed for esophageal resection. Major airway injuries due to transhiatal esophagectomy include vertical tears in the membranous trachea. Tracheal injury is an uncommon but potentially fatal complication. This article describes the technique to repair the posterior membranous tracheal tear, extended just over the carina through a transcervical–transsternal approach, thereby avoiding a second thoracotomy. Six patients with posterior membranous tracheal injury underwent this procedure. The laceration ranged from 3 cm to 5 cm in length. Four patients had received neoadjuvant chemoradiation. The management of tracheal laceration added approximately 60 minutes to the total operation time. There was no mortality related to tracheal injury. Patients were followed up for 6 months after surgery, and both posterior tracheal wall and transverse tracheotomy remained intact without stenosis. The transcervical–transsternal approach decreases the need of thoracotomy and its complications in patients with tracheal laceration in any stage, even in cases of an extended tear down to the carina.

  6. A novel approach for dimension reduction of microarray.

    Science.gov (United States)

    Aziz, Rabia; Verma, C K; Srivastava, Namita

    2017-12-01

    This paper proposes a new hybrid search technique for feature (gene) selection (FS) using Independent component analysis (ICA) and Artificial Bee Colony (ABC) called ICA+ABC, to select informative genes based on a Naïve Bayes (NB) algorithm. An important trait of this technique is the optimization of ICA feature vector using ABC. ICA+ABC is a hybrid search algorithm that combines the benefits of extraction approach, to reduce the size of data and wrapper approach, to optimize the reduced feature vectors. This hybrid search technique is facilitated by evaluating the performance of ICA+ABC on six standard gene expression datasets of classification. Extensive experiments were conducted to compare the performance of ICA+ABC with the results obtained from recently published Minimum Redundancy Maximum Relevance (mRMR) +ABC algorithm for NB classifier. Also to check the performance that how ICA+ABC works as feature selection with NB classifier, compared the combination of ICA with popular filter techniques and with other similar bio inspired algorithm such as Genetic Algorithm (GA) and Particle Swarm Optimization (PSO). The result shows that ICA+ABC has a significant ability to generate small subsets of genes from the ICA feature vector, that significantly improve the classification accuracy of NB classifier compared to other previously suggested methods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Recent advances in fuel fabrication techniques and prospects for the nineties

    International Nuclear Information System (INIS)

    Frain, R.G.; Caudill, H.L.; Faulhaber, R.

    1987-01-01

    Advanced Nuclear Fuels Corporation's approach and experience with the application of a flexible, just-in-time manufacturing philosophy to the production of customized nuclear fuel is described. Automation approaches to improve productivity are described. The transfer of technology across product lines is discussed as well as the challenges presented by a multiple product fabrication facility which produces a wide variety of BWR and PWR designs. This paper also describes the method of managing vendor quality control programs in support of standardization and clarity of documentation. Process simplification and the ensuing experience are discussed. Prospects for fabrication process advancements in the nineties are given with emphasis on the benefits of dry conversion of UF 6 to UO 2 powder, and increased use of automated and computerized inspection techniques. (author)

  8. Experiences with IR Top N Optimization in a Main Memory DBMS: Applying 'The Database Approach' in New Domains

    NARCIS (Netherlands)

    Read, B.; Blok, H.E.; de Vries, A.P.; Blanken, Henk; Apers, Peter M.G.

    Data abstraction and query processing techniques are usually studied in the domain of administrative applications. We present a case-study in the non-standard domain of (multimedia) information retrieval, mainly intended as a feasibility study in favor of the `database approach' to data management.

  9. Standardization of uveitis nomenclature for reporting clinical data. Results of the First International Workshop.

    Science.gov (United States)

    Jabs, Douglas A; Nussenblatt, Robert B; Rosenbaum, James T

    2005-09-01

    To begin a process of standardizing the methods for reporting clinical data in the field of uveitis. Consensus workshop. Members of an international working group were surveyed about diagnostic terminology, inflammation grading schema, and outcome measures, and the results used to develop a series of proposals to better standardize the use of these entities. Small groups employed nominal group techniques to achieve consensus on several of these issues. The group affirmed that an anatomic classification of uveitis should be used as a framework for subsequent work on diagnostic criteria for specific uveitic syndromes, and that the classification of uveitis entities should be on the basis of the location of the inflammation and not on the presence of structural complications. Issues regarding the use of the terms "intermediate uveitis," "pars planitis," "panuveitis," and descriptors of the onset and course of the uveitis were addressed. The following were adopted: standardized grading schema for anterior chamber cells, anterior chamber flare, and for vitreous haze; standardized methods of recording structural complications of uveitis; standardized definitions of outcomes, including "inactive" inflammation, "improvement'; and "worsening" of the inflammation, and "corticosteroid sparing," and standardized guidelines for reporting visual acuity outcomes. A process of standardizing the approach to reporting clinical data in uveitis research has begun, and several terms have been standardized.

  10. Pacifist approaches to conflict resolution: an overview of the principled pacifism

    Directory of Open Access Journals (Sweden)

    Oliveira, Gilberto Carvalho de

    2017-05-01

    Full Text Available This article explores pacifist approaches to conflict resolution based on principles, justifying the pacifist standard grounded in actors' belief systems (spiritual and ethical principles. This article gives a brief overview of the history of the main traditions that shape the debate on pacifism and non-violence, highlighting the central references of principled pacifism (Mahatma Ghandi and Martin Luther King and its main techniques and methods of conflict resolution.

  11. Comparison of Standard and Novel Signal Analysis Approaches to Obstructive Sleep Apnoea Classification

    Directory of Open Access Journals (Sweden)

    Aoife eRoebuck

    2015-08-01

    Full Text Available Obstructive sleep apnoea (OSA is a disorder characterised by repeated pauses in breathing during sleep, which leads to deoxygenation and voiced chokes at the end of each episode. OSA is associated by daytime sleepiness and an increased risk of serious conditions such as cardiovascular disease, diabetes and stroke. Between 2-7% of the adult population globally has OSA, but it is estimated that up to 90% of those are undiagnosed and untreated. Diagnosis of OSA requires expensive and cumbersome screening. Audio offers a potential non-contact alternative, particularly with the ubiquity of excellent signal processing on every phone.Previous studies have focused on the classification of snoring and apnoeic chokes. However, such approaches require accurate identification of events. This leads to limited accuracy and small study populations. In this work we propose an alternative approach which uses multiscale entropy (MSE coefficients presented to a classifier to identify disorder in vocal patterns indicative of sleep apnoea. A database of 858 patients was used, the largest reported in this domain. Apnoeic choke, snore, and noise events encoded with speech analysis features were input into a linear classifier. Coefficients of MSE derived from the first 4 hours of each recording were used to train and test a random forest to classify patients as apnoeic or not.Standard speech analysis approaches for event classification achieved an out of sample accuracy (Ac of 76.9% with a sensitivity (Se of 29.2% and a specificity (Sp of 88.7% but high variance. For OSA severity classification, MSE provided an out of sample Ac of 79.9%, Se of 66.0% and Sp = 88.8%. Including demographic information improved the MSE-based classification performance to Ac = 80.5%, Se = 69.2%, Sp = 87.9%. These results indicate that audio recordings could be used in screening for OSA, but are generally under-sensitive.

  12. Anesthetic technique for inferior alveolar nerve block: a new approach

    Directory of Open Access Journals (Sweden)

    Dafna Geller Palti

    2011-02-01

    Full Text Available BACKGROUND: Effective pain control in Dentistry may be achieved by local anesthetic techniques. The success of the anesthetic technique in mandibular structures depends on the proximity of the needle tip to the mandibular foramen at the moment of anesthetic injection into the pterygomandibular region. Two techniques are available to reach the inferior alveolar nerve where it enters the mandibular canal, namely indirect and direct; these techniques differ in the number of movements required. Data demonstrate that the indirect technique is considered ineffective in 15% of cases and the direct technique in 13-29% of cases. OBJECTIVE: The aim of this study was to describe an alternative technique for inferior alveolar nerve block using several anatomical points for reference, simplifying the procedure and enabling greater success and a more rapid learning curve. MATERIAL AND METHODS: A total of 193 mandibles (146 with permanent dentition and 47 with primary dentition from dry skulls were used to establish a relationship between the teeth and the mandibular foramen. By using two wires, the first passing through the mesiobuccal groove and middle point of the mesial slope of the distolingual cusp of the primary second molar or permanent first molar (right side, and the second following the oclusal plane (left side, a line can be achieved whose projection coincides with the left mandibular foramen. RESULTS: The obtained data showed correlation in 82.88% of cases using the permanent first molar, and in 93.62% of cases using the primary second molar. CONCLUSION: This method is potentially effective for inferior alveolar nerve block, especially in Pediatric Dentistry.

  13. Comparison with Other Techniques

    Science.gov (United States)

    Sacco, Giovanni Maria; Ferré, Sébastien; Tzitzikas, Yannis

    This chapter compares dynamic taxonomies with the other main approaches to information access and discusses analogies and differences. The approaches analyzed range from traditional retrieval paradigms, such as queries on structured data, to the most recent approaches, including the current effort on the Semantic Web: queries on structured data, and OLAP data analysis techniques;

  14. A Method to Test Model Calibration Techniques: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, Ron; Polly, Ben; Neymark, Joel

    2016-09-01

    This paper describes a method for testing model calibration techniques. Calibration is commonly used in conjunction with energy retrofit audit models. An audit is conducted to gather information about the building needed to assemble an input file for a building energy modeling tool. A calibration technique is used to reconcile model predictions with utility data, and then the 'calibrated model' is used to predict energy savings from a variety of retrofit measures and combinations thereof. Current standards and guidelines such as BPI-2400 and ASHRAE-14 set criteria for 'goodness of fit' and assume that if the criteria are met, then the calibration technique is acceptable. While it is logical to use the actual performance data of the building to tune the model, it is not certain that a good fit will result in a model that better predicts post-retrofit energy savings. Therefore, the basic idea here is that the simulation program (intended for use with the calibration technique) is used to generate surrogate utility bill data and retrofit energy savings data against which the calibration technique can be tested. This provides three figures of merit for testing a calibration technique, 1) accuracy of the post-retrofit energy savings prediction, 2) closure on the 'true' input parameter values, and 3) goodness of fit to the utility bill data. The paper will also discuss the pros and cons of using this synthetic surrogate data approach versus trying to use real data sets of actual buildings.

  15. The application of value analysis techniques for complex problems

    International Nuclear Information System (INIS)

    Chiquelin, W.R.; Cossel, S.C.; De Jong, V.J.; Halverson, T.W.

    1986-01-01

    This paper discusses the application of the Value Analysis technique to the transuranic package transporter (TRUPACT). A team representing five different companies or organizations with diverse technical backgrounds was formed to analyze and recommend improvements. The results were a 38% systems-wide savings, if incorporated, and a shipping container which is volumetrically and payload efficient as well as user friendly. The Value Analysis technique is a proven tool widely used in many diverse areas both in the government and the private sector. Value Analysis uses functional diagramming of a piece of equipment or process to discretely identify every facet of the item being analyzed. A standard set of questions is then asked: What is it?, What does it do?, What does it cost?, What else will do the task?, and What would that cost? Using logic and a disciplined approach, the result of the Value Analysis performs the necessary functions at a high quality and the lowest overall cost

  16. When Is Hub Gene Selection Better than Standard Meta-Analysis?

    Science.gov (United States)

    Langfelder, Peter; Mischel, Paul S.; Horvath, Steve

    2013-01-01

    Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when) hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data). Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis) and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility) in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA) in three comprehensive and unbiased empirical studies: (1) Finding genes predictive of lung cancer survival, (2) finding methylation markers related to age, and (3) finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1). However, standard meta-analysis methods perform as good as (if not better than) a consensus network approach in terms of validation success (criterion 2). The article also reports a comparison of meta-analysis techniques applied to

  17. When is hub gene selection better than standard meta-analysis?

    Directory of Open Access Journals (Sweden)

    Peter Langfelder

    Full Text Available Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data. Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA in three comprehensive and unbiased empirical studies: (1 Finding genes predictive of lung cancer survival, (2 finding methylation markers related to age, and (3 finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1. However, standard meta-analysis methods perform as good as (if not better than a consensus network approach in terms of validation success (criterion 2. The article also reports a comparison of meta-analysis techniques

  18. When is hub gene selection better than standard meta-analysis?

    Science.gov (United States)

    Langfelder, Peter; Mischel, Paul S; Horvath, Steve

    2013-01-01

    Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when) hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data). Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis) and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility) in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA) in three comprehensive and unbiased empirical studies: (1) Finding genes predictive of lung cancer survival, (2) finding methylation markers related to age, and (3) finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1). However, standard meta-analysis methods perform as good as (if not better than) a consensus network approach in terms of validation success (criterion 2). The article also reports a comparison of meta-analysis techniques applied to

  19. New Approaches to Airway Management in Tracheal Resections-A Systematic Review and Meta-analysis.

    Science.gov (United States)

    Schieren, Mark; Böhmer, Andreas; Dusse, Fabian; Koryllos, Aris; Wappler, Frank; Defosse, Jerome

    2017-08-01

    Although endotracheal intubation, surgical crossfield intubation, and jet ventilation are standard techniques for airway management in tracheal resections, there are also reports of new approaches, ranging from regional anesthesia to extracorporeal support. The objective was to outline the entire spectrum of new airway techniques. The literature databases PubMed/Medline and the Cochrane Library were searched systematically for prospective and retrospective trials as well as case reports on tracheal resections. No restrictions applied to hospital types or settings. Adult patients undergoing surgical resections of noncongenital tracheal stenoses with end-to-end anastomoses. Airway management techniques were divided into conventional and new approaches and analyzed regarding their potential risks and benefits. A total of 59 publications (n = 797 patients) were included. The majority of publications (71.2%) describe conventional airway techniques. Endotracheal tube placement after induction of general anesthesia and surgical crossfield intubation after incision of the trachea were used most frequently without major complications. A total of 7 new approaches were identified, including 4 different regional anesthetic techniques (25 cases), supraglottic airways (4 cases), and new forms of extracorporeal support (25 cases). Overall failure rates of new techniques were low (1.8%). Details on patient selection and procedural specifics are provided. New approaches have several theoretical benefits, yet further research is required to establish criteria for patient selection and evaluate procedural safety. Given the low level of evidence, it currently is impossible to compare methods of airway management regarding outcome-related risks and benefits. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Comparison of two different techniques of cooperative learning approach: Undergraduates' conceptual understanding in the context of hormone biochemistry.

    Science.gov (United States)

    Mutlu, Ayfer

    2018-03-01

    The purpose of the research was to compare the effects of two different techniques of the cooperative learning approach, namely Team-Game Tournament and Jigsaw, on undergraduates' conceptual understanding in a Hormone Biochemistry course. Undergraduates were randomly assigned to Group 1 (N = 23) and Group 2 (N = 29). Instructions were accomplished using Team-Game Tournament in Group 1 and Jigsaw in Group 2. Before the instructions, all groups were informed about cooperative learning and techniques, their responsibilities in the learning process and accessing of resources. Instructions were conducted under the guidance of the researcher for nine weeks and the Hormone Concept Test developed by the researcher was used before and after the instructions for data collection. According to the results, while both techniques improved students' understanding, Jigsaw was more effective than Team-Game Tournament. © 2017 by The International Union of Biochemistry and Molecular Biology, 46(2):114-120, 2018. © 2017 The International Union of Biochemistry and Molecular Biology.