caite.info Biography API 1163 PDF

Api 1163 pdf

Thursday, February 7, 2019 admin Comments(0)

During the update of this specification, reference to standards such as API [ 1] and PDAM1 [2] have been reviewed and some terminology has been aligned. API Inline Inspection Systems - Download as PDF File .pdf) or read online. Sistema de Inspección de Tuberías. API - - Download as PDF File .pdf) or read online. Guidance on Achieving ILI First Run Success-Dec


Author: DIMPLE RASBERRY
Language: English, Spanish, German
Country: Germany
Genre: Academic & Education
Pages: 545
Published (Last): 02.02.2016
ISBN: 758-6-80891-849-3
ePub File Size: 27.83 MB
PDF File Size: 15.19 MB
Distribution: Free* [*Regsitration Required]
Downloads: 37734
Uploaded by: SHELLI

API. In-line. SECOND ED. This standa systems for liquid pipeli tethered, se detecting m pipeline geo. The standar technologie. This standa performanc including. API STANDARD • In-Line Inspection System Selection—Requirements for selecting an in-line inspection system for a specific pipeline application. (September ) introduction of API (title) highlights the importance of API (In-Line Inspection systems qualification standard) was introduced in.

Records shall be maintained of these quality checks and retained in the record keeping system selected by the organization. The type of anomaly or characteristic covered by the performance specification. Clean pipe thoroughly, preferably by abrasive blasting, etc. Common methods consider the lower range of certainty, the position of the certainty stated in the performance specification within the confidence interval, and the amounts of data used to develop the performance specification. Goals and objectives shall include, but are not limited to, characteristics of anomalies and features to be detected, identified, and sized and the required accuracies. Number of metal-loss features in defined sections.

POD as a function of one or more characteristics of the anomaly. For example: Metal Loss 11 3. The use of a reference anomaly or anomalies. The detection threshold s and POD s must be statistically valid for the distribution of anomaly dimensions or characteristics reasonably expected for the inspection to be conducted. A POI refers to the probability of correct identification of anomalies, components, or characteristics that are detected by an in-line inspection system.

For example, refer to Table 4 in Appendix A. A sizing accuracy refers to how closely the reported dimensions agree with the true dimensions.

Sizing or characterization accuracies shall include a tolerance e. The sizing or characterization accuracies, as a function of anomaly type, should include: Cracks in the pipe body: For crack colonies, the overall colony axial length and circumferential width, along with the depth and axial length of the largest crack or cracks in the colony. Cracks in welds and other weld anomalies: Metallurgical 1.

Manufacturing anomalies such as slugs, scabs and slivers: For an example, refer to Table 5 in Appendix A. The sizing accuracies must be statistically valid for the distribution of anomaly dimensions reasonably expected for the inspection to be conducted. When the sizing accuracies significantly vary with anomaly dimensions or characteristics, individual sizing accuracies shall be given for the range of anomaly dimensions for which they are valid.

Where appropriate, sizing capabilities shall include a tolerance e. The performance specification shall state a location accuracy from a fixed location and an orientation accuracy. This can be stated as an accuracy specification.

Examples of physical and operational factors that can limit detection thresholds, PODs, POIs, and sizing accuracies include: Anomaly orientation angle and proximity to other anomalies or pipeline components.

Anomaly shape and area affected. Maximum and minimum pipe wall thickness e. In-line inspection system speed outside of the specified range. Pipeline cleanliness. Pipe metallurgy. Pipe curvature, field bend or elbow. Pipe wall coverage. Acceptable sensor loss or data degradation from sensor loss. The following table is an example of how limitations may be reported. Alternatively, no detection threshold, POD, POI, or sizing accuracy should be implied outside the range of acceptable conditions.

API Std 1163 In-line Inspection Systems Qualification Standard, First Edition

Results for an inspection or portion of an inspection that are outside the range of acceptable conditions should be considered advisory. Such geometric limitations shall be measured or calculated for straight pipe runs, bends, and other fittings through which the system may pass during the inspection. Calculations shall consider the minimum clear diameter required by the inspection system for passage without damage and the most limiting dimensional tolerances allowed by industry standards in the manufacture of pipe, bends, and fittings.

The performance specification shall also contain a statement, when applicable, that industry standards manufacturing tolerances were utilized in specifying these limitations. If other tolerancing mechanisms are used, these shall be specified in the performance specification. A gauging pig run should be conducted before an in-line inspection tool run is conducted in that segment for the first time.

1163 pdf api

Additional constraints or limitations that shall be stated are: Run length. Data storage capacity. Required check valve positions or tool limitations with respect to valves. The methodology used to qualify a performance specification shall be based on sound engineering practices, be statistically valid, and include a definition of essential variables see 7. The methodology used to qualify the performance specification shall be based on at least one of the following methods: Verified historical data, b.

Essential variables are characteristics or analysis steps that are essential for achieving desired results. Essential variables may include, but are not limited to: Constraints on operational characteristics, such as inspection tool velocity. Inspection tool design and physical characteristics, such as: Changes to the essential variables of a system shall require a new performance specification and qualification. Data and analyses that are not within the range of essential variables defined for a performance specification shall not be used to qualify the specification.

Data and analyses used to qualify a performance specification shall be selected to generate a representative distribution of anomaly dimensions, components, and characteristics reasonably expected for the inspection to be conducted.

The analyses used to define the statistical quantities, such as PODs, POIs, and sizing accuracies, shall be in accordance with standard statistical analysis methods, and the confidence levels given shall be consistent with the amount of data used in the analyses. Data and analyses used to qualify a performance specification shall be documented and maintained.

For anomalies, the data shall include values of the essential variables during the inspection, inspection conditions e. When an in-line inspection system is used for multiple inspections as is the normal case , a database shall be established for the data and analyses used to qualify performance specifications. The database shall be used to improve accuracies, certainties, and confidence levels when such values are included in future performance specifications.

Changes in design or analysis procedures must be accounted for and documented in all databases. The qualification of a performance specification shall be considered valid for the range of essential variables defined for the Specification. If data indicates the in-line inspection system does not meet the performance specification for any values or combinations of essential variables, the essential variables must be redefined, or the performance specification must be restated. Verification measurements are dimensions and characteristics that have been physically measured after anomalies have been exposed.

An example of a full-scale test used for qualification is a pull test. The methods by which the data are correlated or calibrated shall be documented. The methods by which the data is correlated or calibrated shall be documented. If a statistically significant amount of historic or full-scale test data is not available, the detection thresholds, PODs, and POIs shall be estimated using prior experience with other inspection systems, provided the estimates are clearly identified as such in the performance specification.

When using historical or full-scale data, detection thresholds shall represent the anomaly dimension s that must be exceeded to achieve the POD. When the in-line inspection system is operated within its essential variables and under the conditions planned for the inspection, it must be able to detect anomalies that exceed the detection thresholds with the stated POD. Sizing accuracies may be determined by comparing reported characteristics with verification measurements.

Sizing accuracies should be determined using a linear or nonlinear regression analysis e. Tolerances may be stated as an absolute value e. Certainties may include the frequency with which out-of-tolerance errors are over-predicted or under-predicted.

Sources of differences between reported and measured characteristics should be identified, documented, and accounted for in the statistical analyses used to determine the tolerances, certainties, and confidence levels where practical.

Sources of errors include those due to the in-line inspection system, as well as those due to hands-on measurements made of a given characteristic. The tolerances and certainties required in this Standard refer to errors due to the in-line inspection system only.

These errors include, but are not limited to, systematic errors errors that result from known, but unaccounted for causes, such as sensor liftoff , random errors lack of repeatability and other errors with no identified cause , and anomaly-specific errors errors in sizing particular to geometries or assemblies of anomalies.

If the methodology is found to be no longer valid, any performance specifications that were validated by the methodology must be revalidated by an acceptable methodology.

All reported significant errors in detection, identification, and sizing shall be investigated. Significant errors are those that are outside the performance specification. Four sets of requirements are given: Project requirements. Pre-inspection requirements. Inspection requirements. Post-inspection requirements. All in-line inspection project requirements, pre-inspection, inspection, and post-inspection requirements and procedures shall be documented. Prior to the actual inspection, the pipeline geometry and planned pipeline operating conditions shall be reviewed to ensure they are consistent with the information previously provided.

The operator shall disclose to the service provider any and all changes in geometry or planned operating conditions before the in-line inspection system is launched into the pipeline. The service provider shall work closely with the operator to minimize the likelihood of damage to the pipeline or the inspection system. The service provider shall confirm that the in-line inspection system to be used for the inspection is consistent with that used to define the required performance specifications.

The steps shall include a function test to ensure the tool is operating properly. Pre-inspection function tests may include, but are not limited to: Confirmation that an adequate power supply is available and operational. Confirmation that all sensors, data storage, odometers, and other mechanical systems are operating properly. Confirmation that adequate data storage is available.

Confirmation that all components of the inspection tool are properly initialized. Records of the pre-inspection function tests should be made available to the operator, if requested. The electronics shall be checked to make sure that they are properly sealed and functional. The service provider shall set the appropriate tool detection threshold on the above-ground markers to ensure proper detection.

The requirements include activities that occur from the time the in-line inspection tool is placed into the launching device until it has been removed from the receiving device. The in-line inspection tool shall be placed into the launching device and shall be launched in accordance with defined requirements and proper procedures. All system handling, placement, and launching activities shall be carefully monitored. Variations from the required operating conditions shall be identified and documented.

The actual location of each above-ground marker shall be measured and documented. If the above-ground markers are not placed at the planned reference points, the actual locations shall be identified and documented. The in-line inspection tool shall be removed from the receiving device in accordance with predefined requirements and proper procedures. All handling and removal activities should be carefully monitored.

These activities are intended to validate that the inline inspection tool has operated correctly during the inspection run. These steps shall include but are not limited to: Confirmation that a continuous stream of data was collected during the inspection.

Confirmation that the data meets basic quality requirements. Data checks are typically based on direct measurement data, data completeness, and data quality. Deviations shall be noted and their effects communicated to the operator and included in the report. Direct measurement data is typically used to make general judgments about the basic operation of an inspection tool during a run.

Such data shall be utilized as one of the post-inspection data checks. The amount of data collected allows an initial assessment of data completeness. The amount of data collected is typically accessible after processing the recorded data. Completeness of data shall be checked after the initial processing of the data. This will be considered one of the data checks. These steps shall include a function test to ensure the tool has operated properly during the inspection.

Post-inspection function tests may include but are not limited to: Tool cleanliness visual inspection. Confirmation that adequate power was available and operational.

Confirmation that all sensors, data storage, odometers, and other mechanical systems operated properly. Confirmation that adequate data storage was available. Examination of tool for damage and significant wear.

Deviations from these function checks shall be noted, and their effects shall be included in the inspection report. Continuously monitored in-line inspection tools should not require post inspection function tests. Data quality can be demonstrated using a variety of data integrity checks, such as verification that the data taken was within the operating ranges of the sensors used.

Such data checks shall be included in the data checking process. Postinspection data quality checks do not cover the interpretation of the obtained data. Requirements for establishing a performance specification are given in Section 7.

Such assignments are not within the scope of this Standard. A process validation, and b. A comparison with historic data or large-scale test data from the inspection system being used. Based on these steps, verification measurements may be required. Not all inspections require verification measurements, as discussed later in this section. The process validation shall include 1 a confirmation of the data analysis processes, 2 a comparison of recorded data to previous data or that used to establish the performance specification, and 3 a comparison of reported locations and types of pipeline components with the actual locations and types of components.

The process validation may include, but is not limited to: A review of the pipeline route, geometry, and operating conditions during the inspection relative to those planned for the inspection and the essential variables of the inspection system.

A review of the set-up and operation of the inspection tool relative to that planned for the inspection and the essential variables of the inspection system. A review of the processes used for: Bulk data handling, conditioning, and filtering. Automated analyses grading if used. Manual or other adjustments of data or grading. Identification, evaluation, and integration of supplemental data relative to the processes required for compliance with the performance specification.

A review of any additional requirements for the inspection, including any standards or codes applicable to the inspection. A review of the reported anomaly types and characteristics relative to the data used to establish the performance specification.

A comparison of reported locations and types of pipeline components and equipment, such as above-ground markers, anchors, bends, casings, flanges, girth welds, magnets, pig passage indicators, metal repair sleeves, taps, tees, and valves, relative to actual locations of components and appurtenances. Appendix B gives an example of a quality assurance program used for process validation. Inconsistencies uncovered during the process validation shall be evaluated and resolved.

If the inconsistencies cannot be resolved, the inspection results are not verified. If the inspection results are not verified, the performance specification may be restated or all or parts of the inspection data may be rejected.

Types of prior historical data that can be used for comparisons may include, but are not limited to: Prior in-line inspection results. Results from prior excavations and measurements of anomalies similar to those covered by the inspection.

Other data and analyses, when supported by sound engineering practices. If prior in-line inspection data is available for the specific pipeline, the reported results can be considered verified if: Differences in the reported locations, and characteristics of the anomalies are within the tolerances, certainties and confidence levels stated in the performance specification, or b.

Differences in the reported locations, and characteristics are outside the tolerances stated in the performance specification but the differences can be explained using sound engineering practices e.

The reported results can also be verified by comparisons with results from prior excavations and measurements, provided 1 the data from such excavations and measurements represents the range of reported anomaly types and characteristics and 2 any differences are within the tolerances, certainties and confidence levels stated in the performance specification or can be explained using sound engineering practices.

If the reported results are not verified using comparisons with prior historic data, additional comparisons with other inspection data as defined below or verification measurements are recommended.

Alternatively, the performance specification can be restated or all or parts of the inspection data can be rejected.

The reported results can be considered verified by comparisons with the results from prior validated inspections on other lines, provided 1 the prior data represents the range of reported anomaly types and characteristics, and 2 the prior essential variables match those used in the current inspection.

Alternatively, the performance specification can be restated or all or parts of the inspection data are not validated. Appendices C and D provide examples of verification measurement procedures. When verification digs are performed, information from the measurements shall be given to the service provider to confirm and continuously refine the data analysis processes. The information to be collected from the verification measurements and given to the service provider shall be agreed upon by both the operator and the service provider and shall include the measurement techniques used and their accuracies.

Information to be provided by the service provider to the operator should include the measurement threshold, reporting threshold, and interaction criteria, if any. Appendix D lists types of information that should be provided to the service provider. Any discrepancies between the reported inspection results and verification measurements that are outside of performance specifications shall be documented.

The source of the discrepancies should be identified through discussions between the service provider and the operator and through analyses of essential variables, the dig verification process, and data analysis process. Based on the source and extent of the identified and analyzed discrepancies, one of the following courses of action may be taken: The inspection data may be reanalyzed taking into account the detailed correlations between anomaly characteristics and the inspection data.

All or part of the inspection results may be invalidated. The performance specification may be revised for all or part of the inspection results. Listed below are examples of statistical analysis methods that may be used for verifications. Comparison of Verification Measurements with the Performance Specification. This is the simplest method of assessing inspection results.

Reported results are considered verified if the verification measurements meet the performance specification. If the reported results do not meet the performance specifications, further analysis shall be performed.

The accuracy of the verification measurements must be considered in the comparison. See Appendix D for an example. Comparison of a Population of Verification Measurements with Distributions.

This method assesses whether the verification measurements are statistically consistent with the performance specification by determining the probability of meeting the performance specification through the use of distribution functions such as binomial or normal distribution functions.

It becomes more accurate as the number of verification measurements increases. This method is attractive when there is a high confidence level on the tolerance and certainty given in the performance specification. If the test population can be considered representative and if an appropriate number of measurements are consistent with the performance specification, the results are considered verified.

Confidence Intervals. This method compares the range of certainties indicated by the verification measurements to the certainty level in the performance specification.

Confidence intervals provide an estimate of the precision with which the true certainty is known. This method is attractive when there is a low confidence level on the tolerance and certainty levels given in the performance specification. If the confidence interval reasonably bounds the stated certainty, the results are considered verified. Other Methods of Assessing Verification Results. Other methods include combinations and modifications of the methods listed above.

Separate verification comparisons based on different types of metal loss geometries are permissible. This is the case in all verifications in all industries.

As a consequence, heavy emphasis must be placed on historic data, especially the data used to establish the performance specification. See Section 7 for details on establishing performance specifications. Consequently, verification activities tend to concentrate on identifying situations where there are clear problems.

1163 pdf api

For inspections under unusual conditions or conditions not before seen, it may be beneficial to use a larger number of comparisons. Reports shall include anomaly or feature identification and dimensions for which the performance specification has been qualified Section 7 and also the results verified Section 9.

The following reporting requirements are provided to clearly tie the ILI systems qualifications to the inspection results. Type of anomaly or characteristic which may be limited to for MFL: Deformation with metal loss.

Manufacturing indication. Crack like indication. Metal loss at weld seam. Detection thresholds and probabilities of detection. Probabilities of proper identification. Sizing accuracies. Anomaly measurement accuracies. Location and orientation accuracies. Inspection parameters e. Sizing system components e. Analysis algorithms e. The description shall identify the source of data or analyses used for qualification: The description should also summarize the statistical techniques used to determine the performance specification.

These may include: Wall thickness range. Temperature range inside pipeline. Maximum and minimum pressure. Minimum bend radius. Minimum internal diameter. Tool length, weight.

Maximum length of pipeline that can be inspected in one run may be coupled with run times and pipeline conditions. Axial sampling frequency or distance. Circumferential sensor spacing in nominal pipe. Date of survey. Pipeline parameters: Pipe manufacturing method. Outside diameter. Nominal wall thickness. Pipe grade. Line length. In-line inspection data quality. Any quality issues with the in-line inspection data, such as sensor malfunction, should be stated within the summary and described in the report.

Data analysis parameters. Clear communication of data analysis parameters should be included. At a minimum, measurement threshold, reporting threshold, and interaction criteria should be included. See Appendix D. Odometer distance or absolute distance. Identification of upstream girth weld. Distance from feature to upstream girth weld. Feature classification e. Circumferential position.

Also read: DAS KAPITAL PDF

Identification of upstream and downstream markers. Distance from anomaly to upstream and downstream markers.

Tool speed. Feature characterization. Metal-loss features e. Width profile or shape. Deformation features e. Length, width. Crack features e. Width colonies , proximity to welds. Metallurgical features.

Dimension s. Position through the wall, hardness. Inspection survey parameters: Changes in the essential variables may affect the quality and accuracy of the data recorded by an in-line inspection system see 7. If any of these are different during the inspection from the values given in the performance specification, they shall be listed within the summary.

These options are recommended to aid in the integration of inspection results with pipeline integrity assessment programs. The following paragraphs provide some examples for metal loss ILI system results reporting.

Modifications can be made for other ILI technologies. Number of internal metal-loss features. Number of external metal-loss features.

API STD - 2nd Ed - In-line Inspection Systems Qualification

Number of metal-loss features in defined sections. Histograms of range of data scatter for each type of anomaly, based on the statistical data obtained from the inspection.

Circumferential position plot of all metal-loss features over the full pipeline length. Circumferential position plot of all internal metal-loss features over the full pipeline length.

Circumferential position plot of all external metal-loss features over the full pipeline length. Circumferential position plot of all metal-loss features as function of relative distance to the closest girth weld. Circumferential position of all deformation features over the full pipeline length.

If this option is applied, the following information should be included in the report of ILI system results: Assessment methodology.

Severity Ratio and definition if a severity ratio is used. Pipeline parameters used in calculations i. An effective quality management system includes processes that assure consistent products and services are being delivered, that those processes are properly controlled to prevent delivery of unsatisfactory services, and that adequate measures are in place to ensure that the products and services provided continue to meet the needs of a pipeline operator.

For those organizations without a quality management system, this section provides a basis for establishing a quality system to meet specific in-line inspection system needs. As a minimum, this review shall, where applicable, include: Identify which parties involved will be responsible for performing the specific tasks required for successful completion of the in-line inspection project.

A review of procedures to determine if they were followed during the entire inspection process. A review of the pipeline data provided by the pipeline operator to ensure the free passage of the in-line inspection tool. A determination that inspection capabilities of the specified in-line inspection tool meet the specific objectives of the pipeline operator. Evaluation of the analysis requirements of the pipeline operator, including any specific codes or standards used to ensure that the pipeline operator receives correct and accurate results from the in-line inspection.

Organizations that have an existing quality management system that meets or exceeds the requirements of this section The organizations shall have a documented quality system for the scope of activities encompassed in this standard.

The quality system documentation shall be made available to the pipeline operator upon request. Records of qualification processes and procedures and personnel qualifications records in accordance with ASNT ILI-PQ shall be made available to the operator upon request. Provisions shall be included for maintaining the quality of developed and utilized software applications.

Software maintenance, configuration management and auditing should be performed in accordance with accepted industry practices. These procedures shall document the steps required to ensure that the individuals assigned to perform the task can perform the work in a consistent manner. The detail deemed necessary will depend on the task as well as the training and qualification requirements established by the supplying organization.

Training and personnel qualifications requirements shall be included in the procedures. Any procedure or work instruction that is required shall be available to the individual performing the work. Those procedures should also be available for review by the pipeline operator upon request. Procedures shall be reviewed and modified on a periodic basis.

Minimum record keeping-requirements shall be documented. These records shall include not only the inspection data related to the pipeline, but shall also include records pertaining to the setup of the equipment, personnel involved in the performance of the inspection and analysis of data, and a record of the inspection equipment used for the inspection. Records shall be maintained to the level that will allow the recreation of the system set up for inspection system verification and validation purposes.

Additional information may also be maintained as part of the inspection record as determined between service providers and the pipeline operator.

Inspection records shall be retained for a time period no less than that required for legal or regulatory purposes. Adequate measures shall be taken to protect the records from loss or damage. When developing storage and regeneration procedures for inspection data, changes in data collection technology should be considered. A revision control system shall include procedures for withdrawal of outdated information, including documents, files, forms, and software. This includes documents and software internal to the organization as well as documents, files, and software released to the end-user.

These records shall sufficiently document the changes to allow an evaluation of the effects on the essential variables of the previous design. The same procedures apply to the design of services provided to a pipeline operator.

Service process changes shall also be documented to review the effectiveness of the change. Feedback from the pipeline operator should be a component of any design change procedure to be used when evaluating the effectiveness of changes to the design of either an in-line inspection system or service. This shall include the checks required to ensure the proper equipment has been selected, qualified, properly calibrated, and successfully operated in the field.

This shall also include the checks required to ensure that the data has been properly analyzed, and the date successfully delivered to the pipeline operator.

Quality control procedures shall also include those procedures necessary to demonstrate that all personnel are qualified in accordance with the requirements of this standard. Procedures shall contain provisions for personnel to have the ability to interrupt the process when a quality control nonconformance is discovered and initiate immediate corrective action procedures to prevent further or more severe nonconformance.

Records shall be maintained of these quality checks and retained in the record keeping system selected by the organization. Qualification processes and procedures shall also be maintained as part of the Quality Management System. These procedures shall include requirements for the identification of all equipment used, requirements of the individuals performing the task, and provisions for the calibration of applicable test equipment that is traceable to a national standard.

The equipment used for the inspection shall be uniquely identified to permit traceability.

API 1163-Inline Inspection Systems

The use of serial numbers or other tracking references provides a history of equipment used and a way to monitor that equipment for changes in operation and functionality that may affect proper operation. If the historical information process is used for verifying inspection results, the data collected for this purpose shall be matched to the traceability of the ILI System utilized under this section.

Equipment traceability requirements shall extend to support equipment that directly affects the successful completion of a project when used in conjunction with the in-line inspection tool. Such devices typically include above ground marker systems, locating systems, playback and data processing equipment, data reduction and analysis software, and associated test equipment. The accuracy of the inspection results compared to verification dig inspections.

An analysis of the number and types of erroneous calls over a period of time, for each type of inspection system, based upon the stated performance specification or service requirement. Other performance measures should be developed to further analyze the effectiveness of the processes being measured. These procedures should include steps to prevent the nonconformance from recurring.

This requires provision for adequate supervision commensurate with personnel experience and peer review crosscheck as necessary to assure accuracy of data. Processes to prevent nonconformance from initially occurring shall also be part of the quality system. These processes are often included in the research and development program.

These reviews are performed to ensure the overall effectiveness of the Quality Management System is maintained and continues to meet the goals of the organization. Effective improvement requires feedback from employees and the pipeline operator, a review of new technology developments, and a continuous observation and measurement of the results of the output of the organization. The quality management system shall include provisions to allow management to periodically evaluate the effectiveness of the procedures and processes within the quality system.

These internal audits shall be performed at defined intervals, and the records of the audits shall be maintained. Records of any corrective actions taken shall also be maintained. The relevant organization will provide indicators of the success of their processes.

Key measures of those indicators shall be established. The process measures selected shall include measures relevant to the products and services provided. Basic measures include: Consideration may be given to parties that have no financial, competitive, or other incentive that may be in conflict with the financial, proprietary or intellectual nature of the organization being audited.

Prior to performing the audit, the scope and procedure of the audit shall be clearly defined, discussed, and approved by the service provider. The run success percentage that measures the number of acceptable runs made versus the total number of runs made over a selected period of time.

A measure of the turn-around time of inspection data as measured from completion of the fieldwork to the time of delivery of the in-line inspection report. These terms are algebraically defined as follows: Table 4 lists features that may be detected along with their POIs. Table 5 lists PODs and sizing accuracies for metal-loss anomalies. Were survey-acceptance criteria met? Were Data Quality checks completed when applicable? Review orientation of taps, tees, etc.

Check for abnormal joint lengths Review Historical Information Initial a. Check for previous assessments i. Review previous dig information [Further review is required if significant differences in anomaly characteristics or location accuracy are identified.

This appendix provides a sample set of procedures that have been successfully used in prior field verifications. Other mutually agreed upon procedures may also be used. Field verifications involve two different distance measures: Aboveground measurements are typically made from known position of pipeline components, welds, or other physical items whose location relative to the pipeline location and chainage is known. In-line inspection distances are determined from odometer wheel counts and represent approximate chainage values.

Significant sources of errors in aboveground measurements can result from: Errors in distances measured by in-line inspection tools can result from problems with the odometer wheels due to debris, slippage, or sticking. In-line inspection distances can often be recalibrated using as-built pipeline data or other information. Basic Procedure for Feature Location In typical inspection reports, the location of a feature is referenced to fixed aboveground pipeline components e.

Below ground components are not typically used for reference points because they cannot be easily located aboveground. From the inspection report, identify and determine the distances to the nearest known upstream and downstream reference points. For the example shown in Figure 5, weld the target location is Step 2: Mark off and stake the aboveground distance from both reference points.

A gap or overlap is common. The length of the gap or overlap is affected by the accuracy of surface measurements and the odometer counts. For the example shown in Figure 6, the gap is 9. If a very large gap is seen, check to determine that the correct reference points have been used in marking off the aboveground distances.

Discussions between the service provider and the operator should be used if there are gaps or overlaps that are greater than the location accuracy in the performance specification. Step 3: REMARKS Using both upstream and downstream reference points and interpolating gaps or overlaps increases the accuracy with which a target feature is located. Targeting an upstream or downstream girth weld for an anomaly located within a pipe joint provides a ready reference from which to measure a short relative distance to locate the anomaly.

When the location of a target feature is in doubt, individual pipe joints can sometimes be identified by comparing the physical distance between upstream and downstream girth welds with the distance noted on the inspection report. The reported and actual position of the longitudinal weld can also help verify locations.

Basic Procedure—Verification Measurements 1. Clean pipe thoroughly, preferably by abrasive blasting, etc. Inspect for cracking e. Measure depth of anomaly. Measure length longitudinal and width circumferential of anomaly. Provide a rubbing of the anomaly geometry including the surrounding area and take a photo, if possible. Measure the actual wall thickness in multiple areas close to the anomaly. Mark on Feature Location Sheet actual measured distance to girth weld, circumferential position, feature type, feature dimensions, actual wall thickness, etc.

Measure and document exposed anomalies. Field data useful in comparing verification results with reported data 1. Field distance measurement system used.

All modifications applied to the original Feature Location Sheet. Distances measured in the field typically at least one upstream and one downstream is necessary. Observed difference between aboveground location and found position. Length of the joint. Position of the longitudinal weld if applicable. Length of neighboring pipe joints and their longitudinal weld position if possible. Position and extent of pipe area investigated.

Method used to measure the actual defect geometry. Specifications of the method, e. Photos of the location with scales and remarks of dimensions. A data record for each verification anomaly should be prepared.

The data record may include, but not be limited to: Distance to upstream and downstream aboveground reference points. Metal loss profile including the spacing increments and depth measurements ; alternatively, an etching of the anomaly and or a diagram with the maximum depth indicated.

Metal loss interaction Figure 7: Whether or not multiple measured metal loss anomalies interact to form a larger single anomaly; the criteria governing the relationship between the distances X1 and X2 and Y1 and Y2 is specified for each inspection. Information on the accuracy of field measurements. Label each photograph with the following information, as a minimum.

Pipeline system identifier. Right-of-way number. Pipeline stationing. Job number. Anomaly item number from inspection report. Excavation date. Nominal Pipeline O. Minimum covered for buried pipeline in Remember Me? Results 1 to 6 of 6. My threads; codigo98ii: Spons Circuit. Join Date Always Posts Many. Join Date Apr Posts Warm regards, Mano Spenta. My threads; Spenta: Sponsored Links -.

Join Date Mar Posts 4. My threads; simplehigh: Join Date Aug Posts 1. Join Date Jul Posts 3. Originally Posted by codigo98ii. Join Date May Posts My threads; Ribas: