Appendix B - Appendix B to Part 75—Quality Assurance and Quality Control Procedures
Develop and implement a quality assurance/quality control (QA/QC) program for the continuous emission monitoring systems, excepted monitoring systems approved under appendix D or E to this part, and alternative monitoring systems under subpart E of this part, and their components. At a minimum, include in each QA/QC program a written plan that describes in detail (or that refers to separate documents containing) complete, step-by-step procedures and operations for each of the following activities. Upon request from regulatory authorities, the source shall make all procedures, maintenance records, and ancillary supporting documentation from the manufacturer (e.g., software coefficients and troubleshooting diagrams) available for review during an audit. Electronic storage of the information in the QA/QC plan is permissible, provided that the information can be made available in hardcopy upon request during an audit.
1.1 Requirements for All Monitoring Systems 1.1.1 Preventive MaintenanceKeep a written record of procedures needed to maintain the monitoring system in proper operating condition and a schedule for those procedures. This shall, at a minimum, include procedures specified by the manufacturers of the equipment and, if applicable, additional or alternate procedures developed for the equipment.
1.1.2 Recordkeeping and ReportingKeep a written record describing procedures that will be used to implement the recordkeeping and reporting requirements in subparts E, F, and G and appendices D and E to this part, as applicable.
1.1.3 Maintenance RecordsKeep a record of all testing, maintenance, or repair activities performed on any monitoring system or component in a location and format suitable for inspection. A maintenance log may be used for this purpose. The following records should be maintained: date, time, and description of any testing, adjustment, repair, replacement, or preventive maintenance action performed on any monitoring system and records of any corrective actions associated with a monitor's outage period. Additionally, any adjustment that recharacterizes a system's ability to record and report emissions data must be recorded (e.g., changing of flow monitor or moisture monitoring system polynomial coefficients, K factors or mathematical algorithms, changing of temperature and pressure coefficients and dilution ratio settings), and a written explanation of the procedures used to make the adjustment(s) shall be kept.
1.1.4 The provisions in section 6.1.2 of appendix A to this part shall apply to the annual RATAs described in § 75.74(c)(2)(ii) and to the semiannual and annual RATAs described in section 2.3 of this appendix.
1.2 Specific Requirements for Continuous Emissions Monitoring Systems 1.2.1 Calibration Error Test and Linearity Check ProceduresKeep a written record of the procedures used for daily calibration error tests and linearity checks (e.g., how gases are to be injected, adjustments of flow rates and pressure, introduction of reference values, length of time for injection of calibration gases, steps for obtaining calibration error or error in linearity, determination of interferences, and when calibration adjustments should be made). Identify any calibration error test and linearity check procedures specific to the continuous emission monitoring system that vary from the procedures in appendix A to this part.
1.2.2 Calibration and Linearity AdjustmentsExplain how each component of the continuous emission monitoring system will be adjusted to provide correct responses to calibration gases, reference values, and/or indications of interference both initially and after repairs or corrective action. Identify equations, conversion factors and other factors affecting calibration of each continuous emission monitoring system.
1.2.3 Relative Accuracy Test Audit ProceduresKeep a written record of procedures and details peculiar to the installed continuous emission monitoring systems that are to be used for relative accuracy test audits, such as sampling and analysis methods.
1.2.4 Parametric Monitoring for Units With Add-on Emission ControlsThe owner or operator shall keep a written (or electronic) record including a list of operating parameters for the add-on SO
Keep a written record of the specific fuel flowmeter accuracy test procedures. These may include: standard methods or specifications listed in and of appendix D to this part and incorporated by reference under § 75.6; the procedures of sections 2.1.5.2 or 2.1.7 of appendix D to this part; or other methods approved by the Administrator through the petition process of § 75.66(c).
1.3.2 Transducer or Transmitter Accuracy Test ProceduresKeep a written record of the procedures for testing the accuracy of transducers or transmitters of an orifice-, nozzle-, or venturi-type fuel flowmeter under section 2.1.6 of appendix D to this part. These procedures should include a description of equipment used, steps in testing, and frequency of testing.
1.3.3 Fuel Flowmeter, Transducer, or Transmitter Calibration and Maintenance RecordsKeep a record of adjustments, maintenance, or repairs performed on the fuel flowmeter monitoring system. Keep records of the data and results for fuel flowmeter accuracy tests and transducer accuracy tests, consistent with appendix D to this part.
1.3.4 Primary Element Inspection ProceduresKeep a written record of the standard operating procedures for inspection of the primary element (i.e., orifice, venturi, or nozzle) of an orifice-, venturi-, or nozzle-type fuel flowmeter. Examples of the types of information to be included are: what to examine on the primary element; how to identify if there is corrosion sufficient to affect the accuracy of the primary element; and what inspection tools (e.g., baroscope), if any, are used.
1.3.5 Fuel Sampling Method and Sample RetentionKeep a written record of the standard procedures used to perform fuel sampling, either by utility personnel or by fuel supply company personnel. These procedures should specify the portion of the ASTM method used, as incorporated by reference under § 75.6, or other methods approved by the Administrator through the petition process of § 75.66(c). These procedures should describe safeguards for ensuring the availability of an oil sample (e.g., procedure and location for splitting samples, procedure for maintaining sample splits on site, and procedure for transmitting samples to an analytical laboratory). These procedures should identify the ASTM analytical methods used to analyze sulfur content, gross calorific value, and density, as incorporated by reference under § 75.6, or other methods approved by the Administrator through the petition process of § 75.66(c).
1.3.6 Appendix E Monitoring System Quality Assurance InformationIdentify the recommended range of quality assurance- and quality control-related operating parameters. Keep records of these operating parameters for each hour of unit operation (i.e., fuel combustion). Keep a written record of the procedures used to perform NO
Explain how the daily assessment procedures specific to the alternative monitoring system are to be performed.
1.4.2 Daily Quality Assurance Test AdjustmentsExplain how each component of the alternative monitoring system will be adjusted in response to the results of the daily assessments.
1.4.3 Relative Accuracy Test Audit ProceduresKeep a written record of procedures and details peculiar to the installed alternative monitoring system that are to be used for relative accuracy test audits, such as sampling and analysis methods.
2. Frequency of TestingA summary chart showing each quality assurance test and the frequency at which each test is required is located at the end of this appendix in Figure 1.
2.1 Daily AssessmentsPerform the following daily assessments to quality-assure the hourly data recorded by the monitoring systems during each period of unit operation, or, for a bypass stack or duct, each period in which emissions pass through the bypass stack or duct. These requirements are effective as of the date when the monitor or continuous emission monitoring system completes certification testing.
2.1.1 Calibration Error TestExcept as provided in section 2.1.1.2 of this appendix, perform the daily calibration error test of each gas monitoring system (including moisture monitoring systems consisting of wet- and dry-basis O
2.1.1.1 On-line Daily Calibration Error Tests. Except as provided in section 2.1.1.2 of this appendix, all daily calibration error tests must be performed while the unit is in operation at normal, stable conditions (i.e. “on-line”).
2.1.1.2 Off-line Daily Calibration Error Tests. Daily calibrations may be performed while the unit is not operating (i.e., “off-line”) and may be used to validate data for a monitoring system that meets the following conditions:
(1) An initial demonstration test of the monitoring system is successfully completed and the results are reported in the quarterly report required under § 75.64 of this part. The initial demonstration test, hereafter called the “off-line calibration demonstration”, consists of an off-line calibration error test followed by an on-line calibration error test. Both the off-line and on-line portions of the off-line calibration demonstration must meet the calibration error performance specification in section 3.1 of appendix A of this part. Upon completion of the off-line portion of the demonstration, the zero and upscale monitor responses may be adjusted, but only toward the true values of the calibration gases or reference signals used to perform the test and only in accordance with the routine calibration adjustment procedures specified in the quality control program required under section 1 of appendix B to this part. Once these adjustments are made, no further adjustments may be made to the monitoring system until after completion of the on-line portion of the off-line calibration demonstration. Within 26 clock hours of the completion hour of the off-line portion of the demonstration, the monitoring system must successfully complete the first attempted calibration error test, i.e., the on-line portion of the demonstration.
(2) For each monitoring system that has passed the off-line calibration demonstration, off-line calibration error tests may be used on a limited basis to validate data, in accordance with paragraph (2) in section 2.1.5.1 of this appendix.
2.1.2 Daily Flow Interference CheckPerform the daily flow monitor interference checks specified in section 2.2.2.2 of appendix A of this part while the unit is in operation at normal, stable conditions.
2.1.3 Additional Calibration Error Tests and Calibration Adjustments(a) In addition to the daily calibration error tests required under section 2.1.1 of this appendix, a calibration error test of a monitor shall be performed in accordance with section 2.1.1 of this appendix, as follows: whenever a daily calibration error test is failed; whenever a monitoring system is returned to service following repair or corrective maintenance that could affect the monitor's ability to accurately measure and record emissions data; or after making certain calibration adjustments, as described in this section. Except in the case of the routine calibration adjustments described in this section, data from the monitor are considered invalid until the required additional calibration error test has been successfully completed.
(b) Routine calibration adjustments of a monitor are permitted after any successful calibration error test. These routine adjustments shall be made so as to bring the monitor readings as close as practicable to the known tag values of the calibration gases or to the actual value of the flow monitor reference signals. An additional calibration error test is required following routine calibration adjustments where the monitor's calibration has been physically adjusted (e.g., by turning a potentiometer) to verify that the adjustments have been made properly. An additional calibration error test is not required, however, if the routine calibration adjustments are made by means of a mathematical algorithm programmed into the data acquisition and handling system. The EPA recommends that routine calibration adjustments be made, at a minimum, whenever the daily calibration error exceeds the limits of the applicable performance specification in appendix A to this part for the pollutant concentration monitor, CO
(c) Additional (non-routine) calibration adjustments of a monitor are permitted prior to (but not during) linearity checks and RATAs and at other times, provided that an appropriate technical justification is included in the quality control program required under section 1 of this appendix. The allowable non-routine adjustments are as follows. The owner or operator may physically adjust the calibration of a monitor (e.g., by means of a potentiometer), provided that the post-adjustment zero and upscale responses of the monitor are within the performance specifications of the instrument given in section 3.1 of appendix A to this part. An additional calibration error test is required following such adjustments to verify that the monitor is operating within the performance specifications at both the zero and upscale calibration levels.
2.1.4 Data Validation(a) An out-of-control period occurs when the calibration error of an SO
(b) An out-of-control period also occurs whenever interference of a flow monitor is identified. The out-of-control period begins with the hour of completion of the failed interference check and ends with the hour of completion of an interference check that is passed.
(c) The results of any certification, recertification, diagnostic, or quality assurance test required under this part may not be used to validate the emissions data required under this part, if the test is performed using EPA Protocol gas from a production site that is not participating in the PGVP, except as provided in § 75.21(g)(7) or if the cylinder(s) are analyzed by an independent laboratory and shown to meet the requirements of section 5.1.4(b) of appendix A to this part.
2.1.5 Quality Assurance of Data With Respect to Daily AssessmentsWhen a monitoring system passes a daily assessment (i.e., daily calibration error test or daily flow interference check), data from that monitoring system are prospectively validated for 26 clock hours (i.e., 24 hours plus a 2-hour grace period) beginning with the hour in which the test is passed, unless another assessment (i.e. a daily calibration error test, an interference check of a flow monitor, a quarterly linearity check, a quarterly leak check, or a relative accuracy test audit) is failed within the 26-hour period.
2.1.5.1 Data Invalidation with Respect to Daily Assessments. The following specific rules apply to the invalidation of data with respect to daily assessments:
(1) Data from a monitoring system are invalid, beginning with the first hour following the expiration of a 26-hour data validation period or beginning with the first hour following the expiration of an 8-hour start-up grace period (as provided under section 2.1.5.2 of this appendix), if the required subsequent daily assessment has not been conducted.
(2) For a monitor that has passed the off-line calibration demonstration, a combination of on-line and off-line calibration error tests may be used to validate data from the monitor, as follows. For a particular unit (or stack) operating hour, data from a monitor may be validated using a successful off-line calibration error test if: (a) An on-line calibration error test has been passed within the previous 26 unit (or stack) operating hours; and (b) the 26 clock hour data validation window for the off-line calibration error test has not expired. If either of these conditions is not met, then the data from the monitor are invalid with respect to the daily calibration error test requirement. Data from the monitor shall remain invalid until the appropriate on-line or off-line calibration error test is successfully completed so that both conditions (a) and (b) are met.
(3) For units with two measurement ranges (low and high) for a particular parameter, when separate analyzers are used for the low and high ranges, a failed or expired calibration on one of the ranges does not affect the quality-assured data status on the other range. For a dual-range analyzer (i.e., a single analyzer with two measurement scales), a failed calibration error test on either the low or high scale results in an out-of-control period for the monitor. Data from the monitor remain invalid until corrective actions are taken and “hands-off” calibration error tests have been passed on both ranges. However, if the most recent calibration error test on the high scale was passed but has expired, while the low scale is up-to-date on its calibration error test requirements (or vice-versa), the expired calibration error test does not affect the quality-assured status of the data recorded on the other scale.
2.1.5.2 Daily Assessment Start-Up Grace Period. For the purpose of quality assuring data with respect to a daily assessment (i.e. a daily calibration error test or a flow interference check), a start-up grace period may apply when a unit begins to operate after a period of non-operation. The start-up grace period for a daily calibration error test is independent of the start-up grace period for a daily flow interference check. To qualify for a start-up grace period for a daily assessment, there are two requirements:
(1) The unit must have resumed operation after being in outage for 1 or more hours (i.e., the unit must be in a start-up condition) as evidenced by a change in unit operating time from zero in one clock hour to an operating time greater than zero in the next clock hour.
(2) For the monitoring system to be used to validate data during the grace period, the previous daily assessment of the same kind must have been passed on-line within 26 clock hours prior to the last hour in which the unit operated before the outage. In addition, the monitoring system must be in-control with respect to quarterly and semi-annual or annual assessments.
If both of the above conditions are met, then a start-up grace period of up to 8 clock hours applies, beginning with the first hour of unit operation following the outage. During the start-up grace period, data generated by the monitoring system are considered quality-assured. For each monitoring system, a start-up grace period for a calibration error test or flow interference check ends when either: (1) a daily assessment of the same kind (i.e., calibration error test or flow interference check) is performed; or (2) 8 clock hours have elapsed (starting with the first hour of unit operation following the outage), whichever occurs first.
2.1.6 Data RecordingRecord and tabulate all calibration error test data according to month, day, clock-hour, and magnitude in either ppm, percent volume, or scfh. Program monitors that automatically adjust data to the corrected calibration values (e.g., microprocessor control) to record either: (1) The unadjusted concentration or flow rate measured in the calibration error test prior to resetting the calibration, or (2) the magnitude of any adjustment. Record the following applicable flow monitor interference check data: (1) Sample line/sensing port pluggage, and (2) malfunction of each RTD, transceiver, or equivalent.
2.2 Quarterly AssessmentsFor each primary and redundant backup monitor or monitoring system, perform the following quarterly assessments. This requirement is applies as of the calendar quarter following the calendar quarter in which the monitor or continuous emission monitoring system is provisionally certified.
2.2.1 Linearity CheckUnless a particular monitor (or monitoring range) is exempted under this paragraph or under section 6.2 of appendix A to this part, perform a linearity check, in accordance with the procedures in section 6.2 of appendix A to this part, for each primary and redundant backup SO
For differential pressure flow monitors, perform a leak check of all sample lines (a manual check is acceptable) at least once during each QA operating quarter. For this test, the unit does not have to be in operation. Conduct the leak checks no less than 30 days apart, to the extent practicable. If a leak check is failed, follow the applicable data validation procedures in section 2.2.3(g) of this appendix.
2.2.3 Data Validation(a) A linearity check shall not be commenced if the monitoring system is operating out-of-control with respect to any of the daily or semiannual quality assurance assessments required by sections 2.1 and 2.3 of this appendix or with respect to the additional calibration error test requirements in section 2.1.3 of this appendix.
(b) Each required linearity check shall be done according to paragraph (b)(1), (b)(2) or (b)(3) of this section:
(1) The linearity check may be done “cold,” i.e., with no corrective maintenance, repair, calibration adjustments, re-linearization or reprogramming of the monitor prior to the test.
(2) The linearity check may be done after performing only the routine or non-routine calibration adjustments described in section 2.1.3 of this appendix at the various calibration gas levels (zero, low, mid or high), but no other corrective maintenance, repair, re-linearization or reprogramming of the monitor. Trial gas injection runs may be performed after the calibration adjustments and additional adjustments within the allowable limits in section 2.1.3 of this appendix may be made prior to the linearity check, as necessary, to optimize the performance of the monitor. The trial gas injections need not be reported, provided that they meet the specification for trial gas injections in § 75.20(b)(3)(vii)(E)(1). However, if, for any trial injection, the specification in § 75.20(b)(3)(vii)(E)(1) is not met, the trial injection shall be counted as an aborted linearity check.
(3) The linearity check may be done after repair, corrective maintenance or reprogramming of the monitor. In this case, the monitor shall be considered out-of-control from the hour in which the repair, corrective maintenance or reprogramming is commenced until the linearity check has been passed. Alternatively, the data validation procedures and associated timelines in §§ 75.20(b)(3)(ii) through (ix) may be followed upon completion of the necessary repair, corrective maintenance, or reprogramming. If the procedures in § 75.20(b)(3) are used, the words “quality assurance” apply instead of the word “recertification”.
(c) Once a linearity check has been commenced, the test shall be done hands-off. That is, no adjustments of the monitor are permitted during the linearity test period, other than the routine calibration adjustments following daily calibration error tests, as described in section 2.1.3 of this appendix. If a routine daily calibration error test is performed and passed just prior to a linearity test (or during a linearity test period) and a mathematical correction factor is automatically applied by the DAHS, the correction factor shall be applied to all subsequent data recorded by the monitor, including the linearity test data.
(d) If a daily calibration error test is failed during a linearity test period, prior to completing the test, the linearity test must be repeated. Data from the monitor are invalidated prospectively from the hour of the failed calibration error test until the hour of completion of a subsequent successful calibration error test. The linearity test shall not be commenced until the monitor has successfully completed a calibration error test.
(e) An out-of-control period occurs when a linearity test is failed (i.e., when the error in linearity at any of the three concentrations in the quarterly linearity check (or any of the six concentrations, when both ranges of a single analyzer with a dual range are tested) exceeds the applicable specification in section 3.2 of appendix A to this part) or when a linearity test is aborted due to a problem with the monitor or monitoring system. For a NO
(f) No more than four successive calendar quarters shall elapse after the quarter in which a linearity check of a monitor or monitoring system (or range of a monitor or monitoring system) was last performed without a subsequent linearity test having been conducted. If a linearity test has not been completed by the end of the fourth calendar quarter since the last linearity test, then the linearity test must be completed within a 168 unit operating hour or stack operating hour “grace period” (as provided in section 2.2.4 of this appendix) following the end of the fourth successive elapsed calendar quarter, or data from the CEMS (or range) will become invalid.
(g) An out-of-control period also occurs when a flow monitor sample line leak is detected. The out-of-control period begins with the hour of the failed leak check and ends with the hour of a satisfactory leak check following corrective action.
(h) For each monitoring system, report the results of all completed and partial linearity tests that affect data validation (i.e., all completed, passed linearity checks; all completed, failed linearity checks; and all linearity checks aborted due to a problem with the monitor, including trial gas injections counted as failed test attempts under paragraph (b)(2) of this section or under § 75.20(b)(3)(vii)(F)), in the quarterly report required under § 75.64. Note that linearity attempts which are aborted or invalidated due to problems with the reference calibration gases or due to operational problems with the affected unit(s) need not be reported. Such partial tests do not affect the validation status of emission data recorded by the monitor. A record of all linearity tests, trial gas injections and test attempts (whether reported or not) must be kept on-site as part of the official test log for each monitoring system.
(i) The results of any certification, recertification, diagnostic, or quality assurance test required under this part may not be used to validate the emissions data required under this part, if the test is performed using EPA Protocol gas that was not from an EPA Protocol gas production site participating in the PGVP on the date the gas was procured either by the tester or by a reseller that sold to the tester the unaltered EPA Protocol gas, except as provided in § 75.21(g)(7) or if the cylinder(s) are analyzed by an independent laboratory and shown to meet the requirements of section 5.1.4(b) of appendix A to this part.
2.2.4 Linearity and Leak Check Grace Period(a) When a required linearity test or flow monitor leak check has not been completed by the end of the QA operating quarter in which it is due or if, due to infrequent operation of a unit or infrequent use of a required high range of a monitor or monitoring system, four successive calendar quarters have elapsed after the quarter in which a linearity check of a monitor or monitoring system (or range) was last performed without a subsequent linearity test having been done, the owner or operator has a grace period of 168 consecutive unit operating hours, as defined in § 72.2 of this chapter (or, for monitors installed on common stacks or bypass stacks, 168 consecutive stack operating hours, as defined in § 72.2 of this chapter) in which to perform a linearity test or leak check of that monitor or monitoring system (or range). The grace period begins with the first unit or stack operating hour following the calendar quarter in which the linearity test was due. Data validation during a linearity or leak check grace period shall be done in accordance with the applicable provisions in section 2.2.3 of this appendix.
(b) If, at the end of the 168 unit (or stack) operating hour grace period, the required linearity test or leak check has not been completed, data from the monitoring system (or range) shall be invalid, beginning with the first unit operating hour following the expiration of the grace period. Data from the monitoring system (or range) remain invalid until the hour of completion of a subsequent successful hands-off linearity test or leak check of the monitor or monitoring system (or range). Note that when a linearity test or a leak check is conducted within a grace period for the purpose of satisfying the linearity test or leak check requirement from a previous QA operating quarter, the results of that linearity test or leak check may only be used to meet the linearity check or leak check requirement of the previous quarter, not the quarter in which the missed linearity test or leak check is completed.
2.2.5 Flow-to-Load Ratio or Gross Heat Rate Evaluation(a) Applicability and methodology. Unless exempted from the flow-to-load ratio test under section 7.8 of appendix A to this part, the owner or operator shall, for each flow rate monitoring system installed on each unit, common stack or multiple stack, evaluate the flow-to-load ratio quarterly, i.e., for each QA operating quarter (as defined in § 72.2 of this chapter). At the end of each QA operating quarter, the owner or operator shall use Equation B-1 to calculate the flow-to-load ratio for every hour during the quarter in which: the unit (or combination of units, for a common stack) operated within ±10.0 percent of L
(1) In Equation B-1, the owner or operator may use either bias-adjusted flow rates or unadjusted flow rates, provided that all of the ratios are calculated the same way. For a common stack, L
(2) Alternatively, the owner or operator may calculate the hourly gross heat rates (GHR) in lieu of the hourly flow-to-load ratios. The hourly GHR shall be determined only for those hours in which quality-assured flow rate data and diluent gas (CO
(3) In Equation B-1a, the owner or operator may either use bias-adjusted flow rates or unadjusted flow rates in the calculation of (Heat Input)
(4) The owner or operator shall evaluate the calculated hourly flow-to-load ratios (or gross heat rates) as follows. A separate data analysis shall be performed for each primary and each redundant backup flow rate monitor used to record and report data during the quarter. Each analysis shall be based on a minimum of 168 acceptable recorded hourly average flow rates (i.e., at loads within ±10 percent of L
(5) For each flow monitor, use Equation B-2 in this appendix to calculate E
(6) Equation B-2 shall be used in a consistent manner. That is, use R
(b) Acceptable results. The results of a quarterly flow-to-load (or gross heat rate) evaluation are acceptable, and no further action is required, if the calculated value of E
(c) Recalculation of E
(1) Any hour in which the type of fuel combusted was different from the fuel burned during the most recent normal-load RATA. For purposes of this determination, the type of fuel is different if the fuel is in a different state of matter (i.e., solid, liquid, or gas) than is the fuel burned during the RATA or if the fuel is a different classification of coal (e.g., bituminous versus sub-bituminous). Also, for units that co-fire different types of fuels, if the reference RATA was done while co-firing, then hours in which a single fuel was combusted may be excluded from the data analysis as different fuel hours (and vice-versa for co-fired hours, if the reference RATA was done while combusting only one type of fuel);
(2) For a unit that is equipped with an SO
(3) Any hour in which “ramping” occurred, i.e., the hourly load differed by more than ±15.0 percent from the load during the preceding hour or the subsequent hour;
(4) For a unit with a multiple stack discharge configuration consisting of a main stack and a bypass stack, any hour in which the flue gases were discharged through both stacks;
(5) If a normal-load flow RATA was performed and passed during the quarter being analyzed, any hour prior to completion of that RATA; and
(6) If a problem with the accuracy of the flow monitor was discovered during the quarter and was corrected (as evidenced by passing the abbreviated flow-to-load test in section 2.2.5.3 of this appendix), any hour prior to completion of the abbreviated flow-to-load test.
(7) After identifying and excluding all non-representative hourly data in accordance with paragraphs (c)(1) through (6) of this section, the owner or operator may analyze the remaining data a second time. At least 168 representative hourly ratios or GHR values must be available to perform the analysis; otherwise, the flow-to-load (or GHR) analysis is not required for that monitor for that calendar quarter.
(8) If, after re-analyzing the data, E
Within 14 unit operating days of the end of the calendar quarter for which the E
(a) If the investigation fails to uncover a problem with the flow monitor, a RATA shall be performed in accordance with Option 2 in section 2.2.5.2 of this appendix.
(b) If a problem with the flow monitor is identified through the investigation (including the need to re-linearize the monitor by changing the polynomial coefficients or K factor(s)), data from the monitor are considered invalid back to the first unit operating hour after the end of the calendar quarter for which E
Perform a single-load RATA (at a load designated as normal under section 6.5.2.1 of appendix A to this part) of each flow monitor for which E
(a) The following abbreviated flow-to-load test may be performed after any documented repair, component replacement, or other corrective maintenance to a flow monitor (except for changes affecting the linearity of the flow monitor, such as adjusting the flow monitor coefficients or K factor(s)) to demonstrate that the repair, replacement, or other maintenance has not significantly affected the monitor's ability to accurately measure the stack gas volumetric flow rate. Data from the monitoring system are considered invalid from the hour of commencement of the repair, replacement, or maintenance until either the hour in which the abbraviated flow-to-load test is passed, or the hour in which a probationary calibration error test is passed following completion of the repair, replacement, or maintenance and any associated adjustments to the monitor. If the latter option is selected, the abbreviated flow-to-load test shall be completed within 168 unit operating hours of the probationary calibration error test (or, for peaking units, within 30 unit operating days, if that is less restrictive). Data from the monitor are considered to be conditionally valid (as defined in § 72.2 of this chapter), beginning with the hour of the probationary calibration error test.
(b) Operate the unit(s) in such a way as to reproduce, as closely as practicable, the exact conditions at the time of the most recent normal-load flow RATA. To achieve this, it is recommended that the load be held constant to within ±10.0 percent of the average load during the RATA and that the diluent gas (CO
(c) The results of the abbreviated flow-to-load test shall be considered acceptable, and no further action is required if the value of E
For each primary and redundant backup monitoring system, perform relative accuracy assessments either semiannually or annually, as specified in section 2.3.1.1 or 2.3.1.2 of this appendix, for the type of test and the performance achieved. This requirement applies as of the calendar quarter following the calendar quarter in which the monitoring system is provisionally certified. A summary chart showing the frequency with which a relative accuracy test audit must be performed, depending on the accuracy achieved, is located at the end of this appendix in Figure 2.
2.3.1 Relative Accuracy Test Audit (RATA) 2.3.1.1 Standard RATA Frequencies(a) Except as otherwise specified in § 75.21(a)(6) or (a)(7) or in section 2.3.1.2 of this appendix, perform relative accuracy test audits semiannually, i.e., once every two successive QA operating quarters (as defined in § 72.2 of this chapter) for each primary and redundant backup SO
(b) The relative accuracy test audit frequency of a CEMS may be reduced, as specified in section 2.3.1.2 of this appendix, for primary or redundant backup monitoring systems which qualify for less frequent testing. Perform all required RATAs in accordance with the applicable procedures and provisions in sections 6.5 through 6.5.2.2 of appendix A to this part and sections 2.3.1.3 and 2.3.1.4 of this appendix.
2.3.1.2 Reduced RATA FrequenciesRelative accuracy test audits of primary and redundant backup SO
(a) The relative accuracy during the audit of an SO
(b) [Reserved]
(c) The relative accuracy during the audit of a flow monitor is ≤7.5 percent at each operating level tested;
(d) For low flow (≤10.0 fps, as measured by the reference method during the RATA) stacks/ducts, when the flow monitor fails to achieve a relative accuracy ≤7.5 percent during the audit, but the monitor mean value, calculated using Equation A-7 in appendix A to this part and converted back to an equivalent velocity in standard feet per second (fps), is within ±1.5 fps of the reference method mean value, converted to an equivalent velocity in fps;
(e) For low SO
(f) For units with low NO
(g) [Reserved]
(h) For a CO
(i) When the relative accuracy of a continuous moisture monitoring system is ≤7.5 percent or when the mean difference between the reference method values from the RATA and the corresponding monitoring system values is within ±1.0 percent H
(a) For SO
(b) For flow monitors installed on peaking units and bypass stacks, and for flow monitors that qualify to perform only single-level RATAs under section 6.5.2(e) of appendix A to this part, all required semiannual or annual relative accuracy test audits shall be single-load (or single-level) audits at the normal load (or operating level), as defined in section 6.5.2.1(d) of appendix A to this part.
(c) For all other flow monitors, the RATAs shall be performed as follows:
(1) An annual 2-load (or 2-level) flow RATA shall be done at the two most frequently used load levels (or operating levels), as determined under section 6.5.2.1(d) of appendix A to this part, or (if applicable) at the operating levels determined under section 6.5.2(e) of appendix A to this part. Alternatively, a 3-load (or 3-level) flow RATA at the low, mid, and high load levels (or operating levels), as defined under section 6.5.2.1(b) of appendix A to this part, may be performed in lieu of the 2-load (or 2-level) annual RATA.
(2) If the flow monitor is on a semiannual RATA frequency, 2-load (or 2-level) flow RATAs and single-load (or single-level) flow RATAs at the normal load level (or normal operating level) may be performed alternately.
(3) A single-load (or single-level) annual flow RATA may be performed in lieu of the 2-load (or 2-level) RATA if the results of an historical load data analysis show that in the time period extending from the ending date of the last annual flow RATA to a date that is no more than 21 days prior to the date of the current annual flow RATA, the unit (or combination of units, for a common stack) has operated at a single load level (or operating level) (low, mid, or high), for ≥85.0 percent of the time. Alternatively, a flow monitor may qualify for a single-load (or single-level) RATA if the 85.0 percent criterion is met in the time period extending from the beginning of the quarter in which the last annual flow RATA was performed through the end of the calendar quarter preceding the quarter of current annual flow RATA.
(4) A 3-load (or 3-level) RATA, at the low-, mid-, and high-load levels (or operating levels), as determined under section 6.5.2.1 of appendix A to this part, shall be performed at least once every twenty consecutive calendar quarters, except for flow monitors that are exempted from 3-load (or 3-level) RATA testing under section 6.5.2(b) or 6.5.2(e) of appendix A to this part.
(5) A 3-load (or 3-level) RATA is required whenever a flow monitor is re-linearized, i.e., when its polynomial coefficients or K factor(s) are changed, except for flow monitors that are exempted from 3-load (or 3-level) RATA testing under section 6.5.2(b) or 6.5.2(e) of appendix A to this part. For monitors so exempted under section 6.5.2(b), a single-load flow RATA is required. For monitors so exempted under section 6.5.2(e), either a single-level RATA or a 2-level RATA is required, depending on the number of operating levels documented in the monitoring plan for the unit.
(6) For all multi-level flow audits, the audit points at adjacent load levels or at adjacent operating levels (e.g., mid and high) shall be separated by no less than 25.0 percent of the “range of operation,” as defined in section 6.5.2.1 of appendix A to this part.
(d) A RATA of a moisture monitoring system shall be performed whenever the coefficient, K factor or mathematical algorithm determined under section 6.5.7 of appendix A to this part is changed.
2.3.1.4 Number of RATA AttemptsThe owner or operator may perform as many RATA attempts as are necessary to achieve the desired relative accuracy test audit frequencies and/or bias adjustment factors. However, the data validation procedures in section 2.3.2 of this appendix must be followed.
2.3.2 Data Validation(a) A RATA shall not commence if the monitoring system is operating out-of-control with respect to any of the daily and quarterly quality assurance assessments required by sections 2.1 and 2.2 of this appendix or with respect to the additional calibration error test requirements in section 2.1.3 of this appendix.
(b) Each required RATA shall be done according to paragraphs (b)(1), (b)(2) or (b)(3) of this section:
(1) The RATA may be done “cold,” i.e., with no corrective maintenance, repair, calibration adjustments, re-linearization or reprogramming of the monitoring system prior to the test.
(2) The RATA may be done after performing only the routine or non-routine calibration adjustments described in section 2.1.3 of this appendix at the zero and/or upscale calibration gas levels, but no other corrective maintenance, repair, re-linearization or reprogramming of the monitoring system. Trial RATA runs may be performed after the calibration adjustments and additional adjustments within the allowable limits in section 2.1.3 of this appendix may be made prior to the RATA, as necessary, to optimize the performance of the CEMS. The trial RATA runs need not be reported, provided that they meet the specification for trial RATA runs in § 75.20(b)(3)(vii)(E)(2). However, if, for any trial run, the specification in § 75.20(b)(3)(vii)(E)(2) is not met, the trial run shall be counted as an aborted RATA attempt.
(3) The RATA may be done after repair, corrective maintenance, re-linearization or reprogramming of the monitoring system. In this case, the monitoring system shall be considered out-of-control from the hour in which the repair, corrective maintenance, re-linearization or reprogramming is commenced until the RATA has been passed. Alternatively, the data validation procedures and associated timelines in §§ 75.20(b)(3)(ii) through (ix) may be followed upon completion of the necessary repair, corrective maintenance, re-linearization or reprogramming. If the procedures in § 75.20(b)(3) are used, the words “quality assurance” apply instead of the word “recertification.”
(c) Once a RATA is commenced, the test must be done hands-off. No adjustment of the monitor's calibration is permitted during the RATA test period, other than the routine calibration adjustments following daily calibration error tests, as described in section 2.1.3 of this appendix. If a routine daily calibration error test is performed and passed just prior to a RATA (or during a RATA test period) and a mathematical correction factor is automatically applied by the DAHS, the correction factor shall be applied to all subsequent data recorded by the monitor, including the RATA test data. For 2-level and 3-level flow monitor audits, no linearization or reprogramming of the monitor is permitted in between load levels.
(d) For single-load (or single-level) RATAs, if a daily calibration error test is failed during a RATA test period, prior to completing the test, the RATA must be repeated. Data from the monitor are invalidated prospectively from the hour of the failed calibration error test until the hour of completion of a subsequent successful calibration error test. The subsequent RATA shall not be commenced until the monitor has successfully passed a calibration error test in accordance with section 2.1.3 of this appendix. For multiple-load (or multiple-level) flow RATAs, each load level (or operating level) is treated as a separate RATA (i.e., when a calibration error test is failed prior to completing the RATA at a particular load level (or operating level), only the RATA at that load level (or operating level) must be repeated; the results of any previously-passed RATA(s) at the other load level(s) (or operating level(s)) are unaffected, unless the monitor's polynomial coefficients or K-factor(s) must be changed to correct the problem that caused the calibration failure, in which case a subsequent 3-load (or 3-level) RATA is required), except as otherwise provided in section 2.3.1.3 (c)(5) of this appendix.
(e) For a RATA performed using the option in paragraph (b)(1) or (b)(2) of this section, if the RATA is failed (that is, if the relative accuracy exceeds the applicable specification in section 3.3 of appendix A to this part) or if the RATA is aborted prior to completion due to a problem with the CEMS, then the CEMS is out-of-control and all emission data from the CEMS are invalidated prospectively from the hour in which the RATA is failed or aborted. Data from the CEMS remain invalid until the hour of completion of a subsequent RATA that meets the applicable specification in section 3.3 of appendix A to this part. If the option in paragraph (b)(3) of this section to use the data validation procedures and associated timelines in §§ 75.20(b)(3)(ii) through(b)(3)(ix) has been selected, the beginning and end of the out-of-control period shall be determined in accordance with § 75.20(b)(3)(vii)(A) and (B). Note that when a RATA is aborted for a reason other than monitoring system malfunction (see paragraph (h) of this section), this does not trigger an out-of-control period for the monitoring system.
(f) For a 2-level or 3-level flow RATA, if, at any load level (or operating level), a RATA is failed or aborted due to a problem with the flow monitor, the RATA at that load level (or operating level) must be repeated. The flow monitor is considered out-of-control and data from the monitor are invalidated from the hour in which the test is failed or aborted and remain invalid until the passing of a RATA at the failed load level (or operating level), unless the option in paragraph (b)(3) of this section to use the data validation procedures and associated timelines in § 75.20(b)(3)(ii) through (b)(3)(ix) has been selected, in which case the beginning and end of the out-of-control period shall be determined in accordance with § 75.20(b)(3)(vii)(A) and (B). Flow RATA(s) that were previously passed at the other load level(s) (or operating level(s)) do not have to be repeated unless the flow monitor must be re-linearized following the failed or aborted test. If the flow monitor is re-linearized, a subsequent 3-load (or 3-level) RATA is required, except as otherwise provided in section 2.3.1.3(c)(5) of this appendix.
(g) Data validation for failed RATAs for a CO
(1) For a CO
(2) This paragraph (g)(2) applies only to a NO
(h) For each monitoring system, report the results of all completed and partial RATAs that affect data validation (i.e., all completed, passed RATAs; all completed, failed RATAs; and all RATAs aborted due to a problem with the CEMS, including trial RATA runs counted as failed test attempts under paragraph (b)(2) of this section or under § 75.20(b)(3)(vii)(F)) in the quarterly report required under § 75.64. Note that RATA attempts that are aborted or invalidated due to problems with the reference method or due to operational problems with the affected unit(s) need not be reported. Such runs do not affect the validation status of emission data recorded by the CEMS. However, a record of all RATAs, trial RATA runs and RATA attempts (whether reported or not) must be kept on-site as part of the official test log for each monitoring system.
(i) Each time that a hands-off RATA of an SO
(j) Failure of the bias test does not result in the monitoring system being out-of-control.
(k) The results of any certification, recertification, diagnostic, or quality assurance test required under this part may not be used to validate the emissions data required under this part, if the test is performed using EPA Protocol gas from a production site that is not participating in the PGVP, except as provided in § 75.21(g)(7) or if the cylinder(s) are analyzed by an independent laboratory and shown to meet the requirements of section 5.1.4(b) of appendix A to this part.
2.3.3 RATA Grace Period(a) The owner or operator has a grace period of 720 consecutive unit operating hours, as defined in § 72.2 of this chapter (or, for CEMS installed on common stacks or bypass stacks, 720 consecutive stack operating hours, as defined in § 72.2 of this chapter), in which to complete the required RATA for a particular CEMS whenever:
(1) A required RATA has not been performed by the end of the QA operating quarter in which it is due; or
(2) A required 3-load flow RATA has not been performed by the end of the calendar quarter in which it is due; or
(3) For a unit which is conditionally exempted under § 75.21(a)(7) from the SO
(4) Eight successive calendar quarters have elapsed, following the quarter in which a RATA was last performed, without a subsequent RATA having been done, due either to infrequent operation of the unit(s) or frequent combustion of very low sulfur fuel, as defined in § 72.2 of this chapter (SO
(b) Except for SO
(c) If, at the end of the 720 unit (or stack) operating hour grace period, the RATA has not been completed, data from the monitoring system shall be invalid, beginning with the first unit operating hour following the expiration of the grace period. Data from the CEMS remain invalid until the hour of completion of a subsequent hands-off RATA. The deadline for the next test shall be either two QA operating quarters (if a semiannual RATA frequency is obtained) or four QA operating quarters (if an annual RATA frequency is obtained) after the quarter in which the RATA is completed, not to exceed eight calendar quarters.
(d) When a RATA is done during a grace period in order to satisfy a RATA requirement from a previous quarter, the deadline for the next RATA shall determined as follows:
(1) If the grace period RATA qualifies for a reduced, (i.e., annual), RATA frequency the deadline for the next RATA shall be set at three QA operating quarters after the quarter in which the grace period test is completed.
(2) If the grace period RATA qualifies for the standard, (i.e., semiannual), RATA frequency the deadline for the next RATA shall be set at two QA operating quarters after the quarter in which the grace period test is completed.
(3) Notwithstanding these requirements, no more than eight successive calendar quarters shall elapse after the quarter in which the grace period test is completed, without a subsequent RATA having been conducted.
2.3.4 Bias Adjustment FactorExcept as otherwise specified in section 7.6.5 of appendix A to this part, if an SO
(a) When a significant change is made to a monitoring system such that recertification of the monitoring system is required in accordance with § 75.20(b), a recertification test (or tests) must be performed to ensure that the CEMS continues to generate valid data. In all recertifications, a RATA will be one of the required tests; for some recertifications, other tests will also be required. A recertification test may be used to satisfy the quality assurance test requirement of this appendix. For example, if, for a particular change made to a CEMS, one of the required recertification tests is a linearity check and the linearity check is successful, then, unless another such recertification event occurs in that same QA operating quarter, it would not be necessary to perform an additional linearity test of the CEMS in that quarter to meet the quality assurance requirement of section 2.2.1 of this appendix. For this reason, EPA recommends that owners or operators coordinate component replacements, system upgrades, and other events that may require recertification, to the extent practicable, with the periodic quality assurance testing required by this appendix. When a quality assurance test is done for the dual purpose of recertification and routine quality assurance, the applicable data validation procedures in § 75.20(b)(3) shall be followed.
(b) Except as provided in section 2.3.3 of this appendix, whenever a passing RATA of a gas monitor is performed, or a passing 2-load (or 2-level) RATA or a passing 3-load (or 3-level) RATA of a flow monitor is performed (irrespective of whether the RATA is done to satisfy a recertification requirement or to meet the quality assurance requirements of this appendix, or both), the RATA frequency (semi-annual or annual) shall be established based upon the date and time of completion of the RATA and the relative accuracy percentage obtained. For 2-load (or 2-level) and 3-load (or 3-level) flow RATAs, use the highest percentage relative accuracy at any of the loads (or levels) to determine the RATA frequency. The results of a single-load (or single-level) flow RATA may be used to establish the RATA frequency when the single-load (or single-level) flow RATA is specifically required under section 2.3.1.3(b) of this appendix or when the single-load (or single-level) RATA is allowed under section 2.3.1.3(c) of this appendix for a unit that has operated at one load level (or operating level) for ≥85.0 percent of the time since the last annual flow RATA. No other single-load (or single-level) flow RATA may be used to establish an annual RATA frequency; however, a 2-load or 3-load (or a 2-level or 3-level) flow RATA may be performed at any time or in place of any required single-load (or single-level) RATA, in order to establish an annual RATA frequency.
2.5 Other AuditsAffected units may be subject to relative accuracy test audits at any time. If a monitor or continuous emission monitoring system fails the relative accuracy test during the audit, the monitor or continuous emission monitoring system shall be considered to be out-of-control beginning with the date and time of completion of the audit, and continuing until a successful audit test is completed following corrective action. If a monitor or monitoring system fails the bias test during an audit, use the bias adjustment factor given by equations A-11 and A-12 in appendix A to this part to adjust the monitored data. Apply this adjustment factor from the date and time of completion of the audit until the date and time of completion of a relative accuracy test audit that does not show bias.
Figure 1 to Appendix B of Part 75—Quality Assurance Test Requirements
Test | Basic QA test frequency requirements | Daily * | Quarterly * | Semiannual or annual * | Calibration Error Test (2 pt.) | X | Interference Check (flow) | X | Flow-to-Load Ratio | X | Leak Check (DP flow monitors) | X | Linearity Check * (3 pt.) | X | RATA (SO | X | RATA (flow) 1 2 | X |
---|
* “Daily” means operating days, only. “Quarterly” means once every QA operating quarter. “Semiannual” means once every two QA operating quarters. “Annual” means once every four QA operating quarters.
1 Conduct RATA annually (i.e., once every four QA operating quarters) rather than semiannually, if monitor meets accuracy requirements to qualify for less frequent testing.
2 For flow monitors installed on peaking units, bypass stacks, or units that qualify for single-level RATA testing under section 6.5.2(e) of this part, conduct all RATAs at a single, normal load (or operating level). For other flow monitors, conduct annual RATAs at two load levels (or operating levels). Alternating single-load and 2-load (or single-level and 2-level) RATAs may be done if a monitor is on a semiannual frequency. A single-load (or single-level) RATA may be done in lieu of a 2-load (or 2-level) RATA if, since the last annual flow RATA, the unit has operated at one load level (or operating level) for ≥85.0 percent of the time. A 3-level RATA is required at least once every five years (20 calendar quarters) and whenever a flow monitor is re-characterized, except for flow monitors exempted from 3-level RATA testing under section 6.5.2(b) or 6.5.2(e) of appendix A to this part.
Figure 2 to Appendix B of Part 75—Relative Accuracy Test Frequency Incentive System
RATA | Semiannual W | Annual W | SO | 7.5% <RA ≤10.0% or ±15.0 ppm X | RA ≤7.5% or ±12.0 ppm X. | NO | 7.5% <RA ≤10.0% or ±0.020 lb/mmBtu X | RA ≤7.5% or ±0. 015 lb/mmBtu X. | Flow | 7.5% <RA ≤10.0% or ±2.0 fps X | RA ≤7.5% or ±1.5 fps X. | CO | 7.5% <RA ≤10.0% or ±1.0% CO | RA ≤7.5% or ±0.7% CO | Moisture | 7.5% <RA ≤10.0% or ±1.5% H | RA ≤7.5% or ±1.0% H |
---|
W The deadline for the next RATA is the end of the second (if semiannual) or fourth (if annual) successive QA operating quarter following the quarter in which the CEMS was last tested. Exclude calendar quarters with fewer than 168 unit operating hours (or, for common stacks and bypass stacks, exclude quarters with fewer than 168 stack operating hours) in determining the RATA deadline. For SO
X The difference between monitor and reference method mean values applies to moisture monitors, CO
Y A NO