Purpose
To describe a procedure for the operation, calibration and maintenance of Gas Chromatograph/Head Space (GC-HS).
Scope
This SOP applies to the Gas Chromatograph, GC-HS (Make: Agilent; Model: 6850/7694E)
Responsibility
It is the responsibility of QC Executive.
Accountability
Manager – Quality Control
Procedure
Operation
- Ensure that the instrument is clean, free from dust and vibration.
- Ensure that environment condition are in suitable (Temp 15-35°C) Humidity 40-60%.
- Ensure that part of instrument and system in configuration.
- Turn on the cylinders of nitrogen (grey with a black top), air (grey) and hydrogen (red) kept in the gas station on the ground floor. Always check the pressure level on the regulators.
- Turn on the GC instrument along with the HS.
PEOPLE ALSO READ: SOP for Calibration of Atomic Absorption Spectrometer
- Open the toggle valves on the gas panel for all the three gas lines of hydrogen, air and nitrogen.
- Click the icon Instrument Online placed on the desktop for invoking the GC Chemstation software. This will launch the Chemstation software for controlling the software remotely.
- Go to method & run control and load the desired method. This will download the entire method conditions to the GC namely oven temperature, front inlet temperature, split ratio, front detector temperature, front detector hydrogen and air flow and column carrier gas nitrogen flow.
- Manually set the HS parameter method, Initialization time, Vial temperature, runtime, Injection time, shaking time and purging the injection.
- Place the sample vial containing the sample in the HS sampler at the desired position and given the all information in the Sample sequence. There are 18 sample positions.
- When the Chemstation online and HS Manual key board shows ready (all green color) click the ‘START’ icon on HS Key board. This will set the instrument in the run mode.
- In the offline window one can monitor the run by clicking the snapshot option in the drop down menu. Further data analysis also can be performed in the same window.
Shut Down after Cooling
- This is very critical operation and must be performed on a regular basis after the end of analysis.
- The method cool.m was loaded in the Chemstation Online window. This turns off the detector gases (both hydrogen and air). The fan inside the oven starts cooling the capillary column until the temperature comes down to 30º C. Once the set temperature is achieved the system is safe to be turned off.
(Never turn off the GC system when the column is at a temperature higher than 30ºC. This can damage the column as the packing material starts leaching out and decreases the performance).
PEOPLE ALSO READ: SOP for Calibration of Malvern Particle Size Analyzer
Calibration
- The IPV (Instrument Performance Verification) is carried out by the service engineer as a part of AMC visits on a yearly basis along with a documentation report as well.
- The frequency selected to carry out the calibration is once a year for instrument performance. Test solutions and STDs used or recommended by vendors are acceptable if Certificates of Analysis (COAs) of these are available. Use test solutions or STDs traceable to NIST or International Reference if they are available.
- Service Qualification: An analyst or a qualified service representative is qualified for calibration. Training certifications must be provided prior to commencing work.
- Column: Capillary column would be used for calibration of split and split less injector with appropriate standards.
Temperature accuracy and stability (Detector):
- Herein compare selected detector set point temperature versus actual system reading (refer to Agilent’s user manual for set temperature recommendations). Temperature accuracy at two set temp is measured at 230°C and 100°C. The actual temperature reading should be within ±1% of the set point. The temperature stability should be ≤ 0.5°C.
Temperature accuracy and Stability (HS Sampler):
- Herein compare the selected set point temperature versus the actual system reading (refer to Agilent’s user manual for set temperature recommendations). Temperature accuracy at one set temperature is measured at 70°C. The actual temperature reading should be within ± 1% of set point. The temperature stability should be ≤ 0.5°C.
Inlet pressure decay:
- This test demonstrates the pressure integrity of the GC inlet (with a valve-controlled injection system, if applicable) and all flows controlled by GC inlet pneumatics. The acceptable limit for pressure change should be ≥ - 2.0 and < 0.5 psi/5min.
Inlet pressure accuracy:
- This test demonstrates the ability of the system to provide accurate pressure to the head of the column. The acceptance limit should be ≤ 1.2.
Detector flow accuracy:
- It is determined by measuring the flows with calibrated mass flow meter and comparing them to the test set points and the values displayed by the GC (If applicable). The acceptance limits are as follows:
Signal Noise and Drift:
- The base signal is recorded at the beginning of the test, signal noise is calculated as ASTM noise which is average peak to peak noise in a number of signal segments and signal drift is calculated as the slope of the linear regression of the signal. The acceptance limits are as follows:
- Noise (ASTM) : ≤ 0.1 pA
- Drift : ≤ 2.5
Signal to Noise:
- This test uses a traceable STD to determine signal to noise. The acceptance limit should be ≥ 300, 000.00.
Head Space
Injector Precision:
- This test uses a traceable STD to determine injector precision. Following should be the acceptance limits,
- Injector Precision (liquid): Retention time % RSD ≤ 1.0
- Area % RSD : ≤ 3.0
- Injector Precision (Head Space): Retention time % RSD ≤ 1.0
- Area % RSD : ≤ 3.0
- If the test fails, arrange a service and place an “Out of Service” sticker on the instrument.
Annexure
Nil
Revision History
Nil
0 Comments