Can anyone tell me where I can find how the size of the calibration orifice in the compression tester was determined? Theoretically would the orifice size change for each type of engine?
Can anyone tell me where I can find how the size of the calibration orifice in the compression tester was determined? Theoretically would the orifice size change for each type of engine?
.040" for your standard Aircraft engine compression differential tester. A .060" special orifice can be used for big bore engines although most don't bother. Most mechanics will compare compression reading against the calibration port on many of the testers. The reality is, the actual compression numbers people seem to hang their hats on (because that's all that gets recorded in the logs), aren't terribly meaningful as they vary significantly with the displacement and wear of the engine. If a cylinder is leaking, what is more important is where it is leaking, and how much, which is where the number comes from. A low compression reading typically warrants a boroscope inspection. If it's a leaky exhaust valve, the color and pattern will tell right away if the valve is burning or just has a piece of crud caught under the face. Leaky rings may just be a stuck ring, but a boroscope inspection will tell if there is significant scoring or sometimes the plating coming off the upper cylinder face.
Cub Builder
While I agree with your comments, I'm still in the dark about the sizing of the orifice. Who decided that .040" was appropriate? What was the basis for that decision? Just happened to have a handy #60 drill? Using the orifice results in a greater "allowable" leakage than the old 60 PSI, why? Questions, questions, questions.
Didn't I answer this in a previous thread? The answer is that it's arbitrary from ages ago (round engine days). It's not all that essential of a concern in doing the test.
Exactly right. The number isn't so important as whether the cylinder is leaking and what is leaking on it. Exh Valve, Intake Valve, rings, cracked head, scored cylinder? Large bore Continentals are expected to be down into the 50s or even 40s for compression as the ring gaps are huge and they tend to leak a lot cold. But it may be failing with a higher compression if it's a leaky valve. There is nothing magical about 60 PSI other than an arbitrary number that gets recorded in the logs so it is something people can read in hopes they are buying a healthy engine. Generally speaking, good compressions are indicative of a healthy engine. Low compressions may mean the engine is in for some major work, but not necessarily. They are only an indicator that the mechanic needs to look closer. I have seen a number of "failing" cylinders magically healed with a little TLC in the shop. No two gauge sets seem to read exactly the same, so there is apparently a lot of variance in the .040" orifices.
You'll also find compression differential testers used in racing, except that they often times use 100 PSI rather than 80. The differential compression or leak down test is more about showing potential problems so they can be repaired before they become real problema than finding a magic number to record in your log book.
Last edited by cub builder; 04-04-2021 at 06:15 PM.
I obviously am not making myself clear. I know how to do a compression test. I know examining the cylinder for the source of leakage is more important. However for years we did compression tests with a tester that did not have a calibration orifice. Why was that changed? How was the decision to change made? If the size of the orifice is arbitrary whose arbitrary? I repeat I am in favor of the present system of compression testing. I agree with the comments made concerning test limitations. I am just trying to find out the history of the procedure changes.
I'm not making myself clear. For years we had compression testers without calibration orifices. With this tester any cylinder that had leakage to below 60 psi "failed". Then someone ( I think Continental ) decided that the number was not that important, cylinder condition was. At about this time a new compression tester appeared. This tester had included a calibrated orifice and slightly different testing protocol. So my questions are who decided to put the orifice in the tester? Who decided the orifice should be .040"? Who decided on the testing protocol?
As I said in the other thread. They were always set to be about nominal size. Nobody much stated an official number until Continental got tired of people holding the numbers hostage so they computed what they decided would be an acceptable leakdown to them (or at least stop getting people complaining about the quality of their product). Of course, I have my own beef with Continental over other issues, but this one is much ado about nothing.
It's the same as why we use 80 pounds. It was a good number for round engines because it gave a reasonable check of the seal without making it too hard to hold the piston at tdc.
There was never an automatic failure at <60 psi compression differential. Many mechanics would call it a failure, and often times cylinders were failing at that pressure, but not necessarily so. I have had engines that were less than 60 psi on a compression diff chk, but were still acceptable and flown for a number of hours yet before overhaul. The "master orifice" added to the gauge sets gives a comparison leakdown for large bore Continentals and was a recommendation from Continental. I never knew or cared what size the reference port was.
Continental specifies the .060 master orifice for the big bore engines like my IO-550. That typically results in ~45psi and controls where expanded inspection kicks in. The orifice size is in the manual so I don't question the number any more than I do other limits. I just bought a test set with that size. Most people start to squirm when the leakage gets into the 40's so I suspect the size was chosen using TLAR (that looks about right).
Regards,
Greg Young
1950 Navion N5221K
RV-6 N6GY - first flight 5/16/2021
1940 Rearwin Cloudster in work
4 L-2 projects on deck