Register here

Author Topic: British Super-/Turbocharger Development  (Read 345 times)

Offline Schneiderman

  • Senior Member
  • CLEARANCE: Top Secret
  • **
  • Posts: 947
British Super-/Turbocharger Development
« on: September 26, 2017, 02:34:05 am »
There is no question that the UK had the necessary metallurgy to construct turbines, Whittle, Hooker and their peers would confirm that.
Rolls-Royce and Bristol, along with BTH and research at RAE etc. had been developing superchargers progressively since the mid 1920s and had acheived a high degree of efficiency in both intake and impeller design. In the Rolls-Royce 'R' racing engine of 1929 the power consumption of the supercharger was matched by the additional boost acheived in the intake duct, effectively taking the net power drain to zero. Furthermore the introduction of ejector exhausts to provide a modicum of thrust meant that not all the enegy in the exhaust was lost. All in all I doubt that the advantage of turbochargers in the war years was that significant.

Offline iverson

  • CLEARANCE: Secret
  • **
  • Posts: 191
Re: British Super-/Turbocharger Development
« Reply #1 on: September 30, 2017, 12:05:14 pm »
England had the technology to develop superchargers and could get the required materials and fuel from the US. But the two-stage, intercooled, mechanical supercharger met the RAF's requirements, which made turbocharger development a luxury that could be left to the Americans.

That said, the work of Hooker and Whittle on jet engine turbines is not, I believe, directly comparable with work on turbochargers. Jet-engine turbines work at significantly lower temperatures, closer to those produced by diesels. Indeed,  most jet engines burn diesel or similar light oil fuel. 

My main point was that the US had the success that it did with turbochargers for unique and largely fortuitous reasons--a large domestic gasoline industry and the accidental discovery of Vitallium. Turbochargers were being developed in WW1, if not before, and in many countries. But no one had much success until the US in the 1940s. Bristol turbocharged the Hercules, but did not persist with it once the two-stage Merlin proved successful.

Even then, success was marginal. Most American turbocharged engines had issues with overheating and turbine failures throughout the war--hence the many proposals for replacing turbocharged Allisons with Packard-Merlins in airplanes like the P-38.

Offline iverson

  • CLEARANCE: Secret
  • **
  • Posts: 191
Re: British Super-/Turbocharger Development
« Reply #2 on: September 30, 2017, 12:20:54 pm »
One more point, re the Rolls-Royce R engine. Racing engines are not a good parallel with service engines generally, and ram-air induction is a case in point. Ram-induction only works within the speed and altitude range for which the inlet duct was tuned. Under other conditions, the inlet geometry is likely to cause turbulence that hurts performance. So ram-air induction can work on a low-level racing engine that runs for a relatively short time. But it would be much harder to use in a service engine.   

Ejector-type exhaust pipes are obviously more applicable to service conditions. They recover some of the exhaust energy by producing thrust. But they do not reduce the power consumed by the supercharger or attain the efficiency that is at least theoretically possible by compressing the intake charge using the turbine.

So I still maintain that, in wartime, the choice between turbocharging and the combination of exhaust thrust and mechanical supercharging was more a matter of requirements and history than of any theoretical, absolute advantage of one approach over the other.

Offline Avimimus

  • CLEARANCE: Top Secret
  • ***
  • Posts: 1708
Re: British Super-/Turbocharger Development
« Reply #3 on: September 30, 2017, 12:35:22 pm »
Thanks for the analysis. Engine design isn't something that is discussed enough.

Offline Richard N

  • Senior Member
  • CLEARANCE: Secret
  • **
  • Posts: 217
Re: British Super-/Turbocharger Development
« Reply #4 on: September 30, 2017, 01:09:22 pm »
“Even then, success was marginal. Most American turbocharged engines had issues with overheating and turbine failures throughout the war--hence the many proposals for replacing turbocharged Allisons with Packard-Merlins in airplanes like the P-38.”

That sounds more like an Allison issue than a turbocharging problem.  Turbocharging seems to have worked well enough for the B-24, B-17, B-29, and P-47 on their air cooled radials.

Offline Schneiderman

  • Senior Member
  • CLEARANCE: Top Secret
  • **
  • Posts: 947
Re: British Super-/Turbocharger Development
« Reply #5 on: October 01, 2017, 08:40:03 am »
Fair points but I still doubt that metalurgy would have slowed development of a turbo should it have been required. My reference to the 'R' was not regarding any ram effect but the rise in air pressure within the expanding intake duct as a result of the airflow slowing, this would be equally valid for any well designed intake on a military aircraft.

Offline red admiral

  • Senior Member
  • CLEARANCE: Secret
  • **
  • Posts: 494
Re: British Super-/Turbocharger Development
« Reply #6 on: October 04, 2017, 10:10:50 am »
For the US, long range and thus fuel economy were greater concerns than they were for the European powers. The turbocharger had theoretical advantages in this respect. A mechanically supercharged engine burns a lot of fuel just to drive the supercharger and blows a lot of usable energy out the exhaust stacks. A turbocharged engine drives the supercharger using that otherwise wasted energy.

The exhaust energy from the exhaust stacks isn't actually wasted though, it provides a significant amount of thrust compared to the propellor, especially at high altitude and high speed. I seem to remember a figure of ~30mph boost.

I came across a nice comparison between turbocharged and supercharged engines many years ago. Turbocharged better at low speed and altitude. Supercharged being superior above ~400mph above 20,000ft i seem to remember. But not a massive difference.

Offline iverson

  • CLEARANCE: Secret
  • **
  • Posts: 191
Re: British Super-/Turbocharger Development
« Reply #7 on: October 07, 2017, 12:30:54 pm »
Quote
The exhaust energy from the exhaust stacks isn't actually wasted though, it provides a significant amount of thrust compared to the propellor, especially at high altitude and high speed. I seem to remember a figure of ~30mph boost.

I came across a nice comparison between turbocharged and supercharged engines many years ago. Turbocharged better at low speed and altitude. Supercharged being superior above ~400mph above 20,000ft i seem to remember. But not a massive difference.

I believe that your first point is correct in some cases but not in others. Like most technical issues, advantages--and performance--are relative to conditions and requirements. Thrust-producing exhaust stacks demand a lot of experimental tuning work to get them right. When they aren't right, they might even hurt performance, due to back pressure or poor scavenging. In the case of the Merlin, it took awhile before the right solution was found. Ejector stacks also raise operational concerns--they are problematic for military aircraft operating at night.

In your second point, I think you have mixed up the conditions. Supercharged (gear- or turbine-driven) will be superior at altitude. Unsupercharged or mildly supercharged engines will be superior at low level (hence the cropped supercharger impellers used when adapting Merlin engines for low level use).

With few exceptions, military and commercial aero engines never make more power than they do at sea level. Maximum power is usually needed when getting a loaded aircraft off the ground. The engine will not need more as it climbs and cruises, because lift will increase with speed, fuel burn will reduce weight, and drag will be less in less dense air. So, if maximum power output were the issue (as it would be in race cars and Schneider Trophy racers), there would be little point in having a supercharger.  A gear or turbine-driven driven supercharger would just add weight and complexity. You can always get the power with a larger, lighter, and/or faster-turning engine.

But maximum power output is, of course, not the only issue in an working airplane. As the airplane climbs, the air density decreases and power/thrust decreases with it. There is less oxygen per intake stroke, less fuel burned per expansion stroke, and less high-pressure, high-velocity gas in the exhaust. So an engine's critical altitude--the altitude at which sea-level power begins to fall off due to decreasing air pressure--is what matters in practical (non-racing) applications. Superchargers (mechanical or turbo) are used to delay the point at which sea-level power falls off by artificially increasing the density of the intake air. As altitude increases. both have to do more work. But the power required for that work comes from different sources that have very different characteristics, costs, and benefits--and it is in this respect that mechanical and turbo superchargers have their respective advantages and disadvantages.

During the war, in England, the mechanical supercharger had the advantage of being very highly developed (largely due to Bristol's early lead in air-cooled engines and Rolls-Royce's pre-war investments and racing experience--Napier did not benefit). The mechanical approach was expensive and demanded a lot of development. The supercharger was driven by precision-machined gears and clutches and the whole installation was integral with and dedicated to a given engine design and model. Superchargers weren't interchangeable, mass-production units and couldn't be added to any desired engine (though Allison worked on some designs with this in mind). Mechanical superchargers were also efficient only within a fairly narrow altitude band. In general, to achieve a higher critical altitude, engineers had to redesign and/or add hardware. Rolls-Royce were the masters at addressing this. But the solutions--multi-speed drives, multi-stage impellers, and intercoolers, added weight and complexity. The parts had to be finely tuned and governed so that abrupt speed changes did not catastrophically damage components. The extra parts increased the power consumption of the supercharger and reduced the net gains. Complexity and weight yield diminishing returns. This is probably why two-stage engines were successful, but, with the exception of German diesels, three-stage engines were not. As several of us have rightly pointed out, tuned exhaust ejectors add thrust that can offset the power consumption of the supercharger, but only in the supercharger's designed operating range. At other altitudes, the supercharger will consume power without producing optimal power or exhaust thrust.

Turbochargers had the disadvantage of a relative lack of development prior to the war, largely due to fuel and metallurgical issues (see https://history.nasa.gov/SP-4306/ch3.htm, S.D heron's autobiography, and S.D Heron's Development of Aviation Fuels). During the war, this, arguably, limited its success vs. the mechanical supercharger. But post-war, turbochargers replaced their mechanical counterparts in almost all applications, from light airplanes to airliners. The turbo charger is relatively simple compared to a mechanical unit, because it lacks most of the multi-speed clutching and gearing. The units were mass produced in various sizes that could be more or less bolted on to a variety of of production engines. Performance-wise, the turbo charger had the huge advantage of automatically producing higher supercharger speed--and thus higher pressure--as altitude increased, up to the critical altitude. The compressor/impeller had to turn faster to produce sea level power, as in the case of the gear-driven unit. But the turbine had to overcome less back pressure and thus automatically spun the compressor faster. No gear trains. No clutches. No complex mechanical governors. A simple (if sometimes troublesome) blow-off valve prevented compressor over-speed/over-pressure problems by venting excess exhaust gas to the atmosphere. This was less efficient than a properly tuned ejector stack below critical altitude, but the inefficiency would be largely offset by the the lack of mechanical losses under these conditions. On the other hand, turbochargers did compress the intake air in close proximity to the exhaust, and components could get very hot. All war-time turbocharger installations suffered from fires, detonation, and overheating, a problem that was kept in bounds mainly by the abundant use of Anti Detonant Injection (ADI, water injection with alcohol as antifreeze) during take off and landing.

Overall, I suspect that the lower costs that result from suitability for mass production and interchangeability of parts explain why the USAAF preferred the turbo and why it replaced the mechanical supercharger in the post-war civil markets. When the performance differences aren't that great in theory and when cost and availability are critical, an off-the-shelf unit almost always wins over a multi-year bespoke engineering effort. In 1939, England had already completed that effort. So following through on the Merlin made sense. But it was a dead end (aside, of course, from all the expertise RR gained on the developing the compressor itself, which put them in the forefront of jet development, when coupled with a gas-turbine drive). 

I said that there were few exceptions to the rule that military and commercial aero engines never make more power than they do at sea-level. The exceptions are the Austrian and German "super-compressed" engines of the first world war. These had compression ratios optimized for the intended operating altitude of the aircraft. As I understand them, a decompression lever on the camshaft let a ground crewman lower the compression enough to swing the prop and start the motor. The lever was then returned to the full compression position. From sea-level to the optimal altitude, the pilot had to operate the engine at part throttle--and low power--to avoid over stressing it. This was effective and met the immediate need, but was inflexible, inefficient, and unlikely to be good for durability.



Offline iverson

  • CLEARANCE: Secret
  • **
  • Posts: 191
Re: British Super-/Turbocharger Development
« Reply #8 on: October 07, 2017, 12:33:46 pm »
Fair points but I still doubt that metalurgy would have slowed development of a turbo should it have been required. My reference to the 'R' was not regarding any ram effect but the rise in air pressure within the expanding intake duct as a result of the airflow slowing, this would be equally valid for any well designed intake on a military aircraft.

Please explain. What you describe is what I think of AS ram effect. Changes in section in the inlet are trading flow velocity for higher pressure, an effect which is proportional to speed with which the intake moves through the air. Thanks.

Offline iverson

  • CLEARANCE: Secret
  • **
  • Posts: 191
Re: British Super-/Turbocharger Development
« Reply #9 on: October 07, 2017, 01:03:10 pm »
“Even then, success was marginal. Most American turbocharged engines had issues with overheating and turbine failures throughout the war--hence the many proposals for replacing turbocharged Allisons with Packard-Merlins in airplanes like the P-38.”

That sounds more like an Allison issue than a turbocharging problem.  Turbocharging seems to have worked well enough for the B-24, B-17, B-29, and P-47 on their air cooled radials.

"Worked well enough" depends on the definition of "well enough". In war time, the USAAF accepted the trade-off between performance and problems/losses. But in peace time military or commercial service, things might have looked different.

As far as I know, all USAAF turbocharged engines suffered from overheating, fires, and waste-gate (blow-off valve) issues to some extent. Where the engines were at greatest risk--during high-load, high-weight take offs--ADI (Anti Detonant Injection) was used go limit problems. Water and alcohol would be injected into the engines to cool the fuel-air mixture in the cylinders. This prevented detonation that would otherwise destroy the engine. A large aircraft could carry quite a load of ADI, so I suspect that we have heard less about problems in the big radials than in the Allisons. Nonetheless, I've read that B-17s and B-24s frequently aborted due to turbo failures shortly after takeoff.

If my conjecture is true, the B-29 is the exception. It was notorious for engine fires, although how much of this was down to the turbo alone is debatable. The Curtiss-Wright R-3350 Duplex-Cyclone suffered from design issues, poor fuel/air distribution (which promotes detonation) and overall lack of development (all arguably down to a combination of lack of due-diligence and profiteering on the part of the manufacturer). The R-3350 overheated severely, both at takeoff and during cruise. Ambient temperatures were usually high at its bases. Long-range flight at high-altitude demanded lean mixtures, which were known to promote detonation even in the absence of turbocharging and poor fuel distribution. The problems were so bad that B-29 production was very nearly switched to a version re-engined with turbocharged, liquid-cooled, 24-cyclinder Allison V-3420s. These had the same power, fewer overheating problems, and offered higher performance due to reduced drag. But the disruption to production was considered too great.

 

Offline iverson

  • CLEARANCE: Secret
  • **
  • Posts: 191
Re: British Super-/Turbocharger Development
« Reply #10 on: October 07, 2017, 01:07:39 pm »
Fair points but I still doubt that metalurgy would have slowed development of a turbo should it have been required. My reference to the 'R' was not regarding any ram effect but the rise in air pressure within the expanding intake duct as a result of the airflow slowing, this would be equally valid for any well designed intake on a military aircraft.

Alternate futures are always debatable. If you are interested, my source on this is mainly S.D. Heron, in the books quoted in my other reply on this topic. I found both amusing and worthwhile--comprehensible even for a non-engineer like me.

Offline red admiral

  • Senior Member
  • CLEARANCE: Secret
  • **
  • Posts: 494
Re: British Super-/Turbocharger Development
« Reply #11 on: October 08, 2017, 12:29:43 am »
Quote
But post-war, turbochargers replaced their mechanical counterparts in almost all applications, from light airplanes to airliners.   

Agree, but this overlooks that in the main performance driving application i,e. High speed high altitude fighters, piston engines were replaced by jets. The applications that remained were generally for low and slow, where the turbocharger has a performance advantage. Anything remotely high/fast soon switched to turbojets or turboprops.

Offline CJGibson

  • Top Contributor
  • CLEARANCE: Top Secret
  • ***
  • Posts: 975
  • I didn't get where I am today by...
Re: British Super-/Turbocharger Development
« Reply #12 on: October 08, 2017, 12:57:59 am »
If my conjecture is true, the B-29 is the exception. It was notorious for engine fires, although how much of this was down to the turbo alone is debatable.

Wasn't that fixed by fitting cuffs to the roots of the propeller blades to increase airflow through the engine?

Chris

Offline iverson

  • CLEARANCE: Secret
  • **
  • Posts: 191
Re: British Super-/Turbocharger Development
« Reply #13 on: October 09, 2017, 11:33:36 am »
If my conjecture is true, the B-29 is the exception. It was notorious for engine fires, although how much of this was down to the turbo alone is debatable.

Wasn't that fixed by fitting cuffs to the roots of the propeller blades to increase airflow through the engine?

Chris
That may have helped, as did better baffling for control of air flow, improved valve metallurgy to reduce detonation, improved oil distribution, better manifold designs, etc. But I believe that, ultimately, in the post-war period, direct fuel-injection had to be adopted before fuel distribution improved enough to control detonation.