This is something that has confused me for a long time. I know that we (typically) size an impeller to absorb the max hp that the engine makes at whatever rpm that HP peak occurs.
I guess you would go slower if you sized the impeller to absorb the max torque the engine makes at whatever rpm that torque peak occurs.
Pick your engine but it is a powerful motor oh say in the 1000 HP range with a reasonably flat torque curve
It's peak HP is at 6900 RPM
Peak torque is at 5400 RPM
What would be the performance difference if you had two impellers, one sized to hold the engine at peak torque and the other to hold the engine at Peak HP, all else being identical.
I know this is an age old question but it still bugs me. It would seem to me that the lower RPM set up would be more efficient in terms of MPH per RPM and fuel usage but probably would not be the fastest set up of the two. I figure somewhere between the two peaks would be best.
Just wondering out loud here. And there is actually a reason for my madness.