In general, you see inflation in code size as the representation becomes lower level and more explicit. Traversing a larger volume of code to detect dead code is generally more expensive than traversing a smaller volume of code. This might not be true if the deadness determination is really expensive, but it's almost always worth doing some kind of lightweight dead code elimination before each stage of lowering to save time in the subsequent stage.