Verified Production Fix
[pytorch/pytorch] Generator expression with conditional filter produces incorrect results
GH-pytorch/pytorch#176693 • Mar 07, 2026
### ROOT CAUSE
The issue arises because the generator expression's state isn't properly reinitialized when the function is called multiple times, especially in the compiled mode. The generator is created inside the function, but in the compiled version, it might not be reinitialized correctly, leading to incorrect results.
### CODE FIX
To fix this, we can ensure that the generator is properly reinitialized each time the function is called by encapsulating its creation in a helper function. This ensures that each call to `fn` starts with a fresh generator.
def test_genexpr_conditional_state():
def create_generator():
return (v * 2 for v in range(6) if v % 2 == 1)
def fn(x):
gen = create_generator()
a = next(gen) # 2
b = next(gen) # 6
c = next(gen) # 10
return x + a + b + c
x = torch.tensor([0.0])
eager = fn(x)
compiled = torch.compile(fn, backend="eager")(x)
assert torch.equal(eager, compiled), f"eager={eager}, compiled={compiled}"
This fix ensures that each call to `fn` creates a new generator, preventing state leakage between calls.
Deploy with DigitalOcean
Use this fix in production instantly. Claim your $200 developer credit.
Get Started →
digital