Skip to content

Add dispatches for Real Op#2016

Open
jessegrabowski wants to merge 2 commits intopymc-devs:v3from
jessegrabowski:real-op-dispatch
Open

Add dispatches for Real Op#2016
jessegrabowski wants to merge 2 commits intopymc-devs:v3from
jessegrabowski:real-op-dispatch

Conversation

@jessegrabowski
Copy link
Copy Markdown
Member

We have no coverage for this Op because it doesn't have an nfunc_name. There was a concern that because numpy returns a view, we could end up with graph mutations that pytensor is not aware of. An example is:

import pytensor.tensor as pt
x = pt.dvector('x')
out = x.real[0].set(99.0)

I added some checks for this specific case, it seems like everything works ok?

@ricardoV94
Copy link
Copy Markdown
Member

ricardoV94 commented Mar 30, 2026

the input mutation is not a concern in jax which never does Inplace by default. Your test isn't strong enough because the compare py and numba (or other backend) run minimal rewrites

you need to pass the fully fledged mode as an argument or even simpler test with an already inplace set subtensor (pass inplace=true to the test) and then call the function again

@jessegrabowski
Copy link
Copy Markdown
Member Author

After looking into this more I'm not sure your worry is founded:

  • Elemwise.perform as a specific guard for the real case here
  • The nfunc_spec is commented out here, so the ufunc view path isn't used
  • Numba elemwise dispatch allocates fresh output buffers -- no inplace
  • jax/mla/pytorch exclude inplace rewrites

I had the robot go at it every which way, it only mutated if the two guards noted above are disabled. I guess I could disable them, then try to make that work? Surely this isn't the only inplace elemwise operation?


x = pt.zvector("x")
out = pt.real(x)[0].set(99.0)
f = function([In(x, mutable=True)], out, mode=pytorch_mode)
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PyTensor would be allowed to destroy this x. The issue would be if it weren't she still destroyed it

Copy link
Copy Markdown
Member

@ricardoV94 ricardoV94 Apr 20, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When I said inplace=True I meant the set subtensor op in your test could be inplace from the get go. That's how you know elemwise is not returning a view and the input is not destroyed accidentally

@ricardoV94
Copy link
Copy Markdown
Member

ricardoV94 commented Apr 20, 2026

After looking into this more I'm not sure your worry is founded:

What worry is unfounded? You can't have an op return a view without stating so with view map. If it does you'll get wrong results eventually not just corrupted inputs.

I didn't say you couldn't implement the behavior, you just have to check it doesn't return views?

I had the robot go at it every which way, it only mutated if the two guards noted above are disabled

That's how guards are supposed to work? I never said we HAD a bug, I just explained why np.real nfunc was commented out (so Elemwise perform doesn't try to use it directly)

@ricardoV94
Copy link
Copy Markdown
Member

ricardoV94 commented Apr 20, 2026

Surely this isn't the only inplace elemwise operation?

I think these are super rare. If there are more we need to forbid elemwise python mode from using it or force a copy after

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants