Skip to content

Conversation

@adam-smnk
Copy link
Member

Adds a jit decorator to compile and execute torch models using MLIR.

@adam-smnk adam-smnk changed the title [runtime] PyTorch module MLIR JIT decorator [runtime] PyTorch model MLIR JIT decorator Jan 15, 2026
@adam-smnk
Copy link
Member Author

Module becomes pretty ambiguous between torch and MLIR.
Now naming leans into model when referring to PyTorch and module is left for MLIR context.

@rolfmorel rolfmorel self-requested a review January 15, 2026 14:48
@adam-smnk
Copy link
Member Author

@rolfmorel You were right. Decorator hiding the original class makes it less useful.
I dug around torch.nn.Module and managed to hook into its existing callback to redirect model's execution through the custom JIT logic.

For completion, I should mention that there's an option to use torch.compile and provide it with a custom backend.
This requires that the returned callable has the same signature as the model's forward function which is not feasible if MLIR transformations modify the compilated function's signature.

I think the above could be possible with some argument normalization step between the model and MLIR. But not sure if it's worth going that far just yet.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants