torch-mlir/test/Conversion/TorchToLinalg
Longsheng Mou 0a607a410d
[TorchToLinalg] Use `linalg.transpose` instead of `generic` in `permuteTensor` (#3872)
This PR changes the lowering to use `linalg.transpose` instead of
`linalg.generic` in `torch_to_linalg::permuteTensor`.
2024-11-15 17:13:14 +08:00
..
basic.mlir [TorchToLinalg] Use `linalg.transpose` instead of `generic` when lowering `aten.T` (#3660) 2024-09-07 08:09:10 +02:00
broadcast.mlir [TorchToLinalg] Improve broadcast lowerings in strict symbolic modes (#2505) 2023-10-05 15:15:26 -04:00
convolution.mlir [TorchToLinalg] Use Op with native channel order for quantized conv2d (#3807) 2024-10-22 20:26:16 +02:00
datamovement.mlir [TorchToLinalg] Use `linalg.transpose` instead of `generic` in `permuteTensor` (#3872) 2024-11-15 17:13:14 +08:00
elementwise.mlir Revert "[TorchToLinalg] perform rank0 elementwise computations outside linalg generic ops (#3762)" (#3767) 2024-10-04 14:48:02 -07:00
embeddingBag.mlir Remove checking for training specific parameters in EmbeddingBag lowering (#3782) 2024-10-15 09:37:26 -04:00
flatten.mlir Integrate llvm-project at dabdec1001dc368373dd581cf72f37a440873ce3 (#3300) 2024-05-08 14:43:06 -04:00
gridsampler.mlir [TorchToLinalg] remove `extract_slice` grid_sample lowering (#3483) 2024-08-20 14:23:43 -07:00
pooling.mlir TorchToLinalg: Try folding shape computations to keep static shapes when possible (#3475) 2024-06-27 08:43:10 +02:00
resize.mlir OnnxToTorch bicubic interpolation (#3802) 2024-11-12 12:54:29 -06:00
sparse.mlir [torch-mlir] bump stablehlo/llvm version (#3471) 2024-06-18 16:59:53 -07:00
squeeze.mlir torch.aten.squeeze.dim lowering with dynamic dims (#3749) 2024-10-08 10:37:31 -07:00
unsqueeze.mlir Integrate llvm-project at dabdec1001dc368373dd581cf72f37a440873ce3 (#3300) 2024-05-08 14:43:06 -04:00
view.mlir Add support for multiple dynamic reassociation dims for unflatten.int (#3504) 2024-06-28 09:59:51 -07:00
view_strict.mlir TorchToLinalg: Try folding shape computations to keep static shapes when possible (#3475) 2024-06-27 08:43:10 +02:00