torch-mlir/test/Conversion/TorchToLinalg
zjgarvey af236dab66
Add support for multiple dynamic reassociation dims for unflatten.int (#3504)
Addresses an issue with onnx.Gather lowering to linalg:
<https://github.com/nod-ai/SHARK-Turbine/issues/242>

The builder for tensor.expand_shape, without an explicitly provided
output shape, fails to infer an output shape in the case of multiple
dynamic reassociation dims. I tried adding the output shape explicitly
for tensor.expand_shape, but ran into compilation issues later on (see
<https://github.com/iree-org/iree/issues/17760>).

This PR adds support by lowering this op to tensor.reshape when multiple
dynamic reassociation dims are provided.
2024-06-28 09:59:51 -07:00
..
basic.mlir [torch-mlir] bump stablehlo/llvm version (#3471) 2024-06-18 16:59:53 -07:00
broadcast.mlir [TorchToLinalg] Improve broadcast lowerings in strict symbolic modes (#2505) 2023-10-05 15:15:26 -04:00
convolution.mlir [TorchToLinalg] Fix Quantized Convolution Accumulator Type (#3459) 2024-06-20 13:54:20 -07:00
elementwise.mlir TorchToLinalg: Try folding shape computations to keep static shapes when possible (#3475) 2024-06-27 08:43:10 +02:00
flatten.mlir Integrate llvm-project at dabdec1001dc368373dd581cf72f37a440873ce3 (#3300) 2024-05-08 14:43:06 -04:00
gridsampler.mlir [onnx] Gridsampler addition of nearest mode (#3320) 2024-05-10 11:42:10 -07:00
pooling.mlir TorchToLinalg: Try folding shape computations to keep static shapes when possible (#3475) 2024-06-27 08:43:10 +02:00
resize.mlir [ONNX] Fix resize ceil numerics and add half_pixel_symmetric support (#3443) 2024-06-11 22:35:50 -05:00
sparse.mlir [torch-mlir] bump stablehlo/llvm version (#3471) 2024-06-18 16:59:53 -07:00
unsqueeze.mlir Integrate llvm-project at dabdec1001dc368373dd581cf72f37a440873ce3 (#3300) 2024-05-08 14:43:06 -04:00
view.mlir Add support for multiple dynamic reassociation dims for unflatten.int (#3504) 2024-06-28 09:59:51 -07:00
view_strict.mlir TorchToLinalg: Try folding shape computations to keep static shapes when possible (#3475) 2024-06-27 08:43:10 +02:00