Matthias Gehre
c23a61f4b6
DecomposeComplexOps: Use static shape if available ( #2289 )
2023-07-12 10:07:30 +02:00
Sean Silva
bbd3094c2f
update PyTorch version to 2.1.0.dev20230711 ( #2299 )
...
- torch version: 2.1.0.dev20230711
- torch commit hash: 927dc662386af052018212c7d01309a506fc94cd
- torchvision version: 0.16.0.dev20230711
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-07-11 12:34:22 -07:00
Sean Silva
17669391b3
update PyTorch version to 2.1.0.dev20230710 ( #2296 )
...
- torch version: 2.1.0.dev20230710
- torch commit hash: 69565763c841e4e8d07fd338c9bf6515005b3880
- torchvision version: 0.16.0.dev20230710
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-07-10 06:54:40 -07:00
Zhekun Zhang
6a072d4f4a
[Stablehlo] AtenEmptyMemoryFormat remove device cpu check ( #2288 )
...
* remove cpu check
* update dtype
---------
Co-authored-by: zhekun.zhang <zhekun.zhang@bytedance.com>
2023-07-10 15:36:21 +08:00
Sean Silva
05920f9159
update PyTorch version to 2.1.0.dev20230709 ( #2293 )
...
- torch version: 2.1.0.dev20230709
- torch commit hash: 9b5a84f5443c8e3b9db5511a4f58d727b4fade40
- torchvision version: 0.16.0.dev20230709
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-07-09 07:56:04 -07:00
Sean Silva
2fdfa0410d
update PyTorch version to 2.1.0.dev20230708 ( #2292 )
...
- torch version: 2.1.0.dev20230708
- torch commit hash: 3a919e00b8237a76ad6faa6040c00b425a96f1f3
- torchvision version: 0.16.0.dev20230708
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-07-08 08:14:37 -07:00
Sean Silva
6ac85ee662
update PyTorch version to 2.1.0.dev20230707 ( #2290 )
...
- torch version: 2.1.0.dev20230707
- torch commit hash: 760dafbb05853f5f57f1a6869179df2efbc2cf6b
- torchvision version: 0.16.0.dev20230707
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-07-07 10:44:02 -07:00
Abhishek Varma
6c9ba4ce95
[Torch-to-Linalg] Add dynamic dimension support for BroadcastTo op ( #2174 )
...
-- This commit adds support for dynamic dimension in BroadcastTo op.
Signed-off-by: Abhishek Varma <abhishek@nod-labs.com>
2023-07-07 10:01:51 -07:00
Sean Silva
7f4084b570
update PyTorch version to 2.1.0.dev20230705 ( #2284 )
...
- torch version: 2.1.0.dev20230705
- torch commit hash: 758c84d41f55f90f210e6d7d02e05cda4a13c728
- torchvision version: 0.16.0.dev20230705
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-07-05 09:33:10 -07:00
Sean Silva
8c87057f50
update PyTorch version to 2.1.0.dev20230704 ( #2282 )
...
- torch version: 2.1.0.dev20230704
- torch commit hash: e5472fd3c324c5ecb343884e5399e0227cc30a6c
- torchvision version: 0.16.0.dev20230704
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-07-04 08:23:00 -07:00
Sean Silva
157e5e529a
update PyTorch version to 2.1.0.dev20230701 ( #2278 )
...
- torch version: 2.1.0.dev20230701
- torch commit hash: bb3df0bb7c6bce70941199401f6b3550e10cba50
- torchvision version: 0.16.0.dev20230702
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-07-02 07:39:38 -07:00
Sean Silva
112a2ceebf
update PyTorch version to 2.1.0.dev20230701 ( #2276 )
...
- torch version: 2.1.0.dev20230701
- torch commit hash: bb3df0bb7c6bce70941199401f6b3550e10cba50
- torchvision version: 0.16.0.dev20230701
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-07-01 07:56:45 -07:00
Sean Silva
db1a42ddc8
update PyTorch version to 2.1.0.dev20230630 ( #2274 )
...
- torch version: 2.1.0.dev20230630
- torch commit hash: dc72046b235ac803e3875c23a1784e93b3d4812c
- torchvision version: 0.16.0.dev20230630
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-06-30 08:10:30 -07:00
Jiawei Wu
c7fa42b7d3
[Torch Dialect] Add canonicalizer for aten.to.other op ( #2273 )
...
Canonicalize aten.to.other to prim.device + prim.dtype + aten.to.device
Co-authored-by: wujiawei.aml <wujiawei.aml@bytedance.com>
2023-06-30 09:43:08 +08:00
Sambhav Jain
facce24ae3
[Bazel] Fix broken Bazel build ( #2252 )
...
Bazel GHA run: https://github.com/sjain-stanford/torch-mlir/actions/runs/5408580473
2023-06-29 08:45:35 -07:00
Yuanqiang Liu
449cfb8375
[Torch Dialect] add more scalar op folders ( #2265 )
2023-06-29 10:37:13 +08:00
Sean Silva
82819350e1
update PyTorch version to 2.1.0.dev20230628 ( #2272 )
...
- torch version: 2.1.0.dev20230628
- torch commit hash: 94ca800459ebe8cd2bc3a9927a8412d958661634
- torchvision version: 0.16.0.dev20230628
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-06-28 09:05:41 -07:00
Chi_Liu
ddd0c06970
[TORCH] Fix recompose off by -1 error ( #2271 )
2023-06-27 13:34:14 -07:00
Sean Silva
1eb63f33af
update PyTorch version to 2.1.0.dev20230627 ( #2269 )
...
- torch version: 2.1.0.dev20230627
- torch commit hash: 43ec335ff295c55bd5d44a9fd03cfc884839a283
- torchvision version: 0.16.0.dev20230627
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-06-27 09:07:45 -07:00
Yuanqiang Liu
859885c1d3
[Torch Dialect] Support aten.native_dropout ( #2259 )
...
* [Torch Dialect] Support aten.native_dropout
* update
2023-06-27 14:19:33 +08:00
Yuanqiang Liu
1ea2b57ab7
[Torch Dialect] add folder for aten.add ( #2264 )
...
* [Torch Dialect] add folder for aten.add
* update
* update
* update
2023-06-27 10:55:28 +08:00
Sean Silva
a52a2b5053
Update LLVM ( #2267 )
...
Green LLVM commit: ec89cb9a81529fd41fb37b8e62203a2e9f23bd54
Green MHLO commit: cd47c8c4db420181551e79c42fac22aecc4c06af
2023-06-26 16:23:53 -07:00
Sean Silva
38fb99df65
update PyTorch version to 2.1.0.dev20230626 ( #2266 )
...
- torch version: 2.1.0.dev20230626
- torch commit hash: 176a02ed90b218ffbf6a7b290ac28d37f06708ff
- torchvision version: 0.16.0.dev20230626
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-06-26 08:25:16 -07:00
Yuanqiang Liu
0548e2ef3b
[Stablehlo] fix promoteType() when input doesn't have DefiningOp ( #2262 )
2023-06-26 00:04:17 +08:00
Sean Silva
f4e7344276
update PyTorch version to 2.1.0.dev20230625 ( #2263 )
...
- torch version: 2.1.0.dev20230625
- torch commit hash: 3bebfdfbabb134d20c3431d68219b54ad61ce172
- torchvision version: 0.16.0.dev20230625
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-06-25 08:58:32 -07:00
Sean Silva
ec98ce23c9
update PyTorch version to 2.1.0.dev20230624 ( #2261 )
...
- torch version: 2.1.0.dev20230624
- torch commit hash: 27b3861096b2e84d2e10cc823ba413967f82aafc
- torchvision version: 0.16.0.dev20230624
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-06-24 08:19:40 -07:00
Sean Silva
fbb5ed52cf
update PyTorch version to 2.1.0.dev20230623 ( #2260 )
...
- torch version: 2.1.0.dev20230623
- torch commit hash: ad724c83fb0d94cb3bb2cec94e15d88023c64e0d
- torchvision version: 0.16.0.dev20230623
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-06-23 09:03:50 -07:00
Yuanqiang Liu
64afc08dab
[Torch Dialect] add missing one_hot dtype function ( #2143 )
...
* [Torch Dialect] add missing one_hot dtype function
* update
* update
* update
2023-06-23 16:11:33 +08:00
Yuanqiang Liu
39201a4be5
[Torch Dialect] avoid assertion failure when PrimNumToTensorScalarOp'… ( #2256 )
...
* [Torch Dialect] avoid assertion failure when PrimNumToTensorScalarOp's input is torch.number
* update
2023-06-23 16:02:45 +08:00
Ramiro Leal-Cavazos
6f2bf31291
Fix single-element tuple construction in abstract interp library ( #2258 )
...
Single element tuples in Python need a comma after the
element. However, the `registry.py` file, which generates the expected
abstract interpretation function signatures, was not inserting the
comma. This commit changes the expected signature generator to add a
comma after the last element in any non-empty default tuple argument.
2023-06-22 11:27:40 -07:00
Yuanqiang Liu
96b14e952e
[Torch Dialect] Support aten.device.with_index ( #2254 )
2023-06-23 01:07:14 +08:00
Yuanqiang Liu
4fd4477e15
[Torch Dialect] require hasSizes when decompose aten.amax ( #2248 )
2023-06-22 11:26:51 +08:00
Sean Silva
c91c67e53d
update PyTorch version to 2.1.0.dev20230621 ( #2247 )
...
- torch version: 2.1.0.dev20230621
- torch commit hash: e4cf441a4ba770dc869433d876e73051ed9800b2
- torchvision version: 0.16.0.dev20230621
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-06-21 08:12:45 -07:00
Abhishek Varma
a0d2789840
[MLIR][TORCH] Add e2e support for aten.alias
...
-- This commit adds e2e support for aten.alias op.
Signed-off-by: Abhishek Varma <abhishek@nod-labs.com>
2023-06-21 12:15:31 +05:30
Maksim Levental
0244f540a7
Add typeids to CAPI. ( #2253 )
2023-06-20 22:06:43 -05:00
Abhishek Varma
ebda611100
[build] Update llvm tag to 3f8d8c1a
...
This patch updates the submodules to:
- llvm: 3f8d8c1aac3086f603ad73f18fe2bd4fb91fa10a
- mhlo: 4384a47b03dc377d651523037867899a340b0e96
The only change made is calling `registerAllExtensions` during dialect
registration. See: https://reviews.llvm.org/D120368
2023-06-20 15:45:52 -07:00
Sean Silva
860a2d4bbf
update PyTorch version to 2.1.0.dev20230619 ( #2245 )
...
- torch version: 2.1.0.dev20230619
- torch commit hash: 5beeb400ca3487d55629cbf8b87f9b637a7b657f
- torchvision version: 0.16.0.dev20230619
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-06-19 07:52:21 -07:00
Sean Silva
9b4e369671
update PyTorch version to 2.1.0.dev20230618 ( #2244 )
...
- torch version: 2.1.0.dev20230618
- torch commit hash: 59c654a6ad8d256b89123dda536052e98cd5e399
- torchvision version: 0.16.0.dev20230618
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-06-18 09:14:33 -07:00
Sean Silva
145055bdb6
update PyTorch version to 2.1.0.dev20230617 ( #2241 )
...
- torch version: 2.1.0.dev20230617
- torch commit hash: a522f9aedd9c9aaebba5997f201cc23119696578
- torchvision version: 0.16.0.dev20230617
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-06-17 08:46:28 -07:00
Vivek Khandelwal
f6a6cfea4e
[MLIR][TORCH] Add support for negative index values for index.Tensor op ( #2233 )
...
This commit adds the support for index.Tensor op when the index values
are negative. This commit wraps around the index values by checking
their values at run time.
Signed-Off By: Vivek Khandelwal <vivek@nod-labs.com>
2023-06-16 14:21:04 -05:00
Matthias Gehre
6f420019cb
TorchToTosa: Cast float constants to correct type to support bfloat16 ( #2239 )
2023-06-16 09:51:24 +02:00
Sean Silva
45e2188615
update PyTorch version to 2.1.0.dev20230615 ( #2238 )
...
- torch version: 2.1.0.dev20230615
- torch commit hash: 0d4f9aee900596cd8ed55725f75a5792b6df6de1
- torchvision version: 0.16.0.dev20230615
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-06-15 09:59:20 -07:00
Vivek Khandelwal
ab8b23e767
build: manually update PyTorch version
...
Set PyTorch and TorchVision version to nightly release 2023-05-16.
This commit removes the test `BaddbmmDifferentDtypesModule_basic`
since PyTorch expects all operands to have the same dtype.
Ref: 2abad0c184
Signed-Off By: Vivek Khandelwal <vivek@nod-labs.com>
2023-06-15 17:53:16 +05:30
Yuanqiang Liu
bba0f5891b
[Stablehlo] add conversion for AtenFlipOp ( #2163 )
2023-06-15 10:27:34 +08:00
Yuanqiang Liu
7c6961bcbf
[Torch Dialect] Support aten.cuda and add canonicalizer for aten.cuda ( #2231 )
2023-06-14 09:56:39 +08:00
Maksim Levental
0caaf8d32a
Bump LLVM ( #2176 )
...
* Bump LLVM
---------
Co-authored-by: Matthias Gehre <matthias.gehre@xilinx.com>
2023-06-13 16:17:23 +02:00
Yuanqiang Liu
ddea56a832
[Torch Dialect] fix torch.uint8's dtype infer ( #2227 )
2023-06-13 10:38:20 +08:00
Sean Silva
dd5992514d
update PyTorch version to 2.1.0.dev20230612 ( #2229 )
...
- torch version: 2.1.0.dev20230612
- torch commit hash: 8aee9489c907eeae8af1b6df6962f3a4414c984a
- torchvision version: 0.16.0.dev20230612
Co-authored-by: Roll PyTorch Action <torch-mlir@users.noreply.github.com>
2023-06-12 07:40:35 -07:00
Christopher McGirr
b461daa06e
fix(TorchToTosa.cpp): adjust torch->tosa div conversion ( #2200 )
...
check the return type of the division to figure out whether to use
the floating point implementation of a division or to use the integer.
the issue rose from the fact that the inputs are all integer but the
result was casted to floating point. The conversion then chose to
use the integer implementation of division which is not legal in tosa
when all the inputs get casted to floating point.
fix(TorchToLinalg): AtenDivScalarOp
upcast self operand as well if applicable, the self operand must also
be casted to float as it can be an integer.
2023-06-12 11:18:38 +02:00
Tiago Trevisan Jost
cc75557119
feat: support unchanged dimensions in torch.aten.broadcast_to operation. ( #2204 )
2023-06-12 11:17:25 +02:00