diff --git a/build_tools/update_abstract_interp_lib.sh b/build_tools/update_abstract_interp_lib.sh index d33c69536..cb44a4e8b 100755 --- a/build_tools/update_abstract_interp_lib.sh +++ b/build_tools/update_abstract_interp_lib.sh @@ -42,6 +42,6 @@ if [ ! -z ${TORCH_MLIR_EXT_MODULES} ]; then fi PYTHONPATH="${pypath}" python \ - -m torch_mlir.dialects.torch.importer.jit_ir.build_tools.abstract_interp_lib_gen \ + -m torch_mlir.jit_ir_importer.build_tools.abstract_interp_lib_gen \ --pytorch_op_extensions=${ext_module:-""} \ --torch_transforms_cpp_dir="${torch_transforms_cpp_dir}" diff --git a/build_tools/update_torch_ods.sh b/build_tools/update_torch_ods.sh index e0564a62d..cb0599f16 100755 --- a/build_tools/update_torch_ods.sh +++ b/build_tools/update_torch_ods.sh @@ -43,7 +43,7 @@ fi set +u PYTHONPATH="${PYTHONPATH}:${pypath}" python \ - -m torch_mlir.dialects.torch.importer.jit_ir.build_tools.torch_ods_gen \ + -m torch_mlir.jit_ir_importer.build_tools.torch_ods_gen \ --torch_ir_include_dir="${torch_ir_include_dir}" \ --pytorch_op_extensions="${ext_module}" \ --debug_registry_dump="${torch_ir_include_dir}/JITOperatorRegistryDump.txt" diff --git a/docs/Torch-ops-E2E-implementation.md b/docs/Torch-ops-E2E-implementation.md index 153246f37..53031c9ce 100644 --- a/docs/Torch-ops-E2E-implementation.md +++ b/docs/Torch-ops-E2E-implementation.md @@ -17,7 +17,7 @@ The end-to-end test is important to check the correctness of the other steps. ### Step 2. Update ods -Update [torch_ods_gen.py](https://github.com/llvm/torch-mlir/blob/main/projects/pt1/python/torch_mlir/dialects/torch/importer/jit_ir/build_tools/torch_ods_gen.py) with the new op and run [update_torch_ods.sh](https://github.com/llvm/torch-mlir/blob/main/build_tools/update_torch_ods.sh) to generate the ods. Running `update_torch_ods.sh` would dump all the operators with schema into `JITOperatorRegistryDump.txt`. It’s convenient to look for ops signatures and operands names in this file. +Update [torch_ods_gen.py](https://github.com/llvm/torch-mlir/blob/main/projects/pt1/python/torch_mlir/jit_ir_importer/build_tools/torch_ods_gen.py) with the new op and run [update_torch_ods.sh](https://github.com/llvm/torch-mlir/blob/main/build_tools/update_torch_ods.sh) to generate the ods. Running `update_torch_ods.sh` would dump all the operators with schema into `JITOperatorRegistryDump.txt`. It’s convenient to look for ops signatures and operands names in this file. ### Step 3. Propagate types It’s essential to make sure the new op implements shape and dtype inference. See [abstract_interp_lib](https://github.com/llvm/torch-mlir/blob/main/docs/abstract_interp_lib.md) for information on adding shape and dtype inference. diff --git a/docs/abstract_interp_lib.md b/docs/abstract_interp_lib.md index 14ffc2181..eb862e6bb 100644 --- a/docs/abstract_interp_lib.md +++ b/docs/abstract_interp_lib.md @@ -26,7 +26,7 @@ The two main use cases are: ## Architecture Functions are defined as TorchScript-able Python functions in -`python/torch_mlir/dialects/torch/importer/jit_ir/build_tools/abstract_interp_lib_gen.py`. +`python/torch_mlir/jit_ir_importer/build_tools/abstract_interp_lib_gen.py`. The signatures of the functions are systematically derived from Torch JIT operator registry. Most shape functions are expected to reuse the upstream helper functions diff --git a/docs/adding_an_e2e_test.md b/docs/adding_an_e2e_test.md index 1c961c5c1..61664c7dc 100644 --- a/docs/adding_an_e2e_test.md +++ b/docs/adding_an_e2e_test.md @@ -87,7 +87,7 @@ following order: 1. Shape of input tensor. Use `-1` for dynamic dimensions 2. Dtype of the input tensor -3. Boolean representing whether the input tensor [has value semantics](https://github.com/llvm/torch-mlir/blob/ba17a4d6c09b4bbb4ef21b1d8d4a93cb056be109/python/torch_mlir/dialects/torch/importer/jit_ir/csrc/class_annotator.h#L54-L67). This +3. Boolean representing whether the input tensor [has value semantics](https://github.com/llvm/torch-mlir/blob/ba17a4d6c09b4bbb4ef21b1d8d4a93cb056be109/python/torch_mlir/jit_ir_importer/csrc/class_annotator.h#L54-L67). This will always be true for E2E tests, since the [Torch-MLIR backend contract](architecture.md#the-backend-contract) requires all tensors in the IR to eventually have value semantics. diff --git a/docs/architecture.md b/docs/architecture.md index e503ba40d..8ee6bfda8 100644 --- a/docs/architecture.md +++ b/docs/architecture.md @@ -55,14 +55,14 @@ factored such that we can handle this with one core import path, which is through the PyTorch "[JIT IR](https://github.com/pytorch/pytorch/blob/78c8a0d75220bdd4955415b5f81509e005af4232/torch/csrc/jit/OVERVIEW.md)", and lives in -[torch-mlir/python/torch_mlir/dialects/torch/importer/jit_ir](https://github.com/llvm/torch-mlir/tree/e322f6a8784009b37aa354abfa9a40a80f30877d/python/torch_mlir/dialects/torch/importer/jit_ir). +[torch-mlir/python/torch_mlir/jit_ir_importer](https://github.com/llvm/torch-mlir/tree/e322f6a8784009b37aa354abfa9a40a80f30877d/python/torch_mlir/dialects/torch/importer/jit_ir). The JIT IR is a highly principled IR that faithfully models a Python subset (+ tensors, the PyTorch op registry, and a few other things). All the other PyTorch program representations can eventually bottom-out on the JIT IR via some path provided by PyTorch. The `torch` dialect is almost entirely in 1:1 correspondence with the JIT IR -- this allows the importer to be extremely small (the core is -[under 500 lines of code](https://github.com/llvm/torch-mlir/blob/e322f6a8784009b37aa354abfa9a40a80f30877d/python/torch_mlir/dialects/torch/importer/jit_ir/csrc/node_importer.cpp#L1)). +[under 500 lines of code](https://github.com/llvm/torch-mlir/blob/e322f6a8784009b37aa354abfa9a40a80f30877d/python/torch_mlir/jit_ir_importer/csrc/node_importer.cpp#L1)). ### Ops @@ -70,7 +70,7 @@ See [TorchOps.td](https://github.com/llvm/torch-mlir/blob/114f48e96c578ee76a6f83 The ops in the `torch` dialect are almost entirely generated based on the PyTorch JIT IR operator registry via the script -[torch_ods_gen.py](https://github.com/llvm/torch-mlir/blob/e322f6a8784009b37aa354abfa9a40a80f30877d/python/torch_mlir/dialects/torch/importer/jit_ir/build_tools/torch_ods_gen.py#L1) (invoked via [update_torch_ods.sh](https://github.com/llvm/torch-mlir/blob/main/build_tools/update_torch_ods.sh)). +[torch_ods_gen.py](https://github.com/llvm/torch-mlir/blob/e322f6a8784009b37aa354abfa9a40a80f30877d/python/torch_mlir/jit_ir_importer/build_tools/torch_ods_gen.py#L1) (invoked via [update_torch_ods.sh](https://github.com/llvm/torch-mlir/blob/main/build_tools/update_torch_ods.sh)). This script queries the registry and generates MLIR [ODS](https://mlir.llvm.org/docs/OpDefinitions/) in [GeneratedTorchOps.td](https://github.com/llvm/torch-mlir/blob/e322f6a8784009b37aa354abfa9a40a80f30877d/include/torch-mlir/Dialect/Torch/IR/GeneratedTorchOps.td#L1). We have a guide for [adding a new op end-to-end](https://github.com/llvm/torch-mlir/wiki/Torch-ops-E2E-implementation). @@ -195,7 +195,7 @@ values. When one `torch.jit.script`'s a `torch.nn.Module`, the result is actually an `IValue` that represents the module, with a hierarchy of children `IValue`'s. Strictly speaking, JIT IR `torch::jit::Graph`'s are only used to represent the bodies of methods on the modules. So in addition to importing the -JIT IR, we also need to import the `IValue`'s. This happens inside [ivalue_importer.cpp](https://github.com/llvm/torch-mlir/blob/fde390c7669e29362b18388448ef2b188713383f/python/torch_mlir/dialects/torch/importer/jit_ir/csrc/ivalue_importer.cpp#L1). +JIT IR, we also need to import the `IValue`'s. This happens inside [ivalue_importer.cpp](https://github.com/llvm/torch-mlir/blob/fde390c7669e29362b18388448ef2b188713383f/python/torch_mlir/jit_ir_importer/csrc/ivalue_importer.cpp#L1). Most of the IValue modeling can reuse `torch` dialect ops that already exist otherwise, such as `torch.constant.int` to represent an int in the object graph. diff --git a/include/torch-mlir/Dialect/Torch/IR/GeneratedTorchOps.td b/include/torch-mlir/Dialect/Torch/IR/GeneratedTorchOps.td index 3cacd78a2..0c3efd6ce 100644 --- a/include/torch-mlir/Dialect/Torch/IR/GeneratedTorchOps.td +++ b/include/torch-mlir/Dialect/Torch/IR/GeneratedTorchOps.td @@ -13,7 +13,7 @@ // This file is automatically generated. Please do not edit. // Generated via: // ``` -// python -m torch_mlir.dialects.torch.importer.jit_ir.build_tools.torch_ods_gen +// python -m torch_mlir.jit_ir_importer.build_tools.torch_ods_gen // ``` // //===----------------------------------------------------------------------===// diff --git a/lib/Dialect/Torch/Transforms/AbstractInterpLibrary.cpp b/lib/Dialect/Torch/Transforms/AbstractInterpLibrary.cpp index 92f8c8006..eed24195c 100644 --- a/lib/Dialect/Torch/Transforms/AbstractInterpLibrary.cpp +++ b/lib/Dialect/Torch/Transforms/AbstractInterpLibrary.cpp @@ -6227,7 +6227,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %3 = torch.prim.ListConstruct %1, %2 : (!torch.int, !torch.int) -> !torch.list\n" " return %3 : !torch.list\n" " }\n" -" func.func @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.testing_framework._convert_dtype_to_int(%arg0: !torch.int) -> !torch.int {\n" +" func.func @__torch__.torch_mlir.jit_ir_importer.build_tools.testing_framework._convert_dtype_to_int(%arg0: !torch.int) -> !torch.int {\n" " return %arg0 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_shape_fn.aten.triu\"(%arg0: !torch.list, %arg1: !torch.int) -> !torch.list {\n" @@ -7924,7 +7924,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %none = torch.constant.none\n" " %str = torch.constant.str \"AssertionError: \"\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %1 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %1 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " torch.prim.If %1 -> () {\n" " torch.prim.If.yield\n" " } else {\n" @@ -7939,12 +7939,12 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " }\n" " return %3 : !torch.int\n" " }\n" -" func.func @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%arg0: !torch.int) -> !torch.bool {\n" -" %0 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.all_complex_dtypes() : () -> !torch.list\n" +" func.func @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%arg0: !torch.int) -> !torch.bool {\n" +" %0 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.all_complex_dtypes() : () -> !torch.list\n" " %1 = torch.aten.__contains__.int_list %0, %arg0 : !torch.list, !torch.int -> !torch.bool\n" " return %1 : !torch.bool\n" " }\n" -" func.func @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.all_complex_dtypes() -> !torch.list {\n" +" func.func @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.all_complex_dtypes() -> !torch.list {\n" " %int10 = torch.constant.int 10\n" " %int9 = torch.constant.int 9\n" " %0 = torch.prim.ListConstruct %int9, %int10 : (!torch.int, !torch.int) -> !torch.list\n" @@ -8424,7 +8424,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %true = torch.constant.bool true\n" " %false = torch.constant.bool false\n" " %int6 = torch.constant.int 6\n" -" %0 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_float_dtype(%arg0) : (!torch.int) -> !torch.bool\n" +" %0 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_float_dtype(%arg0) : (!torch.int) -> !torch.bool\n" " %1 = torch.prim.If %0 -> (!torch.bool) {\n" " %4 = torch.aten.ne.int %arg0, %int6 : !torch.int, !torch.int -> !torch.bool\n" " torch.prim.If.yield %4 : !torch.bool\n" @@ -8434,7 +8434,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %2 = torch.prim.If %1 -> (!torch.bool) {\n" " torch.prim.If.yield %true : !torch.bool\n" " } else {\n" -" %4 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%arg0) : (!torch.int) -> !torch.bool\n" +" %4 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%arg0) : (!torch.int) -> !torch.bool\n" " torch.prim.If.yield %4 : !torch.bool\n" " }\n" " %3 = torch.prim.If %2 -> (!torch.int) {\n" @@ -8444,12 +8444,12 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " }\n" " return %3 : !torch.int\n" " }\n" -" func.func @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_float_dtype(%arg0: !torch.int) -> !torch.bool {\n" -" %0 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.all_float_dtypes() : () -> !torch.list\n" +" func.func @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_float_dtype(%arg0: !torch.int) -> !torch.bool {\n" +" %0 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.all_float_dtypes() : () -> !torch.list\n" " %1 = torch.aten.__contains__.int_list %0, %arg0 : !torch.list, !torch.int -> !torch.bool\n" " return %1 : !torch.bool\n" " }\n" -" func.func @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.all_float_dtypes() -> !torch.list {\n" +" func.func @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.all_float_dtypes() -> !torch.list {\n" " %int7 = torch.constant.int 7\n" " %int6 = torch.constant.int 6\n" " %int15 = torch.constant.int 15\n" @@ -8524,7 +8524,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.softplus\"(%arg0: !torch.tuple, %arg1: !torch.number, %arg2: !torch.number) -> !torch.int {\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %1 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %1 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %2 = torch.prim.If %1 -> (!torch.int) {\n" " torch.prim.If.yield %0#1 : !torch.int\n" " } else {\n" @@ -8533,12 +8533,12 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " }\n" " return %2 : !torch.int\n" " }\n" -" func.func @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%arg0: !torch.int) -> !torch.bool {\n" -" %0 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.all_integer_dtypes() : () -> !torch.list\n" +" func.func @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%arg0: !torch.int) -> !torch.bool {\n" +" %0 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.all_integer_dtypes() : () -> !torch.list\n" " %1 = torch.aten.__contains__.int_list %0, %arg0 : !torch.list, !torch.int -> !torch.bool\n" " return %1 : !torch.bool\n" " }\n" -" func.func @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.all_integer_dtypes() -> !torch.list {\n" +" func.func @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.all_integer_dtypes() -> !torch.list {\n" " %int4 = torch.constant.int 4\n" " %int3 = torch.constant.int 3\n" " %int2 = torch.constant.int 2\n" @@ -8559,7 +8559,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %true = torch.constant.bool true\n" " %0 = torch.prim.Uninitialized : !torch.int\n" " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%1#1) : (!torch.int) -> !torch.bool\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%1#1) : (!torch.int) -> !torch.bool\n" " %3 = torch.aten.__not__ %2 : !torch.bool -> !torch.bool\n" " torch.prim.If %3 -> () {\n" " torch.prim.If.yield\n" @@ -8589,7 +8589,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " }\n" " func.func @\"__torch_mlir_dtype_fn.prims.sqrt\"(%arg0: !torch.tuple) -> !torch.int {\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %1 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %1 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %2 = torch.prim.If %1 -> (!torch.int) {\n" " torch.prim.If.yield %0#1 : !torch.int\n" " } else {\n" @@ -8767,7 +8767,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.If.yield %2 : !torch.int\n" " } else {\n" " %2:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %3 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%2#1) : (!torch.int) -> !torch.bool\n" +" %3 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%2#1) : (!torch.int) -> !torch.bool\n" " %4 = torch.prim.If %3 -> (!torch.int) {\n" " torch.prim.If.yield %int4 : !torch.int\n" " } else {\n" @@ -8836,10 +8836,10 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg1 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %0#0, %1#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %0#1, %1#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" -" func.func @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%arg0: !torch.list>, %arg1: !torch.list) -> !torch.int {\n" +" func.func @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%arg0: !torch.list>, %arg1: !torch.list) -> !torch.int {\n" " %0 = torch.promote_dtypes %arg0, %arg1 : (!torch.list>, !torch.list) -> !torch.int\n" " return %0 : !torch.int\n" " }\n" @@ -8858,7 +8858,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " func.func @\"__torch_mlir_dtype_fn.aten.hardtanh_backward\"(%arg0: !torch.tuple, %arg1: !torch.tuple, %arg2: !torch.number, %arg3: !torch.number) -> !torch.int {\n" " %int6 = torch.constant.int 6\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %1 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %1 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %2 = torch.prim.If %1 -> (!torch.int) {\n" " torch.prim.If.yield %int6 : !torch.int\n" " } else {\n" @@ -8915,7 +8915,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %none = torch.constant.none\n" " %str = torch.constant.str \"AssertionError: \"\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %1 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %1 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %2 = torch.aten.__not__ %1 : !torch.bool -> !torch.bool\n" " torch.prim.If %2 -> () {\n" " torch.prim.If.yield\n" @@ -8930,7 +8930,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg1 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %0#0, %1#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %0#1, %1#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.lift_fresh_copy\"(%arg0: !torch.tuple) -> !torch.int {\n" @@ -9011,7 +9011,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg1 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %0#0, %1#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %0#1, %1#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " %5 = torch.aten.eq.int %4, %int11 : !torch.int, !torch.int -> !torch.bool\n" " %6 = torch.prim.If %5 -> (!torch.int) {\n" " torch.prim.If.yield %int4 : !torch.int\n" @@ -9152,7 +9152,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg1 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %0#0, %1#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %0#1, %1#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.threshold\"(%arg0: !torch.tuple, %arg1: !torch.number, %arg2: !torch.number) -> !torch.int {\n" @@ -9224,10 +9224,10 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " return %0#1 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.prim.abs.Scalar\"(%arg0: !torch.number) -> !torch.int {\n" -" %0 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg0) : (!torch.number) -> !torch.int\n" +" %0 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg0) : (!torch.number) -> !torch.int\n" " return %0 : !torch.int\n" " }\n" -" func.func @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg0: !torch.number) -> !torch.int {\n" +" func.func @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg0: !torch.number) -> !torch.int {\n" " %0 = torch.prim.NumToTensor.Scalar %arg0 : !torch.number -> !torch.tensor\n" " %1 = torch.prim.dtype %0 : !torch.tensor -> !torch.int\n" " return %1 : !torch.int\n" @@ -9239,7 +9239,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg1 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %1#0, %0#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %1#1, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " %5 = torch.aten.eq.int %4, %int11 : !torch.int, !torch.int -> !torch.bool\n" " %6 = torch.prim.If %5 -> (!torch.int) {\n" " torch.prim.If.yield %int4 : !torch.int\n" @@ -9369,10 +9369,10 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " func.func @\"__torch_mlir_dtype_fn.aten.add\"(%arg0: !torch.number, %arg1: !torch.number) -> !torch.int {\n" " %none = torch.constant.none\n" " %0 = torch.prim.ListConstruct %none, %none : (!torch.none, !torch.none) -> !torch.list>\n" -" %1 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg0) : (!torch.number) -> !torch.int\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" +" %1 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg0) : (!torch.number) -> !torch.int\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" " %3 = torch.prim.ListConstruct %1, %2 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%0, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%0, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.fft_fft\"(%arg0: !torch.tuple, %arg1: !torch.optional, %arg2: !torch.int, %arg3: !torch.optional) -> !torch.int {\n" @@ -9386,7 +9386,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %int5 = torch.constant.int 5\n" " %0 = torch.prim.Uninitialized : !torch.int\n" " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%1#1) : (!torch.int) -> !torch.bool\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%1#1) : (!torch.int) -> !torch.bool\n" " %3 = torch.prim.If %2 -> (!torch.int) {\n" " torch.prim.If.yield %1#1 : !torch.int\n" " } else {\n" @@ -9402,7 +9402,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %9 = torch.prim.If %8 -> (!torch.int) {\n" " torch.prim.If.yield %int10 : !torch.int\n" " } else {\n" -" %10 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%1#1) : (!torch.int) -> !torch.bool\n" +" %10 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%1#1) : (!torch.int) -> !torch.bool\n" " %11 = torch.prim.If %10 -> (!torch.int) {\n" " torch.prim.If.yield %int9 : !torch.int\n" " } else {\n" @@ -9423,9 +9423,9 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %none = torch.constant.none\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %1 = torch.prim.ListConstruct %0#0, %none : (!torch.int, !torch.none) -> !torch.list>\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" " %3 = torch.prim.ListConstruct %0#1, %2 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.__and__.Tensor\"(%arg0: !torch.tuple, %arg1: !torch.tuple) -> !torch.int {\n" @@ -9433,7 +9433,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %1#0, %0#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %1#1, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.__or__.Tensor\"(%arg0: !torch.tuple, %arg1: !torch.tuple) -> !torch.int {\n" @@ -9441,7 +9441,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %1#0, %0#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %1#1, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.add.Tensor\"(%arg0: !torch.tuple, %arg1: !torch.tuple, %arg2: !torch.number) -> !torch.int {\n" @@ -9449,7 +9449,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %1#0, %0#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %1#1, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.bitwise_and.Tensor\"(%arg0: !torch.tuple, %arg1: !torch.tuple) -> !torch.int {\n" @@ -9457,16 +9457,16 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %1#0, %0#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %1#1, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.bitwise_and.Scalar\"(%arg0: !torch.tuple, %arg1: !torch.number) -> !torch.int {\n" " %none = torch.constant.none\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %1 = torch.prim.ListConstruct %0#0, %none : (!torch.int, !torch.none) -> !torch.list>\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" " %3 = torch.prim.ListConstruct %0#1, %2 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.bitwise_or.Tensor\"(%arg0: !torch.tuple, %arg1: !torch.tuple) -> !torch.int {\n" @@ -9474,7 +9474,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %1#0, %0#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %1#1, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.bitwise_xor.Tensor\"(%arg0: !torch.tuple, %arg1: !torch.tuple) -> !torch.int {\n" @@ -9482,7 +9482,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %1#0, %0#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %1#1, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.bitwise_right_shift.Tensor\"(%arg0: !torch.tuple, %arg1: !torch.tuple) -> !torch.int {\n" @@ -9490,14 +9490,14 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %1#0, %0#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %1#1, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.bmm\"(%arg0: !torch.tuple, %arg1: !torch.tuple) -> !torch.int {\n" " %0:2 = torch.prim.TupleUnpack %arg1 : !torch.tuple -> !torch.int, !torch.int\n" " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_priority_of_dtype(%0#1) : (!torch.int) -> !torch.int\n" -" %3 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_priority_of_dtype(%1#1) : (!torch.int) -> !torch.int\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_priority_of_dtype(%0#1) : (!torch.int) -> !torch.int\n" +" %3 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_priority_of_dtype(%1#1) : (!torch.int) -> !torch.int\n" " %4 = torch.aten.lt.int %2, %3 : !torch.int, !torch.int -> !torch.bool\n" " %5 = torch.prim.If %4 -> (!torch.int) {\n" " torch.prim.If.yield %0#1 : !torch.int\n" @@ -9506,7 +9506,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " }\n" " return %5 : !torch.int\n" " }\n" -" func.func @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_priority_of_dtype(%arg0: !torch.int) -> !torch.int {\n" +" func.func @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_priority_of_dtype(%arg0: !torch.int) -> !torch.int {\n" " %none = torch.constant.none\n" " %str = torch.constant.str \"AssertionError: Cannot determine priority of dtype\"\n" " %int15 = torch.constant.int 15\n" @@ -9606,7 +9606,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %str_1 = torch.constant.str \"AssertionError: `self` cannot be complex\"\n" " %0:2 = torch.prim.TupleUnpack %arg1 : !torch.tuple -> !torch.int, !torch.int\n" " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%1#1) : (!torch.int) -> !torch.bool\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%1#1) : (!torch.int) -> !torch.bool\n" " %3 = torch.aten.__not__ %2 : !torch.bool -> !torch.bool\n" " torch.prim.If %3 -> () {\n" " torch.prim.If.yield\n" @@ -9614,7 +9614,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.RaiseException %str_1, %none : !torch.str, !torch.none\n" " torch.prim.If.yield\n" " }\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %5 = torch.aten.__not__ %4 : !torch.bool -> !torch.bool\n" " torch.prim.If %5 -> () {\n" " torch.prim.If.yield\n" @@ -9624,7 +9624,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " }\n" " %6 = torch.prim.ListConstruct %1#0, %0#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %7 = torch.prim.ListConstruct %1#1, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %8 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%6, %7) : (!torch.list>, !torch.list) -> !torch.int\n" +" %8 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%6, %7) : (!torch.list>, !torch.list) -> !torch.int\n" " %9 = torch.aten.ne.int %8, %int11 : !torch.int, !torch.int -> !torch.bool\n" " torch.prim.If %9 -> () {\n" " torch.prim.If.yield\n" @@ -9642,12 +9642,12 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %1#0, %0#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %1#1, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" -" %5 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%4) : (!torch.int) -> !torch.bool\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %5 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%4) : (!torch.int) -> !torch.bool\n" " %6 = torch.prim.If %5 -> (!torch.bool) {\n" " torch.prim.If.yield %true : !torch.bool\n" " } else {\n" -" %8 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_float_dtype(%4) : (!torch.int) -> !torch.bool\n" +" %8 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_float_dtype(%4) : (!torch.int) -> !torch.bool\n" " %9 = torch.prim.If %8 -> (!torch.bool) {\n" " %10 = torch.aten.ne.int %4, %int6 : !torch.int, !torch.int -> !torch.bool\n" " torch.prim.If.yield %10 : !torch.bool\n" @@ -9686,12 +9686,12 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %4:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %5 = torch.prim.ListConstruct %4#0, %3#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %6 = torch.prim.ListConstruct %4#1, %3#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %7 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%5, %6) : (!torch.list>, !torch.list) -> !torch.int\n" -" %8 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%7) : (!torch.int) -> !torch.bool\n" +" %7 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%5, %6) : (!torch.list>, !torch.list) -> !torch.int\n" +" %8 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%7) : (!torch.int) -> !torch.bool\n" " %9 = torch.prim.If %8 -> (!torch.bool) {\n" " torch.prim.If.yield %true : !torch.bool\n" " } else {\n" -" %12 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_float_dtype(%7) : (!torch.int) -> !torch.bool\n" +" %12 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_float_dtype(%7) : (!torch.int) -> !torch.bool\n" " %13 = torch.prim.If %12 -> (!torch.bool) {\n" " %14 = torch.aten.ne.int %7, %int6 : !torch.int, !torch.int -> !torch.bool\n" " torch.prim.If.yield %14 : !torch.bool\n" @@ -9725,8 +9725,8 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " func.func @\"__torch_mlir_dtype_fn.aten.matmul\"(%arg0: !torch.tuple, %arg1: !torch.tuple) -> !torch.int {\n" " %0:2 = torch.prim.TupleUnpack %arg1 : !torch.tuple -> !torch.int, !torch.int\n" " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_priority_of_dtype(%0#1) : (!torch.int) -> !torch.int\n" -" %3 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_priority_of_dtype(%1#1) : (!torch.int) -> !torch.int\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_priority_of_dtype(%0#1) : (!torch.int) -> !torch.int\n" +" %3 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_priority_of_dtype(%1#1) : (!torch.int) -> !torch.int\n" " %4 = torch.aten.lt.int %2, %3 : !torch.int, !torch.int -> !torch.bool\n" " %5 = torch.prim.If %4 -> (!torch.int) {\n" " torch.prim.If.yield %0#1 : !torch.int\n" @@ -9740,7 +9740,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %1#0, %0#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %1#1, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.minimum\"(%arg0: !torch.tuple, %arg1: !torch.tuple) -> !torch.int {\n" @@ -9748,7 +9748,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %1#0, %0#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %1#1, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.mm\"(%arg0: !torch.tuple, %arg1: !torch.tuple) -> !torch.int {\n" @@ -9776,7 +9776,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " } else {\n" " %7 = torch.prim.ListConstruct %1#0, %0#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %8 = torch.prim.ListConstruct %1#1, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %9 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%7, %8) : (!torch.list>, !torch.list) -> !torch.int\n" +" %9 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%7, %8) : (!torch.list>, !torch.list) -> !torch.int\n" " torch.prim.If.yield %9 : !torch.int\n" " }\n" " return %6 : !torch.int\n" @@ -9788,8 +9788,8 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg1 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %0#0, %1#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %0#1, %1#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" -" %5 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%4) : (!torch.int) -> !torch.bool\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %5 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%4) : (!torch.int) -> !torch.bool\n" " %6 = torch.aten.__not__ %5 : !torch.bool -> !torch.bool\n" " torch.prim.If %6 -> () {\n" " torch.prim.If.yield\n" @@ -9804,7 +9804,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %1#0, %0#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %1#1, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.mv\"(%arg0: !torch.tuple, %arg1: !torch.tuple) -> !torch.int {\n" @@ -9812,7 +9812,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg1 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %0#0, %1#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %0#1, %1#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.sub.Tensor\"(%arg0: !torch.tuple, %arg1: !torch.tuple, %arg2: !torch.number) -> !torch.int {\n" @@ -9820,7 +9820,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %1#0, %0#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %1#1, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.threshold_backward\"(%arg0: !torch.tuple, %arg1: !torch.tuple, %arg2: !torch.number) -> !torch.int {\n" @@ -9831,7 +9831,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %str_1 = torch.constant.str \"AssertionError: `grad_output` cannot be complex\"\n" " %0:2 = torch.prim.TupleUnpack %arg1 : !torch.tuple -> !torch.int, !torch.int\n" " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%1#1) : (!torch.int) -> !torch.bool\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%1#1) : (!torch.int) -> !torch.bool\n" " %3 = torch.aten.__not__ %2 : !torch.bool -> !torch.bool\n" " torch.prim.If %3 -> () {\n" " torch.prim.If.yield\n" @@ -9839,7 +9839,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.RaiseException %str_1, %none : !torch.str, !torch.none\n" " torch.prim.If.yield\n" " }\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %5 = torch.aten.__not__ %4 : !torch.bool -> !torch.bool\n" " torch.prim.If %5 -> () {\n" " torch.prim.If.yield\n" @@ -9849,7 +9849,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " }\n" " %6 = torch.prim.ListConstruct %1#0, %0#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %7 = torch.prim.ListConstruct %1#1, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %8 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%6, %7) : (!torch.list>, !torch.list) -> !torch.int\n" +" %8 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%6, %7) : (!torch.list>, !torch.list) -> !torch.int\n" " %9 = torch.prim.ListConstruct %int11 : (!torch.int) -> !torch.list\n" " %10 = torch.aten.__contains__.int_list %9, %8 : !torch.list, !torch.int -> !torch.bool\n" " %11 = torch.aten.__not__ %10 : !torch.bool -> !torch.bool\n" @@ -9875,7 +9875,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.RaiseException %str, %none : !torch.str, !torch.none\n" " torch.prim.If.yield\n" " }\n" -" %3 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %3 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %4 = torch.aten.__not__ %3 : !torch.bool -> !torch.bool\n" " %5 = torch.prim.If %4 -> (!torch.bool) {\n" " %12 = torch.aten.__isnot__ %0#1, %int11 : !torch.int, !torch.int -> !torch.bool\n" @@ -9889,7 +9889,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.RaiseException %str, %none : !torch.str, !torch.none\n" " torch.prim.If.yield\n" " }\n" -" %6 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%1#1) : (!torch.int) -> !torch.bool\n" +" %6 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%1#1) : (!torch.int) -> !torch.bool\n" " %7 = torch.aten.__not__ %6 : !torch.bool -> !torch.bool\n" " %8 = torch.prim.If %7 -> (!torch.bool) {\n" " %12 = torch.aten.__isnot__ %1#1, %int11 : !torch.int, !torch.int -> !torch.bool\n" @@ -9905,7 +9905,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " }\n" " %9 = torch.prim.ListConstruct %0#0, %1#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %10 = torch.prim.ListConstruct %0#1, %1#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %11 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%9, %10) : (!torch.list>, !torch.list) -> !torch.int\n" +" %11 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%9, %10) : (!torch.list>, !torch.list) -> !torch.int\n" " return %11 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten._convolution.deprecated\"(%arg0: !torch.tuple, %arg1: !torch.tuple, %arg2: !torch.optional>, %arg3: !torch.list, %arg4: !torch.list, %arg5: !torch.list, %arg6: !torch.bool, %arg7: !torch.list, %arg8: !torch.int, %arg9: !torch.bool, %arg10: !torch.bool, %arg11: !torch.bool) -> !torch.int {\n" @@ -9922,7 +9922,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.RaiseException %str, %none : !torch.str, !torch.none\n" " torch.prim.If.yield\n" " }\n" -" %3 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %3 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %4 = torch.aten.__not__ %3 : !torch.bool -> !torch.bool\n" " %5 = torch.prim.If %4 -> (!torch.bool) {\n" " %12 = torch.aten.__isnot__ %0#1, %int11 : !torch.int, !torch.int -> !torch.bool\n" @@ -9936,7 +9936,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.RaiseException %str, %none : !torch.str, !torch.none\n" " torch.prim.If.yield\n" " }\n" -" %6 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%1#1) : (!torch.int) -> !torch.bool\n" +" %6 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%1#1) : (!torch.int) -> !torch.bool\n" " %7 = torch.aten.__not__ %6 : !torch.bool -> !torch.bool\n" " %8 = torch.prim.If %7 -> (!torch.bool) {\n" " %12 = torch.aten.__isnot__ %1#1, %int11 : !torch.int, !torch.int -> !torch.bool\n" @@ -9952,7 +9952,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " }\n" " %9 = torch.prim.ListConstruct %0#0, %1#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %10 = torch.prim.ListConstruct %0#1, %1#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %11 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%9, %10) : (!torch.list>, !torch.list) -> !torch.int\n" +" %11 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%9, %10) : (!torch.list>, !torch.list) -> !torch.int\n" " return %11 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.conv2d\"(%arg0: !torch.tuple, %arg1: !torch.tuple, %arg2: !torch.optional>, %arg3: !torch.list, %arg4: !torch.list, %arg5: !torch.list, %arg6: !torch.int) -> !torch.int {\n" @@ -9980,7 +9980,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %false = torch.constant.bool false\n" " %int11 = torch.constant.int 11\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %1 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %1 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %2 = torch.prim.If %1 -> (!torch.bool) {\n" " %5 = torch.aten.ne.int %0#1, %int11 : !torch.int, !torch.int -> !torch.bool\n" " torch.prim.If.yield %5 : !torch.bool\n" @@ -10015,7 +10015,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %2:2 = torch.prim.TupleUnpack %arg2 : !torch.tuple -> !torch.int, !torch.int\n" " %3 = torch.prim.ListConstruct %0#0, %1#0, %2#0 : (!torch.int, !torch.int, !torch.int) -> !torch.list>\n" " %4 = torch.prim.ListConstruct %0#1, %1#1, %2#1 : (!torch.int, !torch.int, !torch.int) -> !torch.list\n" -" %5 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%3, %4) : (!torch.list>, !torch.list) -> !torch.int\n" +" %5 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%3, %4) : (!torch.list>, !torch.list) -> !torch.int\n" " return %5 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.lerp.Tensor\"(%arg0: !torch.tuple, %arg1: !torch.tuple, %arg2: !torch.tuple) -> !torch.int {\n" @@ -10024,7 +10024,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %2:2 = torch.prim.TupleUnpack %arg2 : !torch.tuple -> !torch.int, !torch.int\n" " %3 = torch.prim.ListConstruct %0#0, %1#0, %2#0 : (!torch.int, !torch.int, !torch.int) -> !torch.list>\n" " %4 = torch.prim.ListConstruct %0#1, %1#1, %2#1 : (!torch.int, !torch.int, !torch.int) -> !torch.list\n" -" %5 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%3, %4) : (!torch.list>, !torch.list) -> !torch.int\n" +" %5 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%3, %4) : (!torch.list>, !torch.list) -> !torch.int\n" " return %5 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.addcmul\"(%arg0: !torch.tuple, %arg1: !torch.tuple, %arg2: !torch.tuple, %arg3: !torch.number) -> !torch.int {\n" @@ -10057,7 +10057,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " }\n" " %6 = torch.prim.ListConstruct %0#0, %1#0, %2#0 : (!torch.int, !torch.int, !torch.int) -> !torch.list>\n" " %7 = torch.prim.ListConstruct %0#1, %1#1, %2#1 : (!torch.int, !torch.int, !torch.int) -> !torch.list\n" -" %8 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%6, %7) : (!torch.list>, !torch.list) -> !torch.int\n" +" %8 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%6, %7) : (!torch.list>, !torch.list) -> !torch.int\n" " return %8 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.addcdiv\"(%arg0: !torch.tuple, %arg1: !torch.tuple, %arg2: !torch.tuple, %arg3: !torch.number) -> !torch.int {\n" @@ -10067,8 +10067,8 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %2:2 = torch.prim.TupleUnpack %arg2 : !torch.tuple -> !torch.int, !torch.int\n" " %3 = torch.prim.ListConstruct %0#0, %1#0, %2#0 : (!torch.int, !torch.int, !torch.int) -> !torch.list>\n" " %4 = torch.prim.ListConstruct %0#1, %1#1, %2#1 : (!torch.int, !torch.int, !torch.int) -> !torch.list\n" -" %5 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%3, %4) : (!torch.list>, !torch.list) -> !torch.int\n" -" %6 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%5) : (!torch.int) -> !torch.bool\n" +" %5 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%3, %4) : (!torch.list>, !torch.list) -> !torch.int\n" +" %6 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%5) : (!torch.int) -> !torch.bool\n" " %7 = torch.prim.If %6 -> (!torch.int) {\n" " torch.prim.If.yield %int6 : !torch.int\n" " } else {\n" @@ -10080,27 +10080,27 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %none = torch.constant.none\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %1 = torch.prim.ListConstruct %0#0, %none : (!torch.int, !torch.none) -> !torch.list>\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" " %3 = torch.prim.ListConstruct %0#1, %2 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.sub.Scalar\"(%arg0: !torch.tuple, %arg1: !torch.number, %arg2: !torch.number) -> !torch.int {\n" " %none = torch.constant.none\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %1 = torch.prim.ListConstruct %0#0, %none : (!torch.int, !torch.none) -> !torch.list>\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" " %3 = torch.prim.ListConstruct %0#1, %2 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.mul.Scalar\"(%arg0: !torch.tuple, %arg1: !torch.number) -> !torch.int {\n" " %none = torch.constant.none\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %1 = torch.prim.ListConstruct %0#0, %none : (!torch.int, !torch.none) -> !torch.list>\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" " %3 = torch.prim.ListConstruct %0#1, %2 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.div.Scalar\"(%arg0: !torch.tuple, %arg1: !torch.number) -> !torch.int {\n" @@ -10108,10 +10108,10 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %none = torch.constant.none\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %1 = torch.prim.ListConstruct %0#0, %none : (!torch.int, !torch.none) -> !torch.list>\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" " %3 = torch.prim.ListConstruct %0#1, %2 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" -" %5 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%4) : (!torch.int) -> !torch.bool\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %5 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%4) : (!torch.int) -> !torch.bool\n" " %6 = torch.prim.If %5 -> (!torch.int) {\n" " torch.prim.If.yield %int6 : !torch.int\n" " } else {\n" @@ -10123,16 +10123,16 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %none = torch.constant.none\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %1 = torch.prim.ListConstruct %0#0, %none : (!torch.int, !torch.none) -> !torch.list>\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" " %3 = torch.prim.ListConstruct %0#1, %2 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.floor_divide.Scalar\"(%arg0: !torch.tuple, %arg1: !torch.number) -> !torch.int {\n" " %none = torch.constant.none\n" " %str = torch.constant.str \"AssertionError: \"\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %1 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %1 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %2 = torch.aten.__not__ %1 : !torch.bool -> !torch.bool\n" " torch.prim.If %2 -> () {\n" " torch.prim.If.yield\n" @@ -10141,27 +10141,27 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.If.yield\n" " }\n" " %3 = torch.prim.ListConstruct %0#0, %none : (!torch.int, !torch.none) -> !torch.list>\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" " %5 = torch.prim.ListConstruct %0#1, %4 : (!torch.int, !torch.int) -> !torch.list\n" -" %6 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%3, %5) : (!torch.list>, !torch.list) -> !torch.int\n" +" %6 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%3, %5) : (!torch.list>, !torch.list) -> !torch.int\n" " return %6 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.pow.Scalar\"(%arg0: !torch.number, %arg1: !torch.tuple) -> !torch.int {\n" " %none = torch.constant.none\n" " %0:2 = torch.prim.TupleUnpack %arg1 : !torch.tuple -> !torch.int, !torch.int\n" " %1 = torch.prim.ListConstruct %none, %0#0 : (!torch.none, !torch.int) -> !torch.list>\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg0) : (!torch.number) -> !torch.int\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg0) : (!torch.number) -> !torch.int\n" " %3 = torch.prim.ListConstruct %2, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.pow.Tensor_Scalar\"(%arg0: !torch.tuple, %arg1: !torch.number) -> !torch.int {\n" " %none = torch.constant.none\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %1 = torch.prim.ListConstruct %0#0, %none : (!torch.int, !torch.none) -> !torch.list>\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" " %3 = torch.prim.ListConstruct %0#1, %2 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.leaky_relu\"(%arg0: !torch.tuple, %arg1: !torch.number) -> !torch.int {\n" @@ -10177,10 +10177,10 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.If.yield\n" " }\n" " %2 = torch.prim.ListConstruct %0#0, %none : (!torch.int, !torch.none) -> !torch.list>\n" -" %3 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_float_dtype(%3) : (!torch.int) -> !torch.bool\n" +" %3 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_float_dtype(%3) : (!torch.int) -> !torch.bool\n" " torch.prim.If %4 -> () {\n" -" %7 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %7 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %8 = torch.aten.__not__ %7 : !torch.bool -> !torch.bool\n" " torch.prim.If %8 -> () {\n" " torch.prim.If.yield\n" @@ -10193,7 +10193,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.If.yield\n" " }\n" " %5 = torch.prim.ListConstruct %0#1, %3 : (!torch.int, !torch.int) -> !torch.list\n" -" %6 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %5) : (!torch.list>, !torch.list) -> !torch.int\n" +" %6 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %5) : (!torch.list>, !torch.list) -> !torch.int\n" " return %6 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.elu\"(%arg0: !torch.tuple, %arg1: !torch.number, %arg2: !torch.number, %arg3: !torch.number) -> !torch.int {\n" @@ -10215,7 +10215,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.Loop %int3, %true, init() {\n" " ^bb0(%arg4: !torch.int):\n" " %7 = torch.aten.__getitem__.t %3, %arg4 : !torch.list, !torch.int -> !torch.number\n" -" %8 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%7) : (!torch.number) -> !torch.int\n" +" %8 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%7) : (!torch.number) -> !torch.int\n" " %9 = torch.aten.append.t %2, %8 : !torch.list, !torch.int -> !torch.list\n" " torch.prim.Loop.condition %true, iter()\n" " } : (!torch.int, !torch.bool) -> ()\n" @@ -10224,13 +10224,13 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.Loop %5, %true, init() {\n" " ^bb0(%arg4: !torch.int):\n" " %7 = torch.aten.__getitem__.t %2, %arg4 : !torch.list, !torch.int -> !torch.int\n" -" %8 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_float_dtype(%7) : (!torch.int) -> !torch.bool\n" +" %8 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_float_dtype(%7) : (!torch.int) -> !torch.bool\n" " %9 = torch.aten.append.t %4, %8 : !torch.list, !torch.bool -> !torch.list\n" " torch.prim.Loop.condition %true, iter()\n" " } : (!torch.int, !torch.bool) -> ()\n" " %6 = torch.aten.any.bool %4 : !torch.list -> !torch.bool\n" " torch.prim.If %6 -> () {\n" -" %7 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %7 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %8 = torch.aten.__not__ %7 : !torch.bool -> !torch.bool\n" " torch.prim.If %8 -> () {\n" " torch.prim.If.yield\n" @@ -10248,9 +10248,9 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %none = torch.constant.none\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" " %1 = torch.prim.ListConstruct %0#0, %none : (!torch.int, !torch.none) -> !torch.list>\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" " %3 = torch.prim.ListConstruct %0#1, %2 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.baddbmm\"(%arg0: !torch.tuple, %arg1: !torch.tuple, %arg2: !torch.tuple, %arg3: !torch.number, %arg4: !torch.number) -> !torch.int {\n" @@ -10282,7 +10282,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " }\n" " %5 = torch.prim.ListConstruct %0#0, %1#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %6 = torch.prim.ListConstruct %0#1, %1#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %7 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%5, %6) : (!torch.list>, !torch.list) -> !torch.int\n" +" %7 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%5, %6) : (!torch.list>, !torch.list) -> !torch.int\n" " return %7 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.where.self\"(%arg0: !torch.tuple, %arg1: !torch.tuple, %arg2: !torch.tuple) -> !torch.int {\n" @@ -10290,18 +10290,18 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg2 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %0#0, %1#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %0#1, %1#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.where.Scalar\"(%arg0: !torch.tuple, %arg1: !torch.number, %arg2: !torch.number) -> !torch.int {\n" " %int6 = torch.constant.int 6\n" " %int4 = torch.constant.int 4\n" " %false = torch.constant.bool false\n" -" %0 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" -" %1 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%0) : (!torch.int) -> !torch.bool\n" +" %0 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" +" %1 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%0) : (!torch.int) -> !torch.bool\n" " %2 = torch.prim.If %1 -> (!torch.bool) {\n" -" %4 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg2) : (!torch.number) -> !torch.int\n" -" %5 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%4) : (!torch.int) -> !torch.bool\n" +" %4 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg2) : (!torch.number) -> !torch.int\n" +" %5 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%4) : (!torch.int) -> !torch.bool\n" " torch.prim.If.yield %5 : !torch.bool\n" " } else {\n" " torch.prim.If.yield %false : !torch.bool\n" @@ -10317,18 +10317,18 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %none = torch.constant.none\n" " %0:2 = torch.prim.TupleUnpack %arg1 : !torch.tuple -> !torch.int, !torch.int\n" " %1 = torch.prim.ListConstruct %0#0, %none : (!torch.int, !torch.none) -> !torch.list>\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg2) : (!torch.number) -> !torch.int\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg2) : (!torch.number) -> !torch.int\n" " %3 = torch.prim.ListConstruct %0#1, %2 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.where.ScalarSelf\"(%arg0: !torch.tuple, %arg1: !torch.number, %arg2: !torch.tuple) -> !torch.int {\n" " %none = torch.constant.none\n" " %0:2 = torch.prim.TupleUnpack %arg2 : !torch.tuple -> !torch.int, !torch.int\n" " %1 = torch.prim.ListConstruct %none, %0#0 : (!torch.none, !torch.int) -> !torch.list>\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" " %3 = torch.prim.ListConstruct %2, %0#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%1, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.nll_loss_forward\"(%arg0: !torch.tuple, %arg1: !torch.tuple, %arg2: !torch.optional>, %arg3: !torch.int, %arg4: !torch.int) -> !torch.tuple {\n" @@ -10355,7 +10355,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %none = torch.constant.none\n" " %str = torch.constant.str \"AssertionError: \"\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %1 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %1 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %2 = torch.aten.__not__ %1 : !torch.bool -> !torch.bool\n" " torch.prim.If %2 -> () {\n" " torch.prim.If.yield\n" @@ -10395,7 +10395,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " func.func @\"__torch_mlir_dtype_fn.aten.native_batch_norm\"(%arg0: !torch.tuple, %arg1: !torch.optional>, %arg2: !torch.optional>, %arg3: !torch.optional>, %arg4: !torch.optional>, %arg5: !torch.bool, %arg6: !torch.float, %arg7: !torch.float) -> !torch.tuple {\n" " %int6 = torch.constant.int 6\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %1 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %1 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %2 = torch.prim.If %1 -> (!torch.int) {\n" " torch.prim.If.yield %int6 : !torch.int\n" " } else {\n" @@ -10412,7 +10412,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %0 = torch.aten.__isnot__ %arg1, %none : !torch.optional, !torch.none -> !torch.bool\n" " %1 = torch.prim.If %0 -> (!torch.int) {\n" " %2 = torch.prim.unchecked_cast %arg1 : !torch.optional -> !torch.int\n" -" %3 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%2) : (!torch.int) -> !torch.bool\n" +" %3 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%2) : (!torch.int) -> !torch.bool\n" " %4 = torch.aten.__not__ %3 : !torch.bool -> !torch.bool\n" " torch.prim.If %4 -> () {\n" " torch.prim.If.yield\n" @@ -10422,8 +10422,8 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " }\n" " torch.prim.If.yield %2 : !torch.int\n" " } else {\n" -" %2 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg0) : (!torch.number) -> !torch.int\n" -" %3 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_float_dtype(%2) : (!torch.int) -> !torch.bool\n" +" %2 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg0) : (!torch.number) -> !torch.int\n" +" %3 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_float_dtype(%2) : (!torch.int) -> !torch.bool\n" " %4 = torch.prim.If %3 -> (!torch.int) {\n" " torch.prim.If.yield %int6 : !torch.int\n" " } else {\n" @@ -10442,7 +10442,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %0 = torch.aten.__isnot__ %arg2, %none : !torch.optional, !torch.none -> !torch.bool\n" " %1 = torch.prim.If %0 -> (!torch.int) {\n" " %2 = torch.prim.unchecked_cast %arg2 : !torch.optional -> !torch.int\n" -" %3 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%2) : (!torch.int) -> !torch.bool\n" +" %3 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%2) : (!torch.int) -> !torch.bool\n" " %4 = torch.aten.__not__ %3 : !torch.bool -> !torch.bool\n" " torch.prim.If %4 -> () {\n" " torch.prim.If.yield\n" @@ -10452,13 +10452,13 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " }\n" " torch.prim.If.yield %2 : !torch.int\n" " } else {\n" -" %2 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg0) : (!torch.number) -> !torch.int\n" -" %3 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_float_dtype(%2) : (!torch.int) -> !torch.bool\n" +" %2 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg0) : (!torch.number) -> !torch.int\n" +" %3 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_float_dtype(%2) : (!torch.int) -> !torch.bool\n" " %4 = torch.prim.If %3 -> (!torch.bool) {\n" " torch.prim.If.yield %true : !torch.bool\n" " } else {\n" -" %6 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" -" %7 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_float_dtype(%6) : (!torch.int) -> !torch.bool\n" +" %6 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" +" %7 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_float_dtype(%6) : (!torch.int) -> !torch.bool\n" " torch.prim.If.yield %7 : !torch.bool\n" " }\n" " %5 = torch.prim.If %4 -> (!torch.int) {\n" @@ -10479,7 +10479,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %0 = torch.aten.__isnot__ %arg3, %none : !torch.optional, !torch.none -> !torch.bool\n" " %1 = torch.prim.If %0 -> (!torch.int) {\n" " %2 = torch.prim.unchecked_cast %arg3 : !torch.optional -> !torch.int\n" -" %3 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%2) : (!torch.int) -> !torch.bool\n" +" %3 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%2) : (!torch.int) -> !torch.bool\n" " %4 = torch.aten.__not__ %3 : !torch.bool -> !torch.bool\n" " torch.prim.If %4 -> () {\n" " torch.prim.If.yield\n" @@ -10489,20 +10489,20 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " }\n" " torch.prim.If.yield %2 : !torch.int\n" " } else {\n" -" %2 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg0) : (!torch.number) -> !torch.int\n" -" %3 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_float_dtype(%2) : (!torch.int) -> !torch.bool\n" +" %2 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg0) : (!torch.number) -> !torch.int\n" +" %3 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_float_dtype(%2) : (!torch.int) -> !torch.bool\n" " %4 = torch.prim.If %3 -> (!torch.bool) {\n" " torch.prim.If.yield %true : !torch.bool\n" " } else {\n" -" %7 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" -" %8 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_float_dtype(%7) : (!torch.int) -> !torch.bool\n" +" %7 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" +" %8 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_float_dtype(%7) : (!torch.int) -> !torch.bool\n" " torch.prim.If.yield %8 : !torch.bool\n" " }\n" " %5 = torch.prim.If %4 -> (!torch.bool) {\n" " torch.prim.If.yield %true : !torch.bool\n" " } else {\n" -" %7 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg2) : (!torch.number) -> !torch.int\n" -" %8 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_float_dtype(%7) : (!torch.int) -> !torch.bool\n" +" %7 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg2) : (!torch.number) -> !torch.int\n" +" %8 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_float_dtype(%7) : (!torch.int) -> !torch.bool\n" " torch.prim.If.yield %8 : !torch.bool\n" " }\n" " %6 = torch.prim.If %5 -> (!torch.int) {\n" @@ -10523,7 +10523,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.If.yield %2 : !torch.int\n" " } else {\n" " %2:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %3 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%2#1) : (!torch.int) -> !torch.bool\n" +" %3 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%2#1) : (!torch.int) -> !torch.bool\n" " %4 = torch.prim.If %3 -> (!torch.int) {\n" " torch.prim.If.yield %int4 : !torch.int\n" " } else {\n" @@ -10546,7 +10546,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.If.yield %2 : !torch.int\n" " } else {\n" " %2:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %3 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%2#1) : (!torch.int) -> !torch.bool\n" +" %3 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%2#1) : (!torch.int) -> !torch.bool\n" " %4 = torch.prim.If %3 -> (!torch.int) {\n" " torch.prim.If.yield %int4 : !torch.int\n" " } else {\n" @@ -10560,7 +10560,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %none = torch.constant.none\n" " %str = torch.constant.str \"AssertionError: \"\n" " %0 = call @\"__torch_mlir_dtype_fn.aten.sum\"(%arg0, %arg3) : (!torch.tuple, !torch.optional) -> !torch.int\n" -" %1 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%0) : (!torch.int) -> !torch.bool\n" +" %1 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%0) : (!torch.int) -> !torch.bool\n" " %2 = torch.aten.__not__ %1 : !torch.bool -> !torch.bool\n" " torch.prim.If %2 -> () {\n" " torch.prim.If.yield\n" @@ -10674,7 +10674,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %none = torch.constant.none\n" " %str = torch.constant.str \"AssertionError: \"\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %1 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %1 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %2 = torch.aten.__not__ %1 : !torch.bool -> !torch.bool\n" " torch.prim.If %2 -> () {\n" " torch.prim.If.yield\n" @@ -10685,7 +10685,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %3 = torch.aten.__isnot__ %arg4, %none : !torch.optional, !torch.none -> !torch.bool\n" " %4 = torch.prim.If %3 -> (!torch.int) {\n" " %5 = torch.prim.unchecked_cast %arg4 : !torch.optional -> !torch.int\n" -" %6 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%5) : (!torch.int) -> !torch.bool\n" +" %6 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%5) : (!torch.int) -> !torch.bool\n" " %7 = torch.aten.__not__ %6 : !torch.bool -> !torch.bool\n" " torch.prim.If %7 -> () {\n" " torch.prim.If.yield\n" @@ -10693,9 +10693,9 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.RaiseException %str, %none : !torch.str, !torch.none\n" " torch.prim.If.yield\n" " }\n" -" %8 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %8 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %9 = torch.prim.If %8 -> (!torch.int) {\n" -" %10 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%5) : (!torch.int) -> !torch.bool\n" +" %10 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%5) : (!torch.int) -> !torch.bool\n" " torch.prim.If %10 -> () {\n" " torch.prim.If.yield\n" " } else {\n" @@ -10706,7 +10706,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %12 = func.call @\"__torch_mlir_dtype_fn.aten.std\"(%11, %true) : (!torch.tuple, !torch.bool) -> !torch.int\n" " torch.prim.If.yield %12 : !torch.int\n" " } else {\n" -" %10 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%5) : (!torch.int) -> !torch.bool\n" +" %10 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%5) : (!torch.int) -> !torch.bool\n" " %11 = torch.aten.__not__ %10 : !torch.bool -> !torch.bool\n" " torch.prim.If %11 -> () {\n" " torch.prim.If.yield\n" @@ -10827,8 +10827,8 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %2 = torch.prim.unchecked_cast %arg2 : !torch.optional -> !torch.int\n" " torch.prim.If.yield %2 : !torch.int\n" " } else {\n" -" %2 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" -" %3 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_float_dtype(%2) : (!torch.int) -> !torch.bool\n" +" %2 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg1) : (!torch.number) -> !torch.int\n" +" %3 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_float_dtype(%2) : (!torch.int) -> !torch.bool\n" " %4 = torch.prim.If %3 -> (!torch.int) {\n" " torch.prim.If.yield %int6 : !torch.int\n" " } else {\n" @@ -10981,7 +10981,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %5 = torch.prim.unchecked_cast %arg1 : !torch.optional -> !torch.int\n" " torch.prim.If.yield %5 : !torch.int\n" " }\n" -" %3 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%2) : (!torch.int) -> !torch.bool\n" +" %3 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%2) : (!torch.int) -> !torch.bool\n" " %4 = torch.aten.__not__ %3 : !torch.bool -> !torch.bool\n" " torch.prim.If %4 -> () {\n" " torch.prim.If.yield\n" @@ -11041,7 +11041,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.If.yield %int4 : !torch.int\n" " } else {\n" " %2 = torch.prim.unchecked_cast %arg3 : !torch.optional -> !torch.int\n" -" %3 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%2) : (!torch.int) -> !torch.bool\n" +" %3 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%2) : (!torch.int) -> !torch.bool\n" " %4 = torch.aten.__not__ %3 : !torch.bool -> !torch.bool\n" " torch.prim.If %4 -> () {\n" " torch.prim.If.yield\n" @@ -11062,7 +11062,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.If.yield %int6 : !torch.int\n" " } else {\n" " %2 = torch.prim.unchecked_cast %arg1 : !torch.optional -> !torch.int\n" -" %3 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%2) : (!torch.int) -> !torch.bool\n" +" %3 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%2) : (!torch.int) -> !torch.bool\n" " %4 = torch.aten.__not__ %3 : !torch.bool -> !torch.bool\n" " torch.prim.If %4 -> () {\n" " torch.prim.If.yield\n" @@ -11083,7 +11083,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.If.yield %int6 : !torch.int\n" " } else {\n" " %2 = torch.prim.unchecked_cast %arg2 : !torch.optional -> !torch.int\n" -" %3 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%2) : (!torch.int) -> !torch.bool\n" +" %3 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%2) : (!torch.int) -> !torch.bool\n" " %4 = torch.aten.__not__ %3 : !torch.bool -> !torch.bool\n" " torch.prim.If %4 -> () {\n" " torch.prim.If.yield\n" @@ -11103,7 +11103,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %none = torch.constant.none\n" " %str = torch.constant.str \"AssertionError: \"\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %1 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %1 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %2 = torch.aten.__not__ %1 : !torch.bool -> !torch.bool\n" " torch.prim.If %2 -> () {\n" " torch.prim.If.yield\n" @@ -11136,7 +11136,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %none = torch.constant.none\n" " %str = torch.constant.str \"AssertionError: \"\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %1 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %1 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %2 = torch.aten.__not__ %1 : !torch.bool -> !torch.bool\n" " torch.prim.If %2 -> () {\n" " torch.prim.If.yield\n" @@ -11167,8 +11167,8 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg1 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %0#0, %1#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %0#1, %1#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" -" %5 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%4) : (!torch.int) -> !torch.bool\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %5 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%4) : (!torch.int) -> !torch.bool\n" " %6 = torch.prim.If %5 -> (!torch.int) {\n" " torch.prim.If.yield %int6 : !torch.int\n" " } else {\n" @@ -11179,7 +11179,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " func.func @\"__torch_mlir_dtype_fn.aten.atan\"(%arg0: !torch.tuple) -> !torch.int {\n" " %int6 = torch.constant.int 6\n" " %0:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %1 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" +" %1 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%0#1) : (!torch.int) -> !torch.bool\n" " %2 = torch.prim.If %1 -> (!torch.int) {\n" " torch.prim.If.yield %int6 : !torch.int\n" " } else {\n" @@ -11192,7 +11192,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %1:2 = torch.prim.TupleUnpack %arg1 : !torch.tuple -> !torch.int, !torch.int\n" " %2 = torch.prim.ListConstruct %0#0, %1#0 : (!torch.int, !torch.int) -> !torch.list>\n" " %3 = torch.prim.ListConstruct %0#1, %1#1 : (!torch.int, !torch.int) -> !torch.list\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%2, %3) : (!torch.list>, !torch.list) -> !torch.int\n" " return %4 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.cat\"(%arg0: !torch.list>, %arg1: !torch.int) -> !torch.int {\n" @@ -11219,7 +11219,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %9 = torch.aten.append.t %1, %7#1 : !torch.list, !torch.int -> !torch.list\n" " torch.prim.Loop.condition %true, iter()\n" " } : (!torch.int, !torch.bool) -> ()\n" -" %5 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes(%0, %1) : (!torch.list>, !torch.list) -> !torch.int\n" +" %5 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes(%0, %1) : (!torch.list>, !torch.list) -> !torch.int\n" " return %5 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten._shape_as_tensor\"(%arg0: !torch.tuple) -> !torch.int {\n" @@ -11236,7 +11236,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " %str_0 = torch.constant.str \"AssertionError: \"\n" " %0 = torch.prim.Uninitialized : !torch.int\n" " %1:2 = torch.prim.TupleUnpack %arg0 : !torch.tuple -> !torch.int, !torch.int\n" -" %2 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_complex_dtype(%1#1) : (!torch.int) -> !torch.bool\n" +" %2 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_complex_dtype(%1#1) : (!torch.int) -> !torch.bool\n" " %3 = torch.aten.__not__ %2 : !torch.bool -> !torch.bool\n" " torch.prim.If %3 -> () {\n" " torch.prim.If.yield\n" @@ -11244,11 +11244,11 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " torch.prim.RaiseException %str_0, %none : !torch.str, !torch.none\n" " torch.prim.If.yield\n" " }\n" -" %4 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_float_dtype(%1#1) : (!torch.int) -> !torch.bool\n" +" %4 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_float_dtype(%1#1) : (!torch.int) -> !torch.bool\n" " %5 = torch.prim.If %4 -> (!torch.int) {\n" " torch.prim.If.yield %int7 : !torch.int\n" " } else {\n" -" %6 = func.call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.is_integer_dtype(%1#1) : (!torch.int) -> !torch.bool\n" +" %6 = func.call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.is_integer_dtype(%1#1) : (!torch.int) -> !torch.bool\n" " %7 = torch.prim.If %6 -> (!torch.bool) {\n" " %9 = torch.aten.ne.int %1#1, %int11 : !torch.int, !torch.int -> !torch.bool\n" " torch.prim.If.yield %9 : !torch.bool\n" @@ -11272,7 +11272,7 @@ StringRef mlir::torch::Torch::getAbstractInterpLibrary() { " return %5 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.prim.NumToTensor.Scalar\"(%arg0: !torch.number) -> !torch.int {\n" -" %0 = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.get_dtype_of_scalar(%arg0) : (!torch.number) -> !torch.int\n" +" %0 = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.get_dtype_of_scalar(%arg0) : (!torch.number) -> !torch.int\n" " return %0 : !torch.int\n" " }\n" " func.func @\"__torch_mlir_dtype_fn.aten.softmax.int\"(%arg0: !torch.tuple, %arg1: !torch.int, %arg2: !torch.optional) -> !torch.int {\n" diff --git a/projects/ltc/csrc/base_lazy_backend/mlir_lowering_context.cpp b/projects/ltc/csrc/base_lazy_backend/mlir_lowering_context.cpp index 4823b4929..fd93d4d2b 100644 --- a/projects/ltc/csrc/base_lazy_backend/mlir_lowering_context.cpp +++ b/projects/ltc/csrc/base_lazy_backend/mlir_lowering_context.cpp @@ -21,7 +21,7 @@ #include "mlir-c/IR.h" #include "mlir-c/Pass.h" -#include "../../dialects/torch/importer/jit_ir/csrc/function_importer.h" +#include "../../jit_ir_importer/csrc/function_importer.h" #include "backend_impl.h" #include "mlir_lowering_context.h" #include "mlir_node.h" diff --git a/projects/pt1/examples/torchscript_resnet_inference.ipynb b/projects/pt1/examples/torchscript_resnet_inference.ipynb index 82258fd39..3ab7cc64d 100644 --- a/projects/pt1/examples/torchscript_resnet_inference.ipynb +++ b/projects/pt1/examples/torchscript_resnet_inference.ipynb @@ -92,8 +92,8 @@ "import torchvision\n", "\n", "import torch_mlir\n", - "from torch_mlir.dialects.torch.importer.jit_ir import ClassAnnotator, ModuleBuilder\n", - "from torch_mlir.dialects.torch.importer.jit_ir.torchscript_annotations import extract_annotations\n", + "from torch_mlir.jit_ir_importer import ClassAnnotator, ModuleBuilder\n", + "from torch_mlir.jit_ir_importer.torchscript_annotations import extract_annotations\n", "\n", "from torch_mlir.passmanager import PassManager\n", "from torch_mlir_e2e_test.linalg_on_tensors_backends.refbackend import RefBackendLinalgOnTensorsBackend" diff --git a/projects/pt1/python/test/annotations-sugar.py b/projects/pt1/python/test/annotations-sugar.py index 98cbec74d..e540e84b9 100644 --- a/projects/pt1/python/test/annotations-sugar.py +++ b/projects/pt1/python/test/annotations-sugar.py @@ -8,8 +8,8 @@ import torch from torch_mlir_e2e_test.annotations import annotate_args, export -from torch_mlir.dialects.torch.importer.jit_ir import ClassAnnotator -from torch_mlir.dialects.torch.importer.jit_ir.torchscript_annotations import extract_annotations +from torch_mlir.jit_ir_importer import ClassAnnotator +from torch_mlir.jit_ir_importer.torchscript_annotations import extract_annotations class MmModule(torch.nn.Module): def __init__(self): diff --git a/projects/pt1/python/torch_mlir/__init__.py b/projects/pt1/python/torch_mlir/__init__.py index 8de6cc1a1..8bbcce994 100644 --- a/projects/pt1/python/torch_mlir/__init__.py +++ b/projects/pt1/python/torch_mlir/__init__.py @@ -17,8 +17,8 @@ from torch_mlir.dynamo import _get_decomposition_table from torch.fx.experimental.proxy_tensor import make_fx from .compiler_utils import run_pipeline_with_repro_report -from torch_mlir.dialects.torch.importer.jit_ir import ClassAnnotator, ImportOptions, ModuleBuilder -from torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator import generate_library +from torch_mlir.jit_ir_importer import ClassAnnotator, ImportOptions, ModuleBuilder +from torch_mlir.jit_ir_importer.build_tools.library_generator import generate_library class OutputType(Enum): diff --git a/projects/pt1/python/torch_mlir/jit_ir_importer/__init__.py b/projects/pt1/python/torch_mlir/jit_ir_importer/__init__.py index ead98dd5c..9177515aa 100644 --- a/projects/pt1/python/torch_mlir/jit_ir_importer/__init__.py +++ b/projects/pt1/python/torch_mlir/jit_ir_importer/__init__.py @@ -11,7 +11,7 @@ import torch # Our native extension is not self-contained. It references libraries which # must come in via the above first. -from ....._mlir_libs._jit_ir_importer import * +from .._mlir_libs._jit_ir_importer import * __all__ = [ diff --git a/projects/pt1/python/torch_mlir/jit_ir_importer/build_tools/library_generator.py b/projects/pt1/python/torch_mlir/jit_ir_importer/build_tools/library_generator.py index 74eb520e2..6cd19643a 100644 --- a/projects/pt1/python/torch_mlir/jit_ir_importer/build_tools/library_generator.py +++ b/projects/pt1/python/torch_mlir/jit_ir_importer/build_tools/library_generator.py @@ -10,7 +10,7 @@ import codecs import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder from torch_mlir.passmanager import PassManager from .registry import Registry diff --git a/projects/pt1/python/torch_mlir/jit_ir_importer/torchscript_annotations.py b/projects/pt1/python/torch_mlir/jit_ir_importer/torchscript_annotations.py index d495dda48..a6541b650 100644 --- a/projects/pt1/python/torch_mlir/jit_ir_importer/torchscript_annotations.py +++ b/projects/pt1/python/torch_mlir/jit_ir_importer/torchscript_annotations.py @@ -8,7 +8,7 @@ from typing import List, Optional, Tuple import torch import torch_mlir -from torch_mlir.dialects.torch.importer.jit_ir import ClassAnnotator +from torch_mlir.jit_ir_importer import ClassAnnotator # Decorators diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/arg-error.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/arg-error.py index 7c448f6e3..26eaa5bd0 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/arg-error.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/arg-error.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ClassAnnotator, ModuleBuilder +from torch_mlir.jit_ir_importer import ClassAnnotator, ModuleBuilder # RUN: %PYTHON %s | FileCheck %s mb = ModuleBuilder() diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/arg-tensor-type-bound.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/arg-tensor-type-bound.py index e8bcd4864..6cc2d57b1 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/arg-tensor-type-bound.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/arg-tensor-type-bound.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ClassAnnotator, ModuleBuilder +from torch_mlir.jit_ir_importer import ClassAnnotator, ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s mb = ModuleBuilder() diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/class-annotator-repr.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/class-annotator-repr.py index ce235a6bf..3a2ed4319 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/class-annotator-repr.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/class-annotator-repr.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ClassAnnotator, ModuleBuilder +from torch_mlir.jit_ir_importer import ClassAnnotator, ModuleBuilder # RUN: %PYTHON %s | FileCheck %s mb = ModuleBuilder() diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/export-error.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/export-error.py index cc4b5656b..2a0806f6f 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/export-error.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/export-error.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ClassAnnotator, ModuleBuilder +from torch_mlir.jit_ir_importer import ClassAnnotator, ModuleBuilder # RUN: %PYTHON %s | FileCheck %s mb = ModuleBuilder() diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/export-recursive.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/export-recursive.py index cc2963d46..79b4dccd2 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/export-recursive.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/export-recursive.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ClassAnnotator, ModuleBuilder +from torch_mlir.jit_ir_importer import ClassAnnotator, ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s mb = ModuleBuilder() diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/export.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/export.py index 37b5d48ad..433f8249b 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/export.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/annotations/export.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ClassAnnotator, ModuleBuilder +from torch_mlir.jit_ir_importer import ClassAnnotator, ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s mb = ModuleBuilder() diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/debug-module-name.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/debug-module-name.py index f4ad4dd3a..399b45f73 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/debug-module-name.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/debug-module-name.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/dict.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/dict.py index 0a9e7f926..117b0cff9 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/dict.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/dict.py @@ -5,7 +5,7 @@ from typing import Dict, Optional import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/functions-that-call-methods.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/functions-that-call-methods.py index ade43aca0..318e09975 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/functions-that-call-methods.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/functions-that-call-methods.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/functions.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/functions.py index 484260617..ee22a495e 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/functions.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/functions.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/list.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/list.py index 2e8765be4..0c1b8f2ff 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/list.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/list.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/methods-derefine.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/methods-derefine.py index 6a941330d..fee1b2922 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/methods-derefine.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/methods-derefine.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/methods-locations.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/methods-locations.py index 7eb98beb9..5d38d6e3a 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/methods-locations.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/methods-locations.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/methods.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/methods.py index fc246c458..0143012bf 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/methods.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/methods.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/object-identity-error-submodule.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/object-identity-error-submodule.py index 9bd66c97c..eae86ec1c 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/object-identity-error-submodule.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/object-identity-error-submodule.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: not %PYTHON %s 2>&1 | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/object-identity-error.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/object-identity-error.py index a3ce3440c..968509acc 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/object-identity-error.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/object-identity-error.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: not %PYTHON %s 2>&1 | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/object-identity-torch-bug.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/object-identity-torch-bug.py index 25d651014..4c323ec01 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/object-identity-torch-bug.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/object-identity-torch-bug.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/object-identity.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/object-identity.py index 253bdfcec..0f6516a27 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/object-identity.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/object-identity.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/prim.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/prim.py index 55fed3299..e48c327ed 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/prim.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/prim.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/primitives.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/primitives.py index 3bcfb0717..3cb8cf992 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/primitives.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/primitives.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/quantization.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/quantization.py index f05cf434f..d77b98323 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/quantization.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/quantization.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # UNSUPPORTED: system-darwin # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/strings.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/strings.py index d7d94bd90..b65d6f5ca 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/strings.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/strings.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/submodules-select.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/submodules-select.py index b0834691e..5b2cf04b5 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/submodules-select.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/submodules-select.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/submodules.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/submodules.py index 92333d20e..d9983628d 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/submodules.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/submodules.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/tensors-value-semantics.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/tensors-value-semantics.py index e57c20fe5..36dfa32f0 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/tensors-value-semantics.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/tensors-value-semantics.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ClassAnnotator, ImportOptions, ModuleBuilder +from torch_mlir.jit_ir_importer import ClassAnnotator, ImportOptions, ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/tensors.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/tensors.py index 831c619ad..31a89e3e1 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/tensors.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/tensors.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/ivalue_import/tuple.py b/projects/pt1/test/python/importer/jit_ir/ivalue_import/tuple.py index 3b0bf2d4e..7bed706ac 100644 --- a/projects/pt1/test/python/importer/jit_ir/ivalue_import/tuple.py +++ b/projects/pt1/test/python/importer/jit_ir/ivalue_import/tuple.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/node_import/classes.py b/projects/pt1/test/python/importer/jit_ir/node_import/classes.py index 511aac690..09e2b1b0b 100644 --- a/projects/pt1/test/python/importer/jit_ir/node_import/classes.py +++ b/projects/pt1/test/python/importer/jit_ir/node_import/classes.py @@ -6,7 +6,7 @@ import typing import torch from torch._C import CompilationUnit -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder import typing diff --git a/projects/pt1/test/python/importer/jit_ir/node_import/debug-info.py b/projects/pt1/test/python/importer/jit_ir/node_import/debug-info.py index f7b441a12..bb6ab4ce4 100644 --- a/projects/pt1/test/python/importer/jit_ir/node_import/debug-info.py +++ b/projects/pt1/test/python/importer/jit_ir/node_import/debug-info.py @@ -3,7 +3,7 @@ # See LICENSE.pytorch for license information. import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/node_import/dict.py b/projects/pt1/test/python/importer/jit_ir/node_import/dict.py index ed4371bb0..0060357b4 100644 --- a/projects/pt1/test/python/importer/jit_ir/node_import/dict.py +++ b/projects/pt1/test/python/importer/jit_ir/node_import/dict.py @@ -3,7 +3,7 @@ # See LICENSE.pytorch for license information. import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder import collections from typing import Tuple, Optional, List, NamedTuple, Dict diff --git a/projects/pt1/test/python/importer/jit_ir/node_import/elif.py b/projects/pt1/test/python/importer/jit_ir/node_import/elif.py index 3a9d3a321..71853b0c0 100644 --- a/projects/pt1/test/python/importer/jit_ir/node_import/elif.py +++ b/projects/pt1/test/python/importer/jit_ir/node_import/elif.py @@ -3,7 +3,7 @@ # See LICENSE.pytorch for license information. import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/node_import/errors.py b/projects/pt1/test/python/importer/jit_ir/node_import/errors.py index be0479dcd..2ac801bdd 100644 --- a/projects/pt1/test/python/importer/jit_ir/node_import/errors.py +++ b/projects/pt1/test/python/importer/jit_ir/node_import/errors.py @@ -5,7 +5,7 @@ import enum import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder class Color(enum.Enum): diff --git a/projects/pt1/test/python/importer/jit_ir/node_import/function-block-arg-adjustment.py b/projects/pt1/test/python/importer/jit_ir/node_import/function-block-arg-adjustment.py index e245ec870..a724f1185 100644 --- a/projects/pt1/test/python/importer/jit_ir/node_import/function-block-arg-adjustment.py +++ b/projects/pt1/test/python/importer/jit_ir/node_import/function-block-arg-adjustment.py @@ -2,7 +2,7 @@ # This file is licensed under a pytorch-style license # See LICENSE.pytorch for license information. -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder from utils import create_script_function diff --git a/projects/pt1/test/python/importer/jit_ir/node_import/function-derefine.py b/projects/pt1/test/python/importer/jit_ir/node_import/function-derefine.py index 94eed3cef..89f5604bf 100644 --- a/projects/pt1/test/python/importer/jit_ir/node_import/function-derefine.py +++ b/projects/pt1/test/python/importer/jit_ir/node_import/function-derefine.py @@ -3,7 +3,7 @@ # See LICENSE.pytorch for license information. import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder import typing diff --git a/projects/pt1/test/python/importer/jit_ir/node_import/if.py b/projects/pt1/test/python/importer/jit_ir/node_import/if.py index fd8a7267e..8289e0503 100644 --- a/projects/pt1/test/python/importer/jit_ir/node_import/if.py +++ b/projects/pt1/test/python/importer/jit_ir/node_import/if.py @@ -3,7 +3,7 @@ # See LICENSE.pytorch for license information. import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/node_import/list.py b/projects/pt1/test/python/importer/jit_ir/node_import/list.py index 9a09914e3..2b30d545b 100644 --- a/projects/pt1/test/python/importer/jit_ir/node_import/list.py +++ b/projects/pt1/test/python/importer/jit_ir/node_import/list.py @@ -3,7 +3,7 @@ # See LICENSE.pytorch for license information. import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/node_import/loop.py b/projects/pt1/test/python/importer/jit_ir/node_import/loop.py index e21f4c8c0..d6bb141f2 100644 --- a/projects/pt1/test/python/importer/jit_ir/node_import/loop.py +++ b/projects/pt1/test/python/importer/jit_ir/node_import/loop.py @@ -3,7 +3,7 @@ # See LICENSE.pytorch for license information. import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder import typing diff --git a/projects/pt1/test/python/importer/jit_ir/node_import/prim.py b/projects/pt1/test/python/importer/jit_ir/node_import/prim.py index 2565c6c41..07a56616e 100644 --- a/projects/pt1/test/python/importer/jit_ir/node_import/prim.py +++ b/projects/pt1/test/python/importer/jit_ir/node_import/prim.py @@ -5,7 +5,7 @@ import typing import torch -from torch_mlir.dialects.torch.importer.jit_ir import ClassAnnotator, ImportOptions, ModuleBuilder +from torch_mlir.jit_ir_importer import ClassAnnotator, ImportOptions, ModuleBuilder from utils import create_script_function diff --git a/projects/pt1/test/python/importer/jit_ir/node_import/tuple.py b/projects/pt1/test/python/importer/jit_ir/node_import/tuple.py index 8e14b677f..2dff435cd 100644 --- a/projects/pt1/test/python/importer/jit_ir/node_import/tuple.py +++ b/projects/pt1/test/python/importer/jit_ir/node_import/tuple.py @@ -3,7 +3,7 @@ # See LICENSE.pytorch for license information. import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder from typing import Tuple, Optional, NamedTuple from utils import create_script_function diff --git a/projects/pt1/test/python/importer/jit_ir/node_import/types-bool.py b/projects/pt1/test/python/importer/jit_ir/node_import/types-bool.py index f08fba24c..8da5e0e2c 100644 --- a/projects/pt1/test/python/importer/jit_ir/node_import/types-bool.py +++ b/projects/pt1/test/python/importer/jit_ir/node_import/types-bool.py @@ -3,7 +3,7 @@ # See LICENSE.pytorch for license information. import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/node_import/types-none.py b/projects/pt1/test/python/importer/jit_ir/node_import/types-none.py index eae6b4578..a0e86a66a 100644 --- a/projects/pt1/test/python/importer/jit_ir/node_import/types-none.py +++ b/projects/pt1/test/python/importer/jit_ir/node_import/types-none.py @@ -3,7 +3,7 @@ # See LICENSE.pytorch for license information. import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/projects/pt1/test/python/importer/jit_ir/node_import/union.py b/projects/pt1/test/python/importer/jit_ir/node_import/union.py index 691a8e413..14eb41a21 100644 --- a/projects/pt1/test/python/importer/jit_ir/node_import/union.py +++ b/projects/pt1/test/python/importer/jit_ir/node_import/union.py @@ -5,7 +5,7 @@ from typing import Union import torch -from torch_mlir.dialects.torch.importer.jit_ir import ModuleBuilder +from torch_mlir.jit_ir_importer import ModuleBuilder # RUN: %PYTHON %s | torch-mlir-opt | FileCheck %s diff --git a/test/Dialect/Torch/reify-dtype-calculations.mlir b/test/Dialect/Torch/reify-dtype-calculations.mlir index 9aec26662..3fe94d041 100644 --- a/test/Dialect/Torch/reify-dtype-calculations.mlir +++ b/test/Dialect/Torch/reify-dtype-calculations.mlir @@ -24,11 +24,11 @@ func.func @basic(%arg0: !torch.vtensor) -> !torch.vtensor { // ----- -// CHECK-LABEL: func.func private @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes( +// CHECK-LABEL: func.func private @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes( // CHECK: {{.*}} = torch.promote_dtypes {{.*}} : (!torch.list>, !torch.list) -> !torch.int // CHECK-LABEL: func.func private @__torch_mlir_dtype_fn.aten.floor_divide( -// CHECK: {{.*}} = call @__torch__.torch_mlir.dialects.torch.importer.jit_ir.build_tools.library_generator.promote_dtypes({{.*}} +// CHECK: {{.*}} = call @__torch__.torch_mlir.jit_ir_importer.build_tools.library_generator.promote_dtypes({{.*}} // CHECK-LABEL: func.func @op_with_dtype_promotion( // CHECK: {{.*}} = func.call @__torch_mlir_dtype_fn.aten.floor_divide({{.*}}