Commit Graph

17 Commits (c9357950864d84ab9a2adf10b2aaa15fec6196e9)

Author SHA1 Message Date
Ashay Rane 874fdb7e42
build: improve robustness of cmake and shell scripts (#1018)
On my local machine, `unzip` didn't exist (producing a "command not
found" error), but CMake ignored the error.  Although the build did
succeed (because it found a previously-built version of libtorch), it
seems better to abort builds on such failures, so this patch checks the
return code of all external process invocations.

Along similar lines, this patch also updates the shell scripts in
`build_tools` to extensively use double-quoting to prevent unintentional
word splitting or globbing.  Since some of the scripts execute `rm`
while using shell variables, this patch also adds the preamble `set -u`
to abort execution if an undefined variable is referenced, so that we
reduce the chances of executing `rm -rf /` if the path expression
happens to refer to an undefined variable.
2022-07-06 14:39:30 -07:00
Prashant Kumar 10c8e3c593 Add simple neural_net and bert_training scripts.
1. With the help of `make_fx` we are able to get the full training graph
   with weight updates.
2. NeuralNet_training passes. Bert_training passes after cherry-picking
   https://github.com/llvm/torch-mlir/pull/844.
3. TODO: Remove the functorch's dependency after make_fx moves to
   pytorch core.
2022-05-19 06:18:42 +05:30
Prashant Kumar 5192a4e9f3 Remove heavy_deps models that don't get serialized.
BART, BigBird and GPT2 are not being serialized and hence removed.
Also, changed the script to obtain the resnest model.
2022-04-29 17:21:25 +05:30
Vivek Khandelwal 4635d36efb [MLIR][TORCH] Add heavydep tests for torch benchmarks
This commit adds e2e heavydep tests for the torch benchmarks.

Signed-Off By: Vivek Khandelwal <vivek@nod-labs.com>
2022-04-26 13:22:08 +05:30
Prashant Kumar e9c785b04b Generate backward graph via functorch-aot module
Example to demonstrate the extraction of forward as well as
backward graph via Functorch's AOT module is added.
2022-04-22 20:58:35 +05:30
Sean Silva 0378c75b35 Centralize all test serialization logic. 2022-03-28 10:17:13 -07:00
Prashant Kumar 730cdcd071 Add hugging face `albert-base-v2` in torchscript_e2e_heavydep_tests
`albert-base-v2` for sequence classification is added in e2e_heavy_test.
2022-03-24 17:43:24 +05:30
Sean Silva 3734f69119 Remove basic_mt from the heavydep tests
This was an aspirational goal at an earlier stage in the project where
the focus was heavily on programs with state, control flow, and
lists/dicts. We will circle back to such programs likely 2022H2 at some
point, but for now, having this test doesn't add much, since basically
nothing works or is being worked on.
2022-03-15 15:25:53 -07:00
Sean Silva dcab39146f Remove the last mentions of npcomp from torch-mlir
These snuck through.
2021-10-05 20:17:23 +00:00
Yi Zhang fadd76e9b8 E2e for MiniLM-L6-H384-uncased-sst2
Replace the original BertSequenceClassification with this new one.
The ops needed to support are identical.
2021-10-05 12:45:19 -04:00
Sean Silva 5b6902e31c Dual license the torch-mlir project.
This commit (with approval from all contributors) dual licenses
the torch-mlir project under both the standard LLVM license and the
standard PyTorch license. This will facilitate moving code between
torch-mlir and the two upstream projects.

The standard file comment is now:

```
// This file is licensed under the Apache License v2.0 with LLVM Exceptions.
// See https://llvm.org/LICENSE.txt for license information.
// SPDX-License-Identifier: Apache-2.0 WITH LLVM-exception
// Also available under a BSD-style license. See LICENSE.
```

See `LICENSE` in the project root for the terms of both licenses.
2021-10-01 10:46:08 -07:00
Yi Zhang 89225b0cd8 Add BertSequenceClassification model to e2e
Use torch tracing to get the module because the original model is not
TorchScriptable out of box.
2021-09-30 13:30:29 -04:00
powderluv b55baf508a
Updates to Readme.md (#334) 2021-09-28 13:50:25 -07:00
Sean Silva 4fad753073 Move external/torch-mlir to the root of the repo. 2021-09-27 17:11:08 -07:00
Sean Silva 404bd74ddf Port the bulk of the remaining code to torch-mlir
This leaves no real code outside torch-mlir.

This also renames the "npcomp backend contract" to "linalg on tensors
backend contract" as the name of the abstraction layer that RefBackend
(IREE too) accepts.
2021-09-27 12:48:33 -07:00
Sean Silva 0eb767ea45 Remove frontends/pytorch directory.
It just contained the e2e testing framework. We now fold it into the
main project to reduce complexity.

- `frontends/pytorch/python/` -> `python/torch_support`
- `frontends/pytorch/e2e_testing -> e2e_testing`
- `frontends/pytorch/examples -> examples`
- `frontends/pytorch/test` -> `python/test`
- `torch_mlir_torchscript` python module -> `npcomp_torchscript`
- `torch_mlir_torchscript_e2e_test_configs` python module ->
  `npcomp_torchscript_e2e_test_configs`

This also changes the license of a handful of files from the
"pytorch-style" license to the regular LLVM/npcomp license. The only
people who committed to those files were myself and Yi.
2021-09-17 09:27:49 -07:00
Sean Silva 453e29ea05 Add E2E support for tests with heavy dependencies (heavydep tests).
The tests use the same (pure-Python) test framework as the
normal torchscript_e2e_test.sh, but the tests are added in
`build_tools/torchscript_e2e_heavydep_tests` instead of
`frontends/pytorch/e2e_testing/torchscript`. Any needed dependencies can
easily be configured in generate_serialized_tests.sh.

We add an initial machine translation model with a complex set of
dependencies to seed the curriculum there. I verified that this model
gets to the point of MLIR import (it fails there with a segfault due to
not being able to import the "Any" type).

This required moving a few files from the `torch_mlir` Python module
into multiple modules to isolate the code that depends on our C++
extensions (which now live in `torch_mlir` and
`torch_mlir_torchscript_e2e_test_configs`) from the pure Python code
(which now lives in `torch_mlir_torchscript`). This is an entirely
mechanical change, and lots of imports needed to be updated.

The dependency graph is:
```
       torch_mlir_torchscript_e2e_test_configs
                  /              |
                 /               |
                /                |
               V                 V
torch_mlir_torchscript       torch_mlir
```

The `torch_mlir_torchscript_e2e_test_configs` are then dependency-injected
into the `torch_mlir_torchscript` modules to successfully assemble a
working test harness (the code was already structured this way, but this
new file organization allows the isolation from C++ code to actually
happen).  This isolation is critical to allowing the serialized programs
to be transported across PyTorch versions and for the test harness to be
used seamlessly to generate the heavydep tests.

Also:
- Extend `_Tracer` class to support nested property (submodule) accesses.

Recommended review order:
- "user-level" docs in README.md
- code in `build_tools/torchscript_e2e_heavydep_tests`.
- changes in `torch_mlir_torchscript/e2e_test/framework.py`
- misc mechanical changes.
2021-08-03 14:09:56 -07:00