Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes hash_crossed with cudf 21.12 #1376

Merged
merged 6 commits into from
Jan 31, 2022
Merged

Conversation

albert17
Copy link
Contributor

No description provided.

@albert17 albert17 requested review from benfred, jperez999, rjzamora and karlhigley and removed request for benfred January 31, 2022 17:52
@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1376 of commit 2985583d7a38fee7931db53e842b76583981079c, no merge conflicts.
Running as SYSTEM
Setting status of 2985583d7a38fee7931db53e842b76583981079c to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/4085/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1376/*:refs/remotes/origin/pr/1376/* # timeout=10
 > git rev-parse 2985583d7a38fee7931db53e842b76583981079c^{commit} # timeout=10
Checking out Revision 2985583d7a38fee7931db53e842b76583981079c (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 2985583d7a38fee7931db53e842b76583981079c # timeout=10
Commit message: "Fixes hash_crossed with cudf 21.12"
 > git rev-list --no-walk 82d5ebdb321667b103ef698cce1f26fdd018e92d # timeout=10
First time build. Skipping changelog.
[nvtabular_tests] $ /bin/bash /tmp/jenkins2136643168614848678.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (22.0.2)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (59.4.0)
Collecting setuptools
  Downloading setuptools-60.6.0-py3-none-any.whl (953 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 953.8/953.8 KB 31.7 MB/s eta 0:00:00
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.1)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.9.0)
Requirement already satisfied: numpy==1.20.3 in /var/jenkins_home/.local/lib/python3.8/site-packages (1.20.3)
Found existing installation: nvtabular 0.9.0+4.gc5cb887
Can't uninstall 'nvtabular'. No files were found to uninstall.
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/command/easy_install.py:156: EasyInstallDeprecationWarning: easy_install command is deprecated. Use build and pip and other standards-based tools.
  warnings.warn(
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
  warnings.warn(
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.9.0+8.g2985583 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.9.0+8.g2985583 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.9.0+8.g2985583 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.9.0+8.g2985583 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so -> 
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.9.0+8.g2985583 is already the active version in easy-install.pth

Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Running black --check
All done! ✨ 🍰 ✨
176 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.dispatch
nvtabular/dispatch.py:607:11: I1101: Module 'numpy.random.mtrand' has no 'RandomState' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)


Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)

Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.8) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.11.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: xdist-2.5.0, forked-1.4.0, anyio-3.5.0, cov-3.0.0
collected 1648 items / 3 skipped / 1645 selected

tests/unit/test_dask_nvt.py ............................................ [ 2%]
....................................................................... [ 6%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 14%]
..................ssssssss.............................................. [ 18%]
......... [ 19%]
tests/unit/test_notebooks.py ...... [ 19%]
tests/unit/test_tf4rec.py . [ 19%]
tests/unit/test_tools.py ...................... [ 21%]
tests/unit/test_triton_inference.py ................................ [ 22%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 23%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 24%]
................................................... [ 27%]
tests/unit/framework_utils/test_torch_layers.py . [ 27%]
tests/unit/graph/test_base_operator.py .... [ 28%]
tests/unit/graph/test_column_schemas.py ................................ [ 30%]
.................................................. [ 33%]
tests/unit/graph/test_column_selector.py ..................... [ 34%]
tests/unit/graph/test_tags.py ...... [ 34%]
tests/unit/graph/ops/test_selection.py ... [ 34%]
tests/unit/inference/test_graph.py . [ 34%]
tests/unit/inference/test_inference_ops.py .. [ 35%]
tests/unit/inference/test_op_runner.py ... [ 35%]
tests/unit/inference/test_tensorflow_inf_op.py ... [ 35%]
tests/unit/loader/test_dataloader_backend.py ...... [ 35%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 37%]
........................................s.. [ 40%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 42%]
........................................................ [ 45%]
tests/unit/ops/test_categorify.py ...................................... [ 47%]
........................................................................ [ 52%]
............................... [ 54%]
tests/unit/ops/test_column_similarity.py ........................ [ 55%]
tests/unit/ops/test_fill.py ............................................ [ 58%]
........ [ 58%]
tests/unit/ops/test_groupyby.py ....... [ 59%]
tests/unit/ops/test_hash_bucket.py ......................... [ 60%]
tests/unit/ops/test_join.py ............................................ [ 63%]
........................................................................ [ 67%]
.................................. [ 69%]
tests/unit/ops/test_lambda.py .......... [ 70%]
tests/unit/ops/test_normalize.py ....................................... [ 72%]
.. [ 72%]
tests/unit/ops/test_ops.py ............................................. [ 75%]
.................... [ 76%]
tests/unit/ops/test_ops_schema.py ...................................... [ 79%]
........................................................................ [ 83%]
........................................................................ [ 87%]
....................................... [ 90%]
tests/unit/ops/test_target_encode.py ..................... [ 91%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 91%]
tests/unit/workflow/test_workflow.py ................................... [ 93%]
......................................................... [ 97%]
tests/unit/workflow/test_workflow_chaining.py ... [ 97%]
tests/unit/workflow/test_workflow_node.py ........... [ 98%]
tests/unit/workflow/test_workflow_ops.py ... [ 98%]
tests/unit/workflow/test_workflow_schemas.py .......................... [100%]

=============================== warnings summary ===============================
tests/unit/test_dask_nvt.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 2 files did not have enough
partitions to create 8 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 6 files did not have enough
partitions to create 7 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 8 files did not have enough
partitions to create 9 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 8 files did not have enough
partitions to create 10 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 8 files did not have enough
partitions to create 11 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 12 files did not have enough
partitions to create 13 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 12 files did not have enough
partitions to create 14 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 12 files did not have enough
partitions to create 15 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 12 files did not have enough
partitions to create 16 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 12 files did not have enough
partitions to create 17 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 12 files did not have enough
partitions to create 18 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 12 files did not have enough
partitions to create 19 files.
warnings.warn(

tests/unit/test_io.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/init.py:38: DeprecationWarning: ColumnGroup is deprecated, use ColumnSelector instead
warnings.warn("ColumnGroup is deprecated, use ColumnSelector instead", DeprecationWarning)

tests/unit/test_io.py: 12 warnings
tests/unit/workflow/test_workflow.py: 48 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 2 files did not have enough
partitions to create 20 files.
warnings.warn(

tests/unit/test_io.py: 12 warnings
tests/unit/workflow/test_cpu_workflow.py: 6 warnings
tests/unit/workflow/test_workflow.py: 12 warnings
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 1 files did not have enough
partitions to create 10 files.
warnings.warn(

tests/unit/test_io.py::test_multifile_parquet[False-Shuffle.PER_WORKER-5-0-csv]
tests/unit/test_io.py::test_multifile_parquet[False-Shuffle.PER_WORKER-5-2-csv]
tests/unit/test_io.py::test_multifile_parquet[False-None-5-0-csv]
tests/unit/test_io.py::test_multifile_parquet[False-None-5-2-csv]
tests/unit/loader/test_torch_dataloader.py::test_horovod_multigpu
tests/unit/loader/test_torch_dataloader.py::test_distributed_multigpu
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 1 files did not have enough
partitions to create 5 files.
warnings.warn(

tests/unit/test_io.py::test_to_parquet_output_files[Shuffle.PER_WORKER-4-6]
tests/unit/test_io.py::test_to_parquet_output_files[False-4-6]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 2 files did not have enough
partitions to create 6 files.
warnings.warn(

tests/unit/test_io.py: 6 warnings
tests/unit/loader/test_tf_dataloader.py: 2 warnings
tests/unit/loader/test_torch_dataloader.py: 12 warnings
tests/unit/workflow/test_workflow.py: 9 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 1 files did not have enough
partitions to create 2 files.
warnings.warn(

tests/unit/test_io.py::test_validate_and_regenerate_dataset
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/parquet.py:519: DeprecationWarning: 'ParquetDataset.pieces' attribute is deprecated as of pyarrow 5.0.0 and will be removed in a future version. Specify 'use_legacy_dataset=False' while constructing the ParquetDataset, and then use the '.fragments' attribute instead.
paths = [p.path for p in pa_dataset.pieces]

tests/unit/ops/test_fill.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_fill.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/usr/local/lib/python3.8/dist-packages/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
/usr/local/lib/python3.8/dist-packages/dask/dataframe/core.py:6871: UserWarning: Insufficient elements for head. 1 elements requested, only 0 elements available. Try passing larger npartitions to head.
warnings.warn(msg.format(n, len(r)))

tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-None]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 2 files did not have enough
partitions to create 4 files.
warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing

examples/multi-gpu-movielens/torch_trainer.py 65 0 6 1 99% 32->36
examples/multi-gpu-movielens/torch_trainer_dist.py 63 0 2 0 100%
nvtabular/init.py 18 0 0 0 100%
nvtabular/dispatch.py 341 80 166 28 75% 37-39, 42-46, 51-53, 59-69, 76-77, 118-120, 128-130, 135-138, 142-147, 154, 173, 184, 190, 195->197, 208, 231-234, 265->267, 278-280, 286, 311, 318, 349->354, 352, 355, 358->362, 395-397, 408-411, 416, 438, 445-448, 478, 482, 523, 547, 549, 556, 571-585, 600, 607
nvtabular/framework_utils/init.py 0 0 0 0 100%
nvtabular/framework_utils/tensorflow/init.py 1 0 0 0 100%
nvtabular/framework_utils/tensorflow/feature_column_utils.py 134 78 90 15 39% 30, 99, 103, 114-130, 140, 143-158, 162, 166-167, 173-198, 207-217, 220-227, 229->233, 234, 239-279, 282
nvtabular/framework_utils/tensorflow/layers/init.py 4 0 0 0 100%
nvtabular/framework_utils/tensorflow/layers/embedding.py 153 12 89 6 91% 60, 68->49, 122, 179, 231-239, 335->343, 357->360, 363-364, 367
nvtabular/framework_utils/tensorflow/layers/interaction.py 47 25 22 1 45% 49, 74-103, 106-110, 113
nvtabular/framework_utils/tensorflow/layers/outer_product.py 30 24 12 0 19% 37-38, 41-60, 71-84, 87
nvtabular/framework_utils/tensorflow/tfrecords_to_parquet.py 58 58 30 0 0% 16-111
nvtabular/framework_utils/torch/init.py 0 0 0 0 100%
nvtabular/framework_utils/torch/layers/init.py 2 0 0 0 100%
nvtabular/framework_utils/torch/layers/embeddings.py 32 2 18 2 92% 50, 91
nvtabular/framework_utils/torch/models.py 45 1 30 4 93% 57->61, 87->89, 93->96, 103
nvtabular/framework_utils/torch/utils.py 75 5 34 5 91% 51->53, 64, 71->76, 75, 118-120
nvtabular/graph/init.py 4 0 0 0 100%
nvtabular/graph/base_operator.py 111 2 38 4 95% 151->157, 170->176, 181->185, 250, 254
nvtabular/graph/graph.py 57 9 32 1 82% 39, 100-102, 106-111
nvtabular/graph/node.py 285 53 154 20 77% 49, 73-81, 140, 208->211, 228->231, 239-240, 287, 305, 320-325, 330, 332, 338, 352, 362-373, 378->381, 392-400, 409, 410->405, 424-425, 433, 434->417, 440-443, 447, 474, 481-486, 509
nvtabular/graph/ops/init.py 5 0 0 0 100%
nvtabular/graph/ops/concat_columns.py 18 0 2 0 100%
nvtabular/graph/ops/identity.py 6 1 2 0 88% 41
nvtabular/graph/ops/selection.py 20 0 2 0 100%
nvtabular/graph/ops/subset_columns.py 15 1 2 0 94% 60
nvtabular/graph/ops/subtraction.py 21 2 4 0 92% 54-55
nvtabular/graph/schema.py 128 8 63 6 93% 39, 63, 169, 174->exit, 178, 191, 198, 223, 226
nvtabular/graph/schema_io/init.py 0 0 0 0 100%
nvtabular/graph/schema_io/schema_writer_base.py 8 0 2 0 100%
nvtabular/graph/schema_io/schema_writer_pbtxt.py 126 11 58 10 88% 57-62, 67->74, 70->72, 81, 98->103, 101->103, 124->139, 130-133, 175->191, 183, 187
nvtabular/graph/selector.py 88 1 48 1 99% 158
nvtabular/graph/tags.py 59 0 22 0 100%
nvtabular/inference/init.py 0 0 0 0 100%
nvtabular/inference/graph/init.py 3 0 0 0 100%
nvtabular/inference/graph/ensemble.py 57 42 26 0 20% 39-103, 107-118
nvtabular/inference/graph/graph.py 27 4 14 2 80% 42, 50-57
nvtabular/inference/graph/node.py 15 9 4 0 42% 22-23, 26-27, 31-36
nvtabular/inference/graph/op_runner.py 21 0 8 0 100%
nvtabular/inference/graph/ops/init.py 0 0 0 0 100%
nvtabular/inference/graph/ops/operator.py 32 6 12 1 80% 13-14, 19, 36, 40, 49
nvtabular/inference/graph/ops/tensorflow.py 50 18 16 2 64% 34-47, 79-83, 92-95
nvtabular/inference/graph/ops/workflow.py 30 1 4 1 94% 51
nvtabular/inference/triton/init.py 36 12 14 1 58% 42-49, 68, 72, 76-82
nvtabular/inference/triton/benchmarking_tools.py 52 52 10 0 0% 2-103
nvtabular/inference/triton/data_conversions.py 87 3 58 4 95% 32-33, 84
nvtabular/inference/triton/ensemble.py 281 148 96 7 49% 157-193, 237-285, 302-306, 378-386, 415-431, 483-493, 542-582, 588-604, 608-675, 701-702, 734, 756, 762-781, 787-811, 818
nvtabular/inference/triton/model/init.py 0 0 0 0 100%
nvtabular/inference/triton/model/model_pt.py 101 101 42 0 0% 27-220
nvtabular/inference/triton/model_config_pb2.py 299 0 2 0 100%
nvtabular/inference/triton/workflow_model.py 52 52 22 0 0% 27-124
nvtabular/inference/workflow/init.py 0 0 0 0 100%
nvtabular/inference/workflow/base.py 114 114 62 0 0% 27-210
nvtabular/inference/workflow/hugectr.py 37 37 16 0 0% 27-87
nvtabular/inference/workflow/pytorch.py 10 10 6 0 0% 27-46
nvtabular/inference/workflow/tensorflow.py 32 32 10 0 0% 26-68
nvtabular/io/init.py 5 0 0 0 100%
nvtabular/io/avro.py 88 88 32 0 0% 16-189
nvtabular/io/csv.py 57 6 22 5 86% 22-23, 99, 103->107, 108, 110, 124
nvtabular/io/dask.py 181 8 72 11 92% 110, 113, 149, 389, 399, 416->419, 427, 431->433, 433->429, 438, 440
nvtabular/io/dataframe_engine.py 61 5 30 6 88% 19-20, 50, 69, 88->92, 92->97, 94->97, 97->116, 125
nvtabular/io/dataframe_iter.py 21 1 14 1 94% 42
nvtabular/io/dataset.py 346 44 168 28 85% 48-49, 234, 271, 273, 286, 311-325, 449->522, 454-457, 462->472, 479->477, 480->484, 497->501, 512, 573-574, 575->579, 627, 755, 757, 759, 765, 769-771, 773, 833-834, 868, 875-876, 882, 888, 984-985, 1102-1107, 1113, 1125-1126
nvtabular/io/dataset_engine.py 31 1 6 0 97% 48
nvtabular/io/fsspec_utils.py 115 101 64 0 8% 26-27, 42-98, 103-114, 151-198, 220-270, 275-291, 295-297, 311-322
nvtabular/io/hugectr.py 45 2 26 2 92% 34, 74->97, 101
nvtabular/io/parquet.py 591 35 218 34 91% 35-36, 59, 81->161, 93, 106, 120->128, 131, 145->exit, 181, 210-211, 228->253, 239->253, 290-298, 318, 324, 342->344, 358, 376->386, 379, 384-385, 428->440, 432, 554-559, 597-602, 718->725, 786->791, 792-793, 913, 917, 921, 927, 959, 976, 980, 987->989, 1097->exit, 1101->1098, 1108->1113, 1118->1128, 1133, 1155, 1182
nvtabular/io/shuffle.py 31 7 18 4 73% 42, 44-45, 49, 62-64
nvtabular/io/writer.py 184 13 78 5 92% 24-25, 51, 79, 125, 128, 212, 221, 224, 267, 299-301
nvtabular/io/writer_factory.py 18 2 8 2 85% 35, 60
nvtabular/loader/init.py 0 0 0 0 100%
nvtabular/loader/backend.py 372 17 154 12 94% 27-28, 143, 159-160, 300->302, 312-316, 363-364, 403->407, 404->403, 479, 483-484, 513, 589-590, 625, 633
nvtabular/loader/tensorflow.py 178 30 60 12 82% 38-39, 74, 82-86, 98, 112, 121, 326, 348, 352-355, 360, 365, 380-382, 392->396, 411-413, 423-431, 434-437
nvtabular/loader/tf_utils.py 57 10 22 6 80% 32->35, 35->37, 42->44, 46, 47->68, 53-54, 62-64, 70-74
nvtabular/loader/torch.py 87 14 26 3 80% 28-30, 33-39, 114, 158-159, 164
nvtabular/ops/init.py 23 0 0 0 100%
nvtabular/ops/add_metadata.py 15 0 2 0 100%
nvtabular/ops/bucketize.py 40 9 20 3 73% 52-54, 58->exit, 61-64, 83-86
nvtabular/ops/categorify.py 658 70 350 48 86% 252, 254, 272, 276, 284, 292, 294, 321, 342-343, 390->394, 398-405, 451, 459, 482-483, 560-565, 636, 732, 749, 794, 872-873, 888-892, 893->857, 911, 919, 926->exit, 950, 953->956, 1005->1003, 1065, 1070, 1091->1095, 1097->1052, 1103-1106, 1118, 1122, 1126, 1133, 1138-1141, 1219, 1221, 1291->1314, 1297->1314, 1315-1320, 1365, 1378->1381, 1385->1390, 1389, 1395, 1398, 1406-1416
nvtabular/ops/clip.py 18 2 8 3 81% 44, 52->54, 55
nvtabular/ops/column_similarity.py 122 26 38 5 74% 19-20, 29-30, 82->exit, 112, 207-208, 217-219, 227-243, 260->263, 264, 274
nvtabular/ops/data_stats.py 56 1 24 3 95% 91->93, 95, 97->87
nvtabular/ops/difference_lag.py 42 0 14 1 98% 73->75
nvtabular/ops/dropna.py 8 0 2 0 100%
nvtabular/ops/fill.py 76 5 30 1 92% 63-67, 109
nvtabular/ops/filter.py 20 1 8 1 93% 49
nvtabular/ops/groupby.py 127 4 84 6 95% 72, 83, 93->95, 104->109, 131, 216
nvtabular/ops/hash_bucket.py 43 1 22 2 95% 73, 112->118
nvtabular/ops/hashed_cross.py 40 4 19 4 86% 52, 63, 68, 96
nvtabular/ops/join_external.py 96 8 34 7 88% 20-21, 114, 116, 118, 150->152, 205-206, 216->227, 221
nvtabular/ops/join_groupby.py 127 5 57 6 94% 113, 120, 129, 136->135, 178->175, 181->175, 259-260
nvtabular/ops/lambdaop.py 61 6 22 6 86% 59, 63, 81, 93, 98, 107
nvtabular/ops/list_slice.py 89 29 42 0 64% 21-22, 146-160, 168-190
nvtabular/ops/logop.py 21 0 6 0 100%
nvtabular/ops/moments.py 69 0 24 0 100%
nvtabular/ops/normalize.py 93 4 22 1 94% 89, 139-140, 167
nvtabular/ops/operator.py 12 1 2 0 93% 53
nvtabular/ops/rename.py 29 3 14 3 86% 45, 70-72
nvtabular/ops/stat_operator.py 8 0 2 0 100%
nvtabular/ops/target_encoding.py 182 9 76 5 93% 169->173, 177->186, 274, 283-284, 297-303, 396->399
nvtabular/ops/value_counts.py 34 0 6 1 98% 40->38
nvtabular/tools/init.py 0 0 0 0 100%
nvtabular/tools/data_gen.py 251 12 86 6 95% 25-26, 124-127, 137-139, 161-162, 313, 323, 349
nvtabular/tools/dataset_inspector.py 52 8 24 2 79% 33-40, 51
nvtabular/tools/inspector_script.py 46 46 0 0 0% 17-168
nvtabular/utils.py 119 45 52 8 57% 34-35, 39-40, 53, 64-65, 67-69, 72, 75, 81, 87, 93-129, 148, 152->156, 166-174
nvtabular/worker.py 80 5 38 7 90% 24-25, 81->97, 89, 90->97, 97->100, 106, 108, 109->111
nvtabular/workflow/init.py 2 0 0 0 100%
nvtabular/workflow/node.py 7 0 4 0 100%
nvtabular/workflow/workflow.py 215 17 94 12 91% 28-29, 51, 84, 163, 169->183, 196-198, 322, 337-338, 374, 451, 481, 489-491, 504

TOTAL 8704 1719 3594 394 78%
Coverage XML written to file coverage.xml

Required test coverage of 70% reached. Total coverage: 77.88%
=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:16: could not import 's3fs': No module named 's3fs'
SKIPPED [1] tests/unit/inference/test_ensemble.py:32: could not import 'nvtabular.loader.tf_utils.configure_tensorflow': No module named 'nvtabular.loader.tf_utils.configure_tensorflow'; 'nvtabular.loader.tf_utils' is not a package
SKIPPED [1] tests/unit/inference/test_export.py:8: could not import 'nvtabular.loader.tf_utils.configure_tensorflow': No module named 'nvtabular.loader.tf_utils.configure_tensorflow'; 'nvtabular.loader.tf_utils' is not a package
SKIPPED [8] tests/unit/test_io.py:604: could not import 'uavro': No module named 'uavro'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:531: not working correctly in ci environment
========= 1639 passed, 12 skipped, 264 warnings in 1495.76s (0:24:55) ==========
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://api.GitHub.com/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins5573599536555672880.sh

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1376 of commit c3f37eb6a84831416138d703907cdecb79a7ba2b, no merge conflicts.
Running as SYSTEM
Setting status of c3f37eb6a84831416138d703907cdecb79a7ba2b to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/4088/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1376/*:refs/remotes/origin/pr/1376/* # timeout=10
 > git rev-parse c3f37eb6a84831416138d703907cdecb79a7ba2b^{commit} # timeout=10
Checking out Revision c3f37eb6a84831416138d703907cdecb79a7ba2b (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f c3f37eb6a84831416138d703907cdecb79a7ba2b # timeout=10
Commit message: "Update requirements.txt"
 > git rev-list --no-walk 478820e72c8b2b96c6487ccdf21693a3252b1168 # timeout=10
First time build. Skipping changelog.
[nvtabular_tests] $ /bin/bash /tmp/jenkins5805045923181946478.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (22.0.2)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (59.4.0)
Collecting setuptools
  Downloading setuptools-60.6.0-py3-none-any.whl (953 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 953.8/953.8 KB 27.6 MB/s eta 0:00:00
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.1)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.9.0)
Requirement already satisfied: numpy==1.20.3 in /var/jenkins_home/.local/lib/python3.8/site-packages (1.20.3)
Found existing installation: nvtabular 0.9.0+4.gc5cb887
Can't uninstall 'nvtabular'. No files were found to uninstall.
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/command/easy_install.py:156: EasyInstallDeprecationWarning: easy_install command is deprecated. Use build and pip and other standards-based tools.
  warnings.warn(
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
  warnings.warn(
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.9.0+11.gc3f37eb -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.9.0+11.gc3f37eb -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.9.0+11.gc3f37eb -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.9.0+11.gc3f37eb -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so -> 
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.9.0+11.gc3f37eb is already the active version in easy-install.pth

Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Running black --check
All done! ✨ 🍰 ✨
176 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.dispatch
nvtabular/dispatch.py:607:11: I1101: Module 'numpy.random.mtrand' has no 'RandomState' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)


Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)

Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.8) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.11.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: xdist-2.5.0, forked-1.4.0, anyio-3.5.0, cov-3.0.0
collected 1648 items / 3 skipped / 1645 selected

tests/unit/test_dask_nvt.py ............................................ [ 2%]
....................................................................... [ 6%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 14%]
..................ssssssss.............................................. [ 18%]
......... [ 19%]
tests/unit/test_notebooks.py ...... [ 19%]
tests/unit/test_tf4rec.py . [ 19%]
tests/unit/test_tools.py ...................... [ 21%]
tests/unit/test_triton_inference.py ................................ [ 22%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 23%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 24%]
................................................... [ 27%]
tests/unit/framework_utils/test_torch_layers.py . [ 27%]
tests/unit/graph/test_base_operator.py .... [ 28%]
tests/unit/graph/test_column_schemas.py ................................ [ 30%]
.................................................. [ 33%]
tests/unit/graph/test_column_selector.py ..................... [ 34%]
tests/unit/graph/test_tags.py ...... [ 34%]
tests/unit/graph/ops/test_selection.py ... [ 34%]
tests/unit/inference/test_graph.py . [ 34%]
tests/unit/inference/test_inference_ops.py .. [ 35%]
tests/unit/inference/test_op_runner.py ... [ 35%]
tests/unit/inference/test_tensorflow_inf_op.py ... [ 35%]
tests/unit/loader/test_dataloader_backend.py ...... [ 35%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 37%]
........................................s.. [ 40%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 42%]
..............................Terminated
Build was aborted
Aborted by �[8mha:////4I6AZwo/1Z8Fal8AhZTEatjIwqNwCcqT21311HdysuK+AAAAlx+LCAAAAAAAAP9b85aBtbiIQTGjNKU4P08vOT+vOD8nVc83PyU1x6OyILUoJzMv2y+/JJUBAhiZGBgqihhk0NSjKDWzXb3RdlLBUSYGJk8GtpzUvPSSDB8G5tKinBIGIZ+sxLJE/ZzEvHT94JKizLx0a6BxUmjGOUNodHsLgAzWEgZu/dLi1CL9xJTczDwAj6GcLcAAAAA=�[0madmin
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://api.GitHub.com/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins2606802801671161214.sh

@nvidia-merlin-bot
Copy link
Contributor

Click to view CI Results
GitHub pull request #1376 of commit bb2ed59e0542918d3ba05b5a7976d55abcbb60da, no merge conflicts.
Running as SYSTEM
Setting status of bb2ed59e0542918d3ba05b5a7976d55abcbb60da to PENDING with url http://10.20.13.93:8080/job/nvtabular_tests/4089/ and message: 'Pending'
Using context: Jenkins Unit Test Run
Building on master in workspace /var/jenkins_home/workspace/nvtabular_tests
using credential nvidia-merlin-bot
Cloning the remote Git repository
Cloning repository https://github.com/NVIDIA-Merlin/NVTabular.git
 > git init /var/jenkins_home/workspace/nvtabular_tests/nvtabular # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/NVIDIA-Merlin/NVTabular.git # timeout=10
Fetching upstream changes from https://github.com/NVIDIA-Merlin/NVTabular.git
using GIT_ASKPASS to set credentials This is the bot credentials for our CI/CD
 > git fetch --tags --force --progress -- https://github.com/NVIDIA-Merlin/NVTabular.git +refs/pull/1376/*:refs/remotes/origin/pr/1376/* # timeout=10
 > git rev-parse bb2ed59e0542918d3ba05b5a7976d55abcbb60da^{commit} # timeout=10
Checking out Revision bb2ed59e0542918d3ba05b5a7976d55abcbb60da (detached)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f bb2ed59e0542918d3ba05b5a7976d55abcbb60da # timeout=10
Commit message: "Update hashed_cross.py"
 > git rev-list --no-walk c3f37eb6a84831416138d703907cdecb79a7ba2b # timeout=10
[nvtabular_tests] $ /bin/bash /tmp/jenkins6265022517600980363.sh
Installing NVTabular
Looking in indexes: https://pypi.org/simple, https://pypi.ngc.nvidia.com
Requirement already satisfied: pip in /var/jenkins_home/.local/lib/python3.8/site-packages (22.0.2)
Requirement already satisfied: setuptools in /var/jenkins_home/.local/lib/python3.8/site-packages (59.4.0)
Collecting setuptools
  Downloading setuptools-60.6.0-py3-none-any.whl (953 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 953.8/953.8 KB 18.1 MB/s eta 0:00:00
Requirement already satisfied: wheel in /var/jenkins_home/.local/lib/python3.8/site-packages (0.37.1)
Requirement already satisfied: pybind11 in /var/jenkins_home/.local/lib/python3.8/site-packages (2.9.0)
Requirement already satisfied: numpy==1.20.3 in /var/jenkins_home/.local/lib/python3.8/site-packages (1.20.3)
Found existing installation: nvtabular 0.9.0+4.gc5cb887
Can't uninstall 'nvtabular'. No files were found to uninstall.
running develop
running egg_info
creating nvtabular.egg-info
writing nvtabular.egg-info/PKG-INFO
writing dependency_links to nvtabular.egg-info/dependency_links.txt
writing requirements to nvtabular.egg-info/requires.txt
writing top-level names to nvtabular.egg-info/top_level.txt
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/command/easy_install.py:156: EasyInstallDeprecationWarning: easy_install command is deprecated. Use build and pip and other standards-based tools.
  warnings.warn(
/var/jenkins_home/.local/lib/python3.8/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
  warnings.warn(
warning: no files found matching '*.h' under directory 'cpp'
warning: no files found matching '*.cu' under directory 'cpp'
warning: no files found matching '*.cuh' under directory 'cpp'
adding license file 'LICENSE'
writing manifest file 'nvtabular.egg-info/SOURCES.txt'
running build_ext
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/usr/include/python3.8 -c flagcheck.cpp -o flagcheck.o -std=c++17
building 'nvtabular_cpp' extension
creating build
creating build/temp.linux-x86_64-3.8
creating build/temp.linux-x86_64-3.8/cpp
creating build/temp.linux-x86_64-3.8/cpp/nvtabular
creating build/temp.linux-x86_64-3.8/cpp/nvtabular/inference
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.9.0+13.gbb2ed59 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.9.0+13.gbb2ed59 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/__init__.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.9.0+13.gbb2ed59 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/categorify.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o -std=c++17 -fvisibility=hidden -g0
x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -DVERSION_INFO=0.9.0+13.gbb2ed59 -I./cpp/ -I/var/jenkins_home/.local/lib/python3.8/site-packages/pybind11/include -I/usr/include/python3.8 -c cpp/nvtabular/inference/fill.cc -o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -std=c++17 -fvisibility=hidden -g0
creating build/lib.linux-x86_64-3.8
x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.8/cpp/nvtabular/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/__init__.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/categorify.o build/temp.linux-x86_64-3.8/cpp/nvtabular/inference/fill.o -o build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so
copying build/lib.linux-x86_64-3.8/nvtabular_cpp.cpython-38-x86_64-linux-gnu.so -> 
Generating nvtabular/inference/triton/model_config_pb2.py from nvtabular/inference/triton/model_config.proto
Creating /var/jenkins_home/.local/lib/python3.8/site-packages/nvtabular.egg-link (link to .)
nvtabular 0.9.0+13.gbb2ed59 is already the active version in easy-install.pth

Installed /var/jenkins_home/workspace/nvtabular_tests/nvtabular
Running black --check
All done! ✨ 🍰 ✨
176 files would be left unchanged.
Running flake8
Running isort
Skipped 2 files
Running bandit
Running pylint
************* Module nvtabular.dispatch
nvtabular/dispatch.py:607:11: I1101: Module 'numpy.random.mtrand' has no 'RandomState' member, but source is unavailable. Consider adding this module to extension-pkg-allow-list if you want to perform analysis based on run-time introspection of living objects. (c-extension-no-member)


Your code has been rated at 10.00/10 (previous run: 10.00/10, +0.00)

Running flake8-nb
Building docs
make: Entering directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
/usr/lib/python3/dist-packages/requests/init.py:89: RequestsDependencyWarning: urllib3 (1.26.8) or chardet (3.0.4) doesn't match a supported version!
warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
/usr/local/lib/python3.8/dist-packages/recommonmark/parser.py:75: UserWarning: Container node skipped: type=document
warn("Container node skipped: type={0}".format(mdnode.t))
make: Leaving directory '/var/jenkins_home/workspace/nvtabular_tests/nvtabular/docs'
============================= test session starts ==============================
platform linux -- Python 3.8.10, pytest-6.2.5, py-1.11.0, pluggy-1.0.0
rootdir: /var/jenkins_home/workspace/nvtabular_tests/nvtabular, configfile: pyproject.toml
plugins: xdist-2.5.0, forked-1.4.0, anyio-3.5.0, cov-3.0.0
collected 1648 items / 3 skipped / 1645 selected

tests/unit/test_dask_nvt.py ............................................ [ 2%]
....................................................................... [ 6%]
tests/unit/test_io.py .................................................. [ 10%]
........................................................................ [ 14%]
..................ssssssss.............................................. [ 18%]
......... [ 19%]
tests/unit/test_notebooks.py ...... [ 19%]
tests/unit/test_tf4rec.py . [ 19%]
tests/unit/test_tools.py ...................... [ 21%]
tests/unit/test_triton_inference.py ................................ [ 22%]
tests/unit/framework_utils/test_tf_feature_columns.py . [ 23%]
tests/unit/framework_utils/test_tf_layers.py ........................... [ 24%]
................................................... [ 27%]
tests/unit/framework_utils/test_torch_layers.py . [ 27%]
tests/unit/graph/test_base_operator.py .... [ 28%]
tests/unit/graph/test_column_schemas.py ................................ [ 30%]
.................................................. [ 33%]
tests/unit/graph/test_column_selector.py ..................... [ 34%]
tests/unit/graph/test_tags.py ...... [ 34%]
tests/unit/graph/ops/test_selection.py ... [ 34%]
tests/unit/inference/test_graph.py . [ 34%]
tests/unit/inference/test_inference_ops.py .. [ 35%]
tests/unit/inference/test_op_runner.py ... [ 35%]
tests/unit/inference/test_tensorflow_inf_op.py ... [ 35%]
tests/unit/loader/test_dataloader_backend.py ...... [ 35%]
tests/unit/loader/test_tf_dataloader.py ................................ [ 37%]
........................................s.. [ 40%]
tests/unit/loader/test_torch_dataloader.py ............................. [ 42%]
........................................................ [ 45%]
tests/unit/ops/test_categorify.py ...................................... [ 47%]
........................................................................ [ 52%]
............................... [ 54%]
tests/unit/ops/test_column_similarity.py ........................ [ 55%]
tests/unit/ops/test_fill.py ............................................ [ 58%]
........ [ 58%]
tests/unit/ops/test_groupyby.py ....... [ 59%]
tests/unit/ops/test_hash_bucket.py ......................... [ 60%]
tests/unit/ops/test_join.py ............................................ [ 63%]
........................................................................ [ 67%]
.................................. [ 69%]
tests/unit/ops/test_lambda.py .......... [ 70%]
tests/unit/ops/test_normalize.py ....................................... [ 72%]
.. [ 72%]
tests/unit/ops/test_ops.py ............................................. [ 75%]
.................... [ 76%]
tests/unit/ops/test_ops_schema.py ...................................... [ 79%]
........................................................................ [ 83%]
........................................................................ [ 87%]
....................................... [ 90%]
tests/unit/ops/test_target_encode.py ..................... [ 91%]
tests/unit/workflow/test_cpu_workflow.py ...... [ 91%]
tests/unit/workflow/test_workflow.py ................................... [ 93%]
......................................................... [ 97%]
tests/unit/workflow/test_workflow_chaining.py ... [ 97%]
tests/unit/workflow/test_workflow_node.py ........... [ 98%]
tests/unit/workflow/test_workflow_ops.py ... [ 98%]
tests/unit/workflow/test_workflow_schemas.py .......................... [100%]

=============================== warnings summary ===============================
tests/unit/test_dask_nvt.py: 12 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 2 files did not have enough
partitions to create 8 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 6 files did not have enough
partitions to create 7 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 8 files did not have enough
partitions to create 9 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 8 files did not have enough
partitions to create 10 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 8 files did not have enough
partitions to create 11 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 12 files did not have enough
partitions to create 13 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 12 files did not have enough
partitions to create 14 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 12 files did not have enough
partitions to create 15 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 12 files did not have enough
partitions to create 16 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 12 files did not have enough
partitions to create 17 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 12 files did not have enough
partitions to create 18 files.
warnings.warn(

tests/unit/test_io.py::test_io_partitions_push
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 12 files did not have enough
partitions to create 19 files.
warnings.warn(

tests/unit/test_io.py: 96 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/init.py:38: DeprecationWarning: ColumnGroup is deprecated, use ColumnSelector instead
warnings.warn("ColumnGroup is deprecated, use ColumnSelector instead", DeprecationWarning)

tests/unit/test_io.py: 12 warnings
tests/unit/workflow/test_workflow.py: 48 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 2 files did not have enough
partitions to create 20 files.
warnings.warn(

tests/unit/test_io.py: 12 warnings
tests/unit/workflow/test_cpu_workflow.py: 6 warnings
tests/unit/workflow/test_workflow.py: 12 warnings
tests/unit/workflow/test_workflow_schemas.py: 1 warning
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 1 files did not have enough
partitions to create 10 files.
warnings.warn(

tests/unit/test_io.py::test_multifile_parquet[False-Shuffle.PER_WORKER-5-0-csv]
tests/unit/test_io.py::test_multifile_parquet[False-Shuffle.PER_WORKER-5-2-csv]
tests/unit/test_io.py::test_multifile_parquet[False-None-5-0-csv]
tests/unit/test_io.py::test_multifile_parquet[False-None-5-2-csv]
tests/unit/loader/test_torch_dataloader.py::test_horovod_multigpu
tests/unit/loader/test_torch_dataloader.py::test_distributed_multigpu
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 1 files did not have enough
partitions to create 5 files.
warnings.warn(

tests/unit/test_io.py::test_to_parquet_output_files[Shuffle.PER_WORKER-4-6]
tests/unit/test_io.py::test_to_parquet_output_files[False-4-6]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 2 files did not have enough
partitions to create 6 files.
warnings.warn(

tests/unit/test_io.py: 6 warnings
tests/unit/loader/test_tf_dataloader.py: 2 warnings
tests/unit/loader/test_torch_dataloader.py: 12 warnings
tests/unit/workflow/test_workflow.py: 9 warnings
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 1 files did not have enough
partitions to create 2 files.
warnings.warn(

tests/unit/test_io.py::test_validate_and_regenerate_dataset
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/parquet.py:519: DeprecationWarning: 'ParquetDataset.pieces' attribute is deprecated as of pyarrow 5.0.0 and will be removed in a future version. Specify 'use_legacy_dataset=False' while constructing the ParquetDataset, and then use the '.fragments' attribute instead.
paths = [p.path for p in pa_dataset.pieces]

tests/unit/ops/test_fill.py::test_fill_missing[True-True-parquet]
tests/unit/ops/test_fill.py::test_fill_missing[True-False-parquet]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
/usr/local/lib/python3.8/dist-packages/pandas/core/indexing.py:1732: SettingWithCopyWarning:
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
self._setitem_single_block(indexer, value, name)

tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-True]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
tests/unit/ops/test_ops.py::test_filter[parquet-0.1-False]
/usr/local/lib/python3.8/dist-packages/dask/dataframe/core.py:6871: UserWarning: Insufficient elements for head. 1 elements requested, only 0 elements available. Try passing larger npartitions to head.
warnings.warn(msg.format(n, len(r)))

tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_parquet_output[True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[True-True-None]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_WORKER]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-Shuffle.PER_PARTITION]
tests/unit/workflow/test_workflow.py::test_workflow_apply[False-True-None]
/var/jenkins_home/workspace/nvtabular_tests/nvtabular/nvtabular/io/dataset.py:861: UserWarning: Only created 2 files did not have enough
partitions to create 4 files.
warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/warnings.html

---------- coverage: platform linux, python 3.8.10-final-0 -----------
Name Stmts Miss Branch BrPart Cover Missing

examples/multi-gpu-movielens/torch_trainer.py 65 0 6 1 99% 32->36
examples/multi-gpu-movielens/torch_trainer_dist.py 63 0 2 0 100%
nvtabular/init.py 18 0 0 0 100%
nvtabular/dispatch.py 341 80 166 28 75% 37-39, 42-46, 51-53, 59-69, 76-77, 118-120, 128-130, 135-138, 142-147, 154, 173, 184, 190, 195->197, 208, 231-234, 265->267, 278-280, 286, 311, 318, 349->354, 352, 355, 358->362, 395-397, 408-411, 416, 438, 445-448, 478, 482, 523, 547, 549, 556, 571-585, 600, 607
nvtabular/framework_utils/init.py 0 0 0 0 100%
nvtabular/framework_utils/tensorflow/init.py 1 0 0 0 100%
nvtabular/framework_utils/tensorflow/feature_column_utils.py 134 78 90 15 39% 30, 99, 103, 114-130, 140, 143-158, 162, 166-167, 173-198, 207-217, 220-227, 229->233, 234, 239-279, 282
nvtabular/framework_utils/tensorflow/layers/init.py 4 0 0 0 100%
nvtabular/framework_utils/tensorflow/layers/embedding.py 153 12 89 6 91% 60, 68->49, 122, 179, 231-239, 335->343, 357->360, 363-364, 367
nvtabular/framework_utils/tensorflow/layers/interaction.py 47 25 22 1 45% 49, 74-103, 106-110, 113
nvtabular/framework_utils/tensorflow/layers/outer_product.py 30 24 12 0 19% 37-38, 41-60, 71-84, 87
nvtabular/framework_utils/tensorflow/tfrecords_to_parquet.py 58 58 30 0 0% 16-111
nvtabular/framework_utils/torch/init.py 0 0 0 0 100%
nvtabular/framework_utils/torch/layers/init.py 2 0 0 0 100%
nvtabular/framework_utils/torch/layers/embeddings.py 32 2 18 2 92% 50, 91
nvtabular/framework_utils/torch/models.py 45 1 30 4 93% 57->61, 87->89, 93->96, 103
nvtabular/framework_utils/torch/utils.py 75 5 34 5 91% 51->53, 64, 71->76, 75, 118-120
nvtabular/graph/init.py 4 0 0 0 100%
nvtabular/graph/base_operator.py 111 2 38 4 95% 151->157, 170->176, 181->185, 250, 254
nvtabular/graph/graph.py 57 9 32 1 82% 39, 100-102, 106-111
nvtabular/graph/node.py 285 53 154 20 77% 49, 73-81, 140, 208->211, 228->231, 239-240, 287, 305, 320-325, 330, 332, 338, 352, 362-373, 378->381, 392-400, 409, 410->405, 424-425, 433, 434->417, 440-443, 447, 474, 481-486, 509
nvtabular/graph/ops/init.py 5 0 0 0 100%
nvtabular/graph/ops/concat_columns.py 18 0 2 0 100%
nvtabular/graph/ops/identity.py 6 1 2 0 88% 41
nvtabular/graph/ops/selection.py 20 0 2 0 100%
nvtabular/graph/ops/subset_columns.py 15 1 2 0 94% 60
nvtabular/graph/ops/subtraction.py 21 2 4 0 92% 54-55
nvtabular/graph/schema.py 128 8 63 6 93% 39, 63, 169, 174->exit, 178, 191, 198, 223, 226
nvtabular/graph/schema_io/init.py 0 0 0 0 100%
nvtabular/graph/schema_io/schema_writer_base.py 8 0 2 0 100%
nvtabular/graph/schema_io/schema_writer_pbtxt.py 126 11 58 10 88% 57-62, 67->74, 70->72, 81, 98->103, 101->103, 124->139, 130-133, 175->191, 183, 187
nvtabular/graph/selector.py 88 1 48 1 99% 158
nvtabular/graph/tags.py 59 0 22 0 100%
nvtabular/inference/init.py 0 0 0 0 100%
nvtabular/inference/graph/init.py 3 0 0 0 100%
nvtabular/inference/graph/ensemble.py 57 42 26 0 20% 39-103, 107-118
nvtabular/inference/graph/graph.py 27 4 14 2 80% 42, 50-57
nvtabular/inference/graph/node.py 15 9 4 0 42% 22-23, 26-27, 31-36
nvtabular/inference/graph/op_runner.py 21 0 8 0 100%
nvtabular/inference/graph/ops/init.py 0 0 0 0 100%
nvtabular/inference/graph/ops/operator.py 32 6 12 1 80% 13-14, 19, 36, 40, 49
nvtabular/inference/graph/ops/tensorflow.py 50 18 16 2 64% 34-47, 79-83, 92-95
nvtabular/inference/graph/ops/workflow.py 30 1 4 1 94% 51
nvtabular/inference/triton/init.py 36 12 14 1 58% 42-49, 68, 72, 76-82
nvtabular/inference/triton/benchmarking_tools.py 52 52 10 0 0% 2-103
nvtabular/inference/triton/data_conversions.py 87 3 58 4 95% 32-33, 84
nvtabular/inference/triton/ensemble.py 281 148 96 7 49% 157-193, 237-285, 302-306, 378-386, 415-431, 483-493, 542-582, 588-604, 608-675, 701-702, 734, 756, 762-781, 787-811, 818
nvtabular/inference/triton/model/init.py 0 0 0 0 100%
nvtabular/inference/triton/model/model_pt.py 101 101 42 0 0% 27-220
nvtabular/inference/triton/model_config_pb2.py 299 0 2 0 100%
nvtabular/inference/triton/workflow_model.py 52 52 22 0 0% 27-124
nvtabular/inference/workflow/init.py 0 0 0 0 100%
nvtabular/inference/workflow/base.py 114 114 62 0 0% 27-210
nvtabular/inference/workflow/hugectr.py 37 37 16 0 0% 27-87
nvtabular/inference/workflow/pytorch.py 10 10 6 0 0% 27-46
nvtabular/inference/workflow/tensorflow.py 32 32 10 0 0% 26-68
nvtabular/io/init.py 5 0 0 0 100%
nvtabular/io/avro.py 88 88 32 0 0% 16-189
nvtabular/io/csv.py 57 6 22 5 86% 22-23, 99, 103->107, 108, 110, 124
nvtabular/io/dask.py 181 8 72 11 92% 110, 113, 149, 389, 399, 416->419, 427, 431->433, 433->429, 438, 440
nvtabular/io/dataframe_engine.py 61 5 30 6 88% 19-20, 50, 69, 88->92, 92->97, 94->97, 97->116, 125
nvtabular/io/dataframe_iter.py 21 1 14 1 94% 42
nvtabular/io/dataset.py 346 44 168 28 85% 48-49, 234, 271, 273, 286, 311-325, 449->522, 454-457, 462->472, 479->477, 480->484, 497->501, 512, 573-574, 575->579, 627, 755, 757, 759, 765, 769-771, 773, 833-834, 868, 875-876, 882, 888, 984-985, 1102-1107, 1113, 1125-1126
nvtabular/io/dataset_engine.py 31 1 6 0 97% 48
nvtabular/io/fsspec_utils.py 115 101 64 0 8% 26-27, 42-98, 103-114, 151-198, 220-270, 275-291, 295-297, 311-322
nvtabular/io/hugectr.py 45 2 26 2 92% 34, 74->97, 101
nvtabular/io/parquet.py 591 35 218 34 91% 35-36, 59, 81->161, 93, 106, 120->128, 131, 145->exit, 181, 210-211, 228->253, 239->253, 290-298, 318, 324, 342->344, 358, 376->386, 379, 384-385, 428->440, 432, 554-559, 597-602, 718->725, 786->791, 792-793, 913, 917, 921, 927, 959, 976, 980, 987->989, 1097->exit, 1101->1098, 1108->1113, 1118->1128, 1133, 1155, 1182
nvtabular/io/shuffle.py 31 7 18 4 73% 42, 44-45, 49, 62-64
nvtabular/io/writer.py 184 13 78 5 92% 24-25, 51, 79, 125, 128, 212, 221, 224, 267, 299-301
nvtabular/io/writer_factory.py 18 2 8 2 85% 35, 60
nvtabular/loader/init.py 0 0 0 0 100%
nvtabular/loader/backend.py 372 17 154 12 94% 27-28, 143, 159-160, 300->302, 312-316, 363-364, 403->407, 404->403, 479, 483-484, 513, 589-590, 625, 633
nvtabular/loader/tensorflow.py 178 30 60 12 82% 38-39, 74, 82-86, 98, 112, 121, 326, 348, 352-355, 360, 365, 380-382, 392->396, 411-413, 423-431, 434-437
nvtabular/loader/tf_utils.py 57 10 22 6 80% 32->35, 35->37, 42->44, 46, 47->68, 53-54, 62-64, 70-74
nvtabular/loader/torch.py 87 14 26 3 80% 28-30, 33-39, 114, 158-159, 164
nvtabular/ops/init.py 23 0 0 0 100%
nvtabular/ops/add_metadata.py 15 0 2 0 100%
nvtabular/ops/bucketize.py 40 9 20 3 73% 52-54, 58->exit, 61-64, 83-86
nvtabular/ops/categorify.py 658 70 350 48 86% 252, 254, 272, 276, 284, 292, 294, 321, 342-343, 390->394, 398-405, 451, 459, 482-483, 560-565, 636, 732, 749, 794, 872-873, 888-892, 893->857, 911, 919, 926->exit, 950, 953->956, 1005->1003, 1065, 1070, 1091->1095, 1097->1052, 1103-1106, 1118, 1122, 1126, 1133, 1138-1141, 1219, 1221, 1291->1314, 1297->1314, 1315-1320, 1365, 1378->1381, 1385->1390, 1389, 1395, 1398, 1406-1416
nvtabular/ops/clip.py 18 2 8 3 81% 44, 52->54, 55
nvtabular/ops/column_similarity.py 122 26 38 5 74% 19-20, 29-30, 82->exit, 112, 207-208, 217-219, 227-243, 260->263, 264, 274
nvtabular/ops/data_stats.py 56 1 24 3 95% 91->93, 95, 97->87
nvtabular/ops/difference_lag.py 42 0 14 1 98% 73->75
nvtabular/ops/dropna.py 8 0 2 0 100%
nvtabular/ops/fill.py 76 5 30 1 92% 63-67, 109
nvtabular/ops/filter.py 20 1 8 1 93% 49
nvtabular/ops/groupby.py 127 4 84 6 95% 72, 83, 93->95, 104->109, 131, 216
nvtabular/ops/hash_bucket.py 43 1 22 2 95% 73, 112->118
nvtabular/ops/hashed_cross.py 38 3 17 3 89% 52, 64, 92
nvtabular/ops/join_external.py 96 8 34 7 88% 20-21, 114, 116, 118, 150->152, 205-206, 216->227, 221
nvtabular/ops/join_groupby.py 127 5 57 6 94% 113, 120, 129, 136->135, 178->175, 181->175, 259-260
nvtabular/ops/lambdaop.py 61 6 22 6 86% 59, 63, 81, 93, 98, 107
nvtabular/ops/list_slice.py 89 29 42 0 64% 21-22, 146-160, 168-190
nvtabular/ops/logop.py 21 0 6 0 100%
nvtabular/ops/moments.py 69 0 24 0 100%
nvtabular/ops/normalize.py 93 4 22 1 94% 89, 139-140, 167
nvtabular/ops/operator.py 12 1 2 0 93% 53
nvtabular/ops/rename.py 29 3 14 3 86% 45, 70-72
nvtabular/ops/stat_operator.py 8 0 2 0 100%
nvtabular/ops/target_encoding.py 182 9 76 5 93% 169->173, 177->186, 274, 283-284, 297-303, 396->399
nvtabular/ops/value_counts.py 34 0 6 1 98% 40->38
nvtabular/tools/init.py 0 0 0 0 100%
nvtabular/tools/data_gen.py 251 12 86 6 95% 25-26, 124-127, 137-139, 161-162, 313, 323, 349
nvtabular/tools/dataset_inspector.py 52 8 24 2 79% 33-40, 51
nvtabular/tools/inspector_script.py 46 46 0 0 0% 17-168
nvtabular/utils.py 119 45 52 8 57% 34-35, 39-40, 53, 64-65, 67-69, 72, 75, 81, 87, 93-129, 148, 152->156, 166-174
nvtabular/worker.py 80 5 38 7 90% 24-25, 81->97, 89, 90->97, 97->100, 106, 108, 109->111
nvtabular/workflow/init.py 2 0 0 0 100%
nvtabular/workflow/node.py 7 0 4 0 100%
nvtabular/workflow/workflow.py 215 17 94 12 91% 28-29, 51, 84, 163, 169->183, 196-198, 322, 337-338, 374, 451, 481, 489-491, 504

TOTAL 8702 1718 3592 393 78%
Coverage XML written to file coverage.xml

Required test coverage of 70% reached. Total coverage: 77.89%
=========================== short test summary info ============================
SKIPPED [1] ../../../../../usr/local/lib/python3.8/dist-packages/dask_cudf/io/tests/test_s3.py:16: could not import 's3fs': No module named 's3fs'
SKIPPED [1] tests/unit/inference/test_ensemble.py:32: could not import 'nvtabular.loader.tf_utils.configure_tensorflow': No module named 'nvtabular.loader.tf_utils.configure_tensorflow'; 'nvtabular.loader.tf_utils' is not a package
SKIPPED [1] tests/unit/inference/test_export.py:8: could not import 'nvtabular.loader.tf_utils.configure_tensorflow': No module named 'nvtabular.loader.tf_utils.configure_tensorflow'; 'nvtabular.loader.tf_utils' is not a package
SKIPPED [8] tests/unit/test_io.py:604: could not import 'uavro': No module named 'uavro'
SKIPPED [1] tests/unit/loader/test_tf_dataloader.py:531: not working correctly in ci environment
========= 1639 passed, 12 skipped, 264 warnings in 1568.46s (0:26:08) ==========
Performing Post build task...
Match found for : : True
Logical operation result is TRUE
Running script : #!/bin/bash
cd /var/jenkins_home/
CUDA_VISIBLE_DEVICES=1 python test_res_push.py "https://api.GitHub.com/repos/NVIDIA-Merlin/NVTabular/issues/$ghprbPullId/comments" "/var/jenkins_home/jobs/$JOB_NAME/builds/$BUILD_NUMBER/log"
[nvtabular_tests] $ /bin/bash /tmp/jenkins1890935282556961020.sh

@albert17 albert17 merged commit 632a1ff into NVIDIA-Merlin:main Jan 31, 2022
mikemckiernan pushed a commit that referenced this pull request Nov 24, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants