The following issues were found

test/typing/reveal/module_list.py
8 issues
Unable to import 'torch'
Error

Line: 1 Column: 1

              import torch

# ModuleList with elements of type Module
class FooModule(torch.nn.Module):
    pass

class BarModule(torch.nn.Module):
    pass


            

Reported by Pylint.

Undefined variable 'reveal_type'
Error

Line: 12 Column: 1

              
ml: torch.nn.ModuleList = torch.nn.ModuleList([FooModule(), BarModule()])
ml[0].children() == []  # noqa: B015
reveal_type(ml)  # E: {ModuleList}

            

Reported by Pylint.

Expression "ml[0].children() == []" is assigned to nothing
Error

Line: 11 Column: 1

                  pass

ml: torch.nn.ModuleList = torch.nn.ModuleList([FooModule(), BarModule()])
ml[0].children() == []  # noqa: B015
reveal_type(ml)  # E: {ModuleList}

            

Reported by Pylint.

Missing module docstring
Error

Line: 1 Column: 1

              import torch

# ModuleList with elements of type Module
class FooModule(torch.nn.Module):
    pass

class BarModule(torch.nn.Module):
    pass


            

Reported by Pylint.

Too few public methods (0/2)
Error

Line: 4 Column: 1

              import torch

# ModuleList with elements of type Module
class FooModule(torch.nn.Module):
    pass

class BarModule(torch.nn.Module):
    pass


            

Reported by Pylint.

Missing class docstring
Error

Line: 4 Column: 1

              import torch

# ModuleList with elements of type Module
class FooModule(torch.nn.Module):
    pass

class BarModule(torch.nn.Module):
    pass


            

Reported by Pylint.

Missing class docstring
Error

Line: 7 Column: 1

              class FooModule(torch.nn.Module):
    pass

class BarModule(torch.nn.Module):
    pass

ml: torch.nn.ModuleList = torch.nn.ModuleList([FooModule(), BarModule()])
ml[0].children() == []  # noqa: B015
reveal_type(ml)  # E: {ModuleList}

            

Reported by Pylint.

Too few public methods (0/2)
Error

Line: 7 Column: 1

              class FooModule(torch.nn.Module):
    pass

class BarModule(torch.nn.Module):
    pass

ml: torch.nn.ModuleList = torch.nn.ModuleList([FooModule(), BarModule()])
ml[0].children() == []  # noqa: B015
reveal_type(ml)  # E: {ModuleList}

            

Reported by Pylint.

torch/csrc/jit/codegen/cuda/executor.cpp
8 issues
getenv - Environment variables are untrustable input if they can be set by an attacker. They can have any content and length, and the same variable can be set more than once
Security

Line: 39 Column: 32 CWE codes: 807 20
Suggestion: Check environment variables carefully before using them

                code += std::string("namespace ") + FusionExecutor::kernelNamespace() +
      " {\n" + executor_utils::kernelPreamble() + kernel + "}\n";

  const char* debug_env = std::getenv("PYTORCH_CUDA_FUSER_DEBUG");
  if (debug_env && atoi(debug_env)) {
    std::cout << "\n==== codegen output for kernel: " << kernelName()
              << " ====" << std::endl
              << code << std::endl
              << "======================================\n"

            

Reported by FlawFinder.

getenv - Environment variables are untrustable input if they can be set by an attacker. They can have any content and length, and the same variable can be set more than once
Security

Line: 61 Column: 32 CWE codes: 807 20
Suggestion: Check environment variables carefully before using them

                FusionGuard fg(&fusion_);
  options_ = options;

  const char* debug_env = std::getenv("PYTORCH_CUDA_FUSER_DEBUG");
  if (debug_env && atoi(debug_env)) {
    std::cout << "\n==== codegen output for kernel: " << kernelName()
              << " ====" << std::endl
              << code << std::endl
              << "======================================\n"

            

Reported by FlawFinder.

getenv - Environment variables are untrustable input if they can be set by an attacker. They can have any content and length, and the same variable can be set more than once
Security

Line: 76 Column: 35 CWE codes: 807 20
Suggestion: Check environment variables carefully before using them

                lowered_ = GpuLower(&fusion_);
  const auto kernel = lowered_.kernel();

  const char* dump_kir_env = std::getenv("PYTORCH_CUDA_FUSER_DUMP_KIR");
  if (dump_kir_env && atoi(dump_kir_env)) {
    kernel->print();
  }

  const auto& kernel_summary = kernel->summary();

            

Reported by FlawFinder.

getenv - Environment variables are untrustable input if they can be set by an attacker. They can have any content and length, and the same variable can be set more than once
Security

Line: 129 Column: 35 CWE codes: 807 20
Suggestion: Check environment variables carefully before using them

                lowered_ = GpuLower(&fusion_);
  const auto kernel = lowered_.kernel();

  const char* dump_kir_env = std::getenv("PYTORCH_CUDA_FUSER_DUMP_KIR");
  if (dump_kir_env && atoi(dump_kir_env)) {
    kernel->print();
  }

  const auto kernel_code = codegen::generateCudaKernel(kernel, kernelName());

            

Reported by FlawFinder.

atoi - Unless checked, the resulting number can exceed the expected range
Security

Line: 40 Column: 20 CWE codes: 190
Suggestion: If source untrusted, check both minimum and maximum, even if the input had no minus sign (large numbers can roll over into negative number; consider saving to an unsigned value if that is intended)

                    " {\n" + executor_utils::kernelPreamble() + kernel + "}\n";

  const char* debug_env = std::getenv("PYTORCH_CUDA_FUSER_DEBUG");
  if (debug_env && atoi(debug_env)) {
    std::cout << "\n==== codegen output for kernel: " << kernelName()
              << " ====" << std::endl
              << code << std::endl
              << "======================================\n"
              << std::endl;

            

Reported by FlawFinder.

atoi - Unless checked, the resulting number can exceed the expected range
Security

Line: 62 Column: 20 CWE codes: 190
Suggestion: If source untrusted, check both minimum and maximum, even if the input had no minus sign (large numbers can roll over into negative number; consider saving to an unsigned value if that is intended)

                options_ = options;

  const char* debug_env = std::getenv("PYTORCH_CUDA_FUSER_DEBUG");
  if (debug_env && atoi(debug_env)) {
    std::cout << "\n==== codegen output for kernel: " << kernelName()
              << " ====" << std::endl
              << code << std::endl
              << "======================================\n"
              << std::endl;

            

Reported by FlawFinder.

atoi - Unless checked, the resulting number can exceed the expected range
Security

Line: 77 Column: 23 CWE codes: 190
Suggestion: If source untrusted, check both minimum and maximum, even if the input had no minus sign (large numbers can roll over into negative number; consider saving to an unsigned value if that is intended)

                const auto kernel = lowered_.kernel();

  const char* dump_kir_env = std::getenv("PYTORCH_CUDA_FUSER_DUMP_KIR");
  if (dump_kir_env && atoi(dump_kir_env)) {
    kernel->print();
  }

  const auto& kernel_summary = kernel->summary();
  has_block_reductions = kernel_summary.has_block_reductions;

            

Reported by FlawFinder.

atoi - Unless checked, the resulting number can exceed the expected range
Security

Line: 130 Column: 23 CWE codes: 190
Suggestion: If source untrusted, check both minimum and maximum, even if the input had no minus sign (large numbers can roll over into negative number; consider saving to an unsigned value if that is intended)

                const auto kernel = lowered_.kernel();

  const char* dump_kir_env = std::getenv("PYTORCH_CUDA_FUSER_DUMP_KIR");
  if (dump_kir_env && atoi(dump_kir_env)) {
    kernel->print();
  }

  const auto kernel_code = codegen::generateCudaKernel(kernel, kernelName());
  const auto structured_code = getStructuredCode(kernel_code);

            

Reported by FlawFinder.

torch/distributed/rpc/_testing/__init__.py
8 issues
Unable to import 'torch._C._distributed_rpc_testing'
Error

Line: 15 Column: 5

              if is_available():
    # Registers FAULTY_TENSORPIPE RPC backend.
    from . import faulty_agent_backend_registry
    from torch._C._distributed_rpc_testing import (
        FaultyTensorPipeRpcBackendOptions,
        FaultyTensorPipeAgent,
    )

            

Reported by Pylint.

Access to a protected member _C of a client class
Error

Line: 6 Column: 20

              

def is_available():
    return hasattr(torch._C, "_faulty_agent_init")


if is_available() and not torch._C._faulty_agent_init():
    raise RuntimeError("Failed to initialize torch.distributed.rpc._testing")


            

Reported by Pylint.

Access to a protected member _faulty_agent_init of a client class
Error

Line: 9 Column: 27

                  return hasattr(torch._C, "_faulty_agent_init")


if is_available() and not torch._C._faulty_agent_init():
    raise RuntimeError("Failed to initialize torch.distributed.rpc._testing")

if is_available():
    # Registers FAULTY_TENSORPIPE RPC backend.
    from . import faulty_agent_backend_registry

            

Reported by Pylint.

Access to a protected member _C of a client class
Error

Line: 9 Column: 27

                  return hasattr(torch._C, "_faulty_agent_init")


if is_available() and not torch._C._faulty_agent_init():
    raise RuntimeError("Failed to initialize torch.distributed.rpc._testing")

if is_available():
    # Registers FAULTY_TENSORPIPE RPC backend.
    from . import faulty_agent_backend_registry

            

Reported by Pylint.

Module import itself
Error

Line: 14 Column: 5

              
if is_available():
    # Registers FAULTY_TENSORPIPE RPC backend.
    from . import faulty_agent_backend_registry
    from torch._C._distributed_rpc_testing import (
        FaultyTensorPipeRpcBackendOptions,
        FaultyTensorPipeAgent,
    )

            

Reported by Pylint.

Missing module docstring
Error

Line: 1 Column: 1

              
import torch


def is_available():
    return hasattr(torch._C, "_faulty_agent_init")


if is_available() and not torch._C._faulty_agent_init():

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 5 Column: 1

              import torch


def is_available():
    return hasattr(torch._C, "_faulty_agent_init")


if is_available() and not torch._C._faulty_agent_init():
    raise RuntimeError("Failed to initialize torch.distributed.rpc._testing")

            

Reported by Pylint.

Module 'torch._C' has no '_faulty_agent_init' member, but source is unavailable. Consider adding this module to extension-pkg-whitelist if you want to perform analysis based on run-time introspection of living objects.
Error

Line: 9 Column: 27

                  return hasattr(torch._C, "_faulty_agent_init")


if is_available() and not torch._C._faulty_agent_init():
    raise RuntimeError("Failed to initialize torch.distributed.rpc._testing")

if is_available():
    # Registers FAULTY_TENSORPIPE RPC backend.
    from . import faulty_agent_backend_registry

            

Reported by Pylint.

torch/distributed/pipeline/sync/stream.py
8 issues
Module 'torch' has no 'device' member
Error

Line: 29 Column: 24

              AbstractStream = Union[torch.cuda.Stream, CPUStreamType]


def new_stream(device: torch.device) -> AbstractStream:
    """Creates a new stream for either CPU or CUDA device."""
    if device.type != "cuda":
        return CPUStream
    return torch.cuda.Stream(device)


            

Reported by Pylint.

Module 'torch' has no 'device' member
Error

Line: 36 Column: 28

                  return torch.cuda.Stream(device)


def current_stream(device: torch.device) -> AbstractStream:
    """:func:`torch.cuda.current_stream` for either CPU or CUDA device."""
    if device.type != "cuda":
        return CPUStream
    return torch.cuda.current_stream(device)


            

Reported by Pylint.

Module 'torch' has no 'device' member
Error

Line: 43 Column: 28

                  return torch.cuda.current_stream(device)


def default_stream(device: torch.device) -> AbstractStream:
    """:func:`torch.cuda.default_stream` for either CPU or CUDA device."""
    if device.type != "cuda":
        return CPUStream
    return torch.cuda.default_stream(device)


            

Reported by Pylint.

Module 'torch' has no 'device' member
Error

Line: 51 Column: 24

              

@contextmanager
def use_device(device: torch.device) -> Generator[None, None, None]:
    """:func:`torch.cuda.device` for either CPU or CUDA device."""
    if device.type != "cuda":
        yield
        return


            

Reported by Pylint.

Module 'torch' has no 'device' member
Error

Line: 72 Column: 43

                      yield


def get_device(stream: AbstractStream) -> torch.device:
    """Gets the device from CPU or CUDA stream."""
    if is_cuda(stream):
        return as_cuda(stream).device
    return torch.device("cpu")


            

Reported by Pylint.

Module 'torch' has no 'device' member
Error

Line: 76 Column: 12

                  """Gets the device from CPU or CUDA stream."""
    if is_cuda(stream):
        return as_cuda(stream).device
    return torch.device("cpu")


def wait_stream(source: AbstractStream, target: AbstractStream) -> None:
    """:meth:`torch.cuda.Stream.wait_stream` for either CPU or CUDA stream. It
    makes the source stream wait until the target stream completes work queued.

            

Reported by Pylint.

Too few public methods (0/2)
Error

Line: 18 Column: 1

              __all__: List[str] = []


class CPUStreamType:
    pass


# The placeholder on place of streams for the CPU device instead of CUDA.
CPUStream = CPUStreamType()

            

Reported by Pylint.

Missing class docstring
Error

Line: 18 Column: 1

              __all__: List[str] = []


class CPUStreamType:
    pass


# The placeholder on place of streams for the CPU device instead of CUDA.
CPUStream = CPUStreamType()

            

Reported by Pylint.

tools/autograd/gen_variable_factories.py
8 issues
TODO: maybe update the cpp argument API to take optional namespace argument?
Error

Line: 20 Column: 3

              TYPE_PATTERN = re.compile(r"(?:const\s+)?([A-Z]\w+)")

# Add 'at::' to types defined in ATen namespace, e.g. Tensor, TensorList, IntArrayRef and etc.
# TODO: maybe update the cpp argument API to take optional namespace argument?
def fully_qualified_type(argument_type: str) -> str:
    def maybe_optional_type(type: str, is_opt: bool) -> str:
        return f'c10::optional<{type}>' if is_opt else type

    opt_match = OPTIONAL_TYPE_PATTERN.match(argument_type)

            

Reported by Pylint.

Redefining built-in 'type'
Error

Line: 22 Column: 29

              # Add 'at::' to types defined in ATen namespace, e.g. Tensor, TensorList, IntArrayRef and etc.
# TODO: maybe update the cpp argument API to take optional namespace argument?
def fully_qualified_type(argument_type: str) -> str:
    def maybe_optional_type(type: str, is_opt: bool) -> str:
        return f'c10::optional<{type}>' if is_opt else type

    opt_match = OPTIONAL_TYPE_PATTERN.match(argument_type)
    is_opt = opt_match is not None
    if opt_match:

            

Reported by Pylint.

Missing module docstring
Error

Line: 1 Column: 1

              # Generates C++ functions that wrap ATen tensor factory methods to turn them into Variables.
#
# This writes one file: variable_factories.h

import re
from typing import Optional, List

from tools.codegen.api.types import CppSignatureGroup
from tools.codegen.api import cpp

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 21 Column: 1

              
# Add 'at::' to types defined in ATen namespace, e.g. Tensor, TensorList, IntArrayRef and etc.
# TODO: maybe update the cpp argument API to take optional namespace argument?
def fully_qualified_type(argument_type: str) -> str:
    def maybe_optional_type(type: str, is_opt: bool) -> str:
        return f'c10::optional<{type}>' if is_opt else type

    opt_match = OPTIONAL_TYPE_PATTERN.match(argument_type)
    is_opt = opt_match is not None

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 36 Column: 1

                  qualified_type = f'{argument_type[:index]}at::{argument_type[index:]}'
    return maybe_optional_type(qualified_type, is_opt)

def gen_variable_factories(out: str, native_yaml_path: str, template_path: str) -> None:
    native_functions = parse_native_yaml(native_yaml_path).native_functions
    fm = FileManager(install_dir=out, template_dir=template_path, dry_run=False)
    fm.write_with_template('variable_factories.h', 'variable_factories.h', lambda: {
        'generated_comment': '@' + f'generated from {fm.template_dir}/variable_factories.h',
        'function_definitions': list(mapMaybe(process_function, native_functions)),

            

Reported by Pylint.

Variable name "fm" doesn't conform to snake_case naming style
Error

Line: 38 Column: 5

              
def gen_variable_factories(out: str, native_yaml_path: str, template_path: str) -> None:
    native_functions = parse_native_yaml(native_yaml_path).native_functions
    fm = FileManager(install_dir=out, template_dir=template_path, dry_run=False)
    fm.write_with_template('variable_factories.h', 'variable_factories.h', lambda: {
        'generated_comment': '@' + f'generated from {fm.template_dir}/variable_factories.h',
        'function_definitions': list(mapMaybe(process_function, native_functions)),
    })


            

Reported by Pylint.

Argument name "f" doesn't conform to snake_case naming style
Error

Line: 45 Column: 1

                  })

@with_native_function
def process_function(f: NativeFunction) -> Optional[str]:
    name = cpp.name(f.func)
    has_tensor_options = python.has_tensor_options(f)
    is_factory = has_tensor_options or name.endswith("_like")

    if Variant.function not in f.variants or not is_factory:

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 45 Column: 1

                  })

@with_native_function
def process_function(f: NativeFunction) -> Optional[str]:
    name = cpp.name(f.func)
    has_tensor_options = python.has_tensor_options(f)
    is_factory = has_tensor_options or name.endswith("_like")

    if Variant.function not in f.variants or not is_factory:

            

Reported by Pylint.

torch/_VF.py
8 issues
Module name "_VF" doesn't conform to snake_case naming style
Error

Line: 1 Column: 1

              """
This makes the functions in torch._C._VariableFunctions available as
    torch._VF.<funcname>
without mypy being able to find them.

A subset of those functions are mapped to ATen functions in
torch/jit/_builtins.py

See https://github.com/pytorch/pytorch/issues/21478 for the reason for

            

Reported by Pylint.

standard import "import sys" should be placed before "import torch"
Error

Line: 14 Column: 1

              
"""
import torch
import sys
import types


class VFModule(types.ModuleType):
    vf: types.ModuleType

            

Reported by Pylint.

standard import "import types" should be placed before "import torch"
Error

Line: 15 Column: 1

              """
import torch
import sys
import types


class VFModule(types.ModuleType):
    vf: types.ModuleType


            

Reported by Pylint.

Too few public methods (1/2)
Error

Line: 18 Column: 1

              import types


class VFModule(types.ModuleType):
    vf: types.ModuleType

    def __init__(self, name):
        super(VFModule, self).__init__(name)
        self.vf = torch._C._VariableFunctions

            

Reported by Pylint.

Missing class docstring
Error

Line: 18 Column: 1

              import types


class VFModule(types.ModuleType):
    vf: types.ModuleType

    def __init__(self, name):
        super(VFModule, self).__init__(name)
        self.vf = torch._C._VariableFunctions

            

Reported by Pylint.

Consider using Python 3 style super() without arguments
Error

Line: 22 Column: 9

                  vf: types.ModuleType

    def __init__(self, name):
        super(VFModule, self).__init__(name)
        self.vf = torch._C._VariableFunctions

    def __getattr__(self, attr):
        return getattr(self.vf, attr)


            

Reported by Pylint.

Attribute name "vf" doesn't conform to snake_case naming style
Error

Line: 23 Column: 9

              
    def __init__(self, name):
        super(VFModule, self).__init__(name)
        self.vf = torch._C._VariableFunctions

    def __getattr__(self, attr):
        return getattr(self.vf, attr)



            

Reported by Pylint.

Module 'torch._C' has no '_VariableFunctions' member, but source is unavailable. Consider adding this module to extension-pkg-whitelist if you want to perform analysis based on run-time introspection of living objects.
Error

Line: 23 Column: 19

              
    def __init__(self, name):
        super(VFModule, self).__init__(name)
        self.vf = torch._C._VariableFunctions

    def __getattr__(self, attr):
        return getattr(self.vf, attr)



            

Reported by Pylint.

tools/test/test_mypy_wrapper.py
8 issues
Unable to import 'tools.linter'
Error

Line: 3 Column: 1

              import unittest

from tools.linter import mypy_wrapper


class TestMypyWrapper(unittest.TestCase):
    configs = {
        'foo.ini': {
            'file1.abc',

            

Reported by Pylint.

Missing module docstring
Error

Line: 1 Column: 1

              import unittest

from tools.linter import mypy_wrapper


class TestMypyWrapper(unittest.TestCase):
    configs = {
        'foo.ini': {
            'file1.abc',

            

Reported by Pylint.

Missing class docstring
Error

Line: 6 Column: 1

              from tools.linter import mypy_wrapper


class TestMypyWrapper(unittest.TestCase):
    configs = {
        'foo.ini': {
            'file1.abc',
            'dir2',
            'dir3/file4.xyz',

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 32 Column: 5

                      },
    }

    def test_config_files(self) -> None:
        self.assertEqual(mypy_wrapper.config_files().keys(), {
            'mypy.ini',
            'mypy-strict.ini',
        })


            

Reported by Pylint.

Missing function or method docstring
Error

Line: 38 Column: 5

                          'mypy-strict.ini',
        })

    def test_split_path(self) -> None:
        self.assertEqual(mypy_wrapper.split_path('file1.abc'), ['file1.abc'])
        self.assertEqual(
            mypy_wrapper.split_path('dir3/file4.xyz'),
            ['dir3', 'file4.xyz'],
        )

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 49 Column: 5

                          ['dir2', 'dir5', 'file6.def'],
        )

    def test_make_trie(self) -> None:
        self.assertEqual(mypy_wrapper.make_trie(self.configs), self.trie)

    def test_lookup(self) -> None:
        self.assertEqual(
            mypy_wrapper.lookup(self.trie, 'file1.abc'),

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 52 Column: 5

                  def test_make_trie(self) -> None:
        self.assertEqual(mypy_wrapper.make_trie(self.configs), self.trie)

    def test_lookup(self) -> None:
        self.assertEqual(
            mypy_wrapper.lookup(self.trie, 'file1.abc'),
            {'foo.ini', 'bar/baz.ini'},
        )
        self.assertEqual(

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 105 Column: 5

                          set(),
        )

    def test_make_plan(self) -> None:
        self.assertEqual(
            mypy_wrapper.make_plan(configs=self.configs, files=[
                'file8.xyz',
                'dir3/file11.abc',
            ]),

            

Reported by Pylint.

tools/setup_helpers/gen_version_header.py
8 issues
Redefining name 'args' from outer scope (line 86)
Error

Line: 49 Column: 10

                  return text


def main(args: argparse.Namespace) -> None:
    with open(args.version_path) as f:
        version = f.read().strip()
    (major, minor, patch) = parse_version(version)

    replacements = {

            

Reported by Pylint.

Redefining built-in 'input'
Error

Line: 63 Column: 38

                  # Create the output dir if it doesn't exist.
    os.makedirs(os.path.dirname(args.output_path), exist_ok=True)

    with open(args.template_path) as input:
        with open(args.output_path, "w") as output:
            for line in input.readlines():
                output.write(apply_replacements(replacements, line))



            

Reported by Pylint.

Missing module docstring
Error

Line: 1 Column: 1

              # Ideally, there would be a way in Bazel to parse version.txt
# and use the version numbers from there as substitutions for
# an expand_template action. Since there isn't, this silly script exists.

import argparse
import os
from typing import Dict, Tuple, cast

Version = Tuple[int, int, int]

            

Reported by Pylint.

Consider using enumerate instead of iterating with range and len
Error

Line: 24 Column: 5

                  """
    # Extract version number part (i.e. toss any revision / hash parts).
    version_number_str = version
    for i in range(len(version)):
        c = version[i]
        if not (c.isdigit() or c == "."):
            version_number_str = version[:i]
            break


            

Reported by Pylint.

Variable name "c" doesn't conform to snake_case naming style
Error

Line: 25 Column: 9

                  # Extract version number part (i.e. toss any revision / hash parts).
    version_number_str = version
    for i in range(len(version)):
        c = version[i]
        if not (c.isdigit() or c == "."):
            version_number_str = version[:i]
            break

    return cast(Version, tuple([int(n) for n in version_number_str.split(".")]))

            

Reported by Pylint.

Consider using a generator instead 'tuple(int(n) for n in version_number_str.split('.'))'
Error

Line: 30 Column: 26

                          version_number_str = version[:i]
            break

    return cast(Version, tuple([int(n) for n in version_number_str.split(".")]))


def apply_replacements(replacements: Dict[str, str], text: str) -> str:
    """
    Applies the given replacements within the text.

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 49 Column: 1

                  return text


def main(args: argparse.Namespace) -> None:
    with open(args.version_path) as f:
        version = f.read().strip()
    (major, minor, patch) = parse_version(version)

    replacements = {

            

Reported by Pylint.

Variable name "f" doesn't conform to snake_case naming style
Error

Line: 50 Column: 37

              

def main(args: argparse.Namespace) -> None:
    with open(args.version_path) as f:
        version = f.read().strip()
    (major, minor, patch) = parse_version(version)

    replacements = {
        "@TORCH_VERSION_MAJOR@": str(major),

            

Reported by Pylint.

torch/fx/experimental/fx2trt/converters/transformation.py
8 issues
Unable to import 'tensorrt'
Error

Line: 2 Column: 1

              import torch
import tensorrt as trt
from torch.fx.experimental.fx2trt.fx2trt import tensorrt_converter

from .helper_functions import mark_as_int8_layer

@tensorrt_converter(torch.flatten)
def torch_flatten(network, target, args, kwargs, name):
    # args/kwargs should have already been normalized to kwargs

            

Reported by Pylint.

Attempted relative import beyond top-level package
Error

Line: 5 Column: 1

              import tensorrt as trt
from torch.fx.experimental.fx2trt.fx2trt import tensorrt_converter

from .helper_functions import mark_as_int8_layer

@tensorrt_converter(torch.flatten)
def torch_flatten(network, target, args, kwargs, name):
    # args/kwargs should have already been normalized to kwargs
    assert len(args) == 0

            

Reported by Pylint.

Module 'torch' has no 'flatten' member
Error

Line: 7 Column: 21

              
from .helper_functions import mark_as_int8_layer

@tensorrt_converter(torch.flatten)
def torch_flatten(network, target, args, kwargs, name):
    # args/kwargs should have already been normalized to kwargs
    assert len(args) == 0
    input_val = kwargs["input"]


            

Reported by Pylint.

Unused argument 'target'
Error

Line: 8 Column: 28

              from .helper_functions import mark_as_int8_layer

@tensorrt_converter(torch.flatten)
def torch_flatten(network, target, args, kwargs, name):
    # args/kwargs should have already been normalized to kwargs
    assert len(args) == 0
    input_val = kwargs["input"]

    if not isinstance(input_val, trt.tensorrt.ITensor):

            

Reported by Pylint.

Missing module docstring
Error

Line: 1 Column: 1

              import torch
import tensorrt as trt
from torch.fx.experimental.fx2trt.fx2trt import tensorrt_converter

from .helper_functions import mark_as_int8_layer

@tensorrt_converter(torch.flatten)
def torch_flatten(network, target, args, kwargs, name):
    # args/kwargs should have already been normalized to kwargs

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 8 Column: 1

              from .helper_functions import mark_as_int8_layer

@tensorrt_converter(torch.flatten)
def torch_flatten(network, target, args, kwargs, name):
    # args/kwargs should have already been normalized to kwargs
    assert len(args) == 0
    input_val = kwargs["input"]

    if not isinstance(input_val, trt.tensorrt.ITensor):

            

Reported by Pylint.

Use of assert detected. The enclosed code will be removed when compiling to optimised byte code.
Security

Line: 10
Suggestion: https://bandit.readthedocs.io/en/latest/plugins/b101_assert_used.html

              @tensorrt_converter(torch.flatten)
def torch_flatten(network, target, args, kwargs, name):
    # args/kwargs should have already been normalized to kwargs
    assert len(args) == 0
    input_val = kwargs["input"]

    if not isinstance(input_val, trt.tensorrt.ITensor):
        raise RuntimeError(f"Flatten received input {input_val} that is not part "
                           "of the TensorRT region!")

            

Reported by Bandit.

Use of assert detected. The enclosed code will be removed when compiling to optimised byte code.
Security

Line: 21
Suggestion: https://bandit.readthedocs.io/en/latest/plugins/b101_assert_used.html

                  start_dim = kwargs["start_dim"] - 1
    end_dim = len(input_val.shape) if kwargs["end_dim"] == -1 else kwargs["end_dim"] - 1

    assert start_dim >= 0, "Expect non negtive start_dim, this probably due to flatten batch dim."

    new_shape = []
    flatten_dim = 1
    for i, dim in enumerate(input_val.shape):
        if i < start_dim:

            

Reported by Bandit.

test/onnx/model_defs/super_resolution.py
8 issues
Unable to import 'torch.nn'
Error

Line: 1 Column: 1

              import torch.nn as nn
import torch.nn.init as init


class SuperResolutionNet(nn.Module):
    def __init__(self, upscale_factor):
        super(SuperResolutionNet, self).__init__()

        self.relu = nn.ReLU()

            

Reported by Pylint.

Unable to import 'torch.nn.init'
Error

Line: 2 Column: 1

              import torch.nn as nn
import torch.nn.init as init


class SuperResolutionNet(nn.Module):
    def __init__(self, upscale_factor):
        super(SuperResolutionNet, self).__init__()

        self.relu = nn.ReLU()

            

Reported by Pylint.

Missing module docstring
Error

Line: 1 Column: 1

              import torch.nn as nn
import torch.nn.init as init


class SuperResolutionNet(nn.Module):
    def __init__(self, upscale_factor):
        super(SuperResolutionNet, self).__init__()

        self.relu = nn.ReLU()

            

Reported by Pylint.

Too few public methods (1/2)
Error

Line: 5 Column: 1

              import torch.nn.init as init


class SuperResolutionNet(nn.Module):
    def __init__(self, upscale_factor):
        super(SuperResolutionNet, self).__init__()

        self.relu = nn.ReLU()
        self.conv1 = nn.Conv2d(1, 64, (5, 5), (1, 1), (2, 2))

            

Reported by Pylint.

Missing class docstring
Error

Line: 5 Column: 1

              import torch.nn.init as init


class SuperResolutionNet(nn.Module):
    def __init__(self, upscale_factor):
        super(SuperResolutionNet, self).__init__()

        self.relu = nn.ReLU()
        self.conv1 = nn.Conv2d(1, 64, (5, 5), (1, 1), (2, 2))

            

Reported by Pylint.

Consider using Python 3 style super() without arguments
Error

Line: 7 Column: 9

              
class SuperResolutionNet(nn.Module):
    def __init__(self, upscale_factor):
        super(SuperResolutionNet, self).__init__()

        self.relu = nn.ReLU()
        self.conv1 = nn.Conv2d(1, 64, (5, 5), (1, 1), (2, 2))
        self.conv2 = nn.Conv2d(64, 64, (3, 3), (1, 1), (1, 1))
        self.conv3 = nn.Conv2d(64, 32, (3, 3), (1, 1), (1, 1))

            

Reported by Pylint.

Argument name "x" doesn't conform to snake_case naming style
Error

Line: 18 Column: 5

              
        self._initialize_weights()

    def forward(self, x):
        x = self.relu(self.conv1(x))
        x = self.relu(self.conv2(x))
        x = self.relu(self.conv3(x))
        x = self.pixel_shuffle(self.conv4(x))
        return x

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 18 Column: 5

              
        self._initialize_weights()

    def forward(self, x):
        x = self.relu(self.conv1(x))
        x = self.relu(self.conv2(x))
        x = self.relu(self.conv3(x))
        x = self.pixel_shuffle(self.conv4(x))
        return x

            

Reported by Pylint.