The following issues were found

torch/autograd/graph.py
10 issues
Module 'torch' has no 'empty' member
Error

Line: 120 Column: 22

                          if not pin_memory:
                return (tensor.device, tensor.cpu())

            packed = torch.empty(
                tensor.size(),
                dtype=tensor.dtype,
                layout=tensor.layout,
                pin_memory=(torch.cuda.is_available() and not tensor.is_sparse))
            packed.copy_(tensor)

            

Reported by Pylint.

Missing module docstring
Error

Line: 1 Column: 1

              import torch
from typing import Callable, Any

class saved_tensors_hooks():
    """Context-manager that sets a pair of pack / unpack hooks for saved tensors.

    Use this context-manager to define how intermediary results of an operation
    should be packed before saving, and unpacked on retrieval.


            

Reported by Pylint.

standard import "from typing import Callable, Any" should be placed before "import torch"
Error

Line: 2 Column: 1

              import torch
from typing import Callable, Any

class saved_tensors_hooks():
    """Context-manager that sets a pair of pack / unpack hooks for saved tensors.

    Use this context-manager to define how intermediary results of an operation
    should be packed before saving, and unpacked on retrieval.


            

Reported by Pylint.

Class name "saved_tensors_hooks" doesn't conform to PascalCase naming style
Error

Line: 4 Column: 1

              import torch
from typing import Callable, Any

class saved_tensors_hooks():
    """Context-manager that sets a pair of pack / unpack hooks for saved tensors.

    Use this context-manager to define how intermediary results of an operation
    should be packed before saving, and unpacked on retrieval.


            

Reported by Pylint.

Line too long (109/100)
Error

Line: 64 Column: 1

                      Only one pair of hooks is allowed at a time. Recursively nesting this
        context-manager is not yet supported.
    """
    def __init__(self, pack_hook: Callable[[torch.Tensor], Any], unpack_hook: Callable[[Any], torch.Tensor]):
        self.pack_hook = pack_hook
        self.unpack_hook = unpack_hook

    def __enter__(self):
        torch._C._autograd._register_saved_tensors_default_hooks(self.pack_hook, self.unpack_hook)

            

Reported by Pylint.

Class name "save_on_cpu" doesn't conform to PascalCase naming style
Error

Line: 75 Column: 1

                      torch._C._autograd._reset_saved_tensors_default_hooks()


class save_on_cpu():
    """Context-manager under which tensors saved by the forward pass will be
    stored on cpu, then retrieved for backward.

    When performing operations within this context manager, intermediary
    results saved in the graph during the forward pass will be moved to CPU,

            

Reported by Pylint.

Module 'torch._C' has no '_autograd' member, but source is unavailable. Consider adding this module to extension-pkg-whitelist if you want to perform analysis based on run-time introspection of living objects.
Error

Line: 69 Column: 9

                      self.unpack_hook = unpack_hook

    def __enter__(self):
        torch._C._autograd._register_saved_tensors_default_hooks(self.pack_hook, self.unpack_hook)

    def __exit__(self, *args: Any):
        torch._C._autograd._reset_saved_tensors_default_hooks()



            

Reported by Pylint.

Module 'torch._C' has no '_autograd' member, but source is unavailable. Consider adding this module to extension-pkg-whitelist if you want to perform analysis based on run-time introspection of living objects.
Error

Line: 72 Column: 9

                      torch._C._autograd._register_saved_tensors_default_hooks(self.pack_hook, self.unpack_hook)

    def __exit__(self, *args: Any):
        torch._C._autograd._reset_saved_tensors_default_hooks()


class save_on_cpu():
    """Context-manager under which tensors saved by the forward pass will be
    stored on cpu, then retrieved for backward.

            

Reported by Pylint.

Module 'torch._C' has no '_autograd' member, but source is unavailable. Consider adding this module to extension-pkg-whitelist if you want to perform analysis based on run-time introspection of living objects.
Error

Line: 136 Column: 9

                      self.unpack_hook = unpack_from_cpu

    def __enter__(self):
        torch._C._autograd._register_saved_tensors_default_hooks(self.pack_hook, self.unpack_hook)

    def __exit__(self, *args: Any):
        torch._C._autograd._reset_saved_tensors_default_hooks()

            

Reported by Pylint.

Module 'torch._C' has no '_autograd' member, but source is unavailable. Consider adding this module to extension-pkg-whitelist if you want to perform analysis based on run-time introspection of living objects.
Error

Line: 139 Column: 9

                      torch._C._autograd._register_saved_tensors_default_hooks(self.pack_hook, self.unpack_hook)

    def __exit__(self, *args: Any):
        torch._C._autograd._reset_saved_tensors_default_hooks()

            

Reported by Pylint.

test/package/package_c/test_module.py
10 issues
Unable to import 'torch'
Error

Line: 1 Column: 1

              import torch

try:
    from torchvision.models import resnet18

    class TorchVisionTest(torch.nn.Module):
        def __init__(self):
            super().__init__()
            self.tvmod = resnet18()

            

Reported by Pylint.

Missing module docstring
Error

Line: 1 Column: 1

              import torch

try:
    from torchvision.models import resnet18

    class TorchVisionTest(torch.nn.Module):
        def __init__(self):
            super().__init__()
            self.tvmod = resnet18()

            

Reported by Pylint.

Missing class docstring
Error

Line: 6 Column: 5

              try:
    from torchvision.models import resnet18

    class TorchVisionTest(torch.nn.Module):
        def __init__(self):
            super().__init__()
            self.tvmod = resnet18()

        def forward(self, x):

            

Reported by Pylint.

Too few public methods (1/2)
Error

Line: 6 Column: 5

              try:
    from torchvision.models import resnet18

    class TorchVisionTest(torch.nn.Module):
        def __init__(self):
            super().__init__()
            self.tvmod = resnet18()

        def forward(self, x):

            

Reported by Pylint.

Method could be a function
Error

Line: 11 Column: 9

                          super().__init__()
            self.tvmod = resnet18()

        def forward(self, x):
            x = a_non_torch_leaf(x, x)
            return torch.relu(x + 3.0)


except ImportError:

            

Reported by Pylint.

Argument name "x" doesn't conform to snake_case naming style
Error

Line: 11 Column: 9

                          super().__init__()
            self.tvmod = resnet18()

        def forward(self, x):
            x = a_non_torch_leaf(x, x)
            return torch.relu(x + 3.0)


except ImportError:

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 11 Column: 9

                          super().__init__()
            self.tvmod = resnet18()

        def forward(self, x):
            x = a_non_torch_leaf(x, x)
            return torch.relu(x + 3.0)


except ImportError:

            

Reported by Pylint.

Argument name "a" doesn't conform to snake_case naming style
Error

Line: 20 Column: 1

                  pass


def a_non_torch_leaf(a, b):
    return a + b

            

Reported by Pylint.

Argument name "b" doesn't conform to snake_case naming style
Error

Line: 20 Column: 1

                  pass


def a_non_torch_leaf(a, b):
    return a + b

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 20 Column: 1

                  pass


def a_non_torch_leaf(a, b):
    return a + b

            

Reported by Pylint.

tools/code_coverage/package/util/utils_init.py
10 issues
Attempted relative import beyond top-level package
Error

Line: 5 Column: 1

              import os
from typing import Any

from .setting import (
    JSON_FOLDER_BASE_DIR,
    LOG_DIR,
    MERGED_FOLDER_BASE_DIR,
    PROFILE_DIR,
    SUMMARY_FOLDER_DIR,

            

Reported by Pylint.

Attempted relative import beyond top-level package
Error

Line: 13 Column: 1

                  SUMMARY_FOLDER_DIR,
    Option,
)
from .utils import create_folder, get_raw_profiles_folder, remove_file


def remove_files() -> None:
    # remove log
    remove_file(os.path.join(LOG_DIR, "log.txt"))

            

Reported by Pylint.

Missing module docstring
Error

Line: 1 Column: 1

              import argparse
import os
from typing import Any

from .setting import (
    JSON_FOLDER_BASE_DIR,
    LOG_DIR,
    MERGED_FOLDER_BASE_DIR,
    PROFILE_DIR,

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 16 Column: 1

              from .utils import create_folder, get_raw_profiles_folder, remove_file


def remove_files() -> None:
    # remove log
    remove_file(os.path.join(LOG_DIR, "log.txt"))


def create_folders() -> None:

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 21 Column: 1

                  remove_file(os.path.join(LOG_DIR, "log.txt"))


def create_folders() -> None:
    create_folder(
        PROFILE_DIR,
        MERGED_FOLDER_BASE_DIR,
        JSON_FOLDER_BASE_DIR,
        get_raw_profiles_folder(),

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 32 Column: 1

                  )


def add_arguments_utils(parser: argparse.ArgumentParser) -> argparse.ArgumentParser:
    parser.add_argument("--run", help="run the cpp test binaries", action="store_true")
    parser.add_argument(
        "--merge",
        help="merge raw profiles (only apply to clang coverage)",
        action="store_true",

            

Reported by Pylint.

Line too long (106/100)
Error

Line: 49 Column: 1

                  )
    parser.add_argument(
        "--interest-only",
        help="Final report will be only about these folders and its sub-folders; for example: caff2/c10;",
        nargs="+",
        default=None,
    )
    parser.add_argument(
        "--clean",

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 63 Column: 1

                  return parser


def have_option(have_stage: bool, option: int) -> int:
    if have_stage:
        return option
    else:
        return 0


            

Reported by Pylint.

Unnecessary "else" after "return"
Error

Line: 64 Column: 5

              

def have_option(have_stage: bool, option: int) -> int:
    if have_stage:
        return option
    else:
        return 0



            

Reported by Pylint.

Missing function or method docstring
Error

Line: 70 Column: 1

                      return 0


def get_options(args: Any) -> Option:
    option: Option = Option()
    if args.__contains__("build"):
        if args.build:
            option.need_build = True


            

Reported by Pylint.

tools/test/test_extract_scripts.py
10 issues
Unable to import 'tools'
Error

Line: 3 Column: 1

              import unittest

from tools import extract_scripts

requirements_sh = '''
#!/usr/bin/env bash
set -eo pipefail
pip install -r requirements.txt
'''.strip()

            

Reported by Pylint.

Missing module docstring
Error

Line: 1 Column: 1

              import unittest

from tools import extract_scripts

requirements_sh = '''
#!/usr/bin/env bash
set -eo pipefail
pip install -r requirements.txt
'''.strip()

            

Reported by Pylint.

Constant name "requirements_sh" doesn't conform to UPPER_CASE naming style
Error

Line: 5 Column: 1

              
from tools import extract_scripts

requirements_sh = '''
#!/usr/bin/env bash
set -eo pipefail
pip install -r requirements.txt
'''.strip()


            

Reported by Pylint.

Constant name "hello_sh" doesn't conform to UPPER_CASE naming style
Error

Line: 11 Column: 1

              pip install -r requirements.txt
'''.strip()

hello_sh = '''
#!/usr/bin/env sh
set -e
echo hello world
'''.strip()


            

Reported by Pylint.

Missing class docstring
Error

Line: 18 Column: 1

              '''.strip()


class TestExtractScripts(unittest.TestCase):
    def test_extract_none(self) -> None:
        self.assertEqual(
            extract_scripts.extract({
                'name': 'Checkout PyTorch',
                'uses': 'actions/checkout@v2',

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 19 Column: 5

              

class TestExtractScripts(unittest.TestCase):
    def test_extract_none(self) -> None:
        self.assertEqual(
            extract_scripts.extract({
                'name': 'Checkout PyTorch',
                'uses': 'actions/checkout@v2',
            }),

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 28 Column: 5

                          None,
        )

    def test_extract_run_default_bash(self) -> None:
        self.assertEqual(
            extract_scripts.extract({
                'name': 'Install requirements',
                'run': 'pip install -r requirements.txt',
            }),

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 40 Column: 5

                          },
        )

    def test_extract_run_sh(self) -> None:
        self.assertEqual(
            extract_scripts.extract({
                'name': 'Hello world',
                'run': 'echo hello world',
                'shell': 'sh',

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 53 Column: 5

                          },
        )

    def test_extract_run_py(self) -> None:
        self.assertEqual(
            extract_scripts.extract({
                'name': 'Hello world',
                'run': 'print("Hello!")',
                'shell': 'python',

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 66 Column: 5

                          },
        )

    def test_extract_github_script(self) -> None:
        self.assertEqual(
            # https://github.com/actions/github-script/tree/v3.1.1#reading-step-results
            extract_scripts.extract({
                'uses': 'actions/github-script@v3',
                'id': 'set-result',

            

Reported by Pylint.

test/typing/reveal/namedtuple.py
10 issues
Unable to import 'torch'
Error

Line: 1 Column: 1

              import torch


t = torch.tensor([[3.0, 1.5], [2.0, 1.5]])

t_sort = t.sort()
t_sort[0][0, 0] == 1.5      # noqa: B015
t_sort.indices[0, 0] == 1   # noqa: B015
t_sort.values[0, 0] == 1.5  # noqa: B015

            

Reported by Pylint.

Undefined variable 'reveal_type'
Error

Line: 10 Column: 1

              t_sort[0][0, 0] == 1.5      # noqa: B015
t_sort.indices[0, 0] == 1   # noqa: B015
t_sort.values[0, 0] == 1.5  # noqa: B015
reveal_type(t_sort)  # E: Tuple[{Tensor}, {Tensor}, fallback=torch._C.namedtuple_values_indices]

t_qr = torch.linalg.qr(t)
t_qr[0].shape == [2, 2]     # noqa: B015
t_qr.Q.shape == [2, 2]      # noqa: B015
reveal_type(t_qr)  # E: Tuple[{Tensor}, {Tensor}, fallback=torch._C._VariableFunctions.namedtuple_Q_R]

            

Reported by Pylint.

Undefined variable 'reveal_type'
Error

Line: 15 Column: 1

              t_qr = torch.linalg.qr(t)
t_qr[0].shape == [2, 2]     # noqa: B015
t_qr.Q.shape == [2, 2]      # noqa: B015
reveal_type(t_qr)  # E: Tuple[{Tensor}, {Tensor}, fallback=torch._C._VariableFunctions.namedtuple_Q_R]

            

Reported by Pylint.

Statement seems to have no effect
Error

Line: 7 Column: 1

              t = torch.tensor([[3.0, 1.5], [2.0, 1.5]])

t_sort = t.sort()
t_sort[0][0, 0] == 1.5      # noqa: B015
t_sort.indices[0, 0] == 1   # noqa: B015
t_sort.values[0, 0] == 1.5  # noqa: B015
reveal_type(t_sort)  # E: Tuple[{Tensor}, {Tensor}, fallback=torch._C.namedtuple_values_indices]

t_qr = torch.linalg.qr(t)

            

Reported by Pylint.

Statement seems to have no effect
Error

Line: 8 Column: 1

              
t_sort = t.sort()
t_sort[0][0, 0] == 1.5      # noqa: B015
t_sort.indices[0, 0] == 1   # noqa: B015
t_sort.values[0, 0] == 1.5  # noqa: B015
reveal_type(t_sort)  # E: Tuple[{Tensor}, {Tensor}, fallback=torch._C.namedtuple_values_indices]

t_qr = torch.linalg.qr(t)
t_qr[0].shape == [2, 2]     # noqa: B015

            

Reported by Pylint.

Statement seems to have no effect
Error

Line: 9 Column: 1

              t_sort = t.sort()
t_sort[0][0, 0] == 1.5      # noqa: B015
t_sort.indices[0, 0] == 1   # noqa: B015
t_sort.values[0, 0] == 1.5  # noqa: B015
reveal_type(t_sort)  # E: Tuple[{Tensor}, {Tensor}, fallback=torch._C.namedtuple_values_indices]

t_qr = torch.linalg.qr(t)
t_qr[0].shape == [2, 2]     # noqa: B015
t_qr.Q.shape == [2, 2]      # noqa: B015

            

Reported by Pylint.

Statement seems to have no effect
Error

Line: 13 Column: 1

              reveal_type(t_sort)  # E: Tuple[{Tensor}, {Tensor}, fallback=torch._C.namedtuple_values_indices]

t_qr = torch.linalg.qr(t)
t_qr[0].shape == [2, 2]     # noqa: B015
t_qr.Q.shape == [2, 2]      # noqa: B015
reveal_type(t_qr)  # E: Tuple[{Tensor}, {Tensor}, fallback=torch._C._VariableFunctions.namedtuple_Q_R]

            

Reported by Pylint.

Statement seems to have no effect
Error

Line: 14 Column: 1

              
t_qr = torch.linalg.qr(t)
t_qr[0].shape == [2, 2]     # noqa: B015
t_qr.Q.shape == [2, 2]      # noqa: B015
reveal_type(t_qr)  # E: Tuple[{Tensor}, {Tensor}, fallback=torch._C._VariableFunctions.namedtuple_Q_R]

            

Reported by Pylint.

Missing module docstring
Error

Line: 1 Column: 1

              import torch


t = torch.tensor([[3.0, 1.5], [2.0, 1.5]])

t_sort = t.sort()
t_sort[0][0, 0] == 1.5      # noqa: B015
t_sort.indices[0, 0] == 1   # noqa: B015
t_sort.values[0, 0] == 1.5  # noqa: B015

            

Reported by Pylint.

Line too long (102/100)
Error

Line: 15 Column: 1

              t_qr = torch.linalg.qr(t)
t_qr[0].shape == [2, 2]     # noqa: B015
t_qr.Q.shape == [2, 2]      # noqa: B015
reveal_type(t_qr)  # E: Tuple[{Tensor}, {Tensor}, fallback=torch._C._VariableFunctions.namedtuple_Q_R]

            

Reported by Pylint.

torch/distributions/half_cauchy.py
9 issues
Module 'torch' has no 'full' member
Error

Line: 46 Column: 16

              
    @property
    def mean(self):
        return torch.full(self._extended_shape(), math.inf, dtype=self.scale.dtype, device=self.scale.device)

    @property
    def variance(self):
        return self.base_dist.variance


            

Reported by Pylint.

Module 'torch' has no 'as_tensor' member; maybe 'is_tensor'?
Error

Line: 55 Column: 17

                  def log_prob(self, value):
        if self._validate_args:
            self._validate_sample(value)
        value = torch.as_tensor(value, dtype=self.base_dist.scale.dtype,
                                device=self.base_dist.scale.device)
        log_prob = self.base_dist.log_prob(value) + math.log(2)
        log_prob[value.expand(log_prob.shape) < 0] = -inf
        return log_prob


            

Reported by Pylint.

Method 'enumerate_support' is abstract in class 'Distribution' but is not overridden
Error

Line: 11 Column: 1

              from torch.distributions.transformed_distribution import TransformedDistribution


class HalfCauchy(TransformedDistribution):
    r"""
    Creates a half-Cauchy distribution parameterized by `scale` where::

        X ~ Cauchy(0, scale)
        Y = |X| ~ HalfCauchy(scale)

            

Reported by Pylint.

Parameters differ from overridden 'icdf' method
Error

Line: 66 Column: 5

                          self._validate_sample(value)
        return 2 * self.base_dist.cdf(value) - 1

    def icdf(self, prob):
        return self.base_dist.icdf((prob + 1) / 2)

    def entropy(self):
        return self.base_dist.entropy() - math.log(2)

            

Reported by Pylint.

Missing module docstring
Error

Line: 1 Column: 1

              import math

import torch
from torch._six import inf
from torch.distributions import constraints
from torch.distributions.transforms import AbsTransform
from torch.distributions.cauchy import Cauchy
from torch.distributions.transformed_distribution import TransformedDistribution


            

Reported by Pylint.

Consider using Python 3 style super() without arguments
Error

Line: 33 Column: 9

              
    def __init__(self, scale, validate_args=None):
        base_dist = Cauchy(0, scale, validate_args=False)
        super(HalfCauchy, self).__init__(base_dist, AbsTransform(),
                                         validate_args=validate_args)

    def expand(self, batch_shape, _instance=None):
        new = self._get_checked_instance(HalfCauchy, _instance)
        return super(HalfCauchy, self).expand(batch_shape, _instance=new)

            

Reported by Pylint.

Consider using Python 3 style super() without arguments
Error

Line: 38 Column: 16

              
    def expand(self, batch_shape, _instance=None):
        new = self._get_checked_instance(HalfCauchy, _instance)
        return super(HalfCauchy, self).expand(batch_shape, _instance=new)

    @property
    def scale(self):
        return self.base_dist.scale


            

Reported by Pylint.

Missing function or method docstring
Error

Line: 41 Column: 5

                      return super(HalfCauchy, self).expand(batch_shape, _instance=new)

    @property
    def scale(self):
        return self.base_dist.scale

    @property
    def mean(self):
        return torch.full(self._extended_shape(), math.inf, dtype=self.scale.dtype, device=self.scale.device)

            

Reported by Pylint.

Line too long (109/100)
Error

Line: 46 Column: 1

              
    @property
    def mean(self):
        return torch.full(self._extended_shape(), math.inf, dtype=self.scale.dtype, device=self.scale.device)

    @property
    def variance(self):
        return self.base_dist.variance


            

Reported by Pylint.

torch/distributions/constraint_registry.py
9 issues
Unused argument 'constraint'
Error

Line: 157 Column: 24

              
@biject_to.register(constraints.real)
@transform_to.register(constraints.real)
def _transform_to_real(constraint):
    return transforms.identity_transform


@biject_to.register(constraints.independent)
def _biject_to_independent(constraint):

            

Reported by Pylint.

Unused argument 'constraint'
Error

Line: 177 Column: 28

              
@biject_to.register(constraints.positive)
@transform_to.register(constraints.positive)
def _transform_to_positive(constraint):
    return transforms.ExpTransform()


@biject_to.register(constraints.greater_than)
@biject_to.register(constraints.greater_than_eq)

            

Reported by Pylint.

Unused argument 'constraint'
Error

Line: 215 Column: 24

              

@biject_to.register(constraints.simplex)
def _biject_to_simplex(constraint):
    return transforms.StickBreakingTransform()


@transform_to.register(constraints.simplex)
def _transform_to_simplex(constraint):

            

Reported by Pylint.

Unused argument 'constraint'
Error

Line: 220 Column: 27

              

@transform_to.register(constraints.simplex)
def _transform_to_simplex(constraint):
    return transforms.SoftmaxTransform()


# TODO define a bijection for LowerCholeskyTransform
@transform_to.register(constraints.lower_cholesky)

            

Reported by Pylint.

TODO define a bijection for LowerCholeskyTransform
Error

Line: 224 Column: 3

                  return transforms.SoftmaxTransform()


# TODO define a bijection for LowerCholeskyTransform
@transform_to.register(constraints.lower_cholesky)
def _transform_to_lower_cholesky(constraint):
    return transforms.LowerCholeskyTransform()



            

Reported by Pylint.

Unused argument 'constraint'
Error

Line: 226 Column: 34

              
# TODO define a bijection for LowerCholeskyTransform
@transform_to.register(constraints.lower_cholesky)
def _transform_to_lower_cholesky(constraint):
    return transforms.LowerCholeskyTransform()


@biject_to.register(constraints.corr_cholesky)
@transform_to.register(constraints.corr_cholesky)

            

Reported by Pylint.

Unused argument 'constraint'
Error

Line: 232 Column: 33

              
@biject_to.register(constraints.corr_cholesky)
@transform_to.register(constraints.corr_cholesky)
def _transform_to_corr_cholesky(constraint):
    return transforms.CorrCholeskyTransform()


@biject_to.register(constraints.cat)
def _biject_to_cat(constraint):

            

Reported by Pylint.

Class 'ConstraintRegistry' inherits from object, can be safely removed from bases in python3
Error

Line: 79 Column: 1

              ]


class ConstraintRegistry(object):
    """
    Registry to link constraints to transforms.
    """
    def __init__(self):
        self._registry = {}

            

Reported by Pylint.

Consider using Python 3 style super() without arguments
Error

Line: 85 Column: 9

                  """
    def __init__(self):
        self._registry = {}
        super(ConstraintRegistry, self).__init__()

    def register(self, constraint, factory=None):
        """
        Registers a :class:`~torch.distributions.constraints.Constraint`
        subclass in this registry. Usage::

            

Reported by Pylint.

torch/autograd/variable.py
9 issues
Access to a protected member _LegacyVariableBase of a client class
Error

Line: 11 Column: 45

              

# mypy doesn't understand torch._six.with_metaclass
class Variable(with_metaclass(VariableMeta, torch._C._LegacyVariableBase)):  # type: ignore[misc]
    pass


from torch._C import _ImperativeEngine as ImperativeEngine
Variable._execution_engine = ImperativeEngine()

            

Reported by Pylint.

Access to a protected member _C of a client class
Error

Line: 11 Column: 45

              

# mypy doesn't understand torch._six.with_metaclass
class Variable(with_metaclass(VariableMeta, torch._C._LegacyVariableBase)):  # type: ignore[misc]
    pass


from torch._C import _ImperativeEngine as ImperativeEngine
Variable._execution_engine = ImperativeEngine()

            

Reported by Pylint.

Access to a protected member _execution_engine of a client class
Error

Line: 16 Column: 1

              

from torch._C import _ImperativeEngine as ImperativeEngine
Variable._execution_engine = ImperativeEngine()

            

Reported by Pylint.

Missing module docstring
Error

Line: 1 Column: 1

              import torch
from torch._six import with_metaclass


class VariableMeta(type):
    def __instancecheck__(cls, other):
        return isinstance(other, torch.Tensor)



            

Reported by Pylint.

Missing class docstring
Error

Line: 5 Column: 1

              from torch._six import with_metaclass


class VariableMeta(type):
    def __instancecheck__(cls, other):
        return isinstance(other, torch.Tensor)


# mypy doesn't understand torch._six.with_metaclass

            

Reported by Pylint.

Too few public methods (0/2)
Error

Line: 11 Column: 1

              

# mypy doesn't understand torch._six.with_metaclass
class Variable(with_metaclass(VariableMeta, torch._C._LegacyVariableBase)):  # type: ignore[misc]
    pass


from torch._C import _ImperativeEngine as ImperativeEngine
Variable._execution_engine = ImperativeEngine()

            

Reported by Pylint.

Missing class docstring
Error

Line: 11 Column: 1

              

# mypy doesn't understand torch._six.with_metaclass
class Variable(with_metaclass(VariableMeta, torch._C._LegacyVariableBase)):  # type: ignore[misc]
    pass


from torch._C import _ImperativeEngine as ImperativeEngine
Variable._execution_engine = ImperativeEngine()

            

Reported by Pylint.

Import "from torch._C import _ImperativeEngine as ImperativeEngine" should be placed at the top of the module
Error

Line: 15 Column: 1

                  pass


from torch._C import _ImperativeEngine as ImperativeEngine
Variable._execution_engine = ImperativeEngine()

            

Reported by Pylint.

Module 'torch._C' has no '_LegacyVariableBase' member, but source is unavailable. Consider adding this module to extension-pkg-whitelist if you want to perform analysis based on run-time introspection of living objects.
Error

Line: 11 Column: 45

              

# mypy doesn't understand torch._six.with_metaclass
class Variable(with_metaclass(VariableMeta, torch._C._LegacyVariableBase)):  # type: ignore[misc]
    pass


from torch._C import _ImperativeEngine as ImperativeEngine
Variable._execution_engine = ImperativeEngine()

            

Reported by Pylint.

torch/distributed/optim/functional_adagrad.py
9 issues
Module 'torch' has no 'full_like' member
Error

Line: 54 Column: 24

                      # This is also needed by if we want to share_memory on the step across processes
        for p in self.param_group["params"]:
            self.state[p] = {
                "sum": torch.full_like(p.data, initial_accumulator_value),
                "step": torch.tensor(0.0),
            }

    def step(self, gradients: List[Optional[Tensor]]):
        params = self.param_group['params']

            

Reported by Pylint.

Module 'torch' has no 'tensor' member; maybe 'Tensor'?
Error

Line: 55 Column: 25

                      for p in self.param_group["params"]:
            self.state[p] = {
                "sum": torch.full_like(p.data, initial_accumulator_value),
                "step": torch.tensor(0.0),
            }

    def step(self, gradients: List[Optional[Tensor]]):
        params = self.param_group['params']
        params_with_grad = []

            

Reported by Pylint.

TODO: no union or any types in TorchScript, make step a scalar tensor instead
Error

Line: 50 Column: 3

                      # param group as it's not a common use case.
        self.param_group = {"params": params}

        # TODO: no union or any types in TorchScript, make step a scalar tensor instead
        # This is also needed by if we want to share_memory on the step across processes
        for p in self.param_group["params"]:
            self.state[p] = {
                "sum": torch.full_like(p.data, initial_accumulator_value),
                "step": torch.tensor(0.0),

            

Reported by Pylint.

Missing module docstring
Error

Line: 1 Column: 1

              from typing import List, Dict, Optional
import torch
import torch.optim._functional as F

from torch import Tensor

# Define a TorchScript compatible Functional Adagrad Optimizer
# where we use these optimizer in a functional way.
# Instead of using the `param.grad` when updating parameters,

            

Reported by Pylint.

Class '_FunctionalAdagrad' inherits from object, can be safely removed from bases in python3
Error

Line: 17 Column: 1

              # NOTE: This should be only used by distributed optimizer internals
# and not meant to expose to the user.
@torch.jit.script
class _FunctionalAdagrad(object):
    def __init__(
        self,
        params: List[Tensor],
        lr: float = 1e-2,
        lr_decay: float = 0.0,

            

Reported by Pylint.

Too few public methods (1/2)
Error

Line: 17 Column: 1

              # NOTE: This should be only used by distributed optimizer internals
# and not meant to expose to the user.
@torch.jit.script
class _FunctionalAdagrad(object):
    def __init__(
        self,
        params: List[Tensor],
        lr: float = 1e-2,
        lr_decay: float = 0.0,

            

Reported by Pylint.

Too many arguments (11/5)
Error

Line: 18 Column: 5

              # and not meant to expose to the user.
@torch.jit.script
class _FunctionalAdagrad(object):
    def __init__(
        self,
        params: List[Tensor],
        lr: float = 1e-2,
        lr_decay: float = 0.0,
        weight_decay: float = 0.0,

            

Reported by Pylint.

Variable name "p" doesn't conform to snake_case naming style
Error

Line: 52 Column: 13

              
        # TODO: no union or any types in TorchScript, make step a scalar tensor instead
        # This is also needed by if we want to share_memory on the step across processes
        for p in self.param_group["params"]:
            self.state[p] = {
                "sum": torch.full_like(p.data, initial_accumulator_value),
                "step": torch.tensor(0.0),
            }


            

Reported by Pylint.

Missing function or method docstring
Error

Line: 58 Column: 5

                              "step": torch.tensor(0.0),
            }

    def step(self, gradients: List[Optional[Tensor]]):
        params = self.param_group['params']
        params_with_grad = []
        grads = []
        state_sums = []
        state_steps: List[int] = []

            

Reported by Pylint.

torch/distributed/optim/functional_rprop.py
9 issues
Module 'torch' has no 'tensor' member; maybe 'Tensor'?
Error

Line: 66 Column: 37

                              if param not in self.state:
                    self.state[param] = {}
                    state = self.state[param]
                    state['step'] = torch.tensor(0.0)
                    state['prev'] = torch.zeros_like(param, memory_format=torch.preserve_format)
                    state['step_size'] = torch.full_like(gradient, lr)

                state = self.state[param]
                prevs.append(state['prev'])

            

Reported by Pylint.

Module 'torch' has no 'preserve_format' member
Error

Line: 67 Column: 75

                                  self.state[param] = {}
                    state = self.state[param]
                    state['step'] = torch.tensor(0.0)
                    state['prev'] = torch.zeros_like(param, memory_format=torch.preserve_format)
                    state['step_size'] = torch.full_like(gradient, lr)

                state = self.state[param]
                prevs.append(state['prev'])
                step_sizes.append(state['step_size'])

            

Reported by Pylint.

Module 'torch' has no 'zeros_like' member
Error

Line: 67 Column: 37

                                  self.state[param] = {}
                    state = self.state[param]
                    state['step'] = torch.tensor(0.0)
                    state['prev'] = torch.zeros_like(param, memory_format=torch.preserve_format)
                    state['step_size'] = torch.full_like(gradient, lr)

                state = self.state[param]
                prevs.append(state['prev'])
                step_sizes.append(state['step_size'])

            

Reported by Pylint.

Module 'torch' has no 'full_like' member
Error

Line: 68 Column: 42

                                  state = self.state[param]
                    state['step'] = torch.tensor(0.0)
                    state['prev'] = torch.zeros_like(param, memory_format=torch.preserve_format)
                    state['step_size'] = torch.full_like(gradient, lr)

                state = self.state[param]
                prevs.append(state['prev'])
                step_sizes.append(state['step_size'])


            

Reported by Pylint.

Missing module docstring
Error

Line: 1 Column: 1

              from typing import List, Dict, Optional, Tuple
import torch
import torch.optim._functional as F

from torch import Tensor

# Define a TorchScript compatible Functional Rprop Optimizer
# where we use these optimizer in a functional way.
# Instead of using the `param.grad` when updating parameters,

            

Reported by Pylint.

Too few public methods (1/2)
Error

Line: 17 Column: 1

              # NOTE: This should be only used by distributed optimizer internals
# and not meant to expose to the user.
@torch.jit.script
class _FunctionalRprop(object):
    def __init__(
        self,
        params: List[Tensor],
        lr: float = 1e-2,
        etas: Tuple[float, float] = (0.5, 1.2),

            

Reported by Pylint.

Class '_FunctionalRprop' inherits from object, can be safely removed from bases in python3
Error

Line: 17 Column: 1

              # NOTE: This should be only used by distributed optimizer internals
# and not meant to expose to the user.
@torch.jit.script
class _FunctionalRprop(object):
    def __init__(
        self,
        params: List[Tensor],
        lr: float = 1e-2,
        etas: Tuple[float, float] = (0.5, 1.2),

            

Reported by Pylint.

Missing function or method docstring
Error

Line: 41 Column: 5

              
        self.state = torch.jit.annotate(Dict[torch.Tensor, Dict[str, torch.Tensor]], {})

    def step(self, gradients: List[Optional[Tensor]]):
        params = self.param_group['params']
        params_with_grad = []
        grads = []
        prevs = []
        step_sizes = []

            

Reported by Pylint.

Variable name "lr" doesn't conform to snake_case naming style
Error

Line: 47 Column: 9

                      grads = []
        prevs = []
        step_sizes = []
        lr = self.defaults['lr']
        etaminus, etaplus = self.etas
        step_size_min, step_size_max = self.step_sizes

        if len(params) != len(gradients):
            raise ValueError(

            

Reported by Pylint.