1. At first, get rid of the explicit mapping of generics to default types (remove all these not-good-looking methods which were added earlier, such as `PyTypeChecker.trySubstituteByDefaultsOnl`y and `PyTypeChecker.getSubstitutionsWithDefaults`) and their usages. All the related logic now will be handled in `PyTypeParameterMapping`, as we wanted it to be.
2. Do some changes in `PyTypeChecker` to be able to correctly parameterize class via constructor call, and also take defaults into account in `PyTypeChecker.getSubstitutionsWithUnresolvedReturnGenerics` for methods
3. Get rid of the explicit calls of `PyTypingTypeProvider.tryParameterizeClassWithDefaults` in `PyCallExpressionHelper`, `PyReferenceExpressionImpl`, rename this method to `parameterizeClassDefaultAware` and call it directly in `PyTypingTypeProvider.getReferenceType`
4. Add a new flag to `PyTypeParameterMapping` to be able to correctly match type parameters (see `PyTypeChecker.matchTypeParameters`)
GitOrigin-RevId: 5dd90ee3bdf8319b36f1945ce22a33a8edf6bc93
typing.Generic is a magical class that can be specified in any position
in the list of base classes, not affecting the MRO consistency. It's done by
the custom __mro_entries__ implementation in typing._BaseGenericAlias (Python < 3.12),
which skips this Generic entry if there are other generic classes following
it on the list of superclasses. Namely, it's possible to do the following:
```
class Base(Generic[T]):
pass
class MyClass(Generic[T], Base[T]):
pass
```
which would cause a TypeError for regular classes. Since it broke our implementation
of the C3 algorithm in PyClassImpl.getMROAncestorTypes, we now special-case it by
always moving typing.Generic to the very end of the base class list while constructing
MRO.
See https://github.com/python/cpython/blob/3.11/Lib/typing.py#L1298 for a pure-Python
version of typing._BaseGenericAlias.__mro_entries__ and a relevant discussion in
https://github.com/python/cpython/issues/106102.
GitOrigin-RevId: e7d765193d532ab8457133e8fb5ad06840d89225
Also fixes PY-59014, PY-39761.
PyResolveImportUtil returns both .pyi stubs and the corresponding .py files for stub packages
to support partial stub packages. See the line:
```
groupedResults.topResultIs(Priority.STUB_PACKAGE) -> firstResultWithFallback(groupedResults, Priority.STUB_PACKAGE)
```
in PyResolveImportUtil.filterTopPriorityResults.
It means that, for instance, resolving the QuerySet name in type hints led to QuerySet
definitions from both places. Then, PyTypingTypeProvider.getType() for the reference expression
"QuerySet" returned a union type containing PyClassTypes for both of them, we couldn't parameterize
it in PyTypingTypeProvider.getParameterizedType and returned Any.
It's wrong that while evaluating type hints, we interpret multiple declarations as
a union type. Those should only be explicitly expressed with typing.Union or "|" operator.
This behavior was originally added in PY-18427 as an ad-hoc way to support version checks
for type hints, but now it seems detrimental because it's unclear how to parameterize
such implicit unions of generic types then.
Other type checkers also don't treat conditional definitions like that. For instance, for
conditional type aliases, Mypy complains about the name being defined twice and then uses
only the first definition, and Pyright doesn't consider names under conditions other than
version checks as valid type aliases at all. Both type checkers also support partial stub
packages properly.
GitOrigin-RevId: 1ecc7ab5d09625d10850ddc0e1f7761332ccddd5
add new features: info about underscores in path, info about lib location, info about context of original file(extension type, psi parents, size of project),
info about already existing imports from the same library(in this file, in other opened files, in other files in the same directory)
GitOrigin-RevId: ca8206d4d7db6bc79e8f1a78502bf33696a653e9
sys.version_info guards are processed at the level of ScopeImpl.collectDeclarations and
PyResolveUtil.scopeCrawlUp in PyReferenceImpl.resolveInner, as implemented in
3318ff79cdcc5ba0ce5e4feb65abad5ad0f4acfa.
However, once we collected all name definition candidates flow-insensitively this way, in
PyReferenceImpl.getResultsFromProcessor, if the reference and these candidates were located
in the same scope, we completely ignored these variants and gathered reachable definitions all
over again from CFG using PyDefUseUtil.getLatestDefs. And the latter didn't consider version
guards at all. I've added version guard checks directly in PyDefUseUtil.getLatestDefs.
GitOrigin-RevId: 9f92eecd1eb1812bfbd2bf54f8192f45f0cf0a1d
The refactoring also fixes the problem with wrong types inferred for contextlib.contextmanager
GitOrigin-RevId: 1601047786a0c43a463a82225b18635d407bcb40
`dataclass_transform` support posed a number of challenges to the current
AST/PSI stubs separation in our architecture. For the standard "dataclasses"
module and the "attrs" package API, we could rely on well-known names
and defaults to recognize and preserve the information about decorator
arguments and field specifier arguments in PSI stubs. With `dataclass_transform`
however, effectively any decorator call or extending any base class can indicate
using a "magical" API that should generate dataclass-like entities.
At the moment of building PSI stubs we can't be sure, because we can't leave
the boundaries of the current file to resolve these names and determine if these
decorators and base classes are decorated themselves with `dataclass_transform`.
To support that, we instead rely on well-known keyword argument names documented
in the spec, e.g. "kw_only" or "frozen". In other words, whenever we encounter
any decorator call or a superclass list with such keyword arguments, we generate
a `PyDataclassStub` stub for the corresponding class.
The same thing is happening with class attribute initializers, i.e. whenever
we see a function call with argument such as "default" or "kw_only" in their
RHS, we generate a `PyDataclassFieldStub` for the corresponding target expression.
Both of these stub interfaces now can contain null values for the corresponding
properties if they were not specified directly in the class definition.
Finally, for the `dataclass_transform` decorator itself, a new custom decorator stub
was introduced -- `PyDataclassTransformDecoratorStub`, it preserves its keyword
arguments, such as "keyword_only_default" or "frozen_default" controlling
the default properties of generated dataclasses.
Later, when we need concluded information about specific dataclass properties,
e.g. in `PyDataclassTypeProvider` to generate a constructor signature, or in
`PyDataclassInspection`, we try to "resolve" this incomplete information from stubs
into finalized `PyDataclassParameters` and `PyDataclassFieldParameters` that
contain non-null versions of the same fields. The main entry points for that
are `resolveDataclassParameters` and `resolveDataclassFieldParameters`.
These methods additionally handle the situations where decorators, superclass
lists and field specifiers lack any keyword arguments, and thus, there were no
automatically created custom stubs for them.
All the existing usages of `PyDataclassStub` and `PyDataclassFieldStub`
were updated to operate on `PyDataclassParameters` and `PyDataclassFieldParameters`
instead.
Counterparts of the tests on various inspection checks for the standard dataclasses
definitions were added for dataclasses created with `dataclass_transform`, even
though the spec is unclear on some aspects the expected type checker semantics, e.g.
if combining "eq=False" and "order=True" or specifying both "default" and
"default_factory" for a field should be reported.
I tried to follow common sense when enabling existing checks for such arbitrary
user-defined dataclass APIs.
GitOrigin-RevId: 4180a1e32b5e4025fc4e3ed49bb8d67af0d60e66
The changes include:
- Inferring implicitly parameterized types for classes if they are generic classes with type parameters that have defaults.
- Inferring partially parameterized types for classes where only some of the default Type Parameters are overridden by explicit parameterization.
- Supporting both old-style and new-style generic Type Aliases, now considering defaults of Type Parameters.
GitOrigin-RevId: c3a0d7eb4a85585df9638081291ff83b850eb7f6
- Infer default types for TypeVars, ParamSpecs and TypeVarTuples
- Allow explicitly parameterizing ParamSpecs with `[type1, type2, ...]` constructions
- Correctly set declaration elements of Type Parameters inferred from references
GitOrigin-RevId: 024392a03f744e08c6dee054bbc2ca42c1f4b19e
In particular:
- Add getters and setters for default types of Type Parameters
- Change the type of declaration elements for Type Parameters from hardcoded PyTargetExpression to PyQualifiedNameOwner to make it possible to set new-style Type Parameters as declaration elements
- Correctly calculate the qualified names of the new-style type alias statements as now they will be used in declaration elements of Type Parameters
GitOrigin-RevId: 5185d85c1a75052dfcb3f97c0eee17b52540d24b
- PEP-696 adds a new syntax for declaring the default types of Type Parameters in new-new style generic classes, functions and type alias statements. Support these grammar changes.
- Store info about default types in stubs for Type Parameters
- Increment the stub version counter in PyFileElementType
GitOrigin-RevId: b6b22e3eaa86ce06132885781e5775a89bf4b840