I clearly split the substitution logic into separate branches for ParamSpec,
Concatenate and ordinary callable parameters instead of the previous "do-all"
loop. It additionally fixed a bug with using TypeVarTuple inside Concatenate.
GitOrigin-RevId: 19e880e4a129ac5bbd0520e26899334f0aa6bb51
The original problem with @contextlib.asynccontextmanager was due to a bug
in PyTypeChecker.substitute introduced in the TypeVarTuple support. Namely,
we started to substitute unmatched ParamSpec types with null, effectively
replacing them in a callable signature with a single parameter of type Any.
Then the special logic in PyCallExpressionHelper.mapArguments that treated
unmatched ParamSpecs as "catch-all" varargs stopped working, and we started
to highlight all extra arguments in the substituted callable invocations.
In other words, binding type parameters from decorator targets, e.g.
ParamSpecs or function return types, never worked because we can't resolve
functions passed as decorator arguments in "de-sugared" expression fragments
in the codeAnalysis context, i.e. when we replace
```
@deco
def f(x: int, y: str): ...
```
with `deco(f)` and then try to infer its type in PyDecoratedFunctionTypeProvider,
but we didn't report it thanks to that special-casing of unmatched ParamSpecs
(other type parameters replaced by Any don't trigger such warnings).
Ideally, we should start resolving references in arguments of function calls
in such virtual expression fragments in some stub-safe manner instead of relying
on this fallback logic. In the general case, however, complete stub-safe inference
for decorators is a hard problem because arbitrary expressions can affect types of
their return type, .e.g.
```
def deco(result: T) -> Callable[[Callable[P, Any]], Callable[P, T]]: ...
@deco(arbitrary_call().foo + 42) # how to handle this without unstubbing?
def f(x: int, y: str): ...
```
GitOrigin-RevId: adeb625611a3ebb7d5db523df00388d619323545
- use more modern API where applicable
- give more descriptive names to variables and methods
- remove unnecessary abstraction of Lists as Collections
- remove a usage of PyClassTypeImpl
- remove a redundant counter variable
- remove casts
- remove now discouraged final modifiers
- add nullability annotations
GitOrigin-RevId: bfcd08760d1901417a6f662f8ea0205351fd28e5
Traversing through all children of a PyFunction looking for nested functions,
possibly declaring a nonlocal or global variable, and doing that for every
target expression or a reference to one, is both inefficient and seemingly
hardly needed. For a name being local means that it cannot be accessed
outside the scope of the corresponding function. Updating this variable from
some inner helper function doesn't violate this property. In Python, one
has to explicitly mark a name from an enclosing scope as nonlocal or global to
be able to assign to it within a function. It seems enough (and less surprising)
to rely on this information to distinguish between local variables and everything
else.
All in all, if some local variable is accessed as a nonlocal name in an inner
function, it's now still highlighted as local in the function that defines
it (previously it wasn't), but it is not in the function declaring it as
nonlocal (same as before).
Additionally, we now uniformly highlight as local variables "for" and "with"
statement targets, targets of assignment expressions, names bound in patterns,
and variables in assignments with unpacking (previously it was done only for
trivial assignments).
GitOrigin-RevId: 04c07ae6814a6b531911b3d87a3a26191c934962