PY-72951 Integrate PyCharm with the conformance test suite

GitOrigin-RevId: 1b45a7a94248b06e6610f42af01bb84be9a40bb3
This commit is contained in:
Petr
2024-07-09 21:59:13 +02:00
committed by intellij-monorepo-bot
parent 8926646dfc
commit 4221e99ae6
140 changed files with 12672 additions and 0 deletions

View File

@@ -0,0 +1,254 @@
A. HISTORY OF THE SOFTWARE
==========================
Python was created in the early 1990s by Guido van Rossum at Stichting
Mathematisch Centrum (CWI, see http://www.cwi.nl) in the Netherlands
as a successor of a language called ABC. Guido remains Python's
principal author, although it includes many contributions from others.
In 1995, Guido continued his work on Python at the Corporation for
National Research Initiatives (CNRI, see http://www.cnri.reston.va.us)
in Reston, Virginia where he released several versions of the
software.
In May 2000, Guido and the Python core development team moved to
BeOpen.com to form the BeOpen PythonLabs team. In October of the same
year, the PythonLabs team moved to Digital Creations, which became
Zope Corporation. In 2001, the Python Software Foundation (PSF, see
https://www.python.org/psf/) was formed, a non-profit organization
created specifically to own Python-related Intellectual Property.
Zope Corporation was a sponsoring member of the PSF.
All Python releases are Open Source (see http://www.opensource.org for
the Open Source Definition). Historically, most, but not all, Python
releases have also been GPL-compatible; the table below summarizes
the various releases.
Release Derived Year Owner GPL-
from compatible? (1)
0.9.0 thru 1.2 1991-1995 CWI yes
1.3 thru 1.5.2 1.2 1995-1999 CNRI yes
1.6 1.5.2 2000 CNRI no
2.0 1.6 2000 BeOpen.com no
1.6.1 1.6 2001 CNRI yes (2)
2.1 2.0+1.6.1 2001 PSF no
2.0.1 2.0+1.6.1 2001 PSF yes
2.1.1 2.1+2.0.1 2001 PSF yes
2.1.2 2.1.1 2002 PSF yes
2.1.3 2.1.2 2002 PSF yes
2.2 and above 2.1.1 2001-now PSF yes
Footnotes:
(1) GPL-compatible doesn't mean that we're distributing Python under
the GPL. All Python licenses, unlike the GPL, let you distribute
a modified version without making your changes open source. The
GPL-compatible licenses make it possible to combine Python with
other software that is released under the GPL; the others don't.
(2) According to Richard Stallman, 1.6.1 is not GPL-compatible,
because its license has a choice of law clause. According to
CNRI, however, Stallman's lawyer has told CNRI's lawyer that 1.6.1
is "not incompatible" with the GPL.
Thanks to the many outside volunteers who have worked under Guido's
direction to make these releases possible.
B. TERMS AND CONDITIONS FOR ACCESSING OR OTHERWISE USING PYTHON
===============================================================
PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2
--------------------------------------------
1. This LICENSE AGREEMENT is between the Python Software Foundation
("PSF"), and the Individual or Organization ("Licensee") accessing and
otherwise using this software ("Python") in source or binary form and
its associated documentation.
2. Subject to the terms and conditions of this License Agreement, PSF hereby
grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce,
analyze, test, perform and/or display publicly, prepare derivative works,
distribute, and otherwise use Python alone or in any derivative version,
provided, however, that PSF's License Agreement and PSF's notice of copyright,
i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010,
2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022 Python Software Foundation;
All Rights Reserved" are retained in Python alone or in any derivative version
prepared by Licensee.
3. In the event Licensee prepares a derivative work that is based on
or incorporates Python or any part thereof, and wants to make
the derivative work available to others as provided herein, then
Licensee hereby agrees to include in any such work a brief summary of
the changes made to Python.
4. PSF is making Python available to Licensee on an "AS IS"
basis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT
INFRINGE ANY THIRD PARTY RIGHTS.
5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON,
OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
6. This License Agreement will automatically terminate upon a material
breach of its terms and conditions.
7. Nothing in this License Agreement shall be deemed to create any
relationship of agency, partnership, or joint venture between PSF and
Licensee. This License Agreement does not grant permission to use PSF
trademarks or trade name in a trademark sense to endorse or promote
products or services of Licensee, or any third party.
8. By copying, installing or otherwise using Python, Licensee
agrees to be bound by the terms and conditions of this License
Agreement.
BEOPEN.COM LICENSE AGREEMENT FOR PYTHON 2.0
-------------------------------------------
BEOPEN PYTHON OPEN SOURCE LICENSE AGREEMENT VERSION 1
1. This LICENSE AGREEMENT is between BeOpen.com ("BeOpen"), having an
office at 160 Saratoga Avenue, Santa Clara, CA 95051, and the
Individual or Organization ("Licensee") accessing and otherwise using
this software in source or binary form and its associated
documentation ("the Software").
2. Subject to the terms and conditions of this BeOpen Python License
Agreement, BeOpen hereby grants Licensee a non-exclusive,
royalty-free, world-wide license to reproduce, analyze, test, perform
and/or display publicly, prepare derivative works, distribute, and
otherwise use the Software alone or in any derivative version,
provided, however, that the BeOpen Python License is retained in the
Software, alone or in any derivative version prepared by Licensee.
3. BeOpen is making the Software available to Licensee on an "AS IS"
basis. BEOPEN MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, BEOPEN MAKES NO AND
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF THE SOFTWARE WILL NOT
INFRINGE ANY THIRD PARTY RIGHTS.
4. BEOPEN SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF THE
SOFTWARE FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS
AS A RESULT OF USING, MODIFYING OR DISTRIBUTING THE SOFTWARE, OR ANY
DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
5. This License Agreement will automatically terminate upon a material
breach of its terms and conditions.
6. This License Agreement shall be governed by and interpreted in all
respects by the law of the State of California, excluding conflict of
law provisions. Nothing in this License Agreement shall be deemed to
create any relationship of agency, partnership, or joint venture
between BeOpen and Licensee. This License Agreement does not grant
permission to use BeOpen trademarks or trade names in a trademark
sense to endorse or promote products or services of Licensee, or any
third party. As an exception, the "BeOpen Python" logos available at
http://www.pythonlabs.com/logos.html may be used according to the
permissions granted on that web page.
7. By copying, installing or otherwise using the software, Licensee
agrees to be bound by the terms and conditions of this License
Agreement.
CNRI LICENSE AGREEMENT FOR PYTHON 1.6.1
---------------------------------------
1. This LICENSE AGREEMENT is between the Corporation for National
Research Initiatives, having an office at 1895 Preston White Drive,
Reston, VA 20191 ("CNRI"), and the Individual or Organization
("Licensee") accessing and otherwise using Python 1.6.1 software in
source or binary form and its associated documentation.
2. Subject to the terms and conditions of this License Agreement, CNRI
hereby grants Licensee a nonexclusive, royalty-free, world-wide
license to reproduce, analyze, test, perform and/or display publicly,
prepare derivative works, distribute, and otherwise use Python 1.6.1
alone or in any derivative version, provided, however, that CNRI's
License Agreement and CNRI's notice of copyright, i.e., "Copyright (c)
1995-2001 Corporation for National Research Initiatives; All Rights
Reserved" are retained in Python 1.6.1 alone or in any derivative
version prepared by Licensee. Alternately, in lieu of CNRI's License
Agreement, Licensee may substitute the following text (omitting the
quotes): "Python 1.6.1 is made available subject to the terms and
conditions in CNRI's License Agreement. This Agreement together with
Python 1.6.1 may be located on the internet using the following
unique, persistent identifier (known as a handle): 1895.22/1013. This
Agreement may also be obtained from a proxy server on the internet
using the following URL: http://hdl.handle.net/1895.22/1013".
3. In the event Licensee prepares a derivative work that is based on
or incorporates Python 1.6.1 or any part thereof, and wants to make
the derivative work available to others as provided herein, then
Licensee hereby agrees to include in any such work a brief summary of
the changes made to Python 1.6.1.
4. CNRI is making Python 1.6.1 available to Licensee on an "AS IS"
basis. CNRI MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, CNRI MAKES NO AND
DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON 1.6.1 WILL NOT
INFRINGE ANY THIRD PARTY RIGHTS.
5. CNRI SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
1.6.1 FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON 1.6.1,
OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
6. This License Agreement will automatically terminate upon a material
breach of its terms and conditions.
7. This License Agreement shall be governed by the federal
intellectual property law of the United States, including without
limitation the federal copyright law, and, to the extent such
U.S. federal law does not apply, by the law of the Commonwealth of
Virginia, excluding Virginia's conflict of law provisions.
Notwithstanding the foregoing, with regard to derivative works based
on Python 1.6.1 that incorporate non-separable material that was
previously distributed under the GNU General Public License (GPL), the
law of the Commonwealth of Virginia shall govern this License
Agreement only as to issues arising under or with respect to
Paragraphs 4, 5, and 7 of this License Agreement. Nothing in this
License Agreement shall be deemed to create any relationship of
agency, partnership, or joint venture between CNRI and Licensee. This
License Agreement does not grant permission to use CNRI trademarks or
trade name in a trademark sense to endorse or promote products or
services of Licensee, or any third party.
8. By clicking on the "ACCEPT" button where indicated, or by copying,
installing or otherwise using Python 1.6.1, Licensee agrees to be
bound by the terms and conditions of this License Agreement.
ACCEPT
CWI LICENSE AGREEMENT FOR PYTHON 0.9.0 THROUGH 1.2
--------------------------------------------------
Copyright (c) 1991 - 1995, Stichting Mathematisch Centrum Amsterdam,
The Netherlands. All rights reserved.
Permission to use, copy, modify, and distribute this software and its
documentation for any purpose and without fee is hereby granted,
provided that the above copyright notice appear in all copies and that
both that copyright notice and this permission notice appear in
supporting documentation, and that the name of Stichting Mathematisch
Centrum or CWI not be used in advertising or publicity pertaining to
distribution of the software without specific, written prior
permission.
STICHTING MATHEMATISCH CENTRUM DISCLAIMS ALL WARRANTIES WITH REGARD TO
THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
FITNESS, IN NO EVENT SHALL STICHTING MATHEMATISCH CENTRUM BE LIABLE
FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT
OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

View File

@@ -0,0 +1,3 @@
"""
Dummy implementation _enums_member_values stub.
"""

View File

@@ -0,0 +1,15 @@
"""
Support stub file for enums_member_values test.
"""
from enum import Enum
# > If the literal values for enum members are not supplied, as they sometimes
# > are not within a type stub file, a type checker can use the type of the
# > _value_ attribute.
class ColumnType(Enum):
_value_: int
DORIC = ...
IONIC = ...
CORINTHIAN = ...

View File

@@ -0,0 +1,3 @@
"""
Dummy implementation for stub file for enums_members test.
"""

View File

@@ -0,0 +1,15 @@
"""
Support stub file for enums_members test.
"""
from enum import Enum
# > Within a type stub, members can be defined using the actual runtime values,
# > or a placeholder of ... can be used
class Pet2(Enum):
genus: str # Non-member attribute
species: str # Non-member attribute
CAT = ... # Member attribute
DOG = ... # Member attribute

View File

@@ -0,0 +1,7 @@
"""
Support file for protocols_modules.py test.
"""
timeout = 100
one_flag = True
other_flag = False

View File

@@ -0,0 +1,11 @@
"""
Support file for protocols_modules.py test.
"""
def on_error(x: int) -> None:
...
def on_success() -> None:
...

View File

@@ -0,0 +1,29 @@
"""
Support stub file for @final tests.
"""
from typing import final, overload
class Base3:
# > For overloaded methods, @final should be placed on the implementation
# > (or on the first overload, for stubs):
@final
@overload
def method(self, x: int) -> int:
...
@overload
def method(self, x: str) -> str:
...
class Base4:
# (Swap the order of overload and final decorators.)
@overload
@final
def method(self, x: int) -> int:
...
@overload
def method(self, x: str) -> str:
...

View File

@@ -0,0 +1,102 @@
"""
Tests explicit type aliases defined with `TypeAlias`.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/aliases.html#typealias
from typing import Any, Callable, Concatenate, Literal, ParamSpec, TypeVar, Union, assert_type
from typing import TypeAlias as TA
S = TypeVar("S")
T = TypeVar("T")
P = ParamSpec("P")
R = TypeVar("R")
GoodTypeAlias1: TA = Union[int, str]
GoodTypeAlias2: TA = int | None
GoodTypeAlias3: TA = list[GoodTypeAlias2]
GoodTypeAlias4: TA = list[T]
GoodTypeAlias5: TA = tuple[T, ...] | list[T]
GoodTypeAlias6: TA = tuple[int, int, S, T]
GoodTypeAlias7: TA = Callable[..., int]
GoodTypeAlias8: TA = Callable[[int, T], T]
GoodTypeAlias9: TA = Callable[Concatenate[int, P], R]
GoodTypeAlias10: TA = Any
GoodTypeAlias11: TA = GoodTypeAlias1 | GoodTypeAlias2 | list[GoodTypeAlias4[int]]
GoodTypeAlias12: TA = Callable[P, None]
GoodTypeAlias13: TA = "int | str"
GoodTypeAlias14: TA = list["int | str"]
GoodTypeAlias15: TA = Literal[3, 4, 5, None]
def good_type_aliases(
p1: GoodTypeAlias1,
p2: GoodTypeAlias2,
p3: GoodTypeAlias3,
p4: GoodTypeAlias4[int],
p5: GoodTypeAlias5[str],
p6: GoodTypeAlias6[int, str],
p7: GoodTypeAlias7,
p8: GoodTypeAlias8[str],
p9: GoodTypeAlias9[[str, str], None],
p10: GoodTypeAlias10,
p11: GoodTypeAlias11,
p12: GoodTypeAlias12,
p13: GoodTypeAlias13,
p14: GoodTypeAlias14,
p15: GoodTypeAlias15,
):
assert_type(p1, int | str)
assert_type(p2, int | None)
assert_type(p3, list[int | None])
assert_type(p4, list[int])
assert_type(p5, tuple[str, ...] | list[str])
assert_type(p6, tuple[int, int, int, str])
assert_type(p7, Callable[..., int])
assert_type(p8, Callable[[int, str], str])
assert_type(p9, Callable[[int, str, str], None])
assert_type(p10, Any)
assert_type(p11, int | str | None | list[list[int]])
assert_type(p12, Callable[..., None])
assert_type(p13, int | str)
assert_type(p14, list[int | str])
assert_type(p15, Literal[3, 4, 5, None])
def good_type_aliases_used_badly(
p1: GoodTypeAlias2[int], # E: type alias is not generic
p2: GoodTypeAlias3[int], # E: type alias is already specialized
p3: GoodTypeAlias4[int, int], # E: too many type arguments
p4: GoodTypeAlias8[int, int], # E: too many type arguments
p5: GoodTypeAlias9[int, int], # E: bad type argument for ParamSpec
):
pass
var1 = 3
# The following should not be allowed as type aliases.
BadTypeAlias1: TA = eval("".join(map(chr, [105, 110, 116]))) # E
BadTypeAlias2: TA = [int, str] # E
BadTypeAlias3: TA = ((int, str),) # E
BadTypeAlias4: TA = [int for i in range(1)] # E
BadTypeAlias5: TA = {"a": "b"} # E
BadTypeAlias6: TA = (lambda: int)() # E
BadTypeAlias7: TA = [int][0] # E
BadTypeAlias8: TA = int if 1 < 3 else str # E
BadTypeAlias9: TA = var1 # E
BadTypeAlias10: TA = True # E
BadTypeAlias11: TA = 1 # E
BadTypeAlias12: TA = list or set # E
BadTypeAlias13: TA = f"{'int'}" # E
ListAlias: TA = list
ListOrSetAlias: TA = list | set
x1: list[str] = ListAlias() # OK
assert_type(x1, list[str])
x2: ListAlias[int] # E: already specialized
x3 = ListOrSetAlias() # E: cannot instantiate union
x4: ListOrSetAlias[int] # E: already specialized

View File

@@ -0,0 +1,135 @@
"""
Tests traditional implicit type aliases.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/aliases.html
from collections.abc import Iterable
from typing import Any, Callable, Concatenate, ParamSpec, TypeVar, Union, assert_type
TFloat = TypeVar("TFloat", bound=float)
Vector = Iterable[tuple[TFloat, TFloat]]
def in_product(v: Vector[TFloat]) -> Iterable[TFloat]:
return [x for x, _ in v]
def dilate(v: Vector[float], scale: float) -> Vector[float]:
return ((x * scale, y * scale) for x, y in v)
# > Type aliases may be as complex as type hints in annotations anything
# > that is acceptable as a type hint is acceptable in a type alias.
S = TypeVar("S")
T = TypeVar("T")
P = ParamSpec("P")
R = TypeVar("R")
GoodTypeAlias1 = Union[int, str]
GoodTypeAlias2 = int | None
GoodTypeAlias3 = list[GoodTypeAlias2]
GoodTypeAlias4 = list[T]
GoodTypeAlias5 = tuple[T, ...] | list[T]
GoodTypeAlias6 = tuple[int, int, S, T]
GoodTypeAlias7 = Callable[..., int]
GoodTypeAlias8 = Callable[[int, T], T]
GoodTypeAlias9 = Callable[Concatenate[int, P], R]
GoodTypeAlias10 = Any
GoodTypeAlias11 = GoodTypeAlias1 | GoodTypeAlias2 | list[GoodTypeAlias4[int]]
GoodTypeAlias12 = list[TFloat]
GoodTypeAlias13 = Callable[P, None]
def good_type_aliases(
p1: GoodTypeAlias1,
p2: GoodTypeAlias2,
p3: GoodTypeAlias3,
p4: GoodTypeAlias4[int],
p5: GoodTypeAlias5[str],
p6: GoodTypeAlias6[int, str],
p7: GoodTypeAlias7,
p8: GoodTypeAlias8[str],
p9: GoodTypeAlias9[[str, str], None],
p10: GoodTypeAlias10,
p11: GoodTypeAlias11,
p12: GoodTypeAlias12[bool],
p13: GoodTypeAlias13
):
assert_type(p1, int | str)
assert_type(p2, int | None)
assert_type(p3, list[int | None])
assert_type(p4, list[int])
assert_type(p5, tuple[str, ...] | list[str])
assert_type(p6, tuple[int, int, int, str])
assert_type(p7, Callable[..., int])
assert_type(p8, Callable[[int, str], str])
assert_type(p9, Callable[[int, str, str], None])
assert_type(p10, Any)
assert_type(p11, int | str | None | list[list[int]])
assert_type(p12, list[bool])
assert_type(p13, Callable[..., None])
def good_type_aliases_used_badly(
p1: GoodTypeAlias2[int], # E: type alias is not generic
p2: GoodTypeAlias3[int], # E: type alias is already specialized
p3: GoodTypeAlias4[int, int], # E: too many type arguments
p4: GoodTypeAlias8[int, int], # E: too many type arguments
p5: GoodTypeAlias9[int, int], # E: bad type argument for ParamSpec
p6: GoodTypeAlias12[str], # E: type argument doesn't match bound
):
pass
var1 = 3
# The following should not be considered type aliases.
BadTypeAlias1 = eval("".join(map(chr, [105, 110, 116])))
BadTypeAlias2 = [int, str]
BadTypeAlias3 = ((int, str),)
BadTypeAlias4 = [int for i in range(1)]
BadTypeAlias5 = {"a": "b"}
BadTypeAlias6 = (lambda: int)()
BadTypeAlias7 = [int][0]
BadTypeAlias8 = int if 1 < 3 else str
BadTypeAlias9 = var1
BadTypeAlias10 = True
BadTypeAlias11 = 1
BadTypeAlias12 = list or set
BadTypeAlias13 = f"int"
BadTypeAlias14 = "int | str"
def bad_type_aliases(
p1: BadTypeAlias1, # E: Invalid type annotation
p2: BadTypeAlias2, # E: Invalid type annotation
p3: BadTypeAlias3, # E: Invalid type annotation
p4: BadTypeAlias4, # E: Invalid type annotation
p5: BadTypeAlias5, # E: Invalid type annotation
p6: BadTypeAlias6, # E: Invalid type annotation
p7: BadTypeAlias7, # E: Invalid type annotation
p8: BadTypeAlias8, # E: Invalid type annotation
p9: BadTypeAlias9, # E: Invalid type annotation
p10: BadTypeAlias10, # E: Invalid type annotation
p11: BadTypeAlias11, # E: Invalid type annotation
p12: BadTypeAlias12, # E: Invalid type annotation
p13: BadTypeAlias13, # E: Invalid type annotation
p14: BadTypeAlias14, # E: Invalid type annotation
):
pass
ListAlias = list
ListOrSetAlias = list | set
x1: list[str] = ListAlias() # OK
assert_type(x1, list[str])
x2 = ListAlias[int]() # OK
assert_type(x2, list[int])
x3 = ListOrSetAlias() # E: cannot instantiate union
x4: ListOrSetAlias[int] # E: already specialized

View File

@@ -0,0 +1,62 @@
"""
Tests the `typing.NewType` function.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/aliases.html#newtype
from typing import Any, Hashable, Literal, NewType, TypeVar, TypedDict, assert_type
UserId = NewType("UserId", int)
UserId("user") # E: incorrect type
u1: UserId = 42 # E: incorrect type
u2: UserId = UserId(42) # OK
assert_type(UserId(5) + 1, int)
# > Both isinstance and issubclass, as well as subclassing will fail for
# > NewType('Derived', Base) since function objects dont support these
# > operations.
isinstance(u2, UserId) # E: not allowed in isinstance call
class UserIdDerived(UserId): # E: subclassing not allowed
pass
# > NewType accepts exactly two arguments: a name for the new unique type,
# > and a base class. The latter should be a proper class (i.e., not a type
# > construct like Union, etc.), or another unique type created by
# > calling NewType.
GoodName = NewType("BadName", int) # E: assigned name does not match
GoodNewType1 = NewType("GoodNewType1", list) # OK
GoodNewType2 = NewType("GoodNewType2", GoodNewType1) # OK
nt1: GoodNewType1[int] # E: NewType cannot be generic
TypeAlias1 = dict[str, str]
GoodNewType3 = NewType("GoodNewType3", TypeAlias1)
BadNewType1 = NewType("BadNewType1", int | str) # E: cannot be generic
T = TypeVar("T")
BadNewType2 = NewType("BadNewType2", list[T]) # E: cannot be generic
BadNewType3 = NewType("BadNewType3", Hashable) # E: cannot be protocol
BadNewType4 = NewType("BadNewType4", Literal[7]) # E: literal not allowed
class TD1(TypedDict):
a: int
BadNewType5 = NewType("BadNewType5", TD1) # E: cannot be TypedDict
BadNewType6 = NewType("BadNewType6", int, int) # E: too many arguments
BadNewType7 = NewType("BadNewType7", Any) # E: cannot be Any

View File

@@ -0,0 +1,75 @@
"""
Tests recursive (self-referential) type aliases.
"""
# The typing specification doesn't mandate support for recursive
# (self-referential) type aliases prior to PEP 695, but it also
# doesn't indicates that they shouldn't work.
# Most type checkers now support them, and many libraries and code
# bases have started to rely on them.
# PEP 695 formally mandates that recursive type aliases work.
from typing import Mapping, TypeAlias, TypeVar, Union
Json = Union[None, int, str, float, list["Json"], dict[str, "Json"]]
j1: Json = [1, {"a": 1}] # OK
j2: Json = 3.4 # OK
j3: Json = [1.2, None, [1.2, [""]]] # OK
j4: Json = {"a": 1, "b": 3j} # E: incompatible type
j5: Json = [2, 3j] # E: incompatible type
# This type alias should be equivalent to Json.
Json2 = Union[None, int, str, float, list["Json2"], dict[str, "Json2"]]
def func1(j1: Json) -> Json2:
return j1
RecursiveTuple = str | int | tuple["RecursiveTuple", ...]
t1: RecursiveTuple = (1, 1) # OK
t2: RecursiveTuple = (1, "1") # OK
t3: RecursiveTuple = (1, "1", 1, "2") # OK
t4: RecursiveTuple = (1, ("1", 1), "2") # OK
t5: RecursiveTuple = (1, ("1", 1), (1, (1, 2))) # OK
t6: RecursiveTuple = (1, ("1", 1), (1, (1, [2]))) # E
t6: RecursiveTuple = (1, [1]) # E
RecursiveMapping = str | int | Mapping[str, "RecursiveMapping"]
m1: RecursiveMapping = 1 # OK
m2: RecursiveMapping = "1" # OK
m3: RecursiveMapping = {"1": "1"} # OK
m4: RecursiveMapping = {"1": "1", "2": 1} # OK
m5: RecursiveMapping = {"1": "1", "2": 1, "3": {}} # OK
m6: RecursiveMapping = {"1": "1", "2": 1, "3": {"0": "0", "1": "2", "2": {}}} # OK
m7: RecursiveMapping = {"1": [1]} # E
m8: RecursiveMapping = {"1": "1", "2": 1, "3": [1, 2]} # E
m9: RecursiveMapping = {"1": "1", "2": 1, "3": {"0": "0", "1": 1, "2": [1, 2, 3]}} # E
T1 = TypeVar("T1", str, int)
T2 = TypeVar("T2")
GenericTypeAlias1 = list["GenericTypeAlias1[T1]" | T1]
SpecializedTypeAlias1 = GenericTypeAlias1[str]
g1: SpecializedTypeAlias1 = ["hi", ["hi", "hi"]] # OK
g2: GenericTypeAlias1[str] = ["hi", "bye", [""], [["hi"]]] # OK
g3: GenericTypeAlias1[str] = ["hi", [2.4]] # E
GenericTypeAlias2 = list["GenericTypeAlias2[T1, T2]" | T1 | T2]
g4: GenericTypeAlias2[str, int] = [[3, ["hi"]], "hi"] # OK
g5: GenericTypeAlias2[str, float] = [[3, ["hi", 3.4, [3.4]]], "hi"] # OK
g6: GenericTypeAlias2[str, int] = [[3, ["hi", 3, [3.4]]], "hi"] # E
RecursiveUnion: TypeAlias = Union["RecursiveUnion", int] # E: cyclical reference
# On one line because different type checkers report the error on different lines
MutualReference1: TypeAlias = Union["MutualReference2", int]; MutualReference2: TypeAlias = Union["MutualReference1", str] # E: cyclical reference

View File

@@ -0,0 +1,91 @@
"""
Tests the "type" statement introduced in Python 3.12.
"""
from typing import Callable, TypeVar
type GoodAlias1 = int
type GoodAlias2[S1, *S2, **S3] = Callable[S3, S1] | tuple[*S2]
type GoodAlias3 = GoodAlias2[int, tuple[int, str], ...]
class ClassA:
type GoodAlias4 = int | None
GoodAlias1.bit_count # E: cannot access attribute
GoodAlias1() # E: cannot call alias
print(GoodAlias1.__value__) # OK
print(GoodAlias1.__type_params__) # OK
print(GoodAlias1.other_attrib) # E: unknown attribute
class DerivedInt(GoodAlias1): # E: cannot use alias as base class
pass
def func2(x: object):
if isinstance(x, GoodAlias1): # E: cannot use alias in isinstance
pass
var1 = 1
# The following should not be allowed as type aliases.
type BadTypeAlias1 = eval("".join(map(chr, [105, 110, 116]))) # E
type BadTypeAlias2 = [int, str] # E
type BadTypeAlias3 = ((int, str),) # E
type BadTypeAlias4 = [int for i in range(1)] # E
type BadTypeAlias5 = {"a": "b"} # E
type BadTypeAlias6 = (lambda: int)() # E
type BadTypeAlias7 = [int][0] # E
type BadTypeAlias8 = int if 1 < 3 else str # E
type BadTypeAlias9 = var1 # E
type BadTypeAlias10 = True # E
type BadTypeAlias11 = 1 # E
type BadTypeAlias12 = list or set # E
type BadTypeAlias13 = f"{'int'}" # E
if 1 < 2:
type BadTypeAlias14 = int # E: redeclared
else:
type BadTypeAlias14 = int
def func3():
type BadTypeAlias15 = int # E: alias not allowed in function
V = TypeVar("V")
type TA1[K] = dict[K, V] # E: combines old and new TypeVars
T1 = TypeVar("T1")
type TA2 = list[T1] # E: uses old TypeVar
type RecursiveTypeAlias1[T] = T | list[RecursiveTypeAlias1[T]]
r1_1: RecursiveTypeAlias1[int] = 1
r1_2: RecursiveTypeAlias1[int] = [1, [1, 2, 3]]
type RecursiveTypeAlias2[S: int, T: str, **P] = Callable[P, T] | list[S] | list[RecursiveTypeAlias2[S, T, P]]
r2_1: RecursiveTypeAlias2[str, str, ...] # E: not compatible with S bound
r2_2: RecursiveTypeAlias2[int, str, ...]
r2_3: RecursiveTypeAlias2[int, int, ...] # E: not compatible with T bound
r2_4: RecursiveTypeAlias2[int, str, [int, str]]
type RecursiveTypeAlias3 = RecursiveTypeAlias3 # E: circular definition
type RecursiveTypeAlias4[T] = T | RecursiveTypeAlias4[str] # E: circular definition
type RecursiveTypeAlias5[T] = T | list[RecursiveTypeAlias5[T]]
type RecursiveTypeAlias6 = RecursiveTypeAlias7 # E: circular definition
type RecursiveTypeAlias7 = RecursiveTypeAlias6

View File

@@ -0,0 +1,64 @@
"""
Tests the TypeAliasType call introduced in Python 3.12.
"""
from typing import Callable, Generic, ParamSpec, TypeAliasType, TypeVar, TypeVarTuple
S = TypeVar("S")
T = TypeVar("T")
TStr = TypeVar("TStr", bound=str)
P = ParamSpec("P")
Ts = TypeVarTuple("Ts")
my_tuple = (S, T)
var1 = 3
GoodAlias1 = TypeAliasType("GoodAlias1", int)
GoodAlias2 = TypeAliasType("GoodAlias2", list[T], type_params=(T,))
GoodAlias3 = TypeAliasType("GoodAlias3", list[T] | list[S], type_params=(S, T))
GoodAlias4 = TypeAliasType("GoodAlias4", T | list[GoodAlias4[T]], type_params=(T,))
GoodAlias5 = TypeAliasType(
"GoodAlias5",
Callable[P, TStr] | list[S] | list[GoodAlias5[S, TStr, P]] | tuple[*Ts],
type_params=(S, TStr, P, Ts),
)
class ClassA(Generic[T]):
GoodAlias6 = TypeAliasType("GoodAlias6", list[T])
print(GoodAlias1.__value__) # OK
print(GoodAlias1.__type_params__) # OK
print(GoodAlias1.other_attrib) # E: unknown attribute
x1: GoodAlias4[int] = 1 # OK
x2: GoodAlias4[int] = [1] # OK
x3: GoodAlias5[str, str, ..., int, str] # OK
x4: GoodAlias5[int, str, ..., int, str] # OK
x5: GoodAlias5[int, str, [int, str], *tuple[int, str, int]] # OK
x6: GoodAlias5[int, int, ...] # E: incorrect type arguments
BadAlias1 = TypeAliasType("BadAlias1", list[S], type_params=(T,)) # E: S not in scope
BadAlias2 = TypeAliasType("BadAlias2", list[S]) # E: S not in scope
BadAlias3 = TypeAliasType("BadAlias3", int, type_params=my_tuple) # E: not literal tuple
BadAlias4 = TypeAliasType("BadAlias4", BadAlias4) # E: circular dependency
BadAlias5 = TypeAliasType("BadAlias5", T | BadAlias5[str], type_params=(T,)) # E: circular dependency
BadAlias6 = TypeAliasType("BadAlias6", BadAlias7) # E: circular dependency
BadAlias7 = TypeAliasType("BadAlias7", BadAlias6) # E?: circular dependency
# The following are invalid type expressions for a type alias.
BadAlias8 = TypeAliasType("BadAlias8", eval("".join(map(chr, [105, 110, 116])))) # E
BadAlias9 = TypeAliasType("BadAlias9", [int, str]) # E
BadAlias10 = TypeAliasType("BadAlias10", ((int, str),)) # E
BadAlias11 = TypeAliasType("BadAlias11", [int for i in range(1)]) # E
BadAlias12 = TypeAliasType("BadAlias12", {"a": "b"}) # E
BadAlias13 = TypeAliasType("BadAlias13", (lambda: int)()) # E
BadAlias14 = TypeAliasType("BadAlias14", [int][0]) # E
BadAlias15 = TypeAliasType("BadAlias15", int if 1 < 3 else str) # E
BadAlias16 = TypeAliasType("BadAlias16", var1) # E
BadAlias17 = TypeAliasType("BadAlias17", True) # E
BadAlias18 = TypeAliasType("BadAlias18", 1) # E
BadAlias19 = TypeAliasType("BadAlias19", list or set) # E
BadAlias20 = TypeAliasType("BadAlias20", f"{'int'}") # E

View File

@@ -0,0 +1,45 @@
"""
Tests generic type aliases used in class declarations for
variance incompatibility.
"""
from typing import Generic, TypeVar, TypeAlias
T = TypeVar("T")
T_co = TypeVar("T_co", covariant=True)
T_contra = TypeVar("T_contra", contravariant=True)
class ClassA(Generic[T]):
pass
A_Alias_1: TypeAlias = ClassA[T_co]
A_Alias_2: TypeAlias = A_Alias_1[T_co]
# Specialized type aliases used within a class declaration should
# result in the same variance incompatibility errors as their
# non-aliased counterparts.
class ClassA_1(ClassA[T_co]): # E: incompatible variance
...
class ClassA_2(A_Alias_1[T_co]): # E: incompatible variance
...
class ClassA_3(A_Alias_2[T_co]): # E: incompatible variance
...
class ClassB(Generic[T, T_co]):
pass
B_Alias_1 = ClassB[T_co, T_contra]
class ClassB_1(B_Alias_1[T_contra, T_co]): # E: incompatible variance
...

View File

@@ -0,0 +1,24 @@
"""
Tests for annotating coroutines.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/annotations.html#annotating-generator-functions-and-coroutines
# > Coroutines introduced in PEP 492 are annotated with the same syntax as
# > ordinary functions. However, the return type annotation corresponds to
# > the type of await expression, not to the coroutine type.
from typing import Any, Callable, Coroutine, assert_type
async def func1(ignored: int, /) -> str:
return "spam"
assert_type(func1, Callable[[int], Coroutine[Any, Any, str]])
async def func2() -> None:
x = await func1(42)
assert_type(x, str)

View File

@@ -0,0 +1,104 @@
"""
Tests the handling of forward references in type annotations.
"""
# > When a type hint contains names that have not been defined yet, that
# > definition may be expressed as a string literal, to be resolved later.
import types
from typing import assert_type
def func1(
p1: "ClassA", p2: "list[ClassA]", p3: list["ClassA"], p4: list["int | ClassA"]
) -> None:
assert_type(p1, ClassA)
assert_type(p2, list[ClassA])
assert_type(p3, list[ClassA])
assert_type(p4, list[ClassA | int])
bad1: ClassA # E: Runtime error: requires quotes
bad2: list[ClassA] # E: Runtime error: requires quotes
bad3: "ClassA" | int # E: Runtime error
bad4: int | "ClassA" # E: Runtime error
class ClassA:
...
# > The string literal should contain a valid Python expression
# > should be a valid code object).
var1 = 1
# The following should all generate errors because they are not legal type
# expressions, despite being enclosed in quotes.
def invalid_annotations(
p1: "eval(" ".join(map(chr, [105, 110, 116])))", # E
p2: "[int, str]", # E
p3: "(int, str)", # E
p4: "[int for i in range(1)]", # E
p5: "{}", # E
p6: "(lambda : int)()", # E
p7: "[int][0]", # E
p8: "int if 1 < 3 else str", # E
p9: "var1", # E
p10: "True", # E
p11: "1", # E
p12: "-1", # E
p13: "int or str", # E
p14: 'f"int"', # E
p15: "types", # E
):
pass
# > It should evaluate without errors once the module has been fully loaded.
# > The local and global namespace in which it is evaluated should be the same
# > namespaces in which default arguments to the same function would be evaluated.
class ClassB:
def method1(self) -> ClassB: # E: Runtime error
return ClassB()
def method2(self) -> "ClassB": # OK
return ClassB()
class ClassC:
...
class ClassD:
ClassC: "ClassC" # OK
ClassF: "ClassF" # E: circular reference
str: "str" = "" # OK
def int(self) -> None: # OK
...
x: "int" = 0 # OK
y: int = 0 # E: Refers to local int, which isn't a legal type expression
assert_type(ClassD.str, str)
assert_type(ClassD.x, int)
# > If a triple quote is used, the string should be parsed as though it is implicitly
# > surrounded by parentheses. This allows newline characters to be
# > used within the string literal.
value: """
int |
str |
list[int]
"""

View File

@@ -0,0 +1,190 @@
"""
Tests for annotating generators.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/annotations.html#annotating-generator-functions-and-coroutines
# The return type of generator functions can be annotated by the generic type
# Generator[yield_type, send_type, return_type] provided by typing.py module.
import asyncio
from typing import (
Any,
AsyncGenerator,
AsyncIterable,
AsyncIterator,
Awaitable,
Callable,
Coroutine,
Generator,
Iterable,
Iterator,
Protocol,
TypeVar,
assert_type,
)
T = TypeVar("T")
class A:
pass
class B:
def should_continue(self) -> bool:
return True
class C:
pass
def generator1() -> Generator[A, B, C]:
cont = B()
while cont.should_continue():
yield A()
return C()
def generator2() -> Generator[A, B, C]: # E: missing return
cont = B()
if cont.should_continue():
return False # E: incompatible return type
while cont.should_continue():
yield 3 # E: incompatible yield type
def generator3() -> Generator[A, int, Any]:
cont = B()
if cont.should_continue():
return 3
while cont.should_continue():
yield 3 # E: Incompatible yield type
def generator4() -> Iterable[A]:
yield A()
return True # E?: No return value expected
def generator5() -> Iterator[A]:
yield B() # E: incompatible yield type
def generator6() -> Generator[None, None, None]:
yield
def generator7() -> Iterator[dict[str, int]]:
yield {"": 0} # OK
def generator8() -> int: # E: incompatible return type
yield None # E
return 0
async def generator9() -> int: # E: incompatible return type
yield None # E
class IntIterator(Protocol):
def __next__(self, /) -> int:
...
def generator15() -> IntIterator: # OK
yield 0
class AsyncIntIterator(Protocol):
def __anext__(self, /) -> Awaitable[int]:
...
async def generator16() -> AsyncIntIterator: # OK
yield 0
def generator17() -> Iterator[A]: # OK
yield from generator17()
def generator18() -> Iterator[B]:
yield from generator17() # E: incompatible generator type
yield from [1] # E: incompatible generator type
def generator19() -> Generator[None, float, None]: # OK
x: float = yield
def generator20() -> Generator[None, int, None]: # OK
yield from generator19()
def generator21() -> Generator[None, int, None]:
x: float = yield
def generator22() -> Generator[None, str, None]:
yield from generator21() # E: incompatible send type
def generator23() -> Iterable[str]: # OK
return
yield "" # Unreachable
async def generator24() -> AsyncIterable[str]: # OK
return
yield "" # Unreachable
def generator25(ints1: list[int], ints2: list[int]) -> Generator[int, None, None]: # OK
yield from ints1
yield from ints2
async def get_data() -> list[int]:
await asyncio.sleep(1)
return [1, 2, 3]
async def generator26(nums: list[int]) -> AsyncGenerator[str, None]:
for n in nums:
await asyncio.sleep(1)
yield f"The number is {n}"
async def generator27() -> AsyncGenerator[str, None]:
data = await get_data()
v1 = generator26(data)
assert_type(v1, AsyncGenerator[str, None])
return v1
async def generator28() -> AsyncIterator[str]:
data = await get_data()
v1 = generator26(data)
assert_type(v1, AsyncGenerator[str, None])
return v1
async def generator29() -> AsyncIterator[int]:
raise NotImplementedError
assert_type(generator29, Callable[[], Coroutine[Any, Any, AsyncIterator[int]]])
async def generator30() -> AsyncIterator[int]:
raise NotImplementedError
yield
assert_type(generator30, Callable[[], AsyncIterator[int]])

View File

@@ -0,0 +1,51 @@
"""
Tests for annotating instance and class methods.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/annotations.html#annotating-instance-and-class-methods
from typing import TypeVar, assert_type
T = TypeVar("T", bound="A")
class A:
def copy(self: T) -> T:
return self
@classmethod
def factory(cls: type[T]) -> T:
return cls()
@staticmethod
def static_method(val: type[T]) -> T:
return val()
class B(A):
...
assert_type(A().copy(), A)
assert_type(A.factory(), A)
assert_type(A.copy(A()), A)
assert_type(B.copy(B()), B)
assert_type(B().copy(), B)
assert_type(B.factory(), B)
# This case is ambiguous in the spec, which does not indicate when
# type binding should be performed. Currently, pyright evaluates
# A here, but mypy evaluates B. Since the spec is not clear, both
# of these are currently acceptable answers.
assert_type(A.copy(B()), A) # E?
# Similarly, this case is ambiguous in the spec. Pyright currently
# generates a type error here, but mypy accepts this.
B.copy(A()) # E?
assert_type(A.static_method(A), A)
assert_type(A.static_method(B), B)
assert_type(B.static_method(B), B)
assert_type(B.static_method(A), A)

View File

@@ -0,0 +1,114 @@
"""
Test for type expressions used in annotations.
"""
import abc
import abc
import types
import types
from typing import Any, Callable, Tuple, Union, assert_type
# https://typing.readthedocs.io/en/latest/spec/annotations.html#valid-type-expression-forms
def greeting(name: str) -> str:
return "Hello " + name
assert_type(greeting("Monty"), str)
# > Expressions whose type is a subtype of a specific argument type are also accepted for that argument.
class StrSub(str):
...
assert_type(greeting(StrSub("Monty")), str)
# > Type hints may be built-in classes (including those defined in standard library or third-party
# > extension modules), abstract base classes, types available in the types module, and user-defined
# > classes (including those defined in the standard library or third-party modules).
class UserDefinedClass:
...
class AbstractBaseClass(abc.ABC):
@abc.abstractmethod
def abstract_method(self):
...
# The following parameter annotations should all be considered
# valid and not generate errors.
def valid_annotations(
p1: int,
p2: str,
p3: bytes,
p4: bytearray,
p5: memoryview,
p6: complex,
p7: float,
p8: bool,
p9: object,
p10: type,
p11: types.ModuleType,
p12: types.FunctionType,
p13: types.BuiltinFunctionType,
p14: UserDefinedClass,
p15: AbstractBaseClass,
p16: int,
p17: Union[int, str],
p18: None,
p19: list,
p20: list[int],
p21: tuple,
p22: Tuple[int, ...],
p23: Tuple[int, int, str],
p24: Callable[..., int],
p25: Callable[[int, str], None],
p26: Any,
):
assert_type(p17, int | str)
assert_type(p19, list[Any])
assert_type(p20, list[int])
assert_type(p21, tuple[Any, ...])
# > Annotations should be kept simple or static analysis tools may not be able to interpret the values.
var1 = 3
# The following parameter annotations should all be considered
# invalid and generate errors.
def invalid_annotations(
p1: eval("".join(map(chr, [105, 110, 116]))), # E
p2: [int, str], # E
p3: (int, str), # E
p4: [int for i in range(1)], # E
p5: {}, # E
p6: (lambda: int)(), # E
p7: [int][0], # E
p8: int if 1 < 3 else str, # E
p9: var1, # E
p10: True, # E
p11: 1, # E
p12: -1, # E
p13: int or str, # E
p14: f"int", # E
p15: types, # E
):
pass
# > When used in a type hint, the expression None is considered equivalent to type(None).
def takes_None(x: None) -> None:
...
assert_type(takes_None(None), None)

View File

@@ -0,0 +1,189 @@
"""
Tests Callable annotation and parameter annotations for "def" statements.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/callables.html#callable
from typing import (
Any,
Callable,
Concatenate,
ParamSpec,
Protocol,
TypeAlias,
TypeVar,
assert_type,
)
T_contra = TypeVar("T_contra", contravariant=True)
P = ParamSpec("P")
def func1(cb: Callable[[int, str], list[str]]) -> None:
assert_type(cb(1, ""), list[str])
cb(1) # E
cb(1, 2) # E
cb(1, "", 1) # E
# Mypy reports two errors, one for each kwarg.
cb(a=1, b="") # E: bad kwarg 'a'
def func2(cb: Callable[[], dict[str, str]]) -> None:
assert_type(cb(), dict[str, str])
cb(1) # E
# https://typing.readthedocs.io/en/latest/spec/callables.html#meaning-of-in-callable
# > The Callable special form supports the use of ... in place of the list of
# > parameter types. This indicates that the type is consistent with any input
# > signature.
def func3(cb: Callable[..., list[str]]):
assert_type(cb(), list[str])
assert_type(cb(""), list[str])
assert_type(cb(1, ""), list[str])
def func4(*args: int, **kwargs: int) -> None:
assert_type(args, tuple[int, ...])
assert_type(kwargs, dict[str, int])
v1: Callable[int] # E
v2: Callable[int, int] # E
v3: Callable[[], [int]] # E
v4: Callable[int, int, int] # E
v5: Callable[[...], int] # E
def test_cb1(x: int) -> str:
return ""
def test_cb2() -> str:
return ""
cb1: Callable[..., str]
cb1 = test_cb1 # OK
cb1 = test_cb2 # OK
cb2: Callable[[], str] = cb1 # OK
# > A ... can also be used with Concatenate. In this case, the parameters prior
# > to the ... are required to be present in the input signature and be
# > compatible in kind and type, but any additional parameters are permitted.
def test_cb3(a: int, b: int, c: int) -> str:
return ""
def test_cb4(*, a: int) -> str:
return ""
cb3: Callable[Concatenate[int, ...], str]
cb3 = test_cb1 # OK
cb3 = test_cb2 # E
cb3 = test_cb3 # OK
cb3 = test_cb4 # E
# > If the input signature in a function definition includes both a *args and
# > **kwargs parameter and both are typed as Any (explicitly or implicitly
# > because it has no annotation), a type checker should treat this as the
# > equivalent of `...`. Any other parameters in the signature are unaffected
# > and are retained as part of the signature.
class Proto1(Protocol):
def __call__(self, *args: Any, **kwargs: Any) -> None: ...
class Proto2(Protocol):
def __call__(self, a: int, /, *args, **kwargs) -> None: ...
class Proto3(Protocol):
def __call__(self, a: int, *args: Any, **kwargs: Any) -> None: ...
class Proto4(Protocol[P]):
def __call__(self, a: int, *args: P.args, **kwargs: P.kwargs) -> None: ...
class Proto5(Protocol[T_contra]):
def __call__(self, *args: T_contra, **kwargs: T_contra) -> None: ...
class Proto6(Protocol):
def __call__(self, a: int, /, *args: Any, k: str, **kwargs: Any) -> None:
pass
class Proto7(Protocol):
def __call__(self, a: float, /, b: int, *, k: str, m: str) -> None:
pass
class Proto8(Protocol):
def __call__(self) -> None: ...
def func5(
p1: Proto1,
p2: Proto2,
p3: Proto3,
p4: Proto4[...],
p5: Proto5[Any],
p7: Proto7,
p8: Proto8,
c1: Callable[..., None],
c2: Callable[Concatenate[int, ...], None],
):
ok1: Callable[..., None] = p1 # OK
ok2: Proto1 = c1 # OK
ok3: Callable[..., None] = p5 # OK
ok4: Proto5[Any] = c1 # OK
ok5: Callable[Concatenate[int, ...], None] = p2 # OK
ok6: Proto2 = c2 # OK
ok7: Callable[..., None] = p3 # OK
ok8: Proto3 = c1 # OK
ok9: Proto4[...] = p3 # OK
ok10: Proto3 = p4 # OK
ok11: Proto6 = p7 # OK
err1: Proto5[Any] = p8 # E
# > The ... syntax can also be used to provide a specialized value for a
# > ParamSpec in a generic class or type alias.
Callback1: TypeAlias = Callable[P, str]
Callback2: TypeAlias = Callable[Concatenate[int, P], str]
def func6(cb1: Callable[[], str], cb2: Callable[[int], str]) -> None:
f1: Callback1[...] = cb1 # OK
f2: Callback2[...] = cb1 # E
f3: Callback1[...] = cb2 # OK
f4: Callback2[...] = cb2 # OK
# > If ... is used with signature concatenation, the ... portion continues
# > to mean “any conceivable set of parameters that could be compatible”.
CallbackWithInt: TypeAlias = Callable[Concatenate[int, P], str]
CallbackWithStr: TypeAlias = Callable[Concatenate[str, P], str]
def func7(cb: Callable[[int, str], str]) -> None:
f1: Callable[Concatenate[int, ...], str] = cb # OK
f2: Callable[Concatenate[str, ...], str] = cb # E
f3: CallbackWithInt[...] = cb # OK
f4: CallbackWithStr[...] = cb # E

View File

@@ -0,0 +1,123 @@
"""
Tests the use of an unpacked TypedDict for annotating **kwargs.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/callables.html#unpack-for-keyword-arguments
# This sample tests the handling of Unpack[TypedDict] when used with
# a **kwargs parameter in a function signature.
from typing import Protocol, TypeVar, TypedDict, NotRequired, Required, Unpack, assert_type
class TD1(TypedDict):
v1: Required[int]
v2: NotRequired[str]
class TD2(TD1):
v3: Required[str]
def func1(**kwargs: Unpack[TD2]) -> None:
v1 = kwargs["v1"]
assert_type(v1, int)
# > Type checkers may allow reading an item using ``d['x']`` even if
# > the key ``'x'`` is not required
kwargs["v2"] # E?: v2 may not be present
if "v2" in kwargs:
v2 = kwargs["v2"]
assert_type(v2, str)
v3 = kwargs["v3"]
assert_type(v3, str)
def func2(v3: str, **kwargs: Unpack[TD1]) -> None:
# > When Unpack is used, type checkers treat kwargs inside the function
# > body as a TypedDict.
assert_type(kwargs, TD1)
def func3() -> None:
# Mypy reports multiple errors here.
func1() # E: missing required keyword args
func1(v1=1, v2="", v3="5") # OK
td2 = TD2(v1=2, v3="4")
func1(**td2) # OK
func1(v1=1, v2="", v3="5", v4=5) # E: v4 is not in TD2
func1(1, "", "5") # E: args not passed by position
# > Passing a dictionary of type dict[str, object] as a **kwargs argument
# > to a function that has **kwargs annotated with Unpack must generate a
# > type checker error.
my_dict: dict[str, str] = {}
func1(**my_dict) # E: untyped dict
d1 = {"v1": 2, "v3": "4", "v4": 4}
func1(**d1) # E?: OK or Type error (spec allows either)
func2(**td2) # OK
func1(v1=2, **td2) # E: v1 is already specified
func2(1, **td2) # E: v1 is already specified
func2(v1=1, **td2) # E: v1 is already specified
class TDProtocol1(Protocol):
def __call__(self, *, v1: int, v3: str) -> None:
...
class TDProtocol2(Protocol):
def __call__(self, *, v1: int, v3: str, v2: str = "") -> None:
...
class TDProtocol3(Protocol):
def __call__(self, *, v1: int, v2: int, v3: str) -> None:
...
class TDProtocol4(Protocol):
def __call__(self, *, v1: int) -> None:
...
class TDProtocol5(Protocol):
def __call__(self, v1: int, v3: str) -> None:
...
class TDProtocol6(Protocol):
def __call__(self, **kwargs: Unpack[TD2]) -> None:
...
# Specification: https://typing.readthedocs.io/en/latest/spec/callables.html#assignment
v1: TDProtocol1 = func1 # OK
v2: TDProtocol2 = func1 # OK
v3: TDProtocol3 = func1 # E: v2 is wrong type
v4: TDProtocol4 = func1 # E: v3 is missing
v5: TDProtocol5 = func1 # E: params are positional
v6: TDProtocol6 = func1 # OK
def func4(v1: int, /, **kwargs: Unpack[TD2]) -> None:
...
def func5(v1: int, **kwargs: Unpack[TD2]) -> None: # E: parameter v1 overlaps with the TypedDict.
...
T = TypeVar("T", bound=TD2)
# > TypedDict is the only permitted heterogeneous type for typing **kwargs.
# > Therefore, in the context of typing **kwargs, using Unpack with types other
# > than TypedDict should not be allowed and type checkers should generate
# > errors in such cases.
def func6(**kwargs: Unpack[T]) -> None: # E: unpacked value must be a TypedDict, not a TypeVar bound to TypedDict.
...

View File

@@ -0,0 +1,313 @@
"""
Tests handling of callback protocols.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/callables.html#callback-protocols
from typing import Any, Callable, ParamSpec, Protocol, TypeVar, cast, overload
InputT = TypeVar("InputT", contravariant=True)
OutputT = TypeVar("OutputT", covariant=True)
class Proto1(Protocol):
def __call__(self, *vals: bytes, max_len: int | None = None) -> list[bytes]:
...
def cb1_good1(*vals: bytes, max_len: int | None = None) -> list[bytes]:
return []
def cb1_bad1(*vals: bytes, max_items: int | None) -> list[bytes]:
return []
def cb1_bad2(*vals: bytes) -> list[bytes]:
return []
def cb1_bad3(*vals: bytes, max_len: str | None) -> list[bytes]:
return []
cb1: Proto1 = cb1_good1 # OK
cb1 = cb1_bad1 # E: different names
cb1 = cb1_bad2 # E: parameter types
cb1 = cb1_bad3 # E: default argument
class Proto2(Protocol):
def __call__(self, *vals: bytes, **kwargs: str) -> None:
pass
def cb2_good1(*a: bytes, **b: str):
pass
def cb2_bad1(*a: bytes):
pass
def cb2_bad2(*a: str, **b: str):
pass
def cb2_bad3(*a: bytes, **b: bytes):
pass
def cb2_bad4(**b: str):
pass
cb2: Proto2 = cb2_good1 # OK
cb2 = cb2_bad1 # E: missing **kwargs
cb2 = cb2_bad2 # E: parameter type
cb2 = cb2_bad3 # E: parameter type
cb2 = cb2_bad4 # E: missing parameter
class Proto3(Protocol):
def __call__(self) -> None:
pass
cb3: Proto3 = cb2_good1 # OK
cb3 = cb2_bad1 # OK
cb3 = cb2_bad2 # OK
cb3 = cb2_bad3 # OK
cb3 = cb2_bad4 # OK
# A callback protocol with other attributes.
class Proto4(Protocol):
other_attribute: int
def __call__(self, x: int) -> None:
pass
def cb4_bad1(x: int) -> None:
pass
var4: Proto4 = cb4_bad1 # E: missing attribute
class Proto5(Protocol):
def __call__(self, *, a: int, b: str) -> int:
...
def cb5_good1(a: int, b: str) -> int:
return 0
cb5: Proto5 = cb5_good1 # OK
class NotProto6:
def __call__(self, *vals: bytes, maxlen: int | None = None) -> list[bytes]:
return []
def cb6_bad1(*vals: bytes, max_len: int | None = None) -> list[bytes]:
return []
cb6: NotProto6 = cb6_bad1 # E: NotProto6 isn't a protocol class
class Proto7(Protocol[InputT, OutputT]):
def __call__(self, inputs: InputT) -> OutputT:
...
class Class7_1:
# Test for unannotated parameter.
def __call__(self, inputs) -> int:
return 5
cb7_1: Proto7[int, int] = Class7_1() # OK
class Class7_2:
# Test for parameter with type Any.
def __call__(self, inputs: Any) -> int:
return 5
cb7_2: Proto7[int, int] = Class7_2() # OK
class Proto8(Protocol):
@overload
def __call__(self, x: int) -> int:
...
@overload
def __call__(self, x: str) -> str:
...
def __call__(self, x: Any) -> Any:
...
def cb8_good1(x: Any) -> Any:
return x
def cb8_bad1(x: int) -> Any:
return x
cb8: Proto8 = cb8_good1 # OK
cb8 = cb8_bad1 # E: parameter type
P = ParamSpec("P")
R = TypeVar("R", covariant=True)
class Proto9(Protocol[P, R]):
other_attribute: int
def __call__(self, *args: P.args, **kwargs: P.kwargs) -> R:
...
def decorator1(f: Callable[P, R]) -> Proto9[P, R]:
converted = cast(Proto9[P, R], f)
converted.other_attribute = 1
converted.other_attribute = "str" # E: incompatible type
converted.xxx = 3 # E: unknown attribute
return converted
@decorator1
def cb9_good(x: int) -> str:
return ""
print(cb9_good.other_attribute) # OK
print(cb9_good.other_attribute2) # E: unknown attribute
cb9_good(x=3)
class Proto10(Protocol):
__name__: str
__module__: str
__qualname__: str
__annotations__: dict[str, Any]
def __call__(self) -> None:
...
def cb10_good() -> None:
pass
cb10: Proto10 = cb10_good # OK
class Proto11(Protocol):
def __call__(self, x: int, /, y: str) -> Any:
...
def cb11_good1(x: int, /, y: str, z: None = None) -> Any:
pass
def cb11_good2(x: int, y: str, z: None = None) -> Any:
pass
def cb11_bad1(x: int, y: str, /) -> Any:
pass
cb11: Proto11 = cb11_good1 # OK
cb11 = cb11_good2 # OK
cb11 = cb11_bad1 # E: y is position-only
class Proto12(Protocol):
def __call__(self, *args: Any, kwarg0: Any, kwarg1: Any) -> None:
...
def cb12_good1(*args: Any, kwarg0: Any, kwarg1: Any) -> None:
pass
def cb12_good2(*args: Any, **kwargs: Any) -> None:
pass
def cb12_bad1(*args: Any, kwarg0: Any) -> None:
pass
cb12: Proto12 = cb12_good1 # OK
cb12 = cb12_good2 # OK
cb12 = cb12_bad1 # E: missing kwarg1
class Proto13_Default(Protocol):
# Callback with positional parameter with default arg value
def __call__(self, path: str = ...) -> str:
...
class Proto13_NoDefault(Protocol):
# Callback with positional parameter but no default arg value
def __call__(self, path: str) -> str:
...
def cb13_default(path: str = "") -> str:
return ""
def cb13_no_default(path: str) -> str:
return ""
cb13_1: Proto13_Default = cb13_default # OK
cb13_2: Proto13_Default = cb13_no_default # E: no default
cb13_3: Proto13_NoDefault = cb13_default # OK
cb13_4: Proto13_NoDefault = cb13_no_default # OK
class Proto14_Default(Protocol):
# Callback with keyword parameter with default arg value
def __call__(self, *, path: str = ...) -> str:
...
class Proto14_NoDefault(Protocol):
# Callback with keyword parameter with no default arg value
def __call__(self, *, path: str) -> str:
...
def cb14_default(*, path: str = "") -> str:
return ""
def cb14_no_default(*, path: str) -> str:
return ""
cb14_1: Proto14_Default = cb14_default # OK
cb14_2: Proto14_Default = cb14_no_default # E: no default
cb14_3: Proto14_NoDefault = cb14_default # OK
cb14_4: Proto14_NoDefault = cb14_no_default # OK

View File

@@ -0,0 +1,297 @@
"""
Tests subtyping rules for callables.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/callables.html#subtyping-rules-for-callables
from typing import Callable, ParamSpec, Protocol, TypeAlias, overload
P = ParamSpec("P")
# > Callable types are covariant with respect to their return types but
# > contravariant with respect to their parameter types.
def func1(
cb1: Callable[[float], int],
cb2: Callable[[float], float],
cb3: Callable[[int], int],
) -> None:
f1: Callable[[int], float] = cb1 # OK
f2: Callable[[int], float] = cb2 # OK
f3: Callable[[int], float] = cb3 # OK
f4: Callable[[float], float] = cb1 # OK
f5: Callable[[float], float] = cb2 # OK
f6: Callable[[float], float] = cb3 # E
f7: Callable[[int], int] = cb1 # OK
f8: Callable[[int], int] = cb2 # E
f9: Callable[[int], int] = cb3 # OK
# > Callable A is a subtype of callable B if all keyword-only parameters in B
# > are present in A as either keyword-only parameters or standard (positional
# > or keyword) parameters.
class PosOnly2(Protocol):
def __call__(self, b: int, a: int, /) -> None: ...
class KwOnly2(Protocol):
def __call__(self, *, b: int, a: int) -> None: ...
class Standard2(Protocol):
def __call__(self, a: int, b: int) -> None: ...
def func2(standard: Standard2, pos_only: PosOnly2, kw_only: KwOnly2):
f1: Standard2 = pos_only # E
f2: Standard2 = kw_only # E
f3: PosOnly2 = standard # OK
f4: PosOnly2 = kw_only # E
f5: KwOnly2 = standard # OK
f6: KwOnly2 = pos_only # E
# > If a callable B has a signature with a *args parameter, callable A
# > must also have a *args parameter to be a subtype of B, and the type of
# > Bs *args parameter must be a subtype of As *args parameter.
class NoArgs3(Protocol):
def __call__(self) -> None: ...
class IntArgs3(Protocol):
def __call__(self, *args: int) -> None: ...
class FloatArgs3(Protocol):
def __call__(self, *args: float) -> None: ...
def func3(no_args: NoArgs3, int_args: IntArgs3, float_args: FloatArgs3):
f1: NoArgs3 = int_args # OK
f2: NoArgs3 = float_args # OK
f3: IntArgs3 = no_args # E: missing *args parameter
f4: IntArgs3 = float_args # OK
f5: FloatArgs3 = no_args # E: missing *args parameter
f6: FloatArgs3 = int_args # E: float is not subtype of int
# > If a callable B has a signature with one or more positional-only parameters,
# > a callable A is a subtype of B if A has an *args parameter whose type is a
# > supertype of the types of any otherwise-unmatched positional-only parameters
# > in B.
class PosOnly4(Protocol):
def __call__(self, a: int, b: str, /) -> None: ...
class IntArgs4(Protocol):
def __call__(self, *args: int) -> None: ...
class IntStrArgs4(Protocol):
def __call__(self, *args: int | str) -> None: ...
class StrArgs4(Protocol):
def __call__(self, a: int, /, *args: str) -> None: ...
class Standard4(Protocol):
def __call__(self, a: int, b: str) -> None: ...
def func4(int_args: IntArgs4, int_str_args: IntStrArgs4, str_args: StrArgs4):
f1: PosOnly4 = int_args # E: str is not subtype of int
f2: PosOnly4 = int_str_args # OK
f3: PosOnly4 = str_args # OK
f4: IntStrArgs4 = str_args # E: int | str is not subtype of str
f5: IntStrArgs4 = int_args # E: int | str is not subtype of int
f6: StrArgs4 = int_str_args # OK
f7: StrArgs4 = int_args # E: str is not subtype of int
f8: IntArgs4 = int_str_args # OK
f9: IntArgs4 = str_args # E: int is not subtype of str
f10: Standard4 = int_str_args # E: keyword parameters a and b missing
f11: Standard4 = str_args # E: keyword parameter b missing
# > If a callable B has a signature with a **kwargs parameter (without an
# > unpacked TypedDict type annotation), callable A must also have a **kwargs
# > parameter to be a subtype of B, and the type of Bs **kwargs parameter
# > must be a subtype of As **kwargs parameter.
class NoKwargs5(Protocol):
def __call__(self) -> None: ...
class IntKwargs5(Protocol):
def __call__(self, **kwargs: int) -> None: ...
class FloatKwargs5(Protocol):
def __call__(self, **kwargs: float) -> None: ...
def func5(no_kwargs: NoKwargs5, int_kwargs: IntKwargs5, float_kwargs: FloatKwargs5):
f1: NoKwargs5 = int_kwargs # OK
f2: NoKwargs5 = float_kwargs # OK
f3: IntKwargs5 = no_kwargs # E: missing **kwargs parameter
f4: IntKwargs5 = float_kwargs # OK
f5: FloatKwargs5 = no_kwargs # E: missing **kwargs parameter
f6: FloatKwargs5 = int_kwargs # E: float is not subtype of int
# > If a callable B has a signature with one or more keyword-only parameters,
# > a callable A is a subtype of B if A has a **kwargs parameter whose type
# > is a supertype of the types of any otherwise-unmatched keyword-only
# > parameters in B.
class KwOnly6(Protocol):
def __call__(self, *, a: int, b: str) -> None: ...
class IntKwargs6(Protocol):
def __call__(self, **kwargs: int) -> None: ...
class IntStrKwargs6(Protocol):
def __call__(self, **kwargs: int | str) -> None: ...
class StrKwargs6(Protocol):
def __call__(self, *, a: int, **kwargs: str) -> None: ...
class Standard6(Protocol):
def __call__(self, a: int, b: str) -> None: ...
def func6(
int_kwargs: IntKwargs6, int_str_kwargs: IntStrKwargs6, str_kwargs: StrKwargs6
):
f1: KwOnly6 = int_kwargs # E: str is not subtype of int
f2: KwOnly6 = int_str_kwargs # OK
f3: KwOnly6 = str_kwargs # OK
f4: IntStrKwargs6 = str_kwargs # E: int | str is not subtype of str
f5: IntStrKwargs6 = int_kwargs # E: int | str is not subtype of int
f6: StrKwargs6 = int_str_kwargs # OK
f7: StrKwargs6 = int_kwargs # E: str is not subtype of int
f8: IntKwargs6 = int_str_kwargs # OK
f9: IntKwargs6 = str_kwargs # E: int is not subtype of str
f10: Standard6 = int_str_kwargs # E: Does not accept positional arguments
f11: Standard6 = str_kwargs # E: Does not accept positional arguments
# > A signature that includes *args: P.args, **kwargs: P.kwargs is equivalent
# > to a Callable parameterized by P.
class ProtocolWithP(Protocol[P]):
def __call__(self, *args: P.args, **kwargs: P.kwargs) -> None: ...
TypeAliasWithP: TypeAlias = Callable[P, None]
def func7(proto: ProtocolWithP[P], ta: TypeAliasWithP[P]):
# These two types are equivalent
f1: TypeAliasWithP[P] = proto # OK
f2: ProtocolWithP[P] = ta # OK
# > If a callable A has a parameter x with a default argument value and B is
# > the same as A except that x has no default argument, then A is a subtype
# > of B. A is also a subtype of C if C is the same as A with parameter x
# > removed.
class DefaultArg8(Protocol):
def __call__(self, x: int = 0) -> None: ...
class NoDefaultArg8(Protocol):
def __call__(self, x: int) -> None: ...
class NoX8(Protocol):
def __call__(self) -> None: ...
def func8(default_arg: DefaultArg8, no_default_arg: NoDefaultArg8, no_x: NoX8):
f1: DefaultArg8 = no_default_arg # E
f2: DefaultArg8 = no_x # E
f3: NoDefaultArg8 = default_arg # OK
f4: NoDefaultArg8 = no_x # E
f5: NoX8 = default_arg # OK
f6: NoX8 = no_default_arg # E
# > If a callable A is overloaded with two or more signatures, it is a subtype
# > of callable B if at least one of the overloaded signatures in A is a
# > subtype of B.
class Overloaded9(Protocol):
@overload
def __call__(self, x: int) -> int: ...
@overload
def __call__(self, x: str) -> str: ...
class IntArg9(Protocol):
def __call__(self, x: int) -> int: ...
class StrArg9(Protocol):
def __call__(self, x: str) -> str: ...
class FloatArg9(Protocol):
def __call__(self, x: float) -> float: ...
def func9(overloaded: Overloaded9):
f1: IntArg9 = overloaded # OK
f2: StrArg9 = overloaded # OK
f3: FloatArg9 = overloaded # E
# > If a callable B is overloaded with two or more signatures, callable A is
# > a subtype of B if A is a subtype of all of the signatures in B.
class Overloaded10(Protocol):
@overload
def __call__(self, x: int, y: str) -> float: ...
@overload
def __call__(self, x: str) -> complex: ...
class StrArg10(Protocol):
def __call__(self, x: str) -> complex: ...
class IntStrArg10(Protocol):
def __call__(self, x: int | str, y: str = "") -> int: ...
def func10(int_str_arg: IntStrArg10, str_arg: StrArg10):
f1: Overloaded10 = int_str_arg # OK
f2: Overloaded10 = str_arg # E

View File

@@ -0,0 +1,158 @@
"""
Tests the typing.ClassVar special form.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/class-compat.html#classvar
from typing import (
Annotated,
Any,
Callable,
ClassVar,
Final,
Generic,
ParamSpec,
Protocol,
TypeAlias,
TypeVar,
TypeVarTuple,
assert_type,
cast,
)
from typing import ClassVar as CV
var1 = 3
P = ParamSpec("P")
T = TypeVar("T")
Ts = TypeVarTuple("Ts")
# This is currently commented out because it causes mypy to crash.
# class ClassA(Generic[T, P, Unpack[Ts]]):
class ClassA(Generic[T, P]):
# > ClassVar accepts only a single argument that should be a valid type
bad1: ClassVar[int, str] = cast(Any, 0) # E: too many arguments
bad2: CV[3] = cast(Any, 0) # E: invalid type
bad3: CV[var] = cast(Any, 0) # E: invalid type
# > Note that a ClassVar parameter cannot include any type variables,
# > regardless of the level of nesting.
bad4: ClassVar[T] = cast(Any, 0) # E: cannot use TypeVar
bad5: ClassVar[list[T]] = cast(Any, 0) # E: cannot use TypeVar
bad6: ClassVar[Callable[P, Any]] = cast(Any, 0) # E: cannot use ParamSpec
# This is currently commented out because it causes mypy to crash.
# bad7: ClassVar[tuple[*Ts]] # E: cannot use TypeVarTuple
bad8: ClassVar[list[str]] = {} # E: type violation in initialization
bad9: Final[ClassVar[int]] = 3 # E: ClassVar cannot be nested with Final
bad10: list[ClassVar[int]] = [] # E: ClassVar cannot be nested
good1: CV[int] = 1
good2: ClassVar[list[str]] = []
good3: ClassVar[Any] = 1
good4: ClassVar = 3.1
good5: Annotated[ClassVar[list[int]], ""] = []
def method1(self, a: ClassVar[int]): # E: ClassVar not allowed here
x: ClassVar[str] = "" # E: ClassVar not allowed here
self.xx: ClassVar[str] = "" # E: ClassVar not allowed here
def method2(self) -> ClassVar[int]: # E: ClassVar not allowed here
return 3
bad11: ClassVar[int] = 3 # E: ClassVar not allowed here
bad12: TypeAlias = ClassVar[str] # E: ClassVar not allowed here
assert_type(ClassA.good1, int)
assert_type(ClassA.good2, list[str])
assert_type(ClassA.good3, Any)
assert_type(ClassA.good4, float)
class BasicStarship:
captain: str = "Picard" # Instance variable with default
damage: int # Instance variable without default
stats: ClassVar[dict[str, int]] = {} # Class variable
class Starship:
captain: str = "Picard"
damage: int
stats: ClassVar[dict[str, int]] = {}
def __init__(self, damage: int, captain: str | None = None):
self.damage = damage
if captain:
self.captain = captain
def hit(self):
Starship.stats["hits"] = Starship.stats.get("hits", 0) + 1
enterprise_d = Starship(3000)
enterprise_d.stats = {} # E: cannot access via instance
Starship.stats = {} # OK
# > As a matter of convenience (and convention), instance variables can be
# > annotated in __init__ or other methods, rather than in the class.
class Box(Generic[T]):
def __init__(self, content) -> None:
self.content: T = content
# Tests for ClassVar in Protocol
class ProtoA(Protocol):
x: ClassVar[str]
y: ClassVar[str]
z: CV = [""]
class ProtoAImpl:
x = ""
def __init__(self) -> None:
self.y = ""
a: ProtoA = ProtoAImpl() # E: y is not a ClassVar
class ProtoB(Protocol):
x: int = 3
y: int
z: ClassVar[int]
@classmethod
def method1(cls) -> None:
return None
@staticmethod
def method2() -> None:
return None
class ProtoBImpl(ProtoB):
y = 0
z = 0
b_x: int = ProtoBImpl.x
b_y: int = ProtoBImpl.y
b_z: int = ProtoBImpl.z
b_m1 = ProtoBImpl.method1
b_m2 = ProtoBImpl.method2

View File

@@ -0,0 +1,102 @@
"""
Tests the typing.override decorator.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/class-compat.html#override
from typing import Any, Callable, overload, override
def wrapper(func: Callable[..., Any], /) -> Any:
def wrapped(*args: Any, **kwargs: Any) -> Any:
raise NotImplementedError
return wrapped
class ParentA:
def method1(self) -> int:
return 1
@overload
def method2(self, x: int) -> int:
...
@overload
def method2(self, x: str) -> str:
...
def method2(self, x: int | str) -> int | str:
return 0
def method5(self):
pass
class ChildA(ParentA):
@override
def method1(self) -> int: # OK
return 2
@overload
def method2(self, x: int) -> int:
...
@overload
def method2(self, x: str) -> str:
...
def method2(self, x: int | str) -> int | str: # OK
return 0
@override
def method3(self) -> int: # E: no matching signature in ancestor
return 1
@overload # E[method4]
def method4(self, x: int) -> int:
...
@overload
def method4(self, x: str) -> str:
...
@override
def method4(self, x: int | str) -> int | str: # E[method4]: no matching signature in ancestor
return 0
@override
@wrapper
def method5(self): # OK
pass
# > The @override decorator should be permitted anywhere a type checker
# > considers a method to be a valid override, which typically includes not
# > only normal methods but also @property, @staticmethod, and @classmethod.
@staticmethod
@override
def static_method1() -> int: # E: no matching signature in ancestor
return 1
@classmethod
@override
def class_method1(cls) -> int: # E: no matching signature in ancestor
return 1
@property
@override
def property1(self) -> int: # E: no matching signature in ancestor
return 1
# Test the case where the parent derives from Any
class ParentB(Any):
pass
class ChildB(ParentB):
@override
def method1(self) -> None: # OK
pass

View File

@@ -0,0 +1,130 @@
"""
Tests the evaluation of calls to constructors when there is a __init__
method defined.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/constructors.html#init-method
from typing import Any, Generic, Self, TypeVar, assert_type, overload
T = TypeVar("T")
class Class1(Generic[T]):
def __init__(self, x: T) -> None:
pass
# Constructor calls for specialized classes
assert_type(Class1[int](1), Class1[int])
assert_type(Class1[float](1), Class1[float])
Class1[int](1.0) # E
# Constructor calls for non-specialized classes
assert_type(Class1(1), Class1[int])
assert_type(Class1(1.0), Class1[float])
# > If the self parameter within the __init__ method is not annotated, type
# > checkers should infer a type of Self.
class Class2(Generic[T]):
def __init__(self, x: Self | None) -> None:
pass
class Class3(Class2[int]):
pass
Class3(Class3(None)) # OK
Class3(Class2(None)) # E
# > Regardless of whether the self parameter type is explicit or inferred, a
# > type checker should bind the class being constructed to this parameter and
# > report any type errors that arise during binding.
class Class4(Generic[T]):
def __init__(self: "Class4[int]") -> None: ...
Class4() # OK
Class4[int]() # OK
Class4[str]() # E
class Class5(Generic[T]):
@overload
def __init__(self: "Class5[list[int]]", value: int) -> None: ...
@overload
def __init__(self: "Class5[set[str]]", value: str) -> None: ...
@overload
def __init__(self, value: T) -> None:
pass
def __init__(self, value: Any) -> None:
pass
assert_type(Class5(0), Class5[list[int]])
assert_type(Class5[int](3), Class5[int])
assert_type(Class5(""), Class5[set[str]])
assert_type(Class5(3.0), Class5[float])
# > Function-scoped type variables can also be used in the self annotation
# > of an __init__ method to influence the return type of the constructor call.
T1 = TypeVar("T1")
T2 = TypeVar("T2")
V1 = TypeVar("V1")
V2 = TypeVar("V2")
class Class6(Generic[T1, T2]):
def __init__(self: "Class6[V1, V2]", value1: V1, value2: V2) -> None: ...
assert_type(Class6(0, ""), Class6[int, str])
assert_type(Class6[int, str](0, ""), Class6[int, str])
class Class7(Generic[T1, T2]):
def __init__(self: "Class7[V2, V1]", value1: V1, value2: V2) -> None: ...
assert_type(Class7(0, ""), Class7[str, int])
assert_type(Class7[str, int](0, ""), Class7[str, int])
# > Class-scoped type variables should not be used in the self annotation.
class Class8(Generic[T1, T2]):
def __init__(self: "Class8[T2, T1]") -> None: # E
pass
# > If a class does not define a __new__ method or __init__ method and does
# > not inherit either of these methods from a base class other than object,
# > a type checker should evaluate the argument list using the __new__ and
# > __init__ methods from the object class.
class Class9:
pass
class Class10:
pass
class Class11(Class9, Class10):
pass
assert_type(Class11(), Class11)
Class11(1) # E

View File

@@ -0,0 +1,66 @@
"""
Tests the evaluation of calls to constructors when there is a custom
metaclass with a __call__ method.
"""
from typing import NoReturn, Self, TypeVar, assert_type
# Specification: https://typing.readthedocs.io/en/latest/spec/constructors.html#constructor-calls
# Metaclass __call__ method: https://typing.readthedocs.io/en/latest/spec/constructors.html#metaclass-call-method
class Meta1(type):
def __call__(cls, *args, **kwargs) -> NoReturn:
raise TypeError("Cannot instantiate class")
class Class1(metaclass=Meta1):
def __new__(cls, x: int) -> Self:
return super().__new__(cls)
assert_type(Class1(), NoReturn)
class Meta2(type):
def __call__(cls, *args, **kwargs) -> "int | Meta2":
return 1
class Class2(metaclass=Meta2):
def __new__(cls, x: int) -> Self:
return super().__new__(cls)
assert_type(Class2(), int | Meta2)
T = TypeVar("T")
class Meta3(type):
def __call__(cls: type[T], *args, **kwargs) -> T:
return super().__call__(cls, *args, **kwargs)
class Class3(metaclass=Meta3):
def __new__(cls, x: int) -> Self:
return super().__new__(cls)
Class3() # E: Missing argument for 'x' parameter in __new__
assert_type(Class3(1), Class3)
class Meta4(type):
def __call__(cls, *args, **kwargs):
return super().__call__(cls, *args, **kwargs)
class Class4(metaclass=Meta4):
def __new__(cls, x: int) -> Self:
return super().__new__(cls)
Class4() # E: Missing argument for 'x' parameter in __new__
assert_type(Class4(1), Class4)

View File

@@ -0,0 +1,145 @@
"""
Tests the evaluation of calls to constructors when there is a __new__
method defined.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/constructors.html#new-method
from typing import Any, Generic, NoReturn, Self, TypeVar, assert_type
T = TypeVar("T")
class Class1(Generic[T]):
def __new__(cls, x: T) -> Self:
return super().__new__(cls)
assert_type(Class1[int](1), Class1[int])
assert_type(Class1[float](1), Class1[float])
Class1[int](1.0) # E
assert_type(Class1(1), Class1[int])
assert_type(Class1(1.0), Class1[float])
class Class2(Generic[T]):
def __new__(cls, *args, **kwargs) -> Self:
return super().__new__(cls)
def __init__(self, x: T) -> None:
pass
assert_type(Class2(1), Class2[int])
assert_type(Class2(""), Class2[str])
class Class3:
def __new__(cls) -> int:
return 0
# In this case, the __init__ method should not be considered
# by the type checker when evaluating a constructor call.
def __init__(self, x: int):
pass
assert_type(Class3(), int)
# > For purposes of this test, an explicit return type of Any (or a union
# > containing Any) should be treated as a type that is not an instance of
# > the class being constructed.
class Class4:
def __new__(cls) -> "Class4 | Any":
return 0
def __init__(self, x: int):
pass
assert_type(Class4(), Class4 | Any)
class Class5:
def __new__(cls) -> NoReturn:
raise NotImplementedError
def __init__(self, x: int):
pass
try:
assert_type(Class5(), NoReturn)
except:
pass
class Class6:
def __new__(cls) -> "int | Class6":
return 0
def __init__(self, x: int):
pass
assert_type(Class6(), int | Class6)
# > If the return type of __new__ is not annotated, a type checker may assume
# > that the return type is Self and proceed with the assumption that the
# > __init__ method will be called.
class Class7:
def __new__(cls, *args, **kwargs):
return super().__new__(cls, *args, **kwargs)
def __init__(self, x: int):
pass
assert_type(Class7(1), Class7)
# > If the class is generic, it is possible for a __new__ method to override
# > the specialized class type and return a class instance that is specialized
# > with different type arguments.
class Class8(Generic[T]):
def __new__(cls, *args, **kwargs) -> "Class8[list[T]]": ...
assert_type(Class8[int](), Class8[list[int]])
assert_type(Class8[str](), Class8[list[str]])
# > If the cls parameter within the __new__ method is not annotated,
# > type checkers should infer a type of type[Self].
class Class9(Generic[T]):
def __new__(cls, *args, **kwargs) -> Self: ...
class Class10(Class9[int]):
pass
c10: Class9[int] = Class10()
# > Regardless of whether the type of the cls parameter is explicit or
# > inferred, the type checker should bind the class being constructed to
# > the cls parameter and report any type errors that arise during binding.
class Class11(Generic[T]):
def __new__(cls: "type[Class11[int]]") -> "Class11[int]": ...
Class11() # OK
Class11[int]() # OK
Class11[str]() # E

View File

@@ -0,0 +1,82 @@
"""
Tests the evaluation of calls to constructors when the type is type[T].
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/constructors.html#constructor-calls-for-type-t
# > When a value of type type[T] (where T is a concrete class or a type
# > variable) is called, a type checker should evaluate the constructor
# > call as if it is being made on the class T.
from typing import Self, TypeVar
T = TypeVar("T")
class Meta1(type):
# Ignore possible errors related to incompatible override
def __call__(cls: type[T], x: int, y: str) -> T: # type: ignore
return type.__call__(cls)
class Class1(metaclass=Meta1):
def __new__(cls, *args, **kwargs) -> Self:
return super().__new__(*args, **kwargs)
def func1(cls: type[Class1]):
cls(x=1, y="") # OK
cls() # E
class Class2:
def __new__(cls, x: int, y: str) -> Self:
return super().__new__(cls)
def func2(cls: type[Class2]):
cls(x=1, y="") # OK
cls() # E
class Class3:
def __init__(self, x: int, y: str) -> None:
pass
def func3(cls: type[Class3]):
cls(x=1, y="") # OK
cls() # E
class Class4:
pass
def func4(cls: type[Class4]):
cls() # OK
cls(1) # E
def func5(cls: type[T]):
cls() # OK
cls(1) # E
T1 = TypeVar("T1", bound=Class1)
def func6(cls: type[T1]):
cls(x=1, y="") # OK
cls() # E
T2 = TypeVar("T2", bound=Class2)
def func7(cls: type[T2]):
cls(1, "") # OK
cls(x=1, y="") # OK
cls(1) # E
cls(1, 2) # E

View File

@@ -0,0 +1,195 @@
"""
Tests the conversion of constructors into Callable types.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/constructors.html#converting-a-constructor-to-callable
from typing import (
Any,
Callable,
Generic,
NoReturn,
ParamSpec,
Self,
TypeVar,
assert_type,
overload,
reveal_type,
)
P = ParamSpec("P")
R = TypeVar("R")
T = TypeVar("T")
def accepts_callable(cb: Callable[P, R]) -> Callable[P, R]:
return cb
class Class1:
def __init__(self, x: int) -> None:
pass
r1 = accepts_callable(Class1)
reveal_type(r1) # `def (x: int) -> Class1`
assert_type(r1(1), Class1)
r1() # E
r1(y=1) # E
class Class2:
"""No __new__ or __init__"""
pass
r2 = accepts_callable(Class2)
reveal_type(r2) # `def () -> Class2`
assert_type(r2(), Class2)
r2(1) # E
class Class3:
"""__new__ and __init__"""
def __new__(cls, *args, **kwargs) -> Self: ...
def __init__(self, x: int) -> None: ...
r3 = accepts_callable(Class3)
reveal_type(r3) # `def (x: int) -> Class3`
assert_type(r3(3), Class3)
r3() # E
r3(y=1) # E
r3(1, 2) # E
class Class4:
"""__new__ but no __init__"""
def __new__(cls, x: int) -> int: ...
r4 = accepts_callable(Class4)
reveal_type(r4) # `def (x: int) -> int`
assert_type(r4(1), int)
r4() # E
r4(y=1) # E
class Meta1(type):
def __call__(cls, *args: Any, **kwargs: Any) -> NoReturn:
raise NotImplementedError("Class not constructable")
class Class5(metaclass=Meta1):
"""Custom metaclass that overrides type.__call__"""
def __new__(cls, *args: Any, **kwargs: Any) -> Self:
"""This __new__ is ignored for purposes of conversion"""
return super().__new__(cls)
r5 = accepts_callable(Class5)
reveal_type(r5) # `def (*args: Any, **kwargs: Any) -> NoReturn`
try:
assert_type(r5(), NoReturn)
except:
pass
try:
assert_type(r5(1, x=1), NoReturn)
except:
pass
class Class6Proxy: ...
class Class6:
"""__new__ that causes __init__ to be ignored"""
def __new__(cls) -> Class6Proxy:
return Class6Proxy.__new__(cls)
def __init__(self, x: int) -> None:
"""This __init__ is ignored for purposes of conversion"""
pass
r6 = accepts_callable(Class6)
reveal_type(r6) # `def () -> Class6Proxy`
assert_type(r6(), Class6Proxy)
r6(1) # E
class Class6Any:
"""__new__ that causes __init__ to be ignored via Any"""
def __new__(cls) -> Any:
return super().__new__(cls)
def __init__(self, x: int) -> None:
"""This __init__ is ignored for purposes of conversion"""
pass
r6_any = accepts_callable(Class6Any)
reveal_type(r6_any) # `def () -> Any`
assert_type(r6_any(), Any)
r6_any(1) # E
# > If the __init__ or __new__ method is overloaded, the callable type should
# > be synthesized from the overloads. The resulting callable type itself will
# > be overloaded.
class Class7(Generic[T]):
@overload
def __init__(self: "Class7[int]", x: int) -> None: ...
@overload
def __init__(self: "Class7[str]", x: str) -> None: ...
def __init__(self, x: int | str) -> None:
pass
r7 = accepts_callable(Class7)
reveal_type(
r7
) # overload of `def (x: int) -> Class7[int]` and `def (x: str) -> Class7[str]`
assert_type(r7(0), Class7[int])
assert_type(r7(""), Class7[str])
# > If the class is generic, the synthesized callable should include any
# > class-scoped type parameters that appear within the signature, but these
# > type parameters should be converted to function-scoped type parameters
# > for the callable. Any function-scoped type parameters in the __init__
# > or __new__ method should also be included as function-scoped type parameters
# > in the synthesized callable.
class Class8(Generic[T]):
def __new__(cls, x: list[T], y: list[T]) -> Self:
return super().__new__(cls)
r8 = accepts_callable(Class8)
reveal_type(r8) # `def [T] (x: T, y: list[T]) -> Class8[T]`
assert_type(r8([""], [""]), Class8[str])
r8([1], [""]) # E
class Class9:
def __init__(self, x: list[T], y: list[T]) -> None:
pass
r9 = accepts_callable(Class9)
reveal_type(r9) # `def [T] (x: list[T], y: list[T]) -> Class9`
assert_type(r9([""], [""]), Class9)
r9([1], [""]) # E

View File

@@ -0,0 +1,26 @@
"""
Tests consistency checks between __new__ and __init__ methods.
"""
# pyright: reportInconsistentConstructor=true
# Specification: https://typing.readthedocs.io/en/latest/spec/constructors.html#consistency-of-new-and-init
# Note: This functionality is optional in the typing spec, and conformant
# type checkers are not required to implement it.
# > Type checkers may optionally validate that the __new__ and __init__
# > methods for a class have consistent signatures.
from typing import Self
class Class1:
def __new__(cls) -> Self:
return super().__new__(cls)
# Type error: __new__ and __init__ have inconsistent signatures
def __init__(self, x: str) -> None: # E?
pass

View File

@@ -0,0 +1,68 @@
"""
Tests the handling of descriptors within a dataclass.
"""
# This portion of the dataclass spec is under-specified in the documentation,
# but its behavior can be determined from the runtime implementation.
from dataclasses import dataclass
from typing import Any, Generic, TypeVar, assert_type, overload
T = TypeVar("T")
class Desc1:
@overload
def __get__(self, __obj: None, __owner: Any) -> "Desc1":
...
@overload
def __get__(self, __obj: object, __owner: Any) -> int:
...
def __get__(self, __obj: object | None, __owner: Any) -> "int | Desc1":
...
def __set__(self, __obj: object, __value: int) -> None:
...
@dataclass
class DC1:
y: Desc1 = Desc1()
dc1 = DC1(3)
assert_type(dc1.y, int)
assert_type(DC1.y, Desc1)
class Desc2(Generic[T]):
@overload
def __get__(self, instance: None, owner: Any) -> list[T]:
...
@overload
def __get__(self, instance: object, owner: Any) -> T:
...
def __get__(self, instance: object | None, owner: Any) -> list[T] | T:
...
@dataclass
class DC2:
x: Desc2[int]
y: Desc2[str]
z: Desc2[str] = Desc2()
assert_type(DC2.x, list[int])
assert_type(DC2.y, list[str])
assert_type(DC2.z, list[str])
dc2 = DC2(Desc2(), Desc2(), Desc2())
assert_type(dc2.x, int)
assert_type(dc2.y, str)
assert_type(dc2.z, str)

View File

@@ -0,0 +1,38 @@
"""
Tests the handling of ClassVar and Final in dataclasses.
"""
from dataclasses import dataclass
from typing import assert_type, ClassVar, Final
# Specification: https://typing.readthedocs.io/en/latest/spec/dataclasses.html#dataclass-semantics
# > A final class variable on a dataclass must be explicitly annotated as
# e.g. x: ClassVar[Final[int]] = 3.
@dataclass
class D:
final_no_default: Final[int]
final_with_default: Final[str] = "foo"
final_classvar: ClassVar[Final[int]] = 4
# we don't require support for Final[ClassVar[...]] because the dataclasses
# runtime implementation won't recognize it as a ClassVar either
# An explicitly marked ClassVar can be accessed on the class:
assert_type(D.final_classvar, int)
# ...but not assigned to, because it's Final:
D.final_classvar = 10 # E: can't assign to final attribute
# A non-ClassVar attribute (with or without default) is a dataclass field:
d = D(final_no_default=1, final_with_default="bar")
assert_type(d.final_no_default, int)
assert_type(d.final_with_default, str)
# ... but can't be assigned to (on the class or on an instance):
d.final_no_default = 10 # E: can't assign to final attribute
d.final_with_default = "baz" # E: can't assign to final attribute
D.final_no_default = 10 # E: can't assign to final attribute
D.final_with_default = "baz" # E: can't assign to final attribute

View File

@@ -0,0 +1,41 @@
"""
Tests validation of frozen dataclass instances.
"""
# Specification: https://peps.python.org/pep-0557/#frozen-instances
from dataclasses import dataclass
@dataclass(frozen=True)
class DC1:
a: float
b: str
dc1 = DC1(1, "")
dc1.a = 1 # E: dataclass is frozen
dc1.b = "" # E: dataclass is frozen
# This should generate an error because a non-frozen dataclass
# cannot inherit from a frozen dataclass.
@dataclass # E[DC2]
class DC2(DC1): # E[DC2]
pass
@dataclass
class DC3:
a: int
# This should generate an error because a frozen dataclass
# cannot inherit from a non-frozen dataclass.
@dataclass(frozen=True) # E[DC4]
class DC4(DC3): # E[DC4]
pass
@dataclass(frozen=True)
class DC1Child(DC1):
# This should be allowed because attributes within a frozen
# dataclass are covariant rather than invariant.
a: int

View File

@@ -0,0 +1,70 @@
"""
Tests the synthesis of the __hash__ method in a dataclass.
"""
from dataclasses import dataclass
from typing import Hashable
@dataclass
class DC1:
a: int
# This should generate an error because DC1 isn't hashable.
v1: Hashable = DC1(0) # E
@dataclass(eq=True, frozen=True)
class DC2:
a: int
v2: Hashable = DC2(0)
@dataclass(eq=True)
class DC3:
a: int
# This should generate an error because DC3 isn't hashable.
v3: Hashable = DC3(0) # E
@dataclass(frozen=True)
class DC4:
a: int
v4: Hashable = DC4(0)
@dataclass(eq=True, unsafe_hash=True)
class DC5:
a: int
v5: Hashable = DC5(0)
@dataclass(eq=True)
class DC6:
a: int
def __hash__(self) -> int:
return 0
v6: Hashable = DC6(0)
@dataclass(frozen=True)
class DC7:
a: int
def __eq__(self, other) -> bool:
return self.a == other.a
v7: Hashable = DC7(0)

View File

@@ -0,0 +1,64 @@
"""
Tests inheritance rules for dataclasses.
"""
# Specification: https://peps.python.org/pep-0557/#inheritance
from dataclasses import dataclass
from typing import Any, ClassVar
@dataclass
class DC1:
a: int
b: str = ""
@dataclass
class DC2(DC1):
b: str = ""
a: int = 1
dc2_1 = DC2(1, "")
dc2_2 = DC2()
@dataclass
class DC3:
x: float = 15.0
y: str = ""
@dataclass
class DC4(DC3):
z: tuple[int] = (10,)
x: float = 15
dc4_1 = DC4(0.0, "", (1,))
@dataclass
class DC5:
# This should generate an error because a default value of
# type list, dict, or set generate a runtime error.
x: list[int] = []
@dataclass
class DC6:
x: int
y: ClassVar[int] = 1
@dataclass
class DC7(DC6):
# This should generate an error because a ClassVar cannot override
# an instance variable of the same name.
x: ClassVar[int] # E
# This should generate an error because an instance variable cannot
# override a class variable of the same name.
y: int # E

View File

@@ -0,0 +1,62 @@
"""
Tests the keyword-only feature of dataclass added in Python 3.10.
"""
# Specification: https://docs.python.org/3/library/dataclasses.html#module-contents
from dataclasses import dataclass, KW_ONLY, field
@dataclass
class DC1:
a: str
_: KW_ONLY
b: int = 0
DC1("hi")
DC1(a="hi")
DC1(a="hi", b=1)
DC1("hi", b=1)
# This should generate an error because "b" is keyword-only.
DC1("hi", 1) # E
@dataclass
class DC2:
b: int = field(kw_only=True, default=3)
a: str
DC2("hi")
DC2(a="hi")
DC2(a="hi", b=1)
DC2("hi", b=1)
# This should generate an error because "b" is keyword-only.
DC2("hi", 1) # E
@dataclass(kw_only=True)
class DC3:
a: str = field(kw_only=False)
b: int = 0
DC3("hi")
DC3(a="hi")
DC3(a="hi", b=1)
DC3("hi", b=1)
# This should generate an error because "b" is keyword-only.
DC3("hi", 1) # E
@dataclass
class DC4(DC3):
c: float
DC4("", 0.2, b=3)
DC4(a="", b=3, c=0.2)

View File

@@ -0,0 +1,54 @@
"""
Tests the synthesized comparison methods for dataclasses.
"""
from dataclasses import dataclass
@dataclass(order=True)
class DC1:
a: str
b: int
@dataclass(order=True)
class DC2:
a: str
b: int
dc1_1 = DC1("", 0)
dc1_2 = DC1("", 0)
if dc1_1 < dc1_2:
pass
if dc1_1 <= dc1_2:
pass
if dc1_1 > dc1_2:
pass
if dc1_1 >= dc1_2:
pass
if dc1_1 == dc1_2:
pass
if dc1_1 != dc1_2:
pass
if dc1_1 == None:
pass
if dc1_1 != None:
pass
dc2_1 = DC2("hi", 2)
# This should generate an error because the types are
# incompatible.
if dc1_1 < dc2_1: # E:
pass
if dc1_1 != dc2_1:
pass

View File

@@ -0,0 +1,55 @@
"""
Tests type checking of the __post_init__ method in a dataclass.
"""
# Specification: https://peps.python.org/pep-0557/#post-init-processing
from dataclasses import InitVar, dataclass, field, replace
from typing import assert_type
@dataclass
class DC1:
a: int
b: int
x: InitVar[int]
c: int
y: InitVar[str]
def __post_init__(self, x: int, y: int) -> None: # E: wrong type for y
pass
dc1 = DC1(1, 2, 3, 4, "")
assert_type(dc1.a, int)
assert_type(dc1.b, int)
assert_type(dc1.c, int)
print(dc1.x) # E: cannot access InitVar
print(dc1.y) # E: cannot access InitVar
@dataclass
class DC2:
x: InitVar[int]
y: InitVar[str]
def __post_init__(self, x: int) -> None: # E: missing y
pass
@dataclass
class DC3:
_name: InitVar[str] = field()
name: str = field(init=False)
def __post_init__(self, _name: str):
...
@dataclass
class DC4(DC3):
_age: InitVar[int] = field()
age: int = field(init=False)
def __post_init__(self, _name: str, _age: int):
...

View File

@@ -0,0 +1,69 @@
"""
Tests the slots functionality of dataclass added in Python 3.10.
"""
# Specification: https://docs.python.org/3/library/dataclasses.html#module-contents
from dataclasses import dataclass
# This should generate an error because __slots__ is already defined.
@dataclass(slots=True) # E[DC1]
class DC1: # E[DC1]
x: int
__slots__ = ()
@dataclass(slots=True)
class DC2:
x: int
def __init__(self):
self.x = 3
# This should generate an error because "y" is not in slots.
self.y = 3 # E
@dataclass(slots=False)
class DC3:
x: int
__slots__ = ("x",)
def __init__(self):
self.x = 3
# This should generate an error because "y" is not in slots.
self.y = 3 # E
@dataclass
class DC4:
__slots__ = ("y", "x")
x: int
y: str
DC4(1, "bar")
@dataclass(slots=True)
class DC5:
a: int
DC5.__slots__
DC5(1).__slots__
@dataclass
class DC6:
a: int
# This should generate an error because __slots__ is not defined.
DC6.__slots__ # E
# This should generate an error because __slots__ is not defined.
DC6(1).__slots__ # E

View File

@@ -0,0 +1,119 @@
"""
Tests the dataclass_transform mechanism when it is applied to a base class.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/dataclasses.html#the-dataclass-transform-decorator
from typing import Any, Generic, TypeVar, dataclass_transform
T = TypeVar("T")
class ModelField:
def __init__(self, *, init: bool = True, default: Any | None = None) -> None:
...
def model_field(
*, init: bool = True, default: Any | None = None, alias: str | None = None
) -> Any:
...
@dataclass_transform(
kw_only_default=True,
field_specifiers=(ModelField, model_field),
)
class ModelBase:
not_a_field: str
def __init_subclass__(
cls,
*,
frozen: bool = False,
kw_only: bool = True,
order: bool = True,
) -> None:
...
class Customer1(ModelBase, frozen=True):
id: int = model_field()
name: str = model_field()
name2: str = model_field(alias="other_name", default="None")
# This should generate an error because a non-frozen dataclass cannot
# derive from a frozen one.
class Customer1Subclass(Customer1): # E
salary: float = model_field()
class Customer2(ModelBase, order=True):
id: int
name: str = model_field(default="None")
c1_1 = Customer1(id=3, name="Sue", other_name="Susan")
# This should generate an error because the class is frozen.
c1_1.id = 4 # E
# This should generate an error because the class is kw_only.
c1_2 = Customer1(3, "Sue") # E
c1_3 = Customer1(id=3, name="John")
# This should generate an error because comparison methods are
# not synthesized.
v1 = c1_1 < c1_2 # E
c2_1 = Customer2(id=0, name="John")
c2_2 = Customer2(id=1)
v2 = c2_1 < c2_2
# This should generate an error because Customer2 supports
# keyword-only parameters for its constructor.
c2_3 = Customer2(0, "John") # E
@dataclass_transform(
kw_only_default=True,
field_specifiers=(ModelField, model_field),
)
class GenericModelBase(Generic[T]):
not_a_field: T
def __init_subclass__(
cls,
*,
frozen: bool = False,
kw_only: bool = True,
order: bool = True,
) -> None:
...
class GenericCustomer(GenericModelBase[int]):
id: int = model_field()
gc_1 = GenericCustomer(id=3)
@dataclass_transform(frozen_default=True)
class ModelBaseFrozen:
not_a_field: str
class Customer3(ModelBaseFrozen):
id: int
name: str
c3_1 = Customer3(id=2, name="hi")
# This should generate an error because Customer3 is frozen.
c3_1.id = 4 # E

View File

@@ -0,0 +1,77 @@
"""
Tests the dataclass_transform mechanism honors implicit default values
in field parameters.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/dataclasses.html#field-specifier-parameters
from typing import Any, Callable, Literal, TypeVar, dataclass_transform, overload
T = TypeVar("T")
# > Field specifier functions can use overloads that implicitly specify the
# > value of init using a literal bool value type (Literal[False] or Literal[True]).
@overload
def field1(
*,
default: str | None = None,
resolver: Callable[[], Any],
init: Literal[False] = False,
) -> Any:
...
@overload
def field1(
*,
default: str | None = None,
resolver: None = None,
init: Literal[True] = True,
) -> Any:
...
def field1(
*,
default: str | None = None,
resolver: Callable[[], Any] | None = None,
init: bool = True,
) -> Any:
...
def field2(*, init: bool = False, kw_only: bool = True) -> Any:
...
@dataclass_transform(kw_only_default=True, field_specifiers=(field1, field2))
def create_model(*, init: bool = True) -> Callable[[type[T]], type[T]]:
...
@create_model()
class CustomerModel1:
id: int = field1(resolver=lambda: 0)
name: str = field1(default="Voldemort")
CustomerModel1()
CustomerModel1(name="hi")
# This should generate an error because "id" is not
# supposed to be part of the init function.
CustomerModel1(id=1, name="hi") # E
@create_model()
class CustomerModel2:
id: int = field2()
name: str = field2(init=True)
# This should generate an error because kw_only is True
# by default for field2.
CustomerModel2(1) # E
CustomerModel2(name="Fred")

View File

@@ -0,0 +1,97 @@
"""
Tests the dataclass_transform mechanism when it is applied to a decorator function.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/dataclasses.html#the-dataclass-transform-decorator
from typing import Any, Callable, TypeVar, dataclass_transform, overload
T = TypeVar("T")
@overload
@dataclass_transform(kw_only_default=True, order_default=True)
def create_model(cls: T) -> T:
...
@overload
@dataclass_transform(kw_only_default=True, order_default=True)
def create_model(
*,
frozen: bool = False,
kw_only: bool = True,
order: bool = True,
) -> Callable[[T], T]:
...
def create_model(*args: Any, **kwargs: Any) -> Any:
...
@create_model(kw_only=False, order=False)
class Customer1:
id: int
name: str
@create_model(frozen=True)
class Customer2:
id: int
name: str
@create_model(frozen=True)
class Customer2Subclass(Customer2):
salary: float
c1_1 = Customer1(id=3, name="Sue")
c1_1.id = 4
c1_2 = Customer1(3, "Sue")
c1_2.name = "Susan"
# This should generate an error because of a type mismatch.
c1_2.name = 3 # E
# This should generate an error because comparison methods are
# not synthesized.
v1 = c1_1 < c1_2 # E
# This should generate an error because salary is not
# a defined field.
c1_3 = Customer1(id=3, name="Sue", salary=40000) # E
c2_1 = Customer2(id=0, name="John")
# This should generate an error because Customer2 supports
# keyword-only parameters for its constructor.
c2_2 = Customer2(0, "John") # E
v2 = c2_1 < c2_2
@dataclass_transform(kw_only_default=True, order_default=True, frozen_default=True)
def create_model_frozen(cls: T) -> T:
...
@create_model_frozen
class Customer3:
id: int
name: str
# This should generate an error because a non-frozen class
# cannot inherit from a frozen class.
@create_model # E[Customer3Subclass]
class Customer3Subclass(Customer3): # E[Customer3Subclass]
age: int
c3_1 = Customer3(id=2, name="hi")
# This should generate an error because Customer3 is frozen.
c3_1.id = 4 # E

View File

@@ -0,0 +1,100 @@
"""
Tests the dataclass_transform mechanism when it is applied to a metaclass.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/dataclasses.html#the-dataclass-transform-decorator
from typing import Any, dataclass_transform
class ModelField:
def __init__(self, *, init: bool = True, default: Any | None = None) -> None:
...
def model_field(
*, init: bool = True, default: Any | None = None, alias: str | None = None
) -> Any:
...
@dataclass_transform(
kw_only_default=True,
field_specifiers=(ModelField, model_field),
)
class ModelMeta(type):
not_a_field: str
class ModelBase(metaclass=ModelMeta):
def __init_subclass__(
cls,
*,
frozen: bool = False,
kw_only: bool = True,
order: bool = True,
) -> None:
...
class Customer1(ModelBase, frozen=True):
id: int = model_field()
name: str = model_field()
name2: str = model_field(alias="other_name", default="None")
# This should generate an error because a non-frozen class cannot
# derive from a frozen one.
class Customer1Subclass(Customer1, frozen=False): # E
salary: float = model_field()
class Customer2(ModelBase, order=True):
id: int
name: str = model_field(default="None")
c1_1 = Customer1(id=3, name="Sue", other_name="Susan")
# This should generate an error because the class is frozen.
c1_1.id = 4 # E
# This should generate an error because the class is kw_only.
c1_2 = Customer1(3, "Sue") # E
# OK (other_name is optional).
c1_3 = Customer1(id=3, name="John")
# This should generate an error because comparison methods are
# not synthesized.
v1 = c1_1 < c1_2 # E
c2_1 = Customer2(id=0, name="John")
c2_2 = Customer2(id=1)
v2 = c2_1 < c2_2
# This should generate an error because Customer2 supports
# keyword-only parameters for its constructor.
c2_3 = Customer2(0, "John") # E
@dataclass_transform(frozen_default=True)
class ModelMetaFrozen(type):
pass
class ModelBaseFrozen(metaclass=ModelMetaFrozen):
...
class Customer3(ModelBaseFrozen):
id: int
name: str
c3_1 = Customer3(id=2, name="hi")
# This should generate an error because Customer3 is frozen.
c3_1.id = 4 # E

View File

@@ -0,0 +1,227 @@
"""
Tests basic handling of the dataclass factory.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/dataclasses.html
# Also, see https://peps.python.org/pep-0557/
from dataclasses import InitVar, dataclass, field
from typing import Any, Callable, ClassVar, Generic, Protocol, TypeVar, assert_type
T = TypeVar("T")
@dataclass(order=True)
class InventoryItem:
x = 0
name: str
unit_price: float
quantity_on_hand: int = 0
def total_cost(self) -> float:
return self.unit_price * self.quantity_on_hand
v1 = InventoryItem("soap", 2.3)
class InventoryItemInitProto(Protocol):
def __call__(
self, name: str, unit_price: float, quantity_on_hand: int = ...
) -> None: ...
# Validate the type of the synthesized __init__ method.
x1: InventoryItemInitProto = v1.__init__
# Make sure the following additional methods were synthesized.
print(v1.__repr__)
print(v1.__eq__)
print(v1.__ne__)
print(v1.__lt__)
print(v1.__le__)
print(v1.__gt__)
print(v1.__ge__)
assert_type(v1.name, str)
assert_type(v1.unit_price, float)
assert_type(v1.quantity_on_hand, int)
v2 = InventoryItem("name") # E: missing unit_price
v3 = InventoryItem("name", "price") # E: incorrect type for unit_price
v4 = InventoryItem("name", 3.1, 3, 4) # E: too many arguments
# > TypeError will be raised if a field without a default value follows a
# > field with a default value. This is true either when this occurs in a
# > single class, or as a result of class inheritance.
@dataclass # E[DC1]
class DC1:
a: int = 0
b: int # E[DC1]: field with no default cannot follow field with default.
@dataclass # E[DC2]
class DC2:
a: int = field(default=1)
b: int # E[DC2]: field with no default cannot follow field with default.
@dataclass # E[DC3]
class DC3:
a: InitVar[int] = 0
b: int # E[DC3]: field with no default cannot follow field with default.
@dataclass
class DC4:
a: int = field(init=False)
b: int
v5 = DC4(0)
v6 = DC4(0, 1) # E: too many parameters
@dataclass
class DC5:
a: int = field(default_factory=str) # E: type mismatch
def f(s: str) -> int:
return int(s)
@dataclass
class DC6:
a: ClassVar[int] = 0
b: str
c: Callable[[str], int] = f
dc6 = DC6("")
assert_type(dc6.a, int)
assert_type(DC6.a, int)
assert_type(dc6.b, str)
assert_type(dc6.c, Callable[[str], int])
@dataclass
class DC7:
x: int
@dataclass(init=False)
class DC8(DC7):
y: int
def __init__(self, a: DC7, y: int):
self.__dict__ = a.__dict__
a = DC7(3)
b = DC8(a, 5)
# This should generate an error because there is an extra parameter
DC7(3, 4) # E
# This should generate an error because there is one too few parameters
DC8(a) # E
@dataclass
class DC9:
a: str = field(init=False, default="s")
b: bool = field()
@dataclass
class DC10(DC9):
a: str = field()
b: bool = field()
@dataclass(init=False)
class DC11:
x: int
x_squared: int
def __init__(self, x: int):
self.x = x
self.x_squared = x * x
DC11(3)
@dataclass(init=True)
class DC12:
x: int
x_squared: int
def __init__(self, x: int):
self.x = x
self.x_squared = x * x
DC12(3)
@dataclass(init=False)
class DC13:
x: int
x_squared: int
# This should generate an error because there is no
# override __init__ method and no synthesized __init__.
DC13(3) # E
@dataclass
class DC14:
prop_1: str = field(init=False)
prop_2: str = field(default="hello")
prop_3: str = field(default_factory=lambda: "hello")
@dataclass
class DC15(DC14):
prop_2: str = ""
dc15 = DC15(prop_2="test")
assert_type(dc15.prop_1, str)
assert_type(dc15.prop_2, str)
assert_type(dc15.prop_3, str)
class DataclassProto(Protocol):
__dataclass_fields__: ClassVar[dict[str, Any]]
v7: DataclassProto = dc15
@dataclass
class DC16(Generic[T]):
value: T
assert_type(DC16(1), DC16[int])
class DC17(DC16[str]):
pass
assert_type(DC17(""), DC17)
@dataclass
class DC18:
x: int = field()
# This may generate a type checker error because an unannotated field
# will result in a runtime exception.
y = field() # E?

View File

@@ -0,0 +1,37 @@
"""
Tests the typing.assert_type function.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/directives.html#assert-type
from typing import Annotated, Any, Literal, assert_type
# > When a type checker encounters a call to assert_type(), it should
# > emit an error if the value is not of the specified type.
def func1(
a: int | str,
b: list[int],
c: Any,
d: "ForwardReference",
e: Annotated[Literal[4], ""],
):
assert_type(a, int | str) # OK
assert_type(b, list[int]) # OK
assert_type(c, Any) # OK
assert_type(d, "ForwardReference") # OK
assert_type(e, Literal[4]) # OK
assert_type(a, int) # E: Type mismatch
assert_type(c, int) # E: Type mismatch
assert_type(e, int) # E: Type mismatch
assert_type() # E: not enough arguments
assert_type("", int) # E: wrong argument type
assert_type(a, int | str, a) # E: too many arguments
class ForwardReference:
pass

View File

@@ -0,0 +1,17 @@
"""
Tests the typing.cast call.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/directives.html#cast
from typing import cast
def find_first_str(a: list[object]) -> str:
index = next(i for i, x in enumerate(a) if isinstance(x, str))
return cast(str, a[index])
x: int = cast(int, "not an int") # No type error
bad1 = cast() # E: Too few arguments
bad2 = cast(1, "") # E: Bad first argument type
bad3 = cast(int, "", "") # E: Too many arguments

View File

@@ -0,0 +1,32 @@
"""
Tests the typing.no_type_check decorator.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/directives.html#no-type-check
# "E?" is used below because support is optional.
from typing import no_type_check
# > The behavior for the ``no_type_check`` decorator when applied to a class is
# > left undefined by the typing spec at this time.
@no_type_check
class ClassA:
x: int = "" # E?: No error should be reported
# > If a type checker supports the ``no_type_check`` decorator for functions, it
# > should suppress all type errors for the ``def`` statement and its body including
# > any nested functions or classes. It should also ignore all parameter
# > and return type annotations and treat the function as if it were unannotated.
@no_type_check
def func1(a: int, b: str) -> None:
c = a + b # E?: No error should be reported
return 1 # E?: No error should be reported
func1(b"invalid", b"arguments") # E?: No error should be reported
# This should still be an error because type checkers should ignore
# annotations, but still check the argument count.
func1() # E: incorrect arguments for parameters

View File

@@ -0,0 +1,24 @@
"""
Tests the typing.reveal_type function.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/directives.html#reveal-type
from typing import Any, reveal_type
# > When a static type checker encounters a call to this function, it should
# > emit a diagnostic with the type of the argument.
def func1(a: int | str, b: list[int], c: Any, d: "ForwardReference"):
reveal_type(a) # Revealed type is "int | str"
reveal_type(b) # Revealed type is "list[int]"
reveal_type(c) # Revealed type is "Any"
reveal_type(d) # Revealed type is "ForwardReference"
reveal_type() # E: not enough arguments
reveal_type(a, a) # E: Too many arguments
class ForwardReference:
pass

View File

@@ -0,0 +1,18 @@
"""
Tests the typing.TYPE_CHECKING constant.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/directives.html#type-checking
from typing import TYPE_CHECKING, assert_type
if not TYPE_CHECKING:
a: int = "" # This should not generate an error
if TYPE_CHECKING:
b: list[int] = [1, 2, 3]
else:
b: list[str] = ["a", "b", "c"]
assert_type(b, list[int])

View File

@@ -0,0 +1,20 @@
"""
Tests "# type: ignore" comments.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/directives.html#type-ignore-comments
# The following type violation should be suppressed.
x: int = "" # type: ignore
# The following type violation should be suppressed.
y: int = "" # type: ignore - additional stuff
# The following type violation should be suppressed.
z: int = "" # type: ignore[additional_stuff]
# > In some cases, linting tools or other comments may be needed on the same
# > line as a type comment. In these cases, the type comment should be before
# > other comments and linting markers.
a: int = "" # type: ignore # other comment

View File

@@ -0,0 +1,16 @@
#!/usr/bin/env python
# type: ignore
"""
Tests a file-level type ignore comment.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/directives.html#type-ignore-comments
# > A # type: ignore comment on a line by itself at the top of a file, before any
# > docstrings, imports, or other executable code, silences all errors in the file.
# > Blank lines and other comments, such as shebang lines and coding cookies, may
# > precede the # type: ignore comment.
x: int = "" # No error should be reported

View File

@@ -0,0 +1,14 @@
"""
Tests a file-level type ignore comment.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/directives.html#type-ignore-comments
# type: ignore
# > A # type: ignore comment on a line by itself at the top of a file, before any
# > docstrings, imports, or other executable code, silences all errors in the file.
# > Blank lines and other comments, such as shebang lines and coding cookies, may
# > precede the # type: ignore comment.
x: int = "" # E: should still error because comment is not at top of file.

View File

@@ -0,0 +1,45 @@
"""
Tests checking the version or platform.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/directives.html#version-and-platform-checking
import os
import sys
if sys.version_info >= (3, 8):
pass
else:
val1: int = "" # Should not generate an error
if sys.version_info >= (3, 8, 0):
pass
else:
val2: int = "" # E?: May not generate an error (support for three-element sys.version is optional)
if sys.version_info < (3, 8):
val3: int = "" # Should not generate an error
if sys.version_info < (3, 100, 0):
pass
else:
val3: int = "" # E?: May not generate an error (support for three-element sys.version is optional)
if sys.platform == "bogus_platform":
val5: int = "" # Should not generate an error
if sys.platform != "bogus_platform":
pass
else:
val6: int = "" # Should not generate an error
if os.name == "bogus_os":
val7: int = "" # E?: May not generate an error (support for os.name is optional)
if os.name != "bogus_platform":
pass
else:
val8: int = "" # E?: May not generate an error (support for os.name is optional)

View File

@@ -0,0 +1,40 @@
"""
Tests basic behaviors of of Enum classes.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/enums.html#enum-definition
from enum import Enum
from typing import assert_type
# > Enum classes are iterable and indexable, and they can be called with a
# > value to look up the enum member with that value. Type checkers should
# > support these behaviors
class Color(Enum):
RED = 1
GREEN = 2
BLUE = 3
for color in Color:
assert_type(color, Color)
# > Unlike most Python classes, Calling an enum class does not invoke its
# > constructor. Instead, the call performs a value-based lookup of an
# > enum member.
assert_type(Color["RED"], Color) # 'Literal[Color.RED]' is also acceptable
assert_type(Color(3), Color) # 'Literal[Color.BLUE]' is also acceptable
# > An Enum class with one or more defined members cannot be subclassed.
class EnumWithNoMembers(Enum):
pass
class Shape(EnumWithNoMembers): # OK (because no members are defined)
SQUARE = 1
CIRCLE = 2
class ExtendedShape(Shape): # E: Shape is implicitly final
TRIANGLE = 3

View File

@@ -0,0 +1,75 @@
"""
Tests handling of Enum class definitions using the class syntax.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/enums.html#enum-definition
from enum import Enum, EnumType
from typing import Literal, assert_type
# > Type checkers should support the class syntax
class Color1(Enum):
RED = 1
GREEN = 2
BLUE = 3
assert_type(Color1.RED, Literal[Color1.RED])
# > The function syntax (in its various forms) is optional
Color2 = Enum("Color2", "RED", "GREEN", "BLUE") # E?
Color3 = Enum("Color3", ["RED", "GREEN", "BLUE"]) # E?
Color4 = Enum("Color4", ("RED", "GREEN", "BLUE")) # E?
Color5 = Enum("Color5", "RED, GREEN, BLUE") # E?
Color6 = Enum("Color6", "RED GREEN BLUE") # E?
Color7 = Enum("Color7", [("RED", 1), ("GREEN", 2), ("BLUE", 3)]) # E?
Color8 = Enum("Color8", (("RED", 1), ("GREEN", 2), ("BLUE", 3))) # E?
Color9 = Enum("Color9", {"RED": 1, "GREEN": 2, "BLUE": 3}) # E?
assert_type(Color2.RED, Literal[Color2.RED]) # E?
assert_type(Color3.RED, Literal[Color3.RED]) # E?
assert_type(Color4.RED, Literal[Color4.RED]) # E?
assert_type(Color5.RED, Literal[Color5.RED]) # E?
assert_type(Color6.RED, Literal[Color6.RED]) # E?
assert_type(Color7.RED, Literal[Color7.RED]) # E?
assert_type(Color8.RED, Literal[Color8.RED]) # E?
assert_type(Color9.RED, Literal[Color9.RED]) # E?
# > Enum classes can also be defined using a subclass of enum.Enum or any class
# > that uses enum.EnumType (or a subclass thereof) as a metaclass.
# > Type checkers should treat such classes as enums
class CustomEnum1(Enum):
pass
class Color10(CustomEnum1):
RED = 1
GREEN = 2
BLUE = 3
assert_type(Color10.RED, Literal[Color10.RED])
class CustomEnumType(EnumType):
pass
class CustomEnum2(metaclass=CustomEnumType):
pass
class Color11(CustomEnum2):
RED = 1
GREEN = 2
BLUE = 3
assert_type(Color11.RED, Literal[Color11.RED])

View File

@@ -0,0 +1,78 @@
"""
Tests that the type checker handles literal expansion of enum classes.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/enums.html#enum-literal-expansion
from enum import Enum, Flag
from typing import Literal, Never, assert_type
# > From the perspective of the type system, most enum classes are equivalent
# > to the union of the literal members within that enum. Type checkers may
# > therefore expand an enum type
class Color(Enum):
RED = 1
GREEN = 2
BLUE = 3
def print_color1(c: Color):
if c is Color.RED or c is Color.BLUE:
print("red or blue")
else:
assert_type(c, Literal[Color.GREEN]) # E?
def print_color2(c: Color):
match c:
case Color.RED | Color.BLUE:
print("red or blue")
case Color.GREEN:
print("green")
case _:
assert_type(c, Never) # E?
# > This rule does not apply to classes that derive from enum. Flag because
# > these enums allow flags to be combined in arbitrary ways.
class CustomFlags(Flag):
FLAG1 = 1
FLAG2 = 2
FLAG3 = 4
def test1(f: CustomFlags):
if f is CustomFlags.FLAG1 or f is CustomFlags.FLAG2:
print("flag1 and flag2")
else:
assert_type(f, CustomFlags)
assert_type(f, Literal[CustomFlags.FLAG3]) # E
def test2(f: CustomFlags):
match f:
case CustomFlags.FLAG1 | CustomFlags.FLAG2:
pass
case CustomFlags.FLAG3:
pass
case _:
assert_type(f, CustomFlags)
# > A type checker should treat a complete union of all literal members as
# > compatible with the enum type.
class Answer(Enum):
Yes = 1
No = 2
def test3(val: object) -> list[Answer]:
assert val is Answer.Yes or val is Answer.No
x = [val]
return x

View File

@@ -0,0 +1,30 @@
"""
Tests that the type checker handles the `_name_` and `name` attributes correctly.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/enums.html#member-names
from enum import Enum
from typing import Literal, assert_type
# > All enum member objects have an attribute _name_ that contains the members
# > name. They also have a property name that returns the same name. Type
# > checkers may infer a literal type for the name of a member
class Color(Enum):
RED = 1
GREEN = 2
BLUE = 3
assert_type(Color.RED._name_, Literal["RED"]) # E?
assert_type(Color.RED.name, Literal["RED"]) # E?
def func1(red_or_blue: Literal[Color.RED, Color.BLUE]):
assert_type(red_or_blue.name, Literal["RED", "BLUE"]) # E?
def func2(any_color: Color):
assert_type(any_color.name, Literal["RED", "BLUE", "GREEN"]) # E?

View File

@@ -0,0 +1,96 @@
"""
Tests that the type checker handles the `_value_` and `value` attributes correctly.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/enums.html#member-values
# > All enum member objects have an attribute _value_ that contains the
# > members value. They also have a property value that returns the same value.
# > Type checkers may infer the type of a members value.
from enum import Enum, auto
from typing import Literal, assert_type
class Color(Enum):
RED = 1
GREEN = 2
BLUE = 3
assert_type(Color.RED._value_, Literal[1]) # E?
assert_type(Color.RED.value, Literal[1]) # E?
def func1(red_or_blue: Literal[Color.RED, Color.BLUE]):
assert_type(red_or_blue.value, Literal[1, 3]) # E?
def func2(any_color: Color):
assert_type(any_color.value, Literal[1, 2, 3]) # E?
# > The value of _value_ can be assigned in a constructor method. This
# > technique is sometimes used to initialize both the member value and
# > non-member attributes. If the value assigned in the class body is a tuple,
# > the unpacked tuple value is passed to the constructor. Type checkers may
# > validate consistency between assigned tuple values and the constructor
# > signature.
class Planet(Enum):
def __init__(self, value: int, mass: float, radius: float):
self._value_ = value
self.mass = mass
self.radius = radius
MERCURY = (1, 3.303e23, 2.4397e6)
VENUS = (2, 4.869e24, 6.0518e6)
EARTH = (3, 5.976e24, 6.37814e6)
MARS = (6.421e23, 3.3972e6) # E?: Type checker error (optional)
JUPITER = 5 # E?: Type checker error (optional)
assert_type(Planet.MERCURY.value, Literal[1]) # E?
# > The class enum.auto and method _generate_next_value_ can be used within
# > an enum class to automatically generate values for enum members.
# > Type checkers may support these to infer literal types for member values.
class Color2(Enum):
RED = auto()
GREEN = auto()
BLUE = auto()
assert_type(Color2.RED.value, Literal[1]) # E?
# > If an enum class provides an explicit type annotation for _value_, type
# > checkers should enforce this declared type when values are assigned to
# > _value_.
class Color3(Enum):
_value_: int
RED = 1 # OK
GREEN = "green" # E
class Planet2(Enum):
_value_: str
def __init__(self, value: int, mass: float, radius: float):
self._value_ = value # E
MERCURY = (1, 3.303e23, 2.4397e6)
from _enums_member_values import ColumnType
# > If the literal values for enum members are not supplied, as they sometimes
# > are not within a type stub file, a type checker can use the type of the
# > _value_ attribute.
assert_type(ColumnType.DORIC.value, int) # E?

View File

@@ -0,0 +1,143 @@
"""
Tests that the type checker can distinguish enum members from non-members.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/enums.html#defining-members
from enum import Enum, member, nonmember
from typing import Literal, assert_type
# > If an attribute is defined in the class body with a type annotation but
# > with no assigned value, a type checker should assume this is a non-member
# > attribute
class Pet(Enum): # E?: Uninitialized attributes (pyre)
genus: str # Non-member attribute
species: str # Non-member attribute
CAT = 1 # Member attribute
DOG = 2 # Member attribute
assert_type(Pet.genus, str)
assert_type(Pet.species, str)
assert_type(Pet.CAT, Literal[Pet.CAT])
assert_type(Pet.DOG, Literal[Pet.DOG])
from _enums_members import Pet2
assert_type(Pet2.genus, str)
assert_type(Pet2.species, str)
assert_type(Pet2.CAT, Literal[Pet2.CAT])
assert_type(Pet2.DOG, Literal[Pet2.DOG])
# > Members defined within an enum class should not include explicit type
# > annotations. Type checkers should infer a literal type for all members.
# > A type checker should report an error if a type annotation is used for
# > an enum member because this type will be incorrect and misleading to
# > readers of the code
class Pet3(Enum):
CAT = 1
DOG: int = 2 # E
# > Methods, callables, descriptors (including properties), and nested classes
# > that are defined in the class are not treated as enum members by the
# > EnumType metaclass and should likewise not be treated as enum members by a
# > type checker
def identity(x: int) -> int:
return x
class Pet4(Enum):
CAT = 1 # Member attribute
DOG = 2 # Member attribute
converter = lambda x: str(x) # Non-member attribute
transform = staticmethod(identity) # Non-member attribute
@property
def species(self) -> str: # Non-member property
return "mammal"
def speak(self) -> None: # Non-member method
print("meow" if self is Pet.CAT else "woof")
class Nested: ... # Non-member nested class
assert_type(Pet4.CAT, Literal[Pet4.CAT])
assert_type(Pet4.DOG, Literal[Pet4.DOG])
assert_type(Pet4.converter, Literal[Pet4.converter]) # E
assert_type(Pet4.transform, Literal[Pet4.transform]) # E
assert_type(Pet4.species, Literal[Pet4.species]) # E
assert_type(Pet4.speak, Literal[Pet4.speak]) # E
# > An attribute that is assigned the value of another member of the same
# > enum is not a member itself. Instead, it is an alias for the first member
class TrafficLight(Enum):
RED = 1
GREEN = 2
YELLOW = 3
AMBER = YELLOW # Alias for YELLOW
assert_type(TrafficLight.AMBER, Literal[TrafficLight.YELLOW])
# > If using Python 3.11 or newer, the enum.member and enum.nonmember classes
# > can be used to unambiguously distinguish members from non-members.
class Example(Enum):
a = member(1) # Member attribute
b = nonmember(2) # Non-member attribute
@member
def c(self) -> None: # Member method
pass
assert_type(Example.a, Literal[Example.a])
assert_type(Example.b, Literal[Example.b]) # E
assert_type(Example.c, Literal[Example.c])
# > An attribute with a private name (beginning with, but not ending in,
# > a double underscore) is treated as a non-member.
class Example2(Enum):
__B = 2 # Non-member attribute
def method(self):
reveal_type(Example2.__B)
assert_type(Example2.__B, Literal[Example2.__B]) # E
# > An enum class can define a class symbol named _ignore_. This can be
# > a list of names or a string containing a space-delimited list of names
# > that are deleted from the enum class at runtime. Type checkers may
# > support this mechanism
class Pet5(Enum):
_ignore_ = "DOG FISH"
CAT = 1 # Member attribute
DOG = 2 # temporary variable, will be removed from the final enum class
FISH = 3 # temporary variable, will be removed from the final enum class
assert_type(Pet5.CAT, Literal[Pet5.CAT])
assert_type(Pet5.DOG, int) # E?: Literal[2] is also acceptable
assert_type(Pet5.FISH, int) # E?: Literal[3] is also acceptable

View File

@@ -0,0 +1,85 @@
"""
Tests the handling of __exit__ return types for context managers.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/exceptions.html
from typing import Any, Literal, assert_type
class CMBase:
def __enter__(self) -> None:
pass
class Suppress1(CMBase):
def __exit__(self, exc_type, exc_value, traceback) -> bool:
return True
class Suppress2(CMBase):
def __exit__(self, exc_type, exc_value, traceback) -> Literal[True]:
return True
class NoSuppress1(CMBase):
def __exit__(self, exc_type, exc_value, traceback) -> None:
return None
class NoSuppress2(CMBase):
def __exit__(self, exc_type, exc_value, traceback) -> Literal[False]:
return False
class NoSuppress3(CMBase):
def __exit__(self, exc_type, exc_value, traceback) -> Any:
return False
class NoSuppress4(CMBase):
def __exit__(self, exc_type, exc_value, traceback) -> None | bool:
return None
def suppress1(x: int | str) -> None:
if isinstance(x, int):
with Suppress1():
raise ValueError
assert_type(x, int | str)
def suppress2(x: int | str) -> None:
if isinstance(x, int):
with Suppress2():
raise ValueError
assert_type(x, int | str)
def no_suppress1(x: int | str) -> None:
if isinstance(x, int):
with NoSuppress1():
raise ValueError
assert_type(x, str)
def no_suppress2(x: int | str) -> None:
if isinstance(x, int):
with NoSuppress2():
raise ValueError
assert_type(x, str)
def no_suppress3(x: int | str) -> None:
if isinstance(x, int):
with NoSuppress3():
raise ValueError
assert_type(x, str)
def no_suppress4(x: int | str) -> None:
if isinstance(x, int):
with NoSuppress4():
raise ValueError
assert_type(x, str)

View File

@@ -0,0 +1,73 @@
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#arbitrary-generic-types-as-base-classes
from typing import Generic, TypeVar, Iterable, assert_type
T = TypeVar("T")
# > Generic[T] is only valid as a base class its not a proper type. However,
# > user-defined generic types [...] and built-in generic types and ABCs such as
# > list[T] and Iterable[T] are valid both as types and as base classes.
class Node: ...
class SymbolTable(dict[str, list[Node]]): ...
def takes_dict(x: dict): ...
def takes_dict_typed(x: dict[str, list[Node]]): ...
def takes_dict_incorrect(x: dict[str, list[object]]): ...
def test_symbol_table(s: SymbolTable):
takes_dict(s) # OK
takes_dict_typed(s) # OK
takes_dict_incorrect(s) # E
def func1(y: Generic[T]): # E
x: Generic # E
# > If a generic base class has a type variable as a type argument, this makes
# > the defined class generic.
# Note that there is overlap in the spec and tests in generics_basic.py
from collections.abc import Iterable, Container, Iterator
class LinkedList(Iterable[T], Container[T]): ...
def test_linked_list(l: LinkedList[int]):
assert_type(iter(l), Iterator[int])
assert_type(l.__contains__(1), bool)
linked_list_invalid: LinkedList[int, int] # E
from collections.abc import Mapping
class MyDict(Mapping[str, T]): ...
def test_my_dict(d: MyDict[int]):
assert_type(d["a"], int)
my_dict_invalid: MyDict[int, int] # E
# > Note that we can use T multiple times in the base class list,
# > as long as we dont use the same type variable T multiple times
# > within Generic[...].
class BadClass1(Generic[T, T]): # E
pass
class GoodClass1(dict[T, T]): # OK
pass

View File

@@ -0,0 +1,192 @@
"""
Tests for basic usage of generics.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#introduction
from __future__ import annotations
from collections.abc import Sequence
from typing import Any, Generic, TypeVar, assert_type
T = TypeVar("T")
# > Generics can be parameterized by using a factory available in
# > ``typing`` called ``TypeVar``.
def first(l: Sequence[T]) -> T:
return l[0]
def test_first(seq_int: Sequence[int], seq_str: Sequence[str]) -> None:
assert_type(first(seq_int), int)
assert_type(first(seq_str), str)
# > ``TypeVar`` supports constraining parametric types to a fixed set of
# > possible types
AnyStr = TypeVar("AnyStr", str, bytes)
def concat(x: AnyStr, y: AnyStr) -> AnyStr:
return x + y
def test_concat(s: str, b: bytes, a: Any) -> None:
concat(s, s) # OK
concat(b, b) # OK
concat(s, b) # E
concat(b, s) # E
concat(s, a) # OK
concat(a, b) # OK
# > Specifying a single constraint is disallowed.
BadConstraint1 = TypeVar("BadConstraint1", str) # E
# > Note: those types cannot be parameterized by type variables
class Test(Generic[T]):
BadConstraint2 = TypeVar("BadConstraint2", str, list[T]) # E
# > Subtypes of types constrained by a type variable should be treated
# > as their respective explicitly listed base types in the context of the
# > type variable.
class MyStr(str): ...
def test_concat_subtype(s: str, b: bytes, a: Any, m: MyStr) -> None:
assert_type(concat(m, m), str)
assert_type(concat(m, s), str)
concat(m, b) # E
# TODO: should these be str or Any?
# reveal_type(concat(m, a))
# reveal_type(concat(a, m))
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#user-defined-generic-classes
# > You can include a ``Generic`` base class to define a user-defined class
# > as generic.
from logging import Logger
from collections.abc import Iterable
class LoggedVar(Generic[T]):
def __init__(self, value: T, name: str, logger: Logger) -> None:
self.name = name
self.logger = logger
self.value = value
def set(self, new: T) -> None:
self.log("Set " + repr(self.value))
self.value = new
def get(self) -> T:
self.log("Get " + repr(self.value))
return self.value
def log(self, message: str) -> None:
self.logger.info("{}: {}".format(self.name, message))
def zero_all_vars(vars: Iterable[LoggedVar[int]]) -> None:
for var in vars:
var.set(0)
assert_type(var.get(), int)
# > A generic type can have any number of type variables, and type variables
# > may be constrained.
S = TypeVar("S")
class Pair1(Generic[T, S]): ...
# > Each type variable argument to ``Generic`` must be distinct.
class Pair2(Generic[T, T]): # E
...
# > The ``Generic[T]`` base class is redundant in simple cases where you
# > subclass some other generic class and specify type variables for its
# > parameters.
from collections.abc import Iterator, Mapping
class MyIter1(Iterator[T]): ...
class MyIter2(Iterator[T], Generic[T]): ...
def test_my_iter(m1: MyIter1[int], m2: MyIter2[int]):
assert_type(next(m1), int)
assert_type(next(m2), int)
K = TypeVar("K")
V = TypeVar("V")
class MyMap1(Mapping[K, V], Generic[K, V]): ...
class MyMap2(Mapping[K, V], Generic[V, K]): ...
def test_my_map(m1: MyMap1[str, int], m2: MyMap2[int, str]):
assert_type(m1["key"], int)
assert_type(m2["key"], int)
m1[0] # E
m2[0] # E
# > You can use multiple inheritance with ``Generic``
from collections.abc import Sized, Container
class LinkedList(Sized, Generic[T]): ...
class MyMapping(Iterable[tuple[K, V]], Container[tuple[K, V]], Generic[K, V]): ...
# > Subclassing a generic class without specifying type parameters assumes
# > ``Any`` for each position. In the following example, ``MyIterable``
# > is not generic but implicitly inherits from ``Iterable[Any]``::
class MyIterableAny(Iterable): # Same as Iterable[Any]
...
def test_my_iterable_any(m: MyIterableAny):
assert_type(iter(m), Iterator[Any])
# > Generic metaclasses are not supported
class GenericMeta(type, Generic[T]): ...
class GenericMetaInstance(metaclass=GenericMeta[T]): # E
...

View File

@@ -0,0 +1,167 @@
"""
Tests for basic usage of default values for TypeVar-like's.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#defaults-for-type-parameters.
from typing import Any, Callable, Generic, Self, Unpack, assert_type
from typing_extensions import TypeVar, ParamSpec, TypeVarTuple
DefaultStrT = TypeVar("DefaultStrT", default=str)
DefaultIntT = TypeVar("DefaultIntT", default=int)
DefaultBoolT = TypeVar("DefaultBoolT", default=bool)
T = TypeVar("T")
T1 = TypeVar("T1")
T2 = TypeVar("T2")
# > The order for defaults should follow the standard function parameter
# > rules, so a type parameter with no ``default`` cannot follow one with
# > a ``default`` value. Doing so may raise a ``TypeError`` at runtime,
# > and a type checker should flag this as an error.
class NonDefaultFollowsDefault(Generic[DefaultStrT, T]): ... # E: non-default TypeVars cannot follow ones with defaults
class NoNonDefaults(Generic[DefaultStrT, DefaultIntT]): ...
assert_type(NoNonDefaults, type[NoNonDefaults[str, int]])
assert_type(NoNonDefaults[str], type[NoNonDefaults[str, int]])
assert_type(NoNonDefaults[str, int], type[NoNonDefaults[str, int]])
class OneDefault(Generic[T, DefaultBoolT]): ...
assert_type(OneDefault[float], type[OneDefault[float, bool]])
assert_type(OneDefault[float](), OneDefault[float, bool])
class AllTheDefaults(Generic[T1, T2, DefaultStrT, DefaultIntT, DefaultBoolT]): ...
assert_type(AllTheDefaults, type[AllTheDefaults[Any, Any, str, int, bool]])
assert_type(
AllTheDefaults[int, complex], type[AllTheDefaults[int, complex, str, int, bool]]
)
AllTheDefaults[int] # E: expected 2 arguments to AllTheDefaults
assert_type(
AllTheDefaults[int, complex], type[AllTheDefaults[int, complex, str, int, bool]]
)
assert_type(
AllTheDefaults[int, complex, str],
type[AllTheDefaults[int, complex, str, int, bool]],
)
assert_type(
AllTheDefaults[int, complex, str, int],
type[AllTheDefaults[int, complex, str, int, bool]],
)
assert_type(
AllTheDefaults[int, complex, str, int, bool],
type[AllTheDefaults[int, complex, str, int, bool]],
)
# > ``ParamSpec`` defaults are defined using the same syntax as
# > ``TypeVar`` \ s but use a ``list`` of types or an ellipsis
# > literal "``...``" or another in-scope ``ParamSpec``.
DefaultP = ParamSpec("DefaultP", default=[str, int])
class Class_ParamSpec(Generic[DefaultP]): ...
assert_type(Class_ParamSpec, type[Class_ParamSpec[str, int]])
assert_type(Class_ParamSpec(), Class_ParamSpec[str, int])
assert_type(Class_ParamSpec[[bool, bool]](), Class_ParamSpec[bool, bool])
# > ``TypeVarTuple`` defaults are defined using the same syntax as
# > ``TypeVar`` \ s but use an unpacked tuple of types instead of a single type
# > or another in-scope ``TypeVarTuple`` (see `Scoping Rules`_).
DefaultTs = TypeVarTuple("DefaultTs", default=Unpack[tuple[str, int]])
class Class_TypeVarTuple(Generic[*DefaultTs]): ...
assert_type(Class_TypeVarTuple, type[Class_TypeVarTuple[*tuple[str, int]]])
assert_type(Class_TypeVarTuple(), Class_TypeVarTuple[str, int])
assert_type(Class_TypeVarTuple[int, bool](), Class_TypeVarTuple[int, bool])
# > If both ``bound`` and ``default`` are passed, ``default`` must be a
# > subtype of ``bound``. If not, the type checker should generate an
# > error.
TypeVar("Ok", bound=float, default=int) # OK
TypeVar("Invalid", bound=str, default=int) # E: the bound and default are incompatible
# > For constrained ``TypeVar``\ s, the default needs to be one of the
# > constraints. A type checker should generate an error even if it is a
# > subtype of one of the constraints.
TypeVar("Ok", float, str, default=float) # OK
TypeVar("Invalid", float, str, default=int) # E: expected one of float or str got int
# > In generic functions, type checkers may use a type parameter's default when the
# > type parameter cannot be solved to anything. We leave the semantics of this
# > usage unspecified, as ensuring the ``default`` is returned in every code path
# > where the type parameter can go unsolved may be too hard to implement. Type
# > checkers are free to either disallow this case or experiment with implementing
# > support.
T4 = TypeVar("T4", default=int)
def func1(x: int | set[T4]) -> T4: ...
assert_type(func1(0), int)
# > A ``TypeVar`` that immediately follows a ``TypeVarTuple`` is not allowed
# > to have a default, because it would be ambiguous whether a type argument
# > should be bound to the ``TypeVarTuple`` or the defaulted ``TypeVar``.
Ts = TypeVarTuple("Ts")
T5 = TypeVar("T5", default=bool)
class Foo5(Generic[*Ts, T5]): ... # E
# > It is allowed to have a ``ParamSpec`` with a default following a
# > ``TypeVarTuple`` with a default, as there can be no ambiguity between a
# > type argument for the ``ParamSpec`` and one for the ``TypeVarTuple``.
P = ParamSpec("P", default=[float, bool])
class Foo6(Generic[*Ts, P]): ... # OK
assert_type(Foo6[int, str], type[Foo6[int, str, [float, bool]]])
assert_type(Foo6[int, str, [bytes]], type[Foo6[int, str, [bytes]]])
# > Type parameter defaults should be bound by attribute access
# > (including call and subscript).
class Foo7(Generic[DefaultIntT]):
def meth(self, /) -> Self:
return self
attr: DefaultIntT
assert_type(Foo7.meth, Callable[[Foo7[int]], Foo7[int]])
assert_type(Foo7.attr, int)

View File

@@ -0,0 +1,98 @@
"""
Tests for type parameter default values that reference other type parameters.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#defaults-for-type-parameters.
# > To use another type parameter as a default the ``default`` and the
# > type parameter must be the same type (a ``TypeVar``'s default must be
# > a ``TypeVar``, etc.).
from typing import Any, Generic, assert_type
from typing_extensions import TypeVar
DefaultStrT = TypeVar("DefaultStrT", default=str)
StartT = TypeVar("StartT", default=int)
StopT = TypeVar("StopT", default=StartT)
StepT = TypeVar("StepT", default=int | None)
class slice(Generic[StartT, StopT, StepT]): ...
assert_type(slice, type[slice[int, int, int | None]])
assert_type(slice(), slice[int, int, int | None])
assert_type(slice[str](), slice[str, str, int | None])
assert_type(slice[str, bool, complex](), slice[str, bool, complex])
T2 = TypeVar("T2", default=DefaultStrT)
class Foo(Generic[DefaultStrT, T2]):
def __init__(self, a: DefaultStrT, b: T2) -> None: ...
assert_type(Foo(1, ""), Foo[int, str])
Foo[int](1, "") # E: Foo[int, str] cannot be assigned to self: Foo[int, int] in Foo.__init__
Foo[int]("", 1) # E: Foo[str, int] cannot be assigned to self: Foo[int, int] in Foo.__init__
# > ``T1`` must be used before ``T2`` in the parameter list of the generic.
S1 = TypeVar("S1")
S2 = TypeVar("S2", default=S1)
class Foo2(Generic[S1, S2]): ... # OK
Start2T = TypeVar("Start2T", default="StopT")
Stop2T = TypeVar("Stop2T", default=int)
class slice2(Generic[Start2T, Stop2T, StepT]): ... # E: bad ordering
# > Using a type parameter from an outer scope as a default is not supported.
class Foo3(Generic[S1]):
class Bar2(Generic[S2]): ... # E
# > ``T1``'s bound must be a subtype of ``T2``'s bound.
X1 = TypeVar("X1", bound=int)
TypeVar("Ok1", default=X1, bound=float) # OK
TypeVar("AlsoOk1", default=X1, bound=int) # OK
TypeVar("Invalid1", default=X1, bound=str) # E: int is not a subtype of str
# > The constraints of ``T2`` must be a superset of the constraints of ``T1``.
Y1 = TypeVar("Y1", bound=int)
TypeVar("Invalid2", float, str, default=Y1) # E: upper bound int is incompatible with constraints float or str
Y2 = TypeVar("Y2", int, str)
TypeVar("AlsoOk2", int, str, bool, default=Y2) # OK
TypeVar("AlsoInvalid2", bool, complex, default=Y2) # E: {bool, complex} is not a superset of {int, str}
# > Type parameters are valid as parameters to generics inside of a
# > ``default`` when the first parameter is in scope as determined by the
# > `previous section <scoping rules_>`_.
Z1 = TypeVar("Z1")
ListDefaultT = TypeVar("ListDefaultT", default=list[Z1]) # OK
class Bar(Generic[Z1, ListDefaultT]): # OK
def __init__(self, x: Z1, y: ListDefaultT): ...
assert_type(Bar, type[Bar[Any, list[Any]]])
assert_type(Bar[int], type[Bar[int, list[int]]])
assert_type(Bar[int](0, []), Bar[int, list[int]])
assert_type(Bar[int, list[str]](0, []), Bar[int, list[str]])
assert_type(Bar[int, str](0, ""), Bar[int, str])

View File

@@ -0,0 +1,65 @@
"""
Tests for specialization rules associated with type parameters with
default values.
"""
# > A generic type alias can be further subscripted following normal subscription
# > rules. If a type parameter has a default that hasn't been overridden, it should
# > be treated like it was substituted into the type alias.
from typing import Generic, TypeAlias, assert_type
from typing_extensions import TypeVar
T1 = TypeVar("T1")
T2 = TypeVar("T2")
DefaultIntT = TypeVar("DefaultIntT", default=int)
DefaultStrT = TypeVar("DefaultStrT", default=str)
class SomethingWithNoDefaults(Generic[T1, T2]): ...
MyAlias: TypeAlias = SomethingWithNoDefaults[int, DefaultStrT] # OK
def func1(p1: MyAlias, p2: MyAlias[bool]):
assert_type(p1, SomethingWithNoDefaults[int, str])
assert_type(p2, SomethingWithNoDefaults[int, bool])
MyAlias[bool, int] # E: too many arguments passed to MyAlias
# > Generic classes with type parameters that have defaults behave similarly
# > generic type aliases. That is, subclasses can be further subscripted following
# > normal subscription rules, non-overridden defaults should be substituted.
class SubclassMe(Generic[T1, DefaultStrT]):
x: DefaultStrT
class Bar(SubclassMe[int, DefaultStrT]): ...
assert_type(Bar, type[Bar[str]])
assert_type(Bar(), Bar[str])
assert_type(Bar[bool](), Bar[bool])
class Foo(SubclassMe[float]): ...
assert_type(Foo().x, str)
Foo[str] # E: Foo cannot be further subscripted
class Baz(Generic[DefaultIntT, DefaultStrT]): ...
class Spam(Baz): ...
# Spam is <subclass of Baz[int, str]>
v1: Baz[int, str] = Spam()

View File

@@ -0,0 +1,40 @@
"""
Tests basic usage of ParamSpec.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#paramspec-variables
from typing import Any, Callable, Concatenate, ParamSpec, TypeAlias
P = ParamSpec("P") # OK
WrongName = ParamSpec("NotIt") # E: name inconsistency
# > Valid use locations
TA1: TypeAlias = P # E
TA2: TypeAlias = Callable[P, None] # OK
TA3: TypeAlias = Callable[Concatenate[int, P], None] # OK
TA4: TypeAlias = Callable[..., None] # OK
TA5: TypeAlias = Callable[..., None] # OK
def func1(x: P) -> P: # E
...
def func2(x: Concatenate[int, P]) -> int: # E
...
def func3(x: list[P]) -> None: # E
...
def func4(x: Callable[[int, str], P]) -> None: # E
...
def func5(*args: P, **kwargs: P) -> None: # E
...

View File

@@ -0,0 +1,98 @@
"""
Tests the usage of a ParamSpec and its components (P.args, P.kwargs).
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#the-components-of-a-paramspec
from typing import Any, Callable, Concatenate, ParamSpec, assert_type
P = ParamSpec("P")
def puts_p_into_scope1(f: Callable[P, int]) -> None:
def inner(*args: P.args, **kwargs: P.kwargs) -> None: # OK
pass
def mixed_up(*args: P.kwargs, **kwargs: P.args) -> None: # E
pass
def misplaced1(x: P.args) -> None: # E
pass
def bad_kwargs1(*args: P.args, **kwargs: P.args) -> None: # E
pass
def bad_kwargs2(*args: P.args, **kwargs: Any) -> None: # E
pass
def out_of_scope(*args: P.args, **kwargs: P.kwargs) -> None: # E
pass
def puts_p_into_scope2(f: Callable[P, int]) -> None:
stored_args: P.args # E
stored_kwargs: P.kwargs # E
def just_args(*args: P.args) -> None: # E
pass
def just_kwargs(**kwargs: P.kwargs) -> None: # E
pass
def decorator(f: Callable[P, int]) -> Callable[P, None]:
def foo(*args: P.args, **kwargs: P.kwargs) -> None:
assert_type(f(*args, **kwargs), int) # OK
f(*kwargs, **args) # E
f(1, *args, **kwargs) # E
return foo # OK
def add(f: Callable[P, int]) -> Callable[Concatenate[str, P], None]:
def foo(s: str, *args: P.args, **kwargs: P.kwargs) -> None: # OK
pass
def bar(*args: P.args, s: str, **kwargs: P.kwargs) -> None: # E
pass
return foo # OK
def remove(f: Callable[Concatenate[int, P], int]) -> Callable[P, None]:
def foo(*args: P.args, **kwargs: P.kwargs) -> None:
f(1, *args, **kwargs) # OK
f(*args, 1, **kwargs) # E
f(*args, **kwargs) # E
return foo # OK
def outer(f: Callable[P, None]) -> Callable[P, None]:
def foo(x: int, *args: P.args, **kwargs: P.kwargs) -> None:
f(*args, **kwargs)
def bar(*args: P.args, **kwargs: P.kwargs) -> None:
foo(1, *args, **kwargs) # OK
foo(x=1, *args, **kwargs) # E
return bar
def twice(f: Callable[P, int], *args: P.args, **kwargs: P.kwargs) -> int:
return f(*args, **kwargs) + f(*args, **kwargs)
def a_int_b_str(a: int, b: str) -> int:
return 0
twice(a_int_b_str, 1, "A") # OK
twice(a_int_b_str, b="A", a=1) # OK
twice(a_int_b_str, "A", 1) # E

View File

@@ -0,0 +1,144 @@
"""
Tests semantics of ParamSpec.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#semantics
from typing import Callable, Concatenate, Generic, ParamSpec, TypeVar, assert_type
P = ParamSpec("P")
def changes_return_type_to_str(x: Callable[P, int]) -> Callable[P, str]:
...
def returns_int(a: str, b: bool, /) -> int:
...
f1 = changes_return_type_to_str(returns_int)
assert_type(f1, Callable[[str, bool], str])
v1 = f1("A", True) # OK
assert_type(v1, str)
f1(a="A", b=True) # E: positional-only
f1("A", "A") # E: wrong type
def func1(x: Callable[P, int], y: Callable[P, int]) -> Callable[P, bool]:
...
def x_y(x: int, y: str) -> int:
...
def y_x(y: int, x: str) -> int:
...
f2 = func1(x_y, x_y)
assert_type(f2(1, ""), bool)
assert_type(f2(y="", x=1), bool)
f3 = func1(x_y, y_x) # E?: Could return (a: int, b: str, /) -> bool
# (a callable with two positional-only parameters)
# This works because both callables have types that are
# behavioral subtypes of Callable[[int, str], int]
# Also OK for type checkers to reject this.
def keyword_only_x(*, x: int) -> int:
...
def keyword_only_y(*, y: int) -> int:
...
func1(keyword_only_x, keyword_only_y) # E
U = TypeVar("U")
class Y(Generic[U, P]):
f: Callable[P, str]
prop: U
def __init__(self, f: Callable[P, str], prop: U) -> None:
self.f = f
self.prop = prop
def callback_a(q: int, /) -> str:
...
def func(x: int) -> None:
y1 = Y(callback_a, x)
assert_type(y1, Y[int, [int]])
y2 = y1.f
assert_type(y2, Callable[[int], str])
def bar(x: int, *args: bool) -> int:
...
def add(x: Callable[P, int]) -> Callable[Concatenate[str, P], bool]:
...
a1 = add(bar) # Should return (a: str, /, x: int, *args: bool) -> bool
assert_type(a1("", 1, False, True), bool)
assert_type(a1("", x=1), bool)
a1(1, x=1) # E
def remove(x: Callable[Concatenate[int, P], int]) -> Callable[P, bool]:
...
r1 = remove(bar) # Should return (*args: bool) -> bool
assert_type(r1(False, True, True), bool)
assert_type(r1(), bool)
r1(1) # E
def transform(
x: Callable[Concatenate[int, P], int]
) -> Callable[Concatenate[str, P], bool]:
...
t1 = transform(bar) # Should return (a: str, /, *args: bool) -> bool
assert_type(t1("", True, False, True), bool)
assert_type(t1(""), bool)
t1(1) # E
def expects_int_first(x: Callable[Concatenate[int, P], int]) -> None:
...
@expects_int_first # E
def one(x: str) -> int:
...
@expects_int_first # E
def two(*, x: int) -> int:
...
@expects_int_first # E
def three(**kwargs: int) -> int:
...
@expects_int_first # OK
def four(*args: int) -> int:
...

View File

@@ -0,0 +1,61 @@
"""
Tests the handling of a ParamSpec specialization.
"""
from typing import Any, Callable, Concatenate, Generic, ParamSpec, TypeVar, cast
T = TypeVar("T")
P1 = ParamSpec("P1")
P2 = ParamSpec("P2")
class ClassA(Generic[T, P1]):
f: Callable[P1, int] = cast(Any, "")
x: T = cast(Any, "")
class ClassB(ClassA[T, P1], Generic[T, P1, P2]):
f1: Callable[P1, int] = cast(Any, "")
f2: Callable[P2, int] = cast(Any, "")
x: T = cast(Any, "")
def func20(x: ClassA[int, P2]) -> str: # OK
return ""
def func21(x: ClassA[int, Concatenate[int, P2]]) -> str: # OK
return ""
def func22(x: ClassB[int, [int, bool], ...]) -> str: # OK
return ""
def func23(x: ClassA[int, ...]) -> str: # OK
return ""
def func24(x: ClassB[int, [], []]) -> str: # OK
return ""
def func25(x: ClassA[int, int]) -> str: # E
return ""
class ClassC(Generic[P1]):
f: Callable[P1, int] = cast(Any, "")
def func30(x: ClassC[[int, str, bool]]) -> None: # OK
x.f(0, "", True) # OK
x.f("", "", True) # E
x.f(0, "", "") # E
def func31(x: ClassC[int, str, bool]) -> None: # OK
x.f(0, "", True) # OK
x.f("", "", True) # E
x.f(0, "", "") # E

View File

@@ -0,0 +1,90 @@
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#scoping-rules-for-type-variables
from typing import TypeVar, Generic, Iterable, TypeAlias, assert_type
# > A type variable used in a generic function could be inferred to represent
# > different types in the same code block.
T = TypeVar('T')
def fun_1(x: T) -> T: # T here
return x
def fun_2(x: T) -> T: # and here could be different
return x
assert_type(fun_1(1), int)
assert_type(fun_2('a'), str)
# > A type variable used in a method of a generic class that coincides
# > with one of the variables that parameterize this class is always bound
# > to that variable.
class MyClass(Generic[T]):
def meth_1(self, x: T) -> T: # T here
return x
def meth_2(self, x: T) -> T: # and here are always the same
return x
a: MyClass[int] = MyClass()
a.meth_1(1) # OK
a.meth_2('a') # E
# > A type variable used in a method that does not match any of the variables
# > that parameterize the class makes this method a generic function in that
# > variable.
S = TypeVar("S")
class Foo(Generic[T]):
def method(self, x: T, y: S) -> S:
return y
x: Foo[int] = Foo()
assert_type(x.method(0, "abc"), str)
assert_type(x.method(0, b"abc"), bytes)
# > Unbound type variables should not appear in the bodies of generic functions,
# > or in the class bodies apart from method definitions.
def fun_3(x: T) -> list[T]:
y: list[T] = [] # OK
z: list[S] = [] # E
return y
class Bar(Generic[T]):
an_attr: list[S] = [] # E
def do_something(self, x: S) -> S: # OK
return x
# A generic class definition that appears inside a generic function
# should not use type variables that parameterize the generic function.
def fun_4(x: T) -> list[T]:
a_list: list[T] = [] # OK
class MyGeneric(Generic[T]): # E
...
return a_list
# > A generic class nested in another generic class cannot use the same type
# > variables. The scope of the type variables of the outer class
# > doesn't cover the inner one
class Outer(Generic[T]):
class Bad(Iterable[T]): # E
...
class AlsoBad:
x: list[T] # E
class Inner(Iterable[S]): # OK
...
attr: Inner[T] # OK
alias: TypeAlias = list[T] # E
# Test unbound type variables at global scope
global_var1: T # E
global_var2: list[T] = [] # E
list[T]() # E

View File

@@ -0,0 +1,45 @@
"""
Tests for advanced or special-case usage of the typing.Self type.
"""
from typing import assert_type, Self
class ParentA:
# Test for property that returns Self.
@property
def prop1(self) -> Self:
...
class ChildA(ParentA):
...
assert_type(ParentA().prop1, ParentA)
assert_type(ChildA().prop1, ChildA)
# Test for a child that accesses an attribute within a parent
# whose type is annotated using Self.
class ParentB:
a: list[Self]
@classmethod
def method1(cls) -> Self:
...
class ChildB(ParentB):
b: int = 0
def method2(self) -> None:
assert_type(self, Self)
assert_type(self.a, list[Self])
assert_type(self.a[0], Self)
assert_type(self.method1(), Self)
@classmethod
def method3(cls) -> None:
assert_type(cls, type[Self])
assert_type(cls.a, list[Self])
assert_type(cls.a[0], Self)
assert_type(cls.method1(), Self)

View File

@@ -0,0 +1,32 @@
"""
Tests for usage of the typing.Self type with attributes.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#use-in-attribute-annotations
from typing import TypeVar, Generic, Self
from dataclasses import dataclass
T = TypeVar("T")
@dataclass
class LinkedList(Generic[T]):
value: T
next: Self | None = None
@dataclass
class OrdinalLinkedList(LinkedList[int]):
def ordinal_value(self) -> str:
return str(self.value)
# This should result in a type error.
xs = OrdinalLinkedList(value=1, next=LinkedList[int](value=2)) # E
if xs.next is not None:
xs.next = OrdinalLinkedList(value=3, next=None) # OK
# This should result in a type error.
xs.next = LinkedList[int](value=3, next=None) # E

View File

@@ -0,0 +1,81 @@
"""
Tests for basic usage of the typing.Self type.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#self.
from typing import Callable, Generic, Self, TypeVar, assert_type
T = TypeVar("T")
class Shape:
def set_scale(self, scale: float) -> Self:
assert_type(self, Self)
self.scale = scale
return self
def method2(self) -> Self:
# This should result in a type error.
return Shape() # E
def method3(self) -> "Shape":
return self
@classmethod
def from_config(cls, config: dict[str, float]) -> Self:
assert_type(cls, type[Self])
return cls()
@classmethod
def cls_method2(cls) -> Self:
# This should result in a type error.
return Shape() # E
@classmethod
def cls_method3(cls) -> "Shape":
return cls()
def difference(self, other: Self) -> float:
assert_type(other, Self)
return 0.0
def apply(self, f: Callable[[Self], None]) -> None:
return f(self)
class Circle(Shape):
pass
assert_type(Shape().set_scale(1.0), Shape)
assert_type(Circle().set_scale(1.0), Circle)
assert_type(Shape.from_config({}), Shape)
assert_type(Circle.from_config({}), Circle)
class Container(Generic[T]):
value: T
def set_value(self, value: T) -> Self: ...
# This should generate an error because Self isn't subscriptable.
def foo(self, other: Self[int]) -> None: # E
pass
def object_with_concrete_type(
int_container: Container[int], str_container: Container[str]
) -> None:
assert_type(int_container.set_value(42), Container[int])
assert_type(str_container.set_value("hello"), Container[str])
def object_with_generic_type(
container: Container[T],
value: T,
) -> Container[T]:
val = container.set_value(value)
assert_type(val, Container[T])
return val

View File

@@ -0,0 +1,64 @@
"""
Tests for usage of the typing.Self type with protocols.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#use-in-protocols
from typing import Protocol, Self, assert_type
class ShapeProtocol(Protocol):
def set_scale(self, scale: float) -> Self:
...
class ReturnSelf:
scale: float = 1.0
def set_scale(self, scale: float) -> Self:
self.scale = scale
return self
class ReturnConcreteShape:
scale: float = 1.0
def set_scale(self, scale: float) -> "ReturnConcreteShape":
self.scale = scale
return self
class BadReturnType:
scale: float = 1.0
def set_scale(self, scale: float) -> int:
self.scale = scale
return 42
class ReturnDifferentClass:
scale: float = 1.0
def set_scale(self, scale: float) -> ReturnConcreteShape:
return ReturnConcreteShape()
def accepts_shape(shape: ShapeProtocol) -> None:
y = shape.set_scale(0.5)
assert_type(y, ShapeProtocol)
def main(
return_self_shape: ReturnSelf,
return_concrete_shape: ReturnConcreteShape,
bad_return_type: BadReturnType,
return_different_class: ReturnDifferentClass,
) -> None:
accepts_shape(return_self_shape) # OK
accepts_shape(return_concrete_shape) # OK
# This should generate a type error.
accepts_shape(bad_return_type) # E
# Not OK because it returns a non-subclass.
accepts_shape(return_different_class) # E

View File

@@ -0,0 +1,126 @@
"""
Tests for valid and invalid usage of the typing.Self type.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#valid-locations-for-self
from typing import Any, Callable, Generic, Self, TypeAlias, TypeVar
class ReturnsSelf:
def foo(self) -> Self: # Accepted
return self
@classmethod
def bar(cls) -> Self: # Accepted
return cls(1)
def __new__(cls, value: int) -> Self: # Accepted
return cls(1)
def explicitly_use_self(self: Self) -> Self: # Accepted
return self
# Accepted (Self can be nested within other types)
def returns_list(self) -> list[Self]:
return []
# Accepted (Self can be nested within other types)
@classmethod
def return_cls(cls) -> type[Self]:
return cls
class Child(ReturnsSelf):
# Accepted (we can override a method that uses Self annotations)
def foo(self) -> Self:
return self
class TakesSelf:
def foo(self, other: Self) -> bool: # Accepted
return True
class Recursive:
# Accepted (treated as an @property returning ``Self | None``)
next: Self | None
class CallableAttribute:
def foo(self) -> int:
return 0
# Accepted (treated as an @property returning the Callable type)
bar: Callable[[Self], int] = foo
class HasNestedFunction:
x: int = 42
def foo(self) -> None:
# Accepted (Self is bound to HasNestedFunction).
def nested(z: int, inner_self: Self) -> Self:
print(z)
print(inner_self.x)
return inner_self
nested(42, self) # OK
class Outer:
class Inner:
def foo(self) -> Self: # Accepted (Self is bound to Inner)
return self
# This should generate an error.
def foo(bar: Self) -> Self: ... # E: not within a class
# This should generate an error.
bar: Self # E: not within a class
TFoo2 = TypeVar("TFoo2", bound="Foo2")
class Foo2:
# Rejected (Self is treated as unknown).
def has_existing_self_annotation(self: TFoo2) -> Self: ... # E
class Foo3:
def return_concrete_type(self) -> Self:
return Foo3() # E (see FooChild below for rationale)
class Foo3Child(Foo3):
child_value: int = 42
def child_method(self) -> None:
y = self.return_concrete_type()
y.child_value
T = TypeVar("T")
class Bar(Generic[T]):
def bar(self) -> T: ...
# This should generate an error.
class Baz(Bar[Self]): ... # E
class Baz2(Self): ... # E
# This should generate an error.
TupleSelf: TypeAlias = tuple[Self] # E
class Base:
@staticmethod
# This should generate an error.
def make() -> Self: # E
...
@staticmethod
# This should generate an error.
def return_parameter(foo: Self) -> Self: # E
...
class MyMetaclass(type):
# This should generate an error.
def __new__(cls, *args: Any) -> Self: # E
...
# This should generate an error.
def __mul__(cls, count: int) -> list[Self]: # E
...

View File

@@ -0,0 +1,27 @@
"""
Tests the compatibility rules between type parameter syntax (introduced in PEP 695)
and traditional TypeVars.
"""
# Specification: https://peps.python.org/pep-0695/#compatibility-with-traditional-typevars
from typing import TypeVar
K = TypeVar("K")
class ClassA[V](dict[K, V]): # E: traditional TypeVar not allowed here
...
class ClassB[K, V](dict[K, V]): # OK
...
class ClassC[V]:
def method1(self, a: V, b: K) -> V | K: # OK
...
def method2[M](self, a: M, b: K) -> M | K: # E
...

View File

@@ -0,0 +1,85 @@
"""
Validates the type parameter syntax introduced in PEP 695.
"""
# Specification: https://peps.python.org/pep-0695/#type-parameter-declarations
# This generic class is parameterized by a TypeVar T, a
# TypeVarTuple Ts, and a ParamSpec P.
from typing import Generic, ParamSpec, Protocol, TypeVar, TypeVarTuple, assert_type
class ChildClass[T, *Ts, **P]:
assert_type(T, TypeVar)
assert_type(Ts, TypeVarTuple)
assert_type(P, ParamSpec)
class ClassA[T](Generic[T]): # E: Runtime error
...
class ClassB[S, T](Protocol): # OK
...
class ClassC[S, T](Protocol[S, T]): # E
...
class ClassD[T: str]:
def method1(self, x: T):
x.capitalize() # OK
x.is_integer() # E
class ClassE[T: dict[str, int]]: # OK
pass
class ClassF[S: ForwardReference[int], T: "ForwardReference[str]"]: # OK
...
class ClassG[V]:
class ClassD[T: dict[str, V]]: # E: generic type not allowed
...
class ClassH[T: [str, int]]: # E: illegal expression form
...
class ClassI[AnyStr: (str, bytes)]: # OK
...
class ClassJ[T: (ForwardReference[int], "ForwardReference[str]", bytes)]: # OK
...
class ClassK[T: ()]: # E: two or more types required
...
class ClassL[T: (str,)]: # E: two or more types required
...
t1 = (bytes, str)
class ClassM[T: t1]: # E: literal tuple expression required
...
class ClassN[T: (3, bytes)]: # E: invalid expression form
...
class ClassO[T: (list[S], str)]: # E: generic type
...
class ForwardReference[T]: ...

View File

@@ -0,0 +1,149 @@
"""
Tests the handling of "infer_variance" parameter for TypeVar.
"""
# Specification: https://peps.python.org/pep-0695/#auto-variance-for-typevar
from typing import Final, Generic, Iterator, Sequence, TypeVar
from dataclasses import dataclass
T = TypeVar("T", infer_variance=True)
K = TypeVar("K", infer_variance=True)
V = TypeVar("V", infer_variance=True)
S1 = TypeVar("S1", covariant=True, infer_variance=True) # E: cannot use covariant with infer_variance
S2 = TypeVar("S2", contravariant=True, infer_variance=True) # E: cannot use contravariant with infer_variance
class ShouldBeCovariant1(Generic[T]):
def __getitem__(self, index: int) -> T:
...
def __iter__(self) -> Iterator[T]:
...
vco1_1: ShouldBeCovariant1[float] = ShouldBeCovariant1[int]() # OK
vco1_2: ShouldBeCovariant1[int] = ShouldBeCovariant1[float]() # E
class ShouldBeCovariant2(Sequence[T]):
pass
vco2_1: ShouldBeCovariant2[float] = ShouldBeCovariant2[int]() # OK
vco2_2: ShouldBeCovariant2[int] = ShouldBeCovariant2[float]() # E
class ShouldBeCovariant3(Generic[T]):
def method1(self) -> "ShouldBeCovariant2[T]":
...
vco3_1: ShouldBeCovariant3[float] = ShouldBeCovariant3[int]() # OK
vco3_2: ShouldBeCovariant3[int] = ShouldBeCovariant3[float]() # E
@dataclass(frozen=True)
class ShouldBeCovariant4(Generic[T]):
x: T
vo4_1: ShouldBeCovariant4[float] = ShouldBeCovariant4[int](1) # OK
vo4_4: ShouldBeCovariant4[int] = ShouldBeCovariant4[float](1.0) # E
class ShouldBeCovariant5(Generic[T]):
def __init__(self, x: T) -> None:
self._x = x
@property
def x(self) -> T:
return self._x
vo5_1: ShouldBeCovariant5[float] = ShouldBeCovariant5[int](1) # OK
vo5_2: ShouldBeCovariant5[int] = ShouldBeCovariant5[float](1.0) # E
class ShouldBeCovariant6(Generic[T]):
x: Final[T]
def __init__(self, value: T):
self.x = value
vo6_1: ShouldBeCovariant6[float] = ShouldBeCovariant6[int](1) # OK
vo6_2: ShouldBeCovariant6[int] = ShouldBeCovariant6[float](1.0) # E
class ShouldBeInvariant1(Generic[T]):
def __init__(self, value: T) -> None:
self._value = value
@property
def value(self):
return self._value
@value.setter
def value(self, value: T):
self._value = value
vinv1_1: ShouldBeInvariant1[float] = ShouldBeInvariant1[int](1) # E
vinv1_2: ShouldBeInvariant1[int] = ShouldBeInvariant1[float](1.1) # E
class ShouldBeInvariant2(Generic[T]):
def __init__(self, value: T) -> None:
self._value = value
def get_value(self) -> T:
return self._value
def set_value(self, value: T):
self._value = value
vinv2_1: ShouldBeInvariant2[float] = ShouldBeInvariant2[int](1) # E
vinv2_2: ShouldBeInvariant2[int] = ShouldBeInvariant2[float](1.1) # E
class ShouldBeInvariant3(dict[K, V]):
pass
vinv3_1: ShouldBeInvariant3[float, str] = ShouldBeInvariant3[int, str]() # E
vinv3_2: ShouldBeInvariant3[int, str] = ShouldBeInvariant3[float, str]() # E
vinv3_3: ShouldBeInvariant3[str, float] = ShouldBeInvariant3[str, int]() # E
vinv3_4: ShouldBeInvariant3[str, int] = ShouldBeInvariant3[str, float]() # E
@dataclass
class ShouldBeInvariant4[T]:
x: T
vinv4_1: ShouldBeInvariant4[float] = ShouldBeInvariant4[int](1) # E
class ShouldBeInvariant5[T]:
def __init__(self, x: T) -> None:
self.x = x
vinv5_1: ShouldBeInvariant5[float] = ShouldBeInvariant5[int](1) # E
class ShouldBeContravariant1(Generic[T]):
def __init__(self, value: T) -> None:
pass
def set_value(self, value: T):
pass
vcontra1_1: ShouldBeContravariant1[float] = ShouldBeContravariant1[int](1) # E
vcontra1_2: ShouldBeContravariant1[int] = ShouldBeContravariant1[float](1.2) # OK

View File

@@ -0,0 +1,124 @@
"""
Tests the scoping rules for type parameter syntax introduced in PEP 695.
"""
# Specification: https://peps.python.org/pep-0695/#type-parameter-scopes
from typing import Callable, Mapping, Sequence, TypeVar, assert_type
# > A compiler error or runtime exception is generated if the definition
# > of an earlier type parameter references a later type parameter even
# > if the name is defined in an outer scope.
class ClassA[S, T: Sequence[S]]: # E
...
class ClassB[S: Sequence[T], T]: # E
...
class Foo[T]:
...
class BaseClassC[T]:
def __init_subclass__(cls, param: type[Foo[T]]) -> None:
...
class ClassC[T](BaseClassC[T], param=Foo[T]): # OK
...
print(T) # E: Runtime error: 'T' is not defined
def decorator1[
T, **P, R
](x: type[Foo[T]]) -> Callable[[Callable[P, R]], Callable[P, R]]:
...
@decorator1(Foo[T]) # E: Runtime error: 'T' is not defined
class ClassD[T]:
...
type Alias1[K, V] = Mapping[K, V] | Sequence[K]
S: int = int(0)
def outer1[S](x: str):
S: str = x
T: int = 1
def outer2[T]():
def inner1():
nonlocal S # OK
assert_type(S, str)
# nonlocal T # Syntax error
def inner2():
global S # OK
assert_type(S, int)
class Outer1:
class Private:
pass
class Inner[T](Private, Sequence[T]): # OK
pass
def method1[T](self, a: Inner[T]) -> Inner[T]: # OK
return a
def decorator2[**P, R](x: int) -> Callable[[Callable[P, R]], Callable[P, R]]:
...
T = int(0)
@decorator2(T) # OK
class ClassE[T](Sequence[T]):
T = int(1)
def method1[T](self): # E
...
def method2[T](self, x=T): # E
...
def method3[T](self, x: T): # E
...
T = int(0)
class Outer2[T]:
T = int(1)
assert_type(T, int)
class Inner1:
T = str("")
assert_type(T, str)
def inner_method(self):
assert_type(T, TypeVar)
def outer_method(self):
T = 3j
assert_type(T, complex)
def inner_func():
assert_type(T, complex)

View File

@@ -0,0 +1,54 @@
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#instantiating-generic-classes-and-type-erasure
from typing import Any, TypeVar, Generic, assert_type
T = TypeVar("T")
# > If the constructor (__init__ or __new__) uses T in its signature, and a
# > corresponding argument value is passed, the type of the corresponding
# > argument(s) is substituted. Otherwise, Any is assumed.
class Node(Generic[T]):
label: T
def __init__(self, label: T | None = None) -> None: ...
assert_type(Node(''), Node[str])
assert_type(Node(0), Node[int])
assert_type(Node(), Node[Any])
assert_type(Node(0).label, int)
assert_type(Node().label, Any)
# > In case the inferred type uses [Any] but the intended type is more specific,
# > you can use an annotation to force the type of the variable, e.g.:
n1: Node[int] = Node()
assert_type(n1, Node[int])
n2: Node[str] = Node()
assert_type(n2, Node[str])
n3 = Node[int]()
assert_type(n3, Node[int])
n4 = Node[str]()
assert_type(n4, Node[str])
n5 = Node[int](0) # OK
n6 = Node[int]("") # E
n7 = Node[str]("") # OK
n8 = Node[str](0) # E
Node[int].label = 1 # E
Node[int].label # E
Node.label = 1 # E
Node.label # E
type(n1).label # E
assert_type(n1.label, int)
assert_type(Node[int]().label, int)
n1.label = 1 # OK
# > [...] generic versions of concrete collections can be instantiated:
from typing import DefaultDict
data = DefaultDict[int, bytes]()
assert_type(data[0], bytes)

View File

@@ -0,0 +1,81 @@
"""
Tests for the use of TypeVarTuple with "*args" parameter.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#args-as-a-type-variable-tuple
# > If *args is annotated as a type variable tuple, the types of the individual
# > arguments become the types in the type variable tuple.
from typing import TypeVarTuple, assert_type
Ts = TypeVarTuple("Ts")
def args_to_tuple(*args: *Ts) -> tuple[*Ts]:
...
assert_type(args_to_tuple(1, "a"), tuple[int, str])
class Env:
...
def exec_le(path: str, *args: * tuple[*Ts, Env], env: Env | None = None) -> tuple[*Ts]:
...
assert_type(exec_le("", Env()), tuple[()]) # OK
assert_type(exec_le("", 0, "", Env()), tuple[int, str]) # OK
exec_le("", 0, "") # E
exec_le("", 0, "", env=Env()) # E
# > Using an unpacked unbounded tuple is equivalent to the
# > PEP 484#arbitrary-argument-lists-and-default-argument-values behavior of
# > *args: int, which accepts zero or more values of type int.
def func1(*args: * tuple[int, ...]) -> None:
...
func1() # OK
func1(1, 2, 3, 4, 5) # OK
func1(1, "2", 3) # E
def func2(*args: * tuple[int, *tuple[str, ...], str]) -> None:
...
func2(1, "") # OK
func2(1, "", "", "", "") # OK
func2(1, 1, "") # E
func2(1) # E
func2("") # E
def func3(*args: * tuple[int, str]) -> None:
...
func3(1, "hello") # OK
func3(1) # E
def func4(*args: tuple[*Ts]):
...
func4((0,), (1,)) # OK
func4((0,), (1, 2)) # E
func4((0,), ("1",)) # E
# This is a syntax error, so leave it commented out.
# def func5(**kwargs: *Ts): # E
# ...

View File

@@ -0,0 +1,107 @@
"""
Tests basic usage of TypeVarTuple.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#typevartuple
from typing import Generic, NewType, TypeVarTuple, assert_type
Ts = TypeVarTuple("Ts")
class Array1(Generic[*Ts]):
...
def func1(*args: *Ts) -> tuple[*Ts]:
...
Shape = TypeVarTuple("Shape")
class Array(Generic[*Shape]):
def __init__(self, shape: tuple[*Shape]):
self._shape: tuple[*Shape] = shape
def get_shape(self) -> tuple[*Shape]:
return self._shape
Height = NewType("Height", int)
Width = NewType("Width", int)
Time = NewType("Time", int)
Batch = NewType("Batch", int)
v1: Array[Height, Width] = Array((Height(1), Width(2))) # OK
v2: Array[Batch, Height, Width] = Array((Batch(1), Height(1), Width(1))) # OK
v3: Array[Time, Batch, Height, Width] = Array(
(Time(1), Batch(1), Height(1), Width(1))
) # OK
v4: Array[Height, Width] = Array(Height(1)) # E
v5: Array[Batch, Height, Width] = Array((Batch(1), Width(1))) # E
v6: Array[Time, Batch, Height, Width] = Array( # E
(Time(1), Batch(1), Width(1), Height(1))
)
# > Type Variable Tuples Must Always be Unpacked
class ClassA(Generic[Shape]): # E: not unpacked
def __init__(self, shape: tuple[Shape]): # E: not unpacked
self._shape: tuple[*Shape] = shape
def get_shape(self) -> tuple[Shape]: # E: not unpacked
return self._shape
def method1(*args: Shape) -> None: # E: not unpacked
...
# > TypeVarTuple does not yet support specification of variance, bounds, constraints.
Ts1 = TypeVarTuple("Ts1", covariant=True) # E
Ts2 = TypeVarTuple("Ts2", int, float) # E
Ts3 = TypeVarTuple("Ts3", bound=int) # E
# > If the same TypeVarTuple instance is used in multiple places in a signature
# > or class, a valid type inference might be to bind the TypeVarTuple to a
# > tuple of a union of types.
def func2(arg1: tuple[*Ts], arg2: tuple[*Ts]) -> tuple[*Ts]:
...
# > We do not allow this; type unions may not appear within the tuple.
# > If a type variable tuple appears in multiple places in a signature,
# > the types must match exactly (the list of type parameters must be the
# > same length, and the type parameters themselves must be identical)
assert_type(func2((0,), (1,)), tuple[int]) # OK
func2((0,), (0.0,)) # OK
func2((0.0,), (0,)) # OK
func2((0,), (1,)) # OK
func2((0,), ("0",)) # E
func2((0, 0), (0,)) # E
def multiply(x: Array[*Shape], y: Array[*Shape]) -> Array[*Shape]:
...
def func3(x: Array[Height], y: Array[Width], z: Array[Height, Width]):
multiply(x, x) # OK
multiply(x, y) # E
multiply(x, z) # E
# > Only a single type variable tuple may appear in a type parameter list.
class Array3(Generic[*Ts1, *Ts2]): # E
...

View File

@@ -0,0 +1,49 @@
"""
Tests the use of TypeVarTuple within a Callable.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#type-variable-tuples-with-callable
# > Type variable tuples can also be used in the arguments section of a Callable.
from typing import Callable, TypeVar, TypeVarTuple, assert_type
Ts = TypeVarTuple("Ts")
T = TypeVar("T")
class Process:
def __init__(self, target: Callable[[*Ts], None], args: tuple[*Ts]) -> None:
...
def func1(arg1: int, arg2: str) -> None:
...
Process(target=func1, args=(0, "")) # OK
Process(target=func1, args=("", 0)) # E
def func2(f: Callable[[int, *Ts, T], tuple[T, *Ts]]) -> tuple[*Ts, T]:
...
def callback1(a: int, b: str, c: int, d: complex) -> tuple[complex, str, int]:
...
def callback2(a: int, d: str) -> tuple[str]:
...
assert_type(func2(callback1), tuple[str, int, complex])
assert_type(func2(callback2), tuple[str])
def func3(*args: * tuple[int, *Ts, T]) -> tuple[T, *Ts]:
...
assert_type(func3(1, "", 3j, 3.4), tuple[float, str, complex])

View File

@@ -0,0 +1,56 @@
"""
Tests type concatenation using TypeVarTuples.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#type-concatenation
# > Type variable tuples dont have to be alone; normal types can be prefixed and/or suffixed.
from typing import Generic, NewType, TypeVar, TypeVarTuple, assert_type
Height = NewType("Height", int)
Width = NewType("Width", int)
Batch = NewType("Batch", int)
Channels = NewType("Channels", int)
Shape = TypeVarTuple("Shape")
Ts = TypeVarTuple("Ts")
T = TypeVar("T")
class Array(Generic[*Ts]):
...
def add_batch_axis(x: Array[*Shape]) -> Array[Batch, *Shape]:
...
def del_batch_axis(x: Array[Batch, *Shape]) -> Array[*Shape]:
...
def add_batch_channels(x: Array[*Shape]) -> Array[Batch, *Shape, Channels]:
...
def func1(a: Array[Height, Width]):
b = add_batch_axis(a) # OK
assert_type(b, Array[Batch, Height, Width])
c = del_batch_axis(b) # OK
assert_type(c, Array[Height, Width])
d = add_batch_channels(a) # OK
assert_type(d, Array[Batch, Height, Width, Channels])
def prefix_tuple(x: T, y: tuple[*Ts]) -> tuple[T, *Ts]:
...
z = prefix_tuple(x=0, y=(True, "a"))
assert_type(z, tuple[int, bool, str])
def move_first_element_to_last(tup: tuple[T, *Ts]) -> tuple[*Ts, T]:
return (*tup[1:], tup[0])

View File

@@ -0,0 +1,31 @@
"""
Tests the use of TypeVarTuple in function overloads.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#overloads-for-accessing-individual-types
from typing import Any, Generic, TypeVar, TypeVarTuple, assert_type, overload
Shape = TypeVarTuple("Shape")
Axis1 = TypeVar("Axis1")
Axis2 = TypeVar("Axis2")
Axis3 = TypeVar("Axis3")
class Array(Generic[*Shape]):
@overload
def transpose(self: "Array[Axis1, Axis2]") -> "Array[Axis2, Axis1]":
...
@overload
def transpose(self: "Array[Axis1, Axis2, Axis3]") -> "Array[Axis3, Axis2, Axis1]":
...
def transpose(self) -> Any:
pass
def func1(a: Array[Axis1, Axis2], b: Array[Axis1, Axis2, Axis3]):
assert_type(a.transpose(), Array[Axis2, Axis1])
assert_type(b.transpose(), Array[Axis3, Axis2, Axis1])

View File

@@ -0,0 +1,163 @@
"""
Tests the handling of a TypeVarTuple specialization.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#behaviour-when-type-parameters-are-not-specified
from typing import Any, Generic, NewType, TypeVar, TypeVarTuple, assert_type
Ts = TypeVarTuple("Ts")
Height = NewType("Height", int)
Width = NewType("Width", int)
Time = NewType("Time", int)
class Array(Generic[*Ts]):
...
def takes_any_array1(arr: Array):
...
def takes_any_array2(arr: Array[*tuple[Any, ...]]):
...
def func1(x: Array[Height, Width]):
takes_any_array1(x) # OK
takes_any_array2(x) # OK
def func2(y: Array[Time, Height, Width]):
takes_any_array1(y) # OK
takes_any_array2(y) # OK
# > Generic aliases can be created using a type variable tuple in a similar
# > way to regular type variables.
IntTuple = tuple[int, *Ts]
NamedArray = tuple[str, Array[*Ts]]
def func3(a: IntTuple[float, bool], b: NamedArray[Height]):
assert_type(a, tuple[int, float, bool])
assert_type(b, tuple[str, Array[Height]])
def func4(a: IntTuple[()], b: NamedArray[()]):
assert_type(a, tuple[int])
assert_type(b, tuple[str, Array[()]])
Shape = TypeVarTuple("Shape")
DType = TypeVar("DType")
class Array2(Generic[DType, *Shape]):
...
FloatArray = Array2[float, *Shape]
Array1D = Array2[DType, Any]
def func5_0(a: Array1D, b: Array1D[int]):
assert_type(a, Array2[Any, Any])
assert_type(b, Array2[int, Any])
def takes_float_array_of_any_shape(x: FloatArray):
...
def func5_1(x: FloatArray[Height, Width]):
takes_float_array_of_any_shape(x) # OK
def takes_float_array_with_specific_shape(y: FloatArray[Height, Width]):
...
def func5_2(x: FloatArray):
takes_float_array_with_specific_shape(x) # OK
T = TypeVar("T")
VariadicTuple = tuple[T, *Ts]
def func6(a: VariadicTuple[str, int], b: VariadicTuple[float], c: VariadicTuple):
assert_type(a, tuple[str, int])
assert_type(b, tuple[float])
assert_type(c, tuple[Any, *tuple[Any, ...]])
Ts1 = TypeVarTuple("Ts1")
Ts2 = TypeVarTuple("Ts2")
IntTupleVar = tuple[int, *Ts1] # OK
IntFloatTupleVar = IntTupleVar[float, *Ts2] # OK
IntFloatsTupleVar = IntTupleVar[*tuple[float, ...]] # OK
IntTupleGeneric = tuple[int, T]
IntTupleGeneric[str] # OK
IntTupleGeneric[*Ts] # E
IntTupleGeneric[*tuple[float, ...]] # E
T1 = TypeVar("T1")
T2 = TypeVar("T2")
T3 = TypeVar("T3")
TA1 = tuple[*Ts, T1, T2] # OK
TA2 = tuple[T1, T2, *Ts] # OK
TA3 = tuple[T1, *Ts, T2, T3] # OK
TA4 = tuple[T1, T2, *tuple[int, ...]] # OK
TA5 = tuple[T1, *Ts, T2, *Ts] # E
TA6 = tuple[T1, *Ts, T2, *tuple[int, ...]] # E
TA7 = tuple[*Ts, T1, T2]
v1: TA7[int] # E: requires at least two type arguments
def func7(a: TA7[*Ts, T1, T2]) -> tuple[tuple[*Ts], T1, T2]:
...
def func8(a: TA7[str, bool], b: TA7[str, bool, float], c: TA7[str, bool, float, int]):
assert_type(func7(a), tuple[tuple[()], str, bool])
assert_type(func7(b), tuple[tuple[str], bool, float])
assert_type(func7(c), tuple[tuple[str, bool], float, int])
TA8 = tuple[T1, *Ts, T2, T3]
def func9(a: TA8[T1, *Ts, T2, T3]) -> tuple[tuple[*Ts], T1, T2, T3]:
...
def func10(a: TA8[str, bool, float], b: TA8[str, bool, float, int]):
assert_type(func9(a), tuple[tuple[()], str, bool, float])
assert_type(func9(b), tuple[tuple[bool], str, float, int])
TA9 = tuple[*Ts, T1]
TA10 = TA9[*tuple[int, ...]] # OK
def func11(a: TA10, b: TA9[*tuple[int, ...], str], c: TA9[*tuple[int, ...], str]):
assert_type(a, tuple[*tuple[int, ...], int])
assert_type(b, tuple[*tuple[int, ...], str])
assert_type(c, tuple[*tuple[int, ...], str])
TA11 = tuple[T, *Ts1]
TA12 = TA11[*Ts2] # E

View File

@@ -0,0 +1,46 @@
"""
Tests unpack operations for TypeVarTuple.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#unpacking-tuple-types
from typing import Any, Generic, NewType, TypeVarTuple
Height = NewType("Height", int)
Width = NewType("Width", int)
Batch = NewType("Batch", int)
Channels = NewType("Channels", int)
Ts = TypeVarTuple("Ts")
class Array(Generic[*Ts]):
...
def process_batch_channels(x: Array[Batch, *tuple[Any, ...], Channels]) -> None:
...
def func3(
x: Array[Batch, Height, Width, Channels], y: Array[Batch, Channels], z: Array[Batch]
):
process_batch_channels(x) # OK
process_batch_channels(y) # OK
process_batch_channels(z) # E
Shape = TypeVarTuple("Shape")
def expect_variadic_array(x: Array[Batch, *Shape]) -> None:
...
def expect_precise_array(x: Array[Batch, Height, Width, Channels]) -> None:
...
def func4(y: Array[*tuple[Any, ...]]):
expect_variadic_array(y) # OK
expect_precise_array(y) # OK

View File

@@ -0,0 +1,56 @@
"""
Tests TypeVars with upper bounds.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#type-variables-with-an-upper-bound
from typing import Collection, Generic, Sized, TypeVar, assert_type
# > A type variable may specify an upper bound using bound=<type>
T_Good1 = TypeVar("T_Good1", bound=int) # OK
T_Good2 = TypeVar("T_Good2", bound="ForwardRef | str") # OK
class ForwardRef: ...
# > <type> itself cannot be parameterized by type variables.
T = TypeVar("T")
class Test(Generic[T]):
T_Bad1 = TypeVar("T_Bad1", bound=list[T]) # E
ST = TypeVar("ST", bound=Sized)
def longer(x: ST, y: ST) -> ST:
if len(x) > len(y):
return x
else:
return y
assert_type(longer([1], [1, 2]), list[int])
assert_type(longer({1}, {1, 2}), set[int])
# Type checkers that use a join rather than a union (like mypy)
# will produce Collection[int] here instead of list[int] | set[int].
# Both answers are conformant with the spec.
assert_type(longer([1], {1, 2}), list[int] | set[int]) # E?
def requires_collection(c: Collection[int]) -> None: ...
requires_collection(longer([1], [1, 2])) # OK
longer(3, 3) # E
# > An upper bound cannot be combined with type constraints
T_Bad2 = TypeVar("T_Bad2", str, int, bound="int") # E

View File

@@ -0,0 +1,198 @@
"""
Tests the handling and enforcement of TypeVar variance.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/generics.html#variance
from typing import Sequence, TypeVar, Generic
from collections.abc import Iterable, Iterator
# > To facilitate the declaration of container types where covariant or
# > contravariant type checking is acceptable, type variables accept
# > keyword arguments covariant=True or contravariant=True. At most one of
# > these may be passed.
X1 = TypeVar("X1", covariant=True, contravariant=True) # E
T = TypeVar("T")
T_co = TypeVar("T_co", covariant=True)
T_contra = TypeVar("T_contra", contravariant=True)
class ImmutableList(Generic[T_co]):
def __init__(self, items: Iterable[T_co]) -> None:
...
def __iter__(self) -> Iterator[T_co]:
...
class Employee:
...
class Manager(Employee):
...
managers: ImmutableList[Manager] = ImmutableList([Manager()])
employees: ImmutableList[Employee] = managers # OK
E = TypeVar("E", bound=Employee)
def dump_employee(e: E) -> E:
return e
dump_employee(Manager()) # OK
B_co = TypeVar("B_co", covariant=True)
# > Variance has no meaning, and should therefore be ignored by type checkers,
# > if a type variable is bound to a generic function or type alias.
def func(x: list[B_co]) -> B_co: # OK
...
class Co(Generic[T_co]):
...
class Contra(Generic[T_contra]):
...
class Inv(Generic[T]):
...
class CoContra(Generic[T_co, T_contra]):
...
class Class1(Inv[T_co]): # E: Inv requires invariant TypeVar
pass
class Class2(Inv[T_contra]): # E: Inv requires invariant TypeVar
pass
class Co_Child1(Co[T_co]): # OK
...
class Co_Child2(Co[T]): # OK
...
class Co_Child3(Co[T_contra]): # E: Co requires covariant
...
class Contra_Child1(Contra[T_contra]): # OK
...
class Contra_Child2(Contra[T]): # OK
...
class Contra_Child3(Contra[T_co]): # E: Contra requires contravariant
...
class Contra_Child4(Contra[Co[T_contra]]): # OK
...
class Contra_Child5(Contra[Co[T_co]]): # E: Contra requires contravariant
...
class Contra_Child6(Contra[Co[T]]): # OK
...
class CoContra_Child1(CoContra[T_co, T_contra]): # OK
...
class CoContra_Child2(
CoContra[T_co, T_co] # E: Second type arg must be contravariant
):
...
class CoContra_Child3(
CoContra[T_contra, T_contra] # E: First type arg must be covariant
):
...
class CoContra_Child4(CoContra[T, T]): # OK
...
class CoContra_Child5(
CoContra[Co[T_co], Co[T_co]] # E: Second type arg must be contravariant
):
...
class CoToContra(Contra[Co[T_contra]]): # OK
...
class ContraToContra(Contra[Contra[T_co]]): # OK
...
class CoToCo(Co[Co[T_co]]): # OK
...
class ContraToCo(Co[Contra[T_contra]]): # OK
...
class CoToContraToContra(Contra[Co[Contra[T_contra]]]): # E
...
class ContraToContraToContra(Contra[Contra[Contra[T_co]]]): # E
...
Co_TA = Co[T_co]
Contra_TA = Contra[T_contra]
class CoToContra_WithTA(Contra_TA[Co_TA[T_contra]]): # OK
...
class ContraToContra_WithTA(Contra_TA[Contra_TA[T_co]]): # OK
...
class CoToCo_WithTA(Co_TA[Co_TA[T_co]]): # OK
...
class ContraToCo_WithTA(Co_TA[Contra_TA[T_contra]]): # OK
...
class CoToContraToContra_WithTA(Contra_TA[Co_TA[Contra_TA[T_contra]]]): # E
...
class ContraToContraToContra_WithTA(
Contra_TA[Contra_TA[Contra_TA[T_co]]] # E
):
...

View File

@@ -0,0 +1,194 @@
"""
Tests variance inference for type parameters.
"""
# Specification: https://peps.python.org/pep-0695/#variance-inference
from dataclasses import dataclass
# T1 should be invariant
# T2 should be contravariant
# T3 should be covariant
from typing import Generic, Iterator, Sequence, TypeVar
class ClassA[T1, T2, T3](list[T1]):
def method1(self, a: T2) -> None:
...
def method2(self) -> T3:
...
def func_a(p1: ClassA[float, int, int], p2: ClassA[int, float, float]):
v1: ClassA[int, int, int] = p1 # E
v2: ClassA[float, float, int] = p1 # E
v3: ClassA[float, int, float] = p1 # OK
v4: ClassA[int, int, int] = p2 # E
v5: ClassA[int, int, float] = p2 # OK
class ShouldBeCovariant1[T]:
def __getitem__(self, index: int) -> T:
...
def __iter__(self) -> Iterator[T]:
...
vco1_1: ShouldBeCovariant1[float] = ShouldBeCovariant1[int]() # OK
vco1_2: ShouldBeCovariant1[int] = ShouldBeCovariant1[float]() # E
class ShouldBeCovariant2[T](Sequence[T]):
pass
vco2_1: ShouldBeCovariant2[float] = ShouldBeCovariant2[int]() # OK
vco2_2: ShouldBeCovariant2[int] = ShouldBeCovariant2[float]() # E
class ShouldBeCovariant3[T]:
def method1(self) -> "ShouldBeCovariant2[T]":
...
vco3_1: ShouldBeCovariant3[float] = ShouldBeCovariant3[int]() # OK
vco3_2: ShouldBeCovariant3[int] = ShouldBeCovariant3[float]() # E
@dataclass(frozen=True)
class ShouldBeCovariant4[T]:
x: T
vo4_1: ShouldBeCovariant4[float] = ShouldBeCovariant4[int](1) # OK
vo4_2: ShouldBeCovariant4[int] = ShouldBeCovariant4[float](1) # E
class ShouldBeCovariant5[T]:
def __init__(self, x: T) -> None:
self._x = x
@property
def x(self) -> T:
return self._x
vo5_1: ShouldBeCovariant5[float] = ShouldBeCovariant5[int](1) # OK
vo5_2: ShouldBeCovariant5[int] = ShouldBeCovariant5[float](1) # E
class ShouldBeInvariant1[T]:
def __init__(self, value: T) -> None:
self._value = value
@property
def value(self):
return self._value
@value.setter
def value(self, value: T):
self._value = value
vinv1_1: ShouldBeInvariant1[float] = ShouldBeInvariant1[int](1) # E
vinv1_2: ShouldBeInvariant1[int] = ShouldBeInvariant1[float](1.1) # E
class ShouldBeInvariant2[T]:
def __init__(self, value: T) -> None:
self._value = value
def get_value(self) -> T:
return self._value
def set_value(self, value: T):
self._value = value
vinv2_1: ShouldBeInvariant2[float] = ShouldBeInvariant2[int](1) # E
vinv2_2: ShouldBeInvariant2[int] = ShouldBeInvariant2[float](1.1) # E
class ShouldBeInvariant3[K, V](dict[K, V]):
pass
vinv3_1: ShouldBeInvariant3[float, str] = ShouldBeInvariant3[int, str]() # E
vinv3_2: ShouldBeInvariant3[int, str] = ShouldBeInvariant3[float, str]() # E
vinv3_3: ShouldBeInvariant3[str, float] = ShouldBeInvariant3[str, int]() # E
vinv3_4: ShouldBeInvariant3[str, int] = ShouldBeInvariant3[str, float]() # E
@dataclass
class ShouldBeInvariant4[T]:
x: T
vinv4_1: ShouldBeInvariant4[float] = ShouldBeInvariant4[int](1) # E
class ShouldBeInvariant5[T]:
def __init__(self, x: T) -> None:
self.x = x
vinv5_1: ShouldBeInvariant5[float] = ShouldBeInvariant5[int](1) # E
class ShouldBeContravariant1[T]:
def __init__(self, value: T) -> None:
pass
def set_value(self, value: T) -> None:
pass
vcontra1_1: ShouldBeContravariant1[float] = ShouldBeContravariant1[int](1) # E
vcontra1_2: ShouldBeContravariant1[int] = ShouldBeContravariant1[float](1.2) # OK
# Test the case where a class with inferred variance derives from
# a traditional class that doesn't use inferred variance.
T = TypeVar("T")
T_co = TypeVar("T_co", covariant=True)
T_contra = TypeVar("T_contra", contravariant=True)
class Parent_Invariant(Generic[T]):
pass
class ShouldBeInvariant6[T](Parent_Invariant[T]):
pass
a1: ShouldBeInvariant6[int] = ShouldBeInvariant6[float]() # E
a2: ShouldBeInvariant6[float] = ShouldBeInvariant6[int]() # E
class Parent_Covariant(Generic[T_co]):
pass
class ShouldBeCovariant6[T](Parent_Covariant[T]):
pass
b1: ShouldBeCovariant6[int] = ShouldBeCovariant6[float]() # E
b2: ShouldBeCovariant6[float] = ShouldBeCovariant6[int]() # OK
class Parent_Contravariant(Generic[T_contra]):
pass
class ShouldBeContravariant2[T](Parent_Contravariant[T]):
pass
c1: ShouldBeContravariant2[int] = ShouldBeContravariant2[float]() # OK
c2: ShouldBeContravariant2[float] = ShouldBeContravariant2[int]() # E

View File

@@ -0,0 +1,53 @@
"""
Tests the historical (pre-3.8) mechanism for specifying positional-only
parameters in function signatures.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/historical.html#positional-only-parameters
# > Type checkers should support the following special case: all parameters with
# > names that begin but dont end with __ are assumed to be positional-only:
def f1(__x: int, __y__: int = 0) -> None: ...
f1(3, __y__=1) # OK
f1(__x=3) # E
# > Consistent with PEP 570 syntax, positional-only parameters cannot appear
# > after parameters that accept keyword arguments. Type checkers should
# > enforce this requirement:
def f2(x: int, __y: int) -> None: ... # E
def f3(x: int, *args: int, __y: int) -> None: ... # OK
f3(3, __y=3) # OK
class A:
def m1(self, __x: int, __y__: int = 0) -> None: ...
def m2(self, x: int, __y: int) -> None: ... # E
a = A()
a.m1(3, __y__=1) # OK
a.m1(__x=3) # E
# The historical mechanism should not apply when new-style (PEP 570)
# syntax is used.
def f4(x: int, /, __y: int) -> None: ... # OK
f4(3, __y=4) # OK

View File

@@ -0,0 +1,116 @@
"""
Tests interactions between Literal types and other typing features.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/literal.html#interactions-with-other-types-and-features
from enum import Enum
from typing import IO, Any, Final, Generic, Literal, TypeVar, assert_type, overload
def func1(v: tuple[int, str, list[bool]], a: Literal[0], b: Literal[5], c: Literal[-5]):
assert_type(v[a], int)
assert_type(v[2], list[bool])
v[b] # E: index out of range
v[c] # E: index out of range
v[4] # E: index out of range
v[-4] # E: index out of range
_PathType = str | bytes | int
@overload
def open(
path: _PathType,
mode: Literal["r", "w", "a", "x", "r+", "w+", "a+", "x+"],
) -> IO[str]:
...
@overload
def open(
path: _PathType,
mode: Literal["rb", "wb", "ab", "xb", "r+b", "w+b", "a+b", "x+b"],
) -> IO[bytes]:
...
@overload
def open(path: _PathType, mode: str) -> IO[Any]:
...
def open(path: _PathType, mode: Any) -> Any:
pass
assert_type(open("path", "r"), IO[str])
assert_type(open("path", "wb"), IO[bytes])
assert_type(open("path", "other"), IO[Any])
A = TypeVar("A", bound=int)
B = TypeVar("B", bound=int)
C = TypeVar("C", bound=int)
class Matrix(Generic[A, B]):
def __add__(self, other: "Matrix[A, B]") -> "Matrix[A, B]":
...
def __matmul__(self, other: "Matrix[B, C]") -> "Matrix[A, C]":
...
def transpose(self) -> "Matrix[B, A]":
...
def func2(a: Matrix[Literal[2], Literal[3]], b: Matrix[Literal[3], Literal[7]]):
c = a @ b
assert_type(c, Matrix[Literal[2], Literal[7]])
T = TypeVar("T", Literal["a"], Literal["b"], Literal["c"])
S = TypeVar("S", bound=Literal["foo"])
class Status(Enum):
SUCCESS = 0
INVALID_DATA = 1
FATAL_ERROR = 2
def parse_status1(s: str | Status) -> None:
if s is Status.SUCCESS:
assert_type(s, Literal[Status.SUCCESS])
elif s is Status.INVALID_DATA:
assert_type(s, Literal[Status.INVALID_DATA])
elif s is Status.FATAL_ERROR:
assert_type(s, Literal[Status.FATAL_ERROR])
else:
assert_type(s, str)
def expects_bad_status(status: Literal["MALFORMED", "ABORTED"]):
...
def expects_pending_status(status: Literal["PENDING"]):
...
def parse_status2(status: str) -> None:
if status in ("MALFORMED", "ABORTED"):
return expects_bad_status(status)
if status == "PENDING":
expects_pending_status(status)
final_val1: Final = 3
assert_type(final_val1, Literal[3])
final_val2: Final = True
assert_type(final_val2, Literal[True])

View File

@@ -0,0 +1,175 @@
"""
Tests handling of the LiteralString special form.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/literal.html#literalstring
from typing import (
Any,
Generic,
Literal,
LiteralString,
Sequence,
TypeVar,
assert_type,
overload,
)
variable_annotation: LiteralString
def my_function(literal_string: LiteralString) -> LiteralString:
...
class Foo:
my_attribute: LiteralString = ""
type_argument: list[LiteralString]
T = TypeVar("T", bound=LiteralString)
bad_union: Literal["hello", LiteralString] # E
bad_nesting: Literal[LiteralString] # E
def func1(a: Literal["one"], b: Literal["two"]):
x1: LiteralString = a
x2: Literal[""] = b # E
def func2(a: LiteralString, b: LiteralString):
# > Addition: x + y is of type LiteralString if both x and y are compatible with LiteralString.
assert_type(a + b, LiteralString)
# > Joining: sep.join(xs) is of type LiteralString if seps type is
# > compatible with LiteralString and xss type is compatible with Iterable[LiteralString].
assert_type(",".join((a, b)), LiteralString)
assert_type(",".join((a, str(b))), str)
# > In-place addition: If s has type LiteralString and x has type compatible with
# > LiteralString, then s += x preserves ss type as LiteralString.
a += "hello"
b += a
# > String formatting: An f-string has type LiteralString if and only if its constituent
# > expressions are literal strings. s.format(...) has type LiteralString if and only if
# > s and the arguments have types compatible with LiteralString.
assert_type(f"{a} {b}", LiteralString)
variable = 3
x1: LiteralString = f"{a} {str(variable)}" # E
assert_type(a + str(1), str)
# > LiteralString is compatible with the type str
x2: str = a
# > Other literal types, such as literal integers, are not compatible with LiteralString.
x3: LiteralString = 3 # E
x4: LiteralString = b"test" # E
# > Conditional statements and expressions work as expected.
def condition1() -> bool:
...
def return_literal_string() -> LiteralString:
return "foo" if condition1() else "bar" # OK
def return_literal_str2(literal_string: LiteralString) -> LiteralString:
return "foo" if condition1() else literal_string # OK
def return_literal_str3() -> LiteralString:
result: LiteralString
if condition1():
result = "foo"
else:
result = "bar"
return result # OK
# > TypeVars can be bound to LiteralString.
TLiteral = TypeVar("TLiteral", bound=LiteralString)
def literal_identity(s: TLiteral) -> TLiteral:
return s
def func3(s: Literal["hello"]):
y = literal_identity(s)
assert_type(y, Literal["hello"])
def func4(s: LiteralString):
y2 = literal_identity(s)
assert_type(y2, LiteralString)
def func5(s: str):
literal_identity(s) # E
# > LiteralString can be used as a type argument for generic classes.
class Container(Generic[T]):
def __init__(self, value: T) -> None:
self.value = value
def func6(s: LiteralString):
x: Container[LiteralString] = Container(s) # OK
def func7(s: str):
x_error: Container[LiteralString] = Container(s) # E
# > Standard containers like List work as expected.
xs: list[LiteralString] = ["foo", "bar", "baz"]
class A: pass
class B(A): pass
class C(B): pass
@overload
def func8(x: Literal["foo"]) -> C:
...
@overload
def func8(x: LiteralString) -> B:
...
@overload
def func8(x: str) -> A:
...
def func8(x: Any) -> Any:
...
assert_type(func8("foo"), C) # First overload
assert_type(func8("bar"), B) # Second overload
assert_type(func8(str(1)), A) # Third overload
def func9(val: list[LiteralString]):
x1: list[str] = val # E
def func10(val: Sequence[LiteralString]):
x1: Sequence[str] = val # OK

View File

@@ -0,0 +1,67 @@
"""
Tests legal and illegal parameterizations of Literal.
"""
# > Literal must be parameterized with at least one type.
from typing import Any, Literal, TypeVar
from enum import Enum
class Color(Enum):
RED = 0
GREEN = 1
BLUE = 2
good1: Literal[26]
good2: Literal[0x1A]
good3: Literal[-4]
good4: Literal[+5]
good5: Literal["hello world"]
good6: Literal[b"hello world"]
good7: Literal["hello world"]
good8: Literal[True]
good9: Literal[Color.RED]
good10: Literal[None]
ReadOnlyMode = Literal["r", "r+"]
WriteAndTruncateMode = Literal["w", "w+", "wt", "w+t"]
WriteNoTruncateMode = Literal["r+", "r+t"]
AppendMode = Literal["a", "a+", "at", "a+t"]
AllModes = Literal[ReadOnlyMode, WriteAndTruncateMode, WriteNoTruncateMode, AppendMode]
good11: Literal[Literal[Literal[1, 2, 3], "foo"], 5, None]
variable = 3
T = TypeVar("T")
# > Arbitrary expressions [are illegal]
bad1: Literal[3 + 4] # E
bad2: Literal["foo".replace("o", "b")] # E
bad3: Literal[4 + 3j] # E
bad4: Literal[~5] # E
bad5: Literal[not False] # E
bad6: Literal[(1, "foo", "bar")] # E
bad7: Literal[{"a": "b", "c": "d"}] # E
bad8: Literal[int] # E
bad9: Literal[variable] # E
bad10: Literal[T] # E
bad11: Literal[3.14] # E
bad12: Literal[Any] # E
bad13: Literal[...] # E
def my_function(x: Literal[1 + 2]) -> int: # E
return x * 3
x: Literal # E
y: Literal[my_function] = my_function # E
def func2(a: Literal[Color.RED]):
x1: Literal["Color.RED"] = a # E
x2: "Literal[Color.RED]" = a # OK

View File

@@ -0,0 +1,40 @@
"""
Tests the semantics of the Literal special form.
"""
from typing import Literal
from typing import Literal as L
v1: Literal[3] = 3
v2: Literal[3] = 4 # E
v3: L[-3] = -3
# > Literal[20] and Literal[0x14] are equivalent
def func1(a: Literal[20], b: Literal[0x14], c: Literal[0b10100]):
x1: Literal[0x14] = a
x2: Literal[0x14] = b
x3: Literal[0x14] = c
# > Literal[0] and Literal[False] are not equivalent
def func2(a: Literal[0], b: Literal[False]):
x1: Literal[False] = a # E
x2: Literal[0] = b # E
# > Given some value v that is a member of type T, the type Literal[v] shall
# > be treated as a subtype of T. For example, Literal[3] is a subtype of int.
def func3(a: L[3, 4, 5]):
b = a.__add__(3)
c = a + 3
a += 3 # E
# > When a Literal is parameterized with more than one value, its treated
# > as exactly equivalent to the union of those types.
def func4(a: L[None, 3] | L[3, "foo", b"bar", True]):
x1: Literal[3, b"bar", True, "foo", None] = a
a = x1

View File

@@ -0,0 +1,106 @@
"""
Tests NamedTuple definitions using the class syntax.
"""
# Specification: https://typing.readthedocs.io/en/latest/spec/namedtuples.html#defining-named-tuples
# > Type checkers should support the class syntax
from typing import Generic, NamedTuple, TypeVar, assert_type
class Point(NamedTuple):
x: int
y: int
units: str = "meters"
# > Type checkers should synthesize a ``__new__`` method based on
# > the named tuple fields.
p1 = Point(1, 2)
assert_type(p1, Point)
assert_type(p1[0], int)
assert_type(p1[1], int)
assert_type(p1[2], str)
assert_type(p1[-1], str)
assert_type(p1[-2], int)
assert_type(p1[-3], int)
assert_type(p1[0:2], tuple[int, int])
assert_type(p1[0:], tuple[int, int, str])
print(p1[3]) # E
print(p1[-4]) # E
p2 = Point(1, 2, "")
assert_type(p2, Point)
p3 = Point(x=1, y=2)
assert_type(p3, Point)
p4 = Point(x=1, y=2, units="")
assert_type(p4, Point)
p5 = Point(1) # E
p6 = Point(x=1) # E
p7 = Point(1, "") # E
p8 = Point(1, 2, units=3) # E
p9 = Point(1, 2, "", "") # E
p10 = Point(1, 2, "", other="") # E
# > The runtime implementation of ``NamedTuple`` enforces that fields with default
# > values must come after fields without default values. Type checkers should
# > likewise enforce this restriction::
class Location(NamedTuple):
altitude: float = 0.0
latitude: float # E: previous field has a default value
# > A named tuple class can be subclassed, but any fields added by the subclass
# > are not considered part of the named tuple type. Type checkers should enforce
# > that these newly-added fields do not conflict with the named tuple fields
# > in the base class.
class PointWithName(Point):
name: str = "" # OK
pn1 = PointWithName(1, 2, "")
pnt1: tuple[int, int, str] = pn1
assert_type(pn1.name, str)
class BadPointWithName(Point):
name: str = "" # OK
x: int = 0 # E
# > In Python 3.11 and newer, the class syntax supports generic named tuple classes.
# > Type checkers should support this.
T = TypeVar("T")
class Property(NamedTuple, Generic[T]):
name: str
value: T
pr1 = Property("", 3.4)
assert_type(pr1, Property[float])
assert_type(pr1[1], float)
assert_type(pr1.value, float)
Property[str]("", 3.1) # E
# > ``NamedTuple`` does not support multiple inheritance. Type checkers should
# > enforce this limitation.
class Unit(NamedTuple, object): # E
name: str

View File

@@ -0,0 +1,66 @@
"""
Tests NamedTuple definitions using the functional syntax.
"""
from collections import namedtuple
from typing import NamedTuple
# Specification: https://typing.readthedocs.io/en/latest/spec/namedtuples.html#defining-named-tuples
# > A type checker may support the function syntax in its various forms::
Point1 = namedtuple("Point1", ["x", "y"])
p1_1 = Point1(x=1, y=1)
p1_2 = Point1(2.3, "")
p1_3 = Point1(2.3) # E
Point2 = namedtuple("Point2", ("x", "y"))
p2_1 = Point2(x=1, y=1)
p2_2 = Point2(2.3, "")
p2_3 = Point2() # E
Point3 = namedtuple("Point3", "x y")
p3_1 = Point3(x=1, y=1)
p3_2 = Point3(2.3, "")
p3_3 = Point3(1, 2, 3) # E
Point4 = namedtuple("Point4", "x, y")
p4_1 = Point4(x=1, y=1)
p4_2 = Point4(2.3, "")
p4_3 = Point4(1, z=3) # E
Point5 = NamedTuple("Point5", [("x", int), ("y", int)])
p5_1 = Point5(x=1, y=1)
p5_2 = Point5(2, 1)
p5_3 = Point5(2, "1") # E
p5_4 = Point5(1, 2, 3) # E
Point6 = NamedTuple("Point6", (("x", int), ("y", int)))
p6_1 = Point6(x=1, y=1)
p6_2 = Point6(2, 1)
p6_3 = Point6(2, "1") # E
p6_4 = Point6(x=1.1, y=2) # E
# > At runtime, the ``namedtuple`` function disallows field names that are
# > illegal Python identifiers and either raises an exception or replaces these
# > fields with a parameter name of the form ``_N``. The behavior depends on
# > the value of the ``rename`` argument. Type checkers may replicate this
# > behavior statically.
NT1 = namedtuple("NT1", ["a", "a"]) # E?: duplicate field name
NT2 = namedtuple("NT2", ["abc", "def"]) # E?: illegal field name
NT3 = namedtuple("NT3", ["abc", "def"], rename=False) # E?: illegal field name
NT4 = namedtuple("NT4", ["abc", "def"], rename=True) # OK
NT4(abc="", _1="") # OK
# > The ``namedtuple`` function also supports a ``defaults`` keyword argument that
# > specifies default values for the fields. Type checkers may support this.
NT5 = namedtuple("NT5", "a b c", defaults=(1, 2))
NT5(1) # OK
NT5(1, 2, 3) # OK
NT5() # E: too few arguments

View File

@@ -0,0 +1,27 @@
"""
Tests NamedTuple type compatibility rules.
"""
from typing import Any, NamedTuple
# Specification: https://typing.readthedocs.io/en/latest/spec/namedtuples.html#type-compatibility-rules
class Point(NamedTuple):
x: int
y: int
units: str = "meters"
# > A named tuple is a subtype of a ``tuple`` with a known length and parameterized
# > by types corresponding to the named tuple's individual field types.
p = Point(x=1, y=2, units="inches")
v1: tuple[int, int, str] = p # OK
v2: tuple[Any, ...] = p # OK
v3: tuple[int, int] = p # E: too few elements
v4: tuple[int, str, str] = p # E: incompatible element type
# > As with normal tuples, named tuples are covariant in their type parameters.
v5: tuple[float, float, str] = p # OK

Some files were not shown because too many files have changed in this diff Show More