mirror of
https://gitflic.ru/project/openide/openide.git
synced 2026-03-22 23:31:05 +07:00
Previously, we mistakenly tried to resolve qualified names listed in the `field_specifiers` argument of @dataclass_transform in the same scope where the dataclass itself is defined, not where the actual decorator application is located. Thus, if in the file where the dataclass is defined, a field specifier was imported differently than where @dataclass_transform was applied, we couldn't recognize a field specifier call in the RHS of an assignment as such and took it for an ordinary field default value. In particular, this is what happened with pydantic dataclasses. `pydantic.fields.Field` is usually imported as just `pydantic.Field` where user dataclasses are defined, but imported with an alias and set in the `field_specifiers` argument as `PydanticModelField` in `pydantic._internal._model_construction` where `ModelMetaclass` is defined. It was accidentally broken in f15a07836e7aeac7c46b489b4742e8248a0e6ef4 to support decorating class methods with dataclass_transform (see testData/inspections/PyDataclassInspection/DataclassTransformFieldsOrder/decorator.py). Until PyResolveUtil.resolveQualifiedNameInScope automatically traverses through containing scopes looking for a name, the file containing decorator application seems like a safe trade-off for the scope, because field specifiers are normally defined or imported somewhere at the top level. (cherry picked from commit de9afeb0831a52f058453fe678de229d41c26a4d) IJ-CR-151380 GitOrigin-RevId: b6576ec7b72ea1e19e93b6190372a5168003c396