diff --git a/.gitignore b/.gitignore
index 9c325f3e29f8a..7693576a1c1b2 100644
--- a/.gitignore
+++ b/.gitignore
@@ -60,3 +60,5 @@ test_capi
test_capi
/mypyc/lib-rt/build/
/mypyc/lib-rt/*.so
+
+.envrc
diff --git a/AGENTS.md b/AGENTS.md
new file mode 100644
index 0000000000000..5c0f88f6070be
--- /dev/null
+++ b/AGENTS.md
@@ -0,0 +1,109 @@
+This file provides guidance to coding agents, I guess.
+Also to humans some probably.
+
+## Current Work: Typemap PEP Implementation
+
+We are implementing a PEP draft for type-level computation. The specification
+is in `pep.rst`. Refer to it when implementing new type operators
+or features.
+
+## Default virtualenv
+
+The default virtualenv is ``venv``, so most of the commands below
+should be run from ``venv/bin`` if available.
+
+
+## Pre-commit
+
+Always run ``venv/bin/python runtests.py lint self`` before committing
+and make sure that it passes.
+
+
+## Common Commands
+
+### Running Tests
+```bash
+# Run a single test by name (uses pytest -k matching)
+pytest -n0 -k testNewSyntaxBasics
+
+# Run all tests in a specific test file
+pytest mypy/test/testcheck.py::TypeCheckSuite::check-dataclasses.test
+
+# Run tests matching a pattern
+pytest -q -k "MethodCall"
+
+# Run the full test suite (slow)
+python runtests.py
+
+# Run with debugging (disables parallelization)
+pytest -n0 --pdb -k testName
+```
+
+### Linting and Type Checking
+```bash
+# Run formatters and linters
+python runtests.py lint
+
+# Type check mypy's own code
+python -m mypy --config-file mypy_self_check.ini -p mypy
+```
+
+### Manual Testing
+```bash
+# Run mypy directly on a file
+python -m mypy PROGRAM.py
+
+# Run mypy on a module
+python -m mypy -m MODULE
+```
+
+## Architecture Overview
+
+Mypy is a static type checker that processes Python code through multiple passes:
+
+### Core Pipeline
+1. **Parsing** (`fastparse.py`) - Converts source to AST using Python's `ast` module
+2. **Semantic Analysis** (`semanal.py`, `semanal_main.py`) - Resolves names, builds symbol tables, analyzes imports
+3. **Type Checking** (`checker.py`, `checkexpr.py`) - Verifies type correctness
+
+### Key Data Structures
+
+**AST Nodes** (`nodes.py`):
+- `MypyFile` - Root of a parsed module
+- `FuncDef`, `ClassDef` - Function/class definitions
+- `TypeInfo` - Metadata about classes (bases, MRO, members)
+- `SymbolTable`, `SymbolTableNode` - Name resolution
+
+**Types** (`types.py`):
+- `Type` - Base class for all types
+- `ProperType` - Concrete types (Instance, CallableType, TupleType, UnionType, etc.)
+- `TypeAliasType` - Type aliases that expand to proper types
+- `get_proper_type()` - Expands type aliases to proper types
+
+### Type Operations
+- `subtypes.py` - Subtype checking (`is_subtype()`)
+- `meet.py`, `join.py` - Type meets (intersection) and joins (union)
+- `expandtype.py` - Type variable substitution
+- `typeops.py` - Type utilities and transformations
+
+### Build System
+- `build.py` - Orchestrates the entire type checking process
+- `State` - Represents a module being processed
+- Handles incremental checking and caching
+
+## Test Data Format
+
+Tests in `test-data/unit/check-*.test` use a declarative format:
+```
+[case testName]
+# flags: --some-flag
+x: int = "wrong" # E: Incompatible types...
+
+[builtins fixtures/tuple.pyi]
+[typing fixtures/typing-full.pyi]
+```
+
+- `# E:` marks expected errors, `# N:` for notes, `# W:` for warnings
+- `[builtins fixtures/...]` specifies stub files for builtins
+- `[typing fixtures/typing-full.pyi]` uses extended typing stubs
+- Tests use minimal stubs by default; define needed classes in test or use fixtures
diff --git a/CLAUDE.md b/CLAUDE.md
new file mode 100644
index 0000000000000..e195f119f88cf
--- /dev/null
+++ b/CLAUDE.md
@@ -0,0 +1,7 @@
+# CLAUDE.md
+
+This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
+
+Claude ought to support AGENTS.md but doesn't seem to yet.
+
+@AGENTS.md
diff --git a/README.md b/README.md
index 8040566b18eff..cb0404a60f082 100644
--- a/README.md
+++ b/README.md
@@ -1,190 +1,60 @@
-
+# PEP 827: Type-Level Computation — Prototype Implementation
-Mypy: Static Typing for Python
-=======================================
+This is a prototype implementation of [PEP
+827](https://peps.python.org/pep-0827/) (Type Manipulation) in mypy.
-[](https://pypi.org/project/mypy/)
-[](https://pypistats.org/packages/mypy)
-[](https://github.com/python/mypy/actions)
-[](https://mypy.readthedocs.io/en/latest/?badge=latest)
-[](https://gitter.im/python/typing?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
-[](https://mypy-lang.org/)
-[](https://github.com/psf/black)
-[](https://github.com/astral-sh/ruff)
+The PEP introduces type-level computation operators for introspecting and constructing types.
-Got a question?
----------------
+Most of the main features are prototyped, and this should be suitable
+for experimentation, but it it not yet production quality or ready to
+be a PR yet. (This prototype has been AI assisted, and at least some
+slop has made it in that will need to be fixed; it might benefit from
+a full history rewrite, also.)
-We are always happy to answer questions! Here are some good places to ask them:
+For the original mypy README, see [REAL_README.md](REAL_README.md).
-- for general questions about Python typing, try [typing discussions](https://github.com/python/typing/discussions)
-- for anything you're curious about, try [gitter chat](https://gitter.im/python/typing)
+## What's implemented
-If you're just getting started,
-[the documentation](https://mypy.readthedocs.io/en/stable/index.html)
-and [type hints cheat sheet](https://mypy.readthedocs.io/en/stable/cheat_sheet_py3.html)
-can also help answer questions.
+- **Type operators**: `IsAssignable`, `IsEquivalent`, `Bool`, `GetArg`, `GetArgs`, `GetMember`, `GetMemberType`, `Members`, `Attrs`, `FromUnion`, `Length`, `Slice`, `Concat`, `Uppercase`, `Lowercase`, `Capitalize`, `Uncapitalize`, `RaiseError`
+- **Conditional types**: `true_type if BoolType else false_type`
+- **Type-level comprehensions**: `*[T for x in Iter[...]]` with filtering
+- **Dot notation**: `member.name`, `member.type`, `param.type`, etc.
+- **Boolean operators**: `and`, `or`, `not` on type booleans
+- **Data types**: `Member[name, type, quals, init, definer]`, `Param[name, type, quals]`
+- **Extended callables**: `Callable[Params[Param[...], ...], RetType]`
+- **Object construction**: `NewProtocol[*Members]`, `NewTypedDict[*Members]`
+- **Class modification**: `UpdateClass[*Members]` as return type of decorators / `__init_subclass__`
+- **InitField**: Keyword argument capture with literal type inference
+- **Callable introspection**: `GetArg[SomeCallable, Callable, Literal[0]]` returns `Param` types
+- Incremental mode
-If you think you've found a bug:
+## What's not yet implemented
-- check our [common issues page](https://mypy.readthedocs.io/en/stable/common_issues.html)
-- search our [issue tracker](https://github.com/python/mypy/issues) to see if
- it's already been reported
+- `GetSpecialAttr[T, Attr]` — extract `__name__`, `__module__`, `__qualname__`
+- `GenericCallable[Vs, lambda : Ty]` — generic callable types with lambda binding
+- `Overloaded[*Callables]` — overloaded function type construction
+- `any(comprehension)` / `all(comprehension)` — quantification over type booleans
+- `classmethod`/`staticmethod` representation in type-level computation
-To report a bug or request an enhancement:
+- Any attempt to make it perform well
+- Fine-grained mode is currently broken
-- report at [our issue tracker](https://github.com/python/mypy/issues)
-- if the issue is with a specific library or function, consider reporting it at
- [typeshed tracker](https://github.com/python/typeshed/issues) or the issue
- tracker for that library
+## Known bugs
-To discuss a new type system feature:
+- **`IsAssignable`/`IsEquivalent` over-accept on parameterized tuples.** Both operators ignore type arguments on `tuple` (and presumably other generics), so e.g. `IsAssignable[tuple[Literal[2], Literal[1]], tuple[Literal[2], ...]]` and `IsEquivalent[tuple[Literal[2], Literal[1]], tuple[Literal[2], Literal[2]]]` both return `Literal[True]`. On bare `Literal` values the operators work correctly. (Note: even if implemented correctly, these checks should always return `False` for a fixed-length tuple vs. a homogeneous tuple — neither current behavior is what's needed, so this is a doubly-bad primitive for "all elements equal" style checks. Use `Union[*xs]` collapse against a representative instead.)
-- discuss at [discuss.python.org](https://discuss.python.org/c/typing/32)
-- there is also some historical discussion at the [typing-sig mailing list](https://mail.python.org/archives/list/typing-sig@python.org/) and the [python/typing repo](https://github.com/python/typing/issues)
+## Key files
-What is mypy?
--------------
+- `mypy/typelevel.py` — All type operator evaluation logic
+- `mypy/typeanal.py` — Desugaring of conditional types, comprehensions, dot notation, extended callables
+- `mypy/typeshed/stdlib/_typeshed/typemap.pyi` — Stub declarations for all operators and data types
+- `test-data/unit/check-typelevel-*.test` — Test suite
-Mypy is a static type checker for Python.
+## Some implementation notes
-Type checkers help ensure that you're using variables and functions in your code
-correctly. With mypy, add type hints ([PEP 484](https://www.python.org/dev/peps/pep-0484/))
-to your Python programs, and mypy will warn you when you use those types
-incorrectly.
-
-Python is a dynamic language, so usually you'll only see errors in your code
-when you attempt to run it. Mypy is a *static* checker, so it finds bugs
-in your programs without even running them!
-
-Here is a small example to whet your appetite:
-
-```python
-number = input("What is your favourite number?")
-print("It is", number + 1) # error: Unsupported operand types for + ("str" and "int")
-```
-
-Adding type hints for mypy does not interfere with the way your program would
-otherwise run. Think of type hints as similar to comments! You can always use
-the Python interpreter to run your code, even if mypy reports errors.
-
-Mypy is designed with gradual typing in mind. This means you can add type
-hints to your code base slowly and that you can always fall back to dynamic
-typing when static typing is not convenient.
-
-Mypy has a powerful and easy-to-use type system, supporting features such as
-type inference, generics, callable types, tuple types, union types,
-structural subtyping and more. Using mypy will make your programs easier to
-understand, debug, and maintain.
-
-See [the documentation](https://mypy.readthedocs.io/en/stable/index.html) for
-more examples and information.
-
-In particular, see:
-
-- [type hints cheat sheet](https://mypy.readthedocs.io/en/stable/cheat_sheet_py3.html)
-- [getting started](https://mypy.readthedocs.io/en/stable/getting_started.html)
-- [list of error codes](https://mypy.readthedocs.io/en/stable/error_code_list.html)
-
-Quick start
------------
-
-Mypy can be installed using pip:
-
-```bash
-python3 -m pip install -U mypy
-```
-
-If you want to run the latest version of the code, you can install from the
-repo directly:
-
-```bash
-python3 -m pip install -U git+https://github.com/python/mypy.git
-```
-
-Now you can type-check the [statically typed parts] of a program like this:
-
-```bash
-mypy PROGRAM
-```
-
-You can always use the Python interpreter to run your statically typed
-programs, even if mypy reports type errors:
-
-```bash
-python3 PROGRAM
-```
-
-If you are working with large code bases, you can run mypy in
-[daemon mode], that will give much faster (often sub-second) incremental updates:
-
-```bash
-dmypy run -- PROGRAM
-```
-
-You can also try mypy in an [online playground](https://mypy-play.net/) (developed by
-Yusuke Miyazaki).
-
-[statically typed parts]: https://mypy.readthedocs.io/en/latest/getting_started.html#function-signatures-and-dynamic-vs-static-typing
-[daemon mode]: https://mypy.readthedocs.io/en/stable/mypy_daemon.html
-
-Integrations
-------------
-
-Mypy can be integrated into popular IDEs:
-
-- VS Code: provides [basic integration](https://code.visualstudio.com/docs/python/linting#_mypy) with mypy.
-- Vim:
- - Using [Syntastic](https://github.com/vim-syntastic/syntastic): in `~/.vimrc` add
- `let g:syntastic_python_checkers=['mypy']`
- - Using [ALE](https://github.com/dense-analysis/ale): should be enabled by default when `mypy` is installed,
- or can be explicitly enabled by adding `let b:ale_linters = ['mypy']` in `~/vim/ftplugin/python.vim`
-- Emacs: using [Flycheck](https://github.com/flycheck/)
-- Sublime Text: [SublimeLinter-contrib-mypy](https://github.com/fredcallaway/SublimeLinter-contrib-mypy)
-- PyCharm: [mypy plugin](https://github.com/dropbox/mypy-PyCharm-plugin)
-- IDLE: [idlemypyextension](https://github.com/CoolCat467/idlemypyextension)
-- pre-commit: use [pre-commit mirrors-mypy](https://github.com/pre-commit/mirrors-mypy), although
- note by default this will limit mypy's ability to analyse your third party dependencies.
-
-Web site and documentation
---------------------------
-
-Additional information is available at the web site:
-
-
-
-Jump straight to the documentation:
-
-
-
-Follow along our changelog at:
-
-
-
-Contributing
-------------
-
-Help in testing, development, documentation and other tasks is
-highly appreciated and useful to the project. There are tasks for
-contributors of all experience levels.
-
-To get started with developing mypy, see [CONTRIBUTING.md](CONTRIBUTING.md).
-
-Mypyc and compiled version of mypy
-----------------------------------
-
-[Mypyc](https://github.com/mypyc/mypyc) uses Python type hints to compile Python
-modules to faster C extensions. Mypy is itself compiled using mypyc: this makes
-mypy approximately 4 times faster than if interpreted!
-
-To install an interpreted mypy instead, use:
-
-```bash
-python3 -m pip install --no-binary mypy -U mypy
-```
-
-To use a compiled version of a development
-version of mypy, directly install a binary from
-.
-
-To contribute to the mypyc project, check out the issue tracker at
+- Evaluating `NewProtocol` creates a new anonymous `TypeInfo` that
+ doesn't go into any symbol tables. That `TypeInfo` hangs on to the
+ `NewProtocol` invocation that created it, and when we serialize
+ `Instance`s that refer to it, we serialize the `NewProtocol`
+ invocation instead. This allows us to avoid needing to serialize the
+ anonymous `TypeInfo`s.
diff --git a/REAL_README.md b/REAL_README.md
new file mode 100644
index 0000000000000..8040566b18eff
--- /dev/null
+++ b/REAL_README.md
@@ -0,0 +1,190 @@
+
+
+Mypy: Static Typing for Python
+=======================================
+
+[](https://pypi.org/project/mypy/)
+[](https://pypistats.org/packages/mypy)
+[](https://github.com/python/mypy/actions)
+[](https://mypy.readthedocs.io/en/latest/?badge=latest)
+[](https://gitter.im/python/typing?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
+[](https://mypy-lang.org/)
+[](https://github.com/psf/black)
+[](https://github.com/astral-sh/ruff)
+
+Got a question?
+---------------
+
+We are always happy to answer questions! Here are some good places to ask them:
+
+- for general questions about Python typing, try [typing discussions](https://github.com/python/typing/discussions)
+- for anything you're curious about, try [gitter chat](https://gitter.im/python/typing)
+
+If you're just getting started,
+[the documentation](https://mypy.readthedocs.io/en/stable/index.html)
+and [type hints cheat sheet](https://mypy.readthedocs.io/en/stable/cheat_sheet_py3.html)
+can also help answer questions.
+
+If you think you've found a bug:
+
+- check our [common issues page](https://mypy.readthedocs.io/en/stable/common_issues.html)
+- search our [issue tracker](https://github.com/python/mypy/issues) to see if
+ it's already been reported
+
+To report a bug or request an enhancement:
+
+- report at [our issue tracker](https://github.com/python/mypy/issues)
+- if the issue is with a specific library or function, consider reporting it at
+ [typeshed tracker](https://github.com/python/typeshed/issues) or the issue
+ tracker for that library
+
+To discuss a new type system feature:
+
+- discuss at [discuss.python.org](https://discuss.python.org/c/typing/32)
+- there is also some historical discussion at the [typing-sig mailing list](https://mail.python.org/archives/list/typing-sig@python.org/) and the [python/typing repo](https://github.com/python/typing/issues)
+
+What is mypy?
+-------------
+
+Mypy is a static type checker for Python.
+
+Type checkers help ensure that you're using variables and functions in your code
+correctly. With mypy, add type hints ([PEP 484](https://www.python.org/dev/peps/pep-0484/))
+to your Python programs, and mypy will warn you when you use those types
+incorrectly.
+
+Python is a dynamic language, so usually you'll only see errors in your code
+when you attempt to run it. Mypy is a *static* checker, so it finds bugs
+in your programs without even running them!
+
+Here is a small example to whet your appetite:
+
+```python
+number = input("What is your favourite number?")
+print("It is", number + 1) # error: Unsupported operand types for + ("str" and "int")
+```
+
+Adding type hints for mypy does not interfere with the way your program would
+otherwise run. Think of type hints as similar to comments! You can always use
+the Python interpreter to run your code, even if mypy reports errors.
+
+Mypy is designed with gradual typing in mind. This means you can add type
+hints to your code base slowly and that you can always fall back to dynamic
+typing when static typing is not convenient.
+
+Mypy has a powerful and easy-to-use type system, supporting features such as
+type inference, generics, callable types, tuple types, union types,
+structural subtyping and more. Using mypy will make your programs easier to
+understand, debug, and maintain.
+
+See [the documentation](https://mypy.readthedocs.io/en/stable/index.html) for
+more examples and information.
+
+In particular, see:
+
+- [type hints cheat sheet](https://mypy.readthedocs.io/en/stable/cheat_sheet_py3.html)
+- [getting started](https://mypy.readthedocs.io/en/stable/getting_started.html)
+- [list of error codes](https://mypy.readthedocs.io/en/stable/error_code_list.html)
+
+Quick start
+-----------
+
+Mypy can be installed using pip:
+
+```bash
+python3 -m pip install -U mypy
+```
+
+If you want to run the latest version of the code, you can install from the
+repo directly:
+
+```bash
+python3 -m pip install -U git+https://github.com/python/mypy.git
+```
+
+Now you can type-check the [statically typed parts] of a program like this:
+
+```bash
+mypy PROGRAM
+```
+
+You can always use the Python interpreter to run your statically typed
+programs, even if mypy reports type errors:
+
+```bash
+python3 PROGRAM
+```
+
+If you are working with large code bases, you can run mypy in
+[daemon mode], that will give much faster (often sub-second) incremental updates:
+
+```bash
+dmypy run -- PROGRAM
+```
+
+You can also try mypy in an [online playground](https://mypy-play.net/) (developed by
+Yusuke Miyazaki).
+
+[statically typed parts]: https://mypy.readthedocs.io/en/latest/getting_started.html#function-signatures-and-dynamic-vs-static-typing
+[daemon mode]: https://mypy.readthedocs.io/en/stable/mypy_daemon.html
+
+Integrations
+------------
+
+Mypy can be integrated into popular IDEs:
+
+- VS Code: provides [basic integration](https://code.visualstudio.com/docs/python/linting#_mypy) with mypy.
+- Vim:
+ - Using [Syntastic](https://github.com/vim-syntastic/syntastic): in `~/.vimrc` add
+ `let g:syntastic_python_checkers=['mypy']`
+ - Using [ALE](https://github.com/dense-analysis/ale): should be enabled by default when `mypy` is installed,
+ or can be explicitly enabled by adding `let b:ale_linters = ['mypy']` in `~/vim/ftplugin/python.vim`
+- Emacs: using [Flycheck](https://github.com/flycheck/)
+- Sublime Text: [SublimeLinter-contrib-mypy](https://github.com/fredcallaway/SublimeLinter-contrib-mypy)
+- PyCharm: [mypy plugin](https://github.com/dropbox/mypy-PyCharm-plugin)
+- IDLE: [idlemypyextension](https://github.com/CoolCat467/idlemypyextension)
+- pre-commit: use [pre-commit mirrors-mypy](https://github.com/pre-commit/mirrors-mypy), although
+ note by default this will limit mypy's ability to analyse your third party dependencies.
+
+Web site and documentation
+--------------------------
+
+Additional information is available at the web site:
+
+
+
+Jump straight to the documentation:
+
+
+
+Follow along our changelog at:
+
+
+
+Contributing
+------------
+
+Help in testing, development, documentation and other tasks is
+highly appreciated and useful to the project. There are tasks for
+contributors of all experience levels.
+
+To get started with developing mypy, see [CONTRIBUTING.md](CONTRIBUTING.md).
+
+Mypyc and compiled version of mypy
+----------------------------------
+
+[Mypyc](https://github.com/mypyc/mypyc) uses Python type hints to compile Python
+modules to faster C extensions. Mypy is itself compiled using mypyc: this makes
+mypy approximately 4 times faster than if interpreted!
+
+To install an interpreted mypy instead, use:
+
+```bash
+python3 -m pip install --no-binary mypy -U mypy
+```
+
+To use a compiled version of a development
+version of mypy, directly install a binary from
+.
+
+To contribute to the mypyc project, check out the issue tracker at
diff --git a/mypy/build.py b/mypy/build.py
index 4fe6f52f58287..d8e5602766293 100644
--- a/mypy/build.py
+++ b/mypy/build.py
@@ -120,6 +120,7 @@
from mypy.partially_defined import PossiblyUndefinedVariableVisitor
from mypy.semanal import SemanticAnalyzer
from mypy.semanal_pass1 import SemanticAnalyzerPreAnalysis
+from mypy.typelevel import typelevel_ctx
from mypy.util import (
DecodeError,
decode_python_encoding,
@@ -483,7 +484,8 @@ def build_inner(
reset_global_state()
try:
- graph = dispatch(sources, manager, stdout)
+ with typelevel_ctx.set_api(manager.semantic_analyzer):
+ graph = dispatch(sources, manager, stdout)
if not options.fine_grained_incremental:
type_state.reset_all_subtype_caches()
if options.timing_stats is not None:
diff --git a/mypy/checker.py b/mypy/checker.py
index fa531daba798f..73914ec5b5628 100644
--- a/mypy/checker.py
+++ b/mypy/checker.py
@@ -222,6 +222,7 @@
TypedDictType,
TypeGuardedType,
TypeOfAny,
+ TypeOperatorType,
TypeTranslator,
TypeType,
TypeVarId,
@@ -240,7 +241,13 @@
is_literal_type,
is_named_instance,
)
-from mypy.types_utils import is_overlapping_none, remove_optional, store_argument_type, strip_type
+from mypy.types_utils import (
+ is_overlapping_none,
+ remove_optional,
+ store_argument_type,
+ strip_type,
+ try_getting_literal,
+)
from mypy.typetraverser import TypeTraverserVisitor
from mypy.typevars import fill_typevars, fill_typevars_with_any, has_no_typevars
from mypy.util import is_dunder, is_sunder
@@ -1567,6 +1574,12 @@ def check_funcdef_item(
and fdef.name in ("__init__", "__init_subclass__")
and not isinstance(get_proper_type(typ.ret_type), (NoneType, UninhabitedType))
and not self.dynamic_funcs[-1]
+ # Allow UpdateClass return type for __init_subclass__
+ and not (
+ fdef.name == "__init_subclass__"
+ and isinstance(typ.ret_type, TypeOperatorType)
+ and typ.ret_type.type.name == "UpdateClass"
+ )
):
self.fail(message_registry.MUST_HAVE_NONE_RETURN_TYPE.format(fdef.name), item)
@@ -3409,6 +3422,16 @@ def check_assignment(
rvalue_type, lvalue_type = self.check_simple_assignment(
lvalue_type, rvalue, context=rvalue, inferred=inferred, lvalue=lvalue
)
+ # Store init_type for annotated class members with explicit values.
+ # This preserves the literal type information for the typemap Init field.
+ if (
+ isinstance(lvalue, NameExpr)
+ and isinstance(lvalue.node, Var)
+ and lvalue.node.is_initialized_in_class
+ and lvalue.node.has_explicit_value
+ and lvalue.node.init_type is None
+ ):
+ lvalue.node.init_type = try_getting_literal(rvalue_type)
# The above call may update inferred variable type. Prevent further
# inference.
inferred = None
diff --git a/mypy/checkexpr.py b/mypy/checkexpr.py
index 49fc1159856f7..d136360cc3347 100644
--- a/mypy/checkexpr.py
+++ b/mypy/checkexpr.py
@@ -211,6 +211,7 @@
is_overlapping_none,
is_self_type_like,
remove_optional,
+ try_getting_literal,
)
from mypy.typestate import type_state
from mypy.typevars import fill_typevars
@@ -1760,9 +1761,21 @@ def check_callable_call(
need_refresh = any(
isinstance(v, (ParamSpecType, TypeVarTupleType)) for v in callee.variables
)
+ # Check if we have TypeVar-based kwargs that need expansion after inference
+ has_typevar_kwargs = (
+ callee.unpack_kwargs
+ and callee.arg_types
+ and isinstance(callee.arg_types[-1], UnpackType)
+ and isinstance(get_proper_type(callee.arg_types[-1].type), TypeVarType)
+ )
callee = self.infer_function_type_arguments(
callee, args, arg_kinds, arg_names, formal_to_actual, need_refresh, context
)
+ if has_typevar_kwargs:
+ # After inference, the TypeVar in **kwargs should be replaced with
+ # an inferred TypedDict. Re-expand the kwargs now.
+ callee = callee.with_unpacked_kwargs().with_normalized_var_args()
+ need_refresh = True
if need_refresh:
# Argument kinds etc. may have changed due to
# ParamSpec or TypeVarTuple variables being replaced with an arbitrary
@@ -1806,7 +1819,14 @@ def check_callable_call(
)
self.check_argument_types(
- arg_types, arg_kinds, args, callee, formal_to_actual, context, object_type=object_type
+ arg_types,
+ arg_kinds,
+ args,
+ callee,
+ formal_to_actual,
+ context,
+ object_type=object_type,
+ arg_names=arg_names,
)
if (
@@ -2515,6 +2535,7 @@ def check_argument_types(
context: Context,
check_arg: ArgChecker | None = None,
object_type: Type | None = None,
+ arg_names: Sequence[str | None] | None = None,
) -> None:
"""Check argument types against a callable type.
@@ -2594,6 +2615,22 @@ def check_argument_types(
elif isinstance(unpacked_type, TypeVarTupleType):
callee_arg_types = [orig_callee_arg_type]
callee_arg_kinds = [ARG_STAR]
+ elif isinstance(unpacked_type, TypedDictType):
+ # Unpack[TypedDict] for **kwargs — each kwarg gets its
+ # corresponding item type from the TypedDict.
+ #
+ # TODO: This I think is duplicating some
+ # handling of something somewhere.
+ # TODO: special handling of kwargs
+ callee_arg_types = list[Type]()
+ callee_arg_kinds = list[ArgKind]()
+ for a in actuals:
+ name = arg_names[a] if arg_names else None
+ if name is not None and name in unpacked_type.items:
+ callee_arg_types.append(unpacked_type.items[name])
+ else:
+ callee_arg_types.append(orig_callee_arg_type)
+ callee_arg_kinds.append(ARG_NAMED)
else:
assert isinstance(unpacked_type, Instance)
assert unpacked_type.type.fullname == "builtins.tuple"
@@ -3278,6 +3315,7 @@ def check_arg(
formal_to_actual,
context=context,
check_arg=check_arg,
+ arg_names=arg_names,
)
return True
except Finished:
@@ -4759,6 +4797,7 @@ def visit_reveal_expr(self, expr: RevealExpr) -> Type:
revealed_type = self.accept(
expr.expr, type_context=self.type_context[-1], allow_none_return=True
)
+
if not self.chk.current_node_deferred:
self.msg.reveal_type(revealed_type, expr.expr)
if not self.chk.in_checked_function():
@@ -6815,14 +6854,6 @@ def merge_typevars_in_callables_by_name(
return output, variables
-def try_getting_literal(typ: Type) -> ProperType:
- """If possible, get a more precise literal type for a given type."""
- typ = get_proper_type(typ)
- if isinstance(typ, Instance) and typ.last_known_value is not None:
- return typ.last_known_value
- return typ
-
-
def is_expr_literal_type(node: Expression) -> bool:
"""Returns 'true' if the given node is a Literal"""
if isinstance(node, IndexExpr):
diff --git a/mypy/constraints.py b/mypy/constraints.py
index df79fdae5456c..b0b7fbcacf8bd 100644
--- a/mypy/constraints.py
+++ b/mypy/constraints.py
@@ -39,7 +39,9 @@
Type,
TypeAliasType,
TypedDictType,
+ TypeForComprehension,
TypeOfAny,
+ TypeOperatorType,
TypeType,
TypeVarId,
TypeVarLikeType,
@@ -58,7 +60,7 @@
is_named_instance,
split_with_prefix_and_suffix,
)
-from mypy.types_utils import is_union_with_any
+from mypy.types_utils import is_union_with_any, try_getting_literal
from mypy.typestate import type_state
if TYPE_CHECKING:
@@ -135,7 +137,7 @@ def infer_constraints_for_callable(
break
for i, actuals in enumerate(formal_to_actual):
- if isinstance(callee.arg_types[i], UnpackType):
+ if isinstance(callee.arg_types[i], UnpackType) and callee.arg_kinds[i] == ARG_STAR:
unpack_type = callee.arg_types[i]
assert isinstance(unpack_type, UnpackType)
@@ -218,6 +220,88 @@ def infer_constraints_for_callable(
constraints.extend(infer_constraints(tt, at, SUPERTYPE_OF))
else:
assert False, "mypy bug: unhandled constraint inference case"
+
+ elif isinstance(callee.arg_types[i], UnpackType) and callee.arg_kinds[i] == ARG_STAR2:
+ # Handle **kwargs: Unpack[K] where K is TypeVar bound to TypedDict.
+ # Collect actual kwargs and build a TypedDict constraint.
+
+ unpack_type = callee.arg_types[i]
+ assert isinstance(unpack_type, UnpackType)
+
+ unpacked_type = get_proper_type(unpack_type.type)
+ assert isinstance(unpacked_type, TypeVarType)
+
+ other_named = {
+ name
+ for name, kind in zip(callee.arg_names, callee.arg_kinds)
+ if name is not None and not kind.is_star()
+ }
+
+ # Collect all the arguments that will go to **kwargs
+ kwargs_items: dict[str, Type] = {}
+ for actual in actuals:
+ actual_arg_type = arg_types[actual]
+ if actual_arg_type is None:
+ continue
+ actual_name = arg_names[actual] if arg_names is not None else None
+ if actual_name is not None:
+ # Named argument going to **kwargs
+ kwargs_items[actual_name] = actual_arg_type
+ elif arg_kinds[actual] == ARG_STAR2:
+ # **kwargs being passed through - try to extract TypedDict items
+ p_actual = get_proper_type(actual_arg_type)
+ if isinstance(p_actual, TypedDictType):
+ for sname, styp in p_actual.items.items():
+ # But we need to filter out names that
+ # will go to other parameters
+ if sname not in other_named:
+ kwargs_items[sname] = styp
+
+ # Build a TypedDict from the collected kwargs.
+ bound = get_proper_type(unpacked_type.upper_bound)
+ if isinstance(bound, Instance) and bound.type.typeddict_type is not None:
+ bound = bound.type.typeddict_type
+
+ # This should be an error from an earlier level, but don't compound it
+ if not isinstance(bound, TypedDictType):
+ continue
+
+ # Start with the actual kwargs passed, with literal types
+ # inferred for read-only and unbound items
+ items = {
+ key: (
+ try_getting_literal(typ)
+ if key not in bound.items or key in bound.readonly_keys
+ else typ
+ )
+ for key, typ in kwargs_items.items()
+ }
+ # Add any NotRequired keys from the bound that weren't passed
+ # (they need to be present for TypedDict subtyping to work)
+ for key, value_type in bound.items.items():
+ if key not in items and key not in bound.required_keys:
+ # If the key is missing and it is ReadOnly,
+ # then we can replace the type with Never to
+ # indicate that it is definitely not
+ # present. We can't do that if it is mutable,
+ # though (because that violates the subtyping
+ # rules.)
+ items[key] = (
+ value_type if key not in bound.readonly_keys else UninhabitedType()
+ )
+ # Keys are required if they're required in the bound, or if they're
+ # extra keys not in the bound (explicitly passed, so required).
+ required_keys = {
+ key for key in items if key in bound.required_keys or key not in bound.items
+ }
+ inferred_td = TypedDictType(
+ items=items,
+ required_keys=required_keys,
+ readonly_keys=bound.readonly_keys,
+ fallback=bound.fallback,
+ )
+ constraints.append(Constraint(unpacked_type, SUPERTYPE_OF, inferred_td))
+
else:
for actual in actuals:
actual_arg_type = arg_types[actual]
@@ -1346,6 +1430,22 @@ def visit_union_type(self, template: UnionType) -> list[Constraint]:
def visit_type_alias_type(self, template: TypeAliasType) -> list[Constraint]:
assert False, f"This should be never called, got {template}"
+ def visit_type_operator_type(self, template: TypeOperatorType) -> list[Constraint]:
+ # TODO: Is this right?
+ #
+ # We don't really know how to resolve constraints here, so
+ # resolve none, and hope that when variables are substituted,
+ # we figure out if things are ok.
+ return []
+
+ def visit_type_for_comprehension(self, template: TypeForComprehension) -> list[Constraint]:
+ # TODO: Is this right?
+ #
+ # We don't really know how to resolve constraints here, so
+ # resolve none, and hope that when variables are substituted,
+ # we figure out if things are ok.
+ return []
+
def infer_against_any(self, types: Iterable[Type], any_type: AnyType) -> list[Constraint]:
res: list[Constraint] = []
# Some items may be things like `*Tuple[*Ts, T]` for example from callable types with
diff --git a/mypy/copytype.py b/mypy/copytype.py
index 9a390a01bdbab..c74518de86b01 100644
--- a/mypy/copytype.py
+++ b/mypy/copytype.py
@@ -18,6 +18,8 @@
TupleType,
TypeAliasType,
TypedDictType,
+ TypeForComprehension,
+ TypeOperatorType,
TypeType,
TypeVarTupleType,
TypeVarType,
@@ -126,6 +128,12 @@ def visit_type_type(self, t: TypeType) -> ProperType:
def visit_type_alias_type(self, t: TypeAliasType) -> ProperType:
assert False, "only ProperTypes supported"
+ def visit_type_operator_type(self, t: TypeOperatorType) -> ProperType:
+ assert False, "only ProperTypes supported"
+
+ def visit_type_for_comprehension(self, t: TypeForComprehension) -> ProperType:
+ assert False, "only ProperTypes supported"
+
def copy_common(self, t: ProperType, t2: ProperType) -> ProperType:
t2.line = t.line
t2.column = t.column
diff --git a/mypy/erasetype.py b/mypy/erasetype.py
index cb8d66f292dd3..e75625a613658 100644
--- a/mypy/erasetype.py
+++ b/mypy/erasetype.py
@@ -21,7 +21,9 @@
Type,
TypeAliasType,
TypedDictType,
+ TypeForComprehension,
TypeOfAny,
+ TypeOperatorType,
TypeTranslator,
TypeType,
TypeVarId,
@@ -141,6 +143,12 @@ def visit_type_type(self, t: TypeType) -> ProperType:
def visit_type_alias_type(self, t: TypeAliasType) -> ProperType:
raise RuntimeError("Type aliases should be expanded before accepting this visitor")
+ def visit_type_operator_type(self, t: TypeOperatorType) -> ProperType:
+ raise RuntimeError("Computed types should be expanded before accepting this visitor")
+
+ def visit_type_for_comprehension(self, t: TypeForComprehension) -> ProperType:
+ raise RuntimeError("Computed types should be expanded before accepting this visitor")
+
def erase_typevars(t: Type, ids_to_erase: Container[TypeVarId] | None = None) -> Type:
"""Replace all type variables in a type with any,
diff --git a/mypy/expandtype.py b/mypy/expandtype.py
index 5790b717172ac..98b5eef474fc7 100644
--- a/mypy/expandtype.py
+++ b/mypy/expandtype.py
@@ -38,7 +38,7 @@
UnionType,
UnpackType,
flatten_nested_unions,
- get_proper_type,
+ get_proper_type_simple,
split_with_prefix_and_suffix,
)
from mypy.typevartuples import split_with_instance
@@ -50,6 +50,9 @@
# is_subtype(), meet_types(), join_types() etc.
# TODO: add a static dependency test for this.
+# WARNING: WARNING: This *also* means that get_proper_type can't be used here!
+# Since type evaluation can probably depend on all that stuff.
+
@overload
def expand_type(typ: CallableType, env: Mapping[TypeVarId, Type]) -> CallableType: ...
@@ -227,9 +230,9 @@ def visit_instance(self, t: Instance) -> Type:
# Normalize Tuple[*Tuple[X, ...], ...] -> Tuple[X, ...]
arg = args[0]
if isinstance(arg, UnpackType):
- unpacked = get_proper_type(arg.type)
+ unpacked = get_proper_type_simple(arg.type)
if isinstance(unpacked, Instance):
- # TODO: this and similar asserts below may be unsafe because get_proper_type()
+ # TODO: this and similar asserts below may be unsafe because get_proper_type_simple()
# may be called during semantic analysis before all invalid types are removed.
assert unpacked.type.fullname == "builtins.tuple"
args = list(unpacked.args)
@@ -387,10 +390,16 @@ def visit_unpack_type(self, t: UnpackType) -> Type:
return UnpackType(t.type.accept(self))
def expand_unpack(self, t: UnpackType) -> list[Type]:
- assert isinstance(t.type, TypeVarTupleType)
- repl = get_proper_type(self.variables.get(t.type.id, t.type))
+ if isinstance(t.type, TypeVarTupleType):
+ t2 = self.variables.get(t.type.id, t.type)
+ fallback = t.type.tuple_fallback
+ else:
+ assert isinstance(t.type, ProperType) and isinstance(t.type, TupleType)
+ t2 = t.type
+ fallback = t2.partial_fallback
+ repl = get_proper_type_simple(t2)
if isinstance(repl, UnpackType):
- repl = get_proper_type(repl.type)
+ repl = get_proper_type_simple(repl.type)
if isinstance(repl, TupleType):
return repl.items
elif (
@@ -402,7 +411,7 @@ def expand_unpack(self, t: UnpackType) -> list[Type]:
elif isinstance(repl, (AnyType, UninhabitedType)):
# Replace *Ts = Any with *Ts = *tuple[Any, ...] and same for Never.
# These types may appear here as a result of user error or failed inference.
- return [UnpackType(t.type.tuple_fallback.copy_modified(args=[repl]))]
+ return [UnpackType(fallback.copy_modified(args=[repl]))]
else:
raise RuntimeError(f"Invalid type replacement to expand: {repl}")
@@ -414,7 +423,7 @@ def interpolate_args_for_unpack(self, t: CallableType, var_arg: UnpackType) -> l
prefix = self.expand_types(t.arg_types[:star_index])
suffix = self.expand_types(t.arg_types[star_index + 1 :])
- var_arg_type = get_proper_type(var_arg.type)
+ var_arg_type = get_proper_type_simple(var_arg.type)
new_unpack: Type
if isinstance(var_arg_type, TupleType):
# We have something like Unpack[Tuple[Unpack[Ts], X1, X2]]
@@ -428,7 +437,7 @@ def interpolate_args_for_unpack(self, t: CallableType, var_arg: UnpackType) -> l
fallback = var_arg_type.tuple_fallback
expanded_items = self.expand_unpack(var_arg)
new_unpack = UnpackType(TupleType(expanded_items, fallback))
- # Since get_proper_type() may be called in semanal.py before callable
+ # Since get_proper_type_simple() may be called in semanal.py before callable
# normalization happens, we need to also handle non-normal cases here.
elif isinstance(var_arg_type, Instance):
# we have something like Unpack[Tuple[Any, ...]]
@@ -516,7 +525,10 @@ def expand_type_list_with_unpack(self, typs: list[Type]) -> list[Type]:
"""Expands a list of types that has an unpack."""
items: list[Type] = []
for item in typs:
- if isinstance(item, UnpackType) and isinstance(item.type, TypeVarTupleType):
+ if isinstance(item, UnpackType) and (
+ isinstance(item.type, TypeVarTupleType)
+ or (isinstance(item.type, ProperType) and isinstance(item.type, TupleType))
+ ):
items.extend(self.expand_unpack(item))
else:
items.append(item.accept(self))
@@ -527,7 +539,10 @@ def expand_type_tuple_with_unpack(self, typs: tuple[Type, ...]) -> list[Type]:
# Micro-optimization: Specialized variant of expand_type_list_with_unpack
items: list[Type] = []
for item in typs:
- if isinstance(item, UnpackType) and isinstance(item.type, TypeVarTupleType):
+ if isinstance(item, UnpackType) and (
+ isinstance(item.type, TypeVarTupleType)
+ or (isinstance(item.type, ProperType) and isinstance(item.type, TupleType))
+ ):
items.extend(self.expand_unpack(item))
else:
items.append(item.accept(self))
@@ -539,7 +554,7 @@ def visit_tuple_type(self, t: TupleType) -> Type:
# Normalize Tuple[*Tuple[X, ...]] -> Tuple[X, ...]
item = items[0]
if isinstance(item, UnpackType):
- unpacked = get_proper_type(item.type)
+ unpacked = get_proper_type_simple(item.type)
if isinstance(unpacked, Instance):
# expand_type() may be called during semantic analysis, before invalid unpacks are fixed.
if unpacked.type.fullname != "builtins.tuple":
@@ -580,12 +595,12 @@ def visit_union_type(self, t: UnionType) -> Type:
simplified = UnionType.make_union(
remove_trivial(flatten_nested_unions(expanded)), t.line, t.column
)
- # This call to get_proper_type() is unfortunate but is required to preserve
+ # This call to get_proper_type_simple() is unfortunate but is required to preserve
# the invariant that ProperType will stay ProperType after applying expand_type(),
# otherwise a single item union of a type alias will break it. Note this should not
# cause infinite recursion since pathological aliases like A = Union[A, B] are
# banned at the semantic analysis level.
- result = get_proper_type(simplified)
+ result = get_proper_type_simple(simplified)
if use_cache:
self.set_cached(t, result)
@@ -644,7 +659,7 @@ def remove_trivial(types: Iterable[Type]) -> list[Type]:
new_types = []
all_types = set()
for t in types:
- p_t = get_proper_type(t)
+ p_t = get_proper_type_simple(t)
if isinstance(p_t, UninhabitedType):
continue
if isinstance(p_t, NoneType) and not state.strict_optional:
diff --git a/mypy/exprtotype.py b/mypy/exprtotype.py
index ae36fc8adde09..a6f3f3f528a7c 100644
--- a/mypy/exprtotype.py
+++ b/mypy/exprtotype.py
@@ -10,13 +10,17 @@
BytesExpr,
CallExpr,
ComplexExpr,
+ ConditionalExpr,
Context,
DictExpr,
+ DictionaryComprehension,
EllipsisExpr,
Expression,
FloatExpr,
+ GeneratorExpr,
IndexExpr,
IntExpr,
+ ListComprehension,
ListExpr,
MemberExpr,
NameExpr,
@@ -40,6 +44,7 @@
RawExpressionType,
Type,
TypedDictType,
+ TypeForComprehension,
TypeList,
TypeOfAny,
UnboundType,
@@ -52,6 +57,64 @@ class TypeTranslationError(Exception):
"""Exception raised when an expression is not valid as a type."""
+def _is_map_name(name: str) -> bool:
+ """Return True if name syntactically refers to the Map type operator."""
+ return name == "Map" or name.endswith(".Map")
+
+
+def _is_map_call(expr: CallExpr) -> bool:
+ """Return True if expr is Map() — call syntax."""
+ if len(expr.args) != 1 or expr.arg_names != [None]:
+ return False
+ if not isinstance(expr.args[0], (GeneratorExpr, ListComprehension)):
+ return False
+ callee = expr.callee
+ if isinstance(callee, NameExpr):
+ return _is_map_name(callee.name)
+ if isinstance(callee, MemberExpr):
+ return callee.name == "Map"
+ return False
+
+
+def _generator_to_type_for_comprehension(
+ gen: GeneratorExpr,
+ options: Options,
+ allow_new_syntax: bool,
+ lookup_qualified: Callable[[str, Context], SymbolTableNode | None] | None,
+ line: int,
+ column: int,
+) -> TypeForComprehension:
+ """Build a TypeForComprehension from a GeneratorExpr (single for-clause).
+
+ Raises TypeTranslationError if the generator expression isn't a supported
+ form (multiple generators or non-name target).
+ """
+ if len(gen.sequences) != 1:
+ raise TypeTranslationError()
+ index = gen.indices[0]
+ if not isinstance(index, NameExpr):
+ raise TypeTranslationError()
+ iter_name = index.name
+ element_expr = expr_to_unanalyzed_type(
+ gen.left_expr, options, allow_new_syntax, lookup_qualified=lookup_qualified
+ )
+ iter_type = expr_to_unanalyzed_type(
+ gen.sequences[0], options, allow_new_syntax, lookup_qualified=lookup_qualified
+ )
+ conditions: list[Type] = [
+ expr_to_unanalyzed_type(cond, options, allow_new_syntax, lookup_qualified=lookup_qualified)
+ for cond in gen.condlists[0]
+ ]
+ return TypeForComprehension(
+ element_expr=element_expr,
+ iter_name=iter_name,
+ iter_type=iter_type,
+ conditions=conditions,
+ line=line,
+ column=column,
+ )
+
+
def _extract_argument_name(expr: Expression) -> str | None:
if isinstance(expr, NameExpr) and expr.name == "None":
return None
@@ -68,7 +131,7 @@ def expr_to_unanalyzed_type(
_parent: Expression | None = None,
allow_unpack: bool = False,
lookup_qualified: Callable[[str, Context], SymbolTableNode | None] | None = None,
-) -> ProperType:
+) -> Type:
"""Translate an expression to the corresponding type.
The result is not semantically analyzed. It can be UnboundType or TypeList.
@@ -98,7 +161,18 @@ def expr_to_unanalyzed_type(
if fullname:
return UnboundType(fullname, line=expr.line, column=expr.column)
else:
- raise TypeTranslationError()
+ # Attribute access on a complex type expression (subscripted, conditional, etc.)
+ # Desugar X.attr to _TypeGetAttr[X, Literal["attr"]]
+ before_dot = expr_to_unanalyzed_type(
+ expr.expr, options, allow_new_syntax, expr, lookup_qualified=lookup_qualified
+ )
+ attr_literal = RawExpressionType(expr.name, "builtins.str", line=expr.line)
+ return UnboundType(
+ "__builtins__._TypeGetAttr",
+ [before_dot, attr_literal],
+ line=expr.line,
+ column=expr.column,
+ )
elif isinstance(expr, IndexExpr):
base = expr_to_unanalyzed_type(
expr.base, options, allow_new_syntax, expr, lookup_qualified=lookup_qualified
@@ -161,6 +235,37 @@ def expr_to_unanalyzed_type(
],
uses_pep604_syntax=True,
)
+ elif isinstance(expr, OpExpr) and expr.op in ("and", "or"):
+ # Convert `A and B` to `_And[A, B]` and `A or B` to `_Or[A, B]`
+ op_name = "_And" if expr.op == "and" else "_Or"
+ return UnboundType(
+ f"__builtins__.{op_name}",
+ [
+ expr_to_unanalyzed_type(
+ expr.left, options, allow_new_syntax, lookup_qualified=lookup_qualified
+ ),
+ expr_to_unanalyzed_type(
+ expr.right, options, allow_new_syntax, lookup_qualified=lookup_qualified
+ ),
+ ],
+ line=expr.line,
+ column=expr.column,
+ )
+ elif isinstance(expr, CallExpr) and _is_map_call(expr):
+ # Map(genexp) — variadic comprehension operator (call syntax).
+ base = expr_to_unanalyzed_type(
+ expr.callee, options, allow_new_syntax, expr, lookup_qualified=lookup_qualified
+ )
+ assert isinstance(base, UnboundType) and not base.args
+ arg = expr.args[0]
+ assert isinstance(arg, (GeneratorExpr, ListComprehension))
+ gen = arg if isinstance(arg, GeneratorExpr) else arg.generator
+ tfc = _generator_to_type_for_comprehension(
+ gen, options, allow_new_syntax, lookup_qualified, expr.line, expr.column
+ )
+ tfc.is_map = True
+ base.args = (tfc,)
+ return base
elif isinstance(expr, CallExpr) and isinstance(_parent, ListExpr):
c = expr.callee
names = []
@@ -229,6 +334,18 @@ def expr_to_unanalyzed_type(
elif isinstance(expr, BytesExpr):
return parse_type_string(expr.value, "builtins.bytes", expr.line, expr.column)
elif isinstance(expr, UnaryExpr):
+ # Handle `not` for type booleans
+ if expr.op == "not":
+ return UnboundType(
+ "__builtins__._Not",
+ [
+ expr_to_unanalyzed_type(
+ expr.expr, options, allow_new_syntax, lookup_qualified=lookup_qualified
+ )
+ ],
+ line=expr.line,
+ column=expr.column,
+ )
typ = expr_to_unanalyzed_type(
expr.expr, options, allow_new_syntax, lookup_qualified=lookup_qualified
)
@@ -252,6 +369,38 @@ def expr_to_unanalyzed_type(
elif isinstance(expr, EllipsisExpr):
return EllipsisType(expr.line)
elif allow_unpack and isinstance(expr, StarExpr):
+ # Check if this is a type comprehension: *[Expr for var in Iter if Cond]
+ if isinstance(expr.expr, ListComprehension):
+ return _generator_to_type_for_comprehension(
+ expr.expr.generator,
+ options,
+ allow_new_syntax,
+ lookup_qualified,
+ expr.line,
+ expr.column,
+ )
+ # *Map(genexp) — keep the Map wrapper around the TFC (not an
+ # UnpackType). typeanal will verify the name resolves to Map and
+ # desugar to the analyzed TFC; the TFC then participates in variadic
+ # flattening just like the *[...] form.
+ if isinstance(expr.expr, CallExpr) and _is_map_call(expr.expr):
+ inner_base = expr_to_unanalyzed_type(
+ expr.expr.callee,
+ options,
+ allow_new_syntax,
+ expr.expr,
+ lookup_qualified=lookup_qualified,
+ )
+ assert isinstance(inner_base, UnboundType) and not inner_base.args
+ arg = expr.expr.args[0]
+ assert isinstance(arg, (GeneratorExpr, ListComprehension))
+ gen = arg if isinstance(arg, GeneratorExpr) else arg.generator
+ tfc = _generator_to_type_for_comprehension(
+ gen, options, allow_new_syntax, lookup_qualified, expr.expr.line, expr.expr.column
+ )
+ tfc.is_map = True
+ inner_base.args = (tfc,)
+ return inner_base
return UnpackType(
expr_to_unanalyzed_type(
expr.expr, options, allow_new_syntax, lookup_qualified=lookup_qualified
@@ -262,19 +411,16 @@ def expr_to_unanalyzed_type(
if not expr.items:
raise TypeTranslationError()
items: dict[str, Type] = {}
- extra_items_from = []
+ extra_items_from: list[ProperType] = []
for item_name, value in expr.items:
if not isinstance(item_name, StrExpr):
if item_name is None:
- extra_items_from.append(
- expr_to_unanalyzed_type(
- value,
- options,
- allow_new_syntax,
- expr,
- lookup_qualified=lookup_qualified,
- )
+ typ = expr_to_unanalyzed_type(
+ value, options, allow_new_syntax, expr, lookup_qualified=lookup_qualified
)
+ # TypedDict spread values should be ProperTypes
+ assert isinstance(typ, ProperType)
+ extra_items_from.append(typ)
continue
raise TypeTranslationError()
items[item_name.value] = expr_to_unanalyzed_type(
@@ -285,5 +431,55 @@ def expr_to_unanalyzed_type(
)
result.extra_items_from = extra_items_from
return result
+ elif isinstance(expr, DictionaryComprehension):
+ # Dict comprehension in type context: {k: v for x in foo}
+ # desugars to *[_DictEntry[k, v] for x in foo]
+ if len(expr.sequences) != 1:
+ raise TypeTranslationError()
+ index = expr.indices[0]
+ if not isinstance(index, NameExpr):
+ raise TypeTranslationError()
+ iter_name = index.name
+ key_type = expr_to_unanalyzed_type(
+ expr.key, options, allow_new_syntax, lookup_qualified=lookup_qualified
+ )
+ value_type = expr_to_unanalyzed_type(
+ expr.value, options, allow_new_syntax, lookup_qualified=lookup_qualified
+ )
+ iter_type = expr_to_unanalyzed_type(
+ expr.sequences[0], options, allow_new_syntax, lookup_qualified=lookup_qualified
+ )
+ cond_types: list[Type] = [
+ expr_to_unanalyzed_type(
+ cond, options, allow_new_syntax, lookup_qualified=lookup_qualified
+ )
+ for cond in expr.condlists[0]
+ ]
+ element_expr = UnboundType(
+ "__builtins__._DictEntry", [key_type, value_type], line=expr.line, column=expr.column
+ )
+ return TypeForComprehension(
+ element_expr=element_expr,
+ iter_name=iter_name,
+ iter_type=iter_type,
+ conditions=cond_types,
+ line=expr.line,
+ column=expr.column,
+ )
+ elif isinstance(expr, ConditionalExpr):
+
+ # Use __builtins__ so it can be resolved without explicit import
+ return UnboundType(
+ "__builtins__._Cond",
+ [
+ expr_to_unanalyzed_type(
+ arg, options, allow_new_syntax, expr, lookup_qualified=lookup_qualified
+ )
+ for arg in [expr.cond, expr.if_expr, expr.else_expr]
+ ],
+ line=expr.line,
+ column=expr.column,
+ )
+
else:
raise TypeTranslationError()
diff --git a/mypy/fastparse.py b/mypy/fastparse.py
index e85b8fffaf9e9..fad7e90a87759 100644
--- a/mypy/fastparse.py
+++ b/mypy/fastparse.py
@@ -117,11 +117,13 @@
TupleType,
Type,
TypedDictType,
+ TypeForComprehension,
TypeList,
TypeOfAny,
UnboundType,
UnionType,
UnpackType,
+ get_proper_type,
)
from mypy.util import bytes_to_human_readable_repr, unnamed_function
@@ -130,7 +132,7 @@
PY_MINOR_VERSION: Final = sys.version_info[1]
import ast as ast3
-from ast import AST, Attribute, Call, FunctionType, Name, Starred, UAdd, UnaryOp, USub
+from ast import AST, And, Attribute, Call, FunctionType, Name, Not, Starred, UAdd, UnaryOp, USub
def ast3_parse(
@@ -292,7 +294,7 @@ def parse_type_ignore_tag(tag: str | None) -> list[str] | None:
def parse_type_comment(
type_comment: str, line: int, column: int, errors: Errors | None
-) -> tuple[list[str] | None, ProperType | None]:
+) -> tuple[list[str] | None, Type | None]:
"""Parse type portion of a type comment (+ optional type ignore).
Return (ignore info, parsed type).
@@ -338,12 +340,14 @@ def parse_type_string(
"""
try:
_, node = parse_type_comment(f"({expr_string})", line=line, column=column, errors=None)
- if isinstance(node, (UnboundType, UnionType)) and node.original_str_expr is None:
- node.original_str_expr = expr_string
- node.original_str_fallback = expr_fallback_name
- return node
- else:
- return RawExpressionType(expr_string, expr_fallback_name, line, column)
+ # node is Type | None but we need to check for specific ProperTypes
+ if node is not None:
+ proper = get_proper_type(node)
+ if isinstance(proper, (UnboundType, UnionType)) and proper.original_str_expr is None:
+ proper.original_str_expr = expr_string
+ proper.original_str_fallback = expr_fallback_name
+ return proper
+ return RawExpressionType(expr_string, expr_fallback_name, line, column)
except (SyntaxError, ValueError):
# Note: the parser will raise a `ValueError` instead of a SyntaxError if
# the string happens to contain things like \x00.
@@ -528,7 +532,7 @@ def translate_stmt_list(
def translate_type_comment(
self, n: ast3.stmt | ast3.arg, type_comment: str | None
- ) -> ProperType | None:
+ ) -> Type | None:
if type_comment is None:
return None
else:
@@ -1911,12 +1915,12 @@ def invalid_type(self, node: AST, note: str | None = None) -> RawExpressionType:
)
@overload
- def visit(self, node: ast3.expr) -> ProperType: ...
+ def visit(self, node: ast3.expr) -> Type: ...
@overload
- def visit(self, node: AST | None) -> ProperType | None: ...
+ def visit(self, node: AST | None) -> Type | None: ...
- def visit(self, node: AST | None) -> ProperType | None:
+ def visit(self, node: AST | None) -> Type | None:
"""Modified visit -- keep track of the stack of nodes"""
if node is None:
return None
@@ -1926,7 +1930,7 @@ def visit(self, node: AST | None) -> ProperType | None:
visitor = getattr(self, method, None)
if visitor is not None:
typ = visitor(node)
- assert isinstance(typ, ProperType)
+ assert isinstance(typ, Type)
return typ
else:
return self.invalid_type(node)
@@ -1951,6 +1955,16 @@ def translate_expr_list(self, l: Sequence[ast3.expr]) -> list[Type]:
return [self.visit(e) for e in l]
def visit_Call(self, e: Call) -> Type:
+ # Map(genexp) — variadic comprehension operator (call syntax allows
+ # a bare generator expression without extra parentheses).
+ if (
+ len(e.args) == 1
+ and not e.keywords
+ and isinstance(e.args[0], (ast3.GeneratorExp, ast3.ListComp))
+ and self._is_map_name_ast(e.func)
+ ):
+ return self._build_map_from_call(e)
+
# Parse the arg constructor
f = e.func
constructor = stringify_name(f)
@@ -2058,6 +2072,10 @@ def visit_Constant(self, n: ast3.Constant) -> Type:
# UnaryOp(op, operand)
def visit_UnaryOp(self, n: UnaryOp) -> Type:
+ # Handle `not` for type booleans
+ if isinstance(n.op, Not):
+ return self.visit_UnaryOp_not(n)
+
# We support specifically Literal[-4], Literal[+4], and nothing else.
# For example, Literal[~6] or Literal[not False] is not supported.
typ = self.visit(n.operand)
@@ -2132,11 +2150,14 @@ def visit_Dict(self, n: ast3.Dict) -> Type:
if not n.keys:
return self.invalid_type(n)
items: dict[str, Type] = {}
- extra_items_from = []
+ extra_items_from: list[ProperType] = []
for item_name, value in zip(n.keys, n.values):
if not isinstance(item_name, ast3.Constant) or not isinstance(item_name.value, str):
if item_name is None:
- extra_items_from.append(self.visit(value))
+ visited = self.visit(value)
+ # TypedDict spread values should be ProperTypes
+ assert isinstance(visited, ProperType)
+ extra_items_from.append(visited)
continue
return self.invalid_type(n)
items[item_name.value] = self.visit(value)
@@ -2144,25 +2165,203 @@ def visit_Dict(self, n: ast3.Dict) -> Type:
result.extra_items_from = extra_items_from
return result
+ def visit_DictComp(self, n: ast3.DictComp) -> Type:
+ """Convert {k: v for x in Iter[T]} to *[_DictEntry[k, v] for x in Iter[T]].
+
+ Dict comprehensions in type context desugar to unpacked type comprehensions
+ where each element is a _DictEntry[key, value].
+ """
+ if len(n.generators) != 1:
+ return self.invalid_type(
+ n, note="Type comprehensions only support a single 'for' clause"
+ )
+
+ gen = n.generators[0]
+
+ if not isinstance(gen.target, ast3.Name):
+ return self.invalid_type(n, note="Type comprehension variable must be a simple name")
+
+ iter_name = gen.target.id
+ key_type = self.visit(n.key)
+ value_type = self.visit(n.value)
+ iter_type = self.visit(gen.iter)
+ conditions = [self.visit(cond) for cond in gen.ifs]
+
+ # Create _DictEntry[k, v] as the element expression
+ element_expr = UnboundType(
+ "__builtins__._DictEntry",
+ [key_type, value_type],
+ line=self.line,
+ column=self.convert_column(n.col_offset),
+ )
+
+ return TypeForComprehension(
+ element_expr=element_expr,
+ iter_name=iter_name,
+ iter_type=iter_type,
+ conditions=conditions,
+ line=self.line,
+ column=self.convert_column(n.col_offset),
+ )
+
# Attribute(expr value, identifier attr, expr_context ctx)
def visit_Attribute(self, n: Attribute) -> Type:
before_dot = self.visit(n.value)
if isinstance(before_dot, UnboundType) and not before_dot.args:
return UnboundType(f"{before_dot.name}.{n.attr}", line=self.line, column=n.col_offset)
+ elif isinstance(before_dot, UnboundType):
+ # Subscripted type with attribute access: GetMember[T, K].type
+ # Desugar to _TypeGetAttr[GetMember[T, K], Literal["attr"]]
+ attr_literal = RawExpressionType(n.attr, "builtins.str", line=self.line)
+ return UnboundType(
+ "__builtins__._TypeGetAttr",
+ [before_dot, attr_literal],
+ line=self.line,
+ column=n.col_offset,
+ )
else:
return self.invalid_type(n)
# Used for Callable[[X *Ys, Z], R] etc.
+ # Also handles type comprehensions: *[Expr for var in Iter if Cond]
def visit_Starred(self, n: ast3.Starred) -> Type:
+ # Check if this is a list comprehension (type comprehension syntax)
+ if isinstance(n.value, ast3.ListComp):
+ return self.visit_ListComp_as_type(n.value)
+ # *Map(genexp) — pure synonym for *[...]. Produce the TFC directly
+ # (matching the *[...] path) rather than wrapping in UnpackType.
+ if (
+ isinstance(n.value, ast3.Call)
+ and len(n.value.args) == 1
+ and not n.value.keywords
+ and isinstance(n.value.args[0], (ast3.GeneratorExp, ast3.ListComp))
+ and self._is_map_name_ast(n.value.func)
+ ):
+ return self._build_map_from_call(n.value)
return UnpackType(self.visit(n.value), from_star_syntax=True)
+ def visit_ListComp_as_type(self, n: ast3.ListComp) -> Type:
+ """Convert *[Expr for var in Iter if Cond] to TypeForComprehension."""
+ return self._comprehension_to_type(n)
+
+ def _comprehension_to_type(self, n: ast3.ListComp | ast3.GeneratorExp) -> Type:
+ """Build a TypeForComprehension from a list or generator comprehension AST."""
+ # Currently only support single generator
+ if len(n.generators) != 1:
+ return self.invalid_type(
+ n, note="Type comprehensions only support a single 'for' clause"
+ )
+
+ gen = n.generators[0]
+
+ # The target should be a simple name
+ if not isinstance(gen.target, ast3.Name):
+ return self.invalid_type(n, note="Type comprehension variable must be a simple name")
+
+ iter_name = gen.target.id
+ element_expr = self.visit(n.elt)
+ iter_type = self.visit(gen.iter)
+ conditions = [self.visit(cond) for cond in gen.ifs]
+
+ return TypeForComprehension(
+ element_expr=element_expr,
+ iter_name=iter_name,
+ iter_type=iter_type,
+ conditions=conditions,
+ line=self.line,
+ column=self.convert_column(n.col_offset),
+ )
+
+ def _is_map_name_ast(self, node: ast3.AST) -> bool:
+ """Return True if node syntactically names the Map type operator."""
+ if isinstance(node, ast3.Name):
+ return node.id == "Map"
+ if isinstance(node, ast3.Attribute):
+ return node.attr == "Map"
+ return False
+
+ def _build_map_from_call(self, n: ast3.Call) -> Type:
+ """Convert Map() to UnboundType(, [TypeForComprehension]).
+
+ The Map wrapper is retained so typeanal can verify that the name
+ actually resolves to the Map type operator before desugaring to the
+ TFC. A user class coincidentally named Map will fail resolution there.
+ """
+ assert len(n.args) == 1 and isinstance(n.args[0], (ast3.GeneratorExp, ast3.ListComp))
+ tfc = self._comprehension_to_type(n.args[0])
+ if not isinstance(tfc, TypeForComprehension):
+ return tfc
+ tfc.is_map = True
+ value = self.visit(n.func)
+ if not isinstance(value, UnboundType) or value.args:
+ return self.invalid_type(n)
+ result = UnboundType(
+ value.name, [tfc], line=self.line, column=self.convert_column(n.col_offset)
+ )
+ result.end_column = n.end_col_offset
+ result.end_line = n.end_lineno
+ return result
+
# List(expr* elts, expr_context ctx)
def visit_List(self, n: ast3.List) -> Type:
assert isinstance(n.ctx, ast3.Load)
result = self.translate_argument_list(n.elts)
return result
+ # BoolOp(boolop op, expr* values)
+ def visit_BoolOp(self, n: ast3.BoolOp) -> Type:
+ """Handle boolean operations in type contexts.
+
+ Convert `A and B` to `_And[A, B]` and `A or B` to `_Or[A, B]`.
+ Chains like `A and B and C` become `_And[_And[A, B], C]`.
+ """
+ # Process left-to-right, building up nested operators
+ result = self.visit(n.values[0])
+ op_name = "_And" if isinstance(n.op, And) else "_Or"
+
+ for value in n.values[1:]:
+ right = self.visit(value)
+ result = UnboundType(
+ f"__builtins__.{op_name}",
+ [result, right],
+ line=self.line,
+ column=self.convert_column(n.col_offset),
+ )
+ return result
+
+ def visit_UnaryOp_not(self, n: UnaryOp) -> Type:
+ """Handle `not` in type contexts.
+
+ Convert `not X` to `_Not[X]`.
+ """
+ operand = self.visit(n.operand)
+ return UnboundType(
+ "__builtins__._Not",
+ [operand],
+ line=self.line,
+ column=self.convert_column(n.col_offset),
+ )
+
+ # IfExp(expr test, expr body, expr orelse)
+ def visit_IfExp(self, n: ast3.IfExp) -> Type:
+ """Handle ternary expressions in type contexts.
+
+ Convert `X if Cond else Y` to `_Cond[Cond, X, Y]`.
+ The _Cond type operator is resolved during type analysis.
+ """
+ condition = self.visit(n.test)
+ true_type = self.visit(n.body)
+ false_type = self.visit(n.orelse)
+
+ # Use __builtins__ so it can be resolved without explicit import
+ return UnboundType(
+ "__builtins__._Cond",
+ [condition, true_type, false_type],
+ line=self.line,
+ column=self.convert_column(n.col_offset),
+ )
+
def stringify_name(n: AST) -> str | None:
if isinstance(n, Name):
diff --git a/mypy/fixup.py b/mypy/fixup.py
index d0205f64b7207..d59eb87e501bb 100644
--- a/mypy/fixup.py
+++ b/mypy/fixup.py
@@ -33,7 +33,9 @@
TupleType,
TypeAliasType,
TypedDictType,
+ TypeForComprehension,
TypeOfAny,
+ TypeOperatorType,
TypeType,
TypeVarTupleType,
TypeVarType,
@@ -376,6 +378,23 @@ def visit_union_type(self, ut: UnionType) -> None:
def visit_type_type(self, t: TypeType) -> None:
t.item.accept(self)
+ def visit_type_operator_type(self, op: TypeOperatorType) -> None:
+ type_ref = op.type_ref
+ if type_ref is None:
+ return # We've already been here.
+ op.type_ref = None
+ op.type = lookup_fully_qualified_typeinfo(
+ self.modules, type_ref, allow_missing=self.allow_missing
+ )
+ for a in op.args:
+ a.accept(self)
+
+ def visit_type_for_comprehension(self, t: TypeForComprehension) -> None:
+ t.element_expr.accept(self)
+ t.iter_type.accept(self)
+ for c in t.conditions:
+ c.accept(self)
+
def lookup_fully_qualified_typeinfo(
modules: dict[str, MypyFile], name: str, *, allow_missing: bool
diff --git a/mypy/indirection.py b/mypy/indirection.py
index c5f3fa89b8c4a..861b781c467fa 100644
--- a/mypy/indirection.py
+++ b/mypy/indirection.py
@@ -96,7 +96,7 @@ def visit_type_var_tuple(self, t: types.TypeVarTupleType) -> None:
self._visit(t.default)
def visit_unpack_type(self, t: types.UnpackType) -> None:
- t.type.accept(self)
+ self._visit(t.type)
def visit_parameters(self, t: types.Parameters) -> None:
self._visit_type_list(t.arg_types)
@@ -168,3 +168,13 @@ def visit_type_alias_type(self, t: types.TypeAliasType) -> None:
self.modules.add(t.alias.module)
self._visit(t.alias.target)
self._visit_type_list(t.args)
+
+ def visit_type_operator_type(self, t: types.TypeOperatorType) -> None:
+ if t.type:
+ self.modules.add(t.type.module_name)
+ self._visit_type_list(t.args)
+
+ def visit_type_for_comprehension(self, t: types.TypeForComprehension) -> None:
+ self._visit(t.element_expr)
+ self._visit(t.iter_type)
+ self._visit_type_list(t.conditions)
diff --git a/mypy/join.py b/mypy/join.py
index a8c9910e60bb7..53052036d8765 100644
--- a/mypy/join.py
+++ b/mypy/join.py
@@ -36,7 +36,9 @@
Type,
TypeAliasType,
TypedDictType,
+ TypeForComprehension,
TypeOfAny,
+ TypeOperatorType,
TypeType,
TypeVarId,
TypeVarLikeType,
@@ -665,6 +667,19 @@ def visit_type_type(self, t: TypeType) -> ProperType:
def visit_type_alias_type(self, t: TypeAliasType) -> ProperType:
assert False, f"This should be never called, got {t}"
+ def visit_type_operator_type(self, t: TypeOperatorType) -> ProperType:
+ # TODO: This seems very unsatisfactory. Can we do better ever?
+ # (Do we need to do some self check also??)
+ # We could do union, maybe?
+ if isinstance(self.s, TypeOperatorType):
+ return join_types(self.s.fallback, t.fallback)
+ else:
+ return join_types(self.s, t.fallback)
+
+ def visit_type_for_comprehension(self, t: TypeForComprehension) -> ProperType:
+ # TODO: XXX: This is pretty dodgy
+ return UnpackType(AnyType(TypeOfAny.special_form))
+
def default(self, typ: Type) -> ProperType:
typ = get_proper_type(typ)
if isinstance(typ, Instance):
diff --git a/mypy/meet.py b/mypy/meet.py
index ee32f239df8c3..c77fd095aec3d 100644
--- a/mypy/meet.py
+++ b/mypy/meet.py
@@ -36,8 +36,10 @@
Type,
TypeAliasType,
TypedDictType,
+ TypeForComprehension,
TypeGuardedType,
TypeOfAny,
+ TypeOperatorType,
TypeType,
TypeVarLikeType,
TypeVarTupleType,
@@ -1135,6 +1137,18 @@ def visit_type_type(self, t: TypeType) -> ProperType:
def visit_type_alias_type(self, t: TypeAliasType) -> ProperType:
assert False, f"This should be never called, got {t}"
+ def visit_type_operator_type(self, t: TypeOperatorType) -> ProperType:
+ # TODO: This seems very unsatisfactory. Can we do better ever?
+ # (Do we need to do some self check also??)
+ #
+ # If we had intersections, we could use those...
+ return self.default(t)
+
+ def visit_type_for_comprehension(self, t: TypeForComprehension) -> ProperType:
+ # TODO: This seems very unsatisfactory. Can we do better ever?
+ # (Do we need to do some self check also??)
+ return self.default(t)
+
def meet(self, s: Type, t: Type) -> ProperType:
return meet_types(s, t)
diff --git a/mypy/messages.py b/mypy/messages.py
index 51bb0b7ee9be6..e2ab3ba638423 100644
--- a/mypy/messages.py
+++ b/mypy/messages.py
@@ -71,6 +71,7 @@
is_same_type,
is_subtype,
)
+from mypy.typelevel import typelevel_ctx
from mypy.typeops import separate_union_literals
from mypy.types import (
AnyType,
@@ -89,7 +90,9 @@
Type,
TypeAliasType,
TypedDictType,
+ TypeForComprehension,
TypeOfAny,
+ TypeOperatorType,
TypeStrVisitor,
TypeType,
TypeVarLikeType,
@@ -100,6 +103,7 @@
UnionType,
UnpackType,
flatten_nested_unions,
+ format_new_protocol,
get_proper_type,
get_proper_types,
)
@@ -1747,7 +1751,7 @@ def reveal_type(self, typ: Type, context: Context) -> None:
return
# Nothing special here; just create the note:
- visitor = TypeStrVisitor(options=self.options)
+ visitor = TypeStrVisitor(expand=True, expand_recursive=True, options=self.options)
self.note(f'Revealed type is "{typ.accept(visitor)}"', context)
def reveal_locals(self, type_map: dict[str, Type | None], context: Context) -> None:
@@ -2611,8 +2615,20 @@ def format_literal_value(typ: LiteralType) -> str:
# TODO: always mention type alias names in errors.
typ = get_proper_type(typ)
+ if isinstance(typ, TypeOperatorType):
+ # There are type arguments. Convert the arguments to strings.
+ base_str = typ.type.fullname if typ.type else ""
+ return f"{base_str}[{format_list(typ.args)}]"
+
+ if isinstance(typ, TypeForComprehension):
+ conditions_str = "".join(f" if {format(c)}" for c in typ.conditions)
+ return f"*[{format(typ.element_expr)} for {typ.iter_name} in {format(typ.iter_type)}{conditions_str}]"
+
if isinstance(typ, Instance):
itype = typ
+ # Format NewProtocol types by showing their members
+ if itype.type.is_new_protocol:
+ return format_new_protocol(itype, format)
# Get the short name of the type.
if itype.type.fullname == "types.ModuleType":
# Make some common error messages simpler and tidier.
@@ -2884,7 +2900,8 @@ def format_type_bare(
instead. (The caller may want to use quote_type_string after
processing has happened, to maintain consistent quoting in messages.)
"""
- return format_type_inner(typ, verbosity, options, find_type_overlaps(typ), module_names)
+ with typelevel_ctx.suppress_errors():
+ return format_type_inner(typ, verbosity, options, find_type_overlaps(typ), module_names)
def format_type_distinctly(*types: Type, options: Options, bare: bool = False) -> tuple[str, ...]:
@@ -2899,6 +2916,11 @@ def format_type_distinctly(*types: Type, options: Options, bare: bool = False) -
be quoted; callers who need to do post-processing of the strings before
quoting them (such as prepending * or **) should use this.
"""
+ with typelevel_ctx.suppress_errors():
+ return _format_type_distinctly(*types, options=options, bare=bare)
+
+
+def _format_type_distinctly(*types: Type, options: Options, bare: bool = False) -> tuple[str, ...]:
overlapping = find_type_overlaps(*types)
def format_single(arg: Type) -> str:
diff --git a/mypy/nodes.py b/mypy/nodes.py
index 589da3d240fb9..880c46e625e72 100644
--- a/mypy/nodes.py
+++ b/mypy/nodes.py
@@ -1358,6 +1358,7 @@ class Var(SymbolNode):
"type",
"setter_type",
"final_value",
+ "init_type",
"is_self",
"is_cls",
"is_ready",
@@ -1419,6 +1420,9 @@ def __init__(self, name: str, type: mypy.types.Type | None = None) -> None:
# store the literal value (unboxed) for the benefit of
# tools like mypyc.
self.final_value: int | float | complex | bool | str | None = None
+ # The type of the initializer expression, if this is a class member with
+ # an initializer. Used for the Init field in typemap Member types.
+ self.init_type: mypy.types.Type | None = None
# Where the value was set (only for class attributes)
self.final_unset_in_class = False
self.final_set_in_init = False
@@ -1471,6 +1475,8 @@ def serialize(self) -> JsonDict:
}
if self.final_value is not None:
data["final_value"] = self.final_value
+ if self.init_type is not None:
+ data["init_type"] = self.init_type.serialize()
return data
@classmethod
@@ -1494,6 +1500,8 @@ def deserialize(cls, data: JsonDict) -> Var:
v._fullname = data["fullname"]
set_flags(v, data["flags"])
v.final_value = data.get("final_value")
+ if data.get("init_type") is not None:
+ v.init_type = mypy.types.deserialize_type(data["init_type"])
return v
def write(self, data: WriteBuffer) -> None:
@@ -1527,6 +1535,7 @@ def write(self, data: WriteBuffer) -> None:
],
)
write_literal(data, self.final_value)
+ mypy.types.write_type_opt(data, self.init_type)
write_tag(data, END_TAG)
@classmethod
@@ -1567,6 +1576,7 @@ def read(cls, data: ReadBuffer) -> Var:
v.final_value = complex(read_float_bare(data), read_float_bare(data))
elif tag != LITERAL_NONE:
v.final_value = read_literal(data, tag)
+ v.init_type = mypy.types.read_type_opt(data)
assert read_tag(data) == END_TAG
return v
@@ -3581,6 +3591,8 @@ class is generic then it will be a type constructor of higher kind.
"self_type",
"dataclass_transform_spec",
"is_type_check_only",
+ "is_type_operator",
+ "new_protocol_constructor",
"deprecated",
"type_object_type",
)
@@ -3737,6 +3749,19 @@ class is generic then it will be a type constructor of higher kind.
# Is set to `True` when class is decorated with `@typing.type_check_only`
is_type_check_only: bool
+ # Is set to `True` when class is decorated with `@typing._type_operator`
+ # Type operators are used for type-level computation (e.g., GetArg, Members, etc.)
+ is_type_operator: bool
+
+ # For synthetic protocol types created by NewProtocol[...], stores the
+ # unevaluated TypeOperatorType so it can be re-evaluated on cache load
+ # instead of trying to serialize the synthetic TypeInfo.
+ new_protocol_constructor: mypy.types.TypeOperatorType | None
+
+ @property
+ def is_new_protocol(self) -> bool:
+ return self.new_protocol_constructor is not None
+
# The type's deprecation message (in case it is deprecated)
deprecated: str | None
@@ -3756,6 +3781,7 @@ class is generic then it will be a type constructor of higher kind.
"is_final",
"is_disjoint_base",
"is_intersection",
+ "is_type_operator",
]
def __init__(self, names: SymbolTable, defn: ClassDef, module_name: str) -> None:
@@ -3803,6 +3829,8 @@ def __init__(self, names: SymbolTable, defn: ClassDef, module_name: str) -> None
self.self_type = None
self.dataclass_transform_spec = None
self.is_type_check_only = False
+ self.is_type_operator = False
+ self.new_protocol_constructor = None
self.deprecated = None
self.type_object_type = None
@@ -4068,7 +4096,7 @@ def __str__(self) -> str:
options = Options()
return self.dump(
str_conv=mypy.strconv.StrConv(options=options),
- type_str_conv=mypy.types.TypeStrVisitor(options=options),
+ type_str_conv=mypy.types.TypeStrVisitor(options=options, expand=True),
)
def dump(
@@ -4238,6 +4266,7 @@ def write(self, data: WriteBuffer) -> None:
self.is_final,
self.is_disjoint_base,
self.is_intersection,
+ self.is_type_operator,
],
)
write_json(data, self.metadata)
@@ -4311,7 +4340,8 @@ def read(cls, data: ReadBuffer) -> TypeInfo:
ti.is_final,
ti.is_disjoint_base,
ti.is_intersection,
- ) = read_flags(data, num_flags=11)
+ ti.is_type_operator,
+ ) = read_flags(data, num_flags=12)
ti.metadata = read_json(data)
tag = read_tag(data)
if tag != LITERAL_NONE:
@@ -4829,7 +4859,7 @@ def __str__(self) -> str:
s += f" ({self.node.fullname})"
# Include declared type of variables and functions.
if self.type is not None:
- s += f" : {self.type}"
+ s += f" : {self.type.str_with_options(expand=True)}"
if self.cross_ref:
s += f" cross_ref:{self.cross_ref}"
return s
diff --git a/mypy/plugins/proper_plugin.py b/mypy/plugins/proper_plugin.py
index 4221e5f7c0756..d7e50fab48806 100644
--- a/mypy/plugins/proper_plugin.py
+++ b/mypy/plugins/proper_plugin.py
@@ -86,6 +86,9 @@ def is_special_target(right: ProperType) -> bool:
"mypy.types.Type",
"mypy.types.ProperType",
"mypy.types.TypeAliasType",
+ "mypy.types.ComputedType",
+ "mypy.types.TypeOperatorType",
+ "mypy.types.TypeForComprehension",
):
# Special case: things like assert isinstance(typ, ProperType) are always OK.
return True
diff --git a/mypy/semanal.py b/mypy/semanal.py
index bf21e057345fa..07e7e4e11fcba 100644
--- a/mypy/semanal.py
+++ b/mypy/semanal.py
@@ -272,6 +272,7 @@
TYPE_ALIAS_NAMES,
TYPE_CHECK_ONLY_NAMES,
TYPE_NAMES,
+ TYPE_OPERATOR_NAMES,
TYPE_VAR_LIKE_NAMES,
TYPED_NAMEDTUPLE_NAMES,
UNPACK_TYPE_NAMES,
@@ -303,6 +304,7 @@
UnpackType,
flatten_nested_tuples,
get_proper_type,
+ get_proper_type_simple,
get_proper_types,
has_type_vars,
is_named_instance,
@@ -519,6 +521,12 @@ def __init__(
# new uses of this, as this may cause leaking `UnboundType`s to type checking.
self.allow_unbound_tvars = False
+ # Set when we are analyzing a type as an expression.
+ # We need to do that analysis mostly for mypyc.
+ # When doing this, we disable variable binding in `for`, which
+ # can now appear in expressions.
+ self.analyzing_type_expr = False
+
# Used to pass information about current overload index to visit_func_def().
self.current_overload_item: int | None = None
@@ -574,6 +582,15 @@ def allow_unbound_tvars_set(self) -> Iterator[None]:
finally:
self.allow_unbound_tvars = old
+ @contextmanager
+ def analyzing_type_expr_set(self) -> Iterator[None]:
+ old = self.analyzing_type_expr
+ self.analyzing_type_expr = True
+ try:
+ yield
+ finally:
+ self.analyzing_type_expr = old
+
@contextmanager
def inside_except_star_block_set(
self, value: bool, entering_loop: bool = False
@@ -1116,8 +1133,26 @@ def remove_unpack_kwargs(self, defn: FuncDef, typ: CallableType) -> CallableType
if not isinstance(last_type, UnpackType):
return typ
p_last_type = get_proper_type(last_type.type)
+
+ # Handle TypeVar bound to TypedDict - allows inferring TypedDict from kwargs
+ if isinstance(p_last_type, TypeVarType):
+ bound = get_proper_type(p_last_type.upper_bound)
+ if not self.is_typeddict_like(bound):
+ self.fail(
+ "Unpack item in ** parameter must be a TypedDict or a TypeVar with TypedDict bound",
+ last_type,
+ )
+ new_arg_types = typ.arg_types[:-1] + [AnyType(TypeOfAny.from_error)]
+ return typ.copy_modified(arg_types=new_arg_types)
+ # For TypeVar, we can't check overlap statically since the actual TypedDict
+ # will be inferred at call sites. Keep the TypeVar for constraint inference.
+ return typ.copy_modified(unpack_kwargs=True)
+
if not isinstance(p_last_type, TypedDictType):
- self.fail("Unpack item in ** parameter must be a TypedDict", last_type)
+ self.fail(
+ "Unpack item in ** parameter must be a TypedDict or a TypeVar with TypedDict bound",
+ last_type,
+ )
new_arg_types = typ.arg_types[:-1] + [AnyType(TypeOfAny.from_error)]
return typ.copy_modified(arg_types=new_arg_types)
overlap = set(typ.arg_names) & set(p_last_type.items)
@@ -1134,6 +1169,23 @@ def remove_unpack_kwargs(self, defn: FuncDef, typ: CallableType) -> CallableType
new_arg_types = typ.arg_types[:-1] + [p_last_type]
return typ.copy_modified(arg_types=new_arg_types, unpack_kwargs=True)
+ def is_typeddict_like(self, typ: ProperType) -> bool:
+ """Check if type is TypedDict or inherits from BaseTypedDict."""
+ if isinstance(typ, TypedDictType):
+ return True
+ if isinstance(typ, Instance):
+ # Check if it's a TypedDict class or inherits from BaseTypedDict
+ if typ.type.typeddict_type is not None:
+ return True
+ for base in typ.type.mro:
+ if base.fullname in (
+ "typing.TypedDict",
+ "typing.BaseTypedDict",
+ "_typeshed.typemap.BaseTypedDict",
+ ):
+ return True
+ return False
+
def prepare_method_signature(self, func: FuncDef, info: TypeInfo, has_self_type: bool) -> None:
"""Check basic signature validity and tweak annotation of self/cls argument."""
# Only non-static methods are special, as well as __new__.
@@ -1853,8 +1905,6 @@ def push_type_args(
tvs: list[tuple[str, TypeVarLikeExpr]] = []
for p in type_args:
tv = self.analyze_type_param(p, context)
- if tv is None:
- return None
tvs.append((p.name, tv))
if self.is_defined_type_param(p.name):
@@ -1876,9 +1926,7 @@ def is_defined_type_param(self, name: str) -> bool:
return True
return False
- def analyze_type_param(
- self, type_param: TypeParam, context: Context
- ) -> TypeVarLikeExpr | None:
+ def analyze_type_param(self, type_param: TypeParam, context: Context) -> TypeVarLikeExpr:
fullname = self.qualified_name(type_param.name)
if type_param.upper_bound:
upper_bound = self.anal_type(type_param.upper_bound, allow_placeholder=True)
@@ -2262,6 +2310,8 @@ def analyze_class_decorator_common(
info.is_disjoint_base = True
elif refers_to_fullname(decorator, TYPE_CHECK_ONLY_NAMES):
info.is_type_check_only = True
+ elif refers_to_fullname(decorator, TYPE_OPERATOR_NAMES):
+ info.is_type_operator = True
elif (deprecated := self.get_deprecated(decorator)) is not None:
info.deprecated = f"class {defn.fullname} is deprecated: {deprecated}"
@@ -4008,8 +4058,7 @@ def analyze_alias(
tvar_defs = self.tvar_defs_from_tvars(alias_type_vars, typ)
if python_3_12_type_alias:
- with self.allow_unbound_tvars_set():
- rvalue.accept(self)
+ self.analyze_type_expr(rvalue)
analyzed, depends_on = analyze_type_alias(
typ,
@@ -4361,8 +4410,13 @@ def disable_invalid_recursive_aliases(
f'Cannot resolve name "{current_node.name}" (possible cyclic definition)'
)
elif is_invalid_recursive_alias({current_node}, current_node.target):
+ # Need to do get_proper_type_simple since we don't want
+ # any computation-based expansion done, or expansions in
+ # the arguments.
target = (
- "tuple" if isinstance(get_proper_type(current_node.target), TupleType) else "union"
+ "tuple"
+ if isinstance(get_proper_type_simple(current_node.target), TupleType)
+ else "union"
)
messages.append(f"Invalid recursive alias: a {target} item of itself")
if detect_diverging_alias(
@@ -4478,9 +4532,15 @@ def analyze_name_lvalue(
if (not existing or isinstance(existing.node, PlaceholderNode)) and not outer:
# Define new variable.
- var = self.make_name_lvalue_var(
- lvalue, kind, not explicit_type, has_explicit_value, is_index_var
- )
+ var: SymbolNode
+ if self.analyzing_type_expr:
+ # When analyzing type expressions... the lvalues are type variable
+ param = TypeParam(lvalue.name, TYPE_VAR_KIND, None, [], None)
+ var = self.analyze_type_param(param, lvalue)
+ else:
+ var = self.make_name_lvalue_var(
+ lvalue, kind, not explicit_type, has_explicit_value, is_index_var
+ )
added = self.add_symbol(name, var, lvalue, escape_comprehensions=escape_comprehensions)
# Only bind expression if we successfully added name to symbol table.
if added:
@@ -5140,6 +5200,13 @@ def process_typevartuple_declaration(self, s: AssignmentStmt) -> bool:
# PEP 646 does not specify the behavior of variance, constraints, or bounds.
if not call.analyzed:
+ # XXX: Since adding a TypeVarTuple to typing.pyi, sometimes
+ # they get processed before builtins.tuple is available.
+ # Fix this by deferring, I guess.
+ if self.lookup_fully_qualified_or_none("builtins.tuple") is None:
+ self.defer()
+ return True
+
tuple_fallback = self.named_type("builtins.tuple", [self.object_type()])
typevartuple_var = TypeVarTupleExpr(
name,
@@ -6016,6 +6083,8 @@ def visit_call_expr(self, expr: CallExpr) -> None:
except TypeTranslationError:
self.fail("Argument 1 to _promote is not a type", expr)
return
+ # _promote should only be used with proper types
+ assert isinstance(target, ProperType)
expr.analyzed = PromoteExpr(target)
expr.analyzed.line = expr.line
expr.analyzed.accept(self)
@@ -6204,6 +6273,11 @@ def visit_index_expr(self, expr: IndexExpr) -> None:
and not base.node.is_generic()
):
expr.index.accept(self)
+ elif isinstance(expr.index, (GeneratorExpr, ListComprehension)):
+ # Foo[] is a type-level comprehension; leave
+ # conversion to expr_to_unanalyzed_type. Just do normal semantic
+ # analysis for name resolution here.
+ expr.index.accept(self)
elif (
isinstance(base, RefExpr) and isinstance(base.node, TypeAlias)
) or refers_to_class_or_function(base):
@@ -6759,6 +6833,9 @@ def lookup_qualified(
# See https://github.com/python/mypy/pull/13468
if isinstance(node, ParamSpecExpr) and part in ("args", "kwargs"):
return None
+ # Allow attribute access on type variables in type expression context
+ if isinstance(node, TypeVarExpr) and self.allow_unbound_tvars:
+ return None
# Lookup through invalid node, such as variable or function
nextsym = None
if not nextsym or nextsym.module_hidden:
@@ -7524,6 +7601,7 @@ def name_not_defined(self, name: str, ctx: Context, namespace: str | None = None
# later on. Defer current target.
self.record_incomplete_ref()
return
+
message = f'Name "{name}" is not defined'
self.fail(message, ctx, code=codes.NAME_DEFINED)
@@ -7711,7 +7789,11 @@ def analyze_type_expr(self, expr: Expression) -> None:
# them semantically analyzed, however, if they need to treat it as an expression
# and not a type. (Which is to say, mypyc needs to do this.) Do the analysis
# in a fresh tvar scope in order to suppress any errors about using type variables.
- with self.tvar_scope_frame(TypeVarLikeScope()), self.allow_unbound_tvars_set():
+ with (
+ self.tvar_scope_frame(TypeVarLikeScope()),
+ self.allow_unbound_tvars_set(),
+ self.analyzing_type_expr_set(),
+ ):
expr.accept(self)
def type_analyzer(
@@ -7755,7 +7837,7 @@ def type_analyzer(
tpan.global_scope = not self.type and not self.function_stack
return tpan
- def expr_to_unanalyzed_type(self, node: Expression, allow_unpack: bool = False) -> ProperType:
+ def expr_to_unanalyzed_type(self, node: Expression, allow_unpack: bool = False) -> Type:
return expr_to_unanalyzed_type(
node, self.options, self.is_stub_file, allow_unpack=allow_unpack
)
diff --git a/mypy/semanal_main.py b/mypy/semanal_main.py
index 6c2f51b39eb12..782dc40544c77 100644
--- a/mypy/semanal_main.py
+++ b/mypy/semanal_main.py
@@ -34,7 +34,16 @@
import mypy.state
from mypy.checker import FineGrainedDeferredNode
from mypy.errors import Errors
-from mypy.nodes import Decorator, FuncDef, MypyFile, OverloadedFuncDef, TypeInfo, Var
+from mypy.nodes import (
+ MDEF,
+ Decorator,
+ FuncDef,
+ MypyFile,
+ OverloadedFuncDef,
+ SymbolTableNode,
+ TypeInfo,
+ Var,
+)
from mypy.options import Options
from mypy.plugin import ClassDefContext
from mypy.plugins import dataclasses as dataclasses_plugin
@@ -499,6 +508,7 @@ def apply_class_plugin_hooks(graph: Graph, scc: list[str], errors: Errors) -> No
state.options,
tree,
errors,
+ state,
):
incomplete = True
@@ -510,6 +520,7 @@ def apply_hooks_to_class(
options: Options,
file_node: MypyFile,
errors: Errors,
+ state: State | None = None,
) -> bool:
# TODO: Move more class-related hooks here?
defn = info.defn
@@ -539,9 +550,154 @@ def apply_hooks_to_class(
# an Expression for reason
ok = ok and dataclasses_plugin.DataclassTransformer(defn, defn, spec, self).transform()
+ # Apply UpdateClass effects from decorators and __init_subclass__
+ with self.file_context(file_node, options, info):
+ _apply_update_class_effects(self, info, state)
+
return ok
+def _populate_init_types(info: TypeInfo, state: State | None) -> None:
+ """Populate init_type on class Vars that have explicit values.
+
+ Normally init_type is set during type checking (checker.py), but
+ UpdateClass needs it during the post-semanal pass. We use the
+ TypeChecker's expression checker to infer the rvalue types, which
+ gives accurate results for all expressions (literals, None, Field()
+ calls, etc.).
+ """
+ from mypy.nodes import AssignmentStmt, NameExpr
+ from mypy.types_utils import try_getting_literal
+
+ if state is None:
+ return
+
+ checker = state.type_checker()
+
+ for stmt in info.defn.defs.body:
+ if not isinstance(stmt, AssignmentStmt):
+ continue
+ for lvalue in stmt.lvalues:
+ if not isinstance(lvalue, NameExpr) or not isinstance(lvalue.node, Var):
+ continue
+ var = lvalue.node
+ if not var.has_explicit_value or var.init_type is not None:
+ continue
+
+ try:
+ rvalue_type = checker.expr_checker.accept(stmt.rvalue)
+ except Exception:
+ continue
+
+ var.init_type = try_getting_literal(rvalue_type)
+
+
+def _apply_update_class_effects(
+ self: SemanticAnalyzer, info: TypeInfo, state: State | None = None
+) -> None:
+ """Apply UpdateClass effects from decorators and __init_subclass__ to a class."""
+ from mypy.expandtype import expand_type
+ from mypy.nodes import RefExpr
+ from mypy.typelevel import MemberDef, evaluate_update_class
+ from mypy.types import (
+ CallableType,
+ Instance,
+ TypeOperatorType,
+ TypeType,
+ TypeVarType,
+ UninhabitedType,
+ get_proper_type,
+ )
+ from mypy.typevars import fill_typevars
+
+ defn = info.defn
+ init_types_populated = False
+
+ def _get_class_instance() -> Instance:
+ inst = fill_typevars(info)
+ if isinstance(inst, Instance):
+ return inst
+ # TupleType fallback
+ return inst.partial_fallback
+
+ def _resolve_and_apply(func_type: CallableType) -> None:
+ """Check if func_type returns UpdateClass, resolve it, and apply to info."""
+ nonlocal init_types_populated
+
+ ret_type = func_type.ret_type
+ # Don't use get_proper_type here — it would eagerly expand the TypeOperatorType
+ if not isinstance(ret_type, TypeOperatorType):
+ return
+ if ret_type.type.name != "UpdateClass":
+ return
+
+ # Populate init_type on class vars before evaluating, so that
+ # Attrs[T]/Members[T] can see defaults. Only do this once, and
+ # only when we actually have an UpdateClass to apply.
+ if not init_types_populated:
+ _populate_init_types(info, state)
+ init_types_populated = True
+
+ # Build substitution map: find type[T] in the first arg and bind T to class instance
+ env = {}
+ if func_type.arg_types:
+ first_arg = get_proper_type(func_type.arg_types[0])
+ if isinstance(first_arg, TypeType) and isinstance(first_arg.item, TypeVarType):
+ env[first_arg.item.id] = _get_class_instance()
+
+ # Substitute type vars in return type
+ if env:
+ ret_type = expand_type(ret_type, env)
+
+ assert isinstance(ret_type, TypeOperatorType)
+ members = evaluate_update_class(ret_type, self, defn)
+ if members is not None:
+ _apply_members(members)
+
+ def _apply_members(members: list[MemberDef]) -> None:
+ for m in members:
+ if isinstance(get_proper_type(m.type), UninhabitedType):
+ # Never-typed members mean "remove this member"
+ info.names.pop(m.name, None)
+ else:
+ var = Var(m.name, m.type)
+ var.info = info
+ var._fullname = f"{info.fullname}.{m.name}"
+ var.is_classvar = m.is_classvar
+ var.is_final = m.is_final
+ var.is_initialized_in_class = True
+ var.init_type = m.init_type
+ var.is_inferred = False
+ info.names[m.name] = SymbolTableNode(MDEF, var)
+
+ def _get_func_type(
+ node: FuncDef | Decorator | OverloadedFuncDef | None,
+ ) -> CallableType | None:
+ if isinstance(node, Decorator):
+ node = node.func
+ if isinstance(node, FuncDef) and isinstance(node.type, CallableType):
+ return node.type
+ return None
+
+ # Apply UpdateClass from decorators
+ for decorator in defn.decorators:
+ if isinstance(decorator, RefExpr) and decorator.node is not None:
+ node = decorator.node
+ if isinstance(node, (FuncDef, Decorator, OverloadedFuncDef)):
+ func_type = _get_func_type(node)
+ if func_type is not None:
+ _resolve_and_apply(func_type)
+
+ # Apply UpdateClass from __init_subclass__ (reverse MRO order, per PEP spec)
+ for base in reversed(info.mro[1:]):
+ if "__init_subclass__" not in base.names:
+ continue
+ sym = base.names["__init_subclass__"]
+ func_type = _get_func_type(sym.node) # type: ignore[arg-type]
+ if func_type is not None:
+ _resolve_and_apply(func_type)
+
+
def calculate_class_properties(graph: Graph, scc: list[str], errors: Errors) -> None:
builtins = graph["builtins"].tree
assert builtins
diff --git a/mypy/semanal_shared.py b/mypy/semanal_shared.py
index a85d4ed00b5e6..041b234f6be18 100644
--- a/mypy/semanal_shared.py
+++ b/mypy/semanal_shared.py
@@ -17,6 +17,7 @@
Decorator,
Expression,
FuncDef,
+ MypyFile,
NameExpr,
Node,
OverloadedFuncDef,
@@ -155,6 +156,7 @@ class SemanticAnalyzerInterface(SemanticAnalyzerCoreInterface):
"""
tvar_scope: TypeVarLikeScope
+ modules: dict[str, MypyFile]
@abstractmethod
def lookup(
diff --git a/mypy/semanal_typeargs.py b/mypy/semanal_typeargs.py
index 0f62a4aa8b1a2..8aa1e631e8363 100644
--- a/mypy/semanal_typeargs.py
+++ b/mypy/semanal_typeargs.py
@@ -15,19 +15,21 @@
from mypy.message_registry import INVALID_PARAM_SPEC_LOCATION, INVALID_PARAM_SPEC_LOCATION_NOTE
from mypy.messages import format_type
from mypy.mixedtraverser import MixedTraverserVisitor
-from mypy.nodes import Block, ClassDef, Context, FakeInfo, FuncItem, MypyFile
+from mypy.nodes import Block, ClassDef, Context, FakeInfo, FuncItem, MypyFile, TypeAlias
from mypy.options import Options
from mypy.scope import Scope
from mypy.subtypes import is_same_type, is_subtype
from mypy.types import (
AnyType,
CallableType,
+ ComputedType,
Instance,
Parameters,
ParamSpecType,
TupleType,
Type,
TypeAliasType,
+ TypedDictType,
TypeOfAny,
TypeVarLikeType,
TypeVarTupleType,
@@ -36,6 +38,7 @@
UnpackType,
flatten_nested_tuples,
get_proper_type,
+ get_proper_type_simple,
get_proper_types,
split_with_prefix_and_suffix,
)
@@ -60,7 +63,7 @@ def __init__(
self.recurse_into_functions = True
# Keep track of the type aliases already visited. This is needed to avoid
# infinite recursion on types like A = Union[int, List[A]].
- self.seen_aliases: set[TypeAliasType] = set()
+ self.seen_aliases: set[TypeAlias] = set()
def visit_mypy_file(self, o: MypyFile) -> None:
self.errors.set_file(o.path, o.fullname, scope=self.scope, options=self.options)
@@ -83,12 +86,12 @@ def visit_block(self, o: Block) -> None:
def visit_type_alias_type(self, t: TypeAliasType) -> None:
super().visit_type_alias_type(t)
+ assert t.alias is not None, f"Unfixed type alias {t.type_ref}"
if t.is_recursive:
- if t in self.seen_aliases:
+ if t.alias in self.seen_aliases:
# Avoid infinite recursion on recursive type aliases.
return
- self.seen_aliases.add(t)
- assert t.alias is not None, f"Unfixed type alias {t.type_ref}"
+ self.seen_aliases.add(t.alias)
is_error, is_invalid = self.validate_args(
t.alias.name, tuple(t.args), t.alias.alias_tvars, t
)
@@ -100,12 +103,15 @@ def visit_type_alias_type(self, t: TypeAliasType) -> None:
if not is_error:
# If there was already an error for the alias itself, there is no point in checking
# the expansion, most likely it will result in the same kind of error.
+
if t.args:
+ # XXX: I was trying this at one point but it might not be needed.
+ # if t.args and not any(isinstance(st, ComputedType) for st in t.args):
# Since we always allow unbounded type variables in alias definitions, we need
# to verify the arguments satisfy the upper bounds of the expansion as well.
- get_proper_type(t).accept(self)
+ get_proper_type_simple(t).accept(self)
if t.is_recursive:
- self.seen_aliases.discard(t)
+ self.seen_aliases.discard(t.alias)
def visit_tuple_type(self, t: TupleType) -> None:
t.items = flatten_nested_tuples(t.items)
@@ -236,8 +242,12 @@ def validate_args(
)
elif isinstance(tvar, TypeVarTupleType):
p_arg = get_proper_type(arg)
- assert isinstance(p_arg, TupleType)
- for it in p_arg.items:
+ # Because of the exciting new world of type evaluation
+ # done in get_proper_type, it might expand to a
+ # builtins.tuple subtype instance if it is variadic.
+ assert isinstance(p_arg, (TupleType, Instance))
+ items = p_arg.items if isinstance(p_arg, TupleType) else p_arg.args
+ for it in items:
if self.check_non_paramspec(it, "TypeVarTuple", context):
is_invalid = True
if is_invalid:
@@ -247,6 +257,9 @@ def validate_args(
def visit_unpack_type(self, typ: UnpackType) -> None:
super().visit_unpack_type(typ)
proper_type = get_proper_type(typ.type)
+ # If it is a ComputedType, it's probably stuck... and we'll just hope we are OK.
+ if isinstance(proper_type, ComputedType):
+ return
if isinstance(proper_type, TupleType):
return
if isinstance(proper_type, TypeVarTupleType):
@@ -255,6 +268,16 @@ def visit_unpack_type(self, typ: UnpackType) -> None:
# tricky however, since this needs map_instance_to_supertype() available in many places.
if isinstance(proper_type, Instance) and proper_type.type.fullname == "builtins.tuple":
return
+ # TypeVar with TypedDict bound is allowed for **kwargs unpacking with inference.
+ # Note: for concrete TypedDict, semanal.py's remove_unpack_kwargs() unwraps the Unpack,
+ # so this check won't be reached. For TypeVar, we keep the Unpack for constraint inference.
+ if isinstance(proper_type, TypeVarType):
+ bound = get_proper_type(proper_type.upper_bound)
+ if isinstance(bound, TypedDictType):
+ return
+ # Also allow Instance bounds that are TypedDict-like
+ if isinstance(bound, Instance) and bound.type.typeddict_type is not None:
+ return
if not isinstance(proper_type, (UnboundType, AnyType)):
# Avoid extra errors if there were some errors already. Also interpret plain Any
# as tuple[Any, ...] (this is better for the code in type checker).
diff --git a/mypy/semanal_typeddict.py b/mypy/semanal_typeddict.py
index 3655e4c89dd4b..0d1993017990d 100644
--- a/mypy/semanal_typeddict.py
+++ b/mypy/semanal_typeddict.py
@@ -118,6 +118,8 @@ def analyze_typeddict_classdef(self, defn: ClassDef) -> tuple[bool, TypeInfo | N
info = self.build_typeddict_typeinfo(
defn.name, field_types, required_keys, readonly_keys, defn.line, existing_info
)
+ if not info:
+ return True, None
defn.analyzed = TypedDictExpr(info)
defn.analyzed.line = defn.line
defn.analyzed.column = defn.column
@@ -173,6 +175,8 @@ def analyze_typeddict_classdef(self, defn: ClassDef) -> tuple[bool, TypeInfo | N
info = self.build_typeddict_typeinfo(
defn.name, field_types, required_keys, readonly_keys, defn.line, existing_info
)
+ if not info:
+ return True, None
defn.analyzed = TypedDictExpr(info)
defn.analyzed.line = defn.line
defn.analyzed.column = defn.column
@@ -489,7 +493,10 @@ def check_typeddict(
call.line,
existing_info,
)
- info.line = node.line
+
+ if not info:
+ return True, None, []
+ info.line = node.line
# Store generated TypeInfo under both names, see semanal_namedtuple for more details.
if name != var_name or is_func_scope:
self.api.add_symbol_skip_local(name, info)
@@ -597,14 +604,16 @@ def build_typeddict_typeinfo(
readonly_keys: set[str],
line: int,
existing_info: TypeInfo | None,
- ) -> TypeInfo:
+ ) -> TypeInfo | None:
# Prefer typing then typing_extensions if available.
fallback = (
self.api.named_type_or_none("typing._TypedDict", [])
or self.api.named_type_or_none("typing_extensions._TypedDict", [])
or self.api.named_type_or_none("mypy_extensions._TypedDict", [])
)
- assert fallback is not None
+ # Adding a TypedDict subtype into typing sometimes causes this deferral
+ if fallback is None:
+ return None
info = existing_info or self.api.basic_new_typeinfo(name, fallback, line)
typeddict_type = TypedDictType(item_types, required_keys, readonly_keys, fallback)
if has_placeholder(typeddict_type):
diff --git a/mypy/server/astdiff.py b/mypy/server/astdiff.py
index 9bbc3077ec512..7fc650ead3060 100644
--- a/mypy/server/astdiff.py
+++ b/mypy/server/astdiff.py
@@ -91,6 +91,8 @@ class level -- these are handled at attribute level (say, 'mod.Cls.method'
Type,
TypeAliasType,
TypedDictType,
+ TypeForComprehension,
+ TypeOperatorType,
TypeType,
TypeVarId,
TypeVarLikeType,
@@ -524,6 +526,19 @@ def visit_type_alias_type(self, typ: TypeAliasType) -> SnapshotItem:
assert typ.alias is not None
return ("TypeAliasType", typ.alias.fullname, snapshot_types(typ.args))
+ def visit_type_operator_type(self, typ: TypeOperatorType) -> SnapshotItem:
+ name = typ.type.fullname if typ.type else ""
+ return ("TypeOperatorType", name, snapshot_types(typ.args))
+
+ def visit_type_for_comprehension(self, typ: TypeForComprehension) -> SnapshotItem:
+ return (
+ "TypeForComprehension",
+ snapshot_type(typ.element_expr),
+ typ.iter_name,
+ snapshot_type(typ.iter_type),
+ snapshot_types(typ.conditions),
+ )
+
def snapshot_untyped_signature(func: OverloadedFuncDef | FuncItem) -> SymbolSnapshot:
"""Create a snapshot of the signature of a function that has no explicit signature.
diff --git a/mypy/server/astmerge.py b/mypy/server/astmerge.py
index 56f2f935481c5..280d31a34e01d 100644
--- a/mypy/server/astmerge.py
+++ b/mypy/server/astmerge.py
@@ -101,7 +101,9 @@
Type,
TypeAliasType,
TypedDictType,
+ TypeForComprehension,
TypeList,
+ TypeOperatorType,
TypeType,
TypeVarTupleType,
TypeVarType,
@@ -539,6 +541,16 @@ def visit_union_type(self, typ: UnionType) -> None:
for item in typ.items:
item.accept(self)
+ def visit_type_operator_type(self, typ: TypeOperatorType) -> None:
+ for arg in typ.args:
+ arg.accept(self)
+
+ def visit_type_for_comprehension(self, typ: TypeForComprehension) -> None:
+ typ.element_expr.accept(self)
+ typ.iter_type.accept(self)
+ for c in typ.conditions:
+ c.accept(self)
+
def visit_placeholder_type(self, t: PlaceholderType) -> None:
for item in t.args:
item.accept(self)
diff --git a/mypy/server/deps.py b/mypy/server/deps.py
index ba622329665ea..d55471d8da133 100644
--- a/mypy/server/deps.py
+++ b/mypy/server/deps.py
@@ -162,7 +162,9 @@ class 'mod.Cls'. This can also refer to an attribute inherited from a
Type,
TypeAliasType,
TypedDictType,
+ TypeForComprehension,
TypeOfAny,
+ TypeOperatorType,
TypeType,
TypeVarTupleType,
TypeVarType,
@@ -1097,6 +1099,22 @@ def visit_union_type(self, typ: UnionType) -> list[str]:
triggers.extend(self.get_type_triggers(item))
return triggers
+ def visit_type_operator_type(self, typ: TypeOperatorType) -> list[str]:
+ triggers = []
+ if typ.type:
+ triggers.append(make_trigger(typ.type.fullname))
+ for arg in typ.args:
+ triggers.extend(self.get_type_triggers(arg))
+ return triggers
+
+ def visit_type_for_comprehension(self, typ: TypeForComprehension) -> list[str]:
+ triggers = []
+ triggers.extend(self.get_type_triggers(typ.element_expr))
+ triggers.extend(self.get_type_triggers(typ.iter_type))
+ for cond in typ.conditions:
+ triggers.extend(self.get_type_triggers(cond))
+ return triggers
+
def merge_dependencies(new_deps: dict[str, set[str]], deps: dict[str, set[str]]) -> None:
for trigger, targets in new_deps.items():
diff --git a/mypy/strconv.py b/mypy/strconv.py
index b26f1d8d71a8e..2b943d5122a74 100644
--- a/mypy/strconv.py
+++ b/mypy/strconv.py
@@ -41,7 +41,9 @@ def __init__(self, *, show_ids: bool = False, options: Options) -> None:
def stringify_type(self, t: mypy.types.Type) -> str:
import mypy.types
- return t.accept(mypy.types.TypeStrVisitor(id_mapper=self.id_mapper, options=self.options))
+ return t.accept(
+ mypy.types.TypeStrVisitor(id_mapper=self.id_mapper, options=self.options, expand=True)
+ )
def get_id(self, o: object) -> int | None:
if self.id_mapper:
@@ -691,7 +693,12 @@ def dump_tagged(nodes: Sequence[object], tag: str | None, str_conv: StrConv) ->
a.append(indent(n.accept(str_conv), 2))
elif isinstance(n, Type):
a.append(
- indent(n.accept(TypeStrVisitor(str_conv.id_mapper, options=str_conv.options)), 2)
+ indent(
+ n.accept(
+ TypeStrVisitor(str_conv.id_mapper, options=str_conv.options, expand=True)
+ ),
+ 2,
+ )
)
elif n is not None:
a.append(indent(str(n), 2))
diff --git a/mypy/subtypes.py b/mypy/subtypes.py
index 66d7a95eb4252..142223503f7b2 100644
--- a/mypy/subtypes.py
+++ b/mypy/subtypes.py
@@ -58,7 +58,9 @@
Type,
TypeAliasType,
TypedDictType,
+ TypeForComprehension,
TypeOfAny,
+ TypeOperatorType,
TypeType,
TypeVarTupleType,
TypeVarType,
@@ -1160,6 +1162,35 @@ def visit_type_type(self, left: TypeType) -> bool:
def visit_type_alias_type(self, left: TypeAliasType) -> bool:
assert False, f"This should be never called, got {left}"
+ def visit_type_operator_type(self, left: TypeOperatorType) -> bool:
+ # PERF: Using is_same_type can mean exponential time checking...
+ if isinstance(self.right, TypeOperatorType):
+ if left.type == self.right.type and len(left.args) == len(self.right.args):
+ if all(is_same_type(la, ra) for la, ra in zip(left.args, self.right.args)):
+ return True
+ return self._is_subtype(left.fallback, self.right)
+
+ def visit_type_for_comprehension(self, left: TypeForComprehension) -> bool:
+ # PERF: Using is_same_type can mean exponential time checking...
+ # TODO: should we match some Unpacks?
+ if isinstance(self.right, TypeForComprehension):
+ if len(left.conditions) != len(self.right.conditions):
+ return False
+ if not is_same_type(left.iter_type, self.right.iter_type):
+ return False
+
+ # Substitute left.iter_var for right.iter_var in right's expressions
+ assert self.right.iter_var is not None and left.iter_var is not None
+ env = {self.right.iter_var.id: left.iter_var}
+ right_element_expr = expand_type(self.right.element_expr, env)
+ right_conditions = [expand_type(c, env) for c in self.right.conditions]
+
+ if is_same_type(left.element_expr, right_element_expr) and all(
+ is_same_type(lc, rc) for lc, rc in zip(left.conditions, right_conditions)
+ ):
+ return True
+ return False
+
T = TypeVar("T", bound=Type)
@@ -2277,10 +2308,16 @@ def infer_class_variances(info: TypeInfo) -> bool:
return True
tvs = info.defn.type_vars
success = True
- for i, tv in enumerate(tvs):
- if isinstance(tv, TypeVarType) and tv.variance == VARIANCE_NOT_READY:
- if not infer_variance(info, i):
- success = False
+ # Variance inference substitutes type variables with synthetic bounds and
+ # runs subtype checks, which can trigger type-level operator evaluation on
+ # types the user never wrote. Suppress any errors from those evaluations.
+ from mypy.typelevel import typelevel_ctx
+
+ with typelevel_ctx.suppress_errors():
+ for i, tv in enumerate(tvs):
+ if isinstance(tv, TypeVarType) and tv.variance == VARIANCE_NOT_READY:
+ if not infer_variance(info, i):
+ success = False
return success
diff --git a/mypy/test/testcheck.py b/mypy/test/testcheck.py
index cbb9c0235feda..de16a2f33d842 100644
--- a/mypy/test/testcheck.py
+++ b/mypy/test/testcheck.py
@@ -48,6 +48,8 @@
typecheck_files.remove("check-python311.test")
if sys.version_info < (3, 12):
typecheck_files.remove("check-python312.test")
+ typecheck_files = [f for f in typecheck_files if "typelevel" not in f]
+ typecheck_files.remove("check-kwargs-unpack-typevar.test")
if sys.version_info < (3, 13):
typecheck_files.remove("check-python313.test")
if sys.version_info < (3, 14):
diff --git a/mypy/test/testmerge.py b/mypy/test/testmerge.py
index 2e88b519f7f85..c070b12455f3c 100644
--- a/mypy/test/testmerge.py
+++ b/mypy/test/testmerge.py
@@ -44,7 +44,7 @@ def setup(self) -> None:
self.str_conv = StrConv(show_ids=True, options=Options())
assert self.str_conv.id_mapper is not None
self.id_mapper: IdMapper = self.str_conv.id_mapper
- self.type_str_conv = TypeStrVisitor(self.id_mapper, options=Options())
+ self.type_str_conv = TypeStrVisitor(self.id_mapper, options=Options(), expand=True)
def run_case(self, testcase: DataDrivenTestCase) -> None:
name = testcase.name
diff --git a/mypy/test/testtypes.py b/mypy/test/testtypes.py
index 6562f541d73bc..259bcd8c6079c 100644
--- a/mypy/test/testtypes.py
+++ b/mypy/test/testtypes.py
@@ -1598,9 +1598,11 @@ def make_call(*items: tuple[str, str | None]) -> CallExpr:
class TestExpandTypeLimitGetProperType(TestCase):
+ # WARNING: This should probably stay 0 forever.
+ ALLOWED_GET_PROPER_TYPES = 0
# WARNING: do not increase this number unless absolutely necessary,
# and you understand what you are doing.
- ALLOWED_GET_PROPER_TYPES = 7
+ ALLOWED_GET_PROPER_TYPE_SIMPLE = 7
@skipUnless(mypy.expandtype.__file__.endswith(".py"), "Skip for compiled mypy")
def test_count_get_proper_type(self) -> None:
@@ -1609,3 +1611,7 @@ def test_count_get_proper_type(self) -> None:
get_proper_type_count = len(re.findall(r"get_proper_type\(", code))
get_proper_type_count -= len(re.findall(r"get_proper_type\(\)", code))
assert get_proper_type_count == self.ALLOWED_GET_PROPER_TYPES
+
+ get_proper_type_simple_count = len(re.findall(r"get_proper_type_simple\(", code))
+ get_proper_type_simple_count -= len(re.findall(r"get_proper_type_simple\(\)", code))
+ assert get_proper_type_simple_count == self.ALLOWED_GET_PROPER_TYPE_SIMPLE
diff --git a/mypy/treetransform.py b/mypy/treetransform.py
index 25092de66a149..77a401c980a07 100644
--- a/mypy/treetransform.py
+++ b/mypy/treetransform.py
@@ -307,6 +307,7 @@ def visit_var(self, node: Var) -> Var:
new.is_property = node.is_property
new.is_final = node.is_final
new.final_value = node.final_value
+ new.init_type = node.init_type
new.final_unset_in_class = node.final_unset_in_class
new.final_set_in_init = node.final_set_in_init
new.set_line(node)
diff --git a/mypy/type_visitor.py b/mypy/type_visitor.py
index 1b38481ba0004..093418bcee8fb 100644
--- a/mypy/type_visitor.py
+++ b/mypy/type_visitor.py
@@ -15,7 +15,7 @@
from abc import abstractmethod
from collections.abc import Iterable, Sequence
-from typing import Any, Final, Generic, TypeVar, cast
+from typing import TYPE_CHECKING, Any, Final, Generic, TypeVar, cast
from mypy_extensions import mypyc_attr, trait
@@ -39,7 +39,9 @@
Type,
TypeAliasType,
TypedDictType,
+ TypeForComprehension,
TypeList,
+ TypeOperatorType,
TypeType,
TypeVarLikeType,
TypeVarTupleType,
@@ -51,6 +53,10 @@
get_proper_type,
)
+if TYPE_CHECKING:
+ from mypy.nodes import TypeAlias
+
+
T = TypeVar("T", covariant=True)
@@ -146,6 +152,14 @@ def visit_type_alias_type(self, t: TypeAliasType, /) -> T:
def visit_unpack_type(self, t: UnpackType, /) -> T:
pass
+ @abstractmethod
+ def visit_type_operator_type(self, t: TypeOperatorType, /) -> T:
+ pass
+
+ @abstractmethod
+ def visit_type_for_comprehension(self, t: TypeForComprehension, /) -> T:
+ pass
+
@trait
@mypyc_attr(allow_interpreted_subclasses=True)
@@ -340,6 +354,16 @@ def visit_type_alias_type(self, t: TypeAliasType, /) -> Type:
# must implement this depending on its semantics.
pass
+ def visit_type_operator_type(self, t: TypeOperatorType, /) -> Type:
+ return t.copy_modified(args=self.translate_type_list(t.args))
+
+ def visit_type_for_comprehension(self, t: TypeForComprehension, /) -> Type:
+ return t.copy_modified(
+ element_expr=t.element_expr.accept(self),
+ iter_type=t.iter_type.accept(self),
+ conditions=[c.accept(self) for c in t.conditions],
+ )
+
@mypyc_attr(allow_interpreted_subclasses=True)
class TypeQuery(SyntheticTypeVisitor[T]):
@@ -358,7 +382,7 @@ class TypeQuery(SyntheticTypeVisitor[T]):
def __init__(self) -> None:
# Keep track of the type aliases already visited. This is needed to avoid
# infinite recursion on types like A = Union[int, List[A]].
- self.seen_aliases: set[TypeAliasType] | None = None
+ self.seen_aliases: set[TypeAlias] | None = None
# By default, we eagerly expand type aliases, and query also types in the
# alias target. In most cases this is a desired behavior, but we may want
# to skip targets in some cases (e.g. when collecting type variables).
@@ -451,11 +475,18 @@ def visit_type_alias_type(self, t: TypeAliasType, /) -> T:
# (also use this as a simple-minded cache).
if self.seen_aliases is None:
self.seen_aliases = set()
- elif t in self.seen_aliases:
+ elif t.alias in self.seen_aliases:
return self.strategy([])
- self.seen_aliases.add(t)
+ if t.alias:
+ self.seen_aliases.add(t.alias)
return get_proper_type(t).accept(self)
+ def visit_type_operator_type(self, t: TypeOperatorType, /) -> T:
+ return self.query_types(t.args)
+
+ def visit_type_for_comprehension(self, t: TypeForComprehension, /) -> T:
+ return self.query_types([t.element_expr, t.iter_type] + t.conditions)
+
def query_types(self, types: Iterable[Type]) -> T:
"""Perform a query for a list of types using the strategy to combine the results."""
return self.strategy([t.accept(self) for t in types])
@@ -490,7 +521,7 @@ def __init__(self, strategy: int) -> None:
# Keep track of the type aliases already visited. This is needed to avoid
# infinite recursion on types like A = Union[int, List[A]]. An empty set is
# represented as None as a micro-optimization.
- self.seen_aliases: set[TypeAliasType] | None = None
+ self.seen_aliases: set[TypeAlias] | None = None
# By default, we eagerly expand type aliases, and query also types in the
# alias target. In most cases this is a desired behavior, but we may want
# to skip targets in some cases (e.g. when collecting type variables).
@@ -592,11 +623,18 @@ def visit_type_alias_type(self, t: TypeAliasType, /) -> bool:
# (also use this as a simple-minded cache).
if self.seen_aliases is None:
self.seen_aliases = set()
- elif t in self.seen_aliases:
+ elif t.alias in self.seen_aliases:
return self.default
- self.seen_aliases.add(t)
+ if t.alias:
+ self.seen_aliases.add(t.alias)
return get_proper_type(t).accept(self)
+ def visit_type_operator_type(self, t: TypeOperatorType, /) -> bool:
+ return self.query_types(t.args)
+
+ def visit_type_for_comprehension(self, t: TypeForComprehension, /) -> bool:
+ return self.query_types([t.element_expr, t.iter_type] + t.conditions)
+
def query_types(self, types: list[Type] | tuple[Type, ...]) -> bool:
"""Perform a query for a sequence of types using the strategy to combine the results."""
# Special-case for lists and tuples to allow mypyc to produce better code.
diff --git a/mypy/typeanal.py b/mypy/typeanal.py
index b22e1f80be592..2651ccd8e77ed 100644
--- a/mypy/typeanal.py
+++ b/mypy/typeanal.py
@@ -97,8 +97,10 @@
Type,
TypeAliasType,
TypedDictType,
+ TypeForComprehension,
TypeList,
TypeOfAny,
+ TypeOperatorType,
TypeQuery,
TypeType,
TypeVarId,
@@ -113,6 +115,7 @@
find_unpack_in_list,
flatten_nested_tuples,
get_proper_type,
+ get_proper_type_simple,
has_type_vars,
)
from mypy.types_utils import get_bad_type_type_item
@@ -297,7 +300,7 @@ def not_declared_in_type_params(self, tvar_name: str) -> bool:
)
def visit_unbound_type_nonoptional(self, t: UnboundType, defining_literal: bool) -> Type:
- sym = self.lookup_qualified(t.name, t)
+ sym = self.lookup_qualified(t.name, t, suppress_errors="." in t.name)
param_spec_name = None
if t.name.endswith((".args", ".kwargs")):
param_spec_name = t.name.rsplit(".", 1)[0]
@@ -510,6 +513,9 @@ def visit_unbound_type_nonoptional(self, t: UnboundType, defining_literal: bool)
res = get_proper_type(res)
return res
elif isinstance(node, TypeInfo):
+ # Check if this is a type operator (decorated with @_type_operator)
+ if node.is_type_operator:
+ return self.analyze_type_operator(t, node)
return self.analyze_type_with_type_info(node, t.args, t, t.empty_tuple_index)
elif node.fullname in TYPE_ALIAS_NAMES:
return AnyType(TypeOfAny.special_form)
@@ -520,6 +526,33 @@ def visit_unbound_type_nonoptional(self, t: UnboundType, defining_literal: bool)
else:
return self.analyze_unbound_type_without_type_info(t, sym, defining_literal)
else: # sym is None
+ # TODO: XXX: I'm not sure if this is where I want this.
+ # Try dot notation for type-level attribute access: T.attr -> _TypeGetAttr[T, Literal["attr"]]
+ # Only applies when the prefix is a type variable (not a module, class, etc.)
+ if "." in t.name:
+ prefix, attr = t.name.rsplit(".", 1)
+ prefix_sym = self.lookup_qualified(prefix, t, suppress_errors=True)
+ if prefix_sym is not None and isinstance(prefix_sym.node, TypeVarExpr):
+ operator_sym = self.api.lookup_fully_qualified_or_none("builtins._TypeGetAttr")
+ if operator_sym and isinstance(operator_sym.node, TypeInfo):
+ prefix_type = self.anal_type(
+ UnboundType(prefix, t.args, line=t.line, column=t.column)
+ )
+ attr_literal = LiteralType(
+ attr, self.named_type("builtins.str"), line=t.line, column=t.column
+ )
+ fallback = self.named_type("builtins.object")
+ return TypeOperatorType(
+ operator_sym.node,
+ [prefix_type, attr_literal],
+ fallback,
+ t.line,
+ t.column,
+ )
+ else:
+ # Prefix is not a type variable — re-issue the original lookup
+ # without suppressed errors so the proper error is generated
+ self.lookup_qualified(t.name, t)
return AnyType(TypeOfAny.special_form)
def pack_paramspec_args(self, an_args: Sequence[Type], empty_tuple_index: bool) -> list[Type]:
@@ -646,7 +679,20 @@ def try_analyze_special_unbound_type(self, t: UnboundType, fullname: str) -> Typ
self.anal_array(t.args, allow_unpack=True), line=t.line, column=t.column
)
elif fullname == "typing.Union":
- items = self.anal_array(t.args)
+ items = self.anal_array(t.args, allow_unpack=True)
+ # If there are any unpacks or for comprehensions, turn the
+ # Union into a _NewUnion type operator. This approach has
+ # the strong advantage that we never need to deal with
+ # messed up union types.
+ if any(
+ isinstance(get_proper_type_simple(st), (UnpackType, TypeForComprehension))
+ for st in items
+ ):
+ operator = self.lookup_fully_qualified("typing._NewUnion")
+ assert operator and isinstance(operator.node, TypeInfo)
+ fallback = self.named_type("builtins.object")
+ return TypeOperatorType(operator.node, items, fallback, t.line, t.column)
+
return UnionType.make_union(items, line=t.line, column=t.column)
elif fullname == "typing.Optional":
if len(t.args) != 1:
@@ -1112,6 +1158,116 @@ def visit_type_alias_type(self, t: TypeAliasType) -> Type:
# TODO: should we do something here?
return t
+ def analyze_type_operator(self, t: UnboundType, type_info: TypeInfo) -> Type:
+ """Analyze a type operator application like GetArg[T, Base, 0].
+
+ Returns a TypeOperatorType that will be expanded later.
+ """
+ # Convert any RawExpressionType args (e.g. from dot notation desugaring)
+ # to LiteralType before analysis, since bare RawExpressionType would be
+ # rejected by visit_raw_expression_type.
+ converted_args: list[Type] = []
+ for arg in t.args:
+ if isinstance(arg, RawExpressionType) and arg.literal_value is not None:
+ fallback = self.named_type(arg.base_type_name)
+ converted_args.append(
+ LiteralType(arg.literal_value, fallback, line=arg.line, column=arg.column)
+ )
+ else:
+ converted_args.append(arg)
+
+ # Analyze all type arguments
+ an_args = self.anal_array(
+ converted_args,
+ allow_param_spec=True,
+ allow_param_spec_literals=type_info.has_param_spec_type,
+ allow_unpack=True,
+ )
+
+ # Map(comprehension) is a pure synonym for *[comprehension]: once
+ # we've confirmed the name really resolves to the Map type operator,
+ # unwrap it and return the TypeForComprehension directly. Using the
+ # TFC (not a TypeOperatorType wrapper) matches the *[...] path and
+ # avoids eager evaluation that would pre-substitute TypeVarTuples.
+ if type_info.fullname in ("_typeshed.typemap.Map", "typing.Map"):
+ if len(an_args) == 1 and isinstance(an_args[0], TypeForComprehension):
+ return an_args[0]
+ self.fail(
+ "Map(...) requires a single comprehension argument, "
+ "e.g. Map(T for T in Iter[...])",
+ t,
+ code=codes.VALID_TYPE,
+ )
+ return AnyType(TypeOfAny.from_error)
+
+ # For _TypeGetAttr, eagerly check that the first arg is a type that
+ # supports dot notation. If it's already a concrete type like
+ # tuple[int, str] or a Callable, we can report the error now rather
+ # than deferring to lazy evaluation (which may never happen for
+ # unused function parameters).
+ # TODO: This eager check is a workaround for the fact that type
+ # operator evaluation is lazy — if a function parameter's type is
+ # never used, the operator is never evaluated and the error is
+ # never reported. We may need a more principled approach to
+ # ensuring type-level errors are always surfaced.
+ if type_info.name == "_TypeGetAttr" and len(an_args) >= 2:
+ target = get_proper_type(an_args[0])
+ if isinstance(target, (Instance, TupleType, CallableType)):
+ is_valid = False
+ if isinstance(target, Instance) and (
+ target.type.is_type_operator or target.type.name in ("Member", "Param")
+ ):
+ is_valid = True
+ if not is_valid:
+ name = None
+ name_type = get_proper_type(an_args[1])
+ if isinstance(name_type, LiteralType) and isinstance(name_type.value, str):
+ name = name_type.value
+ self.fail(
+ f"Dot notation .{name} requires a Member or Param type, got {target}", t
+ )
+ return UninhabitedType()
+
+ # TODO: different fallbacks for different types
+ fallback = self.named_type("builtins.object")
+ return TypeOperatorType(type_info, an_args, fallback, t.line, t.column)
+
+ def visit_type_operator_type(self, t: TypeOperatorType) -> Type:
+ return t.copy_modified(args=self.anal_array(t.args, allow_unpack=True))
+
+ def visit_type_for_comprehension(self, t: TypeForComprehension) -> Type:
+ from mypy.semanal import SemanticAnalyzer
+
+ sem = self.api
+ assert isinstance(sem, SemanticAnalyzer)
+
+ iter_type = self.anal_type(t.iter_type)
+
+ with self.tvar_scope_frame(""):
+ targs = [t.type_param()]
+ var_exprs = sem.push_type_args(targs, t)
+ assert var_exprs
+
+ iter_var = self.tvar_scope.bind_new(t.iter_name, var_exprs[0][1], self.fail_func, t)
+ assert isinstance(iter_var, TypeVarType), type(iter_var)
+
+ # Add the comprehension variable to allowed_alias_tvars so it doesn't
+ # trigger "type parameter not declared" errors when defining_alias=True
+ old_allowed = self.allowed_alias_tvars
+ self.allowed_alias_tvars = old_allowed + [iter_var]
+
+ analt = t.copy_modified(
+ element_expr=self.anal_type(t.element_expr),
+ iter_type=iter_type,
+ conditions=self.anal_array(t.conditions),
+ iter_var=iter_var,
+ )
+
+ self.allowed_alias_tvars = old_allowed
+ sem.pop_type_args(targs)
+
+ return analt
+
def visit_type_var(self, t: TypeVarType) -> Type:
return t
@@ -1589,6 +1745,16 @@ def analyze_callable_type(self, t: UnboundType) -> Type:
ret = callable_with_ellipsis(
AnyType(TypeOfAny.explicit), ret_type=ret_type, fallback=fallback
)
+ elif isinstance(callable_args, UnboundType) and self.refers_to_full_names(
+ callable_args, ["typing.Params", "_typeshed.typemap.Params"]
+ ):
+ # Callable[Params[...], RET] - extended callable syntax.
+ # Rewrite to _NewCallable[...params..., ret_type] type operator.
+ items = self.anal_array(list(callable_args.args) + [ret_type], allow_unpack=True)
+ operator = self.lookup_fully_qualified("typing._NewCallable")
+ assert operator and isinstance(operator.node, TypeInfo)
+ obj_fallback = self.named_type("builtins.object")
+ return TypeOperatorType(operator.node, items, obj_fallback, t.line, t.column)
else:
# Callable[P, RET] (where P is ParamSpec)
with self.tvar_scope_frame(namespace=""):
@@ -1738,6 +1904,13 @@ def analyze_literal_param(self, idx: int, arg: Type, ctx: Context) -> list[Type]
and arg.original_str_expr is not None
):
assert arg.original_str_fallback is not None
+ # XXX: Since adding some Literals to typing.pyi, sometimes
+ # they get processed before builtins.str is available.
+ # Fix this by deferring, I guess.
+ if self.api.lookup_fully_qualified_or_none(arg.original_str_fallback) is None:
+ self.api.defer()
+ return None
+
return [
LiteralType(
value=arg.original_str_expr,
@@ -1793,6 +1966,14 @@ def analyze_literal_param(self, idx: int, arg: Type, ctx: Context) -> list[Type]
return None
# Remap bytes and unicode into the appropriate type for the correct Python version
+
+ # XXX: Since adding some Literals to typing.pyi, sometimes
+ # they get processed before builtins.str is available.
+ # Fix this by deferring, I guess.
+ if self.api.lookup_fully_qualified_or_none(arg.base_type_name) is None:
+ self.api.defer()
+ return None
+
fallback = self.named_type(arg.base_type_name)
assert isinstance(fallback, Instance)
return [LiteralType(arg.literal_value, fallback, line=arg.line, column=arg.column)]
@@ -2378,11 +2559,15 @@ def visit_type_alias_type(self, t: TypeAliasType) -> Type:
return t
new_nodes = self.seen_nodes | {t.alias}
visitor = DivergingAliasDetector(new_nodes, self.lookup, self.scope)
- _ = get_proper_type(t).accept(visitor)
+ _ = get_proper_type_simple(t).accept(visitor)
if visitor.diverging:
self.diverging = True
return t
+ def visit_type_operator_type(self, t: TypeOperatorType) -> Type:
+ # XXX: I'm really not sure what to do here
+ return t
+
def detect_diverging_alias(
node: TypeAlias,
@@ -2602,6 +2787,7 @@ def __init__(self, api: SemanticAnalyzerCoreInterface, scope: TypeVarLikeScope)
self.type_var_likes: list[tuple[str, TypeVarLikeExpr]] = []
self.has_self_type = False
self.include_callables = True
+ self.internal_vars: set[str] = set()
def _seems_like_callable(self, type: UnboundType) -> bool:
if not type.args:
@@ -2610,6 +2796,22 @@ def _seems_like_callable(self, type: UnboundType) -> bool:
def visit_unbound_type(self, t: UnboundType) -> None:
name = t.name
+ # We don't want to collect the iterator variables, and we
+ # really don't want to bother to put them in the symbol table.
+ if name in self.internal_vars:
+ return
+
+ # Skip type-level attribute access (T.attr, m.name) where the prefix
+ # is a comprehension variable or a type variable — these are handled
+ # as _TypeGetAttr during type analysis, not as qualified name lookups.
+ if "." in name:
+ prefix = name.rsplit(".", 1)[0]
+ if prefix in self.internal_vars:
+ return
+ prefix_node = self.api.lookup_qualified(prefix, t, suppress_errors=True)
+ if prefix_node and isinstance(prefix_node.node, TypeVarExpr):
+ name = prefix
+
node = self.api.lookup_qualified(name, t)
if node and node.fullname in SELF_TYPE_NAMES:
self.has_self_type = True
@@ -2712,6 +2914,21 @@ def visit_placeholder_type(self, t: PlaceholderType) -> None:
def visit_type_alias_type(self, t: TypeAliasType) -> None:
self.process_types(t.args)
+ def visit_type_operator_type(self, t: TypeOperatorType) -> None:
+ self.process_types(t.args)
+
+ def visit_type_for_comprehension(self, t: TypeForComprehension) -> None:
+ t.iter_type.accept(self)
+
+ shadowed = t.iter_name in self.internal_vars
+ self.internal_vars.add(t.iter_name)
+
+ t.element_expr.accept(self)
+ self.process_types(t.conditions)
+
+ if not shadowed:
+ self.internal_vars.discard(t.iter_name)
+
def process_types(self, types: list[Type] | tuple[Type, ...]) -> None:
# Redundant type check helps mypyc.
if isinstance(types, list):
diff --git a/mypy/typelevel.py b/mypy/typelevel.py
new file mode 100644
index 0000000000000..e7660fb928813
--- /dev/null
+++ b/mypy/typelevel.py
@@ -0,0 +1,1536 @@
+"""Type-level computation evaluation.
+
+This module provides the evaluation functions for type-level computations
+(TypeOperatorType, TypeForComprehension).
+
+Note: Conditional types are now represented as _Cond[...] TypeOperatorType.
+
+"""
+
+from __future__ import annotations
+
+import functools
+import inspect
+import itertools
+from collections.abc import Callable, Iterator, Sequence
+from contextlib import contextmanager
+from dataclasses import dataclass
+from typing import TYPE_CHECKING, Final, NamedTuple
+
+from mypy.expandtype import expand_type, expand_type_by_instance
+from mypy.maptype import map_instance_to_supertype
+from mypy.mro import calculate_mro
+from mypy.nodes import (
+ ARG_NAMED,
+ ARG_NAMED_OPT,
+ ARG_OPT,
+ ARG_POS,
+ ARG_STAR,
+ ARG_STAR2,
+ MDEF,
+ ArgKind,
+ Block,
+ ClassDef,
+ Context,
+ FuncDef,
+ SymbolTable,
+ SymbolTableNode,
+ TypeInfo,
+ Var,
+)
+from mypy.subtypes import is_subtype
+from mypy.typeops import make_simplified_union, tuple_fallback
+from mypy.types import (
+ AnyType,
+ CallableType,
+ ComputedType,
+ Instance,
+ LiteralType,
+ NoneType,
+ ProperType,
+ TupleType,
+ Type,
+ TypeAliasType,
+ TypedDictType,
+ TypeForComprehension,
+ TypeOfAny,
+ TypeOperatorType,
+ TypeType,
+ TypeVarLikeType,
+ UnboundType,
+ UninhabitedType,
+ UnionType,
+ UnpackType,
+ get_proper_type,
+ has_type_vars,
+ is_stuck_expansion,
+)
+
+if TYPE_CHECKING:
+ from mypy.semanal_shared import SemanticAnalyzerInterface
+
+
+MAX_DEPTH = 100
+
+
+class TypeLevelContext:
+ """Holds the context for type-level computation evaluation.
+
+ This is a global mutable state that provides access to the semantic analyzer
+ API during type operator expansion. The context is set via a context manager
+ before type analysis and cleared afterward.
+ """
+
+ def __init__(self) -> None:
+ self._api: SemanticAnalyzerInterface | None = None
+ # Make an evaluator part of this state also, so that we can
+ # maintain a depth tracker and an outer error message context.
+ #
+ # XXX: but maybe we should always thread the evaluator back
+ # ourselves or something instead?
+ self._evaluator: TypeLevelEvaluator | None = None
+ self._suppress_errors: bool = False
+
+ @property
+ def api(self) -> SemanticAnalyzerInterface | None:
+ """Get the current semantic analyzer API, or None if not in context."""
+ return self._api
+
+ @contextmanager
+ def set_api(self, api: SemanticAnalyzerInterface) -> Iterator[None]:
+ """Context manager to set the API for type-level evaluation.
+
+ Usage:
+ with typelevel_ctx.set_api(self.api):
+ # Type operators can now access the API via typelevel_ctx.api
+ result = get_proper_type(some_type)
+ """
+ saved = self._api
+ self._api = api
+ try:
+ yield
+ finally:
+ self._api = saved
+
+ @contextmanager
+ def suppress_errors(self) -> Iterator[None]:
+ """Suppress side-effectful errors (e.g. from RaiseError) during evaluation.
+
+ Used during type formatting to prevent spurious errors when
+ get_proper_type evaluates TypeOperatorTypes for display purposes.
+ """
+ saved = self._suppress_errors
+ self._suppress_errors = True
+ try:
+ yield
+ finally:
+ self._suppress_errors = saved
+
+
+# Global context instance for type-level computation
+typelevel_ctx: Final = TypeLevelContext()
+
+
+# Registry mapping operator names (not full!) to their evaluation functions
+OperatorFunc = Callable[..., Type]
+
+
+@dataclass(frozen=True)
+class OperatorInfo:
+ """Metadata about a registered operator evaluator."""
+
+ func: OperatorFunc
+ expected_argc: int | None # None for variadic
+
+
+_OPERATOR_EVALUATORS: dict[str, OperatorInfo] = {}
+
+
+EXPANSION_ANY = AnyType(TypeOfAny.expansion_stuck)
+
+EXPANSION_OVERFLOW = AnyType(TypeOfAny.from_error)
+
+
+def find_map_any_in_args(args: list[Type]) -> AnyType | None:
+ """Return the inner AnyType if any arg is an UnpackType(AnyType) sentinel.
+
+ This sentinel is produced by evaluating a *Map[...] comprehension whose
+ Iter[...] source is Any; enclosing variadic operators / containers use
+ its presence to collapse themselves to Any.
+ """
+ for arg in args:
+ if isinstance(arg, UnpackType):
+ inner = get_proper_type(arg.type)
+ if isinstance(inner, AnyType):
+ return inner
+ return None
+
+
+def register_operator(name: str) -> Callable[[OperatorFunc], OperatorFunc]:
+ """Decorator to register an operator evaluator.
+
+ The function signature is inspected to determine the expected number of
+ positional arguments (excluding the keyword-only ``evaluator`` parameter).
+ If a ``*args`` parameter is found, the operator is treated as variadic.
+ """
+
+ def decorator(func: OperatorFunc) -> OperatorFunc:
+ # inspect.signature follows __wrapped__ set by functools.wraps,
+ # so this sees through the lift_over_unions wrapper automatically.
+ sig = inspect.signature(func)
+ argc = (
+ None
+ if any(p.kind == inspect.Parameter.VAR_POSITIONAL for p in sig.parameters.values())
+ else len(sig.parameters) - 1
+ )
+
+ _OPERATOR_EVALUATORS[name] = OperatorInfo(func=func, expected_argc=argc)
+ return func
+
+ return decorator
+
+
+def lift_over_unions(func: OperatorFunc) -> OperatorFunc:
+ """Decorator that lifts an operator to work over union types.
+
+ If any argument is a union type, the operator is applied to each
+ combination of union elements and the results are combined into a union.
+
+ For example, Concat[Literal['a'] | Literal['b'], Literal['c']]
+ becomes Literal['ac'] | Literal['bc'].
+
+ Works at the unpacked-args level: the wrapped function receives individual
+ ``Type`` positional arguments plus a keyword-only ``evaluator``.
+ """
+
+ @functools.wraps(func)
+ def wrapper(*args: Type, evaluator: TypeLevelEvaluator) -> Type:
+ expanded: list[list[Type]] = []
+ for arg in args:
+ proper = get_proper_type(arg)
+ if isinstance(proper, UnionType):
+ expanded.append(list(proper.items))
+ elif isinstance(proper, UninhabitedType):
+ # Never decomposes to an empty union: the operator never runs.
+ expanded.append([])
+ else:
+ expanded.append([arg])
+
+ combinations = list(itertools.product(*expanded))
+
+ # Fast path: no unions to fan out over, just call the operator directly.
+ if len(combinations) == 1:
+ return func(*args, evaluator=evaluator)
+
+ results: list[Type] = []
+ for combo in combinations:
+ result = func(*combo, evaluator=evaluator)
+ if not (isinstance(result, ProperType) and isinstance(result, UninhabitedType)):
+ results.append(result)
+
+ return UnionType.make_union(results)
+
+ return wrapper
+
+
+class EvaluationStuck(Exception):
+ pass
+
+
+class EvaluationOverflow(Exception):
+ pass
+
+
+class TypeLevelError(Exception):
+ """Raised by an operator evaluator when its arguments are invalid.
+
+ Caught in eval_operator, which emits the message via api.fail and
+ returns UninhabitedType.
+ """
+
+ def __init__(self, message: str) -> None:
+ super().__init__(message)
+ self.message = message
+
+
+class TypeLevelEvaluator:
+ """Evaluates type-level computations to concrete types."""
+
+ def __init__(self, api: SemanticAnalyzerInterface, ctx: Context | None) -> None:
+ self.api = api
+ self.ctx = ctx
+ self.depth = 0
+ self._current_op: TypeOperatorType | None = None
+
+ self.cache: dict[Type, Type] = {}
+
+ @property
+ def error_ctx(self) -> Context:
+ """Get the best available error context for reporting."""
+ ctx = self.ctx or self._current_op
+ assert ctx is not None, "No error context available"
+ return ctx
+
+ def evaluate(self, typ: Type) -> Type:
+ """Main entry point: evaluate a type to its simplified form."""
+
+ if typ in self.cache:
+ return self.cache[typ]
+
+ if self.depth >= MAX_DEPTH:
+ ctx = self.ctx or typ
+ # Use serious=True to bypass in_checked_function() check which requires
+ # self.options to be set on the SemanticAnalyzer
+ self.api.fail("Type expansion is too deep; producing Any", ctx, serious=True)
+ raise EvaluationOverflow()
+
+ try:
+ self.depth += 1
+ if isinstance(typ, TypeOperatorType):
+ rtyp = self.eval_operator(typ)
+ elif isinstance(typ, TypeForComprehension):
+ rtyp = evaluate_comprehension(self, typ)
+ else:
+ rtyp = typ # Already a concrete type or can't be evaluated
+
+ self.cache[typ] = rtyp
+
+ return rtyp
+ finally:
+ self.depth -= 1
+
+ def eval_proper(self, typ: Type) -> ProperType:
+ """Main entry point: evaluate a type to its simplified form."""
+ typ = get_proper_type(self.evaluate(typ))
+ # A call to another expansion via an alias got stuck, reraise here
+ if is_stuck_expansion(typ):
+ raise EvaluationStuck
+ if isinstance(typ, (TypeVarLikeType, UnboundType, ComputedType)):
+ raise EvaluationStuck
+ if isinstance(typ, UnpackType) and isinstance(typ.type, TypeVarLikeType):
+ raise EvaluationStuck
+
+ return typ
+
+ def eval_operator(self, typ: TypeOperatorType) -> Type:
+ """Evaluate a type operator by dispatching to registered handler."""
+ name = typ.type.name
+
+ info = _OPERATOR_EVALUATORS.get(name)
+ if info is None:
+ return EXPANSION_ANY
+
+ old_op = self._current_op
+ self._current_op = typ
+ try:
+ args = typ.args
+ if info.expected_argc is None:
+ args = self.flatten_args(args)
+ # If a *Map[...]-over-Any sentinel landed in the flattened args,
+ # collapse the whole variadic operator call to Any.
+ if (map_any := find_map_any_in_args(args)) is not None:
+ return map_any
+ elif len(args) != info.expected_argc:
+ return UninhabitedType()
+ try:
+ return info.func(*args, evaluator=self)
+ except TypeLevelError as err:
+ if not typelevel_ctx._suppress_errors:
+ self.api.fail(err.message, self.error_ctx, serious=True)
+ return AnyType(TypeOfAny.from_error)
+ finally:
+ self._current_op = old_op
+
+ # --- Type construction helpers ---
+
+ def literal_bool(self, value: bool) -> LiteralType:
+ """Create a Literal[True] or Literal[False] type."""
+ return LiteralType(value, self.api.named_type("builtins.bool"))
+
+ def literal_int(self, value: int) -> LiteralType:
+ """Create a Literal[int] type."""
+ return LiteralType(value, self.api.named_type("builtins.int"))
+
+ def flatten_args(self, args: list[Type]) -> list[Type]:
+ """Flatten type arguments, evaluating and unpacking as needed.
+
+ Handles UnpackType from comprehensions by expanding the inner TupleType.
+ An UnpackType(AnyType) sentinel (produced by *Map[...] over Iter[Any])
+ is passed through for the caller to detect via find_map_any_in_args.
+ """
+ flat_args: list[Type] = []
+ for arg in args:
+ evaluated = self.eval_proper(arg)
+ if isinstance(evaluated, UnpackType):
+ inner = get_proper_type(evaluated.type)
+ if isinstance(inner, TupleType):
+ flat_args.extend(inner.items)
+ else:
+ flat_args.append(evaluated)
+ else:
+ flat_args.append(evaluated)
+ return flat_args
+
+ def literal_str(self, value: str) -> LiteralType:
+ """Create a Literal[str] type."""
+ return LiteralType(value, self.api.named_type("builtins.str"))
+
+ def tuple_type(self, items: list[Type]) -> TupleType:
+ """Create a tuple type with the given items."""
+ return TupleType(items, self.api.named_type("builtins.tuple"))
+
+ def get_typemap_type(self, name: str) -> Instance:
+ # They are always in _typeshed.typemap in normal runs, but are
+ # sometimes missing from typing, depending on the version.
+ # But _typeshed.typemap doesn't exist in tests, so...
+ if typ := self.api.named_type_or_none(f"typing.{name}"):
+ return typ
+ else:
+ return self.api.named_type(f"_typeshed.typemap.{name}")
+
+
+def _call_by_value(evaluator: TypeLevelEvaluator, typ: Type) -> Type:
+ """Make sure alias arguments are evaluated before expansion.
+
+ Currently this is used in conditional bodies, which should protect
+ any recursive uses, to make sure that arguments to potentially
+ recursive aliases get evaluated before substituted in, to make
+ sure that they don't grow without bound.
+
+ This shouldn't be necessary for correctness, but can be important
+ for performance.
+
+ We should *maybe* do it in more places! Possibly everywhere? Or
+ maybe we should do it *never* and just do a better job of caching.
+ """
+ if isinstance(typ, TypeAliasType):
+ typ = typ.copy_modified(
+ args=[get_proper_type(_call_by_value(evaluator, st)) for st in typ.args]
+ )
+
+ # Evaluate recursively instead of letting it get handled in the
+ # get_proper_type loop to help maintain better error contexts.
+ return evaluator.eval_proper(typ)
+
+
+@register_operator("_Cond")
+def _eval_cond(
+ condition: Type, true_type: Type, false_type: Type, *, evaluator: TypeLevelEvaluator
+) -> Type:
+ """Evaluate _Cond[condition, TrueType, FalseType]."""
+ result = extract_literal_bool(evaluator.eval_proper(condition))
+
+ if result is True:
+ return _call_by_value(evaluator, true_type)
+ elif result is False:
+ return _call_by_value(evaluator, false_type)
+ else:
+ # Undecidable - return Any for now
+ # In the future, we might want to keep the conditional and defer evaluation
+ return EXPANSION_ANY
+
+
+@register_operator("_And")
+def _eval_and(left_arg: Type, right_arg: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate _And[cond1, cond2] - logical AND of type booleans."""
+ left = extract_literal_bool(evaluator.eval_proper(left_arg))
+ if left is False:
+ # Short-circuit: False and X = False
+ return evaluator.literal_bool(False)
+ if left is None:
+ return UninhabitedType()
+
+ right = extract_literal_bool(evaluator.eval_proper(right_arg))
+ if right is None:
+ return UninhabitedType()
+
+ return evaluator.literal_bool(right)
+
+
+@register_operator("_Or")
+def _eval_or(left_arg: Type, right_arg: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate _Or[cond1, cond2] - logical OR of type booleans."""
+ left = extract_literal_bool(evaluator.eval_proper(left_arg))
+ if left is True:
+ # Short-circuit: True or X = True
+ return evaluator.literal_bool(True)
+ if left is None:
+ return UninhabitedType()
+
+ right = extract_literal_bool(evaluator.eval_proper(right_arg))
+ if right is None:
+ return UninhabitedType()
+
+ return evaluator.literal_bool(right)
+
+
+@register_operator("_Not")
+def _eval_not(arg: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate _Not[cond] - logical NOT of a type boolean."""
+ result = extract_literal_bool(evaluator.eval_proper(arg))
+ if result is None:
+ return UninhabitedType()
+
+ return evaluator.literal_bool(not result)
+
+
+@register_operator("_DictEntry")
+def _eval_dict_entry(name_type: Type, value_type: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate _DictEntry[name, typ] -> Member[name, typ, Never, Never, Never].
+
+ This is the internal type operator for dict comprehension syntax in type context.
+ {k: v for x in foo} desugars to *[_DictEntry[k, v] for x in foo].
+ """
+ member_info = evaluator.get_typemap_type("Member")
+ never = UninhabitedType()
+ return Instance(member_info.type, [name_type, value_type, never, never, never])
+
+
+@register_operator("Iter")
+def _eval_iter(arg: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate a type-level iterator (Iter[T])."""
+ target = evaluator.eval_proper(arg)
+ if isinstance(target, TupleType):
+ return target
+ elif isinstance(target, AnyType):
+ # Propagate Any so enclosing operators (notably Map) can detect it
+ # and short-circuit at the Any boundary.
+ return target
+ else:
+ return UninhabitedType()
+
+
+@register_operator("IsAssignable")
+def _eval_isass(lhs: Type, rhs: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate a type-level condition (IsAssignable[T, Base])."""
+ left_proper = evaluator.eval_proper(lhs)
+ right_proper = evaluator.eval_proper(rhs)
+
+ # Handle type variables - may be undecidable
+ if has_type_vars(left_proper) or has_type_vars(right_proper):
+ return EXPANSION_ANY
+
+ result = is_subtype(left_proper, right_proper)
+
+ return evaluator.literal_bool(result)
+
+
+@register_operator("IsEquivalent")
+def _eval_isequiv(lhs: Type, rhs: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate IsEquivalent[T, S] - check if T and S are equivalent types.
+
+ Returns Literal[True] if T is a subtype of S AND S is a subtype of T.
+ Equivalent to: IsAssignable[T, S] and IsAssignable[S, T]
+ """
+ left_proper = evaluator.eval_proper(lhs)
+ right_proper = evaluator.eval_proper(rhs)
+
+ # Handle type variables - may be undecidable
+ if has_type_vars(left_proper) or has_type_vars(right_proper):
+ return EXPANSION_ANY
+
+ # Both directions must hold for type equivalence
+ result = is_subtype(left_proper, right_proper) and is_subtype(right_proper, left_proper)
+
+ return evaluator.literal_bool(result)
+
+
+@register_operator("Bool")
+def _eval_bool(arg: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate Bool[T] - check if T contains Literal[True].
+
+ Returns Literal[True] if T is Literal[True] or a union containing Literal[True].
+ Equivalent to: IsAssignable[Literal[True], T] and not IsAssignable[T, Never]
+ """
+ arg_proper = evaluator.eval_proper(arg)
+
+ # Check if Literal[True] is a subtype of arg (i.e., arg contains True)
+ # and arg is not Never
+ literal_true = evaluator.literal_bool(True)
+ contains_true = is_subtype(literal_true, arg_proper)
+ is_never = isinstance(arg_proper, UninhabitedType)
+
+ return evaluator.literal_bool(contains_true and not is_never)
+
+
+def extract_literal_bool(typ: Type) -> bool | None:
+ """Extract bool value from LiteralType."""
+ typ = get_proper_type(typ)
+ if isinstance(typ, LiteralType) and isinstance(typ.value, bool):
+ return typ.value
+ return None
+
+
+def extract_literal_int(typ: Type) -> int | None:
+ """Extract int value from LiteralType."""
+ typ = get_proper_type(typ)
+ if (
+ isinstance(typ, LiteralType)
+ and isinstance(typ.value, int)
+ and not isinstance(typ.value, bool)
+ ):
+ return typ.value
+ return None
+
+
+def extract_literal_string(typ: Type) -> str | None:
+ """Extract string value from LiteralType."""
+ typ = get_proper_type(typ)
+ if isinstance(typ, LiteralType) and isinstance(typ.value, str):
+ return typ.value
+ return None
+
+
+def extract_qualifier_strings(typ: Type) -> list[str]:
+ """Extract qualifier strings from a type that may be a Literal or Union of Literals.
+
+ Used to extract qualifiers from Member[..., quals, ...] where quals can be:
+ - A single Literal[str] like Literal["ClassVar"]
+ - A Union of Literals like Literal["ClassVar"] | Literal["Final"]
+ - Never (no qualifiers) - returns empty list
+ """
+ typ = get_proper_type(typ)
+ qual_strings: list[str] = []
+
+ if isinstance(typ, LiteralType) and isinstance(typ.value, str):
+ qual_strings.append(typ.value)
+ elif isinstance(typ, UnionType):
+ for item in typ.items:
+ item_proper = get_proper_type(item)
+ if isinstance(item_proper, LiteralType) and isinstance(item_proper.value, str):
+ qual_strings.append(item_proper.value)
+
+ return qual_strings
+
+
+def _callable_to_params(evaluator: TypeLevelEvaluator, target: CallableType) -> list[Type]:
+ """Convert a CallableType's parameters into a list of Param[name, type, quals] instances."""
+ param_info = evaluator.get_typemap_type("Param")
+ never = UninhabitedType()
+ params: list[Type] = []
+
+ for arg_type, arg_kind, arg_name in zip(target.arg_types, target.arg_kinds, target.arg_names):
+ if arg_name is not None:
+ name_type: Type = evaluator.literal_str(arg_name)
+ else:
+ name_type = NoneType()
+
+ quals: list[str] = []
+ if arg_kind == ARG_POS:
+ pass # no quals
+ elif arg_kind == ARG_OPT:
+ quals.append("default")
+ elif arg_kind == ARG_STAR:
+ quals.append("*")
+ elif arg_kind == ARG_NAMED:
+ quals.append("keyword")
+ elif arg_kind == ARG_NAMED_OPT:
+ quals.extend(["keyword", "default"])
+ elif arg_kind == ARG_STAR2:
+ quals.append("**")
+
+ if quals:
+ quals_type: Type = make_simplified_union([evaluator.literal_str(q) for q in quals])
+ else:
+ quals_type = never
+
+ params.append(Instance(param_info.type, [name_type, arg_type, quals_type]))
+
+ return params
+
+
+def _get_args(evaluator: TypeLevelEvaluator, target: Type, base: Type) -> Sequence[Type] | None:
+ target = evaluator.eval_proper(target)
+ base = evaluator.eval_proper(base)
+
+ # TODO: Other cases:
+ # * Overloaded
+ # * classmethod/staticmethod (decorated callables)
+
+ if isinstance(target, CallableType) and isinstance(base, CallableType):
+ if not is_subtype(target, base):
+ return None
+ params = _callable_to_params(evaluator, target)
+ return [evaluator.tuple_type(params), target.ret_type]
+
+ if isinstance(target, Instance) and isinstance(base, Instance):
+ # TODO: base.is_protocol!!
+ # Probably implement it by filling in base with TypeVars and
+ # calling infer_constraints and solve.
+
+ return get_type_args_for_base(target, base.type)
+
+ if isinstance(target, NoneType):
+ return _get_args(evaluator, evaluator.api.named_type("builtins.object"), base)
+
+ if isinstance(target, TupleType):
+ # TODO: tuple v tuple?
+ # TODO: Do a real check against more classes?
+ if isinstance(base, Instance) and target.partial_fallback == base:
+ return target.items
+
+ return _get_args(evaluator, tuple_fallback(target), base)
+
+ if isinstance(target, TypedDictType):
+ return _get_args(evaluator, target.fallback, base)
+
+ if isinstance(target, TypeType):
+ if isinstance(base, Instance) and base.type.fullname == "builtins.type":
+ return [target.item]
+ # TODO: metaclasses, protocols
+ return None
+
+ return None
+
+
+@register_operator("GetArg")
+@lift_over_unions
+def _eval_get_arg(
+ target: Type, base: Type, index_arg: Type, *, evaluator: TypeLevelEvaluator
+) -> Type:
+ """Evaluate GetArg[T, Base, Idx] - get type argument at index from T as Base."""
+ eval_target = evaluator.eval_proper(target)
+ eval_base = evaluator.eval_proper(base)
+ if isinstance(eval_target, AnyType):
+ return eval_target
+ args = _get_args(evaluator, eval_target, eval_base)
+
+ if args is None:
+ raise TypeLevelError(f"GetArg: {eval_target} is not a subclass of {eval_base}")
+
+ # Extract index as int
+ index = extract_literal_int(evaluator.eval_proper(index_arg))
+ if index is None:
+ return UninhabitedType() # Can't evaluate without literal index
+
+ if index < 0:
+ index += len(args)
+ if 0 <= index < len(args):
+ return args[index]
+
+ raise TypeLevelError(f"GetArg: index out of range for {eval_target} as {eval_base}")
+
+
+@register_operator("GetArgs")
+@lift_over_unions
+def _eval_get_args(target: Type, base: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate GetArgs[T, Base] -> tuple of all type args from T as Base."""
+ eval_target = evaluator.eval_proper(target)
+ eval_base = evaluator.eval_proper(base)
+ if isinstance(eval_target, AnyType):
+ return eval_target
+ args = _get_args(evaluator, eval_target, eval_base)
+
+ if args is None:
+ raise TypeLevelError(f"GetArgs: {eval_target} is not a subclass of {eval_base}")
+ return evaluator.tuple_type(list(args))
+
+
+@register_operator("FromUnion")
+def _eval_from_union(arg: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate FromUnion[T] -> tuple of union elements."""
+ target = evaluator.eval_proper(arg)
+
+ if isinstance(target, UnionType):
+ return evaluator.tuple_type(list(target.items))
+ else:
+ # Non-union becomes 1-tuple
+ return evaluator.tuple_type([target])
+
+
+@register_operator("_NewUnion")
+def _eval_new_union(*args: Type, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate _NewUnion[*Ts] -> union of all type arguments."""
+ return make_simplified_union(list(args))
+
+
+@register_operator("_NewCallable")
+def _eval_new_callable(*args: Type, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate _NewCallable[Param1, Param2, ..., ReturnType] -> CallableType.
+
+ The last argument is the return type. All preceding arguments should be
+ Param[name, type, quals] instances describing the callable's parameters.
+ """
+ if len(args) == 0:
+ return AnyType(TypeOfAny.from_error)
+
+ ret_type = args[-1]
+ param_args = args[:-1]
+
+ arg_types: list[Type] = []
+ arg_kinds: list[ArgKind] = []
+ arg_names: list[str | None] = []
+
+ for param in param_args:
+ param = evaluator.eval_proper(param)
+ if not isinstance(param, Instance):
+ # Not a Param instance, skip
+ continue
+ if param.type.fullname not in ("typing.Param", "_typeshed.typemap.Param"):
+ continue
+
+ # Param[name, type, quals]
+ p_args = param.args
+ if len(p_args) < 2:
+ continue
+
+ name = extract_literal_string(get_proper_type(p_args[0]))
+ param_type = p_args[1]
+ quals = extract_qualifier_strings(p_args[2]) if len(p_args) > 2 else []
+
+ qual_set = set(quals)
+ if "*" in qual_set:
+ arg_kinds.append(ARG_STAR)
+ arg_names.append(None)
+ elif "**" in qual_set:
+ arg_kinds.append(ARG_STAR2)
+ arg_names.append(None)
+ elif "keyword" in qual_set:
+ if "default" in qual_set:
+ arg_kinds.append(ARG_NAMED_OPT)
+ else:
+ arg_kinds.append(ARG_NAMED)
+ arg_names.append(name)
+ elif "default" in qual_set:
+ arg_kinds.append(ARG_OPT)
+ arg_names.append(name)
+ else:
+ arg_kinds.append(ARG_POS)
+ arg_names.append(name)
+
+ arg_types.append(param_type)
+
+ fallback = evaluator.api.named_type("builtins.function")
+ return CallableType(
+ arg_types=arg_types,
+ arg_kinds=arg_kinds,
+ arg_names=arg_names,
+ ret_type=ret_type,
+ fallback=fallback,
+ )
+
+
+@register_operator("GetMember")
+@lift_over_unions
+def _eval_get_member(target_arg: Type, name_arg: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate GetMember[T, Name] - get Member type for named member from T."""
+ name = extract_literal_string(evaluator.eval_proper(name_arg))
+ if name is None:
+ return UninhabitedType()
+
+ eval_target = evaluator.eval_proper(target_arg)
+ if isinstance(eval_target, AnyType):
+ return eval_target
+ members = _get_members_dict(eval_target, evaluator=evaluator, attrs_only=False)
+ member = members.get(name)
+ if member is None:
+ raise TypeLevelError(f"GetMember: {name!r} not found in {eval_target}")
+ return member
+
+
+@register_operator("GetMemberType")
+@lift_over_unions
+def _eval_get_member_type(
+ target_arg: Type, name_arg: Type, *, evaluator: TypeLevelEvaluator
+) -> Type:
+ """Evaluate GetMemberType[T, Name] - get attribute type from T."""
+ name = extract_literal_string(evaluator.eval_proper(name_arg))
+ if name is None:
+ return UninhabitedType()
+
+ eval_target = evaluator.eval_proper(target_arg)
+ if isinstance(eval_target, AnyType):
+ return eval_target
+ members = _get_members_dict(eval_target, evaluator=evaluator, attrs_only=False)
+ member = members.get(name)
+ if member is not None:
+ # Extract the type argument (index 1) from Member[name, typ, quals, init, definer]
+ member = get_proper_type(member)
+ if isinstance(member, Instance) and len(member.args) > 1:
+ return member.args[1]
+ return UninhabitedType()
+
+
+@register_operator("_TypeGetAttr")
+def _eval_type_get_attr(
+ target_arg: Type, name_arg: Type, *, evaluator: TypeLevelEvaluator
+) -> Type:
+ """Evaluate _TypeGetAttr[T, Name] - get attribute from a Member.
+
+ Internal operator for dot notation: m.attr desugars to _TypeGetAttr[m, Literal["attr"]].
+ Unlike GetMemberType, this only works on Member instances, not arbitrary types.
+ """
+ target = evaluator.eval_proper(target_arg)
+
+ member_info = evaluator.get_typemap_type("Member")
+ param_info = evaluator.get_typemap_type("Param")
+ if not isinstance(target, Instance) or target.type not in (member_info.type, param_info.type):
+ name_type = evaluator.eval_proper(name_arg)
+ name = extract_literal_string(name_type)
+ evaluator.api.fail(
+ f"Dot notation .{name} requires a Member or Param type, got {target}",
+ evaluator.error_ctx,
+ serious=True,
+ )
+ return UninhabitedType()
+
+ # Direct call bypasses union lifting, which is correct for Member/Param access
+ return _eval_get_member_type(target_arg, name_arg, evaluator=evaluator)
+
+
+@register_operator("Slice")
+@lift_over_unions
+def _eval_slice(
+ target_arg: Type, start_arg: Type, end_arg: Type, *, evaluator: TypeLevelEvaluator
+) -> Type:
+ """Evaluate Slice[S, Start, End] - slice a literal string or tuple type."""
+ target = evaluator.eval_proper(target_arg)
+ if isinstance(target, AnyType):
+ return target
+
+ # Handle start - can be int or None
+ start_type = evaluator.eval_proper(start_arg)
+ if isinstance(start_type, NoneType):
+ start: int | None = None
+ else:
+ start = extract_literal_int(start_type)
+ if start is None:
+ return UninhabitedType()
+
+ # Handle end - can be int or None
+ end_type = evaluator.eval_proper(end_arg)
+ if isinstance(end_type, NoneType):
+ end: int | None = None
+ else:
+ end = extract_literal_int(end_type)
+ if end is None:
+ return UninhabitedType()
+
+ # Handle literal string slicing
+ s = extract_literal_string(target)
+ if s is not None:
+ result = s[start:end]
+ return evaluator.literal_str(result)
+
+ # Handle tuple type slicing
+ if isinstance(target, TupleType):
+ sliced_items = target.items[start:end]
+ return evaluator.tuple_type(sliced_items)
+
+ raise TypeLevelError(f"Slice: {target} is not a tuple or string Literal")
+
+
+@register_operator("Concat")
+@lift_over_unions
+def _eval_concat(left_arg: Type, right_arg: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate Concat[S1, S2] - concatenate two literal strings."""
+ left = extract_literal_string(evaluator.eval_proper(left_arg))
+ right = extract_literal_string(evaluator.eval_proper(right_arg))
+
+ if left is not None and right is not None:
+ return evaluator.literal_str(left + right)
+
+ return UninhabitedType()
+
+
+@register_operator("Uppercase")
+@lift_over_unions
+def _eval_uppercase(arg: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate Uppercase[S] - convert literal string to uppercase."""
+ s = extract_literal_string(evaluator.eval_proper(arg))
+ if s is not None:
+ return evaluator.literal_str(s.upper())
+
+ return UninhabitedType()
+
+
+@register_operator("Lowercase")
+@lift_over_unions
+def _eval_lowercase(arg: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate Lowercase[S] - convert literal string to lowercase."""
+ s = extract_literal_string(evaluator.eval_proper(arg))
+ if s is not None:
+ return evaluator.literal_str(s.lower())
+
+ return UninhabitedType()
+
+
+@register_operator("Capitalize")
+@lift_over_unions
+def _eval_capitalize(arg: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate Capitalize[S] - capitalize first character of literal string."""
+ s = extract_literal_string(evaluator.eval_proper(arg))
+ if s is not None:
+ return evaluator.literal_str(s.capitalize())
+
+ return UninhabitedType()
+
+
+@register_operator("Uncapitalize")
+@lift_over_unions
+def _eval_uncapitalize(arg: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate Uncapitalize[S] - lowercase first character of literal string."""
+ s = extract_literal_string(evaluator.eval_proper(arg))
+ if s is not None:
+ result = s[0].lower() + s[1:] if s else s
+ return evaluator.literal_str(result)
+
+ return UninhabitedType()
+
+
+@register_operator("Members")
+@lift_over_unions
+def _eval_members(target: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate Members[T] -> tuple of Member types for all members of T.
+
+ Includes methods, class variables, and instance attributes.
+ """
+ return _eval_members_impl(target, evaluator=evaluator, attrs_only=False)
+
+
+@register_operator("Attrs")
+@lift_over_unions
+def _eval_attrs(target: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate Attrs[T] -> tuple of Member types for annotated attributes only.
+
+ Excludes methods but includes ClassVar members.
+ """
+ return _eval_members_impl(target, evaluator=evaluator, attrs_only=True)
+
+
+def _eval_members_impl(
+ target_arg: Type, *, evaluator: TypeLevelEvaluator, attrs_only: bool
+) -> Type:
+ """Common implementation for Members and Attrs operators."""
+ members = _get_members_dict(target_arg, evaluator=evaluator, attrs_only=attrs_only)
+ return evaluator.tuple_type(list(members.values()))
+
+
+def _should_skip_type_info(type_info: TypeInfo, api: SemanticAnalyzerInterface) -> bool:
+ """Determine whether to skip a type_info when collecting members.
+
+ HACK: The rules here need to be more clearly defined. For now, we skip
+ anything from a stub file except typing.Member (which needs to be
+ introspectable for GetMemberType/dot notation on Member instances).
+ """
+ # TODO: figure out the real rules for this
+ if type_info.fullname in (
+ "typing.Member",
+ "_typeshed.typemap.Member",
+ "typing.Param",
+ "_typeshed.typemap.Param",
+ ):
+ return False
+ module = api.modules.get(type_info.module_name)
+ if module is not None and module.is_stub:
+ return True
+ return False
+
+
+def _get_members_dict(
+ target_arg: Type, *, evaluator: TypeLevelEvaluator, attrs_only: bool
+) -> dict[str, Type]:
+ """Build a dict of member name -> Member type for all members of target.
+
+ Args:
+ attrs_only: If True, filter to attributes only (excludes methods).
+ If False, include all members.
+
+ Returns a dict mapping member names to Member[name, typ, quals, init, definer]
+ instance types.
+ """
+ target = evaluator.eval_proper(target_arg)
+
+ member_info = evaluator.get_typemap_type("Member")
+
+ if isinstance(target, TypedDictType):
+ return _get_typeddict_members_dict(target, member_info.type, evaluator=evaluator)
+
+ if not isinstance(target, Instance):
+ return {}
+
+ members: dict[str, Type] = {}
+
+ # Iterate through MRO in reverse (base classes first) to include inherited members
+ for type_info in reversed(target.type.mro):
+ if _should_skip_type_info(type_info, evaluator.api):
+ continue
+
+ for name, sym in type_info.names.items():
+ if sym.type is None:
+ continue
+
+ # Skip inferred attributes (those without explicit type annotations),
+ # but include them for enums since enum members are always inferred.
+ if isinstance(sym.node, Var) and sym.node.is_inferred and not target.type.is_enum:
+ continue
+
+ if attrs_only:
+ # Attrs filters to attributes only (excludes methods).
+ # Methods are FuncDef nodes; Callable-typed attributes are Var nodes.
+ if isinstance(sym.node, FuncDef):
+ continue
+
+ # Map type_info to get correct type args as seen from target
+ if type_info == target.type:
+ definer = target
+ else:
+ definer = map_instance_to_supertype(target, type_info)
+
+ # Expand the member type to substitute type variables with actual args
+ member_typ = expand_type_by_instance(sym.type, definer)
+
+ member_type = create_member_type(
+ evaluator,
+ member_info.type,
+ name=name,
+ typ=member_typ,
+ node=sym.node,
+ definer=definer,
+ )
+ members[name] = member_type
+
+ return members
+
+
+def _get_typeddict_members_dict(
+ target: TypedDictType, member_type_info: TypeInfo, *, evaluator: TypeLevelEvaluator
+) -> dict[str, Type]:
+ """Build a dict of member name -> Member type for a TypedDict."""
+ members: dict[str, Type] = {}
+
+ for name, item_type in target.items.items():
+ # Build qualifiers for TypedDict keys
+ # Required is the default, so only add NotRequired when not required
+ quals: list[str] = []
+ if name not in target.required_keys:
+ quals.append("NotRequired")
+ if name in target.readonly_keys:
+ quals.append("ReadOnly")
+
+ quals_type = UnionType.make_union([evaluator.literal_str(q) for q in quals])
+
+ members[name] = Instance(
+ member_type_info,
+ [
+ evaluator.literal_str(name), # name
+ item_type, # typ
+ quals_type, # quals
+ UninhabitedType(), # init (not tracked for TypedDict)
+ UninhabitedType(), # definer (not tracked for TypedDict)
+ ],
+ )
+
+ return members
+
+
+def create_member_type(
+ evaluator: TypeLevelEvaluator,
+ member_type_info: TypeInfo,
+ name: str,
+ typ: Type,
+ node: object,
+ definer: Instance,
+) -> Instance:
+ """Create a Member[name, typ, quals, init, definer] instance type."""
+ # Determine qualifiers
+ quals: Type
+ if isinstance(node, Var):
+ if node.is_classvar:
+ quals = evaluator.literal_str("ClassVar")
+ elif node.is_final:
+ quals = evaluator.literal_str("Final")
+ else:
+ quals = UninhabitedType() # Never = no qualifiers
+ elif isinstance(node, FuncDef):
+ # Methods are class-level, so they have ClassVar qualifier
+ quals = evaluator.literal_str("ClassVar")
+ else:
+ quals = UninhabitedType()
+
+ # For init, use init_type when available (set during type checking for class members).
+ # The literal type extraction is done in checker.py when init_type is set.
+ init: Type = UninhabitedType()
+ if isinstance(node, Var) and node.init_type is not None:
+ init = node.init_type
+
+ return Instance(
+ member_type_info,
+ [
+ evaluator.literal_str(name), # name
+ typ, # typ
+ quals, # quals
+ init, # init
+ definer, # definer
+ ],
+ )
+
+
+@register_operator("NewTypedDict")
+def _eval_new_typeddict(*args: Type, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate NewTypedDict[*Members] -> create a new TypedDict from Member types.
+
+ This is the inverse of Members[TypedDict].
+ """
+ # Get the Member TypeInfo to verify arguments
+ member_info = evaluator.get_typemap_type("Member")
+
+ items: dict[str, Type] = {}
+ required_keys: set[str] = set()
+ readonly_keys: set[str] = set()
+
+ for arg in args:
+ arg = get_proper_type(arg)
+
+ # Each argument should be a Member[name, typ, quals, init, definer]
+ if not isinstance(arg, Instance) or arg.type != member_info.type:
+ # Not a Member type - can't construct TypedDict
+ return UninhabitedType()
+
+ if len(arg.args) < 3:
+ return UninhabitedType()
+
+ # Extract name, type, and qualifiers from Member args
+ name_type, item_type, quals, *_ = arg.args
+ name = extract_literal_string(name_type)
+ if name is None:
+ return UninhabitedType()
+ is_required = True # Default is Required
+ is_readonly = False
+
+ for qual in extract_qualifier_strings(quals):
+ if qual == "NotRequired":
+ is_required = False
+ elif qual == "Required":
+ is_required = True
+ elif qual == "ReadOnly":
+ is_readonly = True
+
+ items[name] = item_type
+ if is_required:
+ required_keys.add(name)
+ if is_readonly:
+ readonly_keys.add(name)
+
+ # Get the TypedDict fallback
+ fallback = evaluator.api.named_type_or_none("typing._TypedDict")
+ if fallback is None:
+ # Fallback to Mapping[str, object] if _TypedDict not available
+ fallback = evaluator.api.named_type("builtins.dict")
+
+ return TypedDictType(
+ items=items, required_keys=required_keys, readonly_keys=readonly_keys, fallback=fallback
+ )
+
+
+class MemberDef(NamedTuple):
+ """Extracted member definition from a Member[name, typ, quals, init, definer] type."""
+
+ name: str
+ type: Type
+ init_type: Type
+ is_classvar: bool
+ is_final: bool
+
+
+def _extract_members(
+ args: tuple[Type, ...], evaluator: TypeLevelEvaluator, *, eval_types: bool = False
+) -> list[MemberDef] | None:
+ """Extract member definitions from Member type arguments.
+
+ Returns a list of MemberDef, or None if any argument is not a valid Member.
+
+ If eval_types is True, member types are eagerly evaluated via the
+ evaluator (needed for UpdateClass where the types are stored on a
+ real TypeInfo and must be fully resolved).
+ """
+ member_info = evaluator.get_typemap_type("Member")
+ members: list[MemberDef] = []
+
+ for arg in args:
+ arg = get_proper_type(arg)
+
+ if not isinstance(arg, Instance) or arg.type != member_info.type:
+ return None
+ if len(arg.args) < 2:
+ return None
+
+ name = extract_literal_string(arg.args[0])
+ if name is None:
+ return None
+
+ item_type = arg.args[1]
+ if eval_types:
+ item_type = evaluator.eval_proper(item_type)
+
+ is_classvar = False
+ is_final = False
+ if len(arg.args) >= 3:
+ for qual in extract_qualifier_strings(arg.args[2]):
+ if qual == "ClassVar":
+ is_classvar = True
+ elif qual == "Final":
+ is_final = True
+
+ init_type: Type = UninhabitedType()
+ if len(arg.args) >= 4:
+ init_type = arg.args[3]
+
+ members.append(MemberDef(name, item_type, init_type, is_classvar, is_final))
+
+ return members
+
+
+_synthetic_type_counter = 0
+
+
+def _build_synthetic_typeinfo(
+ class_name: str, members: list[MemberDef], evaluator: TypeLevelEvaluator
+) -> TypeInfo:
+ """Create a synthetic TypeInfo populated with members.
+
+ Used by NewProtocol to build a TypeInfo carrying member definitions
+ extracted from Member type arguments.
+ """
+ global _synthetic_type_counter
+ _synthetic_type_counter += 1
+
+ # HACK: We create a ClassDef with an empty Block because TypeInfo requires one.
+ # Each synthetic type needs a unique fullname so that nominal subtype checks
+ # (has_base) don't incorrectly treat distinct synthetic types as related.
+ class_def = ClassDef(class_name, Block([]))
+ class_def.fullname = f"__typelevel__.{class_name}.{_synthetic_type_counter}"
+
+ info = TypeInfo(SymbolTable(), class_def, "__typelevel__")
+ class_def.info = info
+
+ object_type = evaluator.api.named_type("builtins.object")
+ info.bases = [object_type]
+ try:
+ calculate_mro(info)
+ except Exception:
+ # HACK: Minimal MRO setup when calculate_mro fails
+ info.mro = [info, object_type.type]
+
+ for m in members:
+ var = Var(m.name, m.type)
+ var.info = info
+ var._fullname = f"{info.fullname}.{m.name}"
+ var.is_classvar = m.is_classvar
+ var.is_final = m.is_final
+ var.is_initialized_in_class = True
+ var.init_type = m.init_type
+ var.is_inferred = False
+ info.names[m.name] = SymbolTableNode(MDEF, var)
+
+ return info
+
+
+@register_operator("NewProtocol")
+def _eval_new_protocol(*args: Type, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate NewProtocol[*Members] -> create a new structural protocol type.
+
+ This creates a synthetic protocol class with members defined by the Member arguments.
+ The protocol type uses structural subtyping.
+ """
+ # TODO: methods are probably in bad shape
+
+ members = _extract_members(args, evaluator)
+ if members is None:
+ return UninhabitedType()
+
+ info = _build_synthetic_typeinfo("NewProtocol", members, evaluator)
+
+ info.is_protocol = True
+ assert evaluator._current_op is not None
+ info.new_protocol_constructor = evaluator._current_op
+ info.runtime_protocol = False
+
+ return Instance(info, [])
+
+
+@register_operator("UpdateClass")
+def _eval_update_class(*args: Type, evaluator: TypeLevelEvaluator) -> Type:
+ """UpdateClass should not be evaluated as a normal type operator.
+
+ It is only valid as the return type of a class decorator or
+ __init_subclass__, and is handled specially by semanal_main.py
+ via evaluate_update_class(). If we get here (e.g. via get_proper_type
+ expanding the return type annotation), return NoneType since the
+ decorated function/method semantically returns None at runtime.
+ Returning Never here would cause the checker to treat subsequent
+ code as unreachable.
+ """
+ return NoneType()
+
+
+def evaluate_update_class(
+ typ: TypeOperatorType, api: SemanticAnalyzerInterface, ctx: Context | None = None
+) -> list[MemberDef] | None:
+ """Evaluate an UpdateClass TypeOperatorType and return its member definitions.
+
+ Called from semanal_main.py during the post-semanal pass. Eagerly evaluates
+ member types so they are fully resolved before being stored on the target
+ class's TypeInfo.
+
+ Returns None if evaluation fails.
+ """
+ evaluator = TypeLevelEvaluator(api, ctx)
+ try:
+ args = evaluator.flatten_args(typ.args)
+ except (EvaluationStuck, EvaluationOverflow):
+ return None
+ return _extract_members(tuple(args), evaluator, eval_types=True)
+
+
+@register_operator("Length")
+@lift_over_unions
+def _eval_length(arg: Type, *, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate Length[T] -> Literal[int] for tuple length."""
+ target = evaluator.eval_proper(arg)
+
+ if isinstance(target, TupleType):
+ # Need to evaluate the elements before we inspect them
+ items = [evaluator.eval_proper(st) for st in target.items]
+
+ # If there is an Unpack, it must be of an unbounded tuple, or
+ # it would have been substituted out.
+ if any(isinstance(st, UnpackType) for st in items):
+ return NoneType()
+ return evaluator.literal_int(len(target.items))
+ if isinstance(target, Instance) and target.type.has_base("builtins.tuple"):
+ return NoneType()
+
+ return UninhabitedType()
+
+
+@register_operator("RaiseError")
+def _eval_raise_error(*args: Type, evaluator: TypeLevelEvaluator) -> Type:
+ """Evaluate RaiseError[S] -> emit a type error with message S.
+
+ RaiseError is used to emit custom type errors during type-level computation.
+ The argument must be a Literal[str] containing the error message.
+ Returns Never after emitting the error.
+ """
+
+ if not args:
+ msg = "RaiseError called without arguments!"
+ else:
+ msg = extract_literal_string(args[0]) or str(args[0])
+
+ if args[1:]:
+ msg += ": " + ", ".join(str(t) for t in args[1:])
+
+ # TODO: We could also print a stack trace?
+ # Use serious=True to bypass in_checked_function() check which requires
+ # self.options to be set on the SemanticAnalyzer
+ if not typelevel_ctx._suppress_errors:
+ evaluator.api.fail(msg, evaluator.error_ctx, serious=True)
+
+ return UninhabitedType()
+
+
+def evaluate_comprehension(evaluator: TypeLevelEvaluator, typ: TypeForComprehension) -> Type:
+ """Evaluate a TypeForComprehension.
+
+ Evaluates *[Expr for var in Iter if Cond] to UnpackType(TupleType([...])).
+ """
+
+ # Get the iterable type and expand it to a TupleType
+ iter_proper = evaluator.eval_proper(typ.iter_type)
+
+ if isinstance(iter_proper, AnyType) and typ.is_map:
+ # Map-over-Any: propagate Any as an UnpackType(AnyType) sentinel
+ # that the enclosing variadic container collapses to Any.
+ return UnpackType(AnyType(TypeOfAny.from_another_any, source_any=iter_proper))
+
+ if isinstance(iter_proper, AnyType) and not typ.is_map:
+ if iter_proper.type_of_any == TypeOfAny.explicit and not typelevel_ctx._suppress_errors:
+ evaluator.api.fail(
+ "Type comprehension requires Iter over a tuple type, got Any;"
+ " use Map(...) to propagate Any",
+ evaluator.error_ctx,
+ serious=True,
+ )
+ return AnyType(TypeOfAny.from_error)
+
+ if not isinstance(iter_proper, TupleType):
+ return UninhabitedType()
+
+ # Process each item in the tuple
+ result_items: list[Type] = []
+ assert typ.iter_var
+ for item in iter_proper.items:
+ # Substitute iter_var with item in element_expr and conditions
+ env = {typ.iter_var.id: item}
+ substituted_expr = expand_type(typ.element_expr, env)
+ substituted_conditions = [expand_type(cond, env) for cond in typ.conditions]
+
+ # Evaluate all conditions
+ all_pass = True
+ for cond in substituted_conditions:
+ cond_result = extract_literal_bool(evaluator.evaluate(cond))
+ if cond_result is False:
+ all_pass = False
+ break
+ elif cond_result is None:
+ # Undecidable condition - raise Stuck
+ raise EvaluationStuck
+
+ if all_pass:
+ # Include this element in the result
+ result_items.append(substituted_expr)
+
+ return UnpackType(evaluator.tuple_type(result_items))
+
+
+# --- Helper Functions ---
+
+
+def get_type_args_for_base(instance: Instance, base_type: TypeInfo) -> tuple[Type, ...] | None:
+ """Get type args when viewing instance as base class.
+
+ Returns None if instance is not a subtype of base_type.
+ """
+ # Check if base_type is in the MRO. (map_instance_to_supertype
+ # doesn't have a way to signal when it isn't; it just fills the
+ # type with Anys)
+ if base_type not in instance.type.mro:
+ return None
+
+ return map_instance_to_supertype(instance, base_type).args
+
+
+# --- Public API ---
+
+
+def evaluate_computed_type(typ: ComputedType, ctx: Context | None = None) -> Type:
+ """Evaluate a ComputedType. Called from ComputedType.expand().
+
+ Uses typelevel_ctx.api to access the semantic analyzer.
+
+ The ctx argument indicates where an error message from RaiseError
+ ought to be placed. TODO: Make it a stack of contexts maybe?
+
+ """
+ if typelevel_ctx.api is None:
+ raise AssertionError("No access to semantic analyzer!")
+
+ old_evaluator = typelevel_ctx._evaluator
+ if not typelevel_ctx._evaluator:
+ typelevel_ctx._evaluator = TypeLevelEvaluator(typelevel_ctx.api, ctx)
+ try:
+ res = typelevel_ctx._evaluator.evaluate(typ)
+ except EvaluationOverflow:
+ # If this is not the top level of type evaluation, re-raise.
+ if old_evaluator is not None:
+ raise
+ res = EXPANSION_OVERFLOW
+ except EvaluationStuck:
+ # TODO: Should we do the same top level thing as above?
+ res = EXPANSION_ANY
+ finally:
+ typelevel_ctx._evaluator = old_evaluator
+
+ # print("EVALED!!", typ, "====>", res)
+ return res
diff --git a/mypy/typeops.py b/mypy/typeops.py
index 839c6454ca28f..5429ce8a704e8 100644
--- a/mypy/typeops.py
+++ b/mypy/typeops.py
@@ -173,8 +173,8 @@ def type_object_type(info: TypeInfo, named_type: Callable[[str], Instance]) -> P
return AnyType(TypeOfAny.from_error)
# The two is_valid_constructor() checks ensure this.
- assert isinstance(new_method.node, (SYMBOL_FUNCBASE_TYPES, Decorator))
- assert isinstance(init_method.node, (SYMBOL_FUNCBASE_TYPES, Decorator))
+ assert isinstance(new_method.node, (SYMBOL_FUNCBASE_TYPES, Decorator, Var))
+ assert isinstance(init_method.node, (SYMBOL_FUNCBASE_TYPES, Decorator, Var))
init_index = info.mro.index(init_method.node.info)
new_index = info.mro.index(new_method.node.info)
@@ -189,7 +189,7 @@ def type_object_type(info: TypeInfo, named_type: Callable[[str], Instance]) -> P
fallback = named_type("builtins.type")
if init_index < new_index:
- method: FuncBase | Decorator = init_method.node
+ method: FuncBase | Decorator | Var = init_method.node
is_new = False
elif init_index > new_index:
method = new_method.node
@@ -227,6 +227,12 @@ def type_object_type(info: TypeInfo, named_type: Callable[[str], Instance]) -> P
# achieved in early return above because is_valid_constructor() is False.
allow_cache = False
t = function_type(method, fallback)
+ elif isinstance(method, Var):
+ # Var with callable type, e.g. from UpdateClass adding __init__
+ assert method.type is not None
+ proper = get_proper_type(method.type)
+ assert isinstance(proper, FunctionLike)
+ t = proper
else:
assert isinstance(method.type, ProperType)
assert isinstance(method.type, FunctionLike) # is_valid_constructor() ensures this
@@ -242,13 +248,15 @@ def type_object_type(info: TypeInfo, named_type: Callable[[str], Instance]) -> P
def is_valid_constructor(n: SymbolNode | None) -> bool:
"""Does this node represents a valid constructor method?
- This includes normal functions, overloaded functions, and decorators
- that return a callable type.
+ This includes normal functions, overloaded functions, decorators
+ that return a callable type, and Vars with callable types (e.g. from UpdateClass).
"""
if isinstance(n, SYMBOL_FUNCBASE_TYPES):
return True
if isinstance(n, Decorator):
return isinstance(get_proper_type(n.type), FunctionLike)
+ if isinstance(n, Var) and n.type is not None:
+ return isinstance(get_proper_type(n.type), FunctionLike)
return False
diff --git a/mypy/types.py b/mypy/types.py
index d4ed728f4c9b8..c4227c803c7bc 100644
--- a/mypy/types.py
+++ b/mypy/types.py
@@ -4,7 +4,7 @@
import sys
from abc import abstractmethod
-from collections.abc import Iterable, Sequence
+from collections.abc import Callable, Iterable, Sequence
from typing import (
Any,
ClassVar,
@@ -158,6 +158,9 @@
# Supported @disjoint_base decorator names
DISJOINT_BASE_DECORATOR_NAMES: Final = ("typing.disjoint_base", "typing_extensions.disjoint_base")
+# Supported @_type_operator decorator names (for type-level computation)
+TYPE_OPERATOR_NAMES: Final = ("typing._type_operator", "_typeshed.typemap._type_operator")
+
# We use this constant in various places when checking `tuple` subtyping:
TUPLE_LIKE_INSTANCE_NAMES: Final = (
"builtins.tuple",
@@ -237,6 +240,9 @@ class TypeOfAny:
# used to ignore Anys inserted by the suggestion engine when
# generating constraints.
suggestion_engine: Final = 9
+ # Does this Any comes from typelevel computation getting stuck by an
+ # unsubstituted type-variable
+ expansion_stuck: Final = 10
def deserialize_type(data: JsonDict | str) -> Type:
@@ -301,8 +307,8 @@ def accept(self, visitor: TypeVisitor[T]) -> T:
def __repr__(self) -> str:
return self.accept(TypeStrVisitor(options=Options()))
- def str_with_options(self, options: Options) -> str:
- return self.accept(TypeStrVisitor(options=options))
+ def str_with_options(self, options: Options | None = None, expand: bool = False) -> str:
+ return self.accept(TypeStrVisitor(options=options or Options(), expand=expand))
def serialize(self) -> JsonDict | str:
raise NotImplementedError(f"Cannot serialize {self.__class__.__name__} instance")
@@ -462,6 +468,269 @@ def read(cls, data: ReadBuffer) -> TypeAliasType:
return alias
+class ProperType(Type):
+ """Not a type alias or computed type.
+
+ Every type except TypeAliasType and ComputedType (and its subclasses)
+ must inherit from this type.
+ """
+
+ __slots__ = ()
+
+
+class ComputedType(ProperType):
+ """Base class for types that represent unevaluated type-level computations.
+
+ This is a ProperType, though it still should usually be expanded.
+ - must be expanded/evaluated before use in most type
+ operations. Analogous to TypeAliasType in that it wraps a computation
+ that produces a concrete type.
+
+ Subclasses:
+ - TypeOperatorType: e.g., GetArg[T, Base, 0], Members[T], _Cond[IsAssignable[T, Base], X, Y]
+ - TypeForComprehension: e.g., *[Expr for x in Iter[T] if Cond]
+ """
+
+ __slots__ = ()
+
+ def expand(self, ctx: mypy.nodes.Context | None = None) -> Type:
+ """Evaluate this computed type to produce a concrete type.
+
+ Returns self if evaluation is not yet possible (e.g., contains unresolved type vars).
+ XXX: That is not true (we return EXPANSION_ANY) but probably
+ should be.
+
+ Subclasses must implement this method.
+
+ """
+ from mypy.typelevel import evaluate_computed_type
+
+ return evaluate_computed_type(self, ctx)
+
+
+class TypeOperatorType(ComputedType):
+ """Represents an unevaluated type operator application, e.g., GetArg[T, Base, 0].
+
+ Stores a reference to the operator's TypeInfo and the type arguments.
+ Type operators are generic classes in typeshed marked with @_type_operator.
+ """
+
+ __slots__ = ("type", "args", "type_ref", "fallback")
+
+ def __init__(
+ self,
+ type: mypy.nodes.TypeInfo, # The TypeInfo for the operator (e.g., typing.GetArg)
+ args: list[Type], # The type arguments
+ fallback: Instance,
+ line: int = -1,
+ column: int = -1,
+ ) -> None:
+ super().__init__(line, column)
+ self.type = type
+ self.args = args
+ self.fallback = fallback
+ self.type_ref: str | None = None # XXX?
+
+ def accept(self, visitor: TypeVisitor[T]) -> T:
+ return visitor.visit_type_operator_type(self)
+
+ def __hash__(self) -> int:
+ return hash((self.type, tuple(self.args)))
+
+ def __eq__(self, other: object) -> bool:
+ if not isinstance(other, TypeOperatorType):
+ return NotImplemented
+ return self.type == other.type and self.args == other.args
+
+ def __repr__(self) -> str:
+ return f"TypeOperatorType({self.type.fullname}, {self.args})"
+
+ def serialize(self) -> JsonDict:
+ data: JsonDict = {
+ ".class": "TypeOperatorType",
+ "type_ref": self.type.fullname,
+ "args": [arg.serialize() for arg in self.args],
+ "fallback": self.fallback.serialize(),
+ }
+ return data
+
+ @classmethod
+ def deserialize(cls, data: JsonDict) -> TypeOperatorType:
+ assert data[".class"] == "TypeOperatorType"
+ args: list[Type] = []
+ if "args" in data:
+ args_list = data["args"]
+ assert isinstance(args_list, list)
+ args = [deserialize_type(arg) for arg in args_list]
+ fallback = Instance.deserialize(data["fallback"])
+ typ = TypeOperatorType(NOT_READY, args, fallback)
+ typ.type_ref = data["type_ref"] # Will be fixed up by fixup.py later.
+ return typ
+
+ def copy_modified(self, *, args: list[Type] | None = None) -> TypeOperatorType:
+ return TypeOperatorType(
+ self.type,
+ args if args is not None else self.args.copy(),
+ self.fallback,
+ self.line,
+ self.column,
+ )
+
+ def write(self, data: WriteBuffer) -> None:
+ write_tag(data, TYPE_OPERATOR_TYPE)
+ write_type_list(data, self.args)
+ write_str(data, self.type.fullname)
+ self.fallback.write(data)
+ write_tag(data, END_TAG)
+
+ @classmethod
+ def read(cls, data: ReadBuffer) -> TypeOperatorType:
+ args = read_type_list(data)
+ type_ref = read_str(data)
+ assert read_tag(data) == INSTANCE
+ fallback = Instance.read(data)
+ typ = TypeOperatorType(NOT_READY, args, fallback)
+ typ.type_ref = type_ref
+ assert read_tag(data) == END_TAG
+ return typ
+
+
+class TypeForComprehension(ComputedType):
+ """Represents *[Expr for var in Iter[T] if Cond].
+
+ Expands to a tuple of types.
+ """
+
+ __slots__ = ("element_expr", "iter_name", "iter_type", "conditions", "iter_var", "is_map")
+
+ def __init__(
+ self,
+ element_expr: Type,
+ iter_name: str,
+ iter_type: Type, # The type being iterated (should be a tuple type)
+ conditions: list[Type], # Each should be IsAssignable[...] or boolean combo
+ iter_var: TypeVarType | None = None, # Typically populated by typeanal
+ line: int = -1,
+ column: int = -1,
+ *,
+ is_map: bool = False,
+ ) -> None:
+ super().__init__(line, column)
+ self.element_expr = element_expr
+ self.iter_name = iter_name
+ self.iter_type = iter_type
+ self.conditions = conditions
+ self.iter_var: TypeVarType | None = iter_var
+ # True when this comprehension was desugared from `Map[...]` syntax.
+ # Changes behavior at the Map boundary: Iter[Any] propagates as Any
+ # through the enclosing variadic context instead of erroring.
+ self.is_map = is_map
+
+ def type_param(self) -> mypy.nodes.TypeParam:
+ return mypy.nodes.TypeParam(self.iter_name, mypy.nodes.TYPE_VAR_KIND, None, [], None)
+
+ def accept(self, visitor: TypeVisitor[T]) -> T:
+ return visitor.visit_type_for_comprehension(self)
+
+ def __hash__(self) -> int:
+ return hash(
+ (
+ self.element_expr,
+ self.iter_name,
+ self.iter_type,
+ tuple(self.conditions),
+ self.is_map,
+ )
+ )
+
+ def __eq__(self, other: object) -> bool:
+ if not isinstance(other, TypeForComprehension):
+ return NotImplemented
+ return (
+ self.element_expr == other.element_expr
+ and self.iter_name == other.iter_name
+ and self.iter_type == other.iter_type
+ and self.conditions == other.conditions
+ and self.is_map == other.is_map
+ )
+
+ def __repr__(self) -> str:
+ conds = "".join(f" if {c}" for c in self.conditions)
+ return f"TypeForComprehension([{self.element_expr} for {self.iter_name} in {self.iter_type}{conds}])"
+
+ def serialize(self) -> JsonDict:
+ return {
+ ".class": "TypeForComprehension",
+ "element_expr": self.element_expr.serialize(),
+ "iter_name": self.iter_name,
+ "iter_type": self.iter_type.serialize(),
+ "conditions": [c.serialize() for c in self.conditions],
+ "iter_var": self.iter_var.serialize() if self.iter_var else None,
+ "is_map": self.is_map,
+ }
+
+ @classmethod
+ def deserialize(cls, data: JsonDict) -> TypeForComprehension:
+ assert data[".class"] == "TypeForComprehension"
+ var = data["iter_var"]
+ return TypeForComprehension(
+ deserialize_type(data["element_expr"]),
+ data["iter_name"],
+ deserialize_type(data["iter_type"]),
+ [deserialize_type(c) for c in data["conditions"]],
+ iter_var=cast(TypeVarType, deserialize_type(var)) if var else None,
+ is_map=data.get("is_map", False),
+ )
+
+ def copy_modified(
+ self,
+ *,
+ element_expr: Type | None = None,
+ iter_name: str | None = None,
+ iter_type: Type | None = None,
+ conditions: list[Type] | None = None,
+ iter_var: TypeVarType | None = None, # Typically populated by typeanal
+ is_map: bool | None = None,
+ ) -> TypeForComprehension:
+ return TypeForComprehension(
+ element_expr if element_expr is not None else self.element_expr,
+ iter_name if iter_name is not None else self.iter_name,
+ iter_type if iter_type is not None else self.iter_type,
+ conditions if conditions is not None else self.conditions.copy(),
+ iter_var if iter_var is not None else self.iter_var,
+ self.line,
+ self.column,
+ is_map=is_map if is_map is not None else self.is_map,
+ )
+
+ def write(self, data: WriteBuffer) -> None:
+ write_tag(data, TYPE_FOR_COMPREHENSION)
+ self.element_expr.write(data)
+ write_str(data, self.iter_name)
+ self.iter_type.write(data)
+ write_int(data, len(self.conditions))
+ for cond in self.conditions:
+ cond.write(data)
+ write_type_opt(data, self.iter_var)
+ write_bool(data, self.is_map)
+ write_tag(data, END_TAG)
+
+ @classmethod
+ def read(cls, data: ReadBuffer) -> TypeForComprehension:
+ element_expr = read_type(data)
+ iter_name = read_str(data)
+ iter_type = read_type(data)
+ num_conditions = read_int(data)
+ conditions = [read_type(data) for _ in range(num_conditions)]
+ iter_var = cast(TypeVarType | None, read_type_opt(data))
+ is_map = read_bool(data)
+
+ assert read_tag(data) == END_TAG
+ return TypeForComprehension(
+ element_expr, iter_name, iter_type, conditions, iter_var, is_map=is_map
+ )
+
+
class TypeGuardedType(Type):
"""Only used by find_isinstance_check() etc."""
@@ -512,15 +781,6 @@ def accept(self, visitor: TypeVisitor[T]) -> T:
return self.item.accept(visitor)
-class ProperType(Type):
- """Not a type alias.
-
- Every type except TypeAliasType must inherit from this type.
- """
-
- __slots__ = ()
-
-
class TypeVarId:
# A type variable is uniquely identified by its raw id and meta level.
@@ -1256,6 +1516,11 @@ def deserialize(cls, data: JsonDict) -> UnpackType:
typ = data["type"]
return UnpackType(deserialize_type(typ))
+ def copy_modified(self, *, type: Type | None = None) -> UnpackType:
+ return UnpackType(
+ type if type is not None else self.type, self.line, self.column, self.from_star_syntax
+ )
+
def __hash__(self) -> int:
return hash(self.type)
@@ -1689,6 +1954,11 @@ def __eq__(self, other: object) -> bool:
def serialize(self) -> JsonDict | str:
assert self.type is not None
+ # Synthetic NewProtocol types can't be looked up by fullname on
+ # deserialization, so serialize the unevaluated constructor instead.
+ # It will be re-evaluated on load.
+ if self.type.new_protocol_constructor is not None:
+ return self.type.new_protocol_constructor.serialize()
type_ref = self.type.fullname
if not self.args and not self.last_known_value and not self.extra_attrs:
return type_ref
@@ -1723,6 +1993,11 @@ def deserialize(cls, data: JsonDict | str) -> Instance:
return inst
def write(self, data: WriteBuffer) -> None:
+ # Synthetic NewProtocol types can't be looked up by fullname on
+ # deserialization, so serialize the unevaluated constructor instead.
+ if self.type.new_protocol_constructor is not None:
+ self.type.new_protocol_constructor.write(data)
+ return
write_tag(data, INSTANCE)
if not self.args and not self.last_known_value and not self.extra_attrs:
type_ref = self.type.fullname
@@ -2454,6 +2729,19 @@ def with_unpacked_kwargs(self) -> NormalizedCallableType:
if not self.unpack_kwargs:
return cast(NormalizedCallableType, self)
last_type = get_proper_type(self.arg_types[-1])
+ # Handle Unpack[K] where K is TypeVar bound to TypedDict
+ if isinstance(last_type, UnpackType):
+ unpacked = get_proper_type(last_type.type)
+ if isinstance(unpacked, TypeVarType):
+ # TypeVar with TypedDict bound - can't expand until after inference.
+ # Return unchanged for now; expansion happens after type var substitution.
+ return cast(NormalizedCallableType, self)
+ # For TypedDict inside UnpackType, unwrap it
+ if isinstance(unpacked, TypedDictType):
+ last_type = unpacked
+ if isinstance(last_type, TypeVarType):
+ # Direct TypeVar (shouldn't happen normally but handle it)
+ return cast(NormalizedCallableType, self)
assert isinstance(last_type, TypedDictType)
extra_kinds = [
ArgKind.ARG_NAMED if name in last_type.required_keys else ArgKind.ARG_NAMED_OPT
@@ -3624,14 +3912,14 @@ def serialize(self) -> str:
@overload
-def get_proper_type(typ: None) -> None: ...
+def get_proper_type_simple(typ: None) -> None: ...
@overload
-def get_proper_type(typ: Type) -> ProperType: ...
+def get_proper_type_simple(typ: Type) -> ProperType: ...
-def get_proper_type(typ: Type | None) -> ProperType | None:
+def get_proper_type_simple(typ: Type | None) -> ProperType | None:
"""Get the expansion of a type alias type.
If the type is already a proper type, this is a no-op. Use this function
@@ -3651,6 +3939,130 @@ def get_proper_type(typ: Type | None) -> ProperType | None:
return cast(ProperType, typ)
+def _could_be_computed_unpack(t: Type) -> bool:
+ return isinstance(t, TypeForComprehension) or (
+ # An unpack of a type alias or a computed type could expand to
+ # something we need to eval
+ isinstance(t, UnpackType)
+ and (
+ isinstance(t.type, (TypeAliasType, ComputedType))
+ # XXX: Some TupleTypes have snuck in in some cases and I
+ # need to debug this more
+ or (isinstance(t.type, ProperType) and isinstance(t.type, TupleType))
+ )
+ )
+
+
+def _find_map_any(items: Iterable[Type]) -> AnyType | None:
+ """Return the AnyType payload from the first *Map[...]-over-Any sentinel in items.
+
+ The sentinel is an UnpackType wrapping an AnyType, produced by
+ evaluate_comprehension on a Map-flavored TypeForComprehension whose
+ Iter[...] source is Any. The enclosing variadic container should
+ collapse to AnyType when it sees one.
+ """
+ for item in items:
+ if not isinstance(item, UnpackType):
+ continue
+ inner = get_proper_type(item.type)
+ if isinstance(inner, AnyType):
+ return inner
+ return None
+
+
+def _expand_type_fors_in_args(typ: ProperType) -> ProperType:
+ """
+ Expand any TypeForComprehensions in type arguments.
+ """
+ # TODO: lots of perf optimizations available
+ # XXX: Callable
+
+ # also I'm really not sure about this at all!
+ # this is a lot of work to be doing in get_proper_type
+ from mypy.expandtype import expand_type
+
+ typ2: ProperType
+
+ if isinstance(typ, TupleType) and any(_could_be_computed_unpack(st) for st in typ.items):
+ typ2 = typ.copy_modified(items=[get_proper_type(st) for st in typ.items])
+ # expanding the types might produce Unpacks, which we use
+ # expand_type to substitute in.
+ typ = expand_type(typ2, {})
+ if isinstance(typ, TupleType):
+ if (map_any := _find_map_any(typ.items)) is not None:
+ # A *Map[...] over Iter[Any] landed here; propagate Any through the tuple.
+ return map_any
+ elif (
+ isinstance(typ, Instance)
+ and typ.type # Make sure it's not a FakeInfo
+ and typ.type.has_type_var_tuple_type
+ and any(_could_be_computed_unpack(st) for st in typ.args)
+ ):
+ typ2 = typ.copy_modified(args=[get_proper_type(st) for st in typ.args])
+ typ = expand_type(typ2, {})
+ if isinstance(typ, Instance):
+ if (map_any := _find_map_any(typ.args)) is not None:
+ return map_any
+ elif isinstance(typ, UnpackType) and _could_be_computed_unpack(typ):
+ # No need to expand here
+ expanded = get_proper_type(typ.type)
+ # If expansion is stuck (still a ComputedType), keep the original.
+ # This prevents infinite expansion of recursive type aliases like
+ # Zip[DropLast[T], ...] where each expansion produces another
+ # stuck recursive reference.
+ if not isinstance(expanded, ComputedType):
+ typ = typ.copy_modified(type=expanded)
+
+ return typ
+
+
+@overload
+def get_proper_type(typ: None) -> None: ...
+
+
+@overload
+def get_proper_type(typ: Type) -> ProperType: ...
+
+
+def get_proper_type(typ: Type | None) -> ProperType | None:
+ """Get the expansion of a type alias type or computed type.
+
+ If the type is already a proper type, this is a no-op. Use this function
+ wherever a decision is made on a call like e.g. 'if isinstance(typ, UnionType): ...',
+ because 'typ' in this case may be an alias to union. Note: if after making the decision
+ on the isinstance() call you pass on the original type (and not one of its components)
+ it is recommended to *always* pass on the unexpanded alias.
+
+ This also *attempts* to expand computed types, though it might fail.
+ """
+ ctx = typ
+ if typ is None:
+ return None
+ # TODO: this is an ugly hack, remove.
+ if isinstance(typ, TypeGuardedType):
+ typ = typ.type_guard
+
+ while True:
+ if isinstance(typ, TypeAliasType):
+ typ = typ._expand_once()
+ elif isinstance(typ, ComputedType):
+ # Handles TypeOperatorType, TypeForComprehension
+ if not is_stuck_expansion(ntyp := typ.expand(ctx)):
+ typ = ntyp
+ else:
+ break
+
+ else:
+ break
+
+ typ = cast(ProperType, typ)
+
+ typ = _expand_type_fors_in_args(typ)
+
+ # TODO: store the name of original type alias on this type, so we can show it in errors.
+ return typ
+
+
@overload
def get_proper_types(types: list[Type] | tuple[Type, ...]) -> list[ProperType]: ...
@@ -3667,13 +4079,32 @@ def get_proper_types(
if isinstance(types, list):
typelist = types
# Optimize for the common case so that we don't need to allocate anything
- if not any(isinstance(t, (TypeAliasType, TypeGuardedType)) for t in typelist):
+ if not any(
+ isinstance(t, (TypeAliasType, TypeGuardedType, ComputedType)) for t in typelist
+ ):
return cast("list[ProperType]", typelist)
return [get_proper_type(t) for t in typelist]
else:
return [get_proper_type(t) for t in types]
+def is_stuck_expansion(typ: Type) -> bool:
+ return (
+ isinstance(typ, ProperType)
+ and isinstance(typ, AnyType)
+ and typ.type_of_any == TypeOfAny.expansion_stuck
+ )
+
+
+def try_expand_or_none(type: Type) -> ProperType | None:
+ """Try to expand a type, but return None if it gets stuck"""
+ type2 = get_proper_type(type)
+ if type is type2 and isinstance(type, ComputedType):
+ return None
+ else:
+ return type2
+
+
# We split off the type visitor base classes to another module
# to make it easier to gradually get modules working with mypyc.
# Import them here, after the types are defined.
@@ -3702,10 +4133,19 @@ class TypeStrVisitor(SyntheticTypeVisitor[str]):
- Represent Union[x, y] as x | y
"""
- def __init__(self, id_mapper: IdMapper | None = None, *, options: Options) -> None:
+ def __init__(
+ self,
+ id_mapper: IdMapper | None = None,
+ expand: bool = False,
+ expand_recursive: bool = False,
+ *,
+ options: Options,
+ ) -> None:
self.id_mapper = id_mapper
self.options = options
self.dotted_aliases: set[TypeAliasType] | None = None
+ self.expand = expand
+ self.expand_recursive = expand_recursive
def visit_unbound_type(self, t: UnboundType, /) -> str:
s = t.name + "?"
@@ -3742,6 +4182,13 @@ def visit_deleted_type(self, t: DeletedType, /) -> str:
return f""
def visit_instance(self, t: Instance, /) -> str:
+ if self.expand:
+ if (nt := try_expand_or_none(t)) and nt != t:
+ return nt.accept(self)
+
+ if t.type.is_new_protocol:
+ return self._format_new_protocol(t)
+
fullname = t.type.fullname
if not self.options.reveal_verbose_types and fullname.startswith("builtins."):
fullname = t.type.name
@@ -3765,6 +4212,9 @@ def visit_instance(self, t: Instance, /) -> str:
s += f"<{self.id_mapper.id(t.type)}>"
return s
+ def _format_new_protocol(self, t: Instance) -> str:
+ return format_new_protocol(t, lambda typ: typ.accept(self))
+
def visit_type_var(self, t: TypeVarType, /) -> str:
if not self.options.reveal_verbose_types:
s = t.name
@@ -3901,7 +4351,7 @@ def visit_callable_type(self, t: CallableType, /) -> str:
)
else:
vs.append(
- f"{var.name}{f' = {var.default.accept(self)}' if var.has_default() else ''}"
+ f"{var.name}{f' = {var.default.accept(self)}' if var.has_default() else ''}"
)
else:
# For other TypeVarLikeTypes, use the name and default
@@ -3919,6 +4369,11 @@ def visit_overloaded(self, t: Overloaded, /) -> str:
return f"Overload({', '.join(a)})"
def visit_tuple_type(self, t: TupleType, /) -> str:
+ # Expand computed comprehensions
+ if self.expand:
+ if (nt := try_expand_or_none(t)) and nt != t:
+ return nt.accept(self)
+
s = self.list_str(t.items) or "()"
if t.partial_fallback and t.partial_fallback.type:
fallback_name = t.partial_fallback.type.fullname
@@ -3985,22 +4440,45 @@ def visit_placeholder_type(self, t: PlaceholderType, /) -> str:
def visit_type_alias_type(self, t: TypeAliasType, /) -> str:
if t.alias is None:
return ""
- if not t.is_recursive:
- return get_proper_type(t).accept(self)
- if self.dotted_aliases is None:
- self.dotted_aliases = set()
- elif t in self.dotted_aliases:
- return "..."
- self.dotted_aliases.add(t)
- type_str = get_proper_type(t).accept(self)
- self.dotted_aliases.discard(t)
- return type_str
+
+ if (self.expand and not t.is_recursive) or self.expand_recursive:
+ if not t.is_recursive:
+ return get_proper_type(t).accept(self)
+ if self.dotted_aliases is None:
+ self.dotted_aliases = set()
+ elif t in self.dotted_aliases:
+ return "..."
+ self.dotted_aliases.add(t)
+ type_str = get_proper_type(t).accept(self)
+ self.dotted_aliases.discard(t)
+ return type_str
+ else:
+ s = t.alias.fullname
+ if t.args:
+ s += f"[{self.list_str(t.args)}]"
+ return s
def visit_unpack_type(self, t: UnpackType, /) -> str:
if not self.options.reveal_verbose_types:
return f"*{t.type.accept(self)}"
return f"Unpack[{t.type.accept(self)}]"
+ def visit_type_operator_type(self, t: TypeOperatorType, /) -> str:
+ if self.expand and (t2 := try_expand_or_none(t)):
+ return t2.accept(self)
+
+ name = t.type.fullname if t.type else ""
+ return f"{name}[{self.list_str(t.args)}]"
+
+ def visit_type_for_comprehension(self, t: TypeForComprehension, /) -> str:
+ conditions = ""
+ if t.conditions:
+ conditions = " if " + " if ".join(c.accept(self) for c in t.conditions)
+ v = t.iter_var.accept(self) if t.iter_var else f"~{t.iter_name}"
+ return (
+ f"*[{t.element_expr.accept(self)} for {v} in {t.iter_type.accept(self)}{conditions}]"
+ )
+
def list_str(self, a: Iterable[Type], *, use_or_syntax: bool = False) -> str:
"""Convert items of an array to strings (pretty-print types)
and join the results with commas.
@@ -4008,7 +4486,7 @@ def list_str(self, a: Iterable[Type], *, use_or_syntax: bool = False) -> str:
res = []
for t in a:
s = t.accept(self)
- if use_or_syntax and isinstance(get_proper_type(t), CallableType):
+ if use_or_syntax and isinstance(get_proper_type_simple(t), CallableType):
res.append(f"({s})")
else:
res.append(s)
@@ -4106,6 +4584,43 @@ def has_recursive_types(typ: Type) -> bool:
return typ.accept(_has_recursive_type)
+def format_new_protocol(t: Instance, format: Callable[[Type], str]) -> str:
+ """Format a NewProtocol instance by showing its members.
+
+ Used by both TypeStrVisitor and format_type_inner in messages.py.
+ """
+ from mypy.nodes import Var
+
+ parts: list[str] = []
+ for name, node in t.type.names.items():
+ if not isinstance(node.node, Var):
+ continue
+ var = node.node
+ if var.type is not None:
+ type_str = format(var.type)
+ else:
+ type_str = "??>"
+
+ if var.is_classvar:
+ type_str = f"ClassVar[{type_str}]"
+ if var.is_final:
+ type_str = f"Final[{type_str}]"
+
+ # Append initializer info
+ if var.init_type is not None:
+ init = get_proper_type(var.init_type)
+ if isinstance(init, LiteralType):
+ type_str = f"{type_str} = {init.value}"
+ elif isinstance(init, NoneType):
+ type_str = f"{type_str} = None"
+ elif not isinstance(init, UninhabitedType):
+ type_str = f"{type_str} = ..."
+
+ parts.append(f"{name}: {type_str}")
+
+ return f"NewProtocol[{', '.join(parts)}]"
+
+
def split_with_prefix_and_suffix(
types: tuple[Type, ...], prefix: int, suffix: int
) -> tuple[tuple[Type, ...], tuple[Type, ...], tuple[Type, ...]]:
@@ -4164,7 +4679,8 @@ def flatten_nested_unions(
if not handle_recursive and t.is_recursive:
tp: Type = t
else:
- tp = get_proper_type(t)
+ # N.B: Not get_proper_type(), because this is called from expand_type
+ tp = get_proper_type_simple(t)
else:
tp = t
if isinstance(tp, ProperType) and isinstance(tp, UnionType):
@@ -4320,6 +4836,8 @@ def type_vars_as_args(type_vars: Sequence[TypeVarLikeType]) -> tuple[Type, ...]:
ELLIPSIS_TYPE: Final[Tag] = 119 # Only valid in serialized ASTs
RAW_EXPRESSION_TYPE: Final[Tag] = 120 # Only valid in serialized ASTs
CALL_TYPE: Final[Tag] = 121 # Only valid in serialized ASTs
+TYPE_OPERATOR_TYPE: Final[Tag] = 122
+TYPE_FOR_COMPREHENSION: Final[Tag] = 123
def read_type(data: ReadBuffer, tag: Tag | None = None) -> Type:
@@ -4364,6 +4882,10 @@ def read_type(data: ReadBuffer, tag: Tag | None = None) -> Type:
return UnboundType.read(data)
if tag == DELETED_TYPE:
return DeletedType.read(data)
+ if tag == TYPE_OPERATOR_TYPE:
+ return TypeOperatorType.read(data)
+ if tag == TYPE_FOR_COMPREHENSION:
+ return TypeForComprehension.read(data)
assert False, f"Unknown type tag {tag}"
diff --git a/mypy/types_utils.py b/mypy/types_utils.py
index 160f6c0365d63..773bf8b7f07d8 100644
--- a/mypy/types_utils.py
+++ b/mypy/types_utils.py
@@ -15,6 +15,7 @@
from mypy.types import (
AnyType,
CallableType,
+ ComputedType,
Instance,
LiteralType,
NoneType,
@@ -31,6 +32,7 @@
UnpackType,
flatten_nested_unions,
get_proper_type,
+ get_proper_type_simple,
get_proper_types,
)
@@ -65,7 +67,15 @@ def is_invalid_recursive_alias(seen_nodes: set[TypeAlias], target: Type) -> bool
if target.alias in seen_nodes:
return True
assert target.alias, f"Unfixed type alias {target.type_ref}"
- return is_invalid_recursive_alias(seen_nodes | {target.alias}, get_proper_type(target))
+ # Need to do get_proper_type_simple since we don't want
+ # any computation-based expansion done, or expansions in
+ # the arguments.
+ return is_invalid_recursive_alias(
+ seen_nodes | {target.alias}, get_proper_type_simple(target)
+ )
+ if isinstance(target, ComputedType):
+ # XXX: We need to do *something* useful here!!
+ return False
assert isinstance(target, ProperType)
if not isinstance(target, (UnionType, TupleType)):
return False
@@ -180,4 +190,18 @@ def store_argument_type(
elif typ.arg_kinds[i] == ARG_STAR2:
if not isinstance(arg_type, ParamSpecType) and not typ.unpack_kwargs:
arg_type = named_type("builtins.dict", [named_type("builtins.str", []), arg_type])
+ # Strip the Unpack from Unpack[K], since it isn't part of the
+ # type inside the function
+ elif isinstance(arg_type, UnpackType):
+ unpacked_type = get_proper_type(arg_type.type)
+ assert isinstance(unpacked_type, TypeVarType)
+ arg_type = unpacked_type
defn.arguments[i].variable.type = arg_type
+
+
+def try_getting_literal(typ: Type) -> ProperType:
+ """If possible, get a more precise literal type for a given type."""
+ typ = get_proper_type(typ)
+ if isinstance(typ, Instance) and typ.last_known_value is not None:
+ return typ.last_known_value
+ return typ
diff --git a/mypy/typeshed/stdlib/_typeshed/typemap.pyi b/mypy/typeshed/stdlib/_typeshed/typemap.pyi
new file mode 100644
index 0000000000000..112a22b765fd2
--- /dev/null
+++ b/mypy/typeshed/stdlib/_typeshed/typemap.pyi
@@ -0,0 +1,371 @@
+"""Declarations from the typemap PEP proposal.
+
+These are here so that we can also easily export them from typing_extensions
+and typemap.typing.
+"""
+
+import typing_extensions
+from typing import Any, Generic, Literal, TypeVar, TypedDict
+from typing_extensions import TypeVarTuple, Unpack, Never
+
+class BaseTypedDict(TypedDict):
+ pass
+
+
+_S = TypeVar("_S")
+_T = TypeVar("_T")
+
+_KwargDict = TypeVar('_KwargDict', bound=BaseTypedDict)
+
+# Inherit from Any to allow the assignments.
+# TODO: Should we do this in a more principled way?
+class InitField(Generic[_KwargDict], Any):
+ def __init__(self, **kwargs: Unpack[_KwargDict]) -> None:
+ ...
+
+ def _get_kwargs(self) -> _KwargDict:
+ ...
+
+# Marker decorator for type operators. Classes decorated with this are treated
+# specially by the type checker as type-level computation operators.
+def _type_operator(cls: type[_T]) -> type[_T]: ...
+
+# MemberQuals: qualifiers that can apply to a Member
+MemberQuals: typing_extensions.TypeAlias = Literal["ClassVar", "Final", "Required", "NotRequired", "ReadOnly"]
+
+# ParamQuals: qualifiers that can apply to a Param
+ParamQuals: typing_extensions.TypeAlias = Literal["positional", "keyword", "default", "*", "**"]
+
+# --- Data Types (used in type computations) ---
+
+_Name = TypeVar("_Name")
+_Type = TypeVar("_Type")
+_Quals = TypeVar("_Quals", default=Never)
+_Init = TypeVar("_Init", default=Never)
+_Definer = TypeVar("_Definer", default=Never)
+
+class Member(Generic[_Name, _Type, _Quals, _Init, _Definer]):
+ """
+ Represents a class member with name, type, qualifiers, initializer, and definer.
+ - _Name: Literal[str] - the member name
+ - _Type: the member's type
+ - _Quals: Literal['ClassVar'] | Literal['Final'] | Never - qualifiers
+ - _Init: the literal type of the initializer expression
+ - _Definer: the class that defined this member
+ """
+
+ name: _Name
+ type: _Type
+ quals: _Quals
+ init: _Init
+ definer: _Definer
+
+class Param(Generic[_Name, _Type, _Quals]):
+ """
+ Represents a function parameter for extended callable syntax.
+ - _Name: Literal[str] | None - the parameter name
+ - _Type: the parameter's type
+ - _Quals: Literal['positional', 'keyword', 'default', '*', '**'] - qualifiers
+ """
+
+ name: _Name
+ type: _Type
+ quals: _Quals
+
+
+_N = TypeVar("_N", bound=str)
+
+# Convenience aliases for Param
+
+# XXX: For mysterious reasons, if I mark this as `:
+# typing_extensions.TypeAlias`, mypy thinks _N and _T are unbound...
+PosParam = Param[_N, _T, Literal["positional"]]
+PosDefaultParam = Param[_N, _T, Literal["positional", "default"]]
+DefaultParam = Param[_N, _T, Literal["default"]]
+NamedParam = Param[_N, _T, Literal["keyword"]]
+NamedDefaultParam = Param[_N, _T, Literal["keyword", "default"]]
+ArgsParam = Param[None, _T, Literal["*"]]
+KwargsParam = Param[None, _T, Literal["**"]]
+
+class Params(Generic[Unpack[_Ts]]):
+ """
+ Wraps a sequence of Param types as the first argument to Callable
+ to distinguish extended callable format from the standard format.
+ """
+
+ ...
+
+# --- Type Introspection Operators ---
+
+_Base = TypeVar("_Base")
+_Idx = TypeVar("_Idx")
+_S1 = TypeVar("_S1")
+_S2 = TypeVar("_S2")
+_Start = TypeVar("_Start")
+_End = TypeVar("_End")
+
+@_type_operator
+class GetArg(Generic[_T, _Base, _Idx]):
+ """
+ Get type argument at index _Idx from _T when viewed as _Base.
+ Returns Never if _T does not inherit from _Base or index is out of bounds.
+ """
+
+ ...
+
+@_type_operator
+class GetArgs(Generic[_T, _Base]):
+ """
+ Get all type arguments from _T when viewed as _Base, as a tuple.
+ Returns Never if _T does not inherit from _Base.
+ """
+
+ ...
+
+@_type_operator
+class GetMember(Generic[_T, _Name]):
+ """
+ Get the Member type for attribute _Name from type _T.
+ _Name must be a Literal[str].
+ Returns Never if the member does not exist.
+ """
+
+ ...
+
+@_type_operator
+class GetMemberType(Generic[_T, _Name]):
+ """
+ Get the type of attribute _Name from type _T.
+ _Name must be a Literal[str].
+ """
+
+ ...
+
+@_type_operator
+class Members(Generic[_T]):
+ """
+ Get all members of type _T as a tuple of Member types.
+ Includes methods, class variables, and instance attributes.
+ """
+
+ ...
+
+@_type_operator
+class Attrs(Generic[_T]):
+ """
+ Get annotated instance attributes of _T as a tuple of Member types.
+ Excludes methods and ClassVar members.
+ """
+
+ ...
+
+@_type_operator
+class FromUnion(Generic[_T]):
+ """
+ Convert a union type to a tuple of its constituent types.
+ If _T is not a union, returns a 1-tuple containing _T.
+ """
+
+ ...
+
+# --- Member/Param Accessors (defined as type aliases using GetMemberType) ---
+
+# _MP = TypeVar("_MP", bound=Member[Any, Any, Any, Any, Any] | Param[Any, Any, Any])
+# _M = TypeVar("_M", bound=Member[Any, Any, Any, Any, Any])
+
+_MP = TypeVar("_MP")
+_M = TypeVar("_M")
+
+
+GetName = GetMemberType[_MP, Literal["name"]]
+GetType = GetMemberType[_MP, Literal["type"]]
+GetQuals = GetMemberType[_MP, Literal["quals"]]
+GetInit = GetMemberType[_M, Literal["init"]]
+GetDefiner = GetMemberType[_M, Literal["definer"]]
+
+# --- Type Construction Operators ---
+
+_Ts = TypeVarTuple("_Ts")
+
+@_type_operator
+class NewProtocol(Generic[Unpack[_Ts]]):
+ """
+ Construct a new structural (protocol) type from Member types.
+ NewProtocol[Member[...], Member[...], ...] creates an anonymous protocol.
+ """
+
+ ...
+
+@_type_operator
+class NewTypedDict(Generic[Unpack[_Ts]]):
+ """
+ Construct a new TypedDict from Member types.
+ NewTypedDict[Member[...], Member[...], ...] creates an anonymous TypedDict.
+ """
+
+ ...
+
+@_type_operator
+class UpdateClass(Generic[Unpack[_Ts]]):
+ """
+ Update an existing class with new members.
+ Can only be used as the return type of a class decorator or __init_subclass__.
+ Members with type Never are removed from the class.
+ """
+
+ ...
+
+@_type_operator
+class _NewUnion(Generic[Unpack[_Ts]]):
+ """
+ Construct a union type from the given type arguments.
+ _NewUnion[int, str, bool] evaluates to int | str | bool.
+ """
+
+ ...
+
+@_type_operator
+class _NewCallable(Generic[Unpack[_Ts]]):
+ """
+ Construct a callable type from Param types and a return type.
+ _NewCallable[Param[...], ..., ReturnType] evaluates to a Callable.
+ """
+
+ ...
+
+# --- Boolean/Conditional Operators ---
+
+@_type_operator
+class IsAssignable(Generic[_T, _Base]):
+ """
+ Type-level assignability check. Evaluates to a type-level boolean.
+ Used in conditional type expressions: `Foo if IsAssignable[T, Base] else Bar`
+ """
+
+ ...
+
+@_type_operator
+class IsEquivalent(Generic[_T, _S]):
+ """
+ Type equivalence check. Returns Literal[True] if T is a subtype of S
+ AND S is a subtype of T.
+ Equivalent to: IsAssignable[T, S] and IsAssignable[S, T]
+ """
+
+ ...
+
+@_type_operator
+class Bool(Generic[_T]):
+ """
+ Check if T contains Literal[True].
+ Returns Literal[True] if T is Literal[True] or a union containing it.
+ Equivalent to: IsAssignable[Literal[True], T] and not IsAssignable[T, Never]
+ """
+
+ ...
+
+@_type_operator
+class Iter(Generic[_T]):
+ """
+ Marks a type for iteration in type comprehensions.
+ `for x in Iter[T]` iterates over elements of tuple type T.
+ """
+
+ ...
+
+@_type_operator
+class Map(Generic[_T]):
+ """
+ Variadic type comprehension.
+ `*Map[(Expr for v in Iter[T] if Cond)]` expands in a variadic context
+ to the tuple of element types produced by the comprehension, equivalent
+ to `*[Expr for v in Iter[T] if Cond]`.
+ """
+
+ ...
+
+# --- String Operations ---
+
+@_type_operator
+class Slice(Generic[_S, _Start, _End]):
+ """
+ Slice a literal string type.
+ Slice[Literal["hello"], Literal[1], Literal[3]] = Literal["el"]
+ """
+
+ ...
+
+@_type_operator
+class Concat(Generic[_S1, _S2]):
+ """
+ Concatenate two literal string types.
+ Concat[Literal["hello"], Literal["world"]] = Literal["helloworld"]
+ """
+
+ ...
+
+@_type_operator
+class Uppercase(Generic[_S]):
+ """Convert literal string to uppercase."""
+
+ ...
+
+@_type_operator
+class Lowercase(Generic[_S]):
+ """Convert literal string to lowercase."""
+
+ ...
+
+@_type_operator
+class Capitalize(Generic[_S]):
+ """Capitalize first character of literal string."""
+
+ ...
+
+@_type_operator
+class Uncapitalize(Generic[_S]):
+ """Lowercase first character of literal string."""
+
+ ...
+
+# --- Annotated Operations ---
+
+@_type_operator
+class GetAnnotations(Generic[_T]):
+ """
+ Extract Annotated metadata from a type.
+ GetAnnotations[Annotated[int, 'foo', 'bar']] = Literal['foo', 'bar']
+ GetAnnotations[int] = Never
+ """
+
+ ...
+
+@_type_operator
+class DropAnnotations(Generic[_T]):
+ """
+ Strip Annotated wrapper from a type.
+ DropAnnotations[Annotated[int, 'foo']] = int
+ DropAnnotations[int] = int
+ """
+
+ ...
+
+# --- Utility Operators ---
+
+@_type_operator
+class Length(Generic[_T]):
+ """
+ Get the length of a tuple type as a Literal[int].
+ Returns Literal[None] for unbounded tuples.
+ """
+
+ ...
+
+@_type_operator
+class RaiseError(Generic[_S, Unpack[_Ts]]):
+ """
+ Emit a type error with the given message.
+ RaiseError[Literal["error message"]] emits the error and returns Never.
+ """
+
+ ...
diff --git a/mypy/typeshed/stdlib/builtins.pyi b/mypy/typeshed/stdlib/builtins.pyi
index 03c3bd2e17c74..28eeae575874e 100644
--- a/mypy/typeshed/stdlib/builtins.pyi
+++ b/mypy/typeshed/stdlib/builtins.pyi
@@ -30,6 +30,7 @@ from _typeshed import (
SupportsRichComparisonT,
SupportsWrite,
)
+from _typeshed.typemap import _type_operator
from collections.abc import Awaitable, Callable, Iterable, Iterator, MutableSet, Reversible, Set as AbstractSet, Sized
from io import BufferedRandom, BufferedReader, BufferedWriter, FileIO, TextIOWrapper
from os import PathLike
@@ -2230,3 +2231,64 @@ if sys.version_info >= (3, 11):
if sys.version_info >= (3, 13):
class PythonFinalizationError(RuntimeError): ...
+
+
+@_type_operator
+class _Cond(Generic[_T, _T1, _T2]):
+ """
+ Type-level conditional expression.
+ _Cond[IsAssignable[T, Base], TrueType, FalseType] evaluates to TrueType if T is a subtype of Base,
+ otherwise FalseType.
+ """
+
+ ...
+
+@_type_operator
+class _And(Generic[_T1, _T2]):
+ """
+ Type-level logical AND.
+ _And[A, B] evaluates to Literal[True] if both A and B are Literal[True],
+ otherwise Literal[False].
+ """
+
+ ...
+
+@_type_operator
+class _Or(Generic[_T1, _T2]):
+ """
+ Type-level logical OR.
+ _Or[A, B] evaluates to Literal[True] if either A or B is Literal[True],
+ otherwise Literal[False].
+ """
+
+ ...
+
+@_type_operator
+class _Not(Generic[_T]):
+ """
+ Type-level logical NOT.
+ _Not[A] evaluates to Literal[True] if A is Literal[False],
+ and Literal[False] if A is Literal[True].
+ """
+
+ ...
+
+@_type_operator
+class _DictEntry(Generic[_T1, _T2]):
+ """
+ Internal type operator for dict comprehension syntax in type context.
+ {k: v for x in foo} desugars to *[_DictEntry[k, v] for x in foo].
+ _DictEntry[name, typ] evaluates to Member[name, typ, Never, Never, Never].
+ """
+
+ ...
+
+@_type_operator
+class _TypeGetAttr(Generic[_T1, _T2]):
+ """
+ Internal type operator for dot notation on types.
+ X[A].attr in type context desugars to _TypeGetAttr[X[A], Literal["attr"]].
+ Semantically equivalent to GetMemberType.
+ """
+
+ ...
diff --git a/mypy/typeshed/stdlib/typing.pyi b/mypy/typeshed/stdlib/typing.pyi
index 0bced03866439..5a51775962acc 100644
--- a/mypy/typeshed/stdlib/typing.pyi
+++ b/mypy/typeshed/stdlib/typing.pyi
@@ -1186,3 +1186,111 @@ if sys.version_info >= (3, 13):
NoDefault: _NoDefaultType
TypeIs: _SpecialForm
ReadOnly: _SpecialForm
+
+# --- Type-level computation support ---
+
+# HACK: Always import because its used in mypy internals.
+# FIXME: Don't put this weird internals stuff here.
+from _typeshed.typemap import (
+ _NewCallable as _NewCallable,
+ _NewUnion as _NewUnion,
+)
+
+if sys.version_info >= (3, 15):
+ __all__ += [
+ # Type operators
+ "GetArg",
+ "GetArgs",
+ "GetMember",
+ "GetMemberType",
+ "Members",
+ "Attrs",
+ "FromUnion",
+ "NewProtocol",
+ "NewTypedDict",
+ "UpdateClass",
+ "IsAssignable",
+ "IsEquivalent",
+ "Bool",
+ "Iter",
+ "Map",
+ "Slice",
+ "Concat",
+ "Uppercase",
+ "Lowercase",
+ "Capitalize",
+ "Uncapitalize",
+ "GetAnnotations",
+ "DropAnnotations",
+ "Length",
+ "RaiseError",
+ # Data types
+ "Member",
+ "Param",
+ "PosParam",
+ "PosDefaultParam",
+ "DefaultParam",
+ "NamedParam",
+ "NamedDefaultParam",
+ "ArgsParam",
+ "KwargsParam",
+ "Params",
+ # Accessors
+ "GetName",
+ "GetType",
+ "GetQuals",
+ "GetInit",
+ "GetDefiner",
+ # Type aliases
+ "MemberQuals",
+ "ParamQuals",
+ # Misc
+ "BaseTypedDict",
+ "InitField",
+ ]
+ from _typeshed.typemap import (
+ Attrs as Attrs,
+ ArgsParam as ArgsParam,
+ BaseTypedDict as BaseTypedDict,
+ Bool as Bool,
+ Capitalize as Capitalize,
+ Concat as Concat,
+ DefaultParam as DefaultParam,
+ DropAnnotations as DropAnnotations,
+ FromUnion as FromUnion,
+ GetAnnotations as GetAnnotations,
+ GetArg as GetArg,
+ GetArgs as GetArgs,
+ GetMember as GetMember,
+ GetMemberType as GetMemberType,
+ GetDefiner as GetDefiner,
+ GetInit as GetInit,
+ GetName as GetName,
+ GetQuals as GetQuals,
+ GetType as GetType,
+ IsAssignable as IsAssignable,
+ InitField as InitField,
+ Iter as Iter,
+ KwargsParam as KwargsParam,
+ Length as Length,
+ Lowercase as Lowercase,
+ Map as Map,
+ IsEquivalent as IsEquivalent,
+ Member as Member,
+ MemberQuals as MemberQuals,
+ Members as Members,
+ NamedDefaultParam as NamedDefaultParam,
+ NamedParam as NamedParam,
+ NewProtocol as NewProtocol,
+ NewTypedDict as NewTypedDict,
+ UpdateClass as UpdateClass,
+ Param as Param,
+ Params as Params,
+ ParamQuals as ParamQuals,
+ PosDefaultParam as PosDefaultParam,
+ PosParam as PosParam,
+ RaiseError as RaiseError,
+ Slice as Slice,
+ Uncapitalize as Uncapitalize,
+ Uppercase as Uppercase,
+ )
diff --git a/mypy/typestate.py b/mypy/typestate.py
index d45837dad645a..be7fa7a8ba650 100644
--- a/mypy/typestate.py
+++ b/mypy/typestate.py
@@ -9,7 +9,14 @@
from mypy.nodes import VARIANCE_NOT_READY, TypeInfo
from mypy.server.trigger import make_trigger
-from mypy.types import Instance, Type, TypeVarId, TypeVarType, get_proper_type
+from mypy.types import (
+ Instance,
+ Type,
+ TypeVarId,
+ TypeVarType,
+ get_proper_type,
+ get_proper_type_simple,
+)
MAX_NEGATIVE_CACHE_TYPES: Final = 1000
MAX_NEGATIVE_CACHE_ENTRIES: Final = 10000
@@ -116,7 +123,10 @@ def __init__(self) -> None:
def is_assumed_subtype(self, left: Type, right: Type) -> bool:
for l, r in reversed(self._assuming):
- if get_proper_type(l) == get_proper_type(left) and get_proper_type(
+ # XXX: get_proper_type_simple on assumptions because doing
+ # get_proper_type triggered some infinite
+ # recursions. Think about whether this is right.
+ if get_proper_type_simple(l) == get_proper_type(left) and get_proper_type_simple(
r
) == get_proper_type(right):
return True
@@ -124,9 +134,12 @@ def is_assumed_subtype(self, left: Type, right: Type) -> bool:
def is_assumed_proper_subtype(self, left: Type, right: Type) -> bool:
for l, r in reversed(self._assuming_proper):
- if get_proper_type(l) == get_proper_type(left) and get_proper_type(
+ # XXX: get_proper_type_simple on assumptions because doing
+ # get_proper_type triggered some infinite
+ # recursions. Think about whether this is right.
+ if get_proper_type_simple(l) == get_proper_type(left) and get_proper_type(
r
- ) == get_proper_type(right):
+ ) == get_proper_type_simple(right):
return True
return False
diff --git a/mypy/typetraverser.py b/mypy/typetraverser.py
index abd0f6bf3bdfe..f1dc4dce60770 100644
--- a/mypy/typetraverser.py
+++ b/mypy/typetraverser.py
@@ -25,7 +25,9 @@
Type,
TypeAliasType,
TypedDictType,
+ TypeForComprehension,
TypeList,
+ TypeOperatorType,
TypeType,
TypeVarTupleType,
TypeVarType,
@@ -142,6 +144,14 @@ def visit_type_alias_type(self, t: TypeAliasType, /) -> None:
def visit_unpack_type(self, t: UnpackType, /) -> None:
t.type.accept(self)
+ def visit_type_operator_type(self, t: TypeOperatorType, /) -> None:
+ self.traverse_type_list(t.args)
+
+ def visit_type_for_comprehension(self, t: TypeForComprehension, /) -> None:
+ t.element_expr.accept(self)
+ t.iter_type.accept(self)
+ self.traverse_type_list(t.conditions)
+
# Helpers
def traverse_types(self, types: Iterable[Type], /) -> None:
diff --git a/test-data/unit/check-fastparse.test b/test-data/unit/check-fastparse.test
index 7ee5a9c432169..a357aa8dddadd 100644
--- a/test-data/unit/check-fastparse.test
+++ b/test-data/unit/check-fastparse.test
@@ -34,26 +34,38 @@ def f(x): # E: Invalid type comment or annotation
# All of these should not crash
from typing import Callable, Tuple, Iterable
-x: Tuple[int, str].x # E: Invalid type comment or annotation
-a: Iterable[x].x # E: Invalid type comment or annotation
+x: Tuple[int, str].x # E: Dot notation .x requires a Member or Param type, got tuple[int, str]
+a: Iterable[x].x # E: Dot notation .x requires a Member or Param type, got typing.Iterable[x?] \
+ # E: Variable "__main__.x" is not valid as a type \
+ # N: See https://mypy.readthedocs.io/en/stable/common_issues.html#variables-vs-type-aliases
b: Tuple[x][x] # E: Invalid type comment or annotation
c: Iterable[x][x] # E: Invalid type comment or annotation
d: Callable[..., int][x] # E: Invalid type comment or annotation
-e: Callable[..., int].x # E: Invalid type comment or annotation
-
-f = None # type: Tuple[int, str].x # E: Invalid type comment or annotation
-g = None # type: Iterable[x].x # E: Invalid type comment or annotation
+e: Callable[..., int].x # E: Dot notation .x requires a Member or Param type, got def (*Any, **Any) -> int
+
+f = None # type: Tuple[int, str].x # E: Dot notation .x requires a Member or Param type, got tuple[int, str] \
+ # E: Incompatible types in assignment (expression has type "None", variable has type "Never")
+g = None # type: Iterable[x].x # E: Variable "__main__.x" is not valid as a type \
+ # N: See https://mypy.readthedocs.io/en/stable/common_issues.html#variables-vs-type-aliases \
+ # E: Dot notation .x requires a Member or Param type, got typing.Iterable[x?] \
+ # E: Incompatible types in assignment (expression has type "None", variable has type "Never")
h = None # type: Tuple[x][x] # E: Invalid type comment or annotation
i = None # type: Iterable[x][x] # E: Invalid type comment or annotation
j = None # type: Callable[..., int][x] # E: Invalid type comment or annotation
-k = None # type: Callable[..., int].x # E: Invalid type comment or annotation
+k = None # type: Callable[..., int].x # E: Dot notation .x requires a Member or Param type, got def (*Any, **Any) -> int \
+ # E: Incompatible types in assignment (expression has type "None", variable has type "Never")
-def f1(x: Tuple[int, str].x) -> None: pass # E: Invalid type comment or annotation
-def f2(x: Iterable[x].x) -> None: pass # E: Invalid type comment or annotation
+def f1(x: Tuple[int, str].x) -> None: pass # E: Dot notation .x requires a Member or Param type, got tuple[int, str]
+def f2(x: Iterable[x].x) -> None: pass # E: Dot notation .x requires a Member or Param type, got typing.Iterable[x?] \
+ # E: Variable "__main__.x" is not valid as a type \
+ # N: See https://mypy.readthedocs.io/en/stable/common_issues.html#variables-vs-type-aliases
def f3(x: Tuple[x][x]) -> None: pass # E: Invalid type comment or annotation
def f4(x: Iterable[x][x]) -> None: pass # E: Invalid type comment or annotation
def f5(x: Callable[..., int][x]) -> None: pass # E: Invalid type comment or annotation
-def f6(x: Callable[..., int].x) -> None: pass # E: Invalid type comment or annotation
+def f6(x: Callable[..., int].x) -> None: pass # E: Dot notation .x requires a Member or Param type, got def (*Any, **Any) -> int
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
[case testFastParseTypeWithIgnore]
def f(x, # type: x # type: ignore
diff --git a/test-data/unit/check-incremental.test b/test-data/unit/check-incremental.test
index 5b256b1731e01..3cfa6b90a74d6 100644
--- a/test-data/unit/check-incremental.test
+++ b/test-data/unit/check-incremental.test
@@ -239,6 +239,207 @@ def baz() -> int:
[rechecked mod2]
[stale]
+[case testIncrementalTypeOperator]
+# flags: --python-version 3.14
+import a
+[file a.py]
+from typing import (
+ TypeVar,
+ NewTypedDict,
+ Iter,
+ Attrs,
+ GetName,
+ GetType,
+ Member,
+)
+
+T = TypeVar("T")
+PropsOnly = list[
+ NewTypedDict[
+ *[
+ Member[GetName[p], GetType[p]]
+ for p in Iter[Attrs[T]]
+ ]
+ ]
+]
+[file a.py.2]
+from typing import (
+ TypeVar,
+ NewTypedDict,
+ Iter,
+ Attrs,
+ GetName,
+ GetType,
+ Member,
+)
+
+T = TypeVar("T")
+PropsOnly = list[
+ NewTypedDict[
+ *[
+ Member[GetName[p], GetType[p]]
+ for p in Iter[Attrs[T]]
+ ]
+ ]
+]
+# dummy change
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testIncrementalNewProtocol]
+# flags: --python-version 3.14
+import a
+import b
+
+[file a.py]
+
+from typing import TypeVar, NewProtocol, Member, Literal, Iter
+
+# Basic NewProtocol creation
+MyProto = NewProtocol[
+ Member[Literal["x"], int],
+ Member[Literal["y"], str],
+]
+x: MyProto
+
+T = TypeVar("T")
+LinkedList = NewProtocol[
+ Member[Literal["data"], T],
+ Member[Literal["next"], LinkedList[T]],
+]
+
+z: LinkedList[str]
+
+lol: tuple[*[t for t in Iter[tuple[MyProto]]]]
+
+asdf: NewProtocol[
+ Member[Literal["x"], int],
+ Member[Literal["y"], str],
+]
+
+[file b.py]
+from a import MyProto, LinkedList
+from typing import NewProtocol, Member, Literal
+
+x: MyProto
+
+class Good:
+ x: int
+ y: str
+
+def takes_proto(p: MyProto) -> None:
+ pass
+
+takes_proto(Good())
+
+z: LinkedList[str]
+
+# A different NewProtocol that should be incompatible with MyProto
+OtherProto = NewProtocol[
+ Member[Literal["bar"], int],
+]
+other: OtherProto
+x = other
+
+[file b.py.2]
+from a import MyProto, LinkedList
+from typing import NewProtocol, Member, Literal
+
+x: MyProto
+
+class Good:
+ x: int
+ y: str
+
+def takes_proto(p: MyProto) -> None:
+ pass
+
+takes_proto(Good())
+
+z: LinkedList[str]
+
+# A different NewProtocol that should be incompatible with MyProto
+OtherProto = NewProtocol[
+ Member[Literal["bar"], int],
+]
+other: OtherProto
+x = other
+
+# dummy change
+
+[out]
+tmp/b.py:22: error: Incompatible types in assignment (expression has type "NewProtocol[bar: int]", variable has type "NewProtocol[x: int, y: str]")
+[out2]
+tmp/b.py:22: error: Incompatible types in assignment (expression has type "NewProtocol[bar: int]", variable has type "NewProtocol[x: int, y: str]")
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testIncrementalUpdateClassWithNewProtocol]
+# flags: --python-version 3.14
+import a
+import b
+
+[file a.py]
+from typing import TypeVar, Literal, Member, NewProtocol, UpdateClass
+
+MyProto = NewProtocol[
+ Member[Literal["x"], int],
+ Member[Literal["y"], str],
+]
+
+T = TypeVar("T")
+def add_proto(cls: type[T]) -> UpdateClass[
+ Member[Literal["proto"], MyProto],
+]:
+ ...
+
+@add_proto
+class Foo:
+ z: float
+
+[file b.py]
+from a import Foo, MyProto
+
+reveal_type(Foo().proto)
+reveal_type(Foo().z)
+
+class Good:
+ x: int
+ y: str
+
+g: Good
+p: MyProto
+p = g
+
+[file b.py.2]
+from a import Foo, MyProto
+
+reveal_type(Foo().proto)
+reveal_type(Foo().z)
+
+class Good:
+ x: int
+ y: str
+
+g: Good
+p: MyProto
+p = g
+
+# dummy change
+
+[out]
+tmp/b.py:3: note: Revealed type is "NewProtocol[x: builtins.int, y: builtins.str]"
+tmp/b.py:4: note: Revealed type is "builtins.float"
+[out2]
+tmp/b.py:3: note: Revealed type is "NewProtocol[x: builtins.int, y: builtins.str]"
+tmp/b.py:4: note: Revealed type is "builtins.float"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
[case testIncrementalMethodInterfaceChange]
import mod1
@@ -2339,6 +2540,7 @@ tmp/c.py:1: error: Module "d" has no attribute "x"
[out2]
mypy: error: cannot read file 'tmp/nonexistent.py': No such file or directory
+-- balance the quotes to fix syntax hilighting '
[case testSerializeAbstractPropertyIncremental]
from abc import abstractmethod
import typing
@@ -2817,8 +3019,9 @@ x = b.c.A()
import c
[file c.py]
+FOO = 1
class A:
- x = 1
+ x = FOO
[file d.py]
import a
@@ -2829,8 +3032,9 @@ import b
x: b.c.A
[file c.py.3]
+FOO = 2
class A:
- x = 2
+ x = FOO
[file d.py.4]
import a
diff --git a/test-data/unit/check-kwargs-unpack-typevar.test b/test-data/unit/check-kwargs-unpack-typevar.test
new file mode 100644
index 0000000000000..f26803795bde0
--- /dev/null
+++ b/test-data/unit/check-kwargs-unpack-typevar.test
@@ -0,0 +1,266 @@
+[case testUnpackTypeVarKwargsBasicAccepted]
+# flags: --python-version 3.12
+# Test that TypeVar with TypedDict bound is accepted in **kwargs
+from typing import TypedDict, Unpack
+
+class BaseTypedDict(TypedDict):
+ pass
+
+def f[K: BaseTypedDict](**kwargs: Unpack[K]) -> K:
+ return kwargs
+
+[builtins fixtures/dict.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testUnpackTypeVarKwargsInvalidBoundInt]
+from typing import TypeVar, Unpack
+
+T = TypeVar('T', bound=int)
+
+def f(**kwargs: Unpack[T]) -> None: # E: Unpack item in ** parameter must be a TypedDict or a TypeVar with TypedDict bound
+ pass
+
+[builtins fixtures/dict.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testUnpackTypeVarKwargsNoBound]
+from typing import TypeVar, Unpack
+
+T = TypeVar('T')
+
+def f(**kwargs: Unpack[T]) -> None: # E: Unpack item in ** parameter must be a TypedDict or a TypeVar with TypedDict bound
+ pass
+
+[builtins fixtures/dict.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testUnpackTypeVarKwargsConcreteTypedDictStillWorks]
+# Test that concrete TypedDict still works as before
+from typing import TypedDict, Unpack
+
+class TD(TypedDict):
+ x: int
+ y: str
+
+def f(**kwargs: Unpack[TD]) -> None:
+ pass
+
+f(x=1, y="hello")
+f(x=1) # E: Missing named argument "y" for "f"
+
+[builtins fixtures/dict.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testUnpackTypeVarKwargsInferBasic]
+# flags: --python-version 3.12
+# Test that kwargs TypeVar inference works
+from typing import TypedDict, Unpack
+
+class BaseTypedDict(TypedDict):
+ pass
+
+def f[K: BaseTypedDict](**kwargs: Unpack[K]) -> K:
+ return kwargs
+
+result = f(x=1, y="hello")
+reveal_type(result) # N: Revealed type is "TypedDict('__main__.BaseTypedDict', {'x': Literal[1], 'y': Literal['hello']})"
+
+[builtins fixtures/dict.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testUnpackTypeVarKwargsInferEmpty]
+# flags: --python-version 3.12
+# Test empty kwargs infers empty TypedDict
+from typing import TypedDict, Unpack
+
+class BaseTypedDict(TypedDict):
+ pass
+
+def f[K: BaseTypedDict](**kwargs: Unpack[K]) -> K:
+ return kwargs
+
+result = f()
+reveal_type(result) # N: Revealed type is "TypedDict('__main__.BaseTypedDict', {})"
+
+[builtins fixtures/dict.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testUnpackTypeVarKwargsWithPositionalParam]
+# flags: --python-version 3.12
+# Test with positional parameter
+from typing import TypedDict, Unpack
+
+class BaseTypedDict(TypedDict):
+ pass
+
+def g[K: BaseTypedDict](a: int, **kwargs: Unpack[K]) -> tuple[int, K]:
+ return (a, kwargs)
+
+result = g(1, name="test", count=42)
+reveal_type(result) # N: Revealed type is "tuple[builtins.int, TypedDict('__main__.BaseTypedDict', {'name': Literal['test'], 'count': Literal[42]})]"
+
+[builtins fixtures/dict.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testUnpackTypeVarKwargsNotGoingToKwargs]
+# flags: --python-version 3.12
+# Test that explicit keyword params don't go to kwargs TypeVar
+from typing import TypedDict, Unpack
+
+class BaseTypedDict(TypedDict):
+ pass
+
+def h[K: BaseTypedDict](*, required: str, **kwargs: Unpack[K]) -> K:
+ return kwargs
+
+# 'required' goes to explicit param, only 'extra' goes to kwargs
+result = h(required="yes", extra=42)
+reveal_type(result) # N: Revealed type is "TypedDict('__main__.BaseTypedDict', {'extra': Literal[42]})"
+
+# Only explicit params, no extra kwargs
+result2 = h(required="yes")
+reveal_type(result2) # N: Revealed type is "TypedDict('__main__.BaseTypedDict', {})"
+
+[builtins fixtures/dict.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testUnpackTypeVarKwargsBoundWithRequiredFields]
+# flags: --python-version 3.12
+# Test that bound TypedDict fields are required
+from typing import TypedDict, Unpack
+
+class BaseTD(TypedDict):
+ x: int
+
+def f[K: BaseTD](**kwargs: Unpack[K]) -> K:
+ return kwargs
+
+# Missing required field 'x' from the bound - inferred TypedDict doesn't satisfy bound
+f() # E: Value of type variable "K" of "f" cannot be "BaseTD"
+f(y="hello") # E: Value of type variable "K" of "f" cannot be "BaseTD"
+
+# Providing 'x' satisfies the bound
+result1 = f(x=1)
+reveal_type(result1) # N: Revealed type is "TypedDict('__main__.BaseTD', {'x': builtins.int})"
+
+# Extra fields are allowed and inferred
+result2 = f(x=1, y="hello", z=True)
+reveal_type(result2) # N: Revealed type is "TypedDict('__main__.BaseTD', {'x': builtins.int, 'y': Literal['hello'], 'z': Literal[True]})"
+
+[builtins fixtures/dict.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testUnpackTypeVarKwargsBoundWithNotRequired1]
+# flags: --python-version 3.12
+# Test that NotRequired fields from bound can be omitted
+from typing import TypedDict, NotRequired, Unpack
+
+class BaseTDWithOptional(TypedDict):
+ x: int
+ y: NotRequired[str]
+
+def g[K: BaseTDWithOptional](**kwargs: Unpack[K]) -> K:
+ return kwargs
+
+# Can omit NotRequired field 'y'
+result1 = g(x=1)
+reveal_type(result1) # N: Revealed type is "TypedDict('__main__.BaseTDWithOptional', {'x': builtins.int, 'y'?: builtins.str})"
+
+# Can provide NotRequired field 'y'
+result2 = g(x=1, y="hello")
+reveal_type(result2) # N: Revealed type is "TypedDict('__main__.BaseTDWithOptional', {'x': builtins.int, 'y'?: builtins.str})"
+
+# Still need required field 'x'
+g(y="hello") # E: Value of type variable "K" of "g" cannot be "BaseTDWithOptional"
+
+[builtins fixtures/dict.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testUnpackTypeVarKwargsBoundWithNotRequired2]
+# flags: --python-version 3.12
+# Test that NotRequired fields from bound can be omitted
+from typing import TypedDict, NotRequired, ReadOnly, Unpack
+
+class BaseTDWithOptional(TypedDict):
+ x: ReadOnly[int]
+ y: ReadOnly[NotRequired[str]]
+
+def g[K: BaseTDWithOptional](**kwargs: Unpack[K]) -> K:
+ return kwargs
+
+# Can omit NotRequired field 'y'
+result1 = g(x=1)
+reveal_type(result1) # N: Revealed type is "TypedDict('__main__.BaseTDWithOptional', {'x'=: Literal[1], 'y'?=: Never})"
+
+# Can provide NotRequired field 'y'
+result2 = g(x=1, y="hello")
+reveal_type(result2) # N: Revealed type is "TypedDict('__main__.BaseTDWithOptional', {'x'=: Literal[1], 'y'?=: Literal['hello']})"
+
+# Still need required field 'x'
+g(y="hello") # E: Value of type variable "K" of "g" cannot be "BaseTDWithOptional"
+
+[builtins fixtures/dict.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testUnpackTypeVarKwargsBasicMixed]
+# flags: --python-version 3.12
+# Test that TypeVar with TypedDict bound is accepted in **kwargs
+from typing import TypedDict, Unpack
+
+class BaseTypedDict(TypedDict):
+ pass
+
+class Args(TypedDict):
+ x: int
+ y: str
+
+
+def f[K: BaseTypedDict](x: int, **kwargs: Unpack[K]) -> K:
+ return kwargs
+
+
+kwargs: Args
+reveal_type(f(**kwargs)) # N: Revealed type is "TypedDict('__main__.BaseTypedDict', {'y': builtins.str})"
+
+[builtins fixtures/dict.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testUnpackTypeVarKwargsInitField]
+# flags: --python-version 3.12
+# Test that TypeVar with TypedDict bound is accepted in **kwargs
+from typing import TypedDict, Unpack
+
+class BaseTypedDict(TypedDict):
+ pass
+
+class Args(TypedDict):
+ x: int
+
+
+class InitField[KwargDict: BaseTypedDict]:
+ def __init__(self, **kwargs: Unpack[KwargDict]) -> None:
+ ...
+
+
+class Field[KwargDict: Args](InitField[KwargDict]):
+ pass
+
+# XXX: mypy produces instances with last_known_values displayed with
+# ?s if not assigned to a value??
+# Though,
+# TODO: Do this on purpose??
+x = InitField(x=10, y='lol')
+reveal_type(x) # N: Revealed type is "__main__.InitField[TypedDict('__main__.BaseTypedDict', {'x': Literal[10], 'y': Literal['lol']})]"
+
+a = Field(x=10, y='lol')
+reveal_type(a) # N: Revealed type is "__main__.Field[TypedDict('__main__.Args', {'x': builtins.int, 'y': Literal['lol']})]"
+
+# TODO: These error messages are terrible and also wrong
+Field(y='lol') # E: Value of type variable "KwargDict" of "Field" cannot be "Args"
+Field(x='asdf') # E: Value of type variable "KwargDict" of "Field" cannot be "Args"
+
+
+[builtins fixtures/dict.pyi]
+[typing fixtures/typing-full.pyi]
diff --git a/test-data/unit/check-type-aliases.test b/test-data/unit/check-type-aliases.test
index 69474e2850055..396fa55ba44eb 100644
--- a/test-data/unit/check-type-aliases.test
+++ b/test-data/unit/check-type-aliases.test
@@ -1449,3 +1449,14 @@ Alias1: TypeAlias = Union[C, int]
class SomeClass:
pass
[builtins fixtures/tuple.pyi]
+
+[case testLooksLikeAlias]
+
+# Test that we correctly catch type errors that depend on an iterator
+# inside something that *seems* like it could be an alias.
+
+# This was written because a workaround attempted for a different
+# problem broke this at one point.
+
+x = [0]
+y = x[[i for i in [None]][0]] # E: Invalid index type "None" for "list[int]"; expected type "int"
diff --git a/test-data/unit/check-typelevel-basic.test b/test-data/unit/check-typelevel-basic.test
new file mode 100644
index 0000000000000..7300f069cd29e
--- /dev/null
+++ b/test-data/unit/check-typelevel-basic.test
@@ -0,0 +1,1416 @@
+[case testTypeOperatorIsAssignable]
+from typing import IsAssignable
+
+# Ternary syntax should be converted to _Cond[condition, true_type, false_type]
+x: IsAssignable[int, object]
+reveal_type(x) # N: Revealed type is "Literal[True]"
+
+y: IsAssignable[int, str]
+reveal_type(y) # N: Revealed type is "Literal[False]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorIsEquivalent]
+from typing import IsEquivalent, Literal
+
+# IsEquivalent checks type equivalence (both directions of subtype)
+x1: IsEquivalent[int, int]
+reveal_type(x1) # N: Revealed type is "Literal[True]"
+
+x2: IsEquivalent[int, object]
+reveal_type(x2) # N: Revealed type is "Literal[False]"
+
+x3: IsEquivalent[Literal[1], Literal[1]]
+reveal_type(x3) # N: Revealed type is "Literal[True]"
+
+x4: IsEquivalent[Literal[1], int]
+reveal_type(x4) # N: Revealed type is "Literal[False]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorBool]
+# flags: --python-version 3.10
+from typing import Bool, Literal, IsAssignable, Union
+
+# Bool checks if type contains Literal[True]
+x1: Bool[Literal[True]]
+reveal_type(x1) # N: Revealed type is "Literal[True]"
+
+x2: Bool[Literal[False]]
+reveal_type(x2) # N: Revealed type is "Literal[False]"
+
+x3: Bool[Union[Literal[True], Literal[False]]]
+reveal_type(x3) # N: Revealed type is "Literal[True]"
+
+x4: Bool[bool]
+reveal_type(x4) # N: Revealed type is "Literal[True]"
+
+# Using Bool to check result of IsAssignable
+x5: Bool[IsAssignable[int, object]]
+reveal_type(x5) # N: Revealed type is "Literal[True]"
+
+x6: Bool[IsAssignable[int, str]]
+reveal_type(x6) # N: Revealed type is "Literal[False]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorAnd]
+from typing import IsAssignable, Literal
+
+# _And is the internal representation for `and` in type booleans
+# We test it directly via IsAssignable compositions
+
+# True and True = True
+x1: Literal[True] if IsAssignable[int, object] and IsAssignable[str, object] else Literal[False]
+reveal_type(x1) # N: Revealed type is "Literal[True]"
+
+# True and False = False
+x2: Literal[True] if IsAssignable[int, object] and IsAssignable[int, str] else Literal[False]
+reveal_type(x2) # N: Revealed type is "Literal[False]"
+
+# False and True = False (short-circuit)
+x3: Literal[True] if IsAssignable[int, str] and IsAssignable[str, object] else Literal[False]
+reveal_type(x3) # N: Revealed type is "Literal[False]"
+
+# False and False = False
+x4: Literal[True] if IsAssignable[int, str] and IsAssignable[str, int] else Literal[False]
+reveal_type(x4) # N: Revealed type is "Literal[False]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorOr]
+from typing import IsAssignable, Literal
+
+# _Or is the internal representation for `or` in type booleans
+
+# True or True = True
+x1: Literal[True] if IsAssignable[int, object] or IsAssignable[str, object] else Literal[False]
+reveal_type(x1) # N: Revealed type is "Literal[True]"
+
+# True or False = True (short-circuit)
+x2: Literal[True] if IsAssignable[int, object] or IsAssignable[int, str] else Literal[False]
+reveal_type(x2) # N: Revealed type is "Literal[True]"
+
+# False or True = True
+x3: Literal[True] if IsAssignable[int, str] or IsAssignable[str, object] else Literal[False]
+reveal_type(x3) # N: Revealed type is "Literal[True]"
+
+# False or False = False
+x4: Literal[True] if IsAssignable[int, str] or IsAssignable[str, int] else Literal[False]
+reveal_type(x4) # N: Revealed type is "Literal[False]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorNot]
+from typing import IsAssignable, Literal
+
+# _Not is the internal representation for `not` in type booleans
+
+# not True = False
+x1: Literal[True] if not IsAssignable[int, object] else Literal[False]
+reveal_type(x1) # N: Revealed type is "Literal[False]"
+
+# not False = True
+x2: Literal[True] if not IsAssignable[int, str] else Literal[False]
+reveal_type(x2) # N: Revealed type is "Literal[True]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorBoolCombinations]
+from typing import IsAssignable, Literal
+
+# Test combinations of and, or, not
+
+# not (True and False) = True
+x1: Literal[True] if not (IsAssignable[int, object] and IsAssignable[int, str]) else Literal[False]
+reveal_type(x1) # N: Revealed type is "Literal[True]"
+
+# (True or False) and True = True
+x2: Literal[True] if (IsAssignable[int, object] or IsAssignable[int, str]) and IsAssignable[str, object] else Literal[False]
+reveal_type(x2) # N: Revealed type is "Literal[True]"
+
+# not False or False = True
+x3: Literal[True] if not IsAssignable[int, str] or IsAssignable[str, int] else Literal[False]
+reveal_type(x3) # N: Revealed type is "Literal[True]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testTypeOperatorTernarySyntax]
+from typing import IsAssignable
+
+# Ternary syntax should be converted to _Cond[condition, true_type, false_type]
+x: str if IsAssignable[int, object] else int
+
+x = 0 # E: Incompatible types in assignment (expression has type "int", variable has type "str")
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorInGenericClass]
+# Test type operators in generic class context
+from typing import Generic, TypeVar, IsAssignable
+
+T = TypeVar('T')
+
+class MyClass(Generic[T]):
+ attr: str if IsAssignable[T, int] else float
+
+ def method(self, x: list[T] if IsAssignable[T, str] else T) -> None:
+ pass
+
+x: MyClass[int]
+reveal_type(x.attr) # N: Revealed type is "builtins.str"
+x.method(0)
+x.method([0]) # E: Argument 1 to "method" of "MyClass" has incompatible type "list[int]"; expected "int"
+
+y: MyClass[object]
+reveal_type(y.attr) # N: Revealed type is "builtins.float"
+y.method(0)
+y.method([0])
+
+z: MyClass[str]
+reveal_type(z.attr) # N: Revealed type is "builtins.float"
+z.method('0') # E: Argument 1 to "method" of "MyClass" has incompatible type "str"; expected "list[str]"
+z.method(['0'])
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorNestedTernary]
+# Test nested ternary expressions
+from typing import IsAssignable
+
+# Nested ternary should work
+z: int if IsAssignable[int, str] else (float if IsAssignable[float, object] else str)
+z = 'xxx' # E: Incompatible types in assignment (expression has type "str", variable has type "float")
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorMultipleTypeArgs]
+# Test type operators with various argument types
+from typing import Generic, TypeVar, IsAssignable
+
+T = TypeVar('T')
+U = TypeVar('U')
+
+class Container(Generic[T, U]):
+ # Complex conditional with multiple type variables
+ value: list[U] if IsAssignable[T, int] else tuple[T, U]
+
+x: Container[int, bool]
+reveal_type(x.value) # N: Revealed type is "builtins.list[builtins.bool]"
+
+y: Container[str, bool]
+reveal_type(y.value) # N: Revealed type is "tuple[builtins.str, builtins.bool]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorCall0]
+# Test type operators in function signatures
+from typing import TypeVar, IsAssignable
+
+T = TypeVar('T')
+
+# XXX: resolving this seems basically impossible!!
+def process(
+ x: bytes if IsAssignable[T, str] else T,
+ y: str if IsAssignable[T, int] else float
+) -> int if IsAssignable[T, list[object]] else str:
+ ...
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorCall1]
+# flags: --python-version 3.14
+# Test type operators in function signatures
+from typing import IsAssignable
+
+def process[T](x: T) -> IsAssignable[T, str]:
+ ...
+
+reveal_type(process(0)) # N: Revealed type is "Literal[False]"
+reveal_type(process('test')) # N: Revealed type is "Literal[True]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorCall2]
+# flags: --python-version 3.14
+
+# Test a type operator in an *argument*, but one that should be resolvable
+# No sweat!!
+from typing import Callable, IsAssignable, Literal
+
+def process[T](x: T, y: IsAssignable[T, str]) -> None:
+ ...
+
+process(0, False)
+process(0, True) # E: Argument 2 to "process" has incompatible type "Literal[True]"; expected "Literal[False]"
+process('test', False) # E: Argument 2 to "process" has incompatible type "Literal[False]"; expected "Literal[True]"
+process('test', True)
+
+x0: Callable[[int, Literal[False]], None] = process
+x1: Callable[[int, Literal[True]], None] = process # E: Incompatible types in assignment (expression has type "Callable[[T, typing.IsAssignable[T, str]], None]", variable has type "Callable[[int, Literal[True]], None]")
+x2: Callable[[str, Literal[False]], None] = process # E: Incompatible types in assignment (expression has type "Callable[[T, typing.IsAssignable[T, str]], None]", variable has type "Callable[[str, Literal[False]], None]")
+x3: Callable[[str, Literal[True]], None] = process
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorCall3]
+# flags: --python-version 3.14
+
+# Test a type operator in an *argument*, but one that should be resolvable
+# No sweat!!
+from typing import Callable, IsAssignable, Literal
+
+def process[T](x: IsAssignable[T, str], y: T) -> None:
+ ...
+
+process(False, 0)
+process(True, 0) # E: Argument 1 to "process" has incompatible type "Literal[True]"; expected "Literal[False]"
+process(False, 'test') # E: Argument 1 to "process" has incompatible type "Literal[False]"; expected "Literal[True]"
+process(True, 'test')
+
+x0: Callable[[Literal[False], int], None] = process
+x1: Callable[[Literal[True], int], None] = process # E: Incompatible types in assignment (expression has type "Callable[[typing.IsAssignable[T, str], T], None]", variable has type "Callable[[Literal[True], int], None]")
+x2: Callable[[Literal[False], str], None] = process # E: Incompatible types in assignment (expression has type "Callable[[typing.IsAssignable[T, str], T], None]", variable has type "Callable[[Literal[False], str], None]")
+x3: Callable[[Literal[True], str], None] = process
+
+reveal_type(process) # N: Revealed type is "def [T] (x: typing.IsAssignable[T`-1, builtins.str], y: T`-1)"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorInTypeAlias]
+# flags: --python-version 3.14
+# Test type operators in type aliases (using concrete types to avoid unbound typevar issues)
+from typing import IsAssignable
+
+# Type alias using _Cond with concrete types
+type ConditionalType = str if IsAssignable[int, object] else bytes
+z: ConditionalType = b'lol' # E: Incompatible types in assignment (expression has type "bytes", variable has type "str")
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorGenericTypeAlias]
+# flags: --python-version 3.14
+# Test generic type alias with conditional type
+from typing import IsAssignable
+
+type IntOptional[T] = T | None if IsAssignable[T, int] else T
+
+x0: IntOptional[int]
+reveal_type(x0) # N: Revealed type is "builtins.int | None"
+
+x1: IntOptional[str]
+reveal_type(x1) # N: Revealed type is "builtins.str"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorFromUnion]
+# flags: --python-version 3.10
+# Test FromUnion operator
+from typing import FromUnion
+
+x: FromUnion[int | str | float]
+reveal_type(x) # N: Revealed type is "tuple[builtins.int, builtins.str, builtins.float]"
+
+# Non-union becomes 1-tuple
+y: FromUnion[int]
+reveal_type(y) # N: Revealed type is "tuple[builtins.int]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorNewUnion]
+# flags: --python-version 3.10
+# Test _NewUnion operator
+from typing import _NewUnion
+
+# Basic union construction
+x: _NewUnion[int, str, float]
+reveal_type(x) # N: Revealed type is "builtins.int | builtins.str | builtins.float"
+
+# Single type stays as-is
+y: _NewUnion[int]
+reveal_type(y) # N: Revealed type is "builtins.int"
+
+# Nested unions get flattened (note: bool is subtype of int so it gets simplified away)
+z: _NewUnion[int, str | bytes]
+reveal_type(z) # N: Revealed type is "builtins.int | builtins.str | builtins.bytes"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorToUnion]
+# flags: --python-version 3.10
+# Test FromUnion operator
+from typing import Union, Iter, Bool, Literal
+
+a: Union[*[x for x in Iter[tuple[int, str]]]]
+reveal_type(a) # N: Revealed type is "builtins.int | builtins.str"
+
+b: Union[*[x for x in Iter[tuple[int, str]] if Bool[Literal[False]]]]
+reveal_type(b) # N: Revealed type is "Never"
+
+c: Union[*tuple[int, str]]
+reveal_type(c) # N: Revealed type is "builtins.int | builtins.str"
+
+d: Union[*tuple[()]]
+reveal_type(d) # N: Revealed type is "Never"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorNewUnionWithComprehension]
+# flags: --python-version 3.10
+# Test _NewUnion with comprehension
+from typing import _NewUnion, Iter, Attrs, GetType
+
+class Foo:
+ a: int
+ b: str
+ c: bytes
+
+# Build union of all attribute types
+AttrTypes = _NewUnion[*[GetType[m] for m in Iter[Attrs[Foo]]]]
+x: AttrTypes
+reveal_type(x) # N: Revealed type is "builtins.int | builtins.str | builtins.bytes"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorStringConcat]
+# flags: --python-version 3.10
+# Test string concatenation operator
+from typing import Concat, Literal
+
+x: Concat[Literal["hello"], Literal["world"]]
+reveal_type(x) # N: Revealed type is "Literal['helloworld']"
+
+y: Concat[Literal['a'] | Literal['b'], Literal['c'] | Literal['d']]
+reveal_type(y) # N: Revealed type is "Literal['ac'] | Literal['ad'] | Literal['bc'] | Literal['bd']"
+
+z: Concat[Literal['a', 'b'], Literal['c', 'd']]
+reveal_type(z) # N: Revealed type is "Literal['ac'] | Literal['ad'] | Literal['bc'] | Literal['bd']"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorStringUppercase]
+# Test string case operators
+from typing import Uppercase, Lowercase, Capitalize, Uncapitalize, Literal
+
+x1: Uppercase[Literal["hello"]]
+reveal_type(x1) # N: Revealed type is "Literal['HELLO']"
+
+x2: Lowercase[Literal["HELLO"]]
+reveal_type(x2) # N: Revealed type is "Literal['hello']"
+
+x3: Capitalize[Literal["hello"]]
+reveal_type(x3) # N: Revealed type is "Literal['Hello']"
+
+x4: Uncapitalize[Literal["HELLO"]]
+reveal_type(x4) # N: Revealed type is "Literal['hELLO']"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorStringSlice]
+# Test string slice operator
+from typing import Slice, Literal
+
+x1: Slice[Literal["hello"], Literal[1], Literal[4]]
+reveal_type(x1) # N: Revealed type is "Literal['ell']"
+
+x2: Slice[Literal["hello"], Literal[0], Literal[2]]
+reveal_type(x2) # N: Revealed type is "Literal['he']"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorTupleSlice]
+# Test tuple slice operator
+from typing import Slice, Literal
+
+x1: Slice[tuple[int, str, float, bool], Literal[1], Literal[3]]
+reveal_type(x1) # N: Revealed type is "tuple[builtins.str, builtins.float]"
+
+x2: Slice[tuple[int, str, float], Literal[0], Literal[2]]
+reveal_type(x2) # N: Revealed type is "tuple[builtins.int, builtins.str]"
+
+# Slice from start
+x3: Slice[tuple[int, str, float], None, Literal[2]]
+reveal_type(x3) # N: Revealed type is "tuple[builtins.int, builtins.str]"
+
+# Slice to end
+x4: Slice[tuple[int, str, float], Literal[1], None]
+reveal_type(x4) # N: Revealed type is "tuple[builtins.str, builtins.float]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorLength]
+# Test Length operator
+from typing import Length, Literal, Iter, Tuple
+
+x: Length[tuple[int, str, float]]
+reveal_type(x) # N: Revealed type is "Literal[3]"
+
+y: Length[tuple[int]]
+reveal_type(y) # N: Revealed type is "Literal[1]"
+
+z: Length[tuple[()]]
+reveal_type(z) # N: Revealed type is "Literal[0]"
+
+w: Length[tuple[int, ...]]
+reveal_type(w) # N: Revealed type is "None"
+
+a: Length[tuple[*[x for x in Iter[tuple[int, str, bool]]]]]
+reveal_type(a) # N: Revealed type is "Literal[3]"
+
+b: Length[tuple[str, *tuple[int, ...]]]
+reveal_type(b) # N: Revealed type is "None"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorGetArg1]
+# Test GetArg operator
+from typing import Generic, TypeVar, GetArg, Literal
+
+T = TypeVar('T')
+U = TypeVar('U')
+
+class MyGeneric(Generic[T, U]):
+ pass
+
+# Get first type arg
+x: GetArg[MyGeneric[int, str], MyGeneric, Literal[0]]
+reveal_type(x) # N: Revealed type is "builtins.int"
+
+# Get second type arg
+y: GetArg[MyGeneric[int, str], MyGeneric, Literal[1]]
+reveal_type(y) # N: Revealed type is "builtins.str"
+
+# Works with list
+z: GetArg[list[float], list, Literal[0]]
+reveal_type(z) # N: Revealed type is "builtins.float"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorGetArg2]
+# Test GetArg operator
+from typing import Generic, TypeVar, GetArg, Literal
+
+T = TypeVar('T')
+U = TypeVar('U')
+
+class MyGeneric(Generic[T, U]):
+ pass
+
+class Concrete(MyGeneric[int, str]):
+ pass
+
+# Get first type arg
+x: GetArg[Concrete, MyGeneric, Literal[0]]
+reveal_type(x) # N: Revealed type is "builtins.int"
+
+# Get second type arg
+y: GetArg[Concrete, MyGeneric, Literal[1]]
+reveal_type(y) # N: Revealed type is "builtins.str"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorGetArg3]
+# Test GetArg operator
+from typing import Generic, TypeVar, GetArg, GetArgs, Literal
+
+T = TypeVar('T')
+U = TypeVar('U')
+
+class MyGeneric(Generic[T, U]):
+ pass
+
+class MyGeneric2(MyGeneric[T, str]):
+ pass
+
+class Concrete(MyGeneric2[int]):
+ pass
+
+# Get first type arg
+x: GetArg[Concrete, MyGeneric, Literal[0]]
+reveal_type(x) # N: Revealed type is "builtins.int"
+
+# Get second type arg
+y: GetArg[Concrete, MyGeneric, Literal[1]]
+reveal_type(y) # N: Revealed type is "builtins.str"
+
+args: GetArgs[Concrete, MyGeneric]
+reveal_type(args) # N: Revealed type is "tuple[builtins.int, builtins.str]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorGetArg4]
+# flags: --python-version 3.14
+
+# Test GetArg on tuple types
+from typing import GetArg, Iterable, Literal, Sequence, IsAssignable, TypedDict, Protocol
+
+x0: GetArg[tuple[int, str, float], Iterable, Literal[0]]
+reveal_type(x0) # N: Revealed type is "builtins.int | builtins.str | builtins.float"
+
+x1: GetArg[list[int], Iterable, Literal[0]]
+reveal_type(x1) # N: Revealed type is "builtins.int"
+
+x2: GetArg[list[int], Sequence, Literal[0]]
+reveal_type(x2) # N: Revealed type is "builtins.int"
+
+x3: GetArg[dict[str, bool], Iterable, Literal[0]]
+reveal_type(x3) # N: Revealed type is "builtins.str"
+
+class D(TypedDict):
+ x: int
+ y: str
+
+x4: GetArg[D, Iterable, Literal[0]]
+reveal_type(x4) # N: Revealed type is "builtins.str"
+
+x5: GetArg[type[bool], type, Literal[0]]
+reveal_type(x5) # N: Revealed type is "builtins.bool"
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorGetArgNegativeIndex]
+# Test GetArg with negative indexes
+from typing import Generic, TypeVar, GetArg, Literal
+
+T = TypeVar('T')
+U = TypeVar('U')
+V = TypeVar('V')
+
+class MyGeneric(Generic[T, U, V]):
+ pass
+
+# Negative indexes count from the end
+x: GetArg[MyGeneric[int, str, float], MyGeneric, Literal[-1]]
+reveal_type(x) # N: Revealed type is "builtins.float"
+
+y: GetArg[MyGeneric[int, str, float], MyGeneric, Literal[-2]]
+reveal_type(y) # N: Revealed type is "builtins.str"
+
+z: GetArg[MyGeneric[int, str, float], MyGeneric, Literal[-3]]
+reveal_type(z) # N: Revealed type is "builtins.int"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorGetArgTuple]
+# Test GetArg on tuple types
+from typing import GetArg, Literal
+
+# GetArg on tuple with tuple base
+x0: GetArg[tuple[int, str, float], tuple, Literal[0]]
+reveal_type(x0) # N: Revealed type is "builtins.int"
+
+x1: GetArg[tuple[int, str, float], tuple, Literal[1]]
+reveal_type(x1) # N: Revealed type is "builtins.str"
+
+x2: GetArg[tuple[int, str, float], tuple, Literal[2]]
+reveal_type(x2) # N: Revealed type is "builtins.float"
+
+# Negative indexes on tuples
+y: GetArg[tuple[int, str, float], tuple, Literal[-1]]
+reveal_type(y) # N: Revealed type is "builtins.float"
+
+z: GetArg[tuple[int, str, float], tuple, Literal[-2]]
+reveal_type(z) # N: Revealed type is "builtins.str"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorGetArgOutOfBounds]
+# Test GetArg with out-of-bounds indexes produces an error
+from typing import Generic, TypeVar, GetArg, Literal
+
+T = TypeVar('T')
+U = TypeVar('U')
+
+class MyGeneric(Generic[T, U]):
+ pass
+
+# Index too large
+x: GetArg[MyGeneric[int, str], MyGeneric, Literal[2]] # E: GetArg: index out of range for __main__.MyGeneric[int, str] as __main__.MyGeneric[Any, Any]
+reveal_type(x) # N: Revealed type is "Any"
+
+x2: GetArg[MyGeneric[int, str], MyGeneric, Literal[100]] # E: GetArg: index out of range for __main__.MyGeneric[int, str] as __main__.MyGeneric[Any, Any]
+reveal_type(x2) # N: Revealed type is "Any"
+
+# Negative index too small
+y: GetArg[MyGeneric[int, str], MyGeneric, Literal[-3]] # E: GetArg: index out of range for __main__.MyGeneric[int, str] as __main__.MyGeneric[Any, Any]
+reveal_type(y) # N: Revealed type is "Any"
+
+y2: GetArg[MyGeneric[int, str], MyGeneric, Literal[-100]] # E: GetArg: index out of range for __main__.MyGeneric[int, str] as __main__.MyGeneric[Any, Any]
+reveal_type(y2) # N: Revealed type is "Any"
+
+# Out of bounds on tuples
+z: GetArg[tuple[int, str], tuple, Literal[2]] # E: GetArg: index out of range for tuple[int, str] as tuple[Any, ...]
+reveal_type(z) # N: Revealed type is "Any"
+
+z2: GetArg[tuple[int, str], tuple, Literal[-3]] # E: GetArg: index out of range for tuple[int, str] as tuple[Any, ...]
+reveal_type(z2) # N: Revealed type is "Any"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeComprehensionBasicTy]
+# flags: --python-version 3.14
+# Test basic type comprehension
+from typing import IsAssignable, Iter
+
+class A[*T]:
+ pass
+
+# Basic comprehension - maps over tuple elements
+def f(x: A[*[T for T in Iter[tuple[int, str, float]]]]) -> None:
+ pass
+
+reveal_type(f) # N: Revealed type is "def (x: __main__.A[builtins.int, builtins.str, builtins.float])"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeComprehensionBasic]
+# Test basic type comprehension
+from typing import IsAssignable, Iter
+
+# Basic comprehension - maps over tuple elements
+def f(x: tuple[*[T for T in Iter[tuple[int, str, float]]]]) -> None:
+ pass
+
+reveal_type(f) # N: Revealed type is "def (x: tuple[builtins.int, builtins.str, builtins.float])"
+
+z: tuple[*[T for T in Iter[tuple[()]]]]
+reveal_type(z) # N: Revealed type is "tuple[()]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeComprehensionWithCondition]
+# Test type comprehension with filtering condition
+from typing import IsAssignable, Iter
+
+# Filter elements - only keep subtypes of int (int, bool)
+def f(x: tuple[*[T for T in Iter[tuple[int, str, bool, float]] if IsAssignable[T, int]]]) -> None:
+ pass
+
+reveal_type(f) # N: Revealed type is "def (x: tuple[builtins.int, builtins.bool])"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeComprehensionTransform]
+# Test type comprehension with element transformation
+from typing import Uppercase, Literal, Iter
+
+# Transform elements
+x: tuple[*[Uppercase[T] for T in Iter[tuple[Literal["a"], Literal["b"], Literal["c"]]]]]
+reveal_type(x) # N: Revealed type is "tuple[Literal['A'], Literal['B'], Literal['C']]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testTypeComprehensionAlias1]
+# flags: --python-version 3.14
+# Test type comprehension with element transformation
+from typing import Uppercase, Literal, Iter
+
+# Transform elements
+type A = tuple[*[Uppercase[T] for T in Iter[tuple[Literal["a"], Literal["b"], Literal["c"]]]]]
+x: A
+reveal_type(x) # N: Revealed type is "tuple[Literal['A'], Literal['B'], Literal['C']]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeComprehensionAlias2]
+# flags: --python-version 3.14
+# Test type comprehension with element transformation
+from typing import Uppercase, Literal, Iter
+
+# Transform elements
+type Trans[U] = tuple[*[Uppercase[T] for T in Iter[U]]]
+x: Trans[tuple[Literal["a"], Literal["b"], Literal["c"]]]
+reveal_type(x) # N: Revealed type is "tuple[Literal['A'], Literal['B'], Literal['C']]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeComprehensionAlias3]
+# flags: --python-version 3.14
+# Test type comprehension with element transformation
+from typing import Uppercase, Literal, Iter
+
+# XXX: DECISION: Should shadowing be allowed?
+
+# Transform elements
+type Trans[T] = tuple[*[Uppercase[T] for T in Iter[T]]] # E: "T" already defined as a type parameter
+x: Trans[tuple[Literal["a"], Literal["b"], Literal["c"]]]
+reveal_type(x) # N: Revealed type is "tuple[Literal['A'], Literal['B'], Literal['C']]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorSubType0]
+# flags: --python-version 3.14
+
+from typing import IsAssignable
+
+def process[T](x: IsAssignable[T, str], y: IsAssignable[T, int]) -> None:
+ x = y # E: Incompatible types in assignment (expression has type "typing.IsAssignable[T, int]", variable has type "typing.IsAssignable[T, str]")
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorSubType1]
+# flags: --python-version 3.14
+
+from typing import Callable, IsAssignable, Literal
+
+class A:
+ def process[T](self, x: IsAssignable[T, str], y: T) -> None:
+ ...
+
+
+class B(A):
+ pass
+
+
+class C(A):
+ def process[T](self, x: IsAssignable[T, str], y: T) -> None:
+ ...
+
+class D(A):
+ def process[T](self, x: IsAssignable[T, int], y: T) -> None: # E: Argument 1 of "process" is incompatible with supertype "A"; supertype defines the argument type as "typing.IsAssignable[T, str]" \
+ # N: This violates the Liskov substitution principle \
+ # N: See https://mypy.readthedocs.io/en/stable/common_issues.html#incompatible-overrides
+ ...
+
+
+class E(A):
+ def process[T](self, x: str, y: T) -> None: # E: Argument 1 of "process" is incompatible with supertype "A"; supertype defines the argument type as "typing.IsAssignable[T, str]" \
+ # N: This violates the Liskov substitution principle \
+ # N: See https://mypy.readthedocs.io/en/stable/common_issues.html#incompatible-overrides
+ ...
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorSubType2]
+# flags: --python-version 3.14
+
+from typing import Callable, IsAssignable, Literal
+
+class A:
+ def process[T](self, y: T) -> IsAssignable[T, str]:
+ ...
+
+
+class B(A):
+ pass
+
+
+class C(A):
+ def process[T](self, y: T) -> IsAssignable[T, str]:
+ ...
+
+
+class D(A):
+ def process[T](self, y: T) -> IsAssignable[T, int]: # E: Return type "typing.IsAssignable[T, int]" of "process" incompatible with return type "typing.IsAssignable[T, str]" in supertype "A"
+ ...
+
+
+class E(A):
+ def process[T](self, y: T) -> str: # E: Return type "str" of "process" incompatible with return type "typing.IsAssignable[T, str]" in supertype "A"
+ ...
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testTypeOperatorSubType3]
+# flags: --python-version 3.14
+
+from typing import Callable, IsAssignable, Literal, Iter
+
+class A:
+ def process[Ts](self, x: Ts) -> tuple[*[t for t in Iter[Ts]]]:
+ ...
+
+
+class B(A):
+ pass
+
+
+# alpha varying
+class C(A):
+ def process[Us](self, x: Us) -> tuple[*[s for s in Iter[Us]]]:
+ ...
+
+
+class D(A):
+ def process[Ts](self, x: Ts) -> tuple[*[int for s in Iter[Ts]]]: # E: Signature of "process" incompatible with supertype "A" \
+ # N: Superclass: \
+ # N: def [Ts] process(self, x: Ts) -> tuple[*[t for t in typing.Iter[Ts]]] \
+ # N: Subclass: \
+ # N: def [Ts] process(self, x: Ts) -> tuple[*[int for s in typing.Iter[Ts]]]
+ ...
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testTypeOperatorJoin1]
+# flags: --python-version 3.14
+
+from typing import Callable, IsAssignable
+
+
+def f0[T](x: IsAssignable[T, str], y: IsAssignable[T, str]) -> None:
+ z = [x, y]
+ reveal_type(z) # N: Revealed type is "builtins.list[typing.IsAssignable[T`-1, builtins.str]]"
+
+def f1[T](x: IsAssignable[T, str], y: IsAssignable[T, str] | None) -> None:
+ z = [x, y]
+ reveal_type(z) # N: Revealed type is "builtins.list[typing.IsAssignable[T`-1, builtins.str] | None]"
+
+def g[T](x: IsAssignable[T, str], y: IsAssignable[T, int]) -> None:
+ z = [x, y]
+ reveal_type(z) # N: Revealed type is "builtins.list[builtins.object]"
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testTypeOperatorMeet1]
+# flags: --python-version 3.14
+
+from typing import Callable, IsAssignable
+
+def g[T](
+ x: Callable[[IsAssignable[T, str]], None],
+ x0: Callable[[IsAssignable[T, str]], None],
+ y: Callable[[IsAssignable[T, int]], None],
+) -> None:
+ z = [x, y]
+ reveal_type(z) # N: Revealed type is "builtins.list[builtins.function]"
+
+ w = [x, x0]
+ reveal_type(w) # N: Revealed type is "builtins.list[def (typing.IsAssignable[T`-1, builtins.str])]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorTypedDictParam]
+# flags: --python-version 3.14
+
+from typing import BaseTypedDict, Callable, Attrs, GetName, TypedDict, Iter
+
+def foo[T: BaseTypedDict](x: T) -> tuple[
+ *[GetName[m] for m in Iter[Attrs[T]]]
+]:
+ return tuple(x.keys()) # type: ignore
+
+
+class Sigh(TypedDict):
+ x: int
+ y: str
+
+
+td: Sigh = {'x': 10, 'y': 'hmmm'}
+x = foo(td)
+reveal_type(x) # N: Revealed type is "tuple[Literal['x'], Literal['y']]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorRaiseError]
+# flags: --python-version 3.14
+
+from typing import RaiseError, Literal
+
+x: RaiseError[Literal['custom error message']] # E: custom error message
+reveal_type(x) # N: Revealed type is "Never"
+
+y: RaiseError[Literal['custom error message'], str] # E: custom error message: str
+
+type Alias[T] = RaiseError[Literal['wrong'], T]
+
+z: Alias[bool] # E: wrong: bool
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorRaiseErrorInConditional]
+from typing import RaiseError, IsAssignable, Literal
+
+# RaiseError in a conditional - error only if condition triggers it
+T = int
+x: str if IsAssignable[T, str] else RaiseError[Literal['T must be a str']] # E: T must be a str
+reveal_type(x) # N: Revealed type is "Never"
+
+y: str if IsAssignable[T, int] else RaiseError[Literal['T must be a int']]
+reveal_type(y) # N: Revealed type is "builtins.str"
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testTypeOperatorConditionalArg1]
+# flags: --python-version 3.14
+
+from typing import IsAssignable
+
+def foo[T](x: T, y: list[T] if IsAssignable[T, int] else T) -> T:
+ return x
+
+
+foo(0, 0) # E: Argument 2 to "foo" has incompatible type "int"; expected "list[int]"
+foo(0, [0])
+foo('a', 'a')
+foo('a', ['a']) # E: Argument 2 to "foo" has incompatible type "list[str]"; expected "str"
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorConditionalArg2]
+# flags: --python-version 3.14
+
+from typing import IsAssignable
+
+def foo[T](y: list[T] if IsAssignable[T, int] else T, x: T) -> T:
+ return x
+
+
+foo(0, 0) # E: Argument 1 to "foo" has incompatible type "int"; expected "list[int]"
+foo([0], 0)
+foo('a', 'a')
+foo(['a'], 'a') # E: Argument 1 to "foo" has incompatible type "list[str]"; expected "str"
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorNewProtocolBasic]
+# flags: --python-version 3.14
+
+from typing import NewProtocol, Member, Literal
+
+# Basic NewProtocol creation
+type MyProto = NewProtocol[
+ Member[Literal["x"], int],
+ Member[Literal["y"], str],
+]
+
+# Check the protocol works with structural subtyping
+class Good:
+ x: int
+ y: str
+
+class Bad:
+ x: int
+ # Missing y
+
+def takes_proto(p: MyProto) -> None:
+ pass
+
+g: Good
+takes_proto(g)
+
+b: Bad
+takes_proto(b) # E: Argument 1 to "takes_proto" has incompatible type "Bad"; expected "NewProtocol[x: int, y: str]" \
+ # N: "Bad" is missing following "NewProtocol" protocol member: \
+ # N: y
+
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testTypeOperatorNewProtocolIncompatible]
+# flags: --python-version 3.14
+
+from typing import NewProtocol, Member, Literal
+
+a0: NewProtocol[
+ Member[Literal["foo"], str],
+]
+b0: NewProtocol[
+ Member[Literal["bar"], int],
+]
+a0 = b0 # E: Incompatible types in assignment (expression has type "NewProtocol[bar: int]", variable has type "NewProtocol[foo: str]")
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testTypeOperatorNewProtocolRecursive1]
+# flags: --python-version 3.14
+
+from typing import NewProtocol, Member, Literal
+
+# Basic NewProtocol creation
+type LinkedList = NewProtocol[
+ Member[Literal["data"], int],
+ Member[Literal["next"], LinkedList],
+]
+
+z: LinkedList
+reveal_type(z) # N: Revealed type is "NewProtocol[data: builtins.int, next: ...]"
+
+reveal_type(z.data) # N: Revealed type is "builtins.int"
+reveal_type(z.next.next.data) # N: Revealed type is "builtins.int"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testTypeOperatorNewProtocolRecursive2]
+# flags: --python-version 3.14
+
+from typing import NewProtocol, Member, Literal
+
+# Basic NewProtocol creation
+type LinkedList[T] = NewProtocol[
+ Member[Literal["data"], T],
+ Member[Literal["next"], LinkedList[T]],
+]
+
+z: LinkedList[str]
+reveal_type(z) # N: Revealed type is "NewProtocol[data: builtins.str, next: ...]"
+
+reveal_type(z.data) # N: Revealed type is "builtins.str"
+reveal_type(z.next.next.data) # N: Revealed type is "builtins.str"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testTypeOperatorNewTypedDictRecursive]
+# flags: --python-version 3.14
+
+from typing import NewTypedDict, Member, Literal
+
+# Basic NewProtocol creation
+type LinkedList = NewTypedDict[
+ Member[Literal["data"], int],
+ Member[Literal["next"], LinkedList],
+]
+
+z: LinkedList
+reveal_type(z) # N: Revealed type is "TypedDict({'data': builtins.int, 'next': ...})"
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testTypeOperatorNewProtocolReturn]
+# flags: --python-version 3.14
+
+from typing import NewProtocol, Member, Literal
+
+
+def foo[T, U](x: T, y: U) -> NewProtocol[
+ Member[Literal["x"], T],
+ Member[Literal["y"], U],
+]:
+ raise BaseException
+
+
+res1 = foo('lol', 10)
+reveal_type(res1.x) # N: Revealed type is "builtins.str"
+reveal_type(res1.y) # N: Revealed type is "builtins.int"
+
+res2 = foo(['foo', 'bar'], None)
+reveal_type(res2.x) # N: Revealed type is "builtins.list[builtins.str]"
+reveal_type(res2.y) # N: Revealed type is "None"
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testTypeOperatorNewProtocolSubtyping]
+# flags: --python-version 3.14
+
+from typing import NewProtocol, Member, Literal
+
+# Create a protocol with two members
+type Point = NewProtocol[
+ Member[Literal["x"], int],
+ Member[Literal["y"], int],
+]
+
+# Any class with x: int and y: int should satisfy Point
+class Point2D:
+ x: int
+ y: int
+
+class Point3D:
+ x: int
+ y: int
+ z: int
+
+class NotAPoint:
+ a: int
+ b: int
+
+def draw(p: Point) -> None:
+ pass
+
+p2d: Point2D
+p3d: Point3D
+nap: NotAPoint
+
+draw(p2d) # OK - has x and y
+draw(p3d) # OK - has x and y (plus z is fine)
+draw(nap) # E: Argument 1 to "draw" has incompatible type "NotAPoint"; expected "NewProtocol[x: int, y: int]"
+
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testNewCallableBasic]
+from typing import Callable, Param, Params, Literal
+
+# Basic callable with positional args
+f1: Callable[Params[Param[Literal["a"], int], Param[Literal["b"], str]], bool]
+reveal_type(f1) # N: Revealed type is "def (a: builtins.int, b: builtins.str) -> builtins.bool"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testNewCallableWithKwargs]
+from typing import Callable, Param, Params, Literal
+
+# Callable with keyword-only args
+f: Callable[Params[
+ Param[Literal["a"], int],
+ Param[Literal["b"], str, Literal["keyword"]],
+], bool]
+reveal_type(f) # N: Revealed type is "def (a: builtins.int, *, b: builtins.str) -> builtins.bool"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testNewCallableWithDefaults]
+from typing import Callable, Param, Params, Literal
+
+# Callable with default args
+f: Callable[Params[
+ Param[Literal["a"], int],
+ Param[Literal["b"], str, Literal["default"]],
+], bool]
+reveal_type(f) # N: Revealed type is "def (a: builtins.int, b: builtins.str =) -> builtins.bool"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testNewCallableWithStarArgs]
+from typing import Callable, Param, Params, Literal
+
+# Callable with *args and **kwargs
+f: Callable[Params[
+ Param[Literal["a"], int],
+ Param[None, int, Literal["*"]],
+ Param[None, str, Literal["**"]],
+], bool]
+reveal_type(f) # N: Revealed type is "def (a: builtins.int, *builtins.int, **builtins.str) -> builtins.bool"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testNewCallableComplex]
+from typing import Callable, Param, Params, Literal
+
+# Complex callable matching PEP example
+f: Callable[Params[
+ Param[Literal["a"], int, Literal["positional"]],
+ Param[Literal["b"], int],
+ Param[Literal["c"], int, Literal["default"]],
+ Param[None, int, Literal["*"]],
+ Param[Literal["d"], int, Literal["keyword"]],
+ Param[Literal["e"], int, Literal["default", "keyword"]],
+ Param[None, int, Literal["**"]],
+], int]
+reveal_type(f) # N: Revealed type is "def (a: builtins.int, b: builtins.int, c: builtins.int =, *builtins.int, d: builtins.int, e: builtins.int =, **builtins.int) -> builtins.int"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testNewCallableAssignment]
+from typing import Callable, Param, Params, Literal
+
+def real_func(a: int, b: str) -> bool:
+ return True
+
+f: Callable[Params[Param[Literal["a"], int], Param[Literal["b"], str]], bool]
+f = real_func # OK
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testGetArgCallableBasic]
+from typing import Callable, Param, Params, GetArg, Literal
+
+# GetArg[Callable, Callable, 0] returns tuple of Params
+a0: GetArg[Callable[[int, str], bool], Callable, Literal[0]]
+reveal_type(a0) # N: Revealed type is "tuple[typing.Param[None, builtins.int, Never], typing.Param[None, builtins.str, Never]]"
+
+# GetArg[Callable, Callable, 1] returns return type
+a1: GetArg[Callable[[int, str], bool], Callable, Literal[1]]
+reveal_type(a1) # N: Revealed type is "builtins.bool"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testGetArgCallableWithNames]
+from typing import Callable, Param, Params, GetArg, Literal
+
+# Extended callable with named params roundtrips through GetArg
+F = Callable[Params[
+ Param[Literal["x"], int],
+ Param[Literal["y"], str, Literal["keyword"]],
+], bool]
+a0: GetArg[F, Callable, Literal[0]]
+reveal_type(a0) # N: Revealed type is "tuple[typing.Param[Literal['x'], builtins.int, Never], typing.Param[Literal['y'], builtins.str, Literal['keyword']]]"
+
+a1: GetArg[F, Callable, Literal[1]]
+reveal_type(a1) # N: Revealed type is "builtins.bool"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testParamDotNotation]
+from typing import Param, Literal
+
+# Dot notation on subscripted Param
+n: Param[Literal["x"], int, Literal["keyword"]].name
+reveal_type(n) # N: Revealed type is "Literal['x']"
+
+t: Param[Literal["x"], int, Literal["keyword"]].type
+reveal_type(t) # N: Revealed type is "builtins.int"
+
+q: Param[Literal["x"], int, Literal["keyword"]].quals
+reveal_type(q) # N: Revealed type is "Literal['keyword']"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testParamGetMemberType]
+from typing import Param, GetMemberType, Literal
+
+a: GetMemberType[Param[Literal["x"], int, Literal["keyword"]], Literal["name"]]
+reveal_type(a) # N: Revealed type is "Literal['x']"
+
+b: GetMemberType[Param[Literal["x"], int, Literal["keyword"]], Literal["type"]]
+reveal_type(b) # N: Revealed type is "builtins.int"
+
+c: GetMemberType[Param[Literal["x"], int, Literal["keyword"]], Literal["quals"]]
+reveal_type(c) # N: Revealed type is "Literal['keyword']"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testRecursiveConditionalAliasWithTypeVars]
+# flags: --python-version 3.14
+# Regression test: recursive conditional type alias with TypeVars
+# in a function return type must not hang when formatting error messages.
+# The bug was that _expand_type_fors_in_args would infinitely expand
+# recursive aliases inside UnpackType when the expansion was stuck
+# on TypeVars.
+from typing import Literal, Bool, IsAssignable, Slice, GetArg, Length, RaiseError, assert_type
+
+type DropLast[T] = Slice[T, Literal[0], Literal[-1]]
+type Last[T] = GetArg[T, tuple, Literal[-1]]
+type Empty[T] = IsAssignable[Length[T], Literal[0]]
+
+type Zip[T, S] = (
+ tuple[()]
+ if Bool[Empty[T]] and Bool[Empty[S]]
+ else RaiseError[Literal["Zip length mismatch"], T, S]
+ if Bool[Empty[T]] or Bool[Empty[S]]
+ else tuple[*Zip[DropLast[T], DropLast[S]], tuple[Last[T], Last[S]]]
+)
+
+def pair_zip[T, U](a: T, b: U) -> Zip[T, U]:
+ return tuple(zip(a, b)) # type: ignore
+
+# Concrete usage should work
+result = pair_zip((1, "hello"), (3.14, True))
+assert_type(result, tuple[tuple[int, float], tuple[str, bool]])
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testRecursiveConditionalAliasWithTypeVarTuples]
+# flags: --python-version 3.14
+# Regression test: same as above but with TypeVarTuples (*Ts, *Us).
+# RaiseError inside the Zip body must not fire during error formatting
+# when get_proper_type evaluates it standalone outside its _Cond context.
+from typing import Literal, Bool, IsAssignable, Slice, GetArg, Length, RaiseError, assert_type
+
+type DropLast[T] = Slice[T, Literal[0], Literal[-1]]
+type Last[T] = GetArg[T, tuple, Literal[-1]]
+type Empty[T] = IsAssignable[Length[T], Literal[0]]
+
+type Zip[T, S] = (
+ tuple[()]
+ if Bool[Empty[T]] and Bool[Empty[S]]
+ else RaiseError[Literal["Zip length mismatch"], T, S]
+ if Bool[Empty[T]] or Bool[Empty[S]]
+ else tuple[*Zip[DropLast[T], DropLast[S]], tuple[Last[T], Last[S]]]
+)
+
+def pair_zip[*Ts, *Us](
+ a: tuple[*Ts], b: tuple[*Us]
+) -> Zip[tuple[*Ts], tuple[*Us]]:
+ return tuple(zip(a, b)) # type: ignore
+
+result = pair_zip((1, "hello"), (3.14, True))
+assert_type(result, tuple[tuple[int, float], tuple[str, bool]])
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
diff --git a/test-data/unit/check-typelevel-comprehension.test b/test-data/unit/check-typelevel-comprehension.test
new file mode 100644
index 0000000000000..9d06db6b50135
--- /dev/null
+++ b/test-data/unit/check-typelevel-comprehension.test
@@ -0,0 +1,297 @@
+[case testTypeComprehensionBasic]
+# Test basic type comprehension over a tuple
+from typing import Iter
+
+x: tuple[*[list[T] for T in Iter[tuple[int, str, bool]]]]
+reveal_type(x) # N: Revealed type is "tuple[builtins.list[builtins.int], builtins.list[builtins.str], builtins.list[builtins.bool]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeComprehensionWithCondition]
+# Test type comprehension with IsAssignable condition
+from typing import Iter, IsAssignable
+
+# Filter to only types that are subtypes of int (just int in this case)
+x: tuple[*[list[T] for T in Iter[tuple[int, str, float]] if IsAssignable[T, int]]]
+reveal_type(x) # N: Revealed type is "tuple[builtins.list[builtins.int]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeComprehensionFilterSubtypeObject]
+# Test type comprehension filtering subtypes of object (everything passes)
+from typing import Iter, IsAssignable
+
+x: tuple[*[T for T in Iter[tuple[int, str, bool]] if IsAssignable[T, object]]]
+reveal_type(x) # N: Revealed type is "tuple[builtins.int, builtins.str, builtins.bool]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeComprehensionFilterNone]
+# Test type comprehension where no items pass the condition
+from typing import Iter, IsAssignable
+
+# No type in the tuple is a subtype of str (str is not in the tuple)
+x: tuple[*[T for T in Iter[tuple[int, float, bool]] if IsAssignable[T, str]]]
+reveal_type(x) # N: Revealed type is "tuple[()]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeComprehensionSingleElement]
+# Test type comprehension over single-element tuple
+from typing import Iter
+
+x: tuple[*[list[T] for T in Iter[tuple[int]]]]
+reveal_type(x) # N: Revealed type is "tuple[builtins.list[builtins.int]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeComprehensionTransform]
+# Test type comprehension that transforms types
+from typing import Iter
+
+# Wrap each type in a tuple
+x: tuple[*[tuple[T, T] for T in Iter[tuple[int, str]]]]
+reveal_type(x) # N: Revealed type is "tuple[tuple[builtins.int, builtins.int], tuple[builtins.str, builtins.str]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeComprehensionInTypeAlias]
+# flags: --python-version 3.12
+# Test type comprehension in a type alias
+from typing import Iter
+
+type Listify[*Ts] = tuple[*[list[T] for T in Iter[tuple[*Ts]]]]
+
+x: Listify[int, str, bool]
+reveal_type(x) # N: Revealed type is "tuple[builtins.list[builtins.int], builtins.list[builtins.str], builtins.list[builtins.bool]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testDictComprehensionNewTypedDict]
+# Test dict comprehension in type context with NewTypedDict
+from typing import NewTypedDict, Members, TypedDict, Iter, GetName, GetType
+
+class Person(TypedDict):
+ name: str
+ age: int
+
+# Dict comprehension produces Members, fed into NewTypedDict
+TD = NewTypedDict[{GetName[m]: GetType[m] for m in Iter[Members[Person]]}]
+x: TD
+reveal_type(x) # N: Revealed type is "TypedDict({'name': builtins.str, 'age': builtins.int})"
+x = {'name': 'Alice', 'age': 30}
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testDictComprehensionNewTypedDictWithCondition]
+# Test dict comprehension with filtering condition
+from typing import NewTypedDict, Members, TypedDict, Iter, IsAssignable, GetName, GetType
+
+class Person(TypedDict):
+ name: str
+ age: int
+ active: bool
+
+# Filter to only string fields using dict comprehension
+TD = NewTypedDict[{GetName[m]: GetType[m] for m in Iter[Members[Person]] if IsAssignable[GetType[m], str]}]
+x: TD
+reveal_type(x) # N: Revealed type is "TypedDict({'name': builtins.str})"
+x = {'name': 'Alice'}
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testDictComprehensionFromClass]
+# Test dict comprehension with class attributes
+from typing import NewTypedDict, Attrs, Iter, GetName, GetType
+
+class Point:
+ x: int
+ y: int
+
+# Create TypedDict from class attributes using dict comprehension
+TD = NewTypedDict[{GetName[m]: GetType[m] for m in Iter[Attrs[Point]]}]
+x: TD
+reveal_type(x) # N: Revealed type is "TypedDict({'x': builtins.int, 'y': builtins.int})"
+x = {'x': 1, 'y': 2}
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testMapBasic]
+# *Map(...) is a pure synonym for *[...] — basic tuple expansion
+from typing import Iter, Map
+
+x: tuple[*Map(list[T] for T in Iter[tuple[int, str, bool]])]
+reveal_type(x) # N: Revealed type is "tuple[builtins.list[builtins.int], builtins.list[builtins.str], builtins.list[builtins.bool]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testMapWithCondition]
+# *Map(...) with an IsAssignable filter
+from typing import Iter, IsAssignable, Map
+
+x: tuple[*Map(list[T] for T in Iter[tuple[int, str, float]] if IsAssignable[T, int])]
+reveal_type(x) # N: Revealed type is "tuple[builtins.list[builtins.int]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testMapTransform]
+# *Map(...) transforming each element into a pair
+from typing import Iter, Map
+
+x: tuple[*Map(tuple[T, T] for T in Iter[tuple[int, str]])]
+reveal_type(x) # N: Revealed type is "tuple[tuple[builtins.int, builtins.int], tuple[builtins.str, builtins.str]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testMapInUnion]
+# *Map(...) inside Union[...]
+from typing import Iter, Map, Union
+
+x: Union[*Map(list[T] for T in Iter[tuple[int, str]])]
+reveal_type(x) # N: Revealed type is "builtins.list[builtins.int] | builtins.list[builtins.str]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testMapListComp]
+# Map accepts a list comprehension as well as a generator expression
+from typing import Iter, Map
+
+x: tuple[*Map([list[T] for T in Iter[tuple[int, str]]])]
+reveal_type(x) # N: Revealed type is "tuple[builtins.list[builtins.int], builtins.list[builtins.str]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testMapInTypeAlias]
+# flags: --python-version 3.12
+# *Map(...) inside a PEP 695 type alias exercises the exprtotype.py code path
+from typing import Iter, Map
+
+type Listify[*Ts] = tuple[*Map(list[T] for T in Iter[tuple[*Ts]])]
+
+x: Listify[int, str, bool]
+reveal_type(x) # N: Revealed type is "tuple[builtins.list[builtins.int], builtins.list[builtins.str], builtins.list[builtins.bool]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testMapEquivalentToStarList]
+# Side-by-side: *Map(...) and *[...] produce identical types
+from typing import Iter, Map
+
+x: tuple[*Map(list[T] for T in Iter[tuple[int, str, bool]])]
+y: tuple[*[list[T] for T in Iter[tuple[int, str, bool]]]]
+reveal_type(x) # N: Revealed type is "tuple[builtins.list[builtins.int], builtins.list[builtins.str], builtins.list[builtins.bool]]"
+reveal_type(y) # N: Revealed type is "tuple[builtins.list[builtins.int], builtins.list[builtins.str], builtins.list[builtins.bool]]"
+x = y
+y = x
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testMapShadowedByUserClass]
+# A local class named Map does NOT get the type-comprehension desugaring:
+# typeanal only unwraps Map() when the name resolves to the real
+# _typeshed.typemap.Map (re-exported as typing.Map).
+from typing import Iter
+
+class Map: ...
+
+x: tuple[*Map(list[T] for T in Iter[tuple[int, str]])] # E: "Map" expects no type arguments, but 1 given
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testMapInNewProtocol]
+# *Map(...) inside NewProtocol[...] — verifies the variadic flow is identical
+# to the *[...] form for non-tuple variadic containers.
+from typing import Iter, Map, NewProtocol, Member, Literal
+
+P = NewProtocol[*Map(Member[Literal['x'], T] for T in Iter[tuple[int]])]
+
+class Impl:
+ x: int
+
+def f(p: P) -> None: ...
+
+f(Impl())
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testMapOverIterAnyInTuple]
+# *Map(...) over Iter[Any] collapses the enclosing tuple to Any.
+# The *[...] form errors because Iter[Any] is not a tuple type.
+from typing import Iter, Map, Any
+
+x: tuple[*Map(list[T] for T in Iter[Any])]
+reveal_type(x) # N: Revealed type is "Any"
+
+y: tuple[*[list[T] for T in Iter[Any]]] # E: Type comprehension requires Iter over a tuple type, got Any; use Map(...) to propagate Any
+reveal_type(y) # N: Revealed type is "tuple[Any]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testMapOverIterAnyInUnion]
+# *Map(...) over Iter[Any] collapses the enclosing Union to Any.
+from typing import Iter, Map, Union, Any
+
+x: Union[*Map(list[T] for T in Iter[Any])]
+reveal_type(x) # N: Revealed type is "Any"
+
+y: Union[*[list[T] for T in Iter[Any]]] # E: Type comprehension requires Iter over a tuple type, got Any; use Map(...) to propagate Any
+reveal_type(y) # N: Revealed type is "Any"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testMapOverIterAnyInNewProtocol]
+# *Map(...) over Iter[Any] collapses the enclosing NewProtocol to Any:
+# any value satisfies the resulting protocol without error.
+from typing import Iter, Map, NewProtocol, Member, Literal, Any
+
+P = NewProtocol[*Map(Member[Literal['x'], T] for T in Iter[Any])]
+
+def f(p: P) -> None: ...
+
+class Impl: ...
+
+f(Impl()) # P collapsed to Any, so this is fine
+f(42)
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testMapOverIterAnyInTypeAlias]
+# flags: --python-version 3.12
+# The Map-over-Any collapse flows through a PEP 695 alias.
+from typing import Iter, Map, Any
+
+type MapList[X] = tuple[*Map(list[T] for T in Iter[X])]
+
+# When X = Any the comprehension source is Iter[Any]; Map collapses to Any.
+x: MapList[Any]
+reveal_type(x) # N: Revealed type is "Any"
+
+# Non-Map form keeps prior behavior: Iter[Any] does not collapse.
+type StarList[X] = tuple[*[list[T] for T in Iter[X]]] # E: Type comprehension requires Iter over a tuple type, got Any; use Map(...) to propagate Any
+y: StarList[Any]
+reveal_type(y) # N: Revealed type is "tuple[Any]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
diff --git a/test-data/unit/check-typelevel-examples.test b/test-data/unit/check-typelevel-examples.test
new file mode 100644
index 0000000000000..3d3064427b2fb
--- /dev/null
+++ b/test-data/unit/check-typelevel-examples.test
@@ -0,0 +1,805 @@
+[case testTypeLevel_qblike]
+# flags: --python-version 3.14
+
+from typing import Literal, Unpack, TypedDict
+
+from typing import (
+ NewProtocol,
+ BaseTypedDict,
+ Iter,
+ Attrs,
+ IsAssignable,
+ GetType,
+ Member,
+ GetName,
+ GetMemberType,
+ GetArg,
+)
+
+
+# Begin PEP section: Prisma-style ORMs
+
+"""First, to support the annotations we saw above, we have a collection
+of dummy classes with generic types.
+"""
+
+
+class Pointer[T]:
+ pass
+
+
+class Property[T](Pointer[T]):
+ pass
+
+
+class Link[T](Pointer[T]):
+ pass
+
+
+class SingleLink[T](Link[T]):
+ pass
+
+
+class MultiLink[T](Link[T]):
+ pass
+
+
+"""
+The ``select`` method is where we start seeing new things.
+
+The ``**kwargs: Unpack[K]`` is part of this proposal, and allows
+*inferring* a TypedDict from keyword args.
+
+``Attrs[K]`` extracts ``Member`` types corresponding to every
+type-annotated attribute of ``K``, while calling ``NewProtocol`` with
+``Member`` arguments constructs a new structural type.
+
+``GetName`` is a getter operator that fetches the name of a ``Member``
+as a literal type--all of these mechanisms lean very heavily on literal types.
+``GetMemberType`` gets the type of an attribute from a class.
+
+"""
+
+
+def select[ModelT, K: BaseTypedDict](
+ typ: type[ModelT],
+ /,
+ **kwargs: Unpack[K],
+) -> list[
+ NewProtocol[
+ *[
+ Member[
+ GetName[c],
+ ConvertField[GetMemberType[ModelT, GetName[c]]],
+ ]
+ for c in Iter[Attrs[K]]
+ ]
+ ]
+]:
+ return []
+
+
+"""ConvertField is our first type helper, and it is a conditional type
+alias, which decides between two types based on a (limited)
+subtype-ish check.
+
+In ``ConvertField``, we wish to drop the ``Property`` or ``Link``
+annotation and produce the underlying type, as well as, for links,
+producing a new target type containing only properties and wrapping
+``MultiLink`` in a list.
+"""
+
+type ConvertField[T] = (
+ AdjustLink[PropsOnly[PointerArg[T]], T] if IsAssignable[T, Link] else PointerArg[T]
+)
+
+"""``PointerArg`` gets the type argument to ``Pointer`` or a subclass.
+
+``GetArg[T, Base, I]`` is one of the core primitives; it fetches the
+index ``I`` type argument to ``Base`` from a type ``T``, if ``T``
+inherits from ``Base``.
+
+(The subtleties of this will be discussed later; in this case, it just
+grabs the argument to a ``Pointer``).
+
+"""
+# XXX: We kind of want to be able to do T: Pointer, but...
+type PointerArg[T] = GetArg[T, Pointer, Literal[0]]
+
+"""
+``AdjustLink`` sticks a ``list`` around ``MultiLink``, using features
+we've discussed already.
+
+"""
+type AdjustLink[Tgt, LinkTy] = list[Tgt] if IsAssignable[LinkTy, MultiLink] else Tgt
+
+"""And the final helper, ``PropsOnly[T]``, generates a new type that
+contains all the ``Property`` attributes of ``T``.
+
+"""
+type PropsOnly[T] = NewProtocol[
+ *[
+ Member[GetName[p], PointerArg[GetType[p]]]
+ for p in Iter[Attrs[T]]
+ if IsAssignable[GetType[p], Property]
+ ]
+]
+
+"""
+The full test is `in our test suite <#qb-test_>`_.
+"""
+
+
+# End PEP section
+
+
+# Basic filtering
+class Comment:
+ id: Property[int]
+ name: Property[str]
+ poster: Link[User]
+
+
+class Post:
+ id: Property[int]
+
+ title: Property[str]
+ content: Property[str]
+
+ comments: MultiLink[Comment]
+ author: Link[Comment]
+
+
+class User:
+ id: Property[int]
+
+ name: Property[str]
+ email: Property[str]
+ posts: MultiLink[Post]
+
+
+####
+
+reveal_type(select(User, id=True, name=True)) # N: Revealed type is "builtins.list[NewProtocol[id: builtins.int, name: builtins.str]]"
+
+reveal_type(select(User, name=True, email=True, posts=True)) # N: Revealed type is "builtins.list[NewProtocol[name: builtins.str, email: builtins.str, posts: builtins.list[NewProtocol[id: builtins.int, title: builtins.str, content: builtins.str]]]]"
+
+
+reveal_type(select(Post, title=True, comments=True, author=True)) # N: Revealed type is "builtins.list[NewProtocol[title: builtins.str, comments: builtins.list[NewProtocol[id: builtins.int, name: builtins.str]], author: NewProtocol[id: builtins.int, name: builtins.str]]]"
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testTypeLevel_qblike_nested]
+# flags: --python-version 3.14
+
+from typing import Literal, Unpack, TypedDict
+
+from typing import (
+ NewProtocol,
+ BaseTypedDict,
+ Iter,
+ Attrs,
+ IsAssignable,
+ GetType,
+ Member,
+ GetName,
+ GetMemberType,
+ GetArg,
+ InitField,
+)
+
+
+# Models
+
+class Pointer[T]:
+ pass
+
+class Property[T](Pointer[T]):
+ pass
+
+class Link[T](Pointer[T]):
+ pass
+
+class SingleLink[T](Link[T]):
+ pass
+
+class MultiLink[T](Link[T]):
+ pass
+
+
+# Select: captures which fields to fetch from a linked model
+
+class Select[T: BaseTypedDict](InitField[T]):
+ pass
+
+type SelectKwargs[T] = GetArg[T, InitField, Literal[0]]
+
+
+# Type aliases
+
+type PointerArg[T] = GetArg[T, Pointer, Literal[0]]
+
+type AdjustLink[Tgt, LinkTy] = list[Tgt] if IsAssignable[LinkTy, MultiLink] else Tgt
+
+type PropsOnly[T] = NewProtocol[
+ *[
+ Member[GetName[p], PointerArg[GetType[p]]]
+ for p in Iter[Attrs[T]]
+ if IsAssignable[GetType[p], Property]
+ ]
+]
+
+type LinkTarget[T, Sel] = (
+ ProjectFields[T, SelectKwargs[Sel]]
+ if IsAssignable[Sel, Select]
+ else PointerArg[GetMemberType[T, Literal["id"]]]
+ if IsAssignable[Sel, Literal["IDS"]]
+ else PropsOnly[T]
+)
+
+type ConvertField[T, Sel] = (
+ AdjustLink[LinkTarget[PointerArg[T], Sel], T]
+ if IsAssignable[T, Link]
+ else PointerArg[T]
+)
+
+type ProjectFields[ModelT, K] = NewProtocol[*[
+ Member[GetName[c], ConvertField[GetMemberType[ModelT, GetName[c]], GetType[c]]]
+ for c in Iter[Attrs[K]]
+]]
+
+
+# select function
+
+def select[ModelT, K: BaseTypedDict](
+ typ: type[ModelT],
+ /,
+ **kwargs: Unpack[K],
+) -> list[
+ ProjectFields[ModelT, K]
+]:
+ return []
+
+
+# Data models
+
+class Comment:
+ id: Property[int]
+ name: Property[str]
+ poster: Link[User]
+
+class Post:
+ id: Property[int]
+ title: Property[str]
+ content: Property[str]
+ comments: MultiLink[Comment]
+ author: Link[Comment]
+
+class User:
+ id: Property[int]
+ name: Property[str]
+ email: Property[str]
+ posts: MultiLink[Post]
+
+
+# Tests
+
+
+# Simple select: same behavior as original
+reveal_type(select(User, id=True, name=True)) # N: Revealed type is "builtins.list[NewProtocol[id: builtins.int, name: builtins.str]]"
+
+nested_sel = Select(title=True, comments=True)
+reveal_type(select(User, name=True, posts=nested_sel)) # N: Revealed type is "builtins.list[NewProtocol[name: builtins.str, posts: builtins.list[NewProtocol[title: builtins.str, comments: builtins.list[NewProtocol[id: builtins.int, name: builtins.str]]]]]]"
+
+reveal_type(select(User, name=True, posts=Select(title=True, comments=True))) # N: Revealed type is "builtins.list[NewProtocol[name: builtins.str, posts: builtins.list[NewProtocol[title: builtins.str, comments: builtins.list[NewProtocol[id: builtins.int, name: builtins.str]]]]]]"
+
+# "IDS" fetches just the id from linked models
+reveal_type(select(User, name=True, posts="IDS")) # N: Revealed type is "builtins.list[NewProtocol[name: builtins.str, posts: builtins.list[builtins.int]]]"
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testTypeLevel_nplike]
+# flags: --python-version 3.14
+
+# Simulate array broadcasting
+
+from typing import Literal as L
+
+from typing import (
+ Literal,
+ Never,
+ GetArg,
+ Bool,
+ IsEquivalent,
+ Iter,
+ Slice,
+ Length,
+ IsAssignable,
+ RaiseError,
+)
+
+
+class Array[DType, *Shape]:
+ def __add__[*Shape2](
+ self,
+ other: Array[DType, *Shape2]
+ ) -> Array[DType, *Merge[tuple[*Shape], tuple[*Shape2]]]:
+ # ) -> Array[DType, *Broadcast[Shape, Shape2]]:
+ raise BaseException
+
+def add[DType, *Shape1, *Shape2](
+ lhs: Array[DType, *Shape1],
+ rhs: Array[DType, *Shape2],
+) -> Array[DType, *Merge[tuple[*Shape1], tuple[*Shape2]]]:
+# ) -> Array[DType, *Broadcast[Shape1, Shape2]]:
+ raise BaseException
+
+
+class Array2[DType, Shape]:
+
+ def __add__[Shape2](
+ self,
+ other: Array2[DType, Shape2]
+ ) -> Array2[DType, Merge[Shape, Shape2]]:
+ raise BaseException
+
+
+def add2[DType, Shape1, Shape2](
+ lhs: Array2[DType, Shape1],
+ rhs: Array2[DType, Shape2],
+) -> Array2[DType, Merge[Shape1, Shape2]]:
+ raise BaseException
+
+
+type MergeOne[T, S] = (
+ T
+ if IsEquivalent[T, S] or IsEquivalent[S, Literal[1]]
+ else S if IsEquivalent[T, Literal[1]]
+ else RaiseError[Literal["Broadcast mismatch"], T, S]
+)
+
+type Head[T] = GetArg[T, tuple, Literal[0]]
+
+type DropLast[T] = Slice[T, Literal[0], Literal[-1]]
+type Last[T] = GetArg[T, tuple, Literal[-1]]
+
+# Matching on Never here is intentional; it prevents stupid
+# infinite recursions.
+type Empty[T] = IsAssignable[Length[T], Literal[0]]
+
+type Tup[*Ts] = tuple[*Ts]
+
+type Merge[T, S] = (
+ S if Bool[Empty[T]] else T if Bool[Empty[S]]
+ else
+ Tup[
+ # XXX: This error message position is super wrong!
+ *Merge[DropLast[T], DropLast[S]], # E: Broadcast mismatch: Literal[4], Literal[10]
+ MergeOne[Last[T], Last[S]]
+ ]
+)
+
+
+a0: MergeOne[L[1], L[5]]
+reveal_type(a0) # N: Revealed type is "Literal[5]"
+
+a1: MergeOne[int, L[5]] # E: Broadcast mismatch: int, Literal[5]
+reveal_type(a1) # N: Revealed type is "Never"
+
+a2: MergeOne[L[5], L[5]]
+reveal_type(a2) # N: Revealed type is "Literal[5]"
+
+a3: MergeOne[int, int]
+reveal_type(a3) # N: Revealed type is "builtins.int"
+
+m0: Merge[tuple[L[1], L[4], L[5]], tuple[L[9], L[1], L[5]]]
+reveal_type(m0) # N: Revealed type is "tuple[Literal[9], Literal[4], Literal[5]]"
+
+m1: Merge[tuple[L[1], L[4], L[5]], tuple[L[9], L[10], L[5]]] # E: Broadcast mismatch: Literal[4], Literal[10]
+reveal_type(m1) # N: Revealed type is "tuple[Literal[9], Never, Literal[5]]"
+
+m2: Merge[tuple[L[2], L[5]], tuple[L[9], L[2], L[1]]]
+reveal_type(m2) # N: Revealed type is "tuple[Literal[9], Literal[2], Literal[5]]"
+
+type T41 = tuple[L[4], L[1]]
+type T3 = tuple[L[3]]
+
+m3: Merge[T41, T3]
+reveal_type(m3) # N: Revealed type is "tuple[Literal[4], Literal[3]]"
+
+#
+
+z1: Array2[float, tuple[L[4], L[1]]]
+z2: Array2[float, tuple[L[3]]]
+
+reveal_type(add2(z1, z2)) # N: Revealed type is "__main__.Array2[builtins.float, tuple[Literal[4], Literal[3]]]"
+reveal_type(z1 + z2) # N: Revealed type is "__main__.Array2[builtins.float, tuple[Literal[4], Literal[3]]]"
+
+#
+
+b1: Array[float, int, int]
+b2: Array[float, int]
+reveal_type(b1 + b2) # N: Revealed type is "__main__.Array[builtins.float, builtins.int, builtins.int]"
+
+
+#
+
+c1: Array[float, L[4], L[1]]
+c2: Array[float, L[3]]
+res1 = c1 + c2
+reveal_type(res1) # N: Revealed type is "__main__.Array[builtins.float, Literal[4], Literal[3]]"
+
+res2 = add(c1, c2)
+reveal_type(res2) # N: Revealed type is "__main__.Array[builtins.float, Literal[4], Literal[3]]"
+
+
+checkr: Array[float, L[4], L[3]] = res1
+checkr = res2
+
+#
+
+err1: Array[float, L[4], L[2]]
+err2: Array[float, L[3]]
+# err1 + err2 # XXX: We want to do this one but we get the wrong error location!
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testTypeLevel_nplike2]
+# flags: --python-version 3.14
+
+# Simpler version of above to snip out and demonstrate
+
+# Simulate array broadcasting
+
+from typing import Literal as L
+
+from typing import (
+ Literal,
+ Never,
+ GetArg,
+ Bool,
+ IsEquivalent,
+ Iter,
+ Slice,
+ Length,
+ IsAssignable,
+ RaiseError,
+)
+
+
+class Array[DType, *Shape]:
+ def __add__[*Shape2](
+ self,
+ other: Array[DType, *Shape2]
+ ) -> Array[DType, *Merge[tuple[*Shape], tuple[*Shape2]]]:
+ raise BaseException
+
+
+type MergeOne[T, S] = (
+ T
+ if IsEquivalent[T, S] or IsEquivalent[S, Literal[1]]
+ else S if IsEquivalent[T, Literal[1]]
+ else RaiseError[Literal["Broadcast mismatch"], T, S]
+)
+
+type DropLast[T] = Slice[T, Literal[0], Literal[-1]]
+type Last[T] = GetArg[T, tuple, Literal[-1]]
+
+# Matching on Never here is intentional; it prevents stupid
+# infinite recursions.
+type Empty[T] = IsAssignable[Length[T], Literal[0]]
+
+type Merge[T, S] = (
+ S if Bool[Empty[T]] else T if Bool[Empty[S]]
+ else
+ tuple[
+ *Merge[DropLast[T], DropLast[S]],
+ MergeOne[Last[T], Last[S]]
+ ]
+)
+
+m0: Merge[tuple[L[4], L[5]], tuple[L[9], L[1], L[5]]]
+reveal_type(m0) # N: Revealed type is "tuple[Literal[9], Literal[4], Literal[5]]"
+
+
+a1: Array[float, L[4], L[1]]
+a2: Array[float, L[3]]
+ar = a1 + a2
+reveal_type(ar) # N: Revealed type is "__main__.Array[builtins.float, Literal[4], Literal[3]]"
+checkr: Array[float, L[4], L[3]] = ar
+
+
+b1: Array[float, int, int]
+b2: Array[float, int]
+reveal_type(b1 + b2) # N: Revealed type is "__main__.Array[builtins.float, builtins.int, builtins.int]"
+
+
+c1: Array[float, L[4], L[1], L[5]]
+c2: Array[float, L[4], L[3], L[1]]
+reveal_type(c1 + c2) # N: Revealed type is "__main__.Array[builtins.float, Literal[4], Literal[3], Literal[5]]"
+
+#
+
+err1: Array[float, L[4], L[2]]
+err2: Array[float, L[3]]
+# err1 + err2 # XXX: We want to do this one but we get the wrong error location!
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeLevel_fastapilike]
+# flags: --python-version 3.14
+
+from typing import (
+ Callable,
+ Literal,
+ Union,
+ ReadOnly,
+ TypedDict,
+ Never,
+ Self,
+)
+
+import typing
+
+
+class FieldArgs(TypedDict, total=False):
+ hidden: ReadOnly[bool]
+ primary_key: ReadOnly[bool]
+ index: ReadOnly[bool]
+ default: ReadOnly[object]
+
+
+class Field[T: FieldArgs](typing.InitField[T]):
+ pass
+
+
+####
+
+# TODO: Should this go into the stdlib?
+type GetFieldItem[T, K] = typing.GetMemberType[
+ typing.GetArg[T, typing.InitField, Literal[0]], K
+]
+
+
+##
+
+# Strip `| None` from a type by iterating over its union components
+# and filtering
+type NotOptional[T] = Union[
+ *[x for x in typing.Iter[typing.FromUnion[T]] if not typing.IsAssignable[x, None]]
+]
+
+# Adjust an attribute type for use in Public below by dropping | None for
+# primary keys and stripping all annotations.
+type FixPublicType[T, Init] = (
+ NotOptional[T]
+ if typing.IsAssignable[Literal[True], GetFieldItem[Init, Literal["primary_key"]]]
+ else T
+)
+
+# Strip out everything that is Hidden and also make the primary key required
+# Drop all the annotations, since this is for data getting returned to users
+# from the DB, so we don't need default values.
+type Public[T] = typing.NewProtocol[
+ *[
+ typing.Member[
+ typing.GetName[p],
+ FixPublicType[typing.GetType[p], typing.GetInit[p]],
+ typing.GetQuals[p],
+ ]
+ for p in typing.Iter[typing.Attrs[T]]
+ if not typing.IsAssignable[
+ Literal[True], GetFieldItem[typing.GetInit[p], Literal["hidden"]]
+ ]
+ ]
+]
+
+# Begin PEP section: Automatically deriving FastAPI CRUD models
+"""
+We have a more `fully-worked example <#fastapi-test_>`_ in our test
+suite, but here is a possible implementation of just ``Public``
+"""
+
+# Extract the default type from an Init field.
+# If it is a Field, then we try pulling out the "default" field,
+# otherwise we return the type itself.
+type GetDefault[Init] = (
+ GetFieldItem[Init, Literal["default"]]
+ if typing.IsAssignable[Init, Field]
+ else Init
+)
+
+# Create takes everything but the primary key and preserves defaults
+type Create[T] = typing.NewProtocol[
+ *[
+ typing.Member[
+ typing.GetName[p],
+ typing.GetType[p],
+ typing.GetQuals[p],
+ GetDefault[typing.GetInit[p]],
+ ]
+ for p in typing.Iter[typing.Attrs[T]]
+ if not typing.IsAssignable[
+ Literal[True],
+ GetFieldItem[typing.GetInit[p], Literal["primary_key"]],
+ ]
+ ]
+]
+
+# Update takes everything but the primary key, but makes them all have
+# None defaults
+type Update[T] = typing.NewProtocol[
+ *[
+ typing.Member[
+ typing.GetName[p],
+ typing.GetType[p] | None,
+ typing.GetQuals[p],
+ Literal[None],
+ ]
+ for p in typing.Iter[typing.Attrs[T]]
+ if not typing.IsAssignable[
+ Literal[True],
+ GetFieldItem[typing.GetInit[p], Literal["primary_key"]],
+ ]
+ ]
+]
+
+class Hero:
+ id: int | None = Field(default=None, primary_key=True)
+
+ name: str = Field(index=True)
+ age: int | None = Field(default=None, index=True)
+
+ secret_name: str = Field(hidden=True)
+
+
+type HeroPublic = Public[Hero]
+type HeroCreate = Create[Hero]
+type HeroUpdate = Update[Hero]
+
+pub: HeroPublic
+reveal_type(pub) # N: Revealed type is "NewProtocol[id: builtins.int, name: builtins.str, age: builtins.int | None]"
+
+creat: HeroCreate
+reveal_type(creat) # N: Revealed type is "NewProtocol[name: builtins.str, age: builtins.int | None = None, secret_name: builtins.str]"
+
+upd: HeroUpdate
+reveal_type(upd) # N: Revealed type is "NewProtocol[name: builtins.str | None = None, age: builtins.int | None | None = None, secret_name: builtins.str | None = None]"
+
+creat_members: tuple[*[typing.GetInit[p] for p in typing.Iter[typing.Members[HeroCreate]]]]
+reveal_type(creat_members) # N: Revealed type is "tuple[Never, None, Never]"
+
+upd_types: tuple[*[typing.GetType[p] for p in typing.Iter[typing.Members[HeroUpdate]]]]
+reveal_type(upd_types) # N: Revealed type is "tuple[builtins.str | None, builtins.int | None | None, builtins.str | None]"
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testUpdateClassDataclassLike]
+# flags: --python-version 3.14
+
+from typing import (
+ Callable,
+ Literal,
+ ReadOnly,
+ TypedDict,
+ Never,
+ UpdateClass,
+)
+
+import typing
+
+
+class FieldArgs(TypedDict, total=False):
+ default: ReadOnly[object]
+
+
+class Field[T: FieldArgs](typing.InitField[T]):
+ pass
+
+
+# Extract the default value from a Field's init arg.
+# If the init is a Field, pull out "default"; otherwise use the init itself.
+type GetDefault[Init] = (
+ typing.GetMemberType[
+ typing.GetArg[Init, typing.InitField, Literal[0]],
+ Literal["default"],
+ ]
+ if typing.IsAssignable[Init, Field]
+ else Init
+)
+
+
+# Generate a Member for __init__ from a class's attributes.
+# All params are keyword-only; params with defaults get Literal["keyword", "default"].
+type InitFnType[T] = typing.Member[
+ Literal["__init__"],
+ Callable[
+ typing.Params[
+ typing.Param[Literal["self"], T],
+ *[
+ typing.Param[
+ p.name,
+ p.typ,
+ Literal["keyword"]
+ if typing.IsAssignable[
+ GetDefault[p.init],
+ Never,
+ ]
+ else Literal["keyword", "default"],
+ ]
+ for p in typing.Iter[typing.Attrs[T]]
+ ],
+ ],
+ None,
+ ],
+ Literal["ClassVar"],
+]
+
+
+def dataclass_ish[T](cls: type[T]) -> UpdateClass[InitFnType[T]]:
+ ...
+
+
+@dataclass_ish
+class Hero:
+ id: int | None = None
+ name: str
+ age: int | None = Field(default=None)
+ score: float = Field()
+ secret_name: str
+
+
+class Model:
+ def __init_subclass__[T](
+ cls: type[T],
+ ) -> typing.UpdateClass[
+ # Add the computed __init__ function
+ InitFnType[T],
+ ]:
+ pass
+
+
+class Hero2(Model):
+ id: int | None = None
+ name: str
+ age: int | None = Field(default=None)
+ score: float = Field()
+ secret_name: str
+
+
+
+reveal_type(Hero.__init__) # N: Revealed type is "def (self: __main__.Hero, *, id: builtins.int | None =, name: builtins.str, age: builtins.int | None =, score: builtins.float, secret_name: builtins.str)"
+Hero(name="Spider-Boy", secret_name="Pedro Parqueador", score=1.0)
+Hero(name="Spider-Boy", secret_name="Pedro Parqueador", score=1.0, id=3)
+Hero(name="Spider-Boy", secret_name="Pedro Parqueador", score=1.0, age=16)
+Hero(name="Spider-Boy", score=1.0) # E: Missing named argument "secret_name" for "Hero"
+h = Hero(id=1, name="Spider-Boy", age=16, score=1.0, secret_name="Pedro Parqueador")
+reveal_type(h) # N: Revealed type is "__main__.Hero"
+
+
+
+reveal_type(Hero2.__init__) # N: Revealed type is "def (self: __main__.Hero2, *, id: builtins.int | None =, name: builtins.str, age: builtins.int | None =, score: builtins.float, secret_name: builtins.str)"
+Hero2(name="Spider-Boy", secret_name="Pedro Parqueador", score=1.0)
+Hero2(name="Spider-Boy", secret_name="Pedro Parqueador", score=1.0, id=3)
+Hero2(name="Spider-Boy", secret_name="Pedro Parqueador", score=1.0, age=16)
+Hero2(name="Spider-Boy", score=1.0) # E: Missing named argument "secret_name" for "Hero2"
+h2 = Hero2(id=1, name="Spider-Boy", age=16, score=1.0, secret_name="Pedro Parqueador")
+reveal_type(h2) # N: Revealed type is "__main__.Hero2"
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
diff --git a/test-data/unit/check-typelevel-members.test b/test-data/unit/check-typelevel-members.test
new file mode 100644
index 0000000000000..77796b8744ec3
--- /dev/null
+++ b/test-data/unit/check-typelevel-members.test
@@ -0,0 +1,587 @@
+[case testTypeOperatorMembersBasic]
+# Test Members operator - basic class
+from typing import Members
+
+class Point:
+ x: int
+ y: str
+
+m: Members[Point]
+reveal_type(m) # N: Revealed type is "tuple[typing.Member[Literal['x'], builtins.int, Never, Never, __main__.Point], typing.Member[Literal['y'], builtins.str, Never, Never, __main__.Point]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorMembersWithClassVar]
+# Test Members operator with ClassVar
+from typing import Members, ClassVar
+
+class Config:
+ name: str
+ count: ClassVar[int]
+
+m: Members[Config]
+reveal_type(m) # N: Revealed type is "tuple[typing.Member[Literal['name'], builtins.str, Never, Never, __main__.Config], typing.Member[Literal['count'], builtins.int, Literal['ClassVar'], Never, __main__.Config]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorMembersWithFinal]
+# Test Members operator with Final
+from typing import Members, Final
+
+class Constants:
+ PI: Final[float] = 3.14
+ NAME: str
+
+m: Members[Constants]
+reveal_type(m) # N: Revealed type is "tuple[typing.Member[Literal['PI'], builtins.float, Literal['Final'], builtins.float, __main__.Constants], typing.Member[Literal['NAME'], builtins.str, Never, Never, __main__.Constants]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorAttrsBasic]
+# Test Attrs operator - filters to attributes only (excludes methods)
+from typing import Attrs, ClassVar
+
+class MyClass:
+ name: str
+ count: ClassVar[int]
+ def method(self) -> None: pass
+
+a: Attrs[MyClass]
+# Should include 'name' and 'count', but exclude methods
+reveal_type(a) # N: Revealed type is "tuple[typing.Member[Literal['name'], builtins.str, Never, Never, __main__.MyClass], typing.Member[Literal['count'], builtins.int, Literal['ClassVar'], Never, __main__.MyClass]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorAttrsCallable]
+# Test Attrs operator - includes Callable-typed attributes
+from typing import Attrs, Callable
+
+class MyClass:
+ f: Callable[[int], int]
+
+a: Attrs[MyClass]
+reveal_type(a) # N: Revealed type is "tuple[typing.Member[Literal['f'], def (builtins.int) -> builtins.int, Never, Never, __main__.MyClass]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorMembersEmpty]
+# Test Members operator with no public members
+from typing import Members
+
+class Empty:
+ _private: int
+
+m: Members[Empty]
+reveal_type(m) # N: Revealed type is "tuple[typing.Member[Literal['_private'], builtins.int, Never, Never, __main__.Empty]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorMembersGenericClass]
+# Test Members operator with generic class
+from typing import Members, Generic, TypeVar
+
+T = TypeVar('T')
+
+class Container(Generic[T]):
+ value: T
+ label: str
+
+m: Members[Container[int]]
+reveal_type(m) # N: Revealed type is "tuple[typing.Member[Literal['value'], builtins.int, Never, Never, __main__.Container[builtins.int]], typing.Member[Literal['label'], builtins.str, Never, Never, __main__.Container[builtins.int]]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorMembersGenericClassSub]
+# Test Members operator with generic class
+from typing import Members, Generic, TypeVar
+
+T = TypeVar('T')
+
+class Container(Generic[T]):
+ value: T
+ label: str
+
+
+class Child(Container[int]):
+ pass
+
+
+m: Members[Child]
+reveal_type(m) # N: Revealed type is "tuple[typing.Member[Literal['value'], builtins.int, Never, Never, __main__.Container[builtins.int]], typing.Member[Literal['label'], builtins.str, Never, Never, __main__.Container[builtins.int]]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorMembersMethod]
+# Test Members operator includes methods with ClassVar qualifier
+from typing import Members
+
+class MyClass:
+ def method(self) -> None: pass
+
+a: Members[MyClass]
+reveal_type(a) # N: Revealed type is "tuple[typing.Member[Literal['method'], def (self: __main__.MyClass), Literal['ClassVar'], Never, __main__.MyClass]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testTypeOperatorMembersNoInferred]
+from typing import Attrs, Members
+
+class MyClass:
+ x: int
+ def __init__(self) -> None:
+ self.x = 0
+ self.y = 'test'
+
+a: Members[MyClass]
+reveal_type(a) # N: Revealed type is "tuple[typing.Member[Literal['x'], builtins.int, Never, Never, __main__.MyClass], typing.Member[Literal['__init__'], def (self: __main__.MyClass), Literal['ClassVar'], Never, __main__.MyClass]]"
+
+b: Attrs[MyClass]
+reveal_type(b) # N: Revealed type is "tuple[typing.Member[Literal['x'], builtins.int, Never, Never, __main__.MyClass]]"
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testTypeOperatorMembersDefiner]
+from typing import Attrs, Members
+
+class Base:
+ x: int
+
+
+class Child(Base):
+ y: str
+
+
+class Child2(Base):
+ x: bool
+ y: str
+
+
+a: Members[Child]
+reveal_type(a) # N: Revealed type is "tuple[typing.Member[Literal['x'], builtins.int, Never, Never, __main__.Base], typing.Member[Literal['y'], builtins.str, Never, Never, __main__.Child]]"
+
+b: Attrs[Child]
+reveal_type(b) # N: Revealed type is "tuple[typing.Member[Literal['x'], builtins.int, Never, Never, __main__.Base], typing.Member[Literal['y'], builtins.str, Never, Never, __main__.Child]]"
+
+c: Attrs[Child2]
+reveal_type(c) # N: Revealed type is "tuple[typing.Member[Literal['x'], builtins.bool, Never, Never, __main__.Child2], typing.Member[Literal['y'], builtins.str, Never, Never, __main__.Child2]]"
+
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorMembersTypedDict]
+from typing import Members, TypedDict, NotRequired
+
+class Person(TypedDict):
+ name: str
+ age: int
+ nickname: NotRequired[str]
+
+m: Members[Person]
+reveal_type(m) # N: Revealed type is "tuple[typing.Member[Literal['name'], builtins.str, Never, Never, Never], typing.Member[Literal['age'], builtins.int, Never, Never, Never], typing.Member[Literal['nickname'], builtins.str, Literal['NotRequired'], Never, Never]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorMembersTypedDictReadOnly]
+from typing import Members, TypedDict, ReadOnly
+
+class Config(TypedDict):
+ name: str
+ version: ReadOnly[int]
+
+m: Members[Config]
+reveal_type(m) # N: Revealed type is "tuple[typing.Member[Literal['name'], builtins.str, Never, Never, Never], typing.Member[Literal['version'], builtins.int, Literal['ReadOnly'], Never, Never]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorMembersTypedDictInheritance]
+from typing import Members, TypedDict, NotRequired
+
+class Base(TypedDict):
+ id: int
+ name: str
+
+class Extended(Base):
+ email: NotRequired[str]
+
+m: Members[Extended]
+reveal_type(m) # N: Revealed type is "tuple[typing.Member[Literal['id'], builtins.int, Never, Never, Never], typing.Member[Literal['name'], builtins.str, Never, Never, Never], typing.Member[Literal['email'], builtins.str, Literal['NotRequired'], Never, Never]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorMembersTypedDictFunctional]
+from typing import Members, TypedDict
+
+Person = TypedDict('Person', {'name': str, 'age': int})
+
+m: Members[Person]
+reveal_type(m) # N: Revealed type is "tuple[typing.Member[Literal['name'], builtins.str, Never, Never, Never], typing.Member[Literal['age'], builtins.int, Never, Never, Never]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorNewTypedDict]
+from typing import NewTypedDict, Member, Literal, Never
+
+# Create a TypedDict from Member types
+TD = NewTypedDict[
+ Member[Literal['name'], str, Never, int, int],
+ Member[Literal['age'], int, Never, int, int],
+]
+
+x: TD
+reveal_type(x) # N: Revealed type is "TypedDict({'name': builtins.str, 'age': builtins.int})"
+x = {'name': 'Alice', 'age': 30}
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorNewTypedDictNotRequired]
+from typing import NewTypedDict, Member, Literal, Never
+
+# Create a TypedDict with NotRequired field
+TD = NewTypedDict[
+ Member[Literal['name'], str, Never, int, int],
+ Member[Literal['nickname'], str, Literal['NotRequired'], int, int],
+]
+
+x: TD
+reveal_type(x) # N: Revealed type is "TypedDict({'name': builtins.str, 'nickname'?: builtins.str})"
+x = {'name': 'Alice'} # nickname is optional
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testNewTypedDictWithComprehensionRoundTrip]
+from typing import NewTypedDict, Members, TypedDict, Iter
+
+class Person(TypedDict):
+ name: str
+ age: int
+
+# Round-trip: Members extracts, NewTypedDict reconstructs
+TD = NewTypedDict[*[m for m in Iter[Members[Person]]]]
+x: TD
+reveal_type(x) # N: Revealed type is "TypedDict({'name': builtins.str, 'age': builtins.int})"
+x = {'name': 'Alice', 'age': 30}
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testNewTypedDictWithComprehensionFilter]
+from typing import NewTypedDict, Members, TypedDict, Iter, IsAssignable, GetMemberType, Literal
+
+class Person(TypedDict):
+ name: str
+ age: int
+ active: bool
+
+# Filter to only string fields
+TD = NewTypedDict[*[m for m in Iter[Members[Person]] if IsAssignable[GetMemberType[m, Literal['typ']], str]]]
+x: TD
+reveal_type(x) # N: Revealed type is "TypedDict({'name': builtins.str})"
+x = {'name': 'Alice'}
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testNewTypedDictWithComprehensionFromClass]
+from typing import NewTypedDict, Attrs, Iter
+
+class Point:
+ x: int
+ y: int
+
+# Create TypedDict from class attributes
+TD = NewTypedDict[*[m for m in Iter[Attrs[Point]]]]
+x: TD
+reveal_type(x) # N: Revealed type is "TypedDict({'x': builtins.int, 'y': builtins.int})"
+x = {'x': 1, 'y': 2}
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorMembersInit]
+# flags: --python-version 3.12
+# Test Members Init field captures literal values from Final variables
+from typing import Members, Final, GetInit, Iter
+
+class Config:
+ NAME: Final[str] = "app"
+ VERSION: Final[int] = 42
+ ENABLED: Final[bool] = True
+ RATIO: Final[float] = 3.14
+ plain: str # No initializer
+
+m: Members[Config]
+reveal_type(m) # N: Revealed type is "tuple[typing.Member[Literal['NAME'], builtins.str, Literal['Final'], Literal['app'], __main__.Config], typing.Member[Literal['VERSION'], builtins.int, Literal['Final'], Literal[42], __main__.Config], typing.Member[Literal['ENABLED'], builtins.bool, Literal['Final'], Literal[True], __main__.Config], typing.Member[Literal['RATIO'], builtins.float, Literal['Final'], builtins.float, __main__.Config], typing.Member[Literal['plain'], builtins.str, Never, Never, __main__.Config]]"
+
+# Test GetInit accessor
+type InitValues = tuple[*[GetInit[m] for m in Iter[Members[Config]]]]
+x: InitValues
+reveal_type(x) # N: Revealed type is "tuple[Literal['app'], Literal[42], Literal[True], builtins.float, Never]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testTypeOperatorMembersInitNonFinal]
+# flags: --python-version 3.12
+# Test Members Init field for non-Final variables with initializers
+from typing import Members, Attrs, GetInit, Iter
+
+class Data:
+ name: str = "default"
+ email: str = "lol@lol.com"
+ count: int = 42
+ active: bool = True
+ no_init: str # No initializer
+
+m: Members[Data]
+reveal_type(m) # N: Revealed type is "tuple[typing.Member[Literal['name'], builtins.str, Never, Literal['default'], __main__.Data], typing.Member[Literal['email'], builtins.str, Never, Literal['lol@lol.com'], __main__.Data], typing.Member[Literal['count'], builtins.int, Never, Literal[42], __main__.Data], typing.Member[Literal['active'], builtins.bool, Never, Literal[True], __main__.Data], typing.Member[Literal['no_init'], builtins.str, Never, Never, __main__.Data]]"
+
+# Test Attrs also captures init
+a: Attrs[Data]
+reveal_type(a) # N: Revealed type is "tuple[typing.Member[Literal['name'], builtins.str, Never, Literal['default'], __main__.Data], typing.Member[Literal['email'], builtins.str, Never, Literal['lol@lol.com'], __main__.Data], typing.Member[Literal['count'], builtins.int, Never, Literal[42], __main__.Data], typing.Member[Literal['active'], builtins.bool, Never, Literal[True], __main__.Data], typing.Member[Literal['no_init'], builtins.str, Never, Never, __main__.Data]]"
+
+# Test GetInit accessor for non-Final variables
+type InitValues = tuple[*[GetInit[m] for m in Iter[Attrs[Data]]]]
+x: InitValues
+reveal_type(x) # N: Revealed type is "tuple[Literal['default'], Literal['lol@lol.com'], Literal[42], Literal[True], Never]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testDotNotationMemberAccess]
+# flags: --python-version 3.12
+# Test dot notation for Member component access
+from typing import Members, TypedDict, Iter, Member, Literal, Never
+
+class Person(TypedDict):
+ name: str
+ age: int
+
+m: Members[Person]
+reveal_type(m) # N: Revealed type is "tuple[typing.Member[Literal['name'], builtins.str, Never, Never, Never], typing.Member[Literal['age'], builtins.int, Never, Never, Never]]"
+
+# Use dot notation to access Member fields
+type Names = tuple[*[T.name for T in Iter[Members[Person]]]]
+x: Names
+reveal_type(x) # N: Revealed type is "tuple[Literal['name'], Literal['age']]"
+
+type Types = tuple[*[T.typ for T in Iter[Members[Person]]]]
+y: Types
+reveal_type(y) # N: Revealed type is "tuple[builtins.str, builtins.int]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testDotNotationNewTypedDict]
+# flags: --python-version 3.12
+# Test dot notation in dict comprehension with NewTypedDict
+from typing import NewTypedDict, Members, TypedDict, Iter
+
+class Person(TypedDict):
+ name: str
+ age: int
+
+# Use dot notation in dict comprehension
+type TD = NewTypedDict[{m.name: m.typ for m in Iter[Members[Person]]}]
+x: TD
+reveal_type(x) # N: Revealed type is "TypedDict({'name': builtins.str, 'age': builtins.int})"
+x = {'name': 'Alice', 'age': 30}
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testDotNotationWithCondition]
+# flags: --python-version 3.12
+# Test dot notation with comprehension condition
+from typing import NewTypedDict, Members, TypedDict, Iter, IsAssignable
+
+class Person(TypedDict):
+ name: str
+ age: int
+ active: bool
+
+# Filter to only string fields using dot notation
+type TD = NewTypedDict[{m.name: m.typ for m in Iter[Members[Person]] if IsAssignable[m.typ, str]}]
+x: TD
+reveal_type(x) # N: Revealed type is "TypedDict({'name': builtins.str})"
+x = {'name': 'Alice'}
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testDotNotationOnSubscriptedType]
+# flags: --python-version 3.12
+# Test dot notation on a subscripted type expression
+from typing import Member, Literal, Never
+
+class Point:
+ x: int
+ y: str
+
+# Dot notation on a subscripted type: Member[...].typ
+x: Member[Literal['x'], int, Never, Never, Point].typ
+reveal_type(x) # N: Revealed type is "builtins.int"
+
+y: Member[Literal['x'], int, Never, Never, Point].name
+reveal_type(y) # N: Revealed type is "Literal['x']"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testDotNotationOnConditionalType]
+# flags: --python-version 3.12
+# Test dot notation where the LHS is a conditional type expression
+from typing import Member, Literal, Never, IsAssignable
+
+class Point:
+ x: int
+ y: str
+
+# Conditional evaluates to a Member, then .typ extracts the type
+x: (Member[Literal['x'], int, Never, Never, Point] if IsAssignable[int, int] else Member[Literal['y'], str, Never, Never, Point]).typ
+reveal_type(x) # N: Revealed type is "builtins.int"
+
+# Same inside a type alias
+type T = (Member[Literal['x'], int, Never, Never, Point] if IsAssignable[int, str] else Member[Literal['y'], str, Never, Never, Point]).name
+a: T
+reveal_type(a) # N: Revealed type is "Literal['y']"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testAttrsOnEnum]
+# flags: --python-version 3.12
+from typing import Attrs, Iter, Union, IsAssignable, Never
+from enum import Enum
+
+class Color(Enum):
+ RED = 1
+ GREEN = 2
+ BLUE = 3
+
+a: Attrs[Color]
+reveal_type(a) # N: Revealed type is "tuple[typing.Member[Literal['RED'], Literal[1]?, Literal['Final'], Literal[1], __main__.Color], typing.Member[Literal['GREEN'], Literal[2]?, Literal['Final'], Literal[2], __main__.Color], typing.Member[Literal['BLUE'], Literal[3]?, Literal['Final'], Literal[3], __main__.Color]]"
+
+type EnumNames[T] = Union[*[
+ m.name for m in Iter[Attrs[T]]
+ if not IsAssignable[m.init, Never]
+]]
+type EnumTypes[T] = Union[*[
+ m.init for m in Iter[Attrs[T]]
+ if not IsAssignable[m.init, Never]
+]]
+
+n: EnumNames[Color]
+reveal_type(n) # N: Revealed type is "Literal['RED'] | Literal['GREEN'] | Literal['BLUE']"
+
+t: EnumTypes[Color]
+reveal_type(t) # N: Revealed type is "Literal[1] | Literal[2] | Literal[3]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testDotNotationOnSubscriptedTypeAlias]
+# flags: --python-version 3.12
+# Test dot notation on subscripted type inside a type alias
+from typing import Member, Literal, Never
+
+class Point:
+ x: int
+ y: str
+
+type T = Member[Literal['x'], int, Never, Never, Point].typ
+a: T
+reveal_type(a) # N: Revealed type is "builtins.int"
+
+type N = Member[Literal['x'], int, Never, Never, Point].name
+b: N
+reveal_type(b) # N: Revealed type is "Literal['x']"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testGetMemberBasic]
+from typing import GetMember, Member, Literal, Never
+
+class Foo:
+ x: int
+ y: str
+
+a: GetMember[Foo, Literal['x']]
+reveal_type(a) # N: Revealed type is "typing.Member[Literal['x'], builtins.int, Never, Never, __main__.Foo]"
+b: GetMember[Foo, Literal['y']]
+reveal_type(b) # N: Revealed type is "typing.Member[Literal['y'], builtins.str, Never, Never, __main__.Foo]"
+
+# Non-existent member produces an error
+c: GetMember[Foo, Literal['z']] # E: GetMember: 'z' not found in __main__.Foo
+reveal_type(c) # N: Revealed type is "Any"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testGetMemberTypedDict]
+from typing import GetMember, Member, Literal, Never, TypedDict
+
+class TD(TypedDict):
+ name: str
+ age: int
+
+a: GetMember[TD, Literal['name']]
+reveal_type(a) # N: Revealed type is "typing.Member[Literal['name'], builtins.str, Never, Never, Never]"
+b: GetMember[TD, Literal['age']]
+reveal_type(b) # N: Revealed type is "typing.Member[Literal['age'], builtins.int, Never, Never, Never]"
+c: GetMember[TD, Literal['missing']] # E: GetMember: 'missing' not found in TypedDict(__main__.TD, {'name': str, 'age': int})
+reveal_type(c) # N: Revealed type is "Any"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testGetMemberInherited]
+from typing import GetMember, Member, Literal, Never
+
+class Base:
+ x: int
+
+class Child(Base):
+ y: str
+
+a: GetMember[Child, Literal['x']]
+reveal_type(a) # N: Revealed type is "typing.Member[Literal['x'], builtins.int, Never, Never, __main__.Base]"
+b: GetMember[Child, Literal['y']]
+reveal_type(b) # N: Revealed type is "typing.Member[Literal['y'], builtins.str, Never, Never, __main__.Child]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+[case testGetMemberGeneric]
+from typing import GetMember, Member, Generic, TypeVar, Literal, Never
+
+T = TypeVar('T')
+
+class Box(Generic[T]):
+ value: T
+
+a: GetMember[Box[int], Literal['value']]
+reveal_type(a) # N: Revealed type is "typing.Member[Literal['value'], builtins.int, Never, Never, __main__.Box[builtins.int]]"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
diff --git a/test-data/unit/check-typelevel-update-class.test b/test-data/unit/check-typelevel-update-class.test
new file mode 100644
index 0000000000000..3627ba80e5cc7
--- /dev/null
+++ b/test-data/unit/check-typelevel-update-class.test
@@ -0,0 +1,66 @@
+-- TODO: Test inheritance more carefully:
+-- - Multiple bases with __init_subclass__ returning UpdateClass (reverse MRO order)
+-- - UpdateClass on a class that is itself subclassed (do children see the added members?)
+-- - UpdateClass adding a member that conflicts with an inherited member
+-- - Diamond inheritance with UpdateClass on multiple paths
+-- - UpdateClass + explicit __init__ on the same class
+
+[case testUpdateClassBasicDecorator]
+# flags: --python-version 3.14
+from typing import Literal, Never, Member, UpdateClass
+
+def add_x[T](cls: type[T]) -> UpdateClass[
+ Member[Literal["x"], int],
+]:
+ ...
+
+@add_x
+class Foo:
+ y: str
+
+reveal_type(Foo.x) # N: Revealed type is "builtins.int"
+reveal_type(Foo().y) # N: Revealed type is "builtins.str"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testUpdateClassRemoveMember]
+# flags: --python-version 3.14
+from typing import Literal, Never, Member, UpdateClass
+
+def remove_y[T](cls: type[T]) -> UpdateClass[
+ Member[Literal["y"], Never],
+]:
+ ...
+
+@remove_y
+class Foo:
+ x: int
+ y: str
+
+reveal_type(Foo().x) # N: Revealed type is "builtins.int"
+Foo().y # E: "Foo" has no attribute "y"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+
+
+[case testUpdateClassInitSubclass]
+# flags: --python-version 3.14
+from typing import Literal, Never, Member, UpdateClass
+
+class Base:
+ def __init_subclass__[T](cls: type[T]) -> UpdateClass[
+ Member[Literal["tag"], str],
+ ]:
+ ...
+
+class Child(Base):
+ x: int
+
+reveal_type(Child.tag) # N: Revealed type is "builtins.str"
+reveal_type(Child().x) # N: Revealed type is "builtins.int"
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
diff --git a/test-data/unit/check-typevar-tuple.test b/test-data/unit/check-typevar-tuple.test
index 703653227e200..455e9a98ed825 100644
--- a/test-data/unit/check-typevar-tuple.test
+++ b/test-data/unit/check-typevar-tuple.test
@@ -2191,7 +2191,7 @@ g(1, 2, 3) # E: Missing named argument "a" for "g" \
def bad(
*args: Unpack[Keywords], # E: "Keywords" cannot be unpacked (must be tuple or TypeVarTuple)
- **kwargs: Unpack[Ints], # E: Unpack item in ** parameter must be a TypedDict
+ **kwargs: Unpack[Ints], # E: Unpack item in ** parameter must be a TypedDict or a TypeVar with TypedDict bound
) -> None: ...
reveal_type(bad) # N: Revealed type is "def (*args: Any, **kwargs: Any)"
@@ -2199,7 +2199,7 @@ def bad2(
one: int,
*args: Unpack[Keywords], # E: "Keywords" cannot be unpacked (must be tuple or TypeVarTuple)
other: str = "no",
- **kwargs: Unpack[Ints], # E: Unpack item in ** parameter must be a TypedDict
+ **kwargs: Unpack[Ints], # E: Unpack item in ** parameter must be a TypedDict or a TypeVar with TypedDict bound
) -> None: ...
reveal_type(bad2) # N: Revealed type is "def (one: builtins.int, *args: Any, other: builtins.str =, **kwargs: Any)"
[builtins fixtures/dict.pyi]
diff --git a/test-data/unit/check-varargs.test b/test-data/unit/check-varargs.test
index 172e57cf1a4b3..89b88e8167b53 100644
--- a/test-data/unit/check-varargs.test
+++ b/test-data/unit/check-varargs.test
@@ -792,7 +792,7 @@ def baz(**kwargs: Unpack[Person]) -> None: # OK
[case testUnpackWithoutTypedDict]
from typing_extensions import Unpack
-def foo(**kwargs: Unpack[dict]) -> None: # E: Unpack item in ** parameter must be a TypedDict
+def foo(**kwargs: Unpack[dict]) -> None: # E: Unpack item in ** parameter must be a TypedDict or a TypeVar with TypedDict bound
...
[builtins fixtures/dict.pyi]
diff --git a/test-data/unit/fine-grained-python314.test b/test-data/unit/fine-grained-python314.test
index 40af43567bd15..601ed34cda8e2 100644
--- a/test-data/unit/fine-grained-python314.test
+++ b/test-data/unit/fine-grained-python314.test
@@ -15,3 +15,151 @@ LiteralString = str
[out]
==
main:2: error: Incompatible types in assignment (expression has type "Template", variable has type "str")
+
+[case testFineGrainedNewProtocol-xfail]
+# flags: --python-version 3.14
+import a
+import b
+
+[file a.py]
+
+from typing import NewProtocol, Member, Literal, Iter
+
+# Basic NewProtocol creation
+type MyProto = NewProtocol[
+ Member[Literal["x"], int],
+ Member[Literal["y"], str],
+]
+x: MyProto
+
+type LinkedList[T] = NewProtocol[
+ Member[Literal["data"], T],
+ Member[Literal["next"], LinkedList[T]],
+]
+
+z: LinkedList[str]
+
+lol: tuple[*[t for t in Iter[tuple[MyProto]]]]
+
+asdf: NewProtocol[
+ Member[Literal["x"], int],
+ Member[Literal["y"], str],
+]
+
+[file b.py]
+from a import MyProto, LinkedList
+from typing import NewProtocol, Member, Literal
+
+x: MyProto
+
+class Good:
+ x: int
+ y: str
+
+def takes_proto(p: MyProto) -> None:
+ pass
+
+takes_proto(Good())
+
+z: LinkedList[str]
+
+# A different NewProtocol that should be incompatible with MyProto
+type OtherProto = NewProtocol[
+ Member[Literal["bar"], int],
+]
+other: OtherProto
+x = other
+
+[file b.py.2]
+from a import MyProto, LinkedList
+from typing import NewProtocol, Member, Literal
+
+x: MyProto
+
+class Good:
+ x: int
+ y: str
+
+def takes_proto(p: MyProto) -> None:
+ pass
+
+takes_proto(Good())
+
+z: LinkedList[str]
+
+# A different NewProtocol that should be incompatible with MyProto
+type OtherProto = NewProtocol[
+ Member[Literal["bar"], int],
+]
+other: OtherProto
+x = other
+
+# dummy change
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+[out]
+b.py:22: error: Incompatible types in assignment (expression has type "NewProtocol[bar: int]", variable has type "NewProtocol[x: int, y: str]")
+==
+b.py:22: error: Incompatible types in assignment (expression has type "NewProtocol[bar: int]", variable has type "NewProtocol[x: int, y: str]")
+
+[case testFineGrainedUpdateClassWithNewProtocol-xfail]
+# flags: --python-version 3.14
+import a
+import b
+
+[file a.py]
+from typing import Literal, Member, NewProtocol, UpdateClass
+
+type MyProto = NewProtocol[
+ Member[Literal["x"], int],
+ Member[Literal["y"], str],
+]
+
+def add_proto[T](cls: type[T]) -> UpdateClass[
+ Member[Literal["proto"], MyProto],
+]:
+ ...
+
+@add_proto
+class Foo:
+ z: float
+
+[file b.py]
+from a import Foo, MyProto
+
+reveal_type(Foo().proto)
+reveal_type(Foo().z)
+
+class Good:
+ x: int
+ y: str
+
+g: Good
+p: MyProto
+p = g
+
+[file b.py.2]
+from a import Foo, MyProto
+
+reveal_type(Foo().proto)
+reveal_type(Foo().z)
+
+class Good:
+ x: int
+ y: str
+
+g: Good
+p: MyProto
+p = g
+
+# dummy change
+
+[builtins fixtures/typelevel.pyi]
+[typing fixtures/typing-full.pyi]
+[out]
+b.py:3: note: Revealed type is "NewProtocol[x: builtins.int, y: builtins.str]"
+b.py:4: note: Revealed type is "builtins.float"
+==
+b.py:3: note: Revealed type is "NewProtocol[x: builtins.int, y: builtins.str]"
+b.py:4: note: Revealed type is "builtins.float"
diff --git a/test-data/unit/fixtures/typelevel.pyi b/test-data/unit/fixtures/typelevel.pyi
new file mode 100644
index 0000000000000..977da35cf3db9
--- /dev/null
+++ b/test-data/unit/fixtures/typelevel.pyi
@@ -0,0 +1,107 @@
+# Builtins stub used in typelevel-related test cases.
+
+import _typeshed
+from typing import Iterable, Iterator, TypeVar, Generic, Sequence, Mapping, Optional, overload, Tuple, Type, Union, Self, type_check_only, _type_operator
+
+_T = TypeVar("_T")
+_Tco = TypeVar('_Tco', covariant=True)
+
+class object:
+ def __init__(self) -> None: pass
+ def __new__(cls) -> Self: ...
+ def __str__(self) -> str: pass
+
+class type:
+ def __init__(self, *a: object) -> None: pass
+ def __call__(self, *a: object) -> object: pass
+class tuple(Sequence[_Tco], Generic[_Tco]):
+ def __hash__(self) -> int: ...
+ def __new__(cls: Type[_T], iterable: Iterable[_Tco] = ...) -> _T: ...
+ def __iter__(self) -> Iterator[_Tco]: pass
+ def __contains__(self, item: object) -> bool: pass
+ @overload
+ def __getitem__(self, x: int) -> _Tco: pass
+ @overload
+ def __getitem__(self, x: slice) -> Tuple[_Tco, ...]: ...
+ def __mul__(self, n: int) -> Tuple[_Tco, ...]: pass
+ def __rmul__(self, n: int) -> Tuple[_Tco, ...]: pass
+ def __add__(self, x: Tuple[_Tco, ...]) -> Tuple[_Tco, ...]: pass
+ def count(self, obj: object) -> int: pass
+class function:
+ __name__: str
+class ellipsis: pass
+class classmethod: pass
+
+# We need int and slice for indexing tuples.
+class int:
+ def __neg__(self) -> 'int': pass
+ def __pos__(self) -> 'int': pass
+class float: pass
+class slice: pass
+class bool(int): pass
+class str: pass # For convenience
+class bytes: pass
+class bytearray: pass
+
+class list(Sequence[_T], Generic[_T]):
+ @overload
+ def __getitem__(self, i: int) -> _T: ...
+ @overload
+ def __getitem__(self, s: slice) -> list[_T]: ...
+ def __contains__(self, item: object) -> bool: ...
+ def __iter__(self) -> Iterator[_T]: ...
+
+def isinstance(x: object, t: type) -> bool: pass
+
+class BaseException: pass
+
+KT = TypeVar('KT')
+VT = TypeVar('VT')
+T = TypeVar('T')
+
+class dict(Mapping[KT, VT]):
+ @overload
+ def __init__(self, **kwargs: VT) -> None: pass
+ @overload
+ def __init__(self, arg: Iterable[Tuple[KT, VT]], **kwargs: VT) -> None: pass
+ def __getitem__(self, key: KT) -> VT: pass
+ def __setitem__(self, k: KT, v: VT) -> None: pass
+ def __iter__(self) -> Iterator[KT]: pass
+ def __contains__(self, item: object) -> int: pass
+ @overload
+ def get(self, k: KT) -> Optional[VT]: pass
+ @overload
+ def get(self, k: KT, default: Union[VT, T]) -> Union[VT, T]: pass
+ def __len__(self) -> int: ...
+
+
+# Type-level computation stuff
+
+_TrueType = TypeVar('_TrueType')
+_FalseType = TypeVar('_FalseType')
+
+@type_check_only
+@_type_operator
+class _Cond(Generic[_T, _TrueType, _FalseType]): ...
+
+_T2 = TypeVar('_T2')
+
+@type_check_only
+@_type_operator
+class _And(Generic[_T, _T2]): ...
+
+@type_check_only
+@_type_operator
+class _Or(Generic[_T, _T2]): ...
+
+@type_check_only
+@_type_operator
+class _Not(Generic[_T]): ...
+
+@type_check_only
+@_type_operator
+class _DictEntry(Generic[_T, _T2]): ...
+
+@type_check_only
+@_type_operator
+class _TypeGetAttr(Generic[_T, _T2]): ...
diff --git a/test-data/unit/fixtures/typing-full.pyi b/test-data/unit/fixtures/typing-full.pyi
index 59e2f9de9929d..0a0eaf0a2e612 100644
--- a/test-data/unit/fixtures/typing-full.pyi
+++ b/test-data/unit/fixtures/typing-full.pyi
@@ -38,8 +38,12 @@ TypedDict = 0
TypeGuard = 0
NoReturn = 0
NewType = 0
+Required = 0
+NotRequired = 0
+ReadOnly = 0
Self = 0
Unpack = 0
+Never = 0
Callable: _SpecialForm
Union: _SpecialForm
Literal: _SpecialForm
@@ -229,3 +233,135 @@ class TypeAliasType:
def __or__(self, other: Any) -> Any: ...
def __ror__(self, other: Any) -> Any: ...
+
+# Type computation!
+
+class BaseTypedDict(TypedDict):
+ pass
+
+def _type_operator(cls: type[T]) -> type[T]: ...
+
+_Ts = TypeVarTuple("_Ts")
+
+@_type_operator
+class Iter(Generic[T]): ...
+
+@_type_operator
+class Map(Generic[T]): ...
+
+@_type_operator
+class IsAssignable(Generic[T, U]): ...
+
+@_type_operator
+class IsEquivalent(Generic[T, U]): ...
+
+@_type_operator
+class Bool(Generic[T]): ...
+
+@_type_operator
+class GetArg(Generic[T, U, V]): ...
+
+@_type_operator
+class GetArgs(Generic[T, U]): ...
+
+@_type_operator
+class FromUnion(Generic[T]): ...
+
+@_type_operator
+class GetMember(Generic[T, U]): ...
+
+@_type_operator
+class GetMemberType(Generic[T, U]): ...
+
+@_type_operator
+class Slice(Generic[T, U, V]): ...
+
+@_type_operator
+class Concat(Generic[T, U]): ...
+
+@_type_operator
+class Uppercase(Generic[T]): ...
+
+@_type_operator
+class Lowercase(Generic[T]): ...
+
+@_type_operator
+class Capitalize(Generic[T]): ...
+
+@_type_operator
+class Uncapitalize(Generic[T]): ...
+
+@_type_operator
+class Length(Generic[T]): ...
+
+@_type_operator
+class RaiseError(Generic[T, Unpack[_Ts]]): ...
+
+@_type_operator
+class Members(Generic[T]): ...
+
+@_type_operator
+class Attrs(Generic[T]): ...
+
+@_type_operator
+class NewTypedDict(Generic[Unpack[_Ts]]): ...
+
+@_type_operator
+class NewProtocol(Generic[Unpack[_Ts]]): ...
+
+@_type_operator
+class UpdateClass(Generic[Unpack[_Ts]]): ...
+
+@_type_operator
+class _NewUnion(Generic[Unpack[_Ts]]): ...
+
+@_type_operator
+class _NewCallable(Generic[Unpack[_Ts]]): ...
+
+# Member data type for type-level computation
+_Name = TypeVar('_Name')
+_Type = TypeVar('_Type')
+_Quals = TypeVar("_Quals", default=Never)
+_Init = TypeVar("_Init", default=Never)
+_Definer = TypeVar("_Definer", default=Never)
+
+_PQuals = TypeVar("_PQuals", default=Never)
+
+class Param(Generic[_Name, _Type, _PQuals]):
+ """Represents a function parameter for extended callable syntax."""
+ name: _Name
+ type: _Type
+ quals: _PQuals
+
+class Params(Generic[Unpack[_Ts]]): ...
+
+class Member(Generic[_Name, _Type, _Quals, _Init, _Definer]):
+ """
+ Represents a class member with name, type, qualifiers, initializer, and definer.
+ """
+ name: _Name
+ typ: _Type
+ quals: _Quals
+ init: _Init
+ definer: _Definer
+
+
+# _MP = TypeVar("_MP", bound=Member[Any, Any, Any, Any, Any] | Param[Any, Any, Any])
+# _M = TypeVar("_M", bound=Member[Any, Any, Any, Any, Any])
+_MP = TypeVar("_MP")
+_M = TypeVar("_M")
+
+GetName = GetMemberType[_MP, Literal["name"]]
+GetType = GetMemberType[_MP, Literal["typ"]]
+GetQuals = GetMemberType[_MP, Literal["quals"]]
+GetInit = GetMemberType[_M, Literal["init"]]
+GetDefiner = GetMemberType[_M, Literal["definer"]]
+
+_KwargDict = TypeVar('_KwargDict', bound=BaseTypedDict)
+
+class InitField(Generic[_KwargDict], Any):
+ def __init__(self, **kwargs: Unpack[_KwargDict]) -> None:
+ ...
+
+ def _get_kwargs(self) -> _KwargDict:
+ ...
diff --git a/test-data/unit/fixtures/typing-typeddict.pyi b/test-data/unit/fixtures/typing-typeddict.pyi
index 0bc5637b32708..0421934f7c528 100644
--- a/test-data/unit/fixtures/typing-typeddict.pyi
+++ b/test-data/unit/fixtures/typing-typeddict.pyi
@@ -81,3 +81,8 @@ class _TypedDict(Mapping[str, object]):
def __delitem__(self, k: NoReturn) -> None: ...
class _SpecialForm: pass
+
+# A dummy TypedDict declaration inside typing.pyi, to test that
+# semanal can handle it.
+class _TestBaseTypedDict(TypedDict):
+ pass
diff --git a/test-data/unit/pythoneval.test b/test-data/unit/pythoneval.test
index 2d3f867d8dfdf..db88a1784d215 100644
--- a/test-data/unit/pythoneval.test
+++ b/test-data/unit/pythoneval.test
@@ -2229,3 +2229,173 @@ def f(x: int, y: list[str]):
x in y
[out]
_testStrictEqualityWithList.py:3: error: Non-overlapping container check (element type: "int", container item type: "str")
+
+[case testTypeLevel]
+# flags: --python-version=3.15
+from __future__ import annotations
+
+from typing import (
+ Callable,
+ Literal,
+ Union,
+ ReadOnly,
+ TypedDict,
+ Never,
+ Self,
+)
+
+import typing
+
+
+class FieldArgs(TypedDict, total=False):
+ hidden: ReadOnly[bool]
+ primary_key: ReadOnly[bool]
+ index: ReadOnly[bool]
+ default: ReadOnly[object]
+
+
+class Field[T: FieldArgs](typing.InitField[T]):
+ pass
+
+
+####
+
+# TODO: Should this go into the stdlib?
+type GetFieldItem[T, K] = typing.GetMemberType[
+ typing.GetArg[T, typing.InitField, Literal[0]], K
+]
+
+
+##
+
+# Strip `| None` from a type by iterating over its union components
+# and filtering
+type NotOptional[T] = Union[
+ *[x for x in typing.Iter[typing.FromUnion[T]] if not typing.IsAssignable[x, None]]
+]
+
+# Adjust an attribute type for use in Public below by dropping | None for
+# primary keys and stripping all annotations.
+type FixPublicType[T, Init] = (
+ NotOptional[T]
+ if typing.IsAssignable[Literal[True], GetFieldItem[Init, Literal["primary_key"]]]
+ else T
+)
+
+# Strip out everything that is Hidden and also make the primary key required
+# Drop all the annotations, since this is for data getting returned to users
+# from the DB, so we don't need default values.
+type Public[T] = typing.NewProtocol[
+ *[
+ typing.Member[
+ typing.GetName[p],
+ FixPublicType[typing.GetType[p], typing.GetInit[p]],
+ typing.GetQuals[p],
+ ]
+ for p in typing.Iter[typing.Attrs[T]]
+ if not typing.IsAssignable[
+ Literal[True], GetFieldItem[typing.GetInit[p], Literal["hidden"]]
+ ]
+ ]
+]
+
+# Begin PEP section: Automatically deriving FastAPI CRUD models
+"""
+We have a more `fully-worked example <#fastapi-test_>`_ in our test
+suite, but here is a possible implementation of just ``Public``
+"""
+
+# Extract the default type from an Init field.
+# If it is a Field, then we try pulling out the "default" field,
+# otherwise we return the type itself.
+type GetDefault[Init] = (
+ GetFieldItem[Init, Literal["default"]]
+ if typing.IsAssignable[Init, Field]
+ else Init
+)
+
+# Create takes everything but the primary key and preserves defaults
+type Create[T] = typing.NewProtocol[
+ *[
+ typing.Member[
+ typing.GetName[p],
+ typing.GetType[p],
+ typing.GetQuals[p],
+ GetDefault[typing.GetInit[p]],
+ ]
+ for p in typing.Iter[typing.Attrs[T]]
+ if not typing.IsAssignable[
+ Literal[True],
+ GetFieldItem[typing.GetInit[p], Literal["primary_key"]],
+ ]
+ ]
+]
+
+# Update takes everything but the primary key, but makes them all have
+# None defaults
+type Update[T] = typing.NewProtocol[
+ *[
+ typing.Member[
+ typing.GetName[p],
+ typing.GetType[p] | None,
+ typing.GetQuals[p],
+ Literal[None],
+ ]
+ for p in typing.Iter[typing.Attrs[T]]
+ if not typing.IsAssignable[
+ Literal[True],
+ GetFieldItem[typing.GetInit[p], Literal["primary_key"]],
+ ]
+ ]
+]
+
+class Hero:
+ id: int | None = Field(default=None, primary_key=True)
+
+ name: str = Field(index=True)
+ age: int | None = Field(default=None, index=True)
+
+ secret_name: str = Field(hidden=True)
+
+
+type HeroPublic = Public[Hero]
+type HeroCreate = Create[Hero]
+type HeroUpdate = Update[Hero]
+
+pub: HeroPublic
+reveal_type(pub)
+
+creat: HeroCreate
+reveal_type(creat)
+
+upd: HeroUpdate
+reveal_type(upd)
+
+creat_members: tuple[*[typing.GetInit[p] for p in typing.Iter[typing.Members[HeroCreate]]]]
+reveal_type(creat_members)
+
+upd_types: tuple[*[typing.GetType[p] for p in typing.Iter[typing.Members[HeroUpdate]]]]
+reveal_type(upd_types)
+
+upd_types_dot: tuple[*[p.type for p in typing.Iter[typing.Members[HeroUpdate]]]]
+reveal_type(upd_types)
+
+# Test dict comprehension syntax for NewTypedDict
+type HeroDict[T] = typing.NewTypedDict[{
+ typing.GetName[p]: typing.GetType[p]
+ for p in typing.Iter[typing.Attrs[T]]
+ if not typing.IsAssignable[
+ Literal[True], GetFieldItem[typing.GetInit[p], Literal["hidden"]]
+ ]
+}]
+
+hd: HeroDict[Hero]
+reveal_type(hd)
+[out]
+_program.py:133: note: Revealed type is "NewProtocol[id: int, name: str, age: int | None]"
+_program.py:136: note: Revealed type is "NewProtocol[name: str, age: int | None = None, secret_name: str]"
+_program.py:139: note: Revealed type is "NewProtocol[name: str | None = None, age: int | None | None = None, secret_name: str | None = None]"
+_program.py:142: note: Revealed type is "tuple[Never, None, Never]"
+_program.py:145: note: Revealed type is "tuple[str | None, int | None | None, str | None]"
+_program.py:148: note: Revealed type is "tuple[str | None, int | None | None, str | None]"
+_program.py:160: note: Revealed type is "TypedDict({'id': int | None, 'name': str, 'age': int | None})"
diff --git a/test-data/unit/semanal-errors.test b/test-data/unit/semanal-errors.test
index 40db0537c413e..3459774395fa6 100644
--- a/test-data/unit/semanal-errors.test
+++ b/test-data/unit/semanal-errors.test
@@ -1471,7 +1471,7 @@ class Variadic(Generic[Unpack[TVariadic], Unpack[TVariadic2]]): # E: Can only u
def bad_args(*args: TVariadic): # E: TypeVarTuple "TVariadic" is only valid with an unpack
pass
-def bad_kwargs(**kwargs: Unpack[TVariadic]): # E: Unpack item in ** parameter must be a TypedDict
+def bad_kwargs(**kwargs: Unpack[TVariadic]): # E: Unpack item in ** parameter must be a TypedDict or a TypeVar with TypedDict bound
pass
[builtins fixtures/dict.pyi]