Skip to content

Host country-agnostic microcalibrate adapter under microplex.calibration#6

Merged
MaxGhenis merged 1 commit intomainfrom
relocate-microcalibrate-adapter
Apr 18, 2026
Merged

Host country-agnostic microcalibrate adapter under microplex.calibration#6
MaxGhenis merged 1 commit intomainfrom
relocate-microcalibrate-adapter

Conversation

@MaxGhenis
Copy link
Copy Markdown
Contributor

Summary

Moves MicrocalibrateAdapter / MicrocalibrateAdapterConfig up from microplex-us so every country package (microplex-us, a future microplex-uk, etc.) shares one identity-preserving gradient-descent chi-squared calibrator instead of duplicating the glue. The adapter is genuinely country-agnostic — it wraps microcalibrate.Calibration with the legacy Calibrator.fit_transform surface and uses only microplex.calibration.LinearConstraint, no US tax logic.

Structure

  • src/microplex/calibration.py → package directory (src/microplex/calibration/__init__.py). Existing public surface (LinearConstraint, Calibrator, SparseCalibrator, HardConcreteCalibrator) imports identically.
  • New sibling module src/microplex/calibration/microcalibrate_adapter.py carries the adapter.
  • calibration/__init__.py appends a try/except ImportError re-export so callers without the calibrate extra get None for the adapter names and callers with the extra get real classes.
  • pyproject.toml gains calibrate = ["microcalibrate>=0.22; python_version >= '3.13'"] — opt-in because microcalibrate pulls ~1.5 GB of torch/optuna/l0 and pins Python 3.13+ while microplex supports 3.11+.
  • Default batch_size = 100_000 keeps autograd activation under ~200 MB at ~500 constraints (see Add batch_size gradient accumulation; release pandas matrix after init PolicyEngine/microcalibrate#99).

Test plan

  • pytest tests/test_calibration.py tests/test_microcalibrate_adapter.py — 36 passed.
  • New tests/test_microcalibrate_adapter.py pins: importability from canonical path; default batch_size set; 3-age-band convergence within 0.1 max relative error. Skips cleanly via pytest.importorskip when microcalibrate extra not installed.
  • Existing 33 calibration tests still green (no public-surface regressions).
  • Downstream: microplex-us companion PR re-exports from here. Verified locally (pytest tests/calibration/ → 13 pass).

Rationale

Max asked: "but won't all countries use microcalibrate?" Yes. Relocating now keeps the architecture properly modular at every step and avoids a later second-country duplication.

🤖 Generated with Claude Code

Moves `MicrocalibrateAdapter` / `MicrocalibrateAdapterConfig` up from
`microplex-us` so every country package (microplex-us, a future
microplex-uk, etc.) shares one identity-preserving gradient-descent
chi-squared calibrator instead of duplicating the glue. The adapter is
genuinely country-agnostic — it wraps `microcalibrate.Calibration` with
the legacy `Calibrator.fit_transform` surface and uses only
`microplex.calibration.LinearConstraint` (no US tax logic).

Structure:

- `src/microplex/calibration.py` becomes a package directory
  (`src/microplex/calibration/__init__.py`) so the existing public
  namespace (`LinearConstraint`, `Calibrator`, `SparseCalibrator`,
  `HardConcreteCalibrator`) continues to import identically. The
  adapter slots in as a sibling module
  `src/microplex/calibration/microcalibrate_adapter.py`.

- `microplex.calibration.__init__` appends a `try/except ImportError`
  re-export of `MicrocalibrateAdapter` and `MicrocalibrateAdapterConfig`.
  Callers without the `calibrate` extra get `None` for those names,
  which is what `pytest.importorskip` and typical feature-gate code
  expect. Callers with the extra get the real classes.

- `pyproject.toml` exposes `calibrate = ["microcalibrate>=0.22; python_version >= '3.13'"]`
  as an optional extra so microcalibrate's ~1.5 GB torch/optuna/l0
  footprint is opt-in, and microplex keeps working on Python 3.11/3.12
  (which microcalibrate does not support yet). The `all` extra now
  includes `calibrate`.

Default `batch_size = 100_000` on `MicrocalibrateAdapterConfig` keeps
peak autograd activation under ~200 MB at ~500 constraints (PR #99
gradient-accumulation path), so the adapter defaults to scale-safe.

TDD: `tests/test_microcalibrate_adapter.py` pins
- import path (`from microplex.calibration import MicrocalibrateAdapter`);
- default `batch_size` is set;
- convergence on a 3-age-band problem within 0.1 max relative error.

Skips cleanly via `pytest.importorskip("microcalibrate")` when the extra
isn't installed.

Downstream microplex-us re-exports from here so existing
`from microplex_us.calibration import MicrocalibrateAdapter` keeps
working (see companion commit on microplex-us).

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@MaxGhenis MaxGhenis merged commit 254114d into main Apr 18, 2026
3 of 6 checks passed
@MaxGhenis MaxGhenis deleted the relocate-microcalibrate-adapter branch April 18, 2026 16:02
MaxGhenis added a commit to CosilicoAI/microplex-us that referenced this pull request Apr 22, 2026
The adapter moved to upstream microplex (see CosilicoAI/microplex#6)
so every country package shares one identity-preserving calibrator
instead of duplicating the glue. This commit:

- Swaps pyproject dependency `microcalibrate>=0.22` for `microplex[calibrate]`,
  picking up the torch/optuna/l0 stack transitively via the extra.
- Deletes `src/microplex_us/calibration/microcalibrate_adapter.py`;
  the source of truth is now `microplex.calibration.microcalibrate_adapter`.
- Rewrites `src/microplex_us/calibration/__init__.py` to re-export the
  adapter classes from upstream so existing
  `from microplex_us.calibration import MicrocalibrateAdapter` imports
  keep working — bit-for-bit backward-compatible for downstream pipelines.

All 13 microplex-us calibration tests pass against the re-exported
adapter (identical behavior, upstream-hosted implementation).

Next: once microplex#6 merges, this PR can merge too; pipelines using
MicrocalibrateAdapter get the batched calibration transparently.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant