f29060491e5b99d466911a5f500040becf6850ec
- Add solver/datagen/dataset.py with DatasetConfig, DatasetGenerator, ShardSpec/ShardResult dataclasses, parallel shard generation via ProcessPoolExecutor, checkpoint/resume support, index and stats output - Add scripts/generate_synthetic.py CLI entry point with Hydra-first and argparse fallback modes - Add minimal YAML parser (parse_simple_yaml) for config loading without PyYAML dependency - Add progress display with tqdm fallback to print-based ETA - Update configs/dataset/synthetic.yaml with shard_size, checkpoint_every - Update solver/datagen/__init__.py with DatasetConfig, DatasetGenerator exports - Add tests/datagen/test_dataset.py with 28 tests covering config, YAML parsing, seed derivation, end-to-end generation, resume, stats/index structure, determinism, and CLI integration Closes #10
kindred-solver
Assembly constraint prediction via GNN. Produces a trained model embedded in a FreeCAD workbench (Kindred Create library), later integrated into vanilla Create.
Overview
kindred-solver predicts whether assembly constraints (joints) are independent or redundant using graph neural networks. Given an assembly graph where bodies are nodes and joints are edges, the model classifies each constraint and reports degrees of freedom per body.
Repository Structure
kindred-solver/
├── solver/ # Core library
│ ├── datagen/ # Synthetic data generation (pebble game)
│ ├── datasets/ # PyG dataset adapters
│ ├── models/ # GNN architectures (GIN, GAT, NNConv)
│ ├── training/ # Training loops and configs
│ ├── evaluation/ # Metrics and visualization
│ └── inference/ # Runtime prediction API
├── freecad/ # FreeCAD integration
│ ├── workbench/ # FreeCAD workbench addon
│ ├── bridge/ # FreeCAD <-> solver interface
│ └── tests/ # Integration tests
├── export/ # Model packaging for Create
├── configs/ # Hydra configs (dataset, model, training, export)
├── scripts/ # CLI utilities
├── data/ # Datasets (not committed)
├── tests/ # Unit and integration tests
└── docs/ # Documentation
Setup
Install (development)
pip install -e ".[train,dev]"
pre-commit install
pre-commit install --hook-type commit-msg
Using Make
make help # show all targets
make dev # install all deps + pre-commit hooks
make test # run tests
make lint # run ruff linter
make type-check # run mypy
make check # lint + type-check + test
make train # run training
make data-gen # generate synthetic data
make export # export model
Using Docker
# GPU training
docker compose up train
# Run tests (CPU)
docker compose up test
# Generate data
docker compose up data-gen
License
Apache 2.0
Languages
C++
66.2%
Python
32.5%
CMake
1.1%