Coverage for anfis_toolbox / optim / __init__.py: 100%
8 statements
« prev ^ index » next coverage.py v7.13.3, created at 2026-02-05 18:47 -0300
« prev ^ index » next coverage.py v7.13.3, created at 2026-02-05 18:47 -0300
1"""Optimization algorithms for ANFIS.
3This module contains pluggable training algorithms (optimizers/trainers)
4that can be used with the ANFIS model.
6Design goals:
7- Decouple training algorithms from the model class
8- Keep a simple API similar to scikit-learn fit(X, y)
9- Allow power users to instantiate and pass custom trainers
11Example:
12 from anfis_toolbox.optim import SGDTrainer, RMSPropTrainer, HybridTrainer
13 trainer = SGDTrainer(learning_rate=0.01, epochs=200)
14 losses = trainer.fit(model, X, y)
16Task compatibility and guidance:
17--------------------------------
18- HybridTrainer implements the original Jang (1993) hybrid learning and is intended
19 for regression with the regression ANFIS (single-output). It is not compatible
20 with the classification head.
22- SGDTrainer, RMSPropTrainer and AdamTrainer perform generic backprop updates and now
23 accept pluggable loss functions (see ``anfis_toolbox.losses``). They default to mean
24 squared error for regression, but can minimize other differentiable objectives such as
25 categorical cross-entropy when used with ``ANFISClassifier``. Targets are adapted via the
26 selected loss' ``prepare_targets`` helper, so integer labels or one-hot matrices are both
27 supported seamlessly.
28"""
30from .adam import AdamTrainer
31from .base import BaseTrainer
32from .hybrid import HybridTrainer
33from .hybrid_adam import HybridAdamTrainer
34from .pso import PSOTrainer
35from .rmsprop import RMSPropTrainer
36from .sgd import SGDTrainer
38__all__ = [
39 "BaseTrainer",
40 "SGDTrainer",
41 "HybridTrainer",
42 "AdamTrainer",
43 "HybridAdamTrainer",
44 "RMSPropTrainer",
45 "PSOTrainer",
46]