ANFISRegressor¶
ANFISRegressor is the high-level entry point for training Adaptive
Neuro-Fuzzy Inference Systems on regression tasks. It hides the low-level
membership construction, rule synthesis, and trainer wiring behind a familiar
scikit-learn style API (fit, predict, evaluate, save, load).
At a Glance¶
- Works with NumPy arrays, array-like objects, or pandas DataFrames.
- Automatically generates membership functions per input (grid, FCM, or random).
- Supports custom membership definitions and rule subsets.
- Provides multiple optimizers:
"hybrid","adam","sgd","rmsprop","pso","hybrid_adam". - Ships with built-in evaluation (
evaluate) and persistence (save,load).
Quick Start¶
import numpy as np
from anfis_toolbox import ANFISRegressor
# Synthetic regression data
rng = np.random.default_rng(0)
X = rng.uniform(-2, 2, size=(200, 2))
y = np.sin(X[:, 0]) + 0.5 * X[:, 1]
reg = ANFISRegressor(optimizer="adam", epochs=40, learning_rate=0.01)
reg.fit(X, y)
pred = reg.predict([[0.4, -0.1]])
report = reg.evaluate(X, y)
Core Workflow¶
- Configure – Set global defaults (
n_mfs,mf_type,init,optimizer). - Fit – Call
fit(X, y)with optional validation data. - Predict – Use
predictfor batch or single-sample inference. - Evaluate – Call
evaluateto obtain MSE, RMSE, MAE, and R² metrics. - Persist – Store or restore trained estimators via
save/load.
Model Equations¶
Each fuzzy rule generated by ANFISRegressor follows a Takagi–Sugeno–Kang
consequent of the form
The firing strength of rule \(i\) is the product of the membership degrees for each input:
After normalising the rule strengths, the overall prediction is
During fitting, the estimator couples gradient-based updates of the membership parameters with least-squares estimation of the consequent coefficients, matching the hybrid learning strategy popularised in the original ANFIS paper.
Key Parameters¶
| Parameter | Description |
|---|---|
n_mfs |
Default number of membership functions per input (int). |
mf_type |
Membership family ("gaussian", "triangular", "bell", etc.). |
init |
Membership initialization ("grid", "fcm", "random", or None). |
inputs_config |
Per-input overrides (dict, list of membership functions, or None). |
optimizer |
Trainer identifier, subclass, or instance (defaults to "hybrid"). |
optimizer_params |
Extra keyword arguments passed to the trainer. |
learning_rate, epochs, batch_size, shuffle, verbose |
Convenience overrides fed into compatible trainers. |
loss |
Optional custom loss (string key or callable). |
rules |
Optional list of rule tuples limiting the rule set. |
Customizing Membership Functions¶
Use inputs_config to tailor membership families, counts, or ranges on a
per-input basis. Keys may be column names (for pandas DataFrames), integer
indices, or "x{i}" aliases.
import numpy as np
from anfis_toolbox import ANFISRegressor
from anfis_toolbox.membership import GaussianMF
rng = np.random.default_rng(21)
X_custom = rng.uniform(-3, 3, size=(320, 2))
y_custom = np.cos(X_custom[:, 0]) + 0.3 * X_custom[:, 1]
inputs_config = {
0: {
"mf_type": "triangular",
"n_mfs": 4,
"overlap": 0.6,
},
1: {
"membership_functions": [
GaussianMF(mean=-1.2, sigma=0.45),
GaussianMF(mean=-0.2, sigma=0.35),
GaussianMF(mean=0.8, sigma=0.3),
GaussianMF(mean=1.7, sigma=0.4),
]
},
}
reg = ANFISRegressor(inputs_config=inputs_config, epochs=60, learning_rate=0.01)
reg.fit(X_custom, y_custom)
Note
Keep the number of membership functions consistent across inputs when mixing dictionary overrides and explicit membership lists. The example above configures four functions for each feature.
The X_custom and y_custom arrays from the example are reused in the
sections below.
Choosing an Optimizer¶
Pass a string alias or a trainer class/instance:
reg = ANFISRegressor(optimizer="adam", epochs=80, learning_rate=0.005)
reg.fit(X, y)
from anfis_toolbox.optim import RMSPropTrainer
reg = ANFISRegressor(optimizer=RMSPropTrainer(learning_rate=0.001, epochs=120))
reg.fit(X, y)
"hybrid"/"hybrid_adam": Combine least-squares consequents with gradient steps."adam","rmsprop","sgd": Familiar gradient optimizers."pso": Particle Swarm Optimization for derivative-free training.
Restricting the Rule Base¶
Supply rules to freeze the rule combinations explored during training.
selected_rules = [(0, 0), (1, 1), (2, 2)]
reg = ANFISRegressor(rules=selected_rules, epochs=40, learning_rate=0.01)
reg.fit(X_custom, y_custom)
assert tuple(reg.get_rules()) == tuple(selected_rules)
If rules is omitted, the full Cartesian product of membership indices is used.
Evaluating Performance¶
evaluate reports regression metrics and can optionally skip printing.
Metrics are returned as a dictionary; keys include mse, rmse, mae, and
r2.
Saving and Loading Models¶
reg.fit(X, y)
reg.save("artifacts/anfis-regressor.pkl")
from anfis_toolbox import ANFISRegressor
loaded = ANFISRegressor.load("artifacts/anfis-regressor.pkl")
pred = loaded.predict(X[:3])
The pickled artifact stores fitted membership functions, rule definitions, and training history, enabling reproducible deployments.
Tips & Troubleshooting¶
- Input scale – Normalize or standardize features for smoother membership learning.
- Underfitting – Increase
n_mfs, provide richerinputs_config, or allow more epochs. - Overfitting – Reduce rule count, add validation data, or lower
epochs. - Stalled training – Try a different optimizer or adjust
learning_rate. - Verbose logging – Set
verbose=Trueduring fitting to mirror trainer progress.
Further Reading¶
- API Reference – Regressor
- Membership Functions catalog
- Optimizer reference
- Jang, J.-S. R. (1993). ANFIS: Adaptive-network-based fuzzy inference system.