Skip to main content
Ctrl+K

Modeva-AI

  • Installation
  • User Guide
  • API Reference
  • Gallery
  • Changelog
  • GitHub
  • PyPI
  • Installation
  • User Guide
  • API Reference
  • Gallery
  • Changelog
  • GitHub
  • PyPI

Section Navigation

  • Get Started
  • Dataset
  • Model Development
    • ModelZoo
    • Built-in Interpretable Models
    • External Models
    • Model Calibration
    • Hyperparameter Tuning
      • Grid Search
      • Random Search
      • Particle Swarm Optimization Search
      • Tuning with optuna (Experimental)
  • Model Validation
  • Utilities
  • Low Code
  • Gallery of Modeva Examples
  • Model Development
  • Hyperparameter Tuning
  • Particle Swarm Optimization Search

Note

Go to the end to download the full example code.

Particle Swarm Optimization Search#

Installation

# To install the required package, use the following command:
# !pip install modeva

Authentication

# To get authentication, use the following command: (To get full access please replace the token to your own token)
# from modeva.utils.authenticate import authenticate
# authenticate(auth_code='eaaa4301-b140-484c-8e93-f9f633c8bacb')

Import required modules

from modeva import DataSet
from modeva import TestSuite
from modeva.models import MoLGBMClassifier
from modeva.models import ModelTunePSO

Load Dataset

ds = DataSet()
ds.load(name="TaiwanCredit")
ds.set_random_split()

Run PSO search#

param_bounds = {"max_depth": [1, 4],
                "learning_rate": [0.01, 1.0]}
param_types = {"max_depth": "int"}

model = MoLGBMClassifier(verbose=-1)
hpo = ModelTunePSO(dataset=ds, model=model)
result = hpo.run(param_bounds=param_bounds,
                 param_types=param_types,
                 n_iter=2,
                 n_particles=10,
                 metric=("AUC", "LogLoss"),
                 cv=5)
result.table
max_depth learning_rate AUC LogLoss AUC_rank LogLoss_rank mean_fit_time
38 1 0.8343 0.7751 0.4367 1 1 0.4229
18 1 0.8343 0.7751 0.4367 1 1 0.6503
8 1 0.8343 0.7751 0.4367 1 1 0.2331
27 1 0.6666 0.7748 0.4368 4 4 0.1835
32 2 0.6494 0.7706 0.4407 5 8 0.2291
2 2 0.6494 0.7706 0.4407 5 8 1.3934
12 2 0.6494 0.7706 0.4407 5 8 0.2840
17 1 0.0963 0.7696 0.4401 8 5 0.2097
37 1 0.0963 0.7696 0.4401 8 5 0.1872
7 1 0.0963 0.7696 0.4401 8 5 0.1762
15 3 0.5336 0.7655 0.4479 11 11 0.6640
35 3 0.5336 0.7655 0.4479 11 11 0.2572
5 3 0.5336 0.7655 0.4479 11 11 3.0980
11 3 0.5494 0.7641 0.4494 14 14 0.3544
1 3 0.5494 0.7641 0.4494 14 14 2.3557
31 3 0.5494 0.7641 0.4494 14 14 0.2706
14 4 0.3896 0.7641 0.4504 17 20 2.2246
34 4 0.3896 0.7641 0.4504 17 20 0.4668
4 4 0.3896 0.7641 0.4504 17 20 3.0105
33 2 0.8929 0.7633 0.4497 20 17 0.1539
3 2 0.8929 0.7633 0.4497 20 17 1.4171
13 2 0.8929 0.7633 0.4497 20 17 0.3990
26 2 1.0 0.7632 0.4510 23 23 0.1985
22 2 1.0 0.7632 0.4510 23 23 2.3360
23 2 1.0 0.7632 0.4510 23 23 0.4037
25 2 1.0 0.7632 0.4510 23 23 0.1830
28 2 1.0 0.7632 0.4510 23 23 0.5747
24 3 0.7842 0.7559 0.4676 28 31 0.2910
0 3 0.718 0.7557 0.4634 29 28 2.1687
10 3 0.718 0.7557 0.4634 29 28 1.0073
30 3 0.718 0.7557 0.4634 29 28 0.9453
19 3 0.8713 0.7512 0.4758 32 32 1.7386
9 3 0.8713 0.7512 0.4758 32 32 0.6181
39 3 0.8713 0.7512 0.4758 32 32 0.3238
21 3 1.0 0.7464 0.4915 35 38 8.1627
20 3 1.0 0.7464 0.4915 35 38 2.2119
16 3 0.9263 0.7433 0.4889 37 35 0.8553
36 3 0.9263 0.7433 0.4889 37 35 0.9300
6 3 0.9263 0.7433 0.4889 37 35 0.6379
29 4 1.0 0.7256 0.5610 40 40 0.7462


result.plot("parallel", figsize=(8, 6))


result.plot(("max_depth", "AUC"))


result.plot(("learning_rate", "AUC"))


Retrain model with best hyperparameter#

model_tuned = MoLGBMClassifier(**result.value["params"][0],
                             name="LGBM-Tuned",
                             verbose=-1)
model_tuned.fit(ds.train_x, ds.train_y)
model_tuned
MoLGBMClassifier(boosting_type='gbdt', class_weight=None, colsample_bytree=1.0,
                 importance_type='split', learning_rate=0.7180374727086953,
                 max_depth=3, min_child_samples=20, min_child_weight=0.001,
                 min_split_gain=0.0, n_estimators=100, n_jobs=None,
                 num_leaves=31, objective=None, random_state=None,
                 reg_alpha=0.0, reg_lambda=0.0, subsample=1.0,
                 subsample_for_bin=200000, subsample_freq=0, verbose=-1)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
MoLGBMClassifier(boosting_type='gbdt', class_weight=None, colsample_bytree=1.0,
                 importance_type='split', learning_rate=0.7180374727086953,
                 max_depth=3, min_child_samples=20, min_child_weight=0.001,
                 min_split_gain=0.0, n_estimators=100, n_jobs=None,
                 num_leaves=31, objective=None, random_state=None,
                 reg_alpha=0.0, reg_lambda=0.0, subsample=1.0,
                 subsample_for_bin=200000, subsample_freq=0, verbose=-1)


Diagnose the tuned model#

ts = TestSuite(ds, model_tuned)
result = ts.diagnose_accuracy_table()
result.table
AUC ACC F1 LogLoss Brier
train 0.8555 0.8412 0.5539 0.3697 0.1157
test 0.7646 0.8158 0.4633 0.4441 0.1379
GAP -0.0909 -0.0254 -0.0906 0.0744 0.0222


Total running time of the script: (3 minutes 42.383 seconds)

Download Jupyter notebook: plot_2_pso.ipynb

Download Python source code: plot_2_pso.py

Download zipped: plot_2_pso.zip

Gallery generated by Sphinx-Gallery

previous

Random Search

next

Tuning with optuna (Experimental)

On this page
  • Run PSO search
  • Retrain model with best hyperparameter
  • Diagnose the tuned model

© Copyright 2024-2025, Modeva Team.

Built with the PyData Sphinx Theme 0.16.0.