Linear Tree Regression#

Installation

# To install the required package, use the following command:
# !pip install modeva

Authentication

# To get authentication, use the following command: (To get full access please replace the token to your own token)
# from modeva.utils.authenticate import authenticate
# authenticate(auth_code='eaaa4301-b140-484c-8e93-f9f633c8bacb')

Import required modules

from modeva import DataSet
from modeva import TestSuite
from modeva.models import MoLGBMRegressor, MoGLMTreeBoostRegressor, MoNeuralTreeRegressor

Load and prepare dataset

ds = DataSet()
ds.load(name="BikeSharing")
ds.set_random_split()
ds.set_target("cnt")

ds.scale_numerical(method="minmax")
ds.scale_numerical(features=("cnt",), method="log1p")
ds.preprocess()

LGBM Linear Tree model#

model = MoLGBMRegressor(linear_trees=True, max_depth=2, verbose=-1, random_state=0)
model.fit(ds.train_x, ds.train_y.ravel())
MoLGBMRegressor(boosting_type='gbdt', class_weight=None, colsample_bytree=1.0,
                importance_type='split', learning_rate=0.1, linear_trees=True,
                max_depth=2, min_child_samples=20, min_child_weight=0.001,
                min_split_gain=0.0, n_estimators=100, n_jobs=None,
                num_leaves=31, objective=None, random_state=0, reg_alpha=0.0,
                reg_lambda=0.0, subsample=1.0, subsample_for_bin=200000,
                subsample_freq=0, verbose=-1)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.


Basic accuracy analysis

ts = TestSuite(ds, model)
results = ts.diagnose_accuracy_table()
results.table
MSE MAE R2
train 0.0038 0.0444 0.8185
test 0.0040 0.0451 0.8136
GAP 0.0002 0.0007 -0.0050


Feature importance analysis

results = ts.interpret_fi()
results.plot()


Local feature importance analysis

results = ts.interpret_local_fi(sample_index=1, centered=True)
results.plot()


Main effect plot

results = ts.interpret_effects(features="hr")
results.plot()


Boosted GLMTree model#

model = MoGLMTreeBoostRegressor(max_depth=1, n_estimators=100, reg_lambda=0.001,
                                verbose=True, random_state=0)
model.fit(ds.train_x, ds.train_y.ravel())
#### MoGLMTreeBoost Training ####
Iteration 1 with validation loss 0.01217
Iteration 2 with validation loss 0.00835
Iteration 3 with validation loss 0.00568
Iteration 4 with validation loss 0.00490
Iteration 5 with validation loss 0.00444
Iteration 6 with validation loss 0.00420
Iteration 7 with validation loss 0.00388
Iteration 8 with validation loss 0.00380
Iteration 9 with validation loss 0.00371
Iteration 10 with validation loss 0.00369
Iteration 11 with validation loss 0.00354
Iteration 12 with validation loss 0.00347
Iteration 13 with validation loss 0.00340
Iteration 14 with validation loss 0.00315
Iteration 15 with validation loss 0.00307
Iteration 16 with validation loss 0.00302
Iteration 17 with validation loss 0.00297
Iteration 18 with validation loss 0.00295
Iteration 19 with validation loss 0.00293
Iteration 20 with validation loss 0.00293
Iteration 21 with validation loss 0.00293
Iteration 22 with validation loss 0.00292
Iteration 23 with validation loss 0.00292
Iteration 24 with validation loss 0.00291
Iteration 25 with validation loss 0.00289
Iteration 26 with validation loss 0.00287
Iteration 27 with validation loss 0.00287
Iteration 28 with validation loss 0.00286
Iteration 29 with validation loss 0.00286
Iteration 30 with validation loss 0.00286
Iteration 31 with validation loss 0.00278
Iteration 32 with validation loss 0.00272
Iteration 33 with validation loss 0.00268
Iteration 34 with validation loss 0.00267
Iteration 35 with validation loss 0.00263
Iteration 36 with validation loss 0.00263
Iteration 37 with validation loss 0.00262
Iteration 38 with validation loss 0.00262
Iteration 39 with validation loss 0.00260
Iteration 40 with validation loss 0.00260
Iteration 41 with validation loss 0.00259
Iteration 42 with validation loss 0.00259
Iteration 43 with validation loss 0.00259
Iteration 44 with validation loss 0.00259
Iteration 45 with validation loss 0.00259
Iteration 46 with validation loss 0.00259
Iteration 47 with validation loss 0.00259
Iteration 48 with validation loss 0.00259
Iteration 49 with validation loss 0.00259
Iteration 50 with validation loss 0.00259
Iteration 51 with validation loss 0.00259
Iteration 52 with validation loss 0.00259
Iteration 53 with validation loss 0.00259
Training is terminated as validation loss stops decreasing.
MoGLMTreeBoostRegressor(name='MoGLMTreeBoostRegressor', reg_lambda=0.001,
                        verbose=True)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.


Basic accuracy analysis

ts = TestSuite(ds, model)
results = ts.diagnose_accuracy_table()
results.table
MSE MAE R2
train 0.0025 0.0358 0.8825
test 0.0026 0.0368 0.8757
GAP 0.0002 0.0010 -0.0067


Main effect plot

results = ts.interpret_effects(features="hr")
results.plot()


Neural Tree model with Monotonicity Constraints#

modelnn = MoNeuralTreeRegressor(estimator=model,
                                nn_temperature=0.0001,
                                nn_max_epochs=20,
                                feature_names=ds.feature_names,
                                mono_increasing_list=("atemp",),
                                mono_decreasing_list=("hum",),
                                mono_sample_size=1000,
                                reg_mono=10,
                                verbose=True,
                                random_state=0)
modelnn.fit(ds.train_x, ds.train_y.ravel())
#### #### MoNeuralTree Training Stage 1: Use Fitted MoGLMTreeBoost ####
#### MoNeuralTree Training Stage 2: Fine-tuning via Gradient Descent ####
Initial training and validation loss: 0.0024 and 0.0026
Epoch 0: Train loss 0.0036, Validation loss 0.0043, Monotonicity loss 0.2513
Epoch 1: Train loss 0.0048, Validation loss 0.0051, Monotonicity loss 0.1257
Epoch 2: Train loss 0.0051, Validation loss 0.0050, Monotonicity loss 0.0334
Epoch 3: Train loss 0.0050, Validation loss 0.0051, Monotonicity loss 0.0173
Epoch 4: Train loss 0.0052, Validation loss 0.0050, Monotonicity loss 0.0052
Epoch 5: Train loss 0.0051, Validation loss 0.0051, Monotonicity loss 0.0003
Epoch 6: Train loss 0.0050, Validation loss 0.0047, Monotonicity loss 0.0002
Epoch 7: Train loss 0.0049, Validation loss 0.0052, Monotonicity loss 0.0001
Epoch 8: Train loss 0.0051, Validation loss 0.0050, Monotonicity loss 0.0001
Epoch 9: Train loss 0.0050, Validation loss 0.0049, Monotonicity loss 0.0001
Epoch 10: Train loss 0.0049, Validation loss 0.0051, Monotonicity loss 0.0000
Epoch 11: Train loss 0.0048, Validation loss 0.0050, Monotonicity loss 0.0000
Training is terminated as validation loss stops decreasing.
Epoch 12: Train loss 0.0050, Validation loss 0.0052, Monotonicity loss 0.0000
Training is terminated as validation loss stops decreasing.
Epoch 13: Train loss 0.0047, Validation loss 0.0045, Monotonicity loss 0.0000
Training is terminated as validation loss stops decreasing.
Epoch 14: Train loss 0.0044, Validation loss 0.0043, Monotonicity loss 0.0000
Epoch 15: Train loss 0.0044, Validation loss 0.0042, Monotonicity loss 0.0000
Epoch 16: Train loss 0.0040, Validation loss 0.0039, Monotonicity loss 0.0000
Epoch 17: Train loss 0.0049, Validation loss 0.0050, Monotonicity loss 0.0000
Epoch 18: Train loss 0.0050, Validation loss 0.0053, Monotonicity loss 0.0000
Epoch 19: Train loss 0.0055, Validation loss 0.0052, Monotonicity loss 0.0000
Training is terminated as max_epoch is reached.
MoNeuralTreeRegressor(clip_predict=False, device='cpu',
                      estimator=MoGLMTreeBoostRegressor(name='MoGLMTreeBoostRegressor',
                                                        reg_lambda=0.001,
                                                        verbose=True),
                      learning_rate=1.0, max_depth=1, min_impurity_decrease=0,
                      min_samples_leaf=50, n_epoch_no_change=5,
                      n_estimators=100, n_feature_search=5, n_screen_grid=1,
                      n_split_grid=20, name='MoGLMTreeBoostRegressor',
                      nn_max_epochs=20, reg_lambda=0.001, simplified=True,
                      split_custom=None, verbose=True)
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.


Basic accuracy analysis

ts = TestSuite(ds, modelnn)
results = ts.diagnose_accuracy_table()
results.table
MSE MAE R2
train 5.2405e-03 0.0546 0.7505
test 5.1980e-03 0.0547 0.7560
GAP -4.2453e-05 0.0001 0.0055


Feature importance analysis

results = ts.interpret_fi()
results.plot()


Main effect plot

results = ts.interpret_effects(features="atemp")
results.plot()


Total running time of the script: (3 minutes 13.387 seconds)

Gallery generated by Sphinx-Gallery