CRAN Package Check Results for Package lightgbm

Last updated on 2021-10-24 20:51:07 CEST.

Flavor Version Tinstall Tcheck Ttotal Status Flags
r-devel-linux-x86_64-debian-clang 3.3.0 346.48 157.63 504.11 OK
r-devel-linux-x86_64-debian-gcc 3.3.0 279.84 168.00 447.84 OK
r-devel-linux-x86_64-fedora-clang 3.3.0 929.75 NOTE
r-devel-linux-x86_64-fedora-gcc 3.3.0 688.34 OK
r-devel-windows-x86_64 3.3.0 385.00 161.00 546.00 OK
r-devel-windows-x86_64-gcc10-UCRT 3.3.0 ERROR
r-patched-linux-x86_64 3.3.0 324.19 147.16 471.35 OK
r-patched-solaris-x86 3.3.0 1193.80 NOTE
r-release-linux-x86_64 3.3.0 324.66 145.87 470.53 OK
r-release-macos-arm64 3.2.1 NOTE
r-release-macos-x86_64 3.3.0 NOTE
r-release-windows-ix86+x86_64 3.3.0 562.00 239.00 801.00 NOTE
r-oldrel-macos-x86_64 3.3.0 NOTE
r-oldrel-windows-ix86+x86_64 3.3.0 569.00 171.00 740.00 NOTE

Additional issues

clang-ASAN gcc-ASAN

Check Details

Version: 3.3.0
Check: installed package size
Result: NOTE
     installed size is 44.4Mb
     sub-directories of 1Mb or more:
     libs 43.8Mb
Flavors: r-devel-linux-x86_64-fedora-clang, r-patched-solaris-x86, r-release-macos-x86_64, r-release-windows-ix86+x86_64, r-oldrel-macos-x86_64, r-oldrel-windows-ix86+x86_64

Version: 3.3.0
Check: tests
Result: ERROR
     Running 'testthat.R'
    Running the tests in 'tests/testthat.R' failed.
    Complete output:
     > library(testthat)
     > library(lightgbm)
     Loading required package: R6
     >
     > test_check(
     + package = "lightgbm"
     + , stop_on_failure = TRUE
     + , stop_on_warning = FALSE
     + , reporter = testthat::SummaryReporter$new()
     + )
     Predictor:
     Predictor: W....W.W[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.026049 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_logloss:0.314167 test's binary_logloss:0.317777"
     [1] "[2]: train's binary_logloss:0.187654 test's binary_logloss:0.187981"
     [1] "[3]: train's binary_logloss:0.109209 test's binary_logloss:0.109949"
     [1] "[4]: train's binary_logloss:0.0755423 test's binary_logloss:0.0772008"
     [1] "[5]: train's binary_logloss:0.0528045 test's binary_logloss:0.0533291"
     [1] "[6]: train's binary_logloss:0.0395797 test's binary_logloss:0.0380824"
     [1] "[7]: train's binary_logloss:0.0287269 test's binary_logloss:0.0255364"
     [1] "[8]: train's binary_logloss:0.0224443 test's binary_logloss:0.0195616"
     [1] "[9]: train's binary_logloss:0.016621 test's binary_logloss:0.017834"
     [1] "[10]: train's binary_logloss:0.0112055 test's binary_logloss:0.0125538"
     [1] "[11]: train's binary_logloss:0.00759638 test's binary_logloss:0.00842372"
     [1] "[12]: train's binary_logloss:0.0054887 test's binary_logloss:0.00631812"
     [1] "[13]: train's binary_logloss:0.00399548 test's binary_logloss:0.00454944"
     [1] "[14]: train's binary_logloss:0.00283135 test's binary_logloss:0.00323724"
     [1] "[15]: train's binary_logloss:0.00215378 test's binary_logloss:0.00256697"
     [1] "[16]: train's binary_logloss:0.00156723 test's binary_logloss:0.00181753"
     [1] "[17]: train's binary_logloss:0.00120077 test's binary_logloss:0.00144437"
     [1] "[18]: train's binary_logloss:0.000934889 test's binary_logloss:0.00111807"
     [1] "[19]: train's binary_logloss:0.000719878 test's binary_logloss:0.000878304"
     [1] "[20]: train's binary_logloss:0.000558692 test's binary_logloss:0.000712272"
     [1] "[21]: train's binary_logloss:0.000400916 test's binary_logloss:0.000492223"
     [1] "[22]: train's binary_logloss:0.000315938 test's binary_logloss:0.000402804"
     [1] "[23]: train's binary_logloss:0.000238113 test's binary_logloss:0.000288682"
     [1] "[24]: train's binary_logloss:0.000190248 test's binary_logloss:0.000237835"
     [1] "[25]: train's binary_logloss:0.000148322 test's binary_logloss:0.000174674"
     [1] "[26]: train's binary_logloss:0.000120581 test's binary_logloss:0.000139513"
     [1] "[27]: train's binary_logloss:0.000102756 test's binary_logloss:0.000118804"
     [1] "[28]: train's binary_logloss:7.83011e-05 test's binary_logloss:8.40978e-05"
     [1] "[29]: train's binary_logloss:6.29191e-05 test's binary_logloss:6.8803e-05"
     [1] "[30]: train's binary_logloss:5.28039e-05 test's binary_logloss:5.89864e-05"
     [1] "[31]: train's binary_logloss:4.51561e-05 test's binary_logloss:4.91874e-05"
     [1] "[32]: train's binary_logloss:3.89402e-05 test's binary_logloss:4.13015e-05"
     [1] "[33]: train's binary_logloss:3.24434e-05 test's binary_logloss:3.52605e-05"
     [1] "[34]: train's binary_logloss:2.65255e-05 test's binary_logloss:2.86338e-05"
     [1] "[35]: train's binary_logloss:2.19277e-05 test's binary_logloss:2.3937e-05"
     [1] "[36]: train's binary_logloss:1.86469e-05 test's binary_logloss:2.05375e-05"
     [1] "[37]: train's binary_logloss:1.49881e-05 test's binary_logloss:1.53852e-05"
     [1] "[38]: train's binary_logloss:1.2103e-05 test's binary_logloss:1.20722e-05"
     [1] "[39]: train's binary_logloss:1.02027e-05 test's binary_logloss:1.0578e-05"
     [1] "[40]: train's binary_logloss:8.91561e-06 test's binary_logloss:8.8323e-06"
     [1] "[41]: train's binary_logloss:7.4855e-06 test's binary_logloss:7.58441e-06"
     [1] "[42]: train's binary_logloss:6.21179e-06 test's binary_logloss:6.14299e-06"
     [1] "[43]: train's binary_logloss:5.06413e-06 test's binary_logloss:5.13576e-06"
     [1] "[44]: train's binary_logloss:4.2029e-06 test's binary_logloss:4.53605e-06"
     [1] "[45]: train's binary_logloss:3.47042e-06 test's binary_logloss:3.73234e-06"
     [1] "[46]: train's binary_logloss:2.78181e-06 test's binary_logloss:3.02556e-06"
     [1] "[47]: train's binary_logloss:2.19819e-06 test's binary_logloss:2.3666e-06"
     [1] "[48]: train's binary_logloss:1.80519e-06 test's binary_logloss:1.92932e-06"
     [1] "[49]: train's binary_logloss:1.50192e-06 test's binary_logloss:1.64658e-06"
     [1] "[50]: train's binary_logloss:1.20212e-06 test's binary_logloss:1.33316e-06"
     ....
     basic:
     lightgbm(): W[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.017890 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_error:0.0222632"
     [1] "[2]: train's binary_error:0.0222632"
     [1] "[3]: train's binary_error:0.0222632"
     [1] "[4]: train's binary_error:0.0109013"
     [1] "[5]: train's binary_error:0.0141256"
     [1] "[6]: train's binary_error:0.0141256"
     [1] "[7]: train's binary_error:0.0141256"
     [1] "[8]: train's binary_error:0.0141256"
     [1] "[9]: train's binary_error:0.00598802"
     [1] "[10]: train's binary_error:0.00598802"
     .....W[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.003456 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 98
     [LightGBM] [Info] Number of data points in the train set: 150, number of used features: 4
     [LightGBM] [Info] Start training from score -1.098612
     [LightGBM] [Info] Start training from score -1.098612
     [LightGBM] [Info] Start training from score -1.098612
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: train's multi_error:0.0466667"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: train's multi_error:0.0466667"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: train's multi_error:0.0466667"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: train's multi_error:0.0466667"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: train's multi_error:0.0466667"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: train's multi_error:0.0466667"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: train's multi_error:0.0466667"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: train's multi_error:0.0466667"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: train's multi_error:0.0466667"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: train's multi_error:0.0466667"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[11]: train's multi_error:0.0333333"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[12]: train's multi_error:0.0266667"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[13]: train's multi_error:0.0266667"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[14]: train's multi_error:0.0266667"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[15]: train's multi_error:0.0266667"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[16]: train's multi_error:0.0333333"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[17]: train's multi_error:0.0266667"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[18]: train's multi_error:0.0333333"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[19]: train's multi_error:0.0333333"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[20]: train's multi_error:0.0333333"
     ...W[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.021731 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_error:0.0304007 train's auc:0.972508 train's binary_logloss:0.198597"
     [1] "[2]: train's binary_error:0.0222632 train's auc:0.995075 train's binary_logloss:0.111535"
     [1] "[3]: train's binary_error:0.00598802 train's auc:0.997845 train's binary_logloss:0.0480659"
     [1] "[4]: train's binary_error:0.00122831 train's auc:0.998433 train's binary_logloss:0.0279151"
     [1] "[5]: train's binary_error:0.00122831 train's auc:0.999354 train's binary_logloss:0.0190479"
     [1] "[6]: train's binary_error:0.00537387 train's auc:0.98965 train's binary_logloss:0.16706"
     [1] "[7]: train's binary_error:0 train's auc:1 train's binary_logloss:0.0128449"
     [1] "[8]: train's binary_error:0 train's auc:1 train's binary_logloss:0.00774702"
     [1] "[9]: train's binary_error:0 train's auc:1 train's binary_logloss:0.00472108"
     [1] "[10]: train's binary_error:0 train's auc:1 train's binary_logloss:0.00208929"
     ..W[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.011923 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_error:0.0222632"
     [1] "[2]: train's binary_error:0.0222632"
     [1] "[3]: train's binary_error:0.0222632"
     [1] "[4]: train's binary_error:0.0109013"
     [1] "[5]: train's binary_error:0.0141256"
     [1] "[6]: train's binary_error:0.0141256"
     [1] "[7]: train's binary_error:0.0141256"
     [1] "[8]: train's binary_error:0.0141256"
     [1] "[9]: train's binary_error:0.00598802"
     [1] "[10]: train's binary_error:0.00598802"
     ..W[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.023049 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] Start training from score 0.482113
     [1] "[1]: train's l2:0.206337"
     [1] "[2]: train's l2:0.171229"
     [1] "[3]: train's l2:0.140871"
     [1] "[4]: train's l2:0.116282"
     [1] "[5]: train's l2:0.096364"
     [1] "[6]: train's l2:0.0802308"
     [1] "[7]: train's l2:0.0675595"
     [1] "[8]: train's l2:0.0567154"
     [1] "[9]: train's l2:0.0482086"
     [1] "[10]: train's l2:0.0402694"
     ....W[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.020025 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_error:0.0222632 train's auc:0.981784 valid1's binary_error:0.0222632 valid1's auc:0.981784 valid2's binary_error:0.0222632 valid2's auc:0.981784"
     [1] "[2]: train's binary_error:0.0222632 train's auc:0.981784 valid1's binary_error:0.0222632 valid1's auc:0.981784 valid2's binary_error:0.0222632 valid2's auc:0.981784"
     [1] "[3]: train's binary_error:0.0222632 train's auc:0.992951 valid1's binary_error:0.0222632 valid1's auc:0.992951 valid2's binary_error:0.0222632 valid2's auc:0.992951"
     [1] "[4]: train's binary_error:0.0109013 train's auc:0.992951 valid1's binary_error:0.0109013 valid1's auc:0.992951 valid2's binary_error:0.0109013 valid2's auc:0.992951"
     [1] "[5]: train's binary_error:0.0141256 train's auc:0.994714 valid1's binary_error:0.0141256 valid1's auc:0.994714 valid2's binary_error:0.0141256 valid2's auc:0.994714"
     [1] "[6]: train's binary_error:0.0141256 train's auc:0.994714 valid1's binary_error:0.0141256 valid1's auc:0.994714 valid2's binary_error:0.0141256 valid2's auc:0.994714"
     [1] "[7]: train's binary_error:0.0141256 train's auc:0.994714 valid1's binary_error:0.0141256 valid1's auc:0.994714 valid2's binary_error:0.0141256 valid2's auc:0.994714"
     [1] "[8]: train's binary_error:0.0141256 train's auc:0.994714 valid1's binary_error:0.0141256 valid1's auc:0.994714 valid2's binary_error:0.0141256 valid2's auc:0.994714"
     [1] "[9]: train's binary_error:0.00598802 train's auc:0.993175 valid1's binary_error:0.00598802 valid1's auc:0.993175 valid2's binary_error:0.00598802 valid2's auc:0.993175"
     [1] "[10]: train's binary_error:0.00598802 train's auc:0.998242 valid1's binary_error:0.00598802 valid1's auc:0.998242 valid2's binary_error:0.00598802 valid2's auc:0.998242"
     .......
     training continuation: [LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.014851 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_logloss:0.179606"
     [1] "[2]: train's binary_logloss:0.0975448"
     [1] "[3]: train's binary_logloss:0.0384292"
     [1] "[4]: train's binary_logloss:0.0582241"
     [1] "[5]: train's binary_logloss:0.0595215"
     [1] "[6]: train's binary_logloss:0.0609174"
     [1] "[7]: train's binary_logloss:0.317567"
     [1] "[8]: train's binary_logloss:0.0104223"
     [1] "[9]: train's binary_logloss:0.00497498"
     [1] "[10]: train's binary_logloss:0.00283557"
     [LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.032728 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_logloss:0.179606"
     [1] "[2]: train's binary_logloss:0.0975448"
     [1] "[3]: train's binary_logloss:0.0384292"
     [1] "[4]: train's binary_logloss:0.0582241"
     [1] "[5]: train's binary_logloss:0.0595215"
     [LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.090214 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [1] "[6]: train's binary_logloss:0.0609174"
     [1] "[7]: train's binary_logloss:0.317567"
     [1] "[8]: train's binary_logloss:0.0104223"
     [1] "[9]: train's binary_logloss:0.00497498"
     [1] "[10]: train's binary_logloss:0.00283557"
     .
     lgb.cv(): W[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.023583 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 5211, number of used features: 116
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.041563 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 5211, number of used features: 116
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.026749 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 5210, number of used features: 116
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.020600 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 5210, number of used features: 116
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.042664 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 5210, number of used features: 116
     [LightGBM] [Info] Start training from score 0.483976
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score 0.480906
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score 0.481574
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score 0.482342
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score 0.481766
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid's l2:0.000306984+0.000613968 valid's l1:0.000306994+0.00061397"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid's l2:0.000306984+0.000613968 valid's l1:0.000306984+0.000613968"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid's l2:0.000306984+0.000613968 valid's l1:0.000306984+0.000613968"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[4]: valid's l2:0.000306984+0.000613968 valid's l1:0.000306984+0.000613968"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[5]: valid's l2:0.000306984+0.000613968 valid's l1:0.000306984+0.000613968"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[6]: valid's l2:0.000306984+0.000613968 valid's l1:0.000306984+0.000613968"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[7]: valid's l2:0.000306984+0.000613968 valid's l1:0.000306984+0.000613968"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[8]: valid's l2:0.000306984+0.000613968 valid's l1:0.000306984+0.000613968"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[9]: valid's l2:0.000306984+0.000613968 valid's l1:0.000306984+0.000613968"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[10]: valid's l2:0.000306984+0.000613968 valid's l1:0.000306984+0.000613968"
     .........W[LightGBM] [Info] Number of positive: 198, number of negative: 202
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.013645 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 167
     [LightGBM] [Info] Number of data points in the train set: 400, number of used features: 1
     [LightGBM] [Info] Number of positive: 196, number of negative: 204
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.006653 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 167
     [LightGBM] [Info] Number of data points in the train set: 400, number of used features: 1
     [LightGBM] [Info] Number of positive: 207, number of negative: 193
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.011724 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 167
     [LightGBM] [Info] Number of data points in the train set: 400, number of used features: 1
     [LightGBM] [Info] Number of positive: 207, number of negative: 193
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.009405 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 167
     [LightGBM] [Info] Number of data points in the train set: 400, number of used features: 1
     [LightGBM] [Info] Number of positive: 192, number of negative: 208
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.006726 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 167
     [LightGBM] [Info] Number of data points in the train set: 400, number of used features: 1
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.495000 -> initscore=-0.020001
     [LightGBM] [Info] Start training from score -0.020001
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.490000 -> initscore=-0.040005
     [LightGBM] [Info] Start training from score -0.040005
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.517500 -> initscore=0.070029
     [LightGBM] [Info] Start training from score 0.070029
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.517500 -> initscore=0.070029
     [LightGBM] [Info] Start training from score 0.070029
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.480000 -> initscore=-0.080043
     [LightGBM] [Info] Start training from score -0.080043
     [1] "[1]: valid's auc:0.476662+0.0622898 valid's binary_error:0.5+0.0593296"
     [1] "[2]: valid's auc:0.477476+0.0393392 valid's binary_error:0.554+0.0372022"
     [1] "[3]: valid's auc:0.456927+0.042898 valid's binary_error:0.526+0.0361109"
     [1] "[4]: valid's auc:0.419531+0.0344972 valid's binary_error:0.54+0.0289828"
     [1] "[5]: valid's auc:0.459109+0.0862237 valid's binary_error:0.52+0.0489898"
     [1] "[6]: valid's auc:0.460522+0.0911246 valid's binary_error:0.528+0.0231517"
     [1] "[7]: valid's auc:0.456328+0.0540445 valid's binary_error:0.532+0.0386782"
     [1] "[8]: valid's auc:0.463653+0.0660907 valid's binary_error:0.514+0.0488262"
     [1] "[9]: valid's auc:0.443017+0.0549965 valid's binary_error:0.55+0.0303315"
     [1] "[10]: valid's auc:0.477483+0.0763283 valid's binary_error:0.488+0.0549181"
     .....[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003925 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 255
     [LightGBM] [Info] Number of data points in the train set: 800, number of used features: 1
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.005257 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 255
     [LightGBM] [Info] Number of data points in the train set: 800, number of used features: 1
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.013134 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 255
     [LightGBM] [Info] Number of data points in the train set: 800, number of used features: 1
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003498 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 255
     [LightGBM] [Info] Number of data points in the train set: 800, number of used features: 1
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.005604 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 255
     [LightGBM] [Info] Number of data points in the train set: 800, number of used features: 1
     [LightGBM] [Info] Start training from score 0.052402
     [LightGBM] [Info] Start training from score 0.017850
     [LightGBM] [Info] Start training from score -0.018739
     [LightGBM] [Info] Start training from score -0.029298
     [LightGBM] [Info] Start training from score 0.027539
     [1] "[1]: valid's l2:3.40838+0.231034"
     [1] "[2]: valid's l2:3.02136+0.216706"
     [1] "[3]: valid's l2:2.68787+0.209507"
     [1] "[4]: valid's l2:2.40599+0.192969"
     [1] "[5]: valid's l2:2.15761+0.187944"
     [1] "[6]: valid's l2:1.94667+0.172578"
     [1] "[7]: valid's l2:1.75663+0.167009"
     [1] "[8]: valid's l2:1.59496+0.150831"
     [1] "[9]: valid's l2:1.44947+0.144699"
     [1] "[10]: valid's l2:1.31792+0.139097"
     .[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.006578 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 255
     [LightGBM] [Info] Number of data points in the train set: 800, number of used features: 1
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.017837 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 255
     [LightGBM] [Info] Number of data points in the train set: 800, number of used features: 1
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.017608 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 255
     [LightGBM] [Info] Number of data points in the train set: 800, number of used features: 1
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.014304 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 255
     [LightGBM] [Info] Number of data points in the train set: 800, number of used features: 1
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.023538 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 255
     [LightGBM] [Info] Number of data points in the train set: 800, number of used features: 1
     [LightGBM] [Info] Start training from score 0.115703
     [LightGBM] [Info] Start training from score 0.123541
     [LightGBM] [Info] Start training from score 0.076232
     [LightGBM] [Info] Start training from score 0.094848
     [LightGBM] [Info] Start training from score 0.150201
     [1] "[1]: valid's l2:3.58556+0.201544"
     [1] "[2]: valid's l2:2.9047+0.163529"
     [1] "[3]: valid's l2:2.35329+0.13272"
     [1] "[4]: valid's l2:1.90673+0.107749"
     [1] "[5]: valid's l2:1.54508+0.0875109"
     [1] "[6]: valid's l2:1.25218+0.0711496"
     [1] "[7]: valid's l2:1.01499+0.057881"
     [1] "[8]: valid's l2:0.82292+0.0471195"
     [1] "[9]: valid's l2:0.667388+0.0383999"
     [1] "[10]: valid's l2:0.541453+0.0313252"
     ..W[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.015847 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 4342, number of used features: 116
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.014744 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 4342, number of used features: 116
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.024025 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 4342, number of used features: 116
     [LightGBM] [Info] Start training from score 0.485260
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score 0.478812
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score 0.482266
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid's l2:0.202301+0.000155342"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid's l2:0.163898+0.000140233"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid's l2:0.132794+0.000144812"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid's l2:0.107602+0.000161317"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid's l2:0.0871985+0.000182805"
     W[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.016952 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 4342, number of used features: 116
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.024848 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 4342, number of used features: 116
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.018534 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 4342, number of used features: 116
     [LightGBM] [Info] Start training from score 0.485260
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score 0.478812
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score 0.482266
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid's l2:0.202301"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid's l2:0.163898"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid's l2:0.132794"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid's l2:0.107602"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid's l2:0.0871985"
     ....
     lgb.train(): W[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.028049 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: train's binary_error:0.00307078 train's auc:0.99996 train's binary_logloss:0.132074"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: train's binary_error:0.00153539 train's auc:1 train's binary_logloss:0.0444372"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: train's binary_error:0 train's auc:1 train's binary_logloss:0.0159408"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: train's binary_error:0 train's auc:1 train's binary_logloss:0.00590065"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: train's binary_error:0 train's auc:1 train's binary_logloss:0.00230167"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: train's binary_error:0 train's auc:1 train's binary_logloss:0.00084253"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: train's binary_error:0 train's auc:1 train's binary_logloss:0.000309409"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: train's binary_error:0 train's auc:1 train's binary_logloss:0.000113754"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: train's binary_error:0 train's auc:1 train's binary_logloss:4.1838e-05"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: train's binary_error:0 train's auc:1 train's binary_logloss:1.539e-05"
     .............[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     ........[LightGBM] [Info] Number of positive: 35110, number of negative: 34890
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.006363 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 12
     [LightGBM] [Info] Number of data points in the train set: 70000, number of used features: 1
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.501571 -> initscore=0.006286
     [LightGBM] [Info] Start training from score 0.006286
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     .....[LightGBM] [Info] Number of positive: 500, number of negative: 500
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002622 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 3
     [LightGBM] [Info] Number of data points in the train set: 1000, number of used features: 1
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.500000 -> initscore=0.000000
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: valid1's binary_error:0"
     ...[LightGBM] [Info] Number of positive: 500, number of negative: 500
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003545 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 3
     [LightGBM] [Info] Number of data points in the train set: 1000, number of used features: 1
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.500000 -> initscore=0.000000
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid1's binary_error:0"
     ...[LightGBM] [Info] Number of positive: 500, number of negative: 500
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002666 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 3
     [LightGBM] [Info] Number of data points in the train set: 1000, number of used features: 1
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.500000 -> initscore=0.000000
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's binary_error:0"
     ...[LightGBM] [Info] Number of positive: 500, number of negative: 500
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.004037 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 3
     [LightGBM] [Info] Number of data points in the train set: 1000, number of used features: 1
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.500000 -> initscore=0.000000
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's binary_error:0"
     ...[LightGBM] [Info] Number of positive: 500, number of negative: 500
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.004851 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 3
     [LightGBM] [Info] Number of data points in the train set: 1000, number of used features: 1
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.500000 -> initscore=0.000000
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's binary_error:0"
     ...[LightGBM] [Info] Number of positive: 500, number of negative: 500
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.006493 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 3
     [LightGBM] [Info] Number of data points in the train set: 1000, number of used features: 1
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.500000 -> initscore=0.000000
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's binary_error:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's binary_error:0"
     ...[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.014405 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's auc:0.987036"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's auc:0.987036"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's auc:0.998699"
     [1] "[4]: valid1's auc:0.998699"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's auc:0.998699"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid1's auc:0.999667"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: valid1's auc:0.999806"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: valid1's auc:0.999978"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: valid1's auc:0.999997"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: valid1's auc:0.999997"
     [LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.011302 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's binary_error:0.016139"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's binary_error:0.016139"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's binary_error:0.016139"
     [1] "[4]: valid1's binary_error:0.016139"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's binary_error:0.016139"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid1's binary_error:0.016139"
     ..........[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.001429 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 3
     [LightGBM] [Info] Number of data points in the train set: 1000, number of used features: 1
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's rmse:55"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's rmse:59.5"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's rmse:63.55"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's rmse:67.195"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's rmse:70.4755"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid1's rmse:73.428"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: valid1's rmse:76.0852"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: valid1's rmse:78.4766"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: valid1's rmse:80.629"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: valid1's rmse:82.5661"
     ...[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003041 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 3
     [LightGBM] [Info] Number of data points in the train set: 1000, number of used features: 1
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's rmse:55"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's rmse:59.5"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's rmse:63.55"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's rmse:67.195"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's rmse:70.4755"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid1's rmse:73.428"
     ...[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003665 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 1
     [LightGBM] [Info] Start training from score 0.045019
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's constant_metric:0.2 valid1's increasing_metric:0.1"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's constant_metric:0.2 valid1's increasing_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's constant_metric:0.2 valid1's increasing_metric:0.3"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's constant_metric:0.2 valid1's increasing_metric:0.4"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's constant_metric:0.2 valid1's increasing_metric:0.5"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid1's constant_metric:0.2 valid1's increasing_metric:0.6"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: valid1's constant_metric:0.2 valid1's increasing_metric:0.7"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: valid1's constant_metric:0.2 valid1's increasing_metric:0.8"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: valid1's constant_metric:0.2 valid1's increasing_metric:0.9"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: valid1's constant_metric:0.2 valid1's increasing_metric:1"
     .....[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002295 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 1
     [LightGBM] [Info] Start training from score 0.045019
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's increasing_metric:1.1 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's increasing_metric:1.2 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's increasing_metric:1.3 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's increasing_metric:1.4 valid1's constant_metric:0.2"
     .....[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002363 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 1
     [LightGBM] [Info] Start training from score 0.045019
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's increasing_metric:1.5 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's increasing_metric:1.6 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's increasing_metric:1.7 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's increasing_metric:1.8 valid1's constant_metric:0.2"
     .....[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002324 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 1
     [LightGBM] [Info] Start training from score 0.045019
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's increasing_metric:1.9 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's increasing_metric:2 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's increasing_metric:2.1 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's increasing_metric:2.2 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's increasing_metric:2.3 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid1's increasing_metric:2.4 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: valid1's increasing_metric:2.5 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: valid1's increasing_metric:2.6 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: valid1's increasing_metric:2.7 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: valid1's increasing_metric:2.8 valid1's constant_metric:0.2"
     .....[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002245 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 1
     [LightGBM] [Info] Start training from score 0.045019
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's rmse:1.10501 valid1's l2:1.22105 valid1's increasing_metric:2.9 valid1's rmse:1.10501 valid1's l2:1.22105 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's rmse:1.10335 valid1's l2:1.21738 valid1's increasing_metric:3 valid1's rmse:1.10335 valid1's l2:1.21738 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's rmse:1.10199 valid1's l2:1.21438 valid1's increasing_metric:3.1 valid1's rmse:1.10199 valid1's l2:1.21438 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's rmse:1.10198 valid1's l2:1.21436 valid1's increasing_metric:3.2 valid1's rmse:1.10198 valid1's l2:1.21436 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's rmse:1.10128 valid1's l2:1.21282 valid1's increasing_metric:3.3 valid1's rmse:1.10128 valid1's l2:1.21282 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid1's rmse:1.10101 valid1's l2:1.21222 valid1's increasing_metric:3.4 valid1's rmse:1.10101 valid1's l2:1.21222 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: valid1's rmse:1.10065 valid1's l2:1.21143 valid1's increasing_metric:3.5 valid1's rmse:1.10065 valid1's l2:1.21143 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: valid1's rmse:1.10011 valid1's l2:1.21025 valid1's increasing_metric:3.6 valid1's rmse:1.10011 valid1's l2:1.21025 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: valid1's rmse:1.09999 valid1's l2:1.20997 valid1's increasing_metric:3.7 valid1's rmse:1.09999 valid1's l2:1.20997 valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: valid1's rmse:1.09954 valid1's l2:1.20898 valid1's increasing_metric:3.8 valid1's rmse:1.09954 valid1's l2:1.20898 valid1's constant_metric:0.2"
     .....[LightGBM] [Info] Number of positive: 66, number of negative: 54
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002650 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 42
     [LightGBM] [Info] Number of data points in the train set: 120, number of used features: 1
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.550000 -> initscore=0.200671
     [LightGBM] [Info] Start training from score 0.200671
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's binary_error:0.486486 valid1's binary_logloss:0.693255"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's binary_error:0.486486 valid1's binary_logloss:0.691495"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's binary_error:0.486486 valid1's binary_logloss:0.69009"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's binary_error:0.432432 valid1's binary_logloss:0.688968"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's binary_error:0.432432 valid1's binary_logloss:0.688534"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid1's binary_error:0.432432 valid1's binary_logloss:0.689883"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: valid1's binary_error:0.432432 valid1's binary_logloss:0.689641"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: valid1's binary_error:0.432432 valid1's binary_logloss:0.689532"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: valid1's binary_error:0.432432 valid1's binary_logloss:0.691066"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: valid1's binary_error:0.432432 valid1's binary_logloss:0.690653"
     ...[LightGBM] [Info] Number of positive: 66, number of negative: 54
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002931 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 42
     [LightGBM] [Info] Number of data points in the train set: 120, number of used features: 1
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.550000 -> initscore=0.200671
     [LightGBM] [Info] Start training from score 0.200671
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's binary_logloss:0.693255"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's binary_logloss:0.691495"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's binary_logloss:0.69009"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's binary_logloss:0.688968"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's binary_logloss:0.688534"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid1's binary_logloss:0.689883"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: valid1's binary_logloss:0.689641"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: valid1's binary_logloss:0.689532"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: valid1's binary_logloss:0.691066"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: valid1's binary_logloss:0.690653"
     ..[LightGBM] [Info] Number of positive: 66, number of negative: 54
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003005 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 42
     [LightGBM] [Info] Number of data points in the train set: 120, number of used features: 1
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.550000 -> initscore=0.200671
     [LightGBM] [Info] Start training from score 0.200671
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's binary_error:0.486486 valid1's binary_logloss:0.693255"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's binary_error:0.486486 valid1's binary_logloss:0.691495"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's binary_error:0.486486 valid1's binary_logloss:0.69009"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's binary_error:0.432432 valid1's binary_logloss:0.688968"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's binary_error:0.432432 valid1's binary_logloss:0.688534"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid1's binary_error:0.432432 valid1's binary_logloss:0.689883"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: valid1's binary_error:0.432432 valid1's binary_logloss:0.689641"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: valid1's binary_error:0.432432 valid1's binary_logloss:0.689532"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: valid1's binary_error:0.432432 valid1's binary_logloss:0.691066"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: valid1's binary_error:0.432432 valid1's binary_logloss:0.690653"
     ...[LightGBM] [Info] Number of positive: 66, number of negative: 54
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003519 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 42
     [LightGBM] [Info] Number of data points in the train set: 120, number of used features: 1
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.550000 -> initscore=0.200671
     [LightGBM] [Info] Start training from score 0.200671
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's binary_logloss:0.693255"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's binary_logloss:0.691495"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's binary_logloss:0.69009"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's binary_logloss:0.688968"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's binary_logloss:0.688534"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid1's binary_logloss:0.689883"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: valid1's binary_logloss:0.689641"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: valid1's binary_logloss:0.689532"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: valid1's binary_logloss:0.691066"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: valid1's binary_logloss:0.690653"
     ..[LightGBM] [Info] Number of positive: 66, number of negative: 54
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002203 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 42
     [LightGBM] [Info] Number of data points in the train set: 120, number of used features: 1
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.550000 -> initscore=0.200671
     [LightGBM] [Info] Start training from score 0.200671
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's binary_error:0.486486 valid1's binary_logloss:0.693255"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's binary_error:0.486486 valid1's binary_logloss:0.691495"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's binary_error:0.486486 valid1's binary_logloss:0.69009"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's binary_error:0.432432 valid1's binary_logloss:0.688968"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's binary_error:0.432432 valid1's binary_logloss:0.688534"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid1's binary_error:0.432432 valid1's binary_logloss:0.689883"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: valid1's binary_error:0.432432 valid1's binary_logloss:0.689641"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: valid1's binary_error:0.432432 valid1's binary_logloss:0.689532"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: valid1's binary_error:0.432432 valid1's binary_logloss:0.691066"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: valid1's binary_error:0.432432 valid1's binary_logloss:0.690653"
     ...[LightGBM] [Info] Number of positive: 66, number of negative: 54
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002570 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 42
     [LightGBM] [Info] Number of data points in the train set: 120, number of used features: 1
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.550000 -> initscore=0.200671
     [LightGBM] [Info] Start training from score 0.200671
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: valid1's constant_metric:0.2"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: valid1's constant_metric:0.2"
     .[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.005641 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 3
     [LightGBM] [Info] Number of data points in the train set: 1000, number of used features: 1
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's mape:1.1 valid1's rmse:55 valid1's l1:55"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's mape:1.19 valid1's rmse:59.5 valid1's l1:59.5"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's mape:1.271 valid1's rmse:63.55 valid1's l1:63.55"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's mape:1.3439 valid1's rmse:67.195 valid1's l1:67.195"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's mape:1.40951 valid1's rmse:70.4755 valid1's l1:70.4755"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid1's mape:1.46856 valid1's rmse:73.428 valid1's l1:73.428"
     .....[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003355 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 140
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 4
     [LightGBM] [Info] Start training from score 0.045019
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     ..[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002901 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 74
     [LightGBM] [Info] Number of data points in the train set: 32, number of used features: 10
     [LightGBM] [Info] Start training from score 20.090625
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: train's l2:34.4887"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: train's l2:33.8024"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: train's l2:33.1297"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: train's l2:32.4704"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: train's l2:31.8243"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: train's l2:31.191"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: train's l2:30.5703"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: train's l2:29.9619"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: train's l2:29.3657"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: train's l2:28.7813"
     ...[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.005214 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 74
     [LightGBM] [Info] Number of data points in the train set: 32, number of used features: 10
     [LightGBM] [Info] Start training from score 20.090625
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: train's l2:34.4887"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: train's l2:33.8024"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: train's l2:33.1297"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: train's l2:32.4704"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: train's l2:31.8243"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: train's l2:31.191"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: train's l2:30.5703"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: train's l2:29.9619"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: train's l2:29.3657"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: train's l2:28.7813"
     ...[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.028072 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 57
     [LightGBM] [Info] Number of data points in the train set: 32, number of used features: 10
     [LightGBM] [Info] Start training from score 20.090625
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: train's l2:34.4954"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: train's l2:33.8156"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: train's l2:33.1493"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: train's l2:32.4963"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: train's l2:31.8563"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: train's l2:31.2291"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: train's l2:30.6143"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: train's l2:30.0117"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: train's l2:29.4211"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: train's l2:28.8423"
     ...W[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003538 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 3
     [LightGBM] [Info] Number of data points in the train set: 1000, number of used features: 1
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid1's rmse:125 valid2's rmse:98.1071"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid1's rmse:87.5 valid2's rmse:62.5"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid1's rmse:106.25 valid2's rmse:80.0878"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid1's rmse:96.875 valid2's rmse:71.2198"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid1's rmse:101.562 valid2's rmse:75.6386"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid1's rmse:99.2188 valid2's rmse:73.425"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: valid1's rmse:100.391 valid2's rmse:74.5308"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: valid1's rmse:99.8047 valid2's rmse:73.9777"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: valid1's rmse:100.098 valid2's rmse:74.2542"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: valid1's rmse:99.9512 valid2's rmse:74.1159"
     ....W[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002921 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 3
     [LightGBM] [Info] Number of data points in the train set: 1000, number of used features: 1
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: train's rmse:25 valid1's rmse:125 valid2's rmse:98.1071"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: train's rmse:12.5 valid1's rmse:87.5 valid2's rmse:62.5"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: train's rmse:6.25 valid1's rmse:106.25 valid2's rmse:80.0878"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: train's rmse:3.125 valid1's rmse:96.875 valid2's rmse:71.2198"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: train's rmse:1.5625 valid1's rmse:101.562 valid2's rmse:75.6386"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: train's rmse:0.78125 valid1's rmse:99.2188 valid2's rmse:73.425"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: train's rmse:0.390625 valid1's rmse:100.391 valid2's rmse:74.5308"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: train's rmse:0.195312 valid1's rmse:99.8047 valid2's rmse:73.9777"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: train's rmse:0.0976562 valid1's rmse:100.098 valid2's rmse:74.2542"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: train's rmse:0.0488281 valid1's rmse:99.9512 valid2's rmse:74.1159"
     ....W[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.001989 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 3
     [LightGBM] [Info] Number of data points in the train set: 1000, number of used features: 1
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: train's rmse:25 valid1's rmse:125 valid2's rmse:98.1071"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: train's rmse:12.5 valid1's rmse:87.5 valid2's rmse:62.5"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: train's rmse:6.25 valid1's rmse:106.25 valid2's rmse:80.0878"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: train's rmse:3.125 valid1's rmse:96.875 valid2's rmse:71.2198"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: train's rmse:1.5625 valid1's rmse:101.562 valid2's rmse:75.6386"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: train's rmse:0.78125 valid1's rmse:99.2188 valid2's rmse:73.425"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: train's rmse:0.390625 valid1's rmse:100.391 valid2's rmse:74.5308"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: train's rmse:0.195312 valid1's rmse:99.8047 valid2's rmse:73.9777"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: train's rmse:0.0976562 valid1's rmse:100.098 valid2's rmse:74.2542"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: train's rmse:0.0488281 valid1's rmse:99.9512 valid2's rmse:74.1159"
     ....W[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003159 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 3
     [LightGBM] [Info] Number of data points in the train set: 1000, number of used features: 1
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: train's rmse:25 valid1's rmse:125 valid2's rmse:98.1071"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: train's rmse:12.5 valid1's rmse:87.5 valid2's rmse:62.5"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: train's rmse:6.25 valid1's rmse:106.25 valid2's rmse:80.0878"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: train's rmse:3.125 valid1's rmse:96.875 valid2's rmse:71.2198"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: train's rmse:1.5625 valid1's rmse:101.562 valid2's rmse:75.6386"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: train's rmse:0.78125 valid1's rmse:99.2188 valid2's rmse:73.425"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: train's rmse:0.390625 valid1's rmse:100.391 valid2's rmse:74.5308"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: train's rmse:0.195312 valid1's rmse:99.8047 valid2's rmse:73.9777"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: train's rmse:0.0976562 valid1's rmse:100.098 valid2's rmse:74.2542"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: train's rmse:0.0488281 valid1's rmse:99.9512 valid2's rmse:74.1159"
     ....W[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002705 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 3
     [LightGBM] [Info] Number of data points in the train set: 1000, number of used features: 1
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: something-random-we-would-not-hardcode's rmse:25 valid1's rmse:125 valid2's rmse:98.1071"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: something-random-we-would-not-hardcode's rmse:12.5 valid1's rmse:87.5 valid2's rmse:62.5"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: something-random-we-would-not-hardcode's rmse:6.25 valid1's rmse:106.25 valid2's rmse:80.0878"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: something-random-we-would-not-hardcode's rmse:3.125 valid1's rmse:96.875 valid2's rmse:71.2198"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: something-random-we-would-not-hardcode's rmse:1.5625 valid1's rmse:101.562 valid2's rmse:75.6386"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: something-random-we-would-not-hardcode's rmse:0.78125 valid1's rmse:99.2188 valid2's rmse:73.425"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: something-random-we-would-not-hardcode's rmse:0.390625 valid1's rmse:100.391 valid2's rmse:74.5308"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: something-random-we-would-not-hardcode's rmse:0.195312 valid1's rmse:99.8047 valid2's rmse:73.9777"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: something-random-we-would-not-hardcode's rmse:0.0976562 valid1's rmse:100.098 valid2's rmse:74.2542"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: something-random-we-would-not-hardcode's rmse:0.0488281 valid1's rmse:99.9512 valid2's rmse:74.1159"
     ....W[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003071 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 3
     [LightGBM] [Info] Number of data points in the train set: 1000, number of used features: 1
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: train's rmse:25"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: train's rmse:12.5"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: train's rmse:6.25"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: train's rmse:3.125"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: train's rmse:1.5625"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: train's rmse:0.78125"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: train's rmse:0.390625"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: train's rmse:0.195312"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: train's rmse:0.0976562"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: train's rmse:0.0488281"
     ..W[LightGBM] [Info] Number of positive: 500, number of negative: 500
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.007976 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 255
     [LightGBM] [Info] Number of data points in the train set: 1000, number of used features: 1
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.500000 -> initscore=0.000000
     [1] "[1]: something-random-we-would-not-hardcode's auc:0.58136 valid1's auc:0.429487"
     [1] "[2]: something-random-we-would-not-hardcode's auc:0.599008 valid1's auc:0.266026"
     [1] "[3]: something-random-we-would-not-hardcode's auc:0.6328 valid1's auc:0.349359"
     [1] "[4]: something-random-we-would-not-hardcode's auc:0.655136 valid1's auc:0.394231"
     [1] "[5]: something-random-we-would-not-hardcode's auc:0.655408 valid1's auc:0.419872"
     [1] "[6]: something-random-we-would-not-hardcode's auc:0.678784 valid1's auc:0.336538"
     [1] "[7]: something-random-we-would-not-hardcode's auc:0.682176 valid1's auc:0.416667"
     [1] "[8]: something-random-we-would-not-hardcode's auc:0.698032 valid1's auc:0.394231"
     [1] "[9]: something-random-we-would-not-hardcode's auc:0.712672 valid1's auc:0.445513"
     [1] "[10]: something-random-we-would-not-hardcode's auc:0.723024 valid1's auc:0.471154"
     ....W....[LightGBM] [Info] Number of positive: 50, number of negative: 39
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002514 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 42
     [LightGBM] [Info] Number of data points in the train set: 89, number of used features: 1
     [LightGBM] [Info] Number of positive: 49, number of negative: 41
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.004074 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 42
     [LightGBM] [Info] Number of data points in the train set: 90, number of used features: 1
     [LightGBM] [Info] Number of positive: 53, number of negative: 38
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002825 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 42
     [LightGBM] [Info] Number of data points in the train set: 91, number of used features: 1
     [LightGBM] [Info] Number of positive: 46, number of negative: 44
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002512 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 42
     [LightGBM] [Info] Number of data points in the train set: 90, number of used features: 1
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.561798 -> initscore=0.248461
     [LightGBM] [Info] Start training from score 0.248461
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.544444 -> initscore=0.178248
     [LightGBM] [Info] Start training from score 0.178248
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.582418 -> initscore=0.332706
     [LightGBM] [Info] Start training from score 0.332706
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.511111 -> initscore=0.044452
     [LightGBM] [Info] Start training from score 0.044452
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid's binary_error:0.500565+0.0460701 valid's binary_logloss:0.701123+0.0155541"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid's binary_error:0.500565+0.0460701 valid's binary_logloss:0.70447+0.0152787"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid's binary_error:0.500565+0.0460701 valid's binary_logloss:0.706572+0.0162531"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid's binary_error:0.500565+0.0460701 valid's binary_logloss:0.709214+0.0165672"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid's binary_error:0.500565+0.0460701 valid's binary_logloss:0.710652+0.0172198"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid's binary_error:0.500565+0.0460701 valid's binary_logloss:0.713091+0.0176604"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: valid's binary_error:0.508899+0.0347887 valid's binary_logloss:0.714842+0.0184267"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: valid's binary_error:0.508899+0.0347887 valid's binary_logloss:0.714719+0.0178927"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: valid's binary_error:0.508899+0.0347887 valid's binary_logloss:0.717162+0.0181993"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: valid's binary_error:0.508899+0.0347887 valid's binary_logloss:0.716395+0.018088"
     ....[LightGBM] [Info] Number of positive: 45, number of negative: 35
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.008186 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 42
     [LightGBM] [Info] Number of data points in the train set: 80, number of used features: 1
     [LightGBM] [Info] Number of positive: 40, number of negative: 40
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.004370 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 42
     [LightGBM] [Info] Number of data points in the train set: 80, number of used features: 1
     [LightGBM] [Info] Number of positive: 47, number of negative: 33
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003190 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 42
     [LightGBM] [Info] Number of data points in the train set: 80, number of used features: 1
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.562500 -> initscore=0.251314
     [LightGBM] [Info] Start training from score 0.251314
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.500000 -> initscore=0.000000
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.587500 -> initscore=0.353640
     [LightGBM] [Info] Start training from score 0.353640
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: valid's constant_metric:0.2+0"
     ..[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.005312 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 80, number of used features: 1
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002613 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 80, number of used features: 1
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.008611 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 80, number of used features: 1
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.007723 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 80, number of used features: 1
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003478 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 80, number of used features: 1
     [LightGBM] [Info] Start training from score 0.024388
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score 0.005573
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score 0.039723
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score 0.029700
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score 0.125712
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid's increasing_metric:4.1+0.141421 valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid's increasing_metric:4.6+0.141421 valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid's increasing_metric:5.1+0.141421 valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid's increasing_metric:5.6+0.141421 valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid's increasing_metric:6.1+0.141421 valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid's increasing_metric:6.6+0.141421 valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: valid's increasing_metric:7.1+0.141421 valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: valid's increasing_metric:7.6+0.141421 valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: valid's increasing_metric:8.1+0.141421 valid's constant_metric:0.2+0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: valid's increasing_metric:8.6+0.141421 valid's constant_metric:0.2+0"
     .....[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.006732 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 80, number of used features: 1
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003074 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 80, number of used features: 1
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.007363 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 80, number of used features: 1
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.012071 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 80, number of used features: 1
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.004052 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 80, number of used features: 1
     [LightGBM] [Info] Start training from score 0.024388
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score 0.005573
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score 0.039723
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score 0.029700
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score 0.125712
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid's constant_metric:0.2+0 valid's increasing_metric:9.1+0.141421"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid's constant_metric:0.2+0 valid's increasing_metric:9.6+0.141421"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid's constant_metric:0.2+0 valid's increasing_metric:10.1+0.141421"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid's constant_metric:0.2+0 valid's increasing_metric:10.6+0.141421"
     .....
     linear learner: [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.007562 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 1
     [LightGBM] [Info] Start training from score 0.137767
     [1] "[1]: train's l2:2.84778"
     [1] "[2]: train's l2:2.5242"
     [1] "[3]: train's l2:2.24798"
     [1] "[4]: train's l2:2.00427"
     [1] "[5]: train's l2:1.79784"
     [1] "[6]: train's l2:1.61418"
     [1] "[7]: train's l2:1.45586"
     [1] "[8]: train's l2:1.32013"
     [1] "[9]: train's l2:1.19755"
     [1] "[10]: train's l2:1.09283"
     .[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.009134 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 1
     [LightGBM] [Info] Start training from score 0.338905
     [1] "[1]: train's l2:3.40843"
     [1] "[2]: train's l2:2.76158"
     [1] "[3]: train's l2:2.23763"
     [1] "[4]: train's l2:1.81323"
     [1] "[5]: train's l2:1.46947"
     [1] "[6]: train's l2:1.19102"
     [1] "[7]: train's l2:0.96548"
     [1] "[8]: train's l2:0.78279"
     [1] "[9]: train's l2:0.634812"
     [1] "[10]: train's l2:0.514949"
     ..[LightGBM] [Fatal] Cannot change linear_tree after constructed Dataset handle.
     .[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003386 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 32
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 1
     [LightGBM] [Info] Start training from score 0.172413
     [1] "[1]: train's l2:2.40089"
     [1] "[2]: train's l2:2.14212"
     [1] "[3]: train's l2:1.92264"
     [1] "[4]: train's l2:1.72221"
     [1] "[5]: train's l2:1.55745"
     [1] "[6]: train's l2:1.4075"
     [1] "[7]: train's l2:1.27998"
     [1] "[8]: train's l2:1.16551"
     [1] "[9]: train's l2:1.06822"
     [1] "[10]: train's l2:0.97992"
     .[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.005423 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 33
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 1
     [LightGBM] [Info] Start training from score -0.048498
     [1] "[1]: train's l2:3.56595"
     [1] "[2]: train's l2:2.88907"
     [1] "[3]: train's l2:2.34276"
     [1] "[4]: train's l2:1.89763"
     [1] "[5]: train's l2:1.53941"
     [1] "[6]: train's l2:1.24722"
     [1] "[7]: train's l2:1.0124"
     [1] "[8]: train's l2:0.820467"
     [1] "[9]: train's l2:0.66657"
     [1] "[10]: train's l2:0.54045"
     ..[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002819 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 32
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 1
     [LightGBM] [Info] Start training from score 0.172413
     [1] "[1]: train's l2:2.41018"
     [1] "[2]: train's l2:2.15266"
     [1] "[3]: train's l2:1.92549"
     [1] "[4]: train's l2:1.71392"
     [1] "[5]: train's l2:1.54274"
     [1] "[6]: train's l2:1.39293"
     [1] "[7]: train's l2:1.271"
     [1] "[8]: train's l2:1.16194"
     [1] "[9]: train's l2:1.06917"
     [1] "[10]: train's l2:0.992576"
     .[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.003372 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 33
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 1
     [LightGBM] [Info] Start training from score -0.048498
     [1] "[1]: train's l2:3.57471"
     [1] "[2]: train's l2:2.8932"
     [1] "[3]: train's l2:2.34421"
     [1] "[4]: train's l2:1.90154"
     [1] "[5]: train's l2:1.54348"
     [1] "[6]: train's l2:1.24892"
     [1] "[7]: train's l2:1.01391"
     [1] "[8]: train's l2:0.821055"
     [1] "[9]: train's l2:0.66692"
     [1] "[10]: train's l2:0.540703"
     ..W[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003455 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 38
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 2
     [LightGBM] [Info] Start training from score 0.137507
     .[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003387 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 40
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 2
     [LightGBM] [Info] Start training from score 0.140709
     [1] "[1]: train's l2:2.81786"
     [1] "[2]: train's l2:2.49518"
     [1] "[3]: train's l2:2.22344"
     [1] "[4]: train's l2:1.98477"
     [1] "[5]: train's l2:1.77604"
     [1] "[6]: train's l2:1.59305"
     [1] "[7]: train's l2:1.43703"
     [1] "[8]: train's l2:1.30009"
     [1] "[9]: train's l2:1.18123"
     [1] "[10]: train's l2:1.07721"
     .[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.004149 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 40
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 2
     [LightGBM] [Info] Start training from score 0.274022
     [1] "[1]: train's l2:3.37302"
     [1] "[2]: train's l2:2.7329"
     [1] "[3]: train's l2:2.21441"
     [1] "[4]: train's l2:1.79444"
     [1] "[5]: train's l2:1.45425"
     [1] "[6]: train's l2:1.17871"
     [1] "[7]: train's l2:0.955515"
     [1] "[8]: train's l2:0.774729"
     [1] "[9]: train's l2:0.628292"
     [1] "[10]: train's l2:0.509678"
     ..
     interaction constraints: ...[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.027386 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] Start training from score 0.482113
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: train's l2:0.24804"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: train's l2:0.246711"
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.057978 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] Start training from score 0.482113
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: train's l2:0.24804"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: train's l2:0.246711"
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.039531 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] Start training from score 0.482113
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: train's l2:0.24804"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: train's l2:0.246711"
     ..[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.058290 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] Start training from score 0.482113
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: train's l2:0.24804"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: train's l2:0.246711"
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.023744 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] Start training from score 0.482113
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: train's l2:0.24804"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: train's l2:0.246711"
     .
     monotone constraints: [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003462 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 610
     [LightGBM] [Info] Number of data points in the train set: 3000, number of used features: 3
     [LightGBM] [Info] Start training from score -358.923775
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     .[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.010473 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 610
     [LightGBM] [Info] Number of data points in the train set: 3000, number of used features: 3
     [LightGBM] [Info] Start training from score -358.923775
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     .[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.004561 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 610
     [LightGBM] [Info] Number of data points in the train set: 3000, number of used features: 3
     [LightGBM] [Info] Start training from score -358.923775
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     .[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.010888 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 765
     [LightGBM] [Info] Number of data points in the train set: 3000, number of used features: 3
     [LightGBM] [Info] Start training from score -5.149260
     .[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002710 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 765
     [LightGBM] [Info] Number of data points in the train set: 3000, number of used features: 3
     [LightGBM] [Info] Start training from score -5.149260
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     .[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.003734 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 765
     [LightGBM] [Info] Number of data points in the train set: 3000, number of used features: 3
     [LightGBM] [Info] Start training from score -5.149260
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     .
     custom_objective:
     Test models with custom objective: [LightGBM] [Warning] Using self-defined objective function
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.017718 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Warning] Using self-defined objective function
     [1] "[1]: train's auc:0.994987 train's error:0.00598802 eval's auc:0.995243 eval's error:0.00558659"
     [1] "[2]: train's auc:0.99512 train's error:0.00307078 eval's auc:0.995237 eval's error:0.00248293"
     [1] "[3]: train's auc:0.99009 train's error:0.00598802 eval's auc:0.98843 eval's error:0.00558659"
     [1] "[4]: train's auc:0.999889 train's error:0.00168893 eval's auc:1 eval's error:0.000620732"
     [1] "[5]: train's auc:1 train's error:0 eval's auc:1 eval's error:0"
     [1] "[6]: train's auc:1 train's error:0 eval's auc:1 eval's error:0"
     [1] "[7]: train's auc:1 train's error:0 eval's auc:1 eval's error:0"
     [1] "[8]: train's auc:1 train's error:0 eval's auc:1 eval's error:0"
     [1] "[9]: train's auc:1 train's error:0 eval's auc:1 eval's error:0"
     [1] "[10]: train's auc:1 train's error:0 eval's auc:1 eval's error:0"
     .[LightGBM] [Warning] Using self-defined objective function
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.024765 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Warning] Using self-defined objective function
     [1] "[1]: train's error:0.00598802 eval's error:0.00558659"
     [1] "[2]: train's error:0.00307078 eval's error:0.00248293"
     [1] "[3]: train's error:0.00598802 eval's error:0.00558659"
     [1] "[4]: train's error:0.00168893 eval's error:0.000620732"
     .......
     dataset:
     testing lgb.Dataset functionality: WW..[LightGBM] [Info] Saving data to binary file C:\r_packages\pkgcheck\CRAN\lightgbm\tmp\RtmpqmFpH9\lgb.Dataset_227ec20cc40a3
     [LightGBM] [Info] Load from binary file C:\r_packages\pkgcheck\CRAN\lightgbm\tmp\RtmpqmFpH9\lgb.Dataset_227ec20cc40a3
     WW..WWW.W.W.W........W..WW.W........................[LightGBM] [Fatal] Initial score size doesn't match data size
     .W.WW......................[LightGBM] [Info] Saving data to binary file C:\r_packages\pkgcheck\CRAN\lightgbm\tmp\RtmpqmFpH9\lgb.Dataset_227ec651f788e
     [LightGBM] [Info] Load from binary file C:\r_packages\pkgcheck\CRAN\lightgbm\tmp\RtmpqmFpH9\lgb.Dataset_227ec651f788e
     [LightGBM] [Info] Number of positive: 13, number of negative: 87
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002919 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 46
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 23
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.130000 -> initscore=-1.900959
     [LightGBM] [Info] Start training from score -1.900959
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     .[LightGBM] [Info] Saving data to binary file C:\r_packages\pkgcheck\CRAN\lightgbm\tmp\RtmpqmFpH9\lgb.Dataset_227ec7aad4f60
     [LightGBM] [Info] Load from binary file C:\r_packages\pkgcheck\CRAN\lightgbm\tmp\RtmpqmFpH9\lgb.Dataset_227ec7aad4f60
     [LightGBM] [Info] Number of positive: 9, number of negative: 58
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.007961 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 46
     [LightGBM] [Info] Number of data points in the train set: 67, number of used features: 23
     [LightGBM] [Info] Number of positive: 8, number of negative: 59
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003526 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 46
     [LightGBM] [Info] Number of data points in the train set: 67, number of used features: 23
     [LightGBM] [Info] Number of positive: 9, number of negative: 57
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.005299 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 46
     [LightGBM] [Info] Number of data points in the train set: 66, number of used features: 23
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.134328 -> initscore=-1.863218
     [LightGBM] [Info] Start training from score -1.863218
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.119403 -> initscore=-1.998096
     [LightGBM] [Info] Start training from score -1.998096
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.136364 -> initscore=-1.845827
     [LightGBM] [Info] Start training from score -1.845827
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: valid's binary_logloss:0.279414+0.0167009"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: valid's binary_logloss:0.145125+0.0306369"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: valid's binary_logloss:0.113776+0.0170876"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[4]: valid's binary_logloss:0.111976+0.0374821"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[5]: valid's binary_logloss:0.0966582+0.0285143"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[6]: valid's binary_logloss:0.0865012+0.0219174"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[7]: valid's binary_logloss:0.103477+0.0535833"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[8]: valid's binary_logloss:0.10095+0.0584932"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[9]: valid's binary_logloss:0.108361+0.0735154"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[10]: valid's binary_logloss:0.102928+0.0643592"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[11]: valid's binary_logloss:0.0999163+0.0796126"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[12]: valid's binary_logloss:0.0952751+0.0882803"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[13]: valid's binary_logloss:0.103852+0.0954681"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[14]: valid's binary_logloss:0.101612+0.107453"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[15]: valid's binary_logloss:0.104854+0.107106"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[16]: valid's binary_logloss:0.104574+0.116598"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[17]: valid's binary_logloss:0.0956509+0.101251"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[18]: valid's binary_logloss:0.0996179+0.114974"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[19]: valid's binary_logloss:0.0913103+0.105155"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[20]: valid's binary_logloss:0.0946521+0.11431"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[21]: valid's binary_logloss:0.0979668+0.124126"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[22]: valid's binary_logloss:0.0969992+0.115547"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[23]: valid's binary_logloss:0.104527+0.123895"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[24]: valid's binary_logloss:0.107342+0.131438"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[25]: valid's binary_logloss:0.106014+0.125752"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[26]: valid's binary_logloss:0.116506+0.133072"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[27]: valid's binary_logloss:0.119467+0.138957"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[28]: valid's binary_logloss:0.106518+0.128288"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[29]: valid's binary_logloss:0.118423+0.141227"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[30]: valid's binary_logloss:0.12338+0.147676"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[31]: valid's binary_logloss:0.123616+0.138676"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[32]: valid's binary_logloss:0.126272+0.150046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[33]: valid's binary_logloss:0.138234+0.15078"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[34]: valid's binary_logloss:0.132632+0.13679"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[35]: valid's binary_logloss:0.14058+0.151539"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[36]: valid's binary_logloss:0.150265+0.156074"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[37]: valid's binary_logloss:0.140275+0.153453"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[38]: valid's binary_logloss:0.15395+0.157308"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[39]: valid's binary_logloss:0.152835+0.167185"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[40]: valid's binary_logloss:0.147393+0.141337"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[41]: valid's binary_logloss:0.1445+0.12219"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[42]: valid's binary_logloss:0.151162+0.131083"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[43]: valid's binary_logloss:0.148884+0.131826"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[44]: valid's binary_logloss:0.154864+0.137458"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[45]: valid's binary_logloss:0.133984+0.111943"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[46]: valid's binary_logloss:0.123453+0.101176"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[47]: valid's binary_logloss:0.133002+0.111579"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[48]: valid's binary_logloss:0.13919+0.118066"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[49]: valid's binary_logloss:0.135524+0.114905"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[50]: valid's binary_logloss:0.143256+0.124112"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[51]: valid's binary_logloss:0.14774+0.129636"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[52]: valid's binary_logloss:0.14825+0.130326"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[53]: valid's binary_logloss:0.136665+0.114743"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[54]: valid's binary_logloss:0.137499+0.115857"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[55]: valid's binary_logloss:0.144269+0.124948"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[56]: valid's binary_logloss:0.149565+0.132107"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[57]: valid's binary_logloss:0.142714+0.122854"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[58]: valid's binary_logloss:0.146239+0.127606"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[59]: valid's binary_logloss:0.137767+0.116215"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[60]: valid's binary_logloss:0.147556+0.129386"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[61]: valid's binary_logloss:0.152326+0.135853"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[62]: valid's binary_logloss:0.147743+0.12964"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[63]: valid's binary_logloss:0.153347+0.13724"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[64]: valid's binary_logloss:0.15782+0.143333"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[65]: valid's binary_logloss:0.157954+0.143515"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[66]: valid's binary_logloss:0.154795+0.13921"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[67]: valid's binary_logloss:0.155441+0.140089"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[68]: valid's binary_logloss:0.162146+0.149243"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[69]: valid's binary_logloss:0.162745+0.150063"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[70]: valid's binary_logloss:0.155413+0.140052"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[71]: valid's binary_logloss:0.156071+0.140948"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[72]: valid's binary_logloss:0.160202+0.146585"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[73]: valid's binary_logloss:0.163868+0.151601"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[74]: valid's binary_logloss:0.162803+0.150142"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[75]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[76]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[77]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[78]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[79]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[80]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[81]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[82]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[83]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[84]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[85]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[86]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[87]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[88]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[89]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[90]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[91]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[92]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[93]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[94]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[95]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[96]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[97]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[98]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[99]: valid's binary_logloss:0.153939+0.138046"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[100]: valid's binary_logloss:0.153939+0.138046"
     ...[LightGBM] [Info] Construct bin mappers from text data time 0.00 seconds
     ...[LightGBM] [Info] Construct bin mappers from text data time 0.00 seconds
     ..........
     learning_to_rank:
     Learning to rank: [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002559 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 32
     [LightGBM] [Info] Number of data points in the train set: 6000, number of used features: 16
     .................W[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.014918 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 40
     [LightGBM] [Info] Number of data points in the train set: 4500, number of used features: 20
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.004184 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 40
     [LightGBM] [Info] Number of data points in the train set: 4500, number of used features: 20
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.013598 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 40
     [LightGBM] [Info] Number of data points in the train set: 4500, number of used features: 20
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.005132 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 40
     [LightGBM] [Info] Number of data points in the train set: 4500, number of used features: 20
     [1] "[1]: valid's ndcg@1:0.675+0.0829156 valid's ndcg@2:0.655657+0.0625302 valid's ndcg@3:0.648464+0.0613335"
     [1] "[2]: valid's ndcg@1:0.725+0.108972 valid's ndcg@2:0.666972+0.131409 valid's ndcg@3:0.657124+0.130448"
     [1] "[3]: valid's ndcg@1:0.65+0.111803 valid's ndcg@2:0.630657+0.125965 valid's ndcg@3:0.646928+0.15518"
     [1] "[4]: valid's ndcg@1:0.725+0.0829156 valid's ndcg@2:0.647629+0.120353 valid's ndcg@3:0.654052+0.129471"
     [1] "[5]: valid's ndcg@1:0.75+0.165831 valid's ndcg@2:0.662958+0.142544 valid's ndcg@3:0.648186+0.130213"
     [1] "[6]: valid's ndcg@1:0.725+0.129904 valid's ndcg@2:0.647629+0.108136 valid's ndcg@3:0.648186+0.106655"
     [1] "[7]: valid's ndcg@1:0.75+0.165831 valid's ndcg@2:0.662958+0.128753 valid's ndcg@3:0.648186+0.11714"
     [1] "[8]: valid's ndcg@1:0.725+0.129904 valid's ndcg@2:0.637958+0.123045 valid's ndcg@3:0.64665+0.119557"
     [1] "[9]: valid's ndcg@1:0.75+0.15 valid's ndcg@2:0.711315+0.101634 valid's ndcg@3:0.702794+0.100252"
     [1] "[10]: valid's ndcg@1:0.75+0.165831 valid's ndcg@2:0.682301+0.117876 valid's ndcg@3:0.66299+0.121243"
     ..............................
     lgb.Booster:
     Booster: W....
     lgb.get.eval.result: ......W[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.011779 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
     [LightGBM] [Info] Start training from score 0.482113
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: test's l2:6.44165e-17"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: test's l2:1.97215e-31"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: test's l2:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[4]: test's l2:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[5]: test's l2:0"
     .W[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.016646 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
     [LightGBM] [Info] Start training from score 0.482113
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[1]: test's l2:6.44165e-17"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[2]: test's l2:1.97215e-31"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [1] "[3]: test's l2:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[4]: test's l2:0"
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Stopped training because there are no more leaves that meet the split requirements
     [1] "[5]: test's l2:0"
     .
     lgb.load(): W[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.010805 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_logloss:0.198597"
     [1] "[2]: train's binary_logloss:0.111535"
     .....W[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.022321 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_logloss:0.198597"
     [1] "[2]: train's binary_logloss:0.111535"
     ...[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003686 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 1
     [LightGBM] [Info] Start training from score 0.137767
     ...W[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.012889 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_logloss:0.198597"
     [1] "[2]: train's binary_logloss:0.111535"
     ...W.....W..W[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.025031 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_logloss:0.198597"
     [1] "[2]: train's binary_logloss:0.111535"
     ...
     Booster: [LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.018342 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     ....W[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.010983 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_logloss:0.198597"
     [1] "[2]: train's binary_logloss:0.111535"
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.023795 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 182
     [LightGBM] [Info] Number of data points in the train set: 1611, number of used features: 91
     ......[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.016461 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] Start training from score 0.482113
     [LightGBM] [Info] Saving data to binary file C:\r_packages\pkgcheck\CRAN\lightgbm\tmp\RtmpqmFpH9\lgb.Dataset_227ec373f12b8
     [LightGBM] [Info] Load from binary file C:\r_packages\pkgcheck\CRAN\lightgbm\tmp\RtmpqmFpH9\lgb.Dataset_227ec373f12b8
     .1W[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.023272 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_logloss:0.198597"
     [1] "[2]: train's binary_logloss:0.111535"
     [1] "[3]: train's binary_logloss:0.0480659"
     [1] "[4]: train's binary_logloss:0.0279151"
     [1] "[5]: train's binary_logloss:0.0190479"
     ......W[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.012196 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_logloss:0.198597"
     [1] "[2]: train's binary_logloss:0.111535"
     ..[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     ..W[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.039376 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_logloss:0.198597"
     [1] "[2]: train's binary_logloss:0.111535"
     [1] "[3]: train's binary_logloss:0.0480659"
     ....W[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.009478 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_logloss:0.198597"
     [1] "[2]: train's binary_logloss:0.111535"
     .[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.014604 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     ...[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.017546 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     .[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     ..
     save_model: W[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.036966 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_logloss:0.198597"
     [1] "[2]: train's binary_logloss:0.111535"
     ...W[LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.019696 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [1] "[1]: train's binary_logloss:0.198597"
     [1] "[2]: train's binary_logloss:0.111535"
     .[LightGBM] [Fatal] Unknown importance type: only support split=0 and gain=1
     .[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.007492 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     ..........[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002857 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     ......................Error in self$construct() :
     Attempting to create a Dataset without any raw data. This can happen if you have called Dataset$finalize() or if this Dataset was saved with saveRDS(). To avoid this error in the future, use lgb.Dataset.save() or Dataset$save_binary() to save lightgbm Datasets.
     .[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.032736 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.006089 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.004159 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Start training from score 0.016891
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score 0.014176
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Info] Start training from score -0.114604
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     .....................
     saveRDS.lgb.Booster() and readRDS.lgb.Booster(): [LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.025219 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 214
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 107
     .[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.012854 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 35
     [LightGBM] [Info] Number of data points in the train set: 100, number of used features: 1
     [LightGBM] [Info] Start training from score 0.167151
     ...
     lgb.convert_with_rules:
     lgb.convert_with_rules(): .........................................................................................................................................
     lgb.importance:
     lgb.importance: ...........
     lgb.interprete:
     lgb.interpete: [LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.050967 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     ....W[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.002369 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 77
     [LightGBM] [Info] Number of data points in the train set: 90, number of used features: 4
     [LightGBM] [Info] Start training from score -1.504077
     [LightGBM] [Info] Start training from score -1.098612
     [LightGBM] [Info] Start training from score -0.810930
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     ....
     lgb.plot.importance:
     lgb.plot.importance(): [LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.022767 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
     [LightGBM] [Info] [binary:BoostFromScore]: pavg=0.482113 -> initscore=-0.071580
     [LightGBM] [Info] Start training from score -0.071580
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     .........
     lgb.plot.interpretation:
     lgb.plot.interpretation: [LightGBM] [Info] Number of positive: 3140, number of negative: 3373
     [LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.019660 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     ..W[LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003010 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Info] Total Bins 77
     [LightGBM] [Info] Number of data points in the train set: 90, number of used features: 4
     [LightGBM] [Info] Start training from score -1.504077
     [LightGBM] [Info] Start training from score -1.098612
     [LightGBM] [Info] Start training from score -0.810930
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     .
     lgb.unloader:
     lgb.unloader: W[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.023915 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
     [LightGBM] [Info] Start training from score 0.482113
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     ...W[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.009598 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
     [LightGBM] [Info] Start training from score 0.482113
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     W[LightGBM] [Warning] Auto-choosing row-wise multi-threading, the overhead of testing was 0.014010 seconds.
     You can set `force_row_wise=true` to remove the overhead.
     And if memory is not enough, you can set `force_col_wise=true`.
     [LightGBM] [Info] Total Bins 232
     [LightGBM] [Info] Number of data points in the train set: 6513, number of used features: 116
     [LightGBM] [Info] Start training from score 0.482113
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     ....
     metrics:
     .METRICS_HIGHER_BETTER(): ...
     parameters:
     feature penalties: WWWWWWWWWWW.......
     parameter aliases: ............
     utils:
     lgb.params2str: ....
     lgb.check.eval: ..........
     lgb.check.wrapper_param: .......
     weighted_loss:
     Case weights are respected: [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003011 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003268 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.006920 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] Auto-choosing col-wise multi-threading, the overhead of testing was 0.003034 seconds.
     You can set `force_col_wise=true` to remove the overhead.
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     [LightGBM] [Warning] No further splits with positive gain, best gain: -inf
     ...
    
     ══ Warnings ════════════════════════════════════════════════════════════════════
     1. Predictor$finalize() should not fail (test_Predictor.R:7:5) - lgb.train: Found the following passed through '...': objective. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     2. predictions do not fail for integer input (test_Predictor.R:33:5) - lgb.train: Found the following passed through '...': objective. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     3. start_iteration works correctly (test_Predictor.R:62:5) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     4. train and predict binary classification (test_basic.R:72:3) - lgb.train: Found the following passed through '...': num_leaves, objective, metric. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     5. train and predict softmax (test_basic.R:100:3) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, min_data, min_hessian, objective, metric, num_class. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     6. use of multiple eval metrics works (test_basic.R:125:3) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective, metric. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     7. lgb.Booster.upper_bound() and lgb.Booster.lower_bound() work as expected for binary classification (test_basic.R:147:3) - lgb.train: Found the following passed through '...': num_leaves, objective, metric. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     8. lgb.Booster.upper_bound() and lgb.Booster.lower_bound() work as expected for regression (test_basic.R:163:3) - lgb.train: Found the following passed through '...': num_leaves, objective, metric. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     9. lightgbm() performs evaluation on validation sets if they are provided (test_basic.R:202:3) - lgb.train: Found the following passed through '...': num_leaves, objective, metric. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     10. cv works (test_basic.R:272:3) - lgb.cv: Found the following passed through '...': min_data, learning_rate. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.cv for documentation on how to call this function.
    
     11. lightgbm.cv() gives the correct best_score and best_iter for a metric where higher values are better (test_basic.R:329:3) - lgb.cv: Found the following passed through '...': num_leaves. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.cv for documentation on how to call this function.
    
     12. lgb.cv() respects showsd argument (test_basic.R:397:3) - lgb.cv: Found the following passed through '...': min_data. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.cv for documentation on how to call this function.
    
     13. lgb.cv() respects showsd argument (test_basic.R:407:3) - lgb.cv: Found the following passed through '...': min_data. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.cv for documentation on how to call this function.
    
     14. lgb.train() works as expected with multiple eval metrics (test_basic.R:429:3) - lgb.train: Found the following passed through '...': learning_rate. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     15. when early stopping is not activated, best_iter and best_score come from valids and not training data (test_basic.R:1345:3) - lgb.train: Found the following passed through '...': num_leaves. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     16. when early stopping is not activated, best_iter and best_score come from valids and not training data (test_basic.R:1367:3) - lgb.train: Found the following passed through '...': num_leaves. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     17. when early stopping is not activated, best_iter and best_score come from valids and not training data (test_basic.R:1390:3) - lgb.train: Found the following passed through '...': num_leaves. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     18. when early stopping is not activated, best_iter and best_score come from valids and not training data (test_basic.R:1414:3) - lgb.train: Found the following passed through '...': num_leaves. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     19. when early stopping is not activated, best_iter and best_score come from valids and not training data (test_basic.R:1438:3) - lgb.train: Found the following passed through '...': num_leaves. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     20. when early stopping is not activated, best_iter and best_score come from valids and not training data (test_basic.R:1463:3) - lgb.train: Found the following passed through '...': num_leaves. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     21. lightgbm.train() gives the correct best_score and best_iter for a metric where higher values are better (test_basic.R:1495:3) - lgb.train: Found the following passed through '...': num_leaves. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     22. using lightgbm() without early stopping, best_iter and best_score come from valids and not training data (test_basic.R:1548:3) - lgb.train: Found the following passed through '...': num_leaves. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     23. lgb.train() works with linear learners and data where a feature has only 1 non-NA value (test_basic.R:1923:3) - lgb.Dataset: Found the following passed through '...': feature_pre_filter. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.Dataset for documentation on how to call this function.
    
     24. lgb.Dataset: basic construction, saving, loading (test_dataset.R:16:3) - Calling getinfo() on a lgb.Dataset is deprecated. Use get_field() instead.
    
     25. lgb.Dataset: basic construction, saving, loading (test_dataset.R:16:3) - Calling getinfo() on a lgb.Dataset is deprecated. Use get_field() instead.
    
     26. lgb.Dataset: basic construction, saving, loading (test_dataset.R:26:3) - Calling getinfo() on a lgb.Dataset is deprecated. Use get_field() instead.
    
     27. lgb.Dataset: basic construction, saving, loading (test_dataset.R:26:3) - Calling getinfo() on a lgb.Dataset is deprecated. Use get_field() instead.
    
     28. lgb.Dataset: getinfo & setinfo (test_dataset.R:34:3) - Calling setinfo() on a lgb.Dataset is deprecated. Use set_field() instead.
    
     29. lgb.Dataset: getinfo & setinfo (test_dataset.R:35:3) - Calling getinfo() on a lgb.Dataset is deprecated. Use get_field() instead.
    
     30. lgb.Dataset: getinfo & setinfo (test_dataset.R:36:3) - Calling getinfo() on a lgb.Dataset is deprecated. Use get_field() instead.
    
     31. lgb.Dataset: getinfo & setinfo (test_dataset.R:38:3) - Calling getinfo() on a lgb.Dataset is deprecated. Use get_field() instead.
    
     32. lgb.Dataset: getinfo & setinfo (test_dataset.R:39:3) - Calling getinfo() on a lgb.Dataset is deprecated. Use get_field() instead.
    
     33. lgb.Dataset: getinfo & setinfo (test_dataset.R:42:3) - Calling setinfo() on a lgb.Dataset is deprecated. Use set_field() instead.
    
     34. Dataset$slice() supports passing additional parameters through '...' (test_dataset.R:73:3) - Dataset$slice(): Found the following passed through '...': feature_pre_filter. These are ignored and should be removed. To change the parameters of a Dataset produced by Dataset$slice(), use Dataset$set_params(). To modify attributes like 'init_score', use Dataset$set_field(). In future releases of lightgbm, this warning will become an error.
    
     35. Dataset$slice() supports passing Dataset attributes through '...' (test_dataset.R:88:3) - Dataset$slice(): Found the following passed through '...': init_score. These are ignored and should be removed. To change the parameters of a Dataset produced by Dataset$slice(), use Dataset$set_params(). To modify attributes like 'init_score', use Dataset$set_field(). In future releases of lightgbm, this warning will become an error.
    
     36. Dataset$slice() supports passing Dataset attributes through '...' (test_dataset.R:94:3) - Dataset$getinfo() is deprecated and will be removed in a future release. Use Dataset$get_field() instead.
    
     37. Dataset$slice() supports passing Dataset attributes through '...' (test_dataset.R:95:3) - Dataset$getinfo() is deprecated and will be removed in a future release. Use Dataset$get_field() instead.
    
     38. lgb.Dataset$setinfo() should convert 'group' to integer (test_dataset.R:268:3) - Dataset$getinfo() is deprecated and will be removed in a future release. Use Dataset$get_field() instead.
    
     39. lgb.Dataset$setinfo() should convert 'group' to integer (test_dataset.R:271:3) - Dataset$setinfo() is deprecated and will be removed in a future release. Use Dataset$set_field() instead.
    
     40. lgb.Dataset$setinfo() should convert 'group' to integer (test_dataset.R:272:3) - Dataset$getinfo() is deprecated and will be removed in a future release. Use Dataset$get_field() instead.
    
     41. learning-to-rank with lgb.cv() works as expected (test_learning_to_rank.R:84:5) - lgb.cv: Found the following passed through '...': min_data, learning_rate. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.cv for documentation on how to call this function.
    
     42. Booster$finalize() should not fail (test_lgb.Booster.R:9:5) - lgb.train: Found the following passed through '...': objective. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     43. lgb.get.eval.result() should throw an informative error for incorrect data_name (test_lgb.Booster.R:59:5) - lgb.train: Found the following passed through '...': min_data, learning_rate. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     44. lgb.get.eval.result() should throw an informative error for incorrect eval_name (test_lgb.Booster.R:92:5) - lgb.train: Found the following passed through '...': min_data, learning_rate. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     45. lgb.load() gives the expected error messages given different incorrect inputs (test_lgb.Booster.R:126:5) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     46. Loading a Booster from a text file works (test_lgb.Booster.R:170:5) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     47. Loading a Booster from a string works (test_lgb.Booster.R:243:5) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     48. Saving a large model to string should work (test_lgb.Booster.R:273:5) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     49. Saving a large model to JSON should work (test_lgb.Booster.R:315:5) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     50. If a string and a file are both passed to lgb.load() the file is used model_str is totally ignored (test_lgb.Booster.R:343:5) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     51. Creating a Booster from a Dataset with an existing predictor should work (test_lgb.Booster.R:397:5) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     52. Booster$rollback_one_iter() should work as expected (test_lgb.Booster.R:479:5) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     53. Booster$update() passing a train_set works as expected (test_lgb.Booster.R:511:5) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     54. Booster$update() passing a train_set works as expected (test_lgb.Booster.R:532:5) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     55. Booster$update() throws an informative error if you provide a non-Dataset to update() (test_lgb.Booster.R:555:5) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     56. Saving a model with different feature importance types works (test_lgb.Booster.R:646:5) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     57. Saving a model with unknown importance type fails (test_lgb.Booster.R:699:5) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     58. lgb.intereprete works as expected for multiclass classification (test_lgb.interprete.R:82:5) - lgb.train: Found the following passed through '...': min_data. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     59. lgb.plot.interepretation works as expected for multiclass classification (test_lgb.plot.interpretation.R:80:5) - lgb.train: Found the following passed through '...': min_data. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     60. lgb.unloader works as expected (test_lgb.unloader.R:7:5) - lgb.train: Found the following passed through '...': min_data, learning_rate. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     61. lgb.unloader finds all boosters and removes them (test_lgb.unloader.R:27:5) - lgb.train: Found the following passed through '...': min_data, learning_rate. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     62. lgb.unloader finds all boosters and removes them (test_lgb.unloader.R:37:5) - lgb.train: Found the following passed through '...': min_data, learning_rate. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     63. Feature penalties work properly (test_parameters.R:14:3) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective, feature_penalty, metric. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     64. Feature penalties work properly (test_parameters.R:14:3) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective, feature_penalty, metric. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     65. Feature penalties work properly (test_parameters.R:14:3) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective, feature_penalty, metric. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     66. Feature penalties work properly (test_parameters.R:14:3) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective, feature_penalty, metric. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     67. Feature penalties work properly (test_parameters.R:14:3) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective, feature_penalty, metric. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     68. Feature penalties work properly (test_parameters.R:14:3) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective, feature_penalty, metric. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     69. Feature penalties work properly (test_parameters.R:14:3) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective, feature_penalty, metric. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     70. Feature penalties work properly (test_parameters.R:14:3) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective, feature_penalty, metric. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     71. Feature penalties work properly (test_parameters.R:14:3) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective, feature_penalty, metric. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     72. Feature penalties work properly (test_parameters.R:14:3) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective, feature_penalty, metric. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     73. Feature penalties work properly (test_parameters.R:14:3) - lgb.train: Found the following passed through '...': num_leaves, learning_rate, objective, feature_penalty, metric. These will be used, but in future releases of lightgbm, this warning will become an error. Add these to 'params' instead. See ?lgb.train for documentation on how to call this function.
    
     ══ Failed ══════════════════════════════════════════════════════════════════════
     ── 1. Failure (test_lgb.Booster.R:469:5): Booster$eval() should work on a Datase
     `eval_in_mem` not identical to `eval_from_file`.
     Objects equal but not identical
    
     ══ DONE ════════════════════════════════════════════════════════════════════════
     Error: Test failures
     Execution halted
Flavor: r-devel-windows-x86_64-gcc10-UCRT

Version: 3.2.1
Check: installed package size
Result: NOTE
     installed size is 54.8Mb
     sub-directories of 1Mb or more:
     libs 54.2Mb
Flavor: r-release-macos-arm64