We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I'm using the new v2.0 xgboost for multiclass regression and the function plot_tree crashes.
This is an example inspired in the documentation https://xgboost.readthedocs.io/en/stable/python/examples/multioutput_regression.html
import numpy as np import xgboost as xgb rng = np.random.RandomState(1994) X = np.sort(200 * rng.rand(100, 1) - 100, axis=0) y = np.array([np.pi * np.sin(X).ravel(), np.pi * np.cos(X).ravel()]).T y[::5, :] += 0.5 - rng.rand(20, 2) y = y - y.min() y = y / y.max() reg = xgb.XGBRegressor( tree_method="hist", n_estimators=128, n_jobs=16, max_depth=8, multi_strategy="multi_output_tree", subsample=0.6, ) reg.fit(X, y, eval_set=[(X, y)]) xgb.plot_tree(reg)
The text was updated successfully, but these errors were encountered:
It's a highly experimental feature as noted in the demo. Anything not shown in that demo can be assumed to be not yet supported.
Sorry, something went wrong.
Closing in favor of #9043 .
Just to clarify, xgboost works with classification since day 1. The issue here is using regression with multi_output_tree.
multi_output_tree
Ok, thanks! I will wait until it is implemented!
No branches or pull requests
I'm using the new v2.0 xgboost for multiclass regression and the function plot_tree crashes.
This is an example inspired in the documentation https://xgboost.readthedocs.io/en/stable/python/examples/multioutput_regression.html
Example
The text was updated successfully, but these errors were encountered: