Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

enable convert_lightgbm to output tensor type #451

Open
huzq2016 opened this issue Mar 11, 2021 · 12 comments
Open

enable convert_lightgbm to output tensor type #451

huzq2016 opened this issue Mar 11, 2021 · 12 comments

Comments

@huzq2016
Copy link

huzq2016 commented Mar 11, 2021

Description
When I used convert_lightgbm, I got a graph:
lightgbm
which includes a probabilities map type (for example, [{0: 0.9, 1: 0.1}, {0: 0.2, 1: 0.8}]) that is not supported by onnxruntime server so far (see: microsoft/onnxruntime#2385)

As comparison, when I used convert_xgboost, I got a graph:
xgboost
which includes a probabilities tensor (ie, [0.1, 0.8]) that is supported by onnxruntime server.

Very probably the map type in convert_lightgbm is caused by https://github.com/onnx/onnxmltools/blob/master/onnxmltools/convert/lightgbm/operator_converters/LightGbm.py#L454.
In convert_xgboost, it is https://github.com/onnx/onnxmltools/blob/master/onnxmltools/convert/xgboost/operator_converters/XGBoost.py#L291, which seemingly does not have zipmap.

Describe the solution you'd like
Conversion of lightgbm into onnx models can output the tensor type of probabilities.

@wenbingl
Copy link
Member

@xadupre , can you help with that?

@xadupre
Copy link
Collaborator

xadupre commented Mar 11, 2021

It is possible to remove zipmap operator by using options. The example shows how to use sklearn-onnx for lightgbm Convert a pipeline with a LightGBM model. The second one shows how to remove zipmap node One model, many possible conversions with options.

@huzq2016
Copy link
Author

thanks @xadupre.
The example is only for sklearn-onnx. Wonder if a similar solution can be implemented for convert_lightgbm of onnxmltools?

@xadupre
Copy link
Collaborator

xadupre commented Mar 11, 2021

It could be but I could probably choose a more simple way, just adding a parameter converrt_lightgbm(..., zipmap=True).

@huzq2016
Copy link
Author

@xadupre, actually we have tried sklearn-onnx, but always failed due to some issue. So it is better if we can implemented here.

@xadupre
Copy link
Collaborator

xadupre commented Mar 12, 2021

Ok I have two bugs to fix then. I'll start by adding a parameter zipmap to convert_lightgbm.

@xadupre
Copy link
Collaborator

xadupre commented Mar 12, 2021

I started to work on this on PR #452.

@huzq2016
Copy link
Author

very appreciated for your kind help, @xadupre

@huzq2016
Copy link
Author

huzq2016 commented Mar 15, 2021

@xadupre, sorry I forgot to mention that my model type is lightgbm.basic.Booster

The code snippet is as follows:

from lightgbm import Booster

# first used mmlspark.LightGBMClassifier to train a model, 
# then saved mode
# then used lightgbm.Booster to load model
bst = lightgbm.Booster(model_file='mmpspark_lightgbmclassifier_model.txt')  

print('The type of model: ', type(bst))
# The type of model:  <class 'lightgbm.basic.Booster'>

I found that here does not consider Booster type.

xadupre added a commit that referenced this issue Mar 15, 2021
* Enable option zipmap for LGBM converter
* add one more unittest
* support booster
@Bhuvanamitra
Copy link

Hi, I am having the same issue. What is the solution for this?
I trained the Lightgbm Binary Classifier using LightGBM.exe and the output probabilities type is 'SEQUENCE TYPE'.
I am getting 'Allocation Failure' while inferencing because of this.
Please let me know the solution.
Thanks.

@xadupre
Copy link
Collaborator

xadupre commented Aug 23, 2021

A new version was released two days ago. Did you with this one?

@abgoswam
Copy link

verified the following versions, and the fix works well:

  • lightgbm : 3.3.2
  • onnxmltools : 1.11.1
    onnx_model = convert_lightgbm(
        clf,
        initial_types=initial_type,
        zipmap=False)

i suppose we can close this issue. thanks @xadupre for this fix.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants