Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

register custom_op for fpEBC #2067

Closed
wants to merge 1 commit into from

Conversation

TroyGarden
Copy link
Contributor

Summary:

context

  • convert FeatureProcessedEmbeddingBagCollection to custom op in IR export
  • add serialization and deserialization function for FPEBC
  • add an API for the FeatureProcessorInterface to export necessary paramters for create an instance
  • use this API (get_init_kwargs) in the serialize and deserialize functions to flatten and unflatten the feature processor

details

  1. Added FPEBCMetadata schema for FP_EBC, use a fp_json string to store the necessary paramters
  2. Added FPEBCJsonSerializer, converted the init_kwargs to json string and store in the fp_json field in the metadata
  3. Added a fqn check for serialized_fqns, so that when a higher-level module is serialized, the lower-level module can be skipped (it's already included in the higher-level module)
  4. Added an API called get_init_kwargs for FeatureProcessorsCollection and FeatureProcessor, and use a FeatureProcessorNameMap to map the classname to the feature processor class
  5. Added _non_strict_exporting_forward function for FPEBC so that in non_strict IR export it goes to the custom_op logic

Differential Revision: D57829276

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jun 4, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57829276

TroyGarden added a commit to TroyGarden/torchrec that referenced this pull request Jun 4, 2024
Summary:

# context
* convert `FeatureProcessedEmbeddingBagCollection` to custom op in IR export
* add serialization and deserialization function for FPEBC
* add an API for the `FeatureProcessorInterface` to export necessary paramters for create an instance
* use this API (`get_init_kwargs`) in the serialize and deserialize functions to flatten and unflatten the feature processor

# details
1. Added `FPEBCMetadata` schema for FP_EBC, use a `fp_json` string to store the necessary paramters
2. Added `FPEBCJsonSerializer`, converted the init_kwargs to json string and store in the `fp_json` field in the metadata
3. Added a fqn check for `serialized_fqns`, so that when a higher-level module is serialized, the lower-level module can be skipped (it's already included in the higher-level module)
4. Added an API called `get_init_kwargs` for `FeatureProcessorsCollection` and `FeatureProcessor`, and use a `FeatureProcessorNameMap` to map the classname to the feature processor class
5. Added `_non_strict_exporting_forward` function for FPEBC so that in non_strict IR export it goes to the custom_op logic

Differential Revision: D57829276
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57829276

TroyGarden added a commit to TroyGarden/torchrec that referenced this pull request Jun 4, 2024
Summary:

# context
* convert `FeatureProcessedEmbeddingBagCollection` to custom op in IR export
* add serialization and deserialization function for FPEBC
* add an API for the `FeatureProcessorInterface` to export necessary paramters for create an instance
* use this API (`get_init_kwargs`) in the serialize and deserialize functions to flatten and unflatten the feature processor

# details
1. Added `FPEBCMetadata` schema for FP_EBC, use a `fp_json` string to store the necessary paramters
2. Added `FPEBCJsonSerializer`, converted the init_kwargs to json string and store in the `fp_json` field in the metadata
3. Added a fqn check for `serialized_fqns`, so that when a higher-level module is serialized, the lower-level module can be skipped (it's already included in the higher-level module)
4. Added an API called `get_init_kwargs` for `FeatureProcessorsCollection` and `FeatureProcessor`, and use a `FeatureProcessorNameMap` to map the classname to the feature processor class
5. Added `_non_strict_exporting_forward` function for FPEBC so that in non_strict IR export it goes to the custom_op logic

Differential Revision: D57829276
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57829276

TroyGarden added a commit to TroyGarden/torchrec that referenced this pull request Jun 5, 2024
Summary:

# context
* convert `FeatureProcessedEmbeddingBagCollection` to custom op in IR export
* add serialization and deserialization function for FPEBC
* add an API for the `FeatureProcessorInterface` to export necessary paramters for create an instance
* use this API (`get_init_kwargs`) in the serialize and deserialize functions to flatten and unflatten the feature processor

# details
1. Added `FPEBCMetadata` schema for FP_EBC, use a `fp_json` string to store the necessary paramters
2. Added `FPEBCJsonSerializer`, converted the init_kwargs to json string and store in the `fp_json` field in the metadata
3. Added a fqn check for `serialized_fqns`, so that when a higher-level module is serialized, the lower-level module can be skipped (it's already included in the higher-level module)
4. Added an API called `get_init_kwargs` for `FeatureProcessorsCollection` and `FeatureProcessor`, and use a `FeatureProcessorNameMap` to map the classname to the feature processor class
5. Added `_non_strict_exporting_forward` function for FPEBC so that in non_strict IR export it goes to the custom_op logic

Differential Revision: D57829276
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57829276

TroyGarden added a commit to TroyGarden/torchrec that referenced this pull request Jun 6, 2024
Summary:

# context
* convert `FeatureProcessedEmbeddingBagCollection` to custom op in IR export
* add serialization and deserialization function for FPEBC
* add an API for the `FeatureProcessorInterface` to export necessary paramters for create an instance
* use this API (`get_init_kwargs`) in the serialize and deserialize functions to flatten and unflatten the feature processor

# details
1. Added `FPEBCMetadata` schema for FP_EBC, use a `fp_json` string to store the necessary paramters
2. Added `FPEBCJsonSerializer`, converted the init_kwargs to json string and store in the `fp_json` field in the metadata
3. Added a fqn check for `serialized_fqns`, so that when a higher-level module is serialized, the lower-level module can be skipped (it's already included in the higher-level module)
4. Added an API called `get_init_kwargs` for `FeatureProcessorsCollection` and `FeatureProcessor`, and use a `FeatureProcessorNameMap` to map the classname to the feature processor class
5. Added `_non_strict_exporting_forward` function for FPEBC so that in non_strict IR export it goes to the custom_op logic

Differential Revision: D57829276
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57829276

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57829276

TroyGarden added a commit to TroyGarden/torchrec that referenced this pull request Jun 7, 2024
Summary:
Pull Request resolved: pytorch#2067

# context
* convert `FeatureProcessedEmbeddingBagCollection` to custom op in IR export
* add serialization and deserialization function for FPEBC
* add an API for the `FeatureProcessorInterface` to export necessary paramters for create an instance
* use this API (`get_init_kwargs`) in the serialize and deserialize functions to flatten and unflatten the feature processor

# details
1. Added `FPEBCMetadata` schema for FP_EBC, use a `fp_json` string to store the necessary paramters
2. Added `FPEBCJsonSerializer`, converted the init_kwargs to json string and store in the `fp_json` field in the metadata
3. Added a fqn check for `serialized_fqns`, so that when a higher-level module is serialized, the lower-level module can be skipped (it's already included in the higher-level module)
4. Added an API called `get_init_kwargs` for `FeatureProcessorsCollection` and `FeatureProcessor`, and use a `FeatureProcessorNameMap` to map the classname to the feature processor class
5. Added `_non_strict_exporting_forward` function for FPEBC so that in non_strict IR export it goes to the custom_op logic

Differential Revision: D57829276
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57829276

TroyGarden added a commit to TroyGarden/torchrec that referenced this pull request Jun 17, 2024
Summary:
Pull Request resolved: pytorch#2067

# context
* convert `FeatureProcessedEmbeddingBagCollection` to custom op in IR export
* add serialization and deserialization function for FPEBC
* add an API for the `FeatureProcessorInterface` to export necessary paramters for create an instance
* use this API (`get_init_kwargs`) in the serialize and deserialize functions to flatten and unflatten the feature processor

# details
1. Added `FPEBCMetadata` schema for FP_EBC, use a `fp_json` string to store the necessary paramters
2. Added `FPEBCJsonSerializer`, converted the init_kwargs to json string and store in the `fp_json` field in the metadata
3. Added a fqn check for `serialized_fqns`, so that when a higher-level module is serialized, the lower-level module can be skipped (it's already included in the higher-level module)
4. Added an API called `get_init_kwargs` for `FeatureProcessorsCollection` and `FeatureProcessor`, and use a `FeatureProcessorNameMap` to map the classname to the feature processor class
5. Added `_non_strict_exporting_forward` function for FPEBC so that in non_strict IR export it goes to the custom_op logic

Differential Revision: D57829276
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57829276

TroyGarden added a commit to TroyGarden/torchrec that referenced this pull request Jun 18, 2024
Summary:
Pull Request resolved: pytorch#2067

# context
* convert `FeatureProcessedEmbeddingBagCollection` to custom op in IR export
* add serialization and deserialization function for FPEBC
* add an API for the `FeatureProcessorInterface` to export necessary paramters for create an instance
* use this API (`get_init_kwargs`) in the serialize and deserialize functions to flatten and unflatten the feature processor

# details
1. Added `FPEBCMetadata` schema for FP_EBC, use a `fp_json` string to store the necessary paramters
2. Added `FPEBCJsonSerializer`, converted the init_kwargs to json string and store in the `fp_json` field in the metadata
3. Added a fqn check for `serialized_fqns`, so that when a higher-level module is serialized, the lower-level module can be skipped (it's already included in the higher-level module)
4. Added an API called `get_init_kwargs` for `FeatureProcessorsCollection` and `FeatureProcessor`, and use a `FeatureProcessorNameMap` to map the classname to the feature processor class
5. Added `_non_strict_exporting_forward` function for FPEBC so that in non_strict IR export it goes to the custom_op logic

Differential Revision: D57829276
TroyGarden added a commit to TroyGarden/torchrec that referenced this pull request Jun 22, 2024
Summary:
Pull Request resolved: pytorch#2067

# context
* convert `FeatureProcessedEmbeddingBagCollection` to custom op in IR export
* add serialization and deserialization function for FPEBC
* add an API for the `FeatureProcessorInterface` to export necessary paramters for create an instance
* use this API (`get_init_kwargs`) in the serialize and deserialize functions to flatten and unflatten the feature processor

# details
1. Added `FPEBCMetadata` schema for FP_EBC, use a `fp_json` string to store the necessary paramters
2. Added `FPEBCJsonSerializer`, converted the init_kwargs to json string and store in the `fp_json` field in the metadata
3. Added a fqn check for `serialized_fqns`, so that when a higher-level module is serialized, the lower-level module can be skipped (it's already included in the higher-level module)
4. Added an API called `get_init_kwargs` for `FeatureProcessorsCollection` and `FeatureProcessor`, and use a `FeatureProcessorNameMap` to map the classname to the feature processor class
5. Added `_non_strict_exporting_forward` function for FPEBC so that in non_strict IR export it goes to the custom_op logic

Differential Revision: D57829276
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57829276

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57829276

TroyGarden added a commit to TroyGarden/torchrec that referenced this pull request Jun 25, 2024
Summary:
Pull Request resolved: pytorch#2067

# context
* convert `FeatureProcessedEmbeddingBagCollection` to custom op in IR export
* add serialization and deserialization function for FPEBC
* add an API for the `FeatureProcessorInterface` to export necessary paramters for create an instance
* use this API (`get_init_kwargs`) in the serialize and deserialize functions to flatten and unflatten the feature processor

# details
1. Added `FPEBCMetadata` schema for FP_EBC, use a `fp_json` string to store the necessary paramters
2. Added `FPEBCJsonSerializer`, converted the init_kwargs to json string and store in the `fp_json` field in the metadata
3. Added a fqn check for `serialized_fqns`, so that when a higher-level module is serialized, the lower-level module can be skipped (it's already included in the higher-level module)
4. Added an API called `get_init_kwargs` for `FeatureProcessorsCollection` and `FeatureProcessor`, and use a `FeatureProcessorNameMap` to map the classname to the feature processor class
5. Added `_non_strict_exporting_forward` function for FPEBC so that in non_strict IR export it goes to the custom_op logic

Differential Revision: D57829276
TroyGarden pushed a commit to TroyGarden/torchrec that referenced this pull request Jun 25, 2024
Summary:
Pull Request resolved: pytorch#2067

# context
* convert `FeatureProcessedEmbeddingBagCollection` to custom op in IR export
* add serialization and deserialization function for FPEBC
* add an API for the `FeatureProcessorInterface` to export necessary paramters for create an instance
* use this API (`get_init_kwargs`) in the serialize and deserialize functions to flatten and unflatten the feature processor

# details
1. Added `FPEBCMetadata` schema for FP_EBC, use a `fp_json` string to store the necessary paramters
2. Added `FPEBCJsonSerializer`, converted the init_kwargs to json string and store in the `fp_json` field in the metadata
3. Added a fqn check for `serialized_fqns`, so that when a higher-level module is serialized, the lower-level module can be skipped (it's already included in the higher-level module)
4. Added an API called `get_init_kwargs` for `FeatureProcessorsCollection` and `FeatureProcessor`, and use a `FeatureProcessorNameMap` to map the classname to the feature processor class
5. Added `_non_strict_exporting_forward` function for FPEBC so that in non_strict IR export it goes to the custom_op logic

Differential Revision: D57829276
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57829276

TroyGarden added a commit to TroyGarden/torchrec that referenced this pull request Jun 25, 2024
Summary:
Pull Request resolved: pytorch#2067

# context
* convert `FeatureProcessedEmbeddingBagCollection` to custom op in IR export
* add serialization and deserialization function for FPEBC
* add an API for the `FeatureProcessorInterface` to export necessary paramters for create an instance
* use this API (`get_init_kwargs`) in the serialize and deserialize functions to flatten and unflatten the feature processor

# details
1. Added `FPEBCMetadata` schema for FP_EBC, use a `fp_json` string to store the necessary paramters
2. Added `FPEBCJsonSerializer`, converted the init_kwargs to json string and store in the `fp_json` field in the metadata
3. Added a fqn check for `serialized_fqns`, so that when a higher-level module is serialized, the lower-level module can be skipped (it's already included in the higher-level module)
4. Added an API called `get_init_kwargs` for `FeatureProcessorsCollection` and `FeatureProcessor`, and use a `FeatureProcessorNameMap` to map the classname to the feature processor class
5. Added `_non_strict_exporting_forward` function for FPEBC so that in non_strict IR export it goes to the custom_op logic

Differential Revision: D57829276
Summary:
Pull Request resolved: pytorch#2067

# context
* convert `FeatureProcessedEmbeddingBagCollection` to custom op in IR export
* add serialization and deserialization function for FPEBC
* add an API for the `FeatureProcessorInterface` to export necessary paramters for create an instance
* use this API (`get_init_kwargs`) in the serialize and deserialize functions to flatten and unflatten the feature processor

# details
1. Added `FPEBCMetadata` schema for FP_EBC, use a `fp_json` string to store the necessary paramters
2. Added `FPEBCJsonSerializer`, converted the init_kwargs to json string and store in the `fp_json` field in the metadata
3. Added a fqn check for `serialized_fqns`, so that when a higher-level module is serialized, the lower-level module can be skipped (it's already included in the higher-level module)
4. Added an API called `get_init_kwargs` for `FeatureProcessorsCollection` and `FeatureProcessor`, and use a `FeatureProcessorNameMap` to map the classname to the feature processor class
5. Added `_non_strict_exporting_forward` function for FPEBC so that in non_strict IR export it goes to the custom_op logic

Reviewed By: PaulZhang12

Differential Revision: D57829276
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D57829276

TroyGarden pushed a commit to TroyGarden/torchrec that referenced this pull request Jul 1, 2024
Summary:
Pull Request resolved: pytorch#2067

# context
* convert `FeatureProcessedEmbeddingBagCollection` to custom op in IR export
* add serialization and deserialization function for FPEBC
* add an API for the `FeatureProcessorInterface` to export necessary paramters for create an instance
* use this API (`get_init_kwargs`) in the serialize and deserialize functions to flatten and unflatten the feature processor

# details
1. Added `FPEBCMetadata` schema for FP_EBC, use a `fp_json` string to store the necessary paramters
2. Added `FPEBCJsonSerializer`, converted the init_kwargs to json string and store in the `fp_json` field in the metadata
3. Added a fqn check for `serialized_fqns`, so that when a higher-level module is serialized, the lower-level module can be skipped (it's already included in the higher-level module)
4. Added an API called `get_init_kwargs` for `FeatureProcessorsCollection` and `FeatureProcessor`, and use a `FeatureProcessorNameMap` to map the classname to the feature processor class
5. Added `_non_strict_exporting_forward` function for FPEBC so that in non_strict IR export it goes to the custom_op logic

Differential Revision: D57829276
@TroyGarden TroyGarden deleted the export-D57829276 branch August 8, 2024 22:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants