Skip to content

Replace Embedding to use nn.Embedding from pytorch #428

Open
@ravinkohli

Description

@ravinkohli

Due to our current implementation of Embedding module, we are forced to one-hot encode all categorical columns. This leads to an explosion in the memory usage. We can avoid this by using nn.Embedding from pytorch which has the same functionality but does not need one-hot encoded columns. This will also allow us to one hot encode the columns which have number of categories less than min_categories_for_embedding.

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions