Skip to content

Commit

Permalink
Update readme
Browse files Browse the repository at this point in the history
  • Loading branch information
rabea-al committed Dec 9, 2024
1 parent db08a4a commit 61b3c72
Showing 1 changed file with 7 additions and 7 deletions.
14 changes: 7 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,37 +78,37 @@ Defines a custom PyTorch model with optional flattening and a configurable optim

<img src="https://github.com/user-attachments/assets/f5122457-b6aa-45b4-b350-361c2bc23ade" alt="TorchModel" width="200" height="150" />

### TorchAddLinearLayer:
### TorchAddLinearLayer Component:
Adds a Linear Layer to a sequential model with specified input and output features.

<img src="https://github.com/user-attachments/assets/bc301ff9-7e1f-4e4d-bbf4-a65504ffd4e3" alt="TorchAddLinearLayer" width="200" height="125" />

### TorchAddConv1DLayer:
### TorchAddConv1DLayer Component:
Adds a 1D convolutional layer with configurable kernel size, stride, padding, and other parameters.

<img src="https://github.com/user-attachments/assets/b362d044-5110-40a3-af68-e7f154cf01cc" alt="TorchAddConv1DLayer" width="200" height="225" />

### TorchAddTransformerEncoderLayer:
### TorchAddTransformerEncoderLayer Component:
Inserts a Transformer Encoder layer with multi-head attention and configurable feedforward dimensions and activation.

<img src="https://github.com/user-attachments/assets/741f14d0-cc2b-4212-89b2-23ef500e8f32" alt="TorchAddTransformerEncoderLayer" width="225" height="250" />

### TorchAddTransformerDecoderLayer:
### TorchAddTransformerDecoderLayer Component:
Adds a Transformer Decoder layer with attention mechanisms and flexible feedforward configurations.

<img src="https://github.com/user-attachments/assets/a04de634-3a32-47a1-942e-8b5475452582" alt="TorchAddTransformerDecoderLayer" width="225" height="250" />

### TorchLSTM:
### TorchLSTM Component:
Creates an LSTM layer for sequence modeling, supporting options for bidirectionality, dropout, and projection.

<img src="https://github.com/user-attachments/assets/9adc63bd-c1a5-40dd-be93-53a12ac48ea4" alt="TorchLSTM" width="200" height="225" />

### TorchAddReluLayer:
### TorchAddReluLayer Component:
Adds a ReLU activation layer to a model, applying non-linearity to enhance model learning.

<img src="https://github.com/user-attachments/assets/ebae0ad9-35b2-4b53-b6e1-796ab44a063a" alt="TorchAddReluLayer" width="200" height="75" />

### TorchAddDropoutLayer:
### TorchAddDropoutLayer Component:
Incorporates a Dropout layer to reduce overfitting by randomly zeroing out input elements during training.

<img src="https://github.com/user-attachments/assets/61386afe-5ed5-41b9-8840-cc8f73b718a7" alt="TorchAddDropoutLayer" width="200" height="85" />
Expand Down

0 comments on commit 61b3c72

Please sign in to comment.