Feature: Automated Creation Based on Example for PyTorch Linear Modules with ReLU Activations#126
Conversation
|
Please feel free to feed back and contribute to this feature. Especially regarding the subsequent Keras support. |
MyGodItsFull0fStars
left a comment
There was a problem hiding this comment.
Great job on this pull request! Clear structure and use of examples.
| arch.append(pnn.to_begin()) | ||
| for idx, layer in enumerate(summary_list[2:], start=1): | ||
|
|
||
| if layer.class_name == "Linear": |
There was a problem hiding this comment.
If more module types would be parsed in the future, having helper functions or a builder class corresponding to the layer.class_name would omit cluttering the parse() function with further if conditions.
There was a problem hiding this comment.
I agree and this becomes even more important with increased types of supported PyTorch modules (layers, activations). Though, I have oriented myself on the coding style in blocks.py and tikzeng.py.
Further, I propose to abstract parsing and constructing the tikz code toward a collection of layers that map to a similar tikz representation and a general activation representation that maps to all possible PyTorch-implemented activation functions.
Though, all of this could - and imo should - be an improvement and extension on-top of this basic functionaility.
git-thor
left a comment
There was a problem hiding this comment.
Addressed comments from @MyGodItsFull0fStars
| arch.append(pnn.to_begin()) | ||
| for idx, layer in enumerate(summary_list[2:], start=1): | ||
|
|
||
| if layer.class_name == "Linear": |
There was a problem hiding this comment.
I agree and this becomes even more important with increased types of supported PyTorch modules (layers, activations). Though, I have oriented myself on the coding style in blocks.py and tikzeng.py.
Further, I propose to abstract parsing and constructing the tikz code toward a collection of layers that map to a similar tikz representation and a general activation representation that maps to all possible PyTorch-implemented activation functions.
Though, all of this could - and imo should - be an improvement and extension on-top of this basic functionaility.
|
Ready as initial functionality for PyTorch automated generation support. Please review and merge if deemed OK. |
|
hey guys, any plan to support conv layer ? |
|
Hey @space192 I am currently occupied and hindered to push the PR further but we happily accept your extension to CNNs. You can create a PR regarding this to that branch of my fork - so it shows up here. |
Work in progress
The PR addresses #124
Automated generation from PyTorch module class ``torch.nn.module
child, leveragingtorchinfo` architecture summary interface, comparable with TensorFlow/Keras `summary` method.This is created from the following code:
Convfor fully connected layer until a specialized function is provided.TODOs for subsequent PRs
Addressed #124 with respect to PyTorch