You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Not an issue, just a simple suggestion, could it be possible to introduce residual/skip connections in the form of extra directed arrows in the network graph? perhaps the skip connection could feed into a node where the aggregation operator can be specified in LaTeX (Addition, or any other function).
I understand that skip connections might be complext on the UX side since they kind of break the "linearity" of the list of layers, however, having them would allow a complete modeling of almost all possible architectures using CNNs, so I'd really appreciate the addition.
The text was updated successfully, but these errors were encountered:
In general, I've been trying to avoid adding support for network layouts that are best specified by code, because tensorboard model graphs already exists. The only diagrams I've seen of resnets so far look like those that tensorboard would support:
Doing some googling now, I saw this diagram which was neat:
The second one sure does look very neat! When I originally wrote the post I had something like this in mind:
Which is a very simplistic diagram of an EDSR-inspired network that I spun op on draw.io in a dozen minutes or so (spoiler, the architecture doesn't work).
So not necessarily a very dense network like RDN, just a single skip connection would suffice.
Not an issue, just a simple suggestion, could it be possible to introduce residual/skip connections in the form of extra directed arrows in the network graph? perhaps the skip connection could feed into a node where the aggregation operator can be specified in LaTeX (Addition, or any other function).
I understand that skip connections might be complext on the UX side since they kind of break the "linearity" of the list of layers, however, having them would allow a complete modeling of almost all possible architectures using CNNs, so I'd really appreciate the addition.
The text was updated successfully, but these errors were encountered: