Comments (12)
According to this and this issues it can be implemented as follows:
def set_regularization(model,
kernel_regularizer=None,
bias_regularizer=None,
activity_regularizer=None):
for layer in model.layers:
# set kernel_regularizer
if kernel_regularizer is not None and hasattr(layer, 'kernel_regularizer'):
layer.kernel_regularizer = kernel_regularizer
# set bias_regularizer
if bias_regularizer is not None and hasattr(layer, 'bias_regularizer'):
layer.bias_regularizer = bias_regularizer
# set activity_regularizer
if activity_regularizer is not None and hasattr(layer, 'activity_regularizer'):
layer.activity_regularizer = activity_regularizer
# exmaple
set_regularization(model, kernel_regularizer=keras.regularizers.l2(0.0001))
model.compile(...) # you have to recompile model if regularization is changed
I did not test this code, if it works it can be added as utils function.
from segmentation_models.
Hi, @Tyler-D
Did you mean a possibility to add regularisation for all convolution layers of the model?
from segmentation_models.
Well, I think it would be better if there is a function that adding specific regularizer to all layers.
from segmentation_models.
Cool, that's exactly the function I want. I could help to add it, what kind of test you needed?
from segmentation_models.
Actually, I'm thinking if there is possibility to build a segmentation task pipeline upon your repo including: train, evaluation, some data-loader for public dataset (e.g. pascal-voc, coco) and even an export tool to export the keras model to inference framework (e.g TensorRT). Then I'm sure this repository can be extremely appealing.
from segmentation_models.
Just test that it works as expected:
Regularization appears in conv/dense layers and applied during training.
Saved/loaded model has regularization.
from segmentation_models.
Segmentation pipeline is a cool idea, however I think it should be build in other repo or written as an example part here.
If you can recommend any cool repos with such kind of pipeline it would be extremly helpful! π
from segmentation_models.
I've tried the code you offered in my train scripts and thing is that only the model config is changed. And after investigation, I found this. And a workround can be found here:
def create_model():
model = your_model()
model.save_weights("tmp.h5")
# optionally do some other modifications (freezing layers, adding convolutions etc.)
....
regularizer = l2(WEIGHT_DECAY / 2)
for layer in model.layers:
for attr in ['kernel_regularizer', 'bias_regularizer']:
if hasattr(layer, attr) and layer.trainable:
setattr(layer, attr, regularizer)
out = model_from_json(model.to_json())
out.load_weights("tmp.h5", by_name=True)
return out
It seems not an elegant way to do the things. I'm thinking how to refactor it.
from segmentation_models.
Yes, I agree. this is not elegant way..
Another not elegant way, but at least do not require model saving:
def set_regularization(model,
kernel_regularizer=None,
bias_regularizer=None,
activity_regularizer=None):
for layer in model.layers:
# set kernel_regularizer
if kernel_regularizer is not None and hasattr(layer, 'kernel_regularizer'):
layer.kernel_regularizer = kernel_regularizer
# set bias_regularizer
if bias_regularizer is not None and hasattr(layer, 'bias_regularizer'):
layer.bias_regularizer = bias_regularizer
# set activity_regularizer
if activity_regularizer is not None and hasattr(layer, 'activity_regularizer'):
layer.activity_regularizer = activity_regularizer
out = model_from_json(model.to_json())
out.set_weights(model.get_weights())
return out
new_model = set_regularization(model, kernel_regularizer=keras.regularizers.l2(0.0001))
new_model.compile(...)
from segmentation_models.
Hi @qubvel . I've tested the new implementation, and it works well! You can add it #54 .
from segmentation_models.
Hi @Tyler-D, ok, no problem
from segmentation_models.
Try this:
# a utility function to add weight decay after the model is defined.
def add_weight_decay(model, weight_decay):
if (weight_decay is None) or (weight_decay == 0.0):
return
# recursion inside the model
def add_decay_loss(m, factor):
if isinstance(m, tf.keras.Model):
for layer in m.layers:
add_decay_loss(layer, factor)
else:
for param in m.trainable_weights:
with tf.keras.backend.name_scope('weight_regularizer'):
regularizer = lambda: tf.keras.regularizers.l2(factor)(param)
m.add_loss(regularizer)
# weight decay and l2 regularization differs by a factor of 2
add_decay_loss(model, weight_decay/2.0)
return
from segmentation_models.
Related Issues (20)
- module 'segmentation_models.losses' has no attribute 'DiceLoss' HOT 1
- Call arguments received by layer 'model_18' (type Functional): β’ inputs=tf.Tensor(shape=(1, 384, 8), dtype=float32) β’ training=True β’ mask=None
- Any way to get per class IoU scores?
- How to get PA value and mPA valueοΌ HOT 1
- How to extract saliency map from the ImageNet pretrained with non-RBG input?
- How to implement more metrics in model.compile? HOT 2
- Dropout layers in U-net and Linknet
- ModuleNotFoundError: No module named 'keras.legacy_tf_layers' issue
- Kindly add the Segformer Backbone! :) HOT 1
- module 'keras.utils.generic_utils' has no attribute 'get_custom_objects' HOT 4
- Good IOU Score on training data, but bad segmentation on testing data. HOT 2
- batch size when predicting HOT 1
- Segformer/Transformer Backbone
- File "/usr/local/lib/python3.8/dist-packages/tensorflow/lite/python/interpreter.py", line 915, in invoke self._interpreter.Invoke() RuntimeError: tensorflow/lite/kernels/concatenation.cc:158 t->dims->data[d] != t0->dims->data[d] (1 != 2)Node number 304 (CONCATENATION) failed to prepare.
- Is there option to add classification head after encoder like in pytorch version?
- AttributeError: module 'keras.utils' has no attribute 'generic_utils'
- Incorporating sample weights in loss function
- Understanding difference between TensorFlow and PyTorch implementations of Unet
- Equation .. math:: is misleading
- How to apply inferred mask to image
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. πππ
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google β€οΈ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from segmentation_models.