coding structure of gaussian noise layer

11 Ansichten (letzte 30 Tage)
jianY xu
jianY xu am 18 Sep. 2018
Beantwortet: Jack Xiao am 22 Feb. 2021
I want to create a special layer to add some special noise to the data.
But my matlab version is 2017b, I don't have the example " gaussianNoiseLayer.m".
That file should be located at (matlabroot, 'examples', 'nnet', 'main', 'gaussianNoiseLayer.m') in the matlab 2018b or 2018a version.
I really want to know the coding structure of adding noise layer.
If any kind-hearted person has installed the latest version of matlab, can you send a copy of this file to me?
email: xjy1236@sina.com thank you very much!!
  1 Kommentar
MAHSA YOUSEFI
MAHSA YOUSEFI am 4 Jan. 2021
Hi Jian.
Did you solve your problem with adding noise?
I want to add Gaussian noide per each layer of hidden layer and input in my costumized training loop.

Melden Sie sich an, um zu kommentieren.

Antworten (1)

Jack Xiao
Jack Xiao am 22 Feb. 2021
here is the code:
classdef gaussianNoiseLayer < nnet.layer.Layer
% gaussianNoiseLayer Gaussian noise layer
% A Gaussian noise layer adds random Gaussian noise to the input.
%
% To create a Gaussian noise layer, use
% layer = gaussianNoiseLayer(sigma, name)
properties
% Standard deviation.
Sigma
end
methods
function layer = gaussianNoiseLayer(sigma, name)
% layer = gaussianNoiseLayer(sigma,name) creates a Gaussian
% noise layer and specifies the standard deviation and layer
% name.
layer.Name = name;
layer.Description = ...
"Gaussian noise with standard deviation " + sigma;
layer.Type = "Gaussian Noise";
layer.Sigma = sigma;
end
function Z = predict(layer, X)
% Z = predict(layer, X) forwards the input data X through the
% layer for prediction and outputs the result Z.
% At prediction time, the output is equal to the input.
Z = X;
end
function [Z, memory] = forward(layer, X)
% Z = forward(layer, X) forwards the input data X through the
% layer and outputs the result Z.
% At training time, the layer adds Gaussian noise to the input.
sigma = layer.Sigma;
noise = randn(size(X)) * sigma;
Z = X + noise;
memory = [];
end
function dLdX = backward(layer, X, Z, dLdZ, memory)
% [dLdX, dLdAlpha] = backward(layer, X, Z, dLdZ, memory)
% backward propagates the derivative of the loss function
% through the layer.
% Since the layer adds a random constant, the derivative dLdX
% is equal to dLdZ.
dLdX = dLdZ;
end
end
end

Produkte


Version

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by