photo

pepper yuan


Aktiv seit 2016

Followers: 0   Following: 0

Statistik

MATLAB Answers

5 Fragen
0 Antworten

RANG
221.902
of 300.840

REPUTATION
0

BEITRÄGE
5 Fragen
0 Antworten

ANTWORTZUSTIMMUNG
60.0%

ERHALTENE STIMMEN
0

RANG
 of 21.092

REPUTATION
N/A

DURCHSCHNITTLICHE BEWERTUNG
0.00

BEITRÄGE
0 Dateien

DOWNLOADS
0

ALL TIME DOWNLOADS
0

RANG

of 171.238

BEITRÄGE
0 Probleme
0 Lösungen

PUNKTESTAND
0

ANZAHL DER ABZEICHEN
0

BEITRÄGE
0 Beiträge

BEITRÄGE
0 Öffentlich Kanäle

DURCHSCHNITTLICHE BEWERTUNG

BEITRÄGE
0 Discussions

DURCHSCHNITTLICHE ANZAHL DER LIKES

  • Thankful Level 1

Abzeichen anzeigen

Feeds

Anzeigen nach

Frage


Conover's Two-Sample Squared Ranks Test for Equality of Variance
How to run Conover's Two-Sample Squared Ranks Test for Equality of Variance if I have two set of samples (30 x 1 double)?

mehr als 8 Jahre vor | 0 Antworten | 0

0

Antworten

Frage


Neural network with multiple inputs and single output - How to improve the performance of neural network?
Hello everyone! I would like to create a neural network with 5 input nodes. In the following I have created a simple code with t...

fast 10 Jahre vor | 1 Antwort | 0

1

Antwort

Frage


How to change transfer function in hidden layer?
hello everyone,as we know the default transfer function in hidden layer is tansig, if I want to change the transfer function to ...

fast 10 Jahre vor | 1 Antwort | 0

1

Antwort

Frage


Neural network with multiple inputs and single output - how to change processing functions?
Hello everyone! I would like to create a neural network with 5 input nodes. In the following I have created a simple code with t...

fast 10 Jahre vor | 1 Antwort | 0

1

Antwort

Frage


how to save the trained network in nftool and how to reuse the saved network to do the prediction by using predict command with new inputs(same number of element but larger sample size)?
Actually i'm using the nftool to train 10959x5 as inputs and 10958x1 as target after i trained the network,below is the script ...

fast 10 Jahre vor | 0 Antworten | 0

0

Antworten