

#Flux networks cant creak flux how to#
How to resolve the above presented issue? This makes me believe that the error is related to the fact that not ALL the parameters of the NN are present in the loss function.
#Flux networks cant creak flux code#
Moreover, when changing loss_1 from 0 to 0.5*dot(diff, diff) where diff = (NN(data_x) - data_y), the code works as well. For instance, if one adds + 0.5*dot(NN.layers.W, NN.layers.W) to the loss function (and do not change anything else), the code runs. However, with several adjustments the code works. The code presented earlier does not work. In expression starting at C:\Users\Sven\Documents\Master\Master_thesis\Julia\Code\TestRepository\TestFileDiscourse_question3.jl:43 include_string(::Function, ::Module, ::String, ::String) at. top-level scope at C:\Users\Sven\Documents\Master\Master_thesis\Julia\Code\TestRepository\TestFileDiscourse_question3.jl:43 apply!(::ADAM, ::Array) at C:\Users\Sven\.julia\packages\DiffEqFlux\GQl0U\src\train.jl:6 Result = DiffEqFlux.sciml_train((p) -> loss(nn, p, data_x, data_y), Θ,Įrror: LoadError: DimensionMismatch("array could not be broadcast to match destination") Println("The value of the loss function at the start of training is: $testLossVal \n") TestLossVal = loss(nn, Θ, data_x, data_y) # Create a function that generates data from a linear function and adds noiseĭata_y = data_x + 0.01*randn(input_size,n) Return loss_intermediate(NN, data_x, data_y) Nn_initial = Chain(Dense(input_size, output_size))įunction loss_intermediate(NN, data_x, data_y) # Creating a simple neural network consisting of two linear layers # Defining some constants for creating the Neural Network Below I present a MWE that gives a similar error. When running my code, I believe this gives an error. In a particular part of the loss, a particular weight of the Neural Network is not used at all. I want to train a Neural Stochastic Differential Equation where the loss explicitly depends on the Neural Network inside the SDE. Hi everyone, I have a question that is related to training a Neural Network using the sciml_train function of DiffEqFlux.jl.
