machine learning - Back propagation through time, simple explanation for a beginner -


i'm totally new machine learning, , understand concept of backpropagation , recurrent neural networks, can't seem grasp backpropagation through time. in wikipedia pseudocode,

back_propagation_through_time(a, y)   // a[t] input @ time t. y[t] output unfold network contain k instances of f until stopping criteria met:     x = zero-magnitude vector;// x current context     t 0 n - 1         // t time. n length of training sequence         set network inputs x, a[t], a[t+1], ..., a[t+k-1]         p = forward-propagate inputs on whole unfolded network         e = y[t+k] - p;           // error = target - prediction         back-propagate error, e, across whole unfolded network         update weights in network         average weights in each instance of f together, each f identical         x = f(x);                 // compute context next time-step 

so understand, have desired output @ current step, forward pass steps before, calculate error between previous step outputs , current output.

how updating weights?

average weights in each instance of f together, each f identical 

what's meaning of this?

can describe bptt in simple terms of give simple reference beginner?

you unfold rnn f n time-steps plain dnn, n length of training feature-label sequence, , dnn contains n instances of f. use n-step feature-label sequence train dnn standard bp. in dnn, each instance of f contains copy of weights w. each updated different new w_1 w_n. average of w_1 w_n new weights of original rnn f after trained n-step sequence. whole procedure of training rnn f bptt.


Comments