Sir, good morning, its an excellent contribution to all categories of people related to this field. Its wonderful, thanks a lot. Just a small doubt sir, in this video you mentioned to add the bias term(at 17.25 min) in the formula. My doubt is, In the network , where the bias is mentioned or added, whether at the end, or at the O2 level. pl clear my doubt. thank you once again.
Sir please help: the part at 14:45 where the output of the recurrent node is passed to next time step of the same node BUT ALSO passed to the other node in the hidden layer, i didnt understand that part pls explain it to me intuitevly/mathematically how it works❤
Hi Aman, I need one suggestion. I need to convert xaml files to atmx files. Is it possible ?. How to develop model ? which model i need to use and how to build dataset ? kindly guide me on this.
Hi aman.. Thanks for the video There will be three weights, in common notation waa for the previous word wax for the input word wya for the output... Correct me if I am wrong Thanks.. Can we expect the derivations also in the next video? 😊
Here, You didn't explain How Each node of the Hidden Layer Process(We know) -> Passes(other Hidden nodes of the same layer & other layers) -> How it stores the output hidden state of each node, How it process with Next timestamp and finally previous Dense give the multiple HiddenStates, How it using that Hidden States & finally give the ouput and also RNN has Multiple type of Architecture (For Many to one) When the output layer works please explain these doubts with the logic, sample code (Ex) along with Sample calculation (We want process only that's enough not exact nums) Even it takes a longtime in a vedio, please upload as single vedio Every sources of internet gives the outline process of RNN not depth level can you please?