Excellent tutorial of a very useful but sometimes confusing feature in NumPy. I would only add that " . . . " is syntactic sugar for omitting a bunch of indices.
Awesome , Your channel is so underrated . Was struggling for a good channel to learn about pytorch ,Thanksfully got yours :D Can you cover pix2pix , cycleGAN , RCNN's ? Would be greatful if you do .
I don't know all of this stuff. I research everything to try to make every video as good as I possible can so the process is usually that I learn something in depth and then decide to share it with you guys
This is great, I just wanna know however, if I can do FFT of Green function using einsum. Note: been trying for a week to implement the code, never got the correct result.
gives me error while matrix-vector multipication: torch.einsum("ij, kj->ik", x, v) einsum(): operands do not broadcast with remapped shapes [original->remapped]: [2, 5]->[2, 1, 5] [1, 3]->[1, 1, 3] same in tf Expected dimension 5 at axis 1 of the input shaped [1,3] but got dimension 3 [Op:Einsum]
One thing that wasn't mentioned in the video that i realized halfway through is sometimes einsum is used on 1 operand while sometimes on 2. I tried "torch.einsum('ii->i', t,t)" and got "RuntimeError: einsum(): more operands were provided than specified in the equation". This tells me that the number of operands must correspond to the number of comma separated indexes on left hand side of ->.
I think it's because if you wrote it as a nested loop, then you would loop over all rows with a variable `i`, and for the columns you would reuse the same variable (every entry at coordinates (i,i) is on the diagonal). Now for the result, if you left the `i` out it would sum the diagonal elements up. If you have it in there, it will create a list instead.
np.einsum('ik,kj->ij', x,y) is actually much much slower than np.dot(x,y) when the matrix size gets larger Also tf.einsum is slightly slower than tf.matmul but torch.einsum is slightly faster than torch.matmul... Only from a perspective of the configuration of my laptop though