Dr Socher you are amazing when it comes to explaining things. Even I, a person not knowing anything about programming, start to understand how things work.
@@user-rl8to5nc2q gemini (1.5 specifically)"remembers" 1M token, so this 1h talk is a finger flex for the model. you can drill down to specific phrases or let it elaborate on points you need more detail on. only flaw is the sometimes inaccurate automated YT transscript. i use it way more often than gpt4 when it comes to context input / understanding. i can throw multiple books at it. not sure why YT isn't all over this. for comparison, gpt4 has a measly 4.096 tokens. that's sometimes only enough for a few follow up questions. energy demand has to come down drastically so we can enjoy the full might of these models.
@@user-rl8to5nc2q i typed the answer already, but google..... the summary is more than just accurate. gemini (1.5) can grasp the entire speech, so it memorizes it down to the word. its 1M token window is gigantic compared to gpt4's 4096 tokens. you can get a summary in bullet points and then let it elaborate and quote on portions you like to know more about. unfortunaately it's not available everywhere yet, but this is a very interesting glimpse of what we (general public) will be capable very soon.
@@user-rl8to5nc2q i typed the answer already, but google..... the summary is more than just accurate. gemini (1.5) can grasp the entire speech, so it memorizes it down to the word. its 1M token window is gigantic compared to gpt4's 4096 tokens. you can get a summary in bullet points and then let it elaborate and quote on portions you like to know more about. unfortunaately it's not available everywhere yet, but this is a very interesting glimpse of what we (general public) will be capable very soon.