Could you suggest PC system for running those LLMs locally? I am thinking about RTX 4060 Ti 16GB because it is a little bir cheaper when compared to others. My friend said that RTX 3090 24GB will be better investment because of upcoming models may require higher GB values. This makes sense but it is almost double price of RTX 4060 Ti. Could you help me for this topic ? Thanks for your efforts!
Difficult to say. In general when you want to run them locally then the RAM is an important fact. But models are already available in smaller sizes and they get better. br