Nvidia is not straight with its pricing, but after some clicks, It seems like Nvidia charges you for an anual Enterprise License, its 4.500 usd while other services charges you for tokens (usage) or hours
I am confused about the GPT 4 deployment on nims. Like it is not an open source model. How is it being hosted in the Nvidia environment. The use of open source models makes sense, like we download it on the system, and we pay for the cost to run it. But gpt4! Did nvidia make a deal with open ai to use there model there model locally. It would be good if it is the case. But if not, then this is just an unnecessary step to use gpt4 from Nims.
Terrible what is possible now. Audio, Video, Text,… is 1 thing, in industry KI already working and doing a really good job! Why all Interfaces are focused on python? Keras, Jax,… For some situations C, C++ or Rust would be better