You could, but you would need to deploy that LLM somewhere that you can reach it from Python. Azure OpenAI Services is hosting the model compute (and you pay per token). You can also pay per token by deploying models in Azure ML. You can also host the LLM(s) in other services on other cloud platforms and send them requests from a notebook in Fabric.