do you know how to do it using custom and local LLM like vicuna or alpaca or others installed on my computer and make it all work without using internet access ? i'm using .txt files so no need for internet compared to .html
I haven't tried it yet (gpt-index.readthedocs.io/en/latest/how_to/customization/custom_llms.html#) but I'll give it a go soon as I get a local LLM running on my machine.