Hello! Basically, I need to process a very large (4000 lines) file and free ai chatbots like chatgpt aren’t able to handle it. I would like to split it into smaller parts and process each part separately. I’m having however a very hard time finding a chatbot with free API. the only one I found is huggingchat, but after a few requests waiting 1 seconds before sending the next one it starts giving rate limit errors.

any suggestion? thanks in advance!

EDIT: I also tried to run gpt4all on my laptop (with integrated graphics) and it took like 2-5 minutes to asnwer a simple “hello” prompt, so it’s not really feasable :(

  • BakedCatboy@lemmy.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    What model size did you run on your laptop? I have an Intel Nuc with an i7 and I run various models on CPU (it doesn’t have a dedicated GPU) and while I can’t run stuff larger than ~14b or so, models up to around ~7b aren’t too slow. If I try to run a 32b then I get a similar experience to you. I tend not to go below 4b because that’s when it starts being dumb and not following instructions well, so just depends on how complex your task is.