It would be great to see you make tools like these into stand alone apps. I love all the cool apps you make, but having them reliant on the browser ads additional steps to getting them launched and is rough for ppl like me with way too many tabs already open in the browser.
unsure if this has been answered previously. however, why do you use colon ': ' opposed to full stop ' . ' at the end of each prompt? do you find any difference in output?
Finally someone who actually know the future of ai software, no matter how "cheap" it's gonna get, llama already kicks its ass, and will do more kicking in the future
Someone correct me if I am wrong, but ollama requires at least a mid-level PC for it to be viable. Not everybody has access to that. A mid-level PC are still a lot more expensive than some openAI credits.
Another sweet ride through the Midnight Ai Matrix.❤🎉 soo good.
thnx mate :)
current research says that you are correct. longer context outperforms fine-tuning quite often.
It would be great to see you make tools like these into stand alone apps. I love all the cool apps you make, but having them reliant on the browser ads additional steps to getting them launched and is rough for ppl like me with way too many tabs already open in the browser.
Thnx for the feedback. Might look into that :)
unsure if this has been answered previously. however, why do you use colon ': ' opposed to full stop ' . ' at the end of each prompt? do you find any difference in output?
tbh its a habit, dont think it makes a big diff :)
@@AllAboutAI thanks
This is absolutely fantastic 👏👏👏
THnx mate :)
super interesting! nice work !
indeed
thanx a lot Simon :)
Amazing job 👍
thnx a lot :)
4o mini is the same price *per image* as 4o, fyi. Mini price only applies to text.
Yeah I know, but i still thinks its not too bad. hopefully cheaper going forward :)
Super amazing!
tnx man =)
NO WAY. I wait for Super-Omni-Prompting Before I waste my time. The current rate of new buzzwords and terms per day is maybe the best about AI.
its not cheap isit its 20 to even get tokens so i stick to local llm ollama thats it! never paying for anything!
don't be sassy about it
Finally someone who actually know the future of ai software, no matter how "cheap" it's gonna get, llama already kicks its ass, and will do more kicking in the future
@@ai.universityy Yeah and when 405B version release things are gonna change
I hate paying man.
But the open source llms output dog poop !
So I use openrouter!!!
Someone correct me if I am wrong, but ollama requires at least a mid-level PC for it to be viable. Not everybody has access to that. A mid-level PC are still a lot more expensive than some openAI credits.
THX stuff be so insightful it is super-great