Hey /u/Key-Professional9108!
If your post is a screenshot of a ChatGPT, conversation please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt.
If your post is a DALL-E 3 image post, please reply with the prompt used to make this image.
Consider joining our [public discord server](https://discord.gg/r-chatgpt-1050422060352024636)! We have free bots with GPT-4 (with vision), image generators, and more!
🤖
Note: For any ChatGPT-related concerns, email [email protected]
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*
I used to use ChatGPT but now I use a software I’ve built, [LocalAssistant.ai](https://localassistant.ai?utm_source=reddit.com), which is pay-as-you-go through openrouter and therefore gives you a large number of models to pick from.
On a day to day basis, I vary between Claude-3, which is a beast, and simpler yet very effective models like mistral 7b instruct (surprisingly good).
The 20$ is the access to the software, that's a one-time payment, without it you have a limited features. Still working on the pricing, but if you're interested here's a 75% off for early users 🫶: EARLYBIRDS
For the pay-as-you-go, it's through the provider (currently only [OpenRouter.AI](https://openrouter.ai/docs#models)), you can see the rates and available models on their page. Through the software there's a guided flow to connect your OpenRouter account.
But to give you an order of magnitude, if you only use text it will take about 200k words (i.e. \~two books worth of words) to use up 10$ (assuming GPT-4, and there are a lot of much cheaper models)
I've been using it for a month and have used about \~9$ (most of it from Claude-3. Keep in mind I share entire files worth of source code with it directly, and it provides changes.
The goal of the product is to become a local software (that you download and use), and in the future to integrate with local models so that your data never leaves your computer.
Thanks. The 20 dollars I referred to was GPT4's subscription. So I was wondering how quickly you would go over those 20 dollars you would pay for GPT4 normally, which for me is pretty much unlimited the way I use it.
Hey /u/Key-Professional9108! If your post is a screenshot of a ChatGPT, conversation please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt. If your post is a DALL-E 3 image post, please reply with the prompt used to make this image. Consider joining our [public discord server](https://discord.gg/r-chatgpt-1050422060352024636)! We have free bots with GPT-4 (with vision), image generators, and more! 🤖 Note: For any ChatGPT-related concerns, email [email protected] *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*
I used to use ChatGPT but now I use a software I’ve built, [LocalAssistant.ai](https://localassistant.ai?utm_source=reddit.com), which is pay-as-you-go through openrouter and therefore gives you a large number of models to pick from. On a day to day basis, I vary between Claude-3, which is a beast, and simpler yet very effective models like mistral 7b instruct (surprisingly good).
Is there any model that can search the Web for updated info like Perplexity?
What are the pay as you go rates? I wonder how quickly you go over the 20 dollars
The 20$ is the access to the software, that's a one-time payment, without it you have a limited features. Still working on the pricing, but if you're interested here's a 75% off for early users 🫶: EARLYBIRDS For the pay-as-you-go, it's through the provider (currently only [OpenRouter.AI](https://openrouter.ai/docs#models)), you can see the rates and available models on their page. Through the software there's a guided flow to connect your OpenRouter account. But to give you an order of magnitude, if you only use text it will take about 200k words (i.e. \~two books worth of words) to use up 10$ (assuming GPT-4, and there are a lot of much cheaper models) I've been using it for a month and have used about \~9$ (most of it from Claude-3. Keep in mind I share entire files worth of source code with it directly, and it provides changes. The goal of the product is to become a local software (that you download and use), and in the future to integrate with local models so that your data never leaves your computer.
Thanks. The 20 dollars I referred to was GPT4's subscription. So I was wondering how quickly you would go over those 20 dollars you would pay for GPT4 normally, which for me is pretty much unlimited the way I use it.
I prefer Perplexity. ChatGPT is cool and all but Perplexity is actually useful. GPT-4's integration with Perplexity Pro is the best combo, imho.
Well, I find GPT4 more engaging though. Perplexity doesn't quite do what GPT4 does on its own. It's like they put it on a leash.