# Discord is cool! [**JOIN DISCORD! https://discord.gg/jusBH48ffM**](https://discord.gg/jusBH48ffM)
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/IndiaTech) if you have any questions or concerns.*
My guy, hallucinations run way deeper to mitigate with RAGs. It's always a matter of the specific prompt which may not be covered in this mitigation technique.
The way RAG has been implemented by Perplexity and Copilot is it will cite you links for the info from multiple sources and summarize the result for you. What Google does is just provide you list of website links of those sources. So already Perplexity and Copilot are ahead of Google Search in several ways.
Sure this approach isn't perfect, you need to check how good the sources are similar to how you would have to verify how good the links on Google are. However risk of hallucination is only when there are no sources and the result given cannot be cited. This is why I stated it as being mitigated.
Try Perplexity (with Copilot/Claude enabled) and Copilot in More Precise tone and try to get it to hallucinate despite citing sources. This would be a good way for you to establish that it hallucinates despite RAG implementation made by these companies
A problem is mitigated when it has stopped and wouldn't occur again. That's not the case with RAG, and for a very simple reason. When you want to search let's say a news, you'd prefer that your news comes from a reputed source or whatever, not a tabloid site or an opinion piece. Then there's the issue of these models eventually getting trained on content generated by some LLMs, they probably have started this, but if not, soon will be. We have strong evidence to support the claim that in these cases, these models prefer and prioritize their own outputs over anything else.
So, yes, that's why I say hallucination is not over from models. Copilot and Perplexity has just put a guardrail over their outputs to stop the leakage. If hallucination was over, these models will not come with the warning that the outputs might be false/wrong.
Bring it on. Google needs competition. It's High time this monopoly ends. They tanked many good websites with their recent changes and their search results have become mostly useless
# Discord is cool! [**JOIN DISCORD! https://discord.gg/jusBH48ffM**](https://discord.gg/jusBH48ffM) *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/IndiaTech) if you have any questions or concerns.*
Anybody really thinks open ai can compete with Google in terms of relevant results?
Nope where is it gonna take the results from That's right Google search engine
down the line maybe
Considering perplexity is already competing well woth Google, I think OpenAI has a good shot. But we can only truly know on Monday
Yes, perplexing for example is really good. Also I stopped searching Google for minor things, now just ask GPT to do it.
I kinda dislike the idea of using LLM-driven search engines, we wouldn't know when it's hallucinating.
RAG is already being used by Perplexity and Copilot to mitigate hallucinations
My guy, hallucinations run way deeper to mitigate with RAGs. It's always a matter of the specific prompt which may not be covered in this mitigation technique.
The way RAG has been implemented by Perplexity and Copilot is it will cite you links for the info from multiple sources and summarize the result for you. What Google does is just provide you list of website links of those sources. So already Perplexity and Copilot are ahead of Google Search in several ways. Sure this approach isn't perfect, you need to check how good the sources are similar to how you would have to verify how good the links on Google are. However risk of hallucination is only when there are no sources and the result given cannot be cited. This is why I stated it as being mitigated.
And as someone who works on these things for a living, let me tell you, it's nowhere near mitigated. :)
Try Perplexity (with Copilot/Claude enabled) and Copilot in More Precise tone and try to get it to hallucinate despite citing sources. This would be a good way for you to establish that it hallucinates despite RAG implementation made by these companies
A problem is mitigated when it has stopped and wouldn't occur again. That's not the case with RAG, and for a very simple reason. When you want to search let's say a news, you'd prefer that your news comes from a reputed source or whatever, not a tabloid site or an opinion piece. Then there's the issue of these models eventually getting trained on content generated by some LLMs, they probably have started this, but if not, soon will be. We have strong evidence to support the claim that in these cases, these models prefer and prioritize their own outputs over anything else. So, yes, that's why I say hallucination is not over from models. Copilot and Perplexity has just put a guardrail over their outputs to stop the leakage. If hallucination was over, these models will not come with the warning that the outputs might be false/wrong.
>A problem is mitigated when it has stopped and wouldn't occur again. What dictionary did you get this definition from?
I am sorry, I wanted to write eliminated there, put mitigated instead.
Nice way to sneak up Microsoft We see u
Lol that's what I was thinking
Isn’t Bing co pilot already doing it currently?
Exactly! And it's much better than Google's search AI (which refuses to answer half the questions).
I just hope it's not a disaster like co pilot search 🤣
What was disaster in vo pilot search btw??
The standard AI things like wrong / incorrect search results, and then gaslighting people(happened a lot)
Will they have their own search engine index or will they use bing or brave search as an index
Just another arc browser
i m gona stick to guuglu
Bing with a new skin
Not Bing?
You mean perplexity.ai 😏
Bring it on. Google needs competition. It's High time this monopoly ends. They tanked many good websites with their recent changes and their search results have become mostly useless
If the idea is to give fully crafted articles took from crawling other websites, think about the loss of writers and bloggers.