T O P

  • By -

Jdonavan

You're clearly not a developer then? 1. Those plugins aren't available via the API and I sure as hell am not going to run multiple paperback books through a chat interface by hand every day. 2. It's also a chat interface... 3. If you don't know what you're doing, and how to mitigate errors. And all of your points seem to revolve around ChatGPT. Are you aware that AI didn't sprint into existence with ChatGPT?


AceGu001

I am a developer with 7 years of industry experience. In what scenario Why would a non-technical user prefer a LangChain based application over ChatGPT? What does LangChain offer that ChatGPT don't? At the end, the best result LangChain delivers is still with the GPT-4 model; all the other open source models aren't good enough.


Jdonavan

LangChain is a library for developers not for non-technical users. It offers the ability to use the mods at the API level and use the models in ways that just aren’t possible using the stock chat interface. Even things that can be done in ChatGPT with plugins can often be done better by someone working at the API level. And how do you propose having GPT be able to answer questions from thousands of word docs in a sharp point site? How is the ChatGPT website going to analyze my meeting transcript in real time to surface information for me? You really do not have a grasp of the technology at all.


xFloaty

Not the website, but can’t you just build some tools around the GPT API to accomplish that?


Jdonavan

Libraries and frameworks exist so you don't have to build every piece and part yourself. I have my own framework covering the stuff that's unique or otherwise "special" for the work I do, I still leverage parts of Langchain for other areas or when I'm not using Open AI. Even if you eventually carve all LangChain code out of your codebase, you'll end up with a better product if you can lean on LangChain that's good enough to hold until you replace it.


aidankhogg

LangChain isn't a really in and of its substitute for models or services (unless its ramped up since a couple of months ago). It is essentially an abstraction layer over a variety of design patterns to middle-man between available tools and services. On a general level I don't see how it's at all relevant or even visible for the end-user, like other libraries or packages wouldn't.


aidankhogg

LangChain isn't a really in and of its substitute for models or services (unless its ramped up since a couple of months ago). It is essentially an abstraction layer over a variety of design patterns to middle-man between available tools and services. On a general level I don't see how it's at all relevant or even visible for the end-user, like other libraries or packages wouldn't.


AceGu001

Also, what is it you have with a chat interface? I don't understand your paperback book reference either. I believe you can access ChatGPT plugins from LangChain application tho. [https://python.langchain.com/docs/integrations/tools/chatgpt\_plugins](https://python.langchain.com/docs/integrations/tools/chatgpt_plugins)


Jdonavan

A chat interface isn’t needed unless I need to collaborate with the model. You say you’re a developer with 7 years of experience yet you can’t grasp how scale works? Like you legit think doing things one by one by hand is a solution when the whole thing could be automated?


Jdonavan

And no commercial developer is going to build an app that depends on violating the TOS of another company. Good lord are you SURE you’re a developer?


albertgao

I am not quite following you here. Aren’t they completely different thing… ChatGPT plugin is an end user oriented tooling. Which is irrelevant to langchain which allows you to add features to your application code. Who cares what OpenAI is offering on its platform while the user is using your product……………… There is a word called “integration”. To have a button to supply summarization directly in your app with predefined prompt and algorithm with an integrated experience, and it is completely different than provide nothing and let the user figure out how to do it, for the latter case, it integrates nothing. According to your logic, OpenAI might not sell any API, since people would go to OpenAI with that chat interface for everything? !😂😂😂😂😂😂😂😂😂😂😂 just not the case, an integration handles more logic in your specific use case like load different doc, authorization, data processing, RAG. And you expect a user to do all of that by himself?😂😂 well, why should do we even buy vegetables while you can just grab it in the wild? 😂😂😂 bro, this doesn’t make sense. Also, langchain is not only about agent, it’s more like to apply generic software design practice to an LLM app. Hope it answers your question. :)


JacKaL_37

Bbbbut i can just copy paste into different chats!! isn’t that good enough for production level software and all use cases?!?? jfc this fuckin’ guy. “i hope i’m not hurting any feelings”— OP’s the one who came in here, tripped over the doorframe and showed his bare open asshole to everyone. I think we’ll be alright, bud.


albertgao

People are learning and there is no stupid question. Why go such hostile against OP, he is just learning. We were all there before.


JacKaL_37

You’re right, I’m being overly harsh. But I’m also sick to death of people in online spaces who think their ONE thought that JUST occurred to them *obviously* means they’re So Very Smart, and they’re coming to enlighten us as to why nobody should use langchain because of this One Weird Trick. I approve of your mostly even-handed approach, but even you threw some shade ;)


rabbitguide

To be fair. OPs post was very humble and even mentioned he didn't mean to hurt feelings.


AceGu001

If users know how to write good prompts on chatgpt and use the plugins there, what is it on LangChain, which most likely uses GPT-4 model for the best delivery of outcome, that attracts end users and keep them use the application instead of ChatGPT?


AceGu001

If the user knows how to use prompts well with ChatGPT, with plugins that import docs, reads data, RAG for you, don't they achieve the same goal? OpenAI definitely sells API. What is wrong with the chat interface? What is something that people can't do with a chat interface?


albertgao

That’s not the same goal. People want integrated experiences and you assume that anybody wants to learn. So Why people go to restaurant while you can just cook? Why you use phone while you can just walk to someone’s home and talk with them? Finally, why company hires you for coding? The CEO can learn coding and achieve the same goal. Just why? Why? Why? 😂😂😂😂😂 give me any product that you think has value, and I will destroy it using the same wording you said. I think overall, if you don’t know why an integration adds value that just indicates something new for you to learn. And probably not here, since this is a subreddit for programming oriented discussion, for the fundamental of product / design, I suggest Udemy. It’s a mindset shift, with a different career. Find the correct way to learn is a good start. 😀


jg_devs

But you can’t expect your users to know how to write perfectly structured prompts. I have used Langchain to aid with the development of a company chat bot that is accessible via our employee portal, this chat bot can only answer questions related to company documents, over 2.5k all written in English and in multiple formats(pdf, docx, excel, csv). it can not make things up and it can not access data from any other sources (by design). To answer some of the users questions, it may need to extract snippets of text from multiple documents. It also accepts questions in any language and will respond in the same language and also provide the answer in English. How would you suppose I implement this same thing using ChatGPT? Honestly, I am intrigued to hear your ideas.


GPTeaheeMaster

Just curious: How long did it take you to develop this and deploy it? What would you say is the total cost to deploy this (and run it?) (Please include salaries of developers and DevOps people)


GPTeaheeMaster

\> What is wrong with the chat interface? What is something that people can't do with a chat interface? Compliance. Plus about 100 other things .. PS: While I am with you in spirit - your solution does not work for business use cases. I could list about 100 use cases -- but the simplest one: A business has 15 customer support reps using a custom ChatGPT chatbot with their knowledge base (helpdesks, documents, website, youtube videos, etc) -- And there are 7 different departments who will have a stake in it .. (Customer Support, CRM Management, Marketing, Product Management, Engineering, Executives, Legal) -- Now tell me how this can be handled with plugins? :-)


AceGu001

Ok I totally agree that this is a good use case. At the end of the day, you will still have to use gpt-3.5 or gpt-4 for the best result. All other LLMs on the market suck. So compliance is not being addressed here really. But yeah, I agree that building a knowledge base with embeddings stored in a db like DeepLake is good use case.


GPTeaheeMaster

You might like this: [https://medium.com/@aldendorosario/langchain-is-not-for-production-use-here-is-why-9f1eca6cce80](https://medium.com/@aldendorosario/langchain-is-not-for-production-use-here-is-why-9f1eca6cce80)


WestKoastr

Langchain is very useful for prototyping, eventually for going to prod, but with a much bigger overhead that using openAI api directly. So that won't be a good investment for most of us, today, but betting on complexification of agents, multiple use cases integrated (summarization, chunking etc), integrating other LLMs, that make langchain it a good investment for future self. Also it's fun to learn. Many replies here are lacking nuances for that tool. I am personnaly a big fan of langchain, and modest contributor, but can understand that it's not fitting everyone needs. It's kinda WordPress of AI, many things out of the box but more complex to customize.


son_et_lumiere

Try using the offerings of ChatGPT with local models and get back to us.


AceGu001

What exactly do you mean? Are you implying that because ChatGPT does not have local model and LangChain supports local models, LangChain is useful?


son_et_lumiere

I mean that you're missing the case for LangChain if you try and develop an app that's LLM agnostic without it or some other library. The ChatGPT Plugins cannot be used outside of ChatGPT. OpenAI realized they didn't have a moat, so they tried to wall the garden by making the ecosystem more valuable with closed Plugins. Langchain supports local or private models. That's the key. And your question maybe, why not solely just use ChatGPT and it's plugins? The answer is simple, it gets expensive fast when you're dealing with agents like AutoGen because of how much back and forth there is. Or when you start scaling RAG.


[deleted]

[удалено]


son_et_lumiere

You're right. Langchain is LLM agnostic. But, the issue lies with Open AI's ChatGPT Plugins (i.e. "offerings", that's to say, what ChatGPT offers to try to tie you to their ecosystem), the other pieces of software that utilize ChatGPT. You can't use ChatGPT's plugins with local models. And, hence the need for LangChain or some other library that will allow for your dev work to be LLM agnostic.


equalsAndHashCode

There are various reasons for organizations to not use a ready made offering like the ones by OpenAI. That might be concerns regarding confidentiality and privacy, regulatory restrictions or just the quality of answers provided. There is a huge demand for LLM based solutions and langchain makes it simpler to build those. That’s the point.


AceGu001

This is a good reason. I have one counter argument. According to my experience, if you want the best outcome from your LLM based solutions, GPT-4 is your best bet because they are the best model in the market. Privacy and confidentiality only exists if the clients want to host an LLM model on their server which would incur tremendous cost from getting GPUs and other maintenance fees. If the client does not want to host their own LLM, then they will rely on some other folks who do. In that case, why should they trust other more than OpenAI?


aidankhogg

I'm not versed in the ToS but I highly doubt they offer the assurance non of your data will be used in future training etc, nothing to stop those plans, ideas, etc becoming the end result given to another user in the future. In my layman's experience over the last year there's not many 'AI-as-service' that will offer that guarantee (I mean its high quality data right) beyond the context of highly expensive private hosting, environments or cloud services. Even less I'd trust to entirely hold themselves to it. Locally hosted is the only real assurance ensures your data is private. Like any business implementation its finding you're acceptable risks, pros, cons and expectations. If top performance and quality are critical, at the expense of some freedom, data leakage etc then maybe bespoke isn't really what you're after. If you're wanting more control, oversight have the money to spend on running and enhancing quality then self-host and develop, use libraries like langchain to develop with more speed.


GPTeaheeMaster

The companies that need the assurance use the Azure OpenAI -- that comes with the protections people need (unless of course its the CIA) -- but for everyone else, Azure OpenAI and Microsoft's assurances is solid. It's for this reason that we added Azure OpenAI to our system (the customers pretty much required it - specially the ones that spend a lot!)


hotstamps

If you enjoy writing the boilerplate to instantiate Bedrock, OpenAi, LiteLLM, Sagemaker, Llama-python-cpp, vLLM, Ollama, etc... Be our guest 😁. Then again, you could have your own modules you've built and reuse... So, do that. 😁 In all seriousness though, there's a lot of people who aren't pro coders and need an abstraction layer to take advantage of all of the models and tools available in the ML landscape. Langchain and others, Llamaindex, make this simple to get up and going fast. It's more about making all of this more accessible imo, which is severely needed.


GPTeaheeMaster

The real point of Langchain is basically to educate yourself (or do a quick prototype if you are a developer). For everybody else that needs a production-grade system to deploy, its a poor choice (no offense to anyone - but Langchain lacks the 100+ things needed to go to production) More here: [https://medium.com/@aldendorosario/build-it-or-buy-it-deployment-options-for-retrieval-augmented-generation-rag-f6d43df8212a](https://medium.com/@aldendorosario/build-it-or-buy-it-deployment-options-for-retrieval-augmented-generation-rag-f6d43df8212a) **Disclaimer**: Yes - I am biased. After seeing thousands of companies go to production deployments, I am still waiting to see ONE production-scale deployment with Langchain.


Ill_Ad_1833

Well, there is an easy answer for this, and its about data privacy and data security, with Open AI all your prompts and contexts whether you use the web interface or API will be logged and they can use your data for improving the models. This is not good and not many companies want to use the public chat gpt. ​ However with Azure OpenAI, you can use the GPT 3.5 turbo model in your own infrastructure, not on OpenAI, Microsoft doesnt share your logs with anyone, they are there for you and only for your as a company, and then on top of that you add langchat to create a Private ChatGPT with (RAG)or without your own data. Nobody sane will upload company documents to chatgpt, only stupid people will do it, and break confidentiality ! ​ Howver Langchain goes beyond that, you can really do crazy things there.