Close Menu
Versa AI hub
  • AI Ethics
  • AI Legislation
  • Business
  • Cybersecurity
  • Media and Entertainment
  • Content Creation
  • Art Generation
  • Research
  • Tools

Subscribe to Updates

Subscribe to our newsletter and stay updated with the latest news and exclusive offers.

What's Hot

Workplace AI Series – Part 3: Artificial Intelligence in Employment: How States Around Pennsylvania Are Near Legal Situation | Tucker Aresberg, PC

June 4, 2025

AI-Media announces innovative AI voice translation at NAB Show 2025

June 4, 2025

Gemini 2.5 native audio features

June 4, 2025
Facebook X (Twitter) Instagram
Versa AI hubVersa AI hub
Wednesday, June 4
Facebook X (Twitter) Instagram
Login
  • AI Ethics
  • AI Legislation
  • Business
  • Cybersecurity
  • Media and Entertainment
  • Content Creation
  • Art Generation
  • Research
  • Tools
Versa AI hub
Home»Tools»Face x Langchain embrace: a new partner package
Tools

Face x Langchain embrace: a new partner package

versatileaiBy versatileaiMay 17, 2025No Comments4 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
#image_title
Share
Facebook Twitter LinkedIn Pinterest Email




Eric Fris' avatar


We are excited to announce the launch of Langchain_Huggingface, a partner package for Langchain, which is jointly maintained by hugging Face and Langchain. This new Python package is designed to bring the latest development power of embracing faces to Langchain and keep them up to date.

All of the face-related classes in Langchain were coded by the community and we were thriving on this, but over time some of them were condemned for the lack of an insider’s perspective.

By becoming a partner package, we aim to reduce the time it takes to offer new features available on the Face Ecosystem, which is embracing Langchain users.

Langchain-Huggingface integrates seamlessly with Langchain, providing an efficient and effective way to utilize the hugging face model within the Langchain ecosystem. This partnership is not only a shared technology, but also a joint commitment to maintaining and continually improving this integration.

Get started

Getting started with Langchain-Huggingface is easy. Here’s how to install and use the package:

Install PIP Langchain-Huggingface

Now that the package is installed, let’s take a tour of the contents!

LLMS

Hagging facepipeline

Among transformers, pipelines are the most versatile tool in the hugging face toolbox. Langchain, primarily designed to address RAG and agent usage cases, here the scope of the pipeline is reduced to the following text-centric tasks: “Text Generation”, “Text2Text-Generation”, “summarization”, “Translation”.

The model can be loaded directly using the from_model_id method.

from langchain_huggingface Import HuggingFacepipeline LLM = HuggingFacepipeline.from_model_id(model_id =“Microsoft/Phi-3-mini-4k-instruct”task =“Text Generation”,pipeline_kwargs = {
“max_new_tokens”: 100,
“TOP_K”: 50,
“temperature”: 0.1,},) llm.invoke (“Hugging my face.”))

Alternatively, you can define the pipeline yourself before passing it to the class.

from transformer Import Automodelforcausallm, autotokenizer, pipeline model_id = “Microsoft/Phi-3-mini-4k-instruct”
tokenizer = autotokenizer.from_pretrained(model_id) model = automodelforcausallm.from_pretrained(model_id, load_in_4bit =truth,) pipe = pipeline (“Text Generation”,Model = Model,Tokenizer =Tokenizer,Max_new_tokens =100top_k =50temperature =0.1) llm = huggingfacepipeline (pipeline = pipe) llm.invoke (“Hugging my face.”))

Using this class, the model is loaded into the cache and uses the computer’s hardware. Therefore, it may be limited by resources available to your computer.

HuggingfaceEndpoint

There are two ways to use this class: You can specify the model using the repo_id parameter. These endpoints use serverless APIs. This is especially beneficial for those using a Pro account or an Enterprise Hub. Still, normal users can access a significant amount of requests by connecting with the HF token in the environment where they are running their code.

from langchain_huggingface Import HuggingfaceEndPoint LLM = HuggingfaceEndPoint(repo_id =“Metalama/Metalama-3-8b-instruct”task =“Text Generation”,max_new_tokens =100do_sample =error)llm.invoke (“Hugging my face.”)llm = huggingfaceEndpoint(endpoint_url =“”task =“Text Generation”,max_new_tokens =1024do_sample =error)llm.invoke (“Hugging my face.”))

Under the hood, this class uses guessing power to provide serverless APIs for various use cases to allow TGI instances to be deployed.

Chathuggingface

Every model has its own special token that works best. If those tokens are not added to the prompt, the model will perform very poorly

If you go to the completion prompt from the list of messages, you will find an attribute that is present in most LLM talkers called chat_template.

For more information about chat_templates for various models, visit this space I’ve created!

This class is a wrapper around other LLMs. Once you enter a list of messages, use the tokenizer.apply_chat_template method to create the correct completion prompt.

from langchain_huggingface Import Chathuggingface, HuggingfaceEndpoint LLM = HuggingfaceEndpoint(endpoint_url =“”task =“Text Generation”,max_new_tokens =1024do_sample =error,) llm_engine_hf = chathuggingface(llm = llm) llm_engine_hf.invoke(“Hugging my face.”))

The above code is:

llm.invoke (“(inst) hugging face is (/inst)”)llm.invoke (“” <| begin_of_text |> <| start_header_id |> user <| end_header_id |> hugging face is <| eot_id |> <| start_header_id |> Assistant <| end_header_id |> “” “”))

embedded

The embracing face is filled with a much stronger embedded model than can be utilized directly in the pipeline.

First select the model. One great resource for choosing an embedded model is the MTEB leaderboard.

Huggingfacembeddings

This class uses statement converter embedding. It is locally embedded, so it uses computer resources to calculate it.

from langchain_huggingface.embeddings Import huggingfacembeddings model_name = “MixedBread-Ai/Mxbai-embed-rarage-v1”
hf_embeddings = huggingfacembeddings(model_name = model_name,)texts =(“Hello World!”, “how are you?”)hf_embeddings.embed_documents (text)

HuggingfaceEndpointembedings

HuggingfaceEndpointembeddings is very similar to what HuggingfaceEndpoint does for LLM. It can be used in models on hubs, and TEI can instance whether it is deployed locally or online.

from langchain_huggingface.embeddings Import HuggingfaceEndpointembedings hf_embeddings = huggingfaceEndpointembedings(model = “MixedBread-Ai/Mxbai-embed-rarage-v1”task =“Feature Extraction”,huggingfacehub_api_token =“”,)texts =(“Hello World!”, “how are you?”)hf_embeddings.embed_documents (text)

Conclusion

We are working to improve our Langchain-Huggingface by the day. We actively monitor feedback and issues and work to address them as soon as possible. It also adds new features and features, expands the package to support a wider range of use cases in the community. I highly recommend trying this package and expressing your opinion as it paves the way for the future of the package.

author avatar
versatileai
See Full Bio
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleWhat is the security attitude of campus in the age of AI? – Campus Technology
Next Article Republicans are trying to boost AI while tightening grips on social media and online speeches
versatileai

Related Posts

Tools

Gemini 2.5 native audio features

June 4, 2025
Tools

IBM and Roche use AI to predict blood glucose levels

June 3, 2025
Tools

Jacks of all trades, some masters, multipurpose trans agent

June 3, 2025
Add A Comment
Leave A Reply Cancel Reply

Top Posts

New Star: Discover why 보니 is the future of AI art

February 26, 20253 Views

How to use Olympic coders locally for coding

March 21, 20252 Views

SmolVLM miniaturization – now available in 256M and 500M models!

January 23, 20252 Views
Stay In Touch
  • YouTube
  • TikTok
  • Twitter
  • Instagram
  • Threads
Latest Reviews

Subscribe to Updates

Subscribe to our newsletter and stay updated with the latest news and exclusive offers.

Most Popular

New Star: Discover why 보니 is the future of AI art

February 26, 20253 Views

How to use Olympic coders locally for coding

March 21, 20252 Views

SmolVLM miniaturization – now available in 256M and 500M models!

January 23, 20252 Views
Don't Miss

Workplace AI Series – Part 3: Artificial Intelligence in Employment: How States Around Pennsylvania Are Near Legal Situation | Tucker Aresberg, PC

June 4, 2025

AI-Media announces innovative AI voice translation at NAB Show 2025

June 4, 2025

Gemini 2.5 native audio features

June 4, 2025
Service Area
X (Twitter) Instagram YouTube TikTok Threads RSS
  • About Us
  • Contact Us
  • Privacy Policy
  • Terms and Conditions
  • Disclaimer
© 2025 Versa AI Hub. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.

Sign In or Register

Welcome Back!

Login to your account below.

Lost password?