Close Menu
Versa AI hub
  • AI Ethics
  • AI Legislation
  • Business
  • Cybersecurity
  • Media and Entertainment
  • Content Creation
  • Art Generation
  • Research
  • Tools
  • Resources

Subscribe to Updates

Subscribe to our newsletter and stay updated with the latest news and exclusive offers.

What's Hot

Dataset recording, VLA fine-tuning, and on-device optimization

March 6, 2026

Update to Gemini 2.5 from Google DeepMind

March 6, 2026

JPMorgan ramps up investment in AI as technology spending approaches $20 billion

March 5, 2026
Facebook X (Twitter) Instagram
Versa AI hubVersa AI hub
Saturday, March 7
Facebook X (Twitter) Instagram
Login
  • AI Ethics
  • AI Legislation
  • Business
  • Cybersecurity
  • Media and Entertainment
  • Content Creation
  • Art Generation
  • Research
  • Tools
  • Resources
Versa AI hub
Home»Tools»Bun Transformers joins Hug Face!
Tools

Bun Transformers joins Hug Face!

versatileaiBy versatileaiOctober 22, 2025No Comments5 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
#image_title
Share
Facebook Twitter LinkedIn Pinterest Email



Today we are announcing that Sentence Transformers is moving from Iryna Gurevych’s Ubiquitous Knowledge Processing (UKP) lab at Darmstadt University of Technology to Hugging Face. Tom Aarsen from Hugging Face has already been maintaining the library since late 2023 and will continue to lead the project. In its new home, Sentence Transformers will benefit from Hugging Face’s robust infrastructure, including continuous integration and testing, allowing it to stay up to date with the latest advances in information retrieval and natural language processing.

Sentence Transformers (also known as SentenceBERT or SBERT) is a popular open-source library for generating high-quality embeddings that capture semantic meaning. Since its founding by Nils Reimers in 2019, Sentence Transformers have been widely adopted by researchers and practitioners for various natural language processing (NLP) tasks such as semantic search, semantic text similarity, clustering, and paraphrase mining. After years of development and training by and for the community, over 16,000 Sentence Transformers models are now live on Hugging Face Hub, serving over 1 million unique users per month.

“Sentence Transformers is a huge success story and the culmination of many years of research across our lab into computing semantic similarity.Nils Reimers has made a very timely discovery and has not only produced great research results, but also a very easy-to-use tool. I would also like to thank all of our users and especially our contributors, without whom this project would not be what it is today.” And finally, I would like to thank Tom and Hugging Face for taking this project into the future. ”

Professor Irina Gurevich, Director of the Ubiquitous Knowledge Processing Laboratory, Darmstadt University of Technology

“We are thrilled to officially welcome Sentence Transformers to the Hugging Face family. It’s amazing to see this project grow to massive global adoption thanks to its great foundation and the amazing community around it. This is just the beginning. We will continue to redouble our support for its growth and innovation, staying true to the open and collaborative spirit that made this project flourish in the first place.”

Clem DeLang, co-founder and CEO of Hugging Face

Sentence Transformers is a community-driven open source project and is covered by the same open source license (Apache 2.0). We welcome and encourage contributions from researchers, developers, and hobbyists. The project will continue to prioritize transparency, collaboration, and broad accessibility.

Project history

The Sentence Transformers library was introduced in 2019 by Dr. Nils Reimers from the Ubiquitous Knowledge Processing (UKP) Laboratory at the Darmstadt University of Technology under the supervision of Professor Iryna Gurevych. Motivated by the limitations of standard BERT embeddings for sentence-level semantic tasks, Sentence-BERT used a Siamese network architecture to generate semantically meaningful sentence embeddings that can be efficiently compared using cosine similarity. Thanks to its modular, open-source design and strong empirical performance in tasks such as semantic text similarity, clustering, and information retrieval, the library quickly became a staple of NLP research toolkits, spawning a variety of follow-up work and real-world applications that rely on high-quality sentence representations.

Multilingual support was added to the library in 2020, extending sentence embedding to over 400 languages. In 2021, the library was extended to support pairwise sentence scoring using the Cross Encoder and Sentence Transformer models, with contributions from Nandan Thakur and Dr. Johannes Daxenberger. Sentence Transformers is also now integrated with Hugging Face Hub (v2.0). For over four years, the UKP Lab team has maintained the library as a community-driven open source project, continuously delivering research-driven innovation. During this period, the development of the project was supported by grants to Professor Gurevich from the German Research Foundation (DFG), the German Federal Ministry of Education and Research (BMBF), and the Hessian Ministry of Higher Education, Research and the Arts (HMWK).

In late 2023, Tom Aarsen from Hugging Face took over maintenance of the library and introduced updated training for the Sentence Transformer model (v3.0) and improvements to the Cross Encoder (v4.0) and Sparse Encoder (v5.0) models.

Acknowledgment

The Ubiquitous Knowledge Processing (UKP) Laboratory at Darmstadt University of Technology, led by Professor Irina Gurevich, is internationally known for its research in natural language processing (NLP) and machine learning. The lab has a long track record of pioneering research in representation learning, large-scale language models, and information retrieval, and has published numerous papers at major conferences and journals. In addition to Sentence Transformers, UKP Lab has developed a number of widely used datasets, benchmarks, and open source tools that support both academic research and real-world applications.

Hug Face would like to thank the UKP Lab and all past and present contributors, especially Dr. Nils Reimers and Professor Irina Gurevich, for their dedicated contribution to the project and for entrusting us with its upkeep and current management. We would also like to thank the community of researchers, developers, and practitioners who contributed to the library’s success through model contributions, bug reports, feature requests, documentation improvements, and real-world applications. We look forward to building on the strong foundation laid by UKP Lab and working with the community to further advance the capabilities of Sentence Transformers.

Start

For those who are new to Sentence Transformers or want to try out its features:

author avatar
versatileai
See Full Bio
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleAI reimagines da Vinci’s legacy in art, medicine, and more
Next Article Investigate top AI security threats
versatileai

Related Posts

Tools

Dataset recording, VLA fine-tuning, and on-device optimization

March 6, 2026
Tools

Update to Gemini 2.5 from Google DeepMind

March 6, 2026
Tools

JPMorgan ramps up investment in AI as technology spending approaches $20 billion

March 5, 2026
Add A Comment

Comments are closed.

Top Posts

Improving the accuracy of multimodal search and visual document retrieval using the Llama Nemotron RAG model

January 7, 20267 Views

5 ways rules and regulations guide AI innovation

January 7, 20265 Views

Google’s industrial robot AI Play makes physical AI a priority

March 4, 20264 Views
Stay In Touch
  • YouTube
  • TikTok
  • Twitter
  • Instagram
  • Threads
Latest Reviews

Subscribe to Updates

Subscribe to our newsletter and stay updated with the latest news and exclusive offers.

Most Popular

Improving the accuracy of multimodal search and visual document retrieval using the Llama Nemotron RAG model

January 7, 20267 Views

5 ways rules and regulations guide AI innovation

January 7, 20265 Views

Google’s industrial robot AI Play makes physical AI a priority

March 4, 20264 Views
Don't Miss

Dataset recording, VLA fine-tuning, and on-device optimization

March 6, 2026

Update to Gemini 2.5 from Google DeepMind

March 6, 2026

JPMorgan ramps up investment in AI as technology spending approaches $20 billion

March 5, 2026
Service Area
X (Twitter) Instagram YouTube TikTok Threads RSS
  • About Us
  • Contact Us
  • Privacy Policy
  • Terms and Conditions
  • Disclaimer
© 2026 Versa AI Hub. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.

Sign In or Register

Welcome Back!

Login to your account below.

Lost password?