Education has always embraced the face of democratizing AI, and it doubles it by giving hf.co/learn a major upgrade! Our NLP courses have been a go-to resource for the open source AI community for the past three years, and it’s time for refreshment. I’m updating and expanding to keep up with all the exciting things happening with AI (it’s not easy if there’s a breakthrough every week!)
I was excited about the experimental small course and the large agent course. There, 100K students signed up to learn about AI agents in a new and fun and interactive way!
Over the past few months, we expanded our “NLP” course with new chapters, including building inference models such as fine-tuning LLM and Deepseek R1. As these new chapters do not fit under the banner of “NLP,” I searched for more relevant and modern titles and enrolled in the LLM course.
What will happen to the NLP course materials?
Maintain existing material focusing on classic NLP tasks such as entity recognition and classification named search. These topics are important.
No LLMS is required for everything! Students benefit from these simpler tasks that are performed locally and easily interpreted.
In fact, over the next few months, we will update and modernize these classic chapters to include approaches such as sentence transformers, zero-shot classification updates, ModernBert and more.
Are there any new chapters?
yes. We are focusing on adding new chapters that focus on cutting-edge research that is accessible to a wider audience, and new chapters that are compatible with tools such as trances, spaces, and hugging facehubs. For example, add chapters on fine tuning, inference, and retrieval.
Rather than focusing solely on the facial libraries, I hope that new chapters will be even more open source and closer to the community. The Transformers library has become a de facto reference to LLM modeling code, but models have been consumed, tweaked and adopted by a variety of frameworks. That’s amazing! For example, in Chapter 11 of Fine Tuning LLMS or Chapter 12 of Inference Models, we collaborated with libraries and tools that significantly complement the embracing face library. This includes:
Over the next few months, we plan to expand our collaboration to include more authors, maintainers and companies. We want our materials to cover the best tools you use!
Are there any interactive exercises and live sessions?
Yes, when they serve students. We will focus on reliable written materials and coding exercises that will help you in the coming years. However, for popular topics students are involved in sync, they will build interactive exercises and hold live sessions.
What’s next?
To participate in the course, follow the hub organization and start a discussion if you wish to participate, suggest interactive units or suggest live sessions.