🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
-
Updated
Jan 26, 2026 - Python
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
A Unified Library for Parameter-Efficient and Modular Transfer Learning
An optimized deep prompt tuning strategy comparable to fine-tuning across scales and tasks
A plug-and-play library for parameter-efficient-tuning (Delta Tuning)
A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.
Live Training for Open-source Big Models
A collection of parameter-efficient transfer learning papers focusing on computer vision and multimodal domains.
Research Trends in LLM-guided Multimodal Learning.
Collection of Tools and Papers related to Adapters / Parameter-Efficient Transfer Learning/ Fine-Tuning
[Pattern Recognition 2025] Cross-Modal Adapter for Vision-Language Retrieval
K-CAI NEURAL API - Keras based neural network API that will allow you to create parameter-efficient, memory-efficient, flops-efficient multipath models with new layer types. There are plenty of examples and documentation.
CodeUp: A Multilingual Code Generation Llama-X Model with Parameter-Efficient Instruction-Tuning
This Repository surveys the paper focusing on Prompting and Adapters for Speech Processing.
On Transferability of Prompt Tuning for Natural Language Processing
[CVPR2024] The code of "UniPT: Universal Parallel Tuning for Transfer Learning with Efficient Parameter and Memory"
INTERSPEECH 23 - Refunction Whisper to recognize new tasks with adapters!
Code for the ACL 2022 paper "Continual Sequence Generation with Adaptive Compositional Modules"
TinyEngram is an open research project built on Qwen model for exploring the Engram architecture.
[EMNLP'25 main] Official Implementation of ModalPrompt: Towards Efficient Multimodal Continual Instruction Tuning with Dual-Modality Guided Prompt
Code for EACL'23 paper "Udapter: Efficient Domain Adaptation Using Adapters"
Add a description, image, and links to the parameter-efficient-learning topic page so that developers can more easily learn about it.
To associate your repository with the parameter-efficient-learning topic, visit your repo's landing page and select "manage topics."