Skip to content

Conversation

@kashif
Copy link
Collaborator

@kashif kashif commented Nov 27, 2025

What does this PR do?

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a GitHub issue? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

if self.vllm_use_lora:
# Cycle through 2 adapter IDs to force vLLM to reload updated weights while keeping memory usage bounded
# vLLM uses lora_int_id as the cache key (see WorkerLoRAManager.add_adapter)
# By alternating between 2 IDs, we ensure the adapter is reloaded each step without accumulating N adapters
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice!

if self.vllm_use_lora:
import tempfile

# Create a temporary directory to save the LoRA adapter (only once)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

any edge case in case it's training on a distributed/multi-node setup?

@kashif kashif changed the title [online-dpo] add vllm lora adapter support [online trainers] add vllm lora adapter support Dec 8, 2025
@qgallouedec
Copy link
Member

Could you explain the context of this PR? In my understanding, if you use LoRA for vLLM the generation is expected to be slower. Why not just merging the adapter before the training?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants