Skip to content

[train][vllm] Add enable_log_requests and max_log_len support#1071

Merged
CharlieFRuan merged 1 commit intoNovaSky-AI:mainfrom
CharlieFRuan:pr-vllm-log-req
Feb 11, 2026
Merged

[train][vllm] Add enable_log_requests and max_log_len support#1071
CharlieFRuan merged 1 commit intoNovaSky-AI:mainfrom
CharlieFRuan:pr-vllm-log-req

Conversation

@CharlieFRuan
Copy link
Member

@CharlieFRuan CharlieFRuan commented Feb 11, 2026

Support +generator.engine_init_kwargs.enable_log_requests=true and +generator.engine_init_kwargs.max_log_len=256 for vllm async engine.


Open with Devin

Copy link
Contributor

@devin-ai-integration devin-ai-integration bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Devin Review found 1 potential issue.

Open in Devin Review

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces request logging support in the vLLM async engine via enable_log_requests and max_log_len options. While the implementation is secure by default, a medium-severity risk exists where sensitive user data could be logged if the feature is enabled in a production environment. It is recommended to add a strong warning in the code to alert operators to this risk. Additionally, a critical fix for a version check has been incorporated to prevent potential crashes with specific vLLM versions.

@CharlieFRuan CharlieFRuan merged commit 81fc538 into NovaSky-AI:main Feb 11, 2026
3 checks passed
@CharlieFRuan CharlieFRuan deleted the pr-vllm-log-req branch February 11, 2026 17:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant