Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

quant support #150

Closed
zhyncs opened this issue Mar 4, 2024 · 1 comment
Closed

quant support #150

zhyncs opened this issue Mar 4, 2024 · 1 comment
Assignees

Comments

@zhyncs
Copy link
Member

zhyncs commented Mar 4, 2024

Hi @yzh119 Thank you for your excellent work. Are there any current plans to support quantization, such as AWQ, SmoothQuant, KV Cache Int8, KV Cache FP8?Thanks.

@yzh119 yzh119 self-assigned this Mar 4, 2024
yzh119 added a commit that referenced this issue Mar 5, 2024
@yzh119
Copy link
Collaborator

yzh119 commented Mar 5, 2024

Hi @zhyncs , KV Cache Int8, KV Cache FP8 are mentioned in #125 .

Regarding AWQ and SmoothQuant, I suppose the most critical operators are fused dequant+gemv/fused dequant+gemm, and existing libraries have good support for them. I want to avoid duplicate work, and I'm glad to implement the missing operators in these libraries.

@zhyncs zhyncs closed this as completed Mar 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants