Skip to content

Commit

Permalink
update index
Browse files Browse the repository at this point in the history
  • Loading branch information
yzh119 committed Jun 22, 2024
1 parent 0efc430 commit dc1cb4b
Show file tree
Hide file tree
Showing 6 changed files with 24 additions and 0 deletions.
4 changes: 4 additions & 0 deletions cu118/torch2.1/flashinfer/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -15,3 +15,7 @@ <h1>FlashInfer Python Wheels for CUDA 11.8 + torch 2.1.0</h1>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu118torch2.1-cp311-cp311-linux_x86_64.whl#sha256=8cf8c9c0420bfd4bcef172f46b2a0f8de4cbc0c344e44778f0be1793e1b3d97f">flashinfer-0.0.5+cu118torch2.1-cp311-cp311-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu118torch2.1-cp38-cp38-linux_x86_64.whl#sha256=cd00d06fa2d354b6c76d6d3f347fb841ea020fcb3c2d15e3df5218c742ee54a3">flashinfer-0.0.5+cu118torch2.1-cp38-cp38-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu118torch2.1-cp39-cp39-linux_x86_64.whl#sha256=4d3bdc49c394e01f57cc3b5cd0c7679b58c9a8b65dc0fcf853f8c91a4bf2912d">flashinfer-0.0.5+cu118torch2.1-cp39-cp39-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu118torch2.1-cp310-cp310-linux_x86_64.whl#sha256=09543abddb3162401727d862dd6593a2517fb28c077b491d618ff840e5bc54f5">flashinfer-0.0.6+cu118torch2.1-cp310-cp310-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu118torch2.1-cp311-cp311-linux_x86_64.whl#sha256=7fed76601e31d15464e300d8b1bb9449e3cff0d27ec315311f7a9ff105912c84">flashinfer-0.0.6+cu118torch2.1-cp311-cp311-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu118torch2.1-cp38-cp38-linux_x86_64.whl#sha256=04df8fe316d0fafee19a0806cfd1e20c8a74f57192778bf636b800984c660306">flashinfer-0.0.6+cu118torch2.1-cp38-cp38-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu118torch2.1-cp39-cp39-linux_x86_64.whl#sha256=18192ca898bf97edcff5983fd398822516221a6ea561c0e00b8ee8378a9d2ef8">flashinfer-0.0.6+cu118torch2.1-cp39-cp39-linux_x86_64.whl</a><br>
4 changes: 4 additions & 0 deletions cu118/torch2.2/flashinfer/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -15,3 +15,7 @@ <h1>FlashInfer Python Wheels for CUDA 11.8 + torch 2.2.0</h1>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu118torch2.2-cp311-cp311-linux_x86_64.whl#sha256=38fc5d1ff34e6e7c6e12c016eae1211d23c164500af2a8380f705972cdb0b574">flashinfer-0.0.5+cu118torch2.2-cp311-cp311-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu118torch2.2-cp38-cp38-linux_x86_64.whl#sha256=fa6638575fafc140b0ab8b8e0183498c81374d1076adafaea402eb39ee5daded">flashinfer-0.0.5+cu118torch2.2-cp38-cp38-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu118torch2.2-cp39-cp39-linux_x86_64.whl#sha256=e7c28d606813d11414fce47fb24628aec48c96f6794d6d4e7210b7a5f407d804">flashinfer-0.0.5+cu118torch2.2-cp39-cp39-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu118torch2.2-cp310-cp310-linux_x86_64.whl#sha256=da7ab432932c0def91409f690584454ec8cedd37d46b02028e3637c1013096ed">flashinfer-0.0.6+cu118torch2.2-cp310-cp310-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu118torch2.2-cp311-cp311-linux_x86_64.whl#sha256=d9d33f7c419c3741eea8885300e5234ab5109e5c61ca36f78c1f508f9aba0904">flashinfer-0.0.6+cu118torch2.2-cp311-cp311-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu118torch2.2-cp38-cp38-linux_x86_64.whl#sha256=3660cb36e35a36940b8e19a0e3fedd52eb2abb3e361a1fb63019623df83e6a9a">flashinfer-0.0.6+cu118torch2.2-cp38-cp38-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu118torch2.2-cp39-cp39-linux_x86_64.whl#sha256=c02a7055f34e70d9a2ce63f33806e39a9d82294dca207404502eb50fe08b8697">flashinfer-0.0.6+cu118torch2.2-cp39-cp39-linux_x86_64.whl</a><br>
4 changes: 4 additions & 0 deletions cu118/torch2.3/flashinfer/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -8,3 +8,7 @@ <h1>FlashInfer Python Wheels for CUDA 11.8 + torch 2.3.0</h1>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu118torch2.3-cp311-cp311-linux_x86_64.whl#sha256=dfd53efe696bbedfc5b800e217d7593de26b4250208a25185fdce0869ddb68a9">flashinfer-0.0.5+cu118torch2.3-cp311-cp311-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu118torch2.3-cp38-cp38-linux_x86_64.whl#sha256=361e3e27aa5810cd8fe51db49ac52ce4f65759fe200463321fca377d83a18186">flashinfer-0.0.5+cu118torch2.3-cp38-cp38-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu118torch2.3-cp39-cp39-linux_x86_64.whl#sha256=771ab65c964a6b82b1999fa54b6aeacef9a12e286c09084792c0910cd4b961d7">flashinfer-0.0.5+cu118torch2.3-cp39-cp39-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu118torch2.3-cp310-cp310-linux_x86_64.whl#sha256=c0e987d9c7a0486fa0108331fa01a57f73c2da2e503db6380376ff20990deb40">flashinfer-0.0.6+cu118torch2.3-cp310-cp310-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu118torch2.3-cp311-cp311-linux_x86_64.whl#sha256=f4efbceae0d891640fb2eaa6d9966c15d106f0bbfa8daaef421d1233c0bd7b44">flashinfer-0.0.6+cu118torch2.3-cp311-cp311-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu118torch2.3-cp38-cp38-linux_x86_64.whl#sha256=a6f8a2f8be56e8738a170843e8ef0d4b3e0200c96d65917b5fb7b83391c07a9c">flashinfer-0.0.6+cu118torch2.3-cp38-cp38-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu118torch2.3-cp39-cp39-linux_x86_64.whl#sha256=568bf1c0e442cb6a47464ed42f9b5c41f522319187b91e3087b233dc22866869">flashinfer-0.0.6+cu118torch2.3-cp39-cp39-linux_x86_64.whl</a><br>
4 changes: 4 additions & 0 deletions cu121/torch2.1/flashinfer/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -15,3 +15,7 @@ <h1>FlashInfer Python Wheels for CUDA 12.1 + torch 2.1.0</h1>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu121torch2.1-cp311-cp311-linux_x86_64.whl#sha256=f530fe6a33967917283e6c887a4e62d7a152679d17271d5dcd30a73e46fc5b2a">flashinfer-0.0.5+cu121torch2.1-cp311-cp311-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu121torch2.1-cp38-cp38-linux_x86_64.whl#sha256=bccc88c735a1fef080fde56597c0c623895378433f8e5a9c47c3c9a12f5078a4">flashinfer-0.0.5+cu121torch2.1-cp38-cp38-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu121torch2.1-cp39-cp39-linux_x86_64.whl#sha256=655dd652424a60ca5675c3a9ae5cae2bc941dbc27a7cda22edbefe6c8d0ccf59">flashinfer-0.0.5+cu121torch2.1-cp39-cp39-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu121torch2.1-cp310-cp310-linux_x86_64.whl#sha256=98577401ad8afdc2a51c6af6e94ada466626cf07454dfe7f2d61d70d629d2a10">flashinfer-0.0.6+cu121torch2.1-cp310-cp310-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu121torch2.1-cp311-cp311-linux_x86_64.whl#sha256=6c540defce91bc9769fe25afe00249803b13d87e7672089931bb28d57d876d25">flashinfer-0.0.6+cu121torch2.1-cp311-cp311-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu121torch2.1-cp38-cp38-linux_x86_64.whl#sha256=20821a19e13cbbf98b8548ee344e244744e21da165dfae46816f037cda5932ae">flashinfer-0.0.6+cu121torch2.1-cp38-cp38-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu121torch2.1-cp39-cp39-linux_x86_64.whl#sha256=95fd4685eae081abf9cc7da1639754c340fbbb0b574641a4778440dbb662e022">flashinfer-0.0.6+cu121torch2.1-cp39-cp39-linux_x86_64.whl</a><br>
4 changes: 4 additions & 0 deletions cu121/torch2.2/flashinfer/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -15,3 +15,7 @@ <h1>FlashInfer Python Wheels for CUDA 12.1 + torch 2.2.0</h1>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu121torch2.2-cp311-cp311-linux_x86_64.whl#sha256=f9e70009a60b2dcf3884e5844a46fb79b0c91078ac96798e98a4718ed62cca57">flashinfer-0.0.5+cu121torch2.2-cp311-cp311-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu121torch2.2-cp38-cp38-linux_x86_64.whl#sha256=d037992d5c97e2037e6fd558c911acf55f2f2b0f31f2b647510c6c5ea6f88f84">flashinfer-0.0.5+cu121torch2.2-cp38-cp38-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu121torch2.2-cp39-cp39-linux_x86_64.whl#sha256=ca442b64418a20efad07f2890b16e7c3c72d5b5b02381b01cf5143cf9dbefb39">flashinfer-0.0.5+cu121torch2.2-cp39-cp39-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu121torch2.2-cp310-cp310-linux_x86_64.whl#sha256=5239ef7ec058329589b1f0ccdd3396e46c267e97fa3c94952661c227bdf0839b">flashinfer-0.0.6+cu121torch2.2-cp310-cp310-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu121torch2.2-cp311-cp311-linux_x86_64.whl#sha256=eeb409bb01d7860576af22a84dde6ea673706c215c751bc459b650608607e238">flashinfer-0.0.6+cu121torch2.2-cp311-cp311-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu121torch2.2-cp38-cp38-linux_x86_64.whl#sha256=73f729f8e83a55cc8c4ac3917bc9db5ed887419a9e9d8e24df6eab98eb718d22">flashinfer-0.0.6+cu121torch2.2-cp38-cp38-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu121torch2.2-cp39-cp39-linux_x86_64.whl#sha256=5bb5cda5770ca7b074105475812c5a2359db32d7dcc6afdcdac661857efb9b30">flashinfer-0.0.6+cu121torch2.2-cp39-cp39-linux_x86_64.whl</a><br>
4 changes: 4 additions & 0 deletions cu121/torch2.3/flashinfer/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -8,3 +8,7 @@ <h1>FlashInfer Python Wheels for CUDA 12.1 + torch 2.3.0</h1>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu121torch2.3-cp311-cp311-linux_x86_64.whl#sha256=3b73c40ec7b754eb88f135c4a29ca91a83defc58390a82f40ac34bef151d503b">flashinfer-0.0.5+cu121torch2.3-cp311-cp311-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu121torch2.3-cp38-cp38-linux_x86_64.whl#sha256=9c0c0d0af6c3a330b67bc686d735ef99fb9cae9cb595a72248472d46b8cb82b4">flashinfer-0.0.5+cu121torch2.3-cp38-cp38-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.5/flashinfer-0.0.5+cu121torch2.3-cp39-cp39-linux_x86_64.whl#sha256=05399a376b1aaa555a7827b6eb1bdb5e55c5eff3da4e3c335506834bdbdb7971">flashinfer-0.0.5+cu121torch2.3-cp39-cp39-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu121torch2.3-cp310-cp310-linux_x86_64.whl#sha256=edf4a8e53eabd187a93b46571b1c5ee56f7a88db37ffa88d03253f64f94a6b7b">flashinfer-0.0.6+cu121torch2.3-cp310-cp310-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu121torch2.3-cp311-cp311-linux_x86_64.whl#sha256=e9cb75435c7ce4d811c860b4246b2333c5c0e5a98f366dd9ba4daabb91f8f539">flashinfer-0.0.6+cu121torch2.3-cp311-cp311-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu121torch2.3-cp38-cp38-linux_x86_64.whl#sha256=f0a22ff9b1abf75a805597a0de27e5036a4a6013bde6ce20d4cad9984350d0ce">flashinfer-0.0.6+cu121torch2.3-cp38-cp38-linux_x86_64.whl</a><br>
<a href="https://github.com/flashinfer-ai/flashinfer/releases/download/v0.0.6/flashinfer-0.0.6+cu121torch2.3-cp39-cp39-linux_x86_64.whl#sha256=c012cf6ccb238fb3cfc33df86ba029ccd9d49472465b2412560b8fdf5ac936c9">flashinfer-0.0.6+cu121torch2.3-cp39-cp39-linux_x86_64.whl</a><br>

0 comments on commit dc1cb4b

Please sign in to comment.