[go: nahoru, domu]

Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
ridgerchu committed Apr 24, 2024
1 parent 1874362 commit 039e76d
Showing 1 changed file with 5 additions and 10 deletions.
15 changes: 5 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,10 @@
<div align="center">

# Flash Linear Attention

[](https://huggingface.co/fla-hub)
<div align=center>
<img src="__assets__/magictime_logo.png" width="150px">
</div>
<h2 align="center">MatMul-Free LLM</h2>
<h5 align="center"> If you like our project, please give us a star ⭐ on GitHub for the latest update. </h2>

This repo aims at providing a collection of efficient Triton-based implementations for state-of-the-art linear attention models.

<div align="center">
<img width="400" alt="image" src="https://github.com/sustcsonglin/flash-linear-attention/assets/18402347/02ff2e26-1495-4088-b701-e72cd65ac6cf">
</div>
<h5 align="center">

# Installation

Expand Down

0 comments on commit 039e76d

Please sign in to comment.