-
Sapienza University of Rome
- Rome, Italy
- https://caesar.one
- @caesar_one_
- in/caesar-one
Highlights
- Pro
Block or Report
Block or report caesar-one
Contact GitHub support about this userβs behavior. Learn more about reporting abuse.
Report abuseStars
Language
Sort by: Recently starred
Implementation of "BitNet: Scaling 1-bit Transformers for Large Language Models" in pytorch
A collection of LLM related papers, thesis, tools, datasets, courses, open source models, benchmarks
SUPIR aims at developing Practical Algorithms for Photo-Realistic Image Restoration In the Wild. Our new online demo is also released at suppixel.ai.
RateNinja: A Python package for efficient and rate-limited API calling with multithreading and automatic retries support.
InstantID : Zero-shot Identity-Preserving Generation in Seconds π₯
Best Practices on Recommendation Systems
A collection of datasets for language model pretraining including scripts for downloading, preprocesssing, and sampling.
An infinite number of monkeys randomly throwing paint at a canvas
π Awesome list of tools and projects with the awesome LangChain framework
The repository for the code of the UltraFastBERT paper
Drop in a screenshot and convert it to clean code (HTML/Tailwind/React/Vue)
A blazing fast inference solution for text embeddings models
A prompting enhancement library for transformers-type text embedding systems
All Algorithms implemented in Python
A modular RL library to fine-tune language models to human preferences
Argilla is a collaboration tool for AI engineers and domain experts to build high-quality datasets
PyTorch Lightning + Hydra + HuggingFace Transformers. A very user-friendly template for rapid and reproducible ML experimentation with best practices. β‘π₯β‘
Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models
Model explainability that works seamlessly with π€ transformers. Explain your transformers model in just 2 lines of code.
Easily turn large sets of image urls to an image dataset. Can download, resize and package 100M urls in 20h on one machine.
A Word Level Transformer layer based on PyTorch and π€ Transformers.
π€ Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.
Reference implementation of the Transformer architecture optimized for Apple Neural Engine (ANE)
π€ Evaluate: A library for easily evaluating machine learning models and datasets.