Block or Report
Block or report Ataraxia1001
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abuseStars
Language
Sort by: Most stars
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Get up and running with Llama 3, Mistral, Gemma 2, and other large language models.
A latent text-to-image diffusion model
High-Resolution Image Synthesis with Latent Diffusion Models
The simplest, fastest repository for training/finetuning medium-sized GPTs.
Google Research
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
LabelImg is now part of the Label Studio community. The popular image annotation tool created by Tzutalin is no longer actively being developed, but you can check out Label Studio, the open source …
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
Tool for producing high quality forecasts for time series data that has multiple seasonality with linear or non-linear growth.
Open source platform for the machine learning lifecycle
Materials for the Learn PyTorch for Deep Learning: Zero to Mastery course.
Hydra is a framework for elegantly configuring complex applications
A series of Jupyter notebooks that walk you through the fundamentals of Machine Learning and Deep Learning in Python using Scikit-Learn, Keras and TensorFlow 2.
a state-of-the-art-level open visual language model | 多模态预训练模型
A Library for Advanced Deep Time Series Models.
The GitHub repository for the paper "Informer" accepted by AAAI 2021.
All course materials for the Zero to Mastery Deep Learning with TensorFlow course.
PyTorch Lightning + Hydra. A very user-friendly template for ML experimentation. ⚡🔥⚡
Time series forecasting with PyTorch
Official repo for "Mini-Gemini: Mining the Potential of Multi-modality Vision Language Models"
4 bits quantization of LLaMA using GPTQ
Implementation of Segnet, FCN, UNet , PSPNet and other models in Keras.
Pytorch implementation of U-Net, R2U-Net, Attention U-Net, and Attention R2U-Net.