There are no reviews yet. Be the first to send feedback to the community and the maintainers!
Ranger-Deep-Learning-Optimizer
Ranger - a synergistic optimizer using RAdam (Rectified Adam), Gradient Centralization and LookAhead in one codebaseRanger21
Ranger deep learning optimizer rewrite to use newest componentsBest-Deep-Learning-Optimizers
Collection of the latest, greatest, deep learning optimizers (for Pytorch) - CNN, NLP suitablemish
Mish Deep Learning Activation Function for PyTorch / FastAIres2net-plus
Res2Net architecture with improved stem and Mish activation functionRanger-Mish-ImageWoof-5
Repo to build on / reproduce the record breaking Ranger-Mish-SelfAttention setup on FastAI ImageWoof dataset 5 epochstraining-detr
Unofficial Colab on how to train DETR, the intelligent object detector, with your own dataset. DETR = Detection Transformertransformer_central
Various transformers for FSDP researchFAdam_PyTorch
an implementation of FAdam (Fisher Adam) in PyTorchRanger22
Testing various improvements to Ranger21 for 2022mrnet-fastai
Deep Learning CNN using FastAI for the Stanford MRNet Knee MRI diagnosis challengetriton_kernels_for_fun_and_profit
Custom kernels in Triton language for accelerating LLMsThunder-Detr
(unofficial) - customized fork of DETR, optimized for intelligent obj detection on 'real world' custom datasetsfsdp_llm
FSDP optimizations for LLM trainingt5_11
housing our model example of fine tuning an 11B t5 with FSDPtransformer_framework
framework for plug and play of various transformers (vision and nlp) with FSDPFTSwishPlus
FTSwish with mean shifting added to increase performancehyper_efficient_optimizers
Development of hyper efficient optimizers that can match/exceed AdamW, while using reduced memoryfsdp_review
Some eval and profile routines for fsdpauto-adaptive-ai
auto adaptive framework for intrinsic hyperparameter selection, adaptive padding, normalized weightsTRelu
An improved activation function for deep learning - Threshold Relu, or TRelusigma_reparam
Sigma Reparam for Transformers (based on Apple's paper)EfficientNet-PyTorch
Unofficial port of Google's new EfficientNet to Pytorch and FastAIRangerQH-Testing
Repo for running RangerQH + Res2NetPLus with LIP Poolingfacial-keypoint-detection
Facial keypoint detection CNN - custom architecture using partial convolution paddingAutoOpt-for-FastAI
Integrate Ebay's AutoOpt Deep Learning Optimizer into the FastAI frameworkskycraft2
Minecraft in the sky, written in Pythonperception_tools
additional utils for working with Unity perception packageQuantFour_AdamW_Cuda
Fused 4bit AdamW in CudaPolarBearLLM
testing new TransFormer, MoE, and TransNormer featuresunet-seg
FTSwish
Flattened Threshold Swish Activation function - PyTorch implementationcoordinate_clipped_Optimizers
coordinate wise clipped Optimizers in PyTorchsnowfall
helpful image handling utils - abstracts various file and opencv and pil features into result oriented functionsstyle-transfer-vgg
Artistic Style transfer using VGG19cuda-kernel-dev
in progress cuda kernelsCurriculum-Learning-Dropout
Implementation of Curriculum Learning Dropout for FastAI frameworkmedExam
Training an AI with FSDP to take the US medical exam5D-Compiler
Auto-Parallelization Compiler using 4D Parallel + Checkpointing (5D)aot_fsdp
When AOT Autograd meets FSDP = large models train fasteralibi_positional_embeddings
Alibi in PyTorchoptimal-lr-finder
Automated optimal learning rate finder for PyTorch deep learning with FastAIft_linen
experiments with flax re-design to interop with pytorchlinear-graph-slam
Linear Graph SLAMbfloat_optimizer
Pure bfloat AdamW+ tweakssnake-id
FastAI deep learning classifier for snakesThunder
AI framework for flexible training and results review (pytorch, vision and tabular)t5_finetuning
T5 and ExT5 fine tuningpretrainer
FSDP codebase for pretraining large language models (LLM)Fusion
Advanced yet low code framework for fully sharded distributed traininghsdp_demo
Tutorial repo for PyTorch FSDP running HSDP on single node.image-captioning-cnn-lstm
Image captioning system combining CNN + LSTM for caption generationself-tuning-ai
implementation of self tuning networks in pytorch, based on https://arxiv.org/pdf/1903.03088v1.pdftriton_flashv2_alibi
working repo for Triton based Flash2 supporting alibi pos embeddingsPytorch_train_test_split
Function to randomize and split training data into train/test, from same directoryLove Open Source and this site? Check out how you can help us