There are no reviews yet. Be the first to send feedback to the community and the maintainers!
Time-Series-Library
A Library for Advanced Deep Time Series Models.Transfer-Learning-Library
Transfer Learning Library for Domain Adaptation, Task Adaptation, and Domain GeneralizationAutoformer
About Code release for "Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting" (NeurIPS 2021), https://arxiv.org/abs/2106.13008iTransformer
Official implementation for "iTransformer: Inverted Transformers Are Effective for Time Series Forecasting" (ICLR 2024 Spotlight), https://openreview.net/forum?id=JePfAI8fahAnomaly-Transformer
About Code release for "Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy" (ICLR 2022 Spotlight), https://openreview.net/forum?id=LzQQ89U1qm_TimesNet
About Code release for "TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis" (ICLR 2023), https://openreview.net/pdf?id=ju_Uqw384Oqawesome-multi-task-learning
2024 up-to-date list of DATASETS, CODEBASES and PAPERS on Multi-Task Learning (MTL), from Machine Learning perspective.Xlearn
Transfer Learning LibraryNonstationary_Transformers
Code release for "Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting" (NeurIPS 2022), https://arxiv.org/abs/2205.14415predrnn-pytorch
Official implementation for NIPS'17 paper: PredRNN: Recurrent Neural Networks for Predictive Learning Using Spatiotemporal LSTMs.depyf
depyf is a tool to help you understand and adapt to PyTorch compiler torch.compile.CDAN
Code release for "Conditional Adversarial Domain Adaptation" (NIPS 2018)Flowformer
About Code release for "Flowformer: Linearizing Transformers with Conservation Flows" (ICML 2022), https://arxiv.org/pdf/2202.06258.pdfUniversal-Domain-Adaptation
Code release for Universal Domain Adaptation(CVPR 2019)HashNet
Code release for "HashNet: Deep Learning to Hash by Continuation" (ICCV 2017)Large-Time-Series-Model
Official code, datasets and checkpoints for "Timer: Generative Pre-trained Transformers Are Large Time Series Models" (ICML 2024)LogME
Code release for "LogME: Practical Assessment of Pre-trained Models for Transfer Learning" (ICML 2021) and Ranking and Tuning Pre-trained Models: A New Paradigm for Exploiting Model Hubs (JMLR 2022)Koopa
Code release for "Koopa: Learning Non-stationary Time Series Dynamics with Koopman Predictors" (NeurIPS 2023), https://arxiv.org/abs/2305.18803A-Roadmap-for-Transfer-Learning
Corrformer
About code release of "Interpretable Weather Forecasting for Worldwide Stations with a Unified Deep Model", Nature Machine Intelligence, 2023. https://www.nature.com/articles/s42256-023-00667-9MDD
Code released for ICML 2019 paper "Bridging Theory and Algorithm for Domain Adaptation".Self-Tuning
Code release for "Self-Tuning for Data-Efficient Deep Learning" (ICML 2021)SimMTM
About Code release for "SimMTM: A Simple Pre-Training Framework for Masked Time-Series Modeling" (NeurIPS 2023 Spotlight), https://arxiv.org/abs/2302.00861PADA
Code release for "Partial Adversarial Domain Adaptation" (ECCV 2018)Batch-Spectral-Penalization
Code release for Transferability vs. Discriminability: Batch Spectral Penalization for Adversarial Domain Adaptation (ICML 2019)Transferable-Adversarial-Training
Code release for Transferable Adversarial Training: A General Approach to Adapting Deep Classiο¬ers (ICML2019)TransNorm
Code release for "Transferable Normalization: Towards Improving Transferability of Deep Neural Networks" (NeurIPS 2019)MTlearn
Code release for "Learning Multiple Tasks with Multilinear Relationship Networks" (NIPS 2017)HashGAN
HashGAN: Deep Learning to Hash with Pair Conditional Wasserstein GANSAN
Code release for "Partial Transfer Learning with Selective Adversarial Networks" (CVPR 2018)Domain-Adaptation-Regression
Code release for Representation Subspace Distance for Domain Adaptation Regression (ICML 2021)Deep-Embedded-Validation
Code release for Towards Accurate Model Selection in Deep Unsupervised Domain Adaptation (ICML 2019)Latent-Spectral-Models
About Code Release for "Solving High-Dimensional PDEs with Latent Spectral Models" (ICML 2023), https://arxiv.org/abs/2301.12664CLIPood
About Code Release for "CLIPood: Generalizing CLIP to Out-of-Distributions" (ICML 2023), https://arxiv.org/abs/2302.00864iVideoGPT
Official repo for "iVideoGPT: Interactive VideoGPTs are Scalable World Models", https://arxiv.org/abs/2405.15223Transolver
About code release of "Transolver: A Fast Transformer Solver for PDEs on General Geometries", ICML 2024 Spotlight. https://arxiv.org/abs/2402.02366MADA
Code release for "Multi-Adversarial Domain Adaptation" (AAAI 2018)MotionRNN
About Code release for "MotionRNN: A Flexible Model for Video Prediction with Spacetime-Varying Motions" (CVPR 2021) https://arxiv.org/abs/2103.02243ETN
Code released for CVPR 2019 paper "Learning to Transfer Examples for Partial Domain Adaptation"Debiased-Self-Training
Code release of paper Debiased Self-Training for Semi-Supervised Learning (NeurIPS 2022 Oral)Versatile-Domain-Adaptation
Code Release for "Minimum Class Confusion for Versatile Domain Adaptation"(ECCV2020)Separate_to_Adapt
Code release for Separate to Adapt: Open Set Domain Adaptation via Progressive Separation (CVPR 2019)AutoTimes
Official implementation for "AutoTimes: Autoregressive Time Series Forecasters via Large Language Models"CoTuning
Code release for NeurIPS 2020 paper "Co-Tuning for Transfer Learning"OpenDG-DAML
Code release for Open Domain Generalization with Domain-Augmented Meta-Learning (CVPR2021)Calibrated-Multiple-Uncertainties
Code Release for "Learning to Detect Open Classes for Universal Domain Adaptation"(ECCV2020)TimeSiam
HarmonyDream
Code release for "HarmonyDream: Task Harmonization Inside World Models" (ICML 2024), https://arxiv.org/abs/2310.00344Batch-Spectral-Shrinkage
Code release for Catastrophic Forgetting Meets Negative Transfer: Batch Spectral Shrinkage for Safe Transfer Learning (NeurIPS 2019)StochNorm
Code release for NeurIPS 2020 paper "Stochastic Normalization"Transferable-Query-Selection
Code Release for "Transferable Query Selection for Active Domain Adaptation"(CVPR2021)Decoupled-Adaptation-for-Cross-Domain-Object-Detection
Code for ICLR2022 Decoupled Adaptation for Cross-Domain Object Detection (D-adapt) https://arxiv.org/abs/2110.02578few-shot
A lightweight library that implements state-of-the-art few-shot learning algorithms.transferable-memory
VideoDG
TCL
Code release for Transferable Curriculum for Weakly-Supervised Domain Adaptation (AAAI2019)SPOT
Code release for "Supported Policy Optimization for Offline Reinforcement Learning" (NeurIPS 2022), https://arxiv.org/abs/2202.06239DPH
Code release for "Deep Priority Hashing" (ACMMM 2018)MMHH
Metasets
PAN
DCN
Deep Calibration NetworkModeRNN
ForkMerge
Code release of paper "ForkMerge: Mitigating Negative Transfer in Auxiliary-Task Learning" (NeurIPS 2023)TAH
Code release for "Transfer Adversarial Hashing for Hamming Space Retrieval" (AAAI 2018)TransCal
learn_torch.compile
torch.compile artifacts for common deep learning models, can be used as a learning resource for torch.compileHelmFluid
About code release of "HelmFluid: Learning Helmholtz Dynamics for Interpretable Fluid Prediction", ICML 2024. https://arxiv.org/pdf/2310.10565Multi-Embedding
About Code Release for "On the Embedding Collapse When Scaling Up Recommendation Models" (ICML 2024)Zoo-Tuning
Code release for Zoo-Tuning: Adaptive Transfer from A Zoo of Models (ICML2021)timer
See the official code and checkpoints for "Timer: Generative Pre-trained Transformers Are Large Time Series Models"Regressive-Domain-Adaptation-for-Unsupervised-Keypoint-Detection
Code for CVPR 2021 Regressive Domain Adaptation for Unsupervised Keypoint Detection (RegDA) https://arxiv.org/abs/2103.06175MitNet
About Code Release for "Estimating Heterogeneous Treatment Effects: Mutual Information Bounds and Learning Algorithms" (ICML 2023)TimeXer
Official implementation for "TimeXer: Empowering Transformers for Time Series Forecasting with Exogenous Variables" (NeurIPS 2024)MobileAttention
Official implementation of "Mobile Attention: Mobile-Friendly Linear-Attention for Vision Transformers in PyTorch". To run the code, you can refer to https://github.com/thuml/Flowformer.Love Open Source and this site? Check out how you can help us