• Stars
    star
    152
  • Rank 238,920 (Top 5 %)
  • Language
    Python
  • License
    MIT License
  • Created about 2 years ago
  • Updated 7 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

[KDD'22] Official PyTorch implementation for "Towards Universal Sequence Representation Learning for Recommender Systems".

UniSRec

This is the official PyTorch implementation for the paper:

Yupeng Hou*, Shanlei Mu*, Wayne Xin Zhao, Yaliang Li, Bolin Ding, Ji-Rong Wen. Towards Universal Sequence Representation Learning for Recommender Systems. KDD 2022.


Updates:

  • [Nov. 22, 2022] We added scripts and implementations of baselines FDSA and S^3-Rec [link].
  • [June 28, 2022] We updated some useful "mid product" files that can be obtained during the data preprocessing stage [link], including:
    1. Clean item text (*.text);
    2. Index mapping between raw IDs and remapped IDs (*.user2index, *.item2index);
  • [June 16, 2022] We released the code and scripts for preprocessing ours datasets [link].

Overview

We propose UniSRec, which stands for Universal Sequence representation learning for Recommendation. Aiming to learn more generalizable sequence representations, UniSRec utilizes the associated description text of an item to learn transferable representations across different domains and platforms. For learning universal item representations, we design a lightweight architecture based on parametric whitening and mixture-of-experts enhanced adaptor. For learning universal sequence representations, we introduce two kinds of contrastive learning tasks by sampling multi-domain negatives. With the pre-trained universal sequence representation model, our approach can be effectively transferred to new cross-domain and cross-platform recommendation scenarios in a parameter-efficient way, under either inductive or transductive settings.

Requirements

recbole==1.0.1
python==3.9.7
cudatoolkit==11.3.1
pytorch==1.11.0

Download Datasets and Pre-trained Model

Please download the processed downstream (or pre-trained, if needed) datasets and the pre-trained model from Google Drive or 百度网盘 (密码 3cml).

After unzipping, move pretrain/ and downstream/ to dataset/, and move UniSRec-FHCKM-300.pth to saved/.

Quick Start

Train and evaluate on downstream datasets

Fine-tune the pre-trained UniSRec model in transductive setting.

python finetune.py -d Scientific -p saved/UniSRec-FHCKM-300.pth

You can replace Scientific to Pantry, Instruments, Arts, Office or OR to reproduce the results reported in our paper.

Fine-tune the pre-trained model in inductive setting.

python finetune.py -d Scientific -p saved/UniSRec-FHCKM-300.pth --train_stage=inductive_ft

Train UniSRec from scratch (w/o pre-training).

python finetune.py -d Scientific

Run baseline SASRec.

python run_baseline.py -m SASRec -d Scientific --config_files=props/finetune.yaml --hidden_size=300

Please refer to [link] for more scripts of our baselines.

Pre-train from scratch

Pre-train on one single GPU.

python pretrain.py

Pre-train with distributed data parallel on GPU:0-3.

CUDA_VISIBLE_DEVICES=0,1,2,3 python ddp_pretrain.py

Customized Datasets

Please refer to [link] for details of data preprocessing. Then you can correspondingly try your customized datasets.

Acknowledgement

The implementation is based on the open-source recommendation library RecBole.

Please cite the following papers as the references if you use our codes or the processed datasets.

@inproceedings{hou2022unisrec,
  author = {Yupeng Hou and Shanlei Mu and Wayne Xin Zhao and Yaliang Li and Bolin Ding and Ji-Rong Wen},
  title = {Towards Universal Sequence Representation Learning for Recommender Systems},
  booktitle = {{KDD}},
  year = {2022}
}


@inproceedings{zhao2021recbole,
  title={Recbole: Towards a unified, comprehensive and efficient framework for recommendation algorithms},
  author={Wayne Xin Zhao and Shanlei Mu and Yupeng Hou and Zihan Lin and Kaiyuan Li and Yushuo Chen and Yujie Lu and Hui Wang and Changxin Tian and Xingyu Pan and Yingqian Min and Zhichao Feng and Xinyan Fan and Xu Chen and Pengfei Wang and Wendi Ji and Yaliang Li and Xiaoling Wang and Ji-Rong Wen},
  booktitle={{CIKM}},
  year={2021}
}

Special thanks @Juyong Jiang for the excellent DDP implementation (#961).

More Repositories

1

LLMSurvey

The official GitHub page for the survey paper "A Survey of Large Language Models".
Python
8,693
star
2

RecBole

A unified, comprehensive and efficient recommendation library
Python
3,230
star
3

TextBox

TextBox 2.0 is a text generation library with pre-trained language models
Python
1,055
star
4

Awesome-RSPapers

Recommender System Papers
902
star
5

RecSysDatasets

This is a repository of public data sources for Recommender Systems (RS).
Python
731
star
6

CRSLab

CRSLab is an open-source toolkit for building Conversational Recommender System (CRS).
Python
474
star
7

Top-conference-paper-list

A collection of classified and organized top conference paper list.
362
star
8

HaluEval

This is the repository of HaluEval, a large-scale hallucination evaluation benchmark for Large Language Models.
Python
298
star
9

LLMRank

[ECIR'24] Implementation of "Large Language Models are Zero-Shot Rankers for Recommender Systems"
Python
182
star
10

Negative-Sampling-Paper

This repository collects 100 papers related to negative sampling methods.
173
star
11

DenseRetrieval

170
star
12

RecBole2.0

An up-to-date, comprehensive and flexible recommendation library
167
star
13

RecBole-GNN

Efficient and extensible GNNs enhanced recommender library based on RecBole.
Python
154
star
14

LLMBox

Python
117
star
15

NCL

[WWW'22] Official PyTorch implementation for "Improving Graph Collaborative Filtering with Neighborhood-enriched Contrastive Learning".
Python
113
star
16

RSPapers

Must-read papers on Recommender System. 推荐系统相关论文整理(内含40篇论文,并持续更新中)
89
star
17

RecBole-CDR

This is a library built upon RecBole for cross-domain recommendation algorithms
Python
78
star
18

MVP

This repository is the official implementation of our paper MVP: Multi-task Supervised Pre-training for Natural Language Generation.
67
star
19

VQ-Rec

[WWW'23] PyTorch implementation for "Learning Vector-Quantized Item Representation for Transferable Sequential Recommenders".
Python
46
star
20

RecBole-PJF

Python
46
star
21

ChatCoT

The official repository of "ChatCoT: Tool-Augmented Chain-of-Thought Reasoning on Chat-based Large Language Models"
Python
41
star
22

CORE

[SIGIR'22] Official PyTorch implementation for "CORE: Simple and Effective Session-based Recommendation within Consistent Representation Space".
Python
37
star
23

Multi-View-Co-Teaching

Code for our CIKM 2020 paper "Learning to Match Jobs with Resumes from Sparse Interaction Data using Multi-View Co-Teaching Network"
Python
29
star
24

JiuZhang

Our code will be public soon .
Python
25
star
25

ELMER

This repository is the official implementation of our EMNLP 2022 paper ELMER: A Non-Autoregressive Pre-trained Language Model for Efficient and Effective Text Generation
Python
24
star
26

BAMBOO

Python
23
star
27

Language-Specific-Neurons

Python
17
star
28

RecBole-DA

Python
17
star
29

CARP

Python
16
star
30

SAFE

The pytorch implementation of the SAFE model presented in NAACL-Findings-2022
Python
16
star
31

RecBole-TRM

Python
13
star
32

Erya

12
star
33

MML

Python
12
star
34

Context-Tuning

This is the repository for COLING 2022 paper "Context-Tuning: Learning Contextualized Prompts for Natural Language Generation".
11
star
35

UniWeb

The official repository for our ACL 2023 Findings paper: The Web Can Be Your Oyster for Improving Language Models
9
star
36

PPGM

[ICDM'22] PyTorch implementation for "Privacy-Preserved Neural Graph Similarity Learning".
Python
6
star
37

LIVE

The official repository our ACL 2023 paper: "Learning to Imagine: Visually-Augmented Natural Language Generation"."
Python
5
star
38

Social-Datasets

A collection of social datasets for RecBole-GNN.
5
star
39

M3SRec

4
star
40

FIGA

Python
3
star
41

Contrastive-Curriculum-Learning

Python
3
star
42

Data-CUBE

3
star
43

Div-Ref

The official repository of "Not All Metrics Are Guilty: Improving NLG Evaluation Diversifying References".
Python
2
star
44

GenRec

Python
1
star
45

ETRec

Python
1
star