• Stars
    star
    152
  • Rank 244,685 (Top 5 %)
  • Language
    Python
  • Created over 7 years ago
  • Updated over 5 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Time-LSTM on RecSys

Introduction

This project is the implementation of the paper "What to Do Next: Modeling User Behaviors by Time-LSTM".

Abstract of The Paper

Recently, Recurrent Neural Network (RNN) solutions for recommender systems(RS) are becoming increasingly popular. The insight is that, there exist some intrinstic patterns in the sequence of users' behaviors, and RNN has been proved to perform excellently tasks such as language modeling, RNN solutions usually only consider sequential order of objects from a sequence without the notion of interval. However, in RS, time intervals between users' behaviors are of significant importance in capturing the relations of users' behaviors and the traditional RNN architectures are not good ar modeling them. In this paper, we propose a new LSTM variant, i.e. Time-LSTM, to model users' sequential behaviors. Time-LSTM equips LSTM with time gates to model time intervals. These time gates are specifically designed, so that compared to the traditional RNN solutions, Time-LSTM better captures both of users' short-term and long-term interest, so as to improve the recommendation performance. Experimental results on two real-world datasets show the superiority of the recommendation method using Time-LSTM over the traditional methods.

Model

Time-LSTM proposed in the paper has three different architectures.

  • TLSTM1: lstm with time gate 1.
  • TLSTM2: lstm with time gate 1 & 2.
  • TLSTM3: lstm with time gate 1 & 2, and forget gate is removed.


Requirments

The code is tested in the following envirenment.
theano=0.9.0
lasagne=0.2.dev1
pandas=0.18.1
cudnn=5.1
cuda=8.0

pip install -v theano==0.9.0
pip install --upgrade https://github.com/Lasagne/Lasagne/archive/master.zip
pip install -v pandas==0.18.1

Data Preprocess

Download and put the original data file into data/music or data/citeulike.
For the citeulike dataset, you need run awk -F "|" '{print $1"|"$2"|"$3}' citeulike_dataset | uniq > citeulike-origin-filtered to get citeulike-origin-filtered and filter users and items with few interactions.
Run the python file in the preprocess folder (e.g. lastfm.py) to get three files: user-item.lst, user-item-delta-time.lst and user-item-accumulate-time.lst.
Use head/tail command (e.g. head -800 user-item.lst > tr_user-item.lst and tail -192 user-item.lst > te_user-item.lst) to generate the following 6 files for each(in data/{data_source}/).

  • tr(te)_user-item.lst
  • tr(te)_user-item-delta-time.lst
  • tr(te)_user-item-accumulate-time.lst

tr_* is traing data. te_* is testing data. The content is something like:

user1, item1 item2 item3 ...
user2, item1 item2 item3 ...

Files

plstm.py: phased-lstm
tlstm1.py: TLSTM1
tlstm2.py: TLSTM2
tlstm3.py: TLSTM3
lstm.py: LSTM
utils.py: load data, save module, and some other method...
main.py: main process to train.

train.sh: command samples.


To train the model, run:

THEANO_FLAGS="${FLAGS}" python main.py --model TLSTM3   --data ${DATA} --batch_size ${BATCH} --vocab_size ${VOCAB} --max_len ${MLEN} --fixed_epochs ${FIXED_EPOCHS} --num_epochs ${NUM_EPOCHS} --num_hidden ${NHIDDEN} --test_batch ${TEST_BATCH} --learning_rate ${LEARNING_RATE} --sample_time ${SAMPLE_TIME}

or just run (the hyperparameters in train.sh are not carefully tuned):

bash train.sh

Important arguments:

  • --model:

    • LSTM: Using LSTM model.
    • LSTM_T: Using LSTM model and use time interval as a feature.
    • PLSTM: Using PLSTM model.
    • TLSTM1: Using TLSTM1 model.
    • TLSTM2: Using TLSTM2 model.
    • TLSTM3: Using TLSTM3 model.
  • --data:

    • music: Using Last.FM as the data source.
    • citeulike: Using CiteULike as data source.

More Repositories

1

pixel_link

Implementation of our paper 'PixelLink: Detecting Scene Text via Instance Segmentation' in AAAI2018
Python
767
star
2

nsg

Navigating Spreading-out Graph For Approximate Nearest Neighbor Search
C++
584
star
3

MatlabFunc

Matlab codes for feature learning
MATLAB
502
star
4

ttfnet

Python
481
star
5

efanna

fast library for ANN search and KNN graph construction
C++
280
star
6

RMI

This is the code for the NeurIPS 2019 paper Region Mutual Information Loss for Semantic Segmentation.
Python
268
star
7

resa

Implementation of our paper 'RESA: Recurrent Feature-Shift Aggregator for Lane Detection' in AAAI2021.
Python
175
star
8

MaxSquareLoss

Code for "Domain Adaptation for Semantic Segmentation with Maximum Squares Loss" in PyTorch.
Python
109
star
9

SSG

code for satellite system graphs
C++
95
star
10

efanna_graph

an Extremely Fast Approximate Nearest Neighbor graph construction Algorithm framework
C++
79
star
11

graph_level_drug_discovery

Python
60
star
12

CariFaceParsing

Code for ICIP2019 paper๏ผšWeakly-supervised Caricature Face Parsing through Domain Adaptation
Python
55
star
13

AtSNE

Anchor-t-SNE for large-scale and high-dimension vector visualization
Cuda
54
star
14

depthInpainting

Depth Image Inpainting with Low Gradient Regularization
C++
50
star
15

ALDA

Code for "Adversarial-Learned Loss for Domain Adaptation"(AAAI2020) in PyTorch.
Python
49
star
16

AttentionZSL

Codes for Paper "Attribute Attention for Semantic Disambiguation in Zero-Shot Learning"
Python
44
star
17

ReDR

Code for ACL 2019 paper "Reinforced Dynamic Reasoning for Conversational Question Generation".
Python
41
star
18

hashingSearch

Search with a hash index
C++
31
star
19

SRDet

A simple, fast, efficient and end-to-end 3D object detector without NMS.
Python
30
star
20

PTL

Progressive Transfer Learning for Person Re-identification published on IJCAI-2019
Python
26
star
21

TreeAttention

A Better Way to Attend: Attention with Trees for Video Question Answering
Python
25
star
22

RPLSH

Kmeans Quantization + Random Projection based Locality Sensitive Hashing
C++
23
star
23

videoqa

Unifying the Video and Question Attentions for Open-Ended Video Question Answering
Python
21
star
24

DMP

Code for ACL 2018 paper "Discourse Marker Augmented Network with Reinforcement Learning for Natural Language Inference".
Python
17
star
25

DREN

DREN:Deep Rotation Equivirant Network
C++
15
star
26

Attention-GRU-3M

Python
12
star
27

AMI

Python
7
star
28

Sparse-Learning-with-Stochastic-Composite-Optimization

The implementation of our work "Sparse Learning with Stochastic Composite Optimization"
MATLAB
7
star
29

TransAt

Python
6
star
30

diverse_image_synthesis

PyTorch implementation of diverse conditional image synthesis
Python
4
star
31

DeAda

Decouple Co-adaptation: Classifier Randomization for Person Re-identification published on Neurocomputing.
Python
3
star
32

AdaDB

Python
2
star
33

SIF

SIF: Self-Inspirited Feature Learning for Person Re-Identification published on IEEE TIP
Python
2
star
34

SIFS

C++
1
star
35

SplitNet

Jupyter Notebook
1
star