Coder Social home page Coder Social logo

p768lwy3 / torecsys Goto Github PK

View Code? Open in Web Editor NEW
100.0 5.0 17.0 6.58 MB

ToR[e]cSys is a PyTorch Framework to implement recommendation system algorithms, including but not limited to click-through-rate (CTR) prediction, learning-to-ranking (LTR), and Matrix/Tensor Embedding. The project objective is to develop an ecosystem to experiment, share, reproduce, and deploy in real-world in a smooth and easy way.

Home Page: https://pypi.org/project/torecsys/

License: MIT License

Python 92.71% Jupyter Notebook 7.29%
pytorch recommender-system recommendation-system python click-through-rate embedding-models

torecsys's Introduction

ToR[e]cSys


News

It is happy to know the new package of Tensorflow Recommenders.


ToR[e]cSys is a PyTorch Framework to implement recommendation system algorithms, including but not limited to click-through-rate (CTR) prediction, learning-to-ranking (LTR), and Matrix/Tensor Embedding. The project objective is to develop an ecosystem to experiment, share, reproduce, and deploy in real world in a smooth and easy way (Hope it can be done).

Installation

TBU

Documentation

The complete documentation for ToR[e]cSys is available via ReadTheDocs website.
Thank you for ReadTheDocs! You are the best!

Implemented Models

1. Subsampling

Model Name Research Paper Year
Word2Vec Omer Levy et al, 2015. Improving Distributional Similarity with Lessons Learned from Word Embeddings 2015

2. Negative Sampling

Model Name Research Paper Year
TBU

3. Click-Through-Rate (CTR) Model

Model Name Research Paper Year
Logistic Regression / /
Factorization Machine Steffen Rendle, 2010. Factorization Machine 2010
Factorization Machine Support Neural Network Weinan Zhang et al, 2016. Deep Learning over Multi-field Categorical Data: A Case Study on User Response Prediction 2016
Field-Aware Factorization Machine Yuchin Juan et al, 2016. Field-aware Factorization Machines for CTR Prediction 2016
Product Neural Network Yanru QU et al, 2016. Product-based Neural Networks for User Response Prediction 2016
Attentional Factorization Machine Jun Xiao et al, 2017. Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks 2017
Deep and Cross Network Ruoxi Wang et al, 2017. Deep & Cross Network for Ad Click Predictions 2017
Deep Factorization Machine Huifeng Guo et al, 2017. DeepFM: A Factorization-Machine based Neural Network for CTR Prediction 2017
Neural Collaborative Filtering Xiangnan He et al, 2017. Neural Collaborative Filtering 2017
Neural Factorization Machine Xiangnan He et al, 2017. Neural Factorization Machines for Sparse Predictive Analytics 2017
eXtreme Deep Factorization Machine Jianxun Lian et al, 2018. xDeepFM: Combining Explicit and Implicit Feature Interactions for Recommender Systems 2018
Deep Field-Aware Factorization Machine Junlin Zhang et al, 2019. FAT-DeepFFM: Field Attentive Deep Field-aware Factorization Machine 2019
Deep Matching Correlation Prediction Wentao Ouyang et al, 2019. Representation Learning-Assisted Click-Through Rate Prediction 2019
Deep Session Interest Network Yufei Feng et al, 2019. Deep Session Interest Network for Click-Through Rate Prediction 2019
Elaborated Entire Space Supervised Multi Task Model Hong Wen et al, 2019. Conversion Rate Prediction via Post-Click Behaviour Modeling 2019
Entire Space Multi Task Model Xiao Ma et al, 2019. Entire Space Multi-Task Model: An Effective Approach for Estimating Post-Click Conversion Rate 2019
Field Attentive Deep Field Aware Factorization Machine Junlin Zhang et al, 2019. FAT-DeepFFM: Field Attentive Deep Field-aware Factorization Machine 2019
Position-bias aware learning framework Huifeng Guo et al, 2019. PAL: a position-bias aware learning framework for CTR prediction in live recommender systems 2019

4. Embedding Model

Model Name Research Paper Year
Matrix Factorization / /
Starspace Ledell Wu et al, 2017 StarSpace: Embed All The Things! 2017

5. Learning-to-Rank (LTR) Model

Model Name Research Paper Year
Personalized Re-ranking Model Changhua Pei et al, 2019. Personalized Re-ranking for Recommendation 2019

Getting Started

There are several ways using ToR[e]cSys to develop a Recommendation System. Before talking about them, we first need to discuss components of ToR[e]cSys.

A model in ToR[e]cSys is constructed by two parts mainly: inputs and model, and they will be wrapped into a sequential module (torecsys.models.sequential) to be trained by Trainer (torecsys.trainer.Trainer). \

For inputs module (torecsys.inputs), it will handle most kinds of inputs in recommendation system, like categorical features, images, etc., with several kinds of methods, including token embedding, pre-trained image models, etc.

For models module (torecsys.models), it will implement some famous models in recommendation system, like Factorization Machine family. I hope I can make the library rich. To construct a model in the module, in addition to the modules implemented in PyTorch, I will also implement some layers in layers which are called by models usually.

After the explanation of ToR[e]cSys, let's move on to the Getting Started. We can use ToR[e]cSys in the following ways:

  1. Run by command-line (In development)
> torecsys build --inputs_config='{}' \
--model_config='{"method":"FM", "embed_size": 8, "num_fields": 2}' \
--regularizer_config='{"weight_decay": 0.1}' \
--criterion_config='{"method": "MSELoss"}' \
--optimizer_config='{"method": "SGD", "lr": "0.01"}' \
...
  1. Run by class method
import torecsys as trs

# build trainer by class method
pipeline = trs.trainer.TorecsysPipeline()

# start to fit the model
trainer = trs.trainer.TorecsysTrainer()
trainer.fit(pipeline)
  1. Run like PyTorch Module
import torch.nn as nn
import torch.optim
import torecsys as trs

schema = {}
batches = {}
labels = torch.Tensor([])
inputs = trs.inputs.Inputs(schema=schema)
model = trs.models.FactorizationMachineModel()

epochs = 1
optimizer = torch.optim.SGD(params=model.parameters(), lr=1e-4)
criterion = nn.MSELoss()

for i in range(epochs):
    optimizer.zero_grad()
    outputs = model(**inputs(batches))
    loss = criterion(outputs, labels)
    loss.backward()
    optimizer.step()

For further details, please refer to the example in repository or read the documentation. Hope you enjoy~

Examples

TBU

Sample Codes

TBU

Sample of Experiments

TBU

Authors

License

ToR[e]cSys is MIT-style licensed, as found in the LICENSE file.

torecsys's People

Contributors

dependabot[bot] avatar jasper430 avatar p768lwy3 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

torecsys's Issues

无法运行

您好,我想运行一些您的模型但发现按照您的各种方式都运行不了,请您给出一个可行的运行方法

PAL模型

论文中,pos模型和pctr模型的sigmoid结果进行相乘得到bctr。
但在在您的PositionBiasAwareLearningFrameworkModel 实现中,是把pctr结果送到pos模型中进行训练?
是不是跟论文中所说的不一样?

另外再问一下torch方面简单的问题,output = self.pos_model((pctr_out, pos_inputs))
这个pos_model的输入,为啥是一个tuple?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.