• Stars
    star
    141
  • Rank 259,971 (Top 6 %)
  • Language
    R
  • License
    Other
  • Created over 6 years ago
  • Updated 8 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Extra recipes for predictor embeddings

embed

R-CMD-check Codecov test coverage CRAN_Status_Badge Downloads

Introduction

embed has extra steps for the recipes package for embedding predictors into one or more numeric columns. Almost all of the preprocessing methods are supervised.

These steps are available here in a separate package because the step dependencies, rstanarm, lme4, and keras, are fairly heavy.

Some steps handle categorical predictors:

  • step_lencode_glm(), step_lencode_bayes(), and step_lencode_mixed() estimate the effect of each of the factor levels on the outcome and these estimates are used as the new encoding. The estimates are estimated by a generalized linear model. This step can be executed without pooling (via glm) or with partial pooling (stan_glm or lmer). Currently implemented for numeric and two-class outcomes.

  • step_embed() uses keras::layer_embedding to translate the original C factor levels into a set of D new variables (< C). The model fitting routine optimizes which factor levels are mapped to each of the new variables as well as the corresponding regression coefficients (i.e., neural network weights) that will be used as the new encodings.

  • step_woe() creates new variables based on weight of evidence encodings.

  • step_feature_hash() can create indicator variables using feature hashing.

For numeric predictors:

  • step_umap() uses a nonlinear transformation similar to t-SNE but can be used to project the transformation on new data. Both supervised and unsupervised methods can be used.

  • step_discretize_xgb() and step_discretize_cart() can make binned versions of numeric predictors using supervised tree-based models.

  • step_pca_sparse() and step_pca_sparse_bayes() conduct feature extraction with sparsity of the component loadings.

Some references for these methods are:

Getting Started

There are two articles that walk through how to use these embedding steps, using generalized linear models and neural networks built via TensorFlow.

Installation

To install the package:

install.packages("embed")

Note that to use some steps, you will also have to install other packages such as rstanarm and lme4. For all of the steps to work, you may want to use:

install.packages(c("rpart", "xgboost", "rstanarm", "lme4"))

To get a bug fix or to use a feature from the development version, you can install the development version of this package from GitHub.

# install.packages("pak")
pak::pak("tidymodels/embed")

Contributing

This project is released with a Contributor Code of Conduct. By contributing to this project, you agree to abide by its terms.

More Repositories

1

broom

Convert statistical analysis objects from R into tidy format
R
1,445
star
2

tidymodels

Easily install and load the tidymodels packages
R
747
star
3

infer

An R package for tidyverse-friendly statistical inference
R
723
star
4

parsnip

A tidy unified interface to models
R
595
star
5

corrr

Explore correlations in R
R
588
star
6

TMwR

Code and content for "Tidy Modeling with R"
RMarkdown
585
star
7

recipes

Pipeable steps for feature engineering and data preprocessing to prepare for modeling
R
558
star
8

yardstick

Tidy methods for measuring model performance
R
367
star
9

rsample

Classes and functions to create and summarize resampling objects
R
335
star
10

stacks

An R package for tidy stacked ensemble modeling
R
294
star
11

tune

Tools for tidy parameter tuning
R
268
star
12

tidypredict

Run predictions inside the database
R
259
star
13

workflows

Modeling Workflows
R
201
star
14

textrecipes

Extra recipes for Text Processing
R
159
star
15

themis

Extra recipes steps for dealing with unbalanced data
R
141
star
16

butcher

Reduce the size of model objects saved to disk
R
130
star
17

censored

Parsnip wrappers for survival models
R
123
star
18

probably

Tools for post-processing class probability estimates
R
113
star
19

dials

Tools for creating tuning parameter values
R
111
star
20

tidyclust

A tidy unified interface to clustering models
R
107
star
21

tidyposterior

Bayesian comparisons of models using resampled statistics
R
102
star
22

hardhat

Construct Modeling Packages
R
100
star
23

aml-training

The most recent version of the Applied Machine Learning notes
HTML
100
star
24

tidymodels.org-legacy

Legacy Source of tidymodels.org
HTML
99
star
25

workshops

Website and materials for tidymodels workshops
JavaScript
92
star
26

workflowsets

Create a collection of modeling workflows
R
92
star
27

usemodels

Boilerplate Code for tidymodels
R
85
star
28

modeldb

Run models inside a database using R
R
80
star
29

multilevelmod

Parsnip wrappers for mixed-level and hierarchical models
R
74
star
30

spatialsample

Create and summarize spatial resampling objects πŸ—Ί
R
70
star
31

learntidymodels

Learn tidymodels with interactive learnr primers
R
68
star
32

brulee

High-Level Modeling Functions with 'torch'
R
67
star
33

finetune

Additional functions for model tuning
R
62
star
34

bonsai

parsnip wrappers for tree-based models
R
51
star
35

shinymodels

R
46
star
36

applicable

Quantify extrapolation of new samples given a training set
R
46
star
37

model-implementation-principles

recommendations for creating R modeling packages
HTML
41
star
38

rules

parsnip extension for rule-based models
R
40
star
39

planning

Documents to plan and discuss future development
37
star
40

discrim

Wrappers for discriminant analysis and naive Bayes models for use with the parsnip package
R
28
star
41

baguette

parsnip Model Functions for Bagging
R
24
star
42

modeldata

Data Sets Used by tidymodels Packages
R
22
star
43

poissonreg

parsnip wrappers for Poisson regression
R
22
star
44

agua

Create and evaluate models using 'tidymodels' and 'h2o'
R
21
star
45

extratests

Integration and other testing for tidymodels
R
20
star
46

tidymodels.org

Source of tidymodels.org
JavaScript
19
star
47

plsmod

Model Wrappers for Projection Methods
R
14
star
48

cloudstart

RStudio Cloud ☁️ resources to accompany tidymodels.org
12
star
49

orbital

Turn Tidymodels Workflows Into Series of Equations
R
12
star
50

desirability2

Desirability Functions for Multiparameter Optimization
R
10
star
51

modeldatatoo

More Data Sets Useful for Modeling Examples
R
7
star
52

.github

GitHub contributing guidelines for tidymodels packages
4
star
53

modelenv

Provide Tools to Register Models for use in Tidymodels
R
4
star
54

tailor

Sandbox for a postprocessor object.
R
2
star
55

survivalauc

What the Package Does (One Line, Title Case)
C
2
star