• Stars
    star
    3
  • Rank 3,943,206 (Top 79 %)
  • Language
    Python
  • Created about 1 year ago
  • Updated about 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Implementation of an LLM prompting pipeline combined with wrappers for auto-decomposing reasoning steps and for search through the reasoning-step-space (eg. by beam search, MCTS etc.) guided by self-evaluation.

More Repositories

1

Abstractive-Summarization

Implementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Jupyter Notebook
160
star
2

TextRank-Keyword-Extraction

Keyword extraction using TextRank algorithm after pre-processing the text with lemmatization, filtering unwanted parts-of-speech and other techniques.
Jupyter Notebook
101
star
3

Chatbot

Hybrid Conversational Bot based on both neural retrieval and neural generative mechanism with TTS.
Jupyter Notebook
67
star
4

Machine-Translation-Transformers

Machine Translation using Transfromers
Jupyter Notebook
27
star
5

DemonRangerOptimizer

Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
Python
23
star
6

BERT-Disaster-Classification-Capsule-Routing

Exploration of BERT-BiLSTM models with Layer Aggregation (attention-based and capsule-routing-based) and Hidden-State Aggregation (attention-based and capsule-routing-based).
Python
21
star
7

Bi-GRU-CRF-NER

Attempted implementation of a Bi-directional GRU followed by a linear-chain-CRF (from scratch) for Named Entity Recognition.
Jupyter Notebook
16
star
8

Self-Organizing-Map

SOM clustering on IRIS dataset
Jupyter Notebook
15
star
9

auto-tldr-TextRank

Extractive Summarization based on statements ranked using TextRank
Jupyter Notebook
12
star
10

Multilingual-BERT-Disaster

Resources for: Cross-Lingual Disaster-related Multi-label Tweet Classification with Manifold Mixup (ACL SRW 2020)
Python
11
star
11

Continuous-RvNN

Official Repository for "Modeling Hierarchical Structures with Continuous Recursive Neural Networks" (ICML 2021)
Python
10
star
12

INTER-INTRA-attentions

An experimental custom seq-2-seq model with both layer-wise (inter-layer), and intra-layer attention (attention to previous hidden states of the same RNN unit) for abstractive summarization.
Jupyter Notebook
9
star
13

Dynamic-Memory-Network-Plus

Implementation of Dynamic Memory Network Plus for Question-Answering. Tested on Induction tasks of bAbi 10K dataset.
Jupyter Notebook
9
star
14

3D-CNN-Keras

A simple 3D CNN in Keras for Lung Nodule Detection.
Jupyter Notebook
8
star
15

Tweet-Disaster-Keyphrase

Official repository for "On Identifying Hashtags in Disaster Twitter Data" (AAAI 2020)
Python
8
star
16

RAKE-Keyword-Extraction

Keyword extraction using standard RAKE algorithm after pre-processing the text with lemmatization, filtering unwanted parts-of-speech and other techniques.
Jupyter Notebook
8
star
17

GRU-Text-Generator

Implementation of character level auto text generator using multilayered GRU in Keras
Python
7
star
18

Wide-Residual-Network

Implementation of a Wide Residual Network on Tensorflow for Image Classification. Trained and tested on Cifar10 dataset.
Jupyter Notebook
5
star
19

KPDrop

Official Implementation of Keyphrase Dropout
Python
5
star
20

QuestionGenerationPub

Python
2
star
21

CapsuleRoutingEncoders

In this work, we study and compare multiple capsule routing algorithms for text classification including dynamic routing, Heinsen routing, and capsule-routing inspired attention-based sentence encoding techniques like dynamic self-attention. Further, similar to some works in computer vision, we do an ablation test of the capsule network where we remove the routing algorithm itself. We analyze the theoretical connection between attention and capsule routing, and contrast the two ways of normalizing the routing weights. Finally, we present a new way to do capsule routing, or rather iterative refinement, using a richer attention function to measure agreement among output and input capsules and with highway connections in between iterations.
Python
2
star
22

MonotonicLocationAttention

1
star
23

Causal-Inference

UIC CS-594 Causal Inference Project Fall 2019
Python
1
star
24

Neural_Net_Evolution

Learning Neural Net Parameters using Evolutionary Strategies
Jupyter Notebook
1
star