There are no reviews yet. Be the first to send feedback to the community and the maintainers!
ekphrasis
Ekphrasis is a text processing tool, geared towards text from social networks, such as Twitter or Facebook. Ekphrasis performs tokenization, word normalization, word segmentation (for splitting hashtags) and spell correction, using word statistics from 2 big corpora (english Wikipedia, twitter - 330mil english tweets).neat-vision
Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)datastories-semeval2017-task4
Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".ntua-slp-semeval2018
Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.lm-prior-for-nmt
This repository contains source code for the paper "Language Model Prior for Low-Resource Neural Machine Translation"keras-utilities
Utilities for Keras - Deep Learning librarytwitter-stream-downloader
A service for downloading twitter streaming data. You can save the data either in text files on disk, or in a database (MongoDB).datastories-semeval2017-task6
Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".prolog-cfg-parser
A toy SWI-Prolog context-free grammar (CFG) parser, that extracts knowledge (facts) from text.hierarchical-rnn-biocreative-4
Repository containing the winning submission for the BioCreative VI Task A (2017). The model is a Hierarchical Bidirectional Attention-Based RNN, implemented in Keras.patric-triangles
MPI implementation of a parallel algorithm for finding the exact number of triangles in massive networksntua-slp-semeval2018-task2
Deep-learning models submitted by NTUA-SLP team in SemEval 2018 Task 2: Multilingual Emoji Prediction https://arxiv.org/abs/1804.06657ntua-slp-semeval2018-task1
Deep-learning models submitted by NTUA-SLP team in SemEval 2018 Task 1: Affect in Tweets https://arxiv.org/abs/1804.06658nmt-pretraining-objectives
This repository contains the source code and data for the paper: "Exploration of Unsupervised Pretraining Objectives for Machine Translation" in Findings of ACL 2021.Love Open Source and this site? Check out how you can help us