There are no reviews yet. Be the first to send feedback to the community and the maintainers!
code2vec
TensorFlow code for the neural network presented in the paper: "code2vec: Learning Distributed Representations of Code"code2seq
Code for the model presented in the paper: "code2seq: Generating Sequences from Structured Representations of Code"how_attentive_are_gats
Code for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)RASP
An interpreter for RASP as described in the ICML 2021 paper "Thinking Like Transformers"Nero
Code and resources for the paper: "Neural Reverse Engineering of Stripped Binaries using Augmented Control Flow Graphs"bottleneck
Code for the paper: "On the Bottleneck of Graph Neural Networks and Its Practical Implications"slm-code-generation
TensorFlow code for the neural network presented in the paper: "Structural Language Models of Code" (ICML'2020)esh
statistical similarity of binaries (Esh)lstar_extraction
implementation of ICML 2018 paper, Extracting Automata from Recurrent Neural Networks Using Queries and Counterexampleslayer_norm_expressivity_role
Code for the paper "On the Expressivity Role of LayerNorm in Transformers' Attention" (Findings of ACL'2023)c3po
Code for the paper "A Structural Model for Contextual Code Changes"adversarial-examples
Code for the paper: "Adversarial Examples for Models of Code"weighted_lstar
implementation for "learning weighted deterministic automata from queries and counterexamples", neurips 2019RASP-exps
Code for running the transformers in the ICML 2021 paper "Thinking Like Transformers"prime
safe
SAFE static analysis toolsdifferential
differentialcounting_dimensions
demonstration for our ACL 2018 paper, "On the Practical Computational Power of Finite Precision RNNs for Language Recognition"id2vec
atam
Example programs for ATAMLove Open Source and this site? Check out how you can help us