• Stars
    star
    218
  • Rank 181,805 (Top 4 %)
  • Language
    Python
  • License
    Other
  • Created over 3 years ago
  • Updated 2 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Atomistic Line Graph Neural Network https://scholar.google.com/citations?user=9Q-tNnwAAAAJ&hl=en

name alt text codecov PyPI version GitHub tag (latest by date) GitHub code size in bytes GitHub commit activity Downloads

Table of Contents

ALIGNN (Introduction)

The Atomistic Line Graph Neural Network (https://www.nature.com/articles/s41524-021-00650-1) introduces a new graph convolution layer that explicitly models both two and three body interactions in atomistic systems.

This is achieved by composing two edge-gated graph convolution layers, the first applied to the atomistic line graph L(g) (representing triplet interactions) and the second applied to the atomistic bond graph g (representing pair interactions).

The atomistic graph g consists of a node for each atom i (with atom/node representations hi), and one edge for each atom pair within a cutoff radius (with bond/pair representations eij).

The atomistic line graph L(g) represents relationships between atom triplets: it has nodes corresponding to bonds (sharing representations eij with those in g) and edges corresponding to bond angles (with angle/triplet representations tijk).

The line graph convolution updates the triplet representations and the pair representations; the direct graph convolution further updates the pair representations and the atom representations.

ALIGNN layer schematic

Installation

First create a conda environment: Install miniconda environment from https://conda.io/miniconda.html Based on your system requirements, you'll get a file something like 'Miniconda3-latest-XYZ'.

Now,

bash Miniconda3-latest-Linux-x86_64.sh (for linux)
bash Miniconda3-latest-MacOSX-x86_64.sh (for Mac)

Download 32/64 bit python 3.10 miniconda exe and install (for windows) Now, let's make a conda environment, say "version", choose other name as you like::

conda create --name version python=3.10
source activate version

optional GPU dependencies

If you need CUDA support, it's best to install PyTorch and DGL before installing alignn to ensure that you get a CUDA-enabled version of DGL.

To [install the stable release of PyTorch] on linux with cudatoolkit 11.8 run

conda install pytorch torchvision torchaudio pytorch-cuda=11.8 -c pytorch -c nvidia

Then install the matching DGL version

conda install -c dglteam/label/cu118 dgl

Some of our models may not be stable with the latest DGL release (v1.1.0) so you may wish to install v1.0.2 instead:

conda install -c dglteam/label/cu118 dgl==1.0.2.cu118

Method 1 (editable in-place install)

You can install a development version of alignn by cloning the repository and installing in place with pip:

git clone https://github.com/usnistgov/alignn
cd alignn
python -m pip install -e .

Method 2 (using pypi):

As an alternate method, ALIGNN can also be installed using pip command as follows:

python -m pip install alignn

Examples

Dataset

The main script to train model is train_folder.py. A user needs at least the following info to train a model: 1) id_prop.csv with name of the file and corresponding value, 2) config_example.json a config file with training and hyperparameters.

Users can keep their structure files in POSCAR, .cif, .xyz or .pdb files in a directory. In the examples below we will use POSCAR format files. In the same directory, there should be an id_prop.csv file.

In this directory, id_prop.csv, the filenames, and correponding target values are kept in comma separated values (csv) format.

Here is an example of training OptB88vdw bandgaps of 50 materials from JARVIS-DFT database. The example is created using the generate_sample_data_reg.py script. Users can modify the script for more than 50 data, or make their own dataset in this format. For list of available datasets see Databases.

The dataset in split in 80:10:10 as training-validation-test set (controlled by train_ratio, val_ratio, test_ratio) . To change the split proportion and other parameters, change the config_example.json file. If, users want to train on certain sets and val/test on another dataset, set n_train, n_val, n_test manually in the config_example.json and also set keep_data_order as True there so that random shuffle is disabled.

A brief help guide (-h) can be obtained as follows.

train_folder.py -h

Regression example

Now, the model is trained as follows. Please increase the batch_size parameter to something like 32 or 64 in config_example.json for general trainings.

train_folder.py --root_dir "alignn/examples/sample_data" --config "alignn/examples/sample_data/config_example.json" --output_dir=temp

Classification example

While the above example is for regression, the follwoing example shows a classification task for metal/non-metal based on the above bandgap values. We transform the dataset into 1 or 0 based on a threshold of 0.01 eV (controlled by the parameter, classification_threshold) and train a similar classification model. Currently, the script allows binary classification tasks only.

train_folder.py --root_dir "alignn/examples/sample_data" --classification_threshold 0.01 --config "alignn/examples/sample_data/config_example.json" --output_dir=temp

Multi-output model example

While the above example regression was for single-output values, we can train multi-output regression models as well. An example is given below for training formation energy per atom, bandgap and total energy per atom simulataneously. The script to generate the example data is provided in the script folder of the sample_data_multi_prop. Another example of training electron and phonon density of states is provided also.

train_folder.py --root_dir "alignn/examples/sample_data_multi_prop" --config "alignn/examples/sample_data/config_example.json" --output_dir=temp

Automated model training

Users can try training using multiple example scripts to run multiple dataset (such as JARVIS-DFT, Materials project, QM9_JCTC etc.). Look into the alignn/scripts/train_*.py folder. This is done primarily to make the trainings more automated rather than making folder/ csv files etc. These scripts automatically download datasets from Databases in jarvis-tools and train several models. Make sure you specify your specific queuing system details in the scripts.

Using pre-trained models

All the trained models are distributed on [Figshare](https://figshare.com/projects/ALIGNN_models/126478.

The pretrained.py script can be applied to use them. These models can be used to directly make predictions.

A brief help section (-h) is shown using:

pretrained.py -h

An example of prediction formation energy per atom using JARVIS-DFT dataset trained model is shown below:

pretrained.py --model_name jv_formation_energy_peratom_alignn --file_format poscar --file_path alignn/examples/sample_data/POSCAR-JVASP-10.vasp

Quick start using GoogleColab notebook example

The following notebook provides an example of 1) installing ALIGNN model, 2) training the example data and 3) using the pretrained models. For this example, you don't need to install alignn package on your local computer/cluster, it requires a gmail account to login. Learn more about Google colab here.

name

Web-app

A basic web-app is for direct-prediction available at JARVIS-ALIGNN app. Given atomistic structure in POSCAR format it predict formation energy, total energy per atom and bandgap using data trained on JARVIS-DFT dataset.

JARVIS-ALIGNN

ALIGNN-FF

ASE calculator provides interface to various codes. An example for ALIGNN-FF is give below:

from alignn.ff.ff import AlignnAtomwiseCalculator,default_path
model_path = default_path()
calc = AlignnAtomwiseCalculator(path=model_path)

from ase import Atom, Atoms
import numpy as np
import matplotlib.pyplot as plt

lattice_params = np.linspace(3.5, 3.8)
fcc_energies = []
ready = True
for a in lattice_params:
    atoms = Atoms([Atom('Cu', (0, 0, 0))],
                  cell=0.5 * a * np.array([[1.0, 1.0, 0.0],
                                           [0.0, 1.0, 1.0],
                                           [1.0, 0.0, 1.0]]),
                 pbc=True)

    atoms.set_tags(np.ones(len(atoms)))
    atoms.calc = calc
    e = atoms.get_potential_energy()
    fcc_energies.append(e)

import matplotlib.pyplot as plt
%matplotlib inline
plt.plot(lattice_params, fcc_energies)
plt.title('1x1x1')
plt.xlabel('Lattice constant ($\AA$)')
plt.ylabel('Total energy (eV)')
plt.show()

To train ALIGNN-FF use train_folder_ff.py script which uses atomwise_alignn model:

AtomWise prediction example which looks for similar setup as before but unstead of id_prop.csv, it requires id_prop.json file (see example in the sample_data_ff directory).:

train_folder_ff.py --root_dir "alignn/examples/sample_data_ff" --config "alignn/examples/sample_data_ff/config_example_atomwise.json" --output_dir=temp

A pretrained ALIGNN-FF (under active development right now) can be used for predicting several properties, such as:

run_alignn_ff.py --file_path alignn/examples/sample_data/POSCAR-JVASP-10.vasp --task="unrelaxed_energy"
run_alignn_ff.py --file_path alignn/examples/sample_data/POSCAR-JVASP-10.vasp --task="optimize"
run_alignn_ff.py --file_path alignn/examples/sample_data/POSCAR-JVASP-10.vasp --task="ev_curve"

To know about other tasks, type.

run_alignn_ff.py -h

Performances

Please refer to JARVIS-Leaderboard to check the performance of ALIGNN models on several databases.

1) On JARVIS-DFT 2021 dataset (classification)

Model Threshold ALIGNN
Metal/non-metal classifier (OPT) 0.01 eV 0.92
Metal/non-metal classifier (MBJ) 0.01 eV 0.92
Magnetic/non-Magnetic classifier 0.05 ยตB 0.91
High/low SLME 10 % 0.83
High/low spillage 0.1 0.80
Stable/unstable (ehull) 0.1 eV 0.94
High/low-n-Seebeck -100 ยตVK-1 0.88
High/low-p-Seebeck 100 ยตVK-1 0.92
High/low-n-powerfactor 1000 ยตW(mK2)-1 0.74
High/low-p-powerfactor 1000ยตW(mK2)-1 0.74

2) On JARVIS-DFT 2021 dataset (regression)

Property Units MAD CFID CGCNN ALIGNN MAD: MAE
Formation energy eV(atom)-1 0.86 0.14 0.063 0.033 26.06
Bandgap (OPT) eV 0.99 0.30 0.20 0.14 7.07
Total energy eV(atom)-1 1.78 0.24 0.078 0.037 48.11
Ehull eV 1.14 0.22 0.17 0.076 15.00
Bandgap (MBJ) eV 1.79 0.53 0.41 0.31 5.77
Kv GPa 52.80 14.12 14.47 10.40 5.08
Gv GPa 27.16 11.98 11.75 9.48 2.86
Mag. mom ยตB 1.27 0.45 0.37 0.26 4.88
SLME (%) No unit 10.93 6.22 5.66 4.52 2.42
Spillage No unit 0.52 0.39 0.40 0.35 1.49
Kpoint-length ร… 17.88 9.68 10.60 9.51 1.88
Plane-wave cutoff eV 260.4 139.4 151.0 133.8 1.95
ั”x (OPT) No unit 57.40 24.83 27.17 20.40 2.81
ั”y (OPT) No unit 57.54 25.03 26.62 19.99 2.88
ั”z (OPT) No unit 56.03 24.77 25.69 19.57 2.86
ั”x (MBJ) No unit 64.43 30.96 29.82 24.05 2.68
ั”y (MBJ) No unit 64.55 29.89 30.11 23.65 2.73
ั”z (MBJ) No unit 60.88 29.18 30.53 23.73 2.57
ั” (DFPT:elec+ionic) No unit 45.81 43.71 38.78 28.15 1.63
Max. piezoelectric strain coeff (dij) CN-1 24.57 36.41 34.71 20.57 1.19
Max. piezo. stress coeff (eij) Cm-2 0.26 0.23 0.19 0.147 1.77
Exfoliation energy meV(atom)-1 62.63 63.31 50.0 51.42 1.22
Max. EFG 1021 Vm-2 43.90 24.54 24.7 19.12 2.30
avg. me electron mass unit 0.22 0.14 0.12 0.085 2.59
avg. mh electron mass unit 0.41 0.20 0.17 0.124 3.31
n-Seebeck ยตVK-1 113.0 56.38 49.32 40.92 2.76
n-PF ยตW(mK2)-1 697.80 521.54 552.6 442.30 1.58
p-Seebeck ยตVK-1 166.33 62.74 52.68 42.42 3.92
p-PF ยตW(mK2)-1 691.67 505.45 560.8 440.26 1.57

3) On Materials project 2018 dataset

The results from models other than ALIGNN are reported as given in corresponding papers, not necessarily reproduced by us.

Prop Unit MAD CFID CGCNN MEGNet SchNet ALIGNN MAD:MAE
Ef eV(atom)-1 0.93 0.104 0.039 0.028 0.035 0.022 42.27
Eg eV 1.35 0.434 0.388 0.33 - 0.218 6.19

4) On QM9 dataset

Note the issue related to QM9 dataset. The results from models other than ALIGNN are reported as given in corresponding papers, not necessarily reproduced by us. These models were trained with same parameters as solid-state databases but for 1000 epochs.

Target Units SchNet MEGNet DimeNet++ ALIGNN
HOMO eV 0.041 0.043 0.0246 0.0214
LUMO eV 0.034 0.044 0.0195 0.0195
Gap eV 0.063 0.066 0.0326 0.0381
ZPVE eV 0.0017 0.00143 0.00121 0.0031
ยต Debye 0.033 0.05 0.0297 0.0146
ฮฑ Bohr3 0.235 0.081 0.0435 0.0561
R2 Bohr2 0.073 0.302 0.331 0.5432
U0 eV 0.014 0.012 0.00632 0.0153
U eV 0.019 0.013 0.00628 0.0144
H eV 0.014 0.012 0.00653 0.0147
G eV 0.014 0.012 0.00756 0.0144

5) On hMOF dataset

Property Unit MAD MAE MAD:MAE R2 RMSE
Grav. surface area m2 g-1 1430.82 91.15 15.70 0.99 180.89
Vol. surface area m2 cm-3 561.44 107.81 5.21 0.91 229.24
Void fraction No unit 0.16 0.017 9.41 0.98 0.03
LCD ร… 3.44 0.75 4.56 0.83 1.83
PLD ร… 3.55 0.92 3.86 0.78 2.12
All adsp mol kg-1 1.70 0.18 9.44 0.95 0.49
Adsp at 0.01bar mol kg-1 0.12 0.04 3.00 0.77 0.11
Adsp at 2.5bar mol kg-1 2.16 0.48 4.50 0.90 0.97

6) On qMOF dataset

MAE on electronic bandgap 0.20 eV

7) On OMDB dataset

coming soon!

8) On HOPV dataset

coming soon!

9) On QETB dataset

coming soon!

10) On OpenCatalyst dataset

On 10k dataset:

DataSplit CGCNN DimeNet SchNet DimeNet++ ALIGNN MAD: MAE
10k 0.988 1.0117 1.059 0.8837 0.61 -

Useful notes (based on some of the queries we received)

  1. If you are using GPUs, make sure you have a compatible dgl-cuda version installed, for example: dgl-cu101 or dgl-cu111, so e.g. pip install dgl-cu111 .
  2. The undirected graph and its line graph is constructured in jarvis-tools package using jarvis.core.graphs
  3. While comnventional '.cif' and '.pdb' files can be read using jarvis-tools, for complex files you might have to install cif2cell and pytraj respectively i.e.pip install cif2cell==2.0.0a3 and conda install -c ambermd pytraj.
  4. Make sure you use batch_size as 32 or 64 for large datasets, and not 2 as given in the example config file, else it will take much longer to train, and performnce might drop a lot.
  5. Note that train_folder.py and pretrained.py in alignn folder are actually python executable scripts. So, even if you don't provide absolute path of these scripts, they should work.
  6. Learn about the issue with QM9 results here: #54
  7. Make sure you have pandas version as 1.2.3.

References

  1. Atomistic Line Graph Neural Network for improved materials property predictions
  2. Prediction of the Electron Density of States for Crystalline Compounds with Atomistic Line Graph Neural Networks (ALIGNN)
  3. Recent advances and applications of deep learning methods in materials science
  4. Designing High-Tc Superconductors with BCS-inspired Screening, Density Functional Theory and Deep-learning
  5. A Deep-learning Model for Fast Prediction of Vacancy Formation in Diverse Materials
  6. Graph neural network predictions of metal organic framework CO2 adsorption properties
  7. Rapid Prediction of Phonon Structure and Properties using an Atomistic Line Graph Neural Network (ALIGNN)
  8. Unified graph neural network force-field for the periodic table

Please see detailed publications list here.

How to contribute

For detailed instructions, please see Contribution instructions

Correspondence

Please report bugs as Github issues (https://github.com/usnistgov/alignn/issues) or email to [email protected].

Funding support

NIST-MGI (https://www.nist.gov/mgi).

Code of conduct

Please see Code of conduct

More Repositories

1

macos_security

macOS Security Compliance Project
YAML
1,748
star
2

800-63-3

Home to public development of NIST Special Publication 800-63-3: Digital Authentication Guidelines
CSS
702
star
3

OSCAL

Open Security Controls Assessment Language (OSCAL)
XSLT
572
star
4

fipy

FiPy is a Finite Volume PDE solver written in Python
Python
430
star
5

jarvis

JARVIS-Tools: an open-source software package for data-driven atomistic materials design. Publications: https://scholar.google.com/citations?user=3w6ej94AAAAJ
Python
289
star
6

jsip

JSIP: Java SIP specification Reference Implementation (moved from java.net)
Java
287
star
7

frvt

Repository for the Face Recognition Vendor Test (FRVT)
C++
261
star
8

trec_eval

Evaluation software used in the Text Retrieval Conference
C
224
star
9

dioptra

Test Software for the Characterization of AI Technologies
Python
220
star
10

oscal-content

NIST SP 800-53 content and other OSCAL content examples
Shell
218
star
11

SCTK

C
208
star
12

SP800-90B_EntropyAssessment

The SP800-90B_EntropyAssessment C++package implements the min-entropy assessment methods included in Special Publication 800-90B.
C++
200
star
13

PrivacyEngCollabSpace

Privacy Engineering Collaboration Space
Python
186
star
14

ACVP

Industry Working Group on Automated Cryptographic Algorithm Validation
HTML
163
star
15

REFPROP-wrappers

Wrappers around NIST REFPROP for languages such as Python, MATLAB, etc.
Mathematica
160
star
16

mobile-threat-catalogue

NIST/NCCoE Mobile Threat Catalogue
HTML
141
star
17

trojai-literature

131
star
18

NFIQ2

Optical live-scan and ink fingerprint image quality assessment tool
C++
130
star
19

MIST

Microscopy Image Stitching Tool
Java
130
star
20

applesec

Draft SP 800-179r1 macOS 10.12 Security project files: draft publication, security settings spreadsheet and Bash script implementation of settings.
Shell
116
star
21

ndn-dpdk

NDN-DPDK: High-Speed Named Data Networking Forwarder
Go
114
star
22

ARIAC

Repository for ARIAC (Agile Robotics for Industrial Automation Competition), consisting of kit building and assembly in a simulated warehouse
C++
110
star
23

SFA

The NIST STEP File Analyzer and Viewer (SFA) generates a spreadsheet and a visualization from an ISO 10303 Part 21 STEP file.
Tcl
109
star
24

NEMO

NEMO is a laboratory logistics web application. Use it to schedule reservations, control tool access, track maintenance issues, and more.
Python
98
star
25

jsfive

A pure javascript HDF5 reader
JavaScript
97
star
26

h5wasm

A WebAssembly HDF5 reader/writer library
C++
84
star
27

pyMCR

pyMCR: Multivariate Curve Resolution for Python
Python
80
star
28

policy-machine-core

Core components of the Policy Machine, a NGAC reference implementation.
Java
76
star
29

psc-ns3

Public Safety Communication modeling tools based on ns-3
C++
68
star
30

chemnlp

ChemNLP: A Natural Language Processing based Library for Materials Chemistry Text Data
Python
65
star
31

Metrology

Metrology for software; software for metrology
JavaScript
65
star
32

STP2X3D

Translator from STEP format to X3D format
C++
62
star
33

combinatorial-testing-tools

Tools for combinatorial testing developed by the NIST ACTS project
Java
61
star
34

jarvis_leaderboard

Explore State-of-the-Art Materials Design Methods: https://www.nature.com/articles/s41524-024-01259-w
Jupyter Notebook
55
star
35

COSMOSAC

A Benchmark Implementation of COSMO-SAC
HTML
52
star
36

ACVP-Server

A repository tracking releases of NIST's ACVP server. See www.github.com/usnistgov/ACVP for the protocol.
C#
52
star
37

pfhub

The CHiMaD Phase Field Community Website
HTML
49
star
38

REFPROP-cmake

Small repo with CMake build system for building REFPROP shared library
CMake
48
star
39

teqp

A highly efficient, flexible, and accurate implementation of thermodynamic EOS powered by automatic differentiation
C++
48
star
40

Lightweight-Cryptography-Benchmarking

C
48
star
41

SimulatedRadarWaveformGenerator

A software tool that generates simulated radar signals and creates RF datasets for developing and testing machine/deep learning detection algorithms.
MATLAB
47
star
42

iheos-toolkit2

XDS Toolkit
Java
46
star
43

OpenSeadragonFiltering

OpenSeadragon filtering plugin
JavaScript
45
star
44

pmml_pymcBN

Jupyter Notebook
42
star
45

ActEV_Scorer

Scoring software for the TRECVID Activities in Extended Video (ActEV) evaluation
Python
41
star
46

HTGS

The Hybrid Task Graph Scheduler API
C++
40
star
47

sctools

Tools for security content automation, baseline tailoring, and overlay development.
HTML
39
star
48

hiperc

High Performance Computing Strategies for Boundary Value Problems
HTML
39
star
49

OpenSeadragonScalebar

OpenSeadragon scalebar plugin
JavaScript
38
star
50

pyPRISM

A framework for conducting polymer reference interaction site model (PRISM) calculations
Python
38
star
51

ocr-pipeline

Convert a corpus of PDF to clean text files on a distributed architecture
Python
38
star
52

800-63-4

HTML
37
star
53

mosaic

A modular single-molecule analysis interface
Python
37
star
54

oscal-cli

A simple open source command line tool to support common operations over OSCAL content.
Java
37
star
55

vulntology

Development of the NIST vulnerability data ontology (Vulntology).
JavaScript
36
star
56

DT4SM

Digital Thread for Smart Manufacturing
C#
34
star
57

OOF3D

Object Oriented for Finite Elements 3D version code.
Python
34
star
58

NetSimulyzer

A flexible 3D visualizer for displaying, debugging, presenting, and understanding ns-3 scenarios.
C++
34
star
59

NetSimulyzer-ns3-module

A flexible 3D visualizer for displaying, debugging, presenting, and understanding ns-3 scenarios.
C++
33
star
60

pyramidio

Image pyramid reader and writer
Java
33
star
61

rcslib

NIST Real-Time Control Systems Library including Posemath, NML communications & Java Plotter
Java
33
star
62

AGA8

Files associated with the AGA8 standard
Rust
33
star
63

hugo-uswds

Implementation of the The United States Web Design System (USWDS) 2.0 using the Hugo open-source static site generator
SCSS
33
star
64

PrivacyFrmwkResources

This repository contains resources to support organizationsโ€™ use of the Privacy Framework. Resources include crosswalks, Profiles, guidelines, and tools. NIST encourages new contributions and feedback on these resources as part of the ongoing collaborative effort to improve implementation of the Privacy Framework.
33
star
65

dataplot

Source code and auxiliary files for dataplot.
Fortran
32
star
66

oscal-tools

Tools for the OSCAL project
XSLT
32
star
67

SDNist

SDNist: Benchmark data and evaluation tools for data synthesizers.
HTML
31
star
68

Voting

The NIST Voting Program repository
31
star
69

metaschema

Documentation for and implementations of the metaschema modeling language
Shell
31
star
70

MDCS

CSS
31
star
71

pySCATMECH

pySCATMECH is a Python interface to SCATMECH: Polarized Light Scattering C++ Class Library
C++
31
star
72

phasefield-precipitate-aging

Phase field model for precipitate aging in ternary analogues to Ni-based superalloys
Cuda
30
star
73

atomvision

Deep learning framework for atomistic image data
Python
29
star
74

OFDM-GAN

Python
29
star
75

feasst

The Free Energy and Advanced Sampling Simulation Toolkit (FEASST) is a free, open-source, modular program to conduct molecular and particle-based simulations with flat-histogram Monte Carlo methods.
C++
29
star
76

liboscal-java

A Java library to support processing OSCAL content
Java
28
star
77

lantern

Interpretable genotype-phenotype landscape modeling
Python
28
star
78

ns3-oran

A module that can be used to model and simulate O-RAN-like behavior in ns-3.
C++
28
star
79

ChebTools

C++ tools for working with Chebyshev expansion interpolants
C++
27
star
80

MediScore

Scoring tools for Media Forensics Evaluations
HTML
27
star
81

hedgehog

C++
27
star
82

REFPROP-issues

A repository solely used for reporting issues with NIST REFPROP
26
star
83

SCATMECH

SCATMECH: Polarized light scattering C++ class library
C++
26
star
84

youbot

Robotic platform for industrial control systems cybersecurity research. We use the research-grade Youbot as the robotics platform for our research. The ROS framework is used for inter-process communication, and Python is the language used for application development.
Python
26
star
85

ThreeBodyTB.jl

Accurate and fast tight-binding calculations, using pre-fit coefficients and three-body terms.
Julia
25
star
86

Circuits

Circuits for functions of interest to cryptography
C++
25
star
87

OOF2

Object Oriented for Finite Elements 2D version.
C++
25
star
88

libbiomeval

Software components for biometric technology evaluations.
C++
25
star
89

F4DE

Framework for Detection Evaluation (F4DE) : set of evaluation tools for detection evaluations and for specific NIST-coordinated evaluations
Perl
24
star
90

optbayesexpt

Optimal Bayesian Experiment Design
Python
24
star
91

blockmatrix

This project is developing code to implement features and extensions to the NIST Cybersecurity Whitepaper, "A Data Structure for Integrity Protection with Erasure Capability". The block matrix data structure may have utility for incorporation into applications requiring integrity protection that currently use permissioned blockchains. This capability could for example be useful in meeting privacy requirements such as the European Union General Data Protection Regulation (GDPR), which requires that organizations make it possible to delete all information related to a particular individual, at that person's request.
Java
24
star
92

texture

Python scripts for analysis of crystallographic texture
Jupyter Notebook
23
star
93

ElectionResultsReporting

Common data format specification for election results reporting data
23
star
94

oscal-deep-diff

Open Security Controls Assessment Language (OSCAL) Deep Differencing Tool
TypeScript
22
star
95

IFA

The NIST IFC File Analyzer (IFA) generates a spreadsheet from an IFC file.
Tcl
22
star
96

MUD-PD

A tool for characterizing the network behavior of IoT Devices. The primary intended use is to assist in the generation of allowlist files formatted according to the Manufacturer Usage Description specification.
Python
21
star
97

trojai-example

Example TrojAI Submission
21
star
98

NIST-Tech-Pubs

XML metadata for NIST Technical Series Publications
HTML
21
star
99

blossom-case-study

A case study for ACSAC 2022 utilizing OSCAL with a custom GitHub action to automate assessments.
HTML
21
star
100

atomgpt

AtomGPT: Atomistic Generative Pretrained Transformer for Forward and Inverse Materials Design
Python
21
star