Ameya D. Jagtap (@AmeyaJagtap)
  • Stars
    star
    295
  • Global Rank 90,261 (Top 4 %)
  • Followers 160
  • Registered about 6 years ago
  • Most used languages
    Python
    100.0 %
  • Location ๐Ÿ‡บ๐Ÿ‡ธ United States
  • Country Total Rank 29,950
  • Country Ranking
    Python
    8,554

Top repositories

1

XPINNs

Extended Physics-Informed Neural Networks (XPINNs): A Generalized Space-Time Domain Decomposition Based Deep Learning Framework for Nonlinear Partial Differential Equations
146
star
2

Conservative_PINNs

We propose a conservative physics-informed neural network (cPINN) on decompose domains for nonlinear conservation laws. The conservation property of cPINN is obtained by enforcing the flux continuity in the strong form along the sub-domain interfaces.
Python
55
star
3

Locally-Adaptive-Activation-Functions-Neural-Networks-

Python codes for Locally Adaptive Activation Function (LAAF) used in deep neural networks. Please cite this work as "A D Jagtap, K Kawaguchi, G E Karniadakis, Locally adaptive activation functions with slope recovery for deep and physics-informed neural networks, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 20200334, 2020. (http://dx.doi.org/10.1098/rspa.2020.0334)".
Python
38
star
4

XPINNs_TensorFlow-2

XPINN code written in TensorFlow 2
25
star
5

Rowdy_Activation_Functions

We propose Deep Kronecker Neural Network, which is a general framework for neural networks with adaptive activation functions. In particular we proposed Rowdy activation functions that inject sinusoidal fluctuations thereby allows the optimizer to exploit more and train the network faster. Various test cases ranging from function approximation, inferring the PDE solution, and the standard deep learning benchmarks like MNIST, CIFAR-10, CIFAR-100, SVHN etc are solved to show the efficacy of the proposed activation functions.
Python
10
star
6

Adaptive_Activation_Functions

We proposed the simple adaptive activation functions deep neural networks. The proposed method is simple and easy to implement in any neural networks architecture.
9
star
7

Error_estimates_PINN_and_XPINN_NonlinearPDEs

The first comprehensive theoretical analysis of PINNs (and XPINNs) for a prototypical nonlinear PDE, the Navier-Stokes equations are given.
6
star
8

Physics_Informed_Deep_Learning

Short course on physics-informed deep learning
Python
1
star
9

Augmented_PINNs_-APINNs-

1
star
10

Activation-functions-in-regression-and-classification

How important are How important are activation functions in regression and classification? A survey, performance comparison, and future directions
1
star