• Stars
    star
    213
  • Rank 177,436 (Top 4 %)
  • Language
    F#
  • License
    MIT License
  • Created over 5 years ago
  • Updated about 2 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

TensorFlow API for F# + F# for AI Models eDSL

This repo contains archival material about "F# for AI Models".

Contents:

  • FM: An F# DSL for AI Models with separated shape checking and tooling

    FM was a prototype F# eDSL for writing numeric models. It has now been subsumed by DiffSharp 1.0.

  • The TensorFlow API for F#

    This is now archived. We recommend TensorFlow.NET or DiffSharp 1.0.

  • Live Checking Tooling for AI models

    This is now being merged to DiffSharp 1.0.

  • fsx2nb now part of the fsdocs tool.

ARCHIVAL MATERIAL: FM: An F# DSL for AI Models

Models written in FM can be passed to optimization and training algorithms utilising automatic differentiation without any change to modelling code, and can be executed on GPUs and TPUs using TensorFlow.

There is also experimental tooling for interactive tensor shape-checking, inference, tooltips and other nice things.

This is a POC that it is possible to configure F# to be suitable for authoring AI models. We execute them as real, full-speed TensorFlow graphs, achieving cohabitation and win-win with the TF ecosystem. Live trajectory execution tooling gives added correctness guarantees and developer productivity interactively.

FM is implemented in the FSAI.Tools package built in this repo.

The aim of FM is to support the authoring of numeric functions and AI models - including neural networks - in F# code. For example:

/// A numeric function of two parameters, returning a scalar, see
/// https://en.wikipedia.org/wiki/Gradient_descent
let f (xs: DT<double>) = 
    sin (v 0.5 * sqr xs.[0] - v 0.25 * sqr xs.[1] + v 3.0) * -cos (v 2.0 * xs.[0] + v 1.0 - exp xs.[1])

These functions and models can then be passed to optimization algorithms that utilise gradients, e.g.

// Pass this Define a numeric function of two parameters, returning a scalar
let train numSteps = GradientDescent.train f (vec [ -0.3; 0.3 ]) numSteps

let results = train 200 |> Seq.last

FM supports the live "trajectory" checking of key correctness properties of your numeric code, including vector, matrix and tensor size checking, and tooling to interactively report the sizes. To active this tooling you need to specify a LiveCheck that is interactively executed by the experimental tooling described further below.

[<LiveCheck>] 
let check1 = train 4 |> Seq.last 

When using live-checks, underlying tensors are not actually populated with data - instead only their shapes are analyzed. Arrays and raw numerics values are computed as normal.

Typically each model is equipped with one LiveCheck that instantiates the model on training data.

ARCHIVAL MATERIAL: Optimization algorithms utilising gradients

The aim of FM is to allow the clean description of numeric code and yet still allow this code to be either executed using TensorFlow and - in the future - other tensor fabrics such as Torch (TorchSharp) and DiffSharp. These fabrics automatically compute the gradients of your models and functions with respect to model parameters and/or function inputs. Gradients are usually computed inside an optimization algorithm.

For example, a naive version of Gradient Descent is shown below:

module GradientDescent =

    // Note, the rate in this example is constant. Many practical optimizers use variable
    // update (rate) - often reducing.
    let rate = 0.005

    // Gradient descent
    let step f xs =   
        // Get the partial derivatives of the function
        let df xs =  fm.diff f xs  
        printfn "xs = %A" xs
        let dzx = df xs 
        // evaluate to output values 
        xs - v rate * dzx |> fm.eval

    let train f initial steps = 
        initial |> Seq.unfold (fun pos -> Some (pos, step f pos)) |> Seq.truncate steps 

Note the call is fm.diff - FM allows optimizers to derive the gradients of FM functions and models in a way inspired by the design of DiffSharp. For example:

// Define a function which will be executed using TensorFlow
let f x = x * x + v 4.0 * x 

// Get the derivative of the function. This computes "x*2 + 4.0"
let df x = fm.diff f x  

// Run the derivative 
df (v 3.0) |> fm.RunScalar // returns 6.0 + 4.0 = 10.0

To differentiate a scalar function with multiple input variables:

// Define a function which will be executed using TensorFlow
// computes [ x1*x1*x3 + x2*x2*x2 + x3*x3*x1 + x1*x1 ]
let f (xs: DT<'T>) = sum (xs * xs * fm.Reverse xs)

// Get the partial derivatives of the scalar function
// computes [ 2*x1*x3 + x3*x3; 3*x2*x2; 2*x3*x1 + x1*x1 ]
let df xs = fm.diff f xs   

// Run the derivative 
df (vec [ 3.0; 4.0; 5.0 ]) |> fm.RunArray // returns [ 55.0; 48.0; 39.0 ]

ARCHIVAL MATERIAL: A Larger Example

Below we show fitting a linear model to training data, by differentiating a loss function w.r.t. coefficients, and optimizing using gradient descent (200 data points generated by linear function, 10 parameters, linear model).

module ModelExample =

    let modelSize = 10

    let checkSize = 5

    let trainSize = 500

    let validationSize = 100

    let rnd = Random()

    let noise eps = (rnd.NextDouble() - 0.5) * eps 

    /// The true function we use to generate the training data (also a linear model plus some noise)
    let trueCoeffs = [| for i in 1 .. modelSize -> double i |]

    let trueFunction (xs: double[]) = 
        Array.sum [| for i in 0 .. modelSize - 1 -> trueCoeffs.[i] * xs.[i]  |] + noise 0.5

    let makeData size = 
        [| for i in 1 .. size -> 
            let xs = [| for i in 0 .. modelSize - 1 -> rnd.NextDouble() |]
            xs, trueFunction xs |]
         
    /// Make the data used to symbolically check the model
    let checkData = makeData checkSize

    /// Make the training data
    let trainData = makeData trainSize

    /// Make the validation data
    let validationData = makeData validationSize
 
    let prepare data = 
        let xs, y = Array.unzip data
        let xs = batchOfVecs xs
        let y = batchOfScalars y
        (xs, y)

    /// evaluate the model for input and coefficients
    let model (xs: DT<double>, coeffs: DT<double>) = 
        fm.Sum (xs * coeffs, axis= [| 1 |])
           
    let meanSquareError (z: DT<double>) tgt = 
        let dz = z - tgt 
        fm.Sum (dz * dz) / v (double modelSize) / v (double z.Shape.[0].Value) 

    /// The loss function for the model w.r.t. a true output
    let loss (xs, y) coeffs = 
        let y2 = model (xs, batchExtend coeffs)
        meanSquareError y y2
          
    let validation coeffs = 
        let z = loss (prepare validationData) (vec coeffs)
        z |> fm.eval

    let train inputs steps =
        let initialCoeffs = vec [ for i in 0 .. modelSize - 1 -> rnd.NextDouble()  * double modelSize ]
        let inputs = prepare inputs
        GradientDescent.train (loss inputs) initialCoeffs steps
           
    [<LiveCheck>]
    let check1 = train checkData 1  |> Seq.last

    let learnedCoeffs = train trainData 200 |> Seq.last |> fm.toArray
         // [|1.017181246; 2.039034327; 2.968580146; 3.99544071; 4.935430581;
         //   5.988228378; 7.030374908; 8.013975714; 9.020138699; 9.98575733|]

    validation trueCoeffs

    validation learnedCoeffs

More examples/tests are in dsl-live.fsx.

The approach scales to the complete expression of deep neural networks and full computation graphs. The links below show the implementation of a common DNN sample (the samples may not yet run, this is wet paint):

The design is intended to allow alternative execution with Torch or DiffSharp. DiffSharp may be used once Tensors are available in that library.

ARCHIVAL MATERIAL: Technical notes:

  • DT stands for differentiable tensor and the one type of DT<_> values are used to represent differentiable scalars, vectors, matrices and tensors. If you are familiar with the design of DiffSharp there are similarities here: DiffSharp defines D (differentiable scalar), DV (differentiable vector), DM (differentiable matrix).

  • fm.gradients is used to get gradients of arbitrary outputs w.r.t. arbitrary inputs

  • fm.diff is used to differentiate of R^n -> R scalar-valued functions (loss functions) w.r.t. multiple input variables. If a scalar input is used, a single total deriative is returned. If a vector of inputs is used, a vector of partial derivatives are returned.

  • In the prototype, all gradient-based functions are implemented using TensorFlow's AddGradients, i.e. the C++ implementation of gradients. THis has many limitations.

  • fm.* is a DSL for expressing differentiable tensors using the TensorFlow fundamental building blocks. The naming of operators in this DSL are currently TensorFLow specific and may change.

  • A preliminary pass of shape inference is performed before any TensorFlow operations are performed. This allows you to check the shapes of your differentiable code independently of TensorFlow's shape computations. A shape inference system is used which allows for many shapes to be inferred and is akin to F# type inference. It also means not all TensorFlow automatic shape transformations are applied during shape inference.

ARCHIVAL MATERIAL: The TensorFlow API for F#

See FSAI.Tools. This API is designed in a similar way to TensorFlowSharp, but is implemented directly in F# and contains some additional functionality.

Live Checking Tooling for AI models

ARCHIVAL MATERIAL

There is some tooling to do "live trajectory execution" of models and training on limited training sets, reporting tensor sizes and performing tensor size checking.

LiveCheck for a vector addition:

LiveCheck for a DNN:

  1. Clone the necessary repos

    git clone http://github.com/dotnet/fsharp
    git clone http://github.com/fsprojects/FSharp.Compiler.PortaCode
    git clone http://github.com/fsprojects/fsharp-ai-tools
    
  2. Build the VS tooling with the extensibility "hack" to allow 3rd party tools to add checking and tooltips

    cd fsharp
    git fetch https://github.com/dsyme/fsharp livecheck
    git checkout livecheck
    .\build.cmd
    cd ..
    
  3. Compile the extra tool

    dotnet build FSharp.Compiler.PortaCode
    
  4. Compile this repo

    dotnet build fsharp-ai-tools
    
  5. Start the tool and edit using experimental VS instance

    cd fsharp-ai-tools\examples
    devenv.exe /rootsuffix RoslynDev
    ..\..\..\FSharp.Compiler.PortaCode\FsLive.Cli\bin\Debug\net471\FsLive.Cli.exe --eval --writeinfo --watch --vshack --livechecksonly  --define:LIVECHECK dsl-live.fsx
    
    (open dsl-live.fsx)
    

ARCHIVAL MATERIAL: fsx2nb

There is a separate tool fsx2nb in the repo to convert F# scripts to F# Jupyter notebooks:

dotnet fsi tools\fsx2nb.fsx -i script\sample.fsx

These scripts use the following elements:

(**markdown 

*)

(**cell *)   -- delimits between two code cells

(**ydec xyz *)   -- this adds 'xyz' to a code cell for use fo producing visual outputs

#if INTERACTIVE   -- this is removed in a code block
...
#endif

#if COMPILED   -- this is removed in a code block
...
#endif


#if NOTEBOOK   -- this is kept and the #if are removed
...
#endif

Building

dotnet build
dotnet test
dotnet pack

More Repositories

1

Paket

A dependency manager for .NET with support for NuGet packages and Git repositories.
F#
1,969
star
2

FAKE

FAKE - F# Make
F#
1,271
star
3

awesome-fsharp

A curated list of awesome F# frameworks, libraries, software and resources.
1,129
star
4

Avalonia.FuncUI

Develop cross-plattform GUI Applications using F# and Avalonia!
F#
827
star
5

FSharp.Data

F# Data: Library for Data Access
F#
801
star
6

FSharpPlus

Extensions for F#
F#
799
star
7

fantomas

FSharp source code formatter
F#
723
star
8

FSharpx.Extras

Functional programming and other utilities from the original "fsharpx" project
F#
676
star
9

Rezoom.SQL

Statically typechecks a common SQL dialect and translates it to various RDBMS backends
F#
663
star
10

SQLProvider

A general F# SQL database erasing type provider, supporting LINQ queries, schema exploration, individuals, CRUD operations and much more besides.
F#
541
star
11

ProjectScaffold

A prototypical .NET solution (file system layout and tooling), recommended for F# projects
F#
514
star
12

FSharp.Formatting

F# tools for generating documentation (Markdown processor and F# code formatter)
F#
450
star
13

IfSharp

F# for Jupyter Notebooks
Jupyter Notebook
441
star
14

Argu

A declarative CLI argument parser for F#
F#
428
star
15

FsHttp

A lightweight F# HTTP library by @SchlenkR and @dawedawe
F#
407
star
16

FsUnit

FsUnit makes unit-testing with F# more enjoyable. It adds a special syntax to your favorite .NET testing framework.
F#
407
star
17

FSharp.Data.GraphQL

FSharp implementation of Facebook GraphQL query language.
F#
389
star
18

fsharp-companies

Community curated list of companies that use F#
370
star
19

fsharp-cheatsheet

This cheatsheet aims to succinctly cover the most important aspects of F# 6.0.
F#
318
star
20

FSharp.TypeProviders.SDK

The SDK for creating F# type providers
F#
295
star
21

pulsar-client-dotnet

Apache Pulsar native client for .NET (C#/F#/VB)
F#
288
star
22

FSharpLint

Lint tool for F#
F#
287
star
23

FSharp.Control.Reactive

Extensions and wrappers for using Reactive Extensions (Rx) with F#.
F#
281
star
24

FsReveal

FsReveal parses markdown and F# script file and generates reveal.js slides.
F#
258
star
25

SwaggerProvider

F# generative Type Provider for Swagger
F#
255
star
26

FSharpx.Collections

FSharpx.Collections is a collection of datastructures for use with F# and C#.
F#
243
star
27

FSharp.Data.Adaptive

On-demand adaptive/incremental data for F# https://fsprojects.github.io/FSharp.Data.Adaptive/
F#
240
star
28

fsharp-language-server

F#
215
star
29

FSharp.Json

F# JSON Reflection based serialization library
F#
213
star
30

FSharp.Data.SqlClient

A set of F# Type Providers for statically typed access to MS SQL database
F#
207
star
31

FsLexYacc

Lexer and parser generators for F#
F#
195
star
32

Fleece

Json mapper for F#
F#
194
star
33

Chessie

Railway-oriented programming for .NET
F#
190
star
34

ExcelFinancialFunctions

.NET Standard library providing the full set of financial functions from Excel.
F#
184
star
35

FsXaml

F# Tools for working with XAML Projects
F#
171
star
36

FSharp.Control.AsyncSeq

Asynchronous sequences for F#
F#
159
star
37

Paket.VisualStudio

Manage your Paket (http://fsprojects.github.io/Paket/) dependencies from Visual Studio!
C#
148
star
38

FSharp.UMX

F# units of measure for primitive non-numeric types
F#
148
star
39

ExcelProvider

This library is for the .NET platform implementing a Excel type provider.
F#
137
star
40

FsBlog

Blog aware, static site generation using F#.
CSS
134
star
41

TickSpec

Lean .NET BDD framework with powerful F# integration
F#
131
star
42

SIMDArray

SIMD enhanced Array operations
F#
128
star
43

FSharp.Configuration

The FSharp.Configuration project contains type providers for the configuration of .NET projects.
F#
114
star
44

FSharp.Interop.Dynamic

DLR interop for F# -- works like dynamic keyword in C#
F#
94
star
45

FSharpx.Async

Asynchronous programming utilities for F#
F#
93
star
46

FSharp.Management

The FSharp.Management project contains various type providers for the management of the machine.
F#
88
star
47

AzureStorageTypeProvider

An F# Azure Type Provider which can be used to explore Blob, Table and Queue Azure Storage assets and easily apply CRUD operations on them.
F#
83
star
48

FSharp.Control.TaskSeq

A computation expression and module for seamless working with IAsyncEnumerable<'T> as if it is just another sequence
F#
78
star
49

Foq

A unit testing framework for F#
F#
76
star
50

FSharp.ViewModule

Library providing MVVM and INotifyPropertyChanged support for F# projects
F#
75
star
51

FSharp.Azure.Storage

F# API for using Microsoft Azure Table Storage service
F#
75
star
52

FSharp.Text.RegexProvider

A type provider for regular expressions.
F#
73
star
53

FSharp.Core.Fluent

Fluent members for F# FSharp.Core functions
F#
71
star
54

Mechanic

F#
69
star
55

Incremental.NET

A library for incremental computations. Based on janestreet/incremental (https://github.com/janestreet/incremental) for OCaml.
F#
68
star
56

FSharp.Collections.ParallelSeq

Parallel (multi-core) sequence operations
F#
67
star
57

FSharp.Quotations.Evaluator

A quotations evaluator/compiler for F# based on LINQ expression tree compilation
F#
67
star
58

FSharp.Linq.ComposableQuery

Compositional Query Framework for F# Queries, based on "A Practical Theory of Language-Integrated Query"
F#
67
star
59

OpenAPITypeProvider

F# type provider for Open API specification
F#
66
star
60

FSharp.Data.Toolbox

F# Data-based library for various data access APIs
F#
57
star
61

FSharp.AWS.DynamoDB

F# wrapper API for AWS DynamoDB
F#
56
star
62

DynamoDb.SQL

SQL-like external DSL for querying and scanning Amazon DynamoDB
F#
54
star
63

FsRandom

A purely-functional random number generator framework designed for F#
F#
51
star
64

Z3Fs

Simple DSL to solve SMT problems using Z3 API in F#
F#
51
star
65

FSharp.Data.JsonSchema

F#
49
star
66

SyntacticVersioning

Helper tool to verify semantic version changes based on API surface area changes
F#
46
star
67

fantomas-for-vs

Visual Studio Formatter for F#
HTML
46
star
68

FSharp.Compatibility

Compatibility libraries for F#
F#
44
star
69

FSharp.Compiler.PortaCode

The PortaCode F# code format and corresponding interpreter. Used by Fabulous and others.
F#
43
star
70

Interstellar

Cross-platform desktop apps in F# using web tech - https://www.nuget.org/packages/Interstellar.Core/
F#
42
star
71

FSharp.Interop.PythonProvider

Early experimental F# type provider for python
F#
41
star
72

FSharp.Data.TypeProviders

F# Type Providers for SqlDataConnection, SqlEntityConnection, ODataService, WsdlService and EdmxFile using .NET Framework generators
F#
39
star
73

FSharp.CloudAgent

Allows running F# Agents in a distributed manner using Azure Service Bus.
F#
38
star
74

Roslyn.FSharp

Roslyn read-only API to work with F# code (via bridge to FSharp.Compiler.Service)
F#
37
star
75

FnuPlot

An F# wrapper for gnuplot charting library
F#
36
star
76

GraphProvider

A state machine type provider
F#
35
star
77

fsharp-linting-for-vs

Visual Studio Linter for F#
C#
34
star
78

fantomas-tools

Collection of tools used when developing for Fantomas
F#
34
star
79

LocSta

An F# library for composing state-aware functions by @SchlenkR
JavaScript
33
star
80

FSharp.Span.Utils

Makes Span/ReadOnlySpan easy to use from F#.
F#
33
star
81

FSharp.Data.Xsd

XML Type Provider with schema support
F#
31
star
82

.github

The place to request for projects to be added or removed from the incubation space
28
star
83

Zander

Regular expression for matrix information. I.e. parse structured blocks of information from csv or excel files (or similar 2d matrixes)
F#
27
star
84

FSharp.Compiler.CodeDom

An F# CodeDOM implementation (based on the old F# Power Pack)
F#
26
star
85

FSharp.Data.WsdlProvider

An implementation of the WsdlProvider compatible with netfx and netcore
F#
23
star
86

FsMath3D

F# 3D Math Library for realtime applications
F#
22
star
87

S3Provider

Experimental type provider for Amazon S3
F#
22
star
88

ReasoningEngine

Symbolic analysis of discrete dynamical systems
F#
22
star
89

BioProviders

F# library for accessing and manipulating bioinformatic datasets.
F#
21
star
90

FSharpPerf

A set of performance test scripts for the F# compiler.
F#
21
star
91

MarkerClustering

A component to cluster map markers.
F#
19
star
92

DynamicsCRMProvider

A type provider for Microsoft Dynamics CRM 2011.
F#
16
star
93

Amazon.SimpleWorkflow.Extensions

Extensions to AmazonSDK's SimpleWorkflow capabilities to make it more intuitive to use
F#
16
star
94

Canopy.Mobile

Canopy testing framework for mobile apps
F#
14
star
95

LSON

Lisp inspired serialization (intended for when you don't even want to take a dependency on JSON serializer)
F#
14
star
96

FSharp.Codecs.Redis

FSharp redis codecs based on Fleece patterns
F#
13
star
97

matprovider

Type provider for .mat files
F#
11
star
98

TensorflowTypeProvider

This Type Provider aims to eliminate the need for ‘magic strings’ associated with accessing pre-trained Tensorflow Graphs. Typed access to NPY/NPZ is also included.
F#
11
star
99

FsGlfw3

F# GLFW 3 Binding
F#
9
star
100

FSharpPlus.CSharp

F#+C#  ❤️use some f# base types for great good
F#
9
star