• Stars
    star
    173
  • Rank 211,874 (Top 5 %)
  • Language
    C++
  • License
    Other
  • Created over 1 year ago
  • Updated 12 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Create huge Sqlite indexes at breakneck speeds

Sqlite Index Blaster

Codacy Badge C/C++ CI DOI

This library provides API for creating huge Sqlite indexes at breakneck speeds for millions of records much faster than the official SQLite library by leaving out crash recovery.

This repo exploits a lesser known feature of the Sqlite database file format to store records as key-value pairs or documents or regular tuples.

Python port here: https://github.com/siara-cc/sqlite_blaster_python

Statement of need

There are a number of choices available for fast insertion of records, such as Rocks DB, LMDB and MongoDB but even they are slow due to overheads of using logs or journals for providing durability. These overheads are significant for indexing huge datasets.

This library was created for inserting/updating billions of entries for arriving at word/phrase frequencies for building dictionaries for the Unishox project using publicly available texts and conversations.

Furthermore, the other choices don't have the same number of IDEs or querying abilities of the most popular Sqlite data format.

Applications

  • Lightning fast index creation for huge datasets
  • Fast database indexing for embedded systems
  • Fast data set creation and loading for Data Science and Machine Learning

Performance

The performance of this repo was compared with the Sqlite official library, LMDB and RocksDB under similar conditions of CPU, RAM and NVMe disk and the results are shown below:

Performance

RocksDB performs much better than other choices and performs consistently for over billion entries, but it is quite slow initially.

The chart data can be found here.

Building and running tests

Clone this repo and run make to build the executable test_sqlite_blaster for testing. To run tests, invoke with -t parameter from shell console.

make
./test_sqlite_blaster -t

Getting started

Essentially, the library provides 2 methods put() and get() for inserting and retrieving records. Shown below are examples of how this library can be used to create a key-value store, or a document store or a regular table.

Note: The cache size is used as 40kb in these examples, but in real life 32mb or 64mb would be ideal. The higher this number, better the performance.

Creating a Key-Value store

In this mode, a table is created with just 2 columns, key and value as shown below:

#include "sqlite_index_blaster.h"
#include <string>

int main() {

    std::string col_names = "key, value";
    sqib::sqlite_index_blaster sqib(2, 1, col_names, "kv_index", 4096, 40, "kv_idx.db");
    std::string key = "hello";
    std::string val = "world";
    sqib.put_string(key, val);
    sqib.close();
    return 0;

}

A file kv_idx.db is created and can be verified by opening it using sqlite3 official console program:

sqlite3 kv_idx.db ".dump"

and the output would be:

PRAGMA foreign_keys=OFF;
BEGIN TRANSACTION;
CREATE TABLE kv_index (key, value, PRIMARY KEY (key)) WITHOUT ROWID;
INSERT INTO kv_index VALUES('hello','world');
COMMIT;

To retrieve the inserted values, use get method as shown below

#include "sqlite_index_blaster.h"
#include <string>

int main() {
    std::string col_names = "key, value";
    sqib::sqlite_index_blaster sqib(2, 1, col_names, "kv_index", 4096, 40, "kv_idx.db");
    std::string key = "hello";
    std::string val = "world";
    sqib.put_string(key, val);
    std::cout << "Value of hello is " << sqib.get_string(key, "not_found") << std::endl;
    sqib.close();
    return 0;
}

Creating a Document store

In this mode, a table is created with just 2 columns, key and doc as shown below:

#include "sqlite_index_blaster.h"
#include <string>

std::string json1 = "{\"name\": \"Alice\", \"age\": 25, \"email\": \"[email protected]\"}";
std::string json2 = "{\"name\": \"George\", \"age\": 32, \"email\": \"[email protected]\"}";

int main() {
    std::string col_names = "key, doc";
    sqib::sqlite_index_blaster sqib(2, 1, col_names, "doc_index", 4096, 40, "doc_store.db");
    std::string pc = "primary_contact";
    sqib.put_string(pc, json1);
    std::string sc = "secondary_contact";
    sqib.put_string(sc, json2);
    sqib.close();
    return 0;
}

The index is created as doc_store.db and the json values can be queried using sqlite3 console as shown below:

SELECT json_extract(doc, '$.email') AS email
FROM doc_index
WHERE key = 'primary_contact';

Creating a regular table

This repo can be used to create regular tables with primary key(s) as shown below:

#include <cmath>
#include <string>

#include "sqlite_index_blaster.h"

const uint8_t col_types[] = {SQLT_TYPE_TEXT, SQLT_TYPE_INT8, SQLT_TYPE_INT8, SQLT_TYPE_INT8, SQLT_TYPE_INT8, SQLT_TYPE_REAL};

int main() {

    std::string col_names = "student_name, age, maths_marks, physics_marks, chemistry_marks, average_marks";
    sqib::sqlite_index_blaster sqib(6, 2, col_names, "student_marks", 4096, 40, "student_marks.db");

    int8_t maths, physics, chemistry, age;
    double average;
    uint8_t rec_buf[500];
    int rec_len;

    age = 19; maths = 80; physics = 69; chemistry = 98;
    average = round((maths + physics + chemistry) * 100 / 3) / 100;
    const void *rec_values[] = {"Robert", &age, &maths, &physics, &chemistry, &average};
    rec_len = sqib.make_new_rec(rec_buf, 6, rec_values, NULL, col_types);
    sqib.put(rec_buf, -rec_len, NULL, 0);

    age = 20; maths = 82; physics = 99; chemistry = 83;
    average = round((maths + physics + chemistry) * 100 / 3) / 100;
    rec_values[0] = "Barry";
    rec_len = sqib.make_new_rec(rec_buf, 6, rec_values, NULL, col_types);
    sqib.put(rec_buf, -rec_len, NULL, 0);

    age = 23; maths = 84; physics = 89; chemistry = 74;
    average = round((maths + physics + chemistry) * 100 / 3) / 100;
    rec_values[0] = "Elizabeth";
    rec_len = sqib.make_new_rec(rec_buf, 6, rec_values, NULL, col_types);
    sqib.put(rec_buf, -rec_len, NULL, 0);

    return 0;
}

The index is created as student_marks.db and the data can be queried using sqlite3 console as shown below:

sqlite3 student_marks.db "select * from student_marks"
Barry|20|82|99|83|88.0
Elizabeth|23|84|89|74|82.33
Robert|19|80|69|98|82.33

Constructor parameters of sqlite_index_blaster class

  1. total_col_count - Total column count in the index
  2. pk_col_count - Number of columns to use as key. These columns have to be positioned at the beginning
  3. col_names - Column names to create the table
  4. tbl_name - Table (clustered index) name
  5. block_sz - Page size (must be one of 512, 1024, 2048, 4096, 8192, 16384, 32768 or 65536)
  6. cache_sz - Size of LRU cache in kilobytes. 32 or 64 mb would be ideal. Higher values lead to better performance
  7. fname - Name of the Sqlite database file

Console Utility for playing around

test_sqlite_blaster also has rudimentary ability to create, insert and query databases as shown below. However this is just for demonstration.

./test_sqlite_blaster -c movie.db 4096 movie_list 3 1 Film,Genre,Studio

To insert records, use -i as shown below:

./test_sqlite_blaster -i movie.db 4096 3 1 "Valentine's Day,Comedy,Warner Bros." "Sex and the City,Comedy,Disney" "Midnight in Paris,Romance,Sony"

This inserts 3 records. To retrieve inserted records, run:

./test_sqlite_blaster -r movie.db 4096 3 1 "Valentine's Day"

and the output would be:

Valentine's Day,Comedy,Warner Bros.

Limitations

  • No crash recovery. If the insertion process is interruped, the database would be unusable.

  • The record length cannot change for update. Updating with lesser or greater record length is not implemented yet.

  • Deletes are not implemented yet. This library is intended primarily for fast inserts.

  • Support for concurrent inserts not implemented yet.

  • The regular ROWID table of Sqlite is not implemented.

  • Only the equivalent of memcmp is used to index records. The order in which keys are ordered may not match with official Sqlite lib for non-ASCII char sets.

  • Key lengths are limited depending on page size as shown in the table below. This is just because the source code does not implement support for longer keys. However, this is considered sufficient for most practical purposes.

    Page Size Max Key Length
    512 35
    1024 99
    2048 227
    4096 484
    8192 998
    16384 2026
    32768 4082
    65536 8194

Stability

This code has been tested with more than 200 million records, so it is expected to be quite stable, but bear in mind that this is so fast because there is no crash recovery.

So this repo is best suited for one time inserts of large datasets. It may be suitable for power backed systems such as those hosted in Cloud and battery backed systems.

License

Sqlite Index Blaster and its command line tools are dual-licensed under the MIT license and the AGPL-3.0. Users may choose one of the above.

  • The MIT License
  • The GNU Affero General Public License v3 (AGPL-3.0)

License for AI bots

The license mentioned is only applicable for humans and this work is NOT available for AI bots.

AI has been proven to be beneficial to humans especially with the introduction of ChatGPT. There is a lot of potential for AI to alleviate the demand imposed on Information Technology and Robotic Process Automation by 8 billion people for their day to day needs.

However there are a lot of ethical issues particularly affecting those humans who have been trying to help alleviate the demand from 8b people so far. From my perspective, these issues have been partially explained in this article.

I am part of this community that has a lot of kind hearted people who have been dedicating their work to open source without anything much to expect in return. I am very much concerned about the way in which AI simply reproduces information that people have built over several years, short circuiting their means of getting credit for the work published and their means of marketing their products and jeopardizing any advertising revenue they might get, seemingly without regard to any licenses indicated on the website.

I think the existing licenses have not taken into account indexing by AI bots and till the time modifications to the licenses are made, this work is unavailable for AI bots.

Support

If you face any problem, create issue in this website, or write to the author (Arundale Ramanathan) at [email protected].

More Repositories

1

esp32_arduino_sqlite3_lib

Sqlite3 Arduino library for ESP32
C
285
star
2

sqlite_micro_logger_arduino

Fast and Lean Sqlite database logger for Arduino UNO and above
C
161
star
3

Unishox2

Compression for Unicode short strings (works on arduino)
C
134
star
4

esp32-idf-sqlite3

Sqlite library for esp-idf (esp32) framework
C
106
star
5

esp_arduino_sqlite3_lib

Sqlite3 library for ESP8266 Arduino core
C
81
star
6

unishox_js

JS Library for Guaranteed compression of Unicode short strings
JavaScript
28
star
7

php_webview

Cross Platform WebView Interface for PHP-ians
C
26
star
8

sqlite_micro_logger_c

C
22
star
9

Shox96_Arduino_lib

Compressing and decompressing Strings for Arduino
C++
19
star
10

Unishox_Arduino_Progmem_lib

Retrieve compressed UTF-8 strings from Arduino Flash memory (Progmem)
C++
17
star
11

sqlite3_page_explorer

Cross Platform app to explore internal organisation of tables and indices
HTML
16
star
12

esp32-idf-sqlite3-examples

Examples for esp-idf sqlite3 component at repository esp32-idf-sqlite3
C
14
star
13

Shox96

Guaranteed Compression for Short Strings
C
10
star
14

ruby_webview

Cross Platform WebView extension for Ruby lovers
C
9
star
15

csv_parser_npmjs

Parse Master-detail CSV data
JavaScript
7
star
16

Shox96_Arduino_Progmem_lib

Store and retrieve compressed text using Progmem for Arduino Uno and upwards
C++
6
star
17

csv_ml

Multi-Level CSV (csv_ml) is a light(er)-weight data interchange format equivalent to JSON and XML
Java
6
star
18

sqlite_blaster_python

A library for creating huge Sqlite indexes at breakneck speeds
C++
5
star
19

Unishox_Sqlite_UDF

Sqlite User defined functions for Unishox compression and decompression as loadable extension
C
5
star
20

employee_db

Sqlite3 version of mysql test_db
4
star
21

Shox96_Sqlite_UDF

Compress / Decompress functions based on Shox96 for SQLite3
C
3
star
22

marisa-esp32

Fast lookups in large static dictionaries - an ESP32 Arduino wrapper for Marisa library
C++
3
star
23

vfp-dbf-reader

VFP DBF Reader
JavaScript
2
star
24

bloom_cpp

Bloom Filter implemention in C++
C++
2
star
25

Unishox_Arduino_lib

Arduino library for Unishox 2 compression method
C++
2
star
26

sakila_sqlite3

Sqlite version of Mysql Sakila sample database
2
star
27

SQLiteNoSQL

NoSQL API for SQLite databases
C++
1
star
28

FirestoreCompress

Store compressed text in Firestore
1
star