Minimal DuckDB WASM for browsers. Async worker-based API with full TypeScript support.
npm install @ducklings/browser
import { init, DuckDB } from '@ducklings/browser';
// Initialize the WASM module (runs in Web Worker)
await init();
// Create database and connection
const db = new DuckDB();
const conn = await db.connect();
// Query returns array of JS objects
const rows = await conn.query<{answer: number}>('SELECT 42 as answer');
console.log(rows); // [{ answer: 42 }]
// Clean up
await conn.close();
await db.close();
import { init, DuckDB, version } from '@ducklings/browser';
// Auto-locate WASM and worker files
await init();
// Custom URLs (optional)
await init({
wasmUrl: '/path/to/duckdb.wasm',
workerUrl: '/path/to/worker.js'
});
// Create database
const db = new DuckDB();
// Get DuckDB version
const v = await version(); // "v1.4.3"
Load directly from jsDelivr or unpkg - cross-origin workers are handled automatically:
<script type="module">
import { init, DuckDB } from 'https://cdn.jsdelivr.net/npm/@ducklings/[email protected]/+esm';
await init();
const db = new DuckDB();
const conn = await db.connect();
const result = await conn.query('SELECT 42 as answer');
console.log(result); // [{ answer: 42 }]
await conn.close();
await db.close();
</script>
When using a bundler, import asset URLs explicitly instead of relying on auto-resolution:
Vite:
import { init, DuckDB } from '@ducklings/browser';
import wasmUrl from '@ducklings/browser/wasm/duckdb.wasm?url';
import wasmJsUrl from '@ducklings/browser/wasm/duckdb.js?url';
import workerUrl from '@ducklings/browser/worker?url';
await init({ wasmUrl, wasmJsUrl, workerUrl });
Webpack:
import { init, DuckDB } from '@ducklings/browser';
const wasmUrl = new URL('@ducklings/browser/wasm/duckdb.wasm', import.meta.url).href;
const wasmJsUrl = new URL('@ducklings/browser/wasm/duckdb.js', import.meta.url).href;
const workerUrl = new URL('@ducklings/browser/worker', import.meta.url).href;
await init({ wasmUrl, wasmJsUrl, workerUrl });
const conn = await db.connect();
// Returns array of objects
const rows = await conn.query<{id: number, name: string}>('SELECT * FROM users');
// Returns Arrow Table (Flechette)
const table = await conn.queryArrow('SELECT * FROM users');
// Execute without returning results
await conn.execute('INSERT INTO users VALUES (1, "Alice")');
// Insert Arrow IPC data into a table
const ipc = tableToIPC(arrowTable, { format: 'stream' });
await conn.insertArrowFromIPCStream('my_table', ipc);
const stmt = await conn.prepare('SELECT * FROM users WHERE id = ? AND active = ?');
stmt.bindInt32(1, 42); // Bind methods are sync
stmt.bindBoolean(2, true);
const results = await stmt.run(); // Execution is async
await stmt.close();
const stream = await conn.queryStreaming('SELECT * FROM large_table');
for await (const chunk of stream) {
console.log(`Processing ${chunk.rowCount} rows`);
for (const row of chunk.toArray()) {
processRow(row);
}
}
// Stream auto-closes after iteration
// Register remote file
await db.registerFileURL('data.parquet', 'https://example.com/data.parquet');
// Register in-memory data
const csvData = new TextEncoder().encode('id,name\n1,Alice');
await db.registerFileBuffer('data.csv', csvData);
// Query registered files
const rows = await conn.query("SELECT * FROM 'data.parquet'");
// Query remote Parquet file directly
const rows = await conn.query(`
SELECT * FROM 'https://example.com/data.parquet'
LIMIT 10
`);
import { tableFromArrays, tableFromIPC, tableToIPC, utf8 } from '@ducklings/browser';
// Query as Arrow Table
const table = await conn.queryArrow('SELECT * FROM users');
// Build Arrow tables
const custom = tableFromArrays({
id: [1, 2, 3],
name: ['Alice', 'Bob', 'Charlie']
});
// Serialize to/from Arrow IPC
const bytes = tableToIPC(table, { format: 'stream' });
const restored = tableFromIPC(bytes);
// Insert Arrow IPC data directly into a table
const data = tableFromArrays(
{ id: [1, 2], label: ['x', 'y'] },
{ types: { label: utf8() } } // Use plain utf8 (see note below)
);
const ipc = tableToIPC(data, { format: 'stream' });
await conn.insertArrowFromIPCStream('my_table', ipc);
Dictionary encoding: Flechette's
tableFromArrays()defaults todictionary(utf8())for string columns. The Arrow IPC decoder used internally does not support dictionary-encoded streams. When building tables forinsertArrowFromIPCStream(), explicitly set string columns toutf8():import { utf8 } from '@ducklings/browser';
tableFromArrays({ col: ['a', 'b'] }, { types: { col: utf8() } });
For Cloudflare Workers, use the @ducklings/workers package instead, which uses Asyncify for proper httpfs support in the Workers runtime.
INSTALL/LOAD commands for other extensions will not work.MIT
Ducklings - Minimal DuckDB for browser environments
This package provides a lightweight DuckDB binding for WebAssembly, using Web Workers for async operations. All DuckDB operations run in a background worker thread to keep the main thread responsive.
Example: Basic usage
Example: File registration
Example: Streaming large results
Example: Prepared statements