Search Shortcut cmd + k | ctrl + k
Search cmd+k ctrl+k
0.10 (stable)
Importing Data

The first step to using a database system is to insert data into that system. DuckDB provides several data ingestion methods that allow you to easily and efficiently fill up the database. In this section, we provide an overview of these methods so you can select which one is correct for you.

Insert Statements

Insert statements are the standard way of loading data into a database system. They are suitable for quick prototyping, but should be avoided for bulk loading as they have significant per-row overhead.

INSERT INTO people VALUES (1, 'Mark');

For a more detailed description, see the page on the INSERT statement.

CSV Loading

Data can be efficiently loaded from CSV files using the read_csv function or the COPY statement.

SELECT * FROM read_csv('test.csv');

You can also load data from compressed (e.g., compressed with gzip) CSV files, for example:

SELECT * FROM read_csv('test.csv.gz');

For more details, see the page on CSV loading.

Parquet Loading

Parquet files can be efficiently loaded and queried using the read_parquet function.

SELECT * FROM read_parquet('test.parquet');

For more details, see the page on Parquet loading.

JSON Loading

JSON files can be efficiently loaded and queried using the read_json_auto function.

SELECT * FROM read_json_auto('test.json');

For more details, see the page on JSON loading.


In several APIs (C, C++, Go, Java, and Rust), the Appender can be used as an alternative for bulk data loading. This class can be used to efficiently add rows to the database system without using SQL statements.

Pages in This Section

About this page

Last modified: 2024-05-22