- Installation
- Guides
- Overview
- SQL Features
- Data Import & Export
- CSV Import
- CSV Export
- Parquet Import
- Parquet Export
- Query Parquet
- HTTP Parquet Import
- S3 Parquet Import
- S3 Parquet Export
- JSON Import
- JSON Export
- Excel Import
- Excel Export
- SQLite Import
- Postgres Import
- Meta Queries
- Python
- Install
- Execute SQL
- Jupyter Notebooks
- SQL on Pandas
- Import From Pandas
- Export To Pandas
- SQL on Arrow
- Import From Arrow
- Export To Arrow
- Relational API on Pandas
- Multiple Python Threads
- DuckDB with Ibis
- DuckDB with Fugue
- DuckDB with Polars
- DuckDB with Vaex
- DuckDB with DataFusion
- DuckDB with fsspec filesystems
- SQL Editors
- Data Viewers
- Documentation
- Connect
- Data Import
- Overview
- CSV Files
- JSON Files
- Multiple Files
- Parquet Files
- Partitioning
- Appender
- Insert Statements
- Client APIs
- Overview
- C
- Overview
- Startup
- Configure
- Query
- Data Chunks
- Values
- Types
- Prepared Statements
- Appender
- Table Functions
- Replacement Scans
- API Reference
- C++
- CLI
- Java
- Julia
- Node.js
- ODBC
- Python
- Overview
- Data Ingestion
- Result Conversion
- DB API
- Relational API
- Function API
- Types API
- API Reference
- R
- Rust
- Scala
- Swift
- Wasm
- SQL
- Introduction
- Statements
- Overview
- Alter Table
- Attach/Detach
- Call
- Checkpoint
- Copy
- Create Macro
- Create Schema
- Create Sequence
- Create Table
- Create View
- Delete
- Drop
- Export
- Insert
- Pivot
- Select
- Set/Reset
- Unpivot
- Update
- Use
- Vacuum
- Query Syntax
- SELECT
- FROM & JOIN
- WHERE
- GROUP BY
- GROUPING SETS
- HAVING
- ORDER BY
- LIMIT
- SAMPLE
- UNNEST
- WITH
- WINDOW
- QUALIFY
- VALUES
- FILTER
- Set Operations
- Data Types
- Overview
- Bitstring
- Blob
- Boolean
- Date
- Enum
- Interval
- List
- Map
- NULL Values
- Numeric
- Struct
- Text
- Timestamp
- Union
- Expressions
- Functions
- Overview
- Bitstring Functions
- Blob Functions
- Date Format Functions
- Date Functions
- Date Part Functions
- Enum Functions
- Interval Functions
- Nested Functions
- Numeric Functions
- Pattern Matching
- Text Functions
- Time Functions
- Timestamp Functions
- Timestamp With Time Zone Functions
- Utility Functions
- Aggregates
- Configuration
- Constraints
- Indexes
- Information Schema
- Metadata Functions
- Pragmas
- Samples
- Window Functions
- Extensions
- Development
- Sitemap
- Why DuckDB
- FAQ
- Code of Conduct
- Live Demo
Below we collect some Tweets about DuckDB
I don't know who needs to hear this, but if you need a in-memory database to play with in #rstats, DuckDB has a more robust feature set than SQLite and is just as easy to use 🤓
— Emily Riederer (@EmilyRiederer) September 20, 2020
🦆https://t.co/i3zpPd4eUE
It is infinitely nicer to use @duckdb to quickly look at @ApacheParquet files than using any of the horrible hadoop/spark things.
— fs111 (@fs111) June 23, 2021
Not the answer you want, but the answer you need: For local DBs switch to DuckDB. https://t.co/x5FAj1ltSB
— Grant McDermott (@grant_mcdermott) October 25, 2020
(Otherwise, I’d check your 5432 port status. On Linux I’d use netstat. My guess is they support Mac too, but don’t know offhand.)
All the benefits of a database, none of the hassle: DuckDB https://t.co/KbSTCAEP52. An embeddable persistent SQL OLAP DB. Subsets do fit in Memory using filters and joins.@CWInl, @hfmuehleisen and 🦆Wilbur; #duckdb, @krlmlr; #DBI @MattDowle; #data.table. pic.twitter.com/Px8zwSBaHR
— smart-R (@smartR101) October 18, 2020
Also, I'm now recommending duckdb over SQLite -- it's embeddable like SQLite, but much more powerful and still fast
— Thomas Lumley (@tslumley) October 1, 2020
I’m a fan of this work!! Watch out for DuckDB. Before you notice it, it’s going to be running your analytics on all types of devices! https://t.co/9ZAwhyCU5l
— Juan Sequeda (@juansequeda) September 23, 2020
If you need sth fast and embedded, try duckdb. It’s columnar an amazing for single machine big data tasks. Embedded too!
— Sefa Ozalp (@SefaOzalp) August 20, 2020
Taking DuckDB for a spin, looks pretty exciting so far. "DuckDB, the SQLite for Analytics" https://t.co/rZTZXaYifM - linear insertion speed up to the 100M rows I've tested so far. Also, fits 100M integers into 400MB disk storage. /cc @fanf @hfmuehleisen @holanda_pe @mark8264 pic.twitter.com/beU3KKn5UN
— Bert Hubert 🇪🇺 (@PowerDNS_Bert) June 6, 2020
Lot of people use SQLite to locally run their analytical queries which are then quite often a bit slow. Luckily, I came around the @duckdb project recently that tries to be the SQLite-for-analytics. In a first test https://t.co/xFogQfqTha it already stands by the promise.
— Uwe L. Korn (@xhochy) October 19, 2019
Congrats to @hfmuehleisen for the first release of his new database. Think of DuckDB as #SQLite for analytics. https://t.co/sA1ae3GHxD
— Andy Pavlo (@andy_pavlo) June 30, 2019