HomeGuidesHow to Convert a CSV File to SQL INSERT Statements
Guide

How to Convert a CSV File to SQL INSERT Statements

3 methods — browser tool, Python, and SQLite CLI

When you have data in a CSV and need to get it into a SQL database, the cleanest approach is to generate INSERT statements from the CSV rows. Here's how to do it three different ways.

Method 1: Using Tabular (no code required)

Generate a ready-to-run .sql file from your CSV in seconds.

  1. 1Go to the CSV to SQL tool on Tabular.
  2. 2Upload your CSV file.
  3. 3Enter your table name (e.g. 'users', 'products').
  4. 4Click Run — Tabular generates one INSERT statement per row.
  5. 5Download the .sql file and run it against your database.

The output uses standard SQL syntax compatible with MySQL, PostgreSQL, and SQLite. Column names are taken directly from your CSV headers.

Method 2: Using Python

More flexible — lets you handle data types, escape values, and target specific databases.

  1. 1Install pandas: pip install pandas
  2. 2Run the script below, adjusting table_name and column types as needed.

python

import pandas as pd

df = pd.read_csv("input.csv")
table_name = "my_table"

with open("output.sql", "w") as f:
    for _, row in df.iterrows():
        cols = ", ".join(df.columns)
        vals = ", ".join(
            f"'{str(v).replace(chr(39), chr(39)+chr(39))}'"
            if isinstance(v, str) else str(v)
            for v in row
        )
        f.write(f"INSERT INTO {table_name} ({cols}) VALUES ({vals});\n")

print(f"Generated {len(df)} INSERT statements")

For production use, consider using parameterized queries or an ORM instead of raw INSERT strings to prevent SQL injection.

Method 3: Using SQLite CLI (import directly)

If you're using SQLite, you can import the CSV directly without generating SQL files.

  1. 1Open your terminal and launch the SQLite CLI: sqlite3 mydb.db
  2. 2Run the commands below to create the table and import the CSV.

sql

-- Set CSV mode and import
.mode csv
.import input.csv my_table

-- Verify the import
SELECT COUNT(*) FROM my_table;
.quit

SQLite's .import command treats the first row as data, not headers, if the table already exists. If the table doesn't exist yet, SQLite creates it automatically using the first row as column names.

Frequently asked questions

What does the SQL output look like?

Each CSV row becomes one INSERT statement. For example: INSERT INTO users (name, email, age) VALUES ('Alice', 'alice@example.com', '30'); All values are quoted as strings — if you need specific column types, you'll need to cast them in your SQL or use the Python method.

Which SQL databases does the output work with?

The INSERT syntax generated by Tabular is standard SQL and works with MySQL, PostgreSQL, SQLite, and SQL Server. For PostgreSQL-specific features like RETURNING or ON CONFLICT, you'll need to modify the generated SQL manually.

How do I handle special characters or apostrophes in the data?

Apostrophes in SQL string values need to be escaped as '' (two single quotes). Tabular handles this automatically. If you're generating SQL manually, make sure to escape single quotes in your values before inserting.

My CSV has 100,000 rows. Is one INSERT per row efficient?

For large datasets, batched INSERTs (multiple rows per statement) are faster. In MySQL/PostgreSQL you can combine rows: INSERT INTO table (col1, col2) VALUES (v1, v2), (v3, v4), ... For very large imports, consider using LOAD DATA INFILE (MySQL) or COPY (PostgreSQL) which are optimized for bulk imports.

Ready to try the fastest method?

Upload a CSV and generate SQL INSERT statements ready to paste into your database. Free, no signup required.

CSV → SQL — free