Help Center/ Relational Database Service_RDS for PostgreSQL/ Best Practices/ Updating, Deleting, or Inserting Data Records at a Time
Updated on 2025-09-04 GMT+08:00

Updating, Deleting, or Inserting Data Records at a Time

This topic describes how to insert (BULK INSERT), update (BULK UPDATE), and delete (BULK DELETE) data records at a time. These operations significantly reduce interactions between your DB instance and applications, reducing system overhead and increasing the overall throughput.

Creating an Example Table Structure

CREATE TABLE product_inventory (
item_id INT PRIMARY KEY,
item_name VARCHAR(100),
stock_qty INT,
last_updated TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

Inserting Data Records at a Time

  • Method 1: Using the VALUES clause
    Insert multiple product records at a time.
    INSERT INTO product_inventory (item_id, item_name, stock_qty) 
    VALUES 
        (101, 'Smartwatch', 50),
        (102, 'Wireless headset', 120),
        (103, 'Bluetooth speaker', 75),
        (104, 'Tablet', 30);
  • Method 2: Using SELECT to generate sequence data
    Generate 100 test data records at a time.
    INSERT INTO product_inventory (item_id, item_name, stock_qty)
    SELECT 
        200 + n, 
        'Test product-' || n,
        (random() * 100)::INT
    FROM generate_series(1, 100) AS n;
  • Method 3: Using multiple INSERT statements wrapped in a transaction
    BEGIN;
    INSERT INTO product_inventory VALUES (301, 'Game controller', 40);
    INSERT INTO product_inventory VALUES (302, 'Mechanical keyboard', 60);
    INSERT INTO product_inventory VALUES (303, 'Gaming mouse', 55);
    COMMIT;
  • Method 4: Using the COPY command

    PostgreSQL provides COPY, a protocol for fast data import/export. It uses a simpler format and transfer mechanism. This makes data import/export much faster than using INSERT statements especially when there are large volumes of data to handle. The following describes its advantages.

    • Reducing overhead: COPY uses the binary or text format to transfer data, avoiding the overhead caused by parsing SQL statements one by one.
    • Improving performance: Using COPY to insert tens of thousands of data records is more than 10 times faster than using INSERT.
    • Applicability across diverse situations: It works for various scenarios like initializing, migrating, or periodically importing data in batches. It can directly read data from CSV files, binary files, or program streams.

    The following is an example of using COPY to import data from a CSV file:

    COPY product_inventory FROM STDIN WITH (FORMAT csv);
    401, "4K monitor", 25
    402, "Curved monitor", 18
    403, "Portable monitor", 32
    \.

    The COPY APIs vary based on the language driver you use. For more information, see the following:

Updating Data Records at a Time

  • Using temporary tables

    Create a temporary table to perform a join-based update.

    WITH update_data(item_id, new_qty) AS (
        VALUES 
            (101, 45),
            (102, 110),
            (104, 28)
    )
    --Execute the batch update.
    UPDATE product_inventory p
    SET stock_qty = u.new_qty
    FROM update_data u
    WHERE p.item_id = u.item_id;
  • Using the CASE clause

    Update the inventory data at a time based on different conditions.

    UPDATE product_inventory
    SET stock_qty = CASE
        WHEN item_id = 201 THEN 90
        WHEN item_id = 202 THEN 65
        WHEN item_id = 203 THEN 40
        ELSE stock_qty
    END
    WHERE item_id IN (201, 202, 203);

Deleting Data Records at a Time

  • Using the IN clause

    Delete the product records with specified IDs.

    DELETE FROM product_inventory
    WHERE item_id IN (301, 302, 303, 304);
  • Using sub-queries

    Delete the products whose in-stock quantity is less than 20.

    DELETE FROM product_inventory
    WHERE stock_qty < 20;
  • Using TRUNCATE

    Clear all data in a table (this operation cannot be rolled back).

    TRUNCATE TABLE product_inventory;

Suggestions on Performance Optimization

  • Batch processing with a transaction: Wrap operations in a single transaction.
    BEGIN;
    --Batch operation 1
    --Batch operation 2
    COMMIT;
  • Batch size control: You are advised to process 1,000 to 5,000 records in each batch.