Creating JSON columns in PostgreSQL - Performance & Efficiency
When we create JSON columns in a database, we want to know how the time to add or access data changes as the data grows.
We ask: How does the work grow when we store more JSON data in a column?
Analyze the time complexity of creating a table with a JSON column and inserting data.
CREATE TABLE users (
id SERIAL PRIMARY KEY,
info JSONB
);
INSERT INTO users (info) VALUES ('{"name": "Alice", "age": 30}');
INSERT INTO users (info) VALUES ('{"name": "Bob", "age": 25}');
This code creates a table with a JSONB column and inserts JSON data into it.
Look for repeated actions that affect time.
- Primary operation: Inserting JSON data into the JSONB column.
- How many times: Once per row inserted; each insert handles the JSON data size.
As the JSON data size grows, the time to process each insert grows roughly in proportion to the JSON size.
| Input Size (JSON size in KB) | Approx. Operations |
|---|---|
| 1 | Small, quick insert |
| 10 | About 10 times more work than 1 KB |
| 100 | About 100 times more work than 1 KB |
Pattern observation: The time grows roughly linearly with the size of the JSON data being inserted.
Time Complexity: O(n)
This means the time to insert grows in a straight line as the JSON data size increases.
[X] Wrong: "Inserting JSON data always takes the same time no matter how big it is."
[OK] Correct: Larger JSON data means more characters to process and store, so it takes more time.
Understanding how JSON data size affects insert time helps you explain database performance clearly and confidently.
"What if we changed the JSONB column to a plain text column? How would the time complexity of inserts change?"