What This Template Is For
Bulk operations let users act on many records at once: importing thousands of contacts, updating hundreds of status fields, exporting a quarter's worth of data, or deleting archived projects in batch. They are among the most technically challenging features to build well. The happy path is straightforward, but the edge cases are where teams lose weeks: partial failures, timeout errors, duplicate handling, progress visibility, and rollback.
This template provides a structured approach to specifying bulk operations. It covers the operation catalog, input validation, processing architecture, progress tracking, error handling, and rollback strategies. Use it whenever a feature involves acting on more than a few dozen records at a time.
For the data import workflow specifically, the Data Import Template provides deeper coverage of mapping, validation, and deduplication. If your bulk operations touch admin-facing features, the Admin Console Template covers the broader admin UX. The Technical PM Handbook includes patterns for specifying infrastructure-heavy features like batch processing.
How to Use This Template
- Catalog every bulk operation your product needs. Group them by type: bulk create, bulk update, bulk delete, bulk export.
- For each operation, define the maximum batch size, expected processing time, and failure tolerance. These constraints drive architecture decisions.
- Decide between synchronous and asynchronous processing. Anything over 50 records or 10 seconds should be async with progress tracking.
- Specify the error handling model: fail-fast (stop on first error), fail-safe (skip errors and continue), or transactional (all-or-nothing rollback).
- Design the progress UI. Users need to know: how many records processed, how many succeeded, how many failed, estimated time remaining, and what to do about failures.
- Document the rollback strategy for each operation. Some bulk operations are reversible (status changes). Some are not (permanent deletes). Users must understand this before confirming.
- Review with engineering, QA (they will find the edge cases), and power users who currently work around the lack of bulk operations.
The Template
Bulk Operations Overview
| Field | Details |
|---|---|
| Product Name | [Product name] |
| Author | [PM or Engineer name] |
| Reviewers | [Names and roles] |
| Date | [Date] |
| Status | Draft / In Review / Approved / In Development |
Purpose. [Which user workflows require bulk operations? What is the current workaround (manual one-by-one, CSV + support ticket, API script)?]
Operation Catalog
| Operation | Type | Max Batch Size | Async | Error Model | Rollback |
|---|---|---|---|---|---|
| [Bulk import contacts] | Create | [10,000] | [Yes] | [Fail-safe: skip invalid rows] | [Delete imported records] |
| [Bulk update status] | Update | [5,000] | [Yes] | [Fail-safe: skip locked records] | [Revert to previous status] |
| [Bulk assign owner] | Update | [1,000] | [Yes] | [Fail-safe] | [Revert to previous owner] |
| [Bulk delete] | Delete | [500] | [Yes] | [Fail-fast on permission error] | [Soft delete with 30-day recovery] |
| [Bulk export] | Read | [100,000] | [Yes] | [Fail-fast on timeout] | [N/A] |
| [Bulk tag/untag] | Update | [5,000] | [No (<100) / Yes (100+)] | [Fail-safe] | [Remove/restore tags] |
Input Specification
Selection methods.
| Method | Description | Max Selection | UX |
|---|---|---|---|
| [Manual selection] | [Checkbox per row in list view] | [100] | [Select all on page, shift-click range] |
| [Select all matching filter] | [Apply filter, then "Select all X matching"] | [Operation max] | [Banner shows count, confirms before action] |
| [CSV upload] | [Upload file with identifiers] | [Operation max] | [File picker, preview first 10 rows] |
| [API batch endpoint] | [POST array of IDs or objects] | [Operation max] | [Documented in API reference] |
Input validation.
| Validation | When | Error Handling |
|---|---|---|
| [File format check] | [On upload, before processing] | [Reject with format requirements message] |
| [Required fields present] | [On upload, before processing] | [List missing fields, reject file] |
| [Data type validation] | [Per row, during processing] | [Skip row, add to error report] |
| [Duplicate detection] | [Per row, during processing] | [Configurable: skip, update, or create duplicate] |
| [Permission check] | [Per record, during processing] | [Skip record, add to error report with reason] |
| [Rate limit check] | [Before processing starts] | [Queue operation, show estimated start time] |
Processing Architecture
Synchronous processing (small batches).
| Parameter | Value |
|---|---|
| Threshold | [Operations affecting < 50 records] |
| Timeout | [30 seconds] |
| UX | [Inline progress bar, blocking UI until complete] |
| Error display | [Inline error list below the action] |
Asynchronous processing (large batches).
| Parameter | Value |
|---|---|
| Threshold | [Operations affecting 50+ records] |
| Queue system | [SQS / Redis / Celery / BullMQ] |
| Worker concurrency | [X workers, Y records per worker batch] |
| Processing rate | [X records/second sustained] |
| Timeout | [Per-record: X seconds. Per-operation: X minutes.] |
| Priority | [User-initiated > scheduled > system-initiated] |
Job lifecycle.
| State | Description | User Can |
|---|---|---|
| Queued | [Waiting for worker capacity] | [Cancel] |
| Processing | [Actively processing records] | [Cancel (stops at current record)] |
| Completed | [All records processed (with or without errors)] | [View results, download report] |
| Failed | [Operation-level failure (infrastructure error)] | [Retry] |
| Cancelled | [User cancelled mid-processing] | [View partial results, retry remaining] |
Progress Tracking
Progress UI components.
| Component | Details |
|---|---|
| Progress bar | [Percentage complete, records processed / total] |
| Status text | ["Processing 1,247 of 5,000 contacts..."] |
| Time estimate | ["Approximately 3 minutes remaining"] |
| Success count | [Green counter: "4,892 succeeded"] |
| Error count | [Red counter: "108 failed" (clickable to view errors)] |
| Cancel button | [Available during Queued and Processing states] |
Notification on completion.
| Channel | Trigger | Content |
|---|---|---|
| [In-app notification] | [Always] | ["Bulk import complete: 4,892 created, 108 failed. View results."] |
| [Email] | [When operation takes > 60 seconds] | [Summary + link to results page] |
| [Webhook] | [If configured by customer] | [JSON payload with job ID, counts, status] |
Results page.
| Section | Content |
|---|---|
| Summary | [Total, succeeded, failed, skipped. Duration. Operator name.] |
| Success details | [Expandable: list of created/updated/deleted record IDs with links] |
| Error details | [Table: row number, record identifier, error message, suggested fix] |
| Error export | [Download failed rows as CSV with error column for correction and re-upload] |
| Retry options | ["Retry failed rows" button, pre-populated with the error CSV] |
Error Handling Models
Fail-safe (skip and continue).
| Behavior | Details |
|---|---|
| On invalid record | [Skip record, log error, continue to next] |
| On permission denied | [Skip record, log error, continue] |
| On duplicate detected | [Configurable: skip, update existing, or create duplicate] |
| On infrastructure error | [Retry record 3 times with backoff, then skip] |
| Result | [Partial success: N succeeded, M failed. Error report available.] |
Fail-fast (stop on error).
| Behavior | Details |
|---|---|
| On first error | [Stop processing, report error, preserve remaining queue] |
| Already processed records | [Remain in new state (no auto-rollback)] |
| User options | [Fix error and retry remaining, or rollback completed records] |
| Use when | [Destructive operations, financial transactions, compliance-sensitive data] |
Transactional (all-or-nothing).
| Behavior | Details |
|---|---|
| Validation phase | [Validate all records before processing any] |
| Processing phase | [Process all records within a transaction] |
| On any error | [Rollback entire batch. No partial changes.] |
| Use when | [Small batches (<100), database transactions, financial calculations] |
| Limitation | [Not viable for large batches due to transaction size limits] |
Rate Limiting and Fairness
| Parameter | Value |
|---|---|
| Per-user concurrency | [Max X concurrent bulk operations per user] |
| Per-org concurrency | [Max X concurrent bulk operations per org] |
| Global throughput cap | [X records/second across all tenants] |
| Queuing | [FIFO within priority tier. User-initiated > scheduled.] |
| Backpressure | [When queue depth > X, new operations show estimated wait time] |
| Abuse prevention | [Max X bulk operations per user per hour. Alert on anomalous patterns.] |
Rollback Specification
| Operation | Rollback Method | Rollback Window | Automation |
|---|---|---|---|
| [Bulk create] | [Delete created records] | [24 hours] | [One-click from results page] |
| [Bulk update] | [Restore previous values from change log] | [72 hours] | [One-click from results page] |
| [Bulk delete] | [Soft delete: restore from trash] | [30 days] | [One-click from trash view] |
| [Bulk export] | [N/A: read-only operation] | [N/A] | [N/A] |
| [Bulk import] | [Delete all records from this import batch] | [7 days] | [One-click from import history] |
Rollback requirements.
- ☐ Every mutation operation stores before-state for all modified fields
- ☐ Rollback executes as its own bulk operation with progress tracking
- ☐ Rollback generates its own audit log entries
- ☐ Partial rollback is supported (select specific records to revert)
- ☐ Rollback is idempotent (safe to trigger multiple times)
Filled Example: CRM Platform Bulk Operations
Operation Catalog
| Operation | Type | Max Batch | Async | Error Model | Est. Throughput |
|---|---|---|---|---|---|
| Import contacts (CSV) | Create | 50,000 | Yes | Fail-safe | 500 records/sec |
| Update contact status | Update | 10,000 | Yes (100+) | Fail-safe | 1,000 records/sec |
| Assign contact owner | Update | 5,000 | Yes (100+) | Fail-safe | 800 records/sec |
| Bulk tag contacts | Update | 10,000 | No (<100) / Yes | Fail-safe | 1,200 records/sec |
| Delete contacts | Delete | 1,000 | Yes | Fail-fast | 200 records/sec |
| Export contacts (CSV) | Read | 500,000 | Yes | Fail-fast | 2,000 records/sec |
| Merge duplicates | Update+Delete | 500 pairs | Yes | Transactional per pair | 50 pairs/sec |
Import Contacts Flow
Step 1: Upload. User uploads CSV file (max 50MB). System validates file format, encoding (UTF-8 required), and column headers against expected schema.
Step 2: Mapping. UI shows detected columns mapped to contact fields. User confirms or adjusts mapping. Required fields (email) highlighted. Unmapped columns shown as "will be ignored."
Step 3: Preview. Show first 10 rows with mapped data. Highlight validation warnings (e.g., invalid email format, missing required field). Show total row count and estimated processing time.
Step 4: Configuration.
- Duplicate handling: Skip / Update existing / Create duplicate (default: Skip)
- Owner assignment: Specific user / Round-robin / Unassigned
- Tags: Apply tags to all imported contacts (optional)
- List assignment: Add to specific list (optional)
Step 5: Processing. Async job with progress bar. 500 records/second. 50,000 contacts completes in approximately 100 seconds.
Step 6: Results.
- Summary: "47,823 contacts imported. 1,892 skipped (duplicates). 285 failed (invalid data)."
- Error CSV download: Original rows + error column explaining why each failed
- "Retry failed rows" button: Opens import flow pre-populated with error CSV
Delete Contacts Flow
Confirmation screen.
- Shows count: "You are about to delete 847 contacts."
- Impact summary: "This will also remove 2,341 associated activities and 156 deals."
- Warning: "Deleted contacts can be recovered from Trash for 30 days."
- Type-to-confirm: User types "DELETE 847" to proceed.
Processing. Fail-fast model. If any contact cannot be deleted (e.g., linked to an active deal in a locked pipeline), processing stops. User sees which contact caused the failure and can exclude it before retrying.
Rollback. All deleted contacts move to Trash with a batch ID. Admin can restore the entire batch or individual contacts from Trash within 30 days.
Common Mistakes to Avoid
- Making all bulk operations synchronous. Anything over 50 records or 10 seconds must be async. Users will refresh the page, close the tab, or lose patience. Async with progress tracking and email notification is the correct pattern.
- No progress visibility during long operations. "Please wait..." with a spinner for 3 minutes will generate support tickets. Show records processed, percentage complete, and estimated time remaining.
- Ignoring partial failure. "Something went wrong" when 4,800 of 5,000 records succeeded is unacceptable. Report successes and failures separately. Provide a downloadable error report. Offer retry for failed records only.
- No rollback capability. If an admin accidentally bulk-updates 10,000 records with the wrong status, they need a one-click undo. Store before-state for every bulk mutation.
- Skipping rate limiting. One customer running a 100,000-record import can starve the system for everyone else. Implement per-tenant concurrency limits and global throughput caps.
Key Takeaways
- Catalog every bulk operation with its max batch size, error model, and rollback strategy before building
- Use asynchronous processing with progress tracking for anything over 50 records
- Provide downloadable error reports with specific failure reasons per record
- Store before-state for every bulk mutation so rollback is always possible
- Implement per-tenant rate limiting to prevent one customer from starving the system
About This Template
Created by: Tim Adair
Last Updated: 3/5/2026
Version: 1.0.0
License: Free for personal and commercial use
