Task Guides
Operational walkthroughs
Follow this sequence to load and validate property account information for each assessment cycle.
Loading Property Account Information
Required prerequisite
Before using this guide, complete Parcel Schema Profiles. This section depends on an active profile (for example, esri-default) with correct field mappings.
Overview
Property accounts in Pie are represented by the Parcel model — one record per (tax_year, account) pair. Parcel records are the source of truth for property data and serve as the foundation for creating Appeals. All parcel data is loaded via CSV import using a configurable schema mapping system.
Prerequisites
1. Schema Profile
Before importing any CSV, a ParcelSchemaProfile must exist and be active. The profile defines how your vendor's CSV column names map to Pie's canonical fields.
Where: Django admin → Parcel Column Import Manager
Steps:
- Click Add Parcel Schema Profile.
- Set a name (e.g.,
esri-default), vendor (esri), and mark Active. - Under Column Mappings, add a
ParcelFieldMaprow for each CSV column:- Source Field — exact CSV column header as it appears in the file
- Canonical Field — the Pie field to map to (see canonical fields below)
- Transform — how the raw value should be parsed (see transforms below)
- Required — check at minimum for
account
- Save.
Canonical Fields:
| Canonical Field | Description |
|---|---|
account | Account number — required |
parcel_id | Parcel ID |
owner_name | Property owner name |
situs_address | Property situs address |
mailing_address | Full mailing address |
municipality | Municipality name |
situs_zip | Situs ZIP code |
assessor_total_market_value | Total assessor market value |
assessor_total_assessed_value | Total assessor assessed value |
res_assessed_value | Residential assessed value |
res_improvement_value | Residential improvement value |
res_land_value | Residential land value |
com_assessed_value | Commercial assessed value |
com_improvement_value | Commercial improvement value |
com_land_value | Commercial land value |
ag_assessed_value | Agricultural assessed value |
ag_improvement_value | Agricultural improvement value |
ag_land_value | Agricultural land value |
prev_owner_1 | Previous owner name |
sale_price_1 | Most recent sale price |
sale_date_1 | Most recent sale date |
parcel_report_url | URL to external parcel report |
Transforms:
| Transform | Behavior |
|---|---|
identity | Raw value, no change |
strip | Strip leading/trailing whitespace |
upper / lower | Convert case |
decimal | Parse as decimal number (handles commas) |
int | Parse as integer |
date | Parse multiple formats: YYYY-MM-DD, MM/DD/YYYY, MM/DD/YY, ESRI timezone suffix |
date_epoch_ms | Convert epoch milliseconds to date |
CSV Preparation
- Encode as UTF-8.
- Include a header row; column names must exactly match the
source_fieldvalues in your schema profile. - Every row must have a non-blank
accountvalue — rows with blank accounts are silently skipped. - Each import is scoped to a single tax year; the
(tax_year, account)pair is unique.
Import Methods
Method A — Azure Blob URL (Recommended for Large Files)
This method queues a background job that downloads, validates, and applies the CSV without risk of HTTP timeout.
- Upload the CSV to Azure Blob Storage (container
doc). - Copy the blob URL (e.g.,
https://caliperdocuments.blob.core.windows.net/doc/imports/Parcels_2026.csv). - In the admin, go to Data Tools.
- Under Parcels.csv Import (Azure Blob URL), paste the URL, enter the tax year, and select the schema profile.
- Click Import from Blob.
- A
ParcelImportJobwith actionBLOB_IMPORTis queued. The background worker picks it up and runs the full pipeline (ingest → validate → apply).
Method B — Admin Upload (Small to Medium Files)
This method uploads the file directly through the admin interface and queues background jobs for each pipeline phase.
- Go to admin → Validate Importations → Add.
- Enter tax year, select schema profile, optionally enter a source name, and upload the CSV file.
- Save. This creates a
ParcelImportBatchinUPLOADEDstatus with all rows stored asParcelRawRowrecords. - On the batch detail page, click Queue Validation to normalize and validate all rows (transitions to
VALIDATED). - After validation, click Queue Apply to upsert
Parcelrecords (transitions toAPPLIED).
Method C — Management Commands (CLI / Automated)
Use for scripted or automated imports (e.g., CI pipelines, scheduled tasks).
Full workflow in one command:
python manage.py import_parcels \ --file /path/to/Parcels_2026.csv \ --tax-year 2026 \ --profile esri-default \ --apply
Step by step:
# 1. Ingest and validate only python manage.py import_parcels --file /path/to/Parcels_2026.csv --tax-year 2026 --profile esri-default # 2. Apply a validated batch by ID python manage.py apply_import --batch <BATCH_ID> # 3. Re-validate an existing batch python manage.py validate_import --batch <BATCH_ID>
Import from Azure Blob via CLI:
python manage.py import_parcels_blob \ --url https://caliperdocuments.blob.core.windows.net/doc/imports/Parcels_2026.csv \ --tax-year 2026 \ --profile esri-default \ --apply
Force re-import of a duplicate file:
python manage.py import_parcels --file /path/to/Parcels_2026.csv --tax-year 2026 --profile esri-default --apply --force
The Import Pipeline
Every import — regardless of method — passes through three phases:
Ingest
- Reads the CSV (UTF-8).
- Computes a SHA256 hash of the file for deduplication.
- Stores each row as a
ParcelRawRowrecord (rawfield = original CSV row as JSON). - Skips rows with a blank account value.
- Creates a
ParcelImportBatchrecord. - Rejects the upload if a batch with the same
(tax_year, sha256)already exists (unless--force).
Validate / Normalize
- Applies the schema profile's
ParcelFieldMaptransforms to each raw row. - Requires at least
account; any field markedrequiredmust be present and parseable. - Writes
normalizedJSON,errorslist, andvalidflag back to eachParcelRawRow. - Updates batch counters:
row_count_valid,row_count_invalid. - Invalid rows are not applied but remain visible for inspection.
- Applies the schema profile's
Apply / Upsert
- For each valid row, upserts a
Parcelrecord by(tax_year, account). - If
assessor_total_assessed_valueis absent, it is auto-computed from residential + commercial + agricultural assessed values. - Sets
current_import_batchandraw_last_seenpointers on theParcel. - Re-applying the same batch is idempotent — existing records are updated in place.
- For each valid row, upserts a
Monitoring Progress
- Data Tools (admin) → Parcel Import Jobs table shows the most recent 10 jobs with status and row counters. Refresh the page to poll for updates.
- Validate Importations → individual batch detail page shows Latest Job Progress with a progress percentage.
- Parcel Import Jobs admin list shows full job history including error messages.
Job statuses: QUEUED → RUNNING → SUCCESS / FAILED
Batch statuses: UPLOADED → VALIDATED → APPLYING → APPLIED / FAILED / DUPLICATE
Verifying the Import
- Go to admin → Property Account Search.
- Search by account number or owner name to confirm records were created.
- Open a parcel record to review all canonical fields, assessed values, and the import batch pointer.
Deduplication Behavior
| Scenario | Behavior |
|---|---|
| Same file re-uploaded for same tax year + profile | Rejected — DUPLICATE status |
Same file with --force flag | Accepted — new batch created, parcels updated |
| Different file, same tax year, same accounts | Accepted — parcels updated (upsert) |
| Same account in a different tax year | Accepted — creates separate Parcel record |
Troubleshooting
Invalid rows / missing data:
- On the batch detail page, review Top Errors and Invalid Rows (sample).
- Go to admin → Parcel Row Import Manager (superusers) to inspect the
raw,normalized, anderrorsfields for individual rows. - Common causes:
source_fieldin the profile does not exactly match the CSV column header (including whitespace or non-breaking spaces).- Wrong
transformselected (e.g.,dateapplied to a decimal column). - Required field is blank in some rows.
Job stuck in RUNNING:
- Check that the
parcel_import_workermanagement command is running (it is a long-running daemon process). - The worker polls for queued jobs every 5 seconds by default.
Duplicate rejection:
- If the import was partially applied and must be re-run, use the
--forceflag via CLI, or Dump Parcel Records (Data Tools) to wipe all parcel data and start fresh. - Note: Dump Parcel Records is blocked if any Appeals exist — Appeals must be removed first.
Key Models Reference
| Model | Purpose |
|---|---|
Parcel | Canonical property account record; unique by (tax_year, account) |
ParcelSchemaProfile | Named mapping profile for a CSV vendor schema |
ParcelFieldMap | Individual column mapping within a profile (source_field → canonical_field + transform) |
ParcelImportBatch | One uploaded CSV file; tracks status and row counts |
ParcelImportJob | Background job queue entry for validate / apply / blob import operations |
ParcelRawRow | Raw and normalized JSON for each CSV row; retains errors for inspection |
Key Code Locations
| Path | Description |
|---|---|
| appeals/models.py | Parcel, ParcelImportBatch, ParcelImportJob, ParcelRawRow, ParcelSchemaProfile, ParcelFieldMap models |
| appeals/services/parcel_import_pipeline.py | Core three-phase pipeline: ingest_file(), validate_batch(), apply_batch() |
| appeals/services/parcel_import_jobs.py | queue_parcel_job() helper |
| appeals/views.py | import_parcels_csv(), import_parcels_blob_url(), data_tools(), dump_parcels_confirm() |
| appeals/admin.py | ParcelSchemaProfileAdmin, ParcelImportBatchAdmin, ParcelAdmin |
| appeals/management/commands/import_parcels.py | CLI: ingest + validate + optional apply |
| appeals/management/commands/import_parcels_blob.py | CLI: blob download + full pipeline |
| appeals/management/commands/parcel_import_worker.py | Background worker daemon |
| appeals/management/commands/validate_import.py | CLI: validate an existing batch by ID |
| appeals/management/commands/apply_import.py | CLI: apply a validated batch by ID |
| Documentation/parcel_import.md | Original user-facing import guide |