Caliper

Task Guides

Loading Property Account Information

Execute dependable parcel account imports with clear prerequisites, validation checks, and audit-ready outcomes.

Guide focus

  • Prerequisite validation

    Confirm the schema profile is active and mapped before imports begin.

  • Import execution

    Run CSV import operations with clear validation and apply checkpoints.

Task Guides

Operational walkthroughs

Follow this sequence to load and validate property account information for each assessment cycle.

Loading Property Account Information

Required prerequisite

Before using this guide, complete Parcel Schema Profiles. This section depends on an active profile (for example, esri-default) with correct field mappings.

Overview

Property accounts in Pie are represented by the Parcel model — one record per (tax_year, account) pair. Parcel records are the source of truth for property data and serve as the foundation for creating Appeals. All parcel data is loaded via CSV import using a configurable schema mapping system.


Prerequisites

1. Schema Profile

Before importing any CSV, a ParcelSchemaProfile must exist and be active. The profile defines how your vendor's CSV column names map to Pie's canonical fields.

Where: Django admin → Parcel Column Import Manager

Steps:

  1. Click Add Parcel Schema Profile.
  2. Set a name (e.g., esri-default), vendor (esri), and mark Active.
  3. Under Column Mappings, add a ParcelFieldMap row for each CSV column:
    • Source Field — exact CSV column header as it appears in the file
    • Canonical Field — the Pie field to map to (see canonical fields below)
    • Transform — how the raw value should be parsed (see transforms below)
    • Required — check at minimum for account
  4. Save.

Canonical Fields:

Canonical FieldDescription
accountAccount number — required
parcel_idParcel ID
owner_nameProperty owner name
situs_addressProperty situs address
mailing_addressFull mailing address
municipalityMunicipality name
situs_zipSitus ZIP code
assessor_total_market_valueTotal assessor market value
assessor_total_assessed_valueTotal assessor assessed value
res_assessed_valueResidential assessed value
res_improvement_valueResidential improvement value
res_land_valueResidential land value
com_assessed_valueCommercial assessed value
com_improvement_valueCommercial improvement value
com_land_valueCommercial land value
ag_assessed_valueAgricultural assessed value
ag_improvement_valueAgricultural improvement value
ag_land_valueAgricultural land value
prev_owner_1Previous owner name
sale_price_1Most recent sale price
sale_date_1Most recent sale date
parcel_report_urlURL to external parcel report

Transforms:

TransformBehavior
identityRaw value, no change
stripStrip leading/trailing whitespace
upper / lowerConvert case
decimalParse as decimal number (handles commas)
intParse as integer
dateParse multiple formats: YYYY-MM-DD, MM/DD/YYYY, MM/DD/YY, ESRI timezone suffix
date_epoch_msConvert epoch milliseconds to date

CSV Preparation

  • Encode as UTF-8.
  • Include a header row; column names must exactly match the source_field values in your schema profile.
  • Every row must have a non-blank account value — rows with blank accounts are silently skipped.
  • Each import is scoped to a single tax year; the (tax_year, account) pair is unique.

Import Methods

Method A — Azure Blob URL (Recommended for Large Files)

This method queues a background job that downloads, validates, and applies the CSV without risk of HTTP timeout.

  1. Upload the CSV to Azure Blob Storage (container doc).
  2. Copy the blob URL (e.g., https://caliperdocuments.blob.core.windows.net/doc/imports/Parcels_2026.csv).
  3. In the admin, go to Data Tools.
  4. Under Parcels.csv Import (Azure Blob URL), paste the URL, enter the tax year, and select the schema profile.
  5. Click Import from Blob.
  6. A ParcelImportJob with action BLOB_IMPORT is queued. The background worker picks it up and runs the full pipeline (ingest → validate → apply).

Method B — Admin Upload (Small to Medium Files)

This method uploads the file directly through the admin interface and queues background jobs for each pipeline phase.

  1. Go to admin → Validate ImportationsAdd.
  2. Enter tax year, select schema profile, optionally enter a source name, and upload the CSV file.
  3. Save. This creates a ParcelImportBatch in UPLOADED status with all rows stored as ParcelRawRow records.
  4. On the batch detail page, click Queue Validation to normalize and validate all rows (transitions to VALIDATED).
  5. After validation, click Queue Apply to upsert Parcel records (transitions to APPLIED).

Method C — Management Commands (CLI / Automated)

Use for scripted or automated imports (e.g., CI pipelines, scheduled tasks).

Full workflow in one command:

python manage.py import_parcels \
  --file /path/to/Parcels_2026.csv \
  --tax-year 2026 \
  --profile esri-default \
  --apply

Step by step:

# 1. Ingest and validate only
python manage.py import_parcels --file /path/to/Parcels_2026.csv --tax-year 2026 --profile esri-default

# 2. Apply a validated batch by ID
python manage.py apply_import --batch <BATCH_ID>

# 3. Re-validate an existing batch
python manage.py validate_import --batch <BATCH_ID>

Import from Azure Blob via CLI:

python manage.py import_parcels_blob \
  --url https://caliperdocuments.blob.core.windows.net/doc/imports/Parcels_2026.csv \
  --tax-year 2026 \
  --profile esri-default \
  --apply

Force re-import of a duplicate file:

python manage.py import_parcels --file /path/to/Parcels_2026.csv --tax-year 2026 --profile esri-default --apply --force

The Import Pipeline

Every import — regardless of method — passes through three phases:

  1. Ingest

    • Reads the CSV (UTF-8).
    • Computes a SHA256 hash of the file for deduplication.
    • Stores each row as a ParcelRawRow record (raw field = original CSV row as JSON).
    • Skips rows with a blank account value.
    • Creates a ParcelImportBatch record.
    • Rejects the upload if a batch with the same (tax_year, sha256) already exists (unless --force).
  2. Validate / Normalize

    • Applies the schema profile's ParcelFieldMap transforms to each raw row.
    • Requires at least account; any field marked required must be present and parseable.
    • Writes normalized JSON, errors list, and valid flag back to each ParcelRawRow.
    • Updates batch counters: row_count_valid, row_count_invalid.
    • Invalid rows are not applied but remain visible for inspection.
  3. Apply / Upsert

    • For each valid row, upserts a Parcel record by (tax_year, account).
    • If assessor_total_assessed_value is absent, it is auto-computed from residential + commercial + agricultural assessed values.
    • Sets current_import_batch and raw_last_seen pointers on the Parcel.
    • Re-applying the same batch is idempotent — existing records are updated in place.

Monitoring Progress

  • Data Tools (admin) → Parcel Import Jobs table shows the most recent 10 jobs with status and row counters. Refresh the page to poll for updates.
  • Validate Importations → individual batch detail page shows Latest Job Progress with a progress percentage.
  • Parcel Import Jobs admin list shows full job history including error messages.

Job statuses: QUEUEDRUNNINGSUCCESS / FAILED

Batch statuses: UPLOADEDVALIDATEDAPPLYINGAPPLIED / FAILED / DUPLICATE


Verifying the Import

  1. Go to admin → Property Account Search.
  2. Search by account number or owner name to confirm records were created.
  3. Open a parcel record to review all canonical fields, assessed values, and the import batch pointer.

Deduplication Behavior

ScenarioBehavior
Same file re-uploaded for same tax year + profileRejected — DUPLICATE status
Same file with --force flagAccepted — new batch created, parcels updated
Different file, same tax year, same accountsAccepted — parcels updated (upsert)
Same account in a different tax yearAccepted — creates separate Parcel record

Troubleshooting

Invalid rows / missing data:

  • On the batch detail page, review Top Errors and Invalid Rows (sample).
  • Go to admin → Parcel Row Import Manager (superusers) to inspect the raw, normalized, and errors fields for individual rows.
  • Common causes:
    • source_field in the profile does not exactly match the CSV column header (including whitespace or non-breaking spaces).
    • Wrong transform selected (e.g., date applied to a decimal column).
    • Required field is blank in some rows.

Job stuck in RUNNING:

  • Check that the parcel_import_worker management command is running (it is a long-running daemon process).
  • The worker polls for queued jobs every 5 seconds by default.

Duplicate rejection:

  • If the import was partially applied and must be re-run, use the --force flag via CLI, or Dump Parcel Records (Data Tools) to wipe all parcel data and start fresh.
  • Note: Dump Parcel Records is blocked if any Appeals exist — Appeals must be removed first.

Key Models Reference

ModelPurpose
ParcelCanonical property account record; unique by (tax_year, account)
ParcelSchemaProfileNamed mapping profile for a CSV vendor schema
ParcelFieldMapIndividual column mapping within a profile (source_field → canonical_field + transform)
ParcelImportBatchOne uploaded CSV file; tracks status and row counts
ParcelImportJobBackground job queue entry for validate / apply / blob import operations
ParcelRawRowRaw and normalized JSON for each CSV row; retains errors for inspection

Key Code Locations

PathDescription
appeals/models.pyParcel, ParcelImportBatch, ParcelImportJob, ParcelRawRow, ParcelSchemaProfile, ParcelFieldMap models
appeals/services/parcel_import_pipeline.pyCore three-phase pipeline: ingest_file(), validate_batch(), apply_batch()
appeals/services/parcel_import_jobs.pyqueue_parcel_job() helper
appeals/views.pyimport_parcels_csv(), import_parcels_blob_url(), data_tools(), dump_parcels_confirm()
appeals/admin.pyParcelSchemaProfileAdmin, ParcelImportBatchAdmin, ParcelAdmin
appeals/management/commands/import_parcels.pyCLI: ingest + validate + optional apply
appeals/management/commands/import_parcels_blob.pyCLI: blob download + full pipeline
appeals/management/commands/parcel_import_worker.pyBackground worker daemon
appeals/management/commands/validate_import.pyCLI: validate an existing batch by ID
appeals/management/commands/apply_import.pyCLI: apply a validated batch by ID
Documentation/parcel_import.mdOriginal user-facing import guide