Skip to main content

Overview

Import complete Apple Health data exports via XML files using one of two methods:
  1. S3 Presigned URL (Recommended) - For large files, uses direct S3 upload. The frontend handles this automatically; the scripts below are for manual or testing purposes.
  2. Direct Upload - For smaller files or testing, uploads directly to the API

Authentication

All endpoints require authentication via Bearer token (user login) or API key.
# Login to get access token
POST /api/v1/auth/login
Content-Type: application/x-www-form-urlencoded

username=user@example.com&password=yourpassword

# Response
{
  "access_token": "eyJ...",
  "token_type": "bearer"
}
Then use the token in subsequent requests:
Authorization: Bearer eyJ...

Endpoints

Best for large files (10MB+). Uploads directly to S3, then processes asynchronously.

Step 1: Request Presigned URL

curl -X POST "https://api.example.com/api/v1/users/{user_id}/import/apple/xml/s3" \
  -H "accept: application/json" \
  -H "Authorization: Bearer <access_token>" \
  -H "Content-Type: application/json" \
  -d '{
    "filename": "export.xml",
    "expiration_seconds": 300,
    "max_file_size": 52428800
  }'
user_id
string
required
The ID of the user to import data for
filename
string
default:""
Custom filename (max 200 characters)
expiration_seconds
integer
default:"300"
URL expiration time in seconds (60 - 3600)
max_file_size
integer
default:"52428800"
Maximum file size in bytes (1KB - 500MB). Default is 50MB.
{
  "upload_url": "https://s3.amazonaws.com/bucket/key",
  "form_fields": {
    "key": "apple-uploads/user-123/file-456.xml",
    "AWSAccessKeyId": "AKIA...",
    "policy": "eyJ...",
    "signature": "abc123..."
  },
  "file_key": "apple-uploads/user-123/file-456.xml",
  "expires_in": 300,
  "max_file_size": 52428800,
  "bucket": "my-bucket"
}

Step 2: Upload File to S3

Use the upload_url and form_fields from the previous response to upload your XML file:
# Upload using form_fields
with open('export.xml', 'rb') as f:
    files = {'file': ('export.xml', f, 'application/xml')}
    upload_response = requests.post(
        presigned_data['upload_url'],
        data=presigned_data['form_fields'],
        files=files
    )
    upload_response.raise_for_status()

print(f"✓ Uploaded successfully! File key: {presigned_data['file_key']}")
Important: When uploading to S3 with presigned POST, you must include all form_fields as form data, and the file field must be last.

Complete Example Workflow

Data Imported

Workouts

  • Activity type (running, cycling, swimming, etc.)
  • Duration and timestamps
  • Distance, calories, elevation
  • Heart rate statistics (min/max/avg)

Time Series Samples

  • Heart rate
  • Steps
  • Active energy
  • Distance
  • Blood oxygen
  • And 100+ other metrics
See Data Types Guide for complete list.

Best Practices

Use S3 for Large Files

Files over 10MB should use the presigned URL method to avoid timeouts

Handle Async Processing

Import processing is asynchronous. Don’t expect immediate data availability

Monitor Task Status

Use Celery Flower or logs to monitor processing status

Dedupe Handled Automatically

Records with the same external_id won’t be duplicated