EagleSense provides S3 bucket access for uploading historical transaction data and batch processing operations. All S3 operations require proper authentication as described in the Authentication section. Access can be achieved through programmatic methods using AWS SDKs or, for certain operations like historical data uploads, through manual uploads if preferred.
This page describes the process for uploading historical data.
S3 Bucket Structure
Your client-specific S3 bucket follows this naming convention: {merchant_name}-merchant-eagle-sense
{merchant_name}-merchant-eagle-sense/
├── historical-data/ # Upload historical transaction data for ML model training
│ └── historical_data_<id>.json
├── batch-request-data/ # Upload batch transaction data for evaluation
│ └── batch_<external_batch_id>.json
├── batch-result-data/ # Download batch evaluation results
│ └── batch_<external_batch_id>_<internal_subbatch_id>_result_<subbatch_index>_<total_number_of_subbatches>.json
└── update-data/ # Upload batch transaction attributes that became available later
└── update_<external_update_batch_id>.json
Prerequisites
Before performing S3 operations, ensure you have:
Proper authentication as described in the Authentication section
AWS SDK or HTTP client configured with your credentials
Your client name and S3 bucket name confirmed with EagleSense support
Data Upload Operations
Historical Data for ML Tuning
Upload historical transaction data to improve the ML model accuracy. This data is used for model training and performance optimization.