Manually Importing Transactional Data
Typically, you would import transaction data via the Adestra API using:
-
transaction.import if you have a transactional data file and you do not wish to send a campaign with it. This is useful for a daily scheduled import.
-
transaction.create if you want to do ad-hoc imports of transactions by supplying the transaction data immediately. This is useful for drip-feeding data.
Alternatively, you can manually import data through the Adestra import interface.
If you have a remote connection configured, you can also configure a scheduled import.
Note: by importing transactions into Adestra you will also populate your items list.
Using the transaction import interface
This feature is currently going through Beta testing. If you would like to be a part of this process or would like more information please contact your account manager.
From the relevant workspace:
-
Access the import interface by opening the data menu and selecting Transactions.
-
Select a CSV file with the transaction data to import.
-
Use the dropdown menus to select the delimiter and character encoding used in your CSV file, e.g. comma-separated unicode.
Tip: use the auto-detect option to let Adestra find this information from your file.
-
The Remove file after import checkbox prevents the file from being stored unnecessarily in the file manager. Only uncheck this box if you need to keep the file in Adestra.
-
Select the core table you want transactions to be imported against.
-
Select the brand the transactions should be imported to.
-
If you are importing transactions with contacts that don't already exist, check the create contacts checkbox to auto-create new contacts from your CSV file.
-
Press the Next button.
-
Drag the fields in your CSV file to fields in your core table to map them.
Adestra will auto-map fields with matching names.
Required fields are transaction_ref and product_ref.
-
Optional: add additional tags and data for your import.
Select what type of tag or data (transaction or product).
Enter a name for the tag/data and what field it applies too.
-
Press the Next button.
Tip: use the import log to see whether your import was successful or not.
Troubleshooting transactional data imports
-
While using Mac OS, make sure the CSV file is saved as a Windows CSV file.
-
Make sure to include headers across the top of the CSV for each column and separate field, for example: email, first name, surname, etc.
It is highly recommended the headers match the field names in the core table fields for ease of mapping.
-
In the email column, make sure all the rows have email ids in a standard format without a gap.
-
If a CSV file becomes corrupt, use one of the following steps:
-
Copy the data of the original CSV file into a new CSV and save it with a new file name. You may change the file name slightly to avoid confusion.
-
If the step above does not work then remove any rows which contain incomplete data. For example, if a table has email id, first name, last name and address columns, and a row only has email id, or the first name, or the address, then the data is incomplete. You can remove such rows and try importing again.
-
If this does not work, then delete some columns and try uploading one column at a time to find the data column with an issue.
For example, alongside the required fields (transaction_ref and product_ref) a transaction file may contain other fields (like transaction_timestamp, transaction_value) and user data (like transaction_tags and some product_tags).
You may initially remove the tags like transaction and product tags.
If the import still fails, then you can individually import the transaction_ref, product_ref, transaction_timestamp, and transaction_value columns.
This approach will help you narrow down to the columns with issues.
For the timestamp_transaction column, make sure the date format is correctly mentioned in the ISO 8601 format to ensure the timestamp column is read correctly and to minimise issues when processing the dates.
-
-
If there are any profile fields in CSV where one of the cells is over 255 bytes, it is recommended to reduce its length. If reducing the length is not an option, then make sure the field you are uploading to is “large text file”.
-
Ensure the dedupe field 'Email' is mapped on the import mapping.