By using this website, you agree to our Terms of Use (click here)
I'm working with a client who has a large database in Acumatica. Their GLTran table has over 2.8 million records (yes, over 2,800,000 records).
As always, we'd like to be able to analyze this data with 3rd party tools. But we can't export to Excel because an Excel Sheet can only hold around 1 million records.
OData is an option, but it's REALLY SLOW on a large dataset like this.
So, I've been wondering if we could use an Export Scenario. Thanks to @shennelly at the Acumatica User Group in Ohio (AUG Ohio) tonight for telling me that it was possible.
I gave it a try and it worked. This is what I did...
1. Created a file in Notepad with the column headings that I wanted, separated by commas:
Note: I discovered that Acumatica wanted a second line or it didn't recognize the file. So, I just copied the first line and pasted it since that was the easiest thing to do.
2. Saved the file as a CSV file called My Vendor Export.csv:
3. Went to the Data Providers (SM206015) screen, populated the Name and Provider Type fields, then pressed Save:
4. Dragged and dropped the My Vendor Export.csv file onto the screen which attached it to the FILES area.
5. Pressed the RELOAD PARAMETERS button, then Save, which populated the Value column on the FileName row with the link to my uploaded file:
6. Clicked the SCHEMA tab, checked the Active box on the left-hand side, then clicked the FILL SCHEMA FIELDS button on the right-hand side, then Save:
7. Went to the Export Scenarios (SM207025) screen, gave it a Name, selected the Vendors (AP3030PL) Generic Inquiry in the Screen Name field, and clicked Save:
8. Populated the MAPPING tab as follows:
9. Went to the Export by Scenario (SM207036) screen, selected my Export Scenario, then pressed PREPARE (which populated the grid below):
10. Then pressed EXPORT which moved the data into my CSV file:
11. Clicked ... -> Get Latest Version which allowed me to download the latest version of the My Vendor Export.csv file to my computer:
12. Opened the My Vendor Export.csv file in Notepad and noted that I now have a populated CSV file!
I didn't go this far, but Stephen was encouraging me to try synching the file an SFTP folder which you could do like this:
Back to the client with 2.8 million records in their GLTran table, I tried this same approach in their environment. Unfortunately, the Export Scenario ran for 35 seconds, then returned this error message:
The Export Scenario ran for 35 seconds and returned this error message:
Denial Code 513. System was unable to process the request due to data size limits. Please review your operation and consider requesting a smaller dataset. Contact your provider for further assistance.
Hello Tim
Did you consider changing the "Maximum File Upload Size in KB" in the File Upload Preferences?
Gabriel Michaud asked me via email if this works with any Generic Inquiry or just those tied to a data entry screen like the one used above.
It seems to work with any Generic Inquiry. Here are some screenshots of my GLTran export setup:
I tried the GLTran export in my local demo environment and it exported successfully:
I still need to try the suggestion from @jeffrey-patch to see if that will work in the client's environment.
This process should work with any GI. I have lots of export scenarios based on GI that target both files and sql server data sources.
Could you please screenshot the errors rather than attach the doc?
@timrodman, thanks for the idea (and the screen shot conversion), but results same. Seems something with export function since exporting to Excel works fine. Might be time to re-build from scratch, but that's for another day.
Click on the link in the error message that opens the log file. You need to work out what is wrong with the data in the excel xml file. There should be some explanation in the log.
The text in the box gives you the path to the source xml file. Change the extension of the .xlsx file to .zip and you can open it with windows explorer and find that xml file.
I'd also pass that xml file through an online xml validator. My guess is there's something in the file that's not confirming.
Thank you, @roycelithgo.
I've got the XML file and thought I had a breakthrough:
1. Thought maybe the data provider file was not clean enough, so I removed all format and review. The export opened clean the first time, but then started failing upon open.
2. Saw some data in the export with blanks after values, so those trimmed up in the GI, but opening export failed same.
3. Thought maybe data provider name (had - and .) might be an issue, but again got first export open clean and subsequent exports failed same.
I downloaded Notepad++ and think confirmed XML syntax, but don't know what to use for the XSD asked for when validating XML.
Any further leads would appreciated.
I was just thinking of whatever XML validator google search found.
eg.
https://www.xmlvalidation.com/
But what about the log file mentioned in your 2nd screenshot? That might have the explanation already.
@roycelithgo, below is the log file. Not sure it helps?
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<recoveryLog xmlns="http://schemas.openxmlformats.org/spreadsheetml/2006/main">
<logFileName>error031680_01.xml</logFileName>
<summary>Errors were detected in file 'C:\Users\Dave.Ziskind\Downloads\UKG Ready Project Task Import Spec for Automation Clean (2).xlsx'</summary>
<additionalInfo>
<info>Excel completed file level validation and repair. Some parts of this workbook may have been repaired or discarded.</info>
</additionalInfo>
</recoveryLog>
Have you tried an XML validator yet?
When run through any validator, it works fine after getting Row 1 on it's own line. Is that what we need to tell Acumatica to get this corrected or is this somehow under our control?
XML doesn't need line breaks to be valid. Can you share the original XML if not confidential?
The whole thing is too big, but attached are the header, first, and last rows. Let me know if there is a better way to send full 24Mb file.