ODBC Error in Power BI - Connection forcibly closed by remote host
Working in power BI pulling in OData from Acumatica, does anyone ever get this error when trying to pull in or refresh their data set?
@royce-lithgo - it's pretty static at this point. It is line level data from orders. I could do an export and create a static table. Ideally I get around this, as it is only about 6 months of data, so I'd prefer to only do it once a year max. But I can start there, if need be. It's strange that I have bigger data tables that refresh fine, though.
Another thing you could do to test if it's a dataset size problem is temporarily add something to the CONDITIONS tab on the Generic Inquiry to make the result set really small. Then try the OData refresh again.
Interested in joining a Local Acumatica User Group? Click here for more info
So yes, Tim, that does fix the issue of the refresh. Limited it to 1000 lines, and works like a charm. Definitely a size issue. And Royce, I'm afraid to run this thing in Acumatica itself for fear that it'll bog things down/freeze up the system.
Do you guys know anything about looping to get the results you want? Setting up a loop in Power BI to do multiple data pulls to get it all in one table? Count page numbers or lines or something?
I'm not aware of any table row count limit, might be a data size issue. Do you have a lot of columns in each row and perhaps any columns with a lot of data?
For splitting and combining the data, you could try setting up multiple tables in Power BI with different queries per table so that each table retrieves a chunk of the master table and then combine them in Power BI. But as the data grows you'd need to keep adding new chunk tables. You'd need some way to reliable split the data in each query so that no data is left out.
Still seems very odd to me to have to do this.
I've got 25 columns, so nothing insane. Some of them do have formulas, but nothing intense. I think overall I have 4 columns that have formulas, most are just something like: =[SOLine.OrderQty] *(-[SOLine.InvtMult]) (so setting up to handle Credit Memos). Does that drastically increase the load?
Right now what I have done is I have set these up in Power BI desktop, and reduced the row count in my queries to build the actual report, then uploaded to power bi. From there, the refresh has never failed for some reason. Maybe it also has to do with my local machine's capacity? That seems to be working for now, anyway, but I'm waiting for the day when that, too is too big. If it gets to a year, I'll be happy, because at that point I'll just start exporting to CSV and merging tables. Once a year would be manageable.