i exporting large datatable excel sheet (.xls) format. call stored procedure here, returns datatable. stored procedure used return 7,50,000 rows 93 columns.
in end, if pass parameters stored procedure, takes 8 mins complete process, since long process.
but when call procedure front end, while assigning records datatable, "system out of memory" exception occurring.
so when google it, came know because size exceeds server ram size, throws exception. when export small datatable works fine.
what alternate way this? suggest me. below code,
c#:
datatable dt = binddatatable(); response.clearcontent(); response.buffer = true; response.addheader("content-disposition", string.format("attachment; filename={0}", "mbq_auto.xls")); response.contenttype = "application/ms-excel"; //response.contenttype = "application/octet-stream"; //response.contenttype = "text/tab-separated-values"; string str = string.empty; foreach (datacolumn dtcol in dt.columns) { response.write(str + dtcol.columnname); str = "\t"; } response.write("\n"); int rowcount = 0; foreach (datarow dr in dt.rows) { str = ""; (int j = 0; j < dt.columns.count; j++) { response.write(str + convert.tostring(dr[j])); str = "\t"; } response.write("\n"); if (unchecked(++rowcount % 1024 == 0)) response.flush(); } response.end();
you should not use datatable kind of problem. datatable stores entire data in memory.
you should use datareader fetch rows database.
Comments
Post a Comment