Thursday, March 29, 2012

configure Defaultmaxbuffersize and DefaultmaxbufferRows

i want to improve the performance of my ssis...im left with last 3 bottleneks...3 huge tables ...here are the details :

table 1 :rows : 40 million +

size of each row..1 -2 KB

i had set the default max buffer to the max..100 MB and default max buffer rows to 100,000 ... considering that average row size is 1 kb.

table 2:

rows :17 million

row size : 280 bytes

again i have set the max buffer to 100 MB and default max buffer row to 300,000 .applyin the same logic..

table 3

rows: 59 million

size per row : 85 bytes..

as the row size was small...i made default max buffer row to 1000000 that'll still be less than 100 mb..which i had set as default max buffer size....

the first 2 have given a considerable speedup..but the 3rd one (as i feared) has gone dead slow...

ne ideas..and is one huge buffer better..or many small buffers...

Can you describe what transformations you're applying to the data from table 3? And what source and destination adapters you're using?|||its a oledb source and destination....the data flow tasks i mentioned (and few other similar) r called from a sequence container in control flow... the transformation is minimal...thouh there r 2 columns compared in the where clause(just 1 in others)..and i'm not sure of indexes on that as im not in control of that DB ... ya that may be a reason...but can u suggest anything apart from that..|||

I'd suggest that you try to determine the location of the bottleneck:

To check the source, try writing out the data from the source into a Raw File, and replacing the source adapter in your existing package with a Raw File Source adapter that's pointed at this file.

To check the destination, replace the existing destination adapter with an unconfigured Export Column transform.

To check the transforms, do both of the previous things.

No comments:

Post a Comment