[Lazarus] Writing >1000 TBufDataset records to file is extremely slow
Marc Santhoff
M.Santhoff at web.de
Mon Mar 27 09:07:49 CEST 2017
On So, 2017-03-26 at 23:53 +0200, Werner Pamler via Lazarus wrote:
> Trying to extend the import/export example of fpspreadsheet from a dBase
> table to a TBufDataset I came across this issue with TBufDataset: While
> data are posted to the database as quickly as usual writing to file
> takes extremely long if there are more than a few thousand records.
>
> Run the demo attached below. On my system, I measure these (non-linearly
> scaling) execution times for writing the TBufDataset table to file:
>
> 1000 records -- 0.9 seconds
> 2000 records -- 8.8 seconds
> 3000 records -- 31.1 seconds
> etc.
>
> Compared to that, writing of the same data to a dbf file is a wink of an
> eye. Is there anything which I am doing wrong? Or should I report a bug?
>
I didn't count, but you make extensive use of the Random() function.
Could that be the cause of slowness?
HTH,
Marc
[...]
> FExportDataset.Open;
>
> // Random data
> for i:=1 to NUM_RECORDS do begin
> if (i mod 100 = 0) then
> WriteLn(Format('Adding record %d...', [i]));
> FExportDataset.Insert;
> FExportDataset.FieldByName('Last name').AsString :=
> LAST_NAMES[Random(NUM_LAST_NAMES)];
> FExportDataset.FieldByName('First name').AsString :=
> FIRST_NAMES[Random(NUM_FIRST_NAMES)];
> FExportDataset.FieldByName('City').AsString :=
> CITIES[Random(NUM_CITIES)];
> FExportDataset.FieldByName('Birthday').AsDateTime := startDate -
> random(maxAge);
> FExportDataset.FieldByName('Salary').AsFloat := 1000+Random(9000);
> FExportDataset.FieldByName('Size').AsFloat := (160 + Random(50)) / 100;
> FExportDataSet.FieldByName('Work begin').AsDateTime :=
> 40000+EncodeTime(6+Random(4), Random(60), Random(60), 0);
> FExportDataSet.FieldByName('Work end').AsDateTime :=
> EncodeTime(15+Random(4), Random(60), Random(60), 0);
> FExportDataset.Post;
> end;
More information about the Lazarus
mailing list