[Mono-dev] Performance issue with DataTable.Load on "large" data sets
Nicklas Overgaard
nicklas at isharp.dk
Thu Apr 7 08:58:53 EDT 2011
Hi mono-devers!
I'm currently working on a rather large webproject, where we are using a
combination of mono 2.10.1 and MySQL.
Over the past week, I have observed that loading "large" datasets (5000+
rows) from mysql into a DataTable takes a very long time.
It's done somewhat like this:
<code>
comm.CommandText = query;
comm.CommandTimeout = MySQLConnection.timeout;
MySqlDataReader reader = (MySqlDataReader)comm.ExecuteReader();
DataTable dt = new DataTable();
dt.Load(reader); // <- this is killing mono
reader.Close();
</code>
I have created a small testprogram, compiled it on my linux machine and
executed it.
It takes 15 seconds to do such operation under mono - but on windows it
takes only 0.4 seconds (with the same executable, fetching the same
data). I have profiled the application on windows, and it seems that
the .net framework is using specialized methods for loading data from a
datareader.
I have been looking through the implementation in mono, in regard to
DataTable.Load, and I can see that a lot of validation and other stuff
is going on, which could explain the huge difference. I'm also working
on a mono log profile trace, to dig a little deeper.
Would it be OK, if I tried to patch the current mono implementation to
gain the same speeds as .net? The reason for asking, is that I know that
I cannot contribute to Mono if I have seen the actual code in .NET (but
does a profile result count as "seeing the code"?)
Best regards,
Nicklas Overgaard
More information about the Mono-devel-list
mailing list