I have written an application which accumulates lightning strike data, and saves it to an Access-type database. The problem is over the past 2 years it has gotten rather large, 1.45G to be more precise. 3 days worth of data logging can net me another 123K worth of strikes, and this is on a non stormy day (strikes all the way in Africa from NY).
So needless to say, when I start the application and load in the initial data which consists of the past 6 days for a little history, it takes a LOOOONNNG time to load, upwards of several minutes. Usually 300K records are returned. All time is spent in the last line of the following:
Code: Select all
String SQL = getEventGatheringSQL(startTime, stopTime, m_DatabaseThreshold);
myAdapter = new OleDbDataAdapter(SQL, mainConnection);
myDataset = new DataSet();
//Finally pull the data into the dataset...
myAdapter.Fill(myDataset);
The SQL string generator looks like this:
Code: Select all
String SQLString =
"SELECT * FROM LightningEvents" +
" WHERE (([eventTime] > #" + startTime + "#)" +
" AND ([eventTime] < #" + stopTime + "#))";
Meanwhile one core of the CPU is pegged while loading, but the hard disk is not. Seems something is limiting performance, and I would like to learn how to improve load times.
I am currently connecting to the database using a basic file connection. Is there a better, higher-performance way I should be doing this, like creating a connection through the control panel, and somehow using that? If that would increase my performance, I would appreciate knowing how to build up a connection string, etc.
But then I have the problem of making an installer for dummies that can get this set up for them.
I am now in the process of adding a threshold that a user can set to prevent huge amounts of data being saved, so only larger, more powerful strikes are saved in the database. This should drop the daily rate from 40K down to <4K or better. Cleaning up the old database may be a struggle though.
-Mark