Friday, February 24, 2012

Large number of rows issue

Hi ,

There is a table with the following structure

_
Date-Time of Operation | Message | details | Reason | Username | IP | MAC-Address
_

A user can fire query based on some condition on columns. The maximum no of rows that should be returned are 1 billion.

By default the data returned to the view is sorted on Date-Time. Further user can sort the data in the view on Message, details , reason , username , IP , Mac-Address.

Suppose i have a scenerio where the user first gets the 1 billion records, then some records are inserted in the database.Then user sort on IP column. Now for differnt sorting if i again fire the query new data will be obtained. Thus i will get different result.

One way is to maintain the data in my memory. But the number of records are huge,so this is not a good solution. So what should i do to get the disconnected data without actually getting everything in the memory.

Regards,
Sunil

One way would be to have a stored procedure to fire this information into another table, then have a view of that table. Hence you are in control of the "loading" of that data for reporting purposes.|||

Hi,

I had a similar case in my company.

The solution for the case was creating a second instance on a different server which keeps denormalized data partitioned according to a date field. And loading the most recent partition into memory.

Also the query should have to use the date field since it will use that criteria in the index.

The OLTP database server was using 4GB of RAM on the other hand this second instance was using 12 GB of RAM only for maintaining such reporting processes.

Perhaps you may create covering index on the table. But this will increase the size of data.

Eralper

No comments:

Post a Comment