I have a master table with the most frequently accessed data.
However, the data in that table must be constantly changing as its supplied by the users in an unpredictable manner. This means that I will not be able to cache the queries, and so there will be a continuously high load on db (Right?).
So I’m thinking of creating a few additional tables, that will function as logs, recording each change in the data as a separate line.
Then once a day at midnight there will run a cron job that will consolidate the logs and write them into the main table. Thus the main table will be updated only once a day, and therefore all the queries within the day can be served from cache.
Is this a good strategy?