We are building an ajax application in which a users input is submitted for processing to a php script. We are currently writing every request to a log file for tracking. I would like to move this tracking into a database table but I do not want to run a insert statement after request. What I would like to do is set up a 'queue' of transactions (inserts and updates) that need to be processed on the MySQL database. I would then set up a cron job or process to check and process the transactions in the queue. Is there something out there that we could build upon or do we have to just write to plain ol' text log files and process them?
解决方案
You want Gearman - it'll queue the requests and insert them as and when the database is ready for them, so you don't overload your DB server.
Gearman provides a generic application
framework to farm out work to other
machines or processes that are better
suited to do the work. It allows you
to do work in parallel, to load
balance processing, and to call
functions between languages. It can be
used in a variety of applications,
from high-availability web sites to
the transport of database replication
events. In other words, it is the
nervous system for how distributed
processing communicates. A few strong
points about Gearman:
There's a recent (and quality) post about using databases for logging here, which (summarised) says:
Use MyISAM with concurrent inserts
Rotate tables daily and use UNION to query
Use delayed inserts with MySQL or a job processing agent like Gearman (although MySQL has a limit on the number of these it will queue before silently dropping them!)
If you really want to avoid this, you could write the raw SQL statements to a file and process them with this cronjob:
mysql loggingDB logTable < fullLog.sql && > fullLog.sql