一个非常不错的缓存方法,把缓存的结果放在数据库,节省内存!(转)

[b][url]http://www.bigbold.com/snippets/posts/show/3286[/url][/b]Rails memcached is not very easy to introduce to a large rails installation. Memcached also chews up a lot of memory on the box and overall cached model does not work the way I needed it to. Basically, I have just a “few” queries that I needed to cache because pagination sucks just that bad in rails.
So, I built my own cache, similar to how I build them in PHP except I am not using disk cache, I am using MySQL itself to cache it’s own results

First, we need a table to hold all this info (note the ‘blob’ field)
[code]CREATE TABLE `cacheditems` (
`id` int(11) NOT NULL auto_increment,
`cachekey` varchar(255) default NULL,
`created` datetime default NULL,
`expires` datetime default NULL,
`content` longblob,
`cachehit` int(11) NOT NULL,
PRIMARY KEY (`id`),
KEY `cacheditems_cachekey_index` (`cachekey`),
KEY `cacheditems_created_index` (`created`),
KEY `cacheditems_expires_index` (`expires`)
)[/code]

Then we create a model called “cacheditem” which has the following functions
[code]require 'digest/sha1'
class Cacheditem < ActiveRecord::Base

def self.checkfor(sql)
key = Digest::MD5.hexdigest(Marshal.dump(sql))
logger.info "%%% checking for key #{key}"
#logger.info "%%% checking by sql #{sql[0]}"
Cacheditem.find( :first, :conditions => [ “cachekey = ? AND expires > NOW()”, key] )
end

def self.getcached(sql)
key = Digest::MD5.hexdigest(Marshal.dump(sql))
logger.info “%%% getting by key #{key}”
#logger.info “%%% getting by sql #{sql[0]}”
getc = Cacheditem.find( :first, :conditions => [ “cachekey = ?”, key] )
hitcount = getc.cachehit + 1
Cacheditem.update(getc.id, {:cachehit => hitcount})
Cacheditem.delete_all “expires < NOW()" # cleaner
return Marshal.load( getc.content )
end

def self.storeresult(sql, result)
key = Digest::MD5.hexdigest(Marshal.dump(sql))
logger.info "%%% storing by key #{key}"
content = Marshal.dump(result)
logger.level = (4) # this stops display in logs of the marshal data
ci = new()
ci.cachekey = key
ci.created = Time.now()
ci.expires = 30.minutes.from_now() # change as needed
ci.content = content
ci.cachehit = 0
ci.save
return result
end

end[/code]
Then, in application.rb I added the following function
[code]def find_by_sql_cache(sql)
iscached = Cacheditem.checkfor(sql)
if iscached
Cacheditem.getcached(sql)
else
result = connection.select_all(sanitize_sql(sql), "#{name} Load").collect! { |record| instantiate(record) }
Cacheditem.storeresult(sql, result)
end
end[/code]

just throw “_cache” after any “find_by_sql” statement you have a need to cache and there you are.

This works very fast, very well, and doesn’t hog your memory. It cleans up after itself in the database, and perhaps it does that too much.. It would be easy to add in a standard garbage collection function which runs on a random but I felt this gave me much better stats of the actual thirty-minute cache…
If you use zabbix for monitoring your network, you can have fun graphs of cache statistics by adding the following to your zabbix_agentd.conf
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值