来自以下基准的粗略结果: 2倍写入,3倍读取 。
这是一个简单的python基准,您可以适应您的目的,我正在研究每个参数在简单地设置/检索值方面的表现如何:
#!/usr/bin/env python2.7import sys, timefrom pymongo import Connectionimport redis# connect to redis & mongodbredis = redis.Redis()mongo = Connection().testcollection = mongo['test']collection.ensure_index('key', unique=True)def mongo_set(data): for k, v in data.iteritems(): collection.insert({'key': k, 'value': v})def mongo_get(data): for k in data.iterkeys(): val = collection.find_one({'key': k}, fields=('value',)).get('value')def redis_set(data): for k, v in data.iteritems(): redis.set(k, v)def redis_get(data): for k in data.iterkeys(): val = redis.get(k)def do_tests(num, tests): # setup dict with key/values to retrieve data = {'key' + str(i): 'val' + str(i)*100 for i in range(num)} # run tests for test in tests: start = time.time() test(data) elapsed = time.time() - start print "Completed %s: %d ops in %.2f seconds : %.1f ops/sec" % (test.__name__, num, elapsed, num / elapsed)if __name__ == '__main__': num = 1000 if len(sys.argv) == 1 else int(sys.argv[1]) tests = [mongo_set, mongo_get, redis_set, redis_get] # order of tests is significant here! do_tests(num, tests)mongodb 1.8.1和redis 2.2.5以及最新的pymongo / redis-py的结果:
$ ./cache_benchmark.py 10000Completed mongo_set: 10000 ops in 1.40 seconds : 7167.6 ops/secCompleted mongo_get: 10000 ops in 2.38 seconds : 4206.2 ops/secCompleted redis_set: 10000 ops in 0.78 seconds : 12752.6 ops/secCompleted redis_get: 10000 ops in 0.89 seconds : 11277.0 ops/sec
当然要加一点盐!如果您使用另一种语言进行编程,使用其他客户端/不同的实现方式等,则结果将大相径庭。更不用说您的用法将完全不同!最好的选择是按照自己打算使用它们的方式对它们进行基准测试。当然,您可能会想出利用每种方法的
最佳 方法。始终为自己设定基准!



