My idea is to store each message in a hash and then use the add_timestamp field to record the timestamp of the message, but now I don’t know how to group by the timestamp field in hash by hour in redis, or is there any other better method?
I don’t plan to save mysql, because these messages are short in content but extremely high in volume and frequency. I think websites like BAT have systems such as work order processing and counting the number of work orders submitted per hour or Taobao’s group purchase statistics system, which are used to count the sales volume of goods with extremely high sales volume per second at different time periods. My needs are probably similar to theirs.
It can be stored in redis, but it is suggested to consider the capacity problem. After all, redis only uses limited memory capacity. Can consider batch synchronization to nosql, such as mongodb, hbase, etc., relatively new read redis, frequent access. The old messages read nosql and have less access.
As for the number of messages per hour, just use the counter of redis. key is 16.8.13-1, which means that in the first hour of August 13, value is the number of messages