Implemented persistent cache, events and gametime counter.

OBS - there is a new data table (for the persistent cache) so you need to sync or restart with your database.

* Persistent cache (pcache)- this works the same as the volatile cache, except it is regularly saved to disk and recovered upon restart. How often the pcache is backed up is set in preferences. This was heck of a tricky thing to get right due to the intricacies of pickle; for example it turns out there is a bug in cPickle, so only normal pickle works to store the cache objects.
* Persistent events - this makes use of the pcache to re-load the scheduled events every reload. Only events with the property "persistent" will be saved this way (if not set, events will get lost upon reboot, just like now). All the main system events have been implemented as persistent events, including a new event to regularly save the pcache to disk.
* In order to track persistent event timers across reboots, there is also a global "game time" defined now. This is saved in cache and counts seconds only when the server is running. Event timers are adjusted with an offset when restarting (otherwise they will be confused by the real time jumping forward after a downtime). There are also a small set of helpful routines in src/gametime.py to help convert from real time to game time (for easy creation of new events).
* Various info commands have been updated to incoorporate the time stamp and the cache sync information.
* There are a few test commands commented out in commands/general.py that I used for testing; I left them in if you want to test things quickly. It works here, but as always more people testing is needed.
/Griatch
This commit is contained in:
Griatch 2009-11-22 21:18:55 +00:00
parent 5e866c6b73
commit 1ea7e69821
16 changed files with 761 additions and 146 deletions

View file

@ -1,79 +0,0 @@
"""
The cache module implements a volatile storage
object mechanism for Evennia.
Data stored using this module is stored in
memory (so requires no database access). The
drawback is that it will be lost upon a
reboot. It is however @reload-safe unless
explicitly flushed with @reload/cache (the cache
is not flushed with @reload/all)
Access I/O of the cache is normally done through
the object model, using e.g.
source_object.cache.variable = data
or
data = source_object.cache.variable
"""
# global storage. This can be references directly, but most
# transparently it's accessed through the object model.
CACHE = {}
class Cache(dict):
"""
This storage object is stored to act as a save target for
volatile variables through use of object properties. It
can also be used as a dict if desired. It lists the contents
of itself and makes sure to return None of the sought attribute
is not set on itself (so test = cache.var will set test to None
if cache has no attribute var instead of raising a traceback).
Each Cache object is intended to store the volatile properties
of one in-game database object or one user-defined application.
"""
def __str__(self):
"""
Printing the cache object shows all properties
stored on it.
"""
return ", ".join(sorted(self.__dict__.keys()))
def __getattr__(self, name):
"""
Make sure to return None if the attribute is not set.
(instead of the usual traceback)
"""
return self.__dict__.get(name, None)
def get(cache_key):
"""
Retrieve a cache object from the storage. This is primarily
used by the objects.models.Object.cache property.
cache_key - identifies the cache storage area (e.g. an object dbref)
"""
if cache_key not in CACHE:
CACHE[cache_key] = Cache()
return CACHE[cache_key]
def flush(cache_key=None):
"""
Clears a particular cache_key from memory. If
no key is given, entire cache is flushed.
"""
global CACHE
if cache_key == None:
CACHE = {}
elif cache_key in CACHE:
del CACHE[cache_key]
def show():
"""
Show objects stored in cache
"""
return CACHE.keys()

0
src/cache/__init__.py vendored Normal file
View file

284
src/cache/cache.py vendored Normal file
View file

@ -0,0 +1,284 @@
"""
The cache module implements a volatile and
semi-volatile storage
object mechanism for Evennia.
Volatile Cache:
Data stored using the Cache is stored in
memory (so requires no database access). The
drawback is that it will be lost upon a
reboot. It is however @reload-safe unless
explicitly flushed with @reload/cache (the cache
is not flushed with @reload/all)
Access I/O of the cache is normally done through
the object model, using e.g.
source_object.cache.variable = data
and
data = source_object.cache.variable
Semi-persistent Cache:
This form of cache works like the volatile cache but the
data will survive a reboot since the state is backed up
to the database at regular intervals (it is thus a save-point
scheme). How often the backup is done can be set in preferences.
Access I/O:
source_object.pcache = data
and
data = source_object.pcache
Whereas you can also access the cache(s) using
set_cache/get_cache and set_pcache/get_pcache
directly, you must continue to use these methods
on a particular piece of data once you start using them
(i.e. you won't be able to use dot-notation to retrieve
a piece of data saved explicitly using set_cache())
"""
from src.cache.models import PersistentCache
from src import logger
class Cache(object):
"""
Each Cache object is intended to store the volatile properties
of one in-game database object or one user-defined application.
By default, the object allows to safely reference variables on
itself also if it does not exist (so test = cache.var will
set test to None if cache has no attribute var instead of raising
a traceback). This allows for stable and transparent operation
during most circumstances.
Due to how the objects are stored in database (using pickle), the
object has a __safedot switch to deactivate the safe mode
of variables mentioned above; this is necessary in order to have
pickle work correctly (it does not like redefining __getattr__)
and should not be used for anything else.
Observe that this object in itself is not persistent, the only
thing determining if it is persistent is which of the global
variables (CACHE or PCACHE) it is saved in (and that there
exists an event to save the cache at regular intervals, use
@ps to check that this is the case).
"""
__safedot = True
def __getattr__(self, key):
"""
This implements a safe dot notation (i.e. it will not
raise an exception if a variable does not exist)
"""
if self.__safedot:
return self.__dict__.get(key, None)
else:
super(Cache, self).__getattr__(key)
def show(self):
"""
Return nice display of data.
"""
return ", ".join(key for key in sorted(self.__dict__.keys())
if key != '_Cache__safedot')
def store(self, key, value):
"""
Store data directly, without going through the dot notation.
"""
if key != '__safedot':
self.__dict__[key] = value
def retrieve(self, key):
"""
Retrieve data directly, without going through dot notation.
Note that this intentionally raises a KeyError if key is not
found. This is mainly used by get_cache to determine if a
new cache object should be created.
"""
return self.__dict__[key]
def pickle_yes(self):
"""
Since pickle cannot handle a custom getattr, we
need to deactivate it before pickling.
"""
self.__safedot = False
for data in (data for data in self.__dict__.values()
if type(data)==type(self)):
data.pickle_yes()
def pickle_no(self):
"""
Convert back from pickle mode to normal safe dot notation.
"""
self.__safedot = True
for data in (data for data in self.__dict__.values()
if type(data)==type(self)):
data.pickle_no()
def has_key(self, key):
"""
Decide if cache has a particular piece of data.
"""
return key in self.__dict__
def to_dict(self):
"""
Return all data stored in cache in
the form of a dictionary.
"""
return self.__dict__
def del_key(self, key):
"""
Clear cache data.
"""
if key in self.__dict__:
del self.__dict__[key]
# Cache access functions - these only deal with the default global
# cache and pcache.
# Volatile cache
def set_cache(cache_key, value):
"""
Set a value in the volatile cache (oftenmost this is done
through properties instead).
"""
CACHE.store(cache_key, value)
def get_cache(cache_key):
"""
Retrieve a cache object from the storage. This is primarily
used by the objects.models.Object.cache property.
cache_key - identifies the cache storage area (e.g. an object dbref)
reference - this bool describes if the function is called as part of
a obj.cache.cache_key.data contstruct.
"""
try:
return CACHE.retrieve(cache_key)
except:
CACHE.store(cache_key, Cache())
return CACHE.retrieve(cache_key)
def flush_cache(cache_key=None):
"""
Clears a particular cache_key from memory. If
no key is given, entire cache is flushed.
"""
global CACHE
if cache_key == None:
CACHE = Cache()
else:
CACHE.del_key(cache_key)
# Persistent cache
def set_pcache(cache_key, value):
"""
Set a value in the volatile cache (oftenmost this is done
through properties instead).
"""
PCACHE.store(cache_key, value)
def get_pcache(pcache_key):
"""
Retrieve a pcache object from the storage. This is primarily
used by the objects.models.Object.cache property.
cache_key - identifies the cache storage area (e.g. an object dbref)
"""
try:
return PCACHE.retrieve(pcache_key)
except KeyError:
PCACHE.store(pcache_key, Cache())
return PCACHE.retrieve(pcache_key)
def flush_pcache(pcache_key=None):
"""
Clears a particular cache_key from memory. If
no key is given, entire cache is flushed.
"""
global PCACHE
if pcache_key == None:
PCACHE = Cache()
elif pcache_key in PCACHE.__dict__:
PCACHE.del_key(pcache_key)
def show():
"""
Show objects stored in caches
"""
return CACHE.show(), PCACHE.show()
# Admin-level commands for initializing and saving/loading pcaches.
def init_pcache(cache_name=None):
"""
Creates the global pcache object in database.
(this is normally only called by initial_setup.py)
"""
from src.cache.managers.cache import GLOBAL_PCACHE_NAME
pcache = PersistentCache()
if cache_name:
pcache.cache_name = cache_name
else:
pcache.cache_name = GLOBAL_PCACHE_NAME
#initial save of the the empty pcache object to database
pcache.save()
#create empty storage object in cache
pcache.save_cache(Cache())
def save_pcache(cache_name=""):
"""
Force-save persistent cache right away.
"""
try:
if cache_name:
pcache = PersistentCache.objects.get(cache_name=cache_name)
else:
pcache = PersistentCache.objects.get_default_pcache()
except:
logger.log_errmsg("Save error: %s Pcache not initialized." % cache_name)
return
pcache.save_cache(PCACHE)
def load_pcache(cache_name=""):
"""
Load pcache from database storage. This is also called during
startup and fills the pcache with persistent cache data.
"""
global PCACHE
try:
if cache_name:
pcache = PersistentCache.objects.get(cache_name=cache_name)
return pcache
else:
pcache = PersistentCache.objects.get_default_pcache()
except:
logger.log_errmsg("Could not load %s: Pcache not found." % cache_name)
return
if pcache :
print " Loading persistent cache from disk."
unpacked = pcache.load_cache()
if unpacked:
PCACHE = unpacked
# Volatile Cache. This is a non-persistent cache. It will be lost upon
# a reboot. This can be referenced directly, but most
# transparently it's accessed through the object model.
CACHE = Cache()
# Persistent Cache. The system will make sure to save the contents of this
# cache at regular intervals, recovering it after a server
# reboot. It is accessed directly or through the object model.
PCACHE = Cache()

0
src/cache/managers/__init__.py vendored Normal file
View file

20
src/cache/managers/cache.py vendored Normal file
View file

@ -0,0 +1,20 @@
"""
Custom manager for Cache objects
"""
from django.db import models
# This is the (arbitrary, but consistent) name used by the
# global interval-saved (persistent) cache (this is
# used by initial_setup)
GLOBAL_PCACHE_NAME = "_global_persistent_cache"
class CacheManager(models.Manager):
"""
Custom cache manager.
"""
def get_default_pcache(self):
"""
Find and return the global pcache object.
"""
return self.get(cache_name=GLOBAL_PCACHE_NAME)

56
src/cache/models.py vendored Normal file
View file

@ -0,0 +1,56 @@
"""
This implements a database storage cache for storing global
cache data persistently.
It is intended to be used with an event timer for updating
semi-regularly (otherwise, object attributes are better to use
if full persistency is needed).
"""
from django.db import models
from django.conf import settings
from src.cache.managers.cache import CacheManager
# 091120 - there is a bug in cPickle for importing the
# custom cache objects; only normal pickle works. /Griatch
import pickle
#try:
# import cPickle as pickle
#except ImportError:
# import pickle
class PersistentCache(models.Model):
"""
Implements a simple pickled database object, without
using the in-game object attribute model.
"""
cache_name = models.CharField(max_length=255)
cache_data = models.TextField(blank=True)
objects = CacheManager()
class Meta:
permissions = settings.PERM_CACHE
def load_cache(self):
"""
Recovers cache from database storage.
"""
cache_data = str(self.cache_data)
#print "loading cache: %s" % cache_data
if cache_data:
cache_data = pickle.loads(cache_data)
cache_data.pickle_no()
return cache_data
else:
return None
def save_cache(self, cache_obj):
"""
Stores a cache as a pickle.
"""
#print "saving ... '%s': %s" % (cache_obj,cache_obj.show())
cache_obj.pickle_yes()
self.cache_data = pickle.dumps(cache_obj)
cache_obj.pickle_no()
self.save()

1
src/cache/views.py vendored Normal file
View file

@ -0,0 +1 @@
# Create your views here.

View file

@ -59,17 +59,6 @@ def cmd_password(command):
source_object.emit_to("Password changed.")
GLOBAL_CMD_TABLE.add_command("@password", cmd_password, help_category="System")
def cmd_pemit(command):
"""
@pemit
Emits something to a player.
(Not yet implemented)
"""
# TODO: Implement cmd_pemit
#GLOBAL_CMD_TABLE.add_command("@pemit", cmd_pemit)
def cmd_emit(command):
"""
@emit
@ -527,7 +516,7 @@ def cmd_fsay(command):
# permission check
if not source_object.controls_other(target):
source_object.emit_to("Cannot pose %s (you don's control it)" % obj.get_name())
source_object.emit_to("Cannot pose %s (you don's control it)" % target.get_name())
return
# Feedback for the object doing the talking.
@ -624,7 +613,7 @@ def cmd_fpose(command):
# permission check
if not source_object.controls_other(target):
source_object.emit_to("Cannot pose %s (you don's control it)" % obj.get_name())
source_object.emit_to("Cannot pose %s (you don's control it)" % target.get_name())
return
if "nospace" in command.command_switches:
@ -776,6 +765,8 @@ def cmd_help(command):
source_object.emit_to(string)
GLOBAL_CMD_TABLE.add_command("help", cmd_help)
## def cmd_testevent(command):
## from src import events
## from src import scheduler
@ -793,3 +784,33 @@ GLOBAL_CMD_TABLE.add_command("help", cmd_help)
## scheduler.del_event(pid)
## source_object.emit_to("event with pid %s removed (if it existed)." % pid)
## GLOBAL_CMD_TABLE.add_command("testevent", cmd_testevent)
## def cmd_testcache(command):
## from src.cache import cache
## from src import scheduler
## from src import events
## from src import gametime
## source_object = command.source_object
## switches = command.command_switches
## s1 = "Temp_cache_val_OK"
## s2 = "Perm_cache_val_OK"
## s3 = "Perm_cache_val2_OK"
## if switches and "get" in switches:
## cache.load_pcache()
## cache_vol = source_object.cache.testcache
## source_object.emit_to("< volatile cache: %s" % cache_vol)
## cache_perm = source_object.pcache.testcache_perm
## source_object.emit_to("< cache_perm1: %s" % cache_perm)
## cache_perm2 = cache.get_pcache("permtest2")
## source_object.emit_to("< cache_perm2: %s" % cache_perm2)
## else:
## source_object.cache.testcache = s1
## source_object.pcache.testcache_perm = s2
## cache.set_pcache("permtest2", s3)
## source_object.emit_to("> volatile cache: %s" % s1)
## source_object.emit_to("> cache_perm1: %s" % s2)
## source_object.emit_to("> cache_perm2: %s" % s3)
## cache.save_pcache()
## source_object.emit_to("Caches saved.")
## source_object.emit_to("Time: %i" % gametime.time())
## GLOBAL_CMD_TABLE.add_command("testcache", cmd_testcache)

View file

@ -14,6 +14,8 @@ from src import scheduler
from src import defines_global
from src import flags
from src.cmdtable import GLOBAL_CMD_TABLE
from src.cache import cache
from src import gametime
def cmd_version(command):
"""
@ -40,8 +42,14 @@ def cmd_time(command):
Server local time.
"""
command.source_object.emit_to('Current server time : %s' %
(time.strftime('%a %b %d %H:%M:%S %Y (%Z)', time.localtime(),)))
gtime = gametime.time()
synctime = gametime.time_last_sync()
ltime = time.strftime('%a %b %d %H:%M:%S %Y (%Z)', time.localtime())
string = " Current game time: %i s." % gtime
string += "\n Time since cache was last saved: %i s." % synctime
string += "\n Current server time: %s" % ltime
command.source_object.emit_to(string)
GLOBAL_CMD_TABLE.add_command("@time", cmd_time, priv_tuple=("genperms.game_info",),
help_category="Admin")
@ -175,3 +183,22 @@ def cmd_stats(command):
stats_dict["players"],
stats_dict["garbage"]))
GLOBAL_CMD_TABLE.add_command("@stats", cmd_stats, priv_tuple=("genperms.game_info",), help_category="Admin"),
def cmd_showcache(command):
"""
@showcache - show stats about the cache system
Usage:
@showcache
Study the current contents and size of the cache.
"""
source_object = command.source_object
str_cache, str_pcache = cache.show()
string = ""
if str_cache:
string += "\nVolatile cache:\n " + str_cache
if str_pcache:
string += "\nPersistent cache:\n " + str_pcache
source_object.emit_to(string)
GLOBAL_CMD_TABLE.add_command("@showcache", cmd_showcache, priv_tuple=("genperms.game_info",), help_category="Admin"),

View file

@ -70,10 +70,28 @@ DATABASE_HOST = ''
# Empty string defaults to localhost. Not used with sqlite3.
DATABASE_PORT = ''
# How often the persistent cache will save to disk (in seconds).
CACHE_BACKUP_INTERVAL = 600
# How many words a single command name may have (e.g. 'push button' instead of 'pushbutton')
# (commands with switches can always only have one word in the name, e.g. @sethelp/add)
COMMAND_MAXLEN = 3
## Time units - this defines a useful base for how fast time will run in the game.
# You don't actually have to use this, but it affects the routines in src.gametime.py
# and allows for a convenient measure to determine the current in-game time.
# The time factor dictates if the game world runs faster (timefactor>1) or
# slower (timefactor<1) than the real world.
TIME_FACTOR = 2.0
# The tick is the smallest unit of time in the game. Smallest value is 1.
TIME_TICK = 1.0
# These measures might or might not make sense to the game world.
TIME_MIN_PER_HOUR = 60
TIME_HOUR_PER_DAY = 24
TIME_DAY_PER_WEEK = 7
TIME_WEEK_PER_MONTH = 4
TIME_MONTH_PER_YEAR = 12
## Command aliases
# These are convenient aliases set up when the game is started
# for the very first time. You can add/delete aliases in-game using
@ -114,6 +132,9 @@ PERM_HELPSYS = (
("staff_help", "May see staff help topics."),
("add_help", "May add or append to help entries"),
("del_help", "May delete help entries"),)
# handling cache
PERM_CACHE = (
("admin_cache","May admin the cache system"),)
# object manipulation/information permissions
PERM_OBJECTS = (
("teleport","May teleport an object to any location."),
@ -383,6 +404,7 @@ INSTALLED_APPS = (
'src.irc',
'src.helpsys',
'src.genperms',
'src.cache',
'game.web.apps.news',
'game.web.apps.website',
)

View file

@ -6,13 +6,21 @@ Create your sub-class, call src.scheduler.add_event(YourEventClass()) to add
it to the global scheduler.
Use @ps to view the event list.
The events set with the member variable persistent equal to True will be
stored in persistent cache and will survive server downtime.
"""
import time
import copy
from twisted.internet import task
from django.conf import settings
import session_mgr
from src import scheduler
from src import defines_global
from src.objects.models import Object
from src.cache import cache
from src import logger
from src import gametime
class IntervalEvent(object):
"""
@ -42,12 +50,23 @@ class IntervalEvent(object):
self.repeats = None
# A reference to the task.LoopingCall object.
self.looped_task = None
# If true, the event definition will survive a reboot.
self.persistent = False
def __getstate__(self):
"""
Used by pickle.
"""
edict = copy.copy(self.__dict__)
edict["looped_task"] = None
edict["pid"] = None
return edict
def __unicode__(self):
"""
String representation of the event.
"""
return self.name
return self.description
def __eq__(self, event2):
"""
@ -89,7 +108,7 @@ class IntervalEvent(object):
"""
Returns a value in seconds when the event is going to fire off next.
"""
return max(0,(self.time_last_executed + self.interval) - time.time())
return max(0, (self.time_last_executed + self.interval) - time.time())
def set_lastfired(self):
"""
@ -110,6 +129,7 @@ class IntervalEvent(object):
scheduler.del_event(self.pid)
# Some default server events
class IEvt_Check_Sessions(IntervalEvent):
"""
@ -117,9 +137,10 @@ class IEvt_Check_Sessions(IntervalEvent):
"""
def __init__(self):
super(IEvt_Check_Sessions, self).__init__()
self.name = 'IEvt_Check_Sessions'
#self.name = 'IEvt_Check_Sessions'
self.interval = 60
self.description = "Session consistency checks."
self.persistent = True
def event_function(self):
"""
@ -133,9 +154,10 @@ class IEvt_Destroy_Objects(IntervalEvent):
"""
def __init__(self):
super(IEvt_Destroy_Objects, self).__init__()
self.name = 'IEvt_Destroy_Objects'
#self.name = 'IEvt_Destroy_Objects'
self.interval = 1800
self.description = "Clean out objects marked for destruction."
self.persistent = True
def event_function(self):
"""
@ -144,13 +166,34 @@ class IEvt_Destroy_Objects(IntervalEvent):
going_objects = Object.objects.filter(type__exact=defines_global.OTYPE_GOING)
for obj in going_objects:
obj.delete()
def add_global_events():
"""
When the server is started up, this is triggered to add all of the
events in this file to the scheduler.
"""
# Create an instance and add it to the scheduler.
scheduler.add_event(IEvt_Check_Sessions())
scheduler.add_event(IEvt_Destroy_Objects())
class IEvt_Sync_PCache(IntervalEvent):
"""
Event: Sync the persistent cache to with the database.
This is an important event since it also makes sure to
update the time stamp.
"""
def __init__(self):
super(IEvt_Sync_PCache, self).__init__()
#self.name = 'IEvt_Sync_PCache'
self.interval = settings.CACHE_BACKUP_INTERVAL
self.description = "Backup pcache to disk."
self.persistent = True
def event_function(self):
"""
This is the function that is fired every self.interval seconds.
"""
infostring = "Syncing time, events and persistent cache to disk."
logger.log_infomsg(infostring)
# updating the current time
time0 = time.time()
time1 = gametime.time(time0)
cache.set_pcache("_game_time0", time0)
cache.set_pcache("_game_time", time1)
# update the event database to pcache
ecache = [event for event in scheduler.SCHEDULE
if event.persistent]
cache.set_pcache("_persistent_event_cache", ecache)
# save pcache to disk.
cache.save_pcache()

109
src/gametime.py Normal file
View file

@ -0,0 +1,109 @@
"""
The gametime module handles the global passage of time in the mud.
It also
"""
from django.conf import settings
import time as time_module
from src.cache import cache
# Speed-up factor of the in-game time compared
# to real time.
TIMEFACTOR = settings.TIME_FACTOR
# Common real-life time measures, in seconds.
# You should normally not change these.
REAL_TICK = settings.TIME_TICK #This is the smallest time unit (minimum 1s)
REAL_MIN = 60.0 # seconds per minute in real world
# Game-time units, in real-life seconds. These are supplied as
# a convenient measure for determining the current in-game time,
# e.g. when defining events. The words month, week and year can
# of course be translated into any suitable measures.
TICK = REAL_TICK / TIMEFACTOR
MIN = REAL_MIN / TIMEFACTOR
HOUR = MIN * settings.TIME_MIN_PER_HOUR
DAY = HOUR * settings.TIME_HOUR_PER_DAY
WEEK = DAY * settings.TIME_DAY_PER_WEEK
MONTH = WEEK * settings.TIME_WEEK_PER_MONTH
YEAR = MONTH * settings.TIME_MONTH_PER_YEAR
def gametime_to_realtime(secs=0, mins=0, hrs=0, days=0,
weeks=0, months=0, yrs=0):
"""
This method helps to figure out the real-world time it will take until a in-game time
has passed. E.g. if an event should take place a month later in-game, you will be able
to find the number of real-world seconds this corresponds to (hint: Interval events deal
with real life seconds).
Example:
gametime_to_realtime(days=2) -> number of seconds in real life from now after which
2 in-game days will have passed.
"""
stot = secs/TIMEFACTOR + mins*MIN + hrs*HOUR + \
days*DAY + weeks*WEEK + months*MONTH + yrs*YEAR
return stot
def realtime_to_gametime(secs=0, mins=0, hrs=0, days=0,
weeks=0, months=0, yrs=0):
"""
This method calculates how large an in-game time a real-world time interval would
correspond to. This is usually a lot less interesting than the other way around.
Example:
realtime_to_gametime(days=2) -> number of game-world seconds
corresponding to 2 real days.
"""
stot = TIMEFACTOR * (secs + mins*60 + hrs*3600 + days*86400 + \
weeks*604800 + months*2419200 + yrs*29030400)
return stot
def time(currtime=None):
"""
Find the current in-game time (in seconds) since the start of the mud.
This is the main measure of in-game time and is persistently saved to
disk, so is the main thing to use to determine passage of time like
seasons etc.
Obs depending on how often the persistent cache is saved to disk
(this is defined in the config file), there might be some discrepancy
here after a server crash, notably that some time will be 'lost' (i.e.
the time since last backup). If this is a concern, consider saving
the cache more often.
currtime : An externally calculated current time to compare with.
"""
time0 = cache.get_pcache("_game_time0")
time1 = cache.get_pcache("_game_time")
if currtime:
return time1 + (currtime - time0)
else:
return time1 + (time_module.time() - time0)
def time_last_sync():
"""
Calculates the time since the system was last synced to disk. This e.g. used
to adjust event counters for offline time. The error of this measure is
dependent on how often the cache is saved to disk.
"""
time0 = cache.get_pcache("_game_time0")
return time_module.time() - time0
def time_save():
"""
Force a save of the current time to persistent cache.
Shutting down the server from within the mud will
automatically call this routine.
"""
time0 = time_module.time()
time1 = time(time0)
cache.set_pcache("_game_time0", time0)
cache.set_pcache("_game_time", time1)
cache.save_pcache()

View file

@ -5,6 +5,7 @@ other things.
Everything starts at handle_setup()
"""
import time
from django.contrib.auth.models import User, Group, Permission
from django.core import management
from django.conf import settings
@ -12,6 +13,12 @@ from src.objects.models import Object
from src.config.models import ConfigValue, CommandAlias, ConnectScreen
from src import comsys, defines_global, logger
from src.helpsys import helpsystem
from src import session_mgr
from src import scheduler
from src import events
from src.cache import cache
# Main module methods
def get_god_user():
"""
@ -119,19 +126,7 @@ def create_aliases():
command_aliases = settings.COMMAND_ALIASES
for user_input, equiv_command in command_aliases.items():
CommandAlias(user_input=user_input, equiv_command=equiv_command).save()
## CommandAlias(user_input="@desc", equiv_command="@describe").save()
## CommandAlias(user_input="@dest", equiv_command="@destroy").save()
## CommandAlias(user_input="@nuke", equiv_command="@destroy").save()
## CommandAlias(user_input="@tel", equiv_command="@teleport").save()
## CommandAlias(user_input="i", equiv_command="inventory").save()
## CommandAlias(user_input="inv", equiv_command="inventory").save()
## CommandAlias(user_input="l", equiv_command="look").save()
## CommandAlias(user_input="ex", equiv_command="examine").save()
## CommandAlias(user_input="sa", equiv_command="say").save()
## #CommandAlias(user_input="emote", equiv_command="pose").save()
## CommandAlias(user_input="p", equiv_command="page").save()
def import_help_files():
"""
Imports the help files.
@ -148,6 +143,44 @@ def categorize_initial_helpdb():
print " Moving initial imported help db to help category '%s'." % default_category
helpsystem.edithelp.homogenize_database(default_category)
def create_pcache():
"""
Create the global persistent cache object.
"""
from src.cache import cache
# create the main persistent cache
cache.init_pcache()
def create_system_events():
"""
Set up the default system events of the server
"""
# create instances of events and add to scheduler (which survives a reboot)
print " Defining system events ..."
scheduler.add_event(events.IEvt_Check_Sessions())
scheduler.add_event(events.IEvt_Destroy_Objects())
scheduler.add_event(events.IEvt_Sync_PCache())
# Make sure that these events are saved to pcache right away.
ecache = [event for event in scheduler.SCHEDULE if event.persistent]
cache.set_pcache("_persistent_event_cache", ecache)
cache.save_pcache()
def start_game_time():
"""
This creates a persistent time stamp (in s since an arbitrary start)
upon first server start and is saved and updated regularly in persistent cache.
_game_time0 is the current absolute time (in s since an arbitrary start)
_game_time is the current relative number of seconds that the server has been running
(not counting offline time), accurate to the time between
cache saves, when this is stored.
"""
time0 = time.time()
time1 = 0
cache.set_pcache("_game_time0", time0)
cache.set_pcache("_game_time", time1)
cache.save_pcache()
def handle_setup():
"""
Main logic for the module.
@ -160,3 +193,6 @@ def handle_setup():
create_channels()
import_help_files()
categorize_initial_helpdb()
create_pcache()
create_system_events()
start_game_time()

View file

@ -21,7 +21,7 @@ from src import scripthandler
from src import defines_global
from src import session_mgr
from src import logger
from src import cache
from src.cache import cache
# Import as the absolute path to avoid local variable clashes.
import src.flags
@ -222,7 +222,8 @@ class Object(models.Model):
else:
return results[0]
def search_for_object_global(self, ostring, exact_match=True, limit_types=[],
def search_for_object_global(self, ostring, exact_match=True,
limit_types=[],
emit_to_obj=None, dbref_limits=()):
"""
Search for ostring in all objects, globally. Handle multiple-matches
@ -233,7 +234,8 @@ class Object(models.Model):
if not emit_to_obj:
emit_to_obj = self
results = Object.objects.global_object_name_search(ostring, exact_match=exact_match,
results = Object.objects.global_object_name_search(ostring,
exact_match=exact_match,
limit_types=limit_types)
if dbref_limits:
# if this is set we expect a tuple of 2, even if one is None.
@ -294,7 +296,8 @@ class Object(models.Model):
"""
# The Command object has all of the methods for parsing and preparing
# for searching and execution. Send it to the handler once populated.
cmdhandler.handle(cmdhandler.Command(self, command_str, session=session),
cmdhandler.handle(cmdhandler.Command(self, command_str,
session=session),
ignore_state=ignore_state)
def emit_to_contents(self, message, exclude=None):
@ -573,11 +576,12 @@ class Object(models.Model):
def clear_objects(self):
"""
Moves all objects (players/things) currently in a GOING -> GARBAGE location
to their home or default home (if it can be found).
Moves all objects (players/things) currently in a
GOING -> GARBAGE location to their home or default
home (if it can be found).
"""
# Gather up everything, other than exits and going/garbage, that is under
# the belief this is its location.
# Gather up everything, other than exits and going/garbage,
# that is under the belief this is its location.
objs = self.obj_location.filter(type__in=[1, 2, 3])
default_home_id = ConfigValue.objects.get_configvalue('default_home')
try:
@ -600,12 +604,13 @@ class Object(models.Model):
# If for some reason it's still None...
if not home:
functions_general.log_errmsg("Missing default home, %s '%s(#%d)' now has a null location." %
string = "Missing default home, %s '%s(#%d)' now has a null location."
functions_general.log_errmsg(string %
(text, obj.name, obj.id))
if obj.is_player():
if obj.is_connected_plr():
if home:
if home:
obj.emit_to("Your current location has ceased to exist, moving you to your home %s(#%d)." %
(home.name, home.id))
else:
@ -677,6 +682,8 @@ class Object(models.Model):
return attrib.get_value()
else:
return default
attribute = property(fget=get_attribute_value, fset=set_attribute)
def get_attribute_obj(self, attrib):
"""
@ -747,7 +754,8 @@ class Object(models.Model):
# wild-carded search string.
match_exp = re.compile(functions_general.wildcard_to_regexp(searchstr),
re.IGNORECASE)
# If the regular expression search returns a match object, add to results.
# If the regular expression search returns a match
# object, add to results.
if exclude_noset:
return [attr for attr in attrs if match_exp.search(attr.get_name())
and not attr.is_hidden() and not attr.is_noset()]
@ -885,7 +893,8 @@ class Object(models.Model):
try:
return self.location
except:
functions_general.log_errmsg("Object '%s(#%d)' has invalid location: #%s" % \
string = "Object '%s(#%d)' has invalid location: #%s"
functions_general.log_errmsg(string % \
(self.name,self.id,self.location_id))
return False
@ -913,15 +922,29 @@ class Object(models.Model):
"""
Returns an object's volatile cache (in-memory storage)
"""
return cache.get(self.dbref())
return cache.get_cache(self.dbref())
def del_cache(self):
"""
Cleans the object cache for this object
"""
cache.flush(self.dbref())
cache.flush_cache(self.dbref())
cache = property(fget=get_cache, fdel=del_cache)
def get_pcache(self):
"""
Returns an object's persistent cache (in-memory storage)
"""
return cache.get_pcache(self.dbref())
def del_pcache(self):
"""
Cleans the object persistent cache for this object
"""
cache.flush_pcache(self.dbref())
pcache = property(fget=get_pcache, fdel=del_pcache)
def get_script_parent(self):
"""
@ -944,7 +967,8 @@ class Object(models.Model):
script_parent: (string) String pythonic import path of the script parent
assuming the python path is game/gamesrc/parents.
"""
if script_parent != None and scripthandler.scriptlink(self, str(script_parent).strip()):
if script_parent != None and scripthandler.scriptlink(self,
str(script_parent).strip()):
#assigning a custom parent
self.script_parent = str(script_parent).strip()
self.save()
@ -1165,7 +1189,8 @@ class Object(models.Model):
# for other users we request the permission as normal.
nostate = self.has_perm("genperms.admin_nostate")
# we never enter other states if we are already in the interactive batch processor.
# we never enter other states if we are already in
# the interactive batch processor.
nostate = nostate or self.get_state() == "_interactive batch processor"
if nostate:

View file

@ -8,6 +8,8 @@ ADDING AN EVENT:
imported, or that add_event() is called by a command or some kind of action.
* Profit.
"""
from src.cache import cache
CACHE_NAME = "_persistent_event_cache"
# dict of IntervalEvent sub-classed objects, keyed by their
# process id:s.
@ -35,19 +37,30 @@ def add_event(event):
* event: (IntervalEvent) The event to add to the scheduler.
Returns:
* pid : (int) The process ID assigned to this event, for future reference.
"""
# Make sure not to add multiple instances of the same event.
matches = [i for i, stored_event in enumerate(SCHEDULE) if event == stored_event]
matches = [i for i, stored_event in enumerate(SCHEDULE)
if event == stored_event]
if matches:
#print "replacing existing event pid=%i: %s" % (event.pid, event.name)
# Before replacing an event, stop its old incarnation.
del_event(matches[0])
SCHEDULE[matches[0]] = event
else:
# Add a new event with a fresh pid.
event.pid = next_free_pid()
#print "adding new event with fresh pid=%i: %s" % (event.pid,event.name)
SCHEDULE.append(event)
event.start_event_loop()
if event.persistent:
# We have to sync to disk, otherwise we might end up
# in situations (such as after a crash) where an object exists,
# but the event tied to it does not.
ecache = [event for event in SCHEDULE if event.persistent]
cache.set_pcache("_persistent_event_cache", ecache)
cache.save_pcache()
return event.pid
def get_event(pid):
@ -56,7 +69,8 @@ def get_event(pid):
otherwise return None.
"""
pid = int(pid)
imatches = [i for i, stored_event in enumerate(SCHEDULE) if stored_event.pid == pid]
imatches = [i for i, stored_event in enumerate(SCHEDULE)
if stored_event.pid == pid]
if imatches:
return SCHEDULE[imatches[0]]
@ -66,7 +80,18 @@ def del_event(pid):
event with a certain pid, this cleans up in case there are any multiples.
"""
pid = int(pid)
imatches = [i for i, stored_event in enumerate(SCHEDULE) if stored_event.pid == pid]
imatches = [i for i, stored_event in enumerate(SCHEDULE)
if stored_event.pid == pid]
for imatch in imatches:
SCHEDULE[imatch].stop_event_loop()
del SCHEDULE[imatch]
event = SCHEDULE[imatch]
event.stop_event_loop()
del SCHEDULE[imatch]
if event.persistent:
# We have to sync to disk, otherwise we might end
# up in situations (such as after a crash) where an
# object has been removed, but the event tied to it remains.
ecache = [event for event in SCHEDULE
if event.persistent]
cache.set_pcache("_persistent_event_cache", ecache)
cache.save_pcache()

View file

@ -15,6 +15,9 @@ from src import alias_mgr
from src import cmdtable
from src import initial_setup
from src.util import functions_general
from src.cache import cache
from src import scheduler
from src import gametime
class EvenniaService(service.Service):
def __init__(self):
@ -34,11 +37,13 @@ class EvenniaService(service.Service):
# Begin startup debug output.
print '-'*50
firstrun = False
try:
# If this fails, this is an empty DB that needs populating.
ConfigValue.objects.get_configvalue('game_firstrun')
except ConfigValue.DoesNotExist:
print ' Game started for the first time, setting defaults.'
firstrun = True
initial_setup.handle_setup()
self.start_time = time.time()
@ -52,9 +57,27 @@ class EvenniaService(service.Service):
# Cache the aliases from the database for quick access.
alias_mgr.load_cmd_aliases()
# Load persistent cache from database into memory
cache.load_pcache()
if not firstrun:
# Find out how much offset the timer is (due to being
# offline).
time_sync = gametime.time_last_sync()
# Sync the in-game timer.
cache.set_pcache("_game_time0", self.start_time)
# Fire up the event scheduler.
event_cache = cache.get_pcache("_persistent_event_cache")
if event_cache and type(event_cache) == type(list()):
for event in event_cache:
# we adjust the executed time to account for offline time.
event.time_last_executed = event.time_last_executed + time_sync
scheduler.add_event(event)
print '-'*50
# Fire up the event scheduler.
events.add_global_events()
"""
BEGIN SERVER STARTUP METHODS
@ -102,6 +125,9 @@ class EvenniaService(service.Service):
"""
Gracefully disconnect everyone and kill the reactor.
"""
gametime.time_save()
cache.save_pcache()
logger.log_infomsg("Persistent cache and time saved prior to shutdown.")
session_mgr.announce_all(message)
session_mgr.disconnect_all_sessions()
reactor.callLater(0, reactor.stop)
@ -141,7 +167,6 @@ class EvenniaService(service.Service):
f.server = self
return f
def start_services(self, application):
"""
Starts all of the TCP services.