[p2pu-dev] Course metrics
Dirk Uys
dirk at p2pu.org
Tue Jun 19 16:07:03 UTC 2012
Hi everyone
During the last release on 18 May I enabled course metrics for all course
organizers believing that the metrics were working perfectly and that it's
simply a permission update. You know what they say about assumption...
The problem is that when a user goes to the metric page for a course the
metrics get generated from the recorded page views (
https://github.com/p2pu/lernanta/blob/master/lernanta/apps/tracker/models.py#L100).
If the user refreshes the page (because it's taking so long), the process
is started again and the metric updating procedure happens concurrently.
This doesn't play nice with the intended use of the db and duplicated data
is generated :(
Now, solving this problem has multiple possibilities! Each with pros and
cons.
1. Enforce some locking mechanism to ensure the operation only happens once
+ process doesn't run concurrently
- user waits
- lots of db work tied to specific requests
2. Queue a celery tasks that runs to operation
+ user doesn't need to wait for results
- still need to implement some locking mechanism to prevent celery tasks
from running concurrently
- lots of db work tied to specific requests
3. Keep the table updated from the get go
+ metrics are always up to date
- introduces small overhead to every page view
- generate metrics that's never used
4. Fix the data duplication issue that presents itself
+ doesn't matter if process runs concurrently
- update still takes a long time
- lots of db work tied to specific requests
5. Don't trigger the update process based on user actions, but rather at a
predetermined time
+ user doesn't wait
- generate metrics that's never used
6. ?
Does anyone have any thoughts on this?
Cheers
d
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.p2pu.org/pipermail/p2pu-dev/attachments/20120619/59a7c424/attachment.html>
More information about the p2pu-dev
mailing list