about
search
post article
Documentation
Mailing Lists
Bug Tracking
Development
Installation
Upgrading
Download
admin
rdf
main
|
When Data.fs gets unacceptably big
|
Posted by on Sunday October 08, 01:39AM, 2000
from the my-hd-is-not-happy-about-this dept.
I run a Squishdot-based news site which has a considerable amount of daily traffic (ACP). We've had it running on our Debian 2.2 system for nearly a year now, and we're quite happy about it. But after all this time of heavy usage, Zope's Data.fs has grown to 2309 megabytes as I type. The /var partition is at 75%, and we don't know how we can solve this problem.
People are starting to consider switching to another software (Slash, for example), but I would like to try to find a solution for Squishdot having to do that. Also, I have checked CVS code for Squishdot, and it seems it's been stale for a while. I wonder how Technocrat and other active sites (such as Dot.KDE) are managing this and if there's a way to solve this before we start losing mail :> Any comments welcome. Thanks, Jordi
< | >
|
Related Links
|
-
Articles on Zope
-
Also by Jordi M.
- Contact
|
|
|
The Fine Print: The following comments are owned by whoever posted them.
( Reply )
|
Re: When Data.fs gets unacceptably big
by on Sunday October 08, 05:47AM, 2000
|
Have you packed the database at all or does it contain 1 year's worth of revision info? Doesn't seem clear from your article. ;)
Cheers,
Navin.
|
[ Reply to this ]
|
-
Re: When Data.fs gets unacceptably big
by on Sunday October 08, 04:35PM, 2000
|
Let's see if I understood Zope right :)
You mean that I get the db, compress it or something, and start a new one?
I guess that would break previous postings, searchs and all, would it?
Or am I missing something?
|
[ Reply to this ]
|
-
Re: When Data.fs gets unacceptably big
by on Sunday October 08, 05:08PM, 2000
|
Nothing so complicated. Go to the top-level Zope management menu and click on "Control Panel". In "Control Panel", click on "Database Management". You should find an option to "pack" the database there, where you can get rid of revision/transaction/undo info older than a certain number of days.
It doesn't make much sense to have 1 year of revision info, you can probably get rid of everything older than 10 days. I'd make a backup first though, no idea how long packing will take for a database that huge or what could happen in such a situation. Never had a problem personally, but our db has not exceeded 75M --unpacked-- yet and we have a bunch of other stuff stored there other than dot.kde.org.
Cheers,
Navin.
|
[ Reply to this ]
|
-
Re: When Data.fs gets unacceptably big
by on Monday October 09, 07:21PM, 2000
|
Navin, this is great!
We hadn't noticed this "pack" feature.
Our nasty 2Gb database has automagically shrinked to 200Mb, with 10 days of savings.
Thank you for your help,
Jordi
|
[ Reply to this ]
|
-
Re: When Data.fs gets unacceptably big
by on Monday October 09, 11:53PM, 2000
|
Wow! That's a relief to hear on this end. 200M won't be a problem for us in a year. ;)
Cheers,
Navin.
|
[ Reply to this ]
|
-
Re: When Data.fs gets unacceptably big
by on Tuesday October 10, 06:52PM, 2000
|
I assume there are a lot more comments to posts in dot.kde.org than in our ACP, though. But we have lots of posts during the day, so maybe they have a similar traffic after all.
|
[ Reply to this ]
|
-
Re: When Data.fs gets unacceptably big
by on Thursday September 16, 09:46AM, 2004
|
is there an alternative of packing a data.fs aside on control panel database management.... one of our client use zope... we remind them to pack every two days of every week.. but they did follow our instruction until the data.fs push to the limit of 19gig... and then they encountering of "no space available"... now when i asked them to pack... packing is not taking effect now.. due to "no space available"... so i need a solution aside packing in database management...
|
[ Reply to this ]
|
-
cron job.
by on Tuesday September 21, 08:31PM, 2004
|
The problem is that you now need to find an additional 19GB so the pack can now succeed. There's not much I know to get around this, you'll have to try asking on the mailing list.
Once you've fixed that problem, I'd suggest setting up a cron job to do the pack so your client doesn't have to remember to do so ;-)
|
[ Reply to this ]
|
|
The Fine Print: The following comments are owned by whoever posted them.
( Reply )
|
|