WeBWorK Main Forum

WeBWorK gradually eats up system memory (Digital Ocean Ubuntu 16.04 server)

WeBWorK gradually eats up system memory (Digital Ocean Ubuntu 16.04 server)

by Homer White -
Number of replies: 4

I have an installation of WeBWorK 2.12 on a Digital Ocean server; the OS is Ubuntu 16.04. The "droplet" has currently has 2Gb RAM (but will be bumped up to at least 4Gb when the semester begins).

As I build HW sets for a course, I check them in various ways. One of the checks is to generate a pdf hardcopy. Once the pdf is generated I hit the back arrow on the browser and move on to something else.

I have noticed over the last few days that certain operations got more "spotty": for example, I might have to request a pdf harcopy two or three times before success.

Earlier today I logged on to my system (via ssh) to do routine maintenance, and

sudo apt-get update

failed. Some internet research prompted me to check system memory and I noticed that there was almost no memory available.

A reboot brought available memory back to about 800Mb. I then proceeded to generate pdf hardcopies while keeping track of system memory with commands like:

watch -n 5 free -m

Each successive hardcopy reduced free available memory by about 100-200Mb. The output below shows where I now stand:

Every 5.0s: free -m Tue Jul 19 21:42:51 2016

total used free shared buff/cache available
Mem: 2000 1790 81 1 128 61
Swap: 0 0 0

I'm down to 128Mb.

Evidently the pdf hardcopy operation takes up some new memory that is not cleared after the pdf is produced.

Is there a way to "clean up" after production of a pdf? (I'm not sure yet but there may be other WeBWorK operations that eat memory, too.)

I apologize for my imprecise technical vocabulary: I'm not a sysadmin, just a faculty member trying to make do on my own.

In reply to Homer White

Re: WeBWorK gradually eats up system memory (Digital Ocean Ubuntu 16.04 server)

by Andras Balogh -

The hardcopy generation is a known memory problem. See replies about what you can try at http://webwork.maa.org/moodle/mod/forum/discuss.php?d=3827

While 800Mb free memory out of 2GB sounds reasonable, the 2GB itself is extremely small nowadays. Try not to have large homework sets.

I have to admit that I am not familiar with cloud solutions in general, but looking at the pricing I fail to understand how this is better than buying a simple desktop computer for about $500 that already comes with 12GB memory  and one can even run it from home if needed.


In reply to Andras Balogh

Re: WeBWorK gradually eats up system memory (Digital Ocean Ubuntu 16.04 server)

by Homer White -

Thanks for the reply. I have since figured out that (seemingly) every operation uses up memory and does not give it back:

* Viewing a group of problems in the Library Browser typically eats about 500Mb
* Adding 2-3 problems to a HW set uses 100-300Mb
* Generating a hardcopy uses 80-130Mb
* When a student-user solves a HW problem 50-60Mb are used
* etc.

None of the memory is given back after the operation is over.

When free -m tells me that I'm down to a few megabytes I print out the processes, in order of the memory they use. I typically find that there is one process owned by mysql that consumes about half of the memory and about half a dozen processes owned by www-data that consume 6-7% each.

If I keep going WebWorK crashes, but usually gives back just enough memory to get going again. The only way to recover all the memory is to reboot.

Right now I'm the only one using the system. If a class starts using it then it will crash before the evening is out, even if my system has 16Gb memory.

So I'm looking for a way to make these processes stop once they have done their work.

At the moment I don't know if the problems is specific to the way Digital Ocean configures Ubuntu 16.04, or if it's something that can be handled by adjusting WeBWorK settings.

In reply to Homer White

Re: WeBWorK gradually eats up system memory (Digital Ocean Ubuntu 16.04 server)

by Danny Glin -
The thread that Andras quoted has some more detail on this.  It is a known issue with WeBWorK, but because of the way that the system is designed it can't easily be fixed.  It is typically mitigated with appropriate apache settings.

Note that when you load a typical WeBWorK page you are employing several apache processes (the ones owned by www-data).  There will be one of these doing the "heavy lifting" of processing all of the problems, but there are others serving things like the static images and the css and javascript for the page.  Because not all requests to the web server are resource intensive, it is not generally a good idea to kill off apache processes after they have served one request.  You would probably see a noticeable degradation in performance, since apache would have to create a new process for every request.

Enter the MaxConnectionsPerChild setting in the apache configuration.  This controls how many requests a process serves before being killed off.  The extreme would be to set this to 1, which I don't recommend for the above reasons.  For a typical WeBWorK server we have found that 50-100 is a good number.

You should note also that your test usage is not entirely representative of what would happen during a term.  You may be making a lot of requests which load many problems at once (hardcopies, library browser, etc.).  When many students are using the system, there are a lot more small requests of the server (a single problem, navigation pages, loading individual images, etc.) in between, which means that a typical apache process could handle many requests without growing too much in memory usage.

With the right settings, there are many people using servers with 4GB of RAM to serve reasonably large classes.  So long story short: set MaxConnectionsPerChild and MaxRequestWorkers as low as you need to so that your system doesn't run out of memory, but be aware that the lower you set these values the slower you will typically find your server in serving webpages.
In reply to Danny Glin

Re: WeBWorK gradually eats up system memory (Digital Ocean Ubuntu 16.04 server)

by Homer White -

Yes, that does look like my problem, and I think I can deal with it.

Thanks to you, too, Andras, for the link!