[Sciserver-users] Compute Jobs
Jonas Haase
jhaase at mpe.mpg.de
Tue Dec 1 14:51:31 CET 2020
Hi Johannes
Yes you should be able to do that, either by running the loop over the files in a single job, perhaps spawning subprocesses
or by creating a job for each file (or a group of them) using the SciServer python API - in this case the system will queue the tasks and execute 10 of them at a time.
The containers the compute jobs run in are currently set up with 8 cores (inherited from the VM they run on) and with a memory limit of 64GB.
There is also a execution time limit of 48 hours.
All these things can be changed if desired!
The Compute Jobs page indeed looks weird, I have not seen it like that before - I assume the two button cannot be pressed?
Perhaps it is a browser issue? I just tried with with a couple of users in Safari and Chrome and it seems to work.
cheers
Jonas
> On 1. Dec 2020, at 14:21, Johannes Buchner <jbuchner at mpe.mpg.de> wrote:
>
> Hi Jonas,
>
> I have a python script that I would like to let loose on each file in a
> directory (27000 files). The script takes a few minutes per file.
> Can I do this with compute jobs? That page doesn't seem to load
> correctly for me (see image attaached). How many cores would be available?
>
> Cheers,
> Johannes
> --
> Dr. Johannes Buchner
> Postdoctoral Researcher
> Max Planck Institute for Extraterrestrial Physics
> Garching, Germany
> Fellow of the ORIGINS excellence cluster Data Science lab
> http://astrost.at/istics/
> <Screenshot from 2020-12-01 14-20-36.png>--
> Sciserver-users mailing list
> Sciserver-users at lists.mpe.mpg.de
> https://lists.mpe.mpg.de/cgi-bin/mailman/listinfo/sciserver-users
More information about the Sciserver-users
mailing list