0

I have a question about the Python Interpreter. How does is treat the same script running 100 times, for example with different sys.argv entries? Does it create a different memory space for each script or something different?

System is Linux , CentOS 6.5. Is there any operational limit that can be observed and tuned?

3
  • 1
    I don't think Python will be the thing that limits you, but rather the OS (I'm thinking of ulimit and such) Commented Dec 29, 2013 at 13:27
  • 2
    I'm not sure what question you're asking here. Are you asking about how a python script can be called from multiple shell scripts concurrently (only the OS limits you I believe) or what happens when there are different arguments provided to a script? Could you clarify your question? Thank you. Commented Dec 29, 2013 at 13:33
  • What happens when you call same python script multiple times with different sys.argv elements. For example assume that you run this script in a for loop from shell like bellow " for i in cat file do; src_python.py $i &; done " Just a random example for you to understand my question better. Commented Dec 30, 2013 at 5:33

1 Answer 1

1

You won't have any problem with what you're trying to do. You can call the same script in parallel a lot of times, with different input arguments. (sys.argv entries). For each run, a new memory space will be allocated.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.