Security of Python's eval() on untrusted strings?

Solution 1:

eval() will allow malicious data to compromise your entire system, kill your cat, eat your dog and make love to your wife.

There was recently a thread about how to do this kind of thing safely on the python-dev list, and the conclusions were:

  • It's really hard to do this properly.
  • It requires patches to the python interpreter to block many classes of attacks.
  • Don't do it unless you really want to.

Start here to read about the challenge: http://tav.espians.com/a-challenge-to-break-python-security.html

What situation do you want to use eval() in? Are you wanting a user to be able to execute arbitrary expressions? Or are you wanting to transfer data in some way? Perhaps it's possible to lock down the input in some way.

Solution 2:

You cannot secure eval with a blacklist approach like this. See Eval really is dangerous for examples of input that will segfault the CPython interpreter, give access to any class you like, and so on.

Solution 3:

You can get to os using builtin functions: __import__('os').

For python 2.6+, the ast module may help; in particular ast.literal_eval, although it depends on exactly what you want to eval.

Solution 4:

Note that even if you pass empty dictionaries to eval(), it's still possible to segfault (C)Python with some syntax tricks. For example, try this on your interpreter: eval("()"*8**5)