← All Articles

Python command-line scripts with argparse

I find myself forgetting the exact syntax of how to pass arguments to my scripts when running from the command line, so here's a template file as a kickstart!

Passing parameters to your Python script is useful for a number of reasons. For example, sometimes I want to run the same script on a different set of users. And if the script is particularly important, I usually include a --production-run optional argument so that without this no database records are updated.

What *not* to use

Before I dive into the template file which I'm currently using, note that there are 2 common approaches which I am *not* using.

The first is using sys.argv. This is actually used "under the hood" in my recommended template (using argparse) but I've found using it directly to be trickier than it needs to be.

Here's a short snippet. Again, don't use this.

def run(some_variable):
  ... some code here ...

if __name__ == "__main__":
  import sys
  if len(sys.argv) > 1:
    some_variable = sys.argv
    some_variable = False


Here's what's annoying about this:

  1. You have to keep checking the length of sys.argv, in the case of optional arguments. The first argument (ie sys.argv[0] is always the actual script which you are running.
  2. None of the arguments are explicitly named, so you have to rely on reading the code / comments to determine what `sys.argv[1]`, `sys.argv[2]`, etc refers to.
  3. There's no help syntax output on the command line, so you have to open the script and inspect it.

Another library I used to use is optparse. Don't use it because it has been deprecated since Python 2.7. I used to use scripts like this with optparse:

def run(some_variable):
  ... some code here ...

if __name__ == "__main__":
  from optparse import OptionParser

  parser = OptionParser()
  parser.add_option("-u", "--user", help="Update a specific user")

  (options, args) = parser.parse_args()


Again, don't use this.

What to use - argparse

Ok, enough fluffing around. Here's what I am using today, with argparse for conditional and required options.

class MyFancyClass:

    def __init__(self, user_id=None, production_run=False):
       self.production_run = production_run
       self.user_id = user_id
    def run(self):
    # --- private
    def _my_private_method1(self):
        if self.production_run:
    def _my_private_method2(self):
        if self.production_run:

if __name__ == "__main__":
    Command line usage:
    `python -m path.to.my_file -h` ==> show help

    `python -m path.to.my_file --production-run` => runs for all users, will update DB.
    `python -m path.to.my_file` => runs for all users, will *not* update DB (ie dry-run)
    `python -m path.to.my_file --production-run -u 1234` => runs only for the user 1234, will update DB.
    `python -m path.to.my_file -u 1234` => runs only for user 1234, will *not* update DB (ie dry-run).       

    import argparse

    parser = argparse.ArgumentParser(description="My fancy script which does something.")
    parser.add_argument('--production-run', '-p', action='store_true', help="Will update fields (default: false)")
    parser.add_argument('--user_id', '-u', help="Only process a specific user", type=int)

    args = parser.parse_args()
    run_once = MyFancyClass(user_id=args.user_id, production_run=args.production_run)

What I like about using this:

  1. I don't have to worry whether an argument was passed in, it will fallback to the default or None.
  2. It allows me to add help text so that any user could run the following on the command line and see the available arguments - `python -m path.to.my_file -h`
  3. It is self-documenting! I didn't really need to add that big comment block, as once you start working with this the code itself is clear enough.
Made with JoyBird