34

I'm using aws ec2 service with awscli. Now I want to put all the commands I type in the console into a python script. I see that if I write import awscli inside a python script it works fine but I don't understand how to use it inside the script. For instance how do I execute the commands aws ec2 run-instances <arguments> inside the python script after import awscli? Just to make it clear, I'm not looking for a solution like os.system('aws ec2 run-instances <arguments>'), I'm looking for something like

import awscli
awscli.ec2_run-instances(<arguments>)

6 Answers 6

28

You can do it with brilliant sh package. You could mimic python package with sh doing wrapping for you.

import sh
s3 = sh.bash.bake("aws s3")
s3.put("file","s3n://bucket/file")
Sign up to request clarification or add additional context in comments.

4 Comments

The sh package is definitely interesting. Thanks for the pointer. But it is not for Windows though (will use the package for my *nix platforms)
this should be the accepted answer, due to lack of features and strange limits of boto3
This only outputs things like "<Command '/usr/bin/bash aws s3 list'>" instead of expected aws output
I get STDERR: /usr/bin/bash: aws s3: No such file or directory
18

The CLI would be more suited for the shell prompt, for a better python API, check the boto library. This example shows how to launch an instance: http://boto.readthedocs.org/en/latest/ec2_tut.html

1 Comment

except the boto library cant do some things that the awscli library can do. e.g. s3 sync.
18

Boto3 doesn't have everything the cli has so you may have to use something from the cli in a script once in a blue moon. I can't find an analog for aws deploy push in boto3 for example so here is how I push to s3 with the cli from a python script. Although to Julio's point, I use boto for everything else.

import subprocess

cmd='aws deploy push --application-name SomeApp --s3-location  s3://bucket/Deploy/db_schema.zip --ignore-hidden-files' 
push=subprocess.Popen(cmd, shell=True, stdout = subprocess.PIPE)
print push.returncode

Comments

3

You can use awscli direclty in python

from awscli.clidriver import create_clidriver
cli_driver = create_clidriver()
result = cli_driver.main(args=["s3api", "list-buckets"])

That way you trigger the command but the result will only contain the return code. I haven't found a way to capture the actual output from the tool. Additionally the instance will exit if things go wrong.

So, I don't recommend using this. Just wanted to add this for informational purpose

1 Comment

Just in case you want to go this path (like I did, when writing a daemon for AWS CLI), you can capture outputs/errors by overriding sys.stdout and sys.stderr
2

Well, you can run aws cli command by using subprocess in python script. For instance, suppose to get the s3 bucket list. Then,

import subprocess

push=subprocess.call(['aws', 's3', 'ls', '--recursive', '--human-readable', '--summarize'])

or

import subprocess

push=subprocess.run(['aws', 's3', 'ls', '--recursive', '--human-readable', '--summarize'])

Wish help for you.

Comments

0

Fix/example to the answer of smokeny using sh. As an answer since I can't comment yet and edit is not working.

from sh import aws
aws("s3","cp","s3://folder/", ".", "--recursive", "--exclude", "*", "--include", "*.txt")

1 Comment

Your answer could be improved with additional supporting information. Please edit to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers in the help center.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.