Using python in amazon web service lambdas can be tricky in the following scenarios.

  • There are some dependencies that are not included in the aws python lambda executor
  • There are some dependencies that require binaries (Packages such as numpy, sklearn e.t.c.)

For the first topic it is still possible to develop on your laptop and to send the lambda via s3. For dependencies with binaries you have to work with a real ec2.

Libraries without binaries

This is the case for example for the ElasticSearch dependency. In order to make the lambda work, the easiest way to go is to zip the code and the libraries and send it to amazon via s3.

It is very easy to do this via a shell script such as the one shown below. The good thing is that this shell script can still be run on your own laptop (If your python installation is working).

#!/bin/bash

if [ "$#" -ne 1 ]; then
    echo "Illegal number of parameters. Specify a file name."
    exit -1
fi

file="$1.zip"
echo "---------------------"
echo "Creating lambda $file"
echo "---------------------"

rm $file
rm -r ./venv

############################
#create virtual environment
############################

virtualenv -p python3 ./venv
source ./venv/bin/activate

##################
# Python modules
##################

pip3 install elasticsearch

###############
# Create Zip
###############

zip -9 $file handler.py
cd ./venv/lib/python3.6/site-packages/
zip -r9 ../../../../$file *
cd ../../../../

#########################
#copy the zip file to s3
#########################
aws s3 cp $file s3://path/toyourbucket/$file

echo "Done."

The script requires a parameter (the name of the zip file). Note that the end of the script must be modified in order to store the file in one of your bucket. Note also that you must call aws configure in order to add the key/pair to your environment in order to be able to use the aws s3 command. Note also that you must add the pip install lines that are required by your project. Don’t forget to add a write permission to this bucket for your user key/pair.

Screen Shot 2017-10-12 at 14.39.07.jpg

Dependencies with binaries

This is the case for dependencies that rely on the numpy package (Pandas, SKLearn e.t.c) In this case, it is no longer possible to use your laptop because the binaries depend on the hardware.

The idea is to run a script similar to the one shown in the previous paragraph directly on an ec2. In order to limit the cost, we will use an ec2 nano using the Amazon Linux AMI.

  • Log into the ec2.
  • Run the command aws configure in order to specify your access keys.
  • Run the following script
#!/bin/bash

if [ "$#" -ne 1 ]; then
    echo "Illegal number of parameters. Specify a file name."
    exit -1
fi

file="$1.zip"

echo "******************************************"
echo "Creating lambda $file"
echo "******************************************"

rm $file
rm -r ./venv

echo "* Creating swap file if required"

sudo /bin/dd if=/dev/zero of=/var/swap.1 bs=1M count=1024
sudo /sbin/mkswap /var/swap.1
sudo /sbin/swapon /var/swap.1

echo "Done..."

############################
#create virtual environment
############################

virtualenv -p python ./venv
source ./venv/bin/activate

##################
# Python modules
##################

echo "* Installing dependencies"

pip install pandas

echo "Done..."

###############
# Create Zip
###############

echo "* Zipping function"

zip -9 $file handler.py

cd ./venv/lib/python2.7/site-packages/
zip -r9 ../../../../$file *
cd ../../../../

cd ./venv/lib64/python2.7/site-packages/
zip -r9 ../../../../$file *
cd ../../../../

echo "Done..."

#########################
#copy the zip file to s3
#########################

echo "* Copying to s3"

aws s3 cp $file s3://path/toyourbucket/$file

echo "Done..."
  • Configure your lambda in the amazon console and you are done.