I need some advice.
I trained an image classifier using Tensorflow and wanted to deploy it to AWS Lambda using serverless. The directory includes the model, some python modules including tensorflow and numpy, and python code. The size of the complete folder before unzipping is 340 MB, which gets rejected by AWS lambda with an error message saying "The unzipped state must be smaller than 262144000 bytes".
How should I approach this? Can I not deploy packages like these on AWS Lambda?
Note: In the requirements.txt file, there are 2 modules listed including numpy and tensorflow. (Tensorflow is a big module)
解决方案
I know I am answering it very late .. just putting it here for reference for other people..
I did the following things -
Delete /external/* /tensorflow/contrib/* /tensorflow/include/unsupported/* files as suggested here.
Strip all .so files especially two files in site-packages/numpy/core - _multiarray_umath.cpython-36m-x86_64-linux-gnu.so and _multiarray_tests.cpython-36m-x86_64-linux-gnu.so. Strip considerably reduces their size.
You can put your model in S3 bucket and download it at runtime. This will reduce the size of the zip. This is explained in detail here.
If this does not work then there are some additional things that can be done like removing pyc files etc as mentioned here