1

I am trying to add the OpenAI Python library to my AWS Lambda function. I tried to add them via AWS Lambda Layers as described in this guide. However, when executing my code, the addition of this line:

import openai

leads to this error-response:

Response
{
  "errorMessage": "Unable to import module 'lambda_function': No module named 'pydantic_core._pydantic_core'",
  "errorType": "Runtime.ImportModuleError",
  "requestId": "c19b73e3-8c6d-4564-8be6-af7b03a79e00",
  "stackTrace": []
}

Function Logs
START RequestId: c19b73e3-8c6d-4564-8be6-af7b03a79e00 Version: $LATEST
[ERROR] Runtime.ImportModuleError: Unable to import module 'lambda_function': No module named 'pydantic_core._pydantic_core'
Traceback (most recent call last):END RequestId: c19b73e3-8c6d-4564-8be6-af7b03a79e00
REPORT RequestId: c19b73e3-8c6d-4564-8be6-af7b03a79e00  Duration: 5.02 ms   Billed Duration: 6 ms   Memory Size: 128 MB Max Memory Used: 40 MB  Init Duration: 166.39 ms

When following along the guide using the same package as them (requests), importing that package (requests) doesn't lead to an error.

Are there possible explanations for this behavior? I don't know where to start looking for fixes. Or are there any alternative methods of importing the library?

6 Answers 6

1

I was running into the same issue over and over again, and I finally found the solution. You have a couple of options to tackle this, in my case, I used Serverless Framework but if you're even using SAM that's fine as well.

i) So the quick approach is to dockerize your lambda, and change the architecture as arm instead of x86. You can dockerize your lambda using this Dockerfile

FROM public.ecr.aws/lambda/python:3.10
COPY requirements.txt .
RUN  pip3 install -r requirements.txt --target "${LAMBDA_TASK_ROOT}"
COPY . ${LAMBDA_TASK_ROOT}
CMD ["handler.generate_code"]

ii) The second option is, to use the serverless-python-requrements plugin if you're using the serverless framework and have the followign configuration

 pythonRequirements:
    dockerizePip: non-linux
    dockerRunCmdExtraArgs: ["--platform", "linux/x86_64"]
    usePipenv: false
    slim: true
    layer: true

This one is flexible because you can use either x86 or arm depending on the dockerRunCmdExtraArgs Hope this helps.

Sign up to request clarification or add additional context in comments.

Comments

0

Your uploaded Lambda environment / Layer does not have pydantic installed which is a requirement of the openai python package.

Ensure that all the required packages are installed in your folder where you have your python Lambda code before zipping and uploading to AWS via the console.

That being said, there is a similar question and answer that refers to the same part of the documentation like you.

Comments

0

The Layer which you uploaded to Lambda doesn't have all dependent packages.

Here is step to create layer for python, wish it will help.

  • Check the python version in local machine and Lambda (It should be similar) as few packages are not imported properly when different version is there.
  • Create Zip file in below mentioned path:
# Create Virtual environment for python3.

python3 -m venv venv
source venv/bin/activate

# Create a directory with name "python"

mkdir python
cd python

# Install you packages in this folder.

pip3 install pycryptodome -t .
Once installed Check the packages.

#Create ZIP file for python directory
cd .. 
zip -r deployment_name.zip python
  • Now you can upload the .zip file to you AWS Lambda Layer. Make sure the ZIP file is created for all folder available in python folder Don't miss out any files or folders it may be our main package may be dependent on it.

For Windows

  • Using power shell or Command Prompt perform following steps (admin privileges):
mkdir layers
cd layers
mkdir python
cd python
pip3 install pycryptodome -t ./
  • Now open the python folder in file explorer.

  • Zip the python folder along with its child folders.

  • Now you are good to go and upload it as layer in AWS.

It will great if you have Linux Machine or Create an EC2 Instance with Ubuntu AMI and Copy the Content to S3, the you can upload it from S3 bucket to AWS lambda layer directly.

Comments

0

I've tried all the answers listed here, but none worked for me and were quite tiresome. To resolve the issue, install the older version of the OpenAI library before they migrated (installing older versions might not be the best practice but it is the simplest):

Here are the step-by-step instructions:

  1. Create an 'openAI' directory:

    mkdir openAI

  2. Move into the directory

    cd openAI

  3. Install the older version of the OpenAI library:

    pip install openai==0.28 --target aws-layer/python/lib/python3.12/site-packages

  4. Zip the 'python' folder and upload it to AWS as a layer.

Additionally, if you must use the newest version you'll need to migrate the library before adding it as an AWS layer. You can follow the migration guide provided here.

Comments

0

I had the same issue, it seems the problem is related to the integration of your runtime setting. Following this issue:

https://github.com/pydantic/pydantic/issues/6557#issuecomment-1633718474

I tried using Python 3.10 and x86/64 architecture and it solved the problem.

Comments

0

In my case, this issue was coming because of a mismatch in the Python version used to create the layer vs the Python version used to run Lambda. I downgraded the Python version of the Lambda run time to match one I used during the creation of the layer zip file. Everything worked fine.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.