1

I'm trying to run my PySpark job code using AWS Lambda function. Here i'm getting below error: Unable to import module 'lambda_function': No module named 'pyspark'

My PySpark job has below imports. So, how can I import below library into my Lambda function. How to install these in Lambda function ?

from pyspark.sql import SparkSession
from pyspark.sql import functions as F, Column as col
from pyspark.sql.functions import when, coalesce
from pyspark.sql.functions import lit

Thanks.

2 Answers 2

1

Guys you must aware three things, when you got this error

Unable to import module 'lambda_function'

  1. File permission
  2. handler format would be python_filename.lambda_handler
  3. When you compress files you must give option as zip -r name.zip. (hidden files are add only when we end dot(.) at end of zip file)
Sign up to request clarification or add additional context in comments.

Comments

0

You need to package all AWS Lambda dependencies together with your code into deployment zip file.

https://docs.aws.amazon.com/lambda/latest/dg/lambda-python-how-to-create-deployment-package.html

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.