Context
I have a lambda with the following handler function:
import jsonfrom langchain_community.llms import ollamadef lambda_handler(event, context): return {'statusCode': 200,'body': json.dumps('Hello from LLM Lambda!') }
I have a .venv active and ran pip install langchain
and then running pip list
on my local I get:
aiohttp 3.9.1aiosignal 1.3.1annotated-types 0.6.0anyio 4.2.0async-timeout 4.0.3attrs 23.2.0certifi 2023.11.17charset-normalizer 3.3.2dataclasses-json 0.6.3exceptiongroup 1.2.0frozenlist 1.4.1greenlet 3.0.3idna 3.6jsonpatch 1.33jsonpointer 2.4langchain 0.1.3langchain-community 0.0.15langchain-core 0.1.15langsmith 0.0.83marshmallow 3.20.2multidict 6.0.4mypy-extensions 1.0.0numpy 1.26.3packaging 23.2pip 23.3.2pydantic 2.5.3pydantic_core 2.14.6PyYAML 6.0.1requests 2.31.0setuptools 58.0.4sniffio 1.3.0SQLAlchemy 2.0.25tenacity 8.2.3typing_extensions 4.9.0typing-inspect 0.9.0urllib3 2.1.0yarl 1.9.4
As you can see the package pydantic_core
is present.
From .venv/lib/python3.9/sites-packages
I'm copiyng, renaming and zipping site-packages
into python.zip
then uploading it to an S3 bucket.
I then create a lambda layer with the following configs and referencing the mentioned zip file S3 object url like this https://bucketname.s3.us-east-2.amazonaws.com/lib/python.zip
THE PROBLEM
Is that the lambda function is currently crashing with the following error message:
{"errorMessage": "Unable to import module 'lambda_function': No module named 'pydantic_core._pydantic_core'","errorType": "Runtime.ImportModuleError","requestId": "7b2bba93-151f-4168-86fd-9ddad2d787fe","stackTrace": []}
Notes on what I have tried:
- If I comment out the
from langchain_community.llms import ollama
line in the lambda source code it runs without problems. - I also tried uploading the zip file directly to the layer with the same result.
- I did the whole same process but uploading and importing the
requests
package instead oflangchain
and the lambda ran successfully. - I'm using python 3.9 and installing packages for an x86_64 architecture.