This project provides a serverless API for generating AI responses using OpenAI's API. It leverages FastAPI for for managing endpoints, AWS Lambda and API Gateway for serverless deployment, and CircleCI for continuous integration and deployment.
The project has been developed as part of the following blog
main.py
: Main application code for FastAPI.build-sam.sh
: Script to build the Lambda deployment package.template.yaml
: AWS SAM template for deploying the application.test/
: Directory containing test cases..circleci/config.yml
: CircleCI configuration for CI/CD.Makefile
: Makefile for running common tasks.pyproject.toml
: Project dependencies and configuration.
Clone the repository:
git clone https://github.com/yourusername/genai-aws-circleci.git
cd genai-aws-circleci
Install dependencies:
uv sync
Set up environment variables:
Create a .env
file in the root directory and add your OpenAI API key:
OPENAI_API_KEY=your_openai_api_key
To run the application locally, use the following command:
uv run main.py
The API will be available at http://127.0.0.1:8000.
To run the tests, use the following command:
uv run pytest
Build the Lambda deployment package:
chmod +x build-sam.sh
./build-sam.sh
Deploy using AWS SAM:
sam build
sam deploy --stack-name sam-app --resolve-s3 --capabilities CAPABILITY_IAM --region eu-central-1
Testing Endpoint
curl -X POST https://znhxj2t415.execute-api.eu-central-1.amazonaws.com/dev/generate \
-H "Content-Type: application/json" \
-d '{"prompt": "Tell me a joke!"}'
The project is configured to use CircleCI for continuous integration and deployment. The .circleci/config.yml
file contains the necessary steps to build, test, and deploy the application on the CircleCI platform.