The Python Script I Built That Started Paying My Rent (And the Libraries That Did the Heavy Lifting)
How I went from tinkering to shipping real Python products — scraping leads, automating workflows, building SaaS endpoints, and packaging AI features — using a handful of battle-tested libraries.
1. Why I focused on libraries (not frameworks) to make money fast When I started trying to monetize Python, the biggest mistake I made was overengineering: building monoliths before validating a single paying customer. Libraries are the Lego bricks — small, focused, composable. Pick the right ones, glue them together with thin glue (APIs, scripts, queue workers), and you can ship an MVP in days instead of months. The paragraphs below show the exact libraries I used, why, and concrete code you can copy-paste.
2. Quick idea-validation workflows (requests + BeautifulSoup + pandas) Before building anything robust, validate demand. I validate by scraping job listings, searching freelance boards, and compiling contact lists. requests + BeautifulSoup + pandas gets you from zero to a spreadsheet of prospects in under an hour.
# lead_scraper.py
import requests
from bs4 import BeautifulSoup
import pandas as pd
from time import sleep
def scrape_jobs(query, pages=2):
leads = []
for page in range(1, pages+1):
url = f"https://remoteok.com/remote-{query}-jobs?page={page}"
r = requests.get(url, headers={"User-Agent": "Mozilla/5.0"})
soup = BeautifulSoup(r.text, "html.parser")
for row in soup.select("tr.job"):
title = row.select_one("h2").get_text(strip=True) if row.select_one("h2") else None
company = row.select_one(".companyLink").get_text(strip=True) if row.select_one(".companyLink") else None
if title:
leads.append({"title": title, "company": company})
sleep(1)
return pd.DataFrame(leads)
if __name__ == "__main__":
df = scrape_jobs("python")
df.to_csv("python_leads.csv", index=False)
print("Wrote python_leads.csv — validate demand with a quick outreach.")
Use this to test whether people are hiring for or paying for the service you want to provide.
3. Productize automation tasks (Selenium / Playwright + pyinstaller) Automations are a quick win: businesses will pay to save hours. I built small CLI automation tools for data entry, bulk uploads, and report generation. For browser automation I use playwright (modern, fast) or selenium when compatibility matters. Wrap it with click for CLI UX and package with pyinstaller.
# bulk_upload.py (Playwright + Click)
import asyncio
from playwright.async_api import async_playwright
import click
@click.command()
@click.argument("csv_file", type=click.Path(exists=True))
async def main(csv_file):
import csv
rows = list(csv.DictReader(open(csv_file)))
async with async_playwright() as p:
browser = await p.chromium.launch(headless=True)
page = await browser.new_page()
await page.goto("https://example.com/login")
await page.fill("#email", "[email protected]")
await page.fill("#password", "secret")
await page.click("button[type=submit]")
await page.wait_for_selector("#dashboard")
for r in rows:
await page.goto("https://example.com/new")
await page.fill("#name", r["name"])
await page.fill("#email", r["email"])
await page.click("#submit")
await browser.close()
if __name__ == "__main__":
asyncio.run(main()) # run: python bulk_upload.py leads.csv
Then pyinstaller --onefile bulk_upload.py to ship a binary clients can run.
4. Build micro-SaaS quickly with FastAPI + SQLAlchemy + Pydantic When proof-of-concept converts, turn it into a tiny web product. FastAPI is my go-to: typing-friendly, fast, and perfect for shipping an API or minimal dashboard. Pair with SQLAlchemy for the DB layer and Alembic for migrations.
# app.py (FastAPI minimal)
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
from sqlalchemy import create_engine, Column, Integer, String
from sqlalchemy.orm import sessionmaker, declarative_base
DATABASE_URL = "sqlite:///./db.sqlite3"
engine = create_engine(DATABASE_URL, connect_args={"check_same_thread": False})
SessionLocal = sessionmaker(bind=engine)
Base = declarative_base()
class Customer(Base):
__tablename__ = "customers"
id = Column(Integer, primary_key=True, index=True)
email = Column(String, unique=True, index=True)
name = Column(String, index=True)
Base.metadata.create_all(bind=engine)
class CustomerIn(BaseModel):
name: str
email: str
app = FastAPI()
@app.post("/customers")
def create_customer(payload: CustomerIn):
db = SessionLocal()
if db.query(Customer).filter(Customer.email == payload.email).first():
raise HTTPException(status_code=400, detail="Already exists")
c = Customer(name=payload.name, email=payload.email)
db.add(c); db.commit(); db.refresh(c)
return {"id": c.id, "email": c.email}
Monetization: add Stripe checkout (next section) and tiered API keys.
5. Handle payments and subscriptions (stripe) Integrate payments with stripe for one-time purchases and recurring subscriptions. Keep server-side checkout sessions and webhook handlers minimal.
# payments.py (Stripe)
import stripe
from fastapi import FastAPI, Request
stripe.api_key = "sk_test_xxx"
app = FastAPI()
@app.post("/create-checkout")
async def create_checkout():
session = stripe.checkout.Session.create(
payment_method_types=["card"],
line_items=[{"price": "price_xyz", "quantity": 1}],
mode="subscription",
success_url="https://yourapp.com/success",
cancel_url="https://yourapp.com/cancel",
)
return {"checkout_url": session.url}
Start with prebuilt Stripe price IDs — no need for a complicated billing engine.
6. Add AI features that customers will actually pay for (openai + langchain)
AI features are the multiplier. I used openai for text generation and langchain to glue pieces (vector DBs, retrieval, prompt templating). Charge extra for “AI-powered” tiers.
# ai_feature.py (LangChain + OpenAI pseudocode)
from langchain import OpenAI
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate
llm = OpenAI(temperature=0.2, api_key="OPENAI_KEY")
template = PromptTemplate(input_variables=["summary"], template="Rewrite the following summary into a concise landing page headline:\n\n{summary}")
chain = LLMChain(llm=llm, prompt=template)
def headline_for(summary):
return chain.run({"summary": summary})
Combine with a vector DB (faiss, chromadb) for retrieval-augmented generation (RAG) if you need factual grounding.
7. Dashboards & demos to convert leads (Streamlit / Gradio / Plotly) Nothing converts like a live demo. Use Streamlit or Gradio to spin up interactive demos in under an hour. plotly or matplotlib for charts.
# demo.py (Streamlit)
import streamlit as st
import requests
st.title("AI Lead Scorer — Demo")
text = st.text_area("Paste lead description")
if st.button("Score"):
resp = requests.post("https://yourapi.com/score", json={"text": text})
st.metric("Score", resp.json().get("score"))
Host a demo and add a CTA button linking to your paid onboarding.
8. Background jobs & scale (Celery / Redis / RabbitMQ) When processing grows (scrapes, AI calls, video transcodes), move heavy work into Celery tasks. Pair Celery with Redis for small setups or RabbitMQ for reliability.
# tasks.py
from celery import Celery
app = Celery("tasks", broker="redis://localhost:6379/0")
@app.task
def process_lead(lead_id):
# heavy CPU / network calls, e.g., enrich using Clearbit API
pass
Background tasks let you offer async features (bulk uploads, nightly enrichment) and charge higher tiers.
9. Packaging, deployment, and cheap infra (Docker + AWS/GCP + GitHub Actions) Deploy fast: containerize with Docker, push to a registry, use GitHub Actions for CI/CD. For minimal cost, start on a single small EC2, DigitalOcean droplet, or Render. For serverless edge, FastAPI + Uvicorn + Cloud Run / AWS Fargate works well. Sample Dockerfile:
FROM python:3.11-slim WORKDIR /app COPY requirements.txt . RUN pip install -r requirements.txt COPY . . CMD ["uvicorn", "app:app", "--host", "0.0.0.0", "--port", "8000"]
Add a gh-actions workflow to build and push on tag.
10. Legal, ops, and customer trust (Sentry + pydantic validation + logging) To charge money, your product must be reliable. Use pydantic to validate inputs, sentry-sdk for error monitoring, and structured logging for debugging. Also include a simple TOS + Privacy (Stripe usually requires it).
# example pydantic validation in FastAPI (already used earlier)
from pydantic import EmailStr
class CustomerIn(BaseModel):
name: str
email: EmailStr # ensures valid email
11. Pricing strategies that worked for me (one-off, subscription, usage-based) I used three pricing tiers:
- Starter — one-time script / packaged binary for small businesses (low friction).
- Pro — monthly subscription for API + dashboard + limited AI calls.
- Enterprise — custom integrations, support, and data pipelines (higher price, slower sales cycle).
Bundle setup as an add-on (one-time) and recurring analytics as the sticky revenue. Offer a 7–14 day trial for the AI tier because usage hooks customers.
Quick checklist to go from idea → revenue (copy this)
- Validate with scraping + outreach (requests, BeautifulSoup, pandas).
- MVP script using playwright or selenium (deliverable binary via pyinstaller).
- API & dashboard using FastAPI + SQLAlchemy + pydantic.
- AI upsell with openai/langchain + vector DB.
- Payments via stripe.
- Demos with Streamlit/Gradio.
- Scale with Celery and Docker.
- Monitor with sentry-sdk.
Read the full article here: https://medium.com/top-python-libraries/the-python-script-i-built-that-started-paying-my-rent-and-the-libraries-that-did-the-heavy-1969b25ffe98