About Eightgen
Eightgen is an AI services company that partners with founders, CIOs, and CXOs to transform ideas into working products. We help startups and enterprises ship AI automation at scale—from intelligent workflows and custom AI agents to enterprise-grade applications.
Our Values
Integrity & Ethics We conduct business with honesty and transparency. We do what's right for our clients, our team, and our partners—even when it's harder. We handle data responsibly and respect the trust placed in us.
Quality & Accountability We take ownership of our work and deliver on our commitments. We build software we're proud of, with attention to detail and a commitment to excellence. When we make mistakes, we acknowledge them and fix them.
Trust & Autonomy We hire talented people and give them the freedom to do their best work. We communicate openly, share context generously, and trust each other to make good decisions.
Inclusion & Belonging We are committed to building a diverse team where everyone feels welcome, respected, and heard. Different backgrounds, perspectives, and experiences make us stronger.
We are a fully remote team that values outcomes over hours and collaboration over hierarchy.
About the Project
For this engagement, you will work on an Asset Performance Management Platform for a manufacturing industry client. The platform integrates data from enterprise systems (SAP, ERP, CMMS) to deliver predictive maintenance insights, performance analytics, and strategic asset management planning—replacing manual Excel-based workflows with automated, AI-powered intelligence.
You will be joining from the ground up to help build the core platform, working across multiple components including data integration, analytics dashboards, and planning tools.
The Role
As a Backend Engineer, you will design and build the core data infrastructure and API services that power our platform. You will work across the full backend stack—from data ingestion pipelines processing millions of records to REST APIs serving real-time analytics and AI-powered insights.
We embrace AI-augmented development. You will leverage AI coding assistants (GitHub Copilot, Cursor, Claude Code, or similar) as integral tools in your workflow—not as a crutch, but as a force multiplier. We expect you to understand the fundamentals deeply while using AI tools to accelerate delivery and maintain high code quality.
You will collaborate closely with frontend engineers, product managers, and business stakeholders to deliver a scalable, multi-tenant platform serving 200+ enterprise customers.
Our AI-Augmented Development Philosophy
We believe the most effective engineers in 2025 and beyond are those who:
-
Master the fundamentals — Deep understanding of Python, databases, system design, and software engineering principles remains essential
-
Leverage AI as a collaborator — Use AI tools to accelerate routine tasks, generate boilerplate, explore solutions, and augment productivity
-
Maintain quality ownership — Treat AI output as a starting point, not a final answer. Review, test, and refine all code before it reaches production
-
Think architecturally — AI tools are excellent at implementation but require human guidance for design decisions, trade-off analysis, and system-level thinking
-
Continuously optimize workflows — Experiment with prompts, tools, and techniques to maximize the value of AI assistance
Key Responsibilities
-
Build and maintain data pipelines that ingest, validate, and transform large volumes of data from multiple sources including batch uploads and enterprise system integrations
-
Design and develop REST APIs that serve both operational and analytical workloads, ensuring high performance, security, and scalability for a multi-tenant architecture
-
Implement data models and database logic across transactional and analytical databases, optimizing for query performance and data integrity
-
Develop data processing and transformation logic for cleansing, validation, enrichment, and aggregation of enterprise data
-
Integrate with external systems via REST/OData APIs and support various data formats (CSV, Excel, JSON) for seamless data exchange
-
Leverage AI coding tools effectively to accelerate development velocity while critically reviewing and validating AI-generated code for correctness, security, and maintainability
-
Contribute to platform architecture through code reviews, technical design discussions, and documentation of best practices
-
Collaborate cross-functionally with frontend engineers, product managers, and business stakeholders to deliver features end-to-end
Required Qualifications
Core Technical Skills
-
3-5 years of professional experience in backend development with Python
-
Strong proficiency in Python including async/await patterns, type hints, and modern Python practices
-
Experience with FastAPI or similar frameworks (Flask, Django REST) for building production REST APIs
-
Solid SQL skills with PostgreSQL or similar relational databases, including query optimization, indexing, and schema design
-
Experience with workflow orchestration tools such as Apache Airflow, Prefect, or Dagster
-
Familiarity with data processing using Pandas, Polars, or similar DataFrame libraries
-
Understanding of cloud platforms, preferably Google Cloud Platform (Cloud Run, Cloud SQL, Cloud Storage)
-
Experience with Git, CI/CD pipelines, and collaborative development workflows
AI-Assisted Development Skills
-
Hands-on experience with AI coding assistants such as GitHub Copilot, Cursor, Claude Code, Windsurf, or similar tools in day-to-day development workflows
-
Prompt engineering proficiency — ability to craft effective prompts that guide AI tools toward correct, efficient, and secure code implementations
-
Critical evaluation of AI-generated code — strong ability to review, validate, refactor, and improve code produced by AI tools, ensuring production-readiness and adherence to best practices
-
Architectural thinking — ability to break down complex problems, design solutions at a high level, and guide AI tools toward correct implementations rather than relying on AI for design decisions
Preferred Qualifications
-
Experience with columnar/analytical databases such as ClickHouse, BigQuery, or Snowflake
-
Familiarity with dbt (data build tool) for data transformations
-
Background in enterprise system integrations (SAP, ERP, MES, CMMS) or industrial/manufacturing domains
-
Experience building multi-tenant SaaS platforms with data isolation patterns
-
Exposure to LLM/AI integrations (OpenAI, Anthropic, LangChain) is a plus
-
Demonstrated workflow optimization using AI tools with measurable productivity improvements
Technical Environment
You will work with our recommended technology stack:
| Layer | Technologies |
|---|---|
| API Framework | FastAPI, Pydantic, Python 3.11+ |
| Orchestration | Google Cloud Composer (Airflow 3.x) |
| Data Processing | Polars, dbt Core, Great Expectations |
| Databases | PostgreSQL 17, ClickHouse Cloud |
| Cloud Platform | Google Cloud Platform (Cloud Run, Cloud SQL, GCS) |
| Integrations | SAP OData APIs (PyOData), REST/JSON |
| AI Development Tools | GitHub Copilot, Cursor, Claude Code (your choice) |
Engagement Details
- Contract Duration: Initial 3-month engagement
- Extension: Strong opportunity to extend for additional 6+ months based on performance and project needs
- Work Arrangement: Fully remote
- Team: You will work alongside frontend engineers, product managers, and business stakeholders
- Start Date: First week of January 2026