About the Role
Lunar Point Systems builds and maintains production-grade Microsoft Fabric data platforms for clients across a range of industries. This role owns the full pipeline lifecycle — from ingestion and transformation through deployment and ongoing maintenance — operating directly within each client's Fabric environment. The work spans different data architectures, business domains, and maturity levels, requiring adaptability and a strong engineering foundation. Engineers are dedicated to one client engagement at a time, going deep on the architecture before transitioning to steady-state support.
Why Us
We believe in creating a supportive, dynamic environment where your skills can truly shine. We offer the opportunity to work on exciting, high-impact projects that push the boundaries of development. Our team is collaborative, innovative, and passionate about delivering exceptional results. If you're looking to make a real impact and be part of a team that values your contributions, Lunar Point Systems is the place for you.
Responsibilities:
- Design and implement end-to-end ingestion and transformation pipelines using Fabric Data Factory and Dataflows Gen2
- Build and manage Lakehouses and Warehouses using medallion architecture (Bronze / Silver / Gold) with Delta Lake
- Write Spark-based transformations in Fabric Notebooks using PySpark or Python
- Contribute to KPI definition — understanding client business objectives and ensuring the data layer supports measurement and reporting against those targets
- Own workspace deployment pipeline promotions (Dev to UAT to Prod) including environment configuration and validation gates
- Implement CI/CD workflows for pipeline code using GitHub Actions or Azure DevOps Pipelines
- Apply Delta Lake optimization techniques for performance at scale
- Enforce data quality validation and testing at each pipeline stage before promotion
- Maintain thorough documentation of pipeline architecture, data contracts, and schema changes for client handover
- Monitor pipeline health, configure alerting, and manage incident response for production environments
Requirements:
- Microsoft Certified: DP-700, DP-600, or AZ-400
- Bachelor's degree in Computer Science, Data Engineering, Information Systems, or a related field
- 3-5 years of experience in data engineering on a cloud-native data platform
- Strong proficiency in Python/PySpark and SQL
- Hands-on experience with Microsoft Fabric — Data Factory, Notebooks, Lakehouse, Warehouse, and OneLake
- Solid understanding of medallion architecture and Delta Lake table management
- Experience building and managing Dev to UAT to Prod deployment pipelines in Fabric
- Ability to contribute to KPI definition and translate business goals into data architecture decisions
- Comfortable working within a client's Fabric environment and adhering to their governance standards
- Version control experience with GitHub or Azure DevOps (branching strategies, PR workflows, YAML pipelines)
- US citizen; no visa sponsorship available
Nice to Have:
- Master's degree in a related field
- Real-time pipeline experience with Fabric Eventstream, Azure Event Hubs, or KQL databases
- Familiarity with Microsoft Purview for data lineage, cataloging, and sensitivity labeling
Equal Opportunity Employer
Lunar Point Systems is an equal opportunity employer. We do not discriminate on the basis of race, color, religion, sex, national origin, age, disability, veteran status, genetic information, or any other characteristic protected by applicable federal, state, or local law. All qualified applicants will receive consideration for employment without regard to any such characteristic. We are committed to creating an inclusive environment for all employees and contractors.