About the Role
Benefits:
• 401(k)
• 401(k) matching
• Bonus based on performance
• Competitive salary
• Dental insurance
• Donation matching
• Health insurance
• Opportunity for advancement
• Paid time off
• Profit sharing
• Training & development
• Tuition assistance
• Vision insurance
• Wellness resources
Location: Fully Remote (United States) — no onsite work required
Employment Type: Contract (Full-Time), multi-year engagement with potential for extension
About AKIVA
AKIVA is an AI-native engineering firm specializing in enterprise services, domain-specific solutions, and high-performance infrastructure from strategy through deployment. We build innovative systems that drive measurable results for commercial enterprises and government agencies. AKIVA is a certified Service-Disabled Veteran-Owned Small Business (SDVOSB).
Position Overview
AKIVA is seeking ETL Developers to design, develop, and maintain extract-transform-load (ETL) processes in support of a multi-year public-sector application modernization program. In this role you will build ETL strategies that move structured and unstructured data from legacy source systems into modern target platforms — developing robust data mappings, automated workflows, validation routines, and audit logging. You will collaborate closely with client stakeholders, data architects, and application development teams in a Scrum-based delivery environment. AKIVA-provided AI-assisted pipeline generation, data-profiling, and anomaly-detection tools may be used to accelerate development and improve data quality.
Key Responsibilities
• Design, develop, test, and maintain ETL processes that move data from source systems to target data stores, data warehouses, and modern applications
• Develop and evolve ETL strategies that support the full data lifecycle — ingestion, transformation, validation, loading, and audit
• Collect, clean, and integrate structured and unstructured data from diverse source systems, including legacy relational databases, flat files, APIs, and third-party feeds
• Build and maintain data models, data mappings, and transformation rules in coordination with data architects and business analysts
• Implement automated workflows with performance tuning, error handling, and audit logging
• Implement and monitor data-quality controls — profiling, deduplication, standardization, and reconciliation
• Develop and maintain API-based integrations between source and target systems
• Troubleshoot production ETL jobs, diagnose failures, and implement remediation
• Collaborate with client stakeholders to understand source-system semantics, data ownership, and business rules
• Support knowledge transfer to client technical staff through documentation, walkthroughs, and pair-work
• Participate in Scrum ceremonies (sprint planning, daily stand-ups, reviews, and retrospectives)
Required Qualifications
• Bachelor's degree in Computer Science, Information Systems, or related field — or equivalent experience
• 2+ years of hands-on experience with advanced SQL, including complex joins, window functions, CTEs, stored procedures, and performance tuning
• 4+ years of experience in data migration and integration with automated workflows, performance tuning, data cleansing, API integration, and audit logging
• Demonstrated experience designing end-to-end ETL pipelines across heterogeneous source and target systems
• Strong problem-solving, troubleshooting, and documentation skills
• Excellent written and verbal communication skills and ability to work directly with non-technical stakeholders
• Ability to work effectively in a fully remote environment
• Must be authorized to work in the United States
Preferred Qualifications
• Experience leading or contributing to full-scale data migrations from legacy systems to cloud, ERP, or CRM platforms
• Hands-on experience with one or more ETL/data-integration platforms — Informatica PowerCenter or IICS, Talend, AWS Glue, Azure Data Factory, or SQL Server Integration Services (SSIS)
• Proficiency in at least one scripting or programming language — Python, PowerShell, or Java
• Experience working on Agile / Scrum delivery teams
• Experience supporting public-sector data modernization initiatives
• Familiarity with AI-assisted pipeline generation, schema-mapping, and anomaly-detection tools
Tools, Technologies & Frameworks
ETL / Integration: Informatica PowerCenter, Informatica IICS, Talend, AWS Glue, Azure Data Factory, SSIS | Databases: SQL Server, Oracle, PostgreSQL, MySQL, Snowflake, Redshift | Scripting: Python, PowerShell, Java | APIs: REST, SOAP, JSON, XML | DevOps: Git, CI/CD pipelines, Jira, Confluence | Delivery: Agile / Scrum | AI Assist: pipeline generation, schema mapping, anomaly detection
What AKIVA Offers
• Professional development and skill-building opportunities
• Supportive management team with veteran-owned company values
• Opportunity to contribute to impactful public-sector data modernization i