Data/Software Engineer
AP Recruiters
Job Description
Data Engineer (Contract) Location: Kansas City, MO (Open to Denver, CO, or Omaha, NE for strong candidates) Duration: 12-month contract (extension potential) United States citizenship or lawful permanent resident alien status with at least three (3) or more years of United States residency from the date of legal entry to the United States is required for this position. About the Role We are seeking a skilled Data Engineer to support data-intensive research and analytics initiatives within a highly advanced enterprise environment. This role focuses on managing and engineering large-scale datasets that drive critical insights across financial and economic domains.
You will work within a high-performance data ecosystem, supporting a large enterprise data warehouse (~300TB across 25+ datasets) and collaborating closely with data scientists, analysts, and cross-functional teams. This is a hands-on role best suited for professionals with strong expertise in SQL, ETL processes, and large-scale data management. Key Responsibilities Own and manage 3–5 critical datasets within a large-scale PostgreSQL/Greenplum data warehouse Design and execute ETL (Extract, Load, Transform) processes for ingesting data from multiple sources Perform data cleansing, validation, transformation, and loading activities Manage schema changes, data updates, and full data reloads Develop and maintain batch data pipelines using SQL and shell scripting (Bash) Collaborate with stakeholders to validate and promote datasets to production Ensure data integrity, quality, and consistency across platforms Support data warehouse modernization initiatives, including migration to cloud-based platforms Contribute to documentation, process improvements, and data governance practices Required Skills Strong expertise in SQL ( critical requirement ) Experience with PostgreSQL or similar relational databases Hands-on experience in ETL / Data Engineering Linux environment familiarity Experience with shell scripting (Bash) Ability to work with large, complex datasets (millions/billions of rows) Experience with GitLab (repo + pipelines / runners) Preferred Qualifications Experience with Greenplum or MPP data warehouse platforms Knowledge of Python for data processing Exposure to modern cloud data platforms (e.g., Databricks) Experience with AWS environments (nice to have) Background in data modeling or large-scale data migrations Work Environment Manage ~25 datasets across domains such as financial, mortgage, market, and climate data Data loads occur on daily, monthly, and quarterly cycles Work involves handling evolving schemas and large-scale transformations Collaborative environment with data engineers, infrastructure engineers, and data scientists Growth & Opportunity Work on high-impact data systems supporting critical research and analytics Exposure to large-scale data engineering challenges Involvement in cloud migration initiatives Collaborative, innovation-driven environment Additional Notes This role is highly SQL and data engineering-focused Minimal focus on visualization tools or frontend work Ideal for candidates who enjoy working with complex data pipelines Contract opportunity with potential for extension Competitive hourly rate Candidates must be authorized to work in the United States without sponsorship Additional background screening may be required based on the nature of the work