Sigma Software is seeking a Senior Data Engineer for its Data Engineering Center of Excellence, requiring 5+ years of experience in Python and SQL, hands-on AWS experience, and skills in building ETL pipelines. Responsibilities include designing scalable solutions, implementing data governance, and collaborating with teams. Preferred qualifications include experience with Hadoop, Spark, or Snowflake, and strong communication skills are essential.
Sigma Software
Are you a Senior Data Engineer passionate about building scalable, secure, and high-performance data solutions? Join our Data Engineering Center of Excellence at Sigma Software and work on diverse projects that challenge your skills and inspire innovation. At Sigma Software, we value expertise, continuous learning, and a supportive environment where your career path is shaped around your strengths. You’ll be part of a collaborative team, gain exposure to cutting-edge technologies, and work in an inclusive culture that fosters growth and innovation. Project Our Data Engineering Center of Excellence (CoE) is a specialized unit focused on designing, building, and optimizing data platforms, pipelines, and architectures. We work across diverse industries, leveraging modern data stacks to deliver scalable, secure, and cost-efficient solutions. Requirements • 5+ years of experience with Python and SQL • Hands-on experience with AWS services (API Gateway, Kinesis, Athena, RDS, Aurora) • Proven experience building ETL pipelines for analytics/internal operations • Experience developing and integrating APIs • Solid understanding of Linux OS • Familiarity with distributed applications and DevOps tools • Strong troubleshooting/debugging skills • English level: Upper-Intermediate WILL BE A PLUS: • 2+ years with Hadoop, Spark, or Airflow • Experience with DAGs/orchestration tools • Experience with Snowflake-based data warehouses • Experience developing event-driven data pipelines Personal Profile • Strong communication skills • Interest in dynamic, research-focused environments • Passion for innovation and continuous improvement Responsibilities • Research new technologies and design complex, secure, scalable, and reliable solutions, focusing on ETL process enhancement • Work with the modern data stack to deliver well-designed technical solutions • Implement data governance practices • Collaborate effectively with customer teams • Take ownership of major solution components and their delivery • Participate in requirements gathering and propose architecture approaches • Lead data architecture implementation • Develop core modules and scalable systems • Conduct code reviews and write unit/integration tests • Scale distributed systems and infrastructure • Build/enhance data platforms leveraging AWS or Azure