hero

Make your next big move

64
companies
267
Jobs

Senior Data Engineer

G2

G2

Data Science
Bengaluru, Karnataka, India
Posted on Thursday, January 18, 2024

About G2 - Our People

G2 was founded to create a place where people will love to work. We strive to create meaning in work and provide more than just a job: a true calling. At the heart of our community and culture are our people. Our global G2 team comes from a wide range of backgrounds and experiences, and that’s what makes our G2 community strong and vibrant. We want everyone to bring their authentic selves to work, and we do this through our company and team events, our G2 Gives charitable initiatives, and our Employee Resource Groups (ERGs).

Our employee-led, leadership-supported ERGs celebrate the diversity of our team, foster inclusivity and belonging, and create a space to connect to each other. Through connections and understanding, we build a stronger and more dynamic global team and help every person reach their personal peak.

We support our employees by offering generous benefits, such as flexible work, ample parental leave, and unlimited PTO. Click here to learn more about our benefits.

About G2 - The Company

When you join G2, you join the global team behind the largest and most trusted software marketplace. Every month, 5.5 million people come to G2 to inform smarter software decisions based on honest peer reviews. Authenticity is our focus, and every day we help thousands of companies, and hundreds of employees, propel their potential. Ready for meaningful work that starts and ends with compassion and heart? You’ve come to the right place.

G2 is going through exciting growth! We’ve recently secured our Series D funding of $157 million, which will further allow us to grow and develop our product and people. Read about it here!

About The Role

G2 is looking for a Senior Data Engineer, you'll play a pivotal role in driving the design, development, and optimization of complex data pipelines and architectures for the G2 data platform. Leveraging advanced ETL expertise and cloud-based solutions within AWS and Snowflake environments, you'll lead critical data initiatives, mentor team members, ensure best practices in data engineering, optimize data pipelines, and maintain high data quality and reliability.

In This Role, You Will:

Data infrastructure and processing:

  • Lead the design, development, and optimisation of sophisticated data pipelines, ensuring seamless data extraction, transformation, and loading into G2’s data warehouse from diverse sources.
  • Lead engagements with the internal stakeholders to understand their data needs and design solutions ensuring data quality and governance.
  • Design the right pipeline architecture to handle data and support various use cases, including analytical reporting and machine learning.
  • Identify, design, and implement process improvements to optimize our data delivery, and re-design infrastructure and pipelines to achieve greater scalability.
  • Help drive the creation of monitoring, alerting, and reporting on the reliability of data pipelines and data processing systems.
  • Develop and manage database schemas and models that support efficient data storage, retrieval, and analysis.
  • Stay up to date with the latest data technologies and trends and evaluate their applicability to G2.

Data quality assurance and governance :

  • Work closely with key stakeholders and SMEs to define business rules that determine governance and data quality.
  • Implement robust measures for data quality, validation, and cleansing to ensure accuracy, completeness, and compliance with the data governance standards.
  • Design efficient, scalable processes to acquire, manipulate, and store data and ensure adherence to them.

Leadership and Mentoring :

  • Provide technical leadership, guidance, and mentorship to junior team members, fostering a culture of excellence and continuous learning in data engineering practices.
  • Collaborate closely with cross-functional teams to align data solutions with organizational goals and ensure successful integration into broader projects.
  • Share insights, contribute to best practice repositories, and drive innovation by evaluating and implementing emerging technologies to enhance data engineering capabilities.

Minimum Qualifications:

We realize applying for jobs can feel daunting at times. Even if you don’t check all the boxes in the job description, we encourage you to apply anyway.

  • 6+ years of experience in implementing enterprise data solutions.
  • 3+ years of experience using ETL/data pipeline tools (dbt, Airflow, Airbyte, Glue, Matillion, Stitch, etc.).
  • 2+ years of hands-on experience in data modeling, optimization and database architecture.
  • Extensive experience in writing, debugging, and tuning SQL queries.
  • Strong programming experience in Python or Java.
  • Good understanding of the AWS data services (DynamoDB, RDS, Data Pipeline, EMR, Lambda, Glue, ECS, etc.) and cloud data warehouses like SnowFlake.
  • Must have good knowledge of performance tuning, optimization and debugging of data pipelines.
  • Experience with enterprise data and business intelligence platforms serving large-scale enterprise deployments.
  • Proven track record of leading and delivering complex data projects within cloud environments.
  • Good understanding of distributed computing and frameworks like Apache Spark, Hadoop, and Apache Kafka for handling large volumes of data.
  • Good problem-solving, leadership, and communication skills.
  • Good understanding of software engineering principles and standards.

What Can Help Your Application Stand Out:

  • Experience with Docker/Kubernetes.
  • Proficiency in data modeling, schema design, and optimizing data structures for performance in Snowflake.
  • Working experience in startup environments.
  • Experience with Agile process methodology, CI/CD automation, Test Driven Development.
  • Knowledge of data governance, security, and compliance standards within cloud-based data solutions.
  • Understanding of any reporting tools such as Tableau, Qlikview ,Looker or PowerBI.
  • Database administration background.

Our Commitment to Inclusivity and Diversity

At G2, we are committed to creating an inclusive and diverse environment where people of every background can thrive and feel welcome. We consider applicants without regard to race, color, creed, religion, national origin, genetic information, gender identity or expression, sexual orientation, pregnancy, age, or marital, veteran, or physical or mental disability status. Learn more about our commitments here.