Data Engineer

Citylitics


About Citylitics

Citylitics delivers predictive intelligence on local utility & public infrastructure markets

What is Infrastructure? It is the roadways you rely on to safely get to Grandma’s house, it’s the potable water that comes out of your kitchen tap that you wash your family’s food with and it’s the energy that heats our homes and powers our digital lifestyles.

Every year, trillions of dollars are spent on all areas of infrastructure to maintain our quality life and move our economy forward. However, our infrastructure is no longer equipped to meet the needs of the future. We hear about infrastructure failures, whether bridge collapses, power blackouts, or water main breaks, every day in the news. Climate change and extreme weather events are disrupting the basic infrastructure we took for granted for years.

Citylitics is solving the hardest data problems in infrastructure while building the sales intelligence platform that enables a faster, more transparent, and more efficient infrastructure marketplace. We turn millions of unstructured documents into high value intelligence feeds and datasets that are available on an intuitive user experience. Our goal is to enable solution providers to connect with cities with relevant infrastructure needs in a faster and more digital way than historic market channels. As more companies adopt our platform, cities & utilities will be able to access solutions that deliver on the promise of moving towards a more resilient, sustainable, and equitable infrastructure future.

Who Are We Looking For?

We’re looking for a skilled and enthusiastic Data Engineer to join our growing team! We’re a close-knit group building and maintaining mission-critical data pipelines, and we need someone who can hit the ground running.

This role requires a solid understanding of and experience with Apache Airflow. You’ll be responsible for designing, building, and maintaining both streaming and batch data pipelines, leveraging various technologies we already use, including Google Cloud Platform (GCP) services like Dataflow, BigQuery, and Vertex AI. Experience with these specific GCP services is a must.

We’re not looking for just someone who knows these tools, but someone who has built with them – ideally, complex and robust data pipelines in a production environment. We’re interested in seeing examples of your work, so please be prepared to discuss past projects and challenges you’ve overcome.

Beyond the technical skills, we value collaboration, a proactive approach to problem-solving, and a willingness to learn and adapt to evolving technologies. If you’re passionate about data engineering, thrive in a collaborative environment, and are excited by the prospect of working on impactful projects, we encourage you to apply.

What Will You Accomplish?

  • Design, build, and maintain: Develop and deploy highly scalable and reliable data pipelines using Apache Airflow, Dataflow, and other GCP services. This includes everything from initial design and development through testing, deployment, and ongoing maintenance.
  • Data Modeling & Optimization: Collaborate with data analysts and stakeholders to define data requirements, and design efficient and effective data models within BigQuery. You’ll be optimizing queries and pipeline performance for maximum efficiency.
  • Monitoring & Troubleshooting: Implement robust monitoring and alerting for data pipelines. Proactively identify and resolve issues, ensuring data quality and pipeline uptime.
  • Collaboration & Communication: Work closely with other engineers, data scientists, and product teams to understand business requirements and translate them into technical solutions. Clearly communicate technical details and project progress.
  • Continuous Improvement: Contribute to the ongoing improvement of our data infrastructure and processes, including suggesting and implementing new technologies and best practices. We’re always looking for ways to optimize our workflows and improve efficiency.
  • Other duties as assigned.

Technologies We Use:

  • Backend: Python, Django, Cloud SQL and Airflow/Cloud Composer as the main language, web framework, database and orchestration tool respectively
  • Cloud Infrastructure: Google Cloud Platform
  • Other Tools: Dash & Plotly as the main framework for our dashboards hosted in Cloud Run

Requirements

  • Proven experience (2+ years) building and maintaining data pipelines using Apache Airflow. We’re particularly interested in seeing examples of complex pipeline orchestration.
  • Strong understanding of data warehousing principles and experience working with BigQuery.
  • Experience with cloud-based data processing frameworks like Apache Beam (ideally with Google Cloud Dataflow).
  • Familiarity with Google Cloud Platform (GCP) services, specifically BigQuery, Dataflow, and Vertex AI. Experience with other GCP services is a plus.
  • Proficiency in at least one scripting language (Python preferred).
  • Experience with version control systems (Git).
  • Excellent problem-solving skills and a proactive approach to identifying and resolving issues.
  • Good communication and collaboration skills – you’ll be working closely with other engineers and stakeholders.
  • Understanding of data modeling concepts and best practices.
  • Experience with CI/CD pipelines is a plus.

Benefits

Why Citylitics?

  • Opportunity to work for one of the top 15 innovative analytics startups in Canada revolutionizing data intelligence
  • This is a rare opportunity to influence positive change within one of the biggest societal challenges of our generation: sustainable public infrastructure
  • You get to support a disruptive solution with a compelling value proposition into an industry that is eager to hear from you and in a market with no direct competition.
  • We live at the cross section of infrastructure, scaleup and data science/AI. There is no other team like us in Toronto.
  • There is no corporate bureaucracy here. You will accomplish more here in a few months than what you would in a few years at a large, entrenched technology company.
  • We believe that Data and AI will play an outsized role in our future, so we equip every team member with access to Generative AI tools and our full Data Universe to enhance their productivity and encourage innovation through experimentation.
  • We are proud to offer every CityZen an internal mentorship program, in-role professional growth, skill-based development & learning, and internal promotion opportunities.
  • We work hard, we play together, we win as a team! We are on a mission to solve infrastructure while savoring the moment and celebrating the little details along the way.

Citylitics is an equal opportunity employer. We are passionate about providing a safe workplace where everyone is accepted and has the opportunity to grow with us. We are committed to making diversity and inclusivity part of our culture!

Apply now
To help us track our recruitment effort, please indicate in your cover/motivation letter where (jobsinengineering.ca) you saw this job posting.