Python Software Developer – Data Pipeline Development
Location: Toronto, Canada
Our client, a prominent financial services firm, is looking for a Python Software Developer to join their data engineering team. This role is focused on designing, developing, and maintaining ETL pipelines to streamline the ingestion and processing of fundamental data. This mid-level position offers the opportunity to work on impactful data infrastructure projects in a collaborative, front-office environment.
Responsibilities
- Develop, implement, and optimize robust data pipelines using Python to process and integrate large volumes of fundamental data.
- Build and maintain ETL pipelines that support various analytical and operational applications by ingesting, cleaning, and transforming data.
- Collaborate with cross-functional teams and key stakeholders, including data analysts, quantitative researchers, and other software developers, to ensure data pipelines align with business requirements and data quality standards.
- Ensure scalability and reliability of the data infrastructure to meet the demands of high-quality, real-time data.
- Apply best practices in software engineering, including code reviews, version control, automated testing, and continuous integration to deliver clean, maintainable, and efficient code.
Technical Skills & Qualifications
- Bachelor’s degree in Computer Science, Engineering, Data Science, or a related field.
- Proficiency in Python for data processing and pipeline development, with a strong understanding of libraries like Pandas and SQLAlchemy.
- Experience with SQL and relational databases (e.g., PostgreSQL) for data storage, manipulation, and retrieval.
- Familiarity with ETL processes and tools, and experience building ETL pipelines to handle high volumes of data.
- Knowledge of cloud development environments, preferably AWS, for handling data ingestion and storage at scale.
- Familiarity with containerization technologies (e.g., Docker, Kubernetes) to support scalable and flexible deployment.
- Experience with Apache Airflow for orchestrating complex data workflows.
- Data transformation skills, with a solid understanding of data quality, cleaning, and validation techniques.
- Strong problem-solving skills and the ability to communicate technical concepts effectively within a team.
Preferred Experience
- Prior experience working with fundamental data in the financial sector, such as corporate earnings data, macroeconomic indicators, and other non-price-based financial data.
- Familiarity with front-office environments, ideally within finance, where high-quality data drives decision-making.
- Knowledge of data architecture best practices and experience in building scalable, maintainable data pipelines.
Apply now
* Required