Career | <?phpecho $jobTitle;?> | <?phpecho $companyName;?>

Principal Data Engineer


Pune, IN
  • Job Type: Full-Time
  • Function: Data Science
  • Industry: SaaS
  • Post Date: 01/25/2023
  • Website:
  • Company Address: 1525 McCarthy Blvd. Suite, 1122, Milpitas, California 95035, US
  • Salary Range: $50,000 - $150,000

About Resilinc

Resilinc was founded with the purpose to strengthen global supply chains, making them resilient, ethical, transparent, and secure. We do this via our technology-driven solutions which create an ecosystem where organizations have unmatched visibility into their supply networks and can collaborate with their suppliers in a transparent environment.

Job Description

About Resilinc

Resilinc is the industry leading provider of Supply Chain Risk Management (SCRM) solutions for the extended supply chain. Resilinc’s solutions deliver supply chain visibility, sophisticated risk analytics and robust strategies to mitigate the risk of supply chain disruptions.  Our unique and innovative SaaS technology products deliver the leading end-to-end solution that address the fundamental problem of improving supply chain visibility through multiple tiers and fosters collaboration between ecosystem partners.  Our information and analytics platform enables customers to proactively monitor their supply chain for critical exposures to global regions and work collaboratively with suppliers on risk mitigation and crisis response.

We are a leading Data Platform that uses AI to help Enterprises leverage Supply Chain Data on a global scale. We have one of the largest repositories of supply chain mapping data worldwide. We help Fortune 200 corporations to digitize their Supply Chain – by mapping the partners, analyzing the risk, assessing capacity and Business Continuity, using AI based data rich platform.


B.Tech / B.E. or M.S. in Computer or Information Science.

Past Experience:

10+ years of hand on experience of building data solutions
Expert level knowledge of, and experience in SQL (PostgreSQL preferred), including complex joins, CTEs, query optimization
ETL / ELT Script development experience
Excellent programming skills in Python with OOP principles and SQL
Expertise in using REST APIs for data extraction
Experience with Master Data Management
Deep experience with Python libraries like Pandas, NumPy, Psycopg2, etc.
Expertise in building data pipelines on public cloud like MS Azure & AWS
Prior experience in providing technical mentorship to colleagues
Experience with Python async programming highly desired
CI / CD experience a plus
Experience with NoSQL (MongoDB, Cassandra, ArangoDB preferred), and Graph databases ( JanusGraph, Virtuoso preferred) highly desired
Working knowledge of ML and AI libraries (PyTorch, TensorFlow, etc.) desired


Develop solutions that automate critical tasks around data research, data enrichment, and Master Data Management
Build robust data pipelines to grow our data repositories
Design and implement data pipelines to move data between platforms
Take ownership of Master Data Management; build automations, and deploy tools to maintain the highest level of data quality
Build automated solutions to validate data quality and fix data problems
Operate independently to develop automation and data solutions
Explore and recommend technologies and optimal alternatives for the best fit solution
Operate in a cross-functional environment, collaborating with Engineering, Product Management, Data Science, and Data Research teams

Soft Skills required:

Excellent communication skills to effectively present to CXO level
Detail oriented with strong analytical & problem-solving skills
Sound project planning & organizing skills
Self-motivated, sets high performance standards for oneself and colleagues

We use cookies to customize your user experience. Click “Agree” if you agree with our Policy.