Engineering

Senior Software Engineer - Big Data

Bengaluru, Karnataka   |   Full Time


Our Team 


PharmEasy was founded in 2015 with the sole purpose to make healthcare easily available, accessible and affordable to all through the extensive use of new-age cutting-edge technology. Today, we are one of India's largest healthcare aggregators connecting lakhs of patients to licensed pharmacies & diagnostic centres online for all their medical needs. We are particularly catering to the chronic-care segment, and offer a range of services including medicine delivery, tele-consultation, sample collection for diagnostic tests as well as subscription-based services for all these categories.


Our highly efficient and technology led Consumer and Supply-chain platforms ensure that medicines are delivered from a licensed pharmacy within six hours of the validation of prescriptions submitted by our customers. And such customer promises are improving with the increasing scale of our business, and continuous product innovation. 


By extensively leveraging the latest in hardware and software technology, we are also committed to eradicate fake medicines from the Pharma ecosystem that contribute to roughly 30% of drug volumes in India. Our product innovations have allowed for complete data transparency in the entire Pharma supply-chain to empower even the end-users to validate the authenticity and genuineness of the medicines for every medicine sold, using constructs such as unique barcoding of information like expiry dates, origination of drugs etc. 


With our scalable technology and processes, we are now reliably delivering healthcare services and medicines to every single pin code in the country. 


PharmEasy now delivers medicines and healthcare services to each and every pin code of the country. 


Data Platform @ Pharmeasy


Data Platform Engineering at 'PharmEasy' builds distributed components, systems, and tools that power decisions at 'PharmEasy'. We have an incredibly rich dataset to collect, transform, and analyze in order to improve the effectiveness of our marketplace and create delight for our customers. The data scientists, machine learning engineers, and other engineers and analysts use this data to make the experience of using 'PharmEasy' Platform better for the customers. We leverage existing open source technologies like Kafka, Hadoop, Redshift, Hive, Presto, Spark, and also write our own. As a member of our team you would spend time designing and growing our existing infrastructure, democratizing data access at the company, and promoting the correct use of data and analytics at the company.


Responsibilities : 


  • As an integral part of the Data Platform team, take ownership of multiple modules from design to deployment.

  • Extensively build scalable, high-performance distributed systems that deal with large data volumes.

  • Provide resolutions and/ or workaround to data pipeline related queries/ issues as appropriate

  • Ensure that Ingestion pipelines that empower the Data Lake and Data Warehouses are up and running.

  • Collaborate with different teams in order to understand / resolve data availability and consistency issues.

  • Exhibit continuous improvement on problem resolution skills and strive for excellence


What are we looking for ?


  • Overall 4 - 8 years of experience in software industry with minimum 3 - 4 years on Big Data and related tech stacks. Preferably from ecommerce companies.

  • Strong Java core programming skills.

  • Good to have programming skills in Python and Scala.

  • Good design and documentation skills.

  • Would have worked on Data at scale.

  • Ability to read and write SQL - and understanding of one of Relational Databases such as MySQL, Oracle, Postgres, SQL Server.

  • Development experience using Hadoop, Spark, Kafka, Map-Reduce, Hive, and any NoSQL databases like HBase.

  • Exposure to tech stacks like Flink, Druid etc.

  • Prior exposure to building real time data pipelines would be an added advantage.

  • Comfortable with Linux with ability to write small scripts in Bash/Python. Ability to grapple with log files and unix processes.

  • Prior experience in working on cloud services, preferably AWS.

  • Ability to learn complex new things quickly

Submit Your Application

You have successfully applied
  • You have errors in applying