Design, architect and build solutions and tools for the big data platform
Mediate and coordinates resolution of software project deliverables using agile methodology
Develop pipelines to ingest data into the big data platform based on business demands and use cases
Develop analytical platforms that will be used to avail data to end users for exploration, advanced analytics, and visualizations for day-to-day business reporting
Provide guidance and advise to technology teams on the best use of latest technologies and designs to deliver a best-in-class platform in the most cost-effective way
Develop automated monitoring solutions to be handed over to support teams to run and operate the platform efficiently
Automate and productionize data science models on the big data engineering platform
BS or MS in computer science or equivalent practical experience
At least 2-3 years of coding experience in a non-university setting.
Experience in Object Oriented development
Proficient understanding of distributed computing principles
Experience in collecting, storing, processing, and analyzing large volumes of data.
Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming
Experience with various messaging systems, such as Kafka or RabbitMQ
Experience with NoSQL databases, such as HBase, Cassandra, MongoDB
Understanding of big data technologies: Cloudera/MapR/Hortonworks