Azure Data Engineer

Karnataka Bangalore, India    |     Other   |   Full-time
Job Reference: 4203 - Posted 18-Aug-2021

Job Title: Azure data Engineer

Experience level: 4 to 10 years

Location: Bangalore

Job Description:

         Designing and implementing highly performant data ingestion pipelines from multiple sources using Apache Spark and/or Azure Databricks

         Developing scalable and re-usable frameworks for ingesting of geospatial data sets.

         Working hand-in-hand with the data architects deploying scalable API services in a perfect fit with the data architecture platform.

         Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.

         Experience with Azure: ADLS, Databricks, Stream Analytics, SQL DW, COSMOS DB, Analysis Services, Azure Functions, Serverless Architecture, ARM Templates

         Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.

         Experience with object-oriented/object function scripting languages: Python, SQL, Scala, Spark-SQL etc.

         Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.

         Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.

         Strong analytic skills related to working with unstructured datasets.

         Build processes supporting data transformation, data structures, metadata, dependency and workload management.

         A successful history of manipulating, processing and extracting value from large disconnected datasets.

         Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.

         Strong project management and organizational skills.

         Experience supporting and working with cross-functional teams in a dynamic environment.

         Nice to have experience with big data tools: Hadoop, Kafka, etc.

         Nice to have experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.

         Nice to have experience with stream-processing systems: Storm