Responsibilities:
- Work with team of Architects and Designers to come up with end-to-end solutions to build data platforms on top of Azure.
- Design and build scalable systems with Azure: use best practices and reducing Azure operational costs.
- Work on projects ingesting large amount of diverse data streams an organize them into our database (experience with Kafka etc. are highly valued).
- Infrastructure and Docker container management.
- Analyzing ETL or ELT napping logic and writing complex SQL queries to recreate logic.
- Build custom python integrations and plugins to support automation and business functions.
- Create Cl/CD pipelines tor Azure infrastructure, configuration and app deployments.
- Work on infrastructure automation using Ansible.
- Demonstrate excellent customer facing skills.
Experience in:
- Development, Automation or DevOps space.
- Programming/Scripting - Bash, Python, PowerShell, Ruby
- Infra (Network, Storage, Linux & Windows and System Administration) Knowledge.
- Experience with Azure Cloud (Azure laaS), AWS is OK but Azure is preferred.
- Experience in automating infrastructure, testing, and deployments in Cloud with tools like Puppet, Chef, Ansible, code repo & version control, CI/CD Tools (Ex Git, jenkins etc)
- Knowledge of Network protocols, troubleshooting DNS, TCP/IP, SMTP, SSH, SFTP, SSL.
- Experience in Big Data components such as Azure Databricks, ADLS, Kafka, Spark SQL, HIVE DB etc.
- Experience with Monitoring different Azure Services for errors and notifications.
- Experience with Azure loT Edge is preferred.
- Excellent Programming skills.
- Excellent communication skills. Technology Stack: Data Factory, Data Lake Store (gen 2), Databricks, Synapse Analytics, Cosmos DB, Azure SQL, Azure Analysis Services.