IntroductionIn this role, you’ll work in one of our IBM Consulting Client Innovation Centres (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centres offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology.At IBM, work is more than a job - it’s a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you’ve never thought possible. Are you ready to lead in this new era of technology and solve some of the world’s most challenging problems? If so, lets talk.

Want more jobs like this?GetjobsinBangalore, Indiadelivered to your inbox every week.

Want more jobs like this?

GetjobsinBangalore, Indiadelivered to your inbox every week.

Get Jobs

Your Role and ResponsibilitiesCreate Solution Outline and Macro Design to describe end to end product implementation in Data Platforms including, System integration, Data ingestion, Data processing, Serving layer, Design Patterns, Platform Architecture Principles for Data platformContribute to pre-sales, sales support through RfP responses, Solution Architecture, Planning and EstimationContribute to reusable components / asset / accelerator development to support capability developmentParticipate in Customer presentations as Platform Architects / Subject Matter Experts on Big Data, Azure Cloud and related technologiesParticipate in customer PoCs to deliver the outcomesParticipate in delivery reviews / product reviews, quality assurance and work as design authorityRequired Technical and Professional ExpertiseExperience in designing of data products providing descriptive, prescriptive, and predictive analytics to end users or other systemsExperience in data engineering and architecting data platformsExperience in architecting and implementing Data Platforms Azure Cloud PlatformExperience on Azure cloud is mandatory (ADLS Gen 1 / Gen2, Data Factory, Databricks, Synapse Analytics, Azure SQL, Cosmos DB, Event hub, Snowflake), Azure Purview, Microsoft Fabric, Kubernetes, Terraform, AirflowExperience in Big Data stack (Hadoop ecosystem Hive, HBase, Kafka, Spark, Scala PySpark, Python etc.) with Cloudera or HortonworksPreferred Technical and Professional ExpertiseExperience in architecting complex data platforms on Azure Cloud Platform and On-PremExperience and exposure to implementation of Data Fabric and Data Mesh concepts and solutions like Microsoft Fabric or Starburst or Denodo or IBM Data Virtualisation or Talend or Tibco Data FabricExposure to Data Cataloging and Governance solutions like Collibra, Alation, Watson Knowledge Catalog, dataBricks unity Catalog, Apache Atlas, Snowflake Data Glossary etc