Want more jobs like this?Getjobsdelivered to your inbox every week.
Want more jobs like this?
Getjobsdelivered to your inbox every week.
Select a locationGet Jobs
Select a location
Essential ResponsibilitiesAs a Data Architect you will be part of a cross-disciplinary team working on development projects, typically involving large, complex data sets. This person will act as POD lead to guide and influence teams of Data Architects, Data Engineers, RTE, Scrum Masters, Product Managers working in concert with partners in GE business units. Potential areas for development could include any data set within the Power Data Lake or beyond but with a focus on delivering outcomes in more than one functional sub domainsIn this role you will:Drive design and implementation of the data technology roadmapDesigns a solution that maps a data product or small group of data product outcomes to an enterprise reference architecturePerforms hands-on execution of proof-of-concepts and complex requirementsParticipates in the data domain technical and business discussions relative to future architect directionAssists in the analysis, design and development of a roadmap, design pattern, and implementation based upon a current vs. future state in a cohesive architecture viewpointGathers and analyzes data and develops architectural requirements at project levelParticipates in the Data Governance CouncilSupports the development data and data delivery platforms that are service-oriented with reusable components that can be orchestrated together into different methods for different businessesResearches and evaluates emerging data technology, industry, and market trends to assist in project development and/or operational support activitiesCoaches and mentors team membersEffectively monitor and control costs in the organizationDevelops standard data models as Sources of Truth with focus on enabling Self ServiceDevelop and enforce development standards and best practices (Standard work) as well as Data Governance standardsEnsures and set the paths for the zero technical debt environmentDemonstrate proficiency in implementation of logical/physical data modelsLead one or more POD(s) in the SAFe environmentGroom features and plan the end-to-end execution as part of PI planningAssigns and Governs tasks given to the developerShow good communication skills - both oral and writtenQualifications/RequirementsBachelor’s Degree with 4-8 years experienceDesired CharacteristicsTechnical Expertise:Translates analytics problems into data requirementsUnderstands logical and physical data models, big data storage architecture, data modelling methodologies, cloud technologies metadata management, master data management, data lineage & data profilingDeeper understanding of how to design, develop, test and optimize with large data setsUnderstand & build in optimal manner extraction, transformation, and loading of data from a wide variety of data sources using SQL, Talend and AWS Services.Understands the technology landscape, up to date on current technology trends and new technology, brings new ideas to the teamExposure to Analytical tools, Databricks, Python, Spark is desirableUnderstanding of basic Data Science and Machine Learning concepts and the integration of ML model outputs into data pipeline is an added advantagePersonal Attributes:Leadership:Demonstrated awareness of how to function in a team settingDistills information down to key pointsShould be able to influence the right solutionExpresses the information clearly and concisely. Projects knowledge of relevant dataDemonstrated awareness of how to leverage curiosity and creativity to drive business impactAsks follow-up questions when presented with new data/projects. Sees the broader implications of an ideaPresents new ideas and concepts. Makes connections among previously unrelated ideasDeep passion for learning