- Google Cloud Platform
responsibilities :
- Work with business stakeholders, subject matter experts, and technical teams to understand the business processes and requirements for various initiatives.
- Document and formalize the business and data requirements using industry-standard methodologies and tools.
- Identify data requirements and its specifications that accurately reflect the business needs and support the underlying processes.
- Collaborate with software developers and data architects to ensure that the data architecture and software designs align with the business requirements and goals.
- Support and maintain existing ETL pipelines with python & GCP development, automated testing of new and existing components in an Agile, Dev Ops and dynamic environment.
- Define and obtain source data required to successfully deliver insights and use cases
- Determine the data mapping required to join multiple data sets together across multiple sources
- Create methods to highlight and report data inconsistencies, allowing users to review and provide feedback on
- Document the data design and solutions to the high end for future reference.
- Technically able to validate, analyze, data model and write the clear use stories to Data Engineers.
- Propose suitable data migration sets to the relevant stakeholders
- Assist teams with processing the data migration sets as required
- Assist with the planning, tracking and coordination of the data migration team and with the migration run-book and the scope for each cutover
- Support data-related tasks such as data mapping, data lineage, data governance, data quality, and metadata management
requirements-expected :
- Minimum [5] years of experience in data analytics, business intelligence, or related areas
- Good to have PEGA customer analytical data model understanding and experience working with.
- Experience in a Business/Data Analyst role
- Familiarity with the different data management services and technologies available on Google Cloud platform such as Big Query, Cloud Storage, Dataproc and Dataflow
- Knowledge of Py Spark & SQL, and capable of navigating databases especially Hive.
- Strong communication skills, with the ability to present complex data insights in an understandable and actionable way
- Good knowledge of SDLC and formal Agile processes, a bias towards TDD and a willingness to test products as part of the delivery cycle SIT, UAT and Production Support.
- Knowledge of data modeling, database design, and data management principles.
- Knowledge of and experience using data models and data dictionaries in a Banking and Financial Markets context.
- Able to handle and manipulate large time-series data sets.
- Strong analytical and problem-solving skills, with the ability to work with complex data sets.
- Attention to detail and ability to manage multiple tasks and priorities
- Understand the Data Definition Tagging processes like Logical to Physical Mapping, Data Processing Flow to measure the consistency etc
benefits :
- sharing the costs of sports activities
- private medical care
- life insurance