Apply now »

Data Solution Architect

Job Advert

Want to be at the heart of things? As Data Solution Architect, you’ll have the freedom to use your technical, leadership and commercial skills to transform our business through the latest technologies.

Data Solution Architect

Hyderabad, Andhra Pradesh

Competitive Salary & excellent benefits package

This role forms part of Enterprise data organisation. RB’s intention is to develop an Enterprise Data & Analytics Platform, to meet analytics and reporting requirements across the business.


•    Architect scalable data processing and analytics solutions, including technical feasibility and proposal development for Big Data storage, processing and consumption (e.g., development of enterprise Data Lake strategy, heterogeneous data management, decision support/BI over Data Lake)
•    Install and configure information systems to ensure business data functionality
•    Analyze structural requirements for new software and applications
•    Utilize Azure services to design and build cost effective solutions which satisfy business needs for a modern Data Warehouse in the cloud
•    Build environments for storing raw data utilizing Azure Data Lake Storage
•    Migrate data from legacy systems to new solutions
•    Define security and backup procedures and mitigation strategies
•    Coordinate with Data Science team to identify and refine the platform data needs
•    Ensure compliance with GxP and regulatory requirements and continuous improvement of quality relevant processes within area of responsibility

Skills & Experience

•    5 – 8 years of strong hands-on experience as Data and Application architect, with proven experience in delivering large scale SaaS/PaaS/API/Microservice architecture based applications
•    Experience in Azure SQL Database, Azure Data Factory v2, Azure Cosmos DB, Azure Data Lake Storage, Azure Functions, Azure Active Directory and Azure security
•    Good understanding of ETL technologies, Azure Cloud and Azure Analytics stack, e.g., Azure Data Lake, Azure Databricks, Azure Data Factory
•    Hands-on experience in multiple end-to-end data warehousing/ETL project implementations
•    Good knowledge of service oriented architecture (SOA) principles
•    Good understanding in databases and analytics, including relational databases (e.g., SQL Server, MySQL, Oracle), Data warehousing, big data (Hadoop, Spark), noSQL, and business analytics
•    SME knowledge and experience in Enterprise Integration Patterns
•    Proficiency in any of the Languages (R, python, java, c#, powershell)
•    Experience of working in a GxP environment and understanding of industry standards (i.e. 21 CFR Part 11, Annex 11, GAMP 5, etc.)
Qualifications required 
•    Bachelor’s degree in computer science, Software Engineering, MIS or equivalent
•    Certification from one of the Big Data vendors – Microsoft, Cloudera, Hortonworks, IBM, Databricks, Teradata preferred

Apply now »