Cutshort logo
SQL Azure Jobs in Mumbai

12+ SQL Azure Jobs in Mumbai | SQL Azure Job openings in Mumbai

Apply to 12+ SQL Azure Jobs in Mumbai on CutShort.io. Explore the latest SQL Azure Job opportunities across top companies like Google, Amazon & Adobe.

icon
mazosol
kirthick murali
Posted by kirthick murali
Mumbai
10 - 20 yrs
₹30L - ₹58L / yr
skill iconPython
skill iconR Programming
PySpark
Google Cloud Platform (GCP)
SQL Azure

Data Scientist – Program Embedded 

Job Description:   

We are seeking a highly skilled and motivated senior data scientist to support a big data program. The successful candidate will play a pivotal role in supporting multiple projects in this program covering traditional tasks from revenue management, demand forecasting, improving customer experience to testing/using new tools/platforms such as Copilot Fabric for different purpose. The expected candidate would have deep expertise in machine learning methodology and applications. And he/she should have completed multiple large scale data science projects (full cycle from ideation to BAU). Beyond technical expertise, problem solving in complex set-up will be key to the success for this role. This is a data science role directly embedded into the program/projects, stake holder management and collaborations with patterner are crucial to the success on this role (on top of the deep expertise). 

What we are looking for: 

  1. Highly efficient in Python/Pyspark/R. 
  2. Understand MLOps concepts, working experience in product industrialization (from Data Science point of view). Experience in building product for live deployment, and continuous development and continuous integration. 
  3. Familiar with cloud platforms such as Azure, GCP, and the data management systems on such platform. Familiar with Databricks and product deployment on Databricks. 
  4. Experience in ML projects involving techniques: Regression, Time Series, Clustering, Classification, Dimension Reduction, Anomaly detection with traditional ML approaches and DL approaches. 
  5. Solid background in statistics, probability distributions, A/B testing validation, univariate/multivariate analysis, hypothesis test for different purpose, data augmentation etc. 
  6. Familiar with designing testing framework for different modelling practice/projects based on business needs. 
  7. Exposure to Gen AI tools and enthusiastic about experimenting and have new ideas on what can be done. 
  8. If they have improved an internal company process using an AI tool, that would be great (e.g. process simplification, manual task automation, auto emails) 
  9. Ideally, 10+ years of experience, and have been on independent business facing roles. 
  10. CPG or retail as a data scientist would be nice, but not number one priority, especially for those who have navigated through multiple industries. 
  11. Being proactive and collaborative would be essential. 

 

Some projects examples within the program: 

  1. Test new tools/platforms such as Copilo, Fabric for commercial reporting. Testing, validation and build trust. 
  2. Building algorithms for predicting trend in category, consumptions to support dashboards. 
  3. Revenue Growth Management, create/understand the algorithms behind the tools (can be built by 3rd parties) we need to maintain or choose to improve. Able to prioritize and build product roadmap. Able to design new solutions and articulate/quantify the limitation of the solutions. 
  4. Demand forecasting, create localized forecasts to improve in store availability. Proper model monitoring for early detection of potential issues in the forecast focusing particularly on improving the end user experience. 


Read more
Quinnox

at Quinnox

2 recruiters
MidhunKumar T
Posted by MidhunKumar T
Bengaluru (Bangalore), Mumbai
10 - 15 yrs
₹30L - ₹35L / yr
ADF
azure data lake services
SQL Azure
azure synapse
Spark
+4 more

Mandatory Skills: Azure Data Lake Storage, Azure SQL databases, Azure Synapse, Data Bricks (Pyspark/Spark), Python, SQL, Azure Data Factory.


Good to have: Power BI, Azure IAAS services, Azure Devops, Microsoft Fabric


Ø Very strong understanding on ETL and ELT

Ø Very strong understanding on Lakehouse architecture.

Ø Very strong knowledge in Pyspark and Spark architecture.

Ø Good knowledge in Azure data lake architecture and access controls

Ø Good knowledge in Microsoft Fabric architecture

Ø Good knowledge in Azure SQL databases

Ø Good knowledge in T-SQL

Ø Good knowledge in CI /CD process using Azure devops

Ø Power BI

Read more
AMALA COMMERCE

AMALA COMMERCE

Agency job
via Prime consulting Inc by Rushikesh Dive
Nagpur, Pune, Mumbai
3 - 7 yrs
₹7L - ₹15L / yr
MS SQLServer
PL/SQL
Office Open XML
SQL Azure

 

KEY RESPONSIBILITIES

·      Develop high-quality database solutions.

·      Use T-SQL to develop and implement procedures and functions.

·      Review and interpret ongoing business report requirements.

·      Research required data.

·      Build appropriate and useful reporting deliverables.

·      Analyze existing SQL queries for performance improvements.

·      Suggest new queries.

·      Develop procedures and scripts for data migration.

·      Provide timely scheduled management reporting.

·      Investigate exceptions with regard to asset movements.

 

MUST-HAVES FOR THIS GIG

T-SQL, Stored Procedure, Functions, Triggers, XML Operations, JSON support on SQL 2016 and above SSIS, SSRS, CTE, EAV Data structure, Integration with NoSQL(MongoDB), SQL Server Indexes, Bulk Insert, BCP, CMD Shell ,Memory Optimization, Performance Tunning, Query Optimization, Database Designing, Table Joins, SQL Server Job agent

Backup and Maintenance plan ,Data Migration, Good Communication

 

NICE-TO-HAVES FOR THIS GIG:

  • Working knowledge of mobile development activity.
  • Working knowledge of web hosting solution on IIS7.

Experience working with an offshore –onsite development process 

Read more
Arting Digital
Pragati Bhardwaj
Posted by Pragati Bhardwaj
Navi Mumbai
6 - 10 yrs
₹15L - ₹18L / yr
skill iconData Science
skill iconMachine Learning (ML)
skill iconPython
sql
Aws
+3 more

Title:- Data Scientist


Experience:-6 years

 

Work Mode:- Onsite

 

Primary Skills:- Data Science, SQL, Python, Data Modelling, Azure, AWS, Banking Domain (BFSI/NBFC)

 

Qualification:- Any

 

Roles & Responsibilities:-

 

1.  Acquiring, cleaning, and preprocessing raw data for analysis.

2.  Utilizing statistical methods and tools for analyzing and interpreting complex  datasets.

3.  Developing and implementing machine learning models for predictive analysis.

4.  Creating visualizations to effectively communicate insights to both technical and   non-technical stakeholders.

5.  Collaborating with cross-functional teams, including data engineers, business   analysts, and domain experts.

6.  Evaluating and optimizing the performance of machine learning models for   accuracy and efficiency.

7.  Identifying patterns and trends within data to inform business decision-making.

8.  Staying updated on the latest advancements in data science, machine learning, and  relevant technologies.

 

Requirement:- 

 

1.  Experience with modeling techniques such as Linear Regression, clustering, and classification techniques.

2.  Must have a passion for data, structured or unstructured.  0.6 – 5 years of hands-on experience with Python and SQL is a must.

3.   Should have sound experience in data mining, data analysis and machine learning techniques.

4.  Excellent critical thinking, verbal and written communications skills.

5.  Ability and desire to work in a proactive, highly engaging, high-pressure, client service environment.

6.   Good presentation skills.


Read more
Wowinfobiz
Nargis Mapari
Posted by Nargis Mapari
Mumbai
4 - 6 yrs
₹10L - ₹12L / yr
PL/SQL
SQL Azure
skill iconAmazon Web Services (AWS)
Oracle
SQL
+4 more

 PLSQL  Developer

experience of 4 to 6 years

Skills- MS SQl Server and Oracle, AWS or Azure


•            Experience in setting up RDS service in cloud technologies such as AWS or Azure

•            Strong proficiency with SQL and its variation among popular databases

•            Should be well-versed in writing stored procedures, functions, packages, using collections,

•            Skilled at optimizing large, complicated SQL statements.

•            Should have worked in migration projects.

•            Should have worked on creating reports.

•            Should be able to distinguish between normalized and de-normalized data modelling designs and use cases.

•            Knowledge of best practices when dealing with relational databases

•            Capable of troubleshooting common database issues

•            Familiar with tools that can aid with profiling server resource usage and optimizing it.

•            Proficient understanding of code versioning tools such as Git and SVN


Read more
With a leading Business Process Management (BPM) company

With a leading Business Process Management (BPM) company

Agency job
via Jobdost by Saida Jabbar
Pune, Mumbai, Bengaluru (Bangalore)
2 - 7 yrs
₹5L - ₹16L / yr
DNN
DotNetNuke
ASP.NET MVC
skill iconGitHub
DevOps
+13 more

Job Summary

  • Candidate will be responsible for providing full life-cycle development (design, coding, and testing) and maintenance of web-based system on Azure
  • Candidate should have experience in GitHub, knowledge of DevOps is a plus
  • Experienced in designing and implementing web portals, experience with DNN is must
  • Ability to work with multiple languages including C#, ASP.Net, MVC, Javascript and related libraries, HTML, Complex SQL queries, CSS, BootStrap, JSON.
  • Experience in Agile project management methodology
  • Developing and Delivering Excellent Web based solutions/portals/sites based on customer’s requirement within the stipulated timeline
  • The candidate should be flexible to learn new technology and platform and should be creative, innovative for improvement ideas, detail oriented, diligent, and eager to learn and grow

Duties and Responsibilities

  • Understand business requirements to apply logic to integrate functionalities
  • Identify and understand any technical bugs on the server, site, log files or modules and work on resolving the bugs
  • Understand how FTP server is setup for the site
  • Understand system/site technical requirements and suggest enhancements if applicable
  • Designing, coding, unit Testing, and integration with Database
  • Handle site deployment
  • Designing, coding, debugging, technical problem solving, and writing Unit Test cases, etc.

Qualifications

Education / Certification

  • B.E. / B.Tech. /MSC in Computer Science or IT.
  • MCAD/MCSD/MSITP/MCPD

Technical Expertise

  • ASP/ASP.NET/VB.NET/MVC/C#/SQL Server 2012+
  • HTML, Javascript, Jquery, CSS, Bootstrap
  • GitHub/DevOps, Azure
  • Web API/ Web Services, Email Services

Skills and Abilities

  • Be able to work with diverse global teams and in an individual contributor role as needed
  • Excellent English written and verbal communication skills (for local team and global stakeholders/team members)
  • Strong task management skills including time management, and ability to manage multiple projects simultaneously
  • Flexibility required to attend late evening meetings with global team members
  • Attention to detail and delivering quality projects and knowledge assets

 

Read more
Blenheim Chalcot IT Services India Pvt Ltd

Blenheim Chalcot IT Services India Pvt Ltd

Agency job
Mumbai
5 - 8 yrs
₹25L - ₹30L / yr
SQL Azure
ADF
Azure data factory
Azure Datalake
Azure Databricks
+13 more
As a hands-on Data Architect, you will be part of a team responsible for building enterprise-grade
Data Warehouse and Analytics solutions that aggregate data across diverse sources and data types
including text, video and audio through to live stream and IoT in an agile project delivery
environment with a focus on DataOps and Data Observability. You will work with Azure SQL
Databases, Synapse Analytics, Azure Data Factory, Azure Datalake Gen2, Azure Databricks, Azure
Machine Learning, Azure Service Bus, Azure Serverless (LogicApps, FunctionApps), Azure Data
Catalogue and Purview among other tools, gaining opportunities to learn some of the most
advanced and innovative techniques in the cloud data space.
You will be building Power BI based analytics solutions to provide actionable insights into customer
data, and to measure operational efficiencies and other key business performance metrics.
You will be involved in the development, build, deployment, and testing of customer solutions, with
responsibility for the design, implementation and documentation of the technical aspects, including
integration to ensure the solution meets customer requirements. You will be working closely with
fellow architects, engineers, analysts, and team leads and project managers to plan, build and roll
out data driven solutions
Expertise:
Proven expertise in developing data solutions with Azure SQL Server and Azure SQL Data Warehouse (now
Synapse Analytics)
Demonstrated expertise of data modelling and data warehouse methodologies and best practices.
Ability to write efficient data pipelines for ETL using Azure Data Factory or equivalent tools.
Integration of data feeds utilising both structured (ex XML/JSON) and flat schemas (ex CSV,TXT,XLSX)
across a wide range of electronic delivery mechanisms (API/SFTP/etc )
Azure DevOps knowledge essential for CI/CD of data ingestion pipelines and integrations.
Experience with object-oriented/object function scripting languages such as Python, Java, JavaScript, C#,
Scala, etc is required.
Expertise in creating technical and Architecture documentation (ex: HLD/LLD) is a must.
Proven ability to rapidly analyse and design solution architecture in client proposals is an added advantage.
Expertise with big data tools: Hadoop, Spark, Kafka, NoSQL databases, stream-processing systems is a plus.
Essential Experience:
5 or more years of hands-on experience in a data architect role with the development of ingestion,
integration, data auditing, reporting, and testing with Azure SQL tech stack.
full data and analytics project lifecycle experience (including costing and cost management of data
solutions) in Azure PaaS environment is essential.
Microsoft Azure and Data Certifications, at least fundamentals, are a must.
Experience using agile development methodologies, version control systems and repositories is a must.
A good, applied understanding of the end-to-end data process development life cycle.
A good working knowledge of data warehouse methodology using Azure SQL.
A good working knowledge of the Azure platform, it’s components, and the ability to leverage it’s
resources to implement solutions is a must.
Experience working in the Public sector or in an organisation servicing Public sector is a must,
Ability to work to demanding deadlines, keep momentum and deal with conflicting priorities in an
environment undergoing a programme of transformational change.
The ability to contribute and adhere to standards, have excellent attention to detail and be strongly driven
by quality.
Desirables:
Experience with AWS or google cloud platforms will be an added advantage.
Experience with Azure ML services will be an added advantage Personal Attributes
Articulated and clear in communications to mixed audiences- in writing, through presentations and one-toone.
Ability to present highly technical concepts and ideas in a business-friendly language.
Ability to effectively prioritise and execute tasks in a high-pressure environment.
Calm and adaptable in the face of ambiguity and in a fast-paced, quick-changing environment
Extensive experience working in a team-oriented, collaborative environment as well as working
independently.
Comfortable with multi project multi-tasking consulting Data Architect lifestyle
Excellent interpersonal skills with teams and building trust with clients
Ability to support and work with cross-functional teams in a dynamic environment.
A passion for achieving business transformation; the ability to energise and excite those you work with
Initiative; the ability to work flexibly in a team, working comfortably without direct supervision.
Read more
Wewill disclose the client name after theinitial screening.

Wewill disclose the client name after theinitial screening.

Agency job
via S3B Global by Rattan Saini
Bengaluru (Bangalore), NCR (Delhi | Gurgaon | Noida), Pune, Mumbai
9 - 20 yrs
₹10L - ₹40L / yr
Windows Azure
Azure Synapse
Data Structures
SQL Azure
QA DB
+1 more

Job title: Azure Architect

Locations: Noida, Pune, Bangalore and Mumbai

 

Responsibilities:

  • Develop and maintain scalable architecture, database design and data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity
  • Design and Develop the Data lake, Data warehouse using Azure Cloud Services
  • Assist in designing end to end data and Analytics solution architecture and perform POCs within Azure
  • Drive the design, sizing, POC setup, etc. of Azure environments and related services for the use cases and the solutions
  • Reviews the solution requirements support architecture design to ensure the selection of appropriate technology, efficient use of resources and integration of multiple systems and technology.
  • Must possess good client-facing experience with the ability to facilitate requirements sessions and lead teams
  • Support internal presentations to technical and business teams
  • Provide technical guidance, mentoring and code review, design level technical best practices

 

Experience Needed:

  • 12-15 years of industry experience and at least 3 years of experience in architect role is required along with at least 3 to 4 years’ experience designing and building analytics solutions in Azure.
  • Experience in architecting data ingestion/integration frameworks capable of processing structured, semi-structured & unstructured data sets in batch & real-time
  • Hands-on experience in the design of reporting schemas, data marts and development of reporting solutions
  • Develop batch processing, streaming and integration solutions and process Structured and Non-Structured Data
  • Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and Azure Analysis Services and other ETL technologies.
  • Experience in Perform Design, Development & Deployment using Azure Services ( Azure Synapse, Data Factory, Azure Data Lake Storage, Databricks, Python and SSIS)
  • Worked with transactional, temporal, time series, and structured and unstructured data.
  • Deep understanding of the operational dependencies of applications, networks, systems, security, and policy (both on-premise and in the cloud; VMs, Networking, VPN (Express Route), Active Directory, Storage (Blob, etc.), Windows/Linux).

 

 

Mandatory Skills: Azure Synapse, Data Factory, Azure Data Lake Storage, Azure DW, Databricks, Python

Read more
IT Giant

IT Giant

Agency job
Remote, Chennai, Bengaluru (Bangalore), Hyderabad, Pune, Mumbai, NCR (Delhi | Gurgaon | Noida), Kolkata
10 - 18 yrs
₹15L - ₹30L / yr
ETL
Informatica
Informatica PowerCenter
Windows Azure
SQL Azure
+2 more
Key skills:
Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake

Job Description
Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.
Read more
Fragma Data Systems

at Fragma Data Systems

8 recruiters
Evelyn Charles
Posted by Evelyn Charles
Remote, Bengaluru (Bangalore), Hyderabad, Chennai, Mumbai, Pune
8 - 15 yrs
₹16L - ₹28L / yr
PySpark
SQL Azure
azure synapse
Windows Azure
Azure Data Engineer
+3 more
Technology Skills:
  • Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub.
  • Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. 
  • Designing and implementing data engineering, ingestion, and transformation functions
Good to Have: 
  • Experience with Azure Analysis Services
  • Experience in Power BI
  • Experience with third-party solutions like Attunity/Stream sets, Informatica
  • Experience with PreSales activities (Responding to RFPs, Executing Quick POCs)
  • Capacity Planning and Performance Tuning on Azure Stack and Spark.
Read more
PAGO Analytics India Pvt Ltd
Vijay Cheripally
Posted by Vijay Cheripally
Remote, Bengaluru (Bangalore), Mumbai, NCR (Delhi | Gurgaon | Noida)
2 - 8 yrs
₹8L - ₹15L / yr
skill iconPython
PySpark
Microsoft Windows Azure
SQL Azure
skill iconData Analytics
+6 more
Be an integral part of large scale client business development and delivery engagements
Develop the software and systems needed for end-to-end execution on large projects
Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions
Build the knowledge base required to deliver increasingly complex technology projects


Object-oriented languages (e.g. Python, PySpark, Java, C#, C++ ) and frameworks (e.g. J2EE or .NET)
Database programming using any flavours of SQL
Expertise in relational and dimensional modelling, including big data technologies
Exposure across all the SDLC process, including testing and deployment
Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure DataBricks, HD Insights, ML Service etc.
Good knowledge of Python and Spark are required
Good understanding of how to enable analytics using cloud technology and ML Ops
Experience in Azure Infrastructure and Azure Dev Ops will be a strong plus
Read more
10FA India Private Limited.Formerly known as Prudential Glob

10FA India Private Limited.Formerly known as Prudential Glob

Agency job
via Brace Infotech pvt ltd by Krishna Brace
Mumbai
4 - 10 yrs
₹10L - ₹15L / yr
Windows Azure
Azure
Microsoft Windows Azure
SQL Azure
Platform as a Service (PaaS)
  • Proficiency in Integration of various Azure resources (IaaS and PaaS - SQL DB , App Service , Application Insights , databricks , Storage accounts etc) to deliver an end to end automation.
  • Thorough understanding of Continuous integration and continuous delivery using Azure DevOps/VSTS.
  • Performing cost analysis of the Azure platform to identify where cost efficiencies could be had.
  • Proficiency and thorough understanding of Azure RBAC model.
  • Sound understanding of Azure Active directory and conditional access policies.
  • Good grasp of Azure governance principles and hands-on experience in rolling out compliance and governance polices.
  • Proficiency in developing infrastructure automation scripts in the form of ARM templates and Azure Power Shell scripts which can then be provided to application teams as consumables.
  • Effective communication skills, both written and verbal for technical and non-technical audiences.
  • Good working and hands on knowledge of Azure IaaS , Vnet , Subnets , Firewalls and NSG. Sound understanding of networking knowledge on DNA and Firewall security like Palo Alto.
  • Experience working with Confluence, JIRA, Bitbucket, git, Jenkins, Sonar for collaboration and continuous integration.
  • Experience with agile methods, along with having found their limitations and ways to overcome them.

 

 

 

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort