Cutshort logo

11+ DMS Jobs in Mumbai | DMS Job openings in Mumbai

Apply to 11+ DMS Jobs in Mumbai on CutShort.io. Explore the latest DMS Job opportunities across top companies like Google, Amazon & Adobe.

icon
Encubate Tech Private Ltd

Encubate Tech Private Ltd

Agency job
via staff hire solutions by Purvaja Patidar
Mumbai
5 - 6 yrs
₹15L - ₹20L / yr
skill iconAmazon Web Services (AWS)
Amazon Redshift
Data modeling
ITL
Agile/Scrum
+7 more

Roles and

Responsibilities

Seeking AWS Cloud Engineer /Data Warehouse Developer for our Data CoE team to

help us in configure and develop new AWS environments for our Enterprise Data Lake,

migrate the on-premise traditional workloads to cloud. Must have a sound

understanding of BI best practices, relational structures, dimensional data modelling,

structured query language (SQL) skills, data warehouse and reporting techniques.

 Extensive experience in providing AWS Cloud solutions to various business

use cases.

 Creating star schema data models, performing ETLs and validating results with

business representatives

 Supporting implemented BI solutions by: monitoring and tuning queries and

data loads, addressing user questions concerning data integrity, monitoring

performance and communicating functional and technical issues.

Job Description: -

This position is responsible for the successful delivery of business intelligence

information to the entire organization and is experienced in BI development and

implementations, data architecture and data warehousing.

Requisite Qualification

Essential

-

AWS Certified Database Specialty or -

AWS Certified Data Analytics

Preferred

Any other Data Engineer Certification

Requisite Experience

Essential 4 -7 yrs of experience

Preferred 2+ yrs of experience in ETL & data pipelines

Skills Required

Special Skills Required

 AWS: S3, DMS, Redshift, EC2, VPC, Lambda, Delta Lake, CloudWatch etc.

 Bigdata: Databricks, Spark, Glue and Athena

 Expertise in Lake Formation, Python programming, Spark, Shell scripting

 Minimum Bachelor’s degree with 5+ years of experience in designing, building,

and maintaining AWS data components

 3+ years of experience in data component configuration, related roles and

access setup

 Expertise in Python programming

 Knowledge in all aspects of DevOps (source control, continuous integration,

deployments, etc.)

 Comfortable working with DevOps: Jenkins, Bitbucket, CI/CD

 Hands on ETL development experience, preferably using or SSIS

 SQL Server experience required

 Strong analytical skills to solve and model complex business requirements

 Sound understanding of BI Best Practices/Methodologies, relational structures,

dimensional data modelling, structured query language (SQL) skills, data

warehouse and reporting techniques

Preferred Skills

Required

 Experience working in the SCRUM Environment.

 Experience in Administration (Windows/Unix/Network/Database/Hadoop) is a

plus.

 Experience in SQL Server, SSIS, SSAS, SSRS

 Comfortable with creating data models and visualization using Power BI

 Hands on experience in relational and multi-dimensional data modelling,

including multiple source systems from databases and flat files, and the use of

standard data modelling tools

 Ability to collaborate on a team with infrastructure, BI report development and

business analyst resources, and clearly communicate solutions to both

technical and non-technical team members

Read more
TIFIN FINTECH

at TIFIN FINTECH

1 recruiter
Vrishali Mishra
Posted by Vrishali Mishra
Mumbai
2 - 5 yrs
Best in industry
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+2 more

Quant Research, TIFIN

Mumbai, India


WHO WE ARE:

TIFIN is a fintech platform backed by industry leaders including JP Morgan, Morningstar, Broadridge, Hamilton Lane, Franklin Templeton, Motive Partners and a who’s who of the financial service industry. We are creating engaging wealth experiences to better financial lives through AI and investment intelligence powered personalization. We are working to change the world of wealth in ways that personalization has changed the world of movies, music and more but with the added responsibility of delivering better wealth outcomes.

We use design and behavioral thinking to enable engaging experiences through software and application programming interfaces (APIs). We use investment science and intelligence to build algorithmic engines inside the software and APIs to enable better investor outcomes.

In a world where every individual is unique, we match them to financial advice and investments with a recognition of their distinct needs and goals across our investment marketplace and our advice and planning divisions.


OUR VALUES: Go with your GUT

●     Grow at the Edge. We are driven by personal growth. We get out of our comfort zone and keep egos aside to find our genius zones. With self-awareness and integrity we strive to be the best we can possibly be. No excuses.

●     Understanding through Listening and Speaking the Truth. We value transparency. We communicate with radical candor, authenticity and precision to create a shared understanding. We challenge, but once a decision is made, commit fully.

●     I Win for Teamwin. We believe in staying within our genius zones to succeed and we take full ownership of our work. We inspire each other with our energy and attitude. We fly in formation to win together.

 

 

WHAT YOU'LL BE DOING:

We are looking for an experienced quantitative professional to develop, implement, test, and maintain the core algorithms and R&D framework for our Investment and investment advisory platform. The ideal candidate for this role has successfully implemented and maintained quantitative and statistical modules using modular software design constructs. The candidate needs to be a responsible product owner, a problem solver and a team player looking to make a significant impact on a fast-growing company. The successful candidate will directly report to the Head of Quant Research & Development.

 

 

RESPONSIBILITIES:

  • The end-to-end research, development, and maintenance of investment platform, data and algorithms
  • Take part in building out the R&D back testing and simulation engines
  • Thoroughly vet investment algorithmic results 
  • Contribute to the research data platform design
  • Investigate datasets for use in new or existing algorithms
  • Participate in agile development practices
  • Liaise with stakeholders to gather & understand the functional requirements
  • Take part in code reviews ensuring quality meets highest level of standards
  • Develop software using high quality standards and best practices, conduct thorough end-to-end unit testing, and provide support during testing and post go-live
  • Support research innovation through the creative and aggressive experimentation of cutting-edge hardware, software, processes, procedures, and methods
  • Collaborate with technology teams to ensure appropriate requirements, standards, and integration

 

REQUIREMENTS:

  • Experience in a quant research & development role
  • Proficient in Python, Git and Jira
  • Knowledge in SQL and database development (PostgreSQL is a plus)
  • Understanding of R and RMarkdown is a plus
  • Bachelor’s degree in computer science, computational mathematics, or financial engineering 
  • Master’s degree or advanced training is a strong plus
  • Excellent mathematical foundation and hands-on experience working in the finance industry
  • Proficient in quantitative, statistical, and ML/AI techniques and their implementation using Python modules such as Pandas, NumPy, SciPy, SciKit-Learn, etc.
  • Strong communication (written and oral) and analytical problem-solving skills
  • Strong sense of attention to detail, pride in delivering high quality work and willingness to learn
  • An understanding of or exposure to financial capital markets, various financial instruments (such as stocks, ETFs, Mutual Funds, etc.), and financial tools (such as Bloomberg, Reuters, etc.)


 

BENEFITS PACKAGE:

TIFIN offers a competitive benefits package that includes:

· Performance-linked variable compensation.

· Medical insurance

· Tax saving benefits

· Flexible PTO policy and Company-paid holidays

· Parental Leave: 6 month paid maternity, 2 week paid paternity leave

· Access to our Wellness trainers, including 1:1 personal coaching

for executives and rising stars

 

A note on location. While we have team centres in Boulder, New York City, San Francisco, Charlotte, and Bangalore,this role is based out of Mumbai.  

 

TIFIN is proud to be an equal opportunity workplace and values the multitude of talents and perspectives that a diverse workforce brings. All qualified applicants will receive consideration for employment without regard to race, national origin, religion, age, color, sex, sexual orientation, gender identity, disability, or protected veteran status.

 

 

 

 

 

Read more
Fatakpay

at Fatakpay

2 recruiters
Ajit Kumar
Posted by Ajit Kumar
Mumbai
2 - 4 yrs
₹8L - ₹12L / yr
SQL
skill iconPython
Problem solving
Data Warehouse (DWH)
Excel VBA

1. Bridging the gap between IT and the business using data analytics to assess processes, determine requirements and deliver data-driven recommendations and reports to executives and stakeholders.


2. Ability to search, extract, transform and load data from various databases, cleanse and refine data until it is fit-for-purpose


3. Work within various time constraints to meet critical business needs, while measuring and identifying activities performed and ensuring service requirements are met


4. Prioritization of issues to meet deadlines while ensuring high-quality delivery


5. Ability to pull data and to perform ad hoc reporting and analysis as needed


6. Ability to adapt quickly to new and changing technical environments as well as strong analytical and problem-solving abilities


7. Strong interpersonal and presentation skills


SKILLS:


1. Advanced skills in designing reporting interfaces and interactive dashboards in Google Sheets and Excel


2. Experience working with senior decision-makers


3. Strong advanced SQL/MySQL and Python skills with the ability to fetch data from the Data Warehouse as per the stakeholder's requirement


4. Good Knowledge and experience in Excel VBA and advanced excel


5. Good Experience in building Tableau analytical Dashboards as per the stake holder's reporting requirements


6. Strong communication/interpersonal skills


PERSONA:


1. Experience in working on adhoc requirements


2. Ability to toggle around with shifting priorities


3. Experience in working for Fintech or E-commerce industry is preferable


4. Engineering 2+ years of experience as a Business Analyst for the finance processes

Read more
LogiNext

at LogiNext

1 video
7 recruiters
Rakhi Daga
Posted by Rakhi Daga
Mumbai
8 - 10 yrs
₹21L - ₹30L / yr
skill iconData Science
skill iconMachine Learning (ML)
skill iconPython
skill iconPHP
skill iconJava

LogiNext is looking for a technically savvy and passionate Principle Engineer - Data Science to analyze large amounts of raw information to find patterns that will help improve our company. We will rely on you to build data products to extract valuable business insights.

In this role, you should be highly analytical with a knack for analysis, math and statistics. Critical thinking and problem-solving skills are essential for interpreting data. We also want to see a passion for machine-learning and research.

Your goal will be to help our company analyze trends to make better decisions. Without knowledge of how the software works, data scientists might have difficulty in work. Apart from experience in developing R and Python, they must know modern approaches to software development and their impact. DevOps continuous integration and deployment, experience in cloud computing are everyday skills to manage and process data.

Responsibilities :

Adapting and enhancing machine learning techniques based on physical intuition about the domain Design sampling methodology, prepare data, including data cleaning, univariate analysis, missing value imputation, , identify appropriate analytic and statistical methodology, develop predictive models and document process and results Lead projects both as a principal investigator and project manager, responsible for meeting project requirements on schedule and on budget Coordinate and lead efforts to innovate by deriving insights from heterogeneous sets of data generated by our suite of Aerospace products Support and mentor data scientists Maintain and work with our data pipeline that transfers and processes several terabytes of data using Spark, Scala, Python, Apache Kafka, Pig/Hive & Impala Work directly with application teams/partners (internal clients such as Xbox, Skype, Office) to understand their offerings/domain and help them become successful with data so they can run controlled experiments (a/b testing) Understand the data generated by experiments, and producing actionable, trustworthy conclusions from them Apply data analysis, data mining and data processing to present data clearly and develop experiments (ab testing) Work with development team to build tools for data logging and repeatable data tasks tol accelerate and automate data scientist duties


Requirements:

Bachelor’s or Master’s degree in Computer Science, Math, Physics, Engineering, Statistics or other technical field. PhD preferred 8 to 10 years of experience in data mining, data modeling, and reporting 5+ years of experience working with large data sets or do large scale quantitative analysis Expert SQL scripting required Development experience in one of the following: Scala, Java, Python, Perl, PHP, C++ or C# Experience working with Hadoop, Pig/Hive, Spark, MapReduce Ability to drive projects Basic understanding of statistics – hypothesis testing, p-values, confidence intervals, regression, classification, and optimization are core lingo Analysis - Should be able to perform Exploratory Data Analysis and get actionable insights from the data, with impressive visualization. Modeling - Should be familiar with ML concepts and algorithms; understanding of the internals and pros/cons of models is required. Strong algorithmic problem-solving skills Experience manipulating large data sets through statistical software (ex. R, SAS) or other methods Superior verbal, visual and written communication skills to educate and work with cross functional teams on controlled experiments Experimentation design or A/B testing experience is preferred. Experience in leading a team required.

Read more
Global Media Agency - A client of Merito

Global Media Agency - A client of Merito

Agency job
via Merito by Merito Talent
Gurugram, Bengaluru (Bangalore), Mumbai
4 - 9 yrs
Best in industry
skill iconMachine Learning (ML)
skill iconData Science
media analytics
SQL
skill iconPython
+4 more

Our client combines Adtech and Martech platform strategy with data science & data engineering expertise, helping our clients make advertising work better for people.

 
Key Role:
  • Act as primary day-to-day contact on analytics to agency-client leads
  • Develop bespoke analytics proposals for presentation to agencies & clients, for delivery within the teams
  • Ensure delivery of projects and services across the analytics team meets our stakeholder requirements (time, quality, cost)
  • Hands on platforms to perform data pre-processing that involves data transformation as well as data cleaning
  • Ensure data quality and integrity
  • Interpret and analyse data problems
  • Build analytic systems and predictive models
  • Increasing the performance and accuracy of machine learning algorithms through fine-tuning and further
  • Visualize data and create reports
  • Experiment with new models and techniques
  • Align data projects with organizational goals


Requirements

  • Min 6 - 7 years’ experience working in Data Science
  • Prior experience as a Data Scientist within a digital media is desirable
  • Solid understanding of machine learning
  • A degree in a quantitative field (e.g. economics, computer science, mathematics, statistics, engineering, physics, etc.)
  • Experience with SQL/ Big Query/GMP tech stack / Clean rooms such as ADH
  • A knack for statistical analysis and predictive modelling
  • Good knowledge of R, Python
  • Experience with SQL, MYSQL, PostgreSQL databases
  • Knowledge of data management and visualization techniques
  • Hands-on experience on BI/Visual Analytics Tools like PowerBI or Tableau or Data Studio
  • Evidence of technical comfort and good understanding of internet functionality desirable
  • Analytical pedigree - evidence of having approached problems from a mathematical perspective and working through to a solution in a logical way
  • Proactive and results-oriented
  • A positive, can-do attitude with a thirst to continually learn new things
  • An ability to work independently and collaboratively with a wide range of teams
  • Excellent communication skills, both written and oral
Read more
Virtusa

at Virtusa

2 recruiters
Agency job
via Response Informatics by Anupama Lavanya Uppala
Chennai, Bengaluru (Bangalore), Mumbai, Hyderabad, Pune
3 - 10 yrs
₹10L - ₹25L / yr
PySpark
skill iconPython
  • Minimum 1 years of relevant experience, in PySpark (mandatory)
  • Hands on experience in development, test, deploy, maintain and improving data integration pipeline in AWS cloud environment is added plus 
  • Ability to play lead role and independently manage 3-5 member of Pyspark development team 
  • EMR ,Python and PYspark mandate.
  • Knowledge and awareness working with AWS Cloud technologies like Apache Spark, , Glue, Kafka, Kinesis, and Lambda in S3, Redshift, RDS
Read more
Aideo Technologies

at Aideo Technologies

2 recruiters
Akshata Alekar
Posted by Akshata Alekar
Mumbai, Navi Mumbai
3 - 8 yrs
₹4L - ₹22L / yr
Tableau
Natural Language Processing (NLP)
Computer Vision
skill iconPython
RESTful APIs
+3 more

We are establishing infrastructure for internal and external reporting using Tableau and are looking for someone with experience building visualizations and dashboards in Tableau and using Tableau Server to deliver them to internal and external users. 

 

Required Experience 

  • Implementation of interactive visualizations using Tableau Desktop  
  • Integration with Tableau Server and support of production dashboards and embedded reports with it 
  • Writing and optimization of SQL queries  
  • Proficient in Python including the use of Pandas and numpy libraries to perform data exploration and analysis 
  • 3  years of experience working as a Software Engineer / Senior Software Engineer 
  • Bachelors in Engineering – can be Electronic and comm , Computer , IT  
  • Well versed with Basic Data Structures Algorithms and system design 
  • Should be capable of working well in a team – and should possess very good communication skills 
  • Self-motivated and fun to work with and organized 
  • Productive and efficient working remotely 
  • Test driven mindset with a knack for finding issues and problems at earlier stages of development 
  • Interest in learning and picking up a wide range of cutting edge technologies 
  • Should be curious and interested in learning some Data science related concepts and domain knowledge 
  • Work alongside other engineers on the team to elevate technology and consistently apply best practices 

 

Highly Desirable 

  • Data Analytics 
  • Experience in AWS cloud or any cloud technologies 
  • Experience in BigData technologies and streaming like – pyspark, kafka is a big plus 
  • Shell scripting  
  • Preferred tech stack – Python, Rest API, Microservices, Flask/Fast API, pandas, numpy, linux, shell scripting, Airflow, pyspark 
  • Has a strong backend experience – and worked with Microservices and Rest API’s - Flask, FastAPI, Databases Relational and Non-relational 
Read more
CRG Solutions Pvt Ltd

at CRG Solutions Pvt Ltd

1 recruiter
Sweety Patyal
Posted by Sweety Patyal
Mumbai
2 - 4 yrs
₹4L - ₹8L / yr
ETL
SQL Server Integration Services (SSIS)
Microsoft SSIS
SQL
SQL server
Should be able to connect with multiple data sources like excel, csv, SQL server oracle etc.
 Should be able to use the transformations components to transform the data
 Should possess knowledge on incremental load, full load etc.
 Should Design, build and deploy effective packages
 Should be able to schedule these packages through task schedulers
 Implement stored procedures and effectively query a database
 Translate requirements from the business and analyst into technical code
 Identify and test for bugs and bottlenecks in the ETL solution
 Ensure the best possible performance and quality in the packages
 Provide support and fix issues in the packages
 Writes advanced SQL including some query tuning
 Experience in the identification of data quality
 Some database design experience is helpful
 Experience designing and building complete ETL/SSIS processes moving and transforming data for
ODS, Staging, and Data Warehousing
Read more
Catalyst IQ

at Catalyst IQ

6 recruiters
Sidharth Maholia
Posted by Sidharth Maholia
Mumbai, Bengaluru (Bangalore)
1 - 5 yrs
₹15L - ₹25L / yr
Tableau
SQL
MS-Excel
skill iconPython
skill iconData Analytics
+2 more
Responsibilities:
● Ability to do exploratory analysis: Fetch data from systems and analyze trends.
● Developing customer segmentation models to improve the efficiency of marketing and product
campaigns.
● Establishing mechanisms for cross functional teams to consume customer insights to improve
engagement along the customer life cycle.
● Gather requirements for dashboards from business, marketing and operations stakeholders.
● Preparing internal reports for executive leadership and supporting their decision making.
● Analyse data, derive insights and embed it into Business actions.
● Work with cross functional teams.
Skills Required
• Data Analytics Visionary.
• Strong in SQL & Excel and good to have experience in Tableau.
• Experience in the field of Data Analysis, Data Visualization.
• Strong in analysing the Data and creating dashboards.
• Strong in communication, presentation and business intelligence.
• Multi-Dimensional, "Growth Hacker" Skill Set with strong sense of ownership for work.
• Aggressive “Take no prisoners” approach.
Read more
Numantra Technologies

at Numantra Technologies

2 recruiters
nisha mattas
Posted by nisha mattas
Remote, Mumbai, powai
2 - 12 yrs
₹8L - ₹18L / yr
ADF
PySpark
Jupyter Notebook
Big Data
Windows Azure
+3 more
      • Data pre-processing, data transformation, data analysis, and feature engineering
      • Performance optimization of scripts (code) and Productionizing of code (SQL, Pandas, Python or PySpark, etc.)
    • Required skills:
      • Bachelors in - in Computer Science, Data Science, Computer Engineering, IT or equivalent
      • Fluency in Python (Pandas), PySpark, SQL, or similar
      • Azure data factory experience (min 12 months)
      • Able to write efficient code using traditional, OO concepts, modular programming following the SDLC process.
      • Experience in production optimization and end-to-end performance tracing (technical root cause analysis)
      • Ability to work independently with demonstrated experience in project or program management
      • Azure experience ability to translate data scientist code in Python and make it efficient (production) for cloud deployment
 
Read more
Episource

at Episource

11 recruiters
Manas Ranjan Kar
Posted by Manas Ranjan Kar
Mumbai
4 - 8 yrs
₹12L - ₹20L / yr
skill iconPython
skill iconMachine Learning (ML)
skill iconData Science
skill iconAmazon Web Services (AWS)
Apache Spark
+1 more

We’re looking to hire someone to help scale Machine Learning and NLP efforts at Episource. You’ll work with the team that develops the models powering Episource’s product focused on NLP driven medical coding. Some of the problems include improving our ICD code recommendations , clinical named entity recognition and information extraction from clinical notes.


This is a role for highly technical machine learning & data engineers who combine outstanding oral and written communication skills, and the ability to code up prototypes and productionalize using a large range of tools, algorithms, and languages. Most importantly they need to have the ability to autonomously plan and organize their work assignments based on high-level team goals.


You will be responsible for setting an agenda to develop and ship machine learning models that positively impact the business, working with partners across the company including operations and engineering. You will use research results to shape strategy for the company, and help build a foundation of tools and practices used by quantitative staff across the company.



What you will achieve:

  • Define the research vision for data science, and oversee planning, staffing, and prioritization to make sure the team is advancing that roadmap

  • Invest in your team’s skills, tools, and processes to improve their velocity, including working with engineering counterparts to shape the roadmap for machine learning needs

  • Hire, retain, and develop talented and diverse staff through ownership of our data science hiring processes, brand, and functional leadership of data scientists

  • Evangelise machine learning and AI internally and externally, including attending conferences and being a thought leader in the space

  • Partner with the executive team and other business leaders to deliver cross-functional research work and models






Required Skills:


  • Strong background in classical machine learning and machine learning deployments is a must and preferably with 4-8 years of experience

  • Knowledge of deep learning & NLP

  • Hands-on experience in TensorFlow/PyTorch, Scikit-Learn, Python, Apache Spark & Big Data platforms to manipulate large-scale structured and unstructured datasets.

  • Experience with GPU computing is a plus.

  • Professional experience as a data science leader, setting the vision for how to most effectively use data in your organization. This could be through technical leadership with ownership over a research agenda, or developing a team as a personnel manager in a new area at a larger company.

  • Expert-level experience with a wide range of quantitative methods that can be applied to business problems.

  • Evidence you’ve successfully been able to scope, deliver and sell your own research in a way that shifts the agenda of a large organization.

  • Excellent written and verbal communication skills on quantitative topics for a variety of audiences: product managers, designers, engineers, and business leaders.

  • Fluent in data fundamentals: SQL, data manipulation using a procedural language, statistics, experimentation, and modeling


Qualifications

  • Professional experience as a data science leader, setting the vision for how to most effectively use data in your organization

  • Expert-level experience with machine learning that can be applied to business problems

  • Evidence you’ve successfully been able to scope, deliver and sell your own work in a way that shifts the agenda of a large organization

  • Fluent in data fundamentals: SQL, data manipulation using a procedural language, statistics, experimentation, and modeling

  • Degree in a field that has very applicable use of data science / statistics techniques (e.g. statistics, applied math, computer science, OR a science field with direct statistics application)

  • 5+ years of industry experience in data science and machine learning, preferably at a software product company

  • 3+ years of experience managing data science teams, incl. managing/grooming managers beneath you

  • 3+ years of experience partnering with executive staff on data topics

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort