Cutshort logo

11+ ER/Studio Jobs in India

Apply to 11+ ER/Studio Jobs on CutShort.io. Find your next job, effortlessly. Browse ER/Studio Jobs and apply today!

icon
a global provider of Business Process Management company

a global provider of Business Process Management company

Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore)
4 - 10 yrs
₹15L - ₹22L / yr
SQL Azure
ADF
Business process management
Windows Azure
SQL
+12 more

Desired Competencies:

 

Ø  Expertise in Azure Data Factory V2

Ø  Expertise in other Azure components like Data lake Store, SQL Database, Databricks

Ø  Must have working knowledge of spark programming

Ø  Good exposure to Data Projects dealing with Data Design and Source to Target documentation including defining transformation rules

Ø  Strong knowledge of CICD Process

Ø  Experience in building power BI reports

Ø  Understanding of different components like Pipelines, activities, datasets & linked services

Ø  Exposure to dynamic configuration of pipelines using data sets and linked Services

Ø  Experience in designing, developing and deploying pipelines to higher environments

Ø  Good knowledge on File formats for flexible usage, File location Objects (SFTP, FTP, local, HDFS, ADLS, BLOB, Amazon S3 etc.)

Ø  Strong knowledge in SQL queries

Ø  Must have worked in full life-cycle development from functional design to deployment

Ø  Should have working knowledge of GIT, SVN

Ø  Good experience in establishing connection with heterogeneous sources like Hadoop, Hive, Amazon, Azure, Salesforce, SAP, HANA, API’s, various Databases etc.

Ø  Should have working knowledge of different resources available in Azure like Storage Account, Synapse, Azure SQL Server, Azure Data Bricks, Azure Purview

Ø  Any experience related to metadata management, data modelling, and related tools (Erwin or ER Studio or others) would be preferred

 

Preferred Qualifications:

Ø  Bachelor's degree in Computer Science or Technology

Ø  Proven success in contributing to a team-oriented environment

Ø  Proven ability to work creatively and analytically in a problem-solving environment

Ø  Excellent communication (written and oral) and interpersonal skills

Qualifications

BE/BTECH

KEY RESPONSIBILITIES :

You will join a team designing and building a data warehouse covering both relational and dimensional models, developing reports, data marts and other extracts and delivering these via SSIS, SSRS, SSAS, and PowerBI. It is seen as playing a vital role in delivering a single version of the truth on Client’s data and delivering MI & BI that will feature in enabling both operational and strategic decision making.

You will be able to take responsibility for projects over the entire software lifecycle and work with minimum supervision. This would include technical analysis, design, development, and test support as well as managing the delivery to production.

The initial project being resourced is around the development and implementation of a Data Warehouse and associated MI/BI functions.

 

Principal Activities:

1.       Interpret written business requirements documents

2.       Specify (High Level Design and Tech Spec), code and write automated unit tests for new aspects of MI/BI Service.

3.       Write clear and concise supporting documentation for deliverable items.

4.       Become a member of the skilled development team willing to contribute and share experiences and learn as appropriate.

5.       Review and contribute to requirements documentation.

6.       Provide third line support for internally developed software.

7.       Create and maintain continuous deployment pipelines.

8.       Help maintain Development Team standards and principles.

9.       Contribute and share learning and experiences with the greater Development team.

10.   Work within the company’s approved processes, including design and service transition.

11.   Collaborate with other teams and departments across the firm.

12.   Be willing to travel to other offices when required.
13.You agree to comply with any reasonable instructions or regulations issued by the Company from time to time including those set out in the terms of the dealing and other manuals, including staff handbooks and all other group policies


Location
– Bangalore

 

Read more
Carsome

at Carsome

3 recruiters
Piyush Palkar
Posted by Piyush Palkar
Kuala Lumpur
3 - 5 yrs
₹20L - ₹25L / yr
SQL
skill iconPython
Business Analysis
Statistical Modeling
MS-Office
+6 more

Your Day-to-Day

  1. Derive Insights and drive major strategic projects to improve Business Metrics and take responsibility for cost efficiency and Revenue management across the country
  2. Perform Market research, Post Mortem analyses on competitor expansion and Market Penetration patterns. 
  3. Provide in-depth business analysis and data insights for internal stakeholders to help improve business. Derive and launch projects in order to reduce the gaps between targeted and projected business metrics
  4. Responsible for optimizing Carsome’s C2B and B2C customer acquisition and Dealer retention funnel. Work closely with Marketing and Tech teams to create, produce and implement creative digital marketing campaigns and drive CRM initiatives and strategies 
  5. Analyse the Revenue flows and processes large datasets to gather process insights and propose process improvement ideas for Carsome across SE-Asia
  6. Lead commercial projects & process mapping, from conceptualization to completion, to build or re-engineer business models, tools and processes.
  7. Having experience in analyses and insights in dealing on Unit Economics, COGs and P&L will be preferred ,but not mandatory
  8. Use Business Intelligence and Data Science tools to answer the appropriate business problems using SQL, Tableau or Python.
  9. Coordinate with HQ Data Insights Team and manage internal stakeholders across departments to ensure the smooth delivery of strategic projects
  10. Work across different departments/functions (BI,DE, tech, pricing, finance, operations, marketing, CS,CX) and also on high impact projects and support business expansion initiatives





Your Know-Know


  • At least a Bachelor's Degree in Accounting/Finance/Business or the equivalent. 
  • 3-5  years of experience in strategy / consulting / analytical / project management roles; experience in e-commerce, Start-ups or Unicorns(CARS24,OLA,SWIGGY,FLIPKART,OYO) or entrepreneur experience preferred + At Least 2 years of experience leading a team
  • Top-notch academics from a Tier 1 college (IIM / IIT/ NIT)
  • Must have SQL/PostgreSQL/Tableau Experience. 
  • Excellent Market Research, reporting and analytical skills, including carrying out weekly and monthly reporting
  • Holds experience in working with Data/Business Intelligence Team
  • Analytical mindset with ability to present data in a structured and informative way
  • Enjoy a fast-paced environment and can align business objectives with product priorities
  • Good to have : Financial modelling, Developing financial forecasts , development of Financial - strategic plan/framework
Read more
Top 3 Fintech Startup

Top 3 Fintech Startup

Agency job
via Jobdost by Sathish Kumar
Bengaluru (Bangalore)
4 - 7 yrs
₹11L - ₹17L / yr
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
Computer Vision
skill iconPython
+6 more
Responsible to lead a team of analysts to build and deploy predictive models to infuse core business functions with deep analytical insights. The Senior Data Scientist will also work
closely with the Kinara management team to investigate strategically important business
questions.

Lead a team through the entire analytical and machine learning model life cycle:

 Define the problem statement
 Build and clean datasets
 Exploratory data analysis
 Feature engineering
 Apply ML algorithms and assess the performance
 Code for deployment
 Code testing and troubleshooting
 Communicate Analysis to Stakeholders
 Manage Data Analysts and Data Scientists
Read more
Agiletech Info Solutions pvt ltd
Chennai
4 - 8 yrs
₹4L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
Spark
SQL
+1 more
We are looking for a Data Engineer to join our growing team of analytics experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoy optimizing data systems and building them from the ground up.

The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products.
Responsibilities for Data Engineer
• Create and maintain optimal data pipeline architecture,
• Assemble large, complex data sets that meet functional / non-functional business requirements.
• Identify, design, and implement internal process improvements: automating manual processes,
optimizing data delivery, re-designing infrastructure for greater scalability, etc.
• Build the infrastructure required for optimal extraction, transformation, and loading of data
from a wide variety of data sources using SQL and AWS big data technologies.
• Build analytics tools that utilize the data pipeline to provide actionable insights into customer
acquisition, operational efficiency and other key business performance metrics.
• Work with stakeholders including the Executive, Product, Data and Design teams to assist with
data-related technical issues and support their data infrastructure needs.
• Create data tools for analytics and data scientist team members that assist them in building and
optimizing our product into an innovative industry leader.
• Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
• Experience building and optimizing big data ETL pipelines, architectures and data sets.
• Advanced working SQL knowledge and experience working with relational databases, query
authoring (SQL) as well as working familiarity with a variety of databases.
• Experience performing root cause analysis on internal and external data and processes to
answer specific business questions and identify opportunities for improvement.
• Strong analytic skills related to working with unstructured datasets.
• Build processes supporting data transformation, data structures, metadata, dependency and
workload management.
• A successful history of manipulating, processing and extracting value from large disconnected
datasets.
Read more
company logo
Agency job
via Amazon by Debiprabha Sarkar
Bengaluru (Bangalore)
0.6 - 1 yrs
₹4L - ₹8L / yr
skill iconData Analytics
SQL
skill iconGoogle Analytics
Mathematics
MS-Excel
+2 more
Data Analyst Job Description :
About GlowRoad:
GlowRoad is building India's most profitable social e-commerce platform where resellers share
the catalog of products through their network on Facebook, Whatsapp, Instagram, etc and
convert them to sales. GlowRoad is on a mission to create micro-entrepreneurs (resellers) who can set up their web-store, market their products and track all transactions through its platform.
GlowRoad app has ~15M downloads and 1- million + MAU's.-
GlowRoad has been funded by global VCs like Accel Partners, CDH, KIP and Vertex Ventures and recently raised series C Funding. We are scaling our operations across India.-
GlowRoad is looking for team members passionate about building platforms for next billion
users and reimagining e-commerce for mobile-first users. A great environment, a fun, open,
energetic and creative environment. Approachable leadership, filled with passionate people, Open communication and provides high growth for employees.
Role:
● Gather, process/analyze and report business data across departments
● Report key business data/metrics on a regular basis (daily, weekly and monthly
as relevant)
● Structure concise reports to share with management
● Work closely with Senior Analysts to create data pipelines for Analytical
Databases for Category, Operations, Marketing, Support teams.
● Assist Senior Analysts in projects by learning new reporting tools like Power BI
and advanced analytics with R
Basic Qualifications
● Engineering Graduate
● 6- 24 months of Hands on experience with SQL, Excel, Google Spreadsheets
● Experience in creating MIS/Dashboards in Excel/Google Spreadsheets
● Strong in Mathematics
● Ability to take full ownership in terms of timeline and data sanity with respect to
reports
● Basic Verbal and Written English Communication
Read more
DataMetica

at DataMetica

1 video
7 recruiters
Nikita Aher
Posted by Nikita Aher
Pune
2 - 5 yrs
₹1L - ₹8L / yr
Google Cloud Platform (GCP)
Big Query
Workflow
Integration
SQL
Job Title/Designation: GCP Engineer - Big Query, Dataflow
Employment Type: Full Time, Permanent

Job Description:

Experience - 2 to 5 Years
Work Location - Pune

Mandatory Skills:
 
  • Sound understanding of Google Cloud Platform
  • Should have worked on Big Query, Workflow or Composer
  • Experience of migrating to GCP and integration projects on large-scale environments
  • ETL technical design, development and support
  • Good in SQL skills and Unix Scripting
  • Programming experience with Python, Java or Spark would be desirable, but not essential
  • Good Communication skills .
  • Experience of SOA and services-based data solutions, would be advantageous
 
Read more
MNC

MNC

Agency job
via Fragma Data Systems by Priyanka U
Remote, Bengaluru (Bangalore)
5 - 8 yrs
₹12L - ₹20L / yr
PySpark
SQL
Data Warehouse (DWH)
ETL
SQL Developer with Relevant experience of 7 Yrs with Strong Communication Skills.
 
Key responsibilities:
 
  • Creating, designing and developing data models
  • Prepare plans for all ETL (Extract/Transformation/Load) procedures and architectures
  • Validating results and creating business reports
  • Monitoring and tuning data loads and queries
  • Develop and prepare a schedule for a new data warehouse
  • Analyze large databases and recommend appropriate optimization for the same
  • Administer all requirements and design various functional specifications for data
  • Provide support to the Software Development Life cycle
  • Prepare various code designs and ensure efficient implementation of the same
  • Evaluate all codes and ensure the quality of all project deliverables
  • Monitor data warehouse work and provide subject matter expertise
  • Hands-on BI practices, data structures, data modeling, SQL skills
  • Minimum 1 year experience in Pyspark
Read more
Prescience Decision Solutions
Shivakumar K
Posted by Shivakumar K
Bengaluru (Bangalore)
3 - 7 yrs
₹10L - ₹20L / yr
Big Data
ETL
Spark
Apache Kafka
Apache Spark
+4 more

The Data Engineer would be responsible for selecting and integrating Big Data tools and frameworks required. Would implement Data Ingestion & ETL/ELT processes

Required Experience, Skills and Qualifications:

  • Hands on experience on Big Data tools/technologies like Spark,  Databricks, Map Reduce, Hive, HDFS.
  • Expertise and excellent understanding of big data toolset such as Sqoop, Spark-streaming, Kafka, NiFi
  • Proficiency in any of the programming language: Python/ Scala/  Java with 4+ years’ experience
  • Experience in Cloud infrastructures like MS Azure, Data lake etc
  • Good working knowledge in NoSQL DB (Mongo, HBase, Casandra)
Read more
IT solutions specialized in Apps Lifecycle management. (MG1)

IT solutions specialized in Apps Lifecycle management. (MG1)

Agency job
via Multi Recruit by Ayub Pasha
Bengaluru (Bangalore)
5 - 6 yrs
₹8L - ₹10L / yr
Data migration
Data Warehouse (DWH)
ETL
SQL
skill iconPostgreSQL
+4 more
  • Excellent working knowledge on Data Warehousing /Data Migration activity using an ETL tool.
  • Strong Data Integration, PostgreSQL/Oracle Database skills, Shell Scripting, Python programming, and development know-how.
  • Hands-on experience in working with and generating XML documents.
  • Good analytical and business process understanding capability.
  • Familiarized with Data Models, Source-Target Data Mapping, Transactional, and Master Data concepts.
  • Well-experienced in High level/Detailed design, Performance tuning of ETL jobs.
  • Very good communication skills, interpersonal skills, stakeholder management skills, self-motivated, quick learner, team player.
  • Exposure to After Sales Business Domain is highly preferred.
  • Experience using HP ALM, Jira for ticketing.
  • Experience release management

 

Read more
Techknomatic Services Pvt. Ltd.
Techknomatic Services
Posted by Techknomatic Services
Pune, Mumbai
2 - 6 yrs
₹4L - ₹9L / yr
Tableau
SQL
Business Intelligence (BI)
Role Summary:
Lead and drive the development in BI domain using Tableau eco-system with deep technical and BI ecosystem knowledge. The resource will be responsible for the dashboard design, development, and delivery of BI services using Tableau eco-system.

Key functions & responsibilities:
 Communication & interaction with the Project Manager to understand the requirement
 Dashboard designing, development and deployment using Tableau eco-system
 Ensure delivery within a given time frame while maintaining quality
 Stay up to date with current tech and bring relevant ideas to the table
 Proactively work with the Management team to identify and resolve issues
 Performs other related duties as assigned or advised
 He/she should be a leader that sets the standard and expectations through an example in
his/her conduct, work ethic, integrity and character
 Contribute in dashboard designing, R&D and project delivery using Tableau

Candidate’s Profile
Academics:
 Batchelor’s degree preferable in Computer science.
 Master’s degree would have an added advantage.

Experience:
 Overall 2-5 Years of experience in DWBI development projects, having worked on BI and
Visualization technologies (Tableau, Qlikview) for at least 2 years.
 At least 2 years of experience covering Tableau implementation lifecycle including hands-on development/programming, managing security, data modelling, data blending, etc.

Technology & Skills:
 Hands on expertise of Tableau administration and maintenance
 Strong working knowledge and development experience with Tableau Server and Desktop
 Strong knowledge in SQL, PL/SQL and Data modelling
 Knowledge of databases like Microsoft SQL Server, Oracle, etc.
 Exposure to alternate Visualization technologies like Qlikview, Spotfire, Pentaho etc.
 Good communication & Analytical skills with Excellent creative and conceptual thinking
abilities
 Superior organizational skills, attention to detail/level of quality, Strong communication
skills, both verbal and written
Read more
Cemtics

at Cemtics

1 recruiter
Tapan Sahani
Posted by Tapan Sahani
Remote, NCR (Delhi | Gurgaon | Noida)
4 - 6 yrs
₹5L - ₹12L / yr
Big Data
Spark
Hadoop
SQL
skill iconPython
+1 more

JD:

Required Skills:

  • Intermediate to Expert level hands-on programming using one of programming language- Java or Python or Pyspark or Scala.
  • Strong practical knowledge of SQL.
    Hands on experience on Spark/SparkSQL
  • Data Structure and Algorithms
  • Hands-on experience as an individual contributor in Design, Development, Testing and Deployment of Big Data technologies based applications
  • Experience in Big Data application tools, such as Hadoop, MapReduce, Spark, etc
  • Experience on NoSQL Databases like HBase, etc
  • Experience with Linux OS environment (Shell script, AWK, SED)
  • Intermediate RDBMS skill, able to write SQL query with complex relation on top of big RDMS (100+ table)
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort