Cutshort logo

11+ ER/Studio Jobs in India

Apply to 11+ ER/Studio Jobs on CutShort.io. Find your next job, effortlessly. Browse ER/Studio Jobs and apply today!

icon
a global provider of Business Process Management company

a global provider of Business Process Management company

Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore)
4 - 10 yrs
₹15L - ₹22L / yr
SQL Azure
ADF
Business process management
Windows Azure
SQL
+12 more

Desired Competencies:

 

Ø  Expertise in Azure Data Factory V2

Ø  Expertise in other Azure components like Data lake Store, SQL Database, Databricks

Ø  Must have working knowledge of spark programming

Ø  Good exposure to Data Projects dealing with Data Design and Source to Target documentation including defining transformation rules

Ø  Strong knowledge of CICD Process

Ø  Experience in building power BI reports

Ø  Understanding of different components like Pipelines, activities, datasets & linked services

Ø  Exposure to dynamic configuration of pipelines using data sets and linked Services

Ø  Experience in designing, developing and deploying pipelines to higher environments

Ø  Good knowledge on File formats for flexible usage, File location Objects (SFTP, FTP, local, HDFS, ADLS, BLOB, Amazon S3 etc.)

Ø  Strong knowledge in SQL queries

Ø  Must have worked in full life-cycle development from functional design to deployment

Ø  Should have working knowledge of GIT, SVN

Ø  Good experience in establishing connection with heterogeneous sources like Hadoop, Hive, Amazon, Azure, Salesforce, SAP, HANA, API’s, various Databases etc.

Ø  Should have working knowledge of different resources available in Azure like Storage Account, Synapse, Azure SQL Server, Azure Data Bricks, Azure Purview

Ø  Any experience related to metadata management, data modelling, and related tools (Erwin or ER Studio or others) would be preferred

 

Preferred Qualifications:

Ø  Bachelor's degree in Computer Science or Technology

Ø  Proven success in contributing to a team-oriented environment

Ø  Proven ability to work creatively and analytically in a problem-solving environment

Ø  Excellent communication (written and oral) and interpersonal skills

Qualifications

BE/BTECH

KEY RESPONSIBILITIES :

You will join a team designing and building a data warehouse covering both relational and dimensional models, developing reports, data marts and other extracts and delivering these via SSIS, SSRS, SSAS, and PowerBI. It is seen as playing a vital role in delivering a single version of the truth on Client’s data and delivering MI & BI that will feature in enabling both operational and strategic decision making.

You will be able to take responsibility for projects over the entire software lifecycle and work with minimum supervision. This would include technical analysis, design, development, and test support as well as managing the delivery to production.

The initial project being resourced is around the development and implementation of a Data Warehouse and associated MI/BI functions.

 

Principal Activities:

1.       Interpret written business requirements documents

2.       Specify (High Level Design and Tech Spec), code and write automated unit tests for new aspects of MI/BI Service.

3.       Write clear and concise supporting documentation for deliverable items.

4.       Become a member of the skilled development team willing to contribute and share experiences and learn as appropriate.

5.       Review and contribute to requirements documentation.

6.       Provide third line support for internally developed software.

7.       Create and maintain continuous deployment pipelines.

8.       Help maintain Development Team standards and principles.

9.       Contribute and share learning and experiences with the greater Development team.

10.   Work within the company’s approved processes, including design and service transition.

11.   Collaborate with other teams and departments across the firm.

12.   Be willing to travel to other offices when required.
13.You agree to comply with any reasonable instructions or regulations issued by the Company from time to time including those set out in the terms of the dealing and other manuals, including staff handbooks and all other group policies


Location
– Bangalore

 

Read more
Leading technology and digital marketing company.(IC1)

Leading technology and digital marketing company.(IC1)

Agency job
via Multi Recruit by Chandra Kanth
Bengaluru (Bangalore)
3 - 7 yrs
₹12L - ₹14L / yr
PowerBI
Data analyst
skill iconData Analytics
Business Intelligence (BI)
Tableau
+2 more

We are looking for a Business Intelligence (BI)/Data Analyst to create and manage Power Bl and analytics solutions that turn data into knowledge.  In this role, you should have a background in data and business analysis. If you are self-directed, passionate about data,

and have business acumen and problem-solving aptitude, we'd like to meet you. Ultimately, you will enhance our business intelligence system to help us make better decisions.

 

Requirements and Qualifications

  • BSc/BA in Computer Science, Engineering, or relevant field.
  • Financial experience and Marketing background is a plus
  • Strong Power BI development   skills including Migration of existing deliverables to PowerBl.
  • Ability to work autonomously
  • Data modelling, Calculations, Conversions, Scheduling Data refreshes in Power-BI.
  • Proven experience as a Power BI Developer is a must.
  • Industry experience is preferred.  Familiarity with other BI tools (Tableau, QlikView).
  • Analytical mind with a problem-solving aptitude.

 

Responsibilities

  • Design, develop and maintain business intelligence solutions
  • Craft and execute queries upon request for data
  • Present information through reports and visualization based on requirements gathered from stakeholders
  • Interact with the team to gain an understanding of the business environment, technical context, and organizational strategic direction
  • Design, build and deploy new, and extend existing dashboards and reports that synthesize distributed data sources
  • Ensure data accuracy, performance, usability, and functionality requirements of BI platform
  • Manage data through MS Excel, Google sheets, and SQL applications, as required and support other analytics platforms
  • Develop and execute database queries and conduct analyses
  • Develop and update technical documentation requirements
  • Communicate   insights   to both   technical   and non-technical audiences.

 

 

 

 

Read more
Miratech Group
Elena Kurtieieva
Posted by Elena Kurtieieva
Remote only
2 - 3 yrs
₹10L - ₹15L / yr
Windows Azure
Luis
node.js
Language understanding
JSON
+1 more

Company Description

Miratech is an IT services and outsourcing company that provides services to multinational organizations all over the world. Our highly professional team achieves success with 99% of IT projects in financial, telecommunication, and technology domains. Founded in 1989, Miratech has its headquarters in New York, USA; with R&D centers in Poland, Philippines, Slovakia, Spain, and Ukraine. Technical complexity is our passion, stability is our standard, friendly work environment is our style. We empower our employees to grow together with the company, to achieve ambitious goals, and to be a part of the international relentless team which helps the visionaries to change the world.

Job Description

We are looking for a Bot Developer to join our team, who will help us working on solutions and implementing technologies.

The ideal candidate will have strong knowledge of technologies and programming languages through which conversational Chatbots are developed. A good understanding of dialog systems and development using Microsoft framework to develop and program conversational Chatbots is required.

Responsibilities:

  • Designing and implementing voice and chat bots
  • Troubleshoot and resolve issues related to voice/chat bots.
  • Assist in planning and estimating development projects/sprints.
  • Take part in code reviews and contribute to team knowledge sharing.
  • Provide technical guidance and support to other team members.
  • Work in an agile environment, using methodologies like Scrum or Kanban

Qualifications

  • 2-3 years of experience in BOT development using node.js
  • Strong experience in developing BOTs using Azure Bot Framework.
  • Conversational AI - ML Based Using Azure Cognitive Services
  • Conversational AI - ML Based services to build Conversational Bot using LUIS.
  • Experience in working with REST API calls, JSON, and systems integration.

Secondary Skills

  • Ability to work with business and technology teams to build and deploy an analytical solution as per client needs.
  • Ability to multi-task, solve problems and think strategically.
  • Strong communication and collaboration skills


Read more
Stanza Living
Parul Pal
Posted by Parul Pal
Gurugram
1 - 3 yrs
₹6L - ₹9L / yr
SQL
Microsoft Excel
Dashboard

Who we are:


Stanza Living is India’s largest and fastest growing tech-enabled, managed accommodation company that delivers a hospitality-led living experience to migrant students and young working professionals across India.


We have a full-stack business model that focuses on design, development and delivery of daily living solutions tailored to the young consumers’ lifestyle. From smartly-planned

residences, host of amenities and services for hassle-free living to exclusive community engagement programmes – everything is seamlessly integrated through technology to ensure the highest consumer delight.


Today, we are:


• India’s largest managed accommodation company with over 50,000 beds under management across 24+ cities

• Most capitalized player in the managed accommodation space, backed by global marquee investors – Falcon Edge, Equity International, Sequoia Capital, Matrix Partners, Accel Partners

• Recognized as the Best Real Estate Tech company across the Globe in 2020 by leading analysis agency, Tracxn

• LinkedIn Top Startup to Work for - 2022


The opportunity: Job Responsibilities:


• Perform data analysis on large volumes of data to identify trends and/or data processing rules

• Team player of core analytics team.

• Responsible for weekly and monthly Sales/Marketing Reports on Gross and Net basis and other adhoc reports.

• Generating reports on daily basis at all stages ·

• Analyze the data to come out with insights on what leads to better conversions, student preferences, role on various

investments and channels, optimizing the spend etc.

• Prepare reports and dashboard for various business functions to keep track of important business metrics.

• Elicit and document requirements at various levels including Business, Logical and Physical/Technical


Skill Sets


• Good hands-on Advanced Excel & SQL.

• Has extensively worked on live Dashboards, reporting, data manipulation and making flat tables in SQL

• Knowledge in Python/R

• Strong analytical skills and ability to interpret data

• Natural curiosity and self-drive to understand the broader business in order to provide the appropriate reporting support

• Extremely high ownership, self-starter and work in a constantly-changing and fast-growing environment

• Establish collaborative and trusting relationships with the business’s key internal leaders and stakeholders in order to ensure that there is a free flow of ideas and information across the business

• First principle thinking and strong problem solving


What Can You Expect:


• A phenomenal work environment, with extremely high ownership and growth opportunities

• Opportunity to shape a potential unicorn

• Quick iterations and deployments - fail-fast attitude

• Opportunity to work on cutting-edge technologies

• Access to a world-class mentorship network

Read more
AppsTek Corp

AppsTek Corp

Agency job
via Venaatics Consulting by Mastanvali Shaik
Gurugram, Chennai
6 - 10 yrs
Best in industry
Data management
Data modeling
skill iconPostgreSQL
SQL
MySQL
+3 more

Function : Sr. DB Developer

Location : India/Gurgaon/Tamilnadu

 

>> THE INDIVIDUAL

  • Have a strong background in data platform creation and management.
  • Possess in-depth knowledge of Data Management, Data Modelling, Ingestion  - Able to develop data models and ingestion frameworks based on client requirements and advise on system optimization.
  • Hands-on experience in SQL database (PostgreSQL) and No-SQL database (MongoDB) 
  • Hands-on experience in performance tuning of DB
  • Good to have knowledge of database setup in cluster node
  • Should be well versed with data security aspects and data governance framework
  • Hands-on experience in Spark, Airflow, ELK.
  • Good to have knowledge on any data cleansing tool like apache Griffin
  • Preferably getting involved during project implementation so have a background on business knowledge and technical requirement as well.
  • Strong analytical and problem-solving skills. Have exposure to data analytics skills and knowledge of advanced data analytical tools will be an advantage.
  • Strong written and verbal communication skills (presentation skills).
  • Certifications in the above technologies is preferred.

 

>> Qualification

 

  1. Tech /B.E. / MCA /M. Tech from a reputed institute.

Experience of Data Management, Data Modelling, Ingestion for more than 4 years. Total experience of 8-10 Years

Read more
Hyderabad
3 - 7 yrs
₹1L - ₹15L / yr
Big Data
Spark
Hadoop
PySpark
skill iconAmazon Web Services (AWS)
+3 more

Big data Developer

Exp: 3yrs to 7 yrs.
Job Location: Hyderabad
Notice: Immediate / within 30 days

1. Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight
2. Experience in developing lambda functions with AWS Lambda
3. Expertise with Spark/PySpark Candidate should be hands on with PySpark code and should be able to do transformations with Spark
4. Should be able to code in Python and Scala.
5. Snowflake experience will be a plus

We can start keeping Hadoop and Hive requirements as good to have or understanding of is enough rather than keeping it as a desirable requirement.

Read more
PayU

at PayU

1 video
6 recruiters
Vishakha Sonde
Posted by Vishakha Sonde
Remote, Bengaluru (Bangalore)
2 - 5 yrs
₹5L - ₹20L / yr
skill iconPython
ETL
Data engineering
Informatica
SQL
+2 more

Role: Data Engineer  
Company: PayU

Location: Bangalore/ Mumbai

Experience : 2-5 yrs


About Company:

PayU is the payments and fintech business of Prosus, a global consumer internet group and one of the largest technology investors in the world. Operating and investing globally in markets with long-term growth potential, Prosus builds leading consumer internet companies that empower people and enrich communities.

The leading online payment service provider in 36 countries, PayU is dedicated to creating a fast, simple and efficient payment process for merchants and buyers. Focused on empowering people through financial services and creating a world without financial borders where everyone can prosper, PayU is one of the biggest investors in the fintech space globally, with investments totalling $700 million- to date. PayU also specializes in credit products and services for emerging markets across the globe. We are dedicated to removing risks to merchants, allowing consumers to use credit in ways that suit them and enabling a greater number of global citizens to access credit services.

Our local operations in Asia, Central and Eastern Europe, Latin America, the Middle East, Africa and South East Asia enable us to combine the expertise of high growth companies with our own unique local knowledge and technology to ensure that our customers have access to the best financial services.

India is the biggest market for PayU globally and the company has already invested $400 million in this region in last 4 years. PayU in its next phase of growth is developing a full regional fintech ecosystem providing multiple digital financial services in one integrated experience. We are going to do this through 3 mechanisms: build, co-build/partner; select strategic investments. 

PayU supports over 350,000+ merchants and millions of consumers making payments online with over 250 payment methods and 1,800+ payment specialists. The markets in which PayU operates represent a potential consumer base of nearly 2.3 billion people and a huge growth potential for merchants. 

Job responsibilities:

  • Design infrastructure for data, especially for but not limited to consumption in machine learning applications 
  • Define database architecture needed to combine and link data, and ensure integrity across different sources 
  • Ensure performance of data systems for machine learning to customer-facing web and mobile applications using cutting-edge open source frameworks, to highly available RESTful services, to back-end Java based systems 
  • Work with large, fast, complex data sets to solve difficult, non-routine analysis problems, applying advanced data handling techniques if needed 
  • Build data pipelines, includes implementing, testing, and maintaining infrastructural components related to the data engineering stack.
  • Work closely with Data Engineers, ML Engineers and SREs to gather data engineering requirements to prototype, develop, validate and deploy data science and machine learning solutions

Requirements to be successful in this role: 

  • Strong knowledge and experience in Python, Pandas, Data wrangling, ETL processes, statistics, data visualisation, Data Modelling and Informatica.
  • Strong experience with scalable compute solutions such as in Kafka, Snowflake
  • Strong experience with workflow management libraries and tools such as Airflow, AWS Step Functions etc. 
  • Strong experience with data engineering practices (i.e. data ingestion pipelines and ETL) 
  • A good understanding of machine learning methods, algorithms, pipelines, testing practices and frameworks 
  • Preferred) MEng/MSc/PhD degree in computer science, engineering, mathematics, physics, or equivalent (preference: DS/ AI) 
  • Experience with designing and implementing tools that support sharing of data, code, practices across organizations at scale 
Read more
Prescience Decision Solutions
Shivakumar K
Posted by Shivakumar K
Bengaluru (Bangalore)
3 - 7 yrs
₹10L - ₹20L / yr
Big Data
ETL
Spark
Apache Kafka
Apache Spark
+4 more

The Data Engineer would be responsible for selecting and integrating Big Data tools and frameworks required. Would implement Data Ingestion & ETL/ELT processes

Required Experience, Skills and Qualifications:

  • Hands on experience on Big Data tools/technologies like Spark,  Databricks, Map Reduce, Hive, HDFS.
  • Expertise and excellent understanding of big data toolset such as Sqoop, Spark-streaming, Kafka, NiFi
  • Proficiency in any of the programming language: Python/ Scala/  Java with 4+ years’ experience
  • Experience in Cloud infrastructures like MS Azure, Data lake etc
  • Good working knowledge in NoSQL DB (Mongo, HBase, Casandra)
Read more
A Chemical & Purifier Company headquartered in the US.

A Chemical & Purifier Company headquartered in the US.

Agency job
via Multi Recruit by Fiona RKS
Bengaluru (Bangalore)
4 - 9 yrs
₹15L - ₹18L / yr
Azure data factory
Azure Data factory
Azure Data Engineer
SQL
SQL Azure
+2 more
  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Author data services using a variety of programming languages
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Azure ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centres and Azure regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.
  • Work in an Agile environment with Scrum teams.
  • Ensure data quality and help in achieving data governance.


Basic Qualifications
  • 2+ years of experience in a Data Engineer role
  • Undergraduate degree required (Graduate degree preferred) in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
  • Experience using the following software/tools:
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases
  • Experience with data pipeline and workflow management tools
  • Experience with Azure cloud services: ADLS, ADF, ADLA, AAS
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • Understanding of ELT and ETL patterns and when to use each. Understanding of data models and transforming data into the models
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and data sets
  • Strong analytic skills related to working with unstructured datasets
  • Build processes supporting data transformation, data structures, metadata, dependency, and workload management
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
  • Experience supporting and working with cross-functional teams in a dynamic environment
Read more
IT Consulting, System Integrator & Software Services Company

IT Consulting, System Integrator & Software Services Company

Agency job
via Jobdost by Ankitha Vyas
Chennai, Bengaluru (Bangalore)
3 - 8 yrs
₹5L - ₹12L / yr
Tableau
SQL
PL/SQL
Responsibilities


In this role, candidates will be responsible for developing Tableau Reports. Should be able to write effective and scalable code. Improve functionality of existing Reports/systems.

·       Design stable, scalable code.

·       Identify potential improvements to the current design/processes.

·       Participate in multiple project discussions as a senior member of the team.

·       Serve as a coach/mentor for junior developers.


Minimum Qualifications

·       3 - 8 Years of experience

·       Excellent written and verbal communication skills

 

Must have skills

·       Meaningful work experience

·       Extensively worked on BI Reporting tool: Tableau for development of reports to fulfill the end user requirements.

·       Experienced in interacting with business users to analyze the business process and requirements and redefining requirements into visualizations and reports.

·       Must have knowledge with the selection of appropriate data visualization strategies (e.g., chart types) for specific use cases. Ability to showcase complete dashboard implementations that demonstrate       visual  standard methodologies (e.g., color themes, visualization layout, interactivity, drill-down capabilities, filtering, etc.).

·       You should be an Independent player and have experience working with senior leaders.

·       Able to explore options and suggest new solutions and visualization techniques to the customer.

·       Experience crafting joins and joins with custom SQL blending data from different data sources using Tableau Desktop.

·       Using sophisticated calculations using Tableau Desktop (Aggregate, Date, Logical, String, Table, LOD Expressions.

·       Working with relational data sources (like Oracle / SQL Server / DB2) and flat files.

·       Optimizing user queries and dashboard performance.

·       Knowledge in SQL, PL/SQL.

·       Knowledge is crafting DB views and materialized views.

·       Excellent verbal and written communication skills and interpersonal skills are required.

·       Excellent documentation and presentation skills; should be able to build business process mapping document; functional solution documents and own the acceptance/signoff process from E2E

·       Ability to make right graph choices, use of data blending feature, Connect to several DB technologies.

·       Must stay up to date on new and coming visualization technologies. 

 

Pref location: Chennai (priority)/ Bengaluru  

Read more
MNC

MNC

Agency job
via Fragma Data Systems by geeti gaurav mohanty
Bengaluru (Bangalore), Hyderabad
3 - 6 yrs
₹10L - ₹15L / yr
Big Data
Spark
ETL
Apache
Hadoop
+2 more
Desired Skill, Experience, Qualifications, and Certifications:
• 5+ years’ experience developing and maintaining modern ingestion pipeline using
technologies like Spark, Apache Nifi etc).
• 2+ years’ experience with Healthcare Payors (focusing on Membership, Enrollment, Eligibility,
• Claims, Clinical)
• Hands on experience on AWS Cloud and its Native components like S3, Athena, Redshift &
• Jupyter Notebooks
• Strong in Spark Scala & Python pipelines (ETL & Streaming)
• Strong experience in metadata management tools like AWS Glue
• String experience in coding with languages like Java, Python
• Worked on designing ETL & streaming pipelines in Spark Scala / Python
• Good experience in Requirements gathering, Design & Development
• Working with cross-functional teams to meet strategic goals.
• Experience in high volume data environments
• Critical thinking and excellent verbal and written communication skills
• Strong problem-solving and analytical abilities, should be able to work and delivery
individually
• Good-to-have AWS Developer certified, Scala coding experience, Postman-API and Apache
Airflow or similar schedulers experience
• Nice-to-have experience in healthcare messaging standards like HL7, CCDA, EDI, 834, 835, 837
• Good communication skills
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort