Cutshort logo
Mixpanel Jobs in Hyderabad

11+ Mixpanel Jobs in Hyderabad | Mixpanel Job openings in Hyderabad

Apply to 11+ Mixpanel Jobs in Hyderabad on CutShort.io. Explore the latest Mixpanel Job opportunities across top companies like Google, Amazon & Adobe.

icon
Indium Software

at Indium Software

16 recruiters
Mohamed Aslam
Posted by Mohamed Aslam
Hyderabad
3 - 7 yrs
₹7L - ₹13L / yr
skill iconPython
Spark
SQL
PySpark
HiveQL
+2 more

Indium Software is a niche technology solutions company with deep expertise in Digital , QA and Gaming. Indium helps customers in their Digital Transformation journey through a gamut of solutions that enhance business value.

With over 1000+ associates globally, Indium operates through offices in the US, UK and India

Visit http://www.indiumsoftware.com">www.indiumsoftware.com to know more.

Job Title: Analytics Data Engineer

What will you do:
The Data Engineer must be an expert in SQL development further providing support to the Data and Analytics in database design, data flow and analysis activities. The position of the Data Engineer also plays a key role in the development and deployment of innovative big data platforms for advanced analytics and data processing. The Data Engineer defines and builds the data pipelines that will enable faster, better, data-informed decision-making within the business.

We ask:

Extensive Experience with SQL and strong ability to process and analyse complex data

The candidate should also have an ability to design, build, and maintain the business’s ETL pipeline and data warehouse The candidate will also demonstrate expertise in data modelling and query performance tuning on SQL Server
Proficiency with analytics experience, especially funnel analysis, and have worked on analytical tools like Mixpanel, Amplitude, Thoughtspot, Google Analytics, and similar tools.

Should work on tools and frameworks required for building efficient and scalable data pipelines
Excellent at communicating and articulating ideas and an ability to influence others as well as drive towards a better solution continuously.
Experience working in python, Hive queries, spark, pysaprk, sparkSQL, presto

  • Relate Metrics to product
  • Programmatic Thinking
  • Edge cases
  • Good Communication
  • Product functionality understanding

Perks & Benefits:
A dynamic, creative & intelligent team they will make you love being at work.
Autonomous and hands-on role to make an impact you will be joining at an exciting time of growth!

Flexible work hours and Attractive pay package and perks
An inclusive work environment that lets you work in the way that works best for you!

Read more
Bengaluru (Bangalore), Mumbai, Delhi, Gurugram, Pune, Hyderabad, Ahmedabad, Chennai
3 - 7 yrs
₹8L - ₹15L / yr
AWS Lambda
Amazon S3
Amazon VPC
Amazon EC2
Amazon Redshift
+3 more

Technical Skills:


  • Ability to understand and translate business requirements into design.
  • Proficient in AWS infrastructure components such as S3, IAM, VPC, EC2, and Redshift.
  • Experience in creating ETL jobs using Python/PySpark.
  • Proficiency in creating AWS Lambda functions for event-based jobs.
  • Knowledge of automating ETL processes using AWS Step Functions.
  • Competence in building data warehouses and loading data into them.


Responsibilities:


  • Understand business requirements and translate them into design.
  • Assess AWS infrastructure needs for development work.
  • Develop ETL jobs using Python/PySpark to meet requirements.
  • Implement AWS Lambda for event-based tasks.
  • Automate ETL processes using AWS Step Functions.
  • Build data warehouses and manage data loading.
  • Engage with customers and stakeholders to articulate the benefits of proposed solutions and frameworks.
Read more
A fast growing Big Data company

A fast growing Big Data company

Agency job
via Careerconnects by Kumar Narayanan
Noida, Bengaluru (Bangalore), Chennai, Hyderabad
6 - 8 yrs
₹10L - ₹15L / yr
AWS Glue
SQL
skill iconPython
PySpark
Data engineering
+6 more

AWS Glue Developer 

Work Experience: 6 to 8 Years

Work Location:  Noida, Bangalore, Chennai & Hyderabad

Must Have Skills: AWS Glue, DMS, SQL, Python, PySpark, Data integrations and Data Ops, 

Job Reference ID:BT/F21/IND


Job Description:

Design, build and configure applications to meet business process and application requirements.


Responsibilities:

7 years of work experience with ETL, Data Modelling, and Data Architecture Proficient in ETL optimization, designing, coding, and tuning big data processes using Pyspark Extensive experience to build data platforms on AWS using core AWS services Step function, EMR, Lambda, Glue and Athena, Redshift, Postgres, RDS etc and design/develop data engineering solutions. Orchestrate using Airflow.


Technical Experience:

Hands-on experience on developing Data platform and its components Data Lake, cloud Datawarehouse, APIs, Batch and streaming data pipeline Experience with building data pipelines and applications to stream and process large datasets at low latencies.


➢ Enhancements, new development, defect resolution and production support of Big data ETL development using AWS native services.

➢ Create data pipeline architecture by designing and implementing data ingestion solutions.

➢ Integrate data sets using AWS services such as Glue, Lambda functions/ Airflow.

➢ Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Athena.

➢ Author ETL processes using Python, Pyspark.

➢ Build Redshift Spectrum direct transformations and data modelling using data in S3.

➢ ETL process monitoring using CloudWatch events.

➢ You will be working in collaboration with other teams. Good communication must.

➢ Must have experience in using AWS services API, AWS CLI and SDK


Professional Attributes:

➢ Experience operating very large data warehouses or data lakes Expert-level skills in writing and optimizing SQL Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technology.

➢ Must have 6+ years of big data ETL experience using Python, S3, Lambda, Dynamo DB, Athena, Glue in AWS environment.

➢ Expertise in S3, RDS, Redshift, Kinesis, EC2 clusters highly desired.


Qualification:

➢ Degree in Computer Science, Computer Engineering or equivalent.


Salary: Commensurate with experience and demonstrated competence

Read more
Synechron

at Synechron

3 recruiters
Ranjini N
Posted by Ranjini N
Bengaluru (Bangalore), Hyderabad
6 - 10 yrs
₹2L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
skill iconPython
Shell Scripting
+2 more

Position: ETL Developer

Location: Mumbai

Exp.Level: 4+ Yrs

Required Skills:

* Strong scripting knowledge such as: Python and Shell

* Strong relational database skills especially with DB2/Sybase

* Create high quality and optimized stored procedures and queries

* Strong with scripting language such as Python and Unix / K-Shell

* Strong knowledge base of relational database performance and tuning such as: proper use of indices, database statistics/reorgs, de-normalization concepts.

* Familiar with lifecycle of a trade and flows of data in an investment banking operation is a plus.

* Experienced in Agile development process

* Java Knowledge is a big plus but not essential

* Experience in delivery of metrics / reporting in an enterprise environment (e.g. demonstrated experience in BI tools such as Business Objects, Tableau, report design & delivery) is a plus

* Experience on ETL processes and tools such as Informatica is a plus. Real time message processing experience is a big plus.

* Good team player; Integrity & ownership

Read more
Hyderabad
4 - 8 yrs
₹6L - ₹25L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+4 more
  1. Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight
  2. Experience in developing lambda functions with AWS Lambda
  3. Expertise with Spark/PySpark – Candidate should be hands on with PySpark code and should be able to do transformations with Spark
  4. Should be able to code in Python and Scala.
  5. Snowflake experience will be a plus

 

Read more
Indium Software

at Indium Software

16 recruiters
Swaathipriya P
Posted by Swaathipriya P
Bengaluru (Bangalore), Hyderabad
2 - 5 yrs
₹1L - ₹15L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+6 more
2+ years of Analytics with predominant experience in SQL, SAS, Statistics, R , Python, Visualization
Experienced in writing complex SQL select queries (window functions & CTE’s) with advanced SQL experience
Should be an individual contributor for initial few months based on project movement team will be aligned
Strong in querying logic and data interpretation
Solid communication and articulation skills
Able to handle stakeholders independently with less interventions of reporting manager
Develop strategies to solve problems in logical yet creative ways
Create custom reports and presentations accompanied by strong data visualization and storytelling
Read more
DataMetica

at DataMetica

1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
Pune, Hyderabad
6 - 12 yrs
₹11L - ₹25L / yr
PL/SQL
MySQL
SQL server
SQL
Linux/Unix
+4 more

We at Datametica Solutions Private Limited are looking for an SQL Lead / Architect who has a passion for the cloud with knowledge of different on-premises and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.

Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.



Job Description :

Experience: 6+ Years

Work Location: Pune / Hyderabad



Technical Skills :

  • Good programming experience as an Oracle PL/SQL, MySQL, and SQL Server Developer
  • Knowledge of database performance tuning techniques
  • Rich experience in a database development
  • Experience in Designing and Implementation Business Applications using the Oracle Relational Database Management System
  • Experience in developing complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL
  •  

Required Candidate Profile :

  • Excellent communication, interpersonal, analytical skills and strong ability to drive teams
  • Analyzes data requirements and data dictionary for moderate to complex projects • Leads data model related analysis discussions while collaborating with Application Development teams, Business Analysts, and Data Analysts during joint requirements analysis sessions
  • Translate business requirements into technical specifications with an emphasis on highly available and scalable global solutions
  • Stakeholder management and client engagement skills
  • Strong communication skills (written and verbal)

About Us!

A global leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

We have our own products!

Eagle Data warehouse Assessment & Migration Planning Product

Raven Automated Workload Conversion Product

Pelican Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.



Why join us!

Datametica is a place to innovate, bring new ideas to live, and learn new things. We believe in building a culture of innovation, growth, and belonging. Our people and their dedication over these years are the key factors in achieving our success.



Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy



Check out more about us on our website below!

www.datametica.com

Read more
Milestone Hr Consultancy

at Milestone Hr Consultancy

2 recruiters
Jyoti Sharma
Posted by Jyoti Sharma
Remote, Hyderabad
3 - 8 yrs
₹6L - ₹16L / yr
skill iconPython
skill iconDjango
Data engineering
Apache Hive
Apache Spark
We are currently looking for passionate Data Engineers to join our team and mission. In this role, you will help doctors from across the world improve care and save lives by helping extract insights and predict risk. Our Data Engineers ensure that data are ingested and prepared, ready for insights and intelligence to be derived from them. We’re looking for smart individuals to join our incredibly talented team, that is on a mission to transform healthcare.As a Data Engineer you will be engaged in some or all of the following activities:• Implement, test and deploy distributed data ingestion, data processing and feature engineering systems computing on large volumes of Healthcare data using a variety of open source and proprietary technologies.• Design data architectures and schemas optimized for analytics and machine learning.• Implement telemetry to monitor the performance and operations of data pipelines.• Develop tools and libraries to implement and manage data processing pipelines, including ingestion, cleaning, transformation, and feature computation.• Work with large data sets, and integrate diverse data sources, data types and data structures.• Work with Data Scientists, Machine Learning Engineers and Visualization Engineers to understand data requirements, and translate them into production-ready data pipelines.• Write and automate unit, functional, integration and performance tests in a Continuous Integration environment.• Take initiative to find solutions to technical challenges for healthcare data.You are a great match if you have some or all of the following skills and qualifications.• Strong understanding of database design and feature engineering to support Machine Learning and analytics.• At least 3 years of industry experience building, testing and deploying large-scale, distributed data processing systems.• Proficiency in working with multiple data processing tools and query languages (Python, Spark, SQL, etc.).• Excellent understanding of distributed computing concepts and Big Data technologies (Spark, Hive, etc.).• Proficiency in performance tuning and optimization of data processing pipelines.• Attention to detail and focus on software quality, with experience in software testing.• Strong cross discipline communication skills and teamwork.• Demonstrated clear and thorough logical and analytical thinking, as well as problem solving skills.• Bachelor or Masters in Computer Science or related field. Skill - Apache Spark-Python-Hive Skill Description - Skill1– SparkSkill2- PythonSkill3 – Hive, SQL Responsibility - Sr. data engineer"
Read more
Syrencloud

at Syrencloud

3 recruiters
Samarth Patel
Posted by Samarth Patel
Hyderabad
3 - 7 yrs
₹5L - ₹8L / yr
skill iconData Analytics
Data analyst
SQL
SAP
Our growing technology firm is looking for an experienced Data Analyst who is able to turn project requirements into custom-formatted data reports. The ideal candidate for this position is able to do complete life cycle data generation and outline critical information for each Project Manager. We also need someone who is able to analyze business procedures and recommend specific types of data that can be used to improve upon them.
Read more
15 years US based Product Company

15 years US based Product Company

Agency job
Chennai, Bengaluru (Bangalore), Hyderabad
4 - 10 yrs
₹9L - ₹20L / yr
Informatica
informatica developer
Informatica MDM
Data integration
Informatica Data Quality
+7 more
  • Should have good hands-on experience in Informatica MDM Customer 360, Data Integration(ETL) using PowerCenter, Data Quality.
  • Must have strong skills in Data Analysis, Data Mapping for ETL processes, and Data Modeling.
  • Experience with the SIF framework including real-time integration
  • Should have experience in building C360 Insights using Informatica
  • Should have good experience in creating performant design using Mapplets, Mappings, Workflows for Data Quality(cleansing), ETL.
  • Should have experience in building different data warehouse architecture like Enterprise,
  • Federated, and Multi-Tier architecture.
  • Should have experience in configuring Informatica Data Director in reference to the Data
  • Governance of users, IT Managers, and Data Stewards.
  • Should have good knowledge in developing complex PL/SQL queries.
  • Should have working experience on UNIX and shell scripting to run the Informatica workflows and to control the ETL flow.
  • Should know about Informatica Server installation and knowledge on the Administration console.
  • Working experience with Developer with Administration is added knowledge.
  • Working experience in Amazon Web Services (AWS) is an added advantage. Particularly on AWS S3, Data pipeline, Lambda, Kinesis, DynamoDB, and EMR.
  • Should be responsible for the creation of automated BI solutions, including requirements, design,development, testing, and deployment
Read more
Bigdatamatica Solutions Pvt Ltd

at Bigdatamatica Solutions Pvt Ltd

1 video
1 recruiter
sriram bhattaram
Posted by sriram bhattaram
Hyderabad
4 - 8 yrs
₹45000 - ₹60000 / mo
Analytics
skill iconPython
skill iconR Programming
SQL server

Top MNC looking for candidates on Business Analytics(4-8 Years Experience).

 

Requirement :

- Experience in metric development & Business analytics

- High Data Skill Proficiency/Statistical Skills

- Tools: R, SQL, Python, Advanced Excel

- Good verbal/communication Skills 

- Supply Chain domain knowledge

 

*Job Summary*

Duration: 6months contract based at Hyderabad

Availability: 1 week/Immediate

Qualification: Graduate/PG from Reputed University

 

 

*Key Skills*

R, SQL, Advanced Excel, Python

 

*Required Experience and Qualifications*

5 to 8 years of Business Analytics experience.

 

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort