Cutshort logo
Apache pig jobs

11+ Apache Pig Jobs in India

Apply to 11+ Apache Pig Jobs on CutShort.io. Find your next job, effortlessly. Browse Apache Pig Jobs and apply today!

icon
Dailyhunt

at Dailyhunt

4 recruiters
khushboo jain
Posted by khushboo jain
Bengaluru (Bangalore)
3 - 9 yrs
₹3L - ₹9L / yr
skill iconJava
Big Data
Hadoop
Pig
Apache Hive
+13 more
What You'll Do :- Develop analytic tools, working on BigData and Distributed Environment. Scalability will be the key- Provide architectural and technical leadership on developing our core Analytic platform- Lead development efforts on product features on Java- Help scale our mobile platform as we experience massive growthWhat we Need :- Passion to build analytics & personalisation platform at scale- 3 to 9 years of software engineering experience with product based company in data analytics/big data domain- Passion for the Designing and development from the scratch.- Expert level Java programming and experience leading full lifecycle of application Dev.- Exp in Analytics, Hadoop, Pig, Hive, Mapreduce, ElasticSearch, MongoDB is an additional advantage- Strong communication skills, verbal and written
Read more
Agilisium

Agilisium

Agency job
via Recruiting India by Moumita Santra
Chennai
10 - 19 yrs
₹12L - ₹40L / yr
Big Data
Apache Spark
Spark
PySpark
ETL
+1 more

Job Sector: IT, Software

Job Type: Permanent

Location: Chennai

Experience: 10 - 20 Years

Salary: 12 – 40 LPA

Education: Any Graduate

Notice Period: Immediate

Key Skills: Python, Spark, AWS, SQL, PySpark

Contact at triple eight two zero nine four two double seven

 

Job Description:

Requirements

  • Minimum 12 years experience
  • In depth understanding and knowledge on distributed computing with spark.
  • Deep understanding of Spark Architecture and internals
  • Proven experience in data ingestion, data integration and data analytics with spark, preferably PySpark.
  • Expertise in ETL processes, data warehousing and data lakes.
  • Hands on with python for Big data and analytics.
  • Hands on in agile scrum model is an added advantage.
  • Knowledge on CI/CD and orchestration tools is desirable.
  • AWS S3, Redshift, Lambda knowledge is preferred
Thanks
Read more
Global Media Agency

Global Media Agency

Agency job
via Merito by Sana Patel
Gurugram
8 - 12 yrs
₹20L - ₹30L / yr
skill iconData Analytics
Marketing analytics
SQL
Media Analytics
Digital Analytics
+4 more
Role - Analytics Associate Director

About our Client :-

Our Client is a global data and measurement-driven media agency whose mission is to make brands more valuable to the world. Clients include Google, Flipkart, NBCUniversal, L'Oréal and the Financial Times. The agency is more than 2,000 people strong, manages $4.5B in annualized media spend, and deploys campaigns in 121 markets via 22 offices in APAC, EMEA and the Americas.

About the role :-

Accountable for quantifying and measuring the success of our paid media campaigns and for delivering insights that enable us to innovate the work we deliver at MFG. Leading multi-product projects, developing best practices, being the main point of contact for other teams and direct line management for multiple team members.

Some of the things we’d like you to do -

● Build a deep understanding of marketing plans and their objectives to help Account teams (Activation, Planning, etc) build comprehensive measurement, and test & learn plans
● Play an instrumental role in evolving and designing new, innovative measurement tools. Managing the process through to delivery and take ownership of global roll out
● Recruit, manage and mentor analytical resource(s), ensuring the efficient flow of work through the team, the timely delivery of high-quality outputs and their continuing development as professionals
● Lead the creation of clear, robust and thought-provoking campaign reviews and insights
● Work with Account teams (Activation, Planning, etc) to help define the correct questions to understand correct metrics for quantifying campaign performance
● To help deliver “best in class” analytical capabilities across the agency with the wider Analytics team, including the use of new methods, techniques, tools and systems
● Develop innovative marketing campaigns and assist clients to define objectives
● Develop deep understanding of marketing platform testing and targeting abilities, and act in a consultative capacity in their implementation
● Provide hands-on leadership, mentorship, and coaching in the expert delivery of data strategies, AdTech solutions, audiences solutions and data management solutions to our clients
● Leading stakeholder management on certain areas of the client portfolio
● Coordination and communication with 3rd party vendors to critically assess new/bespoke measurement solutions. Includes development and management of contracts and SOWs.

A bit about yourself -

● 8+ years of experience in a data & insight role; practical experience on how analytical techniques/models are used in marketing. Previous agency, media, or consultancy background is desirable.
● A proven track record in working with a diverse array of clients to solve complex problems and delivering demonstrable business success. Including (but not limited to) the development of compelling and sophisticated data strategies and AdTech / martech strategies to enable
marketing objectives.
● Ideally you have worked with Ad Platforms, DMPs, CDPs, Clean Rooms, Measurement Platforms, Business Intelligence Tools, Data Warehousing and Big Data Solutions to some degree
● 3+ years of management experience and ability to delegate effectively
● Proficiency with systems such as SQL, Social Analytics tools, Python, and ‘R’
● Understand measurement for both Direct Response and Brand Awareness campaigns desired
● Excellent at building and presenting data in a visually engaging and insightful manner that cuts through the noise
● Strong organizational and project management skills including team resourcing
● Strong understanding of what data points can be collected and analyzed in a digital campaign, and how each data point should be analyzed
● Established and professional communication, presentation, and motivational skills
Read more
 The world’s largest media investment company

The world’s largest media investment company

Agency job
via Merito by Sana Patel
Gurugram, Bengaluru (Bangalore)
3 - 8 yrs
₹7L - ₹14L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+4 more

Hi,

About the co.–Our client is an agency of the world’s largest media investment company which is a part of WPP. It is a global digital transformation agency with 1200 employees across 21 nations. Our team of experts support clients in programmatic, social, paid search, analytics, technology, organic search, affiliate marketing, e-commerce and across traditional channels. 


Job Location: Gurgaon/Bangalore


Responsibilities of the role:

 Manage extraction of data sets from multiple marketing/database platforms and perform hygiene and quality control steps, either via Datorama or in partnership with Neo Technology team. Data sources will include: web analytic tools, media analytics, customer databases, social listening tools, search tools, syndicated data, research & survey tools, etc…

 Implement and manage data system architecture  Audit and manage data taxonomy/classifications from multiple systems and partners

 Manage the extraction, loading, and transformation of multiple data sets

 Cleanse all data and metrics; perform override updates where necessary

 Execute all business rules according to requirements  Identify and implement opportunities for efficiency throughout the process.

 Manage and execute thorough QA process and ensure quality and accuracy of all data and reporting deliverables

 Manipulate and analyze “big” data sets synthesized from a variety of sources, including media platforms and marketing automation tools

 Generate and manage all data visualizations and ensure data is presented accurately and is visually pleasing

 Assist analytics team in running numerous insights reports as needed  Help maintain a performance platform and provide insights and ongoing recommendations around. 


What you will need:

 3+ years’ experience in an analytics position working with large amounts of data

 Hands-on experience working with data visualization tools such as Datorama, Tableau, or PowerBI  Additional desirable skills include tag management experience, application coding experience, statistics background

 Digital media experience background preferred, including knowledge of Doubleclick and web analytics tools

 2 hour overlap with NY or Chicago in the morning (EST or CST time zones)

 Excellent communication skills


Regards

Team Merito

Read more
It's a OTT platform

It's a OTT platform

Agency job
via Vmultiply solutions by HR Lakshmi
Hyderabad
6 - 8 yrs
₹8L - ₹15L / yr
Big Data
Apache Kafka
Kibana
skill iconElastic Search
Logstash
Passionate data engineer with ability to manage data coming from different sources.
Should design and operate data pipe lines.
Build and manage analytics platform using Elastic search, Redshift, Mongo db.
Strong programming fundamentals in Datastructures and algorithms.
Read more
STP Research
Vivek Tyagi
Posted by Vivek Tyagi
Gaziabad, Vaishali, Noida
0 - 2 yrs
₹2.5L - ₹3.5L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+3 more

Hi,


We are looking for a young and passionate data analyst. Candidate should have knowledge of SPSS and other analysis tools.

Read more
DataMetica

at DataMetica

1 video
7 recruiters
Sayali Kachi
Posted by Sayali Kachi
Pune, Hyderabad
4 - 10 yrs
₹5L - ₹20L / yr
ETL
SQL
Data engineering
Analytics
PL/SQL
+3 more

We at Datametica Solutions Private Limited are looking for SQL Engineers who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike. 

Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.

Job Description

Experience : 4-10 years

Location : Pune

 


Mandatory Skills - 

  • Strong in ETL/SQL development
  • Strong Data Warehousing skills
  • Hands-on experience working with Unix/Linux
  • Development experience in Enterprise Data warehouse projects
  • Good to have experience working with Python, shell scripting
  •  

Opportunities -

  • Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
  • Would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
  • Will play an active role in setting up the Modern data platform based on Cloud and Big Data
  • Would be part of teams with rich experience in various aspects of distributed systems and computing


 

About Us!

A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

 

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

 

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.

 

We have our own products!

Eagle – Data warehouse Assessment & Migration Planning Product

Raven – Automated Workload Conversion Product

Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

 

Why join us!

Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

 

 

Benefits we Provide!

Working with Highly Technical and Passionate, mission-driven people

Subsidized Meals & Snacks

Flexible Schedule

Approachable leadership

Access to various learning tools and programs

Pet Friendly

Certification Reimbursement Policy

 

Check out more about us on our website below!

http://www.datametica.com/">www.datametica.com

Read more
NoBroker

at NoBroker

1 video
26 recruiters
noor aqsa
Posted by noor aqsa
Bengaluru (Bangalore)
1 - 3 yrs
₹6L - ₹8L / yr
skill iconJava
Spark
PySpark
Data engineering
Big Data
+2 more
You will build, setup and maintain some of the best data pipelines and MPP frameworks for our
datasets
Translate complex business requirements into scalable technical solutions meeting data design
standards. Strong understanding of analytics needs and proactive-ness to build generic solutions
to improve the efficiency
Build dashboards using Self-Service tools on Kibana and perform data analysis to support
business verticals
Collaborate with multiple cross-functional teams and work
Read more
Fragma Data Systems

at Fragma Data Systems

8 recruiters
Priyanka U
Posted by Priyanka U
Remote only
4 - 10 yrs
₹12L - ₹23L / yr
Informatica
ETL
Big Data
Spark
SQL
Skill:- informatica with big data management
 
1.Minimum 6 to 8 years of experience in informatica BDM development
2. Experience working on Spark/SQL
3. Develops informtica mapping/Sql 
4. Should have experience in Hadoop, spark etc

Work days- Sun-Thu
Day shift
 
 
 
Read more
Conviva

Conviva

Agency job
via Wenger and Watson Inc by Bevin Baby
Remote, Bengaluru (Bangalore)
15 - 20 yrs
₹50L - ₹120L / yr
skill iconScala
Big Data
Hadoop
Spark
JVM
+2 more
About the Company, Conviva:
Conviva is the leader in streaming media intelligence, powered by its real-time platform. More than 250 industry leaders and brands – including CBS, CCTV, Cirque Du Soleil, DAZN, Disney+, HBO, Hulu, Sky, Sling TV, TED, Univision, and Warner Media – rely on Conviva to maximize their consumer engagement, deliver the quality experiences viewers expect and drive revenue growth. With a global footprint of more than 500 million unique viewers watching 150 billion streams per year across 3 billion applications streaming on devices, Conviva offers streaming providers unmatched scale for continuous video measurement, intelligence and benchmarking across every stream, every screen, every second. Conviva is privately held and headquartered in Silicon Valley, California, with offices around the world. For more information, please visit us at www.conviva.com.

What you get to do:

 Be a thought leader. As one of the senior most technical minds in the India centre, influence our technical evolution journey by pushing the boundaries of possibilities by testing forwarding looking ideas and demonstrating its value.
 Be a technical leader: Demonstrate pragmatic skills of translating requirements into technical design.
 Be an influencer. Understand challenges and collaborate across executives and stakeholders in a geographically distributed environment to influence them.
 Be a technical mentor. Build respect within team. Mentor senior engineers technically and
contribute to the growth of talent in the India centre.
 Be a customer advocate. Be empathetic to customer and domain by resolving ambiguity efficiently with the customer in mind.
 Be a transformation agent. Passionately champion engineering best practices and sharing across teams.
 Be hands-on. Participate regularly in code and design reviews, drive technical prototypes and actively contribute to resolving difficult production issues.

What you bring to the role:
 Thrive in a start-up environment and has a platform mindset.
 Excellent communicator. Demonstrated ability to succinctly communicate and describe complexvtechnical designs and technology choices both to executives and developers.
 Expert in Scala coding. JVM based stack is a bonus.
 Expert in big data technologies like Druid, Spark, Hadoop, Flink (or Akka) & Kafka.
 Passionate about one or more engineering best practices that influence design, quality of code or developer efficiency.
 Familiar with building distributed applications using webservices and RESTful APIs.
 Familiarity in building SaaS platforms on either in-house data centres or public cloud providers.
Read more
Cemtics

at Cemtics

1 recruiter
Tapan Sahani
Posted by Tapan Sahani
Remote, NCR (Delhi | Gurgaon | Noida)
4 - 6 yrs
₹5L - ₹12L / yr
Big Data
Spark
Hadoop
SQL
skill iconPython
+1 more

JD:

Required Skills:

  • Intermediate to Expert level hands-on programming using one of programming language- Java or Python or Pyspark or Scala.
  • Strong practical knowledge of SQL.
    Hands on experience on Spark/SparkSQL
  • Data Structure and Algorithms
  • Hands-on experience as an individual contributor in Design, Development, Testing and Deployment of Big Data technologies based applications
  • Experience in Big Data application tools, such as Hadoop, MapReduce, Spark, etc
  • Experience on NoSQL Databases like HBase, etc
  • Experience with Linux OS environment (Shell script, AWK, SED)
  • Intermediate RDBMS skill, able to write SQL query with complex relation on top of big RDMS (100+ table)
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort