Cutshort logo
Apache Drill Jobs in Bangalore (Bengaluru)

11+ Apache Drill Jobs in Bangalore (Bengaluru) | Apache Drill Job openings in Bangalore (Bengaluru)

Apply to 11+ Apache Drill Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Apache Drill Job opportunities across top companies like Google, Amazon & Adobe.

icon
Uber

at Uber

1 video
10 recruiters
Suvidha Chib
Posted by Suvidha Chib
Bengaluru (Bangalore)
7 - 15 yrs
₹0L / yr
Big Data
Hadoop
kafka
Spark
Apache Hive
+9 more

Data Platform engineering at Uber is looking for a strong Technical Lead (Level 5a Engineer) who has built high quality platforms and services that can operate at scale. 5a Engineer at Uber exhibits following qualities: 

 

  • Demonstrate tech expertise Demonstrate technical skills to go very deep or broad in solving classes of problems or creating broadly leverageable solutions. 
  • Execute large scale projects Define, plan and execute complex and impactful projects. You communicate the vision to peers and stakeholders.
  • Collaborate across teams Domain resource to engineers outside your team and help them leverage the right solutions. Facilitate technical discussions and drive to a consensus.
  • Coach engineers Coach and mentor less experienced engineers and deeply invest in their learning and success. You give and solicit feedback, both positive and negative, to others you work with to help improve the entire team.
  • Tech leadership Lead the effort to define the best practices in your immediate team, and help the broader organization establish better technical or business processes.


What You’ll Do

  • Build a scalable, reliable, operable and performant data analytics platform for Uber’s engineers, data scientists, products and operations teams.
  • Work alongside the pioneers of big data systems such as Hive, Yarn, Spark, Presto, Kafka, Flink to build out a highly reliable, performant, easy to use software system for Uber’s planet scale of data. 
  • Become proficient of multi-tenancy, resource isolation, abuse prevention, self-serve debuggability aspects of a high performant, large scale, service while building these capabilities for Uber's engineers and operation folks.

 

What You’ll Need

  • 7+ years experience in building large scale products, data platforms, distributed systems in a high caliber environment.
  • Architecture: Identify and solve major architectural problems by going deep in your field or broad across different teams. Extend, improve, or, when needed, build solutions to address architectural gaps or technical debt.
  • Software Engineering/Programming: Create frameworks and abstractions that are reliable and reusable. advanced knowledge of at least one programming language, and are happy to learn more. Our core languages are Java, Python, Go, and Scala.
  • Data Engineering: Expertise in one of the big data analytics technologies we currently use such as Apache Hadoop (HDFS and YARN), Apache Hive, Impala, Drill, Spark, Tez, Presto, Calcite, Parquet, Arrow etc. Under the hood experience with similar systems such as Vertica, Apache Impala, Drill, Google Borg, Google BigQuery, Amazon EMR, Amazon RedShift, Docker, Kubernetes, Mesos etc.
  • Execution & Results: You tackle large technical projects/problems that are not clearly defined. You anticipate roadblocks and have strategies to de-risk timelines. You orchestrate work that spans multiple teams and keep your stakeholders informed.
  • A team player: You believe that you can achieve more on a team that the whole is greater than the sum of its parts. You rely on others’ candid feedback for continuous improvement.
  • Business acumen: You understand requirements beyond the written word. Whether you’re working on an API used by other developers, an internal tool consumed by our operation teams, or a feature used by millions of customers, your attention to details leads to a delightful user experience.
Read more
Marktine

at Marktine

1 recruiter
Vishal Sharma
Posted by Vishal Sharma
Remote, Bengaluru (Bangalore)
3 - 6 yrs
₹10L - ₹25L / yr
Cloud
Google Cloud Platform (GCP)
BigQuery
skill iconPython
SQL
+2 more

Specific Responsibilities

  • Minimum of 2 years Experience in Google Big Query and Google Cloud Platform.
  • Design and develop the ETL framework using BigQuery
  • Expertise in Big Query concepts like Nested Queries, Clustering, Partitioning, etc.
  • Working Experience of Clickstream database, Google Analytics/ Adobe Analytics.
  • Should be able to automate the data load from Big Query using APIs or scripting language.
  • Good experience in Advanced SQL concepts.
  • Good experience with Adobe launch Web, Mobile & e-commerce tag implementation.
  • Identify complex fuzzy problems, break them down in smaller parts, and implement creative, data-driven solutions
  • Responsible for defining, analyzing, and communicating key metrics and business trends to the management teams
  • Identify opportunities to improve conversion & user experience through data. Influence product & feature roadmaps.
  • Must have a passion for data quality and be constantly looking to improve the system. Drive data-driven decision making through the stakeholders & drive Change Management
  • Understand requirements to translate business problems & technical problems into analytics problems.
  • Effective storyboarding and presentation of the solution to the client and leadership.
  • Client engagement & management
  • Ability to interface effectively with multiple levels of management and functional disciplines.
  • Assist in developing/coaching individuals technically as well as on soft skills during the project and as part of Client Project’s training program.

 

Work Experience
  • 2 to 3 years of working experience in Google Big Query & Google Cloud Platform
  • Relevant experience in Consumer Tech/CPG/Retail industries
  • Bachelor’s in engineering, Computer Science, Math, Statistics or related discipline
  • Strong problem solving and web analytical skills. Acute attention to detail.
  • Experience in analyzing large, complex, multi-dimensional data sets.
  • Experience in one or more roles in an online eCommerce or online support environment.
 
Skills
  • Expertise in Google Big Query & Google Cloud Platform
  • Experience in Advanced SQL, Scripting language (Python/R)
  • Hands-on experience in BI tools (Tableau, Power BI)
  • Working Experience & understanding of Adobe Analytics or Google Analytics
  • Experience in creating and debugging website & app tracking (Omnibus, Dataslayer, GA debugger, etc.)
  • Excellent analytical thinking, analysis, and problem-solving skills.
  • Knowledge of other GCP services is a plus
 
Read more
Marktine

at Marktine

1 recruiter
Vishal Sharma
Posted by Vishal Sharma
Remote, Bengaluru (Bangalore)
3 - 10 yrs
₹5L - ₹15L / yr
Big Data
ETL
PySpark
SSIS
Microsoft Windows Azure
+4 more

Must Have Skills:

- Solid Knowledge on DWH, ETL and Big Data Concepts

- Excellent SQL Skills (With knowledge of SQL Analytics Functions)

- Working Experience on any ETL tool i.e. SSIS / Informatica

- Working Experience on any Azure or AWS Big Data Tools.

- Experience on Implementing Data Jobs (Batch / Real time Streaming)

- Excellent written and verbal communication skills in English, Self-motivated with strong sense of ownership and Ready to learn new tools and technologies

Preferred Skills:

- Experience on Py-Spark / Spark SQL

- AWS Data Tools (AWS Glue, AWS Athena)

- Azure Data Tools (Azure Databricks, Azure Data Factory)

Other Skills:

- Knowledge about Azure Blob, Azure File Storage, AWS S3, Elastic Search / Redis Search

- Knowledge on domain/function (across pricing, promotions and assortment).

- Implementation Experience on Schema and Data Validator framework (Python / Java / SQL),

- Knowledge on DQS and MDM.

Key Responsibilities:

- Independently work on ETL / DWH / Big data Projects

- Gather and process raw data at scale.

- Design and develop data applications using selected tools and frameworks as required and requested.

- Read, extract, transform, stage and load data to selected tools and frameworks as required and requested.

- Perform tasks such as writing scripts, web scraping, calling APIs, write SQL queries, etc.

- Work closely with the engineering team to integrate your work into our production systems.

- Process unstructured data into a form suitable for analysis.

- Analyse processed data.

- Support business decisions with ad hoc analysis as needed.

- Monitoring data performance and modifying infrastructure as needed.

Responsibility: Smart Resource, having excellent communication skills

 

 
Read more
Kloud9 Technologies
Bengaluru (Bangalore)
4 - 7 yrs
₹10L - ₹30L / yr
Google Cloud Platform (GCP)
PySpark
skill iconPython
skill iconScala

About Kloud9:

 

Kloud9 exists with the sole purpose of providing cloud expertise to the retail industry. Our team of cloud architects, engineers and developers help retailers launch a successful cloud initiative so you can quickly realise the benefits of cloud technology. Our standardised, proven cloud adoption methodologies reduce the cloud adoption time and effort so you can directly benefit from lower migration costs.

 

Kloud9 was founded with the vision of bridging the gap between E-commerce and cloud. The E-commerce of any industry is limiting and poses a huge challenge in terms of the finances spent on physical data structures.

 

At Kloud9, we know migrating to the cloud is the single most significant technology shift your company faces today. We are your trusted advisors in transformation and are determined to build a deep partnership along the way. Our cloud and retail experts will ease your transition to the cloud.

 

Our sole focus is to provide cloud expertise to retail industry giving our clients the empowerment that will take their business to the next level. Our team of proficient architects, engineers and developers have been designing, building and implementing solutions for retailers for an average of more than 20 years.

 

We are a cloud vendor that is both platform and technology independent. Our vendor independence not just provides us with a unique perspective into the cloud market but also ensures that we deliver the cloud solutions available that best meet our clients' requirements.


●    Overall 8+ Years of Experience in Web Application development.

●    5+ Years of development experience with JAVA8 , Springboot, Microservices and middleware

●    3+ Years of Designing Middleware using Node JS platform.

●    good to have 2+ Years of Experience in using NodeJS along with AWS Serverless platform.

●    Good Experience with Javascript / TypeScript, Event Loops, ExpressJS, GraphQL, SQL DB (MySQLDB), NoSQL DB(MongoDB) and YAML templates.

●    Good Experience with TDD Driven Development and Automated Unit Testing.

●    Good Experience with exposing and consuming Rest APIs in Java 8, Springboot platform and Swagger API contracts.

●    Good Experience in building NodeJS middleware performing Transformations, Routing, Aggregation, Orchestration and Authentication(JWT/OAUTH).

●    Experience supporting and working with cross-functional teams in a dynamic environment.

●    Experience working in Agile Scrum Methodology.

●    Very good Problem-Solving Skills.

●    Very good learner and passion for technology.

●     Excellent verbal and written communication skills in English

●     Ability to communicate effectively with team members and business stakeholders


Secondary Skill Requirements:

 

● Experience working with any of Loopback, NestJS, Hapi.JS, Sails.JS, Passport.JS


Why Explore a Career at Kloud9:

 

With job opportunities in prime locations of US, London, Poland and Bengaluru, we help build your career paths in cutting edge technologies of AI, Machine Learning and Data Science. Be part of an inclusive and diverse workforce that's changing the face of retail technology with their creativity and innovative solutions. Our vested interest in our employees translates to deliver the best products and solutions to our customers.

Read more
Amagi Media Labs

at Amagi Media Labs

3 recruiters
Rajesh C
Posted by Rajesh C
Bengaluru (Bangalore), Noida
5 - 9 yrs
₹10L - ₹17L / yr
Data engineering
Spark
skill iconScala
Hadoop
Apache Hadoop
+1 more
  • We are looking for : Data engineer
  • Sprak
  • Scala
  • Hadoop
Exp - 5 to 9 years
N.p - 15 days to 30 Days
Location : Bangalore / Noida
Read more
Big revolution in the e-gaming industry. (GK1)

Big revolution in the e-gaming industry. (GK1)

Agency job
via Multi Recruit by Ayub Pasha
Bengaluru (Bangalore)
2 - 3 yrs
₹15L - ₹20L / yr
skill iconPython
skill iconScala
Hadoop
Spark
Data Engineer
+4 more
  • We are looking for a Data Engineer to build the next-generation mobile applications for our world-class fintech product.
  • The candidate will be responsible for expanding and optimising our data and data pipeline architecture, as well as optimising data flow and collection for cross-functional teams.
  • The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimising data systems and building them from the ground up.
  • Looking for a person with a strong ability to analyse and provide valuable insights to the product and business team to solve daily business problems.
  • You should be able to work in a high-volume environment, have outstanding planning and organisational skills.

 

Qualifications for Data Engineer

 

  • Working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimising ‘big data’ data pipelines, architectures, and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • Looking for a candidate with 2-3 years of experience in a Data Engineer role, who is a CS graduate or has an equivalent experience.

 

What we're looking for?

 

  • Experience with big data tools: Hadoop, Spark, Kafka and other alternate tools.
  • Experience with relational SQL and NoSQL databases, including MySql/Postgres and Mongodb.
  • Experience with data pipeline and workflow management tools: Luigi, Airflow.
  • Experience with AWS cloud services: EC2, EMR, RDS, Redshift.
  • Experience with stream-processing systems: Storm, Spark-Streaming.
  • Experience with object-oriented/object function scripting languages: Python, Java, Scala.
Read more
upGrad

at upGrad

1 video
19 recruiters
Priyanka Muralidharan
Posted by Priyanka Muralidharan
Bengaluru (Bangalore), Mumbai
2 - 5 yrs
₹14L - ₹20L / yr
product analyst
skill iconData Analytics
skill iconPython
SQL
Tableau
 

About Us

upGrad is an online education platform building the careers of tomorrow by offering the most industry-relevant programs in an immersive learning experience. Our mission is to create a new digital-first learning experience to deliver tangible career impact to individuals at scale. upGrad currently offers programs in Data Science, Machine Learning, Product Management, Digital Marketing, and Entrepreneurship, etc. upGrad is looking for people passionate about management and education to help design learning programs for working professionals to stay sharp and stay relevant and help build the careers of tomorrow.

  • upGrad was awarded the Best Tech for Education by IAMAI for 2018-19

  • upGrad was also ranked as one of the LinkedIn Top Startups 2018: The 25 most

    sought-after startups in India

  • upGrad was earlier selected as one of the top ten most innovative companies in India

    by FastCompany.

  • We were also covered by the Financial Times along with other disruptors in Ed-Tech

  • upGrad is the official education partner for Government of India - Startup India

    program

  • Our program with IIIT B has been ranked #1 program in the country in the domain of

    Artificial Intelligence and Machine Learning

    Role Summary

    We Are looking for an analytically inclined , Insights Driven Data Analyst to make our organisation more data driven. In this role you will be responsible for creating dashboards to drive insights for product and business teams. Be it Day to Day decisions as well as long term impact assessment, Measuring the Efficacy of different products or certain teams, You'll be Empowering each of them. The growing nature of the team will require you to be in touch with all of the teams at upgrad. Are you the "Go To" person everyone looks at for getting Data, Then this role is for you.

    Roles & Responsibilities

    • Lead and own the analysis of highly complex data sources, identifying trends and patterns in data and provide insights/recommendations based on analysis results

    • Build, maintain, own and communicate detailed reports to assist Marketing, Growth/Learning Experience and Other Business/Executive Teams

    • Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions.

    • Analyze data and generate insights in the form of user analysis, user segmentation, performance reports, etc.

    • Facilitate review sessions with management, business users and other team members

    • Design and create visualizations to present actionable insights related to data sets and business questions at hand

    • Develop intelligent models around channel performance, user profiling, and personalization

      Skills Required

      • Having 3-5 yrs hands-on experience with Product related analytics and reporting

      • Experience with building dashboards in Tableau or other data visualization tools

        such as D3

      • Strong data, statistics, and analytical skills with a good grasp of SQL.

      • Programming experience in Python is must

      • Comfortable managing large data sets

      • Good Excel/data management skills

Read more
Statusneo

at Statusneo

6 recruiters
Yashika Sharma
Posted by Yashika Sharma
Hyderabad, Bengaluru (Bangalore)
2 - 4 yrs
₹2L - ₹4L / yr
skill iconData Science
Computer Vision
Natural Language Processing (NLP)
skill iconMachine Learning (ML)
skill iconPython
+2 more

Responsibilities Description:

Responsible for the development and implementation of machine learning algorithms and techniques to solve business problems and optimize member experiences. Primary duties may include are but not limited to: Design machine learning projects to address specific business problems determined by consultation with business partners. Work with data-sets of varying degrees of size and complexity including both structured and unstructured data. Piping and processing massive data-streams in distributed computing environments such as Hadoop to facilitate analysis. Implements batch and real-time model scoring to drive actions. Develops machine learning algorithms to build customized solutions that go beyond standard industry tools and lead to innovative solutions. Develop sophisticated visualization of analysis output for business users.

 

Experience Requirements:

BS/MA/MS/PhD in Statistics, Computer Science, Mathematics, Machine Learning, Econometrics, Physics, Biostatistics or related Quantitative disciplines. 2-4 years of experience in predictive analytics and advanced expertise with software such as Python, or any combination of education and experience which would provide an equivalent background. Experience in the healthcare sector. Experience in Deep Learning strongly preferred.

 

Required Technical Skill Set:

  • Full cycle of building machine learning solutions,

o   Understanding of wide range of algorithms and their corresponding problems to solve

o   Data preparation and analysis

o   Model training and validation

o   Model application to the problem

  • Experience using the full open source programming tools and utilities
  • Experience in working in end-to-end data science project implementation.
  • 2+ years of experience with development and deployment of Machine Learning applications
  • 2+ years of experience with NLP approaches in a production setting
  • Experience in building models using bagging and boosting algorithms
  • Exposure/experience in building Deep Learning models for NLP/Computer Vision use cases preferred
  • Ability to write efficient code with good understanding of core Data Structures/algorithms is critical
  • Strong python skills following software engineering best practices
  • Experience in using code versioning tools like GIT, bit bucket
  • Experience in working in Agile projects
  • Comfort & familiarity with SQL and Hadoop ecosystem of tools including spark
  • Experience managing big data with efficient query program good to have
  • Good to have experience in training ML models in tools like Sage Maker, Kubeflow etc.
  • Good to have experience in frameworks to depict interpretability of models using libraries like Lime, Shap etc.
  • Experience with Health care sector is preferred
  • MS/M.Tech or PhD is a plus
Read more
Digital Aristotle

at Digital Aristotle

2 recruiters
Digital Aristotle
Posted by Digital Aristotle
Bengaluru (Bangalore)
3 - 6 yrs
₹5L - ₹15L / yr
skill iconDeep Learning
Natural Language Processing (NLP)
skill iconMachine Learning (ML)
skill iconPython

JD : ML/NLP Tech Lead

- We are looking to hire an ML/NLP Tech lead who can own products for a technology perspective and manage a team of up to 10 members. You will play a pivotal role in re-engineering our products, transformation, and scaling of AssessEd

WHAT ARE WE BUILDING :

- A revolutionary way of providing continuous assessments of a child's skill and learning, pointing the way to the child's potential in the future. This as opposed to the traditional one-time, dipstick methodology of a test that hurriedly bundles the child into a slot, that in-turn - declares- the child to be fit for a career in a specific area or a particular set of courses that would perhaps get him somewhere. At the core of our system is a lot of data - both structured and unstructured. 

 

- We have books and questions and web resources and student reports that drive all our machine learning algorithms. Our goal is to not only figure out how a child is coping but to also figure out how to help him by presenting relevant information and questions to him in topics that he is struggling to learn.

Required Skill sets :

- Wisdom to know when to hustle and when to be calm and dig deep. Strong can do mentality, who is joining us to build on a vision, not to do a job.

- A deep hunger to learn, understand, and apply your knowledge to create technology.

- Ability and Experience tackling hard Natural Language Processing problems, to separate wheat from the chaff, knowledge of mathematical tools to succinctly describe the ideas to implement them in code.

- Very Good understanding of Natural Language Processing and Machine Learning with projects to back the same.

- Strong fundamentals in Linear Algebra, Probability and Random Variables, and Algorithms.

- Strong Systems experience in Distributed Systems Pipeline: Hadoop, Spark, etc.

- Good knowledge of at least one prototyping/scripting language: Python, MATLAB/Octave or R.

- Good understanding of Algorithms and Data Structures.

- Strong programming experience in C++/Java/Lisp/Haskell.

- Good written and verbal communication.

Desired Skill sets :

- Passion for well-engineered product and you are - ticked off- when something engineered is off and you want to get your hands dirty and fix it.

- 3+ yrs of research experience in Machine Learning, Deep Learning and NLP

- Top tier peer-reviewed research publication in areas like Algorithms, Computer Vision/Image Processing, Machine Learning or Optimization (CVPR, ICCV, ICML, NIPS, EMNLP, ACL, SODA, FOCS etc)

- Open Source Contribution (include the link to your projects, GitHub etc.)

- Knowledge of functional programming.

- International level participation in ACM ICPC, IOI, TopCoder, etc

 

- International level participation in Physics or Math Olympiad

- Intellectual curiosity about advanced math topics like Theoretical Computer Science, Abstract Algebra, Topology, Differential Geometry, Category Theory, etc.

What can you expect :

- Opportunity to work on the interesting and hard research problem, to see the real application of state-of-the-art research into practice.

- Opportunity to work on important problems with big social impact: Massive, and direct impact of the work you do on the lives of students.

- An intellectually invigorating, phenomenal work environment, with massive ownership and growth opportunities.

- Learn effective engineering habits required to build/deploy large production-ready ML applications.

- Ability to do quick iterations and deployments.

- We would be excited to see you publish papers (though certain restrictions do apply).

Website : http://Digitalaristotle.ai


Work Location: - Bangalore

Read more
Verifone

at Verifone

1 recruiter
Soumya Khedagi
Posted by Soumya Khedagi
Bengaluru (Bangalore)
5 - 8 yrs
₹20L - ₹22L / yr
Data architecture
Data storage
Apache Kafka
Apache Spark

Responsibilities for Data Architect

  • Research and properly evaluate sources of information to determine possible limitations in reliability or usability
  • Apply sampling techniques to effectively determine and define ideal categories to be questioned
  • Compare and analyze provided statistical information to identify patterns, relationships and problems
  • Define and utilize statistical methods to solve industry-specific problems in varying fields, such as economics and engineering
  • Prepare detailed reports for management and other departments by analyzing and interpreting data
  • Train assistants and other members of the team how to properly organize findings and read data collected
  • Design computer code using various languages to improve and update software and applications
  • Refer to previous instances and findings to determine the ideal method for gathering data
Read more
Spoonshot Inc.

at Spoonshot Inc.

3 recruiters
Rajesh Bhutada
Posted by Rajesh Bhutada
Bengaluru (Bangalore)
1 - 4 yrs
₹9L - ₹15L / yr
skill iconData Analytics
Data Visualization
Analytics
SQLite
PowerBI
+5 more
- Prior experience in Business Analytics and knowledge of related analysis or visualization tools
- Expecting a minimum of 2-4 years of relevant experience
- You will be managing a team of 3 currently
- Take up the ownership of developing and managing one of the largest and richest food (recipe, menu, and CPG) databases
- Interactions with cross-functional teams (Business, Food Science, Product, and Tech) on a regular basis to pan the future of client and internal food data management
- Should have a natural flair for playing with numbers and data and have a keen eye for detail and quality
- Will spearhead the Ops team in achieving the targets while maintaining a staunch attentiveness to Coverage, Completeness, and Quality of the data
- Shall program and manage projects while identifying opportunities to optimize costs and processes.
- Good business acumen, in creating logic & process flows, quick and smart decision-making skills are expected
- Will also be responsible for the recruitment, induction and training new members as well
- Setting competitive team targets. Guide and support the team members to go the extra mile and achieve set targets


Added Advantages :
- Experience in a Food Sector / Insights company
- Has a passion for exploring different cuisines
- Understands industry-related jargons and has a natural flair towards learning more about anything related to food
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort