Cutshort logo
Apache hbase jobs

24+ Apache HBase Jobs in India

Apply to 24+ Apache HBase Jobs on CutShort.io. Find your next job, effortlessly. Browse Apache HBase Jobs and apply today!

icon
Sigmoid

at Sigmoid

1 video
4 recruiters
Jayakumar AS
Posted by Jayakumar AS
Bengaluru (Bangalore), Hyderabad
2 - 5 yrs
₹12L - ₹15L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+5 more

Sigmoid works with a variety of clients from start-ups to fortune 500 companies. We are looking for a detailed oriented self-starter to assist our engineering and analytics teams in various roles as a Software Development Engineer.


This position will be a part of a growing team working towards building world class large scale Big Data architectures. This individual should have a sound understanding of programming principles, experience in programming in Java, Python or similar languages and can expect to

spend a majority of their time coding.


Location - Bengaluru and Hyderabad


Responsibilities:

● Good development practices

○ Hands on coder with good experience in programming languages like Java or

Python.

○ Hands-on experience on the Big Data stack like PySpark, Hbase, Hadoop, Mapreduce and ElasticSearch.

○ Good understanding of programming principles and development practices like checkin policy, unit testing, code deployment

○ Self starter to be able to grasp new concepts and technology and translate them into large scale engineering developments

○ Excellent experience in Application development and support, integration development and data management.

● Align Sigmoid with key Client initiatives

○ Interface daily with customers across leading Fortune 500 companies to understand strategic requirements


● Stay up-to-date on the latest technology to ensure the greatest ROI for customer &Sigmoid

○ Hands on coder with good understanding on enterprise level code

○ Design and implement APIs, abstractions and integration patterns to solve challenging distributed computing problems

○ Experience in defining technical requirements, data extraction, data

transformation, automating jobs, productionizing jobs, and exploring new big data technologies within a Parallel Processing environment


● Culture

○ Must be a strategic thinker with the ability to think unconventional /

out:of:box.

○ Analytical and data driven orientation.

○ Raw intellect, talent and energy are critical.


○ Entrepreneurial and Agile : understands the demands of a private, high growth company.

○ Ability to be both a leader and hands on "doer".


Qualifications: -

- Years of track record of relevant work experience and a computer Science or related technical discipline is required

- Experience with functional and object-oriented programming, Java must.

- hand-On knowledge in Map Reduce, Hadoop, PySpark, Hbase and ElasticSearch.

- Effective communication skills (both written and verbal)

- Ability to collaborate with a diverse set of engineers, data scientists and product managers

- Comfort in a fast-paced start-up environment


Preferred Qualification:

- Technical knowledge in Map Reduce, Hadoop & GCS Stack a plus.

- Experience in agile methodology

- Experience with database modeling and development, data mining and warehousing.

- Experience in architecture and delivery of Enterprise scale applications and capable in developing framework, design patterns etc. Should be able to understand and tackle technical challenges, propose comprehensive solutions and guide junior staff

- Experience working with large, complex data sets from a variety of sources

Read more
Thoughtworks

at Thoughtworks

1 video
27 recruiters
Sunidhi Thakur
Posted by Sunidhi Thakur
Bengaluru (Bangalore)
10 - 13 yrs
Best in industry
Data modeling
PySpark
Data engineering
Big Data
Hadoop
+10 more

Lead Data Engineer

 

Data Engineers develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. You might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to solve their most pressing problems. On other projects, you might be acting as the architect, leading the design of technical solutions, or perhaps overseeing a program inception to build a new product. It could also be a software delivery project where you're equally happy coding and tech-leading the team to implement the solution.

 

Job responsibilities

 

·      You might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to solve their most pressing problems

·      You will partner with teammates to create complex data processing pipelines in order to solve our clients' most ambitious challenges

·      You will collaborate with Data Scientists in order to design scalable implementations of their models

·      You will pair to write clean and iterative code based on TDD

·      Leverage various continuous delivery practices to deploy, support and operate data pipelines

·      Advise and educate clients on how to use different distributed storage and computing technologies from the plethora of options available

·      Develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions

·      Create data models and speak to the tradeoffs of different modeling approaches

·      On other projects, you might be acting as the architect, leading the design of technical solutions, or perhaps overseeing a program inception to build a new product

·      Seamlessly incorporate data quality into your day-to-day work as well as into the delivery process

·      Assure effective collaboration between Thoughtworks' and the client's teams, encouraging open communication and advocating for shared outcomes

 

Job qualifications Technical skills

·      You are equally happy coding and leading a team to implement a solution

·      You have a track record of innovation and expertise in Data Engineering

·      You're passionate about craftsmanship and have applied your expertise across a range of industries and organizations

·      You have a deep understanding of data modelling and experience with data engineering tools and platforms such as Kafka, Spark, and Hadoop

·      You have built large-scale data pipelines and data-centric applications using any of the distributed storage platforms such as HDFS, S3, NoSQL databases (Hbase, Cassandra, etc.) and any of the distributed processing platforms like Hadoop, Spark, Hive, Oozie, and Airflow in a production setting

·      Hands on experience in MapR, Cloudera, Hortonworks and/or cloud (AWS EMR, Azure HDInsights, Qubole etc.) based Hadoop distributions

·      You are comfortable taking data-driven approaches and applying data security strategy to solve business problems

·      You're genuinely excited about data infrastructure and operations with a familiarity working in cloud environments

·      Working with data excites you: you have created Big data architecture, you can build and operate data pipelines, and maintain data storage, all within distributed systems

 

Professional skills


·      Advocate your data engineering expertise to the broader tech community outside of Thoughtworks, speaking at conferences and acting as a mentor for more junior-level data engineers

·      You're resilient and flexible in ambiguous situations and enjoy solving problems from technical and business perspectives

·      An interest in coaching others, sharing your experience and knowledge with teammates

·      You enjoy influencing others and always advocate for technical excellence while being open to change when needed

Read more
Helps with software development

Helps with software development

Agency job
via Qrata by Rayal Rajan
Pune
3 - 6 yrs
₹15L - ₹25L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Design patterns
+8 more

Requirements

• Extensive and expert programming experience in at least one general programming language (e. g.

Java, C, C++) & tech stack to write maintainable, scalable, unit-tested code.

• Experience with multi-threading and concurrency programming.

• Extensive experience in object oriented design skills, knowledge of design patterns, and a huge passion

and ability to design intuitive modules and class-level interfaces.

• Excellent coding skills - should be able to convert design into code fluently.

• Knowledge of Test Driven Development.

• Good understanding of databases (e. g. MySQL) and NoSQL (e. g. HBase, Elasticsearch, Aerospike etc).

• Strong desire to solve complex and interesting real world problems.

• Experience with full life cycle development in any programming language on a Linux platform.

• Go-getter attitude that reflects in energy and intent behind assigned tasks.

• Worked in a startup-like environment with high levels of ownership and commitment.

• BTech, MTech or Ph. D. in Computer Science or related technical discipline (or equivalent).

• Experience in building highly scalable business applications, which involve implementing large complex

business flows and dealing with huge amounts of data.

• 3+ years of experience in the art of writing code and solving problems on a large scale.

• Open communicator who shares thoughts and opinions frequently, listens intently, and takes

constructive feedback.

Read more
Provides IT Services and Consultancy.

Provides IT Services and Consultancy.

Agency job
via Qrata by Blessy Fernandes
Pune
3 - 5 yrs
₹20L - ₹22L / yr
skill iconJava
skill iconC
skill iconC++
MySQL
Test driven development (TDD)
+5 more

Requirements

  • Extensive and expert programming experience in at least one general programming language (e. g. Java, C, C++) & tech stack to write maintainable, scalable, unit-tested code.
  • Experience with multi-threading and concurrency programming.
  • Extensive experience in object-oriented design skills, knowledge of design patterns, and a huge passion and ability to design intuitive modules and class-level interfaces.
  • Excellent coding skills - should be able to convert the design into code fluently.
  • Knowledge of Test Driven Development. Good understanding of databases (e. g. MySQL) and NoSQL (e. g. HBase, Elasticsearch, Aerospike etc).
  • Strong desire to solve complex and interesting real-world problems.
  • Experience with full life cycle development in any programming language on a Linux platform. Go-getter attitude that reflects in energy and intent behind assigned tasks.
  • Worked in a startup-like environment with high levels of ownership and commitment.
  • BTech, MTech or Ph. D. in Computer Science or related technical discipline (or equivalent).
  • Experience in building highly scalable business applications, which involve implementing large complex business flows and dealing with huge amounts of data.
  • 3+ years of experience in the art of writing code and solving problems on a large scale.
  • An open communicator who shares thoughts and opinions frequently listens intently and takes constructive feedback


Read more
Play Games24x7

at Play Games24x7

2 recruiters
Agency job
via Zyoin Web Private Limited by Vishali Vashnavi
Bengaluru (Bangalore)
8 - 12 yrs
₹40L - ₹50L / yr
skill iconJava
J2EE
skill iconPostgreSQL
MySQL
skill iconMongoDB
+19 more
Requirements:
• B. E. /B. Tech. in Computer Science or MCA from a reputed university.
• 3.5 plus years of experience in software development, with emphasis on JAVA/J2EE Server side
programming.
• Hands on experience in core Java, multithreading, RMI, socket programing, JDBC, NIO, webservices
and design patterns.
• Knowledge of distributed system, distributed caching, messaging frameworks, ESB etc.
• Experience in Linux operating system and PostgreSQL/MySQL/MongoDB/Cassandra database.
• Additionally, knowledge of HBase, Hadoop and Hive is desirable.
• Familiarity with message queue systems and AMQP and Kafka is desirable.
• Experience as a participant in agile methodologies.
• Excellent written and verbal communication skills and presentation skills.
• This is not a fullstack requirement, we are looking for a purely backend expert.
Read more
hiring for a leading client

hiring for a leading client

Agency job
via Jobaajcom by Saksham Agarwal
Bengaluru (Bangalore)
1 - 3 yrs
₹12L - ₹15L / yr
Big Data
Apache Hadoop
Apache Impala
Apache Kafka
Apache Spark
+5 more
We are seeking a self motivated Software Engineer with hands-on experience to build sustainable data solutions, identifying and addressing performance bottlenecks, collaborating with other team members, and implementing best practices for data engineering. Our engineering process is fully agile, and has a really fast release cycle - which keeps our environment very energetic and fun.

What you'll do:

Design and development of scalable applications.
Collaborate with tech leads to get maximum understanding of underlying infrastructure.
Contribute to continual improvement by suggesting improvements to the software system.
Ensure high scalability and performance
You will advocate for good, clean, well documented and performing code; follow standards and best practices.
We'd love for you to have:

Education: Bachelor/Master Degree in Computer Science
Experience: 1-3 years of relevant experience in BI/Big-Data with hands-on coding experience
Mandatory Skills

Strong in problem-solving
Good exposure to Big Data technologies, Hive, Hadoop, Impala, Hbase, Kafka, Spark
Strong experience of Data Engineering
Able to comprehend challenges related to Database and Data Warehousing technologies and ability to understand complex design, system architecture
Experience with the software development lifecycle, design, develop, review, debug, document, and deliver (especially in a multi-location organization)
Working knowledge of Java, python
Desired Skills

Experience with reporting tools like Tableau, QlikView
Awareness of CI-CD pipeline
Inclination to work on cloud platform ex:- AWS
Crisp communication skills with team members, Business owners.
Be able to work in a challenging, dynamic environment and meet tight deadlines
Read more
Persistent System Ltd

Persistent System Ltd

Agency job
via Milestone Hr Consultancy by Haina khan
Bengaluru (Bangalore), Pune, Hyderabad
4 - 6 yrs
₹6L - ₹22L / yr
Apache HBase
Apache Hive
Apache Spark
skill iconGo Programming (Golang)
skill iconRuby on Rails (ROR)
+5 more
Urgently require Hadoop Developer in reputed MNC company

Location: Bangalore/Pune/Hyderabad/Nagpur

4-5 years of overall experience in software development.
- Experience on Hadoop (Apache/Cloudera/Hortonworks) and/or other Map Reduce Platforms
- Experience on Hive, Pig, Sqoop, Flume and/or Mahout
- Experience on NO-SQL – HBase, Cassandra, MongoDB
- Hands on experience with Spark development,  Knowledge of Storm, Kafka, Scala
- Good knowledge of Java
- Good background of Configuration Management/Ticketing systems like Maven/Ant/JIRA etc.
- Knowledge around any Data Integration and/or EDW tools is plus
- Good to have knowledge of  using Python/Perl/Shell

 

Please note - Hbase hive and spark are must.

Read more
Meesho
Remote only
3 - 12 yrs
₹25L - ₹70L / yr
skill iconJava
skill iconSpring Boot
Apache Kafka
MySQL
Apache HBase
+2 more
  • 3+ years of SDE work experience from Product based companies
  • Experience in Java, Spring Boot, MySQL, Kafka, Hbase, AWS
    • Experience in Multi threading, distributed systems, Best practices of coding, scaling
Read more
Acceldata

at Acceldata

5 recruiters
Richa  Kukar
Posted by Richa Kukar
Bengaluru (Bangalore)
6 - 10 yrs
Best in industry
SRE
Reliability engineering
Site reliability
Hadoop
HDFS
+1 more

Senior SRE - Acceldata (IC3 Level)


About the Job


You will join a team of highly skilled engineers who are responsible for delivering Acceldata’s support services. Our Site Reliability Engineers are trained to be active listeners and demonstrate empathy when customers encounter product issues. In our fun and collaborative environment  Site Reliability Engineers develop strong business, interpersonal and technical skills to deliver high-quality service to our valued customers.


When you arrive for your first day, we’ll want you to have:

  • Solid skills in troubleshooting to repair failed products or processes on a machine or a system using a logical, systematic search for the source of a problem in order to solve it, and make the product or process operational again
  • A strong ability to understand the feelings of our customers as we empathize with them on the issue at hand
  • A strong desire to increase your product and technology skillset; increase- your confidence supporting our products so you can help our customers succeed

In this position you will…

  • Provide Support Services to our Gold & Enterprise customers using our flagship Acceldata Pulse,Flow & Torch Product suits. This may include assistance provided during the engineering and operations of distributed systems as well as responses for mission-critical systems and production customers.
  • Demonstrate the ability to actively listen to customers and show empathy to the customer’s business impact when they experience issues with our products
  • Participate in the queue management and coordination process by owning customer escalations, managing the unassigned queue.
  • Be involved with and work on other support related activities - Performing POC & assisting Onboarding deployments of Acceldata & Hadoop distribution products.
  • Triage, diagnose and escalate customer inquiries when applicable during their engineering and operations efforts.
  • Collaborate and share solutions with both customers and the Internal team.
  • Investigate product related issues both for particular customers and for common trends that may arise
  • Study and understand critical system components and large cluster operations
  • Differentiate between issues that arise in operations, user code, or product
  • Coordinate enhancement and feature requests with product management and Acceldata engineering team.
  • Flexible in working in Shifts.
  • Participate in a Rotational weekend on-call roster for critical support needs.
  • Participate as a designated or dedicated engineer for specific customers. Aspects of this engagement translates to building long term successful relationships with customers, leading weekly status calls, and occasional visits to customer sites

In this position, you should have…

  • A strong desire and aptitude to become a well-rounded support professional. Acceldata Support considers the service we deliver as our core product.
  • A positive attitude towards feedback and continual improvement
  • A willingness to give direct feedback to and partner with management to improve team operations
  • A tenacity to bring calm and order to the often stressful situations of customer cases
  • A mental capability to multi-task across many customer situations simultaneously
  • Bachelor degree in Computer Science or Engineering or equivalent experience. Master’s degree is a plus
  • At least 2+ years of experience with at least one of the following cloud platforms: Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform (GCP), experience with managing and supporting a cloud infrastructure on any of the 3 platforms. Also knowledge on Kubernetes, Docker is a must.
  • Strong troubleshooting skills (in example, TCP/IP, DNS, File system, Load balancing, database, Java)
  • Excellent communication skills in English (written and verbal)
  • Prior enterprise support experience in a technical environment strongly preferred

Strong Hands-on Experience Working With Or Supporting The Following

  • 8-12 years of Experience with a highly-scalable, distributed, multi-node environment (50+ nodes)
  • Hadoop operation including Zookeeper, HDFS, YARN, Hive, and related components like the Hive metastore, Cloudera Manager/Ambari, etc
  • Authentication and security configuration and tuning (KNOX, LDAP, Kerberos, SSL/TLS, second priority: SSO/OAuth/OIDC, Ranger/Sentry)
  • Java troubleshooting, e.g., collection and evaluation of jstacks, heap dumps

You might also have…

  • Linux, NFS, Windows, including application installation, scripting, basic command line
  • Docker and Kubernetes configuration and troubleshooting, including Helm charts, storage options, logging, and basic kubectl CLI
  • Experience working with scripting languages (Bash, PowerShell, Python)
  • Working knowledge of application, server, and network security management concepts
  • Familiarity with virtual machine technologies
  • Knowledge of databases like MySQL and PostgreSQL,
  • Certification on any of the leading Cloud providers (AWS, Azure, GCP ) and/or Kubernetes is a big plus

The right person in this role has an opportunity to make a huge impact at Acceldata and add value to our future decisions. If this position has piqued your interest and you have what we described - we invite you to apply! An adventure in data awaits.

Learn more at https://www.acceldata.io/about-us">https://www.acceldata.io/about-us



Read more
Hiring for one of the MNC for India location

Hiring for one of the MNC for India location

Agency job
via Natalie Consultants by Rahul Kumar
Gurugram, Pune, Bengaluru (Bangalore), Delhi, Noida, Ghaziabad, Faridabad
2 - 9 yrs
₹8L - ₹20L / yr
skill iconPython
Hadoop
Big Data
Spark
Data engineering
+3 more

Key Responsibilities : ( Data Developer Python, Spark)

Exp : 2 to 9 Yrs 

Development of data platforms, integration frameworks, processes, and code.

Develop and deliver APIs in Python or Scala for Business Intelligence applications build using a range of web languages

Develop comprehensive automated tests for features via end-to-end integration tests, performance tests, acceptance tests and unit tests.

Elaborate stories in a collaborative agile environment (SCRUM or Kanban)

Familiarity with cloud platforms like GCP, AWS or Azure.

Experience with large data volumes.

Familiarity with writing rest-based services.

Experience with distributed processing and systems

Experience with Hadoop / Spark toolsets

Experience with relational database management systems (RDBMS)

Experience with Data Flow development

Knowledge of Agile and associated development techniques including:

Read more
Banyan Data Services

at Banyan Data Services

1 recruiter
Sathish Kumar
Posted by Sathish Kumar
Bengaluru (Bangalore)
3 - 15 yrs
₹6L - ₹20L / yr
skill iconData Science
Data Scientist
skill iconMongoDB
skill iconJava
Big Data
+14 more

Senior Big Data Engineer 

Note:   Notice Period : 45 days 

Banyan Data Services (BDS) is a US-based data-focused Company that specializes in comprehensive data solutions and services, headquartered in San Jose, California, USA. 

 

We are looking for a Senior Hadoop Bigdata Engineer who has expertise in solving complex data problems across a big data platform. You will be a part of our development team based out of Bangalore. This team focuses on the most innovative and emerging data infrastructure software and services to support highly scalable and available infrastructure. 

 

It's a once-in-a-lifetime opportunity to join our rocket ship startup run by a world-class executive team. We are looking for candidates that aspire to be a part of the cutting-edge solutions and services we offer that address next-gen data evolution challenges. 

 

 

Key Qualifications

 

·   5+ years of experience working with Java and Spring technologies

· At least 3 years of programming experience working with Spark on big data; including experience with data profiling and building transformations

· Knowledge of microservices architecture is plus 

· Experience with any NoSQL databases such as HBase, MongoDB, or Cassandra

· Experience with Kafka or any streaming tools

· Knowledge of Scala would be preferable

· Experience with agile application development 

· Exposure of any Cloud Technologies including containers and Kubernetes 

· Demonstrated experience of performing DevOps for platforms 

· Strong Skillsets in Data Structures & Algorithm in using efficient way of code complexity

· Exposure to Graph databases

· Passion for learning new technologies and the ability to do so quickly 

· A Bachelor's degree in a computer-related field or equivalent professional experience is required

 

Key Responsibilities

 

· Scope and deliver solutions with the ability to design solutions independently based on high-level architecture

· Design and develop the big data-focused micro-Services

· Involve in big data infrastructure, distributed systems, data modeling, and query processing

· Build software with cutting-edge technologies on cloud

· Willing to learn new technologies and research-orientated projects 

· Proven interpersonal skills while contributing to team effort by accomplishing related results as needed 

Read more
Volks Consulting

at Volks Consulting

18 recruiters
Akash Thakur
Posted by Akash Thakur
Remote, Bengaluru (Bangalore)
10 - 15 yrs
₹50L - ₹55L / yr
Engineering Management
Technical Lead
Engineering Manager
Tech Lead
Lead
+10 more

2018 Forbes Indonesia Choice Award winner and Galen Growth’s 2018 Most

Innovative HealthTech Startup in Asia. Ours is a secure health-tech platform with a mission tosimplifying access to healthcare by connecting millions of patients with licensed doctors, insurance, labs, and pharmacies in one mobile application.

 

Key Job Responsibilities:

  • He/She is a responsive team player who can proactively contribute for building technicalstrategies for applications and systems by promoting an understanding of the technology andbusiness roadmap.
  • He /she is someone who thrives in a fun, fast-paced, dynamic, startup-like environment.
  • Work very closely with various business stakeholders to drive the execution of multiplebusiness plans and technologies.
  • Work closely with Product, Design, and Marketing to conceive features, plan projects, andbuild roadmaps
  • Prior experience with scalable Architecture managing team of minimum 5 engineers andcoaching, mentoring while maintaining a role with code development.
  • Proven history of contributing to product strategy and shipping products with multi-functionalteams.
  • Highly involved in recruitment while building team also leading app development for bothplatforms
  • Promote and support company policies, procedures, mission, values, and standards of ethicsand integrity.

 

Minimum Qualification:

  • Total of 10+ years experience
  • Hands-on working on Java ( {Language understanding - Java 8, Lambdas, Collections,popular frameworks & libraries}, JVM, GC tuning, performance tuning)
  • Worked on REST frameworks/libraries like Spring MVC, Spring Boot, Dropwizard, RESTExpress etc
  • Worked on Relational data stores viz. MySQL, Oracle, or Postgres
  • Worked on Non-relational data stores viz. Cassandra, HBase, Couchbase, MongoDB, etc
  • Worked on caching infra viz. Redis, Memcached, Aerospike, Riak, etc
  • Worked on Queueing infra viz. Kafka, RabbitMQ, ActiveMQ etc

 

Regards,
Volks consulting

Read more
India's best Short Video App

India's best Short Video App

Agency job
via wrackle by Naveen Taalanki
Bengaluru (Bangalore)
4 - 12 yrs
₹25L - ₹50L / yr
Data engineering
Big Data
Spark
Apache Kafka
Apache Hive
+26 more
What Makes You a Great Fit for The Role?

You’re awesome at and will be responsible for
 
Extensive programming experience with cross-platform development of one of the following Java/SpringBoot, Javascript/Node.js, Express.js or Python
3-4 years of experience in big data analytics technologies like Storm, Spark/Spark streaming, Flink, AWS Kinesis, Kafka streaming, Hive, Druid, Presto, Elasticsearch, Airflow, etc.
3-4 years of experience in building high performance RPC services using different high performance paradigms: multi-threading, multi-processing, asynchronous programming (nonblocking IO), reactive programming,
3-4 years of experience working high throughput low latency databases and cache layers like MongoDB, Hbase, Cassandra, DynamoDB,, Elasticache ( Redis + Memcache )
Experience with designing and building high scale app backends and micro-services leveraging cloud native services on AWS like proxies, caches, CDNs, messaging systems, Serverless compute(e.g. lambda), monitoring and telemetry.
Strong understanding of distributed systems fundamentals around scalability, elasticity, availability, fault-tolerance.
Experience in analysing and improving the efficiency, scalability, and stability of distributed systems and backend micro services.
5-7 years of strong design/development experience in building massively large scale, high throughput low latency distributed internet systems and products.
Good experience in working with Hadoop and Big Data technologies like HDFS, Pig, Hive, Storm, HBase, Scribe, Zookeeper and NoSQL systems etc.
Agile methodologies, Sprint management, Roadmap, Mentoring, Documenting, Software architecture.
Liaison with Product Management, DevOps, QA, Client and other teams
 
Your Experience Across The Years in the Roles You’ve Played
 
Have total or more 5 - 7 years of experience with 2-3 years in a startup.
Have B.Tech or M.Tech or equivalent academic qualification from premier institute.
Experience in Product companies working on Internet-scale applications is preferred
Thoroughly aware of cloud computing infrastructure on AWS leveraging cloud native service and infrastructure services to design solutions.
Follow Cloud Native Computing Foundation leveraging mature open source projects including understanding of containerisation/Kubernetes.
 
You are passionate about learning or growing your expertise in some or all of the following
Data Pipelines
Data Warehousing
Statistics
Metrics Development
 
We Value Engineers Who Are
 
Customer-focused: We believe that doing what’s right for the creator is ultimately what will drive our business forward.
Obsessed with Quality: Your Production code just works & scales linearly
Team players. You believe that more can be achieved together. You listen to feedback and also provide supportive feedback to help others grow/improve.
Pragmatic: We do things quickly to learn what our creators desire. You know when it’s appropriate to take shortcuts that don’t sacrifice quality or maintainability.
Owners: Engineers at Chingari know how to positively impact the business.
Read more
Axtechnosoft Private Limited

Axtechnosoft Private Limited

Agency job
via InvokHR by Aanchal Tyagi
Remote only
8 - 14 yrs
₹15L - ₹25L / yr
Architect
Technical Architecture
Architecture
Information architecture
Software architecture
+13 more

Principal Software Engineer /Architect

Axtechnosoft Private Limited

 

Job Description

Responsibilities: -

  • You would take ownership of the existing system and scale it more than 10X over the next 2 years.
  • Apply best coding standards.
  • You would create the infrastructure that can serve 100s of customers and millions of data requests per hour.
  • Over the next year or so, you would be able to guide a team of 5 to 15 people to accomplish your goals. Mentoring this team into a world-class engineering team would be a key part of your role.
  • Your earlier experience in successfully building, deploying and running complex, large scale web or data products.
  • You would work hand-in-hand with the Product Management team to build engineering capabilities that align with the evolution of the product.
  • Eventually work with Data science teamwork to ensure that the algorithmic intelligence that we build is plugged into the product in an expected manner.
  • Overall, you would be responsible for end-to-end architecting from Engineering standpoint.


Must have: -

  • Total experience of 8+ years while relevant experience of at least 2 years.
  • Have built a platform that handles at least 500k to 1 million data request an hour.
  • Worked on building an infrastructure that serves 200k+ customers.
  • Hands on coder.
  • Expert level knowledge in at least one technology stack - Python or ideally, Java. Also Angular, React, Node.js
  • Expert level knowledge with Elastic Search or NoSQL technologies like MongoDB/HBase/Cassandra/Redis/Neo4j
  • Experience developing web applications.
  • Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery)
  • Working knowledge of databases (e.g. MySQL, MongoDB), web servers (e.g. Apache) and UI/UX design.
  • Devops experience working with AWS / Other cloud platforms.
  • Strong knowledge of API’s.
  • Excellent communication and teamwork skills
  • Implementing Software Engineering best practices.
  • Previously worked on user facing products with scale.
  • Agile methodology.
  • Great attention to detail.
  • Organizational skills
  • An analytical mind



Good to have: -

  • Working knowledge of React Red.
  • Open-source technology.
  • Working knowledge of AI/ ML.
  • Degree in Computer Science, Statistics or relevant field.
  • Experience working in a start-up environment.

 

Key Skills

Python

Angular Javascript

Reactor & Solids Processing

Node.js

Elastic Search

NoSQL

Web Applications

Database

Web Servers

UX/UI Design

AWS Cloud

Agile Methodology

Read more
Surplus Hand

Surplus Hand

Agency job
via SurplusHand by Anju John
Remote, Hyderabad
3 - 5 yrs
₹10L - ₹14L / yr
Apache Hadoop
Apache Hive
PySpark
Big Data
skill iconJava
+3 more
Tech Skills:
• Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc)
• should have good hands-on Spark (spark with java/PySpark)
• Hive
• must be good with SQL's(spark SQL/ HiveQL)
• Application design, software development and automated testing
Environment Experience:
• Experience with implementing integrated automated release management using tools/technologies/frameworks like Maven, Git, code/security review tools, Jenkins, Automated testing, and Junit.
• Demonstrated experience with Agile or other rapid application development methods
• Cloud development (AWS/Azure/GCP)
• Unix / Shell scripting
• Web services , open API development, and REST concepts
Read more
IT Services Firm

IT Services Firm

Agency job
Mumbai
10 - 18 yrs
₹18L - ₹22L / yr
skill iconJava
skill iconAmazon Web Services (AWS)
Windows Azure
skill iconGitHub
skill iconJenkins
+5 more

Hi,

We are looking for cloud solution professionals with the following skill sets;

Experience: 10+ years in cloud architecting

Location: Mumbai

 

Job Responsibilities:

  • Analyze and understand customer business processes and workflows, define requirements and design appropriate solutions.
  • Provide End 2 end cloud Solutioning along with secured infra
  • Collaborate with vendors for the execution
  • Well understanding on open source stack frameworks, AWS & Azure Cloud services
  • Solutioning extending from green field to enterprise view
  • Presentation skills with a high degree of comfort with both large and small audiences.
  • High level of comfort communicating effectively across internal and external organizations
  • Intermediate/advanced knowledge of the cloud services, market segments, customer base and industry verticals.
  • Demonstrated experience leading or developing high quality, enterprise scale software products using a structured system development life cycle.
  • Demonstrated ability to adapt to new technologies and learn quickly.
  • Certified Solutions Architect( AWS / Azure)
  • Recommendations on security, cost, performance, reliability and operational efficiency to accelerate challenging, mission-critical projects
  • Experience migrating or transforming customer solutions to the cloud

Primary Skills :

JAVA / J2EE; Spring, Spring Boot, Microservices,Angular JS, Instream data handling, Elastics search DB, Mango DB,DevOps tools- Jenkin, github,maven build, Hands on AWS & Azure cloud services,Mobile: Native and hybrid app hands on;Docker Containers , AKS,Big data and Hbase, Data Lake , service bus, AD

Secondary Skills :

  • Extensive experience in Microservices, Rest Services, JPA, Automated unit testing through tools.
  • Proven design skills and expertise is required.
  • Good knowledge of current / emerging technologies and trends.
  • Good analytical, grasping and problem solving skills. Excellent written and verbal communication skills. High levels of initiative and creativity.
  • Good communication skills with all stake holders, good team player with ability to mentor juniors
Read more
Thirdpresence

at Thirdpresence

5 recruiters
RahulM
Posted by RahulM
Hyderabad
2 - 5 yrs
₹8L - ₹12L / yr
skill iconPython
Hadoop
skill iconElastic Search
Solr
skill iconAmazon Web Services (AWS)
+4 more

We are looking for a Senior Python Developer to produce large scale distributed software solutions. You’ll be part of a cross-functional team that’s responsible for the complete software development life cycle, from conception to deployment.

If you’re also familiar with Agile methodologies, we’d like to meet you.


Responsibilities:

Work with development teams and product managers to ideate software solutions Design client-side and server-side architecture Build the front-end of applications through appealing visual design Develop and manage well-functioning databases and applications Write effective APIs Test software to ensure responsiveness and efficiency Troubleshoot, debug and upgrade software Create security and data protection settings Write technical documentation


Requirements
Proven experience as a Python Developer or similar role Knowledge on Python, Django, MongoDB, Elasticsearch, AWS Excellent communication and teamwork skills Great attention to detail Organizational skills An analytical mind Experience on Apache Kafka, Hbase and Graph DB is an added bonus

Read more
MNC

MNC

Agency job
via Fragma Data Systems by Harpreet kour
Bengaluru (Bangalore)
3 - 6 yrs
₹6L - ₹15L / yr
Apache Hadoop
Hadoop
HDFS
Apache Sqoop
Apache Flume
+5 more
1. Design and development of data ingestion pipelines.
2. Perform data migration and conversion activities.
3. Develop and integrate software applications using suitable development
methodologies and standards, applying standard architectural patterns, taking
into account critical performance characteristics and security measures.
4. Collaborate with Business Analysts, Architects and Senior Developers to
establish the physical application framework (e.g. libraries, modules, execution
environments).
5. Perform end to end automation of ETL process for various datasets that are
being ingested into the big data platform.
Read more
Clairvoyant India Private Limited
Taruna Roy
Posted by Taruna Roy
Remote, Pune
3 - 8 yrs
₹4L - ₹15L / yr
Big Data
Hadoop
skill iconJava
Spark
Hibernate (Java)
+5 more
ob Title/Designation:
Mid / Senior Big Data Engineer
Job Description:
Role: Big Data EngineerNumber of open positions: 5Location: PuneAt Clairvoyant, we're building a thriving big data practice to help enterprises enable and accelerate the adoption of Big data and cloud services. In the big data space, we lead and serve as innovators, troubleshooters, and enablers. Big data practice at Clairvoyant, focuses on solving our customer's business problems by delivering products designed with best in class engineering practices and a commitment to keep the total cost of ownership to a minimum.
Must Have:
  • 4-10 years of experience in software development.
  • At least 2 years of relevant work experience on large scale Data applications.
  • Strong coding experience in Java is mandatory
  • Good aptitude, strong problem solving abilities, and analytical skills, ability to take ownership as appropriate
  • Should be able to do coding, debugging, performance tuning and deploying the apps to Prod.
  • Should have good working experience on
  • o Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)
  • o Kafka
  • o J2EE Frameworks (Spring/Hibernate/REST)
  • o Spark Streaming or any other streaming technology.
  • Strong coding experience in Java is mandatory
  • Ability to work on the sprint stories to completion along with Unit test case coverage.
  • Experience working in Agile Methodology
  • Excellent communication and coordination skills
  • Knowledgeable (and preferred hands on) - UNIX environments, different continuous integration tools.
  • Must be able to integrate quickly into the team and work independently towards team goals
Role & Responsibilities:
  • Take the complete responsibility of the sprint stories' execution
  • Be accountable for the delivery of the tasks in the defined timelines with good quality.
  • Follow the processes for project execution and delivery.
  • Follow agile methodology
  • Work with the team lead closely and contribute to the smooth delivery of the project.
  • Understand/define the architecture and discuss the pros-cons of the same with the team
  • Involve in the brainstorming sessions and suggest improvements in the architecture/design.
  • Work with other team leads to get the architecture/design reviewed.
  • Work with the clients and counter-parts (in US) of the project.
  • Keep all the stakeholders updated about the project/task status/risks/issues if there are any.
Education: BE/B.Tech from reputed institute.
Experience: 4 to 9 years
Keywords: java, scala, spark, software development, hadoop, hive
Locations: Pune
Read more
GeakMinds Technologies Pvt Ltd
John Richardson
Posted by John Richardson
Chennai
1 - 5 yrs
₹1L - ₹6L / yr
Hadoop
Big Data
HDFS
Apache Sqoop
Apache Flume
+2 more
• Looking for Big Data Engineer with 3+ years of experience. • Hands-on experience with MapReduce-based platforms, like Pig, Spark, Shark. • Hands-on experience with data pipeline tools like Kafka, Storm, Spark Streaming. • Store and query data with Sqoop, Hive, MySQL, HBase, Cassandra, MongoDB, Drill, Phoenix, and Presto. • Hands-on experience in managing Big Data on a cluster with HDFS and MapReduce. • Handle streaming data in real time with Kafka, Flume, Spark Streaming, Flink, and Storm. • Experience with Azure cloud, Cognitive Services, Databricks is preferred.
Read more
Pion Global Solutions LTD
Sheela P
Posted by Sheela P
Mumbai
3 - 100 yrs
₹4L - ₹15L / yr
Spark
Big Data
Hadoop
HDFS
Apache Sqoop
+2 more
Looking for Big data Developers in Mumbai Location
Read more
Securonix

at Securonix

1 recruiter
Ramakrishna Murthy
Posted by Ramakrishna Murthy
Pune
3 - 7 yrs
₹10L - ₹15L / yr
HDFS
Apache Flume
Apache HBase
Hadoop
Impala
+3 more
Securonix is a Big Data Security Analytics product company. The only product which delivers real-time behavior analytics (UEBA) on Big Data.
Read more
Foster Entrepreneurship Ventures
Debdas Sinha
Posted by Debdas Sinha
Bengaluru (Bangalore)
1 - 3 yrs
₹6L - ₹20L / yr
Apache HBase
Hadoop
MapReduce
www.aaknet.co.in/careers/careers-at-aaknet.html You are extra-ordinary, a rock-star, hardly found a place to leverage or challenge your potential, did not spot a sky rocketing opportunity yet? Come play with us – face the challenges we can throw at you, chances are you might be humiliated (positively); do not take it that seriously though! Please be informed, we rate CHARACTER, attitude high if not more than your great skills, experience and sharpness etc. :) Best wishes & regards, Team Aak!
Read more
Foster Entrepreneurship Ventures
Bengaluru (Bangalore)
1 - 2 yrs
₹6L - ₹18L / yr
skill iconElastic Search
Apache Kafka
Solr
Apache HBase
www.aaknet.co.in/careers/careers-at-aaknet.html You are extra-ordinary, a rock-star, hardly found a place to leverage or challenge your potential, did not spot a sky rocketing opportunity yet? Come play with us – face the challenges we can throw at you, chances are you might be humiliated (positively); do not take it that seriously though! Please be informed, we rate CHARACTER, attitude high if not more than your great skills, experience and sharpness etc. :) Best wishes & regards, Team Aak!
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort