Cutshort logo
Data processing Jobs in Delhi, NCR and Gurgaon

11+ Data processing Jobs in Delhi, NCR and Gurgaon | Data processing Job openings in Delhi, NCR and Gurgaon

Apply to 11+ Data processing Jobs in Delhi, NCR and Gurgaon on CutShort.io. Explore the latest Data processing Job opportunities across top companies like Google, Amazon & Adobe.

icon
Quess Corp Limited

at Quess Corp Limited

6 recruiters
Anjali Singh
Posted by Anjali Singh
Noida, Delhi, Gurugram, Ghaziabad, Faridabad, Bengaluru (Bangalore), Chennai
5 - 8 yrs
₹1L - ₹15L / yr
Google Cloud Platform (GCP)
skill iconPython
Big Data
Data processing
Data Visualization

GCP  Data Analyst profile must have below skills sets :

 

Read more
B2B - Factory app for retailers & buyers (well funded)

B2B - Factory app for retailers & buyers (well funded)

Agency job
via Qrata by Blessy Fernandes
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
3 - 7 yrs
₹8L - ₹13L / yr
Tableau
SQL
skill iconPython
Microsoft Excel
skill iconData Analytics
+1 more

Job Title

Data Analyst

 

Job Brief

The successful candidate will turn data into information, information into insight and insight into business decisions.

 

Data Analyst Job Duties

Data analyst responsibilities include conducting full lifecycle analysis to include requirements, activities and design. Data analysts will develop analysis and reporting capabilities. They will also monitor performance and quality control plans to identify improvements.

 

Responsibilities

● Interpret data, analyze results using statistical techniques and provide ongoing reports.

● Develop and implement databases, data collection systems, data analytics and other strategies that optimize statistical efficiency and quality.

● Acquire data fromprimary orsecondary data sources andmaintain databases/data systems.

● Identify, analyze, and interpret trends orpatternsin complex data sets.

● Filter and “clean” data by reviewing computerreports, printouts, and performance indicatorsto locate and correct code problems.

● Work withmanagementto prioritize business and information needs.

● Locate and define new processimprovement opportunities. 

 

Requirements

● Proven working experienceas aData Analyst or BusinessDataAnalyst.

● Technical expertise regarding data models, database design development, data mining and segmentation techniques.

● Strong knowledge of and experience with reporting packages (Business Objects etc), databases (SQL etc), programming (XML, Javascript, or ETL frameworks).

● Knowledge of statistics and experience using statistical packages for analyzing datasets (Excel, SPSS, SAS etc).

● Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.

● Adept atqueries,reportwriting and presenting findings.

 

Job Location SouthDelhi, New Delhi 

Read more
Fintech lead,

Fintech lead,

Agency job
via The Hub by Sridevi Viswanathan
Gurugram, Noida
3 - 8 yrs
₹5L - ₹15L / yr
Natural Language Processing (NLP)
BERT
skill iconMachine Learning (ML)
skill iconData Science
skill iconPython
+1 more

Who we are looking for

· A Natural Language Processing (NLP) expert with strong computer science fundamentals and experience in working with deep learning frameworks. You will be working at the cutting edge of NLP and Machine Learning.

Roles and Responsibilities

· Work as part of a distributed team to research, build and deploy Machine Learning models for NLP.

· Mentor and coach other team members

· Evaluate the performance of NLP models and ideate on how they can be improved

· Support internal and external NLP-facing APIs

· Keep up to date on current research around NLP, Machine Learning and Deep Learning

Mandatory Requirements

·       Any graduation with at least 2 years of demonstrated experience as a Data Scientist.

Behavioural Skills

· Strong analytical and problem-solving capabilities.

· Proven ability to multi-task and deliver results within tight time frames

· Must have strong verbal and written communication skills

· Strong listening skills and eagerness to learn

· Strong attention to detail and the ability to work efficiently in a team as well as individually

Technical Skills

Hands-on experience with

· NLP

· Deep Learning

· Machine Learning

· Python

· Bert

Preferred Requirements

· Experience in Computer Vision is preferred

Role: Data Scientist

Industry Type: Banking

Department: Data Science & Analytics

Employment Type: Full Time, Permanent

Role Category: Data Science & Machine Learning

Read more
codersbrain

at codersbrain

1 recruiter
Tanuj Uppal
Posted by Tanuj Uppal
Delhi
4 - 8 yrs
₹2L - ₹15L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+5 more
  • Mandatory - Hands on experience in Python and PySpark.

 

  • Build pySpark applications using Spark Dataframes in Python using Jupyter notebook and PyCharm(IDE).

 

  • Worked on optimizing spark jobs that processes huge volumes of data.

 

  • Hands on experience in version control tools like Git.

 

  • Worked on Amazon’s Analytics services like Amazon EMR, Lambda function etc

 

  • Worked on Amazon’s Compute services like Amazon Lambda, Amazon EC2 and Amazon’s Storage service like S3 and few other services like SNS.

 

  • Experience/knowledge of bash/shell scripting will be a plus.

 

  • Experience in working with fixed width, delimited , multi record file formats etc.

 

  • Hands on experience in tools like Jenkins to build, test and deploy the applications

 

  • Awareness of Devops concepts and be able to work in an automated release pipeline environment.

 

  • Excellent debugging skills.
Read more
Top IT MNC

Top IT MNC

Agency job
Chennai, Bengaluru (Bangalore), Kochi (Cochin), Coimbatore, Hyderabad, Pune, Kolkata, Noida, Gurugram, Mumbai
5 - 13 yrs
₹8L - ₹20L / yr
Snow flake schema
skill iconPython
snowflake
Greetings,

We are looking out for a Snowflake developer for one of our premium clients for their PAN India loaction
Read more
Top startup of India -  News App

Top startup of India - News App

Agency job
via Jobdost by Sathish Kumar
Noida
6 - 10 yrs
₹35L - ₹65L / yr
skill iconData Science
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
Computer Vision
TensorFlow
+6 more
This will be an individual contributor role and people from Tier 1/2 and Product based company can only apply.

Requirements-

● B.Tech/Masters in Mathematics, Statistics, Computer Science or another quantitative field
● 2-3+ years of work experience in ML domain ( 2-5 years experience )
● Hands-on coding experience in Python
● Experience in machine learning techniques such as Regression, Classification,Predictive modeling, Clustering, Deep Learning stack, NLP.
● Working knowledge of Tensorflow/PyTorch
Optional Add-ons-
● Experience with distributed computing frameworks: Map/Reduce, Hadoop, Spark etc.
● Experience with databases: MongoDB
Read more
Celebal Technologies

at Celebal Technologies

2 recruiters
Payal Hasnani
Posted by Payal Hasnani
Jaipur, Noida, Gurugram, Delhi, Ghaziabad, Faridabad, Pune, Mumbai
5 - 15 yrs
₹7L - ₹25L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+4 more
Job Responsibilities:

• Project Planning and Management
o Take end-to-end ownership of multiple projects / project tracks
o Create and maintain project plans and other related documentation for project
objectives, scope, schedule and delivery milestones
o Lead and participate across all the phases of software engineering, right from
requirements gathering to GO LIVE
o Lead internal team meetings on solution architecture, effort estimation, manpower
planning and resource (software/hardware/licensing) planning
o Manage RIDA (Risks, Impediments, Dependencies, Assumptions) for projects by
developing effective mitigation plans
• Team Management
o Act as the Scrum Master
o Conduct SCRUM ceremonies like Sprint Planning, Daily Standup, Sprint Retrospective
o Set clear objectives for the project and roles/responsibilities for each team member
o Train and mentor the team on their job responsibilities and SCRUM principles
o Make the team accountable for their tasks and help the team in achieving them
o Identify the requirements and come up with a plan for Skill Development for all team
members
• Communication
o Be the Single Point of Contact for the client in terms of day-to-day communication
o Periodically communicate project status to all the stakeholders (internal/external)
• Process Management and Improvement
o Create and document processes across all disciplines of software engineering
o Identify gaps and continuously improve processes within the team
o Encourage team members to contribute towards process improvement
o Develop a culture of quality and efficiency within the team

Must have:
• Minimum 08 years of experience (hands-on as well as leadership) in software / data engineering
across multiple job functions like Business Analysis, Development, Solutioning, QA, DevOps and
Project Management
• Hands-on as well as leadership experience in Big Data Engineering projects
• Experience developing or managing cloud solutions using Azure or other cloud provider
• Demonstrable knowledge on Hadoop, Hive, Spark, NoSQL DBs, SQL, Data Warehousing, ETL/ELT,
DevOps tools
• Strong project management and communication skills
• Strong analytical and problem-solving skills
• Strong systems level critical thinking skills
• Strong collaboration and influencing skills

Good to have:
• Knowledge on PySpark, Azure Data Factory, Azure Data Lake Storage, Synapse Dedicated SQL
Pool, Databricks, PowerBI, Machine Learning, Cloud Infrastructure
• Background in BFSI with focus on core banking
• Willingness to travel

Work Environment
• Customer Office (Mumbai) / Remote Work

Education
• UG: B. Tech - Computers / B. E. – Computers / BCA / B.Sc. Computer Science
Read more
European Bank headquartered at Copenhagen, Denmark.

European Bank headquartered at Copenhagen, Denmark.

Agency job
via Apical Mind by Rajeev T
NCR (Delhi | Gurgaon | Noida)
2 - 12 yrs
₹25L - ₹40L / yr
Data governance
DevOps
Data integration
Data engineering
skill iconPython
+14 more
Data Platforms (Data Integration) is responsible for envisioning, building and operating the Bank’s data integration platforms. The successful candidate will work out of Gurgaon as a part of a high performing team who is distributed across our two development centers – Copenhagen and Gurugram. The individual must be driven, passionate about technology and display a level of customer service that is second to none.

Roles & Responsibilities

  • Designing and delivering a best-in-class, highly scalable data governance platform
  • Improving processes and applying best practices
  • Contribute in all scrum ceremonies; assuming the role of ‘scum master’ on a rotational basis
  •  Development, management and operation of our infrastructure to ensure it is easy to deploy, scalable, secure and fault-tolerant
  • Flexible on working hours as per business needs
Read more
Saviance Technologies

at Saviance Technologies

1 recruiter
Shipra Agrawal
Posted by Shipra Agrawal
NCR (Delhi | Gurgaon | Noida)
3 - 5 yrs
₹7L - ₹9L / yr
PowerBI
power bi
Business Intelligence (BI)
DAX
Data modeling
+3 more

 

Job Title: Power BI Developer(Onsite)

Location: Park Centra, Sec 30, Gurgaon

CTC:        8 LPA

Time:       1:00 PM - 10:00 PM

  

Must Have Skills: 

  • Power BI Desktop Software
  • Dax Queries
  • Data modeling
  • Row-level security
  • Visualizations
  • Data Transformations and filtering
  • SSAS and SQL

 

Job description:

 

We are looking for a PBI Analytics Lead responsible for efficient Data Visualization/ DAX Queries and Data Modeling. The candidate will work on creating complex Power BI reports. He will be involved in creating complex M, Dax Queries and working on data modeling, Row-level security, Visualizations, Data Transformations, and filtering. He will be closely working with the client team to provide solutions and suggestions on Power BI.

 

Roles and Responsibilities:

 

  • Accurate, intuitive, and aesthetic Visual Display of Quantitative Information: We generate data, information, and insights through our business, product, brand, research, and talent teams. You would assist in transforming this data into visualizations that represent easy-to-consume visual summaries, dashboards and storyboards. Every graph tells a story.
  • Understanding Data: You would be performing and documenting data analysis, data validation, and data mapping/design. You would be mining large datasets to determine its characteristics and select appropriate visualizations.
  • Project Owner: You would develop, maintain, and manage advanced reporting, analytics, dashboards and other BI solutions, and would be continuously reviewing and improving existing systems and collaborating with teams to integrate new systems. You would also contribute to the overall data analytics strategy by knowledge sharing and mentoring end users.
  • Perform ongoing maintenance & production of Management dashboards, data flows, and automated reporting.
  • Manage upstream and downstream impact of all changes on automated reporting/dashboards
  • Independently apply problem-solving ability to identify meaningful insights to business
  • Identify automation opportunities and work with a wide range of stakeholders to implement the same.
  • The ability and self-confidence to work independently and increase the scope of the service line

 

Requirements: 

  • 3+ years of work experience as an Analytics Lead / Senior Analyst / Sr. PBI Developer.
  • Sound understanding and knowledge of PBI Visualization and Data Modeling with DAX queries
  • Experience in leading and mentoring a small team.

 

 

 

Read more
Elucidata Corporation

at Elucidata Corporation

3 recruiters
Bhuvnesh Sharma
Posted by Bhuvnesh Sharma
Remote, NCR (Delhi | Gurgaon | Noida)
4 - 6 yrs
₹15L - ₹20L / yr
Big Data
skill iconJavascript
skill iconAngularJS (1.x)
skill iconReact.js
About Elucidata:Our mission is to make data-driven understanding of disease, the default starting point in the drug discovery process. Our products & services further the understanding of the ways in which diseased cells are different from healthy ones. This understanding helps scientists discover new drugs in a more effective manner and complements the move towards personalization.Biological big data will outpace data generated by YouTube and Twitter by 10x in the next 7 yrs. Our platform Polly will enable scientists to process different kinds of biological data and generate insights from them to accelerate drug discovery. Polly is already being used at premier biopharma companies like Pfizer and Agios; and academic labs at Yale, MIT, Washington University.We are looking for teammates who think out-of-the-box and are not satisfied with quick fixes or canned solutions to our industry’s most challenging problems. If you seek an intellectually stimulating environment where you can have a major impact on a critically important industry, we’d like to talk to you.About RoleWe are looking for engineers who want to build data rich applications and love the end-to-end product journey from understanding customer needs to the final product.Key Responsibilities- Developing web applications to visualize and process scientific data. - Interacting with Product, Design and Engineering teams to spec, build, test and deploy new features. - Understanding user needs and the science behind it.- Mentoring junior developersRequirements- Minimum 3-4 years of experience working in web development- In-depth knowledge of JavaScript- Hands-on experience with modern frameworks (Angular, React) - Sound programming and computer science fundamentals- Good understanding of web architecture and single page applications You might be a great cultural fit for Elucidata if..- You are passionate for Science.- You are a self-learner who wants to keep learning everyday. - You regard your code as your craft that you want to keep honing. - You like to work hard to solve big challenges and enjoy the process of breaking down a problem one blow at a time. - You love science and can't stop being the geek at a party. Of course you party harder than everybody else there.
Read more
UpX Academy

at UpX Academy

2 recruiters
Suchit Majumdar
Posted by Suchit Majumdar
Noida, Hyderabad, NCR (Delhi | Gurgaon | Noida)
2 - 6 yrs
₹4L - ₹12L / yr
Spark
Hadoop
skill iconMongoDB
skill iconPython
skill iconScala
+3 more
Looking for a technically sound and excellent trainer on big data technologies. Get an opportunity to become popular in the industry and get visibility. Host regular sessions on Big data related technologies and get paid to learn.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort