Cutshort logo

11+ PuTTY Jobs in India

Apply to 11+ PuTTY Jobs on CutShort.io. Find your next job, effortlessly. Browse PuTTY Jobs and apply today!

icon
Magic9 Media and Consumer Knowledge Pvt. Ltd.
Mumbai
3 - 5 yrs
₹7L - ₹12L / yr
ETL
SQL
skill iconPython
Statistical Analysis
skill iconMachine Learning (ML)
+4 more

Job Description

This requirement is to service our client which is a leading big data technology company that measures what viewers consume across platforms to enable marketers make better advertising decisions. We are seeking a Senior Data Operations Analyst to mine large-scale datasets for our client. Their work will have a direct impact on driving business strategies for prominent industry leaders. Self-motivation and strong communication skills are both must-haves. Ability to work in a fast-paced work environment is desired.


Problems being solved by our client: 

Measure consumer usage of devices linked to the internet and home networks including computers, mobile phones, tablets, streaming sticks, smart TVs, thermostats and other appliances. There are more screens and other connected devices in homes than ever before, yet there have been major gaps in understanding how consumers interact with this technology. Our client uses a measurement technology to unravel dynamics of consumers’ interactions with multiple devices.


Duties and responsibilities:

  • The successful candidate will contribute to the development of novel audience measurement and demographic inference solutions. 
  • Develop, implement, and support statistical or machine learning methodologies and processes. 
  • Build, test new features and concepts and integrate into production process
  • Participate in ongoing research and evaluation of new technologies
  • Exercise your experience in the development lifecycle through analysis, design, development, testing and deployment of this system
  • Collaborate with teams in Software Engineering, Operations, and Product Management to deliver timely and quality data. You will be the knowledge expert, delivering quality data to our clients

Qualifications:

  • 3-5 years relevant work experience in areas as outlined below
  • Experience in extracting data using SQL from large databases
  • Experience in writing complex ETL processes and frameworks for analytics and data management. Must have experience in working on ETL tools.
  • Master’s degree or PhD in Statistics, Data Science, Economics, Operations Research, Computer Science, or a similar degree with a focus on statistical methods. A Bachelor’s degree in the same fields with significant, demonstrated professional research experience will also be considered. 
  • Programming experience in scientific computing language (R, Python, Julia) and the ability to interact with relational data (SQL, Apache Pig, SparkSQL). General purpose programming (Python, Scala, Java) and familiarity with Hadoop is a plus.  
  • Excellent verbal and written communication skills. 
  • Experience with TV or digital audience measurement or market research data is a plus. 
  • Familiarity with systems analysis or systems thinking is a plus. 
  • Must be comfortable with analyzing complex, high-volume and high-dimension data from varying sources
  • Excellent verbal, written and computer communication skills
  • Ability to engage with Senior Leaders across all functional departments
  • Ability to take on new responsibilities and adapt to changes

 

Read more
Wonder Worth Solutions Pvt Ltd
Vellore
4 - 7 yrs
₹3.5L - ₹5L / yr
skill iconMachine Learning (ML)
skill iconData Science
skill iconPython
skill iconJava
skill iconC++
+1 more

As a part of WWS, your expertise in machine learning helps us extract value from our data. You will lead all the processes from data collection, cleaning, and preprocessing, to training models and situating them to production. The ideal candidate will be passionate about artificial intelligence and stay up-to-date with the latest developments in the field.

What We Expect

  • A Bachelor's/Master's degree in IT, computer science, or an advanced related field is preferred.
  • At least 3+ years of experience working with ML libraries and packages.
  • Familiarity with coding and programming languages, including Python, Java, C++, and SAS.
  • Strong experience in programming and statistics.
  • Well-versed in Data Science and neural schematics in networking and software.
  • Flexibility in shifts is appreciated.

A Full Stack Developer’s Ideal Day At WWS

Design and Develop. The primary protocols include implementing machine learning algorithms and running AI systems experiments and tests. The Designing and development of machine learning systems along with performing statistical analyses falls under day to day activities of the developer.

Algorithm Assertion. The engineers act as critical members of the data science team as their tasks involve researching, asserting, and designing the artificial intelligence responsible for machine learning and maintaining and improving existing artificial intelligence systems.

Research and Development. To analyze large, complex datasets and extract insights and decide on the appropriate technique. research and implement best practices to improve the existing machine learning infrastructure. At most providing support to engineers and product managers in implementing machine learning as the product.

What You Can Expect

  • Full-time, salaried positions creamed with welfare programs.
  • Competitive salary and tailored training in the core space with recognition potential and annual bonus.
  • Periodic performance appraisals.
  • Attendance Incentives.
  • Working with the best and budding talent in the industry.
  • A conducive intangible environment with dynamic benefits.

Why Consider Machine Learning Engineer as a career with WWS?

With a very appealing work environment at WWS, our setting made it easier to build relationships with other staff members and clients. You may also have an opportunity to learn other aspects of environmental office work on the job, which can enhance your experience and qualifications.

Many businesses must proactively react to changing factors — like patterns of customer behavior or prices. Tracking model performance and retraining it once fresher data is available is key to success. This falls under the MLE range of responsibilities for which the requirement has been crucial for many organizations.

Please attach your resume and let us know through email your current address, phone number, and the best time to contact you by phone.

                                          Apply To this Job

Read more
NextGen Invent Corporation
Deepshikha Gupta
Posted by Deepshikha Gupta
Remote only
0 - 8 yrs
₹3L - ₹20L / yr
skill iconPython
Object Oriented Programming (OOPs)
skill iconData Science
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
+3 more

Experience: 1- 5 Years

Job Location: WFH

No. of Position: Multiple

Qualifications: Ph.D. Must have

Work Timings: 1:30 PM IST to 10:30 PM IST

Functional Area: Data Science

NextGen Invent is currently searching for Data Scientist. This role will directly report to the VP, Data Science in Data Science Practice. The person will work on data science use-cases for the enterprise and must have deep expertise in supervised and unsupervised machine learning, modeling and algorithms with a strong focus on delivering use-cases and solutions at speed and scale to solve business problems.

Job Responsibilities:

  • Leverage AI/ML modeling and algorithms to deliver on use cases
  • Build modeling solutions at speed and scale to solve business problems
  • Develop data science solutions that can be tested and deployed in Agile delivery model
  • Implement and scale-up high-availability models and algorithms for various business and corporate functions
  • Investigate and create experimental prototypes that work on specific domains and verticals
  • Analyze large, complex data sets to reveal underlying patterns, and trends
  • Support and enhance existing models to ensure better performance
  • Set up and conduct large-scale experiments to test hypotheses and delivery of models

Skills, Knowledge, Experience:

  • Must have Ph.D. in an analytical or technical field (e.g. applied mathematics, computer science)
  • Strong knowledge of statistical and machine learning methods
  • Hands on experience on building models at speed and scale
  • Ability to work in a collaborative, transparent style with cross-functional stakeholders across the organization to lead and deliver results
  • Strong skills in oral and written communication
  • Ability to lead a high-functioning team and develop and train people
  • Must have programming experience in SQL, Python and R
  • Experience conceiving, implementing and continually improving machine learning projects
  • Strong familiarity with higher level trends in artificial intelligence and open-source platforms
  • Experience working with AWS, Azure, or similar cloud platform
  • Familiarity with visualization techniques and software
  • Healthcare experience is a plus
  • Experience in Kafka, Chatbot and blockchain is a plus.


Read more
Digital Banking Firm

Digital Banking Firm

Agency job
via Qrata by Prajakta Kulkarni
Bengaluru (Bangalore)
5 - 10 yrs
₹20L - ₹40L / yr
Apache Kafka
Hadoop
Spark
Apache Hadoop
Big Data
+5 more
Location - Bangalore (Remote for now)
 
Designation - Sr. SDE (Platform Data Science)
 
About Platform Data Science Team

The Platform Data Science team works at the intersection of data science and engineering. Domain experts develop and advance platforms, including the data platforms, machine learning platform, other platforms for Forecasting, Experimentation, Anomaly Detection, Conversational AI, Underwriting of Risk, Portfolio Management, Fraud Detection & Prevention and many more. We also are the Data Science and Analytics partners for Product and provide Behavioural Science insights across Jupiter.
 
About the role:

We’re looking for strong Software Engineers that can combine EMR, Redshift, Hadoop, Spark, Kafka, Elastic Search, Tensorflow, Pytorch and other technologies to build the next generation Data Platform, ML Platform, Experimentation Platform. If this sounds interesting we’d love to hear from you!
This role will involve designing and developing software products that impact many areas of our business. The individual in this role will have responsibility help define requirements, create software designs, implement code to these specifications, provide thorough unit and integration testing, and support products while deployed and used by our stakeholders.

Key Responsibilities:

Participate, Own & Influence in architecting & designing of systems
Collaborate with other engineers, data scientists, product managers
Build intelligent systems that drive decisions
Build systems that enable us to perform experiments and iterate quickly
Build platforms that enable scientists to train, deploy and monitor models at scale
Build analytical systems that drives better decision making
 

Required Skills:

Programming experience with at least one modern language such as Java, Scala including object-oriented design
Experience in contributing to the architecture and design (architecture, design patterns, reliability and scaling) of new and current systems
Bachelor’s degree in Computer Science or related field
Computer Science fundamentals in object-oriented design
Computer Science fundamentals in data structures
Computer Science fundamentals in algorithm design, problem solving, and complexity analysis
Experience in databases, analytics, big data systems or business intelligence products:
Data lake, data warehouse, ETL, ML platform
Big data tech like: Hadoop, Apache Spark
Read more
Deep-Rooted.co (formerly Clover)

at Deep-Rooted.co (formerly Clover)

6 candid answers
1 video
Likhithaa D
Posted by Likhithaa D
Bengaluru (Bangalore)
3 - 6 yrs
₹12L - ₹15L / yr
skill iconJava
skill iconPython
SQL
AWS Lambda
HTTP
+5 more

Deep-Rooted.Co is on a mission to get Fresh, Clean, Community (Local farmer) produce from harvest to reach your home with a promise of quality first! Our values are rooted in trust, convenience, and dependability, with a bunch of learning & fun thrown in.


Founded out of Bangalore by Arvind, Avinash, Guru and Santosh, with the support of our Investors Accel, Omnivore & Mayfield, we raised $7.5 million in Seed, Series A and Debt funding till date from investors include ACCEL, Omnivore, Mayfield among others. Our brand Deep-Rooted.Co which was launched in August 2020 was the first of its kind as India’s Fruits & Vegetables (F&V) which is present in Bangalore & Hyderabad and on a journey of expansion to newer cities which will be managed seamlessly through Tech platform that has been designed and built to transform the Agri-Tech sector.


Deep-Rooted.Co is committed to building a diverse and inclusive workplace and is an equal-opportunity employer.  

How is this possible? It’s because we work with smart people. We are looking for Engineers in Bangalore to work with thehttps://www.linkedin.com/in/gururajsrao/"> Product Leader (Founder) andhttps://www.linkedin.com/in/sriki77/"> CTO and this is a meaningful project for us and we are sure you will love the project as it touches everyday life and is fun. This will be a virtual consultation.


We want to start the conversation about the project we have for you, but before that, we want to connect with you to know what’s on your mind. Do drop a note sharing your mobile number and letting us know when we can catch up.

Purpose of the role:

* As a startup we have data distributed all across various sources like Excel, Google Sheets, Databases etc. We need swift decision making based a on a lot of data that exists as we grow. You help us bring together all this data and put it in a data model that can be used in business decision making.
* Handle nuances of Excel and Google Sheets API.
* Pull data in and manage it growth, freshness and correctness.
* Transform data in a format that aids easy decision-making for Product, Marketing and Business Heads.
* Understand the business problem, solve the same using the technology and take it to production - no hand offs - full path to production is yours.

Technical expertise:
* Good Knowledge And Experience with Programming languages - Java, SQL,Python.
* Good Knowledge of Data Warehousing, Data Architecture.
* Experience with Data Transformations and ETL; 
* Experience with API tools and more closed systems like Excel, Google Sheets etc.
* Experience AWS Cloud Platform and Lambda
* Experience with distributed data processing tools.
* Experiences with container-based deployments on cloud.

Skills:
Java, SQL, Python, Data Build Tool, Lambda, HTTP, Rest API, Extract Transform Load.
Read more
Propellor.ai

at Propellor.ai

5 candid answers
1 video
Anila Nair
Posted by Anila Nair
Remote only
2 - 5 yrs
₹5L - ₹15L / yr
SQL
API
skill iconPython
Spark

Job Description - Data Engineer

About us
Propellor is aimed at bringing Marketing Analytics and other Business Workflows to the Cloud ecosystem. We work with International Clients to make their Analytics ambitions come true, by deploying the latest tech stack and data science and engineering methods, making their business data insightful and actionable. 

 

What is the role?
This team is responsible for building a Data Platform for many different units. This platform will be built on Cloud and therefore in this role, the individual will be organizing and orchestrating different data sources, and
giving recommendations on the services that fulfil goals based on the type of data

Qualifications:

• Experience with Python, SQL, Spark
• Knowledge/notions of JavaScript
• Knowledge of data processing, data modeling, and algorithms
• Strong in data, software, and system design patterns and architecture
• API building and maintaining
• Strong soft skills, communication
Nice to have:
• Experience with cloud: Google Cloud Platform, AWS, Azure
• Knowledge of Google Analytics 360 and/or GA4.
Key Responsibilities
• Work on the core backend and ensure it meets the performance benchmarks.
• Designing and developing APIs for the front end to consume.
• Constantly improve the architecture of the application by clearing the technical backlog.
• Meeting both technical and consumer needs.
• Staying abreast of developments in web applications and programming languages.

Key Responsibilities
• Design and develop platform based on microservices architecture.
• Work on the core backend and ensure it meets the performance benchmarks.
• Work on the front end with ReactJS.
• Designing and developing APIs for the front end to consume.
• Constantly improve the architecture of the application by clearing the technical backlog.
• Meeting both technical and consumer needs.
• Staying abreast of developments in web applications and programming languages.

What are we looking for?
An enthusiastic individual with the following skills. Please do not hesitate to apply if you do not match all of it. We are open to promising candidates who are passionate about their work and are team players.
• Education - BE/MCA or equivalent.
• Agnostic/Polyglot with multiple tech stacks.
• Worked on open-source technologies – NodeJS, ReactJS, MySQL, NoSQL, MongoDB, DynamoDB.
• Good experience with Front-end technologies like ReactJS.
• Backend exposure – good knowledge of building API.
• Worked on serverless technologies.
• Efficient in building microservices in combining server & front-end.
• Knowledge of cloud architecture.
• Should have sound working experience with relational and columnar DB.
• Should be innovative and communicative in approach.
• Will be responsible for the functional/technical track of a project.

Whom will you work with?
You will closely work with the engineering team and support the Product Team.

Hiring Process includes : 

a. Written Test on Python and SQL

b. 2 - 3 rounds of Interviews

Immediate Joiners will be preferred

Read more
A Product Company

A Product Company

Agency job
via wrackle by Lokesh M
Bengaluru (Bangalore)
3 - 6 yrs
₹15L - ₹26L / yr
Looker
Big Data
Hadoop
Spark
Apache Hive
+4 more
Job Title: Senior Data Engineer/Analyst
Location: Bengaluru
Department: - Engineering 

Bidgely is looking for extraordinary and dynamic Senior Data Analyst to be part of its core team in Bangalore. You must have delivered exceptionally high quality robust products dealing with large data. Be part of a highly energetic and innovative team that believes nothing is impossible with some creativity and hard work. 

Responsibilities 
● Design and implement a high volume data analytics pipeline in Looker for Bidgely flagship product.
●  Implement data pipeline in Bidgely Data Lake
● Collaborate with product management and engineering teams to elicit & understand their requirements & challenges and develop potential solutions 
● Stay current with the latest tools, technology ideas and methodologies; share knowledge by clearly articulating results and ideas to key decision makers. 

Requirements 
● 3-5 years of strong experience in data analytics and in developing data pipelines. 
● Very good expertise in Looker 
● Strong in data modeling, developing SQL queries and optimizing queries. 
● Good knowledge of data warehouse (Amazon Redshift, BigQuery, Snowflake, Hive). 
● Good understanding of Big data applications (Hadoop, Spark, Hive, Airflow, S3, Cloudera) 
● Attention to details. Strong communication and collaboration skills.
● BS/MS in Computer Science or equivalent from premier institutes.
Read more
Venture Highway

at Venture Highway

3 recruiters
Nipun Gupta
Posted by Nipun Gupta
Bengaluru (Bangalore)
2 - 6 yrs
₹10L - ₹30L / yr
skill iconPython
Data engineering
Data Engineer
MySQL
skill iconMongoDB
+5 more
-Experience with Python and Data Scraping.
- Experience with relational SQL & NoSQL databases including MySQL & MongoDB.
- Familiar with the basic principles of distributed computing and data modeling.
- Experience with distributed data pipeline frameworks like Celery, Apache Airflow, etc.
- Experience with NLP and NER models is a bonus.
- Experience building reusable code and libraries for future use.
- Experience building REST APIs.

Preference for candidates working in tech product companies
Read more
Increasingly Technologies

at Increasingly Technologies

1 video
5 recruiters
Kumar V
Posted by Kumar V
Bengaluru (Bangalore)
4 - 6 yrs
₹6L - ₹9L / yr
MySQL
SQL
Stored Procedures

What essential skills you need

 

  • 4 -6 years’ experience in MySQL database development. You know your way around problems,logic & code but you occasionally need help with how to do things. That’s cool with us. We usestack overflow too.
  • Expertise in database design & development (preferably on MySQL).
  • Expertise in writing complex SQL queries, stored procedures, and functions, triggers.
  • Expertise in performance tuning, query optimization, using performance monitor, SQL Profiler andother related monitoring and troubleshooting tools.
  • BE/B.Tech ( or equivalent education), good written & communication Skills.
Read more
5 years old AI Startup

5 years old AI Startup

Agency job
Pune
2 - 6 yrs
₹12L - ₹18L / yr
skill iconData Science
skill iconMachine Learning (ML)
skill iconPython
Natural Language Processing (NLP)
skill iconDeep Learning
  •  3+ years of experience in Machine Learning
  • Bachelors/Masters in Computer Engineering/Science.
  • Bachelors/Masters in Engineering/Mathematics/Statistics with sound knowledge of programming and computer concepts.
  • 10 and 12th acedemics 70 % & above.

Skills :
 - Strong Python/ programming skills
 - Good conceptual understanding of Machine Learning/Deep Learning/Natural Language            Processing
 - Strong verbal and written communication skills.
 - Should be able to manage team, meet project deadlines and interface with clients.
 - Should be able to work across different domains and quickly ramp up the business                   processes & flows & translate business problems into the data solutions

Read more
Responsibilities :

its a fintech company

Responsibilities : its a fintech company

Agency job
via Connexions by nakul pareek
Bengaluru (Bangalore)
6 - 14 yrs
₹12L - ₹32L / yr
skill iconJava
skill iconPython
skill iconJavascript
skill iconAmazon Web Services (AWS)
skill iconGo Programming (Golang)
+2 more
Responsibilities : - Design simple architecture for complex business requirements and software - Understand the Product architecture and customize it as per customer requirements. - Design and Implement Web APIs considering service management aspects of orchestration, choreography, security, hosting and analytics. - Work with cross functional teams from product management, Leads, QA, Design and customers. - Talk to CIOs and CTOs of Customer, understand their technical requirements and provide solutions. - Provide technical leadership and prove to be a role model for other team members. - Coach and mentor other engineers in technology and process. - Travel to customer locations and might need to work onsite in India and outside India. Basic Qualification : - Bachelors/Masters in Engineering from a premier institute. - Total 6-10 years of experience in software industry with minimum of 2 years as an Architect - Passion for engineering and solving complex problems to delight customers - Experience of working with open source and PaaS - Hands on experience on cloud and dockers - Hands on Experience on Java, Spring boot, Micro services, Integration Patterns, Databases. - Prior experience on integration with external systems like CRM, ERPs, Core Banking, BPMs, Core Insurance etc. - Experience in Performance and security. Preferred Qualification : - Prior experience in the Banking industry, especially Lending. - Prior experience of using Open Source software.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort