Cutshort logo
PuTTY Jobs in Mumbai

11+ PuTTY Jobs in Mumbai | PuTTY Job openings in Mumbai

Apply to 11+ PuTTY Jobs in Mumbai on CutShort.io. Explore the latest PuTTY Job opportunities across top companies like Google, Amazon & Adobe.

icon
Magic9 Media and Consumer Knowledge Pvt. Ltd.
Mumbai
3 - 5 yrs
₹7L - ₹12L / yr
ETL
SQL
skill iconPython
Statistical Analysis
skill iconMachine Learning (ML)
+4 more

Job Description

This requirement is to service our client which is a leading big data technology company that measures what viewers consume across platforms to enable marketers make better advertising decisions. We are seeking a Senior Data Operations Analyst to mine large-scale datasets for our client. Their work will have a direct impact on driving business strategies for prominent industry leaders. Self-motivation and strong communication skills are both must-haves. Ability to work in a fast-paced work environment is desired.


Problems being solved by our client: 

Measure consumer usage of devices linked to the internet and home networks including computers, mobile phones, tablets, streaming sticks, smart TVs, thermostats and other appliances. There are more screens and other connected devices in homes than ever before, yet there have been major gaps in understanding how consumers interact with this technology. Our client uses a measurement technology to unravel dynamics of consumers’ interactions with multiple devices.


Duties and responsibilities:

  • The successful candidate will contribute to the development of novel audience measurement and demographic inference solutions. 
  • Develop, implement, and support statistical or machine learning methodologies and processes. 
  • Build, test new features and concepts and integrate into production process
  • Participate in ongoing research and evaluation of new technologies
  • Exercise your experience in the development lifecycle through analysis, design, development, testing and deployment of this system
  • Collaborate with teams in Software Engineering, Operations, and Product Management to deliver timely and quality data. You will be the knowledge expert, delivering quality data to our clients

Qualifications:

  • 3-5 years relevant work experience in areas as outlined below
  • Experience in extracting data using SQL from large databases
  • Experience in writing complex ETL processes and frameworks for analytics and data management. Must have experience in working on ETL tools.
  • Master’s degree or PhD in Statistics, Data Science, Economics, Operations Research, Computer Science, or a similar degree with a focus on statistical methods. A Bachelor’s degree in the same fields with significant, demonstrated professional research experience will also be considered. 
  • Programming experience in scientific computing language (R, Python, Julia) and the ability to interact with relational data (SQL, Apache Pig, SparkSQL). General purpose programming (Python, Scala, Java) and familiarity with Hadoop is a plus.  
  • Excellent verbal and written communication skills. 
  • Experience with TV or digital audience measurement or market research data is a plus. 
  • Familiarity with systems analysis or systems thinking is a plus. 
  • Must be comfortable with analyzing complex, high-volume and high-dimension data from varying sources
  • Excellent verbal, written and computer communication skills
  • Ability to engage with Senior Leaders across all functional departments
  • Ability to take on new responsibilities and adapt to changes

 

Read more
Accrete.ai

Accrete.ai

Agency job
Mumbai
3 - 9 yrs
₹30L - ₹40L / yr
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
skill iconDeep Learning
skill iconData Science
Computer Vision
+4 more

Responsibilities


  • Work on execution and scheduling of all tasks related to assigned projects' deliverable dates
  • Optimize and debug existing codes to make them scalable and improve performance
  • Design, development, and delivery of tested code and machine learning models into production environments
  • Work effectively in teams, managing and leading teams
  • Provide effective, constructive feedback to the delivery leader
  • Manage client expectations and work with an agile mindset with machine learning and AI technology
  • Design and prototype data-driven solutions

Eligibility


  • Highly experienced in designing, building, and shipping scalable and production-quality machine learning algorithms in the field of Python applications
  • Working knowledge and experience in NLP core components (NER, Entity Disambiguation, etc.)
  • In-depth expertise in Data Munging and Storage (Experienced in SQL, NoSQL, MongoDB, Graph Databases)
  • Expertise in writing scalable APIs for machine learning models
  • Experience with maintaining code logs, task schedulers, and security
  • Working knowledge of machine learning techniques, feed-forward, recurrent and convolutional neural networks, entropy models, supervised and unsupervised learning
  • Experience with at least one of the following: Keras, Tensorflow, Caffe, or PyTorch
Read more
Expand My Business
Bengaluru (Bangalore), Mumbai, Chennai
7 - 15 yrs
₹10L - ₹28L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+2 more

Design, implement, and improve the analytics platform

Implement and simplify self-service data query and analysis capabilities of the BI platform

Develop and improve the current BI architecture, emphasizing data security, data quality

and timeliness, scalability, and extensibility

Deploy and use various big data technologies and run pilots to design low latency

data architectures at scale

Collaborate with business analysts, data scientists, product managers, software development engineers,

and other BI teams to develop, implement, and validate KPIs, statistical analyses, data profiling, prediction,

forecasting, clustering, and machine learning algorithms


Educational

At Ganit we are building an elite team, ergo we are seeking candidates who possess the

following backgrounds:

7+ years relevant experience

Expert level skills writing and optimizing complex SQL

Knowledge of data warehousing concepts

Experience in data mining, profiling, and analysis

Experience with complex data modelling, ETL design, and using large databases

in a business environment

Proficiency with Linux command line and systems administration

Experience with languages like Python/Java/Scala

Experience with Big Data technologies such as Hive/Spark

Proven ability to develop unconventional solutions, sees opportunities to

innovate and leads the way

Good experience of working in cloud platforms like AWS, GCP & Azure. Having worked on

projects involving creation of data lake or data warehouse

Excellent verbal and written communication.

Proven interpersonal skills and ability to convey key insights from complex analyses in

summarized business terms. Ability to effectively communicate with multiple teams


Good to have

AWS/GCP/Azure Data Engineer Certification

Read more
Shiba Inu
Mumbai
2 - 10 yrs
₹10L - ₹50L / yr
Artificial Intelligence (AI)
skill iconMachine Learning (ML)
skill iconPython
Qualifications
  • You're proficient in AI/Machine learning latest technologies  
  • You're proficient in GPT-3 based algorithms 
  • You have a passion for writing code as well as understanding and crafting the ways systems interact
  • You believe in the benefits of agile processes and shipping code often
  • You are pragmatic and work to coalesce requirements into reasonable solutions that provide value

Responsibilities
  • Deploy well-tested, maintainable and scalable software solutions
  • Take end-to-end ownership of the technology stack and product 
  • Collaborate with other engineers to architect scalable technical solutions
  • Embrace and improve our standards and processes to reduce friction and unlock efficiency


Current Ecosystem : 
ShibaSwap : https://shibaswap.com/#/" target="_blank">https://shibaswap.com/#/
Metaverse : https://shib.io/#/" target="_blank">https://shib.io/#/
NFTs : https://opensea.io/collection/theshiboshis" target="_blank">https://opensea.io/collection/theshiboshis
Game : Shiba Eternity on iOS and Android
Read more
MSMEx

at MSMEx

6 recruiters
Sujata Ranjan
Posted by Sujata Ranjan
Remote, Mumbai, Pune
4 - 6 yrs
₹5L - ₹12L / yr
skill iconData Analytics
Data Analysis
Data Analyst
SQL
skill iconPython
+4 more

We are looking for a Data Analyst that oversees organisational data analytics. This will require you to design and help implement the data analytics platform that will keep the organisation running. The team will be the go-to for all data needs for the app and we are looking for a self-starter who is hands on and yet able to abstract problems and anticipate data requirements.
This person should be very strong technical data analyst who can design and implement data systems on his own. Along with him, he also needs to be proficient in business reporting and should have keen interest in provided data needed for business.

 

Tools familiarity:  SQL, Python, Mix panel, Metabase, Google Analytics,  Clever Tap, App Analytics

Responsibilities

  • Processes and frameworks for metrics, analytics, experimentation and user insights, lead the data analytics team
  • Metrics alignment across teams to make them actionable and promote accountability
  • Data based frameworks for assessing and strengthening Product Market Fit
  • Identify viable growth strategies through data and experimentation
  • Experimentation for product optimisation and understanding user behaviour
  • Structured approach towards deriving user insights, answer questions using data
  • This person needs to closely work with Technical and Business teams to get this implemented.

Skills

  • 4 to 6 years at a relevant role in data analytics in a Product Oriented company
  • Highly organised, technically sound & good at communication
  • Ability to handle & build for cross functional data requirements / interactions with teams
  • Great with Python, SQL
  • Can build, mentor a team
  • Knowledge of key business metrics like cohort, engagement cohort, LTV, ROAS, ROE

 

Eligibility

BTech or MTech in Computer Science/Engineering from a Tier1, Tier2 colleges

 

Good knowledge on Data Analytics, Data Visualization tools. A formal certification would be added advantage.

We are more interested in what you CAN DO than your location, education, or experience levels.

 

Send us your code samples / GitHub profile / published articles if applicable.

Read more
Railofy

at Railofy

1 video
1 recruiter
Manan Jain
Posted by Manan Jain
Mumbai
2 - 5 yrs
₹5L - ₹12L / yr
skill iconData Science
skill iconPython
skill iconR Programming

About Us:

We are a VC-funded startup solving one of the biggest transportation problems India faces. Most passengers in India travel long distance by IRCTC trains. At time of booking, approx 1 out of every 2 passengers end up with a Waitlisted or RAC ticket. This creates a lot of anxiety for passengers, as Railway only announces only 4 hour before departure if they have a confirmed seat. We solve this problem through our Waitlist & RAC Protection. Protection can be bought against each IRCTC ticket at time of booking. If train ticket is not confirmed, we fly the passenger to the destination. Our team consists of 3 Founders from IIT, IIM and ISB.

Functional Experience:

  • Computer Science or IT Engineering background with solid understanding of basics of Data Structures and Algorithms
  • 2+ years of data science experience working with large datasets
  • Expertise in Python packages like pandas, numPy, sklearn, matplotlib, seaborn, keras and tensorflow
  • Expertise in Big Data technologies like Hadoop, Cassandra and PostgreSQL
  • Expertise in Cloud computing on AWS with EC2, AutoML, Lambda and RDS
  • Good knowledge of Machine Learning and Statistical time series analysis (optional)
  • Unparalleled logical ability making you the go to guy for all things related to data
  • You love coding like a hobby and are up for a challenge!

 

Cultural:

  • Assume a strong sense of ownership of analytics : Design, develop & deploy
  • Collaborate with senior management, operations & business team
  • Ensure Quality & sustainability of the architecture
  • Motivation to join an early stage startup should go beyond compensation
Read more
Blenheim Chalcot IT Services India Pvt Ltd

Blenheim Chalcot IT Services India Pvt Ltd

Agency job
Mumbai
5 - 8 yrs
₹25L - ₹30L / yr
SQL Azure
ADF
Azure data factory
Azure Datalake
Azure Databricks
+13 more
As a hands-on Data Architect, you will be part of a team responsible for building enterprise-grade
Data Warehouse and Analytics solutions that aggregate data across diverse sources and data types
including text, video and audio through to live stream and IoT in an agile project delivery
environment with a focus on DataOps and Data Observability. You will work with Azure SQL
Databases, Synapse Analytics, Azure Data Factory, Azure Datalake Gen2, Azure Databricks, Azure
Machine Learning, Azure Service Bus, Azure Serverless (LogicApps, FunctionApps), Azure Data
Catalogue and Purview among other tools, gaining opportunities to learn some of the most
advanced and innovative techniques in the cloud data space.
You will be building Power BI based analytics solutions to provide actionable insights into customer
data, and to measure operational efficiencies and other key business performance metrics.
You will be involved in the development, build, deployment, and testing of customer solutions, with
responsibility for the design, implementation and documentation of the technical aspects, including
integration to ensure the solution meets customer requirements. You will be working closely with
fellow architects, engineers, analysts, and team leads and project managers to plan, build and roll
out data driven solutions
Expertise:
Proven expertise in developing data solutions with Azure SQL Server and Azure SQL Data Warehouse (now
Synapse Analytics)
Demonstrated expertise of data modelling and data warehouse methodologies and best practices.
Ability to write efficient data pipelines for ETL using Azure Data Factory or equivalent tools.
Integration of data feeds utilising both structured (ex XML/JSON) and flat schemas (ex CSV,TXT,XLSX)
across a wide range of electronic delivery mechanisms (API/SFTP/etc )
Azure DevOps knowledge essential for CI/CD of data ingestion pipelines and integrations.
Experience with object-oriented/object function scripting languages such as Python, Java, JavaScript, C#,
Scala, etc is required.
Expertise in creating technical and Architecture documentation (ex: HLD/LLD) is a must.
Proven ability to rapidly analyse and design solution architecture in client proposals is an added advantage.
Expertise with big data tools: Hadoop, Spark, Kafka, NoSQL databases, stream-processing systems is a plus.
Essential Experience:
5 or more years of hands-on experience in a data architect role with the development of ingestion,
integration, data auditing, reporting, and testing with Azure SQL tech stack.
full data and analytics project lifecycle experience (including costing and cost management of data
solutions) in Azure PaaS environment is essential.
Microsoft Azure and Data Certifications, at least fundamentals, are a must.
Experience using agile development methodologies, version control systems and repositories is a must.
A good, applied understanding of the end-to-end data process development life cycle.
A good working knowledge of data warehouse methodology using Azure SQL.
A good working knowledge of the Azure platform, it’s components, and the ability to leverage it’s
resources to implement solutions is a must.
Experience working in the Public sector or in an organisation servicing Public sector is a must,
Ability to work to demanding deadlines, keep momentum and deal with conflicting priorities in an
environment undergoing a programme of transformational change.
The ability to contribute and adhere to standards, have excellent attention to detail and be strongly driven
by quality.
Desirables:
Experience with AWS or google cloud platforms will be an added advantage.
Experience with Azure ML services will be an added advantage Personal Attributes
Articulated and clear in communications to mixed audiences- in writing, through presentations and one-toone.
Ability to present highly technical concepts and ideas in a business-friendly language.
Ability to effectively prioritise and execute tasks in a high-pressure environment.
Calm and adaptable in the face of ambiguity and in a fast-paced, quick-changing environment
Extensive experience working in a team-oriented, collaborative environment as well as working
independently.
Comfortable with multi project multi-tasking consulting Data Architect lifestyle
Excellent interpersonal skills with teams and building trust with clients
Ability to support and work with cross-functional teams in a dynamic environment.
A passion for achieving business transformation; the ability to energise and excite those you work with
Initiative; the ability to work flexibly in a team, working comfortably without direct supervision.
Read more
Angel One

at Angel One

4 recruiters
Vineeta Singh
Posted by Vineeta Singh
Remote, Mumbai
3 - 7 yrs
₹5L - ₹15L / yr
skill iconData Science
Data Scientist
skill iconPython
SQL
skill iconR Language
+1 more

Role : 

  • Understand and translate statistics and analytics to address business problems
  • Responsible for helping in data preparation and data pull, which is the first step in machine learning
  • Should be able to do cut and slice data to extract interesting insights from the data
  • Model development for better customer engagement and retention
  • Hands on experience in relevant tools like SQL(expert), Excel, R/Python
  • Working on strategy development to increase business revenue

 


Requirements:

  • Hands on experience in relevant tools like SQL(expert), Excel, R/Python
  • Statistics: Strong knowledge of statistics
  • Should able to do data scraping & Data mining
  • Be self-driven, and show ability to deliver on ambiguous projects
  • An ability and interest in working in a fast-paced, ambiguous and rapidly-changing environment
  • Should have worked on Business Projects for an organization, Ex: customer acquisition, Customer retention.
Read more
B2B SaaS platform For BFSI

B2B SaaS platform For BFSI

Agency job
via Unnati by Samta Arora
Mumbai
1 - 5 yrs
₹10L - ₹11L / yr
skill iconAmazon Web Services (AWS)
SQL
NOSQL Databases
skill iconPython

Our client focuses on providing solutions in terms of data, analytics, decisioning and automation. They focus on providing solutions to the lending lifecycle of financial institutions and their products are designed to focus on systemic fraud prevention, risk management, compliance etc.

 

Our client is a one stop solution provider, catering to the authentication, verification and diligence needs of various industries including but not limited to, banking, insurance, payments etc.

 

Headquartered in Mumbai, our client was founded in 2015 by a team of three veteran entrepreneurs, two of whom are chartered accountants and one is a graduate from IIT, Kharagpur. They have been funded by tier 1 investors and have raised $1.1M in funding.

 
As a Web Scraper, you will be responsible for applying knowledge set to fetch data from multiple online sources, cleanse it and build APIs on top of it.

What you will do:

  • Developing a deep understanding of our vast data sources on the web and knowing exactly how, when, and which data to scrap, parse and store
  • Working closely with Database Administrators to store data in SQL and NoSQL databases
  • Developing frameworks for automating and maintaining constant flow of data from multiple sources
  • Working independently with little supervision to research and test innovative solutions skills

 

Desired Candidate Profile

What you need to have:

  • Bachelor/ Master’s degree in Computer science/ Computer Engineering/ Information Technology
  • 1 - 5 years of relevant experience
  • Strong coding experience in Python (knowledge of Java, JavaScript is a plus)
  • Experience with SQL and NoSQL databases
  • Experience with multi-processing, multi-threading, and AWS/Azure
  • Strong knowledge of scraping frameworks such as Python (Request, Beautiful Soup), Web Harvest and others
  • In depth knowledge of algorithms and data structures & previous experience with web crawling is a must    

 

Read more
They provide both wholesale and retail funding. (PM1)

They provide both wholesale and retail funding. (PM1)

Agency job
via Multi Recruit by Sapna Deb
Mumbai
5 - 7 yrs
₹20L - ₹25L / yr
Teradata
Vertica
skill iconPython
DBA
Redshift
+8 more
  • Key responsibility is to design, develop & maintain efficient Data models for the organization maintained to ensure optimal query performance by the consumption layer.
  • Developing, Deploying & maintaining a repository of UDXs written in Java / Python.
  • Develop optimal Data Model design, analyzing complex distributed data deployments, and making recommendations to optimize performance basis data consumption patterns, performance expectations, the query is executed on the tables/databases, etc.
  • Periodic Database health check and maintenance
  • Designing collections in a no-SQL Database for efficient performance
  • Document & maintain data dictionary from various sources to enable data governance
  • Coordination with Business teams, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc
  • Data Governance Process Implementation and ensuring data security

Requirements

  • Extensive working experience in Designing & Implementing Data models in OLAP Data Warehousing solutions (Redshift, Synapse, Snowflake, Teradata, Vertica, etc).
  • Programming experience using Python / Java.
  • Working knowledge in developing & deploying User-defined Functions (UDXs) using Java / Python.
  • Strong understanding & extensive working experience in OLAP Data Warehousing (Redshift, Synapse, Snowflake, Teradata, Vertica, etc) architecture and cloud-native Data Lake (S3, ADLS, BigQuery, etc) Architecture.
  • Strong knowledge in Design, Development & Performance tuning of 3NF/Flat/Hybrid Data Model.
  • Extensive technical experience in SQL including code optimization techniques.
  • Strung knowledge of database performance and tuning, troubleshooting, and tuning.
  • Knowledge of collection design in any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc), along with implementation of best practices.
  • Ability to understand business functionality, processes, and flows.
  • Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently.
  • Any OLAP DWH DBA Experience and User Management will be added advantage.
  • Knowledge in financial industry-specific Data models such as FSLDM, IBM Financial Data Model, etc will be added advantage.
  • Experience in Snowflake will be added advantage.
  • Working experience in BFSI/NBFC & data understanding of Loan/Mortgage data will be added advantage.

Functional knowledge

  • Data Governance & Quality Assurance
  • Modern OLAP Database Architecture & Design
  • Linux
  • Data structures, algorithm & data modeling techniques
  • No-SQL database architecture
  • Data Security

 

Read more
Quantiphi Inc.

at Quantiphi Inc.

1 video
10 recruiters
Anwar Shaikh
Posted by Anwar Shaikh
Mumbai
1 - 5 yrs
₹4L - ₹15L / yr
skill iconPython
skill iconMachine Learning (ML)
skill iconDeep Learning
TensorFlow
Keras
+1 more
1. The candidate should be passionate about machine learning and deep learning.
2. Should understand the importance and know-how of taking the machine-learning-based solution to the consumer.
3. Hands-on experience with statistical, machine-learning tools and techniques
4. Good exposure to Deep learning libraries like Tensorflow, PyTorch.
5. Experience in implementing Deep Learning techniques, Computer Vision and NLP. The candidate should be able to develop the solution from scratch with Github codes exposed.
6. Should be able to read research papers and pick ideas to quickly reproduce research in the most comfortable Deep Learning library.
7. Should be strong in data structures and algorithms. Should be able to do code complexity analysis/optimization for smooth delivery to production.
8. Expert level coding experience in Python.
9. Technologies: Backend - Python (Programming Language)
10. Should have the ability to think long term solutions, modularity, and reusability of the components.
11. Should be able to work in a collaborative way. Should be open to learning from peers as well as constantly bring new ideas to the table.
12. Self-driven missile. Open to peer criticism, feedback and should be able to take it positively. Ready to be held accountable for the responsibilities undertaken.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort