Cutshort logo
Amazon Glacier Jobs in Chennai

11+ Amazon Glacier Jobs in Chennai | Amazon Glacier Job openings in Chennai

Apply to 11+ Amazon Glacier Jobs in Chennai on CutShort.io. Explore the latest Amazon Glacier Job opportunities across top companies like Google, Amazon & Adobe.

icon
netmedscom

at netmedscom

3 recruiters
Vijay Hemnath
Posted by Vijay Hemnath
Chennai
5 - 10 yrs
₹10L - ₹30L / yr
skill iconMachine Learning (ML)
Software deployment
CI/CD
Cloud Computing
Snow flake schema
+19 more

We are looking for an outstanding ML Architect (Deployments) with expertise in deploying Machine Learning solutions/models into production and scaling them to serve millions of customers. A candidate with an adaptable and productive working style which fits in a fast-moving environment.

 

Skills:

- 5+ years deploying Machine Learning pipelines in large enterprise production systems.

- Experience developing end to end ML solutions from business hypothesis to deployment / understanding the entirety of the ML development life cycle.
- Expert in modern software development practices; solid experience using source control management (CI/CD).
- Proficient in designing relevant architecture / microservices to fulfil application integration, model monitoring, training / re-training, model management, model deployment, model experimentation/development, alert mechanisms.
- Experience with public cloud platforms (Azure, AWS, GCP).
- Serverless services like lambda, azure functions, and/or cloud functions.
- Orchestration services like data factory, data pipeline, and/or data flow.
- Data science workbench/managed services like azure machine learning, sagemaker, and/or AI platform.
- Data warehouse services like snowflake, redshift, bigquery, azure sql dw, AWS Redshift.
- Distributed computing services like Pyspark, EMR, Databricks.
- Data storage services like cloud storage, S3, blob, S3 Glacier.
- Data visualization tools like Power BI, Tableau, Quicksight, and/or Qlik.
- Proven experience serving up predictive algorithms and analytics through batch and real-time APIs.
- Solid working experience with software engineers, data scientists, product owners, business analysts, project managers, and business stakeholders to design the holistic solution.
- Strong technical acumen around automated testing.
- Extensive background in statistical analysis and modeling (distributions, hypothesis testing, probability theory, etc.)
- Strong hands-on experience with statistical packages and ML libraries (e.g., Python scikit learn, Spark MLlib, etc.)
- Experience in effective data exploration and visualization (e.g., Excel, Power BI, Tableau, Qlik, etc.)
- Experience in developing and debugging in one or more of the languages Java, Python.
- Ability to work in cross functional teams.
- Apply Machine Learning techniques in production including, but not limited to, neuralnets, regression, decision trees, random forests, ensembles, SVM, Bayesian models, K-Means, etc.

 

Roles and Responsibilities:

Deploying ML models into production, and scaling them to serve millions of customers.

Technical solutioning skills with deep understanding of technical API integrations, AI / Data Science, BigData and public cloud architectures / deployments in a SaaS environment.

Strong stakeholder relationship management skills - able to influence and manage the expectations of senior executives.
Strong networking skills with the ability to build and maintain strong relationships with both business, operations and technology teams internally and externally.

Provide software design and programming support to projects.

 

 Qualifications & Experience:

Engineering and post graduate candidates, preferably in Computer Science, from premier institutions with proven work experience as a Machine Learning Architect (Deployments) or a similar role for 5-7 years.

 

Read more
A  global business process management company

A global business process management company

Agency job
via Jobdost by Saida Jabbar
Pune, Bengaluru (Bangalore), Chennai, Mumbai, Gurugram, Nashik
5 - 10 yrs
₹20L - ₹22L / yr
skill iconData Science
Kofax
Data Scientist
skill iconMachine Learning (ML)
Natural Language Processing (NLP)
+5 more

B1 – Data Scientist  -  Kofax Accredited Developers

 

Requirement – 3

 

Mandatory –

  • Accreditation of Kofax KTA / KTM
  • Experience in Kofax Total Agility Development – 2-3 years minimum
  • Ability to develop and translate functional requirements to design
  • Experience in requirement gathering, analysis, development, testing, documentation, version control, SDLC, Implementation and process orchestration
  • Experience in Kofax Customization, writing Custom Workflow Agents, Custom Modules, Release Scripts
  • Application development using Kofax and KTM modules
  • Good/Advance understanding of Machine Learning /NLP/ Statistics
  • Exposure to or understanding of RPA/OCR/Cognitive Capture tools like Appian/UI Path/Automation Anywhere etc
  • Excellent communication skills and collaborative attitude
  • Work with multiple teams and stakeholders within like Analytics, RPA, Technology and Project management teams
  • Good understanding of compliance, data governance and risk control processes

Total Experience – 7-10 Years in BPO/KPO/ ITES/BFSI/Retail/Travel/Utilities/Service Industry

Good to have

  • Previous experience of working on Agile & Hybrid delivery environment
  • Knowledge of VB.Net, C#( C-Sharp ), SQL Server , Web services

 

Qualification -

  • Masters in Statistics/Mathematics/Economics/Econometrics Or BE/B-Tech, MCA or MBA 

 

Read more
AxionConnect Infosolutions Pvt Ltd
Shweta Sharma
Posted by Shweta Sharma
Pune, Bengaluru (Bangalore), Hyderabad, Nagpur, Chennai
5.5 - 7 yrs
₹20L - ₹25L / yr
skill iconDjango
skill iconFlask
Snowflake
Snow flake schema
SQL
+4 more

Job Location: Hyderabad/Bangalore/ Chennai/Pune/Nagpur

Notice period: Immediate - 15 days

 

1.      Python Developer with Snowflake

 

Job Description :


  1. 5.5+ years of Strong Python Development Experience with Snowflake.
  2. Strong hands of experience with SQL ability to write complex queries.
  3. Strong understanding of how to connect to Snowflake using Python, should be able to handle any type of files
  4.  Development of Data Analysis, Data Processing engines using Python
  5. Good Experience in Data Transformation using Python. 
  6.  Experience in Snowflake data load using Python.
  7.  Experience in creating user-defined functions in Snowflake.
  8.  Snowsql implementation.
  9.  Knowledge of query performance tuning will be added advantage.
  10. Good understanding of Datawarehouse (DWH) concepts.
  11.  Interpret/analyze business requirements & functional specification
  12.  Good to have DBT, FiveTran, and AWS Knowledge.
Read more
iLink Systems

at iLink Systems

1 video
1 recruiter
Ganesh Sooriyamoorthu
Posted by Ganesh Sooriyamoorthu
Chennai, Pune, Noida, Bengaluru (Bangalore)
5 - 15 yrs
₹10L - ₹15L / yr
Apache Kafka
Big Data
skill iconJava
Spark
Hadoop
+1 more
  • KSQL
  • Data Engineering spectrum (Java/Spark)
  • Spark Scala / Kafka Streaming
  • Confluent Kafka components
  • Basic understanding of Hadoop


Read more
Ganit Business Solutions

at Ganit Business Solutions

3 recruiters
Viswanath Subramanian
Posted by Viswanath Subramanian
Remote, Chennai, Bengaluru (Bangalore), Mumbai
3 - 7 yrs
₹12L - ₹25L / yr
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
Computer Vision
skill iconR Programming
+5 more

Ganit has flipped the data science value chain as we do not start with a technique but for us, consumption comes first. With this philosophy, we have successfully scaled from being a small start-up to a 200 resource company with clients in the US, Singapore, Africa, UAE, and India. 

We are looking for experienced data enthusiasts who can make the data talk to them. 

 

You will: 

  • Understand business problems and translate business requirements into technical requirements. 
  • Conduct complex data analysis to ensure data quality & reliability i.e., make the data talk by extracting, preparing, and transforming it. 
  • Identify, develop and implement statistical techniques and algorithms to address business challenges and add value to the organization. 
  • Gather requirements and communicate findings in the form of a meaningful story with the stakeholders  
  • Build & implement data models using predictive modelling techniques. Interact with clients and provide support for queries and delivery adoption. 
  • Lead and mentor data analysts. 

 

We are looking for someone who has: 

 

  • Apart from your love for data and ability to code even while sleeping you would need the following. 
  • Minimum of 02 years of experience in designing and delivery of data science solutions. 
  • You should have successful projects of retail/BFSI/FMCG/Manufacturing/QSR in your kitty to show-off. 
  • Deep understanding of various statistical techniques, mathematical models, and algorithms to start the conversation with the data in hand. 
  • Ability to choose the right model for the data and translate that into a code using R, Python, VBA, SQL, etc. 
  • Bachelors/Masters degree in Engineering/Technology or MBA from Tier-1 B School or MSc. in Statistics or Mathematics 

Skillset Required:

  • Regression
  • Classification
  • Predictive Modelling
  • Prescriptive Modelling
  • Python
  • R
  • Descriptive Modelling
  • Time Series
  • Clustering
  •  

What is in it for you: 

 

  • Be a part of building the biggest brand in Data science. 
  • An opportunity to be a part of a young and energetic team with a strong pedigree. 
  • Work on awesome projects across industries and learn from the best in the industry, while growing at a hyper rate. 

 

Please Note:  

 

At Ganit, we are looking for people who love problem solving. You are encouraged to apply even if your experience does not precisely match the job description above. Your passion and skills will stand out and set you apart—especially if your career has taken some extraordinary twists and turns over the years. We welcome diverse perspectives, people who think rigorously and are not afraid to challenge assumptions in a problem. Join us and punch above your weight! 

Ganit is an equal opportunity employer and is committed to providing a work environment that is free from harassment and discrimination. 

All recruitment, selection procedures and decisions will reflect Ganit’s commitment to providing equal opportunity. All potential candidates will be assessed according to their skills, knowledge, qualifications, and capabilities. No regard will be given to factors such as age, gender, marital status, race, religion, physical impairment, or political opinions. 

Read more
Kaleidofin

at Kaleidofin

3 recruiters
Poornima B
Posted by Poornima B
Chennai, Bengaluru (Bangalore)
2 - 4 yrs
Best in industry
PowerBI
Business Intelligence (BI)
skill iconPython
Tableau
SQL
+1 more
We are looking for a developer to design and deliver strategic data-centric insights leveraging the next generation analytics and BI technologies. We want someone who is data-centric and insight-centric, less report centric. We are looking for someone wishing to make an impact by enabling innovation and growth; someone with passion for what they do and a vision for the future.

Responsibilities:
  • Be the analytical expert in Kaleidofin, managing ambiguous problems by using data to execute sophisticated quantitative modeling and deliver actionable insights.
  • Develop comprehensive skills including project management, business judgment, analytical problem solving and technical depth.
  • Become an expert on data and trends, both internal and external to Kaleidofin.
  • Communicate key state of the business metrics and develop dashboards to enable teams to understand business metrics independently.
  • Collaborate with stakeholders across teams to drive data analysis for key business questions, communicate insights and drive the planning process with company executives.
  • Automate scheduling and distribution of reports and support auditing and value realization.
  • Partner with enterprise architects to define and ensure proposed.
  • Business Intelligence solutions adhere to an enterprise reference architecture.
  • Design robust data-centric solutions and architecture that incorporates technology and strong BI solutions to scale up and eliminate repetitive tasks.
 Requirements:
  • Experience leading development efforts through all phases of SDLC.
  • 2+ years "hands-on" experience designing Analytics and Business Intelligence solutions.
  • Experience with Quicksight, PowerBI, Tableau and Qlik is a plus.
  • Hands on experience in SQL, data management, and scripting (preferably Python).
  • Strong data visualisation design skills, data modeling and inference skills.
  • Hands-on and experience in managing small teams.
  • Financial services experience preferred, but not mandatory.
  • Strong knowledge of architectural principles, tools, frameworks, and best practices.
  • Excellent communication and presentation skills to communicate and collaborate with all levels of the organisation.
  • Preferred candidates with less than 30 days notice period.
Read more
Virtusa

at Virtusa

2 recruiters
Priyanka Sathiyamoorthi
Posted by Priyanka Sathiyamoorthi
Chennai
11 - 15 yrs
₹15L - ₹33L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+3 more

We are looking for a Big Data Engineer with java for Chennai Location

Location : Chennai 

Exp : 11 to 15 Years 



Job description

Required Skill:

1. Candidate should have minimum 7 years of experience as total

2. Candidate should have minimum 4 years of experience in Big Data design and development

3. Candidate should have experience in Java, Spark, Hive & Hadoop, Python 

4. Candidate should have experience in any RDBMS.

Roles & Responsibility:

1. To create work plans, monitor and track the work schedule for on time delivery as per the defined quality standards.

2. To develop and guide the team members in enhancing their technical capabilities and increasing productivity.

3. To ensure process improvement and compliance in the assigned module, and participate in technical discussions or review.

4. To prepare and submit status reports for minimizing exposure and risks on the project or closure of escalation


Regards,

Priyanka S

7P8R9I9Y4A0N8K8A7S7

Read more
Rudhra Info Solutions

at Rudhra Info Solutions

1 recruiter
Monica Devi
Posted by Monica Devi
Bengaluru (Bangalore), Chennai
5 - 6 yrs
₹7L - ₹15L / yr
Data engineering
skill iconPython
skill iconDjango
SQL
  • Analyze and organize raw data 
  • Build data systems and pipelines
  • Evaluate business needs and objectives
  • Interpret trends and patterns
  • Conduct complex data analysis and report on results 
  • Build algorithms and prototypes
  • Combine raw information from different sources
  • Explore ways to enhance data quality and reliability
  • Identify opportunities for data acquisition
  • Should have experience in Python, Django Micro Service Senior developer with Financial Services/Investment Banking background.
  • Develop analytical tools and programs
  • Collaborate with data scientists and architects on several projects
  • Should have 5+ years of experience as a data engineer or in a similar role
  • Technical expertise with data models, data mining, and segmentation techniques
  • Should have experience programming languages such as Python
  • Hands-on experience with SQL database design
  • Great numerical and analytical skills
  • Degree in Computer Science, IT, or similar field; a Master’s is a plus
  • Data engineering certification (e.g. IBM Certified Data Engineer) is a plus
Read more
Telecom  Client

Telecom Client

Agency job
via Eurka IT SOL by Srikanth a
Chennai
5 - 13 yrs
₹9L - ₹28L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+6 more
  • Demonstrable experience owning and developing big data solutions, using Hadoop, Hive/Hbase, Spark, Databricks, ETL/ELT for 5+ years

·       10+ years of Information Technology experience, preferably with Telecom / wireless service providers.

·       Experience in designing data solution following Agile practices (SAFe methodology); designing for testability, deployability and releaseability; rapid prototyping, data modeling, and decentralized innovation

  • DataOps mindset: allowing the architecture of a system to evolve continuously over time, while simultaneously supporting the needs of current users
  • Create and maintain Architectural Runway, and Non-Functional Requirements.
  • Design for Continuous Delivery Pipeline (CI/CD data pipeline) and enables Built-in Quality & Security from the start.

·       To be able to demonstrate an understanding and ideally use of, at least one recognised architecture framework or standard e.g. TOGAF, Zachman Architecture Framework etc

·       The ability to apply data, research, and professional judgment and experience to ensure our products are making the biggest difference to consumers

·       Demonstrated ability to work collaboratively

·       Excellent written, verbal and social skills - You will be interacting with all types of people (user experience designers, developers, managers, marketers, etc.)

·       Ability to work in a fast paced, multiple project environment on an independent basis and with minimal supervision

·       Technologies: .NET, AWS, Azure; Azure Synapse, Nifi, RDS, Apache Kafka, Azure Data bricks, Azure datalake storage, Power BI, Reporting Analytics, QlickView, SQL on-prem Datawarehouse; BSS, OSS & Enterprise Support Systems

Read more
VIMANA

at VIMANA

4 recruiters
Loshy Chandran
Posted by Loshy Chandran
Remote, Chennai
2 - 5 yrs
₹10L - ₹20L / yr
Data engineering
Data Engineer
Apache Kafka
Big Data
skill iconJava
+4 more

We are looking for passionate, talented and super-smart engineers to join our product development team. If you are someone who innovates, loves solving hard problems, and enjoys end-to-end product development, then this job is for you! You will be working with some of the best developers in the industry in a self-organising, agile environment where talent is valued over job title or years of experience.

 

Responsibilities:

  • You will be involved in end-to-end development of VIMANA technology, adhering to our development practices and expected quality standards.
  • You will be part of a highly collaborative Agile team which passionately follows SAFe Agile practices, including pair-programming, PR reviews, TDD, and Continuous Integration/Delivery (CI/CD).
  • You will be working with cutting-edge technologies and tools for stream processing using Java, NodeJS and Python, using frameworks like Spring, RxJS etc.
  • You will be leveraging big data technologies like Kafka, Elasticsearch and Spark, processing more than 10 Billion events per day to build a maintainable system at scale.
  • You will be building Domain Driven APIs as part of a micro-service architecture.
  • You will be part of a DevOps culture where you will get to work with production systems, including operations, deployment, and maintenance.
  • You will have an opportunity to continuously grow and build your capabilities, learning new technologies, languages, and platforms.

 

Requirements:

  • Undergraduate degree in Computer Science or a related field, or equivalent practical experience.
  • 2 to 5 years of product development experience.
  • Experience building applications using Java, NodeJS, or Python.
  • Deep knowledge in Object-Oriented Design Principles, Data Structures, Dependency Management, and Algorithms.
  • Working knowledge of message queuing, stream processing, and highly scalable Big Data technologies.
  • Experience in working with Agile software methodologies (XP, Scrum, Kanban), TDD and Continuous Integration (CI/CD).
  • Experience using no-SQL databases like MongoDB or Elasticsearch.
  • Prior experience with container orchestrators like Kubernetes is a plus.
About VIMANA

We build products and platforms for the Industrial Internet of Things. Our technology is being used around the world in mission-critical applications - from improving the performance of manufacturing plants, to making electric vehicles safer and more efficient, to making industrial equipment smarter.

Please visit https://govimana.com/ to learn more about what we do.

Why Explore a Career at VIMANA
  • We recognize that our dedicated team members make us successful and we offer competitive salaries.
  • We are a workplace that values work-life balance, provides flexible working hours, and full time remote work options.
  • You will be part of a team that is highly motivated to learn and work on cutting edge technologies, tools, and development practices.
  • Bon Appetit! Enjoy catered breakfasts, lunches and free snacks!

VIMANA Interview Process
We usually target to complete all the interviews in a week's time and would provide prompt feedback to the candidate. As of now, all the interviews are conducted online due to covid situation.

1.Telephonic screening (30 Min )

A 30 minute telephonic interview to understand and evaluate the candidate's fit with the job role and the company.
Clarify any queries regarding the job/company.
Give an overview about further interview rounds

2. Technical Rounds

This would be deep technical round to evaluate the candidate's technical capability pertaining to the job role.

3. HR Round

Candidate's team and cultural fit will be evaluated during this round

We would proceed with releasing the offer if the candidate clears all the above rounds.

Note: In certain cases, we might schedule additional rounds if needed before releasing the offer.
Read more
Maveric Systems

at Maveric Systems

3 recruiters
Rashmi Poovaiah
Posted by Rashmi Poovaiah
Bengaluru (Bangalore), Chennai, Pune
4 - 10 yrs
₹8L - ₹15L / yr
Big Data
Hadoop
Spark
Apache Kafka
HiveQL
+2 more

Role Summary/Purpose:

We are looking for a Developer/Senior Developers to be a part of building advanced analytical platform leveraging Big Data technologies and transform the legacy systems. This role is an exciting, fast-paced, constantly changing and challenging work environment, and will play an important role in resolving and influencing high-level decisions.

 

Requirements:

  • The candidate must be a self-starter, who can work under general guidelines in a fast-spaced environment.
  • Overall minimum of 4 to 8 year of software development experience and 2 years in Data Warehousing domain knowledge
  • Must have 3 years of hands-on working knowledge on Big Data technologies such as Hadoop, Hive, Hbase, Spark, Kafka, Spark Streaming, SCALA etc…
  • Excellent knowledge in SQL & Linux Shell scripting
  • Bachelors/Master’s/Engineering Degree from a well-reputed university.
  • Strong communication, Interpersonal, Learning and organizing skills matched with the ability to manage stress, Time, and People effectively
  • Proven experience in co-ordination of many dependencies and multiple demanding stakeholders in a complex, large-scale deployment environment
  • Ability to manage a diverse and challenging stakeholder community
  • Diverse knowledge and experience of working on Agile Deliveries and Scrum teams.

 

Responsibilities

  • Should works as a senior developer/individual contributor based on situations
  • Should be part of SCRUM discussions and to take requirements
  • Adhere to SCRUM timeline and deliver accordingly
  • Participate in a team environment for the design, development and implementation
  • Should take L3 activities on need basis
  • Prepare Unit/SIT/UAT testcase and log the results
  • Co-ordinate SIT and UAT Testing. Take feedbacks and provide necessary remediation/recommendation in time.
  • Quality delivery and automation should be a top priority
  • Co-ordinate change and deployment in time
  • Should create healthy harmony within the team
  • Owns interaction points with members of core team (e.g.BA team, Testing and business team) and any other relevant stakeholders
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort