11+ MLS Jobs in Bangalore (Bengaluru) | MLS Job openings in Bangalore (Bengaluru)
Apply to 11+ MLS Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest MLS Job opportunities across top companies like Google, Amazon & Adobe.
Object-oriented languages (e.g. Python, PySpark, Java, C#, C++ ) and frameworks (e.g. J2EE or .NET)
● Proficiency in Linux.
● Experience working with AWS cloud services: EC2, S3, RDS, Redshift.
● Must have SQL knowledge and experience working with relational databases, query
authoring (SQL) as well as familiarity with databases including Mysql, Mongo, Cassandra,
and Athena.
● Must have experience with Python/Scala.
● Must have experience with Big Data technologies like Apache Spark.
● Must have experience with Apache Airflow.
● Experience with data pipelines and ETL tools like AWS Glue.
at Vola Finance
Lightning Job By Cutshort⚡
As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered)
Roles & Responsibilities
Basic Qualifications:
● The position requires a four-year degree from an accredited college or university.
● Three years of data engineering / AWS Architecture and security experience.
Top candidates will also have:
Proven/Strong understanding and/or experience in many of the following:-
● Experience designing Scalable AWS architecture.
● Ability to create modern data pipelines and data processing using AWS PAAS components (Glue, etc.) or open source tools (Spark, Hbase, Hive, etc.).
● Ability to develop SQL structures that support high volumes and scalability using
RDBMS such as SQL Server, MySQL, Aurora, etc.
● Ability to model and design modern data structures, SQL/NoSQL databases, Data Lakes, Cloud Data Warehouse
● Experience in creating Network Architecture for secured scalable solution.
● Experience with Message brokers such as Kinesis, Kafka, Rabbitmq, AWS SQS, AWS SNS, and Apache ActiveMQ. Hands-on experience on AWS serverless architectures such as Glue,Lamda, Redshift etc.
● Working knowledge of Load balancers, AWS shield, AWS guard, VPC, Subnets, Network gateway Route53 etc.
● Knowledge of building Disaster management systems and security logs notification system
● Knowledge of building scalable microservice architectures with AWS.
● To create a framework for monthly security checks and wide knowledge on AWS services
● Deploying software using CI/CD tools such CircleCI, Jenkins, etc.
● ML/ AI model deployment and production maintainanace experience is mandatory.
● Experience with API tools such as REST, Swagger, Postman and Assertible.
● Versioning management tools such as github, bitbucket, GitLab.
● Debugging and maintaining software in Linux or Unix platforms.
● Test driven development
● Experience building transactional databases.
● Python, PySpark programming experience .
● Must experience engineering solutions in AWS.
● Working AWS experience, AWS certification is required prior to hiring
● Working in Agile Framework/Kanban Framework
● Must demonstrate solid knowledge of computer science fundamentals like data structures & algorithms.
● Passion for technology and an eagerness to contribute to a team-oriented environment.
● Demonstrated leadership on medium to large-scale projects impacting strategic priorities.
● Bachelor’s degree in Computer science or Electrical engineering or related field is required
Job Title:- Head of Analytics
Job Location:- Bangalore - On - site
About Qrata:
Qrata matches top talent with global career opportunities from the world’s leading digital companies including some of the world’s fastest growing start-ups using qrata’s talent marketplaces. To sign-up please visit Qrata Talent Sign-Up
We are currently scouting for Head of Analytics
Our Client Story:
Founded by a team of seasoned bankers with over 120 years of collective experience in banking, financial services and cards, encompassing strategy, operation, marketing, risk & technology, both in India and internationally.
We offer credit card processing Solution that can help you in effectively managing your credit card portfolio end-to-end. These solution are customized to meet the unique strategic, operational and compliance requirements of each bank.
1. Card Programs built for Everyone Limit assignment based on customer risk assessment & credit profiles including secured cards
2. Cards that can be used Everywhere. Through POS machines, UPI, E-Commerce websites
3. A Card for Everything Enable customer purchases, both large and small
4. Customized Card configurations Restrict usage based on merchant codes, location, amount limits etc
5. End-to-End Support We undertake the complete customer life cycle management right from KYC checks, onboarding, risk profiling, fraud control, billing and collections
6. Rewards Program Management We will manage the entire cards reward and customer loyalty programs for you
What you will do:
We are seeking an experienced individual for the role of Head of Analytics. As the Head of Analytics, you will be responsible for driving data-driven decision-making, implementing advanced analytics strategies, and providing valuable insights to optimize our credit card business operations, sales and marketing, risk management & customer experience. Your expertise in statistical analysis, predictive modelling, and data visualization will be instrumental in driving growth and enhancing the overall performance of our credit card business
Qualification:
- Bachelor's or master’s degree in Technology, Mathematics, Statistics, Economics, Computer Science, or a related field
- Proven experience (7+ years) in leading analytics teams in the credit card industry
- Strong expertise in statistical analysis, predictive modelling, data mining, and segmentation techniques
- Proficiency in data manipulation and analysis using programming languages such as Python, R, or SQL
- Experience with analytics tools such as SAS, SPSS, or Tableau
- Excellent leadership and team management skills, with a track record of building and developing high-performing teams
- Strong knowledge of credit card business and understanding of credit card industry dynamics, including risk management, marketing, and customer lifecycle
- Exceptional communication and presentation skills, with the ability to effectively communicate complex information to a varied audience
What you can expect:
1. Develop and implement Analytics Strategy:
o Define the analytics roadmap for the credit card business, aligning it with overall business objectives
o Identify key performance indicators (KPIs) and metrics to track the performance of the credit card business
o Collaborate with senior management and cross-functional teams to prioritize and execute analytics initiatives
2. Lead Data Analysis and Insights:
o Conduct in-depth analysis of credit card data, customer behaviour, and market trends to identify opportunities for business growth and risk mitigation
o Develop predictive models and algorithms to assess credit risk, customer segmentation, acquisition, retention, and upsell opportunities
o Generate actionable insights and recommendations based on data analysis to optimize credit card product offerings, pricing, and marketing strategies
o Regularly present findings and recommendations to senior leadership, using data visualization techniques to effectively communicate complex information
3. Drive Data Governance and Quality:
o Oversee data governance initiatives, ensuring data accuracy, consistency, and integrity across relevant systems and platforms
o Collaborate with IT teams to optimize data collection, integration, and storage processes to support advanced analytics capabilities
o Establish and enforce data privacy and security protocols to comply with regulatory requirements
4. Team Leadership and Collaboration:
o Build and manage a high-performing analytics team, fostering a culture of innovation, collaboration, and continuous learning
o Provide guidance and mentorship to the team, promoting professional growth and development
o Collaborate with stakeholders across departments, including Marketing, Risk Management, and Finance, to align analytics initiatives with business objectives
5. Stay Updated on Industry Trends:
o Keep abreast of emerging trends, techniques, and technologies in analytics, credit card business, and the financial industry
o Leverage industry best practices to drive innovation and continuous improvement in analytics methodologies and tools
For more Opportunities Visit: Qrata Opportunities.
What are we looking for:
- Strong experience in MySQL and writing advanced queries
- Strong experience in Bash and Python
- Familiarity with ElasticSearch, Redis, Java, NodeJS, ClickHouse, S3
- Exposure to cloud services such as AWS, Azure, or GCP
- 2+ years of experience in the production support
- Strong experience in log management and performance monitoring like ELK, Prometheus + Grafana, logging services on various cloud platforms
- Strong understanding of Linux OSes like Ubuntu, CentOS / Redhat Linux
- Interest in learning new languages / framework as needed
- Good written and oral communications skills
- A growth mindset and passionate about building things from the ground up, and most importantly, you should be fun to work with
As a product solutions engineer, you will:
- Analyze recorded runtime issues, diagnose and do occasional code fixes of low to medium complexity
- Work with developers to find and correct more complex issues
- Address urgent issues quickly, work within and measure against customer SLAs
- Using shell and python scripts, and use scripting to actively automate manual / repetitive activities
- Build anomaly detectors wherever applicable
- Pass articulated feedback from customers to the development and product team
- Maintain ongoing record of the operation of problem analysis and resolution in a on call monitoring system
- Offer technical support needed in development
- Developing telemetry software to connect Junos devices to the cloud
- Fast prototyping and laying the SW foundation for product solutions
- Moving prototype solutions to a production cloud multitenant SaaS solution
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
- Build analytics tools that utilize the data pipeline to provide significant insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with partners including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics specialists to strive for greater functionality in our data systems.
Qualification and Desired Experiences
- Master in Computer Science, Electrical Engineering, Statistics, Applied Math or equivalent fields with strong mathematical background
- 5+ years experiences building data pipelines for data science-driven solutions
- Strong hands-on coding skills (preferably in Python) processing large-scale data set and developing machine learning model
- Familiar with one or more machine learning or statistical modeling tools such as Numpy, ScikitLearn, MLlib, Tensorflow
- Good team worker with excellent interpersonal skills written, verbal and presentation
- Create and maintain optimal data pipeline architecture,
- Assemble large, sophisticated data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Experience with AWS, S3, Flink, Spark, Kafka, Elastic Search
- Previous work in a start-up environment
- 3+ years experiences building data pipelines for data science-driven solutions
- Master in Computer Science, Electrical Engineering, Statistics, Applied Math or equivalent fields with strong mathematical background
- We are looking for a candidate with 9+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Strong hands-on coding skills (preferably in Python) processing large-scale data set and developing machine learning model
- Familiar with one or more machine learning or statistical modeling tools such as Numpy, ScikitLearn, MLlib, Tensorflow
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and find opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Proven understanding of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and interpersonal skills.
- Experience supporting and working with multi-functional teams in a multidimensional environment.
- Experience with relational SQL & NoSQL databases including MySQL & MongoDB.
- Familiar with the basic principles of distributed computing and data modeling.
- Experience with distributed data pipeline frameworks like Celery, Apache Airflow, etc.
- Experience with NLP and NER models is a bonus.
- Experience building reusable code and libraries for future use.
- Experience building REST APIs.
Preference for candidates working in tech product companies
scraping , and problem skills
- 1-5 years of experience in building and maintaining robust data pipelines, enriching data, low-latency/highly-performance data analytics applications.
- Experience handling complex, high volume, multi-dimensional data and architecting data products in streaming, serverless, and microservices-based Architecture and platform.
- Experience in Data warehousing, Data modeling, and Data architecture.
- Expert level proficiency with the relational and NoSQL databases.
- Expert level proficiency in Python, and PySpark.
- Familiarity with Big Data technologies and utilities (Spark, Hive, Kafka, Airflow).
- Familiarity with cloud services (preferable AWS)
- Familiarity with MLOps processes such as data labeling, model deployment, data-model feedback loop, data drift.
Key Roles/Responsibilities:
- Act as a technical leader for resolving problems, with both technical and non-technical audiences.
- Identifying and solving issues with data pipelines regarding consistency, integrity, and completeness.
- Lead data initiatives, architecture design discussions, and implementation of next-generation BI solutions.
- Partner with data scientists, tech architects to build advanced, scalable, efficient self-service BI infrastructure.
- Provide thought leadership and mentor data engineers in information presentation and delivery.