Cutshort logo

11+ MSTR Jobs in India

Apply to 11+ MSTR Jobs on CutShort.io. Find your next job, effortlessly. Browse MSTR Jobs and apply today!

icon
Latent Bridge Pvt Ltd

at Latent Bridge Pvt Ltd

6 recruiters
Mansoor Khan
Posted by Mansoor Khan
Remote only
3 - 7 yrs
₹5L - ₹20L / yr
MicroStrategy administration
skill iconAmazon Web Services (AWS)
Business Intelligence (BI)
MSTR

Familiar with the MicroStrategy architecture, Admin Certification Preferred

· Familiar with administrative functions, using Object Manager, Command Manager, installation/configuration of MSTR in clustered architecture, applying patches, hot-fixes

· Monitor and manage existing Business Intelligence development/production systems

· MicroStrategy installation, upgrade and administration on Windows and Linux platform

· Ability to support and administer multi-tenant MicroStrategy infrastructure including server security troubleshooting and general system maintenance.

· Analyze application and system logs while troubleshooting and root cause analysis

· Work on operations like deploy and manage packages, User Management, Schedule Management, Governing Settings best practices, database instance and security configuration.

· Monitor, report and investigate solutions to improve report performance.

· Continuously improve the platform through tuning, optimization, governance, automation, and troubleshooting.

· Provide support for the platform, report execution and implementation, user community and data investigations.

· Identify improvement areas in Environment hosting and upgrade processes.

· Identify automation opportunities and participate in automation implementations

· Provide on-call support for Business Intelligence issues

· Experience of working on MSTR 2021, MSTR 2021 including knowledge of working on Enterprise Manager and new features like Platform Analytics, Hyper Intelligence, Collaboration, MSTR Library, etc.

· Familiar with AWS, Linux Scripting

· Knowledge of MSTR Mobile

· Knowledge of capacity planning and system’s scaling needs

Read more
Vola Finance
Bengaluru (Bangalore)
3yrs+
Upto ₹20L / yr (Varies
)
skill iconAmazon Web Services (AWS)
Data engineering
Spark
SQL
Data Warehouse (DWH)
+4 more

Lightning Job By Cutshort⚡

 

As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered)


Roles & Responsibilities


Basic Qualifications:

● The position requires a four-year degree from an accredited college or university.

● Three years of data engineering / AWS Architecture and security experience.


Top candidates will also have:

Proven/Strong understanding and/or experience in many of the following:-

● Experience designing Scalable AWS architecture.

● Ability to create modern data pipelines and data processing using AWS PAAS components (Glue, etc.) or open source tools (Spark, Hbase, Hive, etc.).

● Ability to develop SQL structures that support high volumes and scalability using

RDBMS such as SQL Server, MySQL, Aurora, etc.

● Ability to model and design modern data structures, SQL/NoSQL databases, Data Lakes, Cloud Data Warehouse

● Experience in creating Network Architecture for secured scalable solution.

● Experience with Message brokers such as Kinesis, Kafka, Rabbitmq, AWS SQS, AWS SNS, and Apache ActiveMQ. Hands-on experience on AWS serverless architectures such as Glue,Lamda, Redshift etc.

● Working knowledge of Load balancers, AWS shield, AWS guard, VPC, Subnets, Network gateway Route53 etc.

● Knowledge of building Disaster management systems and security logs notification system

● Knowledge of building scalable microservice architectures with AWS.

● To create a framework for monthly security checks and wide knowledge on AWS services

● Deploying software using CI/CD tools such CircleCI, Jenkins, etc.

● ML/ AI model deployment and production maintainanace experience is mandatory.

● Experience with API tools such as REST, Swagger, Postman and Assertible.

● Versioning management tools such as github, bitbucket, GitLab.

● Debugging and maintaining software in Linux or Unix platforms.

● Test driven development

● Experience building transactional databases.

● Python, PySpark programming experience .

● Must experience engineering solutions in AWS.

● Working AWS experience, AWS certification is required prior to hiring

● Working in Agile Framework/Kanban Framework

● Must demonstrate solid knowledge of computer science fundamentals like data structures & algorithms.

● Passion for technology and an eagerness to contribute to a team-oriented environment.

● Demonstrated leadership on medium to large-scale projects impacting strategic priorities.

● Bachelor’s degree in Computer science or Electrical engineering or related field is required

Read more
6sense

at 6sense

15 recruiters
Romesh Rawat
Posted by Romesh Rawat
Remote only
5 - 8 yrs
₹30L - ₹45L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+4 more

About Slintel (a 6sense company) :

Slintel, a 6sense company,  the leader in capturing technographics-powered buying intent, helps companies uncover the 3% of active buyers in their target market. Slintel evaluates over 100 billion data points and analyzes factors such as buyer journeys, technology adoption patterns, and other digital footprints to deliver market & sales intelligence.

Slintel's customers have access to the buying patterns and contact information of more than 17 million companies and 250 million decision makers across the world.

Slintel is a fast growing B2B SaaS company in the sales and marketing tech space. We are funded by top tier VCs, and going after a billion dollar opportunity. At Slintel, we are building a sales development automation platform that can significantly improve outcomes for sales teams, while reducing the number of hours spent on research and outreach.

We are a big data company and perform deep analysis on technology buying patterns, buyer pain points to understand where buyers are in their journey. Over 100 billion data points are analyzed every week to derive recommendations on where companies should focus their marketing and sales efforts on. Third party intent signals are then clubbed with first party data from CRMs to derive meaningful recommendations on whom to target on any given day.

6sense is headquartered in San Francisco, CA and has 8 office locations across 4 countries.

6sense, an account engagement platform, secured $200 million in a Series E funding round, bringing its total valuation to $5.2 billion 10 months after its $125 million Series D round. The investment was co-led by Blue Owl and MSD Partners, among other new and existing investors.

Linkedin (Slintel) : https://www.linkedin.com/company/slintel/">https://www.linkedin.com/company/slintel/

Industry : Software Development

Company size : 51-200 employees (189 on LinkedIn)

Headquarters : Mountain View, California

Founded : 2016

Specialties : Technographics, lead intelligence, Sales Intelligence, Company Data, and Lead Data.

Website (Slintel) : https://www.slintel.com/slintel">https://www.slintel.com/slintel

Linkedin (6sense) : https://www.linkedin.com/company/6sense/">https://www.linkedin.com/company/6sense/

Industry : Software Development

Company size : 501-1,000 employees (937 on LinkedIn)

Headquarters : San Francisco, California

Founded : 2013

Specialties : Predictive intelligence, Predictive marketing, B2B marketing, and Predictive sales

Website (6sense) : https://6sense.com/">https://6sense.com/

Acquisition News : 

https://inc42.com/buzz/us-based-based-6sense-acquires-b2b-buyer-intelligence-startup-slintel/ 

Funding Details & News :

Slintel funding : https://www.crunchbase.com/organization/slintel">https://www.crunchbase.com/organization/slintel

6sense funding : https://www.crunchbase.com/organization/6sense">https://www.crunchbase.com/organization/6sense

https://www.nasdaq.com/articles/ai-software-firm-6sense-valued-at-%245.2-bln-after-softbank-joins-funding-round">https://www.nasdaq.com/articles/ai-software-firm-6sense-valued-at-%245.2-bln-after-softbank-joins-funding-round

https://www.bloomberg.com/news/articles/2022-01-20/6sense-reaches-5-2-billion-value-with-softbank-joining-round">https://www.bloomberg.com/news/articles/2022-01-20/6sense-reaches-5-2-billion-value-with-softbank-joining-round

https://xipometer.com/en/company/6sense">https://xipometer.com/en/company/6sense

Slintel & 6sense Customers :

https://www.featuredcustomers.com/vendor/slintel/customers

https://www.featuredcustomers.com/vendor/6sense/customers">https://www.featuredcustomers.com/vendor/6sense/customers

About the job

Responsibilities

  • Work in collaboration with the application team and integration team to design, create, and maintain optimal data pipeline architecture and data structures for Data Lake/Data Warehouse
  • Work with stakeholders including the Sales, Product, and Customer Support teams to assist with data-related technical issues and support their data analytics needs
  • Assemble large, complex data sets from third-party vendors to meet business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimising data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Elastic search, MongoDB, and AWS technology
  • Streamline existing and introduce enhanced reporting and analysis solutions that leverage complex data sources derived from multiple internal systems

Requirements

  • 3+ years of experience in a Data Engineer role
  • Proficiency in Linux
  • Must have SQL knowledge and experience working with relational databases, query authoring (SQL) as well as familiarity with databases including Mysql, Mongo, Cassandra, and Athena
  • Must have experience with Python/ Scala
  • Must have experience with Big Data technologies like Apache Spark
  • Must have experience with Apache Airflow
  • Experience with data pipeline and ETL tools like AWS Glue
  • Experience working with AWS cloud services: EC2 S3 RDS, Redshift and other Data solutions eg. Databricks, Snowflake

 

Desired Skills and Experience

Python, SQL, Scala, Spark, ETL

 

Read more
Top 3 Fintech Startup

Top 3 Fintech Startup

Agency job
via Jobdost by Sathish Kumar
Bengaluru (Bangalore)
6 - 9 yrs
₹16L - ₹24L / yr
SQL
skill iconAmazon Web Services (AWS)
Spark
PySpark
Apache Hive

We are looking for an exceptionally talented Lead data engineer who has exposure in implementing AWS services to build data pipelines, api integration and designing data warehouse. Candidate with both hands-on and leadership capabilities will be ideal for this position.

 

Qualification: At least a bachelor’s degree in Science, Engineering, Applied Mathematics. Preferred Masters degree

 

Job Responsibilities:

• Total 6+ years of experience as a Data Engineer and 2+ years of experience in managing a team

• Have minimum 3 years of AWS Cloud experience.

• Well versed in languages such as Python, PySpark, SQL, NodeJS etc

• Has extensive experience in the real-timeSpark ecosystem and has worked on both real time and batch processing

• Have experience in AWS Glue, EMR, DMS, Lambda, S3, DynamoDB, Step functions, Airflow, RDS, Aurora etc.

• Experience with modern Database systems such as Redshift, Presto, Hive etc.

• Worked on building data lakes in the past on S3 or Apache Hudi

• Solid understanding of Data Warehousing Concepts

• Good to have experience on tools such as Kafka or Kinesis

• Good to have AWS Developer Associate or Solutions Architect Associate Certification

• Have experience in managing a team

Read more
Standard Rail Technologies
Vishal Bharti
Posted by Vishal Bharti
Bengaluru (Bangalore), Pune
9 - 15 yrs
₹10L - ₹15L / yr
IT infrastructure
DevOps
Computer Networking
Data center
skill iconAmazon Web Services (AWS)
+5 more

JOB DESCRIPTION

  • Lead of IT team must guide & manage dev-ops, cloud system administrators, desktop support analysts and also assist in procure & manage assets.
  • Design and develop a scalable IT infrastructure that benefits the organization.
  • Take part in IT strategic planning activities that reflect the future vision of the organization.
  • Introduce cost-effective best practices related to the needs of the business needs of the organization.
  • Research and recommend solutions that circumvent potential technical issues.
  • Provide high levels of customer service as it pertains to enterprise infrastructure.
  • Review and document key performance metrics and indicators to ensure high performance of IT service delivery systems.
  • Take charge of available client databases, networks, storage, servers, directories, and other technology services.
  • Collaborate with the network engineer to design infrastructure improvements and changes and to troubleshoot any issues that arise.
  • Plan, design, and manage infrastructure technologies that can support complex and heterogeneous corporate data and voice infrastructure.
  • Execute, test and roll out innovative solutions to keep up with the growing competition technologies that can support complex and heterogeneous corporate data and voice infrastructure.
  • Create and document proper installation and configuration procedures.
  • Assist in handling software distributions and software updates and patches.
  • Oversee deployment of systems and network integration in association with partner clients, business partners, suppliers and subsidiaries.
  • Create, update, and manage IT policies.
  • Manage, & drive assigned vendors. Perform cost benefit analysis and provide recommendations to management

KEY Proficiencies

* Bachelor’s or Master’s degree in computer science, information technology, electronics, telecommunications or any related field.

* Minimum 10 years of experience in the above mentioned fields.

Read more
Saas based product company

Saas based product company

Agency job
via SIlverPeople Consulting by Manasa Rao
Bengaluru (Bangalore), Gurugram
4 - 8 yrs
₹6L - ₹15L / yr
NOC
Computer Networking
Network operations
skill iconAmazon Web Services (AWS)
Windows Azure
+3 more
Hi

We have an opportunity for a Lead Operations Engineer role with our client at Bangalore /Gurgaon. Sharing the JD for your reference. Please revert if you would be interested in this opportunity and we can connect accordingly.

JOB DETAILS

Shift timing: 

9.00AM-6.00PM / 11.00AM -8.00PM / 2.00PM – 11.00PM / 7.00PM -3.00AM IST(Night shift allowance will be provided)

Position

Lead Operations Engineer

Location

Bangalore/ Gurgaon

About Our client

Who we are :

At a time when consumers are connected and empowered like never before,

Our client is helping the world's largest brands provide amazing experiences at every turn. It offers a set of powerful social capabilities that allow our clients to reach, engage, and listen to customers across 24

social channels. We empower entire organizations to work together across social, marketing, advertising,

research, and customer care to manage customer experience at scale. Most exciting, Our client works with 50% of the Fortune 500 and nine of the world's 10 most valued brands, including McDonald's, Nestle, Nike,

P&G, Shell, Samsung, and Visa.

What You'll Do

What You’ll Do As a Lead Operations Engineer at our client, you should be passionate about working on new technologies, high profile projects, and are motivated to deliver solutions on an aggressive schedule.

Candidates from product based companies only.

1. 5-7 years of exposure and working knowledge of data centers on-premise or on AWS/Azure/GCP.

2. Working Experience on Jenkins, Ansible, Git, Release & Deployments

3. Working Experience on ELK, Mongo, Kafka, Kubernetes.

4. Implement and operate SaaS environments hosting multiple applications and provide production support.

5. Contribute to automation and provisioning of environments.

6. Strong Linux systems administration skills with RHCE/Centos.

7. Have scripting knowledge in one of the following – Python/Bash/Perl.

8. Good knowledge on Gradle, Maven, etc

9. Should have knowledge of service monitoring via Nagios, Sensu, etc

10. Good to have knowledge on setting up and deploying application servers .

11. Mentoring Team members
Read more
Hyderabad
3 - 7 yrs
₹1L - ₹15L / yr
Big Data
Spark
Hadoop
PySpark
skill iconAmazon Web Services (AWS)
+3 more

Big data Developer

Exp: 3yrs to 7 yrs.
Job Location: Hyderabad
Notice: Immediate / within 30 days

1. Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight
2. Experience in developing lambda functions with AWS Lambda
3. Expertise with Spark/PySpark Candidate should be hands on with PySpark code and should be able to do transformations with Spark
4. Should be able to code in Python and Scala.
5. Snowflake experience will be a plus

We can start keeping Hadoop and Hive requirements as good to have or understanding of is enough rather than keeping it as a desirable requirement.

Read more
A fast-growing SaaS commerce company

A fast-growing SaaS commerce company

Agency job
via Jobdost by Mamatha A
Bengaluru (Bangalore)
5 - 8 yrs
₹25L - ₹35L / yr
API
Integration
SQL
skill iconAmazon Web Services (AWS)
Vendor Management
+5 more

As a Customer Success Manager at this company, you will prioritize the goals and needs of our customers. CSMs form a direct relationship with customers and provide them with timely value propositions. CSM teams work with customers to help them with product onboarding, technical implementation support, understanding the business use cases and map it to the product features. The key skills required are: 

  • You have a customer first approach while implementing a technology product and can independently lead the project in a highly consultative and proactive manner.
  • You can work with cross-functional teams (e.g. Product, Sales, technology) to quickly come up with solutions that achieve customer objectives.
  • You are adept in client relationships and capable of engaging in business-level and technical conversations at multiple levels of the organization. You are empathetic and a good listener. 
  • You constantly strive to improve customer health metrics like product implementation time, CSAT, LTV, repeat purchase, churn, retention, NPS, upsell and cross-sell.

You should have :

  • 5+ years of experience working with enterprise-level strategic customers on technology solutions.
  • Ability to understand complex business requirements. 
  • Ability to quickly learn and explain technical concepts. 
  • Good project management skills. 
  • Strong critical thinking skills and ability to draw insights to improve the product and customer experience. 
  • Very good verbal and written communication & presentation skills.
  • Knowledge of technology solutions like APIs, integrations, SQL, AWS, Pendo, Hubspot, Freshdesk would be a big plus.
  • Very hands-on experience in Excel sheets and advanced data analysis. 
  • Excellent communication skills and experience working with a SaaS company would be ideal.

What can you look for?

A wholesome opportunity in a fast-paced environment that will enable you to juggle between concepts, yet maintain the quality of content, interact and share your ideas and have loads of learning while at work. Work with a team of highly talented young professionals and enjoy the comprehensive benefits that company offers.

We are

A fast-growing SaaS commerce company based in Bangalore with offices in Delhi, Mumbai, SF, Dubai, Singapore, and Dublin. We have three products in our portfolio: Plum, Empuls and Compass. Company works with over 1000 global clients. We help our clients in engaging and motivating their employees, sales teams, channel partners and consumers for better business results.

Way forward

We look forward to connecting with you. As you may take time to review this opportunity, we will wait for a reasonable time of around 3-5 days before we screen the collected applications and start lining up job discussions with the hiring manager. We assure you that we will attempt to maintain a reasonable timeframe for successfully closing this requirement. The candidates will be kept informed and updated on the feedback and application status.

 
Read more
Hiring for a leading client

Hiring for a leading client

Agency job
via Jobaajcom by Saksham Agarwal
New Delhi
3 - 5 yrs
₹10L - ₹15L / yr
Big Data
Apache Kafka
Business Intelligence (BI)
Data Warehouse (DWH)
Coding
+15 more
Job Description:
Senior Software Engineer - Data Team

We are seeking a highly motivated Senior Software Engineer with hands-on experience and build scalable, extensible data solutions, identifying and addressing performance bottlenecks, collaborating with other team members, and implementing best practices for data engineering. Our engineering process is fully agile, and has a really fast release cycle - which keeps our environment very energetic and fun.

What you'll do:

Design and development of scalable applications.
Work with Product Management teams to get maximum value out of existing data.
Contribute to continual improvement by suggesting improvements to the software system.
Ensure high scalability and performance
You will advocate for good, clean, well documented and performing code; follow standards and best practices.
We'd love for you to have:

Education: Bachelor/Master Degree in Computer Science.
Experience: 3-5 years of relevant experience in BI/DW with hands-on coding experience.

Mandatory Skills

Strong in problem-solving
Strong experience with Big Data technologies, Hive, Hadoop, Impala, Hbase, Kafka, Spark
Strong experience with orchestration framework like Apache oozie, Airflow
Strong experience of Data Engineering
Strong experience with Database and Data Warehousing technologies and ability to understand complex design, system architecture
Experience with the full software development lifecycle, design, develop, review, debug, document, and deliver (especially in a multi-location organization)
Good knowledge of Java
Desired Skills

Experience with Python
Experience with reporting tools like Tableau, QlikView
Experience of Git and CI-CD pipeline
Awareness of cloud platform ex:- AWS
Excellent communication skills with team members, Business owners, across teams
Be able to work in a challenging, dynamic environment and meet tight deadlines
Read more
MNC
Bengaluru (Bangalore)
3 - 9 yrs
₹3L - ₹17L / yr
skill iconScala
Spark
Data Warehouse (DWH)
Business Intelligence (BI)
Apache Spark
+2 more
Dear All,
we are looking for candidates who have good experiance with
BI/DW Experience of 3 - 6 years with Spark, Scala, SQL expertise
and Azure.
Azure background is needed.
     * Spark hands on : Must have
     * Scala hands on : Must have
     * SQL expertise : Expert
     * Azure background : Must have
     * Python hands on : Good to have
     * ADF, Data Bricks: Good to have
     * Should be able to communicate effectively and deliver technology
implementation end to end
Looking for candidates who can join 15 to 30 Days and who will avaailable immeiate.


Regards
Gayatri P
Fragma Data Systems
Read more
INSOFE

at INSOFE

1 recruiter
Nitika Bist
Posted by Nitika Bist
Hyderabad, Bengaluru (Bangalore)
7 - 10 yrs
₹12L - ₹18L / yr
Big Data
Data engineering
Apache Hive
Apache Spark
Hadoop
+4 more
Roles & Responsibilities:
  • Total Experience of 7-10 years and should be interested in teaching and research
  • 3+ years’ experience in data engineering which includes data ingestion, preparation, provisioning, automated testing, and quality checks.
  • 3+ Hands-on experience in Big Data cloud platforms like AWS and GCP, Data Lakes and Data Warehouses
  • 3+ years of Big Data and Analytics Technologies. Experience in SQL, writing code in spark engine using python, scala or java Language. Experience in Spark, Scala
  • Experience in designing, building, and maintaining ETL systems
  • Experience in data pipeline and workflow management tools like Airflow
  • Application Development background along with knowledge of Analytics libraries, opensource Natural Language Processing, statistical and big data computing libraries
  • Familiarity with Visualization and Reporting Tools like Tableau, Kibana.
  • Should be good at storytelling in Technology
Please note that candidates should be interested in teaching and research work.

Qualification: B.Tech / BE / M.Sc / MBA / B.Sc, Having Certifications in Big Data Technologies and Cloud platforms like AWS, Azure and GCP will be preferred
Primary Skills: Big Data + Python + Spark + Hive + Cloud Computing
Secondary Skills: NoSQL+ SQL + ETL + Scala + Tableau
Selection Process: 1 Hackathon, 1 Technical round and 1 HR round
Benefit: Free of cost training on Data Science from top notch professors
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort