Cutshort logo
Microstrategy administration jobs

11+ MicroStrategy administration Jobs in India

Apply to 11+ MicroStrategy administration Jobs on CutShort.io. Find your next job, effortlessly. Browse MicroStrategy administration Jobs and apply today!

icon
Latent Bridge Pvt Ltd

at Latent Bridge Pvt Ltd

6 recruiters
Mansoor Khan
Posted by Mansoor Khan
Remote only
3 - 7 yrs
₹5L - ₹20L / yr
MicroStrategy administration
skill iconAmazon Web Services (AWS)
Business Intelligence (BI)
MSTR

Familiar with the MicroStrategy architecture, Admin Certification Preferred

· Familiar with administrative functions, using Object Manager, Command Manager, installation/configuration of MSTR in clustered architecture, applying patches, hot-fixes

· Monitor and manage existing Business Intelligence development/production systems

· MicroStrategy installation, upgrade and administration on Windows and Linux platform

· Ability to support and administer multi-tenant MicroStrategy infrastructure including server security troubleshooting and general system maintenance.

· Analyze application and system logs while troubleshooting and root cause analysis

· Work on operations like deploy and manage packages, User Management, Schedule Management, Governing Settings best practices, database instance and security configuration.

· Monitor, report and investigate solutions to improve report performance.

· Continuously improve the platform through tuning, optimization, governance, automation, and troubleshooting.

· Provide support for the platform, report execution and implementation, user community and data investigations.

· Identify improvement areas in Environment hosting and upgrade processes.

· Identify automation opportunities and participate in automation implementations

· Provide on-call support for Business Intelligence issues

· Experience of working on MSTR 2021, MSTR 2021 including knowledge of working on Enterprise Manager and new features like Platform Analytics, Hyper Intelligence, Collaboration, MSTR Library, etc.

· Familiar with AWS, Linux Scripting

· Knowledge of MSTR Mobile

· Knowledge of capacity planning and system’s scaling needs

Read more
Top IT MNC

Top IT MNC

Agency job
via People First Consultants by Aishwarya KA
Chennai, Coimbatore, Noida, Pune, Kolkata, Bengaluru (Bangalore), Gurugram, Hyderabad, Mumbai, Cochin
8 - 15 yrs
Best in industry
skill iconAmazon Web Services (AWS)
Technical Architecture
We are looking for AWS architect for leading MNC
Experience:8+ Years
AWS Certification must. 
Location:Pan india
Read more
Carsome

at Carsome

3 recruiters
Piyush Palkar
Posted by Piyush Palkar
Remote, Kuala Lumpur
1 - 6 yrs
₹10L - ₹30L / yr
skill iconData Science
skill iconMachine Learning (ML)
skill iconPython
SQL
Problem solving
+4 more

Carsome’s Data Department is on the lookout for a Data Scientist/Senior Data Scientist who has a strong passion in building data powered products.

 

Data Science function under the Data Department has a responsibility for standardisation of methods, mentoring team of data science resources/interns, including code libraries and documentation, quality assurance of outputs, modeling techniques and statistics, leveraging a variety of technologies, open-source languages, and cloud computing platform. 

 

You will get to lead & implement projects such as price optimization/prediction, enabling iconic personalization experiences for our customer, inventory optimization etc.

 

Job Descriptions

 

  • Identifying and integrating datasets that can be leveraged through our product and work closely with data engineering team to develop data products.
  • Execute analytical experiments methodically to help solve various problems and make a true impact across functions such as operations, finance, logistics, marketing. 
  • Identify, prioritize, and design testing opportunities that will inform algorithm enhancements. 
  • Devise and utilize algorithms and models to mine big data stores, perform data and error analysis to improve models and clean and validate data for uniformity and accuracy.
  • Unlock insights by analyzing large amounts of complex website traffic and transactional data. 
  • Implement analytical models into production by collaborating with data analytics engineers.

 

Technical Requirements

 

  • Expertise in model design, training, evaluation, and implementation ML Algorithm expertise K-nearest neighbors, Random Forests, Naive Bayes, Regression Models. PyTorch, TensorFlow, Keras, deep learning expertise, tSNE, gradient boosting expertise, regression implementation expertise, Python, Pyspark, SQL, R, AWS Sagemaker /personalize etc.
  • Machine Learning / Data Science Certification

 

Experience & Education 

 

  • Bachelor’s in Engineering / Master’s in Data Science  / Postgraduate Certificate in Data Science. 
Read more
Personal Care Product Manufacturing

Personal Care Product Manufacturing

Agency job
via Qrata by Rayal Rajan
Mumbai
3 - 8 yrs
₹12L - ₹30L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+9 more

DATA ENGINEER


Overview

They started with a singular belief - what is beautiful cannot and should not be defined in marketing meetings. It's defined by the regular people like us, our sisters, our next-door neighbours, and the friends we make on the playground and in lecture halls. That's why we stand for people-proving everything we do. From the inception of a product idea to testing the final formulations before launch, our consumers are a part of each and every process. They guide and inspire us by sharing their stories with us. They tell us not only about the product they need and the skincare issues they face but also the tales of their struggles, dreams and triumphs. Skincare goes deeper than skin. It's a form of self-care for many. Wherever someone is on this journey, we want to cheer them on through the products we make, the content we create and the conversations we have. What we wish to build is more than a brand. We want to build a community that grows and glows together - cheering each other on, sharing knowledge, and ensuring people always have access to skincare that really works.

 

Job Description:

We are seeking a skilled and motivated Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, developing, and maintaining the data infrastructure and systems that enable efficient data collection, storage, processing, and analysis. You will collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to implement data pipelines and ensure the availability, reliability, and scalability of our data platform.


Responsibilities:

Design and implement scalable and robust data pipelines to collect, process, and store data from various sources.

Develop and maintain data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation.

Optimize and tune the performance of data systems to ensure efficient data processing and analysis.

Collaborate with data scientists and analysts to understand data requirements and implement solutions for data modeling and analysis.

Identify and resolve data quality issues, ensuring data accuracy, consistency, and completeness.

Implement and maintain data governance and security measures to protect sensitive data.

Monitor and troubleshoot data infrastructure, perform root cause analysis, and implement necessary fixes.

Stay up-to-date with emerging technologies and industry trends in data engineering and recommend their adoption when appropriate.


Qualifications:

Bachelor’s or higher degree in Computer Science, Information Systems, or a related field.

Proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems.

Strong programming skills in languages such as Python, Java, or Scala.

Experience with big data technologies and frameworks like Hadoop, Spark, or Kafka.

Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle).

Familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery).

Solid understanding of data modeling, data warehousing, and ETL principles.

Knowledge of data integration techniques and tools (e.g., Apache Nifi, Talend, or Informatica).

Strong problem-solving and analytical skills, with the ability to handle complex data challenges.

Excellent communication and collaboration skills to work effectively in a team environment.


Preferred Qualifications:

Advanced knowledge of distributed computing and parallel processing.

Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink).

Familiarity with machine learning concepts and frameworks (e.g., TensorFlow, PyTorch).

Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes).

Experience with data visualization and reporting tools (e.g., Tableau, Power BI).

Certification in relevant technologies or data engineering disciplines.



Read more
Uber9 Business Process Services Pvt Ltd
Lakshmi J
Posted by Lakshmi J
Chennai
1 - 4 yrs
₹1L - ₹4L / yr
skill iconMongoDB
skill iconMachine Learning (ML)
skill iconDeep Learning
Natural Language Processing (NLP)
skill iconAmazon Web Services (AWS)

Working along with the highly  motivated advanced Machine Learning team, with  key responsibilities are to research, design, develop, and implement applications that will be integrated into our workflows.

Responsibilities and Accountabilities:

 

  • Provide ML  & Deep Learning solutions and build models for day to day Needs.
  • Working on end-to-end automation with regard to complex workflows.
  • Should have the ability to read and understand the necessary  Deep Learning research papers and draw solution out of it
  • Information extraction from various kinds of documents submitted by our customers. These documents will be images (different formats and resolutions), and PDF (text and scanned images).
  • Advanced Natural Language processing algorithms to extract metadata and drive research based  workflows.
  • Work collaboratively with the Engineering and Product team to design and implement the company’s technical vision.

I Experience:


  • Nature of Experience: Practical experience applying machine learning to computer vision tasks or in NLP
Length of Experience: 1-2 years (Freshers with extraordinary projects)


III Skill Set & Personality Traits required:

  • Have a proven understanding of computer vision and machine learning theory.
  • Candidates should be able to analyze and synthesize data both syntactically and semantically using NLP techniques through Neural Networks (Transformers-BERT and its Variants,RNN, LSTM, Bi-LSTM,).
  • Reasoning on Knowledge Graphs
  • Applied Linguists and Computational linguistics
  • Should have in-depth knowledge of Computer Vision (Image classification (CNNs) & Processing) and Natural Language Processing(Syntactic and Semantic regime) 
  • Must have the following Machine Learning skills: Probabilistic Learning (Naive Bayes), Neural Networks (CNN, RNN, LSTM, Bi-LSTM, GCNN, Object detection Neural Networks-Yolo).
  • Proficiency in Python(mandatory) and Scala will be add on
  • Must have work knowledge in FrameWorks: TensorFlow/Pytorch/Keras, Spark, Sci-kit learn, Flask, Fast API
  • Must have work knowledge in Database: MongoDB
  • Must have working knowledge in Cloud: AWS (S3and Lambda)
  • Have vision and experience to make end-to-end Machine Learning Platform solutions.
  • Proven experience working in product driven environment building and shipping early-stage technologies.
  • Strong professionally – credible with integrity.
  • Good communication skills.
  • Strong interpersonal skills.
  • Organizational skills and ability to manage deadlines.
Read more
Emids Technologies

at Emids Technologies

2 candid answers
Rima Mishra
Posted by Rima Mishra
Bengaluru (Bangalore)
5 - 10 yrs
₹4L - ₹18L / yr
Jasper
JasperReports
ETL
JasperSoft
OLAP
+3 more

Job Description - Jasper 

  • Knowledge of Jasper report server administration, installation and configuration
  • Knowledge of report deployment and configuration
  • Knowledge of Jaspersoft Architecture and Deployment
  • Knowledge of User Management in Jaspersoft Server
  • Experience in developing Complex Reports using Jaspersoft Server and Jaspersoft Studio
  • Understand the Overall architecture of Jaspersoft BI
  • Experience in creating Ad Hoc Reports, OLAP, Views, Domains
  • Experience in report server (Jaspersoft) integration with web application
  • Experience in JasperReports Server web services API and Jaspersoft Visualise.JS Web service API
  • Experience in creating dashboards with visualizations
  • Experience in security and auditing, metadata layer
  • Experience in Interacting with stakeholders for requirement gathering and Analysis
  • Good knowledge ETL design and development, logical and physical data modeling (relational and dimensional)
  • Strong self- initiative to strive for both personal & technical excellence.
  • Coordinate efforts across Product development team and Business Analyst team.
  • Strong business and data analysis skills.
  • Domain knowledge of Healthcare an advantage.
  • Should be strong on Co- ordinate with onshore resources on development.
  • Data oriented professional with good communications skills and should have a great eye for detail.
  • Interpret data, analyze results and provide insightful inferences
  • Maintain relationship with Business Intelligence stakeholders
  • Strong Analytical and Problem Solving skills 


Read more
Simplilearn Solutions

at Simplilearn Solutions

1 video
36 recruiters
STEVEN JOHN
Posted by STEVEN JOHN
Anywhere, United States, Canada
3 - 10 yrs
₹2L - ₹10L / yr
skill iconJava
skill iconAmazon Web Services (AWS)
Big Data
Corporate Training
skill iconData Science
+2 more
To introduce myself I head Global Faculty Acquisition for Simplilearn. About My Company: SIMPLILEARN is a company which has transformed 500,000+ carriers across 150+ countries with 400+ courses and yes we are a Registered Professional Education Provider providing PMI-PMP, PRINCE2, ITIL (Foundation, Intermediate & Expert), MSP, COBIT, Six Sigma (GB, BB & Lean Management), Financial Modeling with MS Excel, CSM, PMI - ACP, RMP, CISSP, CTFL, CISA, CFA Level 1, CCNA, CCNP, Big Data Hadoop, CBAP, iOS, TOGAF, Tableau, Digital Marketing, Data scientist with Python, Data Science with SAS & Excel, Big Data Hadoop Developer & Administrator, Apache Spark and Scala, Tableau Desktop 9, Agile Scrum Master, Salesforce Platform Developer, Azure & Google Cloud. : Our Official website : www.simplilearn.com If you're interested in teaching, interacting, sharing real life experiences and passion to transform Careers, please join hands with us. Onboarding Process • Updated CV needs to be sent to my email id , with relevant certificate copy. • Sample ELearning access will be shared with 15days trail post your registration in our website. • My Subject Matter Expert will evaluate you on your areas of expertise over a telephonic conversation - Duration 15 to 20 minutes • Commercial Discussion. • We will register you to our on-going online session to introduce you to our course content and the Simplilearn style of teaching. • A Demo will be conducted to check your training style, Internet connectivity. • Freelancer Master Service Agreement Payment Process : • Once a workshop/ Last day of the training for the batch is completed you have to share your invoice. • An automated Tracking Id will be shared from our automated ticketing system. • Our Faculty group will verify the details provided and share the invoice to our internal finance team to process your payment, if there are any additional information required we will co-ordinate with you. • Payment will be processed in 15 working days as per the policy this 15 days is from the date of invoice received. Please share your updated CV to get this for next step of on-boarding process.
Read more
Freelancer

at Freelancer

4 recruiters
Nirmala Hk
Posted by Nirmala Hk
Bengaluru (Bangalore)
4 - 7 yrs
₹20L - ₹35L / yr
skill iconPython
Shell Scripting
MySQL
SQL
skill iconAmazon Web Services (AWS)
+3 more

   3+ years of experience in deployment, monitoring, tuning, and administration of high concurrency MySQL production databases.

  • Solid understanding of writing optimized SQL queries on MySQL databases
  • Understanding of AWS, VPC, networking, security groups, IAM, and roles.
  • Expertise in scripting in Python or Shell/Powershell
  • Must have experience in large scale data migrations
  • Excellent communication skills.
Read more
INSOFE

at INSOFE

1 recruiter
Nitika Bist
Posted by Nitika Bist
Hyderabad, Bengaluru (Bangalore)
7 - 10 yrs
₹12L - ₹18L / yr
Big Data
Data engineering
Apache Hive
Apache Spark
Hadoop
+4 more
Roles & Responsibilities:
  • Total Experience of 7-10 years and should be interested in teaching and research
  • 3+ years’ experience in data engineering which includes data ingestion, preparation, provisioning, automated testing, and quality checks.
  • 3+ Hands-on experience in Big Data cloud platforms like AWS and GCP, Data Lakes and Data Warehouses
  • 3+ years of Big Data and Analytics Technologies. Experience in SQL, writing code in spark engine using python, scala or java Language. Experience in Spark, Scala
  • Experience in designing, building, and maintaining ETL systems
  • Experience in data pipeline and workflow management tools like Airflow
  • Application Development background along with knowledge of Analytics libraries, opensource Natural Language Processing, statistical and big data computing libraries
  • Familiarity with Visualization and Reporting Tools like Tableau, Kibana.
  • Should be good at storytelling in Technology
Please note that candidates should be interested in teaching and research work.

Qualification: B.Tech / BE / M.Sc / MBA / B.Sc, Having Certifications in Big Data Technologies and Cloud platforms like AWS, Azure and GCP will be preferred
Primary Skills: Big Data + Python + Spark + Hive + Cloud Computing
Secondary Skills: NoSQL+ SQL + ETL + Scala + Tableau
Selection Process: 1 Hackathon, 1 Technical round and 1 HR round
Benefit: Free of cost training on Data Science from top notch professors
Read more
Company is into Product Development.

Company is into Product Development.

Agency job
via Master Mind Consultancy by Dnyanesh Panchal
Remote, Mumbai
10 - 18 yrs
₹30L - ₹55L / yr
skill iconScala
Big Data
skill iconJava
skill iconAmazon Web Services (AWS)
ETL

What's the role?

Your role as a Principal Engineer will involve working with various team. As a principal engineer, will need full knowledge of the software development lifecycle and Agile methodologies. You will demonstrate multi-tasking skills under tight deadlines and constraints. You will regularly contribute to the development of work products (including analyzing, designing, programming, debugging, and documenting software) and may work with customers to resolve challenges and respond to suggestions for improvements and enhancements. You will setup the standard and principal for the product he/she drives.

  • Setup coding practice, guidelines & quality of the software delivered.
  • Determines operational feasibility by evaluating analysis, problem definition, requirements, solution development, and proposed solutions.
  • Documents and demonstrates solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code.
  • Prepares and installs solutions by determining and designing system specifications, standards, and programming.
  • Improves operations by conducting systems analysis; recommending changes in policies and procedures.
  • Updates job knowledge by studying state-of-the-art development tools, programming techniques, and computing equipment; participating in educational opportunities; reading professional publications; maintaining personal networks; participating in professional organizations.
  • Protects operations by keeping information confidential.
  • Develops software solutions by studying information needs; conferring with users; studying systems flow, data usage, and work processes; investigating problem areas; following the software development lifecycle. Who are you? You are a go-getter, with an eye for detail, strong problem-solving and debugging skills, and having a degree in BE/MCA/M.E./ M Tech degree or equivalent degree from reputed college/university.

 

Essential Skills / Experience:

  • 10+ years of engineering experience
  • Experience in designing and developing high volume web-services using API protocols and data formats
  • Proficient in API modelling languages and annotation
  • Proficient in Java programming
  • Experience with Scala programming
  • Experience with ETL systems
  • Experience with Agile methodologies
  • Experience with Cloud service & storage
  • Proficient in Unix/Linux operating systems
  • Excellent oral and written communication skills Preferred:
  • Functional programming languages (Scala, etc)
  • Scripting languages (bash, Perl, Python, etc)
  • Amazon Web Services (Redshift, ECS etc)
Read more
Data ToBiz

at Data ToBiz

2 recruiters
PS Dhillon
Posted by PS Dhillon
Chandigarh, NCR (Delhi | Gurgaon | Noida)
2 - 6 yrs
₹7L - ₹15L / yr
ETL
skill iconAmazon Web Services (AWS)
Amazon Redshift
skill iconPython
Job Responsibilities : - Developing new data pipelines and ETL jobs for processing millions of records and it should be scalable with growth.
Pipelines should be optimised to handle both real time data, batch update data and historical data.
Establish scalable, efficient, automated processes for complex, large scale data analysis.
Write high quality code to gather and manage large data sets (both real time and batch data) from multiple sources, perform ETL and store it in a data warehouse.
Manipulate and analyse complex, high-volume, high-dimensional data from varying sources using a variety of tools and data analysis techniques.
Participate in data pipelines health monitoring and performance optimisations as well as quality documentation.
Interact with end users/clients and translate business language into technical requirements.
Acts independently to expose and resolve problems.

Job Requirements :-
2+ years experience working in software development & data pipeline development for enterprise analytics.
2+ years of working with Python with exposure to various warehousing tools
In-depth working with any of commercial tools like AWS Glue, Ta-lend, Informatica, Data-stage, etc.
Experience with various relational databases like MySQL, MSSql, Oracle etc. is a must.
Experience with analytics and reporting tools (Tableau, Power BI, SSRS, SSAS).
Experience in various DevOps practices helping the client to deploy and scale the systems as per requirement.
Strong verbal and written communication skills with other developers and business client.
Knowledge of Logistics and/or Transportation Domain is a plus.
Hands-on with traditional databases and ERP systems like Sybase and People-soft.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort